diff --git a/.coveragerc b/.coveragerc new file mode 100644 index 000000000..2398f62e3 --- /dev/null +++ b/.coveragerc @@ -0,0 +1,3 @@ +[report] +omit = + tests/* \ No newline at end of file diff --git a/.gitignore b/.gitignore index 9a4bb620f..58e83214e 100644 --- a/.gitignore +++ b/.gitignore @@ -44,6 +44,7 @@ nosetests.xml coverage.xml *,cover .hypothesis/ +*.pytest_cache # Translations *.mo @@ -70,3 +71,8 @@ target/ # dotenv .env +.idea + +# for macOS +.DS_Store +._.DS_Store diff --git a/.travis.yml b/.travis.yml index e0932e6b2..e465e8e4c 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,20 +1,19 @@ -language: - - python +language: python python: - - "3.4" + - 3.5 + - 3.6 + - 3.7 + - 3.8 before_install: - git submodule update --remote install: - - pip install six - - pip install flake8 - - pip install ipython - - pip install matplotlib + - pip install --upgrade -r requirements.txt script: - - py.test + - py.test --cov=./ - python -m doctest -v *.py after_success: diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 1123ef95f..f92643700 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,28 +1,50 @@ How to Contribute to aima-python ========================== -Thanks for considering contributing to `aima-python`! Whether you are an aspiring [Google Summer of Code](https://summerofcode.withgoogle.com/organizations/5663121491361792/) student, or an independent contributor, here is a guide to how you can help: +Thanks for considering contributing to `aima-python`! Whether you are an aspiring [Google Summer of Code](https://summerofcode.withgoogle.com/organizations/5431334980288512/) student, or an independent contributor, here is a guide on how you can help. + +First of all, you can read these write-ups from past GSoC students to get an idea about what you can do for the project. [Chipe1](https://github.com/aimacode/aima-python/issues/641) - [MrDupin](https://github.com/aimacode/aima-python/issues/632) + +In general, the main ways you can contribute to the repository are the following: + +1. Implement algorithms from the [list of algorithms](https://github.com/aimacode/aima-python/blob/master/README.md#index-of-algorithms). +1. Add tests for algorithms. +1. Take care of [issues](https://github.com/aimacode/aima-python/issues). +1. Write on the notebooks (`.ipynb` files). +1. Add and edit documentation (the docstrings in `.py` files). + +In more detail: ## Read the Code and Start on an Issue - First, read and understand the code to get a feel for the extent and the style. - Look at the [issues](https://github.com/aimacode/aima-python/issues) and pick one to work on. -- One of the issues is that some algorithms are missing from the [list of algorithms](https://github.com/aimacode/aima-python/blob/master/README.md#index-of-algorithms). +- One of the issues is that some algorithms are missing from the [list of algorithms](https://github.com/aimacode/aima-python/blob/master/README.md#index-of-algorithms) and that some don't have tests. -## Port to Python 3; Pythonic Idioms; py.test +## Port to Python 3; Pythonic Idioms -- Check for common problems in [porting to Python 3](http://python3porting.com/problems.html), such as: `print` is now a function; `range` and `map` and other functions no longer produce `list`s; objects of different types can no longer be compared with `<`; strings are now Unicode; it would be nice to move `%` string formating to `.format`; there is a new `next` function for generators; integer division now returns a float; we can now use set literals. +- Check for common problems in [porting to Python 3](http://python3porting.com/problems.html), such as: `print` is now a function; `range` and `map` and other functions no longer produce `list`; objects of different types can no longer be compared with `<`; strings are now Unicode; it would be nice to move `%` string formatting to `.format`; there is a new `next` function for generators; integer division now returns a float; we can now use set literals. - Replace old Lisp-based idioms with proper Python idioms. For example, we have many functions that were taken directly from Common Lisp, such as the `every` function: `every(callable, items)` returns true if every element of `items` is callable. This is good Lisp style, but good Python style would be to use `all` and a generator expression: `all(callable(f) for f in items)`. Eventually, fix all calls to these legacy Lisp functions and then remove the functions. -- Add more tests in `_test.py` files. Strive for terseness; it is ok to group multiple asserts into one `def test_something():` function. Move most tests to `_test.py`, but it is fine to have a single `doctest` example in the docstring of a function in the `.py` file, if the purpose of the doctest is to explain how to use the function, rather than test the implementation. ## New and Improved Algorithms - Implement functions that were in the third edition of the book but were not yet implemented in the code. Check the [list of pseudocode algorithms (pdf)](https://github.com/aimacode/pseudocode/blob/master/aima3e-algorithms.pdf) to see what's missing. - As we finish chapters for the new fourth edition, we will share the new pseudocode in the [`aima-pseudocode`](https://github.com/aimacode/aima-pseudocode) repository, and describe what changes are necessary. -We hope to have a `algorithm-name.md` file for each algorithm, eventually; it would be great if contributors could add some for the existing algorithms. -- Give examples of how to use the code in the `.ipynb` file. +We hope to have an `algorithm-name.md` file for each algorithm, eventually; it would be great if contributors could add some for the existing algorithms. + +## Jupyter Notebooks -We still support a legacy branch, `aima3python2` (for the third edition of the textbook and for Python 2 code). +In this project we use Jupyter/IPython Notebooks to showcase the algorithms in the book. They serve as short tutorials on what the algorithms do, how they are implemented and how one can use them. To install Jupyter, you can follow the instructions [here](https://jupyter.org/install.html). These are some ways you can contribute to the notebooks: + +- Proofread the notebooks for grammar mistakes, typos, or general errors. +- Move visualization and unrelated to the algorithm code from notebooks to `notebook.py` (a file used to store code for the notebooks, like visualization and other miscellaneous stuff). Make sure the notebooks still work and have their outputs showing! +- Replace the `%psource` magic notebook command with the function `psource` from `notebook.py` where needed. Examples where this is useful are a) when we want to show code for algorithm implementation and b) when we have consecutive cells with the magic keyword (in this case, if the code is large, it's best to leave the output hidden). +- Add the function `pseudocode(algorithm_name)` in algorithm sections. The function prints the pseudocode of the algorithm. You can see some example usage in [`knowledge.ipynb`](https://github.com/aimacode/aima-python/blob/master/knowledge.ipynb). +- Edit existing sections for algorithms to add more information and/or examples. +- Add visualizations for algorithms. The visualization code should go in `notebook.py` to keep things clean. +- Add new sections for algorithms not yet covered. The general format we use in the notebooks is the following: First start with an overview of the algorithm, printing the pseudocode and explaining how it works. Then, add some implementation details, including showing the code (using `psource`). Finally, add examples for the implementations, showing how the algorithms work. Don't fret with adding complex, real-world examples; the project is meant for educational purposes. You can of course choose another format if something better suits an algorithm. + +Apart from the notebooks explaining how the algorithms work, we also have notebooks showcasing some indicative applications of the algorithms. These notebooks are in the `*_apps.ipynb` format. We aim to have an `apps` notebook for each module, so if you don't see one for the module you would like to contribute to, feel free to create it from scratch! In these notebooks we are looking for applications showing what the algorithms can do. The general format of these sections is this: Add a description of the problem you are trying to solve, then explain how you are going to solve it and finally provide your solution with examples. Note that any code you write should not require any external libraries apart from the ones already provided (like `matplotlib`). # Style Guide @@ -43,54 +65,27 @@ a one-line docstring suffices. It is rarely necessary to list what each argument - At some point I may add [Pep 484](https://www.python.org/dev/peps/pep-0484/) type annotations, but I think I'll hold off for now; I want to get more experience with them, and some people may still be in Python 3.4. - -Contributing a Patch -==================== - -1. Submit an issue describing your proposed change to the repo in question (or work on an existing issue). -1. The repo owner will respond to your issue promptly. -1. Fork the desired repo, develop and test your code changes. -1. Submit a pull request. - Reporting Issues ================ - Under which versions of Python does this happen? +- Provide an example of the issue occurring. + - Is anybody working on this? Patch Rules =========== -- Ensure that the patch is python 3.4 compliant. +- Ensure that the patch is Python 3.4 compliant. - Include tests if your patch is supposed to solve a bug, and explain clearly under which circumstances the bug happens. Make sure the test fails without your patch. - Follow the style guidelines described above. - -Running the Test-Suite -===================== - -The minimal requirement for running the testsuite is ``py.test``. You can -install it with: - - pip install pytest - -Clone this repository: - - git clone https://github.com/aimacode/aima-python.git - -Fetch the aima-data submodule: - - cd aima-python - git submodule init - git submodule update - -Then you can run the testsuite from the `aima-python` or `tests` directory with: - - py.test +- Refer the issue you have fixed. +- Explain in brief what changes you have made with affected files name. # Choice of Programming Languages diff --git a/README.md b/README.md index 0174290c2..17f1d6085 100644 --- a/README.md +++ b/README.md @@ -1,125 +1,172 @@ -
-

-
+ # `aima-python` [![Build Status](https://travis-ci.org/aimacode/aima-python.svg?branch=master)](https://travis-ci.org/aimacode/aima-python) [![Binder](http://mybinder.org/badge.svg)](http://mybinder.org/repo/aimacode/aima-python) Python code for the book *[Artificial Intelligence: A Modern Approach](http://aima.cs.berkeley.edu).* You can use this in conjunction with a course on AI, or for study on your own. We're looking for [solid contributors](https://github.com/aimacode/aima-python/blob/master/CONTRIBUTING.md) to help. -## Python 3.4 +# Updates for 4th Edition + +The 4th edition of the book as out now in 2020, and thus we are updating the code. All code here will reflect the 4th edition. Changes include: + +- Move from Python 3.5 to 3.7. +- More emphasis on Jupyter (Ipython) notebooks. +- More projects using external packages (tensorflow, etc.). + + + +# Structure of the Project + +When complete, this project will have Python implementations for all the pseudocode algorithms in the book, as well as tests and examples of use. For each major topic, such as `search`, we provide the following files: + +- `search.ipynb` and `search.py`: Implementations of all the pseudocode algorithms, and necessary support functions/classes/data. The `.py` file is generated automatically from the `.ipynb` file; the idea is that it is easier to read the documentation in the `.ipynb` file. +- `search_XX.ipynb`: Notebooks that show how to use the code, broken out into various topics (the `XX`). +- `tests/test_search.py`: A lightweight test suite, using `assert` statements, designed for use with [`py.test`](http://pytest.org/latest/), but also usable on their own. + +# Python 3.7 and up + +The code for the 3rd edition was in Python 3.5; the current 4th edition code is in Python 3.7. It should also run in later versions, but does not run in Python 2. You can [install Python](https://www.python.org/downloads) or use a browser-based Python interpreter such as [repl.it](https://repl.it/languages/python3). +You can run the code in an IDE, or from the command line with `python -i filename.py` where the `-i` option puts you in an interactive loop where you can run Python functions. All notebooks are available in a [binder environment](http://mybinder.org/repo/aimacode/aima-python). Alternatively, visit [jupyter.org](http://jupyter.org/) for instructions on setting up your own Jupyter notebook environment. + +Features from Python 3.6 and 3.7 that we will be using for this version of the code: +- [f-strings](https://docs.python.org/3.6/whatsnew/3.6.html#whatsnew36-pep498): all string formatting should be done with `f'var = {var}'`, not with `'var = {}'.format(var)` nor `'var = %s' % var`. +- [`typing` module](https://docs.python.org/3.7/library/typing.html): declare functions with type hints: `def successors(state) -> List[State]:`; that is, give type declarations, but omit them when it is obvious. I don't need to say `state: State`, but in another context it would make sense to say `s: State`. +- Underscores in numerics: write a million as `1_000_000` not as `1000000`. +- [`dataclasses` module](https://docs.python.org/3.7/library/dataclasses.html#module-dataclasses): replace `namedtuple` with `dataclass`. + + +[//]: # (There is a sibling [aima-docker]https://github.com/rajatjain1997/aima-docker project that shows you how to use docker containers to run more complex problems in more complex software environments.) + + +## Installation Guide + +To download the repository: + +`git clone https://github.com/aimacode/aima-python.git` + +Then you need to install the basic dependencies to run the project on your system: + +``` +cd aima-python +pip install -r requirements.txt +``` + +You also need to fetch the datasets from the [`aima-data`](https://github.com/aimacode/aima-data) repository: + +``` +git submodule init +git submodule update +``` + +Wait for the datasets to download, it may take a while. Once they are downloaded, you need to install `pytest`, so that you can run the test suite: -This code is in Python 3.4 (Python 3.5 and later also works, but Python 2.x does not). You can [install the latest Python version](https://www.python.org/downloads) or use a browser-based Python interpreter such as [repl.it](https://repl.it/languages/python3). -You can run the code in an IDE, or from the command line with `python -i filename.py` where the `-i` option puts you in an interactive loop where you can run Python functions. +`pip install pytest` -In addition to the `filename.py` files, there are also `filename.ipynb` files, which are Jupyter (formerly IPython) notebooks. You can read these notebooks, and you can also run the code embedded with them. See [jupyter.org](http://jupyter.org/) for instructions on setting up a Jupyter notebook environment. Some modules also have `filename_apps.ipynb` files, which are notebooks for applications of the module. +Then to run the tests: -## Structure of the Project +`py.test` -When complete, this project will have Python code for all the pseudocode algorithms in the book. For each major topic, such as `nlp`, we will have the following three files in the main branch: +And you are good to go! -- `nlp.py`: Implementations of all the pseudocode algorithms, and necessary support functions/classes/data. -- `nlp.ipynb`: A Jupyter (IPython) notebook that explains and gives examples of how to use the code. -- `nlp_apps.ipynb`: A Jupyter notebook that gives example applications of the code. -- `tests/test_nlp.py`: A lightweight test suite, using `assert` statements, designed for use with [`py.test`](http://pytest.org/latest/), but also usable on their own. # Index of Algorithms -Here is a table of algorithms, the figure, name of the algorithm in the book and in the repository, and the file where they are implemented in the repository. This chart was made for the third edition of the book and needs to be updated for the upcoming fourth edition. Empty implementations are a good place for contributors to look for an issue. The [aima-pseudocode](https://github.com/aimacode/aima-pseudocode) project describes all the algorithms from the book. An asterisk next to the file name denotes the algorithm is not fully implemented. - -| **Figure** | **Name (in 3rd edition)** | **Name (in repository)** | **File** | **Tests** -|:--------|:-------------------|:---------|:-----------|:-------| -| 2.1 | Environment | `Environment` | [`agents.py`][agents] | Done | -| 2.1 | Agent | `Agent` | [`agents.py`][agents] | Done | -| 2.3 | Table-Driven-Vacuum-Agent | `TableDrivenVacuumAgent` | [`agents.py`][agents] | | -| 2.7 | Table-Driven-Agent | `TableDrivenAgent` | [`agents.py`][agents] | | -| 2.8 | Reflex-Vacuum-Agent | `ReflexVacuumAgent` | [`agents.py`][agents] | Done | -| 2.10 | Simple-Reflex-Agent | `SimpleReflexAgent` | [`agents.py`][agents] | | -| 2.12 | Model-Based-Reflex-Agent | `ReflexAgentWithState` | [`agents.py`][agents] | | -| 3 | Problem | `Problem` | [`search.py`][search] | Done | -| 3 | Node | `Node` | [`search.py`][search] | Done | -| 3 | Queue | `Queue` | [`utils.py`][utils] | Done | -| 3.1 | Simple-Problem-Solving-Agent | `SimpleProblemSolvingAgent` | [`search.py`][search] | | -| 3.2 | Romania | `romania` | [`search.py`][search] | Done | -| 3.7 | Tree-Search | `tree_search` | [`search.py`][search] | Done | -| 3.7 | Graph-Search | `graph_search` | [`search.py`][search] | Done | -| 3.11 | Breadth-First-Search | `breadth_first_search` | [`search.py`][search] | Done | -| 3.14 | Uniform-Cost-Search | `uniform_cost_search` | [`search.py`][search] | Done | -| 3.17 | Depth-Limited-Search | `depth_limited_search` | [`search.py`][search] | Done | -| 3.18 | Iterative-Deepening-Search | `iterative_deepening_search` | [`search.py`][search] | Done | -| 3.22 | Best-First-Search | `best_first_graph_search` | [`search.py`][search] | Done | -| 3.24 | A\*-Search | `astar_search` | [`search.py`][search] | Done | -| 3.26 | Recursive-Best-First-Search | `recursive_best_first_search` | [`search.py`][search] | Done | -| 4.2 | Hill-Climbing | `hill_climbing` | [`search.py`][search] | Done | -| 4.5 | Simulated-Annealing | `simulated_annealing` | [`search.py`][search] | Done | -| 4.8 | Genetic-Algorithm | `genetic_algorithm` | [`search.py`][search] | Done | -| 4.11 | And-Or-Graph-Search | `and_or_graph_search` | [`search.py`][search] | Done | -| 4.21 | Online-DFS-Agent | `online_dfs_agent` | [`search.py`][search] | | -| 4.24 | LRTA\*-Agent | `LRTAStarAgent` | [`search.py`][search] | Done | -| 5.3 | Minimax-Decision | `minimax_decision` | [`games.py`][games] | Done | -| 5.7 | Alpha-Beta-Search | `alphabeta_search` | [`games.py`][games] | Done | -| 6 | CSP | `CSP` | [`csp.py`][csp] | Done | -| 6.3 | AC-3 | `AC3` | [`csp.py`][csp] | Done | -| 6.5 | Backtracking-Search | `backtracking_search` | [`csp.py`][csp] | Done | -| 6.8 | Min-Conflicts | `min_conflicts` | [`csp.py`][csp] | Done | -| 6.11 | Tree-CSP-Solver | `tree_csp_solver` | [`csp.py`][csp] | Done | -| 7 | KB | `KB` | [`logic.py`][logic] | Done | -| 7.1 | KB-Agent | `KB_Agent` | [`logic.py`][logic] | Done | -| 7.7 | Propositional Logic Sentence | `Expr` | [`logic.py`][logic] | Done | -| 7.10 | TT-Entails | `tt_entails` | [`logic.py`][logic] | Done | -| 7.12 | PL-Resolution | `pl_resolution` | [`logic.py`][logic] | Done | -| 7.14 | Convert to CNF | `to_cnf` | [`logic.py`][logic] | Done | -| 7.15 | PL-FC-Entails? | `pl_fc_resolution` | [`logic.py`][logic] | Done | -| 7.17 | DPLL-Satisfiable? | `dpll_satisfiable` | [`logic.py`][logic] | Done | -| 7.18 | WalkSAT | `WalkSAT` | [`logic.py`][logic] | Done | -| 7.20 | Hybrid-Wumpus-Agent | `HybridWumpusAgent` | | | -| 7.22 | SATPlan | `SAT_plan` | [`logic.py`][logic] | Done | -| 9 | Subst | `subst` | [`logic.py`][logic] | Done | -| 9.1 | Unify | `unify` | [`logic.py`][logic] | Done | -| 9.3 | FOL-FC-Ask | `fol_fc_ask` | [`logic.py`][logic] | Done | -| 9.6 | FOL-BC-Ask | `fol_bc_ask` | [`logic.py`][logic] | Done | -| 9.8 | Append | | | | -| 10.1 | Air-Cargo-problem | `air_cargo` | [`planning.py`][planning] | Done | -| 10.2 | Spare-Tire-Problem | `spare_tire` | [`planning.py`][planning] | Done | -| 10.3 | Three-Block-Tower | `three_block_tower` | [`planning.py`][planning] | Done | -| 10.7 | Cake-Problem | `have_cake_and_eat_cake_too` | [`planning.py`][planning] | Done | -| 10.9 | Graphplan | `GraphPlan` | [`planning.py`][planning] | | -| 10.13 | Partial-Order-Planner | | | | -| 11.1 | Job-Shop-Problem-With-Resources | `job_shop_problem` | [`planning.py`][planning] | Done | -| 11.5 | Hierarchical-Search | `hierarchical_search` | [`planning.py`][planning] | | -| 11.8 | Angelic-Search | | | | -| 11.10 | Doubles-tennis | `double_tennis_problem` | [`planning.py`][planning] | | -| 13 | Discrete Probability Distribution | `ProbDist` | [`probability.py`][probability] | Done | -| 13.1 | DT-Agent | `DTAgent` | [`probability.py`][probability] | | -| 14.9 | Enumeration-Ask | `enumeration_ask` | [`probability.py`][probability] | Done | -| 14.11 | Elimination-Ask | `elimination_ask` | [`probability.py`][probability] | Done | -| 14.13 | Prior-Sample | `prior_sample` | [`probability.py`][probability] | | -| 14.14 | Rejection-Sampling | `rejection_sampling` | [`probability.py`][probability] | Done | -| 14.15 | Likelihood-Weighting | `likelihood_weighting` | [`probability.py`][probability] | Done | -| 14.16 | Gibbs-Ask | `gibbs_ask` | [`probability.py`][probability] | | -| 15.4 | Forward-Backward | `forward_backward` | [`probability.py`][probability] | Done | -| 15.6 | Fixed-Lag-Smoothing | `fixed_lag_smoothing` | [`probability.py`][probability] | Done | -| 15.17 | Particle-Filtering | `particle_filtering` | [`probability.py`][probability] | Done | -| 16.9 | Information-Gathering-Agent | | | -| 17.4 | Value-Iteration | `value_iteration` | [`mdp.py`][mdp] | Done | -| 17.7 | Policy-Iteration | `policy_iteration` | [`mdp.py`][mdp] | Done | -| 17.9 | POMDP-Value-Iteration | | | | -| 18.5 | Decision-Tree-Learning | `DecisionTreeLearner` | [`learning.py`][learning] | Done | -| 18.8 | Cross-Validation | `cross_validation` | [`learning.py`][learning] | | -| 18.11 | Decision-List-Learning | `DecisionListLearner` | [`learning.py`][learning]\* | | -| 18.24 | Back-Prop-Learning | `BackPropagationLearner` | [`learning.py`][learning] | Done | -| 18.34 | AdaBoost | `AdaBoost` | [`learning.py`][learning] | | -| 19.2 | Current-Best-Learning | `current_best_learning` | [`knowledge.py`](knowledge.py) | Done | -| 19.3 | Version-Space-Learning | `version_space_learning` | [`knowledge.py`](knowledge.py) | Done | -| 19.8 | Minimal-Consistent-Det | `minimal_consistent_det` | [`knowledge.py`](knowledge.py) | Done | -| 19.12 | FOIL | `FOIL_container` | [`knowledge.py`](knowledge.py) | Done | -| 21.2 | Passive-ADP-Agent | `PassiveADPAgent` | [`rl.py`][rl] | Done | -| 21.4 | Passive-TD-Agent | `PassiveTDAgent` | [`rl.py`][rl] | Done | -| 21.8 | Q-Learning-Agent | `QLearningAgent` | [`rl.py`][rl] | Done | -| 22.1 | HITS | `HITS` | [`nlp.py`][nlp] | Done | -| 23 | Chart-Parse | `Chart` | [`nlp.py`][nlp] | Done | -| 23.5 | CYK-Parse | `CYK_parse` | [`nlp.py`][nlp] | Done | -| 25.9 | Monte-Carlo-Localization| `monte_carlo_localization` | [`probability.py`][probability] | Done | +Here is a table of algorithms, the figure, name of the algorithm in the book and in the repository, and the file where they are implemented in the repository. This chart was made for the third edition of the book and is being updated for the upcoming fourth edition. Empty implementations are a good place for contributors to look for an issue. The [aima-pseudocode](https://github.com/aimacode/aima-pseudocode) project describes all the algorithms from the book. An asterisk next to the file name denotes the algorithm is not fully implemented. Another great place for contributors to start is by adding tests and writing on the notebooks. You can see which algorithms have tests and notebook sections below. If the algorithm you want to work on is covered, don't worry! You can still add more tests and provide some examples of use in the notebook! + +| **Figure** | **Name (in 3rd edition)** | **Name (in repository)** | **File** | **Tests** | **Notebook** +|:-------|:----------------------------------|:------------------------------|:--------------------------------|:-----|:---------| +| 2 | Random-Vacuum-Agent | `RandomVacuumAgent` | [`agents.py`][agents] | Done | Included | +| 2 | Model-Based-Vacuum-Agent | `ModelBasedVacuumAgent` | [`agents.py`][agents] | Done | Included | +| 2.1 | Environment | `Environment` | [`agents.py`][agents] | Done | Included | +| 2.1 | Agent | `Agent` | [`agents.py`][agents] | Done | Included | +| 2.3 | Table-Driven-Vacuum-Agent | `TableDrivenVacuumAgent` | [`agents.py`][agents] | Done | Included | +| 2.7 | Table-Driven-Agent | `TableDrivenAgent` | [`agents.py`][agents] | Done | Included | +| 2.8 | Reflex-Vacuum-Agent | `ReflexVacuumAgent` | [`agents.py`][agents] | Done | Included | +| 2.10 | Simple-Reflex-Agent | `SimpleReflexAgent` | [`agents.py`][agents] | Done | Included | +| 2.12 | Model-Based-Reflex-Agent | `ReflexAgentWithState` | [`agents.py`][agents] | Done | Included | +| 3 | Problem | `Problem` | [`search.py`][search] | Done | Included | +| 3 | Node | `Node` | [`search.py`][search] | Done | Included | +| 3 | Queue | `Queue` | [`utils.py`][utils] | Done | No Need | +| 3.1 | Simple-Problem-Solving-Agent | `SimpleProblemSolvingAgent` | [`search.py`][search] | Done | Included | +| 3.2 | Romania | `romania` | [`search.py`][search] | Done | Included | +| 3.7 | Tree-Search | `depth/breadth_first_tree_search` | [`search.py`][search] | Done | Included | +| 3.7 | Graph-Search | `depth/breadth_first_graph_search` | [`search.py`][search] | Done | Included | +| 3.11 | Breadth-First-Search | `breadth_first_graph_search` | [`search.py`][search] | Done | Included | +| 3.14 | Uniform-Cost-Search | `uniform_cost_search` | [`search.py`][search] | Done | Included | +| 3.17 | Depth-Limited-Search | `depth_limited_search` | [`search.py`][search] | Done | Included | +| 3.18 | Iterative-Deepening-Search | `iterative_deepening_search` | [`search.py`][search] | Done | Included | +| 3.22 | Best-First-Search | `best_first_graph_search` | [`search.py`][search] | Done | Included | +| 3.24 | A\*-Search | `astar_search` | [`search.py`][search] | Done | Included | +| 3.26 | Recursive-Best-First-Search | `recursive_best_first_search` | [`search.py`][search] | Done | Included | +| 4.2 | Hill-Climbing | `hill_climbing` | [`search.py`][search] | Done | Included | +| 4.5 | Simulated-Annealing | `simulated_annealing` | [`search.py`][search] | Done | Included | +| 4.8 | Genetic-Algorithm | `genetic_algorithm` | [`search.py`][search] | Done | Included | +| 4.11 | And-Or-Graph-Search | `and_or_graph_search` | [`search.py`][search] | Done | Included | +| 4.21 | Online-DFS-Agent | `online_dfs_agent` | [`search.py`][search] | Done | Included | +| 4.24 | LRTA\*-Agent | `LRTAStarAgent` | [`search.py`][search] | Done | Included | +| 5.3 | Minimax-Decision | `minimax_decision` | [`games.py`][games] | Done | Included | +| 5.7 | Alpha-Beta-Search | `alphabeta_search` | [`games.py`][games] | Done | Included | +| 6 | CSP | `CSP` | [`csp.py`][csp] | Done | Included | +| 6.3 | AC-3 | `AC3` | [`csp.py`][csp] | Done | Included | +| 6.5 | Backtracking-Search | `backtracking_search` | [`csp.py`][csp] | Done | Included | +| 6.8 | Min-Conflicts | `min_conflicts` | [`csp.py`][csp] | Done | Included | +| 6.11 | Tree-CSP-Solver | `tree_csp_solver` | [`csp.py`][csp] | Done | Included | +| 7 | KB | `KB` | [`logic.py`][logic] | Done | Included | +| 7.1 | KB-Agent | `KB_AgentProgram` | [`logic.py`][logic] | Done | Included | +| 7.7 | Propositional Logic Sentence | `Expr` | [`utils.py`][utils] | Done | Included | +| 7.10 | TT-Entails | `tt_entails` | [`logic.py`][logic] | Done | Included | +| 7.12 | PL-Resolution | `pl_resolution` | [`logic.py`][logic] | Done | Included | +| 7.14 | Convert to CNF | `to_cnf` | [`logic.py`][logic] | Done | Included | +| 7.15 | PL-FC-Entails? | `pl_fc_entails` | [`logic.py`][logic] | Done | Included | +| 7.17 | DPLL-Satisfiable? | `dpll_satisfiable` | [`logic.py`][logic] | Done | Included | +| 7.18 | WalkSAT | `WalkSAT` | [`logic.py`][logic] | Done | Included | +| 7.20 | Hybrid-Wumpus-Agent | `HybridWumpusAgent` | | | | +| 7.22 | SATPlan | `SAT_plan` | [`logic.py`][logic] | Done | Included | +| 9 | Subst | `subst` | [`logic.py`][logic] | Done | Included | +| 9.1 | Unify | `unify` | [`logic.py`][logic] | Done | Included | +| 9.3 | FOL-FC-Ask | `fol_fc_ask` | [`logic.py`][logic] | Done | Included | +| 9.6 | FOL-BC-Ask | `fol_bc_ask` | [`logic.py`][logic] | Done | Included | +| 10.1 | Air-Cargo-problem | `air_cargo` | [`planning.py`][planning] | Done | Included | +| 10.2 | Spare-Tire-Problem | `spare_tire` | [`planning.py`][planning] | Done | Included | +| 10.3 | Three-Block-Tower | `three_block_tower` | [`planning.py`][planning] | Done | Included | +| 10.7 | Cake-Problem | `have_cake_and_eat_cake_too` | [`planning.py`][planning] | Done | Included | +| 10.9 | Graphplan | `GraphPlan` | [`planning.py`][planning] | Done | Included | +| 10.13 | Partial-Order-Planner | `PartialOrderPlanner` | [`planning.py`][planning] | Done | Included | +| 11.1 | Job-Shop-Problem-With-Resources | `job_shop_problem` | [`planning.py`][planning] | Done | Included | +| 11.5 | Hierarchical-Search | `hierarchical_search` | [`planning.py`][planning] | Done | Included | +| 11.8 | Angelic-Search | `angelic_search` | [`planning.py`][planning] | Done | Included | +| 11.10 | Doubles-tennis | `double_tennis_problem` | [`planning.py`][planning] | Done | Included | +| 13 | Discrete Probability Distribution | `ProbDist` | [`probability.py`][probability] | Done | Included | +| 13.1 | DT-Agent | `DTAgent` | [`probability.py`][probability] | Done | Included | +| 14.9 | Enumeration-Ask | `enumeration_ask` | [`probability.py`][probability] | Done | Included | +| 14.11 | Elimination-Ask | `elimination_ask` | [`probability.py`][probability] | Done | Included | +| 14.13 | Prior-Sample | `prior_sample` | [`probability.py`][probability] | Done | Included | +| 14.14 | Rejection-Sampling | `rejection_sampling` | [`probability.py`][probability] | Done | Included | +| 14.15 | Likelihood-Weighting | `likelihood_weighting` | [`probability.py`][probability] | Done | Included | +| 14.16 | Gibbs-Ask | `gibbs_ask` | [`probability.py`][probability] | Done | Included | +| 15.4 | Forward-Backward | `forward_backward` | [`probability.py`][probability] | Done | Included | +| 15.6 | Fixed-Lag-Smoothing | `fixed_lag_smoothing` | [`probability.py`][probability] | Done | Included | +| 15.17 | Particle-Filtering | `particle_filtering` | [`probability.py`][probability] | Done | Included | +| 16.9 | Information-Gathering-Agent | `InformationGatheringAgent` | [`probability.py`][probability] | Done | Included | +| 17.4 | Value-Iteration | `value_iteration` | [`mdp.py`][mdp] | Done | Included | +| 17.7 | Policy-Iteration | `policy_iteration` | [`mdp.py`][mdp] | Done | Included | +| 17.9 | POMDP-Value-Iteration | `pomdp_value_iteration` | [`mdp.py`][mdp] | Done | Included | +| 18.5 | Decision-Tree-Learning | `DecisionTreeLearner` | [`learning.py`][learning] | Done | Included | +| 18.8 | Cross-Validation | `cross_validation` | [`learning.py`][learning]\* | | | +| 18.11 | Decision-List-Learning | `DecisionListLearner` | [`learning.py`][learning]\* | | | +| 18.24 | Back-Prop-Learning | `BackPropagationLearner` | [`learning.py`][learning] | Done | Included | +| 18.34 | AdaBoost | `AdaBoost` | [`learning.py`][learning] | Done | Included | +| 19.2 | Current-Best-Learning | `current_best_learning` | [`knowledge.py`](knowledge.py) | Done | Included | +| 19.3 | Version-Space-Learning | `version_space_learning` | [`knowledge.py`](knowledge.py) | Done | Included | +| 19.8 | Minimal-Consistent-Det | `minimal_consistent_det` | [`knowledge.py`](knowledge.py) | Done | Included | +| 19.12 | FOIL | `FOIL_container` | [`knowledge.py`](knowledge.py) | Done | Included | +| 21.2 | Passive-ADP-Agent | `PassiveADPAgent` | [`rl.py`][rl] | Done | Included | +| 21.4 | Passive-TD-Agent | `PassiveTDAgent` | [`rl.py`][rl] | Done | Included | +| 21.8 | Q-Learning-Agent | `QLearningAgent` | [`rl.py`][rl] | Done | Included | +| 22.1 | HITS | `HITS` | [`nlp.py`][nlp] | Done | Included | +| 23 | Chart-Parse | `Chart` | [`nlp.py`][nlp] | Done | Included | +| 23.5 | CYK-Parse | `CYK_parse` | [`nlp.py`][nlp] | Done | Included | +| 25.9 | Monte-Carlo-Localization | `monte_carlo_localization` | [`probability.py`][probability] | Done | Included | # Index of data structures @@ -127,20 +174,20 @@ Here is a table of algorithms, the figure, name of the algorithm in the book and Here is a table of the implemented data structures, the figure, name of the implementation in the repository, and the file where they are implemented. | **Figure** | **Name (in repository)** | **File** | -|:-----------|:-------------------------|:---------| -| 3.2 | romania_map | [`search.py`][search] | -| 4.9 | vacumm_world | [`search.py`][search] | -| 4.23 | one_dim_state_space | [`search.py`][search] | -| 6.1 | australia_map | [`search.py`][search] | -| 7.13 | wumpus_world_inference | [`logic.py`][logic] | -| 7.16 | horn_clauses_KB | [`logic.py`][logic] | -| 17.1 | sequential_decision_environment | [`mdp.py`][mdp] | -| 18.2 | waiting_decision_tree | [`learning.py`][learning] | +|:-------|:--------------------------------|:--------------------------| +| 3.2 | romania_map | [`search.py`][search] | +| 4.9 | vacumm_world | [`search.py`][search] | +| 4.23 | one_dim_state_space | [`search.py`][search] | +| 6.1 | australia_map | [`search.py`][search] | +| 7.13 | wumpus_world_inference | [`logic.py`][logic] | +| 7.16 | horn_clauses_KB | [`logic.py`][logic] | +| 17.1 | sequential_decision_environment | [`mdp.py`][mdp] | +| 18.2 | waiting_decision_tree | [`learning.py`][learning] | # Acknowledgements -Many thanks for contributions over the years. I got bug reports, corrected code, and other support from Darius Bacon, Phil Ruggera, Peng Shao, Amit Patil, Ted Nienstedt, Jim Martin, Ben Catanzariti, and others. Now that the project is on GitHub, you can see the [contributors](https://github.com/aimacode/aima-python/graphs/contributors) who are doing a great job of actively improving the project. Many thanks to all contributors, especially @darius, @SnShine, @reachtarunhere, @MrDupin, and @Chipe1. +Many thanks for contributions over the years. I got bug reports, corrected code, and other support from Darius Bacon, Phil Ruggera, Peng Shao, Amit Patil, Ted Nienstedt, Jim Martin, Ben Catanzariti, and others. Now that the project is on GitHub, you can see the [contributors](https://github.com/aimacode/aima-python/graphs/contributors) who are doing a great job of actively improving the project. Many thanks to all contributors, especially [@darius](https://github.com/darius), [@SnShine](https://github.com/SnShine), [@reachtarunhere](https://github.com/reachtarunhere), [@antmarakis](https://github.com/antmarakis), [@Chipe1](https://github.com/Chipe1), [@ad71](https://github.com/ad71) and [@MariannaSpyrakou](https://github.com/MariannaSpyrakou). [agents]:../master/agents.py diff --git a/SUBMODULE.md b/SUBMODULE.md new file mode 100644 index 000000000..2c080bb91 --- /dev/null +++ b/SUBMODULE.md @@ -0,0 +1,11 @@ +This is a guide on how to update the `aima-data` submodule to the latest version. This needs to be done every time something changes in the [aima-data](https://github.com/aimacode/aima-data) repository. All the below commands should be executed from the local directory of the `aima-python` repository, using `git`. + +``` +git submodule deinit aima-data +git rm aima-data +git submodule add https://github.com/aimacode/aima-data.git aima-data +git commit +git push origin +``` + +Then you need to pull request the changes (unless you are a collaborator, in which case you can commit directly to the master). diff --git a/agents.ipynb b/agents.ipynb index 968c8cdc9..636df75e3 100644 --- a/agents.ipynb +++ b/agents.ipynb @@ -4,26 +4,120 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# AGENT #\n", + "# Intelligent Agents #\n", "\n", - "An agent, as defined in 2.1 is anything that can perceive its environment through sensors, and act upon that environment through actuators based on its agent program. This can be a dog, robot, or even you. As long as you can perceive the environment and act on it, you are an agent. This notebook will explain how to implement a simple agent, create an environment, and create a program that helps the agent act on the environment based on its percepts.\n", + "This notebook serves as supporting material for topics covered in **Chapter 2 - Intelligent Agents** from the book *Artificial Intelligence: A Modern Approach.* This notebook uses implementations from [agents.py](https://github.com/aimacode/aima-python/blob/master/agents.py) module. Let's start by importing everything from agents module." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from agents import *\n", + "from notebook import psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", "\n", - "Before moving on, review the Agent and Environment classes in [agents.py](https://github.com/aimacode/aima-python/blob/master/agents.py).\n", + "* Overview\n", + "* Agent\n", + "* Environment\n", + "* Simple Agent and Environment\n", + "* Agents in a 2-D Environment\n", + "* Wumpus Environment\n", "\n", - "Let's begin by importing all the functions from the agents.py module and creating our first agent - a blind dog." + "## OVERVIEW\n", + "\n", + "An agent, as defined in 2.1, is anything that can perceive its environment through sensors, and act upon that environment through actuators based on its agent program. This can be a dog, a robot, or even you. As long as you can perceive the environment and act on it, you are an agent. This notebook will explain how to implement a simple agent, create an environment, and implement a program that helps the agent act on the environment based on its percepts.\n", + "\n", + "## AGENT\n", + "\n", + "Let us now see how we define an agent. Run the next cell to see how `Agent` is defined in agents module." ] }, { "cell_type": "code", - "execution_count": 1, - "metadata": { - "collapsed": false, - "scrolled": true - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ - "from agents import *\n", + "psource(Agent)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `Agent` has two methods.\n", + "* `__init__(self, program=None)`: The constructor defines various attributes of the Agent. These include\n", + "\n", + " * `alive`: which keeps track of whether the agent is alive or not \n", + " \n", + " * `bump`: which tracks if the agent collides with an edge of the environment (for eg, a wall in a park)\n", + " \n", + " * `holding`: which is a list containing the `Things` an agent is holding, \n", + " \n", + " * `performance`: which evaluates the performance metrics of the agent \n", + " \n", + " * `program`: which is the agent program and maps an agent's percepts to actions in the environment. If no implementation is provided, it defaults to asking the user to provide actions for each percept.\n", + " \n", + "* `can_grab(self, thing)`: Is used when an environment contains things that an agent can grab and carry. By default, an agent can carry nothing.\n", + "\n", + "## ENVIRONMENT\n", + "Now, let us see how environments are defined. Running the next cell will display an implementation of the abstract `Environment` class." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(Environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Environment` class has lot of methods! But most of them are incredibly simple, so let's see the ones we'll be using in this notebook.\n", + "\n", + "* `thing_classes(self)`: Returns a static array of `Thing` sub-classes that determine what things are allowed in the environment and what aren't\n", + "\n", + "* `add_thing(self, thing, location=None)`: Adds a thing to the environment at location\n", + "\n", + "* `run(self, steps)`: Runs an environment with the agent in it for a given number of steps.\n", + "\n", + "* `is_done(self)`: Returns true if the objective of the agent and the environment has been completed\n", "\n", + "The next two functions must be implemented by each subclasses of `Environment` for the agent to recieve percepts and execute actions \n", + "\n", + "* `percept(self, agent)`: Given an agent, this method returns a list of percepts that the agent sees at the current time\n", + "\n", + "* `execute_action(self, agent, action)`: The environment reacts to an action performed by a given agent. The changes may result in agent experiencing new percepts or other elements reacting to agent input." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SIMPLE AGENT AND ENVIRONMENT\n", + "\n", + "Let's begin by using the `Agent` class to creating our first agent - a blind dog." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ "class BlindDog(Agent):\n", " def eat(self, thing):\n", " print(\"Dog: Ate food at {}.\".format(self.location))\n", @@ -43,19 +137,9 @@ }, { "cell_type": "code", - "execution_count": 2, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "True\n" - ] - } - ], + "execution_count": null, + "metadata": {}, + "outputs": [], "source": [ "print(dog.alive)" ] @@ -72,20 +156,15 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# ENVIRONMENT #\n", - "\n", - "A park is an example of an environment because our dog can perceive and act upon it. The Environment class in agents.py is an abstract class, so we will have to create our own subclass from it before we can use it. The abstract class must contain the following methods:\n", + "### ENVIRONMENT - Park\n", "\n", - "
  • percept(self, agent) - returns what the agent perceives
  • \n", - "
  • execute_action(self, agent, action) - changes the state of the environment based on what the agent does.
  • " + "A park is an example of an environment because our dog can perceive and act upon it. The Environment class is an abstract class, so we will have to create our own subclass from it before we can use it." ] }, { "cell_type": "code", - "execution_count": 3, - "metadata": { - "collapsed": false - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ "class Food(Thing):\n", @@ -96,7 +175,7 @@ "\n", "class Park(Environment):\n", " def percept(self, agent):\n", - " '''prints & return a list of things that are in our agent's location'''\n", + " '''return a list of things that are in our agent's location'''\n", " things = self.list_things_at(agent.location)\n", " return things\n", " \n", @@ -130,35 +209,16 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ - "# PROGRAM - BlindDog #\n", - "Now that we have a Park Class, we need to implement a program module for our dog. A program controls how the dog acts upon it's environment. Our program will be very simple, and is shown in the table below.\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
    Percept: Feel Food Feel WaterFeel Nothing
    Action: eatdrinkmove down
    \n" + "### PROGRAM - BlindDog\n", + "Now that we have a Park Class, we re-implement our BlindDog to be able to move down and eat food or drink water only if it is present.\n" ] }, { "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": false - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ "class BlindDog(Agent):\n", @@ -170,19 +230,46 @@ " def eat(self, thing):\n", " '''returns True upon success or False otherwise'''\n", " if isinstance(thing, Food):\n", - " #print(\"Dog: Ate food at {}.\".format(self.location))\n", " return True\n", " return False\n", " \n", " def drink(self, thing):\n", " ''' returns True upon success or False otherwise'''\n", " if isinstance(thing, Water):\n", - " #print(\"Dog: Drank water at {}.\".format(self.location))\n", " return True\n", - " return False\n", + " return False" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now its time to implement a program module for our dog. A program controls how the dog acts upon its environment. Our program will be very simple, and is shown in the table below.\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", " \n", + "
    Percept: Feel Food Feel WaterFeel Nothing
    Action: eatdrinkmove down
    " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ "def program(percepts):\n", - " '''Returns an action based on it's percepts'''\n", + " '''Returns an action based on the dog's percepts'''\n", " for p in percepts:\n", " if isinstance(p, Food):\n", " return 'eat'\n", @@ -195,28 +282,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Lets now run our simulation by creating a park with some food, water, and our dog." + "Let's now run our simulation by creating a park with some food, water, and our dog." ] }, { "cell_type": "code", - "execution_count": 5, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "BlindDog decided to move down at location: 1\n", - "BlindDog decided to move down at location: 2\n", - "BlindDog decided to move down at location: 3\n", - "BlindDog decided to move down at location: 4\n", - "BlindDog ate Food at location: 5\n" - ] - } - ], + "execution_count": null, + "metadata": {}, + "outputs": [], "source": [ "park = Park()\n", "dog = BlindDog(program)\n", @@ -235,26 +308,14 @@ "source": [ "Notice that the dog moved from location 1 to 4, over 4 steps, and ate food at location 5 in the 5th step.\n", "\n", - "Lets continue this simulation for 5 more steps." + "Let's continue this simulation for 5 more steps." ] }, { "cell_type": "code", - "execution_count": 6, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "BlindDog decided to move down at location: 5\n", - "BlindDog decided to move down at location: 6\n", - "BlindDog drank Water at location: 7\n" - ] - } - ], + "execution_count": null, + "metadata": {}, + "outputs": [], "source": [ "park.run(5)" ] @@ -263,32 +324,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Perfect! Note how the simulation stopped after the dog drank the water - exhausting all the food and water ends our simulation, as we had defined before. Lets add some more water and see if our dog can reach it." + "Perfect! Note how the simulation stopped after the dog drank the water - exhausting all the food and water ends our simulation, as we had defined before. Let's add some more water and see if our dog can reach it." ] }, { "cell_type": "code", - "execution_count": 7, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "BlindDog decided to move down at location: 7\n", - "BlindDog decided to move down at location: 8\n", - "BlindDog decided to move down at location: 9\n", - "BlindDog decided to move down at location: 10\n", - "BlindDog decided to move down at location: 11\n", - "BlindDog decided to move down at location: 12\n", - "BlindDog decided to move down at location: 13\n", - "BlindDog decided to move down at location: 14\n", - "BlindDog drank Water at location: 15\n" - ] - } - ], + "execution_count": null, + "metadata": {}, + "outputs": [], "source": [ "park.add_thing(water, 15)\n", "park.run(10)" @@ -298,26 +341,29 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This is how to implement an agent, its program, and environment. However, this was a very simple case. Lets try a 2-Dimentional environment now with multiple agents.\n", + "Above, we learnt to implement an agent, its program, and an environment on which it acts. However, this was a very simple case. Let's try to add complexity to it by creating a 2-Dimensional environment!\n", + "\n", "\n", + "## AGENTS IN A 2D ENVIRONMENT\n", "\n", - "# 2D Environment #\n", - "To make our Park 2D, we will need to make it a subclass of XYEnvironment instead of Environment. Please note that our park is indexed in the 4th quadrant of the X-Y plane.\n", + "For us to not read so many logs of what our dog did, we add a bit of graphics while making our Park 2D. To do so, we will need to make it a subclass of GraphicEnvironment instead of Environment. Parks implemented by subclassing GraphicEnvironment class adds these extra properties to it:\n", "\n", - "We will also eventually add a person to pet the dog." + " - Our park is indexed in the 4th quadrant of the X-Y plane.\n", + " - Every time we create a park subclassing GraphicEnvironment, we need to define the colors of all the things we plan to put into the park. The colors are defined in typical [RGB digital 8-bit format](https://en.wikipedia.org/wiki/RGB_color_model#Numeric_representations), common across the web.\n", + " - Fences are added automatically to all parks so that our dog does not go outside the park's boundary - it just isn't safe for blind dogs to be outside the park by themselves! GraphicEnvironment provides `is_inbounds` function to check if our dog tries to leave the park.\n", + " \n", + "First let us try to upgrade our 1-dimensional `Park` environment by just replacing its superclass by `GraphicEnvironment`. " ] }, { "cell_type": "code", - "execution_count": 8, - "metadata": { - "collapsed": true - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ - "class Park2D(XYEnvironment):\n", + "class Park2D(GraphicEnvironment):\n", " def percept(self, agent):\n", - " '''prints & return a list of things that are in our agent's location'''\n", + " '''return a list of things that are in our agent's location'''\n", " things = self.list_things_at(agent.location)\n", " return things\n", " \n", @@ -349,8 +395,8 @@ " return dead_agents or no_edibles\n", "\n", "class BlindDog(Agent):\n", - " location = [0,1]# change location to a 2d value\n", - " direction = Direction(\"down\")# variable to store the direction our dog is facing\n", + " location = [0,1] # change location to a 2d value\n", + " direction = Direction(\"down\") # variable to store the direction our dog is facing\n", " \n", " def movedown(self):\n", " self.location[1] += 1\n", @@ -365,58 +411,23 @@ " ''' returns True upon success or False otherwise'''\n", " if isinstance(thing, Water):\n", " return True\n", - " return False\n", - " \n", - "def program(percepts):\n", - " '''Returns an action based on it's percepts'''\n", - " for p in percepts:\n", - " if isinstance(p, Food):\n", - " return 'eat'\n", - " elif isinstance(p, Water):\n", - " return 'drink'\n", - " return 'move down'" + " return False" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now lets test this new park with our same dog, food and water" + "Now let's test this new park with our same dog, food and water. We color our dog with a nice red and mark food and water with orange and blue respectively." ] }, { "cell_type": "code", - "execution_count": 9, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "BlindDog decided to move down at location: [0, 1]\n", - "BlindDog decided to move down at location: [0, 2]\n", - "BlindDog decided to move down at location: [0, 3]\n", - "BlindDog decided to move down at location: [0, 4]\n", - "BlindDog ate Food at location: [0, 5]\n", - "BlindDog decided to move down at location: [0, 5]\n", - "BlindDog decided to move down at location: [0, 6]\n", - "BlindDog drank Water at location: [0, 7]\n", - "BlindDog decided to move down at location: [0, 7]\n", - "BlindDog decided to move down at location: [0, 8]\n", - "BlindDog decided to move down at location: [0, 9]\n", - "BlindDog decided to move down at location: [0, 10]\n", - "BlindDog decided to move down at location: [0, 11]\n", - "BlindDog decided to move down at location: [0, 12]\n", - "BlindDog decided to move down at location: [0, 13]\n", - "BlindDog decided to move down at location: [0, 14]\n", - "BlindDog drank Water at location: [0, 15]\n" - ] - } - ], + "execution_count": null, + "metadata": {}, + "outputs": [], "source": [ - "park = Park2D(5,20) # park width is set to 5, and height to 20\n", + "park = Park2D(5,20, color={'BlindDog': (200,0,0), 'Water': (0, 200, 200), 'Food': (230, 115, 40)}) # park width is set to 5, and height to 20\n", "dog = BlindDog(program)\n", "dogfood = Food()\n", "water = Water()\n", @@ -425,6 +436,7 @@ "park.add_thing(water, [0,7])\n", "morewater = Water()\n", "park.add_thing(morewater, [0,15])\n", + "print(\"BlindDog starts at (1,1) facing downwards, lets see if he can find any food!\")\n", "park.run(20)" ] }, @@ -432,11 +444,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This works, but our blind dog doesn't make any use of the 2 dimensional space available to him. Let's make our dog more energetic so that he turns and moves forward, instead of always moving down. We'll also need to make appropriate changes to our environment to be able to handle this extra motion.\n", + "Adding some graphics was a good idea! We immediately see that the code works, but our blind dog doesn't make any use of the 2 dimensional space available to him. Let's make our dog more energetic so that he turns and moves forward, instead of always moving down. In doing so, we'll also need to make some changes to our environment to be able to handle this extra motion.\n", "\n", - "# PROGRAM - EnergeticBlindDog #\n", + "### PROGRAM - EnergeticBlindDog\n", "\n", - "Lets make our dog turn or move forwards at random - except when he's at the edge of our park - in which case we make him change his direction explicitly by turning to avoid trying to leave the park. Our dog is blind, however, so he wouldn't know which way to turn - he'd just have to try arbitrarily.\n", + "Let's make our dog turn or move forwards at random - except when he's at the edge of our park - in which case we make him change his direction explicitly by turning to avoid trying to leave the park. However, our dog is blind so he wouldn't know which way to turn - he'd just have to try arbitrarily.\n", "\n", "\n", " \n", @@ -470,24 +482,19 @@ }, { "cell_type": "code", - "execution_count": 10, - "metadata": { - "collapsed": false - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ "from random import choice\n", "\n", - "turn = False# global variable to remember to turn if our dog hits the boundary\n", "class EnergeticBlindDog(Agent):\n", " location = [0,1]\n", " direction = Direction(\"down\")\n", " \n", " def moveforward(self, success=True):\n", - " '''moveforward possible only if success (ie valid destination location)'''\n", - " global turn\n", + " '''moveforward possible only if success (i.e. valid destination location)'''\n", " if not success:\n", - " turn = True # if edge has been reached, remember to turn\n", " return\n", " if self.direction.direction == Direction.R:\n", " self.location[0] += 1\n", @@ -504,30 +511,28 @@ " def eat(self, thing):\n", " '''returns True upon success or False otherwise'''\n", " if isinstance(thing, Food):\n", - " #print(\"Dog: Ate food at {}.\".format(self.location))\n", " return True\n", " return False\n", " \n", " def drink(self, thing):\n", " ''' returns True upon success or False otherwise'''\n", " if isinstance(thing, Water):\n", - " #print(\"Dog: Drank water at {}.\".format(self.location))\n", " return True\n", " return False\n", " \n", "def program(percepts):\n", " '''Returns an action based on it's percepts'''\n", - " global turn\n", + " \n", " for p in percepts: # first eat or drink - you're a dog!\n", " if isinstance(p, Food):\n", " return 'eat'\n", " elif isinstance(p, Water):\n", " return 'drink'\n", - " if turn: # then recall if you were at an edge and had to turn\n", - " turn = False\n", - " choice = random.choice((1,2));\n", - " else:\n", - " choice = random.choice((1,2,3,4)) # 1-right, 2-left, others-forward\n", + " if isinstance(p,Bump): # then check if you are at an edge and have to turn\n", + " turn = False\n", + " choice = random.choice((1,2));\n", + " else:\n", + " choice = random.choice((1,2,3,4)) # 1-right, 2-left, others-forward\n", " if choice == 1:\n", " return 'turnright'\n", " elif choice == 2:\n", @@ -541,141 +546,33 @@ "cell_type": "markdown", "metadata": {}, "source": [ + "### ENVIRONMENT - Park2D\n", + "\n", "We also need to modify our park accordingly, in order to be able to handle all the new actions our dog wishes to execute. Additionally, we'll need to prevent our dog from moving to locations beyond our park boundary - it just isn't safe for blind dogs to be outside the park by themselves." ] }, { "cell_type": "code", - "execution_count": 11, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "class Park2D(XYEnvironment):\n", - " def percept(self, agent):\n", - " '''prints & return a list of things that are in our agent's location'''\n", - " things = self.list_things_at(agent.location)\n", - " return things\n", - " \n", - " def execute_action(self, agent, action):\n", - " '''changes the state of the environment based on what the agent does.'''\n", - " if action == 'turnright':\n", - " print('{} decided to {} at location: {}'.format(str(agent)[1:-1], action, agent.location))\n", - " agent.turn(Direction.R)\n", - " #print('now facing {}'.format(agent.direction.direction))\n", - " elif action == 'turnleft':\n", - " print('{} decided to {} at location: {}'.format(str(agent)[1:-1], action, agent.location))\n", - " agent.turn(Direction.L)\n", - " #print('now facing {}'.format(agent.direction.direction))\n", - " elif action == 'moveforward':\n", - " loc = copy.deepcopy(agent.location) # find out the target location\n", - " if agent.direction.direction == Direction.R:\n", - " loc[0] += 1\n", - " elif agent.direction.direction == Direction.L:\n", - " loc[0] -= 1\n", - " elif agent.direction.direction == Direction.D:\n", - " loc[1] += 1\n", - " elif agent.direction.direction == Direction.U:\n", - " loc[1] -= 1\n", - " #print('{} at {} facing {}'.format(agent, loc, agent.direction.direction))\n", - " if self.is_inbounds(loc):# move only if the target is a valid location\n", - " print('{} decided to move {}wards at location: {}'.format(str(agent)[1:-1], agent.direction.direction, agent.location))\n", - " agent.moveforward()\n", - " else:\n", - " print('{} decided to move {}wards at location: {}, but couldnt'.format(str(agent)[1:-1], agent.direction.direction, agent.location))\n", - " agent.moveforward(False)\n", - " elif action == \"eat\":\n", - " items = self.list_things_at(agent.location, tclass=Food)\n", - " if len(items) != 0:\n", - " if agent.eat(items[0]):\n", - " print('{} ate {} at location: {}'\n", - " .format(str(agent)[1:-1], str(items[0])[1:-1], agent.location))\n", - " self.delete_thing(items[0])\n", - " elif action == \"drink\":\n", - " items = self.list_things_at(agent.location, tclass=Water)\n", - " if len(items) != 0:\n", - " if agent.drink(items[0]):\n", - " print('{} drank {} at location: {}'\n", - " .format(str(agent)[1:-1], str(items[0])[1:-1], agent.location))\n", - " self.delete_thing(items[0])\n", - " \n", - " def is_done(self):\n", - " '''By default, we're done when we can't find a live agent, \n", - " but to prevent killing our cute dog, we will stop before itself - when there is no more food or water'''\n", - " no_edibles = not any(isinstance(thing, Food) or isinstance(thing, Water) for thing in self.things)\n", - " dead_agents = not any(agent.is_alive() for agent in self.agents)\n", - " return dead_agents or no_edibles\n" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "dog started at [0,0], facing down. Lets see if he found any food or water!\n", - "EnergeticBlindDog decided to move downwards at location: [0, 0]\n", - "EnergeticBlindDog decided to move downwards at location: [0, 1]\n", - "EnergeticBlindDog drank Water at location: [0, 2]\n", - "EnergeticBlindDog decided to turnright at location: [0, 2]\n", - "EnergeticBlindDog decided to move leftwards at location: [0, 2], but couldnt\n", - "EnergeticBlindDog decided to turnright at location: [0, 2]\n", - "EnergeticBlindDog decided to turnright at location: [0, 2]\n", - "EnergeticBlindDog decided to turnleft at location: [0, 2]\n", - "EnergeticBlindDog decided to turnleft at location: [0, 2]\n", - "EnergeticBlindDog decided to move leftwards at location: [0, 2], but couldnt\n", - "EnergeticBlindDog decided to turnleft at location: [0, 2]\n", - "EnergeticBlindDog decided to turnright at location: [0, 2]\n", - "EnergeticBlindDog decided to move leftwards at location: [0, 2], but couldnt\n", - "EnergeticBlindDog decided to turnleft at location: [0, 2]\n", - "EnergeticBlindDog decided to move downwards at location: [0, 2], but couldnt\n", - "EnergeticBlindDog decided to turnright at location: [0, 2]\n", - "EnergeticBlindDog decided to turnleft at location: [0, 2]\n", - "EnergeticBlindDog decided to turnleft at location: [0, 2]\n", - "EnergeticBlindDog decided to move rightwards at location: [0, 2]\n", - "EnergeticBlindDog ate Food at location: [1, 2]\n" - ] - } - ], - "source": [ - "park = Park2D(3,3)\n", - "dog = EnergeticBlindDog(program)\n", - "dogfood = Food()\n", - "water = Water()\n", - "park.add_thing(dog, [0,0])\n", - "park.add_thing(dogfood, [1,2])\n", - "park.add_thing(water, [2,1])\n", - "morewater = Water()\n", - "park.add_thing(morewater, [0,2])\n", - "print('dog started at [0,0], facing down. Lets see if he found any food or water!')\n", - "park.run(20)" - ] - }, - { - "cell_type": "markdown", + "execution_count": null, "metadata": {}, - "source": [ - "This is good, but it still lacks graphics. What if we wanted to visualize our park as it changed? To do that, all we have to do is make our park a subclass of GraphicEnvironment instead of XYEnvironment. Lets see how this looks." - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": { - "collapsed": true - }, "outputs": [], "source": [ - "class GraphicPark(GraphicEnvironment):\n", + "class Park2D(GraphicEnvironment):\n", " def percept(self, agent):\n", - " '''prints & return a list of things that are in our agent's location'''\n", + " '''return a list of things that are in our agent's location'''\n", " things = self.list_things_at(agent.location)\n", + " loc = copy.deepcopy(agent.location) # find out the target location\n", + " #Check if agent is about to bump into a wall\n", + " if agent.direction.direction == Direction.R:\n", + " loc[0] += 1\n", + " elif agent.direction.direction == Direction.L:\n", + " loc[0] -= 1\n", + " elif agent.direction.direction == Direction.D:\n", + " loc[1] += 1\n", + " elif agent.direction.direction == Direction.U:\n", + " loc[1] -= 1\n", + " if not self.is_inbounds(loc):\n", + " things.append(Bump())\n", " return things\n", " \n", " def execute_action(self, agent, action):\n", @@ -683,28 +580,12 @@ " if action == 'turnright':\n", " print('{} decided to {} at location: {}'.format(str(agent)[1:-1], action, agent.location))\n", " agent.turn(Direction.R)\n", - " #print('now facing {}'.format(agent.direction.direction))\n", " elif action == 'turnleft':\n", " print('{} decided to {} at location: {}'.format(str(agent)[1:-1], action, agent.location))\n", " agent.turn(Direction.L)\n", - " #print('now facing {}'.format(agent.direction.direction))\n", " elif action == 'moveforward':\n", - " loc = copy.deepcopy(agent.location) # find out the target location\n", - " if agent.direction.direction == Direction.R:\n", - " loc[0] += 1\n", - " elif agent.direction.direction == Direction.L:\n", - " loc[0] -= 1\n", - " elif agent.direction.direction == Direction.D:\n", - " loc[1] += 1\n", - " elif agent.direction.direction == Direction.U:\n", - " loc[1] -= 1\n", - " #print('{} at {} facing {}'.format(agent, loc, agent.direction.direction))\n", - " if self.is_inbounds(loc):# move only if the target is a valid location\n", - " print('{} decided to move {}wards at location: {}'.format(str(agent)[1:-1], agent.direction.direction, agent.location))\n", - " agent.moveforward()\n", - " else:\n", - " print('{} decided to move {}wards at location: {}, but couldnt'.format(str(agent)[1:-1], agent.direction.direction, agent.location))\n", - " agent.moveforward(False)\n", + " print('{} decided to move {}wards at location: {}'.format(str(agent)[1:-1], agent.direction.direction, agent.location))\n", + " agent.moveforward()\n", " elif action == \"eat\":\n", " items = self.list_things_at(agent.location, tclass=Food)\n", " if len(items) != 0:\n", @@ -732,419 +613,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "That is the only change we make. The rest of our code stays the same. There is a slight difference in usage though. Every time we create a GraphicPark, we need to define the colors of all the things we plan to put into the park. The colors are defined in typical [RGB digital 8-bit format](https://en.wikipedia.org/wiki/RGB_color_model#Numeric_representations), common across the web." + "Now that our park is ready for the 2D motion of our energetic dog, lets test it!" ] }, { "cell_type": "code", - "execution_count": 19, - "metadata": { - "collapsed": false, - "scrolled": true - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "dog started at [0,0], facing down. Lets see if he found any food or water!\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move downwards at location: [0, 0]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog drank Water at location: [0, 1]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnleft at location: [0, 1]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnright at location: [0, 1]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move downwards at location: [0, 1]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnleft at location: [0, 2]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move rightwards at location: [0, 2]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog ate Food at location: [1, 2]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnleft at location: [1, 2]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnleft at location: [1, 2]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnleft at location: [1, 2]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move downwards at location: [1, 2]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnright at location: [1, 3]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move leftwards at location: [1, 3]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move leftwards at location: [0, 3], but couldnt\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnleft at location: [0, 3]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnright at location: [0, 3]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move leftwards at location: [0, 3], but couldnt\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to turnright at location: [0, 3]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "EnergeticBlindDog decided to move upwards at location: [0, 3]\n" - ] - }, - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": null, + "metadata": {}, + "outputs": [], "source": [ - "park = GraphicPark(5,5, color={'EnergeticBlindDog': (200,0,0), 'Water': (0, 200, 200), 'Food': (230, 115, 40)})\n", + "park = Park2D(5,5, color={'EnergeticBlindDog': (200,0,0), 'Water': (0, 200, 200), 'Food': (230, 115, 40)})\n", "dog = EnergeticBlindDog(program)\n", "dogfood = Food()\n", "water = Water()\n", @@ -1155,7 +633,7 @@ "morefood = Food()\n", "park.add_thing(morewater, [2,4])\n", "park.add_thing(morefood, [4,3])\n", - "print('dog started at [0,0], facing down. Lets see if he found any food or water!')\n", + "print(\"dog started at [0,0], facing down. Let's see if he found any food or water!\")\n", "park.run(20)" ] }, @@ -1176,10 +654,8 @@ }, { "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": false - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ "from ipythonblocks import BlockGrid\n", @@ -1220,32 +696,9 @@ }, { "cell_type": "code", - "execution_count": 5, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "data": { - "text/html": [ - "
    " - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[[], [], [], [], [, None]]\n", - "Forward\n" - ] - } - ], + "execution_count": null, + "metadata": {}, + "outputs": [], "source": [ "step()" ] @@ -1253,9 +706,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [] } @@ -1276,9 +727,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.4.3" + "version": "3.6.4" } }, "nbformat": 4, - "nbformat_minor": 0 + "nbformat_minor": 1 } diff --git a/agents.py b/agents.py index db93ca795..d29b0c382 100644 --- a/agents.py +++ b/agents.py @@ -1,4 +1,5 @@ -"""Implement Agents and Environments (Chapters 1-2). +""" +Implement Agents and Environments. (Chapters 1-2) The class hierarchies are as follows: @@ -23,24 +24,21 @@ EnvToolbar ## contains buttons for controlling EnvGUI EnvCanvas ## Canvas to display the environment of an EnvGUI - """ -# TO DO: -# Implement grabbing correctly. -# When an object is grabbed, does it still have a location? -# What if it is released? -# What if the grabbed or the grabber is deleted? -# What if the grabber moves? -# +# TODO # Speed control in GUI does not have any effect -- fix it. from utils import distance_squared, turn_heading from statistics import mean +from ipythonblocks import BlockGrid +from IPython.display import HTML, display, clear_output +from time import sleep import random import copy import collections +import numbers # ______________________________________________________________________________ @@ -69,26 +67,25 @@ def display(self, canvas, x, y, width, height): class Agent(Thing): - """An Agent is a subclass of Thing with one required slot, - .program, which should hold a function that takes one argument, the - percept, and returns an action. (What counts as a percept or action + """An Agent is a subclass of Thing with one required instance attribute + (aka slot), .program, which should hold a function that takes one argument, + the percept, and returns an action. (What counts as a percept or action will depend on the specific environment in which the agent exists.) - Note that 'program' is a slot, not a method. If it were a method, - then the program could 'cheat' and look at aspects of the agent. - It's not supposed to do that: the program can only look at the - percepts. An agent program that needs a model of the world (and of - the agent itself) will have to build and maintain its own model. - There is an optional slot, .performance, which is a number giving - the performance measure of the agent in its environment.""" + Note that 'program' is a slot, not a method. If it were a method, then the + program could 'cheat' and look at aspects of the agent. It's not supposed + to do that: the program can only look at the percepts. An agent program + that needs a model of the world (and of the agent itself) will have to + build and maintain its own model. There is an optional slot, .performance, + which is a number giving the performance measure of the agent in its + environment.""" def __init__(self, program=None): self.alive = True self.bump = False self.holding = [] self.performance = 0 - if program is None or not isinstance(program, collections.Callable): - print("Can't find a valid program for {}, falling back to default.".format( - self.__class__.__name__)) + if program is None or not isinstance(program, collections.abc.Callable): + print("Can't find a valid program for {}, falling back to default.".format(self.__class__.__name__)) def program(percept): return eval(input('Percept={}; action? '.format(percept))) @@ -96,7 +93,7 @@ def program(percept): self.program = program def can_grab(self, thing): - """Returns True if this agent can grab this thing. + """Return True if this agent can grab this thing. Override for appropriate subclasses of Agent and Thing.""" return False @@ -110,50 +107,76 @@ def new_program(percept): action = old_program(percept) print('{} perceives {} and does {}'.format(agent, percept, action)) return action + agent.program = new_program return agent + # ______________________________________________________________________________ def TableDrivenAgentProgram(table): - """This agent selects an action based on the percept sequence. + """ + [Figure 2.7] + This agent selects an action based on the percept sequence. It is practical only for tiny domains. To customize it, provide as table a dictionary of all - {percept_sequence:action} pairs. [Figure 2.7]""" + {percept_sequence:action} pairs. + """ percepts = [] def program(percept): percepts.append(percept) action = table.get(tuple(percepts)) return action + return program def RandomAgentProgram(actions): - """An agent that chooses an action at random, ignoring all percepts.""" + """An agent that chooses an action at random, ignoring all percepts. + >>> list = ['Right', 'Left', 'Suck', 'NoOp'] + >>> program = RandomAgentProgram(list) + >>> agent = Agent(program) + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1, 0): 'Clean' , (0, 0): 'Clean'} + True + """ return lambda percept: random.choice(actions) + # ______________________________________________________________________________ def SimpleReflexAgentProgram(rules, interpret_input): - """This agent takes action based solely on the percept. [Figure 2.10]""" + """ + [Figure 2.10] + This agent takes action based solely on the percept. + """ + def program(percept): state = interpret_input(percept) rule = rule_match(state, rules) action = rule.action return action + return program def ModelBasedReflexAgentProgram(rules, update_state, model): - """This agent takes action based on the percept and state. [Figure 2.12]""" + """ + [Figure 2.12] + This agent takes action based on the percept and state. + """ + def program(percept): program.state = update_state(program.state, program.action, percept, model) rule = rule_match(program.state, rules) action = rule.action return action + program.state = program.action = None return program @@ -164,6 +187,7 @@ def rule_match(state, rules): if rule.matches(state): return rule + # ______________________________________________________________________________ @@ -171,28 +195,51 @@ def rule_match(state, rules): def RandomVacuumAgent(): - """Randomly choose one of the actions from the vacuum environment.""" + """Randomly choose one of the actions from the vacuum environment. + >>> agent = RandomVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ return Agent(RandomAgentProgram(['Right', 'Left', 'Suck', 'NoOp'])) def TableDrivenVacuumAgent(): - """[Figure 2.3]""" + """Tabular approach towards vacuum world as mentioned in [Figure 2.3] + >>> agent = TableDrivenVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ table = {((loc_A, 'Clean'),): 'Right', ((loc_A, 'Dirty'),): 'Suck', ((loc_B, 'Clean'),): 'Left', ((loc_B, 'Dirty'),): 'Suck', - ((loc_A, 'Clean'), (loc_A, 'Clean')): 'Right', - ((loc_A, 'Clean'), (loc_A, 'Dirty')): 'Suck', - # ... - ((loc_A, 'Clean'), (loc_A, 'Clean'), (loc_A, 'Clean')): 'Right', - ((loc_A, 'Clean'), (loc_A, 'Clean'), (loc_A, 'Dirty')): 'Suck', - # ... - } + ((loc_A, 'Dirty'), (loc_A, 'Clean')): 'Right', + ((loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean')): 'Left', + ((loc_A, 'Dirty'), (loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck'} return Agent(TableDrivenAgentProgram(table)) def ReflexVacuumAgent(): - """A reflex agent for the two-state vacuum environment. [Figure 2.8]""" + """ + [Figure 2.8] + A reflex agent for the two-state vacuum environment. + >>> agent = ReflexVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ + def program(percept): location, status = percept if status == 'Dirty': @@ -201,11 +248,19 @@ def program(percept): return 'Right' elif location == loc_B: return 'Left' + return Agent(program) def ModelBasedVacuumAgent(): - """An agent that keeps track of what locations are clean or dirty.""" + """An agent that keeps track of what locations are clean or dirty. + >>> agent = ModelBasedVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ model = {loc_A: None, loc_B: None} def program(percept): @@ -220,8 +275,10 @@ def program(percept): return 'Right' elif location == loc_B: return 'Left' + return Agent(program) + # ______________________________________________________________________________ @@ -288,8 +345,11 @@ def run(self, steps=1000): def list_things_at(self, location, tclass=Thing): """Return all things exactly at a given location.""" + if isinstance(location, numbers.Number): + return [thing for thing in self.things + if thing.location == location and isinstance(thing, tclass)] return [thing for thing in self.things - if thing.location == location and isinstance(thing, tclass)] + if all(x == y for x, y in zip(thing.location, location)) and isinstance(thing, tclass)] def some_things_at(self, location, tclass=Thing): """Return true if at least one of the things at location @@ -299,7 +359,7 @@ def some_things_at(self, location, tclass=Thing): def add_thing(self, thing, location=None): """Add a thing to the environment, setting its location. For convenience, if thing is an agent program we make a new agent - for it. (Shouldn't need to override this.""" + for it. (Shouldn't need to override this.)""" if not isinstance(thing, Thing): thing = Agent(thing) if thing in self.things: @@ -342,37 +402,65 @@ def __init__(self, direction): self.direction = direction def __add__(self, heading): + """ + >>> d = Direction('right') + >>> l1 = d.__add__(Direction.L) + >>> l2 = d.__add__(Direction.R) + >>> l1.direction + 'up' + >>> l2.direction + 'down' + >>> d = Direction('down') + >>> l1 = d.__add__('right') + >>> l2 = d.__add__('left') + >>> l1.direction == Direction.L + True + >>> l2.direction == Direction.R + True + """ if self.direction == self.R: - return{ + return { self.R: Direction(self.D), self.L: Direction(self.U), }.get(heading, None) elif self.direction == self.L: - return{ + return { self.R: Direction(self.U), self.L: Direction(self.D), }.get(heading, None) elif self.direction == self.U: - return{ + return { self.R: Direction(self.R), self.L: Direction(self.L), }.get(heading, None) elif self.direction == self.D: - return{ + return { self.R: Direction(self.L), self.L: Direction(self.R), }.get(heading, None) def move_forward(self, from_location): + """ + >>> d = Direction('up') + >>> l1 = d.move_forward((0, 0)) + >>> l1 + (0, -1) + >>> d = Direction(Direction.R) + >>> l1 = d.move_forward((0, 0)) + >>> l1 + (1, 0) + """ + # get the iterable class to return + iclass = from_location.__class__ x, y = from_location if self.direction == self.R: - return (x + 1, y) + return iclass((x + 1, y)) elif self.direction == self.L: - return (x - 1, y) + return iclass((x - 1, y)) elif self.direction == self.U: - return (x, y - 1) + return iclass((x, y - 1)) elif self.direction == self.D: - return (x, y + 1) + return iclass((x, y + 1)) class XYEnvironment(Environment): @@ -403,7 +491,7 @@ def things_near(self, location, radius=None): radius2 = radius * radius return [(thing, radius2 - distance_squared(location, thing.location)) for thing in self.things if distance_squared( - location, thing.location) <= radius2] + location, thing.location) <= radius2] def percept(self, agent): """By default, agent perceives things within a default radius.""" @@ -417,17 +505,24 @@ def execute_action(self, agent, action): agent.direction += Direction.L elif action == 'Forward': agent.bump = self.move_to(agent, agent.direction.move_forward(agent.location)) -# elif action == 'Grab': -# things = [thing for thing in self.list_things_at(agent.location) -# if agent.can_grab(thing)] -# if things: -# agent.holding.append(things[0]) + elif action == 'Grab': + things = [thing for thing in self.list_things_at(agent.location) if agent.can_grab(thing)] + if things: + agent.holding.append(things[0]) + print("Grabbing ", things[0].__class__.__name__) + self.delete_thing(things[0]) elif action == 'Release': if agent.holding: - agent.holding.pop() + dropped = agent.holding.pop() + print("Dropping ", dropped.__class__.__name__) + self.add_thing(dropped, location=agent.location) def default_location(self, thing): - return (random.choice(self.width), random.choice(self.height)) + location = self.random_location_inbounds() + while self.some_things_at(location, Obstacle): + # we will find a random location with no obstacles + location = self.random_location_inbounds() + return location def move_to(self, thing, destination): """Move a thing to a new location. Returns True on success or False if there is an Obstacle. @@ -443,10 +538,12 @@ def move_to(self, thing, destination): t.location = destination return thing.bump - def add_thing(self, thing, location=(1, 1), exclude_duplicate_class_items=False): - """Adds things to the world. If (exclude_duplicate_class_items) then the item won't be + def add_thing(self, thing, location=None, exclude_duplicate_class_items=False): + """Add things to the world. If (exclude_duplicate_class_items) then the item won't be added if the location has at least one item of the same class.""" - if (self.is_inbounds(location)): + if location is None: + super().add_thing(thing) + elif self.is_inbounds(location): if (exclude_duplicate_class_items and any(isinstance(t, thing.__class__) for t in self.list_things_at(location))): return @@ -455,14 +552,14 @@ def add_thing(self, thing, location=(1, 1), exclude_duplicate_class_items=False) def is_inbounds(self, location): """Checks to make sure that the location is inbounds (within walls if we have walls)""" x, y = location - return not (x < self.x_start or x >= self.x_end or y < self.y_start or y >= self.y_end) + return not (x < self.x_start or x > self.x_end or y < self.y_start or y > self.y_end) def random_location_inbounds(self, exclude=None): """Returns a random location that is inbounds (within walls if we have walls)""" location = (random.randint(self.x_start, self.x_end), random.randint(self.y_start, self.y_end)) if exclude is not None: - while(location == exclude): + while location == exclude: location = (random.randint(self.x_start, self.x_end), random.randint(self.y_start, self.y_end)) return location @@ -470,10 +567,7 @@ def random_location_inbounds(self, exclude=None): def delete_thing(self, thing): """Deletes thing, and everything it is holding (if thing is an agent)""" if isinstance(thing, Agent): - for obj in thing.holding: - super().delete_thing(obj) - for obs in self.observers: - obs.thing_deleted(obj) + del thing.holding super().delete_thing(thing) for obs in self.observers: @@ -484,7 +578,7 @@ def add_walls(self): for x in range(self.width): self.add_thing(Wall(), (x, 0)) self.add_thing(Wall(), (x, self.height - 1)) - for y in range(self.height): + for y in range(1, self.height - 1): self.add_thing(Wall(), (0, y)) self.add_thing(Wall(), (self.width - 1, y)) @@ -515,15 +609,8 @@ class Obstacle(Thing): class Wall(Obstacle): pass -# ______________________________________________________________________________ - -try: - from ipythonblocks import BlockGrid - from IPython.display import HTML, display - from time import sleep -except: - pass +# ______________________________________________________________________________ class GraphicEnvironment(XYEnvironment): @@ -549,7 +636,7 @@ def get_world(self): for x in range(x_start, x_end): row = [] for y in range(y_start, y_end): - row.append(self.list_things_at([x, y])) + row.append(self.list_things_at((x, y))) result.append(row) return result @@ -582,16 +669,16 @@ def run(self, steps=1000, delay=1): def update(self, delay=1): sleep(delay) - if self.visible: - self.conceal() - self.reveal() - else: - self.reveal() + self.reveal() def reveal(self): """Display the BlockGrid for this world - the last thing to be added at a location defines the location color.""" self.draw_world() + # wait for the world to update and + # apply changes to the same grid instead + # of making a new one. + clear_output(1) self.grid.show() self.visible = True @@ -631,6 +718,7 @@ def __init__(self, coordinates): super().__init__() self.coordinates = coordinates + # ______________________________________________________________________________ # Vacuum environment @@ -640,7 +728,6 @@ class Dirt(Thing): class VacuumEnvironment(XYEnvironment): - """The environment of [Ex. 2.12]. Agent perceives dirty or clean, and bump (into obstacle) or not; 2D discrete world of unknown size; performance measure is 100 for each dirt cleaned, and -1 for @@ -659,10 +746,11 @@ def percept(self, agent): Unlike the TrivialVacuumEnvironment, location is NOT perceived.""" status = ('Dirty' if self.some_things_at( agent.location, Dirt) else 'Clean') - bump = ('Bump' if agent.bump else'None') - return (status, bump) + bump = ('Bump' if agent.bump else 'None') + return status, bump def execute_action(self, agent, action): + agent.bump = False if action == 'Suck': dirt_list = self.list_things_at(agent.location, Dirt) if dirt_list != []: @@ -677,7 +765,6 @@ def execute_action(self, agent, action): class TrivialVacuumEnvironment(Environment): - """This environment has two locations, A and B. Each can be Dirty or Clean. The agent perceives its location and the location's status. This serves as an example of how to implement a simple @@ -689,12 +776,11 @@ def __init__(self): loc_B: random.choice(['Clean', 'Dirty'])} def thing_classes(self): - return [Wall, Dirt, ReflexVacuumAgent, RandomVacuumAgent, - TableDrivenVacuumAgent, ModelBasedVacuumAgent] + return [Wall, Dirt, ReflexVacuumAgent, RandomVacuumAgent, TableDrivenVacuumAgent, ModelBasedVacuumAgent] def percept(self, agent): """Returns the agent's location, and the location status (Dirty/Clean).""" - return (agent.location, self.status[agent.location]) + return agent.location, self.status[agent.location] def execute_action(self, agent, action): """Change agent's location and/or location's status; track performance. @@ -714,6 +800,7 @@ def default_location(self, thing): """Agents start in either location at random.""" return random.choice([loc_A, loc_B]) + # ______________________________________________________________________________ # The Wumpus World @@ -723,6 +810,7 @@ class Gold(Thing): def __eq__(self, rhs): """All Gold are equal""" return rhs.__class__ == Gold + pass @@ -772,6 +860,7 @@ def can_grab(self, thing): class WumpusEnvironment(XYEnvironment): pit_probability = 0.2 # Probability to spawn a pit in a location. (From Chapter 7.2) + # Room should be 4x4 grid of rooms. The extra 2 for walls def __init__(self, agent_program, width=6, height=6): @@ -809,7 +898,7 @@ def init_world(self, program): self.add_thing(Explorer(program), (1, 1), True) def get_world(self, show_walls=True): - """Returns the items in the world""" + """Return the items in the world""" result = [] x_start, y_start = (0, 0) if show_walls else (1, 1) @@ -826,7 +915,7 @@ def get_world(self, show_walls=True): return result def percepts_from(self, agent, location, tclass=Thing): - """Returns percepts from a given location, + """Return percepts from a given location, and replaces some items with percepts from chapter 7.""" thing_percepts = { Gold: Glitter(), @@ -846,7 +935,7 @@ def percepts_from(self, agent, location, tclass=Thing): return result if len(result) else [None] def percept(self, agent): - """Returns things in adjacent (not diagonal) cells of the agent. + """Return things in adjacent (not diagonal) cells of the agent. Result format: [Left, Right, Up, Down, Center / Current location]""" x, y = agent.location result = [] @@ -870,24 +959,10 @@ def execute_action(self, agent, action): if isinstance(agent, Explorer) and self.in_danger(agent): return - + agent.bump = False - if action == 'TurnRight': - agent.direction += Direction.R - agent.performance -= 1 - elif action == 'TurnLeft': - agent.direction += Direction.L - agent.performance -= 1 - elif action == 'Forward': - agent.bump = self.move_to(agent, agent.direction.move_forward(agent.location)) - agent.performance -= 1 - elif action == 'Grab': - things = [thing for thing in self.list_things_at(agent.location) - if agent.can_grab(thing)] - if len(things): - print("Grabbing", things[0].__class__.__name__) - if len(things): - agent.holding.append(things[0]) + if action in ['TurnRight', 'TurnLeft', 'Forward', 'Grab']: + super().execute_action(agent, action) agent.performance -= 1 elif action == 'Climb': if agent.location == (1, 1): # Agent can only climb out of (1,1) @@ -897,7 +972,7 @@ def execute_action(self, agent, action): """The arrow travels straight down the path the agent is facing""" if agent.has_arrow: arrow_travel = agent.direction.move_forward(agent.location) - while(self.is_inbounds(arrow_travel)): + while self.is_inbounds(arrow_travel): wumpus = [thing for thing in self.list_things_at(arrow_travel) if isinstance(thing, Wumpus)] if len(wumpus): @@ -907,7 +982,7 @@ def execute_action(self, agent, action): agent.has_arrow = False def in_danger(self, agent): - """Checks if Explorer is in danger (Pit or Wumpus), if he is, kill him""" + """Check if Explorer is in danger (Pit or Wumpus), if he is, kill him""" for thing in self.list_things_at(agent.location): if isinstance(thing, Pit) or (isinstance(thing, Wumpus) and thing.alive): agent.alive = False @@ -927,12 +1002,12 @@ def is_done(self): print("Death by {} [-1000].".format(explorer[0].killed_by)) else: print("Explorer climbed out {}." - .format( - "with Gold [+1000]!" if Gold() not in self.things else "without Gold [+0]")) + .format("with Gold [+1000]!" if Gold() not in self.things else "without Gold [+0]")) return True - # TODO: Arrow needs to be implemented + + # ______________________________________________________________________________ @@ -940,21 +1015,40 @@ def compare_agents(EnvFactory, AgentFactories, n=10, steps=1000): """See how well each of several agents do in n instances of an environment. Pass in a factory (constructor) for environments, and several for agents. Create n instances of the environment, and run each agent in copies of - each one for steps. Return a list of (agent, average-score) tuples.""" + each one for steps. Return a list of (agent, average-score) tuples. + >>> environment = TrivialVacuumEnvironment + >>> agents = [ModelBasedVacuumAgent, ReflexVacuumAgent] + >>> result = compare_agents(environment, agents) + >>> performance_ModelBasedVacuumAgent = result[0][1] + >>> performance_ReflexVacuumAgent = result[1][1] + >>> performance_ReflexVacuumAgent <= performance_ModelBasedVacuumAgent + True + """ envs = [EnvFactory() for i in range(n)] return [(A, test_agent(A, steps, copy.deepcopy(envs))) for A in AgentFactories] def test_agent(AgentFactory, steps, envs): - """Return the mean score of running an agent in each of the envs, for steps""" + """Return the mean score of running an agent in each of the envs, for steps + >>> def constant_prog(percept): + ... return percept + ... + >>> agent = Agent(constant_prog) + >>> result = agent.program(5) + >>> result == 5 + True + """ + def score(env): agent = AgentFactory() env.add_thing(agent) env.run(steps) return agent.performance + return mean(map(score, envs)) + # _________________________________________________________________________ diff --git a/agents4e.py b/agents4e.py new file mode 100644 index 000000000..75369a69a --- /dev/null +++ b/agents4e.py @@ -0,0 +1,1089 @@ +""" +Implement Agents and Environments. (Chapters 1-2) + +The class hierarchies are as follows: + +Thing ## A physical object that can exist in an environment + Agent + Wumpus + Dirt + Wall + ... + +Environment ## An environment holds objects, runs simulations + XYEnvironment + VacuumEnvironment + WumpusEnvironment + +An agent program is a callable instance, taking percepts and choosing actions + SimpleReflexAgentProgram + ... + +EnvGUI ## A window with a graphical representation of the Environment + +EnvToolbar ## contains buttons for controlling EnvGUI + +EnvCanvas ## Canvas to display the environment of an EnvGUI +""" + +# TODO +# Implement grabbing correctly. +# When an object is grabbed, does it still have a location? +# What if it is released? +# What if the grabbed or the grabber is deleted? +# What if the grabber moves? +# Speed control in GUI does not have any effect -- fix it. + +from utils4e import distance_squared, turn_heading +from statistics import mean +from ipythonblocks import BlockGrid +from IPython.display import HTML, display, clear_output +from time import sleep + +import random +import copy +import collections +import numbers + + +# ______________________________________________________________________________ + + +class Thing: + """This represents any physical object that can appear in an Environment. + You subclass Thing to get the things you want. Each thing can have a + .__name__ slot (used for output only).""" + + def __repr__(self): + return '<{}>'.format(getattr(self, '__name__', self.__class__.__name__)) + + def is_alive(self): + """Things that are 'alive' should return true.""" + return hasattr(self, 'alive') and self.alive + + def show_state(self): + """Display the agent's internal state. Subclasses should override.""" + print("I don't know how to show_state.") + + def display(self, canvas, x, y, width, height): + """Display an image of this Thing on the canvas.""" + # Do we need this? + pass + + +class Agent(Thing): + """An Agent is a subclass of Thing with one required slot, + .program, which should hold a function that takes one argument, the + percept, and returns an action. (What counts as a percept or action + will depend on the specific environment in which the agent exists.) + Note that 'program' is a slot, not a method. If it were a method, + then the program could 'cheat' and look at aspects of the agent. + It's not supposed to do that: the program can only look at the + percepts. An agent program that needs a model of the world (and of + the agent itself) will have to build and maintain its own model. + There is an optional slot, .performance, which is a number giving + the performance measure of the agent in its environment.""" + + def __init__(self, program=None): + self.alive = True + self.bump = False + self.holding = [] + self.performance = 0 + if program is None or not isinstance(program, collections.abc.Callable): + print("Can't find a valid program for {}, falling back to default.".format(self.__class__.__name__)) + + def program(percept): + return eval(input('Percept={}; action? '.format(percept))) + + self.program = program + + def can_grab(self, thing): + """Return True if this agent can grab this thing. + Override for appropriate subclasses of Agent and Thing.""" + return False + + +def TraceAgent(agent): + """Wrap the agent's program to print its input and output. This will let + you see what the agent is doing in the environment.""" + old_program = agent.program + + def new_program(percept): + action = old_program(percept) + print('{} perceives {} and does {}'.format(agent, percept, action)) + return action + + agent.program = new_program + return agent + + +# ______________________________________________________________________________ + + +def TableDrivenAgentProgram(table): + """ + [Figure 2.7] + This agent selects an action based on the percept sequence. + It is practical only for tiny domains. + To customize it, provide as table a dictionary of all + {percept_sequence:action} pairs. + """ + percepts = [] + + def program(percept): + percepts.append(percept) + action = table.get(tuple(percepts)) + return action + + return program + + +def RandomAgentProgram(actions): + """An agent that chooses an action at random, ignoring all percepts. + >>> list = ['Right', 'Left', 'Suck', 'NoOp'] + >>> program = RandomAgentProgram(list) + >>> agent = Agent(program) + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1, 0): 'Clean' , (0, 0): 'Clean'} + True + """ + return lambda percept: random.choice(actions) + + +# ______________________________________________________________________________ + + +def SimpleReflexAgentProgram(rules, interpret_input): + """ + [Figure 2.10] + This agent takes action based solely on the percept. + """ + + def program(percept): + state = interpret_input(percept) + rule = rule_match(state, rules) + action = rule.action + return action + + return program + + +def ModelBasedReflexAgentProgram(rules, update_state, transition_model, sensor_model): + """ + [Figure 2.12] + This agent takes action based on the percept and state. + """ + + def program(percept): + program.state = update_state(program.state, program.action, percept, transition_model, sensor_model) + rule = rule_match(program.state, rules) + action = rule.action + return action + + program.state = program.action = None + return program + + +def rule_match(state, rules): + """Find the first rule that matches state.""" + for rule in rules: + if rule.matches(state): + return rule + + +# ______________________________________________________________________________ + + +loc_A, loc_B = (0, 0), (1, 0) # The two locations for the Vacuum world + + +def RandomVacuumAgent(): + """Randomly choose one of the actions from the vacuum environment. + >>> agent = RandomVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ + return Agent(RandomAgentProgram(['Right', 'Left', 'Suck', 'NoOp'])) + + +def TableDrivenVacuumAgent(): + """Tabular approach towards vacuum world as mentioned in [Figure 2.3] + >>> agent = TableDrivenVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ + table = {((loc_A, 'Clean'),): 'Right', + ((loc_A, 'Dirty'),): 'Suck', + ((loc_B, 'Clean'),): 'Left', + ((loc_B, 'Dirty'),): 'Suck', + ((loc_A, 'Dirty'), (loc_A, 'Clean')): 'Right', + ((loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean')): 'Left', + ((loc_A, 'Dirty'), (loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck'} + return Agent(TableDrivenAgentProgram(table)) + + +def ReflexVacuumAgent(): + """ + [Figure 2.8] + A reflex agent for the two-state vacuum environment. + >>> agent = ReflexVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ + + def program(percept): + location, status = percept + if status == 'Dirty': + return 'Suck' + elif location == loc_A: + return 'Right' + elif location == loc_B: + return 'Left' + + return Agent(program) + + +def ModelBasedVacuumAgent(): + """An agent that keeps track of what locations are clean or dirty. + >>> agent = ModelBasedVacuumAgent() + >>> environment = TrivialVacuumEnvironment() + >>> environment.add_thing(agent) + >>> environment.run() + >>> environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + True + """ + model = {loc_A: None, loc_B: None} + + def program(percept): + """Same as ReflexVacuumAgent, except if everything is clean, do NoOp.""" + location, status = percept + model[location] = status # Update the model here + if model[loc_A] == model[loc_B] == 'Clean': + return 'NoOp' + elif status == 'Dirty': + return 'Suck' + elif location == loc_A: + return 'Right' + elif location == loc_B: + return 'Left' + + return Agent(program) + + +# ______________________________________________________________________________ + + +class Environment: + """Abstract class representing an Environment. 'Real' Environment classes + inherit from this. Your Environment will typically need to implement: + percept: Define the percept that an agent sees. + execute_action: Define the effects of executing an action. + Also update the agent.performance slot. + The environment keeps a list of .things and .agents (which is a subset + of .things). Each agent has a .performance slot, initialized to 0. + Each thing has a .location slot, even though some environments may not + need this.""" + + def __init__(self): + self.things = [] + self.agents = [] + + def thing_classes(self): + return [] # List of classes that can go into environment + + def percept(self, agent): + """Return the percept that the agent sees at this point. (Implement this.)""" + raise NotImplementedError + + def execute_action(self, agent, action): + """Change the world to reflect this action. (Implement this.)""" + raise NotImplementedError + + def default_location(self, thing): + """Default location to place a new thing with unspecified location.""" + return None + + def exogenous_change(self): + """If there is spontaneous change in the world, override this.""" + pass + + def is_done(self): + """By default, we're done when we can't find a live agent.""" + return not any(agent.is_alive() for agent in self.agents) + + def step(self): + """Run the environment for one time step. If the + actions and exogenous changes are independent, this method will + do. If there are interactions between them, you'll need to + override this method.""" + if not self.is_done(): + actions = [] + for agent in self.agents: + if agent.alive: + actions.append(agent.program(self.percept(agent))) + else: + actions.append("") + for (agent, action) in zip(self.agents, actions): + self.execute_action(agent, action) + self.exogenous_change() + + def run(self, steps=1000): + """Run the Environment for given number of time steps.""" + for step in range(steps): + if self.is_done(): + return + self.step() + + def list_things_at(self, location, tclass=Thing): + """Return all things exactly at a given location.""" + if isinstance(location, numbers.Number): + return [thing for thing in self.things + if thing.location == location and isinstance(thing, tclass)] + return [thing for thing in self.things + if all(x == y for x, y in zip(thing.location, location)) and isinstance(thing, tclass)] + + def some_things_at(self, location, tclass=Thing): + """Return true if at least one of the things at location + is an instance of class tclass (or a subclass).""" + return self.list_things_at(location, tclass) != [] + + def add_thing(self, thing, location=None): + """Add a thing to the environment, setting its location. For + convenience, if thing is an agent program we make a new agent + for it. (Shouldn't need to override this.)""" + if not isinstance(thing, Thing): + thing = Agent(thing) + if thing in self.things: + print("Can't add the same thing twice") + else: + thing.location = location if location is not None else self.default_location(thing) + self.things.append(thing) + if isinstance(thing, Agent): + thing.performance = 0 + self.agents.append(thing) + + def delete_thing(self, thing): + """Remove a thing from the environment.""" + try: + self.things.remove(thing) + except ValueError as e: + print(e) + print(" in Environment delete_thing") + print(" Thing to be removed: {} at {}".format(thing, thing.location)) + print(" from list: {}".format([(thing, thing.location) for thing in self.things])) + if thing in self.agents: + self.agents.remove(thing) + + +class Direction: + """A direction class for agents that want to move in a 2D plane + Usage: + d = Direction("down") + To change directions: + d = d + "right" or d = d + Direction.R #Both do the same thing + Note that the argument to __add__ must be a string and not a Direction object. + Also, it (the argument) can only be right or left.""" + + R = "right" + L = "left" + U = "up" + D = "down" + + def __init__(self, direction): + self.direction = direction + + def __add__(self, heading): + """ + >>> d = Direction('right') + >>> l1 = d.__add__(Direction.L) + >>> l2 = d.__add__(Direction.R) + >>> l1.direction + 'up' + >>> l2.direction + 'down' + >>> d = Direction('down') + >>> l1 = d.__add__('right') + >>> l2 = d.__add__('left') + >>> l1.direction == Direction.L + True + >>> l2.direction == Direction.R + True + """ + if self.direction == self.R: + return { + self.R: Direction(self.D), + self.L: Direction(self.U), + }.get(heading, None) + elif self.direction == self.L: + return { + self.R: Direction(self.U), + self.L: Direction(self.D), + }.get(heading, None) + elif self.direction == self.U: + return { + self.R: Direction(self.R), + self.L: Direction(self.L), + }.get(heading, None) + elif self.direction == self.D: + return { + self.R: Direction(self.L), + self.L: Direction(self.R), + }.get(heading, None) + + def move_forward(self, from_location): + """ + >>> d = Direction('up') + >>> l1 = d.move_forward((0, 0)) + >>> l1 + (0, -1) + >>> d = Direction(Direction.R) + >>> l1 = d.move_forward((0, 0)) + >>> l1 + (1, 0) + """ + # get the iterable class to return + iclass = from_location.__class__ + x, y = from_location + if self.direction == self.R: + return iclass((x + 1, y)) + elif self.direction == self.L: + return iclass((x - 1, y)) + elif self.direction == self.U: + return iclass((x, y - 1)) + elif self.direction == self.D: + return iclass((x, y + 1)) + + +class XYEnvironment(Environment): + """This class is for environments on a 2D plane, with locations + labelled by (x, y) points, either discrete or continuous. + + Agents perceive things within a radius. Each agent in the + environment has a .location slot which should be a location such + as (0, 1), and a .holding slot, which should be a list of things + that are held.""" + + def __init__(self, width=10, height=10): + super().__init__() + + self.width = width + self.height = height + self.observers = [] + # Sets iteration start and end (no walls). + self.x_start, self.y_start = (0, 0) + self.x_end, self.y_end = (self.width, self.height) + + perceptible_distance = 1 + + def things_near(self, location, radius=None): + """Return all things within radius of location.""" + if radius is None: + radius = self.perceptible_distance + radius2 = radius * radius + return [(thing, radius2 - distance_squared(location, thing.location)) + for thing in self.things if distance_squared( + location, thing.location) <= radius2] + + def percept(self, agent): + """By default, agent perceives things within a default radius.""" + return self.things_near(agent.location) + + def execute_action(self, agent, action): + agent.bump = False + if action == 'TurnRight': + agent.direction += Direction.R + elif action == 'TurnLeft': + agent.direction += Direction.L + elif action == 'Forward': + agent.bump = self.move_to(agent, agent.direction.move_forward(agent.location)) + # elif action == 'Grab': + # things = [thing for thing in self.list_things_at(agent.location) + # if agent.can_grab(thing)] + # if things: + # agent.holding.append(things[0]) + elif action == 'Release': + if agent.holding: + agent.holding.pop() + + def default_location(self, thing): + location = self.random_location_inbounds() + while self.some_things_at(location, Obstacle): + # we will find a random location with no obstacles + location = self.random_location_inbounds() + return location + + def move_to(self, thing, destination): + """Move a thing to a new location. Returns True on success or False if there is an Obstacle. + If thing is holding anything, they move with him.""" + thing.bump = self.some_things_at(destination, Obstacle) + if not thing.bump: + thing.location = destination + for o in self.observers: + o.thing_moved(thing) + for t in thing.holding: + self.delete_thing(t) + self.add_thing(t, destination) + t.location = destination + return thing.bump + + def add_thing(self, thing, location=None, exclude_duplicate_class_items=False): + """Add things to the world. If (exclude_duplicate_class_items) then the item won't be + added if the location has at least one item of the same class.""" + if location is None: + super().add_thing(thing) + elif self.is_inbounds(location): + if (exclude_duplicate_class_items and + any(isinstance(t, thing.__class__) for t in self.list_things_at(location))): + return + super().add_thing(thing, location) + + def is_inbounds(self, location): + """Checks to make sure that the location is inbounds (within walls if we have walls)""" + x, y = location + return not (x < self.x_start or x > self.x_end or y < self.y_start or y > self.y_end) + + def random_location_inbounds(self, exclude=None): + """Returns a random location that is inbounds (within walls if we have walls)""" + location = (random.randint(self.x_start, self.x_end), + random.randint(self.y_start, self.y_end)) + if exclude is not None: + while location == exclude: + location = (random.randint(self.x_start, self.x_end), + random.randint(self.y_start, self.y_end)) + return location + + def delete_thing(self, thing): + """Deletes thing, and everything it is holding (if thing is an agent)""" + if isinstance(thing, Agent): + for obj in thing.holding: + super().delete_thing(obj) + for obs in self.observers: + obs.thing_deleted(obj) + + super().delete_thing(thing) + for obs in self.observers: + obs.thing_deleted(thing) + + def add_walls(self): + """Put walls around the entire perimeter of the grid.""" + for x in range(self.width): + self.add_thing(Wall(), (x, 0)) + self.add_thing(Wall(), (x, self.height - 1)) + for y in range(1, self.height - 1): + self.add_thing(Wall(), (0, y)) + self.add_thing(Wall(), (self.width - 1, y)) + + # Updates iteration start and end (with walls). + self.x_start, self.y_start = (1, 1) + self.x_end, self.y_end = (self.width - 1, self.height - 1) + + def add_observer(self, observer): + """Adds an observer to the list of observers. + An observer is typically an EnvGUI. + + Each observer is notified of changes in move_to and add_thing, + by calling the observer's methods thing_moved(thing) + and thing_added(thing, loc).""" + self.observers.append(observer) + + def turn_heading(self, heading, inc): + """Return the heading to the left (inc=+1) or right (inc=-1) of heading.""" + return turn_heading(heading, inc) + + +class Obstacle(Thing): + """Something that can cause a bump, preventing an agent from + moving into the same square it's in.""" + pass + + +class Wall(Obstacle): + pass + + +# ______________________________________________________________________________ + + +class GraphicEnvironment(XYEnvironment): + def __init__(self, width=10, height=10, boundary=True, color={}, display=False): + """Define all the usual XYEnvironment characteristics, + but initialise a BlockGrid for GUI too.""" + super().__init__(width, height) + self.grid = BlockGrid(width, height, fill=(200, 200, 200)) + if display: + self.grid.show() + self.visible = True + else: + self.visible = False + self.bounded = boundary + self.colors = color + + def get_world(self): + """Returns all the items in the world in a format + understandable by the ipythonblocks BlockGrid.""" + result = [] + x_start, y_start = (0, 0) + x_end, y_end = self.width, self.height + for x in range(x_start, x_end): + row = [] + for y in range(y_start, y_end): + row.append(self.list_things_at((x, y))) + result.append(row) + return result + + """ + def run(self, steps=1000, delay=1): + "" "Run the Environment for given number of time steps, + but update the GUI too." "" + for step in range(steps): + sleep(delay) + if self.visible: + self.reveal() + if self.is_done(): + if self.visible: + self.reveal() + return + self.step() + if self.visible: + self.reveal() + """ + + def run(self, steps=1000, delay=1): + """Run the Environment for given number of time steps, + but update the GUI too.""" + for step in range(steps): + self.update(delay) + if self.is_done(): + break + self.step() + self.update(delay) + + def update(self, delay=1): + sleep(delay) + self.reveal() + + def reveal(self): + """Display the BlockGrid for this world - the last thing to be added + at a location defines the location color.""" + self.draw_world() + # wait for the world to update and + # apply changes to the same grid instead + # of making a new one. + clear_output(1) + self.grid.show() + self.visible = True + + def draw_world(self): + self.grid[:] = (200, 200, 200) + world = self.get_world() + for x in range(0, len(world)): + for y in range(0, len(world[x])): + if len(world[x][y]): + self.grid[y, x] = self.colors[world[x][y][-1].__class__.__name__] + + def conceal(self): + """Hide the BlockGrid for this world""" + self.visible = False + display(HTML('')) + + +# ______________________________________________________________________________ +# Continuous environment + +class ContinuousWorld(Environment): + """Model for Continuous World""" + + def __init__(self, width=10, height=10): + super().__init__() + self.width = width + self.height = height + + def add_obstacle(self, coordinates): + self.things.append(PolygonObstacle(coordinates)) + + +class PolygonObstacle(Obstacle): + + def __init__(self, coordinates): + """Coordinates is a list of tuples.""" + super().__init__() + self.coordinates = coordinates + + +# ______________________________________________________________________________ +# Vacuum environment + + +class Dirt(Thing): + pass + + +class VacuumEnvironment(XYEnvironment): + """The environment of [Ex. 2.12]. Agent perceives dirty or clean, + and bump (into obstacle) or not; 2D discrete world of unknown size; + performance measure is 100 for each dirt cleaned, and -1 for + each turn taken.""" + + def __init__(self, width=10, height=10): + super().__init__(width, height) + self.add_walls() + + def thing_classes(self): + return [Wall, Dirt, ReflexVacuumAgent, RandomVacuumAgent, + TableDrivenVacuumAgent, ModelBasedVacuumAgent] + + def percept(self, agent): + """The percept is a tuple of ('Dirty' or 'Clean', 'Bump' or 'None'). + Unlike the TrivialVacuumEnvironment, location is NOT perceived.""" + status = ('Dirty' if self.some_things_at( + agent.location, Dirt) else 'Clean') + bump = ('Bump' if agent.bump else 'None') + return status, bump + + def execute_action(self, agent, action): + agent.bump = False + if action == 'Suck': + dirt_list = self.list_things_at(agent.location, Dirt) + if dirt_list != []: + dirt = dirt_list[0] + agent.performance += 100 + self.delete_thing(dirt) + else: + super().execute_action(agent, action) + + if action != 'NoOp': + agent.performance -= 1 + + +class TrivialVacuumEnvironment(Environment): + """This environment has two locations, A and B. Each can be Dirty + or Clean. The agent perceives its location and the location's + status. This serves as an example of how to implement a simple + Environment.""" + + def __init__(self): + super().__init__() + self.status = {loc_A: random.choice(['Clean', 'Dirty']), + loc_B: random.choice(['Clean', 'Dirty'])} + + def thing_classes(self): + return [Wall, Dirt, ReflexVacuumAgent, RandomVacuumAgent, TableDrivenVacuumAgent, ModelBasedVacuumAgent] + + def percept(self, agent): + """Returns the agent's location, and the location status (Dirty/Clean).""" + return agent.location, self.status[agent.location] + + def execute_action(self, agent, action): + """Change agent's location and/or location's status; track performance. + Score 10 for each dirt cleaned; -1 for each move.""" + if action == 'Right': + agent.location = loc_B + agent.performance -= 1 + elif action == 'Left': + agent.location = loc_A + agent.performance -= 1 + elif action == 'Suck': + if self.status[agent.location] == 'Dirty': + agent.performance += 10 + self.status[agent.location] = 'Clean' + + def default_location(self, thing): + """Agents start in either location at random.""" + return random.choice([loc_A, loc_B]) + + +# ______________________________________________________________________________ +# The Wumpus World + + +class Gold(Thing): + + def __eq__(self, rhs): + """All Gold are equal""" + return rhs.__class__ == Gold + + pass + + +class Bump(Thing): + pass + + +class Glitter(Thing): + pass + + +class Pit(Thing): + pass + + +class Breeze(Thing): + pass + + +class Arrow(Thing): + pass + + +class Scream(Thing): + pass + + +class Wumpus(Agent): + screamed = False + pass + + +class Stench(Thing): + pass + + +class Explorer(Agent): + holding = [] + has_arrow = True + killed_by = "" + direction = Direction("right") + + def can_grab(self, thing): + """Explorer can only grab gold""" + return thing.__class__ == Gold + + +class WumpusEnvironment(XYEnvironment): + pit_probability = 0.2 # Probability to spawn a pit in a location. (From Chapter 7.2) + + # Room should be 4x4 grid of rooms. The extra 2 for walls + + def __init__(self, agent_program, width=6, height=6): + super().__init__(width, height) + self.init_world(agent_program) + + def init_world(self, program): + """Spawn items in the world based on probabilities from the book""" + + "WALLS" + self.add_walls() + + "PITS" + for x in range(self.x_start, self.x_end): + for y in range(self.y_start, self.y_end): + if random.random() < self.pit_probability: + self.add_thing(Pit(), (x, y), True) + self.add_thing(Breeze(), (x - 1, y), True) + self.add_thing(Breeze(), (x, y - 1), True) + self.add_thing(Breeze(), (x + 1, y), True) + self.add_thing(Breeze(), (x, y + 1), True) + + "WUMPUS" + w_x, w_y = self.random_location_inbounds(exclude=(1, 1)) + self.add_thing(Wumpus(lambda x: ""), (w_x, w_y), True) + self.add_thing(Stench(), (w_x - 1, w_y), True) + self.add_thing(Stench(), (w_x + 1, w_y), True) + self.add_thing(Stench(), (w_x, w_y - 1), True) + self.add_thing(Stench(), (w_x, w_y + 1), True) + + "GOLD" + self.add_thing(Gold(), self.random_location_inbounds(exclude=(1, 1)), True) + + "AGENT" + self.add_thing(Explorer(program), (1, 1), True) + + def get_world(self, show_walls=True): + """Return the items in the world""" + result = [] + x_start, y_start = (0, 0) if show_walls else (1, 1) + + if show_walls: + x_end, y_end = self.width, self.height + else: + x_end, y_end = self.width - 1, self.height - 1 + + for x in range(x_start, x_end): + row = [] + for y in range(y_start, y_end): + row.append(self.list_things_at((x, y))) + result.append(row) + return result + + def percepts_from(self, agent, location, tclass=Thing): + """Return percepts from a given location, + and replaces some items with percepts from chapter 7.""" + thing_percepts = { + Gold: Glitter(), + Wall: Bump(), + Wumpus: Stench(), + Pit: Breeze()} + + """Agents don't need to get their percepts""" + thing_percepts[agent.__class__] = None + + """Gold only glitters in its cell""" + if location != agent.location: + thing_percepts[Gold] = None + + result = [thing_percepts.get(thing.__class__, thing) for thing in self.things + if thing.location == location and isinstance(thing, tclass)] + return result if len(result) else [None] + + def percept(self, agent): + """Return things in adjacent (not diagonal) cells of the agent. + Result format: [Left, Right, Up, Down, Center / Current location]""" + x, y = agent.location + result = [] + result.append(self.percepts_from(agent, (x - 1, y))) + result.append(self.percepts_from(agent, (x + 1, y))) + result.append(self.percepts_from(agent, (x, y - 1))) + result.append(self.percepts_from(agent, (x, y + 1))) + result.append(self.percepts_from(agent, (x, y))) + + """The wumpus gives out a loud scream once it's killed.""" + wumpus = [thing for thing in self.things if isinstance(thing, Wumpus)] + if len(wumpus) and not wumpus[0].alive and not wumpus[0].screamed: + result[-1].append(Scream()) + wumpus[0].screamed = True + + return result + + def execute_action(self, agent, action): + """Modify the state of the environment based on the agent's actions. + Performance score taken directly out of the book.""" + + if isinstance(agent, Explorer) and self.in_danger(agent): + return + + agent.bump = False + if action == 'TurnRight': + agent.direction += Direction.R + agent.performance -= 1 + elif action == 'TurnLeft': + agent.direction += Direction.L + agent.performance -= 1 + elif action == 'Forward': + agent.bump = self.move_to(agent, agent.direction.move_forward(agent.location)) + agent.performance -= 1 + elif action == 'Grab': + things = [thing for thing in self.list_things_at(agent.location) + if agent.can_grab(thing)] + if len(things): + print("Grabbing", things[0].__class__.__name__) + if len(things): + agent.holding.append(things[0]) + agent.performance -= 1 + elif action == 'Climb': + if agent.location == (1, 1): # Agent can only climb out of (1,1) + agent.performance += 1000 if Gold() in agent.holding else 0 + self.delete_thing(agent) + elif action == 'Shoot': + """The arrow travels straight down the path the agent is facing""" + if agent.has_arrow: + arrow_travel = agent.direction.move_forward(agent.location) + while self.is_inbounds(arrow_travel): + wumpus = [thing for thing in self.list_things_at(arrow_travel) + if isinstance(thing, Wumpus)] + if len(wumpus): + wumpus[0].alive = False + break + arrow_travel = agent.direction.move_forward(agent.location) + agent.has_arrow = False + + def in_danger(self, agent): + """Check if Explorer is in danger (Pit or Wumpus), if he is, kill him""" + for thing in self.list_things_at(agent.location): + if isinstance(thing, Pit) or (isinstance(thing, Wumpus) and thing.alive): + agent.alive = False + agent.performance -= 1000 + agent.killed_by = thing.__class__.__name__ + return True + return False + + def is_done(self): + """The game is over when the Explorer is killed + or if he climbs out of the cave only at (1,1).""" + explorer = [agent for agent in self.agents if isinstance(agent, Explorer)] + if len(explorer): + if explorer[0].alive: + return False + else: + print("Death by {} [-1000].".format(explorer[0].killed_by)) + else: + print("Explorer climbed out {}." + .format("with Gold [+1000]!" if Gold() not in self.things else "without Gold [+0]")) + return True + + # TODO: Arrow needs to be implemented + + +# ______________________________________________________________________________ + + +def compare_agents(EnvFactory, AgentFactories, n=10, steps=1000): + """See how well each of several agents do in n instances of an environment. + Pass in a factory (constructor) for environments, and several for agents. + Create n instances of the environment, and run each agent in copies of + each one for steps. Return a list of (agent, average-score) tuples. + >>> environment = TrivialVacuumEnvironment + >>> agents = [ModelBasedVacuumAgent, ReflexVacuumAgent] + >>> result = compare_agents(environment, agents) + >>> performance_ModelBasedVacuumAgent = result[0][1] + >>> performance_ReflexVacuumAgent = result[1][1] + >>> performance_ReflexVacuumAgent <= performance_ModelBasedVacuumAgent + True + """ + envs = [EnvFactory() for i in range(n)] + return [(A, test_agent(A, steps, copy.deepcopy(envs))) + for A in AgentFactories] + + +def test_agent(AgentFactory, steps, envs): + """Return the mean score of running an agent in each of the envs, for steps + >>> def constant_prog(percept): + ... return percept + ... + >>> agent = Agent(constant_prog) + >>> result = agent.program(5) + >>> result == 5 + True + """ + + def score(env): + agent = AgentFactory() + env.add_thing(agent) + env.run(steps) + return agent.performance + + return mean(map(score, envs)) + + +# _________________________________________________________________________ + + +__doc__ += """ +>>> a = ReflexVacuumAgent() +>>> a.program((loc_A, 'Clean')) +'Right' +>>> a.program((loc_B, 'Clean')) +'Left' +>>> a.program((loc_A, 'Dirty')) +'Suck' +>>> a.program((loc_A, 'Dirty')) +'Suck' + +>>> e = TrivialVacuumEnvironment() +>>> e.add_thing(ModelBasedVacuumAgent()) +>>> e.run(5) + +""" diff --git a/aima-data b/aima-data index 6ce56c0b6..f6cbea61a 160000 --- a/aima-data +++ b/aima-data @@ -1 +1 @@ -Subproject commit 6ce56c0b67206bae91b04fb20f0d8d70c9a86b6a +Subproject commit f6cbea61ad0c21c6b7be826d17af5a8d3a7c2c86 diff --git a/arc_consistency_heuristics.ipynb b/arc_consistency_heuristics.ipynb new file mode 100644 index 000000000..fb2241819 --- /dev/null +++ b/arc_consistency_heuristics.ipynb @@ -0,0 +1,1999 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "# Constraint Satisfaction Problems\n", + "---\n", + "# Heuristics for Arc-Consistency Algorithms\n", + "\n", + "## Introduction\n", + "A ***Constraint Satisfaction Problem*** is a triple $(X,D,C)$ where: \n", + "- $X$ is a set of variables $X_1, …, X_n$;\n", + "- $D$ is a set of domains $D_1, …, D_n$, one for each variable and each of which consists of a set of allowable values $v_1, ..., v_k$;\n", + "- $C$ is a set of constraints that specify allowable combinations of values.\n", + "\n", + "A CSP is called *arc-consistent* if every value in the domain of every variable is supported by all the neighbors of the variable while, is called *inconsistent*, if it has no solutions.
    \n", + "***Arc-consistency algorithms*** remove all unsupported values from the domains of variables making the CSP *arc-consistent* or decide that a CSP is *inconsistent* by finding that some variable has no supported values in its domain.
    \n", + "Heuristics significantly enhance the efficiency of the *arc-consistency algorithms* improving their average performance in terms of *consistency-checks* which can be considered a standard measure of goodness for such algorithms. *Arc-heuristic* operate at arc-level and selects the constraint that will be used for the next check, while *domain-heuristics* operate at domain-level and selects which values will be used for the next support-check." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from csp import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Domain-Heuristics for Arc-Consistency Algorithms\n", + "In [[1]](#cite-van2002domain) are investigated the effects of a *domain-heuristic* based on the notion of a *double-support check* by studying its average time-complexity.\n", + "\n", + "The objective of *arc-consistency algorithms* is to resolve some uncertainty; it has to be know, for each $v_i \\in D_i$ and for each $v_j \\in D_j$, whether it is supported.\n", + "\n", + "A *single-support check*, $(v_i, v_j) \\in C_{ij}$, is one in which, before the check is done, it is already known that either $v_i$ or $v_j$ are supported. \n", + "\n", + "A *double-support check* $(v_i, v_j) \\in C_{ij}$, is one in which there is still, before the check, uncertainty about the support-status of both $v_i$ and $v_j$. \n", + "\n", + "If a *double-support check* is successful, two uncertainties are resolved. If a *single-support check* is successful, only one uncertainty is resolved. A good *arc-consistency algorithm*, therefore, would always choose to do a *double-support check* in preference of a *single-support check*, because the cormer offers the potential higher payback.\n", + "\n", + "The improvement with *double-support check* is that, where possible, *consistency-checks* are used to find supports for two values, one value in the domain of each variable, which were previously known to be unsupported. It is motivated by the insight that *in order to minimize the number of consistency-checks it is necessary to maximize the number of uncertainties which are resolved per check*." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "### AC-3b: an improved version of AC-3 with Double-Support Checks" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As shown in [[2]](#cite-van2000improving) the idea is to use *double-support checks* to improve the average performance of `AC3` which does not exploit the fact that relations are bidirectional and results in a new general purpose *arc-consistency algorithm* called `AC3b`." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mAC3\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mdom_j_up\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"[Figure 6.3]\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXk\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mvariables\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mneighbors\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msupport_pruning\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpop\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mrevise\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrevised\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;31m# CSP is inconsistent\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mneighbors\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXk\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;31m# CSP is satisfiable\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource AC3" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mrevise\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Return true if we remove a value.\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# If Xi=x conflicts with Xj=y for every possible y, eliminate Xi=x\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if all(not csp.constraints(Xi, x, Xj, y) for y in csp.curr_domains[Xj]):\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0my\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconstraints\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0my\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mconflict\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mconflict\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprune\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mrevised\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource revise" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "At any stage in the process of making 2-variable CSP *arc-consistent* in `AC3b`:\n", + "- there is a set $S_i^+ \\subseteq D_i$ whose values are all known to be supported by $X_j$;\n", + "- there is a set $S_i^? = D_i \\setminus S_i^+$ whose values are unknown, as yet, to be supported by $X_j$.\n", + "\n", + "The same holds if the roles for $X_i$ and $X_j$ are exchanged.\n", + "\n", + "In order to establish support for a value $v_i^? \\in S_i^?$ it seems better to try to find a support among the values in $S_j^?$ first, because for each $v_j^? \\in S_j^?$ the check $(v_i^?,v_j^?) \\in C_{ij}$ is a *double-support check* and it is just as likely that any $v_j^? \\in S_j^?$ supports $v_i^?$ than it is that any $v_j^+ \\in S_j^+$ does. Only if no support can be found among the elements in $S_j^?$, should the elements $v_j^+$ in $S_j^+$ be used for *single-support checks* $(v_i^?,v_j^+) \\in C_{ij}$. After it has been decided for each value in $D_i$ whether it is supported or not, either $S_x^+ = \\emptyset$ and the 2-variable CSP is *inconsistent*, or $S_x^+ \\neq \\emptyset$ and the CSP is *satisfiable*. In the latter case, the elements from $D_i$ which are supported by $j$ are given by $S_x^+$. The elements in $D_j$ which are supported by $x$ are given by the union of $S_j^+$ with the set of those elements of $S_j^?$ which further processing will show to be supported by some $v_i^+ \\in S_x^+$." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mAC3b\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mdom_j_up\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXk\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mvariables\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mneighbors\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msupport_pruning\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpop\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Si_p values are all known to be supported by Xj\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Sj_p values are all known to be supported by Xi\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Dj - Sj_p = Sj_u values are unknown, as yet, to be supported by Xi\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSi_p\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mSj_u\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mpartition\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mSi_p\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;31m# CSP is inconsistent\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0mSi_p\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprune\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrevised\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mneighbors\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXk\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0misinstance\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mqueue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# or queue -= {(Xj, Xi)} or queue.remove((Xj, Xi))\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdifference_update\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdifference_update\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# the elements in D_j which are supported by Xi are given by the union of Sj_p with the set of those\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# elements of Sj_u which further processing will show to be supported by some vi_p in Si_p\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvj_p\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mSj_u\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvi_p\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mSi_p\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconstraints\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvj_p\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvi_p\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvj_p\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mconflict\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprune\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrevised\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mneighbors\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXk\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;31m# CSP is satisfiable\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource AC3b" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mpartition\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSi_p\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSj_p\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSj_u\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvi_u\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# now, in order to establish support for a value vi_u in Di it seems better to try to find a support among\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# the values in Sj_u first, because for each vj_u in Sj_u the check (vi_u, vj_u) is a double-support check\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# and it is just as likely that any vj_u in Sj_u supports vi_u than it is that any vj_p in Sj_p does...\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvj_u\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mSj_u\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# double-support check\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconstraints\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvi_u\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvj_u\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSi_p\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvi_u\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvj_u\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mconflict\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# ... and only if no support can be found among the elements in Sj_u, should the elements vj_p in Sj_p be used\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# for single-support checks (vi_u, vj_p)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mconflict\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvj_p\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# single-support check\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconstraints\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvi_u\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvj_p\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mSi_p\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvi_u\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mconflict\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mSi_p\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mSj_u\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0mSj_p\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource partition" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "`AC3b` is a refinement of the `AC3` algorithm which consists of the fact that if, when arc $(i,j)$ is being processed and the reverse arc $(j,i)$ is also in the queue, then consistency-checks can be saved because only support for the elements in $S_j^?$ has to be found (as opposed to support for all the elements in $D_j$ in the\n", + "`AC3` algorithm).
    \n", + "`AC3b` inherits all its properties like $\\mathcal{O}(ed^3)$ time-complexity and $\\mathcal{O}(e + nd)$ space-complexity fron `AC3` and where $n$ denotes the number of variables in the CSP, $e$ denotes the number of binary constraints and $d$ denotes the maximum domain-size of the variables." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "## Arc-Heuristics for Arc-Consistency Algorithms" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "Many *arc-heuristics* can be devised, based on three major features of CSPs:\n", + "- the number of acceptable pairs in each constraint (the *constraint size* or *satisfiability*);\n", + "- the *domain size*;\n", + "- the number of binary constraints that each variable participates in, equal to the *degree* of the node of that variable in the constraint graph. \n", + "\n", + "Simple examples of heuristics that might be expected to improve the efficiency of relaxation are:\n", + "- ordering the list of variable pairs by *increasing* relative *satisfiability*;\n", + "- ordering by *increasing size of the domain* of the variable $v_j$ relaxed against $v_i$;\n", + "- ordering by *descending degree* of node of the variable relaxed.\n", + "\n", + "In
    [[3]](#cite-wallace1992ordering) are investigated the effects of these *arc-heuristics* in an empirical way, experimenting the effects of them on random CSPs. Their results demonstrate that the first two, later called `sat up` and `dom j up` for n-ary and binary CSPs respectively, significantly reduce the number of *consistency-checks*." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mdom_j_up\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mSortedSet\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mqueue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mneg\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mt\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource dom_j_up" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0msat_up\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mto_do\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mSortedSet\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mto_do\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;36m1\u001b[0m \u001b[0;34m/\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mvar\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvar\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mscope\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource sat_up" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "## Experimental Results" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "For the experiments below on binary CSPs, in addition to the two *arc-consistency algorithms* already cited above, `AC3` and `AC3b`, the `AC4` algorithm was used.
    \n", + "The `AC4` algorithm runs in $\\mathcal{O}(ed^2)$ worst-case time but can be slower than `AC3` on average cases." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mAC4\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mdom_j_up\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXk\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mvariables\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXk\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mneighbors\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msupport_pruning\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msupport_counter\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mvariable_value_pairs_supported\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mdefaultdict\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munsupported_variable_value_pairs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# construction and initialization of support sets\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mqueue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpop\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0my\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconstraints\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0my\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msupport_counter\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mvariable_value_pairs_supported\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0my\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0msupport_counter\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprune\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munsupported_variable_value_pairs\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrevised\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;31m# CSP is inconsistent\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# propagation of removed values\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0munsupported_variable_value_pairs\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0my\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0munsupported_variable_value_pairs\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpop\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mvariable_value_pairs_supported\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0my\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msupport_counter\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0msupport_counter\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mXj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprune\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremovals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrevised\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munsupported_variable_value_pairs\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrevised\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurr_domains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mXi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;31m# CSP is inconsistent\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;31m# CSP is satisfiable\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource AC4" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Sudoku" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "#### Easy Sudoku" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + ". . 3 | . 2 . | 6 . .\n", + "9 . . | 3 . 5 | . . 1\n", + ". . 1 | 8 . 6 | 4 . .\n", + "------+-------+------\n", + ". . 8 | 1 . 2 | 9 . .\n", + "7 . . | . . . | . . 8\n", + ". . 6 | 7 . 8 | 2 . .\n", + "------+-------+------\n", + ". . 2 | 6 . 9 | 5 . .\n", + "8 . . | 2 . 3 | . . 9\n", + ". . 5 | . 1 . | 3 . .\n" + ] + } + ], + "source": [ + "sudoku = Sudoku(easy1)\n", + "sudoku.display(sudoku.infer_assignment())" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 23.6 ms, sys: 0 ns, total: 23.6 ms\n", + "Wall time: 22.4 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3 needs 11322 consistency-checks'" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, checks = AC3(sudoku, arc_heuristic=no_arc_heuristic)\n", + "f'AC3 needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 7.43 ms, sys: 3.68 ms, total: 11.1 ms\n", + "Wall time: 10.7 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b needs 8345 consistency-checks'" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(easy1)\n", + "%time _, checks = AC3b(sudoku, arc_heuristic=no_arc_heuristic)\n", + "f'AC3b needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 56.3 ms, sys: 0 ns, total: 56.3 ms\n", + "Wall time: 55.4 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC4 needs 27718 consistency-checks'" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(easy1)\n", + "%time _, checks = AC4(sudoku, arc_heuristic=no_arc_heuristic)\n", + "f'AC4 needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 17.2 ms, sys: 0 ns, total: 17.2 ms\n", + "Wall time: 16.9 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3 with DOM J UP arc heuristic needs 6925 consistency-checks'" + ] + }, + "execution_count": 13, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(easy1)\n", + "%time _, checks = AC3(sudoku, arc_heuristic=dom_j_up)\n", + "f'AC3 with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 40.9 ms, sys: 2.47 ms, total: 43.4 ms\n", + "Wall time: 41.7 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b with DOM J UP arc heuristic needs 6278 consistency-checks'" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(easy1)\n", + "%time _, checks = AC3b(sudoku, arc_heuristic=dom_j_up)\n", + "f'AC3b with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 38.9 ms, sys: 1.96 ms, total: 40.9 ms\n", + "Wall time: 40.7 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC4 with DOM J UP arc heuristic needs 9393 consistency-checks'" + ] + }, + "execution_count": 15, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(easy1)\n", + "%time _, checks = AC4(sudoku, arc_heuristic=dom_j_up)\n", + "f'AC4 with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "4 8 3 | 9 2 1 | 6 5 7\n", + "9 6 7 | 3 4 5 | 8 2 1\n", + "2 5 1 | 8 7 6 | 4 9 3\n", + "------+-------+------\n", + "5 4 8 | 1 3 2 | 9 7 6\n", + "7 2 9 | 5 6 4 | 1 3 8\n", + "1 3 6 | 7 9 8 | 2 4 5\n", + "------+-------+------\n", + "3 7 2 | 6 8 9 | 5 1 4\n", + "8 1 4 | 2 5 3 | 7 6 9\n", + "6 9 5 | 4 1 7 | 3 8 2\n" + ] + } + ], + "source": [ + "backtracking_search(sudoku, select_unassigned_variable=mrv, inference=forward_checking)\n", + "sudoku.display(sudoku.infer_assignment())" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "#### Harder Sudoku" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "4 1 7 | 3 6 9 | 8 . 5\n", + ". 3 . | . . . | . . .\n", + ". . . | 7 . . | . . .\n", + "------+-------+------\n", + ". 2 . | . . . | . 6 .\n", + ". . . | . 8 . | 4 . .\n", + ". . . | . 1 . | . . .\n", + "------+-------+------\n", + ". . . | 6 . 3 | . 7 .\n", + "5 . . | 2 . . | . . .\n", + "1 . 4 | . . . | . . .\n" + ] + } + ], + "source": [ + "sudoku = Sudoku(harder1)\n", + "sudoku.display(sudoku.infer_assignment())" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 17.7 ms, sys: 481 µs, total: 18.2 ms\n", + "Wall time: 17.2 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3 needs 12837 consistency-checks'" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, checks = AC3(sudoku, arc_heuristic=no_arc_heuristic)\n", + "f'AC3 needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 24.1 ms, sys: 2.6 ms, total: 26.7 ms\n", + "Wall time: 25.1 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b needs 8864 consistency-checks'" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(harder1)\n", + "%time _, checks = AC3b(sudoku, arc_heuristic=no_arc_heuristic)\n", + "f'AC3b needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 63.4 ms, sys: 3.48 ms, total: 66.9 ms\n", + "Wall time: 65.5 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC4 needs 44213 consistency-checks'" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(harder1)\n", + "%time _, checks = AC4(sudoku, arc_heuristic=no_arc_heuristic)\n", + "f'AC4 needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 9.96 ms, sys: 570 µs, total: 10.5 ms\n", + "Wall time: 10.3 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3 with DOM J UP arc heuristic needs 7045 consistency-checks'" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(harder1)\n", + "%time _, checks = AC3(sudoku, arc_heuristic=dom_j_up)\n", + "f'AC3 with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 36.1 ms, sys: 0 ns, total: 36.1 ms\n", + "Wall time: 35.5 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b with DOM J UP arc heuristic needs 6994 consistency-checks'" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(harder1)\n", + "%time _, checks = AC3b(sudoku, arc_heuristic=dom_j_up)\n", + "f'AC3b with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 40.3 ms, sys: 0 ns, total: 40.3 ms\n", + "Wall time: 39.7 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC4 with DOM J UP arc heuristic needs 19210 consistency-checks'" + ] + }, + "execution_count": 23, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sudoku = Sudoku(harder1)\n", + "%time _, checks = AC4(sudoku, arc_heuristic=dom_j_up)\n", + "f'AC4 with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "4 1 7 | 3 6 9 | 8 2 5\n", + "6 3 2 | 1 5 8 | 9 4 7\n", + "9 5 8 | 7 2 4 | 3 1 6\n", + "------+-------+------\n", + "8 2 5 | 4 3 7 | 1 6 9\n", + "7 9 1 | 5 8 6 | 4 3 2\n", + "3 4 6 | 9 1 2 | 7 5 8\n", + "------+-------+------\n", + "2 8 9 | 6 4 3 | 5 7 1\n", + "5 7 3 | 2 9 1 | 6 8 4\n", + "1 6 4 | 8 7 5 | 2 9 3\n" + ] + } + ], + "source": [ + "backtracking_search(sudoku, select_unassigned_variable=mrv, inference=forward_checking)\n", + "sudoku.display(sudoku.infer_assignment())" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "### 8 Queens" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + ". - . - . - . - 0 0 0 0 0 0 0 0 \n", + "- . - . - . - . 0 0 0 0 0 0 0 0 \n", + ". - . - . - . - 0 0 0 0 0 0 0 0 \n", + "- . - . - . - . 0 0 0 0 0 0 0 0 \n", + ". - . - . - . - 0 0 0 0 0 0 0 0 \n", + "- . - . - . - . 0 0 0 0 0 0 0 0 \n", + ". - . - . - . - 0 0 0 0 0 0 0 0 \n", + "- . - . - . - . 0 0 0 0 0 0 0 0 \n" + ] + } + ], + "source": [ + "chess = NQueensCSP(8)\n", + "chess.display(chess.infer_assignment())" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 689 µs, sys: 193 µs, total: 882 µs\n", + "Wall time: 892 µs\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3 needs 666 consistency-checks'" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, checks = AC3(chess, arc_heuristic=no_arc_heuristic)\n", + "f'AC3 needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 451 µs, sys: 127 µs, total: 578 µs\n", + "Wall time: 584 µs\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b needs 428 consistency-checks'" + ] + }, + "execution_count": 30, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chess = NQueensCSP(8)\n", + "%time _, checks = AC3b(chess, arc_heuristic=no_arc_heuristic)\n", + "f'AC3b needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 8.53 ms, sys: 109 µs, total: 8.64 ms\n", + "Wall time: 8.48 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC4 needs 4096 consistency-checks'" + ] + }, + "execution_count": 32, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chess = NQueensCSP(8)\n", + "%time _, checks = AC4(chess, arc_heuristic=no_arc_heuristic)\n", + "f'AC4 needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1.88 ms, sys: 0 ns, total: 1.88 ms\n", + "Wall time: 1.88 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3 with DOM J UP arc heuristic needs 666 consistency-checks'" + ] + }, + "execution_count": 34, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chess = NQueensCSP(8)\n", + "%time _, checks = AC3(chess, arc_heuristic=dom_j_up)\n", + "f'AC3 with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1.21 ms, sys: 326 µs, total: 1.53 ms\n", + "Wall time: 1.54 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b with DOM J UP arc heuristic needs 792 consistency-checks'" + ] + }, + "execution_count": 36, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chess = NQueensCSP(8)\n", + "%time _, checks = AC3b(chess, arc_heuristic=dom_j_up)\n", + "f'AC3b with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 4.71 ms, sys: 0 ns, total: 4.71 ms\n", + "Wall time: 4.65 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC4 with DOM J UP arc heuristic needs 4096 consistency-checks'" + ] + }, + "execution_count": 38, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chess = NQueensCSP(8)\n", + "%time _, checks = AC4(chess, arc_heuristic=dom_j_up)\n", + "f'AC4 with DOM J UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + ". - . - Q - . - 2 2 3 3 0* 1 1 2 \n", + "- Q - . - . - . 1 0* 3 3 2 2 2 2 \n", + ". - . - . Q . - 3 2 3 2 2 0* 3 2 \n", + "Q . - . - . - . 0* 3 1 2 3 3 3 3 \n", + ". - . - . - Q - 2 2 2 2 3 3 0* 2 \n", + "- . - Q - . - . 2 1 3 0* 2 3 2 2 \n", + ". - . - . - . Q 1 3 2 3 3 1 2 0* \n", + "- . Q . - . - . 2 2 0* 2 2 2 2 2 \n" + ] + } + ], + "source": [ + "backtracking_search(chess, select_unassigned_variable=mrv, inference=forward_checking)\n", + "chess.display(chess.infer_assignment())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For the experiments below on n-ary CSPs, due to the n-ary constraints, the `GAC` algorithm was used.
    \n", + "The `GAC` algorithm has $\\mathcal{O}(er^2d^t)$ time-complexity and $\\mathcal{O}(erd)$ space-complexity where $e$ denotes the number of n-ary constraints, $r$ denotes the constraint arity and $d$ denotes the maximum domain-size of the variables." + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + " \u001b[0;32mdef\u001b[0m \u001b[0mGAC\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0morig_domains\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mto_do\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0msat_up\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Makes this CSP arc-consistent using Generalized Arc Consistency\u001b[0m\n", + "\u001b[0;34m orig_domains is the original domains\u001b[0m\n", + "\u001b[0;34m to_do is a set of (variable,constraint) pairs\u001b[0m\n", + "\u001b[0;34m returns the reduced domains (an arc-consistent variable:domain dictionary)\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0morig_domains\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0morig_domains\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdomains\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mto_do\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mto_do\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconst\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mconst\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconstraints\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvar\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mconst\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mscope\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mto_do\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mto_do\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcopy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomains\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0morig_domains\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcopy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mto_do\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mto_do\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0mto_do\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconst\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mto_do\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpop\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mother_vars\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mov\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mov\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mconst\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mscope\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mov\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mvar\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_domain\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mother_vars\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mval\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mconst\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mholds\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mval\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_domain\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# new_domain = {val for val in domains[var]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if const.holds({var: val})}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mother_vars\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mother\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mother_vars\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mval\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mother_val\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mother\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mconst\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mholds\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mval\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mother\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mother_val\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_domain\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# new_domain = {val for val in domains[var]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if any(const.holds({var: val, other: other_val})\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# for other_val in domains[other])}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;31m# general case\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mval\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mholds\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0many_holds\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdomains\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconst\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mval\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mother_vars\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mchecks\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mholds\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_domain\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# new_domain = {val for val in domains[var]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if self.any_holds(domains, const, {var: val}, other_vars)}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mnew_domain\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnew_domain\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mnew_domain\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0madd_to_do\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnew_to_do\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconst\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdifference\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mto_do\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mto_do\u001b[0m \u001b[0;34m|=\u001b[0m \u001b[0madd_to_do\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mchecks\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource ACSolver.GAC" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "### Crossword" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[_] [_] [_] [*] [*] \n", + "[_] [*] [_] [*] [*] \n", + "[_] [_] [_] [_] [*] \n", + "[_] [*] [_] [*] [*] \n", + "[*] [*] [_] [_] [_] \n", + "[*] [*] [_] [*] [*] \n" + ] + }, + { + "data": { + "text/plain": [ + "{'ant',\n", + " 'big',\n", + " 'book',\n", + " 'bus',\n", + " 'buys',\n", + " 'car',\n", + " 'ginger',\n", + " 'has',\n", + " 'hold',\n", + " 'lane',\n", + " 'search',\n", + " 'symbol',\n", + " 'syntax',\n", + " 'year'}" + ] + }, + "execution_count": 41, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "crossword = Crossword(crossword1, words1)\n", + "crossword.display()\n", + "words1" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1min 20s, sys: 2.02 ms, total: 1min 20s\n", + "Wall time: 1min 20s\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC needs 64617645 consistency-checks'" + ] + }, + "execution_count": 36, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, _, checks = ACSolver(crossword).GAC(arc_heuristic=no_heuristic)\n", + "f'GAC needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1.19 s, sys: 0 ns, total: 1.19 s\n", + "Wall time: 1.19 s\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC with SAT UP arc heuristic needs 908015 consistency-checks'" + ] + }, + "execution_count": 42, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "crossword = Crossword(crossword1, words1)\n", + "%time _, _, checks = ACSolver(crossword).GAC(arc_heuristic=sat_up)\n", + "f'GAC with SAT UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[B] [U] [S] [*] [*] \n", + "[U] [*] [E] [*] [*] \n", + "[Y] [E] [A] [R] [*] \n", + "[S] [*] [R] [*] [*] \n", + "[*] [*] [C] [A] [R] \n", + "[*] [*] [H] [*] [*] \n" + ] + } + ], + "source": [ + "crossword.display(ACSolver(crossword).domain_splitting())" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "### Kakuro" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Easy Kakuro" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[*]\t10\\\t13\\\t[*]\t\n", + "\\3\t[_]\t[_]\t13\\\t\n", + "\\12\t[_]\t[_]\t[_]\t\n", + "\\21\t[_]\t[_]\t[_]\t\n" + ] + } + ], + "source": [ + "kakuro = Kakuro(kakuro2)\n", + "kakuro.display()" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 17.8 ms, sys: 171 µs, total: 18 ms\n", + "Wall time: 16.4 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC needs 2752 consistency-checks'" + ] + }, + "execution_count": 45, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, _, checks = ACSolver(kakuro).GAC(arc_heuristic=no_heuristic)\n", + "f'GAC needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 8.55 ms, sys: 0 ns, total: 8.55 ms\n", + "Wall time: 8.39 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC with SAT UP arc heuristic needs 1765 consistency-checks'" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "kakuro = Kakuro(kakuro2)\n", + "%time _, _, checks = ACSolver(kakuro).GAC(arc_heuristic=sat_up)\n", + "f'GAC with SAT UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[*]\t10\\\t13\\\t[*]\t\n", + "\\3\t[1]\t[2]\t13\\\t\n", + "\\12\t[5]\t[3]\t[4]\t\n", + "\\21\t[4]\t[8]\t[9]\t\n" + ] + } + ], + "source": [ + "kakuro.display(ACSolver(kakuro).domain_splitting())" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "#### Medium Kakuro" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[*]\t17\\\t28\\\t[*]\t42\\\t22\\\t\n", + "\\9\t[_]\t[_]\t31\\14\t[_]\t[_]\t\n", + "\\20\t[_]\t[_]\t[_]\t[_]\t[_]\t\n", + "[*]\t\\30\t[_]\t[_]\t[_]\t[_]\t\n", + "[*]\t22\\24\t[_]\t[_]\t[_]\t[*]\t\n", + "\\25\t[_]\t[_]\t[_]\t[_]\t11\\\t\n", + "\\20\t[_]\t[_]\t[_]\t[_]\t[_]\t\n", + "\\14\t[_]\t[_]\t\\17\t[_]\t[_]\t\n" + ] + } + ], + "source": [ + "kakuro = Kakuro(kakuro3)\n", + "kakuro.display()" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1.96 s, sys: 0 ns, total: 1.96 s\n", + "Wall time: 1.96 s\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC needs 1290179 consistency-checks'" + ] + }, + "execution_count": 49, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, _, checks = ACSolver(kakuro).GAC(arc_heuristic=no_heuristic)\n", + "f'GAC needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 225 ms, sys: 0 ns, total: 225 ms\n", + "Wall time: 223 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC with SAT UP arc heuristic needs 148780 consistency-checks'" + ] + }, + "execution_count": 50, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "kakuro = Kakuro(kakuro3)\n", + "%time _, _, checks = ACSolver(kakuro).GAC(arc_heuristic=sat_up)\n", + "f'GAC with SAT UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[*]\t17\\\t28\\\t[*]\t42\\\t22\\\t\n", + "\\9\t[8]\t[1]\t31\\14\t[5]\t[9]\t\n", + "\\20\t[9]\t[2]\t[1]\t[3]\t[5]\t\n", + "[*]\t\\30\t[6]\t[9]\t[7]\t[8]\t\n", + "[*]\t22\\24\t[7]\t[8]\t[9]\t[*]\t\n", + "\\25\t[8]\t[4]\t[7]\t[6]\t11\\\t\n", + "\\20\t[5]\t[3]\t[6]\t[4]\t[2]\t\n", + "\\14\t[9]\t[5]\t\\17\t[8]\t[9]\t\n" + ] + } + ], + "source": [ + "kakuro.display(ACSolver(kakuro).domain_splitting())" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "#### Harder Kakuro" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[*]\t[*]\t[*]\t[*]\t[*]\t4\\\t24\\\t11\\\t[*]\t[*]\t[*]\t11\\\t17\\\t[*]\t[*]\t\n", + "[*]\t[*]\t[*]\t17\\\t11\\12\t[_]\t[_]\t[_]\t[*]\t[*]\t24\\10\t[_]\t[_]\t11\\\t[*]\t\n", + "[*]\t4\\\t16\\26\t[_]\t[_]\t[_]\t[_]\t[_]\t[*]\t\\20\t[_]\t[_]\t[_]\t[_]\t16\\\t\n", + "\\20\t[_]\t[_]\t[_]\t[_]\t24\\13\t[_]\t[_]\t16\\\t\\12\t[_]\t[_]\t23\\10\t[_]\t[_]\t\n", + "\\10\t[_]\t[_]\t24\\12\t[_]\t[_]\t16\\5\t[_]\t[_]\t16\\30\t[_]\t[_]\t[_]\t[_]\t[_]\t\n", + "[*]\t[*]\t3\\26\t[_]\t[_]\t[_]\t[_]\t\\12\t[_]\t[_]\t4\\\t16\\14\t[_]\t[_]\t[*]\t\n", + "[*]\t\\8\t[_]\t[_]\t\\15\t[_]\t[_]\t34\\26\t[_]\t[_]\t[_]\t[_]\t[_]\t[*]\t[*]\t\n", + "[*]\t\\11\t[_]\t[_]\t3\\\t17\\\t\\14\t[_]\t[_]\t\\8\t[_]\t[_]\t7\\\t17\\\t[*]\t\n", + "[*]\t[*]\t[*]\t23\\10\t[_]\t[_]\t3\\9\t[_]\t[_]\t4\\\t23\\\t\\13\t[_]\t[_]\t[*]\t\n", + "[*]\t[*]\t10\\26\t[_]\t[_]\t[_]\t[_]\t[_]\t\\7\t[_]\t[_]\t30\\9\t[_]\t[_]\t[*]\t\n", + "[*]\t17\\11\t[_]\t[_]\t11\\\t24\\8\t[_]\t[_]\t11\\21\t[_]\t[_]\t[_]\t[_]\t16\\\t17\\\t\n", + "\\29\t[_]\t[_]\t[_]\t[_]\t[_]\t\\7\t[_]\t[_]\t23\\14\t[_]\t[_]\t3\\17\t[_]\t[_]\t\n", + "\\10\t[_]\t[_]\t3\\10\t[_]\t[_]\t[*]\t\\8\t[_]\t[_]\t4\\25\t[_]\t[_]\t[_]\t[_]\t\n", + "[*]\t\\16\t[_]\t[_]\t[_]\t[_]\t[*]\t\\23\t[_]\t[_]\t[_]\t[_]\t[_]\t[*]\t[*]\t\n", + "[*]\t[*]\t\\6\t[_]\t[_]\t[*]\t[*]\t\\15\t[_]\t[_]\t[_]\t[*]\t[*]\t[*]\t[*]\t\n" + ] + } + ], + "source": [ + "kakuro = Kakuro(kakuro4)\n", + "kakuro.display()" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 76.5 ms, sys: 847 µs, total: 77.4 ms\n", + "Wall time: 77 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC needs 46633 consistency-checks'" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, _, checks = ACSolver(kakuro).GAC()\n", + "f'GAC needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 64.6 ms, sys: 0 ns, total: 64.6 ms\n", + "Wall time: 63.6 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC with SAT UP arc heuristic needs 36828 consistency-checks'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "kakuro = Kakuro(kakuro4)\n", + "%time _, _, checks = ACSolver(kakuro).GAC(arc_heuristic=sat_up)\n", + "f'GAC with SAT UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[*]\t[*]\t[*]\t[*]\t[*]\t4\\\t24\\\t11\\\t[*]\t[*]\t[*]\t11\\\t17\\\t[*]\t[*]\t\n", + "[*]\t[*]\t[*]\t17\\\t11\\12\t[3]\t[7]\t[2]\t[*]\t[*]\t24\\10\t[2]\t[8]\t11\\\t[*]\t\n", + "[*]\t4\\\t16\\26\t[8]\t[5]\t[1]\t[9]\t[3]\t[*]\t\\20\t[8]\t[1]\t[9]\t[2]\t16\\\t\n", + "\\20\t[3]\t[7]\t[9]\t[1]\t24\\13\t[8]\t[5]\t16\\\t\\12\t[9]\t[3]\t23\\10\t[3]\t[7]\t\n", + "\\10\t[1]\t[9]\t24\\12\t[3]\t[9]\t16\\5\t[1]\t[4]\t16\\30\t[7]\t[5]\t[8]\t[1]\t[9]\t\n", + "[*]\t[*]\t3\\26\t[8]\t[2]\t[7]\t[9]\t\\12\t[3]\t[9]\t4\\\t16\\14\t[9]\t[5]\t[*]\t\n", + "[*]\t\\8\t[1]\t[7]\t\\15\t[8]\t[7]\t34\\26\t[1]\t[7]\t[3]\t[9]\t[6]\t[*]\t[*]\t\n", + "[*]\t\\11\t[2]\t[9]\t3\\\t17\\\t\\14\t[8]\t[6]\t\\8\t[1]\t[7]\t7\\\t17\\\t[*]\t\n", + "[*]\t[*]\t[*]\t23\\10\t[1]\t[9]\t3\\9\t[7]\t[2]\t4\\\t23\\\t\\13\t[4]\t[9]\t[*]\t\n", + "[*]\t[*]\t10\\26\t[6]\t[2]\t[8]\t[1]\t[9]\t\\7\t[1]\t[6]\t30\\9\t[1]\t[8]\t[*]\t\n", + "[*]\t17\\11\t[3]\t[8]\t11\\\t24\\8\t[2]\t[6]\t11\\21\t[3]\t[9]\t[7]\t[2]\t16\\\t17\\\t\n", + "\\29\t[8]\t[2]\t[9]\t[3]\t[7]\t\\7\t[4]\t[3]\t23\\14\t[8]\t[6]\t3\\17\t[9]\t[8]\t\n", + "\\10\t[9]\t[1]\t3\\10\t[2]\t[8]\t[*]\t\\8\t[2]\t[6]\t4\\25\t[8]\t[1]\t[7]\t[9]\t\n", + "[*]\t\\16\t[4]\t[2]\t[1]\t[9]\t[*]\t\\23\t[1]\t[8]\t[3]\t[9]\t[2]\t[*]\t[*]\t\n", + "[*]\t[*]\t\\6\t[1]\t[5]\t[*]\t[*]\t\\15\t[5]\t[9]\t[1]\t[*]\t[*]\t[*]\t[*]\t\n" + ] + } + ], + "source": [ + "kakuro.display(ACSolver(kakuro).domain_splitting())" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "### Cryptarithmetic Puzzle" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$\n", + "\\begin{array}{@{}r@{}}\n", + " S E N D \\\\\n", + "{} + M O R E \\\\\n", + " \\hline\n", + " M O N E Y\n", + "\\end{array}\n", + "$$" + ] + }, + { + "cell_type": "code", + "execution_count": 57, + "metadata": { + "pycharm": {} + }, + "outputs": [], + "source": [ + "cryptarithmetic = NaryCSP(\n", + " {'S': set(range(1, 10)), 'M': set(range(1, 10)),\n", + " 'E': set(range(0, 10)), 'N': set(range(0, 10)), 'D': set(range(0, 10)),\n", + " 'O': set(range(0, 10)), 'R': set(range(0, 10)), 'Y': set(range(0, 10)),\n", + " 'C1': set(range(0, 2)), 'C2': set(range(0, 2)), 'C3': set(range(0, 2)),\n", + " 'C4': set(range(0, 2))},\n", + " [Constraint(('S', 'E', 'N', 'D', 'M', 'O', 'R', 'Y'), all_diff),\n", + " Constraint(('D', 'E', 'Y', 'C1'), lambda d, e, y, c1: d + e == y + 10 * c1),\n", + " Constraint(('N', 'R', 'E', 'C1', 'C2'), lambda n, r, e, c1, c2: c1 + n + r == e + 10 * c2),\n", + " Constraint(('E', 'O', 'N', 'C2', 'C3'), lambda e, o, n, c2, c3: c2 + e + o == n + 10 * c3),\n", + " Constraint(('S', 'M', 'O', 'C3', 'C4'), lambda s, m, o, c3, c4: c3 + s + m == o + 10 * c4),\n", + " Constraint(('M', 'C4'), eq)])" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 21.7 s, sys: 0 ns, total: 21.7 s\n", + "Wall time: 21.7 s\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC needs 14080592 consistency-checks'" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, _, checks = ACSolver(cryptarithmetic).GAC(arc_heuristic=no_heuristic)\n", + "f'GAC needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 58, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 939 ms, sys: 0 ns, total: 939 ms\n", + "Wall time: 938 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'GAC with SAT UP arc heuristic needs 573120 consistency-checks'" + ] + }, + "execution_count": 58, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, _, checks = ACSolver(cryptarithmetic).GAC(arc_heuristic=sat_up)\n", + "f'GAC with SAT UP arc heuristic needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 59, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/latex": [ + "\\begin{array}{@{}r@{}} 9567 \\\\ + 1085 \\\\ \\hline 10652 \\end{array}" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "assignment = ACSolver(cryptarithmetic).domain_splitting()\n", + "\n", + "from IPython.display import Latex\n", + "display(Latex(r'\\begin{array}{@{}r@{}} ' + '{}{}{}{}'.format(assignment['S'], assignment['E'], assignment['N'], assignment['D']) + r' \\\\ + ' + \n", + " '{}{}{}{}'.format(assignment['M'], assignment['O'], assignment['R'], assignment['E']) + r' \\\\ \\hline ' + \n", + " '{}{}{}{}{}'.format(assignment['M'], assignment['O'], assignment['N'], assignment['E'], assignment['Y']) + ' \\end{array}'))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "## References\n", + "\n", + "
    [[1]](#ref-1) Van Dongen, Marc RC. 2002. _Domain-heuristics for arc-consistency algorithms_.\n", + "\n", + "[[2]](#ref-2) Van Dongen, MRC and Bowen, JA. 2000. _Improving arc-consistency algorithms with double-support checks_.\n", + "\n", + "[[3]](#ref-3) Wallace, Richard J and Freuder, Eugene Charles. 1992. _Ordering heuristics for arc consistency algorithms_." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.5rc1" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/classical_planning_approaches.ipynb b/classical_planning_approaches.ipynb new file mode 100644 index 000000000..b3373b367 --- /dev/null +++ b/classical_planning_approaches.ipynb @@ -0,0 +1,2402 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Classical Planning\n", + "---\n", + "# Classical Planning Approaches\n", + "\n", + "## Introduction \n", + "***Planning*** combines the two major areas of AI: *search* and *logic*. A planner can be seen either as a program that searches for a solution or as one that constructively proves the existence of a solution.\n", + "\n", + "Currently, the most popular and effective approaches to fully automated planning are:\n", + "- searching using a *planning graph*;\n", + "- *state-space search* with heuristics;\n", + "- translating to a *constraint satisfaction (CSP) problem*;\n", + "- translating to a *boolean satisfiability (SAT) problem*." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from planning import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Planning as Planning Graph Search\n", + "\n", + "A *planning graph* is a directed graph organized into levels each of which contains information about the current state of the knowledge base and the possible state-action links to and from that level. \n", + "\n", + "The first level contains the initial state with nodes representing each fluent that holds in that level. This level has state-action links linking each state to valid actions in that state. Each action is linked to all its preconditions and its effect states. Based on these effects, the next level is constructed and contains similarly structured information about the next state. In this way, the graph is expanded using state-action links till we reach a state where all the required goals hold true simultaneously.\n", + "\n", + "In every planning problem, we are allowed to carry out the *no-op* action, ie, we can choose no action for a particular state. These are called persistence actions and has effects same as its preconditions. This enables us to carry a state to the next level.\n", + "\n", + "Mutual exclusivity (*mutex*) between two actions means that these cannot be taken together and occurs in the following cases:\n", + "- *inconsistent effects*: one action negates the effect of the other;\n", + "- *interference*: one of the effects of an action is the negation of a precondition of the other;\n", + "- *competing needs*: one of the preconditions of one action is mutually exclusive with a precondition of the other.\n", + "\n", + "We can say that we have reached our goal if none of the goal states in the current level are mutually exclusive." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mclass\u001b[0m \u001b[0mGraph\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Contains levels of state and actions\u001b[0m\n", + "\u001b[0;34m Used in graph planning algorithm to extract a solution\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkb\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mFolKB\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mLevel\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkb\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mobjects\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0marg\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mclause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkb\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclauses\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0marg\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__call__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpand_graph\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mexpand_graph\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Expands the graph by a level\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mlast_level\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mlast_level\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mactions\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mobjects\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlast_level\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mperform_actions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mnon_mutex_goals\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mindex\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Checks whether the goals are mutually exclusive\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mgoal_perm\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcombinations\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mg\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mgoal_perm\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mg\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mindex\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource Graph" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mclass\u001b[0m \u001b[0mLevel\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Contains the state of the planning problem\u001b[0m\n", + "\u001b[0;34m and exhaustive list of actions which use the\u001b[0m\n", + "\u001b[0;34m states as pre-condition.\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkb\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Initializes variables to hold state and action details of a level\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkb\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mkb\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# current state\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mkb\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# current action to state link\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_action_links\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# current state to action link\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# current action to next state link\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_action_links\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# next state to current action link\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# mutually exclusive actions\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__call__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mobjects\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mbuild\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mactions\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mobjects\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mfind_mutex\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mseparate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Separates an iterable of elements into positive and negative parts\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mpositive\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnegative\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mclause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m'Not'\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnegative\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mpositive\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mpositive\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnegative\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mfind_mutex\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Finds mutually exclusive actions\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Inconsistent effects\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mpos_nsl\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mneg_nsl\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mseparate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mnegeff\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mneg_nsl\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_negeff\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnegeff\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0mnegeff\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mposeff\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mpos_nsl\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mnew_negeff\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mposeff\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ma\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mposeff\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mb\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnegeff\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mb\u001b[0m\u001b[0;34m}\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mb\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Interference will be calculated with the last step\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mpos_csl\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mneg_csl\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mseparate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Competing needs\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mpos_precond\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mpos_csl\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mneg_precond\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mneg_csl\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_neg_precond\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mneg_precond\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0mneg_precond\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mnew_neg_precond\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mpos_precond\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ma\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mpos_precond\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mb\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mneg_precond\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mb\u001b[0m\u001b[0;34m}\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mb\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Inconsistent support\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mstate_mutex\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mpair\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnext_state_0\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mpair\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mpair\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnext_state_1\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mpair\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnext_state_1\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mpair\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnext_state_0\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnext_state_1\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mstate_mutex\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0mnext_state_0\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnext_state_1\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mstate_mutex\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mbuild\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mobjects\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Populates the lists and dictionaries containing the state action dependencies\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mclause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mp_expr\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'P'\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mp_expr\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mp_expr\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mp_expr\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mp_expr\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ma\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnum_args\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mpossible_args\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtuple\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpermutations\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mobjects\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnum_args\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0marg\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mpossible_args\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcheck_precond\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkb\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marg\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mnum\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbol\u001b[0m \u001b[0;32min\u001b[0m \u001b[0menumerate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mislower\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0marg\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0marg\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0marg\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnum\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0marg\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtuple\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0marg\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_action\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msubstitute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mname\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marg\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mclause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprecond\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_clause\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msubstitute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marg\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_clause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mnew_clause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_clause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_clause\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mclause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_clause\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0ma\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msubstitute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marg\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_clause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mnew_clause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_clause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnew_clause\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mnew_action\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mperform_actions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Performs the necessary actions and returns a new Level\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_kb\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mFolKB\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkeys\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mLevel\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_kb\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A *planning graph* can be used to give better heuristic estimates which can be applied to any of the search techniques. Alternatively, we can search for a solution over the space formed by the planning graph, using an algorithm called `GraphPlan`.\n", + "\n", + "The `GraphPlan` algorithm repeatedly adds a level to a planning graph. Once all the goals show up as non-mutex in the graph, the algorithm runs backward from the last level to the first searching for a plan that solves the problem. If that fails, it records the (level , goals) pair as a *no-good* (as in constraint learning for CSPs), expands another level and tries again, terminating with failure when there is no reason to go on. " + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mclass\u001b[0m \u001b[0mGraphPlan\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Class for formulation GraphPlan algorithm\u001b[0m\n", + "\u001b[0;34m Constructs a graph of state and action space\u001b[0m\n", + "\u001b[0;34m Returns solution for the planning problem\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mGraph\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mno_goods\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msolution\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mcheck_leveloff\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Checks if the graph has levelled off\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcheck\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_state\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mcheck\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mextract_solution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mindex\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Extracts the solution\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mlevel\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mindex\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnon_mutex_goals\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mindex\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mno_goods\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlevel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mlevel\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mindex\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Create all combinations of actions that satisfy the goal\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mgoal\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlevel\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnext_state_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mgoal\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mall_actions\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mproduct\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0mactions\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Filter out non-mutex actions\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnon_mutex_actions\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction_tuple\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mall_actions\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0maction_pairs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcombinations\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction_tuple\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnon_mutex_actions\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction_tuple\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mpair\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction_pairs\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mpair\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mlevel\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmutex\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnon_mutex_actions\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpop\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Recursion\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction_list\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mnon_mutex_actions\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0maction_list\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mindex\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msolution\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msolution\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0maction_list\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mindex\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_goals\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mact\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction_list\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mact\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mlevel\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_action_links\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnew_goals\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnew_goals\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mlevel\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcurrent_action_links\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mact\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mindex\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mlevel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnew_goals\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mno_goods\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mextract_solution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_goals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mindex\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Level-Order multiple solutions\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msolution\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mitem\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msolution\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mitem\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mitem\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mitem\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mnum\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mitem\u001b[0m \u001b[0;32min\u001b[0m \u001b[0menumerate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msolution\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mitem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreverse\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnum\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mitem\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mgoal_test\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkb\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mall\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mkb\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mask\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mq\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mFalse\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mq\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mexecute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Executes the GraphPlan algorithm for the given problem\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpand_graph\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoal_test\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkb\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnon_mutex_goals\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msolution\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mextract_solution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgraph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlevels\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0;36m2\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcheck_leveloff\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource GraphPlan" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Planning as State-Space Search" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The description of a planning problem defines a search problem: we can search from the initial state through the space of states, looking for a goal. One of the nice advantages of the declarative representation of action schemas is that we can also search backward from the goal, looking for the initial state. \n", + "\n", + "However, neither forward nor backward search is efficient without a good heuristic function because the real-world planning problems often have large state spaces. A heuristic function $h(s)$ estimates the distance from a state $s$ to the goal and, if it is admissible, ie if does not overestimate, then we can use $A^∗$ search to find optimal solutions.\n", + "\n", + "Planning uses a factored representation for states and action schemas which makes it possible to define good domain-independent heuristics to prune the search space.\n", + "\n", + "An admissible heuristic can be derived by defining a relaxed problem that is easier to solve. The length of the solution of this easier problem then becomes the heuristic for the original problem. Assume that all goals and preconditions contain only positive literals, ie that the problem is defined according to the *Stanford Research Institute Problem Solver* (STRIPS) notation: we want to create a relaxed version of the original problem that will be easier to solve by ignoring delete lists from all actions, ie removing all negative literals from effects. As shown in [[1]](#cite-hoffmann2001ff) the planning graph of a relaxed problem does not contain any mutex relations at all (which is the crucial thing when building a planning graph) and for this reason GraphPlan will never backtrack looking for a solution: for this reason the **ignore delete lists** heuristic makes it possible to find the optimal solution for relaxed problem in polynomial time through `GraphPlan` algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "from search import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Forward State-Space Search" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Forward search through the space of states, starting in the initial state and using the problem’s actions to search forward for a member of the set of goal states." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mclass\u001b[0m \u001b[0mForwardPlan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msearch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mProblem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Section 10.2.1]\u001b[0m\n", + "\u001b[0;34m Forward state-space search\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msuper\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpanded_actions\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpand_actions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0maction\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpanded_actions\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mall\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mpre\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mpre\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprecond\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mresult\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mgoal_test\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mall\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgoal\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mgoal\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mh\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Computes ignore delete lists heuristic by creating a relaxed version of the original problem (we can do that\u001b[0m\n", + "\u001b[0;34m by removing the delete lists from all actions, i.e. removing all negative literals from effects) that will be\u001b[0m\n", + "\u001b[0;34m easier to solve through GraphPlan and where the length of the solution will serve as a good heuristic.\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrelaxed_planning_problem\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mPlanningProblem\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoal\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrelaxed\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mactions\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlinearize\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mGraphPlan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrelaxed_planning_problem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexecute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mexcept\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mfloat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'inf'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource ForwardPlan" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Backward Relevant-States Search" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Backward search through sets of relevant states, starting at the set of states representing the goal and using the inverse of the actions to search backward for the initial state." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mclass\u001b[0m \u001b[0mBackwardPlan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msearch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mProblem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Section 10.2.2]\u001b[0m\n", + "\u001b[0;34m Backward relevant-states search\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msuper\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpanded_actions\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpand_actions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msubgoal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Returns True if the action is relevant to the subgoal, i.e.:\u001b[0m\n", + "\u001b[0;34m - the action achieves an element of the effects\u001b[0m\n", + "\u001b[0;34m - the action doesn't delete something that needs to be achieved\u001b[0m\n", + "\u001b[0;34m - the preconditions are consistent with other subgoals that need to be achieved\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mnegate_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreplace\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Not'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m''\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m'Not'\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m'Not'\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msubgoal\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msubgoal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0maction\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpanded_actions\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0many\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprop\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mprop\u001b[0m \u001b[0;32min\u001b[0m \u001b[0msubgoal\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mand\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0many\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnegate_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprop\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0msubgoal\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mprop\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mand\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0many\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnegate_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprop\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0msubgoal\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mnegate_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprop\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mprop\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprecond\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mresult\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msubgoal\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# g' = (g - effects(a)) + preconds(a)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msubgoal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdifference\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0munion\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprecond\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mgoal_test\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msubgoal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mall\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgoal\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoal\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mgoal\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msubgoal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mh\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msubgoal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Computes ignore delete lists heuristic by creating a relaxed version of the original problem (we can do that\u001b[0m\n", + "\u001b[0;34m by removing the delete lists from all actions, i.e. removing all negative literals from effects) that will be\u001b[0m\n", + "\u001b[0;34m easier to solve through GraphPlan and where the length of the solution will serve as a good heuristic.\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrelaxed_planning_problem\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mPlanningProblem\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoal\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0msubgoal\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrelaxed\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mactions\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlinearize\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mGraphPlan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrelaxed_planning_problem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexecute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mexcept\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mfloat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'inf'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource BackwardPlan" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Planning as Constraint Satisfaction Problem" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In forward planning, the search is constrained by the initial state and only uses the goal as a stopping criterion and as a source for heuristics. In regression planning, the search is constrained by the goal and only uses the start state as a stopping criterion and as a source for heuristics. By converting the problem to a constraint satisfaction problem (CSP), the initial state can be used to prune what is not reachable and the goal to prune what is not useful. The CSP will be defined for a finite number of steps; the number of steps can be adjusted to find the shortest plan. One of the CSP methods can then be used to solve the CSP and thus find a plan.\n", + "\n", + "To construct a CSP from a planning problem, first choose a fixed planning *horizon*, which is the number of time steps over which to plan. Suppose the horizon is \n", + "$k$. The CSP has the following variables:\n", + "\n", + "- a *state variable* for each feature and each time from 0 to $k$. If there are $n$ features for a horizon of $k$, there are $n \\cdot (k+1)$ state variables. The domain of the state variable is the domain of the corresponding feature;\n", + "- an *action variable*, $Action_t$, for each $t$ in the range 0 to $k-1$. The domain of $Action_t$, represents the action that takes the agent from the state at time $t$ to the state at time $t+1$.\n", + "\n", + "There are several types of constraints:\n", + "\n", + "- a *precondition constraint* between a state variable at time $t$ and the variable $Actiont_t$ constrains what actions are legal at time $t$;\n", + "- an *effect constraint* between $Action_t$ and a state variable at time $t+1$ constrains the values of a state variable that is a direct effect of the action;\n", + "- a *frame constraint* among a state variable at time $t$, the variable $Action_t$, and the corresponding state variable at time $t+1$ specifies when the variable that does not change as a result of an action has the same value before and after the action;\n", + "- an *initial-state constraint* constrains a variable on the initial state (at time 0). The initial state is represented as a set of domain constraints on the state variables at time 0;\n", + "- a *goal constraint* constrains the final state to be a state that satisfies the achievement goal. These are domain constraints on the variables that appear in the goal;\n", + "- a *state constraint* is a constraint among variables at the same time step. These can include physical constraints on the state or can ensure that states that violate maintenance goals are forbidden. This is extra knowledge beyond the power of the feature-based or PDDL representations of the action.\n", + "\n", + "The PDDL representation gives precondition, effect and frame constraints for each time \n", + "$t$ as follows:\n", + "\n", + "- for each $Var = v$ in the precondition of action $A$, there is a precondition constraint:\n", + "$$ Var_t = v \\leftarrow Action_t = A $$\n", + "that specifies that if the action is to be $A$, $Var_t$ must have value $v$ immediately before. This constraint is violated when $Action_t = A$ and $Var_t \\neq v$, and thus is equivalent to $\\lnot{(Var_t \\neq v \\land Action_t = A)}$;\n", + "- or each $Var = v$ in the effect of action $A$, there is a effect constraint:\n", + "$$ Var_{t+1} = v \\leftarrow Action_t = A $$\n", + "which is violated when $Action_t = A$ and $Var_{t+1} \\neq v$, and thus is equivalent to $\\lnot{(Var_{t+1} \\neq v \\land Action_t = A)}$;\n", + "- for each $Var$, there is a frame constraint, where $As$ is the set of actions that include $Var$ in the effect of the action:\n", + "$$ Var_{t+1} = Var_t \\leftarrow Action_t \\notin As $$\n", + "which specifies that the feature $Var$ has the same value before and after any action that does not affect $Var$.\n", + "\n", + "The CSP representation assumes a fixed planning horizon (ie a fixed number of steps). To find a plan over any number of steps, the algorithm can be run for a horizon of $k = 0, 1, 2, \\dots$ until a solution is found." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "from csp import *" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mCSPlan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msolution_length\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mCSP_solver\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mac_search_solver\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0msat_up\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Section 10.4.3]\u001b[0m\n", + "\u001b[0;34m Planning as Constraint Satisfaction Problem\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Returns a string for the var-stage pair that can be used as a variable\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;34m\"_\"\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mif_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mv1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mv2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"If the second argument is v2, the first argument must be v1\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mif_fun\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mx1\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mv1\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mx2\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mv2\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mif_fun\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__name__\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m\"if the second argument is \"\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mv2\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;34m\" then the first argument is \"\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mv1\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;34m\" \"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mif_fun\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0meq_if_not_in_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mactset\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"First and third arguments are equal if action is not in actset\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0meq_if_not_in\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0ma\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mx1\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mx2\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0ma\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mactset\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meq_if_not_in\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__name__\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m\"first and third arguments are equal if action is not in \"\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mactset\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;34m\" \"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0meq_if_not_in\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mexpanded_actions\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpand_actions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mfluent_values\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpand_fluents\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mhorizon\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msolution_length\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mact_vars\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'action'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mstage\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mhorizon\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomains\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mav\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmap\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mexpanded_actions\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mav\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mact_vars\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomains\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mupdate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;32mTrue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m}\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvar\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mfluent_values\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mstage\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mhorizon\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# initial state constraints\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconstraints\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mConstraint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mis_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mval\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfluent\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreplace\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Not'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m''\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfluent\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0;34m'Not'\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mfluent\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mitems\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconstraints\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mConstraint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mis_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mFalse\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvar\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfluent\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreplace\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Not'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m''\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mfluent\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mfluent_values\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfluent\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# goal state constraints\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconstraints\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mConstraint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mhorizon\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mis_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mval\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfluent\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreplace\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Not'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m''\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfluent\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0;34m'Not'\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mfluent\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mitems\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# precondition constraints\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconstraints\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mConstraint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'action'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mif_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# st(var, stage) == val if st('action', stage) == act\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstrps\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mexpanded_actions\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mitems\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mval\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfluent\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreplace\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Not'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m''\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfluent\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0;34m'Not'\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mfluent\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstrps\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprecond\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mitems\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mstage\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mhorizon\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# effect constraints\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconstraints\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mConstraint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'action'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mif_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mval\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# st(var, stage + 1) == val if st('action', stage) == act\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstrps\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mexpanded_actions\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mitems\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mval\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfluent\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreplace\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Not'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m''\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfluent\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0;34m'Not'\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mfluent\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstrps\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mitems\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mstage\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mhorizon\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# frame constraints\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconstraints\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mConstraint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'action'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstage\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meq_if_not_in_\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmap\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mact\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mact\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mexpanded_actions\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mvar\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Not'\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mvar\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0mvar\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meffect\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mvar\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mfluent_values\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mstage\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mhorizon\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcsp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mNaryCSP\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdomains\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconstraints\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msol\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCSP_solver\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcsp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0marc_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0marc_heuristic\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0msol\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0msol\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ma\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mact_vars\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource CSPlan" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Planning as Boolean Satisfiability Problem\n", + "\n", + "As shown in [[2]](cite-kautz1992planning) the translation of a *Planning Domain Definition Language* (PDDL) description into a *Conjunctive Normal Form* (CNF) formula is a series of straightforward steps:\n", + "- *propositionalize the actions*: replace each action schema with a set of ground actions formed by substituting constants for each of the variables. These ground actions are not part of the translation, but will be used in subsequent steps;\n", + "- *define the initial state*: assert $F^0$ for every fluent $F$ in the problem’s initial state, and $\\lnot{F}$ for every fluent not mentioned in the initial state;\n", + "- *propositionalize the goal*: for every variable in the goal, replace the literals that contain the variable with a disjunction over constants;\n", + "- *add successor-state axioms*: for each fluent $F$, add an axiom of the form\n", + "\n", + "$$ F^{t+1} \\iff ActionCausesF^t \\lor (F^t \\land \\lnot{ActionCausesNotF^t}) $$\n", + "\n", + "where $ActionCausesF$ is a disjunction of all the ground actions that have $F$ in their add list, and $ActionCausesNotF$ is a disjunction of all the ground actions that have $F$ in their delete list;\n", + "- *add precondition axioms*: for each ground action $A$, add the axiom $A^t \\implies PRE(A)^t$, that is, if an action is taken at time $t$, then the preconditions must have been true;\n", + "- *add action exclusion axioms*: say that every action is distinct from every other action.\n", + "\n", + "A propositional planning procedure implements the basic idea just given but, because the agent does not know how many steps it will take to reach the goal, the algorithm tries each possible number of steps $t$, up to some maximum conceivable plan length $T_{max}$ . In this way, it is guaranteed to find the shortest plan if one exists. Because of the way the propositional planning procedure searches for a solution, this approach cannot be used in a partially observable environment, ie WalkSAT, but would just set the unobservable variables to the values it needs to create a solution." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "from logic import *" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mSATPlan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msolution_length\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mSAT_solver\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcdcl_satisfiable\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Section 10.4.1]\u001b[0m\n", + "\u001b[0;34m Planning as Boolean satisfiability\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mexpand_transitions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mstate\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msorted\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mfilter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcheck_precond\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mact\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mupdate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mname\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msorted\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfilter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;36m3\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0;34m'Not'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mis_strips\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msorted\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mstate\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mvalues\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mstate\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mexpand_transitions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mexpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mtransition\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mdefaultdict\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdict\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mexpand_transitions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexpand_actions\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mSAT_plan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msorted\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msorted\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mplanning_problem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgoals\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msolution_length\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mSAT_solver\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mSAT_solver\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource SATPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mSAT_plan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minit\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgoal\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt_max\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mSAT_solver\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcdcl_satisfiable\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Converts a planning problem to Satisfaction problem by translating it to a cnf sentence.\u001b[0m\n", + "\u001b[0;34m [Figure 7.22]\u001b[0m\n", + "\u001b[0;34m >>> transition = {'A': {'Left': 'A', 'Right': 'B'}, 'B': {'Left': 'A', 'Right': 'C'}, 'C': {'Left': 'B', 'Right': 'C'}}\u001b[0m\n", + "\u001b[0;34m >>> SAT_plan('A', transition, 'C', 1) is None\u001b[0m\n", + "\u001b[0;34m True\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Functions used by SAT_plan\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mtranslate_to_SAT\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minit\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgoal\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtime\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mstates\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mstate\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mstate\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Symbol claiming state s at time t\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mstate_counter\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcount\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ms\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstates\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mt\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtime\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"S{}\"\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mformat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate_counter\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Add initial state axiom\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0minit\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Add goal state axiom\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mfirst\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mclause\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstate_sym\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0missuperset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgoal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtime\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m \\\n", + " \u001b[0;32mif\u001b[0m \u001b[0misinstance\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgoal\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mgoal\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtime\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# All possible transitions\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mtransition_counter\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcount\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ms\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstates\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0ms_\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mt\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtime\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Action 'action' taken from state 's' at time 't' to reach 's_'\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0maction_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mExpr\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"T{}\"\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mformat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtransition_counter\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Change the state from s to s_\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m|\u001b[0m \u001b[0;34m'==>'\u001b[0m \u001b[0;34m|\u001b[0m \u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m|\u001b[0m \u001b[0;34m'==>'\u001b[0m \u001b[0;34m|\u001b[0m \u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms_\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Allow only one state at any time\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mt\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtime\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# must be a state at any time\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'|'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ms\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstates\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ms\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstates\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ms_\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mstates\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mstates\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mindex\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# for each pair of states s, s_ only one is possible at time t\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m|\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mstate_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ms_\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Restrict to one transition per timestep\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mt\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtime\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# list of possible transitions at time t\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mtransitions_t\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mtr\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mtr\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction_sym\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mtr\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# make sure at least one of the transitions happens\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'|'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0maction_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mtr\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mtr\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtransitions_t\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mtr\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtransitions_t\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mtr_\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtransitions_t\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mtransitions_t\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mindex\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtr\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# there cannot be two transitions tr and tr_ at time t\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0maction_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mtr\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m|\u001b[0m \u001b[0;34m~\u001b[0m\u001b[0maction_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mtr_\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Combine the clauses to form the cnf\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'&'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mextract_solution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mtrue_transitions\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mt\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mt\u001b[0m \u001b[0;32min\u001b[0m \u001b[0maction_sym\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0maction_sym\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mt\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Sort transitions based on time, which is the 3rd element of the tuple\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mtrue_transitions\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msort\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0maction\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0maction\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtime\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtrue_transitions\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# Body of SAT_plan algorithm\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mt\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mt_max\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# dictionaries to help extract the solution from model\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mstate_sym\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0maction_sym\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mcnf\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtranslate_to_SAT\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minit\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtransition\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgoal\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mmodel\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mSAT_solver\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcnf\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mmodel\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mextract_solution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource SAT_plan" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "## Experimental Results" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Blocks World" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mthree_block_tower\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Figure 10.3] THREE-BLOCK-TOWER\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m A blocks-world problem of stacking three blocks in a certain configuration,\u001b[0m\n", + "\u001b[0;34m also known as the Sussman Anomaly.\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m Example:\u001b[0m\n", + "\u001b[0;34m >>> from planning import *\u001b[0m\n", + "\u001b[0;34m >>> tbt = three_block_tower()\u001b[0m\n", + "\u001b[0;34m >>> tbt.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> tbt.act(expr('MoveToTable(C, A)'))\u001b[0m\n", + "\u001b[0;34m >>> tbt.act(expr('Move(B, Table, C)'))\u001b[0m\n", + "\u001b[0;34m >>> tbt.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> tbt.act(expr('Move(A, Table, B)'))\u001b[0m\n", + "\u001b[0;34m >>> tbt.goal_test()\u001b[0m\n", + "\u001b[0;34m True\u001b[0m\n", + "\u001b[0;34m >>>\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mPlanningProblem\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'On(A, Table) & On(B, Table) & On(C, A) & Clear(B) & Clear(C)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'On(A, B) & On(B, C)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Move(b, x, y)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'On(b, x) & Clear(b) & Clear(y)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'On(b, y) & Clear(x) & ~On(b, x) & ~Clear(y)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Block(b) & Block(y)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'MoveToTable(b, x)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'On(b, x) & Clear(b)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'On(b, Table) & Clear(x) & ~On(b, x)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Block(b) & Block(x)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Block(A) & Block(B) & Block(C)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource three_block_tower" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### GraphPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 4.46 ms, sys: 124 µs, total: 4.59 ms\n", + "Wall time: 4.48 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = GraphPlan(three_block_tower()).execute()\n", + "linearize(blocks_world_solution)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "14 paths have been expanded and 28 paths remain in the frontier\n", + "CPU times: user 91 ms, sys: 0 ns, total: 91 ms\n", + "Wall time: 89.8 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 15, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = uniform_cost_search(ForwardPlan(three_block_tower()), display=True).solution()\n", + "blocks_world_solution = list(map(lambda action: Expr(action.name, *action.args), blocks_world_solution))\n", + "blocks_world_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "3 paths have been expanded and 9 paths remain in the frontier\n", + "CPU times: user 81.3 ms, sys: 3.11 ms, total: 84.5 ms\n", + "Wall time: 83 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = astar_search(ForwardPlan(three_block_tower()), display=True).solution()\n", + "blocks_world_solution = list(map(lambda action: Expr(action.name, *action.args), blocks_world_solution))\n", + "blocks_world_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "116 paths have been expanded and 289 paths remain in the frontier\n", + "CPU times: user 266 ms, sys: 718 µs, total: 267 ms\n", + "Wall time: 265 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = uniform_cost_search(BackwardPlan(three_block_tower()), display=True).solution()\n", + "blocks_world_solution = list(map(lambda action: Expr(action.name, *action.args), blocks_world_solution))\n", + "blocks_world_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "4 paths have been expanded and 20 paths remain in the frontier\n", + "CPU times: user 477 ms, sys: 450 µs, total: 477 ms\n", + "Wall time: 476 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = astar_search(BackwardPlan(three_block_tower()), display=True).solution()\n", + "blocks_world_solution = list(map(lambda action: Expr(action.name, *action.args), blocks_world_solution))\n", + "blocks_world_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 172 ms, sys: 4.52 ms, total: 176 ms\n", + "Wall time: 175 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = CSPlan(three_block_tower(), 3, arc_heuristic=no_heuristic)\n", + "blocks_world_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan with SAT UP Arc Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 267 ms, sys: 0 ns, total: 267 ms\n", + "Wall time: 266 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = CSPlan(three_block_tower(), 3, arc_heuristic=sat_up)\n", + "blocks_world_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SATPlan with DPLL" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 34.9 s, sys: 15.9 ms, total: 34.9 s\n", + "Wall time: 34.9 s\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = SATPlan(three_block_tower(), 4, SAT_solver=dpll_satisfiable)\n", + "blocks_world_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SATPlan with CDCL" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1.15 s, sys: 4.01 ms, total: 1.15 s\n", + "Wall time: 1.15 s\n" + ] + }, + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time blocks_world_solution = SATPlan(three_block_tower(), 4, SAT_solver=cdcl_satisfiable)\n", + "blocks_world_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Spare Tire" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mspare_tire\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Figure 10.2] SPARE-TIRE-PROBLEM\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m A problem involving changing the flat tire of a car\u001b[0m\n", + "\u001b[0;34m with a spare tire from the trunk.\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m Example:\u001b[0m\n", + "\u001b[0;34m >>> from planning import *\u001b[0m\n", + "\u001b[0;34m >>> st = spare_tire()\u001b[0m\n", + "\u001b[0;34m >>> st.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> st.act(expr('Remove(Spare, Trunk)'))\u001b[0m\n", + "\u001b[0;34m >>> st.act(expr('Remove(Flat, Axle)'))\u001b[0m\n", + "\u001b[0;34m >>> st.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> st.act(expr('PutOn(Spare, Axle)'))\u001b[0m\n", + "\u001b[0;34m >>> st.goal_test()\u001b[0m\n", + "\u001b[0;34m True\u001b[0m\n", + "\u001b[0;34m >>>\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mPlanningProblem\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(Flat, Axle) & At(Spare, Trunk)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(Spare, Axle) & At(Flat, Ground)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Remove(obj, loc)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(obj, loc)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(obj, Ground) & ~At(obj, loc)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Tire(obj)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'PutOn(t, Axle)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(t, Ground) & ~At(Flat, Axle)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(t, Axle) & ~At(t, Ground)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Tire(t)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'LeaveOvernight'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m''\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'~At(Spare, Ground) & ~At(Spare, Axle) & ~At(Spare, Trunk) & \\\u001b[0m\n", + "\u001b[0;34m ~At(Flat, Ground) & ~At(Flat, Axle) & ~At(Flat, Trunk)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Tire(Flat) & Tire(Spare)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource spare_tire" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### GraphPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 4.24 ms, sys: 1 µs, total: 4.24 ms\n", + "Wall time: 4.16 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Flat, Axle), Remove(Spare, Trunk), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 29, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = GraphPlan(spare_tire()).execute()\n", + "linearize(spare_tire_solution)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "11 paths have been expanded and 9 paths remain in the frontier\n", + "CPU times: user 10.3 ms, sys: 0 ns, total: 10.3 ms\n", + "Wall time: 9.89 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Flat, Axle), Remove(Spare, Trunk), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 30, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = uniform_cost_search(ForwardPlan(spare_tire()), display=True).solution()\n", + "spare_tire_solution = list(map(lambda action: Expr(action.name, *action.args), spare_tire_solution))\n", + "spare_tire_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "5 paths have been expanded and 8 paths remain in the frontier\n", + "CPU times: user 20.4 ms, sys: 1 µs, total: 20.4 ms\n", + "Wall time: 19.4 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Flat, Axle), Remove(Spare, Trunk), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = astar_search(ForwardPlan(spare_tire()), display=True).solution()\n", + "spare_tire_solution = list(map(lambda action: Expr(action.name, *action.args), spare_tire_solution))\n", + "spare_tire_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "29 paths have been expanded and 22 paths remain in the frontier\n", + "CPU times: user 22.2 ms, sys: 7 µs, total: 22.2 ms\n", + "Wall time: 21.3 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Flat, Axle), Remove(Spare, Trunk), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 32, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = uniform_cost_search(BackwardPlan(spare_tire()), display=True).solution()\n", + "spare_tire_solution = list(map(lambda action: Expr(action.name, *action.args), spare_tire_solution))\n", + "spare_tire_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "3 paths have been expanded and 11 paths remain in the frontier\n", + "CPU times: user 13 ms, sys: 0 ns, total: 13 ms\n", + "Wall time: 12.5 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Spare, Trunk), Remove(Flat, Axle), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = astar_search(BackwardPlan(spare_tire()), display=True).solution()\n", + "spare_tire_solution = list(map(lambda action: Expr(action.name, *action.args), spare_tire_solution))\n", + "spare_tire_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 94.7 ms, sys: 0 ns, total: 94.7 ms\n", + "Wall time: 93.2 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Spare, Trunk), Remove(Flat, Axle), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 34, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = CSPlan(spare_tire(), 3, arc_heuristic=no_heuristic)\n", + "spare_tire_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan with SAT UP Arc Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 119 ms, sys: 0 ns, total: 119 ms\n", + "Wall time: 118 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Spare, Trunk), Remove(Flat, Axle), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = CSPlan(spare_tire(), 3, arc_heuristic=sat_up)\n", + "spare_tire_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SATPlan with DPLL" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 9.01 s, sys: 3.98 ms, total: 9.01 s\n", + "Wall time: 9.01 s\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Flat, Axle), Remove(Spare, Trunk), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 36, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = SATPlan(spare_tire(), 4, SAT_solver=dpll_satisfiable)\n", + "spare_tire_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SATPlan with CDCL" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 630 ms, sys: 6 µs, total: 630 ms\n", + "Wall time: 628 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Remove(Spare, Trunk), Remove(Flat, Axle), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time spare_tire_solution = SATPlan(spare_tire(), 4, SAT_solver=cdcl_satisfiable)\n", + "spare_tire_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Shopping Problem" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mshopping_problem\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m SHOPPING-PROBLEM\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m A problem of acquiring some items given their availability at certain stores.\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m Example:\u001b[0m\n", + "\u001b[0;34m >>> from planning import *\u001b[0m\n", + "\u001b[0;34m >>> sp = shopping_problem()\u001b[0m\n", + "\u001b[0;34m >>> sp.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> sp.act(expr('Go(Home, HW)'))\u001b[0m\n", + "\u001b[0;34m >>> sp.act(expr('Buy(Drill, HW)'))\u001b[0m\n", + "\u001b[0;34m >>> sp.act(expr('Go(HW, SM)'))\u001b[0m\n", + "\u001b[0;34m >>> sp.act(expr('Buy(Banana, SM)'))\u001b[0m\n", + "\u001b[0;34m >>> sp.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> sp.act(expr('Buy(Milk, SM)'))\u001b[0m\n", + "\u001b[0;34m >>> sp.goal_test()\u001b[0m\n", + "\u001b[0;34m True\u001b[0m\n", + "\u001b[0;34m >>>\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mPlanningProblem\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(Home) & Sells(SM, Milk) & Sells(SM, Banana) & Sells(HW, Drill)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Have(Milk) & Have(Banana) & Have(Drill)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Buy(x, store)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(store) & Sells(store, x)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Have(x)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Store(store) & Item(x)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Go(x, y)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(x)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(y) & ~At(x)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Place(x) & Place(y)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Place(Home) & Place(SM) & Place(HW) & Store(SM) & Store(HW) & '\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m'Item(Milk) & Item(Banana) & Item(Drill)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource shopping_problem" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### GraphPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 5.08 ms, sys: 3 µs, total: 5.08 ms\n", + "Wall time: 5.03 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, HW), Go(Home, SM), Buy(Milk, SM), Buy(Drill, HW), Buy(Banana, SM)]" + ] + }, + "execution_count": 45, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = GraphPlan(shopping_problem()).execute()\n", + "linearize(shopping_problem_solution)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "167 paths have been expanded and 257 paths remain in the frontier\n", + "CPU times: user 187 ms, sys: 4.01 ms, total: 191 ms\n", + "Wall time: 190 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, SM), Buy(Banana, SM), Buy(Milk, SM), Go(SM, HW), Buy(Drill, HW)]" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = uniform_cost_search(ForwardPlan(shopping_problem()), display=True).solution()\n", + "shopping_problem_solution = list(map(lambda action: Expr(action.name, *action.args), shopping_problem_solution))\n", + "shopping_problem_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "9 paths have been expanded and 22 paths remain in the frontier\n", + "CPU times: user 101 ms, sys: 3 µs, total: 101 ms\n", + "Wall time: 100 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, SM), Buy(Banana, SM), Buy(Milk, SM), Go(SM, HW), Buy(Drill, HW)]" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = astar_search(ForwardPlan(shopping_problem()), display=True).solution()\n", + "shopping_problem_solution = list(map(lambda action: Expr(action.name, *action.args), shopping_problem_solution))\n", + "shopping_problem_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "176 paths have been expanded and 7 paths remain in the frontier\n", + "CPU times: user 109 ms, sys: 2 µs, total: 109 ms\n", + "Wall time: 107 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, HW), Buy(Drill, HW), Go(HW, SM), Buy(Milk, SM), Buy(Banana, SM)]" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = uniform_cost_search(BackwardPlan(shopping_problem()), display=True).solution()\n", + "shopping_problem_solution = list(map(lambda action: Expr(action.name, *action.args), shopping_problem_solution))\n", + "shopping_problem_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "18 paths have been expanded and 28 paths remain in the frontier\n", + "CPU times: user 235 ms, sys: 9 µs, total: 235 ms\n", + "Wall time: 234 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, SM), Buy(Banana, SM), Buy(Milk, SM), Go(SM, HW), Buy(Drill, HW)]" + ] + }, + "execution_count": 49, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = astar_search(BackwardPlan(shopping_problem()), display=True).solution()\n", + "shopping_problem_solution = list(map(lambda action: Expr(action.name, *action.args), shopping_problem_solution))\n", + "shopping_problem_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 194 ms, sys: 6 µs, total: 194 ms\n", + "Wall time: 192 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, HW), Buy(Drill, HW), Go(HW, SM), Buy(Banana, SM), Buy(Milk, SM)]" + ] + }, + "execution_count": 50, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = CSPlan(shopping_problem(), 5, arc_heuristic=no_heuristic)\n", + "shopping_problem_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan with SAT UP Arc Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 235 ms, sys: 7 µs, total: 235 ms\n", + "Wall time: 233 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, HW), Buy(Drill, HW), Go(HW, SM), Buy(Banana, SM), Buy(Milk, SM)]" + ] + }, + "execution_count": 51, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = CSPlan(shopping_problem(), 5, arc_heuristic=sat_up)\n", + "shopping_problem_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SATPlan with CDCL" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1min 29s, sys: 36 ms, total: 1min 29s\n", + "Wall time: 1min 29s\n" + ] + }, + { + "data": { + "text/plain": [ + "[Go(Home, HW), Buy(Drill, HW), Go(HW, SM), Buy(Banana, SM), Buy(Milk, SM)]" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time shopping_problem_solution = SATPlan(shopping_problem(), 5, SAT_solver=cdcl_satisfiable)\n", + "shopping_problem_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Air Cargo" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mair_cargo\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Figure 10.1] AIR-CARGO-PROBLEM\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m An air-cargo shipment problem for delivering cargo to different locations,\u001b[0m\n", + "\u001b[0;34m given the starting location and airplanes.\u001b[0m\n", + "\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m Example:\u001b[0m\n", + "\u001b[0;34m >>> from planning import *\u001b[0m\n", + "\u001b[0;34m >>> ac = air_cargo()\u001b[0m\n", + "\u001b[0;34m >>> ac.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> ac.act(expr('Load(C2, P2, JFK)'))\u001b[0m\n", + "\u001b[0;34m >>> ac.act(expr('Load(C1, P1, SFO)'))\u001b[0m\n", + "\u001b[0;34m >>> ac.act(expr('Fly(P1, SFO, JFK)'))\u001b[0m\n", + "\u001b[0;34m >>> ac.act(expr('Fly(P2, JFK, SFO)'))\u001b[0m\n", + "\u001b[0;34m >>> ac.act(expr('Unload(C2, P2, SFO)'))\u001b[0m\n", + "\u001b[0;34m >>> ac.goal_test()\u001b[0m\n", + "\u001b[0;34m False\u001b[0m\n", + "\u001b[0;34m >>> ac.act(expr('Unload(C1, P1, JFK)'))\u001b[0m\n", + "\u001b[0;34m >>> ac.goal_test()\u001b[0m\n", + "\u001b[0;34m True\u001b[0m\n", + "\u001b[0;34m >>>\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mPlanningProblem\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minitial\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(C1, SFO) & At(C2, JFK) & At(P1, SFO) & At(P2, JFK)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mgoals\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(C1, JFK) & At(C2, SFO)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mactions\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Load(c, p, a)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(c, a) & At(p, a)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'In(c, p) & ~At(c, a)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Cargo(c) & Plane(p) & Airport(a)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Unload(c, p, a)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'In(c, p) & At(p, a)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(c, a) & ~In(c, p)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Cargo(c) & Plane(p) & Airport(a)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mAction\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Fly(p, f, to)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mprecond\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(p, f)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0meffect\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'At(p, to) & ~At(p, f)'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Plane(p) & Airport(f) & Airport(to)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdomain\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'Cargo(C1) & Cargo(C2) & Plane(P1) & Plane(P2) & Airport(SFO) & Airport(JFK)'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource air_cargo" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### GraphPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 9.06 ms, sys: 3 µs, total: 9.06 ms\n", + "Wall time: 8.94 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Load(C2, P2, JFK),\n", + " Fly(P2, JFK, SFO),\n", + " Load(C1, P1, SFO),\n", + " Fly(P1, SFO, JFK),\n", + " Unload(C1, P1, JFK),\n", + " Unload(C2, P2, SFO)]" + ] + }, + "execution_count": 38, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time air_cargo_solution = GraphPlan(air_cargo()).execute()\n", + "linearize(air_cargo_solution)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "838 paths have been expanded and 1288 paths remain in the frontier\n", + "CPU times: user 3.56 s, sys: 4 ms, total: 3.57 s\n", + "Wall time: 3.56 s\n" + ] + }, + { + "data": { + "text/plain": [ + "[Load(C2, P2, JFK),\n", + " Fly(P2, JFK, SFO),\n", + " Unload(C2, P2, SFO),\n", + " Load(C1, P2, SFO),\n", + " Fly(P2, SFO, JFK),\n", + " Unload(C1, P2, JFK)]" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time air_cargo_solution = uniform_cost_search(ForwardPlan(air_cargo()), display=True).solution()\n", + "air_cargo_solution = list(map(lambda action: Expr(action.name, *action.args), air_cargo_solution))\n", + "air_cargo_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### ForwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "17 paths have been expanded and 54 paths remain in the frontier\n", + "CPU times: user 716 ms, sys: 0 ns, total: 716 ms\n", + "Wall time: 717 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Load(C2, P2, JFK),\n", + " Fly(P2, JFK, SFO),\n", + " Unload(C2, P2, SFO),\n", + " Load(C1, P2, SFO),\n", + " Fly(P2, SFO, JFK),\n", + " Unload(C1, P2, JFK)]" + ] + }, + "execution_count": 40, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time air_cargo_solution = astar_search(ForwardPlan(air_cargo()), display=True).solution()\n", + "air_cargo_solution = list(map(lambda action: Expr(action.name, *action.args), air_cargo_solution))\n", + "air_cargo_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "506 paths have been expanded and 65 paths remain in the frontier\n", + "CPU times: user 970 ms, sys: 0 ns, total: 970 ms\n", + "Wall time: 971 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "[Load(C1, P1, SFO),\n", + " Fly(P1, SFO, JFK),\n", + " Load(C2, P1, JFK),\n", + " Unload(C1, P1, JFK),\n", + " Fly(P1, JFK, SFO),\n", + " Unload(C2, P1, SFO)]" + ] + }, + "execution_count": 41, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time air_cargo_solution = uniform_cost_search(BackwardPlan(air_cargo()), display=True).solution()\n", + "air_cargo_solution = list(map(lambda action: Expr(action.name, *action.args), air_cargo_solution))\n", + "air_cargo_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### BackwardPlan with Ignore Delete Lists Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "23 paths have been expanded and 50 paths remain in the frontier\n", + "CPU times: user 1.19 s, sys: 2 µs, total: 1.19 s\n", + "Wall time: 1.2 s\n" + ] + }, + { + "data": { + "text/plain": [ + "[Load(C2, P2, JFK),\n", + " Fly(P2, JFK, SFO),\n", + " Unload(C2, P2, SFO),\n", + " Load(C1, P2, SFO),\n", + " Fly(P2, SFO, JFK),\n", + " Unload(C1, P2, JFK)]" + ] + }, + "execution_count": 42, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time air_cargo_solution = astar_search(BackwardPlan(air_cargo()), display=True).solution()\n", + "air_cargo_solution = list(map(lambda action: Expr(action.name, *action.args), air_cargo_solution))\n", + "air_cargo_solution[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 6.5 s, sys: 0 ns, total: 6.5 s\n", + "Wall time: 6.51 s\n" + ] + }, + { + "data": { + "text/plain": [ + "[Load(C1, P1, SFO),\n", + " Fly(P1, SFO, JFK),\n", + " Load(C2, P1, JFK),\n", + " Unload(C1, P1, JFK),\n", + " Fly(P1, JFK, SFO),\n", + " Unload(C2, P1, SFO)]" + ] + }, + "execution_count": 43, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time air_cargo_solution = CSPlan(air_cargo(), 6, arc_heuristic=no_heuristic)\n", + "air_cargo_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSPlan with SAT UP Arc Heuristic" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 13.6 s, sys: 7.98 ms, total: 13.7 s\n", + "Wall time: 13.7 s\n" + ] + }, + { + "data": { + "text/plain": [ + "[Load(C1, P1, SFO),\n", + " Fly(P1, SFO, JFK),\n", + " Load(C2, P1, JFK),\n", + " Unload(C1, P1, JFK),\n", + " Fly(P1, JFK, SFO),\n", + " Unload(C2, P1, SFO)]" + ] + }, + "execution_count": 44, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time air_cargo_solution = CSPlan(air_cargo(), 6, arc_heuristic=sat_up)\n", + "air_cargo_solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## References\n", + "\n", + "[[1]](#ref-1) Hoffmann, Jörg. 2001. _FF: The fast-forward planning system_.\n", + "\n", + "[[2]](#ref-2) Kautz, Henry A and Selman, Bart and others. 1992. _Planning as Satisfiability_." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.5rc1" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/csp.ipynb b/csp.ipynb index 2192352cf..5d490846b 100644 --- a/csp.ipynb +++ b/csp.ipynb @@ -6,21 +6,20 @@ "source": [ "# CONSTRAINT SATISFACTION PROBLEMS\n", "\n", - "This IPy notebook acts as supporting material for topics covered in **Chapter 6 Constraint Satisfaction Problems** of the book* Artificial Intelligence: A Modern Approach*. We make use of the implementations in **csp.py** module. Even though this notebook includes a brief summary of the main topics familiarity with the material present in the book is expected. We will look at some visualizations and solve some of the CSP problems described in the book. Let us import everything from the csp module to get started." + "This IPy notebook acts as supporting material for topics covered in **Chapter 6 Constraint Satisfaction Problems** of the book* Artificial Intelligence: A Modern Approach*. We make use of the implementations in **csp.py** module. Even though this notebook includes a brief summary of the main topics, familiarity with the material present in the book is expected. We will look at some visualizations and solve some of the CSP problems described in the book. Let us import everything from the csp module to get started." ] }, { "cell_type": "code", "execution_count": 1, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from csp import *\n", - "from notebook import psource, pseudocode\n", + "from notebook import psource, plot_NQueens\n", + "%matplotlib inline\n", "\n", - "# Needed to hide warnings in the matplotlib sections\n", + "# Hide warnings in the matplotlib sections\n", "import warnings\n", "warnings.filterwarnings(\"ignore\")" ] @@ -34,6 +33,7 @@ "* Overview\n", "* Graph Coloring\n", "* N-Queens\n", + "* AC-3\n", "* Backtracking Search\n", "* Tree CSP Solver\n", "* Graph Coloring Visualization\n", @@ -51,9 +51,252 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 2, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class CSP(search.Problem):\n",
    +       "    """This class describes finite-domain Constraint Satisfaction Problems.\n",
    +       "    A CSP is specified by the following inputs:\n",
    +       "        variables   A list of variables; each is atomic (e.g. int or string).\n",
    +       "        domains     A dict of {var:[possible_value, ...]} entries.\n",
    +       "        neighbors   A dict of {var:[var,...]} that for each variable lists\n",
    +       "                    the other variables that participate in constraints.\n",
    +       "        constraints A function f(A, a, B, b) that returns true if neighbors\n",
    +       "                    A, B satisfy the constraint when they have values A=a, B=b\n",
    +       "\n",
    +       "    In the textbook and in most mathematical definitions, the\n",
    +       "    constraints are specified as explicit pairs of allowable values,\n",
    +       "    but the formulation here is easier to express and more compact for\n",
    +       "    most cases. (For example, the n-Queens problem can be represented\n",
    +       "    in O(n) space using this notation, instead of O(N^4) for the\n",
    +       "    explicit representation.) In terms of describing the CSP as a\n",
    +       "    problem, that's all there is.\n",
    +       "\n",
    +       "    However, the class also supports data structures and methods that help you\n",
    +       "    solve CSPs by calling a search function on the CSP. Methods and slots are\n",
    +       "    as follows, where the argument 'a' represents an assignment, which is a\n",
    +       "    dict of {var:val} entries:\n",
    +       "        assign(var, val, a)     Assign a[var] = val; do other bookkeeping\n",
    +       "        unassign(var, a)        Do del a[var], plus other bookkeeping\n",
    +       "        nconflicts(var, val, a) Return the number of other variables that\n",
    +       "                                conflict with var=val\n",
    +       "        curr_domains[var]       Slot: remaining consistent values for var\n",
    +       "                                Used by constraint propagation routines.\n",
    +       "    The following methods are used only by graph_search and tree_search:\n",
    +       "        actions(state)          Return a list of actions\n",
    +       "        result(state, action)   Return a successor of state\n",
    +       "        goal_test(state)        Return true if all constraints satisfied\n",
    +       "    The following are just for debugging purposes:\n",
    +       "        nassigns                Slot: tracks the number of assignments made\n",
    +       "        display(a)              Print a human-readable representation\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, variables, domains, neighbors, constraints):\n",
    +       "        """Construct a CSP problem. If variables is empty, it becomes domains.keys()."""\n",
    +       "        variables = variables or list(domains.keys())\n",
    +       "        self.variables = variables\n",
    +       "        self.domains = domains\n",
    +       "        self.neighbors = neighbors\n",
    +       "        self.constraints = constraints\n",
    +       "        self.initial = ()\n",
    +       "        self.curr_domains = None\n",
    +       "        self.nassigns = 0\n",
    +       "\n",
    +       "    def assign(self, var, val, assignment):\n",
    +       "        """Add {var: val} to assignment; Discard the old value if any."""\n",
    +       "        assignment[var] = val\n",
    +       "        self.nassigns += 1\n",
    +       "\n",
    +       "    def unassign(self, var, assignment):\n",
    +       "        """Remove {var: val} from assignment.\n",
    +       "        DO NOT call this if you are changing a variable to a new value;\n",
    +       "        just call assign for that."""\n",
    +       "        if var in assignment:\n",
    +       "            del assignment[var]\n",
    +       "\n",
    +       "    def nconflicts(self, var, val, assignment):\n",
    +       "        """Return the number of conflicts var=val has with other variables."""\n",
    +       "\n",
    +       "        # Subclasses may implement this more efficiently\n",
    +       "        def conflict(var2):\n",
    +       "            return (var2 in assignment and\n",
    +       "                    not self.constraints(var, val, var2, assignment[var2]))\n",
    +       "\n",
    +       "        return count(conflict(v) for v in self.neighbors[var])\n",
    +       "\n",
    +       "    def display(self, assignment):\n",
    +       "        """Show a human-readable representation of the CSP."""\n",
    +       "        # Subclasses can print in a prettier way, or display with a GUI\n",
    +       "        print('CSP:', self, 'with assignment:', assignment)\n",
    +       "\n",
    +       "    # These methods are for the tree and graph-search interface:\n",
    +       "\n",
    +       "    def actions(self, state):\n",
    +       "        """Return a list of applicable actions: nonconflicting\n",
    +       "        assignments to an unassigned variable."""\n",
    +       "        if len(state) == len(self.variables):\n",
    +       "            return []\n",
    +       "        else:\n",
    +       "            assignment = dict(state)\n",
    +       "            var = first([v for v in self.variables if v not in assignment])\n",
    +       "            return [(var, val) for val in self.domains[var]\n",
    +       "                    if self.nconflicts(var, val, assignment) == 0]\n",
    +       "\n",
    +       "    def result(self, state, action):\n",
    +       "        """Perform an action and return the new state."""\n",
    +       "        (var, val) = action\n",
    +       "        return state + ((var, val),)\n",
    +       "\n",
    +       "    def goal_test(self, state):\n",
    +       "        """The goal is to assign all variables, with all constraints satisfied."""\n",
    +       "        assignment = dict(state)\n",
    +       "        return (len(assignment) == len(self.variables)\n",
    +       "                and all(self.nconflicts(variables, assignment[variables], assignment) == 0\n",
    +       "                        for variables in self.variables))\n",
    +       "\n",
    +       "    # These are for constraint propagation\n",
    +       "\n",
    +       "    def support_pruning(self):\n",
    +       "        """Make sure we can prune values from domains. (We want to pay\n",
    +       "        for this only if we use it.)"""\n",
    +       "        if self.curr_domains is None:\n",
    +       "            self.curr_domains = {v: list(self.domains[v]) for v in self.variables}\n",
    +       "\n",
    +       "    def suppose(self, var, value):\n",
    +       "        """Start accumulating inferences from assuming var=value."""\n",
    +       "        self.support_pruning()\n",
    +       "        removals = [(var, a) for a in self.curr_domains[var] if a != value]\n",
    +       "        self.curr_domains[var] = [value]\n",
    +       "        return removals\n",
    +       "\n",
    +       "    def prune(self, var, value, removals):\n",
    +       "        """Rule out var=value."""\n",
    +       "        self.curr_domains[var].remove(value)\n",
    +       "        if removals is not None:\n",
    +       "            removals.append((var, value))\n",
    +       "\n",
    +       "    def choices(self, var):\n",
    +       "        """Return all values for var that aren't currently ruled out."""\n",
    +       "        return (self.curr_domains or self.domains)[var]\n",
    +       "\n",
    +       "    def infer_assignment(self):\n",
    +       "        """Return the partial assignment implied by the current inferences."""\n",
    +       "        self.support_pruning()\n",
    +       "        return {v: self.curr_domains[v][0]\n",
    +       "                for v in self.variables if 1 == len(self.curr_domains[v])}\n",
    +       "\n",
    +       "    def restore(self, removals):\n",
    +       "        """Undo a supposition and all inferences from it."""\n",
    +       "        for B, b in removals:\n",
    +       "            self.curr_domains[B].append(b)\n",
    +       "\n",
    +       "    # This is for min_conflicts search\n",
    +       "\n",
    +       "    def conflicted_vars(self, current):\n",
    +       "        """Return a list of variables in current assignment that are in conflict"""\n",
    +       "        return [var for var in self.variables\n",
    +       "                if self.nconflicts(var, current[var], current) > 0]\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(CSP)" ] @@ -62,7 +305,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The __ _ _init_ _ __ method parameters specify the CSP. Variable can be passed as a list of strings or integers. Domains are passed as dict where key specify the variables and value specify the domains. The variables are passed as an empty list. Variables are extracted from the keys of the domain dictionary. Neighbor is a dict of variables that essentially describes the constraint graph. Here each variable key has a list its value which are the variables that are constraint along with it. The constraint parameter should be a function **f(A, a, B, b**) that **returns true** if neighbors A, B **satisfy the constraint** when they have values **A=a, B=b**. We have additional parameters like nassings which is incremented each time an assignment is made when calling the assign method. You can read more about the methods and parameters in the class doc string. We will talk more about them as we encounter their use. Let us jump to an example." + "The __ _ _init_ _ __ method parameters specify the CSP. Variables can be passed as a list of strings or integers. Domains are passed as dict (dictionary datatpye) where \"key\" specifies the variables and \"value\" specifies the domains. The variables are passed as an empty list. Variables are extracted from the keys of the domain dictionary. Neighbor is a dict of variables that essentially describes the constraint graph. Here each variable key has a list of its values which are the variables that are constraint along with it. The constraint parameter should be a function **f(A, a, B, b**) that **returns true** if neighbors A, B **satisfy the constraint** when they have values **A=a, B=b**. We have additional parameters like nassings which is incremented each time an assignment is made when calling the assign method. You can read more about the methods and parameters in the class doc string. We will talk more about them as we encounter their use. Let us jump to an example." ] }, { @@ -71,12 +314,12 @@ "source": [ "## GRAPH COLORING\n", "\n", - "We use the graph coloring problem as our running example for demonstrating the different algorithms in the **csp module**. The idea of map coloring problem is that the adjacent nodes (those connected by edges) should not have the same color throughout the graph. The graph can be colored using a fixed number of colors. Here each node is a variable and the values are the colors that can be assigned to them. Given that the domain will be the same for all our nodes we use a custom dict defined by the **UniversalDict** class. The **UniversalDict** Class takes in a parameter which it returns as value for all the keys of the dict. It is very similar to **defaultdict** in Python except that it does not support item assignment." + "We use the graph coloring problem as our running example for demonstrating the different algorithms in the **csp module**. The idea of map coloring problem is that the adjacent nodes (those connected by edges) should not have the same color throughout the graph. The graph can be colored using a fixed number of colors. Here each node is a variable and the values are the colors that can be assigned to them. Given that the domain will be the same for all our nodes we use a custom dict defined by the **UniversalDict** class. The **UniversalDict** Class takes in a parameter and returns it as a value for all the keys of the dict. It is very similar to **defaultdict** in Python except that it does not support item assignment." ] }, { "cell_type": "code", - "execution_count": 2, + "execution_count": 3, "metadata": {}, "outputs": [ { @@ -85,7 +328,7 @@ "['R', 'G', 'B']" ] }, - "execution_count": 2, + "execution_count": 3, "metadata": {}, "output_type": "execute_result" } @@ -99,14 +342,118 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For our CSP we also need to define a constraint function **f(A, a, B, b)**. In this what we need is that the neighbors must not have the same color. This is defined in the function **different_values_constraint** of the module." + "For our CSP we also need to define a constraint function **f(A, a, B, b)**. In this, we need to ensure that the neighbors don't have the same color. This is defined in the function **different_values_constraint** of the module." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 4, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def different_values_constraint(A, a, B, b):\n",
    +       "    """A constraint saying two neighboring variables must differ in value."""\n",
    +       "    return a != b\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(different_values_constraint)" ] @@ -115,15 +462,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The CSP class takes neighbors in the form of a Dict. The module specifies a simple helper function named **parse_neighbors** which allows to take input in the form of strings and return a Dict of the form compatible with the **CSP Class**." + "The CSP class takes neighbors in the form of a Dict. The module specifies a simple helper function named **parse_neighbors** which allows us to take input in the form of strings and return a Dict of a form that is compatible with the **CSP Class**." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 5, + "metadata": {}, "outputs": [], "source": [ "%pdoc parse_neighbors" @@ -133,38 +478,148 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The **MapColoringCSP** function creates and returns a CSP with the above constraint function and states. The variables our the keys of the neighbors dict and the constraint is the one specified by the **different_values_constratint** function. **australia**, **usa** and **france** are three CSPs that have been created using **MapColoringCSP**. **australia** corresponds to ** Figure 6.1 ** in the book." + "The **MapColoringCSP** function creates and returns a CSP with the above constraint function and states. The variables are the keys of the neighbors dict and the constraint is the one specified by the **different_values_constratint** function. **Australia**, **USA** and **France** are three CSPs that have been created using **MapColoringCSP**. **Australia** corresponds to ** Figure 6.1 ** in the book." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 6, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def MapColoringCSP(colors, neighbors):\n",
    +       "    """Make a CSP for the problem of coloring a map with different colors\n",
    +       "    for any two adjacent regions. Arguments are a list of colors, and a\n",
    +       "    dict of {region: [neighbor,...]} entries. This dict may also be\n",
    +       "    specified as a string of the form defined by parse_neighbors."""\n",
    +       "    if isinstance(neighbors, str):\n",
    +       "        neighbors = parse_neighbors(neighbors)\n",
    +       "    return CSP(list(neighbors.keys()), UniversalDict(colors), neighbors,\n",
    +       "               different_values_constraint)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(MapColoringCSP)" ] }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "(,\n", - " ,\n", - " )" + "(,\n", + " ,\n", + " )" ] }, - "execution_count": 3, + "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ - "australia, usa, france" + "australia_csp, usa_csp, france_csp" ] }, { @@ -173,14 +628,119 @@ "source": [ "## N-QUEENS\n", "\n", - "The N-queens puzzle is the problem of placing N chess queens on a N×N chessboard so that no two queens threaten each other. Here N is a natural number. Like the graph coloring, problem NQueens is also implemented in the csp module. The **NQueensCSP** class inherits from the **CSP** class. It makes some modifications in the methods to suit the particular problem. The queens are assumed to be placed one per column, from left to right. That means position (x, y) represents (var, val) in the CSP. The constraint that needs to be passed on the CSP is defined in the **queen_constraint** function. The constraint is satisfied (true) if A, B are really the same variable, or if they are not in the same row, down diagonal, or up diagonal. " + "The N-queens puzzle is the problem of placing N chess queens on an N×N chessboard in a way such that no two queens threaten each other. Here N is a natural number. Like the graph coloring problem, NQueens is also implemented in the csp module. The **NQueensCSP** class inherits from the **CSP** class. It makes some modifications in the methods to suit this particular problem. The queens are assumed to be placed one per column, from left to right. That means position (x, y) represents (var, val) in the CSP. The constraint that needs to be passed to the CSP is defined in the **queen_constraint** function. The constraint is satisfied (true) if A, B are really the same variable, or if they are not in the same row, down diagonal, or up diagonal. " ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 8, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def queen_constraint(A, a, B, b):\n",
    +       "    """Constraint is satisfied (true) if A, B are really the same variable,\n",
    +       "    or if they are not in the same row, down diagonal, or up diagonal."""\n",
    +       "    return A == B or (a != b and A + a != B + b and A - a != B - b)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(queen_constraint)" ] @@ -189,14 +749,196 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The **NQueensCSP** method implements methods that support solving the problem via **min_conflicts** which is one of the techniques for solving CSPs. Because **min_conflicts** hill climbs the number of conflicts to solve the CSP **assign** and **unassign** are modified to record conflicts. More details about the structures **rows**, **downs**, **ups** which help in recording conflicts are explained in the docstring." + "The **NQueensCSP** method implements methods that support solving the problem via **min_conflicts** which is one of the many popular techniques for solving CSPs. Because **min_conflicts** hill climbs the number of conflicts to solve, the CSP **assign** and **unassign** are modified to record conflicts. More details about the structures: **rows**, **downs**, **ups** which help in recording conflicts are explained in the docstring." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 9, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class NQueensCSP(CSP):\n",
    +       "    """Make a CSP for the nQueens problem for search with min_conflicts.\n",
    +       "    Suitable for large n, it uses only data structures of size O(n).\n",
    +       "    Think of placing queens one per column, from left to right.\n",
    +       "    That means position (x, y) represents (var, val) in the CSP.\n",
    +       "    The main structures are three arrays to count queens that could conflict:\n",
    +       "        rows[i]      Number of queens in the ith row (i.e val == i)\n",
    +       "        downs[i]     Number of queens in the \\ diagonal\n",
    +       "                     such that their (x, y) coordinates sum to i\n",
    +       "        ups[i]       Number of queens in the / diagonal\n",
    +       "                     such that their (x, y) coordinates have x-y+n-1 = i\n",
    +       "    We increment/decrement these counts each time a queen is placed/moved from\n",
    +       "    a row/diagonal. So moving is O(1), as is nconflicts.  But choosing\n",
    +       "    a variable, and a best value for the variable, are each O(n).\n",
    +       "    If you want, you can keep track of conflicted variables, then variable\n",
    +       "    selection will also be O(1).\n",
    +       "    >>> len(backtracking_search(NQueensCSP(8)))\n",
    +       "    8\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, n):\n",
    +       "        """Initialize data structures for n Queens."""\n",
    +       "        CSP.__init__(self, list(range(n)), UniversalDict(list(range(n))),\n",
    +       "                     UniversalDict(list(range(n))), queen_constraint)\n",
    +       "\n",
    +       "        self.rows = [0] * n\n",
    +       "        self.ups = [0] * (2 * n - 1)\n",
    +       "        self.downs = [0] * (2 * n - 1)\n",
    +       "\n",
    +       "    def nconflicts(self, var, val, assignment):\n",
    +       "        """The number of conflicts, as recorded with each assignment.\n",
    +       "        Count conflicts in row and in up, down diagonals. If there\n",
    +       "        is a queen there, it can't conflict with itself, so subtract 3."""\n",
    +       "        n = len(self.variables)\n",
    +       "        c = self.rows[val] + self.downs[var + val] + self.ups[var - val + n - 1]\n",
    +       "        if assignment.get(var, None) == val:\n",
    +       "            c -= 3\n",
    +       "        return c\n",
    +       "\n",
    +       "    def assign(self, var, val, assignment):\n",
    +       "        """Assign var, and keep track of conflicts."""\n",
    +       "        oldval = assignment.get(var, None)\n",
    +       "        if val != oldval:\n",
    +       "            if oldval is not None:  # Remove old val if there was one\n",
    +       "                self.record_conflict(assignment, var, oldval, -1)\n",
    +       "            self.record_conflict(assignment, var, val, +1)\n",
    +       "            CSP.assign(self, var, val, assignment)\n",
    +       "\n",
    +       "    def unassign(self, var, assignment):\n",
    +       "        """Remove var from assignment (if it is there) and track conflicts."""\n",
    +       "        if var in assignment:\n",
    +       "            self.record_conflict(assignment, var, assignment[var], -1)\n",
    +       "        CSP.unassign(self, var, assignment)\n",
    +       "\n",
    +       "    def record_conflict(self, assignment, var, val, delta):\n",
    +       "        """Record conflicts caused by addition or deletion of a Queen."""\n",
    +       "        n = len(self.variables)\n",
    +       "        self.rows[val] += delta\n",
    +       "        self.downs[var + val] += delta\n",
    +       "        self.ups[var - val + n - 1] += delta\n",
    +       "\n",
    +       "    def display(self, assignment):\n",
    +       "        """Print the queens and the nconflicts values (for debugging)."""\n",
    +       "        n = len(self.variables)\n",
    +       "        for val in range(n):\n",
    +       "            for var in range(n):\n",
    +       "                if assignment.get(var, '') == val:\n",
    +       "                    ch = 'Q'\n",
    +       "                elif (var + val) % 2 == 0:\n",
    +       "                    ch = '.'\n",
    +       "                else:\n",
    +       "                    ch = '-'\n",
    +       "                print(ch, end=' ')\n",
    +       "            print('    ', end=' ')\n",
    +       "            for var in range(n):\n",
    +       "                if assignment.get(var, '') == val:\n",
    +       "                    ch = '*'\n",
    +       "                else:\n",
    +       "                    ch = ' '\n",
    +       "                print(str(self.nconflicts(var, val, assignment)) + ch, end=' ')\n",
    +       "            print()\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(NQueensCSP)" ] @@ -205,35 +947,268 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The _ ___init___ _ method takes only one parameter **n** the size of the problem. To create an instance we just pass the required n into the constructor." + "The _ ___init___ _ method takes only one parameter **n** i.e. the size of the problem. To create an instance, we just pass the required value of n into the constructor." ] }, { "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": true - }, + "execution_count": 10, + "metadata": {}, "outputs": [], "source": [ "eight_queens = NQueensCSP(8)" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have defined our CSP. \n", + "Now, we need to solve this.\n", + "\n", + "### Min-conflicts\n", + "As stated above, the `min_conflicts` algorithm is an efficient method to solve such a problem.\n", + "
    \n", + "In the start, all the variables of the CSP are _randomly_ initialized. \n", + "
    \n", + "The algorithm then randomly selects a variable that has conflicts and violates some constraints of the CSP.\n", + "
    \n", + "The selected variable is then assigned a value that _minimizes_ the number of conflicts.\n", + "
    \n", + "This is a simple **stochastic algorithm** which works on a principle similar to **Hill-climbing**.\n", + "The conflicting state is repeatedly changed into a state with fewer conflicts in an attempt to reach an approximate solution.\n", + "
    \n", + "This algorithm sometimes benefits from having a good initial assignment.\n", + "Using greedy techniques to get a good initial assignment and then using `min_conflicts` to solve the CSP can speed up the procedure dramatically, especially for CSPs with a large state space." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def min_conflicts(csp, max_steps=100000):\n",
    +       "    """Solve a CSP by stochastic Hill Climbing on the number of conflicts."""\n",
    +       "    # Generate a complete assignment for all variables (probably with conflicts)\n",
    +       "    csp.current = current = {}\n",
    +       "    for var in csp.variables:\n",
    +       "        val = min_conflicts_value(csp, var, current)\n",
    +       "        csp.assign(var, val, current)\n",
    +       "    # Now repeatedly choose a random conflicted variable and change it\n",
    +       "    for i in range(max_steps):\n",
    +       "        conflicted = csp.conflicted_vars(current)\n",
    +       "        if not conflicted:\n",
    +       "            return current\n",
    +       "        var = random.choice(conflicted)\n",
    +       "        val = min_conflicts_value(csp, var, current)\n",
    +       "        csp.assign(var, val, current)\n",
    +       "    return None\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(min_conflicts)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's use this algorithm to solve the `eight_queens` CSP." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "solution = min_conflicts(eight_queens)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is indeed a valid solution. \n", + "
    \n", + "`notebook.py` has a helper function to visualize the solution space." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAd0AAAHwCAYAAADjD7WGAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3df7QU5Z3v+8932Aii/BDYoALXYJKVs+YYMdKjzqBcYkgICEbvmZuBa8zR3FzOzT2GEHEyI2tlxWSdxBwViBNzJydHBzwnKppxjKgTJTGCASNOwygzmpm7HDURCT+2sAO6TQTmuX/U7tndvetXd1V1d1W/X2vt1d1VTz31bZ+9/fI89TxV5pwTAADI3u+1OwAAALoFSRcAgBYh6QIA0CIkXQAAWoSkCwBAi5B0AQBoEZIuAAAtQtIFAKBFSLpAC5jZe8zsb83ssJntM7M7zKwnpPwEM/vLwbIDZvYPZvYfWxkzgPSRdIHW+H8lHZB0hqTzJP2vkv4fv4JmdpKkn0g6S9IfShov6U8l3WJmK1oSLYBMkHSB1pgp6QHn3G+dc/skPS7p3weUvVrS/yLpf3fOveqcO+ace1zSCkn/xcxOlSQzc2b2vspBZrbBzP5L1efFZva8mfWb2TNmdm7VvjPN7EEzO2hmr1YnczO7ycweMLP/YWZHzexFMytV7f8zM3tjcN8/m9lH0vlPBBQfSRdojW9JWmpmY8xsmqSF8hKvn49K+pFz7u267Q9KGiOv9xvKzD4k6a8k/SdJkyT9N0mbzGyUmf2epEckvSBpmqSPSFppZguqqrhc0kZJEyRtknTHYL0fkHSdpD9wzo2VtEDSa1HxAPCQdIHWeFpez/aIpD2SypJ+GFB2sqRf1290zh2X1CepN8b5lkv6b865Hc65E865uyX9TtJFkv5AUq9z7mvOuXedc69I+u+SllYdv80597fOuROS/qekWYPbT0gaJen3zWykc+4159y/xIgHgEi6QOYGe5aPS/obSafIS6qnSfqvAYf0ybv2W19Pz+CxfTFOe5akVYNDy/1m1i9phqQzB/edWbdvtaSpVcfvq3o/IGm0mfU4516WtFLSTZIOmNlGMzszRjwARNIFWmGivGu0dzjnfuece1PSekmLAsr/RNJCMzulbvt/kPSupB2DnwfkDTdXnF71/nVJX3fOTaj6GeOcu29w36t1+8Y654LiqeGcu9c5d7G85O0U/I8HAHVIukDGnHN9kl6V9Dkz6zGzCZL+o6TdAYf8T3lD0D8YXGo0cvB6619IutU595vBcs9L+j/MbISZfVzejOiK/y7p/zazC81zipldZmZjJT0n6ejghKiTB48/x8z+IOq7mNkHzOxSMxsl6beS3pH0rw3/RwG6FEkXaI3/TdLHJR2U9LKkY5K+6FfQOfc7SfPl9Uh3yEtsj8ubjPXVqqJfkLREUr+kq1R1jdg5V5b0f8mbAHV48JzXDO47IWmxvKVLr8obrr5T3tKkKKMkfXPwmH2Spki6McZxACSZc67dMQAIYWYjJf1I0huSrnH80QK5RU8X6HDOuWPyruf+i6QPtDkcAAnQ0wUAoEXo6QIA0CKBN1xPYvLkye4973lPFlV3hJ07d7Y7hEzNnj273SFkjjbMN9ov/4rehs4589ueyfByqVRy5XI59Xo7hZnvf8vCSO13YmcK/51mZ3P5gzbMN9ov/7qgDX2/IMPLSNf+W71km0bClYbq2r8mnfoAoI1IukjHsTe95LjnS9nUv+cGr/5j+7OpHwBaIJNruugyafVq49g9eKfDjIadASBL9HSRTCsTbiecFwASIOmiObtGtT/x7TTp0Mb2xgAADSDponE7TXLvJq7multSiOXVZe1P/gAQE9d00ZhdoxNXYaWh9995wHt1SVeY7Rolnf+7hJUAQLbo6aIxLjqx9c6X7vmR/77qhBtne2wp9LwBIGskXcQXMYxrJe+nr1/61JeTJ9JKfZWfcz6ZLD4AaDeSLuKJSGjfvt9/e7OJ1++4F1+JcSCJF0AHI+ki2vEDkUVW3NqCOBQziR/vyzwOAGgGSRfRXpiaWlVBE6YST6Sq9kJvipUBQHqYvYxwvx5a1+PXy6wkS1eOP5TsytLRAWncXOnI09LYMfHDWf+Vofdh8WjfOun0L8avGABagJ4uwu39M0nBCXVP1cjznFnD9wf1YCuJNijhBh13zRLv9Vf7/Pf/W5xvXO9fAADaiKSLRGYsGnq/7a7aZBk2ZPz+K73XSZcGl6mvq/rzWYsbixMAOgFJF8ESzgR+I2T+1cuve6+HjgSXCdsXCzOZAXQYki4SWTQneN/0RcH74gjrBS++JFndANAOJF3EMrDdf/tjt7c2jopH1vlvf+eZ1sYBAI0g6cLfsdqZSieP8q6pnjxqaFucZT4bHmnu9A9vjS5Tff4xo73Po0+qK3TsYHMBAEAGSLrwt/sM380D26VjO7z3cZYIXfvV4duOn6j93Nc/vMwVq6Lrrpy/f4v09raAQrunRFcEAC1C0kXDekYkO/6ki2o/985PVt/4U5MdDwCtQtJFInF6u0tX1352Lrz8Z76WznkBoNOQdJG5+zc3Vn79pmziAIB2i5V0zezjZvbPZvaymf151kGh/a5fG79sq3udjZyvke8BAFmLTLpmNkLSdyQtlPT7kpaZ2e9nHRjaa23Kd1H83M3xyqX9tKK0vwcAJBGnp3uBpJedc684596VtFHSJ7INC3mzeGX4/u8+6L1u3eW/f9PT3mvQc3kr6mc1f/qy6NgAoFPESbrTJL1e9XnP4LYaZrbczMpmVj54kLWRRTfzzNrPjwUt2akzb7n/9k/E7JHWr9+922dJEgB0qtQmUjnnvuecKznnSr29PM+06H525/BtC1eEHzMx5LaOknTah8P3r1wTvh8AOl2cpPuGpBlVn6cPbkORzQofrZjmc8+JxyNuwXg44gEG/UfD999+X/h+X+f2NXEQAGQjTtL9O0nvN7OZZnaSpKWSWNRRdD2Tmzosq5nMV97Q5IEjJ6UaBwAk0RNVwDl33Myuk/SEpBGS/so592LmkQFVfril3REAQHKRSVeSnHN/K+lvM44FOTN1orT/UPvOf+E57Ts3ADSDO1Ih2Ozw+zXua/BOU9U++D5p/gXSe6c3X8ezGyIKRMQPAK0Wq6cLBHHl4Ou4i+Yke97uguukzc8GnxcA8oaki3DTb5P2hM9i6t8iTZjnvd+/WZoysXb/NTdJdz8a/5RzZknb7pKeuGNo26t7pbMv997H6mHP+Iv4JwSAFjEX9ciXJpRKJVcuF7crYmbtDiFTw34ndkZ/XysN9T43bpaWrQ4v34h7vy4tWzD8PKEihpa7rg0LhvbLvy5oQ98vSNJtQhf8stRuOHYw1sPg4y4XWjJXunaJNG+2dPio9PPd0jfWSy+9EiO2OL9W5/ZFLhXqujYsGNov/7qgDX2/IMPLiDay+TuMbVrrJdkgp42Tzp4mXbWwdvu256VLPtvkSVmbC6BDkXQRz2wXOcxcmVQ1skd6t24CVCM3zXBl6eLzhnq1Iy+Ujp9IZ1gZANqJpIv4YiReaSjhNnt3qurjTjwnHdsRsy4SLoAOxzpdNGZm9A2QrRScJG9aLh1+yuu1Vn4Gtnvb/Yy4IGbCnfmDGIUAoL2YSNWELpgAEF4goLdbnxyvmCc9dFvzcSxb7c2Erokt6NeqwV5u17dhztF++dcFbcjs5bR0wS9LdKFdYyT3Ts0mK0l9T0qTxtcWHTtXemsg/vknjpPe/Gnttm9ukG68wyfpzrxPmrg0fuWVWGnDXKP98q8L2pDZy0jR+YNZtK7X2zNCmnm59Nre5qs+dKS21/zLR4f3eCVxDRdA7nBNF8lUJT5Xlh7emizh+jlrsbeut6aXS8IFkEMMLzehC4ZFGj/o2CFpdwvWx557ING64QraMN9ov/zrgjb0/YL0dJGOkRO93ueMddnUP+N2r/4UEi4AtAvXdJGuKSu9HynWmt5IDCMDKBB6usjObDf0M+vwsN2r/DrF5/669jgAKBB6umiNngnDkuia77cpFgBoE3q6AAC0CEkXAIAWIekCANAiJF0AAFokk4lUO3fuLPTC56IvXC9y21XQhvlG++VfkduwVAp+NBqzlwGg4sRh6fmJNZtWrZPWfLGu3Ll7pZFntC4uFAZJF0B3i7iJy7CEK0m7z6z9zJpyxMQ1XQDdZ/+tXrJN465p0lBd+9ekUx8Ki6QLoHsce9NLjnu+lE39e27w6j+2P5v6kXsMLwPoDmn1auPYfbr3yrAz6tDTBVB8rUy4nXBedCySLoDi2jWq/Ylvp0mHNrY3BnQMki6AYtppkns3cTXX3ZJCLK8ua3/yR0fgmi6A4tk1OnEVVnV/g+884L26csJKd42Szv9dwkqQZ/R0ARSPi05svfOle37kv88CbigUtD22FHreyDeSLoBiiRjGtZL309cvferLyRNppb7KzzmfTBYfio2kC6A4IhLat+/3395s4vU77sVXYhxI4u1aJF0AxXD8QGSRFbe2IA7FTOLH+zKPA52HpAugGF6YmlpVQROmEk+kqvZCb4qVIS+YvQwg/349tK7Hr5dZSZauHH8o2ZWlowPSuLnSkaelsWPih7P+K0Pvw+LRvnXS6X5PVEBR0dMFkH97/0xScELdUzXyPGfW8P1BPdhKog1KuEHHXbPEe/3VPv/9/xbnG9f7F0BhkXQBFN6MRUPvt91VmyzDhozff6X3OunS4DL1dVV/PmtxY3Gi+Ei6APIt4UzgN0LmX738uvd66EhwmbB9sTCTuauQdAEU3qI5wfumLwreF0dYL3jxJcnqRvGQdAEUxsB2/+2P3d7aOCoeWee//Z1nWhsHOgdJF0B+HaudqXTyKO+a6smjhrbFWeaz4ZHmTv/w1ugy1ecfM9r7PPqkukLHDjYXAHKHpAsgv3af4bt5YLt0bIf3Ps4SoWu/Onzb8RO1n/v6h5e5YlV03ZXz92+R3t4WUGj3lOiKUAgkXQCF1DMi2fEnXVT7uXd+svrGn5rseBQDSRdA4cXp7S5dXfvZufDyn/laOudFd4lMumb2V2Z2wMz+sRUBAUA73L+5sfLrN2UTB4otTk93g6SPZxwHADTs+rXxy7a619nI+Rr5Hsi3yKTrnHta0qEWxAIADVmb8l0UP3dzvHJpP60o7e+BzsU1XQBdY/HK8P3ffdB73brLf/+mp73XoOfyVtTPav70ZdGxoTuklnTNbLmZlc0szYdfAUDTZp5Z+/mxoCU7deYt99/+iZg90vr1u3f7LElCd0ot6TrnvuecKznnmK8HoCP87M7h2xauCD9mYshtHSXptA+H71+5Jnw/uhvDywDya1b4nZym+dxz4vGIWzAejniAQf/R8P233xe+39e5fU0chDyKs2ToPkk/l/QBM9tjZv9n9mEBQAw9k5s6LKuZzFfe0OSBIyelGgc6V09UAefcslYEAgB598Mt7Y4AnY7hZQCFNnVie89/4TntPT86C0kXQL7NDr9f474G7zRV7YPvk+ZfIL13evN1PLshokBE/CiWyOFlAMg7Vw6+jrtoTrLn7S64Ttr8bPB5gWokXQD5N/02aU/4LKb+LdKEed77/ZulKXXDztfcJN39aPxTzpklbbtLeuKOoW2v7pXOvtx7H6uHPeMv4p8QhWAu6lEazVRqVujxkiz+m3USM2t3CJmjDfPNt/12Rn9nKw31PjdulpatDi/fiHu/Li1bMPw8oQKGloveflKx/wZLpZLK5bJvI5J0m1DkXxaJP/giKHob+rbfsYOxHgYfd7nQkrnStUukebOlw0eln++WvrFeeumVGPHFSbjn9gUuFSp6+0nF/hsMS7oMLwMohpG9TR+6aa2XZIOcNk46e5p01cLa7duely75bJMnZW1uVyLpAiiO2S5ymLkyqWpkj/Ru3QSoRm6a4crSxecN9WpHXigdP5FsWBnFR9IFUCwxEq80lHCbvTtV9XEnnpOO7YhZFwm3q7FOF0DxzIy+AbKVgpPkTculw095vdbKz8B2b7ufERfETLgzfxCjEIqMiVRNKPIEAIlJHEVQ9DaM1X4Bvd365HjFPOmh25qPZdlqbyZ0tcAh5pi93KK3n1Tsv0FmL6esyL8sEn/wRVD0NozdfrvGSO6dmk1WkvqelCaNry06dq701kD8GCaOk978ae22b26QbrzDJ+nOvE+auDR23UVvP6nYf4PMXgbQnc4fzKJ1vd6eEdLMy6XX9jZf9aEjtb3mXz46vMcriWu4qME1XQDFV5X4XFl6eGuyhOvnrMXeut6aXi4JF3UYXm5CkYdFJIa2iqDobdh0+x07JO1uwfrYcw8kWjdc9PaTiv03GDa8TE8XQPcYOdHrfc5Yl039M2736k+QcFFsXNMF0H2mrPR+pFhreiMxjIyY6OkC6G6z3dDPrMPDdq/y6xSf++va44CY6OkCQEXPhGFJdM332xQLComeLgAALULSBQCgRUi6AAC0SCbXdGfPnq1yOc7zrfKp6Gvoirx+roI2zDfaL/+K3oZB6OkCANAizF4GAORW4BOdGtDsM5WbQU8XAJArN1w99JzjNFTquv6qdOoLk8m9l0ulkuOabn5xPSn/it6GtF/+NdOGfo9TzMLUj0kHDiWrwznHo/0AAPmUVq82jv2Dj2jMYtiZ4WUAQEdrZcLN+rwkXQBAR/rtM+1LuBWuLP3JR9Orj6QLAOg4riyNOil5PdfdkryOjTenl/y5pgsA6CjvbE9eR/X12O884L0mTZy/fUYa/UfJ6qCnCwDoKKNHRZfpnS/d8yP/fUEToJJOjEqj503SBQB0jKjeqJW8n75+6VNfTp5IK/VVfs75ZLL4opB0AQAdISqhfft+/+3NJl6/4158Jfq4JImXpAsAaLveidFlVtyafRxSvCQ+aXxzdZN0AQBtd2BzenUF9UTTXH7U92RzxzF7GQDQVn969dB7v15mJVm6cvyhZFeWjg5I4+ZKR56Wxo6JH8/6r8SLZ+Uy6Vv3xa9XoqcLAGizW77gvQYl1D0Hht7PmTV8f1APtpJogxJu0HHXLPFef7XPf38lznWr/PeHIekCADrajEVD77fdVZssw4aM33+l9zrp0uAy9XVVfz5rcWNxxkHSBQC0TdLrrG8cCN738uve66EjwWXC9sXRaPwkXQBAR1s0J3jf9EXB++II6wUvviRZ3X5IugCAjjAQcPvHx25vbRwVj6zz3/7OM83XSdIFALTF1Em1n08e5Q3Xnlx1G8g4w7cbHmnu/A9vjS5Tff4xo73Po+tuBzl5QvxzknQBAG2x7wn/7QPbpWM7vPdxlghd+9Xh246fqP3c1z+8zBUxZh9Xzt+/RXp7m3+Zgz+JrqeCpAsA6Dg9I5Idf9JFtZ975yerb/ypyY6vIOkCADpanN7u0tW1n50LL/+Zr6Vz3kaRdAEAuXd/g7eRXL8pmziiRCZdM5thZk+Z2Utm9qKZfaEVgQEAiu36tfHLZtHrTOt8jXyPOD3d45JWOed+X9JFkv6zmf1+/FMAADDc2uvTre9zN8crl/bTihr5HpFJ1zn3a+fcrsH3RyX9QtK0ZoMDAKAZi1eG7//ug97r1l3++zc97b0GPZe3on5W86cvi44troau6ZrZeyR9SNIOn33LzaxsZuWDBw+mEx0AoGvNPLP282MBS3bqzVvuv/0TMXuk9et37/ZZktSs2EnXzE6V9KCklc65YXerdM59zzlXcs6Vent704sQANCVfnbn8G0LV4QfMzHkto6SdNqHw/evXBO+P6lYSdfMRspLuPc45/4m25AAAN1g8kfC90+bMnzb4xG3YDwc8QCD/qPh+29v8Pm4Uvj9m+vFmb1sku6S9AvnXANztAAACPbmb5o7LquZzFfe0NxxjTypKE5Pd46kqyVdambPD/4kfK4DAACd5Ydbsj9HT1QB59w2SZZ9KAAA1Jo6Udp/qH3nv/CcdOvjjlQAgLaJGire1+Cdpqp98H3S/Auk905vvo5nN4Tvb3SoO7KnCwBAO7lycHJbNCfZ83YXXCdtfjb4vGkj6QIA2mrVOmnNF8PL9G+RJszz3u/fLE2ZWLv/mpukux+Nf845s6Rtd0lP3DG07dW90tmXe+/j9LA/38SdrcxFPYqhCaVSyZXLGfwToUN4E7qLK4vfiU5DG+Yb7Zd/9W0Yp1dppaFyGzdLy1aHl2/EvV+Xli0Yfp6oeII453x/SUm6TeAPPv9ow3yj/fKvvg0nT4j3MPi411CXzJWuXSLNmy0dPir9fLf0jfXSS69EHxsn4U66NHypUFDSZXgZANB2ff3NH7tprZdkg5w2Tjp7mnTVwtrt256XLvlsc+dsZG1uNZIuAKAjxBnWrUyqGtkjvVs3AaqRmcSuLF183tD5Rl4oHT+RfFg5CkkXANAx4l5PrSTcZhNg9XEnnpOO7YhXV9K7YbFOFwDQUZbeGF3GSsEJ8Kbl0uGnvORd+RnY7m33M+KCeMn0j78UXSYKE6mawCSO/KMN8432y7+oNgzq7dYnxyvmSQ/d1nwcy1Z7M6GbOXcYZi+niD/4/KMN8432y784bfj2NmnM6LrjSlLfk9Kk8bXbx86V3hqIf/6J46Q3f1q77ZsbpBvvGJ50l94o3f/j+HVLzF4GAOTMKRd7r/VJsGeENPNy6bW9zdd96Ehtz/WXjw7v8UrpP9GIa7oAgI5WnfhcWXp4a7KE6+esxd663uoEn8UjBBlebgJDW/lHG+Yb7Zd/zbThaWOlQ09lEEyd3vnJ1g1LwcPL9HQBALlw+KjX+1y5Jpv6V9w6eM04YcINQ0+3CfwrO/9ow3yj/fIvrTZM40lAWQwj09MFABROZb2ulYaeQlRt1brh205fUHtcKzF7GQBQCL95yz+Jrr2n9bEEoacLAECLkHQBAGgRki4AAC1C0gUAoEUymUi1c+fOQk/pL/p0/iK3XQVtmG+0X/4VuQ1LpeAp0cxe7hQnDkvPT6zZtGqdtOaLdeXO3SuNPKN1cQEAUkPSbaed4f+aHZZwJWn3mbWfZxf3X4sAUDRc0221/bd6yTYi4cZWqWt/RvdFAwCkhqTbKsfe9JLjni9lU/+eG7z6j+3Ppn4AQGIML7dCWr3aOHaf7r0y7AwAHYeebtZamXA74bwAgEAk3azsGtX+xLfTpEMb2xsDAODfkHSzsNMk927iaq67JYVYXl3W/uQPAJDENd307RqduIrqp2R85wHvNfEzI3eNks7/XcJKAABJ0NNNm4tObL3zpXt+5L8v6NmOiZ/5mELPGwCQDEk3TRHDuJUHJvf1S5/6cvJEWv0QZitJ53wyWXwAgGyRdNMSkdC+fb//9mYTr99xL74S40ASLwC0DUk3DccPRBZZcWsL4lDMJH68L/M4AADDkXTT8MLU1KoKmjCVeCJVtRd6U6wMABAXs5eT+vXQuh6/XmYlWbpy/KFkV5aODkjj5kpHnpbGjokfzvqvDL0Pi0f71kmn+z1RAQCQFXq6Se39M0nBCXVP1cjznFnD9wf1YCuJNijhBh13zRLv9Vf7/Pf/W5xvXO9fAACQGZJuxmYsGnq/7a7aZBk2ZPz+K73XSZcGl6mvq/rzWYsbixMAkD2SbhIJZwK/ETL/6uXXvddDR4LLhO2LhZnMANBSJN2MLZoTvG/6ouB9cYT1ghdfkqxuAED6SLopGdjuv/2x21sbR8Uj6/y3v/NMa+MAAAwh6TbrWO1MpZNHeddUTx41tC3OMp8NjzR3+oe3RpepPv+Y0d7n0SfVFTp2sLkAAAANI+k2a/cZvpsHtkvHdnjv4ywRuvarw7cdP1H7ua9/eJkrVkXXXTl//xbp7W0BhXZPia4IAJAKkm4GekYkO/6ki2o/985PVt/4U5MdDwBIB0k3Y3F6u0tX1352Lrz8Z76WznkBAK0VmXTNbLSZPWdmL5jZi2bmMyCKJO7f3Fj59ZuyiQMAkK04Pd3fSbrUOTdL0nmSPm5mF0UcU3jXr41fttW9zkbO18j3AAAkE5l0neetwY8jB38iBkCLb23Kd1H83M3xyqX9tKK0vwcAIFisa7pmNsLMnpd0QNKPnXM7fMosN7OymaX5PJzCWLwyfP93H/Ret+7y37/pae816Lm8FfWzmj99WXRsAIDWiJV0nXMnnHPnSZou6QIzO8enzPeccyXnHFN4JM08s/bzY0FLdurMW+6//RMxe6T163fv5go8AHSMhmYvO+f6JT0l6ePZhFMcP7tz+LaFK8KPmRhyW0dJOu3D4ftXrgnfDwBorzizl3vNbMLg+5MlfVTSP2UdWMebFX4np2k+95x4POIWjIcjHmDQfzR8/+33he/3dW5fEwcBAJoR5yH2Z0i628xGyEvSDzjnHs02rBzomdzUYVnNZL7yhiYPHDkp1TgAAMEik65zbrekD7UgFiTwwy3tjgAAEIU7UmVo6sT2nv/CYdPdAADtRNJNYnb4cuV9Dd5pqtoH3yfNv0B67/Tm63h2Q0SBiPgBAOmKc00XCbhy8HXcRXOSPW93wXXS5meDzwsA6Cwk3aSm3ybtCZ/F1L9FmjDPe79/szSlbtj5mpukuxuYmjZnlrTtLumJO4a2vbpXOvty732sHvaMv4h/QgBAKsxFPdKmmUrNCj1uOey/2U6LPMZKQ73PjZulZavDyzfi3q9LyxYMP0+okKFls+jvk3dZ/N53kqK3Ie2Xf0Vuw1KppHK57NuIJN0mDPtvduxgrIfBx10utGSudO0Sad5s6fBR6ee7pW+sl156JUZscRLuuX2hS4X4g8+/orch7Zd/RW7DsKTL8HIaRvY2feimtV6SDXLaOOnsadJVC2u3b3teuuSzTZ6UtbkA0BYk3bTMdpHDzJVJVSN7pHfrJkA1ctMMV5YuPm+oVzvyQun4ieTDygCAbJF00xQj8UpDCbfZu1NVH3fiOenYjph1kXABoK1Yp5u2mdE3QLZScJK8abl0+Cmv11r5Gdjubfcz4oKYCXfmD2IUAgBkiYlUTYj8bxbQ261PjlfMkx66rfk4lq32ZkLXxBY0xNxAL5dJHPlX9Dak/fKvyG3I7OWUxfpvtmuM5N6p2WQlqe9JadL42qJj50pvDcQ//8Rx0ps/rd32zQ3SjXf4JN2Z90kTl8avXPzBF0HR25D2y78ityGzl9vh/D6YMZgAACAASURBVMEsWtfr7Rkhzbxcem1v81UfOlLba/7lo8N7vJK4hgsAHYZrulmrSnyuLD28NVnC9XPWYm9db00vl4QLAB2H4eUmNPXf7NghaXcL1seeeyDRumGJoa0iKHob0n75V+Q2DBtepqfbKiMner3PGeuyqX/G7V79CRMuACA7XNNttSkrvR8p1preSAwjA0Bu0NNtp9lu6GfW4WG7V/l1is/9de1xAIDcoKfbKXomDEuia77fplgAAJmgpwsAQIuQdAEAaBGSLgAALZLJNd3Zs2erXI7znLl8KvoauiKvn6ugDfON9su/ordhEHq6AAC0CLOXgQQCn+rUgGafqwwgf+jpAg264eqhZx2noVLX9VelUx+AzpXJvZdLpZLjmm5+cT3Jn98jFbMw9WPSgUPJ6ih6G/I3mH9d0IY82g9oVlq92jj2Dz6mkWFnoHgYXgYitDLhdsJ5AWSHpAsE+O0z7U98riz9yUfbGwOA9JB0AR+uLI06KXk9192SvI6NN7c/+QNIB9d0gTrvbE9eR/X12O884L0mTZy/fUYa/UfJ6gDQXvR0gTqjR0WX6Z0v3fMj/31BE6CSToxKo+cNoL1IukCVqN6olbyfvn7pU19Onkgr9VV+zvlksvgAdDaSLjAoKqF9+37/7c0mXr/jXnwl+jgSL5BfJF1AUu/E6DIrbs0+DileEp80Pvs4AKSPpAtIOrA5vbqCeqJp9lD7nkyvLgCtw+xldL0/vXrovV8vs5IsXTn+ULIrS0cHpHFzpSNPS2PHxI9n/VfixbNymfSt++LXC6D96Omi693yBe81KKHuOTD0fs6s4fuDerCVRBuUcIOOu2aJ9/qrff77K3GuW+W/H0DnIukCEWYsGnq/7a7aZBk2ZPz+K73XSZcGl6mvq/rzWYsbixNA5yPpoqslvc76xoHgfS+/7r0eOhJcJmxfHMxkBvKFpAtEWDQneN/0RcH74gjrBS++JFndADoPSRcYNBBw+8fHbm9tHBWPrPPf/s4zrY0DQHpIuuhaUyfVfj55lDdce3LVbSDjDN9ueKS58z+8NbpM9fnHjPY+j667HeTkCc2dH0DrkXTRtfY94b99YLt0bIf3Ps4SoWu/Onzb8RO1n/v6h5e5Isbs48r5+7dIb2/zL3PwJ9H1AOgMJF3AR8+IZMefdFHt5975yeobf2qy4wF0BpIuECFOb3fp6trPzoWX/8zX0jkvgHyJnXTNbISZ/b2ZPZplQEAe3d/gbSTXb8omDgCdrZGe7hck/SKrQIBWu35t/LKt7nU2cr5GvgeA9oqVdM1suqTLJN2ZbThA66y9Pt36PndzvHJpP60o7e8BIDtxe7rfkvQlSf8aVMDMlptZ2czKBw8eTCU4oJMsXhm+/7sPeq9bd/nv3/S09xr0XN6K+lnNn74sOjYA+RCZdM1ssaQDzrmdYeWcc99zzpWcc6Xe3t7UAgTaZeaZtZ8fC1iyU2/ecv/tn4jZI61fv3u3z5IkAPkUp6c7R9LlZvaapI2SLjWz72caFdABfuZzMWXhivBjJobc1lGSTvtw+P6Va8L3A8i3yKTrnLvROTfdOfceSUsl/dQ596nMIwMyNvkj4funTRm+7fGIWzAejniAQf/R8P23N/F83LD7NwPoLKzTRdd68zfNHZfVTOYrb2juuKRPKgLQOj2NFHbObZG0JZNIgC73wy3tjgBA1ujpAiGmTmzv+S88p73nB5Auki66WtRQ8b4G7zRV7YPvk+ZfIL13evN1PLshfD+3igTypaHhZaAbuXJwcls0J9nzdhdcJ21+Nvi8AIqFpIuut2qdtOaL4WX6t0gT5nnv92+WptQNO19zk3R3A3clnzNL2naX9MQdQ9te3Sudfbn3Pk4P+/Mp39kKQPbMRT0OpQmlUsmVy8X9Z7qZtTuETGXxO9Fp6tswTq/SSkPlNm6Wlq0OL9+Ie78uLVsw/DxR8QQpehvyN5h/XdCGvl+QpNuELvhlaXcImatvw8kT4j0MPu411CVzpWuXSPNmS4ePSj/fLX1jvfTSK9HHxkm4ky4NXypU9DbkbzD/uqANfb8gw8uApL7+5o/dtNZLskFOGyedPU26amHt9m3PS5d8trlzsjYXyCeSLjAozrBuZVLVyB7p3boJUI3MJHZl6eLzhs438kLp+Inkw8oAOhtJF6gS93pqJeE2mwCrjzvxnHRsR7y6SLhAvrFOF6iz9MboMlYKToA3LZcOP+Ul78rPwHZvu58RF8RLpn/8pegyADobE6ma0AUTANodQuai2jCot1ufHK+YJz10W/NxLFvtzYRu5txhit6G/A3mXxe0IbOX09IFvyztDiFzcdrw7W3SmNF1x5WkvielSeNrt4+dK701EP/8E8dJb/60dts3N0g33jE86S69Ubr/x/HrlorfhvwN5l8XtCGzl4FGnHKx91qfBHtGSDMvl17b23zdh47U9lx/+ejwHq/ENVygaLimC0SoTnyuLD28NVnC9XPWYm9db3WCJ+ECxcPwchO6YFik3SFkrpk2PG2sdOipDIKp0zs/2bphqfhtyN9g/nVBG/p+QXq6QEyHj3q9z5Vrsql/xa2D14wTJlwAnYuebhO64F9o7Q4hc2m1YRpPAspiGLnobcjfYP51QRvS0wXSVlmva6WhpxBVW7Vu+LbTF9QeB6B7MHsZSMlv3vJPomvvaX0sADoTPV0AAFqEpAsAQIuQdAEAaJFMrunu3Lmz0DPTij6zsMhtV0Eb5hvtl39FbsNSKXiGJD1dAABapGNnL3fq+kcAAJrVUT3dG64eev5oGip1XX9VOvUBAJBEJnekMrOGKvV7zFkWpn5MOnAoeT1FvhYhcT2pCIrehrRf/hW5DUulksrlcmc+2i+tXm0c+wcfncawMwCgHdo6vNzKhNsJ5wUAdLe2JN3fPtP+xOfK0p98tL0xAAC6S8uTritLo05KXs91tySvY+PN7U/+AIDu0dJruu9sT15H9fXY7zzgvSZNnL99Rhr9R8nqAAAgSkt7uqNHRZfpnS/d8yP/fUEToJJOjEqj5w0AQJSWJd2o3mjl2aJ9/dKnvpw8kVY/r9RK0jmfTBYfAABJtSTpRiW0b9/vv73ZxOt33IuvRB9H4gUAZCnzpNs7MbrMiluzjsITJ4lPGp99HACA7pR50j2wOb26gnqiafZQ+55Mry4AAKplOnv5T68eeu/Xy6wkS1eOP5TsytLRAWncXOnI09LYMfHjWf+VePGsXCZ967749QIAEEemPd1bvuC9BiXUPQeG3s+ZNXx/UA+2kmiDEm7Qcdcs8V5/tc9/fyXOdav89wMAkERbbwM5Y9HQ+2131SbLsCHj91/pvU66NLhMfV3Vn89a3FicAACkIbOkm/Q66xsHgve9/Lr3euhIcJmwfXEwkxkAkLa29nQXzQneN31R8L44wnrBiy9JVjcAAM1oSdIdCLj942O3t+Lswz2yzn/7O8+0Ng4AQHfJJOlOnVT7+eRR3nDtyVW3gYwzfLvhkebO//DW6DLV5x8z2vs8uu52kJMnNHd+AAD8ZJJ09z3hv31gu3Rsh/c+zhKha786fNvxE7Wf+/qHl7kixuzjyvn7t0hvb/Mvc/An0fUAABBXy6/p9oxIdvxJF9V+7p2frL7xpyY7HgCAuNo6kSpOb3fp6trPzoWX/8zX0jkvAABpi5V0zew1M/sHM3vezFq6mOb+Bm8juX5TNnEAAJBUIz3dDzvnznPORfYTr18bv9JW9zobOV8j3wMAgCiZDC+vvT7d+j53c7xyaT+tKO3vAQDobnGTrpO02cx2mtlyvwJmttzMys0MPy9eGb7/uw96r1t3+e/f9LT3GvRc3or6Wc2fviw6NgAA0mIuamaSJDOb5px7w8ymSPqxpM87554OPGCnhVZ69uXSq3trt1XWzQYN/0Y9iShsf1DdcdYK+z6NKMZ/szwzs3aHkDnaMN9ov/wrchuWSiWVy2XfRozV03XOvTH4ekDSQ5IuSBLQz+4cvm3hivBjJobc1lGSTvtw+P6Va8L3AwCQtcika2anmNnYyntJH5P0j2HHTP5IeJ3Tpgzf9njELRgPRzzAoP9o+P7bm3g+btj9mwEAaFSch9hPlfTQ4HBHj6R7nXOPhx3w5m+aCyarmcxX3tDccUmfVAQAQLXIpOuce0WSzyPm8+OHW9odAQAAbbwj1dSJ7Tqz58Jz2nt+AED3ySzpRg0V72vwTlPVPvg+af4F0nunN1/HsxvC93OrSABA2uJc081M2DKfRXOSPW93wXXS5meDzwsAQKtlmnRXrZPWfDG8TP8WacI87/3+zdKUumHna26S7n40/jnnzJK23SU9ccfQtlf3emuDpXg97M+nfGcrAACkmDfHaLhSG7o5RtwbUFTKbdwsLVsdXr4R935dWrZg+Hmi4glT5EXdEgvzi6DobUj75V+R2zDs5hiZJ93JE+I9DD7uNdQlc6Vrl0jzZkuHj0o/3y19Y7300ivRx8ZJuJMujV4qVORfFok/+CIoehvSfvlX5DYMS7qZX9Pt62/+2E1rvSQb5LRx0tnTpKsW1m7f9rx0yWebOydrcwEAWWnJRKo4w7qVSVUje6R36yZANTKT2JWli88bOt/IC6XjJ9IZVgYAIImWzV6Oez21knCbTYDVx514Tjq2I15dJFwAQNZaenOMpTdGl7FScAK8abl0+CkveVd+BrZ72/2MuCBeMv3jL0WXAQAgqcwnUtUL6u3WJ8cr5kkP3dZ8DMtWezOhmzl3lCJPAJCYxFEERW9D2i//ityGbZ297OftbdKY0XXHlKS+J6VJ42u3j50rvTUQ/9wTx0lv/rR22zc3SDfeMTzpLr1Ruv/H8euuKPIvi8QffBEUvQ1pv/wrchu2dfayn1Mu9l7rk2DPCGnm5dJre4cfE9ehI7U9118+OrzHK3ENFwDQem174IFUm/hcWXp4a7KE6+esxd663uoET8IFALRDW4aX6502Vjr0VOphDNM7P9m64YoiD4tIDG0VQdHbkPbLvyK3Ydjwclt7uhWHj3q9z5Vrsql/xa2D14xTSLgAADSrI3q6ftJ4ElBWw8hF/heaxL+yi6DobUj75V+R27Dje7p+Kut1rTT0FKJqq9YN33b6gtrjAADoJG19nm5cv3nLP4muvaf1sQAA0KyO7ekCAFA0JF0AAFqEpAsAQItkck139uzZKpdTmH7coYo+s7DIsworaMN8o/3yr+htGISeLgAALULSBQCgRXKxZAgA0KSdKQzjzi7+cHer0NMFgKLZf6uXbNNIuNJQXfszuldvFyHpAkBRHHvTS457vpRN/Xtu8Oo/tj+b+rsAw8sAUARp9Wrj2H2698qwc8Po6QJA3rUy4XbCeXOMpAsAebVrVPsT306TDm1sbww5QtIFgDzaaZJ7N3E1192SQiyvLmt/8s8JrukCQN7sGp24iuont33nAe818XPMd42Szv9dwkqKjZ4uAOSNi05svfOle37kvy/oeeOJn0OeQs+76Ei6AJAnEcO4VvJ++vqlT305eSKt1Ff5OeeTyeLrdiRdAMiLiIT27fv9tzebeP2Oe/GVGAeSeAORdAEgD44fiCyy4tYWxKGYSfx4X+Zx5BFJFwDy4IWpqVUVNGEq8USqai/0plhZcTB7GQA63a+H1vX49TIrydKV4w8lu7J0dEAaN1c68rQ0dkz8cNZ/Zeh9WDzat046/YvxK+4C9HQBoNPt/TNJwQl1T9XI85xZw/cH9WAriTYo4QYdd80S7/VX+/z3/1ucb1zvX6CLkXQBIOdmLBp6v+2u2mQZNmT8/iu910mXBpepr6v681mLG4sTJF0A6GwJZwK/ETL/6uXXvddDR4LLhO2LhZnMNUi6AJBzi+YE75u+KHhfHGG94MWXJKu7G5F0ASAnBrb7b3/s9tbGUfHIOv/t7zzT2jjyhKQLAJ3qWO1MpZNHeddUTx41tC3OMp8NjzR3+oe3RpepPv+Y0d7n0SfVFTp2sLkACoikCwCdavcZvpsHtkvHdnjv4ywRuvarw7cdP1H7ua9/eJkrVkXXXTl//xbp7W0BhXZPia6oS5B0ASCHekYkO/6ki2o/985PVt/4U5Md3y1iJV0zm2Bmf21m/2RmvzCzP8w6MABAPHF6u0tX1352Lrz8Z76WznlRK25P93ZJjzvn/p2kWZJ+kV1IAIC03b+5sfLrN2UTR7eLTLpmNl7SXEl3SZJz7l3nnM/oPwAgTdevjV+21b3ORs7XyPcoujg93ZmSDkpab2Z/b2Z3mtkpGccFAF1vbcp3UfzczfHKpf20orS/R57FSbo9ks6X9JfOuQ9JelvSn9cXMrPlZlY2s/LBg0wPB4BWW7wyfP93H/Ret+7y37/pae816Lm8FfWzmj99WXRs8MRJunsk7XHODU5Q11/LS8I1nHPfc86VnHOl3l4e6QQAWZt5Zu3nx4KW7NSZt9x/+ydi9kjr1+/e7bMkCf4ik65zbp+k183sA4ObPiLppUyjAgBE+tmdw7ctXBF+zMSQ2zpK0mkfDt+/ck34foSL+zzdz0u6x8xOkvSKpGuzCwkAIEmadTD0YfDTfO458XjELRgPRzzAoP9o+P7b7wvf7+vcviYOKqZYSdc597wkVmQBQCv1TG7qsKxmMl95Q5MHjpyUahx5xh2pAACx/HBLuyPIP5IuAOTY1IntPf+F57T3/HlD0gWATjY7/H6N+xq801S1D75Pmn+B9N7pzdfx7IaIAhHxd5u4E6kAAB3KlYOv4y6ak+x5uwuukzY/G3xeNIakCwCdbvpt0p7wWUz9W6QJ87z3+zdLU+qGna+5Sbr70finnDNL2naX9MQdQ9te3Sudfbn3PlYPe8ZfxD9hlzAX9aiJJpRKJVcuF/efQGbW7hAylcXvRKehDfOtK9tvZ/R3ttJQ73PjZmnZ6vDyjbj369KyBcPPEypkaLkL2tD3C5J0m9AFvyztDiFztGG+dWX7HTsY62HwcZcLLZkrXbtEmjdbOnxU+vlu6RvrpZdeiRFfnP+9n9sXulSoC9rQ9wsyvAwAeTCy+dvrblrrJdkgp42Tzp4mXbWwdvu256VLPtvkSVmb64ukCwB5MdtFDjNXJlWN7JHerZsA1chNM1xZuvi8oV7tyAul4yeSDyt3O5IuAORJjMQrDSXcZu9OVX3cieekYzti1kXCDcU6XQDIm5nRN0C2UnCSvGm5dPgpr9da+RnY7m33M+KCmAl35g9iFOpuTKRqQhdMAGh3CJmjDfON9lNgb7c+OV4xT3rotuZjWbbamwldLXCIuYFebhe0IbOX09IFvyztDiFztGG+0X6Ddo2R3Ds1m6wk9T0pTRpfW3TsXOmtgfgxTBwnvfnT2m3f3CDdeIdP0p15nzRxafzK1RVtyOxlACiU8wezaF2vt2eENPNy6bW9zVd96Ehtr/mXjw7v8UriGm6DuKYLAHlXlfhcWXp4a7KE6+esxd663ppeLgm3YQwvN6ELhkXaHULmaMN8o/0CHDsk7W7B+thzDyRaNyx1RRv6fkF6ugBQFCMner3PGeuyqX/G7V79CRNuN+OaLgAUzZSV3o8Ua01vJIaRU0NPFwCKbLYb+pl1eNjuVX6d4nN/XXscUkNPFwC6Rc+EYUl0zffbFEuXoqcLAECLkHQBAGgRki4AAC2SyTXdnTt3FnoNFmsg8482zDfaL/+K3IalUvDTIejpAgDQIsxeBhAo1gPLIzT7PFegiOjpAqhxw9VDz1hNQ6Wu669Kpz4gzzK597KZFXewXsW+FiFxPakImmlDv0e5ZWHqx6QDh5LVQfvlX5HbsFQqqVwu82g/AP7S6tXGsX/w8XAMO6MbMbwMdLlWJtxOOC/QTiRdoEv99pn2Jz5Xlv7ko+2NAWglki7QhVxZGnVS8nquuyV5HRtvbn/yB1qFa7pAl3lne/I6qq/HfucB7zVp4vztM9LoP0pWB9Dp6OkCXWb0qOgyvfOle37kvy9oAlTSiVFp9LyBTkfSBbpIVG/USt5PX7/0qS8nT6SV+io/53wyWXxA3pF0gS4RldC+fb//9mYTr99xL74SfRyJF0VG0gW6QO/E6DIrbs0+DileEp80Pvs4gHYg6QJd4MDm9OoK6omm2UPtezK9uoBOwuxloOD+9Oqh9369zEqydOX4Q8muLB0dkMbNlY48LY0dEz+e9V+JF8/KZdK37otfL5AH9HSBgrvlC95rUELdc2Do/ZxZw/cH9WAriTYo4QYdd80S7/VX+/z3V+Jct8p/P5BnJF2gy81YNPR+2121yTJsyPj9V3qvky4NLlNfV/XnsxY3FidQBCRdoMCSXmd940Dwvpdf914PHQkuE7YvDmYyo2hIukCXWzQneN/0RcH74gjrBS++JFndQB6RdIEuMRBw+8fHbm9tHBWPrPPf/s4zrY0DaCWSLlBQUyfVfj55lDdce3LVbSDjDN9ueKS58z+8NbpM9fnHjPY+j667HeTkCc2dH+hEJF2goPY94b99YLt0bIf3Ps4SoWu/Onzb8RO1n/v6h5e5Isbs48r5+7dIb2/zL3PwJ9H1AHlB0gW6UM+IZMefdFHt5975yeobf2qy44G8IOkCXS5Ob3fp6trPzoWX/8zX0jkvUDSRSdfMPmBmz1f9HDGzla0IDkBnuL/B20iu35RNHEDeRSZd59w/O+fOc86dJ2m2pAFJD2UeGYBErl8bv2yre52NnK+R7wF0ukaHlz8i6V+cc7/MIhgA6Vl7fbr1fe7meOXSflpR2t8DaKdGk+5SSb63IDez5WZWNjPuIQPk0OKIi0bffdB73brLf/+mp73XoOfyVtTPav70ZdGxAUVhLmpGRKWg2UmS9kr69865/RFl41WaU3H/m+WVmbU7hMx1QxtGrcE9+3Lp1b212yrHBA3/Rj2JKGx/UN1x1goPO6YL2q/oityGpVJJ5XLZtxEb6ekulLQrKuECyIef3Tl828IV4cdMDLmtoySd9uHw/SvXhO8Hiq6RpLtMAUPLADrP5I+E7582Zfi2xyNuwXg44gEG/UfD99/exP9Bwu7fDORNrKRrZqdI+qikv8k2HABpefM3zR2X1UzmK29o7rikTyoCOklPnELOubclTYosCAABfril3REA7ccdqYAuNnVie89/4TntPT/QaiRdoMCihor3NXinqWoffJ80/wLpvdObr+PZDeH7uVUkiibW8DKA4gpb5rNoTrLn7S64Ttr8bPB5gW5D0gUKbtU6ac0Xw8v0b5EmzPPe798sTakbdr7mJunuR+Ofc84sadtd0hN3DG17da+3NliK18P+fMp3tgI6QeybYzRUKTfHyDUW5udffRvGvQFFpdzGzdKy1eHlG3Hv16VlC4afJyqeIN3WfkVU5DYMuzkGSbcJRf5lkfiDL4L6Npw8Id7D4ONeQ10yV7p2iTRvtnT4qPTz3dI31ksvvRJ9bJyEO+nS8KVC3dZ+RVTkNgxLugwvA12gr7/5Yzet9ZJskNPGSWdPk65aWLt92/PSJZ9t7pyszUVRkXSBLhFnWLcyqWpkj/Ru3QSoRmYSu7J08XlD5xt5oXT8RPJhZSDvSLpAF4l7PbWScJtNgNXHnXhOOrYjXl0kXBQd63SBLrP0xugyVgpOgDctlw4/5SXvys/Adm+7nxEXxEumf/yl6DJA3jGRqglFngAgMYmjCKLaMKi3W58cr5gnPXRb83EsW+3NhG7m3GG6vf2KoMhtyOzllBX5l0XiD74I4rTh29ukMaPrjitJfU9Kk8bXbh87V3prIP75J46T3vxp7bZvbpBuvGN40l16o3T/j+PXLdF+RVDkNmT2MoBhTrnYe61Pgj0jpJmXS6/tHX5MXIeO1PZcf/no8B6vxDVcdB+u6QJdrjrxubL08NZkCdfPWYu9db3VCZ6Ei27E8HITijwsIjG0VQTNtOFpY6VDT2UQTJ3e+cnWDUu0XxEUuQ3Dhpfp6QKQ5N1ZykrSyjXZ1L/i1sFrxgkTLpBn9HSbUOR/oUn8K7sI0mrDNJ4ElMUwMu2Xf0VuQ3q6AJpSWa9rpaGnEFVbtW74ttMX1B4HYAizlwHE8pu3/JPo2ntaHwuQV/R0AQBoEZIuAAAtQtIFAKBFsrqm2yfplxnV7Wfy4Dlbog0zC1v6/dqg5d+vxW1I+6WMv8HUFb0NW/39zgrakcmSoVYzs7JzrrDzJPl++cb3y7+if0e+X+swvAwAQIuQdAEAaJGiJN3vtTuAjPH98o3vl39F/458vxYpxDVdAADyoCg9XQAAOh5JFwCAFsl10jWzj5vZP5vZy2b25+2OJ21m9ldmdsDM/rHdsWTBzGaY2VNm9pKZvWhmX2h3TGkys9Fm9pyZvTD4/b7a7piyYGYjzOzvzezRdseSNjN7zcz+wcyeN7MUnrnUWcxsgpn9tZn9k5n9wsz+sN0xpcnMPjDYdpWfI2a2sq0x5fWarpmNkPT/SfqopD2S/k7SMufcS20NLEVmNlfSW5L+h3PunHbHkzYzO0PSGc65XWY2VtJOSVcUpQ3NW/1/inPuLTMbKWmbpC84555tc2ipMrPrJZUkjXPOLW53PGkys9cklZxzhbwxhpndLelnzrk7zewkSWOcc4V84vFgznhD0oXOuVbevKlGnnu6F0h62Tn3inPuXUkbJX2izTGlyjn3tKRD7Y4jK865Xzvndg2+PyrpF5KmtTeq9DjPW4MfRw7+5PNfuQHMbLqkyyTd2e5Y0BgzGy9prqS7JMk5925RE+6gj0j6l3YmXCnfSXeapNerPu9Rgf6H3W3M7D2SPiRpR3sjSdfg0Ovzkg5I+rFzrlDfT9K3JH1J0r+2O5CMOEmbzWynmS1vdzApmynpoKT1g5cH7jSzU9odVIaWSrqv3UHkOemiIMzsVEkPSlrpnDvS7njS5Jw74Zw7T9J0SReYWWEuE5jZYkkHnHM72x1Lhi52zp0vaaGk/zx4yacoeiSdL+kvnXMfkvS2pMLNjZGkwaHzyyX9oN2x5DnpviFpRtXnQ6Y6tgAAAXNJREFU6YPbkCOD1zoflHSPc+5v2h1PVgaH7Z6S9PF2x5KiOZIuH7zuuVHSpWb2/faGlC7n3BuDrwckPSTvslZR7JG0p2r05a/lJeEiWihpl3Nuf7sDyXPS/TtJ7zezmYP/ilkqaVObY0IDBica3SXpF865te2OJ21m1mtmEwbfnyxv0t8/tTeq9DjnbnTOTXfOvUfe399PnXOfanNYqTGzUwYn+Glw2PVjkgqzksA5t0/S62b2gcFNH5FUiEmMPpapA4aWpewe7Zc559xxM7tO0hOSRkj6K+fci20OK1Vmdp+keZImm9keSV9xzt3V3qhSNUfS1ZL+YfC6pyStds79bRtjStMZku4enDX5e5IecM4VbllNgU2V9NDgI+h6JN3rnHu8vSGl7vOS7hnsuLwi6do2x5O6wX8wfVTSf2p3LFKOlwwBAJA3eR5eBgAgV0i6AAC0CEkXAIAWIekCANAiJF0AAFqEpAsAQIuQdAEAaJH/H3g7SUIqLC/qAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "plot_NQueens(solution)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Lets' see if we can find a different solution." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAd0AAAHwCAYAAADjD7WGAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3df7QU5Z3v+8932Aii/BDYYASuwSQrZ80xYqRHM4NyiSEhIBi9Z24GrjFHc3M5N/cYQsTJjKyVFZN1EnNUIE7MnZwcHfCcUdGMY0SdKIkKBow4DaPMaGbuctRERH5sYQd0mwjMc/+o3dndvetXd1V1d1W/X2v1qu6qp576Ns/efPfz1FNV5pwTAADI3u+1OwAAALoFSRcAgBYh6QIA0CIkXQAAWoSkCwBAi5B0AQBoEZIuAAAtQtIFAKBFSLpAC5jZe83s78zssJntM7PbzKwnpPwEM/vLwbIDZvaPZvYfWxkzgPSRdIHW+H8lHZD0HknnSvpfJf0/fgXN7CRJP5V0pqQ/lDRe0p9KusnMVrQkWgCZIOkCrTFT0n3Oud845/ZJelTSvw8oe6Wk/0XS/+6ce8U5d8w596ikFZL+i5mdKklm5szs/ZWdzGyDmf2Xqs+Lzew5M+s3s6fN7JyqbWeY2f1mdtDMXqlO5mZ2g5ndZ2b/w8yOmtkLZlaq2v5nZvb64LZ/MbOPpfNPBBQfSRdoje9IWmpmY8xsmqSF8hKvn49L+rFz7u269fdLGiOv9xvKzD4s6a8k/SdJkyT9N0mbzGyUmf2epIckPS9pmqSPSVppZguqqrhU0kZJEyRtknTbYL0flHSNpD9wzo2VtEDSq1HxAPCQdIHWeEpez/aIpD2SypJ+FFB2sqQ36lc6545L6pPUG+N4yyX9N+fcDufcCefcnZJ+K+kjkv5AUq9z7hvOuXedcy9L+u+Sllbtv80593fOuROS/qekWYPrT0gaJen3zWykc+5V59y/xogHgEi6QOYGe5aPSvpbSafIS6qnSfqvAbv0yTv3W19Pz+C+fTEOe6akVYNDy/1m1i9phqQzBredUbdttaSpVfvvq3o/IGm0mfU4516StFLSDZIOmNlGMzsjRjwARNIFWmGivHO0tznnfuuce1PSekmLAsr/VNJCMzulbv1/kPSupB2DnwfkDTdXnF71/jVJ33TOTah6jXHO3TO47ZW6bWOdc0Hx1HDO3e2cu1Be8nYK/uMBQB2SLpAx51yfpFckfcHMesxsgqT/KGl3wC7/U94Q9A8HLzUaOXi+9S8k3eyc+/Vgueck/R9mNsLMPilvRnTFf5f0f5vZBeY5xcwuMbOxkp6VdHRwQtTJg/ufbWZ/EPVdzOyDZnaxmY2S9BtJ70j6t4b/UYAuRdIFWuN/k/RJSQclvSTpmKQv+xV0zv1W0nx5PdId8hLbo/ImY329quiXJC2R1C/pClWdI3bOlSX9X/ImQB0ePOZVg9tOSFos79KlV+QNV98u79KkKKMkfXtwn32Spki6PsZ+ACSZc67dMQAIYWYjJf1Y0uuSrnL80gK5RU8X6HDOuWPyzuf+q6QPtjkcAAnQ0wUAoEXo6QIA0CKBN1xPYvLkye69731vFlV3hJ07d7Y7hEzNnj273SFkjjbMN9ov/4rehs4581ufyfByqVRy5XI59Xo7hZnvv2VhdMMph7Ta0KXwYz50V+P0FL0N+R3Mvy5oQ98vyPAy0KDrrvSSbRoJVxqq69or0qkPQOeip9uELvgLrd0hZK6ZNpw4TnrziQyCqTP1E9KBQ8nqKHob8juYf13Qhr5fMJNzukDRpNWrjWP/Zm+ZxbAzgPZieBmI0MqE2wnHBZAdki4Q4DdPtz/xubL0Jx9vbwwA0kPSBXy4sjTqpOT1XHNT8jo23tj+5A8gHZzTBeq8sz15HdXnY793n7dMmjh/87Q0+o+S1QGgvejpAnVGj4ou0ztfuuvH/tuCJkAlnRiVRs8bQHuRdIEqUb1RK3mvvn7pM19Nnkgr9VVeZ386WXwAOhtJFxgUldC+e6//+mYTr99+L7wcvR+JF8gvki4gqXdidJkVN2cfhxQviU+K87h5AB2HpAtIOrA5vbqCeqJp9lD7Hk+vLgCtw+xldL0/vXLovV8vs5IsXTn+ULIrS0cHpHFzpSNPSWPHxI9n/dfixbNymfSde+LXC6D96Omi6930JW8ZlFD3HBh6P2fW8O1BPdhKog1KuEH7XbXEW/5qn//2SpzrVvlvB9C5SLpAhBmLht5vu6M2WYYNGX/gcm856eLgMvV1VX8+c3FjcQLofCRddLWk51lfPxC87aXXvOWhI8FlwrbFwUxmIF9IukCERXOCt01fFLwtjrBe8OKLktUNoPOQdIFBAwG3f3zk1tbGUfHQOv/17zzd2jgApIeki641dVLt55NHecO1J1fdBjLO8O2Gh5o7/oNbo8tUH3/MaO/z6LrbQU6e0NzxAbQeSRdda99j/usHtkvHdnjv41widPXXh687fqL2c1//8DKXxZh9XDl+/xbp7W3+ZQ7+NLoeAJ2BpAv46BmRbP+TPlL7uXd+svrGn5psfwCdgaQLRIjT2126uvazc+HlP/eNdI4LIF9IukAK7m3wNpLrN2UTB4DOFivpmtknzexfzOwlM/vzrIMCWuHatfHLtrrX2cjxGvkeANorMuma2QhJ35O0UNLvS1pmZr+fdWBA1tZem259X7gxXrm0n1aU9vcAkJ04Pd3zJb3knHvZOfeupI2SPpVtWEDnWbwyfPv37/eWW3f5b9/0lLcMei5vRf2s5s9eEh0bgHyIk3SnSXqt6vOewXU1zGy5mZXNrHzw4MG04gPaZuYZtZ8fCbhkp9685f7rPxWzR1p//e6dPpckAcin1CZSOed+4JwrOedKvb29aVULtM3Pbh++buGK8H0mhtzWUZJO+2j49pVrwrcDyLc4Sfd1STOqPk8fXAfk2uSPhW+fNmX4ukcjbsF4OOIBBv1Hw7ff2sTzccPu3wygs8RJun8v6QNmNtPMTpK0VBIXPCD33vx1c/tlNZP58uua2y/pk4oAtE5PVAHn3HEzu0bSY5JGSPor59wLmUcGdJkfbWl3BACyFpl0Jck593eS/i7jWICOM3WitP9Q+45/wdntOzaA9HFHKnS1qKHifQ3eaarah94vzT9fet/05ut4ZkP4dm4VCeRLrJ4u0M1cOTi5LZqT7Hm7C66RNj8TfFwAxULSRddbtU5a8+XwMv1bpAnzvPf7N0tTJtZuv+oG6c6H4x9zzixp2x3SY7cNrXtlr3TWpd77OD3sL6Z8ZysA2TMX9TiUJpRKJVcuF/fPdDNrdwiZyuJnotPUt2GcXqWVhspt3CwtWx1evhF3f1NatmD4caLiCVL0NuR3MP+6oA19vyBJtwld8MPS7hAyV9+GkyfEexh83HOoS+ZKVy+R5s2WDh+Vfr5b+tZ66cWXo/eNk3AnXRx+qVDR25Dfwfzrgjb0/YIMLwOS+vqb33fTWi/JBjltnHTWNOmKhbXrtz0nXfT55o7JtblAPpF0gUFxhnUrk6pG9kjv1k2AamQmsStLF547dLyRF0jHTyQfVgbQ2Ui6QJW451MrCbfZBFi934lnpWM74tVFwgXyjet0gTpLr48uY6XgBHjDcunwk17yrrwGtnvr/Yw4P14y/eOvRJcB0NmYSNWELpgA0O4QMhfVhkG93frkeNk86YFbmo9j2WpvJnQzxw5T9DbkdzD/uqANmb2cli74YWl3CJmL04Zvb5PGjK7bryT1PS5NGl+7fuxc6a2B+MefOE5684nadd/eIF1/2/Cku/R66d6fxK9bKn4b8juYf13QhsxeBhpxyoXesj4J9oyQZl4qvbq3+boPHantuf7y4eE9XolzuEDRcE4XiFCd+FxZenBrsoTr58zF3nW91QmehAsUD8PLTeiCYZF2h5C5ZtrwtLHSoSczCKZO7/xk1w1LxW9Dfgfzrwva0PcL0tMFYjp81Ot9rlyTTf0rbh48Z5ww4QLoXPR0m9AFf6G1O4TMpdWGaTwJKIth5KK3Ib+D+dcFbUhPF0hb5XpdKw09hajaqnXD152+oHY/AN2D2ctASn79ln8SXXtX62MB0Jno6QIA0CIkXQAAWoSkCwBAi5B0AQBokUwmUu3cubPQ08GLPp2/yG1XQRvmG+2Xf0Vuw1Ip+LIEZi93ihOHpecm1qxatU5a8+W6cufslUa+p3VxAQBSQ9Jtp53hf80OS7iStPuM2s+zi/vXIgAUDed0W23/zV6yjUi4sVXq2p/RvQkBAKkh6bbKsTe95LjnK9nUv+c6r/5j+7OpHwCQGMPLrZBWrzaO3ad7S4adAaDj0NPNWisTbiccFwAQiKSblV2j2p/4dpp0aGN7YwAA/A5JNws7TXLvJq7mmptSiOWVZe1P/gAASZzTTd+u0YmrqH5Szffu85aJn9u6a5R03m8TVgIASIKebtpcdGLrnS/d9WP/bUHPV0383NUUet4AgGRIummKGMatPLS8r1/6zFeTJ9LqB6FbSTr708niAwBki6SbloiE9t17/dc3m3j99nvh5Rg7kngBoG1Iumk4fiCyyIqbWxCHYibx432ZxwEAGI6km4bnp6ZWVdCEqcQTqao935tiZQCAuJi9nNQbQ9f1+PUyK8nSleMPJbuydHRAGjdXOvKUNHZM/HDWf23ofVg82rdOOt3viQoAgKzQ001q759JCk6oe6pGnufMGr49qAdbSbRBCTdov6uWeMtf7fPf/rs4X7/WvwAAIDMk3YzNWDT0ftsdtckybMj4A5d7y0kXB5epr6v685mLG4sTAJA9km4SCWcCvx4y/+ql17zloSPBZcK2xcJMZgBoKZJuxhbNCd42fVHwtjjCesGLL0pWNwAgfSTdlAxs91//yK2tjaPioXX+6995urVxAACGkHSbdax2ptLJo7xzqiePGloX5zKfDQ81d/gHt0aXqT7+mNHe59En1RU6drC5AAAADSPpNmv3e3xXD2yXju3w3se5ROjqrw9fd/xE7ee+/uFlLlsVXXfl+P1bpLe3BRTaPSW6IgBAKki6GegZkWz/kz5S+7l3frL6xp+abH8AQDpIuhmL09tdurr2s3Ph5T/3jXSOCwBorcika2Z/ZWYHzOyfWhFQN7p3c2Pl12/KJg4AQLbi9HQ3SPpkxnHkzrVr45dtda+zkeM18j0AAMlEJl3n3FOSDrUgllxZm/JdFL9wY7xyaT+tKO3vAQAIxjndFlm8Mnz79+/3llt3+W/f9JS3DHoub0X9rObPXhIdGwCgNVJLuma23MzKZpbmQ+hya+YZtZ8fCbpkp8685f7rPxWzR1p//e6dPpckAQDaI7Wk65z7gXOu5Jxj3qykn90+fN3CFeH7TAy5raMknfbR8O0r14RvBwC0F8PLzZoVfienaT73nHg04haMhyMeYNB/NHz7rfeEb/d1Tl8TOwEAmhHnkqF7JP1c0gfNbI+Z/Z/Zh5UDPZOb2i2rmcyXX9fkjiMnpRoHACBYT1QB59yyVgSCZH60pd0RAACiMLycoakT23v8C85u7/EBALVIuknMDr9f474G7zRV7UPvl+afL71vevN1PLMhokBE/ACAdEUOLyMZVw4+j7toTrLn7S64Rtr8TPBxAQCdhaSb1PRbpD3hs5j6t0gT5nnv92+WptQNO191g3Tnw/EPOWeWtO0O6bHbhta9slc661Lvfawe9oy/iH9AAEAqzEU90qaZSs0KPW457N9sp0XuY6Wh3ufGzdKy1eHlG3H3N6VlC4YfJ1TI0LJZ9PfJuyx+7jtJ0duQ9su/IrdhqVRSuVz2bUSSbhOG/ZsdOxjrYfBxLxdaMle6eok0b7Z0+Kj0893St9ZLL74cI7Y4CfecvtBLhfiFz7+ityHtl39FbsOwpMvwchpG9ja966a1XpINcto46axp0hULa9dve0666PNNHpRrcwGgLUi6aZntIoeZK5OqRvZI79ZNgGrkphmuLF147lCvduQF0vETyYeVAQDZIummKUbilYYSbrN3p6re78Sz0rEdMesi4QJAW3GdbtpmRt8A2UrBSfKG5dLhJ71ea+U1sN1b72fE+TET7swfxigEAMgSE6maEPlvFtDbrU+Ol82THril+TiWrfZmQtfEFjTE3EAvl0kc+Vf0NqT98q/Ibcjs5ZTF+jfbNUZy79SsspLU97g0aXxt0bFzpbcG4h9/4jjpzSdq1317g3T9bT5Jd+Y90sSl8SsXv/BFUPQ2pP3yr8htyOzldjhvMIvW9Xp7RkgzL5Ve3dt81YeO1Paaf/nw8B6vJM7hAkCH4Zxu1qoSnytLD25NlnD9nLnYu663ppdLwgWAjsPwchOa+jc7dkja3YLrY885kOi6YYmhrSIoehvSfvlX5DYMG16mp9sqIyd6vc8Z67Kpf8atXv0JEy4AIDuc0221KSu9lxTrmt5IDCMDQG7Q022n2W7oNevwsM2r/DrF57xRux8AIDfo6XaKngnDkuiav25TLACATNDTBQCgRUi6AAC0CEkXAIAWyeSc7uzZs1Uux3nOXD4V/Rq6Il8/V0Eb5hvtl39Fb8Mg9HQBAGgRZi8DAHIr8MlqDWj22ebNoKcLAMiV664cet54Gip1XXtFOvWFyeTey6VSyXFON784n5R/RW9D2i//mmlDv8eaZmHqJ6QDh5LV4Zzj0X4AgHxKq1cbx/7BR6VmMezM8DIAoKO1MuFmfVySLgCgI/3m6fYl3ApXlv7k4+nVR9IFAHQcV5ZGnZS8nmtuSl7HxhvTS/6c0wUAdJR3tievo/p87Pfu85ZJE+dvnpZG/1GyOujpAgA6yuhR0WV650t3/dh/W9AEqKQTo9LoeZN0AQAdI6o3aiXv1dcvfearyRNppb7K6+xPJ4svCkkXANARohLad+/1X99s4vXb74WXo/dLknhJugCAtuudGF1mxc3ZxyHFS+KTxjdXN0kXANB2BzanV1dQTzTNy4/6Hm9uP2YvAwDa6k+vHHrv18usJEtXjj+U7MrS0QFp3FzpyFPS2DHx41n/tXjxrFwmfeee+PVK9HQBAG1205e8ZVBC3XNg6P2cWcO3B/VgK4k2KOEG7XfVEm/5q33+2ytxrlvlvz0MSRcA0NFmLBp6v+2O2mQZNmT8gcu95aSLg8vU11X9+czFjcUZB0kXANA2Sc+zvn4geNtLr3nLQ0eCy4Rti6PR+Em6AICOtmhO8Lbpi4K3xRHWC158UbK6/ZB0AQAdYSDg9o+P3NraOCoeWue//p2nm6+TpAsAaIupk2o/nzzKG649ueo2kHGGbzc81NzxH9waXab6+GNGe59H190OcvKE+Mck6QIA2mLfY/7rB7ZLx3Z47+NcInT114evO36i9nNf//Ayl8WYfVw5fv8W6e1t/mUO/jS6ngqSLgCg4/SMSLb/SR+p/dw7P1l9409Ntn8FSRcA0NHi9HaXrq797Fx4+c99I53jNoqkCwDIvXsbvI3k+k3ZxBElMuma2Qwze9LMXjSzF8zsS60IDABQbNeujV82i15nWsdr5HvE6ekel7TKOff7kj4i6T+b2e/HPwQAAMOtvTbd+r5wY7xyaT+tqJHvEZl0nXNvOOd2Db4/KukXkqY1GxwAAM1YvDJ8+/fv95Zbd/lv3/SUtwx6Lm9F/azmz14SHVtcDZ3TNbP3SvqwpB0+25abWdnMygcPHkwnOgBA15p5Ru3nRwIu2ak3b7n/+k/F7JHWX797p88lSc2KnXTN7FRJ90ta6ZwbdrdK59wPnHMl51ypt7c3vQgBAF3pZ7cPX7dwRfg+E0Nu6yhJp300fPvKNeHbk4qVdM1spLyEe5dz7m+zDQkA0A0mfyx8+7Qpw9c9GnELxsMRDzDoPxq+/dYGn48rhd+/uV6c2csm6Q5Jv3DONTBHCwCAYG/+urn9sprJfPl1ze3XyJOK4vR050i6UtLFZvbc4Cvhcx0AAOgsP9qS/TF6ogo457ZJsuxDAQCg1tSJ0v5D7Tv+BWenWx93pAIAtE3UUPG+Bu80Ve1D75fmny+9b3rzdTyzIXx7o0PdkT1dAADayZWDk9uiOcmet7vgGmnzM8HHTRtJFwDQVqvWSWu+HF6mf4s0YZ73fv9macrE2u1X3SDd+XD8Y86ZJW27Q3rstqF1r+yVzrrUex+nh/3FJu5sZS7qUQxNKJVKrlzO4E+EDuFN6C6uLH4mOg1tmG+0X/7Vt2GcXqWVhspt3CwtWx1evhF3f1NatmD4caLiCeKc8/0hJek2gV/4/KMN8432y7/6Npw8Id7D4OOeQ10yV7p6iTRvtnT4qPTz3dK31ksvvhy9b5yEO+ni8EuFgpIuw8sAgLbr629+301rvSQb5LRx0lnTpCsW1q7f9px00eebO2Yj1+ZWI+kCADpCnGHdyqSqkT3Su3UToBqZSezK0oXnDh1v5AXS8RPJh5WjkHQBAB0j7vnUSsJtNgFW73fiWenYjnh1Jb0bFtfpAgA6ytLro8tYKTgB3rBcOvykl7wrr4Ht3no/I86Pl0z/+CvRZaIwkaoJTOLIP9ow32i//Itqw6Debn1yvGye9MAtzcexbLU3E7qZY4dh9nKK+IXPP9ow32i//IvThm9vk8aMrtuvJPU9Lk0aX7t+7FzprYH4x584Tnrzidp1394gXX/b8KS79Hrp3p/Er1ti9jIAIGdOudBb1ifBnhHSzEulV/c2X/ehI7U9118+PLzHK6X/RCPO6QIAOlp14nNl6cGtyRKunzMXe9f1Vif4LB4hyPByExjayj/aMN9ov/xrpg1PGysdejKDYOr0zk923bAUPLxMTxcAkAuHj3q9z5Vrsql/xc2D54wTJtww9HSbwF/Z+Ucb5hvtl39ptWEaTwLKYhiZni4AoHAq1+taaegpRNVWrRu+7vQFtfu1ErOXAQCF8Ou3/JPo2rtaH0sQeroAALQISRcAgBYh6QIA0CIkXQAAWiSTiVQ7d+4s9JT+ok/nL3LbVdCG+Ub75V+R27BUCp4STU8XQCwTxtY+Ks2VpWuvGL7u9EntjhToXFwyBCBQ1I0H1nx5+Lo3Hqv93OrrIIFORk8XQI3rrhzqtaahulcMdLtMbgNpZsUdrFexz0VInE8qgmba0O/5olmY+gnpwKFkddB++VfkNiyVSiqXyzxPF4C/tHq1cewffGYpw87oRgwvA12ulQm3E44LtBNJF+hSv3m6/YnPlaU/+Xh7YwBaiaQLdCFXlkadlLyea25KXsfGG9uf/IFW4Zwu0GXe2Z68jurzsd+7z1smTZy/eVoa/UfJ6gA6HT1doMuMHhVdpne+dNeP/bcFTYBKOjEqjZ430OlIukAXieqNVh7q3dcvfearyRNp9YPCrSSd/elk8QF5R9IFukRUQvvuvf7rm028fvu98HL0fiReFBlJF+gCvROjy6y4Ofs4pHhJfNL47OMA2oGkC3SBA5vTqyuoJ5pmD7Xv8fTqAjoJs5eBgvvTK4fe+/UyK8nSleMPJbuydHRAGjdXOvKUNHZM/HjWfy1ePCuXSd+5J369QB7Q0wUK7qYvecughLrnwND7ObOGbw/qwVYSbVDCDdrvqiXe8lf7/LdX4ly3yn87kGckXaDLzVg09H7bHbXJMmzI+AOXe8tJFweXqa+r+vOZixuLEygCki5QYEnPs75+IHjbS695y0NHgsuEbYuDmcwoGpIu0OUWzQneNn1R8LY4wnrBiy9KVjeQRyRdoEsMBNz+8ZFbWxtHxUPr/Ne/83Rr4wBaiaQLFNTUSbWfTx7lDdeeXHUbyDjDtxseau74D26NLlN9/DGjvc+j624HOXlCc8cHOhFJFyiofY/5rx/YLh3b4b2Pc4nQ1V8fvu74idrPff3Dy1wWY/Zx5fj9W6S3t/mXOfjT6HqAvCDpAl2oZ0Sy/U/6SO3n3vnJ6ht/arL9gbwg6QJdLk5vd+nq2s/OhZf/3DfSOS5QNJFJ18xGm9mzZva8mb1gZj6DTQCK7N4GbyO5flM2cQB5F6en+1tJFzvnZkk6V9InzewjEfsAaLNr18Yv2+peZyPHa+R7AJ0uMuk6z1uDH0cOviIGlwC029pr063vCzfGK5f204rS/h5AO8U6p2tmI8zsOUkHJP3EObfDp8xyMyubGfeQAXJo8crw7d+/31tu3eW/fdNT3jLoubwV9bOaP3tJdGxAUcRKus65E865cyVNl3S+mZ3tU+YHzrmSc47pEUAOzDyj9vMjAZfs1Ju33H/9p2L2SOuv372TWSLoIg3NXnbO9Ut6UtInswkHQKv87Pbh6xauCN9nYshtHSXptI+Gb1+5Jnw7UHRxZi/3mtmEwfcnS/q4pH/OOjAAyUz+WPj2aVOGr3s04haMhyMeYNB/NHz7rU08Hzfs/s1A3sR5iP17JN1pZiPkJen7nHMPZxsWgKTe/HVz+2U1k/ny65rbL+mTioBOEpl0nXO7JX24BbEAKLAfbWl3BED7cUcqoItNndje418wbEomUGwkXaDAooaK9zV4p6lqH3q/NP986X3Tm6/jmQ3h27lVJIomzjldAAXmysHJbdGcZM/bXXCNtPmZ4OMC3YakCxTcqnXSmi+Hl+nfIk2Y573fv1maUjfsfNUN0p0NTJ+cM0vadof02G1D617ZK511qfc+Tg/7iynf2QroBOaiHhfSTKVmhb5NZBb/Zp3EzNodQua6rQ3j9CqtNFRu42Zp2erw8o24+5vSsgXDjxMVT5Bua78iKnIblkollctl30Yk6TahyD8sEr/wRVDfhpMnxHsYfNxzqEvmSlcvkebNlg4flX6+W/rWeunFl6P3jZNwJ10cfqlQt7VfERW5DcOSLsPLQBfo629+301rvSQb5LRx0lnTpCsW1q7f9px00eebOybX5qKoSLpAl4gzrFuZVDWyR3q3bgJUIzOJXVm68Nyh4428QDp+IvmwMpB3JF2gi8Q9n1pJuM0mwOr9TjwrHdsRry4SLoqO63SBLrP0+ugyVgpOgDcslw4/6SXvymtgu7fez4jz4yXTP/5KdBkg75hI1YQiTwCQmMRRBFFtGNTbrU+Ol82THril+TiWrfZmQjdz7DDd3n5FUOQ2ZPZyyor8wyLxC18Ecdrw7W3SmNF1+5WkvselSeNr14+dK701EP/4E8dJbz5Ru+7bG6TrbweDeHgAACAASURBVBuedJdeL937k/h1S7RfERS5DZm9DGCYUy70lvVJsGeENPNS6dW9zdd96Ehtz/WXDw/v8Uqcw0X34Zwu0OWqE58rSw9uTZZw/Zy52LuutzrBk3DRjRhebkKRh0UkhraKoJk2PG2sdOjJDIKp0zs/2XXDEu1XBEVuw7DhZXq6ACR5d5aykrRyTTb1r7h58JxxwoQL5Bk93SYU+S80ib+yiyCtNkzjSUBZDCPTfvlX5DakpwugKZXrda009BSiaqvWDV93+oLa/QAMYfYygFh+/ZZ/El17V+tjAfKKni4AAC1C0gUAoEVIugAAtEgm53Rnz56tcjmFaY8dqugzC4s8q7CCNsw32i//it6GQejpAgDQIsxeBoAi25lCj3J28XverUJPFwCKZv/NXrJNI+FKQ3Xtz+h2ZV2EpAsARXHsTS857vlKNvXvuc6r/9j+bOrvAgwvA0ARpNWrjWP36d6SYeeG0dMFgLxrZcLthOPmGEkXAPJq16j2J76dJh3a2N4YcoSkCwB5tNMk927iaq65KYVYXlnW/uSfE5zTBYC82TU6cRXVD6/43n3eMvGjHHeNks77bcJKio2eLgDkjYtObL3zpbt+7L8t6JGLiR/FmELPu+hIugCQJxHDuJXnGPf1S5/5avJEWv1sZCtJZ386WXzdjqQLAHkRkdC+e6//+mYTr99+L7wcY0cSbyCSLgDkwfEDkUVW3NyCOBQziR/vyzyOPCLpAkAePD81taqCJkwlnkhV7fneFCsrDmYvA0Cne2Pouh6/XmYlWbpy/KFkV5aODkjj5kpHnpLGjokfzvqvDb0Pi0f71kmnfzl+xV2Ani4AdLq9fyYpOKHuqRp5njNr+PagHmwl0QYl3KD9rlriLX+1z3/77+J8/Vr/Al2MpAsAOTdj0dD7bXfUJsuwIeMPXO4tJ10cXKa+rurPZy5uLE6QdAGgsyWcCfx6yPyrl17zloeOBJcJ2xYLM5lrkHQBIOcWzQneNn1R8LY4wnrBiy9KVnc3IukCQE4MbPdf/8itrY2j4qF1/uvfebq1ceQJSRcAOtWx2plKJ4/yzqmePGpoXZzLfDY81NzhH9waXab6+GNGe59Hn1RX6NjB5gIoIJIuAHSq3e/xXT2wXTq2w3sf5xKhq78+fN3xE7Wf+/qHl7lsVXTdleP3b5He3hZQaPeU6Iq6BEkXAHKoZ0Sy/U/6SO3n3vnJ6ht/arL9uwVJFwByLk5vd+nq2s/OhZf/3DfSOS5qxU66ZjbCzP7BzB7OMiAAQPru3dxY+fWbsomj2zXS0/2SpF9kFQgAoNa1a+OXbXWvs5HjNfI9ii5W0jWz6ZIukXR7tuEAACrWpnwXxS/cGK9c2k8rSvt75Fncnu53JH1F0r8FFTCz5WZWNrPywYNMDweAVlu8Mnz79+/3llt3+W/f9JS3DHoub0X9rObPXhIdGzyRSdfMFks64JzbGVbOOfcD51zJOVfq7eWRTgCQtZln1H5+JOiSnTrzlvuv/1TMHmn99bt3+lySBH9xerpzJF1qZq9K2ijpYjP760yjAgBE+pnPCb+FK8L3mRhyW0dJOu2j4dtXrgnfjnCRSdc5d71zbrpz7r2Slkp6wjn3mcwjA4BuNyv8VN00n3tOPBpxC8bDEQ8w6D8avv3We8K3+zqnr4mdionrdAGgU/VMbmq3rGYyX35dkzuOnJRqHHnW00hh59wWSVsyiQQA0NF+tKXdEeQfPV0AyLGpE9t7/AvObu/x84akCwCdbHb4/Rr3NXinqWofer80/3zpfdObr+OZDREFIuLvNg0NLwMAOo8rB5/HXTQn2fN2F1wjbX4m+LhoDEkXADrd9FukPeGzmPq3SBPmee/3b5am1A07X3WDdGcDd86fM0vadof02G1D617ZK511qfc+Vg97xl/EP2CXMBf1qIkmlEolVy4X908gM2t3CJnK4mei09CG+daV7bcz+jtbaaj3uXGztGx1ePlG3P1NadmC4ccJFTK03AVt6PsFSbpN6IIflnaHkDnaMN+6sv2OHYz1MPi4lwstmStdvUSaN1s6fFT6+W7pW+ulF1+OEV+c/97P6Qu9VKgL2tD3CzK8DAB5MLL52+tuWusl2SCnjZPOmiZdsbB2/bbnpIs+3+RBuTbXF0kXAPJitoscZq5MqhrZI71bNwGqkZtmuLJ04blDvdqRF0jHTyQfVu52JF0AyJMYiVcaSrjN3p2qer8Tz0rHdsSsi4Qbiut0ASBvZkbfANlKwUnyhuXS4Se9XmvlNbDdW+9nxPkxE+7MH8Yo1N2YSNWELpgA0O4QMkcb5hvtp8Debn1yvGye9MAtzceybLU3E7pa4BBzA73cLmhDZi+npQt+WNodQuZow3yj/QbtGiO5d2pWWUnqe1yaNL626Ni50lsD8WOYOE5684nadd/eIF1/m0/SnXmPNHFp/MrVFW3I7GUAKJTzBrNoXa+3Z4Q081Lp1b3NV33oSG2v+ZcPD+/xSuIcboM4pwsAeVeV+FxZenBrsoTr58zF3nW9Nb1cEm7DGF5uQhcMi7Q7hMzRhvlG+wU4dkja3YLrY885kOi6Yakr2tD3C9LTBYCiGDnR633OWJdN/TNu9epPmHC7Ged0AaBopqz0XlKsa3ojMYycGnq6AFBks93Qa9bhYZtX+XWKz3mjdj+khp4uAHSLngnDkuiav25TLF2Kni4AAC1C0gUAoEVIugAAtEgm53R37txZ6GuwuAYy/2jDfKP98q/IbVgqBT8dgp4uAAAt0rGzl2M9KDlCs8+RBAAgCx3V073uyqFnO6ahUte1V6RTHwAASWRy72Uza6hSv0dIZWHqJ6QDh5LXU+RzERLnk4qg6G1I++VfkduwVCqpXC535qP90urVxrF/8LFUDDsDANqhrcPLrUy4nXBcAEB3a0vS/c3T7U98riz9ycfbGwMAoLu0POm6sjTqpOT1XHNT8jo23tj+5A8A6B4tPaf7zvbkdVSfj/3efd4yaeL8zdPS6D9KVgcAAFFa2tMdPSq6TO986a4f+28LmgCVdGJUGj1vAACitCzpRvVGreS9+vqlz3w1eSKt1Fd5nf3pZPEBAJBUS5JuVEL77r3+65tNvH77vfBy9H4kXgBAljJPur0To8usuDnrKDxxkvik8dnHAQDoTpkn3QOb06srqCeaZg+17/H06gIAoFqms5f/9Mqh9369zEqydOX4Q8muLB0dkMbNlY48JY0dEz+e9V+LF8/KZdJ37olfLwAAcWTa073pS94yKKHuOTD0fs6s4duDerCVRBuUcIP2u2qJt/zVPv/tlTjXrfLfDgBAEm29DeSMRUPvt91RmyzDhow/cLm3nHRxcJn6uqo/n7m4sTgBAEhDZkk36XnW1w8Eb3vpNW956EhwmbBtcTCTGQCQtrb2dBfNCd42fVHwtjjCesGLL0pWNwAAzWhJ0h0IuP3jI7e24ujDPbTOf/07T7c2DgBAd8kk6U6dVPv55FHecO3JVbeBjDN8u+Gh5o7/4NboMtXHHzPa+zy67naQkyc0d3wAAPxkknT3Pea/fmC7dGyH9z7OJUJXf334uuMnaj/39Q8vc1mM2ceV4/dvkd7e5l/m4E+j6wEAIK6Wn9PtGZFs/5M+Uvu5d36y+safmmx/AADiautEqji93aWraz87F17+c99I57gAAKQtVtI1s1fN7B/N7Dkza+nFNPc2eBvJ9ZuyiQMAgKQa6el+1Dl3rnMusp947dr4lba619nI8Rr5HgAARMlkeHnttenW94Ub45VL+2lFaX8PAEB3i5t0naTNZrbTzJb7FTCz5WZWbmb4efHK8O3fv99bbt3lv33TU94y6Lm8FfWzmj97SXRsAACkxVzUzCRJZjbNOfe6mU2R9BNJX3TOPRW4w04LrfSsS6VX9tauq1w3GzT8G/UkorDtQXXHuVbY92lEMf7N8szM2h1C5mjDfKP98q/IbVgqlVQul30bMVZP1zn3+uDygKQHJJ2fJKCf3T583cIV4ftMDLmtoySd9tHw7SvXhG8HACBrkUnXzE4xs7GV95I+IemfwvaZ/LHwOqdNGb7u0YhbMB6OeIBB/9Hw7bc28XzcsPs3AwDQqDgPsZ8q6YHB4Y4eSXc75x4N2+HNXzcXTFYzmS+/rrn9kj6pCACAapFJ1zn3siSfR8znx4+2tDsCAADaeEeqqRPbdWTPBWe39/gAgO6TWdKNGire1+Cdpqp96P3S/POl901vvo5nNoRv51aRAIC0xTmnm5mwy3wWzUn2vN0F10ibnwk+LgAArZZp0l21Tlrz5fAy/VukCfO89/s3S1Pqhp2vukG68+H4x5wzS9p2h/TYbUPrXtnrXRssxethfzHlO1sBACDFvDlGw5Xa0M0x4t6AolJu42Zp2erw8o24+5vSsgXDjxMVT5giX9QtcWF+ERS9DWm//CtyG4bdHCPzpDt5QryHwcc9h7pkrnT1EmnebOnwUennu6VvrZdefDl63zgJd9LF0ZcKFfmHReIXvgiK3oa0X/4VuQ3Dkm7m53T7+pvfd9NaL8kGOW2cdNY06YqFteu3PSdd9Pnmjsm1uQCArLRkIlWcYd3KpKqRPdK7dROgGplJ7MrShecOHW/kBdLxE+kMKwMAkETLZi/HPZ9aSbjNJsDq/U48Kx3bEa8uEi4AIGstvTnG0uujy1gpOAHesFw6/KSXvCuvge3eej8jzo+XTP/4K9FlAABIKvOJVPWCerv1yfGyedIDtzQfw7LV3kzoZo4dpcgTACQmcRRB0duQ9su/IrdhW2cv+3l7mzRmdN0+JanvcWnS+Nr1Y+dKbw3EP/bEcdKbT9Su+/YG6frbhifdpddL9/4kft0VRf5hkfiFL4KityHtl39FbsO2zl72c8qF3rI+CfaMkGZeKr26d/g+cR06Uttz/eXDw3u8EudwAQCt17YHHki1ic+VpQe3Jku4fs5c7F3XW53gSbgAgHZoy/ByvdPGSoeeTD2MYXrnJ7tuuKLIwyISQ1tFUPQ2pP3yr8htGDa83NaebsXho17vc+WabOpfcfPgOeMUEi4AAM3qiJ6unzSeBJTVMHKR/0KT+Cu7CIrehrRf/hW5DTu+p+uncr2ulYaeQlRt1brh605fULsfAACdpK3P043r12/5J9G1d7U+FgAAmtWxPV0AAIqGpAsAQIuQdAEAaJFMzunOnj1b5XIK0487VNFnFhZ5VmEFbZhvtF/+Fb0Ng9DTBQCgRUi6AAC0SC4uGUJO7Uxh+Gh28YfZAHQPerpI1/6bvWSbRsKVhuran9E9QgGghUi6SMexN73kuOcr2dS/5zqv/mP7s6kfAFqA4WUkl1avNo7dp3tLhp0B5BA9XSTTyoTbCccFgARIumjOrlHtT3w7TTq0sb0xAEADSLpo3E6T3LuJq7nmphRieWVZ+5M/AMTEOV00ZtfoxFVUPzHqe/d5y8TPT941SjrvtwkrAYBs0dNFY1x0YuudL931Y/9tQc85Tvz84xR63gCQNZIu4osYxrWS9+rrlz7z1eSJtFJf5XX2p5PFBwDtRtJFPBEJ7bv3+q9vNvH67ffCyzF2JPEC6GAkXUQ7fiCyyIqbWxCHYibx432ZxwEAzSDpItrzU1OrKmjCVOKJVNWe702xMgBID7OXEe6Noet6/HqZlWTpyvGHkl1ZOjogjZsrHXlKGjsmfjjrvzb0Piwe7Vsnnf7l+BUDQAvQ00W4vX8mKTih7qkaeZ4za/j2oB5sJdEGJdyg/a5a4i1/tc9/++/ifP1a/wIA0EYkXSQyY9HQ+2131CbLsCHjD1zuLSddHFymvq7qz2cubixOAOgEJF0ESzgT+PWQ+VcvveYtDx0JLhO2LRZmMgPoMCRdJLJoTvC26YuCt8UR1gtefFGyugGgHUi6iGVgu//6R25tbRwVD63zX//O062NAwAaQdKFv2O1M5VOHuWdUz151NC6OJf5bHioucM/uDW6TPXxx4z2Po8+qa7QsYPNBQAAGSDpwt/u9/iuHtguHdvhvY9zidDVXx++7viJ2s99/cPLXLYquu7K8fu3SG9vCyi0e0p0RQDQIiRdNKxnRLL9T/pI7efe+cnqG39qsv0BoFViJV0zm2Bmf2Nm/2xmvzCzP8w6MORDnN7u0tW1n50LL/+5b6RzXADoNHF7urdKetQ59+8kzZL0i+xCQtHcu7mx8us3ZRMHALRbZNI1s/GS5kq6Q5Kcc+8653zOwqFIrl0bv2yre52NHK+R7wEAWYvT050p6aCk9Wb2D2Z2u5mdknFcaLO1Kd9F8Qs3xiuX9tOK0v4eAJBEnKTbI+k8SX/pnPuwpLcl/Xl9ITNbbmZlMysfPMhlGt1m8crw7d+/31tu3eW/fdNT3jLoubwV9bOaP3tJdGwA0CniJN09kvY45wYvFNHfyEvCNZxzP3DOlZxzpd5eHq1WdDPPqP38SNAlO3XmLfdf/6mYPdL663fv9LkkCQA6VWTSdc7tk/SamX1wcNXHJL2YaVToeD+7ffi6hSvC95kYcltHSTrto+HbV64J3w4AnS7u83S/KOkuMztJ0suSrs4uJHSEWQdDHwY/zeeeE49G3ILxcMQDDPqPhm+/9Z7w7b7O6WtiJwDIRqyk65x7ThJXRnaTnslN7ZbVTObLr2tyx5GTUo0DAJLgjlTIhR9taXcEAJAcSRdNmzqxvce/4Oz2Hh8AGkXSRbDZ4fdr3Nfgnaaqfej90vzzpfdNb76OZzZEFIiIHwBaLe5EKsCXKwefx100J9nzdhdcI21+Jvi4AJA3JF2Em36LtCd8FlP/FmnCPO/9/s3SlLph56tukO58OP4h58yStt0hPXbb0LpX9kpnXeq9j9XDnvEX8Q8IAC1iLuqRL00olUquXC5uV8TM2h1Cpob9TOyM/r5WGup9btwsLVsdXr4Rd39TWrZg+HFCRQwtd10bFgztl39d0Ia+X5Ck24Qu+GGpXXHsYKyHwce9XGjJXOnqJdK82dLho9LPd0vfWi+9+HKM2OL8WJ3TF3mpUNe1YcHQfvnXBW3o+wUZXka0kc3f1nPTWi/JBjltnHTWNOmKhbXrtz0nXfT5Jg/KtbkAOhRJF/HMdpHDzJVJVSN7pHfrJkA1ctMMV5YuPHeoVzvyAun4iXSGlQGgnUi6iC9G4pWGEm6zd6eq3u/Es9KxHTHrIuEC6HBcp4vGzIy+AbKVgpPkDculw096vdbKa2C7t97PiPNjJtyZP4xRCADai4lUTeiCCQDhBQJ6u/XJ8bJ50gO3NB/HstXeTOia2IJ+rBrs5XZ9G+Yc7Zd/XdCGzF5OSxf8sEQX2jVGcu/UrLKS1Pe4NGl8bdGxc6W3BuIff+I46c0natd9e4N0/W0+SXfmPdLEpfErr8RKG+Ya7Zd/XdCGzF5Gis4bzKJ1vd6eEdLMS6VX9zZf9aEjtb3mXz48vMcriXO4AHKHc7pIpirxubL04NZkCdfPmYu963prerkkXAA5xPByE7pgWKTxnY4dkna34PrYcw4kum64gjbMN9ov/7qgDX2/ID1dpGPkRK/3OWNdNvXPuNWrP4WECwDtwjldpGvKSu8lxbqmNxLDyAAKhJ4usjPbDb1mHR62eZVfp/icN2r3A4ACoaeL1uiZMCyJrvnrNsUCAG1CTxcAgBYh6QIA0CIkXQAAWiSTc7o7d+4s9DVYRb+GrshtV0Eb5hvtl39FbsNSKfgpLfR0AQBoEWYvA+huXE+OFqKnC6D77L/ZS7ZpJFxpqK79a9KpD4VF0gXQPY696SXHPV/Jpv4913n1H9ufTf3IPYaXAXSHtHq1cew+3Vsy7Iw69HQBFF8rE24nHBcdi6QLoLh2jWp/4ttp0qGN7Y0BHYOkC6CYdprk3k1czTU3pRDLK8van/zRETinC6B4do1OXIVV3d/ge/d5S1dOWOmuUdJ5v01YCfKMni6A4nHRia13vnTXj/23WcANhYLWx5ZCzxv5RtIFUCwRw7hW8l59/dJnvpo8kVbqq7zO/nSy+FBsJF0AxRGR0L57r//6ZhOv334vvBxjRxJv1yLpAiiG4wcii6y4uQVxKGYSP96XeRzoPCRdAMXw/NTUqgqaMJV4IlW153tTrAx5wexlAPn3xtB1PX69zEqydOX4Q8muLB0dkMbNlY48JY0dEz+c9V8beh8Wj/atk07/cvyKkXv0dAHk394/kxScUPdUjTzPmTV8e1APtpJogxJu0H5XLfGWv9rnv/13cb5+rX8BFBZJF0DhzVg09H7bHbXJMmzI+AOXe8tJFweXqa+r+vOZixuLE8VH0gWQbwlnAr8eMv/qpde85aEjwWXCtsXCTOauQtIFUHiL5gRvm74oeFscYb3gxRclqxvFQ9IFUBgD2/3XP3Jra+OoeGid//p3nm5tHOgcJF0A+XWsdqbSyaO8c6onjxpaF+cynw0PNXf4B7dGl6k+/pjR3ufRJ9UVOnawuQCQOyRdAPm1+z2+qwe2S8d2eO/jXCJ09deHrzt+ovZzX//wMpetiq67cvz+LdLb2wIK7Z4SXREKgaQLoJB6RiTb/6SP1H7unZ+svvGnJtsfxUDSBVB4cXq7S1fXfnYuvPznvpHOcdFdIpOumX3QzJ6reh0xs5WtCA4AWuXezY2VX78pmzhQbJFJ1zn3L865c51z50qaLWlA0gOZRwYAEa5dG79sq3udjRyvke+BfGt0ePljkv7VOffLLIIBgEasTfkuil+4MV65tJ9WlPb3QOdqNOkulXSP3wYzW25mZTNL8zkcAJCaxREnxr5/v7fcust/+6anvGXQc3kr6mc1f/aS6NjQHcxFzRaoFDQ7SdJeSf/eObc/omy8SnMq7r9ZXpkV/7Z0tGG+/a79Im6heNal0it76/Yd7BYEDf9GPYkobHtQ3bEeCTh76Gey6O0nFft3sFQqqVwu+zZiIz3dhZJ2RSVcAOgUP7t9+LqFK8L3mRhyW0dJOu2j4dtXrgnfju7WSNJdpoChZQBoi1nhd3Ka5nPPiUcjbsF4OOIBBv1Hw7ff2sz/kuf0NbET8ihW0jWzUyR9XNLfZhsOADSgZ3JTu2U1k/ny65rcceSkVONA5+qJU8g597YkfioAIMSPtrQ7AnQ67kgFoNCmTmzv8S84u73HR2ch6QLIt9nhs2D3NXinqWofer80/3zpfdObr+OZDREFIuJHscQaXgaAPAu7zGfRnGTP211wjbT5meDjAtVIugDyb/ot0p7wWUz9W6QJ87z3+zdLU+qGna+6Qbrz4fiHnDNL2naH9NhtQ+te2etdGyzF7GHP+Iv4B0QhxL45RkOVcnOMXOPC/Pwrehv6tl/EjTIkr7db6X1u3CwtWx1evhF3f1NatmD4cUIFDC0Xvf2kYv8Oht0cg6TbhCL/sEj8whdB0dvQt/2OHYz1MPi4lwstmStdvUSaN1s6fFT6+W7pW+ulF1+OEV+chHtOX+ClQkVvP6nYv4NhSZfhZQDFMLK36V03rfWSbJDTxklnTZOuWFi7fttz0kWfb/KgXJvblUi6AIpjtoscZq5MqhrZI71bNwGqkZtmuLJ04blDvdqRF0jHTyQbVkbxkXQBFEuMxCsNJdxm705Vvd+JZ6VjO2LWRcLtalynC6B4ZkbfANlKwUnyhuXS4Se9XmvlNbDdW+9nxPkxE+7MH8YohCJjIlUTijwBQGISRxEUvQ1jtV9Ab7c+OV42T3rgluZjWbbamwldLXCIOWYvt+jtJxX7d5DZyykr8g+LxC98ERS9DWO3364xknunZpWVpL7HpUnja4uOnSu9NRA/honjpDefqF337Q3S9bf5JN2Z90gTl8auu+jtJxX7d5DZywC603mDWbSu19szQpp5qfTqXp99Yjp0pLbX/MuHh/d4JXEOFzU4pwug+KoSnytLD25NlnD9nLnYu663ppdLwkUdhpebUORhEYmhrSIoehs23X7HDkm7W3B97DkHEl03XPT2k4r9Oxg2vExPF0D3GDnR633OWJdN/TNu9epPkHBRbJzTBdB9pqz0XlKsa3ojMYyMmOjpAuhus93Qa9bhYZtX+XWKz3mjdj8gJnq6AFDRM2FYEl3z122KBYVETxcAgBYh6QIA0CIkXQAAWiSrc7p9kn6ZUd1+Jg8esyXacA1dS79fG7T8+7W4DWm/lPE7mLqit2Grv9+ZQRsyuTlGq5lZ2TnX5AO6Oh/fL9/4fvlX9O/I92sdhpcBAGgRki4AAC1SlKT7g3YHkDG+X77x/fKv6N+R79cihTinCwBAHhSlpwsAQMcj6QIA0CK5Trpm9kkz+xcze8nM/rzd8aTNzP7KzA6Y2T+1O5YsmNkMM3vSzF40sxfM7EvtjilNZjbazJ41s+cHv9/X2x1TFsxshJn9g5k93O5Y0mZmr5rZP5rZc2ZWjt4jX8xsgpn9jZn9s5n9wsz+sN0xpcnMPjjYdpXXETNb2daY8npO18xGSPr/JH1c0h5Jfy9pmXPuxbYGliIzmyvpLUn/wzl3drvjSZuZvUfSe5xzu8xsrKSdki4rShuad/X/Kc65t8xspKRtkr7knHumzaGlysyulVSSNM45t7jd8aTJzF6VVHLOFfLGGGZ2p6SfOeduN7OTJI1xzvW3O64sDOaM1yVd4Jxr5c2bauS5p3u+pJeccy87596VtFHSp9ocU6qcc09JOtTuOLLinHvDObdr8P1RSb+QNK29UaXHed4a/Dhy8JXPv3IDmNl0SZdIur3dsaAxZjZe0lxJd0iSc+7doibcQR+T9K/tTLhSvpPuNEmvVX3eowL9h91tzOy9kj4saUd7I0nX4NDrc5IOSPqJc65Q30/SdyR9RdK/tTuQjDhJm81sp5ktb3cwKZsp6aCk9YOnB243s1PaHVSGlkq6p91B5DnpoiDM7FRJ90ta6Zw70u540uScO+GcO1fSdEnnm1lhThOY2WJJB5xzO9sdS4YudM6dJ2mhpP88eMqnKHoknSfpL51zH5b0tqTCzY2RpMGh80sl/bDd9TB9QgAAAXtJREFUseQ56b4uaUbV5+mD65Ajg+c675d0l3Pub9sdT1YGh+2elPTJdseSojmSLh0877lR0sVmVqhHvjvnXh9cHpD0gLzTWkWxR9KeqtGXv5GXhItooaRdzrn97Q4kz0n37yV9wMxmDv4Vs1TSpjbHhAYMTjS6Q9IvnHNr2x1P2sys18wmDL4/Wd6kv39ub1Tpcc5d75yb7px7r7zfvyecc59pc1ipMbNTBif4aXDY9ROSCnMlgXNun6TXzOyDg6s+JqkQkxh9LFMHDC1L2T3aL3POueNmdo2kxySNkPRXzrkX2hxWqszsHknzJE02sz2Svuacu6O9UaVqjqQrJf3j4HlPSVrtnPu7NsaUpvdIunNw1uTvSbrPOVe4y2oKbKqkBwYfQdcj6W7n3KPtDSl1X5R012DH5WVJV7c5ntQN/sH0cUn/qd2xSDm+ZAgAgLzJ8/AyAAC5QtIFAKBFSLoAALQISRcAgBYh6QIA0CIkXQAAWoSkCwBAi/z/WKZTYdgmYdwAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "eight_queens = NQueensCSP(8)\n", + "solution = min_conflicts(eight_queens)\n", + "plot_NQueens(solution)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The solution is a bit different this time. \n", + "Running the above cell several times should give you different valid solutions.\n", + "
    \n", + "In the `search.ipynb` notebook, we will see how NQueensProblem can be solved using a **heuristic search method** such as `uniform_cost_search` and `astar_search`." + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Helper Functions\n", "\n", - "We will now implement a few helper functions that will help us visualize the Coloring Problem. We will make some modifications to the existing Classes and Functions for additional book keeping. To begin we modify the **assign** and **unassign** methods in the **CSP** to add a copy of the assignment to the **assignment_history**. We call this new class **InstruCSP**. This will allow us to see how the assignment evolves over time." + "We will now implement a few helper functions that will allow us to visualize the Coloring Problem; we'll also make a few modifications to the existing classes and functions for additional record keeping. To begin, we modify the **assign** and **unassign** methods in the **CSP** in order to add a copy of the assignment to the **assignment_history**. We name this new class as **InstruCSP**; it will allow us to see how the assignment evolves over time. " ] }, { "cell_type": "code", - "execution_count": 5, - "metadata": { - "collapsed": true - }, + "execution_count": 15, + "metadata": {}, "outputs": [], "source": [ "import copy\n", @@ -256,15 +1231,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Next, we define **make_instru** which takes an instance of **CSP** and returns a **InstruCSP** instance. " + "Next, we define **make_instru** which takes an instance of **CSP** and returns an instance of **InstruCSP**." ] }, { "cell_type": "code", - "execution_count": 6, - "metadata": { - "collapsed": true - }, + "execution_count": 16, + "metadata": {}, "outputs": [], "source": [ "def make_instru(csp):\n", @@ -275,15 +1248,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We will now use a graph defined as a dictonary for plotting purposes in our Graph Coloring Problem. The keys are the nodes and their corresponding values are the nodes they are connected to." + "We will now use a graph defined as a dictionary for plotting purposes in our Graph Coloring Problem. The keys are the nodes and their values are the corresponding nodes they are connected to." ] }, { "cell_type": "code", - "execution_count": 7, - "metadata": { - "collapsed": true - }, + "execution_count": 17, + "metadata": {}, "outputs": [], "source": [ "neighbors = {\n", @@ -315,29 +1286,434 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now we are ready to create an InstruCSP instance for our problem. We are doing this for an instance of **MapColoringProblem** class which inherits from the **CSP** Class. This means that our **make_instru** function will work perfectly for it." + "Now we are ready to create an InstruCSP instance for our problem. We are doing this for an instance of **MapColoringProblem** class which inherits from the **CSP** Class. This means that our **make_instru** function will work perfectly for it." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [], + "source": [ + "coloring_problem = MapColoringCSP('RGBY', neighbors)" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [], + "source": [ + "coloring_problem1 = make_instru(coloring_problem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### CONSTRAINT PROPAGATION\n", + "Algorithms that solve CSPs have a choice between searching and or doing a _constraint propagation_, a specific type of inference.\n", + "The constraints can be used to reduce the number of legal values for another variable, which in turn can reduce the legal values for some other variable, and so on. \n", + "
    \n", + "Constraint propagation tries to enforce _local consistency_.\n", + "Consider each variable as a node in a graph and each binary constraint as an arc.\n", + "Enforcing local consistency causes inconsistent values to be eliminated throughout the graph, \n", + "a lot like the `GraphPlan` algorithm in planning, where mutex links are removed from a planning graph.\n", + "There are different types of local consistencies:\n", + "1. Node consistency\n", + "2. Arc consistency\n", + "3. Path consistency\n", + "4. K-consistency\n", + "5. Global constraints\n", + "\n", + "Refer __section 6.2__ in the book for details.\n", + "
    " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## AC-3\n", + "Before we dive into AC-3, we need to know what _arc-consistency_ is.\n", + "
    \n", + "A variable $X_i$ is __arc-consistent__ with respect to another variable $X_j$ if for every value in the current domain $D_i$ there is some value in the domain $D_j$ that satisfies the binary constraint on the arc $(X_i, X_j)$.\n", + "
    \n", + "A network is arc-consistent if every variable is arc-consistent with every other variable.\n", + "
    \n", + "\n", + "AC-3 is an algorithm that enforces arc consistency.\n", + "After applying AC-3, either every arc is arc-consistent, or some variable has an empty domain, indicating that the CSP cannot be solved.\n", + "Let's see how `AC3` is implemented in the module." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def AC3(csp, queue=None, removals=None, arc_heuristic=dom_j_up):\n",
    +       "    """[Figure 6.3]"""\n",
    +       "    if queue is None:\n",
    +       "        queue = {(Xi, Xk) for Xi in csp.variables for Xk in csp.neighbors[Xi]}\n",
    +       "    csp.support_pruning()\n",
    +       "    queue = arc_heuristic(csp, queue)\n",
    +       "    while queue:\n",
    +       "        (Xi, Xj) = queue.pop()\n",
    +       "        if revise(csp, Xi, Xj, removals):\n",
    +       "            if not csp.curr_domains[Xi]:\n",
    +       "                return False\n",
    +       "            for Xk in csp.neighbors[Xi]:\n",
    +       "                if Xk != Xj:\n",
    +       "                    queue.add((Xk, Xi))\n",
    +       "    return True\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(AC3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`AC3` also employs a helper function `revise`." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def revise(csp, Xi, Xj, removals):\n",
    +       "    """Return true if we remove a value."""\n",
    +       "    revised = False\n",
    +       "    for x in csp.curr_domains[Xi][:]:\n",
    +       "        # If Xi=x conflicts with Xj=y for every possible y, eliminate Xi=x\n",
    +       "        if all(not csp.constraints(Xi, x, Xj, y) for y in csp.curr_domains[Xj]):\n",
    +       "            csp.prune(Xi, x, removals)\n",
    +       "            revised = True\n",
    +       "    return revised\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(revise)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`AC3` maintains a queue of arcs to consider which initially contains all the arcs in the CSP.\n", + "An arbitrary arc $(X_i, X_j)$ is popped from the queue and $X_i$ is made _arc-consistent_ with respect to $X_j$.\n", + "
    \n", + "If in doing so, $D_i$ is left unchanged, the algorithm just moves to the next arc, \n", + "but if the domain $D_i$ is revised, then we add all the neighboring arcs $(X_k, X_i)$ to the queue.\n", + "
    \n", + "We repeat this process and if at any point, the domain $D_i$ is reduced to nothing, then we know the whole CSP has no consistent solution and `AC3` can immediately return failure.\n", + "
    \n", + "Otherwise, we keep removing values from the domains of variables until the queue is empty.\n", + "We finally get the arc-consistent CSP which is faster to search because the variables have smaller domains." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's see how `AC3` can be used.\n", + "
    \n", + "We'll first define the required variables." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [], + "source": [ + "neighbors = parse_neighbors('A: B; B: ')\n", + "domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]}\n", + "constraints = lambda X, x, Y, y: x % 2 == 0 and (x + y) == 4 and y % 2 != 0\n", + "removals = []" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll now define a `CSP` object." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [], + "source": [ + "csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints)" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "AC3(csp, removals=removals)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This configuration is inconsistent." ] }, { "cell_type": "code", - "execution_count": 8, - "metadata": { - "collapsed": true - }, + "execution_count": 25, + "metadata": {}, "outputs": [], "source": [ - "coloring_problem = MapColoringCSP('RGBY', neighbors)" + "constraints = lambda X, x, Y, y: (x % 2) == 0 and (x + y) == 4\n", + "removals = []\n", + "csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints)" ] }, { "cell_type": "code", - "execution_count": 9, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ - "coloring_problem1 = make_instru(coloring_problem)" + "AC3(csp,removals=removals)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This configuration is consistent." ] }, { @@ -346,15 +1722,13 @@ "source": [ "## BACKTRACKING SEARCH\n", "\n", - "For solving a CSP the main issue with Naive search algorithms is that they can continue expanding obviously wrong paths. In backtracking search, we check constraints as we go. Backtracking is just the above idea combined with the fact that we are dealing with one variable at a time. Backtracking Search is implemented in the repository as the function **backtracking_search**. This is the same as **Figure 6.5** in the book. The function takes as input a CSP and few other optional parameters which can be used to further speed it up. The function returns the correct assignment if it satisfies the goal. We will discuss these later. Let us solve our **coloring_problem1** with **backtracking_search**." + "The main issue with using Naive Search Algorithms to solve a CSP is that they can continue to expand obviously wrong paths; whereas, in **backtracking search**, we check the constraints as we go and we deal with only one variable at a time. Backtracking Search is implemented in the repository as the function **backtracking_search**. This is the same as **Figure 6.5** in the book. The function takes as input a CSP and a few other optional parameters which can be used to speed it up further. The function returns the correct assignment if it satisfies the goal. However, we will discuss these later. For now, let us solve our **coloring_problem1** with **backtracking_search**." ] }, { "cell_type": "code", - "execution_count": 10, - "metadata": { - "collapsed": true - }, + "execution_count": 27, + "metadata": {}, "outputs": [], "source": [ "result = backtracking_search(coloring_problem1)" @@ -362,7 +1736,7 @@ }, { "cell_type": "code", - "execution_count": 11, + "execution_count": 28, "metadata": {}, "outputs": [ { @@ -391,7 +1765,7 @@ " 20: 'B'}" ] }, - "execution_count": 11, + "execution_count": 28, "metadata": {}, "output_type": "execute_result" } @@ -409,7 +1783,7 @@ }, { "cell_type": "code", - "execution_count": 12, + "execution_count": 29, "metadata": {}, "outputs": [ { @@ -418,7 +1792,7 @@ "21" ] }, - "execution_count": 12, + "execution_count": 29, "metadata": {}, "output_type": "execute_result" } @@ -431,12 +1805,12 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now let us check the total number of assignments and unassignments which is the length ofour assignment history." + "Now, let us check the total number of assignments and unassignments, which would be the length of our assignment history. We can see it by using the command below. " ] }, { "cell_type": "code", - "execution_count": 13, + "execution_count": 30, "metadata": {}, "outputs": [ { @@ -445,7 +1819,7 @@ "21" ] }, - "execution_count": 13, + "execution_count": 30, "metadata": {}, "output_type": "execute_result" } @@ -458,34 +1832,357 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now let us explore the optional keyword arguments that the **backtracking_search** function takes. These optional arguments help speed up the assignment further. Along with these, we will also point out to methods in the CSP class that help make this work. \n", + "Now let us explore the optional keyword arguments that the **backtracking_search** function takes. These optional arguments help speed up the assignment further. Along with these, we will also point out the methods in the CSP class that help to make this work. \n", "\n", - "The first of these is **select_unassigned_variable**. It takes in a function that helps in deciding the order in which variables will be selected for assignment. We use a heuristic called Most Restricted Variable which is implemented by the function **mrv**. The idea behind **mrv** is to choose the variable with the fewest legal values left in its domain. The intuition behind selecting the **mrv** or the most constrained variable is that it allows us to encounter failure quickly before going too deep into a tree if we have selected a wrong step before. The **mrv** implementation makes use of another function **num_legal_values** to sort out the variables by a number of legal values left in its domain. This function, in turn, calls the **nconflicts** method of the **CSP** to return such values.\n" + "The first one is **select_unassigned_variable**. It takes in, as a parameter, a function that helps in deciding the order in which the variables will be selected for assignment. We use a heuristic called Most Restricted Variable which is implemented by the function **mrv**. The idea behind **mrv** is to choose the variable with the least legal values left in its domain. The intuition behind selecting the **mrv** or the most constrained variable is that it allows us to encounter failure quickly before going too deep into a tree if we have selected a wrong step before. The **mrv** implementation makes use of another function **num_legal_values** to sort out the variables by the number of legal values left in its domain. This function, in turn, calls the **nconflicts** method of the **CSP** to return such values." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 31, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def mrv(assignment, csp):\n",
    +       "    """Minimum-remaining-values heuristic."""\n",
    +       "    return argmin_random_tie(\n",
    +       "        [v for v in csp.variables if v not in assignment],\n",
    +       "        key=lambda var: num_legal_values(csp, var, assignment))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(mrv)" ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 32, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def num_legal_values(csp, var, assignment):\n",
    +       "    if csp.curr_domains:\n",
    +       "        return len(csp.curr_domains[var])\n",
    +       "    else:\n",
    +       "        return count(csp.nconflicts(var, val, assignment) == 0\n",
    +       "                     for val in csp.domains[var])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(num_legal_values)" ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 33, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def nconflicts(self, var, val, assignment):\n",
    +       "        """Return the number of conflicts var=val has with other variables."""\n",
    +       "\n",
    +       "        # Subclasses may implement this more efficiently\n",
    +       "        def conflict(var2):\n",
    +       "            return (var2 in assignment and\n",
    +       "                    not self.constraints(var, val, var2, assignment[var2]))\n",
    +       "\n",
    +       "        return count(conflict(v) for v in self.neighbors[var])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(CSP.nconflicts)" ] @@ -494,14 +2191,119 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Another ordering related parameter **order_domain_values** governs the value ordering. Here we select the Least Constraining Value which is implemented by the function **lcv**. The idea is to select the value which rules out the fewest values in the remaining variables. The intuition behind selecting the **lcv** is that it leaves a lot of freedom to assign values later. The idea behind selecting the mrc and lcv makes sense because we need to do all variables but for values, we might better try the ones that are likely. So for vars, we face the hard ones first.\n" + "Another ordering related parameter **order_domain_values** governs the value ordering. Here we select the Least Constraining Value which is implemented by the function **lcv**. The idea is to select the value which rules out least number of values in the remaining variables. The intuition behind selecting the **lcv** is that it allows a lot of freedom to assign values later. The idea behind selecting the mrc and lcv makes sense because we need to do all variables but for values, and it's better to try the ones that are likely. So for vars, we face the hard ones first." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 34, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def lcv(var, assignment, csp):\n",
    +       "    """Least-constraining-values heuristic."""\n",
    +       "    return sorted(csp.choices(var),\n",
    +       "                  key=lambda val: csp.nconflicts(var, val, assignment))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(lcv)" ] @@ -510,88 +2312,86 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Finally, the third parameter **inference** can make use of one of the two techniques called Arc Consistency or Forward Checking. The details of these methods can be found in the **Section 6.3.2** of the book. In short the idea of inference is to detect the possible failure before it occurs and to look ahead to not make mistakes. **mac** and **forward_checking** implement these two techniques. The **CSP** methods **support_pruning**, **suppose**, **prune**, **choices**, **infer_assignment** and **restore** help in using these techniques. You can know more about these by looking up the source code." + "Finally, the third parameter **inference** can make use of one of the two techniques called Arc Consistency or Forward Checking. The details of these methods can be found in the **Section 6.3.2** of the book. In short the idea of inference is to detect the possible failure before it occurs and to look ahead to not make mistakes. **mac** and **forward_checking** implement these two techniques. The **CSP** methods **support_pruning**, **suppose**, **prune**, **choices**, **infer_assignment** and **restore** help in using these techniques. You can find out more about these by looking up the source code." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now let us compare the performance with these parameters enabled vs the default parameters. We will use the Graph Coloring problem instance usa for comparison. We will call the instances **solve_simple** and **solve_parameters** and solve them using backtracking and compare the number of assignments." + "Now let us compare the performance with these parameters enabled vs the default parameters. We will use the Graph Coloring problem instance 'usa' for comparison. We will call the instances **solve_simple** and **solve_parameters** and solve them using backtracking and compare the number of assignments." ] }, { "cell_type": "code", - "execution_count": 14, - "metadata": { - "collapsed": true - }, + "execution_count": 35, + "metadata": {}, "outputs": [], "source": [ - "solve_simple = copy.deepcopy(usa)\n", - "solve_parameters = copy.deepcopy(usa)" + "solve_simple = copy.deepcopy(usa_csp)\n", + "solve_parameters = copy.deepcopy(usa_csp)" ] }, { "cell_type": "code", - "execution_count": 16, + "execution_count": 36, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "{'AL': 'B',\n", - " 'AR': 'B',\n", - " 'AZ': 'R',\n", - " 'CA': 'Y',\n", - " 'CO': 'R',\n", - " 'CT': 'R',\n", - " 'DC': 'B',\n", - " 'DE': 'B',\n", - " 'FL': 'G',\n", - " 'GA': 'R',\n", - " 'IA': 'B',\n", - " 'ID': 'R',\n", - " 'IL': 'G',\n", - " 'IN': 'R',\n", - " 'KA': 'B',\n", - " 'KY': 'B',\n", - " 'LA': 'G',\n", - " 'MA': 'G',\n", - " 'MD': 'G',\n", - " 'ME': 'R',\n", - " 'MI': 'B',\n", + "{'SD': 'R',\n", " 'MN': 'G',\n", - " 'MO': 'R',\n", - " 'MS': 'R',\n", - " 'MT': 'G',\n", - " 'NC': 'B',\n", " 'ND': 'B',\n", + " 'MT': 'G',\n", + " 'IA': 'B',\n", + " 'WI': 'R',\n", " 'NE': 'G',\n", - " 'NH': 'B',\n", - " 'NJ': 'G',\n", - " 'NM': 'B',\n", + " 'MO': 'R',\n", + " 'IL': 'G',\n", + " 'WY': 'B',\n", + " 'ID': 'R',\n", + " 'KA': 'B',\n", + " 'UT': 'G',\n", " 'NV': 'B',\n", - " 'NY': 'B',\n", - " 'OH': 'G',\n", " 'OK': 'G',\n", + " 'CO': 'R',\n", " 'OR': 'G',\n", - " 'PA': 'R',\n", - " 'RI': 'B',\n", - " 'SC': 'G',\n", - " 'SD': 'R',\n", + " 'KY': 'B',\n", + " 'AZ': 'R',\n", + " 'CA': 'Y',\n", + " 'IN': 'R',\n", + " 'OH': 'G',\n", + " 'WA': 'B',\n", + " 'MI': 'B',\n", + " 'AR': 'B',\n", + " 'NM': 'B',\n", " 'TN': 'G',\n", " 'TX': 'R',\n", - " 'UT': 'G',\n", + " 'MS': 'R',\n", + " 'AL': 'B',\n", " 'VA': 'R',\n", - " 'VT': 'R',\n", - " 'WA': 'B',\n", - " 'WI': 'R',\n", " 'WV': 'Y',\n", - " 'WY': 'B'}" + " 'PA': 'R',\n", + " 'LA': 'G',\n", + " 'GA': 'R',\n", + " 'MD': 'G',\n", + " 'NC': 'B',\n", + " 'DC': 'B',\n", + " 'DE': 'B',\n", + " 'SC': 'G',\n", + " 'FL': 'G',\n", + " 'NJ': 'G',\n", + " 'NY': 'B',\n", + " 'MA': 'R',\n", + " 'CT': 'G',\n", + " 'RI': 'B',\n", + " 'VT': 'G',\n", + " 'NH': 'B',\n", + " 'ME': 'R'}" ] }, - "execution_count": 16, + "execution_count": 36, "metadata": {}, "output_type": "execute_result" } @@ -603,16 +2403,16 @@ }, { "cell_type": "code", - "execution_count": 17, + "execution_count": 37, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "460302" + "49" ] }, - "execution_count": 17, + "execution_count": 37, "metadata": {}, "output_type": "execute_result" } @@ -623,7 +2423,7 @@ }, { "cell_type": "code", - "execution_count": 18, + "execution_count": 38, "metadata": {}, "outputs": [ { @@ -632,7 +2432,7 @@ "49" ] }, - "execution_count": 18, + "execution_count": 38, "metadata": {}, "output_type": "execute_result" } @@ -647,24 +2447,142 @@ "source": [ "## TREE CSP SOLVER\n", "\n", - "The `tree_csp_solver` function (**Figure 6.11** in the book) can be used to solve problems whose constraint graph is a tree. Given a CSP, with `neighbors` forming a tree, it returns an assignement that satisfies the given constraints. The algorithm works as follows:\n", + "The `tree_csp_solver` function (**Figure 6.11** in the book) can be used to solve problems whose constraint graph is a tree. Given a CSP, with `neighbors` forming a tree, it returns an assignment that satisfies the given constraints. The algorithm works as follows:\n", "\n", - "First it finds the *topological sort* of the tree. This is an ordering of the tree where each variable/node comes after its parent in the tree. The function that accomplishes this is `topological_sort`, which builds the topological sort using the recursive function `build_topological`. That function is an augmented DFS, where each newly visited node of the tree is pushed on a stack. The stack in the end holds the variables topologically sorted.\n", + "First it finds the *topological sort* of the tree. This is an ordering of the tree where each variable/node comes after its parent in the tree. The function that accomplishes this is `topological_sort`; it builds the topological sort using the recursive function `build_topological`. That function is an augmented DFS (Depth First Search), where each newly visited node of the tree is pushed on a stack. The stack in the end holds the variables topologically sorted.\n", "\n", - "Then the algorithm makes arcs between each parent and child consistent. *Arc-consistency* between two variables, *a* and *b*, occurs when for every possible value of *a* there is an assignment in *b* that satisfies the problem's constraints. If such an assignment cannot be found, then the problematic value is removed from *a*'s possible values. This is done with the use of the function `make_arc_consistent` which takes as arguments a variable `Xj` and its parent, and makes the arc between them consistent by removing any values from the parent which do not allow for a consistent assignment in `Xj`.\n", + "Then the algorithm makes arcs between each parent and child consistent. *Arc-consistency* between two variables, *a* and *b*, occurs when for every possible value of *a* there is an assignment in *b* that satisfies the problem's constraints. If such an assignment cannot be found, the problematic value is removed from *a*'s possible values. This is done with the use of the function `make_arc_consistent`, which takes as arguments a variable `Xj` and its parent, and makes the arc between them consistent by removing any values from the parent which do not allow for a consistent assignment in `Xj`.\n", "\n", "If an arc cannot be made consistent, the solver fails. If every arc is made consistent, we move to assigning values.\n", "\n", - "First we assign a random value to the root from its domain and then we start assigning values to the rest of the variables. Since the graph is now arc-consistent, we can simply move from variable to variable picking any remaining consistent values. At the end we are left with a valid assignment. If at any point though we find a variable where no consistent value is left in its domain, the solver fails.\n", + "First we assign a random value to the root from its domain and then we assign values to the rest of the variables. Since the graph is now arc-consistent, we can simply move from variable to variable picking any remaining consistent values. At the end we are left with a valid assignment. If at any point though we find a variable where no consistent value is left in its domain, the solver fails.\n", "\n", - "The implementation of the algorithm:" + "Run the cell below to see the implementation of the algorithm:" ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 39, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def tree_csp_solver(csp):\n",
    +       "    """[Figure 6.11]"""\n",
    +       "    assignment = {}\n",
    +       "    root = csp.variables[0]\n",
    +       "    X, parent = topological_sort(csp, root)\n",
    +       "\n",
    +       "    csp.support_pruning()\n",
    +       "    for Xj in reversed(X[1:]):\n",
    +       "        if not make_arc_consistent(parent[Xj], Xj, csp):\n",
    +       "            return None\n",
    +       "\n",
    +       "    assignment[root] = csp.curr_domains[root][0]\n",
    +       "    for Xi in X[1:]:\n",
    +       "        assignment[Xi] = assign_value(parent[Xi], Xi, csp, assignment)\n",
    +       "        if not assignment[Xi]:\n",
    +       "            return None\n",
    +       "    return assignment\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(tree_csp_solver)" ] @@ -673,19 +2591,17 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We will now use the above function to solve a problem. More specifically, we will solve the problem of coloring the map of Australia. At our disposal we have two colors: Red and Blue. As a reminder, this is the graph of Australia:\n", + "We will now use the above function to solve a problem. More specifically, we will solve the problem of coloring Australia's map. We have two colors at our disposal: Red and Blue. As a reminder, this is the graph of Australia:\n", "\n", "`\"SA: WA NT Q NSW V; NT: WA Q; NSW: Q V; T: \"`\n", "\n", - "Unfortunately as you can see the above is not a tree. If, though, we remove `SA`, which has arcs to `WA`, `NT`, `Q`, `NSW` and `V`, we are left with a tree (we also remove `T`, since it has no in-or-out arcs). We can now solve this using our algorithm. Let's define the map coloring problem at hand:" + "Unfortunately, as you can see, the above is not a tree. However, if we remove `SA`, which has arcs to `WA`, `NT`, `Q`, `NSW` and `V`, we are left with a tree (we also remove `T`, since it has no in-or-out arcs). We can now solve this using our algorithm. Let's define the map coloring problem at hand:" ] }, { "cell_type": "code", - "execution_count": 19, - "metadata": { - "collapsed": true - }, + "execution_count": 40, + "metadata": {}, "outputs": [], "source": [ "australia_small = MapColoringCSP(list('RB'),\n", @@ -696,19 +2612,19 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We will input `australia_small` to the `tree_csp_solver` and we will print the given assignment." + "We will input `australia_small` to the `tree_csp_solver` and print the given assignment." ] }, { "cell_type": "code", - "execution_count": 20, + "execution_count": 41, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "{'Q': 'R', 'NT': 'B', 'NSW': 'B', 'WA': 'R', 'V': 'R'}\n" + "{'NT': 'R', 'Q': 'B', 'NSW': 'R', 'V': 'B', 'WA': 'B'}\n" ] } ], @@ -730,15 +2646,13 @@ "source": [ "## GRAPH COLORING VISUALIZATION\n", "\n", - "Next, we define some functions to create the visualisation from the assignment_history of **coloring_problem1**. The reader need not concern himself with the code that immediately follows as it is the usage of Matplotib with IPython Widgets. If you are interested in reading more about these visit [ipywidgets.readthedocs.io](http://ipywidgets.readthedocs.io). We will be using the **networkx** library to generate graphs. These graphs can be treated as the graph that needs to be colored or as a constraint graph for this problem. If interested you can read a dead simple tutorial [here](https://www.udacity.com/wiki/creating-network-graphs-with-python). We start by importing the necessary libraries and initializing matplotlib inline.\n" + "Next, we define some functions to create the visualisation from the assignment_history of **coloring_problem1**. The readers need not concern themselves with the code that immediately follows as it is the usage of Matplotib with IPython Widgets. If you are interested in reading more about these, visit [ipywidgets.readthedocs.io](http://ipywidgets.readthedocs.io). We will be using the **networkx** library to generate graphs. These graphs can be treated as graphs that need to be colored or as constraint graphs for this problem. If interested you can check out a fairly simple tutorial [here](https://www.udacity.com/wiki/creating-network-graphs-with-python). We start by importing the necessary libraries and initializing matplotlib inline.\n" ] }, { "cell_type": "code", - "execution_count": 21, - "metadata": { - "collapsed": true - }, + "execution_count": 42, + "metadata": {}, "outputs": [], "source": [ "%matplotlib inline\n", @@ -752,23 +2666,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The ipython widgets we will be using require the plots in the form of a step function such that there is a graph corresponding to each value. We define the **make_update_step_function** which return such a function. It takes in as inputs the neighbors/graph along with an instance of the **InstruCSP**. This will be more clear with the example below. If this sounds confusing do not worry this is not the part of the core material and our only goal is to help you visualize how the process works." + "The ipython widgets we will be using require the plots in the form of a step function such that there is a graph corresponding to each value. We define the **make_update_step_function** which returns such a function. It takes in as inputs the neighbors/graph along with an instance of the **InstruCSP**. The example below will elaborate it further. If this sounds confusing, don't worry. This is not part of the core material and our only goal is to help you visualize how the process works." ] }, { "cell_type": "code", - "execution_count": 22, - "metadata": { - "collapsed": true - }, + "execution_count": 43, + "metadata": {}, "outputs": [], "source": [ "def make_update_step_function(graph, instru_csp):\n", " \n", + " #define a function to draw the graphs\n", " def draw_graph(graph):\n", - " # create networkx graph\n", + " \n", " G=nx.Graph(graph)\n", - " # draw graph\n", " pos = nx.spring_layout(G,k=0.15)\n", " return (G, pos)\n", " \n", @@ -787,11 +2699,11 @@ " nx.draw(G, pos, node_color=colors, node_size=500)\n", "\n", " labels = {label:label for label in G.node}\n", - " # Labels shifted by offset so as to not overlap nodes.\n", + " # Labels shifted by offset so that nodes don't overlap\n", " label_pos = {key:[value[0], value[1]+0.03] for key, value in pos.items()}\n", " nx.draw_networkx_labels(G, label_pos, labels, font_size=20)\n", "\n", - " # show graph\n", + " # display the graph\n", " plt.show()\n", "\n", " return update_step # <-- this is a function\n", @@ -815,15 +2727,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Finally let us plot our problem. We first use the function above to obtain a step function." + "Finally let us plot our problem. We first use the function below to obtain a step function." ] }, { "cell_type": "code", - "execution_count": 23, - "metadata": { - "collapsed": true - }, + "execution_count": 44, + "metadata": {}, "outputs": [], "source": [ "step_func = make_update_step_function(neighbors, coloring_problem1)" @@ -833,15 +2743,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Next we set the canvas size." + "Next, we set the canvas size." ] }, { "cell_type": "code", - "execution_count": 24, - "metadata": { - "collapsed": true - }, + "execution_count": 45, + "metadata": {}, "outputs": [], "source": [ "matplotlib.rcParams['figure.figsize'] = (18.0, 18.0)" @@ -851,36 +2759,38 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Finally our plot using ipywidget slider and matplotib. You can move the slider to experiment and see the coloring change. It is also possible to move the slider using arrow keys or to jump to the value by directly editing the number with a double click. The **Visualize Button** will automatically animate the slider for you. The **Extra Delay Box** allows you to set time delay in seconds upto one second for each time step." + "Finally, our plot using ipywidget slider and matplotib. You can move the slider to experiment and see the colors change. It is also possible to move the slider using arrow keys or to jump to the value by directly editing the number with a double click. The **Visualize Button** will automatically animate the slider for you. The **Extra Delay Box** allows you to set time delay in seconds (upto one second) for each time step." ] }, { "cell_type": "code", - "execution_count": 25, + "execution_count": 46, "metadata": {}, "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAABTgAAAUyCAYAAAAqcpudAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3Xl4VPW9x/HPZN9MiEQImyFAAiqEZESRyg4+FRRREVNE\ncEHZFKWoRURbXIpVaVFbgYIXd1m1XBEKiggGAbcQFoEsENEiENaEkEy2mfsHDReRJcuZOXNm3q/n\n8cGGme/5xHvD8slvsblcLpcAAAAAAAAAwIICzA4AAAAAAAAAAHVFwQkAAAAAAADAsig4AQAAAAAA\nAFgWBScAAAAAAAAAy6LgBAAAAAAAAGBZFJwAAAAAAAAALIuCEwAAAAAAAIBlUXACAAAAAAAAsCwK\nTgAAAAAAAACWRcEJAAAAAAAAwLIoOAEAAAAAAABYFgUnAAAAAAAAAMui4AQAAAAAAABgWRScAAAA\nAAAAACyLghMAAAAAAACAZVFwAgAAAAAAALAsCk4AAAAAAAAAlkXBCQAAAAAAAMCyKDgBAAAAAAAA\nWBYFJwAAAAAAAADLouAEAAAAAAAAYFkUnAAAAAAAAAAsi4ITAAAAAAAAgGVRcAIAAAAAAACwLApO\nAAAAAAAAAJZFwQkAAAAAAADAsig4AQAAAAAAAFgWBScAAAAAAAAAy6LgBAAAAAAAAGBZFJwAAAAA\nAAAALIuCEwAAAAAAAIBlUXACAAAAAAAAsCwKTgAAAAAAAACWRcEJAAAAAAAAwLIoOAEAAAAAAABY\nFgUnAAAAAAAAAMui4AQAAAAAAABgWRScAAAAAAAAACyLghMAAAAAAACAZVFwAgAAAAAAALAsCk4A\nAAAAAAAAlkXBCQAAAAAAAMCyKDgBAAAAAAAAWBYFJwAAAAAAAADLouAEAAAAAAAAYFkUnAAAAAAA\nAAAsi4ITAAAAAAAAgGVRcAIAAAAAAACwLApOAAAAAAAAAJZFwQkAAAAAAADAsig4AQAAAAAAAFgW\nBScAAAAAAAAAy6LgBAAAAAAAAGBZFJwAAAAAAAAALIuCEwAAAAAAAIBlUXACAAAAAAAAsCwKTgAA\nAAAAAACWRcEJAAAAAAAAwLIoOAEAAAAAAABYFgUnAAAAAAAAAMui4AQAAAAAAABgWRScAAAAAAAA\nACyLghMAAAAAAACAZVFwAgAAAAAAALAsCk4AAAAAAAAAlkXBCQAAAAAAAMCyKDgBAAAAAAAAWBYF\nJwAAAAAAAADLouAEAAAAAAAAYFkUnAAAAAAAAAAsi4ITAAAAAAAAgGVRcAIAAAAAAACwLApOAAAA\nAAAAAJZFwQkAAAAAAADAsig4AQAAAAAAAFgWBScAAAAAAAAAy6LgBAAAAAAAAGBZFJwAAAAAAAAA\nLIuCEwAAAAAAAIBlUXACAAAAAAAAsCwKTgAAAAAAAACWRcEJAAAAAAAAwLIoOIELePPNN2Wz2c77\nT2BgoNkxAQAAAAAA/FKQ2QEAb5eamqo//elPZ/25jIwMrV69Wv369fNwKgAAAAAAAEgUnMAFpaam\nKjU19aw/16VLF0nSyJEjPRkJAAAAAAAA/2VzuVwus0MAVrR161alpKSoWbNm2rNnD9vUAQAAAAAA\nTMAZnEAdzZ49W5I0YsQIyk0AAAAAAACTsIITqIPS0lI1bdpUx48fV35+vlq0aGF2JAAAAAAAAL/E\nCk6gDhYuXKhjx47p+uuvp9wEAAAAAAAwEQUnUAfV29NHjRplchIAAAAAAAD/xhZ1oJa+//57tW/f\nXs2bN9cPP/zA+ZsAAAAAAAAmYgUnUEtcLgQAAAAAAOA9WMEJ1ILD4VDTpk1VVFTE5UIAAAAAAABe\ngBWcQC0sWrRIR48eVb9+/Sg3AQAAAAAAvAAFJ1AL1dvTR44caXISAAAAAAAASGxRB2psx44duvzy\ny7lcCAAAAAAAwItQcAIAAAAAAACwLLaoAwAAAAAAALAsCk4AAAAAAAAAlkXBCQAAAAAAAMCyKDgB\nAAAAAAAAWBYFJwAAAAAAAADLouAEAAAAAAAAYFkUnAAAAAAAAAAsi4ITAAAAAAAAgGVRcAIAAAAA\nAACwLApOAAAAAAAAAJZFwQkAAAAAAADAsig4AQAAAAAAAFhWkNkBACsoKSnR5s2btXPnTjkcDoWH\nh+uKK65Qhw4dFBYWZnY8AAAAAAAAv0XBCZyD0+nUihUr9OKLL+rLL79URESEqqqq5HQ6FRgYqICA\nAJWWlqpPnz567LHH1KtXL9lsNrNjAwAAAAAA+BWby+VymR0C8Da7du1Senq6srOzVVxcfMHXR0ZG\nqlOnTnrvvffUrFkzDyQEAAAAAACARMEJ/MrHH3+s9PR0ORwOOZ3OGr8vKChIYWFhWrZsmbp37+7G\nhAAAAAAAAKhGwQmc5t///rcGDRqk0tLSOs+IiIjQp59+qt/85jcGJgMAAAAAAMDZUHAC/7Vv3z61\nbdtWx48fr/eshg0bKi8vTw0aNDAgGQAAAAAAAM4lwOwAgLcYPnx4vVZunq64uFgPPPCAIbMAAAAA\nAABwbqzgBCRt3bpVnTt3NqzglKTQ0FDl5eWpefPmhs0EAAAAAADAL7GCE5A0ffp0lZeXGz53xowZ\nhs8EAAAAAADA/2MFJyCpcePGKigoMHxuu3bttGPHDsPnAgAAAAAA4CQKTvi9o0ePKj4+3i0rOIOD\ng3XixAkFBwcbPhsAAAAAAABsUQeUn5+vsLAwt8wODg7Wvn373DIbAAAAAAAAFJyAKisrZbPZ3DI7\nICBAlZWVbpkNAAAAAAAACk5AMTExqqqqcsvs8vJyxcTEuGU2AAAAAAAAOIMTUFVVlSIiItxyBmds\nbKyOHDli+FwAAAAAAACcxApO+L3AwEBddtllbpndqVMnt8wFAAAAAADASRScgKQHH3xQUVFRhs68\n6KKL9MADDxg6EwAAAAAAAL/EFnVA0okTJxQfH6/i4mLDZl5yySX6+eefFRQUZNhMAAAAAAAA/BIr\nOAFJkZGRevnllxUZGWnIvIiICM2dO5dyEwAAAAAAwM1YwQn8l8vlUt++fbV+/Xo5HI46zwkPD9fA\ngQM1b948A9MBAAAAAADgbCg4gdMUFxere/fu2rFjR51KzvDwcHXu3FkrV65USEiIGxICAAAAAADg\ndGxRB04TFRWldevW6eabb1ZERESt3hseHq7hw4dTbgIAAAAAAHgQKziBc1i+fLnGjRunAwcOqKSk\nRGf7UgkICFBYWJhatmypmTNnqnv37iYkBQAAAAAA8F8UnMB5uFwubdiwQQsWLNDcuXPldDrlcrkU\nEhKikpISDR8+XGPHjpXdbjc7KgAAAAAAgF+i4ARqqGnTpvrqq6/UokULSdJNN92k4cOH67bbbjM5\nGQAAAAAAgP/iDE6gBoqKilRYWKhmzZqd+lhaWpo2bdpkYioAAAAAAABQcAI1kJOTo6SkJAUE/P+X\njN1uV2ZmpompAAAAAAAAQMEJ1EB2drbatm37i4+xghMAAAAAAMB8FJxADZyt4GzRooUqKiq0b98+\nk1IBAAAAAACAghOogZycnF8VnDabTXa7nVWcAAAAAAAAJqLgBGrgbCs4pZPb1DmHEwAAAAAAwDwU\nnMAFOJ1O5eTkKDk5+Vc/xzmcAAAAAAAA5qLgBC5g7969io6OVnR09K9+joITAAAAAADAXBScwAWc\na3u6JCUlJengwYM6evSoh1MBAAAAAABAouAELuhsFwxVCwwMVEpKirKysjycCgAAAAAAABIFJ3BB\n51vBKbFNHQAAAAAAwEwUnMAFZGdnn/WCoWp2u52b1AEAAAAAAExCwQlcACs4AQAAAAAAvJfN5XK5\nzA4BeCuHw6EGDRqouLhYQUFBZ31NeXm5YmJidPjwYUVERHg4IQAAAAAAgH9jBSdwHnl5eUpMTDxn\nuSlJISEhuuyyy7RlyxYPJgMAAAAAAIBEwQmc14W2p1djmzoAAAAAAIA5KDiB87jQBUPVKDgBAAAA\nAADMQcEJnEdNV3Da7XYKTgAAAAAAABNQcALnkZOTU6OCMyUlRd9//70qKio8kAoAAAAAAADVKDiB\nc3C5XDVewRkVFaVLL71UO3bs8EAyAAAAAAAAVKPgBM7h0KFDcrlciouLq9HrOYcTAAAAAADA8yg4\ngXOoXr1ps9lq9Hq73a7MzEw3pwIAAAAAAMDpKDiBc6jp9vRqrOAEAAAAAADwPApO4BxqesFQtbS0\nNGVlZcnpdLoxFQAAAAAAAE5HwQmcQ21XcDZs2FCxsbHatWuXG1MBAAAAAADgdBScwDlkZ2crOTm5\nVu9hmzoAAAAAAIBnUXACZ1FZWan8/Hy1adOmVu+j4AQAAAAAAPAsCk7gLPLz89WkSROFh4fX6n3c\npA4AAAAAAOBZFJzAWdT2gqFq1Ss4XS6XG1IBAAAAAADgTBScwFnU9oKhas2aNZPL5dLPP//shlQA\nAAAAAAA4EwUncBZ1uWBIkmw2m+x2O+dwAgAAAAAAeAgFJ3AWdV3BKZ3cps45nAAAAAAAAJ5BwQmc\nRV3P4JS4SR0AAAAAAMCTKDiBMxQVFamwsFDNmjWr0/spOAEAAAAAADyHghM4Q05OjpKSkhQQULcv\njzZt2ujIkSM6fPiwwckAAAAAAABwJgpO4Az1OX9TkgICAtSxY0dlZWUZmAoAAAAAAABnQ8EJnKG+\nBafENnUAAAAAAABPoeAEzlCfC4aq2e12blIHAAAAAADwAApO4Ays4AQAAAAAALAOm8vlcpkdAvAW\nTqdTF110kfbt26fo6Og6z6moqFBMTIwOHjyoyMhIAxMCAAAAAADgdKzgBE6zd+9eRUdH16vclKTg\n4GBdfvnl2rx5s0HJAAAAAAAAcDYUnMBpjNieXo1t6gAAAAAAAO5HwQmcxogLhqpRcAIAAAAAALgf\nBSdwGiNXcHKTOgAAAAAAgPtRcAKnyc7OVnJysiGzUlJStHPnTpWXlxsyDwAAAAAAAL9GwQmcxsgV\nnBEREWrZsqW2b99uyDwAAAAAAAD8GgUn8F8Oh0P79u1TYmKiYTM5hxMAAAAAAMC9KDiB/8rLy1Ni\nYqKCgoIMm8k5nAAAAAAAAO5FwQn8l5Hb06uxghMAAAAAAMC9KDiB/zLygqFqqamp2rx5s5xOp6Fz\nAQAAAAAAcBIFJ/Bf7ljBefHFFysuLk65ubmGzgUAAAAAAMBJFJzAf+Xk5BhecEpsUwcAAAAAAHAn\nCk5AksvlcssKTomCEwAAAAAAwJ0oOAFJhw4dksvlUlxcnOGzuUkdAAAAAADAfSg4Af3/+Zs2m83w\n2dUrOF0ul+GzAQAAAAAA/B0FJyD3XDBUrUmTJgoMDNR//vMft8wHAAAAAADwZxScgNx3wZAk2Ww2\ntqkDAAAAAAC4CQUnIPeu4JS4aAgAAAAAAMBdKDgBnSw4k5OT3TafghMAAAAAAMA9KDjh9yorK5Wf\nn682bdq47Rl2u52CEwAAAAAAwA0oOOH38vPz1aRJE4WHh7vtGYmJiSosLNShQ4fc9gwAAAAAAAB/\nRMEJv+fOC4aqBQQEKDU1lVWcAAAAAAAABqPghN9z9wVD1TiHEwAAAAAAwHgUnPB77r5gqJrdbldm\nZqbbnwMAAAAAAOBPKDjh91jBCQAAAAAAYF0UnPB7njiDU5LatWunn376ScePH3f7swAAAAAAAPwF\nBSf8WlFRkQoLC9WsWTO3Pys4OFjt27fX5s2b3f4sAAAAAAAAf0HBCb+Wk5OjpKQkBQR45kuBbeoA\nAAAAAADGouCEX/PU+ZvVKDgBAAAAAACMRcEJv+bpgpOb1AEAAAAAAIxFwQm/5qkLhqp16NBB2dnZ\nKisr89gzAQAAAAAAfBkFJ/yap1dwhoeHq3Xr1vr+++899kwAAAAAAABfRsEJv+V0OpWTk6Pk5GSP\nPtdut3MOJwAAAAAAgEEoOOG39u7dq+joaEVHR3v0uWlpaZzDCQAAAAAAYBAKTvgtT29Pr8ZN6gAA\nAAAAAMah4ITf8vQFQ9VSU1O1ZcsWVVVVefzZAAAAAAAAvoaCE37LrBWcDRo0UOPGjZWTk+PxZwMA\nAAAAAPgaCk74rezsbI9fMFSNbeoAAAAAAADGoOCE3zJrBadEwQkAAAAAAGAUCk74JYfDoX379ikx\nMdGU59vtdm5SBwAAAAAAMAAFJ/xSXl6eEhMTFRQUZMrzq1dwulwuU54PAAAAAADgKyg44ZfM3J4u\nSfHx8QoNDdWPP/5oWgYAAAAAAABfQMEJv2TmBUPV2KYOAAAAAABQfxSc8Etmr+CUuGgIAAAAAADA\nCBSc8Es5OTkUnAAAAAAAAD6AghN+x+VyecUKTrvdTsEJAAAAAABQTxSc8DuHDh2Sy+VSXFycqTla\ntmypEydOqKCgwNQcAAAAAAAAVkbBCb9TvXrTZrOZmsNmsyk1NZVVnAAAAAAAAPVAwQm/4w3b06tx\nDicAAAAAAED9UHDC73jDBUPV7Ha7MjMzzY4BAAAAAABgWRSc8Dus4AQAAAAAAPAdFJzwO9nZ2UpO\nTjY7hiSpbdu2+vnnn1VUVGR2FAAAAAAAAEui4IRfqaysVH5+vtq0aWN2FElSUFCQOnTooKysLLOj\nAAAAAAAAWBIFJ/xKfn6+mjRpovDwcLOjnMI2dQAAAAAAgLqj4IRf8aYLhqpRcAIAAAAAANQdBSf8\nijddMFSNm9QBAAAAAADqjoITfsWbLhiq1r59e+Xm5srhcJgdBQAAAAAAwHIoOOFXvHEFZ1hYmJKS\nkrRt2zazowAAAAAAAFgOBSf8ijeewSmd3KbOOZwAAAAAAAC1R8EJv1FUVKTCwkI1a9bM7Ci/kpaW\nxjmcAAAAAAAAdUDBCb+Rk5OjpKQkBQR43//bc5M6AAAAAABA3Xhf0wO4iTeev1ktNTVVW7duVWVl\npdlRAAAAAAAALIWCE37DmwvO6OhoNW3aVNnZ2WZHAQAAAAAAsBQKTvgNb71gqBrb1AEAAAAAAGqP\nghN+w5tXcEoUnAAAAAAAAHVBwQm/4HQ6lZOTo+TkZLOjnJPdbucmdQAAAAAAgFqi4IRf2Lt3r6Kj\noxUdHW12lHNKS0tTVlaWXC6X2VEAAAAAAAAsg4ITfsHbt6dLUqNGjRQREaEffvjB7CgAAAAA4BcW\nL16scePGqVu3boqOjpbNZtOdd95pdiwAtUTBCb/g7RcMVWObOgAAgG+rTZmSm5urF154Qb1791aL\nFi0UEhKixo0ba+DAgfr88889nBzwTc8995z+8Y9/KCsrS82aNTM7DoA6ouCEX7DCCk6Ji4YAAAB8\nXW3KlKeeekqPP/64Dhw4oP79++uRRx7Rtddeq2XLlql379569dVXPZQa8F3Tp09XTk6OioqKNHPm\nTLPjAKijILMDAJ6QnZ2t6667zuwYF5SWlqbXX3/d7BgAAABwk+nTp6t58+Zq06aN1q5dq169ep3z\ntddff70mTpyotLS0X3x87dq1uu666/TYY49p8ODBatKkibtjAz7rfF+DAKyDFZzwC1ZZwWm321nB\nCQAA4MN69eqlpKQk2Wy2C7727rvv/lW5KUk9evRQz549VV5ervXr17sjJgAAlkLBCZ/ncDi0b98+\nJSYmmh3lgi699FI5HA7t37/f7CgAAADwYsHBwZKkoCA25QEAQMEJn5eXl6fExERL/OHPZrNxDicA\nAADOa8+ePfrss88UERGh7t27mx0HAADTUXDC51lle3o1Ck4AAACcS1lZmYYOHaqysjJNmTJFsbGx\nZkcCAMB0FJzweosXL9a4cePUrVs3RUdHy2az6c477zzve6qqqvT666+re/fuuvPOO7Vs2TK1atVK\n6enpysnJ8VDyurHb7crMzDQ7BgAAALxMVVWVhg0bpi+//FLp6el69NFHzY4EAIBX8P49u/B7zz33\nnDZv3qyoqCg1b95cO3fuPO/ri4uLNXDgQK1evVqpqalKTExUQkKC4uLilJGRoZycHCUnJ3sofe2l\npaXpqaeeMjsGAAAAvEhVVZXuvPNOLVq0SLfffrvefffdGl1UBACAP2AFJ7ze9OnTlZOTo6KiIs2c\nOfOCrx81apRWr16tWbNmadOmTYqJidHkyZP1zjvv6IcfftBvf/tbD6Suu+TkZB04cEDHjh0zOwoA\nAAC8QEVFhYYMGaL58+frjjvu0Pvvv2+J8+UBAPAUCk54vV69eikpKalG36HOzMzU+++/r/T0dI0a\nNUoul+tXZ3BW3zjprQIDA5WSkqKsrCyzowCAz5k4caL69OmjFi1aKDw8XBdffLHS0tL09NNP6/Dh\nw2bHA4BfKS8v1+DBg7Vo0SINHz5c77zzjgIDA82OBQCAV+HbfvAp77//viRpyJAhKiws1HvvvafS\n0lJ9+OGH6tOnj9q0aWNywpqpvmioZ8+eZkcBAJ8yffp02e12XXfddWrUqJFOnDihjRs3asqUKZo9\ne7Y2btyoFi1amB0TACSdvFDo1ltv1fLlyzVixAjNnj1bAQGsUQGMtGTJEi1ZskSStH//fknShg0b\ndPfdd0uS4uLiNG3aNLPiAaghCk74lG+++UaStGfPHrVu3frUapzRo0fLZrNpzJgxevXVV73+u95p\naWn64osvzI4BAD6nqKhIYWFhv/r45MmTNXXqVD3//POaMWOGCckA+IvalCmjR4/W8uXLFRcXp2bN\nmumZZ5751byePXvyTXGgHrKysvTWW2/94mO7d+/W7t27JUkJCQkUnIAFUHDCpxQUFEiSJkyYoJtv\nvlmpqanavn27RowYodGjR2vGjBm65JJLNGXKFHODXoDdbtfLL79sdgwA8DlnKzcl6fbbb9fUqVOV\nm5vr4UQA/E1typT8/HxJ0qFDh85ablaj4ATqbsqUKV7/90MAF8b+BvgUp9MpSWrXrp0WLFig48eP\nq3379urTp48WL16sgIAA/e1vf1N5ebnJSc/viiuu0K5du1RaWmp2FADwC0uXLpUkpaSkmJwEgK+b\nMmWKXC7XOf/54YcfTr12zZo1532ty+WimAEAQKzghI9p0KCBJGnAgAEKDAxUdna2hg0bJknq2LGj\nEhMTtWvXLu3YsUMdO3Y0M+p5hYaGqm3bttq6dauuvvpqs+MAgM+ZNm2aiouLVVhYqG+//Vbr1q1T\nSkqKHn/8cbOjAQAAAKglCk74lLZt2+rrr78+VXRmZ2crOTn51M/HxsZKkiVWRtrtdm3atImCEwDc\nYNq0aTpw4MCp/3399dfrzTff1CWXXGJiKgAAAAB1wRZ1+JS+fftKkrZt26bKykrl5+efujm9rKzs\n1NlqLVu2NCtijaWlpSkzM9PsGADgk/bv3y+Xy6X9+/frww8/1O7du/l1FwAAALAoCk74lEGDBqlp\n06ZasGCBlixZoiZNmig8PFyS9Oyzz6qwsFC9evVSfHy8yUkvLC0tTZs2bTI7BgD4tMaNG+uWW27R\nJ598osOHD2v48OFmRwIAAABQSzaXy+UyOwRwPkuWLNGSJUsknVxxs3LlSrVq1UrdunWTJMXFxZ26\naVKSPv30U914441yOp2Ki4vT0KFD9dVXX2ndunVq1KiR1q1bp6SkJFM+l9o4fvy4GjdurMLCQgUH\nB5sdBwB8XlpamrKysnTw4EHFxcWZHQcAAABADVFwwutNmTJFTz/99Dl/PiEh4Re3TUrS5s2bNXz4\ncOXk5Kiqqkrx8fG64YYb9NRTT6lp06ZuTmyctm3bavHixerQoYPZUQDA5zVu3FgFBQU6cuTIqTOb\nAQAAAHg/tqjD602ZMkUul+uc/5xZbkonb0y/5pprNG3aNJWXl+vHH3/UzJkzLVVuSmxTBwAj5eTk\nqLCw8Fcfdzqdmjx5sgoKCvSb3/yGchMAAACwGG5Rh8/Kzs7W4MGDzY5RL9UFJ2fCAUD9LV++XJMm\nTVLXrl2VmJiohg0b6sCBA1q7dq12796t+Ph4zZkzx+yYAAAAAGqJghM+KycnR23btjU7Rr3Y7XY9\n99xzZscAAJ/Qt29f5eXlad26ddq0aZOOHTumyMhIJScna9iwYXrooYd08cUXmx0TAAAAQC1xBid8\nUlFRkZo0aaLjx48rIMC6JzEcOnRIrVu31tGjRy39eQAAAAAAALgLjQl8Uk5OjpKSkixfCsbFxSk6\nOlr5+flmRwEAAAAAAPBK1m5/gHPIzs62/Pb0ana7XZmZmWbHAAAAAAAA8EqcwQmf5EsFZ/VFQ1a/\nMAkAAADn53Q6lZubqz179qiyslINGjRQSkqKoqKizI4G+KyCggJt3LhR33zzjf7zn//IZrOpZcuW\nuuqqq9S5c2fO5wYsgoITPiknJ0cDBgwwO4Yh0tLSNGvWLLNjAAAAwA2cTqc+++wzvfTSS8rIyFBg\nYKCCgv7/r2klJSVq2rSpHnjgAY0YMYKyBTDIF198oeeee04ZGRkKCQnRiRMnVFVVJUkKCgpSZGSk\nysrK1K9fP02ePFlXXnmlyYkBnA+XDMEnpaWlac6cOerUqZPZUertp59+UqdOnbR//37ZbDaz4wAA\nAMAg27dv1+233649e/aouLj4vK+NiIiQJE2dOlXjxo2z/FnzgFkKCws1ZswY/e///q9KSkou+Hqb\nzabw8HDdc889eumllxQeHu6BlABqi4ITPsfpdOqiiy7Svn37FB0dbXacenO5XLrkkku0ZcsWNW3a\n1Ow4AAAAMMDrr7+uhx56SA6HQ7X5K1lkZKQ6dOigFStWKCYmxo0JAd/z008/6dprr1VBQYHKyspq\n9d7w8HAlJCRo3bp1atiwoZsSAqgrvu0Hn7N3715FR0f7RLkpnfyOYfU5nAAAALC+mTNn6uGHH1Zp\naWmtyk1JOnHihDZt2qSuXbvq+PHjbkoI+J5Dhw6pS5cu+vnnn2tdbkpSaWmpdu3apa5du+rEiRNu\nSAigPig44XN86YKhahScAAAAvuGbb77RI488UqOtsedSVlam3Nxc3X///QYmA3zbvffeq4MHD546\nZ7MuKioqtGfPHj366KMGJgNgBApO+JycnByfKzjtdrsyMzPNjgEAAIB6KCsr0+DBg1VaWmrIrKVL\nl2r58uUGJAN829KlS7V69WqVl5fXe1ZpaaneeustffXVVwYkA2AUCk74HFZwAgAAwBstWLBAhw8f\nNmxeSUkMhcbJAAAgAElEQVSJxo8fX+tt7oC/+dOf/mTotnKHw6Fnn33WsHkA6i/I7ABAfTidTuXl\n5SkzM1P79++Xy+XS559/rrFjx8rpdPrM7ZJJSUk6dOiQjh49qtjYWLPjAAAAoA5efPHFC96WXls/\n//yzvv76a3Xu3NnQuYCv2LFjh3bu3GnoTJfLpVWrVunAgQNq3LixobMB1I1vtD/wO3v37tWkSZPU\nsGFD2e12jRw5UhMnTtTjjz+ubdu26ZFHHlFsbKwee+wx/fjjj2bHrbeAgAB17NiRVZwAAAAWVVBQ\noNzcXMPnlpaWasmSJYbPBXzF2rVr3TI3NDRU69evd8tsALVHwQlLcTqdmj59upKTkzV9+nQdO3ZM\nJ06c0PHjx1VeXq7y8nK5XC6VlJSoqKhIr776qtq1a6cXXnihXodJewO2qQNA/ZWXl+u7777TW2+9\npddee01z5szRmjVrVFRUZHY0AD7uu+++U3h4uOFznU6nvvjiC8PnAr4iIyPDkHNvz1RcXKyvv/7a\n8LkA6oYt6rCMkpIS3XDDDfrmm29qfOtk9SHSzz77rD766COtXLlSUVFR7ozpNmlpaVq9erXZMQDA\nktavX69p06Zp2bJlCgsLk9PpVGVlpQICAhQcHKySkhKlpKRo4sSJuvnmmxUcHGx2ZAA+JicnRw6H\nwy2zs7OztWvXLkn6xXmcRvy70fP85Tm+/LlZ7b/hxo0b5Q5Op1O7d+92y2wAtWdzcSI1LKCiokK9\ne/fWt99+W+c/GIaFhSklJUVffPGFQkNDDU7ofllZWbrjjju0fft2s6MAgGUcPHhQ9957rz7//HOV\nlJRc8CKOqKgoNWnSRIsWLVLHjh09lBKAN3G5XCorK1NpaakcDochP5aWlmrnzp3Ky8tzS+aAgAC1\nbNny1P+22WyG/rvR8/zlOb78uVnpv+GSJUtUUFAgdxg8eLAWLlzoltkAaocVnLCEKVOmKDMzs17f\n9XY4HNq6dasmT56sadOmGZjOMy6//HL98MMPKikpUUREhNlxAMDrbdq0SX369NGJEydOrei/kOLi\nYuXl5alLly76xz/+oXvvvdfNKQGcS02KRiNLyOofy8rKFBwcrPDwcIWFhdXox9P/PSYmRvHx8b96\nzaeffqo5c+aorKzM8P9W8fHxp1ZwAvilQ4cO6cMPPzR8rs1mU9OmTQ2fC6BuKDjh9bZu3arp06cb\ncm5KaWmpZsyYoSFDhujKK680IJ3nhISEqF27dtqyZYuuueYas+MAgFfbtm2bevTooePHj9f6vS6X\nS6WlpRo3bpwCAwN11113uSEhYB0ul0sOh6Neqxfr8t6ysjKFhITUuGQ888dzFY0Xem9YWJgCAoy/\nqiA2NlZvv/22WwrODh06GD4T8BVdu3bVsmXLDP/ai4qKUufOnQ2dCaDuKDjh9Z5++mlDfzNyOBz6\n05/+pI8//tiwmZ5it9uVmZlJwQkA5+FwONS/f/86lZunKykp0dixY3XNNdeobdu2BqUD6u7MotEd\nqxfPtaIxNDS0VqsYT/8xNja2TiWlu4pGs6SlpbnlopOQkBD17NnT8LmAr7j22msVHBxseMFZUVGh\nLl26GDoTQN1RcMKrHT58WMuWLZPT6TRspsvl0qpVq7R//37Fx8cbNtcTuEkdAC7sySef1OHDhw2Z\n5XA4dPvttysrK+sX53rBvzmdzlqf0WjU1unQ0NA6rWasLhrr8t7Q0FCfKhrNEhUVpR49emjVqlWG\nzg0ICFB6erqhMwFfctVVVykuLk7FxcWGzu3YseMvzr4FYC4KTni1L774QiEhIYbfOBkSEqI1a9bo\nd7/7naFz3S0tLU1vvPGG2TEAwGsVFhbqtddeM+z3jeobUlevXq0+ffoYMhPGcTqd9do6Xdcy8mxF\nY20Kw9jYWDVt2rTWKyIpGq3vD3/4gzZs2KATJ04YMs9ms6lz585KTEw0ZB7gi2w2myZPnqzx48cb\n9rUXGRmpJ5980pBZAIxBwQmv9tVXXxn+nTbp5CUSGzdutFzB2bFjR23fvl0VFRUKDg42Ow4AeJ23\n335bgYGBhs4sLi7WSy+9RMF5HjUtGo3eUl1eXl6jrdPnKg4vvvjiOq2EpGhEXfXt21dXX321MjIy\nVFlZWe95YWFhmjFjhgHJAN927733aubMmcrKyqr37sCQkBB1795dN9xwg0HpABiBghNebfv27YZu\nT6/mcrm0fft2w+e6W2RkpBISErR9+3Z17NjR7DgA4HUWL15s2OqM061Zs0ZOp9PrS63qotETqxhP\n/7G8vLzOF8GEhYWpYcOGtT7XMTw8XCEhIV7/fxPgdDabTS+99JKuvvrqes+KiIjQE088ocsvv9yA\nZIBvCwgI0MKFC2W321VUVFTnOTabTdHR0XrzzTc5ugbwMhSc8GpVVVVum11RUeG22e5UfQ4nBScA\n/NrmzZvdMjcoKEi5ubk1vmzobEWjJy6EqaioqPEZjWf7WMOGDet8RiN/0QMuLCsrS7fccovuv/9+\nvfvuu3X+hkxkZKRuvfVWPfHEEwYnBHxX69at9dlnn6lPnz4qLi6u9UKaoKAgxcTE6Msvv1SjRo3c\nlBJAXVFwwqvFxcVZcrY7VRecd999t9lRAMCrOBwOtxxrIknl5eUaM2aMYmNja1Q+VlRUnLoFui6F\n4ZlFY222TlM0At7p448/1j333KMZM2Zo8ODBuu+++zRgwAAVFhbW6nb18PBwPfjgg5o6dSpf70At\nderUSd999526deumgwcP1nhBTWRkpOx2u+bNm6dmzZq5OSWAuqDghFfr0qWLFi9erJKSEkPnhoaG\nqmvXrobO9BS73a6PPvrI7BgA4HUqKyvd9pf9oKAgpaam6je/+U2NVkaGhIRQPACQdPJopFdffVUv\nvPCCli5dqmuuuUbSyaIlLy9Pjz766KntrucqOgMCAuRyuZScnKx33nlHV111lSc/BcCn2Gw2lZeX\n6+mnn9asWbN07NgxlZWV/WqHX0hIiIKDg9WkSRNNmTJFd9xxB7+3A17M5nK5XGaHAM5l06ZN6tat\nm+HnqUVFRemTTz5Rly5dDJ3rCUeOHFHLli117Ngxzh0DgNNUVVUpLCzMkIs7zhQTE6OlS5eqW7du\nhs8G4LsqKys1fvx4rVmzRh9//LFatmx51tcdOXJEc+fO1QcffKBt27bJ4XDIZrPJ5XIpISFB3bp1\nU2Zmpv74xz9q0KBBnv0kAB/icrnUr18/9enTR4899phcLpe+/PJLrVu3TmvXrtW+ffsUEBCg5s2b\nq2fPnurevbuuvPJKik3AAig44dVcLpdatWqlH374wdC5zZo1008//WTZ36gSEhK0atUqJSUlmR0F\nALxKmzZttGvXLsPnBgcH6+DBg4qJiTF8NgDfVFRUpN/97neqqqrSwoULa/zrh8vlUklJiSorKxUZ\nGamgoJOb7t577z29+eab+vTTT90ZG/BpCxYs0J///Gd99913Cg4ONjsOAAOx/AtezWaz6fHHH1dk\nZKRhMyMiIvTYY49ZttyUTm5Tz8zMNDsGAHidbt26uWV1e6NGjSg3AdTYjz/+qK5duyohIUHLli2r\n1a8fNptNkZGRiomJOVVuStJtt92mzZs3Kzc31x2RAZ937NgxTZgwQf/85z8pNwEfRMEJrzdixAi1\nbNnSkELSZrOpWbNmGjt2rAHJzFN90RAA4JdGjRql8PBwQ2eGhYVp9OjRhs4E4Lu++eYbdenS5dSF\nQqeXlPURGhqqe+65R7NmzTJkHuBvJk+erAEDBljymDIAF0bBCa8XFBSkRYsWKSIiot6zwsPDtWjR\nIst/x46CEwDOrmnTpnI6nYbOtNlsGjlypKEzAfimDz74QP3799eMGTP0+9//3vAdQ6NGjdJbb71V\nq1vXAUhff/21PvzwQz3//PNmRwHgJhScsITLLrtMH330Ub1KzqCgICUkJKhVq1YGJjNH9RZ1jtAF\ngJOKi4s1duxYtW7dWiEhIQoJCTFkbmRkpJ566ik1atTIkHkAfJPL5dKLL76o8ePHa+XKlRo4cKBb\nntOqVStdddVVWrhwoVvmA76osrJSo0aN0rRp0xQbG2t2HABuQsEJy+jdu7fWrFmj5s2b12r7YXh4\nuCIiItS5c2d1795d/fv3V3FxsRuTul/Tpk1ls9m0d+9es6MAgKmqqqr02muvKT4+XnPnztWkSZN0\n8OBBPfjgg/Ve+R8SEqKkpCT94Q9/MCgtAF9UXl6u+++/X/PmzdOGDRtkt9vd+rwxY8Zo5syZbn0G\n4Ev+/ve/Ky4uTnfccYfZUQC4EQUnLOWqq65STk6Oxo0bp6ioKEVFRZ3ztVFRUYqMjNSoUaO0d+9e\nVVZWKjExUe3atdOAAQNUUlLiweTGstlsbFMH4Pc+/fRTtWrVShMmTFCPHj2Un5+vZ555RsHBwXrp\npZd044031qvkjI+P16pVqxQYGGhgagC+5OjRo+rXr58KCgqUkZGh5s2bu/2ZN9xwg37++Wf+HAjU\nwE8//aSpU6dqxowZlr5kFsCFUXDCcsLDw/XCCy+ooKBAkydPVlRUlNq0aaPY2Fg1aNBArVu31h13\n3KF//OMfKigo0PTp09WgQQMtXLhQ06dP15133qmEhATddNNNlj6/yG638wdbAH5p586d6tGjh266\n6SYFBgbqs88+07Jly9SkSZNTrwkICNC8efP0+9//vtaXDkVGRqpjx46SZPh5ngB8x65du9SlSxel\npKToX//613m/8W6kwMBAjRw5klWcQA089NBDGjdunJKSksyOAsDNKDhhWeHh4YqNjdWgQYOUm5ur\nI0eO6OjRo8rLy9N7772nu+666xcrdy699FK98cYbGjp0qJ5//nnFx8frlltukcPhMPGzqLu0tDRl\nZmaaHQMAPObQoUO6//77T/369+KLLyo3N1ddu3Y96+sDAgL03HPPaf369br66qsVHh5+ztuMbTab\noqKi1Lx5c73xxhvKysrS0KFDddttt6m8vNydnxYAC/ryyy/VtWtXPfTQQ5o+fbrHV3rfd999WrRo\nkQoLCz36XMBKPvroI+3YsUMTJ040OwoAD7C5uKUEFnbnnXeqZ8+euu+++2r8nsmTJ+vrr7/Wxx9/\nrLvuukvFxcX64IMPFBoa6sakxsvNzVXfvn21Z88es6MAgFuVlZXplVde0bPPPiun06nbbrtNf/3r\nXxUXF1erOTt37tS7776rtWvXavv27XI4HAoKClLLli3VtWtXDRo0SL169Tq1hc3pdOrWW29VXFyc\n5syZw9Y2AJKk999/X+PHj9fbb7+t66+/3rQc6enp6tatmx588EHTMgDeqri4WFdccYXefPNN9erV\ny+w4ADyAghOW1rJlS61cuVJt27at8XsqKyvVt29f9ezZU5MnT9bvfvc7VVVVadGiRQoODnZjWmM5\nnU41aNBA+fn5atiwodlxAMBwLpdLixcv1vjx41VaWqpLL71Ur7/+ujp16uSxDMXFxbr22mt1zz33\naPz48R57LgDv43K59Mwzz+iNN97Q0qVL1aFDB1PzrFmzRg888IC2bdvGN2CAMzz66KMqKCjQ22+/\nbXYUAB7CFnVY1k8//aSSkhIlJyfX6n1BQUGaN2+e5syZo88//1zz5s2T0+nUkCFDVFFR4aa0xgsI\nCFBqairncALwSV999ZU6d+6sMWPGqLS0VNOmTVNmZqZHy03p5IV1H330kV544QWtWLHCo88G4D3K\nyso0bNgwLVu2TBs3bjS93JSkHj16yOVyKSMjw+wogFfZvHmz3n77bU2bNs3sKAA8iIITlrVu3Tp1\n7dq1Tt+xbtKkid59913dddddOnjwoBYtWqTS0lINGzZMlZWVbkjrHtykDsDX/PjjjxoyZIh++9vf\naufOnRoyZIh27dqle++9VwEB5vyxJSEhQYsXL9bw4cO1c+dOUzIAMM+hQ4fUt29flZWVac2aNYqP\njzc7kqSTZwePHj2ay4aA0zidTo0ePVpTp05Vo0aNzI4DwIMoOGFZGRkZ57xYoiZ69eqlBx98UOnp\n6QoICNAHH3ygo0eP6u6771ZVVZWBSd2HghOArygqKtITTzyh9u3ba82aNWrfvr3WrVunv//974qN\njTU7nq699lq9+OKLGjBggI4cOWJ2HAAesnPnTl1zzTXq2rWrFixY8IsLLL3B8OHDtWLFCh04cMDs\nKIBXmD17tgIDA3XvvfeaHQWAh1FwwrLWrVunbt261WvGpEmTFB0drSeeeEJhYWFasmSJ9u3bp/vu\nu09Op9OgpO5jt9u5SR2ApVVWVmr27Nlq06aNFi5cqMjISP3tb39TRkaGUlJSzI73C3fffbduvvlm\nDR482FJHmgCom9WrV6tHjx564okn9Pzzz5u2ivx8GjRooEGDBul//ud/zI4CmG7//v364x//qH/+\n859e+fUKwL24ZAiWdPToUV166aU6cuRIvS8GOnz4sOx2u1599VUNHDhQJ06cUP/+/ZWcnOz1vzlW\nVFQoJiZGBQUFioqKMjsOANTKypUrNWHCBJWVlenw4cMaOXKknnzySV100UVmRzunqqoq3XTTTUpI\nSNCMGTPMjgPATd544w09/vjjmj9/vtffwPzdd9/p1ltv1e7duxUYGGh2HMA0d9xxhxISEvT888+b\nHQWACby3uQHOY/369ercubMht543bNhQ8+fP1/3336/8/HxFRkZq2bJl2rFjhx588EF58/cAgoOD\ndfnll2vLli1mRwGAGvv+++/Vr18/jRgxQsXFxWrVqpU2bNigF154wavLTUkKDAzUvHnztHbtWgpO\nwAc5nU5NmjRJf/7zn7V27VqvLzcl6corr1Tjxo3173//2+wogGk++eQTbdiwQU899ZTZUQCYhIIT\nllTf8zfP1KVLFz3xxBMaPHiwHA6HoqKitHz5cm3atEkPP/ywV5ecbFMHYBUFBQUaM2aMunfvrgMH\nDigoKEgvv/yyVq5cqXbt2pkdr8aio6O1dOlSPfPMM1q1apXZcQAYpKSkROnp6crIyNDGjRst9evS\nmDFjuGwIfqu0tFRjx47Va6+95nXn5ALwHApOWJIR52+e6eGHH1ZCQoIeeeQRSSf/ArtixQpt2LBB\njz76qNeWnFw0BMDbORwO/eUvf9Fll12mrVu3yuVy6cYbb9T27dt1yy23yGazmR2x1lq1aqX58+dr\n6NChys3NNTsOgHrav3+/evXqpdDQUH322WeKi4szO1KtpKen66uvvlJ+fr7ZUQCPe/7555WWlqb+\n/fubHQWAiSg4YTkOh0NZWVnq3LmzoXNtNpvmzp2rlStXav78+ZKkmJgYffLJJ/r88881adIkryw5\nKTgBeCuXy6X58+erXbt2Wrp0qaKjo9WwYUN9++23euaZZyy/yqJnz5567rnnNGDAAB07dszsOADq\naNu2bbrmmmvUv39/vfPOOwoNDTU7Uq1FRERo2LBhmj17ttlRAI/auXOnZs6cqVdeecXsKABMxiVD\nsJyMjAw98sgj+vrrr90yPysrS9ddd50yMjJObU06fPiwevfurYEDB+qZZ55xy3PrqqSkRHFxcTp2\n7JhCQkLMjgMAkqQNGzZowoQJKi4uVoMGDbR//3698sorPrm64uGHH9bOnTu1bNkyBQUFmR0HQC2s\nXLlSw4YN0/Tp0zV06FCz49RLdna2unfvrh9//NGSJS1QWy6XS7169dKgQYM0btw4s+MAMBkrOGE5\nRp+/eabU1FRNnTpVt912m0pKSiSdvIho1apV+vDDD/Xss8+67dl1ERERocTERH3//fdmRwEA5efn\nKz09XYMHD1Z8fLz27dun/v37a9u2bT5ZbkrSX//6V0nSo48+anISALUxc+ZM3X333frXv/5l+XJT\nktq2bav27dvrww8/NDsK4BFvv/22Tpw4obFjx5odBYAXoOCE5bjj/M0z3XfffUpLS9PYsWNPbUu/\n5JJL9Nlnn+n999/XX/7yF7c+v7bYpg7AbIWFhZo4caI6deqkwMBABQUFKSQkRJs2bdKkSZN8ejVR\nUFCQFixYoBUrVmjOnDlmxwFwAVVVVfr973+vV155RevWrdO1115rdiTDcNkQ/MXhw4c1ceJEzZo1\nS4GBgWbHAeAFKDhhKVVVVVq/fr3b/yBqs9k0a9YsffPNN5o7d+6pjzdu3FirV6/W3LlzT63Y8QYU\nnADMUllZqRkzZqht27batWuXUlJStHnzZs2dO1cLFixQixYtzI7oEQ0aNNDSpUv15JNPau3atWbH\nAXAOxcXFuuWWW7R582Zt2LBBrVu3NjuSoQYOHKi8vDxt27bN7CiAW/3hD39Qenq6rrzySrOjAPAS\nFJywlG3btik+Pl6NGjVy+7MiIyO1ePFiPf7449q8efOpjzdp0kSrV6/2qsOs7Xa7MjMzzY4BwI+4\nXC4tX75cKSkpWrhwofr166c1a9ZowIABysrKUu/evc2O6HFJSUl67733lJ6ert27d5sdB8AZ/vOf\n/6hbt25q1KiRVqxYodjYWLMjGS44OFj33XefZs2aZXYUwG0yMjL0ySefeN3RYQDMRcEJS3H3+Ztn\nuuyyy/Tyyy9r8ODBKioqOvXx5s2ba/Xq1XrllVc0Y8YMj+U5l9TUVG3ZskVVVVVmRwHgB7Zs2aLf\n/va3mjBhgm688Ubt2rVLTqdTW7du1YQJExQcHGx2RNP07dtXTz31lAYMGPCL3zcAmCszM1NdunTR\nkCFDNGfOHJ++mHHkyJF6//33VVxcbHYUwHDl5eUaPXq0Xn75ZUVHR5sdB4AXoeCEpXji/M0zDR06\nVL1799aIESNOnccpSZdeeqlWr16tF198UbNnz/ZopjPFxsYqLi5OeXl5puYA4Nv279+v+++/X9dd\nd506deqkxo0b69NPP9X8+fP11ltv/R97dx5W49r2D/zbSCVD2tohc4gKaaDaJGxCiUTIUIaKIkUy\nyzxXJNUuJbSlomSKECWhFKXJNkTIkAyNqnX//tivfs/aptJa3WvV+TmO53iPp3v69ry1tM51XecJ\nRUVFtiMKhIULF2Lo0KGYOnUqffBEiACIiorCqFGj4OHhARcXF4iIiLAdia86duyIoUOHIiQkhO0o\nhPDc7t270bVrV0ycOJHtKIQQAUMFTiI0GIZp8BWcX3h4eODhw4fw8vLi+nqXLl1w6dIlbNy4EYGB\ngQ2e63/RNnVCCL+UlZVh8+bNUFVVhZSUFCZOnAh/f39YWFggOTm5UQ3o4BVPT0+Ul5fD1dWV7SiE\nNFkMw8Dd3R0LFizAmTNnYGZmxnakBvNl2ND/fjhPiLB79OgRdu/eDS8vr0b/QQUhpO6owEmExpMn\nT8AwDLp169bgz27evDnCwsKwceNG3Lx5k+tY9+7dcenSJaxZswaHDx9u8Gxf0KAhQgivcTgcHDly\nBL169UJaWhpcXFwQFhYGDoeDzMxM2NnZ0eTS75CQkEBYWBgiIyMRFBTEdhxCmpyqqiosWLAAgYGB\nSExMhLa2NtuRGtSIESPw6dOnr/5uJURYMQyDhQsXwsXFBV26dGE7DiFEAImzHYCQ2vqyepOtT+u6\nd+8OX19fTJkyBSkpKWjbtm3NsZ49e+LixYsYPnw4xMXFMXXq1AbPN2DAAHh4eDT4cwkhjVN8fDyc\nnJwgKiqKdevWwd/fH8+ePUN0dDQ0NTXZjicU5OTkcOrUKQwdOhTKysq00pWQBvLhwwdMnjwZoqKi\nSEhIaJJ9+kRFRWFrawtvb28MGjSI7TiE1FtYWBjy8/OxZMkStqMQQgQUreAkQoON/pv/NWHCBJiZ\nmWHmzJngcDhcx1RUVHDhwgU4OTkhLCyswbN92aJOW5EIIfXx8OFDTJo0CZaWlpg7dy769euHVatW\nYd68eUhMTKTiZh2pqKggODgY5ubmyMvLYzsOIY3ekydPoKenhx49eiA6OrpJFje/sLKywqlTp1BY\nWMh2FELq5cOHD1iyZAl8fX2b9CBDQsiPUYGTCA22+m/+17Zt21BUVIQdO3Z8dUxVVRXnz5+Hg4MD\nIiMjGzSXoqIiJCQk8OzZswZ9LiGkcSgqKoKzszN0dHQwYMAAODs7Y82aNZCSkkJ2djasra0hKkp/\nNvyK0aNHw8XFBSYmJjTVmBA+unnzJnR1dTFv3jx4eXlBXLxpb1Zr27YtTExMWO8TT0h9rV69GmPH\njoWuri7bUQghAozeqRCh8ObNG7x48QLq6upsR4GEhARCQ0Ph4eGBq1evfnW8X79+OHv2LGxsbHD6\n9OkGzUZ9OAkhdVVZWYm9e/eiV69eKC4uRlBQECIiIhAREYHY2Fh4enqidevWbMcUeosXL4a2tjYs\nLS2/2gFACKm/sLAwjBs3Dr6+vli8eDENIPk/dnZ28PHxodcdIrRu376N8PBwbNu2je0ohBABRwVO\nIhSuX7+OwYMHC8wwCyUlJQQFBWHatGl49erVV8c1NDRw+vRpWFtb4/z58w2WS0NDgwqchJBaYRgG\np06dgqqqKs6cOYPjx4+joqICtra2WLZsGeLi4gTiQ6XGQkREBPv378e7d++wevVqtuMQ0mgwDIOt\nW7fC2dkZFy9ehLGxMduRBMqgQYPQokULxMbGsh2FkDqrqqqCjY0Ndu7cCTk5ObbjEEIEHBU4iVAQ\nhP6b/zV69GhYW1tj2rRpqK6u/uq4lpYWoqKiMHPmTFy8eLFBMg0YMAB37txpkGcRQoRXamoqhg8f\njhUrVmD37t0wMjKCubk5FBQUkJWVhalTp9LqJz6QlJREREQEjh07hqNHj7IdhxCh9/nzZ1hbWyM8\nPBxJSUno378/25EEjoiICOzs7HDgwAG2oxBSZ15eXpCTk8P06dPZjkIIEQIiDE0kIUJAR0cHO3bs\nwNChQ9mOwqW6uhp//vkndHV1sXHjxm+ek5CQgIkTJyI0NBTDhg3ja56HDx/CwMCA+nASQr7pxYsX\nWL16Nc6dO4d169ahR48ecHR0RPv27bF371707t2b7YhNQkZGBgwNDREdHQ0dHR224xAilN69e4eJ\nE89By5MAACAASURBVCeiVatWCAkJgYyMDNuRBFZxcTE6deqEe/fuoWPHjmzHIaRW8vPz0b9/fyQm\nJqJnz55sxyGECAFawUkEXklJCTIyMqCtrc12lK+IiYkhJCQEgYGB392Krq+vj7CwMEyZMgXXrl3j\na56uXbvi48ePePPmDV+fQwgRLiUlJXBzc4OamhoUFBRw+fJlxMXFYe7cudi4cSNiYmKouNmAVFVV\ncfDgQUycOJE+kCLkF/zzzz8YPHgwBg4ciBMnTlBx8ydatGiBqVOn4q+//mI7CiG1tmjRItjb21Nx\nkxBSa1TgJALv5s2b6NevH6SkpNiO8k0KCgoICQnB7Nmzv/tGdejQofj7778xadIkJCYm8i2LqKgo\nDRoihNTgcDgICgpCr169kJ2djcTERLRs2RL6+vro2bMnMjMzMWHCBNqOzoJx48bB0dER48ePR0lJ\nCdtxCBEa8fHx0NfXh5OTE3bv3i0w/dkFnZ2dHfz9/VFZWcl2FEJ+Kjo6GhkZGXB1dWU7CiFEiFCB\nkwg8Qey/+V9DhgyBo6MjpkyZ8t0/HIcPH47Dhw/D1NQUt27d4lsWKnASQgAgLi4Ompqa8PPzQ3h4\nOGbNmgVjY2MkJSXh9u3b2LBhA6SlpdmO2aQtXboUampqmD17Nk04JqQWjhw5AjMzMwQHB8PGxobt\nOEJFVVUV3bp1w6lTp9iOQsgPlZSUwMHBAT4+PmjevDnbcQghQoQKnETgffmkXtC5uLigbdu2P/yk\ncdSoUQgKCoKxsTFSUlL4koMKnIQ0bbm5uTA1NYWVlRVcXV1x5MgRbNu2DQ4ODvDw8EBUVBS6devG\ndkyCf4d/+Pn54fnz59iwYQPbcQgRWAzDYO3atVizZg2uXLmCP//8k+1IQomGDRFh4Obmhj/++AOG\nhoZsRyGECBkqcBKBVlVVhZs3b0JPT4/tKD8lKiqKQ4cOISIiAidPnvzueWPGjMFff/2FsWPHIi0t\njec5NDQ0aJI6IU3Qu3fv4OjoCD09Pejq6uLOnTvIzMyEtrY2dHR0kJGRgTFjxrAdk/xHs2bNcPLk\nSQQFBeH48eNsxyFE4JSXl2P69Om4cOECkpKS0LdvX7YjCS0zMzOkp6cjNzeX7SiEfNO9e/cQFBSE\n3bt3sx2FECKEqMBJBFpaWho6deoEOTk5tqPUipycHI4fPw4bGxs8fPjwu+eZmJhg//79MDIyQnp6\nOk8z9O7dG8+fP8enT594el9CiGD6/Pkz3N3d0atXL1RWVuL+/fvo2bMnNDQ0kJWVhdTUVKxYsQLN\nmjVjOyr5DgUFBURFRWHhwoVITk5mOw4hAuPNmzcYPnw4qqurceXKFSgoKLAdSag1a9YMVlZW8PHx\nYTsKIV/hcDiwsbHB5s2b0a5dO7bjEEKEEBU4iUAThv6b/6WtrY01a9bA3Nwc5eXl3z3PzMwMHh4e\nGDVqFDIzM3n2fHFxcfTt2xd3797l2T0JIYKHYRicPHkSffv2RWxsLK5evYpFixZh5syZWLVqFQIC\nAhAaGgolJSW2o5Ja6NevH/z8/DBhwgS8ePGC7TiEsC4rKwuDBg3CsGHD8PfffwvssElhY2Njg+Dg\nYJSVlbEdhRAuf/31F0RFRTFnzhy2oxBChBQVOIlAE5b+m/9lb2+PHj16wNHR8YfnTZkyBTt37sTI\nkSORk5PDs+fTNnVCGreUlBQYGBhg3bp18Pb2RmhoKA4dOgQ9PT38+eefSEtLo95VQmjChAmwtbWF\nqakpFR9Ik3bp0iUMHToUa9euxaZNmyAqSm9ZeKVr167Q0dFBaGgo21EIqfHq1SusWbMGPj4+9PtO\nCPll9OpBBBbDMEK5ghP4d3CEv78/Ll++jKNHj/7w3OnTp2Pz5s0YMWIE/vnnH548nwYNEUEXHh4O\nBwcH/PHHH2jZsiVERERgaWn53fM/ffqEVatWoXfv3mjevDnatGmDUaNG4dKlSw2Ymn35+fmYOXMm\njI2NMWPGDNy5cweFhYVQUVFBQUEB0tPT4eTkBAkJCbajkl+0cuVK9OjRA9bW1mAYhu04hDQ4f39/\nTJs2DcePH8esWbPYjtMo0bAhImicnJxgZWUFNTU1tqMQQoQYFTiJwHrw4AGaNWuGTp06sR3ll7Rs\n2RLh4eFwdHT86Rb02bNnY926dRg+fDgePXpU72dTgZMIuk2bNsHLywtpaWno0KHDD88tKirCoEGD\nsGXLFoiLi8PW1hZmZma4c+cORowYgYCAgAZKzZ7i4mKsXbsW/fr1Q6dOnZCTkwMdHR2MGDEC27dv\nx7Fjx3Do0CEoKiqyHZXUk4iICAICAvDw4UNs2bKF7TiENBgOh4Ply5dj+/btiI+Ph4GBAduRGi0j\nIyO8evUKKSkpbEchBLGxsUhMTMTatWvZjkIIEXJU4CQCS1hXb/4vdXV1bN++Hebm5igpKfnhuXPn\nzoWrqysMDQ2Rl5dXr+eqqakhNzcXFRUV9boPIfzi7u6O3NxcfPz48aerSNavX4/MzExMnDgRaWlp\n8PDwgL+/P+7fvw8lJSU4ODggPz+/gZI3rOrqagQEBKBnz554/Pgx0tLSsHTpUqxZswbDhw/HlClT\nkJycDD09PbajEh6SkpJCVFQUfHx8cPLkSbbjEMJ3paWlMDc3x40bN5CUlISePXuyHalRExMTw/z5\n82kVJ2FdeXk5FixYAC8vL8jIyLAdhxAi5KjASQSWsPbf/C8rKytoaWnB1tb2p9sN7ezs4OzsDEND\nQzx79uyXnyklJYXu3bsjIyPjl+9BCD8NGzYMysrKEBER+em5Xwo8GzZsgLi4eM3X27VrBycnJ5SV\nleHgwYN8y8qW2NhYaGhoICgoCFFRUTh06BAuXboEFRUVlJWVITMzE3Z2dhATE2M7KuEDRUVFREZG\nwsbGBmlpaWzHIYRvXr58iaFDh0JGRgYXL15E27Zt2Y7UJMyZMwcRERF4//4921FIE7Z161aoq6tj\n7NixbEchhDQCVOAkAqsxrOAE/t1u6O3tjbS0NPj7+//0fAcHByxcuBCGhoZ4/vz5Lz+XtqmTxqKg\noAAA0K1bt6+OfflaY+rFmZ2dDWNjY9jY2GDt2rW4du0aREVFoaenBx8fH0RHR8PX1xfy8vJsRyV8\nNnDgQOzfvx/jx4/Hq1ev2I5DCM/du3cPgwYNwvjx43Ho0CE0a9aM7UhNhoKCAkaNGoXg4GC2o5Am\nKicnB97e3vD09GQ7CiGkkaACJxFIBQUFKCwsRJ8+fdiOwhPS0tIIDw/HypUra1V0dHJywty5czF8\n+PCa4k5dUYGTNBZfCnmPHz/+6tiXnrU5OTkNmokf3r59C3t7e/zxxx8YNmwYMjMzYWBgADs7O4wd\nOxbz5s1DYmIiNDU12Y5KGpC5uTmsrKwwYcIElJeXsx2HEJ45e/ZsTR/h1atX12pFP+EtOzs7+Pj4\n0EAz0uAYhoGtrS1Wr179017shBBSW1TgJAIpISEBenp6EBVtPD+ivXr1wr59+2Bubo4PHz789Pzl\ny5fD0tIShoaGeP36dZ2fp6GhgTt37vxKVEIEypdtS+vWrUN1dXXN19+8eQN3d3cA/w4iElYVFRXY\ntWsXVFRUICoqiqysLCxevBiBgYFQUVFBs2bNkJ2dDWtr60b1mkhqb+3atejYsSNsbGyoEEEaBS8v\nL8yZMwdRUVGwsLBgO06TNWTIEIiIiODq1atsRyFNzOHDh/Hx40fY29uzHYUQ0ojQOyUikBISEhpF\n/83/srCwwKhRo2BtbV2rN6mrV6+Gubk5RowYgbdv39bpWf3790d6ejpXQYgQYbRhwwYoKSkhPDwc\n/fv3h6OjI+bNm4e+fftCTk4OAISy8McwDMLCwqCiooL4+HgkJCRg7969yM3NhZaWFv7++2/ExsbC\n09MTrVu3ZjsuYZGoqCiCgoKQkZGBnTt3sh2HkF9WXV2NRYsWwdvbG9evX8fgwYPZjtSkiYiIwNbW\nloYNkQZVWFgIFxcX+Pr6Uh9xQghPCd87QtIkxMfHN4r+m9+yZ88e5OXl1brfzPr16zFu3DiMHDkS\n7969q/VzWrVqBQUFBeTm5v5qVEIEgqKiIm7fvo2FCxfi06dP8Pb2xpkzZzBlyhSEhYUB+HfgkDC5\nefMm9PX1sWXLFgQEBCAqKgqtWrXC7NmzMXnyZCxbtgxxcXFQV1dnOyoRENLS0oiKioKnpyeio6PZ\njkNInX369Anjx49HZmYmEhMTv9lXmTS8mTNn4sKFC7/cEomQunJ1dcXkyZOp5Q4hhOeowEkEzqdP\nn5CTk4OBAweyHYUvmjVrhrCwMGzZsgU3btz46fkiIiLYvHkzRowYgT///LNO0y5pmzppLBQUFODl\n5YUnT57g8+fPePHiBfbt24enT58CALS0tFhOWDtPnz7F9OnTMXHiRMybNw/JycnQ19eHh4cH1NTU\noKCggKysLEydOpX60ZGvdOzYESdOnMCcOXOQnp7OdhxCau3Zs2f4448/0L59e5w7d45WpQuQVq1a\nYdKkSQgICGA7CmkCEhIScO7cOWzatIntKISQRogKnETg3LhxAwMHDmzUkzS7du0Kf39/TJkypVZb\nz0VERLBjxw7o6+tj1KhRterhCdCgIdL4fZn+Om3aNJaT/NjHjx+xcuVKDBgwAMrKysjJycHs2bNx\n7do1DBgwAGfPnkV8fDy2b98OWVlZtuMSAaajowMPDw+YmJjgzZs3bMch5KdSUlIwePBgWFpawtfX\nFxISEmxHIv9hZ2cHPz8/amtE+Orz58+wtbWFh4cHWrZsyXYcQkgjRAVOInAaa//N/zIxMYGFhQVm\nzJgBDofz0/NFRETg7u4OLS0tjBkzBp8+ffrpNVTgJI0Bh8NBcXHxV18/fPgwgoODoaurC1NTUxaS\n/VxVVRX8/PzQq1cvvHjxAvfu3cP69evx/v17WFhYwMrKChs3bkRMTAx69+7NdlwiJKZNm4Zp06bB\nzMwMnz9/ZjsOId8VGRmJ0aNHY9++fVi6dCmtTBdQGhoa+P3333H27Fm2o5BGbM+ePejUqRPMzMzY\njkIIaaREGBrHSQTMsGHDsHz5cowePZrtKHxXWVkJQ0NDjB49GqtWrarVNRwOB3Z2dsjKysK5c+cg\nIyPz3XNfvXoFFRUVFBYW0psKIlAiIyMRGRkJACgoKEBMTAy6detW03tXXl4eu3btAgAUFxdDQUEB\nI0eORPfu3SEqKorr16/jxo0bUFFRQWxsLNq3b8/a9/I9MTExcHZ2xm+//Ybdu3dDQ0MDFRUV2LNn\nD3bv3o2FCxdi+fLlkJaWZjsqEUIcDgdmZmZo27Yt/vrrL3qNJwKFYRjs3r0b7u7uiIqKol57QiAo\nKAjHjx+nIifhi8ePH0NLSwu3b99G165d2Y5DCGmkqMBJBMrnz58hJyeH58+fo1WrVmzHaRDPnz+H\npqYmQkJCMGzYsFpdw+FwMG/ePDx+/BinT5/+YYGkffv2SExMRJcuXXiUmJD6W79+Pdzc3L57vHPn\nznjy5AmAfz8IsLW1RUJCAvLz8wEAysrKmDx5MhwdHQWuQHj//n0sXboUDx8+xM6dO2FiYgIRERGc\nP38eixYtgoqKCtzd3WnABqm34uJi6OnpwcrKCo6OjmzHIQTAv6/Z9vb2SEpKwunTp6GkpMR2JFIL\nZWVlUFJSogIU4TmGYTB27FgMGTIErq6ubMchhDRiVOAkAiUpKQm2trZIS0tjO0qDunjxImbNmoWU\nlBQoKirW6prq6mpYWVmhoKAAp06dQvPmzbmOh4eH4+rVqzh69CjKy8tRVlaG6dOn48iRI1/d69mz\nZ9i6dStSUlKQl5eHoqIitG3bFt27d4e1tTUsLS2pZxYhP/H69WusW7cOERERWL16NWxtbSEpKYnH\njx9jyZIluH//Pjw9PTFmzBi2o5JGJC8vD4MHD0ZAQACMjIzYjkOauPfv38Pc3BySkpI4duwY9RQW\nMs7OzpCQkMC2bdvYjkIakfDwcKxfvx6pqan0foIQwlfUg5MIlISEhJotqk3JyJEjMX/+fEydOhVV\nVVW1ukZMTAyBgYGQl5fHhAkTUFFRwXV806ZN8PLyQklJyU9XuD18+BBHjx5Fq1atYGpqCmdnZxgb\nGyMvLw/W1tYYNWpUrXMR0tSUl5dj27Zt6NOnD6SkpJCdnY1Fixahuroa69evh5aWFnR0dJCRkUHF\nTcJznTt3RlhYGGbNmoWsrCy245Am7PHjx9DV1YWKigqioqKouCmEbG1tERgY+NXflIT8qo8fP8LR\n0ZEGjBFCGgQVOIlAiY+PbxIDhr5lzZo1kJCQwNq1a2t9jZiYGIKDg9GiRQtMmjSJa9iEu7s7cnNz\nERISAmVl5R/eR1dXF0VFRbhw4QJ8fHywZcsW+Pr64uHDhzAwMMCVK1dw4sSJX/7eCGmMGIbBsWPH\n0Lt3b9y6dQs3btzAnj170KZNG0RGRqJPnz7IyspCamoqVqxYgWbNmrEdmTRSenp62LFjB4yNjVFY\nWMh2HNIE3bhxA7q6urCzs8PevXshLi7OdiTyC5SVlaGuro6IiAi2o5BGYvXq1TAyMoKenh7bUQgh\nTQAVOInA4HA4uH79epMtcIqJieHo0aMIDg6uU4N3cXFxhISEQFxcHBYWFqisrATw77AmZWVlaGho\n4MGDBz+8h6SkJERFv345kJCQqJlO/bN7ENKUfHkzv2vXLgQHB+PEiRNQVlZGTk4OjIyMsGrVKgQE\nBCA0NJT6z5EGMXv2bEyYMAHm5uY1/w4Q0hBCQ0NhYmICf39/ODg4sB2H1JOdnR0OHDjAdgzSCCQn\nJ+P48ePU8oAQ0mCowEkERnZ2Nlq2bIkOHTqwHYU17dq1w7Fjx2BlZYWnT5/W+joJCQmEhoaisrIS\n06dP59pO3qVLF5SXl/9Snurq6ppiq7q6+i/dg5DG5PHjx5gyZQomT56MBQsW4NatWxgyZAiKi4vh\n6uoKfX19jBo1CmlpaTA0NGQ7Lmlitm3bBmlpaSxatAjUYp3wG8Mw2LRpE1xcXBAbG4uxY8eyHYnw\ngImJCR49eoT09HS2oxAhVlVVBRsbG+zYsQNt27ZlOw4hpImgAicRGE21/+Z/6evrY+nSpZg8eTLX\nlvOfkZSURHh4OD59+oSZM2eiuroaACAiIvLTLepfvH37FuvXr8e6deuwYMEC9O7dGxcuXMC0adNg\nbGz8S98PIY3Bhw8fsHz5cmhpaUFNTQ05OTmYMWMGREREcOzYMaioqODly5dIT0/HkiVLqM8UYYWY\nmBhCQkIQHx8Pb29vtuOQRqyiogKzZ89GZGQkkpKS0K9fP7YjER4RFxfHvHnzaBUnqZf9+/ejVatW\nmDFjBttRCCFNCE1RJwJjxowZGDJkCObNm8d2FNYxDIPx48ejW7du8PDwqNO1ZWVlMDExgaKiIgID\nAyEmJobJkycjLCzsu1PUv8jOzoaKikrNfxcREYGzszO2bNlCBRvSJFVVVcHPzw8bNmzAuHHjsHHj\nRigqKgIA0tPT4eDggA8fPsDLy4v6SxGB8ejRI+jq6uLIkSMYMWIE23FII1NYWIiJEyeibdu2OHz4\nMGRkZNiORHjs+fPnUFNTQ15eHg2LInWWn5+P/v374/r16+jVqxfbcQghTQit4CQCg1Zw/n8iIiI4\ndOgQoqKiEB4eXqdrpaSkEBUVhfz8fMybNw8cDqfWKzh79+4NhmFQVVWFvLw8uLu7w8/PD0OGDMG7\nd+9+5VshRCgxDIOzZ89CXV0dJ06cQExMDPz9/aGoqIj379/D0dERw4cPx5QpU5CcnEzFTSJQunXr\nhtDQUEyfPh25ublsxyGNSG5uLgYNGgQdHR2Eh4dTcbOR6tChAwwMDHD06FG2oxAh5OjoiIULF1Jx\nkxDS4KjASQRCfn4+iouL6R/C/9GmTRuEhYXBzs6uzgN+pKWlER0djX/++Qd2dnbo0aNHna4XExND\np06dsHjxYvj6+iIpKalO090JEWb37t3DqFGj4OzsjJ07d+LixYvo168fOBwOgoKCoKKigrKyMmRm\nZsLOzg5iYmJsRybkK0OHDsXmzZthbGyMoqIituOQRuDq1asYMmQIXFxcsGPHjm8OJySNx5dhQ7TZ\nj9TFmTNncO/ePaxYsYLtKISQJoj+MiECISEhAfr6+hAREWE7ikDR1NSEm5sbzM3NUVZWVqdrZWRk\ncObMGWRkZCAyMhIAfmmyrpGREQAgLi6uztcSIkwKCgowb948jBw5EuPHj8e9e/cwduxYiIiIICUl\nBXp6evDx8UF0dDR8fX0hLy/PdmRCfmju3LkwMjKChYUF1/A5Qurq0KFDMDc3x5EjR6iVUBMxfPhw\nlJaW4saNG2xHIUKipKQE9vb28Pb2RvPmzdmOQwhpgqjASQRCfHw89PX12Y4hkOzs7KCiooJFixbV\n+VpZWVmcO3euZovir2wzf/78OYB/m84T0hiVlZVh8+bNUFVVRZs2bZCTk4OFCxdCQkIChYWFsLW1\nxdixYzFv3jwkJiZCU1OT7ciE1NquXbtq+ikTUlccDgerV6+Gm5sbrl69Sj1dmxBRUVHY2trSsCFS\naxs2bICuri69ThBCWEMFTiIQqP/m94mIiMDPzw/x8fEIDg6u8/UtW7bEzp07Afw7FOVbW43u3LlT\nM3X9fxUXF2Px4sUAgLFjx9b52YQIMg6HgyNHjqBXr164e/cubt26hR07dqB169aorq6Gj48PVFRU\n0KxZM2RnZ8Pa2pq2ZBKhIy4ujmPHjiEmJgZ+fn5sxyFCpKysDNOmTcPly5eRlJTENYSQNA2zZ89G\ndHQ03r59y3YUIuDS09MRGBiIPXv2sB2FENKE0RR1wrr3799DSUkJ7969o0ndP5Ceng5DQ0NcuXIF\nqqqqPz0/MjKyZmt6QUEBYmJiICoqir59+0JDQwPy8vLYtWsXAMDU1BTXr1+Hrq4uOnXqBGlpaTx7\n9gznzp3D+/fvoauri5iYGLRo0YKv3yMhDSU+Ph5OTk4QFRXFnj17uIYEJSYmwt7eHrKysti3bx/U\n1dVZTEoIbzx48AD6+voIDQ2FgYEB23GIgHv9+jXGjx+PLl26IDAwkLabNmGzZs2Cqqoqli1bxnYU\nIqA4HA709fUxa9Ys2NjYsB2HENKEUYGTsO7s2bPYvXs3Ll26xHYUgRcUFITt27fj9u3bPy02rl+/\nHm5ubt893rlzZzx58gTAvw3B//77b9y6dQuvXr1CaWkp2rRpA3V1dUyePBnW1ta0RZ00Cg8fPsTy\n5ctx+/ZtbNu2DVOmTKlZlVlQUABXV1fExsZi586dsLCwoL7ApFG5dOkSpk+fjsTERHTr1o3tOERA\n3b9/H+PGjcPMmTOxfv16eh1s4pKSkmBpaYnc3FzaxUC+yc/PD0FBQUhISKCfEUIIq6jASVi3YsUK\nSEpK/rAYR/6/OXPmoKysDEePHq3Tm47y8nK0adMG2dnZGDNmDKZOnYrVq1fzMSkhdZecnIwLFy7g\n6tWrePToEaqrq9GmTRsMHjwYQ4YMgbGxMaSkpOp836KiImzatAmHDh2Cs7MzHB0da+5TWVmJ/fv3\nY/PmzbC2tsbq1ashKyvL62+NEIHg7e2N/fv348aNG2jZsiXbcYiAuXjxIqZPn47du3djxowZbMch\nAoBhGGhoaGDbtm0YNWoU23GIgHn9+jVUVVURGxtLO14IIayjAidh3R9//IF169ZRQ+paKisrw6BB\ng2BnZwdbW9s6XduvXz/4+/tDSUkJBgYGsLa2houLC5+SElJ7J06cwMqVK5Gfn4/Pnz+jsrKS67iI\niAhatGgBhmEwd+5cuLm51ao4U1lZCR8fH2zcuBETJkzAhg0boKCgUHP8ypUrcHBwQPv27bF37170\n7t2b598bIYJm4cKFePLkCU6dOgUxMTG24xAB4efnh7Vr1+L48eMYMmQI23GIAPHz88PZs2drWh8R\n8sWMGTOgqKiIHTt2sB2FEEKowEnYVV5eDnl5eRQUFFB/xzrIzc2Fnp4ezp8/j4EDB9b6utmzZ0NX\nVxfz58/HixcvMHToUCxYsABLlizhY1pCvq+wsBAzZ85EXFwcSktLa3VN8+bN0aJFCxw7dgzDhw//\n5jkMwyA6OhrLli1Dly5dsHv3bq7etfn5+Vi6dCmSkpLg7u4OU1NT2oZJmozKykoYGRmhf//+Nb2Y\nSdNVXV2N5cuXIzo6GqdPn4aysjLbkYiAKS4uRufOnZGWlgYlJSW24xABcenSJcyZMwf379+HjIwM\n23EIIYSmqBN2JScnQ0VFhYqbddSzZ094e3vD3NwcRUVFtb5OQ0MDqampAID27dvj8uXL8PLywr59\n+/gVlZDvevHiBTQ0NBAbG1vr4ibw7wcjb9++hbGxMQ4fPvzV8dTUVAwfPhwrVqyAp6cnYmJiaoqb\nFRUV2Lp1K/r3749evXohMzMTEyZMoOImaVIkJCRw/PhxREVFITAwkO04hEUlJSUwMzNDSkoKbty4\nQcVN8k0tWrTAtGnT8Ndff7EdhQiI8vJy2NnZYd++fVTcJIQIDCpwElbFx8dDX1+f7RhCydzcHOPG\njYOVlRVquxB7wIABuHPnTs1/V1JSwuXLl7Fnzx74+PjwKyohXyktLYW+vj5evHiBz58//9I9ysrK\nYGNjgwsXLgD4t2BqbW2NMWPGYPLkybh79y5Gjx5dc/758+ehpqaGpKQk3Lp1C25ubpCWlubJ90OI\nsJGTk0N0dDSWL1+OhIQEtuMQFrx48QJDhgxB69atERMTAzk5ObYjEQFma2sLf3//r1rIkKZp27Zt\nUFVVhbGxMdtRCCGkBhU4CasSEhLwxx9/sB1DaO3cuRMvXrzAnj17anV+v379kJGRgaqqqpqvde7c\nGZcuXcKWLVvg7+/Pr6iEcFm2bBkKCgq4fhZ/RVlZGSwsLODq6go1NTUoKCggJycHtra2EBcXBwA8\nfvwYpqamcHBwgIeHB6KiomiCNCEAevfujeDgYJibm+PJkydsxyENKC0tDYMGDYKZmRkCAwMhUfeV\nvQAAIABJREFUKSnJdiQi4Pr27QtlZWVERUWxHYWwLCcnB15eXti7dy/bUQghhAv14CSsqa6uhry8\nPLKzs7mGfpC6ycvLg7a2Nk6cOAE9Pb2fnq+srIzIyEj07duX6+sPHjyAoaEhNm3ahFmzZvErLiFI\nT0/HoEGD6rQt/We6d++O2NhYdOnSpeZrZWVl2L59O7y8vODs7AwnJyc0a9aMZ88kpLHw9PREQEAA\nrl+/DllZWbbjED47c+YMZs+ejf3792Py5MlsxyFC5NixY/Dz88Ply5fZjkJYwjAMRowYAWNjYzg6\nOrIdhxBCuNAKTsKa+/fvo127dlTcrKfOnTvj4MGDsLCwwJs3b356voaGBtc29S+UlZURGxuLlStX\n4ujRo/yISgiAf1ceV1RU8PSeL168QNu2bQH8+8d3ZGQk+vTpg6ysLKSmpmLFihVU3CTkOxYtWgQd\nHR1YWlqCw+GwHYfwCcMw2Lt3L+bNm4fo6GgqbpI6mzhxIjIzM5Gdnc12FMKSo0ePoqioCPb29mxH\nIYSQr1CBk7CG+m/yztixY2FpaQlLS0tUV1f/8NwBAwbUDBr6r169euHixYtYtmwZQkND+RGVNHEV\nFRUICwv76c9pXYmKiiIsLAw5OTkwMjLCqlWrEBAQgNDQUJr4SshPiIiIYP/+/Xj//j1Wr17NdhzC\nB1VVVXBwcICvry8SExMxaNAgtiMRISQpKQlra2vq295EvXv3DsuWLYOvr29NGyBCCBEkVOAkrKH+\nm7y1ceNGlJeXY/PmzT8870cFTgDo06cPYmJi4OjoiIiICF7HJE1ceno6X3q9lZSUYPfu3dDX18eo\nUaOQlpYGQ0NDnj+HkMZKUlISERERCA0NxZEjR9iOQ3jo48ePMDExQW5uLhITE7laeRBSV/Pnz8fh\nw4d52maGCAdXV1eYmZlBS0uL7SiEEPJNVOAkrGAYhlZw8pi4uDiOHTsGHx8fXLp06bvnfSlw/qj9\nrpqaGs6dO4cFCxZQM3nCU3fu3Kn3YKHvefbsGdLT07FkyRJISEjw5RmENGby8vI4deoUnJyckJSU\nxHYcwgNPnz6Fvr4+OnXqhDNnzqBVq1ZsRyJCrkuXLhg8eDCOHTvGdhTSgK5fv44zZ878dCEFIYSw\niQqchBV5eXmorq5G9+7d2Y7SqCgqKuLIkSOwtLTEixcvvnlOu3bt0KJFCzx+/PiH9+rfvz/Onj2L\n+fPn48yZM/yIS5qgd+/e8bz/5hfNmjXD77//zpd7E9JU9O3bFwcPHoSZmRmePXvGdhxSD7dv38bg\nwYNhZWWFAwcO0Ac/hGfs7Oxw4MABtmOQBlJZWQlbW1u4u7vThySEEIFGBU7Cii+rN0VERNiO0ugY\nGhpiwYIFsLCw+O5KuZ9tU/9i4MCBOHXqFKysrBATE8PrqKQJEhUV5dvvvago/ZNGCC+MGzcOS5Ys\nwfjx41FSUsJ2HPILIiIiMGbMGHh7e2PJkiX09xbhqdGjR+PNmzdITk5mOwppAHv27EHHjh1hbm7O\ndhRCCPkhejdIWEH9N/lr1apVkJaW/u6wiO9NUv8WHR0dREZGYsaMGT/c+k5IbXTs2BFSUlJ8uXeL\nFi1QWFjIl3sT0tQ4OztDXV0ds2bNosnqQoRhGOzYsQOLFy9GTEwMxo8fz3Yk0giJiYnBxsaGVnE2\nAU+ePMHOnTuxf/9++qCEECLwRJgfNeIjhE/69OmDI0eOQENDg+0ojdbbt2+hoaEBb29vjBs3rubr\npaWl2LZtG44dO4a+ffuirKwMLVu2hI6ODjQ1NaGnp/fNyYjx8fEwMzPD8ePHYWBg0IDfCRF2VVVV\nuHPnDuLi4hAdHY2EhAS+PKdDhw74+PEj2rVrBy0tLWhra0NLSwsaGhqQlpbmyzMJacwqKipgaGiI\nESNGwM3Nje045CcqKythZ2eHlJQUREdHo2PHjmxHIo3Y69ev0atXLzx69Aht2rRhOw7hA4ZhYGxs\nDD09PaxYsYLtOIQQ8lNU4CQN7u3bt+jevTsKCwu/WUgjvJOYmIgJEybg5s2bkJCQwObNm3Ho0CGI\nioqiuLiY61xJSUk0a9YMEhISsLe3h7OzM1q2bMl1zpUrVzBlyhScOHGCBkSR76qqqkJqairi4uJw\n5coVXL9+HZ07d4aBgQGGDh2KefPmoaioiKfPlJWVRUhICIyMjJCTk4Nbt27h9u3buHXrFu7fv4+e\nPXvWFD21tbXRt29fev0hpBZevXoFHR0dbN++HVOmTGE7DvmOoqIiTJo0CTIyMggJCUGLFi3YjkSa\ngKlTp2LQoEFYvHgx21EIH0RERGDt2rVITU2FpKQk23EIIeSnqMBJGlxUVBS8vb2pp2MD2b17N7y8\nvPDmzRt8/vwZlZWVP72mefPmaNGiBUJCQjBy5EiuY7GxsZg2bRqioqIwePBgfsUmQqSqqgppaWk1\nBc2EhAR06tQJBgYGGDZsGIYMGQJ5efma893c3LBt2zaUl5fzLIO8vDwKCgogJib21bGKigrcvXsX\nt27dqil8Pnv2DP37969Z5amtrY1u3brR9itCvuHu3bsYMWIEzp07B01NTbbjkP94+PAhxo0bh9Gj\nR2PXrl3ffB0khB+uXbsGGxsbZGZm0r+fjczHjx/Rt29fhISEUFsxQojQoAInaXDLli1Dq1atvtsf\nkvAOwzCwtbVFQEAAqqur63y9lJQUtm/fDgcHB66vnz9/HjNnzsTp06ehra3Nq7hESFRXV39V0OzY\nsSNXQfO333777vXZ2dlQU1P77hCsupKRkcHmzZvrtILkw4cPSE5OrlnleevWLZSVlXEVPLW0tKCg\noMCTjIQIu8jISDg4OODmzZto374923HI/7l+/TomTZqENWvWYMGCBWzHIU0MwzBQU1PDvn37MGzY\nMLbjEB5avHgxiouLERAQwHYUQgipNSpwkgY3aNAgbNu2jfo4NgBnZ2f4+PigtLT0l+8hJSWFAwcO\nYNasWVxfP336NObMmYNz585RL9VGrrq6Gnfv3q0paMbHx6NDhw5cBc127dr99D4MwyAkJARLly5F\n3759cePGjXr9bAL/Tk5XV1dHcnJyvVctvXjxoqbgefv2bdy+fRstW7bkKnoOHDgQsrKy9XoOIcJq\ny5YtiIyMxNWrV/k2LIzUXkhICBwdHREcHIzRo0ezHYc0Ufv378fVq1dx/PhxtqMQHklJScHYsWNx\n//59tG3blu04hBBSa1TgJA2qtLQUv/32G968eUNDP/js6tWrMDIyQllZWb3vJSMjg4yMDHTp0oXr\n65GRkbC1tUVMTAz69etX7+cQwVBdXY179+5xFTQVFRW5Cpp1Xdn46NEj2NnZoaCgAH5+ftDW1oa5\nuTnOnTv3y0VOERERtG7dGsnJyejWrdsv3eNHOBwO/vnnH66i5927d9GlS5eaXp5aWlpQV1en3lSk\nSWAYBpaWluBwOAgJCaEtqSxhGAYbN27EwYMHER0dDTU1NbYjkSbs48eP6Ny5MzIzM6GoqMh2HFJP\n1dXV0NHRgYODw1eLGwghRNBRgZM0qCtXrmDlypW4ceMG21EataqqKnTq1AkvX77kyf3ExMSgq6uL\na9eufXUsPDwcDg4OuHjxIlRVVXnyPNKwOBzOVwVNBQWFmoLm0KFDf3mrdmVlJfbs2YOdO3fCxcUF\nS5YsgYSERM2x4cOHIyEhAXX9p0hSUhKysrK4du0a+vTp80vZfkVlZSXS09O5trY/evQIampqXEOM\nlJWVISoq2mC5CGkoZWVlMDAwgLGxMbWaYUFFRQXmzp2LnJwcnDp1Cr///jvbkQiBjY0NlJSU6DWh\nEdi3bx8iIiJw5coV+hCLECJ0qMBJGtTGjRvx6dMn7Nixg+0ojdqJEycwe/ZsfPr0iWf3lJKSQkpK\nClRUVL469vfff8PZ2RmXLl365nEiWDgcDtLT02sKmteuXUO7du24Cpq8eNN88+ZNzJ8/H7///jsO\nHDjAtcoyPz8fS5cuxY0bN2BkZIQjR46gsrISnz9//ul9ZWRkMGzYMBw8ePCHvT4bSnFxMe7cucM1\nub2oqAiamppc29s7dOjAdlRCeOLly5fQ0dGBh4cHJk6cyHacJuPt27eYMGECFBQUEBwcTDthiMBI\nS0uDiYkJHj16BHFxcbbjkF/0/Plz9O/fH/Hx8ejduzfbcQghpM6owEka1J9//gl7e3uYmJiwHaVR\n09PTQ2JiIk/vKS4uDhsbG3h5eX3z+OHDh7FixQpcvnwZPXv25OmzSf1wOBxkZGQgLi4OcXFxuHr1\nKuTl5WFgYFDzH15uK/v48SNWrVqF8PBw7N69G1OnTq1ZBVBRUQF3d3fs2rULCxcuxPLlyyEtLY3n\nz5/Dw8MDvr6+AP7dIvVl67q4uDhkZGRQXl4OfX19uLq6YsSIETzLyw+vX79GcnIy1+R2SUlJrlWe\nmpqaaN26NdtRCfklKSkpGD16NC5cuIABAwawHafRy8nJwdixY2Fubo7NmzfTCnEicAYPHgxXV1eM\nHz+e7SjkF5mbm6N3797YuHEj21EIIeSXUIGTNJiqqirIycnh8ePH1LCajzgcDqSlpVFRUcHze/fs\n2RM5OTnfPX7w4EGsW7cOcXFx6N69O8+fT2qHw+Hg/v37XAVNOTk5roImv6Ygf5m0/Oeff2Lnzp2Q\nk5OrOXb+/HksWrQIKioqcHd3/2bfzM+fPyM9PR3JycnIzMyEj48P3Nzc0L9/f2hqakJeXp4vufmN\nYRg8efKEa5XnnTt30KFDB65Vnv3790fz5s3ZjktIrYSFhWHp0qW4efMmbZXmoytXrsDCwgJbt26F\ntbU123EI+abg4GCEhITg/PnzbEchv+Ds2bNYtGgR0tPTaYgcIURoUYGTNJiUlBTMnDkT9+/fZztK\no5adnQ1NTU2UlJTw/N4SEhIoKSmp6aH4LX5+fti8eTPi4uLQtWtXnmcgX+NwOMjMzOQqaLZu3Zqr\noMnv7dH5+flwcHBAZmYmfH19YWBgUHPs8ePHWLJkCe7fvw9PT0+MGTOmVvcsLi6GgoICX36WBUFV\nVRWysrK4VnlmZ2dDRUWFa4iRiopKvSfEE8Ivbm5uOH/+PK5cuULFeT4IDAyEq6srjh07hmHDhrEd\nh5DvKi8vh5KSEpKSkuhDbiFTWlqKvn37ws/PDyNHjmQ7DiGE/DIqcJIG4+npiaysLPj4+LAdpVG7\nevUqxo8fjw8fPvD83s2bN8ezZ89+uopu//792LVrF65evYpOnTrxPEdTxzDMVwXNli1bchU0O3bs\n2CBZqqurceDAAbi5uWHBggVYsWJFTZGjrKwM27dvh5eXF5ydneHk5IRmzZrV+t4VFRWQlZWtVV/O\nxqK0tBRpaWlcQ4wKCgowcOBAru3tnTp1oub/RCBwOBxYWFigefPmOHToEP1c8giHw8GqVatw/Phx\nnDlzhvrhEaGwdOlSiIqKUq99IePq6oqnT58iJCSE7SiEEFIvVOAkDWbSpEkwNTWFpaUl21Eatbi4\nOJiamvKtwJmXl4d27dr99FxPT0/s27cPcXFxDVZsa6wYhkFWVlZNQTMuLg6ysrJcBU0lJaUGz3Xv\n3j3Mnz8fEhIS8PX1rZlmzjAMoqKisGTJEmhra2PXrl2/lI/D4UBMTAwcDqdJF03evXtX08/z9u3b\nuHnzJjgcDtcqTy0tLaHdvk+EX2lpKf744w9MmTIFLi4ubMcRemVlZZg5cyZevnyJyMhI+t0mQuPB\ngwfQ09PD06dPaUW3kMjIyIChoSHu3btHrUYIIUKPCpykQTAMA0VFRdy8eROdO3dmO06jlpWVBW1t\nbRQXF/P83hISEiguLoakpGStzt+1axf8/PwQFxfHt56PjRHDMMjOzuYqaMrIyHAVNNlcGVtaWooN\nGzYgICAAW7ZswZw5c2oGXuTk5GDx4sV49uwZ9u3bB0NDw3o9S0xMDBUVFTSV9X8wDIP8/HyuVZ4p\nKSmQl5fnWuU5YMAAyMjIsB2XNBH5+fkYNGgQvL29aZBgPbx69QomJibo0aMHAgICqEhEhM6ff/6J\nmTNn0oIGIcDhcDBkyBBYWlrC1taW7TiEEFJvVOAkDeLBgwcwNDTE06dPm/RKrIZQXV0NGRkZvgwZ\nUlZWRm5ubp2u2bp1K4KDgxEXFwcFBQWeZ2oMGIZBTk4OV0FTSkqKq6ApKB8MXLhwAXZ2dtDS0oKH\nh0fNp/3FxcXYtGkTAgICsHLlStjb2/+wV2ttNW/eHEVFRdTw/ic4HA5ycnK4hhhlZGRAWVmZa4hR\n3759efL/F0K+5datWxg3bhwuXboENTU1tuMInYyMDIwbNw5WVlZYu3Yt/b1EhNLJkyexa9cuXL9+\nne0o5Cf8/f3h7++PxMTEmg+qCSFEmFGBkzSIwMBAXLx4kXq7NJDBgwcjKSmJp/cUFxfHvHnz4O3t\nXedrN2zYgNDQUMTFxeG3337jaS5hxDAMcnNzuQqakpKSGDZsWE1Bs0uXLmzH5PL69Ws4OTkhISEB\nBw4cgJGREYB/v5fQ0FAsW7YMhoaG2L59O0+3OMnKyuL58+do2bIlz+7ZVFRUVODevXtcQ4zy8vLQ\nv39/rqJn9+7dqZBCeCYkJASrVq3CrVu3Gvz1/syZM/D09ERmZiYKCwuhqKiIgQMHwsnJCYMHD27Q\nLHUVExODGTNmwN3dHdOnT2c7DiG/rKqqCl26dMHZs2ehrq7OdhzyHa9fv4aqqiouXryIfv36sR2H\nEEJ4ggqcpEFYW1tDU1MTCxYsYDtKkxAeHg4rKyueblOXkpJCcnJyTZ/FulqzZg1OnTqFy5cvo23b\ntjzLJQwYhsGDBw+4Cpri4uJfFTQFscjEMAyCgoKwfPlyzJw5E25ubjXbntPT0+Hg4IAPHz7Ay8sL\nenp6PH++nJwcHjx40OR+Zvjl48ePSElJqSl63rp1C6WlpdDU1OTq6Ul9uEh9rFq1CteuXcOlS5dq\n3dKkvpYvX44dO3agbdu2MDU1hby8PP755x+cOnUKVVVVCA4OFtgts18GtYWHh0NfX5/tOITUm5ub\nGwoKCnDgwAG2o5DvmDlzJtq1a4ddu3axHYUQQniGCpykQfTs2RMRERG0Za2BVFZWQklJCa9eveLJ\n/cTExKCjo1Ov7UYMw2DFihW4cOECLl26hDZt2vAkmyBiGAb//PMPV0FTVFSUq6DZtWtXgSxo/q/c\n3FzY2Njg06dP8PPzg4aGBgDg/fv3WL9+PUJCQuDm5ob58+dDTEyMLxkUFBRw9+5dKrjx0cuXL2u2\ntX/5v7KyslwFz4EDB9IqWlJrHA4HZmZmkJOTg7+/P99f6woKCtChQwf89ttvuHfvHtcgvCtXrsDQ\n0BBdu3bFo0eP+Jqjrqqrq7Fs2TKcPXsWZ86cQffu3dmORAhPPH/+HKqqqnj69ClkZWXZjkP+4/Ll\ny7CyssL9+/fRokULtuMQQgjPUIGT8F1BQQFUVFRQWFhI/V0a0OXLlzFu3DiUlZXV+17S0tJIT09H\nt27d6nUfhmGwdOlSXLt2DRcvXkTr1q3rnU0QMAyDhw8fchU0AXAVNLt16ybwBc0vPn/+jO3bt8PT\n0xOrV6+Gvb09xMXFweFwEBwcjBUrVsDExASbN2/m+3RfJSUlJCYmsjIlvqn6UqD/36JnWloaOnfu\nzFX0VFdXR7NmzdiOSwRUcXEx9PX1MWvWLCxZsoSvz7p58yYGDRoEExMTREVFfXW8ZcuWYBgGnz59\n4muOuiguLsa0adNQXFyMiIiIRv2hH2mazMzMMGLECNjZ2bEdhfyPiooKqKurY+fOnTQQjhDS6FCB\nk/BdREQEAgMDcfr0abajNDmLFi1CQEAASktLf/keIiIi0NfXx+XLl3kyyZphGDg6OuLmzZu4cOGC\nUK4KYxgGjx8/xpUrV2oKmhwOh6ugKax9DRMSEjB//nx0794d+/fvr5nWnpKSAnt7ezAMAy8vL2hq\najZInm7duuHixYu0solllZWVyMjI4Jrc/s8//0BNTY1rcnvPnj3pgyxSIy8vD4MHD0ZAQEBN315+\nePfuHRQVFSEnJ4f09HSuD16uXbuGoUOHwtTUFCdPnuRbhrrIz8+HsbExNDQ0cODAgQbbxk9IQ4qN\njYWTkxPu3r0rlH8PNVYbNmxAamqqwLweEkIIL1GBk/Cdo6Mjfv/9d7i6urIdpcnhcDiYO3cujh8/\njpKSkjpfLy0tXdNLTVxcHKGhoTX9F+uDYRgsXLgQ9+7dw/nz5wV+ewzDMHjy5AlXQbOqqoqroNmj\nRw+h/gP+/fv3WL58OU6fPg1PT0+YmZlBREQEhYWFWLVqFaKiorBlyxbMmjWrQQtYvXr1QlRUFHr3\n7t1gzyS1U1JSgjt37nBtbS8sLKzp5/ml8NmhQweh/t0g9ZOYmAhTU1PExcX9cg/n2vDw8ICTkxPk\n5eVhamqKtm3b4uHDhzh16hSGDBmCI0eOcG1dZ0tqaipMTExgb28PFxcX+t0gjRaHw4GKigoOHjzI\nlx7dpO4ePHiAwYMHIzU1lXbGEEIaJSpwEr7T1NSEp6cn/XHDEoZh8Ndff8HJyQkVFRWoqqr66TXN\nmjWDjIwMjhw5AiMjI1RWVmL+/Pm4f/8+Tp8+zZM3iRwOBzY2NsjNzcXZs2d5UjjlpSdPniAuLq6m\nqPn582eugqaysnKjeGPKMAyOHz+OJUuWYPz48di6dStat26N6upq/PXXX1i3bh0sLCzg5ubGSksB\nVVVV/P3339S/V0i8efMGycnJXEOMJCQkuFZ5ampq0nbcJubQoUPYuHEjbt68ydeBYZGRkbC2tkZR\nUVHN13r06AE3NzdMmzaNb8+trVOnTmHOnDk4cOAAJk2axHYcQvjO3d0dKSkpOHLkCNtRmjyGYTBy\n5EiMGTMGTk5ObMchhBC+oAIn4atPnz5BUVERhYWF1KuNZU+fPsWGDRsQEhICCQkJlJSUoLq6uua4\nqKgoJCQk0Lx5c9ja2sLV1ZWroMUwDNatW4eQkBCcP38ePXr0qHcmDoeDOXPm4OnTpzh9+jSkpKTq\nfc9flZeXx1XQLC8v5ypo9uzZs1EUNP9XXl4eFixYgLy8PPj5+UFXVxfAvyuu7O3tISsri3379kFd\nXZ21jAMGDEBAQEDNgCMiXBiGQV5eHtcqzzt37kBRUZFrlWf//v1Z/f0n/Ofi4oLbt2/jwoULkJCQ\n4Pn9d+zYgZUrV2LRokWwt7fH77//juzs7JrhdsuWLcOOHTt4/tzaYBgGHh4e2LVrF06ePAltbW1W\nchDS0N69e4fu3bsjNzcXv/32G9txmrSjR49i165duH37Nk9aThFCiCCiAifhq4sXL2Ljxo24du0a\n21HI//n06ROuXLmCW7duITU1FWVlZZCVlUXLli2RlZWFxMTEH/YD8/X1xfr16xEZGQkdHZ1656mu\nrsasWbPw5s0bREVFoXnz5vW+Z208ffqUq6BZWloKAwODmqJmr169Gl1B84uqqirs3bsXW7ZswZIl\nS7Bs2TJISkqioKAArq6uiI2Nxc6dO2FhYcH6/wba2trYt28fT37WiGCorq5GVlZWzQrP27dvIysr\nC7179+YaYtSnTx+IiYmxHZfwSHV1NUxNTdGhQwccOHCAp68tcXFxGDZsGCZMmIATJ05wHSstLUXP\nnj3x8uVLPHjwoN7D8uqqqqoKDg4OSEhIwOnTp9G5c+cGfT4hbLOysoKKigpcXFzYjtJkFRUVoU+f\nPoiKiqIPWAghjRp9fEP4Kj4+Hvr6+mzHIP9DVlYWJiYmX01OLCwsRNeuXX/aX9HGxgbt27fHuHHj\nEBgYiHHjxtUrj5iYGIKCgmBpaQkzMzOcOHGCL6t9nz17xlXQLC4urilouri4oHfv3qwX8xpCSkoK\n5s+fj1atWuHGjRtQVlZGZWUlPDw8sHnzZlhbWyMrKwuysrJsRwUASEhIoLKyku0YhIfExMSgqqoK\nVVVVWFtbAwDKysqQlpaG27dv4/Lly9i2bRtevnwJDQ0Nru3tnTt3bhK/p42RmJgYjh49Cl1dXezf\nvx/29vY8u/eXIYbDhg376pi0tDS0tbVx8uRJpKamNmiB88OHD5g8eTJERUVx/fp1oRyqR0h92dnZ\nwcLCAkuXLqUhdCxxdXXFxIkTqbhJCGn0qMBJ+CohIQHLli1jOwaphbZt26Jz585ITU2FlpbWD881\nNjbGmTNnMH78eLi5uWH+/Pn1era4uDgOHz6MqVOnYvLkyQgLC6v3VNn8/HyugubH/8fencfVmPf/\nA3+1LxQlO9lSWijt0nLKHSGy1ISxTIMWS2EYkXXGkhhLosi+NHVXliIxbbSniBQRIWur9r3z+2O+\nc373GVvLqavl/Xw8PB73fc65rut1Zkad63U+S0kJp9Bcu3YtFBUVu1RRUlZWhi1btuDixYtwc3PD\nwoULwcfHh8jISKxcuRIDBgxAdHR0u9vMhwrOrkFMTAzjxo3DuHHjOI8VFRVx1vP08fGBk5MT6urq\nuEZ5amlp0bTHDkRSUhJBQUHQ09ODgoICTE1NeXLe6upqAH+vAfsl/zzelruVv3z5Eubm5jAyMsKh\nQ4doSijpsrS0tCAlJYWbN29i8uTJTMfpcuLi4nDt2jVkZGQwHYUQQlodTVEnraampga9evVCTk4O\nI5uTkKZbvnw5hg8fjl9++aVRr8/KyoKZmRnmzp2L3377rcWFYU1NDaysrCAoKAhfX98mrdP29u1b\nrkKzuLgYRkZGnCnnSkpKXarQ/F/Xr1/HsmXLYGRkhD/++AO9e/fGmzdvsHbtWiQkJODAgQOYMWNG\nu/znY2pqinXr1mHixIlMRyEMY7PZePv2LWctz6SkJCQnJ6NXr15cozzV1dXb3aZlhNudO3dgZWWF\n6OhoyMvLt/h8//3vf2FtbY2+ffsiJSUFAwcO5Dx348YNTJ06FSIiInjz5k2rbnL0j8TERMycORPr\n16+Ho6Nju/zZSkhbOnHiBIKCghAUFMR0lC6ltrYWGhoacHFxgbW1NdNxCCGk1VHBSVon/rcmAAAg\nAElEQVRNYmIi7OzskJqaynQU0kh+fn7w8fHB1atXG31Mbm4uzM3NoaysjOPHj7d484jq6mrMmjUL\n3bt3x8WLF7866uXdu3eIiorilJpFRUWfFZpdfSrU+/fv4eTkhHv37sHLywv/+c9/UF1djQMHDmDf\nvn1Yvnw51q9fD3FxcaajftXUqVPh4ODQ4qUQSOfU0NCAp0+fcm1ilJaWBjk5Oa5NjFRUVFplYxvS\nfCdOnMDevXuRkJAAKSmpFp2roaEBkyZNQlhYGCQkJDBz5kz069cPjx8/xrVr1zib/Dg5OfEo/df5\n+/tj2bJlOHXqFKZNm9bq1yOkIygvL4esrCzu378PWVlZpuN0GXv37kVYWBhCQ0PpixZCSJdABSdp\nNfv27cPLly/h4eHBdBTSSO/fv4eysjLy8/ObVA6Wl5fD2toadXV18Pf3b/H6jVVVVbCwsICMjAzO\nnTsHAQEBvHv3Drdv3+YUmgUFBVyFprKycpcvNP/R0NCA48ePY/PmzbC1tcWmTZsgJiaG0NBQODo6\nQlFREQcOHGjzzTaaY8aMGVi0aBFmzpzJdBTSQVRXVyMtLY1rE6OXL19CVVWVa3q7nJwc3fAxbPXq\n1Xj06BFu3LjR4inctbW1OHLkCHx9fZGRkYGKigpIS0tDW1sbjo6OrT4KnM1mw9XVFUePHkVQUBDG\njh3bqtcjpKNxdHSEpKQkduzYwXSULuHVq1fQ0NBAYmIiRowYwXQcQghpE1RwkkZhs9k4ceIETpw4\ngfT0dLDZbCgqKmLJkiWwtbX9YrE0Y8YMzJ07l6ZEdDDy8vIIDAzE6NGjm3RcXV0dli1bhpSUFFy/\nfh39+vVrUY4XL17AwsICNTU1YLPZyM/P5yo0VVRUqND8gvT0dNja2nJKztGjRyM7OxurV69Geno6\nDh06hClTpjAds9GsrKxgZWWFH374gekopAMrKSnBvXv3uErP0tJSzjqe/5Se/fv3Zzpql1JXVwdz\nc3PIy8vD3d2d6TjNVlNTA3t7e6SmpiI4OJhrijwh5G8ZGRmYMGECXr161abr4XZFbDYb06dPh66u\nLlxcXJiOQwghbYbaAdIo8+fPh62tLV6+fIm5c+diyZIlqKiogIODA3766afPXs9msxETE0M7qHdA\nhoaGuHPnTpOPExQUxLFjx2BhYQE9PT1kZmY26fgPHz7Az88PDg4OGDVqFDQ1NTFkyBDU19dDRUUF\nubm5uHz5MhwdHTFmzBgqN/+lqqoKmzdvBovFwo8//ojY2FjIyclh27Zt0NLSgo6ODh49etShyk2A\nNhkivCEpKQkWi4Vff/0VAQEBePXqFTIyMrBixQrw8/Pj6NGjUFZWxuDBgzF79my4uroiIiICJSUl\nTEfv1AQFBeHn54e//voLx44dYzpOsxQWFmLSpEkoKChAdHQ0lZuEfIWSkhIUFBRw5coVpqN0epcv\nX8bz589po1dCSJdDWzqS77p8+TJ8fHwwbNgwJCUlQUZGBsDfIxZmz56N8+fPY8aMGZg1axbnmCdP\nnkBSUpI+6HdAhoaGuHbtGpYvX97kY/n4+LBlyxYMHjwYRkZGuHTpEvT09L742o8fP3JNOf/w4QMM\nDQ3BYrFgZ2eH0aNHQ0BAAGVlZZg8eTJWrlyJo0eP0pTSL4iMjOT8M0tNTcWAAQNw9epVrF69Gtra\n2rh//z4GDx7MdMxmoYKTtJZ+/fph2rRpnHUS2Ww2nj9/zlnLc/PmzXjw4AEGDx7Mmdqura2NMWPG\nQEREhOH0nUePHj0QFBQEfX19yMvLw9jYmOlIjZaVlYWpU6fC3Nwcbm5uEBAQYDoSIe2ag4MDPD09\naVZGKyotLYWTkxMuXrxII2UJIV0OTVEn37Vw4UKcP38eHh4en5VeqampGDt2LIyNjREREcF53Nvb\nG9HR0Th37lxbxyUt9PLlS+jq6uL9+/ctKhNv3LiBhQsX4vjx45g5cyZyc3O5Cs3379/DwMAALBYL\nxsbGGDNmzFdvDktLSzFx4kRoamrC3d2dSs7/U1BQgLVr1yI8PBweHh6YPn06MjMz4eTkhJycHBw+\nfBgmJiZMx2yRJUuWQEdHB0uXLmU6CumCamtrkZ6ezrVz+7Nnz6CiosK1iZGCggKNKm+hiIgIzJs3\nD7GxsR1ivbjo6GhYWVlh27ZtsLe3ZzoOIR1CTU0NZGVlERkZCUVFRabjdEqrVq1CSUkJTp06xXQU\nQghpc/RpnHzXhw8fAOCLG5L881h0dDRqamo4j0dHR8PAwKBtAhKeGjJkCISFhfHs2bMWnUdTUxNr\n167F/Pnz0b9/f8jLy+PcuXMYPnw4Lly4gPz8fAQFBWHNmjUYO3bsN0e+SEhIIDQ0FImJifjll1/Q\n1b+XYbPZuHDhApSVlSEpKYn09HSYmJjA2dkZ+vr6mDRpElJTUzt8uQnQCE7CLCEhIaipqWHp0qXw\n9vbGgwcPkJeXh/3792P48OEIDQ2Fubk5pKSkOH8HAwMDkZOT0+V/TjWViYkJtm7dimnTpqG4uJjp\nON904cIFzJ49G+fOnaNyk5AmEBYWxuLFi+Hl5cV0lE7p3r17+PPPP+Hm5sZ0FEIIYQRNUSff9c+U\n9Ozs7M+ee/HiBYC/Nwp48eIFRo0aBQCIiYnBhg0b2i4k4Rk+Pj7OOpzy8vKNPi4/P58zQjMqKgo5\nOTnQ19eHo6MjfHx8sGDBAri6ujZ7lFOPHj1w8+ZN/Oc//4GzszNcXV275EjO58+fw8HBAbm5uQgO\nDoampib8/Pywbt06mJiYIC0trcUbPLUnVHCS9qZbt27Q19fnWmM6Pz8fycnJSEpKwunTp+Hg4AAB\nAQHOCE9tbW1oampCWlqaweTtn4ODAx49eoS5c+ciODi43U35ZrPZ2LZtG86dO4fIyEgoKyszHYmQ\nDsfW1hbq6urYtWsXunXrxnScTqO+vh52dnZwdXXl3LsRQkhXQyM4yXdNnToVALB//34UFhZyHq+t\nrcXWrVs5/7+oqAgA8PbtW5SWlnLKTtLxNGajofz8fFy6dImz6c+IESNw6tQpyMrK4vTp08jPz8e1\na9ewe/dupKSkIDo6GgsXLuQa6dtUUlJSuHXrFm7evInNmzd3qRFStbW1cHV1hY6ODiZOnIjk5GSI\niorC2NgYe/bsga+vL86ePdupyk2ACk7SMcjIyMDMzAxbtmzBtWvX8PHjRyQkJGDBggUoKSnBrl27\nMGTIEIwcORI//vgjDh48iLi4OFRWVjIdvd05ePAgampq8OuvvzIdhUtVVRV+/PFH3Lx5EwkJCVRu\nEtJMQ4YMgZ6eHnx9fZmO0ql4enpCXFz8i5u/EkJIV0FrcJLvqq+vx9SpU3Hz5k307dsXFhYWEBUV\nRVhYGN6/fw8JCQm8fv0aCQkJ0NHRgZ+fH/7880/aJbEDy8zMxKRJk/Dy5UvOYwUFBbhz5w5nhObL\nly8xfvx4zhqaY8eOhaDg1weFV1RUYN68eSgrK0NgYCB69OjR7Hx5eXkwNjaGlZUVV8neWSUkJMDW\n1hYDBw7E0aNHISUlhW3btsHHxwfbt2+Hra1tuxvpxCsbNmyAhIQENm7cyHQUQlqkvr4eT5484azl\neffuXWRkZEBBQYEzylNLSwtKSkrf/FnaFRQWFkJXVxfOzs74+eefmY6DvLw8zJgxAwMHDsTZs2ch\nJibGdCRCOrSQkBBs2bIFycnJTEfpFN69ewdVVVXcuXOH1jYlhHRpNIKTfJeAgACCg4Ph6uqK3r17\n4+zZszh79ixGjhyJuLg4SEhIAAD69OkDgNbf7Azk5eVRUVGB48ePY9WqVVBTU8OwYcNw/PhxDBgw\nAMePH0dBQQFCQkLw66+/QktL67s35OLi4ggMDIS8vDwMDQ3x7t27Zufr3bs3wsPD4evri127djX7\nPO1dSUkJVqxYgZkzZ2LDhg24du0abt++DUVFRVRWViIjI4MzFbazohGcpLMQEBCAsrIybGxs4Onp\nieTkZBQWFsLT0xMqKiqIioqClZUVpKSkYGhoiF9++QV+fn7Izs7uUqPVAUBaWhpBQUFwdnZGTEwM\no1keP34MXV1dsFgs+Pr6UrlJCA9MmjQJhYWFuHv3LtNROoVVq1bBzs6Oyk1CSJdHIzhJi1RVVaFH\njx6QlJREXl4eAEBNTQ3Hjh2Djo4Ow+lIUxQVFXGN0Hz06BEUFRUxb948sFgsaGhoQEhIqMXXYbPZ\n2LNnD7y8vBASEgIlJaVmn+v9+/dgsVhYsmQJ1q1b1+Js7cnly5excuVKmJmZwc3NDdnZ2VixYgXY\nbDY8PDygqanJdMQ28fvvv6O6uho7duxgOgohbeLTp0+c9Tzv3r2LxMRE1NTUcI3y1NLS4nyp2Jnd\nunULixYtQnx8PIYOHdrm1w8PD8e8efOwZ88emvZJCI/t2bMHmZmZtNt3C924cQMrVqzAo0eP6AsY\nQkiXRwUnaZEzZ87AxsYGK1euhLu7Oz59+oTBgwejoKAAwsLCTMcj3/Dp0yeuQjMrKwvjxo0Di8UC\ni8VCYmIi0tPT4e3t3SrXP3/+PNauXQt/f38YGho2+zxv376FkZERVqxYgVWrVvEwITPevHmDFStW\n4MmTJzh+/DiUlZXh4uKCq1evYteuXVi0aFGzN2rqiFxdXVFUVIQ9e/YwHYUQxrx9+xZ3797lTG9P\nTk6GlJQU1yZG6urq6N69O9NRec7d3R0nTpxAbGwsZ8ZIWzhx4gRcXFzg5+cHFovVZtclpKvIy8uD\nvLw8Xrx4ASkpKabjdEgVFRVQUVGBp6cnJk2axHQcQghhXNde5Ik0WklJCSQlJbkeS01Nxbp16yAl\nJQVnZ2cAQHx8PLS0tKjcbIc+ffqE6OhoTqH59OlTTqH5z4jA//33Ji4uDk9Pz1bLs2DBAvTr1w+W\nlpY4cuQIrKysmnWegQMHIiIiAiwWC0JCQli+fDmPk7aN+vp6HD16FNu3b8eKFSvg4+ODc+fOwcrK\nCnPmzMHjx4/Rs2dPpmO2OZqiTsjfP+cGDhyIGTNmAAAaGhrw7NkzzijPgIAAPHz4ECNGjOCM8tTW\n1sbo0aN5MvKeSStXrsSjR48wf/58XL58udW/4GloaMCGDRtw6dIlREdHQ15evlWvR0hX1bt3b0yZ\nMgVnz57tFF9QM2HHjh3Q1tamcpMQQv4PFZykUUxNTSEmJgYVFRVISEjg8ePHuH79OsTExBAcHIwB\nAwYA+Hv9TX19fYbTEgAoLi7mKjQzMzM564i5u7t/t4hWUVFBbm4uPnz40Go7c5uamuLWrVswNzfH\n27dvm/0BV1ZWFhERETAyMoKQkBBsbW15nLR1PXjwALa2thAVFUVMTAwKCwuhr68PCQkJ/PXXXxgz\nZgzTERlDBSchn+Pn54eCggIUFBSwYMECAEBNTQ3S0tKQlJSExMREeHh4IDs7G2PGjOGa3i4nJ9eh\nRoHz8fHBw8MDEydOhIuLC3bv3t1q16qoqMCCBQuQl5eH+Ph4yMjItNq1CCGAg4MDFi9eDCcnJ/Dx\n8TEdp0P5Z5bVw4cPmY5CCCHtBhWcpFEsLS3h6+uLCxcuoLKyEgMHDoStrS02bNiAQYMGcV4XExOD\nzZs3M5i06yopKeEqNJ88eQIdHR2wWCwcPHgQ2traTRpZKyAgAH19fURHRzd7dGVjqKmpITY2FmZm\nZsjJycHevXubdfM9dOhQREREwNjYGIKCgu1i593vqaiowPbt23H69Gns3r0bkydPxsaNGxEWFoa9\ne/dizpw5Xf4DPxWchDSOsLAwNDQ0oKGhAQcHBwBAaWkpUlJScPfuXVy5cgUuLi4oLi7mrOP5T/HZ\nv39/htN/m7CwMAICAqCjowMlJSVOqfslZWVlKCkpgaCgIGRkZBr9++T9+/eYPn06FBUV4ePjAxER\nEV7FJ4R8xfjx4yEsLIyIiAhMmDCB6TgdRkNDA+zt7bF9+/Z2//ObEELaEq3BSXimuroavXr1wvv3\n79t0nayuqqSkBDExMZxC8/Hjx9DW1uasoamtrd3iG7S9e/fi9evXOHz4cLOOLygowOXLl3H9+nWk\npaXh7du3EBYWxujRo2FjYwMbGxvOzWdhYSEsLCwwcOBA2Nvbw83NDQkJCaisrMTIkSPx888/Y+XK\nld/dMfzp06cwMTHBrl27sHDhwmblbgs3b96Eg4MDdHV14ebmhoCAAOzcuRM///wzNm3aRH+H/s+J\nEycQHx+PkydPMh2FkE7h48ePuHv3LteanmJiYlxT2zU1NdGjRw+mo34mPT0dxsbGCAoKgq6uLoC/\nb/Rv3boFT09PJCYmoqCgAEJCQmhoaAAAjBo1CpaWlrC1tf3qxkwPHz7EtGnTsHTpUri4uHT5L5YI\naUtHjx5FREQEAgICmI7SYZw8eRLHjx9HXFzcdz8XE0JIV0IFJ+GZ2NhYODk5ITk5mekonVJpaSlX\noZmenv5ZoSkqKsrTayYlJWHp0qV48OBBs4738vKCg4MD+vfvD2NjY8jKyuLjx4+4dOkSiouLMXv2\nbPj7+3NuJquqqmBiYoL4+Hh069YN1tbWkJaWRnBwMDIzM2FpaQl/f//vXvfJkycwMTHBvn37MG/e\nvGZlby25ublYvXo14uLi4OnpCREREaxcuRIDBgyAu7s7Ro0axXTEduXs2bMIDw/HuXPnmI5CSKfE\nZrORnZ3NKTvv3r2L+/fvY9CgQVxT21VVVXn+O6Y5rl+/DltbW8THx+Px48ewsbFBaWkpysrKvnqM\nqKgo2Gw2Fi5ciP3793NtxhQSEoJFixbB3d0dc+fObYu3QAj5HyUlJRgyZAjS09M5S16Rr8vLy4OK\nigpu3rwJNTU1puMQQki7QgUn4RlXV1d8+PABBw8eZDpKp1BWVsZVaD569AhaWlqcQlNHR6fVbzZr\na2vRq1cvvHz5EtLS0k0+PiIiAuXl5Zg6dSrXNMEPHz5AW1sbOTk5CAgIwOzZswH8/SFXTk4OBQUF\nGDp0KKKiojB48GCu4vPPP//EnDlzvnvt9PR0/Oc//4G7u3urTrFvLDabjVOnTmHDhg1YtGgRli5d\nii1btiAhIQEHDhzAjBkzaNTQF/z555+4evUqfH19mY5CSJdRV1eH9PR0rlGeT58+hbKyMtfUdgUF\nBUZGD+3Zswd79uxBVVUVKisrG32cqKgoevTogevXr0NDQwNHjhzBjh07EBgYCD09vVZMTAj5Fnt7\newwYMABbtmxhOkq7t2jRIsjIyOCPP/5gOgohhLQ7tAYn4ZmYmBjY2NgwHaPDKisrQ1xcHCIjIxEV\nFYW0tDRoamqCxWLB1dUVurq6bT56RkhICLq6uoiNjcW0adOafLyJickXH+/Xrx/s7e3h4uKCqKgo\nTsEZEBCAvLw8LFy4EKNHj4aenh5CQkIwevRo7NixAxMmTICnp2ejCk5lZWWEhoZi0qRJEBQUxMyZ\nM5ucn1cyMzNhZ2eH8vJyXLt2DREREdDT08Py5ctx6tQpiIuLM5atvaM1OAlpe4KCglBVVYWqqiqW\nLFkC4O81g+/fv4+kpCTcunULO3bsQG5uLjQ0NLhGeg4ePLhVv6ypra1FeHg4SkpKUF9f36Rjq6qq\nUFVVBSMjI0ydOhVpaWmIjY3F8OHDWyktIaQxHBwcYG5ujo0bN0JQkG5PvyYyMhKRkZHIyMhgOgoh\nhLRL9BuE8ERDQwNiY2NpnbwmKC8v5yo0Hz58CA0NDbBYLOzatQu6uroQExNjOiYMDQ1x+/btZhWc\n3yIkJAQAXB9kIyIiAABmZmaYO3cuBg4ciAkTJsDPzw+GhoYQFxdHXFwcqqurG7W+qKqqKkJCQjB5\n8mQICgry/D18T3V1Nfbs2QN3d3ds3rwZcnJymD9/PhQVFZGUlEQ31Y1ABSch7YO4uDjGjx+P8ePH\ncx4rKChAcnIykpKScObMGSxbtgx8fHxcozy1tLSaNQPga1avXo3Y2Ngml5v/q7y8HIGBgcjIyKCf\nw4S0A6qqqhg8eDCuXbuGGTNmMB2nXaquroaDgwPc3d25ltkghBDy/1HBSXgiPT0dvXv3Rt++fZmO\n0m5VVFRwFZoPHjyAuro6WCwWduzYAV1d3XY5ks/Q0BBr167l6Tnr6uo4ayqamZlxHs/MzAQAyMvL\nAwDmzp2Lfv36wdraGocOHcKwYcOQnp6OFy9eQFFRsVHXUldXx7Vr1zB16lScPXsWkydP5ul7+Zro\n6GjY2tpCXl4eV69exd69e+Hh4YFDhw5hypQpbZKhM6CCk5D2q1evXpg0aRImTZoE4O+lOHJycjhr\nebq6uiIlJQV9+vTh2sRo7Nixzfp9FxMTg1OnTjVpWvrX8PPzw8nJCSEhIbQ8CCHtgIODAzw9Pang\n/Ao3NzcoKCjQPx9CCPkGKjgJT0RHR0NfX5/pGO1KRUUF4uPjOYVmamoq1NTUYGxsjN9++w3jxo1r\nl4Xmv2lrayM9PR2lpaU829nb2dkZjx49wpQpUzg3xgBQXFwMAFy79xobGyM8PJxrHc9Pnz416Xpa\nWloICgrC9OnTceHCBUycOJEH7+LLioqKsH79eoSEhGDv3r148uQJLCws8Msvv8DPz6/FO9t3NVRw\nEtJx8PHxQVZWFrKysrC0tAQA1NfXIzMzk7OWp4+PD9LT0yEvL881ylNZWfm7U1Pt7e15Um4Cf091\nj46ORmxsLH1+IaQdsLKywpo1a5CVlQU5OTmm47Qrz549w6FDh3Dv3j2moxBCSLtGBSfhiZiYGJia\nmjIdg1GVlZVcheb9+/ehqqoKY2NjbNu2DePGjUO3bt2YjtlkoqKi0NDQQHx8PE+KQXd3d/zxxx8Y\nNWoUzp8/36hjRo8ejdjYWCgoKABAs6Ym6urq4vLly5g5cyZ8fX2/uj5oc7HZbPj5+WHNmjWYMWMG\n9uzZg40bN0JbWxv379/H4MGDeXq9roIKTkI6NgEBASgpKUFJSQk//fQTgL/Xwnzw4AHu3r2LO3fu\nYN++fXjz5g3Gjh3LNb192LBhnNGV9+/fR3Z2Nk+zVVRUYN++fVRwEtIOiIqK4qeffsKxY8ewd+9e\npuO0G2w2G8uWLcOGDRsgKyvLdBxCCGnXaBd10mJsNhuysrKIiIjAyJEjmY7TZiorK5GQkMApNO/d\nu4cxY8bA2NgYLBYLenp6HbLQ/JJNmzYBAHbs2NGi83h4eGDlypVQUlJCeHg4+vXrx/W8lpYWkpOT\nkZycDA0Njc+OV1RUxJMnT2BqaoqrV682a43S27dvw8rKCgEBATA0NGz2e/lfL1++xLJly5CTkwMX\nFxecOXMGOTk5OHz4MM+L1K4mNjYW69atQ1xcHNNRCCGt6NOnT0hJSeFMb09KSkJVVRWn8Hz48CGC\ngoLQ0NDA0+sKCQmhrKwMwsLCPD0vIaTpsrKyMG7cOOTk5LT5xprtlY+PD9zc3JCcnEwbMBFCyHfw\nMx2AdHyvX79GbW1tp59OUlVVhaioKGzbtg1GRkbo3bs3Nm7ciLq6OmzatAkfPnxAXFwcdu7cCVNT\n005TbgKAkZER7ty506JzHDx4ECtXroSKigoiIyM/KzcBcEZoPn369LPn6urq8Pr1awgKCqJHjx4w\nNTVFYWFhk3MYGRnB19cXlpaWiI2Nbfob+Vemffv2QVNTE9ra2jAzM8PKlSsxadIkpKamUrnJAzSC\nk5CuoWfPnpgwYQI2bNiAS5cu4c2bN3j48CHs7e1RV1eH8PBwnpebwN+jxtLT03l+XkJI08nJyUFd\nXR3+/v5MR2kXioqK8Msvv8DLy4vKTUIIaQQqOEmL/bP+ZmdbpL+qqgq3b9/G9u3bwWKxICMjA2dn\nZ1RXV2Pjxo348OED4uPjsWvXLkycOLFT72g4btw43Lt3D1VVVc06fs+ePVi9ejXU1NQQGRmJPn36\nfPF1/xSCoaGhnz13584dVFRUQE9PD35+ftDV1cX48ePx6tWrJucxMTHBhQsXMHPmTCQkJDT5eABI\nTk6GtrY2QkNDsWnTJpw8eRK5ublIS0vD6tWrObvEk5ahgpOQrmvAgAGwsLDAzp07W+0zBpvNpoKT\nkHZk2bJl8PT0ZDpGu7Bx40bMmDEDurq6TEchhJAOgQpO0mIxMTEwMDBgOkaLVVdX486dO/jtt99g\nbGwMGRkZ/Prrr6isrISzszPev3+PhIQE7N69G5MmTerUhea/de/eHcrKykhKSmrysb///jucnZ2h\noaGB8PBwyMjIfPW1lpaWkJGRga+vL5KTkzmPV1VVcabJOzg4gJ+fH/v27YO9vT3Gjx+P1NTUJuea\nOHEizpw5AwsLC65rfU9ZWRlWr14Nc3NzzJ49G7W1tTh79ix8fX1x9uzZL45MJc0nLCyMmpoapmMQ\nQhhWXV3dKuetr69HeXl5q5ybENJ0U6dORU5ODh48eMB0FEYlJCTg6tWr2L17N9NRCCGkw6Cx7qTF\noqOjsXTpUqZjNFl1dTWSkpIQFRWFyMhIJCUlQUlJCcbGxvj1118xfvx4SEpKMh2z3TA0NMTt27eb\ntG7l2bNnsWXLFggICMDAwADu7u6fvWbo0KGcjSckJSXh7e0NS0tLsFgszJkzB9LS0ggKCkJmZiYs\nLS1hbW3NOdbJyQkDBw6EqakpfHx8mrzR1ZQpU3DixAlMnToVoaGhGDt27DdfHxwcjBUrVmD8+PGw\nsLDAoUOHsH37dtja2kJAQKBJ1yaNQyM4CSHA3z8LWqPk5Ofnh4iICM/PSwhpHkFBQdja2sLT0xNe\nXl5Mx2FEbW0t7Ozs8Mcff6Bnz55MxyGEkA6DCk7SIgUFBcjJyYGqqirTUb6rpqbms0Jz1KhRYLFY\nWLt2LfT19anQ/AZDQ8MvFpTf8s+Ot/X19Th48OAXX2NkZMQpOAFgxowZuH37Nnbu3InAwEBUVVVB\nTk4O+/fvh6Oj42fTFC0tLdG3b19YWlpi3759WLBgQZMyTps2DZ6enpg8eTJu3bqFMWPGfPaa9+/f\nw9HREampqbC2tsb58+cxffp0ZGRkfHNEKmk5KjgJIQAwbNgwpKWl8fy89fX16MELenEAACAASURB\nVN69O9hsdqdbaoeQjmrJkiVQUlKCm5tbl/xsfujQIfTt2xdz5sxhOgohhHQotIs6aZSXL1/i0qVL\niIqKQlpaGiorKyEiIoJevXqhtLQU169fh7y8PNMxudTU1ODu3bucQjMxMREKCgpgsVgwNjaGvr4+\nevTowXTMDqOoqAiysrIoLCxsl+tLZmRkYMqUKbCzs4Ozs3OTb1T9/PywevVqhIWFQUlJCQDQ0NCA\nY8eOYcuWLZg+fToePXoEPj4+eHh4QFNTszXeBvmXN2/eQEdHB2/fvmU6CiGEQQ4ODjh27Bh4/bGV\nj4+Ps7SIgYEB54+KigqNzCeEQVZWVmCxWFi+fDnTUdrUq1evoKGhgYSEhE6/gSshhPAajeAk35SW\nlgZHR0ckJCSAzWZ/Nj3s9evXEBAQgKqqKlRVVXHo0CHo6OgwkrWmpgbJycmIiopCVFQUEhISMHLk\nSLBYLKxatQr6+vo0zaMFpKSkMHz4cNy7d4+xf8ffoqSkhLi4OEyePBk5OTk4fPhwk25Ora2tUVdX\nB1NTU4SHh6Ourg52dnaora2FsbExQkJCsGvXLixatAj8/LR8cVuhEZyEkIqKCnTv3h18fHw8LzgN\nDAwQFRWF7OxsREdHIzo6GocPH0Zubi709PQ4haempiZNZSekDTk4OMDR0RHLli3rMqOr2Ww2Vq5c\nCScnJyo3CSGkGWgEJ/mihoYG/Pbbb3Bzc0NVVVWjbyjExMRgZ2cHNze3Vh/lV1tby1VoxsfHQ05O\njjNC08DAgApNHlu5ciVkZWWxbt06pqN8VUlJCWbNmoVu3brhzz//hLi4eJOO9/b2xpo1ayAkJAQz\nMzOEhYVh7ty52L59O/33xIDCwkKMGDECRUVFTEchhLSxjIwMHDt2DBcuXMC4ceOQlJSEvLw8np2/\ne/fu8PPzw5QpUz577uPHj4iJieGUnpmZmdDQ0OAUnnp6epCQkOBZFkIINzabDUVFRXh7e3eKzUwb\n4/Lly9iwYQMePHhAX6gQQkgzUMFJPlNfXw9ra2vcuHEDFRUVTT5eTEwMenp6CAkJgbCwMM9y1dbW\nIiUlhVNoxsXFYcSIEVyFppSUFM+uRz4XEBCAs2fPIjg4mOko31RTU4PFixcjKysLwcHBjV4nMyIi\nAnZ2dhAQEMDz588xduxYnDhx4ovrcpK2UVpaiv79+6OsrIzpKISQNlBTU4NLly7By8sLmZmZWLx4\nMZYuXYohQ4bg8uXLmD9/frM+m/wbPz8/xowZg5SUlEaNyi8pKUF8fDyn8ExJScGoUaM4hae+vj76\n9OnT4lyEkP/v4MGDSEpKgo+PD9NRWl1paSmUlJRw/vx5sFgspuMQQkiHRAUn+YyDgwPOnTvXohsI\nMTExTJ06Ff7+/s0+R11d3WeF5rBhw7gKTWlp6WafnzTdx48fMWrUKOTn57f7tcnYbDY2btyIwMBA\nhIaGYvjw4V99bX5+PtauXYuwsDAoKCggMzMTkyZNQnh4OKKiojB06NC2C064VFVVoUePHq2yezIh\npP14+fIljh07hlOnTkFZWRkODg6wsLD47IvSGTNmIDQ0tMU/E8TFxZGamoqRI0c26/jq6mrcvXuX\nU3jGxcWhX79+XOt4Dh06tMtMrSWkNRQVFWHYsGF4+vRpp/8CYc2aNSgsLMSZM2eYjkIIIR0WFZyE\nS0REBMzNzVFZWdnic3Xr1g3nzp3DrFmzGvX6uro63Lt3j1NoxsbGYujQoVyFZq9evVqci7TMqFGj\n4OfnB1VVVaajNMqRI0ewc+dOBAUFfbYxEJvNxoULF7Bu3TooKCggPT0dixcvxqZNmyAhIQEPDw/s\n378ft2/fxuDBgxl6B11bfX09hISEUF9fT0UBIZ1MfX09QkJC4OXlhcTERCxYsAB2dnYYNWrUV48p\nLS2Frq4unj9/3uySU0xMDBcvXsTMmTObG/0z9fX1SEtL4xSe0dHREBAQ4Co8lZWVaQ1nQpro559/\nhry8PJydnZmO0mru378PMzMzpKenN3rWESGEkM9RwUk4GhoaICsry9Pdinv27IkPHz58cR2Zuro6\n3L9/n1NoxsTEYMiQIWCxWGCxWDAyMqJCsx2ytbWFiooKHB0dmY7SaFeuXMHSpUtx7tw5TJ48GQCQ\nlZUFe3t7vHr1Cg0NDRgxYgTc3d0/u7E+cOAAjh49iqioKAwcOJCJ+F2egIAAqqurIShI++IR0hl8\n+PABJ06cwPHjx9G/f384ODjA2toaYmJijTq+uLgYkydPxsOHD1FeXt7o6woKCkJERAQ+Pj6YPn16\nc+M3CpvNxvPnz7kKz4KCAowfP55TeGpoaPB0KR9COqO7d+/ihx9+QFZWVrufPdQc9fX1GDduHOzt\n7fHzzz8zHYcQQjo0KjgJR2hoKKysrHi61l337t1x7NgxzJs3D/X19Z8VmoMHD+YqNOlby/bvwoUL\nuHLlCgICApiO0iRxcXGYNWsWfvvtNxQUFGDv3r2QlZVFUVERDh48iBkzZnx1hKCbmxtOnjyJqKgo\n9O/fv42TE1FRURQVFTW6/CCEtD9sNhuRkZHw9PREWFgYrKysYG9vD3V19Wadr6GhAUeOHOGM6vrW\nsjr8/PwQFRWFhoYGLl68yNiI/Pfv33NtXJSVlQVNTU0YGBjA0NAQurq66N69OyPZCGnPNDU18dtv\nv31xQ7CO7siRI/Dz80NUVBSN8CaEkBaigpNwmJub4/r16zw/r6ysLMaMGYPo6GgMGjSIq9Ds3bs3\nz69HWtfr16+hqamJjx8/drgpw76+vli4cCEkJSVRV1cHJycnrF+/vlE7re/cuRMXL15EZGQk+vbt\n2wZpyT8kJCTw9u1bSEpKMh2FENJEhYWFOHv2LLy8vCAkJAQHBwfMnz8fPXr04Mn5i4uLcfbsWRw5\ncgTZ2dkQExPj/G6qqqpCbW0tJk6ciN9///2zZUqYVlxcjLi4ONy5cwfR0dFITU2FkpIS18ZF9MUv\nIcDJkydx5cqVdr/JZVO9e/cOqqqqiIqKgrKyMtNxCCGkw6OCk3DIyMigoKCA5+cVEBDAxYsXYWxs\n3OkXCO8qhg4ditDQ0G+uk9aeFBcXY8OGDfDz8+NMdzYzM8PFixebNO1527ZtCAwMRGRkJN10tiFp\naWk8e/aMlqwgpINgs9lISkqCp6cnrly5gqlTp8LBwQHjx49v1S/Gqqur8fjxYxQXF0NISAjDhw/H\ntm3bIC8vjzVr1rTadXmlqqoKSUlJnBGe8fHxGDhwINc6nkOGDGE6JiFtrry8HLKysrh3716n+jsw\nZ84cDB8+HLt27WI6CiGEdApUcBIAf4+w6N+/P2pqanh+7m7duiE1NRVycnI8PzdhxsKFC6Gvrw9b\nW1umo3wTm83GpUuXsGLFCoiIiICfnx8eHh4wMDCAlZUVBAUF4efnh27dujX6fJs2bcL169cREREB\naWnpVn4HBAD69u2LBw8eoF+/fkxHIYR8Q1lZGXx8fODl5YXi4mLY2dnBxsaG0dkagYGBOHnyJEJC\nQhjL0Fx1dXV4+PAh1zqeIiIiXIWnoqIiTWslXYKTkxO6d++OnTt3Mh2FJ27evIlly5YhLS2tUTOJ\nCCGEfB8VnAQA8Pz5c6ipqfF0/c1/9OjRA3/99Re0tLR4fm7CjJMnTyIyMhIXLlxgOspX5eTkwMHB\nAYmJiaipqYGzszPWrFnD2fCqtrYWtra2ePToEa5fv97o0cVsNhvr169HeHg4wsLCICUl1ZpvgwAY\nNGgQ4uPjaSd7QtqpR48ewdPTE3/++SeMjIxgb28PU1PTdlG8FRUVYciQIcjLy/vihocdCZvNxrNn\nz7gKz+LiYq6Ni9TV1SEkJMR0VEJ47vHjxzA2Nsbr1687/OZclZWVUFFRwZEjR2BmZsZ0HEII6TSo\n4CQAgJcvX0JFRaVJu5E2lqCgIKZNmwZlZWX07dsX/fr1Q9++fTl/JCUlO9xajl3ds2fPYGJigtev\nX7e7f3f19fU4fPgwtmzZAgEBAUyYMAEHDhz4YjnGZrOxdetW+Pj4IDQ0tNGjjNlsNtasWYPY2Fj8\n9ddfPFtLjnzZsGHDEB4ejuHDhzMdhRDyf6qrqxEQEABPT09kZ2djyZIlWLp0KQYNGsR0tM9oa2vD\nzc0NLBaL6Sg89/btW66Ni168eAFtbW1O4amrq9voWQqEtHcmJiaws7ODtbU101FaxMXFBVlZWfDz\n82M6CiGEdCpUcBIAf9+oSEhIoLa2lufnFhQUxK5du1BRUYGPHz/i48eP+PDhA+d/19bWcsrOf5ef\n/368R48e7a5Q64rYbDYGDBiA+Ph4DB06tNnnKSkpwf379/HixQvU1tZCUlISqqqqkJeXh4CAQJPP\nl5qaioULF+LNmzeQlpbG8ePHYWJi8t3jjh8/jq1bt+LKlSvQ0dFp1LXYbDYcHR2RkpKCmzdvQkJC\nosl5SePIy8sjODgYCgoKTEchpMt7/vw5jh07hjNnzkBNTQ329vaYNm1aux416OLiAj4+PuzYsYPp\nKK2uqKgIsbGxnMLzwYMHGD16NNfGRbS8Cumo/P39ceTIEURFRTEdpdkyMjJgZGSEBw8eYMCAAUzH\nIYSQToUKTsIhJyeH58+f8/y8UlJSKCws/OrzXys+//3nw4cPqKmpQZ8+fb5Yfv77j5SUFJWhrcja\n2hpTpkzBokWLmnRcTU0NAgICsGfPHjx+/Bji4uKoq6sDm82GgIAA2Gw26uvrMWfOHKxZswYqKirf\nPWd5eTlcXFzg7e0Nfn5+bN++HStXrmzSDfe1a9dgY2ODU6dOYdq0aY06hs1mw8HBAenp6bhx4wa6\nd+/e6OuRxlNWVoafn1+j/lsghPBeXV0drl27Bi8vL6SkpOCnn36Cra0tRo4cyXS0RomKisL69euR\nmJjIdJQ2V1lZicTERE7hmZCQAFlZWa51PGn5D9JR1NbWYsiQIQgLC4OSkhLTcZqsoaEBLBYLP/zw\nA1asWMF0HEII6XSo4CQcq1atwtGjR3k6ipOPjw+CgoKcTV1mzZrVop3UKysrv1h8fqkQrays/KwM\n/VopKi0tTWVoEx05cgT37t3DyZMnG31MYmIifvjhBxQWFn53vVcBAQEICwtj3rx5OHjw4FfLwxs3\nbuCnn35CaWkpzM3N4e7u3uzNaJKSkmBhYYFt27bBzs6uUcc0NDRg6dKlePHiBa5fv04LxbeCsWPH\n4uTJk1BXV2c6CiFdytu3b3HixAl4e3tjyJAhsLe3h5WVFURFRZmO1iTV1dXo3bs3Xr161eXXTa6r\nq0NqairXOp7dunXjKjxHjRpFn4lIu7V582YUFxfD3d2d6ShNdurUKXh6eiIhIaFZM5UIIYR8GxWc\nhCMrKwujR49GVVUVz84pLi6OmzdvIjc3F/7+/rhx4wbU1dU5ZWffvn15dq1/q6qq+uZo0P/9/+Xl\n5ejdu/dXR4P+bykqLS3dLjZOYFpaWhpmzZqFZ8+eNer1+/fvx6ZNm1BZWdmk64iKikJaWhrR0dFc\nazB+/PgRP/30E6KiojBgwACcO3cO48ePb9K5vyQrKwtmZmaYO3cufvvtt0bd5NXX18PGxgbv3r1D\ncHAwxMTEWpyD/H/a2to4fPhwo5cPIIQ0X0NDA8LDw+Hl5YXIyEjMmTMHdnZ2UFVVZTpai0yePBlL\nly7FrFmzmI7SrrDZbGRmZnIVnmVlZdDX14eBgQEMDQ2hpqYGQUFBpqMSAuDvTSRVVVXx+vXrDjVz\nJj8/H8rKypx7IUIIIbxHBSfhMm3aNNy6dQs1NTUtPpeAgAC0tLQQHx/PeayyshKhoaHw9/dHSEgI\nxo4dyyk7mzvqjheqq6uRm5v7zRGh/zxeWlqK3r17f3eKfN++fSEjI9Npy9CGhgbIyMjg0aNH311D\naP/+/di8eTMqKiqadS1+fn5IS0sjJSUFgwYNgoeHB5ydnQEAu3fvxooVK3j6TXhubi7Mzc2hpKQE\nb2/vRk11r6+vx4IFC1BYWIgrV650uBFO7dn48eOxZ88e6OvrMx2FkE6roKAAp0+fxrFjx9CtWzc4\nODhg3rx5nWZ94f379+PZs2fw9PRkOkq79+bNG0RHR+POnTuIjo7G69evoauryxnhqaOjQ1/kEUZZ\nWFjA3NwcS5cuZTpKo9nY2KBnz544cOAA01EIIaTTooKTcPn48SNGjhyJ0tLSFp9LXFwcjx49wrBh\nw774fGVlJW7evAl/f39cv34dampqsLKywuzZsxktO7+npqaGU4Z+b5p8cXExZGRkvjtF/p8ytKNN\nV7GwsMC8efO+uZtlUlISWCxWk0du/puAgAAUFBRQV1eH7OxsTJs2DceOHYOMjEyLzvs15eXlsLa2\nRm1tLQICAhp1k19XV4d58+ahoqICly5dgrCwcKtk62pYLBa2bt0KY2NjpqMQ0qmw2WzEx8fD09MT\nwcHBsLCwgL29PXR1dTvdFOWHDx9i1qxZyMrKYjpKh1NQUMC1cVFaWhpUVVU5hef48eO7/NR/0rZC\nQ0OxceNGpKSkdIifVbdv38aCBQuQnp7eab40IoSQ9ogKTvKZ0NBQzJo1q0WFlJiYGE6ePIm5c+c2\n6vVVVVVcZeeYMWM4ZWf//v2bnYNptbW1n5WhXytFP336BGlp6e/uJP9PGdoepovt378fz58/x5Ej\nR774fE1NDUaOHInXr1/z7JpSUlK4ceNGm0xXrqurw7Jly5CcnIyQkJBGFe+1tbWwtrYGm83Gf//7\n33a9s3BHYWpqinXr1mHixIlMRyGkUygtLcWFCxfg5eWFyspK2NvbY9GiRejVqxfT0VoNm81G//79\nER8f/9UvXknjlJeXc21clJiYiGHDhnGt4zlw4ECmY5JOrKGhASNHjoSPj0+7X76muroaampq2LVr\nF2bOnMl0HEII6dSo4CRfFBQUhLlz56KyshJN/U9ETEwMR48exU8//dSsa1dVVeHWrVsICAhAcHAw\nVFRUOGVnZ/7AXFdXh7y8vEZNky8qKoKUlFSjpsn36dOn1crQ5ORk2NjYIC0t7YvP+/r6YunSpd/d\nUKgpevbsidzc3DYrDtlsNnbs2IHTp0/jxo0bUFBQ+O4xNTU1sLS0hIiICP788892UUZ3ZFOmTMHy\n5csxdepUpqMQ0qE9ePAAnp6e8PPzw4QJE2Bvbw8TE5NOu5TKv82fPx9GRkYdalprR1BbW4v79+9z\nCs+YmBhISkpyFZ7y8vIdYqQd6Tjc3NyQkZGBM2fOMB3lm3bs2IGkpCRcvXqV/g4QQkgro4KTfNXj\nx4/xww8/4OXLl40qqLp164a+ffsiICAAY8eO5UmG6upq/PXXX/D390dwcDCUlJRgZWUFS0vLTl12\nfk9dXR3y8/O/O0X+48ePKCgoQM+ePRs1Tb5Pnz5NKg7r6urQq1cvvHjx4osjf8aOHYvU1FRevnVI\nSEjg9OnTmD17Nk/P+z2nT5/Ghg0bEBgY2KjNjKqrqzFz5kz06NED58+fp5KzBSwsLGBjY4MZM2Yw\nHYWQDqeyshL+/v7w9PTEmzdvYGtri8WLF3937eTO6MyZMwgJCcF///tfpqN0ag0NDXjy5AnXxkVV\nVVWcjYsMDAygqqpKvxdJi+Tl5WHkyJF48eIFpKWlmY7zRVlZWdDV1UVKSgqGDBnCdBxCCOn0qOAk\n31RfX4/AwEDs2bMH6enpEBERQWVlJWprayEoKAhxcXHU1NRg+PDhWL9+PebMmdNq6w5WV1cjLCwM\n/v7+CAoKgqKiIqfsHDRoUKtcszOor6/nKkO/VYrm5+ejR48e391J/p8yVFhYGGZmZrC3t/+sfCot\nLUWvXr1QW1vL8/c0b948XLx4kefn/Z7Q0FAsWLAAx48fb9Q0o6qqKkyfPh19+/bFmTNnOtwaq+2F\npaUlrK2tYWVlxXQUQjqMZ8+ewcvLC+fOnYOmpibs7e0xderULl0qvX37FqqqqsjNze0yo1bbi1ev\nXnEVnm/fvsW4ceM4hae2tjZtzkeabP78+VBXV8eaNWu4Hg8ICMDt27eRmpqKBw8eoLS0FD/++CMu\nXLjQqPMuWbIEJ0+eBPD3z1I5ObkmZ2Oz2Zg0aRJnmR1CCCGtjwpO0mi5ublISUnBo0ePUFlZCVFR\nUSgqKkJDQ6PNR4LU1NRwlZ0KCgqcsnPw4MFtmqUzaWhoQEFBwTdHhP7zXF5eHiQkJCAgIAARERHo\n6+tzlaH5+fn4/fffUV5ezvOcI0aMYGyjiJSUFEyfPh0bN27E8uXLv/v6iooKmJubY8iQITh58iTd\nVDfD3LlzMW3aNMybN4/pKIS0a7W1tQgKCoKnpyfS0tJgY2MDW1tbDB8+nOlo7YaSkhLOnz8PDQ0N\npqN0afn5+YiJieEUnhkZGVBTU+MUnnp6eujZsyfTMUk7FxsbCxsbGzx58oTr85WamhoePHiA7t27\nY9CgQXjy5EmjC87g4GBMnz4d3bt3R1lZWbMLTl9fX+zatQspKSm0HjshhLQRKjhJh1dTU4Pw8HD4\n+/vj6tWrkJeX55SdsrKyTMfrtBoaGlBYWIgbN25gx44d2Lp1K1cRmpKSgvT0dDQ0NPD82qKioi3e\nlb0lsrOzYWZmhpkzZ2LXrl3fLS3Ly8sxZcoUKCgowMvLi0rOJlq0aBGMjY2bva4vIZ1dTk4OvL29\nceLECcjJycHBwQGzZs2CiIgI09HaHUdHRwwYMADOzs5MRyH/o6ysDAkJCZzC8+7duxgxYgTXOp4d\nedNJ0jrYbDZUVVWxf/9+/Oc//+E8HhkZiUGDBkFOTg63b9+GsbFxowrOvLw8jB49GiwWCx8+fMDt\n27ebVXB++vQJSkpKCAwMxLhx45r13gghhDQdFZykU6mpqUFERASn7JSTk+OUnbT2Teuorq5Gr169\n8O7dO0hKSnIeP3nyJJycnFplBKeQkBBqamp4ft6myM/Px/Tp0zF8+HCcOnXqu0szlJaWwszMDKqq\nqjhy5AgtNN8ES5YsgY6ODm0MQsj/aGhowK1bt+Dp6Yno6Gj8+OOPsLOzg4qKCtPR2rXg4GAcOnQI\nYWFhTEch31BTU4N79+5xbVwkLS3NVXjKycnR71ICT09PhIWFITAw8IvPR0VFNbrgnDlzJuLj45Ge\nno7Zs2c3u+BctmwZGhoa4OXl1aTjCCGEtAwNIyKdyj9rQp48eRLv37/H9u3b8fjxY2hoaEBHRwf7\n9u3Dy5cvmY7ZqYiIiEBTUxNxcXFcj0tKSrbaSEVxcfFWOW9TyMjIICwsjDM6s7i4+Juvl5CQwI0b\nN3Dv3j2sWrUK9N1S4wkJCbXKWq6EdES5ubnYs2cP5OTk4OLigmnTpuH169c4fPgwlZuNwGKxkJiY\nyOgsAPJ9wsLC0NXVxbp16xAUFIT8/HxcuXIF2traCAsLg7GxMQYMGIAffvgBhw8fRmpqKurr65mO\nTRgwf/58RERE4O3bty06z5kzZ3DlyhUcO3bsixtnNlZiYiIuX76M3bt3tygPIYSQpqOCk3RaQkJC\nmDRpEk6cOIH379/j999/R2ZmJrS0tKCtrY29e/ciOzub6ZidgpGREe7cucP1mKqqaqtMTwf+XkOt\nPRAXF0dAQAAUFBRgaGj43Q/XkpKSCA0NRVxcHNatW0clZyNRwUm6OjabjejoaMybNw/y8vLIzMyE\nn58fkpOTsWTJEnTv3p3piB2GhIQEVFVVER0dzXQU0gT8/PxQUVGBg4MDfHx8kJOTg7i4OJibm+Ph\nw4eYM2cOevXqhSlTpmD37t2IiYlBdXU107FJG5CQkMCcOXNw4sSJZp/j1atXcHJywvz582FhYdHs\n89TV1cHOzg779u2DlJRUs89DCCGkeajgJF2CkJAQJk6cCG9vb7x79w47d+5EVlYWdHR0oKWlBTc3\nN7x48YLpmB2WoaHhZwWnnJxcqxR4goKCMDY25vl5m0tAQAAeHh6YO3cu9PT0kJ6e/s3X9+zZEzdv\n3kR4eDg2btxIJWcjUMFJuqri4mJ4eHhg9OjRsLW1hY6ODrKzs3Hq1CloaWnR9NxmMjU1pSnqHRwf\nHx+GDRuGhQsXwtvbG0+ePMHTp0+xZMkS5ObmYtWqVejVqxcMDQ3h4uKC0NBQlJSUMB2btBIHBwd4\ne3ujrq6uycc2NDRg0aJF6N69O9zd3VuU49ChQ+jduzdtikgIIQyhgpN0OUJCQjA1NcWxY8fw7t07\nuLq64sWLF9DV1YWGhgZcXV3x/PlzpmN2KLq6ukhNTeWa8sfPz4/58+dDUFCQp9cSEhJqd5vN8PHx\nwdnZGTt27ICJiclnZe+/SUtLIywsDNevX8fWrVvbKGXHRQUn6WpSUlKwdOlSDB06FNHR0fDw8EBG\nRgacnJxoVBAPmJqa4q+//mI6BuGxPn36YNasWThw4ACSk5Px/v17bNq0Cfz8/HB1dcWAAQOgrq4O\nJycnBAQE4OPHj0xHJjwyZswYDB06FMHBwU0+9sCBA7h9+za8vb1b9PP19evX2L17N44ePUpfPhFC\nCEOo4CRdmqCgICZMmAAvLy+8e/cOe/fuxatXr6Cnpwd1dXXs3r0bWVlZTMds97p164bRo0cjISGB\n6/FVq1ZBSEiIZ9fh4+ODuro6Ro4cybNz8tKCBQtw4cIFWFpawt/f/5uv7dWrF2dR/N9//72NEnZM\nVHCSrqCiogKnT5+GtrY2Zs2ahWHDhuHx48fw8/MDi8WiG2Ye0tLSQnZ2NnJzc5mOQlqRhIQEJk6c\niN9//x1RUVEoKCiAh4cHBgwYgDNnzmDUqFGQl5fH4sWLcebMGTx//pxmVXRgDg4O8PT0bNIxT58+\nhYuLC2xsbDBlypQWXd/R0RGOjo7t9jMqIYR0BVRwEvJ/BAUFYWJiAk9PT7x79w5//PEHcnJyoK+v\nj7Fjx2LXrl149uwZ0zHbrS9NU1dUVMSiRYsgJibGk2uIiorC29ubJ+dqLaamprh16xZWr16NgwcP\nfvO1ffr0QXh4OC5evAhXV9c2StjxUMFJOrPHjx9j1apVkJWVRWBgILZu+dXTeAAAIABJREFU3YoX\nL15g48aN6NevH9PxOiUhISEYGRkhIiKC6SikDYmIiEBPTw/r16/HtWvXUFBQgICAAKirq+PGjRsw\nMDDAoEGDMGfOHBw5cgQPHz5stbXECe9ZWloiNTW1SZ/VMzIyUF1djdOnT4OPj4/rz+3btwEAI0eO\nBB8fH65cufLV81y9ehVPnjzB+vXrW/w+CCGENB9v544S0kkICAjA2NgYxsbGOHz4MKKjo+Hv7w8D\nAwP069cPVlZWsLKygry8PNNR2w1DQ0Ps37//s8f/+OMPhISE4M2bNy26URAXF8eWLVugqKjYkpht\nQk1NDbGxsZg8eTJycnKwd+/er+4o369fP0RERMDIyAhCQkL45Zdf2jht+yckJISKigqmYxDCMzU1\nNbh8+TK8vLzw+PFjLF68GCkpKRgyZAjT0bqMf6apz5kzh+kohCH8/PwYM2YMxowZg+XLl4PNZuPF\nixeIjo5GdHQ0Dh06hLy8PIwfPx4GBgYwMDCApqYmhIWFmY5OvkBERAQ2Njbw8vLCH3/80ahjhg4d\nisWLF3/xuevXr+PDhw+wsrKCpKQkhg4d+sXXlZWVYeXKlTh79ixERESaG58QQggP8LFpLgYhjVZf\nX4+YmBj4+/v/P/buPK7mtHEf+HXaJJUtS7SI7BpLtvbdEpmJsu+Rso0xY4xtmLHMM2OMsYwK2Zeo\nkCVatYesIZOUFGHKFmlT5/fH89XvMYMR55zPOXW9X6/5Y/Q5930Z80rnOveC4OBgNG3atKrsbN++\nvdDxBPX06VPo6+vj0aNH//jh/86dO+jduzcePXqEioqKao+toaGB8ePHK9y5Ro8fP8bnn3+Oli1b\n/usPvrm5ubC1tcXs2bPx5ZdfyjCl/FuzZg3u3bv31gKdSJFkZ2djy5Yt8Pf3R8eOHeHt7Y0vvviC\nhYkA/vzzT/Tr1w937txRqL9XSLYePHiAhISEqtLz5s2b6NmzZ1XhaWZmBi0tLaFj0v/JyspC7969\nkZubW7V7KCYmBnZ2dhgzZgz27NnzwWPZ2toiNjYWGRkZMDY2fudzX3/9NfLz87Fr165Pzk9ERJ+G\nBSfRR6qoqEBiYmJV2amjo1NVdnbo0EHoeILo3r07Nm3aBDMzs3987d69e3B1dUVaWhqKioo+aDyR\nSAR1dXUsXboU3377rUK+CS0pKcHYsWNRUFCAw4cPv/cA+zt37sDW1hbz5s3D9OnTZZhSvq1fvx4Z\nGRnYsGGD0FGIqq2iogKnTp2Cj48PkpOTMW7cOHh5edXavyfkhVgshoGBAaKiorgbgz7Ys2fPkJyc\nXFV4Xrx4ER06dKgqPC0tLdG0aVOhY9ZqAwcORNu2bVFYWAjgvyV1WFgYWrduDSsrKwCAjo4Ofv31\n1/eO8yEF5+XLl9GvXz9cu3aNf+5ERHKABSeRBFRWVr5RdjZq1Kiq7FSELdWSMmfOHOjq6r7zDKLK\nykps2rQJS5cuRXl5OZ4/f/7W51RVVaGsrIyePXti8+bNCv/fsKKiAnPnzkVUVBROnjwJfX39dz6b\nlZUFOzs7LF68GFOnTpVhSvnl4+ODK1euwNfXV+goRB/swYMH2LZtGzZv3oxmzZrBy8sLI0aMgIaG\nhtDR6P9MnjwZpqammDFjhtBRSEGVlJTg/PnzVYVnYmIidHV1qwpPa2trGBoaKuQHtIrq6NGjmD59\nOu7du/fOZwwNDZGdnf3ecf6t4KyoqIC5uTmmTp2KKVOmfGpsIiKSABacRBJWWVmJpKSkqrKzQYMG\nVWVnp06dhI4nVYcOHYK/vz9OnDjx3udevXqFEydO4NChQ0hOTsa9e/dQUVEBDQ0NdOrUCfb29hg/\nfvx7twQpGrFYjN9++w2///47QkNDYWJi8s5nb926BTs7OyxfvhwTJ06UXUg5tXXrViQnJ8Pf31/o\nKETvJRaLERMTA19fX4SHh8PNzQ1eXl4wNTUVOhq9xb59+3Dw4MH3Xh5CVB0VFRVITU2tKjzj4+Oh\nqqpaVXhaWVmhU6dO7zyXmz5dRUUFjIyMEBISgu7du0ttHh8fH+zduxdxcXH88yQikhMsOImkqLKy\nEmfOnEFgYCCCgoKgpaUFd3d3DB8+HJ07dxY6nsTl5+ejbdu2ePToEZSVlYWOI5cCAgIwe/ZsBAQE\nwN7e/p3Ppaenw97eHj///DPGjh0rw4TyZ+fOnYiKiuL5ViS3njx5gl27dsHX1xdKSkrw9vbGuHHj\nUL9+faGj0Xv89ddfaNeuHQoKCqCiwns3SfLEYjFu3br1RuH55MmTNy4u6tGjB8/hlbAVK1YgJycH\nmzdvlsr4Dx48gImJCWJiYmrkz/NERIqKBSeRjFRWVuLs2bNVZaempmbVys7OnTvXmO1LnTp1wp49\ne9CjRw+ho8it06dPY8SIEVi3bh1GjRr1zufS0tLg6OiI3377rVbf9Ltv3z4cO3YM+/fvFzoKURWx\nWIyUlBT4+vri8OHDGDhwILy9vWFpaVljvp/XBt26dYOPj89bz44mkoa8vLw3Li7KzMxEr1693ri4\nqF69ekLHVGgPHjxAx44dkZ2dLZUPmkaNGoVWrVrhp59+kvjYRET08VhwEgmgsrIS586dqyo7NTQ0\n4ObmBnd3d5iYmCj0m+PXl2fMmTNH6Chy7erVqxg0aBBmzZqFb7755p1/5teuXYOTkxM2bNgANzc3\nGaeUD4GBgThw4ACCgoKEjkKEoqIi7Nu3D76+vnjy5AmmTZuGSZMm8YIJBfXNN99AW1sb33//vdBR\nqJZ6+vQpkpKSqgrPy5cvo1OnTm9cXKSjoyN0TIUzfPhwWFtbY+bMmRIdNzw8HNOmTcP169d5pjIR\nkZxhwUkkMLFYXFV2BgYGQl1dvWpl52effaZwZee+ffsQFBSEQ4cOCR1F7t29excDBw6EnZ0d1q5d\n+85t/ZcvX8aAAQPg5+eHzz//XMYphXfkyBFs374dISEhQkehWuz69evw9fXFvn37YGlpCW9vb/Tr\n149nrym4sLAwrFy5EnFxcUJHIQIAFBcXIyUlBXFxcYiPj0dycjL09PRgbW1dVXoaGBgIHVPunT59\nGjNnzsS1a9ck9rN0cXExTExMsH79ejg7O0tkTCIikhwWnERy5PWWx9dlp5qaWlXZ2bVrV4UoO3Nz\nc9G9e3fk5+crRF6hPX36FK6urmjUqBH27NmDunXrvvW5CxcuwNnZGf7+/hg8eLCMUworNDQUGzdu\nRGhoqNBRqJYpLS1FcHAwfH19cevWLUyZMgVTp06Fvr6+0NFIQl6+fIlmzZohLy8PWlpaQsch+odX\nr17hypUrb5zjWbdu3TcuLurYsSN/5vobsViMTp06wc/PD9bW1hIZc/HixUhPT0dgYKBExiMiIsli\nwUkkp8RiMc6fP19VdqqoqFSVnd26dZPrH2Rbt26N48eP1/hb4yWltLQUEydORG5uLo4ePYpGjRq9\n9blz585h8ODB2LVrFwYMGCDjlMKJiIjAzz//jMjISKGjUC2RlZUFPz8/bN++HZ999hm8vb0xZMgQ\nqKqqCh2NpMDe3h5z586tdR8ekWISi8W4efPmG4VnYWEhLC0tqwrP7t278/sVgHXr1uHMmTMSOcP7\nxo0bsLKyQmpqKlq0aCGBdEREJGncV0Ukp0QiEXr16oVffvkFWVlZ2L9/PyoqKjBs2DC0bdsWCxYs\nwMWLFyGPn1HY2Nhwu1811KlTB3v37oWZmRksLCyQnZ391ud69+6NkJAQjB8/vlaVfaqqqigvLxc6\nBtVwr169QkhICAYOHIg+ffrg1atXSEhIQGRkJIYNG8ayoAZzcnKqVd9TSbGJRCK0b98eU6ZMwc6d\nO5GVlYUrV65g5MiRyMrKwpQpU9C4cWM4Ojrihx9+QHR0NF6+fCl0bEFMmDABp06dwsOHDz9pHLFY\nDC8vLyxdupTlJhGRHOMKTiIFIxaLcfHixaqVnQCqVnb26NFDLlZ2bt++HREREdi3b5/QURTOunXr\n8Msvv+D48ePo3r37W5+Jj4/HsGHDcPDgQdja2so2oAASExMxb948JCUlCR2FaqC8vDxs3boVW7Zs\ngb6+Pry8vODu7v7O4yKo5jl//jwmTJiA69evCx2FSCKePHmCxMTEqhWeV65cgYmJyRsXF71rt0hN\n4+HhAWNjYyxYsOCjx9ixYwf++OMPnDlz5p3npRMRkfBYcBIpMLFYjMuXL1eVnRUVFVVlp6mpqWBl\nZ2ZmJmxsbJCbmysXhauiCQoKwvTp07F37144OTm99ZmYmBgMHz4cwcHBsLKyknFC2Tp37hxmzJiB\nlJQUoaNQDVFZWYno6Gj4+PggOjoaI0aMgLe3N7p27Sp0NBJARUUFmjZtitTUVLRs2VLoOEQS9/Ll\nS5w9e7aq8Dx79iwMDQ3fOMdTT09P6JhSceHCBQwbNgyZmZkfVU4WFBSgc+fOCA0NhampqRQSEhGR\npLDgJKohxGIxrly5UlV2lpeXV5WdPXv2lGnRKBaLoaenh7i4OLRp00Zm89Yk8fHxcHNzw+rVqzF+\n/Pi3PhMZGYnRo0fjyJEjMDc3l3FC2bl06RImTZqEy5cvCx2FFNyjR4+wY8cO+Pn5QV1dHd7e3hgz\nZgy0tbWFjkYCc3d3h4uLyzu/3xLVJK9evcKlS5eqCs+EhARoamq+UXi2b9++xnxI3bt3byxduhSD\nBg2q9msnT54MLS0trFu3TgrJiIhIklhwEtVAYrEYqampVWVnaWkp3Nzc4O7ujt69e8vkB9ZRo0ah\nX79+mDRpktTnqqnS0tLg7OwMT09PLFiw4K1/bmFhYRg3bhyOHz+O3r17C5BS+q5du4YRI0Zw+yh9\nFLFYjDNnzsDHxwdHjx7FkCFD4OXlBTMzsxrz5p0+3ebNmxEfH4/du3cLHYVI5sRiMf788883Li56\n+fLlGxcXdevWDSoqKkJH/Sjbt29HcHAwjh8/Xq3XxcXFYcyYMbh+/To/CCMiUgAsOIlqOLFYjKtX\nr1aVncXFxVVlZ58+faT2Bt/Hxwfnzp3D9u3bpTJ+bZGXlwdnZ2eYmZlh48aNb91edfz4cXh4eNTY\n7VPp6elwcXHBzZs3hY5CCuT58+fYu3cvfH198eLFC3h5eWHixInQ0dEROhrJoaysLFhYWCAvL4/F\nNxGA3NzcNwrPnJwc9O3bt6rw7NOnj8KcVfzy5UsYGBjg/PnzaNWq1Qe9pqysDN26dcPy5csxbNgw\n6QYkIiKJYMFJVIuIxWJcu3atquwsKip6o+xUUlKS2FzXr1/HkCFDkJmZKbExa6vCwkIMGzYMGhoa\n2L9/PzQ0NP7xzJEjRzBt2jSEhYWhW7duAqSUnqysLDg4OOD27dtCRyEFkJqaCh8fHxw4cAB2dnbw\n8vKCg4ODRL+/Uc3Upk0bhISEoEuXLkJHIZI7jx49QmJiIuLi4hAfH49r166ha9eusLa2hpWVFSws\nLNCgQQOhY77TV199BXV1daxcuRK3bt3C5cuX8fjxYygrK8PQ0BCmpqZo3Lhx1fMrV65EcnIyjh07\nxg89iIgUBAtOolpKLBbj+vXrVWXn8+fPq8rOvn37fnIZUFlZiaZNm+Ly5cs19uB6WSorK4OHhwdu\n3bqFY8eOvXUVWlBQEGbOnImIiAiYmJgIkFI67t69iz59+uDevXtCRyE5VVJSgsDAQPj4+CAnJwee\nnp7w8PDghTFULV5eXmjfvj2++uoroaMQyb2ioiKcOXOmaoXnuXPn0Lp16zfO8WzRooXQMauEhYXh\niy++qNoJo6SkhFevXkEkEkFVVRXFxcUwMDDAN998A3Nzc9jZ2VVrxScREQmPBScRAcAbZeezZ8+q\nyk4zM7OPLjuHDh0Kd3d3jBo1SsJpayexWIyFCxciODgYp06dQuvWrf/xTEBAAObOnYvIyEh06tRJ\ngJSS9/DhQ5iYmOCvv/4SOgrJmYyMDPj5+WHnzp0wNTWFl5cXBg8erLDnxJGwgoKCsG3bNoSGhgod\nhUjhlJeX4+LFi29cXNSgQYM3Cs+2bdvKfDVkaWkpFi1ahE2bNqGkpAT/9ta3Xr16KCsrw/jx47F1\n61YZpSQiIklgwUlE/5CWllZVdj59+hTDhg2Du7s7zM3Nq1V2/v7770hPT4ePj48U09Y+mzZtwooV\nK3D06FH07NnzH1/fs2cP5s+fj+joaLRv316AhJL1+PFjtGnTBk+ePBE6CsmB8vJyHDt2DD4+Prhy\n5QomTZoET09PtGnTRuhopOAeP36MVq1aoaCgAGpqakLHIVJolZWVuHHjxhvneJaVlb1xcVHXrl3f\nera4pDx8+BBWVla4d+8eXr58Wa3XamhowNvbG6tXr+YWdSIiBcGCk4je68aNG1Vl5+PHj6vKTgsL\ni38tOy9evIhx48bx9mspOHLkCKZOnYpdu3Zh4MCB//j69u3b8f333+P06dMwNjYWIKHkPH/+HLq6\nunjx4oXQUUhAd+/exZYtW7B161a0bt0a3t7eGDZsGOrUqSN0NKpBevfujdWrV8PGxkboKEQ1zp07\nd94oPO/duwczM7OqwrN3795QV1eXyFyPHj2Cqakp8vLyUF5e/lFj1KtXD56envjtt98kkomIiKSL\nBScRfbA///yzquwsKCh4o+x82yfwFRUVaNy4MTIyMtCkSRMBEtdsSUlJGDp0KFatWoXJkyf/4+tb\ntmzBihUrEBMTAyMjIwESSkZJSQnq16+P0tJSoaOQjFVWViIiIgI+Pj6Ii4vD6NGjMW3atBp1xizJ\nl4ULF0JJSQkrVqwQOgpRjZefn4+EhISqwjMtLQ09evSoKjzNzc1Rv379ao8rFosxaNAgREVFoays\n7JMyamhoICAgAC4uLp80DhERSR8LTiL6KOnp6QgKCkJgYCAePnxYdd6mlZXVG2Wns7MzpkyZgqFD\nhwqYtuZKT0/HwIEDMWHCBHz//ff/2Ea1adMmrF69GjExMTA0NBQo5aepqKiAqqoqKisrhY5CMpKf\nn4/t27fDz88P9evXh7e3N0aNGgVNTU2ho1ENd/r0aSxYsABnzpwROgpRrfPixQskJydXFZ4pKSlo\n27btG+d4Nm/e/F/HCQwMxMSJE6u9Lf1dGjZsiNu3b39U2UpERLLDgpOIPtnNmzerys779+9XlZ3W\n1tZYvXo1Hjx4gN9//13omDXWgwcPMGjQIPTo0QM+Pj7/uGBl/fr1WLduHWJjYxX2RnslJSWUl5dL\n9awuEpZYLEZiYiJ8fHxw4sQJuLq6wtvbG7169eL5ZyQzpaWlaNKkCe7cuYOGDRsKHYeoVisrK8OF\nCxeqCs/ExEQ0btz4jcKzTZs2b/wdIRaL0aZNG9y+fVtiOTQ0NLB8+XLMnTtXYmMSEZHkseAkIonK\nyMioKjvv3bsHc3NzXLt2DTdu3ODNxlL0/PlzuLu7Q1lZGQcOHPjHSrc1a9bA19cXsbGxaNGihUAp\nP16dOnXw7NkziZ3NRfKjsLAQu3fvhq+vL8rLy+Hl5YXx48ejUaNGQkejWmrAgAHw9PTkzgMiOVNZ\nWYnr16+/cY5nRUXFG4VnYWEhnJ2dUVRUJNG5dXV1ce/ePX7gRkQkx1hwEpHU3Lp1CwEBAVi6dCka\nNWpUdWanjY0Ny04pKC8vx7Rp03D16lWcOHECTZs2fePr//nPf7Bjxw7ExMR80BYveaKpqYn79+9D\nS0tL6CgkIZcuXYKPjw8CAwPh5OQEb29v2Nra8s0jCW7NmjXIzMzEpk2bhI5CRO8hFouRnZ2N+Ph4\nxMXFIT4+HtnZ2Z987ubbaGho4OrVq2jdurXExyYiIslgwUlEUufg4IDRo0ejoKAAgYGByMnJgaur\nK9zd3WFra8uyU4LEYjGWLl2Kffv24eTJk2jbtu0bX1++fDkCAgJw+vTpfxSg8qxRo0bIyMhA48aN\nhY5Cn6C4uBgHDhyAj48PHjx4AE9PT0yePBm6urpCRyOqkpqaimHDhiEjI0PoKERUTb1790ZKSorE\nx9XS0oK/vz/c3d0lPjYREUmGktABiKjms7a2RkZGBubPn4/z58/jzJkzaNOmDRYsWIAWLVrA09MT\nERERePXqldBRFZ5IJMKPP/6Ib7/9FtbW1jh79uwbX1+yZAmGDRsGR0dHFBQUCJSy+lRVVVFeXi50\nDPpI6enp+Oqrr6Cvr4/AwEAsWbIEWVlZWLRoEctNkjtdunRBYWEhsrOzhY5CRNV07949qYxbXFyM\nzMxMqYxNRESSwYKTiKTO2toacXFxVf/eunVrfPvtt0hJScHZs2fRtm3bqqJj6tSpCA8PZ5n1iTw9\nPbFlyxYMHjwYx44de+NrP/zwAwYNGgQnJyc8fvxYoITVw4JT8ZSVlSEwMBD29vawsbFB3bp1kZKS\nghMnTmDw4MG8MIrklpKSEhwdHREZGSl0FCKqJml9WF5RUcEP4omI5BwLTiKSuj59+iA1NfWtB74b\nGRlh3rx5OHfuHFJSUtC+fXssWbIEurq6mDJlCsLCwlhsfaTBgwfjxIkT8PT0hJ+fX9Wvi0QirFq1\nCg4ODujXrx+ePn0qYMoPw4JTceTk5GDx4sUwNDTEH3/8gWnTpiEnJwerVq2CkZGR0PGIPoiTkxMi\nIiKEjkFE1fT3SxYlpU6dOqhfv75UxiYiIslgwUlEUqehoYGuXbvizJkz732uVatW+Oabb3D27Flc\nuHABnTp1wrJly6CrqwsPDw+cOnWKJVc19e7dG/Hx8Vi9ejUWL16M18cui0QirF69GpaWlhgwYAAK\nCwsFTvp+LDjlW0VFBUJDQ+Hi4oLu3bvj+fPniI6ORkxMDEaMGAE1NTWhIxJVi6OjI6KiolBZWSl0\nFCKqhp49e0plXDU1NXTt2lUqYxMRkWSw4CQimfj7NvV/Y2hoiLlz5yI5ORkXL15Ely5d8OOPP6J5\n8+aYPHkyQkNDpXJLZk1kbGyMpKQkhIeHY9KkSVVFoUgkwtq1a9GjRw8MHDgQz58/Fzjpu7HglE8P\nHz7ETz/9BGNjYyxduhSurq7IycnBunXr0LFjR6HjEX00PT09NGnSBJcvXxY6ChFVg52dHTQ0NCQ+\nbklJCbp37y7xcYmISHJYcBKRTFS34PxfBgYG+Oqrr5CUlITLly/js88+w8qVK6Grq4uJEyfixIkT\nLDv/RdOmTXH69Gk8evQIgwcPriozRSIRNm7ciE6dOmHQoEFvPUZAHrDglB9isRixsbEYOXIkOnTo\ngMzMTAQGBiIlJQWTJ09GvXr1hI5IJBHcpk6keNzc3FBRUSHRMUUiERwdHaGlpSXRcYmISLJYcBKR\nTFhYWCAlJQWlpaWfNI6+vj7mzJmDxMREXLlyBd27d8dPP/2E5s2bY8KECTh+/Pgnz1FT1atXD4cP\nH4ahoSFsbGxw//59AP+9UMPPzw9t2rSBi4sLXr58KXDSf2LBKbynT59i/fr16Ny5M7y9vWFhYYHb\nt29j69atUtsSSCQkR0dHFpxECkZHRweff/45VFRUJDamhoYGvv32W4mNR0RE0sGCk4hkQltbGx06\ndMD58+clNqaenh6+/PJLJCQk4OrVqzA1NcXPP/8MXV1djB8/HseOHWPZ+TcqKirw8/ODq6srzM3N\n8eeffwL4b8m5detWtGzZEl988QVKSkpklikoKAizZs2ClZUVtLW1IRKJMHbs2DeeeV1wZmdnQyQS\nvfOfkSNHyix3bZGSkgIPDw8YGRkhOTkZvr6+uH79OmbNmoUGDRoIHY9IamxtbXH27FkUFxcLHYWI\nqmHt2rVQV1eXyFhqampwcHCAjY2NRMYjIiLpkdxHW0RE/+L1NnULCwuJj92yZUvMnj0bs2fPRl5e\nHoKDg7F69WqMHz8egwcPhru7O/r16yexH3gVmUgkwpIlS6CnpwdbW1sEBwfDwsICysrK2L59O8aN\nGwdXV1ccOXIEderUkXqeFStW4MqVK9DU1ISenl5V6fq/1NTU3jiGoGvXrvjiiy/+8VyXLl2kmrW2\nKCoqQkBAAHx8fPDo0SNMmzYN6enpaNq0qdDRiGRGW1sbXbt2RUJCApycnISOQ0QfqEWLFvjyyy+x\ncuXKTxpHJBJBS0sL/v7+EkpGRETSJBK/vlKXiEjKjhw5Aj8/P5w8eVJmc96/fx/BwcEIDAxEamoq\nBg0aBHd3d/Tv359lJ4BTp05h3Lhx2Lx5M1xdXQEAr169wsiRI1FWVoagoCCp34B9+vRp6OnpwdjY\nGLGxsbCzs8OYMWOwZ8+eqmecnJwwb948tGvXDkZGRpgwYQJ27Ngh1Vy1UVpaGnx9fbF3715YWFjA\n29sb/fv3h5ISN3xQ7bRs2TK8fPkSv/zyi9BRiOgD7d+/H19++SVcXFwQEBDwUUfviEQiaGtrIyEh\ngR+eEhEpCL5jISKZsbS0RFJSEl69eiWzOXV1dTFz5kzExsYiLS0NZmZmWLt2LXR1dTFmzBgcOXKk\nVm8/HDBgAE6dOoUZM2Zg48aNAP67jX3//v1QUlLCyJEjpX72pZ2dHdq2bQuRSPTOZ3gGp/SUlpZi\n//79sLGxgYODA+rXr49Lly7h6NGjGDhwIMtNqtWcnJwQGRkpdAwi+gBisRg///wzvvvuO0RHR8Pf\n3x8bN25EvXr1qnUmp4aGBjp06ICUlBSWm0RECoTvWohIZnR0dKCvr4/Lly8LMr+uri5mzJiBmJgY\n3LhxA5aWlli/fj10dXUxevRoHD58uFaWnaampkhMTMSGDRswf/58VFZWQlVVFQcOHEBZWRnGjBkj\n01L6bf5ecObl5cHPzw+rVq2Cn58fUlNTBUynmG7fvo0FCxbAwMAA/v7+mDVrFnJycrB8+XIYGBgI\nHY9ILvTu3RtZWVnIz88XOgoRvcerV68wY8YM7Nu3D0lJSVXF5KRJk5CWlob+/fujTp067z16R1NT\nE9ra2li0aBFSU1PRtm1bWcUnIiIJYMFJRDL1+hxOoTVv3hze3t6Ijo5Geno6rK2tsXHjRujq6mLU\nqFEIDg6Wy9vEpcXIyAiJiYmIj4/H+PHjUVZWhjp16iAoKAiFhYU0Pm+aAAAgAElEQVSYMGECKioq\nBMv394IzIiICXl5eWLRoEby8vNC1a1fY2dkhJydHsIyKoKKiAkePHoWzszN69eqF0tJSxMXFITIy\nEm5ublBVVRU6IpFcUVVVhbW1NaKiooSOQkTvUFRUhKFDh+LWrVuIj49Hy5Yt3/i6gYEBjh8/jszM\nTCxduhQODg7Q0dGBuro6NDQ0ULduXdja2mLLli3Iz8/HwoULJXoLOxERyQYLTiKSKXkpOP9Xs2bN\n4OXlhaioKNy8eRO2trbw8fFBixYtMGLECAQFBdWKslNHRweRkZEoKirCwIED8ezZM6irq+Pw4cP4\n66+/MHnyZMFKztcFp4aGBpYsWYILFy7gyZMnePLkSdW5nTExMXBwcEBRUZEgGeXZ/fv3sWLFChgZ\nGeGnn37CiBEjkJubi99++w3t27cXOh6RXOM2dSL59fDhQ9jZ2aFx48Y4ceIEtLW13/lsy5YtsWDB\nAkRGRiI/Px/FxcUoKirCtGnT4OzsjJEjR0r93HEiIpIeFpxEJFPW1taIj49HZWWl0FHeqmnTppg2\nbRoiIyORkZEBBwcH+Pn5QVdXF8OHD0dgYGCNLtA0NDQQFBSEDh06wNraGvfu3UPdunUREhKCnJwc\neHp6CvJn97rgbNq0KX788Uf06NEDDRo0QIMGDWBtbY3w8HD06dMHt27dwtatW2WeTx6JxWJERUXB\n3d0dnTp1wt27dxESEoLk5GRMmDABdevWFToikUJwcnJCREQEeC8nkXxJT0+HmZkZnJ2dsW3bto/e\nhdClSxdcvXpVwumIiEjWWHASkUy1aNECjRo1QlpamtBR/lWTJk3g6emJiIgIZGZmwsnJCVu2bEGL\nFi3g7u6OgwcP1siyU1lZGRs3bsSoUaNgbm6O69evQ0NDA8eOHcPNmzcxffp0mb/R/7dLhlRUVDBl\nyhQAkLsVwrL2+PFjrF27Fh06dMCcOXNgZ2eHO3fuwNfXF927dxc6HpHCad++PSoqKpCRkSF0FCL6\nP4mJibCxscHixYuxbNmy915U+G9MTExYcBIR1QAsOIlI5uRxm/q/0dHRwdSpUxEeHo7MzEz0798f\n/v7+aNGiBdzc3HDgwAG8ePFC6JgSIxKJ8N1332HFihWwt7dHbGwsNDU1ERoaitTUVMyaNUumJeeH\n3KLepEkTAKiRpfO/EYvFOHPmDCZOnIg2bdrg4sWL2LZtG1JTUzF9+vT3btkjovcTiUTcpk4kR4KC\nguDq6oqdO3di8uTJnzxe586dkZ6eLviFikRE9GlYcBKRzFlbWyM2NlboGB9NR0cHU6ZMQVhYGLKy\nsjBw4EBs374dLVu2xLBhwxAQEFBjys5x48Zh7969VStWtbS0cPLkSaSkpGDu3LkyKznV1NRQVlb2\n3mfOnDkDAGjdurUsIsmFFy9ewM/PDz169MDYsWPRuXNnZGRkYPfu3bCwsPikFS1E9P+93qZORMJa\nu3Yt5syZg/DwcPTv318iY9arVw8tWrTArVu3JDIeEREJgwUnEcnc6xWcNeE8s8aNG8PDwwOnTp3C\n7du3MWjQIOzcuRMtW7bE0KFDsX//fjx//lzomJ/E0dER4eHhmDt3Ln7//XfUr18fYWFhiIuLw/z5\n82Xy5/h6BefFixffegZoVFQU1q5dCwAYO3as1PMI7erVq5gxYwYMDAwQFhaGn3/+GTdv3sS8efOg\no6MjdDyiGsfBwQExMTFc4UUkkIqKCsyZMwf+/v5ISkpCt27dJDo+z+EkIlJ8InFNaBiISKGIxWIY\nGBggOjoabdu2FTqOVDx+/BhHjx5FYGAgEhISYG9vD3d3d7i4uEBLS0voeB/lzp07GDhwIAYMGIBf\nf/0VT58+hb29PQYNGoQVK1Z89GrBI0eO4MiRIwCABw8eICwsDK1bt4aVlRWA/66YVVFRqSpWMzIy\nYG5uDj09PQBAamoqoqOjAQDLly/H4sWLJfC7lT8lJSUICgqCr68vbt++jalTp2LKlClV/x2ISLq6\ndu0KPz8/9O3bV+goRLVKcXExxo4di8ePH+Pw4cNo0KCBxOdYsmQJRCIRfvzxR4mPTUREssGCk4gE\nMWbMGNjb28PDw0PoKFL35MmTqrIzPj4ednZ2VWWnop2N+PjxY3zxxRfQ1dXFrl278Pz5c9jZ2WHY\nsGFYtmzZR425bNky/PDDD+/8uqGhIcaNGwdVVVW0bNkShw8fxrVr11BQUIDy8nI0a9YMZmZmmDlz\nZlUpWpPcunULfn5+2LlzJ7p37w4vLy+4uLhARUVF6GhEtco333yD+vXrY8mSJUJHIao1CgoKMGTI\nEBgZGWHbtm2oU6eOVOY5ePAgAgICcOjQIamMT0RE0sct6kQkCEW8aOhjNWzYEBMmTMDx48dx584d\nDB06FAEBAdDT08OQIUOwe/duPHv2TOiYH6RRo0YIDw9HZWUl+vfvD2VlZURFReHgwYNYuXLlR425\nbNkyiMXid/6TnZ1dtUXdw8MDx48fR3Z2Nl68eIHS0lLk5OTgwIEDNarcfPXqFQ4fPoz+/fvD3Nwc\nIpEISUlJCAsLg6urK8tNIgE4OjryHE4iGcrMzIS5uTlsbW2xe/duqZWbAG9SJyKqCVhwEpEgalPB\n+b8aNGiA8ePH49ixY8jNzYW7uzsCAwOhr68PFxcX7Nq1C0+fPhU65nupq6vjwIED6NatG6ysrFBa\nWoqoqCjs2rULv/zyi1Tm/JBb1GuCe/fuYdmyZWjVqhXWrFmDcePGIScnB7/88guMjY2FjkdUq1lb\nW+PSpUs15hI5Inl29uxZWFpaYu7cuVi1ahWUlKT7ttXY2Bj37t1DUVGRVOchIiLpYcFJRILo0KED\nioqKkJOTI3QUwdSvXx/jxo3D0aNHkZubixEjRiA4OBgGBgYYPHgwdu7cKbdlp5KSEtauXYtJkybB\n3Nwc+fn5iI6OxubNm6su+5GkmlxwVlZWIjw8HEOHDoWJiQny8/MRGhqKhIQEjB07Furq6kJHJCIA\nGhoa6NWrF2JjY4WOQlSjhYSEYPDgwdiyZQu8vLxkMqeqqiratWuHtLQ0mcxHRESSx4KTiAQhEolg\nbW2N+Ph4oaPIhfr162Ps2LEICQnB3bt3MWrUKBw+fBgGBgYYNGgQduzYgSdPnggd8w0ikQhff/01\nVq9eDUdHR6SnpyM6OhobNmzAhg0bJDpXTSw4CwoKsHr1arRr1w7z58/HgAEDcOfOHfzxxx/47LPP\nhI5HRG/BbepE0vXHH3/A29sboaGhGDx4sEznNjExwbVr12Q6JxERSQ4LTiISjLW1NVfCvIW2tjbG\njBmDI0eO4O7duxgzZgxCQkLQqlUrODs7Y/v27XJVdo4cORIHDx7EyJEjkZCQgOjoaKxZswa+vr4S\nm6OmFJxisRiJiYkYO3YsjI2Ncf36dezZswcXL16Ep6cntLS0hI5IRO/h5OSEyMhIoWMQ1TiVlZX4\n9ttvsWHDBiQmJqJXr14yz8BzOImIFBsLTiISTG09h7M6tLW1MXr0aBw+fBh3797FuHHjcOzYMbRq\n1QoDBw7Etm3b8PjxY6FjwtbWFlFRUfjuu+9w8OBBREZGYtWqVdi6datExldTU0NZWZlExhJCYWEh\nNm3ahK5du2Ly5MkwNTVFVlYWduzYgb59+0IkEgkdkYg+QI8ePXD//n3k5eUJHYWoxigpKcHo0aOR\nlJSExMREGBkZCZKjS5cuLDiJiBQYC04iEoyJiQkePHiAhw8fCh1FIWhpaWHUqFE4dOgQ7t27h4kT\nJyI0NBRGRkYYMGAA/P398ejRI8HymZiYICkpCbt378b69esRHh6OZcuWYefOnZ88tqKu4Lx8+TKm\nTZsGQ0NDnD59GmvXrsWff/6Jr776Co0aNRI6HhFVk7KyMuzs7LiKk0hCHj9+jP79+6OyshKRkZFo\n3LixYFm4gpOISLGx4CQiwSgrK8PS0pLncH4ETU1NjBgxAkFBQbh37x4mT56MU6dOoXXr1ujfvz+2\nbt2KgoICmefS09NDfHw8rl69ikWLFuH48eNYuHAh9u3b90njKlLBWVxcjJ07d6Jv374YMmQI9PX1\nkZaWhsDAQDg4OHC1JpGCc3Jy4jmcRBKQnZ0NS0tL9OzZEwEBAYJfqqenp4eSkhLk5+cLmoOIiD4O\nC04iEhS3qX86TU1NDB8+HIGBgcjLy8OUKVMQHh6ONm3aoF+/ftiyZYtMf1hv0KABTp06BTU1Ncyc\nORMHDx7E119/jYMHD370mIpQcN68eRNz586Fvr4+Dhw4gEWLFiErKwuLFy+Grq6u0PGISEJen8Mp\nFouFjkKksC5cuAALCwt4eXlhzZo1UFIS/m2pSCTiRUNERApM+L9JiKhWY8EpWfXq1YO7uzsOHjyI\nvLw8eHp6IjIyEsbGxnB0dISfn59Mys46depg7969MDMzg4eHB7Zt24bZs2fj0KFDHzWevBac5eXl\nCAoKgoODA6ysrFCnTh2kpKQgNDQULi4uUFFREToiEUlY69atUbduXVy/fl3oKEQKKTQ0FAMGDMDG\njRsxe/ZsoeO8gedwEhEpLr7zIiJBmZqaIjMzE0+ePEHDhg2FjlOj1KtXD25ubnBzc8PLly9x8uRJ\nBAYGYv78+TA1NYW7uzuGDh2Kpk2bSmV+JSUlrF69Gvr6+pg6dSrWrl0Lb29vqKioYMiQIdUaS94K\nztzcXGzevBn+/v5o27YtvL294erqijp16ggdjYhk4PU29S5duggdhUihbNmyBd9//z2OHj0KMzMz\noeP8g4mJCS5duiR0DCIi+ghcwUlEglJVVUXfvn2RmJgodJQaTUNDA8OGDUNAQADy8vIwY8YMxMbG\nol27drC3t4ePj4/ULnuaPXs21q1bh9mzZ2PJkiWYOnUqQkNDqzWGPBScFRUVOHnyJIYMGYJu3brh\n2bNniIyMRGxsLEaOHMlyk6gWeb1NnYg+jFgsxuLFi/Hzzz8jLi5OLstNANyiTkSkwERiHiBERAJb\nvnw5CgsLsXr1aqGj1DrFxcU4deoUAgMDERoaiu7du1et7GzevLlE54qPj4ebmxu8vLzg4+ODPXv2\noF+/fh/02tOnT+OHH35ATEyMRDN9iL/++gvbtm2Dn58fGjduDG9vb4wcORL16tWTeRYikg+PHj2C\nkZERCgoKoKamJnQcIrlWVlaGKVOm4ObNmzh27BiaNGkidKR3evLkCQwNDfH06VO5OBeUiIg+HL9r\nE5HgeA6ncOrWrQtXV1fs27cP9+/fx5dffonExER06NABtra2+OOPP/DgwQOJzGVlZYWYmBjs3LkT\nrq6uGDt2LKKioj7otbJewSkWixEXF4dRo0ahffv2yMjIQGBgIM6fPw8PDw+Wm0S1XOPGjdG+fXsk\nJycLHYVIrj179gzOzs4oLCxEdHS0XJebANCwYUNoa2vjzp07QkchIqJq4gpOIhJccXExdHR08PDh\nQ2hqagodhwCUlJQgLCwMQUFBOH78OD777DO4u7tj2LBhn3wjeF5eHpydnWFoaIjk5GQEBgbCxsbm\nH8+lpKRg3759iI+Px59//omXL19CU1MTxsbGsLa2xogRI9C3b1+IRKJPyvO/nj17hl27dsHX1xdi\nsRheXl4YP348GjRoILE5iKhmWLhwIZSVlbF8+XKhoxDJpdzcXDg7O8PW1ha///47lJWVhY70QQYO\nHAhvb+9qnxdORETC4gpOIhJc3bp10aNHD66EkSPq6ur4/PPPsXv3bjx48ADffPMNzp07h86dO8Pa\n2hobNmxAXl7eR43dokULxMXF4eXLlzA2NoabmxsSEhKqvh4dHY0OHTrAzs4O69evx4ULF1BUVASx\nWIznz5/j0qVL2LBhA5ycnNCuXTuEhYV98u/3woULmDJlClq1aoXExERs2rQJ169fx+zZs1luEtFb\nOTo6IiIiQugYRHIpNTUV5ubmmDhxItavX68w5SbAcziJiBQVV3ASkVxYtGgRlJSUuBJGzpWWliIi\nIgKBgYE4duwYOnfuXLWys2XLltUa6/WZXOfOncOjR48QHByMHTt2ICAgAMXFxR88joaGBoYOHYrN\nmzejbt26H/y6ly9fIiAgAD4+PsjPz8e0adMwefJkNGvWrFq/DyKqnUpLS6Gjo4OcnBw0bNhQ6DhE\nciMiIgJjxozBhg0bMGLECKHjVNvu3bsRGhqK/fv3Cx2FiIiqgQUnEcmFsLAwrFq1CrGxsUJHoQ9U\nWlqKyMhIBAYG4ujRo+jYsSPc3d3h5uYGPT29DxpDLBZj0aJF2LFjB/Lz86GsrIzS0tJqZ1FXV4eJ\niQliYmKgoaHx3mdv3LgBX19f7NmzB+bm5vD29kb//v0VanUJEcmHAQMGYNq0aXB1dRU6CpFc2LFj\nB+bPn4+goCBYWVkJHeejXLp0CePGjeMqTiIiBcOCk4jkwvPnz6Grq4uCggKoq6sLHYeqqays7I2y\ns3379lVlp76+/r++vmvXrkhNTf2kDOrq6rC1tUVoaOg/zuUsKyvD4cOH4ePjg/T0dHh4eGDq1Kkw\nNDT8pDmJqHb79ddfkZWVhU2bNgkdhUhQYrEYy5cvx/bt2xEaGoqOHTsKHemjlZSUoGHDhnj27BnU\n1NSEjkNERB+IZ3ASkVzQ0tJCp06dkJKSInQU+ghqampwdnbG9u3bcf/+fSxZsgRXr15Ft27dYGZm\nht9++w05OTlvfW1QUBBu3br1yRlKSkoQHx+PvXv3Vv1adnY2Fi5cCAMDA/j5+WHGjBm4c+cOVqxY\nwXKTiD6Zk5MTz+GkWq+8vBxTp05FSEgIkpOTFbrcBP77gWmrVq2Qnp4udBQiIqoGFpxEJDesra25\nRb0GUFNTw8CBA7Ft2zY8ePAAS5cuxfXr19GjRw/07dsXa9aswZ07dwD8d2Wlp6cnXr58KZG5i4qK\nMH36dBw6dAiDBg1Cz549UVxcjJiYGERHR8Pd3Z2rMYhIYkxMTFBYWIjs7GyhoxAJ4vnz53BxccH9\n+/cRGxuL5s2bCx1JIkxMTHD16lWhYxARUTWw4CQiuWFtbY24uDihY5AEqaqqYsCAAfD398f9+/fx\nww8/4MaNGzA1NUWfPn3g4eGBsrIyic754sULzJs3D+7u7sjNzcXatWvRoUMHic5BRAQASkpKcHR0\nRGRkpNBRiGQuLy8P1tbWMDQ0REhICDQ1NYWOJDFdunRhwUlEpGBYcBKR3LC0tMSZM2dQXl4udBSS\nAlVVVfTv3x9bt27F/fv3sXz5ckRGRqKoqEii84jFYujo6GDixInVulWdiOhjODo6cps61TrXr1+H\nubk53N3d4evrCxUVFaEjSRRXcBIRKR4WnEQkNxo1aoRWrVrh0qVLQkchKVNVVYWTk5PEy83XUlNT\nUVlZKZWxiYj+l5OTE6Kiovg9h2qNmJgY2NvbY8WKFVi4cOE/LvarCUxMTHiLOhGRgmHBSURyhdvU\na48HDx5IbbWukpJS1TmfRETSpKenhyZNmuDy5ctCRyGSun379mH48OHYv38/xo4dK3QcqWndujUK\nCgpQWFgodBQiIvpALDiJSK6w4Kw9nj17BlVVVamMraKigmfPnkllbCKiv+M2darpxGIx/vOf/+C7\n775DdHQ07O3thY4kVUpKSujYsSNXcRIRKRAWnEQkV6ytrZGQkMCtfrWAiooKxGKxVMYWi8U17jww\nIpJfTk5OvGiIaqxXr15hxowZ2L9/P5KTk9GlSxehI8kEz+EkIlIsLDiJSK40b94cTZo04SfmtYCe\nnh5KSkqkMnZJSQlatWollbGJiP7O1tYWZ86cQXFxsdBRiCSqqKgIrq6uuHXrFuLj49GyZUuhI8kM\nz+EkIlIsLDiJSO5YW1sjNjZW6BgkZerq6jAwMJDK2M2aNYOmpqZUxiYi+jttbW189tlnSEhIEDoK\nkcQ8fPgQdnZ20NHRwYkTJ6CtrS10JJniCk4iIsXCgpOI5A7P4aw9Pv/8c6ipqUl8XENDQzx58kTi\n4xIRvQu3qVNNkp6eDjMzMzg7O2Pbtm1SOzNbnnXp0gVXr16V2nE6REQkWSw4iUjuvC44+QNlzTdr\n1iwoKUn2ryI1NTU0bNgQRkZGGD9+PBISEvj/EhFJnZOTEy8aohohMTERNjY2WLx4MZYtWwaRSCR0\nJEE0a9YMSkpKuH//vtBRiIjoA7DgJCK5Y2hoCHV1ddy8eVPoKCRlRkZGcHFxQZ06dSQynpqaGvr1\n64djx47h1q1b6N69O6ZOnYrOnTvj999/x6NHjyQyDxHR3/Xu3RuZmZnIz88XOgrRRwsKCoKrqyt2\n7tyJyZMnCx1HUCKRiOdwEhEpEBacRCSXuE299vD19YWGhoZExlJXV4e/vz8AQEdHB1999RXS0tLg\n5+eHCxcuoE2bNhgzZgxiY2O5qpOIJEpVVRU2NjaIjo4WOgrRR1m7di3mzJmD8PBw9O/fX+g4coHn\ncBIRKQ4WnEQkl1hw1h6NGjVCSEjIJ5ecdevWxaFDh9C0adM3fl0kEsHKygq7d+9GVlYWevfujenT\np6Njx45Ys2YNCgoKPmleIqLXuE2dFFFFRQXmzJkDf39/JCUloVu3bkJHkhuvz+EkIiL5x4KTiOSS\njY0NC85axMrKCsePH4empiZUVFSq9VplZWXUq1cPR44cgYODw3ufbdSoEb788ktcu3YN/v7+SE1N\nhbGxMUaNGoXTp09zVScRfRJHR0dERETwewkpjOLiYri7u+PKlStISEiAgYGB0JHkCldwEhEpDhac\nRCSX2rZti9LSUty5c0foKCQjdnZ2SEtLQ9++fT/4ZnWRSISePXvi2rVr6Nev3wfPJRKJYGFhgZ07\nd+L27dswNzfH7Nmz0b59e6xevRp//fXXx/42iKgW69ChAyoqKnDr1i2hoxD9q4KCAjg4OKBu3bo4\ndeoUGjRoIHQkudO5c2f8+eefqKioEDoKERH9CxacRCSXRCIRrK2tERsbK3QUkiF9fX1ER0ejQYMG\n6NOnD1RVVaGtrQ1NTU3UrVsX9erVg7a2NlRUVODg4IAOHTpgzpw5aNWq1UfP2bBhQ8yaNQupqanY\nuXMn0tLS0K5dO4wYMQJRUVGorKyU3G+QiGo0kUjEbeqkEDIzM2Fubg5bW1vs3r1bYpf91TRaWlpo\n1qwZMjMzhY5CRET/ggUnEcktnsNZOx0+fBjt2rXDmTNnUFhYiMjISGzcuBG//fYbNm7ciIiICDx/\n/hyRkZFYs2YNVqxYIZESUiQSwczMDNu3b0d2djasra0xd+5ctGvXDj///DMePnwogd8dEdV0r7ep\nE8mrs2fPwtLSEl9//TVWrVoFJSW+JXwfblMnIlIMIjEPCSIiOZWamgo3NzfcvHlT6CgkI2KxGH36\n9MHChQvxxRdffNDzvXv3xoIFCzB06FCp5ElJScHmzZsRHBwMR0dHeHp6wsHBgW8IieitHj58iA4d\nOiA/P7/aZwoTSVtISAimTJmC7du3Y/DgwULHUQiLFi2Cqqoqli1bJnQUIiJ6D747IyK51aVLFxQU\nFOD+/ftCRyEZSUhIwJMnT+Di4vJBz4tEIixZsgTLly+XyqUeIpEIvXv3xtatW3Hnzh04ODjg22+/\nhbGxMVatWsX/N4noH5o1awYDAwOcP39e6ChEb/jjjz/g7e2NkydPstysBhMTE1y7dk3oGERE9C9Y\ncBKR3FJSUoKlpSXi4+OFjkIysmbNGnz11VdQVlb+4Ne4uLhALBbj+PHjUkwGaGtrw8vLCxcvXsTB\ngweRnZ2NTp06YejQoTh16hTP6iSiKtymTvKksrIS3377LTZs2IDExET07NlT6EgKhVvUiYgUAwtO\nIpJrPIez9rh58yaSkpIwceLEar1OJBJh8eLFUlvF+bb5evbsic2bNyMnJwcDBgzA4sWL0bp1a6xY\nsQJ5eXlSz0BE8s3JyQmRkZFCxyBCSUkJRo8ejaSkJCQmJsLIyEjoSAqnXbt2yMnJQXFxsdBRiIjo\nPVhwEpFcs7GxYcFZS6xduxbTpk2DhoZGtV87dOhQFBUVITw8XArJ3k1LSwuenp44f/48goODcffu\nXXTu3BlffPEFQkNDUVFRIdM8RCQfrKyscPHiRbx48ULoKFSLPX78GP3790dlZSUiIyPRuHFjoSMp\nJFVVVbRt2xZpaWlCRyEiovdgwUlEcq179+7Izs7G48ePhY5CUpSfn4+AgADMmDHjo16vpKSExYsX\n48cff5TJKs63MTU1ha+vL3JzczF48GAsW7YMRkZG+PHHH3H37l1BMhGRMOrVq4eePXsiNjZW6ChU\nS2VnZ8PCwgK9evVCQEAA1NXVhY6k0HgOJxGR/GPBSURyTUVFBWZmZjyHs4bz8fHBsGHD0Lx5848e\nY/jw4SgoKMDp06clmKz6NDU1MWXKFJw7dw4hISF48OABPvvsMwwZMgTHjx/Hq1evBM1HRLLBbeok\nlAsXLsDCwgLTp0/Hr7/+CiUlvuX7VDyHk4hI/vFvOyKSezyHs2YrKSnBpk2bMHfu3E8aR1lZGYsW\nLcLy5csllOzTde/eHZs2bUJubi5cXV2xcuVKGBkZYdmyZcjJyRE6HhFJkZOTEy8aIpkLDQ3FgAED\nsHHjRsyaNUvoODUGC04iIvnHgpOI5B4Lzpptz5496NGjBzp16vTJY40ePRo5OTlyt+K3Xr16mDRp\nEpKTk3HixAk8evQI3bt3x+DBg3H06FGu6iSqgXr06IG8vDxePEYys2XLFnh4eODo0aNwdXUVOk6N\n0qVLFxacRERyTiQW6rAyIqIPVFJSAh0dHdy/fx9aWlpCxyEJqqysROfOnfHHH3/A3t5eImNu3boV\nBw8elPmFQ9X18uVLBAYGYvPmzcjOzoaHhwc8PDxgaGgodDQikhA3Nzd8/vnnGDdunNBRqAYTi8VY\nsmQJAgICcPLkSbRt21boSDWOWCxGgwYNkJWVxcuaiIjkFFdwEpHcU1dXh6mpKZKSkoSOQhJ28uRJ\nqKurw87OTmJjjh8/Hunp6Th79qzExpQGDQ0NTJgwAYmJifVgDeQAACAASURBVAgLC8OzZ89gamoK\nZ2dnHD58GOXl5UJHJKJPxG3qJG1lZWWYMGECIiMjkZyczHJTSkQiEbp06cKLhoiI5BgLTiJSCNym\nXjOtWbMGX3/9NUQikcTGVFNTw3fffSdXZ3H+my5dumDdunXIzc3FqFGj8Ntvv8HQ0BCLFi3C7du3\nhY5HRB/J0dERkZGR4IYpkoZnz55h4MCBKCwsRHR0NJo0aSJ0pBqN53ASEck3FpxEpBBsbGxYcNYw\nFy9eREZGBkaMGCHxsSdNmoTLly/jwoULEh9bmurWrYtx48YhPj4ekZGRePnyJXr16oX+/fsjODiY\nqzqJFEybNm2grq6OtLQ0oaNQDZObmwtLS0t06tQJwcHB0NDQEDpSjcdzOImI5BsLTiJSCGZmZrh0\n6RKKi4uFjkISsmbNGsyePRuqqqoSH1tdXR3z5s3DihUrJD62rHTq1Alr167F3bt3MX78eKxfvx76\n+vpYsGABMjMzhY5HRB+I29RJ0q5cuQJzc3NMnDgR69evh7KystCRagUTExNuUScikmMsOIlIIdSr\nVw9dunSR+3MV6cPk5ubi5MmTmDp1qtTmmDp1Ks6cOYPU1FSpzSEL6urqGDNmDGJjYxETE4Py8nKY\nmZnByckJgYGBKCsrEzoiEb2Ho6MjC06SmIiICDg5OUnliBd6v9cFJ4+cICKSTyw4iUhh8BzOmmP9\n+vWYOHEiGjRoILU5NDQ08PXXXyv0Ks6/69ChA3799Vfk5ubCw8MDmzZtgr6+PubPn4+MjAyh4xHR\nW9jb2yM+Pp4fRtAn27FjB8aOHYvg4GAMHz5c6Di1TqNGjaCpqYmcnByhoxAR0Vuw4CQihcGCs2Yo\nLCzEtm3b8OWXX0p9Li8vL8TGxuLGjRtSn0uW6tSpg5EjR+L06dOIj4+HWCyGpaUlHBwccODAAZSW\nlgodkYj+T+PGjdG+fXucOXNG6CikoMRiMX788Uf88MMPiImJgZWVldCRai2ew0lEJL9YcBKRwrCw\nsMDZs2e5CkbBbd26FU5OTjA0NJT6XJqampgzZw5Wrlwp9bmE0q5dO/zyyy/IycnBtGnTsGXLFujr\n62PevHm4efOm0PGICNymTh+vvLwcU6dOxdGjR5GcnIyOHTsKHalW403qRETyiwUnESmMhg0bok2b\nNrh48aLQUegjlZeXY926dfjmm29kNueMGTMQFhZW47dw16lTB8OHD0dkZCSSkpKgrKwMa2tr2Nra\nYt++fSgpKRE6IlGt5eTkhMjISKFjkIJ5/vw5XFxccP/+fcTExKB58+ZCR6r1eNEQEZH8YsFJRAqF\n29QVW1BQEFq1aoWePXvKbE5tbW3MnDkTq1atktmcQjM2NsZ//vMf5OTkYObMmdixYwf09fUxd+7c\nGrddn0gRmJub49q1a3j69KnQUUhB5OXlwdraGoaGhggJCYGmpqbQkQhcwUlEJM9YcBKRQrGxsWHB\nqaDEYnHVra+yNnv2bBw7dgy3b9+W+dxCUlNTg5ubG8LDw3H27Fmoq6vD3t4e1tbW2LNnD4qLi4WO\nSFQrqKurw9zcHKdPnxY6CimA69evw9zcHO7u7vD9f+zde1zP9///8fv7nXQmijmUjpbU29lQekfk\nbNhyHD5hcibFLGTmzJQhQzkzZ3OYYuRQKYfJFCEUlUPIMZRK798f3/HbwZzq/X6+D/frn9TrdWuX\nXdCj52HZMpQpU0Z0Ev3J2dkZV65cQWFhoegUIiL6Bw44iUijeHh4ID4+Hi9fvhSdQh8oNjYWubm5\n6NSpk8rfXaFCBQwdOhSzZ89W+bvVhb29PWbNmoXMzEz4+/tjw4YNsLa2hr+/P1JSUkTnEWk9blOn\n93H06FF4eXlhxowZmDhxIiQSiegk+gsjIyPUqFEDqampolOIiOgfOOAkIo1SuXJlVKlSBcnJyaJT\n6APNnz8fAQEBkErF/NXj7++P7du3IzMzU8j71YW+vj6++OIL7N+/H7///jtMTU3h7e2N5s2bY926\ndVzVSaQk3t7evGiI3mrjxo3o0aMHNm3ahL59+4rOof/AcziJiNQTB5xEpHF4DqfmuXTpEk6dOoX+\n/fsLa7C0tMTgwYMxb948YQ3qxs7ODjNmzEBGRgbGjRuHLVu2wMrKCqNHj+YZY0SlTCaT4dGjR8jI\nyBCdQmpGoVBgzpw5CAoKwuHDh+Hl5SU6id6C53ASEaknDjiJSONwwKl5FixYgKFDh8LIyEhoR2Bg\nIDZu3Ihbt24J7VA3+vr66Nq1KyIjI3HmzBlUqFAB7du3h5ubG9asWYPnz5+LTiTSeFKpFK1bt+Y2\ndfqboqIijBgxAps2bUJCQgJcXV1FJ9E7uLq6csBJRKSGJAqFQiE6gojoQ2RlZaFBgwa4e/cuz6bS\nAHfv3oWTkxNSU1NRuXJl0TkYO3YsgP8butJ/Kyoqwr59+xAeHo6EhAT07t0bgwcPRt26dUWnEWms\n1atX47fffsPmzZtFp5AaePbsGXr16oUXL15g+/btKFeunOgkeg+XL19G27Ztde7iQiIidccBJxFp\nJDs7O0RFRcHZ2Vl0Cr3D1KlTcevWLYSHh4tOAQDcunULrq6uuHTpkloMXDVBVlYWVq1ahRUrVqB6\n9erw8/NDz549YWJiIjqNSKO8+gHdnTt3hJ1HTOrhzp076Ny5M1xcXBAeHg59fX3RSfSeXr58iXLl\nyiE7OxtmZmaic4iI6E/8lxURaSRuU9cMeXl5WLp0KQICAkSnvFatWjX06dMHISEholM0hrW1Nb77\n7jtcv34dwcHB2L17N6ytrTFs2DD88ccfovOINIa1tTUsLCyQlJQkOoUESk1NRbNmzdChQwesWrWK\nw00No6enB2dnZ6SkpIhOISKiv+CAk4g0kqenJwecGmD9+vX47LPPUKtWLdEpfzNhwgREREQgJydH\ndIpG0dPTQ8eOHbF7924kJyejWrVq6Nq1Kxo3boyIiAjk5uaKTiRSe7xNXbfFx8fD09MTkydPxtSp\nU3nUjobiOZxEROqHA04i0khyuRwxMTHgKRvqq7i4GCEhIQgMDBSd8i/W1tbw8fHBjz/+KDpFY1lZ\nWSE4OBjp6emYNm0aoqKiUKNGDQwZMgSJiYmi84jUVuvWrTng1FHbt29Ht27dsHbtWgwcOFB0DpUA\nb1InIlI/HHASkUZycHBAcXExD3hXY5GRkTA1NYWnp6folDcKCgrCsmXL8PDhQ9EpGk1PTw/t27fH\nzp07kZKSgho1asDHxwcNGzbE8uXL8eTJE9GJRGqlRYsWOHHiBPLy8kSnkAotWLAA/v7+OHDgANq2\nbSs6h0pIJpPh/PnzojOIiOgvOOAkIo0kkUh4DqeaCwkJwbhx49R2+52dnR06d+6MRYsWiU7RGtWq\nVcOkSZOQlpaGWbNm4cCBA7CxscHgwYPx+++/c8U1EYDy5cujTp06iI+PF51CKvDy5Uv4+/tj5cqV\nSEhIQL169UQnUSl4tYKTf68REakPDjiJSGNxwKm+Tp8+jfT0dPj4+IhOeauJEyciLCyMqwxLmVQq\nRdu2bbFjxw5cuHAB9vb26NWrFxo0aIClS5fi8ePHohOJhOI2dd2Ql5eH7t27IykpCceOHUONGjVE\nJ1EpqVKlCoqLi3Hnzh3RKURE9CcOOIlIY3HAqb5CQkIwZswYtb8ZtmbNmmjbti2WLFkiOkVrVa1a\nFUFBQbhy5Qp++OEHHDlyBLa2thg0aBBOnjzJ1S+kk7y9vREdHS06g5QoJycHrVq1grGxMfbv3w9z\nc3PRSVSKJBIJz+EkIlIzEgW/syAiDVVcXIxKlSohOTkZ1atXF51Df8rIyECDBg1w7do1lCtXTnTO\nO128eBEtWrRAWloaTE1NRefohDt37mDt2rUIDw+HsbEx/Pz80LdvXw4ASGcUFhbC0tISaWlpsLS0\nFJ1DpSwtLQ3t27eHj48PZsyYAamUa0q00ahRo2Bvb4+xY8eKTiEiInAFJxFpMKlUCg8PD8TFxYlO\nob9YuHAhBgwYoBHDTQBwdnaGp6cnli1bJjpFZ3zyySf45ptvcPnyZfz44484duwYbG1t4evri4SE\nBK7qJK2nr68PuVyOQ4cOiU6hUnby5Ek0b94cgYGBmDVrFoebWowrOImI1Av/xiUijebp6clt6mrk\n8ePHWLNmDUaPHi065YNMnjwZISEheP78uegUnSKVSuHl5YXNmzfjypUrcHV1ha+vL2QyGRYtWsQb\n7kmrcZu69tm9ezc6deqEiIgIDBkyRHQOKZmrqysHnEREaoQDTiLSaDyHU71ERESgffv2GneRQp06\nddC0aVNERESITtFZlSpVwrhx45CamoqwsDCcOHECdnZ26N+/P44dO8ZVnaR1vL29cfDgQf6/rSWW\nLFmCYcOGYd++fejUqZPoHFIBV1dXXLhwAS9fvhSdQkRE4BmcRKThioqKYGFhwXPM1EBhYSHs7e2x\ne/duNGjQQHTOB0tMTESXLl1w9epVGBoais4h/N8lHevWrUN4eDikUin8/PzQr18/WFhYiE4jKjGF\nQgErKyscPXoUNWvWFJ1DH6m4uBjffvst9uzZg3379sHOzk50EqmQra0toqOj4ejoKDqFiEjncQUn\nEWm0MmXKwM3NjedwqoGtW7fC0dFRI4ebANCwYUPUq1cPq1evFp1Cf7K0tERAQAAuXryIZcuW4fTp\n03BwcEDfvn0RGxvLlW+k0SQSCbepa7j8/Hz06dMHx48fR3x8PIebOojncBIRqQ8OOIlI43GbungK\nhQIhISEYN26c6JQSCQ4Oxpw5c1BQUCA6hf5CIpFALpdjw4YNSE9PR+PGjTF06FA4OzsjNDQUOTk5\nohOJPsqrbeqkeR48eIC2bduiuLgYBw8e5MpyHcVzOImI1AcHnESk8TjgFO/IkSPIy8tD+/btRaeU\nSJMmTeDk5IR169aJTqH/ULFiRYwZMwYpKSlYuXIlkpKS4OjoiN69e+PIkSNc1UkapVWrVjhy5AiK\niopEp9AHuH79Otzd3dG4cWNs3ryZx5roMK7gJCJSHxxwEpHGa9SoEVJTU/H48WPRKTorJCQEAQEB\nkEo1/6+VKVOmYPbs2Rw4qDmJRAJ3d3esXbsW165dg5ubG0aNGgUnJyf88MMPuHfvnuhEoneqUqUK\nrK2tkZiYKDqF3lNiYiLc3d0xfPhwzJ8/Xyv+3qOPJ5PJcP78edEZREQEDjiJSAsYGBigcePGSEhI\nEJ2iky5evIjExET069dPdEqpaN68OWrUqIGNGzeKTqH3VKFCBYwaNQrnzp3D2rVrceHCBdSsWRM9\ne/bEoUOHUFxcLDqR6D9xm7rmiIqKQrt27RAWFoZRo0aJziE14OTkhOvXryM/P190ChGRzuOAk4i0\ngqenJ7epCxIaGorhw4dr1Ra94OBgzJw5Ey9fvhSdQh9AIpGgWbNmWL16Na5fvw65XI6xY8fi008/\nxdy5c3Hnzh3RiUT/0rp1aw44NUBERAQGDhyIPXv2oFu3bqJzSE2ULVsWDg4OuHjxougUIiKdxwEn\nEWkFnsMpxp07d7B9+3YMGzZMdEqpatmyJSwtLbF161bRKfSRzM3NMWLECCQlJeHnn3/G5cuXUatW\nLXTv3h0HDx7kqk5SG3K5HGfOnMHTp09Fp9AbKBQKTJ48GXPnzkVcXByaNWsmOonUDM/hJCJSDxxw\nEpFWaNq0KZKSkvD8+XPRKTplyZIl6NmzJypVqiQ6pVRJJBJMmTIFM2bM4CBMw0kkEjRp0gQrV67E\n9evX4eXlhW+++QaOjo6YPXs2srOzRSeSjjMxMUGjRo34Qzo1VFBQgP79+yM6OhrHjx9HzZo1RSeR\nGuI5nERE6oEDTiLSCsbGxqhTpw5OnDghOkVnPH/+HMuWLcPYsWNFpyhFmzZtYGJigl9++UV0CpWS\n8uXLY9iwYThz5gy2bt2Ka9euwdnZGV9++SV+++03DrNJGG5TVz+PHz9G+/btkZubi8OHD2vdD/Ko\n9HAFJxGReuCAk4i0Brepq9batWvRrFkzODk5iU5RColEguDgYMyYMQMKhUJ0DpUiiUSCRo0aITw8\nHJmZmWjbti0mTpwIe3t7zJw5E7du3RKdSDrG29sb0dHRojPoT1lZWWjevDlq166NHTt2wNjYWHQS\nqTFXV1cOOImI1AAHnESkNTjgVJ3i4mIsWLAAgYGBolOUqlOnTpBIJPj1119Fp5CSmJmZwc/PD4mJ\nidixYweysrLg6uqKbt26Yd++fbxoilSiYcOGuHnzJm7fvi06ReclJSXBzc0Nvr6+WLRoEfT09EQn\nkZqzsbHBkydP8PDhQ9EpREQ6jQNOItIa7u7uOHXqFAoKCkSnaL1ff/0V5ubm8PDwEJ2iVK9WcU6b\nNo2rOHVAw4YNsWzZMmRmZqJjx4747rvvYG9vj2nTpuHGjRui80iL6enpoWXLllzFKdjBgwfh7e2N\nkJAQBAYGQiKRiE4iDSCVSuHi4sJzOImIBOOAk4i0Rvny5fHpp5/i9OnTolO03vz583Xmm7+uXbvi\nxYsX2L9/v+gUUhFTU1N8/fXXOHXqFHbt2oXs7GzUqVMHn3/+Ofbu3ctVnaQU3KYu1po1a9C3b1/s\n2LEDPXr0EJ1DGobncBIRiccBJxFpFU9PT25TV7JTp04hKysLX375pegUlZBKpZg8eTJXceqo+vXr\n46effkJWVha6du2KGTNmwNbWFlOnTkVWVpboPNIi3t7eOHjwIP+cUTGFQoFp06bh+++/x9GjR7V+\nZwIpB8/hJCISjwNOItIqPIdT+UJCQuDv748yZcqITlEZHx8fPHz4EIcOHRKdQoKYmJhg4MCBOHHi\nBPbu3YucnBzUrVsXnTp1wp49e1BUVCQ6kTScvb09DAwMcOHCBdEpOqOwsBBff/019uzZg+PHj8PZ\n2Vl0EmkoruAkIhJPouCPiYlIi9y7dw+Ojo64f/++Tg3gVOXatWto1KgRrl+/DjMzM9E5KrV+/Xqs\nWLECMTExolNITTx79gzbtm1DeHg4MjIyMGjQIAwaNAg2Njai00hD+fn5wcXFBWPGjBGdovVyc3PR\nvXt36OnpYcuWLTA1NRWdRBosJycHjo6OePjwoU4c30NEpI64gpOItEqlSpVgZWWFpKQk0SlaaeHC\nhRg0aJDODTcBoHfv3rh58yYHnPSaiYkJfH19kZCQgP379+PRo0do0KABOnTogF27dqGwsFB0ImmY\nV9vUSblu3boFuVwOGxsb7N69m8NNKjFLS0sYGRnxQjoiIoE44CQircNt6srx8OFDrFu3DqNHjxad\nIkSZMmUwceJETJ8+XXQKqSGZTIZFixbhxo0b6N27N0JCQmBjY4PJkyfj2rVrovNIQ3h5eSEuLg4F\nBQWiU7RWSkoK3Nzc0KNHDyxbtoy7PajU8BxOIiKxOOAkIq3DAadyhIeHo2PHjrCyshKdIky/fv1w\n9epVHD9+XHQKqSkjIyP069cPcXFxiI6OxrNnz9C4cWO0a9cOO3bs4KpOeisLCwvUrFkTJ0+eFJ2i\nlY4ePQovLy/MmDEDQUFB3EpMpYrncBIRicUBJxFpHblcjri4OBQXF4tO0RoFBQVYvHgxAgMDRacI\npa+vj6CgIK7ipPdSu3ZtLFiwADdu3EC/fv2waNEiWFtbIygoCGlpaaLzSE1xm7pybNy4ET169MCm\nTZvQt29f0TmkhWQyGc6fPy86g4hIZ3HASURap3r16jA3N8fFixdFp2iNLVu2oFatWqhXr57oFOF8\nfX1x7tw5nD59WnQKaQhDQ0N89dVXiImJwZEjR1BQUICmTZvC29sb27Zt43Zk+pvWrVtzwFmKFAoF\n5syZg6CgIBw+fBheXl6ik0hLcQUnEZFYvEWdiLTSwIED0bhxYwwbNkx0isZTKBSoX78+Zs+ejfbt\n24vOUQuLFy9GdHQ0du/eLTqFNFR+fj527tyJ8PBwXLhwAb6+vhg8eDAcHR1Fp5Fg+fn5qFSpEm7c\nuIHy5cuLztFoRUVFGDVqFBISEhAVFYXq1auLTiIt9vz5c1hYWODJkyfQ19cXnUNEpHO4gpOItBLP\n4Sw9hw4dQmFhIdq1ayc6RW18/fXX+P3335GUlCQ6hTSUoaEhevfujSNHjiA2NhbFxcVwc3NDq1at\nsGXLFrx48UJ0IgliaGgINzc3HDlyRHSKRnv27Bm6deuGtLQ0xMXFcbhJSmdsbAwrKytcuXJFdAoR\nkU7igJOItJJcLkdMTAy4SL3kQkJCEBAQwMsY/sLIyAjjxo3DjBkzRKeQFnBycsIPP/yArKws+Pn5\nITw8HNbW1hg/fjwuX74sOo8E4Db1krlz5w5atGiBSpUqITIyEuXKlROdRDqC53ASEYnDAScRaSU7\nOztIpVJe5FFC58+fx9mzZ/HVV1+JTlE7Q4YMQWxsLFJSUkSnkJYwMDBAz549cejQISQkJEBPTw9y\nuRwtW7bEpk2bkJ+fLzqRVMTb2xvR0dGiMzRSamoqmjVrhk6dOmHlypXcKkwqxXM4iYjE4YCTiLSS\nRCLhNvVSEBoaihEjRsDQ0FB0itoxMTHB2LFjMXPmTNEppIUcHR0xZ84cZGZmYsSIEVi1ahWsra0R\nGBiIS5cuic4jJatTpw4ePnyIzMxM0SkaJT4+Hp6enggODsZ3333HnQekcq6urhxwEhEJwgEnEWkt\nDjhLJjs7Gzt37uRFTW8xYsQIREdHIzU1VXQKaamyZcvCx8cHBw8exIkTJ1C2bFm0bNkScrkcP//8\nM1d1aimpVIpWrVpxm/oH2L59O7p164Z169ZhwIABonNIR3EFJxGROBxwEpHW4oCzZMLCwtCnTx9Y\nWFiITlFbZmZmGDVqFGbNmiU6hXSAg4MDZs+ejczMTPj7+2P9+vWwsrLC2LFjceHCBdF5VMq4Tf39\nLViwAP7+/jhw4ADatGkjOod0mKOjI27fvo1nz56JTiEi0jkSBW/gICItpVAoULlyZZw5cwbW1tai\nczTKs2fPYGtri+PHj8PR0VF0jlp79OgRHB0dcerUKdjb24vOIR1z7do1rFy5EqtWrYK9vT38/PzQ\nvXt3GBkZiU6jEsrMzESjRo2QnZ0NqZRrEt7k5cuXCAwMRHR0NKKiolCjRg3RSURo0KABli1bhs8+\n+0x0ChGRTuG/lohIa706hzMuLk50isZZs2YNmjdvzuHmezA3N8fw4cMxe/Zs0Smkg+zs7DBjxgxk\nZGRg3Lhx2Lx5M6ysrDB69Gje5KvhatSogQoVKiApKUl0ilrKy8tD9+7dkZSUhGPHjnG4SWqD53AS\nEYnBAScRaTVuU/9wL1++xIIFCzBu3DjRKRrD398fv/zyCzIyMkSnkI7S19dH165dERUVhTNnzsDc\n3Bxt27aFm5sb1qxZg+fPn4tOpI/AbepvlpOTAy8vLxgbG2P//v0wNzcXnUT0Gs/hJCISgwNOItJq\ncrkcMTExojM0yu7du2FpaQk3NzfRKRqjYsWKGDx4MObOnSs6hQg2NjaYNm0aMjIy8O2332L79u2w\ntrbGyJEjuRpQw3h7e/OioX+4evUq3Nzc0LJlS6xbtw4GBgaik4j+RiaTcQU9EZEAPIOTiLTay5cv\nYWFhgcuXL6Ny5cqiczSCu7s7/P390b17d9EpGuXu3buoVasWzp07h+rVq4vOIfqbzMxMrFq1CitX\nrkT16tXh5+eHnj17wsTERHQavcXjx49hZWWFe/fuwdDQUHSOcCdPnkTXrl0xdepUDBkyRHQO0Rvd\nvHkTDRo0wJ07d0SnEBHpFK7gJCKtpqenB3d3d57D+Z6OHz+O27dvo1u3bqJTNE7lypUxYMAAzJs3\nT3QK0b/UqFEDU6dOxbVr1zB58mTs2rUL1tbWGD58OM6ePSs6j/5D+fLlIZPJEB8fLzpFuN27d6Nz\n585YsWIFh5uk1qpVq4aCggLcvXtXdAoRkU7hgJOItB7P4Xx/ISEh8Pf3R5kyZUSnaKRx48Zh/fr1\nyM7OFp1C9EZlypRBp06dsGfPHiQnJ6Nq1aro0qULPvvsM6xYsQJPnz4VnUj/wG3qwJIlSzBs2DBE\nRUWhY8eOonOI3koikfAcTiIiATjgJCKtxwHn+0lPT8fRo0cxcOBA0Skaq2rVqujbty9CQkJEpxC9\nk5WVFYKDg5Geno7vv/8ekZGRsLa2xpAhQ5CYmCg6j/7UunVrnR1wFhcX45tvvsHixYsRHx+PRo0a\niU4iei88h5OISPV4BicRab2CggJYWFggKyuLN62+xejRo2FiYoLZs2eLTtFoN27cQJ06dZCamopK\nlSqJziH6ILdu3cLq1asREREBCwsL+Pn5oXfv3ihXrpzoNJ1VWFgIS0tLpKWlwdLSUnSOyuTn58PX\n1xc3b97Erl27YGFhITqJ6L0tW7YMp0+fxooVK0SnEBHpDK7gJCKtV7ZsWTRp0oRnmL3FgwcPsGHD\nBowaNUp0isazsrJCjx49sGDBAtEpRB+sWrVqmDRpEtLS0jBr1iwcOHAANjY2GDx4MH7//Xfw5+Kq\np6+vD7lcjsOHD4tOUZkHDx6gbdu2KC4uxsGDBzncJI3j6urKLepERCrGAScR6QRuU3+75cuXo3Pn\nzqhWrZroFK3w7bffYvny5Xjw4IHoFKKPoqenh7Zt22LHjh24cOEC7O3t0bNnTzRo0ABLly7F48eP\nRSfqFF3apn79+nW4u7ujcePG2Lx5M2+PJ43k6uqKlJQUFBcXi04hItIZHHASkU6Qy+WIiYkRnaGW\nCgoKEBYWhsDAQNEpWsPW1hZdu3bFokWLRKcQlVjVqlURFBSEq1evYt68eTh8+DBsbW0xaNAgnDx5\nkqs6VeDVRUPa/t86MTER7u7uGD58OObPnw+plN+qkGYyNzdHxYoVcf36ddEpREQ6g/9qICKd0KRJ\nE5w7d443BL/Bpk2b4OLigjp16ohO0SpBQUEICwvjSjfSGlKpFN7e3ti2bRsuXbqETz/9FF999RXq\n1auHJUuW4NGjR6ITtZazszMKCwuRlpYmOkVpoqKioWIe2wAAIABJREFU0K5dO4SFhfG4FNIKvEmd\niEi1OOAkIp1gZGSE+vXr48SJE6JT1IpCoUBISAhXbyqBo6Mj2rdvj7CwMNEpRKXuk08+wYQJE3D5\n8mWEhoYiNjYWtra2GDBgAI4fP671Kw1VTSKRvNc29UOHDqFbt26oUqUKDAwMUK1aNbRt2xZRUVEq\nKv04ERERGDRoEH799Vd069ZNdA5RqeA5nEREqsUBJxHpDJ7D+W+vtjy2adNGdIpWmjRpEhYuXIjc\n3FzRKURKIZVK0apVK2zZsgWXL19G7dq10b9/f8hkMixatAgPHz4Unag1vL29ER0d/Z+//80336B1\n69Y4ffo0Pv/8cwQGBqJjx464d+8ejh49qrrQD6BQKDB58mTMmzcPcXFxaNq0qegkolLDFZxERKol\nUfBH7ESkI/bv3485c+ao7Td6IrRt2xa9e/eGr6+v6BSt1atXLzRo0ADffPON6BQilVAoFIiJiUF4\neDiioqLw+eefw8/PD+7u7pBIJKLzNFZ2djZq166Ne/fuQU9P72+/FxERAT8/P/zvf/9DeHg4ypYt\n+7ffLywshL6+vipz36mgoACDBg3ClStX8Ouvv6JSpUqik4hKVVJSEvr06YOUlBTRKUREOoEDTiLS\nGbm5uahatSru378PAwMD0TnCJScno127drh27Rr/eyjRuXPn4O3tjfT0dBgbG4vOIVKpnJwcrFu3\nDuHh4ZBKpfDz80P//v1RsWJF0WkaSSaTYcWKFWjSpMnrX3vx4gWsra1hZGSEK1eu/Gu4qY4eP36M\nL774AmZmZti4cSP/bCSt9OLFC5ibm+PRo0f8dxYRkQpwizoR6QwzMzM4Ozvj999/F52iFkJDQzFy\n5Ej+o1vJZDIZ3N3dER4eLjqFSOUsLS0REBCAixcvYtmyZTh9+jTs7e3Rt29fxMbG8qzOD/SmbeoH\nDx7EvXv38MUXX0AqlSIyMhJz587FwoULcfz4cUGl/y0rKwvNmzdH7dq1sWPHDg43SWsZGBjA3t4e\nly5dEp1CRKQTOOAkIp3Cczj/z61bt7Bnzx4MHTpUdIpOmDx5Mn744Qfk5+eLTiESQiKRQC6XY8OG\nDUhLS0OjRo0wdOhQODs7IzQ0FDk5OaITNYK3t/e/Lhp69UM7Q0ND1K9fH506dcK3334Lf39/uLm5\nwdPTE/fu3ROR+y9JSUlwc3ODr68vFi1a9K+t9kTahhcNERGpDgecRKRT5HI5YmJiRGcIt3jxYnz1\n1VfcJqoi9evXR4MGDbBy5UrRKUTCWVhYwN/fHykpKVixYgXOnj0LR0dH9OnTB0ePHuWqzreQy+VI\nTEzE06dPX//a3bt3AQA//PADJBIJ4uLikJubi+TkZLRp0waxsbHo3r27qOTXDh48CG9vb4SEhCAw\nMJDnsZJOkMlkOH/+vOgMIiKdwAEnEemU5s2b4/jx4ygqKhKdIszTp08REREBf39/0Sk6JTg4GHPn\nzsWLFy9EpxCpBYlEgubNm2PdunVIT09H06ZNMXLkSDg5OWH+/Plqs+pQnZiYmKBhw4aIi4t7/WvF\nxcUAgDJlymDPnj1o3rw5TE1NIZPJsHPnTlhZWSEmJkbodvU1a9agX79+2LFjB3r06CGsg0jVeJM6\nEZHqcMBJRDrFwsICNjY2+OOPP0SnCLN69Wq0aNECDg4OolN0ymeffYbatWtj7dq1olOI1E7FihUx\nevRonDt3DmvWrMH58+dRs2ZN9OrVC4cPH349xKN/b1M3NzcH8H8rxW1tbf/2scbGxmjbti0A4NSp\nUyprfEWhUGDatGmYNm0ajh49Cg8PD5U3EInEAScRkepwwElEOkeXz+F8+fIlFixYgMDAQNEpOik4\nOBizZ89GYWGh6BQitSSRSODm5oY1a9bg+vXr8PDwgL+/P5ycnDBv3rzX27F1WevWrf824HRycgLw\n/wed/1ShQgUAQF5envLj/qKwsBBff/019uzZg4SEBNSqVUul7ydSB7a2tnjw4AEePXokOoWISOtx\nwElEOkeXB5w7d+5ElSpV0KxZM9EpOsnd3R329vb4+eefRacQqT1zc3OMGDECSUlJ2LBhA1JTU+Hk\n5ITu3bvj4MGDOruqs1GjRrh58yays7MBAK1atYJEIsGFCxfe+N/k1fl/dnZ2KmvMzc1F586dkZ2d\njaNHj6JKlSoqezeROpFKpXBxcUFKSoroFCIirccBJxHpHLlcjri4OJ385jgkJATjxo0TnaHTgoOD\nMXPmTJ0+B5boQ0gkEjRp0gQrV67E9evX4eXlhfHjx8PR0RGzZ89+PejTFXp6emjZsiWio6MBADY2\nNujcuTMyMzOxcOHCv33sgQMH8Ntvv8Hc3Bzt2rVTSd+tW7cgl8thY2OD3bt3w9TUVCXvJVJX3KZO\nRKQaHHASkc6pWrUqLC0tde6n6QkJCbh37x66dOkiOkWneXp6okqVKtiyZYvoFCKNU758eQwbNgx/\n/PEHtmzZgvT0dDg7O+PLL7/Eb7/9pjM/uPrnNvUlS5bA2toaAQEBaN26NcaPHw8fHx906NABenp6\nWLFiBcqXL6/0rpSUFLi5uaFHjx5YtmwZypQpo/R3Eqk7DjiJiFSDA04i0km6uE19/vz58Pf3h56e\nnugUnSaRSF6v4tSVYQxRaZNIJGjcuDEiIiKQkZGBNm3aICgoCA4ODpg5cyZu3bolOlGpXl00pFAo\nAABWVlZITEzEyJEjceXKFSxcuBBHjx5F586dER8fjy+//FLpTUeOHIGXlxdmzJiBoKAgSCQSpb+T\nSBO4urpywElEpAISxat/GRER6ZB169Zh79692Lp1q+gUlbh69SqaNWuG69evw8TERHSOzlMoFGjW\nrBkCAwPRvXt30TlEWiMxMRHh4eHYunUrWrRoAT8/P7Rp00brfrCjUChgb2+PyMhI1K5dW3QONm7c\nCH9/f2zevBleXl6ic4jUyt27d1GrVi3cv3+fg38iIiXiCk4i0kmvVnDqys94fvzxR/j5+XG4qSZe\nreKcPn06V3ESlaKGDRti+fLlyMzMRIcOHTBlyhTY29tj+vTpuHnzpui8UiORSP61TV0EhUKBOXPm\nICgoCIcPH+Zwk+gNKleuDH19fa1fWU5EJBoHnESkk2xsbFC2bFlcuXJFdIrS3b9/Hz///DNGjhwp\nOoX+okOHDtDX18eePXtEpxBpHTMzMwwePBi///47du3ahdu3b0Mmk6FLly6IjIzEy5cvRSeW2Ktt\n6qIUFRVh+PDh2Lx5MxISEuDq6iqshUjd8RxOIiLl44CTiHSSRCLRmXM4ly1bhm7duqFq1aqiU+gv\n/rqKU1dWEhOJUL9+ffz000/IzMxEly5dMG3aNNja2mLq1KnIysoSnffRvLy8EBcXh8LCQpW/+9mz\nZ+jWrRvS0tIQGxuL6tWrq7yBSJPwHE4iIuXjgJOIdJYuDDhfvHiBsLAwBAQEiE6hN/j8889RWFiI\nqKgo0SlEWs/U1BQDBw7EyZMnsXfvXuTk5KBu3bro1KkT9uzZg6KiItGJH8TS0hKOjo44ceKESt97\n584dtGjRApUqVUJkZCTKlSun0vcTaSKZTIbz58+LziAi0moccBKRzvL09NT6AefPP/+MunXrcuug\nmpJKpZg8eTJXcRKpWN26dREWFoasrCz4+Phgzpw5sLW1xZQpU5CRkSE6772pept6amoqmjVrhk6d\nOmHlypXQ19dX2buJNBm3qBMRKR8HnESksz799FPk5eVp1DezH0KhUCA0NBTjxo0TnUJv8eWXX+LJ\nkyeIjo4WnUKkc0xMTODr64uEhATs27cPjx49QoMGDdChQwfs2rVLyPbvD+Ht7a2yPzuOHTsGT09P\nBAcH47vvvuNt0EQfwMXFBZcuXdK4leJERJqEA04i0lmvzuGMi4sTnaIUv/32G/T09NCqVSvRKfQW\nenp6mDRpEqZNm8ZVnEQCyWQyLFq0CFlZWejVqxfmz58PGxsbTJ48GdevXxed90bu7u44d+4cHj9+\nrNT3bNu2DV988QXWrVuHAQMGKPVdRNrIxMQEVatWxdWrV0WnEBFpLQ44iUinyeVyxMTEiM5Qivnz\n5yMwMJCrbDRAz549kZ2drbX/LxJpEmNjY/Tv3x/Hjh3DwYMH8fTpUzRq1Ajt2rXDL7/8olarOg0N\nDdGsWTMcOXJEKc9/tRNg7NixOHDgANq0aaOU9xDpAm5TJyJSLg44iUinaetFQ2fPnsXFixfRq1cv\n0Sn0HsqUKYNJkyZh+vTpolOI6C9cXFzw448/IisrC3379sWPP/6IGjVqYOLEiUhPTxedB0B529Rf\nvnwJf39/rFq1CgkJCahXr16pv4NIl/CiISIi5eKAk4h0mqurK+7evYvs7GzRKaUqNDQUo0ePRtmy\nZUWn0Hv66quvcO3aNcTHx4tOIaJ/MDIyQt++fREbG4vDhw8jPz8fTZo0QZs2bbBt2zYUFBQIa2vd\nunWpXzSUl5eH7t2749y5czh27Bhq1KhRqs8n0kVcwUlEpFwccBKRTtPT00Pz5s216hzOGzduYO/e\nvfDz8xOdQh9AX18f3377LVdxEqk5Z2dnhIaGIisrC76+vvjpp59gbW2NCRMmCDlfr27dunj48CEy\nMzNL5Xk5OTnw8vKCsbEx9u3bB3Nz81J5LpGuc3V15YCTiEiJOOAkIp2nbdvUFy9ejH79+qFChQqi\nU+gD/e9//8OFCxdw6tQp0SlE9A6Ghobo06cPjhw5gtjYWBQXF8PNzQ2tW7fGli1b8OLFC5V0SKVS\ntGrVqlS2qV+9ehVubm5o2bIl1q9fDwMDg1IoJCIAqFmzJm7evIlnz56JTiEi0koccBKRzvP09NSa\nAWdubi5WrlwJf39/0Sn0EQwMDDBhwgSu4iTSME5OTvjhhx+QlZWFwYMHIzw8HNbW1hg/fjwuX76s\n9Pd7e3uXeJv6yZMn4eHhgcDAQMyaNYsX1BGVMn19fXz66ae4ePGi6BQiIq3EAScR6bz69evj2rVr\nePDggeiUElu1ahW8vLxgZ2cnOoU+0qBBg3DmzBn88ccfolOI6AMZGBigZ8+eOHToEOLj4yGVSuHh\n4YGWLVti06ZNSlvV2bhxY0RGRmLUqFFo3rw5GjZsCA8PDwQEBGDr1q148uTJWz9/9+7d6Ny5M1as\nWIEhQ4YopZGIeA4nEZEySRQKhUJ0BBGRaG3atMGoUaPQuXNn0SkfraioCI6OjtiyZQuaNGkiOodK\nYMGCBTh27Bh27NghOoWISqigoAC7d+9GeHg4zp49i/79+2Pw4MGoVatWiZ997do1TJo0CTt37nw9\nPP3rP+0lEglMTU1RVFSEXr16Yfr06ahevfrfnhEWFoZZs2Zhz549aNSoUYmbiOi/zZ07F3fu3EFo\naKjoFCIircMVnERE+L9zOGNiYkRnlMgvv/wCa2trDje1wJAhQxAfH89VHkRaoGzZsujevTsOHjyI\nEydOoGzZsmjRogU8PT3x888/Iz8//4OfqVAosHjxYri6umLr1q3Iz8+HQqHAP9ctKBQK5ObmIi8v\nD+vXr0etWrWwatUqKBQKFBcXY/z48QgLC0N8fDyHm0QqwBWcRETKwxWcREQAYmNjMW7cOI293EWh\nUKBJkyaYOHEiunbtKjqHSsG8efNw5swZbN68WXQKEZWygoIC/PrrrwgPD0diYiL69euHwYMHo3bt\n2u/83OLiYgwaNAhbt27F8+fPP/jdxsbGGDhwIO7evYtbt25h165dsLCw+Jgvg4g+UFZWFj777DPc\nvn1bdAoRkdbhgJOICEB+fj4sLCyQnZ0NMzMz0TkfLC4uDgMHDsSlS5egp6cnOodKQW5uLhwcHBAb\nG1sqW1mJSD1du3YNK1aswOrVq+Hg4AA/Pz/4+PjAyMjojR/v7++PiIiIjxpuviKVSuHs7IzTp0/D\n0NDwo59DRB9GoVCgQoUKuHr1KiwtLUXnEBFpFW5RJyICYGhoiIYNG+L48eOiUz5KSEgIAgICONzU\nImZmZhgzZgxmzZolOoWIlMjOzg4zZ85ERkYGAgMDsXHjRlhbW2PMmDE4f/783z42JiamxMNN4P9W\ngaanp+PChQsleg4RfRiJRAJXV1duUyciUgIOOImI/iSXyxEbGys644NdvnwZCQkJ+N///ic6hUrZ\nyJEjERUVhatXr4pOISIl09fXR9euXbFv3z6cPn0a5cqVQ9u2beHm5oY1a9YgNzcXffr0KfFw85W8\nvDz07t37X+d2EpFy8RxOIiLl4ICTiOhPnp6eGjngXLBgAYYMGQJjY2PRKVTKypcvjxEjRmD27Nmi\nU4hIhWxtbTF9+nRkZGTg22+/xfbt21GtWjXcu3evVN9z8+ZNHDt2rFSfSURvJ5PJ/rU6m4iISo5n\ncBIR/enp06eoUqUKcnJyNOZMspycHNSsWROXLl3CJ598IjqHlODBgweoWbMmEhMTYWtrKzqHiARx\nc3Mr9WNUJBIJvvjiC2zfvr1Un0tE/y0uLg7ffPONxh6LRESkrriCk4joT6ampnBxcdGom9SXLl2K\nL7/8ksNNLVaxYkUMGTIEc+bMEZ1CRIIoFAokJycr5blcwUmkWq6urkhJSUFxcbHoFCIircIBJxHR\nX8jlcsTExIjOeC/5+flYsmQJAgICRKeQko0dOxZbt27FjRs3RKcQkQBZWVlKG4Y8ePAAjx8/Vsqz\niejfKlSogHLlyiEjI0N0ChGRVuGAk4joLzTpoqENGzagYcOGqF27tugUUrJKlSph0KBBmDdvnugU\nIhLg3r170NfXV8qzDQwMkJOTo5RnE9Gb8RxOIqLSxwEnEdFfNG/eHCdOnEBhYaHolLcqLi5GaGgo\nAgMDRaeQigQGBmLDhg24ffu26BQiUjGJRKLRzyeiv+NN6kREpY8DTiKiv6hQoQLs7e1x5swZ0Slv\ntW/fPhgYGKBly5aiU0hFqlSpgn79+mH+/PmiU4hIxapVq4YXL14o5dn5+fmoXLmyUp5NRG/m6urK\nAScRUSnjgJOI6B80YZt6SEgIxo0bx1U3Ouabb77B6tWrcffuXdEpRKRCVapUgZGRkdKebWpqqpRn\nE9GbcQUnEVHp44CTiOgfPD091XrA+ccff+DKlSvo0aOH6BRSserVq6NXr14IDQ0VnUJEKubp6Vnq\nP9TS09ND69atS/WZRPRuzs7OSEtLQ0FBgegUIiKtwQEnEdE/eHh44NixY3j58qXolDcKCQnB6NGj\nlXbhBKm3CRMmICIiAvfv3xedQkQqNHbsWJiYmJTqM4uLi2FjY8MhC5GKGRoawtbWFqmpqaJTiIi0\nBgecRET/8Mknn+CTTz5Ry9sts7KyEBUVhcGDB4tOIUFsbGzQrVs3LFy4UHQKEamQXC5HtWrVSu15\nenp6cHZ2xvHjx+Hg4ICFCxfi2bNnpfZ8Ino7nsNJRFS6OOAkInoDuVyOmJgY0Rn/smjRIvj6+sLc\n3Fx0CgkUFBSEn376CY8ePRKdQkQqsmPHDty/fx9lypQplecZGBjg119/xW+//YZdu3YhLi4O9vb2\nmDFjBh4+fFgq7yCi/8ZzOImIShcHnEREb6COFw09efIEq1atwpgxY0SnkGAODg7o2LEjFi9eLDqF\niJQsJycHPXv2xKRJk/Drr79i3rx5MDY2LtEzjY2NsWTJEtjb2wMAGjZsiO3btyMmJgZpaWlwdHTE\nhAkTkJ2dXRpfAhG9gUwmU8vdQkREmooDTiKiN3g14FQoFKJTXluxYgW8vb1hY2MjOoXUwMSJE7Fo\n0SLk5uaKTiEiJfnll18gk8lgbW2Ns2fPolmzZhg7diyCgoI+eshpZGSEuXPnwtfX91+/V6tWLaxe\nvRp//PEH8vLyULt2bQwfPhzXrl0r4VdCRP/EFZxERKVLolCn796JiNSIra0t9u/fj1q1aolOQVFR\nERwcHLBjxw40atRIdA6piT59+qBu3bqYMGGC6BQiKkX379/HyJEjcfr0aaxZswbu7u7/+pioqCj0\n69cPz58/R35+/jufaWRkhPLly2PTpk1o0aLFe3XcvXsXCxcuxPLly9G+fXt8++23cHFx+dAvh4je\n4OXLlyhXrhxu376NcuXKic4hItJ4XMFJRPQf1Gmb+vbt22Fra8vhJv3NpEmTEBoayotBiLTIrl27\nIJPJUKVKFSQlJb1xuAkAHTp0QFpaGoKCgmBhYQEzMzMYGRn97WPKlCkDMzMzfPLJJ/juu+9w5cqV\n9x5uAkDlypUxc+ZMpKWlwcXFBa1atULXrl1x8uTJknyJRIT/u+irdu3a3KZORFRKuIKTiOg/rFy5\nEkeOHMGGDRuEdigUCjRu3BhTpkzB559/LrSF1I+Pjw/c3NwQEBAgOoWISuD+/fsYPXo0Tp48idWr\nV8PDw+O9P7eoqAinT59GYmIi/vjjDzx//hw5OTm4ffs2Vq1ahYYNG0IqLfm6hry8PKxatQo//PAD\nHBwcEBQUhFatWkEikZT42US6aODAgWjatCn8/PxEpxARaTwOOImI/sOVK1fg5eWFzMxMod+8xcTE\nwM/PDxcvXiyVb1BJu5w9e/b1Sq5/rt4iIs2wZ88eDBs2DD4+Ppg1axZMTExK/MyLFy+iS5cuuHz5\ncikU/l1hYSE2bdqEOXPmwNTUFEFBQejSpQv/jiL6QAsWLEB6ejovDSQiKgX8VwgR0X9wdHREUVER\nMjIyhHaEhIQgICCA3zjSG9WrVw+NGzfGihUrRKcQ0Qd6+PAh+vfvj7Fjx2Ljxo1YuHBhqQw3AcDe\n3h6ZmZkoLCwslef9lb6+Pvr374/z588jKCgIs2bNgqurK9atW6eU9xFpK1dXV140RERUSvjdMhHR\nf5BIJJDL5YiJiRHWcOnSJZw8eRL9+/cX1kDqLzg4GPPmzcOLFy9EpxDRe9q7dy9kMhnKly+P5ORk\neHp6lurzDQwMYGVlhfT09FJ97l9JpVJ069YNp06dwqJFi7B27VrUrFkTS5YsQV5entLeS6QtXt2k\nzk2VREQlxwEnEdFbiL5oaMGCBRg2bBi3HtNbNWrUCDKZDGvWrBGdQkTv8OjRI/j6+mL06NHYsGED\nFi9eXGqrNv/p008/VcoW9X+SSCRo3bo1Dh06hC1btuDgwYOws7PDnDlz8PjxY6W/n0hTffLJJ5BK\npcjOzhadQkSk8TjgJCJ6C5EDznv37mHbtm0YPny4kPeTZgkODsbs2bO5PZRIjUVFRUEmk8HExATJ\nyckfdKP5x1DVgPOvmjRpgl27diE6Ohrnz5+Hg4MDJk2ahLt376q0g0gTSCSS16s4iYioZDjgJCJ6\nCxcXF9y/fx+3bt1S+bt/+ukn+Pj4oHLlyip/N2meZs2awdHREevXrxedQkT/8OjRIwwcOBAjRozA\n2rVrsWTJEpiamir9vSIGnK+4urpiw4YNOHXqFB48eIBatWph9OjRyMzMFNJDpK54DicRUenggJOI\n6C2kUik8PDwQFxen0vfm5eXhp59+QkBAgErfS5ptypQpmDVrFoqKikSnENGf9u/fD5lMBgMDAyQn\nJ8PLy0tl73ZyckJqaqrK3vcm9vb2WLp0KVJSUmBkZIT69etjwIABuHTpktAuInXBFZxERKWDA04i\nonfw9PRU+Tb19evX47PPPkOtWrVU+l7SbHK5HNWrV8emTZtEpxDpvMePH+Prr7/G0KFDsXr1aixd\nuhRmZmYqbRC5gvOfqlatirlz5+Lq1auwt7eHp6cnfHx8kJiYKDqNSCiZTIbz58+LziAi0ngccBIR\nvYOqz+EsLi5GaGgoAgMDVfZO0h7BwcGYOXMmXr58KTqFSGcdOHAAMpkMenp6SE5ORuvWrYV0VK9e\nHY8ePUJubq6Q979JhQoVEBwcjPT0dHh4eKBr165o27YtYmJieJM06SQXFxdcvHiRf28TEZUQB5xE\nRO9Qr149ZGZm4v79+yp5X2RkJExNTeHp6amS95F2adWqFSpUqIDt27eLTiHSOU+ePIGfnx8GDx6M\nFStWYPny5ShXrpywHqlUipo1a+LKlSvCGv6LiYkJxowZg7S0NPTs2RODBw+Gu7s79u7dy0En6RQz\nMzNUrlwZaWlpolOIiDQaB5xERO9QpkwZ2NnZoX///vDw8EC5cuUgkUjQt2/f//ycFy9eYMmSJfjs\ns89gaWkJU1NTODs7Y/To0cjIyHjr+0JCQhAYGAiJRFLaXwrpAIlEgilTpmD69OkoLi4WnUOkM6Kj\noyGTyaBQKJCcnIw2bdqITgKgHudwvk3ZsmUxcOBAXLx4Ef7+/ggODkbdunWxadMmnidMOoPncBIR\nlRwHnERE7+HOnTuIiorC2bNnUb169bd+bFFREVq1aoWRI0ciNzcXvXv3xtChQ1G5cmUsXrwYdevW\nxYULF974uadPn0Z6ejp8fHyU8WWQjmjXrh2MjIywa9cu0SlEWi83NxdDhw7FwIEDsXz5ckRERKB8\n+fKis15Tp3M430ZPTw89evTAmTNnMG/ePCxduhROTk4IDw/HixcvROcRKRXP4SQiKjkOOImI3kNQ\nUBBcXFzw5MkTLF269K0fu3PnTsTHx6NVq1ZISUnB4sWLMX/+fMTExGDKlCl4/Pgx5s+f/8bPDQkJ\ngb+/P/T19ZXxZZCOkEgkmDx5MmbMmMGtnkRKdOjQIchkMhQWFuLcuXNo166d6KR/0ZQB5ysSiQTt\n2rVDbGws1q5di927d8Pe3h4hISF4+vSp6DwipeAKTiKikuOAk4joPQwZMgTXr19/r4sa0tPTAQAd\nO3aEVPr3P2a7dOkCALh3796/Pi8zMxMHDhzA119/XQrFpOs+//xzFBcXIzIyUnQKkdZ5+vQphg8f\nDl9fX/z0009YuXKlWq3a/CtNG3D+VfPmzREZGYnIyEj8/vvvsLOzw9SpU1V2JjaRqri6unLASURU\nQhxwEhG9BwMDAzRq1AgJCQnv/FgXFxcAwL59+/51BuLevXsB4I036i5cuBADBgwQeiEFaY9Xqzin\nTZvGVZxEpejIkSOQyWTIy8vDuXPn0KFDB9Hq8WGpAAAgAElEQVRJb/Xpp58iNTVVo/8cqFevHjZv\n3oyEhATcvHkTNWvWRGBgIG7evCk6jahUODk5ITMzE3l5eaJTiIg0FgecRETvydPTE7Gxse/8uI4d\nO+KLL77AwYMHIZPJMGbMGIwfPx5eXl6YMWMGRo0ahREjRvztcx4/fozVq1dj9OjRysonHfTFF1/g\n2bNnOHDggOgUIo339OlTjBw5Ev369UNYWBhWr14Nc3Nz0VnvVLFiRRgYGODOnTuiU0qsZs2aiIiI\nQHJyMhQKBWQyGfz8/HD16lXRaUQloq+vj5o1a+LixYuiU4iINBYHnERE70kul7/XgFMikWD79u34\n7rvvkJqaikWLFmH+/Pk4cuQI5HI5+vTpgzJlyvztcyIiItC+fXvUqFFDWfmkg6RSKVdxEpWCmJgY\n1K1bF7m5uTh37hw6duwoOumDaPI29TexsrJCaGgoLl++jKpVq6JZs2bo3bs3kpKSRKcRfTSew0lE\nVDIccBIRvaemTZvi7Nmz77zNNT8/Hz179kRISAiWLFmC27dv4/Hjx4iKikJGRgbkcjl27979+uML\nCwuxcOFCBAYGKvtLIB3Uo0cP5OTk4MiRI6JTiDTOs2fPMGbMGPTp0wc//vgj1q5diwoVKojO+mDa\nNuB8xdLSEt9//z3S09PRsGFDtG/fHp06dUJ8fLzoNKIPxnM4iYhKhgNOIqL3ZGJiAplMhgsXLrz1\n4+bMmYNt27Zh5syZGDJkCKpUqYJy5cqhffv22L59OwoLCzFmzJjXH79161Y4OjqiQYMGyv4SSAfp\n6elh4sSJmD59uugUIo0SFxeHunXr4v79+zh37hw6d+4sOumjOTk5ITU1VXSG0piZmWHcuHFIT09H\n586d0a9fP3h6emL//v1cvU4agys4iYhKhgNOIqIPIJfL37kF7tVFQi1btvzX79WtWxcVKlRARkYG\n7t+/D4VCgZCQEIwbN04pvUQA0KdPH2RmZiIuLk50CpHae/78OcaOHft6Jf6GDRtQsWJF0Vkloq0r\nOP/J0NAQQ4YMweXLlzFkyBCMHz8eDRs2xLZt2/Dy5UvReURvJZPJcP78edEZREQaiwNOIqIPIJfL\nkZyc/NaPebWF/d69e2/8vdzcXABA2bJlcfToUeTl5aF9+/alH0v0J319fQQFBXEVJ9E7xMfHo169\nerhz5w7OnTuHLl26iE4qFboy4HylTJky6NOnD5KSkvD9998jNDQUtWvXxqpVq1BQUCA6j+iNatSo\ngadPn+LBgweiU4iINBIHnEREH8Dd3f2dN1x6eHgAAGbNmvWv8zqnTp2KoqIiNG7cGGZmZggJCUFA\nQACkUv5xTMrVv39/pKam4uTJk6JTiNROXl4eAgMD4ePjg7lz52Ljxo2wsLAQnVVqHBwccO3aNRQV\nFYlOUSmpVIrOnTsjISEBy5cvx5YtW+Do6IiFCxfi2bNnovOI/kYikcDFxYXb1ImIPpJEwYNpiIje\nadeuXdi1axcA4JdffkFubi7s7e1fDzMtLS0xf/58AMDNmzfRtGlT3LhxA7a2tmjXrh2MjIwQHx+P\nU6dOwcjICIcOHYK5uTlatmyJ69evw9DQUNjXRrpj6dKliIyMfH2MAhEBCQkJGDBgAOrXr4+wsDBY\nWlqKTlIKOzs7REdHw8HBQXSKUKdPn8bs2bNx7NgxjBo1CiNGjNDIi6NIOw0ZMgQymQwjR44UnUJE\npHG4ZIiI6D2cPXsWa9euxdq1a19vMU9PT3/9a9u3b3/9sdWrV8eZM2cQGBgIQ0NDrF69GmFhYcjO\nzoavry/OnDmDZs2aITQ0FMOHD+dwk1RmwIABOHv2LBITE0WnEAmXl5eH8ePH48svv8SsWbOwefNm\nrR1uAv+3TV2bLxp6X40aNcKOHTtw9OhRXL16FY6OjpgwYQKys7NFpxHxHE4iohLgCk4iog/0yy+/\nYOXKlYiMjPzoZ9y5cwe1atXC5cuXUalSpVKsI3q7hQsX4ujRo9i5c6foFCJhTpw4AV9fX9SpUwdL\nlizRiT+HR40aBQcHB/j7+4tOUSsZGRmvL5Pq1asXxo8fDzs7O9FZpKNiYmIwceJExMfHi04hItI4\nXMFJRPSBPDw8EB8fX6IbWZcsWYJevXrpxDfVpF4GDx6MEydOvPOyLCJtlJ+fjwkTJqBr166YPn06\ntm7dqjN/DuvaRUPvy8bGBosWLcKlS5dgbm6Oxo0bo1+/fkhJSRGdRjrI1dUV58+fB9cgERF9OA44\niYg+UKVKlVCtWjUkJSV91Oc/f/4cy5Ytw9ixY0u5jOjdjI2NERgYiBkzZohOIVKpU6dOoUGDBkhL\nS0NycjK6d+8uOkmlnJycOOB8i8qVK2PWrFlIS0uDi4sLWrVqha5du/JiNlIpCwsLmJiYIDMzU3QK\nEZHG4YCT/h97dx5Xc9r/D/x12qgQBimyVFooWiwtypCdqcGUGTMY+zqjZClLjKXIOjEyGUszYzuW\nsSVrEUmFtBJlX8LYaa/z+2O+t9899wxaTl2nzuv5eNz/cLo+L3PP6PQ61/W+iKgMnJ2dERUVVaav\n/fXXX2Fvbw8TExM5pyIqmfHjx+P06dO4cuWK6ChEFS4vLw++vr747LPP4Ofnh127dqFRo0aiY1U6\nzuAsGR0dHfj4+ODGjRvo3r07PDw84OLigpMnT3JXHVUKzuEkIiobFpxERGVQ1oKzuLgYK1euxLRp\n0yogFVHJ1KpVC1OmTMHixYtFRyGqUPHx8bCxsUF6ejqSkpLw5ZdfQiKRiI4lhIGBAf7880+8fftW\ndJQqQUtLC5MnT0ZGRgaGDRuGyZMno1OnTti3bx+Ki4tFx6NqzNLSEsnJyaJjEBFVOSw4iYjKwMnJ\nCVFRUaXezXHw4EHUrVsXnTt3rqBkRCUzefJkHD16FNevXxcdhUju8vLyMHv2bPTv3x+zZ8/Gnj17\noKurKzqWUKqqqjAyMkJGRoboKFWKuro6hg8fjtTUVPj6+mLx4sWwtLTEb7/9hoKCAtHxqBqysLBg\nwUlEVAYsOImIysDAwAB16tQp9RHfFStWwNvbW2l3EJHiqFOnDiZPngx/f3/RUYjk6uLFi2jfvj1S\nU1ORmJiIIUOG8O/c/8M5nGWnoqKCAQMGIC4uDqtXr8bmzZvRqlUr/PTTT8jJyREdj6oR7uAkIiob\nFpxERGXUpUuXUh1Tj4uLw507dzBo0KAKTEVUct9//z0OHDiAmzdvio5CVG75+fmYO3cu+vTpg5kz\nZ+KPP/5A48aNRcdSKJzDWX4SiQQ9evRAREQEduzYgWPHjsHQ0BBLly7Fq1evRMejaqB169a4fv06\ndwgTEZUSC04iojIq7RzOFStWwNPTE2pqahWYiqjk6tWrhwkTJmDJkiWioxCVy6VLl9C+fXskJiYi\nMTER33zzDXdt/gsTExPu4JQjOzs77N+/H8ePH0dycjIMDQ0xZ84cPHnyRHQ0qsI0NTXRrFkz/rdK\nRFRKLDiJiMrI2dkZp0+fLtEczlu3buHkyZMYNWpUJSQjKjlPT0/s2rULd+7cER2FqNTy8/Mxb948\n9O7dG9OmTcP+/fuhp6cnOpbCYsFZMSwsLPD7778jLi4OT58+hampKaZMmcK/V6nMOIeTiKj0WHAS\nEZWRoaEhAODGjRsffe3q1asxcuRI1K5du6JjEZVKgwYNMHr0aAQGBoqOQlQqly9fRseOHXHx4kVc\nvnwZw4YN467NjzA1NUV6enqpL8ijkjE0NERwcDBSU1NRo0YNWFtbY+TIkRwLQKXGOZxERKXHgpOI\nqIwkEkmJjqm/ePECv/76K77//vtKSkZUOt7e3ti2bRsePHggOgrRRxUUFOCHH35Ajx494OnpiYMH\nD0JfX190rCrhk08+gUQiwZ9//ik6SrWmp6eHwMBAZGRkoGXLlnBycoK7uzsuXbokOhpVEZaWlkhJ\nSREdg4ioSmHBSURUDiUpOENCQtCvXz80bdq0klIRlY6uri6GDx+OZcuWiY5C9EFJSUno1KkTYmNj\nkZCQgG+//Za7NktBIpHwmHolqlevHubOnYubN2/C0dERrq6u6N27d4nH25Dy4g5OIqLSk8j43ZWI\nqMxSU1Ph6uqK+Ph43Lx5E4WFhdDR0YGxsTHU1NSQn58PQ0NDHDp0CFZWVqLjEr3XgwcPYGFhgatX\nr6JRo0ai4xD9TUFBAZYsWYKgoCAsXboUI0aMYLFZRsOGDUPXrl0xYsQI0VGUTl5eHn7//XcsWbIE\njRo1gq+vL/r168d/l+kfioqKUKdOHWRlZXG8ERFRCXEHJxFRGSUmJiIwMBA3b95E48aN0a1bN/Tq\n1QsdOnSAtrY2rKysMGHCBLRq1YrlJik8fX19fPXVV1ixYoXoKER/k5ycDDs7O0RHR+PSpUsYOXIk\nC6FyMDU15Q5OQWrUqIFRo0bh6tWrmDJlCubOnQsrKyts374dhYWFouORAlFVVYWZmRlSU1NFRyEi\nqjJYcBIRldL9+/fh4uICe3t7bN26FTKZDAUFBXj16hVevnyJN2/eID8/H4mJidiyZQvOnz+PTZs2\n8TgaKbyZM2diw4YNnM9HCqGwsBCLFy9Gt27dMGHCBISHh8PAwEB0rCrPxMSEl94IpqqqCg8PD1y6\ndAlLlixBcHAwzMzMEBISgry8PNHxSEFwDicRUemw4CQiKoWwsDCYmZkhKioKOTk5KCoq+uDri4uL\nkZubi++//x49e/bE27dvKykpUek1a9YMX3zxBVavXi06Cim51NRU2Nvb4/Tp07h48SJGjx7NXZty\nwhmcikMikaBPnz6IiorC5s2bsW/fPhgaGmLFihV48+aN6HgkGOdwEhGVDgtOIqIS2r9/P9zd3fHm\nzZtSHyV7+/Ytzp49C2dnZ2RnZ1dQQqLy8/HxQXBwMJ4/fy46CimhwsJCBAQE4NNPP8WYMWNw9OhR\nNGvWTHSsasXY2BiZmZkf/YCOKpeTkxMOHz6MQ4cOIS4uDoaGhpg/fz6ePn0qOhoJwoKTiKh0WHAS\nEZXAtWvXMGTIEOTk5JR5jdzcXKSlpWHMmDFyTEYkX4aGhnB1dUVQUJDoKKRk0tLS4ODggJMnT+LC\nhQsYO3Ysd21WAG1tbTRs2BB3794VHYX+hbW1NXbu3Ino6Gjcv38frVq1gre3N+7fvy86GlUyCwsL\nJCcnc8QREVEJseAkIvqIoqIiDB48GLm5ueVeKzc3F/v27cORI0fkkIyoYsyaNQtr167Fq1evREch\nJVBYWIilS5fC2dkZI0eOxPHjx9G8eXPRsao1zuFUfK1atcKGDRuQlJSE4uJiWFpaYuzYscjIyBAd\njSqJnp4eiouL8fjxY9FRiIiqBBacREQfcfjwYWRkZKC4uFgu62VnZ+O7777jJ/KksFq1aoWePXvi\np59+Eh2FqrmrV6+ic+fOOHr0KOLj4zF+/Hju2qwEnMNZdTRt2hSrVq3CtWvX0LhxY9jb2+Orr75C\nUlKS6GhUwSQSCY+pExGVAgtOIqKPWLp0qdyH/T98+BCxsbFyXZNInmbPno1Vq1bxoguqEEVFRVi2\nbBk6d+6MYcOG4cSJE2jZsqXoWEqDBWfV06BBAyxYsAA3btyAjY0Nevfujf79++PcuXOio1EFYsFJ\nRFRyLDiJiD7gzZs3iIuLk/u62dnZ2Llzp9zXJZKX1q1b49NPP8X69etFR6FqJj09HU5OTggLC0Nc\nXBwmTpwIFRW+Ja1MLDirrtq1a2P69Om4ceMG+vfvj2+++QZdunTB0aNHeTKkGvrPHE4iIvo4vpsk\nIvqAy5cvQ1NTU+7rymQynDlzRu7rEsnTnDlzsGLFinJdrkX0H0VFRVixYgUcHR0xZMgQREREwNDQ\nUHQspWRqasoZnFVczZo1MX78eFy7dg3jxo3DtGnTYGtri927d6OoqEh0PJITS0tLpKSkiI5BRFQl\nsOAkIvqAq1evorCwsELWzszMrJB1ieSlbdu2sLOzw4YNG0RHoSru2rVrcHZ2xv79+xEbG4vJkydz\n16ZAzZs3R1ZWFj+8qAbU1NQwZMgQJCYmYv78+Vi+fDlat26NTZs2IT8/X3Q8KicLCwukpaXJbQ48\nEVF1xneWREQfkJubW2FvKvmDB1UFc+bMQWBgIHJzc0VHoSqouLgYq1evhoODAwYPHoxTp07ByMhI\ndCylp6amhpYtW/KDtmpERUUFrq6uiImJwfr167Fjxw4YGxsjKCgI2dnZouNRGdWpUwcNGjTAjRs3\nREchIlJ4LDiJiD5AS0sLqqqqFbJ2jRo1KmRdInmytbVFu3btsHnzZtFRqIrJyMhAly5dsGfPHpw/\nfx7ff/89d20qEM7hrJ4kEgm6du2KY8eOYe/evTh9+jRatmyJxYsX48WLF6LjURlwDicRUcnwXSYR\n0Qe0adOmwgpOU1PTClmXSN7mzp2LJUuWcNcxlUhxcTGCgoJgZ2eHQYMG4dSpUzA2NhYdi/4H53BW\nf+3bt8eePXtw6tQpXL9+HUZGRvDx8UFWVpboaFQKnMNJRFQyLDiJiD6gbdu2FTajrEmTJiyMqEqw\ns7ODqakpfv31V9FRSMFlZmaia9eu2LlzJ86dOwdPT88K+5CIyoc7OJWHubk5tmzZgkuXLuHt27do\n3bo1Jk2ahFu3bomORiVgaWnJHZxERCXAgpOI6AM0NTXx6aefyn1ddXV1ZGZmQk9PDyNGjEB4eDjL\nTlJoc+fORUBAQIVdukVVW3FxMdauXYtOnTrBzc0NUVFRMDExER2LPoAFp/Jp3rw51qxZgytXrkBH\nRwe2trYYNmwYUlNTRUejD2DBSURUMiw4iYg+YsaMGdDW1pbrmmZmZkhISEBSUhKsrKywaNEi6Onp\nYdSoUTh69CgKCgrk+jyi8nJycoKBgQG2bdsmOgopmBs3bsDFxQVbt25FdHQ0pk6dyl2bVQALTuWl\nq6sLf39/3LhxA+bm5nBxccGAAQMQFxcnOhr9C1NTU9y6dYuX/RERfQQLTiKij3BxcYGNjQ3U1NTk\nsp6mpiaCg4MB/HVMfcqUKYiOjsbly5dhYWGB+fPnQ09PD2PGjMHx48e5Y44Uhp+fHxYvXoyioiLR\nUUgBFBcXY926dejYsSP69euHs2fPcrZwFaKrq4v8/Hw8e/ZMdBQSREdHB76+vu8+pHB3d0f37t1x\n8uRJyGQy0fHo/2hoaMDIyAhXrlwRHYWISKGx4CQi+giJRIKtW7eiZs2a5V5LU1MTI0eOhKOj4z9+\nz8DAAF5eXoiJicHFixdhZmaGOXPmQF9fH+PGjcPJkydZdpJQXbt2RYMGDSCVSkVHIcFu3bqFHj16\n4Ndff8WZM2cwbdo07tqsYiQSCXdxEgBAS0sLkydPRkZGBoYOHYrJkyfDzs4O+/btQ3Fxseh4BF40\nRERUEiw4iYhKwMDAAIcOHYKWllaZ19DU1ISjoyNWrVr10dc2b94c3t7eiI2NRVxcHIyNjeHj44Mm\nTZpgwoQJiIyM5C46qnQSiQRz587FokWL+EOvkpLJZFi/fj06dOiAXr164ezZszA3Nxcdi8qIBSf9\nN3V1dQwfPhypqamYOXMmFi1aBEtLS/z2228cnSMY53ASEX0cC04iohLq0qULjh07hvr166NGjRql\n+lo1NTUMHDgQYWFhUFdXL9XXtmjRAtOnT0d8fDxiYmLQokULTJs2DU2aNMGkSZNw+vRplp1UaXr1\n6gVtbW3s3btXdBSqZLdv30bPnj2xadMmnD59GjNmzJDb6A4SgwUn/RsVFRUMHDgQ8fHxWL16NTZt\n2gQTExOsW7cOOTk5ouMpJQsLCxacREQfwYKTiKgUHB0dkZmZiUGDBqFGjRofLTpr166NBg0aQFtb\nG15eXtDQ0CjX8w0NDTFz5kxcvHgRZ8+eRdOmTeHp6YmmTZviu+++w5kzZ7izjirUf+/i5Iw25SCT\nyRASEoL27dvDxcUF586dQ+vWrUXHIjkwNTVFenq66BikoCQSCXr06IHIyEhs27YNR44cgaGhIZYu\nXYpXr16JjqdUuIOTiOjjWHASEZVS3bp1sXXrVmRkZGDq1KkwMzODuro6tLS0oK2tDQ0NDdSvXx+9\nevXC1q1bkZWVhaCgIIwZM0auMzSNjY3h6+uLhIQEnD59Go0bN8bkyZNhYGDw7uIilp1UEfr37w+J\nRIKDBw+KjkIV7M6dO+jVqxdCQkIQGRkJHx8f7tqsRriDk0rK3t4eBw4cwLFjx5CUlARDQ0PMmTMH\nT548ER1NKTRv3hyvXr3C8+fPRUchIlJYEhm3XxARlVtBQQGysrJQWFgIHR0d1K9f/2+/L5PJ0KNH\nD/Tt2xdTp06t0CxXr17Frl27IJVK8fz5c7i7u8PDwwOdOnWCigo/1yL52Lt3L/z9/REfHw+JRCI6\nDsmZTCbDxo0b4evrCy8vLx5Hr6Zev34NXV1dvHnzht8fqFQyMzOxbNkySKVSDB06FNOmTYOBgYHo\nWNWavb09AgMD4eTkJDoKEZFCYsFJRFRJMjIyYGdnhwsXLqBFixaV8sy0tDTs2rULO3fuxJs3b96V\nnR07dmQpReVSXFyMdu3aITAwEH369BEdh+To3r17GD16NJ48eYItW7bA0tJSdCSqQPr6+oiNjWU5\nRWXy4MEDrFq1Cps2bYKbmxtmzpwJU1NT0bGqpTFjxsDa2hoTJ04UHYWISCHxo1oiokpibGwMb29v\nTJw4sdJmF7Zu3Rrz5s1DWloawsPDUatWLQwfPhwtW7Z8d3ERP+eislBRUcHs2bOxYMEC/jtUTchk\nMmzatAnW1tZwdHTE+fPnWW4qAc7hpPLQ19fHsmXLcP36dbRo0QJOTk5wd3fHpUuXREerdjiHk4jo\nw1hwEhFVomnTpuHu3buQSqWV/uw2bdrghx9+wJUrV3Dw4EHUrFkTX3/99d8uLmJRRaXh7u6O58+f\n4+TJk6KjUDndv38f/fr1w5o1a3Dy5EnMnTsX6urqomNRJeAcTpKH+vXrw8/PDzdu3ICDgwNcXV3R\nu3dvREVF8b2FnFhaWiIlJUV0DCIihcWCk4ioEqmrq2PDhg3w8vISNiheIpHA0tISCxcuRHp6Ovbt\n2wc1NTUMHjz4bxcX8QcS+hhVVVXMnj0bCxcuFB2FykgmkyE0NBTW1tbo1KkT4uLi0LZtW9GxqBKx\n4CR5qlWrFry8vJCZmQl3d3eMGjUKnTt3RlhYGN9XlJOFhQWSk5P5z5GI6D04g5OISIDJkycjLy8P\nGzZsEB3lHZlMhsuXL0MqlUIqlUJFRQUeHh7w8PBA27ZtObOT/lVhYSHMzMywadMmODs7i45DpfDg\nwQOMHTsWd+/eRWhoKKysrERHIgEOHjyI4OBgHD58WHQUqoaKioqwe/duBAQEQCaTwcfHB+7u7ry0\nrIwaN26M+Ph4zswlIvoX3MFJRCSAv78/wsPDERUVJTrKOxKJBNbW1ggICEBGRgZ27NiBwsJCfP75\n5zAzM8PcuXO5c4D+QU1NDbNmzeIuzipEJpPht99+g5WVFWxtbREfH89yU4lxBidVJFVVVQwePBgJ\nCQkICAjAunXrYGZmhg0bNiAvL090vCqHcziJiN6POziJiAT5448/4Ovri8TERNSoUUN0nPeSyWS4\ncOHCu52dWlpa73Z2tmnTRnQ8UgAFBQVo1aoVtm/fDnt7e9Fx6AMePnyIcePG4datW9iyZQtsbGxE\nRyLBCgoKULt2bbx8+VKhvxdR9XHmzBkEBAQgMTER3t7eGDt2LGrVqiU6VpUwdepUNG7cGDNmzBAd\nhYhI4XAHJxGRIAMGDIC5uTkCAgJER/kgiUSCDh06YNmyZe9KkTdv3qB3796wsLDAggULcOXKFdEx\nSSB1dXX4+PhwF6cCk8lk2Lp1K9q1a4d27drhwoULLDcJwF///TZr1gw3btwQHYWUhJOTEw4fPoxD\nhw4hNjYWhoaG+OGHH/Ds2TPR0RTef+/gfPr0KX755RcMGDAAxsbG0NTUhI6ODjp37oyNGzeiuLhY\ncFoiosrFHZxERALdu3cP1tbWiIqKgrm5ueg4pVJcXIzY2FhIpVLs2rUL9evXh4eHB9zd3WFqaio6\nHlWyvLw8GBkZYd++fWjfvr3oOPRfsrKyMH78eGRkZCA0NBS2traiI5GC6d+/P8aMGQM3NzfRUUgJ\nXbt2DYGBgdi7dy9GjhyJqVOnQl9fX3QshRQfH48xY8bg8uXLWL9+PSZMmAA9PT107doVzZo1w6NH\nj7B37168fPkSgwYNwq5duzhDnYiUBgtOIiLB1q5dC6lUilOnTkFFpWpurC8uLkZMTMy7srNRo0bv\nys5WrVqJjkeVZM2aNThx4gT2798vOgrhr12bO3bsgKenJ0aPHg0/Pz8eQaZ/5e3tDV1dXR57JaHu\n3buHFStWIDQ0FO7u7pgxYwaMjIxEx1Io2dnZ+OSTT/Dq1SucOXMGb9++Rb9+/f72/jErKwsdO3bE\n3bt3sXv3bgwaNEhgYiKiylM1f5ImIqpGJkyYgPz8fGzcuFF0lDJTUVGBo6MjfvzxR9y7dw9r1qzB\nw4cP4ezs/LeLi6h6Gz16NOLj45GYmCg6itJ79OgRBg0ahEWLFuHQoUNYvHgxy016LxMTE1y7dk10\nDFJyTZs2xapVq3Dt2jXo6uqiU6dOGDJkCJKSkkRHUxhaWlpo2rQpMjIy0K1bN3z22Wf/+HC8cePG\nGD9+PADg1KlTAlISEYnBgpOISDBVVVWEhIRg9uzZyMrKEh2n3FRUVODk5IQ1a9bg3r17WL16Ne7d\nuwdHR0fY2tpi6dKlnPVWTWlqasLb2xuLFi0SHUVpyWQy7Ny5E+3atYOpqSkuXryIDh06iI5FCo4F\nJymSBg0aYMGCBbhx4wasra3Ru3dvfPbZZzh37pzoaAqhJDepq6urAwDU1NQqIxIRkULgEXUiIgXh\n6+uLmzdvYseOHaKjVIiioiJERUVBKpViz549aN68+btj7C1atBAdj+Tk7du3MDQ0REREBNq0aSM6\njlJ5/PgxJk6ciNTUVGzZsgWdOnUSHf3MJQIAACAASURBVImqiPv378PW1rZafMhG1U9ubi62bNmC\npUuXonnz5vD19UXPnj2Vdrakn58fZDLZey/2KywshLW1NVJSUnDkyBH06tWrkhMSEYnBHZxERArC\nz88PFy5cwOHDh0VHqRCqqqro2rUrgoOD8eDBAyxZsgQZGRno0KEDOnXqhBUrVuDOnTuiY1I5aWtr\nw8vLC4sXLxYdRans2rULbdu2hZGRERISElhuUqno6+vjzZs3ePnypegoRP9Qs2ZNjB8/HtevX8eY\nMWPg7e2N9u3bY/fu3SgqKhIdr9J9bAenj48PUlJS0LdvX5abRKRUuIOTiEiBnDhxAqNHj0ZKSgpq\n1aolOk6lKCgowKlTpyCVSvHHH3+gVatW8PDwwBdffAEDAwPR8agMXr9+DUNDQ5w9exampqai41Rr\nT548waRJk5CUlIQtW7bAzs5OdCSqomxsbPDzzz9zpAEpvOLiYhw6dAj+/v548eIFZs6cia+//hoa\nGhqio1WKq1evon///v862zwoKAhTpkyBmZkZoqOjUb9+fQEJiYjE4A5OIiIF0r17dzg7O2PevHmi\no1QadXV19OjRAxs2bMDDhw8xf/58pKSkwMrK6t3FRffv3xcdk0qhdu3a+P777+Hv7y86SrW2Z88e\ntG3bFs2bN0dCQgLLTSoXzuGkqkJFRQWurq6IiYlBcHAwtm3bBmNjYwQFBSE7O1t0vApnbGyMBw8e\n4O3bt3/79bVr12LKlClo3bo1IiMjWW4SkdLhDk4iIgXz5MkTWFhY4PDhw7C1tRUdR5j8/HycPHkS\nUqkU+/fvR5s2beDh4YFBgwZBX19fdDz6iBcvXsDY2BhxcXEwNDQUHada+fPPPzF58mQkJCRg8+bN\ncHBwEB2JqgE/Pz9IJBL88MMPoqMQlVp8fDwCAgIQHR2N77//HpMmTULdunVFx6ow1tbW+Pnnn9Gx\nY0cAwOrVq+Hl5QULCwucPHkSjRo1EpyQiKjycQcnEZGCadiwIQIDAzF27FgUFhaKjiOMhoYG+vTp\ng82bN+Phw4fw8fHBhQsX0KZNG3Tp0gU//fQTL8RQYHXr1sWECRMQEBAgOkq18scff6Bt27Zo0qQJ\nLl++zHKT5MbU1BTp6emiYxCVSYcOHbB3715ERkbi2rVrMDIygo+PDx49eiQ6WoX47zmcS5cuhZeX\nF6ysrBAZGclyk4iUFndwEhEpIJlMhu7du6Nfv36YOnWq6DgKJS8vD8eOHYNUKsWhQ4dgZWUFDw8P\nDBw4ELq6uqLj0X95+vQpTExMcOnSJTRv3lx0nCrt6dOn+O677xAfH4/Nmzejc+fOoiNRNRMfH49x\n48bh0qVLoqMQldutW7ewfPlybNu2DV999RWmT5+OFi1aiI4lN8uWLcODBw9Qv359+Pn5wdbWFseO\nHeOxdCJSaiw4iYgUVEZGBuzs7HDhwoVq9aZcnnJzc3H06FFIpVKEhYXB1tb2XdnZsGFD0fEIf93m\n+urVK6xbt050lCpr//79mDBhAjw8PODv7w8tLS3RkagaevHiBZo2bYrXr19DIpGIjkMkF48ePcLq\n1asREhKCfv36wcfHB61btxYdq9yOHDkCb29vpKWlQVVVFd999x10dHT+8boWLVrg22+/rfyAREQC\nsOAkIlJg/v7+OHv2LMLCwvgD50fk5OTgyJEjkEqlCA8PR4cOHeDh4YEBAwagQYMGouMprcePH8PM\nzAzJyclo0qSJ6DhVyrNnzzBlyhTExMRg8+bNcHJyEh2JqjldXV0kJCRwzjFVOy9evMC6devw448/\nwsHBAb6+vu/mV1ZF9+7dg6mp6UcvVerSpQtOnTpVOaGIiATjDE4iIgU2bdo03L17F1KpVHQUhaep\nqYkBAwZg+/btePDgAcaPH48TJ07AyMgIvXr1wsaNG/Hs2TPRMZVOo0aNMGLECCxbtqxc69y7dw8j\nR46Evr4+atSogRYtWsDT0xPPnz+XU1LFcvDgQVhaWqJevXpITExkuUmVgnM4qbqqW7cuZs2ahZs3\nb6Jbt25wd3dH9+7dERERgaq436dJkybQ0NDAo0ePIJPJ3vs/lptEpEy4g5OISMHFxMRg0KBBSE1N\nRb169UTHqXLevn2LsLAwSKVSHD9+HA4ODvDw8MDnn3/Of56V5OHDh2jTpg3S0tLQuHHjUn99ZmYm\nHBwc8PjxY7i5ucHMzAxxcXGIjIyEqakpoqOj8cknn1RA8sr3/PlzeHp64uzZs9i0aRO6dOkiOhIp\nkdGjR6NDhw4YN26c6ChEFSo/Px/btm3DkiVLoKOjA19fX7i6ukJFpers/3F2dsb8+fPRrVs30VGI\niBRC1fkbnIhISdnb22PAgAGYOXOm6ChVkra2Njw8PLB7927cv38fw4cPx8GDB9G8eXP069cPoaGh\nePHiheiY1Zqenh6+/vprrFixokxfP3HiRDx+/BhBQUHYt28flixZgoiICHh5eSE9PR2zZ8+Wc2Ix\nwsLCYGlpidq1ayMxMZHlJlU6ExMTXLt2TXQMogqnoaGBb7/9FqmpqZgxYwYWLVoES0tL/P777ygs\nLBQdr0T++yZ1IiLiDk4ioirh5cuXaNOmDbZv386jqnLy+vVrHDx4EFKpFBEREejSpQs8PDzg6ur6\nr4P6qXzu3r2Ldu3aIT09vVQXQGVmZsLY2BgtWrRAZmbm33bXvH79Gnp6epDJZHj8+DG0tbUrInqF\ne/HiBby8vHD69Gls3LgRXbt2FR2JlNS+ffuwceNGHDx4UHQUokolk8lw/PhxBAQE4NatW5g+fTpG\njBgBTU1N0dHeKzg4GBcvXsQvv/wiOgoRkULgDk4ioipAR0cHa9aswdixY5GXlyc6TrVQu3ZtDBky\nBPv27cO9e/cwePBg7Nq1CwYGBnBzc8PWrVvx6tUr0TGrDQMDA3h4eGDVqlWl+rrIyEgAQM+ePf9x\ndLB27dpwdHREdnY2zp8/L7eslSk8PByWlpbQ1NREUlISy00SijM4SVlJJBL07NkTkZGR2LZtG44c\nOQJDQ0MEBgYq7HsB7uAkIvo7FpxERFXEgAEDYGpqiiVLloiOUu3UqVMH33zzDQ4cOIA7d+5g0KBB\n2L59O5o2bfru4qLXr1+Ljlnl+fj44Oeffy7VZU//KVtMTEz+9fdbtWoFAFXuWO3Lly8xatQoTJw4\nEaGhoVi3bh1q1aolOhYpOUNDQ9y5cwcFBQWioxAJY29vjwMHDuDYsWNITEyEoaEh5s6diydPnoiO\n9jcWFhZIS0tDcXGx6ChERAqBBScRURWydu1arFmzBlevXhUdpdqqW7cuhg0bhkOHDuH27dtwc3PD\nb7/9hqZNm2LQoEHYuXMn3r59KzpmldSiRQu4ubkhKCioxF/z8uVLAHjv2ID//HpVmqN69OhRWFpa\nQl1dHUlJSbwgghRGjRo10KRJE9y8eVN0FCLhLC0tsXXrVsTGxuLJkycwNTWFp6cn7t69KzoagL/e\nr9SrVw+3bt0SHYWISCGw4CQiqkKaNm2K+fPnY+zYsfzEvhLUq1cP3377LQ4fPoybN2+iX79+2Lx5\nM/T19d9dXJSdnS06ZpUya9YsrF279l1xqUxevXqFMWPGYOzYsdi4cSPWr1+P2rVri45F9De8aIjo\n74yMjLB+/XqkpKRAXV0d7dq1w6hRoxTivxMLCwseUyci+j8sOImIqpgJEyYgPz8fmzZtEh1FqdSv\nXx8jR47EkSNHcOPGDfTs2RMhISHQ09PDl19+ib179yInJ0d0TIVnbGyMPn36YO3atSV6/X92aL6v\nEP3Pr9etW1c+ASvI8ePHYWlpCYlEguTkZPTo0UN0JKJ/ZWpqqhDFDZGi0dfXx7Jly5CRkYHmzZuj\nc+fOcHd3x6VLl4Rl4hxOIqL/jwUnEVEVo6qqipCQEMyaNQtZWVmi4yilTz75BKNHj8axY8eQkZGB\nbt26Yd26ddDT03t3cVFubq7omApr9uzZ+PHHH0s019TU1BTA+2dsXr9+HcD7Z3SK9vr1a4wbNw6j\nRo1CSEgIQkJCUKdOHdGxiN7LxMSEFw0RfUD9+vXh5+eHGzduwMHBAa6urujduzeioqIgk8kqNYul\npSVSUlIq9ZlERIqKBScRURXUtm1bjBo1Cl5eXqKjKL2GDRti7NixOHHiBK5duwZnZ2cEBQWhcePG\n7y4uYtn5d2ZmZujWrRuCg4M/+tr/3Cp+7Nixf4xleP36NaKjo6GlpQU7O7sKyVoeJ06cgKWlJYqK\nipCcnIxevXqJjkT0UTyiTlQytWrVgpeXFzIzM/HFF19g1KhRcHJyQlhYWKUVndzBSUT0/0lklf0x\nExERyUVOTg4sLCywZs0a9O3bV3Qc+h9ZWVnYu3cvpFIpEhMT8dlnn8HDwwM9evRAjRo1RMcT7j/H\ntG/cuAEtLa0PvrZXr144duwYgoKC8N1337379alTp2LVqlUYN24c1q9fX9GRS+z169eYMWMGDh06\nhJCQEPTp00d0JKISu3PnDuzt7XH//n3RUYiqlKKiIuzevRv+/v6QSCTw8fGBu7s7VFVVK+yZeXl5\nqFu3Ll68eMH3FkSk9FhwEhFVYcePH8eYMWOQmpoKbW1t0XHoPR4+fIg9e/ZAKpUiJSUFrq6u8PDw\nQPfu3aGhoSE6njADBw6Es7MzPD09P/i6zMxMODg44PHjx3Bzc4O5uTliY2MRGRkJExMTnDt3Dp98\n8kklpf6wiIgIjBo1Cl27dsXKlSsVfjYo0f8qLi5G7dq18ejRI9SqVUt0HKIqRyaTITw8HP7+/sjK\nysLMmTMxbNiwCisgW7duje3bt6Ndu3YVsj4RUVXBgpOIqIobOnQodHV1sXz5ctFRqATu37//ruy8\ncuUK3Nzc4OHhARcXF6irq4uOV6kSEhLQv39/ZGZmombNmh987d27d+Hn54cjR47g6dOn0NPTw4AB\nAzBv3jzUq1evkhK/35s3bzBz5kzs378fISEh3FVNVVq7du2wefNm2NjYiI5CVKWdOXMG/v7+SE5O\nxtSpUzF27Fi5f3AwePBguLq64uuvv5brukREVQ0LTiKiKu7JkyewsLBAeHg4fxitYu7du4fdu3dD\nKpXi2rVr+Pzzz+Hh4YGuXbsqTdn52WefoXfv3pg0aZLoKGV2+vRpjBw5Ek5OTli1apVCFK5E5eHu\n7o5Bgwbhyy+/FB2FqFpISEhAQEAATp06hUmTJuG7775D/fr1y7XmixcvsG3bNgQFBeH+/fsoLCyE\nRCJB/fr1YWtriz59+mDIkCG82I6IlAYLTiKiaiA0NBRBQUGIjY2Fmpqa6DhUBnfu3HlXdmZmZmLA\ngAHw8PDAp59+Wq3/P42Li8MXX3yBjIyMKndc/+3bt/Dx8cHevXvx888/o3///qIjEcnF7NmzUaNG\nDfj5+YmOQlStpKenIzAwEPv27cOIESMwdepU6Ovrl2qNly9fYtq0afj999+hoqKC7Ozsf32dlpYW\niouL8e233yIwMBC1a9eWxx+BiEhh8RZ1IqJqYNiwYahbty7WrFkjOgqVUbNmzTB16lScP38e8fHx\nMDExwaxZs6Cvr4/x48cjIiIChYWFomPKXceOHdG6dWuEhoaKjlIqUVFRaNeuHV6+fInk5GSWm1St\nmJqa8iZ1ogpgamqKjRs34vLlyygsLISFhQXGjRuHzMzMEn19ZGQkjIyM8PvvvyM3N/e95SYAZGdn\nIzc3F1u2bIGxsTHOnDkjrz8GEZFC4g5OIqJq4vr167C3t8eFCxfQokUL0XFITm7evIldu3ZBKpXi\n7t27GDRoEDw8PODk5FShN7NWpujoaHzzzTe4du2awh/Nf/v2LWbNmoXdu3cjODgYrq6uoiMRyd35\n8+fx3XffIT4+XnQUomrtyZMnCAoKQnBwMHr27AkfHx+0bdv2X1+7b98+DBkyBDk5OWV6lpaWFnbt\n2sUZ0URUbbHgJCKqRvz9/REdHY1Dhw5BIpGIjkNylpmZ+a7sfPDgAb744gt4eHjA0dGxyped3bp1\nw7Bhw/Dtt9+KjvJeZ8+exYgRI9CpUycEBQWVe34akaJ69uwZWrZsiRcvXvB7CVElePXqFdavX4/V\nq1fD1tYWs2bNgr29/bvfj4+Px6effvrBHZsloaWlhXPnzvHGdSKqllhwEhFVI/n5+bCxsYGfnx88\nPDxEx6EKdP369Xdl5+PHj9+VnQ4ODlBRqXoTaCIjIzFu3DikpaUp3MzR7OxszJ49Gzt37sS6devw\n+eefi45EVOEaNGiA1NRU6Orqio5CpDRyc3OxefNmBAYGonnz5vD19YWzszPMzc1x+/btcq8vkUhg\nbGyM1NRUhT8xQURUWlXvJyAiInovDQ0NbNiwAZ6ennj+/LnoOFSBWrVqhVmzZuHy5cuIjIxEo0aN\nMHHiRBgYGMDT0xPnzp1DcXGx6Jgl9umnn0JXVxc7d+4UHeVvzp07BysrKzx69AjJycksN0lpcA4n\nUeWrWbMmJkyYgOvXr2P06NHw9vaGsbExsrKy5LK+TCbD/fv38fPPP8tlPSIiRcIdnERE1dCkSZNQ\nUFCAkJAQ0VGokl25cuXdzs6XL1/C3d0dHh4e6NSpk8IfNT127Bg8PT2RkpIifBdqTk4O5s6di61b\nt+Knn37CwIEDheYhqmwjRoyAo6MjRo8eLToKkdIqLCxEw4YN8eLFC7mua2BggNu3byv8+wIiotLg\nDk4iomrI398fhw8f5o2ZSsjc3Bx+fn5ISUnBkSNHUKdOHYwYMQItWrTAtGnTEBcXB0X9bLNHjx6o\nXbs29uzZIzRHTEwMrK2tce/ePSQnJ7PcJKVkYmLCHZxEgp0/fx5FRUVyX/f58+e4cOGC3NclIhKJ\nBScRUTWko6ODoKAgjB07Fnl5eaLjkCBt2rTB/PnzkZaWhrCwMGhpaWHo0KFo2bIlZsyYgQsXLihU\n2SmRSODn54eFCxcKOV6fm5uLGTNmYMCAAVi0aBF27NiBBg0aVHoOIkXAgpNIvLi4OOTn58t93aKi\nIsTHx8t9XSIikVhwEhFVUwMGDICpqSmWLFkiOgoJJpFIYGFhgQULFuDq1as4cOAANDQ08NVXX8HI\nyAg+Pj64dOmSQpSdffv2hbq6Og4cOFCpz42NjYW1tTVu3ryJpKQkfPHFF5X6fCJFwxmcROJFR0dX\nyAfVOTk5iImJkfu6REQicQYnEVE1dvfuXVhbW+Ps2bMwMzMTHYcUjEwmQ2JiIqRSKXbu3AmJRAIP\nDw94eHigXbt2wmZz/fHHH1i0aBEuXLhQ4Rlyc3Mxf/58bNmyBUFBQfDw8KjQ5xFVFTk5OahXrx7e\nvHkDNTU10XGIlFK3bt0QGRlZIWv37t0b4eHhFbI2EZEI3MFJRFSNGRgYYN68eRg3blyVulGbKodE\nIoGVlRX8/f2RkZEBqVSK4uJiDBw4EKamppgzZw6SkpIqfWenm5sbCgoKcPjw4Qp9Tnx8PGxtbXH9\n+nUkJiay3CT6L5qammjcuDFu374tOgqR0qrIDxc0NDQqbG0iIhFYcBIRVXMTJ05Ebm4uNm/eLDoK\nKTCJRAIbGxssWbIEmZmZ2LZtG/Lz8+Hq6vq3i4sqo+xUUVHBnDlzsHDhwn8+Tyb763/lkJeXh1mz\nZqF///6YO3cudu/eDV1d3XKtSVQdcQ4nkVht2rSpkJMMqqqqsLCwkPu6REQiseAkIqrmVFVVsWHD\nBvj6+uLRo0ei41AVIJFI0L59ewQGBuLmzZv49ddfkZ2djb59+/7t4qKKNGjQILx8+RKRBw4A69cD\nvXsDjRoBamqAigqgrQ3Y2gLTpwOlKGAuXLgAW1tbXLlyBYmJifjyyy+FHcUnUnQsOInEsrOzQ61a\nteS+rra2Njp27Cj3dYmIROIMTiIiJeHj44Pbt29j+/btoqNQFVVcXIy4uDhIpVJIpVLUrVv33cxO\nuc94zc1F2sCBMDx6FDU0NSF5+/bfX6eu/lfpaWMDbNoEmJj868vy8vKwcOFCbNiwAatWrcJXX33F\nYpPoI9asWYMrV65g3bp1oqMQKaUnT56gWbNmyM3Nleu6NWvWxIMHD1CvXj25rktEJBJ3cBIRKQk/\nPz/ExcVxoDyVmYqKCuzs7LBy5UrcuXMHISEhePbsGVxcXNC2bVssWrRIPru9EhMBExOYnz6NmsXF\n7y83AaCgAMjJAWJiACsr4Mcf//GSS5cuoX379khOTsbly5cxZMgQlptEJcAdnERiNWzYEL1795br\n9yyJRAJXV1eWm0RU7bDgJCJSElpaWli/fj0mTpyItx8qjIhKQEVFBQ4ODli9ejXu3r2LdevW4fHj\nx+jSpcu7i4uuX79e+oXj4oDOnYG7dyHJzi751xUX/1V0zpoFzJgBAMjPz4efnx/69OmDmTNnYt++\nfdDT0yt9JiIlxYKTSKz8/Hw0a9ZMrvOvJRIJoqOjsXv37kq/RJCIqCLxiDoRkZIZOnQodHV1sXz5\nctFRqBoqKipCdHQ0pFIpdu/eDX19fXh4eMDd3R1GRkYf/uIHDwBzc+DVq/KF0NLCnalT8dmBA2jW\nrBl+/vln6Ovrl29NIiVUVFSEWrVq4enTp9DS0hIdh0ipREREYNKkSTA0NETz5s0RGhqK7NJ88Pcv\ntLS04OvrCwcHB3h5eaFOnTpYtWoV2rdvL6fURETisOAkIlIyT548gYWFBcLDw2FjYyM6DlVjRUVF\nOHPmDKRSKfbs2QMDA4N3ZWfLli3//mKZDHBxAc6cAQoLy/3stwCOrFyJgZ6ePI5OVA4WFhbYunUr\n2rVrJzoKkVLIysrCtGnTcObMGfz4449wc3NDYWEhevTogdjY2DLP49TU1ISjoyPCw8OhpqaGoqIi\nbN68GX5+fujevTv8/f3RtGlTOf9piIgqD4+oExEpmYYNG2Lp0qUYO3YsCuVQJBG9j6qqKj799FOs\nW7cO9+/fR2BgIG7cuIFOnTqhY8eOWL58OW7fvv3Xi48f/+t4upz+ndRSVcWgM2dYbhKVE4+pE1WO\nwsJCrFmzBpaWljAwMEBaWho+//xzSCQSqKurIzw8HE5OTtDW1i712tra2ujatSsOHToENTU1AH99\njx49ejTS09NhYGCAdu3aYf78+RxjRERVFgtOIiIlNHz4cNSpUwdr1qwRHYWUhJqaGrp164b169fj\nwYMH8Pf3x7Vr12Braws7OzvcmTQJkOMPVZKiIiA8HHj8WG5rEikjFpxEFS82NhYdO3bE3r17cfr0\naQQEBPyjyNTU1MTRo0exbNkyaGtro2bNmh9dV1NTE9ra2li1ahUOHTqEGjVq/OM1tWvXxuLFi3Hp\n0iWkp6fD1NQUv/76K4qLi+X25yMiqgw8ok5EpKSuX78Oe3t7XLx4Ec2bNxcdh5RUQUEBzhw8CCd3\nd6jL+4cpTU1g+XJg4kT5rkukRDZt2oTTp08jNDRUdBSiaufZs2fw9fXFwYMHsWzZMgwZMqREJw8e\nPXqEkJAQBAUF4c2bN9DQ0Hh3KkdNTQ1v3rxBrVq1MHPmTIwZMwYNGzYscaaYmBh4eXmhqKgIK1eu\nhJOTU5n/fERElYkFJxGRElu8eDFiYmJw8OBBHuUlcSIigIEDgZcv5b+2uzsglcp/XSIlER0dDW9v\nb5w/f150FKJqo7i4GKGhofD19YW7uzsWLlyIunXrlnodmUyG+/fv4+LFi3j06BEkEgl0dXWRkJCA\ne/fuYcOGDWXOt2PHDvj6+qJDhw4IDAyEoaFhmdYiIqosLDiJiJRYfn4+bGxs4OfnBw8PD9FxSFmt\nXg34+AB5efJf29AQyMyU/7pESuLJkycwMTHBs2fP+EEYkRwkJSVh4sSJyM/PR3BwMGxtbeX+jNTU\nVLi6uiKznN//cnJysHLlSqxcuRKjRo3C7NmzoaOjI6eURETyxRmcRERKTENDAxs2bICXlxeeP38u\nOg4pq9evgfz8ilmblyUQlUuDBg0AAE+fPhWchKhqe/36Nby9vdG9e3cMHToUMTExFVJuAkDr1q2R\nnZ2NmzdvlmsdTU1NzJ49GykpKXj69ClMTU2xfv16XlJJRAqJBScRkZKzt7eHm5sbfHx8REchZaWu\nDqhU0FuS/7stlojKRiKR8KIhonKQyWSQSqUwNzfHs2fPkJKSgnHjxkFVVbXCnimRSODi4oKTJ0/K\nZT09PT1s3LgR4eHh2LlzJ6ysrHDs2DG5rE1EJC8sOImICAEBAQgLC8OZM2dERyFlZGwM/M9tsXLT\nqlXFrEukRExNTZGeni46BlGVc+3aNfTq1QsLFy7E9u3bsXnzZjRq1KhSnu3i4oITJ07IdU1ra2tE\nRERg0aJFmDRpEvr27YsrV67I9RlERGXFgpOIiKCjo4Mff/wR48aNQ15FzEEk+hBbW6AijrupqgJd\nush/XSIlwx2cRKWTk5MDPz8/ODg4oFevXrh06VKl30bu4uKCiIgIFBcXy3VdiUSCzz//HKmpqejR\nowecnZ0xefJk/Pnnn3J9DhFRabHgJCIiAMDAgQPRqlUrLF26VHQUUjYtWgCffCL3ZWU1awL9+8t9\nXSJlw4KTqOQOHz4MCwsLXL16FZcvX4a3tzfU1dUrPUezZs1Qt25dJCcnV8j6Ghoa8PLywpUrVyCR\nSGBubo6VK1civ6JmahMRfQQLTiIiAvDXJ/Jr165FUFAQrl69KjoOKROJBJg2DdDSkuuyt4qKcCgr\nCzKZTK7rEikbFpxEH3fnzh0MHDgQU6ZMwbp16yCVStG0aVOhmeQ5h/N9GjRogDVr1iAqKgonT55E\nmzZtsG/fPn7vJaJKx4KTiIjeMTAwgJ+fH8aNGyf3I01EHzRypFzncMq0tPDAywuzZ89G+/btsX//\nfv6wRVRGrVq1QkZGBoqKikRHIVI4+fn5CAwMhI2NDaysrJCcnIxevXqJjgUA6N69u9zncL6Pubk5\nwsLC8NNPP2HOnDno1q0bLl++XCnPJiICWHASEdH/mDRpEnJycrB582bRUUiZ1KoFbNsmn12cNWpA\n0q8fHP39kZCQgLlz5+KHH36Ail7rqwAAIABJREFUtbU19u7dy/KeqJS0tbXRoEED3L17V3QUIoVy\n+vRpWFtbIzIyErGxsfDz80PNmjVFx3qna9euOHv2bKUeG+/ZsycuX76MwYMHo3fv3hg1ahQePnxY\nac8nIuXFgpOIiP5GVVUVGzZsgK+vLx49eiQ6DimT7t2B6dPLV3JqaAAtWwK//AIAUFFRweeff46L\nFy9i4cKFCAgIgJWVFXbt2sWik6gUeEyd6P979OgRhg0bhqFDh2LhwoU4fPgwjIyMRMf6h/r168PE\nxARxcXGV+lw1NTWMHz8e6enp+OSTT2BhYYFFixYhJyenUnMQkXJhwUlERP/Qrl07jBgxAl5eXqKj\nkLKZPx/w9QU0NUv/tdragIUFcO4cUKfO335LIpHgs88+Q1xcHJYsWYIVK1bA0tISO3bs4LFbohJg\nwUkEFBUV4aeffoKFhQUaN26MtLQ0DBw4EBKJRHS093Jxcam0Y+r/S0dHB4GBgYiPj0diYiLMzMyw\nbds2jowhogrBgpOIiP7VvHnzEBsbi/DwcNFRSNnMmQOcOAE0afLX0fWPqVnzr0J09mwgLg6oV++9\nL5VIJOjbty9iYmKwatUqrFmzBhYWFti6dSsKCwvl+Icgql5MTU2Rnp4uOgaRMPHx8ejUqROkUilO\nnTqFwMBA1CrJ9yjBunfvXuEXDX2MoaEhdu3ahd9//x0rV66Evb09YmJihGYiouqHBScREf0rLS0t\nBAcHY+LEiXj79q3oOKRsHByAGzeAX37Bg8aNUaSi8teuzDp1gNq1AR0doEYNoGHDv461Z2b+tfNT\nVbVEy0skEvTs2RNnz57F2rVr8fPPP6N169YIDQ1l0Un0L7iDk5TV8+fPMWHCBLi6umLKlCk4deoU\n2rRpIzpWiTk6OuLy5ct48+aN6ChwcnJCXFwcJk6cCHd3d3z55Ze4ffu26FhEVE2w4CQiovfq2bMn\nHB0dMX/+fNFRSBlpaACDB8OtaVOcDQsDjh37a7ZmSAiwcydw5w7w+DGwYAGgp1emR0gkEri4uOD0\n6dMICQnBli1bYGZmhk2bNqGgoEDOfyCiqosFJykbmUyG0NBQmJubQ0VFBWlpaRg6dKhCH0f/N1pa\nWujQoQOioqJERwHw12zsYcOGIT09HWZmZrCxscGsWbPw+vVr0dGIqIqTyDgAg4iIPuDx48ewtLRE\neHg4bGxsRMchJfPo0SOYmpriyZMnUFdXr5RnRkVFYcGCBcjMzMSsWbMwfPhwaGhoVMqziRRVYWEh\natWqhRcvXijULdFEFSElJQUTJ05ETk4OgoOD0b59e9GRymXx4sV4+vQpVq5cKTrKP9y/fx+zZs3C\n8ePHsWDBAowYMQKqJTyNQUT037iDk4iIPqhRo0ZYunQpxo4dy6O7VOnCw8PRvXv3Sis3AcDZ2Rkn\nTpzA77//jt27d6NVq1ZYv3498vLyKi0DkaJRU1NDy5YtkZGRIToKUYV58+YNpk+fjq5du+Krr77C\n+fPnq3y5Cfx10ZDoOZzv06RJE4SGhuLAgQMIDQ2Fra0tIiIiRMcioiqIBScREX3U8OHDUadOHaxd\nu1Z0FFIyYWFh6Nevn5BnOzo64ujRo9i5cycOHjwIY2Nj/PTTT8jNzRWSh0g0HlOn6komk2HPnj0w\nNzfH48ePkZKSggkTJlSbnYTt27fH7du38fjxY9FR3qt9+/aIiorCnDlzMHr0aLi5ufHvGyIqFRac\nRET0URKJBOvXr8eiRYtw584d0XFISRQUFODEiRPo06eP0Bx2dnYICwvD3r17cfToURgbGyMoKAg5\nOTlCcxFVNhacVB1lZGSgb9++mDdvHrZu3YrQ0FDo6uqKjiVXampq6NKli8LvjJRIJPjiiy+QlpYG\nR0dHODg4wNPTE8+ePRMdjYiqABacRERUIiYmJvDy8sLEiRPB8c1UGaKjo2FsbIzGjRuLjgIA6NCh\nAw4cOIADBw4gMjISRkZGWLVqFbKzs0VHI6oULDipOsnNzcX8+fNhZ2eHbt26ISEhAc7OzqJjVZju\n3bsr7DH1/1WzZk3MmDEDaWlpyMvLg5mZGYKCgnj5HxF9EAtOIiIqsenTp+PWrVvYvXu36CikBMLC\nwtC3b1/RMf7BxsYGf/zxBw4fPozo6GgYGRlh+fLlePv2rehoRBXK1NQU6enpomMQlduRI0dgYWGB\nlJQUJCQkYPr06ZU661kEFxcXnDhxQnSMUmnUqBGCg4MRERGBsLAwWFpa4tChQ/ygnYj+FW9RJyKi\nUjl37hzc3d2RmpqKunXrio5D1Vjr1q2xZcsWdOzYUXSUD0pOTsaiRYtw6tQpeHl5YdKkSahdu7bo\nWERyl5WVBUtLSzx58kR0FKIyuXv3Lry8vJCQkIC1a9cKH4FSmWQyGZo0aYKzZ8/C0NBQdJxSk8lk\nCA8Ph7e3N5o2bYqVK1fC0tJSdCwiUiDcwUlERKXi4OAANzc3+Pj4iI5C1djNmzfx559/Vonbay0t\nLbFz505ERkYiKSkJRkZGWLx4MV69eiU6GpFc6erqIi8vj/PwqMopKCjA8uXLYW1t/W7npjKVm8Bf\n8y2r4i7O/5BIJOjbty+SkpLg5uaG7t27Y9y4cXj06JHoaESkIFhwEhFRqQUEBODQoUM4e/as6ChU\nTR0+fBh9+vSBikrVeavSunVrbNu2DVFRUbh69SqMjIywYMECvHjxQnQ0IrmQSCQwMTHB9evXRUch\nKrEzZ87A2toaJ06cQExMDObPnw9NTU3RsYSoSnM430ddXR2TJ0/G1atXoa2tjTZt2mDJkiXIzc0V\nHY2IBKs6PzUQEZHC0NHRwY8//oixY8ciLy9PdByqhsLCwtCvXz/RMcrEzMwMv/32G86dO4ebN2/C\n2NgY8+bN4643qhY4h5OqisePH+Pbb7/FkCFDMH/+fISHh6NVq1aiYwnl4uKCiIgIFBcXi45SbvXq\n1cPKlSsRExOD8+fPw9zcHFLp/2PvzsNqzvs/jr+OUso6Y8lkCalDCylbJUpli0ZhGCQMGUvZx77U\n2NcyaJTbMlnG3IYs2ZOlEpJ2hTCGlF1Toe38/rh/03XPPWMmnNPnLK/Hdfnjnjl9z7P7Gp3O+3w/\nn89P3J+TSINxwElERB/E09MTJiYmWLVqlegUUjOFhYWIjo5Gjx49RKd8FBMTE2zfvh2XL1/Gw4cP\nYWpqivnz5+PZs2ei04g+GE9SJ2VXWlqK4OBgWFhYoF69ekhPT8fAgQMhkUhEpwnXuHFjfPrpp0hO\nThadIjcmJiYIDw/Htm3bsHz5cjg4OODq1auis4hIAA44iYjog0gkEmzcuBFBQUG8m4fkKioqCtbW\n1mpziJWxsTG2bt2Kq1ev4smTJzA1NcWcOXN4UAupJA44SZnFx8ejc+fO2LNnDyIjI7FmzRoe+vY/\nXFxcVHYfzr/j5OSE+Ph4jB49Gp9//jm8vLzw4MED0VlEVIk44CQiog/WpEkTLFy4EOPGjeOSIJKb\niIgI9OnTR3SG3DVv3hxbtmxBQkICXr16hVatWuGbb77hAQmkUjjgJGX08uVLTJw4EX379sWkSZNw\n4cIFnrD9Ds7Oziq/D+e7aGlpYfTo0cjMzISRkRHatm2LhQsXIj8/X3QaEVUCDjiJiOijTJw4EYWF\nhdi+fbvoFFIDMplMpfffrAgjIyNs3rwZSUlJeP36NVq3bo1p06YhJydHdBrRP/r9kCF12MOPVJ9M\nJkNYWBhat26NsrIypKenw9vbm8vR/4aTkxNiYmJQVFQkOkVhatasiSVLluD69evIyspCq1atsGPH\nDv7cIlJzHHASEdFH0dLSQmhoKGbPns070eijpaenQyKRwMzMTHSKwjVu3BjfffcdUlNTUVZWBjMz\nM0yePBnZ2dmi04jeqWbNmqhduzYePnwoOoU0XFpaGhwdHREYGIhDhw4hODgYn376qegspffJJ59A\nKpUiLi5OdIrCNW3aFLt378b+/fuxZcsWdOjQARcuXBCdRUQKwgEnERF9tLZt22LUqFGYOnWq6BRS\ncb8vT9eku28MDQ0RGBiI9PR0aGtrw8LCApMmTeLeYaS0uEydRMrPz8esWbPg6OiIL774AleuXEHH\njh1FZ6kUFxcXtV2m/lc6d+6M2NhYzJw5EyNGjMCAAQOQlZUlOouI5IwDTiIikotFixYhLi4OJ06c\nEJ1CKkzdl6f/nYYNG2Lt2rXIyMiAvr4+2rRpg/Hjx+P+/fui04j+gANOEkEmk+HgwYMwNzdHdnY2\nUlJSMHHiRGhpaYlOUznOzs5qedDQ35FIJBgyZAhu3LgBGxsbdOrUCTNnzsSrV69EpxGRnHDASURE\ncqGvr4/g4GCMHz8eBQUFonNIBb148QLXr1+Hk5OT6BShGjRogFWrViEzMxN16tRBu3bt4OPjg3v3\n7olOIwIASKVSZGZmis4gDXLnzh307dsX8+bNw86dOxEWFoaGDRuKzlJZ9vb2SE5ORl5enuiUSqen\np4e5c+ciNTUVL168gFQqRXBwMEpKSkSnEdFH4oCTiIjkpmfPnrC3t8fixYtFp5AKOn36NBwcHKCv\nry86RSnUr18fy5cvx82bN2FgYID27dvjq6++4rI6Eo53cFJlefPmDb799lt07NgR3bp1Q2JiIhwd\nHUVnqTw9PT107NhRo/ejbNiwIbZu3YqTJ09i//79aNu2LVchEak4DjiJiEiu1q1bhx9++AHXr18X\nnUIq5vf9N+mP6tati2+//Ra3bt1CkyZN0KlTJ4wcORK3bt0SnUYaigNOqgynTp1CmzZtcP36dSQk\nJOCbb76Bjo6O6Cy14ezsrFH7cL5L27ZtcebMGSxfvhx+fn7o3bs30tPTRWcR0QfggJOIiOSqQYMG\nWLFiBcaOHYvS0lLROaQiysrKcPz4cY3df7MiPvnkEyxevBi3b9+GsbEx7Ozs4OXlhYyMDNFppGGa\nN2+OBw8eoKioSHQKqaGHDx/iiy++wNdff43169fjwIEDaNq0qegstePi4qJx+3C+i0Qigbu7O1JT\nU9GzZ084Ojpi4sSJePr0qeg0InoPHHASEZHcjRw5EjVr1sR3330nOoVURHx8POrXr49mzZqJTlF6\nderUwYIFC5CVlYXWrVuja9euGDp0KO84oUqjo6ODpk2bcrsEkqvi4mKsW7cObdu2RatWrZCWlsYP\nvRTIxsYGDx48QG5urugUpaGjo4MpU6bgxo0b0NLSQuvWrbF27Vq8fftWdBoRVQAHnEREJHcSiQRb\ntmzBkiVLeAI0VQiXp7+/WrVqYe7cucjKykLbtm3h5OSEL774AikpKaLTSANwmTrJU0xMDGxsbHD8\n+HHExsYiICAAenp6orPUmpaWFhwdHXH27FnRKUqnbt262LBhAy5evIioqCiYm5vj4MGDkMlkotOI\n6G9wwElERAphamqKKVOmYMKECfyFkP5RREQE79T5QDVr1sSsWbNw584ddOzYEa6urhgwYACSkpJE\np5Ea44CT5OHJkycYPXo0Bg8ejPnz5+PUqVMwNTUVnaUxnJ2duUz9b7Rq1QpHjx5FcHAwFi5cCCcn\nJyQkJIjOIqJ34ICTiIgU5ptvvsHdu3exf/9+0SmkxHJycpCVlQV7e3vRKSqtevXqmDFjBu7cuYMu\nXbqgd+/e6N+/P9+MkUJwwEkfo6ysDCEhITA3N0edOnWQnp6OL774AhKJRHSaRvl9H05+EP33XF1d\ncf36dQwdOhRubm4YNWoUsrOzRWcR0f/ggJOIiBRGR0cHISEhmDJlCl6+fCk6h5TU8ePH4erqiqpV\nq4pOUQv6+vqYOnUqsrKy4OzsDHd3d/Tr1w9Xr14VnUZqRCqVcsBJHyQhIQG2trbYuXMnTp8+jXXr\n1qFWrVqiszSSVCpFSUkJ99OtAG1tbfj4+CAzMxMGBgawtLTEt99+i8LCQtFpRPT/OOAkIiKFsre3\nh7u7O2bPni06hZQU999UDD09Pfj6+uL27dvo1asXPD090adPH8TFxYlOIzVgamqKzMxM0RmkQl6+\nfAlfX1/06dMH48aNw8WLF9G2bVvRWRpNIpHAxcUFkZGRolNURq1atbBixQrEx8cjNTUVrVq1wu7d\nu1FWViY6jUjjccBJREQKt3z5chw5cgTR0dGiU0jJFBUV4cyZM+jdu7foFLVVrVo1TJw4Ebdv34a7\nuzsGDx6Mnj17IiYmRnQaqTBDQ0Pk5+fj1atXolNIyclkMuzevRtmZmYoKipCWloaRo8ejSpV+FZU\nGXAfzg/TvHlz7Nu3D3v37kVQUBBsbW0RGxsrOotIo/FVhYiIFK5OnToICgqCj48P3r59KzqHlEhM\nTAxMTU1hYGAgOkXt6erq4uuvv8atW7cwaNAgeHl5wdnZGRcuXBCdRipIIpHAxMQEt27dEp1CSuzG\njRvo3r071qxZgwMHDmDLli2oW7eu6Cz6L87OzoiKiuIdiB/I3t4ecXFx8PX1xeDBgzF48GDcu3dP\ndBaRRuKAk4iIKsWAAQPQsmVLrFq1SnQKKREuT698Ojo6GDNmDDIzMzF8+HCMHj0ajo6OiIqK4kET\n9F64Dye9S0FBAebMmYOuXbvC09MTV69eRefOnUVn0V9o1KgR6tevj8TERNEpKqtKlSoYPnw4MjMz\nYW5uDhsbG8yZMwd5eXmi04g0CgecRERUKSQSCTZt2oSgoCDu20blIiIi4ObmJjpDI1WtWhWjRo1C\nRkYGRo8ejXHjxqFbt248UZcqjPtw0v+SyWQ4dOgQzM3Ncf/+fSQnJ8PX1xfa2tqi0+hvODs7cx9O\nOdDX18fChQuRnJyMR48eQSqVIjQ0FKWlpaLTiDQCB5xERFRpmjRpggULFmDcuHEcoBDu3LmD58+f\nw8bGRnSKRtPW1saIESOQnp6OcePGwdfXF/b29jh58iT/ntLfMjU15R2cVO7u3btwd3fHrFmzsG3b\nNuzevRufffaZ6CyqAB40JF+NGjXCjh07cPToUYSFhcHa2pr//xJVAg44iYioUk2aNAkFBQXYvn27\n6BQS7NixY+jduzcPmlAS2traGDZsGFJTU+Hn54dp06bB1tYWx44d46CT/hIHnAQAb9++xdKlS9Gh\nQwfY2dkhOTkZ3bt3F51F78HR0RGxsbHcJ13ObGxscP78eSxatAg+Pj5wd3fnXe9ECsR3FEREVKm0\ntLQQGhqKOXPm4PHjx6JzSCAuT1dOWlpaGDJkCFJSUjB9+nTMmjULHTp0wOHDhznopD/4fcDJ/y40\n15kzZ9CmTRtcvXoV8fHxmDNnDnR0dERn0XuqU6cOWrdujbi4ONEpakcikcDT0xPp6eno2rUrunTp\ngsmTJ+P58+ei04jUDgecRERU6aysrODt7Y2pU6eKTiFBCgoKEB0djR49eohOoXeoUqUKBg0ahKSk\nJMydOxcLFy6EtbU1Dh48yNN2CcB/hiL6+vp49OiR6BSqZNnZ2RgyZAjGjh2LNWvWIDw8HM2aNROd\nRR/B2dkZZ86cEZ2htnR1dTFjxgykp6ejuLgYrVq1QlBQEIqLi0WnEakNDjiJiEiIRYsW4dKlSzhx\n4oToFBIgKioK7du3R+3atUWn0D+oUqUKPD09cf36dfj7+2PJkiVo164d9u/fz0EncZm6hikpKUFg\nYCDatGmDli1bIi0tDf369ROdRXLAfTgrR/369bF582ZERUXh+PHjsLCwwJEjR3gnPJEccMBJRERC\nVK9eHcHBwZgwYQIKCgpE51Ali4iIQJ8+fURn0HuQSCRwd3dHfHw8li1bhlWrVqFNmzbYt28fT4jV\nYBxwao7Y2FjY2Njg6NGjiImJwZIlS6Cvry86i+TEzs4OKSkpyMvLE52iEczNzXHixAkEBQVh1qxZ\ncHV1RXJysugsIpXGAScREQnTs2dP2Nrawt/fX3QKVSKZTMb9N1WYRCKBm5sbLl++jDVr1iAwMBCW\nlpbYs2cPB50aSCqVcsCp5p4+fYoxY8Zg0KBBmDNnDk6fPg2pVCo6i+SsWrVq6NSpE86fPy86RaP0\n6tULycnJ8PT0hKurK8aOHYucnBzRWUQqiQNOIiISav369di5cycSExNFp1AlSUtLg5aWFlq3bi06\nhT6CRCJBr169EBsbi6CgIGzevBlmZmYICwtDSUmJ6DyqJKampjwVWE2VlZVh69atMDc3R40aNZCe\nno4hQ4ZAIpGITiMFcXFx4T6cAmhra2PChAnIzMxE7dq1YWFhgeXLl+PNmzei04hUCgecREQkVIMG\nDbBixQqMHTuWd39piN+Xp/NNsnqQSCRwdXXFxYsXERwcjK1bt6JVq1bYvn07D0/QAFyirp4SExNh\nb2+Pbdu24eTJkwgMDOSeyRrA2dmZ+3AKVKdOHaxZswZxcXG4evUqWrVqhX379nF/TqIK4oCTiIiE\nGzlyJGrUqIGNGzeKTqFKwOXp6kkikaB79+44f/48/vWvf2HXrl2QSqXYunUrioqKROeRghgbG+OX\nX37hMFtN5OXlYcqUKejZsyfGjBmD6OhoWFlZic6iSmJtbY3s7GwukRasZcuWOHDgAHbs2IGVK1ei\nS5cuuHLliugsIqXHAScREQknkUiwZcsWfPvtt7h//77oHFKgFy9eIDExEU5OTqJTSIG6deuGyMhI\n/PDDD/jpp59gamqKLVu24O3bt6LTSM50dXXRqFEj3Lt3T3QKfQSZTIYff/wRrVu3RkFBAdLS0vDV\nV1+hShW+XdQkWlpacHR05F2cSsLR0RHx8fEYO3YsPDw8MGzYMPz666+is4iUFl+xiIhIKZiammLK\nlCmYOHEil+KosVOnTqFr167Q09MTnUKVoEuXLjh16hT27t2L8PBwmJiYYPPmzdxXTM1wH07VlpGR\nARcXF6xYsQL79+9HaGgo6tWrJzqLBHFxceGAU4lUqVIFI0eORGZmJoyNjWFlZYUFCxYgPz9fdBqR\n0uGAk4iIlMY333yDO3fu4OeffxadQgry+/6bpFlsbW1x/Phx7N+/H8eOHUPLli3x3XffcdCpJrgP\np2oqLCzEvHnz0KVLF7i7uyM+Ph62trais0gwZ2dnnDlzhh82K5kaNWogICAAiYmJuHv3LqRSKbZv\n346ysjLRaURKgwNOIiJSGjo6OggJCcHkyZPx8uVL0TkkZ6WlpTh+/Dj339RgHTt2xNGjR3Ho0CGc\nOXMGxsbGCAwMRGFhoeg0+ggccKqeI0eOwNzcHHfu3EFycjImT54MbW1t0VmkBExNTSGTyXD79m3R\nKfQXmjRpgl27duHgwYPYunUr2rdvj/Pnz4vOIlIKHHASEZFSsbe3R79+/TBnzhzRKSRn8fHxMDAw\ngJGRkegUEszGxgaHDh3C0aNHceHCBRgbG2Pt2rUoKCgQnUYfgANO1XHv3j18/vnnmDFjBkJDQ7F3\n714YGhqKziIlIpFIyu/iJOXVsWNHREdHY9asWfD29oanpyeH0qTxOOAkIiKls2LFChw+fBjR0dGi\nU0iOuDyd/le7du1w4MABnDx5EpcvX4axsTFWrVrFvcVUjFQq5R6cSq6oqAjLly9H+/bt0bFjRyQn\nJ8PFxUV0FikpZ2dn7sOpAiQSCQYPHoyMjAx07NgRnTt3xowZMzRmFdT+/fvh6+sLBwcH1KpVCxKJ\nBMOHD//Lx44cORISieRv/zg7O1fyd0DyJpFxcw0iIlJC+/fvx6JFi3D9+nXo6OiIziE5sLGxwbp1\n69CtWzfRKaSk0tLSsGTJEpw9e7b80LFatWqJzqJ/UFZWhho1auDx48eoUaOG6Bz6H2fPnsXEiRPR\nsmVLbNiwAc2bNxedREouOzsblpaWePz4MbS0tETnUAXl5uZiwYIFOHToEBYuXIhx48ap9dYTVlZW\nSEpKQo0aNdC4cWNkZGRg2LBh2LVr158eGx4ejsTExL+8TlhYGO7cuYPVq1djxowZis4mBeKAk4iI\nlJJMJoO7uzs6deqE+fPni86hj/To0SOYm5sjNzcXVatWFZ1DSu7GjRtYunQpTp48CT8/P/j5+aF2\n7dqis+hvtGnTBjt37kS7du1Ep9D/e/ToEWbMmIGYmBhs2LAB7u7uopNIhZiZmSEsLAw2NjaiU+g9\nJScnY9q0acjOzsbatWvRu3dv0UkKERUVhcaNG6Nly5Y4f/48nJyc3jngfJeXL1/C0NAQpaWlePjw\nIerVq6fAYlI0LlEnIiKlJJFIsGnTJgQGBnLpoxo4fvw4XF1dOdykCmndujV27dqF6Oho3L59G8bG\nxli8eDFevHghOo3egftwKo+SkhJs2LABbdq0gZGREdLS0jjcpPfm4uLCZeoqqk2bNjh9+jRWrlyJ\nKVOmoFevXkhLSxOdJXdOTk4wMTGBRCL54GuEhYXh9evX8PT05HBTDXDASURESqtp06ZYsGABvv76\na3DBgWrj/pv0IaRSKXbu3Im4uDjcv38fJiYmWLBgAZ49eyY6jf4H9+FUDnFxcejQoQPCw8Nx4cIF\nLFu2DNWrVxedRSqIBw2pNolEgn79+iE1NRV9+vSBk5MTxo8fjydPnohOUyqhoaEAAB8fH8ElJA8c\ncBIRkVKbNGkS8vPzsWPHDtEp9IGKiooQGRmptkukSPFatmyJbdu24cqVK8jJyYGpqSnmzp2Lp0+f\nik6j/8c7OMV69uwZfHx84OnpiZkzZyIyMhKtW7cWnUUqzNHREZcuXcKbN29Ep9BHqFq1Kvz8/JCR\nkQFdXV2YmZlh9erVePv2reg04S5duoSUlBSYmprCyclJdA7JAQecRESk1LS0tBASEoLZs2fj8ePH\nonPoA0RHR0MqlaJBgwaiU0jFtWjRAqGhoUhISMDz588hlUoxa9Ys/mxQAhxwilFWVoZt27bBzMwM\n1apVw40bNzB06NCPWrJJBAC1a9eGubk5Ll26JDqF5ODTTz9FYGAgoqOjcfHiRZiZmeHnn3/W6BVS\nISEhAICxY8cKLiF54YCTiIiUXrt27eDt7Y2pU6eKTqEPwOXpJG9GRkb4/vvvkZiYiPz8fLRq1Qoz\nZsxATk6O6DSN9fuAU5N48AUoAAAgAElEQVTfLFe25ORkODg4YMuWLTh+/Dg2bNjAw7hIrrgPp/qR\nSqU4fPgwQkJCEBAQAEdHR1y7dk10VqV79eoVfvrpJ+jo6GDkyJGic0hOOOAkIiKVsGjRIly6dAkn\nT54UnULvKSIiAm5ubqIzSA01adIEmzZtQkpKCoqKimBmZoYpU6YgOztbdJrGqVu3LrS1tXk3bSXI\ny8vDtGnT4OrqipEjR+LSpUuwtrYWnUVqiPtwqi9nZ2ckJCTAy8sL/fr1w8iRIzXqtXPXrl0oLCzk\n4UJqhgNOIiJSCdWrV8fmzZsxfvx4FBQUiM6hCsrKysLLly/55psUqlGjRtiwYQPS0tIgkUhgYWEB\nX19fPHjwQHSaRuEydcWSyWT46aefYGZmhlevXiE1NRVjx45FlSp8S0eKYWtri7S0NLx69Up0CimA\nlpYWxowZg8zMTBgaGqJNmzYICAhAYWGh6DSF+/1woXHjxgkuIXniqyEREamMXr16wdbWFv7+/qJT\nqIKOHTuGPn368A04VYrPPvsM69evR3p6OqpVq4Y2bdpgwoQJuH//vug0jcABp+LcvHkTPXv2xJIl\nS7Bv3z7861//Qv369UVnkZqrVq0abG1tce7cOdEppEA1a9bEsmXLEB8fj/T0dEilUoSFhaGsrEx0\nmkJcvnwZSUlJMDU1haOjo+gckiO+2yAiIpWyfv167NixA4mJiaJTqAK4/yaJ0LBhQ6xevRoZGRmo\nVasW2rVrh3HjxuHevXui09QaB5zy9/r1ayxYsAB2dnbo3bs3EhISYG9vLzqLNIizszP34dQQzZo1\nw48//oh9+/Zh48aN6Ny5M2JiYkRnyd3vhwv5+PgILiF5k8i4EzgREamYbdu2ITg4GHFxcdDS0hKd\nQ+9QUFCAhg0b4sGDBzz4goR6+vQp1q9fj++//x4eHh6YO3cuWrRoITpL7fz8888ICwtDeHi46BS1\nEBERAV9fX3To0AHr1q1Do0aNRCeRBrp27RpGjBiBtLQ00SlUicrKyrB3717MmTMHnTt3xsqVK9G8\neXPRWX8QHh5e/nqTk5ODkydPokWLFnBwcAAA1KtXD2vWrPnD1+Tl5cHQ0BAlJSV48OAB999UM7yD\nk4iIVM6oUaNQo0YNbNq0SXQK/Y2zZ8+iQ4cOHG6ScPXq1cPSpUtx69YtGBoaomPHjhg1ahRu3bol\nOk2t8A5O+bh//z48PDwwdepUfP/999i3bx+HmySMlZUVcnJyNOoAGgKqVKmCYcOGISMjA5aWlmjf\nvj1mz56NvLw80WnlEhMTsXPnTuzcubP8ENI7d+6U/7P9+/f/6Wt2796NgoICeHh4cLiphjjgJCIi\nlSORSPD9998jICCAe+spMS5PJ2Xz6aefIiAgALdv30azZs1ga2uLESNGIDMzU3SaWmjZsiXu3LmD\n0tJS0SkqqaioCCtXroS1tTVsbGyQkpKCHj16iM4iDaelpQUnJyecPXtWdAoJoK+vjwULFiAlJQWP\nHz+GVCpFSEiIUvycX7x4MWQy2Tv//NW2NOPHj4dMJsPevXsrP5gUjgNOIiJSSVKpFJMnT8akSZPA\n3VaUj0wmQ0REBNzc3ESnEP1JnTp1sGjRImRlZcHU1BRdunTBsGHDcOPGDdFpKk1PTw8GBgb45Zdf\nRKeonHPnzsHKygoXLlzAlStXMH/+fOjq6orOIgLwn304z5w5IzqDBDI0NMS2bdsQERGBPXv2oF27\ndjh9+rToLKI/4ICTiIhU1qxZs5CVlYUDBw6ITqH/kZqaiqpVq6JVq1aiU4jeqXbt2pg/fz6ysrJg\naWkJR0dHDBkyBKmpqaLTVJZUKuUdse8hJycHXl5e8Pb2xtKlS3H06FHuD0tKx8XFBZGRkfxAmWBt\nbY2oqCj4+/tj/Pjx6Nu3LzIyMkRnEQHggJOIiFSYjo4OtmzZAj8/P7x8+VJ0Dv2X3+/elEgkolOI\n/lGtWrUwe/ZsZGVlwcbGBi4uLhg4cCCSk5NFp6kc7sNZMaWlpdi0aRMsLS3RqFEjpKenw8PDgz8z\nSSm1bNkSEomEf7cJwH+2ivLw8EBaWhqcnJzg4OAAPz8/PHv2THQaaTgOOImISKV16dIF/fr1w5w5\nc0Sn0H/h/pukimrUqIGZM2ciKysLdnZ26NmzJzw8PHD9+nXRaSqDA85/duXKFXTs2BH//ve/cf78\neaxYsQLVq1cXnUX0ThKJBM7OzoiMjBSdQkpEV1cX06dPx40bN1BWVobWrVsjMDAQRUVFotNIQ3HA\nSUREKm/FihU4fPgwYmJiRKcQgOfPnyMpKQmOjo6iU4g+SPXq1TFt2jRkZWXB0dERffv2hbu7O+Lj\n40WnKT0OON/t+fPn+Prrr/H5559j6tSpiIqKgpmZmegsogpxcXHhPpz0l+rVq4eNGzfi3LlzOHXq\nFCwsLHDo0CFuaUCVjgNOIiJSeXXq1EFgYCB8fHz4qbESOHXqFLp16wY9PT3RKUQfRV9fH5MnT0ZW\nVhZ69OiB/v37w83NDZcvXxadplT2798PX19fODg4YNCgQThz5gyGDx/+l48tLi5GUFAQRo0aBSsr\nK+jo6EAikWDr1q2VXF15ZDIZduzYATMzM2hra+PGjRsYPnw4l6OTSunevTvOnTunFKdnk3IyMzPD\nsWPH8N1332Hu3LlwcXFBUlKS6CzSIBxwEhGRWhg4cCBatGiBVatWiU7ReFyeTuqmWrVqmDRpErKy\nstC3b18MGjQIvXr1QmxsrOg0pbBkyRJs3LgRiYmJaNy4MQCgpKTkLx9bUFCAKVOmYMeOHcjJyUHD\nhg0rM7XSpaSkoGvXrti8eTMiIiKwceNG1KlTR3QW0Xv77LPPYGhoyC076B/17NkTSUlJGDhwIHr2\n7IkxY8YgJydHdBZpAA44iYhILUgkEmzatAlBQUFcHilQaWkpTpw4ATc3N9EpRHKnq6uL8ePH4/bt\n2/D09MTQoUPh6uqKixcvik4Tav369bh58yby8vIQHBwMAPjtt9/+8rH6+vo4duwYsrOzkZOTg9Gj\nR1dmaqX57bffMGPGDDg7O2PYsGG4dOkSbGxsRGcRfRRnZ2cuU6cK0dbWxvjx45GRkYFPP/0UFhYW\nWLZsGV6/fi06jdQYB5xERKQ2mjZtinnz5mHcuHHc90eQq1evomHDhmjatKnoFCKF0dHRgY+PD27d\nuoUhQ4bA29sbTk5OOHfunOg0IZycnGBiYvKHJdfvGnDq6Oigd+/e+Oyzzyorr1LJZDLs378fZmZm\nePbsGVJTU/H1119DS0tLdBrRR3NxceFBQ/Re6tSpg1WrVuHy5ctISEhA69at8eOPP/L3dFIIDjiJ\niEit+Pr6Ij8/Hzt27BCdopEiIiJ49yZpjKpVq+Krr75CZmYmvL29MXbsWHTr1g2RkZEa/+YtLy9P\ndEKlu3XrFnr37g1/f3/s2bMH27dvR4MGDURnEclNt27dEBcXhzdv3ohOIRVjbGyM/fv344cffsDq\n1athZ2eHuLg40VmkZjjgJCIitaKlpYWQkBDMnj0bjx8/Fp2jcbj/JmmiqlWrYuTIkbhx4wbGjBmD\nCRMmwMHBAadOndLYQacmDThfv36NRYsWwdbWFq6urkhISICDg4PoLCK5q1WrFiwtLbn/MH2wrl27\n4urVq/j6668xcOBADB06FPfv3xedRWqCA04iIlI77dq1w4gRIzBt2jTRKRolOzsb9+7dg52dnegU\nIiG0tbXh5eWF9PR0TJw4EVOmTIGtrS2OHz+ucYPOdy1RVzfHjx+HpaUl0tPTkZiYiOnTp6Nq1aqi\ns4gUhvtw0seqUqUKvL29kZmZCRMTE7Rr1w7z589Hfn6+6DRScRxwEhGRWlq8eDFiYmJw8uRJ0Ska\n4/jx4+jRowe0tbVFpxAJpaWlhS+//BIpKSmYNm0aZs6ciU6dOuHo0aMaM+hU9zs4f/31VwwYMAC+\nvr7YuHEj/v3vf5efIE+kzpydnbkPJ8lF9erV4e/vj6SkJPzyyy+QSqXYtm0bSktLRaeRiuKAk4iI\n1FL16tURHByM8ePHo7CwUHSORuDydKI/0tLSwhdffIHk5GTMmjUL8+bNg42NDcLDw9V+0CmTyfD0\n6VPRGXJXXFyM1atXo127dmjTpg1SU1PRq1cv0VlElcbW1hbp6el4+fKl6BRSE40bN0ZYWBjCw8Ox\nbds2tG/f/qMP7Xvw4AHCw8MREBCA6dOnY8GCBdi1axdu3LiBsrIy+YST0uEtFkREpLZ69eqFzp07\nw9/fHytXrhSdo9bevn2LyMhIbNmyRXQKkdKpUqUKBgwYAA8PDxw+fBgBAQFYvHgxFixYAA8PD1Sp\non73HNSqVQs3b95EvXr1RKfIzYULFzBhwgQ0adIEly9fhrGxsegkokqnq6sLOzs7nDt3Dv379xed\nQ2qkQ4cOuHjxIvbv349Ro0bBysoKq1atgomJSYW+vrS0FD/99BNWrlyJzMxM6OjoID8/v3ygWaNG\nDchkMtSqVQvTp0+Hj48PatasqchviSqZ+v02RURE9F/Wr1+P7du3IzExUXSKWouOjkbr1q1Rv359\n0SlESqtKlSro378/rl27hm+//RYrVqxA27Zt8dNPP6ndHSU1a9bEzZs3RWfIRW5uLry9vTF8+HAE\nBATg2LFjHG6SRnNxceEydVIIiUSCQYMG4caNG+jcuTNsbW0xbdo0vHjx4m+/LjMzE9bW1vDx8UFS\nUhLevHmDvLy8P7y25ufno6CgAI8ePcKCBQvQokULnD59WtHfElUiDjiJiEitGRgYYPny5fDx8eGe\nPgrE5elEFSeRSNCvXz9cuXIFK1euxNq1a2FpaYm9e/eqzc+p3+/gVGWlpaUIDg6GpaUlDAwMkJ6e\nDk9PT0gkEtFpRELxoCFStGrVqmHWrFlIT09HYWEhWrVqhY0bN6K4uPhPjz1x4gSsra2Rmppa4YOK\nXr9+jadPn6J///5YsmSJvPNJEIlM3TcAIiIijSeTyeDk5ARPT0/4+fmJzlFLUqkUe/bsgY2NjegU\nIpUjk8lw6tQp+Pv74/nz55g/fz6GDBmiMgd2hYeHIzw8HACQk5ODkydPwsDAALq6unByckK9evWw\nZs2a8sevWLECGRkZAIDExEQkJSXBzs6ufBlily5dMGbMmMr/Rv5LfHw8xo8fDz09PWzevBkWFhZC\ne4iUSVlZGRo0aICkpCQ0atRIdA5pgJSUFEyfPh2//vor1q5di969e0MikeDcuXNwc3P7qP329fX1\nsXjxYsycOVOOxSQCB5xERKQRMjMzYW9vj+vXr6NJkyaic9TK7du34eDggIcPH6rlXoJElUUmk+Hs\n2bPw9/dHTk4O5s2bh2HDhin9oHPx4sXw9/d/5783MjLCvXv3yv+3o6Mjzp8//87He3t7Y8eOHXIs\nrLgXL15g3rx5OHjwIFauXAkvLy/esUn0FwYNGoR+/fphxIgRolNIQ8hkMhw7dgzTp09H06ZNsWjR\nIvTr1+8fl69XhJ6eHi5evMgP6lUcB5xERKQxAgICEB8fj0OHDvENqxxt2LABSUlJ+Ne//iU6hUgt\nyGQynDt3DgEBAbh//z7mzZsHLy8vVK1aVXRaheXn56N+/fooKChQiQ8+ZDIZwsLCMGvWLHh4eGDp\n0qX45JNPRGcRKa0tW7YgNjYWO3fuFJ1CGqa4uBjff/89Zs6cieLiYrntYd2iRQvcvHkTWlpacrke\nVT7l/22DiIhITmbNmoXbt2/jwIEDolPUCvffJJIviUQCJycnREVFYfv27dizZw9MTU0REhKCoqIi\n0XkVUqNGDdStWxe//vqr6JR/lJaWBkdHR2zYsAGHDx/G5s2bOdwk+ge/78PJ+6WoslWtWhUDBw4E\nALke0PfkyROcOHFCbtejyscBJxERaQxdXV2EhITAz88Pr169Ep2jFvLz8xEbGwtXV1fRKURqqWvX\nrjhz5gx27dqFn3/+GSYmJggODsbbt29Fp/0jqVSKzMxM0RnvlJ+fj2+++QaOjo4YPHgwLl++jA4d\nOojOIlIJxsbG0NbWVuq/46S+QkND5b4a67fffsPq1avlek2qXBxwEhGRRunSpQv69u2LOXPmiE5R\nC2fPnkXHjh1Rq1Yt0SlEas3e3h4nT57Evn37cOTIEbRs2RIbN27EmzdvRKe9k6mpqVKepC6TyXDg\nwAGYmZkhNzcXqampmDBhApclEr0HiUQCFxcXnqZOQhw+fFghr39xcXEoLS2V+3WpcnDASUREGmfl\nypU4dOgQYmJiRKeoPC5PJ6pcnTt3xrFjx3DgwAGcOnUKxsbGCAoKwuvXr0Wn/YkyDjizsrLg5uaG\nBQsWICwsDDt37oSBgYHoLCKV5OzsjMjISNEZpGFkMhnS09MVcu2qVasq3esWVRwHnEREpHHq1KmD\nwMBA+Pj4qMx+dsro99Ms3dzcRKcQaZwOHTrg8OHDOHLkCM6dOwdjY2OsW7cOhYWFotPKKdOA882b\nNwgICECnTp3g5OSExMREdOvWTXQWkUpzdnbGuXPneMcbVar8/HwUFxcr5NpaWlp48OCBQq5NiscB\nJxERaaSBAweiefPm3GvnI6SkpEBHRwdSqVR0CpHGsra2xsGDB3Hs2DHExsaiRYsWWL16NfLz80Wn\nKc0enCdPnoSlpSWSkpKQkJCAmTNnqtSJ9ETKysDAAI0bN8a1a9dEp5AGKSsrk/v+m/+NA3vVxQEn\nERFpJIlEgk2bNmH9+vVKc4eRqomIiICbm5tCf8kkooqxsrLC/v37cfr0acTHx8PY2BgrVqzAb7/9\nJqypWbNmePTokbB9Qh88eIBBgwZhwoQJCAoKws8//4ymTZsKaSFSVy4uLlymTpWqevXqkMlkCrm2\nTCZD3bp1FXJtUjwOOImISGMZGRlh/vz5+PrrrxX2i5I64/6bRMrH0tIS+/btQ1RUFJKTk2FsbIyl\nS5fi1atXld6ira2NZs2aISsrq1Kft7i4GOvWrYOVlRXMzMyQmprKn1VECuLs7MyDhqhSaWtro3nz\n5gq5dmFhISwsLBRybVI8DjiJiEij+fr6Ii8vDzt37hSdolKePXuG5ORkODo6ik4hor9gZmaGPXv2\n4MKFC8jIyEDLli0REBCAly9fVmpHZe/DGR0dDWtra5w8eRKXLl2Cv78/9PT0Ku35iTRNt27dcOXK\nFaU86IzUl5OTE7S0tOR+3RYtWvA1Q4VxwElERBpNS0sLoaGhmDVrFp48eSI6R2WcOnUKjo6OqFat\nmugUIvobrVq1QlhYGGJjY3H37l20bNkSCxcuxPPnzyvl+StrH84nT55g1KhR+PLLL7Fo0SKcOHEC\nJiYmCn9eIk1Xs2ZNtGnTBjExMaJTSIOMHz8eurq6cr1m9erVMXnyZLlekyoXB5xERKTx2rVrhxEj\nRmDatGmiU1QGl6cTqRYTExNs374dly9fRnZ2NkxMTDBv3jw8e/ZMoc+r6Ds4y8rKsGXLFpibm6Nu\n3bpIT0/HwIEDuTcwUSVydnbmPpxUqaysrGBqairXn/USiQReXl5yux5VPg44iYiIACxevBjR0dE4\ndeqU6BSlV1paihMnTnDASaSCjI2NsXXrVsTHx+Pp06cwNTXF7NmzFXYHuyIHnNeuXYOtrS3CwsIQ\nGRmJNWvWoGbNmgp5LiJ6NxcXF+7DSZVu06ZNqFJFPiOt6tWrY+PGjXwNUXEccBIREeE/v9hs3rwZ\n48ePR2FhoegcpXblyhUYGhryNGIiFda8eXNs2bIFCQkJyMvLg1QqxcyZM5GbmyvX51HEgPPly5fw\n9fWFm5sbxo8fjwsXLsDS0lKuz0FEFde5c2dkZmbixYsXolNIQ8THx8PLywsdOnSAvr7+R11LT08P\nDg4OGDFihJzqSBQOOImIiP5f79690alTJwQEBIhOUWoRERFwc3MTnUFEcmBkZITNmzcjKSkJb968\nQevWrTFt2jQ8evToo65bUlKC8PBwzJ07F0+fPkWNGjVQvXp1NGjQAM7Ozli2bBl+/fXX97qmTCbD\n7t27YWZmhuLiYqSnp2PkyJFyu4OHiD6Mjo4O7O3tce7cOdEppOZkMhk2bNiAPn36YOXKlYiNjYWP\nj88HDzn19PRgY2ODgwcPcmsTNSCRyWQy0RFERETKIjc3F5aWljh9+jTatm0rOkcptWvXDhs2bICD\ng4PoFCKSs+zsbKxatQo//PADvLy88M0336BRo0YV/vqysjJs3LgR/v7+KC4uxm+//faXj9PV1YVE\nIkG3bt0QHByM5s2b/+1109PTMXHiRLx69QrBwcHo1KnTe31fRKRYa9aswd27d7Fp0ybRKaSmXr58\nidGjR+P+/fvYt28fjI2NAfxn6BkaGopp06bh7du3KCkpqdD19PT0MGbMGKxevVruBxaRGPy4k4iI\n6L8YGBhg+fLlGDt2LEpLS0XnKJ2HDx/i/v37sLW1FZ1CRApgaGiIwMBApKWlQVtbG5aWlpg4cWKF\n7ra8f/8+OnbsiLlz5+L58+fvHG4CwNu3b/HmzRucOXMGFhYW+P777//ycQUFBZg9eza6deuGgQMH\n4urVqxxuEikhFxcXHjRECnPlyhVYW1ujcePGiImJKR9uAv85HMjHxwfp6enw8PBAtWrVUL169b+8\njq6uLqpVqwY7OztERkZiw4YNHG6qEd7BSURE9D9kMhmcnJwwYMAA+Pr6is5RKlu3bkVkZCT27t0r\nOoWIKsHjx4+xZs0abN26FYMHD8bs2bNhZGT0p8dlZWWhU6dOePny5Qd9OKSvr4/Jkydj2bJlAP7z\nc/jQoUOYPHkyunbtitWrV6Nhw4Yf/f0QkWKUlZXBwMAA169fR+PGjUXnkJr4fUn60qVL8f3338PT\n0/Mfv+bZs2c4ePAgLl68iGvXrqGgoAA6OjowMzNDt27d4ObmBhMTk0qop8rGAScREdFfyMzMhL29\nPa5fv44mTZqIzlEaHh4e8PT0hJeXl+gUIqpET548wbp16xASEgJPT0/MnTu3fFl5Xl4eWrVqhdzc\nXJSVlX3wc+jr62PdunVwdXWFn58f7ty5g02bNsHJyUle3wYRKdDgwYPRp08feHt7i04hNfDixQuM\nHj0aDx48wL59+9CiRQvRSaTkuESdiIjoL0ilUvj5+WHSpEngZ4H/8fbtW5w9exa9evUSnUJElax+\n/fpYvnw5bt68CQMDA7Rv3x6jR49GVlYWfH198eLFi48abgJAYWEhfH19YWNjAwcHByQmJnK4SaRC\nnJ2dcebMGdEZpAYuX74Ma2trGBkZITo6msNNqhDewUlERPQOb9++hZWVFZYuXVqhJTHq7syZM1iw\nYAEuXbokOoWIBHvx4gWCgoIQGBiI/Px8ue1Z/PvBQ1FRUXK5HhFVnqysLDg4OODhw4c8kZo+iEwm\nw/r167FixQqEhISgf//+opNIhfAOTiIionfQ1dVFSEgI/Pz88OrVK9E5wkVERMDNzU10BhEpgU8+\n+QSLFy+Gq6vrR9+5+d9kMhkuXbqEhw8fyu2aRFQ5WrRoAV1dXdy4cUN0Cqmg58+fo3///ti3bx+u\nXLnC4Sa9Nw44iYiI/oaDgwPc3NwwZ84c0SnCRUREoE+fPqIziEhJFBUV4ciRIwrZxmP37t1yvyYR\nKZZEIoGzszNPU6f3FhcXB2traxgbG+PixYto1qyZ6CRSQRxwEhER/YMVK1YgPDwcsbGxolOEuXXr\nFvLz89GuXTvRKUSkJNLS0qCjoyP36/6+3y8RqR4XFxcOOKnCZDIZ1q5di88//xxBQUFYt26dQl5X\nSDNwwElERPQPPvnkEwQGBsLHxwdFRUWic4Q4duwY+vTpwz21iKhcUlKSwg5hS0pKUsh1iUixunfv\njnPnzqGkpER0Cim5Z8+ewd3dHf/+979x5coVfP7556KTSMVxwElERFQBgwYNQrNmzbB69WrRKUJw\neToR/a9Xr16huLhYIdfOz89XyHWJSLEaNGgAIyMjXLt2TXQKKbHY2FhYW1tDKpXiwoULMDIyEp1E\naoADTiIiogqQSCTYtGkT1q9fj1u3bonOqVT5+fm4dOkSXF1dRacQkRLR1tZGlSqKeTuhra2tkOsS\nkeI5OzvjzJkzojNICZWVlWH16tXw8PDAxo0bsWbNGi5JJ7nhgJOIiKiCjIyMMG/ePIwbN05hyzKV\nUWRkJDp16oSaNWuKTiEiJVK/fn2FbVvRuHFjhVyXiBSP+3DSX3n69Cn69euHAwcO4OrVq+jXr5/o\nJFIzHHASERG9B19fX+Tl5WHnzp2iUypNREQE3NzcRGcQkUAymQx37txBWFgYxo0bBwsLC4waNQqv\nX79WyPM1b96ce/gRqaiuXbvi6tWrKCwsFJ1CSiImJgbW1tYwNzfHhQsX0LRpU9FJpIY44CQiInoP\n2traCAkJwaxZs/DkyRPROQonk8nKDxgiIs1RXFyMq1evIjAwEAMHDoShoSG6dOmCw4cPw8zMDDt2\n7MDLly/RokULuT+3jo4Obty4gYYNG2LkyJE4dOgQByVEKqRGjRpo27YtYmJiRKeQYGVlZVi5ciUG\nDBiA4OBgrFq1ClWrVhWdRWpKItOkNXZERERyMmPGDOTm5iIsLEx0ikIlJSVhwIABuHXrFk9QJ1Jj\nr169wqVLlxATE4Po6GjEx8ejWbNmsLe3R5cuXWBvb49mzZr96efAli1bMH36dBQUFMitpUGDBnj0\n6BEePHiAQ4cOITw8HPHx8ejevTs8PDzg5uaGunXryu35iEj+Fi9ejNevX2PlypWiU0iQJ0+ewNvb\nG69evcKPP/6IJk2aiE4iNccBJxER0QcoKCiAhYUFQkJC1PrwnWXLliE3NxdBQUGiU4hITmQyGX75\n5RfExMSUDzTv3LmD9u3blw80bW1tUadOnX+8VmFhIZo3b47Hjx/Lpa169epYt24dfHx8/vDPnz9/\njqNHjyI8PByRkZGwsbFB//790b9/fy51JFJCFy9exNSpUxEfHy86hQS4ePEihg4dimHDhuHbb7/l\nXZtUKTjgJCIi+q6b268AACAASURBVEDHjx/HpEmTkJKSAn19fdE5CmFvb4+FCxeiZ8+eolOI6AOV\nlJQgKSnpDwPN0tJS2Nvblw80raysPvgk27Nnz6Jfv34fvYxcS0sLnTp1QnR09N/eMV5YWIjTp08j\nPDwcR44cgZGREfr37w8PDw+Ym5vzbnMiJVBUVIT69evj7t27+PTTT0XnUCX5fUl6UFAQtm3bxi2O\nqFJxwElERPQRvvzySxgZGWHFihWiU+Tu2bNnaNGiBXJzc1GtWjXROURUQXl5eYiLiysfaF6+fBlN\nmzYtH2ja29vD2NhYroPAhQsXYu3atR885NTS0kK9evWQkJAAQ0PDCn9dSUkJoqOjER4ejvDwcGhr\na5cPOzt37gwtLa0P6iGij9enTx989dVXGDBggOgUqgRPnjyBl5cX8vPz8eOPP6Jx48aik0jDcMBJ\nRET0EXJzc2FpaYnTp0+jbdu2onPkas+ePdi3bx8OHTokOoWI/sb9+/fLh5kxMTG4efMmbGxsyoeZ\ndnZ2Cr+DSiaTYfHixVizZs17Dzn19PRQv359XLx48aOWm8tkMiQmJpYPO3NycuDu7g4PDw90796d\nH9QQVbJ169bh9u3b2Lx5s+gUUrALFy5g2LBh8PLyQkBAALS1tUUnkQbigJOIiOgjbd26FaGhoYiN\njVWru4WGDRuGrl27Yty4caJTiOj/lZaWIjk5+Q8DzdevX5cfBGRvbw9ra2vo6uoK6YuKisKXX36J\n/Pz8fzx4SEtLCzo6OvD29sbatWvlvtVHVlZW+SFFycnJ6NGjBzw8PNCnTx/Url1brs9FRH+WlJSE\nQYMG4ebNm6JTSEHKysqwfPlybNy4Edu3b0evXr1EJ5EG44CTiIjoI5WVlcHJyQkDBw6Er6+v6By5\nKC0thYGBAa5fv85TL4kEys/P/9Ny888+++wPA00TExOl2nfy9evX2Lt3L1auXIl79+6hWrVqKCkp\nQVlZGapWrQqZTIbS0lIMHToUU6dOhbm5ucKbHj9+jCNHjiA8PBznz5+Hra0tPDw84O7u/l5L4omo\n4srKytCwYUPEx8fzMDA19PjxYwwfPhxv3rzB3r170ahRI9FJpOE44CQiIpKDjIwMODg4ICEhQS0G\ngrGxsRg/fjySkpJEpxBplIcPHyI6Orp8oJmRkQErK6vygaadnR3q1asnOrPCnj17hoSEBNy9excl\nJSWoU6cOrKysIJVKhd3xnp+fjxMnTiA8PBzHjh2Dqalp+b6dUqlUSBORuhoyZAh69uyJUaNGiU4h\nOTp37hyGDx+OkSNHYvHixVySTkqBA04iIiI58ff3R0JCAsLDw5XqbqoPMW/ePMhkMixbtkx0CpHa\nKi0tRVpaWvnJ5jExMcjPz4ednV35QNPGxoZ7RypQUVERzp8/X75vZ61atcqHne3bt0eVKlVEJxKp\ntNDQUJw/fx67du0SnUJyUFpaimXLlmHz5s3YuXMnevToITqJqBwHnERERHLy9u1bWFlZYenSpfD0\n9BSd81GsrKywceNGdOnSRXQKkdooKCjAlStXygeacXFxaNCgAezt7csHmlKpVOU/IFFVZWVliI+P\nLx92vnr1Cp9//jk8PDzQrVs36OjoiE4kUjl3796FnZ0dsrOz+bNNxeXm5mL48OEoLi7Gnj17uL0H\nKR0OOImIiOTo4sWL+PLLL5GWlqayh1g8ePAAbdu2RW5uLpccEX2ER48elS81j46ORnp6Otq0aVM+\n0LSzs0ODBg1EZ9I7ZGZmlg87MzMz0bt3b3h4eKBXr16oUaOG6DwildGiRQscOXKkUvbbJcWIiorC\n8OHDMXr0aCxatIi/H5JS4oCTiIhIznx8fFC1alVs2rRJdMoHCQ0NRVRUFPbs2SM6hUhllJWVIT09\n/Q8DzZcvX8LOzq58oNm+fXvo6emJTqUPkJ2djcOHDyM8PByxsbHo2rUrPDw80K9fPw6pif6Bj48P\nLCws4OfnJzqF3lNpaSmWLFmCLVu2YOfOnXB1dRWdRPROHHASERHJ2YsXL2Bubo6ff/4Ztra2onPe\nW//+/TFw4EAMHz5cdAqR0iosLMTVq1fLB5qxsbGoW7fuH5abt2rVins4qqFXr17h2LFjCA8Px8mT\nJ2FpaVm+b2eLFi1E5xEpnX379mH37t04fPiw6BR6Dzk5ORg2bBjKysqwZ88efPbZZ6KTiP4WB5xE\nREQK8NNPP+Hbb7/FtWvXVGrftrdv36JBgwbIyspSqZOaiRQtNze3fJgZExODlJQUWFhYwN7evvxP\nw4YNRWdSJXv79i0iIyMRHh6OQ4cOwcDAoHzYaWVlxT0HiQA8efIEJiYmePr0KZc2q4jIyEh4eXnB\nx8cHCxYsgJaWlugkon/EAScREZECyGQy9O3bF/b29pg7d67onAo7ffo0Fi1ahNjYWNEpRMKUlZUh\nIyPjDwPNp0+fwtbWtnyY2bFjR+jr64tOJSVSWlqKuLg4hIeH4+DBgyguLi4fdnbp0oWDHdJoVlZW\nCA4OVsmVLZqktLQUAQEBCA0NRVhYGJydnUUnEVUYB5xEREQK8ssvv8DGxgaXLl2CiYmJ6JwKmTJl\nCurXr4958+aJTiGqNG/evPnTcvPatWv/4e5Mc3NzLjenCpPJZEhLSys/pOjevXvo27cvPDw84Orq\nyuE4aZwZM2agTp06mD9/vugUeodHjx5h2LBhkEgk2L17N1clkMrhgJOIiEiB1q9fj6NHj+LMmTMq\nsVTRxMQEP/30E9q1ayc6hUhhnjx58oe7M5OSkmBmZvaHgaahoaHoTFIj9+/fx6FDhxAeHo74+Hh0\n794dHh4ecHNzQ926dUXnESnc8ePHsXLlSpw7d050Cv2F06dPw9vbG+PGjcP8+fO5JJ1UEgecRERE\nClRSUoJOnTrBz88P3t7eonP+1s2bN+Hk5IQHDx6oxDCWqCJkMhlu3ryJ6Ojo8oFmTk7On5ab16hR\nQ3QqaYjnz5/j6NGjCA8PR2RkJGxsbNC/f3/0798fTZs2FZ1HpBD5+flo2LAhcnNzUb16ddE59P9K\nSkrg7++Pbdu2YdeuXXBychKdRPTBOOAkIiJSsISEBPTu3RupqamoX7++6Jx3CgwMRFpaGkJDQ0Wn\nEH2wt2/f4tq1a+UDzdjYWOjr65efbG5vbw8LCwvenUJKobCwEKdPn0Z4eDiOHDkCIyOj8n07zc3N\n+WETqZWuXbti3rx56Nmzp+gUApCdnY2hQ4eiatWq2LVrFwwMDEQnEX0UDjiJiIgqwfTp0/HkyRP8\n8MMPolPeydXVFRMmTICHh4foFKIKe/bsGWJjY8sHmtevX4dUKv3DQLNx48aiM4n+UUlJCaKjo8v3\n7dTW1i4fdnbu3JlDeVJ5/v7+KCgowKpVq0SnaLxTp07B29sbEyZMwNy5c/nzhdQCB5xERESVID8/\nHxYWFggNDYWrq6vonD/57bffYGhoiOzsbNSsWVN0DtFfkslkuH37dvlS8+joaDx8+BCdOnUqH2h2\n6tSJ/w2TypPJZEhMTCwfdubk5MDd3R0eHh7o3r07qlWrJjqR6L3FxMTAz88P165dE52isUpKSrB4\n8WLs2LEDu3btgqOjo+gkIrnhgJOIiKiSHDt2DL6+vkhJSVG6E3TDw8OxadMmnD59WnQKUbmioiIk\nJCT84UAgHR0d2Nvblw80LS0toa2tLTqVSKGysrLKDylKTk5Gjx494OHhgT59+qB27dqi84gqpLi4\nGPXq1cOdO3d4uJYADx8+xJdffolq1aohLCyMS9JJ7XDASUREVImGDBmC5s2bY/ny5aJT/mDs2LEw\nNzfHlClTRKf8H3t3Hl51feaN/w4EgYRNEFFA2QRU9gIKRCKCWsAF0iqCCl0c5/GxLtWqta2dqW3V\nTtW6zDN1qdYpUUGx9ACK6IAVBaSKIipIlIIi4kKVfQlL8vtjan6lorKc5HtO8npdl/+Qc+7zhusC\nw5v78/1Qg61duzbmzZtXUWa+/PLLcdRRR+1WaLqEhZru448/jmnTpkUqlYrZs2dH//79o6ioKM48\n88xo2bJl0vHgS51++unx7W9/O84666yko9QoM2bMiO985ztxySWXxI9+9KOoVatW0pEg7RScAFCF\nPvzww+jevXvMnDkzunfvnnSciPjfo5CtW7eOP//5z9GpU6ek41BDlJeXx/Lly3fbznz33XfjuOOO\nqyg0+/XrF40aNUo6KmSsTZs2xYwZMyKVSsX06dOjU6dOFc/t7Ny5c9Lx4HNuu+22KCkpibvvvjvp\nKDXCzp0746c//WkUFxfHww8/HIWFhUlHgkqj4ASAKnbffffF7373u5g3b16lP9T9wQcfjLFjx0ZE\nxO9+97v4l3/5l8+95tVXX42zzz473n777UrNQs22Y8eOWLhw4W6FZq1atSouAjrhhBOiR48ejpvD\nftq+fXvMnj274rmdjRo1qig7+/TpY2OLjPD666/HN77xDd9zVIFVq1bFmDFjIj8/P4qLi6N58+ZJ\nR4JKpeAEgCpWVlYWgwYNilGjRsUll1xSaZ/z3nvvRbdu3WLXrl2xadOmLyw4b7jhhlizZk3cfvvt\nlZaFmmfdunXxwgsvVJSZCxYsiHbt2lUUmgUFBdG2bdvIyclJOipUO2VlZbFgwYKKsnP9+vUxYsSI\nKCoqihNPPDEOOuigpCNSQ5WXl8dhhx0WL774YrRp0ybpONXWk08+Gd/5znfi8ssvjx/+8If+gYMa\nQcEJAAl48803Y+DAgbFw4cI44ogj0j6/vLw8TjnllFixYkV84xvfiFtuueULC84BAwbEz372szj1\n1FPTnoOaoby8PN59992YM2dORaG5fPny6Nu3b0WZ2b9//2jSpEnSUaFGKikpqSg7S0pKYtiwYVFU\nVBRDhw6NBg0aJB2PGmbMmDFxyimnxHe/+92ko1Q7O3bsiJ/+9Kfx0EMPxcMPPxwDBw5MOhJUGQUn\nACTk+uuvj4ULF0YqlUr77DvuuCOuuOKKePbZZ+OZZ56J66+/fo8F59/+9rfo0KFDfPzxx1G3bt20\n56B62rlzZyxatGi3QnPXrl0VFwEVFBREr169ok6dOklHBf7J6tWrY+rUqZFKpWLevHlRWFgYRUVF\nccYZZ8Shhx6adDxqgPvvvz9mzZoVDz/8cNJRqpX33nsvRo8eHY0aNYrx48c7kk6NY08ZABJy7bXX\nRklJSfzpT39K69w333wzrr322rj88su/8mHyTz31VJx00knKTb7Uhg0b4umnn45/+7d/iyFDhsTB\nBx8c3/rWt2LJkiVx+umnx3PPPRcffPBBPPbYY3HFFVfEcccdp9yEDNWyZcu46KKLYsaMGfHee+/F\neeedF08//XR06tQpBg4cGLfeemssX7486ZhUY0OGDIlnnnkm7Fqlz+OPPx59+vSJM888M5544gnl\nJjWSp7gDQELq1q0b99xzT5x77rkxePDgaNy48QHP3LlzZ4wdOzaOPPLIuPHGG7/y9U888UScdtpp\nB/y5VC8rV66s2MycM2dOLFu2LHr37h0FBQVx5ZVXRv/+/aNp06ZJxwQOUOPGjWPMmDExZsyYKC0t\njVmzZkUqlYr+/ftHixYtKi4p6tmzp+flkjZt27aNBg0axOLFi6Nr165Jx8lqO3bsiJ/85CcxceLE\nmDx5chQUFCQdCRKj4ASABBUWFsbw4cPjxz/+cfzXf/3XAc/7+c9/HgsXLow5c+ZE/fr1v/S1O3fu\njKeeeip+/etfH/Dnkr127doVr7322m6FZmlpacVx8/PPPz++9rWvuZQEqrm6devG8OHDY/jw4XHX\nXXfF/PnzI5VKxdlnnx07duyoKDtPOOGEyM3110gOzJAhQ2LmzJkKzgOwcuXKGD16dBx88MHxyiuv\nxCGHHJJ0JEiUI+oAkLD/+I//iD/96U/xwgsvHNCcv/zlL3HjjTfGD37wg+jfv/9evf6II46I1q1b\nH9Dnkl02bdoUM2fOjOuvvz5OPfXUaNq0aZx77rmxaNGi+PrXvx7PPPNMfPTRRzF58uT4wQ9+EP36\n9VNuQg1Tu3btKCgoiJtvvjnefvvtiiOvV111VRx22GHx7W9/O6ZMmRJbtmxJOipZ6uSTT45Zs2Yl\nHSNrTZs2Lfr27RtFRUUxbdo05SaES4YAICM88sgj8ctf/jJeeeWV/Xp24c6dO6NLly5Ru3btWLhw\n4W7P1PzZz362x0uGfvzjH0dOTk7ccMMNafk5kJlWrVpVsZ05d+7cWLp0afTq1atiQ3PAgAHRrFmz\npGMCWWLlypUxZcqUSKVSsWDBghg8eHAUFRXFaaed5s8S9tpnlxz+7W9/88zmfbBjx4740Y9+FJMm\nTYoJEybEgAEDko4EGUPBCQAZoLy8PE4//fQ44YQT4kc/+tE+v3/dunVx8MEH79VrL7/88rj99tuj\nR48e8dvf/tbzmqqRXbt2xeLFi3e73XzTpk1RUFBQUWj27t3bpVJAWnz66afx+OOPRyqVilmzZkXv\n3r1j5MiRMXLkyDjyyCOTjkeG+9rXvhb/7//9PyXdXnr33XfjnHPOiebNm8d///d/+wcF+CcKTgDI\nEO+880706dMnXnjhhejYseM+vXfr1q1x6aWX7vFrr7zySixcuDBOOOGE6Ny5c5xyyikxYMCA6NWr\nV3z00UdRu3btdMQnAZs3b44XX3yxotCcP39+tGjRYrdCs1OnTi4HASrdli1b4n/+538ilUrFtGnT\nok2bNhXP7ezSpYs/h/icq6++Oho2bBj/9m//lnSUjDdlypT413/917jmmmviyiuv9PsJ9kDBCQAZ\n5De/+U088cQTMXPmzLR987qnI+r33ntvzJ49Ox566KG0fAZV44MPPqi4CGju3LmxZMmS6NGjR0Wh\nOWDAgGjevHnSMYEabufOnTFnzpxIpVKRSqUiNze3ouzs16+ff1gjIiJmzJgRN910U8yePTvpKBlr\n+/btce2118bkyZNj4sSJ0a9fv6QjQcZy/R0AZJDLLrssHnrooSguLo5x48ZV2uc88cQTcc4551Ta\nfA5cWVlZLFmyZLdCc/369TFgwIAoKCiI3/zmN9GnT5+oX79+0lEBdpObmxuDBg2KQYMGxW233Rav\nvvpqpFKpuPjii+PDDz+MM888M4qKimLw4MFRr169pOOSkIEDB8ZZZ50Vmzdvjvz8/KTjZJx33nkn\nzjnnnGjRokW88sor0bRp06QjQUZzizoAZJDc3Nz43e9+F1dffXWsWbOmUj5j27Zt8ec//zm+/vWv\nV8p89s+WLVti9uzZceONN8bw4cOjWbNmUVRUFC+88EIUFhbG448/HmvWrIlp06bFtddeGwMHDlRu\nAhkvJycnevXqFddff30sWrQo5s2bF8ccc0z86le/isMOOyxGjRoVEyZMiPXr1ycdlSqWn58fvXv3\njueffz7pKBknlUrFcccdF6NHj44pU6YoN2EvOKIOABnoBz/4QaxZsybGjx+f9tlPP/10XH/99TF3\n7ty0z2bvffTRRxUXAc2ZMyfeeOON6NatWxQUFFT816JFi6RjAlSajz/+OKZNmxapVCpmz54d/fv3\nj6KiojjzzDOjZcuWScejCvziF7+IDRs2xM0335x0lIywffv2uOaaa2LKlCkxceLEOP7445OOBFlD\nwQkAGWjTpk3RtWvXuO++++Lkk09O6+zLL788WrRoET/+8Y/TOpcvVlZWFkuXLt2t0Pzkk08qjpsX\nFBRE3759Iy8vL+moAInYtGlTzJgxI1KpVEyfPj06depU8dzOzp07Jx2PSjJv3rz43ve+FwsXLkw6\nSuKWL18e55xzTrRq1SoeeOCBOPjgg5OOBFlFwQkAGWr69Olx2WWXxWuvvZa24qu8vDw6duwYjz32\nWPTs2TMtM/m8bdu2xUsvvVRRaM6bNy8aN25ccbN5QUFBHHvssVGrlqcFAfyz7du3x+zZsysuKWrU\nqFFF2dmnTx9/dlYjO3bsiObNm8eyZcvikEMOSTpOYiZPnhwXXXRR/OQnP4nLLrvMLemwHxScAJDB\nRo8eHe3atYubbropLfNKSkpiyJAh8d577/nmOY3WrFlTUWbOnTs3Fi1aFMcee+xuhebhhx+edEyA\nrFNWVhYLFiyoKDvXr18fI0aMiKKiojjxxBPjoIMOSjoiB+iMM86IsWPHxqhRo5KOUuVKS0vj6quv\njscffzweeeSR6Nu3b9KRIGspOAEgg3344YfRvXv3mDlzZnTv3v2A5912223x5ptvxr333puGdDVT\neXl5lJSU7FZofvTRR9GvX7+KQvO4445zIyxAJSgpKakoO0tKSmLYsGFRVFQUQ4cOjQYNGiQdj/1w\nxx13xOOPPx5HH310vPrqq7Fo0aLYuHFjnHfeefHggw9+4fvmzZsXv/zlL2P+/PmxdevW6NixY3z3\nu9+NSy+9NGrXrl2FP4P9s3z58hg1alQceeSR8fvf/z6aNGmSdCTIagpOAMhwv/vd7+L++++PuXPn\nHvA37CeffHJccsklMXLkyDSlq/5KS0tjwYIFux03z8/Pj4KCgopCs0uXLlnxlymA6mT16tUxderU\nSKVSMW/evCgsLIyioqI444wz4tBDD006HnvpjTfeiN69e8f27dujQYMG0bp161i6dOmXFpxTpkyJ\nb37zm1GvXr0455xzomnTpjFt2rQoKSmJs846KyZNmlTFP4t989hjj8XFF18c1113XVx66aVO1UAa\nKDgBIMOVlZXFoEGDYtSoUXHJJZfs95yNGzdGy5Yt44MPPrDl8iU++eSTmDdvXsyZMyfmzp0br776\nanTu3Hm3QrNVq1ZJxwTgH6xfvz6mT58eqVQqnnrqqejWrVvFczvbt2+fdDy+RHl5eTRr1iwee+yx\nOOmkk2L27Nlx0kknfWHBuWHDhjjqqKNi/fr1MXfu3OjTp09E/O/zrwcPHhwvvPBCTJgwIUaPHl3V\nP5WvtG3btrjqqqviySefjIkTJzqSDmmUm3QAAODL1apVK+65554oLCyMkSNHRuvWrfdrzsyZM6N/\n//7KzX9QXl4ey5Ytq7jZfO7cubF69eo4/vjjo6CgIK6//vo4/vjj/ZoBZLjGjRvHmDFjYsyYMVFa\nWhqzZs2KVCoV/fv3jxYtWlSUnT179rQtl2FycnJi2LBhsXz58hg8ePBXvv6xxx6LNWvWxLhx4yrK\nzYiIevXqxS9/+csYMmRI3HXXXRlXcC5btixGjRoV7du3j5dfftmRdEgz188BQBY45phj4nvf+15c\neuml+z3jiSeeiNNOOy2NqbLP9u3bY/78+XHrrbdGUVFRHHbYYTFkyJB46qmnomfPnjFhwoT49NNP\n4+mnn45///d/jyFDhig3AbJM3bp1Y/jw4XHvvffG6tWr46677oqtW7fG2WefHW3bto3LL788nn32\n2di5c2fSUfm7IUOGxKxZs/bqtc8880xERAwdOvRzXyssLIy8vLyYN29elJaWpjXjgXj00UdjwIAB\nccEFF8SkSZOUm1AJHFEHgCxRWloaPXr0iJtuuimKior26b3l5eXRqlWrmD17dnTs2LGSEmaetWvX\nxrx58yo2NF955ZXo2LFjxc3mBQUFceSRRyYdE4AqUF5eHosXL664pOidd96J008/PYqKiuKUU06J\nvLy8pCPWWCtXroy+ffvGBx98EM8999yXHlHv27dvLFiwIBYsWBC9e/f+3Ne7du0aixcvjiVLlsQx\nxxxTFfG/0LZt2+LKK6+Mp556Kh599NE95gXSwxF1AMgSdevWjXvvvTfOO++8GDJkSDRq1Giv37tw\n4cJo0KBBtS43y8vLY/ny5RWXAc2ZMyfee++9OO6446KgoCCuu+666Nev3z79ugFQfeTk5ETXrl2j\na9eucd1118XKlStjypQpceedd8a4ceNi8ODBUVRUFKeddlo0a9Ys6bg1ypFHHhmNGjWKN9544ytf\nu379+oj438cS7MlnP75u3br0BdwPb7/9dowaNSo6duwYr7zyyhfmBdLDEXUAyCKFhYUxdOjQ+PGP\nf7xP75s+fXq1O56+Y8eOePHFF+O2226Ls846Kw4//PAoLCyMJ554Irp06RLjx4+PTz/9NGbOnBnX\nX399nHrqqcpNACoceeSRcemll8asWbNixYoVUVRUFKlUKtq3bx+DBw+OO++8M1auXJl0zBrj5JNP\n3utj6plu4sSJMWDAgLjwwgvjkUceUW5CFbDBCQBZ5te//nV06dIlzjvvvOjfv/9eveeJJ56In//8\n55WcrHKtW7cuXnjhhYoNzQULFkT79u2joKAgioqK4pZbbok2bdq4PAKAfda0adMYN25cjBs3LrZs\n2RL/8z//E6lUKn7+859HmzZtKi4p6tKli//PVJIhQ4bEAw88EL169frS131WFn62yfnPPvvxJJ5z\nuXXr1rjiiiti1qxZ8fTTT3/lzwVIHwUnAGSZgw8+OH7zm9/Ev/7rv8Yrr7wSderU2e3r5eXlsW3b\ntsjJyYm6devG3/72t1iyZEkUFhYmlHjflZeXxzvvvFNRZs6dOzdWrFgRffv2jYKCgvjhD38Y/fr1\n85B+ANIuLy8vRowYESNGjIidO3fGnDlzIpVKxemnnx65ubkVZWe/fv2idu3aScetNk466aS44IIL\n4oorrvjS13Xu3DkWLFgQb7311ueeablz585YsWJF5ObmRvv27Ssz7ue89dZbMWrUqDj66KPj5Zdf\ndmoEqpgj6gCQhc4555w44ogj4pZbbomIiJKSkrjyyiuje/fuUb9+/WjYsGE0aNAgGjZsGP369YuW\nLVvGp59+mnDqL7Zz585YsGBB3HHHHTFq1Kho3bp1DBgwIKZMmRKdO3eO+++/Pz799NN45pln4he/\n+EUMHTpUuQlApcvNzY1BgwbF7bffHitWrIhJkyZFfn5+XHzxxdGyZcu48MILY/r06bFt27ako2a9\nZs2axVFHHRVvvvnml75u8ODBERExY8aMz33tueeeiy1btsSAAQOibt26lZJzTyZMmBAFBQVx0UUX\nxYQJE5Sbe9TqOQAAIABJREFUkAC3qANAlnrnnXeiV69ecdRRR8XixYtj586dsWPHjj2+tk6dOlGr\nVq0YOXJk/Pa3v42mTZtWcdrdbdiwYbfj5i+99FIceeSRccIJJ1Tcbt6uXTvHAAHIWH/9619jypQp\nkUql4rXXXotTTz01ioqKYvjw4Z65uJ+uueaa+Pjjj+MPf/jDF96ivmHDhujQoUNs2LAh5s6dG336\n9ImI/72xfPDgwfHCCy/EhAkTYvTo0ZWed+vWrfH9738//vznP8ejjz4aPXv2rPTPBPZMwQkAWere\ne++NSy655AtLzT056KCDIi8vLyZNmhQnn3xyJabb3cqVK2POnDkVheayZcuid+/eFYVm//794+CD\nD66yPACQTh9//HFMmzYtUqlUzJ49O/r37x9FRUVx5plnRsuWLZOOl/FSqVSkUqlYvXp1vPTSS7Fu\n3bpo3759DBw4MCIiDjnkkIpTK5+9/qyzzop69erF6NGjo2nTpjF16tQoKSmJs846Kx599NFK/0fS\nkpKSGDVqVBx77LFxzz332NqEhCk4ASAL3XDDDXHjjTfGli1b9uv99evXj4kTJ8aZZ56Z5mT/e9z8\n9ddf363Q3L59exQUFFQUmr169YqDDjoo7Z8NAEnbtGlTzJgxI1KpVEyfPj06depU8dzOzp07Jx0v\nI/3sZz+L66+//gu/3qZNm3jnnXd2+7G5c+fGDTfcEC+88EJs27YtjjrqqPjud78bl112WaU/G/Wh\nhx6K73//+3HDDTfEhRde6MQJZAAFJwBkmT/+8Y8Vt7weiLy8vJg/f35069btgOZs3Lgx/vKXv8Tc\nuXNjzpw58eKLL0arVq12KzQ7dOjgm38Aapzt27fH7NmzKzYUGzVqVFF29unTJ2rVci3GPzvppJPi\nmmuuiWHDhiUd5XO2bNkSl112WTz//PPx6KOPRo8ePZKOBPydghMAssiaNWuiY8eOsX79+gOelZOT\nE506dYrXX3/9czexf5lVq1ZVbGbOmTMn3nrrrejVq1dFodm/f/9o1qzZAecDgOqkrKwsFixYUFF2\nrl+/PkaMGBFFRUVx4oknOtnwd7/85S9j7dq1ceuttyYdZTdLly6Ns88+O7p37x533313NGzYMOlI\nwD9QcAJAFvm///f/xu9///vYvn17Wubl5+fHHXfcERdccMEev75r16544403dis0t2zZUnERUEFB\nQfTu3btKbyoFgOqgpKSkouwsKSmJYcOGRVFRUQwdOjQaNGiQdLzEzJ8/Py666KJ49dVXk45Sobi4\nOK688sq46aab4oILLnAqBTKQghMAssTmzZvj0EMPPeCj6f/sqKOOirfeeitycnJi8+bNFcfN586d\nG/Pnz4/DDjtst0KzU6dOvrEHgDRavXp1TJ06NVKpVMybNy8KCwujqKgozjjjjDj00EOTjleldu7c\nGYcccki89dZbif/ct2zZEpdeemnMnTs3Hn300ejevXuieYAvpuAEgCzx2GOPxXe/+93YuHFjWufW\nrVs3Ro0aFW+++WYsWbIkevbsWVFmDhgwIJo3b57WzwMAvtj69etj+vTpkUql4qmnnopu3bpVPLez\nffv2ScerEiNGjIhzzz03zjnnnMQyLFmyJEaNGhW9evWKu+66q0Zv1UI2yE06AACwd+bNmxebNm1K\n+9ydO3fG9u3b47bbbos+ffpEvXr10v4ZAMDeady4cYwZMybGjBkTpaWlMWvWrEilUtG/f/9o0aJF\nRdnZs2fPanuiYsiQITFz5szECs4//OEPcdVVV8V//Md/xHe+851q++sM1YkNTgDIEgUFBTFv3rxK\nmX3FFVfEb37zm0qZDQAcuF27dsX8+fMjlUrFn/70p9ixY0dF2XnCCSdEbm712V9avHhxnHHGGbF8\n+fIq/dzNmzfHJZdcEvPnz49JkyZF165dq/Tzgf1XK+kAAMDe2bBhQ6XNXrt2baXNBgAOXO3ataOg\noCBuvvnmePvtt+OJJ56I5s2bx1VXXRWHHXZYfPvb344pU6ak/VndSTj22GNj69atVVpwLl68OI47\n7rgoKyuLl156SbkJWUbBCQBZojJvKncsHQCyR05OTnTt2jWuu+66WLBgQbzyyivRu3fvuPPOO+Pw\nww+PoqKiGD9+fHzyySdJR90vOTk5MWTIkJg1a1aVfN5///d/x6BBg+Lqq6+OP/zhD563CVlIwQkA\nWaJbt26VMrd+/fqVNhsAqHxHHnlkXHrppTFr1qxYsWJFFBUVRSqVivbt28fgwYPjzjvvjJUrVyYd\nc5+cfPLJlV5wbt68Ob71rW/Fr3/963j22Wfj29/+dqV+HlB5FJwAkCUKCgoiPz8/7XPr1KkTvXv3\nTvtcAKDqNW3aNMaNGxeTJ0+ODz74IC6//PJYuHBhfO1rX4vevXvHL37xi3jjjTci06/j+GyDs6ys\nrFLmv/HGG9GnT5+oVatWvPTSS9GlS5dK+RygarhkCACyxPvvvx9HHXVUbNu2La1zmzRpEh9//HHU\nqVMnrXMBgMyxc+fOmDNnTqRSqUilUpGbm1txSVG/fv2idu3aSUf8nM6dO8ejjz4aPXr0SNvM8vLy\n+P3vfx/XXntt3HLLLfGtb30rbbOB5NjgBIAs0apVqygsLEzrzLp168b3vvc95SYAVHO5ubkxaNCg\nuP3222PFihUxadKkyM/Pj4svvjhatmwZF154YUyfPj3t/5B6IIYMGRIzZ85M27xNmzbFuHHj4rbb\nbovZs2crN6EascEJAFlk4cKFUVBQEFu3bk3LvEaNGsWyZcuiefPmaZkHAGSfv/71rzFlypRIpVLx\n2muvxamnnhpFRUUxfPjwaNy4cWK5Jk+eHPfdd19Mnz79gGe9/vrrcfbZZ0dBQUH853/+Z+Tl5aUh\nIZApbHACQBbp1atXXH755Wn5pjwvLy/uv/9+5SYA1HAdOnSIK6+8Mp577rl466234utf/3o8/PDD\nccQRR8TXv/71uPvuu2P16tVVmqm8vDzy8/Nj5syZ0bdv32jWrFk0atQomjdvHieeeGL8+7//e7z5\n5pt7Nee+++6LwYMHx09+8pO4//77lZtQDdngBIAss2PHjhg6dGi88MIL+73JmZeXFxdccEHceeed\naU4HAFQXmzZtihkzZkQqlYrp06dHp06dKp7b2blz50r73CeffDIuu+yy+OCDD2Lz5s17fE1ubm7U\nqVMnunTpEnfddVf06dPnc6/ZuHFjXHTRRfHaa6/FpEmT4uijj660zECyFJwAkIVKS0vj7LPPjmee\neeYLv/H/IvXr149LL700fvWrX0VOTk4lJQQAqpPt27fH7NmzKy4patSoUUXZ+dlt5Adq8+bN8S//\n8i8xderU2LJly16/r379+nHJJZfETTfdVHFZ0qJFi2LUqFFRWFgYd9xxh61NqOYUnACQpcrLy+Oh\nhx6Kiy++OHbs2PGVlwI0bNgw8vPzY+LEiXHiiSdWUUoAoLopKyuLBQsWVJSd69evjxEjRkRRUVGc\neOKJcdBBB+3zzI0bN8bAgQOjpKRkvy46ysvLiyFDhsQf//jHeOCBB+InP/lJ3H777XHeeeft8ywg\n+yg4ASDLbdy4MU477bRYsmRJrF+/PvLy8io2M8vKymLbtm3Ro0ePuPrqq2PkyJH79ZcOAIAvUlJS\nUlF2lpSUxLBhw6KoqCiGDh0aDRo0+Mr3l5eXx0knnRTz58+P0tLS/c6Rl5cXhx9+eOTl5cWkSZMq\n9Rg9kFkUnACQ5bZv3x6tWrWKBQsWxCGHHBKvvfZa/O1vf4ucnJxo2bJldO3aVakJAFSJ1atXx9Sp\nUyOVSsW8efOisLAwioqK4owzzohDDz10j++555574gc/+ME+P3ZnT2rXrh2PP/54DB069IBnAdlD\nwQkAWS6VSsXtt98ezz77bNJRAAAqrF+/PqZPnx6pVCqeeuqp6NatW8VzO9u3bx8RERs2bIiWLVum\npdz8zBFHHBHvvvuuZ41DDZKbdAAA4MCMHz8+xo4dm3QMAIDdNG7cOMaMGRNjxoyJ0tLSmDVrVqRS\nqejfv3+0aNEiRo4cGdu3b0/7565duzaeeeaZGDJkSNpnA5nJBicAZLFPP/002rdvH++++240btw4\n6TgAAF9p165dMX/+/EilUnHHHXfEjh070v4ZRUVFMXny5LTPBTKTghMAsthdd90Vs2fPjokTJyYd\nBQBgn5SWlkbDhg0rpeA8/PDDY/Xq1WmfC2SmWkkHAAD2X3FxcYwbNy7pGAAA+2zp0qVRr169Spm9\nZs2atD7XE8hsCk4AyFLLli2L5cuXx6mnnpp0FACAfbZu3bqoVatyaok6derEhg0bKmU2kHkUnACQ\npYqLi2PMmDGRm+vOQAAg+1Tm9zBlZWW+R4IaxO92AMhC5eXlUVxcHI899ljSUQAA9kvbtm2jtLS0\n0uY3a9as0mYDmcUGJwBkoblz50b9+vWjV69eSUcBANgvLVu2jLp161bK7M6dO1fa8Xcg8/jdDgBZ\n6LPLhXJycpKOAgCwX3JycuKUU05JexFZr169+OY3v5nWmUBmyykvLy9POgQAsPe2bdsWrVq1ikWL\nFkXr1q2TjgMAsN/mz58fJ598clpvPK9Xr16sWLEiDjvssLTNBDKbDU4AyDKPP/549OrVS7kJAGS9\n448/Prp16xa1a9dOy7x69erF6NGjlZtQwyg4ASDLFBcXx9ixY5OOAQBwwHJycuLhhx9O27M4GzZs\nGHfccUdaZgHZQ8EJAFlkzZo1MXv27PjGN76RdBQAgLRo165dPPDAA1G/fv0DmpOfnx9Tp06NRo0a\npSkZkC0UnACQRR555JE4/fTTo2HDhklHAQBIm1GjRsV9990X9evX3+dLFHNzc6NBgwbx5JNPRr9+\n/SopIZDJFJwAkEXGjx/veDoAUC2de+658eKLL0bnzp2jQYMGe/We/Pz8KCgoiKVLl8bAgQMrOSGQ\nqdyiDgBZoqSkJE466aRYuXJl5ObmJh0HAKBS7Ny5M6ZMmRK/+tWvYtGiRZGXlxfbt2+PXbt2RW5u\nbuTm5sbWrVtj0KBBcfXVV8fJJ5+8z1ufQPWi4ASALHHdddfFtm3b4pZbbkk6CgBAlVi3bl0sXLgw\nli5dGqWlpZGfnx9dunSJnj17Rl5eXtLxgAyh4ASALFBWVhbt2rWLqVOnRo8ePZKOAwAAkDE8gxMA\nssDzzz8fTZo0UW4CAAD8EwUnAGQBlwsBAADsmSPqAJDhtm7dGq1atYo33ngjWrZsmXQcAACAjGKD\nEwAy3NSpU6Nv377KTQAAgD1QcAJAhnM8HQAA4Is5og4AGeyjjz6Ko48+OlatWhX5+flJxwEAAMg4\nNjgBIINNmDAhzjzzTOUmAADAF1BwAkAGKy4ujnHjxiUdAwAAIGMpOAEgQy1evDg++uijGDRoUNJR\nAAAAMpaCEwAyVHFxcZx//vlRu3btpKMAAABkLJcMAUAG2rVrV7Rt2zZmzJgRXbp0SToOAABAxrLB\nCQAZ6Nlnn43mzZsrNwEAAL6CghMAMpDLhQAAAPaOI+oAkGE2b94crVu3jqVLl0aLFi2SjgMAAJDR\nbHACQIZJpVIxYMAA5SYAAMBeUHACQIYpLi6OsWPHJh0DAAAgKziiDgAZ5IMPPohjjz02Vq9eHfXr\n1086DgAAQMazwQkAGeThhx+Ob3zjG8pNAACAvaTgBIAMMn78eMfTAQAA9oGCEwAyxGuvvRbr1q2L\nwsLCpKMAAABkDQUnAGSI4uLiOP/886NWLf97BgAA2FsuGQKADLBr16444ogj4plnnomjjz466TgA\nAABZw4oIAGSAWbNmRevWrZWbAAAA+0jBCQAZwOVCAAAA+8cRdQBI2MaNG+OII46It99+O5o3b550\nHAAAgKxigxMAEjZ58uQoLCxUbgIAAOwHBScAJKy4uDjGjRuXdAwAAICs5Ig6ACRo1apV0aNHj3j/\n/fejXr16SccBAADIOjY4ASBBDz30UHzzm99UbgIAAOwnBScAJKS8vDzGjx/veDoAAMABUHACQEIW\nLlwYW7dujYKCgqSjAAAAZC0FJwAkpLi4OMaOHRs5OTlJRwEAAMhaLhkCgATs3LkzWrduHc8//3x0\n7Ngx6TgAAABZywYnACTg6aefjnbt2ik3AQAADpCCEwASUFxc7HIhAACANHBEHQCq2Pr166NNmzbx\n17/+NZo1a5Z0HAAAgKxmgxMAqtgf//jHGDx4sHITAAAgDRScAFDFPrs9HQAAgAPniDoAVKF33303\nevfuHe+//37UrVs36TgAAABZzwYnAFShhx56KEaNGqXcBAAASBMFJwBUkfLy8hg/frzj6QAAAGmk\n4ASAKrJgwYLYtWtX9OvXL+koAAAA1YaCEwCqyGfbmzk5OUlHAQAAqDZcMgQAVWDHjh3RqlWrmD9/\nfrRv3z7pOAAAANWGDU4AqAIzZsyIzp07KzcBAADSTMEJAFXA5UIAAACVwxF1AKhka9eujXbt2sWK\nFSvi4IMPTjoOAABAtWKDEwAq2aRJk+KUU05RbgIAAFQCBScAVLLi4uIYN25c0jEAAACqJUfUAaAS\nLV++PPr16xfvv/9+1KlTJ+k4AAAA1Y4NTgCoRA8++GCcc845yk0AAIBKYoMTACpJeXl5dOrUKR5+\n+OHo27dv0nEAAACqJRucAFBJ5s+fH7Vr144+ffokHQUAAKDaUnACQCX57HKhnJycpKMAAABUW46o\nA0AlKC0tjVatWsXLL78cbdq0SToOAABAtWWDEwAqwfTp06Nr167KTQAAgEqm4ASASvDZ8XQAAAAq\nlyPqAJBmn3zySXTo0CFWrlwZjRo1SjoOAABAtWaDEwDS7NFHH41hw4YpNwEAAKqAghMA0mz8+PEx\nduzYpGMAAADUCI6oA0Aavf322zFw4MBYtWpV5ObmJh0HAACg2rPBCQBp9OCDD8aYMWOUmwAAAFXE\nBicApEl5eXl06NAhHnvssfja176WdBwAAIAawQYnAKTJ3LlzIy8vL3r16pV0FAAAgBpDwQkAafLZ\n5UI5OTlJRwEAAKgxHFEHgDTYtm1btGrVKhYtWhStW7dOOg4AAECNYYMTANLg8ccfj169eik3AQAA\nqpiCEwDS4LPj6QAAAFQtR9QB4ACtWbMmOnbsGO+99140bNgw6TgAAAA1ig1OADhAEydOjNNPP125\nCQAAkAAFJwAcoOLi4hg3blzSMQAAAGokBScAHIClS5fGqlWrYsiQIUlHAQAAqJEUnABwAIqLi+Pc\nc8+N2rVrJx0FAACgRnLJEADsp7KysmjXrl1MmzYtunfvnnQcAACAGskGJwDsp+eeey6aNGmi3AQA\nAEiQghMA9pPLhQAAAJLniDoA7IctW7ZEq1atYsmSJXH44YcnHQcAAKDGssEJAPth6tSpcdxxxyk3\nAQAAEqbgBID94Hg6AABAZnBEHQD20UcffRRHH310rFq1KvLz85OOAwAAUKPZ4ASAfTRhwoQYMWKE\nchMAACADKDgBYB+NHz8+xo4dm3QMAAAAQsEJAPtk8eLF8fHHH8egQYOSjgIAAEAoOAFgnxQXF8f5\n558ftWvXTjoKAAAA4ZIhANhru3btijZt2sRTTz0VXbp0SToOAAAAYYMTAPbas88+Gy1atFBuAgAA\nZBAFJwDsJZcLAQAAZB5H1AFgL2zevDlat24dS5cujRYtWiQdBwAAgL+zwQkAeyGVSsWAAQOUmwAA\nABlGwQkAe8HxdAAAgMzkiDoAfIXVq1dH165d4/3334/69esnHQcAAIB/YIMTAL7Cww8/HEVFRcpN\nAACADKTgBICvUFxcHOPGjUs6BgAAAHug4ASAL7Fo0aJYt25dDBw4MOkoAAAA7IGCEwC+RHFxcZx/\n/vlRq5b/ZQIAAGQilwwBwBfYuXNnHHnkkfHMM8/E0UcfnXQcAAAA9sA6CgB8gVmzZkXr1q2VmwAA\nABlMwQkAX8DlQgAAAJnPEXUA2IONGzfGEUccEcuWLYtDDjkk6TgAAAB8ARucALAHkydPjsLCQuUm\nAABAhlNwAsAeOJ4OAACQHRxRB4B/smrVqujRo0e8//77Ua9evaTjAAAA8CVscALAP3nooYfirLPO\nUm4CAABkAQUnAPyD8vLyGD9+fIwdOzbpKAAAAOwFBScA/IOFCxfG1q1bo6CgIOkoAAAA7AUFJwD8\ng+Li4hg7dmzk5OQkHQUAAIC94JIhAPi7nTt3RqtWrWLOnDnRsWPHpOMAAACwF2xwAsDfPf3009Gh\nQwflJgAAQBZRcALA37lcCAAAIPs4og4AEbF+/fpo06ZNLF++PJo2bZp0HAAAAPaSDU4AiIjHHnss\nBg8erNwEAADIMgpOAIj///Z0AAAAsosj6gDUeO+++2707t073n///ahbt27ScQAAANgHNjgBqPEe\nfPDBGDVqlHITAAAgCyk4AajRysvLo7i4OMaNG5d0FAAAAPaDghOAGu2ll16KsrKyOP7445OOAgAA\nwH5QcAJQoxUXF8f5558fOTk5SUcBAABgP7hkCIAaa/v27dG6deuYP39+tG/fPuk4AAAA7AcbnADU\nWDNmzIjOnTsrNwEAALKYghOAGsvlQgAAANnPEXUAaqS1a9dG27Zt4913340mTZokHQcAAID9ZIMT\ngBpp0qRJceqppyo3AQAAspyCE4AayfF0AACA6sERdQBqnOXLl0e/fv3i/fffjzp16iQdBwAAgANg\ngxOAGufBBx+M0aNHKzcBAACqARucANQo5eXl0bFjx5gwYUL07ds36TgAAAAcIBucAGSNtm3bRk5O\nzh7/O+yww/Zqxvz58yM3Nzf69OlTyWkBAACoCrlJBwCAfdG4ceP4/ve//7kfb9CgwV69f/z48TFu\n3LjIyclJdzQAAAAS4Ig6AFmjbdu2ERHxzjvv7Nf7S0tLo1WrVvHyyy9HmzZt0hcMAACAxDiiDkCN\nMX369OjWrZtyEwAAoBpxRB2ArFJaWhoPPvhgrFy5MvLz86N79+5RWFgYtWvX/sr3jh8/PsaOHVsF\nKQEAAKgqjqgDkDXatm0b77777ud+vF27dvHAAw/EiSee+IXv/eSTT6JDhw6xcuXKaNSoUWXGBAAA\noAo5og5A1vjOd74Ts2bNig8//DA2b94cr7/+evyf//N/4p133olhw4bFokWLvvC9jzzySAwbNky5\nCQAAUM3Y4AQg61111VVx6623xsiRI+NPf/rTHl/Tv3//+OlPfxrDhw+v4nQAAABUJgUnAFlv2bJl\n0bFjx2jatGl88sknn/v622+/HQMHDoxVq1ZFbq7HTwMAAFQnjqgDkPWaN28eERGbN2/e49eLi4tj\nzJgxyk0AAIBqyN/0AMh68+fPj4iI9u3bf+5rZWVlUVxcHJMnT67qWAAAAFQBG5wAZIU333xzjxua\n77zzTlxyySUREXH++ed/7utz586N/Pz86NmzZ6VnBAAAoOrZ4AQgKzzyyCNx6623RmFhYbRp0yYa\nNmwYf/3rX+OJJ56Ibdu2xfDhw+Oqq6763PuKi4tj7NixkZOTk0BqAAAAKptLhgDICrNnz4677747\nFi5cGB9++GFs3rw5mjRpEj179oyxY8fuscTctm1btGrVKhYtWhStW7dOKDkAAACVScEJQLU1adKk\nuOeee2LmzJlJRwEAAKCSeAYnANVWcXFxjBs3LukYAAAAVCIbnABUS2vWrImOHTvGqlWrokGDBknH\nAQAAoJLY4ASgWpo4cWKcfvrpyk0AAIBqTsEJQLU0fvx4x9MBAABqAAUnANXO0qVL4/33348hQ4Yk\nHQUAAIBKpuAEoNopLi6O8847L2rXrp10FAAAACqZS4YAqFbKysqiXbt2MW3atOjevXvScQAAAKhk\nNjgBqFaee+65OPjgg5WbAAAANYSCE4BqZfz48TF27NikYwAAAFBFHFEHoNrYsmVLtGrVKpYsWRKH\nH3540nEAAACoAjY4Aag2pk6dGscff7xyEwAAoAZRcAJQbTieDgAAUPM4og5AVtm2bVssWrQoVq1a\nFWVlZdG0adPo1atXbN++PY455phYtWpV5OfnJx0TAACAKpKbdAAA+CqlpaUxefLkuPnmm+P111+P\nvLy8iq/l5OTE1q1bo169etGhQ4fYsmWLghMAAKAGscEJQEZ77rnnYvTo0bFx48bYtGnTl762bt26\nUbt27bjxxhvj0ksvjVq1PIkFAACgulNwApCRysvL47rrrovbbrsttm7duk/vzc/Pj169esWTTz4Z\nDRo0qKSEAAAAZAIFJwAZ6Zprronf/va3sXnz5v16f926daNLly7x/PPP73akHQAAgOrF2T0AMs7U\nqVPjv/7rv/a73Iz43+d2LlmyJC6//PI0JgMAACDT2OAEIKOsXbs2OnToEGvXrk3LvPr168eTTz4Z\nJ554YlrmAQAAkFlscAKQUe66667Ytm1b2uZt3bo1rr766rTNAwAAILPY4AQgY5SVlcXhhx8eH3/8\ncVrn1q9fP15++eU45phj0joXAACA5NngBCBjLFmyJLZs2ZL2ubt27Yonn3wy7XMBAABInoITgIzx\n8ssvV8rc7du3x7PPPlspswEAAEiWghOAjPHWW2/Fpk2bKmV2SUlJpcwFAAAgWQpOADJGOi8X+mfb\nt2+vtNkAAAAkR8EJQMZo0qRJ1K5du1JmN2zYsFLmAgAAkCwFJwAZo0ePHpGfn18ps/v27VspcwEA\nAEiWghOAjNGnT58oLS1N+9z8/PwYOHBg2ucCAACQPAUnABmjZcuW0a1bt7TP3bVrV4wcOTLtcwEA\nAEieghOAjPLDH/4wGjRokLZ5derUiW9+85vRpEmTtM0EAAAgc+SUl5eXJx0CAD5TVlYW/fr1i1de\neSV27dp1wPMaNGgQb731Vhx++OFpSAcAAECmscEJQEapVatWTJw4MerXr3/As/Ly8uLuu+9WbgIA\nAFRjCk4AMk779u1j2rRpkZeXt98z8vLy4uqrr47zzjsvjckAAADINI6oA5Cx/vKXv8SIESNiw4YN\nsXWJjvv4AAAEhElEQVTr1r16T+3ataNu3bpx8803x8UXX1zJCQEAAEiaDU4AMtbxxx8fy5Yti299\n61tRr169L93orFOnTtSrVy8KCgritddeU24CAADUEDY4AcgK69atiz/84Q8xZcqUWLRoUXz66acR\nEVG/fv045phjYsiQIXHhhRdGx44dE04KAABAVVJwApCVysvLo7y8PGrVchgBAACgJlNwAgAAAABZ\ny9oLAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAA\nWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAA\nQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAA\nAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAA\nAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIA\nAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwA\nAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUn\nAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvB\nCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZS\ncAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1\nFJwAAAAAQNZScAIAAAAAWUvBCQAAAABkLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCQAAAABk\nLQUnAAAAAJC1FJwAAAAAQNZScAIAAAAAWUvBCfD/tWMHJAAAAACC/r9uR6AzBAAAALYEJwAAAACw\nJTgBAAAAgC3BCQAAAABsCU4AAAAAYEtwAgAAAABbghMAAAAA2BKcAAAAAMCW4AQAAAAAtgQnAAAA\nALAlOAEAAACALcEJAAAAAGwJTgAAAABgS3ACAAAAAFuCEwAAAADYEpwAAAAAwJbgBAAAAAC2BCcA\nAAAAsCU4AQAAAIAtwQkAAAAAbAlOAAAAAGBLcAIAAAAAW4ITAAAAANgSnAAAAADAluAEAAAAALYE\nJwAAAACwJTgBAAAAgC3BCQAAAABsCU4AAAAAYEtwAgAAAABbghMAAAAA2BKcAAAAAMCW4AQAAAAA\ntgQnAAAAALAlOAEAAACALcEJAAAAAGwJTgAAAABgS3ACAAAAAFuCEwAAAADYEpwAAAAAwJbgBAAA\nAAC2BCcAAAAAsCU4AQAAAIAtwQkAAAAAbAV+Oilx9KZ6ggAAAABJRU5ErkJggg==\n", + "application/vnd.jupyter.widget-view+json": { + "model_id": "1882dd95ddd0465c8ec91d93a8a7224f", + "version_major": 2, + "version_minor": 0 + }, "text/plain": [ - "" + "interactive(children=(IntSlider(value=0, description='iteration', max=20), Output()), _dom_classes=('widget-in…" ] }, "metadata": {}, "output_type": "display_data" }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "Widget Javascript not detected. It may not be installed or enabled properly.\n" - ] - }, { "data": { "application/vnd.jupyter.widget-view+json": { - "model_id": "f9d8fbb23a9f446585fac31b107eb123" - } + "model_id": "3967e7c0226d434e8c08c7f4a59e2b2a", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "interactive(children=(ToggleButton(value=False, description='Visualize'), ToggleButtons(description='Extra Del…" + ] }, "metadata": {}, "output_type": "display_data" @@ -896,7 +2806,7 @@ "\n", "visualize_callback = make_visualize(iteration_slider)\n", "\n", - "visualize_button = widgets.ToggleButton(desctiption = \"Visualize\", value = False)\n", + "visualize_button = widgets.ToggleButton(description = \"Visualize\", value = False)\n", "time_select = widgets.ToggleButtons(description='Extra Delay:',options=['0', '0.1', '0.2', '0.5', '0.7', '1.0'])\n", "\n", "a = widgets.interactive(visualize_callback, Visualize = visualize_button, time_step=time_select)\n", @@ -909,33 +2819,25 @@ "source": [ "## N-QUEENS VISUALIZATION\n", "\n", - "Just like the Graph Coloring Problem we will start with defining a few helper functions to help us visualize the assignments as they evolve over time. The **make_plot_board_step_function** behaves similar to the **make_update_step_function** introduced earlier. It initializes a chess board in the form of a 2D grid with alternating 0s and 1s. This is used by **plot_board_step** function which draws the board using matplotlib and adds queens to it. This function also calls the **label_queen_conflicts** which modifies the grid placing 3 in positions in a position where there is a conflict." + "Just like the Graph Coloring Problem, we will start with defining a few helper functions to help us visualize the assignments as they evolve over time. The **make_plot_board_step_function** behaves similar to the **make_update_step_function** introduced earlier. It initializes a chess board in the form of a 2D grid with alternating 0s and 1s. This is used by **plot_board_step** function which draws the board using matplotlib and adds queens to it. This function also calls the **label_queen_conflicts** which modifies the grid placing a 3 in any position where there is a conflict." ] }, { "cell_type": "code", - "execution_count": 26, - "metadata": { - "collapsed": true - }, + "execution_count": 47, + "metadata": {}, "outputs": [], "source": [ "def label_queen_conflicts(assignment,grid):\n", " ''' Mark grid with queens that are under conflict. '''\n", " for col, row in assignment.items(): # check each queen for conflict\n", - " row_conflicts = {temp_col:temp_row for temp_col,temp_row in assignment.items() \n", - " if temp_row == row and temp_col != col}\n", - " up_conflicts = {temp_col:temp_row for temp_col,temp_row in assignment.items() \n", - " if temp_row+temp_col == row+col and temp_col != col}\n", - " down_conflicts = {temp_col:temp_row for temp_col,temp_row in assignment.items() \n", - " if temp_row-temp_col == row-col and temp_col != col}\n", + " conflicts = {temp_col:temp_row for temp_col,temp_row in assignment.items() \n", + " if (temp_row == row and temp_col != col)\n", + " or (temp_row+temp_col == row+col and temp_col != col)\n", + " or (temp_row-temp_col == row-col and temp_col != col)}\n", " \n", - " # Now marking the grid.\n", - " for col, row in row_conflicts.items():\n", - " grid[col][row] = 3\n", - " for col, row in up_conflicts.items():\n", - " grid[col][row] = 3\n", - " for col, row in down_conflicts.items():\n", + " # Place a 3 in positions where this is a conflict\n", + " for col, row in conflicts.items():\n", " grid[col][row] = 3\n", "\n", " return grid\n", @@ -979,15 +2881,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now let us visualize a solution obtained via backtracking. We use of the previosuly defined **make_instru** function for keeping a history of steps." + "Now let us visualize a solution obtained via backtracking. We make use of the previosuly defined **make_instru** function for keeping a history of steps." ] }, { "cell_type": "code", - "execution_count": 27, - "metadata": { - "collapsed": true - }, + "execution_count": 48, + "metadata": {}, "outputs": [], "source": [ "twelve_queens_csp = NQueensCSP(12)\n", @@ -997,10 +2897,8 @@ }, { "cell_type": "code", - "execution_count": 28, - "metadata": { - "collapsed": true - }, + "execution_count": 49, + "metadata": {}, "outputs": [], "source": [ "backtrack_queen_step = make_plot_board_step_function(backtracking_instru_queen) # Step Function for Widgets" @@ -1010,36 +2908,38 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now finally we set some matplotlib parameters to adjust how our plot will look. The font is necessary because the Black Queen Unicode character is not a part of all fonts. You can move the slider to experiment and observe the how queens are assigned. It is also possible to move the slider using arrow keys or to jump to the value by directly editing the number with a double click.The **Visualize Button** will automatically animate the slider for you. The **Extra Delay Box** allows you to set time delay in seconds upto one second for each time step.\n" + "Now finally we set some matplotlib parameters to adjust how our plot will look like. The font is necessary because the Black Queen Unicode character is not a part of all fonts. You can move the slider to experiment and observe how the queens are assigned. It is also possible to move the slider using arrow keys or to jump to the value by directly editing the number with a double click. The **Visualize Button** will automatically animate the slider for you. The **Extra Delay Box** allows you to set time delay in seconds of upto one second for each time step." ] }, { "cell_type": "code", - "execution_count": 29, + "execution_count": 50, "metadata": {}, "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAcgAAAHICAYAAADKoXrqAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADS1JREFUeJzt3X+s3Xddx/H3ub0EcG3XIbNb13a7c2WBEu0ipugdGw7Y\nJiJXwBglzsSwoP6hy4KJidH9wz9qjCZLFgyJE1EIMAZchoQEXDR4cez3j24r2+ytZZWBMaa9t/f2\ndrf36x+3vfOmr5wfzf1yjvHx+OcmJ5/evvP+55nPOd97b6dpmgIA1hsb9gAAMIoEEgACgQSAQCAB\nIBBIAAgEEgACgQSAQCABIBBIAAjGBzk8PTM7Ur92Z2pyYtgjrDM9MzvsEc5hR93ZT2921J399DZq\nO6qqTj+H3CABIBBIAAh+6IF86eiRevBfvlGLC/M/7P8aAPo20GeQg/qv/3ypFubnatfEnqqq+t6L\nh+v233xPnVxcqDe8aV/92ce/UFVVp5aW6sjsc7VrYk+9+tWvaXMkAOhLazfIRx/45/rwL19Xv3vL\nTXXPJ++qqqqjRw7VycWFqqp68d9fqNPLy/XyqaX6yIfeW79/61R95EPvrVNLS22NBAB9ay2QTz7y\nrTp9ermqqh6eub+qqt7yszfUB275naqq+uidn65N4+P10tEj9d3Dz1dV1YuHX6j/eHH0nsAC4P+f\nDQ1k0zRrN8Sff9+v1959+6uq6n0f/PDamZ2XX1VVVbvPvO2684qrau++/TW2aVP93M3vr8uvvLqq\nyk0SgKHasED+4KWj9du/8vb64M0/Wfd88q7avmNXffTOT9XY2Pr/YvHE3OrXMyHtdDp1weYt9dbr\nbqrb/ujP6+TiQv3Bb32gfvVde+uuP/3DjRoPAAayYYF88Jtfr+9/77u1cvp0fe2Ln1r95mNjdcHm\nrXXg8W+vnVtcOFFVtXbTXFlZqWeeeKgu2bGrqqoOPvVIfefpx2plZaW+ft9nPO0KwFBsWCCv2X9d\nbXvd66uq6sapX1t7fcvWbXXgsXMDuXQmkIdfeLbm547V9ktXA7nnTfvq4u07amxsrK6/6ZfqtT+y\neaNGBIC+bVggL9t9Zd39pQfqp37m7fXjV7957fUtF15URw59p+aPH6uqqoUzN8Kzb7EeeOyBqqra\nftlqIJtmpebnjteffOzzdfsf/8VGjQcAA9nQh3TGxsbqrdffVF/4+79ae23z1gtX30Z98qGq+t9v\nsa5+PXu73L5jd1VVffkzf11bt72u3rB330aOBgAD2fAf8/jpyXfUwaceqYMHHq2qqi1bL6qqV0J4\n9jPFk4sLa58/jm3aVBdv31Hzx4/VP9z7t3XtO35ho8cCgIFseCC3XfT6unrvNXXv332sqqq2XLit\nqqqePvOgzuKJVwJ5+N8O1vzcsfrRiy+p8fFX1Zc/d3ctnJiva294z0aPBQADaeUXBex/24318Lfu\nryOHnlu7Qc6+8GwtnJhb9xTr2c8fL9mxu+bnjtdXPv+J2nnFVTWx541tjAUAfWsnkNe9q5qmqS9+\n+uO1eeuFVVW1cvp0PfPEQ+s+gzz7tuuPXbqz7vvc3bUwP1fX3uDtVQCGr5VAXrrzitp1xZ765jfu\nq6WTi2uvH3j8wVeeYl04Uc88/mBVrf4oyFfu+URVVb3tnb/YxkgAMJDWfhfrm6/ZX8vLL9f9X713\n7bWnH/v22kM6B596tObnVn/046GZf6wT88fr4u076rLdV7Y1EgD0rbU/d7VpfPVbn/1F5FVVh557\nuppmpaqqDjz+wNrrR48cOvNvXtXWOAAwkFb/HuQbf+It9e7339LX2eXl5frs39zZ5jgA0LdWA7n8\n8qk6fuy/+zp79k9jAcAoaDWQzz/7ZD3/7JN9n7/ksstbnAYA+tfaQzoA8H+ZQAJA0OpbrNffOFW3\n3/GXfZ09tbRUv/cbN7c5DgD0rdVA/us/fa2eeHim7/Ovee0FLU4DAP1rLZC33nZH3XrbHW19ewBo\nlc8gASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgKDTNM0g5wc63Lbpmdlhj7DO1OTEsEc4hx11\nZz+92VF39tPbCO6o0885N0gACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBA\nIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKB\nBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBgfJDD0zOzbc1xXqYmJ4Y9wjqjtp8qO+rF\nfnqzo+7sp7dR21G/3CABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSA\nQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgAC\ngQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAoNM0zSDnBzrctumZ2WGPsM7U5MSwRziH\nHXVnP73ZUXf209sI7qjTzzk3SAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEE\ngEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIA\nAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgGB8kMPTM7NtzXFepiYnhj3COqO2nyo7\n6sV+erOj7uynt1HbUb/cIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKB\nBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQS\nAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAIJO0zSDnB/ocNumZ2aHPcI6U5MTwx7hHHbU\nnf30Zkfd2U9vI7ijTj/n3CABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgAC\ngQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgE\nEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgCC8UEOT8/MtjXHeZmanBj2COuM2n6q7KgX\n++nNjrqzn95GbUf9coMEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIA\nAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAI\nBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAgk7TNIOcH+hw26ZnZoc9wjpTkxPDHuEc\ndtSd/fRmR93ZT28juKNOP+fcIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQS\nAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgA\nCAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAILxQQ5Pz8y2Ncd5mZqcGPYI64zafqrs\nqBf76c2OurOf3kZtR/1ygwSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgE\nEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBI\nAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAg6TdMMcn6gw22bnpkd9gjrTE1ODHuEc9hR\nd/bTmx11Zz+9jeCOOv2cc4MEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAI\nBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQ\nSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIxgc5PD0z29Yc52VqcmLYI6wzavupsqNe\n7Kc3O+rOfnobtR31yw0SAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgA\nCAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEg\nEEgACAQSAAKBBIBAIAEgEEgACAQSAAKBBIBAIAEgEEgACDpN0wxyfqDDbZuemR32COtMTU4Me4Rz\n2FF39tObHXVnP72N4I46/ZxzgwSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBI\nAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCAB\nIBBIAAgEEgACgQSAQCABIBBIAAgEEgACgQSAQCABIBBIAAg6TdMMewYAGDlukAAQCCQABAIJAIFA\nAkAgkAAQCCQABAIJAIFAAkAgkAAQCCQABP8DCNiNomYWeDEAAAAASUVORK5CYII=\n", + "application/vnd.jupyter.widget-view+json": { + "model_id": "582e8f9b8d2e4a31aa7d45de68fd5b7c", + "version_major": 2, + "version_minor": 0 + }, "text/plain": [ - "" + "interactive(children=(IntSlider(value=0, description='iteration', max=473, step=0), Output()), _dom_classes=('…" ] }, "metadata": {}, "output_type": "display_data" }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "Widget Javascript not detected. It may not be installed or enabled properly.\n" - ] - }, { "data": { "application/vnd.jupyter.widget-view+json": { - "model_id": "e0cf790018f34082961a812b9bc7eb81" - } + "model_id": "bb0f50b970764cb4bbebeb69cd4fbd19", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "interactive(children=(ToggleButton(value=False, description='Visualize'), ToggleButtons(description='Extra Del…" + ] }, "metadata": {}, "output_type": "display_data" @@ -1055,7 +2955,7 @@ "\n", "visualize_callback = make_visualize(iteration_slider)\n", "\n", - "visualize_button = widgets.ToggleButton(desctiption = \"Visualize\", value = False)\n", + "visualize_button = widgets.ToggleButton(description = \"Visualize\", value = False)\n", "time_select = widgets.ToggleButtons(description='Extra Delay:',options=['0', '0.1', '0.2', '0.5', '0.7', '1.0'])\n", "\n", "a = widgets.interactive(visualize_callback, Visualize = visualize_button, time_step=time_select)\n", @@ -1071,10 +2971,8 @@ }, { "cell_type": "code", - "execution_count": 30, - "metadata": { - "collapsed": true - }, + "execution_count": 51, + "metadata": {}, "outputs": [], "source": [ "conflicts_instru_queen = make_instru(twelve_queens_csp)\n", @@ -1083,10 +2981,8 @@ }, { "cell_type": "code", - "execution_count": 31, - "metadata": { - "collapsed": true - }, + "execution_count": 52, + "metadata": {}, "outputs": [], "source": [ "conflicts_step = make_plot_board_step_function(conflicts_instru_queen)" @@ -1096,36 +2992,38 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The visualization has same features as the above. But here it also highlights the conflicts by labeling the conflicted queens with a red background." + "This visualization has same features as the one above; however, this one also highlights the conflicts by labeling the conflicted queens with a red background." ] }, { "cell_type": "code", - "execution_count": 32, + "execution_count": 53, "metadata": {}, "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAcgAAAHICAYAAADKoXrqAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADStJREFUeJzt3V1s3Xd9x/HvcYx4aJKmQJc2TdK6a6ggaEs1prC5tKyM\ntjDAPAltaJ00UbHtYqsqJk2att5ws03TJlWqmJDWMTYQUAozBYQEVJvAUPr8kLahLXEWmtFtmqbE\njh2njv+7SOLuKB+dh0j2Oep5vW4sHf0sff29eev3P8d2q2maAgDajQ16AAAYRgIJAIFAAkAgkAAQ\nCCQABAIJAIFAAkAgkAAQCCQABOP9HJ6emR2qP7szNTkx6BHaTM/MDnqEs9hRZ/bTnR11Zj/dDduO\nqqrVyyE3SAAIBBIAAoEEoM0Lhw/V/d//Ti0uzA96lIHq6z1IAF5e/ue/X6iF+bnaMbGrqqp+9vzB\nuvV331PHFxfqDW/aU3/16a9UVdWJpaU6NPtM7ZjYVa985asGOfK6cYMEGFEP3/dv9fEPX1N/eNMN\ndddn76iqqsOHDtTxxYWqqnr+35+rk8vL9eKJpfrEx95Xf3zzVH3iY++rE0tLgxx73QgkwIh6/KEf\n1MmTy1VV9eDMvVVV9ZZfva4+dNMfVFXVJ2//fG0YH68XDh+qnx58tqqqnj/4XP3H88P3Sdm1IJAA\nI6RpmtUb4rs+8Nu1e8/eqqr6wEc/vnpm+6VXVFXVztOPXbdfdkXt3rO3xjZsqF+78YN16eVXVlW9\n7G+SAgkwIv7rhcP1+x95e330xl+suz57R23dtqM+efvnamysPQWLx+ZOfT0d0larVedt3FRvveaG\nuuXP/rqOLy7Un/zeh+o337m77vjLP133n2O9CCTAiLj/e9+u//zZT2vl5Mn61lc/V1VVY2Njdd7G\nzbXv0R+tnltcOFZVtXrTXFlZqacee6Au2rajqqr2P/FQ/fjJR2plZaW+fc8XXrafdhVIgBFx1d5r\nastrX19VVddP/dbq65s2b6l9j5wdyKXTgTz43NM1P3ektl58KpC73rSnLty6rcbGxuraG95fr37N\nxvX6EdaVQAKMiEt2Xl53/st99Uu/8vb6+SvfvPr6pvMvqEMHflzzR49UVdXC6RvhmUes+x65r6qq\ntl5yKpBNs1Lzc0frLz715br1z/9mPX+EdSWQACNkbGys3nrtDfWVf/671dc2bj7/1GPUxx+oqv//\niPXU1zO3y63bdlZV1de+8Pe1ectr6w2796zn6OtOIAFGzC9PvqP2P/FQ7d/3cFVVbdp8QVW9FMIz\n7ykeX1xYff9xbMOGunDrtpo/eqS+cfc/1tXv+I3BDL+OBBJgxGy54PV15e6r6u5/+lRVVW06f0tV\nVT15+oM6i8deCuTBn+yv+bkj9boLL6rx8VfU1750Zy0cm6+rr3vPYIZfRwIJMIL2vu36evAH99ah\nA8+s3iBnn3u6Fo7NtX2K9cz7jxdt21nzc0fr61/+TG2/7Iqa2PXGgc2+XgQSYATtvead1TRNffXz\nn66Nm8+vqqqVkyfrqcceaHsP8sxj15+7eHvd86U7a2F+rq6+7uX/eLVKIAFG0sXbL6sdl+2q733n\nnlo6vrj6+r5H73/pU6wLx+qpR++vqlO/CvL1uz5TVVVv+/X3rvu8gyCQACPqzVftreXlF+veb969\n+tqTj/xo9UM6+594uObnTv3qxwMz361j80frwq3b6pKdlw9k3vXm310BjKgN46cScOYPkVdVHXjm\nyWqalaqq2vfofauvHz504PT3vGIdJxwsgQQYYW/8hbfUuz94U09nl5eX64v/cPsaTzQ8BBJghC2/\neKKOHvnfns6e+ddYo0IgAUbYs08/Xs8+/XjP5y+65NI1nGa4+JAOAAQCCQCBR6wAI+za66fq1tv+\ntqezJ5aW6o9+58Y1nmh4CCTACPvhv36rHntwpufzr3r1eWs4zXARSIARdfMtt9XNt9w26DGGlvcg\nASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgKDVNE0/5/s6vNamZ2YHPUKbqcmJQY9wFjvqzH66\ns6PO7Ke7IdxRq5dzbpAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJA\nIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCB\nQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQDBeD+Hp2dm12qOczI1OTHoEdoM236q7Kgb++nO\njjqzn+6GbUe9coMEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEE\ngEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIA\nAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAglbTNP2c7+vwWpuemR30CG2mJicGPcJZ7Kgz\n++nOjjqzn+6GcEetXs65QQJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQC\nCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgk\nAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAATj/RyenpldqznOydTkxKBHaDNs+6myo27s\npzs76sx+uhu2HfXKDRIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAI\nBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQ\nSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASBoNU3Tz/m+Dq+16ZnZQY/QZmpyYtAjnMWOOrOf\n7uyoM/vpbgh31OrlnBskAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAA\nEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJA\nIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAMN7P4emZ2bWa45xMTU4MeoQ2w7afKjvqxn66\ns6PO7Ke7YdtRr9wgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAg\nASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEE\ngEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgKDVNE0/5/s6vNamZ2YHPUKbqcmJQY9wFjvq\nzH66s6PO7Ke7IdxRq5dzbpAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCB\nQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQC\nCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQDBeD+Hp2dm12qOczI1OTHoEdoM236q7Kgb\n++nOjjqzn+6GbUe9coMEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIA\nAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAI\nBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIWk3T9HO+r8NrbXpmdtAjtJmanBj0CGexo87s\npzs76sx+uhvCHbV6OecGCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgk\nAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAA\nEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEIz3c3h6Znat5jgnU5MTgx6hzbDtp8qOurGf\n7uyoM/vpbth21Cs3SAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQ\nSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAg\nASAQSAAIBBIAAoEEgEAgASAQSAAIBBIAAoEEgEAgASBoNU3Tz/m+Dq+16ZnZQY/QZmpyYtAjnMWO\nOrOf7uyoM/vpbgh31OrlnBskAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJA\nIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCB\nQAJAIJAAEAgkAAQCCQCBQAJAIJAAEAgkAAQCCQCBQAJA0GqaZtAzAMDQcYMEgEAgASAQSAAIBBIA\nAoEEgEAgASAQSAAIBBIAAoEEgEAgASD4Pz4ojaLlZaEKAAAAAElFTkSuQmCC\n", + "application/vnd.jupyter.widget-view+json": { + "model_id": "409c4961f6e04fbea5d07a01cb1797ea", + "version_major": 2, + "version_minor": 0 + }, "text/plain": [ - "" + "interactive(children=(IntSlider(value=0, description='iteration', max=27, step=0), Output()), _dom_classes=('w…" ] }, "metadata": {}, "output_type": "display_data" }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "Widget Javascript not detected. It may not be installed or enabled properly.\n" - ] - }, { "data": { "application/vnd.jupyter.widget-view+json": { - "model_id": "a61406396a92432d9f8f40c6f7a52d3e" - } + "model_id": "a55b1b50a9a44085a484b357aa26b50f", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "interactive(children=(ToggleButton(value=False, description='Visualize'), ToggleButtons(description='Extra Del…" + ] }, "metadata": {}, "output_type": "display_data" @@ -1138,12 +3036,19 @@ "\n", "visualize_callback = make_visualize(iteration_slider)\n", "\n", - "visualize_button = widgets.ToggleButton(desctiption = \"Visualize\", value = False)\n", + "visualize_button = widgets.ToggleButton(description = \"Visualize\", value = False)\n", "time_select = widgets.ToggleButtons(description='Extra Delay:',options=['0', '0.1', '0.2', '0.5', '0.7', '1.0'])\n", "\n", "a = widgets.interactive(visualize_callback, Visualize = visualize_button, time_step=time_select)\n", "display(a)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { @@ -1162,13 +3067,18 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.3" - }, - "widgets": { - "state": {}, - "version": "1.1.1" + "version": "3.6.8" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "source": [], + "metadata": { + "collapsed": false + } + } } }, "nbformat": 4, "nbformat_minor": 1 -} +} \ No newline at end of file diff --git a/csp.py b/csp.py index 9e933c266..46ae07dd5 100644 --- a/csp.py +++ b/csp.py @@ -1,14 +1,17 @@ -"""CSP (Constraint Satisfaction Problems) problems and solvers. (Chapter 6).""" - -from utils import argmin_random_tie, count, first -import search - -from collections import defaultdict -from functools import reduce +"""CSP (Constraint Satisfaction Problems) problems and solvers. (Chapter 6)""" import itertools -import re import random +import re +import string +from collections import defaultdict, Counter +from functools import reduce +from operator import eq, neg + +from sortedcontainers import SortedSet + +import search +from utils import argmin_random_tie, count, first, extend class CSP(search.Problem): @@ -24,9 +27,9 @@ class CSP(search.Problem): In the textbook and in most mathematical definitions, the constraints are specified as explicit pairs of allowable values, but the formulation here is easier to express and more compact for - most cases. (For example, the n-Queens problem can be represented - in O(n) space using this notation, instead of O(N^4) for the - explicit representation.) In terms of describing the CSP as a + most cases (for example, the n-Queens problem can be represented + in O(n) space using this notation, instead of O(n^4) for the + explicit representation). In terms of describing the CSP as a problem, that's all there is. However, the class also supports data structures and methods that help you @@ -50,13 +53,12 @@ class CSP(search.Problem): def __init__(self, variables, domains, neighbors, constraints): """Construct a CSP problem. If variables is empty, it becomes domains.keys().""" + super().__init__(()) variables = variables or list(domains.keys()) - self.variables = variables self.domains = domains self.neighbors = neighbors self.constraints = constraints - self.initial = () self.curr_domains = None self.nassigns = 0 @@ -74,21 +76,22 @@ def unassign(self, var, assignment): def nconflicts(self, var, val, assignment): """Return the number of conflicts var=val has with other variables.""" + # Subclasses may implement this more efficiently def conflict(var2): - return (var2 in assignment and - not self.constraints(var, val, var2, assignment[var2])) + return var2 in assignment and not self.constraints(var, val, var2, assignment[var2]) + return count(conflict(v) for v in self.neighbors[var]) def display(self, assignment): """Show a human-readable representation of the CSP.""" # Subclasses can print in a prettier way, or display with a GUI - print('CSP:', self, 'with assignment:', assignment) + print(assignment) # These methods are for the tree and graph-search interface: def actions(self, state): - """Return a list of applicable actions: nonconflicting + """Return a list of applicable actions: non conflicting assignments to an unassigned variable.""" if len(state) == len(self.variables): return [] @@ -153,35 +156,186 @@ def conflicted_vars(self, current): return [var for var in self.variables if self.nconflicts(var, current[var], current) > 0] + # ______________________________________________________________________________ -# Constraint Propagation with AC-3 +# Constraint Propagation with AC3 + + +def no_arc_heuristic(csp, queue): + return queue + +def dom_j_up(csp, queue): + return SortedSet(queue, key=lambda t: neg(len(csp.curr_domains[t[1]]))) -def AC3(csp, queue=None, removals=None): + +def AC3(csp, queue=None, removals=None, arc_heuristic=dom_j_up): """[Figure 6.3]""" if queue is None: - queue = [(Xi, Xk) for Xi in csp.variables for Xk in csp.neighbors[Xi]] + queue = {(Xi, Xk) for Xi in csp.variables for Xk in csp.neighbors[Xi]} csp.support_pruning() + queue = arc_heuristic(csp, queue) + checks = 0 while queue: (Xi, Xj) = queue.pop() - if revise(csp, Xi, Xj, removals): + revised, checks = revise(csp, Xi, Xj, removals, checks) + if revised: if not csp.curr_domains[Xi]: - return False + return False, checks # CSP is inconsistent for Xk in csp.neighbors[Xi]: - if Xk != Xi: - queue.append((Xk, Xi)) - return True + if Xk != Xj: + queue.add((Xk, Xi)) + return True, checks # CSP is satisfiable -def revise(csp, Xi, Xj, removals): +def revise(csp, Xi, Xj, removals, checks=0): """Return true if we remove a value.""" revised = False for x in csp.curr_domains[Xi][:]: # If Xi=x conflicts with Xj=y for every possible y, eliminate Xi=x - if all(not csp.constraints(Xi, x, Xj, y) for y in csp.curr_domains[Xj]): + # if all(not csp.constraints(Xi, x, Xj, y) for y in csp.curr_domains[Xj]): + conflict = True + for y in csp.curr_domains[Xj]: + if csp.constraints(Xi, x, Xj, y): + conflict = False + checks += 1 + if not conflict: + break + if conflict: csp.prune(Xi, x, removals) revised = True - return revised + return revised, checks + + +# Constraint Propagation with AC3b: an improved version +# of AC3 with double-support domain-heuristic + +def AC3b(csp, queue=None, removals=None, arc_heuristic=dom_j_up): + if queue is None: + queue = {(Xi, Xk) for Xi in csp.variables for Xk in csp.neighbors[Xi]} + csp.support_pruning() + queue = arc_heuristic(csp, queue) + checks = 0 + while queue: + (Xi, Xj) = queue.pop() + # Si_p values are all known to be supported by Xj + # Sj_p values are all known to be supported by Xi + # Dj - Sj_p = Sj_u values are unknown, as yet, to be supported by Xi + Si_p, Sj_p, Sj_u, checks = partition(csp, Xi, Xj, checks) + if not Si_p: + return False, checks # CSP is inconsistent + revised = False + for x in set(csp.curr_domains[Xi]) - Si_p: + csp.prune(Xi, x, removals) + revised = True + if revised: + for Xk in csp.neighbors[Xi]: + if Xk != Xj: + queue.add((Xk, Xi)) + if (Xj, Xi) in queue: + if isinstance(queue, set): + # or queue -= {(Xj, Xi)} or queue.remove((Xj, Xi)) + queue.difference_update({(Xj, Xi)}) + else: + queue.difference_update((Xj, Xi)) + # the elements in D_j which are supported by Xi are given by the union of Sj_p with the set of those + # elements of Sj_u which further processing will show to be supported by some vi_p in Si_p + for vj_p in Sj_u: + for vi_p in Si_p: + conflict = True + if csp.constraints(Xj, vj_p, Xi, vi_p): + conflict = False + Sj_p.add(vj_p) + checks += 1 + if not conflict: + break + revised = False + for x in set(csp.curr_domains[Xj]) - Sj_p: + csp.prune(Xj, x, removals) + revised = True + if revised: + for Xk in csp.neighbors[Xj]: + if Xk != Xi: + queue.add((Xk, Xj)) + return True, checks # CSP is satisfiable + + +def partition(csp, Xi, Xj, checks=0): + Si_p = set() + Sj_p = set() + Sj_u = set(csp.curr_domains[Xj]) + for vi_u in csp.curr_domains[Xi]: + conflict = True + # now, in order to establish support for a value vi_u in Di it seems better to try to find a support among + # the values in Sj_u first, because for each vj_u in Sj_u the check (vi_u, vj_u) is a double-support check + # and it is just as likely that any vj_u in Sj_u supports vi_u than it is that any vj_p in Sj_p does... + for vj_u in Sj_u - Sj_p: + # double-support check + if csp.constraints(Xi, vi_u, Xj, vj_u): + conflict = False + Si_p.add(vi_u) + Sj_p.add(vj_u) + checks += 1 + if not conflict: + break + # ... and only if no support can be found among the elements in Sj_u, should the elements vj_p in Sj_p be used + # for single-support checks (vi_u, vj_p) + if conflict: + for vj_p in Sj_p: + # single-support check + if csp.constraints(Xi, vi_u, Xj, vj_p): + conflict = False + Si_p.add(vi_u) + checks += 1 + if not conflict: + break + return Si_p, Sj_p, Sj_u - Sj_p, checks + + +# Constraint Propagation with AC4 + +def AC4(csp, queue=None, removals=None, arc_heuristic=dom_j_up): + if queue is None: + queue = {(Xi, Xk) for Xi in csp.variables for Xk in csp.neighbors[Xi]} + csp.support_pruning() + queue = arc_heuristic(csp, queue) + support_counter = Counter() + variable_value_pairs_supported = defaultdict(set) + unsupported_variable_value_pairs = [] + checks = 0 + # construction and initialization of support sets + while queue: + (Xi, Xj) = queue.pop() + revised = False + for x in csp.curr_domains[Xi][:]: + for y in csp.curr_domains[Xj]: + if csp.constraints(Xi, x, Xj, y): + support_counter[(Xi, x, Xj)] += 1 + variable_value_pairs_supported[(Xj, y)].add((Xi, x)) + checks += 1 + if support_counter[(Xi, x, Xj)] == 0: + csp.prune(Xi, x, removals) + revised = True + unsupported_variable_value_pairs.append((Xi, x)) + if revised: + if not csp.curr_domains[Xi]: + return False, checks # CSP is inconsistent + # propagation of removed values + while unsupported_variable_value_pairs: + Xj, y = unsupported_variable_value_pairs.pop() + for Xi, x in variable_value_pairs_supported[(Xj, y)]: + revised = False + if x in csp.curr_domains[Xi][:]: + support_counter[(Xi, x, Xj)] -= 1 + if support_counter[(Xi, x, Xj)] == 0: + csp.prune(Xi, x, removals) + revised = True + unsupported_variable_value_pairs.append((Xi, x)) + if revised: + if not csp.curr_domains[Xi]: + return False, checks # CSP is inconsistent + return True, checks # CSP is satisfiable + # ______________________________________________________________________________ # CSP Backtracking Search @@ -196,17 +350,16 @@ def first_unassigned_variable(assignment, csp): def mrv(assignment, csp): """Minimum-remaining-values heuristic.""" - return argmin_random_tie( - [v for v in csp.variables if v not in assignment], - key=lambda var: num_legal_values(csp, var, assignment)) + return argmin_random_tie([v for v in csp.variables if v not in assignment], + key=lambda var: num_legal_values(csp, var, assignment)) def num_legal_values(csp, var, assignment): if csp.curr_domains: return len(csp.curr_domains[var]) else: - return count(csp.nconflicts(var, val, assignment) == 0 - for val in csp.domains[var]) + return count(csp.nconflicts(var, val, assignment) == 0 for val in csp.domains[var]) + # Value ordering @@ -218,8 +371,8 @@ def unordered_domain_values(var, assignment, csp): def lcv(var, assignment, csp): """Least-constraining-values heuristic.""" - return sorted(csp.choices(var), - key=lambda val: csp.nconflicts(var, val, assignment)) + return sorted(csp.choices(var), key=lambda val: csp.nconflicts(var, val, assignment)) + # Inference @@ -230,6 +383,7 @@ def no_inference(csp, var, value, assignment, removals): def forward_checking(csp, var, value, assignment, removals): """Prune neighbor values inconsistent with var=value.""" + csp.support_pruning() for B in csp.neighbors[var]: if B not in assignment: for b in csp.curr_domains[B][:]: @@ -240,17 +394,16 @@ def forward_checking(csp, var, value, assignment, removals): return True -def mac(csp, var, value, assignment, removals): +def mac(csp, var, value, assignment, removals, constraint_propagation=AC3b): """Maintain arc consistency.""" - return AC3(csp, [(X, var) for X in csp.neighbors[var]], removals) + return constraint_propagation(csp, {(X, var) for X in csp.neighbors[var]}, removals) + # The search, proper -def backtracking_search(csp, - select_unassigned_variable=first_unassigned_variable, - order_domain_values=unordered_domain_values, - inference=no_inference): +def backtracking_search(csp, select_unassigned_variable=first_unassigned_variable, + order_domain_values=unordered_domain_values, inference=no_inference): """[Figure 6.5]""" def backtrack(assignment): @@ -273,12 +426,13 @@ def backtrack(assignment): assert result is None or csp.goal_test(result) return result + # ______________________________________________________________________________ -# Min-conflicts hillclimbing search for CSPs +# Min-conflicts Hill Climbing search for CSPs def min_conflicts(csp, max_steps=100000): - """Solve a CSP by stochastic hillclimbing on the number of conflicts.""" + """Solve a CSP by stochastic Hill Climbing on the number of conflicts.""" # Generate a complete assignment for all variables (probably with conflicts) csp.current = current = {} for var in csp.variables: @@ -298,8 +452,8 @@ def min_conflicts(csp, max_steps=100000): def min_conflicts_value(csp, var, current): """Return the value that will give var the least number of conflicts. If there is a tie, choose at random.""" - return argmin_random_tie(csp.domains[var], - key=lambda val: csp.nconflicts(var, val, current)) + return argmin_random_tie(csp.domains[var], key=lambda val: csp.nconflicts(var, val, current)) + # ______________________________________________________________________________ @@ -351,11 +505,11 @@ def topological_sort(X, root): def build_topological(node, parent, neighbors, visited, stack, parents): - """Builds the topological sort and the parents of each node in the graph""" + """Build the topological sort and the parents of each node in the graph.""" visited[node] = True for n in neighbors[node]: - if(not visited[n]): + if not visited[n]: build_topological(n, node, neighbors, visited, stack, parents) parents[node] = parent @@ -365,15 +519,15 @@ def build_topological(node, parent, neighbors, visited, stack, parents): def make_arc_consistent(Xj, Xk, csp): """Make arc between parent (Xj) and child (Xk) consistent under the csp's constraints, by removing the possible values of Xj that cause inconsistencies.""" - #csp.curr_domains[Xj] = [] + # csp.curr_domains[Xj] = [] for val1 in csp.domains[Xj]: - keep = False # Keep or remove val1 + keep = False # Keep or remove val1 for val2 in csp.domains[Xk]: if csp.constraints(Xj, val1, Xk, val2): # Found a consistent assignment for val1, keep it keep = True break - + if not keep: # Remove val1 csp.prune(Xj, val1, None) @@ -392,8 +546,9 @@ def assign_value(Xj, Xk, csp, assignment): # No consistent assignment available return None + # ______________________________________________________________________________ -# Map-Coloring Problems +# Map Coloring CSP Problems class UniversalDict: @@ -423,11 +578,10 @@ def MapColoringCSP(colors, neighbors): specified as a string of the form defined by parse_neighbors.""" if isinstance(neighbors, str): neighbors = parse_neighbors(neighbors) - return CSP(list(neighbors.keys()), UniversalDict(colors), neighbors, - different_values_constraint) + return CSP(list(neighbors.keys()), UniversalDict(colors), neighbors, different_values_constraint) -def parse_neighbors(neighbors, variables=[]): +def parse_neighbors(neighbors): """Convert a string of the form 'X: Y Z; Y: Z' into a dict mapping regions to neighbors. The syntax is a region name followed by a ':' followed by zero or more region names, followed by ';', repeated for @@ -445,27 +599,27 @@ def parse_neighbors(neighbors, variables=[]): return dic -australia = MapColoringCSP(list('RGB'), - 'SA: WA NT Q NSW V; NT: WA Q; NSW: Q V; T: ') - -usa = MapColoringCSP(list('RGBY'), - """WA: OR ID; OR: ID NV CA; CA: NV AZ; NV: ID UT AZ; ID: MT WY UT; - UT: WY CO AZ; MT: ND SD WY; WY: SD NE CO; CO: NE KA OK NM; NM: OK TX; - ND: MN SD; SD: MN IA NE; NE: IA MO KA; KA: MO OK; OK: MO AR TX; - TX: AR LA; MN: WI IA; IA: WI IL MO; MO: IL KY TN AR; AR: MS TN LA; - LA: MS; WI: MI IL; IL: IN KY; IN: OH KY; MS: TN AL; AL: TN GA FL; - MI: OH IN; OH: PA WV KY; KY: WV VA TN; TN: VA NC GA; GA: NC SC FL; - PA: NY NJ DE MD WV; WV: MD VA; VA: MD DC NC; NC: SC; NY: VT MA CT NJ; - NJ: DE; DE: MD; MD: DC; VT: NH MA; MA: NH RI CT; CT: RI; ME: NH; - HI: ; AK: """) - -france = MapColoringCSP(list('RGBY'), - """AL: LO FC; AQ: MP LI PC; AU: LI CE BO RA LR MP; BO: CE IF CA FC RA - AU; BR: NB PL; CA: IF PI LO FC BO; CE: PL NB NH IF BO AU LI PC; FC: BO - CA LO AL RA; IF: NH PI CA BO CE; LI: PC CE AU MP AQ; LO: CA AL FC; LR: - MP AU RA PA; MP: AQ LI AU LR; NB: NH CE PL BR; NH: PI IF CE NB; NO: - PI; PA: LR RA; PC: PL CE LI AQ; PI: NH NO CA IF; PL: BR NB CE PC; RA: - AU BO FC PA LR""") +australia_csp = MapColoringCSP(list('RGB'), """SA: WA NT Q NSW V; NT: WA Q; NSW: Q V; T: """) + +usa_csp = MapColoringCSP(list('RGBY'), + """WA: OR ID; OR: ID NV CA; CA: NV AZ; NV: ID UT AZ; ID: MT WY UT; + UT: WY CO AZ; MT: ND SD WY; WY: SD NE CO; CO: NE KA OK NM; NM: OK TX AZ; + ND: MN SD; SD: MN IA NE; NE: IA MO KA; KA: MO OK; OK: MO AR TX; + TX: AR LA; MN: WI IA; IA: WI IL MO; MO: IL KY TN AR; AR: MS TN LA; + LA: MS; WI: MI IL; IL: IN KY; IN: OH KY; MS: TN AL; AL: TN GA FL; + MI: OH IN; OH: PA WV KY; KY: WV VA TN; TN: VA NC GA; GA: NC SC FL; + PA: NY NJ DE MD WV; WV: MD VA; VA: MD DC NC; NC: SC; NY: VT MA CT NJ; + NJ: DE; DE: MD; MD: DC; VT: NH MA; MA: NH RI CT; CT: RI; ME: NH; + HI: ; AK: """) + +france_csp = MapColoringCSP(list('RGBY'), + """AL: LO FC; AQ: MP LI PC; AU: LI CE BO RA LR MP; BO: CE IF CA FC RA + AU; BR: NB PL; CA: IF PI LO FC BO; CE: PL NB NH IF BO AU LI PC; FC: BO + CA LO AL RA; IF: NH PI CA BO CE; LI: PC CE AU MP AQ; LO: CA AL FC; LR: + MP AU RA PA; MP: AQ LI AU LR; NB: NH CE PL BR; NH: PI IF CE NB; NO: + PI; PA: LR RA; PC: PL CE LI AQ; PI: NH NO CA IF; PL: BR NB CE PC; RA: + AU BO FC PA LR""") + # ______________________________________________________________________________ # n-Queens Problem @@ -478,12 +632,13 @@ def queen_constraint(A, a, B, b): class NQueensCSP(CSP): - """Make a CSP for the nQueens problem for search with min_conflicts. + """ + Make a CSP for the nQueens problem for search with min_conflicts. Suitable for large n, it uses only data structures of size O(n). Think of placing queens one per column, from left to right. That means position (x, y) represents (var, val) in the CSP. The main structures are three arrays to count queens that could conflict: - rows[i] Number of queens in the ith row (i.e val == i) + rows[i] Number of queens in the ith row (i.e. val == i) downs[i] Number of queens in the \ diagonal such that their (x, y) coordinates sum to i ups[i] Number of queens in the / diagonal @@ -502,26 +657,26 @@ def __init__(self, n): CSP.__init__(self, list(range(n)), UniversalDict(list(range(n))), UniversalDict(list(range(n))), queen_constraint) - self.rows = [0]*n - self.ups = [0]*(2*n - 1) - self.downs = [0]*(2*n - 1) + self.rows = [0] * n + self.ups = [0] * (2 * n - 1) + self.downs = [0] * (2 * n - 1) def nconflicts(self, var, val, assignment): """The number of conflicts, as recorded with each assignment. Count conflicts in row and in up, down diagonals. If there is a queen there, it can't conflict with itself, so subtract 3.""" n = len(self.variables) - c = self.rows[val] + self.downs[var+val] + self.ups[var-val+n-1] + c = self.rows[val] + self.downs[var + val] + self.ups[var - val + n - 1] if assignment.get(var, None) == val: c -= 3 return c def assign(self, var, val, assignment): """Assign var, and keep track of conflicts.""" - oldval = assignment.get(var, None) - if val != oldval: - if oldval is not None: # Remove old val if there was one - self.record_conflict(assignment, var, oldval, -1) + old_val = assignment.get(var, None) + if val != old_val: + if old_val is not None: # Remove old val if there was one + self.record_conflict(assignment, var, old_val, -1) self.record_conflict(assignment, var, val, +1) CSP.assign(self, var, val, assignment) @@ -559,6 +714,7 @@ def display(self, assignment): print(str(self.nconflicts(var, val, assignment)) + ch, end=' ') print() + # ______________________________________________________________________________ # Sudoku @@ -584,7 +740,8 @@ def flatten(seqs): class Sudoku(CSP): - """A Sudoku problem. + """ + A Sudoku problem. The box grid is a 3x3 array of boxes, each a 3x3 array of cells. Each cell holds a digit in 1..9. In each box, all digits are different; the same for each row and column as a 9x9 grid. @@ -601,8 +758,9 @@ class Sudoku(CSP): . . 2 | 6 . 9 | 5 . . 8 . . | 2 . 3 | . . 9 . . 5 | . 1 . | 3 . . - >>> AC3(e); e.display(e.infer_assignment()) - True + >>> AC3(e) # doctest: +ELLIPSIS + (True, ...) + >>> e.display(e.infer_assignment()) 4 8 3 | 9 2 1 | 6 5 7 9 6 7 | 3 4 5 | 8 2 1 2 5 1 | 8 7 6 | 4 9 3 @@ -617,7 +775,7 @@ class Sudoku(CSP): >>> h = Sudoku(harder1) >>> backtracking_search(h, select_unassigned_variable=mrv, inference=forward_checking) is not None True - """ # noqa + """ R3 = _R3 Cell = _CELL @@ -645,9 +803,12 @@ def show_cell(cell): return str(assignment.get(cell, '.')) def abut(lines1, lines2): return list( map(' | '.join, list(zip(lines1, lines2)))) + print('\n------+-------+------\n'.join( '\n'.join(reduce( abut, map(show_box, brow))) for brow in self.bgrid)) + + # ______________________________________________________________________________ # The Zebra Puzzle @@ -669,7 +830,7 @@ def Zebra(): Spaniard: Dog; Kools: Yellow; Chesterfields: Fox; Norwegian: Blue; Winston: Snails; LuckyStrike: OJ; Ukranian: Tea; Japanese: Parliaments; Kools: Horse; - Coffee: Green; Green: Ivory""", variables) + Coffee: Green; Green: Ivory""") for type in [Colors, Pets, Drinks, Countries, Smokes]: for A in type: for B in type: @@ -715,6 +876,7 @@ def zebra_constraint(A, a, B, b, recurse=0): (A in Smokes and B in Smokes)): return not same raise Exception('error') + return CSP(variables, domains, neighbors, zebra_constraint) @@ -728,3 +890,546 @@ def solve_zebra(algorithm=min_conflicts, **args): print(var, end=' ') print() return ans['Zebra'], ans['Water'], z.nassigns, ans + + +# ______________________________________________________________________________ +# n-ary Constraint Satisfaction Problem + +class NaryCSP: + """ + A nary-CSP consists of: + domains : a dictionary that maps each variable to its domain + constraints : a list of constraints + variables : a set of variables + var_to_const: a variable to set of constraints dictionary + """ + + def __init__(self, domains, constraints): + """Domains is a variable:domain dictionary + constraints is a list of constraints + """ + self.variables = set(domains) + self.domains = domains + self.constraints = constraints + self.var_to_const = {var: set() for var in self.variables} + for con in constraints: + for var in con.scope: + self.var_to_const[var].add(con) + + def __str__(self): + """String representation of CSP""" + return str(self.domains) + + def display(self, assignment=None): + """More detailed string representation of CSP""" + if assignment is None: + assignment = {} + print(assignment) + + def consistent(self, assignment): + """assignment is a variable:value dictionary + returns True if all of the constraints that can be evaluated + evaluate to True given assignment. + """ + return all(con.holds(assignment) + for con in self.constraints + if all(v in assignment for v in con.scope)) + + +class Constraint: + """ + A Constraint consists of: + scope : a tuple of variables + condition: a function that can applied to a tuple of values + for the variables. + """ + + def __init__(self, scope, condition): + self.scope = scope + self.condition = condition + + def __repr__(self): + return self.condition.__name__ + str(self.scope) + + def holds(self, assignment): + """Returns the value of Constraint con evaluated in assignment. + + precondition: all variables are assigned in assignment + """ + return self.condition(*tuple(assignment[v] for v in self.scope)) + + +def all_diff_constraint(*values): + """Returns True if all values are different, False otherwise""" + return len(values) is len(set(values)) + + +def is_word_constraint(words): + """Returns True if the letters concatenated form a word in words, False otherwise""" + + def isw(*letters): + return "".join(letters) in words + + return isw + + +def meet_at_constraint(p1, p2): + """Returns a function that is True when the words meet at the positions (p1, p2), False otherwise""" + + def meets(w1, w2): + return w1[p1] == w2[p2] + + meets.__name__ = "meet_at(" + str(p1) + ',' + str(p2) + ')' + return meets + + +def adjacent_constraint(x, y): + """Returns True if x and y are adjacent numbers, False otherwise""" + return abs(x - y) == 1 + + +def sum_constraint(n): + """Returns a function that is True when the the sum of all values is n, False otherwise""" + + def sumv(*values): + return sum(values) is n + + sumv.__name__ = str(n) + "==sum" + return sumv + + +def is_constraint(val): + """Returns a function that is True when x is equal to val, False otherwise""" + + def isv(x): + return val == x + + isv.__name__ = str(val) + "==" + return isv + + +def ne_constraint(val): + """Returns a function that is True when x is not equal to val, False otherwise""" + + def nev(x): + return val != x + + nev.__name__ = str(val) + "!=" + return nev + + +def no_heuristic(to_do): + return to_do + + +def sat_up(to_do): + return SortedSet(to_do, key=lambda t: 1 / len([var for var in t[1].scope])) + + +class ACSolver: + """Solves a CSP with arc consistency and domain splitting""" + + def __init__(self, csp): + """a CSP solver that uses arc consistency + * csp is the CSP to be solved + """ + self.csp = csp + + def GAC(self, orig_domains=None, to_do=None, arc_heuristic=sat_up): + """ + Makes this CSP arc-consistent using Generalized Arc Consistency + orig_domains: is the original domains + to_do : is a set of (variable,constraint) pairs + returns the reduced domains (an arc-consistent variable:domain dictionary) + """ + if orig_domains is None: + orig_domains = self.csp.domains + if to_do is None: + to_do = {(var, const) for const in self.csp.constraints for var in const.scope} + else: + to_do = to_do.copy() + domains = orig_domains.copy() + to_do = arc_heuristic(to_do) + checks = 0 + while to_do: + var, const = to_do.pop() + other_vars = [ov for ov in const.scope if ov != var] + new_domain = set() + if len(other_vars) == 0: + for val in domains[var]: + if const.holds({var: val}): + new_domain.add(val) + checks += 1 + # new_domain = {val for val in domains[var] + # if const.holds({var: val})} + elif len(other_vars) == 1: + other = other_vars[0] + for val in domains[var]: + for other_val in domains[other]: + checks += 1 + if const.holds({var: val, other: other_val}): + new_domain.add(val) + break + # new_domain = {val for val in domains[var] + # if any(const.holds({var: val, other: other_val}) + # for other_val in domains[other])} + else: # general case + for val in domains[var]: + holds, checks = self.any_holds(domains, const, {var: val}, other_vars, checks=checks) + if holds: + new_domain.add(val) + # new_domain = {val for val in domains[var] + # if self.any_holds(domains, const, {var: val}, other_vars)} + if new_domain != domains[var]: + domains[var] = new_domain + if not new_domain: + return False, domains, checks + add_to_do = self.new_to_do(var, const).difference(to_do) + to_do |= add_to_do + return True, domains, checks + + def new_to_do(self, var, const): + """ + Returns new elements to be added to to_do after assigning + variable var in constraint const. + """ + return {(nvar, nconst) for nconst in self.csp.var_to_const[var] + if nconst != const + for nvar in nconst.scope + if nvar != var} + + def any_holds(self, domains, const, env, other_vars, ind=0, checks=0): + """ + Returns True if Constraint const holds for an assignment + that extends env with the variables in other_vars[ind:] + env is a dictionary + Warning: this has side effects and changes the elements of env + """ + if ind == len(other_vars): + return const.holds(env), checks + 1 + else: + var = other_vars[ind] + for val in domains[var]: + # env = dict_union(env, {var:val}) # no side effects + env[var] = val + holds, checks = self.any_holds(domains, const, env, other_vars, ind + 1, checks) + if holds: + return True, checks + return False, checks + + def domain_splitting(self, domains=None, to_do=None, arc_heuristic=sat_up): + """ + Return a solution to the current CSP or False if there are no solutions + to_do is the list of arcs to check + """ + if domains is None: + domains = self.csp.domains + consistency, new_domains, _ = self.GAC(domains, to_do, arc_heuristic) + if not consistency: + return False + elif all(len(new_domains[var]) == 1 for var in domains): + return {var: first(new_domains[var]) for var in domains} + else: + var = first(x for x in self.csp.variables if len(new_domains[x]) > 1) + if var: + dom1, dom2 = partition_domain(new_domains[var]) + new_doms1 = extend(new_domains, var, dom1) + new_doms2 = extend(new_domains, var, dom2) + to_do = self.new_to_do(var, None) + return self.domain_splitting(new_doms1, to_do, arc_heuristic) or \ + self.domain_splitting(new_doms2, to_do, arc_heuristic) + + +def partition_domain(dom): + """Partitions domain dom into two""" + split = len(dom) // 2 + dom1 = set(list(dom)[:split]) + dom2 = dom - dom1 + return dom1, dom2 + + +class ACSearchSolver(search.Problem): + """A search problem with arc consistency and domain splitting + A node is a CSP""" + + def __init__(self, csp, arc_heuristic=sat_up): + self.cons = ACSolver(csp) + consistency, self.domains, _ = self.cons.GAC(arc_heuristic=arc_heuristic) + if not consistency: + raise Exception('CSP is inconsistent') + self.heuristic = arc_heuristic + super().__init__(self.domains) + + def goal_test(self, node): + """Node is a goal if all domains have 1 element""" + return all(len(node[var]) == 1 for var in node) + + def actions(self, state): + var = first(x for x in state if len(state[x]) > 1) + neighs = [] + if var: + dom1, dom2 = partition_domain(state[var]) + to_do = self.cons.new_to_do(var, None) + for dom in [dom1, dom2]: + new_domains = extend(state, var, dom) + consistency, cons_doms, _ = self.cons.GAC(new_domains, to_do, self.heuristic) + if consistency: + neighs.append(cons_doms) + return neighs + + def result(self, state, action): + return action + + +def ac_solver(csp, arc_heuristic=sat_up): + """Arc consistency (domain splitting interface)""" + return ACSolver(csp).domain_splitting(arc_heuristic=arc_heuristic) + + +def ac_search_solver(csp, arc_heuristic=sat_up): + """Arc consistency (search interface)""" + from search import depth_first_tree_search + solution = None + try: + solution = depth_first_tree_search(ACSearchSolver(csp, arc_heuristic=arc_heuristic)).state + except: + return solution + if solution: + return {var: first(solution[var]) for var in solution} + + +# ______________________________________________________________________________ +# Crossword Problem + + +csp_crossword = NaryCSP({'one_across': {'ant', 'big', 'bus', 'car', 'has'}, + 'one_down': {'book', 'buys', 'hold', 'lane', 'year'}, + 'two_down': {'ginger', 'search', 'symbol', 'syntax'}, + 'three_across': {'book', 'buys', 'hold', 'land', 'year'}, + 'four_across': {'ant', 'big', 'bus', 'car', 'has'}}, + [Constraint(('one_across', 'one_down'), meet_at_constraint(0, 0)), + Constraint(('one_across', 'two_down'), meet_at_constraint(2, 0)), + Constraint(('three_across', 'two_down'), meet_at_constraint(2, 2)), + Constraint(('three_across', 'one_down'), meet_at_constraint(0, 2)), + Constraint(('four_across', 'two_down'), meet_at_constraint(0, 4))]) + +crossword1 = [['_', '_', '_', '*', '*'], + ['_', '*', '_', '*', '*'], + ['_', '_', '_', '_', '*'], + ['_', '*', '_', '*', '*'], + ['*', '*', '_', '_', '_'], + ['*', '*', '_', '*', '*']] + +words1 = {'ant', 'big', 'bus', 'car', 'has', 'book', 'buys', 'hold', + 'lane', 'year', 'ginger', 'search', 'symbol', 'syntax'} + + +class Crossword(NaryCSP): + + def __init__(self, puzzle, words): + domains = {} + constraints = [] + for i, line in enumerate(puzzle): + scope = [] + for j, element in enumerate(line): + if element == '_': + var = "p" + str(j) + str(i) + domains[var] = list(string.ascii_lowercase) + scope.append(var) + else: + if len(scope) > 1: + constraints.append(Constraint(tuple(scope), is_word_constraint(words))) + scope.clear() + if len(scope) > 1: + constraints.append(Constraint(tuple(scope), is_word_constraint(words))) + puzzle_t = list(map(list, zip(*puzzle))) + for i, line in enumerate(puzzle_t): + scope = [] + for j, element in enumerate(line): + if element == '_': + scope.append("p" + str(i) + str(j)) + else: + if len(scope) > 1: + constraints.append(Constraint(tuple(scope), is_word_constraint(words))) + scope.clear() + if len(scope) > 1: + constraints.append(Constraint(tuple(scope), is_word_constraint(words))) + super().__init__(domains, constraints) + self.puzzle = puzzle + + def display(self, assignment=None): + for i, line in enumerate(self.puzzle): + puzzle = "" + for j, element in enumerate(line): + if element == '*': + puzzle += "[*] " + else: + var = "p" + str(j) + str(i) + if assignment is not None: + if isinstance(assignment[var], set) and len(assignment[var]) == 1: + puzzle += "[" + str(first(assignment[var])).upper() + "] " + elif isinstance(assignment[var], str): + puzzle += "[" + str(assignment[var]).upper() + "] " + else: + puzzle += "[_] " + else: + puzzle += "[_] " + print(puzzle) + + +# ______________________________________________________________________________ +# Kakuro Problem + + +# difficulty 0 +kakuro1 = [['*', '*', '*', [6, ''], [3, '']], + ['*', [4, ''], [3, 3], '_', '_'], + [['', 10], '_', '_', '_', '_'], + [['', 3], '_', '_', '*', '*']] + +# difficulty 0 +kakuro2 = [ + ['*', [10, ''], [13, ''], '*'], + [['', 3], '_', '_', [13, '']], + [['', 12], '_', '_', '_'], + [['', 21], '_', '_', '_']] + +# difficulty 1 +kakuro3 = [ + ['*', [17, ''], [28, ''], '*', [42, ''], [22, '']], + [['', 9], '_', '_', [31, 14], '_', '_'], + [['', 20], '_', '_', '_', '_', '_'], + ['*', ['', 30], '_', '_', '_', '_'], + ['*', [22, 24], '_', '_', '_', '*'], + [['', 25], '_', '_', '_', '_', [11, '']], + [['', 20], '_', '_', '_', '_', '_'], + [['', 14], '_', '_', ['', 17], '_', '_']] + +# difficulty 2 +kakuro4 = [ + ['*', '*', '*', '*', '*', [4, ''], [24, ''], [11, ''], '*', '*', '*', [11, ''], [17, ''], '*', '*'], + ['*', '*', '*', [17, ''], [11, 12], '_', '_', '_', '*', '*', [24, 10], '_', '_', [11, ''], '*'], + ['*', [4, ''], [16, 26], '_', '_', '_', '_', '_', '*', ['', 20], '_', '_', '_', '_', [16, '']], + [['', 20], '_', '_', '_', '_', [24, 13], '_', '_', [16, ''], ['', 12], '_', '_', [23, 10], '_', '_'], + [['', 10], '_', '_', [24, 12], '_', '_', [16, 5], '_', '_', [16, 30], '_', '_', '_', '_', '_'], + ['*', '*', [3, 26], '_', '_', '_', '_', ['', 12], '_', '_', [4, ''], [16, 14], '_', '_', '*'], + ['*', ['', 8], '_', '_', ['', 15], '_', '_', [34, 26], '_', '_', '_', '_', '_', '*', '*'], + ['*', ['', 11], '_', '_', [3, ''], [17, ''], ['', 14], '_', '_', ['', 8], '_', '_', [7, ''], [17, ''], '*'], + ['*', '*', '*', [23, 10], '_', '_', [3, 9], '_', '_', [4, ''], [23, ''], ['', 13], '_', '_', '*'], + ['*', '*', [10, 26], '_', '_', '_', '_', '_', ['', 7], '_', '_', [30, 9], '_', '_', '*'], + ['*', [17, 11], '_', '_', [11, ''], [24, 8], '_', '_', [11, 21], '_', '_', '_', '_', [16, ''], [17, '']], + [['', 29], '_', '_', '_', '_', '_', ['', 7], '_', '_', [23, 14], '_', '_', [3, 17], '_', '_'], + [['', 10], '_', '_', [3, 10], '_', '_', '*', ['', 8], '_', '_', [4, 25], '_', '_', '_', '_'], + ['*', ['', 16], '_', '_', '_', '_', '*', ['', 23], '_', '_', '_', '_', '_', '*', '*'], + ['*', '*', ['', 6], '_', '_', '*', '*', ['', 15], '_', '_', '_', '*', '*', '*', '*']] + + +class Kakuro(NaryCSP): + + def __init__(self, puzzle): + variables = [] + for i, line in enumerate(puzzle): + # print line + for j, element in enumerate(line): + if element == '_': + var1 = str(i) + if len(var1) == 1: + var1 = "0" + var1 + var2 = str(j) + if len(var2) == 1: + var2 = "0" + var2 + variables.append("X" + var1 + var2) + domains = {} + for var in variables: + domains[var] = set(range(1, 10)) + constraints = [] + for i, line in enumerate(puzzle): + for j, element in enumerate(line): + if element != '_' and element != '*': + # down - column + if element[0] != '': + x = [] + for k in range(i + 1, len(puzzle)): + if puzzle[k][j] != '_': + break + var1 = str(k) + if len(var1) == 1: + var1 = "0" + var1 + var2 = str(j) + if len(var2) == 1: + var2 = "0" + var2 + x.append("X" + var1 + var2) + constraints.append(Constraint(x, sum_constraint(element[0]))) + constraints.append(Constraint(x, all_diff_constraint)) + # right - line + if element[1] != '': + x = [] + for k in range(j + 1, len(puzzle[i])): + if puzzle[i][k] != '_': + break + var1 = str(i) + if len(var1) == 1: + var1 = "0" + var1 + var2 = str(k) + if len(var2) == 1: + var2 = "0" + var2 + x.append("X" + var1 + var2) + constraints.append(Constraint(x, sum_constraint(element[1]))) + constraints.append(Constraint(x, all_diff_constraint)) + super().__init__(domains, constraints) + self.puzzle = puzzle + + def display(self, assignment=None): + for i, line in enumerate(self.puzzle): + puzzle = "" + for j, element in enumerate(line): + if element == '*': + puzzle += "[*]\t" + elif element == '_': + var1 = str(i) + if len(var1) == 1: + var1 = "0" + var1 + var2 = str(j) + if len(var2) == 1: + var2 = "0" + var2 + var = "X" + var1 + var2 + if assignment is not None: + if isinstance(assignment[var], set) and len(assignment[var]) == 1: + puzzle += "[" + str(first(assignment[var])) + "]\t" + elif isinstance(assignment[var], int): + puzzle += "[" + str(assignment[var]) + "]\t" + else: + puzzle += "[_]\t" + else: + puzzle += "[_]\t" + else: + puzzle += str(element[0]) + "\\" + str(element[1]) + "\t" + print(puzzle) + + +# ______________________________________________________________________________ +# Cryptarithmetic Problem + +# [Figure 6.2] +# T W O + T W O = F O U R +two_two_four = NaryCSP({'T': set(range(1, 10)), 'F': set(range(1, 10)), + 'W': set(range(0, 10)), 'O': set(range(0, 10)), 'U': set(range(0, 10)), 'R': set(range(0, 10)), + 'C1': set(range(0, 2)), 'C2': set(range(0, 2)), 'C3': set(range(0, 2))}, + [Constraint(('T', 'F', 'W', 'O', 'U', 'R'), all_diff_constraint), + Constraint(('O', 'R', 'C1'), lambda o, r, c1: o + o == r + 10 * c1), + Constraint(('W', 'U', 'C1', 'C2'), lambda w, u, c1, c2: c1 + w + w == u + 10 * c2), + Constraint(('T', 'O', 'C2', 'C3'), lambda t, o, c2, c3: c2 + t + t == o + 10 * c3), + Constraint(('F', 'C3'), eq)]) + +# S E N D + M O R E = M O N E Y +send_more_money = NaryCSP({'S': set(range(1, 10)), 'M': set(range(1, 10)), + 'E': set(range(0, 10)), 'N': set(range(0, 10)), 'D': set(range(0, 10)), + 'O': set(range(0, 10)), 'R': set(range(0, 10)), 'Y': set(range(0, 10)), + 'C1': set(range(0, 2)), 'C2': set(range(0, 2)), 'C3': set(range(0, 2)), + 'C4': set(range(0, 2))}, + [Constraint(('S', 'E', 'N', 'D', 'M', 'O', 'R', 'Y'), all_diff_constraint), + Constraint(('D', 'E', 'Y', 'C1'), lambda d, e, y, c1: d + e == y + 10 * c1), + Constraint(('N', 'R', 'E', 'C1', 'C2'), lambda n, r, e, c1, c2: c1 + n + r == e + 10 * c2), + Constraint(('E', 'O', 'N', 'C2', 'C3'), lambda e, o, n, c2, c3: c2 + e + o == n + 10 * c3), + Constraint(('S', 'M', 'O', 'C3', 'C4'), lambda s, m, o, c3, c4: c3 + s + m == o + 10 * c4), + Constraint(('M', 'C4'), eq)]) diff --git a/deep_learning4e.py b/deep_learning4e.py new file mode 100644 index 000000000..9f5b0a8f7 --- /dev/null +++ b/deep_learning4e.py @@ -0,0 +1,584 @@ +"""Deep learning. (Chapters 20)""" + +import random +import statistics + +import numpy as np +from keras import Sequential, optimizers +from keras.layers import Embedding, SimpleRNN, Dense +from keras.preprocessing import sequence + +from utils4e import (conv1D, gaussian_kernel, element_wise_product, vector_add, random_weights, + scalar_vector_product, map_vector, mean_squared_error_loss) + + +class Node: + """ + A single unit of a layer in a neural network + :param weights: weights between parent nodes and current node + :param value: value of current node + """ + + def __init__(self, weights=None, value=None): + self.value = value + self.weights = weights or [] + + +class Layer: + """ + A layer in a neural network based on a computational graph. + :param size: number of units in the current layer + """ + + def __init__(self, size): + self.nodes = np.array([Node() for _ in range(size)]) + + def forward(self, inputs): + """Define the operation to get the output of this layer""" + raise NotImplementedError + + +class Activation: + + def function(self, x): + return NotImplementedError + + def derivative(self, x): + return NotImplementedError + + def __call__(self, x): + return self.function(x) + + +class Sigmoid(Activation): + + def function(self, x): + return 1 / (1 + np.exp(-x)) + + def derivative(self, value): + return value * (1 - value) + + +class ReLU(Activation): + + def function(self, x): + return max(0, x) + + def derivative(self, value): + return 1 if value > 0 else 0 + + +class ELU(Activation): + + def __init__(self, alpha=0.01): + self.alpha = alpha + + def function(self, x): + return x if x > 0 else self.alpha * (np.exp(x) - 1) + + def derivative(self, value): + return 1 if value > 0 else self.alpha * np.exp(value) + + +class LeakyReLU(Activation): + + def __init__(self, alpha=0.01): + self.alpha = alpha + + def function(self, x): + return max(x, self.alpha * x) + + def derivative(self, value): + return 1 if value > 0 else self.alpha + + +class Tanh(Activation): + + def function(self, x): + return np.tanh(x) + + def derivative(self, value): + return 1 - (value ** 2) + + +class SoftMax(Activation): + + def function(self, x): + return np.exp(x) / np.sum(np.exp(x)) + + def derivative(self, x): + return np.ones_like(x) + + +class SoftPlus(Activation): + + def function(self, x): + return np.log(1. + np.exp(x)) + + def derivative(self, x): + return 1. / (1. + np.exp(-x)) + + +class Linear(Activation): + + def function(self, x): + return x + + def derivative(self, x): + return np.ones_like(x) + + +class InputLayer(Layer): + """1D input layer. Layer size is the same as input vector size.""" + + def __init__(self, size=3): + super().__init__(size) + + def forward(self, inputs): + """Take each value of the inputs to each unit in the layer.""" + assert len(self.nodes) == len(inputs) + for node, inp in zip(self.nodes, inputs): + node.value = inp + return inputs + + +class OutputLayer(Layer): + """1D softmax output layer in 19.3.2.""" + + def __init__(self, size=3): + super().__init__(size) + + def forward(self, inputs, activation=SoftMax): + assert len(self.nodes) == len(inputs) + res = activation().function(inputs) + for node, val in zip(self.nodes, res): + node.value = val + return res + + +class DenseLayer(Layer): + """ + 1D dense layer in a neural network. + :param in_size: (int) input vector size + :param out_size: (int) output vector size + :param activation: (Activation object) activation function + """ + + def __init__(self, in_size=3, out_size=3, activation=Sigmoid): + super().__init__(out_size) + self.out_size = out_size + self.inputs = None + self.activation = activation() + # initialize weights + for node in self.nodes: + node.weights = random_weights(-0.5, 0.5, in_size) + + def forward(self, inputs): + self.inputs = inputs + res = [] + # get the output value of each unit + for unit in self.nodes: + val = self.activation.function(np.dot(unit.weights, inputs)) + unit.value = val + res.append(val) + return res + + +class ConvLayer1D(Layer): + """ + 1D convolution layer of in neural network. + :param kernel_size: convolution kernel size + """ + + def __init__(self, size=3, kernel_size=3): + super().__init__(size) + # init convolution kernel as gaussian kernel + for node in self.nodes: + node.weights = gaussian_kernel(kernel_size) + + def forward(self, features): + # each node in layer takes a channel in the features + assert len(self.nodes) == len(features) + res = [] + # compute the convolution output of each channel, store it in node.val + for node, feature in zip(self.nodes, features): + out = conv1D(feature, node.weights) + res.append(out) + node.value = out + return res + + +class MaxPoolingLayer1D(Layer): + """ + 1D max pooling layer in a neural network. + :param kernel_size: max pooling area size + """ + + def __init__(self, size=3, kernel_size=3): + super().__init__(size) + self.kernel_size = kernel_size + self.inputs = None + + def forward(self, features): + assert len(self.nodes) == len(features) + res = [] + self.inputs = features + # do max pooling for each channel in features + for i in range(len(self.nodes)): + feature = features[i] + # get the max value in a kernel_size * kernel_size area + out = [max(feature[i:i + self.kernel_size]) + for i in range(len(feature) - self.kernel_size + 1)] + res.append(out) + self.nodes[i].value = out + return res + + +class BatchNormalizationLayer(Layer): + """Batch normalization layer.""" + + def __init__(self, size, eps=0.001): + super().__init__(size) + self.eps = eps + # self.weights = [beta, gamma] + self.weights = [0, 0] + self.inputs = None + + def forward(self, inputs): + # mean value of inputs + mu = sum(inputs) / len(inputs) + # standard error of inputs + stderr = statistics.stdev(inputs) + self.inputs = inputs + res = [] + # get normalized value of each input + for i in range(len(self.nodes)): + val = [(inputs[i] - mu) * self.weights[0] / np.sqrt(self.eps + stderr ** 2) + self.weights[1]] + res.append(val) + self.nodes[i].value = val + return res + + +def init_examples(examples, idx_i, idx_t, o_units): + """Init examples from dataset.examples.""" + + inputs, targets = {}, {} + for i, e in enumerate(examples): + # input values of e + inputs[i] = [e[i] for i in idx_i] + + if o_units > 1: + # one-hot representation of e's target + t = [0 for i in range(o_units)] + t[e[idx_t]] = 1 + targets[i] = t + else: + # target value of e + targets[i] = [e[idx_t]] + + return inputs, targets + + +def stochastic_gradient_descent(dataset, net, loss, epochs=1000, l_rate=0.01, batch_size=1, verbose=False): + """ + Gradient descent algorithm to update the learnable parameters of a network. + :return: the updated network + """ + examples = dataset.examples # init data + + for e in range(epochs): + total_loss = 0 + random.shuffle(examples) + weights = [[node.weights for node in layer.nodes] for layer in net] + + for batch in get_batch(examples, batch_size): + inputs, targets = init_examples(batch, dataset.inputs, dataset.target, len(net[-1].nodes)) + # compute gradients of weights + gs, batch_loss = BackPropagation(inputs, targets, weights, net, loss) + # update weights with gradient descent + weights = [x + y for x, y in zip(weights, [np.array(tg) * -l_rate for tg in gs])] + total_loss += batch_loss + + # update the weights of network each batch + for i in range(len(net)): + if weights[i].size != 0: + for j in range(len(weights[i])): + net[i].nodes[j].weights = weights[i][j] + + if verbose: + print("epoch:{}, total_loss:{}".format(e + 1, total_loss)) + + return net + + +def adam(dataset, net, loss, epochs=1000, rho=(0.9, 0.999), delta=1 / 10 ** 8, + l_rate=0.001, batch_size=1, verbose=False): + """ + [Figure 19.6] + Adam optimizer to update the learnable parameters of a network. + Required parameters are similar to gradient descent. + :return the updated network + """ + examples = dataset.examples + + # init s,r and t + s = [[[0] * len(node.weights) for node in layer.nodes] for layer in net] + r = [[[0] * len(node.weights) for node in layer.nodes] for layer in net] + t = 0 + + # repeat util converge + for e in range(epochs): + # total loss of each epoch + total_loss = 0 + random.shuffle(examples) + weights = [[node.weights for node in layer.nodes] for layer in net] + + for batch in get_batch(examples, batch_size): + t += 1 + inputs, targets = init_examples(batch, dataset.inputs, dataset.target, len(net[-1].nodes)) + + # compute gradients of weights + gs, batch_loss = BackPropagation(inputs, targets, weights, net, loss) + + # update s,r,s_hat and r_gat + s = vector_add(scalar_vector_product(rho[0], s), + scalar_vector_product((1 - rho[0]), gs)) + r = vector_add(scalar_vector_product(rho[1], r), + scalar_vector_product((1 - rho[1]), element_wise_product(gs, gs))) + s_hat = scalar_vector_product(1 / (1 - rho[0] ** t), s) + r_hat = scalar_vector_product(1 / (1 - rho[1] ** t), r) + + # rescale r_hat + r_hat = map_vector(lambda x: 1 / (np.sqrt(x) + delta), r_hat) + + # delta weights + delta_theta = scalar_vector_product(-l_rate, element_wise_product(s_hat, r_hat)) + weights = vector_add(weights, delta_theta) + total_loss += batch_loss + + # update the weights of network each batch + for i in range(len(net)): + if weights[i]: + for j in range(len(weights[i])): + net[i].nodes[j].weights = weights[i][j] + + if verbose: + print("epoch:{}, total_loss:{}".format(e + 1, total_loss)) + + return net + + +def BackPropagation(inputs, targets, theta, net, loss): + """ + The back-propagation algorithm for multilayer networks in only one epoch, to calculate gradients of theta. + :param inputs: a batch of inputs in an array. Each input is an iterable object + :param targets: a batch of targets in an array. Each target is an iterable object + :param theta: parameters to be updated + :param net: a list of predefined layer objects representing their linear sequence + :param loss: a predefined loss function taking array of inputs and targets + :return: gradients of theta, loss of the input batch + """ + + assert len(inputs) == len(targets) + o_units = len(net[-1].nodes) + n_layers = len(net) + batch_size = len(inputs) + + gradients = [[[] for _ in layer.nodes] for layer in net] + total_gradients = [[[0] * len(node.weights) for node in layer.nodes] for layer in net] + + batch_loss = 0 + + # iterate over each example in batch + for e in range(batch_size): + i_val = inputs[e] + t_val = targets[e] + + # forward pass and compute batch loss + for i in range(1, n_layers): + layer_out = net[i].forward(i_val) + i_val = layer_out + batch_loss += loss(t_val, layer_out) + + # initialize delta + delta = [[] for _ in range(n_layers)] + + previous = np.array([layer_out[i] - t_val[i] for i in range(o_units)]) + h_layers = n_layers - 1 + + # backward pass + for i in range(h_layers, 0, -1): + layer = net[i] + derivative = np.array([layer.activation.derivative(node.value) for node in layer.nodes]) + delta[i] = previous * derivative + # pass to layer i-1 in the next iteration + previous = np.matmul([delta[i]], theta[i])[0] + # compute gradient of layer i + gradients[i] = [scalar_vector_product(d, net[i].inputs) for d in delta[i]] + + # add gradient of current example to batch gradient + total_gradients = vector_add(total_gradients, gradients) + + return total_gradients, batch_loss + + +def get_batch(examples, batch_size=1): + """Split examples into multiple batches""" + for i in range(0, len(examples), batch_size): + yield examples[i: i + batch_size] + + +class NeuralNetworkLearner: + """ + Simple dense multilayer neural network. + :param hidden_layer_sizes: size of hidden layers in the form of a list + """ + + def __init__(self, dataset, hidden_layer_sizes, l_rate=0.01, epochs=1000, batch_size=10, + optimizer=stochastic_gradient_descent, loss=mean_squared_error_loss, verbose=False, plot=False): + self.dataset = dataset + self.l_rate = l_rate + self.epochs = epochs + self.batch_size = batch_size + self.optimizer = optimizer + self.loss = loss + self.verbose = verbose + self.plot = plot + + input_size = len(dataset.inputs) + output_size = len(dataset.values[dataset.target]) + + # initialize the network + raw_net = [InputLayer(input_size)] + # add hidden layers + hidden_input_size = input_size + for h_size in hidden_layer_sizes: + raw_net.append(DenseLayer(hidden_input_size, h_size)) + hidden_input_size = h_size + raw_net.append(DenseLayer(hidden_input_size, output_size)) + self.raw_net = raw_net + + def fit(self, X, y): + self.learned_net = self.optimizer(self.dataset, self.raw_net, loss=self.loss, epochs=self.epochs, + l_rate=self.l_rate, batch_size=self.batch_size, verbose=self.verbose) + return self + + def predict(self, example): + n_layers = len(self.learned_net) + + layer_input = example + layer_out = example + + # get the output of each layer by forward passing + for i in range(1, n_layers): + layer_out = self.learned_net[i].forward(np.array(layer_input).reshape((-1, 1))) + layer_input = layer_out + + return layer_out.index(max(layer_out)) + + +class PerceptronLearner: + """ + Simple perceptron neural network. + """ + + def __init__(self, dataset, l_rate=0.01, epochs=1000, batch_size=10, optimizer=stochastic_gradient_descent, + loss=mean_squared_error_loss, verbose=False, plot=False): + self.dataset = dataset + self.l_rate = l_rate + self.epochs = epochs + self.batch_size = batch_size + self.optimizer = optimizer + self.loss = loss + self.verbose = verbose + self.plot = plot + + input_size = len(dataset.inputs) + output_size = len(dataset.values[dataset.target]) + + # initialize the network, add dense layer + self.raw_net = [InputLayer(input_size), DenseLayer(input_size, output_size)] + + def fit(self, X, y): + self.learned_net = self.optimizer(self.dataset, self.raw_net, loss=self.loss, epochs=self.epochs, + l_rate=self.l_rate, batch_size=self.batch_size, verbose=self.verbose) + return self + + def predict(self, example): + layer_out = self.learned_net[1].forward(np.array(example).reshape((-1, 1))) + return layer_out.index(max(layer_out)) + + +def keras_dataset_loader(dataset, max_length=500): + """ + Helper function to load keras datasets. + :param dataset: keras data set type + :param max_length: max length of each input sequence + """ + # init dataset + (X_train, y_train), (X_val, y_val) = dataset + if max_length > 0: + X_train = sequence.pad_sequences(X_train, maxlen=max_length) + X_val = sequence.pad_sequences(X_val, maxlen=max_length) + return (X_train[10:], y_train[10:]), (X_val, y_val), (X_train[:10], y_train[:10]) + + +def SimpleRNNLearner(train_data, val_data, epochs=2, verbose=False): + """ + RNN example for text sentimental analysis. + :param train_data: a tuple of (training data, targets) + Training data: ndarray taking training examples, while each example is coded by embedding + Targets: ndarray taking targets of each example. Each target is mapped to an integer + :param val_data: a tuple of (validation data, targets) + :param epochs: number of epochs + :param verbose: verbosity mode + :return: a keras model + """ + + total_inputs = 5000 + input_length = 500 + + # init data + X_train, y_train = train_data + X_val, y_val = val_data + + # init a the sequential network (embedding layer, rnn layer, dense layer) + model = Sequential() + model.add(Embedding(total_inputs, 32, input_length=input_length)) + model.add(SimpleRNN(units=128)) + model.add(Dense(1, activation='sigmoid')) + model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) + + # train the model + model.fit(X_train, y_train, validation_data=(X_val, y_val), epochs=epochs, batch_size=128, verbose=verbose) + + return model + + +def AutoencoderLearner(inputs, encoding_size, epochs=200, verbose=False): + """ + Simple example of linear auto encoder learning producing the input itself. + :param inputs: a batch of input data in np.ndarray type + :param encoding_size: int, the size of encoding layer + :param epochs: number of epochs + :param verbose: verbosity mode + :return: a keras model + """ + + # init data + input_size = len(inputs[0]) + + # init model + model = Sequential() + model.add(Dense(encoding_size, input_dim=input_size, activation='relu', kernel_initializer='random_uniform', + bias_initializer='ones')) + model.add(Dense(input_size, activation='relu', kernel_initializer='random_uniform', bias_initializer='ones')) + + # update model with sgd + sgd = optimizers.SGD(lr=0.01) + model.compile(loss='mean_squared_error', optimizer=sgd, metrics=['accuracy']) + + # train the model + model.fit(inputs, inputs, epochs=epochs, batch_size=10, verbose=verbose) + + return model diff --git a/games.ipynb b/games.ipynb index f1ff58a94..edf955be8 100644 --- a/games.ipynb +++ b/games.ipynb @@ -82,7 +82,7 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": null, "metadata": { "collapsed": true }, @@ -135,11 +135,18 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 4, "metadata": { "collapsed": true }, - "outputs": [], + "outputs": [ + { + "output_type": "stream", + "text": "\u001b[1;32mclass\u001b[0m \u001b[0mTicTacToe\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mGame\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;34m\"\"\"Play TicTacToe on an h x v board, with Max (first player) playing 'X'.\n A state has the player to move, a cached utility, a list of moves in\n the form of a list of (x, y) positions, and a board, in the form of\n a dict of {(x, y): Player} entries, where Player is 'X' or 'O'.\"\"\"\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mh\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m3\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mv\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m3\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mk\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m3\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mh\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mh\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mv\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mv\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mk\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mk\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mmoves\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;33m[\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;32mfor\u001b[0m \u001b[0mx\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mh\u001b[0m \u001b[1;33m+\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mfor\u001b[0m \u001b[0my\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mv\u001b[0m \u001b[1;33m+\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0minitial\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mGameState\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mto_move\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;34m'X'\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mutility\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m0\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mboard\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;33m{\u001b[0m\u001b[1;33m}\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmoves\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mmoves\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mactions\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;34m\"\"\"Legal moves are any square not yet taken.\"\"\"\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mmoves\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mresult\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mmove\u001b[0m \u001b[1;32mnot\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mmoves\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mstate\u001b[0m \u001b[1;31m# Illegal move has no effect\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mboard\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mcopy\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mboard\u001b[0m\u001b[1;33m[\u001b[0m\u001b[0mmove\u001b[0m\u001b[1;33m]\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mto_move\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mmoves\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mlist\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mmoves\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mmoves\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mremove\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mmove\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mGameState\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mto_move\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m'O'\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mto_move\u001b[0m \u001b[1;33m==\u001b[0m \u001b[1;34m'X'\u001b[0m \u001b[1;32melse\u001b[0m \u001b[1;34m'X'\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mutility\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mcompute_utility\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mto_move\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mboard\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmoves\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mmoves\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mutility\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;34m\"\"\"Return the value to player; 1 for win, -1 for loss, 0 otherwise.\"\"\"\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mutility\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mplayer\u001b[0m \u001b[1;33m==\u001b[0m \u001b[1;34m'X'\u001b[0m \u001b[1;32melse\u001b[0m \u001b[1;33m-\u001b[0m\u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mutility\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mterminal_test\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;34m\"\"\"A state is terminal if it is won or there are no empty squares.\"\"\"\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mutility\u001b[0m \u001b[1;33m!=\u001b[0m \u001b[1;36m0\u001b[0m \u001b[1;32mor\u001b[0m \u001b[0mlen\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mmoves\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;33m==\u001b[0m \u001b[1;36m0\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mdisplay\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mboard\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mstate\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mfor\u001b[0m \u001b[0mx\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mh\u001b[0m \u001b[1;33m+\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mfor\u001b[0m \u001b[0my\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mv\u001b[0m \u001b[1;33m+\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mprint\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;34m'.'\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mend\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;34m' '\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mprint\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mcompute_utility\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;34m\"\"\"If 'X' wins with this move, return 1; if 'O' wins return -1; else return 0.\"\"\"\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mif\u001b[0m \u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mk_in_row\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;33m(\u001b[0m\u001b[1;36m0\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;32mor\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mk_in_row\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;33m(\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;36m0\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;32mor\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mk_in_row\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;33m(\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;33m-\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;32mor\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mk_in_row\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;33m(\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[1;33m+\u001b[0m\u001b[1;36m1\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mplayer\u001b[0m \u001b[1;33m==\u001b[0m \u001b[1;34m'X'\u001b[0m \u001b[1;32melse\u001b[0m \u001b[1;33m-\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32melse\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[1;36m0\u001b[0m\u001b[1;33m\n\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mk_in_row\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mboard\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mdelta_x_y\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;34m\"\"\"Return true if there is a line through move on board for player.\"\"\"\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;33m(\u001b[0m\u001b[0mdelta_x\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mdelta_y\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mdelta_x_y\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mn\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;36m0\u001b[0m \u001b[1;31m# n is number of moves in row\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mwhile\u001b[0m \u001b[0mboard\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;33m==\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mn\u001b[0m \u001b[1;33m+=\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mx\u001b[0m \u001b[1;33m+\u001b[0m \u001b[0mdelta_x\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m \u001b[1;33m+\u001b[0m \u001b[0mdelta_y\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mmove\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mwhile\u001b[0m \u001b[0mboard\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;33m==\u001b[0m \u001b[0mplayer\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mn\u001b[0m \u001b[1;33m+=\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mx\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mx\u001b[0m \u001b[1;33m-\u001b[0m \u001b[0mdelta_x\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0my\u001b[0m \u001b[1;33m-\u001b[0m \u001b[0mdelta_y\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[0mn\u001b[0m \u001b[1;33m-=\u001b[0m \u001b[1;36m1\u001b[0m \u001b[1;31m# Because we counted move itself twice\u001b[0m\u001b[1;33m\n\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mn\u001b[0m \u001b[1;33m>=\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mk\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", + "metadata": {}, + "execution_count": 4 + } + ], "source": [ "%psource TicTacToe" ] @@ -210,7 +217,7 @@ "\n", "\n", "\n", - "The states are represented wih capital letters inside the triangles (eg. \"A\") while moves are the labels on the edges between states (eg. \"a1\"). Terminal nodes carry utility values. Note that the terminal nodes are named in this example 'B1', 'B2' and 'B2' for the nodes below 'B', and so forth.\n", + "The states are represented with capital letters inside the triangles (eg. \"A\") while moves are the labels on the edges between states (eg. \"a1\"). Terminal nodes carry utility values. Note that the terminal nodes are named in this example 'B1', 'B2' and 'B2' for the nodes below 'B', and so forth.\n", "\n", "We will model the moves, utilities and initial state like this:" ] @@ -285,7 +292,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(Fig52Game.actions)" @@ -318,7 +327,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(Fig52Game.result)" @@ -351,7 +362,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(Fig52Game.utility)" @@ -386,7 +399,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(Fig52Game.terminal_test)" @@ -419,7 +434,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(Fig52Game.to_move)" @@ -452,7 +469,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(Fig52Game)" @@ -473,7 +492,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 2, "metadata": {}, "outputs": [ { @@ -506,7 +525,7 @@ "" ] }, - "execution_count": 9, + "execution_count": 2, "metadata": {}, "output_type": "execute_result" } @@ -527,7 +546,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(minimax_decision)" @@ -616,469 +637,9 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "\n", - "\n", - "
    \n", - "\n", - "
    \n", - "\n", - "\n" - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "" - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "outputs": [], "source": [ "minimax_viz = Canvas_minimax('minimax_viz', [randint(1, 50) for i in range(27)])" ] @@ -1091,7 +652,7 @@ "\n", "## Overview\n", "\n", - "While *Minimax* is great for computing a move, it can get tricky when the number of games states gets bigger. The algorithm needs to search all the leaves of the tree, which increase exponentially to its depth.\n", + "While *Minimax* is great for computing a move, it can get tricky when the number of game states gets bigger. The algorithm needs to search all the leaves of the tree, which increase exponentially to its depth.\n", "\n", "For Tic-Tac-Toe, where the depth of the tree is 9 (after the 9th move, the game ends), we can have at most 9! terminal states (at most because not all terminal nodes are at the last level of the tree; some are higher up because the game ended before the 9th move). This isn't so bad, but for more complex problems like chess, we have over $10^{40}$ terminal nodes. Unfortunately we have not found a way to cut the exponent away, but we nevertheless have found ways to alleviate the workload.\n", "\n", @@ -1122,7 +683,7 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 3, "metadata": {}, "outputs": [ { @@ -1161,7 +722,7 @@ "" ] }, - "execution_count": 10, + "execution_count": 3, "metadata": {}, "output_type": "execute_result" } @@ -1271,469 +832,9 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "\n", - "\n", - "
    \n", - "\n", - "
    \n", - "\n", - "\n" - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "" - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "outputs": [], "source": [ "alphabeta_viz = Canvas_alphabeta('alphabeta_viz', [randint(1, 50) for i in range(27)])" ] @@ -1755,6 +856,9 @@ "## alphabeta_player\n", "The `alphabeta_player`, on the other hand, calls the `alphabeta_search` function, which returns the best move in the current game state. Thus, the `alphabeta_player` always plays the best move given a game state, assuming that the game tree is small enough to search entirely.\n", "\n", + "## minimax_player\n", + "The `minimax_player`, on the other hand calls the `minimax_search` function which returns the best move in the current game state.\n", + "\n", "## play_game\n", "The `play_game` function will be the one that will actually be used to play the game. You pass as arguments to it an instance of the game you want to play and the players you want in this game. Use it to play AI vs AI, AI vs human, or even human vs human matches!" ] @@ -2557,9 +1661,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.3" + "version": "3.8.2-final" } }, "nbformat": 4, "nbformat_minor": 1 -} +} \ No newline at end of file diff --git a/games.py b/games.py index 00a2c33d3..d22b2e640 100644 --- a/games.py +++ b/games.py @@ -1,18 +1,23 @@ -"""Games, or Adversarial Search (Chapter 5)""" +"""Games or Adversarial Search (Chapter 5)""" -from collections import namedtuple +import copy +import itertools import random +from collections import namedtuple -from utils import argmax +import numpy as np + +from utils import vector_add -infinity = float('inf') GameState = namedtuple('GameState', 'to_move, utility, board, moves') +StochasticGameState = namedtuple('StochasticGameState', 'to_move, utility, board, moves, chance') + # ______________________________________________________________________________ -# Minimax Search +# MinMax Search -def minimax_decision(state, game): +def minmax_decision(state, game): """Given a state in a game, calculate the best move by searching forward all the way to the terminal states. [Figure 5.3]""" @@ -21,7 +26,7 @@ def minimax_decision(state, game): def max_value(state): if game.terminal_test(state): return game.utility(state, player) - v = -infinity + v = -np.inf for a in game.actions(state): v = max(v, min_value(game.result(state, a))) return v @@ -29,29 +34,69 @@ def max_value(state): def min_value(state): if game.terminal_test(state): return game.utility(state, player) - v = infinity + v = np.inf for a in game.actions(state): v = min(v, max_value(game.result(state, a))) return v - # Body of minimax_decision: - return argmax(game.actions(state), - key=lambda a: min_value(game.result(state, a))) + # Body of minmax_decision: + return max(game.actions(state), key=lambda a: min_value(game.result(state, a))) + # ______________________________________________________________________________ -def alphabeta_search(state, game): +def expect_minmax(state, game): + """ + [Figure 5.11] + Return the best move for a player after dice are thrown. The game tree + includes chance nodes along with min and max nodes. + """ + player = game.to_move(state) + + def max_value(state): + v = -np.inf + for a in game.actions(state): + v = max(v, chance_node(state, a)) + return v + + def min_value(state): + v = np.inf + for a in game.actions(state): + v = min(v, chance_node(state, a)) + return v + + def chance_node(state, action): + res_state = game.result(state, action) + if game.terminal_test(res_state): + return game.utility(res_state, player) + sum_chances = 0 + num_chances = len(game.chances(res_state)) + for chance in game.chances(res_state): + res_state = game.outcome(res_state, chance) + util = 0 + if res_state.to_move == player: + util = max_value(res_state) + else: + util = min_value(res_state) + sum_chances += util * game.probability(chance) + return sum_chances / num_chances + + # Body of expect_minmax: + return max(game.actions(state), key=lambda a: chance_node(state, a), default=None) + + +def alpha_beta_search(state, game): """Search game to determine best action; use alpha-beta pruning. As in [Figure 5.7], this version searches all the way to the leaves.""" player = game.to_move(state) - # Functions used by alphabeta + # Functions used by alpha_beta def max_value(state, alpha, beta): if game.terminal_test(state): return game.utility(state, player) - v = -infinity + v = -np.inf for a in game.actions(state): v = max(v, min_value(game.result(state, a), alpha, beta)) if v >= beta: @@ -62,7 +107,7 @@ def max_value(state, alpha, beta): def min_value(state, alpha, beta): if game.terminal_test(state): return game.utility(state, player) - v = infinity + v = np.inf for a in game.actions(state): v = min(v, max_value(game.result(state, a), alpha, beta)) if v <= alpha: @@ -70,9 +115,9 @@ def min_value(state, alpha, beta): beta = min(beta, v) return v - # Body of alphabeta_cutoff_search: - best_score = -infinity - beta = infinity + # Body of alpha_beta_search: + best_score = -np.inf + beta = np.inf best_action = None for a in game.actions(state): v = min_value(game.result(state, a), best_score, beta) @@ -82,20 +127,19 @@ def min_value(state, alpha, beta): return best_action -def alphabeta_cutoff_search(state, game, d=4, cutoff_test=None, eval_fn=None): +def alpha_beta_cutoff_search(state, game, d=4, cutoff_test=None, eval_fn=None): """Search game to determine best action; use alpha-beta pruning. This version cuts off search and uses an evaluation function.""" player = game.to_move(state) - # Functions used by alphabeta + # Functions used by alpha_beta def max_value(state, alpha, beta, depth): if cutoff_test(state, depth): return eval_fn(state) - v = -infinity + v = -np.inf for a in game.actions(state): - v = max(v, min_value(game.result(state, a), - alpha, beta, depth + 1)) + v = max(v, min_value(game.result(state, a), alpha, beta, depth + 1)) if v >= beta: return v alpha = max(alpha, v) @@ -104,23 +148,20 @@ def max_value(state, alpha, beta, depth): def min_value(state, alpha, beta, depth): if cutoff_test(state, depth): return eval_fn(state) - v = infinity + v = np.inf for a in game.actions(state): - v = min(v, max_value(game.result(state, a), - alpha, beta, depth + 1)) + v = min(v, max_value(game.result(state, a), alpha, beta, depth + 1)) if v <= alpha: return v beta = min(beta, v) return v - # Body of alphabeta_cutoff_search starts here: + # Body of alpha_beta_cutoff_search starts here: # The default test cuts off at depth d or at a terminal state - cutoff_test = (cutoff_test or - (lambda state, depth: depth > d or - game.terminal_test(state))) + cutoff_test = (cutoff_test or (lambda state, depth: depth > d or game.terminal_test(state))) eval_fn = eval_fn or (lambda state: game.utility(state, player)) - best_score = -infinity - beta = infinity + best_score = -np.inf + beta = np.inf best_action = None for a in game.actions(state): v = min_value(game.result(state, a), best_score, beta, 1) @@ -129,6 +170,7 @@ def min_value(state, alpha, beta, depth): best_action = a return best_action + # ______________________________________________________________________________ # Players for Games @@ -139,21 +181,33 @@ def query_player(game, state): game.display(state) print("available moves: {}".format(game.actions(state))) print("") - move_string = input('Your move? ') - try: - move = eval(move_string) - except NameError: - move = move_string + move = None + if game.actions(state): + move_string = input('Your move? ') + try: + move = eval(move_string) + except NameError: + move = move_string + else: + print('no legal moves: passing turn to next player') return move def random_player(game, state): """A player that chooses a legal move at random.""" - return random.choice(game.actions(state)) + return random.choice(game.actions(state)) if game.actions(state) else None + + +def alpha_beta_player(game, state): + return alpha_beta_search(state, game) + +def minmax_player(game,state): + return minmax_decision(state,game) -def alphabeta_player(game, state): - return alphabeta_search(state, game) + +def expect_minmax_player(game, state): + return expect_minmax(state, game) # ______________________________________________________________________________ @@ -208,6 +262,38 @@ def play_game(self, *players): return self.utility(state, self.to_move(self.initial)) +class StochasticGame(Game): + """A stochastic game includes uncertain events which influence + the moves of players at each state. To create a stochastic game, subclass + this class and implement chances and outcome along with the other + unimplemented game class methods.""" + + def chances(self, state): + """Return a list of all possible uncertain events at a state.""" + raise NotImplementedError + + def outcome(self, state, chance): + """Return the state which is the outcome of a chance trial.""" + raise NotImplementedError + + def probability(self, chance): + """Return the probability of occurrence of a chance.""" + raise NotImplementedError + + def play_game(self, *players): + """Play an n-person, move-alternating stochastic game.""" + state = self.initial + while True: + for player in players: + chance = random.choice(self.chances(state)) + state = self.outcome(state, chance) + move = player(self, state) + state = self.result(state, move) + if self.terminal_test(state): + self.display(state) + return self.utility(state, self.to_move(self.initial)) + + class Fig52Game(Game): """The game represented in [Figure 5.2]. Serves as a simple test case.""" @@ -240,7 +326,7 @@ def to_move(self, state): class Fig52Extended(Game): """Similar to Fig52Game but bigger. Useful for visualisation""" - succs = {i:dict(l=i*3+1, m=i*3+2, r=i*3+3) for i in range(13)} + succs = {i: dict(l=i * 3 + 1, m=i * 3 + 2, r=i * 3 + 3) for i in range(13)} utils = dict() def actions(self, state): @@ -261,6 +347,7 @@ def terminal_test(self, state): def to_move(self, state): return 'MIN' if state in {1, 2, 3} else 'MAX' + class TicTacToe(Game): """Play TicTacToe on an h x v board, with Max (first player) playing 'X'. A state has the player to move, a cached utility, a list of moves in @@ -341,4 +428,163 @@ def __init__(self, h=7, v=6, k=4): def actions(self, state): return [(x, y) for (x, y) in state.moves - if y == 1 or (x, y - 1) in state.board] + if x == self.h or (x + 1 , y ) in state.board] + +class Gomoku(TicTacToe): + """Also known as Five in a row.""" + + def __init__(self, h=15, v=16, k=5): + TicTacToe.__init__(self, h, v, k) + + +class Backgammon(StochasticGame): + """A two player game where the goal of each player is to move all the + checkers off the board. The moves for each state are determined by + rolling a pair of dice.""" + + def __init__(self): + """Initial state of the game""" + point = {'W': 0, 'B': 0} + board = [point.copy() for index in range(24)] + board[0]['B'] = board[23]['W'] = 2 + board[5]['W'] = board[18]['B'] = 5 + board[7]['W'] = board[16]['B'] = 3 + board[11]['B'] = board[12]['W'] = 5 + self.allow_bear_off = {'W': False, 'B': False} + self.direction = {'W': -1, 'B': 1} + self.initial = StochasticGameState(to_move='W', + utility=0, + board=board, + moves=self.get_all_moves(board, 'W'), chance=None) + + def actions(self, state): + """Return a list of legal moves for a state.""" + player = state.to_move + moves = state.moves + if len(moves) == 1 and len(moves[0]) == 1: + return moves + legal_moves = [] + for move in moves: + board = copy.deepcopy(state.board) + if self.is_legal_move(board, move, state.chance, player): + legal_moves.append(move) + return legal_moves + + def result(self, state, move): + board = copy.deepcopy(state.board) + player = state.to_move + self.move_checker(board, move[0], state.chance[0], player) + if len(move) == 2: + self.move_checker(board, move[1], state.chance[1], player) + to_move = ('W' if player == 'B' else 'B') + return StochasticGameState(to_move=to_move, + utility=self.compute_utility(board, move, player), + board=board, + moves=self.get_all_moves(board, to_move), chance=None) + + def utility(self, state, player): + """Return the value to player; 1 for win, -1 for loss, 0 otherwise.""" + return state.utility if player == 'W' else -state.utility + + def terminal_test(self, state): + """A state is terminal if one player wins.""" + return state.utility != 0 + + def get_all_moves(self, board, player): + """All possible moves for a player i.e. all possible ways of + choosing two checkers of a player from the board for a move + at a given state.""" + all_points = board + taken_points = [index for index, point in enumerate(all_points) + if point[player] > 0] + if self.checkers_at_home(board, player) == 1: + return [(taken_points[0],)] + moves = list(itertools.permutations(taken_points, 2)) + moves = moves + [(index, index) for index, point in enumerate(all_points) + if point[player] >= 2] + return moves + + def display(self, state): + """Display state of the game.""" + board = state.board + player = state.to_move + print("current state : ") + for index, point in enumerate(board): + print("point : ", index, " W : ", point['W'], " B : ", point['B']) + print("to play : ", player) + + def compute_utility(self, board, move, player): + """If 'W' wins with this move, return 1; if 'B' wins return -1; else return 0.""" + util = {'W': 1, 'B': -1} + for idx in range(0, 24): + if board[idx][player] > 0: + return 0 + return util[player] + + def checkers_at_home(self, board, player): + """Return the no. of checkers at home for a player.""" + sum_range = range(0, 7) if player == 'W' else range(17, 24) + count = 0 + for idx in sum_range: + count = count + board[idx][player] + return count + + def is_legal_move(self, board, start, steps, player): + """Move is a tuple which contains starting points of checkers to be + moved during a player's turn. An on-board move is legal if both the destinations + are open. A bear-off move is the one where a checker is moved off-board. + It is legal only after a player has moved all his checkers to his home.""" + dest1, dest2 = vector_add(start, steps) + dest_range = range(0, 24) + move1_legal = move2_legal = False + if dest1 in dest_range: + if self.is_point_open(player, board[dest1]): + self.move_checker(board, start[0], steps[0], player) + move1_legal = True + else: + if self.allow_bear_off[player]: + self.move_checker(board, start[0], steps[0], player) + move1_legal = True + if not move1_legal: + return False + if dest2 in dest_range: + if self.is_point_open(player, board[dest2]): + move2_legal = True + else: + if self.allow_bear_off[player]: + move2_legal = True + return move1_legal and move2_legal + + def move_checker(self, board, start, steps, player): + """Move a checker from starting point by a given number of steps""" + dest = start + steps + dest_range = range(0, 24) + board[start][player] -= 1 + if dest in dest_range: + board[dest][player] += 1 + if self.checkers_at_home(board, player) == 15: + self.allow_bear_off[player] = True + + def is_point_open(self, player, point): + """A point is open for a player if the no. of opponent's + checkers already present on it is 0 or 1. A player can + move a checker to a point only if it is open.""" + opponent = 'B' if player == 'W' else 'W' + return point[opponent] <= 1 + + def chances(self, state): + """Return a list of all possible dice rolls at a state.""" + dice_rolls = list(itertools.combinations_with_replacement([1, 2, 3, 4, 5, 6], 2)) + return dice_rolls + + def outcome(self, state, chance): + """Return the state which is the outcome of a dice roll.""" + dice = tuple(map((self.direction[state.to_move]).__mul__, chance)) + return StochasticGameState(to_move=state.to_move, + utility=state.utility, + board=state.board, + moves=state.moves, chance=dice) + + def probability(self, chance): + """Return the probability of occurrence of a dice roll.""" + return 1 / 36 if chance[0] == chance[1] else 1 / 18 diff --git a/games4e.ipynb b/games4e.ipynb new file mode 100644 index 000000000..5b619f7ed --- /dev/null +++ b/games4e.ipynb @@ -0,0 +1,1667 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Game Tree Search\n", + "\n", + "We start with defining the abstract class `Game`, for turn-taking *n*-player games. We rely on, but do not define yet, the concept of a `state` of the game; we'll see later how individual games define states. For now, all we require is that a state has a `state.to_move` attribute, which gives the name of the player whose turn it is. (\"Name\" will be something like `'X'` or `'O'` for tic-tac-toe.) \n", + "\n", + "We also define `play_game`, which takes a game and a dictionary of `{player_name: strategy_function}` pairs, and plays out the game, on each turn checking `state.to_move` to see whose turn it is, and then getting the strategy function for that player and applying it to the game and the state to get a move." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from collections import namedtuple, Counter, defaultdict\n", + "import random\n", + "import math\n", + "import functools \n", + "cache = functools.lru_cache(10**6)" + ] + }, + { + "cell_type": "code", + "execution_count": 73, + "metadata": {}, + "outputs": [], + "source": [ + "class Game:\n", + " \"\"\"A game is similar to a problem, but it has a terminal test instead of \n", + " a goal test, and a utility for each terminal state. To create a game, \n", + " subclass this class and implement `actions`, `result`, `is_terminal`, \n", + " and `utility`. You will also need to set the .initial attribute to the \n", + " initial state; this can be done in the constructor.\"\"\"\n", + "\n", + " def actions(self, state):\n", + " \"\"\"Return a collection of the allowable moves from this state.\"\"\"\n", + " raise NotImplementedError\n", + "\n", + " def result(self, state, move):\n", + " \"\"\"Return the state that results from making a move from a state.\"\"\"\n", + " raise NotImplementedError\n", + "\n", + " def is_terminal(self, state):\n", + " \"\"\"Return True if this is a final state for the game.\"\"\"\n", + " return not self.actions(state)\n", + " \n", + " def utility(self, state, player):\n", + " \"\"\"Return the value of this final state to player.\"\"\"\n", + " raise NotImplementedError\n", + " \n", + "\n", + "def play_game(game, strategies: dict, verbose=False):\n", + " \"\"\"Play a turn-taking game. `strategies` is a {player_name: function} dict,\n", + " where function(state, game) is used to get the player's move.\"\"\"\n", + " state = game.initial\n", + " while not game.is_terminal(state):\n", + " player = state.to_move\n", + " move = strategies[player](game, state)\n", + " state = game.result(state, move)\n", + " if verbose: \n", + " print('Player', player, 'move:', move)\n", + " print(state)\n", + " return state" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Minimax-Based Game Search Algorithms\n", + "\n", + "We will define several game search algorithms. Each takes two inputs, the game we are playing and the current state of the game, and returns a a `(value, move)` pair, where `value` is the utility that the algorithm computes for the player whose turn it is to move, and `move` is the move itself.\n", + "\n", + "First we define `minimax_search`, which exhaustively searches the game tree to find an optimal move (assuming both players play optimally), and `alphabeta_search`, which does the same computation, but prunes parts of the tree that could not possibly have an affect on the optimnal move. " + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": {}, + "outputs": [], + "source": [ + "def minimax_search(game, state):\n", + " \"\"\"Search game tree to determine best move; return (value, move) pair.\"\"\"\n", + "\n", + " player = state.to_move\n", + "\n", + " def max_value(state):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = -infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = min_value(game.result(state, a))\n", + " if v2 > v:\n", + " v, move = v2, a\n", + " return v, move\n", + "\n", + " def min_value(state):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = +infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = max_value(game.result(state, a))\n", + " if v2 < v:\n", + " v, move = v2, a\n", + " return v, move\n", + "\n", + " return max_value(state)\n", + "\n", + "infinity = math.inf\n", + "\n", + "def alphabeta_search(game, state):\n", + " \"\"\"Search game to determine best action; use alpha-beta pruning.\n", + " As in [Figure 5.7], this version searches all the way to the leaves.\"\"\"\n", + "\n", + " player = state.to_move\n", + "\n", + " def max_value(state, alpha, beta):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = -infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = min_value(game.result(state, a), alpha, beta)\n", + " if v2 > v:\n", + " v, move = v2, a\n", + " alpha = max(alpha, v)\n", + " if v >= beta:\n", + " return v, move\n", + " return v, move\n", + "\n", + " def min_value(state, alpha, beta):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = +infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = max_value(game.result(state, a), alpha, beta)\n", + " if v2 < v:\n", + " v, move = v2, a\n", + " beta = min(beta, v)\n", + " if v <= alpha:\n", + " return v, move\n", + " return v, move\n", + "\n", + " return max_value(state, -infinity, +infinity)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# A Simple Game: Tic-Tac-Toe\n", + "\n", + "We have the notion of an abstract game, we have some search functions; now it is time to define a real game; a simple one, tic-tac-toe. Moves are `(x, y)` pairs denoting squares, where `(0, 0)` is the top left, and `(2, 2)` is the bottom right (on a board of size `height=width=3`)." + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": {}, + "outputs": [], + "source": [ + "class TicTacToe(Game):\n", + " \"\"\"Play TicTacToe on an `height` by `width` board, needing `k` in a row to win.\n", + " 'X' plays first against 'O'.\"\"\"\n", + "\n", + " def __init__(self, height=3, width=3, k=3):\n", + " self.k = k # k in a row\n", + " self.squares = {(x, y) for x in range(width) for y in range(height)}\n", + " self.initial = Board(height=height, width=width, to_move='X', utility=0)\n", + "\n", + " def actions(self, board):\n", + " \"\"\"Legal moves are any square not yet taken.\"\"\"\n", + " return self.squares - set(board)\n", + "\n", + " def result(self, board, square):\n", + " \"\"\"Place a marker for current player on square.\"\"\"\n", + " player = board.to_move\n", + " board = board.new({square: player}, to_move=('O' if player == 'X' else 'X'))\n", + " win = k_in_row(board, player, square, self.k)\n", + " board.utility = (0 if not win else +1 if player == 'X' else -1)\n", + " return board\n", + "\n", + " def utility(self, board, player):\n", + " \"\"\"Return the value to player; 1 for win, -1 for loss, 0 otherwise.\"\"\"\n", + " return board.utility if player == 'X' else -board.utility\n", + "\n", + " def is_terminal(self, board):\n", + " \"\"\"A board is a terminal state if it is won or there are no empty squares.\"\"\"\n", + " return board.utility != 0 or len(self.squares) == len(board)\n", + "\n", + " def display(self, board): print(board) \n", + "\n", + "\n", + "def k_in_row(board, player, square, k):\n", + " \"\"\"True if player has k pieces in a line through square.\"\"\"\n", + " def in_row(x, y, dx, dy): return 0 if board[x, y] != player else 1 + in_row(x + dx, y + dy, dx, dy)\n", + " return any(in_row(*square, dx, dy) + in_row(*square, -dx, -dy) - 1 >= k\n", + " for (dx, dy) in ((0, 1), (1, 0), (1, 1), (1, -1)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "States in tic-tac-toe (and other games) will be represented as a `Board`, which is a subclass of `defaultdict` that in general will consist of `{(x, y): contents}` pairs, for example `{(0, 0): 'X', (1, 1): 'O'}` might be the state of the board after two moves. Besides the contents of squares, a board also has some attributes: \n", + "- `.to_move` to name the player whose move it is; \n", + "- `.width` and `.height` to give the size of the board (both 3 in tic-tac-toe, but other numbers in related games);\n", + "- possibly other attributes, as specified by keywords. \n", + "\n", + "As a `defaultdict`, the `Board` class has a `__missing__` method, which returns `empty` for squares that have no been assigned but are within the `width` × `height` boundaries, or `off` otherwise. The class has a `__hash__` method, so instances can be stored in hash tables." + ] + }, + { + "cell_type": "code", + "execution_count": 68, + "metadata": {}, + "outputs": [], + "source": [ + "class Board(defaultdict):\n", + " \"\"\"A board has the player to move, a cached utility value, \n", + " and a dict of {(x, y): player} entries, where player is 'X' or 'O'.\"\"\"\n", + " empty = '.'\n", + " off = '#'\n", + " \n", + " def __init__(self, width=8, height=8, to_move=None, **kwds):\n", + " self.__dict__.update(width=width, height=height, to_move=to_move, **kwds)\n", + " \n", + " def new(self, changes: dict, **kwds) -> 'Board':\n", + " \"Given a dict of {(x, y): contents} changes, return a new Board with the changes.\"\n", + " board = Board(width=self.width, height=self.height, **kwds)\n", + " board.update(self)\n", + " board.update(changes)\n", + " return board\n", + "\n", + " def __missing__(self, loc):\n", + " x, y = loc\n", + " if 0 <= x < self.width and 0 <= y < self.height:\n", + " return self.empty\n", + " else:\n", + " return self.off\n", + " \n", + " def __hash__(self): \n", + " return hash(tuple(sorted(self.items()))) + hash(self.to_move)\n", + " \n", + " def __repr__(self):\n", + " def row(y): return ' '.join(self[x, y] for x in range(self.width))\n", + " return '\\n'.join(map(row, range(self.height))) + '\\n'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Players\n", + "\n", + "We need an interface for players. I'll represent a player as a `callable` that will be passed two arguments: `(game, state)` and will return a `move`.\n", + "The function `player` creates a player out of a search algorithm, but you can create your own players as functions, as is done with `random_player` below:" + ] + }, + { + "cell_type": "code", + "execution_count": 69, + "metadata": {}, + "outputs": [], + "source": [ + "def random_player(game, state): return random.choice(list(game.actions(state)))\n", + "\n", + "def player(search_algorithm):\n", + " \"\"\"A game player who uses the specified search algorithm\"\"\"\n", + " return lambda game, state: search_algorithm(game, state)[1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Playing a Game\n", + "\n", + "We're ready to play a game. I'll set up a match between a `random_player` (who chooses randomly from the legal moves) and a `player(alphabeta_search)` (who makes the optimal alpha-beta move; practical for tic-tac-toe, but not for large games). The `player(alphabeta_search)` will never lose, but if `random_player` is lucky, it will be a tie." + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Player X move: (0, 0)\n", + "X . .\n", + ". . .\n", + ". . .\n", + "\n", + "Player O move: (1, 1)\n", + "X . .\n", + ". O .\n", + ". . .\n", + "\n", + "Player X move: (1, 2)\n", + "X . .\n", + ". O .\n", + ". X .\n", + "\n", + "Player O move: (0, 1)\n", + "X . .\n", + "O O .\n", + ". X .\n", + "\n", + "Player X move: (2, 1)\n", + "X . .\n", + "O O X\n", + ". X .\n", + "\n", + "Player O move: (2, 0)\n", + "X . O\n", + "O O X\n", + ". X .\n", + "\n", + "Player X move: (2, 2)\n", + "X . O\n", + "O O X\n", + ". X X\n", + "\n", + "Player O move: (0, 2)\n", + "X . O\n", + "O O X\n", + "O X X\n", + "\n" + ] + }, + { + "data": { + "text/plain": [ + "-1" + ] + }, + "execution_count": 74, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "play_game(TicTacToe(), dict(X=random_player, O=player(alphabeta_search)), verbose=True).utility" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The alpha-beta player will never lose, but sometimes the random player can stumble into a draw. When two optimal (alpha-beta or minimax) players compete, it will always be a draw:" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Player X move: (0, 1)\n", + ". . .\n", + "X . .\n", + ". . .\n", + "\n", + "Player O move: (0, 0)\n", + "O . .\n", + "X . .\n", + ". . .\n", + "\n", + "Player X move: (2, 0)\n", + "O . X\n", + "X . .\n", + ". . .\n", + "\n", + "Player O move: (2, 1)\n", + "O . X\n", + "X . O\n", + ". . .\n", + "\n", + "Player X move: (1, 2)\n", + "O . X\n", + "X . O\n", + ". X .\n", + "\n", + "Player O move: (0, 2)\n", + "O . X\n", + "X . O\n", + "O X .\n", + "\n", + "Player X move: (1, 0)\n", + "O X X\n", + "X . O\n", + "O X .\n", + "\n", + "Player O move: (1, 1)\n", + "O X X\n", + "X O O\n", + "O X .\n", + "\n", + "Player X move: (2, 2)\n", + "O X X\n", + "X O O\n", + "O X X\n", + "\n" + ] + }, + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "play_game(TicTacToe(), dict(X=player(alphabeta_search), O=player(minimax_search)), verbose=True).utility" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Connect Four\n", + "\n", + "Connect Four is a variant of tic-tac-toe, played on a larger (7 x 6) board, and with the restriction that in any column you can only play in the lowest empty square in the column." + ] + }, + { + "cell_type": "code", + "execution_count": 76, + "metadata": {}, + "outputs": [], + "source": [ + "class ConnectFour(TicTacToe):\n", + " \n", + " def __init__(self): super().__init__(width=7, height=6, k=4)\n", + "\n", + " def actions(self, board):\n", + " \"\"\"In each column you can play only the lowest empty square in the column.\"\"\"\n", + " return {(x, y) for (x, y) in self.squares - set(board)\n", + " if y == board.height - 1 or (x, y + 1) in board}" + ] + }, + { + "cell_type": "code", + "execution_count": 77, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Player X move: (2, 5)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . X . . . .\n", + "\n", + "Player O move: (1, 5)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O X . . . .\n", + "\n", + "Player X move: (5, 5)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O X . . X .\n", + "\n", + "Player O move: (4, 5)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O X . O X .\n", + "\n", + "Player X move: (4, 4)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . X . .\n", + ". O X . O X .\n", + "\n", + "Player O move: (2, 4)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . O . X . .\n", + ". O X . O X .\n", + "\n", + "Player X move: (2, 3)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . X . . . .\n", + ". . O . X . .\n", + ". O X . O X .\n", + "\n", + "Player O move: (1, 4)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . X . . . .\n", + ". O O . X . .\n", + ". O X . O X .\n", + "\n", + "Player X move: (0, 5)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . X . . . .\n", + ". O O . X . .\n", + "X O X . O X .\n", + "\n", + "Player O move: (5, 4)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . X . . . .\n", + ". O O . X O .\n", + "X O X . O X .\n", + "\n", + "Player X move: (5, 3)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . X . . X .\n", + ". O O . X O .\n", + "X O X . O X .\n", + "\n", + "Player O move: (6, 5)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . X . . X .\n", + ". O O . X O .\n", + "X O X . O X O\n", + "\n", + "Player X move: (1, 3)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". X X . . X .\n", + ". O O . X O .\n", + "X O X . O X O\n", + "\n", + "Player O move: (6, 4)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". X X . . X .\n", + ". O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player X move: (5, 2)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . X .\n", + ". X X . . X .\n", + ". O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player O move: (0, 4)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . X .\n", + ". X X . . X .\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player X move: (0, 3)\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . X .\n", + "X X X . . X .\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player O move: (0, 2)\n", + ". . . . . . .\n", + ". . . . . . .\n", + "O . . . . X .\n", + "X X X . . X .\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player X move: (0, 1)\n", + ". . . . . . .\n", + "X . . . . . .\n", + "O . . . . X .\n", + "X X X . . X .\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player O move: (0, 0)\n", + "O . . . . . .\n", + "X . . . . . .\n", + "O . . . . X .\n", + "X X X . . X .\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player X move: (5, 1)\n", + "O . . . . . .\n", + "X . . . . X .\n", + "O . . . . X .\n", + "X X X . . X .\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player O move: (4, 3)\n", + "O . . . . . .\n", + "X . . . . X .\n", + "O . . . . X .\n", + "X X X . O X .\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player X move: (6, 3)\n", + "O . . . . . .\n", + "X . . . . X .\n", + "O . . . . X .\n", + "X X X . O X X\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player O move: (5, 0)\n", + "O . . . . O .\n", + "X . . . . X .\n", + "O . . . . X .\n", + "X X X . O X X\n", + "O O O . X O O\n", + "X O X . O X O\n", + "\n", + "Player X move: (3, 5)\n", + "O . . . . O .\n", + "X . . . . X .\n", + "O . . . . X .\n", + "X X X . O X X\n", + "O O O . X O O\n", + "X O X X O X O\n", + "\n", + "Player O move: (1, 2)\n", + "O . . . . O .\n", + "X . . . . X .\n", + "O O . . . X .\n", + "X X X . O X X\n", + "O O O . X O O\n", + "X O X X O X O\n", + "\n", + "Player X move: (1, 1)\n", + "O . . . . O .\n", + "X X . . . X .\n", + "O O . . . X .\n", + "X X X . O X X\n", + "O O O . X O O\n", + "X O X X O X O\n", + "\n", + "Player O move: (3, 4)\n", + "O . . . . O .\n", + "X X . . . X .\n", + "O O . . . X .\n", + "X X X . O X X\n", + "O O O O X O O\n", + "X O X X O X O\n", + "\n" + ] + }, + { + "data": { + "text/plain": [ + "-1" + ] + }, + "execution_count": 77, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "play_game(ConnectFour(), dict(X=random_player, O=random_player), verbose=True).utility" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transposition Tables\n", + "\n", + "By treating the game tree as a tree, we can arrive at the same state through different paths, and end up duplicating effort. In state-space search, we kept a table of `reached` states to prevent this. For game-tree search, we can achieve the same effect by applying the `@cache` decorator to the `min_value` and `max_value` functions. We'll use the suffix `_tt` to indicate a function that uses these transisiton tables." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [], + "source": [ + "def minimax_search_tt(game, state):\n", + " \"\"\"Search game to determine best move; return (value, move) pair.\"\"\"\n", + "\n", + " player = state.to_move\n", + "\n", + " @cache\n", + " def max_value(state):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = -infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = min_value(game.result(state, a))\n", + " if v2 > v:\n", + " v, move = v2, a\n", + " return v, move\n", + "\n", + " @cache\n", + " def min_value(state):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = +infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = max_value(game.result(state, a))\n", + " if v2 < v:\n", + " v, move = v2, a\n", + " return v, move\n", + "\n", + " return max_value(state)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For alpha-beta search, we can still use a cache, but it should be based just on the state, not on whatever values alpha and beta have." + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": {}, + "outputs": [], + "source": [ + "def cache1(function):\n", + " \"Like lru_cache(None), but only considers the first argument of function.\"\n", + " cache = {}\n", + " def wrapped(x, *args):\n", + " if x not in cache:\n", + " cache[x] = function(x, *args)\n", + " return cache[x]\n", + " return wrapped\n", + "\n", + "def alphabeta_search_tt(game, state):\n", + " \"\"\"Search game to determine best action; use alpha-beta pruning.\n", + " As in [Figure 5.7], this version searches all the way to the leaves.\"\"\"\n", + "\n", + " player = state.to_move\n", + "\n", + " @cache1\n", + " def max_value(state, alpha, beta):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = -infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = min_value(game.result(state, a), alpha, beta)\n", + " if v2 > v:\n", + " v, move = v2, a\n", + " alpha = max(alpha, v)\n", + " if v >= beta:\n", + " return v, move\n", + " return v, move\n", + "\n", + " @cache1\n", + " def min_value(state, alpha, beta):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " v, move = +infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = max_value(game.result(state, a), alpha, beta)\n", + " if v2 < v:\n", + " v, move = v2, a\n", + " beta = min(beta, v)\n", + " if v <= alpha:\n", + " return v, move\n", + " return v, move\n", + "\n", + " return max_value(state, -infinity, +infinity)" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 593 ms, sys: 52 ms, total: 645 ms\n", + "Wall time: 655 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "O X X\n", + "X O O\n", + "O X X" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time play_game(TicTacToe(), {'X':player(alphabeta_search_tt), 'O':player(minimax_search_tt)})" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3.07 s, sys: 30.7 ms, total: 3.1 s\n", + "Wall time: 3.15 s\n" + ] + }, + { + "data": { + "text/plain": [ + "O X X\n", + "X O O\n", + "O X X" + ] + }, + "execution_count": 82, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time play_game(TicTacToe(), {'X':player(alphabeta_search), 'O':player(minimax_search)})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Heuristic Cutoffs" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": {}, + "outputs": [], + "source": [ + "def cutoff_depth(d):\n", + " \"\"\"A cutoff function that searches to depth d.\"\"\"\n", + " return lambda game, state, depth: depth > d\n", + "\n", + "def h_alphabeta_search(game, state, cutoff=cutoff_depth(6), h=lambda s, p: 0):\n", + " \"\"\"Search game to determine best action; use alpha-beta pruning.\n", + " As in [Figure 5.7], this version searches all the way to the leaves.\"\"\"\n", + "\n", + " player = state.to_move\n", + "\n", + " @cache1\n", + " def max_value(state, alpha, beta, depth):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " if cutoff(game, state, depth):\n", + " return h(state, player), None\n", + " v, move = -infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = min_value(game.result(state, a), alpha, beta, depth+1)\n", + " if v2 > v:\n", + " v, move = v2, a\n", + " alpha = max(alpha, v)\n", + " if v >= beta:\n", + " return v, move\n", + " return v, move\n", + "\n", + " @cache1\n", + " def min_value(state, alpha, beta, depth):\n", + " if game.is_terminal(state):\n", + " return game.utility(state, player), None\n", + " if cutoff(game, state, depth):\n", + " return h(state, player), None\n", + " v, move = +infinity, None\n", + " for a in game.actions(state):\n", + " v2, _ = max_value(game.result(state, a), alpha, beta, depth + 1)\n", + " if v2 < v:\n", + " v, move = v2, a\n", + " beta = min(beta, v)\n", + " if v <= alpha:\n", + " return v, move\n", + " return v, move\n", + "\n", + " return max_value(state, -infinity, +infinity, 0)" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 367 ms, sys: 7.9 ms, total: 375 ms\n", + "Wall time: 375 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "O X X\n", + "X O O\n", + "O X X" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time play_game(TicTacToe(), {'X':player(h_alphabeta_search), 'O':player(h_alphabeta_search)})" + ] + }, + { + "cell_type": "code", + "execution_count": 60, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . X\n", + ". . . . . X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O X\n", + ". . . . . X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O X\n", + ". . . . X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . . O X\n", + ". . . . X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . X O X\n", + ". . . . X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . X O X\n", + ". O . . X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". X . . X O X\n", + ". O . . X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O O\n", + ". X . . X O X\n", + ". O . . X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . X O O\n", + ". X . . X O X\n", + ". O . . X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . X O O\n", + ". X . . X O X\n", + ". O . O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . X .\n", + ". . . . X O O\n", + ". X . . X O X\n", + ". O . O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . . X .\n", + ". . . . X O O\n", + ". X . . X O X\n", + ". O . O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . . X .\n", + ". X . . X O O\n", + ". X . . X O X\n", + ". O . O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . . X O\n", + ". X . . X O O\n", + ". X . . X O X\n", + ". O . O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". X . . . X O\n", + ". X . . X O O\n", + ". X . . X O X\n", + ". O . O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". X . . . X O\n", + ". X . . X O O\n", + ". X . . X O X\n", + ". O O O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". X . . . X O\n", + ". X . . X O O\n", + ". X . . X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . . .\n", + ". . . . . O O\n", + ". X . . . X O\n", + ". X . . X O O\n", + ". X . . X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . . X\n", + ". . . . . O O\n", + ". X . . . X O\n", + ". X . . X O O\n", + ". X . . X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . . X\n", + ". . . . . O O\n", + ". X . . . X O\n", + ". X . . X O O\n", + "O X . . X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . X X\n", + ". . . . . O O\n", + ". X . . . X O\n", + ". X . . X O O\n", + "O X . . X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . X X\n", + ". . . . . O O\n", + ". X . . . X O\n", + ". X . . X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . X X\n", + ". . . . . O O\n", + ". X . . . X O\n", + ". X . X X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . X X\n", + ". . . . . O O\n", + ". X . . O X O\n", + ". X . X X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . X X\n", + ". . . . . O O\n", + ". X . X O X O\n", + ". X . X X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . X X\n", + ". . . . O O O\n", + ". X . X O X O\n", + ". X . X X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + ". . . . . X X\n", + ". . . X O O O\n", + ". X . X O X O\n", + ". X . X X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + ". . . . O X X\n", + ". . . X O O O\n", + ". X . X O X O\n", + ". X . X X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + ". . . X O X X\n", + ". . . X O O O\n", + ". X . X O X O\n", + ". X . X X O O\n", + "O X . O X O X\n", + "X O O O X X O\n", + "\n", + "CPU times: user 8.82 s, sys: 146 ms, total: 8.96 s\n", + "Wall time: 9.19 s\n" + ] + }, + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 60, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time play_game(ConnectFour(), {'X':player(h_alphabeta_search), 'O':random_player}, verbose=True).utility" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . . X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . . . X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . O .\n", + ". . O . X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . X O .\n", + ". . O . X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . X O .\n", + ". . O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . X O .\n", + ". X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O . . X O .\n", + ". X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". X . . . . .\n", + ". O . . X O .\n", + ". X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O . . . . .\n", + ". X . . . . .\n", + ". O . . X O .\n", + ". X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O . . . . .\n", + ". X . . . . .\n", + ". O . . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O . . . . .\n", + ". X . . . . .\n", + "O O . . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O . . . . .\n", + ". X . . X . .\n", + "O O . . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . . . .\n", + ". O . . O . .\n", + ". X . . X . .\n", + "O O . . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X . .\n", + ". O . . O . .\n", + ". X . . X . .\n", + "O O . . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X . .\n", + ". O . . O . .\n", + ". X . . X O .\n", + "O O . . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X . .\n", + ". O . . O X .\n", + ". X . . X O .\n", + "O O . . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X . .\n", + ". O . . O X .\n", + ". X . . X O .\n", + "O O O . X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X . .\n", + ". O . . O X .\n", + ". X . . X O .\n", + "O O O X X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X O .\n", + ". O . . O X .\n", + ". X . . X O .\n", + "O O O X X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X O .\n", + ". O . . O X .\n", + ". X . X X O .\n", + "O O O X X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . . X O .\n", + ". O . O O X .\n", + ". X . X X O .\n", + "O O O X X O .\n", + "X X O O X X .\n", + "\n", + ". . . . . . .\n", + ". . . X X O .\n", + ". O . O O X .\n", + ". X . X X O .\n", + "O O O X X O .\n", + "X X O O X X .\n", + "\n", + ". . . O . . .\n", + ". . . X X O .\n", + ". O . O O X .\n", + ". X . X X O .\n", + "O O O X X O .\n", + "X X O O X X .\n", + "\n", + ". . . O . . .\n", + ". . . X X O .\n", + ". O . O O X .\n", + ". X X X X O .\n", + "O O O X X O .\n", + "X X O O X X .\n", + "\n", + "CPU times: user 12.1 s, sys: 237 ms, total: 12.4 s\n", + "Wall time: 12.9 s\n" + ] + }, + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 61, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time play_game(ConnectFour(), {'X':player(h_alphabeta_search), 'O':player(h_alphabeta_search)}, verbose=True).utility" + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Result states: 6,589; Terminal tests: 3,653; for alphabeta_search_tt\n", + "Result states: 25,703; Terminal tests: 25,704; for alphabeta_search\n", + "Result states: 4,687; Terminal tests: 2,805; for h_alphabeta_search\n", + "Result states: 16,167; Terminal tests: 5,478; for minimax_search_tt\n" + ] + } + ], + "source": [ + "class CountCalls:\n", + " \"\"\"Delegate all attribute gets to the object, and count them in ._counts\"\"\"\n", + " def __init__(self, obj):\n", + " self._object = obj\n", + " self._counts = Counter()\n", + " \n", + " def __getattr__(self, attr):\n", + " \"Delegate to the original object, after incrementing a counter.\"\n", + " self._counts[attr] += 1\n", + " return getattr(self._object, attr)\n", + " \n", + "def report(game, searchers):\n", + " for searcher in searchers:\n", + " game = CountCalls(game)\n", + " searcher(game, game.initial)\n", + " print('Result states: {:7,d}; Terminal tests: {:7,d}; for {}'.format(\n", + " game._counts['result'], game._counts['is_terminal'], searcher.__name__))\n", + " \n", + "report(TicTacToe(), (alphabeta_search_tt, alphabeta_search, h_alphabeta_search, minimax_search_tt))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Monte Carlo Tree Search" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "class Node:\n", + " def __init__(self, parent, )\n", + "def mcts(state, game, N=1000):" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Heuristic Search Algorithms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "t = CountCalls(TicTacToe())\n", + " \n", + "play_game(t, dict(X=minimax_player, O=minimax_player), verbose=True)\n", + "t._counts" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "for tactic in (three, fork, center, opposite_corner, corner, any):\n", + " for s in squares:\n", + " if tactic(board, s,player): return s\n", + " for s ins quares:\n", + " if tactic(board, s, opponent): return s" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "\n", + "\n", + "def ucb(U, N, C=2**0.5, parentN=100):\n", + " return round(U/N + C * math.sqrt(math.log(parentN)/N), 2)\n", + "\n", + "{C: (ucb(60, 79, C), ucb(1, 10, C), ucb(2, 11, C)) \n", + " for C in (1.4, 1.5)}\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def ucb(U, N, parentN=100, C=2):\n", + " return U/N + C * math.sqrt(math.log(parentN)/N)\n", + "\n", + "\n", + "C = 1.4 \n", + "\n", + "class Node:\n", + " def __init__(self, name, children=(), U=0, N=0, parent=None, p=0.5):\n", + " self.__dict__.update(name=name, U=U, N=N, parent=parent, children=children, p=p)\n", + " for c in children:\n", + " c.parent = self\n", + " \n", + " def __repr__(self):\n", + " return '{}:{}/{}={:.0%}{}'.format(self.name, self.U, self.N, self.U/self.N, self.children)\n", + " \n", + "def select(n):\n", + " if n.children:\n", + " return select(max(n.children, key=ucb))\n", + " else:\n", + " return n\n", + " \n", + "def back(n, amount):\n", + " if n:\n", + " n.N += 1\n", + " n.U += amount\n", + " back(n.parent, 1 - amount)\n", + " \n", + " \n", + "def one(root): \n", + " n = select(root)\n", + " amount = int(random.uniform(0, 1) < n.p)\n", + " back(n, amount)\n", + " \n", + "def ucb(n): \n", + " return (float('inf') if n.N == 0 else\n", + " n.U / n.N + C * math.sqrt(math.log(n.parent.N)/n.N))\n", + "\n", + "\n", + "tree = Node('root', [Node('a', p=.8, children=[Node('a1', p=.05), \n", + " Node('a2', p=.25,\n", + " children=[Node('a2a', p=.7), Node('a2b')])]),\n", + " Node('b', p=.5, children=[Node('b1', p=.6,\n", + " children=[Node('b1a', p=.3), Node('b1b')]), \n", + " Node('b2', p=.4)]),\n", + " Node('c', p=.1)])\n", + "\n", + "for i in range(100):\n", + " one(tree); \n", + "for c in tree.children: print(c)\n", + "'select', select(tree), 'tree', tree\n", + "\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "us = (100, 50, 25, 10, 5, 1)\n", + "infinity = float('inf')\n", + "\n", + "@lru_cache(None)\n", + "def f1(n, denom):\n", + " return (0 if n == 0 else\n", + " infinity if n < 0 or not denom else\n", + " min(1 + f1(n - denom[0], denom),\n", + " f1(n, denom[1:])))\n", + " \n", + "@lru_cache(None)\n", + "def f2(n, denom):\n", + " @lru_cache(None)\n", + " def f(n):\n", + " return (0 if n == 0 else\n", + " infinity if n < 0 else\n", + " 1 + min(f(n - d) for d in denom))\n", + " return f(n)\n", + "\n", + "@lru_cache(None)\n", + "def f3(n, denom):\n", + " return (0 if n == 0 else\n", + " infinity if n < 0 or not denom else\n", + " min(k + f2(n - k * denom[0], denom[1:]) \n", + " for k in range(1 + n // denom[0])))\n", + " \n", + "\n", + "def g(n, d=us): return f1(n, d), f2(n, d), f3(n, d)\n", + " \n", + "n = 12345\n", + "%time f1(n, us)\n", + "%time f2(n, us)\n", + "%time f3(n, us)\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/games4e.py b/games4e.py new file mode 100644 index 000000000..aba5b0eb3 --- /dev/null +++ b/games4e.py @@ -0,0 +1,635 @@ +"""Games or Adversarial Search (Chapter 5)""" + +import copy +import itertools +import random +from collections import namedtuple + +import numpy as np + +from utils4e import vector_add, MCT_Node, ucb + +GameState = namedtuple('GameState', 'to_move, utility, board, moves') +StochasticGameState = namedtuple('StochasticGameState', 'to_move, utility, board, moves, chance') + + +# ______________________________________________________________________________ +# MinMax Search + + +def minmax_decision(state, game): + """Given a state in a game, calculate the best move by searching + forward all the way to the terminal states. [Figure 5.3]""" + + player = game.to_move(state) + + def max_value(state): + if game.terminal_test(state): + return game.utility(state, player) + v = -np.inf + for a in game.actions(state): + v = max(v, min_value(game.result(state, a))) + return v + + def min_value(state): + if game.terminal_test(state): + return game.utility(state, player) + v = np.inf + for a in game.actions(state): + v = min(v, max_value(game.result(state, a))) + return v + + # Body of minmax_decision: + return max(game.actions(state), key=lambda a: min_value(game.result(state, a))) + + +# ______________________________________________________________________________ + + +def expect_minmax(state, game): + """ + [Figure 5.11] + Return the best move for a player after dice are thrown. The game tree + includes chance nodes along with min and max nodes. + """ + player = game.to_move(state) + + def max_value(state): + v = -np.inf + for a in game.actions(state): + v = max(v, chance_node(state, a)) + return v + + def min_value(state): + v = np.inf + for a in game.actions(state): + v = min(v, chance_node(state, a)) + return v + + def chance_node(state, action): + res_state = game.result(state, action) + if game.terminal_test(res_state): + return game.utility(res_state, player) + sum_chances = 0 + num_chances = len(game.chances(res_state)) + for chance in game.chances(res_state): + res_state = game.outcome(res_state, chance) + util = 0 + if res_state.to_move == player: + util = max_value(res_state) + else: + util = min_value(res_state) + sum_chances += util * game.probability(chance) + return sum_chances / num_chances + + # Body of expect_min_max: + return max(game.actions(state), key=lambda a: chance_node(state, a), default=None) + + +def alpha_beta_search(state, game): + """Search game to determine best action; use alpha-beta pruning. + As in [Figure 5.7], this version searches all the way to the leaves.""" + + player = game.to_move(state) + + # Functions used by alpha_beta + def max_value(state, alpha, beta): + if game.terminal_test(state): + return game.utility(state, player) + v = -np.inf + for a in game.actions(state): + v = max(v, min_value(game.result(state, a), alpha, beta)) + if v >= beta: + return v + alpha = max(alpha, v) + return v + + def min_value(state, alpha, beta): + if game.terminal_test(state): + return game.utility(state, player) + v = np.inf + for a in game.actions(state): + v = min(v, max_value(game.result(state, a), alpha, beta)) + if v <= alpha: + return v + beta = min(beta, v) + return v + + # Body of alpha_beta_search: + best_score = -np.inf + beta = np.inf + best_action = None + for a in game.actions(state): + v = min_value(game.result(state, a), best_score, beta) + if v > best_score: + best_score = v + best_action = a + return best_action + + +def alpha_beta_cutoff_search(state, game, d=4, cutoff_test=None, eval_fn=None): + """Search game to determine best action; use alpha-beta pruning. + This version cuts off search and uses an evaluation function.""" + + player = game.to_move(state) + + # Functions used by alpha_beta + def max_value(state, alpha, beta, depth): + if cutoff_test(state, depth): + return eval_fn(state) + v = -np.inf + for a in game.actions(state): + v = max(v, min_value(game.result(state, a), alpha, beta, depth + 1)) + if v >= beta: + return v + alpha = max(alpha, v) + return v + + def min_value(state, alpha, beta, depth): + if cutoff_test(state, depth): + return eval_fn(state) + v = np.inf + for a in game.actions(state): + v = min(v, max_value(game.result(state, a), alpha, beta, depth + 1)) + if v <= alpha: + return v + beta = min(beta, v) + return v + + # Body of alpha_beta_cutoff_search starts here: + # The default test cuts off at depth d or at a terminal state + cutoff_test = (cutoff_test or (lambda state, depth: depth > d or game.terminal_test(state))) + eval_fn = eval_fn or (lambda state: game.utility(state, player)) + best_score = -np.inf + beta = np.inf + best_action = None + for a in game.actions(state): + v = min_value(game.result(state, a), best_score, beta, 1) + if v > best_score: + best_score = v + best_action = a + return best_action + + +# ______________________________________________________________________________ +# Monte Carlo Tree Search + + +def monte_carlo_tree_search(state, game, N=1000): + def select(n): + """select a leaf node in the tree""" + if n.children: + return select(max(n.children.keys(), key=ucb)) + else: + return n + + def expand(n): + """expand the leaf node by adding all its children states""" + if not n.children and not game.terminal_test(n.state): + n.children = {MCT_Node(state=game.result(n.state, action), parent=n): action + for action in game.actions(n.state)} + return select(n) + + def simulate(game, state): + """simulate the utility of current state by random picking a step""" + player = game.to_move(state) + while not game.terminal_test(state): + action = random.choice(list(game.actions(state))) + state = game.result(state, action) + v = game.utility(state, player) + return -v + + def backprop(n, utility): + """passing the utility back to all parent nodes""" + if utility > 0: + n.U += utility + # if utility == 0: + # n.U += 0.5 + n.N += 1 + if n.parent: + backprop(n.parent, -utility) + + root = MCT_Node(state=state) + + for _ in range(N): + leaf = select(root) + child = expand(leaf) + result = simulate(game, child.state) + backprop(child, result) + + max_state = max(root.children, key=lambda p: p.N) + + return root.children.get(max_state) + + +# ______________________________________________________________________________ +# Players for Games + + +def query_player(game, state): + """Make a move by querying standard input.""" + print("current state:") + game.display(state) + print("available moves: {}".format(game.actions(state))) + print("") + move = None + if game.actions(state): + move_string = input('Your move? ') + try: + move = eval(move_string) + except NameError: + move = move_string + else: + print('no legal moves: passing turn to next player') + return move + + +def random_player(game, state): + """A player that chooses a legal move at random.""" + return random.choice(game.actions(state)) if game.actions(state) else None + + +def alpha_beta_player(game, state): + return alpha_beta_search(state, game) + + +def expect_min_max_player(game, state): + return expect_minmax(state, game) + + +def mcts_player(game, state): + return monte_carlo_tree_search(state, game) + + +# ______________________________________________________________________________ +# Some Sample Games + + +class Game: + """A game is similar to a problem, but it has a utility for each + state and a terminal test instead of a path cost and a goal + test. To create a game, subclass this class and implement actions, + result, utility, and terminal_test. You may override display and + successors or you can inherit their default methods. You will also + need to set the .initial attribute to the initial state; this can + be done in the constructor.""" + + def actions(self, state): + """Return a list of the allowable moves at this point.""" + raise NotImplementedError + + def result(self, state, move): + """Return the state that results from making a move from a state.""" + raise NotImplementedError + + def utility(self, state, player): + """Return the value of this final state to player.""" + raise NotImplementedError + + def terminal_test(self, state): + """Return True if this is a final state for the game.""" + return not self.actions(state) + + def to_move(self, state): + """Return the player whose move it is in this state.""" + return state.to_move + + def display(self, state): + """Print or otherwise display the state.""" + print(state) + + def __repr__(self): + return '<{}>'.format(self.__class__.__name__) + + def play_game(self, *players): + """Play an n-person, move-alternating game.""" + state = self.initial + while True: + for player in players: + move = player(self, state) + state = self.result(state, move) + if self.terminal_test(state): + self.display(state) + return self.utility(state, self.to_move(self.initial)) + + +class StochasticGame(Game): + """A stochastic game includes uncertain events which influence + the moves of players at each state. To create a stochastic game, subclass + this class and implement chances and outcome along with the other + unimplemented game class methods.""" + + def chances(self, state): + """Return a list of all possible uncertain events at a state.""" + raise NotImplementedError + + def outcome(self, state, chance): + """Return the state which is the outcome of a chance trial.""" + raise NotImplementedError + + def probability(self, chance): + """Return the probability of occurrence of a chance.""" + raise NotImplementedError + + def play_game(self, *players): + """Play an n-person, move-alternating stochastic game.""" + state = self.initial + while True: + for player in players: + chance = random.choice(self.chances(state)) + state = self.outcome(state, chance) + move = player(self, state) + state = self.result(state, move) + if self.terminal_test(state): + self.display(state) + return self.utility(state, self.to_move(self.initial)) + + +class Fig52Game(Game): + """The game represented in [Figure 5.2]. Serves as a simple test case.""" + + succs = dict(A=dict(a1='B', a2='C', a3='D'), + B=dict(b1='B1', b2='B2', b3='B3'), + C=dict(c1='C1', c2='C2', c3='C3'), + D=dict(d1='D1', d2='D2', d3='D3')) + utils = dict(B1=3, B2=12, B3=8, C1=2, C2=4, C3=6, D1=14, D2=5, D3=2) + initial = 'A' + + def actions(self, state): + return list(self.succs.get(state, {}).keys()) + + def result(self, state, move): + return self.succs[state][move] + + def utility(self, state, player): + if player == 'MAX': + return self.utils[state] + else: + return -self.utils[state] + + def terminal_test(self, state): + return state not in ('A', 'B', 'C', 'D') + + def to_move(self, state): + return 'MIN' if state in 'BCD' else 'MAX' + + +class Fig52Extended(Game): + """Similar to Fig52Game but bigger. Useful for visualisation""" + + succs = {i: dict(l=i * 3 + 1, m=i * 3 + 2, r=i * 3 + 3) for i in range(13)} + utils = dict() + + def actions(self, state): + return sorted(list(self.succs.get(state, {}).keys())) + + def result(self, state, move): + return self.succs[state][move] + + def utility(self, state, player): + if player == 'MAX': + return self.utils[state] + else: + return -self.utils[state] + + def terminal_test(self, state): + return state not in range(13) + + def to_move(self, state): + return 'MIN' if state in {1, 2, 3} else 'MAX' + + +class TicTacToe(Game): + """Play TicTacToe on an h x v board, with Max (first player) playing 'X'. + A state has the player to move, a cached utility, a list of moves in + the form of a list of (x, y) positions, and a board, in the form of + a dict of {(x, y): Player} entries, where Player is 'X' or 'O'.""" + + def __init__(self, h=3, v=3, k=3): + self.h = h + self.v = v + self.k = k + moves = [(x, y) for x in range(1, h + 1) + for y in range(1, v + 1)] + self.initial = GameState(to_move='X', utility=0, board={}, moves=moves) + + def actions(self, state): + """Legal moves are any square not yet taken.""" + return state.moves + + def result(self, state, move): + if move not in state.moves: + return state # Illegal move has no effect + board = state.board.copy() + board[move] = state.to_move + moves = list(state.moves) + moves.remove(move) + return GameState(to_move=('O' if state.to_move == 'X' else 'X'), + utility=self.compute_utility(board, move, state.to_move), + board=board, moves=moves) + + def utility(self, state, player): + """Return the value to player; 1 for win, -1 for loss, 0 otherwise.""" + return state.utility if player == 'X' else -state.utility + + def terminal_test(self, state): + """A state is terminal if it is won or there are no empty squares.""" + return state.utility != 0 or len(state.moves) == 0 + + def display(self, state): + board = state.board + for x in range(1, self.h + 1): + for y in range(1, self.v + 1): + print(board.get((x, y), '.'), end=' ') + print() + + def compute_utility(self, board, move, player): + """If 'X' wins with this move, return 1; if 'O' wins return -1; else return 0.""" + if (self.k_in_row(board, move, player, (0, 1)) or + self.k_in_row(board, move, player, (1, 0)) or + self.k_in_row(board, move, player, (1, -1)) or + self.k_in_row(board, move, player, (1, 1))): + return +1 if player == 'X' else -1 + else: + return 0 + + def k_in_row(self, board, move, player, delta_x_y): + """Return true if there is a line through move on board for player.""" + (delta_x, delta_y) = delta_x_y + x, y = move + n = 0 # n is number of moves in row + while board.get((x, y)) == player: + n += 1 + x, y = x + delta_x, y + delta_y + x, y = move + while board.get((x, y)) == player: + n += 1 + x, y = x - delta_x, y - delta_y + n -= 1 # Because we counted move itself twice + return n >= self.k + + +class ConnectFour(TicTacToe): + """A TicTacToe-like game in which you can only make a move on the bottom + row, or in a square directly above an occupied square. Traditionally + played on a 7x6 board and requiring 4 in a row.""" + + def __init__(self, h=7, v=6, k=4): + TicTacToe.__init__(self, h, v, k) + + def actions(self, state): + return [(x, y) for (x, y) in state.moves + if y == 1 or (x, y - 1) in state.board] + + +class Backgammon(StochasticGame): + """A two player game where the goal of each player is to move all the + checkers off the board. The moves for each state are determined by + rolling a pair of dice.""" + + def __init__(self): + """Initial state of the game""" + point = {'W': 0, 'B': 0} + board = [point.copy() for index in range(24)] + board[0]['B'] = board[23]['W'] = 2 + board[5]['W'] = board[18]['B'] = 5 + board[7]['W'] = board[16]['B'] = 3 + board[11]['B'] = board[12]['W'] = 5 + self.allow_bear_off = {'W': False, 'B': False} + self.direction = {'W': -1, 'B': 1} + self.initial = StochasticGameState(to_move='W', + utility=0, + board=board, + moves=self.get_all_moves(board, 'W'), chance=None) + + def actions(self, state): + """Return a list of legal moves for a state.""" + player = state.to_move + moves = state.moves + if len(moves) == 1 and len(moves[0]) == 1: + return moves + legal_moves = [] + for move in moves: + board = copy.deepcopy(state.board) + if self.is_legal_move(board, move, state.chance, player): + legal_moves.append(move) + return legal_moves + + def result(self, state, move): + board = copy.deepcopy(state.board) + player = state.to_move + self.move_checker(board, move[0], state.chance[0], player) + if len(move) == 2: + self.move_checker(board, move[1], state.chance[1], player) + to_move = ('W' if player == 'B' else 'B') + return StochasticGameState(to_move=to_move, + utility=self.compute_utility(board, move, player), + board=board, + moves=self.get_all_moves(board, to_move), chance=None) + + def utility(self, state, player): + """Return the value to player; 1 for win, -1 for loss, 0 otherwise.""" + return state.utility if player == 'W' else -state.utility + + def terminal_test(self, state): + """A state is terminal if one player wins.""" + return state.utility != 0 + + def get_all_moves(self, board, player): + """All possible moves for a player i.e. all possible ways of + choosing two checkers of a player from the board for a move + at a given state.""" + all_points = board + taken_points = [index for index, point in enumerate(all_points) + if point[player] > 0] + if self.checkers_at_home(board, player) == 1: + return [(taken_points[0],)] + moves = list(itertools.permutations(taken_points, 2)) + moves = moves + [(index, index) for index, point in enumerate(all_points) + if point[player] >= 2] + return moves + + def display(self, state): + """Display state of the game.""" + board = state.board + player = state.to_move + print("current state : ") + for index, point in enumerate(board): + print("point : ", index, " W : ", point['W'], " B : ", point['B']) + print("to play : ", player) + + def compute_utility(self, board, move, player): + """If 'W' wins with this move, return 1; if 'B' wins return -1; else return 0.""" + util = {'W': 1, 'B': -1} + for idx in range(0, 24): + if board[idx][player] > 0: + return 0 + return util[player] + + def checkers_at_home(self, board, player): + """Return the no. of checkers at home for a player.""" + sum_range = range(0, 7) if player == 'W' else range(17, 24) + count = 0 + for idx in sum_range: + count = count + board[idx][player] + return count + + def is_legal_move(self, board, start, steps, player): + """Move is a tuple which contains starting points of checkers to be + moved during a player's turn. An on-board move is legal if both the destinations + are open. A bear-off move is the one where a checker is moved off-board. + It is legal only after a player has moved all his checkers to his home.""" + dest1, dest2 = vector_add(start, steps) + dest_range = range(0, 24) + move1_legal = move2_legal = False + if dest1 in dest_range: + if self.is_point_open(player, board[dest1]): + self.move_checker(board, start[0], steps[0], player) + move1_legal = True + else: + if self.allow_bear_off[player]: + self.move_checker(board, start[0], steps[0], player) + move1_legal = True + if not move1_legal: + return False + if dest2 in dest_range: + if self.is_point_open(player, board[dest2]): + move2_legal = True + else: + if self.allow_bear_off[player]: + move2_legal = True + return move1_legal and move2_legal + + def move_checker(self, board, start, steps, player): + """Move a checker from starting point by a given number of steps""" + dest = start + steps + dest_range = range(0, 24) + board[start][player] -= 1 + if dest in dest_range: + board[dest][player] += 1 + if self.checkers_at_home(board, player) == 15: + self.allow_bear_off[player] = True + + def is_point_open(self, player, point): + """A point is open for a player if the no. of opponent's + checkers already present on it is 0 or 1. A player can + move a checker to a point only if it is open.""" + opponent = 'B' if player == 'W' else 'W' + return point[opponent] <= 1 + + def chances(self, state): + """Return a list of all possible dice rolls at a state.""" + dice_rolls = list(itertools.combinations_with_replacement([1, 2, 3, 4, 5, 6], 2)) + return dice_rolls + + def outcome(self, state, chance): + """Return the state which is the outcome of a dice roll.""" + dice = tuple(map((self.direction[state.to_move]).__mul__, chance)) + return StochasticGameState(to_move=state.to_move, + utility=state.utility, + board=state.board, + moves=state.moves, chance=dice) + + def probability(self, chance): + """Return the probability of occurrence of a dice roll.""" + return 1 / 36 if chance[0] == chance[1] else 1 / 18 diff --git a/gui/eight_puzzle.py b/gui/eight_puzzle.py new file mode 100644 index 000000000..5733228d7 --- /dev/null +++ b/gui/eight_puzzle.py @@ -0,0 +1,151 @@ +import os.path +import random +import time +from functools import partial +from tkinter import * + +from search import astar_search, EightPuzzle + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + +root = Tk() + +state = [1, 2, 3, 4, 5, 6, 7, 8, 0] +puzzle = EightPuzzle(tuple(state)) +solution = None + +b = [None] * 9 + + +# TODO: refactor into OOP, remove global variables + +def scramble(): + """Scrambles the puzzle starting from the goal state""" + + global state + global puzzle + possible_actions = ['UP', 'DOWN', 'LEFT', 'RIGHT'] + scramble = [] + for _ in range(60): + scramble.append(random.choice(possible_actions)) + + for move in scramble: + if move in puzzle.actions(state): + state = list(puzzle.result(state, move)) + puzzle = EightPuzzle(tuple(state)) + create_buttons() + + +def solve(): + """Solves the puzzle using astar_search""" + + return astar_search(puzzle).solution() + + +def solve_steps(): + """Solves the puzzle step by step""" + + global puzzle + global solution + global state + solution = solve() + print(solution) + + for move in solution: + state = puzzle.result(state, move) + create_buttons() + root.update() + root.after(1, time.sleep(0.75)) + + +def exchange(index): + """Interchanges the position of the selected tile with the zero tile under certain conditions""" + + global state + global solution + global puzzle + zero_ix = list(state).index(0) + actions = puzzle.actions(state) + current_action = '' + i_diff = index // 3 - zero_ix // 3 + j_diff = index % 3 - zero_ix % 3 + if i_diff == 1: + current_action += 'DOWN' + elif i_diff == -1: + current_action += 'UP' + + if j_diff == 1: + current_action += 'RIGHT' + elif j_diff == -1: + current_action += 'LEFT' + + if abs(i_diff) + abs(j_diff) != 1: + current_action = '' + + if current_action in actions: + b[zero_ix].grid_forget() + b[zero_ix] = Button(root, text=f'{state[index]}', width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, zero_ix)) + b[zero_ix].grid(row=zero_ix // 3, column=zero_ix % 3, ipady=40) + b[index].grid_forget() + b[index] = Button(root, text=None, width=6, font=('Helvetica', 40, 'bold'), command=partial(exchange, index)) + b[index].grid(row=index // 3, column=index % 3, ipady=40) + state[zero_ix], state[index] = state[index], state[zero_ix] + puzzle = EightPuzzle(tuple(state)) + + +def create_buttons(): + """Creates dynamic buttons""" + + # TODO: Find a way to use grid_forget() with a for loop for initialization + b[0] = Button(root, text=f'{state[0]}' if state[0] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 0)) + b[0].grid(row=0, column=0, ipady=40) + b[1] = Button(root, text=f'{state[1]}' if state[1] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 1)) + b[1].grid(row=0, column=1, ipady=40) + b[2] = Button(root, text=f'{state[2]}' if state[2] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 2)) + b[2].grid(row=0, column=2, ipady=40) + b[3] = Button(root, text=f'{state[3]}' if state[3] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 3)) + b[3].grid(row=1, column=0, ipady=40) + b[4] = Button(root, text=f'{state[4]}' if state[4] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 4)) + b[4].grid(row=1, column=1, ipady=40) + b[5] = Button(root, text=f'{state[5]}' if state[5] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 5)) + b[5].grid(row=1, column=2, ipady=40) + b[6] = Button(root, text=f'{state[6]}' if state[6] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 6)) + b[6].grid(row=2, column=0, ipady=40) + b[7] = Button(root, text=f'{state[7]}' if state[7] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 7)) + b[7].grid(row=2, column=1, ipady=40) + b[8] = Button(root, text=f'{state[8]}' if state[8] != 0 else None, width=6, font=('Helvetica', 40, 'bold'), + command=partial(exchange, 8)) + b[8].grid(row=2, column=2, ipady=40) + + +def create_static_buttons(): + """Creates scramble and solve buttons""" + + scramble_btn = Button(root, text='Scramble', font=('Helvetica', 30, 'bold'), width=8, command=partial(init)) + scramble_btn.grid(row=3, column=0, ipady=10) + solve_btn = Button(root, text='Solve', font=('Helvetica', 30, 'bold'), width=8, command=partial(solve_steps)) + solve_btn.grid(row=3, column=2, ipady=10) + + +def init(): + """Calls necessary functions""" + + global state + global solution + state = [1, 2, 3, 4, 5, 6, 7, 8, 0] + scramble() + create_buttons() + create_static_buttons() + + +init() +root.mainloop() diff --git a/gui/genetic_algorithm_example.py b/gui/genetic_algorithm_example.py new file mode 100644 index 000000000..c987151c8 --- /dev/null +++ b/gui/genetic_algorithm_example.py @@ -0,0 +1,189 @@ +# A simple program that implements the solution to the phrase generation problem using +# genetic algorithms as given in the search.ipynb notebook. +# +# Type on the home screen to change the target phrase +# Click on the slider to change genetic algorithm parameters +# Click 'GO' to run the algorithm with the specified variables +# Displays best individual of the current generation +# Displays a progress bar that indicates the amount of completion of the algorithm +# Displays the first few individuals of the current generation + +import os.path +from tkinter import * +from tkinter import ttk + +import search + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + +LARGE_FONT = ('Verdana', 12) +EXTRA_LARGE_FONT = ('Consolas', 36, 'bold') + +canvas_width = 800 +canvas_height = 600 + +black = '#000000' +white = '#ffffff' +p_blue = '#042533' +lp_blue = '#0c394c' + +# genetic algorithm variables +# feel free to play around with these +target = 'Genetic Algorithm' # the phrase to be generated +max_population = 100 # number of samples in each population +mutation_rate = 0.1 # probability of mutation +f_thres = len(target) # fitness threshold +ngen = 1200 # max number of generations to run the genetic algorithm + +generation = 0 # counter to keep track of generation number + +u_case = [chr(x) for x in range(65, 91)] # list containing all uppercase characters +l_case = [chr(x) for x in range(97, 123)] # list containing all lowercase characters +punctuations1 = [chr(x) for x in range(33, 48)] # lists containing punctuation symbols +punctuations2 = [chr(x) for x in range(58, 65)] +punctuations3 = [chr(x) for x in range(91, 97)] +numerals = [chr(x) for x in range(48, 58)] # list containing numbers + +# extend the gene pool with the required lists and append the space character +gene_pool = [] +gene_pool.extend(u_case) +gene_pool.extend(l_case) +gene_pool.append(' ') + + +# callbacks to update global variables from the slider values +def update_max_population(slider_value): + global max_population + max_population = slider_value + + +def update_mutation_rate(slider_value): + global mutation_rate + mutation_rate = slider_value + + +def update_f_thres(slider_value): + global f_thres + f_thres = slider_value + + +def update_ngen(slider_value): + global ngen + ngen = slider_value + + +# fitness function +def fitness_fn(_list): + fitness = 0 + # create string from list of characters + phrase = ''.join(_list) + # add 1 to fitness value for every matching character + for i in range(len(phrase)): + if target[i] == phrase[i]: + fitness += 1 + return fitness + + +# function to bring a new frame on top +def raise_frame(frame, init=False, update_target=False, target_entry=None, f_thres_slider=None): + frame.tkraise() + global target + if update_target and target_entry is not None: + target = target_entry.get() + f_thres_slider.config(to=len(target)) + if init: + population = search.init_population(max_population, gene_pool, len(target)) + genetic_algorithm_stepwise(population) + + +# defining root and child frames +root = Tk() +f1 = Frame(root) +f2 = Frame(root) + +# pack frames on top of one another +for frame in (f1, f2): + frame.grid(row=0, column=0, sticky='news') + +# Home Screen (f1) widgets +target_entry = Entry(f1, font=('Consolas 46 bold'), exportselection=0, foreground=p_blue, justify=CENTER) +target_entry.insert(0, target) +target_entry.pack(expand=YES, side=TOP, fill=X, padx=50) +target_entry.focus_force() + +max_population_slider = Scale(f1, from_=3, to=1000, orient=HORIZONTAL, label='Max population', + command=lambda value: update_max_population(int(value))) +max_population_slider.set(max_population) +max_population_slider.pack(expand=YES, side=TOP, fill=X, padx=40) + +mutation_rate_slider = Scale(f1, from_=0, to=1, orient=HORIZONTAL, label='Mutation rate', resolution=0.0001, + command=lambda value: update_mutation_rate(float(value))) +mutation_rate_slider.set(mutation_rate) +mutation_rate_slider.pack(expand=YES, side=TOP, fill=X, padx=40) + +f_thres_slider = Scale(f1, from_=0, to=len(target), orient=HORIZONTAL, label='Fitness threshold', + command=lambda value: update_f_thres(int(value))) +f_thres_slider.set(f_thres) +f_thres_slider.pack(expand=YES, side=TOP, fill=X, padx=40) + +ngen_slider = Scale(f1, from_=1, to=5000, orient=HORIZONTAL, label='Max number of generations', + command=lambda value: update_ngen(int(value))) +ngen_slider.set(ngen) +ngen_slider.pack(expand=YES, side=TOP, fill=X, padx=40) + +button = ttk.Button(f1, text='RUN', + command=lambda: raise_frame(f2, init=True, update_target=True, target_entry=target_entry, + f_thres_slider=f_thres_slider)).pack(side=BOTTOM, pady=50) + +# f2 widgets +canvas = Canvas(f2, width=canvas_width, height=canvas_height) +canvas.pack(expand=YES, fill=BOTH, padx=20, pady=15) +button = ttk.Button(f2, text='EXIT', command=lambda: raise_frame(f1)).pack(side=BOTTOM, pady=15) + + +# function to run the genetic algorithm and update text on the canvas +def genetic_algorithm_stepwise(population): + root.title('Genetic Algorithm') + for generation in range(ngen): + # generating new population after selecting, recombining and mutating the existing population + population = [ + search.mutate(search.recombine(*search.select(2, population, fitness_fn)), gene_pool, mutation_rate) for i + in range(len(population))] + # genome with the highest fitness in the current generation + current_best = ''.join(max(population, key=fitness_fn)) + # collecting first few examples from the current population + members = [''.join(x) for x in population][:48] + + # clear the canvas + canvas.delete('all') + # displays current best on top of the screen + canvas.create_text(canvas_width / 2, 40, fill=p_blue, font='Consolas 46 bold', text=current_best) + + # displaying a part of the population on the screen + for i in range(len(members) // 3): + canvas.create_text((canvas_width * .175), (canvas_height * .25 + (25 * i)), fill=lp_blue, + font='Consolas 16', text=members[3 * i]) + canvas.create_text((canvas_width * .500), (canvas_height * .25 + (25 * i)), fill=lp_blue, + font='Consolas 16', text=members[3 * i + 1]) + canvas.create_text((canvas_width * .825), (canvas_height * .25 + (25 * i)), fill=lp_blue, + font='Consolas 16', text=members[3 * i + 2]) + + # displays current generation number + canvas.create_text((canvas_width * .5), (canvas_height * 0.95), fill=p_blue, font='Consolas 18 bold', + text=f'Generation {generation}') + + # displays blue bar that indicates current maximum fitness compared to maximum possible fitness + scaling_factor = fitness_fn(current_best) / len(target) + canvas.create_rectangle(canvas_width * 0.1, 90, canvas_width * 0.9, 100, outline=p_blue) + canvas.create_rectangle(canvas_width * 0.1, 90, canvas_width * 0.1 + scaling_factor * canvas_width * 0.8, 100, + fill=lp_blue) + canvas.update() + + # checks for completion + fittest_individual = search.fitness_threshold(fitness_fn, f_thres, population) + if fittest_individual: + break + + +raise_frame(f1) +root.mainloop() diff --git a/gui/grid_mdp.py b/gui/grid_mdp.py new file mode 100644 index 000000000..e60b49247 --- /dev/null +++ b/gui/grid_mdp.py @@ -0,0 +1,676 @@ +import os.path +import sys +import tkinter as tk +import tkinter.messagebox +from functools import partial +from tkinter import ttk + +import matplotlib +import matplotlib.animation as animation +from matplotlib import pyplot as plt +from matplotlib import style +from matplotlib.backends.backend_tkagg import FigureCanvasTkAgg +from matplotlib.figure import Figure +from matplotlib.ticker import MaxNLocator + +from mdp import * + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + +matplotlib.use('TkAgg') +style.use('ggplot') + +fig = Figure(figsize=(20, 15)) +sub = fig.add_subplot(111) +plt.rcParams['axes.grid'] = False + +WALL_VALUE = -99999.0 +TERM_VALUE = -999999.0 + +black = '#000' +white = '#fff' +gray2 = '#222' +gray9 = '#999' +grayd = '#ddd' +grayef = '#efefef' +pblue = '#000040' +green8 = '#008080' +green4 = '#004040' + +cell_window_mantainer = None + + +def extents(f): + """adjusts axis markers for heatmap""" + + delta = f[1] - f[0] + return [f[0] - delta / 2, f[-1] + delta / 2] + + +def display(gridmdp, _height, _width): + """displays matrix""" + + dialog = tk.Toplevel() + dialog.wm_title('Values') + + container = tk.Frame(dialog) + container.pack(side=tk.TOP, fill=tk.BOTH, expand=True) + + for i in range(max(1, _height)): + for j in range(max(1, _width)): + label = ttk.Label(container, text=f'{gridmdp[_height - i - 1][j]:.3f}', font=('Helvetica', 12)) + label.grid(row=i + 1, column=j + 1, padx=3, pady=3) + + dialog.mainloop() + + +def display_best_policy(_best_policy, _height, _width): + """displays best policy""" + dialog = tk.Toplevel() + dialog.wm_title('Best Policy') + + container = tk.Frame(dialog) + container.pack(side=tk.TOP, fill=tk.BOTH, expand=True) + + for i in range(max(1, _height)): + for j in range(max(1, _width)): + label = ttk.Label(container, text=_best_policy[i][j], font=('Helvetica', 12, 'bold')) + label.grid(row=i + 1, column=j + 1, padx=3, pady=3) + + dialog.mainloop() + + +def initialize_dialogbox(_width, _height, gridmdp, terminals, buttons): + """creates dialogbox for initialization""" + + dialog = tk.Toplevel() + dialog.wm_title('Initialize') + + container = tk.Frame(dialog) + container.pack(side=tk.TOP, fill=tk.BOTH, expand=True) + container.grid_rowconfigure(0, weight=1) + container.grid_columnconfigure(0, weight=1) + + wall = tk.IntVar() + wall.set(0) + term = tk.IntVar() + term.set(0) + reward = tk.DoubleVar() + reward.set(0.0) + + label = ttk.Label(container, text='Initialize', font=('Helvetica', 12), anchor=tk.N) + label.grid(row=0, column=0, columnspan=3, sticky='new', pady=15, padx=5) + label_reward = ttk.Label(container, text='Reward', font=('Helvetica', 10), anchor=tk.N) + label_reward.grid(row=1, column=0, columnspan=3, sticky='new', pady=1, padx=5) + entry_reward = ttk.Entry(container, font=('Helvetica', 10), justify=tk.CENTER, exportselection=0, + textvariable=reward) + entry_reward.grid(row=2, column=0, columnspan=3, sticky='new', pady=5, padx=50) + + rbtn_term = ttk.Radiobutton(container, text='Terminal', variable=term, value=TERM_VALUE) + rbtn_term.grid(row=3, column=0, columnspan=3, sticky='nsew', padx=160, pady=5) + rbtn_wall = ttk.Radiobutton(container, text='Wall', variable=wall, value=WALL_VALUE) + rbtn_wall.grid(row=4, column=0, columnspan=3, sticky='nsew', padx=172, pady=5) + + initialize_widget_disability_checks(_width, _height, gridmdp, terminals, label_reward, entry_reward, rbtn_wall, + rbtn_term) + + btn_apply = ttk.Button(container, text='Apply', + command=partial(initialize_update_table, _width, _height, gridmdp, terminals, buttons, + reward, term, wall, label_reward, entry_reward, rbtn_term, rbtn_wall)) + btn_apply.grid(row=5, column=0, sticky='nsew', pady=5, padx=5) + btn_reset = ttk.Button(container, text='Reset', + command=partial(initialize_reset_all, _width, _height, gridmdp, terminals, buttons, reward, + term, wall, label_reward, entry_reward, rbtn_wall, rbtn_term)) + btn_reset.grid(row=5, column=1, sticky='nsew', pady=5, padx=5) + btn_ok = ttk.Button(container, text='Ok', command=dialog.destroy) + btn_ok.grid(row=5, column=2, sticky='nsew', pady=5, padx=5) + + dialog.geometry('400x200') + dialog.mainloop() + + +def update_table(i, j, gridmdp, terminals, buttons, reward, term, wall, label_reward, entry_reward, rbtn_term, + rbtn_wall): + """functionality for 'apply' button""" + if wall.get() == WALL_VALUE: + buttons[i][j].configure(style='wall.TButton') + buttons[i][j].config(text='Wall') + label_reward.config(foreground='#999') + entry_reward.config(state=tk.DISABLED) + rbtn_term.state(['!focus', '!selected']) + rbtn_term.config(state=tk.DISABLED) + gridmdp[i][j] = WALL_VALUE + + elif wall.get() != WALL_VALUE: + if reward.get() != 0.0: + gridmdp[i][j] = reward.get() + buttons[i][j].configure(style='reward.TButton') + buttons[i][j].config(text=f'R = {reward.get()}') + + if term.get() == TERM_VALUE: + if (i, j) not in terminals: + terminals.append((i, j)) + rbtn_wall.state(['!focus', '!selected']) + rbtn_wall.config(state=tk.DISABLED) + + if gridmdp[i][j] < 0: + buttons[i][j].configure(style='-term.TButton') + + elif gridmdp[i][j] > 0: + buttons[i][j].configure(style='+term.TButton') + + elif gridmdp[i][j] == 0.0: + buttons[i][j].configure(style='=term.TButton') + + +def initialize_update_table(_width, _height, gridmdp, terminals, buttons, reward, term, wall, label_reward, + entry_reward, rbtn_term, rbtn_wall): + """runs update_table for all cells""" + + for i in range(max(1, _height)): + for j in range(max(1, _width)): + update_table(i, j, gridmdp, terminals, buttons, reward, term, wall, label_reward, entry_reward, rbtn_term, + rbtn_wall) + + +def reset_all(_height, i, j, gridmdp, terminals, buttons, reward, term, wall, label_reward, entry_reward, rbtn_wall, + rbtn_term): + """functionality for reset button""" + reward.set(0.0) + term.set(0) + wall.set(0) + gridmdp[i][j] = 0.0 + buttons[i][j].configure(style='TButton') + buttons[i][j].config(text=f'({_height - i - 1}, {j})') + + if (i, j) in terminals: + terminals.remove((i, j)) + + label_reward.config(foreground='#000') + entry_reward.config(state=tk.NORMAL) + rbtn_term.config(state=tk.NORMAL) + rbtn_wall.config(state=tk.NORMAL) + rbtn_wall.state(['!focus', '!selected']) + rbtn_term.state(['!focus', '!selected']) + + +def initialize_reset_all(_width, _height, gridmdp, terminals, buttons, reward, term, wall, label_reward, entry_reward, + rbtn_wall, rbtn_term): + """runs reset_all for all cells""" + + for i in range(max(1, _height)): + for j in range(max(1, _width)): + reset_all(_height, i, j, gridmdp, terminals, buttons, reward, term, wall, label_reward, entry_reward, + rbtn_wall, rbtn_term) + + +def external_reset(_width, _height, gridmdp, terminals, buttons): + """reset from edit menu""" + for i in range(max(1, _height)): + for j in range(max(1, _width)): + gridmdp[i][j] = 0.0 + buttons[i][j].configure(style='TButton') + buttons[i][j].config(text=f'({_height - i - 1}, {j})') + + +def widget_disability_checks(i, j, gridmdp, terminals, label_reward, entry_reward, rbtn_wall, rbtn_term): + """checks for required state of widgets in dialog boxes""" + + if gridmdp[i][j] == WALL_VALUE: + label_reward.config(foreground='#999') + entry_reward.config(state=tk.DISABLED) + rbtn_term.config(state=tk.DISABLED) + rbtn_wall.state(['!focus', 'selected']) + rbtn_term.state(['!focus', '!selected']) + + if (i, j) in terminals: + rbtn_wall.config(state=tk.DISABLED) + rbtn_wall.state(['!focus', '!selected']) + + +def flatten_list(_list): + """returns a flattened list""" + return sum(_list, []) + + +def initialize_widget_disability_checks(_width, _height, gridmdp, terminals, label_reward, entry_reward, rbtn_wall, + rbtn_term): + """checks for required state of widgets when cells are initialized""" + + bool_walls = [['False'] * max(1, _width) for _ in range(max(1, _height))] + bool_terms = [['False'] * max(1, _width) for _ in range(max(1, _height))] + + for i in range(max(1, _height)): + for j in range(max(1, _width)): + if gridmdp[i][j] == WALL_VALUE: + bool_walls[i][j] = 'True' + + if (i, j) in terminals: + bool_terms[i][j] = 'True' + + bool_walls_fl = flatten_list(bool_walls) + bool_terms_fl = flatten_list(bool_terms) + + if bool_walls_fl.count('True') == len(bool_walls_fl): + print('`') + label_reward.config(foreground='#999') + entry_reward.config(state=tk.DISABLED) + rbtn_term.config(state=tk.DISABLED) + rbtn_wall.state(['!focus', 'selected']) + rbtn_term.state(['!focus', '!selected']) + + if bool_terms_fl.count('True') == len(bool_terms_fl): + rbtn_wall.config(state=tk.DISABLED) + rbtn_wall.state(['!focus', '!selected']) + rbtn_term.state(['!focus', 'selected']) + + +def dialogbox(i, j, gridmdp, terminals, buttons, _height): + """creates dialogbox for each cell""" + global cell_window_mantainer + if (cell_window_mantainer != None): + cell_window_mantainer.destroy() + + dialog = tk.Toplevel() + cell_window_mantainer = dialog + dialog.wm_title(f'{_height - i - 1}, {j}') + + container = tk.Frame(dialog) + container.pack(side=tk.TOP, fill=tk.BOTH, expand=True) + container.grid_rowconfigure(0, weight=1) + container.grid_columnconfigure(0, weight=1) + + wall = tk.IntVar() + wall.set(gridmdp[i][j]) + term = tk.IntVar() + term.set(TERM_VALUE if (i, j) in terminals else 0.0) + reward = tk.DoubleVar() + reward.set(gridmdp[i][j] if gridmdp[i][j] != WALL_VALUE else 0.0) + + label = ttk.Label(container, text=f'Configure cell {_height - i - 1}, {j}', font=('Helvetica', 12), anchor=tk.N) + label.grid(row=0, column=0, columnspan=3, sticky='new', pady=15, padx=5) + label_reward = ttk.Label(container, text='Reward', font=('Helvetica', 10), anchor=tk.N) + label_reward.grid(row=1, column=0, columnspan=3, sticky='new', pady=1, padx=5) + entry_reward = ttk.Entry(container, font=('Helvetica', 10), justify=tk.CENTER, exportselection=0, + textvariable=reward) + entry_reward.grid(row=2, column=0, columnspan=3, sticky='new', pady=5, padx=50) + + rbtn_term = ttk.Radiobutton(container, text='Terminal', variable=term, value=TERM_VALUE) + rbtn_term.grid(row=3, column=0, columnspan=3, sticky='nsew', padx=160, pady=5) + rbtn_wall = ttk.Radiobutton(container, text='Wall', variable=wall, value=WALL_VALUE) + rbtn_wall.grid(row=4, column=0, columnspan=3, sticky='nsew', padx=172, pady=5) + + widget_disability_checks(i, j, gridmdp, terminals, label_reward, entry_reward, rbtn_wall, rbtn_term) + + btn_apply = ttk.Button(container, text='Apply', + command=partial(update_table, i, j, gridmdp, terminals, buttons, reward, term, wall, + label_reward, entry_reward, rbtn_term, rbtn_wall)) + btn_apply.grid(row=5, column=0, sticky='nsew', pady=5, padx=5) + btn_reset = ttk.Button(container, text='Reset', + command=partial(reset_all, _height, i, j, gridmdp, terminals, buttons, reward, term, wall, + label_reward, entry_reward, rbtn_wall, rbtn_term)) + btn_reset.grid(row=5, column=1, sticky='nsew', pady=5, padx=5) + btn_ok = ttk.Button(container, text='Ok', command=dialog.destroy) + btn_ok.grid(row=5, column=2, sticky='nsew', pady=5, padx=5) + + dialog.geometry('400x200') + dialog.mainloop() + + +class MDPapp(tk.Tk): + + def __init__(self, *args, **kwargs): + + tk.Tk.__init__(self, *args, **kwargs) + tk.Tk.wm_title(self, 'Grid MDP') + self.shared_data = { + 'height': tk.IntVar(), + 'width': tk.IntVar()} + self.shared_data['height'].set(1) + self.shared_data['width'].set(1) + self.container = tk.Frame(self) + self.container.pack(side='top', fill='both', expand=True) + self.container.grid_rowconfigure(0, weight=1) + self.container.grid_columnconfigure(0, weight=1) + + self.frames = {} + + self.menu_bar = tk.Menu(self.container) + self.file_menu = tk.Menu(self.menu_bar, tearoff=0) + self.file_menu.add_command(label='Exit', command=self.exit) + self.menu_bar.add_cascade(label='File', menu=self.file_menu) + + self.edit_menu = tk.Menu(self.menu_bar, tearoff=1) + self.edit_menu.add_command(label='Reset', command=self.master_reset) + self.edit_menu.add_command(label='Initialize', command=self.initialize) + self.edit_menu.add_separator() + self.edit_menu.add_command(label='View matrix', command=self.view_matrix) + self.edit_menu.add_command(label='View terminals', command=self.view_terminals) + self.menu_bar.add_cascade(label='Edit', menu=self.edit_menu) + self.menu_bar.entryconfig('Edit', state=tk.DISABLED) + + self.build_menu = tk.Menu(self.menu_bar, tearoff=1) + self.build_menu.add_command(label='Build and Run', command=self.build) + self.menu_bar.add_cascade(label='Build', menu=self.build_menu) + self.menu_bar.entryconfig('Build', state=tk.DISABLED) + tk.Tk.config(self, menu=self.menu_bar) + + for F in (HomePage, BuildMDP, SolveMDP): + frame = F(self.container, self) + self.frames[F] = frame + frame.grid(row=0, column=0, sticky='nsew') + + self.show_frame(HomePage) + + def placeholder_function(self): + """placeholder function""" + + print('Not supported yet!') + + def exit(self): + """function to exit""" + if tkinter.messagebox.askokcancel('Exit?', 'All changes will be lost'): + quit() + + def new(self): + """function to create new GridMDP""" + + self.master_reset() + build_page = self.get_page(BuildMDP) + build_page.gridmdp = None + build_page.terminals = None + build_page.buttons = None + self.show_frame(HomePage) + + def get_page(self, page_class): + """returns pages from stored frames""" + return self.frames[page_class] + + def view_matrix(self): + """prints current matrix to console""" + + build_page = self.get_page(BuildMDP) + _height = self.shared_data['height'].get() + _width = self.shared_data['width'].get() + print(build_page.gridmdp) + display(build_page.gridmdp, _height, _width) + + def view_terminals(self): + """prints current terminals to console""" + build_page = self.get_page(BuildMDP) + print('Terminals', build_page.terminals) + + def initialize(self): + """calls initialize from BuildMDP""" + + build_page = self.get_page(BuildMDP) + build_page.initialize() + + def master_reset(self): + """calls master_reset from BuildMDP""" + build_page = self.get_page(BuildMDP) + build_page.master_reset() + + def build(self): + """runs specified mdp solving algorithm""" + + frame = SolveMDP(self.container, self) + self.frames[SolveMDP] = frame + frame.grid(row=0, column=0, sticky='nsew') + self.show_frame(SolveMDP) + build_page = self.get_page(BuildMDP) + gridmdp = build_page.gridmdp + terminals = build_page.terminals + solve_page = self.get_page(SolveMDP) + _height = self.shared_data['height'].get() + _width = self.shared_data['width'].get() + solve_page.create_graph(gridmdp, terminals, _height, _width) + + def show_frame(self, controller, cb=False): + """shows specified frame and optionally runs create_buttons""" + if cb: + build_page = self.get_page(BuildMDP) + build_page.create_buttons() + frame = self.frames[controller] + frame.tkraise() + + +class HomePage(tk.Frame): + + def __init__(self, parent, controller): + """HomePage constructor""" + + tk.Frame.__init__(self, parent) + self.controller = controller + frame1 = tk.Frame(self) + frame1.pack(side=tk.TOP) + frame3 = tk.Frame(self) + frame3.pack(side=tk.TOP) + frame4 = tk.Frame(self) + frame4.pack(side=tk.TOP) + frame2 = tk.Frame(self) + frame2.pack(side=tk.TOP) + + s = ttk.Style() + s.theme_use('clam') + s.configure('TButton', background=grayd, padding=0) + s.configure('wall.TButton', background=gray2, foreground=white) + s.configure('reward.TButton', background=gray9) + s.configure('+term.TButton', background=green8) + s.configure('-term.TButton', background=pblue, foreground=white) + s.configure('=term.TButton', background=green4) + + label = ttk.Label(frame1, text='GridMDP builder', font=('Helvetica', 18, 'bold'), background=grayef) + label.pack(pady=75, padx=50, side=tk.TOP) + + ec_btn = ttk.Button(frame3, text='Empty cells', width=20) + ec_btn.pack(pady=0, padx=0, side=tk.LEFT, ipady=10) + ec_btn.configure(style='TButton') + + w_btn = ttk.Button(frame3, text='Walls', width=20) + w_btn.pack(pady=0, padx=0, side=tk.LEFT, ipady=10) + w_btn.configure(style='wall.TButton') + + r_btn = ttk.Button(frame3, text='Rewards', width=20) + r_btn.pack(pady=0, padx=0, side=tk.LEFT, ipady=10) + r_btn.configure(style='reward.TButton') + + term_p = ttk.Button(frame3, text='Positive terminals', width=20) + term_p.pack(pady=0, padx=0, side=tk.LEFT, ipady=10) + term_p.configure(style='+term.TButton') + + term_z = ttk.Button(frame3, text='Neutral terminals', width=20) + term_z.pack(pady=0, padx=0, side=tk.LEFT, ipady=10) + term_z.configure(style='=term.TButton') + + term_n = ttk.Button(frame3, text='Negative terminals', width=20) + term_n.pack(pady=0, padx=0, side=tk.LEFT, ipady=10) + term_n.configure(style='-term.TButton') + + label = ttk.Label(frame4, text='Dimensions', font=('Verdana', 14), background=grayef) + label.pack(pady=15, padx=10, side=tk.TOP) + entry_h = tk.Entry(frame2, textvariable=self.controller.shared_data['height'], font=('Verdana', 10), width=3, + justify=tk.CENTER) + entry_h.pack(pady=10, padx=10, side=tk.LEFT) + label_x = ttk.Label(frame2, text='X', font=('Verdana', 10), background=grayef) + label_x.pack(pady=10, padx=4, side=tk.LEFT) + entry_w = tk.Entry(frame2, textvariable=self.controller.shared_data['width'], font=('Verdana', 10), width=3, + justify=tk.CENTER) + entry_w.pack(pady=10, padx=10, side=tk.LEFT) + button = ttk.Button(self, text='Build a GridMDP', command=lambda: controller.show_frame(BuildMDP, cb=True)) + button.pack(pady=10, padx=10, side=tk.TOP, ipadx=20, ipady=10) + button.configure(style='reward.TButton') + + +class BuildMDP(tk.Frame): + + def __init__(self, parent, controller): + + tk.Frame.__init__(self, parent) + self.grid_rowconfigure(0, weight=1) + self.grid_columnconfigure(0, weight=1) + self.frame = tk.Frame(self) + self.frame.pack() + self.controller = controller + + def create_buttons(self): + """creates interactive cells to build MDP""" + _height = self.controller.shared_data['height'].get() + _width = self.controller.shared_data['width'].get() + self.controller.menu_bar.entryconfig('Edit', state=tk.NORMAL) + self.controller.menu_bar.entryconfig('Build', state=tk.NORMAL) + self.gridmdp = [[0.0] * max(1, _width) for _ in range(max(1, _height))] + self.buttons = [[None] * max(1, _width) for _ in range(max(1, _height))] + self.terminals = [] + + s = ttk.Style() + s.theme_use('clam') + s.configure('TButton', background=grayd, padding=0) + s.configure('wall.TButton', background=gray2, foreground=white) + s.configure('reward.TButton', background=gray9) + s.configure('+term.TButton', background=green8) + s.configure('-term.TButton', background=pblue, foreground=white) + s.configure('=term.TButton', background=green4) + + for i in range(max(1, _height)): + for j in range(max(1, _width)): + self.buttons[i][j] = ttk.Button(self.frame, text=f'({_height - i - 1}, {j})', + width=int(196 / max(1, _width)), + command=partial(dialogbox, i, j, self.gridmdp, self.terminals, + self.buttons, _height)) + self.buttons[i][j].grid(row=i, column=j, ipady=int(336 / max(1, _height)) - 12) + + def initialize(self): + """runs initialize_dialogbox""" + + _height = self.controller.shared_data['height'].get() + _width = self.controller.shared_data['width'].get() + initialize_dialogbox(_width, _height, self.gridmdp, self.terminals, self.buttons) + + def master_reset(self): + """runs external reset""" + _height = self.controller.shared_data['height'].get() + _width = self.controller.shared_data['width'].get() + if tkinter.messagebox.askokcancel('Reset', 'Are you sure you want to reset all cells?'): + external_reset(_width, _height, self.gridmdp, self.terminals, self.buttons) + + +class SolveMDP(tk.Frame): + + def __init__(self, parent, controller): + + tk.Frame.__init__(self, parent) + self.grid_rowconfigure(0, weight=1) + self.grid_columnconfigure(0, weight=1) + self.frame = tk.Frame(self) + self.frame.pack() + self.controller = controller + self.terminated = False + self.iterations = 0 + self.epsilon = 0.001 + self.delta = 0 + + def process_data(self, terminals, _height, _width, gridmdp): + """preprocess variables""" + + flipped_terminals = [] + + for terminal in terminals: + flipped_terminals.append((terminal[1], _height - terminal[0] - 1)) + + grid_to_solve = [[0.0] * max(1, _width) for _ in range(max(1, _height))] + grid_to_show = [[0.0] * max(1, _width) for _ in range(max(1, _height))] + + for i in range(max(1, _height)): + for j in range(max(1, _width)): + if gridmdp[i][j] == WALL_VALUE: + grid_to_show[i][j] = 0.0 + grid_to_solve[i][j] = None + + else: + grid_to_show[i][j] = grid_to_solve[i][j] = gridmdp[i][j] + + return flipped_terminals, grid_to_solve, np.flipud(grid_to_show) + + def create_graph(self, gridmdp, terminals, _height, _width): + """creates canvas and initializes value_iteration_parameters""" + self._height = _height + self._width = _width + self.controller.menu_bar.entryconfig('Edit', state=tk.DISABLED) + self.controller.menu_bar.entryconfig('Build', state=tk.DISABLED) + + self.terminals, self.gridmdp, self.grid_to_show = self.process_data(terminals, _height, _width, gridmdp) + self.sequential_decision_environment = GridMDP(self.gridmdp, terminals=self.terminals) + + self.initialize_value_iteration_parameters(self.sequential_decision_environment) + + self.canvas = FigureCanvasTkAgg(fig, self.frame) + self.canvas.get_tk_widget().pack(side=tk.TOP, fill=tk.BOTH, expand=True) + self.anim = animation.FuncAnimation(fig, self.animate_graph, interval=50) + self.canvas.show() + + def animate_graph(self, i): + """performs value iteration and animates graph""" + + # cmaps to use: bone_r, Oranges, inferno, BrBG, copper + self.iterations += 1 + x_interval = max(2, len(self.gridmdp[0])) + y_interval = max(2, len(self.gridmdp)) + x = np.linspace(0, len(self.gridmdp[0]) - 1, x_interval) + y = np.linspace(0, len(self.gridmdp) - 1, y_interval) + + sub.clear() + sub.imshow(self.grid_to_show, cmap='BrBG', aspect='auto', interpolation='none', extent=extents(x) + extents(y), + origin='lower') + fig.tight_layout() + + U = self.U1.copy() + + for s in self.sequential_decision_environment.states: + self.U1[s] = self.R(s) + self.gamma * max( + [sum([p * U[s1] for (p, s1) in self.T(s, a)]) for a in self.sequential_decision_environment.actions(s)]) + self.delta = max(self.delta, abs(self.U1[s] - U[s])) + + self.grid_to_show = grid_to_show = [[0.0] * max(1, self._width) for _ in range(max(1, self._height))] + for k, v in U.items(): + self.grid_to_show[k[1]][k[0]] = v + + if (self.delta < self.epsilon * (1 - self.gamma) / self.gamma) or ( + self.iterations > 60) and self.terminated is False: + self.terminated = True + display(self.grid_to_show, self._height, self._width) + + pi = best_policy(self.sequential_decision_environment, + value_iteration(self.sequential_decision_environment, .01)) + display_best_policy(self.sequential_decision_environment.to_arrows(pi), self._height, self._width) + + ax = fig.gca() + ax.xaxis.set_major_locator(MaxNLocator(integer=True)) + ax.yaxis.set_major_locator(MaxNLocator(integer=True)) + + def initialize_value_iteration_parameters(self, mdp): + """initializes value_iteration parameters""" + self.U1 = {s: 0 for s in mdp.states} + self.R, self.T, self.gamma = mdp.R, mdp.T, mdp.gamma + + def value_iteration_metastep(self, mdp, iterations=20): + """runs value_iteration""" + + U_over_time = [] + U1 = {s: 0 for s in mdp.states} + R, T, gamma = mdp.R, mdp.T, mdp.gamma + + for _ in range(iterations): + U = U1.copy() + + for s in mdp.states: + U1[s] = R(s) + gamma * max([sum([p * U[s1] for (p, s1) in T(s, a)]) for a in mdp.actions(s)]) + + U_over_time.append(U) + return U_over_time + + +if __name__ == '__main__': + app = MDPapp() + app.geometry('1280x720') + app.mainloop() diff --git a/gui/romania_problem.py b/gui/romania_problem.py new file mode 100644 index 000000000..9ec94099d --- /dev/null +++ b/gui/romania_problem.py @@ -0,0 +1,672 @@ +from copy import deepcopy +from tkinter import * + +from search import * +from utils import PriorityQueue + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + +root = None +city_coord = {} +romania_problem = None +algo = None +start = None +goal = None +counter = -1 +city_map = None +frontier = None +front = None +node = None +next_button = None +explored = None + + +def create_map(root): + """This function draws out the required map.""" + global city_map, start, goal + romania_locations = romania_map.locations + width = 750 + height = 670 + margin = 5 + city_map = Canvas(root, width=width, height=height) + city_map.pack() + + # Since lines have to be drawn between particular points, we need to list + # them separately + make_line( + city_map, + romania_locations['Arad'][0], + height - + romania_locations['Arad'][1], + romania_locations['Sibiu'][0], + height - + romania_locations['Sibiu'][1], + romania_map.get('Arad', 'Sibiu')) + make_line( + city_map, + romania_locations['Arad'][0], + height - + romania_locations['Arad'][1], + romania_locations['Zerind'][0], + height - + romania_locations['Zerind'][1], + romania_map.get('Arad', 'Zerind')) + make_line( + city_map, + romania_locations['Arad'][0], + height - + romania_locations['Arad'][1], + romania_locations['Timisoara'][0], + height - + romania_locations['Timisoara'][1], + romania_map.get('Arad', 'Timisoara')) + make_line( + city_map, + romania_locations['Oradea'][0], + height - + romania_locations['Oradea'][1], + romania_locations['Zerind'][0], + height - + romania_locations['Zerind'][1], + romania_map.get('Oradea', 'Zerind')) + make_line( + city_map, + romania_locations['Oradea'][0], + height - + romania_locations['Oradea'][1], + romania_locations['Sibiu'][0], + height - + romania_locations['Sibiu'][1], + romania_map.get('Oradea', 'Sibiu')) + make_line( + city_map, + romania_locations['Lugoj'][0], + height - + romania_locations['Lugoj'][1], + romania_locations['Timisoara'][0], + height - + romania_locations['Timisoara'][1], + romania_map.get('Lugoj', 'Timisoara')) + make_line( + city_map, + romania_locations['Lugoj'][0], + height - + romania_locations['Lugoj'][1], + romania_locations['Mehadia'][0], + height - + romania_locations['Mehadia'][1], + romania_map.get('Lugoj', 'Mehadia')) + make_line( + city_map, + romania_locations['Drobeta'][0], + height - + romania_locations['Drobeta'][1], + romania_locations['Mehadia'][0], + height - + romania_locations['Mehadia'][1], + romania_map.get('Drobeta', 'Mehadia')) + make_line( + city_map, + romania_locations['Drobeta'][0], + height - + romania_locations['Drobeta'][1], + romania_locations['Craiova'][0], + height - + romania_locations['Craiova'][1], + romania_map.get('Drobeta', 'Craiova')) + make_line( + city_map, + romania_locations['Pitesti'][0], + height - + romania_locations['Pitesti'][1], + romania_locations['Craiova'][0], + height - + romania_locations['Craiova'][1], + romania_map.get('Pitesti', 'Craiova')) + make_line( + city_map, + romania_locations['Rimnicu'][0], + height - + romania_locations['Rimnicu'][1], + romania_locations['Craiova'][0], + height - + romania_locations['Craiova'][1], + romania_map.get('Rimnicu', 'Craiova')) + make_line( + city_map, + romania_locations['Rimnicu'][0], + height - + romania_locations['Rimnicu'][1], + romania_locations['Sibiu'][0], + height - + romania_locations['Sibiu'][1], + romania_map.get('Rimnicu', 'Sibiu')) + make_line( + city_map, + romania_locations['Rimnicu'][0], + height - + romania_locations['Rimnicu'][1], + romania_locations['Pitesti'][0], + height - + romania_locations['Pitesti'][1], + romania_map.get('Rimnicu', 'Pitesti')) + make_line( + city_map, + romania_locations['Bucharest'][0], + height - + romania_locations['Bucharest'][1], + romania_locations['Pitesti'][0], + height - + romania_locations['Pitesti'][1], + romania_map.get('Bucharest', 'Pitesti')) + make_line( + city_map, + romania_locations['Fagaras'][0], + height - + romania_locations['Fagaras'][1], + romania_locations['Sibiu'][0], + height - + romania_locations['Sibiu'][1], + romania_map.get('Fagaras', 'Sibiu')) + make_line( + city_map, + romania_locations['Fagaras'][0], + height - + romania_locations['Fagaras'][1], + romania_locations['Bucharest'][0], + height - + romania_locations['Bucharest'][1], + romania_map.get('Fagaras', 'Bucharest')) + make_line( + city_map, + romania_locations['Giurgiu'][0], + height - + romania_locations['Giurgiu'][1], + romania_locations['Bucharest'][0], + height - + romania_locations['Bucharest'][1], + romania_map.get('Giurgiu', 'Bucharest')) + make_line( + city_map, + romania_locations['Urziceni'][0], + height - + romania_locations['Urziceni'][1], + romania_locations['Bucharest'][0], + height - + romania_locations['Bucharest'][1], + romania_map.get('Urziceni', 'Bucharest')) + make_line( + city_map, + romania_locations['Urziceni'][0], + height - + romania_locations['Urziceni'][1], + romania_locations['Hirsova'][0], + height - + romania_locations['Hirsova'][1], + romania_map.get('Urziceni', 'Hirsova')) + make_line( + city_map, + romania_locations['Eforie'][0], + height - + romania_locations['Eforie'][1], + romania_locations['Hirsova'][0], + height - + romania_locations['Hirsova'][1], + romania_map.get('Eforie', 'Hirsova')) + make_line( + city_map, + romania_locations['Urziceni'][0], + height - + romania_locations['Urziceni'][1], + romania_locations['Vaslui'][0], + height - + romania_locations['Vaslui'][1], + romania_map.get('Urziceni', 'Vaslui')) + make_line( + city_map, + romania_locations['Iasi'][0], + height - + romania_locations['Iasi'][1], + romania_locations['Vaslui'][0], + height - + romania_locations['Vaslui'][1], + romania_map.get('Iasi', 'Vaslui')) + make_line( + city_map, + romania_locations['Iasi'][0], + height - + romania_locations['Iasi'][1], + romania_locations['Neamt'][0], + height - + romania_locations['Neamt'][1], + romania_map.get('Iasi', 'Neamt')) + + for city in romania_locations.keys(): + make_rectangle( + city_map, + romania_locations[city][0], + height - + romania_locations[city][1], + margin, + city) + + make_legend(city_map) + + +def make_line(map, x0, y0, x1, y1, distance): + """This function draws out the lines joining various points.""" + map.create_line(x0, y0, x1, y1) + map.create_text((x0 + x1) / 2, (y0 + y1) / 2, text=distance) + + +def make_rectangle(map, x0, y0, margin, city_name): + """This function draws out rectangles for various points.""" + global city_coord + rect = map.create_rectangle( + x0 - margin, + y0 - margin, + x0 + margin, + y0 + margin, + fill="white") + if "Bucharest" in city_name or "Pitesti" in city_name or "Lugoj" in city_name \ + or "Mehadia" in city_name or "Drobeta" in city_name: + map.create_text( + x0 - 2 * margin, + y0 - 2 * margin, + text=city_name, + anchor=E) + else: + map.create_text( + x0 - 2 * margin, + y0 - 2 * margin, + text=city_name, + anchor=SE) + city_coord.update({city_name: rect}) + + +def make_legend(map): + rect1 = map.create_rectangle(600, 100, 610, 110, fill="white") + text1 = map.create_text(615, 105, anchor=W, text="Un-explored") + + rect2 = map.create_rectangle(600, 115, 610, 125, fill="orange") + text2 = map.create_text(615, 120, anchor=W, text="Frontier") + + rect3 = map.create_rectangle(600, 130, 610, 140, fill="red") + text3 = map.create_text(615, 135, anchor=W, text="Currently Exploring") + + rect4 = map.create_rectangle(600, 145, 610, 155, fill="grey") + text4 = map.create_text(615, 150, anchor=W, text="Explored") + + rect5 = map.create_rectangle(600, 160, 610, 170, fill="dark green") + text5 = map.create_text(615, 165, anchor=W, text="Final Solution") + + +def tree_search(problem): + """ + Search through the successors of a problem to find a goal. + The argument frontier should be an empty queue. + Don't worry about repeated paths to a state. [Figure 3.7] + This function has been changed to make it suitable for the Tkinter GUI. + """ + global counter, frontier, node + + if counter == -1: + frontier.append(Node(problem.initial)) + + display_frontier(frontier) + if counter % 3 == 0 and counter >= 0: + node = frontier.pop() + + display_current(node) + if counter % 3 == 1 and counter >= 0: + if problem.goal_test(node.state): + return node + frontier.extend(node.expand(problem)) + + display_frontier(frontier) + if counter % 3 == 2 and counter >= 0: + display_explored(node) + return None + + +def graph_search(problem): + """ + Search through the successors of a problem to find a goal. + The argument frontier should be an empty queue. + If two paths reach a state, only use the first one. [Figure 3.7] + This function has been changed to make it suitable for the Tkinter GUI. + """ + global counter, frontier, node, explored + if counter == -1: + frontier.append(Node(problem.initial)) + explored = set() + + display_frontier(frontier) + if counter % 3 == 0 and counter >= 0: + node = frontier.pop() + + display_current(node) + if counter % 3 == 1 and counter >= 0: + if problem.goal_test(node.state): + return node + explored.add(node.state) + frontier.extend(child for child in node.expand(problem) + if child.state not in explored and + child not in frontier) + + display_frontier(frontier) + if counter % 3 == 2 and counter >= 0: + display_explored(node) + return None + + +def display_frontier(queue): + """This function marks the frontier nodes (orange) on the map.""" + global city_map, city_coord + qu = deepcopy(queue) + while qu: + node = qu.pop() + for city in city_coord.keys(): + if node.state == city: + city_map.itemconfig(city_coord[city], fill="orange") + + +def display_current(node): + """This function marks the currently exploring node (red) on the map.""" + global city_map, city_coord + city = node.state + city_map.itemconfig(city_coord[city], fill="red") + + +def display_explored(node): + """This function marks the already explored node (gray) on the map.""" + global city_map, city_coord + city = node.state + city_map.itemconfig(city_coord[city], fill="gray") + + +def display_final(cities): + """This function marks the final solution nodes (green) on the map.""" + global city_map, city_coord + for city in cities: + city_map.itemconfig(city_coord[city], fill="green") + + +def breadth_first_tree_search(problem): + """Search the shallowest nodes in the search tree first.""" + global frontier, counter, node + if counter == -1: + frontier = deque() + + if counter == -1: + frontier.append(Node(problem.initial)) + + display_frontier(frontier) + if counter % 3 == 0 and counter >= 0: + node = frontier.popleft() + + display_current(node) + if counter % 3 == 1 and counter >= 0: + if problem.goal_test(node.state): + return node + frontier.extend(node.expand(problem)) + + display_frontier(frontier) + if counter % 3 == 2 and counter >= 0: + display_explored(node) + return None + + +def depth_first_tree_search(problem): + """Search the deepest nodes in the search tree first.""" + # This search algorithm might not work in case of repeated paths. + global frontier, counter, node + if counter == -1: + frontier = [] # stack + + if counter == -1: + frontier.append(Node(problem.initial)) + + display_frontier(frontier) + if counter % 3 == 0 and counter >= 0: + node = frontier.pop() + + display_current(node) + if counter % 3 == 1 and counter >= 0: + if problem.goal_test(node.state): + return node + frontier.extend(node.expand(problem)) + + display_frontier(frontier) + if counter % 3 == 2 and counter >= 0: + display_explored(node) + return None + + +def breadth_first_graph_search(problem): + """[Figure 3.11]""" + global frontier, node, explored, counter + if counter == -1: + node = Node(problem.initial) + display_current(node) + if problem.goal_test(node.state): + return node + + frontier = deque([node]) # FIFO queue + + display_frontier(frontier) + explored = set() + if counter % 3 == 0 and counter >= 0: + node = frontier.popleft() + display_current(node) + explored.add(node.state) + if counter % 3 == 1 and counter >= 0: + for child in node.expand(problem): + if child.state not in explored and child not in frontier: + if problem.goal_test(child.state): + return child + frontier.append(child) + display_frontier(frontier) + if counter % 3 == 2 and counter >= 0: + display_explored(node) + return None + + +def depth_first_graph_search(problem): + """Search the deepest nodes in the search tree first.""" + global counter, frontier, node, explored + if counter == -1: + frontier = [] # stack + if counter == -1: + frontier.append(Node(problem.initial)) + explored = set() + + display_frontier(frontier) + if counter % 3 == 0 and counter >= 0: + node = frontier.pop() + + display_current(node) + if counter % 3 == 1 and counter >= 0: + if problem.goal_test(node.state): + return node + explored.add(node.state) + frontier.extend(child for child in node.expand(problem) + if child.state not in explored and + child not in frontier) + + display_frontier(frontier) + if counter % 3 == 2 and counter >= 0: + display_explored(node) + return None + + +def best_first_graph_search(problem, f): + """Search the nodes with the lowest f scores first. + You specify the function f(node) that you want to minimize; for example, + if f is a heuristic estimate to the goal, then we have greedy best + first search; if f is node.depth then we have breadth-first search. + There is a subtlety: the line "f = memoize(f, 'f')" means that the f + values will be cached on the nodes as they are computed. So after doing + a best first search you can examine the f values of the path returned.""" + global frontier, node, explored, counter + + if counter == -1: + f = memoize(f, 'f') + node = Node(problem.initial) + display_current(node) + if problem.goal_test(node.state): + return node + frontier = PriorityQueue('min', f) + frontier.append(node) + display_frontier(frontier) + explored = set() + if counter % 3 == 0 and counter >= 0: + node = frontier.pop() + display_current(node) + if problem.goal_test(node.state): + return node + explored.add(node.state) + if counter % 3 == 1 and counter >= 0: + for child in node.expand(problem): + if child.state not in explored and child not in frontier: + frontier.append(child) + elif child in frontier: + if f(child) < frontier[child]: + del frontier[child] + frontier.append(child) + display_frontier(frontier) + if counter % 3 == 2 and counter >= 0: + display_explored(node) + return None + + +def uniform_cost_search(problem): + """[Figure 3.14]""" + return best_first_graph_search(problem, lambda node: node.path_cost) + + +def astar_search(problem, h=None): + """A* search is best-first graph search with f(n) = g(n)+h(n). + You need to specify the h function when you call astar_search, or + else in your Problem subclass.""" + h = memoize(h or problem.h, 'h') + return best_first_graph_search(problem, lambda n: n.path_cost + h(n)) + + +# TODO: +# Remove redundant code. +# Make the interchangeability work between various algorithms at each step. +def on_click(): + """ + This function defines the action of the 'Next' button. + """ + global algo, counter, next_button, romania_problem, start, goal + romania_problem = GraphProblem(start.get(), goal.get(), romania_map) + if "Breadth-First Tree Search" == algo.get(): + node = breadth_first_tree_search(romania_problem) + if node is not None: + final_path = breadth_first_tree_search(romania_problem).solution() + final_path.append(start.get()) + display_final(final_path) + next_button.config(state="disabled") + counter += 1 + elif "Depth-First Tree Search" == algo.get(): + node = depth_first_tree_search(romania_problem) + if node is not None: + final_path = depth_first_tree_search(romania_problem).solution() + final_path.append(start.get()) + display_final(final_path) + next_button.config(state="disabled") + counter += 1 + elif "Breadth-First Graph Search" == algo.get(): + node = breadth_first_graph_search(romania_problem) + if node is not None: + final_path = breadth_first_graph_search(romania_problem).solution() + final_path.append(start.get()) + display_final(final_path) + next_button.config(state="disabled") + counter += 1 + elif "Depth-First Graph Search" == algo.get(): + node = depth_first_graph_search(romania_problem) + if node is not None: + final_path = depth_first_graph_search(romania_problem).solution() + final_path.append(start.get()) + display_final(final_path) + next_button.config(state="disabled") + counter += 1 + elif "Uniform Cost Search" == algo.get(): + node = uniform_cost_search(romania_problem) + if node is not None: + final_path = uniform_cost_search(romania_problem).solution() + final_path.append(start.get()) + display_final(final_path) + next_button.config(state="disabled") + counter += 1 + elif "A* - Search" == algo.get(): + node = astar_search(romania_problem) + if node is not None: + final_path = astar_search(romania_problem).solution() + final_path.append(start.get()) + display_final(final_path) + next_button.config(state="disabled") + counter += 1 + + +def reset_map(): + global counter, city_coord, city_map, next_button + counter = -1 + for city in city_coord.keys(): + city_map.itemconfig(city_coord[city], fill="white") + next_button.config(state="normal") + + +# TODO: Add more search algorithms in the OptionMenu +if __name__ == "__main__": + global algo, start, goal, next_button + root = Tk() + root.title("Road Map of Romania") + root.geometry("950x1150") + algo = StringVar(root) + start = StringVar(root) + goal = StringVar(root) + algo.set("Breadth-First Tree Search") + start.set('Arad') + goal.set('Bucharest') + cities = sorted(romania_map.locations.keys()) + algorithm_menu = OptionMenu( + root, + algo, "Breadth-First Tree Search", "Depth-First Tree Search", + "Breadth-First Graph Search", "Depth-First Graph Search", + "Uniform Cost Search", "A* - Search") + Label(root, text="\n Search Algorithm").pack() + algorithm_menu.pack() + Label(root, text="\n Start City").pack() + start_menu = OptionMenu(root, start, *cities) + start_menu.pack() + Label(root, text="\n Goal City").pack() + goal_menu = OptionMenu(root, goal, *cities) + goal_menu.pack() + frame1 = Frame(root) + next_button = Button( + frame1, + width=6, + height=2, + text="Next", + command=on_click, + padx=2, + pady=2, + relief=GROOVE) + next_button.pack(side=RIGHT) + reset_button = Button( + frame1, + width=6, + height=2, + text="Reset", + command=reset_map, + padx=2, + pady=2, + relief=GROOVE) + reset_button.pack(side=RIGHT) + frame1.pack(side=BOTTOM) + create_map(root) + root.mainloop() diff --git a/gui/tic-tac-toe.py b/gui/tic-tac-toe.py new file mode 100644 index 000000000..66d9d6e75 --- /dev/null +++ b/gui/tic-tac-toe.py @@ -0,0 +1,232 @@ +import os.path +from tkinter import * + +from games import minmax_decision, alpha_beta_player, random_player, TicTacToe +# "gen_state" can be used to generate a game state to apply the algorithm +from tests.test_games import gen_state + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + +ttt = TicTacToe() +root = None +buttons = [] +frames = [] +x_pos = [] +o_pos = [] +count = 0 +sym = "" +result = None +choices = None + + +def create_frames(root): + """ + This function creates the necessary structure of the game. + """ + frame1 = Frame(root) + frame2 = Frame(root) + frame3 = Frame(root) + frame4 = Frame(root) + create_buttons(frame1) + create_buttons(frame2) + create_buttons(frame3) + buttonExit = Button( + frame4, height=1, width=2, + text="Exit", + command=lambda: exit_game(root)) + buttonExit.pack(side=LEFT) + frame4.pack(side=BOTTOM) + frame3.pack(side=BOTTOM) + frame2.pack(side=BOTTOM) + frame1.pack(side=BOTTOM) + frames.append(frame1) + frames.append(frame2) + frames.append(frame3) + for x in frames: + buttons_in_frame = [] + for y in x.winfo_children(): + buttons_in_frame.append(y) + buttons.append(buttons_in_frame) + buttonReset = Button(frame4, height=1, width=2, + text="Reset", command=lambda: reset_game()) + buttonReset.pack(side=LEFT) + + +def create_buttons(frame): + """ + This function creates the buttons to be pressed/clicked during the game. + """ + button0 = Button(frame, height=2, width=2, text=" ", + command=lambda: on_click(button0)) + button0.pack(side=LEFT) + button1 = Button(frame, height=2, width=2, text=" ", + command=lambda: on_click(button1)) + button1.pack(side=LEFT) + button2 = Button(frame, height=2, width=2, text=" ", + command=lambda: on_click(button2)) + button2.pack(side=LEFT) + + +# TODO: Add a choice option for the user. +def on_click(button): + """ + This function determines the action of any button. + """ + global ttt, choices, count, sym, result, x_pos, o_pos + + if count % 2 == 0: + sym = "X" + else: + sym = "O" + count += 1 + + button.config( + text=sym, + state='disabled', + disabledforeground="red") # For cross + + x, y = get_coordinates(button) + x += 1 + y += 1 + x_pos.append((x, y)) + state = gen_state(to_move='O', x_positions=x_pos, + o_positions=o_pos) + try: + choice = choices.get() + if "Random" in choice: + a, b = random_player(ttt, state) + elif "Pro" in choice: + a, b = minmax_decision(state, ttt) + else: + a, b = alpha_beta_player(ttt, state) + except (ValueError, IndexError, TypeError) as e: + disable_game() + result.set("It's a draw :|") + return + if 1 <= a <= 3 and 1 <= b <= 3: + o_pos.append((a, b)) + button_to_change = get_button(a - 1, b - 1) + if count % 2 == 0: # Used again, will become handy when user is given the choice of turn. + sym = "X" + else: + sym = "O" + count += 1 + + if check_victory(button): + result.set("You win :)") + disable_game() + else: + button_to_change.config(text=sym, state='disabled', + disabledforeground="black") + if check_victory(button_to_change): + result.set("You lose :(") + disable_game() + + +# TODO: Replace "check_victory" by "k_in_row" function. +def check_victory(button): + """ + This function checks various winning conditions of the game. + """ + # check if previous move caused a win on vertical line + global buttons + x, y = get_coordinates(button) + tt = button['text'] + if buttons[0][y]['text'] == buttons[1][y]['text'] == buttons[2][y]['text'] != " ": + buttons[0][y].config(text="|" + tt + "|") + buttons[1][y].config(text="|" + tt + "|") + buttons[2][y].config(text="|" + tt + "|") + return True + + # check if previous move caused a win on horizontal line + if buttons[x][0]['text'] == buttons[x][1]['text'] == buttons[x][2]['text'] != " ": + buttons[x][0].config(text="--" + tt + "--") + buttons[x][1].config(text="--" + tt + "--") + buttons[x][2].config(text="--" + tt + "--") + return True + + # check if previous move was on the main diagonal and caused a win + if x == y and buttons[0][0]['text'] == buttons[1][1]['text'] == buttons[2][2]['text'] != " ": + buttons[0][0].config(text="\\" + tt + "\\") + buttons[1][1].config(text="\\" + tt + "\\") + buttons[2][2].config(text="\\" + tt + "\\") + return True + + # check if previous move was on the secondary diagonal and caused a win + if x + y == 2 and buttons[0][2]['text'] == buttons[1][1]['text'] == buttons[2][0]['text'] != " ": + buttons[0][2].config(text="/" + tt + "/") + buttons[1][1].config(text="/" + tt + "/") + buttons[2][0].config(text="/" + tt + "/") + return True + + return False + + +def get_coordinates(button): + """ + This function returns the coordinates of the button clicked. + """ + global buttons + for x in range(len(buttons)): + for y in range(len(buttons[x])): + if buttons[x][y] == button: + return x, y + + +def get_button(x, y): + """ + This function returns the button memory location corresponding to a coordinate. + """ + global buttons + return buttons[x][y] + + +def reset_game(): + """ + This function will reset all the tiles to the initial null value. + """ + global x_pos, o_pos, frames, count + + count = 0 + x_pos = [] + o_pos = [] + result.set("Your Turn!") + for x in frames: + for y in x.winfo_children(): + y.config(text=" ", state='normal') + + +def disable_game(): + """ + This function deactivates the game after a win, loss or draw. + """ + global frames + for x in frames: + for y in x.winfo_children(): + y.config(state='disabled') + + +def exit_game(root): + """ + This function will exit the game by killing the root. + """ + root.destroy() + + +if __name__ == "__main__": + global result, choices + + root = Tk() + root.title("TicTacToe") + root.geometry("150x200") # Improved the window geometry + root.resizable(0, 0) # To remove the maximize window option + result = StringVar() + result.set("Your Turn!") + w = Label(root, textvariable=result) + w.pack(side=BOTTOM) + create_frames(root) + choices = StringVar(root) + choices.set("Vs Pro") + menu = OptionMenu(root, choices, "Vs Random", "Vs Pro", "Vs Legend") + menu.pack() + root.mainloop() diff --git a/gui/tsp.py b/gui/tsp.py new file mode 100644 index 000000000..590fff354 --- /dev/null +++ b/gui/tsp.py @@ -0,0 +1,342 @@ +from tkinter import * +from tkinter import messagebox + +import utils +from search import * + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + +distances = {} + + +class TSProblem(Problem): + """subclass of Problem to define various functions""" + + def two_opt(self, state): + """Neighbour generating function for Traveling Salesman Problem""" + neighbour_state = state[:] + left = random.randint(0, len(neighbour_state) - 1) + right = random.randint(0, len(neighbour_state) - 1) + if left > right: + left, right = right, left + neighbour_state[left: right + 1] = reversed(neighbour_state[left: right + 1]) + return neighbour_state + + def actions(self, state): + """action that can be executed in given state""" + return [self.two_opt] + + def result(self, state, action): + """result after applying the given action on the given state""" + return action(state) + + def path_cost(self, c, state1, action, state2): + """total distance for the Traveling Salesman to be covered if in state2""" + cost = 0 + for i in range(len(state2) - 1): + cost += distances[state2[i]][state2[i + 1]] + cost += distances[state2[0]][state2[-1]] + return cost + + def value(self, state): + """value of path cost given negative for the given state""" + return -1 * self.path_cost(None, None, None, state) + + +class TSPGui(): + """Class to create gui of Traveling Salesman using simulated annealing where one can + select cities, change speed and temperature. Distances between cities are euclidean + distances between them. + """ + + def __init__(self, root, all_cities): + self.root = root + self.vars = [] + self.frame_locations = {} + self.calculate_canvas_size() + self.button_text = StringVar() + self.button_text.set("Start") + self.algo_var = StringVar() + self.all_cities = all_cities + self.frame_select_cities = Frame(self.root) + self.frame_select_cities.grid(row=1) + self.frame_canvas = Frame(self.root) + self.frame_canvas.grid(row=2) + Label(self.root, text="Map of Romania", font="Times 13 bold").grid(row=0, columnspan=10) + + def create_checkboxes(self, side=LEFT, anchor=W): + """To select cities which are to be a part of Traveling Salesman Problem""" + + row_number = 0 + column_number = 0 + + for city in self.all_cities: + var = IntVar() + var.set(1) + Checkbutton(self.frame_select_cities, text=city, variable=var).grid( + row=row_number, column=column_number, sticky=W) + + self.vars.append(var) + column_number += 1 + if column_number == 10: + column_number = 0 + row_number += 1 + + def create_buttons(self): + """Create start and quit button""" + + Button(self.frame_select_cities, textvariable=self.button_text, + command=self.run_traveling_salesman).grid(row=5, column=4, sticky=E + W) + Button(self.frame_select_cities, text='Quit', command=self.on_closing).grid( + row=5, column=5, sticky=E + W) + + def create_dropdown_menu(self): + """Create dropdown menu for algorithm selection""" + + choices = {'Simulated Annealing', 'Genetic Algorithm', 'Hill Climbing'} + self.algo_var.set('Simulated Annealing') + dropdown_menu = OptionMenu(self.frame_select_cities, self.algo_var, *choices) + dropdown_menu.grid(row=4, column=4, columnspan=2, sticky=E + W) + dropdown_menu.config(width=19) + + def run_traveling_salesman(self): + """Choose selected cities""" + + cities = [] + for i in range(len(self.vars)): + if self.vars[i].get() == 1: + cities.append(self.all_cities[i]) + + tsp_problem = TSProblem(cities) + self.button_text.set("Reset") + self.create_canvas(tsp_problem) + + def calculate_canvas_size(self): + """Width and height for canvas""" + + minx, maxx = sys.maxsize, -1 * sys.maxsize + miny, maxy = sys.maxsize, -1 * sys.maxsize + + for value in romania_map.locations.values(): + minx = min(minx, value[0]) + maxx = max(maxx, value[0]) + miny = min(miny, value[1]) + maxy = max(maxy, value[1]) + + # New locations squeezed to fit inside the map of romania + for name, coordinates in romania_map.locations.items(): + self.frame_locations[name] = (coordinates[0] / 1.2 - minx + + 150, coordinates[1] / 1.2 - miny + 165) + + canvas_width = maxx - minx + 200 + canvas_height = maxy - miny + 200 + + self.canvas_width = canvas_width + self.canvas_height = canvas_height + + def create_canvas(self, problem): + """creating map with cities""" + + map_canvas = Canvas(self.frame_canvas, width=self.canvas_width, height=self.canvas_height) + map_canvas.grid(row=3, columnspan=10) + current = Node(problem.initial) + map_canvas.delete("all") + self.romania_image = PhotoImage(file="../images/romania_map.png") + map_canvas.create_image(self.canvas_width / 2, self.canvas_height / 2, + image=self.romania_image) + cities = current.state + for city in cities: + x = self.frame_locations[city][0] + y = self.frame_locations[city][1] + map_canvas.create_oval(x - 3, y - 3, x + 3, y + 3, + fill="red", outline="red") + map_canvas.create_text(x - 15, y - 10, text=city) + + self.cost = StringVar() + Label(self.frame_canvas, textvariable=self.cost, relief="sunken").grid( + row=2, columnspan=10) + + self.speed = IntVar() + speed_scale = Scale(self.frame_canvas, from_=500, to=1, orient=HORIZONTAL, + variable=self.speed, label="Speed ----> ", showvalue=0, font="Times 11", + relief="sunken", cursor="gumby") + speed_scale.grid(row=1, columnspan=5, sticky=N + S + E + W) + + if self.algo_var.get() == 'Simulated Annealing': + self.temperature = IntVar() + temperature_scale = Scale(self.frame_canvas, from_=100, to=0, orient=HORIZONTAL, + length=200, variable=self.temperature, label="Temperature ---->", + font="Times 11", relief="sunken", showvalue=0, cursor="gumby") + temperature_scale.grid(row=1, column=5, columnspan=5, sticky=N + S + E + W) + self.simulated_annealing_with_tunable_T(problem, map_canvas) + elif self.algo_var.get() == 'Genetic Algorithm': + self.mutation_rate = DoubleVar() + self.mutation_rate.set(0.05) + mutation_rate_scale = Scale(self.frame_canvas, from_=0, to=1, orient=HORIZONTAL, + length=200, variable=self.mutation_rate, label='Mutation Rate ---->', + font='Times 11', relief='sunken', showvalue=0, cursor='gumby', resolution=0.001) + mutation_rate_scale.grid(row=1, column=5, columnspan=5, sticky='nsew') + self.genetic_algorithm(problem, map_canvas) + elif self.algo_var.get() == 'Hill Climbing': + self.no_of_neighbors = IntVar() + self.no_of_neighbors.set(100) + no_of_neighbors_scale = Scale(self.frame_canvas, from_=10, to=1000, orient=HORIZONTAL, + length=200, variable=self.no_of_neighbors, label='Number of neighbors ---->', + font='Times 11', relief='sunken', showvalue=0, cursor='gumby') + no_of_neighbors_scale.grid(row=1, column=5, columnspan=5, sticky='nsew') + self.hill_climbing(problem, map_canvas) + + def exp_schedule(k=100, lam=0.03, limit=1000): + """One possible schedule function for simulated annealing""" + + return lambda t: (k * np.exp(-lam * t) if t < limit else 0) + + def simulated_annealing_with_tunable_T(self, problem, map_canvas, schedule=exp_schedule()): + """Simulated annealing where temperature is taken as user input""" + + current = Node(problem.initial) + + while True: + T = schedule(self.temperature.get()) + if T == 0: + return current.state + neighbors = current.expand(problem) + if not neighbors: + return current.state + next = random.choice(neighbors) + delta_e = problem.value(next.state) - problem.value(current.state) + if delta_e > 0 or probability(np.exp(delta_e / T)): + map_canvas.delete("poly") + + current = next + self.cost.set("Cost = " + str('%0.3f' % (-1 * problem.value(current.state)))) + points = [] + for city in current.state: + points.append(self.frame_locations[city][0]) + points.append(self.frame_locations[city][1]) + map_canvas.create_polygon(points, outline='red', width=3, fill='', tag="poly") + map_canvas.update() + map_canvas.after(self.speed.get()) + + def genetic_algorithm(self, problem, map_canvas): + """Genetic Algorithm modified for the given problem""" + + def init_population(pop_number, gene_pool, state_length): + """initialize population""" + + population = [] + for i in range(pop_number): + population.append(utils.shuffled(gene_pool)) + return population + + def recombine(state_a, state_b): + """recombine two problem states""" + + start = random.randint(0, len(state_a) - 1) + end = random.randint(start + 1, len(state_a)) + new_state = state_a[start:end] + for city in state_b: + if city not in new_state: + new_state.append(city) + return new_state + + def mutate(state, mutation_rate): + """mutate problem states""" + + if random.uniform(0, 1) < mutation_rate: + sample = random.sample(range(len(state)), 2) + state[sample[0]], state[sample[1]] = state[sample[1]], state[sample[0]] + return state + + def fitness_fn(state): + """calculate fitness of a particular state""" + + fitness = problem.value(state) + return int((5600 + fitness) ** 2) + + current = Node(problem.initial) + population = init_population(100, current.state, len(current.state)) + all_time_best = current.state + while True: + population = [mutate(recombine(*select(2, population, fitness_fn)), self.mutation_rate.get()) + for _ in range(len(population))] + current_best = np.argmax(population, key=fitness_fn) + if fitness_fn(current_best) > fitness_fn(all_time_best): + all_time_best = current_best + self.cost.set("Cost = " + str('%0.3f' % (-1 * problem.value(all_time_best)))) + map_canvas.delete('poly') + points = [] + for city in current_best: + points.append(self.frame_locations[city][0]) + points.append(self.frame_locations[city][1]) + map_canvas.create_polygon(points, outline='red', width=1, fill='', tag='poly') + best_points = [] + for city in all_time_best: + best_points.append(self.frame_locations[city][0]) + best_points.append(self.frame_locations[city][1]) + map_canvas.create_polygon(best_points, outline='red', width=3, fill='', tag='poly') + map_canvas.update() + map_canvas.after(self.speed.get()) + + def hill_climbing(self, problem, map_canvas): + """hill climbing where number of neighbors is taken as user input""" + + def find_neighbors(state, number_of_neighbors=100): + """finds neighbors using two_opt method""" + + neighbors = [] + for i in range(number_of_neighbors): + new_state = problem.two_opt(state) + neighbors.append(Node(new_state)) + state = new_state + return neighbors + + current = Node(problem.initial) + while True: + neighbors = find_neighbors(current.state, self.no_of_neighbors.get()) + neighbor = np.argmax_random_tie(neighbors, key=lambda node: problem.value(node.state)) + map_canvas.delete('poly') + points = [] + for city in current.state: + points.append(self.frame_locations[city][0]) + points.append(self.frame_locations[city][1]) + map_canvas.create_polygon(points, outline='red', width=3, fill='', tag='poly') + neighbor_points = [] + for city in neighbor.state: + neighbor_points.append(self.frame_locations[city][0]) + neighbor_points.append(self.frame_locations[city][1]) + map_canvas.create_polygon(neighbor_points, outline='red', width=1, fill='', tag='poly') + map_canvas.update() + map_canvas.after(self.speed.get()) + if problem.value(neighbor.state) > problem.value(current.state): + current.state = neighbor.state + self.cost.set("Cost = " + str('%0.3f' % (-1 * problem.value(current.state)))) + + def on_closing(self): + if messagebox.askokcancel('Quit', 'Do you want to quit?'): + self.root.destroy() + + +if __name__ == '__main__': + all_cities = [] + for city in romania_map.locations.keys(): + distances[city] = {} + all_cities.append(city) + all_cities.sort() + + # distances['city1']['city2'] contains euclidean distance between their coordinates + for name_1, coordinates_1 in romania_map.locations.items(): + for name_2, coordinates_2 in romania_map.locations.items(): + distances[name_1][name_2] = np.linalg.norm( + [coordinates_1[0] - coordinates_2[0], coordinates_1[1] - coordinates_2[1]]) + distances[name_2][name_1] = np.linalg.norm( + [coordinates_1[0] - coordinates_2[0], coordinates_1[1] - coordinates_2[1]]) + + root = Tk() + root.title("Traveling Salesman Problem") + cities_selection_panel = TSPGui(root, all_cities) + cities_selection_panel.create_checkboxes() + cities_selection_panel.create_buttons() + cities_selection_panel.create_dropdown_menu() + root.protocol('WM_DELETE_WINDOW', cities_selection_panel.on_closing) + root.mainloop() diff --git a/gui/vacuum_agent.py b/gui/vacuum_agent.py new file mode 100644 index 000000000..b07dab282 --- /dev/null +++ b/gui/vacuum_agent.py @@ -0,0 +1,154 @@ +import os.path +from tkinter import * + +from agents import * + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + +loc_A, loc_B = (0, 0), (1, 0) # The two locations for the Vacuum world + + +class Gui(Environment): + """This GUI environment has two locations, A and B. Each can be Dirty + or Clean. The agent perceives its location and the location's + status.""" + + def __init__(self, root, height=300, width=380): + super().__init__() + self.status = {loc_A: 'Clean', + loc_B: 'Clean'} + self.root = root + self.height = height + self.width = width + self.canvas = None + self.buttons = [] + self.create_canvas() + self.create_buttons() + + def thing_classes(self): + """The list of things which can be used in the environment.""" + return [Wall, Dirt, ReflexVacuumAgent, RandomVacuumAgent, + TableDrivenVacuumAgent, ModelBasedVacuumAgent] + + def percept(self, agent): + """Returns the agent's location, and the location status (Dirty/Clean).""" + return agent.location, self.status[agent.location] + + def execute_action(self, agent, action): + """Change the location status (Dirty/Clean); track performance. + Score 10 for each dirt cleaned; -1 for each move.""" + if action == 'Right': + agent.location = loc_B + agent.performance -= 1 + elif action == 'Left': + agent.location = loc_A + agent.performance -= 1 + elif action == 'Suck': + if self.status[agent.location] == 'Dirty': + if agent.location == loc_A: + self.buttons[0].config(bg='white', activebackground='light grey') + else: + self.buttons[1].config(bg='white', activebackground='light grey') + agent.performance += 10 + self.status[agent.location] = 'Clean' + + def default_location(self, thing): + """Agents start in either location at random.""" + return random.choice([loc_A, loc_B]) + + def create_canvas(self): + """Creates Canvas element in the GUI.""" + self.canvas = Canvas( + self.root, + width=self.width, + height=self.height, + background='powder blue') + self.canvas.pack(side='bottom') + + def create_buttons(self): + """Creates the buttons required in the GUI.""" + button_left = Button(self.root, height=4, width=12, padx=2, pady=2, bg='white') + button_left.config(command=lambda btn=button_left: self.dirt_switch(btn)) + self.buttons.append(button_left) + button_left_window = self.canvas.create_window(130, 200, anchor=N, window=button_left) + button_right = Button(self.root, height=4, width=12, padx=2, pady=2, bg='white') + button_right.config(command=lambda btn=button_right: self.dirt_switch(btn)) + self.buttons.append(button_right) + button_right_window = self.canvas.create_window(250, 200, anchor=N, window=button_right) + + def dirt_switch(self, button): + """Gives user the option to put dirt in any tile.""" + bg_color = button['bg'] + if bg_color == 'saddle brown': + button.config(bg='white', activebackground='light grey') + elif bg_color == 'white': + button.config(bg='saddle brown', activebackground='light goldenrod') + + def read_env(self): + """Reads the current state of the GUI.""" + for i, btn in enumerate(self.buttons): + if i == 0: + if btn['bg'] == 'white': + self.status[loc_A] = 'Clean' + else: + self.status[loc_A] = 'Dirty' + else: + if btn['bg'] == 'white': + self.status[loc_B] = 'Clean' + else: + self.status[loc_B] = 'Dirty' + + def update_env(self, agent): + """Updates the GUI according to the agent's action.""" + self.read_env() + # print(self.status) + before_step = agent.location + self.step() + # print(self.status) + # print(agent.location) + move_agent(self, agent, before_step) + + +def create_agent(env, agent): + """Creates the agent in the GUI and is kept independent of the environment.""" + env.add_thing(agent) + # print(agent.location) + if agent.location == (0, 0): + env.agent_rect = env.canvas.create_rectangle(80, 100, 175, 180, fill='lime green') + env.text = env.canvas.create_text(128, 140, font="Helvetica 10 bold italic", text="Agent") + else: + env.agent_rect = env.canvas.create_rectangle(200, 100, 295, 180, fill='lime green') + env.text = env.canvas.create_text(248, 140, font="Helvetica 10 bold italic", text="Agent") + + +def move_agent(env, agent, before_step): + """Moves the agent in the GUI when 'next' button is pressed.""" + if agent.location == before_step: + pass + else: + if agent.location == (1, 0): + env.canvas.move(env.text, 120, 0) + env.canvas.move(env.agent_rect, 120, 0) + elif agent.location == (0, 0): + env.canvas.move(env.text, -120, 0) + env.canvas.move(env.agent_rect, -120, 0) + + +# TODO: Add more agents to the environment. +# TODO: Expand the environment to XYEnvironment. +if __name__ == "__main__": + root = Tk() + root.title("Vacuum Environment") + root.geometry("420x380") + root.resizable(0, 0) + frame = Frame(root, bg='black') + # reset_button = Button(frame, text='Reset', height=2, width=6, padx=2, pady=2, command=None) + # reset_button.pack(side='left') + next_button = Button(frame, text='Next', height=2, width=6, padx=2, pady=2) + next_button.pack(side='left') + frame.pack(side='bottom') + env = Gui(root) + agent = ReflexVacuumAgent() + create_agent(env, agent) + next_button.config(command=lambda: env.update_env(agent)) + root.mainloop() diff --git a/gui/xy_vacuum_environment.py b/gui/xy_vacuum_environment.py new file mode 100644 index 000000000..093abc6c3 --- /dev/null +++ b/gui/xy_vacuum_environment.py @@ -0,0 +1,191 @@ +import os.path +from tkinter import * + +from agents import * + +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) + + +class Gui(VacuumEnvironment): + """This is a two-dimensional GUI environment. Each location may be + dirty, clean or can have a wall. The user can change these at each step. + """ + xi, yi = (0, 0) + perceptible_distance = 1 + + def __init__(self, root, width=7, height=7, elements=None): + super().__init__(width, height) + if elements is None: + elements = ['D', 'W'] + self.root = root + self.create_frames() + self.create_buttons() + self.create_walls() + self.elements = elements + + def create_frames(self): + """Adds frames to the GUI environment.""" + self.frames = [] + for _ in range(7): + frame = Frame(self.root, bg='grey') + frame.pack(side='bottom') + self.frames.append(frame) + + def create_buttons(self): + """Adds buttons to the respective frames in the GUI.""" + self.buttons = [] + for frame in self.frames: + button_row = [] + for _ in range(7): + button = Button(frame, height=3, width=5, padx=2, pady=2) + button.config( + command=lambda btn=button: self.display_element(btn)) + button.pack(side='left') + button_row.append(button) + self.buttons.append(button_row) + + def create_walls(self): + """Creates the outer boundary walls which do not move.""" + for row, button_row in enumerate(self.buttons): + if row == 0 or row == len(self.buttons) - 1: + for button in button_row: + button.config(text='W', state='disabled', + disabledforeground='black') + else: + button_row[0].config( + text='W', state='disabled', disabledforeground='black') + button_row[len(button_row) - 1].config(text='W', + state='disabled', disabledforeground='black') + # Place the agent in the centre of the grid. + self.buttons[3][3].config( + text='A', state='disabled', disabledforeground='black') + + def display_element(self, button): + """Show the things on the GUI.""" + txt = button['text'] + if txt != 'A': + if txt == 'W': + button.config(text='D') + elif txt == 'D': + button.config(text='') + elif txt == '': + button.config(text='W') + + def execute_action(self, agent, action): + """Determines the action the agent performs.""" + xi, yi = (self.xi, self.yi) + if action == 'Suck': + dirt_list = self.list_things_at(agent.location, Dirt) + if dirt_list: + dirt = dirt_list[0] + agent.performance += 100 + self.delete_thing(dirt) + self.buttons[xi][yi].config(text='', state='normal') + xf, yf = agent.location + self.buttons[xf][yf].config( + text='A', state='disabled', disabledforeground='black') + + else: + agent.bump = False + if action == 'TurnRight': + agent.direction += Direction.R + elif action == 'TurnLeft': + agent.direction += Direction.L + elif action == 'Forward': + agent.bump = self.move_to(agent, agent.direction.move_forward(agent.location)) + if not agent.bump: + self.buttons[xi][yi].config(text='', state='normal') + xf, yf = agent.location + self.buttons[xf][yf].config( + text='A', state='disabled', disabledforeground='black') + + if action != 'NoOp': + agent.performance -= 1 + + def read_env(self): + """Reads the current state of the GUI environment.""" + for i, btn_row in enumerate(self.buttons): + for j, btn in enumerate(btn_row): + if (i != 0 and i != len(self.buttons) - 1) and (j != 0 and j != len(btn_row) - 1): + agt_loc = self.agents[0].location + if self.some_things_at((i, j)) and (i, j) != agt_loc: + for thing in self.list_things_at((i, j)): + self.delete_thing(thing) + if btn['text'] == self.elements[0]: + self.add_thing(Dirt(), (i, j)) + elif btn['text'] == self.elements[1]: + self.add_thing(Wall(), (i, j)) + + def update_env(self): + """Updates the GUI environment according to the current state.""" + self.read_env() + agt = self.agents[0] + previous_agent_location = agt.location + self.xi, self.yi = previous_agent_location + self.step() + xf, yf = agt.location + + def reset_env(self, agt): + """Resets the GUI environment to the initial state.""" + self.read_env() + for i, btn_row in enumerate(self.buttons): + for j, btn in enumerate(btn_row): + if (i != 0 and i != len(self.buttons) - 1) and (j != 0 and j != len(btn_row) - 1): + if self.some_things_at((i, j)): + for thing in self.list_things_at((i, j)): + self.delete_thing(thing) + btn.config(text='', state='normal') + self.add_thing(agt, location=(3, 3)) + self.buttons[3][3].config( + text='A', state='disabled', disabledforeground='black') + + +def XYReflexAgentProgram(percept): + """The modified SimpleReflexAgentProgram for the GUI environment.""" + status, bump = percept + if status == 'Dirty': + return 'Suck' + + if bump == 'Bump': + value = random.choice((1, 2)) + else: + value = random.choice((1, 2, 3, 4)) # 1-right, 2-left, others-forward + + if value == 1: + return 'TurnRight' + elif value == 2: + return 'TurnLeft' + else: + return 'Forward' + + +class XYReflexAgent(Agent): + """The modified SimpleReflexAgent for the GUI environment.""" + + def __init__(self, program=None): + super().__init__(program) + self.location = (3, 3) + self.direction = Direction("up") + + +# TODO: Check the coordinate system. +# TODO: Give manual choice for agent's location. +if __name__ == "__main__": + root = Tk() + root.title("Vacuum Environment") + root.geometry("420x440") + root.resizable(0, 0) + frame = Frame(root, bg='black') + reset_button = Button(frame, text='Reset', height=2, + width=6, padx=2, pady=2) + reset_button.pack(side='left') + next_button = Button(frame, text='Next', height=2, + width=6, padx=2, pady=2) + next_button.pack(side='left') + frame.pack(side='bottom') + env = Gui(root) + agt = XYReflexAgent(program=XYReflexAgentProgram) + env.add_thing(agt, location=(3, 3)) + next_button.config(command=env.update_env) + reset_button.config(command=lambda: env.reset_env(agt)) + root.mainloop() diff --git a/images/-0.04.jpg b/images/-0.04.jpg new file mode 100644 index 000000000..3cf276421 Binary files /dev/null and b/images/-0.04.jpg differ diff --git a/images/-0.4.jpg b/images/-0.4.jpg new file mode 100644 index 000000000..b274d2ce3 Binary files /dev/null and b/images/-0.4.jpg differ diff --git a/images/-4.jpg b/images/-4.jpg new file mode 100644 index 000000000..79eefb0cd Binary files /dev/null and b/images/-4.jpg differ diff --git a/images/4.jpg b/images/4.jpg new file mode 100644 index 000000000..55e75001d Binary files /dev/null and b/images/4.jpg differ diff --git a/images/broxrevised.png b/images/broxrevised.png new file mode 100644 index 000000000..87051a383 Binary files /dev/null and b/images/broxrevised.png differ diff --git a/images/cake_graph.jpg b/images/cake_graph.jpg new file mode 100644 index 000000000..160a413ca Binary files /dev/null and b/images/cake_graph.jpg differ diff --git a/images/ensemble_learner.jpg b/images/ensemble_learner.jpg new file mode 100644 index 000000000..b1edd1ec5 Binary files /dev/null and b/images/ensemble_learner.jpg differ diff --git a/images/ge0.jpg b/images/ge0.jpg new file mode 100644 index 000000000..a70b18703 Binary files /dev/null and b/images/ge0.jpg differ diff --git a/images/ge1.jpg b/images/ge1.jpg new file mode 100644 index 000000000..624f16e25 Binary files /dev/null and b/images/ge1.jpg differ diff --git a/images/ge2.jpg b/images/ge2.jpg new file mode 100644 index 000000000..3a29f8f4c Binary files /dev/null and b/images/ge2.jpg differ diff --git a/images/ge4.jpg b/images/ge4.jpg new file mode 100644 index 000000000..b3a4b4acd Binary files /dev/null and b/images/ge4.jpg differ diff --git a/images/general_learning_agent.jpg b/images/general_learning_agent.jpg new file mode 100644 index 000000000..a8153bef8 Binary files /dev/null and b/images/general_learning_agent.jpg differ diff --git a/images/grid_mdp.jpg b/images/grid_mdp.jpg new file mode 100644 index 000000000..fa77fa276 Binary files /dev/null and b/images/grid_mdp.jpg differ diff --git a/images/grid_mdp_agent.jpg b/images/grid_mdp_agent.jpg new file mode 100644 index 000000000..3f247b6f2 Binary files /dev/null and b/images/grid_mdp_agent.jpg differ diff --git a/images/hillclimb-tsp.png b/images/hillclimb-tsp.png new file mode 100644 index 000000000..8446bbafc Binary files /dev/null and b/images/hillclimb-tsp.png differ diff --git a/images/knowledge_FOIL_grandparent.png b/images/knowledge_FOIL_grandparent.png new file mode 100644 index 000000000..dbc6e7729 Binary files /dev/null and b/images/knowledge_FOIL_grandparent.png differ diff --git a/images/knowledge_foil_family.png b/images/knowledge_foil_family.png new file mode 100644 index 000000000..356f22d8d Binary files /dev/null and b/images/knowledge_foil_family.png differ diff --git a/images/maze.png b/images/maze.png new file mode 100644 index 000000000..f3fcd1990 Binary files /dev/null and b/images/maze.png differ diff --git a/images/mdp-b.png b/images/mdp-b.png new file mode 100644 index 000000000..f21a3760c Binary files /dev/null and b/images/mdp-b.png differ diff --git a/images/mdp-c.png b/images/mdp-c.png new file mode 100644 index 000000000..1034079a2 Binary files /dev/null and b/images/mdp-c.png differ diff --git a/images/mdp-d.png b/images/mdp-d.png new file mode 100644 index 000000000..8ba7cf073 Binary files /dev/null and b/images/mdp-d.png differ diff --git a/images/model_based_reflex_agent.jpg b/images/model_based_reflex_agent.jpg new file mode 100644 index 000000000..b6c12ed09 Binary files /dev/null and b/images/model_based_reflex_agent.jpg differ diff --git a/images/model_goal_based_agent.jpg b/images/model_goal_based_agent.jpg new file mode 100644 index 000000000..93d6182b4 Binary files /dev/null and b/images/model_goal_based_agent.jpg differ diff --git a/images/model_utility_based_agent.jpg b/images/model_utility_based_agent.jpg new file mode 100644 index 000000000..693230c00 Binary files /dev/null and b/images/model_utility_based_agent.jpg differ diff --git a/images/pop.jpg b/images/pop.jpg new file mode 100644 index 000000000..52b3e3756 Binary files /dev/null and b/images/pop.jpg differ diff --git a/images/queen_s.png b/images/queen_s.png new file mode 100644 index 000000000..cc693102a Binary files /dev/null and b/images/queen_s.png differ diff --git a/images/random_forest.png b/images/random_forest.png new file mode 100644 index 000000000..e0ab1d658 Binary files /dev/null and b/images/random_forest.png differ diff --git a/images/refinement.png b/images/refinement.png new file mode 100644 index 000000000..8270d81d0 Binary files /dev/null and b/images/refinement.png differ diff --git a/images/romania_map.png b/images/romania_map.png new file mode 100644 index 000000000..426c76f1e Binary files /dev/null and b/images/romania_map.png differ diff --git a/images/simple_problem_solving_agent.jpg b/images/simple_problem_solving_agent.jpg new file mode 100644 index 000000000..80fb904b5 Binary files /dev/null and b/images/simple_problem_solving_agent.jpg differ diff --git a/images/simple_reflex_agent.jpg b/images/simple_reflex_agent.jpg new file mode 100644 index 000000000..74002a720 Binary files /dev/null and b/images/simple_reflex_agent.jpg differ diff --git a/images/stapler1-test.png b/images/stapler1-test.png new file mode 100644 index 000000000..e550d83f9 Binary files /dev/null and b/images/stapler1-test.png differ diff --git a/improving_sat_algorithms.ipynb b/improving_sat_algorithms.ipynb new file mode 100644 index 000000000..d461e99c4 --- /dev/null +++ b/improving_sat_algorithms.ipynb @@ -0,0 +1,2539 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "# Propositional Logic\n", + "---\n", + "# Improving Boolean Satisfiability Algorithms\n", + "\n", + "## Introduction\n", + "A propositional formula $\\Phi$ in *Conjunctive Normal Form* (CNF) is a conjunction of clauses $\\omega_j$, with $j \\in \\{1,...,m\\}$. Each clause being a disjunction of literals and each literal being either a positive ($x_i$) or a negative ($\\lnot{x_i}$) propositional variable, with $i \\in \\{1,...,n\\}$. By denoting with $[\\lnot]$ the possible presence of $\\lnot$, we can formally define $\\Phi$ as:\n", + "\n", + "$$\\bigwedge_{j = 1,...,m}\\bigg(\\bigvee_{i \\in \\omega_j} [\\lnot] x_i\\bigg)$$\n", + "\n", + "The ***Boolean Satisfiability Problem*** (SAT) consists in determining whether there exists a truth assignment in $\\{0, 1\\}$ (or equivalently in $\\{True,False\\}$) for the variables in $\\Phi$." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from logic import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## DPLL with Branching Heuristics\n", + "The ***Davis-Putnam-Logemann-Loveland*** (DPLL) algorithm is a *complete* (will answer SAT if a solution exists) and *sound* (it will not answer SAT for an unsatisfiable formula) procedue that combines *backtracking search* and *deduction* to decide satisfiability of propositional logic formula in CNF. At each search step a variable and a propositional value are selected for branching purposes. With each branching step, two values can be assigned to a variable, either 0 or 1. Branching corresponds to assigning the chosen value to the chosen variable. Afterwards, the logical consequences of each branching step are evaluated. Each time an unsatisfied clause (ie a *conflict*) is identified, backtracking is executed. Backtracking corresponds to undoing branching steps until an unflipped branch is reached. When both values have been assigned to the selected variable at a branching step, backtracking will undo this branching step. If for the first branching step both values have been considered, and backtracking undoes this first branching step, then the CNF formula can be declared unsatisfiable. This kind of backtracking is called *chronological backtracking*.\n", + "\n", + "Essentially, `DPLL` is a backtracking depth-first search through partial truth assignments which uses a *splitting rule* to replaces the original problem with two smaller subproblems, whereas the original Davis-Putnam procedure uses a variable elimination rule which replaces the original problem with one larger subproblem. Over the years, many heuristics have been proposed in choosing the splitting variable (which variable should be assigned a truth value next).\n", + "\n", + "Search algorithms that are based on a predetermined order of search are called static algorithms, whereas the ones that select them at the runtime are called dynamic. The first SAT search algorithm, the Davis-Putnam procedure is a static algorithm. Static search algorithms are usually very slow in practice and for this reason perform worse than dynamic search algorithms. However, dynamic search algorithms are much harder to design, since they require a heuristic for predetermining the order of search. The fundamental element of a heuristic is a branching strategy for selecting the next branching literal. This must not require a lot of time to compute and yet it must provide a powerful insight into the problem instance.\n", + "\n", + "Two basic heuristics are applied to this algorithm with the potential of cutting the search space in half. These are the *pure literal rule* and the *unit clause rule*.\n", + "- the *pure literal* rule is applied whenever a variable appears with a single polarity in all the unsatisfied clauses. In this case, assigning a truth value to the variable so that all the involved clauses are satisfied is highly effective in the search;\n", + "- if some variable occurs in the current formula in a clause of length 1 then the *unit clause* rule is applied. Here, the literal is selected and a truth value so the respective clause is satisfied is assigned. The iterative application of the unit rule is commonly reffered to as *Boolean Constraint Propagation* (BCP)." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mdpll_satisfiable\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mno_branching_heuristic\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"Check satisfiability of a propositional sentence.\u001b[0m\n", + "\u001b[0;34m This differs from the book code in two ways: (1) it returns a model\u001b[0m\n", + "\u001b[0;34m rather than True when it succeeds; this is more useful. (2) The\u001b[0m\n", + "\u001b[0;34m function find_pure_symbol is passed a list of unknown clauses, rather\u001b[0m\n", + "\u001b[0;34m than a list of all clauses and the model; this is more efficient.\u001b[0m\n", + "\u001b[0;34m >>> dpll_satisfiable(A |'<=>'| B) == {A: True, B: True}\u001b[0m\n", + "\u001b[0;34m True\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mdpll\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mto_cnf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource dpll_satisfiable" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mdpll\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mno_branching_heuristic\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"See if the clauses are true in a partial model.\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munknown_clauses\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;31m# clauses with an unknown truth value\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mval\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mpl_true\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mval\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mval\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munknown_clauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0munknown_clauses\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mfind_pure_symbol\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0munknown_clauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mdpll\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremove_all\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mextend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mfind_unit_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mdpll\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremove_all\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mextend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0munknown_clauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mdpll\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremove_all\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mextend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mor\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdpll\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremove_all\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mextend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbranching_heuristic\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource dpll" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Each of these branching heuristics was applied only after the *pure literal* and the *unit clause* heuristic failed in selecting a splitting variable." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### MOMs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "MOMs heuristics are simple, efficient and easy to implement. The goal of these heuristics is to prefer the literal having ***Maximum number of Occurences in the Minimum length clauses***. Intuitively, the literals belonging to the minimum length clauses are the most constrained literals in the formula. Branching on them will maximize the effect of BCP and the likelihood of hitting a dead end early in the search tree (for unsatisfiable problems). Conversely, in the case of satisfiable formulas, branching on a highly constrained variable early in the tree will also increase the likelihood of a correct assignment of the remained open literals.\n", + "The MOMs heuristics main disadvatage is that their effectiveness highly depends on the problem instance. It is easy to see that the ideal setting for these heuristics is considering the unsatisfied binary clauses." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mmin_clauses\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mmin_len\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmin\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmap\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mc\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdefault\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mfilter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mc\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mmin_len\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mmin_len\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m1\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource min_clauses" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mmoms\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m MOMS (Maximum Occurrence in clauses of Minimum Size) heuristic\u001b[0m\n", + "\u001b[0;34m Returns the literal with the most occurrences in all clauses of minimum size\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mmin_clauses\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource moms" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Over the years, many types of MOMs heuristics have been proposed.\n", + "\n", + "***MOMSf*** choose the variable $x$ with a maximize the function:\n", + "\n", + "$$[f(x) + f(\\lnot{x})] * 2^k + f(x) * f(\\lnot{x})$$\n", + "\n", + "where $f(x)$ is the number of occurrences of $x$ in the smallest unknown clauses, k is a parameter." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mmomsf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mk\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m MOMS alternative heuristic\u001b[0m\n", + "\u001b[0;34m If f(x) the number of occurrences of the variable x in clauses with minimum size,\u001b[0m\n", + "\u001b[0;34m we choose the variable maximizing [f(x) + f(-x)] * 2^k + f(x) * f(-x)\u001b[0m\n", + "\u001b[0;34m Returns x if f(x) >= f(-x) otherwise -x\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mmin_clauses\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mpow\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mk\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource momsf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "***Freeman’s POSIT***
    [[1]](#cite-freeman1995improvements) version counts both the number of positive $x$ and negative $\\lnot{x}$ occurrences of a given variable $x$." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mposit\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Freeman's POSIT version of MOMs\u001b[0m\n", + "\u001b[0;34m Counts the positive x and negative x for each variable x in clauses with minimum size\u001b[0m\n", + "\u001b[0;34m Returns x if f(x) >= f(-x) otherwise -x\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mmin_clauses\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource posit" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "***Zabih and McAllester’s*** [[2]](#cite-zabih1988rearrangement) version of the heuristic counts the negative occurrences $\\lnot{x}$ of each given variable $x$." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mzm\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Zabih and McAllester's version of MOMs\u001b[0m\n", + "\u001b[0;34m Counts the negative occurrences only of each variable x in clauses with minimum size\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mmin_clauses\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0ml\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mop\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m'~'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource zm" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### DLIS & DLCS" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Literal count heuristics count the number of unresolved clauses in which a given variable $x$ appears as a positive literal, $C_P$ , and as negative literal, $C_N$. These two numbers an either be onsidered individually or ombined. \n", + "\n", + "***Dynamic Largest Individual Sum*** heuristic considers the values $C_P$ and $C_N$ separately: select the variable with the largest individual value and assign to it value true if $C_P \\geq C_N$, value false otherwise." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mdlis\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m DLIS (Dynamic Largest Individual Sum) heuristic\u001b[0m\n", + "\u001b[0;34m Choose the variable and value that satisfies the maximum number of unsatisfied clauses\u001b[0m\n", + "\u001b[0;34m Like DLCS but we only consider the literal (thus Cp and Cn are individual)\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mclauses\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource dlis" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "***Dynamic Largest Combined Sum*** considers the values $C_P$ and $C_N$ combined: select the variable with the largest sum $C_P + C_N$ and assign to it value true if $C_P \\geq C_N$, value false otherwise." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mdlcs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m DLCS (Dynamic Largest Combined Sum) heuristic\u001b[0m\n", + "\u001b[0;34m Cp the number of clauses containing literal x\u001b[0m\n", + "\u001b[0;34m Cn the number of clauses containing literal -x\u001b[0m\n", + "\u001b[0;34m Here we select the variable maximizing Cp + Cn\u001b[0m\n", + "\u001b[0;34m Returns x if Cp >= Cn otherwise -x\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mclauses\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource dlcs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### JW & JW2" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Two branching heuristics were proposed by ***Jeroslow and Wang*** in [[3]](#cite-jeroslow1990solving).\n", + "\n", + "The *one-sided Jeroslow and Wang*’s heuristic compute:\n", + "\n", + "$$J(l) = \\sum_{l \\in \\omega \\land \\omega \\in \\phi} 2^{-|\\omega|}$$\n", + "\n", + "and selects the assignment that satisfies the literal with the largest value $J(l)$." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mjw\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Jeroslow-Wang heuristic\u001b[0m\n", + "\u001b[0;34m For each literal compute J(l) = \\sum{l in clause c} 2^{-|c|}\u001b[0m\n", + "\u001b[0;34m Return the literal maximizing J\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0mpow\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource jw" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The *two-sided Jeroslow and Wang*’s heuristic identifies the variable $x$ with the largest sum $J(x) + J(\\lnot{x})$, and assigns to $x$ value true, if $J(x) \\geq J(\\lnot{x})$, and value false otherwise." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mjw2\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m Two Sided Jeroslow-Wang heuristic\u001b[0m\n", + "\u001b[0;34m Compute J(l) also counts the negation of l = J(x) + J(-x)\u001b[0m\n", + "\u001b[0;34m Returns x if J(x) >= J(-x) otherwise -x\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0mpow\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource jw2" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CDCL with 1UIP Learning Scheme, 2WL Lazy Data Structure, VSIDS Branching Heuristic & Restarts\n", + "\n", + "The ***Conflict-Driven Clause Learning*** (CDCL) solver is an evolution of the *DPLL* algorithm that involves a number of additional key techniques:\n", + "\n", + "- non-chronological backtracking or *backjumping*;\n", + "- *learning* new *clauses* from conflicts during search by exploiting its structure;\n", + "- using *lazy data structures* for storing clauses;\n", + "- *branching heuristics* with low computational overhead and which receive feedback from search;\n", + "- periodically *restarting* search.\n", + "\n", + "The first difference between a DPLL solver and a CDCL solver is the introduction of the *non-chronological backtracking* or *backjumping* when a conflict is identified. This requires an iterative implementation of the algorithm because only if the backtrack stack is managed explicitly it is possible to backtrack more than one level." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mcdcl_satisfiable\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvsids_decay\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m0.95\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrestart_strategy\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mno_restart\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m >>> cdcl_satisfiable(A |'<=>'| B) == {A: True, B: True}\u001b[0m\n", + "\u001b[0;34m True\u001b[0m\n", + "\u001b[0;34m \"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mTwoWLClauseDatabase\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mto_cnf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msymbols\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mCounter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mG\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnx\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mDiGraph\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mmodel\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdl\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflicts\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrestarts\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msum_lbd\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue_lbd\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0munit_propagation\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mconflict\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mdl\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflicts\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlearn\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlbd\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mconflict_analysis\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue_lbd\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlbd\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msum_lbd\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0mlbd\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbackjump\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlearn\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mupdate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlearn\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0msymbol\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m*=\u001b[0m \u001b[0mvsids_decay\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrestart_strategy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflicts\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrestarts\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue_lbd\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msum_lbd\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbackjump\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mqueue_lbd\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclear\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mrestarts\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdl\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0massign_decision_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource cdcl_satisfiable" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Clause Learning with 1UIP Scheme" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The second important difference between a DPLL solver and a CDCL solver is that the information about a conflict is reused by learning: if a conflicting clause is found, the solver derive a new clause from the conflict and add it to the clauses database.\n", + "\n", + "Whenever a conflict is identified due to unit propagation, a conflict analysis procedure is invoked. As a result, one or more new clauses are learnt, and a backtracking decision level is computed. The conflict analysis procedure analyzes the structure of unit propagation and decides which literals to include in the learnt clause. The decision levels associated with assigned variables define a partial order of the variables. Starting from a given unsatisfied clause (represented in the implication graph with vertex $\\kappa$), the conflict analysis procedure visits variables implied at the most recent decision level (ie the current largest decision level), identifies the antecedents of visited variables, and keeps from the antecedents the literals assigned at decision levels less than the most recent decision level. The clause learning procedure used in the CDCL can be defined by a sequence of selective resolution operations, that at each step yields a new temporary clause. This process is repeated until the most recent decision variable is visited.\n", + "\n", + "The structure of implied assignments induced by unit propagation is a key aspect of the clause learning procedure. Moreover, the idea of exploiting the structure induced by unit propagation was further exploited with ***Unit Implication Points*** (UIPs). A UIP is a *dominator* in the implication graph and represents an alternative decision assignment at the current decision level that results in the same conflict. The main motivation for identifying UIPs is to reduce the size of learnt clauses. Clause learning could potentially stop at any UIP, being quite straightforward to conclude that the set of literals of a clause learnt at the first UIP has clear advantages. Considering the largest decision level of the literals of the clause learnt at each UIP, the clause learnt at the first UIP is guaranteed to contain the smallest one. This guarantees the highest backtrack jump in the search tree." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mconflict_analysis\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict_clause\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mG\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mp\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'K'\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'antecedent'\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mp\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpred\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'K'\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnode\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mnode\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnodes\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;34m'K'\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnodes\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnode\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'dl'\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mdl\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0min_degree\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnode\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mfirst_uip\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnx\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mimmediate_dominators\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mP\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'K'\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove_node\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'K'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict_side\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnx\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdescendants\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfirst_uip\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflict_clause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mintersection\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflict_side\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mantecedent\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mG\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mp\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'antecedent'\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mp\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpred\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict_clause\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mpl_binary_resolution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflict_clause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mantecedent\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# the literal block distance is calculated by taking the decision levels from variables of all\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# literals in the clause, and counting how many different decision levels were in this set\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mlbd\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnodes\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'dl'\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflict_clause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlbd\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcount\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m1\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mfirst_uip\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflict_clause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;36m0\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlbd\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m1\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mheapq\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnlargest\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m2\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlbd\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconflict_clause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlbd\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource conflict_analysis" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mpl_binary_resolution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mci\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mdi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mci\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mdj\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mdi\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m~\u001b[0m\u001b[0mdj\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0;34m~\u001b[0m\u001b[0mdi\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mdj\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mpl_binary_resolution\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'|'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremove_all\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mci\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'|'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mremove_all\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdj\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0massociate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'|'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0munique\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mci\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource pl_binary_resolution" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mbackjump\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mdelete\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mnode\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mnode\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnodes\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnodes\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnode\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'dl'\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove_nodes_from\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdelete\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mnode\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdelete\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdel\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnode\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msymbols\u001b[0m \u001b[0;34m|=\u001b[0m \u001b[0mdelete\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource backjump" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2WL Lazy Data Structure" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Implementation issues for SAT solvers include the design of suitable data structures for storing clauses. The implemented data structures dictate the way BCP are implemented and have a significant impact on the run time performance of the SAT solver. Recent state-of-the-art SAT solvers are characterized by using very efficient data structures, intended to reduce the CPU time required per each node in the search tree. Conversely, traditional SAT data structures are accurate, meaning that is possible to know exactly the value of each literal in the clause. Examples of the most recent SAT data structures, which are not accurate and therefore are called lazy, include the watched literals used in Chaff .\n", + "\n", + "The more recent Chaff SAT solver [[4]](#cite-moskewicz2001chaff) proposed a new data structure, the ***2 Watched Literals*** (2WL), in which two references are associated with each clause. There is no order relation between the two references, allowing the references to move in any direction. The lack of order between the two references has the key advantage that no literal references need to be updated when backtracking takes place. In contrast, unit or unsatisfied clauses are identified only after traversing all the clauses’ literals; a clear drawback. The two watched literal pointers are undifferentiated as there is no order relation. Again, each time one literal pointed by one of these pointers is assigned, the pointer has to move inwards. These pointers may move in both directions. This causes the whole clause to be traversed when the clause becomes unit. In addition, no references have to be kept to the just assigned literals, since pointers do not move when backtracking." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0munit_propagation\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mcheck\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mmodel\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0m_\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mw1\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_neg_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw1\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw1\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_pos_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw2\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0m_\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mw2\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_neg_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw2\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw2\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_pos_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw2\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0munit_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mwatching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mwatching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd_node\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mval\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mp\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd_edges_from\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mzip\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcycle\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mantecedent\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mp\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mconflict_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd_edges_from\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mzip\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprop_symbols\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mitertools\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcycle\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'K'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mantecedent\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbcp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mfilter\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcheck\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_clauses\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# we need only visit each clause when one of its two watched literals is assigned to 0 because, until\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# this happens, we can guarantee that there cannot be more than n-2 literals in the clause assigned to 0\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mfirst_watched\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mpl_true\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msecond_watched\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mpl_true\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfirst_watched\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munit_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbcp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0mfirst_watched\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mFalse\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0msecond_watched\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mupdate_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbcp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if the only literal with a non-zero value is the other watched literal then\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0msecond_watched\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;31m# if it is free, then the clause is a unit clause\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munit_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbcp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;31m# else (it is False) the clause is a conflict clause\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0msecond_watched\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mFalse\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mfirst_watched\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mupdate_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbcp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if the only literal with a non-zero value is the other watched literal then\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfirst_watched\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;31m# if it is free, then the clause is a unit clause\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0munit_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbcp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;31m# else (it is False) the clause is a conflict clause\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mconflict_clause\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mbcp\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource unit_propagation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mclass\u001b[0m \u001b[0mTwoWLClauseDatabase\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m{\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mdefaultdict\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mset\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mclauses\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mc\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mget_clauses\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mkeys\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mset_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnew_watching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnew_watching\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mset_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnew_watching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnew_watching\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mget_pos_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0ml\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mget_neg_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0ml\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__assign_watching_literals\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp1\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw2\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp2\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp1\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mw1\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mw2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw2\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp2\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw2\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mremove\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp1\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw2\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp2\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdel\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__twl\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdiscard\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp1\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdiscard\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mw1\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mw2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw2\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdiscard\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp2\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw2\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdiscard\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mupdate_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if a non-zero literal different from the other watched literal is found\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mfound\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnew_watching\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__find_new_watching_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfound\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;31m# then it will replace the watched literal\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mset_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnew_watching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_watching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mupdate_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if a non-zero literal different from the other watched literal is found\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mfound\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnew_watching\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__find_new_watching_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_second_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfound\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;31m# then it will replace the watched literal\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mset_first_watched\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnew_watching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minspect_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnew_watching\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mp\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__watch_list\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__find_new_watching_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mother_watched\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if a non-zero literal different from the other watched literal is found\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0ml\u001b[0m \u001b[0;34m!=\u001b[0m \u001b[0mother_watched\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mpl_true\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# then it is returned\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0ml\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__assign_watching_literals\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m2\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mmodel\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mclause\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mpl_true\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0ml\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdisjuncts\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mclause\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mpl_true\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ml\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource TwoWLClauseDatabase" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### VSIDS Branching Heuristic" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The early branching heuristics made use of all the information available from the data structures, namely the number of satisfied, unsatisfied and unassigned literals. These heuristics are updated during the search and also take into account the clauses that are learnt. \n", + "\n", + "More recently, a different kind of variable selection heuristic, referred to as ***Variable State Independent Decaying Sum*** (VSIDS), has been proposed by Chaff authors in [[4]](#cite-moskewicz2001chaff). One of the reasons for proposing this new heuristic was the introduction of lazy data structures, where the knowledge of the dynamic size of a clause is not accurate. Hence, the heuristics described above cannot be used. VSIDS selects the literal that appears most frequently over all the clauses, which means that one counter is required for each one of the literals. Initially, all counters are set to zero. During the search, the metrics only have to be updated when a new recorded clause is created. More than to develop an accurate heuristic, the motivation has been to design a fast (but dynamically adapting) heuristic. In fact, one of the key properties of this strategy is the very low overhead, due to being independent of the variable state." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0massign_decision_literal\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mP\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbols\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0msymbol\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mvalue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0mscores\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m~\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0msymbols\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mremove\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mG\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0madd_node\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mP\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mval\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdl\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mdl\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource assign_decision_literal" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Restarts" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solving NP-complete problems, such as SAT, naturally leads to heavy-tailed run times. To deal with this, SAT solvers frequently restart their search to avoid the runs that take disproportionately longer. What restarting here means is that the solver unsets all variables and starts the search using different variable assignment order.\n", + "\n", + "While at first glance it might seem that restarts should be rare and become rarer as the solving has been going on for longer, so that the SAT solver can actually finish solving the problem, the trend has been towards more aggressive (frequent) restarts.\n", + "\n", + "The reason why frequent restarts help solve problems faster is that while the solver does forget all current variable assignments, it does keep some information, specifically it keeps learnt clauses, effectively sampling the search space, and it keeps the last assigned truth value of each variable, assigning them the same value the next time they are picked to be assigned." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Luby" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this strategy, the number of conflicts between 2 restarts is based on the *Luby* sequence. The *Luby* restart sequence is interesting in that it was proven to be optimal restart strategy for randomized search algorithms where the runs do not share information. While this is not true for SAT solving, as shown in [[5]](cite-haim2014towards) and [[6]](cite-huang2007effect), *Luby* restarts have been quite successful anyway.\n", + "\n", + "The exact description of *Luby* restarts is that the $ith$ restart happens after $u \\cdot Luby(i)$ conflicts, where $u$ is a constant and $Luby(i)$ is defined as:\n", + "\n", + "$$Luby(i) = \\begin{cases} \n", + " 2^{k-1} & i = 2^k - 1 \\\\\n", + " Luby(i - 2^{k-1} + 1) & 2^{k-1} \\leq i < 2^k - 1\n", + " \\end{cases}\n", + "$$\n", + "\n", + "A less exact but more intuitive description of the *Luby* sequence is that all numbers in it are powers of two, and after a number is seen for the second time, the next number is twice as big. The following are the first 16 numbers in the sequence:\n", + "\n", + "$$ (1,1,2,1,1,2,4,1,1,2,1,1,2,4,8,1,...) $$\n", + "\n", + "From the above, we can see that this restart strategy tends towards frequent restarts, but some runs are kept running for much longer, and there is no upper limit on the longest possible time between two restarts." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mluby\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflicts\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrestarts\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue_lbd\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msum_lbd\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0munit\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m512\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# in the state-of-art tested with unit value 1, 2, 4, 6, 8, 12, 16, 32, 64, 128, 256 and 512\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_luby\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mk\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mi\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0;36m1\u001b[0m \u001b[0;34m<<\u001b[0m \u001b[0mk\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0;36m1\u001b[0m \u001b[0;34m<<\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mk\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0;36m1\u001b[0m \u001b[0;34m<<\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mk\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<=\u001b[0m \u001b[0mi\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0;36m1\u001b[0m \u001b[0;34m<<\u001b[0m \u001b[0mk\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0m_luby\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mi\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0;36m1\u001b[0m \u001b[0;34m<<\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mk\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mk\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0munit\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0m_luby\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrestarts\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mqueue_lbd\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource luby" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Glucose" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Glucose restarts were popularized by the *Glucose* solver, and it is an extremely aggressive, dynamic restart strategy. The idea behind it and described in [[7]](cite-audemard2012refining) is that instead of waiting for a fixed amount of conflicts, we restart when the last couple of learnt clauses are, on average, bad.\n", + "\n", + "A bit more precisely, if there were at least $X$ conflicts (and thus $X$ learnt clauses) since the last restart, and the average *Literal Block Distance* (LBD) (a criterion to evaluate the quality of learnt clauses as shown in [[8]](#cite-audemard2009predicting) of the last $X$ learnt clauses was at least $K$ times higher than the average LBD of all learnt clauses, it is time for another restart. Parameters $X$ and $K$ can be tweaked to achieve different restart frequency, and they are usually kept quite small." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mglucose\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconflicts\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrestarts\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mqueue_lbd\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msum_lbd\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m100\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mk\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m0.7\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# in the state-of-art tested with (x, k) as (50, 0.8) and (100, 0.7)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# if there were at least x conflicts since the last restart, and then the average LBD of the last\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# x learnt clauses was at least k times higher than the average LBD of all learnt clauses\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mqueue_lbd\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m>=\u001b[0m \u001b[0mx\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0msum\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mqueue_lbd\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m/\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mqueue_lbd\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mk\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0msum_lbd\u001b[0m \u001b[0;34m/\u001b[0m \u001b[0mconflicts\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource glucose" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": {} + }, + "source": [ + "## Experimental Results" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [], + "source": [ + "from csp import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Australia" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSP" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [], + "source": [ + "australia_csp = MapColoringCSP(list('RGB'), \"\"\"SA: WA NT Q NSW V; NT: WA Q; NSW: Q V; T: \"\"\")" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 154 µs, sys: 37 µs, total: 191 µs\n", + "Wall time: 194 µs\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b with DOM J UP needs 72 consistency-checks'" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, checks = AC3b(australia_csp, arc_heuristic=dom_j_up)\n", + "f'AC3b with DOM J UP needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 263 µs, sys: 0 ns, total: 263 µs\n", + "Wall time: 268 µs\n" + ] + }, + { + "data": { + "text/plain": [ + "{'Q': 'R', 'SA': 'G', 'NSW': 'B', 'NT': 'B', 'V': 'R', 'WA': 'R'}" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time backtracking_search(australia_csp, select_unassigned_variable=mrv, inference=forward_checking)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SAT" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "australia_sat = MapColoringSAT(list('RGB'), \"\"\"SA: WA NT Q NSW V; NT: WA Q; NSW: Q V; T: \"\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### DPLL" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 43.3 ms, sys: 0 ns, total: 43.3 ms\n", + "Wall time: 41.5 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=no_branching_heuristic)" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 36.4 ms, sys: 0 ns, total: 36.4 ms\n", + "Wall time: 35.3 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=moms)" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 36.1 ms, sys: 3.9 ms, total: 40 ms\n", + "Wall time: 39.2 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=momsf)" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 45.2 ms, sys: 0 ns, total: 45.2 ms\n", + "Wall time: 44.2 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=posit)" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 31.2 ms, sys: 0 ns, total: 31.2 ms\n", + "Wall time: 30.5 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=zm)" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 57 ms, sys: 0 ns, total: 57 ms\n", + "Wall time: 55.9 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=dlis)" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 51.8 ms, sys: 0 ns, total: 51.8 ms\n", + "Wall time: 50.7 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=dlcs)" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 40.6 ms, sys: 0 ns, total: 40.6 ms\n", + "Wall time: 39.3 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=jw)" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 43.2 ms, sys: 1.81 ms, total: 45.1 ms\n", + "Wall time: 43.9 ms\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(australia_sat, branching_heuristic=jw2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### CDCL" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 32.9 ms, sys: 16 µs, total: 33 ms\n", + "Wall time: 31.6 ms\n" + ] + } + ], + "source": [ + "%time model = cdcl_satisfiable(australia_sat)" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{NSW_B, NT_B, Q_G, SA_R, V_G, WA_G}" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "{var for var, val in model.items() if val}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### France" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSP" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [], + "source": [ + "france_csp = MapColoringCSP(list('RGBY'),\n", + " \"\"\"AL: LO FC; AQ: MP LI PC; AU: LI CE BO RA LR MP; BO: CE IF CA FC RA\n", + " AU; BR: NB PL; CA: IF PI LO FC BO; CE: PL NB NH IF BO AU LI PC; FC: BO\n", + " CA LO AL RA; IF: NH PI CA BO CE; LI: PC CE AU MP AQ; LO: CA AL FC; LR:\n", + " MP AU RA PA; MP: AQ LI AU LR; NB: NH CE PL BR; NH: PI IF CE NB; NO:\n", + " PI; PA: LR RA; PC: PL CE LI AQ; PI: NH NO CA IF; PL: BR NB CE PC; RA:\n", + " AU BO FC PA LR\"\"\")" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 599 µs, sys: 112 µs, total: 711 µs\n", + "Wall time: 716 µs\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b with DOM J UP needs 516 consistency-checks'" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, checks = AC3b(france_csp, arc_heuristic=dom_j_up)\n", + "f'AC3b with DOM J UP needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 560 µs, sys: 0 ns, total: 560 µs\n", + "Wall time: 563 µs\n" + ] + }, + { + "data": { + "text/plain": [ + "{'NH': 'R',\n", + " 'NB': 'G',\n", + " 'CE': 'B',\n", + " 'PL': 'R',\n", + " 'BR': 'B',\n", + " 'IF': 'G',\n", + " 'PI': 'B',\n", + " 'BO': 'R',\n", + " 'CA': 'Y',\n", + " 'FC': 'G',\n", + " 'LO': 'R',\n", + " 'PC': 'G',\n", + " 'AU': 'G',\n", + " 'AL': 'B',\n", + " 'RA': 'B',\n", + " 'LR': 'R',\n", + " 'LI': 'R',\n", + " 'AQ': 'B',\n", + " 'MP': 'Y',\n", + " 'PA': 'G',\n", + " 'NO': 'R'}" + ] + }, + "execution_count": 40, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time backtracking_search(france_csp, select_unassigned_variable=mrv, inference=forward_checking)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SAT" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": {}, + "outputs": [], + "source": [ + "france_sat = MapColoringSAT(list('RGBY'),\n", + " \"\"\"AL: LO FC; AQ: MP LI PC; AU: LI CE BO RA LR MP; BO: CE IF CA FC RA\n", + " AU; BR: NB PL; CA: IF PI LO FC BO; CE: PL NB NH IF BO AU LI PC; FC: BO\n", + " CA LO AL RA; IF: NH PI CA BO CE; LI: PC CE AU MP AQ; LO: CA AL FC; LR:\n", + " MP AU RA PA; MP: AQ LI AU LR; NB: NH CE PL BR; NH: PI IF CE NB; NO:\n", + " PI; PA: LR RA; PC: PL CE LI AQ; PI: NH NO CA IF; PL: BR NB CE PC; RA:\n", + " AU BO FC PA LR\"\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### DPLL" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3.32 s, sys: 0 ns, total: 3.32 s\n", + "Wall time: 3.32 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=no_branching_heuristic)" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3.17 s, sys: 390 µs, total: 3.17 s\n", + "Wall time: 3.17 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=moms)" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3.49 s, sys: 0 ns, total: 3.49 s\n", + "Wall time: 3.49 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=momsf)" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3.5 s, sys: 0 ns, total: 3.5 s\n", + "Wall time: 3.5 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=posit)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3 s, sys: 2.6 ms, total: 3.01 s\n", + "Wall time: 3.01 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=zm)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 12.5 s, sys: 11.4 ms, total: 12.5 s\n", + "Wall time: 12.5 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=dlis)" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3.41 s, sys: 0 ns, total: 3.41 s\n", + "Wall time: 3.41 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=dlcs)" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 2.92 s, sys: 3.89 ms, total: 2.92 s\n", + "Wall time: 2.92 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=jw)" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 3.71 s, sys: 0 ns, total: 3.71 s\n", + "Wall time: 3.73 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(france_sat, branching_heuristic=jw2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### CDCL" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 159 ms, sys: 3.94 ms, total: 163 ms\n", + "Wall time: 162 ms\n" + ] + } + ], + "source": [ + "%time model = cdcl_satisfiable(france_sat)" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{AL_G,\n", + " AQ_G,\n", + " AU_R,\n", + " BO_G,\n", + " BR_Y,\n", + " CA_R,\n", + " CE_B,\n", + " FC_B,\n", + " IF_Y,\n", + " LI_Y,\n", + " LO_Y,\n", + " LR_G,\n", + " MP_B,\n", + " NB_R,\n", + " NH_G,\n", + " NO_Y,\n", + " PA_B,\n", + " PC_R,\n", + " PI_B,\n", + " PL_G,\n", + " RA_Y}" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "{var for var, val in model.items() if val}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### USA" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSP" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": {}, + "outputs": [], + "source": [ + "usa_csp = MapColoringCSP(list('RGBY'),\n", + " \"\"\"WA: OR ID; OR: ID NV CA; CA: NV AZ; NV: ID UT AZ; ID: MT WY UT;\n", + " UT: WY CO AZ; MT: ND SD WY; WY: SD NE CO; CO: NE KA OK NM; NM: OK TX AZ;\n", + " ND: MN SD; SD: MN IA NE; NE: IA MO KA; KA: MO OK; OK: MO AR TX;\n", + " TX: AR LA; MN: WI IA; IA: WI IL MO; MO: IL KY TN AR; AR: MS TN LA;\n", + " LA: MS; WI: MI IL; IL: IN KY; IN: OH KY; MS: TN AL; AL: TN GA FL;\n", + " MI: OH IN; OH: PA WV KY; KY: WV VA TN; TN: VA NC GA; GA: NC SC FL;\n", + " PA: NY NJ DE MD WV; WV: MD VA; VA: MD DC NC; NC: SC; NY: VT MA CT NJ;\n", + " NJ: DE; DE: MD; MD: DC; VT: NH MA; MA: NH RI CT; CT: RI; ME: NH;\n", + " HI: ; AK: \"\"\")" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1.58 ms, sys: 17 µs, total: 1.6 ms\n", + "Wall time: 1.6 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b with DOM J UP needs 1284 consistency-checks'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, checks = AC3b(usa_csp, arc_heuristic=dom_j_up)\n", + "f'AC3b with DOM J UP needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 2.15 ms, sys: 0 ns, total: 2.15 ms\n", + "Wall time: 2.15 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "{'NM': 'R',\n", + " 'TX': 'G',\n", + " 'OK': 'B',\n", + " 'AR': 'R',\n", + " 'MO': 'G',\n", + " 'KA': 'R',\n", + " 'LA': 'B',\n", + " 'NE': 'B',\n", + " 'TN': 'B',\n", + " 'MS': 'G',\n", + " 'IA': 'R',\n", + " 'SD': 'G',\n", + " 'IL': 'B',\n", + " 'CO': 'G',\n", + " 'MN': 'B',\n", + " 'KY': 'R',\n", + " 'AL': 'R',\n", + " 'GA': 'G',\n", + " 'FL': 'B',\n", + " 'VA': 'G',\n", + " 'WI': 'G',\n", + " 'IN': 'G',\n", + " 'NC': 'R',\n", + " 'WV': 'B',\n", + " 'OH': 'Y',\n", + " 'PA': 'R',\n", + " 'MD': 'Y',\n", + " 'SC': 'B',\n", + " 'MI': 'R',\n", + " 'DC': 'R',\n", + " 'DE': 'G',\n", + " 'WY': 'R',\n", + " 'ND': 'R',\n", + " 'NJ': 'B',\n", + " 'NY': 'G',\n", + " 'UT': 'B',\n", + " 'AZ': 'G',\n", + " 'ID': 'G',\n", + " 'MT': 'B',\n", + " 'NV': 'R',\n", + " 'CA': 'B',\n", + " 'OR': 'Y',\n", + " 'WA': 'R',\n", + " 'VT': 'R',\n", + " 'MA': 'B',\n", + " 'NH': 'G',\n", + " 'CT': 'R',\n", + " 'RI': 'G',\n", + " 'ME': 'R'}" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time backtracking_search(usa_csp, select_unassigned_variable=mrv, inference=forward_checking)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SAT" + ] + }, + { + "cell_type": "code", + "execution_count": 56, + "metadata": {}, + "outputs": [], + "source": [ + "usa_sat = MapColoringSAT(list('RGBY'),\n", + " \"\"\"WA: OR ID; OR: ID NV CA; CA: NV AZ; NV: ID UT AZ; ID: MT WY UT;\n", + " UT: WY CO AZ; MT: ND SD WY; WY: SD NE CO; CO: NE KA OK NM; NM: OK TX AZ;\n", + " ND: MN SD; SD: MN IA NE; NE: IA MO KA; KA: MO OK; OK: MO AR TX;\n", + " TX: AR LA; MN: WI IA; IA: WI IL MO; MO: IL KY TN AR; AR: MS TN LA;\n", + " LA: MS; WI: MI IL; IL: IN KY; IN: OH KY; MS: TN AL; AL: TN GA FL;\n", + " MI: OH IN; OH: PA WV KY; KY: WV VA TN; TN: VA NC GA; GA: NC SC FL;\n", + " PA: NY NJ DE MD WV; WV: MD VA; VA: MD DC NC; NC: SC; NY: VT MA CT NJ;\n", + " NJ: DE; DE: MD; MD: DC; VT: NH MA; MA: NH RI CT; CT: RI; ME: NH;\n", + " HI: ; AK: \"\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### DPLL" + ] + }, + { + "cell_type": "code", + "execution_count": 57, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 46.2 s, sys: 0 ns, total: 46.2 s\n", + "Wall time: 46.2 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=no_branching_heuristic)" + ] + }, + { + "cell_type": "code", + "execution_count": 58, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 54.6 s, sys: 0 ns, total: 54.6 s\n", + "Wall time: 54.6 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=moms)" + ] + }, + { + "cell_type": "code", + "execution_count": 59, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 44 s, sys: 0 ns, total: 44 s\n", + "Wall time: 44 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=momsf)" + ] + }, + { + "cell_type": "code", + "execution_count": 60, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 43.8 s, sys: 0 ns, total: 43.8 s\n", + "Wall time: 43.8 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=posit)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 52.6 s, sys: 0 ns, total: 52.6 s\n", + "Wall time: 52.6 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=zm)" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 57 s, sys: 0 ns, total: 57 s\n", + "Wall time: 57 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=dlis)" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 43.8 s, sys: 0 ns, total: 43.8 s\n", + "Wall time: 43.8 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=dlcs)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 53.3 s, sys: 3.82 ms, total: 53.3 s\n", + "Wall time: 53.3 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=jw)" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 44 s, sys: 3.99 ms, total: 44 s\n", + "Wall time: 44 s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(usa_sat, branching_heuristic=jw2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### CDCL" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 559 ms, sys: 0 ns, total: 559 ms\n", + "Wall time: 558 ms\n" + ] + } + ], + "source": [ + "%time model = cdcl_satisfiable(usa_sat)" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{AL_B,\n", + " AR_B,\n", + " AZ_R,\n", + " CA_B,\n", + " CO_R,\n", + " CT_Y,\n", + " DC_G,\n", + " DE_Y,\n", + " FL_Y,\n", + " GA_R,\n", + " IA_B,\n", + " ID_Y,\n", + " IL_G,\n", + " IN_R,\n", + " KA_G,\n", + " KY_B,\n", + " LA_G,\n", + " MA_G,\n", + " MD_R,\n", + " ME_G,\n", + " MI_G,\n", + " MN_Y,\n", + " MO_R,\n", + " MS_Y,\n", + " MT_B,\n", + " NC_B,\n", + " ND_G,\n", + " NE_Y,\n", + " NH_Y,\n", + " NJ_G,\n", + " NM_G,\n", + " NV_G,\n", + " NY_R,\n", + " OH_Y,\n", + " OK_Y,\n", + " OR_R,\n", + " PA_B,\n", + " RI_B,\n", + " SC_Y,\n", + " SD_R,\n", + " TN_G,\n", + " TX_R,\n", + " UT_B,\n", + " VA_Y,\n", + " VT_B,\n", + " WA_B,\n", + " WI_R,\n", + " WV_G,\n", + " WY_G}" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "{var for var, val in model.items() if val}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Zebra Puzzle" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### CSP" + ] + }, + { + "cell_type": "code", + "execution_count": 76, + "metadata": {}, + "outputs": [], + "source": [ + "zebra_csp = Zebra()" + ] + }, + { + "cell_type": "code", + "execution_count": 77, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'Milk': 3, 'Norwegian': 1}\n" + ] + } + ], + "source": [ + "zebra_csp.display(zebra_csp.infer_assignment())" + ] + }, + { + "cell_type": "code", + "execution_count": 78, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 2.04 ms, sys: 4 µs, total: 2.05 ms\n", + "Wall time: 2.05 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "'AC3b with DOM J UP needs 737 consistency-checks'" + ] + }, + "execution_count": 78, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time _, checks = AC3b(zebra_csp, arc_heuristic=dom_j_up)\n", + "f'AC3b with DOM J UP needs {checks} consistency-checks'" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'Blue': 2, 'Milk': 3, 'Norwegian': 1}\n" + ] + } + ], + "source": [ + "zebra_csp.display(zebra_csp.infer_assignment())" + ] + }, + { + "cell_type": "code", + "execution_count": 72, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 2.13 ms, sys: 0 ns, total: 2.13 ms\n", + "Wall time: 2.14 ms\n" + ] + }, + { + "data": { + "text/plain": [ + "{'Milk': 3,\n", + " 'Blue': 2,\n", + " 'Norwegian': 1,\n", + " 'Coffee': 5,\n", + " 'Green': 5,\n", + " 'Ivory': 4,\n", + " 'Red': 3,\n", + " 'Yellow': 1,\n", + " 'Kools': 1,\n", + " 'Englishman': 3,\n", + " 'Horse': 2,\n", + " 'Tea': 2,\n", + " 'Ukranian': 2,\n", + " 'Spaniard': 4,\n", + " 'Dog': 4,\n", + " 'Japanese': 5,\n", + " 'Parliaments': 5,\n", + " 'LuckyStrike': 4,\n", + " 'OJ': 4,\n", + " 'Water': 1,\n", + " 'Chesterfields': 2,\n", + " 'Winston': 3,\n", + " 'Snails': 3,\n", + " 'Fox': 1,\n", + " 'Zebra': 5}" + ] + }, + "execution_count": 72, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%time backtracking_search(zebra_csp, select_unassigned_variable=mrv, inference=forward_checking)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### SAT" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "zebra_sat = associate('&', map(to_cnf, map(expr, filter(lambda line: line[0] not in ('c', 'p'), open('aima-data/zebra.cnf').read().splitlines()))))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### DPLL" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 13min 6s, sys: 2.44 ms, total: 13min 6s\n", + "Wall time: 13min 6s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=no_branching_heuristic)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 15min 4s, sys: 22.4 ms, total: 15min 4s\n", + "Wall time: 15min 4s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=moms)" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 22min 28s, sys: 40 ms, total: 22min 28s\n", + "Wall time: 22min 28s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=momsf)" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 22min 25s, sys: 36 ms, total: 22min 25s\n", + "Wall time: 22min 25s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=posit)" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 14min 52s, sys: 32 ms, total: 14min 52s\n", + "Wall time: 14min 52s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=zm)" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 2min 31s, sys: 9.87 ms, total: 2min 31s\n", + "Wall time: 2min 32s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=dlis)" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 4min 27s, sys: 12 ms, total: 4min 27s\n", + "Wall time: 4min 27s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=dlcs)" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 6min 55s, sys: 39.2 ms, total: 6min 55s\n", + "Wall time: 6min 56s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=jw)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 8min 57s, sys: 7.94 ms, total: 8min 57s\n", + "Wall time: 8min 57s\n" + ] + } + ], + "source": [ + "%time model = dpll_satisfiable(zebra_sat, branching_heuristic=jw2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### CDCL" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "pycharm": {} + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 1.64 s, sys: 0 ns, total: 1.64 s\n", + "Wall time: 1.64 s\n" + ] + } + ], + "source": [ + "%time model = cdcl_satisfiable(zebra_sat)" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{Englishman_house2,\n", + " Englishman_milk,\n", + " Englishman_oldGold,\n", + " Englishman_redHouse,\n", + " Englishman_snails,\n", + " Japanese_coffee,\n", + " Japanese_greenHouse,\n", + " Japanese_house4,\n", + " Japanese_parliament,\n", + " Japanese_zebra,\n", + " Norwegian_fox,\n", + " Norwegian_house0,\n", + " Norwegian_kool,\n", + " Norwegian_water,\n", + " Norwegian_yellowHouse,\n", + " Spaniard_dog,\n", + " Spaniard_house3,\n", + " Spaniard_ivoryHouse,\n", + " Spaniard_luckyStrike,\n", + " Spaniard_orangeJuice,\n", + " Ukrainian_blueHouse,\n", + " Ukrainian_chesterfield,\n", + " Ukrainian_horse,\n", + " Ukrainian_house1,\n", + " Ukrainian_tea}" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "{var for var, val in model.items() if val and var.op.startswith(('Englishman', 'Japanese', 'Norwegian', 'Spaniard', 'Ukrainian'))}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## References\n", + "\n", + "[[1]](#ref-1) Freeman, Jon William. 1995. _Improvements to propositional satisfiability search algorithms_.\n", + "\n", + "[[2]](#ref-2) Zabih, Ramin and McAllester, David A. 1988. _A Rearrangement Search Strategy for Determining Propositional Satisfiability_.\n", + "\n", + "[[3]](#ref-3) Jeroslow, Robert G and Wang, Jinchang. 1990. _Solving propositional satisfiability problems_.\n", + "\n", + "[[4]](#ref-4) Moskewicz, Matthew W and Madigan, Conor F and Zhao, Ying and Zhang, Lintao and Malik, Sharad. 2001. _Chaff: Engineering an efficient SAT solver_.\n", + "\n", + "[[5]](#ref-5) Haim, Shai and Heule, Marijn. 2014. _Towards ultra rapid restarts_.\n", + "\n", + "[[6]](#ref-6) Huang, Jinbo and others. 2007. _The Effect of Restarts on the Efficiency of Clause Learning_.\n", + "\n", + "[[7]](#ref-7) Audemard, Gilles and Simon, Laurent. 2012. _Refining restarts strategies for SAT and UNSAT_.\n", + "\n", + "[[8]](#ref-8) Audemard, Gilles and Simon, Laurent. 2009. _Predicting learnt clauses quality in modern SAT solvers_." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.3" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/index.ipynb b/index.ipynb index 2ae5742bb..f9da121f2 100644 --- a/index.ipynb +++ b/index.ipynb @@ -18,7 +18,7 @@ "\n", "3. [**Search**](./search.ipynb)\n", "\n", - "4. [**Search - 4th edition**](./search-4e.ipynb)\n", + "4. [**Search - 4th edition**](./search4e.ipynb)\n", "\n", "4. [**Games**](./games.ipynb)\n", "\n", diff --git a/intro.ipynb b/intro.ipynb index 738ffb53d..896ed9498 100644 --- a/intro.ipynb +++ b/intro.ipynb @@ -17,9 +17,9 @@ " \n", "## What version of Python?\n", " \n", - "The code is tested in Python [3.4](https://www.python.org/download/releases/3.4.3/) and [3.5](https://www.python.org/downloads/release/python-351/). If you try a different version of Python 3 and find a problem, please report it as an [Issue](https://github.com/aimacode/aima-python/issues). There is an incomplete [legacy branch](https://github.com/aimacode/aima-python/tree/aima3python2) for those who must run in Python 2. \n", + "The code is tested in Python [3.4](https://www.python.org/download/releases/3.4.3/) and [3.5](https://www.python.org/downloads/release/python-351/). If you try a different version of Python 3 and find a problem, please report it as an [Issue](https://github.com/aimacode/aima-python/issues).\n", " \n", - "We recommend the [Anaconda](https://www.continuum.io/downloads) distribution of Python 3.5. It comes with additional tools like the powerful IPython interpreter, the Jupyter Notebook and many helpful packages for scientific computing. After installing Anaconda, you will be good to go to run all the code and all the IPython notebooks. \n", + "We recommend the [Anaconda](https://www.anaconda.com/download/) distribution of Python 3.5. It comes with additional tools like the powerful IPython interpreter, the Jupyter Notebook and many helpful packages for scientific computing. After installing Anaconda, you will be good to go to run all the code and all the IPython notebooks. \n", "\n", "## IPython notebooks \n", " \n", @@ -62,7 +62,7 @@ "source": [ "From there, the notebook alternates explanations with examples of use. You can run the examples as they are, and you can modify the code cells (or add new cells) and run your own examples. If you have some really good examples to add, you can make a github pull request.\n", "\n", - "If you want to see the source code of a function, you can open a browser or editor and see it in another window, or from within the notebook you can use the IPython magic function `%psource` (for \"print source\") or the function `psource` from `notebook.py`. Also, if the algorithm has pseudocode, you can read it by calling the `pseudocode` function with input the name of the algorithm." + "If you want to see the source code of a function, you can open a browser or editor and see it in another window, or from within the notebook you can use the IPython magic function `%psource` (for \"print source\") or the function `psource` from `notebook.py`. Also, if the algorithm has pseudocode available, you can read it by calling the `pseudocode` function with the name of the algorithm passed as a parameter." ] }, { diff --git a/ipyviews.py b/ipyviews.py index fbdc9a580..b304af7bb 100644 --- a/ipyviews.py +++ b/ipyviews.py @@ -6,7 +6,6 @@ import copy import __main__ - # ______________________________________________________________________________ # Continuous environment diff --git a/knowledge.ipynb b/knowledge.ipynb deleted file mode 100644 index 2ffb20362..000000000 --- a/knowledge.ipynb +++ /dev/null @@ -1,677 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# KNOWLEDGE\n", - "\n", - "The [knowledge](https://github.com/aimacode/aima-python/blob/master/knowledge.py) module covers **Chapter 19: Knowledge in Learning** from Stuart Russel's and Peter Norvig's book *Artificial Intelligence: A Modern Approach*.\n", - "\n", - "Execute the cell below to get started." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "from knowledge import *\n", - "\n", - "from notebook import pseudocode, psource" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## CONTENTS\n", - "\n", - "* Overview\n", - "* Current-Best Learning\n", - "* Version-Space Learning" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## OVERVIEW\n", - "\n", - "Like the [learning module](https://github.com/aimacode/aima-python/blob/master/learning.ipynb), this chapter focuses on methods for generating a model/hypothesis for a domain. Unlike though the learning chapter, here we use prior knowledge to help us learn from new experiences and find a proper hypothesis.\n", - "\n", - "### First-Order Logic\n", - "\n", - "Usually knowledge in this field is represented as **first-order logic**, a type of logic that uses variables and quantifiers in logical sentences. Hypotheses are represented by logical sentences with variables, while examples are logical sentences with set values instead of variables. The goal is to assign a value to a special first-order logic predicate, called **goal predicate**, for new examples given a hypothesis. We learn this hypothesis by infering knowledge from some given examples.\n", - "\n", - "### Representation\n", - "\n", - "In this module, we use dictionaries to represent examples, with keys the attribute names and values the corresponding example values. Examples also have an extra boolean field, 'GOAL', for the goal predicate. A hypothesis is represented as a list of dictionaries. Each dictionary in that list represents a disjunction. Inside these dictionaries/disjunctions we have conjunctions.\n", - "\n", - "For example, say we want to predict if an animal (cat or dog) will take an umbrella given whether or not it rains or the animal wears a coat. The goal value is 'take an umbrella' and is denoted by the key 'GOAL'. An example:\n", - "\n", - "`{'Species': 'Cat', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`\n", - "\n", - "A hypothesis can be the following:\n", - "\n", - "`[{'Species': 'Cat'}]`\n", - "\n", - "which means an animal will take an umbrella if and only if it is a cat.\n", - "\n", - "### Consistency\n", - "\n", - "We say that an example `e` is **consistent** with an hypothesis `h` if the assignment from the hypothesis for `e` is the same as `e['GOAL']`. If the above example and hypothesis are `e` and `h` respectively, then `e` is consistent with `h` since `e['Species'] == 'Cat'`. For `e = {'Species': 'Dog', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`, the example is no longer consistent with `h`, since the value assigned to `e` is *False* while `e['GOAL']` is *True*." - ] - }, - { - "cell_type": "markdown", - "metadata": { - "collapsed": true - }, - "source": [ - "## CURRENT-BEST LEARNING\n", - "\n", - "### Overview\n", - "\n", - "In **Current-Best Learning**, we start with a hypothesis and we refine it as we iterate through the examples. For each example, there are three possible outcomes. The example is consistent with the hypothesis, the example is a **false positive** (real value is false but got predicted as true) and **false negative** (real value is true but got predicted as false). Depending on the outcome we refine the hypothesis accordingly:\n", - "\n", - "* Consistent: We do not change the hypothesis and we move on to the next example.\n", - "\n", - "* False Positive: We **specialize** the hypothesis, which means we add a conjunction.\n", - "\n", - "* False Negative: We **generalize** the hypothesis, either by removing a conjunction or a disjunction, or by adding a disjunction.\n", - "\n", - "When specializing and generalizing, we should take care to not create inconsistencies with previous examples. To avoid that caveat, backtracking is needed. Thankfully, there is not just one specialization or generalization, so we have a lot to choose from. We will go through all the specialization/generalizations and we will refine our hypothesis as the first specialization/generalization consistent with all the examples seen up to that point." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Pseudocode" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "data": { - "text/markdown": [ - "### AIMA3e\n", - "__function__ Current-Best-Learning(_examples_, _h_) __returns__ a hypothesis or fail \n", - " __if__ _examples_ is empty __then__ \n", - "   __return__ _h_ \n", - " _e_ ← First(_examples_) \n", - " __if__ _e_ is consistent with _h_ __then__ \n", - "   __return__ Current-Best-Learning(Rest(_examples_), _h_) \n", - " __else if__ _e_ is a false positive for _h_ __then__ \n", - "   __for each__ _h'_ __in__ specializations of _h_ consistent with _examples_ seen so far __do__ \n", - "     _h''_ ← Current-Best-Learning(Rest(_examples_), _h'_) \n", - "     __if__ _h''_ ≠ _fail_ __then return__ _h''_ \n", - " __else if__ _e_ is a false negative for _h_ __then__ \n", - "   __for each__ _h'_ __in__ generalizations of _h_ consistent with _examples_ seen so far __do__ \n", - "     _h''_ ← Current-Best-Learning(Rest(_examples_), _h'_) \n", - "     __if__ _h''_ ≠ _fail_ __then return__ _h''_ \n", - " __return__ _fail_ \n", - "\n", - "---\n", - "__Figure ??__ The current-best-hypothesis learning algorithm. It searches for a consistent hypothesis that fits all the examples and backtracks when no consistent specialization/generalization can be found. To start the algorithm, any hypothesis can be passed in; it will be specialized or generalized as needed." - ], - "text/plain": [ - "" - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "pseudocode('Current-Best-Learning')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Implementation\n", - "\n", - "As mentioned previously, examples are dictionaries (with keys the attribute names) and hypotheses are lists of dictionaries (each dictionary is a disjunction). Also, in the hypothesis, we denote the *NOT* operation with an exclamation mark (!).\n", - "\n", - "We have functions to calculate the list of all specializations/generalizations, to check if an example is consistent/false positive/false negative with a hypothesis. We also have an auxiliary function to add a disjunction (or operation) to a hypothesis, and two other functions to check consistency of all (or just the negative) examples.\n", - "\n", - "You can read the source by running the cell below:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "psource(current_best_learning, specializations, generalizations)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can view the auxiliary functions in the [knowledge module](https://github.com/aimacode/aima-python/blob/master/knowledge.py). A few notes on the functionality of some of the important methods:" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "* `specializations`: For each disjunction in the hypothesis, it adds a conjunction for values in the examples encountered so far (if the conjunction is consistent with all the examples). It returns a list of hypotheses.\n", - "\n", - "* `generalizations`: It adds to the list of hypotheses in three phases. First it deletes disjunctions, then it deletes conjunctions and finally it adds a disjunction.\n", - "\n", - "* `add_or`: Used by `generalizations` to add an *or operation* (a disjunction) to the hypothesis. Since the last example is the problematic one which wasn't consistent with the hypothesis, it will model the new disjunction to that example. It creates a disjunction for each combination of attributes in the example and returns the new hypotheses consistent with the negative examples encountered so far. We do not need to check the consistency of positive examples, since they are already consistent with at least one other disjunction in the hypotheses' set, so this new disjunction doesn't affect them. In other words, if the value of a positive example is negative under the disjunction, it doesn't matter since we know there exists a disjunction consistent with the example." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Since the algorithm stops searching the specializations/generalizations after the first consistent hypothesis is found, usually you will get different results each time you run the code." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Examples\n", - "\n", - "We will take a look at two examples. The first is a trivial one, while the second is a bit more complicated (you can also find it in the book).\n", - "\n", - "First we have the \"animals taking umbrellas\" example. Here we want to find a hypothesis to predict whether or not an animal will take an umbrella. The attributes are `Species`, `Rain` and `Coat`. The possible values are `[Cat, Dog]`, `[Yes, No]` and `[Yes, No]` respectively. Below we give seven examples (with `GOAL` we denote whether an animal will take an umbrella or not):" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "animals_umbrellas = [\n", - " {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': True},\n", - " {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True},\n", - " {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True},\n", - " {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': False},\n", - " {'Species': 'Dog', 'Rain': 'No', 'Coat': 'No', 'GOAL': False},\n", - " {'Species': 'Cat', 'Rain': 'No', 'Coat': 'No', 'GOAL': False},\n", - " {'Species': 'Cat', 'Rain': 'No', 'Coat': 'Yes', 'GOAL': True}\n", - "]" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let our initial hypothesis be `[{'Species': 'Cat'}]`. That means every cat will be taking an umbrella. We can see that this is not true, but it doesn't matter since we will refine the hypothesis using the Current-Best algorithm. First, let's see how that initial hypothesis fares to have a point of reference." - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "True\n", - "True\n", - "False\n", - "False\n", - "False\n", - "True\n", - "True\n" - ] - } - ], - "source": [ - "initial_h = [{'Species': 'Cat'}]\n", - "\n", - "for e in animals_umbrellas:\n", - " print(guess_value(e, initial_h))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We got 5/7 correct. Not terribly bad, but we can do better. Let's run the algorithm and see how that performs." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "True\n", - "True\n", - "True\n", - "False\n", - "False\n", - "False\n", - "True\n" - ] - } - ], - "source": [ - "h = current_best_learning(animals_umbrellas, initial_h)\n", - "\n", - "for e in animals_umbrellas:\n", - " print(guess_value(e, h))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We got everything right! Let's print our hypothesis:" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[{'Species': 'Cat', 'Rain': '!No'}, {'Coat': 'Yes', 'Rain': 'Yes'}, {'Coat': 'Yes'}]\n" - ] - } - ], - "source": [ - "print(h)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "If an example meets any of the disjunctions in the list, it will be `True`, otherwise it will be `False`.\n", - "\n", - "Let's move on to a bigger example, the \"Restaurant\" example from the book. The attributes for each example are the following:\n", - "\n", - "* Alternative option (`Alt`)\n", - "* Bar to hang out/wait (`Bar`)\n", - "* Day is Friday (`Fri`)\n", - "* Is hungry (`Hun`)\n", - "* How much does it cost (`Price`, takes values in [$, $$, $$$])\n", - "* How many patrons are there (`Pat`, takes values in [None, Some, Full])\n", - "* Is raining (`Rain`)\n", - "* Has made reservation (`Res`)\n", - "* Type of restaurant (`Type`, takes values in [French, Thai, Burger, Italian])\n", - "* Estimated waiting time (`Est`, takes values in [0-10, 10-30, 30-60, >60])\n", - "\n", - "We want to predict if someone will wait or not (Goal = WillWait). Below we show twelve examples found in the book." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "![restaurant](images/restaurant.png)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "With the function `r_example` we will build the dictionary examples:" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "def r_example(Alt, Bar, Fri, Hun, Pat, Price, Rain, Res, Type, Est, GOAL):\n", - " return {'Alt': Alt, 'Bar': Bar, 'Fri': Fri, 'Hun': Hun, 'Pat': Pat,\n", - " 'Price': Price, 'Rain': Rain, 'Res': Res, 'Type': Type, 'Est': Est,\n", - " 'GOAL': GOAL}" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "collapsed": true - }, - "source": [ - "In code:" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "restaurant = [\n", - " r_example('Yes', 'No', 'No', 'Yes', 'Some', '$$$', 'No', 'Yes', 'French', '0-10', True),\n", - " r_example('Yes', 'No', 'No', 'Yes', 'Full', '$', 'No', 'No', 'Thai', '30-60', False),\n", - " r_example('No', 'Yes', 'No', 'No', 'Some', '$', 'No', 'No', 'Burger', '0-10', True),\n", - " r_example('Yes', 'No', 'Yes', 'Yes', 'Full', '$', 'Yes', 'No', 'Thai', '10-30', True),\n", - " r_example('Yes', 'No', 'Yes', 'No', 'Full', '$$$', 'No', 'Yes', 'French', '>60', False),\n", - " r_example('No', 'Yes', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Italian', '0-10', True),\n", - " r_example('No', 'Yes', 'No', 'No', 'None', '$', 'Yes', 'No', 'Burger', '0-10', False),\n", - " r_example('No', 'No', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Thai', '0-10', True),\n", - " r_example('No', 'Yes', 'Yes', 'No', 'Full', '$', 'Yes', 'No', 'Burger', '>60', False),\n", - " r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$$$', 'No', 'Yes', 'Italian', '10-30', False),\n", - " r_example('No', 'No', 'No', 'No', 'None', '$', 'No', 'No', 'Thai', '0-10', False),\n", - " r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$', 'No', 'No', 'Burger', '30-60', True)\n", - "]" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Say our initial hypothesis is that there should be an alternative option and let's run the algorithm." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "True\n", - "False\n", - "True\n", - "True\n", - "False\n", - "True\n", - "False\n", - "True\n", - "False\n", - "False\n", - "False\n", - "True\n" - ] - } - ], - "source": [ - "initial_h = [{'Alt': 'Yes'}]\n", - "h = current_best_learning(restaurant, initial_h)\n", - "for e in restaurant:\n", - " print(guess_value(e, h))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The predictions are correct. Let's see the hypothesis that accomplished that:" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[{'Res': '!No', 'Fri': '!Yes', 'Alt': 'Yes'}, {'Bar': 'Yes', 'Fri': 'No', 'Rain': 'No', 'Hun': 'No'}, {'Bar': 'No', 'Price': '$', 'Fri': 'Yes'}, {'Res': 'Yes', 'Price': '$$', 'Rain': 'Yes', 'Alt': 'No', 'Est': '0-10', 'Fri': 'No', 'Hun': 'Yes', 'Bar': 'Yes'}, {'Fri': 'No', 'Pat': 'Some', 'Price': '$$', 'Rain': 'Yes', 'Hun': 'Yes'}, {'Est': '30-60', 'Res': 'No', 'Price': '$', 'Fri': 'Yes', 'Hun': 'Yes'}]\n" - ] - } - ], - "source": [ - "print(h)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "It might be quite complicated, with many disjunctions if we are unlucky, but it will always be correct, as long as a correct hypothesis exists." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## VERSION-SPACE LEARNING\n", - "\n", - "### Overview\n", - "\n", - "**Version-Space Learning** is a general method of learning in logic based domains. We generate the set of all the possible hypotheses in the domain and then we iteratively remove hypotheses inconsistent with the examples. The set of remaining hypotheses is called **version space**. Because hypotheses are being removed until we end up with a set of hypotheses consistent with all the examples, the algorithm is sometimes called **candidate elimination** algorithm.\n", - "\n", - "After we update the set on an example, all the hypotheses in the set are consistent with that example. So, when all the examples have been parsed, all the remaining hypotheses in the set are consistent with all the examples. That means we can pick hypotheses at random and we will always get a valid hypothesis." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Pseudocode" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "data": { - "text/markdown": [ - "### AIMA3e\n", - "__function__ Version-Space-Learning(_examples_) __returns__ a version space \n", - " __local variables__: _V_, the version space: the set of all hypotheses \n", - "\n", - " _V_ ← the set of all hypotheses \n", - " __for each__ example _e_ in _examples_ __do__ \n", - "   __if__ _V_ is not empty __then__ _V_ ← Version-Space-Update(_V_, _e_) \n", - " __return__ _V_ \n", - "\n", - "---\n", - "__function__ Version-Space-Update(_V_, _e_) __returns__ an updated version space \n", - " _V_ ← \\{_h_ ∈ _V_ : _h_ is consistent with _e_\\} \n", - "\n", - "---\n", - "__Figure ??__ The version space learning algorithm. It finds a subset of _V_ that is consistent with all the _examples_." - ], - "text/plain": [ - "" - ] - }, - "execution_count": 3, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "pseudocode('Version-Space-Learning')" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "collapsed": true - }, - "source": [ - "### Implementation\n", - "\n", - "The set of hypotheses is represented by a list and each hypothesis is represented by a list of dictionaries, each dictionary a disjunction. For each example in the given examples we update the version space with the function `version_space_update`. In the end, we return the version-space.\n", - "\n", - "Before we can start updating the version space, we need to generate it. We do that with the `all_hypotheses` function, which builds a list of all the possible hypotheses (including hypotheses with disjunctions). The function works like this: first it finds the possible values for each attribute (using `values_table`), then it builds all the attribute combinations (and adds them to the hypotheses set) and finally it builds the combinations of all the disjunctions (which in this case are the hypotheses build by the attribute combinations).\n", - "\n", - "You can read the code for all the functions by running the cells below:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "psource(version_space_learning, version_space_update)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "psource(all_hypotheses, values_table)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "psource(build_attr_combinations, build_h_combinations)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Example\n", - "\n", - "Since the set of all possible hypotheses is enormous and would take a long time to generate, we will come up with another, even smaller domain. We will try and predict whether we will have a party or not given the availability of pizza and soda. Let's do it:" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "party = [\n", - " {'Pizza': 'Yes', 'Soda': 'No', 'GOAL': True},\n", - " {'Pizza': 'Yes', 'Soda': 'Yes', 'GOAL': True},\n", - " {'Pizza': 'No', 'Soda': 'No', 'GOAL': False}\n", - "]" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Even though it is obvious that no-pizza no-party, we will run the algorithm and see what other hypotheses are valid." - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "True\n", - "True\n", - "False\n" - ] - } - ], - "source": [ - "V = version_space_learning(party)\n", - "for e in party:\n", - " guess = False\n", - " for h in V:\n", - " if guess_value(e, h):\n", - " guess = True\n", - " break\n", - "\n", - " print(guess)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The results are correct for the given examples. Let's take a look at the version space:" - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "959\n", - "[{'Pizza': 'Yes'}, {'Soda': 'Yes'}]\n", - "[{'Pizza': 'Yes'}, {'Pizza': '!No', 'Soda': 'No'}]\n", - "True\n" - ] - } - ], - "source": [ - "print(len(V))\n", - "\n", - "print(V[5])\n", - "print(V[10])\n", - "\n", - "print([{'Pizza': 'Yes'}] in V)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "There are almost 1000 hypotheses in the set. You can see that even with just two attributes the version space in very large.\n", - "\n", - "Our initial prediction is indeed in the set of hypotheses. Also, the two other random hypotheses we got are consistent with the examples (since they both include the \"Pizza is available\" disjunction)." - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.5.3" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/knowledge.py b/knowledge.py index 6fe09acd2..8c27c3eb8 100644 --- a/knowledge.py +++ b/knowledge.py @@ -1,20 +1,25 @@ -"""Knowledge in learning, Chapter 19""" +"""Knowledge in learning (Chapter 19)""" -from random import shuffle -from math import log -from utils import powerset from collections import defaultdict +from functools import partial from itertools import combinations, product +from random import shuffle + +import numpy as np + from logic import (FolKB, constant_symbols, predicate_symbols, standardize_variables, variables, is_definite_clause, subst, expr, Expr) - -# ______________________________________________________________________________ +from utils import power_set -def current_best_learning(examples, h, examples_so_far=[]): - """ [Figure 19.2] +def current_best_learning(examples, h, examples_so_far=None): + """ + [Figure 19.2] The hypothesis is a list of dictionaries, with each dictionary representing - a disjunction.""" + a disjunction. + """ + if examples_so_far is None: + examples_so_far = [] if not examples: return h @@ -62,7 +67,7 @@ def generalizations(examples_so_far, h): hypotheses = [] # Delete disjunctions - disj_powerset = powerset(range(len(h))) + disj_powerset = power_set(range(len(h))) for disjs in disj_powerset: h2 = h.copy() for d in reversed(list(disjs)): @@ -73,7 +78,7 @@ def generalizations(examples_so_far, h): # Delete AND operations in disjunctions for i, disj in enumerate(h): - a_powerset = powerset(disj.keys()) + a_powerset = power_set(disj.keys()) for attrs in a_powerset: h2 = h[i].copy() for a in attrs: @@ -95,13 +100,13 @@ def generalizations(examples_so_far, h): def add_or(examples_so_far, h): - """Adds an OR operation to the hypothesis. The AND operations in the disjunction + """Add an OR operation to the hypothesis. The AND operations in the disjunction are generated by the last example (which is the problematic one).""" ors = [] e = examples_so_far[-1] attrs = {k: v for k, v in e.items() if k != 'GOAL'} - a_powerset = powerset(attrs.keys()) + a_powerset = power_set(attrs.keys()) for c in a_powerset: h2 = {} @@ -115,13 +120,16 @@ def add_or(examples_so_far, h): return ors + # ______________________________________________________________________________ def version_space_learning(examples): - """ [Figure 19.3] + """ + [Figure 19.3] The version space is a list of hypotheses, which in turn are a list - of dictionaries/disjunctions.""" + of dictionaries/disjunctions. + """ V = all_hypotheses(examples) for e in examples: if V: @@ -135,9 +143,9 @@ def version_space_update(V, e): def all_hypotheses(examples): - """Builds a list of all the possible hypotheses""" + """Build a list of all the possible hypotheses""" values = values_table(examples) - h_powerset = powerset(values.keys()) + h_powerset = power_set(values.keys()) hypotheses = [] for s in h_powerset: hypotheses.extend(build_attr_combinations(s, values)) @@ -148,7 +156,7 @@ def all_hypotheses(examples): def values_table(examples): - """Builds a table with all the possible values for each attribute. + """Build a table with all the possible values for each attribute. Returns a dictionary with keys the attribute names and values a list with the possible values for the corresponding attribute.""" values = defaultdict(lambda: []) @@ -180,7 +188,7 @@ def build_attr_combinations(s, values): h = [] for i, a in enumerate(s): - rest = build_attr_combinations(s[i+1:], values) + rest = build_attr_combinations(s[i + 1:], values) for v in values[a]: o = {a: v} for r in rest: @@ -196,7 +204,7 @@ def build_h_combinations(hypotheses): """Given a set of hypotheses, builds and returns all the combinations of the hypotheses.""" h = [] - h_powerset = powerset(range(len(hypotheses))) + h_powerset = power_set(range(len(hypotheses))) for s in h_powerset: t = [] @@ -206,11 +214,12 @@ def build_h_combinations(hypotheses): return h + # ______________________________________________________________________________ def minimal_consistent_det(E, A): - """Returns a minimal set of attributes which give consistent determination""" + """Return a minimal set of attributes which give consistent determination""" n = len(A) for i in range(n + 1): @@ -220,7 +229,7 @@ def minimal_consistent_det(E, A): def consistent_det(A, E): - """Checks if the attributes(A) is consistent with the examples(E)""" + """Check if the attributes(A) is consistent with the examples(E)""" H = {} for e in E: @@ -231,16 +240,17 @@ def consistent_det(A, E): return True + # ______________________________________________________________________________ -class FOIL_container(FolKB): - """Holds the kb and other necessary elements required by FOIL""" +class FOILContainer(FolKB): + """Hold the kb and other necessary elements required by FOIL.""" - def __init__(self, clauses=[]): + def __init__(self, clauses=None): self.const_syms = set() self.pred_syms = set() - FolKB.__init__(self, clauses) + super().__init__(clauses) def tell(self, sentence): if is_definite_clause(sentence): @@ -248,10 +258,10 @@ def tell(self, sentence): self.const_syms.update(constant_symbols(sentence)) self.pred_syms.update(predicate_symbols(sentence)) else: - raise Exception("Not a definite clause: {}".format(sentence)) + raise Exception('Not a definite clause: {}'.format(sentence)) def foil(self, examples, target): - """Learns a list of first-order horn clauses + """Learn a list of first-order horn clauses 'examples' is a tuple: (positive_examples, negative_examples). positive_examples and negative_examples are both lists which contain substitutions.""" clauses = [] @@ -268,12 +278,11 @@ def foil(self, examples, target): return clauses def new_clause(self, examples, target): - """Finds a horn clause which satisfies part of the positive + """Find a horn clause which satisfies part of the positive examples but none of the negative examples. The horn clause is specified as [consequent, list of antecedents] - Return value is the tuple (horn_clause, extended_positive_examples)""" + Return value is the tuple (horn_clause, extended_positive_examples).""" clause = [target, []] - # [positive_examples, negative_examples] extended_examples = examples while extended_examples[1]: l = self.choose_literal(self.new_literals(clause), extended_examples) @@ -281,59 +290,71 @@ def new_clause(self, examples, target): extended_examples = [sum([list(self.extend_example(example, l)) for example in extended_examples[i]], []) for i in range(2)] - return (clause, extended_examples[0]) + return clause, extended_examples[0] def extend_example(self, example, literal): - """Generates extended examples which satisfy the literal""" + """Generate extended examples which satisfy the literal.""" # find all substitutions that satisfy literal for s in self.ask_generator(subst(example, literal)): s.update(example) yield s def new_literals(self, clause): - """Generates new literals based on known predicate symbols. - Generated literal must share atleast one variable with clause""" + """Generate new literals based on known predicate symbols. + Generated literal must share at least one variable with clause""" share_vars = variables(clause[0]) for l in clause[1]: share_vars.update(variables(l)) - for pred, arity in self.pred_syms: new_vars = {standardize_variables(expr('x')) for _ in range(arity - 1)} for args in product(share_vars.union(new_vars), repeat=arity): if any(var in share_vars for var in args): - yield Expr(pred, *[var for var in args]) + # make sure we don't return an existing rule + if not Expr(pred, args) in clause[1]: + yield Expr(pred, *[var for var in args]) def choose_literal(self, literals, examples): - """Chooses the best literal based on the information gain""" - def gain(l): - pre_pos = len(examples[0]) - pre_neg = len(examples[1]) - extended_examples = [sum([list(self.extend_example(example, l)) for example in - examples[i]], []) for i in range(2)] - post_pos = len(extended_examples[0]) - post_neg = len(extended_examples[1]) - if pre_pos + pre_neg == 0 or post_pos + post_neg == 0: - return -1 - - # number of positive example that are represented in extended_examples - T = 0 - for example in examples[0]: - def represents(d): - return all(d[x] == example[x] for x in example) - if any(represents(l_) for l_ in extended_examples[0]): - T += 1 - - return T * log((post_pos*(pre_pos + pre_neg) + 1e-4) / ((post_pos + post_neg)*pre_pos)) - - return max(literals, key=gain) + """Choose the best literal based on the information gain.""" + return max(literals, key=partial(self.gain, examples=examples)) + + def gain(self, l, examples): + """ + Find the utility of each literal when added to the body of the clause. + Utility function is: + gain(R, l) = T * (log_2 (post_pos / (post_pos + post_neg)) - log_2 (pre_pos / (pre_pos + pre_neg))) + + where: + + pre_pos = number of possitive bindings of rule R (=current set of rules) + pre_neg = number of negative bindings of rule R + post_pos = number of possitive bindings of rule R' (= R U {l} ) + post_neg = number of negative bindings of rule R' + T = number of possitive bindings of rule R that are still covered + after adding literal l + + """ + pre_pos = len(examples[0]) + pre_neg = len(examples[1]) + post_pos = sum([list(self.extend_example(example, l)) for example in examples[0]], []) + post_neg = sum([list(self.extend_example(example, l)) for example in examples[1]], []) + if pre_pos + pre_neg == 0 or len(post_pos) + len(post_neg) == 0: + return -1 + # number of positive example that are represented in extended_examples + T = 0 + for example in examples[0]: + represents = lambda d: all(d[x] == example[x] for x in example) + if any(represents(l_) for l_ in post_pos): + T += 1 + value = T * (np.log2(len(post_pos) / (len(post_pos) + len(post_neg)) + 1e-12) - + np.log2(pre_pos / (pre_pos + pre_neg))) + return value def update_examples(self, target, examples, extended_examples): - """Adds to the kb those examples what are represented in extended_examples - List of omitted examples is returned""" + """Add to the kb those examples what are represented in extended_examples + List of omitted examples is returned.""" uncovered = [] for example in examples: - def represents(d): - return all(d[x] == example[x] for x in example) + represents = lambda d: all(d[x] == example[x] for x in example) if any(represents(l) for l in extended_examples): self.tell(subst(example, target)) else: @@ -346,7 +367,7 @@ def represents(d): def check_all_consistency(examples, h): - """Check for the consistency of all examples under h""" + """Check for the consistency of all examples under h.""" for e in examples: if not is_consistent(e, h): return False @@ -355,7 +376,7 @@ def check_all_consistency(examples, h): def check_negative_consistency(examples, h): - """Check if the negative examples are consistent under h""" + """Check if the negative examples are consistent under h.""" for e in examples: if e['GOAL']: continue @@ -367,7 +388,7 @@ def check_negative_consistency(examples, h): def disjunction_value(e, d): - """The value of example e under disjunction d""" + """The value of example e under disjunction d.""" for k, v in d.items(): if v[0] == '!': # v is a NOT expression @@ -381,7 +402,7 @@ def disjunction_value(e, d): def guess_value(e, h): - """Guess value of example e under hypothesis h""" + """Guess value of example e under hypothesis h.""" for d in h: if disjunction_value(e, d): return True @@ -390,20 +411,12 @@ def guess_value(e, h): def is_consistent(e, h): - return e["GOAL"] == guess_value(e, h) + return e['GOAL'] == guess_value(e, h) def false_positive(e, h): - if e["GOAL"] == False: - if guess_value(e, h): - return True - - return False + return guess_value(e, h) and not e['GOAL'] def false_negative(e, h): - if e["GOAL"] == True: - if not guess_value(e, h): - return True - - return False + return e['GOAL'] and not guess_value(e, h) diff --git a/knowledge_FOIL.ipynb b/knowledge_FOIL.ipynb new file mode 100644 index 000000000..4cefd7f69 --- /dev/null +++ b/knowledge_FOIL.ipynb @@ -0,0 +1,639 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# KNOWLEDGE\n", + "\n", + "The [knowledge](https://github.com/aimacode/aima-python/blob/master/knowledge.py) module covers **Chapter 19: Knowledge in Learning** from Stuart Russel's and Peter Norvig's book *Artificial Intelligence: A Modern Approach*.\n", + "\n", + "Execute the cell below to get started." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from knowledge import *\n", + "from notebook import psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "* Overview\n", + "* Inductive Logic Programming (FOIL)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## OVERVIEW\n", + "\n", + "Like the [learning module](https://github.com/aimacode/aima-python/blob/master/learning.ipynb), this chapter focuses on methods for generating a model/hypothesis for a domain; however, unlike the learning chapter, here we use prior knowledge to help us learn from new experiences and to find a proper hypothesis.\n", + "\n", + "### First-Order Logic\n", + "\n", + "Usually knowledge in this field is represented as **first-order logic**, a type of logic that uses variables and quantifiers in logical sentences. Hypotheses are represented by logical sentences with variables, while examples are logical sentences with set values instead of variables. The goal is to assign a value to a special first-order logic predicate, called **goal predicate**, for new examples given a hypothesis. We learn this hypothesis by infering knowledge from some given examples.\n", + "\n", + "### Representation\n", + "\n", + "In this module, we use dictionaries to represent examples, with keys being the attribute names and values being the corresponding example values. Examples also have an extra boolean field, 'GOAL', for the goal predicate. A hypothesis is represented as a list of dictionaries. Each dictionary in that list represents a disjunction. Inside these dictionaries/disjunctions we have conjunctions.\n", + "\n", + "For example, say we want to predict if an animal (cat or dog) will take an umbrella given whether or not it rains or the animal wears a coat. The goal value is 'take an umbrella' and is denoted by the key 'GOAL'. An example:\n", + "\n", + "`{'Species': 'Cat', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`\n", + "\n", + "A hypothesis can be the following:\n", + "\n", + "`[{'Species': 'Cat'}]`\n", + "\n", + "which means an animal will take an umbrella if and only if it is a cat.\n", + "\n", + "### Consistency\n", + "\n", + "We say that an example `e` is **consistent** with an hypothesis `h` if the assignment from the hypothesis for `e` is the same as `e['GOAL']`. If the above example and hypothesis are `e` and `h` respectively, then `e` is consistent with `h` since `e['Species'] == 'Cat'`. For `e = {'Species': 'Dog', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`, the example is no longer consistent with `h`, since the value assigned to `e` is *False* while `e['GOAL']` is *True*." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Inductive Logic Programming (FOIL)\n", + "\n", + "Inductive logic programming (ILP) combines inductive methods with the power of first-order representations, concentrating in particular on the representation of hypotheses as logic programs. The general knowledge-based induction problem is to solve the entailment constraint:

    \n", + "$ Background ∧ Hypothesis ∧ Descriptions \\vDash Classifications $\n", + "\n", + "for the __unknown__ $Hypothesis$, given the $Background$ knowledge described by $Descriptions$ and $Classifications$.\n", + "\n", + "\n", + "\n", + "The first approach to ILP works by starting with a very general rule and gradually specializing\n", + "it so that it fits the data.
    \n", + "This is essentially what happens in decision-tree learning, where a\n", + "decision tree is gradually grown until it is consistent with the observations.
    To do ILP we\n", + "use first-order literals instead of attributes, and the $Hypothesis$ is a set of clauses (set of first order rules, where each rule is similar to a Horn clause) instead of a decision tree.
    \n", + "\n", + "\n", + "The FOIL algorithm learns new rules, one at a time, in order to cover all given positive and negative examples.
    \n", + "More precicely, FOIL contains an inner and an outer while loop.
    \n", + "- __outer loop__: (function __foil()__) add rules until all positive examples are covered.
    \n", + " (each rule is a conjuction of literals, which are chosen inside the inner loop)\n", + " \n", + " \n", + "- __inner loop__: (function __new_clause()__) add new literals until all negative examples are covered, and some positive examples are covered.
    \n", + " - In each iteration, we select/add the most promising literal, according to an estimate of its utility. (function __new_literal()__)
    \n", + " \n", + " - The evaluation function to estimate utility of adding literal $L$ to a set of rules $R$ is (function __gain()__) : \n", + " \n", + " $$ FoilGain(L,R) = t \\big( \\log_2{\\frac{p_1}{p_1+n_1}} - \\log_2{\\frac{p_0}{p_0+n_0}} \\big) $$\n", + " where: \n", + " \n", + " $p_0: \\text{is the number of possitive bindings of rule R } \\\\ n_0: \\text{is the number of negative bindings of R} \\\\ p_1: \\text{is the is the number of possitive bindings of rule R'}\\\\ n_0: \\text{is the number of negative bindings of R'}\\\\ t: \\text{is the number of possitive bindings of rule R that are still covered after adding literal L to R}$\n", + " \n", + " - Calculate the extended examples for the chosen literal (function __extend_example()__)
    \n", + " (the set of examples created by extending example with each possible constant value for each new variable in literal)\n", + " \n", + "- Finally, the algorithm returns a disjunction of first order rules (= conjuction of literals)\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": true + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class FOIL_container(FolKB):\n",
    +       "    """Hold the kb and other necessary elements required by FOIL."""\n",
    +       "\n",
    +       "    def __init__(self, clauses=None):\n",
    +       "        self.const_syms = set()\n",
    +       "        self.pred_syms = set()\n",
    +       "        FolKB.__init__(self, clauses)\n",
    +       "\n",
    +       "    def tell(self, sentence):\n",
    +       "        if is_definite_clause(sentence):\n",
    +       "            self.clauses.append(sentence)\n",
    +       "            self.const_syms.update(constant_symbols(sentence))\n",
    +       "            self.pred_syms.update(predicate_symbols(sentence))\n",
    +       "        else:\n",
    +       "            raise Exception("Not a definite clause: {}".format(sentence))\n",
    +       "\n",
    +       "    def foil(self, examples, target):\n",
    +       "        """Learn a list of first-order horn clauses\n",
    +       "        'examples' is a tuple: (positive_examples, negative_examples).\n",
    +       "        positive_examples and negative_examples are both lists which contain substitutions."""\n",
    +       "        clauses = []\n",
    +       "\n",
    +       "        pos_examples = examples[0]\n",
    +       "        neg_examples = examples[1]\n",
    +       "\n",
    +       "        while pos_examples:\n",
    +       "            clause, extended_pos_examples = self.new_clause((pos_examples, neg_examples), target)\n",
    +       "            # remove positive examples covered by clause\n",
    +       "            pos_examples = self.update_examples(target, pos_examples, extended_pos_examples)\n",
    +       "            clauses.append(clause)\n",
    +       "\n",
    +       "        return clauses\n",
    +       "\n",
    +       "    def new_clause(self, examples, target):\n",
    +       "        """Find a horn clause which satisfies part of the positive\n",
    +       "        examples but none of the negative examples.\n",
    +       "        The horn clause is specified as [consequent, list of antecedents]\n",
    +       "        Return value is the tuple (horn_clause, extended_positive_examples)."""\n",
    +       "        clause = [target, []]\n",
    +       "        # [positive_examples, negative_examples]\n",
    +       "        extended_examples = examples\n",
    +       "        while extended_examples[1]:\n",
    +       "            l = self.choose_literal(self.new_literals(clause), extended_examples)\n",
    +       "            clause[1].append(l)\n",
    +       "            extended_examples = [sum([list(self.extend_example(example, l)) for example in\n",
    +       "                                      extended_examples[i]], []) for i in range(2)]\n",
    +       "\n",
    +       "        return (clause, extended_examples[0])\n",
    +       "\n",
    +       "    def extend_example(self, example, literal):\n",
    +       "        """Generate extended examples which satisfy the literal."""\n",
    +       "        # find all substitutions that satisfy literal\n",
    +       "        for s in self.ask_generator(subst(example, literal)):\n",
    +       "            s.update(example)\n",
    +       "            yield s\n",
    +       "\n",
    +       "    def new_literals(self, clause):\n",
    +       "        """Generate new literals based on known predicate symbols.\n",
    +       "        Generated literal must share atleast one variable with clause"""\n",
    +       "        share_vars = variables(clause[0])\n",
    +       "        for l in clause[1]:\n",
    +       "            share_vars.update(variables(l))\n",
    +       "        for pred, arity in self.pred_syms:\n",
    +       "            new_vars = {standardize_variables(expr('x')) for _ in range(arity - 1)}\n",
    +       "            for args in product(share_vars.union(new_vars), repeat=arity):\n",
    +       "                if any(var in share_vars for var in args):\n",
    +       "                    # make sure we don't return an existing rule\n",
    +       "                    if not Expr(pred, args) in clause[1]:\n",
    +       "                        yield Expr(pred, *[var for var in args])\n",
    +       "\n",
    +       "\n",
    +       "    def choose_literal(self, literals, examples): \n",
    +       "        """Choose the best literal based on the information gain."""\n",
    +       "\n",
    +       "        return max(literals, key = partial(self.gain , examples = examples))\n",
    +       "\n",
    +       "\n",
    +       "    def gain(self, l ,examples):\n",
    +       "        """\n",
    +       "        Find the utility of each literal when added to the body of the clause. \n",
    +       "        Utility function is: \n",
    +       "            gain(R, l) = T * (log_2 (post_pos / (post_pos + post_neg)) - log_2 (pre_pos / (pre_pos + pre_neg)))\n",
    +       "\n",
    +       "        where: \n",
    +       "        \n",
    +       "            pre_pos = number of possitive bindings of rule R (=current set of rules)\n",
    +       "            pre_neg = number of negative bindings of rule R \n",
    +       "            post_pos = number of possitive bindings of rule R' (= R U {l} )\n",
    +       "            post_neg = number of negative bindings of rule R' \n",
    +       "            T = number of possitive bindings of rule R that are still covered \n",
    +       "                after adding literal l \n",
    +       "\n",
    +       "        """\n",
    +       "        pre_pos = len(examples[0])\n",
    +       "        pre_neg = len(examples[1])\n",
    +       "        post_pos = sum([list(self.extend_example(example, l)) for example in examples[0]], [])           \n",
    +       "        post_neg = sum([list(self.extend_example(example, l)) for example in examples[1]], []) \n",
    +       "        if pre_pos + pre_neg ==0 or len(post_pos) + len(post_neg)==0:\n",
    +       "            return -1\n",
    +       "        # number of positive example that are represented in extended_examples\n",
    +       "        T = 0\n",
    +       "        for example in examples[0]:\n",
    +       "            represents = lambda d: all(d[x] == example[x] for x in example)\n",
    +       "            if any(represents(l_) for l_ in post_pos):\n",
    +       "                T += 1\n",
    +       "        value = T * (log(len(post_pos) / (len(post_pos) + len(post_neg)) + 1e-12,2) - log(pre_pos / (pre_pos + pre_neg),2))\n",
    +       "        return value\n",
    +       "\n",
    +       "\n",
    +       "    def update_examples(self, target, examples, extended_examples):\n",
    +       "        """Add to the kb those examples what are represented in extended_examples\n",
    +       "        List of omitted examples is returned."""\n",
    +       "        uncovered = []\n",
    +       "        for example in examples:\n",
    +       "            represents = lambda d: all(d[x] == example[x] for x in example)\n",
    +       "            if any(represents(l) for l in extended_examples):\n",
    +       "                self.tell(subst(example, target))\n",
    +       "            else:\n",
    +       "                uncovered.append(example)\n",
    +       "\n",
    +       "        return uncovered\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(FOIL_container)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example Family \n", + "Suppose we have the following family relations:\n", + "
    \n", + "![title](images/knowledge_foil_family.png)\n", + "
    \n", + "Given some positive and negative examples of the relation 'Parent(x,y)', we want to find a set of rules that satisfies all the examples.
    \n", + "\n", + "A definition of Parent is $Parent(x,y) \\Leftrightarrow Mother(x,y) \\lor Father(x,y)$, which is the result that we expect from the algorithm. " + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "A, B, C, D, E, F, G, H, I, x, y, z = map(expr, 'ABCDEFGHIxyz')" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "small_family = FOIL_container([expr(\"Mother(Anne, Peter)\"),\n", + " expr(\"Mother(Anne, Zara)\"),\n", + " expr(\"Mother(Sarah, Beatrice)\"),\n", + " expr(\"Mother(Sarah, Eugenie)\"),\n", + " expr(\"Father(Mark, Peter)\"),\n", + " expr(\"Father(Mark, Zara)\"),\n", + " expr(\"Father(Andrew, Beatrice)\"),\n", + " expr(\"Father(Andrew, Eugenie)\"),\n", + " expr(\"Father(Philip, Anne)\"),\n", + " expr(\"Father(Philip, Andrew)\"),\n", + " expr(\"Mother(Elizabeth, Anne)\"),\n", + " expr(\"Mother(Elizabeth, Andrew)\"),\n", + " expr(\"Male(Philip)\"),\n", + " expr(\"Male(Mark)\"),\n", + " expr(\"Male(Andrew)\"),\n", + " expr(\"Male(Peter)\"),\n", + " expr(\"Female(Elizabeth)\"),\n", + " expr(\"Female(Anne)\"),\n", + " expr(\"Female(Sarah)\"),\n", + " expr(\"Female(Zara)\"),\n", + " expr(\"Female(Beatrice)\"),\n", + " expr(\"Female(Eugenie)\"),\n", + "])\n", + "\n", + "target = expr('Parent(x, y)')\n", + "\n", + "examples_pos = [{x: expr('Elizabeth'), y: expr('Anne')},\n", + " {x: expr('Elizabeth'), y: expr('Andrew')},\n", + " {x: expr('Philip'), y: expr('Anne')},\n", + " {x: expr('Philip'), y: expr('Andrew')},\n", + " {x: expr('Anne'), y: expr('Peter')},\n", + " {x: expr('Anne'), y: expr('Zara')},\n", + " {x: expr('Mark'), y: expr('Peter')},\n", + " {x: expr('Mark'), y: expr('Zara')},\n", + " {x: expr('Andrew'), y: expr('Beatrice')},\n", + " {x: expr('Andrew'), y: expr('Eugenie')},\n", + " {x: expr('Sarah'), y: expr('Beatrice')},\n", + " {x: expr('Sarah'), y: expr('Eugenie')}]\n", + "examples_neg = [{x: expr('Anne'), y: expr('Eugenie')},\n", + " {x: expr('Beatrice'), y: expr('Eugenie')},\n", + " {x: expr('Mark'), y: expr('Elizabeth')},\n", + " {x: expr('Beatrice'), y: expr('Philip')}]" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[Parent(x, y), [Father(x, y)]], [Parent(x, y), [Mother(x, y)]]]\n" + ] + } + ], + "source": [ + "# run the FOIL algorithm \n", + "clauses = small_family.foil([examples_pos, examples_neg], target)\n", + "print (clauses)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indeed the algorithm returned the rule: \n", + "
    $Parent(x,y) \\Leftrightarrow Mother(x,y) \\lor Father(x,y)$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Suppose that we have some positive and negative results for the relation 'GrandParent(x,y)' and we want to find a set of rules that satisfies the examples.
    \n", + "One possible set of rules for the relation $Grandparent(x,y)$ could be:
    \n", + "![title](images/knowledge_FOIL_grandparent.png)\n", + "
    \n", + "Or, if $Background$ included the sentence $Parent(x,y) \\Leftrightarrow [Mother(x,y) \\lor Father(x,y)]$ then: \n", + "\n", + "$$Grandparent(x,y) \\Leftrightarrow \\exists \\: z \\quad Parent(x,z) \\land Parent(z,y)$$\n" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[Grandparent(x, y), [Parent(x, v_6), Parent(v_6, y)]]]\n" + ] + } + ], + "source": [ + "target = expr('Grandparent(x, y)')\n", + "\n", + "examples_pos = [{x: expr('Elizabeth'), y: expr('Peter')},\n", + " {x: expr('Elizabeth'), y: expr('Zara')},\n", + " {x: expr('Elizabeth'), y: expr('Beatrice')},\n", + " {x: expr('Elizabeth'), y: expr('Eugenie')},\n", + " {x: expr('Philip'), y: expr('Peter')},\n", + " {x: expr('Philip'), y: expr('Zara')},\n", + " {x: expr('Philip'), y: expr('Beatrice')},\n", + " {x: expr('Philip'), y: expr('Eugenie')}]\n", + "examples_neg = [{x: expr('Anne'), y: expr('Eugenie')},\n", + " {x: expr('Beatrice'), y: expr('Eugenie')},\n", + " {x: expr('Elizabeth'), y: expr('Andrew')},\n", + " {x: expr('Elizabeth'), y: expr('Anne')},\n", + " {x: expr('Elizabeth'), y: expr('Mark')},\n", + " {x: expr('Elizabeth'), y: expr('Sarah')},\n", + " {x: expr('Philip'), y: expr('Anne')},\n", + " {x: expr('Philip'), y: expr('Andrew')},\n", + " {x: expr('Anne'), y: expr('Peter')},\n", + " {x: expr('Anne'), y: expr('Zara')},\n", + " {x: expr('Mark'), y: expr('Peter')},\n", + " {x: expr('Mark'), y: expr('Zara')},\n", + " {x: expr('Andrew'), y: expr('Beatrice')},\n", + " {x: expr('Andrew'), y: expr('Eugenie')},\n", + " {x: expr('Sarah'), y: expr('Beatrice')},\n", + " {x: expr('Mark'), y: expr('Elizabeth')},\n", + " {x: expr('Beatrice'), y: expr('Philip')}, \n", + " {x: expr('Peter'), y: expr('Andrew')}, \n", + " {x: expr('Zara'), y: expr('Mark')},\n", + " {x: expr('Peter'), y: expr('Anne')},\n", + " {x: expr('Zara'), y: expr('Eugenie')}, ]\n", + "\n", + "clauses = small_family.foil([examples_pos, examples_neg], target)\n", + "\n", + "print(clauses)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indeed the algorithm returned the rule: \n", + "
    $Grandparent(x,y) \\Leftrightarrow \\exists \\: v \\: \\: Parent(x,v) \\land Parent(v,y)$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example Network\n", + "\n", + "Suppose that we have the following directed graph and we want to find a rule that describes the reachability between two nodes (Reach(x,y)).
    \n", + "Such a rule could be recursive, since y can be reached from x if and only if there is a sequence of adjacent nodes from x to y: \n", + "\n", + "$$ Reach(x,y) \\Leftrightarrow \\begin{cases} \n", + " Conn(x,y), \\: \\text{(if there is a directed edge from x to y)} \\\\\n", + " \\lor \\quad \\exists \\: z \\quad Reach(x,z) \\land Reach(z,y) \\end{cases}$$\n" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "\"\"\"\n", + "A H\n", + "|\\ /|\n", + "| \\ / |\n", + "v v v v\n", + "B D-->E-->G-->I\n", + "| / |\n", + "| / |\n", + "vv v\n", + "C F\n", + "\"\"\"\n", + "small_network = FOIL_container([expr(\"Conn(A, B)\"),\n", + " expr(\"Conn(A ,D)\"),\n", + " expr(\"Conn(B, C)\"),\n", + " expr(\"Conn(D, C)\"),\n", + " expr(\"Conn(D, E)\"),\n", + " expr(\"Conn(E ,F)\"),\n", + " expr(\"Conn(E, G)\"),\n", + " expr(\"Conn(G, I)\"),\n", + " expr(\"Conn(H, G)\"),\n", + " expr(\"Conn(H, I)\")])\n" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[Reach(x, y), [Conn(x, y)]], [Reach(x, y), [Reach(x, v_12), Reach(v_14, y), Reach(v_12, v_16), Reach(v_12, y)]], [Reach(x, y), [Reach(x, v_20), Reach(v_20, y)]]]\n" + ] + } + ], + "source": [ + "target = expr('Reach(x, y)')\n", + "examples_pos = [{x: A, y: B},\n", + " {x: A, y: C},\n", + " {x: A, y: D},\n", + " {x: A, y: E},\n", + " {x: A, y: F},\n", + " {x: A, y: G},\n", + " {x: A, y: I},\n", + " {x: B, y: C},\n", + " {x: D, y: C},\n", + " {x: D, y: E},\n", + " {x: D, y: F},\n", + " {x: D, y: G},\n", + " {x: D, y: I},\n", + " {x: E, y: F},\n", + " {x: E, y: G},\n", + " {x: E, y: I},\n", + " {x: G, y: I},\n", + " {x: H, y: G},\n", + " {x: H, y: I}]\n", + "nodes = {A, B, C, D, E, F, G, H, I}\n", + "examples_neg = [example for example in [{x: a, y: b} for a in nodes for b in nodes]\n", + " if example not in examples_pos]\n", + "clauses = small_network.foil([examples_pos, examples_neg], target)\n", + "\n", + "print(clauses)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The algorithm produced something close to the recursive rule: \n", + " $$ Reach(x,y) \\Leftrightarrow [Conn(x,y)] \\: \\lor \\: [\\exists \\: z \\: \\: Reach(x,z) \\, \\land \\, Reach(z,y)]$$\n", + " \n", + "This happened because the size of the example is small. " + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "source": [], + "metadata": { + "collapsed": false + } + } + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} \ No newline at end of file diff --git a/knowledge_current_best.ipynb b/knowledge_current_best.ipynb new file mode 100644 index 000000000..5da492cd0 --- /dev/null +++ b/knowledge_current_best.ipynb @@ -0,0 +1,662 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# KNOWLEDGE\n", + "\n", + "The [knowledge](https://github.com/aimacode/aima-python/blob/master/knowledge.py) module covers **Chapter 19: Knowledge in Learning** from Stuart Russel's and Peter Norvig's book *Artificial Intelligence: A Modern Approach*.\n", + "\n", + "Execute the cell below to get started." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from knowledge import *\n", + "\n", + "from notebook import pseudocode, psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "* Overview\n", + "* Current-Best Learning" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## OVERVIEW\n", + "\n", + "Like the [learning module](https://github.com/aimacode/aima-python/blob/master/learning.ipynb), this chapter focuses on methods for generating a model/hypothesis for a domain; however, unlike the learning chapter, here we use prior knowledge to help us learn from new experiences and find a proper hypothesis.\n", + "\n", + "### First-Order Logic\n", + "\n", + "Usually knowledge in this field is represented as **first-order logic**; a type of logic that uses variables and quantifiers in logical sentences. Hypotheses are represented by logical sentences with variables, while examples are logical sentences with set values instead of variables. The goal is to assign a value to a special first-order logic predicate, called **goal predicate**, for new examples given a hypothesis. We learn this hypothesis by infering knowledge from some given examples.\n", + "\n", + "### Representation\n", + "\n", + "In this module, we use dictionaries to represent examples, with keys being the attribute names and values being the corresponding example values. Examples also have an extra boolean field, 'GOAL', for the goal predicate. A hypothesis is represented as a list of dictionaries. Each dictionary in that list represents a disjunction. Inside these dictionaries/disjunctions we have conjunctions.\n", + "\n", + "For example, say we want to predict if an animal (cat or dog) will take an umbrella given whether or not it rains or the animal wears a coat. The goal value is 'take an umbrella' and is denoted by the key 'GOAL'. An example:\n", + "\n", + "`{'Species': 'Cat', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`\n", + "\n", + "A hypothesis can be the following:\n", + "\n", + "`[{'Species': 'Cat'}]`\n", + "\n", + "which means an animal will take an umbrella if and only if it is a cat.\n", + "\n", + "### Consistency\n", + "\n", + "We say that an example `e` is **consistent** with an hypothesis `h` if the assignment from the hypothesis for `e` is the same as `e['GOAL']`. If the above example and hypothesis are `e` and `h` respectively, then `e` is consistent with `h` since `e['Species'] == 'Cat'`. For `e = {'Species': 'Dog', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`, the example is no longer consistent with `h`, since the value assigned to `e` is *False* while `e['GOAL']` is *True*." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## CURRENT-BEST LEARNING\n", + "\n", + "### Overview\n", + "\n", + "In **Current-Best Learning**, we start with a hypothesis and we refine it as we iterate through the examples. For each example, there are three possible outcomes: the example is consistent with the hypothesis, the example is a **false positive** (real value is false but got predicted as true) and the example is a **false negative** (real value is true but got predicted as false). Depending on the outcome we refine the hypothesis accordingly:\n", + "\n", + "* Consistent: We do not change the hypothesis and move on to the next example.\n", + "\n", + "* False Positive: We **specialize** the hypothesis, which means we add a conjunction.\n", + "\n", + "* False Negative: We **generalize** the hypothesis, either by removing a conjunction or a disjunction, or by adding a disjunction.\n", + "\n", + "When specializing or generalizing, we should make sure to not create inconsistencies with previous examples. To avoid that caveat, backtracking is needed. Thankfully, there is not just one specialization or generalization, so we have a lot to choose from. We will go through all the specializations/generalizations and we will refine our hypothesis as the first specialization/generalization consistent with all the examples seen up to that point." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Pseudocode" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/markdown": [ + "### AIMA3e\n", + "__function__ Current-Best-Learning(_examples_, _h_) __returns__ a hypothesis or fail \n", + " __if__ _examples_ is empty __then__ \n", + "   __return__ _h_ \n", + " _e_ ← First(_examples_) \n", + " __if__ _e_ is consistent with _h_ __then__ \n", + "   __return__ Current-Best-Learning(Rest(_examples_), _h_) \n", + " __else if__ _e_ is a false positive for _h_ __then__ \n", + "   __for each__ _h'_ __in__ specializations of _h_ consistent with _examples_ seen so far __do__ \n", + "     _h''_ ← Current-Best-Learning(Rest(_examples_), _h'_) \n", + "     __if__ _h''_ ≠ _fail_ __then return__ _h''_ \n", + " __else if__ _e_ is a false negative for _h_ __then__ \n", + "   __for each__ _h'_ __in__ generalizations of _h_ consistent with _examples_ seen so far __do__ \n", + "     _h''_ ← Current-Best-Learning(Rest(_examples_), _h'_) \n", + "     __if__ _h''_ ≠ _fail_ __then return__ _h''_ \n", + " __return__ _fail_ \n", + "\n", + "---\n", + "__Figure ??__ The current-best-hypothesis learning algorithm. It searches for a consistent hypothesis that fits all the examples and backtracks when no consistent specialization/generalization can be found. To start the algorithm, any hypothesis can be passed in; it will be specialized or generalized as needed." + ], + "text/plain": [ + "" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pseudocode('Current-Best-Learning')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "As mentioned earlier, examples are dictionaries (with keys being the attribute names) and hypotheses are lists of dictionaries (each dictionary is a disjunction). Also, in the hypothesis, we denote the *NOT* operation with an exclamation mark (!).\n", + "\n", + "We have functions to calculate the list of all specializations/generalizations, to check if an example is consistent/false positive/false negative with a hypothesis. We also have an auxiliary function to add a disjunction (or operation) to a hypothesis, and two other functions to check consistency of all (or just the negative) examples.\n", + "\n", + "You can read the source by running the cell below:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": true + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def current_best_learning(examples, h, examples_so_far=None):\n",
    +       "    """ [Figure 19.2]\n",
    +       "    The hypothesis is a list of dictionaries, with each dictionary representing\n",
    +       "    a disjunction."""\n",
    +       "    if not examples:\n",
    +       "        return h\n",
    +       "\n",
    +       "    examples_so_far = examples_so_far or []\n",
    +       "    e = examples[0]\n",
    +       "    if is_consistent(e, h):\n",
    +       "        return current_best_learning(examples[1:], h, examples_so_far + [e])\n",
    +       "    elif false_positive(e, h):\n",
    +       "        for h2 in specializations(examples_so_far + [e], h):\n",
    +       "            h3 = current_best_learning(examples[1:], h2, examples_so_far + [e])\n",
    +       "            if h3 != 'FAIL':\n",
    +       "                return h3\n",
    +       "    elif false_negative(e, h):\n",
    +       "        for h2 in generalizations(examples_so_far + [e], h):\n",
    +       "            h3 = current_best_learning(examples[1:], h2, examples_so_far + [e])\n",
    +       "            if h3 != 'FAIL':\n",
    +       "                return h3\n",
    +       "\n",
    +       "    return 'FAIL'\n",
    +       "\n",
    +       "\n",
    +       "def specializations(examples_so_far, h):\n",
    +       "    """Specialize the hypothesis by adding AND operations to the disjunctions"""\n",
    +       "    hypotheses = []\n",
    +       "\n",
    +       "    for i, disj in enumerate(h):\n",
    +       "        for e in examples_so_far:\n",
    +       "            for k, v in e.items():\n",
    +       "                if k in disj or k == 'GOAL':\n",
    +       "                    continue\n",
    +       "\n",
    +       "                h2 = h[i].copy()\n",
    +       "                h2[k] = '!' + v\n",
    +       "                h3 = h.copy()\n",
    +       "                h3[i] = h2\n",
    +       "                if check_all_consistency(examples_so_far, h3):\n",
    +       "                    hypotheses.append(h3)\n",
    +       "\n",
    +       "    shuffle(hypotheses)\n",
    +       "    return hypotheses\n",
    +       "\n",
    +       "\n",
    +       "def generalizations(examples_so_far, h):\n",
    +       "    """Generalize the hypothesis. First delete operations\n",
    +       "    (including disjunctions) from the hypothesis. Then, add OR operations."""\n",
    +       "    hypotheses = []\n",
    +       "\n",
    +       "    # Delete disjunctions\n",
    +       "    disj_powerset = powerset(range(len(h)))\n",
    +       "    for disjs in disj_powerset:\n",
    +       "        h2 = h.copy()\n",
    +       "        for d in reversed(list(disjs)):\n",
    +       "            del h2[d]\n",
    +       "\n",
    +       "        if check_all_consistency(examples_so_far, h2):\n",
    +       "            hypotheses += h2\n",
    +       "\n",
    +       "    # Delete AND operations in disjunctions\n",
    +       "    for i, disj in enumerate(h):\n",
    +       "        a_powerset = powerset(disj.keys())\n",
    +       "        for attrs in a_powerset:\n",
    +       "            h2 = h[i].copy()\n",
    +       "            for a in attrs:\n",
    +       "                del h2[a]\n",
    +       "\n",
    +       "            if check_all_consistency(examples_so_far, [h2]):\n",
    +       "                h3 = h.copy()\n",
    +       "                h3[i] = h2.copy()\n",
    +       "                hypotheses += h3\n",
    +       "\n",
    +       "    # Add OR operations\n",
    +       "    if hypotheses == [] or hypotheses == [{}]:\n",
    +       "        hypotheses = add_or(examples_so_far, h)\n",
    +       "    else:\n",
    +       "        hypotheses.extend(add_or(examples_so_far, h))\n",
    +       "\n",
    +       "    shuffle(hypotheses)\n",
    +       "    return hypotheses\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(current_best_learning, specializations, generalizations)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can view the auxiliary functions in the [knowledge module](https://github.com/aimacode/aima-python/blob/master/knowledge.py). A few notes on the functionality of some of the important methods:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "* `specializations`: For each disjunction in the hypothesis, it adds a conjunction for values in the examples encountered so far (if the conjunction is consistent with all the examples). It returns a list of hypotheses.\n", + "\n", + "* `generalizations`: It adds to the list of hypotheses in three phases. First it deletes disjunctions, then it deletes conjunctions and finally it adds a disjunction.\n", + "\n", + "* `add_or`: Used by `generalizations` to add an *or operation* (a disjunction) to the hypothesis. Since the last example is the problematic one which wasn't consistent with the hypothesis, it will model the new disjunction to that example. It creates a disjunction for each combination of attributes in the example and returns the new hypotheses consistent with the negative examples encountered so far. We do not need to check the consistency of positive examples, since they are already consistent with at least one other disjunction in the hypotheses' set, so this new disjunction doesn't affect them. In other words, if the value of a positive example is negative under the disjunction, it doesn't matter since we know there exists a disjunction consistent with the example." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since the algorithm stops searching the specializations/generalizations after the first consistent hypothesis is found, usually you will get different results each time you run the code." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Examples\n", + "\n", + "We will take a look at two examples. The first is a trivial one, while the second is a bit more complicated (you can also find it in the book).\n", + "\n", + "Earlier, we had the \"animals taking umbrellas\" example. Now we want to find a hypothesis to predict whether or not an animal will take an umbrella. The attributes are `Species`, `Rain` and `Coat`. The possible values are `[Cat, Dog]`, `[Yes, No]` and `[Yes, No]` respectively. Below we give seven examples (with `GOAL` we denote whether an animal will take an umbrella or not):" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "animals_umbrellas = [\n", + " {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': True},\n", + " {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True},\n", + " {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True},\n", + " {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': False},\n", + " {'Species': 'Dog', 'Rain': 'No', 'Coat': 'No', 'GOAL': False},\n", + " {'Species': 'Cat', 'Rain': 'No', 'Coat': 'No', 'GOAL': False},\n", + " {'Species': 'Cat', 'Rain': 'No', 'Coat': 'Yes', 'GOAL': True}\n", + "]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let our initial hypothesis be `[{'Species': 'Cat'}]`. That means every cat will be taking an umbrella. We can see that this is not true, but it doesn't matter since we will refine the hypothesis using the Current-Best algorithm. First, let's see how that initial hypothesis fares to have a point of reference." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n", + "True\n", + "False\n", + "False\n", + "False\n", + "True\n", + "True\n" + ] + } + ], + "source": [ + "initial_h = [{'Species': 'Cat'}]\n", + "\n", + "for e in animals_umbrellas:\n", + " print(guess_value(e, initial_h))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We got 5/7 correct. Not terribly bad, but we can do better. Lets now run the algorithm and see how that performs in comparison to our current result. " + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n", + "True\n", + "True\n", + "False\n", + "False\n", + "False\n", + "True\n" + ] + } + ], + "source": [ + "h = current_best_learning(animals_umbrellas, initial_h)\n", + "\n", + "for e in animals_umbrellas:\n", + " print(guess_value(e, h))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We got everything right! Let's print our hypothesis:" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[{'Species': 'Cat', 'Rain': '!No'}, {'Species': 'Dog', 'Coat': 'Yes'}, {'Coat': 'Yes'}]\n" + ] + } + ], + "source": [ + "print(h)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If an example meets any of the disjunctions in the list, it will be `True`, otherwise it will be `False`.\n", + "\n", + "Let's move on to a bigger example, the \"Restaurant\" example from the book. The attributes for each example are the following:\n", + "\n", + "* Alternative option (`Alt`)\n", + "* Bar to hang out/wait (`Bar`)\n", + "* Day is Friday (`Fri`)\n", + "* Is hungry (`Hun`)\n", + "* How much does it cost (`Price`, takes values in [$, $$, $$$])\n", + "* How many patrons are there (`Pat`, takes values in [None, Some, Full])\n", + "* Is raining (`Rain`)\n", + "* Has made reservation (`Res`)\n", + "* Type of restaurant (`Type`, takes values in [French, Thai, Burger, Italian])\n", + "* Estimated waiting time (`Est`, takes values in [0-10, 10-30, 30-60, >60])\n", + "\n", + "We want to predict if someone will wait or not (Goal = WillWait). Below we show twelve examples found in the book." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "![restaurant](images/restaurant.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the function `r_example` we will build the dictionary examples:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "def r_example(Alt, Bar, Fri, Hun, Pat, Price, Rain, Res, Type, Est, GOAL):\n", + " return {'Alt': Alt, 'Bar': Bar, 'Fri': Fri, 'Hun': Hun, 'Pat': Pat,\n", + " 'Price': Price, 'Rain': Rain, 'Res': Res, 'Type': Type, 'Est': Est,\n", + " 'GOAL': GOAL}" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "In code:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "restaurant = [\n", + " r_example('Yes', 'No', 'No', 'Yes', 'Some', '$$$', 'No', 'Yes', 'French', '0-10', True),\n", + " r_example('Yes', 'No', 'No', 'Yes', 'Full', '$', 'No', 'No', 'Thai', '30-60', False),\n", + " r_example('No', 'Yes', 'No', 'No', 'Some', '$', 'No', 'No', 'Burger', '0-10', True),\n", + " r_example('Yes', 'No', 'Yes', 'Yes', 'Full', '$', 'Yes', 'No', 'Thai', '10-30', True),\n", + " r_example('Yes', 'No', 'Yes', 'No', 'Full', '$$$', 'No', 'Yes', 'French', '>60', False),\n", + " r_example('No', 'Yes', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Italian', '0-10', True),\n", + " r_example('No', 'Yes', 'No', 'No', 'None', '$', 'Yes', 'No', 'Burger', '0-10', False),\n", + " r_example('No', 'No', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Thai', '0-10', True),\n", + " r_example('No', 'Yes', 'Yes', 'No', 'Full', '$', 'Yes', 'No', 'Burger', '>60', False),\n", + " r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$$$', 'No', 'Yes', 'Italian', '10-30', False),\n", + " r_example('No', 'No', 'No', 'No', 'None', '$', 'No', 'No', 'Thai', '0-10', False),\n", + " r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$', 'No', 'No', 'Burger', '30-60', True)\n", + "]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Say our initial hypothesis is that there should be an alternative option and lets run the algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n", + "False\n", + "True\n", + "True\n", + "False\n", + "True\n", + "False\n", + "True\n", + "False\n", + "False\n", + "False\n", + "True\n" + ] + } + ], + "source": [ + "initial_h = [{'Alt': 'Yes'}]\n", + "h = current_best_learning(restaurant, initial_h)\n", + "for e in restaurant:\n", + " print(guess_value(e, h))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The predictions are correct. Let's see the hypothesis that accomplished that:" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[{'Alt': 'Yes', 'Type': '!Thai', 'Hun': '!No', 'Bar': '!Yes'}, {'Alt': 'No', 'Fri': 'No', 'Pat': 'Some', 'Price': '$', 'Type': 'Burger', 'Est': '0-10'}, {'Rain': 'Yes', 'Res': 'No', 'Type': '!Burger'}, {'Alt': 'No', 'Bar': 'Yes', 'Hun': 'Yes', 'Pat': 'Some', 'Price': '$$', 'Rain': 'Yes', 'Res': 'Yes', 'Est': '0-10'}, {'Alt': 'No', 'Bar': 'No', 'Pat': 'Some', 'Price': '$$', 'Est': '0-10'}, {'Alt': 'Yes', 'Hun': 'Yes', 'Pat': 'Full', 'Price': '$', 'Res': 'No', 'Type': 'Burger', 'Est': '30-60'}]\n" + ] + } + ], + "source": [ + "print(h)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It might be quite complicated, with many disjunctions if we are unlucky, but it will always be correct, as long as a correct hypothesis exists." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.7" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/knowledge_version_space.ipynb b/knowledge_version_space.ipynb new file mode 100644 index 000000000..8c8ec29f5 --- /dev/null +++ b/knowledge_version_space.ipynb @@ -0,0 +1,1088 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# KNOWLEDGE\n", + "\n", + "The [knowledge](https://github.com/aimacode/aima-python/blob/master/knowledge.py) module covers **Chapter 19: Knowledge in Learning** from Stuart Russel's and Peter Norvig's book *Artificial Intelligence: A Modern Approach*.\n", + "\n", + "Execute the cell below to get started." + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [], + "source": [ + "from knowledge import *\n", + "\n", + "from notebook import pseudocode, psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "* Overview\n", + "* Version-Space Learning" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## OVERVIEW\n", + "\n", + "Like the [learning module](https://github.com/aimacode/aima-python/blob/master/learning.ipynb), this chapter focuses on methods for generating a model/hypothesis for a domain. Unlike though the learning chapter, here we use prior knowledge to help us learn from new experiences and find a proper hypothesis.\n", + "\n", + "### First-Order Logic\n", + "\n", + "Usually knowledge in this field is represented as **first-order logic**, a type of logic that uses variables and quantifiers in logical sentences. Hypotheses are represented by logical sentences with variables, while examples are logical sentences with set values instead of variables. The goal is to assign a value to a special first-order logic predicate, called **goal predicate**, for new examples given a hypothesis. We learn this hypothesis by infering knowledge from some given examples.\n", + "\n", + "### Representation\n", + "\n", + "In this module, we use dictionaries to represent examples, with keys the attribute names and values the corresponding example values. Examples also have an extra boolean field, 'GOAL', for the goal predicate. A hypothesis is represented as a list of dictionaries. Each dictionary in that list represents a disjunction. Inside these dictionaries/disjunctions we have conjunctions.\n", + "\n", + "For example, say we want to predict if an animal (cat or dog) will take an umbrella given whether or not it rains or the animal wears a coat. The goal value is 'take an umbrella' and is denoted by the key 'GOAL'. An example:\n", + "\n", + "`{'Species': 'Cat', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`\n", + "\n", + "A hypothesis can be the following:\n", + "\n", + "`[{'Species': 'Cat'}]`\n", + "\n", + "which means an animal will take an umbrella if and only if it is a cat.\n", + "\n", + "### Consistency\n", + "\n", + "We say that an example `e` is **consistent** with an hypothesis `h` if the assignment from the hypothesis for `e` is the same as `e['GOAL']`. If the above example and hypothesis are `e` and `h` respectively, then `e` is consistent with `h` since `e['Species'] == 'Cat'`. For `e = {'Species': 'Dog', 'Coat': 'Yes', 'Rain': 'Yes', 'GOAL': True}`, the example is no longer consistent with `h`, since the value assigned to `e` is *False* while `e['GOAL']` is *True*." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## VERSION-SPACE LEARNING\n", + "\n", + "### Overview\n", + "\n", + "**Version-Space Learning** is a general method of learning in logic based domains. We generate the set of all the possible hypotheses in the domain and then we iteratively remove hypotheses inconsistent with the examples. The set of remaining hypotheses is called **version space**. Because hypotheses are being removed until we end up with a set of hypotheses consistent with all the examples, the algorithm is sometimes called **candidate elimination** algorithm.\n", + "\n", + "After we update the set on an example, all the hypotheses in the set are consistent with that example. So, when all the examples have been parsed, all the remaining hypotheses in the set are consistent with all the examples. That means we can pick hypotheses at random and we will always get a valid hypothesis." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Pseudocode" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "data": { + "text/markdown": [ + "### AIMA3e\n", + "__function__ Version-Space-Learning(_examples_) __returns__ a version space \n", + " __local variables__: _V_, the version space: the set of all hypotheses \n", + "\n", + " _V_ ← the set of all hypotheses \n", + " __for each__ example _e_ in _examples_ __do__ \n", + "   __if__ _V_ is not empty __then__ _V_ ← Version-Space-Update(_V_, _e_) \n", + " __return__ _V_ \n", + "\n", + "---\n", + "__function__ Version-Space-Update(_V_, _e_) __returns__ an updated version space \n", + " _V_ ← \\{_h_ ∈ _V_ : _h_ is consistent with _e_\\} \n", + "\n", + "---\n", + "__Figure ??__ The version space learning algorithm. It finds a subset of _V_ that is consistent with all the _examples_." + ], + "text/plain": [ + "" + ] + }, + "execution_count": 32, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pseudocode('Version-Space-Learning')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "### Implementation\n", + "\n", + "The set of hypotheses is represented by a list and each hypothesis is represented by a list of dictionaries, each dictionary a disjunction. For each example in the given examples we update the version space with the function `version_space_update`. In the end, we return the version-space.\n", + "\n", + "Before we can start updating the version space, we need to generate it. We do that with the `all_hypotheses` function, which builds a list of all the possible hypotheses (including hypotheses with disjunctions). The function works like this: first it finds the possible values for each attribute (using `values_table`), then it builds all the attribute combinations (and adds them to the hypotheses set) and finally it builds the combinations of all the disjunctions (which in this case are the hypotheses build by the attribute combinations).\n", + "\n", + "You can read the code for all the functions by running the cells below:" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def version_space_learning(examples):\n",
    +       "    """ [Figure 19.3]\n",
    +       "    The version space is a list of hypotheses, which in turn are a list\n",
    +       "    of dictionaries/disjunctions."""\n",
    +       "    V = all_hypotheses(examples)\n",
    +       "    for e in examples:\n",
    +       "        if V:\n",
    +       "            V = version_space_update(V, e)\n",
    +       "\n",
    +       "    return V\n",
    +       "\n",
    +       "\n",
    +       "def version_space_update(V, e):\n",
    +       "    return [h for h in V if is_consistent(e, h)]\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(version_space_learning, version_space_update)" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def all_hypotheses(examples):\n",
    +       "    """Build a list of all the possible hypotheses"""\n",
    +       "    values = values_table(examples)\n",
    +       "    h_powerset = powerset(values.keys())\n",
    +       "    hypotheses = []\n",
    +       "    for s in h_powerset:\n",
    +       "        hypotheses.extend(build_attr_combinations(s, values))\n",
    +       "\n",
    +       "    hypotheses.extend(build_h_combinations(hypotheses))\n",
    +       "\n",
    +       "    return hypotheses\n",
    +       "\n",
    +       "\n",
    +       "def values_table(examples):\n",
    +       "    """Build a table with all the possible values for each attribute.\n",
    +       "    Returns a dictionary with keys the attribute names and values a list\n",
    +       "    with the possible values for the corresponding attribute."""\n",
    +       "    values = defaultdict(lambda: [])\n",
    +       "    for e in examples:\n",
    +       "        for k, v in e.items():\n",
    +       "            if k == 'GOAL':\n",
    +       "                continue\n",
    +       "\n",
    +       "            mod = '!'\n",
    +       "            if e['GOAL']:\n",
    +       "                mod = ''\n",
    +       "\n",
    +       "            if mod + v not in values[k]:\n",
    +       "                values[k].append(mod + v)\n",
    +       "\n",
    +       "    values = dict(values)\n",
    +       "    return values\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(all_hypotheses, values_table)" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def build_attr_combinations(s, values):\n",
    +       "    """Given a set of attributes, builds all the combinations of values.\n",
    +       "    If the set holds more than one attribute, recursively builds the\n",
    +       "    combinations."""\n",
    +       "    if len(s) == 1:\n",
    +       "        # s holds just one attribute, return its list of values\n",
    +       "        k = values[s[0]]\n",
    +       "        h = [[{s[0]: v}] for v in values[s[0]]]\n",
    +       "        return h\n",
    +       "\n",
    +       "    h = []\n",
    +       "    for i, a in enumerate(s):\n",
    +       "        rest = build_attr_combinations(s[i+1:], values)\n",
    +       "        for v in values[a]:\n",
    +       "            o = {a: v}\n",
    +       "            for r in rest:\n",
    +       "                t = o.copy()\n",
    +       "                for d in r:\n",
    +       "                    t.update(d)\n",
    +       "                h.append([t])\n",
    +       "\n",
    +       "    return h\n",
    +       "\n",
    +       "\n",
    +       "def build_h_combinations(hypotheses):\n",
    +       "    """Given a set of hypotheses, builds and returns all the combinations of the\n",
    +       "    hypotheses."""\n",
    +       "    h = []\n",
    +       "    h_powerset = powerset(range(len(hypotheses)))\n",
    +       "\n",
    +       "    for s in h_powerset:\n",
    +       "        t = []\n",
    +       "        for i in s:\n",
    +       "            t.extend(hypotheses[i])\n",
    +       "        h.append(t)\n",
    +       "\n",
    +       "    return h\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(build_attr_combinations, build_h_combinations)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Since the set of all possible hypotheses is enormous and would take a long time to generate, we will come up with another, even smaller domain. We will try and predict whether we will have a party or not given the availability of pizza and soda. Let's do it:" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [], + "source": [ + "party = [\n", + " {'Pizza': 'Yes', 'Soda': 'No', 'GOAL': True},\n", + " {'Pizza': 'Yes', 'Soda': 'Yes', 'GOAL': True},\n", + " {'Pizza': 'No', 'Soda': 'No', 'GOAL': False}\n", + "]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Even though it is obvious that no-pizza no-party, we will run the algorithm and see what other hypotheses are valid." + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n", + "True\n", + "False\n" + ] + } + ], + "source": [ + "V = version_space_learning(party)\n", + "for e in party:\n", + " guess = False\n", + " for h in V:\n", + " if guess_value(e, h):\n", + " guess = True\n", + " break\n", + "\n", + " print(guess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The results are correct for the given examples. Let's take a look at the version space:" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "959\n", + "[{'Pizza': 'Yes'}, {'Soda': 'Yes'}]\n", + "[{'Pizza': 'Yes'}, {'Pizza': '!No', 'Soda': 'No'}]\n", + "True\n" + ] + } + ], + "source": [ + "print(len(V))\n", + "\n", + "print(V[5])\n", + "print(V[10])\n", + "\n", + "print([{'Pizza': 'Yes'}] in V)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are almost 1000 hypotheses in the set. You can see that even with just two attributes the version space in very large.\n", + "\n", + "Our initial prediction is indeed in the set of hypotheses. Also, the two other random hypotheses we got are consistent with the examples (since they both include the \"Pizza is available\" disjunction)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Minimal Consistent Determination" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This algorithm is based on a straightforward attempt to find the simplest determination consistent with the observations. A determinaton P > Q says that if any examples match on P, then they must also match on Q. A determination is therefore consistent with a set of examples if every pair that matches on the predicates on the left-hand side also matches on the goal predicate." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Pseudocode" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Lets look at the pseudocode for this algorithm" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "data": { + "text/markdown": [ + "### AIMA3e\n", + "__function__ Minimal-Consistent-Det(_E_, _A_) __returns__ a set of attributes \n", + " __inputs__: _E_, a set of examples \n", + "     _A_, a set of attributes, of size _n_ \n", + "\n", + " __for__ _i_ = 0 __to__ _n_ __do__ \n", + "   __for each__ subset _Ai_ of _A_ of size _i_ __do__ \n", + "     __if__ Consistent-Det?(_Ai_, _E_) __then return__ _Ai_ \n", + "\n", + "---\n", + "__function__ Consistent-Det?(_A_, _E_) __returns__ a truth value \n", + " __inputs__: _A_, a set of attributes \n", + "     _E_, a set of examples \n", + " __local variables__: _H_, a hash table \n", + "\n", + " __for each__ example _e_ __in__ _E_ __do__ \n", + "   __if__ some example in _H_ has the same values as _e_ for the attributes _A_ \n", + "    but a different classification __then return__ _false_ \n", + "   store the class of _e_ in_H_, indexed by the values for attributes _A_ of the example _e_ \n", + " __return__ _true_ \n", + "\n", + "---\n", + "__Figure ??__ An algorithm for finding a minimal consistent determination." + ], + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pseudocode('Minimal-Consistent-Det')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can read the code for the above algorithm by running the cells below:" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def minimal_consistent_det(E, A):\n",
    +       "    """Return a minimal set of attributes which give consistent determination"""\n",
    +       "    n = len(A)\n",
    +       "\n",
    +       "    for i in range(n + 1):\n",
    +       "        for A_i in combinations(A, i):\n",
    +       "            if consistent_det(A_i, E):\n",
    +       "                return set(A_i)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(minimal_consistent_det)" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def consistent_det(A, E):\n",
    +       "    """Check if the attributes(A) is consistent with the examples(E)"""\n",
    +       "    H = {}\n",
    +       "\n",
    +       "    for e in E:\n",
    +       "        attr_values = tuple(e[attr] for attr in A)\n",
    +       "        if attr_values in H and H[attr_values] != e['GOAL']:\n",
    +       "            return False\n",
    +       "        H[attr_values] = e['GOAL']\n",
    +       "\n",
    +       "    return True\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(consistent_det)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We already know that no-pizza-no-party but we will still check it through the `minimal_consistent_det` algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'Pizza'}\n" + ] + } + ], + "source": [ + "print(minimal_consistent_det(party, {'Pizza', 'Soda'}))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also check it on some other example. Let's consider the following example :" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [], + "source": [ + "conductance = [\n", + " {'Sample': 'S1', 'Mass': 12, 'Temp': 26, 'Material': 'Cu', 'Size': 3, 'GOAL': 0.59},\n", + " {'Sample': 'S1', 'Mass': 12, 'Temp': 100, 'Material': 'Cu', 'Size': 3, 'GOAL': 0.57},\n", + " {'Sample': 'S2', 'Mass': 24, 'Temp': 26, 'Material': 'Cu', 'Size': 6, 'GOAL': 0.59},\n", + " {'Sample': 'S3', 'Mass': 12, 'Temp': 26, 'Material': 'Pb', 'Size': 2, 'GOAL': 0.05},\n", + " {'Sample': 'S3', 'Mass': 12, 'Temp': 100, 'Material': 'Pb', 'Size': 2, 'GOAL': 0.04},\n", + " {'Sample': 'S4', 'Mass': 18, 'Temp': 100, 'Material': 'Pb', 'Size': 3, 'GOAL': 0.04},\n", + " {'Sample': 'S4', 'Mass': 18, 'Temp': 100, 'Material': 'Pb', 'Size': 3, 'GOAL': 0.04},\n", + " {'Sample': 'S5', 'Mass': 24, 'Temp': 100, 'Material': 'Pb', 'Size': 4, 'GOAL': 0.04},\n", + " {'Sample': 'S6', 'Mass': 36, 'Temp': 26, 'Material': 'Pb', 'Size': 6, 'GOAL': 0.05},\n", + "]\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, we check the `minimal_consistent_det` algorithm on the above example:" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'Temp', 'Material'}\n" + ] + } + ], + "source": [ + "print(minimal_consistent_det(conductance, {'Mass', 'Temp', 'Material', 'Size'}))" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'Temp', 'Size', 'Mass'}\n" + ] + } + ], + "source": [ + "print(minimal_consistent_det(conductance, {'Mass', 'Temp', 'Size'}))\n" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/learning.ipynb b/learning.ipynb index 55e80bb14..0cadd4e7b 100644 --- a/learning.ipynb +++ b/learning.ipynb @@ -12,12 +12,11 @@ { "cell_type": "code", "execution_count": 1, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from learning import *\n", + "from probabilistic_learning import *\n", "from notebook import *" ] }, @@ -34,12 +33,10 @@ "* Plurality Learner\n", "* k-Nearest Neighbours\n", "* Decision Tree Learner\n", + "* Random Forest Learner\n", "* Naive Bayes Learner\n", "* Perceptron\n", - "* Learner Evaluation\n", - "* MNIST Handwritten Digits\n", - " * Loading and Visualising\n", - " * Testing" + "* Learner Evaluation" ] }, { @@ -127,7 +124,7 @@ "\n", "* **examples**: Holds the items of the dataset. Each item is a list of values.\n", "\n", - "* **attrs**: The indexes of the features (by default in the range of [0,f), where *f* is the number of features. For example, `item[i]` returns the feature at index *i* of *item*.\n", + "* **attrs**: The indexes of the features (by default in the range of [0,f), where *f* is the number of features). For example, `item[i]` returns the feature at index *i* of *item*.\n", "\n", "* **attrnames**: An optional list with attribute names. For example, `item[s]`, where *s* is a feature name, returns the feature of name *s* in *item*.\n", "\n", @@ -982,7 +979,7 @@ "\n", "$$I_G(p) = \\sum{p_i(1 - p_i)} = 1 - \\sum{p_i^2}$$\n", "\n", - "We select split which minimizes the Gini impurity in childre nodes.\n", + "We select a split which minimizes the Gini impurity in child nodes.\n", "\n", "#### Information Gain\n", "Information gain is based on the concept of entropy from information theory. Entropy is defined as:\n", @@ -999,9 +996,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "pseudocode(\"Decision Tree Learning\")" @@ -1018,9 +1013,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "psource(DecisionFork)" @@ -1068,13 +1061,132 @@ "source": [ "The implementation of `DecisionTreeLearner` provided in [learning.py](https://github.com/aimacode/aima-python/blob/master/learning.py) uses information gain as the metric for selecting which attribute to test for splitting. The function builds the tree top-down in a recursive manner. Based on the input it makes one of the four choices:\n", "
      \n", - "
    1. If the input at the current step has no training data we return the mode of classes of input data recieved in the parent step (previous level of recursion).
    2. \n", + "
    3. If the input at the current step has no training data we return the mode of classes of input data received in the parent step (previous level of recursion).
    4. \n", "
    5. If all values in training data belong to the same class it returns a `DecisionLeaf` whose class label is the class which all the data belongs to.
    6. \n", "
    7. If the data has no attributes that can be tested we return the class with highest plurality value in the training data.
    8. \n", "
    9. We choose the attribute which gives the highest amount of entropy gain and return a `DecisionFork` which splits based on this attribute. Each branch recursively calls `decision_tree_learning` to construct the sub-tree.
    10. \n", "
    " ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "We will now use the Decision Tree Learner to classify a sample with values: 5.1, 3.0, 1.1, 0.1." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "setosa\n" + ] + } + ], + "source": [ + "iris = DataSet(name=\"iris\")\n", + "\n", + "DTL = DecisionTreeLearner(iris)\n", + "print(DTL([5.1, 3.0, 1.1, 0.1]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the Decision Tree learner classifies the sample as \"setosa\" as seen in the previous section." + ] + }, + { + "attachments": { + "0_tG-IWcxL1jg7RkT0.png": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABkAAAAQ+CAIAAAAI2WRoAACAAElEQVR42uzdWXBc153f8Xtv9+0daKyNfd9JAiDAfRFXUYslUrQ5sjmKPTXjmoekUnmYquQllarkKY+pSk1VqiaOS7HksRSNtZqyuO+kSAIkRRDggoVYiH1H79u9N0UcuQ0DIEVyTLkpfT/lkkGwAfS9BHD+/Tvn/I/ZMAwJAAAAAAAASFYKtwAAAAAAAADJjAALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJjQALAAAAAAAASY0ACwAAAAAAAEmNAAsAAAAAAABJzcwtAPBE4vG4Pk+W5WUfYBiGaZ6iEJE/AcMw4vG4pmlms9lkMj3s9i6l67okSfI8biMA4FmMUJqmieHmYQ+QZVkM/QxGT3RjdV3XNE3cvccfyo15kiQlai3xR24+gO82AiwAT0DX9Zs3b7a3t09MTIiaSRRMCZqmWa3WvLy8bdu2eTwe7tjjl7Bzc3OXL1++d+9ebW3t1q1bVVV9xIPFv4Wu64ZhBIPBeDxut9ttNpssy7FYTNM0k8lkNpvFvxHlLADgXzNCeb3etra2e/fuiahl2VHJ6XQ2NDSUlZXZbDZu2mOKRqMDAwM3btyw2Wzr1q3Lzs5+xJBtLBAKhcLhsKqqLpdLURRN06LRqCzLFotFZIgM/QC+kwiwADxZFdva2vrb3/62v79/aTIiZmjtdntdXV1VVdWj67Bn9PTEZKZhGGaz+XnJbkQleuXKlffee29oaOjgwYObNm16WIAlrtHr9Q4MDAwODs7NzU1PT4dCIbfb3djYWFtbOzo6ev369Wg0Wl9fX1pampKSwnw4AOBfY3Z29vTp0ydOnAiFQiaTadHclfijx+ORZTk3N9dqtX6bg05iUkfTNOUPnpcbOzg4eOzYsZMnT65atWrFihXZ2dkPu0ZZlqPR6OTk5P3796enp2dmZqanp1VVLSoqWrNmjclkunnz5v379ysrK8vKyjIzMy0WCzNYAL57CLAAPFmZODQ0dPfu3YmJibS0tJycHIvFsnBPga7r1nl/kZpJ1/X79+93dnZqmrZx48bU1FSTyZT8dzUSidy5c+fdd99taWlZv3796tWrH/G0Y7HY5OTkkSNHzp075/P53G737Oxsa2trNBr9u7/7u9LS0tnZ2RMnTrS0tNTU1Lzwwgtbt26tqKhwuVx89wIAnk44HO7r67t165bJZMrNzU1PT4/H4wtjLLEDLvH2t/z0/H7/vXv3RkZGysrKCgsLHQ5H8gc3uq77/f6jR49+8MEHZrO5pqYmIyNDlmWRVS2tvqLRaGtr69mzZ+/duycu8Nq1a8PDwzU1Nf/1v/7XoqKirq6uQ4cO2e32lStXvvDCCw0NDZmZmXzrAviOIcAC8ARkWdZ1PR6PWyyWlStX7tq1KzMzMx6PJwoswzBUVc3JycnPzxcV2LKl2DMSi8XOnDnz3nvv2Wy24uJip9P5XCw+Gh0dPXLkyOXLl7Ozs/fs2VNfXy+Wjy2tX2VZHh8fP3LkyC9/+ctAILB37949e/ZMTU0Fg8H79++npqa6XK6amppXXnlF07SWlpa7d+92d3cfOHBg9erVTqeTmVgAwFMwDCMWi+m6XlZWtmfPnsbGxkgkkpi+SmwhXLlypd1u/5aHfl3XBwcHf/vb37a1tb311lvp6ekOhyP5b2koFPrqq6+OHj06PDz81ltvbdmyxe12Pyz+i0ajt2/ffv/997/88sumpqYdO3aIlW6XL1/OzMy02+1ZWVlbtmyJxWInTpw4dOjQvXv39u7d+8ILL2RnZ9OQFMB3CQEWgKcpZF0u1+rVq19//fWSkpJFXV3FNGyiBca3GZpEIpHLly+fOXOmpqYmEoks2uOQtHp6ev7lX/5lZmbm4MGD27dvF8v+l63+NU27devW//yf/3NoaOjNN988ePBgZWVlIBCIx+OTk5ONjY2qqtrt9pdffrm5ufmTTz55d55ok7Fu3ToCLADA079sMJuLioo2bdq0a9cuTdMWDbKiAdO3v38/Fovdv3//1KlTXV1dL7/8sqZpz8XN9Hq9n3322Z07d8rKyl566aWcnJxlb5ooBmZnZz/44IOjR4/m5OQcOHBgw4YNDodDVdXm5ma73Z6Xl6eqanV1dUFBQUNDw3vvvXfy5MnR0VFd1/ft20dLMgDfqZGIWwDgKciybLVa3W53SkrKwx7zsGXwi87KeXSZu/DxiS+96KMSn80wjEgkEg6HdV0XexkenWE94ukl/iqxjmzRBy66kKXPR6xHS/zVwy5zaGjo6tWrg4ODFRUVTU1NOTk5D3tukiTdu3fv4sWLvb29paWl69evLy8vt9lsqqq+8MILkUgksWXS6XQ6HI79+/fruv4//sf/OHnyZGZmZkZGRllZ2bJruwAAeJyh32KxOBwOsczq8Uf/RUP5N47+Dxv6F76R+Ftxhq9oar7oMywcr79x6F/6mIeVEIvO+1v0lL7xMg3DCIfDd+7cOX/+vMVi2bBhQ1VVlZi7Wpbf729razt79qymac3NzY2Njenp6ZIk1dbWFhUVybKcmpoqSZJlXnNzs6IoPp/vypUrH330UV5e3urVqx9RqgHA84VXMgCehujX/uh5zmUznVgsNjs7OzMzI0lSRkZGWlqa2WxeelxO4vG6rs/NzXm93kAgIFZ+ZWRkOJ1OkdQkSsxIJOLz+WZmZiKRiCRJYkXSxMSEOIxPVNuyLIdCIb/fL8tySkqKxWJZWlbG4/FAIBAOh61Wq8PhSCyGkiQpGAwGAgFFUVJTU1VVjcfj4ivabLa0tDSn07mwlWw0GhU9VlVVzcrKSklJWfYyJUlqa2u7cOGCoig7d+5cuXKlqqoP23mh6/qtW7cuXLgQj8dXrVpVXV1ttVoNwzCZTHl5eUsfX1paunfv3t7e3t/97neHDh0qKSnJy8ujGRYA4KktXXj1jaO/GMr9fv/c3Fw0GrXb7ampqXa7fdm1WuKTa5oWDof9fn8gEIhGoxaLJSUlxeVyicVEiQ+JxWJ+vz8YDPp8PvH+QCAwMzNjNpsNw7BYLE6nU1VVTdN8Pl8sFrPZbHa7fekZKeIolVAoJMuy3W63Wq2JObB4PO73+8UBNQ6HQ9Qbs7OzmqaJpyQ6FYjLjMfjwWBwZmYmFos5HI60tDS73b70OGBZlqenp1taWnp7e7du3bpt2za32/2w7leSJE1PT1+5cqWnp6esrKyhoUGkV6IiSozpiY91Op1r1649cODAxMRES0vLxx9/7PF4ampq+NYFQIAFAI/LMIyZmZm+vr7e3t6heZIkFRQUlJSUFBcXV1VVLZ0eFAftibP2RkdHZ2ZmDMPIyMgoLi4WpxympaWJEjMYDN69e/fKlSuTk5NdXV2SJM3MzBw6dOjGjRuSJKWlpdXU1DQ1Ndlstlu3bl2/fl3spysrK1vaJmNqaqq1tXVgYEBMWhYXF4svEQ6Hb968efXqVbFBz2Kx3Llz5/bt28PDw263e9euXY2NjSLUm56e7unp6evrE5dptVrFNZaUlCS+YiJ30zStvb39xo0bmZmZO3bsKC4uftjR14ZhzM7Otre3d3R0SJJUXV2dk5MTj8dlWVZV9WHz2AUFBT//+c8HBgZOzdu+fXt5eTmLsAAAT+dJ9waGw+Hx8fHu7u6BgYHh4eFgMOh2u3NycsrLyysqKrKzs8XsTuLx4qA9MfqPjIyIY3YdDof4kBUrVng8HhFjxePxkZGRCxcujI6OdnV1TU5Oijbn8Xjc5XKpqlpcXLxmzZqcnJy5ubnz58+Pj4/X1tbW1dVlZWUtepLxeLyzs7OtrU1V1bq6upqaGjFY67o+Ojp66dIln88n3j8xMdHR0dHf36/r+qpVqzZs2JCamiryr+Hh4e7u7sHBweHh4VAolJaWVlxcXFFRUVpampGRkRh5RZw3ODjY2toqVlQ1NTUtnJNbRNO0+/fvt7a2zs7O5uTkFBQUiEkyRVFMJtOiZqPiDYvFsnXr1nv37vX39x87dmzLli2lpaXLTtoBAAEWACxTfs3Ozp45c+b999+/ePFiPB4XtVo8Hk9JSdm8efPf/d3frV+/XixiSpS8HR0dv/71r48ePSpmVsWEajQalWV5x44dP/3pT7dt2ybmIWdmZs6cOfOP//iPs7OzoVBIkqSJiYm3337bbDbrul5SUvL666/X1NRYrdZTp0797//9v1NTU//dv/t3WVlZSwOsvr4+cRpgQ0ODxWLJz88XT3V2dvbw4cP/9E//lJ6enp+fPzc39/bbb7e2tkqSlJKSkpqa2tDQIIrpc+fO/fM//3Nra6vJZBITs5FIJDs7WzxnkaMlbsvExERvb6/P56uvry8qKrLZbEtntjVNC4VCXq/3+rzp6WlRpI6NjXm9XlVVy8vLHQ7Hwi6tiVrWbrevWrWqoaHh6tWrd+7cuXjxYm5urthrAADAsyNWJHV3dx8+fPijjz4aHByUJMlkMomjYJqbm3/4wx/u3r1bzNyID9F1fXJy8vPPP//iiy/a29tDoZAYRsWJh9XV1T/60Y9ee+210tJSk8kUj8d7enr+6Z/+6fbt27FYTLSD/OKLL06dOiVJUmpq6o4dO/Lz83NycsbHx3/96193dHQcOHAgPT19aYAVDocvXLjwzjvv2Gy2gwcPinMMRZXS09Pzi1/8YnBw8Mc//nE4HD537pzoWWmz2X7wgx9UVlba7XZN027fvv3ZZ599/vnn4+PjsiwrihKNRh0OR1NT049+9KNt27bl5uYmTpUJBoN9fX2dnZ3Z2dnV1dWZmZlLh35R8EQikfHx8UuXLrW1tUUiEbPZLGbsZFnOysrKzs4WsdSizYyyLGdnZzc1NdXV1Z0/f/7LL79csWJFbW0t35MACLAAfE89bKHQsiYnJz/99NOPPvro2rVrJpOppqamsLBQluW7d+/29fWdPHlydnb23//7f//yyy8nkpfh4eEzZ86cPHlyeno6Nze3sLCwoKAgHo93dHT09PScPn06Go2mpKRs3LjR4XCYTCaXy5Wenq4oyuTkZCQSMZlMbrfbZrNFo9GMjIzMzEyTyWQYRiAQmJ2dFdXqot7zQjQanZvn9XrD4XCiphRbC6empjRNO3/+vFh+lT7PZrM5nU7DMPr7+z/55JMPP/ywq6vLarXW1tYWFxdHo9G2trbh4eFDhw7NzMz8x//4HxsbG8VMbDQa7ezsHBwcTElJWbFihViDtvSu+v3+a9euffnll2fPnm1ra4tGoyaT6dChQ5cvXzYMo7Cw8D/8h/9QW1u7sCPJwplYVVVrampKS0u7urpOnjy5e/duAiwAwL8+n1paGCz8YyQSuXnz5m9/+9vPPvvM6/WWlpZWV1enpKRMTEzcunWrvb3d6/UGg8Gf/vSnYpe9+JCenp4PP/zwzp07qampoiu5w+EYGRlpb2+/efPm3Nyc0+l89dVXxUnHDoejsLAwHA7Pzc0NDg4ahpGdnZ2Zmakoit1uz8nJETNGmqZ5vd65ublgMLhs6wOxhGpubi4SiYRCoYXdtURV4PV6+/r6vF5vS0uLoii1tbUOhyMrK0uW5XA4fP369ffff//kyZORSKSysrKiosLlco2Ojt6+ffvSpUuzs7PBYHD//v3p6eniFvl8vuHh4XA4vGLFitzc3ESjroU3MB6PDw0NtbW1tbS0nD17dmJiQpblW7du/epXvxK7L3/wgx/s2bMnIyNj6R5M8Z7CwsLa2tpz58599dVX3d3d1dXVYkIOAAiwAHwfK1fRCEMshl9UOSVIkjQ3N/fll1++/fbbX331VWNj48GDB9etW5eWliYmWk+cOPGrX/3q7NmzBQUFubm5dXV1ou1UYF5VVdW//bf/tqqqyuPxiPVZ/f3977777vHjx8+dO1dTU1NVVeVwODIyMvbs2VNZWTk5OfnLX/7y6NGj+fn5//AP/1BeXm4ymZxOZ2FhocvlWlRtP6KLx9J4TrzHZDLNzc19+umn6enpBw8e3LVrl4iNysvLvV7v6dOnf/WrX7W3t7/44osHDhwQjSo0TRsYGPj888/FqUAVFRUOh6Ourk6EaFevXu3p6cnIyKivr1+4AG3Ra4D+/v7Ozs7u7u6pqSmz2ZyWlmaxWGKxmJiUNpvNyx6SnbiEFStWVFVVXbp06eLFi93d3R6Px2q18j0MAHjSoV9YthmWGCXFG2Lj27vvvvvJJ5/E4/F/82/+zYsvvlhSUmKxWLxeb29v77vvvnvhwoXPPvusqKho27ZtGRkZIrURPSh/+MMf7ty5Mz8/XzSd9Pv958+ff++991paWj777LPi4uL8/Hxx7t4//MM/TE9PX7t27f/+3/87NDS0b9++xGfLzMzMz89PPOFHD/2JcX/pRYmV1K2trTk5OdXV1Xv37s3LyxOt07Ozs4eHh99+++1Dhw5lZ2f/7d/+7a5du8SZgIFAoKWl5Z133mlra7Pb7QUFBdu3b7fb7YZhjI+P3717V5Kk5uZmEWAtjf9ER4K+vr6enp7R0VFN0zIyMsTK8VgsZrFYVFVNtN9a1GBLvJGfn9/U1JSRkdHR0dHS0rJ58+alaRcAEGAB+I4Xr2JtfCAQuH79+m9/+9vs7Gwxn5moaD0ez8qVK4uLi8XMZ2dn5+9+97tbt25VV1e/+eabf/VXf1VYWJj4hG63e2ho6Isvvrhw4cKKFSvKy8tFgOXxeHbu3Ll+/fqNGzdmZWUlAhoRFQ0PD4vlSHNzc2LnXdk8r9d7/Phx0aN969atDQ0NC/s9RaPRZcvExySeQzQanZmZ2b9//1tvvSVyKOHGjRuHDx8eGhpas2bNT37ykzfeeCM7OzuRH9lsNtGq4+jRo42NjeIDo9Fob2/v6OhoXV1dUVHRw0Ilu92+YsUKVVWnp6fHx8etVuuWLVtEiaxpWlpaWmFhoWhJ+7BjH0WvMYvFMjU1de/evfr6egIsAMATEbFUf3//6dOnZ2dnFy5SFif/VlRUVFdXi639fr//4sWLJ0+elCRp3759P/nJT8TGfDEqVVVVzc7ODg0NdXV1HTt2rLa2VkROYlP83/zN31RWVq5YsWLhwXxOp3NoaOjatWu3b9/u7+8XuxEz5oXDYU3TUlNTx8bGqqqqNm/enJOTs/Qcw8e5wIe9X3TyWrdu3Ztvvrlr1y5xgoosy16v98KFCxcvXkxLS9u/f/9PfvKT2traxEKn9PT0ubm5mZkZcQDL6tWrxeA7OTnZ398vRuelDUDFZzabzR6PZ926dRaL5f79+9PT0zU1Na+99lpjY6OmaYnuAQ972qLFe2FhYW5u7vj4eH9//+TkZKJzKAAQYAH4vtSvIsDy+Xznz5+/efOmqqqJFVhisX1jY+Pf/M3fZGZm2my2eDze1tZ28uTJWCz28ssv7927NzHZKJSVlb3++ustLS2dnZ1fffWVz+dzuVyyLHs8HhEAJXpGCKL/+tGjR8+cOTM2NhYMBpc+PfFRomfWn/0OmEymkpKSl156KZFeiZ2J7e3tFy9edDqdb7311u7duxPplXgyK1eufO21127cuHHz5s3Ozk5xppLogeX3+20228IOr4u4XK61a9eWlpZeunRJzMHu3r17//79BQUFmqaJf46HFd+iDk5LS8vOzk5NTQ2FQiMjI4tuGgAA30hRFE3T7ty5MzU19fnnny8c+sUhfXv37hU763VdHx8fP3XqVH9//wsvvPDjH/84sbxaDFV2u33Tpk1ffvnlhx9+eOnSpR/96Edi+spqtZaVlZWWlirzFo5lOTk5VVVVNpttenp6cnIyFoslDhMU42Ci/dPCAVEsUPrXVz7xeDw1NXXTpk0bN24UIZT4/GNjY8ePH5+enn7ttdfeeOONysrKhQlRdnb2yy+/fOnSpSNHjrS1tY2MjIgF1F6vV2wJTE1NXRjSLXzmJpOpsLCwqKgoNTX1/fffN5vNq1at2rlz59q1a8UVPaKTQ2I2y+VyeTwei8UyOzvr9XqfRVEEAN8yAiwAT0asxpdl2Wq1OhyORTmRKEAT7xkdHb1z587c3FxqampBQYGqqqOjowur4Wg06nQ6rVZrJBIZnSfOJBKVq2EY4XA4EomEw+HoH4yPjwcCAbGCSeyhS9RwC/tW6LqeeKp/xgu3Wq2NjY35+fkL33///v329nafz1dSUlJeXi7L8tDQUOLJiMlbh8NhtVrD4fDo6OjIyEhhYaE4jVussXI6ncvOiyaefygUmpycDAaDdru9vLxcxHwLM69lrzRRwYuzw0ULMPFFAQB4inFQ0zRd1xPNpMSAK6IiMfDFYrH79+93dXXJspyfn5+VlTU3N+f3+xdVC1lZWSaTaWRkZGJiIhwOi37komFlPB4PBoOxWCwSiUSj0Vgs5vV6x8fHLRZLNBr1+/2hUCgRYC1abPVnj2lEVFdSUlJVVeVyuRLvDwQCfX19XV1dZrO5qKgoJSVlZmZm0RAsWmUpijI+Pi5yN4vFEppntVrtdvujh36xyGtgYEDTtLKysuzs7GU7Biw79Is5P7fbbTabvV6vz+cjwALwHUCABeBp6tfU1NTNmzfv3bs3KysrFoslqiJN0zIzM0WRZxjG1NTUyMiIOHbw6tWrfr8/Ho8vrLF0XZ+YmBgbGxMZzezsrKZpIhTTdd3v99+/f7+np2dgYKC/v1/UfxMTE2L7wKPr1D97oSYuXFXVysrKtLS0he8fGxsbHBzUdd3n8505c6a9vX1hm1ix7WJwcFAcIOjz+aanp/Pz8+PxeCLAstvtj+hjJdpgibvkcrkKCgqW7gF8dE5nsVicTufExMT09HQ4HObbGADwRHRdV1W1qqrqBz/4QVNTU6LZufiv2WwuKSnJyckR00vDw8Nzc3Nms3liYuL48eOqqi7sRSVG/87OznA4rKpqMBhM7PHXdT0ajU5OTt6/f394eHhoaGh0dHRiYmJ0dLSnp2d6elqcmiI6cH07HZ3E087IyEh0YRd8Pt/AwMDc3JwkSQMDA8eOHVu4Jj3ReaC3t1ckcYkSKBwOh0KhrKwsm822bICV6Mbl9/sHBgbEoYd5eXlL9xs+mqqqYtug3+8PBAIEWAC+AwiwADxNFWu1WsvLyzdt2lRcXByLxf7k14rZLI7IMQzD5/PNzs7GYrFoNHru3Lnr168vPQAoEolMTU2JJfqRSERM5AaDwdu3bx87duyrr76anJycnZ2dnp4Wh2SLY6T/UtduMpk8Hs/COVjDMGZmZsQBhRMTE7///e8XrUoT89IinhNT06J1iK7rsVhMrGWz2WyPmFaNxWL9/f1TU1OqqubMe6ImVmIS2OVyaZrGCiwAwFMwDENRlNzc3NWrV2/fvn3h3JUglmDLshyLxWZmZkKhUDAYvH79+uDg4ML1WYmAZmJiIhAIpKWlxWIxURtomjY5OXnhwoVLly51d3fPzgsGg6LHVigUUhTlW05hxNM2m82pqakpKSkLR+pgMCgOPvZ6vV9++eXt27cXXaZ48PDwsLgEsWZcXKboY2WxWB69omp8fLynp0fTtOx5C48bfhyqqoou+IFAQKyA+9ZSPwB4RgiwADwNk8mUkpKSOu9hj9E0TZxILTYFuFyutLS0pQGWLMsFBQWyLDc0NGRkZIjytKOj47333vv444/7+/tzc3Pz8/Orqqqc81JTU1vmPaNLe0RtZxiG6BqbaOAq3i8uU1EUm82WmppqtVoXzsEKGRkZRUVFkiTV1tampqaKhh0mk0lc79LHLyS2Y8zMzLjd7ry8PLfbLT7q8ctQi8XicDhEpLgocAQA4DHHR1VVHQ7Hw47NTYz+oVAoHo8riqKqqs1mW/aAv9LS0pKSkvT09Ly8PNEKanJy8uTJk7/5zW+uXbsWi8VycnKysrLy8/MzMjI8Hs/ExMSxY8ei0ejSQuIxPV34JRZfu1wucTTNwqFZJFOKolgsFnGZS79E1R9kZmaKlWgPO/Fw6dcdHh7u6elRVbW4uPgpThAWM2Rihbt4qnwPA3jeEWABeEr6vG+snOx2u8Viyc3N/fu///sNGzaI5T/LFm0ZGRmlpaWyLE9OTn788ce/+c1vZFl+8cUXX3jhhTVr1lRVVYlWDrFY7L//9//+jQHWsuGOqBoXToQ+KVGOL/qcNpvN4XCoqlpTU/Of/tN/ysvLi0QiD+sdm5+fX1BQYJ4noi4xNfqI44FCodDg4KDX6xVZ3qJWtY9D0zSRWzH1CgD41/jG0dNkMtlsNtHl6vXXXz9w4IAIbpYtJEwmU1lZmcPhiMVira2t/+f//J/bt2+Xl5dv3bp17dq1lZWVmZmZdrvdMIzTp09fuXJlcnLySceyxGknYuPhw0qaR1yXOBlm0RhtsVjsdruqqgUFBQcPHty5c6fFYllaF4l1T06ns6ioSCyhEsuuRai0sK/C0g8cHx8fGBiwWq2FhYVut/tJzxA0DGPhQjm2EAL4DiDAAvCU9evjVEKpqamiZ4SmaXl5eU1NTY/4KJPJpKrq3Nzc2bNnL1++PDc3t23btv/8n/+ziK4cDocoWGdnZ58uhRGBmjj+z+v1LqoaxaSo6FIhtjE+fuGenp6emZkppmHLysrq6uoesTzKZDKJ5uuKooip0WAw6Pf7NU1btjYV+/7u37/v8/kaGhqKi4sfp4frIuFwOBAIyLLsdrufdAoXAIDHp6qqOIl4dnbW4XCIdcePGLjNZrOu6319fRcvXuzo6MjMzDx48OC+ffvS0tLEWTGir5ZoL7Ds8qtFGc2iIdhsNjscDkVRAoHA0i6Qop4JhULRaDSxWGxp/bD0nQ6Hw+PxqKoajUbT0tLq6upSUlIedpkL11w75kUikYddjhCNRkdGRoaGhqxWa0FBgdPpFBNjj18CxeNxn8+naVpqaqrL5XrS/AsAkhABFoBnRcQlHo9HnKg9ODgYDAbT0tIenQ1Fo9G7d++OjY2lpKTU1NTU19dnZGQsrOe6u7sXHmW4bIkpusAuzaFS5vn9/qmpqaWtoHRdn5qaGh4eFq2pvvHqEhsBMjMzPR6Ppmmjo6P9/f1lZWULu7wvvcbEEiqxbiscDnu93odVsbFYbGKeruu5ubnl5eVPGmCJ0tzv9yuKIl5U8M0JAHhGVFXNzs52uVxTU1NDQ0Nzc3Pp6emLFi8nhifRWkvTtJGRkb6+Pl3Xq6urm5qaiouLF47OPp9vYmJCdEx/WBAjji9cugbKYrGIBgVTU1M+n2/pR4XD4YmJiWAw+IiuCEs5nc68vDyn0zk4ODg0NOT3+zMyMh52quDCysFms7lcLr/f/4gVWLquz8zM3L9/f3R0NCcnJy8vT6zeeqIJvFgsNjs7G4/HU1JSXC6X6F3A9yeA55rCLQDw7OTk5Igz+wKBwOXLl2/evLlsrRaJRHw+n2jNLla8i9VMiUIwUfxFo9EzZ87cunVr+d9o8/v7zGZzPB6fmpqKRqOLmqmnp6eLJ3Pr1q1Fx11rmtbb29vW1jYxMfGIJf0LJXKowsLC2tpat9s9MzPzxRdfdHd3L1uMilavielfVVXz8/PT09NDodDo6GjiDKZFxP5BUXPn5OQ83QqsWCwmunSxAgsA8EypqlpYWFhWVma1Wu/evXvhwoVAILD0YZqmBQKBubk5MeaK1uaJxUoLR3/DMPr7+69duxYOhxMNpBaOxWIFtyzLXq83FAot+kJWqzUrK0uW5Tt37ogjgxfWBn6//9atWx0dHYuqgm/kcDhKSkrKy8slSbp69Wpra+uiqkOIx+OBQMDn8yX+Ki0tTXQDGBsbW/bOiLsxPT09MjLi9/szMzPz8/OfYuyORqMzMzOxWMw1j/QKwHcAARaAZ0KsTrLZbKtXr960aZPdbr906dKHH37Y0tIiqlVd18XmuLa2tk8++eT48eP3798X2+iys7OdTqfP57t3797du3e9Xq+IYEZHR48fP/7FF1/09vYu/xtNUcQaK5/Pd+PGjbm5OV3XE71Lxf6+0tLScDjc0dFx7ty5e/fuifYQ0Wj03r17n3zyyenTpwOBwOM3OhWX6Xa7V69e3dzcrGnakSNHPv300xs3bni9XtEmTKRp169f//TTT8+ePTs+Pp4oqevr64uLi/1+f29v79KaO1Fbd3V1+f1+sVsh0ef+iUpYn88XCAQURXnYNDjw/eH1en//+983NzfLsvz3f//3o6Ojj7knGsDjDIsmkykvL2/Lli1lZWV9fX0fffTRmTNnRkZGwuFwfF4oFBoaGjp37tzRo0fb29sT8ysZGRnxeLy3t7ejo2NiYiIWi4n0p6Oj44svvhC9L5eOgCaTyeFwuN1uSZJ6enpGRkbE2ceBQEA0f3Q6neXl5SkpKQMDA5cuXbp+/brP54vH4+K0xNbW1o8//ljUA090mYqi5Ofnb9++PS8vr729/bPPPjt37tzExEQkEtE0TTzzgYGBM2fOHD9+vKurK9GJ0uPxVFRUSJLU1dU1NTW17OePx+NjY2MTExOGYXg8nry8vKcLsKamphIrsPjmBPC8Gx4eZgshgGciscOutrb2wIEDfX19N2/e/OCDD6ampt544438/HyLxRKLxYaGhs6fP3/y5MmGhoaf//znYsJ21apVpaWlN27caG1tfeedd1555ZXCwsJgMHjz5s33339/bGzMarX6/f6lX1RRlOLi4qKiou7u7qNHj5aUlBQXF4vSbcWKFTabbdWqVevXrz9//vzY2Nh7770XjUa3bt3qcDjm5ubOnz9/5MiRvr4+t9sdDoeXvppd9vVtYj6zoqJi7969AwMDd+7c+fWvfz06Ovraa6/l5eWJCrK3t/fs2bMtLS2bN2/2eDwlJSWSJNlstubm5vLy8nv37rW3t4tp2KXtLfx+/+3bt30+X15eXmZmplh+9fjzqIZhTExMjI6OhkIht9tdWFhIFYvvOUVRxMtdu92uKIo40oG1CcBjjimPTnvFj1JKSsqOHTs6Ojo+//zzixcvRiKRgYGBVatWOZ1OTdP8fv/du3dPnToVi8X27dtXW1vrcrny8/NramrcbndnZ+dHH31kGMaqVaskSZqenj5+/PiVK1fC4bDb7RYzQwubXplMprS0tJKSkmvXrl29erW2tla832w2V1ZWis2MjY2NK1as6O/vP3PmjMlkeuWVV3Jzc3Vd7+np+XKexWJxuVzLXt2y7xSX6XK5du/efenSpRMnThw/ftzr9b7xxhsVFRVOp1PMXXV2dp46dUpV1TfffLOmpkaEUB6Pp6qqymq1dnR0jI2NLTv0x2KxwcHB6elpm82Wn5+fmZn5pB2sxBOYnJyUZTk7O1ucX8w3MIDnmlm0RQSAx69cRe8GcZb2o1/yib91u92bN2/2er3/7//9v7a2tpMnT16+fNlsNotJVDEZa7VaS0pK8vLyFEWx2+1r16596aWXhoaGenp6Pv7442PHjomaLx6PO53OgwcP3r59+9NPP13akVRV1fXr19++fXt8fPz69ev/5b/8F1mWi4uLX3rppZKSEvHMt27d2tfX97vf/a6/v/9//a//9c4776iqKk4mqq2tra+vHxwcvHnzpsViWVjqidVkTqczJSVFrGBaVG663e4XX3wxEAi89957Q0NDn3/++ZkzZ8xmc+LQw1gslpaWVlFRkZmZmXi2JfNisVh7e/vk5GRZWdnSW+r1ejs7O0OhUHl5eWFh4ZMWoLIsd80Tt2L16tVimppX7Pg+B1hWq1X0jY5EIgRYwDcSq5xSUlJE7Ps4rzHKysp+9rOfpaWlHT58uLOz8x//8R9dLpfD4dB13e/3i+XP69atExNXsizn5ORs3bq1q6vryJEj169f7+7uTk9PN5vNPp/PYrE0NTVlZ2efPHlycnJy4VokMVuWk5OzZcuW27dv37lz5+233/7oo4+sVuuaNWt+/vOfZ2dnm0ym8vLyH/7wh7Ozs3fu3Dl06NC5c+fcbrdhGH6/3+PxvPLKKzMzM19++aXT6RRPZuGFu1wut9u9bH93k8lUWlr6t3/7t+np6efPn79x48bdu3edTqfD4RA5XSwWM5lMmzdvzs/PT7zsEnsPCwsL29raent7w+GwxWJZ9MkjkUhvb+/o6KjT6SwuLn6K5VdjY2M3b94cHR31eDwrV670eDz8lgPwHRiMTP/tv/03bgSAxw+wZmZmdF0vKCjYuHFjTU2Nw+F49OPFYoeCgoKioqL09HRd18Ph8NzcXDAYVBQlIyOjvr5+//79r776akVFhcViEZ0sPB5Pbm6uWH/k8/nm5uZUVa2rq/vJT36yd+9ecd52Y2Pjtm3bEnmQKGRTU1PdbreqqqJXazweLysr27hxY11dnag+Re+JjIwMUVx6vd5AIJCWlrZmzZof//jHGzZsECcVNjQ0rFu3Li8vT2RkmqbNzc1pmrZixYqdO3d6PJ6Fy6BESw6Xy1VQUFBeXu5yucQWiZmZmXA4LNrZNjc3HzhwYPfu3YWFhRaLRbxgNpvNIyMjoi19eXl5cXHxoi4VsVjs1q1bH3zwQSgU2r59+44dO0TM9/j/XoZhHD58+MSJE5Ik7d69+9VXX01JSeHlOr7PYrHY+Pj4qVOnBgYGVq5cuXPnTrfbvbSxDoCEaDTq9XodDkdjY2N9fX12dvajhx4xwGVlZeXn54sG5CaTSZyHK9ZnlZeX79q167XXXmtqakpJSRH9xVNTU/Py8lJSUsSDxU78nJyc3bt379u3r7q6WpKkoqKiNWvW1NXVLZxFs1gsbrc7NTU1EomEw+FIJJKWltbQ0LB69er09PREG6z8/HyxBCwSiQQCAavVWllZ+eKLL+7bty8jI0OEbmvWrBGZWqK/eygUKigoWLNmTW1t7cIYa+FlFhcXZ2dn22w2k8kUDAZ9Pp8syxkZGVVVVS+++OIPfvADsfpMXKaiKNFodHBw8Pr16+np6aWlpSJlW3gDJyYmPv3005aWlszMzAMHDtTW1oom7o+vq6tL7NBsbm5+7bXXampqWIEF4HkXCoVYgQXgCSiK0tzcnJWVpWlaSUnJN25GE3WeyWTKycnZsWNHWVnZtm3bZmZmRFVqs9ncbndubm5FRYU4izqxSr+goODll18uLi7u6+ubnZ2NRqMul6usrEzUzWJFfUpKSk5OzqKv6HK5mpub09PTV69e7fV6TSZTcXFxXV2dqPwMw3A6neJkw/r6+tHRUdHxyuPxFBcXr1ixwmw2Z2ZmNjY2ZmZmFhcXJwpKp9O5YcOG3Nxci8VSUFCwcI9koty0WCwlJSXZ2dmVlZX9/f1z88SasrS0tNzc3MrKyqysrEV7AFesWLFmzZo7d+6cOnVqzZo1i+ZIp6en+/r6HvyyNpsLCgpyc3OfqAAV5w92dHT09PSsXLlyz5494p+MF+rgV5lYBCr2ItEAC3g0t9u9Y8eO+vr69PR0sTv+G4d+SZLsdnttbW12dnZDQ8PUvEAgIFZDZ2ZmFhYWilP8Fo7gInLasGHD2NiY1+s1m815eXnV1dVFRUWRSCQlJSUSiXg8HovFsvArqqpaVFT06quvFhUVjYyMxOPx9PR0UVqIB5jNZlGHlJaW9vf3z8zMiDMHCwoKysrKCgoKMjIy8vPzRVurxFm9ZrO5sLDwhz/8YTgc9ng8Tqdz4eiZOMjF7XY3NDTk5uauWbNmdnZ2YmIiFAqJoT89Pb2oqEhEeAvPIvR4PM3NzZ988smtW7daW1urq6sXtqfUNG1qakrs/Xe5XBUVFU+UXhmGoWma6GlgMpnWrl1bXV1NegXgu4EAC8ATlERiG9rCw61FQfaNewllWU5JSVk179GfP/HyMjMzc+u8pY8snbf0o8QfU1NTG+ct+/kNw7BareXzln0ai56k+CiLxbLwQ8Q7FxWyohNHSkpK07xHXObCP9bU1GzZsuX48eNXr169fPlyeXn5wpnt+/fvX79+PRaLud3u8vJykX89zvop8ZjZ2dkTJ060trYqirJ27dotW7Y8esUc8H0gfqLFy0WxvZd7Ajz6R8bhcNTU1Dzp0C+mdvLmPWaZYbVaq+ctHd8dDodYTrVopBNfSBzsm5+f/7ChX7TKWj1v6ZcumLfoAsXJJ4u+6LJXarVai+Y9ToUjDiJsamratGlTa2vrmTNnNm3aVFNTk8iwQqFQd3e36PhZWloqeiA8/t7/eDze2dl55syZ/v7+2trazZs3P+KJAcBzRCGMB/BEJeyy7/xzLed5us+z9DjtRz/ySb/Kw676qZ//wkVboiJvbGx85ZVXJEk6ceLElStXRGcQ8Zjh4eE7d+7IslxVVVVSUiLmnB/zC0Wj0Y6Ojl/84hcjIyMbN27ctWuXw+EQdTzfzPg+M5lMIsAyDCPRA4vbAjzRyPg4I9Gi6aVl337E+PjoBy/ayP+wD3zY0P/opZePX9ssepj4nAt7zC/7DGVZzsvL279/f3FxcXt7++nTpycnJxNPKRQK3b17d25urrCwsL6+PrHa/XGekli9dfjw4fPnz6enp7/++uuVlZVi6yK/6AA878xmMwEWAPzFXgCIarKsrOyv//qva2trv/rqq8OHDw8ODopTloLBYE9PT3d3t8PhWL9+fVFR0eMUr6I1/uzs7Pnz5z/44IOOjo6SkpL9+/dv375dTOGyfxDfc2IVQ+IVHS/qgG9nyFv27UeMj08x7fSYX2Xh0q1ncZmJz7z0OST+yuVy7dy5c9euXWaz+YsvvmhrawuFQokmfe3t7eFwuKGhYdOmTWLu6tFPVWyIDofD/f39R44c+eKLL6LR6K5du/bt2yc6itL7EsB3A1sIAeAvXNDb7fYVK1a89dZbv/zlL2/cuNHS0pKXl6dp2pkzZ06fPu3z+dasWbNr167H3AKgadrk5OTZs2ffeeedzs7OnJycn//853v37s3MzFzUtwv43v7ciYNQJUladCQ/gKQaH5/1J392X+Ub14ObTKb09PSXXnrJ6/WeP3/+6tWr9fX1NpttcHDw2LFjN2/eTE9P3759+6ZNm0Q7zm98qqFQ6N69ex9++OHRo0dDodCrr7568ODB8vJy0ZCeoR/Ad6SE4y4AwF+QSJTsdvvOnTsdDsfs7GxGRsbFixevXLly9uzZvr6+1atX/+xnP2tsbBQl7KMTqEgk0t3d/cknnxw7dszv92/evHnXvEQjWwCJHliGYcRiMbYQAviLDP2SJFVWVv7oRz+qra1NT08fGRn5cl5ra2taWtru3bu3b9+ekZHxjZ9N1/Xp6enz588fPny4u7s7Ly9v8+bNO3furK2tFX21mLsC8N2gKAoBFgD8hV9Lix3dBQUF+/fvVxSlv7//97///ZEjR6ampurq6l566aV9+/ZlZWUtfPzDxOPx8fHxGzdu+P3+PXv2vPzyy9u2bRN93/+8DcuA55fJZLJarWJXTjQajcfj3BMA3/7QbxiGOHixvr4+EAjcvHnz4sWL169ft1gsu3fv3rdvn2ic/43xk67rPp+vu7u7p6enoqJiz54927ZtS0xckV4B+M4wm80EWACQXBWty+Wqqal58803MzMz6+vrxfFDj1mAqqpaWlr6s5/9zO12NzU1Wa3Wp2tdD3yHLeqyDAB/qUFfDO5ms9lut+fk5GzevLm5ubmsrKyysjItLU2sn3qclmEpKSnbtm1rbm6uq6tLS0uzWCxMXAH4TpZwBFgAkCy/kUWV6Xa716xZs3LlSqvV6nK5EjMNj5Nhmc3m/Pz8jIwMq9UqWrYz+wosfbFnmid6xtEDC8Bf8NeReENV1aKiIrHa2uFwPNEIriiK2+1euXKlJElOp5O7CuA7/DuTAAsAkqiKNQzDbDa75z2szH0YwzAURbHOW1j1kl4Bi36UxBZCRVHi8wiwAPwFieHbPm/hOx9n/ZR4mDpv4Xu4qwC+e0wmk8JdAICkemn95/pY6lfgYT8piRd7hmFomkaABSDZhv7HHMSXPozRH8B3lSLOkAYAAPjevnTk9R4AAEDyYwshAHwT0fL5D2/88VXv/H+5PcBzR5ZlRVFMJhMrsAA8ugBYNPo/GPwZ+gHgL4QACwC+qXydb/VsBPya1yvFYw/qV7tDSXVLdjs1LPDcvRoVZ36pqqooiq7rsViMAAvA8r8xNE33enWfV9LikqzIdrspM0v+Q7cpAMC3XcJxIwBg6e9HSdel+Z3WIqIywuFI21exL8/rszOyLJuqaiybtqrVtYkP+Hp6lolZILkldgsq81iBBWDh8L9oNNd9vsjli7EbrdLsjGRSTRXV9v0HlPSMB79J5h9sSIYsyYz+APDtlHAEWACwuH41YjFtclLSdVNWpmS1PXhfJKz1dMfOnJCmJyRF0b1eU1mlWlUjClYjHNHDIdlike12mukAz0sNxE8rgD8Z/2MxPRiUJElxuWTzg1dJRjAY62iPnzslTY4ZslmfnbG+9IqSni5JsqHrejBohEOy3a7YHbKikGEBwLNGgAUAf1q/RiKxe/fiHW1GNKo2NKmVFbLNbui6EQ5JAZ8RDUuKLAUDRjT2oFTVdd3vj/Z0acPDJo/HUlMjuVJlzscAkp7JZDKbzazAAjA/9ht6PBYf6I93d0oms1pdY84vkFXViGtSMCBFwkYsKukhKeiXdEMyHjxcGx+Ldd7Rp6eVvHy1osqUlUV8BQDPGgEWACysYI346Ejon9/WrpwzQqH4lp3GgYOW1c2yosiKIolkSpYlRZbnNxdqc3Ph08cjn39s3L6p1K8x/uqvrZu3yg4ndxJIcna73Wq1xuNxv9+vaRo3BPhej/6aFu/tDX30fvz3n0hOV2zHHvsbf6VW10qKJJnMsqwY8/0wJbHMStfjE+Phw4ei//IbKeSTiyuNH//UuvtFxWrjdBcAeKYIsABgQQkbjWqTE3pftzQ9Kema1nY1kpGtllcayqKSVJZk2QhHYl2d0SOHjFs3jZBf7+2MXm+1NDZLBFhA0jPNEyuwuBvA9330j8djd2/rd9oN/6wU9GnnT0ULipX0TDHg/+FBX4//ejQa++pa/PwZaXzI0DWpt1Pr6zHmNkhZFokl2ADwLPFLFgD+SFZVJTVFcrok1SLJsjExqrW1Rm/dNIIhSVX/WMVKsqSYtPGx6NmTes9dIxJ4ULMqZtlsZeoVeP5eu7KFEPi+D/+y4nBKNockK7JhGJNjsTMno21fSZomSbIh//FhsqLEhwdj165o3bcNyZBMJsnulF0pso0CAACeOQIsAPjji1hZUUxZHnXNRsmTJxm6pMeNseHoqWPx3ntSPC7pf5h+VVUjFIzduRW/eknyzT0oaU0mpahUbWqWnSy/Ap4DZrNZVVVd12OxGAEW8D0nm83mympT7UrDmSqZzFI8pt2+Ef+qJT42Nv/XyvzyK1kyJCMajV1t0dqvS0GfZBiyajE1NJmraxWnkz7uAPDM6zduAQB8Xb/O/1dxpVi2bI133dImRuRY1PD7tEsXYpkeye+TdH3+IYZkNscH+o2xYWO434jPd3NPTTfXN1tWNcg2G3cSSPYfdlm2ztN1PRwOs4sQ+L7/TjCZzPn5akNT/Fa7cafN8Hsl/6x2u11OzTBC4fn6QJYU2YiE4/398dbLxv0+yTAkSZHTMswbtporaySTidsIAM8aARYA/PFFrWEYssViLilXV68zBgaM/i4pHjUmR7XrLZIhSfH4148MBeNtXxl9XVI4JEmGrKhKebW6fqOSmsrB/EDyMwxDmafrejzxcw3g+/obQZJlyWxW61ZqL74cHR0yfF5JMel992JRXXLYpVhMkiXZZJb83ujZ03r3bSMUfPCBFptcWWepbzRnZ329jpsaAACeJQIsAPgjWZYNSVLsdrVpndZ/Lz56Xwr4JUnTu29JkmLEI7IsGbqh93Y9KFX93vnuGIpUUGJu3mCurJJV9es6GEBy/6TPt7JRNE0z5nFPgO/zbwRjfhW2yZNjaV4Xv9qiT49LoeCDUb7/rqRapKBfTHFJkxPapdPG7LQY65WMTPWFXeaCgj+EYIz+APBs0QMLAP60jp0/J1stK1XXbJCLy2Wz+cGrW9+c4Z+VtfiDBxiGMTtlzEzJsciDP9qd5ub1lq07TO7UBwUu9SvwPLBYLKqqapoWjUYJsIDv+9AvVk+ZTOaCInXrDiW/SDaZJC1uBP3G3LQRi8rzHbCMcFAfHTIiQUlW5Ixs0+r1lsZmxZUiGQz+APBtIMACgD8138pdtjvUxiZz03rJ4XpQtirzpalhzPe8MOT5/xmSJKsWpaJG3bTFUrdCVkwLj9sGkMxEDyxN0+iBBSCxekq22SzNa02r10up6fPDvSSLZdXyg2H/wegv/qAoSnmN5eXXTHl5D6oDWRQFAIBniwALABZXsaKUNXtyzPWrlZIKw6xKxvz7E/+bj7EePCgtw7zxBbWqVjYpYg8CgOejAJonSZL+9eEMAL7vvl6GlZ6hrt+iFFfIZtV4MOj/oTOAGP1Fo4Asj3lVo2Xl1ye3yIZE9wAA+DbqN24BACxTxc6fSaRW15jXb5KcTkMy/mRyVWRVFptUXG7Z+oK5oFBaMH8L4LnAjyyApb8TZIfD0tRsqquXUzPnNw5Ki0d/s2qqWWluWqukpiqqahgGq68B4NtBgAUAy1Wx8/815ReoazcqpTWy3WWILYQLf4HmF1l2vmzOL5TNZtZeAc8Xq9UqemCFw2F6YAEQDMOQJcnkclm271LWrJfMqvSn2wNlWVay88zNGyx1KyWzSaJ3OwB8iwiwAOAhNaxhKBaLubRc3bZLziv8ehehqGJlWU7LVFattq7fqKSkiv6v3DPgOaKqqtVq1XU9EonQAwuA8HUapaqW+gZ13Sa5oMQwqYnCQBLLrxrXqqvXKOnpDx5MeAUA3yICLAB4SA07X8Uqbrd1206lZqVkd/5hYZYhKSalqs6yZYcpv+Dr5VdMwALP4U+5YRj0wAKw0HwnLEWx29WVDeZ1m2WHUyzBfvBf1SLlFprXbzKXlTPuA8C3jwALAB75EtdiUYuLzQ1NSlGppChfN29Py1Sb1lqa18gWi3gRzI0Cnqef6/kzxBRFMQxD0zQyLAB//P3whz4C5qJCy6bNSn6RrFq+biOQkmZes0mtW2FKSeFGAcC3jwALAL7hla5hMqtrN5jWbtLtzgfvUa3y+i3mtZtMaeny/ClmTMMCzxfDMGw2m6qquq7TAwvAonH/6/93uMxVddILu/SsXEPXDZNZKS6zvrrXnF/IKRAA8BdBgAUAD32Jq+uGpunxuB7MzPNVrArnlOiyKZaaGWtcFy+v0mRF0w1e+gLP4etT2Ww2WywWXddjsRgrsAD8yehvGHFNj2l60JXub9wczC6MG6a4pzC8oileUaXZ7Jo+P/xTAADAt8vMLQCApcWrJEmabkx7I0OT/rGZ4HQgHpi2ZztLqswD99OqZgLpzrtzOe5wfqbD47arqkkcvc10LPC8SGwhJL0CsLAAiOvGXCAyMhEYnQlO+WPeCSnVWlbsGoimFI2ppdbbs570SH66LTvNYbf8f/buLEay8l4M+NmXOmvt+9b7bAwYs8oYMFwCXHFNksuVb26Wl8hSrpRIyUMe8pqXKA+JlMdIuVKMQhbHNtjGY7DBYwiYbQZmemZ6q+7q2veqU6eWU1Vni7pOT9MM2wxmTC//n+VmaLp6uk9Vnf//+3/f9/8IBLEh9gMAwJ8MFLAAAODT+SvSHY4rreHKdmejpFRaw/7ImGhGhEm3/MqW+1SjZFJqwS+SqaCwEJNnI5JXZrC9thkAgAMPw7C9Ju6wjAIAgCCIZdva2MjXe2s5JVNUKq2BqhnaxJCpZCpyz0jwlFQX9kHRK1DJADc/jf5+maFIGE8BAMCfCNxwAQDgEwzTqivatVx7OdNaL3Q6/bFhWiSO0xTZi86uun09VuybqN5Qa01ku6xmit27F/13zvtCHo4mCZiIBeBQoGmaYRjLsjRNM00TLggAx5ltI5ZttXrjTEG5uFG/tt1ud8eGZRE4RpOEHooWPO4JRvRsYtTs19t2rtLdLKun0967F3zpsEQROExgAQDAnwAUsAAAYH8Ka6vDyfurtTcvlwv1vmUiBI7wLBXysEG3y8VQKIqMdavWHtTaw95w0tf01Xyn1dNa6uiJb8cjPmG6qgMuJAAHPgGasixrMpnALkIAjnv0R+z+yLi00fjthXy5MRzqJo7sRH+fxES8nMtF4Siq6Wa7q5Ua/b42GY719YJSbQ3U4ZhnybCXxyD2AwDAnyB/g0sAAABO6QpF0U5//NFG6+0rlWK9rxuWR6BnI9KptDfq59w8Q5EYgqCGaXUH41K9t1rorBW6Sn/UUEYX1xsyTz98lvRKDGwkBODgQ1EUemABAJzlV6OJeWmj8dZyJVvuISgqMORMRDyRdKfDoltgKAJHMMQ07f5gUlWG17bba/lOszvqDvWPMk2PwDxwKhT1Qw0LAABuOyhgAQDALsO08rXeu6uVXK0/1i2/xN614Lt7ITAbkyUXReDovmSXTwT4iJ/3iq6PNurl1rDSHny00Qh5uG8xfobGYRUWAAc9ASIIp4BlmibUsAA4tuzr0f/iej1TVCaG5RXZs3Peu+b98zHJJ7k+Gf2R5EgIuV0+ibm00cxW1Iaivbdacwu0xFGCi4LTXAAA4Pbmb3AJAAAAQRDLsiut4eXN1mquY5lWwM1+a97/vbtjiYBAEhg63V+AILuJqY0gEk+fZEifxAos9dbVcrbUzdf7763Won4u6uMIHIccFoCDjCRJmqZN04QeWAAc7+hvqcPJxY36eqGrTQyZp++a9z12dywZEhiSQNBp9N89aXAnDeAYcjHh9kmsxNEIghTq/Xyt9+FGwyexp9MeAscQCP8AAHDbQAELAAB2GJadKSkruXZf010UcTLhuf9UKBkUKQLb/YqdxNW2d5JZ29kmQJF4yMvdfzqkauNub6z0J5lSN1Psii7KI2KwkRCAAwtFUZIknR5YhmHACiwAjq3xxCq3Bqu5TlMd0SSeDAoPnArNRiSKxD8V/VEUtVEExXA05HHdveDXDWt4oVBqDLZK6mpASYdEiafgkgIAwO2DwSUAAADLsrWxUaj3a60hjqJekT0z652JSPs3DkzrUSiK7O4PcM7dJzDULzOn0975hIyhSLc/Xi8q7Z4G1SsADjKn593en+GCAHBsaRMjV1XrbW2imwE3d8eMNxkSrlev7E9Gf2Raytq9YwTc7LcW/MmASJFYuz8q1Hr1rmaaUA0HAIDbCApYAACAmJbdUEaNjjacGCSBRwN8xMcJLIki6G7++il7o18cQ6NebiYsci5SN6x8vdfpjREYEQNwsOE4TpKkbdu6rsMKLACOreFILzT62sQgcCzkYediEseQTrurz5yL2pvEwlDMzdPpiMAxO9G/2RtVW4MJFLAAAOB2ggIWAAAglm0p/VFP0w3LpggsEeTdAj3NU23b/uK1VDaKoG6Rifp4nt1JeZvKqDc0LBtWdQBwcKEoStM0RVGmaY7HY+iBBcCxpU3MRkczDIul8JDHFQvwBI4hyJd0skKvdxJIBHiOJUzLHmqTdm9sWhD7AQDgNoICFgAAILaFDEaGNjEs0yYIzCcyLI3vy1E//4HTjhg0iYkcxZCEZdsDTR/rhtMtAy4sAAcWjuMEQdi2bRgG1JsBOJaxf+fDWDfVwcSwbJrEJZ6WeRrHbrYJAIGjssAwFG5Z9lg3hyMD7iUAAHBbQQELAAAQG7FN27Qsa7rgCiVwDLu5U4R2O2JMtxlg05TX2vkuiI1CEgvAgYZOOVuBnPc+XBMAjmMCYFnmdNU0iqI4tuOWmljiGIqi2DT6I3AnAQCA2w0KWAAAsDOYJXGcIHayUBuxxrp5a7sAbMQ0bdOyURvBcQzb+S7QxB2Ag/2en3KqV3AQIQDHkBPmcQynCAzDUMu2J4alm+Yt1KBsRJ/WvxAUwTGUwFGI/QAAcFtBAQsAABAMQ10MyVIkgWG6bjW7mjY2Pk5vvzh9tZGJYaojfaKbKIbyLMWQ+HQ6FkVgFyEAB/ddjxEEQZKkaZqj0QjaYAFwvNi7Xa5ICuc5GsfQsW52++OOOr75crZh2W11NJkYGIpSFO5iSAyDEhYAANzO/A0uAQAA4Bjq4WnRReM4NjGsfK3X7o1vrn6F2IjdUkelRr87nKAoEnAzIkfi081JNqzDAuCgQlGUIAgcxy3L0nUdClgAHLNbwO4/XRQeklkSx0Zjo9oe5Gv9iXGzBazxxChUB/2RgWOo4CK9IotjMLYCAIDbCG6yAABg4xjmERm/zPAMoZtmsT7Yrqqd/vhLm1nYtj3RrWxFXct3hppOElgqJLpF+voIGa4tAAd4ALuvDRZcDQCOJxdNxIK8iyUNyy43Byu5tjrQb2YGa6KblfYgU1YGmk6TuE9gwl4XQcDYCgAAbiO4yQIAjrtp51bExRDJkBDx8yiKdvrj5c3mSq49GH/R8WQ2Yk8Mq9zoL2+1MqUugiJekZmPyx6BhqsKwAHnrMAiCMI0zclkAiuwADieWJpIBcWIl3PReEsdXdlqZ0pKbzj5ouhv27phFur991cb+VpPNy2vyCRDol9iCdhCCAAAtxMUsAAAMI5FprsIsYW4+0TCw5DExDCvbXfeuVIt1HoT/bP7udq2bZp2U9XeulpZ3mopvTFL4emwOBeVRY62neZYAIADmwBhGDkFPbAAOM4YCo/6ucWE2yMyY90sNvtvX6lkK+pobHzmoYK2bRum3eqOLm02316uKP0xjqGpsLiYcDM04RwNAVcVAABuEwIuAQAAWLZl2QiGoSxDcC5qODH6I+Nark3imLIUXErIPEsS+PWjtW3EtO2BpueqvYvr9Y82mq3uGMNQnMAIAtfG+mRiUiSOoAjMwwJwYKEoOj0xH3MOIoQxJwDH0HQuCrFtVGBJhiYxDJ0Y5lqhg+Noux9YjHm8Eo1jGLov+o8nxnal92GmcTnTbPc0w7RpHCcJzLSs0cREUQTaYAEAwO0DBSwAwPFNW50mF5Zt1dtaraMVav2VXGuo6Ya5M55tdUcXNxvd4aTS7EX8gsxTzknbE93a+WRrsFFUNgrdVndk2TaGo9rIzJa7v/sQnQlJsYAQdDO8i8Kd3QQotMMC4GDZa4AFPbAAOIbRH0EQ3bC6g3GlNSw0+qv5jtIb2ZZtWlZHtS5nWr2hXm4MEiHRw1EUTWAIaliWOpg0Otp6qbNeUBodTTctFEN029quqH8gsFpbS/j5oJcVOXqv6gXxHwAAvkZQwAIAHNP81baRwUhv90b19nA5285W1FKjN55YPEeFvG7TtJT+uDuYXNpsbpXVgJt1CzRL4ziGjSdmqzdudIad/si2UZbCgx6XKNBqf9zsjoofFD+SW6mguBCTUhEx4GY9AkOSOCSwABwoGIbt9cCCUwgBOEYJAIIMR0anNyo1B1vlbqbULdR6g7HJMcRcTMYxtKFo6nCyvNXarqoBmfWKrIshMBQdG2a3P6l3hkp/rBsWS+Mhr8sjMtrYqHe0Ny+XL2+14gFhISanwzvR3ysxLAVDLQAA+DrBXRUAcLxY03MDByO9rY62q+rKdidTUmodzUZsN0/PxdyLCc9cTNQ0YyXXXt5qNpVRX5t0eiPTslEUwVDMtCwMRQkCo0lC4qhUSLznRCAVFmvt4bXtdqaklFvD91arV7KNkMe1lPSemfGEPJzIUSxFEDgKk7EAfMPD1+nBDRiGURRF0/RgMNA0zTAM5/NwfQA4qm983bD6mq70x7labzXfWc11ap2hbSOii1yKy0tJ90JMJnDswnp9eatVVzRtbGRK3fWCsruDcHqHIAiUwnGvyKQj4tk531xEUjV9NddZK3TKzcHF9fq17XbAw87H5DtmvPGAKLpIF0OQBAZ9BQAA4I8HBSwAwHHJXC3LnkyT11pnuFXqruQ6hUa/2x+blh3xcwk/v5RwpyNS0O1iady0kJifj/q5tbxSbg3avfFAm1jTJRooirMM4eaZoMeVDounUp6wj+NoIuh2OWUsJ5EtNfuFxqDa0dbynfS0vWsiwHtEhqWdRBa2FQDwzXDeek65yvkz9MAC4ChHf9ue6PZwrDcULVNS1nJKrq4qvbFtI36ZTQaF2ai8GJcCHhfPkCiCeiQm7ufXip1SY9BUR31Nt8yd+wOKoAyDyxztl9iZqLiUcCeCgosmTMtOB4U7532rOWW90CnU+rXWsKlomaIyE5HnY1I6LPollqEIpxEBPCkAAPCVQQELAHA8UlgE6WmTjaKymlc2y2qh3usNJwSOhb2uuYh0x6wv7OM8PMOxJDHd7YeiKEO5OBc5G5Va3XGjq/WHE8NC0OkUKscQXpHxy6xbYESOpoidUTBOECyFyxwV8nKn0p7tqrpeUDKl7mapm6/1lrOtdEhcmM7xxv08BZsKAfhGodc5TdzhggBwVA3Hxkahu17sZErdYr3X7k0wFPFL7Mm0eynhiQUEL0/xHEXiu83XQ24XRxMzUanZHTUUTR1MnNOIUQRx0bhXZHwS6xEZkaNoEkcQhEQQlsYFFxWUXafTnu2Kmil1MiW11BgUG4Or261kQFiMy4txORGSOAYGXwAA8NXBPRQAcERNF1RYNqKN9ZY6KjUGuXpvPd8pNgbDkU6R+GJMTkekdFiKBlxRL89SOLbv5CDbtnEMc/O06KLCHnM41ie6adm2s2qKwnGWIZnplsDrX44iiD0texEMRbg5KuRxpcLiyVpvJafk6r2dJLhXL9R7mXJ3NizF/HzEy8kiTcFqLAC+CSiKEgRBkqRt25PJBHpgAXCEgj9i2fZYN1rdSbk1yNfVtbxSavS6/QlNE+mQMBuVUiExFRJDXhfPkHuropzHYhgq8ZTIUUG3azgyxtP9xc5tg8RRhiZdFEEQ2Cf/TpQicb/EugUmILGpsLCU7G8UlUyx21RHy1tNJ/rPRKSEn4/6eZmnaRJHIfwDAMAtggIWAOBoJq+mZQ+GujIYF5v9jYJyLd8p1Pq6YfIsmQiJC1HpVNozH3W7RfoTGej1THJvkxGOoS6GcH3+lOn1jUjO7OwuisR9EuuT2FMp74nUTuq8mutkK2qtMyw2Bx+tN2cj0sm0ZyEm+WVW5EiWIne/AzTJAOBPAsMwkiQpirJtezweG4YBA0kAjkL0N+3hWFf642pbWy8q13LtfLU3HBscQ8b8wlxMPpFwLyU9fpn5dPT/5P5ihKUJlia+MPh/nDLYNoKiNoGjHonxSMxSwnM67b2Wa6/mla1St9waVlrVy5lWIiTcOetNh6WgxyXzNEPhcFoxAADcPChgAQCOlOmkqznQ9HpH2yypa8V2odZr9kYkhvtlNhUS52NiKihF/S6WIUkcn2acyMcZ6CfdTD75eV9j7xa07IiX84nsmRlfvtbbLHU3y51Sc7hR6mSr3Qtudi4qz8flRFCQOdrFTo8rhCQWgNvP6eOOYRiKorCFEIDDzunRrg4nLVXbrvRX8+1ctddURzaCyAJ9dtY3GxVnIlLYy0kuiiAwp/7k5AA3xPGbKyXd+FUfz2NNvzVi216Jvf9k6GTSW2oONovKZqlbbg3ylW6x3vPJ7GxEXEp44gHeKzIuhqQIDKI/AAB8KShgAQCOSOZqWrZh2n1tUmkPN4rK8mar2hoq/RGOoV6ZmYvIJ1Lu2bDkERkXTZD7WlDdpqPH0OvfmSRwksBZhpB4ajYiVjvezXJ3Zbu9Xe1tVXul5uBarp0ICieSntmo5J+euo0TGJxWCMDt5lSvnB5YUMMC4JAyTVs3zeHYbCrDy1utTFEp1gft3ghFELfIpCPSiaR7MSbJAsMxJEXuvO1va/TfK4yROEriGEsSMk+lgvwds97Nirqaa2eK3XJzUG0PVvOdZFBciMvzMTnodrloAscxHLq8AwDA54MCFgDg0LOnq66q7enZgvnOVqVbV7ShZkgcvZhwn0h5lhJuv8xKHM1SOLqvcIXs2zJwO+xuRthNaBGWwlkKl3g6ERTumvMX6r1rufZaXqm0Brla79JmK+rlZqLi2VlfMiRIHI1/nGYDAL5m+7cQQg8sAA4jy7ZN0652BlsVdW26U6/UHGhjg6OJ2Yi4kwAkPREfx7OUiyF260LXN/vdvuiPXl+ONU0AbBRFaRKnSVzm6aiPv2vem6v11/KdtbySr6mVlnZ5sxnyuHZ+2oQ7HZW9Io2jGPQTAACAzwQFLADA4bN34L1lI8OxWWv3c7V+pqhsFpVyezjSDZ6h7przz0akdFSM+/mQh8VxfO+xuznrn6o4hO77YNs2SWAkQYku0icxQQ83Exan+wp7xUZ/Jd8pNvulxmA+LqdCQiLAewSGIvG97wT1LAC+LlDAAuAQRv/rs0IIoo2NWlsr1PuZcmejpJab/f5wIvL0ibBnLiLOR6VYQAh7XCT5jUX/vcaWKILato3jmMBRAke5BTbs5dJhKVNStsrqzq9QUurKcLuizsfU2ag43VfoYigMgj8AANwAClgAgEOZwuqmpQ4mze6o1Oiv5jsbJaXV1UzL9ojMKb9nJiyeTHmjPk7gSALDPpFNfqN54PW/fSeRZqdTxFE/t5Tw5Oq9tVxnu9YtN4dXtlqb5W7Mzy8l5dmIHPZwMk+5GJLAIYMF4DbdUmy4CAAcChPD6mt6q6uVmoP1grJWUBrKUDcsmaNnZqWlhGcuJiUCvMzTOI5OlzHZ1xdFHYTov3O3YUg8MS2unUh6suXuRlHZqqjFen+toGQramSbX4zL81E54nPJAsOxFEVA9AcAgF1QwAIAHKZBpmXbo7HZ1/R6V8sUuiv5Tq6q9rQJjqE+iU2HxcWEez4qBdwuhiIIHEVvc6uLr/qLoCi6m1IzFB72cj6ZXUq4S43+RrF7JdsqNgZbZTVX6/nlxkxYXIy7kyHRJzLstM8rBg0yAPg6xpNOE/dpDx0TalgAHPDoP9bN4ciotrXtinol2yrUe+pwYlm2W2QSAWEpIS/E3REfx1IEgaEohu4LuAfrzuOsyqZJwi/hEkctxd2VzuBarrOe7+TrvWK9X2r0P9pozkTE+bicCoohj4ulCZrEMRQ2FgIAjjsoYAEADkfyaljTA4YG42xFXc11Nkrdams41k0CR5MhYSHmXojJiSAv8bSLJggc2/9YFD1YLdH3zira/dkwhMZwmsQ4howHhFMpz3att15QNkqdekerdoZXsu14gF9KeBbi7qiX41gCx1AM+rwD8MfdVUiSpGkathACcGBZ09rVxLR6g0mu1lvJdzZL3XJz0Nd0msTCHm4+Ls/H5WRA8Ai0iyFJAt3ds3993upgxsm9zIQmcZrAXCwRdLvunPFmp+2xNopKuzt6b6V2dbsT9rAn096FuJwICNx0LTYO0R8AcIxBAQsAcIBHmM7xgiYymujl5mC71tsqd3NVta5o2sgQOWou7llKuJNBPuh2TUtXJI7duOTqIOd5u13epw1lUWS3zyvHEAE3OxeVyk1vpthdL3WrzcGVrXa+1r+Sbc2EpYWYHAtwHpEhCaeKBYksAF+F0wYLQRBd16GABcBBi/62jQzGelsZbVXUzbKSm57bO9EtF4OfSnvmItJMRA552N1d9thusep69D/QsfETP9z0tGKZx3gX5ZWYdES8q+nLlNX1QqfcGqwVlHJreCXbng1LczExFRI8AkOTOArhHwBwLEEBCwBwIJPXaQJqWXZvOKm2BoVpo6utcrfW1gzL8ojMXFSej0ozUTEVlJxWF/vyXvRw9Tzd/6NOu7zjbmF6XJGfTwSEdETaLHYzFaVcH1zZauWqvUxZmYtKM2Ep4uMDMutiCGf3E6SyANzqW2963r0N1SsADk70d/7Q1/RaZ7hdVbdK6malW2kOxropstRiXJ6LyQsxMernfTJL4vgNmcNhDIXOT07iqMQzEs8kfHw8yM+EhUy5m62o29Xeaq6dq6qZMj8XldOhnd895HZxLGF/Kos48i+Mr5xfAQCOBihgAQAOYo4yMazhSG91R9lq70q2uVlUqm0NRRG3yCSD4lLSfSLpmS6nx51i1b4WV+hhH1HvHryNoDSBJ0NixM+fTMrZqudqtr05PWjpUqa1WVTCHn5x5zq4E0FB4iiGJggM0jUAbuG9hqKocz6pOQXXBIBvPAHQTXs4mnT6k1xFvbLdXst1yq0BhqOSi5qLyYtx98mUOxUSRRe1dxzhEahWfGIeC7EJAov5hbCXO5HyFBqDy5nmRqlTrPdX851MoRv2cfMx+cyMNzmN/tMzXo5+6J+uyLMty7r5hzhdDiEpAuCIgQIWAOAAZSemZQ/HxkDTy83+Rqm7nldKrf5QMwkCm4/J8SC/mHCngqJHZFwMMe1nih60Bu1fTy5rozaKoNPVZASGeUQXz1IzYamuaJul7lq+XWoOah2t2tE+yjRmIuJi3J0OSz6Z4RiSIXf7UgMAviQHIgiKomzbHo1GUMAC4JuN/uOJ2dP0anuQKSnrhW6x3u9pE5LA0mExGRTn49JMWPTJLEsTtDNdMw2Su9HySEGRaWMBDEMljmYpIup13dsNZErKRlHJ1/rt3uidK5WVXDsR5Odi8nxUCsg7SQJNHeWuAoPBIJvNKopiWdaX/o62bRMEEQ6Hg8Egz/PwFgPgSCVvcAnA7UtHbrWrCI7jBEGgsKn/2L1Udl4tE9MajY1Ob5wpKhulbr7WqynDycRiaHwuJs1F5RMpd9TLiS6SoUj0Uy3aj9pF2c3H0WkSaxM4SuCkiyE9Ih0P8N+a922W1bVCJ1NSq+3BhbXGaq4T9fOzEXE2KiUCgshRNEk4J4jDmwmAz4NhGEEQTrS6pYl9AMDXwrJsw7RGE7Otjraq6mapu13pVjuaNjIYmkgGxaWUeyEmRzycLNAumsBuaHOJIkfvWL7968pxDGFpgqUJj8BE/NwdM/5Co7ea72yW1HJr8OFGc6PYvSCz83H3XESKB3i3QDMUjh/FNu+5XO6//Jf/cuXKlfF4vH+Wbm8n+P49hqZpulyup59++tlnnz19+jS80QA4SqCABW4X0zTL5XK3273JqRIEQTweTzAYpGkart6xgqLIcKznqr31YnezpG6VlGZ3ZKO2R6BPp71LCTkVkvwyK3EUTeLXuyDY17cKHvECzd55hU66jqGowJICS7oFZi4qVdrDbEVd3mrlar3Lm82NohLyuGYj0nxMXojLES+3l+sDAAAAB81IN8utwVq+s1HobpSUlqKZli3z1B1zvhNJdyokhjysyNE0iWMIOu3svnuq4JGfnPn4F7ze2UtgKY4hfRKTDAlNRcuU1KvZdq6uruQ729XeZbcrFRLmovLpGXfQw1EEfsQuSL/fX15evnDhwqenxp0C1g2fpChqYWGh1+vBuwyAIwYKWOB26fV6P/vZz1ZWVm7miy3LwnH87NmzTz/9dDKZhKt3HNi2PdYtpT+uK8NcVd0odrervbY6wlA04nMlQuJsVEwFxaifEziK3Jttm+Z0R27L4M1crp0MbbeYZSMuhnAxhFdkoz4u4nXl64PNslJq9CrNQaMz2q6oW5XufFSO+Di/xAoukjhyuSwAX8sQ0emBZVnWV+sQDAC4Vc6qK2UwbnRG+Vpvo6Jky2qrOzIty+dmk0FhNizORMSIn5c5mrze3BHZi3/I8Qr/0+C/W7dDUZSh8QjN+UU24HYlAvx2bSfc5yv9WmfYVEfZqlps9lNhMebj/LJL5Ehyuhj7CAiFQn/zN3/z8MMP67rupIC6rrdarffffz+TySSTyaeeeopl2b1hBU3Td911VzgchnccAEcMFLDA7TIcDn/729/+7ne/w/cdEGMYxng8diaTXC7XXhHCsiyCIBRFueeee+Lx+P6HgKOThDnbBacftImpDneS162ysppXNopKdzAhcNQjsotxeTEhn0i4I36exFEnVb2hy9Ux3Ba3bynW9J/2biIbprmQh7trrG9V1ZXtzvp0Jnaz0t2uqctbrfmYvJRwp8KSX6RZhqSu9w2BfYUAOFP0DMPYtj0YDHRdhwsCwG2J/vb1FABBRhOzN5i01PFWRVnJdTaL3Vp3SBGYV2RnI+Jiwr2UcCcCAkV+Xh547KIXui/pQZ0CHopSFB72cmEvd3rGW2z2rmbbK7nOdqVXaPSL9b5XZOfi0smUZyYi+gSGc1HXu4Yd1vhv23YqlfpX/+pf7f+kpmlXr179D//hP2xubi4tLf37f//vfT7fZz72cP3Stm1PJpN2u20YhsvlkmUZhkUA7AcFLHC7uFyuhx9+mOf5vS2EKIpWKpVMJlOpVHw+33333cdxnPNfTdNEUfTb3/62z+eDsfURzl/1aauLnjbZrvSvbDa3Kmqjq40nhoshT6c9MxFpNiqngrzEUzSJExjqZB32MVxw9aUZ7W73egSd9q9lKGIhIsd8/Lfm/Vs1dT2nZCtqQxm+c612eauVCgoLcXkh5g57XbyLpAjiqMzIAvDHvo+cgYFhGLACC4DblwLopj3Wjf5Q36p013LKZrlb62ja2GBpfCnhno/KczE5HRZkjqZJgiTQw1l5+FNHfwRBaQpPBcWQmzs759+u9jaLykq+01FHF9fqa7lONMAvRMXFhCfq43jXTmaFY0dnDtBZOWvvzozaR6aPoWma+Xz+xRdf7PV6d95555NPPulyueCVD8AeKGCB20UQhL/+67/+8z//8/135Hfeeef//J//02w25+fn//W//teRSMTZyu6EH1EU/X7/octWnKg5Go0syyJJkqIoOAPuU0mGbVrWcGzWO9pmWVkttAu1QbU1NCxbcpHppPtUyrsQl4NulmMohsL3dW46oj3av7ZEFkGm5y9h0/lYisIFlvR72JMJT6Heu7Ld2ix2y63Bla32Zrl7KdNKh8RTM55EQJAEmiawo9jmFYCvfieHiwDA1/meQhB7J/rbE92sKVq20t0odNYL3ZY60iamwJLzMelE0n0i6XE2u7loYi8kXe/7BFfx86M/gtq2jaEoRuAkgbsY0i+zS3H59Ix3o9hZz3dLrcHKdnu7ol7abM9ExJMpbzLIe0SGInACP2QX9yv8tM46JsMwMAyjKArH8cmUZVkYhjEM48xeOP2zDMNwDp5yqmAYhn1pPm9Zlq7rhmGYprnbcR/HSZIkCOKPHAVomnbp0qWXXnrJMAye5w3DgBc8APtBAQvcrpEAQRDxePyGe329XpdlGcMwURRPnjwZCoU+7+GHKLLatt3pdM6dO6eq6unTp++55x6YKtl/cQzT6vb1cqu/Xe1lit1cXW12NdtC/TKTmva5SIWEsJdz8zT5GX2aIH/90rTuE+8aHMdElhJZyiNQYS93OjXIVdX1opKr9jZL3UK9v1XppkNCKiwmQ2LYy7E0MT2rEK4jOI4wDMNx3Bm9wCmEAHx9oX/ng2nZveGk3Brkqr1MSdmu9RodzbQQr0CfTPPzUTkV4sNe3icx+6P/9dIVhKWbiP/oxz0WcAzlGZJnSJmjYn7uRNK7XVXXC0qupm5Xu+XmYKusJoL8bGQn+kd8PM+S6Ce2Jx414/H4woUL29vbLpfr/vvvd7vdH3300dWrV5vNpsfjeeyxx2KxGEmStm1rmra+vr62tlapVLrdLkEQgiCcPHny1KlTgUDg87bvtVqt1dXVzc3NWq02Go1omg4GgydPnpyfn3e73X/MC7jT6bz//vvr6+vz8/N7G1OcKRZ4XwAABSxwG2Pqp4tQezMbe//6eSWPm7xBO0ufMAy7pRu6s+T4Vh/1xTHy0qVLf/d3fzcYDJ577rlTp05BAet6j3az0R1ly2q23M2Uu+Vmv6cZLgpLhcW5sLQQdzstxlmG2HdWHmwZ+OMSWadHhm0zFJEI8DE/t5hwn57xbld7myUlW+lvlXuZctcr0ImgsBh3z0TFiI/zCDSOYQjUscBxy4GmnC2EUMAC4OsyMcxmV8tV1O2qmin38jW1p01IHI/4uKWEJx0W4wEu4HZxDOlE/70NcTBE/8rRfy9/YmgiSvMRLz8fl86kvbmaullWN8vdfL23Ve1+uN5MhYWFuHsmIsb8vMzTNHk0+yv1er2XXnrp/Pnz0WiUJEld13/yk5988MEHnU4nkUgEAgGPxyNJkqIoL7/88uuvv3758uVqtdrr9XAcFwRhcXHxu9/97mOPPfbtb3+bJMn939kwjEwm88orr7z22mubm5v1en0ymdA07ff7FxcXn3jiiccffzydTn+FxlWmadbr9bfffvv3v/99u912mgWvra1RFIWiaCAQ8Pv9sMkDAChggdseUPdXJr643tFsNuv1ujOJwbJsvV6vVquj0YhhmHg8LssyQRAoihqG0Wg0Wq1Wp9Pp9/skSQqC4Ha7vV7vF3Q6nEwmhUJBVdVOp+NEGlmW3W633+/f307+KxgMBu+///7q6mogEGAYBkKLbpjDsdHtjyvt4UapezXbrjT72sjkWGI+Ks1GpPmEPBsRfZILx1DMea2gX/DCAbfyvtt/EaeFWomjBNadDAqLcTlT7F7b7mTraqc7+mijuV3tzZbExYScDkt+mRVZmqFxmPkGxysNIgjbtm+YXwEA3Kqd95FlD8eG2h9XFS1T7F7JtsqN/nBsUASeDkrJsLAQl04kvB6RJonpLOInJjsh8Hw9ife0dSiCYIjAUnyUTAS4hbi8UVSvbrcL9V6tM7ySbW9Xe8mQcDLpSYfEkNclsBTD4Dh2pNZjj8fjra2tK1eudLvdN998c2Nj4+rVq5Zleb1elmUnk4lpmq1W6/z583/3d3+3urrqjD7S6bSu671e77333tva2lIUxePxpNNpiqKcb+tUr3784x+/9NJL2Ww2EAjMzs4SBDEajdrt9q9//etKpWLb9rPPPhsMBm9+RGBZVqPRyOVy77777ptvvrm+vm6aZqVSeeONN65evYphGMuyjzzyyAMPPEDTNGRp4LhnbnAJwEGpeuj6H/7wh5/+9KfhcPi5554LBAK/+MUvfv7zn1cqlXg8/rd/+7cPPPCAJEnj8bhWq507d+6NN95YXV3tdDo0TQcCgbNnzz7yyCPf+c53vF7vDTUsy7I0TSsUCv/9v//3S5cu5XI5TdM4jksmk3ffffdTTz11xx13sCx7q/HAWf9lGEalUnn//fcbjcaJEyei0ehkMlEUBUVRkiS/wrc9vJnrzpNoWoZhtXujjWL3Wra9VlDqiqYbhsBRi0n5ZNJzMu1N+HmOxXEnU7r+aMhcb8uTMh0coCiK4SjPUjNhMu4Xzsz6tsrda9vttUKn3Bq+c626nG2HvdzptPd0ypMICixDUNP2WFBNBMdkvAcA+CMTAMO0J4bZG+qbpe7yVnO9oFQ7w/HEZGg8HRZPJD2nZ7ypkMgz0wZBn1wxBBfwNkR/ZPcQHBxlcSoRIMNe7syMZ7uqruSVlVy71OhfXK+v5johj+uOGe9S0p0IioKLpAkcx4/ILJaz08K27Vqt9uqrr3Y6nfvvv//ee++VJMkwjKWlJRzHf/vb3/6n//Sfrly5kkwm/+Iv/uKhhx6Kx+P9fv/y5cs/+tGP3nvvvV/84heBQOCv//qvnaYolmV1Op3nn3/+hRde6PV6J0+e/Mu//Mt77rlHluVSqXT+/PkXXnhheXn5xz/+sd/v/4u/+AuSJG/yYhqGsby8/JOf/OSVV15pNpuDwQDH8VKppKoqgiA4jns8nnA4fPfdd9M0Da9wcMxBAQscFOPxeGVl5dy5c3Nzc6FQKJ/Pv/vuu9vb26PRCEGQZrOp63q/37906dILL7zwzjvvNBoNp0tCf6pQKHz44YeZTOa5555LJpN7DRScOY3XX3/9pz/96YcffjgajZz5kE6noyjK6urq1atX/9k/+2ePPPIIz/M3H7Nt2x6NRq1W68qVKy+//PKFCxdM02w0Gm+99VahUEAQhKbpEydOPPjggzcsPD6Smatl2bppD0d6sdHPVtTNUjdXU5X+xLIQv8zMReW5uJQKCn6JFTmKpnBnig/mXW/74Px6cwvnUmMYSlN4yOMSXVQqJH5rIbBZ6qzllHJ7UKr3m4p2bbuVCAhzMWkmIgVkliZ3ElkMVmSBowvDMGcLoWmazi5CWEILwE2F/mlosSzbsOzBSK91hpul7mZJzdd6LVUzTMQt0DNhcT4mJ4JCyM2JPMXSNx4uCMHldkf/6zc6lMaIgIwLLJkMimdnvVvl7kZBydf69bb2u25pOdtOBPiZqLQYk30y66IJfHqazmF/gpwuh71er9vt/tmf/dk/+Sf/ZG5ujiAIy7I4jltZWXn11VdXVlZmZ2d/+MMfPv7446FQiGVZ0zRTqRRFUQRBfPjhhy+++OJ9990Xi8VQFB2NRu++++758+cbjcajjz76t3/7t6dPn/b7/SRJJpPJUCgky/Lzzz9/8eLF119//dFHH/2CfSE3wHF8ZmbmH/7Df+h2u1955ZWNjQ1Zlp966qnvfOc7uq4jCMLz/OnTp//ILSMAHA1QwAIHhWVZvV6v1WohCPLyyy8XCoVoNPo3f/M3oijiOD4/P09R1PLy8n/9r//19ddfR1H03nvvffTRR0VRnEwm5XL51VdfvXDhwmQykWX57//9v+/z+ZwkqdfrvfLKKz/60Y8uXLgQi8Weeuopp0fVYDBYWVl5+eWXX3vtNY7jXC7Xww8/7IxkbtLm5uYbb7zxu9/97uLFi5VKBUGQdrv93nvvXb582TAMj8eDYdi99957VAtYu8cWI8hobLR7o2JzkC2pmVK31Bz0hxMCR0Me12xUno2KqaDglVmeoQj8E0EXYvCfLpfd3VGI2KiNY6jgInmW9LvZRJA/mfLma73NUjdT3hl4FOr9tWInGRRnI1I6LIQ8LomjCeII5LEAfPaYwdlC6JxOZZomFLAAuIkEwLZtZKxbSn9Uag42isp2VS3WB93BCMcwv8ymw+J8XE4FBZ/MCiy516MdGrR/U88XiqAYhnIsxbGUV6QTAf5E0lOs9zMldS3fqbaG5eZgvahc3W7NReR0WIz4OLdAE4f/rGLnUKlUKvX973//3nvvZRjG+fx4PH7rrbfOnz/PMMwzzzzzZ3/2Z7Ozs87vShBEMBj83ve+t7a2try8vLa2dvXq1VOnTvl8PlVVX3nllfX19RMnTjzzzDMPPfQQx3HOoziOW1hYYFn24sWLq6ur16bOnj0riuLN/JwYhkWj0Ugk0u/333zzTYqinHZaTz/99GQycX4qp6YGSxcBgAIWOEBjbGeqpNVqXb169c4773z22Wcfe+wxnufH47Eois1m87XXXvvtb3+LYdgzzzzzD/7BP7jvvvucqZJWqyVJkqZpm5ubL7744pkzZ7xeL4qiuq47K6Q++OADv9//z//5P3/sscdmZmYoitJ1fW1tDcfxn/70p+fPn/f7/WfPnnUedZM/MEmSHo8nGAy6XC4cx2VZvuuuu+6//36nS5ckSfPz81+hg+MhygoGI6PWHubrvWxFzVa6peZQG+s8SzlLeJIhIRkU/TLLUjj6cZdWCLzf4HsMme4q2J2XpQksILM+mU0ExfmYlK30MqVurqrWFK3Wrq0XO8mgMBMSZyJS2Mf7ZYYicLiE4EjGHWcGxTkKHa4JAF9qODbryrBY72er6na5u11ThyPTxZKpsJgOSzNhMRYQwh6XiyHQG8ooEPy/oRudjXwc/kkC94qsm2dTwZ0QPxeTslU1V1Fr7eEHq/WtohoN8HPRnecx7OUCbpahDvdoURTFO+64Y2lpaW/znbMT8PLly9ls9tSpU0888YSzwGr/o2Kx2NzcnNvtrlaruVzOObuwVqt9+OGHqqrec889DzzwwA1HNlEUlUgkkskkz/OKoqytrc3Pz99kAct5uHMkbr1et217fn4+Ho+zLLtXdIP3EQAOKGCBAzeWcDqs/+AHP3jqqaechVSOV1999bXXXhsOh9/97nd/+MMf3nnnnc59HMfxSCTy1FNPKYryn//zf3777bc3NjbuvPNOlmU1TTt//vzFixcZhnnkkUd++MMfsiy7NyFz6tSpH/zgB7lc7mc/+9m7775bqVQEQbiZveVO8FicOn36dK1Wcxp1/dVf/dU//af/9DO/+Ag8O3sn+JrTVhfqYFJo9C9lGleyrUJtoE0MkSMTAWEp6T4z452PyYKL+uw+FxB4v+F32d7HnYwWt22Zp2TeMxuVTs9414qd5c3Weq5TbWvFWn95q70Qle+Y9Z6a8fgl1sWQ5PXeZfA0giM4xoMCFgCfEf2nK3hR1LKs8cTSJka+1r+63bq82dyuqNrY4Fxk1MstJD1nZ3eiv1tgrp8taO8PFRA1vsn7275NhdN7HYLjNseS8zE5HZUairZRUJa3WtfyrXpbK7X6K7l2OiyemfGemfWFPZyLwikSnz6DhyyntW2b47h0Os3z/N4ndV0vl8u1Ws0wDBzHx+Px9vb2Db8XjuOqqpIkadu2oijD4dBpV+LsFCFJUlXVjY2NG/46y7JGoxFJkpqm1et1Z/ffTQYgpxOWc54VgiDhcNjtdn/6jQPvIwCggAUO3PhBEIR77rnnzjvvlGV57z+Zpnn58uULFy6EQqG//Mu/DAQCmqbtf2wwGDx9+rQgCPV6vVQqKYpC03S/3/9//+//5XK5J5988gc/+IFhGKPRaG+IYtv27OzszMwMhmGqquZyuWg0ejMFrP3Bo9VqbWxsjEajM2fOJBKJT7dQOSrVq52P44k5nJi19jBb6W4UlWy111FHpmUHPEzcLywk3Sfibr/MuhhieiozutedHcLtgc5opzOz2HTfh8hRS3F3sd5fzbczBbWmDNeL3Xy9995KdTYmL8U9sQDvkxiKxKaVLHhaweGG47izy9swDGcLIVwTAG4ck9u2YZra2Gx1tWylt5JXcjW12dEMy/IIdGTGezrtnYtKIY+LY0iawlHUnnYRh9B/kPPt6+1HbRtHEK9AC/O++ah4fyu4mmtvlLuV5mC7olabwwtr9ZmItJRwp0KCR2BYhiTxQ/bMUhTldrv3N/QwDKNWqw0GAwRBVlZW/sW/+Bd7hwzup6pqu922bbvf72uaNhgM6vW6s9/8f/2v//XKK698uvEIiqLNZlNRFJfL1el0br6A5RS/arXa5ubmcDj0er3O8evwWgXg06CABQ5YnmRZsiyfOXMmEAg4rUmcHXnVarVQKPT7fQRBBoPBlStX0Olk4N4DSZIsl8sYhjnN1FVVFUWxPOXED0VR3nnnHeevcB7itO/t9/sEQWiaViwWnYbxN13TsZ2pkmq1quv62bNn5+fnj1i6Nm1zgThLrgZDY7Pc3Sp3V4udams4GBs4hoTcrtmoPB+T0yHRI9HivlVXsMz58GSyO88UhiI0idMkLroov8zORKXGkpatqle3Wrlaf7vWKzSGH2WaqZAwH5XSESnuF1gaJ3Ds8LfIAMcXhmHOKVG6rkMBC4Abor9l2bphdQeTUqO/WepuFJVic6D0xwSOeUV6LibPRaVUWIx4OI4h8X1tLm3URuF4lkPxRE/jN0XgFIHzLOWXuVRI/HZX26p0V3NKtqKWmoNSc7C81Zp2eRdPpjwh987TTe6eVnwI4j9JkoIg7C9gmaY5GAyc3lJOD0RnaHDDIlyWZZPJJIZhkUjE5XLpuj4YDJwwYZrmeDw2DOPTj5KmIpFIOBz+zLrYF7zpnO2KCIIEAoGZmRkoYAHwmaCABQ4Qa4phGL/f70xrOIFxPB7n8/lOp4MgSLVa/W//7b85K4H3xwwURVVVbTabTjP1Xq83HA4LhYITny5evNhut/d6H+4fvThfMx6Pa7XaLRWwnOVXuVxuMpkQBBGJRG6phdbhqGwg6MQwndOF1nKdrapaaQ6GY1PkyIWodCLpmYmIQY/LzdMcQ6IYik6Xal3vswBljUPzPO89U9OnzqYI3C+xbp6O+LjFuFyoDzaKnbV8p9Qc1NvDlVw7ERQWYu75mJQKiW6BxuGZBof2pY9NTU9TtWALIQAfD/Itq9EZZYrKRknZLKmlZn8wMliKmI1Ki3F5LiqFvbxb2In+uye0fOJoYQgKhyXN21c+QVECR70iI/N0yONaiLkrrcF6QVnNdwr1fl3RVnOda7nOTHjn2Z+NiD6RwfFD8EQ7DXZv+AzDMM6gYG5u7l/+y38ZDAZt52yCT7FtO5FIpNPp4XBI0zSKohRFPfnkk48//rhTYPr0oyzLcrlc6XTa4/Hc6rCiVquRJBkMBgVBuKWjpQA4PuCNAQ4QJ3gQBHHDXdtp0+6s9dU0LZPJfGZzdBRFGYbhpgiCmEwmrVbLWX7VarU0TfvMFicoijqzJRRF3dL5U7ZtVyqVTCaj67pv6oY+i4eaYViqNqkrWrnZz5S7m0W1UOvZCMIz5MmUNx3mU2EpFRI8Ikvi+zcJTi8xVK4O8Xge2Xv+CByTOEriqLCHSwb5dFjcKvey1W61NVjdVgq1nRfGXERKBYWIn/eKjIsmUAyeenDoXvOoExo+b/QCwLFimpY6nLTUUbk52Cx31wvdSmugjQ1uGv3no9JsVIwHhIB7uubq432CeweEgMOZgV/v9I4gKI6hoosSXFTUx0X9/ExE3CqrW2W13Bpcy3YK1X6mqMzHxJmwFPJyfonlWRLD0IM/vtj7V+fkJacrLsdxDz744OLi4ucNAWzbxjDMCRMej8cZLCQSiUceeeTTrd/3HoJed/M/pGVZ9Xq9XC7TNB2LxZwO8Tfk1KZp9vv9XC5nGEYkEgmFQrZtD4fDYrHY6XScLvKyLEPlCxxt8PoGBw6O4zRN3xBI9jqIezye++67b38vxk9/2dmzZz0ez/4Z9VQqdebMmc8re+m6HggETp48yXHcLYVDZ7M6iqKJRGJvmuVQF3As2zZ0azDWW+o4V1WvZluZstpQhraN8iyZCvCzMemuBX/Ux7E0OQ3o11PXvXk8qF4d+vH8x2MR58lkGSIVEqJ+/szsOFPsruU7G8VOqTlYzjQ38krUzy0l5BMpT8TLSxzFUDhB4PAaAIco3BAEYRjGeDyGLYTgmBYv7GlXBNMajnVlMMmU1c1CZ73QrbQHhmkLLLmQkBei7pMp91xU4lgSQ5HpusX9JSuI/Yc89O/9f1rGcp5OhiISAT7m406nvZslda2gXN1u1jrD1Vxnq6xEvNxCQl5KelMBQeQpltoJ/ofidUAQhN/vd84HVFW1VColEglBEL74h2dZNhAIcByn63q1Wq3X6/F4/EtPG7/5QcFkMqnVatVqVRTFVCrljHT2P1bX9Xq9funSpd///vfj8fi73/3uo48+imHY1atXf/e732WzWVEUH3744QcffNDv98NLGhxhUMACB3MIfeNRIH6/n+d5HMfn5ub+7b/9t7Ozs18wVe5yuQRBUBTF7/dTFEWS5OOPP/5v/s2/+YK96DiO8zx/80uonMmcRqORz+dxHHcOzf30D2/btmmauq6jKEqSJI7jTlnNMAzTNJ3+wbe07Ot2Ja8IYpm2YVnDkVFtD5a3WmuFTqkxaKkjisB8EjsTkU+nPOmw4JNZliZIAsPQT2wZgMT1yL0HP35tYNNSFoXZPpGVFqjFhFRtB1bznZVsp1DvZyu9bEX9MNNKh4STSc9i0uMRaYrACMx5acNLAxzcQOPchHEcd27UTkMTAI6PnZzERgzTGk3Memd4bbu9VlAyJaU31BEEcfP0TFQ4kfDMRSW/7HIxBOksu4Iml0f5zvjx/23bxjEMxxC3wJydI+dj0t2Lvo1idyXXzlV7uXqv2Ox/tNFOhflTSe9iwu2VGJbCCeKgN8ciSTIajabTaVmWO53OG2+8kUqlOI5zVlrtfZlhGLqu27ZNURRBEAzDxOPxZDK5urp6+fLlDz74YGlp6Yayl9MeV9d1HMcpirqlRVij0ajdbvf7fb/fH4/HnQVihmFomubsSaxWq+fOnfvf//t/53K5Xq9XKBRs25Zl+fnnn7948aKiKOPx+OLFizzPP/roo7e6/guAQwQKWOAQoCgqHo+73W7noCiXyxUKhb607sNxXCKRYBhG13XLskRR9Hq9N5PM3eQdX9O0ZrPZaDRwHI/FYoIgfPpbjUajfD6/ubnJsuzs7Gw8HnfmT7a2tprNptfrPXXqlMfj+QZrWNOCmq1bdqs72q52N4rdzaJSbg/6Q4Oh8LmIuJBwLybcES/n5pmd5JXAPnGhIDweh1T2+tNN4AiBEwxNiC4q7OHOpH3ZirpW6GyWuvXOsN7WVvNKOtOcjYhzMTnm5wWWJHDsgO8sAMd56O7UsDAMczowwjUBx+r1b9m2blhKf1yo93fu5MVuuTXo9nWKRGN+bj4qLSY9MT/vEWiepabR3947XxiC/7FIAK4vsscxFKcIhiJ4lgx5uJMpT77Wu7bd3q6o9Ra1vmcAAIAASURBVI52oadtldWLmcZsVJqLiMmgILoogsCxA5kkOnd+lmXvv//+S5cuvfnmm7/85S8jkQjLsn6/31lR5cxSFwqFTCZjWda3v/3tYDCIIIjH43nkkUeWl5evXbv2q1/9KplM3n///Xv5v2VZpmmur6/n8/lgMHj33Xfv7xz/pXRd32vFa05NJpNSqfSb3/xG07Tvf//7q6urGxsb3/nOdx566KGf//zny8vL//f//l+e54PB4L/7d//ugw8+eP7557PZbC6XGwwGn7dVBYAjAApY4IDmVZ94mU7X+oZCIY7jGo3G22+/HQwGo9HoDXHRWdzkdGdEUZSm6cAUQRCrq6vnz5//8z//8xvWWO31PXFa+d7SIninZ7yqqrIse71emqb3L7kiCGIwGHz00Ufnzp1bWVkhSfKJJ574q7/6q2Kx+Oqrr164cKHdbnu93kcfffSZZ55x4uKf9gojNmJblq2NzboyzNV62+XuVkUtNwfqQJc4aiEuz0SkEwk56udDHhdJ4DfEfkhej2UWu/vs0yRBS4RPYsNeLh0W8/XeekHZrqjV9vDCWi1b7q4VOrMRKRUSY37eJzFOIgubTMDBfG2j11eUQA8scAzyK+eEQXs02Yn+hVo/V1O3Smq+3leHExdNJsP8iaRnJiQkgkLIyzEUse+xu32S4DIet9vkXvSnSNxL4l6RifhcyaCQr/W2yupWpVtqDi9tNLer6kaBnwmLqbAUn0b/ndcPetvrnZ/ZxHD/Z2449MkZKdx3332ZTGZtbW19ff1//s//WavVTpw44Ywaer1etVq9du1auVxOpVJzc3NOou5yuR555JF33nnnl7/85fvvv08QxPr6+szMjCRJTrveZrN56dIlVVW/973v3XnnnbdUwHLORnfGF++99148Hrcs67333vvDH/5w4sQJ0zQDgcCDDz44Oztbq9XeeuutjY2NlZWVv/f3/t7TTz/98MMPUxT161//ut/vm6YJsQwcbVDAAodjgEHT9NLS0okTJ1ZXV1966aVQKPT444+73e79X9br9XK5HIZh8XjcWdMrCMJdd921srKyvLz84x//eH5+fmlpaf9GQqePVa1WEwQhlUp96Vb2/fr9frfbdbYHOquLnU9evXq11WrNzs72er1z585du3atWq1ms9nxeBwMBq9cuXLt2rXhcFir1d5+++1yufytb33rT1/AMkxTHU4ailaoDzbL3ZVcu6WMTNtyC8xsVJqLyTNhMeYXfBKDYSiGIvszVqhBHPM34+57Z/o/wUXyjJQMCgtROV/vZcvqalGpNgeXMs21vBIPCIsJeS4qR7ycX6IZmoAXDzgww/jdeQvn7j2Zgh5Y4MgzLas/NBrdQakxWC8q6wWlrgx13ZR45sysbyYspiPiXESQud2Jh/3L0lEUdoVD9N+N/i6anIlIMT+/mHQXar2NYjdTUkqNwdVse72oxPz8fEyenZ724xYZjiHw27zV4DMnVp3O6595Y0dRNBKJPP74441G46WXXrpw4cLGxkYqlYpGoxRFtdvtXC7Xbrej0eiZM2ecKWpnQn1+fv7ZZ5/Vdf3dd989d+7c5cuXU6mU3+83DKNSqZRKpclkcvr0aZ7nbzXhcblc8Xg8EAg0m81XXnml2+0Oh8NGo5FOpx966CGfz+f089U07fLly91uF0GQSCTy3HPPnTx50jmx3TRNiqL2+tMDcFRBAQt8A8OGrzDMQBDkgQce2NzcLJVK7777Lsuy3W73nnvucXaYm6bZ6/XW19fffffdQCDwj/7RP3IKWBzHPf3009ls9le/+tVvfvObYDD4xBNPJBIJkiRt255MJt1u97333stms3fffXc0Gr2lAtbe0i1d17e2tjKZjCAIH3744blz52ia/v73v4/jOMMw//gf/+NsNvvCCy9cu3btf/yP/8Hz/JNPPnnXXXe9+OKL//E//sdsNtvr9f40vSRsG7FsW5sY6mBca2tble5aXslW1IGm4zgW8LhSIWE+Ji0m3AHZxdA4hu4/Uw4OGAL7Xkv/n703bY7rPPO7z75vve9oNBoLwVWUKFmS5fHYmkR5NFVJqSaVVPIm7/IiyZt8A3+OvEw+Qsr1TJU9mfHksWVLlkRKJIgd3eh9PVv36bOfp9ANQhAlS5RESiB4/ehiUSC7Ddx97mu7r/t/IREaoaeT0nGUwdCljFhI8beqyYOOsX2sHba1Rn9y2NbrXePPUn+1qFxbjhVSQkJkRW6u/AZPE3ABkjGCIBYFrIWIO2hgAZeVMIpsxzemXk+zjrvGw2O11jVV08VRJKkwS1lpvSitFmKFBM9SBIYj55UNoOcK+ILxPPP+KMLQRJ7kMjHu6nKy0Te2j9X9pnbcnxz3Jsdd8yOpv1GMrZeUpbSYlBmRo2gSf4qqAhiGSZJUrVbv3Llz9erVx9qdaJre2Nh49dVXM5lMKpU6/7dnBxgvvfRSKpVKJpO/+93v2u22ruvD4RDHcYIgOI6rVqtvvvnme++9l81mzxwHx3H/+l//63w+/7//9//+/e9/bxhGs9lc6OGSJJlKpdbW1t55552f/exn36r9aiES/8Ybb/zbf/tv//CHP8xms62trXQ6/dZbb7333ns3btw4q4jpur6zszMajTY2Nt59992rV68u5hUO5qysrCSTyW/7fw0AUMACgO9ewDq7ynH+KwtPs7Ky8i/+xb/Y3t7+zW9+89vf/vbTTz/d2NhYWloiCELTtEajUa/XcRx/++23HcdZvJaiqFu3br3zzjvNZvODDz74n//zf/7DP/zDxsZGMplcTKut1+vdbjefz5fL5W9bWRNFMR6P8zy/6LTCMGw2m73//vsEQfzX//pfb9y4oSjKzZs3gyDo9/sIggzn/Kf/9J/eeOONWCz2/vvvcxy30A8++xmf3ZpHERKE0WTmHffMzw6G92vjZt+c2B6BY5kYd3s9fb0Sq+Rk6SS8OBXfXnwM6NnnAM8ucLYrT6PXzx+OeSyIyzx+cyWxVpB74/ROQ71/NNpvzGPZvnlvro5xvZK4VklkFJahT4cVQU8WcHH8EVy7AC7fEx4hSBhGjh8edc37B6Ot2mi/pc+8AEOiTIy/VonfqCbWikqMp0gCX6gWPaYHCjYaeCwCOP+ELUJGmUO5UqySk3ob6f2Gfr823DlWuyOrPbA+2R1WcuK1lcT1SryYEhY1rKfi/RcKuf/xP/7HX/ziFwuZkfP1KUmS/v2///d/9Vd/RdP01atXz7qozucaCxHb//yf//O/+lf/6ujo6PDwsNvtMgwTj8fX1tZWV1ez2awoio8p1S4qTdVq9b333js4ODg8PDQMQ5KkXC5XrVZXVlbS6TRN09/2B8Qw7NatW//9v//3v/mbv2m1WjRN37x588qVKyzLkiR59m6O4xwcHKiqeufOnZ/+9Kc0TS9O8bvdrud5xWJxcT0FxiwAUMACgKfh8uY3AQVBkCRJluXH/AGGYYIgxONxSZJYlkW/OJw5iiKCIG7evPlf/st/KRQKv/3tbxuNxr179/b39xc9ULZtC4Lw85///L333isWi2e2WxTFd955h+O4X//617/5zW+azaamaTRNLzqwZrNZpVL5N//m37z99ttPfl6xeOd4PP7aa6/99V//9R/+8IfBYPD3f//3giBUKpW//du/ffvttzOZDEVRoih2Op1arTYajZaXl//2b//29u3b8XjccRxd18MwTCaTi0bfp+5mFpnYom7leMFQm9W7xnZDa/QmnfFk5gbz9m9lrSiXc9JyVoyLNMeQX+iLAdcHfJt4dt6kFxE4JnIUTRExiaoW5PZgstfS9pp6T7XuH44b/clnh8NKVqoWlKWMqAjUQuUdKlnAj+WVFpqJC/FdKGABl6duFc3rVl4wNu1Gf7J9rNV6Rmc4tWyPJPFKXlrJSetFpZgW4xItcdRjQodgkIEnNKFnzwxJoCSBLaUwhadW8lJz3TxsG1u18UC1do7V9mj64GhcyQorBXk5KyWk+a3CEwv83Z80DMNYlt3Y2FhZWSHnnP+uSJKsVqtLS0sLyfYvj0taJBcYhsXjcUEQyuXyK6+8YlkWjuM0TcuyfL50dVYPOp1pQxC5XC4ej6+urhqG4XkeRVEsy4qiyPP8d9g+p2tIksvLy4lEYvFtxGKxs7rb4h84jjMYDBqNhu/7y8vLa2trOI47jvPw4cPDw0OWZdfW1iRJeux7BgAoYAHAd/c0i/pOpVLZ2Nh4bEAGRVGvv/76f/tv/y2ZTK6trZ0/KjnzRrIsv/7664lE4uWXX67X6/1+X9f1RUNvIpEoFos3bty4du3awnafeaZCofDOO+8sLy+/9tpr3W53OBzOZjMcx2VZjsfj6+vrr7zySrlcJgjiCW394p0Zhnn11VdRFH3ppZd6vR7HcYtv4M6dO/F4/Kypajqd7u3tmab51ltvvfvuu8lkEkGQfr/farUWC/IsBoVE8/8FfmhYXms4Pe4ZR12j3jW7YyuKkJhIbyzFqwV5JSvnkpwiUgxJLF4FNwWB7xHIPqpkIQhFYAmJjYvsUlqsFuXrleluUz1sG92RdW9/eNAyHtTHK3llrSDnk3xaYTmaOIkSIdQCfnCvRFEUSZJhGLquC1cIgee+dDX340EYGVOnp87qXfOoazT6k+OeGUWIxJPXKom1orKUEQtJPimdamyfz3XBDAPfrYy1eIxwHI+LeExkCil+raCsF5VaVz9oGc3B9GFtfNjWturqSl5eK8rFlJCUWJGjvpv3P3ti2TlfWWZi5nzjdx5FETXnMWndL0u/P7ZBFu+fSqX+0qu+7RoumsJicx77iRa/27bd7XZVVWVZNp1OLzIIz/Pu3r27t7cnSdLGxoYoip7nRVEEFwkBKGABwFNIFfL5/OJmOMdxZ72+C0iSvHPnzpUrV0iSFEVxoYn+ZRadwFeuXJlOp/1+X9M0DMN4no/H47IsEwSxuJH32KsURXn11Vdv3bql6/pgMJjNZgRBSJKUSCR4nicI4jvEbSiKptPpt99+++WXXx6NRosimiAI57+BMAwHg0Gn0yFJsjJn0et7fHx8eHjI8/zm5ubZUclTi1/nQlcjw+6NZvWeudvUjnumabk4jhZTfDEjVnPyUkbIJTiZZ9BzlwNROKwBnkos+6gMiqIIS+OllJiL88s5sTWY1DrmQUdvD6f1rnncm+zU1XJWXMnJxRSfibOKSBPzdhhYQ+CHAcfxhQZWFEW+74OIO/BcE0WR7QXaxOmNrVrX3G9px11zZDo4hqRjbDkjVXJSOSuU0qLAkiSOIY8ug4PzB57OE/joMUIRhCHxXIJPysxaUd4sT4465lHbbA3M1mDaGk53jtXljLick0ppIZvgYyJNEjj67Ss+X/O3jz3Vf+kh/5qH/2v0Pb7bq77nD4UgiGVZx8fHjuOUSqWzAVC+79dqtU6nk0gkFEU5ODjQNK1YLG5ubn4rbV8AeF6AAhbww8VVKIou+p5kWf7Kv+XmfI0DOHuTxftIkrQ46EAf8di/PHuTxatYlmUYJp1Of+WrvpW/OZvCyzBMLpdb6Dt+uet+MpkcHh6applOp/P5/OKMKIqiWq22v7/Pcdz6+rooik9phU/e2fUCy/Gbg8mD2nirNqp1TdPyaArPxNjVgvzyerqalxMyM7+29SVPCQEs8FRAP79RuNgTJIrlEnwuzl8txztja6s23qqPax1jv6UdtLRPxMFqSb65kthYisVFhqUJcvGAAsAzf1ThqhRwGQiiyPeCie13x9busfrp4bDWnajmjCTwlMIuZ8Q7m5nVgpJRWJLA0FMFbvR8uARrCDwFi/ooHD31/ghCEXha4ZISu7kU745nO8fq/aPRUdeodfTDth7fp8tZ6aW11MZSLCWzHI1TJP59nsbH5EeeJML9xkLYU3zV9/yJzhTcHcepVqvlcvns6+Ec27b39vb++Mc/qqr67rvvLi4YwmMJXD6ggAX8cHnCN3qCb3Q2j3mmr/EQX9nr+/Wv+j6nJX/ptYujktlstry8nE6nF1/0ff/g4KBer7/00kvFYjGKIsuyCII4r9H4rZgLXYWuH2qmfTxvuXpYU7uqZTsBRWKb5fhaUbq2HC+mRYmjaApfNLlAvAo84y3/udbr4nGLkIhliKWUkJCZGyuJo655d39Q65gDzbq3NzzqGMXd/pWl+GoxVkzxEkviOApPKvDsOH8isjjtCMMQlgV4rp7hhUB7YFrucW+y0xifeP+xNbVdmiRWi8p6Udlcji2lxZjAMBRO4F+2qWBigWfr/efHvQhNEfkkH5fo6yuJg7b+6f7ooK2OTOf+0bgxmHy0018vKRulWDkriSxJEBjMKv5KJpNJrVYjCGJ5eTmTyZwpZ12/fv3KlSv37t37X//rf5XL5b/7u79744034AohcFmBAhYAPFtPc3R0ZFlWuVxeWlpafNH3fU3TZrMZz/Ou6/7617/2fX9zc/PmzZvfRkj+5LdwodHu+t3x9KhjHnX0etfsqTPLDUSGWF6Wqnm5mpfySSGtsCx9fr/DXGzgxygWoChOYzRNxAU6qbD5JNfsTffb2mFL76nWg6Nxoz/Zqo+Xs9JGSSkmhbhEEzg+z7kgzQKeepaFLqRPFobX8zy4Qgg8L+Y0jJAwDF0v7KlWrWMedfVa1+yOpubM4xhioxhbKSiVvFhKCekYyzPUl4wxrCLwA3t/BKdwhsIVgVYEqpDkG/3YYcc4aOudobXTUFvDycPj8UYpXs4ISxkxKbMUgS0GFsLzeoYkSYvD77feeuvsXJyiqJ///Oeu65bLZZZlX3/99V/+8pfpdBrDMNjtwKUEClgA8KwIw3A4HO7t7c1ms5WVlUqlsvj64v6jIAiHh4f/43/8D8/z7ty5c+vWrW8RDcx/8/xQnTjH/Umto+839XrPVCc2gaIJhb1zJbNWlJdSQirOiSx1KizwBT8G/gz4oYsFj/23xFHSUnwlK1+vxjvD6VHX3G9qRx1962i0XVfv7Q+Ws1I1Ly9npWyCkzgKxxDoFgCe+mNJkuRCBhE0sIDnoBZwqtEejgynPZoetfWDll7rmkPdRlE0pbA315JreXkpK2UUTp6PeX30wuhU7wqMKPDjen8EETlK5KjlrHS9kuip1mFb32/pRx1zr6Efts2kxFRy0kpeWs5KhZQgsiRJwJCXUwqFwn/4D/8hCIJ0Oi1J0tmU9uXl5b/7u7978803aZoulUpng7Bg1YBLCRSwAOBZEYah7/uyLG9ubq6vr8fj8cXXSZK8ffv2W2+9dXR0VK/X/3rO+vr6E95UD8Jo5nhjw+mps8O2vlUftQZTw/J4hljOystZoZqX10uxbJyjCPxcIzfMFwQuVhq20HqnSSwb49IKt5KXN4ryQds47Oj1ntkZTjsja7uuVgvyalEpZ4SUwiUlZi7gAg8yAAAvXlARRZbjaxN3oE53GvpBW6t1DHPq0RRRTAqVnFgtKuslJRdjaZrEzl3jgi4M4AJ5/7lCFoKgBIGlFDYhM0tZcX1JOWqZh11jr6kP9Vlfs3Ya4+WsvF5UShkhl+CTEvM95bEuBxzHVavVM0WUM0lfBEFisZiiKOcHGsJyAZcVKGABwDMknU6/++67nufdunWLok57+DEMe+ONN8Iw3NraymQyv/jFL0ql0jdeHoyiyA9Cy/bHptMamLtN7bBjdgbTmevzHLlaVDaK8mpBLmeluMTQBIbj6Lk2qwiOYYALxUI67kw9GEcRmaf4pdhyTr4+Thy0tQdzXYyx4Xy0099taMUUv1ZSNpcTaZkTWIKhFvcK4ZkGvvejiKJztRXQwAIuZLaPLBquI9v1tInbGkwPWvpeW290Tdv1WYqo5OWVvLRalKs5OSEx9Lzp6rHpNGAqgYtkcs9NvZ4LC0gsxRWUUkq6YSa3auP9plbrGQNt9unB6MT7p4X1onJtOZFNsAJD0jRBvKje/8vb+S/tdKheAZcbKGABwLNyMwRBlMvlf/fv/h2CICzLnh2VYBiWSCT+5b/8lz/72c8IghAEYTHE/S9N6l0cWLl+oE7shzX10/3Rftvoq1MviESOXF+KXVtOXKvEMjGOpwlyoXx5Whr4QrkAAC5g7eB8sEUSOEngTFbMJrhry4njweT+4WirpnZHk97YeljX7u2PN5eUzUq8kpckljrJ06A0C3zvh5CiKBRFPc/zfR8WBLg4UcRJno9EQRiNzdlh2/hkZ7Df1rrjmWV7skAvZ8Wb1eRmOZ5P8jxLUCROYIuDq+jLNhYALrT3x3GCwVkKj4n0rdVUo29s1dWto2GjP723P9xv6p8ejNaK8o2VxEpBUXgSx0+f9RfqasGTjK6CjQ+8CEABCwCeoZuh55x30os/4DjOz3nsrx4LXhcDhgzLaQ+ne039qGM0BuZIt4MwysS5pYy0UZJX8nJSYWMCTZ27WgVnL8DzG8sSOEbgGEMREk/lE/zNleRhW91t6q2hVe8avdH0YX1czEjrRbmcFjNxjmcIDAOVV+A7gs1BUTQIAujAAi4CC+8fhJHluM3+5KBt1LpGvW8OtZnnhTGBur6SuFFJlDNCWmEVkaFJHMPQ85ezYA2B58v7Lx5dHMcEBuNoUmSJfIK/Vo4ddYy9lt7sTxp9szu2dptaOSOu5KVKTsrEOZGh5tob4P4B4MUCClgA8EOn6F/2tOe/smi5CqPI96OhPuurs6OecdjS91va2HBIEk8pTCUrrRXlclYupQWJo86/dvFW4MyB53ebnE4swtCFzutSil/Ji2ulSa1jbB9r9Z5+0NaPusZBS1vJydW8VEoLuSQvchS+aJ+Hpx/4lo8chmGnk92ggAX8SERn04UjxPUCbep0RlZrON1tqPstXTWcMFocXAmreblSkCsZSeIf9/7nVC8B4Pmyw/NHdz5rAEMRniV5lswnhZWcvL4UO2zr2/Xx8eAkDGj0zd2GVsmJ1bxSyYkn3p+lCBxbOH7w/wDwIgAFLAC4WIRRNLX9kWF3htZuU91v6u3R1LI9gSE3lmKVnLRakMpZKSkz9FzO+nyzFXhu4HIUFB7774TEKAK9XpSvVWL7LeOobRz3zb46aw8nnx0Ol9LC9WqinBHTCqcIFE0RsBGAJwfDMJIkYQoh8CMTIWGEzBxfNe32cHrQNnYbamswndgeSxOVvLiclVYLciUrJRSGpQkcxcD7A5fO/T9Wf40UkRYFqpKTrpZjta6539LrPaM3nr3/wLp/NK7kpc0lpZyVs3Fe4SmOgawWAF4IYKsDwIUgjCLHDcyZO1Bn9f5kr6nuNXTDcsMwUgSqWpA2ivH1klJM8QJL4jh6dv0fLgwClzmne6RLimEIgWNrxVglJ491p9Y1thvqYcfoj0+i2P22no1za8XYal7OJ/m4RPMMCfMKgScBx3GGYTAMcxzH8zxYEOCHtnII4rqBMXXHpn3cm+w0taO2PtBnYRgKDHmtEl/NK2slaTkr8zRBkvj5C9Pg/YHL7v0RDEEJluQLylJGur2WOmzre039oG20R9OHNXXnWM0l+GpeWSvIpYwQF2ieox7N4AYA4HICBSwA+HE9NBIhURgijut3TlLx4f2Dca1nalMHjaKExFWL8tVKfL0gp2McS+HkVwldQfwKXFYem7ZD4CiBY9kEFpfp1aLcGk23j7Xt2uigpW83tMO2eS/Grhblq8vx9ZKSktnFsCIMNgjwtc8YPpdRCefAggA/DOF8ElsYIZ4fdsfTTw+HD2vjWmcyMuwICWWeWcnHri7HryzFMjGOo3GKxL9efwAALqf3jxZWGmVxjKZwiadWi7HuyNppaPcOB0dtfb+p1bvmpwfDal6+uhzbKMUzCY4iMAxdBMmwRwDgsgEFLAD4EYhOZVoRLwj1qXPcmxy09cO2cdwzxoaLY0gpxa8WY+sFuZQRUwor8/PJ2IvXQt0KeJH3ThRhGMZQGE3iEk9mYtzmkrLXUPea+lHX7IymI8M+aBnlrLBeiq2VlIzC8Qxxbro07BvgKzIleCyAH8yCLc6u/DCcWN5xf3LY1g87+mFbHxs2iqKZGLdelKrFWDkrpBVOFiiKwB/z/gDwIhnoc94fRRmKYChCZMl0jKkWxHp3stNQD9v6SJ+N9FmtZ3x2OKoWlPVSrJDgBI6cdyyC7weASwUUsADgh49fkQiJLNsfaFZrOD04CV6NZn8ynXksTawWpZWctJKXl7NSOsZyNIGiSISgixkt4IMBqDWcJXIMReQTZCbGldLClXK83jX323q9Y/bVWWc02W8ZKw11JScvpcVsnItL9PxaAYzoAr4AhmEEcRILBXNgQYBnShghtuuPdLs1mBx1zcO2ftyfmFOXobByRlrOSqtFeTkrZ+OcwBIwGh8AvtL70xSepriExKzk5LWSXOsYRx3joK0NNac3svaa+k5DXc3L5ayYS/AJiaFJHCYcAMClAQpYAPDDEUWR54fmzNNMu9Yz95vaQctojyZhiEocubkcr+aktaKylBFjIk2T+Fm8iiIIHLwCwPlA9qwZAcfQuMQqAlMtyFcr8YOWvtfU6z2jr87+tNV7WFNLaWG1IK/k5VycU0SaZ0h8riEHACdhEEFwHIdhmG3bjuPAggDPyvsH4fzgal5ebxr7bb01mHh+ILDUalFeL8oreaWUEdIKS50k2yioXALAX/D+yOIsCscwkaOulGKVrHStYh21jb2mXuvpvfHsk73BXkMrpPhqQVnNS9l5GUtgybPbDAAAPMeRGywBAPwABGHkesF03nV10Da2j8f7Tc2wXARBYwJdzogbS7ErS0omznM0geMo9vmVltPgFcJXAHgsiv38z/MyFksRS2kxF+dvrqYaPeNBbbTX1Dsja6s23m9pCYndKMU2y7FSRkzKDEPioPIOLB4kDDt5EsIwXFzvAoCnSBhGrh9Yjj/U7XrH+OxoeNQ2hrqNoKgiUKWUcqUcXyvKpbTI0gRx3vs/cv5gpQDgi0YbmSvIISgazTcIylB4MSlkY9z1Srw1mm7X1a262h1Nd47Vg5b+scxWC/LVcmy1oCgizdL4vB0b9hUAPK9AAQsAniHRnCCI9KnbGEx2W9rOsdYZTLSJg+FoWuZWCtKNSnIpI8YlhmMIEj/JpM5eu5CfBCcLAN+40RYyRgSO4jhGkZjIJEop4fbqbL+lP6yrx12jPZwONGurPiqlxVvVRDUvpxSOpXEcx0AA6YVPh9BFTx+IuANP0SiFERIE0XTmtkbTnYa629QavYlmOlGExCV2pSBdLcfWikpMZDiapCns3LiJ+U1nsEsA8HV2+/M7gQuVdxzHCALnWbqQFG5Vk/st7UFNq3X0zng60u2HtfFqQV4rKeslpZDkGYrAFmcXsM0A4HkDClgA8Cwi11Oh9jCMRqZ92Db2m/phRz/um8bUJTAsk+CuLsWrBak4bxiZy0zCbEEA+O4FiHNpH4JjGMdgHE3EJCYbZ1fy0mHH2GvqR229M5r2Veu4a1QK8mpBWclJS1lJoAkEJN5fVHAcpygKRVHXdX3fhwUBno73jxDVdOpd46Ct7zXVRn86MmwcRXIJfqWgrBXm2nwJThHoL51agVQPAHzH3YdjCEvjLM3GBDops+WsXOuYuy31qGX0dGu0bR+0jYd1db2krOTlUlqQeRr7UiABAMAFBwpYAPAUw9bTEUNhFE1tXzPtnmodtI0HtVGrP7Vsj6aw1YJczkprBWWtpKRllqYeny4EHhQAvjPouf2IohFF4OkYn1b4Sk66UlL22/p+0zgemO3BdKDbew19JS9dWYqVUkI6xsoCzVD44j4CbMIXBxzHGYbBMMx1Xc/zYEGA7xoARAu5gJkTaBN7oM32mvpuQz3umqrlsDRZzogreXnjJHOWsnGepQnw/gDwNAOARz1ZURThOJZU2KTCLmeljbKy39QOO8ZeQxvo9mi3f9jWK3l5bR6QZ+NsTGBoCscwBG49AMBzARSwAOBpBrGuF2oTu6/bta6511AP2sZQn+EYmpCZq5X4akFeLynFtCjQ5Py2QIScu/4EywcAzyKQRVFU4iixnFgtxdRN56ij7xxrRx29OZx8tDP4aLdfSgknieVSbCklJGRG4mkQeX+RHhWUIIiFBhZcIQS+M64XGpY70GaNvrnbVA+aRle1UASNSfSdYro677oqZ0WJox5ruQLvDwBP3aqfCb0LLLE277bWLXevoR209VrbOB5M7u4NPj0YpRR2rShdWYoXk3w6xgkcRRGwHwHgogMFLAD4vkRR5IeR7fim7fXHs72mutvQW8PJ/LYgmk/wlZy0XlIqWSkb5xiaoD6Xjj6dpAbeEgCeXSB7KnKEIBSOJWVG5MiVnNwcTfab2n5Tr3eN7tjqqbPdpraSldZKympejkk0R5MUiWGQXgIA8Je9fxCGjhtObK+nzuYjUNVGf6JPXHR+W7CYEtZL8lpRScqsyFFfJXMJAMCzcP2fx9gn6S6OxXj6pWqympe649n+XJG2NZiOdftPhr3f1JcywsZSvJKT0jLLMgRN4Th4fwC4qEABCwC+L2GITCx3u6FuHY23j7XOaGpYLkcT+aSwWY7dqibLGVFgSYrCSRw7Fen5YnYNAMAzTzVPdl6EoShL4SyFywK1mpdHG/Zh2/hkr7/X0mtt46ht3N0fljPitUriWiVWTAkcTcIevdzgOE7T9EIDy3EcqCwAT25SwggxLa/WNT47GG831OPexLQcmiKycXZjKfbKenopLUocyVDEIhuOvtB1Dc8YADxbFrvsbMcxNEFTuDKf/X1nI3PY1j89GD08Hjf6k3rXvHcwLmeEK+X4jZXEUvokbsdOC2EAAFwsoIAFAN8+bH0kdzGdt1zVu+ZRxzhoaz11NrV9kSPWiqnVglzJydk4l1IYniExDIO8CAB+zED2UTQ7v1mA0iROEThD4jJP5ZJcsz/ZbWpHXX2o2Q+Pta42226olYy0nBNX8rIi0BR52jgBW/iSgWHYWQELNLCAJ3D+SBhFnhd2RlZzMNlrqkdds6taE8tlaeJmNVEtKNW8mI0LmTjH0ySOo2cqmSgKNgQAfnDvf9qIffpnksAJHOMZQmDJXIK/vhLbPtbqXbMzsg7aRm8822vqyxlxtSAvZcWUxJAUjoH3B4CLBBSwAODJQ9dTlVY/iEzL7WuzZt/cb+m7Ta03nnl+oIj01XJstSivFZVyVkzKDIZiX3aiAAD86LHsqco7icdJPC4xKzmpWpCPOsZhWz9o6e3x9JO96V5DK6b4a5X4ckbOJ/m4xHAMgSERbOZL9jyctcNEj4ZxAMAXvP+jhyMMw4nlD/RZezTdbagHLb0xmLheIPHUWlFZzcvrRaWck1IKS+DY57EDHF0BwAUK5k82JI6jMZGOiXQlJ67kpHpvstfUD9t6azC9fzjYb6i7TWm9qKzkpXyKS0kcxxBQxwKACwIUsADgyUPYyHVDbWr3xrPjnrlVG9d6hj5xUAxNKGw1J1cL8mpezsY5niXn00xQCFwB4KKWLT4fWhhFEU3iy1mpmORvriTqPXN3LvXaHk4OWnqta6YVdrUobyzFyhkxJtICS1EEDpPuL8uTgC46ZEHEHfgabC8wpu5Qt+tdY6eh7jU1feJGCJKU2eWMuJwX1wpKISkILElgcHQFABfa5n8hE8axYlrMJoRrlXi9N9ltaEdtrTGYHrb1WltPyGy1KG2WYktZMSmzAktSJA5bGgB+XKCABQDfTBBGjhtMba8zsu4fjR7Wx83BZDrzCAJLx7jVorxeil1fjks8TZM4gX+ubRHNNSThDj0AXFjOisw4iuAUkSJxiafXi0prON0+Hu829N2G2hhMWsPJ/aNRNa9cWVJWC0o6xnE0DoHsZQiDCIJlWQzDbNt2HCcMQ/hMgUfG4cQ+uH5g2X57ON2qj7eP1UbPNGduhKCZGFfOijdWEhtFJSYxzFzm8nPvD6dXAPCcBAA4iuIYQhKMwFLVvNQbJ3eb2v3D0VHH6I2t3tjaOhqv5KUr5Xg1L2fjvMiSJPH5QAYAAH7oyA2WAAC+ImY9k7oIIscPhrq9czzemw8s62szzw9ZGrtRTayXYmsFJZ/kBZbgaPLLzuzza/cAAFxIPk8yHwm9MhTOUDjLEPkk//J65qit7TX1vZbW12Z/3u09rI9zSb6Ska4ux5dzksxTBH46qxADgZvnjTAMLcva39+fTCa5XK5QKPi+T1EUrMwL7v2jExDXD8a6vd/WdxpavWO0hhPXD3maWCvGrizH14pyIc5LAsXSBIGd9lydnVqBHQCA5yUAWDh/bC6OSZM4T5PZBH99OdEcTPda2nZ93Fdnd/dHOw0tLbOVvHx9OV7JS4pAk/Mjawz9gswWAADPGihgAQByXvoERdAgDG0v8LzA8cPx/L7AXkuvdYy+NnPcIC7Sm8uxzSWlnFNyMU4WKPpRFwacuALAcx/JPtrIFIFTAi7zVFykKzn5+kpir609PFI74+luXW32p4cdYzknrJdihaSg8BRF4gxJEAT6uTEBa/CcfOae50VRhGEYSZIgg/Xief9Tkat5QTOyvcD1Qtf3jal31DX2G9pR1+ipJ96fp4m1irK5FK/k5VyCUwSao/DHclbIYgHguXX+j3JjApMISmCJpMIsZYRry7H9lr59rB33zcOe0VVn9Z65lBaqeblSkBWe5ubDDXHs1PuD6weAZw0UsADgNIT1wkg1ZiPd1iauYbmW408td2DYrf6kO7Zsz1cEen0pdmM5Xi3I5awociSGovPzVhSmYgPA5YtlF/ta5CiRJ/MpoZwVq1llt6Hdrw0b/clW3W4OzKO2mU/xCZEVOFLmSEmgUzKrCDRL47CSzwU4jlMURRBEEASu68KCvIC4fmBMvKEx0yauPnUs25/anmrajf6kPbRmrs8zxNVy7NpKciUnLmWkuEjPVa7g4AoALmtSEGEoxjMUz5D5JF+aDyXcqqmfHg1b/cn28bjRMw86RrmppmROEWhZoBISG5eomECDQQCAZw0UsADwUpEXhMbU66vWw7p60Nb66sywvJnjz2zP8YPFaQxN4LmEcGsl+eqVdHoxYOh0qD7ErwBwOUHRMwk7lCLQbIwTaJKmMH3qqIZjWK4+cUfa8H5tzFC4yFIiR8ZFeq0UWy/IxbQg8zTIZDwHYRBBMAyzKGAtNLBgTV4I1z//FYTRZOZ1RtZ+S9s51vqqZVju1PZt1585PoohBIZRJJ6S2Gsr8TevZZMKQxH459EDHFwBwGX2/hGCoASOpGWWo4goQsaTmWo62sQ1Z55WH+/UVYElJY6SebqQEdaL8pVSLKkwLE1gYBwA4NlFbrAEwAuO7Qb1nvnZ4ejT/WFXtSYzNwijuQ47ShI4TmCPbpSgvdH0w+2e5wcvrSbzSZ6liUeX3sFFAcBljWJPE1U/iDqjyYOj8d390XHPcH0fx1GCwBY67iiKTG3PnLmd0fSoa97dZzdK8ivrmXJWElgShxrWBc5SzppnwZK/WESI7fo9dXZ3f/jZ4ajZn5qWE8wl/DEEJTCMZ8ho3mKNIIg6cT7eGXp+dKuaKGclliKwM9kbAAAurfdHoyjy/LA7nt47GN3dGzT6E8cNTpw6Ph/9MjcB2tRRJ05zONk9Vu+lh69dSV+txGMiS+JgJADgmQAFLOAFDl+jKAiiw7b+j3ebD+vaSJ/589KVzNEphRU4ksCxIAgnM29s2vrEM2ee1dZV0+5r1s9vFZdzIk+TMEofAC57nhv5QXTcN/74oPvnh/2h6XhegGGIzFFxiZF5BifQKIymtjfSbW3qGFN3OnNH2mwwnv3V7eLmUlwWSAQsxcXNUlBsThiGvu/Dgrwg+EFU6xrv3+/dOxgO9ZkXhAiCCCyZlFmFpwkcC5FoOvNGhmNMHMv16z1zaMx6qvXT67mrywmegfgZAC4/nh92R9Y/3Wt/vNfvqzPfRwgMiYtMUqZ5hsRwzPcCbeqOjZlpeX3VUg3bmDqG5d25ksnGORw8PwA8A8ABAy8uQRjV+8b7Dzqf7A61iRsiSDbGVvNSNS9n4rzAEjiGBWFo2X5XnR51zIO21hvNOqOpG0RRhPwSLVaLCnV2lxAAgMtIFCLd8fSDh70PtnqtkRUhSIyny1lxvSTnEoLCUxiORVFkO97QsOs9c/tY646soWE7boiiGIGit9aTOIZhcBJ7McMggqAoCsfxhQYWXCF8ITZ1FI302e8/63y0M+hrMwxFYxJdzUsreTmX4EWWxjE0QiLL9gbarNYzDlpGZzQdqvY9b4hECElgN1aS0FkJAC+AobD/8KD7551ea2ihCKIIVLWgbC7FsnGWZ0kUwYIw0KdeZzTdqqv1jqFPnYO2EYQIjmM/v5XnaBKHIhYAPPXIDZYAeDEJw0ibuB/vDv+8MxjqM56h0gr70mri9nq6kpMk/gtj1I2Zd9w1U/vs/cPhcXcy0mcfPuwlZEaR6GyMB9cEAJc2fg2jie0+qI0/3O4fD6YkgWUUdrMcu7WWurIUiwn0eYkrx/Vbo0lSYu8eDOsdYzLz7h4MJIFIxdh8kscIkHW/cMkJiqJnIu5RFIGI+wvysetT52Fd/eDhYKhZBIHnE9xmJf7KemqtoMgCff5fW47f6JvZxOju7uCgbaimfW9/KHBkISHEJJrAMVhQALisTGbeTlN9/0G7NZjiGJpNcFfLidsbqatLisjR543K2HQyMe4jltg5VscTd7+lMxReSHBrxZjAkbCSAPB0wX/1q1/BKgAvplu6uz/4x0+azcFUZMmNJeX/eX3pZy8VlnMSTWKPTSKjCSwps5WcmJJZc+apE2die7bjiyxZSAoEDq0VAHA58YPws8PR7+6291sGTeGlDP83ryz9zZ3SlaXYQgVvIfK6qIXgGKoIzHJGSsdY3w/ViT21/anlRShazogsTTyShAcuBAsj7zjORx999MknnyAI8tprr92+fZuiKFicy0qEIK4XfLI/+IePm/XehCKxtaL89p2lt18uVrISQxOL274LBWckiggcS0rMclZKyaxhuVMnMKau5fgcQyRlhmMgNQWAy2kogiC8fzT6x09a+y0dQ7FKTn775eI7ry4tZ0WGIj5PE+ZSeSxNFFNCKS3gOGZanmY5E8ubzLxSRohL9Nz1g+8HgKcGnB0BL6hnUifOZ4ejvmqhKFrKiK9tZl5aTccEGn0k67uQbn80ZegkNZU46ko5dudKOp/gSQLrjq39ltFTrQCunADAJcVygnv7w+bA9IMgIdGvXsneXkulFfZM+zuK5qP05yYDRVEMRTmW3Cgpb17PLWdlhsJHpr3f0Dojy/PBUFzUSAjD5h9gBPcHLz1hiIwNZ6+hHbZ1AkfTMe71a9lb1WRcZHB8MTcsWuzrsx2NoihPk2sl5ac3cuWMgGOYPnHvHQxGhr0IEmBVAeDyJQqa6Rx1jHrPDKMoJtI3q4lb1VRMZMjTAYPoYu9H6GmaQBJYNsH99EbuylJM4ijL9Q/aer1nzhwflEYA4CmHbbAEwAuI6wZ91ap1dMvxRZ68Wo7frCYljjxzSGfTx873YaEoKnHU5lLseiXOUcTU9ht987hn+gHkPABw6aLXCAnCUDXtg5ZmTFyWIpaz0k+uZJMK8xWG4tGfFm0bPEut5KWX1pIxkXa8sKta9Z7peP68eA5cLFAUpWmaoijf92ezGdSwLjd+ELbH02Z/qk1cjiGuLClXy/GkwmDY2aZ+tJvPeX8ERUSWvFlNbi7H4xLlen69YzaH0+nMg/oVAFxCQqSnWce9iWo6KIKWc+Lmcjwd4zAMRSJ03nR1miCgyKnrRxCEIvB8Qri6rBRTPBIh2tQ96hjG1EOhggUATxUoYAEvIpOZ2x1bxswLoyitsOWsGBPo+Z2B6C91+aKPCloJma3mZUVgCBwbG057NPWDEEJYALh0hY1o5vi98VS3PC+M4iJdzogphSHxL1wx/pKhOM17BZas5uW0wlEkPrG97nhquwEs6kX8nOcyWDiOz+fSBtBQc7nx/bA3ssamjaFoTGRW8nJcYhbD8L/G+y9iA4ElK1kxG+dDBDVtvzuaqhMHHhcAuHyECDLQ7IFmoQjCs+RKVs7GOAI/PbD6ckEKPRUSQAkcLaWlclpkaSIIot7Ymsw8WE8AeLpAAQt4EZna/shwXC+MIiQuMekYSxJfl5Q+IkIihGeIbIKLSTRJYJbtq7obBBGcrQDAZatrIKjtBgPddtwAiaK4xOSSHEVi59uvvgaKwDNxJiHTNInbbqCajutBAeuiftan3XPR15xhAJcDPwjHhj2d+QSJxUU6n+RZhvjGz3zxVOAomlb4dJxDUcT1grFhz1NTKGEBwGUjiiLNdIyZhyKIyJL5FK8I1NcbijPfkZSZTIxjKQJBorHh2K4P6wkATxcoYAEvIo4fTmee70dIhEgsKbLkXIY9+iZ/tlDHQAWG4lmSQDHX86eOG0TzhmIAAC4XXhBOZp7nR1GECCyp8PRc9eqJ1FhRDOVoUmBImsB8P7RszwvASlzUSGgOdGC9CGmpF0RTJ3D8EMdQfr6pKQJDnuyCD4qiEk8qPIWhaBCGlh3M7wUDAHDpLEUUWY43swMMQzmGkDmKpvAntBIMRfAcRRJ4GCIzx3dB/hIAnnrYBksAvJieaX7tL0JQBMNRDF1sBPSb3NKjf4MiGIYgGBoiyPxtIjiCBYDLRxhFfhgubgjjGIZj6BPYCeRM1xlFUQzD5qPMkDACsecLCoqiFEURBBEEgeM4oIF1uT/tCJlv6jBEERSfj2f5VvI0c/Xm+XWh8MQ+RCE0YAHAZUwTECQIozCMUOQkRyBOMgX0iX0KgiHoXFZvbiXA9QPA0wYKWMCLCIGhNInNpWwixw0c/9u1RgQh4nlRGIY4hpE4jqCgzwgAl9JQYAxFEPO6leMFsycWsVq0aAVh5PlhEEbYfD4RhoGZuKDgOH7WgQWrcck/axRlSJzEsTCKHC+0vSAIv4X/d73Q9cIgRDDsZFOfauIAAHC5QBGEJnGKwCIE8fzQdv0nnNe0OCB3/fDk36MISWA4Brk2ADxlYFMBLyIMTSgSTRE4gqIj3Rnr9hOekETRSUY60mfqxPb9kKUIiafm+S0Kx7AAcMmg5yo5FIWjKDI27J46e/KjVD8IVcMZG7bjBjSJiSy5UH8HLmKu8mgmegSn5ZcdksBkgeYYwvNDbWL3VMt2/Sdsog6jaKjPBtqJHSAJTOQpliEREE0DgMuXHmOoLNAiRwZhaEydnjp7ci12feKOzZnt+jiGSBz9hHcPAQD4FjsUlgB4ARFYMi2xDImjEToyZ+3RdGL5T5K2oCiiTZx6z9BMxw1CkSNSCovPJxiBkDsAXCJOUlqGJtMxlqMJDMPGptMamMbU/cZ+jUUFxHaDes8c6LbjBTRFJGSWoQhY1gsIiqIEQSymEPq+DwWsy7ulTyBwLCkzAkcFQaiadqNnnuSlEfKNH3sYIa4XNgaTztiaV7fxpMxIHAWeHwAuoV9A0JjIyAKFIMjU8Ro9c2w40Tcairn/6GrT9nDiuD6KYCmF4Rlw/QDwlIECFvAiIrBkNiEoMkMSqGrae0210Tf8MPz61OUkvQmi9nD68FjVJy4aoUmZLab407H6sKwAcGly3QhFUIQh8azCJ2WWIjFj6hx1jb2mZtnffAwbhOHIsB/Wx0Pd8oNI4clikmcoHJo1LmKigqI0TZMk6fu+4ziwIJc3I0XmBSw0n2BTCoNhqD51d4619nBqu8HXa1nNBXGC7ni639T6YwtDUEVkikkhLtKwpwHgMvoFJKPQmRhHYJjjBgdto9YzLdv/ekMRhpHtBIcto96f2F5IEmghxQscBesJAE8XKGABL+Rzj6IJib5ZScYExnaC7WPt/a1eazCx3fnx+1eVsaL56etxz/xwp7/X1LwwTCrMakEupgQcVDAA4NIFr4tyNseQt6qJbIyNkKjVn/7zvfZB25jOvHm5+6sDWdcPm/3Jh9u9z45G05kfF+hKXi5lBIrEUejuuZgeAcMWsmXhNx1jAM87OIZlYvxaQSmmBMcLD1rG+w+6+03Ndv/iAMqFdEBvZP3zvfbOsTpzfIkjb1TiuQSH4xiCgo47AFy+GABNyGy1IBVSPI6h7dH0493+dmM8c/zwK7uwIyQIo/HE+XC3f3d/0BtbNIUXU0I5KwqnHVhgJgDgqQFtjcALRxRFKIpKHHVrNXncM0zLHer2vf0BGiE/uZZdyYkMTWDo560SEYJEYeT60VFH/3C798nuQJ+4OIqsFaXN5VhMpDE4gQWAyxi/LmSwbq0mO6OparpT29+qjUkSd1x/vRQTWALHsEfDSU9y34V6a2NevfpwuzvQZhESlXPiy2uplMLh85lEYCwuuHeARbjcYBgqcNRGKdZTrd/dbU0d9+PdfhhFjh9eWYoxFL6YM3huUyOOF/TG0z9t9T7c7g11m6LwYlb8ydVMQmLmcw3nv2BnA8Dl8gYcTawWlM7IGhszw/K36yqJYWEUrRViEk+hj0aTR3PBgTCKRoZ9/2j0u3vtWlv3/TCX5F7ZSBXTAkngi7ZuyBUA4GkBBSzgBc1LKRIrJPnb66mRYR+09aFmf7DdNWfuekkpZcS0wrIUgeNoEEa2G4wMu9Eztxvqfksb6HYQRCiB0RTO0PMMFiJXALi86W5G4daKSq1rHnZ00/I+OxhNZm69Z1bzcibOCuxJyhtGiOP6+sRtjyb3a+peQ+uMLT8IWZoopoSVgsyQ+Nz4RHDb+AJ6BJIkcRwPwxA0sC55SjpPInEMKWWE2+up3aZ+3DfHpnNvfzi1vNZgupThUwrHsxSBnWSkthuMDac5MPdb2k5dG2q2GwRJhS1nxKWMyNAkshDLgU0NAJfOUGAomk3wV5aUrdrIsif61LlfG08dr9mfrhWVhMzwNInhiOeHE8vva9ZOQ92uq4cdYzEXgsAxniVpgjh7Q1hYAHhaQAELeJHTFkTiKJYmwxDxwsDWfNPqH/fNSk5ayogiR5IEPvdMbms4PWjqHdWauT5F4BSJ+WHYHc+O2kZcYJIKg6NwGxcALl0UO09OLWc+PjtCwjAMUGRszHTLPu5N6r3JcoZXJIbEsShCJjOvr84O23qta9qeP79hiEYRYrmBPnEkjhRxCkLYiwk5JwxDz/OggHXJvf7JVkZJHBNokmPIubRlOFAtY+rW+5PVglhMSXGJJggsDKOJ5bWH06O23hxMLNdHUWRxxzQIQn3iMhRJk+D6AeByGopwfqk8CCMExSLkxFCM9Jk+cTsjqzWYFFKizFMkgdmuPzac456529DGUxtFEAJDwwg1Z95BR88lOIbCWYoA5w8ATxEoYAEvIn4QqhNn91j9/f1OrWPQFM6QWISgrhf0xtZAnX20M8BxlJh3YPl+FIaRH4YIisQFOpfkOYpoDCa1rjmxPG3ivnk9l4mxNEVg4J8A4LIQRYgfhqph/+FB908Pe83hRGRpgSUtx5+5vj5x7u72Pz1AcQxbaFt5fhDMM1skQiWO4mgcQVB14n78sKOZ9s9fyl9bjisCS4Bk3sUDm7OYQhgEASzIpd3UCBIEkTlz95ranx/26h0DR9GExIQR4nnh2Jh9NLHv7o0IAsFQLApPLEAQRH4YIVEk85TEU7YXTCzvg4c91w9+dj23UlBoct6HDQDAZTIUYajqzt39we8fdBo9k6ZIWaB9P5q5/lCfqaaNYT0Sx0kcdU6cRhiESBBGLIlLHJWKsV4QtYeT9z/tjnXbmHiby7GEdOL94RALAJ4KUMACXiynFEaRF4StwfTD3f77n7aa/YnI029ez5bTojHzdurjrjqbOZ5le+GpHgqK4yhN4iJHpRRuoyS/spHmGeJhXf3dvfZBS/v7D+p91frZzfyVcpyhHkniAADw/NqJ+c53vbDRN99/0PmnT9rmzC2lhVc20hsl5ahjbNfV9nA6nXm2E3ihh0ZIhKIYhlAEzjJESmY3SrGNJQVBkQ+3+vcOBx/vDsaG3RlN37iaT8dYmpyHsWApLgboow8jiiJov7rEhOGJ9x9osw+3e3/c6h20dIEjb6+lrlbixtTdO9Y64+nU8Wdu4M8iFAkXN4gpEudZIi4yV8qxmytx1XT/vN1/UB/9nz83htrspzfyL60mRe5z6SwAAJ73AMDxgoE++8ePW+/f74xNOybSb17PlVJCT51t1ccDbaZP3ZnjT0MfQZEoQkgcpSlcEahCSri1mlotyLYX/Gmr+/6DzoOj8UizX+2nX9vMltIiSxMYBpeOAeD7AgUs4AUiCKOp7e01tQ+2+/cOhmN9llK4n1zLvn4tm5IZxw+vLsXao2l3PB3pthtE8woWSpNoXGJyCSGX4LNxdnGKInE0QxHvs8SDmvrnnb4581wvWF9SYgJz4pogkAWA5zZ4DSPkxFA0tD88aN/bH05sb72ovHUzd2MlGZeYUlq8Uop1x1Z7OB0as5lzOryMwDGJpzJxvpwRMnEuLtIIgiYlNhljP9judcfWP33SHmr2qxuZjbKyUM4CQ3FBONPAcl3X8zyQLLlsm3r+a2p7x33zjw+6n+wN+9pM4si/ulV4eT1VSAmu718rJ9rjSXdkqaZtzTc1iqAkgSoik4tzuQSfiXNphXVcPy7SIk/8aau3VVdNy5u5/u3VVFJmCAIEMQHguQ8AZnaw19Y+eNj901bPmLrljPjmjdzttVRCYqa2d2U51hlaxz1TnTiuF8w3PMoyeEJiiikhnxSyMU7kyCCKBIZIyuyftrq1tvHP9zoD3f7JZvbqclzmKXwhiQnLDQDfFShgAS8KYRgNtNmDo9H7W93dpuZ64fqS8sp65uW1VC7JkTiGIEhSYpZzkjZxJjMvmA/KjSKEwFGeJWIizbM0TaDzsBZJyPTttRTHEDxL3d0bbtVGQRCOJ87t1VQ6BreEAOA5NhTa1P3scPjHB70H9ZHvR3c2Mj/ZTF9bScRFBkVQlsITEl3OSqrpmDPH86PFdGwMQ1iakAVaEWkKxxAURSOkWpA5hkjIzPv3O3tN7ff3u2PTVafOzWoyrYChuBCgKEpRFEmSURR5nuf7PqzJpUtKEdW0H9TUD7e6D2pjy/aXs+Lt9dQbm9l8kqdIPIqipMSVs6I+daYzz/HCxRhCDEd4hlB4RuRIat44ydL41eU4Q+EsTX60OzjsmEHYnEy9lzfSpbRAz2c1AADwnKJN3N2m+vvPOp8eDD0/vL6SeONq9mY1mZAYgsAElkzJXDnlrhXlqe0HQYTO6+MUgYscNU8TTszEXGcPLWcliacVgfq/d9t7Le3P23196mqm89J6spDkUShhAcD3AApYwIsQvSK2Fww068OH/Q93+ocdnaXwW9XE69cyNyqpuESjj3qmSAKLibQi0NHjGc7Zuernp/KKQF2rJDiaoCni/uHwYX00tb2Z499eSxVTPDW/7A5rDwDPEa4XDnXr/tH49/c7+02dprAblfjbrxRXC7LEnUmwoziGidxJLIsgwnlb8Wiu9iMzgSI4juaTgsBSLIWzDLFVUx/Wx4blWrZ/azWZjfMcDYbixwfH8YUGVjgHFuTyOP8oCsKoPbLuH44+3O4+rI0pCr9aib96JX1rNZWJcTi2SCRRAj/x6bJAIacV6S9v6tOvcAy5UlRYhmRp4qOdQXMwmdn+1PbubGYqWUlgCZg0CgDPW5aABEGoTtwHh6P3t7pbtRGCIC+tpd68nt0sxxWBXqiDoHOtAEU8MRSPpwkLE/EoSDhJKHAsJTN31tMUiUsCtVUb7x6rk5k3tb1X1lPZhCCwBHh/APhuQAELuOzRK4I4XnDcMz/Y7v32w4Y2dWIi89pm5q9u5asFhSGwub+Z3xWc+x40Qr56ouDj4+9RBEUElry6HE8qbEykf/thfa+hjQzHmLi/fLmYTXD06eB88E8A8BwQBFFftT7c6f2fj5vNwVQRqNeuZn5xq1jNywRxYh3OrpWd2IkIwZ5k/liE4hgSE+mfXMumY5zCtT7ZG+w3tbFh9/XpW9cLK3mZIfH5W4Gh+DEdxeLDBXN9mQijyPXCkT77p0+av7/fGagWx5C311K/vF1cLSj8vNJ0flMjp2HAV3j/CD2fqEYsSVSykszTaYX9fz+oHbaM8Uczw/J+/lJ+oxSnSQweJQB4jux/EEaq6fzpYff/ftreaWgKT91cSb77xvJqUcYQDEORc4biL8b1J3nEF98WRVGJp9+4lsknuJTC/uF+p9Y2RqrVGk5/+XJhtaAILAmWAgC+A/ivfvUrWAXgsvqkMIy0iXtvf/Tbjxt/vN/1gqBSUP7mTvHNG7mltMjMbwQ8yltOI9O/mEV+6evoo4SHpYhsnGNowvXDgTbrjqeq6TAULgkUAaruAHDRDcXJb44f7jXH//hJ8//7tDPU7VJa+Pmtws9u5kspiaKw84nu6fZ/sm2NPgp8CQwVWHIpLQgC5XnhUJs1+tOh7oRhGJeY/5+9936u4kr7fTuszt27d07aSiCQhCQkISEMtknOvrfq/HT/0HurTp33nbEJxgShnAVCKO2cQ+d4S3tLMjPvzBjbIBTWR1WgmQKLvXqtJ/V6vk/zwiYsdn+iMofjlMvlycnJtbW1tra2b775JhAIYO9VnoScXGzbqcnG2m7l71N7k2u5mqx3hD13R9vujCS6oyJDgyPv/x+8/NH//4956f5fRVGUIrCAyHhY0rTdcsNIF5VyXUNQ18tTJIGhUKoZAjkNmYJuOtu5xuP55KP5VLashDz0nX1D0dYV9VD7rvkgS3gfd/+P/xNtvRpHEZRnyIiP9Xlow7DLDT1VlHNVDUXRpq2A3h8C+cPAAhbkDPsku1BTX67lfl1Kr+6Udcse6Qndvto23huOB3jynerVn/wZh38db6amAQ/N0YRuOfmqmi7Kkmph6L7Tokj84G4XfCoQyEkzFE1bIWnWRrL6aC45vZ6vy2Z33HN3JPHZlUhbaN9Q/M/q1R+0Ewd/nQCYh6P8AiVypGk52YqSr6rFuoahKEcTFIk3u5mgpThuHMepVqsvXrxYXl6ORqPff/99MBiEt7FO74lGmnMYsmVl8U3xyWJ6/k3Rsty+Dt8XV2OfXYklQnzTKf/5Q40evunCMJShgN9Diyzlum6+qmZKckUyEATxsCR9MKcB7iII5IQaC8d1Vd3aytQezaWm1/P5qtoR4b8cTnzWH+2KiYeSdn8lSzh4g4XjmMAQfoH2CaTlIsWamispFUk3LYcmAUcBDEdhxwYE8v7AFkLI2URSzdd7lcm13MKbUrmhxfzsyOXQrYFoe0hgaIAihy9G/pq3OPzr+x4q5GUmrkTCfmZqjX26lJ3dKGZKcras3LwSTUR4HIUtQhDISSxe1GXjxWp2cjW3tlOmKXBzMHprMHa53SdyxEFTwAcyFM2rXkhQZK73RcJetjPmmVrLvc3UKw19K1Of6I9c6fI3dbWgoThujspVrV5CuCCn+FEiiKLbbzO1FyvZ2Y18oaKJAvX5YGzscqgjIogciRy2BH6QXNF1XZElR3qCIR8TC7DPl7PrO5VSVc0UlZsDkd4OHwmgrDsEckKNRaWuLb0tPVvJLm6WCBy90R+9MRAZ7Ap4OBLHMRc5TBU+QJqw/5/xCdS13nDYz16IiZNr2a10PVeWt7P1if7oYJffy1OwfgWBvCewgAU5cxmp6xZr2tp2aWotP7tRsF33QswzcSUydjncHhZaiq2tyPNDJYqthgIEQQWW6Gv30QRAUWz2dT5ZlB7Pp2zbuWFHEyGeoeBxg0BOCq6LWLaTLctLm6VHC6ntTE0UqGuXwzevRPs6fDQJfjMU7gfoBDq87rH/32JpoqddFAWSo4jp9dyrveqz5UxDNTTD6uv0hb0sfA973IkMih6JuMMa1qk90vvev9LQ3ySrz1ayS29LNVXviAhjl8NfDMbiIb556xo9ONIf8lAjNAW6IgIJMBLHni1nt7KNp0tpzbAQF+2KCVDmBgI5aVi2U6xpC28KT5cz6ztlniGHLgRuj7RdbvMKHHnwuukv9Wj8C1uBoQhFgAsx0ctRPAN+Bem1nfLMq3xDNhTVHLoYCHsZApa8IZD3ALYQQs4OjutatlNTjJdruZ9nkys7Fcd1+zt898faJ/oiUT+LN1MU9J0pIR8o+WkqYjWFMjAc9XBkPMAiKFJtGLmymqkojut6WIpjCADQDxI3QyCQvxhNGqadrSgvVnL/9XInXZQEjrwzkrg3mrgYFykCtErSB4YC/TAlknfNDoaiPE1E/ZxXoGuyXqpqmZJarKkEjvlFGkcxHIfqece3GRqNxuTk5MLCQiQS+fbbb6PRKI7D6ZCn6yEiluPUZH1+s/hgLjXzOm+7zsU28d5w4vZwW8zPARw7OoboBz7U+1sIwzCOJsJ+lmOImqQXalqyIBmWzTGEwBAtyQL4mCCQT28rHMey3VJdm1zNPpxLvk7WeIa8ORC9fy1xud3H0kRzlhOKfNCOvnfvYSEIylAg6mMFllIMq9Iw0mWlUFExHOUZkqUAhsHWYwjkd4AFLMgZwXFc1bDepuu/zKcez6eSBdnLUTeuRL673jHQHRB56h+rVx8e9Ld3LCjHECEv4/NQqmHnKkq6KNVkg8Axn0A3VTEQ9P1VoCEQyIe1FU3Zizep+sPZ5IvVTKGqdkaFu6OJLwbjiSBHEPhfFMd7n4pJq2eNInG/SId9LEViVVlvNh3L5ZpOU4BjAPFOyg35mLUPV5blly9fzs3N+Xy+77//PhaLAQAvzJ6W57f/pZv2Xl56sZr9+9TuTrZBEfhEf+TrsfbhnmBAoDG8WbP6eN6/aTJwDKNJEPDQIZGxbLdc15NFqVTXXQQNehmi5fuh54dAPp2pRxBEM52tXOPxQvLRQjpbUWM+5vZw7Kvx9kRYoA/U8T6u122NgCAA7hOoRISjSNCQjWxJThakhmJgGOZtJixQ2R0C+Q/AAhbkDPik/V8qDX1lu/xgNvlyPVeR9O6o58vh+JfD8e64h2MIDD0OfcQjXVgMRVmKCIqM30OZtpsuyqmCUmnojut4eYohAVQIhkA+WbVCNec2ig/nUzPred20L7d7vx3vGO8LR/wsAfD3njj0lwzFURmLBLiPp8I+lqcJSTUzJXUvL1VlHcNQn4em4MWNY9kSjUbjxYsXs7OzgUDgxx9/hAWsU/TsXBeRNXNlu/RoIf18OVOoqLEgd3ckcWsw2tvhEzkK/dgF6XcONYaiNIn7PXTQSxMAy5WVVEEq1jTDdASWFFgSg0kpBPJJbEUzWVB0a99WzKVerueqknGpzXv3WuLmYKwtwO97fwRFPr6tOBoLQzZrWCGR9rCkZtqpkpwpKcWqRuAozxIUAaC5gED+HTBEg5z6ANawnGxZWdwsTL8qvElWERQdvRSa6I9c6fRFfdzh6073eKSRj1JTDENFnhy6ECAJnKXw+Y3iRrLSUAxVt0Z6QvEAR0NJLAjkeONX23ZKNW11u/zTbHIrU6cIbORC8PPBaF+HX+TJAxNxXAHjgXKe61Ik3hEWWBIXOWpqPbe8VV7aLMmqJavmQJe/LcjB/qOPDd4ERVHHcWzbbslgwTU/8c7fNWynVFWXt8pT69n13appO1d7QmOXQiOXQkGRbo4QRY7zUbZ+FkcTrV4kEmDT6/l0QfppelfWzOu94c6IwNIEhsGtBYEcp61A7FaL8UZxcjW3tltGUPTa5eCNK9Gh7mBQpJHDu5zHYyuOquoAxdoCPEcRHo70CdTqdnl9t6ybVrGmDfcE28MCTUE1AQjkX4Vt8AYW5PTiuI5q2Du5xrOlzK9L6Y1kTeTJ4Z7QV2PtwxeDPp4+vHflHvNgr5ZzQlAE4JhfoIJeGsPQasPYzUv5imLZDksDT7PTHephQSDHk1jajpsuSjOvcg/nU2vbFZEnJ65E7w63DV8KUSQ4ULI73rIFuh8zH/xAliaiAdbLkwiK1WR9r9BIFxXTclma4BjQUsWAz/Ej7Q1d16empl6+fCkIwvfff59IJAgCCm+fdHTD3itIU+v5h3N7r5JVisCHLga+vd5x7XLYL9DNguSxn+jDsfk4ioocGfIyFIFLmrGTkfJVRVatpusnSQIeZwjk2Ew84jhOrqIsbRb/z+Tuq90KTYLRy6H719qHLgREjjzo7D3e607v/iyGAiEvGxIZFENrsrWTq6eLsmk7LIWzNEGTUNYdAvlnYAELclqxHaehmFuZ2uOF1LPlTLmuh33srcHY3WuJi3EPQzarQ+ihZuux0/zJ+7/gGCow+4EsAfByQy/WtGxZNizXJzRl3XHYSwiBfOwKhaObTrYsP13KPJ5L7RXkgEh9ebXtznC8Ky4SB3rp6CcRnDrUit3/BWCY10PFAhyOoTXJTBflfFWTdZNjCI4mmv0EsKHgo9hqy7KmpqZevHjBsuwPP/yQSCRIkoRLfYJPtKvq1la28etS+sliOl9WfR5qoj9yfzRxpcPP0ADFWvnopzgvzR/Z8ussTQRFmqWJasMoN/RMWWmopsBTHpYEOAovVkAgx4Bh24WaOrmaezSb3MpJXo4c7wvfH2u/FBdZ6tMLerT+AQBDPTwV8rIMiVcaRrGm5kqapBoMhfsE+jCdgRYDAjkAFrAgpzJ4tR23XNceL6b/31/fzr8pGZY7cin4f93svjPcFg9yJDgOIcb3/KeiKIrjmIej4kGuPcI3FDNZkN8kq6WaShKYTzhoc4CeCQL5GDiOq+jW2m7l/3u29WQxXWzo3THx/7nX88VQLB7kSYAd+y2Nf2ko0ANhVxwXObIjKkR8rKpb6aL0Nl3fTtcdBPFyFEMBDEaxH8FK67r+/PnzFy9ecBz3448/dnR0wALWiXxSzRliritp1uOF1P9+vjO5mlN0q6/D/78+v3B3pK0zKrb6bU/Amd736RiGcgwR9bEXE17TdlrHOVWQMQzxCQxD4iisSUMgH8teIA7iWra7vlv5r5e7P88m0yW5Jy7+8FnXtxMd7UGeOjFytPvmCkMBjokc2RnxRAOc4yLpkvQmWd0rSHXV8HE0SxM4hsJkAQJpAQtYkFMWvLoIohn2Zqr2bDnzZDGdKso+gbo1EPtiKD7U7T9QbD2+TvbfD2GPylgUuZ+aChyFYWhF1tNFpSbrtuN6WJIiAAYHjkEgHzooRFykWNfmN4uP5lJLb0sYhl69ELx/LXHtUsgrUIdKNJ/+3B1dFG3ZCpoAHpYKijQBsLps5qpqsabKukUCXGQpHD+4XAIf8YfCNM2XL18+f/6cYZgff/yxs7MTFrBO5onWTHsv33ixmn28kNrO1HmWHLscvj/WPtTt94v0b9Xok/HwDqWaMZEnPRxJkXhDNrNlpVTXDMumSdzDkgdWCO42COSDJguOizQUY36j8HghOfu6ZNlOb7vvh8+6Ri+Hgx4aa8bcJ0Tr8N1MgQCYlycDzREudcUsVtV8RZE1iwSYhyUJgENjAYHAAhbklHmlfYckG8tvS08W05Or2UrD6IwIXwy1fTkcvxgXBZZoCsq4J21wR0sXA2kGsgEPHfDQJMAKVW2vIBeqqm27LAUElmxO2YZ+CQL5MNi2my0r069yj+dT67sVhgLjveG7I21XLwZFljqZNaBW/R1FUIrAgiIT9DIsTSiamSpI6aJckwySBM2SN9TQ+cBMTU09e/aMJMnvvvuus7OToii4wieNqmxsJKtPFtNPl9K5shILcLcGY3dG4n0dfp4hUOTgGuMJPM44hnkFKigyLAVqsrGXl7IVRTdsmgLNpBS+v4JAPmyy4JZq2tzrwk8ze6s7ZQRBr/WG7oy0jfWGvTx5dNxOmrlolbFIgPkEKuBlRJbQTDtVVDJFua4YOI5xDGi1PcJnDDnnwDlokFPijhBEN+18VX21U3m6nNnK1BzXHe4Jjl0ODV0IhnzMoT7MCR0e5R56S4YCPW1elgYEjk2/KmTK0qO5lKSaE2akO+LhGDifCAL5y8fNdXXT3s1LU2u5mdf5dFEKicy1y+GJ/kh3TKQprFW9OrHGoqndgxAA74x4eIbwcMTLtdybZHV2o6DqVkMx+jq8ES9LkXA64YdJGwAALdV213VbUwjhspyoI2FYdqWuL2+Vp1/l1naqumH1dfrHe0PDPaF4kDsQvERO4oFG3eYXghIAS4R4hgIEgb9Yyb3NVF+u5SXFUnXrcsLraSofwCcNgfx1DMvOlpW5jcKL5exOtuEVqMGLgVsDsf3YmwLIiTUWh/4IQfa9fyLIiyzp4SmBzb3eqyy9LTUUsyzpwxcCsQDb7NuAFgNyfoE3sCCnIXptzr/PVpQXK9n/ern7areC4/jwxeD/fbP7em9E5Ckc+612dTITOvS3j4LiKCLydMTPsRReqKrbuUayICmqFRQZkSMJgCKwOQgC+dPmwnUt206X5Aczqb9P72WKSsjL3B1t+26iozMqEi3VC/cTCTy/V/yKHJmy5kh+kAjxQZHRDTtdlnfy0l6+gaBIwENzLImh8OLGB8BxnJcvX/7yyy8kSf7www/d3d3wBtbJOc6O6xZr2sv13E/Te3MbRQRBe9t9/+uLi+O9kaCXBb+dgRP5xFq18ubvGIpwNBkLcAJHVmUjVZC3c/W6ZPgEysvTFIHDHQeB/EUs28mVlV+Xs//7+c5eXvLw1JdX4z9OdHbHPUyzenUwcvAEH7ajAIChiHiAawsKhmWnS8rbTD2dlwzLCftYlgY4jkIlAci5BRawICc9eEUQpC4bK9vlBzN7L9ZylYbRFuTvXUvcHo5faBNbrTQnSfXi94LZw9tYBMB9HirgoU3LrcpGtiQXqyqKISJPUgSADQUQyJ+wFw6C1BXj1W7tby/3Zjfyhmlf6fZ/c73jel80KDLNLt1ToB91dPZbZg3DUJ4lon7Oy9F1yShU1WxJKTV0HMf8PIlhKHwT+9cdzcuXL58+fUoQxHfffdfd3U3TNLTAn/yhuC4i6+brveqD2dSTpXSxqoW8zJdX27653nGxzcMzrREMp+BBoQefaP+fSuCYyFMhkSYAWq7rhZqaLsmWbft4kiIBBmcTQiB/Nl0wTGd1p/JoPvV8OSup5sWE57uJzhv90YifJQ4noJwCc/FOeyOGoRxNRHyswBGyYlUkLVdRMyUF4BhPEyQJB5lDzimwhRByot2RZbuVhr6wWXixkn2drFq2c7FNvHUldrUnEPdxgDhBKox/KJZtdbmHRZbqASxNepbIuY388lZJM23ddIZ7giEvAzBYw4JA/oC5cFy3UteWtsrPlzMrOxUcRa5dDt64EhvsDvgF+uBm02lTP3ddF0NRniYvxoGHI2kSn1zLvc3UXq7nG6pZl42BLl9IZAhoL/7s8jYHxeIEQeA47rquYRiO48CV+fTH2XErkr78tjS9nl/dLdcl82K7ON4bGbsUbA8LLd2o0+X9WyYIw1AvRwx0BxgKcDT5YjXzNl3TDMuy3NHLoURYoPY/GjzMEMj72YqDajdSl/X1verTpcziZkm37P4u/+cD0ZFLQb+Hac3vO3XJwtEAqM6ohyZxgSGm1vKv9spzrwuqblYl7erFYCzAEzi8iQ05d8AbWJCTi2bYqZI8tZ57NJ9e36kwFBjtCd4ZTVzvDQe9DI5jyIFe++mz2wey7ghKASzsY3wCBXC8Iul7ealYUx3HZUggChQcsQ2BvCe24+bKyvSr/OOF9NLbIksR1/vCX491XOn0ixyJtrRoTuGN+6MREAiKcAy5by48+x+nVFNTBTlbVlzbpQicZw6koCF/eHmbzMzMPHnyBMfxr7766uLFiyzLQtv7aY9zqijPvi48mEuubJcxDB264L8zkpjoj8QDHMAw98T3Af3744y4KAIw1CtQER9LAlzSrGRBypZlw3IoAvdwzW5CuP0gkPfBdW3HLda1uY3iT9PJle0S4iIjl4Jfj3eM9IS8PNW8edU0GKfRXDRLdCiCMBQR9bF+kcYxrNzQ9wpyvqoZpk0CzMNTAM6AgpwzYAELckKD16pkrG5XniymnyylSzW9Lcx/MRS9M9I20BXgmOY1e/RIGv10+lz0YGQSjmE+gYoFOJYiFM3cy0t7eUlWLYbCGQq0Ilm4JSCQfx++uqpub2frT5bSj+ZTmaIS9rK3r8bvXktcjIt0a2RPcxrYaT1JRwJ/CEKReNjLRXwcTQFFtVIFaa/QqDYMADCOJkgCdiD9Saanp588eYJh2FdffdXT0wMLWJ/oLO8f57pivElWf1lIP1lMp4ty2MfcuBL9erz9Sqd/Px3FWgJ27ikVi3QPaukohqI8Q8b8nMCStuMmi9JevlFp6ASBcTSgSThrDAL5fe9vWM5uTnq+nHk4l9pI1kJe5rP+yP1ricHmJUcMO7nDnf6Axdg3FwhJ4EEvGw1wLE3ohp0pKbu5RqGi0eR+pkCTOIYhUEIXck6ABSzIibLTTb12x600tIXN4qO51MzrnGHal9q9968lbvRFEiGePCrouMipju7Qo0gWQTEMZWkQ8bMcTdRlI1NSchWlVNc8LMGzBIFjKLwgDIH8KxzH1Qz79V710ULqxUquJpmdUeH2SPz21ba2EA9w7ISPd3hfc3E4YBtDURxDBZaIBjiOJhTdyle0vbxUlfbzXoEhW1LQ0Fz8UWZnZ58+fYogyN27dy9dugQLWJ8kF7VstyYby1vln2eTs6/zDdm8EPfcHm774mqsIyQwzXsGp7EV6B9d//7XoVFCaQqP+lifQMmala+omZJcrGo0hXs5imjZL7gNIZB/YzEk1UwW5CcL6aZGnhrxs3dG4l9ejXdFxXem9J5y748cxTBN788QsQAnsKRmWvmKki7LpbqKYajAkRQBAA4vYkPOBbCABTk5rmj/F9tBsmXl6XLmwWzyTaqKo9hoT/Db8Y6hi8GASAOA//Yu5QxEdYd6PE1dDIwm8YCH9nCkZtj5irJbkGqyCTBU5CmGImA7IQTyT8ErgiCqbr7aq/zvF1uzG0VVt/u7fPeutX8+GPN56IPq1Rk6OEfarjiGshQIepmon7Fsu1jTUkW5UFVMy/Z5aIYELckPaDHen+Xl5cnJScMwJiYmLl26JAgCXL1jPs6O6+ar2uRa7uFMcmW75LjIQJfvh886r12KhLx0693VUSJ3Bs7yvmlqfhySwH0eys9RKILkK2qqKJVquuW4XoFiKBxD4VmGQP7JXux/6aa9ul3+75c70+uFmmJcjIvf3+gc7w3HgzxBYM26D3I2zs3R8X83WQh5GRRFi1UtWZDzFVU3bZ4mfDx1gmeyQiAfDFjAgpwUHNeVNWNlu/R4PjW5lsuU5bif+3wwdme4rbfTJzAEhmFn4CbFf85LKQL38lSrVFeqavmKkq+plu0wFBBYEnYHQSBHWLZbrGtT67mHc+mV7TIB0Ou9kXujiaELAZ9AYe+03Z29VL+VwzME8AmU38PwDCFrVqao5KpqraEDgLIMSUJl1z/C4uLis2fPdF2/efPmlStXPB4PXJPjSkURFEEkzdpIVp8upZ8tZVNFycfTN4dit4fjg10Br0Adzdk8SxsaPegORhDUJXDcw5F+kWZpoiaZuYqSq6iqbtEEzrMkHNEAgbzrAV0XqUj6/Ebh4dze3EYRQ9GrF4L3RttGe0JBkQE4hh526p69ZOFA2Z3Am8NMGZGn6oqRr2jpotxQDQRxWYakmjMXYTch5AwDpxBCTkQAa7tOoapuJGu/LO7noo7jXGrzTvSFxy+HYyH+lA4Q+XN5qYcjhy4EBZbkaGJqPbeTlRTdrsrGnavx9ohAQ0ksCDQYrmuYdqooL2wWny9nt7L1gEiPXAx+Phi/2O5hSeKoyHMmPz76ThcVTYK+Tp/fQ4s89WI58zbTeLKYrslGTTKudPlDXhYADNqLP7S1DhMkF1ra41lw23Ebirm6XZpcyS5ul1XN7Ix4rveFx3rDiTBHAnC2vX/r7ZXrugxF9LT5PBzF0cTzlexWuvbLQqqhGLcMu6/DxzEE3JAQiOsipuXmK8rSVunZcnojWWNpMNIT/nww1tfp5Rnyt3c8yJkNAFofkKFAT0IMeBiWxqfXC6vb5ZdruVJNrcrmQJc/4mNJAoNGA3JWgTewIJ/cHSGaYaWK0uR67pe51Ku9Ck2CkZ7g12Md1y6Fwj4WQ5FTr8H4/nlpcz4RhqIelmoP8yTANdPKlOSdbMMwbYYEPEuQAIfXgyHnGUWz3mbrTxbTv8ynijU14mfvX0vcGU50RA8qvOeh+HD4CV0EQVkaRH2sz0MhCFpu6KminC4plu3SFOCZZgkLmovfY2VlZXJyUlGUiYmJK1euiKIIbezHz0Vd3XKyJWX2df6n2b2V7QqOI0MXA/dGEzcHoiEvS+DY+fH+rXYnhgJtQZ5nCNNyCzV1NyfVZIMmcY4hGBIcqg5AIOfUYmiGs5tr/LqcfbyQ2ss3on7u9nD8znDicoeXIUHrKJ0fc+G6CEXisQAf8FAYhtUVcy/XSJdkzbAYCnBMS0IXmgzIGQQWsCCf0hU5+97I3srWHy+kni5lknkpKNI3+iNfXWu/3OEVGAI7N/HrwZoczSfCUIbCgyLDsYSiWdmSlC4oddngacLDkQTAMHivAnL+sG3XsOyNVPXxQurFarYmGV1Rz5dD8S+vxsM+lnhHJefcWNH9z4ohKEXum4uIn0VRrNLQU0UpU1Qaih4SGYbGD+e2Qv4t6+vrL1++rNfrIyMjAwMDPp8PrtjHz0XtVF56tpp9NJfazUs8S4z3Rr4aa796IcizBH7OvH9rKjGKohTAAyItCpRpOvmqmipKlbqOY6hfpKGsO+S8mgvERVxNt5KFxoPZ5POVbL6qxv3s3dHEneFEIsQ1FXLR82Mx0N8GQSAEwHweOhbgMByvNPRcWd3LNzTDEhiCZ0kCx2DhG3L2gAUsyCcLXhEEqSnm8lb54Vxyai1fV4xEkP9qrOPzoVhbiKMp8I7o1XkxvQfzifa/cVEEo0jg42kvT+IYuldQcmWlIhkYinp5iqHgPSzI+bIZiIvWFWPhTenRXHLudd4wnb4O/1djifG+SMBD76e7Z1Uk7z9Gsc2F2f/cBMB4moj4GYrEFd3KVdRsWWloJo6gPEc0305Dg/FvWV9fn5ycrNVqIyMjQ0NDfr8fLtZHPc2yZr7aqTyYT75YyRZqaiIo3BmJ3xyIdsdElm61vjQvGZyjp3BwqwJFXQIAkSODIoNhaLGmpQpypaEbpiMKJEc3Z9nAwww5ZymDopvre9UHM8mZ14WGYlyIe+6PJa73hcM+9jfvf57OxaGYwMF0Qp4h/QIl8oSim/mymiupFVl3XTQg0gSOYRgK++IhZwmogQX5BH4IQRDdsLMVdWY9N7mW2842AEAn+iPXe8Ojl0IiTx2GuOc0RGuNEEFQF0dRn0BduxwOe1meoSbXcivb5YqkF+vqRH+kLcTRBIBlLMh5sBiW7Zbq6ouVzK9Lmb2CRBH451fDXw7FLyVEliaR8yCS9+/txdEwU4rE28MCz5CxAPtiNbewWXyxks0U5WRJvnY51BkWWpfUoL34l8lAa2Ecx2ltOchHOsu2g2TK8txG4eVq7k2qiqDI1QvBu6Ntg91+v0ChKNb6g+dwo7Y+sOuiGOryDNHf5RP5/aT08XxmN98oN7RyXbs1EO2OixxNHP15CORsGw3bcYs1beZ1bnI192qnSpD4jf7I50Ox/i6/l6OQs615+R5Go/XxCYC2h3m/h4742am13NSr/NxGIV9RMkV5rC/cFRWaL7FgDQtyRoA3sCCfAFW31nerT5bSz5cyqZIcC7A3B2O3r8YHuv0ejmpGri6MzRAEbY1nwlFM4EifSDMkkDUzXZKyJUU3bJoEAktQBI7Ay8GQM41m2HuFxuRq7vF8ajcvhX3srcHYl1fjlzt8NAFQODUaeWcBXJcm8YBIh0SGo4lmO6GcKymqYREEzjMkFHb9l2xsbExPTxeLxcHBwZGRkUAgAFfpYyBr1ma69mwl/WQhvZ2tB0Tmel/47kjb8MWAZz8XRQ9vUaDn+CAfjihEUJ4h/B6GYwjTcrIVOVNSZNUicIxnSIaCE10gZx/bcVMFaWo9//NMcitT8wnUZwPRuyNtA11+jiYOJ3ij0PW36lgEjgZFNuRlKIBLqpErqztZSTdNADCOJhgKQKMBORvAAhbk+HBd17TdqqSv7VQezqderuU00+6Oej6/Gr89HG8PC2zTtqLwouuRWzp8s4RhKMcQER/DUEDT7VxFSRZlRTVIAggs2ZTFgPk75AziuK6kmJup6rPlzK9L6apsdESEz4dinw/FOiMCdf5Er94rmEVRAHCRp6J+FsdQw3QKVTVdVGqSgQOUIQFF4BiU0vlHtra2ZmZmstlsb2/v6OhoKBSC6/PBE9FyQ3+1V30wl5pay9clvT3E3xqKNe9RehkaoND9/2O81FoQhgRhHyMwpGk5xZq2l5dqioEiqCiQJNE6x3C5IGdv/+8fAVmzdnPy0+X08+VspqzEg9zNgdjt4fiFmEhT+FGpF4K07lY1jQaOoR6WDIo0QxGm5RRqarYkVyUDQxGaAjSFYygGbQbktAMLWJDj80ZNS6rNvs7/NL27ult2HOTqxcDX19sn+iNBkXl33hC0re9moy1JLAzFGAqP+TmRpzTDThfl3bxUaegcQ3hYiiYBXDbIGcNxXFW3V7ZLD2ZSz5Yzkmr2dvi+v9FxvS8S87FEcxwnTHf/2dI2pXRaUSxLgUSY9wmUaTm5srpXkDIlBUFdsTmqH8OgsutvvH37dmpqKpvN9vf3j42NhcNhuK8+4J50bKcmG7Ov8w9mk/MbBdtxe9v93020T/RH20Ic2ZyUed4kL3/f9TclsfbTThL3e+h4kDdNK19R9wpytixTFO7laIYCsBYNOXP5guu6iGpYb5LVv03tvFjLlWrqxTbP3dH2OyPxmJ8nCbRZuoL17ncsxjtNxVjT+8dDXMhLW5ZTqms7eSlbkhXdivo4hsJRaDUgpxxYwIIcjzdCdMPazNR/mU89WUjtFmQ/T98ait0Zaetr9wksefQWEZrUf+2WmguDIggJMC9P+z006iJlWc+WlVJNcxAn6GEoAmvOJnThKynIGbAYCIJUJX3mVe7hXGp1u4Si6NDFwLfjHQPdQR9P4TgG093/HMW67n4USwDMK9BBD0MCvFLX8hU5XVRkzfRwlEAT2ME7bLiGyObm5suXLzOZTH9///j4OCxgfcBc1LTdnWz9l8XULwuZrUyDZ4jxvvC9sUR/l9/LU2D/LCPwLP+Ls4we/IaiKIFjHEsEPQxBYHXZSJfkclXXTEvkKIEB8EIl5CxZDAdB6rK++Lb0cDY1t1E0Lbu/0//d9Y6RnlBAoADAjspWcNv/6zVsvvMmcUxgqViAIwmsIZuZspopSpphUyTgWYIAGFw+yOkFFrAgHzsLddFmIrr8tvTLUnpqLVeoaYkQf2ek7YvB2IWYh2cOqlfwRcrvOvWDGdsE7uUpv4ciAV5r6Ds5qVTTDNPhWIKnAY5h0KdDTvtWtx03XZSm1vKPFpIbyRpNEeO94XujicELAQ9Hwnk675n9uq6LYRhF4D6BCog0x5ANzciWlExZUTULQRAvRwKAw34CBEF2d3dnZ2f39vYuXrw4Pj4eiUTgBvvrB9l1kbpqrO1Uniylny1lcxU54mNvj8Q/H4r3tns9LHE0axgu1+94fwwlcMzLUwGRIUlc0+y9QiNTUjXDokj8IB2Frh9y+je7abulmjb9Ov/LfHptt4Ki7rXeyJ2RttGeoE+gocV4L+/voq3ZxCSBBTyMlyd5ltB0K11SsmW1KmsogrI0wVI4Cq0G5HQCC1iQj+uKWvIrc2+Kj+ZTcxtFB3F7O3x3R9o+H4q1BXkC/OaKoA39vXQUPYplCYD5BDrkZUgCL9XVbEnZyTVcF2EowNEEAUWaIac5VdNNO1WUnyxmniymtzKNkI++MRC7P5q40hmgSdxFXBSB1w3e12i0ZmLgOOrlqXiQ4yjCtp1CTd3ONUp1DccxisQZqimjd76XNJVKzc3NbW9vd3R0TExMxGIxuMf+IqbtlOra0mbp4Wxy7nVB0oxLCd+XQ7E7o21dEU+r4AJz0T/i/V0MQ0WODHlZD09UG3qmpKQKDdWwmhNdSAI0L2FDIKfXaFhOMi9NreUfz6fepKoiT17vj967lhi84G9N3oQW4/1MRksfoBUuIR6OjPhYD0e6CJotK3t5qVjXENflaIJtvvaGKwo5dcACFuTjJaJIQzE3ktVHC6lfFtJ7+UbIS392JXb/WvtwT1BsXaNoGVdoOv+wY9p34SwFwl7Wy5GO4+ZqarIglRs6hqGtUSPQI0FOHbbjNhTj9V718ULqyUJK1qyehHhvNPH5YLQjIuAHdzWhxfhDqe/hhHEUoQAeDTARP8eQoCoZmbK8malpuk0AjKFwEpzroWa7u7szMzNbW1tdXV03b96EBay/5v1dSbO2MrVny9lHc6m3mbqXp8b7wvevJUYvh/0C3aqzuPDF1R9b1QNJTIrEQyLj89AYhpQb+l5OytdUx3EZGrAUwHEMrhXkNBoNRbPeZuo/T+/9spiqSHpn1HN7uO3eaKI9zNMkHNjyJ9OFVgWcBHjQy3SEeQBQTbeSBWk336grJoFjNAVoAoPXCCCnC1jAgnwMP4TYtlOV9ZWt4s9zqdnXeUkx2sP83ZG2zwfjPXEPQ4HfrgFDg/nHXVJT2hXBsNZ8Is7noUzbKda1VF4q1TUXQXw8xZCtziC4vpDTYTYsxy3XtZWt8sP55OzrguMifZ2+byfaR3qCYR8L3hnyABfrD9mLVi9hqwGZBLiXpyJ+liKBolqZkpItK+W6DghUYAkCb+roncsF3tvbm5ub29zc7OzsvHHjRltbG9xpfy4LbZahzaWt0pPF9POVbLWhxwLs50Oxr661d8c8AkvireoVPMt/+CQjrosiqIuhKEngfp6MBljERcsNLV2UU0W5eaWC5FkCSmJBTp3RqMvm6nb58WJq+lVe1e3edu/d0cRnV6JhPwPHDf/F5W15fwLgIktG/RzHAFW3cs3ZxMWaiuMoSxMUCS9iQ04TsIAF+ShhVrGmTr3K/TSbWnpbdBxkoNv/3Xjn6OVw2M+0ZodBxda/tMLvfE+ReKs5yHLsbFndzUm5igowNBxgOZqABSzIadnUFUmfWsv9bXpvdbuMIMjNgejXY+1XOv0CRzUzXrQ1Jxqu1J+yya0LL/vpL45jHAPCXjYg0opq56rKbq5RrOmu4/p42sOR59Nm5HK5ubm5N2/eRKPRW7duJRIJuNn+TLKEIJJizr3O//fkzvLbkqyafR3+b6+3T1yJRv0sSbSKV1A04M8HV/tfzYwUAEzkqYiPBQAr1tWdnJQrq4btRP2syFFweSGnCFm1Zl8Xfp5JzmzkHccd7Ql9M9YxcikkChQORXI/hPdHmmoCGIpyDAj52KDIWI6bKkrJgpwry7ppB72MhyPhIkNOC7CABflwkWtTs10xrLfp2tPlzK9LmWSu4RPoG/2R28NtA11+r4fCof7ih/dMLsBxD0cKLEkRuKKZ2bJckXTDdEiAsXSroQAuOeSEmg3HQWzb3c03Xqxmfl3MbOfqokBf7wvfG030JLwcTWAYciSUB9frL5kL5Gg6IcJSQOTJgMhgGNpQjVxZzVdVRXMYGqdJvKlPdL6sRj6fX1xcXFtbC4fDX3zxRXt7OzSaf8T77/+imc5uvv58JftkMbOZrvMsMdYXuj0cH+oJBT1063If9P8fxPG7CIKhKIYhPEN6eYohCcOyCzW1UFE0w8ax/TS16fvhOyzIyU0ZUBQ1TLtQVV+sZB8vpDczVZrAbw7G7ozE+zr9Ho7EjiRyodX4INkC6mIo1poEJQoUBYCk6dmyVqgplukAHBNY4uB9IbQckJMNLGBBPlj8ajtuXTFe71afLGaer+QyZaUtyH0xHL870tbX6WNpAr5F+Tgrv7+uGIb5BTrsZWgKl1UjWZD3cpJuWQwJWIpoXcCGawU5aTiuqxjWbl56upT+ZSGdKslRH3trMHZnuK23w9vct/CuxodPflstBRSJxwKsTyBJgMuqlSrK6aJsmC1JLEASOIr9Nsr/zJPL5RYWFlZXVyORyJdfftnR0QG33fsnogiC1BXzbbr6bDn7aCGdKsoxPzvRH7k/mhjoCnjYoxf7cFE/0Ck+zP8xFPVwZMjHiCyp6Va2Iu9kG5JqUiS+7/pJHMq6Q06o3UAQ1bCTeWn6df7BTGo70/Dx9ERf5Ktr7f2dfoYCiNu8NQxNxoe0GwfunyTwoIeO+BgK4KphZytKuig3FJMicIoE1EEVCy495OQCC1iQD4Nh2ZmSPPem+HA2ubRZQhCkr8N771r7F0OxiJ9tdlYjB1rCkA+cjiKtWU4YhrIUCHmZoJdVNCtf01JFuVzTCYAKLLHvkGAgCzlJOK5baehrO+UH08mX6zlJsXrbvfdGE7cGYu1hviXHAEveH8VmHKpiIS4isGQ0wPoE2nWRWkPfzjQKVcVxXZoCDEWAcyOKkcvl5ufnV1ZWwuHw7du3YQHrfbNQ1zUtp1zT5jYKD+eTsxsFw3R62sRvJjpuXoklwjzZen0CT/JHOMZu0/cjKEoReMjHBDw0gqK5ipopKdmi8ttEFyhtAzl53r8uG6+S1ScL6ccLqYZidEWEO6Ntt4fj7WHhQPISVq8+kt04UMVCOIZoCwk+kQI4Vqxqyby0m5ds12EowJCAwOFAc8jJBRawIB8gfrVtdyfXeLqU+XlmdzvXYCgw3hf+dqJjoDvg4+lDXUBoCT9yStqSdadA0MN4BUo3nWJFSxYapYZGAuDhSJYCKJxNCDkZRmM/flXM2deFn2eTi29KluMOdPu/Hm8fvxwJeGgcSrYfVxSLYxhLEyEvEwvyDuJmmmrQqYJk2Y6PpzkK4Pi5yH4rlcry8vLCwoLH47l9+3Z3dzfce+9zkA3TyZblJ0upR/Op9d0KAfDhnuD9sfZrzWmDrQqoC19efaRT3NIFdF0UQ0mA+QQm6GHs5jSMZFHOlmUXQUSe4GgSyjNDTg6O60qKufS2+PNscm6jIGtWb7vv6/H2G/2RsI9912jAtfqo3h/DmrVvLxP1cY7rFmpKsiBlSrJq2gJDCCyJwxoW5KQCC1iQvxK87sevqm692qv+NL03s1EoVLX2sPDlUNsXV2PdEZGlCQxD3eZbQthPfQy5REsqAOCowJERPwcAWpH0bEnJVhTLdlmG8HIkFMWAfPKNajtuvqpOr+cezCW30g2aANf7IveutfV2+EWWbBoNF855OIYo9ugbAmAelgh5GZIAqm5lykqhqtUlgwC4hycAfvZnE1YqlZWVlZmZGUEQ7t69e+HCBbj9fucgI4hm2K+T1UfzyecruWJVDXiZL4fabg/HL7WJQrNtsHnJDyaiH/MUv9MVjGMoSxNhH8syRF3WcxU1W1ZV3WJpwLMU2RS2g0A+LY7jVmXj5Xru4VxqfadKAOzqxdBXY4nBCwEv31LKg1J5x+T9W3YDYChPEwEPLbBkQ7NKdS1XVmqygSIozxI0RcBHATmBALgEkD+dhToOUmlor/YqL9dz0+t5y3YuxMVbA7HxvnBbkMUw7J1R2dD+HatD4hmyt50gAIJj2ORKdjvTsGzXsGwcbWsLcudQoRlyAizGoXCLi6SKjfk3xadLmc1U3cfTo5dDnw/Gejt8dEuz1YWb87iNOYaiNAm6YyLRnFFIrmY30/UXq1lVsyTduNLlD4ts65mc1UdzVDB1HKcl6gT5d7ul9U1NMjYztWdLmelXhbpidMWEib7IRF+0I8K1Zg0fVljgWT6O3dty/cz+KfbQJAZw9Oli5k2q9nwla9qOZtpD3UGKwN6NFiCQY7UbKGJZbrGmLr0tPV5Ivdqt8Awx3BO8NRgf7PZzNNH6g02bAffnsdoNisS7Yh6WBhRFvFzNvNqtzKznG4oha+Z4b8TnoQ7Gx8LnAjkxwBtYkD+Jqlvb2drT5cx/T+6ubJW9PDVxJfb9jY7rveGQjzkaYQHt3SdwSM1CAYqgPp5KhHiRJ03DTpfkt6laQzFxHBMYkiQw2E0IOfbNiTRU8/Ve9cFs8uFcKluSe9t998cS98cS3TGBIvFWPAWD1+M3Gs28wUVdVOSotiAXD/IEjpYb2ma6/jZTM02XJgHHEGdYFKNWqy0tLU1PT/M8f//+/YsXL0Ln9e/QTSdZaDxfzvzt5e7i2xJFYhN94W+ud94ajEYDDMAxeH/i05zi1hlGUJ4hOsJCQKRdBMmX1Tepeq6sUARgKcBQAF5uhRw/bjNreJOsP55P/W1qJ12Q2iP8/dH2u6OJ/i4fBY4GDcGd+Qm8P9KMuzia6Ip6wj6OJvGapG+m67u5hmbaAEdZiiDhMCjISQIWsCB/0Ak1238k1XydrD1ZTD1bzhZqasjL3hltuzuSuJQQWRo0E1XYwf7pHNLhbyiKsjQIioyvKYlVqGq7OanS0FkacDRBEtjBuFwI5FjsRkMx1neqf5/em9soqpp1KeH9erzj5kAsJNI4jv7WCQPX65PYDbf5haIkwAIeujV8Q1KNTFnJlWVZtdimIHRToOQMvpmQJGl1dfX58+cAgK+//vrSpUtwJ/7PU+zse39rO9t4PJ/6dSmzm2+EvOzNweg31zv6Orxc8wrlkTQTXLHj9/1HoRfAsZCX9QmUgyDFupotqdmSDDBM4AiKwKEkFuQ42c8aFGMjWX00n5pczdYVozvm+Wqs/bOBWDzAASh5+clte0tKr9mG7BeoiK850FwzizVtJ1OXVIsicJb6bTohXDHIJwcWsCB/LH41LadY06bXc4/m0/NvCrbt9Lb77o223RqMRwMMCXAEgfo1J+h5tdJRv4cOiAwAWKakZMtytqwgiMszZDPfgLfkIB99H9q2W6qr069yP8+mVrZKJMCv9gS/HusYvhgUudaIfaj0fCKy39YrWRzHOBrEA5yHo2zbLdbUVEFq2g2EZwmKBGcvAZZleW1t7enTpwRBfPPNN729vXAz/s8stFzXlrdKP00nZ98UFM3qjAr3r7XdvtoWD/IU8Y5eO1y6T3iOD58CwDCRo0JehiHxhmru5qRcWdENm2cIgSHhaELI8WDZTrWhL2+V/j69t7RVRhCkv8v/9XjHWG/YL1A4jkGjcQKcf/MFYjMOa3p/IuRlwz7Wtp1cWU6X5GxFcVxX5CiSwHEMge01kE8O1MCCvHcWiiCW5SQL0vPlzMv1fK6i0CQY7wvd6I9eiItejsJx5B3RK8iJCGQRxMUwlKWIi20iRwMSoC/Xctu5uqJbVdm4NRjrinooAodrBfmo8Wuuoj5dyrxYyaTLCkOCm0PRG/2xCzGBZwiY9J44U998HATAgyJzvS8c9jIvVump9fzaTkVSjVJDu9EX6YgINImfpfubRxqCUADrX3p/13GThcbs68LkWm4726AIMNYb/uxKtKfNExKZf5S8hJyIzYygCEXg7SGeIUHAw/yfyZ1kQXq8mK6r5p3h+IW4KLAkXCvIxzYeuYoyvZ57sZJ7m65zDDHSE/x8OHYxJh6+u4KlkBNjNw7LWADHAh6avRjw8ZSHJec2C2/TdVm1ynVt3+YnRALH4UODfFrgDSzI+2Y0kmq8SdeeLmefr2RzZSXi524NRm8Nxvo6vB6WanWVQFd0Al1Ss7fdJQDOM4RfoBkaNBQzW1KKVU01LJrEOYYkAZyVC/koaIa9m2s8X8n8upjKlJWIl/niauzmYPxi3MPRADYOnNjs13VdDENpEvh4SuRIQOB12ciV1VxVVjSLADhDEzSBnxmTr6rq2trakydPMAz79ttv+/r64LY8cv+aYb/N1H5ZyLxYye7mGgGR+aw/8uXV2GC3T+ToZkMavEB5Ah/cvv8HOMYxhM9DcTTQTTtXVnJVta6aJMA4BtAkHDEG+Qh7rxl6mraTKsq/LmWeLmW3c3WfQN+4sm83eju8/L73x6D3P4kBwKH3Jwjcx5NegWFIoGhWriJnSoqiWyiKtPQE4LODfEJgAQvyn53Q/pfjuhVJ+2Ha3gAAgABJREFUX9spP15MPVvM1hSzO+a5MxK/P5boinpIgLuoe9R+Ajl56ejBmxUcw7wC1RbiCQKXZDNVlrYzdc2wGZrw8RQAGCw/Qj5g0ougiKrZW5n6r4vpxwupfEXrDAufD8e+m+hoC/IkgbtwWvaJthsHZSwAsICXiQdYEuCSaqQL8k6uIakmSWCthoKzIYllGMb6+vqDBw8cx/nuu+/6+/tbt4rO+Sl2XKQuG5up2uOF5OP5dLmhxUP87aHY12Mdl9u9RFN6+fAGJTw0J871tySxMBRlKNAZETgKKLqVLSubqaqsGSSB+XiKJgB89Qj5wN4fQVTDThbkXxfTP80kMyU54mNvD8fvjsYvt/uaw0Bg9epEe/+jdsKAh44FWZYGkmImi/JeTipWNZYGHpaizor3h5xGYAEL8p8dEaJbdrasPl/JPphJrmyXaRIfuxz6dqJj9FLIL9DYQecPCqtXpyWqIHAs4mN8Hgpx3VJdTxelbFnhGYKlCYpsNbfDRwn5i2bDdVy3oZhLb0s/zexNvcqZljN0IfDNeMdEf9TH0y0FJRRmTackkEUQhAJ4W5D3e2iAYTXJSBakVFFCMYyhcIYizoAklmmar169+umnnyzL+vHHHwcGBmABSzedfEWZeVX42/Tu8tsSwNGhC8Gvxztu9EeDIoPhSHPkHTzFJ/0It25jYSjiE+iQj8FxtCrp2ZKaLsoIgog83XT9cC4x5MPguK6kmmvblZ9ndidX87ppXW733bvWfmswFvWxONqcFQLnPJxwu3H49tt1XRLgYbE51wVFZc1MleRsRTUsm2cImgLNtxjwUUKOG1jAgvzbHNS0nZpsbKQO5obkKkrAQ98ajN4fTVxu94kc2Yx4YAfQKXmehzJDrbYgv0AHRJoi8HJDz5b3vZFuWAwFaBKc4Un5kOMIXh3XaDaqTK5mHy+kX+1WSAKMXAp9M94+dCHg4yloN05ZIHs4nIgm8YCHDvtYhsIbipmvqKmCVJd1gOM0CUjidE8nNAxjbW3tp59+chznhx9+OM8FLNdFbMeRdWsjWXm6nHm6nN3LNzwcfWsgenekbaA74BMogGNoa3AlfOFxGh5oq2hAEriPp4IiwzGEpBqZkpIpK4pqUiTOUjgAsIgF+atxpmW7ubKysFl8MJtc2a4giHvtcuj+aGKkJxgSGYBjrSkDcJLdqQkAmlkDRQIvT8VDPEUCTd8P8JIFqVzXURRhmt4fg7dwIccLLGBB/p3RQiqSPvM6/99TOzPreUkxL7V7v73e/tlArDMqUASA0sunzgkdRRitQNbLU7EAx7NEqaa/SlbTBVnVLS9PeQUKP/dXDyB/Gtt2M0X54Vzqp5nkVromCvTt4bb719r62v0sBe3GqbQdiIu0ShUEOLAbAZFWDPtNsrabbeQqKo6jET/DUOD03t80TXNtbe3vf/+74zjffvvtlStXcBw/n7sURRFZt2Zf5X6eST1bzlQaemeEv3+t/e5ooiMisDTRGg8CRa9O0QM9GvOK45iHJdtCnIelFN18k6ptZ+pV2RA50sdTJJzoAvlrZMrK0+XMf0/ubKRqNAluDcS+v9F1OSF6OArDDitX0G6cLlpTTQEqcvveP+JnZN3aTNe3so18WUUQJCjSDEVgGHyskOMDTiGE/A9L1ZyWncw35l4Xp15l3yRrFAkGu/03rkQHuv3+ZvvPP9VEIKcqlj14aiSBR3zs+OUwgaEugiSL0vTrPIK4hun0JOB8IsifCHIQ3bTepGpT67nJ1VyprnVGhYmB6ERfpCPMEwCDduPUWo3fJA4BjgW9zFUiSJIAx9BXO9XXe1XbdUzLvtoTbA/xrRt2p9EwAgBIkjQMQ9d1y7JIkjyHR9hynHxZWdwqPVvKvk3XEBQZuRT8rD8y0O2P+tl3XoTAc3xaXT+GoT6eHroQBDiKIejydmnlbRlDUVW3Bi8EvTyFw0QU8sfRTXsnV59ay0+t55IFOREWxi6FbgxEOiMcSYDWcFe0Ke8O1+rU2Q4EObhs6xeoga6A6yIkji1ulnZyddt2dNO6djncHhJoCoeOAXI8wBtYkHeDV9d1EUUz9/KNh3OpJ4vpZEEOeOiJ/sj9a4mhC0GRI5uvUGDoekYeOIZiHENE/RxN4qZp5ypKqijVFYMmAc8SAGAwkIW8J47rSorxOlV/OJd8vpyVVLM7Lt4bSXw5FG8LcgDHW6qgcKHOQpEDRWkSRHysTyABjtYVPV1SUgXZNB2GJCgCJ8Hpi2Ity9rY2Hjw4IGqqvfu3RsaGqIo6lx5Oqc5bTCZqz9dzjyeS79J1USBGLsUvj+aGO+L+Dz0u52/MAQ4zdko6iIuQ4KQl/EKlGk5pZq+k2tUJAPHUA9LUkSrCg2fMeT9TIfjKrq1mao9mk89W0qX63oixN8Zid8dSXRGPIDA0aOsAW6qU2s2Dp5es4EjJDJhL0MATNKsdEnayTd006YJnKFa7YRQSxfy0YEFLMg7EbztVBr6ynb559nk5EpO1cwLCfHuaOLOSLwz4qEJHCrXnK08FG3pYhAAiwU4n4dBEKRUVbcyjVJDsx3Xw1IMDTCo0Qv5nZ2E2K5TruuLm8WfZ/fmN4oojg10+b+baB/rDfsFCmtqfELLcUae9uGzxFAkIDJRPyfylKrZmaYoRr6qEgATOZIkmrc7Tk8UeyTirmnaV199dfXq1fNTwGpOG3Qbsrm2W304n3q6mK4rZnuYv3M1fm80cSEu0iT0/mfvGCMAx/weOuJnAcCqkpYqSOmCbDsuyxACS+KwhgV5v8ShrphrO+W/Te3OviqYttvb4ftqLHFzIB72Mjjeur0Dd9IZAW3KCQAc9XBke5j3cKRpurmKsleQilUNQZGW94eSWJCPDSxgneesE3Fcx3Edu3nzynbcVEGaWs8/mEmublcIgI9eDn811j52ORwSmXenZcOlOyN+qPWSpHmfggS4j6cifhbHsUJVy5SkdEkxTSfYFHo/eujw6UPeyXgR19n/xrScSkN/tpR5MJdc26nwDHm9N/L1ePuVLp+HJTEMa0o9Q9NxZuLXfUPgHiq78zQI+9iQl2ne49B281K+orjufhRLU7h7cK8XOfkTJ03T3NzcfPjwoSRJd+7cGRoaYhjmrG5a10WcVtWqiWm7pbr+Yi33YHZvcbOEIOhAt//HzzrHe8PRAEcdvLuC9v+MeX8XRVAAMA9LxoMsS4K6Yu4V9l1/XTFDIn0gWdgKFqH3hxziHHh/t/mNW6iqcxuFv0/vrmxXAI6O94bvXWsbuxxqen/0cAoIXLYz4zwOwjkMx1gaRJreH0GRQlVLFeRkQbZtR2AJigQo1vL+btN4wB0A+cBADazzmHm6iGu7rmppDVOWLM1yLMfEGxVsdaO+uFks1/SAh745GB29FGyPCB6GaPU/wwjmjDqjg9SEpkBHmMfxuMBSL1Yy29n689WsZlpD3YGwj+MYwFI4QwG82VUId8J5tR7IvvVozhmUVFM3bMtxSjXt1V51ej2XrchhH/PFYGy8N9oe4WkKP7iAgyJwVNmZS4APBmzjOOblyf5OP/f/s/fn7XEc14InHEvumVWZtVdhI0AA3KnFWixfX1+3+73vzNMfYL5Pf6X5Z+Z5enq5nmvZlrVS3ACu2Guvysp9i4h5KgukaFmWIJLdJpLxgwhKBApiRZ44W5w4R5FaVe3rnWF3FP6Pr48nTnxpU7dqUJCoJipl0dCwJCABvam+7KIHlizLlNJFD6xiWn8GCGVRkvlxFkUZy/9w6kb39qffPByeDP2yIX10pfXBpebWiqmrInr2oLnOL5wwnKanRAE1K9ovr7ZVRSzf6+0eTL9+MICM3bxYW2kYkiRIAtJlUZHx6YRirs3fQmlZaA/KMsqCKA3iLE0pYcyL0rtPxrcejQ77nlWSfnml9dHV1nq7bKjSC8Ov+foVyfbPP56P4ilp0qVVS1WERln7Ynd4NPD+8G134sbXts16k0kyVASpLOq6oIoIQ57J4rw+eALrrbNChNFhPD1w+8NgPI4mXuIlMaJuxe2Wjo/Twcy1auj6lfq/vNNeaZbhdx0XudIpqjF6dhrLmCjiTk3PO3G6xyO/Nw3/dLc/mAbtqm6VZEuXaqbarKhNSxUw4nborYx52MSN+tNwZIdjJ/LDNErJeBY96c1GdqSIeHul8vH19mbHmusa3vOq8NrjmRIwVOH6egWI6YSOendmewMpDOn+FJtLgWDaZV2y1HJVrqwa7SW1oQjim/leEEIYY8YYIYRSWrysDWVsNIt6k3A8C6de5AYJZYBSNplFj49nYyeqluSPr7Z+9/7KarMkYPi3D5pToM37nV8HAWhU1A+EBiV0YIeHffcvO4OhHS41DE0WVBlXSkq1pDRyB0CRMO9k9BaSpKQ3DgazufV3giSMSUbpzEuedmfdUaBI+NJq5Z9udraWTIz5wJa3xfQDAFRZ2FwyZQVO6LifeIdD4EWk5zjmcqBYrqKzqmw11GrHaHbUmiGqfPU4rwWewHqLYs+M0lE0feIc3Rs/uDXeGfvjMHWyNCOuxfpX0KyDxAxXbbQU0FY4QKqRwopsQID42r0NZITOvPRg4OwcTPuTgDGGAJs44dAOZRFpimDIYsVU1tuld7cbG61ytSQLApeNt0N3AEYpm/np8dC7tzd+cDTrTwI3SOKUxmmW0fn3SAKUBZxmtD8JVUlomKrIJ7K/LeIBfBL2/NGj9OGodD9ujRht9vzGyQ5mx33W3pEtX5bFslS5XNm+Wb+8bV3o6HUVy2+aR/6iUw6LVTmQEmK7yeHAvbc/fXBoj6ZhEKdRSheZuoxSAKAioZqltKpaSROfNVzm8edbAQVg5ibHY2/iRhAwScRekNx6Mr7zdJLPZECqLFbK0mrTuLlR31ox894CPHx4O9Q7A0GcDqfh065zd2+8P/BsJw6TLElJRlmSUQwhFqAsYgjA2IktI6qUFJlb/7fD9EMAgizqheMn8V5fvx21+pSYU2dp9FCkJyPUfiTVBqKgmbK1ZV24Xr9yubKxZrQldFrby+G8NLwH1luhYvL0BOlF40973/zfe//9m5MvR85xlHiAZhgAmGkobGAoEbNL6ruxuncc7o1iVxKUqmLJSIR8okTBHRRGKBvOwm8ejf7bl4ef7wyGs5AwJmCEEBIQhAhlhPpRNppFhwNvMAkAY2VdUiU+q+itgBBme/Gtx6P/+vnhF7vD/Z7jR2lG5soFQYgxFBHEGGWU9u2wO/LjlJi6pCvPptFwCSmwbDDqZ+GO/fT/2vv9vx9+emg/SIUJkX2ERERVJMao1GfISTLP8+1j9+ihcxBTYsrlkqRj+AYVclJK9/f3f//73w+Hw1/96lfvvfdeqVQqgOgu1PvIjr58MPwvXxx+vTs8Gft+lGWL3mT5O0T5/kUQhFHmBilCsKRLqixy7f5WbGFKHT/5+vHov395+Jf7g5mfMAYwnpv+xRhiQlmQZFMnPh76J2OfUFopKYt+Anz1ig3Nrxs/Opn94fbx//j6aPfAHs2iKCV5xhsiCDCCQi4qGWVDOzoaepQxQ5EMdR44cPVReMIseugc/tfDP/7XvX/bt3cTNGRSQCEUmQqEDOtDKE6TNHCDSdc5fuweOmlUU6qaqIhI4M4h51XgRyhvRYIiJeTE7/+fT//b7cGdsdtLaSJhSVPMqt6wFEsBJbBSiWI4JnQM0jgjfuze798KY3caz3679EFdsTDkxylFlY65e3o4cD+/3//LzqA39uOUyrJQMeR2RTVLMoKQ5oXiAzuaOGGUkscnThhnMz/+9Y2llYaxKLThZqiY0pFHuf1p8OXu4M/3+wc9N0mJJOK5eNQ0S5cEjDLGwigb2tHEjfww3e+7Xpj6Yfabd5bW22VZ5EFOkcVjHM2+HT/4/eGnD0a7fuwiyHTZKJnl+nKpTMoQa4mG7HQ0DYdeMI1INHK7fzr41Eu8f1n95GZlS8HSP9yFXVwVRAiJoqgoCgAgiqIkSXLZP+/anRHGjobun+72vtgd9sZ+klFZwtWy0rRUqyRjDLOUOkE8tKOZl0RJ9qTrBDGx3eSfby51aprEKymKrd4ZGEzDz+73PrvXPxx4cUJkWaiWxKW6bqiSIEBKmR+mAzucOJEfkcOhF6fEj7JfXmtfWjF5M4ECywYAwAmSu0/Hn97tPTyczvwUQVDWpIal1k1F10TIAGXA9pKRE06dKErI4cAN4tT2ol/fWLrQKmvKQntwCSkglNGYpn8Z3P3D0Z/vD+44sQMgU0W51CxVO6UKsSA2UgV5oDHyh244idJw4va+yj53o9m/XviXG7Wtkqid9tTicH4+PIH1VmQoBuHk304+/+z4s5k3ggiV1fql+uXrta3V0rIm6hISAEEZJaN4+am7+u1w99DeC2P34WgnoYmAhP+w9GFVLvOFLKYRAswNkj98e/L5zqA7CRCETUu9ul69vFrp1DRDExeuTBBlg1m0ezDd2Z+cjP2DoZekBELwu/dXOzWd1+gVmEXt1b/dOj4Z+imhTVPbXDGvXaiutQxVETCElLEkpUM7fHQ8u7832eu7Azv8fKePERQQvLhk8gCnoBEOCEmyaz/5L3v//fH4YRh7oqS1jKX3mtcvVzdrSk2BKoSAwNRL/b4/3Jk+ujW4O/MG46D/xfFfIGCqoGyXVxUs/WPfyPOG9AAAlNeUMsYopUV4RpRN3ejz3cG/f9sdzULGQLuiXV6rXFmrtCqqrkkIAUpZEGfDabQzV+/jk3FwMHCTjAgI/upGZ61V4qJeYKZe/NWDwb9/e3I08CgDlbJ89ULl0kplvV1SFQEt1HtCBrPw8fHs3tPp0cjrTYM/3ukyxlQZrzRKksD1ezEJ4+zhkf1vXx/vHE6DOFMl4UKrdHW9urVsWYYkSxjmVsDL85uPjqf39qa9SdAbB1+SQZKQ//3j9bWWkac4+VoWUTxIcmf6+N8PP73VuxXEroDlZnn5WvXSzfqlmlbXoQEgo5BEJOz5w/vTR7cH90Ze3w4G36YBhljC4o3atoJEHjpwXg6ewCp8kAFmif/NePfL7leOP4YIt83l91vv/aL5zkVzpSqXMMC5+mAAsIi0Llqrbb3zzeDO7e4tJxwf24efHX+xojdvVLfVN+ConPPaiWKye2R/83h0MgkEBFfq+kdXWjcu1leauq5Iz68IEMIutEmnpjUt7U93ugcDdzALv3owbFV0VREqhsxXspBQyu7vTb/aHR0NfMjAck3/xaXm+9v1lUbJNCSEng0oZWC1aSw39GZFle/1nhzNJm705e6gYkjVsmKWZMxVR+GAADyYHfyle+vR6EGcBppcvtS4+lH7vWu1rVW9JWPx2ckqpJS45tq6uVpVa1/3vunaB040vTO4W1asumy11eqbYFngMxbZqwKUX83Ve0ofHM5uPRz3J4EooOWm/uHl5ntbzaWapqsSQqdPiFC23sraVa1uyl89GD4+ng3t8M/3+1ZJrpYVLU9kcIEvmm/IWJrRR0f2X+73T0YBA2C5ZryzWfvgcmO1aZR1eXF/cNH7f7VlrDRKTUv/fKf3KFfvd56MTV0qa3KtLPMTrCKGDuBk7H+xO3h0PAsiYpXlSyvWR1daW0tmo6LJL6gEQtmFdmm1aTQt/Yud/n7fGbvx7SeTTr1U1qW6qfDFLJ50EMZ6wfDT488fjHaj2FMk/WJ164P2+zfrly8YS4oo4WfWnwG6WV5dLi23tMaXvVt7k8dR4t0f3atqlYpc3iyv8NXkvBw8gVVwB4UyeuwPPu9+c2IfMEYso/Fu+93/tPG7db2NIHr2XQBABBnQBHXNUFtaraU3vdTfGSRR4j4d79wabrS1xqre5EtatPQEYxMv/ubhqDsOaMaqNeX9S43/34cr1bL64vR0xpggIENAl1etZkULkyxKs+44PBr63zweLtV1S5fzgdzchS2aAnHD9JuHw90DmxJmleR3t+u/fW9po2N+9x0w1yMQ6Ip4sWM2LQ0B4Afp8dg7GvnfPpmsdcybqoj5RaTCEZLk68G9rwbfRokrC+py9eLv1n79u6WPMDxNVzJ2erMYYaGCyxW53FRriqD9Pku7s72+1/umd+tG9VJZMnThjciAC4IgiiKEME1TQsj5V+9g4kR3nkz2ug6AsGIoH15u//bdzlLdeLZ95w+I5e0ODVW6emEebUoCmnnJ2A2fdGd3nk4utMubS2Uk8P1bPPUObC++tzfZPbQzyixNemez9h8/WFltGIvmVs+tP0ZIV6SLHalpqYIAnCA9GnknY//rh8PLa5WyJvJ7psUz/WGSPTqeffNw5MWZLKKLHes/vLfy3lb9WXf20xmtC+1R1qSyJrUrmiSgJCH7Q2/kJt88Gl1cKpuGJPJ7pgUTDwCcNLg/3bvdvzUJRoIot8vLv1n759903v/usg4DbGH9AS5Lxs2qsWq0ZVFLaPZ0uDMLRt/0b6+VVy6UlgSI+C1CzkvAu5MUnEnsPpg+3Zs+SrJYlIz3W+/+bvmTjlJ7QV88G7YET3WOBIUr5tp/XPuXS/UrApKiNL41uHvgnGSM8vUslhFiQZQ9PprtnThRkpmGdGWt/surbVNXXhSO7/VZLKviP11vf3C5WdZECtijo9mTEydI0kLUK3D+iiSjj45n+wM3jBNVxlfWrF/fWFqq6d99x1+7pYwxXRHe2ar/082lWlnBCJyMvZ2n4yjJ+GIWjJSSB87Bg/GuG4wwllrm6v+29s/v1a88Pxf5wb54NaX8r8sffbzyUdVoAUom3uCz/q1eOHo+bOQf6QwhJAiCLMsQwiRJ0jQ970VYaZrtHEyf9mZhkpmqeGOz+qsbrYalPX9b3xs1yAColuT3tpu/uNI0FBkheNh37u2Nk5Sb/gKSEbrfdx+fzJKUyCK+tln94EpztWGgZ4VX37P+jDFNEd7ZrH98tVXTlYyygR3e25s4frLopcUpDBSAk6F/b29qezFk7OJS+ZPrrZsXq+J3g6fhi9qDPbP+H11tfni11a6qJMtORt69/cnQjvh6Fi946PrDz/vfuOEUMNYyOr9a+dUnrZuWZHynCPIgE77wEkNQft1+77cr/2SVmhChodvdHT/shiPCdQfn5Xw2vgTFphdOnsz23GAEAFmx1m80rm+W11RBXhys/W1GAzIIIdQE5cPmtZvNa4ZWI4AMvO6eezCJHcD1TLGMUN5yezaaBRmhrap2faPSrumigH5YPBiDAAoYrbVKVy5UVpq6gNDUTU5G3mQW8xCneCQpORi4Yyei+dDJaxvVpbouS8IPBiuLA1kIYbOivbNZW22UFFFwvORo5A1nUZpxASlW9Euzffe4551kWaQI6tX61avVzZpkotNLePAHJUSAuK5Uftl6Z8NaFwTJS7zH06c9f0QZA2+AF4tyFhMJC9ADK07pXm82mAaEsWZVe+divVXR5iHoD2j3fCA6YxijTk2/uVHr1DQB44mXHA69qRdTym1/sYw/A1GSnQz93jSAEFYM+Z2N2nq7hPEPW/9FqgJBWC+rNy9WOw1dxMgL06OBb3sxmYsHl5DiQAgZTIOjoZsSqsrC9Y3alZWKlo8l/WHZyJ1DjFG1pLyzWd9esRCCbpg8OZkN7YCvZ7FUB0sY6QfDw+l+lEairG9UNj5pv1uVTfR3xgrnziEQEGqrtWu1rWuNG6qgxYm3Z+8/mh1RSviqcl7GYeNLUOgEBTgJBo+dg5RmgqDebFzbNNdUQfqu/Pd7Rij/WNgnU9S3rPWNyoYAcZz4+87xsT9k3EcpEJSymR8fDfwgJqokrrVKm8umIuGFwfkB8Zj7LvMvYQRXG8aVNUtTBEJpbxp2xz7jEU6xlAchdOrG3ZHvh6kii8uN0uaSpcoCA+wHfZRnJ7FMxKhV0S6tVcq6HGd07ETHI48XYRXIrDDKmJcGR27PiR2ERFNv3KhtN5QKhCC3ET98X2TxpwjCdaOzWd20tDqlySQYHrtdN30jgpzFLMJFAuu8l5QkGRnOwt449MPUUMQL7dLFpbKE8cLQ/+0Dgs8qsiQBrTaM7RVLk3GYkJEd9adBSniMUbQodGiHJxPfC1JFwmvN0kbbLGnSi1v1BxExWqkbWytmpazECR3Y4XAWRgkXj0LJhu0lJxN/6sUCgks1/dKK1bSUH5MNeHqpAyG42tS3ls1KSaGEnQz9/iTIMsor9IrELHEP3BM3thmgda25aW2u6S0BoR8xrAvXAQLQ1uofNt8xtRpA+MTrPZ0dUsa1B+dl4AmsgluiaTQdBiMGoSRpW9ZaQ7EWduRHHJTnX2qptfXyiohEwtJRaE8Th99TLhIZoW6YTdw4JdTS5U5Nr5vKom8r+zviMQ9Q8y9ZhrzaLKmSAABz/HjqxtxBKViWIiPU9mLbi+OUGoqwXNcrJVnEP9WtIC+9kUS00jAMVcgI9YJ0NIuSjPsoBSEfnU6dNLCjaZpFEpZrerOjNzVBBhBABn88NIIQSlhqG+2aWgOMJok3jWZ+9o+/ZoIQEkVxcYUwTdMsO98p1ySlYydy/CTJaFmXlmt6WZdy9f7TitpQxZWGrkoCSYkXpjMvIoSr96IlKSbu3HAnCVEkYa1lmLr0k636WR6DKpKwUtcrhpIQ4obxzI9jfj5RKOMPXD+dunEYZ6KAlhp6rSyLeeurn3TzIICGItZNtV5WIYAzL7bnKoirj0LhJeEwHGdZAhho683lUltCQl7+8GOqYxFalgRts7xSUkyIRT9xZ+GE8tiB83I+G1+CYhuiOI2DxGcAioLSUCslUZvHH2dTF6aoNRRTQIiSLEqDOE14l+4ikVEWp1kQZ4RSXRMsQ1Ql4e/e//nOQQH5KT02NVkUEAMgSkiUZtwEFUtzQMJAlGZhTNKMyDKulqTn3Vt/zH/N5QNDWNZEWcSMsSSjUZxxD7ZIskEZi0mSpCHNElEQTblcEnUM8aKG90fF47QIyxSNsmxAxrIsi0iUvhmXCERRlCQJALDogXWuHxMhNIizKCUZYbosmCVJxIuhgz9txEUBllRRwDCjLEmyKCG8vrZ4SYogzsIkyyiTBFQ1FfEMffoX2xsiYKiiJmNKQZySKCG8i03BiNI0iEiSUgxx1VAkSXxRgf9doWIAwvkvTRZLhgjg3IWIk4xQfsG0UCQ08ZOQsgxAUJXNmmL+VXjw91UHY0zCQkXWFUFDCDESp2kIuPbgvBQ8gVVwCMsykjEAMRJUrOC8hAbAn9QXc42CkCBAgQEM5tEsIYxAboWK5MLOo0eSZhljQBQQRqf9l88yLwZjKGCIEQJgHuTMRYwnNwsoHjTLYxMMYD5J6KeclBe8lbl4CHNJIhRkGeM+SrGiX5bNN33GGEUQi1h+1rv9rIhIEKEAF5mWuYy9KQ2nCjMtizKa719KKRMELAnCmd8YgxCJIsYY5uccICOA79/ikRKSZZQyABFURITw2QUEYAHmAjX3C7M3aPtyXo9+TzOWZoQSChGTJCwgeDbleeoeYAwlAUEICAH55WOuPQpFxmhCM5Z3rpQESUbS2c1rfkCOMBIQwGxunQjgEyo5LwVPYBWceZyAJTj3M7KQxITRPHsFz2KJMkpSmgFAAAQYCRgh7sQWavNDKGAsiQKEMM1DnbM/X0pZStjc+WVAQHmqggtH8cRDwGKet8qL9X6OfDCQEkYylqfO59EOn6JdJCCAc8uCJIgwpSQhCfmZIWxKs5Rm+bUChDGC6I1wRRY9sCCEBeiBlSehkCjM31BG6M+6xUMZSzJC6PxJ52cVgO/f4m1iScBz8YDzxx0l5GdNLcgymuVV1yg/yoI8kiiUegeigGQRY4wYY3GakZ850YJQmqRzfwFjiDH4ycpczvlCgFjGIoIIQJBkSUySn/VyyiihKQMEIgSRwCMHzksGKXwJim2IZFHTZQMCkGZx3x87aXB2O+IkwSi2CSUIYlXQVKxwK1QkMIKKLOiKgBH0wmzmpWF01snxYZLN/DjJCIBAlbEmi/wUpViqgyEIVUlQJUEQUZSQqRfn0/TZ2fxX5vhJnFIIgSxhXREx5ramKKKRy4aCZFnUIJYSms4i2029jJIzag/KmJN4TuIzgLAgqYIqwjfCixUEYdEDa3GF8FznsDBGhiIoEhYxDKLU9uKEnDUKTTPi+EmSEoyhImFVFiDi6r1ou1hTBFUWBIzSlI5nUa7ewVn3b5D6cYYgUsX8hyCu3gsVOCiSoCmCJOKMsokTh/FZe5wxwBhjQUS8MGEAqBJWJeFnFPdxzgMylgxJR0gAAE7i2Ti2zz7gK6HZNPGiLKSUQKzIosrjSs7Lwa1Occn1iSmXamoVMZKk3qPZwSi2z3QDKH/tIBzvOccpyzASa4plKSWeKS8SAkaGKlqGLGI085OBHdpecsZx6a6fnoz9MCYAsLIuWYbE81fF0h5QwMjUpbIuyQIOoqw7DmZ+csZZZBmhi/GFGEFdFqtlRTpDgxXOOTEsEEFkiKqplCUsp1k89QddbxiS+EwvZywhyYnfH4cjiJEs6hXF0gX5TdAfCCFJkiCEWc75jjFEXDGUkiaJAp4FSW8SBmF2tl5WMIxJdxwkKREXNkKXBZ7AKlqOAlq6YuqyJKAoyY6HQRCdtZFlmtHBNJx5sSTCkiqZmvysPSKnIJRUydJlRRKyjB0PvYkbp2cbw8IY8KN0NAtH04gxZhqyaUiSwDNYhcIQ1bpSEbAIAOgGg2O3n1FyxhSWn0VP3GM3mhGSGrJhqhXEgwfOyzlsfAkK7KEAABpqdVlvIYjTLN6dPj7y+vmtwJ9+bUSSI+/4YHaUUSII8pLRbKpVCHkKq0CbH6GyKrUqqiwJQZT2pn53EpylGztjbORE+z0vTghGqFpSa2WVq5JCKY/86pCpSw1TVSUxyrLBJOxPgyT7aTeFUuYE8cHAc6NEEpBlSE1Lk0UuIAVSHRAaotpQa7qo0yydBeMnzoEdu2cxDwTQXjg9nB3MggmAgq6YDa2ui+obI/mQMUbpuW/qIwqoUpZrZVmVsR+k3ZHXn4Yk+6n3xQChbOyERwM3SogkYlOXq2WVV1AWTsPDWlmpm4oqC3FKumN/MA3i5KeTFBllYzc67HszL5EEbJWlSkmWuHovlmyUdalmqoYiZpT27fBg4NremYZaUAoG0/Cg7028CEFYLyvVspLnryDvhFUYSqLW0mqyqAEIx97wyD0cxbOMnSHFydgomt4ZP3BDGxJSVaodvclvIHNe0hHlS1BUWN6tfVltXKpsaGqVAfhktPt179vHznFM0h8v+IxJ8qf+7c+7X9neCQKoojcvWhdaSgWcfYQh5zxgaOL2cqVZUUUB7nfdL3b7/WlACP2Rh0woOxn7tx+PHhzaCaG1srLeLrVrat7FhstGUbRH/iQVWdhcNhsVBTIwnIWf7/S7Y//HBsYtKjft8Mud4YPjaRASy5DX26Wlms4jnOKEN/lnAQkb5dV2qY1EMUz9L3u37k+fzBL/x9v1U8amsfP7k893x7tpFqqitm5tLOkNAaJF5ugfHrlhjCGEJOdc718IgCqL2yuVTlWHCB4OvD/f7w2d8MdrbAljvbH/xe7waddNMlIvyReXSg1LySc4cPVeHPUOITBUYb1TXm7olIC+HXx2v//kZMZy/r5RYBMn+mJn8PhkFiVpWZM2OuWaqQh5mzS+sIUREElA7aq60tIwhn6Yfr07vLc3STP6430wKWMzP/5iZ3j3yYhSpivCxrLZtNQX+7tzzr0DAKGMpWWjs2SuSVhOE29ntPv7ky/sxKX5DdK/HzvQYTS7Pdr94uQLP/EESV0vX7hUXhd4AovzUuD//J//M1+FgoYZc2shYiFjrBuM7GiSZmFAEoZQRbZUQUFzRbRwOuBz80MY9dLosXv8/+z/v7vDe3Ea6pL+wfKHHzZvLkal8mauRRIRASOM0NgJu6PAj7MkIYKA6qYiiXjxoJ8/7oVXmxJqu/Efvu1+sTsc2qGI4I2N6oeXm53aPEaCjF9mL4yPAvIaPaiI+Hjkj2dREGdhTAQIS5qkq+JCa3xPPAhhMz/56uHg0zvd7igglF5esz650VqulxbtYLn2KAwIQE1UuuGkF4zDxE2yJKSZKql1xcpbssPnnXvZs4+MkUE4+XP/9p+P/txzjgGEnfLy79b+6ZK1oWIpb+j+DxaP6XR669atr776qtlsfvzxx5ubmxjj87t/IYSCgCZO0pv4fpRGcSYJ8/2rKd8fiv9cvY9m4VePhn+80504MWPsxmb9k6vtpqUhBPn2LZZ6n1trjOHEiQ4HbkqYFySiiExdVuTTG18viMd8/xI6V+9fPx798U63OwkEjNaapd+8u7RcNwTMr5gWh8VOFwXsh+nhwItS6ocJzRtalVQpz/D/VSCQD6NjhDDbS756OPzsXu947CGA1julX91ob3RMUUBcdxQrdAAQ4oikx+5JkPgxiYMsVkTdlAwJi7l8fNe3PzcuLGVklrh/Gdz+w9FnR/YTCECjtPyr1Y9/0bwqIoGLB4cnsDjfd1MwwoqgUMB6wciJpkHiT+OZm0UYS6ogy1jKrx+f6o6Ypv1w+tXw3r8f/flu/5YbTWVBvVDb/v+v/WbLXOVapnhGCEEoinPXc6/nzLwkjMgsiJOUigLWZEEQ0PPb6ZSBKCGHfe/zncEf7/SOhh4FtKxJv7zevrJWNVQRAsADnEJ5sYAhCGVRSAl1/HhoR0GUTdw4STMRY0XCsohfnC4UJaQ78b96OPjDtyd7PTfJaEkVf7Hd/OByU1dFBPkgwiJFOPO9rmARYjFIoxOvl2ahHdt27AY0k7FiiNqiqGohSRQwLw2eOCf/dvKXPx396Xh2kGRh3Wh/uPLhrzsf5Dkv+CYkv2ez2TfffPP55583m81PPvlke3v7nCawFs8IISiJmBA6mAbDWRgmZOJGacYkUVBl/KJ6Z4yFCTnou5/f7//5bu+w71PGypr08ZX2+5cbcn6ewbdv0ZIUCMhz8WBDO7S92AtSJ0j8KJPFZ+r9hUceJ+Rk5H/1YPDHOyd7XTdKskpJvrZe/fhKq6RLXDQKFjhACCUBYwTdKBvZoRdljpfYXgTnLgFWFQHnE+iea48gzo5H/ue7g09vdw/6bppRS5d/fbPz7mbDMmTIT74Lh5yHkJPEn0ZTP575iTeO7ZBmApZKki7C76JFBpifRU/d40+7X356/Nne9FGWxYZa+XDpw18tfdBWKoviay4hnJ+LwJeg6JYIWJL+fuOqnThu4k68YX929FkaToPx0+rFjtEuCYaERApYSrNJNDlwT3bHDw7sPT92JEG5UNn855VfXbbWdVEFjN8gKJoPCyBUJHyxYy7V9OF07qYcDvwo6o7sYGvFalVUQ5UQhJQyP87GTvToyN45mPbGQUIIPO2I9ryMj9ufgokHBBAgBC6vWL2Jv9/zxm50MvbJfTqaxZcvWCt1Q8uHWFHG4pQMZ+F+z93Zn+733ZQsrhpAjJGIEa/LK5xdOX2i2+aat/QLO7YfjnaD2NkZ3ptEds/pbVUvmnJZEWQEEGHET8NRMHxk790d3p/4PcZYRW++1373nzofNhQLQ7S40/QmJH0Wb21xD+JcTyFcvBFJRMsNvVVVdw+nSUoOB16aspETbZ+qd3Gu3hkLomw0Cx8dz3b27e7EyyiFcFEyyfducV1DAEWMVprG9qp1NPLtNO6O/TghjpdsrpZX6yVdETGChLEkIUMn3Dtxdw6nh0MvSQilgNFFIQaAp3dLuaQUTUKWG6VfXmnN3HjnYDpxozt7xAuzw6G3vWyaurSo06eUBXE2nIVPTpzduXPox4RiBHVN2Fy2Kjx7VcC4YW4lBYiW9eZvVj+Js/B275Yb2Y9Hu17sdd3u1dqlmlpVsIQhJowEWTwKR4+nT++Od0del5BUl8vvtN75pP3+qt78nkfB4ZwdnsB6K4KNjtb47fIn/WB8u//tzB/O/MEXwejeeLeiNypyVRcVOtcy4SgYjb1BmnhzxxfLq9bGxysf/6bzfkUuPXd5OAVKUMBFEVZJE0uaJIoIxnPj1LP9sRPe25u060atLIsCTjMydZORHQ5mYZxkqiSUJTkjNIizg757Za1SKytcOooY4cyxynLD1DRVnHoRAGDoRGM3enhsL9d1y1AUCWeEuUHcn4RDJwqjTBRgSRGz3K/tTv2RGxmahPkYoqKpD8AgKAnqtcpWQgmj7NH4QZR6x5PHfefky9HtqlozJEOAQkpSO56N/YEXTrIsRkgw1crN9nu/XvnkqrUuwLzECbI3IQBe9MDKWxHTAvRxXyQVBAQRQgxANH9ioDsJhnZ0/8BuV5SaqYoY5hfDk/4kGDtxnBJJRKYsZoS6YXo88vvTcK0pIH5FrKBoslDJp0zC/Ehr4kZ/vt/bObBXGrpVVmQBpnP1ng4meRFfTCQBljUxSqkfZUcDb2hHpi6LAu9iU0B0RdhaNg+HlZOx50VpGGeLA6p7e5NaWTFUCSOQZGzmR4NpOLSDjDCMkAgRgIxQyhat+DiFiygXOSxVUN6vXw7TMCHpzvBOkPgn9tOec/LtaKdhNA1BlwUpzZJZ4o6DkeMPMhIzAEtK5WL90m9X//lqdVPDMk9dcV4ansB6G6IMKEDcVqv/x9Z/6hitz46/7DsHUeIH4SSKZgOA5t4tY5RRAghjAEOsysZm/do/L3/8YfNqRSrxU9hi2qFTEWFRQsZOFEapoQqWrnhh4kfZyEmm7hRhABFklBEy90boaWNgq1lRHS+5+3R8f3+6vVJZa5Y0WeBiUkiiJJv5sesnooArhpKkqRtmUy9x/ATBvPVZ3p5zMaFfFNCFlr7eMWd+fPvJ+MnR7P7TyVJFF/mYysKpj4VdsGTjw/o1FUu/V8o7o103GBESj+yj6ewYQ4QgooARRvKaDaBIWsVovd9+7zdLH22VVwT4rNfem6E7RFFUFAUhFIZhHMcFeEqEsr4d9iYBQqBeVlVJmAXznTuahhMnFKANFuqdMpqnuDRJWG0ZF5fLUzf+fHewe2CvNkutiioKIi+xKSRBTEZOFMSZJotlXaKETd146sYzP8YIIZi3Rs1zEQwARcZLdX2zY06c8NGxs9d37+6NGxW1VpJ5GUXx8hSLgbMiRoBBAUNNFgmhSUb3uu5Bz50rdwQppSTvgcUY0BShWdEkAY/scGhHd56O2zWtU9XfkAJbzuuVjbnFBOijxjVDVE21cmfw7dQfZllqu13P6+XSgQijFFDCKGBMFhRDq99s3fzX1V9vlJYNQeUryXkVeAKr+FHGIlkuY3HNaP/r8ierevv2aGd3+njkD+JomqXhYroQhFCUSppaaemtK9XN9xs3NsrL1UXt1Qt3KzgFI6VsOAsHdsAYvLRS/fByoz8N9rpud+xP/dgLCYSAMaCI2NTFZkVbbZU+vNxoVLT9rjOYBkcD//7+5GKntLli8mEihWTmxf1p4IbJct343S+Wgyh9dOz0JsHADqM4W2gZhGDFkGtldaWpv7tVX6kbxyNvMA2Pht79/cn19dpaq8RP6YsHY0yA2JSMd6qXK7J5x1q/N3546BxNgwGJ/ZSRReiCsKwp5bJS3apcfKdxddvaWNUbMpbeRJdImDtFhJACVGAxxpKEHva946FXNeTfvLO01irtdZ0Hh/bADidO7KXZ4jtlEVu61LDUjU7p2nptpakf9J3DvjucBbeejLZXy9vLFVHgDkDh9m+u3o/Hfkro9fXaO5s1COHtx+PeNBhMwijNFo0jBAxNXWlUlNWGcf1idbVeOh55cZp9+2R8f296edUyFFHmc2aLSELIeBZ5UVIrKdc2qhCA3ig4nnhekGUJXcQXioRNXW5UlPV2+fp6lTLwx7u9P94+2T20r1yoVvIybZ7+Lp7qgLl110X1emWzLOqb5uqd0e6+czwJelnsplmWP3QGkajIJUurr5trN+qXr1W3Vo2O/KylMg8tOS/vrfElKDyLXkWQMQRhW6tVlPJKqXO9caXvDexwHGVh3ukDIgh1uVzVam2tuVpqL6k1nOcjFvqFq5iiEifkcOBOnFhTxUur5odXmnFKTkZBbxKMnTCKSX52xmQJV3S5VVXbVW25bkgilgV0fb06mAaPjqb398tLdd1QRS4nhctQgP406I8DAMBK03h/uyEJ6PKq35+G/WkQJdmiCwrGoFpSGtZcPJbq+uIw/8GhPZyGj4+dO3sjqyTXygpfzwIal9xAlCT1mrTR1KpblY0Tb9DzekHi5mfzcyGSBaWkmDWtvl5aWdNbhqicdpiC4A0s732xDda5hlLWnfiPT2ZemN68WP3gcmO9bV5cKl9atfrTcOxEeQIaQcRkAVslpVVRlmp6q6opkqBK+N2t+qd3eo+P7W8fjZuWVi0p/CJhsXQ7yzI6mIYnI18U8KVV66MrLV0R1ppGfxr2xkGSLfYvEDCqlZS6pXRqeqemqxKuV5TexH/SdQ/6zr29adVUl6oaX9KCiQcAwPGSk5GfpGx72fjkesfUpPEsPBp5XpBmGWUQAUYVWbAMuVVR2zV9uaanhDl+srM3OR54956OO1VtpWHw9Sya6X8+zZQxVZC3zQtNrX7RvHDsDXp+z43sjKa5cWcClkqy2dSby0b7gtE2Jf25gPHQkvMq8ATW26VrAGASEi4anXWjHWZxkEUJzRY+CoJARrIuyDIWIfhOrXD9UmziNDseeI6fbC6ZSw1dV8W6qbarepSSKMnm0pH3aMX5xCJFWkyumotEWZff3arvHNj7fefOk/HWirW1XFYkrlKKxtEw6E0CVRFWGoahiJWS3LC0JCVhkmWEwWcaRpawImMBocV/lnXpxsXaXtfZOZx+83C0vVKpleX8xI7rk2IZlxdGCNWkcrVaum5d9LIwJgl5lgMSIdYEWRVkDDFcXGxnb6LnihBa9MBaVGCd9xxWSumDI3uv56iysNE2m5YmCahT1VuWFs/3L6H09A1iBGUJyyLC6LSOxtSVj662epPgy93h7SfjS6uWIgmGKnKBL06GIr8e3p14YyeyDHm1aVRLsqaIpi7HKQnibCEebL4voCJgWUICRouqCkuXL61Zlw6tu3uTrx8O11pGy1Ix4uq9WM5hQvp22J+GogBXmsb2slnWpM0l82aSJhlhBD53DhVJkEWM0WJwBNxcNm9s1L54MLj9dLy1bC7VdcRDieI6APlvzBS1cmXjinXBy6Iwiwk7LWEWIFKwbAgyRqdFvDx1xXkt8GjzbfNZcqUBAQZQE2RNkOlpMWg+UCb//cXpp7z7VZFlIe9c4AXpcBZSxqpluWmqQu6CYAQ1CasiBuB5CJfPpHvB5MgiWmmWtlbM3sTf6zu3n46alsITWAWTkSQj/bFve0lJF5ZqmiIJEEIMwWLO+gviAZ45JKcSImC43i5vr5hPe87hwNvrzdZaJV3h4lFMF/Z5TgoCKGFYQcaiac5pYgj+lWU5zWC9ecYFYyyKIgAgyzJCyPneupR5QbbXdcd21Knr652SKp1WyGIMVQRzXf099f7C/hXQZl6r9fjY6Y6CO08ndUvlCaxiaXfgBunADpKMVEpK3dJUWQSLdJUk/PWVwLwdEnouHvPPnYp+42LtSdc5GLi7h/bmitkoK/ymWJHwk2wwDaderEhC3dTKmoQRzCddSH89lPz7zmHDVG9u1h4e2/1x8LTr3tiombrMRaO4oUQeWOYfCCJT1MqCxuCpeVmoDQROp1HzO4Oc1wW/tf62hRrfRRAIIgSQAJGA5p/x3G9Bi1DkO6PEKbIssDglvXwolSoJzapmGTJ6dgIP88ZG6Dvm9oe9YLEwQqYhXb1gLdW0ySy6/Wg0tEO2aObJKYhfApwgHcyimGSmITYsRRJh3q/1b8UD5b3SnmuMuXppVbStVatd1Wd+8uBgNnViwEWjsMrkO3MB2dyLxSi3LPk/p5bluWaAb65xWSSwClCBFc11u380dGNClxv6ct0Q8POjqTzSQODF7YvydiUvqHeoK9LWsrW1arph8tWD0dHQJ4WYzMhZQBkbO9FgEkoYdaqaoQhwEXPmLSfQXzGXlhfU+5yqqV5Zq7SrahRn9/YnB32PcvVeLIIwG9hBnGaWLlVLsoBP1TgC4Pvi8YJzCACwSvKNjepyQ49T+uhkdjT2KHcLC2z9X0hILUJLjJ6Fls+iB/ZMQHj2ivO64AmstzjeAH99Wgb/+kucwqcn8hsEJ2N/NAutstSpaqYhv3CMBv9+iHpqhCQBX71Qu3GxVtLlk6F/+8l47EZ8YQtDmtGTkTeaBoqAl2ulsi7jfB7/i5VWP+jGnB7IQbDeKr2zWVVE9PBo+uh4FsSE5zffBuvyk24u5382bhA/PJoOpoGuCBudcruqYXx6Jg5f+PW3TsGza6Hzf1lrlX+x3aibSm/iffto1J34lGcpigKlbGSHAztUZCEvjxXBIof5w/nl721fJmLUrGjvX2pUy8rJ0NvZm4zsiItHkfzDiRsdDTwE4HKzVDeV7+TgbzX5CxKzSH+XdfnmxXrNVPa6zv092wtTbvnfotAS/Ej0wOG8HngCi8N5ewlj0h0HfpQ1TLVpKZKwqKM5k6OxcFPqlnp1vXahXfKi7O7TycnIzygDvNKmEKSE5b38Y1nGSw1dl8UzNi5YNPcEANQt7fpGfblhTJzo7tNxfxLwCIfz5vpDCAmCACEkhGRZdk4HES4U+NiJd/btMCbtirbaNFRZyNNSZ40iFtvcMsRLK5VLqxVC2d2no0dHsyDOuJwUIj/BvDAb2KEXpGVd6tRVWULgzDcA8yMKVtakmxv19U45TsnO4fRp10kJr9ErhnSAjLDxLOpPQoThSkOvmcqZVcfpaMIra5X1djmIs/t7k/2Bm2SELyyHw3ltDhtfAg7n7YRSNvPj7shnjK01S7Wy+tz/OKOjkl8XAmtN4/pGVVOFg4G7czC13YgftRXChWVhnPanYZhkpi61a5ok4sVXzhjhLMbzL9X1K6sVAeOdg8njY5sHwJw31x9CSBRFCGGWZXEcZ9l5ldUoJt1J8PhkhjHaWjY7Nf2Zbv95OgBB2LDUd7fr7YrWt8M7TyYnY16EVQjrz8DADk5GPgOgXdFali4J+OwvX1wYFzDq1PSbF6t1U93ruXeejqduzMWjGPY/iNKBHXphUlLFVkUta9LZrT+EECPYqmiXVyuWIc9l48nY8RNef83hcF6bw8aXgMN5O4kSMrLDwdTHCKy1vktgnd2FzX8DlZK8tWytNUt+lH37aHQ48J5N3+acYwhhthcPpgGErGGpTVN91kMH/gzxYKCsi5fWrJqp9CbBwyM7b5TGV5fzJgIhXDQBpJQScl6vuzIGbD86GnpjJypr4sZSuWGpL3F/c3ER2FCFy6vW5TVLQGjnYPrw0E5Swktsz31+goHuxO9OfFFAS3Xd1KXFCLmf5wBApsrCldXqeqfsBun9g+nT3ixKeKFNEZj5cXfsEUabFa1SUkQBPT+XOqN4aDK+uFxeaRozN97dt/sTP814gR6Hw3k98AQWh/OWEsTZ0Am9KFMksV3VSrq4mJD9M5zg/LOAUbuqXt+oljTxYOA97s6mHu+Ede7jm5TQ0Swa2oEk4Kal1k1VWLiwPydGAhBokrDeKW8umQiiJ13nac/J+Bk9543nr7rOnysoY92Rv3cyo4SttUrtmi4g9FIrcDpctKxL723XV5r6wA53DqbdiZ/ybu7nHMLoeBbZXqIpYsNSJRH97BRnrt9FES019O1lq6JLYzu4/Xg882O+vOcdyoDtxf1pAAHs1PWSJsHvdMIZpQOKAl5rlS6tWLIsHI+93SPbDVO+thwO57XAE1gczlvK2In2ex6hoFPTq2UFI5SHbD/Di4XPZuK2q9rHV1rbK1YYZrcejh8ezSgfR3jOidOsO/Yms7isSa2KbmjSz5WPRT81hFCtpL67VV+qawcD99aj0dSJCG+VwnkD/aFnVwgJIUmSnMcrhIyxIE6fdJ2joacpwqXVSqeqQfQKPw4AQxVvbNTe2apjhHb2J3/Z6btBwgDgGv6cQih1/fRk6Pth2q6oK42S+HPuD56KRp7iRQAaivjedv3qei2I6e1H40fHMy/gl8XON2lGx7N4ZEeyiFcbuqEKC3N+ZrUBIGQIQVMTb1ysba6Yjp98tTPsjvws454hh8N5HQ4bXwIO5+1kPIsO+q6AwUpT12TxNOfwM09hF3UKGKFWVb15sWqV5f2+s3swdf2ED04+v7C8QK878uOUtGt6w1IXF0x+rnwsvl8Q0NaytbVsQQj3es7dpxM/4iexnDcOCKEoigihLMuSJDmPtwgpA4NpuNdzgoh06vp6q1TW5ZfXA6eN36GhStfXaxudkhOmtx6NDvtunPBmduc3PcFGs7A/CdKMtqp6p6ph/LNjAfhCH4HlmnFjo1K31MEsvLs3Hjm8BPs8W38GZkE8sMM4JdWy0qpqiiQstMGZFelzAYGrjdK1CxVVFg+H7s7hdOKEfIU5HM6rwxNYHM5b6KCwOCEjJxraoSIJyzVdlfGrRH0AAEUS8qEzJT9KHx7PHh7zXhjnGEKp46dHo3mEs9rSWxUNvoKwYQRrZeXKWqVZ0SZO/MXuYDgNKT+I5bxp/hBCOIcxRgg5d1MI8782Pei7B30XIbi1bLYqqiQgCF7yOuTzJIWA4MWOeW29qqvift+783QycflNsfNKnGb5eNko77StWiUZo59RX/M9kYMQyBLeXK5cWrUoZff3p/s9J4gyrt7Pr384tsOToUcpa1bUhqkq+fyWn+8DMAigZUhbK5W1dimM6e3Ho/2Bm2WU8S56HA7nFR02vgQcztvnoICJEw+mQZKRSkluVjRJwK/4MzGC7ap2ZdWqGEp35P/lfm/qxNyFPa8RTkLHs2g0iyAESzWjaioQvGQK61kRFtxcNi+vWhCCnYPpXs/xwhRAvtIczmuDUuZH2UHfHc/iiiFdWrE0VXyWZXiVzcYAhGVNvL5eubRsxWl2+/H4cOBECb8IfC4J4+xw4Ma59a+ZioDmwsFeSkKet4prV9VrG5VGRe1PgttPxycjH0Bu/s+rfzicxb1pgPL8ZlmTEIIv9XPgQgZWGtqNizVdE5523cfHtu1xz5DD4bwqPIHF4bx9cQ5j/anfGweUzP3O6ncD5l7a45kHSIokXlqrrnfKYZzdfTrZ6ztBzG+KnUu8MD0Z+2GSlTSpVlY1GefJppf3OhGErap6ebVSN9XRLNw5mPYnAS/B4rxRQAhlWZYkiVIax/G564GVZGxgBwd9L0pJu65fWqtqsvAyN8P/NhDNl2djybx5sWqq0uHQvb9vj2b8NtB5Ve9Pj2dJSppVrWEpEMGXqq/5btcAAMqadLFjbi1bAMKd/enjE5sSruDPIQwQSofT0HbjkiY2K5ok4pebaAEhYLlY1crq1TXrQrPkR+mDQ/tg6DI+x4XD4bwaPIHF4bx9LgoDAzsc2gHGsFkxSqr46oFfPo4Qrjb1GxvVWlnp2cH9/QmPcM4ptp8c9N00Jc2Kahnyswnr8GXlLc9visJGp3RpxUIIPjiyn/ZmUZzxEIfzBvlDCAk5lNI053z9/cMk3T20e7avycJy3WhaymJy6Cur90VX5jxJsWReuVAllH77ePzkeMYAvwl8vkw/I5ROveh47DMGmhWtWtbgK5fCMgYQgk1L/eBSvWkpw1m0c2gfjfyM8DYC5wzCmB9m/annBnHdVFcahpjrkJdLgudjrZmEUauiX7lQKWniXs99cGAHccr1BofDeSWHjS8Bh/O2ubBRSgZ26AWpoYqtqvoqDbC+l6Qoa/K19er2qkUztnMwOxx4YcLPYc8ZlDHXj7sTnzLQqmqGJoKX7aHzLACGC2d2qaZfXa80LHVkRw8O7f404J3+OW+OYoQQYowRQoseWOdOcTlevLs/mblxp6att0uSgBGEr+VdnE4UBbBT09/frlfLysnQu78/7Y78jPAtfJ7ww3QwDW0v1hWhXdHKmgjhaxAPAICeT73cWrYQhI+PZ3efjvyId/o/Z2SETrxwYEdxSuum2qkZIn4l/xAyCCAwden6RnWlUXKD7MHR7HgYJClPbnI4nJeHJ7A4nLfPQXHCk5GXZLRT01bqxmLEzCu7sKdJirWm8c7FWquqDab+rUej44HLkxTnKo4HYZwNZ9HUiXVZWKppep7fhK8c5UAANEW82DHf22qIAtrdtx8c2WlGeTtXzpvAQsJhzulWOE+KiyUpORx6T7oOZeDiknmhXT5VyRC+rvVhgBmqeHmtcn2jLgro7v74z/f7bpC80u1izv9S7Q7GTrzf86I0a1e1Tk2TTtMT7NV/NEbQ1OX3thorDX1kh18+GHXHPpmbfy4d54Y4JUdD33YjScBNSzU1MW+AxV7J8AMgi2itWbq+UTVL0uHA+/rRyMn1BofD4bwcPIHF4bxdpIQNp2F/EjLG2lWtaamy8Nr0AARAlcUL7fKNjSqj7O7T8c7BNIx5EdZ5inAmbnwy8v0orVpKu2Yo0msr0AMAVEvyR5cbdUsdzsIHB/bEiQlvh8H5h4t9rqAQQlIOpTRJknPVAwvaXrJ7ZA9noWUoF9qlakn5n/F/ETCqm+p72/VOzRhMw693B8cjP80oT2GdCygFEy/ujv00Y62K1rAUnLe/fNbm7BV2EGQAQFlC28vm9oqFMXp0bN/bnzh+wkXjHBHF2dMTxw1Sq6S0qhrGKK++fA09UjVZuL5eXW+X3DC5uzfpTxaTiLlwcDicl4EnsDict4skJb1p4AaJouCGpcoSfl1n9M+TFA1TfXerYRlyzw7v7U/6k4BQPq/q3ITythcNpkGS0FpJaZiKILyGBNZzEVNlYXPJ2ugYEMJHx86TrpPwPimcfzQL+cQYizmLJu7nqweW7ScHXS+MSLuqLdV1VcYAwte9SvPPioSvrFqXVk0Boyc9Z+dg6kcpHyh6HlQ7yAixncj2YkWaW/+yLj8r03tl2cgrbTBCNVPdXrU6dd32kjtPJsNZSBlPb54bopQc9L0gziqG1LDURfvLVxSPZz1S0YVWaXvZVCW833V2D23H50VYHA7nJeEJLA7n7fFf515klGSHAzeI00WHTgGj1+PAvpCk0FVhs2NeWrNkAT/pOo+P7SjlCaxzIiQQTJ144kSiiJpVVVfF1xqaMoSQromXVqrtijaww52Die0kvAiL8+ZoyNd47e5/GX6UHg2c7sQ3VHFjqdywVIT+57yH/JTCUKUbF2vr7ZIbZPf3JyejIMkoL6Z48wXcCZLeNPCjrFpSOjW9pImvVTTmn0UBXlqxrl+oarJwMvR3D20niPktwnOh/jJC59bfDRFEdUs1Dek1KhEIga6KGx1zvW2GabZzMO1NQkIBVxscDucl4AksDuftclLcID0e+klGWxV9pVk6TWC91iBQwKhqyu9u1pdq+sgOd47sgR3yXr/nwYMFcUr6k8D2krIhrTZKizH8rzX4BSJG2yvW5opJKX1wmI8jTHivX84/HgghyqGUEkLo+akbHc/Cxyez/4+99/5u60r2Pc8+OSdk5qRs2e6+9603//8PM2vu3OluW4kSxQgiAyfntN8iQKn7ut3txACC9fEypZ8EcJ86e1fVrvpWEGUtU9jtqIbM3VIGDs8jWoYmn6xrr/ZMTWLPh/7b06kf5WA/y7+9O37anwZJVrRMsWGIIsdcT4q7oQzF4mfLEF5sG3sdxU+yN8fT0SyCK4oHYB4IhUnenwVOkEoC3amLisDeVCp/UZ5PU+ROR32+ZfAMdT7yjntOEGdQngcAwO8AElgA8IgIk2JkRTPvWqGzrvKLEvGbDQLnEQ71ZMN4vqWzDHXcdT+eW3EKSYplp6wqL8gGszCIs5rKbzZkkadv1Dbms8xI1DLFp5t6Q+cH0+jNieUEKZRvAPcOQmjRQogxfkAaWGWFB7P4pOfSNNpf09dqEs9Q83345t+pxWlBIsJQ+Jc75qsdM87Kv36eXI79FMpsl5sK45mXTJyEIsn1uqwuyq8QJvCN+QDzbRzRJLnZUr5/0uBZ+rjnfe65TphCmmLZwTiI894s9KLMVLg1UxLm11c3NMb0ejsyFf7JhrbRlP0oe3M6G8xCGPIDAMDvABJYAPB4/BPCC7PLSRDEuanwTUPkOfqW+kxIhAyFe75tbjWVoR398Hk2ceMCrmGXm6Kspl48dZMK44Yq1DSevQkBrJ84shhjlib3OuqzTSPNyw/n1uUoiDNQwgLu2x/6IuKOECqKoiwfwPQJjHGU5t2xfzkJVJH7Zs+sqdyiGAbjW2oixASBSERsN5X/fN7UJKY7Ct6czMZ2BMUUy2snBIErPLJj208VkdlpK/z8cgIRCN9kmxha9KHXVP7ltrlel/0o/eHz5LTvwSSXJbcPjAk/zMdWkhe4pnE17asA1k0VYS2UsIhOXfp2v8Yx9KcL57TvhTEUbwIA8NsdNlgCAHg8LqwbZr1ZGGVlTePrGjefMINuIcJBiMAsTe501IMNDSF8NvTfnc4cL4GnsMzkZdWfhU6YiRzbMkXuuo7jxi0EzYuwpKdbuiqxMy9+czodWREBWr/AvYLm/MM+hpd/S5+XX4XnYz/Jq6Yh7LQVmWcXX/yWhLy+JikUkd1f1/Y6WlnhH45nJ303ByWspTUVjOOsmDixH2WqyGx3FOHLeFl0059EEIhjyKYpPtnQBI7+eOl8vLCDOMVwg7W8OwnOi9LyE8dPeJZqGpIuszcrBbj4xzBGhsy92DLahuDH+YcL+3zkVzDkBwCA3wgksADgkTgoV6HO1E2GswhholMTDZW/9l7xjUc4V/9TFNkyxG/36htNZebF/3U46k6CsgTNziUNbwiCSLPydOBZfqqr7FZbWZRf3XgYjBBBIiTz9JN1/emmlqTl345nJwM3yUvQ+gXu9xWg5iwit+UPq9Bcd/nowjnuuQxNPtk0mrpI0+S8zPF2TxOECZoi24bwH88bpsJ9vnR+OJ5ejH3QOlxO5onOaGhFZYnrutAxZY6hbscoEZobiC5zf37a2G6ptp+9P7c/XThpDmW2y0uUFkMrnLqxIjCbTblpCDctL3Hl+yFE8Cy11VJe7NZYhnx3Zr07teK0hBwWAAC/CUhgAcBjic/iNB/OIjtIZZFp1wRNYr94nLfxaVc/GQrtdJSXOybP0mcD/2TgOWECV/RLG8CHST6cxVmWGwrbNgSautVZbKhliK92a6bKj+z46NIdWWEFV/TAPXEt3jcHIZTn+fJrYJUY20F6OvRtL62pwm5b5a7Lam53juKX0fqYZei9jvZ0U0ckOuq578+sBJIUy2kqFT4b+TM3kQS6ZYgiR5OIvLWSV4QxZihyrS692DEMmevPoh9PZ36cwYNY0tOfILwoG9lxkpeGwpkKzzH0jd9uLkQwEUIyz77cNjfqkhNkn3vu5TTICkhgAQDwG4AEFgA8FoI4H8yCIMoMiWsbkswzX1yX24gGr6OourbQwpD8MPtwZl2M/BKyFEtJWpQjO7a8mGWohiYYys0L/P9Ph5mQBOZgQ99f16qy+tR1PvfdtKigCAu4RxYJLJIkyzlL/m2zvOpNgouRX1TVelPeaMg0eXdOHcaIIlFDF747qLdr4sRO3pxYYysqSohFl46yKM+Hnu0lmsx1ahJNzwul8G3t8IsEqswzL3fM7Y4SJfmHc7s/CaEIaznBmJh56ciOSITW6pI2F/hH+Oq/WzAMzNDUTlt5umlwNHkxDt6czMIEhvwAAPAbgAQWADwWB8WLspETlxVuGoKhcOy8g+D2XNhFpRWJyI2G/Gq3JovM6cA9PLedIAMtjCUkiovLSWD5qSKybVNSRIYkb882EJp7xg2Nf71XbxriyIo+nNszL4b0JrAM3G4J0w2RpMXZwBu7kSwwe22lofHzdxbf1RJd/RRY+mBdf71bEzjqtO++OZl6YTrf/uFFXprTnyCCtBhZUZKXpso3DWGufYlv28YZmlyvyy+2TFPmBrPwb58nUyfCGGxjCf1DPHWikRUJLLXb0WRxXp6PiFvSSCVJQhHZ51vGVlsJouyHz9OxHUH3MQAAvx5IYAHAo6CssOWlMycWWHqzKcsC+yWRcIsR4MJPbejC671apyb6Uf7h3DrpuyUkKZYvYI/SYjgLgzBTJa6pCyxN3V4MP/+Hr/7XZP7VjrnXUSuMP/eco66TwRU9cI+vAUIkSSKEqjnL/FWrCjtR9rnvRkm5Vpd22ipNk7d6J/HzTiSJTJX7/qC+1ZScMP3h82xkxTkUYS0TWV6O7WhkR/RcmNJQuK878G2/TarIvNwxdtfUOC3enFhnwyBKYYdfLjDGWVFOnNgJUoFntlqyeF2ef3unP8Ey5JMN/emmzjDU2dA/vLCdAIb8AADwq30PWAIAeAykeWF7iRNkgkCvNxRp7qDc9g3sIgPCMdRaTXy9X1cltjsJji7dIMkKyGEtlQtLYD/Mpk5CIFTXOVPhEHEXYTCFCFPhXu6YNZUfWfFRzw2SHO7ngXuEZVmKooqiyPOlnu8eZ8XQCnuTiETEdkvZaMqL/fbuS8dYmnq+bbzYqckCezHx357Npm4M7/Aynf7lyIotL+VZqqULmsjd2UdTFNpf015sm4bMj+3w8MKaF2HBM1kiKkyMrWTiJFVJGDJnqjzL3HZsiEmENJl9uqGtN6Qgyj517Ykbg4IAAAC/EkhgAcAjSE9g7Ef52InSvNAldq0m8hx9FzewX9TcVYn905PGVlMJwuzo0u6OwywDyYMlMo88r6ZeMnFilqXWTMlUeYTwXeULyBc7xk5HzYvqdOD3J2EMtgHcEz+pwFrmRkI3yI57nuUlhsLttNWGxlPkfcx4xQRFIUPmX2xdvcVelP/taHo2CAq4o1gakrQYWWGQ5IrI1nVB4Oi7MWxMEBRCisjOB87qeVm9P7POh35RwiXFMjkAFe7NwrET0zTZNEVd4mmKvGXDuDI/miT3143nmzrDkOfDoDvyoziDMdUAAPwaIIEFAKtPhYneJOxNQ56l12qyJvO3PGDuHwNCYqGFsdtSXu6YusJ3x/5fPo5sP/2a3gLunSgrhrPQDlKZp9qmqMv8nT0ZmkIdU3qxbXZMaTQL/3o0nTkJ2AZwX1DUdfPs0rYQLqL/oRUenltFVT1Z17eaCkmS96PchfDiKmSjKf/Hs6YhsScD74fj6dRL4RVeitO/wk6YXU5CEqFOXaxr/J0ZCcJ44QG0a+J/PGuaitCbhu8WBXpgHEtDUVW9iT91E4Gj5rebFIlu9/xdGCBJEk2Nf7Zl7LU1N8x+PLUuxgGIYAIA8GuABBYAPAoX9nIS9CeRJnPbbUXiKfKLQNXdxFskSQo882zbONjUorT469HkfOjnRUlAwfhyEMT5wIryomrogqkJHEuhO7EQjDGJSI6lnm5qz7eNrKh++Dw9H/tpVhAIHgtw1yCEFlMIiznLGWZXGHtRdj7ye5NQ5Jknm0a7Jt5XrRj+0rNoKvzLbePZpp7l5ZuT2WnfnevZwQ5/z6R5ObHj/izkWGqzqTQNgSTvLIN1/UGaxD7dMA42VFzhd6ezo0s7zQtoF1sS4rToz6IwyjSZbRrC/HYT3X4JNiYIRNPkZkP+/mmdY6lPF/anrhOlBSQ3AQD4RSCBBQArD87nCp1umGky0zIEmibxfA7cHX38/JNIRGy1lOdbpiywg1n05nQ6dmICshTLgR9mIzuiKHKjpWgS+w/P7dbzBYv72I2a/HLb0CV2aEcfu87AiiC6Ae7BJSJJnucpisrzPMuy5fySVUX0JsHJwA2SvGPwm01JEZn72km/fC4mSVRT+G/3ay2dH1nhhwt7bEdQTnHvp3+UFhMn8oJM4pm2LqoiR975sUuSyFDYVztmTeMHVvzh3B5ZUQVC/0tAUVaWn4ysEBNEUxMamkBen/u3ayRfJxGbqvD9Qb1lCo6ffrq0L8dBBQksAAB+8ViBJQCA1SYv8NhJpm5S4LKpC3Xt7voH/zHCQQSSOeago+6vaXlZfbxwuiM/y3JoFbt3ygq7YTq1Y5pGWw1ZXYzQvsNZZgghnqM3GvLOmpoX1eG5fTrwMMS+wH1AURRJkhjjpW0hzMvqYhxcjkMSoc2mUlN5irxnX24x+lDkmScbxsGGQZPkx3P744VdwWj8e34whBukvWmY5VWnJhkqR1OL/BW+Q9u4+iyBo59uXtkGRZGfe87nnpsXkMG6f4qymrrJ1IkJRDR00VTvqEDv6xxMjiFbhrjf0ViGOul7Hy5sMAwAAH4RSGABwIqT5UV/4k/ciEJkx5Suoh1E3kPpEyJIklhvyK/3ajWVHznRx64z9VJwVe478sR+lI2cOMwKmWc6piQJDEHctYEgRNQM4dmWoYhMfxp8vLCtIIGbWODuWVQFLmcbC573DwZRdtr3Jk6sS9z+hqbOSybvV28eoasVY2iyYQjfHdS3mkpvGrw9s0Z2VJSwx9+nwdhB2h0HFa522rKpcF/M5E7vJzDGNEU2DenbfXOzLg9m8ftz2/aTEm4p7t0/zKqxHflxLgtsQxckjibvbIDLfJtFCIkc/WRTb5mC5SXvz6yxE5dQngcAwL8FElgAsOLEWdmbhpafcAzZMkVZYOdTtu5hWhUikCZzz7b0Z1tGWRFvT62zkZflMJHoniOcoR13RwFBEG1TNFWBoe7lXMCqwDxZ13bb8yKsC+e4f2Ub0EgI3DE0TZMkWVVVWZZL+LpWFR478cnAjbOiUxcOOprMs8vw1dA8iSWw1Hf7tZd7ZlnhT5fOXz9P3CAlMOgd3YexYFyU2PLSsR1RJLHTUTSZvRdVskUOi2fJlzvmi10jK8rPPefo0gnjDB7T/RJlRW8aFWXV1PmmKZAUusvs5iLtTtPUXkd7sqbRJDrpuR/OrSgt4dEAAPBvgAQWAKxybuLKQUmK3jTMi6qmCjXta304unNnGhGIQAjXNeE/njZaunA5Cd6fWkMrgkKbew1yiIkdXk58miTX67IiMRSF7jjGmWcwEU2hpiH86Um9pvJDK/x4blteCrYB3KlLRJKiKNI0naZpkiRL+L6meXk28AazUBXZ/XVdV3mKWholQYRIktRk7tWusdmUbS/5r/fDy2mQlRVksO5lb7e8ZDAL46zUJL6lSyLH3NuGOreNuia82DI6NXHmJv/Pu2F/FlZVBTdY98VcPSAZTMMKE52avGZKNHn3mwkmEaqp3Ivd2mZbcaLs7ak1c6EICwCAf+utwRIAwAo7sHlRWn5yOQkIjNo1sa5/Vei8e/f1+k9VYp9vGzttpayqwwv78NxOc0hT3JuJ5GU1dZKZG/Ms1TZFnqEITOC7LdD70v2EVJF9tVvbbqtlRXy6dM5H3nxUJQDcnSlS1PUIziXUwMLzeQvdse+FWV3jn2zoAkcvT8Pj4mvQNLnXVp9t6RxDHfe8jxeO46dgWndPhfHYifqToKyqhiHUNJ5h7uv8J9BcJo2hqO22erCuYYw/XtgfLx07SOHwvy+StJjY8Wxent8wBEPlSfIerjYRIjiGPljXnm3qLE2e9r3jvutHOTwgAAD+FZDAAoAVjsaILC9nbjJxYwoRbVOoKdw9qv0uwhuKJA1FeLljmCp3OQ0/XNiWl5Sgk3IflBX2w3TiJmFSaDLX0K/7B9HdS6TNjYOhqKYhPt3UGzrfHfuHF06clZDcBO5645xH+UuYwCqKqjsJetOAItFGXdpsyuzihUVoedYNYUKXuRfbxlZLCZLi7cmsPw2KCkOhzZ1uqPME1sxNx06MCNwyRIln5hs7up8HsTBRhBu68HzLbOrC1E8+nFvDWYyhPu+eCJKsPw3jODdVvqHxAkfd/UZy9YmYIEmirvEH69p6Q555yY+fp1M3gQcEAMC/AhJYALDKPqwf5/1ZGMa5rnIdU+JZ+t7813+IsjiKfL1bf7Fdqyp8dOm8P5+FaQGP6+7Ji2pkJf1JkOZl2xA6NYlhqHsQ+L9WSbn6ZIGlX+/VXu0YfpS/O7NOh36cwk0scHdmyLIsRVFZluV5vmw5lzgtPlzYg1nUqUn767qp8CRJEksV/WNMzKvYnm+Z//mspUnsycB9c2pN7Pj6TALu5jlUVZKWQztygkQR2PWGyNHUTw7iOzcNjAjEMdTzLeP7gwZPU0dd9+jS9uMMMlj3ghtkJwMvTovtltzUxfkwU3T3Lykm8NXGS9N7a/qfnzQQQbw5tY66jh/nkPUGAOBngQQWAKwyfpT3pyEmiE5NquvC/fqvX71YkkINXXi5bbQNYX7bZs28BByVu6co8cSNZn5KEKiuCabCUeT9Wcd8+BtJoo2G9HTT0CRmbEV/O5rMvHTu4sLjAm7fJSJJQRAoiiqKIsuWS2G6qLAbZOdDP0rL7ba61VJYmporp6Ml+pYILXbyuiY82zKebOhlhd+dWp97TglFWHd7zjpBMpqFcVrVdaFdkxiGvG/TuLaNpiG83K1tNGU/yg+7Tm8SwvO6ewMpq8oJ0qEVFRVeq0m6zH55O9HdG8biUxsa/2LbaBiC46fvz2eXY7+AGT8AAPystwZLAACr6r+WFZ47KCHCxFpdahgCXooA5yrkomhyf019umlQCB1dOseXrhuCTspdk+blYBZGaaGJTE3jqXk70j06jAs5DJ6ld1rq/rqWF+X709nACrIcOkyBO+LrFMJlayEMovx06A2mIUuj/TW1XROvu7LQci3gfIO/2kNauvC/X7brmtAd++/PrZEV5SUEo3dEWREzLx3ZYVFWDUPqmBJNkctgGwRxtcNvNKRvdmsST5/03MMLO86KqgLbuMujlgjifOLEXpSJPN0yJVlg7mM49d/9VQIRHEuvN+UX2wZNESc997jvJlkOZZsAAPwzkMACgJUljPORFXlhLvJMy5B0mSOWROt3fuPWqUkvd8y2KVh+8t8fR71JAI/s7h7BvKgpiPPzgZ+kRd0QO3WJXozQvr+A+O+hb014tVfTZK47CY4uXctPCAQPDbgbI1xSU7P95PDCdoLMVPjdjmrIHEJoOaO7xRrKEvN6r/ZkQy/K6vDc+vFklsB0/LuiqPDQCi0/E3i6ZfCywC6J2P/iO+gy9x/Pmut1aeYlb06nR5dOkoGMwF0+BcLx08txGES5ofLtmiTyDCbu8ei/lrYwZO71Xr1Tk6Ze8qnrjJwYqq8BAPhnIIEFACubobC8tDuXN2qaQl3jRZZekst6tNCamc+debVbY2jytO9+7jleOJ+3DtzFI8BFUTlBMnbjEuOGxtd1gVqKK/orBI7Z72i7a0peVp8v3ctJAIMqgbsxP47jaJrO8zxNU4zxcsT8RIXxyI4OL2wCEdtttWmKNEXOg77lzexSJDIU7tWOuVaTx3byl0+TkR0VMK/jTkjzsjcJgySvqfx6XWbpZRH7X3wHlqG2WsrzbVOR2O44/OFo4kYZPLU7o8LYDrKxExVVVVN4XeYYej6A8P62u2vDoKnNpvxsy+RY+mLkn/Z9GEQMAMA/AwksAFhRMGEF6WAWVVW1Zkr6/LoeLZOeAImIdk16tVtrm5IT5R8vnIuRv4STv1Y1VE/ycuzEbpTxDNU0BE1k0dLM46dJtN6QX26bisBcToLTvu9HGeSwgDsIojiO+6qBtTz7pR9l3WnQmwQiR+12VF1i/yHoW8rzBxMkQixNPt3QDzZ1gkCfus6nruOFkKe4dUqMo6QYWnGalnWNX6tJDIWWyTYwTZGKwLzYNrbnSlg/ntojK8pyyFXc0btZlNj2k5mXMhRZ10WJY7708d2vQOrVT1Phv9mrNXVh4iRHl64fZSW4hQAA/CSEhCUAgJV0UCqMLS+ZuTHL0GsNSebp5Yp35iPnOIbabCqv92ocQx713A8XdpQWGKR+74QoLi5HQZwUNU3o1GSGJpfKQEyVe7Zl7rbUMMkPu/bpwM8L8GKB22VRALiI5fDS7EQVri7HwVnPTfOibUobTVnkaYSWfCXnLiaJGobwesfcbEqOn/z1aNyd+JCJvm2ytBw7ke3HFEU0dKGh8dSXEYRL8pYtbGO7rT7fNkSeOh96H87tmZfAs7sT/xDHWTG2YzfKVInfaogctxSn/7VEGkO92DSebmiYwCd952PXjhLIbAIA8D+ABBYArCBlhf04H1qhH2W6zG02ZJFnlsuFQlf/fZE8qK3VJCfI3p3Nzq7zFBDh3K7/ijEO4rw/C9OsbOpCpyYtT0C8uAZGBKqr/PdPG7LAnPTdd6ezMMkxZDeB2/aKSHJRq7o8lpYX1cUkuBj5JDmXbzdEEpHE0svCLRaQpaln28Y3ezWeo08G/vszy/YTeItvlSDJz4eeF16d/h1TFHmGXD5jIRHSRPbppr7X1tKseHMyOxv6Jcj838mr6UfZ0I6CODMUdr0h88wS5TcpCmkK+2zLWKtJo1n0/x+OZl58tWPApgEAwNcTBJYAAFaPsqqmTtSfhnFW1DV+s6kIHL1U33CeoLj6i8DRex3l+bYp8czxpfuXT2MXekxunzQvZ248dWOaRu2a2NR5tDQhznwW0tWXkQTm5Y6x2ZCDOH97Ojvpuyn0mAC36hKRJM/zNE1nWZYkSVne/xT3ssIzLznpe1aQNXRxf13TZe5BLOb1dHyE6prwcsd8uWNESf7D5+nZ0M+KCnJYt5KZmK+qH2WnAy9MioYhtmsSRZGLARnL9D2vftIU2utor/dqqsR0R/4Px5OxE4Fh3DZVRUyceGKHJEE0DammCRRJLsmlIZ6PIqZJcqejPdnQKwK/ObM+9xw/yggEk1wAAPjircESAMDqUVbV2EksPyFJsqbyqsTR1HKe/RghQhbYb/Zq223Zj/PDC7s/DWDa+m0TRvnAivyoUCWuZYiqxC2Tc3j9VViabBnS/rquCGx3HLw9tYI4h2cH3KLlIcTzPEmSZVnmeb4MknxFWfWn4cXIL6tqq6lsNGSRpx/Mgl6FowRFoq2m8r+eNxWRuRyHhxf21E2gkfD28MJ0bMcVJhoab6ocOb8twku1xaPrPzSJ219Xn22ZeVl9vLC746AowTJulwoTg2lo+ZkssGs1UWDpRVH8kuzAC/Ooa8L+hl7XhKkdvz+3B1YEDw4AgK9AAgsAVpA8r3rjwAkyVWRbNZGll03A/e/RDUEQDEXutNWnm5ous30rPOzaUQJ5iluOcOJ8aEVJWtQUztR4gaOXUA4aISTyzNNNfbsjx1lx1LWHsxCmmAG36xXNWZYdksBZXl6M/KkbKzyzt67pCvdVpesBrOaXXUWX2Wfb5l5Hwxi/P7VOBi7kKW7nSMVRks+81PISnqWapjy/nEDEMnacXh3/JIk6NfnPTxu6zE3s5HPPnXkJBtHuW9xSiKKsRnbshZkqset1iZ33DyICLdfRz1E7LXl/XScIdHzpng28NCugixAAgGtXDZYAAFaPOCsvJoEXpqbCbTRkhiKXc2DVvK0BIxKZCvty2zxY16O4eH8ys/x0fj8PzsqthMQYE16YDa2oKKu6xpsyh+aKJEsYElMk8WRde7FpCAx1NvI/9pwgLuAhArcaOy2PiHtZYi/MP/c9P8o6denFlsGz1D9+z4ex52DM0FRLF//0pFHXhcOu8+7EcsMMV7DD3zAVJpww600DO8g0iV2vSfpyVdf+j1dt8X6ZCvf9Xn23oxVl9e5sdtxz8hK0Dm/PQnCU5BM3jtNck9m1usTSS2ofnZr0zZ6pK9zAjg4v7JmfQl4TAIAFkMACgNVzUKqpG02cqKxwTRPbhkhSaMnDRRKhzYaycFYuJ+FfPo4Gs/DKwYXHefPBJFFWeOLEEycWeGazqUoCs6whMSYIJPL0wabxfNvM8vLN8WxkRyD0C9yWS0SSoiguNLDiOL53DawoKY77Tn8S0CS51VKebGjcMskt/6ZNXmSpFzvm000DEehj1/nh8yROQdLu5rd3y4tHVpSXZU3lTYVl6GU//SkK6Qr/es/oNKSzgf/mxLK9GDpMb4kkKc7Gnu0lPMe0TUmTWYpaxkgQEYTIU1st5bv9GkuTn7rO8aWT5SVkNgEAgAQWAKwgaVb1pqHtpwJHt0xBEZm5m4iX2OfGCCFd4Q7W9b01PcrK/z4cnw68LK+gYvzmV7siorSYOLHlJbJAb7ZkRWSW1TCuLJemyM2G9GrXlHimOwo+XlhTF+QwgNuCYRiKopZEA8sN0vfn1sxL6jq/3VJlgSUfppIxxpikyMZczb1pCINp+OPxdOJG85mzwI1RVtXYikd2zNJUpy6rMjdPE6GlDfvnXwzxHP1iu7bb0aK0+Hhpf+g6kNy8JeKs6I4Dy08lnu6Yosyz5PKNqMTzUcSIIJua8P3TRk0Tpl5yeGGP7AgymwAAQAILAFaQNC8vx4E77yDomCLHUFeeAF7ya1hMkWitLn23X5MF9nPf/dh1pi5cw96Ga1g5QTpx4iQvdZlrm9carktpGItpVUiTuIN1bXdNi9L8h+PpxSSoYIgZcFt7EbEkClNlVU3c+KTvZXm51VLWG9LiOz7E3urFwnIsebChPt/USZI47rmH53YQw8zZmyQrqv4smrqJyFGbTenr5cTSNpxeF2EhomkIzzb1msoNrfDH46ntxxV0mN7s0T//maTlYBLGSWHIfFMXaRqh5dtSvt65Sjz9ZN3YX1MZivzYdT5dOhmkvAEAgAQWAKyYg4IJIoiz/jRKsrKuCm1TYGgSEcucv/rqsSBN4r7ZNZ9saFlevT+zP145K+DC3jBFhXtjf2RHDEV2TFGTWZpcXuPAV3ZBMDS5Xpf/dNCQePbjhXN4bjtBAhks4Da2IfoqpENVVS36B+/NzDDhx/nJwO1PI1GgD9a1jil+0Q1ED/R8okiyoQl/ftbYaaszL/1/D0e9aThPRsO7fBMLjLEXZv1ZGMW5LnNrDVm+bg/HS/61CYR4ln6+ZbzcNasSfzi1jnqeD8nNG19oXDlBcjkNMIHbpljXeXJ+wi7hlrL4ShRF1jX+u736ekM+H/lvjmczL4ZBLgAAQAILAFbKQ8nywvYSJ0gZhmwYgq7w5JJOIPoZKBLVVf7Vrm6q3GgWHXUdN8igCOtmKcpqOAtnbipydMsUWYoilrgp6es3UwTm2aa20RCDOD/uO2dDPwcvFrhxe0OI53mGYYqiSNP0HhNYGGHXT08HXpRmbVPaqCvy37XqHujiEouZs/vr2osdg+fJ0757dOlYfgJ7/E1YDFGWeObGMzemKbKhi6rA0vN5mni5z3+EEJ6f/g1NeL5ltGqS7afvTmZTN34w0zYfhoEQcVrOvHjqJixNNU3BULhr/xAtoVVc/4Um0e66erCulVV1NvRP+n6cQXspADx2IIEFAKvkoaAoKXuzyPZTRWA6pmgoHCIfRriz8FNFgXm6bux1tLIqP106R5d2lkO32E2SZtXQTpwokUVmrS6zNLn8hoEQYhiqbUrPtgxJYC7H4ceuE6U5JjCGUZXAjcbSHMfRNF2WZZZlZVneV6iZ5dXIibvjgERop6XUdZ6myBXYCUkStQ3x1Y6509KCOP/h8/Rk4OVlBe/xH7ZdosR4bMcTO+Y4aqclyzx9vX8+gO9+BcuQTzeMeYcp+njhnA/9BGqwb/QoDZJ8ZCe2l/Ic3TElXebJ5fYPMcYkiZqa+GRDa2ri1Enenk69IIWHCQCPHEhgAcBKeShhkvdnQZwVhsI1dEHkmIci+vt1HGFdE78/qNd1oTcN3hzPbC8pYOrcDVFVeOREYyfCmKhrQk3j6KVPYH1VIJYF5tV2bbulhEn+qWsPZlFRPtBmKuBBhHv43pSD5r1gFyN/6iaGzD3Z1Goqv8xKRr9pSRFCW035+/26xDOnA+/tyWzixJDB+uNrm+bl1E3cKJMFZr0p8xz1sGyGJNFmU369V28Y4tSNfzyenQ28Ck7/m7IQgrD8dGiFZVXVNcFQOHo+n3qZ0+ILMSyWIXfb6tMtHRPV0YVzOQ3itIAHCgCPGUhgAcAKebAEdsO0N4kQItbrkqnyFIkeUBH+dREWT393UH+yrldldXTpfLp0wjiHp3sjFGXVn4ZjJ+YYaq0uGjL/IPKbX6atk+tN6dWuqYjsxSg47NpXXiyksIAbtTSWZSmKwhiXZXlfUwgxJgaz4LjnliXeaCobDUXk6dVY3sVfNJl7vmUcrGtJXr47my0qbcH8/ghVhS03GTsRQRA1TWgZIkNTD8uBQQTiGGq7rbzcMal5Eda7c8uPc1BzvyELIaZ20p9GJInWGqKhcD95K5fVLbz6gnVdeLFtNnRhZMfvzqxFeykAAI8WSGABwIqAMVEUeOYmg2nEs9RWS3koDspPwhuGJtea8otto21KEzf6y9Fk6iXwfG+EosSDWeQEqTTvIBB5et5B8ADCA3w9k4h5tWtutxQ/zj+c2RM7ygu4iQVucgvieZ5lWYIg7rGFsCiri1FwNnBlgX66pRsyRy3EllcFhqbaden7Jw1DZi/H4fszG6bO3cjePrZjjqHbhlBTeYZCD8qBWWhhETWF/36/3qlJMy9+d2b3ZgHoYN4IeVGNnXjqxlcuVk3SZe6B7MkEIhDPUAdr2rNNvcTVh3P7fOiXJYhLAMDjBRJYALA6hGkxcWLLj3mW3mwomsQ9PIUgTCCSYEhyf017sW1UFfHh3OpNgjjLwVX546R5MXWjJCs0ianrPEUtjoCHUIQ1/5I0Te62tYMNnWep7sg/7NpuBNV5wA0GS4hhGHIufX1fUwirCjtBejH2x05SU4Xnm6Yisau31DJHf7NX31vTyxIfXtife25WgDbz76coq7EdzfxE5OmmISoiR5IPSTTty7AZLAnM7rr2dEtnGPK07368cEC0+wYcK4z9OJu4UZwWssA0NUHiaYJ4QOaB2nXp2ZZpSFx/GnzquRMHUt4A8HiBBBYArAgVJmw/GVpRXlZ1latp/FwCAxEP6pIKo+tURdMUnm4ZNY2fOsnbs9nFKEBw2/bHyMtq6qVjO6FIsmmKDV2gHlpNB4mQKrJbTXm9Lnth9ubYmjgxQYB8DnDDwdI9fnqSlUd952LoEwTabCltU+Boat4qu1JmTlFkU+df7ZjrDWlsx2+PZ26YwUyG331yplk5tOMoKRo639D4xd3Eg6vaWyjOKzzzasfsmJIbph8v7JEdlTBz9o8uLGF5yXAaEhh1TElTuPlQiAe0JxOKwGw1la22WhTEUdc+7rswiRgAHi2QwAKAFaGq8NiOepOAQmijqagii4i5/PWDcmEX3xXPm8V228qzLYNExOG5/f7MClKotflDHmySFsNZNHVjjqY6ptzQReqB9ZhgEiGSvIrqX+2aLEN96tpHPdeJUkhhATcXKSGavh7flmXZXdawYIyr+aSwo647tGNFZPbXVE1mSRLNbyLQCoXTmCSRwNIvd4xv9moURR5eOu/PZk6QEpiAq4rfSl5UTpgOZkFRVhtNuV2TSBI9zLfv6ifHkE829GdbusDSp3333cksAtHuP0ZZ4ZmbDO2IJImtlqJK3GKcwoOo0fs6T6Ou89/u1Q2FvZyE704t20+hCAsAHieQwAKA1YgHrjsIhlbE0NRmQxEWor8PUzdlUWzVMsQ/HdTbNXFkRT98nvUmIWhh/JFFjdKiPw3ipNAktmmIsrCYUPmQmggWf2lo/De75lpdsoP0zfH0+NJFBEi5AzfkFZEkx3E0TZdlGUXRHctgFWU1ceLjnhsl+XpD2l/T2LkU90O7ifjldxljAiFirS5/u1fbacsjO/q/3w67oyAvKwT56N9IkpUjKxxbESKIrYbSrknkQzYXkkSmyn+3V3+yodt++tejydCK8qKE1Obv9Q/xXD0gcYNc5OmNpizNJ1QS+GF4iF+/pCqy3+7X9ta0fD784bBrpzD8AQAep6sGSwAAq+GgeGE2cdJ4Lm/UMOZdJw85viEQ4jl6s6m83q3xHH0x9o4v3SQtMRTb/E4LIfwov5wEmCA6ddFQuYVP+BDLOliaahnSs01d4OizoXd47gRJDnquwE0FSxRFLfSD8vyu7SpJi4uRN7BCjqW2mnLTFBZCdauk4P4PezxiaHKrJb/arbE09fnSPbp0nCCFN/m3EqbF5TSKskIR2ZrOCyz1UIprftaZuXoHSbSzpr7aNWWBuZyEb09nlp8SCC4qfg9Vhb0gmzhxUVWGwrUNgWPphWLDw4KmyJrKP9/STJUfWsmHU/tqu4CjHwAeH5DAAoCVSE8QeOLEYyskMNE0xJrK0fTDdvUQgUmETJX/89NmuyY4fnrYdcZOXJYliGH9vqjADbLuOEAEXmtIdY1bVC090IjAkLnn22ZTF+0g+9S1BtOwgotY4Ebfl0VcdMeZozApToeeG2S6wm01FU1kEbG6dSfzpuCmIX2zV2ubohWkb09nlxO/qDAEpb+JIMrOhn5eVA1D0GVuUX71QJOe6FrOHdUU4dmmsd1R/Cj7y6dJ/2qTB7v4PVQYT/xkaEe4Iuqa0DREliEfxvSWf4Jlqedb5nZHSfPic8/pTq7MHh4xADw2IIEFACvhoFTEyImGdoQQ6tRETeapBzWB6OdCmyvnSuLpJxv6sw1DEdnjvvPXo0kQF+DB/o5oPEqLmRfbfsJx9HpNNmT+If9CiGPJzYb8eq8mccz5KPjUdeKsADV34EZg5lRVlaZpdYeZ0bzAEyc56/tVhQ/WtL117Xo224rWnSy6IkmSWK9J3+/XFZ45Hfofzm3LgyKs37aQXph1x35Z4bYhmQq3CgaDrgyjbYrf7NYEjj4euIcX9sSNQeb/d1BWxNSObS8ROKqhCarI0uRDvQekEFqvyy+2jbYpDqzo/ZnlxRk8YgB4bEACCwBWIT2RFdXYjtwoVwSmY4o88+C7ThC6biVQJebljrnTVuZKWJPeJIALt99sIQTh+OnlJIjSoq7yTV0UOHphOg/TNq4MQ1e4b/dr6w3Ji7P3Z7P+LMiLEjpMgD9sXYjjOIZhyrKM4/guNbCcMDkduBM3VkR2b01br8urbc9ffztD4V/t1vbWtTgt3p3Zxz23KEGc+Vfu7TiI85EdO37CMVTLEGSBXZk30VD459vmRkPyw/z96exs6OUFGMZvJi2KwSz0oqyuC+26xNDUQobuIfq6JIkEnn6yrj/f0suy+nBu9yZhmpXwlAHgUQEJLAB48JQVEcbZcBalWVnX+bW6RJNoNfzXq58k2l/Tnu+YNIlOB977M8uL4MLtt7p9hB0kvUlQFOW8QI+lSfJrBcQD/H3QfFIVtbemHWzqCs+cDPx3p1YQF9BgAvyR6GgxmWuhgVVVVVHc6eyzsR1/uLDTrNxsyht1WeKZRSJ/xdccI4pE6w3pz08bNVW4GHlvT6eWl5QwI/9XrSBh+VlvGkRpYSp80xA4hlwBxbRF/ThDkxt16eVuTRaYs1FweG7bfoIhhfXb/EMcJ8VwFgVx3jbF9br8ZfowepA+4bw4f70uv9qpzZWwwh8/T0d2BA8aAB4VkMACgAdPUVSOn0zsuCyrhi526jJFr86rjQhkqtyzDX2zJbtB+uZkNrbjAoqwfpsLW9lBOnUTAqHF/MGF7/pAo5yF9CxFIl1in20Y2x115iZvTyzLiyuQwgL+SHT05S8keZ0FuJtG7ArjCuORHZ2PPIoi99e1hi5cWTom8EoLV18t8vz3UwT29V5tb01Ns/JT1/586SZQVfHrcPxkOAuzvGoaQk0VaIpcmZcRY0IV2Zfbxm5HTbLi44V9NvShOO+3+Yfl1ek/8ZKirBoG39SFB32/ieeWIYvMbkedjyOsfvw8vRj5BeS7AeAxAQksAHjwJHl5OQntIOU4ul2TDIWjyBV5tRfRI0lRW23lu4OmKrHnI//D+WzmJfDcf/UaEmGcT+zYCzNVYjs1WVz0Dz5YMRH01TBI9HRD+26/TlHobOR97NouVOcBfzhyZhiGpumqqpIkuZsEVlni/jQ66XuWl9U17mBN02R2YeuPpCuWZan1uapduyYNrOi/PozGTlxBquLf7+0EUZR46iYTJ2ZoslMXNYVdIcm0q6dPUeRGQ/lfz5tNQ7wYB29OpkGSl3BR8auJ0vxs5LthqohsWxcVkX7Y4hLzPZokUMsQ//ykoV35hMGnnjNxY7AKAHg8QAILAB64f0cQUZKdDIMkLesq19SFlcleEV90GkiC0CXu2aa+0VTirHh7Yl1OgzsrjlgBM7H8pD8N4qyoqULLEDiW+uoKPmDDmDd8mSr/ZF1bq4uOn747syx3nnEAuwD+gGnxPE/TdFmWURTdjQZWhavLiX8x8rKsWGvIWy1FFphHdZAhArMU+XLbfL5tlCU+7Nqfew50i//CumHsx/nICv0o1yS2Y0rKqghgfXkXiXl1HvN6v7HVkuO0+NxzT/pOkkJ13q8lisvzvu+HmS6zdV1gaOqhd5heHfGIkEXmyYa23VairDjquufDoCjh4AeAxwIksADggTv+88nr3ZEfF2XTENumSJErdWe/KLehSLTZUL4/qMsCczrwPl3YTpDC/fyvxPbSiRtXGDd1wZA5miZXIMWzkMOgSNQyxRdbJs/SJ333ZOD6cUaAljvwx0yLnE9xvbOO1Ci5isyHs1gV2b22qkrsQo3rsSz4fOwsSRINnX+5bWw2ZNtN/nY07U4CGC36b6gq7AbJ0IrirKypwlpdUngGrdyCMTS5XhOfbxh1TRjOor98mlh+AlbxKwnTbGCFWXFlIabKr8Cucq2OilBN419sm6bKDabhxwvryieES00AeBxAAgsAHjCYwHlZWV4ydiKKJNqGUNd4cuXCnoW/UtP4b/ZqW00lSIoPF/aHcysDJaxftBCMK0xM3cT2M4GjOzVB5GlEIIxWwc9byAPpMvts21hvyLafLWYSERD1An9sw1mU+C24gzTEzE2Oe44f5y1T3F/Xrpt88aNa86sfLE3tr2mvdk2GoY669uG57YYZAUHpv6Cs8NRNpm5CEETLFAyFJylEEGiVapMXg+d4ln66qR+sa1levjmZnQ69IM7BAH6RJC9mbjpzY4Yim4agiuyK3G/OLVzkmadb+t6alhXlp659PvRg8gMAPBIggQUADxvHT3uTwAszVWRbpigLzKpe2zM02TGlbw/qusJejIK/fpo4QYor6Bf7BTcvL6qBFbphaircZlNl5gL/qyGtg+YxPktT2y3lxbbBMtSnrnPcc7OigptY4Hca1f/UwLqDIqysLC/Gfm8S0BS501HW6jLLkMRi2tajW3yiaQgvdmo7HdWP8zcns7OhC/LMP7+3E0RZXu3tMz+RBWarpQjz3vBFb/UqvY/zvRx36tI3e2bDEPqz8O3JbDALwQZ+ET/K+7PADTJFZNumKPP0ihj/3MJJhNqm+O1ezVD5y2n4/twKkwKOfgB4DEACCwAednrC8tPuNAjivKbyTV3kOZpYxQzWwiOXBeb1bm2npcRp/v7CPh16cV5Cv9i/ocI4jLOJnYRxYSj8ZkthGGrV3gGCqKv8NztGXeXGdnzUc0ZWVICeK/B7txqe51mWXWhg3UECK0nLk747ddOaxh1s6NJVkIkIAqFHufgcQ2815e8OaiJHf+65hxeOH+UV6B3+81phnBfVYBK6QabJ7E5L5Vn6H4dprpBVXP1Sisg83dCfbupFWb0/tS+GXgXXV79EEOXDWeTHuaZwrdqVf7ga5vF1kIvA0q93ze2m7IfZUdcdWmGaFfDcAWDlgQQWADzo4B27YTqdj1+pqYImcyS6LktZSRgKNXXh2ZZpqPzYig7PLC/MILT5N2R52Z9GdhCTFGkqgnk9oRKvUsS78GI3GsreusYw5OnA+/F4upjBD5YB/G6juoMxERjjrCiHVtSbhmVZtk1xp60yFPmYj7RFt/h3+42NlpLn5eG5ddx3sqKERsKf7u0ltoNs6iV5Ueoyv1aXWJZa5XAFoaYpPt8y6qpgBcmHrj2wwgr2+H+LF6UTJ0YEUVd4TWRXb5dmKHKtIe+ta5rCDazwb0dTJ8i+bCQAAKzuiQBLAAAP19lP83JsxbaXCjy9VhdVicWLKcOrGdpgAiFJYL7Zre111DSvPl445yM3y6HW5l+S5uXDeCKRAACAAElEQVTpyHPDTBXYTk0UWPrK61ut3iSEECKRKrPf7NZaujhxkh+OJ2M7hs4j4Hdb1FcR99vOYcVJcTrwRlasSNx2S+mYEk0/XscMzyczsDS1Xpe/3avXNeGk7747tWwvhYj0p3t7VvSn/sxLeIZuG4KusCs2v+WfkXlmp60+3zYJgvjUdT/33DQt7kao7iH6S3lRzpxk5sYCT683ZUPmV+6XJBCFJJ7ZX9P2OlqcFn/7POlNgqwo4fYKAFYbSGABwMN1UAgvzC7Gvu1nNZXfbCm6xF2f6isaVRIEQZNot6O82q21DLE7Cf77cDLzEzCGn3Pt5uFxVp70PC/MGxq3UZdoavXyV9cILL2YwU+R6NOF8+5sFkQ5dJcCv2Of4Thu0UIYhuGtthBijB0/m0uVp7sdda+tyvyix+eRRl/oi7qNyFH/+azxYscI4uLH4+n7c3selAJ/J06Lk4E3dRNd4TYaEsvS5OqWXy+EvRBCLUP6v162W4Y4mIZ/+zS9nAZwUfEv/cMo689Cy88Ukd1pK6bKr9jlJkaYnI8s2GmrfzqoKyJ73HPfn1sjK4KUJgCsNpDAAoAH7KD4UT6x47yoaopQU3iOpebD5dDq/sqYQIhn6b2O+nqvhjH+dOlejPwozeEO9ucWC/tRNrKioqwMhTM1niRXtkCPJJGh8M+3jLWa6ITZx3N74sbzAhowDOA3gBCi51RVlWXZ7SWwMCbSvBw5UXfsIwIdrKmdmoTmQ2Qfo3771/Wf/6QosqkJz7eMpiEMrfjtyXTmJUUJOay/7+5hkg9nUZyVuszWNYFGiEArazlfpNwJkacP1tWnG7rA0kc9+8OFHaVgFT9DhbHtZxM3LorKkDlT4Znr0s4VEhC43i2QKrH7a+p2S0nz8mPXORv6OYyoBoCVBhJYAPCAXVjLT6ZezLJUpy6JAr040lc4YP9am7Bel//0tN7Q+akTvz2dDacRZCn+mTAuRlY0dSOBpZqGqEksuaLC0IssFUmi/XXt2ZZJU+Rx3zvpezHouQK/c58hbl8DC7tBdjrwRnakSOyTTaNpil++wGM/2giC4Fjq6Yb+fMsoK/zp0j26dP0YXudrigpbXjpxEoogGoZoqPzCaFfYcha/GjW/qHi9X99qyyMrfn9qT524rOD8/ylVhUd2NLZjliE3GrIsMPM1xKuX4kSIINHVW/Byx1QktjsOPnZtJ0jBBgBghYEEFgA8UBf/yoWdOonlpRJP73UUiWcfR/CzGEdIbzaV/Q0NkcTbE+t46Gc5yLn+lCDKhrPQ8hORp1uGqIosSaIVzjiQiGgZ4sGGulaXvCj7cGZZXgJ6rsDvMKevGli3+kFjJ/p86eRF1TalliHxKy3C/dteZ4xJimzXxJe7ZqcmOUH634fDkRXdgbL+gyDNyoEVztyY56iOKRoyT6DHsiwsTR5saE82dISI85H/8cIO4xxM4ieUFR5Z4cSJeJbabMkiT//DUbmCXqEuc0/Wte2GkmXlp6573HfL+e4NmwUArCSQwAKABwnGOIizsROFaa4KzGZLER5N8DOXwyDrmvDdfqOmcN2Jf3hmDWYhXMP+j1UiCD/OR3acpKUh83WdZxhyhcO/hUgKQ5PbLfW7/TpFos9993TgBVC1AfymUGiugcUwTFVVcRyXt9O2hjGR5OXlNDwZeCxNPt1QdYWd90nBJkYs1NwRQbA09WzDeL1boxF6d+YcXTpelMECzadzFMNZ6EeZIrINXZB4ai5tiB/H6Y8MmX+2YazVpZkbvzmZjZ04B4m0f6CqcJwVYzsO4kIW2fWGIrD0am/bNEW2a9KfnzVVib2c+B8vbC/MKthOAWBFgQQWADxIygqP7HhsRxRCNZ1vGgLLkI8nwkSIEDnqyYa+t6aVJf506Xw4t+b9YuCv/N1CbD8Z2xFJki1TNFWevO4xWXGRlKYufH9Qr2v8zIvfn9mDWQAlWMBvgmEYmqbLskzT9JYyShjjsR2dDTw7SA1V2FvTFJFZ4dfzN7/O1y810TbF13vmWl2y/PjtqXXSdwtQtyEIP8oHs7CoqoYumCrP0NRKdof97D5/9YZSaLstP9syEEKfe+7RheOEGVjFV/KyGtvxxE0ITNQ0vqkLLLPKF5yLs18V2e+f1jebSpyWR5fuxTjI8xJOfwBYSSCBBQAPkqLC3bE/dmJNYtfrssQx1CMLfBBCusy92q11auLYin48ng7nauVgGwvmPSZXLqzI02sNeT6BaOVN4uonz9GbTfnltsFQ1Ptz+2TgZwW0lwK/J06u5lMAbuPfLyt80vdOhx6JiJ220jQEhgJ/7J8fA8HQ5GZL+Xa/LrD0cc95e2p50WMf2VGUlROmIztmGWqrJZsKT6DHlfxEiKhpwqud2nZb8aLsv48m3bEP7aVfyfOqN4nGdsSy1JopqSJNrbaDuEhr0tRaXX6+pdc1vj8LP5zbUH8NAKsKOEwA8ADBRF6UIyuyvESV2ZYpMDRJIPSYfLcrT1Vg6f117WBdpyh0PvQ/de0oKTBUYc1XJ86KqROHcS7xdEsXVYl9JL84QoQsMs93zIYhDGfBcd+deQm0lwK/OhRCJEkuuvmKoriNkBhjnGRFbxpMrFjkme22okrcvIwArPR/Pot5UsaQuaeb+lZLiZL8w5l92nezvHy82zwmkqyYuYnjZyxDd2qSLrPokZ1uCCGGInfayqsdU+Co455zNvSDOIMXaEFalGMncoJMYOmGIXIMMx9RufpHP09TTzaNvXU1yYtPXXvixDCOEABWEkhgAcDDo6gqN0xHsyhJy6YmtHSRItGjGl216JUgSdTU+D8/baw3pYkb//XTdOomZVVB0ThBEG6QDmdhiXGnJpkqR5PX8/lXPvuACCSw9PNN42Bdz0v88cJ6dz5Lc7iJBX6tCfE8z3Fcnuee592GBlZeVAMrPO55YVpsNuRnm7oqMOha9wn46QPhOXqno/zvF21V4j5fOn/7PPWix6tugwk889Pu2M/KypS5pi5KPP3Y3tBFzc1GQ/rPZ42NuuxG6duT2WHXgRLshb5eEKdDO8yKsqZxbVO4Lr9Cq28VJImebWrf79dlnvl86b45vdorwCQAYPWABBYAPDDX5Pp6zYomXkKSaK0u1XWefHT9g9fpOo6hX+6YL7dqHEtfjL2jS8ebj0/Gj/J+HhN4QYWxFaQDK6JIcrdzLa+DMXoMSU48TwToEvdyx2gZwtRLPpzZMy+pKjy/poXkJvAL0DRNklfeUVmWN2swi38tycuTvj+yYoGldzta0xBphsKQvfrXKyYL7J+e1g82dEwQR5fOh3M7ySr8mF7nL5vX1S88deLuKGBItNmSdZlbFAw+NsNYVEq2DOnlnqnwbHcSfjiz/Siv8CPd5PH1y4LLqpq56WgWI4JomVJTF6jHZB4CS++11f11LYzz92f2yI7ysvriGcGGCgArAiSwAOABufJXp3BZYi/Mz8bB1IlZhmqZoibzaD70/RGuCUWRLUN8sqmu1UUvyv5yND4bu1GRlbh6jD0mmKhwFZeZl0b9mT/zE46lttqSyDPEo6nQQ/PqPIahdjvq3ppGVPhzz/nUdbwou05HgBcL/FJsfBsZAUwQJS7jMp34/scLywuzus7vdRSZZ9AXHWLgn58FQRAsRW405OfbetMQB7Pw/zscDW0/Ka/2efxYTn9c4iq92tvD3szrTUOOozZbsjy/nHi0yCL7Ystcq0tBmH04mx33nSjOK/woi7DnF1dplbtpeGF5EzeiKbKl84bMkeSjcpNRsya92DZ4njofep+6zsSJixLPZQRgjwWAFYGGJQCAh5CXIPKiCuPcj/Mozi+nweG5EyaFIjBRUoztWBULRWAZmrydyGtpHfrSzYKgCAN2LNaDYpwdnjlaDfv0VFdZnVUURlIYkUbUai/KvOSq8ovYzXw/D5wksL3sfT/NslJWsE9NBhnGiSbTEkcxK28fFcZZXgZx7oeZxDM8y4yd5L8+jKn5NEaBoyWeViVmPrcLil6An8mYUHOqqsrzm9ELL6oyLGInC530/7D3Ht5xXteh72lfL9MHGDSCVayiumRbVmzfl5ubm38q/9N76927HCdxZFuyrUKJnSABkqjTZ75eTnsLMwClxDYJXvktc4bfzzItL2GwhI19djt77zPyk3Rnnz7cCzNOFQdm1uAgQw1csomBIS7k/5ek5+WhsMZWPTkY5/efer/ffLzKlLKluYrrKrajGATNp/S4FCFNvCzwWeTnwSjI7u5nXpgbtkyU0UFGZFq2iWliDcDXKCLiXMQZDWJKqShZGkLh07b/+e0DxmSjapoasXVi6q+FmWeCJzwbZ4FHw4iFQz+5s0ujhKqGCPFon+rlxHFVy8DqdPhyrmMhEGc0yaiCsWuqAy/9aqMHATy7VFJVbBmkbGm6ijFGhesvKJhpigJWQcEMJOSUi+4w2dgdbe757VE8GKc9L6JCpJT/8V63PYxXF+yLa9WlmmWo5HWYJ+CHMqGDdPT14MGd0cN9vzcCgGqtIMC/ub/xZdouL7AVe+Vy6ezl6pmmUdHwPK+5zUXu5dGd0dat4cPdYG+cjMKBmu+uCNTw0OhXndv3ZfliZf1K5dyqtWgQDc2vfjAuopS2B/Hdp8On7WC3G1Euspzd3OofDKN6Sa+5xkrTurpea1QMS1dQUcQq+BM0TVNVVQgRx7EQ4oclVJJJNsiC+6PHtwcPN/3Hvk/j/QodrzFEd8TjX/Zv3M0W365fulI5XdFKKi6isv8sPcEHubcxfvrN4P6T8UEbC64uD33y/9zcMsfdcg2vWCvXKucvV8/U9RJGBM3Zjy+5n4d3Rlu3h4+e+rvjdBQMEd1d43yB4eHvhvceb5sXq4e2/bS9bGAVvQbNNhKAnDEvzB7ueI/2vXY/ao8iCUCS8y82ek874ULVbFbNMwvuuZVy1dU0Fc+ry5NAcskHmf/Y37vRv//E3/WyQejLdPuc4DWujP7gPdzf0s+WTl2vnT9ltUxFRxDAeaxhSSm5kEnGt/bHj/b9h9ujLOdcio3dUWcY1xy94mrNqnn1dPXUgl22dQXjwvUXFMwuRahUUPBKB2pSyFGYbe6N/3Cv86QdjMIsz1hOBZUCyMOIbbsbdEfxg6ejjW3vzbPVK6drCxUTo7mtUUwWPYhuMr432vqq8+3j0aNR1M9FypieW6kEQap0g7g97qbd4ZPHxv2blfX3m2+917xiYBWjeXtKWkqZ8OxG/8HX3dubw4eDqBvTmArKc5UbqZRDovl5sD+meHvw4LZ770rz8ieLb9f0MoJo/kRBmdjuBrceD+5ujQ6GYRBRymXOhBQgydnBIBqMU0UJ7j8lD56OLp+uXj/bWKyZCi5G6Qu+Y7pbB02GshljP6SAJYEMaLIZ7Hy2/9Xm4GEv7kWZzxhkoCLsHCDKQCcbDLvjzd3R1oPaxfcW37xWOatiBRWp1eTmJuXZ7eHWl92bj4Yb3aCT0DgXGnMyicsJfhqG/REVbeXJk/6Du7U3Pmq9fbly2iTafOTnQoqYp4/Gu3/sfnuvd3cQdmMWUcZYRoSWgfIQagGNO96B2B1t3HPuXWxc/vHCm0tm49D7z2+XjRAyztmD7eHtreHdp8OBl2WUU8YpF0KCKGW7NOqOU33Xu2sPTy85b51rXDpVLdvq/N1USCAjmt4dP77Ruf1g+KAfduI8mnh/xA0BQQ3oXh73/D22Pdi417t7feHKR823WmYNz1dP+nT5V075Tjf8+mFvY3d00IujlCY5ExxwLoYsC+J8r4/1XW9zd3xhrfzW2cb5lbKu4uL+qqBgRikKWAUFr3Ss1hsnNzd7Xz3o3d8eeXEuBDA1UnJUTcEAQspEENFRmA2CdBBmwyD1Y/r+xYXluo3ntA9LCLETd75s3/pj++ut4WacjIGkWNE1wzSXqWS+JJRBmOf5MAmG8XA/7Hipn7D0ncalhlHBEM1N0AYlGGT+rdHD/3j6uweDB140ACJDSFE12y47wJGcBgwkuUj9OPHififudeOu5OxHi28vWlUC58T+T19VT3P+cG/85f3u7a3hTjfMKMMYOpZasjSC0XQIN0jyUZAOA9D3kr6fhgl790L97FIJwqIRq+A7pvrwA7f+8slU763+g9/uf3Gre9uLu4LlCtEU1TYbGJQiDhiDWZ4nCffGqdeJ+n7u5Ty/Vj3nKiZ4vRVSShCy+Pbg0b/t/O5+/94o7EqeYaIRSzGNFDIfaCAHKM3jJPWG8aCbDILMi2j4XvOqifWZrwBKMM7C28NHf9j/+tv+rVHQFjzHCBPVcV0bWoDTkIE4B0kQh3487Ib9g6ibZNGPl945466QubuqAccPcAyD7M7W4MuNzoMdb7LbSGgEl2xVVRDBSEiZZXwcUS/KB37S9+JxkCU5u3am1iybE685P6KIWfp1796nu7/f6D/oRW3IKUJYVW3LtoEhJIsoTHKRhEkYJKN21BsmQ8roe81rZ9wVDOelynlooWWS0a394A/3299s9HrjNMkYmWiFUcIIQcFlkvMgzvwoG4bZwEvHQZ5RcflUxdAJPF60V1BQUBSwCgoK/grVq76XfHG//ZubB48PfAhhzdYaVXOlbi9UDNPAUh4m7b1xcjCIDgaxH+cbO6MgzpOM/eKdlcWKieeutYRL3k4Gn+7+8bPd328PtxAEllFqmI2ms1gzm65mIwS5EEEWdMPOvrczSkZp6t3e/2aQDHOefbT4dtOozEd3g5RilId/7N78lyefPu7fpyzTFLNiLLecVsNeLOmugjGXIsriftQ98Pf7cS/Jwyf9jV+yPOP5x0vvrtmtuWn0iFO6ue//65e7t7YGfpwTglbKVqtuLdftiq2rCpJAppT3hsn+MOoMk1GQPT7w/SiL01wheKlmqUqxfqjgiOmzbpMXM/j/cVYV0eRG/86/Pfntzc63nGWEaFV7ccVdq9sLJc3FGOWchpnfCTqdoD1IBqPo4IvdxMsDAMT12kVbMcDcJNwvT8LTb/sb//L005sHX1OWKMSoOq0Fd3nRWXDVEsGYCeYlfjdst4P9QTL0o94X2e9Hmaci5Ur1vKuYM52UjvPwRv/eL5/8x6P+vTSPFUUrO0uLVrNht6pGRSFESB7lcT/qt4ODXtSN82BvuPmrPEx4op362YqzQCCaM+WRAIyC7MZG91++2tnrRSllJUuplYzFqrlUM21TVTDiQkYJ3emHB714ECRBwr7d7PtJnlL+waWFumPMjUQiltzoPfhfT/7tXucW46mClZrTaloLC06rrFeUyZ1NTJN+1Gv7+52om+ThzvDRr2jiZwE59cmy1dCwOheBkGRCPtz1P/1m75tHPS+iuopXm85Sw1pt2mVLxRgyJsOE7vWjzjDujOLOOAnvd8KUAiAvrlVsQylcXkFBUcAqKCj463jlJOc3Nnr/8c3+47YPAWjVzGuna9fPNdYXHecw3YYAwMnGH7bXDb/d6t/c7O/1or1+9B/f7tm68tPrSzVXn5urpcmNIxxl4eftbz7d/rTt7WIES0b94sLVDxbfPueu1vWyThQIoAQy5Xk7Ht7o3bvVu7U52AzS4f5w85dSKkj5eOndWc9tptKIWPJ1/96/Pfn0UftbgJBrlNdrF95aePPN2oVFo2oSHUEkgcw4HWX+reHmt93bD/v3+mGn7W3/8vG/SiD/8dQnVc1Fs9+SxrjYOvB/9dXOVw+7ccpcU11t2m+da1w5XWvVLENFCEEpgZByMkcQ3Xk8/PJBtzOK28Pk93c6QoJ/eH9tuW5DBCAobmJfdyCEZIKUklI6HSGcdvmd/JukLNvwtv/X43/f6N5hNLGMylrlzNuL199rXFkwqgbREAACyJhl+/HgVn/jq4OvHg8fxVlwv/0thtgk5qXyuna0dPm1I+N0y9/930/+/U77Zk5jyyidrpx7e/Gta9Xzq/aiPhmxFJPp6d2of3vw4MuDG09HW3HqPeze+V9EJ0i5XrugIjKLdv5Q6yT/dvDg37Z/c7d9gwtma+VTtbPXm9ferl9esKo2MSfLe0TG2SgP7o2efNO9vdG/1/EPRkH7s53PXdX5OflwyajP1ZiYlHHK7jwe/vuNvYd7HoawUTKunK5dO1M7u+RWXUMhaDoVnzPR95Ktfe/m5uDe09HASx/tepwfOoCfv71iaGQOGm5SkW+Mn/6/j391t3OTsdTWy6vlU28vvnOtdmHVbpjEmF5NUcmGib/hbX/Vvf2gd6cftA/GTz/jOYD4H059vG634Ow/TpxzsdsPf3fr4I/3OinlpoHfWC5fPdu4cqrSqluacnhWxKElF30/3dr3P7u9v7nve1F2a3MAgSQYXl6vqQQVTVgFBbMF/ud//udCCgUFr1wET0W7H//qq93NPV9K0SwZn7y18sn1pTOtkmOpKsGTJS2QYKQruOzoK3WLEBzG1ItpkjLK2ELFqjiaSuanryTh+QPv6S+f/Hp3/ERKWTGb7698+I/rn1ypnKvpJYOoBGIMEYZYxYqrWMt2c9FuppL7eRDnQcbSVIplp1XXK7M+MCaAfOTv/Gb3D3c6tyjPHKN2rfX2P5z62UcL15pG1SSGgsiRKBCxFXPRrLWsRYLVQebFWZRmPofA1NxFs6ZgMtNVGynlIEg/v9P+4n43jKllkMvr1Z+/s/rOhXqrZhoamZwViKeHRcUVR1+sWrZBooSFCY0zFsS05urVkmEUGzEKJkPKg8Hg888/f/DgQa1W+8d//MdWqwVOnPVKICGAe1H3X3Z/d7d9M8lDU3cvN6/9z/Wfv9e8smQ2TKJNLBUmEGtYLSv2st109dIoD4M8zGiU0gQTo2HUKpr7uimklIfS24k6v977w632jSgbW3r5UvPq35/65MOFa0tW01Q0cmjcMEET6al2y2rWjIrPUj/zUxqmLJOInC6tWooxix2mTPKDePDr7d9907mRs9TUrMsL1//n6f/20eKbLbNuH9r279ycpRiLZnXFaSlY72bjjMYZjRLJDNVatZfm5vldCaQAcnPX/+z2wf3tIeeiXjbeu9j8b++sXFirVJzJu3IIoslfhCBLVxYqRnPShD4KsyRjXswYE6sLjmOqBMGZrtoIIHbCzq92Pr/b/iahoamXLjev/M8zf/9+88qqvWARfeL98cT7KxY51JAzpVWINI+GYeZlLBtTf9FaaJp1jcx8E9Y4yD79Zu/Gw94ozBxTubRe/cU7K+9eaDYrpqGSw6Ny6P2RSpBlKI2yWXU1xsU4yOOM+XFu6KTummVLLVx/QUFRwCooKPihDIPs64fdrza64zCvOvqb5+p//97qct1SVTwtN0y87VFbACHI1EjV0ZmUg3EyDjNKpa6RRtmoOPqcJDZAbkfd3+59datzI8kC26xdXnzzn0797HxpzVC0aesMlN+VYjBEpqKVNbeklQIa7wT7XOQRTRbsxSW7qePZjleYZL/e++qL/a9GUYdg9XLrrf9+6pPrtQuuamF4mLV9XxQIQh2rFc11dVdAtBfspTRIOCVEO1Nac4g104OEnMutg+DX3+zt9QKE4MXVysdvLr17oeFamoIRgHJ6Vp510BACLV2pOjrEcOCnXpClGSMKWqqZNdcootgCIUS/3//ss882Njbq9fo//dM/tVqt6UThidRDglTktwYbv9n+rB+2ESJrldP/48wv3qpfrGouhvg7SzUZEMQQWYpR0hxMNJ+GvaiT8zwU+YrbOmW3IHztnnvPBL07fPirnd8MggMgwRsLl/9u7adv1y/V9BKZNknKifwm0iMQm0Sv6K6umIPUa0fdnCdUsPXSqYpe0vDsDQdFNPld+9sv97/o+nuaoq9VL/yP9Z+927hcUm08VYZDLZw0JE92tWlYdVW7pLmQkHbUCTI/oKFG9NOlNVux5mPno+Ayy/kf73W+fNAdhZmpKR9cXvjxlda5lZKukkkzOjg6nVKCyZlRJyuQSqaa5HzopV5MGRclS1uuW4ZOZnq4MsijW4ONf9/5zTDsYkwuN67/bO3jd5tXKpqtoMlUzfSdQQkklAhADSmuajWNWgbEXtRNsiCjCVGtRavZNCozrRiUic1979ff7O72YozResv9Hx+uXTxVKTs6Pjor0wuFSUyIoEpw1dV0hYQJ647jJGWAg5qrrS06xbsZBQWzRfH6UkHBK1epAQCMgvT242GYUILh2qLz9oXmwuQ6cZJEPbs+hJO/P4zdEEI117h2uvrGqYqhkZTyzX3vYBAxLsAP2EP8ColFyoOwfWdwN85DiNWz1bMfL7+/7i6qWAFyuoL7u5INPP6PQdSL5fW3m1fWK2cJxFEyfDTc3I26EogZrl4JPs6CzdFmL+pIiEtW40dL775ROW0q+tH+6e+L4hgV4zV78aPFt06VT6uKFaX+zvjJk2A/E/lMH5YgoTvdYLcTSAkaJeP6ucbFtbKlK3hacTg+K9PSw+SwQIRgxdGun65dPlWtOHrO+faB/7QTxBmVc3FYCn4geMJ0hJBS+rIjhKMseDR+Moo6nGU1o3GlfvlS5Yyjmgiio9oV/E92CgDgKtb7jStX65cdoy6A6AX72/5eP/PAa6aQEshBGjwZ7/W8fca5a9au1K9cr52vaDY6WrZ8LLPpnQWQCMKSYl+tnrvcuNRwWoLTUdR74D0ZZt7M+TggQUjje/0HvaiHECpbjQ+W3rlUPWsR/Zklh1N7diwDKaWC8JLV+Kh5/Xz9DVsrpam/N9554O0kLJ0PrWBcdEbJk3bQ82KV4LUF5+2ztdMtV8F4Wrn6rj8SHv0fCKGC8XLD/vBSc73lGgoO4nxjd9wdx5TPtplvx6OH461R0BaCNeyld1rXrtXO28RAcBofHunG9IBAeGhxEESLZu295rULtfOqYjBOH48ebwf7CctnWhZJRh/ve+1RnFNWcbTLp6oXlsuOoR2djWMzMVWMiWykrpKzy+W3zjWaZRMh2B7F291wHOagcP0FBUUBq6Cg4IeE8IyL3jh9cuBTKlxTPb9cunKqip9dvv4Jk+4AgBBYbTqX1yo1V+dCTtZVJnHGxFxkNYnI96L2QbDHBDN191L1wvuNKzrWji7Z/lQmk8geAWQS7UL17PXFNw3FZpxu+Ts7wT4TMyyVlGdPw04nOEhppGnWSuXMu/XLFcU+3hT251MjCaCO1bPu6qXmlbJR44IOot7D8dOEzXIBS8LeONlu++MoUxS8vui+sVZplM2jVxr//GE5lAZCsFUzr65XVhqWEGAQZk8Ov0lemJ8CCCHGGCE0LWBlWfZSOZ4AcpiOH3s7SRZBiFZKq282LldUFwEE5J9Zsja1VAThhl6+VD2/Wl4nkOR5tO3vPg0700e2Xh+ElJ20/zTcybIxRqjlrl6qnW/qFQQRkH/mPD/rjKur5Sv1N87XziGAMho/GG/20/HM/fiZoJ14uOdvR3mgKNayu/ZB4+qCXnm+ukopdaycchavN6417RYUYhAP7g0fBjSeD63IqXja8fcGUZzxkq1ePVM9s1yydGXyTOhfritLqSn43Er5/Eq5VjZyKg760V4vTDM2o8dqaoj2487m+GmexwiS09UzV6oXGnp56tf+vMub2BwE4Vln6c36pbLZABAM4u6OvzfMfTGzFkZKEMT0aScIYoYgXKqal09VHFPFaHp39WdcP5gY27Ktnl9xT7dcTcVhSvcHUbsfiaKAVVBQFLAKCgp+QLEGjIKsM4qTnAEIFirmSsO2DPL86/9pW5aq4FbNWl90ETx07QM/9aJMCjkHQhmk3l7YZSzFEC3aS0v2konV7zoZ/kJkP/2bulp6o7RuG1WAyTDp95JBPsttRyFN7ntPIhYhAOpG/Y3KWYsYAD6vSeTZzi8Fk4ul9ZpZR0SNaLIXHuQim2G9kLIzirvjBCFoGuT0UqnsaOg5OvE9xZAAtGrWatPGGHIuu6PED/Mihi2Yasj0IUIx4eQFLAkkk3yUeYNkwIFUdHfFXV53WkeTXBA831I1zeqV6jmD6FKKTjzYj3rytXN/sp+M2lFfQqQq5nppddGoPUvEnys9uW4vXqqcU1SLAbkTHHiZP3M/fsDiJ+F+QmMgRUkrna6cKmsOhC/Y2XQ0HA3xudJKy2lhxYhZshvsz00HVsbETjcMY6piVHONc8slQyXH/cXPjYoAIAitt5yFiiEAiFK6P4jjjM2uKCb18VEvHgCIdMNdL6029LKcnBz4vFKenMzQ4SW7tVZexRAymnTibjseCDmrl3mc83GYDbyMMeFa6nLTXqyaCMLn7yucFnwdUz23XDI1RUg5DrP2KOKF7y8omCmKAlZBwStHnFIvynN6GFjUy3qtpKNJpPaiPOrQaU8WVRoQwjRjQZwnGZsLvywDmviZJzjFAC7Zi/XJ7oYTyOQQSzGWrJqluQiRJI/jPKSCz25nQy5oJxqkNIUAVrTSGXdVQfiES6YJRKt2s6SXEVKoyIPc54LPdL4bxLkf5RhCSyOtmmnrCjiZViAIq65RL+kYQSGkH+VJzkARxRYcjyA9p6XxOQpJBQvzKM4DCaSlug2rVtdKJ1xFVFadVXtRIZqQMshCfwZLMD/0QEsZ5GGQ+xJAVbGWrIWaVjrhr6yiOi2rQYh+KL1knNJk5n78lOeDdJyxFABZVp01e1nH6rRv5IWfxRAvGrWaXsZYYSL3My9n2XxoBWV86KdpxhSMypa6VLU0hZxAlyZiQWChYlRsDUiZ5dwLMzazXekSSCp4kEdRHkgIDdVdMBqOYh5N0z73dEzyPVQ3KktmEwPEee7nYUDjmR0hlEzIKKVBkjMubENplg3HUtHJdgaaGl6oGCpBQkzi7TgvtgcUFMwWRQGroOCVi+AzyrOMUn4YZtm6YmjkaKfBCSoUCkamRhAClIs855TNQ2e0BCDneUwzIRgEoKw5lqJ/PzJ7/ocRhCrSNKJhgAXLGaNCzHBbGpMsoCHjFABpKXpdL6GTpcfTEM0kuq7oGBEhOaWzHbdJIJOMJSmDECgEuYaiEnTyB+N0FRu6gjGSQGY5o/MwblvwVwBjrKoqxng6Qsg5P7lCMsEznjOaQwA0RbcUk6CTPganIcVWTIyIhDIXWcZzCV63HVgiY2lGUwAhwaqrWgZ58YMb0+ZTBKCGVTLZY53ThAkKZm1UjAsW0phxKqXUiVZVHXy0lvtEElCxcujmEBFCMJpyMCcGjXMRJjRnDB+GN9gxTlSnON4WBg1NOQyKIGRcZDmf3ZBITjZgZjSlNIMAaUS3VVNBBMLJdPKLJQKMw49YCGIhBKVZPrMWRgLIxSRUpowLqSnY1Ccr0Z5VLl9k4XWdKJNoIaeHWlHcXRUUzBZFAaug4NXieP3kZGwATb0qfLnPHz1SdIKwd4bEMl3Oejw0AF9OoJP/lQCgaZgnIYCzLQzw3Zs58Hjx/0lkePz1RwuDIZTPv7mdicMCnoXv8OUS1um+XySfHZuCgsNjQgiZFrAYY3men7iANVm8JuGzhy8nioX+D3UagqOX9l43BzgZ4PxOEif7lcnj38DEosHJZqTZs2zff6MFTje1S3By2z55eA7K6V6178UBc3Amj/b3T38gdGK5HI3WQQkneyARBCd2l6+mvzteUY+m6v3MvJzkoEw0Axw95AmfPQUAZ9pWH++2+l4cdAJPLr/bJCAnDY6F8y8omDGKAlZBwSvnlDUFawqZrKIEScbSk082QUC5SHMmJCAIqio8/C5wDnIaoCJFVzSIsJQgyKP4ZbZ7CClzkeciF0IipGCsQDjDUiEQ26pJsAIATFg6zIKX6qJKeJazTEiBEMZYg7PtBaChEl1TgJSMyTCh9MTr+SUA2eTqlUkBAdBUTHDhEAvA9/e4v+QOrEOrghHWkIqJBgHMWZbQhE92TZ/k87lk8eHXUwSAAlUNaa9h+UrHmorVyTAmDfM44/RkdRgpgcx4xiUFAKhYw0h7+Zr239q2Y2wq+tRDpTwb5yGXJ+7+O3RzNGe5EAxCpBAVQjwf+oMRtHSiKFhImVARxvQluqikTDOWUSEgwBBqCjrhPO8rKgqIVaIRokogKM9iFnNx0lcV5eQFmIglQohDDcGqghU4q4ZCIgRVcmgsIIQZ40nGOZcnPO6cizRjlEsIoKIgXS1cf0HBjFEc2oKCVw5DI46pqAoCEgz8dBxk8qQRLIgS2vMSKYGuE9tQdW1OQlhL0Uuqi5DCpdgPu4NkfPINNQlL+6kXZYEQTFdNQ7UIxgDO6gNfKlKbekVFmgRylHmPwz124iSHS7kf9f3MF5wSpNiqgyGe6XzXNVXXUqSUScrawyRK6ckTm2GQDMYJ4wAhcHhYVAyKi9iCP7GqJ88PIQAKwqZm6qoNIIyyoJ8MR1lwwnzbz8L9qMd4DgFyNMdRndetAwtCaKmWo9oISJrH3bg/ygNxst+TR6NeMpw89AEcvawT7Wi2bnZ+fA2pVb2kTv7Nx3mwEx1kPD+xbReDzPPSMec5QaqjuQpW5kMrCEEVW9MUTLnwo6w9SjJ60osKIURvnIyDbHpL4dgaUdDMujtIMDZVy1RsIEGc+Z2477OTvjUpgRym3kHY40KgQw2xHMWY3bs8gpClK46uYoKiOO97aZSyE+6GSCnvjZM85wgCSyOOpUFU+P6CglmiKGAVFLxqEfxhTt6oGLauAAk6o/hgEGc5e2EGJCXIc94bJXvdEEjgmGrVNRxThfMwGgUrirNo1VXF4ADsh7t74V7K8xOmhb10fH+0FSQDwWnFqDbMuobUySThTFawTKKddVcs1ZIA9uPh/cHDmJ10XXHC0vve40HUFyI3iblkL6iznOQgCOolo1EyJABhQrf2vaGXnXC/GeNytxvudEPOJcFooWK4llbEsAXf34GV53mapiccIZxOsigIV/Vyw6xhiPLUbwftvbh3ghLzodK248GD0eOUphDCplFrWfXXzv0BWNcqC2YdApDTeCfYb8cDeYKH0gSQO2H3obfNaEIgWnFaJc05+Tq8VwSL6KtW01ItCLGXeo9Gj8e5L8GJdnmlnG6Mt/fDA8EzkxgtZ8ki5nxohUrwcsN2LIUy0R8nj3ZHYXLSi4ok548P/M4wRpOHPhYqlnGCBfCvrmkCsK5XGmYVSplm3hNvp5MMxcmWneWS7gT7T/1tDqWiGE2r3jAqaAb70SY3ClDBqOxotZKuKSiI6f4g6vvJCb2/F2ZPDvw0ZxjCsq0tlI2iflVQMGPxfyGCgoJXDV0l9ZKxVLcIQV6YP9ofP9z3GH9BjMKFbI/ih3teZ5RABBplY6FiOLo68/WrSWODpRiLZrNk1CGCYTLaGG7eHDzKBX1hoMMkf+zvftW9GechRmjZXmqZzekQwYxuwjKwtu4sVc0awXqSB3ujx7dHW14eyRdIUeaCHcS92/17w3QAISrrpdPuioa1mdaOWklv1U1dxTnnTw78xwfeOExfmO8JIQd+urEz3u0FCANbV5YbdslSC+NTMOn4IKqqIoQYY5TSky9xn1qVqlpadZY1YkrJtoOdO4ONmKVyMuP2HOvt0XjLe7o1ekwFI6q56rRWrQX4mu0WhhA2zeqqs6woFpds29veHD/1WcKleI70pJQhSx6Mth4MHgrBNKJfKK3X9fLsuX6stsxG1WyoipHT6MDbvjl4OEg98aK9TVSwQTq61bt7EO4LKUq6e758yiT6/8lLmq8eqoJWGlbd1RUFeVF+98lwvx+nJ1i8neZspxtu7I77XqIgWHX1lYalazPZdDzd0w8AaBq1VXuJYIVxujV8vDXeDmnyfAWRUnLBd8LOveGDYdSRUtp6eclqNfQymsEo6KgqDYFlkOWGbRuEcXHQjzZ2Rxl9wUCllCBM2E43eHzgp5QbGlkom4tVExU7MAsKZoqigFVQ8Mp558O03NWvnK6aupIzvnXgf3mvM/RT+bzLJTmeBHb3tkdxSjWCzy2VWjWbEDTrHVgSSggggnDJWbxQO6djXXL6ZPz484Mvd6MuE8/LLbkUD/3db/p3dodbTFBVdc+UTi2bjZkOVgjCFc05VV6vmhUo2Cjq/+Hg681gJ33+sIkEnXj4h87N7dHjPA81xVwoLa87yxqe6aoNdC11peEsVi0hQW+c3NoaPNr1M/a8vUUSgCRjD56O7j8dD4IMQ7hUt1aalqkTWESxBWC6RvzQckopX2YH1hFV3blQPl22ahCRbti+1b21MX5COXtOxTwR2Tf9B7d698ZxD0jZsFpr7kpdL71uTwsgAOtaad1drTktBPEo6t3s3b473JxswvqLn8oEvTN6/G3vTjfYBQi7ZuONypmaVpq9Hx9CR7Uu1C5U9YrgbBT3v2jfeOA9SV+0CKyfen/o3n443AhTDylG0156o7xuTkYR58CmqRi1atbaolO2tTTn253g1mZ/txs+v6bJhej76Y2HvaedIEq5rpLTLbdeMhSMZ1Em8PhdgkWzdq522rJqAOJ+dHCjd/vu6PFkE9bzPj7I/G969x8OHuU0QhCuOSvLTsvA2iyrh7Q0cnbJbZQMgvHAS+9sDTf3vCRnz/mMEPJp1//2Ub89TDgXCxVjbdEu21rh+gsKZgv8z//8z4UUCgpepTBlcjIx0lX8pO2PwjSMaRizkq2aOtHUZ1vZ4XEyDrkQ4yi7vTX87PbBo72xlGC5Yf/dW8tnl0oqmfm1Ps+yPg1rCOJtfy/I/SgPQxYRojmqYyo6hv/1HlFKGfN0N+z86+7nNw5ueGFXIfpq9czPV398vrR2mJ3O5lOE02tYCJEAsp+M+lEvY7GXRxAqtmqZiqEg/Exwz8L7VOTtZPBF9/a/bv92EOxLIZZKpz9c/uCtxgUDq8c3uzMnCgChJBhhBEdB1hnGac78hAoBHFPVVKQQ9L2nGo8+xLn0omxjZ/ybWwcP9zxKhWspP77Wunq65phqEcMWSCmTJPl6Qp7nf//3f3/x4kVd109ySKbHU0EEIbwXD/txN82CiMY5BGW9ZBBdQcp/UsnDFFxGLN0YP/nfT359t3uH5pGtue8svfte882mUZk8GDbrr6a+nHFTEBEADfOwG7bTzPdonANR1UuWYhJI0KSq+My4SSBDlj70tv9l+7d3ujfTbOTo1SsLb/7dygc1vTRbkjt+JRbqxOjEvXbczVnqZT5ExNFsSzFURP6zMTv8Ixe0n46/6t355ZNPO/5TwPlCef295Xc+bF7TiToHuiMlQAhoCqFMDL2sP07TnAcJxQhWXUMlh84c/IlcMso7w+Tmo/5vbx30xgmQYG3B+fk7y6sNWyF4hqOhyQ4BANF+MhxFvSSPRjTMJa/oZY0oCjpeyv49758L2klHX/fufrb3+yejTQBgSa/+bO3jNxsXTazPaN1mOkWIMSIK6o3SUZh6cR6njHHhmKqtK4eBMoLH6nD035TynW74+9sHX97v+XGmEvzupcY7F5q1klEUsAoKigJWQUHBDz6ZCOoqSXI29HP/0DHTgZ/GOceT6gUXgAnBmEhzHiZ0pxd+vdH97c39rX0/y4Wq4Ovn6h9cWqy5+vEL4/PgmzWsGERLAB/GgzgPkjzuRH2PxRgRACWTnAqWc5rwPKRJLx1+07//q53ffbP/1TjuYUSWS2ufrP3k7cYlWzFmNy2c/joRALZiMICGk629OUt6ca+f+bkUCCIm5aEoBE05jVg6zLx7w63/2PvD7/Z+3/G2AQQVq/mj5Q9+uvx+TXURRDOqIZML6cPfoqpghGCU0HGYRSkbh2lnlDABMD4MYLmQlAvKRJbzIKK7/fDbR/1ffbW7uedFCS1Z6uX12k+vLy3VLYQgLLa4FwCQZdmtW7e+/vrrKIp+8YtfXLx40TTNk2Q4z74GI0yQ0ksGfh6mNB4lo4N0xKTUsEIlp5LlgiU8C2nciQc3+/d/uf3p/e6tNA80xTjXuPSL1Y/PlVZUNH0v9XXRyeMeE6giRSdqOxl6mZ/kwTgdt9Px5MBDcZiQMypZwrKAxQdR/5v+/V8+/fR+71acjlXFvNi4/MnKj864KwpUZtGyIYhKqkWBGOWhl4woTQfJsJ95meAYYS44lTznNOM0nNj2R972r/f+8Ju9zw8mbThlq/nx6o9+0nqnqZcnwQKcfa2Y/hKloSlSgqGfjsL0MCLysiilcuIDJJCMS8ZFfhgUsYGfPdwd//5u5/d32/uDiHHRqlnvX2x+dGnRNJTZ7kqb3NuoiGhE62fjcepneTBKvKdRd1L/RQIIyvmh9xc0Ydkw8x56Tz/d/+p3O7/bHm8xTmtm462VDz5Zfn/FqiOAZtS6PFNtBKBlkDDO+16S5KznpUGU50wQDAGEh1rBRMZ4kvKBlz7a9X711c43j3qjMIMANCvGR5dbF1crqoKLAlZBwWxBChEUFLx6IYpEEJo6eedCsz9OR0E68NOnnWAUZA+3x2sLzmLVsCbdIknGuuNkrxtudwMvzJiQYLKY3NSIduSS5Yz21/zZzLBulH+2/H7K0i/3vmj7O71w//Pt4NHo0Yqz2nKajmorkOSCjbOgHXW2h5uDuJ/SRCPqQmn5k9WPf7R4vaaXnl10z6ooJuIoKfYHzauU5wzI/eHWKOnf2Pvi6fjJcmltxVkqa66KFSGET8P9qLs/3u6G7TAPEJAL7spHSx99vPLeqtk4XuA6q6KQEEApVYIurJTTnHVG8XY3GIVZ9GTQHsZ3tpzVBafiKCpRAJBpxnvjZLsb7PbjoZ9wLrgUZVu7fLq6WDYVjCedHRIUNawiMJrswMIYCyGmO7Be1ozaxHizdj6gEYJoo3fPT0e39r/e87ZvVM4u2a2y7mhIzUQ+eRTsYN/f7YbtnCam5p6unv1vaz+9XD1jK8Zr6fsOxWwp2sXy6Z+v/QQBcLd7y09Gdw5uHHg7q+VTS85yVXd1rCc8HaXefniw5233w27Oc53o65WzP1/75Frt/GQ2CshZm8Ccun4NK+81rzLBmRB7w00/Gd3av7E73v6ju7LqLlf1so41CeQ487txf8/b7gQHQeZDAMpG7b2l9366/MEpp4UQhhLMRfve1OPBkqVePV3tjuInbZ8y0R7Fv7m5v7E7Wms6SzXLNlVFQZyLKKE73WinG3RGcRBTLiRGYLFqnF8p26Y6bWCf3aho+u/tKNb12huTwVKw2X8QJKN77W/7wUGrtLrkLNW1skbUyWK4uB12d72nnfAgzAIIZN1ZfGvx7X9Y/7s1ZxFBPGkAne0hU03FF1bKQz/rjpIHu6NxmH39sLfdCU4tuKdaTsnSVAUxJrwo2x/Eu91gtxdllE+606CiYEMjikKOO9YK119QUBSwCgoKfkCxRk7WYVRsreZqhoKAkFyKUZD6MX3aCUyNaCoBQOaTm6Ukp1nOFYLKliqEDFM2DLMs59+r/MyLwYJ4yWz8w6mPLcX4dOf3HW87TEdxHrS9XUsxVKwgSLhkGcsTlqR5BKQEir5ee+OnKx+9t3CtZVQxRHOjJDWt9OPW265m/d+b//pk+CjNggMa98PuI9XQkIbRYXia8zykMWUJ5wxhtey0frb+dz9afHvFWsAITzvxZ1dDppkNkNIylFbNsnQFQciBpJTv98OBnz7YHWsKIghDKHMm0oxFGc2pwBgSDPkkwcPoaBvkZOcRKC5iC76zw8e8ZBni8LOuYv548S0NEYzVe93baep3vJ1xPNhQDI3oGBIu8pTnMU0oS6QEWLPeaF79+eqP36q94SoWBPA1TKmmPVgQQkcx3mlcURDhQN7v3cuzsE23h3F/g9w1iI6RwkSWHtr5OGcpkFLRnDcaVz5Z/vCdxqWSaqGjW4qZVDkIYVl13l+4phL119vW/d7dJBm3adyLuo/7D3SiYqhAIFOWJTxNaSx4DrHiWos/Wfno46X31p2WisikegXmqX0PQuBaWrVkgGlfIpR+nIUJ3e9HhkZUgglGQsic8ihjSUq5lApGgEDBBeNy8maxfFYOm12XN9WQkmp92LxKIPxXpDwYbGTpuO1tD6LulnJPI5oCsQCAChrTJKGxEDlCSt1Z+snqj3/ceuess6ROZplnXUGmBV+F4GbZqLkGAmMEZZKzvX7U99L7u2OdIIVgxnmSizSjSc6YkCVTNTQSJDSImRfRLGeTK67C8RcUFAWsgoKCv0aYIqSgTORcaio5veRSJrww82M66X+W00xJIdjUcK2mry06y3XroB99cb/bHydxls9J89X3ghUIIUF4xVr4SesdR3Xu9e49Gm0Nk2FG4zTzJuEpOvwyBBHWDb1SNatnyutvNa++WbvUNEpkskNkDsQy/REQQg298m79MhXynr30cPSoF/fSLB7HfSCnK10hABARVVXMhlNZcVcvN974sHl92W6qR6KYkwnT6bJtyjiEcLFiNitGb5SMo3zgpVyI6bW1AFLDSNdI3TVOLTqciwc7XpTQ7ihhTH6XPxe89iCEMMYIHRoTzrkQ4iXN9+GZQhBVVPt6/aKK1aZe3RhuDpJ+lHmjOJRCQIikPPwTEcPQy4vWwrnq+bebV69Uz5Y1Bx0/s/UaMhkJO5ReVXWu1S9AAFvmwqPRo27US2jgJb3x96Wn6JZRaZiN85VzbzWvXq6eLav2TJt3OKnIIwhrmvtO/bKOtZbZuNd/0Iv7MQ29dOAdvVsyaTAjiqIYrtVcLa1eql18p3nllLOkIfV4unreFIhynuVMClmytGbVBAL0xkmYUT9Ojp+4OZSfoWLbUpsVY6FijoPs0d54EKR9P5ViHl6umqo3gqis2m8dmheybLceDDd6US/OQy8ZSMGfTeNCrBFiOFrjlLt6uX7pvcXra9bCdDZ5DgKh6Z0TAFIIyTgHANYdw7HVnPKBn3ZHsRQSTmNlCE0Flwy1WTPPr1QUDP94r9Pzkr6XRil1TKXwegUFRQGroKDgr0Oacy+mGeUVR/vg0gKEcK8XHfQjP865EFICgqBlkHrZWKnbV9ZrC1XziwedGw/74yAfhzSjQlfw3ASxk1rLYVhOID7tLLfMxtnS2hed21ve03HcSzJP8OnrMxBjxdTcutNad0990LyyZi8Yk/eYjjegw7kQBYDTu3rN/b+W3z/nrn09WNsYbQ7CTpQMuaDyMFqHGBFNcypW45S7dq124Wr1rEm0yeCdPJbEPOgHZcKPWZQJhNCFtcr7F5qbB95uL+qPkzTnk7t3gCBwDLXq6itN582z1VGQjcN8txcdDKN00q5YtF8VPHuCUFEUQg4DpOkI4ct9h+91eTSNSkV7a9VZ/qp7d9N/fODtxumY8XzaIKMS1dQrNWvhUu3Ch82ri2aNTFpE5+zu4aWlBye9FQjVtfLHS++ul9a+6t3bGD3sBu0wGXKWT2sVhGi2UWrYS+crZ95vXFm26goiQM6+9CaeDkFU1ZyfLF5ftVuLztLG+HEv6kTJgLFs2hKIkKKpZtmsr7ir12oX3q6/YRAVAXTcVztv+iMliDM6DjMBZK1kvHuh6ZrK5p7fGcajKKVMTEpYUCGw5hr1knZmuXRqwdnaHffGsRfmvWGSc0nInNj5qZLXtPInrfdOu2tfdJYfeo874X6UjBlNjvQIYU2zS2aj5Sy9Xb90tXKmrNr/pRA2B/YCSJBQFiSUILi+5J5bKTMuNnf93jjOGZ/eayoEVW21WTHfOFW5croaxexx298bhP1x5EfZYsUsGrAKCooCVkFBwV8BIYEX5qMgFULWy/r1c41m2QjivD9OgoRO6lcAQ2AZSsXRaq6hqVhKWbK0iqtnlO0PotNLrq4ac5jbTGYjNKRccFeXzcYw9bvpyMv8XNBpPKNitaw6y1bDUW0DawSho/ca5yimh8/6hSTACJ9ymgtm+cfNa51kMEjHuaDTqo2Ciavai0atqrkG0VVEIJjkzvOV3iQZO4xWc2rpZK1hv3tx4dJ6bRikQ39awJq0okFgmUrF1iuOZqh4rx82Ksb+IO6N0qGfNssGwagwOwXTHViapk0LWIyxly1g/dfvBtGqVW+ufeTn1/bi3iD1cp5P/5GBtYrutoy6qx1aKgyPxgaLjcLwaFcRwAAtm/Xa8gc/WrjSjga9dJQdS0/Dal0vLxo1V7N1opJJd818zM0dWelJKWvJqP3D6o8+WrjaS0b9dJzwowKWgoitmotmvaqVLKyqSIMzPyH3AuKMe1EGJKg42rll9+Ja7frZbBgkXpTl9GjQl2BYtvWqq7mTPaGUiZprjA/89iDyo1QjJkJwTg7I9CFeIFtG7b+vfvjh4tVeMuwn45in0/VvGCJXtReMWk2rGIqmQjKPtzSHv/YgyoI413VyasH98OJC2VY6F5K+n+R02n0FCEZlW626umuqhkIOeLRYtTRl1PPSnpdeWC2cXkFBUcAqKCj4qxSwhBxH+TjIJAAVR6+5eslSHUOpl3QujrayQAAwhgpGGB/d21ccrVHRN/e9/UHohVmjNIebgKdFLAihihUVK45iLVn1XHB5GMrISakCKpDoWH0WyM9xR8N0VERBioIUi5h1o5wLdrzvAyAACSIqVsiz5V/zeDmf5GzabFV19bKj6SrWVexaynLdEscrjA4DegQVgvCkoFlzjMWKdZeMeuOk56XrObMNtTA7rznPmjSftWq+7AKsP/sNFUgURCzFrGqlXDIhj2YSEUQqJBpW4HExei4nv35Ali4BgArEimraqlnTSpn4TnoYIgURDSlzaeef9ckqmCiYWESv65V8ss1patvhoW3HGjo0af9F3+bzbAIZp9SLcgBByVZdUzU0rKtG1dWYOFSKafluWq0gBCMIhJSLVata0jd2Ru1hMgyyiqNpGM9H3/HkN31oMgjCNjItxaxr5dyl/PiA/KmGyGcPJcyPxZ6GyjRKmKHhhapespWSrZu6stKwxbHxRhOtUAia/uymrizXTU3B/XHaHyeMS4UUVregYJYoClgFBa9uKhVEuR9nGELXUnUFTaYGoK4+79jahtIo6Q+ejnrDJEzo/GY24HtbsRBBmvGijHR+JQGf/aAIQh2rOlafI4q5zI7TjA+8NKXcsVTHOtpncZjG/OWmKkMjNVczVHIwiAZenBQFrILj3pXpFCGaFDqFED+khvVfais6UXWg/rnk/CixLKpXf8bQH0tPw6r2XOM2Z3b+vyiPhhUNK38xj4dz3Xw1OSRpyoMoBxK4pmZoZPrzqgpWAf6zn0AQ2jqpl3RNJUGS98bJWtPWFDJHZ+M7gwEB+Esa8qwwDudu0aMUMs25H2VRSku2VS8ZCj5UBoVgheC/ZCt0FTcrpqmTkZ8N/CzOaGmyZaKgoGBWKCYmCgpeUYSUoyCLMmbouOpoeOKVX5BKQWjqpFk2EIJ9P/XCjPOj29o5zjZ/+Ne8JsKY35t5kFI28FPKeNlWXVN7VhP4S7kemLQu1spGraQzLrqjJExYYXMKnp0UZYKUMs9zSun/3+cTgmJs8AfZrvmW3kmM+9zrgAQgSlkYMwRh2dEMVXm+nZ+iKni5ZtVLRpzRnU6YZfx1PD5gbgvjQso0Y16Ypzm3dNIom5r64sRWJaha0htlAwDZ95LeOBVCFma2oGCGKApYBQWvZKwmAePCizLOpGNpNVdHUJ4kC9JVUnENQyNhwkZBnuRcFn65YN4zmzTjccYAgLahGurk3lXC5+d6EIKqozcqBkRw4KdBlMviqBQcgzF+tgPrpV8hLCgo+CtHRDLLRZzSnHNdx5aGVQUfRz3PPcgQLlTNeknLmegOoyhnhZ2fJ4SQfkyjjEEIDF21dQUj/KKyJkQIWZrSrBiKggdB1hvFotCKgoKZoihgFRS8ml5ZpDkfBxnloubozYp5whtmXUX1su5aapLSvpf4cVY45oL5hnI+jrI447pKXFOZzNhK+cKWBQDrrtYsGxiBwTgehxnjRZ2i4Fg9in6ogoJXiTijfpxLKV1dtXQF46O1cS9IchCsl41ayRBc9rx4FGSMy6KGNTdwIXt+HCZUV3HZUjAGk+32z73ohRJCYGqkVTVVgoZe0i4KWAUFs0ZRwCooeCW9spRxSodBxrkoO1rN1dCLEqppTKYppO4aJVNLKRt4RV9JwfyT5XwyQcBMQylZ2qSANX3O7fkVCmCbatXRTU0JU9YbJ2lWXM4XPMt7j3Zgcc6LDqyCgr85cTotYIGSrVq6ohAkX9yABSCCtq4uVKyKo40iutePJr26BXMCE3LoZVFCdZVUHB1NHqt5QbA8KW8RjGquaelqnLL+OGWsMPIFBTMVpBUiKCh4Fb0yE15MgziHEJYttWzr4AQbjqbpt6mRqqtBCMdRNg7zIvkqmGMkABnloyBLMmZq2LUVTZn6tRd30BCMKo7arBiUyYNh5M3xowcFLwnG+NkOLM55IZCCgr8tYcq8KBcSOJMN7nhyp/eiCwc52auAFypGs2IEMd3rhUleFLDmBy5kbxwHMdVUXLZVdJKkFk4tPKyX9KqrCSl7XuzFuRACFDdYBQUzQlHAKih4FckpH3hZSrmhKSVLO3r9V76wqWRys0RQvaxbOgliOvBTWgxGFcwxEmRUelGW5szUiKUrynQ3yvPPyvGjXa6prTRszsVBL/LDvIheC45iI4QwxlJKxlhRwCoo+JuTpCw46sBSdPXoTZsXtNpM/jlGsOJq9bKZZKw7SpKMFWWKuUFwMQqyKKW6iku2htBJR7/JpIDVKOsIwd44HXhJXjRhFRTMUJBWiKCg4BUkpbw3TjgTJUu1TRWCl3j/WCWoUTZsQwkTOvCznMnifZWCeQUCkFEWxrmQwtbJtNYrpXxBA9bxiKFtkqWaCSDs+2kQ50CC4qwUHOtIsQaroOBVIclZFDMIpGOoCkEne5vy6E/H1OolHQAwDtIwprzoS58XKBd+TCkVpkocQ0UnNdoSI1SytaprYATHQTqc3vUWNr+gYEYoClgFBa8iacbbw4hyWbJVx1Reyq2qClqoWrapJhkd+kma0aIvumBekdOb+YRqCqmWDEMjJyw9TL/EUMlC1bJ0klA+jrJ8suC3kGoBmXCYIFHKeaEVBQV/OyMvD/+KUpoxpirYthSMXy55cS1luW45pjKO8v1+FBTT4nOBkHLoZ36UE4LKtmbq5MS3DpNhBQSrjla29YzL7jjN8sLOFxTMDEUBq6DglYvVJg/usP1+SBmvl/Syrb7UtRCCqObodVeHEg78dBRkvPDKBXNKzrgf50FMDY00y4ap4pf6uKGSxapZdXXKeGeYjIO86FcsgBAqiqJpGgAgyzJKi3S3oOBvGBTJnIkgznMqLINUHY2cuIAlpYQQuoay3LBLljZZgxX5UXGi50EvKOVDP/WjTFVQtaRbBjnhBOE0IkYYLlTNVs0QXOz3o6goaxYUzA5FAaug4JXzyv8fe3f+JNlxJ4Y973xnvbq6e3pmMLh4E7yPXRFLLi2H15ZsOUIO/+DwX6O/xxGy5PBKq93VLrkHCQIgCIIgSOLGzPRMX3W9+73Ml5mOrhqMKFmK4Mx0V1VPfz/BYOAHoKvi1cvMb34z85vLC3e6edESgpNQBpLjRwn10LJq6TjxfI9llT5dVBbKYIGndGajO5tVqmw6wWk/llKwR/oDgtNB7A1izxp3PK/mRQPpK/D7txBaa1dzYHgmAGyEda7RXdl0nbGhx5NQMvqHtsfV5TYY48jnw55EGJ2kVVa18FQvP6yNmxdtUXeckV4oPM7+4LcCrXZhjWI5Tjzr0OG0ymttzjp7CAEAuAxBGjwCALZsTo6a1qSlWt6qxpJQeI+yqWQ10eKM7A2DyGdZ2Z4smg42lYCns60grU1e6UZ1ktNBJMWj7cByhODAY4NYUEomWT0vWghfwaojhaQVANvAOte0Oq+U7uwqKPrDE1gPc1geZzfGkeT0/qScprVzzkJff8l1xmRFWzZacBoHnHPySJ02xjgJ5bgXcEbmeT1LG93BfR0AXA6QwAJg6+bkeaNmRd20XRzKfiQFp4/6RyjFu4MwDkXZ6GladwYCNfB05hkq1RWV6owJzyY2nmSPMKh9ekcVGSd+FPCsUNMUbu0EyDlHKV3VwFrdQgjL8gBsrD1aVzU2K3XXWV+yyBcUP/LkJfDozd049NjpvDpdNK02CG7suOR057JaK218QWLvQQX3P7ivPvvXfMkGsewFomy7o3lVNh08VQAuBUhgAbB1ylrP07ZWJvJYLxSSkUfdCkAJHvVkHEhjXFq2rTIWNmGBpzHRkJUqLRWntBcKwR9tRFs1K0bR/jAc97yi1qezqmo0ZCuuuFUNLCGEtVYp1XUwqwFgY4xFVavLVluMQp+FHlsWcX+0XtoT7MbIj31eqW6et3mlnIUtlpeb0raoNSE49mXgP6jg/kh13CnFw0ReGwZKmcNpWUAZLAAuCUhgAbBlc3Lk8lrNilZrE4c88jkheDXWPkLDxiT2xagnBaOLQp0squV6IwBPm6xUi6L1JBv1vEdNYK1wSvfH/rjnlY0+WlRZpeCO9aveCTtHCKGUnk2eYfsVABtlnEuLtmk7j9PIl6vTvav9s38ojAnBvVDu9H2K8WlaH05LYyEousQ6Y/Na5ZXijPRjEQecPE5CEiehvDYOVGePZlVRa+jsAbgUIIEFwLbNnVBR6bxShOB+JHzJHuOPUIzjQIwS3/dYVurjWVUr2EQAnjYWocUqgSXoKPE4JY+RqsAEJaEc9TyC8aJQ81x1cIoQfLqS7xyU9QVgk0xn57lS2gQej33+oAL3I+WvECIYhT7fH4WS08m8OpiUUBv0cr8VxuXLC4g5I/1IxoEkj1O10IU+3+37jJCsVHmloIYAAJcCJLAA2C7a2EXRVrWOfDHq+Z6gjzP1IkhyMkpk5POiVsezGnZggaeQc4u8XRStL9hO4i+rxeFHTlI4JDjdHQRJKLNS3Z8UqoMQ9krDGDPGOOcIIaWU1nCqFICN6YydpHWzKqoQicfJU5y1X+wLdn0nDH0+L/ThpGw19POXeOjXnckqldeKUzqM/V4gHid/hXEg2W7iJ5GolTmeVVmp4OkCsP0ggQXAVg3Krm27Rd5WrQ49Nux5vqSP8UdWV2j1AtEPeWfMWfDXQAILPG20sXml69b4kvYjwRlG+HESDZSQUd/vhSKv9eG0VHAV0ZXHl5xzegkSWABsrJ/v3DxvW2UCn/cC/hjXg7qzgcFxRkaxN4g9Y+0kbfJSQ23Qy/xW2HxZ4NX3eBxwT9BHfjGWx1A9QYexl0SyUeZoVmVlC88WgO0HCSwAtohzqFZmljdVY0KPDSJPCr7cVPIIYdaDURyjyBejJCCEnGZN2WqE4NYd8PSw1malKhplnYt8EfiPtTK/xAga9WQc8qrRJ7NGwXbFKw9jTAhZvWYWiqIBsMFUhbHzvG30WVDUCyUhj9GcEXaYUTKI5V7fp4RMsmaaVRo2217SUBmhWnfz4mz07/kseKyTCghj5xwlpBfyfiC6zk7SZrUDCyJlALYcJLAA2KY5uXNlo9NCa+PiQEQBYw/qUj7yzBwj1A/F3sBnFE8X1aJoFZR8AE/brKbOS4UxHkQy9BghBLnHquO6rO87jj1K8KJoTxZw5Paqe7iYD3uvANhgngItUxVFrR1CgeShxzF6rIUKjCjBvUDsj0PO8GxRH89q2Gx7eTWtmectwXhn4AeBeJJ+3hNsb+RLQadpPS/asz4fun0AthsksADYIta6RaGySjGCR33fE+zTVaLH+Wu9UO4OAsnYvGjneduoDhaWwFOj69x82VgowUkkfckIfswYlmAceWxvGMS+yKr2cFpWDVx6cKUxxoQQqxpYSilIYwGwEca6olSN0pySOOAef8yNtqsmLDjdGwa9UJRNdzSrmhb6+cuqUQ8SWLsDP/b5k/wpwemNnagX8EXZnqZNqw309wBsOUhgAbBdsdo8b4pa+R7bHwaCkQczbPwYs3IsOemHIvZ5q80sa4pKIQzPGDwlumW2t6g1wSgJhVyWwHiclrJsLGcTm8HZxKaqu8NZVcPE5mpbFXGHGlgAbBBGSHVmUbWNsr6gsc85e8xpy2psIATvj4Jx4hvnjudVVim4ZfQycghVy3KxGONR4kdPlsCSgt7YiZJI5rWepk3VdPBOALDlIIEFwBYx1s7ztqi1dzad9jmjT/gHA48PEw8jPEmredFA/go8PY3F2Kxoy6ZjlEQhX+2/eoy4c/WfUIIHPdkPRduZ6aKuFSSwrvC0+VPu98BjAWD9lDaLQjVa+5LFZzHRE01bKCHXhuFO38fInS7qeam0sbAz/XJxznXGlbXOa0UJHsbSl8w9wY/IKNlJgiQUWttFodJSGaiNBsB2gwQWAFs1J3dpqarGeJLu9H1On7SFhgEfD3xC8WTRzrPWwRYs8LTorE0rZawNPBZKujpY8hhbsB7+J0kgxn2fMTLLmrxqrYW0xdWdIJGl1T/DawDApmht86JttPUlCwPJ6GPf1YGWCSw0iOVO4gnB01JN5nWrIFdx+VSNTpfJR1/S2OeCEfy4nbRDiGDsCzrueZ5kadUczUoDF3cAsN0ggQXAtkyZkEOtNmnRdsYGHh/0PMaeNN8USrab+AShWd7MCqjkAp6KxrK87qBqu7RsMcK9UMSBxPhJG0scyZ2B73E6zZpJ1tYK6vteURjj1RHChzWw4CJCADZCdSYtlVbWkyzwGCEYocctC3o2dGCC8SjxdhKvVd3BpCgbBQ/50ilrPS8a51ASidDnjGL3iLd1/6fefvn/nJHxwA89lufq4LSE6ykB2HKQwAJgW2hjJlm9KJXgZBR7vUCyJ5uTY4x8ycaJ1wt51XaTRV23HdyuAi4950xn81LNsxZhPO55/ViSJ01g4UEkro/C0GeLUh3Pq7JR0FauLMaYlBJjrJTSWkMCC4CNUJ3NKtUZG3os8B6s6T3ecoVzD/67US+4tRt3nfvkMM8qDQ/5crEOpVU7TRvn3Ljnx74g5Gz8d+7xYwDO6P4o6IUyrfTdkxyuIQZgy0ECC4BtoY2dpm1eKV/yYex5gmLyZHNyhxjFvVCMeoFzbpo2i6K1MCcHl5+xrqx1WijsXC+UyxD28RuLO5vZIEpIP5L92OuMnaRNVioHtVGuqodlsJaXw0L2CoANcM41yjStYQxHPvclRU+wUPHwP00ifm0UIOxO0zorlYGt6ZfttSgqPcsbZ3EvEFLQh4P4Y+MUj3r+MJLW2WnW5JWyECsDsMUggQXA1szJjVsOnNqXtB8Lwchy+vT4g6jDmFISB3I88Jd13JtFriyEauDy08Zmlc4rTRmJA+FLtoprHztbsZrCxAG/NgwYpZN5PctbaCtXFj7rPCkhxBhjl+XQ4JkAsO5+vrN5pdq28yVLAuEJdi5VPCOf7wz8wONlrU/mddVq2Jl+iTiE8kovCoUxikOxquv/hDUECMFxIHYHgRRsUajjWdVADQEAthgksADYFp2xi6Kpah1IFocPbwV+gvXGs//hwKO7fR8hNM/rtIQdWOCpmNgYm1aq1cbnLPbFg/ODT7Q4j1cTmxujUHIyzZpF3kLa4soSQvi+TylVSjVNA5uwANhIP1/Wuumsz2kccp9TfB4ZLE+wcS8YJr7S5nBSpLmGjv4SMc5Vra7bTgraCwQl5/BOYIwjj10bBIFkadGeLOoGbiIGYItBAguAbaE6m5XaIpeEshdI9OQ3Bi6DMsHobs8PfN4sK8QbuB8YXH6N6uZ5g5Drx6IXsvO6Bz3y+f4oDCTLSjXL6kbD1purGhsRQinFGBtjuq6DBBYAa+aca7VNK1Ur7Xks9j0p2Ln8ZU5JEvHdxOuMvT8rsxI2214a1rqq7ha5UspGPo9Dvrou9slRiscDrxfwstGTrIYyWABsdZAGjwCALRmVi1pnpWKYDGIZ+/zJ5+SripaCkXHfT0LRdXaWt+VqUg7hGrjMlLJprpx1SSiTSJ7Xn5WCDnsyCWVn7CRrYGJzdWMjQlYHS1d9JT6XjR8AgEehO5OXqmmNJ1gUcMHJOazrOUcIDj220/c5JbOsncPC3uXhkCuas1BZGxv7PAkFo/ic+nw8WF4I4xyapm1RaSi4AcD2BmnwCADYikDN2EXRpqVijAx7Xi8UT74D69PrgenewB/GnrXoZF6nhbLGIahODS6zRpvVjQT9WCbhuSWwCMaBx/f6PiX4ZFGfzCvYenM1PayBZZcg5Q/A+ilt8kob40KP+4Iuy4LiJ2yMq2S0J9j+OBz0vLxUp2ldqQ6a+KXgHKoanVZKdyYK+JPf1v17ATMeRN5O4gtOZmkzyxvdQQoLgC0FCSwAtiRQs/OsTQvFORnGXuSL5aB8DoMnX2bEhj2PYDxN63nRdDAnB5fZgysIS4UxGkQiCcX5RMbLIDbw2P5OyBk9ndUns8pA/HolMcY8z1vVwGrbFvKYAKxfrUxWKUrJqOf5kj5INJxHwsIT9OZOuNP381rfPy1TqHh4SVjr8kqnRescSkIRBRxjfC73BROCh7HYG/iBpNO8PppXyzJY8FYAsI0ggQXAFnAPilLXSnucRj7jjCzXmp40UFvFZJzRJOKBT7NazfLWwKQcXGZtZ/JGt50RnPZCHkh2DgXjEMKrxkLJThL4kuWVmuWr7YrgyqGUMsYIId0STG4BWHtYhFptqqbjnPRjIQU9xz/OKRlE3nJnuptm7aJQ0MIvx1vhXNmoVhtOSexzRgjCCLtzCAAIxp5gg8iLfFm1ZrJo4CJCALYWJLAA2AIYNcrMi8ZZl0Qy9PnyXmD05AuNq7VKStEo9gY9r6y7k3nVQbkHcJnVTbfIW2tcHMjQF5yf08Rm2VgYo+O+14+kcW6WN2mlIHkBUyZ4CACsudVZa8tKF5UWFI96nifZ+X6Cx+neIPA9usib00XlHMRFl4B1bp6rRpswYEns0VUBLHxu/XwvEnsDHyF0mtVVo+GBA7CdIIEFwBaEashVrZ4uWoTwTl9GPl8OyOc2a8II7w6Dcc8vW30yqxsN1wODS6xs9CxvjHODnoi9B5VRzivJwAkeJ95e38cYnab1ZFHDjsUr6OEOrNXEBspgAbDeoAgZ48pGZ7VilA5iz+PsfM9zCU5v7UVJKGd5czgrO9htcxnozi2KtlUmDsQwlgSf2zR2tdzbC8UzezEh+GRW55WGTh+A7QQJLAC2IFZzrqq7WVZjgnb6QRzw1Xh6fgMzGvf8cd9X2kyyuqigNCW4xMqmm6aNMW4QydBnqxZ0XtfEEYr7kRwlHiNkljWnWW1gZf7qwRgLISily6orkL0CYN20cUXdlbUWnAxizxP0fOsRSU5u7IT9SBa1PppVea0gLtp+Spt5vkxg+XwYS0bP+e/3QnFjJ+QUT7N6eT0lvBMAbCNIYAGweV2HskrNi4ZgPOx5gc8RPufJWBzwYSQ5wXmlZlnTwtl+cGmVjZ6ljbG2F8pVYRR8Xukr5DDCnJFBLAPJi1qdzEoDbeVK4pwTQjDGbgkeCADr1OquaLTurMdZL+SSUXeuO9MJJf1YjhIPEzzL2qNppSAu2m7Gulp1s7xRnYl8nkSSEHy+HxEIupv4gWRV203Sumo0dP4AbCFIYAGwebXqsrKtWyMF7UfC4/ScbiBEDw/2C076kRf6smrM0ayqIVADl1bTmrzVjOIkEoKd5yjmHMYYEYyHiRz2pdbucFa3GhrLFeJWL8HS6h9gAgPA2pshKmqdVy3GOA65ZAxhtEwmn1PCAmOCsS/4tWEQSr4o1N3TvIGufrt1xqalqhpNCY4CHvqc4HNOYFFKQ58Pex5x6GRazfIWun8AthAksADYvLJW81wZ43qBiJY1fRA+txOEqzkYoySJeC8UteoOZ1XdQBkscPlY54yxeaWUMp5gvVBwdp6VfVfBMMFonPg7PV935mha1kpDFuPqeJi9opQKITDGWmsD2/AAWCPnUF7rvNKU4jjgny5UnNteW7zszhnFu30v8llRqXun1ae3zkFXv6V0Zxdl2ygjOY19Idhyh+z5zooJjjy+k/iY4KNFPcsaGPoB2EKQwAJg8xalOp6XDqGdJIh9saxJjc83GMQYJ5Ec9KRS9mhawu0q4FLOaqzLa50uS2D0QjGOPU+c/yhGCN7rB9dGISLodFFPFg1swrqCKKVBEBBClFJaQ4cJwPpYZNNCLfKWLw/68XOvdfRgrQLvDsKdxNPGHpxmWdVCvbtt1urudF43yiShHPd9urxkA59npOwIxr5k18eh5PR4Xk9SuMUFgG0ECSwANi+v9DRrEUL9nvA9ev7T/uWypS/pMJaU4EWhiqYz1sFCI7hksxrnqlanlW61iTw+iKVg9AI+BwceG/W82BONtsfzqoIdi1cPxpgxhjG21sIOLADWyqG8UmmlCCFJICnFF9LGCRpEcmcYCkanaTNNYa1iq7XaTrNGLa8gHFxAAawH5xUYGSdeEoq67eaFalQHOU0Atg0ksADYaJC2HBjzUs3zlmA0iKUn2PlHacv/9yXbSbwwYEWtJmldNhph+AXAZWKdKyudVm3X2dBjcSA4IxeQuUCUkGHsjRO/M/betCxqjaG1XD2rCu5wCyEAaw+NVjWwFKO4Fy7rKlxEA0e4F4lrg8D32DxvTxd1o2CtYnspbWbLJGPo814kML6Qt4IzMk78cd/vOjtJ60XRwvWUAGwbSGABsGHG2rzWdWs8yZNQCnZRrdIXbJwEkceLWp9mdVnDoRhw2WY1FuV1V1SdQyj0uOR0lWI434nTKoeVRGLc97U290/LolaQv7py4REhQghCiDHGWgsPBID1dfUI1W3XKiM4CT12IfmrZV/vS76T+LHPy1pP0qpqNOQqtlbX2bxSxqLIZ8tqGxfyKZyScc8fJ77u7GTeLAoFpwgB2LoIDR4BABtkHcqrLi3azpiez3shZxeTwMIYP6zj3uhunrYllMECl7C9zIu20Z0vafLpsvz5LsM+vIAu8tmwJygjk7TJSt1BCHvFUEo9z0MItW2rNVymDsC6+nnryrorGu0cinzmC3Yhe22Wf3O5813sDUNMyNGsySoNpRW2kHPOOpdXumo6T5BeKHxBL25RSQo6iDijeF40k7SxDhYwANgukMACYLOBmk3LZp43nbFJJAahx8lFtUqCcRLIYeRhhOdZXVQKnj+4XIy187ytmy6QfNDzKCUXNtdwgeTjXhBJnhXt6WplHlIYVyo8Wu7AwhjDDiwA1hoXOZdVbV5qjFAvlKHPyYXlKjBCw56/Pw4EJyeLap61Bhr79nEINapblG2ljC9YEkpPsgs6Qrgsg4VHSRAFIq/V8aJUHQz9AGxZhAaPAIBNBmoWZYVaFMpal0QiChg+i9TwBV3kHPp8d+QHHp8Xep6r5bIWDMzg0tCdneV11ejAY0kkGMUXN6/xJNvp++PEK1p9NK2yqoWmckXnTg76SQDWxxiXV93q4HYv4IGkF5eqQAglsbg+DH3B0kJNs6ZRUMd9C3thtCyp3laN9j3WC4Uv2cW9FJzR6+Ng1JNV0x1NqrrRFkYAALYJJLAA2CTrXFqqolIY4yQUgi8ruGOH3PmPzBijOODXhoEv2bxoZ1nbKhiUwWXSajNNm6rtAsn7kUfJBe3AchgjyemwJ8d9v267k3mdV7AD62qhlEopMcZd18EthACsMS5CRaPKusMIRZ7wJL+wVIVDCIeSj/veqOfpzh7Nq0UBm9O3jkOorLtZ1lSN9gSLfcEpubjClIzi3X4wjGWrzWnaZLWCfXkAbBVIYAGwSca6SdZUykQ+H/V8QcnqZhx3MQNzGIj9YRj5rGz0NK8rBXNycHlCWOeKWqdFa6yNfP6gBpa7iA960PwiX1wb+hSTedFmpVpeRQTt5aqglAZBQAhp21YpmNMCsLa4yKaFymuFMY58ttxqiy8iVnEIr4aQXihu7cUEo3unxemihp9gC0f/qu0WZauNi30eePRBqHwxESwlpBeKnYHncZZV7em8auF6SgC2CSSwANjQeIycQ051ZpY2Sts4FKPE4+zTCtIXNCXDOAr4IJbYuUWhZnlrYQ8WuCyzGuPSUpWN4Yz2Y+ELSsjF1fY940m2Nwp9jxWVnuetUnCb9hWCMaaUwhFCANbMOlvWqlWWM+J7HBN07pd1PGjj7kF3Hwq2vxMSik/m9Sxv4CfYQlWjy1pzhgc9EUj+sJc+/7diGYAzRsb9oBeJpjX3plUNB0sB2CaQwAJgQxy2FjVtN0nrVnU9nw1jyRm9qE9zbjXSe4Lu9gPB6DxvTuYV3K0GLovO2nneNEqHkg1ij/MLHb/O2kUg6f4wjH2WVeo0reHizisFfwoquAOw1q7euLzUztko4FHACb7IEu5LQrBrwyD0eFmrWdaozsDa3lax1uWVLmotOB3Evuexi+z60erWo3EvGMSyVt39k7JuYAcWAFsEElgAbIpzzlZNN81bbW0UyFHPY4xc3GTsQaDGzgI1KdmiaI9nFczNwCWa1SyKtm46T7Ikkmx53vaCtsasThF6gu70/diXjepOF82yDBb8DlcFxtjzPIyx1rrrOtiEBcB6tMrmjbIO9ULZCyTBF/6JktO9vj/qedq6k0V9uqg7A6HRFkXLnbFp2ebVKoElfckuvP9HaNSTg8jT2h7OyuXyFWzFBWBbQAILgM1NyLvVkSjNKe6HIvQucqXxPwVqZG8QBB7LK306r/VqBxYMymD724ux87ytWuNLNojlqrFc0O1UGK9SYziUbHfgCUGnWX2yqOEM4dXBGFslsJRSXQfL7wCsg7WuVV1aKudQz+e9gOOLj4s4w6PY2+0HyKGjaXk4KSGBtT2cQ7qzadEWlZKMDHteIOgaPjcOxLjvSUHTop2mdd3CKUIAtgUksADYGGXcJG/qtos8MUokvcgdJQ9RSgaxTAJhrJsVbVlrezYph2k52HadsVmpWm18yXoBJxe8Lr+aNXFGbuxEsc+nWXM0ryB/dXVgjMmyypo96yLhhwdgHcyyXHdatM65XijiQCzXKi44LsLYl2yceILTWdben1UKEljbpNU2LVWrjBQsCYTHLzyBhRGKfL7X93uhyGp9OK+LWsMwAMCWgAQWABujtZlnbdOaOOSD2FtNyC96sZES3AvFMPEYI4u8neaN0rCsBLadc65uu6rpGMWxz9ezXXF5nTa5OY4iny9yNVnUcOT2ar578LsDsB7WukZ1Za0dcqHPI48tQyN8oQ0cYUwp3h0E/VBUqjuali1st9meHhihSumi0RjjXsA9wdAaRn+MfUlHiZcEvGr06XxZBBMyWABsB0hgAbAx2thZutqBxZdHotbxoZSQwOOjnscJnuXtNG1gpRFsv864rFJV20nB4lD4kuK1JLAIxrsDPwmF0t0ib6u2M9Berkh4RIjnecuVBg1HCAFYW1xUNrpWRlAS+/zibrZ5aDWUMEquDf1R31Pa3J9Uea2htsKWcBZVtS4qTRkZxFLw1WGFi/zET6PlfuT1e56xbp41RaOXhxXgrQBgCyI0eAQAbErTdtOs1p3tRXIYexhfeHtc3kWIAkl3+57v8bRQp4u6hR1YYMvjV+da1U3Spqh1INkolqEUeE1ZDDyI5e4glILOy+ZgUqoOElhXAmMsiiJKadu2WsMFlACsJS5S3SRrrHNJJONQYrymz6UE7w2Ca4PAGHtvWkzSetnVQ7ZiCwIA5BalmuetYHR3EK4SWBf6Yjz8272Q3xzHocdnRTvPWjhNDsCWgAQWAJuhjc0qVTZacJKEIvD5WvZEPzil2I9kL+CdMbOsaRUksMC2azs7z9qq0aHHklB6HsXr2LLoMMaBz8eJ5wmWFuo+JLCuDLy0hrqEAICHlLJZoay1USgin6N1ZbDOunrJhj0vlLys9fG8XIZGGH6RjXMOlZXOSiUYHvc8yejaPjr2+d7QC5fLvVDHHYDtAQksADY0IVdmurxSLQ7EIJa+YHiNgVq/J0eJhxCepHVaKrhbDWw53ZlF0dbKhP7ZrGZVFuWiMwvOnTVKyeioJwOP5aU+nJaQ8L0iMMaU0mVRHlh2B2BdoZE2aaGMQb2AxwFbT1S0auCUkr1RMBp4ddvdn1a1goPDW8FYWzS6ajrJ6TDxOV9PAuvslQg9sdsPQo/ldTtJm7KFrbgAbAVIYAGwGVVrThd12XSRz5NI+oKSdW1WxxiNe/448QnGs6xNixaq+oAtp7TNqhYh1I9E4LOHKYaLbikIOUJwP5aDSBrjTud11WiE4GTJ049SGgQBIUQtQQ4LgDVotZnljbE2DmTsi/Ws662GEkLwXj/YSXyt7emsqZsOOvqNcw5VbVfUGmEU+Tz2GV1LvVjnMEKOUhwHYpR4GOFp2izyFn4RALYBJLAA2IxG6XnW1KqLfHE2JFPiMEZrmSNhhHqBGESe4CSrVFq0nYFNJWCLQ1iEKtVlheYUD2MRemx98wqHl0duvXESEIJPF3VaKt0ZhGFq87SHR4RIKTHGWmsDPSQAa+jolwmsrNTOudhnocfxGs/wkbOuXowSnzEyL9pZ3mhY3Nv4S+FcVrZZqRjDvVAEkq3nAuJVvQ2EUCDo/jCUnE2yZpLWsJIBwFZEaPAIANiIRtl53nba9iMeePzBUHnxA/OyjjvmjAwikYSy1eYkrVsFURrY3vi162xZ6axqGaXD2A+lWFtpEocRwWgQyWsjT3B8mtbTrG5W9x5AdRQAADivznaZwapb3aiOUhL5whMUr7GfxRiFHt/r+8NYFHVzcFoUdQfpis2yzi0KtShbRmgSSt9jZL0jr5Ds1l7kSzrN6tNFDb8IANsAElgAbEZZ67xSnJFh4vmSrTFEw6scVhyJvaFvnTueVY2GWg9ge3XGFbXOSsUoHiZe6DO0rlVQvGwyocfGvSCQvKj1Im8buLjzCnhYA8sYYy2k+AG4WM6hRnVpqVRnIp9HPhecrrfJI1+yvWFwbRTWrT2YlGWjEOy42SjrXF6pouwEI/1I+HJVAHN9JKPXhkHo86ru0kK1GkoiArB5kMACYP1RmjPGLoo2LZUnyG4/iJY7sNY5MUMIDSJ5fRxa647n1bLWAwBbSmmTlSqvteCreup8zbufKMGDnhwmnrHuaF7lpYIf5an3X9TAggcCwMWGRsjltVrkre5stLqsg5K1HSFcZiUwxngQyxvjCC3X9opKOzgsvlHWorRQWamkoL1QMEJWq7Br+wKM4UHPG/U8hNEkbY7nlYabiAHYNEhgAbD+KA012szzJq2U4HS370c+X//X6AV8t+8TjOd5m1fawaoS2M724lCturRSqrOBpKF3NqtBaK0hLMKov9yxiAk6nJZzqOR6FcKj/7wGFvSPAFx0V1/W3bxUnbFxwCOfL8t1r+925tU/RL7YH4dSsGnaTrOm1QY2YW2QtS6rVK270ONRIFY/E15jaTSKcezzceJxRk7T6t5p0XawBRuATUdo8AgAWHuYhlplslLVbecJmkRizfvkV4Rg/UiEHmu1mWV10WhYaATb2WDqpkuLdllMXXpiTVcQ/mdzG4T7obeTBBSRWdosllMsB+3lqYaXHl6xDwC46NCoanRWqM64wOeepMtmuM72d9ap9wJ+YxzGAc/KdrqoYX/6Zt8JpW1RaWtsLxK9QJA1T1udI5R4gg17nmR0kbVHs1Jr2IEFwIZBAguADahanZYtMjb2RSAZo2T9cyTBSBJ5g9i31t2blouidVCVGmylrFazvKEEDXse5xsYtjBGccB3Ek8Kmjd6ltZl00Fa4+n2sAaWXYIHAsDF5gqQq9our5WxNva4YPTThriuL+Awwijw+P4o7PmyrPXJoi5qBT39pnTWFY1Ki9Y6O4xkEgqC1xqmuuXbRwgeJ14YiKI1J/PmQRFMeC0A2BxIYAGwAWmhFoXigg0TX3K65u0kq736nJJ+JMaJ5xw6mpVpoTAMyGAbZzUoq9QkbTDGg+Uq6Ea+BuckieW1oW8dOpxVp4vKWmgvTzPOeRRFlNKmadoWDo0CcOFhSd10daMZIXHwMIG1Phg/2G4ZSLYz9Cklx/NqlrUQGW1KZ1xWqlnRdtYNYtkP5ZoTWKsPWyWwdgYeQvZ4XmWlWr4n8FoAsDGQwAJgAxZlO89awchO35Ocrf8LYHw29PqCjRKPEnw6q9OyhR0lYBtnNRaVtU4LRQkexFLwzQxbGOEklNfHIbLuZFZPsxYSWE95eEQIYwxjDLcQArAG1rm81o0ynseSaDNd/WopUXJ6fRz6Hjuc1sfz2jpo/pvRGZNXKi21cyiJRORzTPD6S5JRjMeJv9cPMCIni3qetZ2xDk4sALDBCA0eAQDrj9LSQs2LVnK6k/hC0M3MxxESnIx6XujztFKLXKnOwJoS2CrOobYzRa2VNh6nsb+q4L7m77A6RoBCyfZHISF4njeLorWQ8b0qLyH80ABceDtTnckq1XTWFyyJhOR0U19FcHpjHMQ+m6T1ybxqtYHefiO0totCKd0JTkOPU3o2Eju87tQRxij2xbgnBSNp0c7yRmmLoeYGAJsDCSwA1j4kdzYtVNVqKWg/OhsRN/M9MBaMjhOvF4i87qZZU0FZH7BtcxrkylrllbYOhb4IvNW9VGtuKA8+UQi6k/ieYEWt52kLd2k//RESIW7JWrilFYALZB1S2malVsp4ksa+4HRjMxRO8Ljv90PZtN0sa4qqg+a/Eaqz86I1BoU+9z324ArCDQTLWHKShCL0WdvZWfZpGSwAwKbCM3gEAKw3SnNZpRZVixCKAxFtYkL+MDnAORklfj/2qkZP8iavtIFTUWCrOJeWKi0VRqgfiVAyQjY2bElGxj2vH0vd2ZO0yioFJ8ueYhjj1RFC51zbtpDDAuBCu/pW27xq2webbQXf3A4sRHDo8Z1ByBidZs3hvDQGuvoNaLWZZo2xthfw2Oeb3PJEcL/njRPfOXc8r8taw3EFADbZIuERALBOxrhZ1swWDad0ea0JJxtKYGGEOaWDWI57UmszXdSLooGyPmCrWIfSQs3zxiE3iGUYMEI2dqRLcrYzDMeJb4w7nJTTtOmgvTy9VrcQrhJYXdcZYyCBBcAFcQg1bbc8L2ZCX0T+xtb2nHME48jjN8ahL+nxvP7kMNcG2v4G1Ko7nVfGuX7kxYHEeGMpLILxTuLvj0Jn0eG0XBQKTpUCsEGQwAJgrTpj53m7KFrGyLDnBZIRvMlVJbHchOVJltd6msGEHGzdtCarVFYqhHEvEpIx5NCmolhKcOTTUU8yhudlOy8aA7Oaq/AOfgoeBQAX1spQ1XZV23FGkoDzZWmFjTS6Vc6aM7Lb9+JAFLU+nVdwYHz9Qz9CqGnMNG2Msb2ARx7fYLCMEUoisdP3hSBp1U6zutUGxgQANgUSWACslbFuUbRpoQTDw56UfMNtkFO60/fCQJSNPl3UuoOD/WC7gtii1nXbSUaTULLNVUU5m0rhs/ayOwiSUFSNnmaNhnMlT3F4RAildHWKUC3BMwHgojpY68qma1QnBU0iySj+/fqD68eWZbB2+h5CbpLWadEaODC+1qHfdZ0pG1U0mjHSC4QnKdpc3XSMsS/4qOcNY6m1PZ7XRa3h1iMANhahwSMAYJ2DcmdsWqqy1b7g/VBSQjaaHECckb1+EHu8qvVpWrcatkWDbWovnSkqpTob+awfidWsZlPx62pWc20YDHteq+00bRWswT69VjWwhBAIIaWU1ho2YQFwQSxCdasbZTxOe7HYXG3QB62fEDLseXuDkFEySdvTZW8PP9M6VcrMS6W0DSTrheJs+MebjJYpxYNY7g8DY93RrMprBeMBAJsCCSwA1jgGOlS3Ji2Uta4fyySS9MGOEreJAfksFhCUjBI/CYXSZpY2Va1gkga2hLGoVN2i0Ma6JBb92Ntownc5ZFJybRiMEk93ZprWZQ1JjafZwx1YZgl+awAuLDpyRa0apT3JBqFH6Sa7eowdIbgXiL2BzxmZZfXpvGwhgbVeVaNnWdMZm0QiiTy6XL7aUB+MV2WwklDujyLk3Mm8yksNAwIAG4vG4REAsMYJucsqlRYNQmgQiySU7MEyI97MgIwQJtj36CjxPcHzujtNz8IF+KXANrDWFpVelK0xLvZEEghCN7ssjyhGcSj6kWSELAo9zVo4Rfh0w0ubKscDwBWhjc1KXbfWF2wQcUY3Oz1ZJiwI3hn4g8RTnb0/LevGIDgzti4OobLR87MR1sSBjP0H5WI3WjMW+R7bGfiCs7Ros6qFVQ0ANgUSWACsjzEuK9tFqRBG/ciLQ7HRNaUHnysZvTbwQ5/lpTqeVwqKlYItaS/WVY3OytZYGwciCQXfaAJrWQYLe5yOen4vEnmtDmelgrJxT6nVEULO+eoIYdd18EwAuCBa27LRWhtP0DiUFOPNT5AwvjYM9geBse5wWuaNQhsM166esukWWaM7F/ss9Pnmv5BzgaDjxIt9VjZ6krZFDYMCABvqn+ERALDGCbldFGpRKELpoCeXQdom15Qwdqtaxdd3on4ks0odTcsOJuRgO3TGpYUq644SHAdCcrraELOp77P6aErI3sDfG4RZqe4cF3Cu5Gn1sAYWIaRdgmcCwEWw1hW1LpuOMdoLhC8ZIVuQwCL4+ii6MQ4JJUfTap61sD99nerWZLXGGCWhCDy2BUMC8iXbH4bjvl+15t6kmOUN5DMB2Ez/DI8AgDVOyG1aNI0yoceTyGOMbvpkyupgP9pJ/CSSqjPzrG01hGhgW9rLrGhbfdZe+qHYhinNalYzSvzdfqC0PZmXVd1BCPtUwhivLiLEGNslOC0CwEVwDqWlKmvtC5oEUi4Lz22+q8co8vnuIIw9ntX6eFEVNZQ9WtP70HW2qlTTmkCyOPY9yTf9lRxCmFIyiPxhL3AOnS6qRQGrGgBsBoNHAMBFDXgPx+GmcapFStVpPTlZGGPGIY1tbavCeT6mdCNfzzprnNWua7Ruacm8DmM8K5r76YL62udSYMYI3YY4ElyZJuPO3knTaas7ZI7L8mCWamPikBPPVKZmxnHCCN7M0otxpjOmNkrTmvsdwnZeNHdmUxF0gcc55pwsN4kRWBl6emCM3afgaQBwjhkB66yypnOqbNXd+SytW84RErowOdW+RznFjGC0/iBk2dqdsp3FBnMVxSQ/th8fL25O5S0WS8oEYWw5EkF4dJ4P3RjbtqhtrFZF1swO50p3oSA92/Kmcj7BlDmMN/I+rCITg7qFqalUlKJp1t6bZc82ns8FJ4ydvasQLwOwJpDAAuCiRjzUdaYszcHd7r13u48/MEeHsxYfh1/t6E5S3vf+8Xd68gx64UW2dx0HAWZszYNxZepJPf84v3+nOLo9PT2ohSXJ/XTxf//2J8+W3rOD8bPhjevh2KfyLEqDYRlcMONsY9q0LW6Xh3fyw3vV4WSqju/2SxXYIHsjO8oPB59Lbt6K9nsi5ISTdR0nPJtoIddZM2+zO9XhJ9n9O/PTT0pjaHSam3/7259dL93+IHk2vPFsb7/PIw97BEEke+mtjhAyxh4eIYQcFgDn09tbq6yeNovbxdHt4uAgPz36WJ4sIo302/k7ze33nu3vPRffuB7s9HjAyRo3ZC17e2W7hSpu5/dul/fenc+mCGnTf+vuwZH3m5sFuxHs3ert3wx2ExGffTeEoLd/whEWGePqujs57j76sPvdb8zBnUllJvL51ns+to3/sw/JyVB/+cv02nUSx269G/SMs61R0ya9nR/eqe7dTU9v16aj/Wlu/v72rw8C++xg59no+n6wcxaZIFj0BWAdIIEFwMUMyG2rPv5I/eQfzDu/tPfvusXcVhU17GYyN71nnm0Ow7futD+Puxu32He+J779HXbzFiZkDdWwnDubit8uDt+avfebye9OsvupSsu2VmbHRDcaYtLp/dtG9U69a+HNl3a//I3xF/b9MacMIxiVwUW9kw65k3r2y+m7v5m+f5jemVfTvCvamil7k/jjhuXV/PCudm9Fu8/0nv3S+PNfGjw/kglDbA2xonX2tFn8ZvHxO9Pf3Z5/vKinRVvXbayCm7XDaXb/7kEeTlnfG93o3fri+HPfGH1hx+tTRCGOvdQwxpRSKSWltOs6rTU8EwCetLdf5ismzeKXs/d+O3n3zuL2vJkUummKoRV7CKGmOprcKX5z4o+i/RcHL35950ufT571mVhLBOIscrM2ey+989bpOx/OPpg3p3lpGzwy/o3SpdPZvdumjkU88McvjF78xs5Ln09uRTyAnv5Jhn+kdXdwV73+avfWG/bgtp2cuCJDmgzj+y/2jiKnEvWBfcNUP71Fv/w1+fKfslvP4iBYR7S8jE0m9fzN6Xu/mbx7kC7fVdW0TWCC5yrDqvL08Pb81yf+KNx7cfjid/e++mLvpqQcomUALhoksAA49wjIurJsf/G6+ocfm1+86g7uON0gSjHlMRXfyD58vjnpq0VQHdtD5D563xzcNSdH8ns/4J//ApHyokflqmveXdx57eiNt05/fS896HSFEOLcp3FJ5bFD2rDZtM5Oa3OQHt4vjk7LyR9d+9rzvRsxD+HHBefOOqet/jA/eOP47TeP3rqd3qnbzNmOUUlZIEdz3NOWlLWdFGlxVBx9NL9zJzs43nvpmzsvPR9fF4RdaJNRtvswPfjl5DdvHv/qk8UnRT1HrqOEYx+L/ROMnON51S3yRXOIjz5JD+5md+bV7Ou7L50FsoTBYuylRpZWpwjhaQDw5BmBzpqP8/s/P3n79fu/OEjvFk2KXUeZJCFjElmEHE3TdrGo9d3s8G56cFwc5Te++/Xx52MWXHR32ll7rzr9xcnbbxy+9eHi40V1il1HcEATIgLsWGVZmdXpopzczQ4OinvHxcli/5tfG39+IGNOOPy+jxMAFIX++KP2R3/VvfoTd/tDVxWIEEx5RPkX6vu7OqemG1SH1tbuk4/sxx+5kyPxg3/Kv/xVOhhc8LuKtOk+Ke6/cfz264dv3knvls0CI0OJpB6m106tQ47O0mYxr/Wd7N7d/ODsn699/Wvjz4Y0IDD0A3CRIIEFwHkPe2XZvvHz9v/51+b1f0RthTDBgzEe7+LxDot7zzlkuw6XOT4K7HSC2sr97lc6nbn5zOF/KT73eeJ5F/XFnKuNej+981e3//7tozfTaoIwDrxk4I2uxfu+CCkmndWlGk/L01k1q3V+Z/revF7kuvjTm3/0teHnKZwlBOdN2+7D7O7f3Hnl9fuvL8oTYzvJg8Qb7EV7sUwYZ9jhpmvTpj+vJvN6XlXTd1U6rU5KVfFnvncj3JWUX9h3M4fl6d/c+cmbx2+eZPed04x6Q//aKBz3vL7HfIustd2ink/LyaKet2360Wk+rxfzNsPPfO+zyTMcwyB7ieFPLcuzwHWTADxpVuCwmvzo7is/u/f6aXrXISt5MPRHo3C358WCSOdMpetZM5mW06JNp9nBa8287BqM0EvDzw1kdHFfzTh7VE1/ev8X/3jw07vTDyyyknmJ3x8He31vwCjH2DVdMy+n02qStrNFfvRGk6Vt2pr227tf2QuGsO/mUSNS13Xqg/fbv/zz7sd/habHyDnkhWT/Bt7ZC+LIY+K61qiu0MRz0xOULty9j/Vfntp04YwV3/oOCYKLqzjpnPskP/zbg1devffaJLtnnfOEP/BHY3+n5yecCItMpcfzajapTssmnaR3f6qKabNACH158JkLfVcBABBbA3CuY16n9Scftf/2/zJvv+HaChOKR7vkC1+l3/gW//wX6c4OQti2rT050b98w/zy5/bOhyidu+P75pW/V0KQXo8/9xx26Nw3lTjkOmc+yu//3b3Xf3X487xNKRWx3781/MwXR5/9yvCzfRFRzLRV0zZ7f3H7d7P3Ppp9kJaTopm+eu81jtmeN9oNhgLDMiM4N9a502b+44PXfn7/9VlxRAiN/MH1/rOfG37uq6PP7QYDjwiESaGr+9Xph+ntd05/e3/xSauq0/zwtYNXAxF+//q3nwl2LmITlnPutJ69evKr1w5+Oq1nBKFA9neTGy+Nv/S5wfPXg52Q+Q45bbp71cl7i9vvTt87WHxcqWKaH71+73VKaF9Ge96IQk33y+n3jxDqJdiHBcCT9KiZrl4/fefVe69N8kOEceSd9fZfHH3xy4MXd/2Bx4RFLo6ft/oAAIAASURBVG2L28Xh27N3Pzx9d1octbp+/+Q3xjmO+Td3vsgpvYg8kbW27OpfTH77s4Of3Z19ZBHyeHRt8NwXxl/4Uv+Fm+GuxwRyqNT1x8X9384/+M3Jr+flRHXVB5P3lNEhDxMR+0zAr/wIjOnu3lE//bvur/8dKuZng3ivT248z17+AX/pq3RnF0vpWmXn0+799/Wbr9m330TZ3FWlee0flfRwFImXvoLl+a/4uuW7Wujqp0e/eOXea/P8CGES+8n1/q0vjr/0peT53WDgU2mdm6nsdnH468l7H05+NytPqjZ77+Qd47Ag/BvjLzAMZQQAuCiQwALg3IIzhJy+fbv5t/+6e+tVlKeYCfziF+X/9n+If/IndNDHXKwqtZ/9m5/5jPjGN83hP2//5j/qf/9v0PFde/9297f/oR2Nyf/yL+l4jM//66FJk/703quv3P2HvFlQTJ8bfub7z37/n+x9dSgjQSVBBC9H7lux+fLghen+N39++s7/+9FfT9KDrDh59d5rlpD/8zP/fMcfOOdgHxZ48vaCMT6qJ39z8LNXD16ZFEeEsL3+rR888/If7339RrDDKaeYEIyX0aR9sXfzj3a/cufat/764JU3779xmt49zA/+4oO/THg4uNkLqXe+r6RDLlXlL07f+cuP/npaTQjGw/jGH9347p/denlH9gPmUUIJxmf/HnK34mvfGn/p8Pp3Xj3+1T/c/ceD2Yen2b1X7/60J+P//uY/ueYP4ee+jAghjDEpJWNMaw1F3AF4kg6/7Jq/O3zjx5/8/XF6l1AWBzs/fPZPf3Dj2zeDXY9JRgh2Z729Dexn+899b++rb+2+/3cHr7xz/MuiTd++/1rMPF/4X+4/z8j5X9xcmfZ36Sc/vv13H80/tLaLvP53br78Pz33/VvRNY8JRihZfjeH3AvJze/svnT7+rf/4vY/vH30ZlnPPp68++dM7Ic7n+8/dzYkwD6sP+yNsEXW/oc/13/+b9x8gjnH12/xP/uf5Z/9M7a3j3wf01U1dOeMEV/5uvnT/6798d/qv/p39v13UJZ2P/pL5BwZ7fBnnjn/i7ydK3T9o/tv/OTuTybpXcJ44u/+0+d++PL1b90IxpIs31WEHUY37O4X+s9/b/erb03f//uDn75z9FbZZm/fe2Ug/ID7X0ieZZjCLw3ARaD/6l/9K3gKAJwLm6b6Fz/X//Hfo5NDTBj+zBfFv/jfve//kF+/fjY8E7Ka/WCMMaFEStIf4P7QYuzmc7eYIdW4ztAXPsf2rqFzvSrYIdRZ87vFx3975x9O0gOC2a3RZ77/zMt/ev3be/5QUP7g9l+MMEYEE05YLKK+jCSVsyYt2kJ1dWu7Z3o3h7LPobIPOA/amrdn7/347k/up3cxRjf7z/3gme+/vP+t5+JrkoqH51VXdzxRTARhPRnt+TuO4Emb1iqvdcWYn3j968H4fN9J59y76e0fH7zy0fRd5OxecutPnvneD2/+8Yu9Gz6TlHx6lhZ/+t0o64lw6Pcp4afNomwzZVqF0K34xkj2L2LGBdagrusf/ehHv/3tb2/evPnd73732WefZQyW/QB49N7embvF8Y/u/uT96e86o3ei/ZdvvfyD69/5bHLzrEfFBCP8aQSCOaEB8/peEoigNnpSTTtdNc72vf6teF9Sce4RyGE1+Zu7r/z25Ne1ygfR7leufeN/uPXyl0cv/BffDWPMCA2Zl8heJKKqa6f1otWV6tQg3Bn7w5D5EB39QSNsq9o3f6H+5i/sh++eDfXXn+X/47+Qf/bP+AufIZ6HV5UHV/EyIZhz0uvRvWvID+zpiStSVxXIOjwY0edeIOKcN751ztzO7//47s8+mr5nXDeO9r//7J98/8Z3Xoh/b/Rf7gMkq/eB+wPZC0S40MW8WXS6ap3r+4Nn4n1JOLwPAFwEONoAwLkxh4f6V790J4euM3jvBvvOy/IHP2R7e6sU0oOp+O9NkTGl/Pnn/X/+v9KvfQt7geuM/eSD7oP3TJqdd7Dgcl19kt45SA8sQr7X++b+1/9o76u7/oD8N/59jNA1f/S9/W9+de+r/WBkjJ7nx+/MPpirHH5ocC5mKnt3/vH9xV1j2sQfvbT70veuf/OZcI9i8t/a6SIw+0xy4+Xr3/7m/jd9EVtr3pt/8Jvp+43V59pcXNW1Hy0++WD2fmeUJ+OXdl76k/1vf6Z3fbXr6r/ynyDECL0Z7PzJ9e98Ze9rvXCsjb63+OSD9PYMmszljZAIoZQSQtwSPBAAHk9j1AfZnbvp7bJJfRm/MPrsD2/88YvJDU7ochvrf6UX7ovw66Mv/PH+t0fRHqZ8Xp58NP/4sJpaZ8+3t++sPSxPfn3661LnjMkXBi/+8JmXPz9Ybp/5/zf75dcNqPzK4MXv7n/rVv95gnDd5u9M3zsoTxyCXuIPYpu6++Ub9pOPkO2Q57Nv/7H84T/lL7yIzkbYB8/wYcC82tXGru3LP/nB/8fencfGdZ2HAr/nrrPPcPbhcKdIUaQoipQoy1ps2ZK8xHbsWLFf3sMLkry8h7cARYsW/bdI/yjQv4oiCNoGbVokRdukjh07sSzbciwvonaJ2k2KO4fD2Tj7dvfzwDnShJEoaihTVkx/vwCOLc3cubw8c893v3POd9h9B5HTRWFKj0eU4XN6OkWt9Z25rEpji9HytKiUjIKt3dWxp36wxRK42VaX4+AtA+5N23xbnWYvYriFYnwqPRUrp3RoDwA8oPAMLgEAaxUIafNz2tgIJZUphqE7u7nBnYzLTVUmXqE7oyAyu4Tj2LZ2dssAam5DDMLlojZ+Q4tG1rZL1ikcKSVCuTlRznGsEHA0dtd1BMyuSoywzAZblSWCmEa0U7D1eTbV2+oZhheV0mwunJayEKKBNfi2UDheTk9nQqJcpBHd6Gja7N4UMLpYRFfmNeFlnzTIYGaT2d/v7fZYAwzDZMvJcH4uIWa0tXuq0fHiuYVy4XwpiRDjtQV7PRsbLV6GYioLB5bNYC3+IY1ov8nV697YaGuiEVuW8+Pp6Vg5Cb/uLyOEEMuyPM8zDKOqqizLkMMC4P7u90W1PJENZUspjHW/NdDt7mq0eAWaq4Qgy91UK7NcbJx5Y13rBucGgTOqqhQtxGYL8+raJrAoKiXlQvlIqhDXdGwzeTpdHb3OdgMjVFMnvx8dLf4PIWRghY2Oli7XBkGw6BSey89FCwlJg0p5tcTKmp5c0KYnqPQCxbDI6WUf2cU2NqObk5tun7KEKn0/RVGMy80/ugc1tiHBQBUL+ty0HotgRVnbyKSoliezoUw5RVHYb63vcW9qvLVXDMLobp2FjTdvrGvb4NzAsQZVFecL0VA+urbJVgBAFSSwAFiTTg9jSdaiETwfwoqKWIHeuInr6iJFrxDGy3TJN8tmLf4L19HJ9PRRLEvJsjY9oSfilL7GY4yzxchcMYp1zcSaNtg3BExenuz6jCm03LmRfwgM32ZtqLc2GnirqsuJUiwlZhRdgxANfP42mRLT0VJMpzSWs7TYm9vtDRzNUJXx12XL9FYbqpHh602+JnsTx3CqKibKiflyYg0TWBqlh0qxaCmuagpHc22O9gZrwMQKt74Yy58bmUfAINRqa2h1NC1Gsbo+XQgviCn4dX8Z2yeZgUUSWIqiyLIMlwWA+6BjqqSW54vzJbVEM0LQ2tBV11rJXlW+aIi6W112hLDT4NhU125kzZiiklImVIxpeE33A8VUVEpNF+YVVWIout5S32JtsnJGunJKy0VHN0+WppDPWNfiaLaZfQjR2dJCopzIq2WdgiG+e11yWVZDszgeoVQZGUx0Uxu7oZO23H3bvlu/BsTzbEMj07EZ1bkpVcaZlBqaw2t6Z9axXtTEcClSUssMYwjaghsdLQLDV9vqci3o5i/cb/L0ODuMnBkjakFMh0txDRJYADwYkMACYG0ed3RRwsU8VcxTNKLq3IwvwNQ5KbIBGUIrdsoU7fUxTc0Uw1KaSmXTejG/5g9jGalQkIsYISNnbLMHrLyxGo6tdINAyMoZ6wx2g2DUdU2UimWlDGNK4PNTsFZSSyW5iDFlEMwek9vJ2+lKl1RLyQgLZwqafTzisKaWpFJeKq1hUhVjnJPzucVzwxzDtdqCdZyNwvc4LTI5C1HIa3AEjF6eM1EUVZKyolKGX/eXztIHVyhiAsDnoVGaqJaLUlHRVZ41eE1ur6GOQfQ9vlyVxICFMwTNHoE3YESJSrkgFdZ4/AxRBbmclfI61hmGqTd5vMa6e3/rK4OSPM3ZeavDUMciWpPLZbkoaQpFQQrrng1C09NpvVzGukbZ7HR7J2001vrr4lg6GEROJ6XrlCzhfBar6lqeGtZFpVyS8qqmCILZY3T7jHXsim0VUTeXPTp4S9DkFVgDppColItyAcZ6AXhAIIEFwFo88SKEVZkqlylFQgghowlZrTfrUK7cgZFxfrMF2e0UzVA6xuUytdZD/TpFldVyWZUwpniGc/I2oYb9nsmZszRj5EwGxlDZaVrSdEWHLhl8zu8LxhrWJVVWVZGiEE8brJzJxAq1748u0JyDt7I0R2FdUWVZldZwZSumcEkRxcXvi05TtNtgN7DC8isH7whkMcYCzZkFE7f4FcOSKmm6Cr/xLy+mshmWpmmqqsLTCAD3E4HolKypklLWdZ1hOBNntrCGe0ZHZLEeTSErb+YYA4VoRVcUTVrzIgaiKpaUMsYag2i7YLHyphpCPkz24zEyglUwMYjRdVnVZA2rcI+opUHgUhHLMoUxMhpptxfVXogdMbTDgQ2GxY5a17EkUmtcE01XVFVWRR1rLM2ZOZOZqxTmv0dbRRSmOJqxCiZ2MVpGqi7LmgypTAAeEEhgAbA2SGEE6maNZ/y7OpQrj+OhW0Uqq73jciUAPvf3HFe+7Ijs+68vdvw1/ESV08A3Ew7k/ynYIhqs3VfmZpPElWQpXuVzCa5ujVDZp3Bt48Slm4DqOl6+zPDdvzU3I+FbJwq+rE0UIY7jWJaVZVmSJEhgAXCfXyUKLSnLrZP76crR0e8VHMQ6dXMSLHoQX3NyVExCsRoSItWxFp3CeqXKaeXsIDqqse9e8svHeJVbZFReXP0VYbTW54aqRTRuNlkS/N6jrVZ/+agyZExhaAwAPEiQwAJgDTpjRGEkcMhooDhhsXstFnA+jzWtxlJWeqGIs1lK0yiapgQjWutdgSmEzKzRyBooTMmasiBmRK3WSV6qrpaUkqRKFEIMY2BpDiG4b4DP2yAZxPCswFYm28uqmFcKJVWsPdkj60pWLqi6QiHE0bzA8GhNn7VMrNHAGhGiVawnxHRZFWucHYYpStKVolxUNAlRFM8amErxV/DlbKeI4ziGYcgMLLggANzPkwZCHMMZOCNNM4omF5RiQS3XcrdHmNIwzslFWZUojFmG51czUbdGBoY3syaEaIz1tJTLyaUaIz+MKUmTCkpR0zWa5TlWYBELeYt7/1ppGlmsFC9QiMblsh6PYLnmQuw6xpk0VS5RCFEMjQwGak0jUhohnuH4xd6/0lblQkEtYVxTnkzRtZxcqLRVnaM5juEhpwnAg+pW4BIA8LkfdytPOoKArA7KYl8MynJpPTKnJeK4trnNWnRem57CmooYhna5kNm65udYJ9gcgpVCuKyUpnPzOaVYS65AX4wdSykxI6olBnFGwWLiTSxNQ6cMPudXhkW0mTNZeAuisCQX4oXEgpSpsbxapUZVcS4flzUV0axZMNt4C712USyNFr8vNsFC07Sqy1O5ubSUq3UKFsaxcmq+GFcqOS+bYDNyRviNAwC+spjKUjuLYONoVlbERGkhWkqqeg212BEqqOVQIS6pIsLYzBltgnVta9IhirJxZqfRQSNW1bX54kKsnFo6J/6ud3oKi5qUEXOZckrDOsubzLzZwPBktjv80lfCMrTTiUwWRNNUIYenxvVCvpb+FVOULolaeA5n0xSiKd6I7A6K49a0rdIGRrAJdo7hJLmUKC5Ey8kaa7GnpfxsIVbp+ikTZ7YLNqifCMADAgksAD43sr0/xzMeP/LVI4bBclmbGFcmJilNv+dbKYpSp6a0z64iTaV4nm5qYbzem9Xf1+6BvNHsC5jdiKJLSnk8M50QU+qKW/mQE5M0ZbYYjRfmRanIMpzL7KkT7CyiEcRn4HO3yTqjw2t0MxRSleJsITxTjKpYqyVPJOtqvJycyc7IukyzQp3RGTC52LVMYNH1Jq/f5GZoTtPUqexMrLSgYLWW8BpjPJ2fn8zOSlKJQajeUu8y2uHX/SWFEGIYhqZpXdc1TYMLAsB9BEg0ok2cyW/2GTmDrkvzufnxbFipZTNBTGWk/GhmqqyUEYUcgiNg8jKIWdsT9JmcjWYfy3Ia1iKFyFxhXtTke44+6hROStnZXDhbTOhYtxmdTqPLwhsZtNYL2tdZc7i1mSDtclMMh4tFfXZSC83gUunel01R9ERcH/+MSi1QHIfsdibYQK9hAmuxrTJmzuQzeY2sQdPK8/noRDYs6TVNEIuXUyPpyUpbpZwGh8/sYWC9AgAP6CECLgEAa/CUgzGFEBPwM52dlNFIYaxfv6ycPq7Go/hujz1kXb0sySOfqcNncWgaazplMDHtHYzPt9bfc9pndgatQQNvU1Q5mgtdjl+fKURJvYe7vUvTtYSYPhe7MpcLaZokcMYma9Ah2CB7BdbgS0Mhj+BssTcZeIuOqVBm5lL0aqgYV3R1hRwWxljHeLIwdyZ6MVmMYKw5jM4GW4PbYF/LGVgU8hgcDZagxeDSsZ7IzQ3Hroxn51T9HmW8NV0LleKX49fmsjOY0o2CtdPR5DO44df95cXzPNTAAuD+cwKVSSgmxtBqa7AbnDRio4XI1cTVqVwlT0QtX1+Q3OqTcu5q8sZEckxRyizL+82eZot/LZMClQIQdt7aYPHXmb0MYgrl5GcLI2cSnxW08gpvwhgX1fLV1Pi15KgqF2mEgpZ6n8nNI+5WiXdwl64fY0QztKOObmqj6lwU1nBqQfn0Y2VigtL0u95jyVhvLCod+wDPTmFZQkYLHWhkfT7EcXjt2mpl8pSh1d7oMDopREcL85fjV2YK8+VK2Y3lgxNMLbZVKX8tOTqZHFdUkWMNAbO3yeSjIYEFwIPB/OAHP4CrAMDn7pMrNRxZDkuSNj1BZTM4n8XlMmWy0E43bTLdLFhZnU5MilZKsjozLb57WDv9CU5GKUGgN/QITz3LtbRRDEOSYmvQJZP5YTRbUKXZwnyunFJUsaRJLCu4TE4Dw5MFgdWpzqRetYq1WDl1Ijp8cv50Ih9hGd5vC+5r2N1qq+dpFuZFg89PYDlRk+eK8ayYLsr5sipxrFDH242sYfERBf2uUC6+Fb+KmjxbjH4aPnsqfDZfTjE02+Pv2xkYaLXUV3ayxmvRMsn3hZGwmhDTC4WopJTyiogY1ibYrJwZIVLz9/ZzU3Q1Ul44OX/hVPj0QjEmMEKzq2Nf8NFmaz1LM/Dr/jJSVXV4ePjSpUuapvX19Q0MDPBrXqAQgPUdHN36J0JovrSQEtMlKStqsk4zVt5i4UwsYu+MQHQKZ6TC5eTop3OnplJjui67LfXbAwNb3V2myg6Ga5VNoTBiEFKwnlIKC8V4ScqXVLGoy05DndNgp9HNgt7khk/u9hqli6p8LTX+ydyp0cRnqiZaDXV7Gh7d7O60cSZUrQkP7h4tV4ZvFS08R8VjWJFwLkfxAnK6GKuVounbouXFf2iaFo/LQ5/KR97CkTkKY7qxhdv3FL+1HwnCWkXLt9oqohCOlpOJ0kJJykmaTNGsQ7CaOROZ/bekrVa2LcQ4I+cvJUc/DZ2YzUzquuK1BbfXD/S5NxoYHqJlACCBBcAfdr9sMCCDUc/k9NA0JZapQg7HEpimKZ5HHI84DjE3n2N1WdHjMWX0uvTxb9Vj7+L5EKJZOtjMP/N1fvsjtMWC1m4vQtJ90hTiWK6glOPlpCgXClI+r5RkrHE0z9CMgeXpWx+n6npKzs/ko2fjlz6Y+SSeC2u64rMFB4M7d/n7nAbrvbdWBKAGLM3wDCdTeqKYKMr5olxMSVkF6xRCNM3xNMsg+mZLw1jUlKScv5GZ+nDu5Pn58wuFeYbmAo6m/U17tnl7SFJ1jZrlzYMYWAFTVKS0kBMzZbmQkXJFTeIYHlOIo1l28ann5rlJupqSchPZ0KnoxRPhk/O5WYSQ3xZ8rGnvFtdGG2+Gr8uXN4F14cKF4eFhjPFABSSwALgPNKKNrEGjqJSUSReToiZGSwsqxgzNMTQrMFw1y69jPSeX50vx4cRnQ+HTn8WvKqos8KZt9YO76rcHzR5m7cYD8K1YhqVZE2eMlBYyYlqUC1kxW9YUnjUwNMvSDIduDtphjAtKeb6c+Cw18WHoxLX4lbKc5TnTJm/vgaa9zRY/Q8OMmxobBIOsFr1Q0OMRnEkisaQnErpYplge8RzFC9VoGWualk6rU5PSb99TfnsET41iRUYuL7frMeGp5xi3h6LptY1I6UoZLA2hSDGWFdOSKsbKKQ1TNM2yzGJkUm2rmq7n1fJcMX4xce343KkbC5+pmszz5sHgjkcD2wImN7QHAB4QSGABQK1ZngghZDRRRpM2M03l0rhUpHJpfXZGT6cxpiiGwbqml0p6IaeFZsWhT5X339FOfYzjUYwp5PayO/YYvvY819C45hkiMjPFwPAcw+fV8kIxIWrlvJSLFOILckGhNAPDq1gTVSmvlOLl1KXk6CfzZ0/Pn0lk57CuG432bf7+p1v2+YwOlmYRotZmpgv4qqpOlTKxgpW3ppViWsyU5FxRLoQL0aiUKWkyT7OYwpImi6qYVQqzhdiFxGcfzZ28GrmYLiYohLwW3+7G3YO+Xp/RueZfGYSQwHAGzqBQVDgfFtVyQcrHiwthMS1qUiVVhmVdKatiUS2HitGLCyMfh89eiJxLFKIYa3Vm79bAwP7GR33GOoamMXxhvpxUVb148eLw8LCqqv39/du2bYMEFgD3dcOnGJqpM9jLujJXiIhSQVSKsdJCXMzIWOdpRsOarCuiKi2I6ZHszFB0+MTcyYnkDUkpGnlTi7vryaY9m10beIZFZN4UtWYzbiiK4hFrEcxFTUmVUwUxJ6mlBTEZW+yJFBohBtGitni3z0q5kezsqdil384en0yOlaQ8y/BBe/NTLY9vdnUYaH7tJgKv/4AZ8TxlMuNySZ+dxqJI5dP6/JwemtUpGnGsrmlYFPVcTovHlIsX5A+OKMfew3OTWJGR2cIO7hGeeo7r7KIFHq11W11sDzRbx9uSciFRXpDkUlkuxMqJhJhRMGYRo1N6pfdfbKufZaaHIueGwqemUhOSUjZx5jbPpv1Nuzc72zmagZYAwAPCwiUAYG26vcraforn2fYO4aVXJURpF05RmqLH5vCpsjY+gux1lMmEGJaSJVzM42RCTyaoQh5pKuX2s9t3Cc+9xAYbqQewgQ3pRFnEbLA3KQ27Cqo4Er1UELPJQvSKWgpnpk6b3EbezCJW0eSyWk6Xk6lSsiDldF01G+u2B7bvDgw2mTy3xp2gUwZr0CAxxgxiAkbX/sbduqafmT9TKC1kxYWxWDmRC1+OXjLxJp4RMEXJmlSUCmkxnS4tlKQ8jeg6s297cHB3YMBvdD2YFolpivYbXHsD21Pl9MXIcLIQyYnJiVg5lZ+/FLtkEawcI9AIyZpckhfPLVVMleQchbHd7O6vH3iy8VGf0ckgBmP4wnyJ8TzPcVypVJJlWdd1uCAA3M8NH1MMQg7essPXlyglT4XPZvORfDk5FpeShfil2LCBNfGsAWO9rBRzUj5ZSuTKaUWVGEZodW/8Wuv+Hle7gSbpY7zGa/QwhWhkQcadvi1FpShpSiQzXRJzE4nPUvnYpdhFm8HOMwYdq7IqZcVsqrSQFtOqIjIM1+LcsK9pT4+rw8QYIHu1ygdQhmtrx3v367mcduIjnEpQqQVNHNYXIsoxNzJbKYbDmkoVSziTwsk4ziaRqlJGMzOwkzv4LNfbhwT+AQUnCCGHYN1Tv11UpdPhs9liJFdcuKFKC/nohcVo2SSwBk3XFtuqmEuWF/LiYlsVOEubZ+Nzbfu761oFhiXLH6E5APBA7h9wCQBYm27vVuTCOByGxx6nDAbFV69dPEPF5qlkXI9HqMWHH0TRDKWri69nGMQwlMGIGlu5PfuExw7w3ZsQzTygAIgc1swaNrs6WIY7YXRdjl+JZufypYVscWGCHkeIpWlG11UK6zTWEEYcb6p3tG7z9+2u395ma+AZbuncGQA+f6SIKcrI8D11bQaG95jdF6IX5rKzZTFXKKdD2RmMGBrRlZIoOqVrNNZpmjUb61rrWgd8fdv9fS2Vgr4Pok3ixUckzDNcmy3wQtuBoNl3LnZxJj1ZKqXm5dxcZgYjGlXGV7GuUVijdZ1GtCDYGuxNO+q3Dfq2bLA13CrgimFX9S9vE+U4DnYhBODz3lHR4v9YxLRY6r/W/ITL4DwbOR/JzZXEbEjMzmZoCjGocsPUdYXGmMY6wwpuW32nq2tf06MDnm4jzd8skLXmt3tERlPoRrPvYNMeh2A7M39uJjOdLy7MS7m53MziudEsxpjCGtI1hCmGZhxW70ZX167A9m3ezXbeRCME0dEqI1IaGQz8lj5ksUouj3riEzw3RZUKeHxU165XomW6EjZjiqYrATOLGtrYwV3C/qf4zX201fKAIlJcaa0szWx0NBto3mVcbKvzlchktpydyUwvnhDNYErHuroYlmCKYXmXpX6rf+vO+oFtrm6B5W5lr6D3B+CBgAQWAGv3rFN9VDUYDdt3MC630tyijl7FsyE9HqHyOUpTFv+eZTArIJcLef10YzPbO8APbGebWhCNHtxkjephBZrd7Gg1sYZ6i/dGcjxSiCbFdE7M6rpKIYphOAaxVsHqNNYFLIFN7q7Nro6gyc0zPOmNIT4Da9ksSRiLcZs1aOZMXpNrJHljPj+fLCUzck5RpUox38VAlxcMNsHmN3sa7M2bnO1djja3wc4gmnowXxkyCkuGT9usAStn9JjdnyVHQ5nZhXI6K+Uktbx4bhTCDG1gbFbB6jG5gtZgp2tDn7vLY3CQ7BU80qybxy24CAB8jhxRJYNFIYZCLdaAiTN6TK6x1FgoG46XEhkxq2KZ0he/ZTRnMvKmOsHht/ha7a097o1djmYDzVWiqyXbZ6x1gESCt6DRxQcGvEb39eTIZHoyUUrmpGxZLVc+FiOGNXAmp+DwmDwtdS297q4Oe6OdM1fqKsBM21VHpJiiaKOR7+xEPK/UB9Vrl/W5GRwJ43QSKcrNh1SGoSwO2uujG5uZjT3c4E6urR2ZTLdFtg+mrdLN1oCRNXhMrhvJG6Hs3EJ5ISvlFF2hdB0vxsuCiTc5DQ6fydfiaBnwbW6zBXmaW9LzQ5MA4IGABBYAa/9AjigKmUx81yY2GFRnt6s3bqiTN3AiTqkS2eGEEkx0QzPb1s5u6GCDDchoQjRdiZ8e7HANOTuWZtttQZ+xrqeuYzIfDhWiiWJU0RSy+QrL8B6TO2gNtFqDTWafwPA0otes5gQAd4axCDEIBYxOu79/o6NlthANFSKRYqyslDDWK+OvjImz+MzuVntjiyVQx1srzRLdHCx9YA2TjKnTCHkMdTu9vRtsjTOFSKgQjRXjBTmPsU6yGhbe6jV5mm31TeaAy2A339okC7JX6wBToeu6qqqQxgLgc+UFSARCMX6j0+kf2GhfvNtP50PzhZiiiWSJLktzdqOj3uxvtTU0mr023szR7M1ZMQ80Brl5v0YewW73bGqz1U+5u6Zz4crdPlfZb26xJ7IZ7PVmX4u1odHicwpWspsHzLX5XAEzx3FtbYzfr/UNqJMT6uh1fT5ESSJpNBTDIa+faWnnunvYYANttZH67g+0e/29tmpy1gn9G+3NM4XITH4uWkxIahmTtsrwdoMjaAm02uobTF47byF1NmDlIABfxN0DrgIADyYSqozbKypWZUqRsaqTXXdvPrezLFXZmpBiWfQFPutWPwgvPn/rsq6qWNP0W89mlRwWgxiWZjmGIftbwyIo8ODb5M3+SKOwpqkK1jSs6hiTqS+IQjS92Cw5huMqK03Q7zfmL+QkKZ3SFF3TdE3Fmo516mbveevcaJalWZoiFd4f/OMWePAkSXrzzTf/7u/+bmJi4tvf/vaf//mfO51OuCwAfJ4baTXXo2Fd1VVFV1VdxbfmOdKL93eG7P3H3iqD/YWNoN3sUxbv9rqyeG6VnojUf6hM96URzS7e7TmWZmjIUqxRg7g5y1XTsKJQsoxV9XfRMkUhlqFYHgkCxTA358p9ISmi29pqpT2oGtbwrcjkd211ycbEMNoLwBcAZmAB8ADceqpe7MY4FnEsZTTdlipGv99PfmHP4dUPQggxFG1g7l4F8+YLMQWDSeDBt0lS24ShEMNwPMXdrU2S0JA8ZnyR85sQomiKFmiaorl7/SAPbqULeAiNs5rxh6sBwOe/kVbnmtMI8TTH3+uOSn2BGYGbn1i52698bhRMs127BnHzyt+sDGtYPlquvpL6gmLSW22VJC6RQHMCfa/IhIKuH4AvAiSwAHigHeCSzreGLvzhZA5q+BkAePDfFVR7bPpQnhlqzErBF2Y9oWkaHlABeAChEVVznh/9wcZHcHNY2yv+h9iH3sxjQe8PwB9SbAaXAAAAAADgTizL0jStaZqiKKqqwgUBAAAAAHiIIIEFAAAAAHA7hBDP8yzLKooiiqIsy7CQEAAAAADgIYIEFgAAAADA7RBCDMPQNI0x1jRN13VIYAEAAAAAPESQwAIAAAAAuF01XUX+BbJXAAAAAAAPFySwAAAAAADuiJBomqvQdV2SJE3T4JoAAAAAADzM8AwuAQAAAADAnXie5zhO0zRJkqAGFgAAAADAwwUJLAAAAACAZaCK6n9CAgsAAAAA4CGCBBYAAAAAwDKqCSxcARcEAAAAAOAhggQWAAAAAMDtEEI8zwuCoOu6KIqqqsI1AQAAAAB4iCCBBQAAAACwDI7jeJ7HGCuKoqqqrutwTQAAAAAAHhZIYAEAAAAALBck0XR1CaGu67CKEAAAAADgYcZmcAkAAAAAAO5EamBhjDVNg+lXAAAAAAAPFySwAAAAAABuhxASBIHneV3XJUlSFAVmYAEAAAAAPESQwAIAAAAAWAbLstUaWLCEEAAAAADg4YIEFgAAAADAMsgSQvLvkL0CAAAAAHi4IIEFAAAAALAMhBCp444r4IIAAAAAADxEkMACAAAAAPg9GGOEkKGCpmlVVWVZhjruAAAAAAAPESSwAAAAAAB+D1k5yFYghHRd1zQNJmEBAAAAADxEkMACAAAAAFhGtQDW0v+ENBYAAAAAwEMBCSwAAAAAgOWCJJpmGIbUwKruQnhbVgsAAAAAAHxBsRlcAgAAAACAZYIkmmZZlqZpXdcVRYEaWAAAAAAADzM2g0sAAAAAAHAnsgshTdNLZ2ABAAAAAICHgoVLAMDawhhrFQghjuPuY7EJOYKiKEwF2cS9xvfquq6qqqZp5D/J+hfy3MUwDMvCVx4AAFahWvcKslcAALDmMTPGWFEUsmnGqiLe6hGqM2Q5jiPbbtT+dhJv67qOKsiIhaZpZPotLBgH4A8QPM2C9d8vVreOInVMbnsyIfVNPn8XRY6s63q5XJ6dnU0mk3a7fdOmTRzHrfY4sizPzc3duHHD7Xa3t7fX1dXV8kaySVYqlQqFQslkUpZlhJDP5/P7/alUqlAoOJ3Otra2+4sPAADgq4k8EcESQgDAV4GqqnfL1y9N8azJZ5Ex11QqNT4+zvN8W1ub0+m8jwA1lUqNjo6m0+murq7GxkZBEO55EPJ0UC6XY7FYKBQqFAoYY4vF4vV6bTZbOBw2mUwNDQ1Wq/X+hqIBAA8OJLDAeqbrejQanZ2dzefzLMtW50aRkRae5+12e3Nzs8fjWZOP0zRtbm7u5MmTp06dEkVx586dHR0d95HAyuVyZ8+efe2117q6ul555ZUaE1jZbPbq1atnzpyZmprK5/OZTAZj3N/fv3v37tnZ2eHhYZ7nn3rqqS1btvj9foZhoHkAAMDKyERajuN0XZckCRJYAIB1TFXV0dHRaDRK7n4kwUTuewgho9HY0NAQCATMZvOafFwul7t06dLp06cvXbrU1dV16NAhu92+2uwYxnh6evq1116bnZ09dOiQ2Wz2+/21pJwmJibOnTt3/fr1eDyeyWREUayrq9uyZcu2bds++OCDhYWF3t7eHTt2bN26led5aBsA/OGABBZYz0gC6+233x4eHiYzkjDGpFcjc6+cTmdPT8+2bds6Ojqam5vvO62j63o6nb506dK777770UcfSZLU29trtVrvb/1gNpu9cuXKW2+9FQqFdu/evWXLlpWPo2laJpM5fPjwL3/5y2g0unHjRqPRmE6nR0ZGRFHcunWryWTKZDLDw8OXLl3av3//wYMHN23aZLVaoYUAAMDKyELs6iIXAABYrzRNGx4efv/996PRaHUeFikCiDEWBKGzs7Onp2fLli2dnZ02m+2+p2Kpqjo7O/vRRx+99dZbo6Ojdru9p6eHFM24j7B5bm7u3XffnZqa6ujoGBgY8Pl8K79FFMVr16699tprQ0NDDMN0dXUxDHPt2jWKonie3759O8uyZ86cGRoaOnny5Isvvrh7926/3w/LFwD4AwEJLLCeMQzjcrk4jhsfH5+amuJ5PhAI+Hw+g8Gg63o2m71+/fo777zT2dn5yiuvfOc733E6nfeRw8IYl8vloaGhn/3sZ8eOHWttbf3v//2/P/vss52dnaudfrW0ZjC65Z5vyefz586d+6d/+qcbN25885vf/LM/+zNBEE6fPv3GG2/Y7fbe3l6/39/Z2Xn48OGf//znP/nJT2ZmZr773e8ODAwYDAZoJAAAUPvdHi4CAGDdPhayrMfjEUXxxIkToiiaTCayrI+iKFmW5+fnT5w4YTQa9+7d+7/+1//as2fP/YWRuq7Pz8//9Kc//dWvfpXJZPbs2fOtb31r9+7ddrudjBbcRxms6vmvnAIjmbhwOPzTn/70yJEjwWDw+9///vPPP7+wsPDjH/84Fos9+uijmzdv7uvra2lp+eUvfzk0NHTx4sU//uM/fuGFF7xeL+SwAPiDuFPBJQDrGE3Tbrd769atQ0NDY2NjTqfz0KFDBw8edDgcGONisfjpp5/+53/+5/DwMMuygUDgwIED9xy3uVOxWDx9+vQ//MM/XL58ua+v73//7/+9a9cun89Hphyvticmr68xe0VePDk5+ZOf/GRsbGznzp3f+MY3GhoaOI574oknOjs7EULBYNBoNPb09LjdbqfT+dprrx07doxhGE3Tdu3aBZ0xAACsoFoDS5ZlSGABANZ32Nzf3z8wMHD06FFZljs7O//P//k//f39GGNVVcPh8I9+9KPz589//PHHRqOxo6OjqalptTEkWRvx7//+76+//jpN06+++urLL79MVi2QF9xHUIp+38ovTiQSH3/88bFjx3Rdf+yxx55++um6ujqbzfZ//+//FUXR6XQ6HA6WZV944YWmpqZ33nnntdde+8d//EeM8Te+8Y21KjkCAPg8IIEF1i2S3OE4zmAwkFySw+EYGBjYtWuXyWQinWhra2sikRiveP/993t7e1ebwFJVdXp6+j/+4z8+/vjjvr6+733vewcOHCBFKKupqAf6M2az2cuXLw8NDSmK0tfX19vbS8avHBXkcQtjbDQam5ubX331VZPJ9Dd/8zfvvPOO3W5vamqqr6+/j2liAADwVYAQEgSB53lVVUVRrG7wCgAA6zJsJtEjTdNms7m3t3dwcLCvr4+8IJ/PLywsKIpy4sSJs2fPzs7Oer1eo9G4qk/J5/NHjx598803s9ns888//z//5/9sbW0VBOEL+zHn5+fff//9ubm5Xbt2PfLII263m6zYaG9vX/oyt9u9Z88eo9EYj8ePHTv2i1/8wul0vvzyy1BDFoCHjoZLANbxgwdZz59MJjOZDEVRHo/H5/NVu0mapgOBQEtLi81mK5VK169fz+fzq/2UZDI5NDR05swZr9f71FNP7du3z263f2FzmnRdHxsbO336dC6X8/l8LS0tDodjadbstgyaz+fbt2/fM888gzH+6KOPfvvb35bLZWgqAACwQldCSsBUN7QFAID1GjanUql4PC5Jkslkam9vX1ov1Wg0Dg4OdnR0kO2GJicns9nsqj5C07RwOHz48OGxsbHdu3cfOnRow4YN1bD8C7jBFovFGzduDA8Pl8vlDRs2tLS0rPBiQRC6u7u/9a1vdXZ2Xr9+/dixYzMzM7IsQ1MB4OGCBBZY53RdTyQSqVRKEIRgMOh2u5cOnpC9CDmOI3XQ76NGbzQa/fDDD2Ox2K5du5544omGhgaWZZeGAlX4FlVVC4VCJBKZm5tLJBL5fH61u7NXK2vKsnzlypWzZ8+qqhoMBl0ulyzLuVxOFMWlB1xavT4QCLz00kutra3Xr18/evQo2awQnsoAAGDlWy7cJwEA6/5eFw6H5+bmRFE0Go1NTU1LE1g0TdtsNrKIQVXVZDIpiuKqjp/P50dGRi5cuEBR1MGDBx9//HGyRfjKYbMkSYlEIhwORyKRdDpNJsOu6oZcvYGHw+Hh4eH5+XmyUkEQhHw+n8vlFEW584DkNXv37t22bRvG+OLFi2fOnCkUCtBOAHi4YAkhWOdEUYzFYgsLC4IgNDQ03FZvEmOcr2BZ1ul0rmoxHcZYluXJyckzZ86Uy+U9e/Z0d3ff8/XpdDoajU5OTo6OjhaLRbfbHQwGGxsbm5ubfT5fjTOTSXeeSqVGRkY+/PDDsbExVVXz+fzFixcLhYKu652dnbftM1gNCwwGQ3t7e2tr68WLF8fHx69evepwOGBHQgAAuBNCiK1QFEWSpFWNNAAAwJeLpmmRSCQWi1EUZbPZgsGg2Wyuri7EGM/MzESjUVIc0O/3k7+tXSgU+u1vfytJUn9/f7V+1t1WLWiaVigUMpnM1NTU1atXFxYWGIZpaGior69vbW1tbm4WBKHGFQ+apuUqjh079sknn8iyzHHc5OTkBx98YDQazWbzzp076+vrbzsa+U+WZfv6+trb26enp48ePTowMOB0Ou+j0jwAYK1AAgus8544mUyGQqFsNut0Ojds2LA0U0M2IoxGoySRtHnzZrvdXvvBMcYTExNDQ0OpVMrn83V1da1Q3FHTtHQ6/dlnn7333nujo6OqqnIcVyqVEomELMsdHR3PP//8q6++WmMCS5blqampt99++/jx4yRpRVb1HzlyRBAEk8n03HPP3TZuVkXTtMPh6O7uPn36dCgU+vTTTzdu3AgJLAAAuFO1BlaxWIQEFgBgfcMYz83Nzc/PC4JQX18fDAbJuC9J1qiqevXq1ZmZGYSQ3W5vbm62WCy1H1zX9cnJyWPHjomiODg42NraSlZn35kJIiO+MzMzJ06cOH78eCwWMxgMZKlELpdDCB04cOCP/uiP/H5/jVmkfD5P6macO3duZGREVVWe54eHh0OhEEVRDQ0NTU1NgUBg2ZPheX7Lli3t7e0XLlw4ffr02NhYW1sbVMIC4CGCBBZYz8hQUiKRUBTFbDZ3dXVVMzVkF8KhoaHLly9rmuZ0Op988slVVXDHGI+MjHz66ac0TW/ZssXv91cX8N85CzqbzX7wwQf/8i//Mj093dHRceDAgc2bN8disddff/3dd99lWXZVj0akOL3H4wkEAmNjYzRNC4Kwa9eujo4OTdNMJlN3dzeZ433nOZMJBaQzfvfdd0+fPv3ss8+2tLRAZwwAAHeiKzDGuq7DKkIAwDpGlhBGIhGr1drY2Gi1Wqs5JlKR4/jx46Ojow6Ho7e3t5reqlE0Gr169er09LTT6dy0aZPP57vbTke6rl+7du0Xv/jF0aNHyZbZzzzzDMdxIyMjP/rRj6amptrb21e1iTZN0xaLJRAI2Gw2jDEpbrVjxw6fzyfLst/v93g8dzsZhmGam5vb29tNJlMsFrty5crg4KDX64XWAsDDAgkssJ6pqhqJREiNSYfDQbof8lelUuns2bO//OUvr1y54nQ6d+zYMTg4SGZg1TgxGGOcTCbD4TBN042NjWQe9bJvVFX1xIkTP//5z8+cObNr167/+l//6/79+71ebzqdLhaLFy5caG5u3rhxY7V41r2/tyzr9/sPHjxosVjGxsZmZmbcbvfzzz+/e/duTdMYhnG5XMsmsMjp0TTd3NwcCAQ0TYtGo6lUSlVVSGABAMCyt02ydgamXwEA1jEyshsKhWKxWEcF2WEQIaSq6uzs7JEjR86dOyeK4ubNm59++umlO27XcvxwODw9Pa3rusPh8Hq9JpNp2TdqmjY1NXXkyJHDhw/n8/lvf/vbr7zySnd3N8uyHR0dH374oaZpzc3NRqOxxs/FGJtMpq1btzY0NGSz2XPnzvE8v3v37pdffrm5uVlRFIPB4HK5aJq+WxdgMBg8Ho/T6VxYWJidnc3lcpDAAuAhggQWWM9UVZ2ZmclkMqRYez6fn5mZKRaLsizPzs6+/fbbx44doyjqwIEDL7zwQlNTE8/ztffEiqKUSqVisciyrN1uv9sewJqmxePx999//9SpU4FA4JVXXjlw4AAZ6jGZTNu2bdu3b19vb29ra2vtBbAYhrFYLIaKUqnE83xDQ0NHR0dLSwuZI1DdNmvZn4WmabfbTbJ1oijm83nYGx4AAJZ9dOE4jmVZXddlWYYcFgBgHcfMyWQyHo+rqmowGIxGYzgcJpXaM5nM6dOnX3/99Ww2u3nz5qeeemrv3r1k/WDt06AymUwymWQYxm63WyyWZRNGuq6rqnrq1KkjR45kMpknn3zy1Vdf3bJlCwloPR7PE0880dHRMTg4yPN87bdxlmVdLhdCiGEYURTdbvfWrVs3btxIcnD3/CkQQhaLxWazxWKxRCKx2tL1AIC1BQkssJ4pijIxMUHqPqZSqV/84heqqkaj0fn5+UgkkkwmEULPPvvs97///e3btxsMhqUZn+qWJct2sRhjUlpSFEWbzWa1Wu82f6pQKAwNDV24cAEh9Mgjjzz++ONkoSLG2GAw9Pb2/vVf/7UgCHa7vcYgoDrehTEmWxkajcbOzk6yOrJ6titk4hBCZrOZRB6appVKJVVVobUAAMCdSA0sVVXL5TLk+gEA65UkSdFoNJ/PUxSVTqfPnDkzNzeXy+USiUQkEpmampJlecuWLYcOHXrppZcCgcDdpiwti2yalM1myfDt3TZN0nU9n88fP378/Pnz27Zt+x//43+0trZWE0w2m+173/semVFV++pFEg8jhEj6SZZli8XS2dnpdDqXjhyvHDZbLBar1aqqaiaTKZfL0FoAeIgggQXWLbJV3+zsbCaTsdlsFouFFG7M5XI0Tff29nZ3d2/atKmzs5MsAKx2XRhjRVFIR+twOOrq6pbt0sgLZFmmaXrZzpgcMJFIvPnmm1euXOnu7n7mmWecTme1OyS1IUk+a1W7mZDslaqqsYqWlpa+vj6bzXbba1Z+JCNTxlRVTaVSsixDgwEAAAAA+GoqFovXr18nZTdIlYxCoUAm6dfV1fX392/atGnLli3Nzc1er5ekfla1GR8Z96Vp2mq13m3VQi6X++CDDy5fvsxx3MaNGzdv3ry0IAZCyOFw1DJSe2c8rOv62NjY3Nwcz/Mej8dmszEMs/QIKx/KbDZbrVZd1zOZDMTMADxckMAC65Ysy4lEYmFhQdO0np6el19+ORAIkJX8ZDV7S0tLIBCoJp7IlCtN0/L5/MTExMmTJ8fHx5977rmDBw/eeXBSKSCfz+u6zrKsyWRadgGgKIqjo6NXrlzJ5XLBYHDr1q1Lh4yqg0KrDQLI69PpdCwWU1XVarV2d3fXuJNgdYEhmVZAtkdUFAUaDAAA3Pnks7QGFhRxBwCsV8Vi8cqVK5lMxmw279y588UXX2RZVlEUhmGsVmtDQ0MwGHS5XLfFk7UfXxTFUqlE07TZbL7bqgWSwJqYmAgEAps2bVqaZloaM9/Hp+u6PjIyEgqFLBZLU1NTtbxXjR2BwWAwmUyapmWzWUmSoLUA8BBBAgusW4VCYXx8vFgsUhTV2dl56NChQCCwtDciO5gs7QKLxeLIyMiZM2fOnTt39uzZZDK5cePGZRNYZO4SWXlXPdSdr8nn8+Pj4/l83mg0+v3+pfmy2zrOVXXDZHXk9PR0JBKhKMpisTQ2NpLOuJZueOmDGSk3AIVdAABgWXyFpmnlchlulQCA9apUKo2OjuZyOb/fv3fv3q9//euCIFQ3166m8mucsnQnTdNUVV0hZiah+8jISDqd7u7u3rBhw9ICVbe9ZbWfjjEeHx8Ph8Mul6u9vX1V+yeSvQhJKk1RFOgIAHi4IIEF1q1cLjc6OloqlViWdVYwDHO3FfuqqpZKpYsXL77//vsnT56cmZmZn58nO6TcbZCH53kyBVrX9bv1Z5IkkfX2dXV1fr/farWutsddIQ4YHx+PRqMURdVV1L6JYfVHJgNrRqMRtiAEAIA7LS3irigK1MACAKxXpVJpbm6uXC77fL5gMEhufXe+jCRxSCFzo9FIxmVJil9RFI7jjEbjsikqlmU5jlt5Q4x0Op3NZhVFqaur8/l8axUzVyt85XK5lpaWtra2Ggd9q5QKsnwBYmYAHi5IYIF1K5vNXr58uVwu19fXkypXK2SjisXi8PDwP//zP0uS9F/+y3/hOO6v/uqv4vH43RaMIITsFQzDkIJZy9ZBF0UxFouJotjQ0FDd62RNyLJ87dq1UCjk9/vb29vJdsI1Tqgma2FIAMGyrMfjWe1IFAAAfEUsXegNVwMAsP5gjEulUigUSqfTmqb5fL6mpqY7s1ek1EYymRwZGbl27RrLsrt27ers7CQ7Jl24cCEejweDwYGBgebm5tt2RqqWkSLV3JctI1Uul0OhEKkibzAYzGbzqurEr0BV1VAolEgkSCX4jo6OVSWwyMZN2WyWYRi3273a5BcAYG1BAgusz54YIZTP56enp2VZbmtr83q9K+d3OI7z+XwvvfSS3W7v6OiYmJgwmUzVjQiXZTab7XY7mYRVLpeXTWDpui5JEsaYjDvd9ldkPEcQBJZlV5vbkmU5Go2m0+m2trbGxkZy8BoPQnriUqlEJkVbLBYYTQIAgDstXWyuaRrksAAA61I6nZ6YmCiXyzRN2+32urq6O5NHGONoNPree+/95je/mZ2dFUUxkUh8/etfLxQKP/vZzy5dupTJZKxW6yOPPPKnf/qnTU1Nt91LLRaL3W7Xdb1UKi1bepXEzGRyFltBbrkkeldVVRRFhBAJm1f105VKpfHxcbKJk8PhcDqd93GEUqnEMIzNZuN5HhoMAA8RJLDA+iRJUiQSmZubU1XV6/XW1dWtPDtJEITm5uZgMGgwGBBCU1NTK7yYHMpoNNpsNqPRqGlaoVBYtjOmaZpMpS6VSqQaV1U+nx8ZGRkbG9u1a9eyI10rUBQlFotFIpFisej1ehsbG1c7SJXL5cgYF/lBVtuRAwDAV0R1v4u7DVQAAMCXXTKZHB8fV1XVYrG4XK5l9wVSFOXtt9++cOGCy+USBOHw4cPHjh3L5XKiKGqaduDAgaGhofPnzxcKhVdeecXv9982cGu1Wh0Oh67r+XxeFMU7w3KO4+x2O3mXJEnkNdW/DYfDQ0NDFoulr6+vubl5VT9dsVi8ceNGLpezWq319fVk/UTtbyfjvrlcjmGYuro6WLUAwMMFT61gfYpEImNjY5lMRtd1i8WydBfeZdE0TaZcIYRIKmqFkXbS47Isa7fbXS5XIpGIRCJkQtNtLBZLe3u7yWSKx+OXLl06c+ZMIBBQVTWVSl26dOnUqVPlcnnjxo2r7YlLpdLk5GQymaQoyuv1BoPBVfXEuq6Hw2EyldrpdLrd7tuCDAAAAORuTyYCrFDrEAAAvtR0XY/H4xMTE6T4VHV5wZ33Q0EQBgYGWltbx8fHP/nkk5GREV3Xt2zZ8tJLL/X09Oi6fuPGjWKxWCqV7rxber3ehoYGTdNSqVQymSRLEJa+gOf5YDDo9/vn5uYmJydPnDjhcDgEQcjlctPT02fOnDl16tSePXs6OjpW+wOKojg9PV0oFDweT0NDw2rXAOq6nk6nk8kkx3H19fX3fKYAADxQkMAC61Amk7l48eKFCxdKpRLHcfl8fmFhgQy8LDtTqToKVHuhE/IWv9+/YcOGSCRy48aNVCrV3t5+28scDkd/f39ra+upU6c++eQTnud7enokSZqYmLhy5UqxWHz88cftdvtq1w8Wi8WZmZl8Pk9WPvr9/lXNwNJ1fWxsLBQKmUymDRs2BAIBWEIIAAAr3/Nh/SAAYP1RFCUcDl+8eHFyclJRFFVVSbKGlI5dGqCyLPv1r3+dYRgyoUnX9XK57PF4XnzxxcHBQZqmDQaDIAgIIbPZfOfU/oaGBlKzNZPJzM7OZrNZj8dzWwBcX1//yCOPhMPhkZGR1157TZIko9EYiUROnDgxOzsbDAabmpqcTufS0L2Wu3c+n49EIuVy2e12k/r0tR8BY5zNZiORSCaTcblcGzZscDgctX86AGDNQQILrDeapn322Wfvvffe5cuXbTYbTdOzs7MnTpwIBALbtm1bNtFzH50QKYzS2dm5Z8+eEydOjI6Ozs3N9ff3Mwyz9GgGg6G3t/fFF1+UJGlqaurw4cMffPABKSfZ0dFx6NCh5557rrW1labpVfWFxWJxenq6XC6bzeb6+nq32117BXcyMXtkZGR2dtbj8Wzbto2EAgAAAO4ENbAAAOsVWdD30Ucfvf/++6Io2mw2TdMuX7586tQpj8ezdKoRxpimaZvNhhCKxWJTU1OiKHZ3d+/fv3/r1q0mk6lQKCSTyUKh0NjYaLfbqxWsqm+32WwbNmzo6uqanJy8du1aOBwm4evS83G5XP/tv/23bDZ79OjRkZGRUChE4naWZQcHB7/73e9u3bqVRK21Z69EUQyHw/Pz86qqOp3O6qqFGo9ANk0aHx+nKKqxsXHz5s0kgQWNB4CHBRJYYB3iOK67u7taoxEh1NDQwPP8mo+W1NfX9/X1+Xy+ubm5sbGxSCTS0NBQ/VvycS6X69ChQ21tbRcuXJiampJlmWyAsmPHjo6ODjL7abUjOYVCYXJyslgsut1uj8fDcVztb1dVNRKJXL9+PRaLDQ4O7t271+12Q5sBAIA7kSUzgiBomiaKItTAAgCsPx6P5/HHH9+5cycZTzUYDHcWcSd5fIZhdF3PZrM3btwolUoDAwOPPvoo2VswlUrNz88jhFpbW0kJrerYajXKbW1tPXDgwL/+67+eO3ducnKyp6fntoLoBoOhp6fn//2//zcwMHDt2rVkMokQqq+v37p1a09PT1dXl9lsXtX0K7J+MBKJZLNZiqL8fn9zc3PtdTMwxrIsX716dWpqyuPx7NixIxgMLv3RoPEA8MWDBBZYh88bLS0tLpeL7P1Hhs3JqNFabcdb/SCO41pbW/fu3XvkyJEPP/ywo6PjtqKVGGOO41paWvx+f39/fywW0zTNZDI1Nja6XK7qa1bVBeq6vrCwMDY2Vi6XW1pampqaalwASD4onU4fP358fHzcbrf39/e3t7eTrY6h5QAAwDJx0q3NsGRZhhpYAIB1FjObTKaBgYGNGzeSHbExxqSUOxn3XRqjkn9RFCUej8/OzjIMs2nTpvb2doRQuVy+ceNGOBy22+09PT2kzHk1tqweIRgMPv7440ePHh0dHR0eHt66dWtra2t1oSKZ5GU0Grdt29ba2jo3N5fP52madrlcLS0t1VTXasNmsmqhUCiYTKampqZAIFD744AkSaFQ6OTJk7Ozszt27Hj22WerqxYgewXAQwvM4BKA9YR0fu6KZfvpNR8wCQaDL7300vj4+JkzZ4LBYEtLy9IBpeonkl0OyabC1RMgf7Xa8ykUCuFwOBKJ6Lq+saL2ClalUunKlSu/+tWvYrHYzp07X3zxRYvFAt0wAACs8IBH7uRQBgsAsM5iZhKg+v3+FW6At/1JsVicn5/P5XIOh8Pr9ZJcVbFYPHfuXCgUampq2rJly5379JHPMpvNXV1du3fvnpmZOXLkiN/v/853vkOKbd32WXUV5JZbTW/d+bJa5HK569evFwqFjo6O9vZ2q9Va+xEWFhbefPPN06dPWyyW3bt3P/LIIyRsBgA8RDRcArBenzfutNrsVS0vttls27ZtO3jwoN1uP3369BtvvEG2cVk67lRNVNEV93cyVdFodGRkRNM0j8fT0dHh9XprGUrCGEuSdOnSpbfeeuv8+fOBQODpp5/euXMnbAYMAAArYCpgF0IAwPqLlqsB6t3C5jvfRSqaa5rW3Nxst9vJH4qiOD4+vrCwYLPZgsEgKVVBdrte+lkURbnd7meeeWZwcHBycvLw4cOnT5/OZrNLhweqSavbwua7nc/KJEki+5IrirJz587Ozs6ln7Jy2LywsDA0NPTGG28oinLw4MEnnnjibptBAQC+SPAlBOutM77vv106wE6eUsi/4yXufAtFUT6f7zvf+c6uXbsikcjrr7/+3nvvVTvjpWNHqzqZu50eRVGjo6NDQ0MURT3yyCO9vb137vNyt58rHo//+te//rd/+7d8Pv/KK68899xzFouF1DuAlgMAAMveqAVBIAvDSQ0suGECAL7KYXMqlZqdnaVpure3t1oNg+xdWCwWBUHAGL/11lt///d/f/HixduOhjE2Go07d+48dOjQhg0bzp8//8Mf/nB8fLwada88uLuqyJncq+Px+NWrV+fm5hiG2bdv35YtW1aexlWNmXVdHxoa+vGPfzw2Ntbd3X3o0KFHHnmEVK2FZgPAwwVLCAH4Xad148aNhYWFcDj88ccfp9NpXddPnTrl9Xp9Pp/D4Vg61rS0K0UIeTyeP/mTP2loaHj//fd//vOfnz9//umnn3788cfr6+trX9+3tIOvbnRY7WJlWZ6fn7906dIbb7wxMzPT1dV16NCh/v7+e87kIlXbjx49+vHHH1+8eLG1tfWb3/zm888/HwgE7iMgAACArxSapsn2spC9AgCAWCw2OTkpCMLSBJbBYOjo6GhsbDx//vwPfvADp9O5Z8+e9vb2O0NciqLMZvP+/ftpmv71r3994cKFv/iLv9i9e/eBAwe2bt0qCMJ9nBIJmMnwM7lLa5qWyWRCodDhw4dff/11q9X65JNP9vf3G43Gex6tUCgMDw//5je/OX78eKFQePHFF7/5zW/u2rWLnBvEzAA8dJDAAuAmXdcnJiauX78ej8cXFha6urokSVIU5cyZMy6Xq7W11el03pbAqq7J53l+cHDQZDL5/f4TJ04MDw9LkiQIwvPPP19LZ3kbnudtNhtJnFXfHg6H33jjjY8++igcDnd3d7/wwguPPfZYtUDACrLZ7PHjx3/961/Pzs729/fv27fv4MGD1d0PoTMGAIC7PRTddoe85wQBAABY3wRBCAaDgUCgr6+vrq6O/KHNZjt48CCpNoUQ2rNnz/79+xsbG+98O9nKsLGx8Wtf+5rL5aqvrx8eHn7nnXfIZogtLS2rHfelKMpkMvl8vnQ6bbPZBEGgaTqbzQ4NDR07duzkyZP5fP655547dOhQY2PjbeW07qQoyszMzDvvvHPs2DGbzfbUU089+eST27Ztg9JXAPzhgAQWAL97VnG5XA0NDWTHQLI0T9M0RVFYlvV6vcuOCy3dJHjTpk3BYHD79u2nT5/O5/P3UaC9Ggds27btu9/9bnt7O+luyaYwkiT5fL7e3t4dO3bs3buXxA33LKel6zpCaOvWrY899tj+/fs7OzsNBgNsAAwAADV2DaQ70HWdbEQINVAAAF9Z7e3t3/rWt8gWhGazmfyhyWTauXOnyWSan5+32WyDg4N3jvhW76gkheRyuZ588smenp5Tp05dunTJ5XLdX5FBhFBra+srr7xCNgp0OBwIIU3TyuUyxphst71v375NmzYZjcYad09yu90vvfRSf3//1q1bvV4v6QIgbAbgDyUwg/nwAFSXyiuKomnanUPuZBUJx3F3e26p9mok0yTLsqZpZL+V1T7qkBF+chCGYXieJ0eQZTmVSpEowWAwkO2Na+lQVVUtl8vkRxAEoVozC3piAAC4p0Kh8Jd/+Zd/+7d/GwwGf/jDHz711FOw9wUA4CtL0zRVVUlmf+muRCR2JYOmHMetEGEujZkxxrIsS5LEMIzBYCDrtVcbNmuaRgJvEuXSNK2qajabLZfLBoPBZDLxPE+OfM/Ql/wUpVKpGjOTIBxiZgD+cMAMLACoasaK5/l7dpMrVGQnf8tV3N/qvOrQEM/zt3X/HMd5vV6Sh1r65/f8CJZlqzOfl54n9MQAAFALshMWeUyCjQgBAF9ZZAHg0lV+SzfdJvtdrBww/3/2/uxJjrS8G/5ry6WytqzKqsral16k0WgGpMHAgOH58bA8/E4cENhhO8In9omP/d/4yD70CQGBD/Dywhv45YGXYWAYRsMw6rX2JZfa91yq3iAvVO6n1dJIGo3Uan0/diiaGrVayq6uuu8rr/t73b8W5TiOlt9PMKSbfrPPcXbh7fV6o9FoJBLZDjF8lCLU9k8LhULnPgtrZoDLAwUsgMfwiONazt5ceqz3vHN/wtk/52wn1xO8u5/9GG/DAABP9sqMqwEAeDH8yIXxRy41H1Qe+jjL5nNr3e3K+RFXv9uFN0pXAJcZchwAPsE3+Cd+2zsXHnzuz3myd3fktQMAPJazt/cpA4uOmQMAwFNcMD/FP+fRq2xYMwO8iFDAAvhE9jy4UQ8AcAU2V9sTLrZtL5dLy7JwWQAAsFQGgOcCRwgBnqb1ej2dTnVd93q9gUCA53mGYSgDEndyAABeOB6PZ5v5gh0XAMDHQdO9l8vldDq1bTsQCEQikbPhWQAAD4cCFsDTtFqt7t69+4Mf/CAQCOzv72ezWUmSRFEMBAI0xJDGoOBQPQDAi2L7io0EdwCAx7JerzebzdphGMZkMun3+61W64MPPlitVtevX//KV74Si8VwoQDgEaGABfA0jcfjd9999z/+4z9ms1kkEgmHw/F4PJfLZTKZtCOTycTj8e18k+2+CPUsAIDLaTtbdrVaIQMLAOBBtj2qm3sWi4Wqqu12u+VoOFRV1XXd7/d/7nOfu3XrFgpYAPDoUMACeMrv3B6PRxTFwWBweHg4nU49Ho8kSYlEIplMplKprCORSMTj8Wg0GovFRFHked7n81Fz1v0J7gAA8Lx4PB7WsVqtDMNABhYAwNl179lfbdu2LGs6nQ4ciqJ0u916vd5sNtvtdqfTUVV1MBi43e5wOFwsFlmWxTUEgMeCAhbA03wXj0QiX/7yl2VZbrVaiqPb7fb7/clk0mg0Dg4OVquV1+sNhUKyLFM9K51OJxx02DAcDgcCAUEQOI7DJQUAeI4v6ds+WRrHTqdgcGUAALbm8/l0Oh2NRsPhUNM0RVGo36rdbtdqNV3Xl8slwzBBx6uvvhqPx7PZrCzL+Xx+d3c3kUhsX29xMQHgI6GABfA08Ty/t7dXKpUsy7JtezKZNJvNer3e6XS63W6n02k2m71ebz6fN5vNSqVimuZ6vRZFMZVK0UnD7WFDSZL8DkEQ/H4/x3G0gwIAgGdgu5ui+EKEuAMAUJTVfD6fzWbz+Xw8HiuK0mw2G40GrXibzeZoNHK5XCzL8jwvimIkEslms/l8PucolUq5XC4QCPjuwbEDAHh0KGABPOXdzvbN2OVyhUKhZDL52muvWZa1Wq2m06mmaZQFcHx8XKlUTk9PG40Glbfef/99n8/HMEw4HE4kEqlUqlAoFIvFvb293d3dVCrFcZzX66WBhgjPAgB4Ni/sLMvSkCxkYAHAS2VbtafjgZvNZj6fa5p2fHx8cHBwdHRUq9U6nY6u64vFgu7d2rYdCoVyudze3t6nPvWpUqmUcoii6Pf76eWUYZizy1e0XwHAo0MBC+ATfNffhqfQI4lEIp1OL5fL+Xw+HA4nkwkFBDQdrVZLVdVerzcYDFRVPTw8DDnC4XA0GqXwLIqBTyaTsViMDhv6fL6zMw2xAgAAeIrcbjdtt6iAhQwsALjaa9ftB5vNxjTN6XTa7/d1XW+1WnTbtdVq9fv9wWAwHA7n87lpmoIgpFKpdDqdy+UKjnQ6HY1GJUkKh8OCIPA8T6+iZ78Q1q4A8ARQwAL4BLc95x7xeDyCIxaL5XI5etAwDF3Xu92uqqqapnW73Xa7rSgKVbL6/X6tVpvNZvRZsiwnk0lKhU+n06lUipKzJEmKRqOBQOBcEjyWBQAAH5PHQWdncDUA4Cq5f27garXq9XqapumOTqdDd1gVRWk0GoqizGazQCBAU7ZjsZgkSXmHLMuZTCaVSiWTSZ7nL/xaWJ0CwMeEAhbAc1grnH3bZlmWQq+284YHg4Gu65qmbacOd7tdXddHo5GiKKenp6vVyuPx0ElDCoPP5XLFYpE6tKlpKxgMchzHsizCBQAAPo5tAQsZWABwxVakpmkahrFYLCaTydChaRplXLRarU6no2naeDz2er2CIEQikRs3bsRisXQ6XS6Xi8UiTdZOJpOiKNI91PsXnNt1L9aiAPDxoYAF8Kxt37/PvaPTr4FAwO/3y7JMo4hN06TwrHq9XnE0m81tPYtS4S3L2mw2HMdJkkSVrN3d3Xw+n0gkYrEYhQ5QMYtl2W2LFgAAPMorNuegxgQcIQSAF9pmszEMY+VYLBb9fl9RlE6nU6/XDw8PT09P2+32fD7fbDYU6srzPN1nzWaz169f393dLRQKNGiIYRj6PZTQev8XQt0KAJ46FLAAnue+6P53erfb7XVsH3S5XLZtp1KpmzdvLhYLys+ijm4Kz1IUpdVqDQaDdrutadoHH3xA4wvpFhkNN6RGrXw+H41GGYbx3oMweACAh79Q03gNalVAiDsAvEC2BwMpYd2yrOl02u12aQ3ZbrdPTk4o04rqWcvlcrPZhMNhmhuYyWQo0yqbzVIwazAYpECr+5utzp0wwMISAD4JKGABXKJt0v3Lju1kQ1o3bP/TwjGZTKbT6Xg8VlVV1/V2u12tVk9PT6vVar1eNwyDYRhBEAIOURQpm4AiNovFIt1D43ne4/GcOxeDZQcAwKO8UAMAXB73H3NerVa6rlcqlWMH1a2okX86nY5GI9u2eZ7P5XK3b9/e29srFAoZRywWCzoCgQDLsh/56oeXRwB4BlDAAnjBdkpU1dr2WFGuMN03Gw6HXUez2aS4TSpsDQaD8Xjc7XZ/97vfCYIQj8dlWU6n08lkklLhM5lMNBoN38PzPLWAIWsTAGDbFWtZFnLcAeBSOZvCTi9Ts9lsNBoNHNSkv23bb7fbg8HAsiye58PhcKlUoqFA2Ww2k8nk8/lcLpdIJGh04LlTgecarAAAngsUsABeMPdHaHk8HqpnRaPRcrlMRw4Xi4WmaXTGkGLgm81mp9OZTqez2ezg4OA3v/nNarXieV6W5Z2dnUwmk0gkqD8rkUiIoigIQigUouHHPp8PqxYAeDmxLEsTtZCBBQCXymazobuY0+mUuvJ7vV673aa1X6vVqtVq9Xp9NptxHEe9VHt7e4lEIpfL5fP5bRZ7KpXy+/3uey6sVWEdCACXAQpYAC+qB4XBb48c+v3+dDp969Yt0zRns9lwONR1vdfrNZvNhqPdbo/HY9M0Dw4Ofv/736/Xa5/PF4lEaLghBR9ks1lZlkVRDAQCPM9TEjzFdmIpAwAvA6/X6/P5KPkYBSwAeI4ozco0zaVjPp9PJpNut1tz0AKv1+tNp1Pbtql7NJFI7O/v5/P5Uqm0XddJkiSKInsPTaw+t8JEyxUAXEIoYAG88B4UnkWjYfx+v8vlikaj6XTatm2al0yLHsp9r9frH374Yb1e73a7vV6vVqtVq1X6XIZh/H5/PB4vlUq7u7ulUimdTqdSKUmSIpEIx3E0d8bj8WCJAwBX/mUW2zkAePaozYoYhtHv9zVNa7Va1Wr16Ojo8PCw0+mMx2PDMGjQhM/ni0aj+Xy+UCjs7u5eu3atXC4nEolAIMBxHEWwn50XdHbp+PDlJQDAc4cCFsCV3Wude4RqUhzHbR80TXN/f388Hn/pS18ajUb9fr/b7XY6HV3XVVVttVqNe05OTn79619HIpFgMBiPx1OpVCaT2aZopdNpGm6IShYAXD1Uqd+O8cIFAYBngDKt1uv1dDpVFIVOBXY6nUaj0Ww2e73eeDweDof9ft8wDFEUs9lsOp0uFAoUByHLcswRjUZDoRCdgz7359/fyw8AcMmhgAXwUrhwXUJ949FotFgs0iPj8bjf7/d6PUVR6vV6pVJpt9u9Xq/vODk5mUwmm80mEAhQEjwtknK5XCaTicfj1JEeCoUCgcD2mCFmKgPAC41hGI7j3G73arVCAQsAnqKzQwPpY9u2V6vVbDajFHbKtKITghT+oGnaZDJhWVYUReqRj8fjhUKhVCrROcFkMkkzpu//Wg9ahmF5BgAvChSwAOC/VzaBQEAQhEwmc/PmTcuyaAml63qn02k2m3TMsNvtDgYDerxer//0pz+lT5RluVwu53K5VCqVSCTi8TgNsqHQUEEQHmUGMwDAZeP1ehmGofFeKGABwFNffVGUFQWx67re7XbpPiIVrbrdrmmaLMv6/X5BEG7cuBGPx2loYMGRzWaj0SjHcdQLT+3wOBIIAFcSClgA4NrelzsXiBAMBiVJyuVyr732GoWGzufzjqPdblMfe6VSURRlNpvR/zRNk3q7ts1Z+Xw+k8nkcrl4PB4MBnme9/v9lAdPayysqADgMtvuBtfrNa4GAHzM5RYFklIa6WKxmEwmzWbz9PS00WhQslWz2VwsFrQqYxgmHA7HYrFisUgp7Pl8vlgsyrIsCALFlZ6LYD+3tMM1B4CrBAUsAHA9JAmeNm902NDlctEqan9/f7VaLZdLGthMeaKNRuP09JROHQ6Hw1qt1mq13n33XUqCj8VikiRls9mSI5/Py7JMkw3phiGlzNDfBOstALg8aAohOrAA4AmcjWCn44HUZlWpVE4czWZT1/XBYLBcLk2Hy+WKRCKFQqHs2N3dlWWZutq3Qey0Krtw2faQpR0AwIsOBSwAuNi5dc92YUTFrGAwuB3nvFqtFosFxcCPRiNN05rNJrVoqaqqKIqmaZVKxefzBYPBWCwWiUREUUwmk+l0mrq06MhhNBoNh8N0VAcA4JKgDCyXy7VcLi3LwgUBgEdnmuZwONR1XdO0drtNd/s6nc42YHQ6nbrdbloXybKcd1CPVSwWE0UxFov5/X7qW79/VfagZRsAwJWEAhYAPBJaGJ1dHm02m21zVjAYTCaT9PhqtaKeLEVRdF1vtVptx8gxGAxOT09Ho5HH44nFYqlUSpblZDKZSCRkB1WyotEo5cEzDHPhUEV8RwDgGS2VnEM61IGFU4QAcNbZFHayXq8Xi8VgMNA0bTAYqKrabrdbrZaqqjTrudVqrVYruqW3s7MTjUYphT2bzW6nPCcSCZodcbZWhSOBAAAoYAHAE3rQ9GWO49IO+p+GYVDdSlXVRqNRq9Uqlcrp6Wm3253P581m8+joyDAMt9stCEI0Gs3lcqVSidrmc7mcJEmCIPA8TzHwdJbn7E1ILOYA4JPeoN6/RwWAl/k14ezHdO7PMAzKtBqNRt1u9/T09MMPPzw5OaHS1Ww283g8FAOazWYlSSoWi/v7+9evXy8Wi8lkMh6Ph0KhC5c020R2LHgAAFDAAoCP61xb1rkFFsMwdDYwm82+/vrrhmM6nbZarWaz2Wg0ms2mqqqdTqff769Wq6Ojo8PDQ8reCofD6XSa5uzQoOh4PB6LxSgDgvh8PizpAOCTQyF91FiBDiyAlxxVtFf3LBaLbreraVqn06FbdJVKRVXV1WpFCXputzuRSLzyyisUm1AsFvf29nK5XDQa5XmeRgf6fL5zU3TOLaiwzgEAIChgAcDTdGFyls9BITJbhUJh5phOp+PxmCYbUqc9URSFpkffvXvX7/cHAoFgMCjLMvXY0wf5fJ7a7CkDnopZW/h2AMBTWCrdO0JomiZC3AFeNlSxWq/XdIjYsqzRaNTpdOr1erPZ7HQ6p6eniqKMRqP5fD6bzRaLxWaziUQiOzs7pVJpG/cpy3IkEgmFQuFwWBCE+ytWD19QAQAAClgA8Ml60PJrs9nwDkmSqBXfMIzFYrFcLmez2WQyoTB4midddzSbzclkYts2deAHAoFQKESJp9lstlAoFIvFcrkcj8eDwSDP87j4APBUUAcW7WBxlhDgZWPb9ng87vV6J/fQ3bVerzeZTGaz2Xw+93g8kUgkm83evn372rVrpVKJYj1FUQwEAoKD6uAPil8AAIBHgQIWADwHZxdt9DHVs87+nuVyORgMKAO+2+02Gg1FUVqtVr/fH4/Ho9FIVdXf//73Pp8vHo9nMhlqyKJR05QKT/c5I5FIMBh8UK4EvhcA8IivV6heAVw99/9c27ZNN9JGo5Gu66qqtlotRVFqjnq93u/31+t1KBSKRqM0iIbarDKZTD6fp66rUCjk8/nu/1pYeAAAfBwoYAHAJV1QchxH68Jbt25tNhvDMHq9HiVndbtd+qDVag0Gg9ls1mg0jo6OVqsVy7LRaDSdThcKhXQ6Tc1ZuVwuGAxS35YgCEjOAoBHR7MjaFtrWRZ2oQBXbL2xXq8pgp1iDYbDIXV/N5tNqli12+3FYkHzZMLhcCqVSiQSxWKxVCrl8/lyuZzNZhOJBM/zbreb5sxsk9fPfi28bgAAfEwoYAHAZUSLvLNLPYZh/H5/Mpm8efOmYRjL5ZJa+jVNoyVmo9GgetZ8PqcVp9vt9nq9NKmaThrSMcNUKhUOhykJnmVZn89H8w2RnAUAFyyVzmRgrVYr9GEBvNDW67Vt2zQ3kFLYqae7Wq2enp5SuUpRlMViQXMb3G53JBKhclW5XN7d3c1ms7IsR6PRSCTCsizP8xTEfv/XQrEbAOApr8pwCQDg8qMlII2g3p40pCxVKmbR7OrJZNJut6vVKvX5K4pCvVonJycMw9DnCoIQjUapmFUsFjOZTDKZzGazNA9oW8ZCMQsACL34UJsGxWDhxQHghVtFENu2V6sVHQys1+u1Wo1irTRNG4/HlMVpmibLsolEgqIJdnZ29vb28vk8hWwGAgGe51mW9TjuX6uce/XAxQcAeIpQwAKAF2MDef+60OPxsI5gMEiPrNfra9eujcfjfr8/GAyGw6HiUB1Uz+p0OtVq9eDgIBKJiKIYCoUoeDWVSlFyVjqdlmU5Ho8LgnCukoWVKMDL/Cr0oD0qAFwq1Ca5/ZXmBlKMZrfbpUnHnU5H1/XhcDhwuFyuaDRaKBQymUwqlaJMK1mWY/eEw2GWZR/+RfHKAADwSUMBCwBesN3jQxaOHo8n7MjlclTPopmG/X5fUZRms9loNChCq+/odrtHR0eGYXAcFwqF4vF4Op3O5/O5XC6fzyeTyWg0KopiJBLxO+iO67kDRFiwAlxtHo+HUvPo2NF6vT7XdgEAz9H9p3oty6JAK6pP9Xo9araitIFOpzMcDlerFc/z4XA4Fotdv35dluV8Pl8sFvP5PJ0QFEWRMq3uX2w8wSoFAACeChSwAOAquH/hSCvagCOZTF6/ft12zOdzWsseHx/XarVOp6Oqarvdns1mmqa12+1f/OIXtm0LgpBKpfb29mg5K99D92B5nuc4jmEY7GMBrjyGYTiO83q9pmnS8SI6a4wrA3BJrNfrlYPCBHq9HvVYnZ6enpycVKvVZrNp2zb9LHMcR/eo8vl8oVAolUrXr1/P5/OiKNJtKrofRu/v5ypW+MEHAHi+UMACgCuIVpzbhSYtQymGORAIRKPRXC53+/bt5XK5WCwGg0G9Xtc0rdPpNBydTmcymaiq2uv13n77ba/Xy/O8JEm5XC6VSsmyTKcM0ul0MBhk72EYBknwAFcP7WbpIPOF7R4A8Izf4mkkKKWwr1aryWRC3VWVSqXdbtNRwfl8bpqmbdubzSYUClHFqlwuFwqFVCqVzWaTySQFWtF4YtyRAgC4/FDAAoCrueF80KrX7Xb7fL6gY/vg66+/Tknw4/F4OBzqjna7fXJyUq/Xu90uPXh0dEQ3b2mQdjwez2azuVwum81SGHwkEgmFQlTJ8ng89CtKWgAv+uvJNsedNsO4JgDPDP3EUQ/1er2mRsjBYKDreqvVorEtzWZTVdWpY7FYWJbl8/kikUjpnp2dnWQyKUmSKIrhcJhiAe6vWCHHCgDgkkMBCwBero3ohStjWsvSdEJaH8/n88lkomlav98fDoc0zZDOG1Kv1vHxsWVZoVCIFsSiKEqSlMlk8vl8Op2O30PDDXHlAV5cNC/C6/ValrVcLi3LwjUBeJZM06QsSzryT51Wqqr2+31d13u93mQyYVlWkiTqq8pkMsViMZvNJhIJ6Z7toOGzCwAMDQQAeLGggAUAL68L77V6PB5qsxJFMZ/P0+B8utlL1Suavd3pdGjRPBqNOp3O4eGhZVnhcDiZTKZSKUmS6ANZltPpNCXBU8A87YTRlgXwwiyVfD6GYXw+n2EYlmXZto1rAvDJvS+v12sKrByNRrquDwYDqls1m01FUTqdTrfbVVWV7iGJonjjxg1RFOPxeC6XKxaLqXtisRhlWm0rUxfexMJ7MQDAi7QqwyUAgJfWhctWSrrZFpho7UsJ7jdv3qSDDLPZbDwe67peq9UODw8PDg5OT08VRRmNRo1G4+TkxDCMzWbDcVwkEikWi+VyeX9/v1wul0olSZLo/AJtiX0+H500/Mi/GAA8L9sf0vV6jSOEAB/f2Z+jbaaVaZqr1Wo+nw+Hw06nc3R09P7779+9e7fZbPZ6veVySe2QPM9Ho9FEIrG3t/fKK6/cuHGDTghGo1G/33/hG+iDClV4twUAeLGggAUA8BEr2nMLX4/HEwgE/H5/LBYrlUpvvvkm5WdpmtZoNOgu8fYW8XK5rNfrzWbzZz/7mcfj4XleluXtoO6kI5FIhMNhGmtIVS2v14vvAsDleUGgV4DNGbgsAE+M2qwMwzBN0zCM6XRKuZPtdrvmqFQquq5blkW/0+VypdPpRCKRdRSLxf39/UwmEw6HeQfHcXRD6EHv3ShUAQBcDShgAQB89Pb13ILY62AYRhCE7X/a39+fOKbT6XA41DSt2+1uF+VU2+p2u81m8+7du8FgMBAIBINBSZLS6XQqlco40o5gMEhfgiZ5u+/B9wLgOSyVfD6qL9u2TacIcU0AHt227EtH8qnZajgcUrmq0WjQO2On0xmPx/Q2OpvNPB5PIpHY3d3N5/OZe9LpdDgcjkQiD8mX3Nat8KYJAHAFV2W4BAAAj+4h8w29Xi+ludM5o9VqNZvN5vN536FpGoV3UAytoih0AtGyLJZlw+GwJEnxeJwyaGVZpggPmmxI1S6WZXH9AZ49r9fLsqzP56MCFvWDAMCjWywW0+l0MBh0HC0HtSprmjYajebzucvlCoVCyWTylVdeyWazdDsnl8slEolYLEZvhRzHPbxFGnUrAICrDQUsAICP6/4FtNfrpcmGm80mn89vbz5PJhMKo6VFPDVnURj8cDisVquj0cjlcomimMlkcrkc1bASiUQ6nZZlmQpkkUgkFAoxDIMrD/BsnG2BRAEL4CNZljVwDIfDfr/fbDbpja95z2g0Yhgm6tjZ2YnH44VCgepWOYcsy36/fxvBfmFTFSLYAQBeNihgAQA8hc3tgx48d/wwGo1GIpHd3d31em2aJg1XUhSl2+02Go2qQ1VVwzD6/X632zUMw+VyhcPhVCpF2R/5fJ5W+ZIk+e+hyYYYbgjwCf2A00leKkMjAAvgHNu2KdOKEiFns1mv16tUKrVarVqt1uv1VqulaZphGAzD0JBfqlXt7OyUSqVcLkfJVrFYjGEYzz0f+Y6G9zsAgJcNClgAAM/CdrLhduAgz/PBYDCVSpmO1WpFfViUBlKr1U5OTur1uqIos9ns+Pj49PSUjjLRACZZlguFQrlcvnbtWiqVEkUxFArxPE8Z8KhnATy1pZKTgcWyLB0NNk0TNSx4yd/OqGhlWZZt26ZpLhaL4XCoKEqlUrl79+7x8XGr1dJ1fblcWg56y5Nlmd6zrl+/XigUaG4gHZBnWfb+44HosQIAgPOrMlwCAIBn4EHrclq40yOyLFuWtVqtptPpZDIZj8eDwUDTtFarRYcvKEiLHB8fv/fee4FAgJJBaDzT9tQhHTwUBMHn8527lY3NAMDjovGgdDDKtm1cEHipUMXqbAr7arVSVbXrqNfrDUe/3x+NRmOHYRiCICSTyXQ6nclkstlsoVDIZDIU7CiKIgVa3T9y91zFCm9YAABwFgpYAADPx3Zdfna9ToPAA4GALMv0n+jO9mAw6Pf7dOSw2WzqDk3TdF0/OTmZTqcMw0QikVgsRpEiyWQym83GHclkUpblWCwWCASoP+tsJQvbA4BH+WmlnxT0XsHLYFuxov9J4wvG4zGNIqEZuzQ3sNfraZrW7/fH47HH45EkKZVK3bp1K5lMUt1KlmWaTyJJUjgcvr9ide5NEG9JAADwEChgAQA8/73xhet44vf76eQFJcEbhkEnDXVd39797nQ6qqoOh8PJZHJycjKfzy3L4nleFMVkMkmBuOl0mtqyaJxTKBQSBMHv91NfCQA8xPZMLrWf4ILAlUdzA6fT6Wg00nVdVdVWq1Wr1RqNRqvVUlV1Npu5XK5AIBAOh8vlciwWS6VShUIhn89TL7AkSaFQiGXZh6RZbQ/X44IDAMCjwL4FAOASuXAc+NleLZ/P5/f7k8nk/v4+5UmbpqnrerVarVQq1WqVhj01Gg3q0up2u++++65lWT6fLx6P7+/vU2huPp9PpVKZTCYSiXAcxzAMy7IMw2AvAXBud+1yuViW9fl8brebAn3QhwVX7HlOvxqGQYGM1GzVaDSazebp6enh4eHx8bGqquv1mo7TMgyTSCQymUyhULh27dru7u7e3l6hUAiHw9sI9nPjOx/lLQ8AAODhUMACAHjB9tLbFT+dxaDsW0mSbt68OZ/Pl8vlcDjsODRNo4HljUZD07TpdHrnzp27d+8yDMPzfCQSocAsWZaLxWK5XM7lcuFwmOO4bRL8Nj8L1x9eQttnvsfjoQIWZQDhysCL/m6yDbSimiy9cdRqtUqlcnp62nJomrZYLEzTNAzDsixRFFOpVLlcplsgmUwmlUrFYrFwOOz3+wVB4HkeKewAAPCJQgELAOAF20tf8FLu8wUd20dM05w7xuPxcDjs9/uKorRarWq12mq1FEXp9/uqqt69e5dlWb/fHwqFRFGkMyCZTCadTlOEFp0B4Xl+O9nwUUabA1zJH0Da9uNSwAtnW7GyHZZlzWazfr+vaVq73aYUdpobOHIsFgvLsvx+fzweT6fTBUexWMxkMpIkiaIYDocDgQDP8/d/oQv7iAEAAJ4KFLAAAK7gXsXn84UdqVSKti7L5ZJmGlIYvK7rdNiw1+vRScNKpbJcLnmejzni8XgsFqMBUqlUKh6PJxIJiuP1+/24yPBS8Xg8yMCCF5phGPRSryhKt9tttVr1el1RlF6vp6qqruvz+ZzOp+/u7iYSiVQqlXdQ/jrdz/D7/edCG8+VqFCuAgCATxQKWAAAV8qDboBTZHs6nd5sNrZtLxYLzUHTDOmkIZW3JpPJaDRqtVqmaXIcJ4ridpRhNpvN5XI0Bz0cDkcikWAw6Pf7GYbBSUO4Mj9Bq9VqOp3O5/NkMslxnNvt9ng8lBBH561cLtd6vZ5MJo1Gw+VyZTKZaDSK5z9cnucwPVHn8zkNsaUbFQ1Hu91WVVVRlMFgsF6vg8GgKIqvv/56LBaTZTmXy+3s7MiynHREo1E6PLuFywsAAM8RClgAAFfKgzYYZ/cePp+PZVkaHUXnSlarFdWtKpXK0dHRyclJtVpVFEVVVbo///7779u2zTBMIBCgW/Tlcnlvb69YLMqyTJUsCvfd5mdhqwMvqF6v9/777zcaja985SulUolKV5QHt83AMgzj8PDw+9//Psuyf/Znf3br1i0M9ITnhdpsrXvo9Xw4HLZarYODg8PDwzt37nS73fF4bFkWBbr5fD7qsS2Xyzdu3Hj11Vd3dnai0SjFILrdbnoNP/cyjkArAAB4vrDYAgB4GXc7tDOhQVGUBB8IBOLxeDab/cxnPrNcLik/q1qt1uv1drtNEVqqqg6Hw2az2el0fvnLX9JnxWKxXC6XzWYpOSvtoF3Q2eQsnC6BF8J6vT44OPiXf/mXX//614qi/Pmf//mNGze2z95tDFa/3//Xf/3X7373u6+88spnPvOZ119/HQUseMYv43SglYpW0+lUURR6uW42m5VKpdFoDAaDxWJhGMZqtXK5XKIoZrPZQqGQyWRyudx2cEcgEPA7fD7fuV7acxUrvIYDAMDzhcUWAMBL5/79Cd1v93q9HMdFIpHtf7px48Z4PJ5MJuPxWNd1RVE0TVMUpVar0aDDarV6fHz8u9/9jlLkw+FwNBrNZDKyLG/z4FOpVDQaZVl2O9Nwe2Mf2yG4hD8dFE19cHDwwx/+MJvNlstlOkJI4xEsyxqPx++8885Pf/rTbrf7hS98IZFI0EhQgE8IlU2paEXHwFer1WAwoCgruqnQbDYVRRnfM5/PWZZNJpOlUml3d5fSDOlXSkgURVEQhEd/swAAAHjuUMACAHjZt+sXbpaoxhRx0IOUnEWzq6gbS1XVbrfb6XQoA3g4HHa73eVy6fF4gsEgxQBTeFY+n08mk5Ik0cx1qnZRHjAqWXDZfiL29/f/9E//9Gc/+9np6emPf/zjN998k07dejweanhptVr/+Z//eXR0VCgUPvvZz+7s7GybGQGeirPzLqliNZvNxuPxaDTq9/vUFdvpdFqOTqfT7/eXyyWdDY9Go3t7e5lMJpvNZhz5fD6RSIiiSJGFuLwAAPCCQgELAAAu2MPfn3Xi8XgCgYAgCJIk7e7uUhjQfD6ncyv1el3TtEaj0el0dF2npq12u03j2OnoSs5B0wypOSsYDAYCAfqVkoZw8eG5SyQSX/jCF7761a/++7//+89//vO33377+vXrPM9TKtBsNtM07Re/+MVsNvvmN7/5+c9/PhqNejwexAPB02Xb9nQ63XZU0aiNdrvdarWOjo6azeZ4PPb5fIFAIBQK5XI5SZLoeGAmkykWi6VSKZ1O8zzvuefCFHY8bwEA4AWCAhYAAFzgwmOG9PjZZhOe5yORCNWzbNueTCaapvV6vW6322g0arUalbTG4zE9+Itf/GKz2XAcJ0lSNptNpVK5XC6fzxcKhXQ6LQgCz/Ocg2EYhMHDM7bdzBeLxb/5m785OTl56623fvSjH00mE8MwqAPr4ODgww8/bDabuVzu61//+s2bN6mlBU9UeGJ0P8CyLMMwlssl9brqun56enp0dFSv11utVrfbHQwGtm1TP2AoFKIeq729vd3d3VwuR+e1I5EIzdOg8MGPHB2I5y0AALxAUMACAICP8KAdzmazoRv72zMpoVAomUxalmWapmEYi8ViOp32er1arXZycnJ4eHh0dKSq6mAw0DTt4OCAZVmGYXiep83Y/v5+2VEqlWIOQRDOxmZhjjs8m6d6OBy+devW1772tU6n8/bbbw8GA9Ohadovf/nLd955xzTNL37xi2+88UYgELiwYxHgIe7PtJpOp/1+v9PpHB4e3r179+joqFar9ft9SmE3TdPtdofD4XQ6nc/nb926RXWrZDIZCoX8fj/ruPB4IJ6cAABwdZZqZ8/YAwAAfJwt2YXbJNM0R6PRcDjs9/s9R6fTURSl2+2qqqooiq7r4/HY6/WKohgOhyORiCiK0Wg060gkEpIkJRKJeDwuiuJ2TtbZaVnYnsFTfybbtv2rX/3qn/7pn/7t3/7NsiyO43q9XjAY9Pl8s9lsb2/vH/7hH77zne9QlBsuGjz8GbUdYUnlKtu2l8slxQjqut5oNFqtVrvd7t8zGo3m83koFKK+KtlBnaqSJFGkYCQS4TjuEV+HAQAArgB0YAEAwNPxoHQVhmHiDnrQMAyqZ+m6rmla10FlLFKpVMbjMfVzUe47fTrNz6JKFj0uCALLsnRY5kF/B4AneCZvNhuv1/vqq69+/etf/+1vf/ub3/zG4/FQ6JvL5cpkMl/60pdu3bolCAJuBMKFzj4x6HjgarUaj8eapqmqSuNcaXqgpmmUwj6bzXw+H9Wnbt68GY/Hs9lssVhMpVLxe0Kh0MMnBuA1EAAArjAUsAAA4BMsBNz/IMMwVH4qlUrr9ZoyXyaTia7r26Fa7Xa71+vRjq7dbh8fHy+XS6/XGwwGM5lMLpejYhZVsmRZjkQigiAEg0FK0cKVh4//1N1sNoIgfP7zn//yl79MMwqoJMFx3PXr17/xjW/k8/kHTTyAlxydDZzP59PpdDKZDIfDXq+nqmq73aaRF+12W1VVwzAYhvH7/eFw+Nq1a/F4PJ1OUyxgPp9PpVKiKFLT3zaF/Vz1Cs89AAB4qaCABQAAz7o0QDsuapviOC4YDEqSVCwWaddn2/Z4PG61WtVqtV6v1xy05RsMBv1+/7333rMsa7PZRCKRbDZ748aNUqlEIw5p18cwDMdxPsfZk4YAj7dI8vny+fzXv/71w8PDn/zkJ9R+JcvyF7/4xc9+9rPhcHj7lMa1Anr5Mk2TsthHo1Gr1arVapVK5fj4+OjoqFqt9vt9t9vt9XopiD2ZTBYKhWKxeO3atV1HPp/fzg18UOrf2aIVnnsAAPByrc1wCQAA4Pk6t0nbbDYMwwSDwWKxuFgs5vM5nbtRHO12m/aEnU5nsVhUKhVFUWhwYTAYTCaTsiyn0+nd3V2aIi9JkiAI22OG1L+AMHh41Cenx3P79q3/9b++cXR4cFqp2pb1yvVrX/2fXwkEQngKvcwozersAMHFYqHrer1er1Qq9BrV7XZ7vd58PqfBgpZlxWKxTCazs7NTLpcp0EqW5Wg0St2jfr+fqlcf+YKJ6w8AAC8nFLAAAOBybQvdbjc1TwUCge3jpmnOZrPxeNzr9Sg5S1EUCoPvdDp03rDT6RwcHHAcJ4qiLMuU/p5MJikCOZlMxmKxYDAoiuK2x2Hb6YBtIdxL2natndLEZrMx7fXSsNlg/JVPf+61W2/q/ZHX57t283aqcG1uuTYLk/F53G4Xtcp43C56GuFKXtWnB/VYEdu2Z7MZzabQdb3b7TabzXa7rSiKqqq6rg8GA8Mw/H6/KIrlcpkK64VCIZvNplKpZDIZj8fD4bDf77/wNRAXHAAA4H4oYAEAwCVy4c5ts9n4fD7RUSgU6EHTNCkJvtvtDgYDSkRWFIVmHSqKcufOnclkwrJsNBpNOih7q1AoUH5WIpHYVrW2MfDwMrPXm/HM6E+Ww5mxWFiThTGaGpbLOxrzsf0/jR3UgpGYO/bqW0dzodEM8L6Qn/XzvkiAlcJcJMixXg+u4RW2XC6pFZSmB7ZaLXrNoWmqqqoul8tQKJRKpdLp9BtvvCHLci6Xy2azVECnlyCGYZ7gNRAAAAD+8C6J6TkAAHCZPaQfYdsTsdlslsslJWT1er1Op7M9ZjgYDCaTycyxWq3cbnckEqHo91wuVygU8vm8LMuiKIYcwWCQ4ziGYXw+H04avgzPLmu9Xhn2bGlNF+ZwYiiDWUufKYP5eGYuDXO+tFbmH55gq8W4/sEv+EAkWXxVCEk+z8bPegSeEfxMUuRz8WAqLkQDvOBnQgLDM16nOQtPnhf5ieHMDVwsFhNHv99XVbVarVYqlXq93ul0VFWdz+dut9vv94dCoXA4LElSNpvd2dkpOKjNKhQKeb3ebabVY73EAQAAwFkoYAEAwAu5vTy35dtsNrZt09Ee27YNw1gsFsPhsNvt0p6zVqt1u11d1/v9vm3bm3tYlqXGrnK5XCqVCoWC5BBF0e/3syzLOLxeL+pZV+wpZNrr+dLqT5bqcNHWZg112lAm46U5X1qmZW/sjdvrcrs8LrfzZFuvLWPucXvdDOdxe9ZrJ7TbvXG73KzPI3DesMAlo/5MIliUg3IsIIW5IM9SGQvPmhfi+UBRVqvVikajDgYDTdModO/09PTDDz9UVXU2m9GLj8fj4ThOkqRMJpPNZvf29vb39+nVQxCE7YsGla4e/toFAAAAjwgFLAAAuAqbzwv3hKZpGoYxn88pDH40GnU6nWq12u12O51O06EoimEYHMf5HTzPi6KYyWTopOHOzk4ul0un07FYjOO4c4FZ2Ii+uM+XlWmpw8XvK73fVfsdbTaer5bmZmXY683a5dp4PF6B94b8bEhgQwLDeL0ut+sP/79xQrLW66VhDafGcGosVqZlr10ul9fjZnwejvEF/UwqJlzLR2+WYyU5zDAej8fl3iAe63I9A7Yf0Me2bU8mE0VRTk5O6g4afjoYDBaO+Xy+2Wyi0SjNDSwWi7lcrlwuZzKZUCgkOPx+P00+xRUGAAD4JKCABQAAV7dK8X+WDGzbns/nw+FwOp2Ox2M6E7QdbrgNYDZNk2VZOk4Yi8UoQouS4FOpVCaTocFhPM+zLHtu1D1KWpe/bGFam05vdtQcHDRHldZIGcxnK3Ntu3je52d8YpBNxvypmBCP+IN+NsD7BN7r9XhcFNHu/Dnrzcaw7MncGs1Ww+lKG8y7vVl/aswW5sqw166Nn/PFwnw5Fb6WF6/lxVw84OdwIvVSfPe3v27brNrtNuWvE8rUGzkMwxAEIR6PU62Kwq1otmksFhNFMRKJ+P1+fFsBAACeDRSwAADgpdi7PmiTOZvNer1evV5vNBrNZrPVarXbbVVVh8PheDwejUaz2Wy9XgcCAVmWs9lsqVTKZrPpdDqRSMTjcVEUg45AIMDzPIXBo551OZ8D1nozGC9b+uzOce+DWq+hTOcrk/F5wgEuFuLiYT4ZFZIxfzYeSEvBaIjzeT1ez8O+gyvTni5MdbBoapNub64NF/p4qQ8Xo5mxMCyO8cqi/2ZZulmOFeVQMibwjBdPiWf5Hd9+QEeM5/M5Fa9Ho9F2kinl5dXrdU3TVquV3++PRCKiKMZisVQqlcvl8vl8qVTa399PpVKiKGLgAwAAwPOCAhYAALy8+9uzo/Ft27YsyzRNVVVbrVa321VVdTsafzweLxaL2Wy2WCwMw+B5PpFIUPxNyrHtzBIEIRQKUT0Lh4kuj6Vhdfrzdw+1Oyf6aXs8X1ketzsoMHJU2M2Fd9KRdCyQlgTOiTvzuP/w/y73H1uu7qs6bTYb98bl/N9ms95s7PVmZdj9yarbn520RketUUefTeamaa85xpN0ylifezVVToeCPIMa1jP7ATcMgyLYx+PxYDCguYH0Kw15WCwWHMfRAcBQKBSNRovFYqFQyGQyNOQhHo/T2WE6G/iQLHYAAAD4pKGABQAAL+nm9kETwbaVLMuyZrPZcDgcDAZU1Wo2m41Go91u9/v95XJpWRbVv3w+XzAYjMfj6XSa8rMKhYIsy+FwmCpZPM9vc51R1XrG3+n1ZjNdWqet0S/vKu8d673x0rVxBXhfXBSuF8SbJakoh8QA6/O5GZ/n3jHBhz1bNhuX2wm1Ove7LHtj2evp3Gzo07vVwVFr2NSm45m5Xq8DPHNzJ/a5V+QbpVg0yHsxovAT+ImmzDs6G0iZd5qmVRy1Wq3T6fR6PWqo9Hg8Xq+X47hoNErHA6luFY/HE4lEOBym6Q0sy17Yb4UgdgAAgOcCBSwAAID/3pfe/yA1Zxn3LJfL0WhUrVaPjo6Oj49rtVqr1VIUZTAYbDYbhmH8fj/HcYFAIJFIlMvlnZ2d3d3dcrmccESjUZ/Ph0v97L6nro1prd896v3v91p3Ktp8aa03rpQovFqOvbEf38lEvxC23gAAZcNJREFUxCDPMp57nVauJytNUEWDfl1v/vAVFyurrkzfO9XfPVCa+sy21jznK6XDX3o984WbqbDAejyogDxNtm33er1ut9tqtU5OTu7evXt0dFSv1yeTyWKxWC6XNKshGo1ms9lisfj666/v7e3t7u7S3ECe5zmO8znOr5VRqwIAALgcUMACAAB4bJPJRFXVvqPX67Xb7Var1ev1KAZeVdVer+dyucLhMOU9S5IUjUbT6XQ2m00kEpIkUT1LkiTaMCMM/qlzmuM2w+nqd9X+Wx90P6wNxvOVn2HycvDzN+UbhVg2Hgj67x3oc7qqnsoX3X7vlobd7c/vNga/PdKOm6PeeMmz3nIq9ObN1O39RCoWYLweF77Pj3xhtx9szWYzGsJA0xiq1Sq1WWmOfr8/n88lSaLk9WQymU6nc7lcJpOhZkn6qby/IxINVgAAAJcTClgAAACPvZc+u7+1LGs8HtNJw06nQ4cN2+22pmm9Xm/gmE6npmkKgiCKIlWv6LBhsVikQYeUGx0IBDjHdlONjfQTs9cbtT9/70T/+fud0/Zoadlhgd3Lim9cS3zulaQU4Z1xdO6LDgM+refJH36ZLKzDxvBXH3Z/e6z1xyuXx5WLB958Nf3ZV5IlOez2boO24OKfNfpgvV6bpklnA3Vd7/V6uq53u91ardZsNjudDhWtDMNgWTYSiVDhOB6Pl0qlfD5PgxeoYhUIBOhgIP3h+BEDAAB4UaCABQAA8Hg76vt3vOv1+mwY/Gq1ms/nmqZRZlaj0aBI+H6/P5lMVqsVHWhar9ehUIiGG1JjSCqVkiQpHo9HIhGe5ylbejvcEB7r2zSarn7+fud/32nXuhNrs5Ei/I1C7M2bqf2cKAa2J/g+8W6bjctlmHZNGf/8Tve9E10ZzEx7nYoGvvBa+n9+OpuI+RkvMtEu/g5alkWTEyjQStd1RVF0XT89Pa06NE1br9csy3Ic5/f7g8FgIpFIpVL5fL5YLJbL5Xw+H4lE6Gyg1+t1u92Uwv6RP9EAAABwCaGABQAA8LG22Q8Kg7ccFAa/XC41TWu1Wo1Go9ls1mq1RqPR6XSGwyGVvTabjcfj4Xk+lUoVCoV8Pp/JZPL5PA1EC4VClAFP+/D7N+FwzsKwPqj2vveT40p3bFqbpOj/zHX5K7czaSkocN5n03WzcZZZzsnEjWVv2r3ZOwfq//t+t6FNzPWmEA9+6VOZ/9+trBTh8b3cng20bZuy2E3T7Pf7jUbj9PS0VqtVKhU6ITidTredUyzLJpPJnZ0dSmEvl8vUacVxHHMPZiYAAABcGShgAQAAfOLW6zU1Xi0Wi/l8PplMhsNhv9/vdDqtVqtardbr9W63OxqNXC4XwzA0uFAQhGg0Si1a2Wx2d3c3m81KkhQKhXie9/l87v8TrrPLqRuZtv27Sv//udP61QeKYdtigPvT19NffC1TSAZZ1uv544X6pE4Onv/r3CtxGqatDhe/Pdb+73caHX3h9myKqfDX3sh/9noiEuRewm/f2TSr9Xq9XC5ns5mmafV6/eTkpFar1et1VVWHw+F8Pl8sFqvVyuVyBQKBomNnZ6dQKCQddPxWEAQaoYBYKwAAgCsJg5AAAAA+8Y26x+PxO6LRKD1Ikw3H4zFF+VAKdbfb7Tg0TRsMBu12e7FY+Hy+aDSaTCaz2SyN+ac2k0QiEYvFIo5QKETHDM/u0l+2HTtVpEzbrmvTXx+o7x/pC9MO+n2v78T/5LpcSodYn/dMLeMZXRz6Lmw2G5bxZuIBt9s1GC/fMpV2f15XJm/9vhMNsq/tSBzjvfLfr7NB7FTVnUwmg8FAVVVN09rtNrUlKorSbDZVVR0MBi6XKxQKSZJEB2yz2SwFWqUciUSCYuMe8bsAAAAALzR0YAEAADyfzfy2tEGPrFar8Xjc6XQolFpRFAqDHznG4/FoNJrNZh6PJx6PZzIZGqwWj8cpP4uS4MPhcCgUCgQCDPPH+Xovz9Z949qs15tOb/Zfv22/9UG3pc0iQfZaXvzm5/Kv5KMCz35yee2P+C13uVyGvW6o0//rV/V3DrX+ZClwvjdfTf3/P18oJEIs43FdoUD3sxHp9LFt28vlkiYeDIdDXddbjkajQXM82+32crkMBAI00yAajSYSiUKhQM9w+lWW5e1z+7+Xs6hPAQAAvARQwAIAAHgOe/sHbbnpONVmszFNc3vSsNlsVqvVk5OTZrOp6/pkMqFka0qC53lekiRZlgv3ZDKZeDwejUap7YvjOJZlr3we0Ma1Gc+Mn7/f+fe3ap3+nGXcr+3Ev3o7+1o5HvT7/juQ6nmz15u79f5//bb5y9+rs4UZC/Pf+JP8F19L5xKBK1bAMhw0OnAymdCZ2UqlcnJycnp62m63e73earViGIbjOEEQQqFQMpnM5/O7jlKplEgkRFH0+/0U/YZAKwAAgJcZjhACAAA8aw/KfacoKzoM6PP5OI6LRqOFQuHGjRs02XA6naqqSvUsGm6o63qv15vNZqenp5VKhT49GAymUqn9/X06eJVMJmm4YTAYpEoW/XrFkrMsc6MNlh9U+v3JyuV2pWKBN19NvVKIBniqXj3/FCT6O3g97p1MZLo0hxPzg0pvNF29d6wno0IqKni9L3AzEUWwG4axWq0Mw1gsFjQ3sNvtNptNSmHvdrur1YpKtJvNRhRFSZJyuVw2m6VYq3w+L0kSTRWkLHbMDQQAAACCAhYAAMClcP+e3O12+xzblJ/NZrO3tzd3zBw03LDT6bTb7aqj3W7X6/Wjo6N3333X7/fzPL9tbMlms7lcrlQq0Wksnuev0kDD/nT5Yb1X6Y5XphUW2P1s5NZuIhJgL89RSrf7j3U0nvHtZcThtVVHnymDRaU7OWoOruXEZNTv876Q3w6qXo1Go2azeffu3bpjO2pzfo/L5ZJleW9vr1wub5+QmUwmGAwGHDzPUwH33B/+Moe7AQAAAEEBCwAA4JK6sKRFAwpjsRjFYNPhLArDpoYXRVF0XadUbFVVdV1XVfXk5OT999+naKGEI5lMyrKcTqdlWU4kEqFQKBgMbocbvljFgs1ms964mtr0dyf94dRwuVzFdOhmWYoEmG3N6PJ8T+kkY1hgdnORawVxsjBG89Vpe/JhrR8Jpnxez2W+ztuPt3MDR6NRt9ttt9ude9rtdt8xm81s2xYEQZKkmzdv0vABim9LJBKSJNH8gWAwiJoUAAAAfCQUsAAAAF5INNww6Ein09vHTdOksKHj4+NKpdJqtWi44Wg0mk6nmqa98847q9XK5/Ol0+ntua1EIkFj3SRJEgTB7/eHQiGO43w+3+UfbrjeuAaT5WlrdNgcmqYVDrA3CrEbxZjHcxlj7N3uP3zvGJ83FQt8eleqdyfjmdHqTe+c6Ps50c/66K99GZ5g5z62LIv6/qbT6WQyURSF6lanp6f0ZNM0bbPZhO7J5XLpdDqXy+Xz+T1HPp+/sFyFU4EAAADwkVDAAgAAeCFdWAVwuVxer5d6W/b39y3LMgxjOBx2u13VQflZ3W53MBgsl8tms3l8fLxarTweTzQazWQy1CaTSqXy+XwqlRJFMRQKCYIQCAQ4jvN6vZew0GDZ68PG8MP6YDQzeM53LSdeL4hShL/M37uNayNwvv18dC8/UkfL+cKsdid364NoiAsJ7CX5e67Xa8MwZrMZzQ3o9XqNRqNWq9H0wGaz2e12DcOgFPZIJEKTMXd3dynWKuOQJIk6+3w+n9frPTud8CFPZgAAAIBzUMACAAC4IrZhTx6PhzqniCRJ+XyeRsLN5/PBYNDv93Vdb7Va1Wr19PS0VqvRg4PB4ODgwOfzMQwTDoej0WgqlaJ5cMViUZblSCQiCALj8Pl8Ho/nMpS0DMM+aY2a6tTtcsdC3GtlKSMFPM7f6rFaezb3rNfrR/ysbfT+46aJuV1ut8cVDXGvFMSDen86NwbT1Wln9PqOFPIzm+dR06EcK9u2LcsyTXO5XI7H416vV61WP/zww1qt1mg0NE2bzWar1cq27c1mw7JsMpksFAp7e3u7u7uFQoH6+Kh9j9yfaYWWKwAAAHgCKGABAABcWdtuF0rOogez2SxNi6MMo9lsNh6PFUU5PT1tt9t05LDVajUajePjY4ZhhHvC4XA6nc47SqVSMpmMOwRBoDLW1tP9J2z/FRf+yevNRh0t2r3peGZynKcgB18pRsXgH/+xj/WX2Ww2o9Go3+8vFotz1/Ahn+Lz+YLBYCKR8Pv9j/UPc7ndPOPdy0Z20uHBZDlbmi1tpgwW0SDHsd4Hfd7a8QQlswc9PbYzAReLBQ0NbLfbtVqNKpuapk2n09lstlgsDMPgeV6W5WvXruVyuXK5vLu7m8/nI5EIpbALgsCy7LkYtQvLVaheAQAAwONCAQsAAODKuvCYodfBsmwwGIzH4/T4YrHQdZ2asPr9frPZbLVamqZRnJaiKI1GY7lcBgKBaDQaj8cTiUQ8Hk8mk9lsliLh6fhYPB5nWfYR/zIPR+UVyqf3+/2iKF74J9v2WunP+lPDWttBns3GA9EQ7/M+SXa7YRhvv/32f/3XfymKQunvVNx5yKes12u/318qlf76r/96Z2fnsb439DcUA2whFTrtjIczoz9ZqoNFMRm6v4BFv9m27baDYZhr164Fg8HHupj3PzidTikijcLXa7Vau93u9XqaptE50/V6HYvFaGJgylEoFNLpND0HkslkMBj0eDyP+zwEAAAAeFwoYAEAALxEHlRK4Hk+53BKQvZkMhmPx6PRSFXVRqPRbrfr9Xq73R4OhzT0sNVqWZbl8XjC4XAsFqPOrEKhkM1m4/G4KIrhcDgUClFy1jYM/nFNp9M7d+789Kc/jUQin//852/cuBEIBM79Hstad3rz0WS1cblCPJNLhnin+vMERRPbthuNxo9//OM7d+5sA8WokvWgT7EsKxwO3759+xvf+MbjFbDu/Q0ZxptNBBKiv9IdT+ZmW59MCqIYurBUZ9dqtR/+8Id37txJJpN/93d/Vy6XLzygdyEKtFosFvP5fDgcjkajXq9HRatqtdpoNLrd7nA4tG2b47hQKFQsFj/96U8nk8lcLkfHA3O5nOjw+/2P/nUBAAAAngoUsAAAAF5255qVfD4f1SmoBcm27fV6PZ/PdV2vVqu1Wq3ZbNbrdWrbmc1miqK0Wq233nrLtm06YlYoFEoOWZYTjmg0yrIsx3E8z7Ms+yjJWZvNZjwe/+Y3v/nnf/5ny7K+/e1v/8Vf/MXt27cFQfjj5zqFpYVhdfvz6cL0etyxMJ+OBTjmCWsrPp9vf39/b2/v8PBwPB77/f58Pi/L8kNqWKZpCoLw2muvhUKhJ/uiHo87JQqpmOBnfYZp15Rpb7LMJ4Mbl+vsBaLeqx/+8If/+I//2G63v/jFL37rW98qFAoPKiRt5wYuHavVajKZaJpG37Wjo6NqtVqpVFRVXa/XlLDOcVwul5NlOZfLFQqFHUexWAyHwyzLUt7ZhecWEWgFAAAAzwAKWAAAAC+7s9WHbSr5uTx4v98fiUSy2eyf/MmfrFar+Xze7/fb7TadNWu3202HpmmKoui6/tvf/pYOKoqimEqlcrkcpX0Xi8V0Oh2JRFiWZRiGilkex/0xSaIoXnP8+te//t73vqfr+t///d9/6lOfikQiziA/l2GtlcFCHcznK0sMspl4IBxgvJ4nLKawLHv79u1vf/vb/X7/Jz/5SSQS+eY3v/lXf/VXkiSt1+v7fz8V+LxebzAYTCaTT/ZFvW53IurPSIFIkO2PV219qg0XprVmfN5tBcswjGaz+b3vfe+73/3uycnJa6+99rWvfa1QKJw9U7mNn6eAM0phpwj2er1ONUc6FrpYLCzLoiytUCiUTqdLpRI10NHoyXg87vf7KTeN4zicEAQAAIDLAAUsAAAA+G8XFiOoqkVp5dvcJdM0r1+/vlwu5/P5dDrdHknbFrM6nc5wONR1/fT0lAoiwWBQFMV4PJ5KpbLZbCaTKZfLNLQuGAxSm8/ZeHK/3//GG2/87d/+LcMwb7311o9+9COGYb7zne/8j//xP0KhkMfjMUxbGc7746Vp2RGBzcaDAudzGqYe+x9O/8ZQKPTVr351NBpRE9bBwcFyudzf33+UE3NP1ojkdrs4xheP8AnRP5ysRnNDGy6G01VC9LtcbipLHR0d/eAHP/j+979/cnLy6U9/+i//8i+/9a1vSZK0jWCn6YHz+Xw8HmuaVqvV6o5ms6mq6ng8prR+0zR9Pl84HKYjgVS3kmVZkiRRFOnIJ7XIPZV/GgAAAMBThAIWAAAAfIT7W7Ro+l7Ysf1PhmFMJpN+v69pmqqquq5TNLiqqpqmDQaDTqfz4Ycf2rZNLUuJRCKTySQSif+Pvft6bvO68waOp6J3EABJgF3spKhC9V4tWeW145Jix5vMbnb2fm92Z3Ymf0Au9jaTycZpjiU7LrIlW66yehdJiWLvBYVE78BT3glPguWqUKREWZT0/Vzs0CTwlPOcVYDv/M7vOBwO0iDcarWazWay5JDEWHa7fefOnTRNK5XK77777vjx48lkkmXZtWvXWiyWrCCFYplkRlAoFAat0m5WcyxpgPXw91hQULBly5aenp533333woULx48fdzgctbW1LMvO3dD9USIeg07pMGuHPNF0VgzH0uEECbAUkiT19vZ+/PHH77zzzsjISF1d3VtvvfXCCy9UVFSQnu7BYDAQCHi93unpadJ6nzRl93q9Pp8vEokwDGO1Wu12e2VlJVkeWFRUVFxcTMacFFvd0aEMmwYCAADAEoQACwAAABYgv7rw7j+Rfu02m626upr8JhKJkDyFrF8bHx8fGRnx+XyxWIzsf3fx4kWFQkGWGbrdbpKtVFRUlJSUmEwm7QyNRrNz506WZRmGOXHixMmTJ0mys3HjxozIRRM5QZR5hjFqlBa9imGoR7+70tLSt956q6en55tvvvnss89IrOZ0Oh9H53JyRq2KsxlUPMukM0I8JcSSOVmWBUEYHh7+6KOPjh492tXVVVtb+/LLL7/wwgsmk2l4eDgej09NTY2Ojg4PD/f394+NjY2Ojvp8vlwup1ardTqd2WwuKysrLCwsLy8n3azcM2w22wNXBQIAAAAsNQiwAAAAYPGRKh6yF2FlZSVZ5pZKpeLx+PT0NNn8bmiG3+8PhULxeLy9vf3KlSuCILAsq9frbTab2+0un1FRUeFwOA4dOhSLxc6ePfvBBx8Eg0GPx9uwalMwmpEUlErJGvScXssxi1ErpFKpqqur//mf/5miqJMnT/7lL3+x2WyHDx8mDd0fx3Cpedqk5xmakmVFLJWLp3LJVHKwf+APf/zjRx99NDo6WlxcvGfPnoqKiqtXr/b39/f19Q0ODo6Pj0ejUUEQeJ4nLasaGhpIFFhZWVlRUVFWVma323U6nUqlIo3zSV1bvowOpVUAAADwtECABQAAAIsvX6hFQhOFQsFxnEqlMhqNZDlba2trMpkkSw5J9dDExIRvhtfrTSaT4+PjXq/3xo0bHMfpdLqioiKLxSKKol6vj0Qily5dyuWE1tGpnKFBFHkly2h4VqPkHj2RIbGOUqlcv369z+cbGhoaHh4+cuSIy+Uivbcex3ApeVav4VmaVsiKdFYIx+Id7UPvHX3vxGefj4yMMAyjVCq7uro6Ozv9fn8ymcxms6QRu8lkcjgcZWVlRUVFLpeL/GCxWDQzSEOru8dkjjI6AAAAgKUJARYAAAB8T/LZkFKpJM2zZFnO5XLLly+Px+OkGTzZ3DC/8HBsxsDAQGdnp1arZRgmHo9TFBWLxa5cuTzmnXYs22Ct2qgtdKuUnIpnaPpRQ5l8aVJBQcHBgwd7enrefvvt9vb2d955p6CgYM2aNY9jZDiWVitZhqEohkmlszdvdk52nPz0+InJSY8oirIse73eYDCYzWYVCgXpyVVZWelyuUgvfIfDodVqSYt9Umw1e8xRaQUAAADPAARYAAAA8D25ZykQx3HGGaSzVSaTyVdmBQKBqRlk1eHAwEB3d3cmkyHN1JPJ5OhQXziWdmcps34nw5SxDL0oW+aRt9M07XQ6Dx48ODk5+fnnn58+fbqiosJoNFZWVt7R9XwRRkZBsQzNMjStkMYHu7pGrox0nvZMToqipFAoRFHM5XJFRUWNjY0NDQ0ul6uwsNBut9tsNqPRqNFo1Gp1/pbRfx0AAACeSQiwAAAA4EnKJyyyLDMMQ9a+2Wy28vJy8vtEIuH1etvb20+dOuXxeEKhkCiKCoWC53leqc6lk9Gp0VQypJAkcqRFjGwoilq3bl0wGBwYGLh169bHH39ssVh+8pOfWK3WRW/oTlMKlqElSQj4J0KTI7KsMJpMsVg8l8uRkbFYLM3NzTt37nS73RqNxmg03nN5IAAAAMAzifnlL3+JUQAAAIAnbnaSJUlSMpkMBAIjIyNXr1795ptvvvvuu7a2No/HI8uy3mAodDqX1dRUN66wlK0yFtfbC0tqyx21pWaFYpEDHY7j9Hq9UqkcHBzs6+tLJBIOh8PlcimVykUMj2RJDsTSbX3TwWjSZNS3ttTuWNdsK7DzPM/NEEVxenp6cHCwq6urp6fH4/Gk0+lcLkfTNMuypDU7phAAAAA8w1CBBQAAAEuLLMuRSOTq1atnz55tb28fHR0NhULpdFqSJL1eX1dXV1/fsHrVyqply2Ki+puOyERYUGpUCgUlSYqZfvGLdhkkFSosLNy/f//g4CC5qqNHj5aWljY1NalUqkW8a1GUszlRVjCWguLljQU7l9ui0ejkpGdwcPD27dudnZ0DAwOhUOjSpUttbW1qtbqgoKCysnLv3r27du16fNsjAgAAACwRCLAAAABgaUkmkzdv3jx69OiZM2d8Pp8kSQ6Ho6Kiory8vL6+vqKiwuVyud1us8Uy6E1dHe/3JyOyTAmSJAgSxzGLleSQSEiWZZZl3W73D37wg8nJyU8//fTixYtHjx7V6/W1tbWLdcuSQhYlhSDLCopSKnmD0eB0FjqdheXlFfX19a2trePj46OjowMDA0NDQ6SxfXt7+8DAgEqlqqysLCgooBcxugMAAABYehBgAQAAwNIiSZIgCJIkFRYWNjc3V1ZWVldXu93uohk6ne4fK+YomkqwlCiLuazApDJSKiuw7CIvpiNHY1l21apVL7744sTExOXLlz/44IPa2tqSkpLZ3dMfRU6QU5mcJEoUpeAYiqMVkizTFMWyrM1ms1qtdXV1mUwmGAz6fL6JiYmhoaHBwcFgMFhVVaXT6VB+BQAAAM88BFgAAACwtKhUqurq6oMHD0aj0cLCQlJhpFarSd90agapjeJZRqfmaIbKCVIqKySzglbN0YrFT3MoiqJpWqfTaTQahUIhCAJpQaVWqxfl+JmsEE1mRUmmFAqVklHzrEKe2Ztw1s2S9vbFxcWNjY2xWGxqaioQCJjN5rKyMpqmH33vRQAAAIClDAEWAAAALC0cxxUXFxcVFcmyTM+44wX5sEbJMyY9z9J0Ip2NJbKxZNZqUD2OS5IkKRAInDt37vr163q9vrW1dc2aNUajcfbFPIpURgjHMoIgUQpKr+YMWv6OFC5/CoqiVDNsNpssy/k/Ib0CAACAZxvaJQAAAMASQvIgmqYZhiH76939mnxYo+Roi17FMlRakCKJbCiWFUV50S9JkiSfz/eXv/zlq6++ymaz69ev/8UvfkEaYC1W3VM8lfOH0zlRomiFTsPrNRw15xDli8LIakqkVwAAAPDMQ4AFAAAAS8iCshieZQqMap2aoxSKcCLjCyZzgrjolxQIBE6dOvXhhx/29/fX1dW99tpra9as0ev1i5JezYRRikgi6wkmcqLEs5RFrzTrlIs1RAAAAADPBgRYAAAA8LRS8ozdrLEaVDxPxxKZsalYMi3IEilRWgSyLOdyuba2tnfeeefWrVtut/vgwYPbtm3T6XSLdQuyLKczwnQ4GQinZVlh0intRrVew+PhAgAAAMyGAAsAAACeThTFsYzFoLIZ1WqeTaRFbyARSWYFSVYoFm0h4dDQ0FdffXXq1ClZlnfv3n3gwIHCwkKWZRerEkqU5KloyhtOxVI5hqIcZq3FoOZYfEIDAAAA+D/w8QgAAACeSpSsoBQKNc8UmFU6NSsIUjiW8QQSmZyoWIyNCGVZDgaDf/3rX99//32VSrVr167Dhw/X19eT9GqxiLLsDyV9oWQqk+NY2m3XmfQ81gkCAAAA3AEBFgAAADydKHlmy0KmxK43G1Q0LcdSuWFPLJHKPXoFlizL4XD4+PHj7777rtfrbWpq+td//ddNmzbxPE9R1KKtUVQosllxxBv3TicohcKo40sceqNWiWcLAAAAcAcEWAAAAPCUohQKBUvThVaNVa/iWCadE8f88WA0nROkRzx0Lpc7d+7cxx9/3N3dXVVV9cYbb6xatUqlUuV3AFyUG5BlORzLjHijwWiG4xmbQe20arUqFo8WAAAA4A4IsAAAAODpJVOUwqRTFRdoTTplNit5Aom+iXA4lnmUg6ZSqf7+/g8++ODs2bMul+vgwYO7du0ymUyLfvWZnDgwERnzx1MZQc2xhTaNRa/kWQbPFQAAAOAOCLAAAADgaSXLlEKhUHFMRZGhuECnoORwPH1rMDAxHZdl+SHWEZICq/Hx8WPHjp06dSqRSLzwwgsvvfRSYWEhwzCLWnulECU5GM3cHg36wymKokw6ZZnToFNzCgoNsAAAAADuhAALAAAAnlYk56EZqqrYXO02GTV8TpR7x8O945HpaEamHqZTVSQSOX/+/K9//etoNLpx48bDhw83NTVxHLfY1y4n0rm+sdDgZCSZyenUXHmhvq7UolVz+RwNAAAAAPLQZAEAAACebgxNmXXKapdpoMR8ayAQS+a6RkKFVs3GxkKGWVgpUzabPXXq1JEjRzwez+rVq19++eWmpiZhxhyhEkVRHMexLDv/yilJkqfCyfaB6alIOifITqu2udJmNijpmSOgAgsAAADgDgiwAAAA4CkmyzL1NwpXga6p0jbuT3hDyRFv9OagqrbUbDWoGJoi7d4fKJfLdXR0HDt27OLFi4Ig0DQ9NTX1zTffzJ1eybLMsmxLS0tFRYVWq53nNYdimb7xSN9EJJURDBq+oshQU2JScUz+jvBkAQAAAGZDgAUAAABPsXzWY9Qq60rMfaPhcCITTeW6R0NXu6fW1jvMeiU9jzhIFEWv1/vuu++eOHEiEonIsnz79m2Px8NxnCiKc7xRlmWlUvkv//IvBoNhPgGWLMvxtHB7JHj+lmc6nKYpRV2JqaHcYjdpZrI2lF8BAAAA3AMCLAAAAHgmPtMwlN2kbqywjE/Hx6fi06HUtR6/Ucs3VVj1ao56UIglSVIsFotGo3q93mAwkCCJmt+CPpqmpRkPLJ6SZTmbE/snItd7p/onIjlRdFg0zZW28kIDx9IovwIAAAC474c9DAEAAAA8AyiK0qm5hnLrmD+eSOW8oWTfRIRjabNBWVVk5Chm7lyIpmmz2bxz587KysoFRUiyLHMct379epvNNvfLKIoSJGkqkr5823tzMJDOCmqOrXaZmqpsDpOGbE2I9AoAAADgnhBgAQAAwLNBpijKZlBtairKCtKVLm84kesZDZ1uV1IKqrLYyDHUHPkQTdMWi2X79u2ZTGYBp5zpjUXTtNFo1Gg0FEXNUUIliJI/lDrdNtE5FIwkMmolV+s2bWhwFlo0NE2h9goAAABgDgiwAAAA4FlA8h+Oo8uc+nX1DlGUzt/yRpPZ6z1+jqEVCrnSZeRoWnGvkIiER0ql0uFwPNo13DuEkmVFTpS9gcTVHv/lLp8vlORYptSh39BYWF1iVvEs0isAAACAuSHAAgAAgGcBRf09xOJYZpnLLEryVDjdNx4KxFJXe/w5QcoKUrnToNNw1F1treaunJqn+6dXcjYnDntj13unrvX4PcGELCvcdv3aeudMfy5eRuN2AAAAgAdhfvnLX2IUAAAA4BkwEyDJCgXFsZROw8myIhjLxFO5SCIzHclEE1mthtOpWCXH3h0YPXqEdL/0KpUVx6fip9smL972TgaSClnhMKnX1Ds3NRVajWqGof/2Xjw8AAAAgDmhAgsAAACeETMx0N93DtQquaZKqyBJZzoU4/54PJW5NRTI5KRoXWbFsgKTXsnQ9GMte5JlhSTLiVRu2Bc91+G91uuPJrIMQ1kN6nVNztY6h9WoZmhaoZAp5FcAAAAAD/ykR5qPAgAAADwzyGo+WZajiWzXSOjsrcnOwWA4kVFyjN2sXlPrWL7MVl5oVHEMQ1KsRY2yZFmWJDkrSv5g8kb/1LXeqRFvLJbIqpVsRZFxc0tRc4XNblKR9ArNrwAAAADmAwEWAAAAPLNkWY6lcn3joavd/uu9U8FYWlYoHCZNRZFxWbFpmcvosus0KvYfKdYj9sD6+7aE6azoC6UGPeHesXDfeMQTSKRzokWrrC0xtdY5GiusJp2KZbDtIAAAAMACIMACAACAZ5NMVufJckYQhz3Ri12+joFpz3QynRWVPGMzqqpdptpSS5lTX2BSa1UsxzIPf66ZBYOpjBCMpkf9se7RcN9YeDIQT2YEmqLMeuWKKtuaOke126xTc6RrPDpfAQAAAMwfAiwAAAB4NsmygqL+nmJls4I/mrnW6+8cDAx5ovFUThQlnmUcFs0yl6nKZXBatGadUqPi1EqGY+mZ9X1zHpxUW8myIEqprJhI52LJ3HQ4PTAZ6RkLj/vjqaxAUwqNii0wqetKresbHKUOvZJjSNXVzLXhEQEAAADMFwIsAAAAeHbJCpmSqZkPPKKsSKZzk9OJc7c8twYC3mAyJ4oKWaFUMgYNbzNqSp368iJ9ucNgNag1KoYiSFhFk4MpKFmRT68kWcrm5FAsM+aP9U+Ex/yJqXAqGEuns6IoSSxNmw18Q5l1VbW9ymWyGVQ0jcgKAAAA4CEhwAIAAIDngaxQUJIsZ7LidCQ94ot3jwZ6RsMTU/GMILEUxbC0TsXpNZxRp7QbVQ6rhvynimdohuZZhmUZQZAEURJFMSNI8ZQQjacD0cxUOBWIpaPJbCotZHOiIEgcS1uN6mUuU22pqbLIaDdrtEqWYWg0vQIAAAB4aAiwAAAA4Nk3Uzz1v8v2MjlxYjreOxrumwh7AslgNB1N5jIZQZRlSkFpVKxZz6t5TqtmeY5hGErFMzzH5gQxkxUFUc4KUjqTiydz0WQ2ns4JgsQwNM9SOhVvNaocZk15oaHKbSx3GnRq/u8XgPQKAAAA4BEgwAIAAIDnjizLoiSnMsJUJOUJJCen4v5w0h9Kh+KZWDKbyUqCKEqSQpAkUZIphYKmRDEZyKaTClbDaS0MyzEUxdA0y1A8R+vUnFmvshqUBSaNq0BbaNU6LRqNimOZv+9uiAEHAAAAeEQIsAAAAOD5ki+GmuljJQuiLAhSPJXzBJMT0wlvIBGKZRLpXCojpjK5TE6SJEU8On3r3LGgd8hZ0VTfusdgtKiUjFbJatWcUcfbTWpXgc5p0Zi0SlKxxdBUvt4KtVcAAAAAj47FEAAAAMBzJR8nURQ1U0ilUHKMWsma9cplxcZ0Vogmc9FENp7JJVLZVEaUZMozLt/6bMjTd6XCVbCl2V5UWKzTcHo1Z9DwBi2v4hmOYziGnhVb/W9yhfQKAAAA4NEhwAIAAIDnnSwraJriaYbnGK2aNeqUOUHKiZIoSKIkK2hqQB3X8pIkZKx6ZvWygmJ3Ac+yHENzLM0y9N2h1QwkVwAAAACLBgEWAAAAPO/ySRPJn1iGYhhaLSsU//h9yKDiub99alIrOatRZdWpaIbOv31WaoXQCgAAAOCxoDEEAAAAAMTfe2OR5GpWFDW7Z6j8t/+Q//GzAqkVAAAAwPcAARYAAADA/zH/OArJFQAAAMD3AwEWAAAAAAAAAAAsaQiwAAAAAAAAAABgSUOABQAAAAAAAAAASxoCLAAAAAAAAAAAWNIQYAEAAAAAAAAAwJKGAAsAAAAAAAAAAJY0BFgAAAAAAAAAALCkIcACAAAAAAAAAIAlDQEWAAAAAAAAAAAsaQiwAAAAAAAAAABgSUOABQAAAAAAAAAASxoCLAAAAAAAAAAAWNJYDAEAAAAAwBMky/LsHyiKuuNP+d/M/hMAAMBzBQEWAAAAAMATI89IJpOTk5OxWEypVDqdTpPJJIqi3+8PBALZbNZkMjkcDp1OR9M0MiwAAHg+IcACAAAAAHhiBEGYnp7u6Og4duzY7du3bTbb4cOHd+zY4fP5Tp48efnyZa/X29jYePjw4bVr11qtVowYAAA8nxBgAQAAAAA8MZFI5OLFix9++KHX6x0eHr5161Y2m9VoNG1tbV1dXaFQaGJiYnJyUpZlywxUYAEAwPMJARYAAAAAwBOTyWQEQVi2bNm+fftOnjx54sSJa9eu6fV6l8t16NAhSZLOnj174sSJrq4uv9+P4QIAgOcWAiwAAAAAgCdGqVQuW7aspqamurp6eHj4iy++iMVi09PTBw4cOHToUDQaTSQS3377bSaTEUURwwUAAM8tBFgAAAAAAE+MdYYkScFgcGRkJBQKVVRU/OQnP9mzZ49GoxkZGRkbG5Nl2el0GgwGDBcAADy3EGABAAAAADwZsiyTnlaiKE7NYBimqqqqtbXVYrEoFIrp6enOzk5BEFwuFzq4AwDA84zGEAAAAAAAPBH5juzZbHZ4eDgUCtnt9hUrVphMJvL7UCjU19cnSVJ5eXlBQQE6uAMAwHMLARYAAAAAwBOWyWR6e3unpqYcDkdjY6NarZZlOZfLeTwer9fL83x5ebnNZsNAAQDAcwsBFgAAAADAE5bJZLq7u6enp81mc3l5uVKplCRpenra4/Gk02mr1epwOGiajsfjaOUOAADPJwRYAAAAAABPjCzLCoUinU4PDAzEYjGz2VxYWMjzPGnrHgqFaJo2m83pdLq9vf3KlSuRSESSJIwbAAA8b9DEHQAAAADgScrlcoFAwO/3K5XKwsJCvV5P07QoivIMhUIRDoe//fbbcDhMUVRRURF5AcYNAACeK/hfPgAAAACAJymdTk9PT+t0uoaGhurqapVK9beP6TRdWFhYW1tbVlYWi8WuXbvGcdzmzZvtdjvDMBg0AAB43qACCwAAAADgSeJ5vra29j//8z+VSmVFRQXJp8jKwT179pjN5mAwaLFYamtry8vLjUYj9iIEAIDnEAIsAAAAAIAnief5shl3/J6m6fLy8sLCwlQqpdPpVCqVLMtIrwAA4PmEAAsAAAAA4Im5XyBFUZQsywzDaGdgoAAA4DmHHlgAAAAAAEvRHdkWyq8AAOB5hgALAAAAAOApgPQKAACeZwiwAAAAAAAAAABgSUOABQAAAAAAAAAASxqauAMAAAAALDJJknK5XDqdlmX5ezspRVE0TWs0GoZh8AgAAOAZgwALAAAAAGAxybKcTqc9Hs/AwIAoit/beSmKUqvVzc3NJpMJPbMAAOAZgwALAAAAAGCRTU9Pf/HFF0ePHk2lUrN/zzAMTS9OEw95xuyATJIku93+H//xH62trTzP4ykAAMCzBAEWAAAAAMAi43leFMWOjo5gMJj/JUVRJSUlBQUFNE1LkvTQqwupGblcLhQK+Xy+TCaTP5TT6QwEApIk4REAAMAzBgEWAAAAAMAiM5vNK1asWLdu3cWLF/MZFsuyJSUlW7ZsqaysFAThUdpj0TRNAqzh4eGrV6/29vYmEgksGwQAgGcYAiwAAAAAgEUjyzJFUUqlsr6+/p/+6Z8ikcj58+dnZ1U1NTU//vGPF6vPejQa/c1vfvO73/2us7MTgw8AAM8wBFgAAAAAAIsmXwOl1+t37949MDAQDoe7u7tFUczlcteuXSsoKCgqKlq7dq1Op3v002m12h/84AcTExPj4+ORSATjDwAAzyoaQwAAAAAwmyRJ2RmiKN5vkRdpnk1ehn5DcM8ZwrKsyWTav3//Cy+8YLPZSLCVTCYvXLjw/vvvj4+PP+IqQnIWhmGKi4vXrFlTVlaGYQcAgGcYAiwAAACA/yMQCJw9e/bKlSter3eOAMvn8128ePHSpUuRSEQQBIwbzJavw6qtrX3llVf279+fr7fy+XyfffbZ+++/PzU1tShnYRimubm5urqapulHTMQAAACWLARYAAAAAH9HvvyPjIz84Q9/+N3vfvfdd9/dvSaLoihZllOp1JkzZ37zm9+89957k5OTuVwOowf3nFEcxzU2Nr700kvLly9XKpWkxG9iYuKTTz45ffp0IBBYhA/0NF04g+M4jDkAADyr0AMLAAAA4P8IBoOXL18OBAKJRKKmpmbVqlX5ahrSn1sQhJGRkePHj3/00UctLS2xWAyrCOGeyMzRarWrV68+fPiw1+sdHByUJEkQhJ6enqNHj1qt1p07dz767oEcxxUXF5eUlDAMw7Is9iIEAIBnDyqwAAAAABT5cEqhUBQWFm7evJll2bNnz16+fDmZTObjAIqiaJpOJBJfffXVxYsXNRpNXV1dSUkJqawBuB+n07l///69e/cWFhaS6RSJRE6dOnXixInBwUFRFB/x+DzPr1mz5tVXX929e3dxcfFibXEIAACwdDC//OUvMQoAAAAAZG0gRVE8zxsMhs7OzoGBAY7jXC6XRqM5ceLE8PDw8uXLt2zZ0t/f/8c//rGtrW3lypWvv/56S0sLx3GoeYF7IpOKoiiDwWC328fGxiYmJjKZjEKhSKVSfr/fbDbX1NRoNJpHnL0mk6m6unrFihWlpaUqlSo/n/EIAADg2YAACwAAAOB/UwCFQqFUKk0m0+Tk5PDw8Pj4uEajcTgcp06dGh4ebmxsrK6u/vLLL7/66iulUvn//t//e/HFFy0WC2mejbAA7jepyKaENpuNYRifzzc4OEj+GgqFcrmc3W53uVwPXccnyzJN02q12mq12u12pVKJ9AoAAJ49CLAAAAAA7sTzPMuyHo+nra1NEAS1Wt3W1ubz+Vwul1qtPn78+NjY2JYtW1599dX6+nqO4xAWwNzI9CDd1jOZTH9/f751WjgczmQyZWVlDofj4Zb+3XPuYUICAMAzBgEWAAAAwP8iURRN03q9PhQKdXR0TE9PJxKJ4eHhWCymVqtjsdj169c1Gs3rr7++a9cuvV5PFohh6OCBKIpSq9VKpTKbzfb19SUSCYVCkclkUqkUz/PV1dV6vZ6m0aMWAADgHhBgAQAAAPwvEkXlO2ENDAxcu3ZtZGQkGo2KohgOhwcHB2VZPnDgwI9//OOysjIsHoT5I1PFbrcXFRWRZljpdJosJJyYmHA6ncXFxXq9HjMKAADgbgiwAAAAAO5EmgpptdpYLDYyMuL3+8k+caIoUhRVX1//ox/9aOPGjTzPo/wK5o+0pqJp2mAwqNVqMrVyuRxp6B4IBIqKikpLS7EnAAAAwN0QYAEAAADcicRSHMep1epoNNrf309WeykUiqKiogMHDuzfv7+4uBidhuAhppZCoeA4zmq1iqI4MjLi8/kUCoUgCD6fj+O4wsJCl8uFhYQAAAB3YDEEAAAAAHfIr+GqrKzcunVrW1tbPB5PJBIajaahoWH37t0ulytfUPPsZViZTCYUCo2NjaVSqZqaGofD8RAHkSQpHA6Pjo6GQqGSkhKn06nVauf5xmg0Ojk5GQgEyAo7pVJJ9unLZrO3bt3S6XQlJSUWi+UpLX8jc8ZsNu/bt298fNzj8UxNTcmynMlkTp8+7XA4nE5nWVkZx3HP0v9DZbPZiYmJqakpURRramqsVutDHCQYDI6NjSUSiaKiouLi4nmWqgmCEIlEvF4vmVE0TbMsS+ZkLBbr7Oy02Wwul8toND5cE30AAPh+oAILAAAA4E75b8Ucx6lUqlQq1d3dHQ6HS0tLX3rppQMHDpjN5jte+WTTAfIVPRKJkKAtHo/HYrH4PySTyVwuR1EUwzBzX7Asy+l0enx8/OLFi++99965c+cqKytLS0sf4qoEQbh9+/aRI0fee+89juNcLpfJZHrguyRJCgaDHR0dn8348ssvv/nmmytXroiiWFZW5vf7f/WrX3V3d6vVaoPBwPM8TdNPY4ZFNgowGAxGo9Hj8QwODgqCQHYkTCQSRqOxoqJCp9M91msQBCEajYbDYTJJZk8YMosymczfvi08aM7M55kmk8nR0dHjx48fO3bs5s2bVVVVRUVFD3HBnZ2dR44cOXnyJMuyVVVVKpXqgfNZFEWfz9fW1nby5MnPP//8iy+++Pbbby9dukRi0J6enl/96lfj4+M8z2u1WqVS+ZTOKACA5wEqsAAAAADumzIoFAqn07lr166LFy9Go9HGxsYdO3bMs5Loe5PL5Xw+39GjR69du0a6dP3tQx7Lku/8kiSxLLts2bINGzasWrXKbDbf7ys6qXA5d+7csWPHLl++rFKp1q1bp9frH+6qRFH0er2dnZ0dHR3V1dWbNm2aT9IxMTHx9ttvnzlzRqlUlpeXazSaU6dOCYJQU1NDlt2ZzebLly/fuHGjubn5tddeW7FiBbmjp25ekWyoubn50KFDk5OT169fT6VSCoWit7f3yJEjZWVlW7du1el0jy9M8fv9x44du3TpUjweZxhGlmWGYWialiRJEASGYYqKijZu3Lhy5cqioqKHbsslimIgEPj888+PHz9+8+ZNm822d+/e+USZ95weHo/n9u3bXV1dZWVlJF974FsGBwd/+9vfXr16Va/Xu91uMqOSyeS+ffs4jjMYDHa7/cyZM2fPnt24cePhw4eXL1+ej6cBAGBJQQUWAAAAwH3JssyyrFqtjsViKpVq/fr1u3fv1mg0S6pGQ5ZlSZJu3rxJSksGBwfT6bTJZNLr9clk0uv13rx5s6enh3TycjqdBoPhjsRHlmWFQuHz+T777LM//vGPt27dstvtL7/88p49eyorKzUazUNcFamXOXv27NDQUHV19erVq+couiGr6kZGRv46I5vN7t27d//+/RUVFalUSqVSLV++fMOGDTqdzmw2m0wmn8935cqVyclJrVZrs9l0Ot1Tt5aTXDDHcWazWZbl/v7+cDhMiunC4bAoisuWLXM4HI9vmSRZ03fy5MmLFy/29PQEg0Gr1Wo0GnO5nMfj6evr6+zs7O3tDQaDBoPBarU+xJJGSZKmp6fffffdI0eODA4OtrS0vPTSS9u2bXO5XEql8uFm1Pnz50dGRmpqajZv3jx3wCdJ0vDw8LFjx44cOcIwzKFDh3bs2FFZWZnNZimK2rNnT2Njo1qtLigo0Ov1Y2NjHR0dIyMjGo2muLhYqVSiDgsAYKlBBRYAAADAfZEvsVqtdvfu3c3NzSQroWl66cQlJGIzmUyrV68+c+ZMe3s7z/OrV68+ePCg2+2ORqN+v//27dvHjx8/efKkx+NhGObNN980Go13HGdqaurTTz995513bt++/cILL/z0pz9tbW0l5VcPfbM0TZOiHpZl5y6SkmU5Ho+fP3/+97///fT09D/NcDgcqVTKbrcHg0Gn06lWq1mW3bZtW0tLS2Nj45/+9Kevv/46GAxmMpl9+/Y9dVUz+SEtLi5+8cUXp6am3n77ba/Xq1AoYrHYF198UV5ebrVaS0pKHtNMM5vNa9eu/eyzz27fvp1Op91u9w9/+MOKiopkMjk1NdXV1XXy5Mnr16+T3LOgoKCsrGyha2bJssE//vGPkUhk7969P//5zxsbG3mel2c8xH2R4kFmxgNfnEwmv/nmm7fffjuZTP7sZz977bXXHA4H6Z81Nja2cuVK5YydO3euWLGiqanpd7/73RdffJFMJnme37Jli9FoRIYFALCkIMACAAAAeACO45YtW1ZVVUVaFy2pzQfzV6LT6UipFAmwNmzY4HA4SHFWPB7P5XKjo6O9vb3Hjh07dOjQ7ABLluVUKnX16tXf//73/f3969evf/PNN3fs2PF93gVZ6nX27Nne3t7q6urW1lbSOV6tVq9cuZKEHeROKYoymUw7duxQq9WBQODSpUscx2k0mgMHDjyNXc/JrZWXl7/yyitDQ0Nff/319PS0LMt+v//jjz8uKyvbv3//wzXRf+B5aZo2Go0ajYZhGI1GU1tbu2HDBpfLRS7J5/MZjcZf//rXXV1dp0+f3rt3b2Fh4fzLpiRJSqVSp06d+u1vfxsIBF588cW33nqrubmZrGwlj/Khg9H5vEsUxaGhoXPnzt26dWvz5s1bt24tKCggYfTy5cubm5vzMTRFUVardd++fZIkRSKRq1evKpVKs9m8Zs0anueRYQEALB3YoBcAAADgwREDwzAsy5KG1mTB3ZIiSdLU1FQwGCShj8Ph0Ov1pACK4zi9Xl9RUeF2u1Op1PDw8B3Ng2RZnpiYOHv2bEdHh8Ph+OlPf7py5cr8n76ftE4Uxa6uLlI+Vltba7fb82fPl3GRkSe/VKlUTU1Nr7/+emlp6aVLlz755JNIJJL/61OEBCg8z1dWVr722murVq3ieZ7ce39//6effnrx4sVMJiNJ0qKfl2w3OTU1FY/HTSZTSUkJyWvIaBsMhhUrVlgsFoVCEY1Gx8fH59NzKi+Xy7W1tZ06dWp4eLimpuYHP/jB8uXLSdlU/hk91nmVyWSuXbvW3d2t1WqrqqpmN0ojM4qMfP5iWJbdsGHDz372M5PJdPr06QsXLvh8vqduOgEAPNsQYAEAAAA8+Kv+3L954iRJ8vv9gUCA53mn0+l2u0lbqHwAp5lBVuqRDe9mv7ezs/PChQsGg2HHjh0tLS35Htv3u1MSFYmiKMwiiqIkSfNPkeR/kCQpk8l0dXXdunVLpVK1tLQ4nU5yYeSAs0c+z2azbdmypbm5WZKk27dvd3d3p9Ppp7FehtygXq/fuXPnrl27li1bRu4imUyePHnyk08+mZiYEEXxcZw3FotNTU2lUimLxbJs2TK1Wp3/K8MwOp1OpVKR5lOJROKOaTO3bDZ76dKlK1euaLXaF154oa6uLl/NNMekkiRp9qQSZ9wxB+YzqRQKRTqdvnz5cnd3t8ViaWpq0mq19zza7Fowu92+fv361atXS5J0+fLlnp4eBFgAAEsKlhACAAAAPPUkSfJ6vYFAQK1Wu91uskKQFJiQr+ihGQqFgiwZm/3eeDze3t7e1tZWWVm5ZcsW0jh8juVdJG+amJjo6OgYGxuLxWJkAaPZbK6pqamrqzMYDPO87FwuFwgEJicnr169+t1338ViMYqibt++/f777xsMBpvNtnz58rKysnuuDaQoqqCgYMOGDTdv3hwcHDx16lRVVdXsCOaBMcdDxxOz1zM++rMjo80wjNFofPHFF8PhcCAQ8Pv9kiQlEolz58796U9/+vnPf+5yuRZ3ziSTydHRUfL47g6wcrlcKBQiGyOyLKvRaMjqv/kgG1Bev359fHy8qqpqw4YN+ZK6+21/SaLVkZGRW7duTU5OJhIJhmFMM5qamqqqqua5k0A2mw2FQiMjI1euXLl48WIkEuF5vr29XZZljuPcbndjY2NpaekdMyo/4e12+9atWy9fvnz+/PnW1tYtW7bcb8tOAAD4/iHAAgAAAHi65ROlQCBgMBgqKiry3/bJN/NMJjM5Oen1erVabWlpKSmryecUPT09HR0dkUikpKSktrZWrVbPHTRMTEy0t7dfvny5s7Mzk8mk0+loNBqPx41G4759+woKCuYZYFEUlUgk2tvbz58/f/r06Y6ODlmWs9lse3v70NAQTdMNDQ02m83tdt8dYJErVKvVq1atKi8v7+jouHTp0p49e2w2G1kdNsd58zv9RaPRh1iaR9M02eFxEVtu5S+4trb2wIEDIyMjJ06cIAtCh4aGPvjgg5qamr179+ZzyUU5aTQa7e7ujsViHMcVFBQUFxfPvqNUKtXX1xeNRhUKhdFoLCoqImsb5yOZTF6/fr2zs5Om6aqqKrfbzbLsHJFoLpcbHh6+evXq9evXBwYGstlsOp0OBoOSJOn1+jfeeKOgoECtVs/nxsPhcHt7+7fffnvu3LmBgQFyI21tbYODg5lMZv369Var1eVy3f3syME1Gk1LS4vD4ejr62tvbx8ZGSkrK5v/jQMAwGOFAAsAAADg6ZbL5SKRyOTkZCgUcrvddXV1s8tVstlsd3d3b29vOBwuLy/fuHGjVqudnVOcPn26q6vLbrfX19fPnT1JkjQ8PHz8+PFPP/00FArV19dv3ryZ5/menp4PP/xwaGiotrY2m83Ovzm3IAiZTEYURVLTpFarS0pKVqxYYTKZJEmqrKy0Wq333L6QHJ/juLIZKpVqZGSkvb29pqZmPvFZOBw+depUZ2dnKpWafx5EKrY4jluzZs3q1audTuci1ubkG4o3NDS88cYbPp/v7NmzqVQqk8kMDAy8++67Nptt3bp18yxEmo9oNHrr1q1oNGqxWFwul0qlyt+OLMs+n+/cuXN+v99gMFRXV1dUVMzOPecWj8fPnTs3NDRUUlKydu1a8sb7jZUoin19fX/+85/PnDkjCEJLS0t9fT3HcVeuXDl27Fg0Gt22bdv8Vy+SGSUIAtmpkOM4l8u1fv16nuczmUxNTY3FYpljQ0yGYZxOZ0VFxdWrV/v6+s6dO+d0OhFgAQAsEQiwAAAAAJ5uZCFeOBxWKBQmk6m+vp5EBpIkZbPZ4eHh999//+bNmyzL1tTUbN++XafTzf7C393dPTEx4Xa784vI7ld+FQ6HP/nkkz//+c+Tk5Ovv/76z3/+85KSEpqm29vb+/v7L1++7HQ6rVbr/GMds9m8a9eu1tZWlUo1Ojqq0Wi2bdv2b//2byUlJaIochynVqvnKHQixVAlJSUOhyMYDHZ3d8fj8fkEWKFQ6Lvvvvv6669JhdE8kfvieT6Xy5WUlNjt9jmikIXKr2LT6XTLly9/+eWX/X7/rVu3RFGMx+MXL15saWkpLS2trKxcrDMmEomRkZFEIlFdXV1aWprvsJ7NZv1+/6VLl7777rtoNLpixYrNmzcXFBTMf/uCdDo9MDAQiUTsdntzc/McexdKkjQ9Pf3JJ5/86U9/MpvNP/zhD19++eWioiKyM+ONGzcGBgZsNpvZbJ7n2R0Ox65du6qrq9Vq9djYGE3TmzZt+q//+i+1Wi0IglKpVKlU95tR5PgajaaiosJqtXo8nps3bx48eBD/wgAALBEIsAAAAACebqlUqre3l6w4MxqNLpdLFMVoNJpMJq9du3bkyJFTp04FAoG1a9e++eabK1asUCqV+SIpURTD4XAikdBoNDabbY60iPRj+uijjwYGBjZv3vyzn/2soaGBHMTtdm/dulWtVq9YsYJkDfO5bFmW2RkURYXD4VAoVFhYWFdXV1hYaDab53nvpHWU1WodGxubnp6eZ6mOwWBYu3YtqRWaf2sncs08z69evXpx0ysiP252u/3w4cOTk5PBYHB8fFyW5UQiMTAw4Pf7KyoqHr3sizz9ZDLp9XozmYzNZnM6nYIgxOPxdDrd2dn58ccfHzt2zO/3V1dXv/7666+++uqCVi9mMhnSWkur1drt9jkmVTKZPHbs2KeffirL8u7du998882ioiIysCUlJTt27KipqWloaJhdMzifGaVSqfx+fyqVslqtFRUVBQUFs6/hfkEYychYli0oKNDpdKSp3KLv/wgAAA8NARYAAADA0y2RSNy+fZsEWL29vf/93//NMIwoiolEYmhoqL+/n+f5Q4cO7d+/f+vWraRAKb/5WmqGKIo8z2u12nuGMiTvIMvuent7y8rKdu/e7Xa780VDNpvtwIEDa9euraiomLvb0R15ASkBCwQCHo8nGo0uW7bsjk5M88ksVDNyuVw8Hp/nbn1ms3nbtm0NDQ25XG5BORTptu50Oi0Wy2Pq7U0WEjqdzoMHD968eXN6ejqVSrEsa7PZ9Hr9Yp0lnU5PTEyEw2FRFIeGho4ePXrx4kWKoqLR6OTkZH9/fzab3b179/79+/fs2UPyxHk+VkEQ8t3fSQ3dHO8KBAJfffVVV1fXunXrtm/f7nQ684+jqKjoRz/6USqVym/LOJ+hIxcZi8VIcVlFRUVxcfE9J9790DSt1Wp5nhcEIZlMYiNCAIClAwEWAAAAwNMtmUx2dnYGAgGNRiPLcldXlyzLSqWS53mbzVZTU1NVVbV69eqGhgaTyUTTdD6JEAQhEokkEgmy05xSqbzfd/tEItHf33/p0qVIJHLgwIEXXnghv9hQlmWNRlNfX09eOf8GWEQ6nR4dHfX7/aIokvWAc6w4u2dmodFodDpdNpuNRCLZbHY+b1EqlS6Xy+12z1GPM5/jLPRmF3Rwp9NJ6p54nq+rq1u3bl1xcfFinW5qaqq/vz+RSFAUJUnS1NQU2bCP4zitVrt169aysrKWlpaVK1c6HI4FPVZBEILBIHkQDMPMMakCgcC1a9c6OzsFQVi3bt2KFSvyuSpFUUajceXKlfNftzh7Rk1NTU1MTKRSKYfDUVZWtqCMkvRi4zgum83G4/FkMol/YQAAlggEWAAAAABPt0QiMTg4GI/Hly1btnPnzlWrVsmyrNfrrVaryWSyWCxWq1WlUuW358sHCrlcLhwOk6/oDMOoVKr7fdX3+XwXLlwgbapqa2urqqrmWIe1oItPpVKDg4OhUIhhGLIX3kJ7Zms0Gq1WKwhCLBZLp9OSJJF2TveTb5cuSdJD50GPL70iA5tMJq9cudLT0yOKosvlOnz48Lp16+a/svKBAoHA8PCwIAgGg2HDhg3bt28nE8ButxuNRrPZbLFYNBoNqYZb0J2SSZXL5Uj0xnHc/SbV2NjY559/HggEHA5HXV2d3W5nGIacK/+MHmKoE4nExMREKBSSJKmwsLCiomJBARZN0yTAImsqw+GwIAgLWmcKAACPCf4tBgAAAHhakS/2wWBwenpakqT6+vrDhw+vXbuWoiiaphmGyf/fO95CfhZFMZlMkr5RNE2TdlT3PNH09PSNGzdSqVRpaWlhYeHsWOERQ5xkMjk4OBiNRg0Gg8PhMBgMc8dP90wcGIaRJCk344EBVn7fxmg0KoriQltZkeIgvV6/oKWOC5LNZnt6eo4cOdLW1maz2bZs2XLo0CGyZnOxThEMBsfGxiRJcrlcu3bt+vGPf5zNZmfPmfyTXWhOJ0lSKpUik4rjuDmehdfrvXbtWiaTaWxszC8evCNjzY/5/C8gFouNj49ns1m1Wl1UVORyuRb6fFmWJYWKgiDkcjlRFBFgAQAsBfi3GAAAAOBpRVFUJBIZGhpKpVI0TdtsNpfLpdPp8l/48+lD/ofZWQDDMBqNhnw5F0Uxk8ncr2V1LBYbGxsTBMHtdttstoeIFe4nmUwODAzEYjGLxUJqcBZacZOdwTCMesZ88q+pqakPPvjg1KlT0Wh0/jmUPEOlUh04cGDXrl0ul2uhWds8z+LxeI4cOXL58uV0Ot3S0vLTn/7U5XItYnoly3IgEBgdHZUkyel0OhwOfsb9bnl2Upn/Tb5d191hU/4pZDKZ+7XVl2U5EolMTEyIolhYWGixWBbr7kKh0MDAQC6Xs1qt+Zq1+c8oSZLS6TRJNnmeV6vVSK8AAJYI/HMMAAAA8BQLhUK9vb2ZTEalUjkcDpPJdEeacMcPs/E8bzabNRoNaV2UTqfvtzAwkUh4vV5Jkux2+x2neGgkU4hGo729vaSDu9vtJsHHgsKaZDIZi8VYltXpdKTj0gPTikQi0dPTc+bMmWAwuNAAS61W19bWrlu37jE9UJ/P9+WXX3722Wcej2fVqlX79+9vbW0lz2ix5HK5yRkURZWXl1ut1nveLAkHI5FIMpm0WCwkGJUkKZlMkiWfZrNZrVbfMdocx1ksFhKH5XK5bDZ7d00cKW6KRCKhUIjneavVajQaFyWhk2U5GAySpZclJSV2u32O+X+/I6RSqWw2y7KsfsbjiCkBAOAhIMACAAAAeIqFw+GhoSFBEAoLC51Op06nW8AHQZY1mUzkLYIgkO0I7/nKbDYbi8VEUVSr1Xc0WZdlWZIksj3fQ2QQoVBocnIyk8m4XK6ysrKFhgUkbkgkEhzH5ZcfPvAydDpdS0tLIpGIx+MLqq+RZZnn+aamJpPJ9DgaYMXj8StXrhw9erS/v7+goODll1/es2cPeUCL1XJLluVoNOrz+RKJhFarraqqIm3a70B2h+zo6Lh9+3Y8Hq+vr9+yZYtKpert7b158+bExATDMA0NDa2trfmQiCCBFOnxP0cqSir+crkcz/MqlWp2/RcJCkmHMpqmF3rXoVBoeHhYkqTKysqioqKFjg9ZAkkuTKvVkht5fN36AQBgAZ9bMAQAAAAATylJkkgzI0EQ7Ha7zWbjeX5B37SVSiXp1U32XLvfEkKlUmkwGDKZTDgcJrsW5o2NjQ0MDHAc19TURHbNm794PD4+Pp5KpViWLS4uXuiGcUQqlUomk7MDrAcqKCh45ZVXXnzxxYfo405RlFar1Wg0i55oSJJ048aNDz/88MKFC0ql8sUXX9y/f39ZWVn+vIt1Fp/P5/F4JEniOK6ioiK/JpQgeZPP5zt+/PilS5d6e3t9Pl9ZWRm5gKtXr96+fdvr9Y6MjFRWVv77v//7gQMH8sNOckyj0ahSqUjumUgk7jmpWJZVq9Ukt4rFYvnN/mRZzuVyIyMjQ0NDBQUFdXV15FDzIctyMpn0+/2hUIiiqKqqqocLsMhuAGRDxnv25AIAgCcCARYAAADA0yoajQ4PD09OTpJtBxeUqpCiEoZhrFarXq+Px+M+n49sHnc3s9lcW1sbDoe7urpu3bpVXl5OmqanUqnjx4+fPn26oaHB5XItKMCiKGp6erqnpyebzZL1jxaLZaEBVjabDc7QaDQOh+OO6rD7nZdlWeOMRxz/RSzMyeVyHo/n2LFjn3/+uSzLK1aseOmll+rr6x9iH8AHnqhvBgmwzGazVqu94zWiKE5NTXV0dFRVVblcrlOnTvX19X344YcMwxQXF7/yyiuDg4P/8z//c/Xq1e7u7n379uUDLHKRKpWKrDONRqNjY2PV1dV3hFDkETgcjsrKyomJie7u7o6ODtI/PpPJeL3er7766vLly3v27CkrK5t/gEVR1Pj4+ODgoCAIOp2uqKiItNZa6C6KPp8vHo8bDAan04n1gwAASwcCLAAAAICn1ejo6I0bN0KhEAkd7tfB6n7f9smCr9ra2uLi4omJib6+vkwmc88Xu93uHTt2tLe3d3R0/PnPfx4bG9PpdD6f78aNG319fU6n8/XXX19oHiTL8vT0dH9/P1n/aLFYHmL9YCgUGhoampycbG5ubmhouDuLeawWsTAnFov99a9//fLLL71eb0VFxU9+8pPVq1fn+3Mt4oni8XhnZ2dPTw+ZM9KM2bkhiTXLysreeuutgoKC3t5eEjBdunTpjTfe2LdvX1VVVVtb22effTY6OhqPx+9edqpUKisrKy0Wi8/nu3nz5saNG/V6/eynRm6npKRk27Zt77///qVLl37729+uWrWK5/mxsbEbN24MDw/Xz1hQ8y9ZlsfHx0dGRkjkarVaydsXNHrxeLy3t3dqamrt2rVNTU3ziUQBAOD7gQALAAAA4ClDlll1d3d/9NFHZ8+eJevgRkdHT548aTAYNm3aNP86JpVKtWHDhvPnz9++fbu/vz8QCDgcjruDJLvdvn379t7e3hMnTnR0dPh8PpVKlclkaJretGnTrl277ggp5oOiKNLASxRFsrnhQmOadDrd398/OjqqUCgqKyvr6uoWt9n59/Y0o9HopUuXPvnkk4GBgZKSkkOHDm3atCm/g95iEQQhFAqRmCwWi1EUlclk3nvvPYZh1q5dq9VqybQhAZPRaGxubmYYpqOjw+PxKJXKxsbGzZs319XVsSybzWbJvgEajebuyabT6datW/fdd9/19/ffuHEjvzww/9zJD6Wlpa+++mooFPrmm28uXLgwODjIsmwymdTr9fv27du7d29zc/P8W+yTI3s8ntHRUZZli4qKTCYTwzALLb8aHR0dHx+XJKmqqmrNmjUIsAAAlg4EWAAAAABPGbLRXjweZxiGVB6R3d80Gk08Hl9QHRbLsnV1dU1NTV9++eXIyEh3d3dZWdkdURRFUWq1ur6+/q233nI4HF1dXalUiqIok8lUW1u7ffv21atXk+/5CwoLstns1NSU1+slxThOp3Oh6wdTqdS1a9dGRkbsdntra2t+E8OnSzab7e7u/uCDD27cuMFx3JYtW15//fXS0tKHaAc2N5J7JhKJiooKt9utUCgYhuE4LhKJzC7fI7OLpmmVSiUIgtfrHRgYMBqNe/fubWxs1Gq1wWBwfHw8FouRKqe7r1Oj0axcubK+vr63t7e/v39sbKyoqGh2FEXOZTKZ1q9fn06nS0pKurq6crkcy7Jms3nFihXbt29vaGgg/fXnP6nI1ZK4rby8nCSACy2/amtr8/v9hYWFTU1NZFcBdHAHAFgiEGABAAAAPH14nl+9enVTU9PsBVw0TXMcN//gg3wzV6lUK1asaG1tbW9v/+KLLxobG5ctWzZ7b778y1pbW5ubm8PhcCQS4TjOarWS/eM4jiOvWVAHrkAgMDY2lkgkWJYtLy93Op0LigkEQRgfH79w4cLExERzc/PWrVsXtAPj0jEyMnLixImTJ08mEok9e/a89NJLLS0tj6Pwh7Sd+sUvfiEIwuyhVs6Y/Zv8z8lk0ufzBYPBysrK5uZmk8kky3IikRgcHEyn0+Xl5W63++6nRtO03W5fvXo1Kdb76quvyBaTs49PMiye57dt27Z+/fpAIJBMJpVKpcViIZOKZJELCo9CoZDH4wkGg0ajsaamxmAwLGh8BEGYmJg4ffp0JBLZu3fv6tWr7xgNAAB4shBgAQAAADxlyLd6fsajHId8M2cYpqGhYf369efPn//22283b95stVrtdvsdcUP+jDqdrrCwkHRKuvuq5nlqSZL8fv/w8LAkSSaTqby8fP5LCMmJpqamvv7665s3b/I839LS0tjYqFarn7pHmclkzpw589577/n9/qKiogMHDuzcuZPESYtb+JPv2T/HSs87zkhaSk1OTiqVyrKysoKCAvLE4/F4X19fMpmsrKy8u1KMHESpVK5fv76jo+O99977/PPPV65c6XA4VCpV/vj5G1TNMBgMoijSM+6eovMhiuL4+PjExEQikXC73Q0NDaSR/PzHx+v1njlz5vr16zzPb9y4saGhYdGL4AAA4FHgH2UAAACAp8ziloRQFOV0Ords2bJp06ZkMvmHP/zh/PnzsixLknT3GUnowLJsvkDm4a5KFMXJycmBgQGVSlVTU+N2u+ffvkqW5Ugkcu3atb/+9a9TU1Pbt28/fPgwad++oOWTT1wulzt9+vQHH3wwNPT/2bu33qiuQ4Hj9oxn7PG1rmPAcU3ccmljQgQkLSkUkgbSRrhFpRVNEylC6kNf8lCpn6Bv+QRp1apK6SVSlFapRIN6UVISSGsQROAEl/jYjQmBDFjYQI1tZuzZ+0jeOhaHNMRQgxf27/cQWXZm75k162Hmz9prD9TX13/ve9/7yle+Mh2YZv1dvtH/p1QqDQwMfPDBB9XV1a2trdN9cHR09NSpU4VCob29fcmSJVEUXXMF4nQV3bJly/3333/mzJmXXnrp4MGDk5OTH50wyW+SSTW9CddNvMBSqdTf33/27NlcLnfPPfcsW7Zshivykqk+Ojr6+uuv//KXvywUCo8//vj69euTKxDvrBkFML9ZgQUAsHAlQSqXy33xi1/8wQ9+sHv37v379z///POlUmnz5s1NTU3XXBh4TeOY+Zqpq39Iti0/duzYkSNHGhsbd+7cee+9907njOscM3l4clXaCy+80NPTs3nz5l27dm3atClpH3fQ1V7FYrGnp+fFF1/829/+Vl1dvXXr1u9+97srVqy4eif1uX2GURT19vYODAzU1tYuXbq0qqoq6UQjIyP5fD6TybS3txeLxa6urkwm09HRMX3JXhzH6XS6urr60UcfjeP4Jz/5yWuvvTYxMZFOp9etW5ekxquXYl1z3ht64dP968qVK6+++uqJEyeWLVvW2dl59913z2T7qqReffjhh6+++uru3bt7enq+8Y1vfP/731+1atUN7R8PwG0gYAEALFzTX+/r6+s3bdpUKBTS6XR/f/9zzz03MDDw4IMP3nfffXfdddfNHTyeklSP5IeJiYmhoaHe3t633377r3/968TExMaNG7ds2dLc3HzN8/noocrLy8fGxv7+978fPnx4//7977333te//vUnn3xy/fr1SVu5gzbbjqLozJkze/bs2b9/fxRF69at+853vrNs2bLKysob3U3s1ikWiydPnszn8+3t7W1tbckFgMlO8Inu7u6LFy+eO3du7dq1K1asuGZSxXG8ePHixx57bHh4eM+ePcePH//pT3/61a9+9YEHHlixYsWNblA1PW7Jf6eXB5ZKpcHBwf+ZcuDAgXQ6vXnz5i1btiTLrz5xGIeGhg5Pee21186fP799+/YnnnhizZo1yXpA27cDBEXAAgBY6JIv6s3Nzd/85jcXL1780ksvvf76688//3xPT88zzzxz0wErm83W1dU1NDTU1NQklxyOjY0dO3bs17/+9dGjR0dHRx999NGnnnpq6dKlM7nfXBzHg4ODP//5z7u7u+vq6h577LFdu3Yl98W741rDyMhIV1fXH/7wh3/961+rV6/etm3bI4880tDQEM4isjiOr1y5Mj4+ns1m29rapldgpVKpurq6tra2EydO7N27d9GiRatWrWppaUn++lHNzc1PP/10S0vLyy+/fOjQoXfffbezs/Ppp5++uYBVWVlZW1vb0NBQXV2dSqWSptnd3f2LX/zixIkThULha1/72vbt21esWDHD9VMnT5782c9+9u677+Zyuc7OzieeeOILX/jCnTijABYCAQsAYKGb3lG7pqZm/fr199xzz7Zt27q6uqIouman9plLp9Of+9znHnnkkZaWlo0bNyYVbHJycnx8PI7jtWvXrlmzZtOmTffdd19yx71P7AWTU9ra2lauXLlhw4b777+/ubn5jlt7lVS8d95557e//e17773X2NjY2dn5rW99a3rHpXACViqVeuCBB3K53OrVq1tbW1OpVPLL9vb2Xbt27d+/v7y8fOPGjRs2bFi+fPlHA9b0pKqvr9+6deuqVavefPPN3t7eXC53cxtLTc+olStXPvTQQ8meXMVicXx8PJPJrFmzZuPGjQ899NDKlStn0kMTqVSqtbV17dq1GzZs6OjoaGpqmuFsBGAOPq7YmBAAgKu/tEdRNDY2du7cuWKxePfddycrg25UcpChoaHR0dGmpqZPfepTlZWVhUJhaGjozJkz2Wy2qanprrvumvkd95IDnjp1qqqqqrm5eXqz8zurNURRdOTIkV/96le/+93vLl26tGPHjh/96Edr167NZDJBvZAoiiYnJwcHB0dHR+vq6pqbm5M1TcklhOenxHG8aNGipqamTCbzcVuYXT2phoeHL1y4UFZWtmTJkuvcDPE683NsbOz8+fOFQqFxSkVFRTKjPvzww0wms3jx4sbGxmw2O8MZldwN4PTp08kLnL6NgHoFECYBCwCAa7/Y/7/Pizf1Zf6jHzKTrPCfP5J+0k7b//Gxd1xlKJVKZ8+efeGFF5577rmhoaGOjo5nnnlm586duVwutNfycaN9nXfwOt1nVt67mc+oG723wJ07owAWFJcQAgAw+1/j/+NBbu7IyaPmQVwYHR1944039u7de+rUqWXLln37299++OGHZ2W/8GRh1Llz5/L5fC6XW7p06c2tm/vEd+o6T/Lm/jQnM2oWnxUAt03KEAAAwK02Njb21ltvvfjii0ePHl28eHFnZ+f27dtbW1uTv/6XMSVZ2/Xyyy//8Ic/fPbZZ48fP27AAZhnrMACAIBbKI7jKIp6e3t37979j3/8I5VKrV+//qmnnlq+fPlsbX0VRdHp06e7u7t7enqKxeLIyIhhB2CesQILAABurXw+v3fv3gMHDgwPD69evfrJJ5/8/Oc/n81mZ2u/8DiOk+sH4ziuqqq66XtHAkCwrMACAIBbJYqiS5cuHTx48JVXXsnn8+3t7Vu3bn344YdrampmcRumycnJDz744P3334/juLa2NpvNGnkA5hkBCwAAbpVCodDT07Nnz57jx4/X1tZu27bt8ccfb2lpmcVTxHF84cKFvr6+06dPx3FcXV2dyWSMPADzjIAFAAC3ysmTJ1955ZW9e/eWl5d/6Utf2rlz57p16+Ips3L8KIouXLhw6NCh7u7uy5cvV08RsACYfwQsAACYfVEUjY+P75kyPDz8mc985stf/nJDQ8Pg4GChUPjvA1b5lNHR0SNHjvz+978/evRoHMepVKqmpsYlhADMPwIWAADMvgsXLuzbt+8vf/lLX19fWVnZpUuX/vSnPx09erSioqJUKs3WWSYmJvL5fH9//+XLl8vKylKpVF1dnYAFwPwjYAEAwGyK43hiYqK/v/83v/nNO++8Mzk5WVZWNjIy0tXVNVu7tl9zuun1XOXl5VZgATAvCVgAADCb4jg+efLkH//4x3379o2MjFz9+9na+urjpNPp2trayspK7wIA84yABQAAs6lUKvX19e3bt69YLFZUVNyKVVcfd97Kysr6+noBC4D5R8ACAIBZE8dxFEVVVVXt7e2FQiHZl+q2nXfRokVtbW1VVVXeCADmmfJbvYwZAAAWjiQknT9/fmBg4OLFi8m+VLft1LlcbuXKlc3NzRUVFXEc37ZTA8CtJmABAMAsi6Io/j+387ypVKq8vPz2rPkCgNvJJYQAADCb4jj+7xPS1TcWvInHWnsFwDxjBRYAAIQliqLLly+PjY1VVVU1NDSoUQBgBRYAAIRiYmJibGxseHj40KFDPT09a9as2bFjh4AFAAIWAADMsWS3rFKpdObMma6urjfeeOOtt966ePFiOp3esWOH8QEAAQsAAOZYFEUXL148ePBgV1fX8ePH33777Xw+X1tbG0WRwQEAAQsAAObe5OTk2bNnDxw48M9//vPTn/50R0fHlStXSqWS+wkCQELAAgCAORZFUaFQ+OxnP/vggw9u2LDh8OHDzz77bH9/vxsuAUBCwAIAgDmWzWaXL1++ZMmSysrKqqqqbDZr7RUAXE3AAgCAuRTHcTqdrp9SVlY2MjJSKpWsvQKAq/mHHQAAmEvl5eXTPyfd6urfAAACFgAAAAChE7AAAAAACJqABQAAAEDQBCwAAAAAgiZgAQBAQOI4jqIonpLcjjBhZABYyCoMAQAAzK0kUY1MGRoa6uvr+/e//z0+Pp7P53t7e+vr66urq2tqarLZrBsUArAwpX/84x8bBQAAmFvFYvHYsWNvTvnzn//8/vvvj4+PF4vFoaGhwcHBYrFYN0XAAmBhsgILAADmWBzHhUKhr6/v4MGDly5dymQyHR0dURRlMpmBgYHh4eEoitra2lpaWowVAAuTgAUAAHMvm83ee++9lZWVURRls9l0Ol1eXj45OTkxMVFWVtba2trY2GiUAFiwyu0HCQAAc2vmn8ldQgjAwmQFFgAAzKU4jmUpALi+lCEAAIA5pF4BwCcSsAAAAAAImoAFAAAAQNAELAAAAACCJmABAAAAEDQBCwAAAICgCVgAAAAABE3AAgAAACBoAhYAAAAAQROwAAAAAAiagAUAAABA0AQsAAAAAIImYAEAAAAQNAELAAAAgKAJWAAAAAAETcACAAAAIGgCFgAAAABBE7AAAAAACJqABQAAAEDQBCwAAAAAgiZgAQAAABA0AQsAAACAoAlYAAAAAARNwAIAAAAgaAIWAAAAAEETsAAAAAAImoAFAAAAQNAELAAAAACCJmABAAAAEDQBCwAAAICgCVgAAAAABE3AAgAAACBoAhYAAAAAQROwAAAAAAiagAUAAABA0AQsAAAAAIImYAEAAAAQNAELAAAAgKAJWAAAAAAETcACAAAAIGgCFgAAAABBE7AAAAAACJqABQAAAEDQBCwAAAAAgiZgAQAAABA0AQsAAACAoAlYAAAAAARNwAIAAAAgaAIWAAAAAEETsAAAAAAImoAFAAAAQNAELAAAAACCJmABAAAAEDQBCwAAAICgCVgAAAAABE3AAgAAACBoAhYAAAAAQROwAAAAAAiagAUAAABA0AQsAAAAAIImYAEAAAAQNAELAAAAgKAJWAAAAAAETcACAAAAIGgCFgAAAABBE7AAAAAACJqABQAAAEDQBCwAAAAAgiZgAQAAABA0AQsAAACAoAlYAAAAAARNwAIAAAAgaAIWAAAAAEETsAAAAAAImoAFAAAAQNAELAAAAACCJmABAAAAEDQBCwAAAICgCVgAAAAABE3AAgAAACBoAhYAAAAAQROwAAAAAAiagAUAAABA0AQsAAAAAIImYAEAAAAQNAELAAAAgKAJWAAAAAAETcACAAAAIGgCFgAAAABBE7AAAAAACJqABQAAAEDQBCwAAAAAgiZgAQAAABA0AQsAAACAoAlYAAAAAARNwAIAAAAgaAIWAAAAAEETsAAAAAAImoAFAAAAQNAELAAAAACCJmABAAAAEDQBCwAAAICgCVgAAAAABE3AAgAAACBoAhYAAAAAQROwAAAAAAiagAUAAABA0AQsAAAAAIImYAEAAAAQNAELAAAAgKAJWAAAAAAETcACAAAAIGgCFgAAAABBE7AAAAAACJqABQAAAEDQBCwAAAAAgiZgAQAAABA0AQsAAACAoAlYAAAAAARNwAIAAAAgaAIWAAAAAEETsAAAAAAImoAFAAAAQNAELAAAAACCJmABAAAAEDQBCwAAAICgCVgAAAAABE3AAgAAACBoAhYAAAAAQROwAAAAAAiagAUAAABA0AQsAAAAAIImYAEAAAAQNAELAAAAgKAJWAAAAAAETcACAAAAIGgCFgAAAABB+98AAAD//3DcV35yu2O/AAAAAElFTkSuQmCC" + } + }, + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## RANDOM FOREST LEARNER\n", + "\n", + "### Overview\n", + "\n", + "![random_forest.png](images/random_forest.png) \n", + "Image via [src](https://cdn-images-1.medium.com/max/800/0*tG-IWcxL1jg7RkT0.png)\n", + "\n", + "#### Random Forest\n", + "\n", + "As the name of the algorithm and image above suggest, this algorithm creates the forest with a number of trees. The more number of trees makes the forest robust. In the same way in random forest algorithm, the higher the number of trees in the forest, the higher is the accuray result. The main difference between Random Forest and Decision trees is that, finding the root node and splitting the feature nodes will be random. \n", + "\n", + "Let's see how Rnadom Forest Algorithm work : \n", + "Random Forest Algorithm works in two steps, first is the creation of random forest and then the prediction. Let's first see the creation : \n", + "\n", + "The first step in creation is to randomly select 'm' features out of total 'n' features. From these 'm' features calculate the node d using the best split point and then split the node into further nodes using best split. Repeat these steps until 'i' number of nodes are reached. Repeat the entire whole process to build the forest. \n", + "\n", + "Now, let's see how the prediction works\n", + "Take the test features and predict the outcome for each randomly created decision tree. Calculate the votes for each prediction and the prediction which gets the highest votes would be the final prediction.\n", + "\n", + "\n", + "### Implementation\n", + "\n", + "Below mentioned is the implementation of Random Forest Algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(RandomForest)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This algorithm creates an ensemble of decision trees using bagging and feature bagging. It takes 'm' examples randomly from the total number of examples and then perform feature bagging with probability p to retain an attribute. All the predictors are predicted from the DecisionTreeLearner and then a final prediction is made.\n", + "\n", + "\n", + "### Example\n", + "\n", + "We will now use the Random Forest to classify a sample with values: 5.1, 3.0, 1.1, 0.1." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['versicolor', 'setosa', 'setosa', 'setosa', 'setosa']\n", + "setosa\n" + ] + } + ], + "source": [ + "iris = DataSet(name=\"iris\")\n", + "\n", + "DTL = RandomForest(iris)\n", + "print(DTL([5.1, 3.0, 1.1, 0.1]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the Random Forest classifies the sample as \"setosa\"." + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -1122,7 +1234,7 @@ "\n", "*a)* The probability of **Class** in the dataset.\n", "\n", - "*b)* The conditional probability of each feature occuring in an item classified in **Class**.\n", + "*b)* The conditional probability of each feature occurring in an item classified in **Class**.\n", "\n", "*c)* The probabilities of each individual feature.\n", "\n", @@ -1306,7 +1418,7 @@ "source": [ "You can see the means of the features for the \"Setosa\" class and the deviations for \"Versicolor\".\n", "\n", - "The prediction function will work similarly to the Discrete algorithm. It will multiply the probability of the class occuring with the conditional probabilities of the feature values for the class.\n", + "The prediction function will work similarly to the Discrete algorithm. It will multiply the probability of the class occurring with the conditional probabilities of the feature values for the class.\n", "\n", "Since we are using the Gaussian distribution, we will input the value for each feature into the Gaussian function, together with the mean and deviation of the feature. This will return the probability of the particular feature value for the given class. We will repeat for each class and pick the max value." ] @@ -1372,7 +1484,9 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "collapsed": true + }, "outputs": [], "source": [ "psource(NaiveBayesSimple)" @@ -1603,6 +1717,109 @@ "The correct output is 0, which means the item belongs in the first class, \"setosa\". Note that the Perceptron algorithm is not perfect and may produce false classifications." ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## LINEAR LEARNER\n", + "\n", + "### Overview\n", + "\n", + "Linear Learner is a model that assumes a linear relationship between the input variables x and the single output variable y. More specifically, that y can be calculated from a linear combination of the input variables x. Linear learner is a quite simple model as the representation of this model is a linear equation. \n", + "\n", + "The linear equation assigns one scaler factor to each input value or column, called a coefficients or weights. One additional coefficient is also added, giving additional degree of freedom and is often called the intercept or the bias coefficient. \n", + "For example : y = ax1 + bx2 + c . \n", + "\n", + "### Implementation\n", + "\n", + "Below mentioned is the implementation of Linear Learner." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(LinearLearner)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This algorithm first assigns some random weights to the input variables and then based on the error calculated updates the weight for each variable. Finally the prediction is made with the updated weights. \n", + "\n", + "### Implementation\n", + "\n", + "We will now use the Linear Learner to classify a sample with values: 5.1, 3.0, 1.1, 0.1." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.2404650656510341\n" + ] + } + ], + "source": [ + "iris = DataSet(name=\"iris\")\n", + "iris.classes_to_numbers()\n", + "\n", + "linear_learner = LinearLearner(iris)\n", + "print(linear_learner([5, 3, 1, 0.1]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## ENSEMBLE LEARNER\n", + "\n", + "### Overview\n", + "\n", + "Ensemble Learning improves the performance of our model by combining several learners. It improvise the stability and predictive power of the model. Ensemble methods are meta-algorithms that combine several machine learning techniques into one predictive model in order to decrease variance, bias, or improve predictions. \n", + "\n", + "\n", + "\n", + "![ensemble_learner.jpg](images/ensemble_learner.jpg)\n", + "\n", + "\n", + "Some commonly used Ensemble Learning techniques are : \n", + "\n", + "1. Bagging : Bagging tries to implement similar learners on small sample populations and then takes a mean of all the predictions. It helps us to reduce variance error.\n", + "\n", + "2. Boosting : Boosting is an iterative technique which adjust the weight of an observation based on the last classification. If an observation was classified incorrectly, it tries to increase the weight of this observation and vice versa. It helps us to reduce bias error.\n", + "\n", + "3. Stacking : This is a very interesting way of combining models. Here we use a learner to combine output from different learners. It can either decrease bias or variance error depending on the learners we use.\n", + "\n", + "### Implementation\n", + "\n", + "Below mentioned is the implementation of Ensemble Learner." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(EnsembleLearner)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This algorithm takes input as a list of learning algorithms, have them vote and then finally returns the predicted result." + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -1748,190 +1965,154 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## MNIST HANDWRITTEN DIGITS CLASSIFICATION\n", + "## AdaBoost\n", "\n", - "The MNIST database, available from [this page](http://yann.lecun.com/exdb/mnist/), is a large database of handwritten digits that is commonly used for training and testing/validating in Machine learning.\n", + "### Overview\n", "\n", - "The dataset has **60,000 training images** each of size 28x28 pixels with labels and **10,000 testing images** of size 28x28 pixels with labels.\n", + "**AdaBoost** is an algorithm which uses **ensemble learning**. In ensemble learning the hypotheses in the collection, or ensemble, vote for what the output should be and the output with the majority votes is selected as the final answer.\n", "\n", - "In this section, we will use this database to compare performances of different learning algorithms.\n", + "AdaBoost algorithm, as mentioned in the book, works with a **weighted training set** and **weak learners** (classifiers that have about 50%+epsilon accuracy i.e slightly better than random guessing). It manipulates the weights attached to the the examples that are showed to it. Importance is given to the examples with higher weights.\n", "\n", - "It is estimated that humans have an error rate of about **0.2%** on this problem. Let's see how our algorithms perform!\n", + "All the examples start with equal weights and a hypothesis is generated using these examples. Examples which are incorrectly classified, their weights are increased so that they can be classified correctly by the next hypothesis. The examples that are correctly classified, their weights are reduced. This process is repeated *K* times (here *K* is an input to the algorithm) and hence, *K* hypotheses are generated.\n", "\n", - "NOTE: We will be using external libraries to load and visualize the dataset smoothly ([numpy](http://www.numpy.org/) for loading and [matplotlib](http://matplotlib.org/) for visualization). You do not need previous experience of the libraries to follow along." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Loading MNIST digits data\n", + "These *K* hypotheses are also assigned weights according to their performance on the weighted training set. The final ensemble hypothesis is the weighted-majority combination of these *K* hypotheses.\n", "\n", - "Let's start by loading MNIST data into numpy arrays." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The function `load_MNIST()` loads MNIST data from files saved in `aima-data/MNIST`. It returns four numpy arrays that we are going to use to train and classify hand-written digits in various learning approaches." - ] - }, - { - "cell_type": "code", - "execution_count": 46, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "train_img, train_lbl, test_img, test_lbl = load_MNIST()" + "The speciality of AdaBoost is that by using weak learners and a sufficiently large *K*, a highly accurate classifier can be learned irrespective of the complexity of the function being learned or the dullness of the hypothesis space." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Check the shape of these NumPy arrays to make sure we have loaded the database correctly.\n", + "### Implementation\n", "\n", - "Each 28x28 pixel image is flattened to a 784x1 array and we should have 60,000 of them in training data. Similarly, we should have 10,000 of those 784x1 arrays in testing data." - ] - }, - { - "cell_type": "code", - "execution_count": 47, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Training images size: (60000, 784)\n", - "Training labels size: (60000,)\n", - "Testing images size: (10000, 784)\n", - "Training labels size: (10000,)\n" - ] - } - ], - "source": [ - "print(\"Training images size:\", train_img.shape)\n", - "print(\"Training labels size:\", train_lbl.shape)\n", - "print(\"Testing images size:\", test_img.shape)\n", - "print(\"Training labels size:\", test_lbl.shape)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Visualizing MNIST digits data\n", + "As seen in the previous section, the `PerceptronLearner` does not perform that well on the iris dataset. We'll use perceptron as the learner for the AdaBoost algorithm and try to increase the accuracy. \n", "\n", - "To get a better understanding of the dataset, let's visualize some random images for each class from training and testing datasets." - ] - }, - { - "cell_type": "code", - "execution_count": 48, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAAKoCAYAAABUXzFLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3Xm8TdX/x/GXMULmIWODJFKUIgmlzIpKpZQiRSiFQmRI\npUElRQMqRIqiRJo0Skqp5Nc8q4TMNMj5/dH3s/c695x73XvuvWfY5/18PHrY7XXuOcuyzzl3r89n\nfVaBUCgUQkREREREJCAKJroDIiIiIiIieUk3OSIiIiIiEii6yRERERERkUDRTY6IiIiIiASKbnJE\nRERERCRQdJMjIiIiIiKBopscEREREREJFN3kiIiIiIhIoOgmR0REREREAkU3OY6dO3cyaNAgqlat\nSrFixWjYsCFPPvlkoruV9Hbs2MH1119PmzZtqFixIgUKFGDMmDGJ7lZKeO211+jVqxd169alRIkS\nVKtWjbPOOovVq1cnumtJbc2aNXTs2JGaNWtSvHhxypUrx0knncTs2bMT3bWUNG3aNAoUKEDJkiUT\n3ZWk9vrrr1OgQIGo/61cuTLR3UsJb7/9Nh06dKBs2bIUL16cI444gptvvjnR3Upql156aabXna69\nrH300Ud06dKFqlWrcuCBB1K3bl3GjRvH7t27E921pLdq1Sratm1LqVKlKFmyJKeeeirvvPNOoruV\nI4UT3YFkcvbZZ/P+++8zYcIE6tSpw5w5c+jevTv79u3jwgsvTHT3ktbmzZt5+OGHOfbYY+nSpQvT\npk1LdJdSxtSpU9m8eTPXXHMN9erVY+PGjUycOJGmTZuybNkyTjvttER3MSlt3bqVGjVq0L17d6pV\nq8auXbt44oknuPjii/n+++8ZOXJkoruYMtavX8+QIUOoWrUq27ZtS3R3UsKtt97KqaeeGnbu6KOP\nTlBvUsecOXO4+OKLOe+885g5cyYlS5bkm2++4Zdffkl015LaqFGj6Nu3b8T5zp07c8ABB3DCCSck\noFfJb926dTRr1owjjzySe++9lwoVKvDmm28ybtw4Vq9ezaJFixLdxaT1/vvv06JFC0488URmzZpF\nKBTijjvuoHXr1ixfvpyTTjop0V3MnpCEQqFQ6IUXXggBoTlz5oSdP+OMM0JVq1YN7d27N0E9S377\n9u0L7du3LxQKhUIbN24MAaHRo0cntlMpYsOGDRHnduzYEapcuXKodevWCehRamvSpEmoRo0aie5G\nSunUqVOoc+fOoZ49e4ZKlCiR6O4kteXLl4eA0NNPP53orqScn3/+OVSiRIlQv379Et2VQHj99ddD\nQGjkyJGJ7krSuvHGG0NA6Ouvvw47f8UVV4SA0B9//JGgniW/tm3bhipXrhzatWuXd2779u2hChUq\nhJo1a5bAnuWM0tX+59lnn6VkyZJ069Yt7Pxll13GL7/8wnvvvZegniU/C5lLzlWqVCniXMmSJalX\nrx4//fRTAnqU2ipUqEDhwgpQZ9fs2bN54403mDJlSqK7IgE3bdo0du3axQ033JDorgTC9OnTKVCg\nAL169Up0V5JWkSJFAChdunTY+TJlylCwYEGKFi2aiG6lhHfeeYdWrVpx4IEHeudKlSpFixYtWLFi\nBb/++msCe5d9usn5n7Vr13LUUUdF/IJ0zDHHeO0i8bBt2zY+/PBD6tevn+iuJL19+/axd+9eNm7c\nyJQpU1i2bJl+icqm33//nUGDBjFhwgSqV6+e6O6klP79+1O4cGEOOugg2rZty9tvv53oLiW9N998\nk3LlyvH555/TsGFDChcuTKVKlejbty/bt29PdPdSyrZt25g/fz6tW7fm0EMPTXR3klbPnj0pU6YM\n/fr149tvv2XHjh0sXryYhx56iP79+1OiRIlEdzFp/f333xxwwAER5+3cp59+Gu8uxUQ3Of+zefNm\nypUrF3Hezm3evDneXZI01b9/f3bt2sWNN96Y6K4kvauuuooiRYpQqVIlrr32Wu677z6uvPLKRHcr\nJVx11VUceeSR9OvXL9FdSRmlS5fmmmuu4aGHHmL58uVMmjSJn376iVatWrFs2bJEdy+prV+/nt27\nd9OtWzfOP/98XnnlFYYOHcrMmTPp0KEDoVAo0V1MGXPnzmXPnj307t070V1Jaocccgjvvvsua9eu\n5fDDD+eggw6ic+fO9OzZk0mTJiW6e0mtXr16rFy5kn379nnn9u7d62U1pcrvxMrrcGSVcqV0LImH\nUaNG8cQTTzB58mSOP/74RHcn6Y0YMYLLL7+c33//neeff54BAwawa9cuhgwZkuiuJbUFCxbw/PPP\n89FHH+mzLQcaNWpEo0aNvP8/5ZRT6Nq1Kw0aNOD666+nbdu2Cexdctu3bx9//vkno0ePZtiwYQC0\natWKokWLMmjQIF599VVOP/30BPcyNUyfPp3y5cvTtWvXRHclqX3//fd07tyZypUrM3/+fCpWrMh7\n773H+PHj2blzJ9OnT090F5PWwIED6d27NwMGDODGG29k3759jB07lh9++AGAggVTI0aSGr2Mg/Ll\ny0e9M/3jjz8AokZ5RPLS2LFjGT9+PLfccgsDBgxIdHdSQs2aNWncuDEdOnRg6tSpXHHFFQwfPpyN\nGzcmumtJa+fOnfTv35+BAwdStWpVtm7dytatW/n777+B/yrX7dq1K8G9TB1lypShU6dOfPLJJ+zZ\nsyfR3Ula5cuXB4i4EWzfvj0AH374Ydz7lIo++eQTPvjgA3r06BE1nUh8w4YNY/v27SxbtoxzzjmH\nFi1aMHToUO69915mzJjBG2+8keguJq1evXoxYcIEZs2aRfXq1alZsybr1q3zJhCrVauW4B5mj25y\n/qdBgwb83//9H3v37g07b3mHKg8q+Wns2LGMGTOGMWPGMGLEiER3J2WdeOKJ7N27l2+//TbRXUla\nmzZtYsOGDUycOJGyZct6/82dO5ddu3ZRtmxZLrrookR3M6VYqpWiYpmz9a0Z2dilysxwoln04fLL\nL09wT5LfmjVrqFevXsTaGyu5rbXWWbvhhhvYtGkTn376Kd9//z0rVqxgy5YtlChRImUyTfSp8j9d\nu3Zl586dLFiwIOz8448/TtWqVWnSpEmCeiZBd/PNNzNmzBhGjhzJ6NGjE92dlLZ8+XIKFizIYYcd\nluiuJK0qVaqwfPnyiP/atm1LsWLFWL58OePHj090N1PGli1bWLx4MQ0bNqRYsWKJ7k7SOueccwBY\nunRp2PklS5YA0LRp07j3KdX89ddfzJ49mxNPPFETr9lQtWpVPvvsM3bu3Bl2/t133wVQwZVsOOCA\nAzj66KOpVasWP/74I/PmzaNPnz4UL1480V3LFq3J+Z/27dtzxhln0K9fP7Zv307t2rWZO3cuL774\nIrNnz6ZQoUKJ7mJSW7p0Kbt27WLHjh3Af5twzZ8/H4AOHTqElSEU38SJE7npppto164dHTt2jNi5\nWl/80V1xxRUcdNBBnHjiiVSuXJlNmzbx9NNPM2/ePIYOHUrFihUT3cWkVaxYMVq1ahVx/rHHHqNQ\noUJR2+Q/F154oZciWaFCBb766ismTpzIhg0beOyxxxLdvaTWpk0bOnfuzLhx49i3bx9Nmzblgw8+\nYOzYsXTq1InmzZsnuotJb+HChfzxxx+K4mTToEGD6NKlC2eccQbXXnstFSpUYOXKldx2223Uq1fP\nS5WUSGvXrmXBggU0btyYAw44gI8//pgJEyZwxBFHcPPNNye6e9mX4H16ksqOHTtCV199dahKlSqh\nokWLho455pjQ3LlzE92tlFCrVq0QEPW/7777LtHdS1otW7bMdNz09szcjBkzQqecckqoQoUKocKF\nC4fKlCkTatmyZWjWrFmJ7lrK0mag+3fbbbeFGjZsGCpdunSoUKFCoYoVK4a6du0aWrVqVaK7lhJ2\n794duuGGG0I1atQIFS5cOFSzZs3Q8OHDQ3/++Weiu5YSzjjjjFCJEiVC27dvT3RXUsZrr70WatOm\nTahKlSqh4sWLh+rUqRMaPHhwaNOmTYnuWlL74osvQi1atAiVK1cuVLRo0VDt2rVDI0eODO3cuTPR\nXcuRAqGQ6jaKiIiIiEhwaE2OiIiIiIgEim5yREREREQkUHSTIyIiIiIigaKbHBERERERCRTd5IiI\niIiISKDoJkdERERERAJFNzkiIiIiIhIohRPdgWgKFCiQ6C4khVi2MNLY/UdjFzuNXexyOnYat//o\nmoudxi52GrvYaexip7GLXU7HTpEcEREREREJFN3kiIiIiIhIoOgmR0REREREAkU3OSIiIiIiEii6\nyRERERERkUDRTY6IiIiIiARKUpaQFhERkfTWvn1773j69OkADB48GIC5c+cmpE8ikjoUyRERERER\nkUBRJEckCdStWxeAPn36AHDEEUd4bR07dgSgYMH/5iT27dvntX311VcA3HDDDQAsWrQo/zsrCWez\n2XfddZd37plnngGgd+/e3rmtW7fGt2NponHjxgAsW7YMgFdffdVr69mzJwB79uyJf8dSXNGiRQG4\n+uqrARg9erTXVqJECQAKFSoU/46JSEpSJEdERERERAJFNzkiIiIiIhIoSlcTyUcHHHCAd9yoUSMA\nLrjgAgAOPvhgr+3cc88FIBQKRTyHnbM0NfcxtWvXBqB58+ZA+qSrVaxY0Tu+8MILAejSpQsALVu2\n9NoKFCgA+GP2ww8/eG1Lly4F4JdffgFg4cKFXtvatWvzo9t5xq4FN3XR/v6W6gNKV8tLhQv7X5f3\n3HMPAOXKlQOgW7duXpulkP7+++8Rz/H9998D8NJLLwFKaYPw9LNJkyYBcOWVVwLw77//em0TJ04E\n4Nlnn41j7yRojj32WADOPPNM79w555wDwDHHHBPx+PXr1wPQunVrAL788sv87qLkIUVyREREREQk\nUBTJIXyh7h133LHfxz/44IMAjB071jv3999/533HUlznzp294379+gF+SVA3GmHj/+ijj8axd3mv\nRo0a3rH9fdu0aeOda9iwYZ6/5s8//wzAY489lufPnWjVqlXzjjt06ADApZdeCkCVKlW8tkMOOSTs\n59xrK2NkrGbNmt6xzRYbd5HzkCFDAH9mOV1FW+Ttzq6nAysycP/993vnmjRpkunjR4wYsd/n3Llz\nJwB9+/b1zj355JNA+o3vsGHDvOOM78lHHnnEOx46dGjc+iTB4EZf7TPdPuetyIUrWiaFfQ/Zd3m6\nRHIOO+ww73jChAmAH7F2x2nbtm2A/7vdypUr49XFbFEkR0REREREAqVAKNqta4JZHn28WI4m+LNp\nderU2e/PPfzww97xggULAHjllVfyrF+x/NPEe+xcNrtps3E9evTw2rIq+/nNN98A2Rvz7ErE2N16\n663e8fXXXx/xnFn1KePakaweY2tIANq1awfAZ599FkOPo0v0dWd/J1v3AFlfG7b2YdWqVQDccsst\nXltWa2uOO+44wF/T484i21oXW0e1v+cyOR27WMft2muvBcJLSBs3WmX55LGy2fOzzz7bO2dRyjVr\n1uTquV2Jvuay8sADDwBw1VVXRbT9+eefALzwwgveOVuTE41FHU8//fSINvt3y+k6nWQeu6xYhMze\nt67FixcD/uw75M8MeqqO3UEHHQSEz7afcMIJgP/5565btPG0tvfee89ri/VXwFQYO1vHBTBo0KCw\nPmS3//Z4Wwtm63dyI5nHrmvXrgDMnj3bO1esWDEAdu/eDcCBBx4Y8XMWwTnllFO8c+6a0byS07FT\nJEdERERERAJFNzkiIiIiIhIoKjyAv3gbYNOmTUD2UqeuuOIK79hKAJ900kkAfP3113nZxaRVvHhx\n7/jll18G/J2psysvU60S6fjjj/eOLbRcsKA/j2ChW0uvcotVfPTRR4CfHlWpUiWvzRZI2nNWrVrV\na+vUqRMQnDEEaNasGRD9PWipLVb+GWDatGlAeBpfdrz99tuAH4p309Xs383990tH3bt3B8JTemfO\nnAn4KXOvvvpq/DsWB/Y+u+yyyyLaNm/eDPhpfG+++Wb8OpbiSpUqBYQXcjBbtmwBYODAgUB4yfd0\nc/LJJwNQtmxZ75wVXLHxOfroozP9eTdVyAq32J/nn3++1zZ//vw86nHy6NixIwCXX355po+x31cA\n7rzzTgBuv/12IDxN2Sxfvjwvu5h0LE3NPt/texH8wgNWCMQt/DNjxgwAmjZtCoSX5na3ZUiU9P4G\nFxERERGRwEnrSI7NKNmdKPizyMbdeGzdunWAvzDanbm3TeEGDBgA+Av9IG+LESSbXr16ecc5jeCY\nkSNH5lV3EspmOwDmzp0LQJ8+fbxzdk1YwQqbDY7GLS1ri79t0bK78M4WgdvC52TfxDI7Jk+eDIRH\numwR5K+//grAX3/9levXsWIY7kJJ888//wCwd+/eXL9OvLmRh/Hjx8f0HFYO3V28bOrXrw/4C3CD\nGsmZOnUqEB6tNtOnTwdg9erVce1TEFhU8MQTT4xos5n0IEZwLKpQq1Yt79xFF12U6eNtttzdUNo+\n++1zafv27V6bRbTnzZsH+IUdwC+eYdztDoLo7rvvBqBkyZIRbf379wf88QL/8/7xxx8Hokdyguis\ns87yju071iI411xzjdc2ZcoUwI8O2qbGbpttt2ARSFAkR0REREREJM/pJkdERERERAIlLdPVLE3t\n0UcfBfzF2y6rB26LsACee+45wE9VsP8HPwRtCwLPO+88r+3iiy8GgpHWYSkGTzzxBBC+AC0n3Pr1\nWe0rkUqiLUx87LHHYnquBx980DtetGgREF4gw1SvXh3w9z9ww8+pauPGjUDsqVZZqVy5sndsaQs3\n3ngjEL7TvKUBWopqKom2h0FO2eeZW+Qi3Vj6Rt26dYHwQhi2D5alOto1BMFfoBwL+84Ff8d546Zc\nxfp5mQpGjBgBhKerZccnn3ziHY8bNw4IT6PPTLT9XKywQ1Cv0RYtWgD+7yXu3jL2eW9pqNHY492f\ns2NLLYxWMCNVDRs2zDu2NDVL2bM0NMh6vxtLw7/66qsBvwgX+GmDlmaeCIrkiIiIiIhIoKRNJKdH\njx7ecbdu3YDoERxb0Gd3pW60xtiiSLes6s033wz4szXujLHNuriz7Kkwk2JljN1S2VZmN9YZXivz\n6+4Q7i4wl0g2C/Lhhx8C0RdFWslMCWcLTwcPHgyE71pfoUIFwN9JfcyYMV6bLd6V9PX0008D8Pzz\nzwPhM5Q2o25bBrjlaK3M6g033ACERyrSVZMmTbzj1q1bh7VZKXLwy+sHkWV0uH/faD799FPA/1xa\nsGCB15ad70orUONer8YKO6xZs2b/HU5BFnGwz/2dO3d6bUuWLNnvz1thB7e4j7HCSm5p5T///DP2\nziZQzZo1Ab/sM/iZDK+//jqQdfTGZUUI7Hc6t2DBrbfeCkQvwx8viuSIiIiIiEigBDKSE63kpztL\ne+ihh4a1uWViLUfR1utkl+Ux2iagbllqmwFwy0WmQiTnqKOOAmDs2LGZPsad8di6dSsQvnlZRuvX\nrwfgjTfeyIsuphWbiTruuOMS3JPEc8ujZixd3rx5c+/Y1shFK4VsZVVtI7iffvopz/uZX7Ka0XVL\n81r52bwouZ2ubLbW1uiAXxrVSsW7WQFW/v2II44AwiP4u3btyt/OJhmLKrjlejPK6XdtqnrnnXfC\n/swvbdu2BeCwww6LaMtONCOVWSlo89RTT3nHWZUlL1++POBn+URjvye6azdTlf0O4f7+9tprrwHh\n69BzIloULFpELN4UyRERERERkUDRTY6IiIiIiARKINPV3FK7ZcqU2e/jbQd6gHvuuSem17Q0tR07\ndgDw5JNPem220NINH1erVg3w07eSUVYpBmvXrgXCy2KvWrUK8MtLS96yAgTJEAKOB/f9MmfOHMAv\nReu2FS1aNNvP6ZbFtGIEGVMcUoEtbr/vvvsi2k499VTv2FIQbBH822+/na3nt2tt06ZNgF+kQf5j\nC5q7du0KwKhRo7w2G2v73J8wYYLXZumTQWcLm60gj/2/65tvvgHC02+tuIrknC22HzRoUESbXa+p\nulA+uzKmlNnvWS77vnBLbFtBhmiPN1aEKhW/LzJq1apVxDn7nS5oFMkREREREZFACUQkxxYh28xs\ndjfD++OPP4C83YBsw4YNQHiRAVtk37JlS++czUy755KNzfrYxqjgbyZ2/vnnA/D55597bRdccMF+\nnzOrIgZBZ+UabYa3Q4cOXpttOGblat3o4ubNm4Hw2eKMrNxokNhGYgAnnHBCpo+zEpYWdbBrFPxr\n2MbaLSFtP2dFQtyfS3ZW3tPdSNcWurvsmrMy9u6mla+88goA3377bcTPHXzwwUDWERwrsZzObObY\n3eCycOH/vlZtOwErRAB+eWD3/R1EVpDBSmxHc/jhhwPhC/FtRt3+3LNnT351MXAsamab1rrf21a+\n2qJnQbV69WoAvvvuOwCaNWvmtdm2F/feey8QfbPUaD744AMgWAUyov3d3TL4sWjQoEHEOftOTiRF\nckREREREJFACEcmx8p3nnXdepo9xN4WymaMhQ4YAsG7dunzsXXRumdtkFe3OPLeCugkZ+DNo4G/O\n6W78ZqW1o5U4t0iOlf+1zWjBLxdcpUoVIPqaHHdz1aCYNGmSd7xx48awNnezTrumspo1WrFiBRC+\nKeEdd9wB+GsC3OhrsrMIgs14g18Su0iRIt65ggX/m8cqV64cAFOnTvXabLbTohDueh137DMT5I0b\nc8Oi1WeccQYQHoVs3749EPxIjl1L0SI5Fpm29Tc2TgA33XQT4F+btjWDRGfva4jcZPW3337zjqNt\nah5k9n1qazjBX6ttbVmtbXWj1Ja1EiQZt12A8MhfTpQuXRqI/vuife8mkiI5IiIiIiISKLrJERER\nERGRQCkQSsJ6tBZOzC5bhJvVX2XAgAHesZuyEQ9WeCBailqhQoUy/blY/mlyOna55ZbvtXLS7mK/\njKpXrw74JWrzSzzHznaVnzx5sneud+/eEc+ZVZ+yE0LPyWPcx7mh9+wUh0iF6y6n7LpzU1MtZD9/\n/nwgb9IScjp2uR03SxUAOProowG46667vHO243ylSpVy9TrR2M7qVsAgN+JxzdnjLT3RFinnF3vf\nnXvuud65hQsXAn7p6byQjO/Xjz76CIBjjz0WCC+MYul7tt2CfX+7/u///g+A+vXr52s/k3HscuLd\nd9/1ji3V2dh3EORtcSWTzGP3xRdfAFC7du1M++D2374XrrvuOgBef/11ry0/SkYneuwsXbZPnz7e\nOdtewVLlrQy3y7ZkadOmjXduzJgxANStWzfi8ZYunZdyOnaK5IiIiIiISKAEovBAVjPcVh7WZpbA\nLzlts0W7du3K0evZ7GhW5VXPOuss7zjjDAtAr169cvSaycqNRGUVwQmykSNHAuH/ptGuRTtnm25Z\nKUvwZ1SsDLC7oDSz59kfm81yF+mnK1t06i6ut/LK/fv3T0if8sK2bdu8YyuocuaZZ3rnrMiFXV+2\n8B2gUaNGuXrts88+G8ibSE482GeVldG+5ZZbvLb8juqkA4skQmQExi11bhGcrD7jVNQia7YZsjvm\nxrYhmDVrVlz7lCjuInor+lGrVq39/pw7PhbBsW1Fgs6yF7p37+6ds1L3Fml1i3UZ20TU/R3ECiNZ\nxMtK6INfJj6RpcsVyRERERERkUAJRCQnq5ltm12PFjmxzZ1yuj6kXbt2gJ/bnV3ubKGbS5sOnnzy\nSQC2b9+e4J7kPbd0dFaeeeYZwM+VthlN8HOmLeqS3Y3KLEKxcuVKwF//BfDUU08BfsnWdNatWzcg\nvMzqTz/9BCTHhmV5KWO5bfA3kh03bpx3zkpNW+n9zp07e22Wc22zeRUrVvTakmlNQk5Y2W377HUj\nOfaZnhcsQlGvXr08e85U4K4zcmdzARYsWOAdW1lfdw1jRlbeXcLZ2sKJEycC4RufW+Q+qzUVQWRR\nCQhfK5KTn0uXCI556aWXgPDNse3z8Pjjj494vG1i/P777wOwZMkSr822r3jkkUcAOOaYY7w2Kyut\nSI6IiIiIiEge0U2OiIiIiIgESiDS1SzMNmLEiBz93GWXXZbnffnxxx+B8LKD1i83RW39+vV5/tqJ\nYAvl9+ett94Ccl7kIdX98ssv3nHPnj0B2LNnT8TjbKfvDh06ZPpcljrUpUsX79wnn3wCKCVtf6w4\nhJtG88QTTySqOwnjfi7ZsaVKumVmrViKlf611D4IT5FJRR988AEQnq5mpdUtrTY3LOXU0tX++usv\nr23mzJm5fv5kZeVlo3GLUzz++OOA/znmpj9aYRBLv5VwVjTJLS5irJSv/Q4SRLZdA8DQoUMBv5Q9\nRC5dsBQ+8NOSbSsPW0QPsHjx4jzvaypwiy8sWrQIgLJlywJw8MEHe22rVq0Copd7N8ma9q1IjoiI\niIiIBEogIjnFihUD/MV4tjgPcr7Bny1OnTRpUkx9ufvuuwH47bffYvr5VJPVAlG3bKjNBARZtAXZ\n7mZYVnLcNiiz6A1Ay5YtgehFNCyCc9pppwGpW+7WjaJYIRAr5xtt07Bly5Z5xxaJyc77qmbNmt6x\nLXi2krYW+QJ/8zOJFOT367fffguER1ktwuIumh0/fjwAu3fvzvS5bNbznnvu8c7Z+zTj8wA8++yz\nsXY76WU1k+t+F7iz8QBff/21d2xFMLZs2ZLHvUtd0aIXxsrGA7z88stx61OiuJkjVkQl2vfubbfd\nBvjfLy7LKunYsaN3bvTo0UD6ZZq4rCiU/fnDDz/k6Ofd7QySiSI5IiIiIiISKIGI5GSc3ShdurR3\nnNPZWss5XLFiRe47FmA2xlnl53fq1Mk7dmfrgsZKkEeLwlSpUsU7/u677zJ9DvtZ+9PKTYO/duDj\njz/OfWcTyN0gcOrUqZk+zjYXcyMyNus2ffr0TH/O1jy5s3e2Gdmff/4J+JtiAmzYsCHbfU93brlz\ne89bKWB3JjW7G9Umks1Qurn8tmZk+PDh3jm75ixK465nsqisbSQbbW2ire+566678qzvycyyGMDf\nKNbK0brRCMuWsLU5gwcP9tqCuMVAbtlGleBvYGwsAgHBjn5Zto6tO4LonzX2+e6uL8zo+++/B8I3\nL7f1J0FjB66TAAAgAElEQVT+PSW/2bpN9/vgrLPOAmDhwoUJ6RMokiMiIiIiIgGjmxwREREREQmU\nQKSrZeQugLKSlJK3LrzwQgBOPvnkTB/j7pwb5DCw7Sbvpj/dfPPNMT2Xpan17t3bO+emCgVV3759\nvWNLFXV3SbYyvCVKlAD8Bcrg77R++umnA1C0aFGvbc6cOYD/72ElkSVn3PF+4403AP8zwE2tTKWF\n9W552VNPPRUIL3VsRQgeffTRHD2vpRBZmmm67Dzvlsru3r074O+C7r4nrTy0m6YmmXMLthj7rrEd\n6IPOiknZZ73LLf9saWrR3nPVqlUDoEGDBvnQQ8mYcp/xOFEUyRERERERkUAJZCRHkoMtBAd/g7wg\nskiLW3bcFtpalMdl0YQXXnjBO2c/a5t6RtswNNW5G0reeuutAAwbNgyABx98MOLxbolfi8i6EYWM\nnnrqKQDmzp3rnXvuuedy0WMxtlg3mkMOOSRu/cgvFmm2jQLBX8Rss8fuDPCMGTMAfxG9/T/4ZcrT\nJYITjY2nRcgk52yrASu377Jztr1A0LVr1y7TNjcL4KCDDgL8wkiTJ0/22izrxDatdQuJ2PtY8pZb\nBCxRFMkREREREZFA0U2OiIiIiIgESoFQMqwMyiDaDrbpKJZ/mniNnaUh2E7hAFWrVg17jLtjfYcO\nHeLSL5PMY5fs4jl29957L+DvqwH+AtFobGG4W8jCFsLbAtSsdqjPbzkdO11z/9H7NXYau9gl89hZ\nqtVVV10V0WYFVdz9RxYsWBCXfpl4jp2lvrtpocbSQwGOOuooAIoUKbLfPtg+VuAXUYmXZL7uYtW+\nfXsgvBCEFQErV65cnr1OTsdOkRwREREREQkURXKSWCrc7b/44ove8RlnnBHW1rJlS+843qW8U2Hs\nkpXGLnaK5MRG11zsNHaxS+axe+uttwBo1qxZpo8ZO3asdzxu3Lh875Mrmccu2QVx7KpUqQL41y3A\n33//DUD9+vXz7HUUyRERERERkbSmSE4SC+Ldfrxo7GKnsYudIjmx0TUXO41d7JJ57GrUqAH4G1yC\nv+bk6aefBvwS/BD/bQeSeeySncYudorkiIiIiIhIWtNNjoiIiIiIBIrS1ZKYQpqx09jFTmMXO6Wr\nxUbXXOw0drHT2MVOYxc7jV3slK4mIiIiIiJpLSkjOSIiIiIiIrFSJEdERERERAJFNzkiIiIiIhIo\nuskREREREZFA0U2OiIiIiIgEim5yREREREQkUHSTIyIiIiIigaKbHBERERERCRTd5IiIiIiISKDo\nJkdERERERAJFNzkiIiIiIhIouskREREREZFA0U2OiIiIiIgESuFEdyCaAgUKJLoLSSEUCuX4ZzR2\n/9HYxU5jF7ucjp3G7T+65mKnsYudxi52GrvYaexil9OxUyRHREREREQCRTc5IiIiIiISKLrJERER\nERGRQEnKNTkiIiKSmgoX/u9Xi5EjRwIwatQor+2ee+4BYMiQIfHvmIikFUVyREREREQkUBTJERER\n2Y+yZct6x7Vq1QKgd+/eEY+rWrUqAPXq1QPgpptu8tqefvrp/Oxi0rj88ssBP5KzYsUKr23WrFkJ\n6ZOIpB9FckREREREJFAUycmgZMmSADz77LMAtGrVymuzGbm77roLgH/++Se+nUsCpUqVAmD69OkA\nnH322Zk+duPGjd7xwQcfnL8dSwFHH300AF27dgWgbt26XtuFF14Y9tgff/zRO542bRoA8+bNA+DL\nL7/M136KSKS+fft6xwMHDgSy/lz76aefAPjjjz/yt2NJqEOHDgBs3rwZgBtvvNFr+/jjjxPSJxFJ\nP4rkiIiIiIhIoOgmR0REREREAqVAKBQKJboTGRUoUCCur3fggQd6x9999x0AFStWjOiLDVWvXr0A\neOyxx/K1X7H80+T32N1+++0AXHvttft9rJuuVq1atXzrUzSJHruGDRsCMHz4cO+cpfYVKlQopuf8\n999/Abj//vu9c9n5d8ipRI/dscceC0CLFi0yfcx9993nHe/bty/Tx7311lsAnHXWWQBs27YtL7qY\nqZyOXbw/65JVoq+5rBQs+N9coF1DAM888wzg93vv3r1e24wZMwB/0f2mTZvytX/JMnbHH3+8d7xq\n1SrALzhwyimn5Pnr5YVkGbtUpLGLncYudjkdO0VyREREREQkUNK68EDjxo0Bf3My8CM4WbGyoUce\neaR3bvz48QDs2rUrL7soKcCuo+bNm3vnrEhFmTJlIh4/c+ZMALZs2ZLpc1oZWoBu3boBfgSoXbt2\nXlt+RHISwZ2lWrZsGQAVKlTI9PFu9CarmR2bQbbnyu9IjgRH9erVAbjtttsA6NGjR8Rj7Lvg0Ucf\njV/HkpQV6wH//TlhwoREdUfSmBVIcqOLVgzDuN8577zzDgALFy6MQ+9ST+nSpQG45pprADjhhBO8\nttNOOw3wy+O7m/zmdxQ7OxTJERERERGRQEnLSM5BBx0E+DnTJ598csRjNmzYAITf7VeqVCns8e7P\n1axZE4Arr7wSgJ07d+Z1t1NO8eLFveNOnToBsHjx4kR1J98MGzYMiF5O+/PPP/eOO3fuDMD3338P\n+GtsoilSpIh3/O233wJwww03AFC7dm2v7dxzzwVg/vz5sXQ9KVmEK6tIl621AViwYAEA48aNA8Jn\n77Zu3QqkZ7l3ybnChf2vRIu4nnrqqRGPs2vTvQ7TVfny5YHwz6xffvkFgBdeeCEhfQoCi+b369fP\nO2e/Z1xyySUAvPzyy17b6aefDsDDDz8MhJc8TxeW9WARxEMOOSTTx7q/29lYWQaGm92Trux3C4BH\nHnkE8CM60dg1edRRR3nnmjRpkk+9yz5FckREREREJFB0kyMiIiIiIoGSlulqkydPBuDMM8/M9DFj\nx44F4LXXXvPOWUize/fuAFSuXNlrs3PG0tYgfVPXLC0QYMyYMUCw0tWsTLRbWtY88cQTAIwYMcI7\nZzugWxjYvbYy7orupldZOpalq7klqEeNGgWkfrqaWzzADXfnxNVXXx1xbtasWQD8+OOPsXUsBblp\nQyeddBIAd911F+Cn9EH+vBe7dOkChKd7rF69GghPf0hWAwcO9I4zpqlZCjNAgwYNgPAy+enKFh67\nRXsmTpyYqO4kPbeozGWXXQbAOeec4507+OCDAShWrBgQvsWFpVPa52Xr1q29NjvXp08fIH3S1cqW\nLesdP/DAA4BfaCa75YZtjKtUqZLHvUsdNo72vdC0aVOvLSflq9evX5+3HcslRXJERERERCRQ0iaS\nYxsMQvSZd3PHHXcA/kIrd3H4ddddB8C8efOA8JKZNgNgEZ3ffvvNaxs8eHCu+i7JyQoJRNvc00pS\ntmzZ0jtn16CVYZwzZ47Xdumll2b6Op9++ikAr776KhA+e1eyZMlYuh4otrjUFuW60nFheP/+/b1j\n+zyzGWA3wvLGG28AsGPHjly9XteuXb1jiyi6M39Dhw7N1fPHw+GHHw74EWeXbWzpvpf//PPPuPQr\nVbnff7GwaIRFqiFyVv6qq67yjlOhwIFFty666CLvXLQtK7Zv3w7Anj17AL80L/hR/V9//RWAOnXq\neG32e0m6sTLuELntgL13AQYMGABAvXr1AHj88ce9Nvu8ys516xYzsO0cmjVr5p2z6/Swww4L+xPi\nvyn6/rhRMHsPuREcY1uj2PYOS5cu9dqs8IgVe4hWyCuRFMkREREREZFA0U2OiIiIiIgESuDT1Swc\nd/vtt3vn3AXx4O9bAn7ILas9TN577z0AevXq5Z1bsmRJ2GNsQSHAvffeC/gLz1OZhcSt/rkbphXf\nlClTMm37+eefAbjvvvuy9Vx//fUXALt3745oK1GiBOCH0N1rOcjclAFLG41WsMDSO9KJfd6Anypr\naRLuNWfXVU7Z/ldWwOX888/32iztY/r06d65jJ+Nyeibb74BwovE2PfEnXfeCShFLTPXXnstkLPF\nyS73vWzXZ8eOHQE/dQvgiy++APxF4s8995zXZsUPLAUzGVkq1fXXX++ds367RWhs7xtLSctKrGMe\nJHPnzvWObe9De+9ayh/4xXxatGgR8Rxr164F/M80l6WZX3755YC/Hwz46eLuv4MtvLff9yxNLhm5\nafIZ09Q+/PBD79iWeEQrKlCrVi3A/9052SiSIyIiIiIigRL4SI7dZbZp0ybTx5x33nnese2Qnh3L\nly/3ju+//37Av2svU6ZMRNsFF1zgnXNnGFKJ3d3b4uYZM2Z4bY0aNcr052zx3S233ALAjTfemF9d\njBubWfzss88AqF+/fsRj3NLFttDRZrnzoqxxpUqVAL/c7aOPPprr50wFbqTUyvn+/fffQHqVr3Vn\nEDt16gT4BTHAL0dr5Y+ff/55r83GKzvcstTDhw8H/AW/7oJwW7j70EMPeeeiRSCTVbTo1rRp0wA4\n/vjjvXMWMf38888B+OSTT7y2vXv3Arkv6JAq7N8/u+V6M3LL9nbo0AHwy4736NHDa/vyyy8BqF69\nOhA+03zzzTcD4UUwNm/eHFN/8kvPnj0jzs2cOTNXz+lmUthngRW9SRdudMEyKIYNGwaEFwuxaJm7\n2N5YUR+7ttxiKf369QP869sW4YO/VcQzzzzjnbPxT4Xy8tHK+n/88cdAeMTLrq3mzZsDUKpUKa/N\njeQnI0VyREREREQkUAIfybFyvdHYrOaaNWtiem531m/SpEmAv6mXzaCCP7NaunRp71yqRnKMRS8y\nbmKZGctdtQ00g+Cpp54C4M033wSil4d0ZxPTZb1MfrBSqxYBdDdttBk2i+C4ZWeDyj5LjjzySO/c\nokWLIh63b98+wI865/QatPftkCFDvHOW927j7l7j9llnM/Gpxt0g2v7ONnvp5uIfcMABgF8+1WUz\nuLaWxy3tm06b0maXm+Fgzj77bMBfv+iyczbjDP7aFncD11TfIDk7Dj30UO/Y3o/p8PfOjG2+bVFq\nNyITLYJjbCPpaBtKGyvlfffdd3vn3BLVqcS+P6KtY7US21999ZV3ziL5GUt0pwJFckREREREJFB0\nkyMiIiIiIoESyHS1iy++2Dt2Fy5mZIsVbaFoblgJ0m+//RYIT1cTny1ms/Q+yDqlMBXYYuvc7vIt\nmbPwuqWpFSzoz8989913gL8INB1Y+fr9FVmwcvcXXnhhTK9jn5HR3qO24NfKi0L4YvBUZKVkwS+v\nailpVjob/HQ1W5TsLtK1YiB33HEHEJ4CY99Nr7/+eh73PL7cIgxZFZzJStWqVYHwHetzwlLawC8Y\n5PYlyGlbdv256ZWWmhpr+n2QjBkzBggvtGIFoGz7C1fGUtxuMQMrcvPKK6/kdTcT5rjjjgPCC2QZ\nS00Lyu+wiuSIiIiIiEigBCqSY7O7brnowoUj/4o2W5efm3OmyyZd7oy6ewxQqFChiMfbbGjGDVlF\nMnLLf9oGera41mYtwV8MGW3hvb0PLSLx4osvem2pUOIzI4suZLXx2qxZs7zjm266KabXadeuHQCn\nn356RNumTZsAf0zd8slBlFUpYitOYDPrADVq1AD88XEX1ltp7bZt2wLBKEZi77GcfufZ94Nt7gn+\n4u5oBQey04dWrVrl6OdSlX02HnHEEd45KxOfzBuixpu7oaqVdj7xxBOB8Os1Y/nzcuXKeceWnRMk\nK1euBMIj7xbdya1k+z5QJEdERERERAIlUJGcypUrA3DRRRd556JtUGZ51DbzkR+ivW6JEiXy7fUS\nxZ1Rd4+zehzEvnFcOrLrJlr+rJUxt9n1ILC/p7uxac2aNTN9vM0EH3744RFtNltnz7V06VKvzdZH\n5GQD4ES79tprgfDNOTNyc6lt3cKKFSv2+9x169b1jh9++OFMX2f27Nlhj5Hw7QS+/vprALp37w6E\nr1mqU6cO4G9W2Ldv33h1MU+5JcJfeOEFALp06ZKj57AxcyNltv7JImPRNmc1zz77rHds3yeLFy/O\nUR9SnRuNSLdNQLNiG8y6m45nfK/ZRr4ATz75JACjR48GoFixYl7bkiVLAH8dmrsZaKqyLUzctYS2\nwbv7d89o3bp1Eecs+mrcEtvJQJEcEREREREJFN3kiIiIiIhIoAQqXS0r7u6t7nE8uTtmW1hUJBo3\ntfG+++4D4JRTTol43F133QWEl8pMdZZGllWKmmv8+PFA9JQ9t1Q5QPv27b1jKy9vpUVTge3gbX3v\n2rWr12bXjFssIFrhgFi415e7i3gqc3dAt0W3r776aqK6k7LmzZsHhKeruQviM/P7778DfjEGgOHD\nhwNQsWJFIHoBAivscMwxx3jndu/eDcDy5ctz1PdUZdeum/ad6uXbY1W/fn3v2AqB9OzZEwgfH7tG\nrOy+e93t3LkT8H9HO/TQQ702SzEtWbIkEIx0NWNjAnDnnXfu9/FFixYFom/XYAUakq3whSI5IiIi\nIiISKIGK5GS18NEtMpCfBQeyMnXq1IS8rqQOm4236A34m5EZd6OyRx55JD4diyMrQelGWGxxspXl\nzS57ji+//BKIXpwglbz00kthf7ql2Fu3bg3AyJEjvXM2C2mFP/7880+vzQo2RCuIYjOgVmRg1KhR\nXtu///6by79FcjjyyCO9Y1t0mxeRHFs0b5/37iaixmaOg8DeW+7fycpm33vvvWGPicYKF4AfybHI\nmhvJsc1D7fFumd+xY8cCfmncoDvnnHMizrnjmA5sU08rDADh0VkIz9rJznvcruF02QIkp84//3wg\n/Prbu3cv4Bf3caNDyUCRHBERERERCZRARXKOOuqohL126dKlAW1yKbGxWfXJkycD/qaPrr///huA\nBx54wDv3448/5n/n4sxyevMit/eQQw4B/Nn0oJUu3759u3dsJXXd0rpWFtoiOO7mk1Zy9qSTTop4\nXlvzM3fu3LztcBJxI6S2BsTd1DOr8sUZHXbYYd6xla3NGIEFfzPacePG5ayzSezjjz8G4JdffvHO\n2Xex5e6fccYZXpvNltsMsBt9qVevHgCXX345EP59ahFKW+/jvp47m58Ozj33XAC++eYb79x3332X\nqO7ETalSpbxjK13sbq1gn++25vmee+7x2rJaS2Preqz0tPs9oagOHH300UD4OiZj78Nbbrklrn3K\nLkVyREREREQkUHSTIyIiIiIigRKodLWs5PduwFbKtUGDBhFttngyJ+kPktps1/mCBTOfR3B3mL/+\n+uuB8LQO888//wAwZswYACZMmJBX3Qw8K31s/x7pxt3VG+Daa6/1jk888cSwNrdM9KJFi/K3Y0lg\n7dq13vEVV1wBQKtWrbxzy5Yt2+9z2Oe+WyikevXqYY9xF+Tbwno3zTAo3BQ8S1Nr1KgRAO+++67X\nZovBb7311ojnsGIQ/fr1A/w0XvBTiKxUvO1AD3456qCz8vrGvbZ27NgR7+7EnRUPAKhWrVpEu103\nDz/8cKbPYdeNpZUCNGvWDPDTVl1WvMaK36QLK2QD8NhjjwFQrFixiMfZe7R8+fJA8o2TIjkiIiIi\nIhIoaRPJOfnkk/PsuWzxmztb0KFDh0wfb7N8W7ZsybM+SPJo06YNEL4xoxUOcBcyx+qzzz4DYNq0\naRHPmTE66EaO7HF79uzJdR9SiS3KBX8hs80Cb9y40Wt7880349uxBLJZz+uuu847V6hQIQB++ukn\nILzEfbKVAc0P0TaOdBewv/zyywCsW7cOCH/fWeSncOHIr1B7T1qpWvd6DPJ78cknn/SOa9WqBfgz\n6+7moHbcsWNHIOtiIFbUAOD1118HYMqUKUD6RG+yErRCKvvjRv+i/d0zFl/o1q2bd2xRRStqUaFC\nhUyfyza2BLj66qtz0ePU5RZqsO8P2z7ALUhjUdtki+AYRXJERERERCRQAhXJsU0Eo2ncuHHE8Qcf\nfLDf53R/zkqtDh48GICaNWtGPN7W37g52pMmTdrv60jqatiwIQBXXnllvj6/zVwuXbrUa3M3O4Pw\njQctv33NmjXeuYULFwL+rGiQnHXWWUB43rZFtmyGz424ZlyvEkRFihQB/HVcbh67lSS39WBW3jhd\n/PDDD97xoEGDALj99tu9c23btg37Mytff/21dzx+/HgAHn/88TzpZyqycbQNKnv27Om12ZqIli1b\nRvycbRpqm/4uXrzYa0uHNSf7k3GbjHTbAHR/5Zyzs44uGlvbZNszjBgxIqbnCYJKlSoBMH369Ig2\n+5y75JJL4tqn3FAkR0REREREAkU3OSIiIiIiEiiBSld76qmngPAyqRbedUvfWUjT0jWyUrJkSe+4\nRIkSYW3uIuY777wT8EN8KjIQnZUSvemmmxLck8Rz0ytt8V6fPn2A6KmQtsi5ffv23jn3OCO7vv/4\n4w/vXKqmqdl7r0WLFkB4yp4tbrYUGbfsrKWpnXnmmUB6pKi5rPy4/elavXo1EHuKR6pz058spdi9\nriy1M2OKkMu+cyzdDeDXX3/N036mMivTPXTo0AT3JBiOO+64sP/funVrgnqSGBMnTvSOBw4cCEQv\n/mFpbW5BAUuFtEIi8+fP99peeuklIHkXz8eTbWPhfu7t27cPgBkzZiSkT7mhSI6IiIiIiARKgVAS\n1iDc3+Ky/XE3/rMFjL169Yrpufbu3esdW7TGNs1zy1vmR2nQWP5pcjt2OVWmTBnv2Ery2iaXbjlj\nmwmwsqxWPjS/xHPsbCOx1157zTtXtGjRTB9vs8a33Xabdy475VDPO+88ILwca0Zu6d977rlnv88Z\nTTJed7Vr1wb8SIzNvAE0b94c8Gf03AXlVtY7XhGcnI5dvN6vVapUAaB79+7euUceeQQI31AwUZLx\nmksVGrvYpdrY2aaMVsjB3tcAGzZsiGtfEj12tj1AtEhONFakJxnKuCd67KKxggMW6SpXrpzXZlkn\nVgQpkXI6dorkiIiIiIhIoOgmR0REREREAiWQ6WpBkYwhzVShsYtdMo6dpUX26NEDCN97yvpr6X+j\nRo3K175kJVnT1ZJdMl5zqUJjF7tUGLtSpUp5x++99x7gp4S7qfnplq6WypJx7Jo2bQr4xaHcgkVt\n2rQB/GI1iaR0NRERERERSWuBKiEtIsFkpVLvv//+sD9FRIKsfPny3vGRRx4JwF9//QX4BX1E8opF\ncEaPHu2dS4YITqwUyRERERERkUBRJEdEREQkCZ177rkR52yW3d2QXCQ3Vq5cCYRHDoNAkRwRERER\nEQkU3eSIiIiIiEigqIR0EkvGMoOpQmMXO41d7FRCOja65mKnsYudxi52GrvYaexipxLSIiIiIiKS\n1pIykiMiIiIiIhIrRXJERERERCRQdJMjIiIiIiKBopscEREREREJFN3kiIiIiIhIoOgmR0RERERE\nAkU3OSIiIiIiEii6yRERERERkUDRTY6IiIiIiASKbnJERERERCRQdJMjIiIiIiKBopscEREREREJ\nFN3kiIiIiIhIoOgmR0REREREAqVwojsQTYECBRLdhaQQCoVy/DMau/9o7GKnsYtdTsdO4/YfXXOx\n09jFTmMXO41d7DR2scvp2CmSIyIiIiIigaKbHBERERERCRTd5IiIiIiISKAk5ZocERERCb569ep5\nx2+99RYA8+bNA6B///5eWyzrGEQkvSmSIyIiIiIigVIglITTI6oi8R9V4Iidxi52GrvYqbpabHTN\nxS5Vx6548eIAPPDAA965Sy+9NOwxBxxwgHf8zz//5HkfUnXskoHGLnYau9ipupqIiIiIiKQ1rcnJ\noFChQgB07twZgMGDB3ttJ598MuDfUffq1ctre+eddwD48ssv49JPSV09e/YE4OCDD/bO3XbbbUD2\nZikmT57sHV9zzTV53LvEO+aYYwAoVapURFvNmjUBaN26tXeuTp06AFSuXBmAI444IuLntm7dCsB1\n113nnXvsscfypsMBdsEFFwAwd+5c79zvv/8O+OMtEovjjz8eiIzeAGzYsAHQOhzJP0WLFvWOV65c\nCUDFihUB6N69u9f29ttvx7djkqcUyRERERERkUDRTY6IiIiIiASK0tUyGDJkCAC33nprRJuFzu3P\nadOmeW0ffvghAF26dAFg/fr1+drPVGDpCACrVq0CoEqVKgBs3LgxIX2KN0txBJg4cSIAxx13HOCn\nRgLs27cv28/Zo0cP7zjV09VsYfEbb7zhnatfvz4AJUqU8M5lJ21l8+bNQPi1ZSkJpUuXBuC0007z\n2tIpXa1x48YAfPDBBzn6uRo1agDh45/qn212za1bt847V6FCBQDatm2bkD4BdOzYEQjvl5smGBQl\nS5YE4Oqrr870MXPmzAFg7969cemTpI9ixYoBMHv2bO9co0aNwh7jvu+mTJkC+CnlkloUyRERERER\nkUBJ60iOzeg1adLEO3f22WfH9Fw2O9+sWTMAnn766Vz2LjV07drVO3722WfD2o466ijv2GaCjzzy\nSCB9IjmVKlXyjnfs2AGER3BiYaVXwY8cLly4MFfPmSjt2rUD4IQTTsjycV999RXgz3K7M21//fUX\nAM8991zEz1nk8PPPPwfgoosu8tpmzpwJwCuvvBJT31PBwIEDAbj77rsBOOyww7y2n376ab8/b+Pn\nStWCA6effjoAo0ePBsLHwrz77rtx7VM0dj27ghTRsff8ueeeG9H23XffATB16tS49kmCz743H3/8\ncQDOOeeciMds2bIFgOrVq3vnxo4dG/bzN910U772MxnZ7yyWEWGfoeBHZO13PLfUu73Xly9fHpd+\nRqNIjoiIiIiIBEpabwY6atQoAMaMGZNnz2mlai+77DLvXLQZ5uxI9Q2j3Nxyi+DkNoqRXck4dnfe\neScQXsY4t/79918A6tWrB8DXX3+d6+eM59jZmpm+fft65yzCOmPGDO/cnj17ANi9e3dMr/PWW28B\nfqQV4LzzzgNgwYIFMT1nNMmwGWiLFi28Y5tBGzFiBOBfg5D1OrADDzwQgI8++ggIL8ttJaSjRXli\nFY9r7ptvvgGiR3CSla377NOnT6aPScbPuozc9XWvvvoqACeeeGLE45o2bQr4azjzWyqMXbJKtbGz\nsv2r74QAACAASURBVNC23su1Zs0awF8Xd/HFF3ttw4cPB/xIjvtd9eijj8bUl1QbO1tb/Oabb+bo\n53bt2gX4UfS8eF9rM1AREREREUlruskREREREZFASZvCA1Y2EPzw49ChQ3P0HJ999hkQuYgeoEiR\nIgCUKVMGCC9P26ZNGyDn5VtTnTs+SZgVGXdWbOGLL74A/JLH4F+T0VxyySUA9O7dO6LN0v8KFkzN\n+Yq///4bgPvuuy9fnr9WrVoAHHPMMQB8+eWXXltepqklEzcl1FIc2rdvD8Dtt9+erec4+OCDgfA0\nNZOxwEiqeOihhwB/e4D9pc5aaqT9aYUqwL9u69atC/iFLfKCW/wgGQoh5IaN8QsvvOCdy5im5n43\nbN++PT4dy2fuQutWrVpl+rjXX38dCC+hb/IyjV787wDjXndWTvqXX34Bwj8nP/nkEwCWLFkCwPjx\n4722WNPVUoG75CJjsQV3KYJ9d2/atAmAefPmeW0Zfy9OhNT8zUhERERERCQTaRPJqV27tnc8cuTI\nbP+cO2t5wQUXAP4GZW4J1qpVq4b9nJXag/AoUjq44oorAC3QzOiOO+4I+zO7spoJlEg2ewQwa9Ys\nAEqVKgXAM888k5A+xZOVs3e5Y5Jbv/76a549VzzZ++79998Hwj+jH3zwQSC8PPaECRMAuPnmm+PV\nxcCxMXaLYRgrNTto0CDvXF5GxBIpu5/Z9rhoj3fL9BorZ2wsErS/cxJp9erV3rFt1G3cKK9tFJwu\n6tSpA4RHEsuWLQvA9OnTgfAsqG3btoX9vBt9tkyKl156KV/6mh2K5IiIiIiISKDoJkdERERERAIl\nkOlqts8G+KHGe+65J1s/a7tNW3j922+/9dosTc24YXYL41lajKt8+fLZeu2gcRf2pUOaUF6qVKmS\nd2x7nERji/5sD5B0Zu/1cePGeeesvv/69esBePjhh+PfsTiLtpN8btlCe4BFixbl+fPHU7TdtzOm\nq0jeGDBgQKZttmfG1KlT49WdlJcxhS1aSptx09YyFjZQUQNYuXJlxLnq1asDfnESCN8zJ8gsTe3l\nl18G/LEAv0iPLUWIxsbJvnPBLyRi+/i5BQviRZEcEREREREJlEBGcpo0aeIdR5u1y4qVusxOuWe3\nBG3Dhg2B6LPutmt1qs+AZlfFihWB8MIDVl5QssfGEMIjk+DvIgzwwAMPAPDvv//Gp2MJZqWNzzjj\nDO9c165dATjrrLOA6OXKrYDIDz/8kN9dTBiLPp9wwgkRbX/88UeOnitjuVUrfw7w8ccfx9A7SSdW\niKdbt24RbXv27AH878UgcgsEZIy2uG0WbcmqUEFW0ZqsuM+Z8fnd57T+pHN0p0aNGgC8+eabABxy\nyCERj9m5cycAl156aby6le8segN+cQCL4Cxbtsxrc8tJZ8a+k93f+yySk4gIjlEkR0REREREAiWQ\nkZzsshlft0x03759Y3oum0nft28fEL45o93ZFi1a1Dvn5rgHTZcuXYDwGfVU3UAw3mxzwaw2qnz6\n6ae946+//jrf+xRvVnZyzpw53jnblNLeQ9HWvmXF3teNGjXyzq1YsQKATz/9FICFCxd6bTZrl0qs\nVH20jWHvvvvuHD1X06ZNs/3YKlWqeMf2b/fee+/l6PWS0TXXXANA9+7dAahWrVq2fs7WP9g6RHc9\nYlA2u9yfs88+G4D69etHtNlmio8//nhc+xRP7nqY7KybyarssxthyVhyumXLll5btA1FjT0uq1LV\nbh/SoQy1O3a2WXK0CI5F/y+66CIA3nnnnfzvXJwcf/zx3rFFs6y0u7u21c0eyejYY48F/N/7XNld\nC5+fFMkREREREZFA0U2OiIiIiIgESlqnq82bNw/ww5C5YSHlc845B/BL5oG/W+yUKVO8c5dffnmu\nXzPZWKpKzZo1Afjwww+9tkTueJtKbKGupWdFs3Tp0nh1JyFmz54NwEknneSdi1ZMICMr454Vd6Gl\n7dpsz/3II494bbGmrSbS+eefH3HO0gzWrl273593d/a+5JJLsv26nTp18o4tFclNYUtVVvo/p1sA\ndO7cOezPHj16eG1WHCOr9I8giJamZn755Zc49iQxskr3ctPXcrrYPzvpbVmxdLVoBZncVLYgpqtZ\ncSjToEGDiMdYOulTTz3lnbvlllsA+P777/OvcwlSsmTJiHP3338/EL3EtqWLlytXzjt3/fXXA1Ci\nRAkAtmzZ4rUlw3WkSI6IiIiIiARKICM5gwcPzrLdFsEPHDgwz17zggsuALK/ODWIbCbYZj5//PHH\nRHYn6RUqVMg77tOnDwDDhw/P9PG2sDRICx+jsUIArj///BPwixHMnTs3oi2nrEzyu+++C4RvdGab\nrCay9GVORSvXa9FUtwR0ZtyIReXKlcPa3HKixq7fxo0be+dyWhAikU499VTv+LDDDsv0cVu3bgXC\nN4bODtvQt3Xr1t65JUuWAHDmmWcCsG3bthw9ZzKzoingfx8a9/pzZ8kzY+Wl27Rp452zMVuzZk2u\n+pnOkmFmPR6KFy8OhEfN3PdhRjt27ACgY8eOALz99tv52LvkYVk3Liu44kb2jX2mtW3bNtPnHDly\npHecDNsNKJIjIiIiIiKBEqhIztFHHw1kvbEW+LOzOd0gLyP3dSz/unTp0hGPe+211wDo169frl4v\n2Vk0wkpmp/MGoBYlsNnHaNxr5brrrsv0cTarZGVZbWY5qK666qq4vM4nn3wC+OWV3Y18bUYv2SM5\nzZs3945t9tL18ssvZ/u5ouVnmyOPPNI7tmijzYyedtppXpttphcEtm6hV69eQM5z8q0kq33+g79h\nq5WVtusMYo9IJgu3PHvG70E3gmCbgZrChf1fQx588EHAH3OXjVlWUbdkZJttxrqpZ7y4JZVTlUWS\nLeLvrhfMaPXq1d6xRR6DuCVDVtzP6yFDhgD+upuLL744pufMztrYeFIkR0REREREAkU3OSIiIiIi\nEiiBSlezlJ9oaRdu2crFixfnyevNmjXLO65atWqmj7Nwte0kG3RWktcKPARV9erVAb9cuO2MDn4a\nSk7Lz0ZjqR5BT1NLFFt0mopOOOEE79gtZGEyXjMHHHCAd2wLxW1X6qzKlp988snesaVibtiwAQhP\nL3QLQiS7zz//3Du2Yhf2ngY/fTHW0rE//fQTAAMGDPDOvfjii4Cf4ud+V6V6ulpWstpCwC1S0Lt3\nbyB6yfho13cqSKY0sKxS+a2wTaqwtHi3OMWtt94KwHHHHQeEX0f79u0D/OvICs5A+qWpGTed+ZRT\nTgGgWLFiQPjyCiu6Yu/PjIVpwN+S5d9//82fzsZIkRwREREREQmUQEVysuIWGVi1alVMz2GLKSdO\nnAhEv5u12TjbFC83r5eqbIYl6KwQgM2E5/frPP744wD8/PPPXtvff/+dr68dZAceeCAAPXv2jGh7\n//33492dmEQrG+2yUp+2yZ0tLgW/UEtW7OeGDRvmnbNoRKpvjvfrr796x1ZO2i3e4L7PcsMtR2sR\nDZt9dv/9pk6dmievlyhu1NCyFmzWvGBBfz719NNPB/ziAvb/kPWmvwcddBDgR86TvSiI2V8hpHjK\nqi853Zg0EdzIp21aGe3z+/fffwfgpptu8s7Z1gvRtigQ+OCDD8L+3/3cKlOmDOBnqET73deiQhYx\nSxaK5IiIiIiISKCkTSQnL9gGgT169Mj0MXfccQcAt912W1z6lIyymo0LkkmTJgH5P3NhM5dfffUV\nALNnz/babFbKyjZu2bIlX/sSJOeffz4QvomhSZXZvqOOOirLdpt5c0sVm82bNwN+DnaJEiUiHvPb\nb78BfmnfoLKxyA+7du3yjm022SI5FqWF1I/kLF261Du27AVby/XQQw/l+vntPZkqEZzM2BrdREim\n9UE50aRJEyB8PXW0zSqt3SLWX3zxhdd2880352cXA83WnB977LERbba578yZM+Pap+xSJEdERERE\nRAJFNzkiIiIiIhIoaZOuVrt2be/YFqNZaplbutMWSNquyoMHD/baLrrookyf39IdnnjiiTzqceqy\nwgMbN25McE/y15dffgmEX1vZsWLFCgBmzJjhnbvkkksAv+R0/fr1M/35aOmSVu7RTXm56667ctSv\nILMiA5aiBv74W3rlsmXLvDY3xSiZuelOzz33XES7lUm14hjubvPvvfce4BcVGDVqVMTPP/3003nX\n2TTllu1u3LhxWFu0BbxBYOVk3dLjsXCv13PPPTdXz5Uolp42evRowN8SIBGiFR5IZPpcdllpejdF\nbe/evUB4qWPb1uOvv/6KeI6gvtfyi1ssxC08k5EVU0m20tFGkRwREREREQmUQEVybCbW3VzMZtFs\ncS34MypWFm/9+vVeW5EiRQC45ZZb9vt67mLKBQsWAOm7qRRA165dgfTZDHTChAkATJkyBYCiRYtG\nPMYt8Wyzm1dccUVE26OPPgr4mxF27tzZa7OIjF3L0Up0H3LIIYC/GRr4mzZ+99133jlbfGmzYEHk\nzkCdd955gB/BOfPMM702u05tfNxyvqmyMePy5cu9Y1uca2WfIfYyyPYcVtBCYueWuHXf1wBz5syJ\nd3fiwmbUL7zwQgCaNm2ao59fu3YtALfffrt3zsoCpxorzZwKJZqTjX2m1axZEwjfUL1Tp05A9I1m\nbaH8DTfc4J277LLLwh6TKsVlEsXNXMqYxWQFVAAWLlwYtz7FQpEcEREREREJlAKhJKz3m9vNJH/4\n4Qfv2GbG84KV57UhczfT27BhQ569jonlnyaRG3Faf23GLZE5sPEcu2rVqgFw/fXXR7RZ2XGAb775\nJqbnNzYT2L17d+9cTtcD2czxxRdfnOlj4jF21m9be3TEEUd4bVYqO6dsps6isQDt2rXL9PE2A2gR\nnJ07d8b0uq6cjl0ybJxrpT/dtV52rbr/LvkpHtecRfNtltfdDNQim+5Mcazsc8/WCgwfPtxryxjt\ndb+f3IyCnEjm7wnbXNXWGoJfatre+272w7hx4wB45plnANi9e3e+9i+Zxy4/RPv72pqcnEaa4jF2\nVvLa1jG5pd5tM1nLyAH/vWcRHPe6M1Zm2s34ifcazGS+7iwTwjYfBz8ia1q0aOEdu1GdeMjp2CmS\nIyIiIiIigaKbHBERERERCZRAFR4wbiqOuzA3Fm65R0t9yYuUhiDat28fEFsoNpVZmsk111yTr69j\n6QRWpACgVKlSAIwfPx6Ak046yWuzcptuGlZO09vyi5U+vu222zJ9jBuez841ZY+P9lgr8jB58mTv\nnJWST5UiA3mtbNmyALRu3TrBPYmPG2+8EYARI0YA4QuP7RpwSxYbS6+Klk5mJZKPOuoo75wdR0vX\ntdexkt6pupg+u2w83QIYDRo0SFR30la00tEmkSWt98dSSi1t0U0/W7Ro0X5/3i0lPXLkSMDfZiFV\ntgmIN9s+JWOKGsCSJUsAWLduXVz7lBuK5IiIiIiISKAEsvCAe7dvM20PPvigdy6rBfE242slZ1ev\nXu21xbowNFbJvDgtmjfeeAOA5s2bA1CoUKGE9SXVxi6vuDN2hx9+OBC+KWu0DSMzisfYWbGGU045\nBQiPJtSpUyesLbt9sj7Mnz/fO2cLmD///HMA1qxZk6N+5lQqFR6wcsbRZkSDWHjA3gfuhoLx4EZS\nrbTyVVddlWfPn66fdXkhXcbOvheiZbaceuqpQM4jOvEcu6FDhwLh3wkZy7G7Vq5cCYSXb7fNu5NB\nMl93Nk72+wP42yzYZsZbt26NS1+iUeEBERERERFJa7rJERERERH5f/buPG7K6f/j+KufvVSWIkt2\nQir7vi/RhpAiS0QRoSwpX0uyfL+kLCEpIb6W7Nl3lZ0kW2RJyJJ9r77x+8Pjc65z3TP3NHM198w1\nZ97Pf7q6ztwzp9M1M/d1Pp/zORKUINPVQpHmkGY2lhJ4zDHHALD44uWra1FpY5cmGrvkKildLU1K\ncc3ZgtrjjjsOiPZpAWjfvj0A77//vjvXokWLvJ/b7/+YMWOA7MUuLG2ymPR+Ta7axi7bvzfpv6fa\nxq6Y0jx2v/zyCwD169d3526//XYAunfvXpI+5KJ0NRERERERqWqK5KRYmu/2005jl5zGLjlFcpLR\nNZecxi65ahs7KzzgF6ixggODBw/OOJdLtY1dMaV57CyS4/fRrpcpU6aUpA+5KJIjIiIiIiJVLcjN\nQEVEREQkYtEaP5Jjx7YFBKR7g1CpW7bBeCgUyRERERERkaDoJkdERERERIKiwgMplubFaWmnsUtO\nY5ecCg8ko2suOY1dctU6dn66mik0Ra1ax64YNHbJqfCAiIiIiIhUtVRGckRERERERJJSJEdERERE\nRIKimxwREREREQmKbnJERERERCQouskREREREZGg6CZHRERERESCopscEREREREJim5yREREREQk\nKLrJERERERGRoOgmR0REREREgqKbHBERERERCYpuckREREREJCi6yRERERERkaAsXu4OZFOvXr1y\ndyEV/v7774J/RmP3D41dchq75AodO43bP3TNJaexS05jl5zGLjmNXXKFjp0iOSIiIiIiEhTd5IiI\niIiISFB0kyMiIiIiIkHRTY6IiIiIiARFNzkiIiIiIhIU3eSIiIiIiEhQUllCWqRaderUCYA999zT\nnevbty8QlZCcOHGia9tll11K2DsRERGRyqBIjoiIiIiIBEWRnFqstdZaABx11FHu3L/+9S8AJk2a\nBMCwYcNc2wMPPFC6zqXUYostBsDZZ58NwLnnnuva+vTpA8C1115b+o5VgM6dOwNw9913A/ENr+zY\n/lxnnXVcW+vWrQGYNm1aSfpZbg0bNgTgmGOOcefsffjXX38BcOWVV7q2n376CYiuyf/7v2he5447\n7gCgW7duddhjEclm//33B+D8889351q2bFnr499++20g+l6577776rB3Us2+/fZbd9ygQQMATjvt\nNACuvvrqsvRJklEkR0REREREgqKbHBERERERCUq9v/28mJSwBdbl0K5dOwAuuugiAFq1apXxGOvf\nb7/95s517NgRiC8KX1RJ/mvKOXYbb7wxAG+99VZG24wZMwDYe++9Afj000/rtC9pHrtGjRoB0KFD\nB3fuxhtvBGCJJZYA4JtvvnFtt956KwA77rgjAFtuuaVre/jhh4GoYEExpHns7H3mp6rYa+fTb7+f\nv//+OxAVb5gyZcoi96/QsavrcbPnP/LIIwG44YYbMtosDffCCy+s077kkuZrrkePHkA8rap58+ZA\n7n4/+OCDAGyyySbu3KhRowD497//XbT+pXnszNJLL+2Or7rqKgC6du0KwIIFC1zbY489BsDYsWMB\n2HnnnV2bpTx//vnnQPbv5kJVwtilVZrHbvXVVwdg8cWjFRn23WHuuusud7zVVlsBsPXWWwNwyimn\nuLb69esD8O677wKw+eabu7b58+cn6l+axy7tCh07RXJERERERCQoVV14YMkllwSgf//+7pzNZuZz\nt2h3+ACnn346ALNnzwbgww8/LFo/Q7D++usDcPDBBwNw6aWXlrM7ZXXggQcCMHr06Iy22267DYgX\nbbBryRbU+5Gcxo0bA9G1aNGJUB1++OFFe65lllkGiK7NYkRy0sYifGPGjAGyF7SwWcxsmjVrBsSj\nGObmm28GYPLkycXpbMocccQRAIwcORKIz9q+8cYbAFxyySUAvPTSS67NZoMtsv3LL7+4tlDHqjb2\nWXXPPfe4c6uuuioAr776KgADBgxwbTUzISyyA7DKKqsAcNhhhwGw/fbbu7YXXnihmN2WCuJHCddY\nYw0gik7bd63/uJoRbICmTZsu9HXs/exHhO69996k3a4oNmYXXHABACeccIJrs3EcMWJE6TuWB0Vy\nREREREQkKFUZybE78jPOOAOIZoYWha3lsT9tbYUvhcufSs5mfwVefvlld/zss88C0Yz7Rx99lPF4\nW0/hz6LssMMOAGywwQYATJ06tU76GrITTzwRiEpKh8TWwOXy5Zdf1tpm6yB69uyZ0WalzP2Nayvd\nxRdf7I67d+8OROsJ7fsC4JFHHqn1OWbOnFk3nasg22yzDQD3338/AE2aNHFtTzzxBAADBw4E8v/M\nsm0aunTpAsTXxEp1sLU2EK23GT58uDu37777xh7/5ptvumPb4sLWcvnRG1sXZtH8XNHt9957L1Hf\nK42/3YKtHx40aFDG4/r16wdE6+jS9r5UJEdERERERIKimxwREREREQlKVaarWXpaMdLUavO///3P\nHVsqUbYUpGrxySefAPDnn3+WuSflZ2WfrTQ0wLx58xb6c1aAwE9zs7QQiVxxxRXueNdddwWgTZs2\nZepN6Vn6AECvXr1qfdyPP/4IRCV9C3X77bcn+rk0Gjp0KBAvHWvjYjud+6WOJZNfKvuhhx4CYLnl\nlgPiBQWsdPTPP/9c0PNb2fhNN90UqNzvUz+VfYsttijoZ63AkS0E/+6771ybpcPvscceQJROClHJ\ncksRrFS21ACiIh7rrrtuxuMsZdQvatGiRQsAvv32WwAef/zxjJ/79ddfAZg0aZI7Z8Uz7PPSfpcJ\nnZ/qbKmic+bMAeLfsVaMwD47/aJS+fxeU9cUyRERERERkaAEH8mxGY/p06e7c1ZmMJfnnnsOiC9q\na9myJRBFgDbaaKO8+mAl9o466qi8Hl+p/E2yarKZvJ9++qlU3Umtr7/+OtHP2SyTzShVo969ewPw\n2WefuXM262aLnA855BDXtvvuuwPR54C/mPKvv/6q286WiR+N8P+9NVmEwja5K9SsWbMS/Vxa2Aap\nEBWf8Mfrgw8+ABTBWZgGDRoA8XK6K6ywAgBPPvkkAPvtt59r++OPPxbp9So1gmMsagjRdVcX/M+3\n448/HogibJVaytyPvtg1ZptrZ/Paa6+5Yyt4kc2yyy4LRJvAW7QQom0ZLEI+d+7cQrtdUez3Y78Q\nj/3O0bZtWyAq4gBRJMf+bN26tWuzqG05KZIjIiIiIiJB0U2OiIiIiIgEJch0tSWXXNId9+/fH8i+\nOC0bW9xtqWV+SsaDDz4IRAtuDzjgANd23nnnAVHo3ufvERCykPbLSKMVV1wRgNVWW82d+/zzzwH4\n/vvvy9KnUrOwuaVa+Wwn6nPOOcedW3/99YFoUa6fwmHnhgwZUjedLQE/9dYWJdvO8D5LuTj44IPd\nOdubKZfNNttsEXuYXv5nu+1746dV5cOKyvjjZHtz+CnSIbM0lbXXXtuds4Xfdr0taopaSOrXr1/y\n12zYsCEAO+64I1C56Wo++8475phj3Dnbq2WttdYC4vvy2ef8K6+8kvFcluLsF20x9j5++umni9Dr\n9LJr5K677gLi16mdsz2tbK+qbJo1a1ZXXUxEkRwREREREQlKkJEci95ANMuUi92lAnTr1m2hj//0\n00+BeFECi2Lks8O4SBJWmtEv1WrlqCt9EfiisFm4MWPGAPFIbj4qsRiGLZD3ZzH79OlT6+MfffRR\nIIpY5MsWmobomWeecccW1bJx8llkbIcddnDnRo8eDcAyyywDxMsCWyTRfs7KrwI89thjxeh62fkZ\nC9ki+FdffTVQ3UVSanPPPfe446OPPrrWx9nO8dmiYBaFts88n13X7du3d+dOPvnkZJ2tAFbcAuDC\nCy8EovLv/hiYTp06AfGozSWXXBJ7zBtvvOGO7Tm++eabIvU4PfwCAoMGDQJgq622AuLfFfZ7sX3O\n+cV90k6RHBERERERCUqQkZydd97ZHVvp2Gxslumaa65Z5Ne018lWqtZmAvbZZx93LtuMoUgu2dZ7\nDRs2rAw9SRfLAS40gmMzVbYOrxIsvvg/H9lWlv6ss87K+XjbcNFmOPNlm6jmKkEdEls3ud1227lz\nv/zyCwB9+/YFYMMNN3Rttgnjf//7XyCKXEA062mf+4cffrhrs1lk26KgUvmfRdm2UrCNOyWTv7bD\nX9db0zvvvAMk/3zyr+VqMXLkSCDabsEvYWzvx/fffx+I1u1A9Dln72tb3whhRnDMyiuv7I7PPPNM\nIPrOyLZxrG3G2rlz51qf00rvp0V1fIOJiIiIiEjV0E2OiIiIiIgEJah0tV122QWIyiRCtEDPN2fO\nHAB69OgBwMSJExf5te11spWqtUWtxXgdqT4WSrdFf375T3/xtBTmuOOOA6IUhbTyF4faIu+zzz47\nr58999xzgaj0Z76sJHK2dDVL37AywX4arqVqXXnlle7clClTCnrtcrCCMbbzOUSpRJZ+ccIJJ7g2\nKzwwf/78jOey/5tbbrkFgPfee8+1WQpRpaerLUyudDUrvmAFGarte3Hu3Lnu+P777y/686+33noA\nHH/88UV/7kph7z3bYgGidLWa2woATJgwAYBevXoBYaeo+XbfffeMc1asy0rh++za/d///ufOWQq1\nGT9+fDG7uMgUyRERERERkaAEFcmxzYsWttnWa6+9Bix6OU9/8WWu17SN+OzPamQLdKVwrVu3BqBD\nhw5AtOhc4nIVGTGVuJC+ZcuW7vihhx5a6ONHjRrljvMpqmKbzPoLyHOV0l9ppZUAuPHGG4H4gvzG\njRsD8cX2fiQqrQYMGJBxzsrI2vvuq6++Kug5LQLklwzeaaedgKjErZUJrjR+wYts7zsrc28ZDbbY\nG6Bnz55AVMLXLzNts8g2Pv6MseTHSsn7i8pnz54NVF9BCH+T3prFoXx77LEHUD0RHONv6vn1118D\nub8zPvnkEwC++OILd27NNdcEoqjZiy++WPR+LorK+8YXERERERHJIahITqn5papthk6y83NjZeGW\nW245d3zHHXcA8PbbbwPRpqAhsAiozZYDzJgxA8h/HYnNmN95551APM/YohTGXytnaydsbU5a2eaS\n+dpiiy3ccT5rtlZYYQUgHpHJxzbbbFNrWyWV5YZovZu9xyDaQLHQCI6xnH+7LiGKaFukLNtmjmlm\nm8/a2gWI/p3++hJ/TRbACy+84I5tTcRuu+0GxDeqtM82Ww9l0UKovLEqtYMPPhiIrx0zTzzxeXYR\nVwAAIABJREFUBADTp08vaZ9KwY8UW0TGSkD7v6PVXJ/t/9029bXvoXwi5iHwf8+w8bDIjEVtAFq0\naAHA4MGDY4/x2Wenld5PC0VyREREREQkKLrJERERERGRoASZrpbPAuRi8EOhNV/TX+Dsl/yVcPk7\nKFtZxaRpO/5zWUlQKwccgoYNGwIwfPhwICrnDlF6j39u3rx5tT6XpRbYn/4i8gsvvLDWn7NF8mn3\nww8/FPR4P12tXM4///xyd6Egdu1ccskl7pwtxF1Ufrra5ZdfDsBWW20FVF4K1hprrAFklo0FWG21\n1dxxrhLZ7777buzPn376ybVZutoGG2wAwIgRI1ybvZeHDBkCwNVXX134PyBg5513HhD93/iFjq64\n4opydKkkrCw7wCOPPFLr46yIxdFHHw1EW45AVAxj6NChAEyaNMm1/fzzz8XrbMrY9y/AvffeC8A7\n77wDxNPVVl11VQCWWmqpWp/L/5xLE0VyREREREQkKEFGcrJtAFpMF198MQCnnHJKra/p3wXffPPN\nddqfcrIZeYBNN920jD0pDX+RY9u2bQHo27cvEP/3L7nkkgC88sortT7Xq6++6o5tYaiVsLRZFYjK\nf9oGhCGwssh+tMZ07doViEdhbHYpG7sGbRHlQQcdlFcf/NK1aZa2zUoffvhhICr4YH9CVKLWNlyu\nFKeeempJXue2224D4tsPVBIrguEXaLAowZZbbunO2eaq+WzTYBs3+sc2Y+wXBbEIjv1f2SbbkPvz\nIUSWOWK/i0C0ONz4Ww1k29ix0h155JEADBs2rNbH+P9uK5ZhW4j4235YJMfO+QVqQvbggw+6Yysn\nPXDgQACaNWvm2uz3WyvC4rdZUam77767bjubkCI5IiIiIiISlHp/13XYI4Gka2qaNm0KxO9Os+Wn\n2yxj7969AZg4caJrqzm7azPyAP379weiGeZcQ2eboUHyso1J/mtKtR7J+Jv++eU+Ad566y13bOVC\nC11fkFRdjd3xxx/vjv18cYjn7to6LNs4EWD55ZcHYN111631+W2TLT+/3SI5W2+9NQBffvllrT+/\n+uqru+NWrVoBUZQI8ttcrxTXnc1AWqnPbM/lbyBo0YMJEyYA0KlTJ9dm7/F8yrj7/bT1Bf7GZouq\n0LHLZ9z88tq33347EJU8tj8hysH3o402I55toze7nnJFG20dmL/GsHv37rG+FEMlfNYVw0UXXQRE\nZbuLUb683GNnJe4PPPBAd85KRvvrVheVRSasjK3/ebvOOusAhX+/lHvskrL3dbYNxm3srZwy5F7T\nmFS5x87Wfe24444ZbbbJrpXVBnj00Udjj7HvY4h+B9x4442B+O8yFuUppnKPXT788tJNmjQBonVw\nlsUC0bi2a9euJP0qdOwUyRERERERkaDoJkdERERERIISVOEBS0P79ttvcz7O0truueceIJ6uZilW\nFhLzQ3aHHXZY3n0JcWfhbDbbbLNa22bOnOmOS5WmVtf8dLCa/IWeFtZt1KiRO2eLjffaay8gvtO3\nFS2w5/dDsla+8fnnnwdyL5i3NBiA5s2bZ/Qhn3S1UrDyndlC8JYa5S+kteN+/frV+vh8FotaCV8o\nbppaXfJ337brY6ONNgLi77/x48cD8UWhlsoxcuTIRK999tlnA/F0NUnO/v+ypRlVKkvjtlQfiFJH\nrST3oEGDXFvSz6ALLrgAiFK1/Oe0a3/PPfdM9NyVYtlllwWiAhY+u6YshbwuUtTKzS/xvu2222a0\n2+8cluKb7fcwS/G97LLL3Dn/2oX474TVyv89w34H8dPUjF+GOo30zSUiIiIiIkEJKpJj/IW6Vt7y\nxBNPrPXx/qZQtkA+16xwtpnjBx54AEj/XW0prbjiiu7Yohi2ILBSTZkyxR3bZnY2C7TyyitnPN5f\nHGvHL730EhBt3ubLVlLZFkhaIQF/NsUvUAAwY8YMdzx27FggPdEb39SpU4Fo8bvP3lf5LjAs5PGV\nXkrVNk60a8j+9PmRbL/UbyFsEalfxKAa2Oc/RAubF7WcrB+R7Ny5MwDXXnvtIj1nmlhE9LTTTnPn\nrrrqKiCK8vibh1rJXys9WygreGGlbgF22GEHIP5dnmtD0kplBY3at2+f0XbWWWcB8QyKUFj07pBD\nDnHnsm1Ia5toWzEa+86EqODPvvvuC8A222yT8fNW5Ofll18uQq/DYdlPxi8q9fTTT5e6OwVRJEdE\nRERERIISZCTHZxuInXDCCXk9Pp9ZYXuMvxmZ5Qu//vrrSboZpO23394d2/qQSl+r5G94ZdEpi5gM\nGDDAtdlmeLfeeqs7ZzO6hx56KBBfK2Mbm/kb49XGL31pM1zmjz/+cMc2659GNlbrr78+EJ9BL6a5\nc+cCUdnZkDfmLSa7HtNQLrcULOpsazsAXnzxRSDaMiBb1Cwf++yzjzu2jV3teykk/saftjbG/p22\nYTJEaxItkv3MM8+4tu+//36hr/Phhx8C0aaOEG1EWnPGOTQ1t2nwv0/teyhE9ruErQ1ZmFyf8/aZ\n5mdl2Bony86o9N9Tiq3mWlh/24E0Zor4FMkREREREZGg6CZHRERERESCEny6mqUH+MUIrNSvhT7X\nXHPNgp5zzJgxAJxyyinuXEglQetCr169gGghaggstcxSos455xzXZukT9idEIfARI0YA0eJciMqf\n5yOEcty//PILEJV0Lka6mr0H/TSE//znPwA88sgji/z81cQ+2w444AAgXoDAX0gfCvue6NKliztn\nn/OWTjV06FDXdv311wMwa9asWp/Txmz//fd35yyly67/UNm4WNqjXwa9d+/eQJRetWDBAtf2+OOP\nA1Habc30LIhSXf3P1vnz5wOFfY5WiiOOOMIdW3qv8a/JkK+padOmATB58mR3zlLYCi1vb4UZ/DSr\nO++8E4DPPvtsUboZlPXWW88dd+rUKdZWSQW2FMkREREREZGg1Ps73zqtJVSqxa620ae/yaeVnLZh\n8WeGbAGqzcTXtST/NaVeKFy/fn13fOyxxwJRiVBbIArRYtNcM5/FVAljl1alHLuGDRsC0LNnz4w2\n24gSoHHjxrU+h5Wuff/994HyRm0KHbu0X3NWStWfNbcy3JtvvnnRXieN71cr8LHzzjsD8Q1VreCH\nRRqyfSecfvrpAGy11VbunJX+/eabb4rWzzSOXT6spO8qq6ziztkGn5ZlYSV9IRpH67u/Iebo0aOB\nwkvEp3nsbCPf++67z52za9K+W8sZVS332Nli+K5du7pzthm2FQnxo4Qff/wxABMmTACibQzKodxj\nl4/u3bu7Y8tasS0J2rRp49oWtcR+oQodO0VyREREREQkKLrJERERERGRoFR1ulraVUJIM600dslp\n7JILLV2tY8eOANx///3u3HXXXQdAnz59ivY6lXDNNWnSxB1fc801ABx00EG1Pv69994D4KijjnLn\n/P0liqUSxi6t0jx2l156KRAv1jNv3jwAjj76aCCesldqaR67tKuEsRs4cKA7vuiiiwB49NFHAWjX\nrl1J++JTupqIiIiIiFS14EtIi4hIMg8++CAQLyFdrb799lt3fPDBB5exJ1IN/BK+xhZ+lzOCI9XB\nL/rx+eefA3DIIYeUqzuJKZIjIiIiIiJB0ZqcFKuEvM200tglp7FLLrQ1OaWiay45jV1yaR472/S5\nUaNG7pxtd5GGSE6axy7tNHbJaU2OiIiIiIhUNd3kiIiIiIhIUJSulmIKaSansUtOY5ec0tWS0TWX\nnMYuuTSPnaWrff311+5cy5YtAViwYEFJ+pBLmscu7TR2ySldTUREREREqloqIzkiIiIiIiJJKZIj\nIiIiIiJB0U2OiIiIiIgERTc5IiIiIiISFN3kiIiIiIhIUHSTIyIiIiIiQdFNjoiIiIiIBEU3OSIi\nIiIiEhTd5IiIiIiISFB0kyMiIiIiIkHRTY6IiIiIiARFNzkiIiIiIhIU3eSIiIiIiEhQFi93B7Kp\nV69eubuQCn///XfBP6Ox+4fGLjmNXXKFjp3G7R+65pLT2CWnsUtOY5ecxi65QsdOkRwREREREQmK\nbnJERERERCQouskREREREZGg6CZHRERERESCopscEREREREJSiqrq4mIiEhlmjZtGgAtW7bMaBsx\nYgQAJ598ckn7JCLVR5EcEREREREJiiI5C7Hnnnu648cffxyADz74AIDBgwe7tvHjxwPwv//9r4S9\nExGRYllqqaXc8UMPPQTAHnvsAcT3Z5gxYwYAY8aMKej5BwwYAMAKK6wAQMeOHTNeLwQWwcm2p8Vu\nu+0GQKNGjQD4+eefS9cxkUW06667uuNnn30WgE6dOrlzm2yyCQDPPPMMAC+99FLJ+iaZFMkRERER\nEZGg6CZHRERERESCUu/vbPHkMqtXr15JXmexxRYDYN1113Xn9tlnHwBGjx4NwO+//+7aDj/8cCBK\nU1trrbVc2zvvvANE6W1ff/31IvcvyX9NqcYu7TR2yWnskit07DRu/0jLNWcpVAA//PBD0Z/f2PdD\nu3bt3Lk333wz0XOlZeyOOuood2zfn9n6Nn/+fADatGkDROnf5ZCWsatElTp2Sy65JADHHnusO2f9\nsn/TZptt5toOOuigrD8PMG/ePCCe5rrEEksAMHfuXAAaNGiQ0YdKHbs0KHTsFMkREREREZGgVGXh\nAYvgnHHGGQBceOGFGY85/fTTgfhM27hx4wB48cUXAXjsscdcmy20fPLJJ4F4UYK77rqraH0Pxf/9\nX3R/3bp1awCeeuopIFqUC3DYYYcB8N///tedS2HwsehWW201d2zXYJ8+fQBYfPHa37Z2/QEMHToU\ngNmzZ9dFF0tq4403BuDAAw8EosXgEF035ssvv3THNqMs/zjppJPcsc1k2uL3Tz/9tCx9qhRffPGF\nO27YsCEQj/zUZMUJ3n///Yy2Sy+9FEgevUmT5ZZbDoBWrVrl9Xgr0lPOCE7IdthhByBaAA/R5+Ze\ne+0FwF9//ZXx+NAXyNv36KhRowBYZZVVXFvNSE6+LKrjZ/wcffTRQDTWIVlnnXUAOO6449w5y2iy\na+y1115zbY888ggAV111FQDfffddKboZo0iOiIiIiIgEpWoiOSuuuKI7vvLKKwE45JBDan28zaQP\nGzbMnWvbti0AH374IQB77723a7OojkV0OnTo4NruueceID57Uq0sCmH/BwC9e/eOPcYfp549ewLR\nGAL88ccfddnFkvHzePv16wdE180WW2zh2mx26dFHHwXiMyU2G2ozJOedd55rs+t08803d+emT59e\ntP6XkkVDN9xww4y2XXbZBYjGyZ+NW2+99QA488wz67qLFcEfv4022ih2Llskx8b23XffdefmzJlT\nl10sq+OPP77Wtueee84dDx8+HID69evX+vjPPvsMCD9CtummmwLxKGFN/me2/50qhWnatCkQ/X5y\nwAEHuLbu3bsDsMYaawBRxorPPhv99R22JjnESI5FXAFOO+00IIrg2DpqiL4r1157bQA++eQT1zZo\n0CAg+n3Pz+Ax/tYhH330EQC33Xbbov8DymiDDTZwx/b7Sa9evYDsES875//usuWWWwJw4oknAtCl\nSxfXZiW265oiOSIiIiIiEhTd5IiIiIiISFCCT1ez9Ci/XGDNNDUrAwjw/fffA9HC+LvvvrvW57YQ\nJ0ShTAv5Hnnkka7NdsW96aabCu5/KGyB3mWXXQZkpqj5/LC57SQcSooaRCku1113nTt36KGHAtG1\n+MADD7i2m2++GYDmzZsD8UIW3377bey5999/f3dsYfZ7773XnbMUpUpjKQDZUgXMqquuCsRD4pai\ncPDBBwPQvn1711apqXtJWBqW/7779ddfAfjtt99q/TlbpPvjjz+6c1aoxb9GK52lGfvfEzXZdwPA\nlClT6rxPlWL55ZcH4ilQNcvd2gJkgKlTp5amYxXAUsyylQfeZpttgHiKqZXdbtKkyUKf2x9zK9yy\n5pprJu9sBfIL+FjqrX3e2XYhEBWrWXnllQFYZpllXNvMmTOB7AVErGjSzz//XMRel9dFF10ERIWO\nAJZddtlFek4rTmLFvkDpaiIiIiIiIokEH8mxmTm7O81m4sSJ7thmViwC5JejzcWiOjY76pdBHjBg\nAAB33HGHO/fnn3/m9byVzBZ9QxTBsXK1ubz88svuONcsc6Xp0aMHEC3iy1bi02ZFbYEpwBVXXAFE\nZRhzlbncbrvt3PG+++4LwO23376oXS87KyFuJXf9hfDGNmGzqA9EBQeszKW/KLIaIjl2HWWLUNjn\n3uTJkxf6PP642cLmSucvzLbolC089j399NNAFBmVuG7dugG5FyNn26ahWm277bbu2LI7/C0VkrJI\noxWveeWVV1ybLYK3SI5ll/ht1cIiXNl+t8u2ibt9J1vkx9/01goi3XfffUXvZyn4EasxY8YAUbZD\nvtEb28LBPidPPfVU1+YX/IJ42W4b17rcdBkUyRERERERkcDoJkdERERERIJS7+8Ubh+fbRFeIfza\n6LZA1GrB+ywNyFKpAD7//PNFeu1zzjkHgH/961/unKW++btjW1pbLkn+axZ17IrB9n+xXb0BTjjh\nhNhj/B2Cbd+Nxo0bA9FiPojvMl6INI6d1d63dB+/Fr+9drNmzTL6cuONNwJwyimnAPFCGcYW9vnj\nZft0+Cls+YSG0zh2Fla3/Qjmz59f62OtyAXATjvtBMATTzwBwFtvveXabBFvMRU6dnU9brbPlBW5\n8F/P9kHw0/tqssW22T4/7XOtGMpxzZ177rnu2D63s7EUW38hd5qUY+z8he+zZs0C4u87e/77778f\ngIMOOsi1pWm/uHKMnf97gBWOyfZe+umnn4D4Z3bN1LIJEya4Y/u8t+8A/3UsNdW+W/09jUaMGJHg\nX5HO74mahgwZ4o4HDhwIwLXXXgtA3759a/05K1QD0ffuL7/8AsA111zj2iwlMN9lDabcY2f71zz0\n0EPuXM3UsmzsOjr//PPduRdeeAGIfi+ZPXu2a7NCDtn+vba/1ttvv11Q3wsdO0VyREREREQkKEEW\nHjjssMPccbYZyLlz5wLRzNyiRm98dofrl/K1O1abpYd4+enQnHfeeUBm9AaiQgJ+OW2bUbFZpqTR\nm7Tzo1eQfZGzFaewBfMQzZTmYiWCl156aXfOCg7U9cK+UiikhLgf6TriiCNibbmiFqHwi1bYDtXG\nypFD7uvKxi1XkYGzzz4biM+WVgJbmG39Xxgr4evvAG5j55dnryZ++V0/glOTRRrSFL0pN7/csG2R\nkG2W3oqr5PP5n43/HWvfrZMmTQLglltuSfSclcZKZ/uy/X5h2T8WcTzxxBNdm43VuHHjgMxtGyrF\nzjvv7I4tguMXF6j5HvWLY/3nP/8B4hEcY7/f2veARW8gKqiR7f1fjGIb+VAkR0REREREghJUJMfy\n9k8//fScj7OZkccff7zO+mLl+CAq/XvMMce4c/5MfShs8zI/klbTySefDMDYsWMz2qZNm1Y3HUuJ\n/fbbD4ChQ4dmtNm6m9dffx2I8qsXxmZnLIfYz9H2S3FXE3+z3wMOOACI8qmHDRtWlj6V0vXXX++O\n/dLPAB9//LE7tnVNtmGeX07UIuBWljub8ePHL3pny8DWS+ab457t/WrZAN999x0Q3yzVZoOzbR4Y\nimwZEtn415uxjbPPOussIP493LVr14U+p20HEULp41ybGydlW1ZYlMhn0Qj/eq02tv7ONo+G6HPS\nIg62cSjkt346zWyrCit3DdFnvR9hsbUu9p71f4+2tXU2Zn6GhGXsWHlof82MPX+2dTSliu4qkiMi\nIiIiIkHRTY6IiIiIiAQlqHQ1K8Nou5v7LL0AqiNlpVQsRQ3grrvuAuJhYGPpQq+++mppOpZCVmzC\nL0qRhL/jspWztHFd1OeuZJamZuWSARYsWABEYzZ58uTSd6xELMXCymZnYyVAAS6//HIgSmux3dAh\nSuXKVa5z+vTpyTtbRn5p3aSsTL591vmfeVau3BYo+/8fVnil0vmpftnS/mqe80tIW3EV449PPiks\n//3vfwHYbLPN3LlBgwYBUYn5atSyZUsgSsf0F3aPHDkSiNKiq5kVyujTp487Z9fr4MGDy9KnumDF\nFKwQlG0zUZtnnnkGgG7dugHRZxxA27Ztgeg6ylWQJhdL74XSpQEqkiMiIiIiIkEJKpKTy1dffeWO\n/ZleSWbrrbcGougNZI/gGJv1/fTTT+u2YylmM+W28DZbeUvjLxq1zbasBPWxxx7r2myzW7+kazXw\nyyRb0YWLL74YiM8G33rrrUC4pX79BbJPP/30Qh//7LPPuuNcs+a5Sn9WOtuszkpDQ7Qxr23AmM0b\nb7zhjq1MqkUh/A2orZCD/fnAAw+4tjPOOAOICoxUKj/Cl8/mfBZlyPb4bIufc7HH9+/f352zRdUv\nvfTSQn8+JA0aNHDHVsDBzn3wwQeu7YILLgCqL9JlxS0gd5aDfd7Z92mlFxsAWGeddYDc/24/smLf\nowceeCAQ39B+9dVXB/KL8Gdj21j4hUVmzpxZ0HMkpUiOiIiIiIgEJahIztFHH11r24wZM0rYk3DZ\nZpO2KVSu6I1vvfXWA6B58+ZAtNFZqBo3bgzAoYce6s7ZWrBsm+dZyWhbt+Nv0uU/B8RnUWxG+dRT\nTwXipcttxjokRx55JBDNiANstNFGscf07t3bHfvllEPUuXNnd1zILLj/eNsU+ZRTTnFtNguc7Tlt\nVrhS9ejRA4hvjmdrtfIt3W522203AFq0aOHO2cZ59h7eddddXdvdd98NRLOllR7RWRi/LHk+bO2s\nbZjpf1baZ2o2tmagWiI59evXB+CGG25w56yEr22cbGsrIMzvglzs9wzbEBtyfz7mKnUcsuWXX94d\nv/POO0C0difXJr+Fsoian0lQKorkiIiIiIhIUHSTIyIiIiIiQQkqXS1X6lTNspXlMHr06HJ3YZFt\nt912AOy11161PubMM88E4rtiT5gwAQgrTc0WFrdr1w6IxgaiBeFrr722O2flO4cPH57xXLYA8JJL\nLgFgyy23zHiMlYv2C2fccsstQLRI8Msvv3RtIVxvNVlaXs0UNd+ee+7pjkNNV7NrL9uu5rn46YyW\nOmUpU1byeGHyfVxaffHFFwDcdttti/xcVnbV/gR47733AOjXrx8Q/z+yNBorhGGfHRCli4TESs76\nBWpqsoIqEO2kbuPjp/o9+eSTddDDyrLEEksA0QLuLl26ZDzGvkOmTp1auo6lhKXFn3766UD8d5Bc\nnnvuOQDmzZtXNx0rA/tdwLaX2HzzzV2bbbey2GKLuXOW+m7+/PNPd2wFK6z8fq6CNP772dLKR40a\nVfg/oEgUyRERERERkaAEFclJk2zlgf0y1pWkSZMm7rjm7Kdt8glw0kknAVHJ1B9//LEEvSstf+bD\nZinbt28PwJw5c1zbwIEDgXgE0RaEWrnZ7t27uzaLxFg0ctq0aa7t7LPPBqJomL840hbi24Zf1ieI\nyixbaeUQHH/88UC8MMOmm24KRP9OP3oWKlvQ7W/gmYvNpPfq1avO+iT/sFlh+9MvS23XrUXi/I2r\nKymSM3bsWHdshVGyzZpb6Vn/+/D+++8HYL/99gPipWRDLfVeLLaBZbZsACujHdLnfaEuvPBCIMqE\nyLfwhWVe3HnnnUC0oW8l++abb4Aow6RDhw6uzbJt/O9R+73CIvx+VMvKS1s0KFeBBj9yXY5CAzUp\nkiMiIiIiIkFRJKfILCe0ZtnfSmZ5wBDfhBFg1qxZ7vjmm28uWZ/KxZ8JtwiOzcD6/+dvv/02EI+C\n9ezZE4giXrZZl8/Kz1511VXunK0hyObNN9+MvbZtagjRBqH+rJ+fZ1uJnn/++YxzNfOobQPQkFl0\n4K233nLnWrduDcQ33LUZ9yFDhhT0/DU3A7XS5gAjRoxI0OPq5Zc19teLVTL/c99Ktj/11FPunG0a\naGwDZIjKGdt6m2wRf1sz4Ee67DmzbVRb8/VC4o+dfT8Y///BvgNCWldSk61pA9hggw2AeOl7Kwu/\nYMECIJ5dUfN3l48++sgd2+8uoa7hBHjooYeyHte04oorAtG6JoDNNtus1sfbemL7f7DNy9NCkRwR\nEREREQmKbnJERERERCQoQaWrWYqPH2YzliIE8TKqxWJpao8//jgQ353ZUpfmz59f9NctBX8H+Zr8\ncsbVIFfJXr+EtIW9LfQL0cJcSzXyd2O28tJJFzxaUYOOHTu6c7awN/RdnP2dvQGmT59epp6UjqVh\n+KXc7TPHL+HplxQvRM0dwCtpUXza+Ivpa6arDRgwwB3nSiFJMytV7Ker7bHHHkD2z55VVlkFiBYl\nT5o0ybVZ+qWlx/ifqfZc2Xan99MpQ2FFaKwkNES70Nv7f//993dtIaep2ViceOKJ7pxtJ+CbMmUK\nEBVhOPnkkzMeY6lsr7zyijtnBQsELr/8cgAOOeSQvB5v21gUoyR/XVAkR0REREREglLv7xRO8yZd\nRLj00ksD8YWethjXX6Rom7V17twZKHwWyMrpWdlBiBb92Wyq/2847LDDgMIXRCf5rynmAkwrw+iX\npNx9992BqJzx4Ycf7tr8ctLlVldj5886brPNNkC8rLSx680iLBBFd2xmd/LkyQX3sRTKfd3lw19I\nb5FGK83tL9SdOHFiSftV6NildcG0zXbav8cvP/rYY48V/fVKec3Z51qDBg3yerwV/vC/J7bYYgsg\nXoK1Nn5Zd1ssbfzNkVu1apVXf2pKy/vVL9dr3w877LADEC9ek6sv+fxb7PGvvfaaO2ffS7/99lsB\nPU7P2GVjUa2WLVtmtFm2ymWXXVaSvmRTyrGzf2e2yIzPLzQA8WIDtqGlRcbOOeecRH0phrRcd/5n\noG39YdFTixpm64NlnkC0rUOpIomFjp0iOSIiIiIiEpSgIjnGL9v75JNPAlFEx2ezPoXegTZq1AjI\nPoNv6338TZAsV9GPJuWj3Hf7VnrYX89kevToAcC4ceOK9nrFVIqxs/UQa6yxRkabzSjjDb38AAAg\nAElEQVTZ7EglKfd1l43NEm+//fZAfD2TjbXNbpZzbUNokRxbi+OvRfNLVBdLKa45i+A88sgjAKyw\nwgp5/ZxtVulHXXbccUcg+i5I6owzznDHSWfl0/h+NcOGDQOisYfoPZytL/n8W1599VUA9t13X3eu\n5gx+vtI4doMHDwaiDaL917PPNltvWejvFMVUyrGrGVnOl5U3Bhg1ahQQba5dTmm57vxtLGbMmLHQ\nx9sGo7aurhwUyRERERERkaqmmxwREREREQlKUCWkzbfffuuObXG4pVcBXHvttUC06CrfBag1+aVq\nLS3JdiT+4YcfEj1n2llxgffff7/MPSm/pOWepXBdu3YF4IYbbgDi5djPO+88oHJL8KaR7SpvqWl1\nkaJWarZQvVevXkC8zL8Vh9lkk03cOfteWGuttWJ/Loq5c+cCUTGXO+64Y5GfM8369+8PwOKLR79q\n7LrrrkCUctWnT59af3727Nnu+P777wfg3HPPBeD7778val/LyU+/bdu2LZA9PclS6+39Wc50tbTw\ni0188MEHQPQ9YampEKWdSlTMwr/u8kkDq8StBBTJERERERGRoARZeGBhz7nyyisDuWeQdtppJyBe\nMthYFMOfhbPyhMVU7sVp2QoP2Iyuv2Atjco9dpUsLWNn5WchKkm73HLLAfFSorYJcBqEVnjg4Ycf\nBnJvglsMabnmfFZcYNtttwWgTZs2rs0W4FofcpW29bc0sPK1FpUohjSOXaVIy9j5WSE1y4xnM3r0\naCCKSpZDKcfuzDPPrPU1bSwgXmggzcpx3VkGE8CRRx4JwFJLLVVrn/yNpMeOHQuUt+y2UeEBERER\nERGparrJERERERGRoFRNulolKnco3dLVevbs6c7ts88+ALzwwgtFe526UO6xq2RpGbtPPvnEHTdv\n3hyIdgFv166da/vqq6+K/tpJhZauZmlZHTt2dG2vv/560V8vLddcJdLYJZeWscuVrvbjjz+640sv\nvRSIChxVyz45oSnH2L3xxhvuuFWrVhnPaX2yNLUhQ4a4NttjKA2UriYiIiIiIlUtyBLSUhyTJ08G\noFmzZu5c2iM4Eo4nn3zSHdsu52maUQqZLaS3XdebNGlSzu6IBM3fbd4iOS+++CIAffv2dW1Tpkwp\nbcckGHvuuac7tlLQTZs2defmzJkDREVmpk6dWsLe1R1FckREREREJChak5NiynlNTmOXnMYuuVDW\n5JSarrnkNHbJaeyS09glp7FLTmtyRERERESkqukmR0REREREgqKbHBERERERCYpuckREREREJCip\nLDwgIiIiIiKSlCI5IiIiIiISFN3kiIiIiIhIUHSTIyIiIiIiQdFNjoiIiIiIBEU3OSIiIiIiEhTd\n5IiIiIiISFB0kyMiIiIiIkHRTY6IiIiIiARFNzkiIiIiIhIU3eSIiIiIiEhQdJMjIiIiIiJB0U2O\niIiIiIgEZfFydyCbevXqlbsLqfD3338X/DMau39o7JLT2CVX6Nhp3P6hay45jV1yGrvkNHbJaeyS\nK3TsFMkREREREZGg6CZHRERERESCopscEREREREJim5yREREREQkKKksPCAiIiKVrXXr1gA8/fTT\n7tyuu+4KwNtvv12OLolIFVEkR0REREREgqJIjkjKNWjQAIDmzZsDcNRRR2U8ZurUqQDccccd7txf\nf/1Vgt6JiMSdcMIJAOy4444ArLDCCuXsjohUKUVyREREREQkKIrkLMS//vUvdzxkyBAAbrnlFgAu\nvPBC1zZ9+vTSdizFmjZtCsDXX3/tzrVr1w6Axx57rCx9qjQjR450xzvttBMAG2200UJ/7uWXX3bH\nH3/8cfE7JlVhhx12AKJ1Ez/99FM5u5M6G2+8MQAdO3YEYJVVVnFtp5xyCpB707qePXsCMHbs2Lrq\nYskdd9xx7ti+Gxs1agTAGWec4dree++90nZMRKqWIjkiIiIiIhIU3eSIiIiIiEhQlK5WwzrrrAPA\no48+Gvs7RAu5Dz30UAB2331317bXXnsB8O6775akn2lkaWoPP/wwEE/X2H///QGlq/mWX355d/zv\nf/8bgBVXXBGAzp07u7Z69erl/ZznnHOOO+7Ro8ci9rA81ltvPQDef/99d+7//u+f+ZhsxRTuvvtu\nAFZaaaXY37Pxx/Kmm24ClIplxowZ446POOIIACZNmgTAfvvt59p++eWX0nasTJZcckkAunfvDkQp\nfABdunQBYNlll834uVwFP+yaXnrppYvWz3Jr06YNAJdeeqk7Z8VSxo8fD8Dw4cNd24IFC0rYO6lG\nDRs2BKBbt27u3NChQ2NtufjfEx988AEAHTp0AODDDz8sWj+l7imSIyIiIiIiQan3d67VkWVSyMx1\nMay77rru+JFHHsk4l4+vvvoKgLZt2wLwzjvvLHK/kvzXlHrsfFtssQUAr7zySkZfttxySwCmTJlS\nkr6keexsnKyQBcA+++xT9Nex6Eehyj12FmGxGXT/+XP1rZDHANx8881A9pLcSRU6duV8v1oUwqJ/\ntnEjRJ9jpmvXru7YZueLqdzXnLEIBEQzt7fddlui57KIjkW2AU466SQAPv3006RdzFDusbNCC8OG\nDXPnbPbbPtdmzpxZtNcrpnKPXSUr99hZNLRFixbu3BVXXAFA48aNgfhnWj4+//xzAFZfffWMNisu\n1bJly8I7W0O5x66SFTp2iuSIiIiIiEhQqnpNjuX++zNtNSM4FpWAqFzouHHjANh7771dW7NmzYBo\nE7Q+ffrUQY8rg91p+2W1VWI7uraefvppIL/cYIDffvsNgNGjRwNw4403ujYrZ26zSyGUZ23SpElJ\nXsdmmS3K+Nprr5XkdcvJXwfWr18/AE477bSF/ly2mc2QWFTLZoIh95q2N954A4B58+ZltFnkxx4z\nefLkYnUzlWxNju+AAw4A0hvBKafzzjvPHa+99toZ7bYO2DZSzTcybY+75JJLgPgaqe+++y55h1Nk\n6623dsd9+/YFojXS+bruuuuA7Nk2zzzzDBDPsrD1xOuvvz4QrceDuolql5tFyDbddFN37sADDwRg\nww03BGCrrbZybbYW9osvvgCgV69ers0yo8pJkRwREREREQmKbnJERERERCQoVV14wFLRbCG478EH\nHwTiYUtLZ7GQ3UMPPeTa1lprLSBabGo7WkO0wLlQlbY4zcbl1VdfBeC+++5zbYcffnhJ+5LGsbvq\nqquAKKUxFz/Fxcpgzp49G4A111wz43GrrbYaEIWVAe69995E/Sz32FkhgOuvvz7j+YtZeMAed8MN\nNwDxMHtSaS08sNxyywHRNQjxwg4QLz9uj+/fvz8QT8taZZVVAPjhhx+K1r9yX3M2LrnSjP33pBUl\n+PXXX4vWh6TKMXbHHnusOx45cmTGc9rn0ZdffrlIr1PXyjF29v0IsPnmmy/0dQr9PDN+kR8/vahY\nSjl2lqZm6WSQXxn2wYMHu+P//Oc/AMyfPx/IXep9scUWc8e2LYgtbxgxYoRrO/nkkxfah2zK/XmX\njX3mX3nllUDm90Ntfan5b3nhhRfc8U477VTMLmZ9vYVRJEdERERERIJSNYUH7C4cohnbbAsmbbb8\n1FNPBbJv/GSL6P073eeffx6IyvaeddZZri1pJKfS2LjYnxbZkX/kKjQwbdo0AM4++2wgvmlqtsXN\ntbFF9JA8klNuY8eOjf0JUYQq10af2Rx33HFANPPsL6Y0dTHblBbLLLMMEH2e+Z9Ztinj7bffDsAF\nF1zg2vwINkQbY0KYpUxbtWq10MfYQnCIFtumIZJTDtttt507tuvhzjvvdOe+/fbbkvepUpx44onu\n2Ao0/Pnnn+5cIZ9xu+22mzv2S3hD9LtMJbPf22xM8t1E14ov+EV65s6dm/fr+hvWfvzxx7Gft0hH\nCCziCvDss88CUeGL33//3bVZVo4Va7jrrrtc2worrABE3x9+8S77/vnjjz+K3fW8KZIjIiIiIiJB\n0U2OiIiIiIgEpWrS1XbZZRd3bKkb2RxzzDFA9jS1mn788cda21ZeeWV3bCHXfJ6zklkBB1tMefzx\nx5ezO6lj+93YdTNmzBjX9uSTTwLxNLXa+HudLL74P29hCwf7eyOEpNA0NfPoo48CcOGFFxazOxXj\n4osvBuCkk07KaLMx8fftqCZ+Oq0VjsmXpW/YXmlpX2BfbNkWJb/88svu2BZ3SyZ/nPzjJHL9LmP7\nNFWyn376CYBPPvkEgFVXXTWvn7M05aeeesqdmzVrVq2Pb9GiBRD9/vfcc8+5tnbt2hXQ48oydOhQ\nd2xpZpaS5n9n+AUfatO+fXsA9thjD3fOxtWuU/+75qOPPkrY68IokiMiIiIiIkEJPpJjM3RWCtVn\nd/n+TucTJ07M+7ltdgGiBYA33XQTAGussYZrs1LTdlcbuhRWJU8FK0bx73//G4DPPvusoJ+3CNmE\nCRPcOYsY2nPmii5WC7/4gkW2GjduXOvjbQfsUPhFT/xFzhAVtoAoylOtPvjgA3dss7zNmzfP62db\ntmwJwMyZMwH4/PPPXZvNaL7//vvF6GbFqJYCO2niRzasAIQV/gnhc23OnDlAFDFdYoklXNvpp58O\nwPrrr+/O7b///gA0atQIgAceeMC12ZYhhx56KBAvzGCR2Q022ACIf25aUZIQM3H8AiLff/89APvs\nsw8AX3zxRUHP9b///Q+IFxmwiJgVXfr6669d22mnnZagx4VTJEdERERERIISfCTn/vvvB+L513YH\nf/TRRwPxWbhC+CUJLQJk0SHb0BCiknz+hpjjxo1L9JqVwGaU/NK8o0aNKld3UsOiLEmjLTbzYZsx\n+vz1PdXK1t2dcsop7lyu8tBW8rJSS23XZBvT+XnPVtLeSn760Ztcm+FVA//fb6V8/bLlFpHJxdbE\n+Wt6bDsBm2n2nzNEtl7u559/LtpzWqTMX39oEQqVp4Y999wTiK9/sAwKi/SHtE7MogN+lMCPShvL\nqLHNPPfbbz/X1rt3byAaH/868jNvIF7S2yIUobPf2wrdIsDKetvWLJdffnmtz+2vVS8VRXJERERE\nRCQouskREREREZGgBJmudvDBB7tjP03NWGpP0jS1QlnKSLNmzUryeuVmYfNsYy/5sTQYgOHDhwNR\nioJv9OjRQOGLBCudn4ZgqQa2E3WuNKy33nrLHY8YMQKo7PSXfv36ueMBAwYAUaoGwPjx4wE44YQT\nAKWo1caugb59+7pzU6ZMAaJyyWuvvXZez2UpVueccw4AkyZNcm0hLl6279FCy0ZbMRBb6AxRmrdt\nR7Diiiu6tjfffBOIFi/76XEXXXQRAFOnTi2oD5Uq16JtP9Wq2tQsdWxbM0D0nWppVdnY2B100EHu\nnBUXCZ19blmBBv+9VPN9tcwyy7jjjh07AtGyjGyFpyxF/9xzzy1ij/OjSI6IiIiIiAQlqEjO6quv\nDsQXpNnd+8MPP+zOWbndupCt8MCCBQuA0kWOyq3QhWsS6dq1KxCVzATo0aNH7DFfffWVO7YZzGqZ\nvWvatCkAgwYNcuds1teiFLlKmI8cOdIdV3IEZ9NNNwVg4MCB7lyTJk2AaCYOog15rTxoMVj0KFe5\n5datW7vjzTbbDKiMzwV/1tZmHb/55hsANtlkE9eWazbYWITRNqSFKGoRUkTHrjs/+pxrsbZFcG64\n4QYgKvoAud+7bdq0if3dv54ssmuFhiAqA2z/f9Ui6cbJoXvkkUeA/CI5/iaiIbNiNRB9p9oWDP7v\nIP4xxN97ud6z9jlgvw9//PHHi9jjwimSIyIiIiIiQQkqkmObPG288cYZbbYpINTtrLdf0tH8/vvv\nANx222119rppos1A87Puuuu6Y5tFOfLII4FoHZfPNpX181qrJV/YZmptnCx6U6hKn+W0CI5Fpm0W\nHaIITocOHdw5ey/aJnfbbruta/M30YN4xNA2b8vG32y0Nv76DD+KXomuvvpqAHr27OnO2RivtNJK\nQHyTwpr8tTz2HrafDyGiY2uWrCQ75N4I9dprrwWgc+fOAHz66aeu7ZprrgHiG23XxjZ+BGjXrh0Q\nX0vx6quvAvHv/kpn63rbtm2b0fbYY48BMG3atJL2Kc38bTtsXaJE/MinfX9YJMeyACC+6SzE1xn2\n6dOn1ue3rQv81yk1RXJERERERCQouskREREREZGgBJWuls13330HwHvvvVenr2OLIq3MrxUbALji\niivq9LXTxhal+SFNiVLQ6tevD8DNN9/s2rbbbrtaf84WyJ955pkAvPPOO3XVxdSyRdxJ09TM448/\n7o732msvoLIKEFiqT7Zy9FZit0uXLu7cscceC0SL/5PyP88mT54MRJ+pfrrRc889B8C8efPcOSvF\nXOls6wH/2FKiNt9887yeY7311gOigiH+dgeV5IcffnDHlrL3wAMPuHOWFuk/zlgBB0ursvchwJw5\nc/Luw7PPPuuOreiIpawD/PTTT3k/V6Ww97GloVZCMY9yWG655YB4AaitttqqXN2pCF9++SUAEyZM\niP3pW3nllYHcKaCWogbQu3fvYnYxEUVyREREREQkKEFFcqz8rs8WMhYyQ5Qvv6TlfffdB0RlVadP\nn+7ayrEBUjnZLNNGG21U5p6Uj82wtWjRwp2zzQEPOeQQIHeBBj+6sPvuuwPVGcEx+Wz0aZGyXI+x\nhfsQlbKtpEjONttsU2ubRW1y8WfWn3/++VjbuHHj3LF9btpmjFZ+FWDffffNr7NVwN7LfpnofDcN\nrWRDhgxxx/be9AtZWHQnWyTHzJ49Gyj8u9lKVdvrQvTd72+KPGrUqIKetxKcccYZtbZdddVVJexJ\nOlmE27Ikdtlll4zHWLTZL7Ry/fXXA9F37XXXXefa+vfvD8Bvv/1W/A5XCNuexTIh/N/t7PeYO++8\nE4Bu3bqVuHe5KZIjIiIiIiJBCSqS07Jly4xzxZzdWGqppQA4/fTTgXhJUVszYLPCfjnNamNRDL+8\nbYgaNWoEwDrrrJPRduqppwJReVVfPiW27echvvlntXr77beB7NHBd999F4hK0u68886uzWbojB/l\nufzyywE4+uijgbqJ9hablfXMlev82muvueNLLrkEiDYD9Tdp/Pnnn2t9jssuu2yR+lktrAS0/z5/\n4YUXFvpzVhLdNqwEGDFiRJF7V3esrDbA9ttvD0RRLYB7770XiKIt/vrDxRZbDIi+r/1S+h999FHs\ndfwNRpdcckkg+gzwZ4x//fVXIColHSpba2L8yFU+113obrnlFgB22223jDYra2+fibNmzXJtRxxx\nBAD33HMPAMccc4xrs6i2rXmsFhaNhWhduWWm+L/D3HrrrUB8zNJEkRwREREREQmKbnJERERERCQo\nQaWrFZOlpvmlV62Eb6dOnYD4okoLy1sYP4SdrJPKJx2r0ljZZ0uJAujXrx+Qf/nYQtx0003u2Er1\nfvbZZ7U+3hZKbrjhhu7cxRdfXPR+lUvHjh0BOOCAA4D4NWapMVbK2E/DsgXz2dJY7Jz9/9mO4Wlm\n/8/+wti6YCkd/uJuqd3cuXMLerylYR144IHuXCWlq/msnKztmA4wePBgIEr1sXQgyEw5tXLaEJWx\nNVb4AqK0NitP7aee3n777UCU1hqShg0bumNLkbaU8NVWW821Lb300kCYpbNz2XXXXd1xrq0Y7Pva\nLxJivv76awA+//zzjDZLa86WAhci+2yyNDSICjIYvxBN3759gcI/A0tFkRwREREREQlK8JEcm4G3\nTUGzsagNRLMmVva5T58+GY+fMWMGAB06dHDnai6YrGY2y7Tmmmu6c7aJ4+uvv16WPiW19dZbA9EC\n9latWpW8Dzbzmaskd9u2bTPOhRTJsSjN8OHDF/pYv4CAzd6FviC52PxNU6uJX7zGSnLnKtttGjRo\nUNDrWIGaEN6jU6dOjf0J0WawVpTA36DW2pZYYomMtlwsemuz7ueff75rs01yQ+QXtllrrbWAaCzs\ndxGIii9UG//3MItmZXP33XfX2rb88ssD0LRp0+J1rEJZcZua0RuINvq0xwD8+OOPpelYQorkiIiI\niIhIUIKP5EyYMAGAYcOGuXM2M2KbAfozJZb7bxYsWOCObdbE1uR8/PHHddDjyjVo0CAgmmXy86mt\nvGClRXJsRttyodPGctFtY8ds+cYSRRdtw1CIcvqtTSK27qFaWGlUP4JlGwsWk63jPOywwwB44okn\niv4aaWCRFfvz8MMPz3iMbcGwyiqrZLTtscceADz11FPunGVjhBy1KZRfyruaN6usjUUeIHfJfLum\nsm0eWi323ntvIFpj57O1dZYZ4W9FkHaK5IiIiIiISFB0kyMiIiIiIkEJKl3NSsD6KWe2UHzs2LF5\nPYelsHzwwQcAXHDBBa7ttttuK0o/Q5ctNahSjRw5EihuWWxbAOmXu1x11VVjjxk6dKg7tmvSUohe\neukl12a7OFfCotMtt9zSHffu3RuIdph+7rnnXNvvv/++SK/jpxycccYZQPT/55edtXMhljxfVNOm\nTQOisuV+eeAQLbvsskC0ALkY7D35xx9/uHPdunUD4Nlnny3a61SqMWPG1Nrmf+9K7dK+6LsULFUb\n4MQTTwRgySWXBGCZZZZxbVZIJJs999yz1rZnnnlmUbuYWv6WE7ZthY2dFd8CGDJkSGk7VkSV/1uo\niIiIiIiIp97fKZzGTLoQ2MpVDhw40J1r3779Qn9u+vTp7thmkNIQtUnyX1PORdRWMvrll18G4uPa\nv39/AKZMmVKSvlTa2KVJscfOIjgPPvigO9ekSZPYYyZOnOiObfM1/5zNWFqp3myLlW3jyh133NGd\nsxLy2fppM+1WgnTy5Mm1/hvyVejY6Zr7R1rer/vtt587Xn/99WNtO+ywgzved999geharbmJJcDT\nTz8N1P1nXlrGrhJVwti1adPGHb/xxhuxtiOPPNIdjxs3rmR9gnSOnY2HbTVgxaXyZRFsv1CGFZwq\n5maXaRm7t956yx1b+fw777wTgO7du7s2vwBXuRU6dorkiIiIiIhIUHSTIyIiIiIiQQkqXS00aQlp\nViKNXXLFHrvrr78egKOOOqqg5/TT1SysvsYaawDRXlXZ+pCr/34/be+mfIuS5EPpasno/Zqcxi65\nShg7P12tZupjjx493LHS1SLt2rUD4gV9LrvsMiBaYN+8eXPX9uKLLwIwfvx4AGbOnFmn/Sv32Nl1\nY9/NAHPmzAFg4403BtJb1ELpaiIiIiIiUtUUyUmxct/tVzKNXXIau+QUyUlG11xyGrvkKmHs/GiE\nRXKaNm0KKJJTqco9dpYlsfbaa7tzXbp0AeJbVKSRIjkiIiIiIlLVgtoMVERERCQUs2fPdse2hsIi\nOB9//HE5uiQVaIkllnDHiy/+z6/+fln8WbNmlbxPpaBIjoiIiIiIBEU3OSIiIiIiEhQVHkixci9O\nq2Qau+Q0dsmp8EAyuuaS09glp7FLTmOXnMYuORUeEBERERGRqpbKSI6IiIiIiEhSiuSIiIiIiEhQ\ndJMjIiIiIiJB0U2OiIiIiIgERTc5IiIiIiISFN3kiIiIiIhIUHSTIyIiIiIiQdFNjoiIiIiIBEU3\nOSIiIiIiEhTd5IiIiIiISFB0kyMiIiIiIkHRTY6IiIiIiARFNzkiIiIiIhKUxcvdgWzq1atX7i6k\nwt9//13wz2js/qGxS05jl1yhY6dx+4euueQ0dslp7JLT2CWnsUuu0LFTJEdERERERIKimxwRERER\nEQmKbnJERERERCQouskREREREZGg6CZHRERERESCopscEREREREJSipLSIuIiEgYllhiCXe89dZb\nA9CnTx8AJk6c6NpefPFFAKZNm1bC3olIqBTJERERERGRoNT7O8muRHVMmx79o9I2jGrdujUAb7zx\nBgD/93/RPfT5558PwLhx4wD48MMP67QvaRy78847D4Bzzz03o23w4MEA7LLLLgt9nueeey7jOYsp\njWNXKbQZaDK65pKrhLE7+eST3fGwYcNqfdwLL7wAwE477VTnfYLKGLu0qoSxO+2009xxp06dALj6\n6qsBuPPOO0vaF18ljF1aaTNQERERERGparrJERERERGRoFR1ulq2VJ9sqUTGUoqeffbZ2J91pdJC\nmp07dwZg/PjxGX2xf8thhx0GwO23316nfUnj2NXFW2233XYDinstlmPsGjZs6I7tfdmhQwd3bv31\n14893k+F/Ouvv2p93kmTJgHw+uuvAzBjxgzX9tBDDwHw2WefJex1pmpKV1trrbUAGDt2rDtn12Oh\n0vh+rRRpHruTTjoJgCFDhrhzyy67bK2PV7pa5Ujj2G2//fZAdN0dfPDBGY+ZP38+AG3btnXn/BTw\nUkjj2FUKpauJiIiIiEhVq5oS0s8884w73nXXXRM9h0V57E+L7EDdLACvNM2aNVvoY5ZZZpkS9KT8\n7BrLFRksBruuK3WWZ9tttwXgmmuucefatGkDxGdsas7e+NGbXDM7NiO84447ZrSNGDECgLvvvhuA\n4cOHu7aXXnopv39AFbNrPOnnqcStuuqqAMyePbvMPUlmvfXWc8dWCrpBgwYALLXUUhmP/+abbwDo\n2rWrOzd16tS67GLq2PhYFgREn1UbbbQREI9q2Wedfd77n3127r333gPg4osvdm12ziLaIbExBDj7\n7LMB2HvvvWt9vJUz98uaS6bmzZu744MOOgiALl26ALDddttlPN4Kipx66qkl6F3+FMkREREREZGg\nBBnJ8aMqizqT7kdraj6X/3cr/Zs0Jz0E3bp1W+hjhg4dCsTz+EPkRw4LYddbXUeA0mKPPfYAovLj\n5WCzVE2bNnXnbMbqu+++K0ufKsF+++1X7i6UxKWXXgrAgw8+6M7lyuG3aPUGG2wAwO677+7aLFqz\n+eabA/GoTceOHQH4888/3Tmbqf/1118B6N+/v2vz+5MG+++/vzteYYUVan3cp59+CsDxxx8PxDcD\nDZl9vgwcONCds4hDixYt3LmaUZpcEe1sUWx7rptuuinjcfa5du+99yb8V6SPnxNt/JsAACAASURB\nVEGSK4Ij+bHsCr/Eth/VgShSC9Ea7GzRnZo/X8z1r/lSJEdERERERIKimxwREREREQlKUOlqdbHY\n2099szK92VKR7LWtrZrT1qpVoSlquQpX+H/PtbC+rsuY1xVL2+nZs2dGmy3CnTJlStFez8pRt2/f\nPuOc2Xnnnd2xlaxWulqm5ZZbDojSk2bOnFnG3tSdAQMGANFCWj8d19IsLf3iggsucG12XdUsew7Z\nF4zX1LhxY3dsj1txxRWBdKer2Xhl88svv7jjY489FoCnnnqqzvuUJvXr1wege/fu7txKK60EwLvv\nvuvOXXHFFUD0OThq1Khan9NPO7My3YMGDQKyF6OxIgYhpKstvfTSABx33HG1Pubmm292x506dQJg\n+eWXr9uOVajLLrsMiFIa/eI7a6yxRq0/Z6lofgpbzbZZs2ZlPE+pUtcUyRERERERkaAEFcnJdybd\nZtBttjzfzYVs1tyiNLkiOv5MfMjlpf3yjdnKhFaDQkvp5rOBZ77XjB8NqiS24HratGkAvPnmm67t\ngQceKPrrPf/88wCcc8457pzNdNqfftRm3rx5Re9DGvjXVdJiKTUj5TbzHAKbxYTMUqgWfYRo08F+\n/foB0KpVK9eWdNNfm7G/5557Ev18uTRq1AjIXcbeykVD9UVwzJw5cwBo165dRtv06dPd8e+//w7k\njuBkM3r0aCCKlPmFVOya9F+n0m266aZA9pLFc+fOBaBXr17u3EcffQQokuPziwvYZ59Fi/0tFXKx\niEy2yIwV9TGrrbZaxs/VNUVyREREROT/27v3eKvm/I/jL1Nuo35UaKjcxiUal5BSiVESya3GLaM8\nXMatiJhucqk0ZCiXcp1C7mlyTTJIGXId10lIojEhuiCM+P3h8fmu795nnd3e6+xz9trf/X7+Y1lr\nn72/fc/ae5+1Pp/v5yMSFF3kiIiIiIhIUIJIV8snTShXKN1P+bEUjlxpQPks9o4rfhBi2tqWW27p\ntv3weCUpZpqasfNwTcq18IDxe2vUhq5duwJw8803A9C0aVN3LDut6IwzznDbxSx6kAbnnHMOkPm5\nVMi54/+e7LmWLVsGwKRJk2o+wBJr27YtADfccIPbl53Wcs8997jtPn36AJlpaibXd439nPGLB3z1\n1VcFjLi0tt12W7c9efJkIHcakPUAKgb7zrF0JYjSS6dPn1601yk2S0Orrc8W6zhv38P+eTh79myg\n8BS4NMvVE2fUqFFAuGnHNWVpatYTB+Doo4/OOJaU31PH0nktNc0vZlBXFMkREREREZGgBB/JyWdh\ndtIIi3+nxIoQ5HtXPxStWrVy2/6iskoSd/7E3SXP5855oUUMpCq/uIBFLnItBreiBIWWAC8HFoGJ\nW0Saz2fjVlttVe3P2106i+iUG4veQBRRsfLYEJ0zFsHp37+/OzZlypSMx3z99dfumC1wfuCBB4Bo\nQTjAkiVLivcPKKEjjzzSbbdp06ZOXtM+Z/faay8g806+RUnOOusst++2226rk3GVUsuWLd22vdft\nnPziiy/cMYtkh8TKuPsskjd69Oi6Hk7qWYloiIoM3H///W5fTSM4ca9jUR2LMpaCIjkiIiIiIhKU\nICI5udYv1NU6mFmzZgHxd+DtbnKIa3Lq168fu12JivH7zaeRbbmWja4tlu8/bNgwAFq3bp3Xz9ka\nnLvvvhvIbFhYzmzNDFSNwFj0BfKLLFrTQIvoQLQGp9zX4lx33XVuu3HjxlWOn3766UB0t9NfM2N3\nPS3X3H/vL1iwoOhjFTjzzDOB+N+VNdo87LDD3L5KiOT07NnTbWevBfPX/tx55511Nqba5GeLxK0B\ntvYDq1evXuNz+aXzrXGvNVKNc8UVVwCZZbhnzpy5xtcpNYum7L333m6fNe6MK7+d9PnvvffeKq9j\nxo4dW+PXSUqRHBERERERCYouckREREREJChB5BeVyyJtP6UhlNS1Y445ptRDCEI+BQcsvSiUcwei\nMrB+WpWln/rFAmzxt3Wd91NUcxUVsBSOpUuXApnpCP6C8BBYmlpckYATTzwRyD/FzNLU7Pfzr3/9\nyx0LJV1yjz32cNt2DllKCkSFA+JKO0+YMCHjv5XGT43KVTI7qebNmwNRKilAkyZNqh2D/f46derk\n9h177LFVniMUVnBg0KBBbp/Ngf3X3sMh2X333d32DjvsUOW4zYu1DujQoYM75rcPABg8eHBBrz1u\n3DgA3nzzTbevS5cuAHz++ecFPVddsvQxe09BNC+Wbluoo446ym1feeWVQJS25hczMElfpxgUyRER\nERERkaAEEclJA7vLns/C8ZD4d9Sz7+j96lfRNfRPP/0U+xj5RaUWHLAFn/vss4/bl31HEqJGZdmP\nyd6ujhUGefbZZ5MPNuXiIjgWgbHCAX5Tz2nTpmUcO/vss90xe5yVh/bPvYULFxZtzGlhdxr9SE45\nNeesa4W+//Jld5stQta+fftqXyfudf33d4gRHGNRWyu4AFW/W0P8rDv44INzHrfPMP+zrDpW6h2i\nojM//vgjkBnxt0jI0KFDgcwGwEcccQRQfk1W84ms+I1CreS0zYVfXMAiNwMHDgSgV69e7pgVOCgl\nRXJERERERCQoZRvJSdu6hHzKsYbIIjRQ9c5a3LERI0bUzcDKgH8O57MWJ8RzzO4orVq1yu3z704W\ni91x8yOPts+agZY7W3fjR3RsTY3912cRGWuA6TfCNLaGx6I+ofryyy8BRW9KzaIvfgRHqrLPrrho\n1qhRo4DMUsehsHVyEH3erbPOOgU9h0W8rEQ8wJNPPlnt4+1Y3759Adh6663dMYsYlUMkx9bMACxa\ntCjn8WzZkR8/s8LK6Vvkx48AFaNEdU0pkiMiIiIiIkHRRY6IiIiIiASlbNPV0iZt6XNp9emnn5Z6\nCCVn50q+RSr8zsyheeGFFwDYfvvt3b569eqt8ecaNGjgtk866aSMY6eccorbbtiwYcYxv1v6o48+\nCsDDDz8MZKZSzp8/f41jSBtLLfPTGq2AgHWCt0ICAK+//joQfx5aetqAAQNqY6ip4KdIWgGMtm3b\nun1z586t8zFVkk033RSIOqVDVNq20GIGr732GgB9+vQp0ujS6dRTTwVgk002ATLnydKwQk4t9dPK\njj/+eCBqK7AmVjxlww03LP7AUszSyaxYBWQWDqiOXwr6qquuAqLv6zhWnMB/TClLRxtFckRERERE\nJCjBR3JsQXcaFm0r2lPZConghBy9iVOTCN/5559f7f9bJMIW6vrN4SzK07t3bwC+/fZbd6x///4A\nfP/994nHVSp+ieexY8dm/Nfn39mDzIaftqg3ZHHltJ9++mm3z96vfllp+YXfENEWMW+xxRbVPr57\n9+5u295TVibab+BZiNmzZ7ttWwhtpYBDYlEbiKLUcWX2rflniAUH4kyZMqWgx9v3QqVFckx2GwaA\nZs2aue1cUZpcrNDAueeem/HftFAkR0REREREgrLWz8Xs5FUkhTaMzPVPsDzM2oii+GV//TuAxRpD\nkl9NXTXbbNmyJQAvvvii25dd+tcfy9KlS4GoUVRtNyorxdz5v18rVZyrNHScYp6vdk76Y7AIUa7I\nZprPu6SaNGkCwODBg90+i2TY2P1/d8eOHYHC724VOnd1PW/W+BOi88NKR1u0C+o+8l3qc87WOowZ\nM8bts0ifRRn9Y7feeiuQjshBqefOomA9evSo9jHfffed27aS7Z07d67yOGsg7bcfyGYRnJNPPtnt\ne//99wsYcaTUc5ePG264wW1bJCfuMyuftYzFVA5z57P1IRa96Nq1qzuWq4S0seahfgnpnj17AlEU\nLV/lNne5WLaErdvx1/skjQ7lUujcKZIjIiIiIiJB0UWOiIiIiIgEJYjCA5aCE5cyZou8ayNdLd8S\nwGkoelBsJ5xwApB/d/p33nkHqP00tVIotCR0LpbmVgxxqXL2Hklr6Lu2WLpkpXa0tzQ1P63C9lmK\nZIifU/myjuX+++Lyyy8HYLPNNgPgr3/9qzvWrVs3ICqbmoa0tVI588wzAdhnn33cPkuBNOutt57b\njktTM7lSUaxMtC2gXrJkSeGDLUN+Gmn2/IwaNaquh1NxrDBNixYtgMwU/RkzZpRkTGli6WrPP/88\nUDspajWhSI6IiIiIiAQliEiO3YG0/+a6gw01L89rz59rUbn/GpV8h7QSFCOCY+yc8u/Y2fkza9as\nNf58vpGgtJRWtyagfoTBooSvvPJK0V7HFuwOHTq02sf4ZaxDa1o7ceJEAHbbbTe3z5qHqrR95MYb\nb3TbM2fOBKJmlbvvvrs7dsABBwBRA9nsctyVZPHixUDm55M1n03qvffeA6IMAIgafVZK1MwWcvsl\npO17wSLTt9xyS90PLGBW+ML/TLSmo/Xr//Lnsn9O+m0HKslRRx3lti3CNXDgwFINJydFckRERERE\nJChBRHJM3NqcuKiL3Q2xu9h+1CXXXc18ygJnR5VEaiqfyGGhSn1+2noQe6/+5je/cccaNGiQ6Dlt\nfdgFF1zg9g0fPhzIneu/YsUKAE477TS376OPPko0hrS5+uqrgejcsXK/UBkNP2tiwYIFAHz44YdA\nZiTHWAnZSo7kGL+k8w8//ABEa5byZe87K+kd4hrONbH2DLYWx//ssm1bB/HFF1/U8ejCtPbaawPR\nd8ewYcOqPMayDdLW7LIU/M87W4tz3333lWo4OSmSIyIiIiIiQdFFjoiIiIiIBGWtn5O0Xq1lxew8\nX8xF4fkoZmneNHfF3XLLLQG455573L6ddtoJgA022KDKWKxLdTFTrnKpy7mzlKu4f5ufFpZd8KKu\nzlN/DPmUC66LuWvXrh0Ac+bMqXLszTffBGD+/PkFPWfz5s0BaNu2bZVxxf2brLiAlbQt9PXiFDp3\ntfF+Pfzww922pVgsXLgQgNatW7tjy5YtK/prJ1Xqz7qGDRsCme/RCRMmALD55psD8WO0Rfe2+LYU\nSj13cZo2bQpE3eH974nsufLTRC315a233qrV8Zk0zp0VcOjYsSMQLYaHaNF7q1atanUM+Ujj3OVi\nhRws1eqJJ55wx5YvXw7Ep1c+8sgjQFRKuhiFL8pt7oylSdpcQpS+Z6nRta3QuVMkR0REREREghJk\nJCdObdw1t7vi2c9fLOV2tW93kM8++2wAOnXq5I6FHMkx/r8t6cJ+iwrFlYu289Z/7uzHxb1uoWOp\ni7mz0tEWyWncuHGV58o1Dv/18nmcLdD1SwTfeuutQHGLDJQykmMNGK1por/PIjgW0UmbUrxfL730\nUrd9+umnA5nnYfbr+GN8/fXXgagk+fTp02s0lpoot++JNEnL3Pml7fv37w9AkyZNgMxo92WXXQZk\nRiFKJS1zly/LNLHS8Nbk17d69WoAhgwZ4vaNGzcOiIppFEO5zZ1ZtGgRkNnw0y8nXRcUyRERERER\nkYqmixwREREREQlKxaSrxYnrP2K9cOLSheq6B065hjTTQHOXXF3OXYcOHYDMxfK2kLEY6WojR44E\nYPz48QB89tlnicaZr1Kmq02cOBGAvn37un3Wa8Pvj5NGpXi/WrEBiNIvGjVqVOVxzz33HABjxoxx\n+6w4xqpVq2o0hmLQZ11ypZ67PffcE4C5c+e6fVZo4KeffgIyiwzMmzevaK9dU6Weu3JWbnOXXXDA\nLzxw3nnn1elYlK4mIiIiIiIVraIjOWlXblf7aaK5S05zl1waSkiXI51zyWnukiv13O2xxx5AZiTH\nnn/q1KlAfFnjNCj13JWzcpu7++67D4jaNLRv375kY1EkR0REREREKpoiOSlWblf7aaK5S05zl5wi\nOcnonEtOc5ec5i45zV1ymrvkFMkREREREZGKposcEREREREJii5yREREREQkKLrIERERERGRoKSy\n8ICIiIiIiEhSiuSIiIiIiEhQdJEjIiIiIiJB0UWOiIiIiIgERRc5IiIiIiISFF3kiIiIiIhIUHSR\nIyIiIiIiQdFFjoiIiIiIBEUXOSIiIiIiEhRd5IiIiIiISFB0kSMiIiIiIkHRRY6IiIiIiARFFzki\nIiIiIhKU+qUeQJy11lqr1ENIhZ9//rngn9Hc/UJzl5zmLrlC507z9gudc8lp7pLT3CWnuUtOc5dc\noXOnSI6IiIiIiARFFzkiIiIiIhIUXeSIiIiIiEhQUrkmR0RERMIwbtw4t92/f38AhgwZAsDo0aNL\nMiYRCZ8iOSIiIiIiEpS1fk5S5qGWqYrEL1SBI7m0zF3btm3d9tChQwE45JBDqn38vHnzAOjcubPb\n9+mnnxZ9XLmkZe7KkaqrJaNzLrk0z12jRo0AWLx4sdu33nrrAdHn2nbbbeeOffvtt3UyLpPmuUs7\nzV1ymrvkVF1NREREREQqmiI5KRbi1f7YsWMB6NevX5VjPXv2BGDatGk1fp1Sz93f//53ALp37+72\n1atXL++f//HHH932hRdeCMAVV1xRpNHlVuq5i2Pnxvrrr1/l2BlnnAHA+PHjM/7f37d8+XIAHn74\n4VodpyI5yaTxnOvYsSMAd911FwDrrruuO/buu+8C8Nvf/haAmTNnumP2b6lf/5clr717967y3B98\n8AEACxcudPvss+KHH34oaJxpnDvTpEkTAD7//PMqx2zc+++/v9s3a9asOhlX9hgKoffsL9I4dwMG\nDADguOOOA2CPPfao8to27hEjRrhjF110Ua2OK1sa565cKJIjIiIiIiIVTRc5IiIiIiISFKWrZena\ntSsQhTkvu+wyd+ynn37KeOw777zjtu1xd999d9HGEmJI8+233wZghx12qHLMUpIefPDBGr9OKebO\nUtQADj300IJ+9vbbbweihbo9evRwx1avXg1Aly5dAHj22WdrNM41KfV5Z2k+p512mtt35ZVXArDO\nOuskes5vvvkGgNmzZ7t9Z511FgALFixI9Jxx0paudtBBBwHwyCOPAPDee++5Yy1btqzV1y5Eqc85\ns9NOO7ntV199FYC11167oLEk/Upt0KABAKtWrSro59Iyd3Fypat9+eWXAOy8885un4qslI9Sz902\n22wDwOTJk92+Nm3aAPD+++8DcM0117hjU6dOBaL3+J133umObb755kUbVz5KPXflTOlqIiIiIiJS\n0Sq6GajdNbcrfIB27doB0d07P3qTfQW54447uu1JkyYBUYOzww47zB0r5p3icvWHP/wBiBbqhsj/\nncfdbZg4cSIAI0eOBGDlypXu2LJlywD43e9+B0TnIcAmm2wCRCWoazuSU2oDBw4EMqOoNbXBBhsA\n0K1bN7fvjjvuAKBDhw5Fe520ss+xFAbuU8UvbJEdwfG/C1566SUAXnvtNSDzLqsVubCIdPPmzd2x\n3//+90BUqMCilgDff/99zf8BKeMXAclm0dW6jt5I+bLPcYBHH30UgK222srtO+WUUwC49957gfio\n6IoVKwBYunSp22fvy6effrra17a/YT7++GO3zz4HLNsiBPYZ2KtXL7fPsmwsA+ekk05yx/75z3/W\n4egKp0iOiIiIiIgEpSIjOSeeeCIAo0aNAmDTTTd1x/73v/8B0RoJ/w7dCy+8AMAnn3yS8RiADTfc\nEIiiOyeccII7dvHFFxd1/OXCL536l7/8Bci8c2m++uoroPzv6DVr1sxtW0nK0aNHu312Byh7bZfv\n9ddfB+Dwww93+2ytT6dOnQDYb7/93LFnnnmmZoNOIVtHkos/B9kldy3iBbD33nsD0Lhx4+IMrgz4\n5+FNN91UwpGUnxtvvLHaYw888IDbPuaYYxI9/5QpUxL9XLmxO76DBg2q9jF1VRK/3Oy6665AFPm3\nvzeg8LWeofHXb22//fZA5mecZdTkYtGdG264we3bd999gfhIjkVwrr/+egA23nhjd6x9+/ZA9Ldh\nOdt2220BuPzyywE48sgjq32sv9bJvq/j1t2lgSI5IiIiIiISFF3kiIiIiIhIUIJPV7PSgNYBF2D4\n8OFAtFhs2rRp7pgtFrVFybn43XQt9e3oo48GoG/fvu7Y3/72NwAWLVpU8PjLmT8/m222WbWPe/zx\nxwF48cUXa31MtclPt/PLHyfhh7+tHKaFxvfaay93LMR0tWOPPRbITA96/vnnAfj666+BzMWOP/74\nY8bP++W3rWv9/fffD0DTpk1rYcTp4qdh2OefpUg+9NBDJRlTufDLyrZu3TrjWK5FyZLp4IMPBjIL\nOWSbMWNGXQ0n9WzhO8D48eOBKOXv6quvdsc6d+4MROXG89WiRQsgWpCf1tSiNfELF9lSgqQL3y39\nzNenTx8gKn4D0KpVKwAWL14MREVDoGqqdDk777zzgPg0Nfs7Y8899wQy/7YbN24ckPk3dpookiMi\nIiIiIkEJvhnoEUccAUR3cn3WIM9f5J2U3WGJuztlhQ7yiQ75yrVhlN09njdvntv361//OuMx1mgP\n4MADDwSiAgTFUK5zF8caWFok5+WXX3bH2rZtW/TXS8vc+QVBrHFgdtQmjpXchqjctpUZXXfddd0x\niw4Vs4R0KZuBWglzP5Jjc2iRHL8B6AcffLDG57QiBocccojbl2txflJpOeemT5/utq0xtLHFyQBz\n5swp+msnlZa5s/L3EL23/JK/Jvt7N1chltpW6rk74IADABgzZozbt8suu2Q8xv/Mq1evXo3GYN+1\nfjQiqVLMnX8+2b9ho402cvvsb6033ngDyN1Y1/9+sSIPtojeojYQRTEsW8f/uyapUp93ZsCAAW7b\n/n1W6MgyniBqg2LfB37hAWtSbt+7tV0KX81ARURERESkoukiR0REREREghJ84QELX/r+/e9/A1F3\n3GKwztf2X3/Rqt/xuhLYArbsFDWfn5pWzDQ1Ccdnn32W6Of8oh+2eLcSbLnllkBmGsavflWz+1i2\n2N5f8PvRRx8BUcGQkNjCWp+lplkKlsSzBdoQn6ZmrGdaKdPUSsG6xPupaX6qlZk/fz4QFQl59913\nC3odSx+yIkgQpXaVe/GMb775xm13794dgHPOOcfts3+npeJeeuml7pilYVl/Hb+wzU477ZSx79xz\nz3XHrL9dSKwAj9/P0f52te9P6xnps55NI0eOdPssZd56OMUtDSklRXJERERERCQoQUZyNtxwQ7e9\n4447Vjm+cuVKoLhlFG1hdFxUwsoJjx49umivl2Z2ZyXXArEVK1bU1XCC4y8sl2gx7kUXXQTAn//8\n51IOp2Ts/RZ3h7zQu+a9evUCoHHjxlV+PoW1amrM7mg2atSoyjGL/PvfJfXr//LV+eGHHwKwfPny\n2h5iatlc+He/sy1cuNBt+4VTCrHffvsB0LBhQyD6vUBUZj+Ndt11VwCuvfZaANZbb70qj7FjEH2O\nLVu2LNHr+VFXY1HIfAq3lAv7W8vmC+Cxxx4DovmcMmWKO/bkk08C0KlTJyBzLizj5/bbb6/FEafH\n6aefDsBuu+3m9lkEJi6Ck80veGRRISsrrUiOiIiIiIhILQoykmOlVAG23nrrKsf9XEypGb+sYc+e\nPdf4eGv4eeaZZ9bamEJld4vffPPNEo+k9Py1E2eddRaQmV+cD7vjaY34yj1fPR+TJk1y29nrFdu0\naeO27U6oHxUPmUUJ4sq02trN3r17u30WvbB1Y5MnT3bHrrvuOiCzOXDI9t9/fyDz/Mn28MMPu+18\nGij269cPyGwwaHeP7XfkRxRtnZi/Bi8tjRrtc3vu3LlAZvuEq666Csg8V5KuVbJ/u72vrXEywC23\n3JLoOcuNzbE1zJ4wYYI79qc//SnjsW+99ZbbfvTRR+tgdOnhR3CMXz5/Tdq1a+e2LZMiu3lyWiiS\nIyIiIiIiQdFFjoiIiIiIBCXIdDW/q7nxF5m99NJLdTkcVwoyRH6K2j333LPGx1soPWl54Eph4XaI\nOolbmlrShbsh8VPLcpWrzcVKLT/xxBMAbLzxxu5YqAvJ/TQDf+E2ZJabrrTyvlOnTgWgT58+1T4m\nriS+tQcYPHiw29e/f38gWsTslwz2F+CHIi71xdji8EGDBlX7mBYtWrjtO+64A4AOHToAUSqML67w\nhZVP94v7WCuDUrPfuaXF1hb7brWF4Lfeeqs7ZqV/K4UVpzjwwAOrHLvkkksAOPnkk90+K9N9zDHH\nAFGRgkqyePHiNT7GUkXLKY1ZkRwREREREQlKkJGcCy+8sMq+WbNmue1nn3226K+57bbbAvHlG22x\nfYj23Xdft21X+XZHuNLuBhfTzjvv7Lb/7//+r4QjSae111672mN+6VVrwGd3i8eOHVvl8XF3i6V6\ndid0xowZJR5J8XzxxRdAZsQ/+xybPXu2237mmWeAqNS2NROEKLJoZVp79Ojhjtki/TSXPM6HH/U7\n8sgjq32cLf5ftWpVlWOXXXYZEM0T1PwOsV90aMSIEUDyUszlwC9Hbc0YTdpK+dYli8hahA+i4gKj\nRo0CouagAHfffTcQFaUaMmSIO3bTTTcB+ZVWLhfvvfdeop+zz0T/fWYs+8QiiQBLlixJ9DrFpEiO\niIiIiIgEJchIjl8G1LbjSoMW01ZbbZXx39A1aNAAiNaLQNVmhH7u9Nlnnw0kv4NQKaxMqp/Hb+66\n6666Hk5qTZw40W1369YNgO+//x6ARx55xB2zu3X+nfZQWdnnTTbZxO2zUturV69e48/7Ea2WLVsC\nmXdCTW1/lpbCCy+8AMDuu+/u9tmd8aeeegqAt99+2x2z8rwWcfUbhWavGWnWrJnbfvzxxwHo2LEj\nAP/973+L8w+oY5a5AJnrB7NdccUVVfbZXXZrI2DrJ9bE7grb+glr6ujbZptt3LatsQs5knPqqae6\nbXvP2nqS2shYSTtr5jtw4MAqx/yIKsAbb7zhtrt06QJE0dprrrnGHbP1TA8++GBxB1tC9h7yI9fX\nX389EJ0/9hiAefPmAZlrD7NttNFGABx11FFun9/ktlQUyRERERERkaDoIkdERERERIISZLqany5g\n23FlJ2vKD41bWM5e57bbbnPH8imtXG6sHOY+++xT7WP8ErV33nknUDnFze537AAACjdJREFUCKz7\ndPv27d0+S6vKZfPNNwfiF+B+++23QGZKZDmVpPUXK//xj38E4kv29u3bF4BFixZV+1z+YmVLmbTQ\nu4XWK9Xw4cNr/BxWoMFSior9/Gnlp6T529WxLur++9BSc+NsvfXWQLQ4t1zT1dbECitYiq0t6AY4\n/PDDgfi0RyvSc/PNNwOZi8Nt4bcV93n11Ver/Pw333zjtq3oQYisTYZf+Me89dZbAHz33Xd1OqY0\n2GKLLTL++9xzz+X1c59++ikQffdYsQGAYcOGATBnzhwAli5dWpSxltItt9wCwHHHHef22d90fipq\nEn5Km9LVREREREREiizISE5d8ReCb7fddkBUitQWcUFYpQcteuVHqqrjz0HIiz+NNbKDqKmYH72o\nKVts/+WXX7p9tpDZzrtx48blfA77PdgiQf/3Utu/I/+ukV84AKKF31D4HUi7c5lL9+7dC3pOkXx8\n/vnnAKxYsSKvx1vE56OPPqqtIaWC3Q22u99xrRWMH5E5+OCDgegzzi+RbNHxXN89fpQxVyS43Fkk\n+4gjjnD7bIG83wTU2F36xo0bA5kN0UOaJ2tObu/HU045paCft/PVL15jTWU7d+4MwH333VfjcaaF\nlXGH4jWr9du1pIEiOSIiIiIiEpSKieTENQFMynIOd9llF7fP7ujZ3eq4fOEQWCneXA0q7W5lPtGe\nkBx//PFuO27tkeWLr7/++kDyKI/djYPM6AhA//79c/6snZdWKtc/T9u0aZNoPPnKHitEUU5/3dpn\nn31Wo9fxmzhahPXEE0+s9vG2LqJS1otJza2zzjpAFDm1XP41Wb58OVD+ke3//Oc/btvWwFkJY1+u\nCI7x75pbBMjW7fjPafuMlYwHOOecc4BorUHo4r5/raFl3Fqyp59+utbHlAbWnNeyAT788MNEz2ON\nQyGK5FiUMaRIjh91sX+fNUL11xm2a9cOiMpo+2sJrWXBBRdcAKSvTYgiOSIiIiIiEhRd5IiIiIiI\nSFAqJl3N0skK5ZfrtXSkoUOHAplpMVbqMsSwsJ8yYKU9c+natSsAq1atqrUxlSPrBHzssccCmelt\nxsqe+p3CrbxlHCtFG1fqt0GDBkBUbhQyO7rH/X9ds7LY77zzTtGe0zqdQ35FCUaMGAHAypUrizYG\nCZst2LWO83455FztCurVqwdA/fq/fPX6HcfLyddff+22x48fD2R2iS+EXyzAvlttnuJY2m/v3r3d\nvoceeijRa5cb+2yzwjb++fPYY4+VZExp8sorrwBRarSfRnrjjTfm/Txx34vl1K4hX35RrOnTp2f8\nN1+77bYbEKWrpY0iOSIiIiIiEpSKieQUqkWLFgDMnDnT7bNGbsYW+gEMGjSobgZWAn7jRVtkFsei\nZR988EGtjymN1tRw9rDDDgMyy34auzs5YMAAIL4MaC6XXHJJlX1dunQBomIREEV19txzTyD/Zmm1\nxZqe3n///W6flf30Fzdn22uvvdx2dvPFuLvAdqfd/x1Z0QVrVCsRm6+44hhWoGLGjBl1OqZis2gK\nQL9+/YDMMqr2ORa3eD5pw7zmzZsD0KxZMyCMUtKTJ08GMiOo9jnWsGHDgp7LfidxTbytzLxFXq18\nfiWx6L9lV/h33dNWurcUbEG8fX5ZSWnIL5LTpEkTIDNKaNkVlRItLFTaI1yK5IiIiIiISFAqJpKT\nKwLhr62xO1BWctaaX0LUoOzcc88FYNq0ae5YiPn81ojNci7jLFmyxG0feOCBtT6mNFu9erXbjrsD\nbvn7cSyyMn/+/KKN58knn8z4b6lNmjTJbXfr1i3jmEV0oHZKdNq6JrsLDNFdv3wbOVYSu4MeV1b7\noosuAmDp0qUATJgwoe4GVkR+lKBHjx4A7LvvvkV/na+++sptW9QxhAiOsXLYF198sdtnUZe4dSIW\n+Tn00EOB+HLIFtn1I7xTpkwpzoDLjP/3ydFHH51x7MUXX6zr4aSanSP2N5o18AS49tprM475rFn0\nsGHDAGjdurU7dt111wHhtgUJnSI5IiIiIiISFF3kiIiIiIhIUComXe2mm25y27aobO+99wYyywX6\nC9UAFi9e7LatLOGcOXNqbZxpYotwO3ToUO1jRo4c6bbzKdcbMn/RspVCtXLaPgt7++mOlvoTMj/1\nxFKFbDHoRhttVOPn//jjjwF44403qhwbPHgwoHO0mKxDdrmmq/ml1S1NzT8/LFWvUaNGQFQ0ADJL\nRkN07gEsX74cgDFjxgDwj3/8wx3LVUwjJFYUIC5tVwpjBWsA2rZtm3HM/w6R6LPf/i45//zz3TFr\ns7DffvtV+TkrzmPva78VSCGlpyX6uxpggw02AKLCSqWgTyAREREREQlKkJGcqVOnuu0zzjgDgM02\n28zte+qppzIe7y+utTLIl156KVC+dymTatWqldv2o1+yZn6E76CDDirhSNLJX+htUR27622lOwGu\nvPJKIPM9+/bbbwNRE8Y4CxYsAGDu3LlFGnHlmj17NhCVrPULQ4TCb6R48sknA9GieMhslAeZd3Rt\nQb0VrQixCbSkQ69evarss89Sv9iNRA3IrTjKgw8+6I69/PLLQGZLhWzWhNv/uy+kIiG1wUpsWzTb\nWldAFM22v8NLQZEcEREREREJii5yREREREQkKGv9vKY27SWQvaizJqx7ui0Eh2gBqaWm+Yvnr7/+\n+qK9dk0l+dUUc+7KmeYuOc1dcoXOXdrn7d133wUy+4UNHz4cgCeeeAKAV155pcavo3MuOc1dcmme\nu6ZNmwKZhVQ23XRTAIYMGQLA6NGj62QscdI8d2kX4tzZd4T/fWDnp6UBFkOhc6dIjoiIiIiIBCX4\nSE45C/Fqv65o7pLT3CUXWiSnruicS05zl1ya587ufg8cONDts4IghxxyCAArV66sk7HESfPcpV3I\nc3fttde6bSsGdPXVVxft+RXJERERERGRihZkCWkRERGRcvXJJ58AsGzZMrfvmmuuAUobwRHJpV+/\nfqUeQgZFckREREREJCi6yBERERERkaCo8ECKhbw4rbZp7pLT3CWnwgPJ6JxLTnOXnOYuOc1dcpq7\n5FR4QEREREREKloqIzkiIiIiIiJJKZIjIiIiIiJB0UWOiIiIiIgERRc5IiIiIiISFF3kiIiIiIhI\nUHSRIyIiIiIiQdFFjoiIiIiIBEUXOSIiIiIiEhRd5IiIiIiISFB0kSMiIiIiIkHRRY6IiIiIiARF\nFzkiIiIiIhIUXeSIiIiIiEhQdJEjIiIiIiJB0UWOiIiIiIgERRc5IiIiIiISFF3kiIiIiIhIUHSR\nIyIiIiIiQdFFjoiIiIiIBEUXOSIiIiIiEhRd5IiIiIiISFB0kSMiIiIiIkHRRY6IiIiIiARFFzki\nIiIiIhIUXeSIiIiIiEhQdJEjIiIiIiJB0UWOiIiIiIgERRc5IiIiIiISFF3kiIiIiIhIUHSRIyIi\nIiIiQdFFjoiIiIiIBEUXOSIiIiIiEhRd5IiIiIiISFB0kSMiIiIiIkH5f1A6nO45ed8sAAAAAElF\nTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "# takes 5-10 seconds to execute this\n", - "show_MNIST(train_lbl, train_img)" - ] - }, - { - "cell_type": "code", - "execution_count": 49, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAAKoCAYAAABUXzFLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3XeYFMXWx/EvgoCCgICAIGDCgKgYQAwElSBJwYwJAwZE\nEYwgBhADKpivESMCBjCBIqKiiAEFc7wY8YKBoEQVkXn/8D3dNTu9w8zs7nRP7+/zPD7bds301hY9\nofucOlUhkUgkEBERERERiYmNwu6AiIiIiIhIadJFjoiIiIiIxIouckREREREJFZ0kSMiIiIiIrGi\nixwREREREYkVXeSIiIiIiEis6CJHRERERERiRRc5IiIiIiISK7rIERERERGRWNFFjmPVqlUMGjSI\nhg0bUrVqVVq2bMljjz0Wdrcib+XKlVx88cV07tyZLbbYggoVKjB8+PCwu1UQXn31VU499VR22mkn\nqlWrRqNGjTjssMOYN29e2F2LtA8//JDu3bvTpEkTNtlkE2rXrs2+++7Lo48+GnbXCtLYsWOpUKEC\n1atXD7srkfbaa69RoUKFwP/eeeedsLtXEGbPnk23bt3YfPPN2WSTTWjWrBkjR44Mu1uRdvLJJxd7\n3uncS++DDz6gV69eNGzYkE033ZSddtqJq666ijVr1oTdtch799136dKlC5ttthnVq1fnwAMP5M03\n3wy7W1mpFHYHouTwww/nvffeY9SoUeywww5MmDCBPn36sH79eo477riwuxdZS5cu5d5772X33Xen\nV69ejB07NuwuFYy77rqLpUuXct5559G8eXMWL17MmDFjaNOmDdOnT+eggw4Ku4uR9Pvvv9O4cWP6\n9OlDo0aNWL16NePHj+fEE0/k+++/57LLLgu7iwVj4cKFXHjhhTRs2JDly5eH3Z2CcO2113LggQcm\n7WvRokVIvSkcEyZM4MQTT+Too4/mkUceoXr16nzzzTcsWrQo7K5F2uWXX85ZZ52Vsr9nz55UqVKF\nVq1ahdCr6Pv888/Zb7/92HHHHbnllluoW7cus2bN4qqrrmLevHk8++yzYXcxst577z3atWtH69at\nGTduHIlEghtuuIGDDz6YmTNnsu+++4bdxcwkJJFIJBLPP/98AkhMmDAhaX+nTp0SDRs2TKxbty6k\nnkXf+vXrE+vXr08kEonE4sWLE0DiyiuvDLdTBeKXX35J2bdy5cpE/fr1EwcffHAIPSps++yzT6Jx\n48Zhd6Og9OjRI9GzZ89E3759E9WqVQu7O5E2c+bMBJB48sknw+5Kwfnf//6XqFatWqJ///5hdyUW\nXnvttQSQuOyyy8LuSmQNGzYsASS+/vrrpP1nnHFGAkgsW7YspJ5FX5cuXRL169dPrF692tu3YsWK\nRN26dRP77bdfiD3LjtLV/t/TTz9N9erVOeqoo5L2n3LKKSxatIg5c+aE1LPos5C5ZK9evXop+6pX\nr07z5s358ccfQ+hRYatbty6VKilAnalHH32U119/nTvvvDPsrkjMjR07ltWrV3PJJZeE3ZVYuP/+\n+6lQoQKnnnpq2F2JrI033hiAmjVrJu2vVasWG220EZUrVw6jWwXhzTffpEOHDmy66abevs0224x2\n7drx1ltv8dNPP4XYu8zpIuf/ffrpp+y8884pX5B22203r10kH5YvX87777/PLrvsEnZXIm/9+vWs\nW7eOxYsXc+eddzJ9+nR9icrQr7/+yqBBgxg1ahRbbbVV2N0pKAMGDKBSpUrUqFGDLl26MHv27LC7\nFHmzZs2idu3afPnll7Rs2ZJKlSpRr149zjrrLFasWBF29wrK8uXLmTRpEgcffDDbbLNN2N2JrL59\n+1KrVi369+/Pt99+y8qVK5k6dSr33HMPAwYMoFq1amF3MbLWrl1LlSpVUvbbvk8++STfXcqJLnL+\n39KlS6ldu3bKftu3dOnSfHdJyqkBAwawevVqhg0bFnZXIu/ss89m4403pl69egwePJjbbruNM888\nM+xuFYSzzz6bHXfckf79+4fdlYJRs2ZNzjvvPO655x5mzpzJrbfeyo8//kiHDh2YPn162N2LtIUL\nF7JmzRqOOuoojjnmGF5++WUuuugiHnnkEbp160YikQi7iwVj4sSJ/PHHH5x22mlhdyXStt56a95+\n+20+/fRTtttuO2rUqEHPnj3p27cvt956a9jdi7TmzZvzzjvvsH79em/funXrvKymQvlOrLwOR7qU\nK6VjST5cfvnljB8/nttvv5299tor7O5E3qWXXkq/fv349ddfmTJlCueccw6rV6/mwgsvDLtrkTZ5\n8mSmTJnCBx98oPe2LOyxxx7sscce3v+3bduW3r17s+uuu3LxxRfTpUuXEHsXbevXr+fPP//kyiuv\nZMiQIQB06NCBypUrM2jQIF555RU6duwYci8Lw/3330+dOnXo3bt32F2JtO+//56ePXtSv359Jk2a\nxBZbbMGcOXO4+uqrWbVqFffff3/YXYysc889l9NOO41zzjmHYcOGsX79ekaMGMEPP/wAwEYbFUaM\npDB6mQd16tQJvDJdtmwZQGCUR6Q0jRgxgquvvpprrrmGc845J+zuFIQmTZqw9957061bN+666y7O\nOOMMhg4dyuLFi8PuWmStWrWKAQMGcO6559KwYUN+//13fv/9d9auXQv8W7lu9erVIfeycNSqVYse\nPXrw8ccf88cff4TdnciqU6cOQMqFYNeuXQF4//33896nQvTxxx8zd+5cTjjhhMB0IvENGTKEFStW\nMH36dI444gjatWvHRRddxC233MIDDzzA66+/HnYXI+vUU09l1KhRjBs3jq222oomTZrw+eefezcQ\nGzVqFHIPM6OLnP+366678sUXX7Bu3bqk/ZZ3qPKgUpZGjBjB8OHDGT58OJdeemnY3SlYrVu3Zt26\ndXz77bdhdyWylixZwi+//MKYMWPYfPPNvf8mTpzI6tWr2XzzzTn++OPD7mZBsVQrRcWKZ/Nbi7Kx\nK5Q7w2Gz6EO/fv1C7kn0ffjhhzRv3jxl7o2V3NZc6/QuueQSlixZwieffML333/PW2+9xW+//Ua1\natUKJtNE7yr/r3fv3qxatYrJkycn7X/44Ydp2LAh++yzT0g9k7gbOXIkw4cP57LLLuPKK68MuzsF\nbebMmWy00UZsu+22YXclsho0aMDMmTNT/uvSpQtVq1Zl5syZXH311WF3s2D89ttvTJ06lZYtW1K1\natWwuxNZRxxxBADTpk1L2v/CCy8A0KZNm7z3qdD89ddfPProo7Ru3Vo3XjPQsGFDPvvsM1atWpW0\n/+233wZQwZUMVKlShRYtWtC0aVMWLFjA448/zumnn84mm2wSdtcyojk5/69r16506tSJ/v37s2LF\nCrbffnsmTpzIiy++yKOPPkrFihXD7mKkTZs2jdWrV7Ny5Urg30W4Jk2aBEC3bt2SyhCKb8yYMVxx\nxRUccsghdO/ePWXlan3wBzvjjDOoUaMGrVu3pn79+ixZsoQnn3ySxx9/nIsuuogtttgi7C5GVtWq\nVenQoUPK/oceeoiKFSsGtsm/jjvuOC9Fsm7dusyfP58xY8bwyy+/8NBDD4XdvUjr3LkzPXv25Kqr\nrmL9+vW0adOGuXPnMmLECHr06MEBBxwQdhcj75lnnmHZsmWK4mRo0KBB9OrVi06dOjF48GDq1q3L\nO++8w3XXXUfz5s29VElJ9emnnzJ58mT23ntvqlSpwkcffcSoUaNo1qwZI0eODLt7mQt5nZ5IWbly\nZWLgwIGJBg0aJCpXrpzYbbfdEhMnTgy7WwWhadOmCSDwv++++y7s7kVW+/btix03vTyL98ADDyTa\ntm2bqFu3bqJSpUqJWrVqJdq3b58YN25c2F0rWFoMdMOuu+66RMuWLRM1a9ZMVKxYMbHFFlskevfu\nnXj33XfD7lpBWLNmTeKSSy5JNG7cOFGpUqVEkyZNEkOHDk38+eefYXetIHTq1ClRrVq1xIoVK8Lu\nSsF49dVXE507d040aNAgsckmmyR22GGHxAUXXJBYsmRJ2F2LtK+++irRrl27RO3atROVK1dObL/9\n9onLLrsssWrVqrC7lpUKiYTqNoqIiIiISHxoTo6IiIiIiMSKLnJERERERCRWdJEjIiIiIiKxoosc\nERERERGJFV3kiIiIiIhIrOgiR0REREREYkUXOSIiIiIiEiuVwu5AkAoVKoTdhUjIZQkjjd2/NHa5\n09jlLtux07j9S+dc7jR2udPY5U5jlzuNXe6yHTtFckREREREJFZ0kSMiIiIiIrGiixwREREREYkV\nXeSIiIiIiEis6CJHRERERERiRRc5IiIiIiISK5EsIS0iIiLl29FHH+1tjx49OqmtSZMm+e6OiBQY\nRXJERERERCRWymUkZ6+99gJgxowZAPz+++9e23XXXQfAfffdl/+OiUigKlWqAHDhhRcCcNJJJ6U8\n5tdffwWgbt263r6hQ4cC8Mwzz5R1F0U2qFGjRgCceeaZALRp08Zrs+2RI0d6+2688cY89i58jRs3\nBmDMmDEAHHXUUSmPefvtt/PaJ4k/+75n51/nzp29ts8++wyAm266CYDnnnvOa1u6dGm+uig5UiRH\nRERERERiRRc5IiIiIiISKxUSiUQi7E4UVaFChVI71qabbgpA9+7dvX333HMPADVq1Eh5/D///APA\ngAEDABg7dmyp9SVbufzTlObYlaaTTz4ZgAcffBCAl19+2Wvr1KlTqf++OI1dvkVl7CxFDWDQoEEA\njBo1CoC///7ba7Pt5cuXA7Dlllt6bZbacsABBwCwfv36Uu+nK9uxK+Rzbty4cUk/AV566aWcjhWV\nc6407L333gAcc8wxAJx11lleW6VK/2aIV61atdjnL1myxNtu1qwZkJxSXVShj52lCAG8+eabKfuM\nvZbPP/98AN55550S/+5CH7swFerY2evRvuMB7LzzzkBm/Zs0aZK3ba/xbBXq2EVBtmOnSI6IiIiI\niMRKLAsPtGjRwtu2yMEee+zh7bMr4qArwooVKwLQvHnzsuxiuWMT+exOert27by2Aw88EICZM2fm\nv2MhsEiDG8Hq3bs3EHxuvvvuu4BfFKO8ePzxx73tww47LKmtb9++3vbEiRMBv6DI3LlzvbZ9990X\ngPbt2wPxPscsSnDHHXd4+/bbbz8AevToAcCCBQtKdGz3+McffzwA33zzjdeWaySn0FgWwJAhQwBo\n1aqV19ahQwcgecyy4RbOsIIZdsw4sfLQbmnooAiOefLJJ4HSieBIvG288caAX3gG4IgjjgBgxx13\nBOC///2v12afu/bZYRlAkPpd8NVXXy2DHheGwYMHA3Dcccd5++xz18bQioaAXygoTIrkiIiIiIhI\nrMQqkmNX35a/D8kRnGwMHDgQgK222srb98EHHwBw//33A37JWgm2+eabe9v77LNPUtu6deu87Vzv\nLhcay9+1OWGbbbZZymOCIjmHHHII4N8xf+KJJ8q0n2GzyJ4b6XrjjTcAf/6HG+UxrVu3LvaYK1eu\nLM0uRtKJJ54IwOmnn+7ts/Pp0EMPBZKjPNlwz7miUbW//vorp2MWsttvvx0ILmWeCfvs+P777719\n06dPT3mcvQ9Y6emFCxfm9PuixEplB72G03HnQpQHFr1zo3gWkS76GIDXXnut2DZ7Ty36mLjZeuut\nAbjtttuA5LnYxspFu98TLbqzaNEiAB544AGvzSI5a9asAeD1118v5V5HU8eOHb3tq666CvAj1htt\n5MdHimZEnXvuud62jePnn39eZv3cEEVyREREREQkVnSRIyIiIiIisRKrdDVLA+rTp0+pHdMmqwEc\nfvjhAHTr1g2Atm3bltrviSM3lcPCyGb+/PnetjtxOW522203b9vKkbuTGjNRuXJlwE+TtDLnAJMn\nTy5pFyPHUsvccZoxYwbgpxoEsRQF1xdffAHAV199VZpdjIztt9/e27788stT2teuXQvAiy++mNPx\n69evD0DLli2LPbZNCI+7c845x9vOJE3tjz/+AODjjz/29tlYPfLIIwAsXry4NLsYWW5BgaLptj/+\n+KO3vf/++wPJqURBj4uLoJS0K6+8skTHCmLHjGO6mrvUgC0xEJSmZl555RUA/vzzT2/f119/DcCI\nESOA5MJIVqDg7LPPBuCHH34ojW5Hlk3RsJRc8NP5VqxYASS/ht9//33AT5d2/z2+++67su1sBhTJ\nERERERGRWIlVJGfKlClAcnm7IDZpKpOFAd0JVvZ4K8vq3lF/9tlnAT/aI3DkkUcW22Z3TOLOon4A\n1apV2+Dj0y34Zc93yyfHMZJjkxSvuOIKb59FaW0MrHwx+OeSlex1y0RfeumlQHwLD7iLFTdt2jSl\n3e5a2p3KbPXv37/YY1tULddjFwp7TVoZ/CBu2fIJEyYA8MILLwDxjSJmwiI4bpEB22eRGSslDf5i\noFYm2m2LCzfiUtKS9m5kJl0kJ44lyI0bJTzqqKOS2txJ8d9++y3gR1N33313r82WIahTpw6QvICv\nfQeM8/ID4BeKmj17NpA8rlbwxIoBffnllynPf+655wB4+umnvX22XIaNfRgUyRERERERkVjRRY6I\niIiIiMRKLNLVbF2Ia665Bkit212UrXfjTgovjpvSlu64tg6FTdD98MMPN3jsuLIQpf0MMnXq1Hx1\nJ1Tz5s3ztm1toIoVKxb7+L///htITlsr+nh3UqUVxohT2pqtRzB+/HhvX7169QC44YYbAOjatWux\nz3dD43FdHd1WnnYnyBp3LZUBAwaU6PdY6m9QGuWcOXNKdOxCYZNue/bsmdL20ksvAclrQ7grqZd3\ntvr5vvvum9IWtBq6pRJdcMEFZduxELkpZrbtppNZ+q21ZVssYEPff+KmYcOG3rYVQ7HvbW7K/LRp\n0wC/WItbjMU+XywF19LXyhMr3NOkSZOUtlNOOQUITlM74IADAJg1a1ZKm6X9XXzxxaXWz2wpkiMi\nIiIiIrESi0iO3S1Pd4fcXVna7oTbFevAgQO9NlvN9qOPPgL8CA3AqlWrABg5cmSxv+f5558HklfM\nDSrtGmc77LADANtss01Km939zaToQxxY6WPwV4q3yGP16tW9NisTfdlllwFwzDHHeG1WujJIv379\nAH+1dDtHC42t6A7+hEd3oqdN/rRxSsct+24TUe0O8Weffea13XHHHYAfPSsEtvq2lTB279papNDO\nIUh+38uFHT/o7nDc7xhbKdSgu7pLliwB4KKLLgIUvSnKCgYUnQgOcNNNNwGppaQhvpHX4hx44IGl\ndqx0E+NL8/dEjRtBsHLtllFjJfDdfePGjQP8ifbgZw3cfffdZdvZCDv44IMB/3393nvv9dpefvnl\npMe60R73ce7zi26HRZEcERERERGJlVhEcmyRxJo1a6a02UKT7jyGn376KemnW462KDfv3BZJsrxP\nd9EuW7iwQYMGgF8Gtzyxv93+PYLYvCm3/HZ5YTnB9tONPFqkwnLXe/XqldExba6ELTxYaKpWrQok\nz2GrW7dusY+3hdhsTgr4ETIrp/zrr796bVbG97TTTks5lpU+tjl6//vf/7L/A/LM5oUEzSe0O5QP\nP/xwXvryySef5OX3hMXu6gYthGqlucvj+3xx3JKzo0ePTmp7++23ve1s5tu0adPG27a7x3asOC4O\nWhLpykTHcRHQIEOGDAH8yEzQdxH7zLTlBcBfRFR8QfMwDznkEABuvPFGb5/NWTRuZoQtrRImRXJE\nRERERCRWdJEjIiIiIiKxEot0te+++w6AXXfdNaVt2LBhQOmsym3pLFYW0y2nd/755wN+yNgtNrDF\nFlsAfom+uKpduzYQXHDAVtF99dVX89qnKLEUo7322gtITpO0Sc77778/kPmEPVtJvFDT/2wifVCK\nmlsK2ooEWKnsBQsWeG3uCstFWSj9hBNOAOCWW27x2iyUbmlJbgESm8QfNS1atCi2zS1XXlK77LIL\n4Kfouv766y/An+QbxNIvwU/lLbSiGOnG2gqDrFixAkgu9uCmS5YHlqZmn4vuPksps8/HbI8VVLjA\njmnFDaD8FSww6YoNBKUbxZ1NQfjtt9+A5MIDRf3888956VOhsgJJ4H+nsyUL0hUAclN433rrrTLq\nXeYUyRERERERkViJRSTH7hranYtFixZ5bfPnzy+z32vlosG/S3nQQQcByZOx7G5U3CM5QXd9jZV5\nLNQJ8pmwO9bg39l1yxk3a9Zsg8fI5O7bG2+84W0/+OCD2XQxcqzggEW3wH/tWIEGgN9//z2n41uE\nyybjW4l48Cel9u/fH0iOBNsimIU0ubk0IzlW3twKQ7gmTZoE+GW/3QnPNoZ77rmnt89KMBf6uRrE\n7nY2bdrU23fmmWcCJS/fXShswcWgqMvNN98MpI+0uKWkrdCAW8SgKGtzo0NuVKc8sNdcULEBW0w0\n7mrUqAEkl8e2CfH2vn/nnXd6bYcffjjgLx7qLvNhC5fbUgOlkflTaGzs3Iis6dSp0wafv2zZMsBf\n1iIqFMkREREREZFYqZCIwmo9RWSbS/r+++8D/pwHd2G2zp07A2VfHtbKV99zzz1Acslquxtsfcm0\nP7n80+Q7D9e90zt16lTAv7Pyyy+/eG377LMPkL874/kYO7v7c+KJJwLQt29fr22nnXZKOWYmfbLH\nBz3WIpTuuZVuXkSuCuG8Kw12R2/hwoUpbTbPz/4dwS8dn062Y5ftuFmZ6KCy97feeisAL730UlbH\nDGIlVYPmOZqgc9Ui50899ZS3b+jQoRv8fVE851q3bg34r2/LRwf/vAjKTbd5YxbhKGthjJ1b2tkt\nD21swc+gctEWibH5hFY2P4jdWQc/U8Ae77blGsmJ4nmXCZuLExTJsc/fsi4bHcbYuRky9l2rbdu2\nKY+zLJvevXt7+2zJhttvvx2Arl27em0WlbbvLO7yIPfdd1+J+hwkiufdxhtvDMABBxwAJEfjzc47\n7wzAKaecktJmEdk+ffqUVReB7MdOkRwREREREYkVXeSIiIiIiEisxKLwgE10mjFjBuCnEYE/2bus\n09WWL18O+JMh27dv77XtsMMOAFxzzTXePje1qZBdeOGF3rY7ARDg3Xff9bYLaQJ3Ol26dPG2LTxr\nk7TLmp3fZX0ui18mOYLZvEBwv6wEtlsKO136Y1FBqZXpnmeTc1988UVv36OPPgokv/YLlf0NQX+L\nlZe29MGWLVt6bZZOaj/dAjVxYeloLvc93i3VDsmFBN58882kfe7zrFCBFbdw29wCBeVVujQ1KzhQ\n1mlqYdhvv/0AmDJlirevVq1aKY+z9+2rrroKSF5awbatMMh2223ntQ0ZMgTwpxT85z//8dos5c3K\nxq9cubIkf0pk/f3334B/jgWVJ7dUP9dnn30GwNlnn12GvcudIjkiIiIiIhIrsYjkWPlZu6P+0Ucf\neW12t3GzzTbz9pXllbhd7dsCoC5bLNPtT6HfFXD/pqLcuyFxYYvLQv4iOMaif7aAJvglkW0xUSt9\nDDB+/HgAHnroIW+f+9qQ4lkxE7u7FRVz5swB/PcZm0xb1tasWeNtDxgwAPAXYS3097BcfPrppwDs\nsccegP9aA/81eMghhwDxiuTYBP+gYgFuIQCLwFi0xqI37j57fFDRAHuMG72xEtV27KCiBnFXNILj\nRm2GDx+e177kk5U3Dore/Pnnn962ZfXMnTt3g8f85ptvvG0rBT969GgABg8e7LVZ9sa1114LwLnn\nnptV3+PAivTYotouKzJjC7BGjSI5IiIiIiISK7EoIW1++OEHwC8H6HIXATz44INz61gGLI8xqKyh\nu6Ch5X7aHeMgUSwzaCyK4V6926Ks69evB5LHIN1icGWhrMZu7Nix3nZQGcV0x8ykT1a61xY6g/S5\nrpnMuXAXPbM7VulE+bwrTelKSE+YMAEILtWcTlmXkDb2/mERHYBDDz0USC5rbAtSWklsd4FKy1+3\nRdxOPvlkr83mERr3LvHIkSNz6nM6YZxzm2++ubdtkSobk2y5c3I++OADwL+b3KpVq1y7mJF8jp1F\nVoIW/mzSpEnKvqLzb8CPxOy///4pj7ey23YnPd3zSmOeZyG817mvPbe0MSTPg833XJx8jJ2VM371\n1VeB5Mi1RXDcebKzZ8/Ouk8ue99zy8XfcMMNAFSrVg1IPm8ziRgFKYTzzvXII48AfiTH5qCDn1ny\n008/5aUvKiEtIiIiIiLlmi5yREREREQkVmJReMBYOC8orOdO2LOw7rPPPgvAY4895rUVDblZUQNI\nDmECnHTSSd72brvtBqSmbLnOOussbztdmlohuOKKKwD/73VZSdl8p6jlw7Rp07xtW9l3k002Kfbx\nQefi0qVLgeTCDEUnjdr5BH543E2JSXf8ooJSJ8ur+vXre9uDBg0q9nGW+hpVL730UtJPl6XhASxa\ntCjjY/bo0cPbLpquNn369Gy7GHnPPfect92gQQMArr/+em/fxx9/DMCSJUsA+Pbbb1OOYQVkbCVw\n16677grATjvt5O378ssvS9rtyLECAm76WFC6WdHHjxkzBghOfTNuqWorSx2X5Qg2xD4Tiqaogf8d\nJo7lojNlRaVKmqLmstTdbbbZxttXs2ZNwJ+K8MUXX5Ta74uyjh07ettHHHEE4KeK3XrrrV5bvtLU\ncqVIjoiIiIiIxEqsCg/Y3bjWrVt7+6y8XdDx7U9fvHix17Z27dqkx7qlp93J4MVJNxH82GOP9bZt\nsbN0ojw5zcrGbrrppt4+m7RrpVNnzZqVl74EycfY2R3aoIVd7a64Wz72k08+Afy7b0ET3oO0adMG\ngOuuuw5IjiimO9/sLr4t0AgwdOjQDf6+fJ53NkneLRVrERa7qwawYsWKnI5vrPSoLfoGcMkllyQ9\nxi2Be9FFFwHZR3TyVXigNFmhFneRWfs7Hn74YSCzIhslkc9zzj4f3njjDW+fW6yhKHuv+/XXX719\nVsjGCkBstdVWxT7f3g+hbCJi+Ry7t956CwguIZ2rTBcDLQtR/oxNt/CnFRwIM5KTz7Gz88CNUv/8\n888A7L777t4+i7pmomrVqt62vY7t/Nt66629Nvsctc/00liGIcrnnXEXebbxsWUD3FLSf/zxR177\npcIDIiIHthicAAAgAElEQVQiIiJSrsUqkmNatGjhbVv53D333DPl+Jn86dmWAA469tSpUwE47bTT\nvH02LyOdKF7t2x3Lr776Cki+G/L5558Dfi56mKI4diVlZbsPOuggb5/NMbF97qJ7difws88+y+r3\n5HPsbBFTO5/cY/33v//19lmOvy3CGMTmSLl3y21uic1/ct8HinIX8M3mjqCrECM5Nu/BnaNkf4eV\npS7rBS3zec6deeaZANx99905PT9TtnCr3QWFkkckg+Rz7Cyq/Pbbb+f0fPDvyttdc/sZhih+TqSb\nizNixIikx4QpjPPOzcyxzz63zL9FANetW5dyDPusadasGeCXRQY/GmTPe+GFF7w2++xxy++XVBTP\nO2Pz0N35x3Xr1gX8z88PP/wwL30JokiOiIiIiIiUa7rIERERERGRWIlluprLJsZ369bN22elK61Q\ngbuKbrq+ZJOutnr1am+fTRTPNsQXxZDmgAEDALjttttS2i699FIguQxrWKI4doUijLFzJ3raKtJ1\n6tTx9lkagaWwzZs3z2uzVdKtlLe7an2lSv9WyQ96jVv6gU0Md9Pjcn1bLKR0NSvGYOlDtqK3u2+P\nPfYAkotAlIV8nnNWGtUtdmF/e7oCBJmyAgXnnXcekLxEQVkI4/Xqloa2159bjMDSmu2nm5IWZnpa\nUVH8nEjXpygUHDBhjJ2Vcwa/xLtb9MNSRIPSQi29zVLT3O9oNsl+9OjRSccpK1E876zIln1Pdcto\nz5gxA4BevXoB+S824FK6moiIiIiIlGuxj+Sk069fPwCaN2/u7Rs4cGCxfbGhsjvAd9xxR7HHdidS\nuxO4shHFq32brGeT/dasWeO1derUCYjGIqBRHLtCEfbY7bPPPgBcddVV3j538vaGWJlL8IuQ/Pnn\nn0DyxNWxY8cCyaWTS6qQIjkWKbPIg9sXe31PnDgxL30J+5yzzwD3/d8WqaxduzaQXE7cFsD77rvv\ngOQovRU0KIsiA0HCHrtCFpWxc8tEW8GYfP3uXIU9dlaM4Nxzz/X22bIOVtjJXYjXSsbb0g3ucgr/\n/PNPqfUrE2GPXRArsGDRLJcVk7LiUmFSJEdERERERMo1XeSIiIiIiEislOt0taiLYkjTQsQ28dGt\n1z9q1Kgy/d3ZiOLYFQqNXe4KKV3NJtnb+l22lhD46yKUdcEBo3Mudxq73EVl7NzP0aLr47hFBqzw\nQBREZewKURTH7ttvvwWgadOmKW2HHXYY4K/5GCalq4mIiIiISLmmSE6ERfFqv1Bo7HKnsctdIUVy\nokTnXO40drmLytili+RE9d8qKmNXiKI4dv379weCC2pZmW4ruBImRXJERERERKRcqxR2B0RERETE\nF6X5NxJ/d911V9LPuFAkR0REREREYkUXOSIiIiIiEisqPBBhUZycVig0drnT2OVOhQdyo3Mudxq7\n3Gnscqexy53GLncqPCAiIiIiIuVaJCM5IiIiIiIiuVIkR0REREREYkUXOSIiIiIiEiu6yBERERER\nkVjRRY6IiIiIiMSKLnJERERERCRWdJEjIiIiIiKxooscERERERGJFV3kiIiIiIhIrOgiR0RERERE\nYkUXOSIiIiIiEiu6yBERERERkVjRRY6IiIiIiMRKpbA7EKRChQphdyESEolE1s/R2P1LY5c7jV3u\nsh07jdu/dM7lTmOXO41d7jR2udPY5S7bsVMkR0REREREYkUXOSIiIiIiEiu6yBERERERkViJ5Jwc\nERERKV8WLlwIwLJlywCYMGGC1zZx4kQAvv/++7z3S0QKkyI5IiIiIiISKxUSuZR5KGOqIvEvVeDI\nXRTHrkaNGgC0atUKgD59+nht8+fPB2DQoEEANGjQwGu78sorAbjjjjsA/y5nWYni2BUKVVfLjc65\n3BXq2O29994AnHfeed4+e08M6l/Hjh0BmDlzZqn1oVDHLgo0drnT2OVO1dVERERERKRci/2cHLtb\nNHfu3JB7IuXdqaeeCsCYMWOKfczvv/8OwB9//OHts0jOTjvtBMAJJ5zgta1fv77U+ylijjnmGAAe\ne+wxb9/OO+8MwJdffhlKn6Jo+PDh3ra9Xs1rr73mbb/++uspjy+vLr74YgCOOOKIkHsicXP66acD\nfvYD+J+pu+66KwA//vhj/jsmeadIjoiIiIiIxIouckREREREJFZila7WvHlzAM444wxvX8+ePQGY\nMmVKyuNtkrfkn6URArz33nuAn3p13HHHeW2PP/54fjtWhtwJtgCLFi3yts8//3wAXnnlFQCaNGni\ntQ0cOBCAvn37Aslj8uyzz5ZNZyNmk002AaBu3boADBgwwGuzdJftttsOSJ6gaZMUe/XqBcBLL73k\ntf35559l2ON4+PTTT4HktEgb72uuuSaUPkWJpZ0VTVFzdejQIXDbfX7UtWnTBoDPP/8cgBUrVmT1\n/OrVq3vb7777LuCn36abSLx48eLAbRFTsWJFb/uuu+4CoF+/fkDy52Pbtm0BqFmzJqB0tfJCkRwR\nEREREYmVWJWQPu200wC4++67M3r8d999ByTfSbJj/PzzzwD89NNPXtvq1atz6leu4lhmsH379gA8\n8MAD3r6tt94a8O8Wf/31116bTXLOVhTHziINFtEZNmyY17Zy5cpin3fmmWcCcOeddwLJE7532WWX\nUu9nVMaud+/e3vall14KwJ577gnA2rVrvTa7Izd+/HgAatWq5bVZxMfu9p1zzjlem931K01xKyHd\nsGFDIPmu57fffgtAs2bNSu33ROWcS8eNwmRTxnjEiBHetr3/mQMPPLDE/Yry2FWtWhWAG2+80dt3\n9tlnJ/UhqP8WtTn++OO9fRblLk1RHruoi8rY9ejRw9t+7rnnAJg2bRqQXNTCXrOW3bNkyZKMjl+7\ndm2gdJduiMrYFSKVkBYRERERkXItVpGcf/75B8i8rO5GG220wce7USFbsNHKgH700Uc59TNThX61\nv8UWW3jbNl/KStHa3ArwSzsuXLgQgJNPPtlrmzNnTk6/u9DHzmVj9csvvwB+lBH8yIa1lYaojN0P\nP/zgbTdq1AiA66+/Hkguyztjxoxij2GR2Hr16gGK5GTLIg3uXfSlS5cC/rw6998pV1E554JkMu/G\nZZGbdPNtLCrknse5ivLY2ZhdccUVxfYhqP+33347UPbzZqM8dunYHEM3UrHPPvsAcMkllwDw9NNP\nl2kfwh67OnXqAPD99997+6xPFn1Zt26d13bAAQcAMHv27GL7ZfNeL7jgAq/NMieOOuqo0up66GNn\nx3r00Ue9fTYX2r5fuJlLloViY2ffhcH/3vbZZ58ByXOGy+LyQpEcEREREREp13SRIyIiIiIisRKr\nEtIWGs80rSATZ511lrdt6W0WlnvnnXe8NrdstfyrXbt23vY999wD+OUbXZbucsIJJwBlnwZYaCzM\nbiHmLbfc0muzMHJppqtFxTHHHONtV6lSBfBTRbNlqazloWyoFQsAfwL333//ndOxvvrqKyA5RcDS\nRLbZZhugdNLVosgmKhct+xzELSCQSQpaaaSpRZm9Xlu2bJnV85544gkALr/88lLvU6FySyR37NgR\ngKeeeqrYx9tnrZsSft9995VR78Jjyy5YcQuATp06AclpaqZompp9nwM/xc/K4o8dO9ZrC0q1LCT2\nvcFNtzvooIMA6NOnj7fP3uPr16+fcgxr23///ZN+BnHPOyuWlOkUkrKgSI6IiIiIiMRKrCI58+bN\nA5Kv0IPMnTsXgOnTpwPBV+qHHXYY4N8VAf8K18r2uuV7rfS0RZGsL+AvoBbXO57FscneABtvvHGx\nj7OiBBaxUCQn2Q477AD4d1Ns4jf4k/7iyI2UZmPbbbf1tu0u3/vvvw/A1KlTS96xiKpU6d+38/vv\nv9/bZ4sw2kJ4kp5bGjqTCI4VGYh7ZCYTlStX9rbtM/XQQw/d4PPcCc5WaCBdSf3yxh3DyZMnb/Dx\ndifdLdtt33UWLFhQyr0Lj0VP3Sj1G2+8scHn2fdDN1PAIjgPPvggAP379/faLAugUB177LGAv8SC\ny42w/P7770lt7lja54ctH2CRWvC/v1nE8bbbbvPabMFtK3DgLv2Qr+iOIjkiIiIiIhIrusgRERER\nEZFYiVW6mtlQGMxC6RbCDfLss8+mHOuQQw4B0hcZsHQ1N2XOUhnc8Gimq+0WIps0mm6dCJeFiv/z\nn/+UVZcK2sEHH5z0/zaZHMpfCmQmBg4c6G3XqFEDiHeamrG1hDp37uztW7NmDeCn1lrRFAmWSYoa\n+O/pmb7HlQdusYAhQ4Zs8PEvvfQSANdee623L44FVLJlaX82hkOHDk15jKUEue//V111FQDVqlUD\nYJNNNvHa3O24OPfccwF47733vH1vvvkm4KdouWvoGEuLd9O3HnroIcCfdhAH9h3ULYpiLAVv9OjR\n3r6g8ywT9v0t6PmDBw8G4MQTTwTg3nvv9dqC0ufKgiI5IiIiIiISK7GM5ARxJ1W5Ex03ZMqUKd62\nrfp90003AcllGa2sr5VvtQm/4JdSnjNnjrfPShxaVMldJbZQ2WQ9u7sZFFGzEsDuasyK4KRyyz3u\nueeeIfYk+uyOVY8ePQA455xzvDY7B+01G2d2Z9NlhSmsFLSk5xYQSBfVsYIDAttvvz0Axx9/fFbP\ns2Igs2bNKvU+FQorjGKFjgAuvvhiAPbYY49in2cFGmzswb+jbpYtWxa4HRd2/nTv3t3bZ1HBr7/+\nGoC77rrLa7Oy0Ba1cQu0xHEJEIvo9evXL6Xt7rvvBnKP3rjse/BJJ50E+BkFADvvvHPSY2vXru1t\nT5o0CYC//vqrxH1IR5EcERERERGJlXITyRk3bpy3XXRRqExZfvs333wD+Asquc4880zAn78D/h3m\nJk2apDzeSusVaiSnVq1a3rYbfSjO1ltvDeReHjju7FxxyzC6dz8ABgwYkNc+RZG7qOyFF14IwKWX\nXgr4r1OAo48+OmVfnDRu3NjbPvXUUwF/8TeAL7/8EgheHK+k3N8TF+5is+kiOVZqOiiiY9Gg8lJW\nesKECYD/3r4hNsb2eg3Su3dvAG644QZvny30GDTPolBZJGbixInFPua3337ztovOddp99929bVti\nwMr8nn766V6bO48zLuzvnTZtmrfPsm0uu+wyIDlSYeOxatUqIHnpEHex47hwo4OQnM106623ltrv\nsdejzXu178BB3GVXbFkRRXJERERERESyoIscERERERGJlXKTrpYv99xzDwBPPfVUyr6ePXuG0qey\nYJPMLL0K0q+qbqHSiy66CIB58+aVYe8Kj6Ud2CS+Bg0aeG0WSl+6dCkAH3zwQZ57Fx2WpjZ37lxv\n37bbbpv0mBkzZgRux0mlSv++ddsK3eCPjZvecvXVV5fo9xRNlXTFMcUjW7ZkQNA+S2WLe5lpOw/S\nnQ8rV670trt06ZLxsd3X9ieffAL4KZitWrXKqp9RtNNOOxXbZqvFuxPri6Z577rrrt72Tz/9BPiv\n/3fffbfU+lko1q5dC/ipaO73EyvkULFiRSA5XdK23fO00LnnBiSnLNt3iVxtttlm3ra9nvv06bPB\n5z388MPetp3fZU2RHBERERERiZVYRnLchThNvifJuhP9bBKce+fJSk0XKpssZhNEN8QWP3VLR5dX\ndn62bNnS22eLz9arVw9Ivitqd+ZsEuXy5cvz0s8osvPOjdC0aNECgN122w1InnBpd9Xt54YWCo46\nKzRgk73322+/lMd899133ra959hP9w6evRZtYbigsTn88MNT9tn5t2jRouz/gIgLiroERWsyYc9r\n3769ty9oYb5C1LVrV2/bnfxeHCv3C/7d9mxtuummST/jIN3Yffzxx0D6Ij1u8SOLetmSDIX+XlcS\ntvipGwm0MtFbbbUV4JfqBr/UsX12ZLPMSFT9/PPPSf9ft25db3vy5MkAnHDCCd6+P/74A0jOBCjK\nzle3NHebNm0y7tMFF1zgbZdFMZwgiuSIiIiIiEisxDKSE3QHI9/5482bN/e2rUy0O8+iUO+y2N81\nevRoIDhqZvs+//xzb1/nzp3z0Ltoq1OnDuDfBXXzU4u69957vW1buKvonZnyyPJ4zz777JQ2iyra\nYm/g51pbWVV3DkshGjVqFAD7779/StsPP/wAJJfLtrLSe+21V7HHtJL6AwcO9PZ9+OGHxT7+008/\nBeC///1vpt0uSBbVsVLQVjY6W24patsu9PLStogl+KVg07FINfivU4twBX02uDn/xbW5SzIsWLBg\ng30oNO7i4cXZe++9U/bZe5zdmS+PnnjiCSA54v/II48AfjaAOyfE5s9ZaWX3vbBQlx+wxU/HjBmT\n0hb0PmSltW1uVxCLBhXSfDhFckREREREJFZ0kSMiIiIiIrESy3S1MDVq1AiA6dOne/vcNDVjJZUL\nYfX6dKl3QWl3lqZ2/PHHe/vShUDjrGPHjt72NddcAwSnGBhLAXr++ee9fTae2U7YtQmW1atXT2mz\nMqxxYhPp3XPSJuhbSkyhp6tZCfaPPvoISE4DsqICVuQDoEqVKoC/Gr07+dQm2Vr65Msvv+y1WZny\n7bbbLqUP1157bcn+iAJjKR1BxWuyLVQQl3S1bJ1//vnetr0v2Xhmm0pun7GPP/64t2/fffctaRdD\n8eOPP6bss0nv7uTuTPz6669AcuGR8sbe5+w9zV22w9LUzHXXXedt2znZv39/AN58802vrVA/Myxd\n8ZZbbgFg0KBBKY/ZfvvtU/a5BZGKsqIE999/v7fPXoe2/EXTpk1TnjdlyhQgnBLdiuSIiIiIiEis\nVEhEcEW3XMs929X7c889l9J2++23e9vuXaXSYqX1LNLhRj/szrJFb8AvZ5iupHIu/zRlUSrbLRca\nNLbGSspaScEwozdhj93NN98MwMknn+zts8UaM+mb2xcrv2p3Zt577z2vzRbIswnmLltoLmhBR1sQ\nLUjYY1ea7By0srP2b1BWsh27KIybRRbdu57dunUD/IIF7t9Vv359IDliVFJxOueyfX3n4/eV9u93\niynYZ0K1atVy6kOuX0HcO/O2EKEb2cxE2OedfQ+w4jIAy5YtA2DHHXcEghdu7NWrF5C86Lh997CF\nusta2GMX5OijjwbgscceA5Ij1zauQaxYkhVh+eKLL7y20047rdT7mc+xs8IgblbJ4MGDczqWFWZw\nM03atWuXtC/ofcAyEIKKIGQr27FTJEdERERERGJFc3JyYHmbdrcT/FxQu3vusrs0bg6x3TEoBO4d\n3qJsbgD4C0uVl/k3NWrUAPzcXjsvipPNnRj3sXvuuWdSW1D5YHu8W2Z67ty5QHL0Lds870LklrSN\n6t3+KLHzxH6CPyfH7hRrHNMLmpsTZ+6cotdffx1I/jzMREkjOTbfDPw5FYXmlVdeAfzyveCPS6VK\nxX89Cyrha6/Z8qx169aAX+Y+00U9LdvmpptuAvxIEMAzzzwD+PNKCs3ff/8NwLRp07x97nZJ2dIY\n6SK548ePL7Xfly1FckREREREJFZ0kSMiIiIiIrESy3Q1m0Tm2mabbbztLbbYAoDFixdv8FhuuV+b\ngJ8uNcFSPtwS0ldcccUGf0+UnXnmmd520ZLRs2bN8rbjWJY4nY8//hiAxo0bA8FpF+6qypZ+YPvc\nNpsYbz8txAzJBSsgedKfuf766wFYvny5t89SIIJSKAuVFUxwJz5byoc566yzvG17rVtRDMmMTeQ2\nCxcu9LazLWVeSNzzykpBB5V9PvDAA5Pa0pWNdtnz4sQmz0+aNMnbZ8VnJL1vvvkGgJ133tnbZ+m2\nv/zyS7HPO+qoo1L2PfTQQ6XbuQJkJe/t/apo2egN+eeff4DkwjxBZfSlcCiSIyIiIiIisRLLSE7Q\nApU9evTwtu+55x7AX4gr3cRHd9K9FRcIOr5NvrRSvgsWLMiy19EwcOBAb9vKIAdFxmyRxbIox10o\nmjRpAqQ/f2ycwC+5a5P/3XPE7ny2aNECSI6KlbRIhTuhvFDZOThkyBAgeBE9Kw/tLnpm0Vq3DLoE\ncydvH3rooQCsW7cOgNGjR3ttK1asyG/H8siNyLhRnaL/n81k+REjRnjbcVwE1ArNuJ8dFl2tWrUq\nkFwMpKTs89ctcGOfv4XKjZRmwhZxdMcgzhHWTNnn7QMPPAAkLxngZjkUJ2iJgUyeJ9GlSI6IiIiI\niMRKrCI5djfHFsUCv6yxy6Izdnc4KDITxMoR2hwbt6SgXe2X5gJ5YXDvUAaNi+0rbyVTg9jcD1sg\nq3Llyl7b/PnzAb+8NMC3335b7LHeeeedpJ+SbOjQoQBcdtllABx22GFeW61atQB4+OGHAT/iCnDL\nLbcAfklRKd6xxx7rbdu5bCXJ7RyPK3s/Kxq9KQmL2pSX98p58+Z52/aatAwKe/1CdvN17rzzTm/b\noh22OOZ9992Xe2cllp588kkALrjgAsAvCQ0wcuRIAL7//vuU59WrVw/wF8l0FwO1Y0r2bOzC/F6s\nSI6IiIiIiMSKLnJERERERCRWYpWutmbNGgCmTp3q7bP0qxNPPDGjY1i53nHjxqW0ffXVV4BfuCBO\nGjVqBPjlQF3uxPX+/fsDhVtYoTTde++9gF9CukGDBl7b5MmTgfQpapKeW7pz2LBhgP8ar1Gjhtf2\nyCOPANC9e3cAXnrpJa9NKS0bZitVu699W3XdLR8fZ5ZS1r59e29fSVPX4lguOlv2Wex+JkvpsXRS\ngGbNmgHw4YcfhtWdyOjWrRuQPKXAlny45pprAGjVqpXXZq9VK5ThFqqxpRgke1akxgrYhEGRHBER\nERERiZVYRXKM3UUHePHFFwF4/PHHUx5ndyvdyfZWXKCkZXsLjU3qtOgE+JP26tat6+0LWqyyvLv8\n8svD7kIsWYlu8BdnswnNQa9nu2tn0UZILrEqwbbccksg+XVuBTBmzJgRSp/C4kZfii70mS6y45aJ\nLi+FBiR8e+yxh7e9+eabh9iTaFm2bBkA+++/v7fPiqfYMgT2WQJ+yXPL+NHnRumwhcjte6NlYuST\nIjkiIiIiIhIrusgREREREZFYqZDIZunmPLE0svIul3+aXMdu7NixAOyzzz7ePls92J3I7a6FEGX5\nHLu4ieLYWVEBW/9gzpw5XpulWt59991AuJMcsx07nXP/iuI5Vyg0drkrtLGrUqUKAH/++SeQ3P86\ndeoA8Ntvv+WlL4U2dlESp7Hr3bs3kDxNpKiGDRsCyYUycpXt2CmSIyIiIiIisRLLwgOSvX79+oXd\nBZFiPf/880k/RUTKG7trbnezLYoNsHbt2lD6JLIhbdu2BeDJJ5/M++9WJEdERERERGJFkRwRERGR\niLOlLeynu3yBLX8hkk9fffUVAE8//TTgRxsBPvjgA8BfyiUMiuSIiIiIiEis6CJHRERERERiRSWk\nIyxOZQbzTWOXO41d7lRCOjc653Knscudxi53GrvcaexypxLSIiIiIiJSrkUykiMiIiIiIpIrRXJE\nRERERCRWdJEjIiIiIiKxooscERERERGJFV3kiIiIiIhIrOgiR0REREREYkUXOSIiIiIiEiu6yBER\nERERkVjRRY6IiIiIiMSKLnJERERERCRWdJEjIiIiIiKxooscERERERGJFV3kiIiIiIhIrOgiR0RE\nREREYqVS2B0IUqFChbC7EAmJRCLr52js/qWxy53GLnfZjp3G7V8653Knscudxi53Grvcaexyl+3Y\nRfIiR0REROLn9ttvB+DII48EoGnTpl7b2rVrQ+mTiMST0tVERERERCRWdJEjIiIiIiKxonQ1ERER\nKTOVKvlfNdq2bQtA7dq1Ac01EJGyo0iOiIiIiIjEiiI5IiIiUmb69Onjbe+2224ADB06FIC//vor\nlD6JSPwpkiMiIiIiIrGiSI5IBFStWhWA4cOHA355VYDly5cDcN555wEwe/bs/HZORKQELHrj+vXX\nX0PoiYiUJ4rkiIiIiIhIrOgiR0REREREYqVCIpFIhN2JolRS8l+5/NOEOXYvvPACAF27dgXglVde\n8do6d+4MwPr16/PSl0IbuyuvvDLp5/z58722rbfeGoD3338fgH333bdM+5KPsdtiiy0A2GuvvQDo\n1auX19auXbuUfrzxxhtZ98n19NNPA/4YAixevLhExwyS7dhF9b2uS5cuALz44osAjBo1ymuzCeOl\nKcqv1w4dOgBQpUoVb9+ee+5Z7OP32GMPIDnl1Fx99dUAXHHFFaXWvyiPnaXhvvfee96+atWqAbDf\nfvsB8PPPP+elL0GiPHZRF/bY2XnUu3dvb59t2+eJ+/usv8888wwA559/vtf2ww8/lFq/MhH22BWy\nbMdOkRwREREREYkVFR6QEmnUqJG3vcMOOwD+lfZBBx3ktbVq1QqAOXPm5LF30XbMMcd42xdffDEA\nN998MwAXXHCB13b33XcDcPLJJwOw3XbbeW3ffPNNWXezTNgdt7vuugtIvjtjd6zcfTvttFPSvqA7\ndEHPs339+vUD4Mcff/TaLOL45ZdflvjviZt//vkn6ae9tuPKolNDhgxJaatcuTKQfM5tvPHGSfuC\n7i4G7evUqRNQupGcKLOo8y677OLtu/TSS4FwIziFYLPNNgPgsMMO8/ZZlLBZs2YAdO/e3WtbvXo1\nAPfddx8AEydO9No+/PBDANatW1eGPS579jkAMHnyZAB23HFHb18mr0eL8uy///5eW//+/QE/4i/x\noUiOiIiIiIjESrmO5NidEncxskqV/h2SNWvWhNKnQlO/fn1ve9tttw2xJ4WjYcOGADz88MPevpde\negmA66+/PuXxH330EeDfUXbnAxRqJGfWrFkALF26FEieH2O51pnOmbG7e3Ys9y5e06ZNkx7r/r/N\njwiaO1He2dylJUuWhNyT/LC7u9WrVy/T31O3bl0AmjRpAsCCBQvK9PeFzY3mm5UrV4bQk8LRpk0b\nAO68804Adt99d6+taITC/X9737SlBuwn+PNlzzrrLAAWLVpU2t0uU3vvvTcAzz//vLfP5nUGRe6L\n+8GiNPsAACAASURBVH93nz0f4J577gFg3rx5QPxfl8beh8DPEDn++OMBP2robgdFyiw6OH78eMAf\nQ4CHHnoIgFWrVpV21zOmSI6IiIiIiMSKLnJERERERCRWYpWuttFG/16zbbPNNt4+t0xgUTYJ9Ouv\nv/b21a5dG/BDbt99953XZuG4008/HUhO1Spq7ty53rZNkFuxYkUGf4XElaVCWgj377//9trOOecc\nIHgVcLcUd1zYZH9LQ3DTojbddNOUfelYupo9/vDDD/faggobGLdsddzZOO+6667evgcffLDYxx9w\nwAFAckqH5Oa1117ztj/55BMAfvvtt5B6k18tW7ZM2Rf0Hlfe1alTx9u29Nnddtst5XGWkjtz5syU\nNnvf7NatW0qb7bNiN1bgplBYmpo7Tvae/vnnn3v7rDx0ugICVvTGLYVvx7XvdpdffnlpdDtS3FRc\nK2YxYsQIb5/7vbmodGmSFStWBOCkk05K+gn+d/Lbbrst126XmCI5IiIiIiISK7GI5NhkqJ133hmA\njz/+OKvnuyV5jZU8dgVNCs/E4MGDAT9yBPDLL7/kdCwpXPXq1QOgY8eOQPIiZukmOlrBgTgK+rsz\nKfphk2zBj1JYadqgkqJB/+/ecYoru3tnUZs///zTa0sXybHJp5KZ//3vf962le6dNGkSAF988YXX\nZmV+4659+/YAHHLIIUBy5Orll18OpU9Rduyxx3rbgwYNSmobOXKkt23FCIKiYZYpcNlllyX9LGTj\nxo0DgosM2HIABx54oLcvk+i/FVWxcxP8RantMyROkRw7L6677jpv39lnn53VMay4gH1+WvRmQ264\n4QbA/0y///77vbZcFkTNhSI5IiIiIiISK7rIERERERGRWIlFutpRRx0FJK/wm46ttJyuXn+jRo0A\nfzJfSdhqzzNmzPD22Vonhb4C8Q8//OBt23oubl1/8dmkT2Nr42T6PDtXLExfnrRr1w7wiwW4693Y\nJMpMVru+9tprvX3lYXVrK8rQvHlzwE/VkLLz2GOPAf4q8+VR27ZtAT+txZ0cXl6KLmTihBNOANJP\nzF6+fLm3na5og30+WEpq0BoxYa5Xkgt7/7L3b/e93dZay3YtLzum/Szu+HFh0ySyTVFzz0krumW2\n3nprb7tKlSoAPPLIIynH2HjjjQF/HSK3CItb8KssKZIjIiIiIiKxEotIjjvxrDju1f7+++8PwPff\nf1/s4++9914A+vTp4+2zqI6VgnYn9lnpQbdEa1EW0QG/TKTd9StUVtIS/FWUy1skxyIJ2267rbfP\n7mq4kcBLLrkEgLvvvhtIngSeib/++guAd955J/fOFgCbBGqrdEPqxFP3LmXQPlN0n3t3KpMCB4XO\nLZMqxbM7lVbEIltbbbWVt21lbG3StBs9/OOPP3LtYkEpOjHZxiRTVmylQ4cO3j57D5g+fTqQ/R38\nKLLSzkERhPvuuw/wyydnylanT3fMQmHnjf0cMmSI1xb0fp+ORW6eeuopIPmz2Y5l3/vipEaNGsW2\nuZlE9tl44403Asnf7f7555+k57lLpFjWUyYOPvhgb1uRHBERERERkRzEIpJz1llnAbB+/fpiH+OW\n7kwXwTFnnHEGkJxneNFFFwHwwAMPAPDss896bVOmTAHgvffeA6Bu3bppj2/lrgudW37b5hmVN1ay\n2F1Yy0qCXn311d4+K3t86623Apnn/9o8FFtIMO4snz9o4bd0822K+39337vvvuvts4WC4zY3xy0V\nWnTB00zngZU3Tz75JJA8b65NmzY5Hatx48aAX472rbfe8tpeffVVwI/KxtWhhx6a8WPdcvBWutzK\nJgctRmt3gLt27ert++abb3LqZxiuuOIKb9vOt6D3rGuuuQZILk+eji3gm0lmS6GwMTDu+5ltu9Fq\nN/oA/meJ+3iL4ASNuS3r4L5Pxu3zweUuUD9s2DAA1q5dW+zj7TPZooWQ3dIqn332WbZdLDFFckRE\nREREJFZ0kSMiIiIiIrESi3S1TNJ+cp3gP3v27MDtomzl9vIwmdlVs2ZNb7t+/foh9iQ8tqqvO5HR\nUhLclEYLpX/11VcbPOaWW26Zsv3GG2+UvLMF4JZbbgGS01iKpl25k7ltRXkbX0s5AH/1cEspdEtP\n24r0Nqn1xBNP9NoK8XVsaQP9+vVLaZs/f37ST4C+ffsWeyy3RGh58NNPPwFw+OGHe/ssTcXKbz/3\n3HNe25w5c5Ke76YgVa9eHYCqVasCMHXqVK+tR48eAEybNq3U+h4VtrI6+KVjTdB71ymnnAIkv5aL\nfoa4aeY2+Xn77bcH/LK04K9eXwhLMpx22mlp2//73/8C6Ze4CGLve7Vr1y72MVbUZd68eVkdOyrc\nAhaW0uimhBddRiDbAjWWHjl58mSvzQoV9O/fH4DFixeX8K/Ir80337zYNvdcsZLc6cqU2/tXpqn2\nlpZrn8Nvv/12Rs8rTYrkiIiIiIhIrFRIRHD1o2xLA1rBgXR/il31AwwfPjynfmXiu+++A6BJkyZp\nH2cFDexuVpBc/mmyHbuScosNuGUFi9p3332B1DugZSWMsbM7t+AvkupOnLVJd+nKjBubgAvw6KOP\nAnDXXXcB2S/qla1COO8yZdE1Kx9qhUEg9c6ee0f58ssvz+n3ZTt2uY6bRVpuv/12b5+Voy0L7t3n\nhx56qNSPH6dz7qSTTgLgwQcfTGmzwhf2flgaojJ2bvTv22+/TWpzy+tbJMai3JtssonXZpGbJ554\nAkj+3LZxDfr8tuNnUlTIFcbYWTQFYMyYMYD/vQH8ghUWXUxns80287ZnzpwJJE8KN2PHjgXgzDPP\nzKHHwcI+7+zccAsPZBPJcfufyfNsEW4rWAO5FyXI59jZOfL777/n9PygPmTaf/seU5pLpWQ7dork\niIiIiIhIrMRiTk4m7C4QlG0kJ1NuaUOJB3dxT5tX4pbAtEXsMtGpUydv2+5cfPjhhyXtYrljc2ve\nf/99AC688EKvbfTo0YB/d8ot624RuKjmX9uCu270xuYjuHe6LHJqOebNmjUr9phW9hxS8/rPO+88\nb9vushfivKV8sCi9RWvcu+f77LMP4M8xy3aRzELlvrZsLo1FcNwytt27dwfgzTffBJLn+RSdl/fl\nl19625lEPaLCnQ9jryu39HG6v8WiQDvssEPS8yH9IqBB0Z1CN2rUKCB5Hl3RpTnsfd9lr7mi5anB\nn9dk0TTwswBsPqebZVEI5aUtOmrzUsGP+jVs2LDUf99xxx3nbUfh/U2RHBERERERiRVd5IiIiIiI\nSKzEIl3NJniefPLJxT7Gndy41VZbAZmvJFwWxo8fH9rvLk1uqoGl9gStUm1pMvkqPBC2Tz75pETP\nr1ixordtY2yrpUvu3FSOomkdbjqMrXh977335qdjWbLUR7dwyZNPPgnAH3/8kdMx3UnbVhbd7Lbb\nbt72wIEDAT9dRIKde+65AOy9997ePivUcuWVVwLRSOfIB3fisU2EXr58OZCczmdpapYidN9993lt\nLVu2TDqme/5ZqdpC89FHH2X1eEt1s59HH310sY91Py+s9G+c2DnipmHZe7qlorml3TNh6WduGpql\nn1qamvs5MWzYsKTfF0VWmOvrr7/29h1xxBGAX9oZoG7duoD//fjzzz/32izNtlGjRhv8fTNmzPC2\no/C6VCRHRERERERiJRaRnIULF27wMfXq1fO27e6ZXZHnK6LjLsTn3qEqZO7dASsh3bVr15TH2V04\nK4csySzSePDBBwPJkxvN4MGDAXj44Ye9fVaSVjKzZMkSb9sKDkS1FHE6VuTC7jKWhiOPPDJl32+/\n/QbAAw884O0rzd8Zptdff93bXrRoEQB9+vQpteNbIQi3NLktQGsLjLpjbm2Fyv0ctnL5u+yyC5Bc\n6thYKfxXXnnF23fRRRcB/h14d0Hgosd+/vnnS6PbBckKD7Rr167Yx7z88sve9tq1a8u8T/l2+umn\nA8nv37ZdmhFSK2YQ9Dlh3yGjHMkJYt8b3CIBmcikfPMBBxzgbbsLKIdFkRwREREREYmVWERybJFE\ny9G0fMPiWDnFCy64AIBp06Z5bZaLnwl3wU+7q7DlllsW+/hDDjnE2w5zPlAYbL5AeWHzaOxuLqQu\nxuXepXz77bcBaNGiRbHH7N+/P5C8MOOCBQsAv4zmp59+WpJux54bBSt6V8rNQX7qqafy1qeoqFmz\nZsq+O+64A4hG2f3SYndfW7du7e2z16n72rr//vtL5fc1btw4ZZ+VRnbniha6v//+29u2TAUrpR/k\nhBNOAOC2227z9tWpU6fYx8+aNSvpeW7Z5fKiSpUqAEyYMAGAWrVqeW0bbfTvPWubr1d0Xl3cWAnx\nfK1nb78nX78viixzx13ctyh3nnYUKJIjIiIiIiKxooscERERERGJlVikq9kKwSeddBIA//zzj9eW\nrsSilUJ1S1j++OOPGf9edzJl/fr1k9q++uorb/v2228H/NSi8mjAgAFA+SkhbakVVjIW4Jtvvkl6\njJ2v4K+mbOeImwppE3QtPcNNx+zZsycAL7zwAgCdO3f22twVwaPGyoyPHDkSSC7ZaatUW0nykrCU\nQFvJOmjyqO2bPXu2t88tUBB3lra13XbbhdyT/LB/Z3dleXv93XzzzSmPzzVtrU2bNoCfZlqe/Oc/\n/wH80rNBBR2CCtTYa9Emyl988cUpx3RTgMubbt26AX4hHzd16uOPPwagX79++e9YCGwZjvPOO8/b\nZ69je7+3z5Js2ecS+J/hQZ8db7zxRk7HLzT2WrXPiKCUvR9++AGA9957L38dy4AiOSIiIiIiEisV\nEhGcRVXSkq7uQopWEMAttXjMMceU6PhBbGL91VdfDfglSQGWLVuW0zFz+acJsxyulfQMukNnbRZ5\nKGuFMHarV6/2ti3K06BBAwDeeustr83uggYt8ti9e3cApkyZAvgFDMA/593IZibyMXZW7MMiT+7v\ntGiquxDnddddl/GxrQAJ+KU9DzvssJR+2u+0O/vnn3++15brHcBsxy4K5aut5KdbUtnYOWSLNJaV\nMF6vbpEZK93uWrNmTdLjrr/+eq/NCse47/PGFmi1CfWbbrppsX1wF7AeN25cpl1PEuX3uu233x5I\nLvds42Gf0+65ZaVtn3jiCcC/O1xWojx2xi3JawVRateunfI4K6gxderUvPQrKmM3dOhQb9u+f1nf\n3EI+mWQ2WATILedux7K+u3+3fV5nG/mPythlyqLZ9t7m9t++X/Tt2xeAiRMnlmlfsh07RXJERERE\nRCRWYhnJCeJGd3bccUfAn0ez6667em3NmjXb4LHsqvbnn3/29tmVfGnmCxfa1X7Hjh2B4DLciuSk\nciM5RUvJ2h0TSC57XJSVDbW7fa+99prXZtEPu7sFwdGgovIxdjfddBPgL3C6fv16r83+JnefzUey\nO5ljx4712ixKY3OV7PXt9qvo3TiAL774Iul5pTGHqRAjORZ5DboDbHMaJ0+eXKZ9COP12rBhQ2/b\noqVuqex0ERgrN24lVd3+d+nSBYCqVasW+3ybc+JmFeS6cF4hvNdFVZTHzpbGcBdsLLo4qs3XhOTI\ndz5EZezcyL19PthngPv7rM0WCnU/J4p+dgRF/O1zyf08vfzyy3Pqc1TGLh03CmYLvVeuXBlI7r9F\nvIMW/C0LiuSIiIiIiEi5poscERERERGJlXKTrlaICiGk6bKSqe6keWOrYbdq1Qrwy12WlUIYO7dE\nsq30bePiFsrIZAVhS8d0Q+lWftUNy1t6TTr5GLu6desC8Msvv6T8zqAJnunSztJNDC26zy1gYNsW\nbi8NhZiu9uCDDwLJJc2NlVJ107jKQlRer+5yAnfeeecGHx+UWpkJm2C/7777ZvW8IFEZu0IUxbGz\nNMn58+cDyWmVdp5ZsQYrPAOwww47ADBv3rwy7Z+J4thZCWlbqsKWKoDsPieCPl8szc19n8z1syOK\nY1fUsGHDvO2rrroqqQ9u/x9//HEgOa2yLCldTUREREREyrVYLAYq0bfxxhsD/p1PgQsvvNDbvvba\nawF/In4m0RuXlXF0y2m++OKLQHCZ27BZoQ4rwWmLmQLsvffeQPLd8aJ3sdIt6umW87QiGCeeeGJp\ndLvcWLVqFVB+FrszdrcW/JKozZs3T3lcNpNs3RLuVlY6kyiRlB8WhQE/smqFkdz3wZUrVwJ+4RX7\nf8hfBCfKrOR4hw4dgOSy7FagoGjxBkj/+WILVR955JGl2teosqySXXbZJaPHW0GHqNI3ThERERER\niRVd5IiIiIiISKyo8ECEFcLkNFfjxo0Bf62WbbbZxmu79NJLAX/V8LI+7Qpt7KIkjLGzQgQAJ5xw\nAuCv4A3Qtm3bpL4FrX9g69zcd999XtuCBQtK1K9sFWLhAUtxdItWnHPOOYC/VkdZK4TXa5UqVbzt\nadOmAXDggQcCwYUHbC2da665xtv32GOPlXq/CmHsoioqY2dFewBmz56d1OaubXbzzTcD/B97dx4o\n1fz/cfwZKbJnL1mS7FvZl0IUqUhCluxbJFshe2QptNkptChb9l1U+FLIlpCvJXwr2UlC6feH3/tz\nPnPn3Lkz585y5szr8c89nc/cmc/9dObMnPN+f94fLr300rz3IVdxGbtsde7cGYC2bdum/BuClDTj\npzzHoUANFG/shg0bBqSuwVS1DzNnznT7LJ031+IrUanwgIiIiIiIVDRFcmIszlf7caexi05jF105\nRnLiQMdcdBq76OIydn6U0IpTnHDCCQAceOCBru2pp57K+2tHFZexK0dxHrvrr78eCIoghfVh2rRp\nbp8VCioWRXJERERERKSiKZITY3G+2o87jV10GrvoFMmJRsdcdBq76DR20Wnsoovz2K288spA6jyl\n1q1bp/ShW7durs0WAy0WRXJERERERKSi6SJHREREREQSRelqMRbnkGbcaeyi09hFp3S1aHTMRaex\ni05jF53GLjqNXXRKVxMRERERkYoWy0iOiIiIiIhIVIrkiIiIiIhIougiR0REREREEkUXOSIiIiIi\nkii6yBERERERkUTRRY6IiIiIiCSKLnJERERERCRRdJEjIiIiIiKJooscERERERFJFF3kiIiIiIhI\nougiR0REREREEkUXOSIiIiIikii6yBERERERkUSpW+oOhKlTp06puxALS5Ysyfl3NHb/0thFp7GL\nLtex07j9S8dcdBq76DR20WnsotPYRZfr2CmSIyIiIiIiiaKLHBERERERSRRd5IiIiIiISKLoIkdE\nRERERBIlloUHREREJHnOPvtsAK6//noA1l57bdf23XfflaRPIpJMiuSIiIiIiEiiKJIjIiIiBdO5\nc2e3fcEFFwBBKVi/7Y477ihux0Qk0RTJERERERGRRFEkRwpm1113ddsTJkwAoF69egBsvvnmru2T\nTz4pbsfK3DrrrOO2GzZsCMCiRYsAjaWUlr2/V1hhBbdv4cKFACxYsKAkfaqt6667DoA+ffq4ff36\n9QPgsssuq/H3jzzySLc9evTolDb/OQcOHFirfsZRy5YtAbjtttvcvjXWWAMIIjmTJ08ufsdEpCIo\nkiMiIiIiIomiixwREREREUmUikxXa9SoEQDHHHMMAF26dHFt2223Xcpjl1oquA78559/AHjnnXcA\nuOaaa1zbww8/XJjOlrHWrVu77WWWWQYIUhQkd82aNQPg5Zdfdvssde3vv/8G4NZbb3Vt55xzThF7\nVxxNmjQBUsdgo402qtVzvv322wDssccebt8ff/xRq+csJ8sttxwAjRs3Tmv78ssvgSAdMkyDBg3c\n9nHHHQfA0KFD3b4BAwYAcOGFF9a6r8XUsWNHAM477zwg9dzVvXt3AH7//fe031t99dUB6NWrFwBL\nL720a7Pn+OuvvwCYOnVqvrsdK08//TQAq622mttnY3DUUUcB8PHHHxe/Y5JI9n3Nvm/Yexfgqquu\nqvb3LM17r732AmDOnDmF6qIUmSI5IiIiIiKSKImM5HTr1s1t+3e2jV3t+3cgTdVIg0Vv/LZtt90W\ngDFjxrg2u0t50EEHAfDNN99E6rtUjlatWrntBx98EAiOsbvvvjvtcVtuuSWQOqnbHm93rk477TTX\ntvXWWwOwzz775L3vxWZ3w2+55RYAmjZt6tpqGx1s0aIFEEQ0oLIiOR06dABg3LhxaW1W3vfxxx+v\n9vf9EsB+BKcc+ZP/Tz/9dADq1KmT9rj1118fgGuvvTan57ciDJ06dQJg0qRJkfoZR8svv7zbtghV\n1SIDAEOGDAFg7NixRexd+bBj68QTTwSCMYQg6vXII48AQdQQYNSoUUBQhnvw4MGF72wMLLvssm7b\n3rNhRTwyfU40b94cCAok+Z/N33//fV76WS7sszXse8NWW22Vtu+DDz4A4L777gNg/vz5Bexd7hTJ\nERERERGRRElkJGeHHXZw2/5d73yrWzcYPovuHHbYYQDccMMNBXvdcrHZZpuVuguxtMoqqwCp0Rq7\nI2d3m3r37p32e7NnzwbghBNOSGuzUrb+mFvefxLY33XAAQeUuCfJYefGqHO3VlppJQB23nnnjI/b\naaedIj1/Mdmd3KOPPtrt8+8QQ3D3HILojkXun3nmGdc2a9asan/vrbfeAuCnn37KR7djxY/obbLJ\nJkBwPpsxY4Zru/rqq4vbsTJjUTCbx+RHEm08LXPEn+tkj7OxTzqbQ/jCCy+4fZtuumm1j//tt9+A\nYP6qZT8ArLjiiim/b5/RkOxIjn+Ou+iiiwA4/PDDgdznutocbIumAfz888+17WKtKZIjIiIiIiKJ\nooscERERERFJlESmqx1yyCFZPc7Cl346ga1unYlNVD7jjDPcPkvJuPzyywHYc889XZuVIq00/krf\nFma30ozluvp5bey4445AUMrSJpiG8VPZPv/885R9c+fOTXv8lVdembbvs88+i97ZmLFUvYkTJwKw\nyy67uDZL0/jhhx8AGDFiRNrvnXXWWQBsvPHGBe9rudhvv/2A4Lj02XnQT8OqylIHe/TokdbmH3vH\nH398rfpZKJaiBvDss88CsOaaa6Y9zsqV+wVtjKUL+elnVlyg0vTt29dt23vSftpyDZDs9J98sEID\nt99+OxAsWeGz4gI33nij22fnuDvvvLPQXSwZv8y9pamFpajZ+/Gee+5x+wYNGgQERaH8z99XXnkF\ngHXXXRdIXU7gv//9bz66HksXXHCB27Z0tTBPPvkkAM8//zwAJ598smuzgkh2fvQL+Bx88MH562xE\niuSIiIiIiEiiJCqS07VrVyC15GKYX3/9FYBTTz0VgAceeCCn17GFo+zuHwR3jy1q40dyevbsCcCw\nYcNyep0ksmjE119/XeKeFF/79u0BaNOmTVrba6+9BgR3Q/73v//l9NwNGzYEUiep/vjjj5H6GUf2\nt+y9994AtG3b1rXZBNKnnnqq2t+3O3tW5tJnEV2/XHzSWAlu/1znjyEEi3ZCUMjCJun6bJLu2Wef\nXe3r+XeYbUHRuPEj8RtssEFau93xtb/l0EMPdW32HrY7y1OmTHFt9vnw6quvAvH9+/PFCg74E94t\ncm9ZElrwM3tVSx1b1KYm3333HZDMSJmdv/yS7WERHPvc3H333YH0IiA+v83GzCI5FtlJKvsO7Jd9\nNzYufiERKxO9ePFiIIgyQrAky6effgoEC6r6+vTpA0C/fv3cvpdeeglIzfgpREEWRXJERERERCRR\ndJEjIiIiIiKJkqh0NSsI4Nc/D/Piiy8CuaepVeWH1mztkueeew6A7bbbzrVZiPWrr75y+x577LFa\nvXacHXXUUdW23XbbbUXsSbx8+OGHADz44IMATJ8+3bVZMYJc2arYtmaJn+pw//33R3rOcmATIGti\nKVlhk+MtZG9h+TjU9M83S/OwAgz+Cul2rFgK36233uraqqap+ZNJn376aQBatmyZ9no2kTXbFJtS\n6N+/PxCkK1fH1sqwc5Y/6bmqsNXBrUCIPxZWmCZJbHKxnypr6T/ZFgGqjp+SZBPrbc2syZMnu7Zr\nrrkGSEZBG0tzrCntHlInyFvhlSSmq9l70U9tCmP//1VT/mpiY2cGDhzoti2d2da4uvnmm12bpW+V\ng7XWWstt169fH0h9z1q6o30evvvuu9U+V1gas73X/bRVe49uv/32ANSrV8+12fpqfrEXpauJiIiI\niIjUIFGRnNNOO63atvfff99t+yuy5otNjLar/Lvuusu12VVzo0aN8v66cWR32iSVRXDsZz4cccQR\nQBC9nDBhgmubOXNm3l6nHKywwgpAarlaK4v5xRdfAMFqzhCU5rY7dOXOzjOrrrqq22fRGYvg2F1J\nCCaPnn/++dU+p01M9SNndgfOvPfee2575MiRQLyLOJxyyikA1K2b+eNv/vz5AIwbNw6Ae++9t9rH\n+tFru9tskR+/tLIVzmjVqlWu3Y4Vv9zsQQcdBKTePb/66qtr9fyjRo1KeW6ABg0apLyOTS6H4DOn\ntpGjOLCxs0IqfjQrUwEHK2du73U/c6TcWcTEL6ZjxXZ8tkSAld22MtM+ex9PmjTJ7bvllluAoKDI\ngQcemPZ79h4fPXq021dOxX0uvvhit23fFyx6A8GSAJkiOJn88ssvaa+TackGG0cr1FIoiuSIiIiI\niEiiJCqSk4mfb+5fveabLZo0bdo0t8/mCiWd5Xf6eZfGcjPtal+is4VnATbffPOUNn8huEWLFhWt\nT6W0xRZbAMHcCT9P2qI0tjhlbefhxYW9x5o1a+b2nXPOOQAcd9xxaY+fMWMGAIcddljavjCWA29z\nTcIWDLW7pZ06dXL7vv322+z+gBK49NJLAVh55ZXT2qz07NixY90+m0uTzWKA/qJ6VjbZFlT1ozYW\nBbvkkkuA8EV8y0GTJk3ctkVY/GUBxowZk/Vz+WVsbRwtGuZHh/z5A1X/7Ze7LXc2p8b+Pv/4yRTJ\nsTk8SYzk2HxJP7Jn3+ns/O+zaLZf9t3YPn9O17LLLltjH6yMcrl9rtpinf4CnsafA1PbjAabA/RI\nTwAAIABJREFUZx4WvbFj2uasQ+YMgnxSJEdERERERBJFFzkiIiIiIpIoFZOu9tFHHxXldSws54fu\nLV0tm5BoOevSpQsQvhK6hXp///33ovYpSSzsbBNSIUgrstB7tqWVk8TSM3bbbbdqH7PVVlsVqzsF\nZRNGLT3Hyj/7/BKyln513nnnAZlTLWziMgST7W2ivM/SAm2CtJWnjjs7P1lZ7Tlz5rg2G898FKGY\nMmUKAPvttx+QWmjEJvfaCuB+MYNySi/yi8tYSpmf/pNLGWM/1e/CCy9MeU4/Xc1WobfPcr/EsqUx\n+WlrljZYrjKVQa6amlbT45PCymtDkEI7ePBgt89Subfeeusanyvb4h/2mWrFCP7888/sOhsTdr6r\naWmVXPhln7t27QrA+uuvX+3j7b179NFH560P2VIkR0REREREEqViIjk20ROgXbt2Je/DoEGDStKH\nQmrdujUQTJhcaqngGrrqpNGks8mQ/kRJm6BtC2P5bKys9O6bb77p2my7W7duQOodd5uQaYsM2gKX\nlcRfEK86Fl3s2bOn2/fHH38UrE+FYmXyM50//IICL730EgDt27cHUhegtQVk//rrLyD1jmhYBMfY\nc9hk/XLRoUMHAIYPHw6klnYuRBnxhQsXAkEBAggiOTbZvnfv3q7NPzbjzn/PWQQh17LRNhHaL0dt\nz2WfF/5z+p+fkBrJsYVpbWFSKN9Izttvvw0Ed8jDslDWW2+9lJ+Q7MVAw9j520rCQ7CMgBX48JcM\nsGPDsh+yZQvNllsEx9iixBZhhiDi5UcCrQiKLZZs5y+fRVr94jZ+8Zuqv2fP5Rf+KjZFckRERERE\nJFHqLIlhImfUu/5Wntiu5qtz1VVXAXDZZZdFep1sjB8/3m1bLucTTzzh9vl3+KsT5b+m2BETf6zt\n77O7fH5fLIfT7jYVWjHHzhag69Gjh9tnUa1s+2Gvnc3j/X7aAmV+6dvaKofjzmf51yNGjAAyl2xv\n3ry5286mNHCuch27XMfN7tL6C37mYt68eW7bFg+1eTp+hLAqfy7PjTfeCASLzd59992R+uIrt2Mu\nF/5cTJufYxEdf9FUy1fP9b1czLGzOS8PPfRQ2uvXtLhqVRbJ8e/y2nPZgr5+NGbBggUpv+9HcqZO\nnQoEuf8A3bt3r7EP5XrcWeTKvztv/dphhx2A1GUsCqEcxs4vL23HRq6RHPuemM9y76UYuxVXXNFt\n2+dI2Dwdawvro0V+wvpiS7P4C5LbYuX5lOvYKZIjIiIiIiKJooscERERERFJlEQVHvjPf/4DwL77\n7pvxcZbiY+U7P//887z3xQ+p2ba/2mtS+KuHV50A7k9yTmLpaEvdGDlyJBCsQg9B6NY/Diytxybm\nWZleCFYe7tevHwAnnXRSVn0ol/K9hfT+++8DcOKJJwKp47r22msDwSR7W/Ueggmr5VSAwFLK/DSn\nXPilPzOx4gU28dtPQfBT3qRm/kRcK0Jg6Wp+cZamTZsWt2MRbL755kDtUmes1Kyl//hpaJZilk3R\nAEvZgmACfrkWG4jK/3+IQxpd3PTq1cttZ0pTs9TlqpPoIViawNJ7y7UAwW+//ea2BwwYAMCpp57q\n9tlni1+MIBcnnHACAE8++WTULhaEIjkiIiIiIpIoiYrk2BVkTZEcu5NkCwTmM5JjpfXatm2bt+cs\nVwMHDnTbYeUIy5EVGYD0CI4/ATvbSIyxCIO/mF02jjzySABef/11ICgHXIneffddAPbcc0+37+mn\nnwZg2223BYIoLsDFF18MlNcijFaG/KabbgLgzjvvdG0bbLABECwaC3DFFVcA8MYbbwCZy8vagr0Q\nHI+PP/54HnpdOptssonbtnLr3377bam6w8cff1xtm5W7tbKrcWQRxLBMhWeeecbtsyIKYcebRVzt\nzvE777zj2rKJxNg50halhSDyWGmRnBjWjYoFK+x06KGHVvuYu+66y21bcQErTuBHVe273C677ALA\nxIkT89rXUrBy7EOGDHH7zjzzzJTH2HcLgA033LDa57IFWv2FWuNEkRwREREREUmURJWQ3n///QF4\n4IEH3L7llluu2sfbVftee+0V6fXCPP/880DqYnqW82+LewE8++yzNT5XnEs02l3jhx9+2O3bZptt\nUh6Ta0nRfCrU2NniigCtWrUCggjOGWec4doy5e02btwYSF0Ez+aHWL/9xUBtXoRFCa0kuf94W+xy\n2LBhNf4NNYnzcZeJzb+xuXYQzBOzMr7XXnuta7O7d3///Xfe+lDoEtK5srKhFtHadddd0x5jETBb\nLBNgzpw5Be1XVfk+5uy94ke6Pv30UyAosTt//vycX7O2LN8907wmf55ONkrxfvVf06I7fr9tX9jd\nXXtP2nP4ywrY0gvWP3+pBSsZbb/39ddfuzZbYDnXhTDL9Vxn85GsdDYE/TrnnHOA1MV9CyGOY2dz\nhF944QUgfOFtWwzYn69j88IsUj569GjXZstk2LxZP8pjy5bkKo5jZ+xz9KmnnnL7tttuu5TH2Pwb\nCKKnFikvNJWQFhERERGRiqaLHBERERERSZREFR6wiY9vvfWW21e1rDHAr7/+CqSutFxb3bp1A8LD\no5Z6lE2KWrmw9J+qKWpJtfvuuwPQunVrt++TTz4BMhcZsLQ+CCbE9+3bF4CNNtrItVnBgOuvvx6A\nxx57zLXZ8fzEE08AqekdVhbz4IMPBlJTtew4T7odd9wRCFInLR0wzIgRI9x2PtPUSsU/vsJWsbaV\n6cPS1GzCt6U/FjtFrZDs/9lPbfjxxx8BWLx4cUn6lCQ20R+C4g5+WXMbdztvhhUqsJ9WgACCogSW\nmhP2e/bafvp3rmlqSRGWuuMX26g0TZo0AcK/h1nBEfuM9UuXGyu0MnfuXLfPykqvuuqqACy99NJ5\n7HH8WBp+1RQ1gEWLFgGpnxXFSlOLSpEcERERERFJlERFcoxfyjcskmMaNGgApC5+lM0dIStHaz8B\nbrzxRiBYdNAvS3vEEUdk0+3E8CftJYUVCfDvnPmLTkLqQmJt2rQBgqIBkLpwKqQuDmsle/0oZHXa\nt2/vth999FEgOM5vvvlm12ZlXMuBvRcB7rvvPgBmzZrl9ll5Y5sUaXd8IYgq2kKX/oRkW5TVCjv4\nz1nObLz8yJ2Ngy1aB0GZfPPee++5bZtkm8QFZS3a2bBhQ7fPSsDa+d4/ToqlY8eORX/NQrDy6xB8\n9vlRRYvqhE2Wrrov02P8ktt2Xohzie1iC1sM1O7ESyrLvLCfkjvLliqnrCRFckREREREJFESGcnx\ny/rZXAdbsBGCaIstBjVt2jTXZovs2UKPm266qWuzO4FDhw4FwstTWwTH7m5Ban5nUvj50FV99NFH\nRexJcdiCYH4kx+bnvPbaa0DqIoxWdtJfBNWODYvs+VEby3XNxpQpU9y2LQJqd4j9uRdWUt1fpC+u\nLr/8crdtEYaoJk+e7Lat5Pfbb79dq+eMm0aNGgHBnIeaWGlev8x5KRfFLDSL1vjz0uw9aYtG2/sD\ngnO0P68kKvtcsJLHVtIXMi9XYFHZcuAvumlLMfgL7Vrp56rlon02t8b/uy1yY58hfiQnbA5FpQsb\n13wcw+UqU+aOnTNPO+20ah9jn+Fhi19Onz4dyLw8RDmzOcLXXHNNWtuXX34JwPnnn1/MLuWFIjki\nIiIiIpIousgREREREZFESWS6ml9+98orrwTgkksucfv81DWAFi1auO3bb78dgL333htInbydaaVV\nC2Ha4y2FKalKMWm3lKyYxbHHHuv2WbqapV3cc889rs1SOL755hu374033sh7v6qWjj7yyCNdmxXG\niHO6mk2g90tzR2Upe35Bh1zSAJOoT58+AAwaNAiovFSWffbZx21PmDABCIox+O/NO+64A0hNb7PU\n5UxpfVau/NRTT3X7LN2yefPmNfbPT9Wy93K5sWI9gwcPdvv8bSkcv/DAUkv9e8/6mGOOKVV3Ss4+\nd8NYYSC/OE82LE3tiiuuAOD333+P2Lt4s3NYWKre8OHDgdT00XKhSI6IiIiIiCRKnSWZwhMlElZS\nsrYOOeQQt23lgKuWV62pL1WHaurUqW772muvBYLFpPIhyn9NIcYujBVksLscEESzrHxyISIX2cr3\n2FlZXn8BT2N3hEu5+KZNcrafAJ999hmQ+0TJYh53Fsl5+eWX3b4ddtih2sdbCWRbUNX3/PPPA6Vd\n7DHXsYs6brYgnRVRqY4VYYl7BKcYx5wtGmvHWljhmEKzu8D9+vUDYODAgbV+zjh/TsRduY6dnTet\nQBJA586dgeD86RdUKoQ4jp0tMTBp0iQANt5440jP8+GHH7ptK4pji03nQxzHzs5F5557blqbZULF\noXx7rmOnSI6IiIiIiCSKLnJERERERCRRKiZdzWf10q1IgK2fAHDhhRemPNZf2bnqUNk6OxCssJ1P\ncQxplguNXXSlGLtVVlnFbdu6IgceeKDbZ2kIljpw66231ur1CqVY6WpJo/drdBq76Mp97E455RS3\nbUWW7Pw5evTogr52nMeubt1/a2rZmnQQFB7o3r07EKyhCDBu3DggSFPzU9MKUbwmjmO3zjrrADBz\n5kwAll9+eddmxVfmzZsHpH7ftSJdxaJ0NRERERERqWgVGckpF3G82i8XGrvoNHbRKZITjY656DR2\n0ZX72O2+++5u2ybbWyRnyJAhBX3tch+7Uorz2A0dOhSAM844I+21rd8DBgxwbRdccEFR+mUUyRER\nERERkYqmSE6MxflqP+40dtFp7KJTJCcaHXPRaeyi09hFp7GLTmMXnSI5IiIiIiJS0XSRIyIiIiIi\niaKLHBERERERSRRd5IiIiIiISKLEsvCAiIiIiIhIVIrkiIiIiIhIougiR0REREREEkUXOSIiIiIi\nkii6yBERERERkUTRRY6IiIiIiCSKLnJERERERCRRdJEjIiIiIiKJooscERERERFJFF3kiIiIiIhI\nougiR0REREREEkUXOSIiIiIikii6yBERERERkUSpW+oOhKlTp06puxALS5Ysyfl3NHb/0thFp7GL\nLtex07j9S8dcdBq76DR20WnsotPYRZfr2CmSIyIiIiIiiaKLHBERERERSRRd5IiIiIiISKLEck6O\niIiIJM+BBx4IwGGHHZbyE2DQoEEAnHfeecXvmIgkjiI5IiIiIiKSKIrkiEjZWGeddQCYPXu22zd+\n/HgA+vXrl/b46dOnA7B48eIi9E5EamKRm0MPPRSA77//3rU98sgjJemTiCSTIjkiIiIiIpIodZZE\nKdhdYMWqB77nnnsCcNlll6Xty8YVV1zhtidOnJjyMx+SVEv9tttuA+CUU04B4IILLnBt1113Xd5f\nrxRj5x87uRxHvssvv7xWfciHOB93L730EgCtW7fO6vEvv/wyAP3790/5d6GU0zo5Vc9/YcfsXnvt\nBeT3vBYmzsec2WCDDdz28ccfD8DIkSMB+Pzzz11b3br/JkhY/7baaivX1qlTJwDOPvtsAFZYYQXX\nZudB/9yYjXIYu0cffdRtd+zYEQgiOF27dnVtkydPLmq/ymHs4kpjF10xx+6ggw4CUqOk3bt3B2DU\nqFGRnrOUtE6OiIiIiIhUNF3kiIiIiIhIolRk4QFLWckmpSgsJS0szc227fFxSDuKk9deew2Ak046\nqcQ9yZ9s0n1yZWlYliYk/9p9990BaNWqVU6/Z+No4zp48GDX1rt37zz1rjzZ8ZrpuLVzZSWnmTRu\n3BiAF154we1r2rQpABdddBEAY8aMcW377rsvAGuuuWaNz/3PP/+47Rhmjtdahw4dANh///3T2saN\nGwcUP0VNKod/bsslVdn/3lfu3+WaN28OpJ5rdtxxR6A809VypUiOiIiIiIgkSsVEcvyr+GzuuFvU\nJuwqPmwSrt3Nt5/+xGjdlYcDDjig1F3Ii2yPI/9OkKl6LPm/XzUa5N/VLdbk7zj78MMPAZg0aRIA\n22yzjWuzYgSZWCSoZ8+ebp+Vl7733nvz1s9ykm3xhkp39NFHA0H0JsyRRx5ZrO6UhZ133hmAxx57\nLK3tvvvuA6BXr15F7ZMkn33G+lk2UYT9frlGdOzz7dNPP3X7vvjii1J1p+gUyRERERERkURJfAlp\nuzMelo9pd8bt7nBYW7Z3zzMNY9Q78eVeonHFFVd029OmTQOCu6F9+/Z1beVUQjrT8+YjjzfsTpQd\nN8WKCMb5uFtuueUAqF+/vtv3888/1/h7Tz/9NADt2rVz+yZMmABA27Zt89a/uJeQjnq6L3Q/43jM\nLbPMMkBQ/ni//fbL23PPmzcPgAEDBrh9r7/+OgBvvPFGTs8Vx7Hr3LkzAA8++GBa21FHHQUEc3JK\nqVBj50eabdkEe62PP/448utl6q89LpvH2Fwy//H+ufG9996rsV9xOe6izrvJVT77Hpexi8r/LmJl\n8K2sfo8ePVzbnDlz8v7aKiEtIiIiIiIVTRc5IiIiIiKSKIkvPJBpApqlqeVjQpmF78LCpbbPD/FV\nwiTyLbbYwm1nmrRbTgodMg5LV7NwvLWV6wTIfPjjjz9SftbE0mZatGiR1rbuuusCsPLKKwPwyy+/\n5KOLsVTINI6k2mSTTYDMaWoLFy4EYKmlgvuF9erVS3nMs88+67bHjx8PwNixYwFYsGBBfjobAw0a\nNHDb5513XrWPi0OaWqH5pe5PPvnkWj1XIdLVwh6zxx57uO1s0tXiItsiA1WnJ/jfwaouD5Lr+TKs\niJClryfhu559Vt5zzz1A6rFSt27qZcT999/vtuPwXlckR0REREREEiWRkZxsy/zm8464Xa3b1XvY\n3QW/L0m4uq+OXdlfcsklaW2LFi0CYMSIEUXtU7nxixjUthxmpdhggw2AoEQtBIuehUXgPvvsM6Ay\nIjj5WKi2EvhRmGzKQvfp0weAzz//3O1bffXVgaBsub/o7K+//pqXfsaRjQXATjvtlNLmT0auBP45\nZfbs2QA0atSoVN1JtGyXcsjm+17V73H+vkwyRX7K7bveYYcdBqQWRVl22WWB4NxWThTJERERERGR\nRNFFjoiIiIiIJEoi09VqSs0o5HojFpoMSzGqlBXGu3XrBoRP2D322GMB+O6774rZpbLjh9aVrpbO\nX4PpiCOOAODmm28GMheH+OSTT9x2UlNosk3XlXQ77LCD2/bTr6pjE7RfffXVtLZRo0blr2NlwE/H\nsvegFVZ44oknStKnUhk5cqTbtonuHTt2THucTeD+6KOP3L7vv/8+L32w9XkANttss2ofd+uttwIw\nefLkvLxusWRKP7PvYbVdry5MWJGBMGHrL8aN/1nZpUsXIDh2qxYUAHjnnXeA1HS+qVOnAsH6T59+\n+qlrs7XG/v7773x2OyeK5IiIiIiISKLUWRJ1CewCilqmN9tJtsVYObamO/HZ9KFcV8W1q/yLL744\nra1ly5YAvPvuuwXtQ7mOXZiqf0ulrD6//fbbu+1dd90VgK5duwKpd42t4EA2LOoDqaUu8yXXscvn\nuEUtf5qNpB9z9evXB+Chhx5y+9q3b1/t42fMmAEEWQH5uvseRanHzixevNhtW5+GDh0KwDnnnJP3\n18uHuIxdPtn50I+eWSTH+v7FF1+4tg4dOgDBnfhslXrssjnfRV22w/8OadvZZlTY959M0aBSj51p\n2LCh286UXWPRmgMOOACAH3/80bWtsMIKQHgBH/u/2WeffWrf2f+X69gpkiMiIiIiIomSiDk5dqWd\nKYJTyHk4Yfy7BmF3AJK4sGPjxo0BOO6449LaXnnlFQCmT59e1D6Vq7Djws+DrQQ2twvgrLPOystz\nnnnmmW77+eefB+Cnn37Ky3OXWq5zt6ougFfJLGqYKXrj++uvvwBo1qwZkFpaNdc74kn2wAMP1Or3\nd955Z7fdpEkTAJ566ikgWQup5tOYMWOA8Hk4s2bNAmD06NFuX7ker3b+8r9rVT2XhZ0TM533co3a\nmHJd6D1smQ/zxhtvuG1bVNuP4Ji4l5VWJEdERERERBJFFzkiIiIiIpIoiUpXC2MpPuUUQixXlqZm\naWu//fabazvmmGMAWLRoUfE7VkaihsuTaPPNN8/qcTNnzgSCMr6PP/64a7Nj8N577wVS019uuukm\nILuV7ePKT2vMJu3MT6vIJs23Ulhp+2xtu+22ALz22mtAauEBKyt9++23A/D000+7tj/++KM23Yyl\nk046KW/PZYVpLI2mTZs2rm255ZYDgvHs1KlT3l63XK2xxhpu285xW221VbWPt3TMck1RCxNWqjns\n3Jav81y5pqb5rLz46aefntZmaWoHH3yw2zdv3ryUx/i/17Nnz2pf55tvvqlVP/NBkRwREREREUmU\nRERyMt31LtWVdk134sv1DkBV9erVc9vt2rVLaVu4cKHbtgmPklnY3abaLmxWrq688kq3ve666wLB\nROZHHnnEtdlCen4J26qaNm0KpJYu33333QFo3rw5EESEkizTJF2Jzp98a9EH+zlixAjXNmjQICAo\nQZ0EVkJ2qaWCe6b//PNP1r9v70MIigrYYr9h5WKtjO3ZZ5/t9tm4VgqL4Dz33HNu3zbbbAOEj5mV\nRk9SBCdT4YCoMhX3SdLnb9++fQFYeuml09ruu+8+AHbaaae0tuOPPx5IXejdFvw0v/76q9sePHhw\n7TtbS4rkiIiIiIhIoiQikpNJsSMmdrUfdkfBv0uQlEiO3VWDYMFGM2TIkGJ3p2xlmotTaaWjzX/+\n8x+3nSnPPBsW5fFL2tp71eYU9O7du1avUQrFmrsVdnfYzmF+Tny53u384IMPgNS/pXXr1nl5brv7\nCdCxY0cAtthiCwB++OGHvLxGKc2ZMwdIjd7Y8WJzQPxytMYiqOPHj3f7ll9++ZTfj+Fa5bFg85Es\negNBJM3+H/zP5meffbaIvSuOQix6HFaWutLYZ8pqq60W6ff9ReALveh7NhTJERERERGRRNFFjoiI\niIiIJEqdJTGMB9epUyenx1f9E/z0nmKlT9jrZFph1y89mI0o/zW5jl1tffvtt27bJt++8sorALRt\n29a12QrhxVIOY+enNGYKvRe7X+Uwdrlq0KABAA8//LDbZ8fnzz//DEQPz/tyHbvajlvcTt/lfq5b\ndtll3famm25a4+M32mgjAM466yy3r2rabhgrSpCPtJi4jJ0/qb1Zs2ZAsEL6lltu6dqsHK0tKzB8\n+PBq+5fpbzvooIPc9pNPPhmpz3EZu2yddtppAFx33XVAcF6DoF+33HILAOeff75rW7BgQd77Uoqx\n8z8nMxUZsPdVpikCmZ4r1376v5/Ne7rUx12fPn0AuOaaa/L2nLZcgz8WhUhXy3XsFMkREREREZFE\nSWThgXxNGK2JHyXKNAE4iRPHrXznqquumtZmkxyLHb0pF3anI1P0Jtc74ZLZjTfeCKRGF41fBr1c\nxHWCf7mXpfbL3mdzF9Ie8/zzz7t9++yzDxCUjl5ppZXSfq9Lly5AsiY4+yXfbWHKhg0bAqmL8Np5\nr1evXtU+11133QXACSeckNZ25513AtGjN+XMIoZ+BMfYIsgWwSlE9KZUsl242D43s3lf+UVGqj6v\nf37N5lybaySn1K6//nogtYS0jZ0VQ/nwww9d2+jRo4FgId5NNtkk7TmfeOIJIB7FBnyK5IiIiIiI\nSKIkMpJT6LuJdicq29cphyv7bDVu3BgIIjn+nYChQ4cCMHDgwOJ3rAxkE8GxqF+SjpliqVv339OZ\nf0f5qKOOAsLn2/z9998A9OzZswi9yy87PopVQjpbcY5at2zZEgjKN4fxF7J79NFHs35uy0cHePvt\ntwGYP38+EB7JSaIxY8a4bYvw2zIC/uK977zzDpBa/rgqK+seln/fv3//2ne2jPiZKf6is1XZgqhJ\niuCYbDNlSvW5GdfIenWszLg/JyfT/Jytt94agLXWWiut7c8//wSCeWJxo0iOiIiIiIgkii5yRERE\nREQkURKZruazMGLUcGK2ZX5N1BKq5aJFixZAkLbmszQNW11eUmUKuecyYTJbduzGMfXNJs5ayV4r\nNQuw7bbbAplTM3ybbbYZAB06dACCid81ue+++wC45557snp8nISdZ+z4KkS6rn8M+RN2q4pb2oZf\nVGLChAkArLjiitU+ftGiRW579uzZKW2WjgtBapCdB/fff3/XZumPjRo1qvZ1brrpphr7Xs7Gjh0L\nwMEHHwxAq1atXNt2220HZFcK1n/M66+/DgQToyuFn9a3yiqrpLRNnjw5dDtpCnFOK1aBqiTYcccd\ngfTjD4K07+nTpxe1T9lSJEdERERERBIlEZEcm3gWdqc8bF/Vu43+XQLbtqv8bO8gFOJOfFwss8wy\nbtsWkTL+hNtKLOmZL9mWyKxJpsVoIT4RxgsuuACA4447DoA//vjDta2zzjpAeJnUqCy6aHfzAc48\n88y8PX+p+P+3uZ57Mi1gXFVcjptc+QvoZYrgGCteAbDeeuultPkTc3v37g0EkaKwUvqZ/PLLLzk9\nvtxYtKVr165A6qRke8/nysrW2v9jkyZNXNvMmTMjPWc5uOiii9x21eiXv7hxktm5LR8RHTvvlXu5\n+2KyhXvDPPjgg0XsSe4UyRERERERkURJRCTHrvIzRV/8u5W1Lbua9Hk3VVm5aIBdd901pc3Pz//5\n55+L1qdyVLW8rn+cxq0UcKG1adMGyDxvIR+mTp0KwK233grAyJEjC/p6Ei/+gsRWTtwWtsuVH9EO\nK6VanRkzZrhtK68/b968SH0oNxbROfHEE90+u5Nu8zubN2/u2i6++GIgPOpmC4ta9GzOnDn573AM\n+XMTLZJj88VsAcZK5s+tyTQHO5ulP+y7XdzmFsbNp59+6rbjPlaK5IiIiIiISKLoIkdERERERBKl\nzpJs6jgWmT9ZNAq/1HNtJ5eVctJ2lP+a2o5dGJskDumrTZ988slue/jw4Xl/7ajiMnaZ5FrwIpvV\n5GszEd0UY+zq168PBJOJrUQlBCVmfZbW1qlTJyD1b/v444+BoAy1PxHyww8/BIpX1jyQeEPXAAAg\nAElEQVTXsSv2MeezYyyb0viF7mcxjjl7/BFHHAEUPnXRjku/pHkhUqzK4VwXV3EcuzXWWAMICg5Y\nSXII+mtFMC655JKC9iWTYo5dLueqKIo9BSGOx10mw4YNA6BHjx5AauGUvn37AkEJcz89txByHTtF\nckREREREJFESGcnxRb0DEIeS0HG52h8zZozbPvzww4Fg4tn222/v2ubPn5/3144qLmNXjjR20ZVT\nJMeEnSOTfGfTfs8v+9yrVy8gmNwOwV1Liwx+8sknWT3/XXfdBQRRG3+B0ULQ+zW6OI5ds2bNgOB4\nW2qp4F70P//8A8ABBxwAwLPPPlvQvmRS6rGr7VdX/7udZUkU6/teqccuV5tvvjkQjM9qq63m2mwx\nUPsceeONNwraF0VyRERERESkoukiR0REREREEiXx6WrlrNxCmnGisYtOYxddOaarxYGOueg0dtHF\nZeys2ADAPffcA0C7du3SXs/6a2uTjBo1yrXNmjUr7/3KJC5j56/TkqlwT5zWN4zL2JUjpauJiIiI\niEhFq1vqDoiIiIgIrL766jU+xsr2fvnll25fsSM5ceFHckSqUiRHREREREQSRXNyYkx5m9Fp7KLT\n2EWnOTnR6JiLTmMXXRzH7rjjjgPgzjvvTHs9W9x4wIABAIwePbqgfckkjmNXLjR20WlOjoiIiIiI\nVDRd5IiIiIiISKIoXS3GFNKMTmMXncYuOqWrRaNjLjqNXXQau+g0dtFp7KJTupqIiIiIiFS0WEZy\nREREREREolIkR0REREREEkUXOSIiIiIikii6yBERERERkUTRRY6IiIiIiCSKLnJERERERCRRdJEj\nIiIiIiKJooscERERERFJFF3kiIiIiIhIougiR0REREREEkUXOSIiIiIikii6yBERERERkUTRRY6I\niIiIiCRK3VJ3IEydOnVK3YVYWLJkSc6/o7H7l8YuOo1ddLmOncbtXzrmotPYRaexi05jF53GLrpc\nx06RHBERERERSRRd5IiIiIiISKLoIkdERERERBJFFzkiIiIiIpIousgREREREZFE0UWOiIiIiIgk\nSixLSIuIiEjyrb/++m777bffBmDRokUAbLvttq5t7ty5xe2YiJQ9RXJERERERCRRFMkRKZL69esD\n0Lp1awB23XVX19amTRsANt98cwBWXXXVap/nyiuvdNuXXXZZ3vtZTHan9oQTTkhrmzFjhtu2cVlu\nueUAOO6441zbuHHjAPjxxx8BePzxx13byy+/DAR3hkVy1aBBAwDef/99t2/DDTcEoHPnzkDqMbfs\nsssC0KFDBwDuu+8+13bggQcC8MwzzxSwx+WlR48ebrthw4YAjBo1CoCffvopq+dYaql/79f6CyYu\nXrw4X10UkTKlSI6IiIiIiCSKLnJERERERCRR6ixZsmRJqTtRlR9yLgRLJ7j00ksBuPDCC13b999/\nD8Dw4cMB+Oyzz1zbww8/DMDChQsBWLBgQUH7GeW/ptBjF5WlaF1xxRUA3Hjjja7NT/XIl7iM3dFH\nH+22L7roIgA23njjtNf73//+B8DEiRMBePfdd11b3759AVhllVUA+Oeff1zbPvvsA8CkSZPy1udC\njV3Xrl3ddrNmzYDgPbjMMstkfM5s+mSP9x/70ksvAXD88ccD8M0339T4PLWR69jF9f1abHF5v4a5\n5pprADj//PPT2n7//XcAPvjgA7fPUiq32WablMcA7LbbbkBq6lttxXnsMllnnXUA+Prrr90+Szuz\ndNR77703q+dq3LgxAGussYbb559Dq1OuYxcHcR47Ox5OP/10t8/SR7fYYotq+2XfRSwFuup2vsR5\n7OIu17FTJEdERERERBKlIiM5F198MRBEFXI1ffp0AAYOHOj2jR49uvYdq6Lcr/b3339/t23jY9GI\nN99807V17NgRgO+++y5vr12KsbM7uABjxowB4IADDnD7/vzzTwAeffRRAMaPH+/aXnvtNSB8DOxu\n5lFHHZXWtt9++wHwwgsv1KrvvkKNnT8ROJfIDAQTkP/++++0x9mdtm7dugGw8soruzaLEP32229A\n6p1hi+D+8ccfNfYlW3GP5DRq1MhtW3GLmTNnAuFjm4kVjejfv7/b1759ewCuv/56t69Pnz5A5rGJ\n47muXbt2ADz99NNpr2fRhyZNmqT8G4JMgV9++QWAk08+2bVZpDaf4jh2mdiddIuy+tEXOw/acWTv\n20KJ49hNnjwZCKJ+vqFDhwIwf/58IPWcesMNN9T43E2bNgXgkEMOSWvzX69Tp04prxMmLmNXr149\nt23ZAkOGDAGC7xvZ9sv+pq+++sq1DRs2DIBBgwbVvrP/Ly5j51t++eWBIAoWVgxop512AmDKlClp\nbfYdxP9uZ+fAfFIkR0REREREKlrFRHJ23313t23lO600aFR+WVq7c9+7d28Afvjhh1o9N8Tzaj8X\nTz31lNu2u6Jh8yaOPfZYIL/RsFKM3d133+22jznmGADeeustt++MM84AYOrUqTk9byVEcn7++We3\nbREWv3yszTnKJtpn878AbrrpJgA222yztMdZeWm7++f3Iaq4RnLWXnttIJhfAtC9e3cgiMTY/Khs\nnXvuuUBqGXO7G+jPEdt3332BzCV943Ku86OxVvrZyj7bfE2Ali1bAtC8eXMg9e7lSiutBBR+/peJ\ny9hl4kcQ7Vxl78lff/3Vtdm4+nNhCymOY2fvnbBITqa+5POrnJ0nLrnkkmofE5exO/vss922n12T\ni7DvJcaiifaZno85xHEZO4tSAey5555A+JylXHz88cdue/vttwfyO39dkRwREREREalousgRERER\nEZFEqVvqDhSLpZFB7dPUTN26wfBZKHPu3LlAaolkP82hEuQaVrXSyoUo3lBMV155pdu2SbUPPfSQ\n2xd1gruNp/20Ywzym6ZWaH45TwuJ25jcfPPNrm3WrFm1eh0rxw2wwgorVPu4vffeGwhK/eazDHdc\nrLnmmgAMGDAAgCOPPDLtMbkel5aCYCmolqIGwRj6qW/ltPK8/U0QpKkZK6AAQaEBv+CA8dOvKp1N\n/PbHrmrqqD/OxUpTi7MHH3wQCIoEWKltSWWppZnSbP3UJkt1tqVAZs+e7dosxbRXr15AajGDFVdc\nEQg+y3fddVfX5qejx91qq63mtq0wjJ8Cv/TSSwPBEil+WXybjuF/thor7jN48GAANt10U9e2xx57\nAPDcc8/V/g+ISJEcERERERFJlERGcvwr1hEjRgDBoolh/HKnNtk0jJXUsyv57bbbLu0xtmCcldqD\nYGLuq6++WmPfk8Amw/t3Rav68ssv3Xbnzp0L3aWi+Pzzz0O3a6tLly5AcFfKn5BfTm677baivE6b\nNm3c9nrrrVft426//XYgeREcP8Jsi6BaBMcvE21FGUaNGlXjc/oT8i0qbuPsTyq1Igbldq6zwgz+\nsgIWObUyuuUUNY2Lc845B4Azzzwzrc3ef6+88kpR+xR39r60pQZ23nnntMfssssuQDBZHILPh/r1\n6wPhxVYysaUxIIh2xJkVPrFIS5hbbrnFbVuUJhMbM1s41GcL1V511VVunxUbuv/++7PocWlYifYH\nHnjA7fOL85i3334bCDIuci2QtO666wJB1gDAhhtumFtnC0CRHBERERERSRRd5IiIiIiISKIkKl3N\nVjr3V5K39XE+/fRTt++6664DgjSLefPmubb33nuv2ue3kPtee+0FwLXXXuvabDKu8cPIti7Plltu\n6fbVdnJ1nF188cU1Puajjz5y235ddfmXX6veUoUsHSHqWgBJZ6uB+5PrM9XUt9SqpPGPHfsbLU3N\nX9PGzoOZWJEWS3uD9NXSbcIpwIQJEyL0uPQs9W6rrbZy++zYsfN+2KRbCWcp3bY2mM/WFOrZsycA\n//zzT/E6VkZsnSW/eI0J22fs+0m26ZXffvstEBRiAfjxxx+z7mepZLNeip9alg1LnVxrrbXcvh12\n2CHlMf7Uh8033xyAJk2auH3+9Ic4sLVwwlLU/CIKVvQjm7Xowtjnr18ow1LfrJhGPtaPzJUiOSIi\nIiIikiiJiuRcffXVQBC9geDq1FbdhvCyn7mwldIvv/xyt+/JJ5+s9vF2N9QmYUJ2k+DKjZUC9e+G\nVmWrytvkSgl33nnnVds2bdq0IvYk/lq1agUEd43C7vBZJMOPsCbtzryVi+7bt29amxWreOyxx3J6\nzhYtWgDhK59bqWS7g1du/GIK/qrpxiLw/oRdqd6yyy7rtu3usZWQts9MgO7duwOwaNGiIvYu+Syq\nYMWPMhk+fLjbtsyAcoje+KwssX9u8ouuQFDeOFu2lMEmm2zi9lWN5PgaNWoEpJ4/LMrmF1cqBYtG\nhRXIOuuss4DU4gK1jaj+9ddfQGohGxsXK0CgSI6IiIiIiEgtJSKSY6X9bEEnn82xqW30Joyff/78\n888D0LZt22ofX653PLNl5XozLbZq5aInT55clD6VGyv/GRYNy7QAYaWwhcb8Ur+Z3nMWbbDjbsqU\nKQXsXfH5+c+2AHHVOTMAH374IZD9/LeNNtoICMr8rr766q7N5jeOGzcOiJ7DXWp+tNQiVv7fYgvl\n/f7778XtWJnyI15299giqP5Cz9lEUG0RX/9YvvDCC4Gg3Le/iOjEiRMj9rq8+Z+1tgC1/16tyiI4\ndicfoi9SXWq2EOecOXPcPn9uDKTOD+7Ro0e1z2XzpYcMGQKEz1/JxI9KWkSj1NZff30ANt5447S2\nxx9/HMjPfDhbONU+h/3P5jhQJEdERERERBJFFzkiIiIiIpIoiUhXs8mNRxxxRFqbv+JtvvlhSSux\nOmPGDCA8dS7pbIVw+xlGaWqZrbrqqkD4ZEErw2iTyCvRYYcdBkCXLl2yevy9994LwKRJkwrWp1JY\neumlgdSUnUMPPbTax1s5WUvLqslxxx0HBOmBPis9bat9l6uDDjoobZ+tDg5BqrOlIvsszcPKZ9t5\nH4IS3pYimHSNGzcGYOedd05rswIzmY4V//Nim222AYIV5MNSbcygQYPc9i677ALAwoULs+12WbNx\nevjhh92+qmlqfiEBm1Bvq9GXa4paGL9MtKXXmhNOOMFtP/HEEwCsttpqABxzzDGuzcpnZ1OW2mfv\n+xNPPNHtmz17dk7PUQqPPPIIAJ06dXL7LLU0E0uj33///d0+K7Fv00biJp69EhERERERiajOklwv\nXYsgUyQgTMOGDYHwCbAWybGFxwrtvvvuA4I7zr7mzZu7bSttnUmU/5pcxy6fbFExu1MSpmqJx0Ip\nt7EztpDbiy++6PZZv2ySuY1zocR57MLGx9idpLDJlHY3/r///a/bZ5NM/X21levY5Tpu9jda2dRL\nL700p9+Pyo/AdujQAcjvhPxSHHM2cRmyj3BVZWPgF3SwUr62WLS/ELUVa8inUoydRRIhKKhz2mmn\nuX0WbbbPvEylYx999FG37d9ZBrjnnnvc9vfffw+El9e38un2GAjKxWcqShCXc51FwwAOPvjgGh9v\n73v77gPB32LLNPhR3tdeey0v/fTFZez8Y9HOi2ELktsxaMUa/BLy1q9Mf5MVOOjatavb98knnwC5\nZ1cUY+wssjd+/HggdWmVfLJCIvZeveiii9Ies+OOOwKp59yoch07RXJERERERCRREjEnJxPL1V15\n5ZXdvl9++aVU3Ukc/+6d3TmwK21/ztKsWbOK27EyY3NxbEFb39tvvw1U9lwcY3ck7S45BOWO7W58\n06ZN037PFgP2FwU+9thjgeDuus2vgPjOp7DF7YoVwTH+4ng2B7LcSyvvtNNObtsiZH5JXlu00uyz\nzz5uu1mzZkCwAGbLli3Tnj+slOrIkSOBYO6Af9fTyp2XgxVXXNFt22eAf4fV8vTDIjh2592OYT96\n8+abbwLBPAt/PpSVqLVIjl+2Nyx6a/N54lxe2ubWvPrqq26fH2GIwsoDFyJ6E0eLFy922/a+svl2\n/lIMmUprV80C+O2331ybjWf//v2BIHoTdxbVtDLsNs8Sgmi8fT+G9Dk1H330kduuulCsvU8Bbrvt\nNiA8W8q+a8+fPz/3PyBPFMkREREREZFE0UWOiIiIiIgkSuLT1awUr4XnAMaMGVOq7iSGlVo96aST\nqn3MDTfc4LZtQmCl8YtNtG/fHgiOSX8ioKV6hKVanX322UB8VlIuJRuDsMmNtkq6rV4NQYqLOfzw\nw922rQhtYXw/bcYmrvqlb/30mFKxMp/t2rUDoHfv3mmPeeqpp9z2tGnTqn0uWx3cjq+wlCuz1lpr\nuW0rkZzNyvVx5qe52Lafyjxs2LCUx1f9NwRpW2El3+148ssgd+zYEYDTTz8dSD0/WFqJnyoTV337\n9k3b56emVf2MXXvttd32OeecAwRpZ6+88oprq1rW20+B6dWrV0qbXy64ajpNubDJ5H6aZC78FCNL\ntbLSyH6JZNO6dWsgNT0uSRYsWADA119/DaR+FmRiY2dl488991zX9vLLL+ezi0U3b948ICj972/7\nZd/9Ag6QOV0tW/b54xdmKTZFckREREREJFESEcmxkolWvjlsUdA77rjDbduVebEWbbISokkqeGCF\nHGziJKRP3quUiY9h+vXrBwR3ySGYNG6lKP0FY20yd1h5RFvEzO6A+mVok7SoW23Z5MY33njD7fO3\nIfh/AWjTpg0ADzzwAJBa+vzWW28F4IsvvnD7wspWF5u9t6wvtemTFbsIO18aK4RhUQYIIjlhi2RW\nGou6hC1ybPv8u+02EfrJJ58EUgthWMnf0aNHF6azedSqVau0fWETsu3usB9xtHPi+++/D6SWTLZj\n0SKuu+22m2uzKO5DDz0EQJ8+fTL2MVMUM25yLYtrRT8seh3lOZLCPxaHDx8OwIYbbhjpuS677DKg\n/KM32ar6+RiFfa/ZbLPNav1chaBIjoiIiIiIJEoiIjl2d/O5554D4NBDD3VttviklfoEePjhh4Hg\nLlrU8nb+4kxWbtR/bWMlgP2FysqdLXjn3z2y/wcr2Rl2dzOJ7G6ln6duczr8O0I2L8lyXf05ICNG\njKj2+S1qdtdddwGp88tsbkopc17L1YQJEwC48MILgWDhYN/YsWPdts1DSworC21zxXyWs3/NNdcA\ncPPNN7u2uXPnFqF3yeGXN7ac/zvvvBOAyy+/3LVtu+22QHlEcvz3hS30Zz8BzjrrLADWXXddIDWi\nbSxy6i8G6kduIHWhXis5ne2Cqva5myQPPvggAAMHDgRg6623TnuMvZ/97zU2R8VfmLbcWantxx57\nzO3zS5tLcdjc4r333huA7777zrWFzRktNkVyREREREQkUXSRIyIiIiIiiVJnSQxnq/lpYFGccsop\nbjssBcVMnToVCMqxQm6rTvuT/qoWFfjzzz/dtq0qa6kK2YryX1PbsavJ8ssvDwQTZ/fYY4+017YJ\ntPaYUijm2Fl5VL+krj1XixYt3L53330XCFZOv/fee9OeY8CAAQC89NJLrs0mwYdNprTiGVYa9PPP\nP4/0N/jieNzli61UD3DYYYcBqcUIMqlaYjNMrmNX7HFr2LCh27ZCLZa2++2337o2mwyej4mp2YjL\nMecXWLAysvvttx+Qn8Ix9evXB4K0Iz/19MgjjwRSU8GyUYqx848jS8P2nzPq14oXXngBCCaAf/DB\nB67NJtvnUynGztISISjla0VQfFZQydJpAUaNGgXAwoULa9WHfCj1e9ZSwv2Uz6r8VNFrr70WCMqU\nW8q93y97riuvvDJv/QxT6rHLJxtXKwTif88NK61fW7mOnSI5IiIiIiKSKIkoPFCVlZgE2H///YHU\nUp1WhMAmSvp3zQcPHgwEZXsz3b2zhd3C/Oc//3HbuUZw4szGx4/gVOUvIlUJ7C6cf6fFtm2CMQTl\niO1usb/Alt2ts0iOz6IPVs7XFhIEaNy4MRAsqOffESyHYgT2XvTfS5999hkQvQSsLXIJsNdeewFw\n7LHHArD99tu7NotKZrozZH0pd1Yu2o8SWETRIjh+Kd9iRXDiwhbF8yf916tXDwjKu0eN5NgEaQgm\n4loExz8HlFOhFn8sLALll3T2lxaoyqJY9jkxadIk1/bWW28B5bEgaq7sePKjBGERHGMLEfufIRLo\n2rVrjY854IAD3Pabb74JpC8467NsC8meLW1h33lyyYYqBkVyREREREQkURIZyfnhhx/ctl21+2UG\n/TxoSM0btHkS77zzDgBDhgxxbXYnwKJDPXr0qLYPdrcqaWzuR1h+qOVPJ2nR02xYSVM//9fu1vlz\ncizC8MgjjwDQv39/15ZN1MJKVFvUBoJ5T+ussw6QeueqHCI5tiCqH2GwBU4XLFjg9mWTh2vHpN0x\nhdxKivqvYQsVZrrrV04s4mfRG58dv5UWvfHZsebPpbTjyBY99XP/H3/88ZTf9+dIWHTSSsT7v7fx\nxhsDQaTCSsBD6py+uFu8eLHbtveu/zlqkZzbbrsNgPPOO8+12fs7htOBC6pXr15A8P2hOjZX7oor\nrih4n8qNlSsG2HTTTWt8vB+ZtUV5Laot+WXvZ3vPx4UiOSIiIiIikii6yBERERERkURJZLpamOOP\nP95t28TvYcOGAeGlYS30fs8997h9lgZnK9DXrZs+fFbWMNtVmcuNhSTDUg0mTpwIBCVFK4Wlbvgp\nV5ZG5pd9tpWA58yZU6vXe/HFF932yy+/DASrDZebv/76C0gtzmGpLg0aNHD7cklXyzUNxl7bL3lu\nJWyTIqxQiBUcuOOOO4rdndgJS0+cMGECEKSYjRkzptrff+qpp9y2rTgfltL7zDPPAEGaWhImOlsx\nj549e7p9559/PhCke9v7vBLZuFhJ8jCWogZBGvz8+fML27Ey9Pfff7vtRYsWAZlL+6+22mpZPa99\nZ0lSkSj5lyI5IiIiIiKSKBUTyfGLEdx+++1AEHmYMWNGVs+R6a7As88+CwTRoSTdhbESqgDLLLNM\nCXtSPmxisd0hzif/btbRRx8NwAMPPABkfyzHhS1416pVK7fPStH6Cw5W1ahRI7dti8+a6dOnu22b\n5GylaX22yOqsWbOAwiw2WGp2zrIS2r6RI0cC5VGgolheffVVt22LNh511FEAdO7c2bVttNFGKb/n\nF/wwVuzm0Ucfdfus6EjcyqzWhr23/FLZErDCE5YBEsaP5CTpu0O+TZkyxW1bQZBMS3lky/6Pvvrq\nq1o/VyXo1q2b27aMJssMsP+XuFAkR0REREREEkUXOSIiIiIikigVk64WZubMmQCsvvrqbp+lHbRt\n2xYIVnMOYyvQA1x11VVA6joLSbH++uu77RtvvBEIxswKLQCMHz++uB0TV8QgbGJ5OfFTxZI26T8O\nlEqUHT8V1FJNLX3SforkwtK9wwqiWDGaL7/8sphdSoTu3bsDqd/DTjvttJTH+O9nS082flGpQqSV\nJ1m7du3S9k2dOhVInRoSB4rkiIiIiIhIotRZEsNlh8NKb1aiKP81Grt/aeyi09hFl+vYFWvcTj/9\ndACGDh3q9lnBBr90dqnomItOYxddMcZuwIABAJxyyikA9OvXz7XdcsstQFAgpZzouIuu3Mdu7ty5\nbnvNNdcEYNSoUQAcc8wxBX3tXMdOkRwREREREUkURXJirNyv9ktJYxedxi66uEZy4k7HXHQau+g0\ndtFp7KLT2EWnSI6IiIiIiFQ0XeSIiIiIiEii6CJHREREREQSRRc5IiIiIiKSKLEsPCAiIiIiIhKV\nIjkiIiIiIpIousgREREREZFE0UWOiIiIiIgkii5yREREREQkUXSRIyIiIiIiiaKLHBERERERSRRd\n5IiIiIiISKLoIkdERERERBJFFzkiIiIiIpIousgREREREZFE0UWOiIiIiIgkii5yREREREQkUeqW\nugNh6tSpU+ouxMKSJUty/h2N3b80dtFp7KLLdew0bv/SMRedxi46jV10GrvoNHbR5Tp2iuSIiIiI\niEiixDKSIyIiIsk1duxYAA499FC3b4cddgBg2rRpJemTiCSLIjkiIiIiIpIousgREREREZFEUbqa\niIiIFEXXrl1TfvoTqtdaa62S9ElEkkmRHBERERERSRRFckRERKQoTjvtNACWWurfe6wff/yxa5s6\ndWpJ+iQiyaRIjoiIiIiIJIoiOTVYbbXV3PaECRMA2Hrrrat9vOUXf/PNN27fWWedBcDDDz9ciC7G\nlr9o0y677ALAG2+8UarulIV69eoBwXgBtG/fHoA99tgDgCZNmri2Zs2aAfDnn38Wq4tF8+ijj7rt\njh07Vvu4xx57DAjuDK+33nqubdSoUTW+zp133gnA/PnzI/VTkqN169Zuu0WLFiltPXv2dNvrr78+\nEBx7r7zySqTXs2MPkn38tWrVym3vvvvuKW2DBg1y2z/88EPR+iQiyadIjoiIiIiIJIouckRERERE\nJFHqLPFzimLCLylZbJbycs455wBwyimnuLamTZsC8OWXXwLhKQqWyrbNNtu4fV9//TUAbdq0cfs+\n++yzGvsS5b+mlGNndt55ZwBef/11t++www4D4IEHHihKH8ph7PxyqZttthkAF154IQD77rtvVs9x\n//33A3DCCScAsGDBglr3q9Rjd+CBBwIwfvz4nPpkfci2//b42bNnA/DWW2+5ts6dO2fX2SpyHbt8\njpulMS5cuNDt++677/Ly3BtssIHbtvQiS5Xcc889XVvUdKNSH3P7778/AGPHjnX7VlxxRSC3Yy/X\nx9uxB8FxP23atCx6HCj12GXj3nvvddtHH300AL/++iuQemz9/PPPRe1XOYxdXMVx7JZeemkAttpq\nKyC1qIV/XgQ4/fTT3fZFF10EwFdffQVA//79Xdubb74JwNy5c/PWzziOXbnIdewUyRERERERkURR\nJAdYbrnl3HafPn0AOPXUUwG4++67XdvVV18NwN9//w2ET/ZeZpllALjjjjvcvu7duwPQq1cvt++m\nm26qsV/lerVv0Rpb7A3gwQcfBODQQw8tSh/iOHYWJbTJzf6xZZPl7U7mp59+6jLbea4AACAASURB\nVNrsWNl+++2B1AnQxu5EP/fcc7XuZ1zG7vPPP3fbL774IgDDhw+v9vEWUVh33XXdvoYNGwKpxQjM\nrrvuCgR/72+//ebaJk6cCOQe0Sl2JGellVZy2xZZnjFjhtvXrVu3Wj2/GTdunNuu+h7u0qWL237k\nkUciPX+pj7nrrrsOgHPPPTft+a1v06dPd23+sVIb7777rtu2KG6uBQhKPXbZ8CNWa6+9NhCUkr79\n9tuL2hdfMcZu5ZVXBoJiRH4GyMyZMwE46qij3L4NN9wQyHyM2XP+8ssvOfUln0p93DVo0ABI/Z5h\n37X22msvAN555x3XdumllwLwySefAKlRHvtsDvPf//4XCLIrZs2aVeu+l3rscmURst122w2Af/75\nx7XZ+WrIkCEA9O3b17VZNo//+NpSJEdERERERCqaSkgDhx9+uNvecccdAWjZsiWQegcqGxbl8e8I\n7rPPPkAwLwWyi+Qkic1LqjT+fAXL+7W5Wb///rtrO/nkk4GgxLZ/19g89NBDABx00EFun19OOmn2\n228/t23Hzx9//FHt46dMmZLT89ucn06dOgHBHAyA999/P6fnKhU/QmV56DZ3MB+23HJLIDWiZXfS\n2rVrBwSl9cuZlRr3z9vG7kb67zuVOs7OscceC8Aaa6yR1mbR/aQbPXo0APXr1weC903VbWN3xP3l\nK4zdzbd5TE888YRre+mll4DgvOnf8b744osB+Ouvv6L9ETHhlyK3JQZWXXVVt6/qXX7/XGjLENg8\n60zRG5/NPbRMinxEcspB48aN3bbNX7rgggtq/D0/Uvn4448DcO211wKlWUJEkRwREREREUkUXeSI\niIiIiEiiVEy6Wr9+/dy2pZQNHjwYSE2LsTKeixYtqtXr/fjjj247U4pNpbBJl5XC0l6sWAUERSks\nnGspahBMhszEjiMLAUNqGcyksUm5+bDKKqsAQdoGBO91S3Gw0vAQpC/FnU2wLZTzzz8fCI5dgJdf\nfhkIikEkwSWXXFJtm527lKKWOzs+beIyBGW6f/rpp5L0qdjOPvvslJ+NGjVybZYq6zvmmGOqfS5L\nV7O0Hz/97Pnnn6/296wAy4knnphtt2PF0tT8wiZ2Tg+biG6pbFbMA4Lj7Zprrkn7vTgUbCo1GwNb\n/sQ/nlZffXUgGLNM34/9NEA7vtu2bQtAixYtXJtf+KGQFMkREREREZFESXwkx4oK9O7d2+2z8rwW\nyfELAkh0NgneL+lo/ve//xW7OyVlZRX9O+A2odRKWeZahnb55ZcHgsm85cbuZEJQZrwQx4WV6Ibg\nrqmVhrfFeiG442R3lG699VbXZmVD48r63qFDh7S2qGWcfXbnzkqGJp3dxfTv6NoY6y5v7tZff30g\ntViDsQhgDFevKAg7l1jU3Y9qWTECn733tthiCwC23XZb1zZ06FAgKDJihUFqYmWp7fFhhW3izKLu\nfpEB4y+3YIUc5syZA4Qv82ELZ/tLFFiEy47bSmTf38IWI7Zoti2DkukzZtNNN3XbFiG3pQyuuOIK\n12bfzQt9HlAkR0REREREEkUXOSIiIiIikiiJTFezSXYQFByoV6+e2/fYY48B+Vu1OoyfppRtPfZy\nd9ZZZ5W6C7ExfPhwIHVF+ltuuQVIXR8nF+eddx4QpK2VmzvvvNNt17YYh612DXDAAQcAweTUo48+\n2rXZ+95++qHx+++/Hwhq//uFB+Kubt1/T922CrcvHymAlnJg63EknR0X/vFhq3RXSlpVPh1//PFA\nsPbUvHnzXJut91WpFi9e7LYXLFiQ1v7VV1+l/HzmmWfSHmOFB7JNV7P0onJLU8uGn1qcyzm8f//+\nbtsmxoelq/36669AsF5WkvjrDo0YMSKl7a677nLbVkApm/H1CwrYeeDZZ58FUteHtGkNlrpeKJXx\n7VtERERERCpGIiM5fiGBjTbaCEgtV3nbbbcVvA/+5Geb9GeT4ZLKJq6FsbtSlcJKLA4cODBvz2l3\n78NeJ2yCZdxYMYYoWrZsCQSluTfeeGPXtt122wHBBPHvv//etX3wwQdAUBL6vffec21TpkyJ3J9S\na9OmTbVt/t9fCLayukh1qkYALbINhc2gqBRrr702kLoMQSaFvlteaCNHjgSgZ8+ebp99Htp3PIB1\n110XyG7JCr8ARNhnq7Ey3bNnz86hx/G2xhprADBs2DC3r2nTpkDwXvXHOur3C/s9+/ydO3euaytW\nZEyRHBERERERSZRERXIsj/yQQw5Ja/PzDYtRzjisLLXNAUgqu4sSxnKIJTqbe+K77rrrAJg4cWKR\ne1N4/vulffv2ACy33HLVPn7SpElA6sKOr732WoF6V1phiwjaee3uu+8u6GvPmDGjoM8fNzb/yy/3\nWw6R02Lz74zvtNNOKW1hufw2t9DKywKst956QBCd9efu3XjjjYA+SyAoK20R7jB+1DrqPNC4sL+l\nR48ebp8t7Ny8eXO377PPPkt53Lhx4/6vvTuPu3LO/zj+MmgQRTEzCpMs2bOLLE0yD+skhhohISay\nVGIsIRqVbNlajG2UhtBCgzJCEYbElIhsNXYlO42f3x8en+/1vc597tM5132W63zP+/mPy3Wd+5zr\n/nadc+7r+/l8Px93LHMM/EbKe+65Z72vHVIEx3Tv3h2ISpEDvPvuuwD06tWraK9j625sXdOjjz5a\ntOfOlyI5IiIiIiISFN3kiIiIiIhIUIJKV7Nydf6if3PllVeW5RysdLQVG4CoTKRfPi9EmSHffv36\nVehMwtKnTx8A2rZtC0QLISFK0QqRn3aaTxlfe9/7JaRtAaqF4quddfTOllJw0003AbB06dIGv85u\nu+2W6OcspSukdC5LE/U7z1dz0YpSOeigg9y2pRBZsR0/9dTS0yz97Ne//nVBz3/00UcD2Usrh65p\n06YAHHPMMSt9bO/evd12taerGb+s8WOPPQbAtGnT3L7NN98ciNoVnH/++e6YpUq9/vrrQDytOZdJ\nkyY14IzTyZZ2+Es3Dj300KI8txUwgOjv7rfeegtQupqIiIiIiEiDBRXJsTvuM8880+3zm3KWg0Uz\n/EVtVj6v1pSjwEM1stkmv4lZJv/6ufzyy4GoRPJHH33kjtlsVog6duzotqdMmQLA2muvXe/jremu\nH+WwJm9DhgwB4g1Jq4VfbMFmcO139RdmT58+vWivmblwPJdOnTq5bWsa5xdG8MuGptXMmTOBePTQ\nxtiagj7zzDPumJW0feWVV+o818KFCwGYOnVqaU42pfbaa686+2yWfcWKFW6fNWG0CI4fZXj++ecB\nmDNnTp3nsgitlUP2F5yH3p7B2PvK/90z3X333QAsWLCgLOdUKVbMwh8Lu0as/LHfLNWPbK3M3Llz\n3Xa5soBKzRpiAxx//PFAPBo6f/78Bj2/leG25toQNVe1SE4lKJIjIiIiIiJBCSqSYzPkfvRm4sSJ\nAHz22WdlOYdx48bV2ffGG2+U5bUroW/fvpU+hVTzGydaA7fmzZsD8WvyvvvuA6JI4D777OOO2QyM\nNf4cMGBACc84Pfz1Rm3atAGixm+2TgmitUr2GH/9jpWktfVw/jE/vzvNzjnnHLftrzeC+PqbbbbZ\nJvbflbHcdBs3X2YzR5+VLbcZO3/9zp133gnA559/ntc5pMXIkSMB+PHHH92+bNeMyfx3sCgrRGsw\nc62Nsn/Thx56yO3zo3LVyNaL+ebNm1fnWOa1ZWtpASZMmFDv81tk18pL+yW9a8VJJ50E5F4zZ5+R\nVra3lljWjP33sssuc8es5HQ+/Obl9n6udv7atzXWWKNoz2t/b9sau5NPPrnOY4qZZVAoRXJERERE\nRCQouskREREREZGgBJWulo0tGs2nBG1DnHDCCQCsv/76ALz44ovumIXxQpSrU/Ds2bPLeCaV5xcL\nGD9+PBBdDz5LkzrllFPcvnxC6VdddRUQLbytJbZ43f779NNPu2O2uHHDDTcE4Nprr3XHLK3DurH7\nJUWtaEO2buxp8qtf/areYy1btnTbliqWLytF7i9IzYeNt5Wz7d+/vztmKV7+QvNqYGlqlrYGUZEH\nSy/NtdjbZ4Ui/H+bTNaJffjw4W6fv2C3mjRr1gyI3n8+W8xsZbizHcuVorb33nu77a233rpB51mt\n7H3mb/vpkcb+1rEiKxJPPy3EYYcd5rYt5bxnz55A9bYjsAJGvg4dOrhte6/NmjWr3uewv2f8FOdu\n3boBcPrpp9d5/CeffALA6NGjCz/hIlEkR0REREREghJ8JKeU7M4eYNSoUUBUdtTKZEL1LyjNpV27\ndnX2LV68OPbf0NnMrZUrhmj23Z8VOfDAA4HoevBnVp544gkgWjSajRUe8AtrVNuMeSnYzJr996ij\njnLHbJbYIjoWhYCoeIG/sD+N/PKbfgSrlGymzmbu/EX0Nr4zZswoy7lUihX/2GijjYDsn3Xm6quv\ndts2o24LfXNFymwWFKLiIy+88ELCM66MJk2aAFFhFd9rr70GZI/k2Lj471drO2BFBaxUN0SLpb/6\n6isgrIazufhNaK3ISrbMFPsuWL58eXlOLMXat28PZF8En41FGqxFQffu3d0xi3a89NJLAGy//fbu\nWDW1yfDbJ1hD3fXWW8/ts3LSF154IQCdO3d2x3beeWcgKhPtf6blatNizT+XLVvWoHNvCEVyRERE\nREQkKKv8VOrFKglkyzfNhzU4uv32290+a5zo5/M2dKbD7uztLhWiO9yuXbsC0axcQyT5p0k6dkll\nO0ebPbfZgkoo59jZv/URRxzh9tlsrN/Q0mYgbX2In4M/ePBgIDpvv0GeRYosSjh58mR3rEuXLonO\nOZdquO7yZbPv2fKMr7vuOqC4kZxCxy6t42ZrRuw97M9YbrzxxkV/vZCuOWNRGj/CYbn+9h3i/962\nhtEvH5+PSo+dfff5DTytCeN5550HxBuF+jPESVgJc399XVKVHrt8jBkzxm1bZCLbeT/11FMAHH74\n4UDpIzppHDv7rrTv31zruF5++WW3bU1Wv/vuOyBqDgx11+L5/5+roXculR47yzzy1wUnZY14bU2e\nZaVAlN1iWSjFUOjYKZIjIiIiIiJB0U2OiIiIiIgEJajCAxZq9NniT7+876RJk/J+Tn9hloWBrZSv\npQ9B1Fl34sSJBZxx9cq1CFeiBeKWogZR2op1rfYLD1gI9pFHHgHinZotxcXSM/bYYw93zBal2usk\nDZ+XW69evYDofWnleovBTye4++67geyh/rSnO5WbLboF2GGHHWLHnnnmmXKfTslYCt7QoUPdPisK\nsnDhwqK9jqX8HXvssW6fpanZd4cVKahmlorywAMPuH2WrmapZbn478NcqShTp07N+zlDsv/+++f1\nuGnTpgG1XXjA2jPkSlN77733ANh1113dPis1bd+tVhY9G7+IxogRI5KfbAVZYSxbzgHQr18/ABo3\nblzvz1n7Cv+z03/fQ7xQTjHT1JJSJEdERERERIISVCTH7jL9WfAtttgCgHHjxrl9Z511VmyfX+LZ\nZtisgIA1twNYd911Y6/nz24OGjSo4b9AFfHLfkpdd9xxBxCVsoSohONWW20FxIsLnHvuuUA0++uX\nXHz22Wdjz+0vuLXFvnPnzgXizb2++OKLBv0OpWTNT20MrEwvwJIlSwp6LotmWYTM3rsQzcjZDLGN\nE8Q/JwTWWmstt20zoTZu/qLyame/0yabbOL2PfTQQ0BUPtVnC5StfO/KjlkUzArh+LPC9toWwbFZ\nZYC+ffsW+qukihVPgSirwsYz1+xwtujNokWLALjiiivcPmt2G0L0qxSsBHAt879vM9kC+U6dOgHx\nRqH2t51FOLI18Ta5jlULa+9xySWXuH0WnbH2F1Y2GqIiW9kaSFuzZGuq7T9nGiiSIyIiIiIiQQkq\nkmP8/H4rHWuNxCBq/HTBBRcA8bxByw9u3bp1nee1Gafx48cD6W8iWEq5ysgWo3x2tRs7diwQXx9i\nTbOssaIfdZk3b95Kn/Piiy8G4jOZNlNq0Qz/9aqhqaCd7/Tp090+Gx8riQpRo8/jjjuuznPYjHyL\nFi2A3Hn99u8CtZ27vjIp7CxQUvZ5b5/tEH0X2Ayl30jW2HXZqlUrty+fsXv//fcBuOuuu9y+ani/\n5uJ/j9qssJ+7L6VVzPVk1coiFH6U1nz55ZcAfPzxx0B8rdM111wDxBt91sciiqGxjKbM5trZDBgw\nwG1bM2B7r6etQaoiOSIiIiIiEhTd5IiIiIiISFCCTFfzF8naAlorrwiw+eabA9nTDzJZaBPg+uuv\nB2DIkCFFOU8J17bbbltn30svvQRE108+KWo+WyjpL5i3sqpW/rfQ56wU67g8cOBAICoQ4vPLlCdN\nn/r++++BqOy7LTCXwqS5iEWhZs+eDUC3bt3cPitGs+eee9Z5fK7viXy+Q/wFzpamZu0I/K7rIoXy\nCyr5BZRqlRVEmjFjBgBt2rRxxyw12r4j/WI3udi4dunSBcidxhU6a8lyyimnuH1PPvkkADfffHNF\nzmllFMkREREREZGgBBnJ8dld93777ef29ejRA4DOnTsD8UX0U6ZMif28lf2FePnZWmfFBfzZdiv9\na6W8a8XkyZOBeHMxW/joRxU7duwINHxWfMWKFW47s7x0tbBo1ttvvw1EZbUhKi9dKJtR8t/DVjb0\nnnvuSfSctc4KQowZM6bCZ1I89jnlF0ix2Vo/onj11Vcnen5r8muFLWzBM8Ctt96a6DlFICqGYZFt\nvwF6rRULyebDDz8EombTfrPOnXbaCcgvguO3cLD3rJ8NVKus8I8fwe7ZsyeQ3kI+iuSIiIiIiEhQ\ndJMjIiIiIiJBWeWnFMY4LSRb65L802jsfqaxS05jl1yhY5emcVtnnXXc9osvvgjATTfdBMTTPkpB\n11xyGrvkqmHshg8f7rb79+8PROftpz/6i8HLoRrGbv3113fb1kvuiCOOAODUU091xyzFedKkSUCU\n+gxRn6xiqoaxy8b6enXv3t3ta9asGQCff/55Wc6h0LFTJEdERERERIKiSE6KVevdfhpo7JLT2CVX\nzZGcStI1l5zGLrlqGLumTZu6bSuN3LZtWwD69Onjjo0cObKs51UNY5dW1TZ2TZo0AWDx4sUAXHfd\nde6YtbTwS+WXkiI5IiIiIiJS0xTJSbFqu9tPE41dchq75BTJSUbXXHIau+Q0dslp7JLT2CWnSI6I\niIiIiNQ03eSIiIiIiEhQdJMjIiIiIiJB0U2OiIiIiIgEJZWFB0RERERERJJSJEdERERERIKimxwR\nEREREQmKbnJERERERCQouskREREREZGg6CZHRERERESCopscEREREREJim5yREREREQkKLrJERER\nERGRoOgmR0REREREgqKbHBERERERCYpuckREREREJCi6yRERERERkaCsVukTyGaVVVap9Cmkwk8/\n/VTwz2jsfqaxS05jl1yhY6dx+5muueQ0dslp7JLT2CWnsUuu0LFTJEdERERERIKimxwREREREQmK\nbnJERERERCQouskREREREZGg6CZHRERERESCksrqaiIiIlJbttxySwBOO+00AI477jh37IADDgBg\nzpw55T8xEalKiuSIiIiIiEhQVvkpScHuElM98J+plnpy1Tp2zZs3B+DEE090+37zm98AsN9++wGw\n00471fm5448/HoBx48Y1+BzKOXb2c40aNXL7Dj30UAAuvvhit2/77bePPX7JkiXu2GWXXQbArbfe\nCsD//d//JTqXYlCfnGTS+H598MEHATj44INX+tg+ffq47Q8++ACASZMmlebEMqRx7Apxww03uO2u\nXbsC0KxZszqPW758ORB9RhZDtY9dJWnsktPYJac+OSIiIiIiUtN0kyMiIiIiIkFR4QFgxx13dNvd\nu3ePHevQoYPb3mWXXVb6XJdffjkAI0eOdPuWLVsGwPfff9+Q00y97bbbDoCZM2cC0LRpU3dsypQp\nAJx33nkAvP7662U+u3Rr1aoVALfccgsAv/vd7+o8xsLV2cK1ltJWbY466igAxo8f7/bZ++S+++5z\n+8aOHRv7Of99OmrUKABatmwJwKWXXlqSc61llioJ8MQTTwBRWuC7777rjh144IEALFy4sHwnV0T7\n7ruv2957772B/NIjbrzxRrf91VdfAfD222/XeZyNz4cfftig86w266yzjtseNmwYAG3btgWgXbt2\n7ljmWPvX0WeffVbKUxSRACmSIyIiIiIiQanJSE6LFi0A6NGjBxBfNJo5I+4v9spnRm/gwIEAXHTR\nRW7fiBEjAOjXr1/CM64OrVu3BqJZO3+8bDH5iy++CEQRL/nZ9ddfD2SP4OTjnnvuKebplIQ/m2vF\nAvbZZx8gHqm58sorAZg/f369zzVjxgy3PWvWLCCahV999dXdsRUrVjT0tFNtrbXWctuHHHIIABMm\nTCj66xx22GFu2yI49v7eZJNN3LEddtgBqL5IzhprrAHEP//967UQjRs3BqLIts8W1FtUolauz2uv\nvdbtO+GEE/L+eb/4iB/ZrQWWReJnk/gRVYAnn3zSbSuCDSeddBIARx55JBBFTrPx/7a79957Afjh\nhx+A+HfP0KFDi36eUj6K5IiIiIiISFBqJpLjz4bcfvvtQDQD6ef6Tp8+HYB///vfAOy+++7u2D//\n+U8AXn311Xpf54ILLgDiud3W0MzP237rrbcK/yUCMHHixEqfQmr4a7xsFj6FFd2L5vHHH3fbW2yx\nBRCtxendu3dBz/XCCy+47aeeegqA/fffH4Cdd97ZHXvuueeSnWzKjRkzBoiXE7dti1RDFEVOat11\n1wXqziD7vvzyS7ddresmzj77bAAGDx5c0td55ZVXgKhsct++fUv6epVmn2u5ojd+NMLWbpq5c+eW\n5LzSzCIyl1xyyUof6/9dU6uRHFvXCdGaVvt+uPDCC90x/3MK4pEc+y62SHS3bt3cMftOtrVkIbH3\nJ0R/D++2225A9vesZU34a2jtb+Y333yzVKfZIIrkiIiIiIhIUHSTIyIiIiIiQQk+Xc1KFttCZ4gW\nJlso0w+9vfPOOw16veeffx6IFtgDbLrppgCcdtppbt8555zToNepVpaWdPrpp1f4TCrPT937xS9+\nnm+wRd3ffPONO2YL8S0dyxbr+9Zee+2SnWex7Lrrrm773HPPBWD48OGJnuuXv/yl27ZF47Vk2223\nBeLpambjjTcu2uusuuqqQO7ra968eW7bLwgRigULFgBRiqW/b/vttwfiZbTXW289ILou/TSOrbfe\nGoDOnTsD4aarWfrP6NGjgdzd2jt27FiWc0oz/33jp6AleY6kxWuq1SOPPOK27bvR/g4rtLBHo0aN\nADjrrLPcPvs78eGHHwailNNq1rNnTyBeECSz0Eq21Hkrq9++fXu3z9Ik01pMSpEcEREREREJSpCR\nnAEDBrhti+Cstlr0q/7hD38A4LHHHgPgu+++K9prL1++HIiXLpw2bRoAG2ywQdFep1r5/w61zo/2\n2Uynld71F0w+++yzQLQI3GZTfFY607/208YvK2wNcpPq1KmT27ZZJStAsGjRogY9t0SsBLAfxcj0\n4IMPlut0KsIWw9v1BVEkxyJqixcvdsfsfbrmmmsC8SIztnA3dLYYuUmTJkD2WeG0zvyWk82CZ4ve\nWNPdQYMG1dmXrThBZslpe2zo/IIClh1hmQJ33HFHQc9lJaSt8BREEQ6L2lZrJMeixxC1rPBbEBjL\nIsl2LBv7W8W+39PWzkKRHBERERERCYpuckREREREJChB5Q5ZQQG/9vePP/4IwEYbbeT2ffLJJyU/\nF3+xqW37/TssBJpZu11qR5cuXdz2b3/7WwC++uoroHr7jeTSpk2bBj/HHnvsAcCoUaPqHLP3md+1\n3rYtjdTGF6Bly5YALF26FID333+/wedXLpbemG1Bd65F3oWy/jj+c2YWyfDTuEJkxWv8Yhc2LpMn\nTwbixRdysVQOG7sQ2Lj4fVqaNm0KZE9Ts7SW1157rfQnl3K5euFY/6BsaWc21n7/qlpNV/NZaqn1\nK7z//vvdsXz+1rI+OdaHzNfQolSVts0227jtbKlodr3Y2NmyDp8d89/X9nf3qaeeCkQFGgC++OKL\nBp51wymSIyIiIiIiQQkqkmNlmf0Su9dccw1QnuhNfeyuN9uddK1FcvxO9RLxS9DWx48E1gK/NLQt\nbrQuzC1atKjz+BNPPDH2X99///tfIFowDrDvvvvGjvXo0cMds2IPFglOix133BGIykRnmynPtq9Q\nzZs3B6LZOf85LQph+4rxepU2f/58AL7++mu3r3HjxrHHnH322XV+7owzzgDiJYBvuOEGIFp87wux\ndYD9TrmKnvhFR37/+98DsGTJknofb9EhK2EO8bL6IbNCA/lEYizaA1EEx4/u1BqLjP3rX/8CoE+f\nPu7YkCFD6v05ywz405/+BMTLKVu7i6effrq4J5syViDrueeei/3Xly2DYsSIEUBU2MA+SyEqyLLX\nXnsV92QLoEiOiIiIiIgEJYhIjjVfs7v2OXPmuGN+E1CpvNmzZ1f6FKqWzdBlmzm32ZSQ7Lnnnm7b\ncoGNv7Ymcz2Evx7OojTGX3czdOhQACZMmADE15bYbOrgwYPdvjSso5g7dy4A7733HgAbbrhhncf4\nUa4tt9wSiM+k58Oe134+m9tuuw2Iz9xVKyuD7TcBtEZ5mU3yfFYm2l8HaiXerQytHyH88MMPi3TG\n6WHrGHLxS/nmiuAYW3PiZz+88cYbAMycOROI1kNVM2vc6ZeQ9tc2JZG5NgdqZ32Ofd5369YNiK4V\niNa52uf82LFj3TGLkH///fcAdO/e3R277777SnjG1cUvlW/86DfEv3/+97//lfycVkaRHBERERER\nCYpuckREREREJChBpKvZIkUrF+sv7kzTwn6/8/W3335bwTORamLh35AWeuejUaNGdfZZmpqfamAL\nQ5M6/PDDgfiCXUsZ8cPzt99+e4Nep5j+9re/AVEXboiKmXTt2tXtswWf9PItcgAADV5JREFUVrrY\n756eayG3PVe2z6mpU6cCUWpXSJ9lflrVihUrALjzzjsLeo4mTZoAsP/++wNRKhxEi+6XLVvWkNNM\nlWzlzK3MuF13V155Zb0/P2nSJLd92GGHrfT1+vfvD8TbQmSmpVYLSyNLmk7mp7ZllqOuxXQ1Y2m9\nfuEBK0Jl3yt+YRFbZN+rVy8g/5LwtaZdu3ZAvLy0pfqllSI5IiIiIiISlCAiOb179479/6efflqh\nM8ntmWeecdtpaJJUbP4C20z2uxe6AFrggQceqPfY448/DsDHH39crtMpG78QgEVZLJJjM3XFMGvW\nrNh/IWoA2apVq6K9TjFZVOnAAw90+4488sg6j7PZbpv99hul5nq/2qJ7f7bc2Hs4pAhONuPGjYv9\n12eNZK3xnR9RyyxQ4Zd+txl1mwnNp3R82mWLMGeWGfdZlHD06NFAPHqTT5TannvgwIFu35///OdC\nTzsIfulyqctvWbHuuuvGjtn1B1Ep+DQslC8Vv0CMRfH9pqAWCbS/JdZff313zAr/WFuHbFkWaaVI\njoiIiIiIBCWISI7djfo5wZWWrYxriOVDt9tuO7dtaxuM5WVDVJrxhx9+KM+JVZjl9vqlZXfaaad6\nH2/XbraZTH/2PdPSpUuBMMfVjxT4UZZSsfUSEG9CmGbWrBOgdevWQPxzcIsttgBg7bXXBuDYY491\nx2zb1jDecsst7li2dRaZx2qZrQGxSNr48ePdMSs5na208rbbbgtEkRxrHBqqKVOm1Nln60is8aLP\nyvxaY0L/s8/KLZumTZsW6zQrzl8/U6zojL9Gx7Ytkpg5lqGxKESu9XTDhg1z2yFHcIz/XjzzzDOB\nKGIP0eeWrffMxj77/RYOtnYxM1KWForkiIiIiIhIUHSTIyIiIiIiQQkiXc2kobTu6quvDsQ7Z2+y\nySZAtEg1VJnj7y/AtQ7tIbKUvSuuuMLtszQ1P7Un1/WZK10t8zG+o446CoDNNtsMiC/Wf+ihhwB4\n5ZVX3D5LB5Eo1ah9+/ZAVGwAonLvfgpXGn3++eduu1OnTnWOr7nmmkBUxjhbyuRVV10FQL9+/dy+\n5s2bA9mvxzR8zqbF+++/D8TLj2+wwQYA3H///UBUxttnYz1mzBi3z1J6Q2Kd5/1S7EcffXTsMX6b\nh+OPPx6I0qomTJhQ4jNMh3IVELC0OP/1Qkxd22effYB40Q8rmLJo0SIgXta8e/fuQJR6FTorXHPE\nEUe4fQcffPBKf87+vrjuuuvcPit4c8wxxxTzFItGkRwREREREQlKUJGcNLCZA78hqZUAtiZ6tchv\n+BYKa0Jri/f8GbGkM+BJH2Mz9P7M1UknnQTAI4884vZZQYQ0NcmtlJEjRwJRpNVnUZ4lS5aU9Zwa\nYvny5fXuO+GEE4B4SejMBaYWvYHqKhGaNp988gkQjW+2SM7GG28MxCP+uRpmViuLaPsFWDJ17tzZ\nbdv1ettttwH5zS6HIFcxj1xNPXM1A/XZd1OtNAW1v7/84jUWVVywYEGdY1Yi+c033yzXKaaCX77d\nCkcdcMABAHzwwQfu2ODBg+t9DssmyXYNn3baaUU5z4ZQJEdERERERIKiSE6RXXzxxQAsW7bM7Rs0\naBAQ5uy5rWfIxh8Dmz0Jic3Q5sppfvnll922zaLde++9QFRaHKIc2aQsSmizyBBFmPwmYLXq/PPP\nB6K1SxAv8w5Rw0wIbw2ZXQP+tXDooYfGHvP3v//dbVupY+Ov/bnrrrtKcYplZ+snIV4+vD6PPvqo\n2959992BqJS0lV8FmDdvHhCt68rl3HPPdds333wzEC/PWg1mzpwJwB//+Ee3z9oHZDZGzcaPLuTz\n+JdeegmonQaguaIvuSI59nfHyp4jJLY2brfddgNg8uTJ7lhmA2l/reqtt94KRFGMEFsyrIxl2xSa\ndZOr8W8aKJIjIiIiIiJB0U2OiIiIiIgEJYh0NVtAbGU5e/fu7Y5Nnz69LOdgr3nQQQcB8UW9lr4Q\nol122aXeY36KS4gL+jLTfXzt2rUD4PXXX3f7LF3R0tQuv/zyvF7Hyhj/9a9/rfcxtkjwxx9/zOs5\nK81ShWwxaDFSoFq2bAlE70GfLYD0U9Ts32bEiBFA/D2bT9pMtfNTKSG+uHT8+PH1Pjbz56qNpeJZ\n6WyI0lRymTZtmtu2dBhLV2vatKk79p///AeAHXfccaXP6XcJty7t1ZauZt+//mfPjTfeCOSXwuK/\n1zIfP3v2bLdtRUAsTS1boY1a4xclyFQrKWq+9dZbD4BVV10ViL/HMw0ZMsRtjx07FoiK0IT490qt\nUiRHRERERESCEkQkx2a+rJlaly5d3LEzzjgDgBtuuKFor2elg0ePHu32WRMzW9h7yimnFO310swa\nbNWyXOU//fKoVg61a9eu9T6HLXi06xbqlvoNgc20WdO2YcOGuWN33303EC/DnotFZ2wRd58+fep9\nrB9ZszLRoRUZKJRFY/3Ps8xrOtc1Xm0sAuBHCfKJ5GQrTmAzx/74WHnoQl122WVA9F1SLSyCYxEd\niIoDTJw4EYgapObLsh+srC2okXE2iuTEWWNK+5zPlUVzzz33uG2LPFoRIUVywqFIjoiIiIiIBCWI\nSM7HH38MwDHHHAPAnXfe6Y5ZszV/pu0f//hH7Ody8XP4t9lmGwDOO+88ADp27OiO3XHHHQCceOKJ\nBZ9/Nfvoo4/qPbZixYoynkn52SzlcccdB0CzZs3csWeffRaIX3eWb54tT90iOGeffTYQZvTG9913\n3wHQt29fIFrjANF71hpY+mzt0ddff+32bbXVVkC8jG8m+7fyn7Pa1j6Uyr777gtAkyZN3D67Rr/5\n5hsAhg8fXv4TKxFb2+GvibP36cCBA8t6Ln5ZW78xaLWzz7+TTz4ZgIsuusgd89/r9Vm8eDGg6I0k\nY9Fa/3siF4vqZrYVkOqnSI6IiIiIiARFNzkiIiIiIhKUINLVjHVqtXQ0iMLl1v0d4PTTTweizvPW\nvRqirt/WvdkvA2rlBY11yYXci51DZgtMsxk1alQZz6T8Zs2aBUQLts8///yCfn7q1Klu+8EHHwTC\nT1PLZOkEfnduW+B96qmn1nm8LfTOxcqBAsyYMQOIPhMsTU7yY13UH3744QqfSfH5aaP2Wf7YY4+5\nfVaqeLvttgOidGWICmcUasGCBUCUKt2jRw93bNmyZYmeM83sM27OnDlun72vLYXtySefdMemTJkC\nxBeFi+Tr008/BaBt27YArLHGGu5Yrs/+t99+u7QnFrBf/CLdsZJ0n52IiIiIiEiBVvkpn25dZdbQ\ncqX+AuRDDjkEiEdd1lxzTSC/RmX+uSxduhSAm266CYChQ4e6Y99++20Dzji7JP805S71arOcEF9E\nC9C/f3+3bQ0Xy6WcY9eqVSsg/jvadecvfBwzZgwQzVL6UbA0NfGs9HVnz2XRVIhKe1pJbr9oQKNG\njYAoGmTRWCh/U89Cxy4NpZmt+EO24gI2E++XQi+FSl9z+ejZs6fb/stf/gLAZpttBsQj+bl+l3Hj\nxgHFLXpRDWOXViGNXebvUurzTOPYbb755gDMnDkTiEdme/XqBUQRncaNG7tjr776KhAV/rFCNaWS\nxrFLyhp5W+EvnzVML2YmQKFjp0iOiIiIiIgERTc5IiIiIiISlCDT1bLxOy4PGDAgti9bh+mbb74Z\ngEcffdTts1rq5ardXw0hTUsVAmjfvj0Q9ZqYPHmyOxZyulpoNHbJhZauZqm5pe7homsuOY1dciGN\nnRVZ6dChA1Cb6Wpml112AeD+++93+7744gsA5s+fD8SLSrVu3RqANm3aAPDOO++U9PzSPHaFshS/\nq6++us6xhQsXArD11lsX7fWUriYiIiIiIjWtZiI51Siku/1y09glp7FLrhojOWmgay45jV1yIY3d\npZdeCkRl361YC8ATTzxR9NerhrHziwsMHjwYgObNmwNw7LHHumNWLt6KA5VaNYxdvlZb7edONNdf\nfz0Qb/3w3nvvAbDpppsW7fUUyRERERERkZqmSE6KhXS3X24au+Q0dskpkpOMrrnkNHbJaeyS09gl\nF+LY2Ronv2z3l19+CSiSIyIiIiIiUjS6yRERERERkaAoXS3FQgxplovGLjmNXXJKV0tG11xyGrvk\nNHbJaeyS09glp3Q1ERERERGpaamM5IiIiIiIiCSlSI6IiIiIiARFNzkiIiIiIhIU3eSIiIiIiEhQ\ndJMjIiIiIiJB0U2OiIiIiIgERTc5IiIiIiISFN3kiIiIiIhIUHSTIyIiIiIiQdFNjoiIiIiIBEU3\nOSIiIiIiEhTd5IiIiIiISFB0kyMiIiIiIkHRTY6IiIiIiARFNzkiIiIiIhIU3eSIiIiIiEhQdJMj\nIiIiIiJB0U2OiIiIiIgERTc5IiIiIiISFN3kiIiIiIhIUHSTIyIiIiIiQdFNjoiIiIiIBEU3OSIi\nIiIiEhTd5IiIiIiISFB0kyMiIiIiIkHRTY6IiIiIiARFNzkiIiIiIhIU3eSIiIiIiEhQdJMjIiIi\nIiJB0U2OiIiIiIgERTc5IiIiIiISFN3kiIiIiIhIUHSTIyIiIiIiQdFNjoiIiIiIBOX/AazvmSKI\nVIPXAAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "# takes 5-10 seconds to execute this\n", - "show_MNIST(test_lbl, test_img)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's have a look at the average of all the images of training and testing data." + "Let's first see what AdaBoost is exactly:" ] }, { "cell_type": "code", - "execution_count": 50, + "execution_count": 3, "metadata": {}, "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Average of all images in training dataset.\n", - "Digit 0 : 5923 images.\n", - "Digit 1 : 6742 images.\n", - "Digit 2 : 5958 images.\n", - "Digit 3 : 6131 images.\n", - "Digit 4 : 5842 images.\n", - "Digit 5 : 5421 images.\n", - "Digit 6 : 5918 images.\n", - "Digit 7 : 6265 images.\n", - "Digit 8 : 5851 images.\n", - "Digit 9 : 5949 images.\n" - ] - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAACBCAYAAADjY3ScAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJztnXmsVdXZxh8EGQSZp4LgBIoXUYyKQypiLVJRUqmxjbbG\nVq0V0Yqp81BB/SppQqJtorFRYxWlRk3baGtJW1HjgIgjiDOiOBQBleHiUHrP94c+Zz933dfj5Ubu\n3mf7/BJyD3ufYe13v2utvd5pdahUKhUYY4wxxhhjTEnYJu8GGGOMMcYYY8zXiRc5xhhjjDHGmFLh\nRY4xxhhjjDGmVHiRY4wxxhhjjCkVXuQYY4wxxhhjSoUXOcYYY4wxxphS4UWOMcYYY4wxplR4kWOM\nMcYYY4wpFV7kGGOMMcYYY0qFFznCxo0bMWPGDAwZMgRdu3bF2LFj8ac//SnvZhWeDRs24Pzzz8cR\nRxyBAQMGoEOHDpg5c2bezaoLHnjgAZx88skYNWoUunfvjqFDh+L73/8+nnrqqbybVnieffZZHHXU\nURg+fDi6deuGvn374qCDDsLcuXPzblrdceONN6JDhw7o0aNH3k0pNA8++CA6dOgQ/lu4cGHezasL\nHnnkEUyePBl9+vRBt27dMHLkSFx55ZV5N6vQ/PSnP/1SvbPu1eaZZ57BMcccgyFDhmC77bbDqFGj\ncMUVV2DTpk15N63wLFq0CJMmTcL222+PHj164LDDDsOjjz6ad7O2iE55N6BI/OAHP8CTTz6J2bNn\nY7fddsMdd9yB448/Hk1NTTjhhBPybl5hWbt2Lf7whz9g7733xjHHHIMbb7wx7ybVDddffz3Wrl2L\ns88+Gw0NDVi9ejXmzJmDAw88EPPnz8d3vvOdvJtYWD766CMMGzYMxx9/PIYOHYrGxkbcfvvtOPHE\nE7FixQpceumleTexLnjnnXdw7rnnYsiQIVi3bl3ezakLfvOb3+Cwww5rdmzPPffMqTX1wx133IET\nTzwRP/zhD3HrrbeiR48eeP311/Huu+/m3bRCc9lll+H0009vcXzKlCno0qUL9t9//xxaVXyWLVuG\ngw8+GLvvvjuuueYa9O/fHw8//DCuuOIKPPXUU/jrX/+adxMLy5NPPonx48dj3LhxuO2221CpVPDb\n3/4Whx9+OBYsWICDDjoo7ya2joqpVCqVyt/+9rcKgModd9zR7PjEiRMrQ4YMqWzevDmnlhWfpqam\nSlNTU6VSqVRWr15dAVC5/PLL821UnbBq1aoWxzZs2FAZNGhQ5fDDD8+hRfXPAQccUBk2bFjezagb\njj766MqUKVMqJ510UqV79+55N6fQLFiwoAKgctddd+XdlLrj7bffrnTv3r0ybdq0vJtSCh588MEK\ngMqll16ad1MKyyWXXFIBUHnttdeaHT/ttNMqACoffPBBTi0rPpMmTaoMGjSo0tjYWD22fv36Sv/+\n/SsHH3xwji3bMhyu9gV//vOf0aNHDxx33HHNjv/sZz/Du+++iyeeeCKnlhUfuszNljNw4MAWx3r0\n6IGGhgasXLkyhxbVP/3790enTnZSt4a5c+fioYcewnXXXZd3U0zJufHGG9HY2IgLLrgg76aUgptu\nugkdOnTAySefnHdTCsu2224LAOjVq1ez471798Y222yDzp0759GsuuDRRx/FhAkTsN1221WPbb/9\n9hg/fjwee+wxvPfeezm2rvV4kfMFS5cuxR577NHi4WivvfaqnjemPVi3bh2efvppjB49Ou+m1AVN\nTU3YvHkzVq9ejeuuuw7z58/3g1QreP/99zFjxgzMnj0bO+ywQ97NqSumT5+OTp06oWfPnpg0aRIe\neeSRvJtUeB5++GH07dsXL730EsaOHYtOnTph4MCBOP3007F+/fq8m1dXrFu3DnfffTcOP/xw7Lzz\nznk3p7CcdNJJ6N27N6ZNm4bly5djw4YNuO+++3DDDTdg+vTp6N69e95NLCyfffYZunTp0uI4jy1Z\nsqS9m9QmvMj5grVr16Jv374tjvPY2rVr27tJ5hvK9OnT0djYiEsuuSTvptQFZ5xxBrbddlsMHDgQ\n55xzDn73u9/hF7/4Rd7NKjxnnHEGdt99d0ybNi3vptQNvXr1wtlnn40bbrgBCxYswLXXXouVK1di\nwoQJmD9/ft7NKzTvvPMONm3ahOOOOw4/+tGP8K9//QvnnXcebr31VkyePBmVSiXvJtYN8+bNw8cf\nf4xTTjkl76YUmp122gmPP/44li5dil133RU9e/bElClTcNJJJ+Haa6/Nu3mFpqGhAQsXLkRTU1P1\n2ObNm6tRTfXyTOyYDqFWyJXDsUx7cNlll+H222/H73//e+y77755N6cuuPjii3Hqqafi/fffx733\n3oszzzwTjY2NOPfcc/NuWmG55557cO+99+KZZ57x2LYF7LPPPthnn32q/z/kkEMwdepUjBkzBuef\nfz4mTZqUY+uKTVNTEz755BNcfvnluPDCCwEAEyZMQOfOnTFjxgz8+9//xne/+92cW1kf3HTTTejX\nrx+mTp2ad1MKzYoVKzBlyhQMGjQId999NwYMGIAnnngCV111FTZu3Iibbrop7yYWlrPOOgunnHIK\nzjzzTFxyySVoamrCrFmz8OabbwIAttmmPnwk9dHKdqBfv37hyvSDDz4AgNDLY8zXyaxZs3DVVVfh\n//7v/3DmmWfm3Zy6Yfjw4dhvv/0wefJkXH/99TjttNNw0UUXYfXq1Xk3rZBs3LgR06dPx1lnnYUh\nQ4bgo48+wkcffYTPPvsMwOdV6xobG3NuZf3Qu3dvHH300Xj++efx8ccf592cwtKvXz8AaLEQPPLI\nIwEATz/9dLu3qR55/vnnsXjxYvzkJz8Jw4lMxoUXXoj169dj/vz5OPbYYzF+/Hicd955uOaaa3Dz\nzTfjoYceyruJheXkk0/G7Nmzcdttt2GHHXbA8OHDsWzZsqrxcOjQoTm3sHV4kfMFY8aMwYsvvojN\nmzc3O864Q5cHNVuTWbNmYebMmZg5cyYuvvjivJtT14wbNw6bN2/G8uXL825KIVmzZg1WrVqFOXPm\noE+fPtV/8+bNQ2NjI/r06YMf//jHeTezrmColb1iXw7zW1Mou3qxDOcNvQ+nnnpqzi0pPs8++ywa\nGhpa5N6w5LZzrWtzwQUXYM2aNViyZAlWrFiBxx57DB9++CG6d+9eN5EmHlW+YOrUqdi4cSPuueee\nZsf/+Mc/YsiQITjggANyapkpO1deeSVmzpyJSy+9FJdffnnezal7FixYgG222Qa77LJL3k0pJIMH\nD8aCBQta/Js0aRK6du2KBQsW4Kqrrsq7mXXDhx9+iPvuuw9jx45F165d825OYTn22GMBAPfff3+z\n43//+98BAAceeGC7t6ne+PTTTzF37lyMGzfOhtdWMGTIELzwwgvYuHFjs+OPP/44ALjgSivo0qUL\n9txzT+y444546623cOedd+LnP/85unXrlnfTWoVzcr7gyCOPxMSJEzFt2jSsX78eI0aMwLx58/CP\nf/wDc+fORceOHfNuYqG5//770djYiA0bNgD4fBOuu+++GwAwefLkZmUITcacOXPw61//Gt/73vdw\n1FFHtdi52hP/l3PaaaehZ8+eGDduHAYNGoQ1a9bgrrvuwp133onzzjsPAwYMyLuJhaRr166YMGFC\ni+O33HILOnbsGJ4zn3PCCSdUwyP79++PV199FXPmzMGqVatwyy235N28QnPEEUdgypQpuOKKK9DU\n1IQDDzwQixcvxqxZs3D00Ufj29/+dt5NLDx/+ctf8MEHH9iL00pmzJiBY445BhMnTsQ555yD/v37\nY+HChbj66qvR0NBQDZU0LVm6dCnuuece7LfffujSpQuee+45zJ49GyNHjsSVV16Zd/NaT8779BSK\nDRs2VH75y19WBg8eXOncuXNlr732qsybNy/vZtUFO+64YwVA+O+NN97Iu3mF5dBDD/1Subl71ubm\nm2+uHHLIIZX+/ftXOnXqVOndu3fl0EMPrdx22215N60u8WagX83VV19dGTt2bKVXr16Vjh07VgYM\nGFCZOnVqZdGiRXk3rS7YtGlT5YILLqgMGzas0qlTp8rw4cMrF110UeWTTz7Ju2l1wcSJEyvdu3ev\nrF+/Pu+m1A0PPPBA5YgjjqgMHjy40q1bt8puu+1W+dWvflVZs2ZN3k0rNC+//HJl/Pjxlb59+1Y6\nd+5cGTFiROXSSy+tbNy4Me+mbREdKhXXbTTGGGOMMcaUB+fkGGOMMcYYY0qFFznGGGOMMcaYUuFF\njjHGGGOMMaZUeJFjjDHGGGOMKRVe5BhjjDHGGGNKhRc5xhhjjDHGmFLhRY4xxhhjjDGmVHTKuwER\nHTp0yLsJhaAtWxhZdp9j2bUdy67tbKnsLLfPsc61Hcuu7Vh2bceyazuWXdvZUtkVcpFjjDHGmPqk\n1gMZz23pw4r3LTfGbCkOVzPGGGOMMcaUCi9yjDHGGGOMMaXC4WrGbEU0bIOv07/p6xSGafBvU1NT\ni3PGGNNeRGPXNttkNtNOnTo1+9u5c+fquS5dujT7q+f4HRzjPvvss+q5jz/+uNnfTz/9tHpu8+bN\nzT4HeGw0xtiTY4wxxhhjjCkZ32hPTi2LenSOpJb1rzr3TbEopbLS/9eSQb3LJ7JkbrvttgCAbt26\nVY/x9XbbbQcA6N69e4tztHyqTGix3LhxIwBg/fr11XObNm0CkFk3adEEgP/9739tv6gCoPpD2fJv\nrT4b9b1aXrBa/fmbSirLLa3sU0ZZRjKoNU+0xjsbHauHuUOvrWPHjgCyMQ/IvDQc63r06FE917t3\nbwBA//79m/1fv4Nj14YNG6rn3n//fQDAqlWrAAAffvhh9RzHwf/+97/VY+zrRZWhKQ5pX7XOlAd7\ncowxxhhjjDGlovSeHK7QaW0CMmtR165dAQC9evWqnuvbt2+zY3qOVnZa1tWivnbtWgCZdUnPffLJ\nJwCaW9nr1VJAeVIW6qno2bMngMxqp7HWtKpFsdaUZ+qV0PepV6IosqNXIZIF9WbgwIHVY0OGDAEA\nDB8+HAAwbNiw6jm+j7LT66UuvfXWWwCA119/vXqOx9555x0Aza2bjY2NAJrrXVFI+6VagSlH9XRR\nnrT6qmWY/ZjfqddL71fUL2klppxUJyn/ouhaa9hSz3R6bbU8Z+qlTI+prvJ15FFUL1oR+Kq8Eupk\n+ldfc4zTsY7jAdHrplz4V2VH/dNcE77mubz1MZIT+7DKgB6c7bffHgDQr1+/6rlvfetbAIDBgwcD\niD057JM6b69btw5AJl89F3l4zTeDaGyijtCjCLSMpFB9JeyXfGbT12lf1Pc7F6y42JNjjDHGGGOM\nKRVe5BhjjDHGGGNKRSnD1aJSllEoEd3mO++8c/Xcbrvt1uzYoEGDqucYFsMQmLfffrt67tVXXwUA\nvPjiiwCAFStWVM8xUVKTKIsYQvRlRGEsDCtgOAKQhWMxDIHhawpdvxo29MEHHwDIQorU3RuFuuTp\nDlZZULfo/taQjB122AEAsMsuu1SPjRw5stnfHXfcsXqOMmMYlv5OGq62dOnS6rnnnnsOQPNQGkLZ\nMQwQyDdkSPslwwgYkqYhK5TFTjvtVD3G/shj1DUgCzFl+IqGOzJZmSF+r7zySvUcjzHUb/Xq1dVz\n7ONFCzFNw4U0ZCcKG0pDe1Sv0r6l59LEcY59+l38vIZvsH8z3EhfM9wjbzlGsqPMNESS4xf7NfUM\nyPSVc4nOL5Q5f0d1iDKgTHRO4PjH0Gcg00ke01C2POQYhT1GZaIpD8pQw3Y57/KYyo7XRz366KOP\nquc4DnI8q9fw0tZuHbClhStqnasHubSG6NlOxyY+j/C5TedYzh0ME9f5mt/FcZ9zAgC88cYbAIA3\n33wTAPCf//yneo7PLjrHpiH29SL7NAxZw245Vqr8SZqKEIUvR4V/2gt7cowxxhhjjDGlolSenGgF\nSisRy1UCWeL3qFGjAAB77rln9RyP8T262qe1nJY5WokBYOjQoQAyy54mvHH1quUtacmrl1U+SS3J\n6kFIPRoqO6KWS0JZ1CoBXBQiCya9LwMGDKieo6dBiwtQR6iLasGkxYMWzKgcK79fPRxMxuVf9ZDR\nKqVJlHlYUiizKDGZOqJyGjFiBABg9913rx7bddddAWQeMrWq03oXJZJSBrwfqpO02qdla4Gsrxah\n4EUtT2pkPVfvKuXM9+s1UC/UCkkoG8pLPW206vHz2qdpeY9KoOedPJ8Wu4gswNqHqWvsb5wTgMwb\n0adPHwCt39AyLYSxZs2a6jnOJ2pFZlspQ/UKtWc0QOpV+CqLOsdE6g+9s0DmwaHMdV6kZXzlypXN\n/gKZfDjGqVcrspoXxePPexgVWaHeRMfSz6Xfm8LrjTyslBX/qszrIXk+8lKnOgZkfXWPPfYAAIwe\nPbp6jnMI9U+9trx29s933323eo5eIY6BHFOB7L5pP+Z4SFkXZSuHaB6J5mSOaXxeATKPGOdRnWOo\nS4xYYsQJkHm/2Hd1rqB8tvYziT05xhhjjDHGmFJRCk9O6l1QLwpXpWqF4+p+r732AtDcYszVK2OJ\ndfXLFSctVxpnnFpR1HpOTwWtBHq+nnJzgNobJ9IqwFW+WttT+WjcOS0BlEmRN3SLPDm0YKpnhjqo\n7aYeRJvZpXkCqsOpx0GtL9RvWrM0D4rtUe9OHkQeVsqM16ZWNV6fWsAoK+qG5sOlG6+qDChH/rZ6\nJNh/eT/4F8gsykUoSRtZhaNcQ+qCeq3pWaaM1PpNq1qUO8I+TBmph4O/HXltIq9Q0fpu5HmgXmiu\nF3PnOD/QswNkMtZ+Sqi31FWVCWXM31arMF+rjlLGPJd3X462ZGB/1WvhNdAKrvpDneR3qRWcnhta\ng997773qOeprtK1A3vNE+gwSWch53TovRvlefM1xLJpXojLabAP1TvOZOLbRsq7WduZ96fNJ3l5X\nknqudesAjk0a2ZB6cPS5j3MMox40B5PXG+WVpHllXxUtkXrL8vaQpeMekMlC9Y7emr333hsAsO++\n+1bPMdqJY6COneyXzHF95plnqucWL14MAFi2bBmA5vM29VM9jlvDq2NPjjHGGGOMMaZUeJFjjDHG\nGGOMKRWlClerVdaYoQcA0NDQ0OxYFHZG966WQqX7kb8ThcXQPaquYrpF1fXO8Jt6C1cjUdgaXeeU\ni4Yo0J3L61b3bq3SoHmHIaTU2hVew+zoxtbiFHRjM7FYXbN0j/O7VLeY5Exd1hAFhjLwr4ZJRGWD\n8yQK/eS91v5CGWhYgLrH9fNAdp0MB9GESRY0oHte9ahWeeWiyAyoXXhAQ1kYIqRJ3ml57SicjLqq\nsqG8+XnqoL6PYxdD+/TYVyWF50GtwgOUXVTqmPLUMDJeSzp26WuWMtdzaehyVEJaj3FMqRUm3J7U\nCj3VUCKGYVGemhzO91NvNISF4VScK3UMiMJ/igLlEsmC8yDHb92yglsMaMgV38fPaShvrXC1dB7S\nuYdbWyxatKjZe/X9Ou+yH+eRNB+F56ZbDgCZTmmIKcOpOH/qOMRnOuqbhouzX/HZUUN++dtRiCnv\ns4at8t7kPYekz8UqO4aRcssUADjggAMAAPvvvz+ArFADkF0752ndpiHVEd12hXMxn4G1P3NcjLZp\n+DrHOXtyjDHGGGOMMaWiFJ6c1IqiHgQmU0XlaGkJ0JUkrR9c7WsyMi1sXLWrxZjfyZW9Whfo3Xnt\ntdeqx5hgqSvieiJacVMuUdlQJpfSsqIresqVstD7UUSrHWHbeE2auMkNw9QqS/2kzKLN7GgFUtnR\nmsXEcvVU0koTbdJVNKJNEZkEql4wWti0rGpqFVMLJvscrXiUE9Byo0u1OtGSxPumfZFtzdtynpIm\nkap1jtetukOLJK9DrbupHmpfS8dStc5RTrTOqbWUMlVPbeqNyIvUk6PWV3oBde6gF0L1iVBHOT9o\n8jx1ml6byMvDcyqnaCPV1FOU93hYK7E+KurBv3qO+sMxkt4bPRZtsUCdjzYWzLsMcloeOtp8nLLQ\nAhb06uim0exr0cbQ7Kv8q+MgxwJ+Tj2VlDkLDkRziP5OUcpvp57DqOy76hZ1kWOOepn5/LV8+XIA\nzZ9B+F30/EdROlFJ70gXiyI7tpO6qN5U6p0WF+Br6p96WF9++WUA2caoOl9zzKReq3yo+5RnFGmy\ntSn+k5ExxhhjjDHGbAFe5BhjjDHGGGNKRSnC1ehupWtMQ8UYRqbuYA1JAJoXBKBbjq5NrSdPFzrd\nf/qdhGFxUeJ4lMxWr0QuWcqF16vJynQbM5RD98lJ9z/IOySjFlGIBEMBVCYMgdL7TFcy36cuX0K9\n0XAC/mZa6EC/i22JdrLOO0wobSPQMjRRQ3qisAC6uSPXO8NARowYAaB5MiXHAuqWhhQydCtKiizK\nPhFAHIKQ7kkFZGEGGkbLUA5em94DJpGy/2mIDb+XoW86ZjJEi2FqGpKZ7vYN5Nufo2ISDH2J9nXR\nMYtyZPEFvc40aV53SOfYxrFOdS4N59NwNd4bvUfUQ/aXvGSZhvpp2AnDpLRPpnqj76esGKamIeHU\nG46beo/YhrRoib5WvdvaxRpam1jOENlorGMfVP1J9w+hHgFZP+Z1qnzS0Hx9BuGcEY3Fkd4VYdwD\nWu4/pAUvouILJCpow/GefTcKz2Vf1xA4/nY03kWhznkWC1GdZJ9j/9SiKtQRTePg8/OKFSsAAAsX\nLqyee+qppwBkMlT5cL9Jjp2qd3wf55ZoX6etjT05xhhjjDHGmFJRt54cXQWmCe+6yy29LWqho+eH\nFqQXXniheu75558HkO3eqom6tA5ECYH8flr/dDXLFbImsKYlceuNyDrB62RSm+6mS0sHZa6eHFoz\ni+zBIXrdqWVOLYvpzvRf9h2E+kC90YTytEylfj7dVV2thLQ85W2V4+9r0n96TD1QUclLehZYanXU\nqFHVc9yNmceicsdMmNT+TOspk8bV4l6EZPnI0pUmk+qYwrFHPTmUJRPlNUGer9k32X+BzCvEZFL1\nGLEEOq3KamlmX47udd6k1mC1gnOs0qRwWsYpl6icLvVEPQgcB+j51+RnWoHZT1Xv+f06DvJ13qX0\nUy+YemY4ZmlxCnpyKDu1qLO4AD2oKlfO4dTvKOKB+qr9lbqofSYqZPN1oveC9ycqoc57zutV2fFz\n6s2iTvD9OmalHln1sI4bNw5ApssqO8qKfVXvR61SvnmTekMiL2fklaIeRGXiOUbp/MKIH0YD6OcY\n6UO9Va8b75HqItuVR/ltjfKgnvH5QfsnCyzoMxrHqcWLFwMAHnvsseo5enc4/6hXiDrI79S5gnrG\n+6H3KpLP1tA7e3KMMcYYY4wxpaJuPTm6YuVKlRZc3WyLljldtXPVzbwbem+AzKvD1bpaxlNrglow\naSml9U43A6M1VS2Has2pJ1LLit4H5j/QAqpWSnpuUiuwvq8o1qPWQktEZFGixVOPUVZRCVtaQ2kB\njTZ0pM6oXClHWgu/yhuR5gW1B9FvpV47jdWlXNTyNHr0aABZmct99tmneo5xxXy/6iStoJSFluel\npZdW1yJ6H4A4J4fji+oJPdiak8gxinqiHlTqLa3mKm9aNDl+au5IaknXc5E3Ng+dS39bX7Nvag5S\nVH6blkm+T63ztF5SBqq/tAJrCXTC8SDyeEWbXRZlE9DW5DOp/qS5ODpXUgf5XboRJj/H71RPOOVP\n6zAt6/q+KGeyPTaj5e/yHmqOBvsedUTHJ16THuP7eX3qfaXe1Mrppb5qZAFlzucaeiCA7BknD89D\nROQho1x1rOGcp14pju/szzoWUmf5fKhecPZ19lnNxWbJ6VdeeQVAtv0HkHk/8t5+IM1dAlr2Vc2j\nUa89oceKUQ/aZ/n8zD7O+RgAxo4dCyCbM/QeUQep03nIyZ4cY4wxxhhjTKnwIscYY4wxxhhTKuo2\nXE3DA+iGY8KtFh5gIqO6sVn2c9myZQCAF198sXqOrki6QqNwI4YjqHs3LRMZhRxErsQ8Qzm+DjQM\ncOTIkQAymatbl+7faJf0erp2bWva7uj+algBwwgYaqQuY4ZaslBGVPKc36nuebrLGY6g4VjUyWgH\n6TwSmWv9lvZnJi5qAQGWhaabXMtEUz68NnWJ013OMARNtGS4AnVY3fNFKmeuekV9Yts1UZ7jnpby\npSyoA5oUyhANXj9LgQLAmDFjAGThWxq+kSZyR2Vd9VhUeCIPKIMotCMKIU2Lhmh4G3WO16Tn0v6t\noV38zkgmUeGBooyNtcLVKAtNRua1pwnvQCZ3hrfofE19ozxV1/gdTNLX+5MWYAGysJn2KMCS3k8N\nFWMfjMLVouIIDO1hmJqO6fwdfpeGIDFMnGOchoQzTJzhalG5/KIQhaulxTyAbB7UwgzcpoPPghzj\ngOZzBtB8/qX8uYXISy+9VD3HFAaGcWnIL++fhpDnWSQkmuujYiHUH30O47Wwb7MYA5DNEQwtZYia\nvmZoIMP6gJbbNGj/dLiaMcYYY4wxxrSBuvXkqBWHlkuu3jV5lKtSTbTjivzVV18F0LwkYLoxZZS0\nHVn9aHliu9RCx9VyvXovInidKmt6H3hOS6cyeY/yLYKFvC1ESeDUAy02QSuRWtqop7R8aqIuLfLU\nYT1HHaZVNNqUkBYu1bvU+pqeB/KzGqdWdfXkRJYnto0WSJa0BJqXXwXiUqj8fi2vTH1NZQjEnpy8\ndDby5FCXolKeaqGkdY5Wc9VRyolFL/bYY4/qOZblpu5pkjfbQOuefmeUxJx6KPLyHqaFY/R+0zqr\nHiteO69XrZDpOKYlZyl/3jcdM9IEap0TakUB5D1fpGOJ3nN6DrScMftuVGCBHlp6cNSTw+/l57SQ\nSuqF1vtHr4fehzRaYmvC+5OWkgZabugaeXLUE8Dr0gRuknqktZQ+X7NfcrNVIIteqRVJERXpyFvv\nUnmqTOjZU08On0f4fp1/qWfsz+rBoueGXghG+QDZmMDf0zbUKhaSN6lORt5F1VPKqqGhodnngCwC\ngM8lWtyL/Zl6rfMx52mOrxpl0V7zgT05xhhjjDHGmFJRt54c9aIwDjPdgAzIVouRJycqZ5xu5KRW\nF1pRaPnnf089AAASB0lEQVTk7wKZZYWWpGgzsGjDqHqD8qAMNHeEMqB1QK3tLE9Yr+WiiXoc6L3j\ndatXix4DLWHJ19GmjbSGUHf1d9K4dvXk0OpHPdeytdRF1eFU/u3pnYgshfyr7aClTC1CjJWmDDQX\nLM2xUGtuOjZoTgote+nmhEAm8yjWur1IZQRk4x7/qpcuah9zd+iZoUdHoUxUH+khiizx/J1auS3t\nYT3fUtJytCz7D2SbP2v/oSWcXonIqxd5S1MPr5aqZX/lb2uuStHi+xVeJ+WjOUgcs7RvUQZsv/ZX\nWoOZL6H9lV5VyiUqy8/f1u+krHXcTNveHkTjKu9n1LbUOwVkfS3yaNPTxbL5++23X/UcowH4bKFe\nSXpyOK7lnR+3pVA+Ucl11RG+L900GWi5mbbm1vBZhTLTst2p16PIUShRCXV6oHW8o/dLPbKUlT6z\nEMoz/Qtk8uT3M2oHyPQuynFvrzHNnhxjjDHGGGNMqfAixxhjjDHGGFMq6jZcTV24DAdg4pS6vxk2\npqEoDE9hmFoUHkA3sobF8fvpzmPJRiALN4p2eKZrUBPxo6TCoqLufso2DTkAMvc63cAs7ABkbvK8\nwy7aSqQPDFNjEp6G7rHUouoIQ6b4V0skpzt8a4JeGjKkLvs0EVhLJDO0RL+LruuohO3WujeUXRTW\nxGP622yjJrvT3U2d0u/ia/Y9DQ/iPeHvqSueIZeUmYa5crzQ5Ob23sk6Sjrn/WK7tGgKdUDHGepA\n1GbKizqtoVocNzl2MbRXf5O/own5/FwUlpB3SdU0lE77BcNUNBSU8uTnolBQ6ozqHPsyw6l0PkqL\nNeg53g8NBUnLdedFGjoZFd2JStTymCaA8/2Uv+oww5qpRxoSzsIGkR5FCf95hBVF4WppWWkdc6Oy\n0ul4puW6GZLGsr1aLIT3hGOklvKlXDkPF6V/tpaoQE3aB4FMRzim632gnrH0tIZDc67hXBsVwuHf\nqH/mHZ4b6R31jWOa9jOOQ/rsy2I2HOdUR9JCDlrwhs/RHEOZDgJkz4Lsz9HzhgsPGGOMMcYYY8wW\nULeeHLUk0dIRJVpzBakWWVqQos0SuYqNLFC0ytN6optL0eIUFTrgClotB2xDPVhP1IJJqwktSmpp\nS62/TDoDWibqqsyLLANC640m6tEjQ28Bk0GBLKldy6rW8jimibpqYUnLmWsyJeVP2as1lRYr9SpS\n7mkp261Ban1TPaJVLPLy8DrVqq5FO/S79fspT70mypr9X+WaJvbq/WBbo2Th9iLytvE62Le073Cc\nUR3ltfFa9R6knkiVDa+bCblLliypnmOSPq3DmtDK9mlhlTwTdfX+cZ6gBy+aJ9R7SNJN9fQ76NHW\n76KVNNId3g/eh2jzUf1c3hbi1hBZ2SlrjnmqD+yfHJ8ij21UlprfFW0syM99VXnf9iLyvqbl7IH4\nnqfjkvZn9lUWElHPPa3mLH+skRSUMcdRvR9p+4pEWthDi01w7tNyxpyLKbPIW0O9U08uvRHsxzrH\ncl5Iy6IDLedtbXMe8owKXrBvcLwGMs+MFljgmJZuggxkMuAzjvY9ypXzgnqMUs9hrc3Utxb25Bhj\njDHGGGNKRd16ciLLb7QyTGODgWyVz5Wrfi4tkaw5J9xsa8yYMQCyvAsgW+lytawbca1cuRJA81Wz\nlpguEpGFXGOCaT1hTKZaIhmjz+vV0txpudlaFsoiWpSiTfAoA+Z5aI4NLWzUI32tukhoWePfKBeE\nn1PrHa01UftojVIrIfVOvTtbm7T0K5D1F/UskGjzMsog3VgSaNnH9Tspj0gW/I7Uy6jvK4IuqieH\nVjnmyNAqCWQyrVXOWOPXOX7xulXevO5042Qgs9TRg6NWPepjVHI6D1QW1AX2W+2HvIYoFy4q95/q\njH5XavmtlScSlVUvIqmHWeUUbUScWoX1nG7eCDSPlmCEQBQpwDGLHgu10vOYjpt5blegv5mWAY/O\nqZ6mnjHmSgBZCXjONep9fe211wBknhx9BmFfpU7r5yJPThHGPaCll17HL+qI5r1ybqQHQfOS6JXm\ntatXKP09navSeUXvFb+rKJEp+tvpBqrRNg1aRptzRBTZwOdgfr9GVnAsoHw1J7RWHmt7RfXYk2OM\nMcYYY4wpFV7kGGOMMcYYY0pF3YarqestDTXQcwxR0LAzJk/RTaauN7rqmFCqIWkjR44EkO0MriEK\ndJ0z+Yp/gSx8S0OE1F1cBKLdyyMXMcMHeEzDEJiAlpYBBVomWEalUYviIo9I3eZA5u6mLDSsgPqj\n4RbUxShcje5jhiNpqF+6W7C2QUvXftl3akhTWmpya8o8TdjWEIC0zK6GAFCnNMGTryPXO+VJWWsp\nb77m/dBQNn4Xxw3V1yjsKi/91GtNS3FqcYaoiAOvl7vRa2hAtBM2YYgW+7KGIPBzUUnc9tCrLSEq\nCMBQHw375LXoGE3Z8lxUeIChb9rPqdvUd71HadhlFMpWxLAhto19RnWGc5/qyK677gogmys1NIhy\npx5FhW3Yp7WAD0vUMnSS4TFAFgqeZ8n3LyP9/Sh8U/sQdZZ9Vp9dGKLF0CJNJn/xxRcBZGFrKjvK\nuta4lrecSK2iMjrfcUzXfsx7zlA9fQ7j8wllp7/DvpoWfdD3R+W+ixxiyvtJmURFCTTUmP2QstCi\nH2l5e32mYJ+jLhatD9qTY4wxxhhjjCkVdevJUasrLRZMaFTPDJPCR48eXT1GKxw9OpEnh9YBWgv0\nc1yVaqm8l19+GUDt8o26ws0zGTeC1gm1qrHggCaG8jXlpNdE6xCtfGotSpP21IKZJj4WJYkvQtuW\nWl7V0ks5qYWXMuP1qrcm1WG1lKbWe20DLTKUp/YLWmn0HvF8e3oSo41UabFNPYP6Pm0jryEqeU0P\nGa2cTM4FMk8OraJqgWLSJf9GpZBVh/MiSiaNdCEqdZx6dyIvIq818hjRs6FyL3LJWUK56DXxNa3C\n1Bcg0z/VOc4L6ZYDQKZz9N7q5ng8x/erd4hypMw1gZ8eo2ijxryh7NheLR7A5G6W1AeyebOhoQFA\nc28EvTu8Ti3sQJnTS/Pss89Wzy1atAgA8MILLwDIIiSArF8XpXR5RK17qfMuxyp6HLkdAdAysV49\nOSwSwrlEn2solyJ7CyNST44W8uHziXpWqJ9Rmei0kJIWVKL8o+ICabGkqHCE6lpR5FnLgxh573id\nlBP1EGj+DAg0L1jAeZOe3KIV1bInxxhjjDHGGFMq6taTo7GELKdK70lUypfWIyCzKnGVr6t9Wt+i\nErJcqTLekxYlAFi6dCkA4KWXXgLQfCNMWvKKaKFLV+9q6aWlQ8sS06JCq4B6I2g54l+1HNAywr/1\nVkKaeqHWMeoDY9K1pCljhyOLEGP09f3phrFq/aXeRDkXPMf3R5ufaenyWptzfd2k+T9qCef9j0q1\n0yKslrbUMh/lR9ArpLlRfB9loNZfxmuzP6t1iuNLETw5SmqBU10gWiab/ZmyVG9aWg5UreC0xtWS\nQ/TbRdnst9ZGqux/2kZ6HiLd4XdEnum0XDSQyZEWdfUeUg+pazqe0MtTlPLbCttBvdC+Qk+O3mfq\nDccg5ugAmdeMctIxi3M451P+BVp6KqJy0UWRV0SU90o90jmWuRAcEzU3gjKmDJinBGRy5JysFvV6\n8L5GpJuB6tgWPaNRnpSZ5oHyfeyzOhbyGZB6q9+ZRktEUShF1ruIVK5AJit6bXQs5PxBndK+x/Et\nzfsCam8C317Yk2OMMcYYY4wpFV7kGGOMMcYYY0pF3YaraSIsQ32WLFkCIE6uVXciE07pltOk53Qn\nWA1vYXEBhqmxZCOQJUrSPa9hCEV2pacudHVf0p2rx3gNdFdGpaDp+o3ORcl79eBC5z3UMDLqBmWn\nYY9MCNWSl2m4moaRMZE3SpikzNMESG1XWoJav0vb3J4J9byvaRu1ndpPCPvjzjvvXD3G8AOWotUy\n2tRPykllkIaWPvfcc9VzaQKzFh5gKE096Cap1Yc5JqruMPSAxzRcja95z2r15SKXUVU9p17wfmvf\nZMijlqOlzjEBV5PDCXVbdY5zB3Vv+fLl1XM8xnAj7ZtpgZEiwrZpuArDxbX/sIzxP//5TwDNQ64Y\n+hKVQU/D+VQ+HBOj+bTI/TSdY1WPOJ6p3jHcnsUsdKxLx00N9UvDhqLw+CLLKYL3mNeic0hUEIR9\nmqHLKjvCfqYlzxn2lxYuADKZU+frNQxQx2nOESofjoEMU9NiAwwNjAqmpOGRUbhtNFe0V2izPTnG\nGGOMMcaYUlG3nhy1UnBFzvLNmgxPK5MWCWBZWVqXNJmNliN+Tq1wfE1LoJbRrLWaLfIqP7XwRGVk\n1dpOCxutRVHyMa9dLSVpcvNXlTMsGpHVkRZFWjB14zFaRdSrSFlF5Z5poeKx1sonTerXfkFZq4W+\nPb2K6WZkqkfsc1FCaVSWk99BC53KlTJjf2SCMpB5X5kcTUs60LpE3aLzVZaxtGBDVIKb1633gHoS\nWS/TohWRV7YofVl1iNZHelmj61XPAZPlaVnX5HB+L9+v2wlwHKDO6aaV1NGoNHe0aV9R0TamYxeQ\nWcJ57TpPpHNGrZK8kSyKoluthX2Q+qYJ79Qp9eTwNT1ekYc1LdsLtIyuqBdPV0qU9M+5QwtesM9p\nWWl6H+jRiQoPsO/p8xuf7eiB1HmCXtd0c2CgPjw5UcELzp8qH+pi5LnmNacFGvRY9OybRp9Ec9PW\nxp4cY4wxxhhjTKnwIscYY4wxxhhTKuo2XE3dg3STMWlRQw4YOvDwww9Xj9ENzKQrdePR1dYat1xU\nD7zIbssItpvXoq5GhiGoS5zhRVESfOp+VLcuvysKV6sH0tArIHOhM0xACwnUSspujZu2lh7VKtoQ\nfS7vIg/UKQ3NSY9pGAJDBrjDOZCFITBRV/ss+yPDzvS7qLscEzRhmp9rjz2Dvm5q7b6tOpom0uo9\nYF9mWILqZZroq4mmHBOjAiNF69cqH449lIWO39QTDU9OE3A1STfdJ0zHSI4D/B0N00znjnoNKYqI\nxpmi6UN7Ec2LDFdTPeJ4xr9AFs4WhflShznGaWh+rcIM9QplwOvU5xOOQwwnA7L+y1SEKMSU/ZJF\nooAs9C16huQYGD0jFbnPps8g0R5DGpKmYZRAHL7NYzqPpIWFWjsftJfs7MkxxhhjjDHGlIoOlQIu\nRYtckrQ9acutsew+x7JrO+0pu8gjSCtTlKxc63eiZOVaicxbY+jb0u/8OnWOMlK5UZZR8mn0fpIW\ntIjKgn6dSbd59Fe9bnq1tPw2ZacWUML2RvJJd0Hf2kVWPNa1nfaQXVpwgIndQFZIheWiAWDw4MEA\nslL6qpP05NDToNED9GjQq6hFclIr+9cxDuahd/r51ox3kUct8jLW6rOk3vpsre1B6E1UXUy3VNGi\nBOk2A+rloax4TKMl6IWk1y0qYLOlurilsrMnxxhjjDHGGFMq7MkpMLbQtR3Lru1Ydm0nT09OPVNk\nnav1O0WYPossu6LTnhb1aAPGKCeHeSS0pKunIi0FrznDtKBHVnNa4Ovd+1oW2lN20eciL3Wao9ka\nD7aSevj1deTxbi8Poj05xhhjjDHGmFLhRY4xxhhjjDGmVDhcrcDYHdx2LLu2Y9m1HYertQ3rXNux\n7NpO3snz0bE0YTw6V6uEfJRYvzXKSVvv2o5l13YcrmaMMcYYY4z5RlNIT44xxhhjjDHGtBV7cowx\nxhhjjDGlwoscY4wxxhhjTKnwIscYY4wxxhhTKrzIMcYYY4wxxpQKL3KMMcYYY4wxpcKLHGOMMcYY\nY0yp8CLHGGOMMcYYUyq8yDHGGGOMMcaUCi9yjDHGGGOMMaXCixxjjDHGGGNMqfAixxhjjDHGGFMq\nvMgxxhhjjDHGlAovcowxxhhjjDGlwoscY4wxxhhjTKnwIscYY4wxxhhTKrzIMcYYY4wxxpQKL3KM\nMcYYY4wxpcKLHGOMMcYYY0yp8CLHGGOMMcYYUyq8yDHGGGOMMcaUCi9yjDHGGGOMMaXCixxjjDHG\nGGNMqfAixxhjjDHGGFMqvMgxxhhjjDHGlAovcowxxhhjjDGlwoscY4wxxhhjTKnwIscYY4wxxhhT\nKrzIMcYYY4wxxpQKL3KMMcYYY4wxpcKLHGOMMcYYY0yp8CLHGGOMMcYYUyq8yDHGGGOMMcaUCi9y\njDHGGGOMMaXi/wF5m/0aE3+CBgAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Average of all images in testing dataset.\n", - "Digit 0 : 980 images.\n", - "Digit 1 : 1135 images.\n", - "Digit 2 : 1032 images.\n", - "Digit 3 : 1010 images.\n", - "Digit 4 : 982 images.\n", - "Digit 5 : 892 images.\n", - "Digit 6 : 958 images.\n", - "Digit 7 : 1028 images.\n", - "Digit 8 : 974 images.\n", - "Digit 9 : 1009 images.\n" - ] - }, { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAACBCAYAAADjY3ScAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJztnXmMVtX9xh8WWRxk3zoIuLCJiLghmp+ItUhFSaXGNtoa\nW7VWRCum7mIFtZU0IdE20diosYpSo6ZttLUkrahRBBVBAcENUVzKKssMLqVzf3/o895nzhxehikz\n977X55OQebn3Xc793u85557vdlolSZLAGGOMMcYYYwpC66wbYIwxxhhjjDF7Ey9yjDHGGGOMMYXC\nixxjjDHGGGNMofAixxhjjDHGGFMovMgxxhhjjDHGFAovcowxxhhjjDGFwoscY4wxxhhjTKHwIscY\nY4wxxhhTKLzIMcYYY4wxxhQKL3KEmpoaTJs2DdXV1ejQoQNGjRqFP/3pT1k3K/ds374dV199NU45\n5RT06tULrVq1wowZM7JuVkXw9NNP4/zzz8ewYcNQVVWFfv364Xvf+x4WL16cddNyz9KlS3Haaadh\nwIAB6NixI7p3747jjjsOc+bMybppFcc999yDVq1aoVOnTlk3Jdc888wzaNWqVfTfwoULs25eRfD8\n889j4sSJ6NatGzp27IjBgwfjlltuybpZueYnP/nJLvXOuleeJUuW4IwzzkB1dTX23XdfDBs2DDff\nfDN27NiRddNyz0svvYQJEyZgv/32Q6dOnXDSSSfhhRdeyLpZe0TbrBuQJ77//e/j5ZdfxqxZszBk\nyBA8/PDDOPvss1FXV4dzzjkn6+bllk2bNuEPf/gDDj/8cJxxxhm45557sm5SxXDXXXdh06ZNuPzy\nyzF8+HBs2LABs2fPxpgxYzBv3jx8+9vfzrqJuWXLli3o378/zj77bPTr1w+1tbV46KGHcO6552LN\nmjWYPn161k2sCD766CNceeWVqK6uxtatW7NuTkXwm9/8BieddFK9YyNGjMioNZXDww8/jHPPPRc/\n+MEP8MADD6BTp05499138fHHH2fdtFxz44034uKLL25wfNKkSWjfvj2OOeaYDFqVf9544w0cf/zx\nGDp0KG6//Xb07NkTzz33HG6++WYsXrwYf/3rX7NuYm55+eWXMXbsWIwePRoPPvggkiTBb3/7W5x8\n8smYP38+jjvuuKyb2DgSkyRJkvztb39LACQPP/xwvePjx49Pqqurk507d2bUsvxTV1eX1NXVJUmS\nJBs2bEgAJDfddFO2jaoQ1q1b1+DY9u3bkz59+iQnn3xyBi2qfI499tikf//+WTejYjj99NOTSZMm\nJeedd15SVVWVdXNyzfz58xMAyaOPPpp1UyqODz/8MKmqqkqmTJmSdVMKwTPPPJMASKZPn551U3LL\nDTfckABI3nnnnXrHL7roogRAsnnz5oxaln8mTJiQ9OnTJ6mtrS0d27ZtW9KzZ8/k+OOPz7Ble4bD\n1b7mz3/+Mzp16oSzzjqr3vGf/vSn+Pjjj7Fo0aKMWpZ/6DI3e07v3r0bHOvUqROGDx+OtWvXZtCi\nyqdnz55o29ZO6sYwZ84cPPvss7jzzjuzboopOPfccw9qa2txzTXXZN2UQnDvvfeiVatWOP/887Nu\nSm7ZZ599AABdunSpd7xr165o3bo12rVrl0WzKoIXXngB48aNw7777ls6tt9++2Hs2LFYsGABPvnk\nkwxb13i8yPma5cuX45BDDmnwcDRy5MjSeWNagq1bt+LVV1/FoYcemnVTKoK6ujrs3LkTGzZswJ13\n3ol58+b5QaoRrF+/HtOmTcOsWbOw//77Z92cimLq1Klo27YtOnfujAkTJuD555/Pukm557nnnkP3\n7t2xatUqjBo1Cm3btkXv3r1x8cUXY9u2bVk3r6LYunUrHnvsMZx88sk48MADs25ObjnvvPPQtWtX\nTJkyBatXr8b27dvx5JNP4u6778bUqVNRVVWVdRNzy5dffon27ds3OM5jy5Yta+kmNQkvcr5m06ZN\n6N69e4PjPLZp06aWbpL5hjJ16lTU1tbihhtuyLopFcEll1yCffbZB71798YVV1yB3/3ud/j5z3+e\ndbNyzyWXXIKhQ4diypQpWTelYujSpQsuv/xy3H333Zg/fz7uuOMOrF27FuPGjcO8efOybl6u+eij\nj7Bjxw6cddZZ+OEPf4h//vOfuOqqq/DAAw9g4sSJSJIk6yZWDHPnzsVnn32GCy64IOum5JoDDjgA\nL774IpYvX46DDz4YnTt3xqRJk3DeeefhjjvuyLp5uWb48OFYuHAh6urqSsd27txZimqqlGdix3QI\n5UKuHI5lWoIbb7wRDz30EH7/+9/jqKOOyro5FcH111+PCy+8EOvXr8cTTzyBSy+9FLW1tbjyyiuz\nblpuefzxx/HEE09gyZIlHtv2gCOOOAJHHHFE6f8nnHACJk+ejMMOOwxXX301JkyYkGHr8k1dXR0+\n//xz3HTTTbj22msBAOPGjUO7du0wbdo0/Otf/8J3vvOdjFtZGdx7773o0aMHJk+enHVTcs2aNWsw\nadIk9OnTB4899hh69eqFRYsW4dZbb0VNTQ3uvfferJuYWy677DJccMEFuPTSS3HDDTegrq4OM2fO\nxPvvvw8AaN26MnwkldHKFqBHjx7RlenmzZsBIOrlMWZvMnPmTNx666349a9/jUsvvTTr5lQMAwYM\nwNFHH42JEyfirrvuwkUXXYTrrrsOGzZsyLppuaSmpgZTp07FZZddhurqamzZsgVbtmzBl19+CeCr\nqnW1tbUZt7Jy6Nq1K04//XS8/vrr+Oyzz7JuTm7p0aMHADRYCJ566qkAgFdffbXF21SJvP7663jl\nlVfw4x//OBpOZFKuvfZabNu2DfPmzcOZZ56JsWPH4qqrrsLtt9+O++67D88++2zWTcwt559/PmbN\nmoUHH3wQ+++/PwYMGIA33nijZDzs169fxi1sHF7kfM1hhx2GlStXYufOnfWOM+7Q5UFNczJz5kzM\nmDEDM2bMwPXXX591cyqa0aNHY+fOnVi9enXWTcklGzduxLp16zB79mx069at9G/u3Lmora1Ft27d\n8KMf/SjrZlYUDLWyV2zXML81hLKrFMtw1tD7cOGFF2bckvyzdOlSDB8+vEHuDUtuO9e6PNdccw02\nbtyIZcuWYc2aNViwYAE+/fRTVFVVVUykiUeVr5k8eTJqamrw+OOP1zv+xz/+EdXV1Tj22GMzapkp\nOrfccgtmzJiB6dOn46abbsq6ORXP/Pnz0bp1axx00EFZNyWX9O3bF/Pnz2/wb8KECejQoQPmz5+P\nW2+9NetmVgyffvopnnzySYwaNQodOnTIujm55cwzzwQAPPXUU/WO//3vfwcAjBkzpsXbVGl88cUX\nmDNnDkaPHm3DayOorq7GihUrUFNTU+/4iy++CAAuuNII2rdvjxEjRmDgwIH44IMP8Mgjj+BnP/sZ\nOnbsmHXTGoVzcr7m1FNPxfjx4zFlyhRs27YNgwYNwty5c/GPf/wDc+bMQZs2bbJuYq556qmnUFtb\ni+3btwP4ahOuxx57DAAwceLEemUITcrs2bPxq1/9Ct/97ndx2mmnNdi52hP/rrnooovQuXNnjB49\nGn369MHGjRvx6KOP4pFHHsFVV12FXr16Zd3EXNKhQweMGzeuwfH7778fbdq0iZ4zX3HOOeeUwiN7\n9uyJt99+G7Nnz8a6detw//33Z928XHPKKadg0qRJuPnmm1FXV4cxY8bglVdewcyZM3H66afj//7v\n/7JuYu75y1/+gs2bN9uL00imTZuGM844A+PHj8cVV1yBnj17YuHChbjtttswfPjwUqikacjy5cvx\n+OOP4+ijj0b79u3x2muvYdasWRg8eDBuueWWrJvXeDLepydXbN++PfnFL36R9O3bN2nXrl0ycuTI\nZO7cuVk3qyIYOHBgAiD677333su6ebnlxBNP3KXc3D3Lc9999yUnnHBC0rNnz6Rt27ZJ165dkxNP\nPDF58MEHs25aReLNQHfPbbfdlowaNSrp0qVL0qZNm6RXr17J5MmTk5deeinrplUEO3bsSK655pqk\nf//+Sdu2bZMBAwYk1113XfL5559n3bSKYPz48UlVVVWybdu2rJtSMTz99NPJKaeckvTt2zfp2LFj\nMmTIkOSXv/xlsnHjxqyblmvefPPNZOzYsUn37t2Tdu3aJYMGDUqmT5+e1NTUZN20PaJVkrhuozHG\nGGOMMaY4OCfHGGOMMcYYUyi8yDHGGGOMMcYUCi9yjDHGGGOMMYXCixxjjDHGGGNMofAixxhjjDHG\nGFMovMgxxhhjjDHGFAovcowxxhhjjDGFom3WDYjRqlWrrJuQC5qyhZFl9xWWXdOx7JrOnsrOcvsK\n61zTseyajmXXdCy7pmPZNZ09lV0uFznGGGOMqUz4QBb+jaEPLXV1dbt8v/ctN8bsKQ5XM8YYY4wx\nxhQKL3KMMcYYY4wxhcLhasb8jzC0IhZOoWEXrVu3rvd3T0My/vvf/zZ4j0M4jDFZwvGsbdv0cWKf\nffYBAOy7774AgPbt25fO8bUeIzt37gQAfPHFF/X+AsCXX35Z7xj/D6RjI8PdjDEGsCfHGGOMMcYY\nUzDsyUF5azv/6vtoPVerEV/zr1rYy1n6i0hjPBtFlIVaMvm6Q4cOpWP77bcfAKCqqgoA0KlTp9I5\nfR+QWjQBoKampt7fHTt2lM7xNa2b+rlKlXEsWTn0fmm/5HWGf3d1rDHnvuk0ppIP5abvLaIsY7II\nj+n/90R2jdXVvKH9j2OdemY4tnXp0gUA0LVr19K5bt261XtPx44dS+f+85//AAC2bNkCoP5Y9+mn\nn9Y7t23bttK5mHcnNhcbU64/V1IfNI3DnhxjjDHGGGNMofjGeHLatGlTek2LEy3qtCwBQK9evQAA\n3bt3BwB07ty5wXfRgrR58+bSsU2bNgFIrU20ugOpdYlxw0Dlxg7T4sGYa/VA0DJHeapngx4GyuDz\nzz8vnePr8G/sc0D+rCyUBfUJSC2Yffv2LR3r378/AGD//fev938A6NOnD4DU4qn6Qd169913AQAr\nV64snVu9ejUA4N///jeA1MoJpHLMo66FHpl27dqVzjGOnzIE0v7Ys2dPAKlXTD/L6/zss89K57Zu\n3Qog7auUJQBs374dQCon9YJVihV4d16D0CtWzruq1nmOl7HvD71qsXFNj8VyyfJAuWsDUhmwf6un\nguMez/EvUH/cA+JefeoaPRdA6o2ora0tHQt1M2sZxryslJPKh2Mh+y3nVT3GsY79HUhlwM+vX7++\ndI4yoHdH5cw5tpyH1xSTmMef/VG9hNQp/lX9oY5Qj3QOCfPDtM/ytc6xRdG3xpZ9zzv25BhjjDHG\nGGMKhRc5xhhjjDHGmEJRyHC1mCtdXeJ0lx9wwAEAgCFDhpTOHXLIIfXOaSgbYUjQBx98UDrGEKK3\n334bAPDee++VzjFUhuExQGW50su5gzVsKAzD0nCjMBRDw6oYUqShRIRu4zy6gykLusSpV0Aqi0GD\nBpWOUbd4jDoGpOEclKdeI8MzPvzww3rfDQCLFy8GACxfvhxAGr4GpGFCmoybpexiIaPsl5qYzBC/\nwYMHl44NGzYMAHDwwQcDSMP7gDSklHpKfQKATz75BADwzjvvAABWrVpVOvfWW28BAD766CMAaagp\nEA9hy4pYUnusMArDL/QYQ/koew214LWFeqyv+Z0aihmGvmloR1gkA0hlyd/Ouv/GwvPCEGYgHb/Y\nr7V/hwn1Or/wuzhG6tgVhsNo8jznCeosAHz88ccA0rFR71+WcozpXbnQZYaZ6mstvEI4ZvGv6hbn\nDsqw0kpIl0t4j/XxxhTpaWpBlaz74J5SLqRWxy3qG+eQgQMHls4ddNBBAIBvfetbAOrPOfwuzh1r\n1qwpneOz3Nq1awGkoeH6ftVTjquxkOc8y52ypSxihZR0Die8JvZBHaPCfplF/7QnxxhjjDHGGFMo\nCuXJ4WpfV6C0sGniI63Bo0aNAgCMGDGidI5eHVrL1doUJjarJT5MiNZEVFqMdWMzWjfzvLIvR1iA\nAEhlRVn36NGjdI7XTmu5JiZTnuWsBHmRU8xKTh2LFbDo3bt36RgtwdQRtd5RPjEdpozpvaBHCEi9\ng7FiGJSreiNU7i1FrLgAdYVeGPVO0Wtz2GGHlY4deOCBAFIZqG7R+k5LsuoKvYphYQcg1TfKJyan\nWJn4libmmY5Zz+lB0GN8zffrNWqBD6D+/aGO8v6oV5aWU1rSVefohdDfCeWbdV+mPDVRPqaHHN+p\nQwMGDCid43jPz6kHiP2Vv6MeB/bTmPeaFuLYJpnsy+ohy8LLGLOox4rQhB4cnX/ZB/ldqoecHyiL\nDRs2lM7xNWWgn8ubJycW/RArtR0WsIgdCwtZAPF5kddOvVD5hN4vfRbh6zx6HMoVqOEcol59ev9H\njhwJADj00ENL59ifOQ+rzCkX9k8tGERd5jwf83BoPw4911nMubsjjMgB0uvj3KqFkfjM3K9fPwD1\nxzv2R3q6NMKJURKUj0YzUebN3WftyTHGGGOMMcYUikJ4crjap5UzZlGiJRhILcT05DBWE0itAvwO\nXWVyZc7fU2syrc8xixKtcLqKVUtKUaCVhdZNtZozr4R/1dJFy1NsQ8u8lZ+NeXJomVMLHS0k2m7G\nlDPOXmN7eZ3UYc11ojU0VvqS5yhrLXlO60ljNidsTmJev1B2aqHj+7W/MB+Jll59f5hPobILz1VX\nV5fO8XUsJ4fWKbXCZ0XMKhx6EYGGmywCqSyoMxo7Tvmyv+nv8Ds4xqklnr9JfVaLcSx2O2v9I+E8\nobH87Ec6TzB3jtZhje8Pc3Fi5cdjcuW8wvsQ89iq/vI174eWl25JwjwRbTflGCv5Tk+25uRQZmF5\ndyDtixwj161bVzrHPsl5YncW8pacM0Ld0vGJ8mH/VO8++5XKh+fZ97SP83vLXRt1S71gHD+ZX6I5\nJxz3dGzIS/5c6HFQDwJzazSyIfTgqGeWcwHnRc3dDCNr9PmN94h9LxaRU27bi7x4cnRMjuUR01tz\n1FFHAQDGjBlTOjd06FAAqW5q/6ccucUF84QB4NVXXwWQRjPpMw+9Ziq75vDq2JNjjDHGGGOMKRRe\n5BhjjDHGGGMKRaHC1ehC05AdJkppKV+6MhmmpqEYhG5zLXVMN2UYlgWk4QR0+WmoDd2jMVddXlyZ\njSV056o7my5lhnJoOF/ozleXb5hIqiUI85ZIGiv1yTbG3NiakEg39vvvv7/L99M9rzrJEBrqaywU\ngm58PRdLWM2CWLhSGKKo/YwJjBqaExYViOkdQ001rIj9n/0zVjY4TBTX11nqX0xuYRKzhlyxv2ki\nLkOIKG8dl8KS2/p7YYgNQ0OAVF4M5dNQLfbdcmVEs6JcSBHDhTSckYnKPKbhWJQBQ3xUf6m3DM1V\nmVPWYQEC/Q49RjnyfmufVhm3FJSdhp6GugKkYS2xQiFh+WxNVOZrzpU6BoSFFrQvx3SrMSWY9xZh\nQRANMaMsOC7ps0gYEqnvYyhRLAya6LXxeqkX+ryxYsUKAMCCBQvqvReIh1VlGWpVLjxXQ3HZL7UA\nFF/z2UxDkBmyx1A9DZPkfeM4oIUHwucabQPnJR1L+P4s+meMWHgur08LM4wdOxYAMHr0aADp3Knf\nsXHjRgD1n12og5SFjqGcyxl2quNkSz3v2ZNjjDHGGGOMKRT5MPU2gdhqP1YummU/WRgASL0tfJ+u\nJJnwyNU+V/9AamGjRU8tMrSysw1qTWYCoG4Qyt8Jy7hWCrHNnWhxilmUafWlrNW6SU9OrORx1omP\nIdqe0Buh5V1pudBkzvXr1wNIZabvp3xoUdJNAmmVYklHtciEG3hpcqFaOvOA9jPKhfLUe04LuFoi\nQ9S6SX3TgiOExygXLSQQehCzLrXdGMIiDmpVpOVXLXDUHcpUrz9MkI+VBaZlU/sy9Z2f1zGMr9XS\nlxeLZliONla0IebJ4Tnt+xzT6anVeYLnOMapN4L3gfLh/4F4EjPHAb4vK70s5wWjDmoSM+XIOVY9\nUBwbWW5cPTkcI3XcJKFnWuceyqUl546Y5zeWIB9GNmiRAcpJy/Xyfbxe7T8cs8ptl0F91XmC+sNn\nECaJa5t3dW0tTbny27E+G9sYmn2IngQAWLJkCYB0w2zte9RTfr/2M+r6nsok62cXyo5zpXpa6Tkc\nN25c6dixxx4LINVXfV5988036x1TnaQO0zuk59gPws2lgXh0THOQr6cgY4wxxhhjjPkf8SLHGGOM\nMcYYUygqNlxNCQsOaJJsLLGP7jW6JJkIDgCrVq0CAKxcuRJAfZcdQwfoltcwBLp8GbYWS5DTMDq6\n8TQsKa/E3K6xwgN09VL+uh8AwzliCbcM62ipHXCbQqx+PkMjYqEVdNnqdTLUIwxzA1JdZCik6gp/\nm3quMufnYkngWeyIHoPt1RCAcP8ZbSvlqWEUvHbKR0M+GFY0YsQIAGk4qr6P90H3jmDIKJMptT+z\nfVmGHPC3yyXiavET9jdNmmWfZIiQhkIx9IVhGxoKwvAC9mWVN8MI+TnV8dheEnnrz5Sd7kfDECEN\nM2YYDPutJs1SZ9555x0A9fcd4TmOaxqWSpnxPmg/YP/QcA/Kke/POoySstBwUYa3aNgQ9YZzJWUC\npLrIeVeLs/D6YnuCheOf6hjHjHL7lbQEsbGObaI+qB6xL+l1MnyPfUmT5zl/sk9pn+Wzx5FHHgmg\n/thAqFsqJx7TfpqXPhuGmMb0QXWR8Pp0nyWO95Shfo7jG+ddLTJC3QrD6vV3tB9n+RyjcwXlw5Bt\n7Z+HH344gPqFB9jnGJr2/PPPl84tW7YMQCpPHTu57yTHUJUd5yQ+D2s4PWnuOdaeHGOMMcYYY0yh\nKIQnh1bHsFQjAAwZMgRA/cQ+riZpMaHXBgCWLl1a7xiTI4HUMkTLk1qaabniX02C5qqXFi9tc0uW\nudybxKzMvD7uMqyWJFrdaNHT8o2hlTLPsijnRdFztI5pgi5lFfOwUF+oU2oNoYWE59SCSesgrelq\nZYr9Thb6Vs6iFfNA0cKmljb2bXpkaT3S1/TgaAI0Lee0GmuCOMvEx2SXF0smELfO0YIbS/ZWix3v\nM5Nt1etCCzH7n3qfadHkuKn6SBnSIq2eirx4HGKEydpqBaclV6MA+Jq6oMVSQi+uWnIpA8pFLfe0\nIvNcrNiF9s2wwEtWY2M52TGhWT2IYcEBHe85p/KYWucpc84dOo/y2ilz9bzyu7Sv8H3NpYux+8T7\nqUntvOf0IscKpLDgApC2m1ZzPUcd5O+pvnJeoGdbn09CL5KOA2yr6mKW4185/Y95yGKRFGHSPZDO\no5yTVV8POeQQAOkYqkVY+MzC+6feoZh3h3LM2pPD51x6aPQZmBFO+kzK+fDFF18EACxatKh0TudN\noL5nn6+pd+rlob7FtmmIbQfRHNiTY4wxxhhjjCkUFevJ0ZU2LZBcmcc2h1IrJa0YjKem9wYAXnvt\nNQCpl0dX6GG5W40lDq2iavkMN2wEGpYlzLP3QglX3xpjydK1tBio7GjNYgxybMO7SpEBKWdlisVm\nU2fDcqNAarmkDNUbqRvpAfUtyrQy0WIS2zwvVuo0bGdzErPYhFZrlQV1Sr0UtLQdf/zxANK4cyD1\n1lKGmtMQlvFV+dDyyb+x+5clsRKbHDc4vqg1kh5U1RdagXnd6nWhfvA7NYeOlj6W4FfLdPhdGt+f\nB7kpMd2P5ZVQd2IbWsZKZfPa+VfnI473tNjrOXp8YiWhYxbgvMgzzIlQa20sF4z6yXFJc3I4V1L+\n6o0IdVg94ZxPYptr856q7MKNaZvTsh4+G8R0hfqgngD1yhPKjB4vfc7gtYSleYFUnjynusXv4HgQ\n2+Q8j95X3jNet47f1CP1dNETQ/3h3ACkXgveq5jnMZanzdfMu9PoHrZBn3VCfWvJPqxjDfsq9UE9\n/BzntN185mV5cX3O4HdQTpyPAeCII44AkMo6lo8Z5iICLZczbE+OMcYYY4wxplB4kWOMMcYYY4wp\nFBUbrqbJinS9MbRCw9WYFKWuWJYSZFm8t956q3SOrkm6mGMhLHShx8p/xly+YZgEkL/d6PcUtl/D\nAFnCkm5hLY9NmdO1nOcSs3tKLBwrLPULpCFZDFvQpD/q7siRIwEABx54YOkc9Zu6paEGTLilrNUd\nXK6QQxZhMHqfKZ/YvQ8LiQCpPFh4QHempzxjIQ2UAcNr9DsZUsPf01CulghxaSwachXuXq1y4LWp\nXjFMhTLScCyOoXw/S3ADaTEHfr+WSKZM2S7Vcb7WkMGWSjBtLLHw0jCMDEj7EsdtnXMYSsnv0GRy\nhr5Rr1Q+JAylCtuTN9hfOYZpuBpDWDS8lDJjX9TCA9Rhlq9laCSQhuvy+7X/sX/yuzT8m8TCKnVM\nbC7CcUJDcTjXsW06L/L+63zI8Z3XqeNSOO9qCNKwYcMApKFaGhbHUCQml2sb2Na89M9yxX10bOez\nxNq1a0vH+LzHcU51iyFW1E3VYcqD5ZPffvvt0jk+H/J3NHyQ90afBbMuEkJ4nRybdIxiG1UPGLpG\nuehzNPWN8zBLUAPAUUcdBSCdKxjuBqRhqpRZFgUaKvtJ2xhjjDHGGGMCCuHJoeWCVg1NKOPqVRPd\nuTLnilOTzLgyjyXDh2U0NXGVViUeU+trrExunq125QiT5lXWXPnTgqAypyWYloNKvX4lTGDWJFCW\nWNUCFLQu0drEJFsgtThRhpo8TpkzyVGtorSUUNYxC7p6DcPS31lZm8J2xBLEtd20dNISqfoTS7Ql\nvEe8N2rZo3Up5gVjn81DMQKVA3WN3gLdNJZ6pd5VJnXTM6OWX45H9OTohsn06vAcPbFA2vep41pO\nOFZSObRsZm3hZNt0fKJe6XYCbDf1S6+J8wTfE9scM3adYWK6WvDLWTazlhmJbaTKcU3HOqLJy4TW\nYCYqMwIASGUX29wzRGXHsVE3+4150JqbmJcwVlaaxDbP5fs4lmsECPWMfV03cxw6dCiA9LpVFnzG\n4dxRKZEUYUEH9eRw/FZPDr2JLNwT25Q95m1mEQsWo9LoHnrBONeqZy3PG5iT2Kbl1DH17jASgNEk\nek3s25TCre00AAASoklEQVShbrjNOYbzlHq6KDvqXWMjTfYm9uQYY4wxxhhjCkXFenK05CytGrRq\nqpWJqCWSXgWuMjXHoZyHgRaAWDwsV7M8p99DS7FaDmOlIyuBMJ5Vc0e42qflQDeQYtnFSi0XHcur\noneA163el1hZVcab0wqiMa88x+9UKxMtI7SCaBwtLTKUp3o4qa+xEra8R3nxqKnViNeppUFpYac1\nTTea5TXzOtWzQPnTsqebmNGSzNh1HQdoMcyD91XvH1+H5ciBVAfUo8hy7uyvaoHjd9BTobrKsY3X\nX04OqnPsH2Gp8jzBa9GyxitWrABQf1ymNZf6pN4aXifnIZU5j8U8OtRtzgWxTRlVvnnpn6EHX/sY\n5zw9Fm6QrB5Hjo3MQ9TcGnpkaFlXvaOMeR/092L5T1nqoN5z9QCG/9e+E36Wsla941zD/Jtjjjmm\ndI5Wdo5dGqFCefKcjrflvOl5madjWzLE8ujC+VBlR1nzczqPco7l39hcENv0Oy/yIdqe0IOjOVp8\nNtO5hf2Qc6U+67BfsZ+p/vCecOzkczWQesjDCKnwO5oTe3KMMcYYY4wxhcKLHGOMMcYYY0yhqNhw\nNXVLM3SFIRnq/qZLTJOh6LqlazyWrB3blZ7hB0wYp8tYj/G3dTdmJgCquzB0q+YZdWlS7gxn0eR5\nujfp6mUSH5C6KyvhemNQH2KhAwy7YMiZvtakWoan0R2ssqNu0eWroTSEbmG9HwwVYcKl6jLd87GE\nQ4bNtEQ4TKyQQOwYYXu1D1F/qH+xsJQwKRdIQ9I4RmjJ5TDMVROnKX+VXVahQ9pnqB9M5NSQFI49\nGobB8SvWdsqLyaca/sNQB4YMaugpxzOGWmlyOeWVRVjC7uD4RJ2LJR5rUjHD1diPNAyar8OEXCAN\n+wv7JpDKjOc08Zf3Ko/laCkzykLnRb7WY4Q6pX0yLImv5cnDAjXlSr5rYn1YahjItjRyLGyIx1Tv\nwrDH2DF9nuFcw13mNfyUn2OIEMshA+mzB8f9rPWpsYShdDpfxIr7sO/xmI7f1C32ce3rfD6MPfdx\nnOSx2FYgeQlh07GWesfxWQs0cPxRGTDcntcbK+VNmWtoI/son6e1P7OPct7KoqiKPTnGGGOMMcaY\nQlEITw6tYbTwxErPakJpmPSvyX/hBnea4ExL/JgxYwAARx55ZOkcLQhc8TLRHkhX0Gp5qgRPTlgy\nG2hoEdeNB2lFo9VXE9B4LpbkmGcZEOqUJhjTgkGPjHptWJCBid9AqiO0ZOp3UT7UTbW8h5ZS1UkW\nvKCFRL+T1n5aWIBU7mEybHMQFmtQC1h4TM/xc9pPw1Kd5TZe1c/Rik6LXswrGUss57k8JC9r8jWT\nYGmtVTnQG6EFMKgzvG6VM2XDcU1/hzJZvXo1AGDp0qWlc/TQ0jqsnqOYhzDL0tF6v9k3aBnXcY16\npYUA9LV+HkjHPY4Beo5WUv6OzlWUK9+vFmO2J49jY2M8SnotvGb1YoXfReu5FgXiWMUxUsdPzjnU\nN7VC83OqixwHsvbklPNk8l7re9hHYzJkmXdubKkeMspg1apVAOpvaEnPdFieGmi5Ur7/C5SJerXo\ngdcCPoySYP9S3eLzF/VOZc75hTqsMg83ZdUyyBxf8lIgRO9huDm2PpNyrNfoIo5bsdLrHJvoSdR+\nSfnweU89RuFmqVl4vOzJMcYYY4wxxhSKivXkqLUrLIerK/RYKUFa4WJWJq7M+R5dsXKDvNGjRwOo\nHw/L76flU+NhuemorporoYQ0V+9qpaTliBZMjSlnLg4tympVCy3JajUOyaNFibLQzcWoP/TQ0MoB\npB4W9XSFZU7VIsRrpl7E5MM8AM39oWWY7dI8Fr03hB6NMD8BaL7cCcpO20MZ8FislGrMk8M+rvIJ\n46jVW0G5UPb6O7z22HXnqQRyzJMT62O0oKmcw42LVR9VX8PfoQyZl6Kb49FTS8uxxr3Tepj1Jqqx\nku/0gLL/6JzAa1DvAPsn9UPlE/bPWJw+0esP74daTWPx/XkhHJ9UTrENLTkm0sut8mGOAGXOsRJI\n51R+XktPU7c4x6rFONRJbU/e5Bnz8sQ899QRlQEjBCgznUOYC0Gvq+ZGcG7m+2M5c3nJK1FCL72O\nX8wL1vmQuTj0WKn3gnpD3dUcO3qIOG5otARfU7diuaV5lB3vK/tBrG/oMco4lidGHaQ3Vcc/esio\nb5pPHM7bKpuWmmPtyTHGGGOMMcYUCi9yjDHGGGOMMYWiYsPV1L1L9yPd4Oo2p0tSQzO09DNQf3db\nhnrQFaohaUz2i4U70C26fPlyAMAbb7xROvfee+81+J1y4VpZEtvlVl23dFvymLofGY7HUBq9D2G4\nmrp885K0V47QbQ6koVBMktVSlmF5YiANb6PeaMgA3esMP4qV5SXqZg9dyxp6xc8xMRBo6CJuTtd6\nWLhC+wuvgeEHWr6Yn9NQDIbEUE6xXcPZZ1k2Wl/Tza56x+8MkyOBbMvPhsQSZHlMZUTd0WukflC+\n2tcoe4YsaN/n+1jgQHUolJd+Z17KRceKprC/MtRHyxPzPmuxgTCsSvsOw1uoV5oATrmyT+pYz9+J\nhSnlSedC2E72P92SgWFjmgBOuXDe1TAjyjVWLIXjAscKDadhQv3rr78OAFi5cmXpHOdfHTfzMq+E\nIU276yPUG8qMoXtAqrN8j4ZjhQUHNGwoLDiQl8Igu4Oy43OZzrGcT1W32N+pnwxRA+qH7+l3A+nY\nEIYK6muOq7FtD/II72csVCzsz0AqOz7j8LkGSOXP5z59tuNzH+cKHUPLjWnhdi27et//SmXcLWOM\nMcYYY4xpJBXryVHrNhOfaNVQjwlX+1rel1Y4emlo0QXS1Swt8GrtC70XakV5+eWXAQBLliwBUD9R\nl+3TNufNahJLNqOFTa0nlAHlpAmo3DiQK3m1FtE6ECuTGkveq0Q04TssMQukVkpawFUfKDPqilow\naW2JlVumjHkf1DITeiqA1PLfkp7EmBeMesR+pmWPKUfVEVqOYomz1E9a6NWTM3z4cADp/aCOAml5\n0VBv9ffyYg0mYRJ8zMujcqMVkmOeWij5WeqhjpuUL71DsUTlvGxU2VgoM45x9PwBDTeoBBr2Kb1O\nWn7pwVFPDmXN74r1ZcpV556YZywvhLqipXnpQVDLL/sboyY0OZzjQWyDR/ZvWt1feeWV0rmFCxcC\nSOdYRkgAqYzVG5sXvQzbof2T167zLsdGjme60SzPxTxqfOZgJEU53cpjojyJyYdzh45fsaI17OOx\neYIy5nON6mvYZ8uhY265ojV5k+vuPIjhhr9arjuMQolFmtBzqNEF5eTZUnKyJ8cYY4wxxhhTKCrO\nkxOL12csIDepUwsdV6CaW0OrEnNsynlY1OJN69v7778PAFi8eHHpHK1LLB2tli6uevNooQutabGN\n67RsMs9TLrEYaFpA9XppHQiteJVCWI4RSC3ftKapJZyWRZUdrWmUj24OS32hd1Ctv9T1WMljfidz\nJjQOmxY9PUbrHi1cLWFt4m+oxTYsya2l2ll2VnPBaEGiNS62QS37unqF6PWiLNT6y7w55hTo/aOl\nNC85JiSM649tihrb8DS2MSWhTmifDL2NsZLdu/p/HqB8Ynlv4SafQOqJUZ2jjlHGam3na3pnVR/5\nm/QQal9mP2ff17ZkuWHe7mA7OBbpmLJixQoA9cdGXhfn5kGDBpXOUdb8Tv0ueiOYd8McV6DhVgzq\nqYh5NvNCmHsQ65+aZ0n58DlGdZLfwfFe80xYRptz8u7KROeVcmWGY9uD6HsoT50DCOcVjoU6NxPO\n5Sq70Cu0u41U8yLj2MbrJPTaAKmHi7qoW6xQByl/nSupb3w+1nEgD/3RnhxjjDHGGGNMofAixxhj\njDHGGFMoKi5cLdx5GUhDURh+oiEZfL+6GFmEgAnLWsIyLLv7wQcflM7Rlc4dhZlwCaQhbHSla7J3\nnkuDhi50dV/ytSa681r0+ghlx3ujMt/V71YKvG7dYZ6hJ5SdXi9DVDScgPKke1fDNBjaQr3T32GY\nTcw1zhCXsMQy0LC0OpC63lsyTI1yKVcUQWXHhNKBAweWjjH5ln1WS05TrnSN6/UyPO21114DkBYI\nAYBly5YBSENq1AUfCx3KO7GyyZSlFsUg1AXeFw2j4f1gX44VqihXSrW5y4I2Fg2ZZZ/i/eaYDaTh\nLVrynaV7mUSv8wRlTLmo7nDOYN/XeYLlfRki2dhyq1nDNsXGf/YVlQGv85lnngEQT/Lmd6kM+Jrj\noJ7j/JLnsL4Y4RyrYY+UhYYGMUyNY50+z7CvcozT4keUFeVUKWWiyxEWWtHUgrDEO5DKlikJQ4cO\nLZ1jn6VcNNyRYwK/U0vmh7qoxS3yFgYYK+gU+z+f6VS3OL5xvNPS3Hwf5wOVOcfVWIh3GFKYxXOf\nPTnGGGOMMcaYQlFxnhyiq0WuJFl4QK1M9KzwHJAWIWCJRl3N8rNc2TPZUb+DCX6aOE4rFle6sdVs\nHglX2NpWWnPV4kEPBVfyaiGmHGOlU/kdtMhUmpUpLKqgx0ILMZAmzJaz/qpcw3K16vXg71Cn1IIe\nlvNVi3u5RMmWSAjk/QwLLihsT6wstlrM6HVgn9Xylrw+WjVpRQZS7y6t6Wq9Z/+NbQaaxyIhSrm+\nop4cypXXoxa4UAf0O0OLnepVY2STl76s18g+xX4aK12s3ojBgwcDSK3CatkkMY8/5wkWodG5J9zm\nIG9Jursj9Ojoa9UtXh89VirrcsnkoWU85q3Ji241lnKbIjP5XXWLid/q8SGcM/hX59hwM/Ryc2yl\nyDDctFKvl/04tsUF5akeROog5aTbCXDO4HyhW4DweY/zfMyTUwnENonWZ1/OqbHtBkIPbqwwA+US\nK1KTZVlte3KMMcYYY4wxhcKLHGOMMcYYY0yhqNhwNYWuM7oyNfGYYQQLFiwoHWPScrkdc+l607Ch\ncon1eUtAayxsd2y3bbohNfkzTDbWMCO612PFIcJa85USzkdiBSzCsDMNX1y9ejWAeCJgzHVbLkwj\nbEO5fQTyGN4R22MoDHFRHWOxgEWLFpWOMYSDfVdDOcIEeg1pYNhMmBwJNNT5Sgo9iMH7rH2YMqcc\nNKSI4x73htD9OMLxTwthMGQhtqt4rDhGXggT3TWsgvJhvwUa7lOiOhfuG6M6x3GAhUViu4NXWvJ8\nY9D2Uw/yHvbZXOi4zDkzFiLE8UzDmvk+jkfaZ8NnnVhhhnIh4ZVW8CeUAccqIF7wgs97LA6l++Vw\nvOP4pUUbGJIW21uOY2Ce92IisTEklvRPHYsVmiI6rnMMowx07OR9KPdcHNO7lhrv7MkxxhhjjDHG\nFIpWSQ7NR81hbWjqd2Ypnqb8dqVZapqLSpBdXsrshrSE7ML3x5IiY8nK5SxCYREGoKElubkt53v6\nnXtT5/hd6l0Ny9fGPIuhpRlomGCucgw9X3tDjln0V/08vTTqrWHircqFUC60Wqp8QotvzGO7N6mE\nsS6vNJfsYp4cenDolQbS0tG9e/ducIxJ81pkhbpIzyo9D0DqmaAnUb08YVL43hgHs+6zRPsnX8fm\nkDAKRfss+3FLeWtaUnaxeYFjmxaOol6yaIN69vk+jo+xqIzY1hiMcgk92EDTIyj2VHb25BhjjDHG\nGGMKxTfGk1OJ2ELXdCy7pmPZNZ0sPTmVTJ51LvY7jcmFaynyLLu805Jea+Y8qLeQuTjqreEx/tWN\nj8OtAjTHkN4d5pxoLk9oNd8bngrrXdPJ2gtGr5Z6t0L9VM9PuOG2esH4vWF+t74vpndN1UF7cowx\nxhhjjDHfaLzIMcYYY4wxxhQKh6vlGLuDm45l13Qsu6bjcLWmYZ1rOpZd08laduW+K1ZsJSwWopQr\nwBK+Z2+QtewqGcuu6ThczRhjjDHGGPONJpeeHGOMMcYYY4xpKvbkGGOMMcYYYwqFFznGGGOMMcaY\nQuFFjjHGGGOMMaZQeJFjjDHGGGOMKRRe5BhjjDHGGGMKhRc5xhhjjDHGmELhRY4xxhhjjDGmUHiR\nY4wxxhhjjCkUXuQYY4wxxhhjCoUXOcYYY4wxxphC4UWOMcYYY4wxplB4kWOMMcYYY4wpFF7kGGOM\nMcYYYwqFFznGGGOMMcaYQuFFjjHGGGOMMaZQeJFjjDHGGGOMKRRe5BhjjDHGGGMKhRc5xhhjjDHG\nmELhRY4xxhhjjDGmUHiRY4wxxhhjjCkUXuQYY4wxxhhjCoUXOcYYY4wxxphC4UWOMcYYY4wxplB4\nkWOMMcYYY4wpFF7kGGOMMcYYYwqFFznGGGOMMcaYQuFFjjHGGGOMMaZQeJFjjDHGGGOMKRRe5Bhj\njDHGGGMKhRc5xhhjjDHGmELhRY4xxhhjjDGmUHiRY4wxxhhjjCkUXuQYY4wxxhhjCsX/A72wAtv5\nJA/kAAAAAElFTkSuQmCC\n", + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def AdaBoost(L, K):\n",
    +       "    """[Figure 18.34]"""\n",
    +       "    def train(dataset):\n",
    +       "        examples, target = dataset.examples, dataset.target\n",
    +       "        N = len(examples)\n",
    +       "        epsilon = 1. / (2 * N)\n",
    +       "        w = [1. / N] * N\n",
    +       "        h, z = [], []\n",
    +       "        for k in range(K):\n",
    +       "            h_k = L(dataset, w)\n",
    +       "            h.append(h_k)\n",
    +       "            error = sum(weight for example, weight in zip(examples, w)\n",
    +       "                        if example[target] != h_k(example))\n",
    +       "            # Avoid divide-by-0 from either 0% or 100% error rates:\n",
    +       "            error = clip(error, epsilon, 1 - epsilon)\n",
    +       "            for j, example in enumerate(examples):\n",
    +       "                if example[target] == h_k(example):\n",
    +       "                    w[j] *= error / (1. - error)\n",
    +       "            w = normalize(w)\n",
    +       "            z.append(math.log((1. - error) / error))\n",
    +       "        return WeightedMajority(h, z)\n",
    +       "    return train\n",
    +       "
    \n", + "\n", + "\n" + ], "text/plain": [ - "" + "" ] }, "metadata": {}, @@ -1939,292 +2120,114 @@ } ], "source": [ - "print(\"Average of all images in training dataset.\")\n", - "show_ave_MNIST(train_lbl, train_img)\n", - "\n", - "print(\"Average of all images in testing dataset.\")\n", - "show_ave_MNIST(test_lbl, test_img)" + "psource(AdaBoost)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Testing\n", + "AdaBoost takes as inputs: **L** and *K* where **L** is the learner and *K* is the number of hypotheses to be generated. The learner **L** takes in as inputs: a dataset and the weights associated with the examples in the dataset. But the `PerceptronLearner` doesnot handle weights and only takes a dataset as its input. \n", + "To remedy that we will give as input to the PerceptronLearner a modified dataset in which the examples will be repeated according to the weights associated to them. Intuitively, what this will do is force the learner to repeatedly learn the same example again and again until it can classify it correctly. \n", "\n", - "Now, let us convert this raw data into `DataSet.examples` to run our algorithms defined in `learning.py`. Every image is represented by 784 numbers (28x28 pixels) and we append them with its label or class to make them work with our implementations in learning module." - ] - }, - { - "cell_type": "code", - "execution_count": 51, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "(60000, 784) (60000,)\n", - "(60000, 785)\n" - ] - } - ], - "source": [ - "print(train_img.shape, train_lbl.shape)\n", - "temp_train_lbl = train_lbl.reshape((60000,1))\n", - "training_examples = np.hstack((train_img, temp_train_lbl))\n", - "print(training_examples.shape)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now, we will initialize a DataSet with our training examples, so we can use it in our algorithms." + "To convert `PerceptronLearner` so that it can take weights as input too, we will have to pass it through the **`WeightedLearner`** function." ] }, { "cell_type": "code", - "execution_count": 52, + "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ - "# takes ~10 seconds to execute this\n", - "MNIST_DataSet = DataSet(examples=training_examples, distance=manhattan_distance)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Moving forward we can use `MNIST_DataSet` to test our algorithms." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Plurality Learner\n", - "\n", - "The Plurality Learner always returns the class with the most training samples. In this case, `1`." - ] - }, - { - "cell_type": "code", - "execution_count": 53, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "1\n" - ] - } - ], - "source": [ - "pL = PluralityLearner(MNIST_DataSet)\n", - "print(pL(177))" - ] - }, - { - "cell_type": "code", - "execution_count": 54, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Actual class of test image: 8\n" - ] - }, - { - "data": { - "text/plain": [ - "" - ] - }, - "execution_count": 54, - "metadata": {}, - "output_type": "execute_result" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADcpJREFUeJzt3V+oXfWZxvHnMW0vTHuhSUyCjZNOkSSDF3Y8yoA6OhTz\nZyjEhlQaZJIypSlaYSpzMTEKFYZjwmAy06vCKYYm0NoWco6GprYNMhgHiiYGqTYnbaVk2kxC/mCh\nlghF887FWSnHePZvney99l47eb8fkP3n3Wuvlx2fs9bev7XWzxEhAPlc03YDANpB+IGkCD+QFOEH\nkiL8QFKEH0iK8ANJEX4gKcIPJPWRQa7MNocTAn0WEZ7N63ra8ttebftXtt+yvaWX9wIwWO722H7b\ncyT9WtJ9kk5IOiRpQ0QcLSzDlh/os0Fs+e+Q9FZE/DYi/izp+5LW9vB+AAaol/DfKOn30x6fqJ77\nANubbR+2fbiHdQFoWC8/+M20a/Gh3fqIGJM0JrHbDwyTXrb8JyQtmfb4k5JO9tYOgEHpJfyHJN1s\n+1O2Pybpi5L2NdMWgH7rerc/It6z/Yikn0qaI2lXRPyysc4A9FXXQ31drYzv/EDfDeQgHwBXLsIP\nJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+ICnC\nDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5AU4QeS6nqKbkmyfVzSO5Lel/Re\nRIw00RSas2DBgmL9pZdeKtaXLVtWrNvlCWEnJyc71sbHx4vLbtu2rVg/f/58sY6ynsJf+YeIONfA\n+wAYIHb7gaR6DX9I+pnt12xvbqIhAIPR627/nRFx0vYNkg7YPhYRB6e/oPqjwB8GYMj0tOWPiJPV\n7RlJE5LumOE1YxExwo+BwHDpOvy259r+xMX7klZKerOpxgD0Vy+7/QslTVRDPR+R9L2I+EkjXQHo\nO0fE4FZmD25liZTG8nfs2FFc9sEHHyzW6/7/qBvnLy1ft+zExESxvn79+mI9q4gof7AVhvqApAg/\nkBThB5Ii/EBShB9IivADSTHUdxVYvXp1x9r+/fuLy9YNt42OjhbrBw4cKNaXL1/esVY3zHjXXXcV\n64sWLSrWz549W6xfrRjqA1BE+IGkCD+QFOEHkiL8QFKEH0iK8ANJMc5/FTh9+nTH2rx584rLPvfc\nc8X6xo0bi/VeLp+9atWqYr3uGIWHH364WB8bG7vsnq4GjPMDKCL8QFKEH0iK8ANJEX4gKcIPJEX4\ngaSamKUXfbZ5c3m2s9Klu+uO42jz8tfnzpUnd6671gB6w5YfSIrwA0kRfiApwg8kRfiBpAg/kBTh\nB5KqHee3vUvS5ySdiYhbqueul/QDSUslHZf0QET8oX9t5la69r1UHssfHx9vup3GrFixolgf5LUm\nMprNlv87ki6dFWKLpBcj4mZJL1aPAVxBasMfEQclvX3J02sl7a7u75Z0f8N9Aeizbr/zL4yIU5JU\n3d7QXEsABqHvx/bb3iypfHA6gIHrdst/2vZiSapuz3R6YUSMRcRIRIx0uS4AfdBt+PdJ2lTd3yTp\n+WbaATAoteG3/aykn0taZvuE7S9L2i7pPtu/kXRf9RjAFaT2O39EbOhQ+mzDvaCDu+++u1gvnfde\nd13+fisdo7B169bisnXn8x88eLCrnjCFI/yApAg/kBThB5Ii/EBShB9IivADSXHp7iFQd8puXf3s\n2bMday+//HJXPc1WXW+HDh3qWLv22muLyx49erRYP3bsWLGOMrb8QFKEH0iK8ANJEX4gKcIPJEX4\ngaQIP5AU4/xDYM2aNcV63Xj4u+++22Q7l2V0dLRYL/Ved8ru9u1cJqKf2PIDSRF+ICnCDyRF+IGk\nCD+QFOEHkiL8QFKM8w+BuvPW66aqnjdvXsfazp07i8s+9NBDxfqePXuK9ZUrVxbrTLM9vNjyA0kR\nfiApwg8kRfiBpAg/kBThB5Ii/EBSrhuHtb1L0ucknYmIW6rnnpT0FUkXLxi/NSJ+XLsym0HfLrzw\nwgvF+qpVqzrWZvHvW6z3uvz4+HjH2rp163pa95w5c4r1rCKi/I9Smc2W/zuSVs/w/H9GxK3Vf7XB\nBzBcasMfEQclvT2AXgAMUC/f+R+x/Qvbu2xf11hHAAai2/B/S9KnJd0q6ZSkHZ1eaHuz7cO2D3e5\nLgB90FX4I+J0RLwfERckfVvSHYXXjkXESESMdNskgOZ1FX7bi6c9/LykN5tpB8Cg1J7Sa/tZSfdK\nmm/7hKRvSLrX9q2SQtJxSV/tY48A+qA2/BGxYYann+lDL+ig7tr4N910U8fasmXLelp33Vj7U089\nVaxv27atY21ycrK47GOPPVasP/7448V63eeWHUf4AUkRfiApwg8kRfiBpAg/kBThB5KqPaW30ZVx\nSm9fPProox1rTz/9dHHZulNyR0bKB2YeOXKkWC+57bbbivVXX321p3Xffvvtl93T1aDJU3oBXIUI\nP5AU4QeSIvxAUoQfSIrwA0kRfiAppui+CmzZsqVjre44jomJiWL92LFjXfXUhLre58+f33X93Llz\nXfV0NWHLDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJMc5/FViwYEHHWt1Y+fr165tupzF11xqoG6tn\nLL+MLT+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJFU7zm97iaQ9khZJuiBpLCK+aft6ST+QtFTScUkP\nRMQf+tdqXsuXLy/WS2P5g5yX4XKtWLGiWK/rvW6Kb5TNZsv/nqR/jYgVkv5O0tds/42kLZJejIib\nJb1YPQZwhagNf0Sciogj1f13JE1KulHSWkm7q5ftlnR/v5oE0LzL+s5ve6mkz0h6RdLCiDglTf2B\nkHRD080B6J9ZH9tv++OS9kr6ekT8se6462nLbZa0ubv2APTLrLb8tj+qqeB/NyLGq6dP215c1RdL\nOjPTshExFhEjEVGe8RHAQNWG31Ob+GckTUbEzmmlfZI2Vfc3SXq++fYA9MtsdvvvlPRPkt6w/Xr1\n3FZJ2yX90PaXJf1O0hf60yLuueeeYv2aazr/Db9w4ULT7XzA3Llzi/U9e/Z0rK1bt6647JkzM+5M\n/sXGjRuLdZTVhj8i/kdSpy/4n222HQCDwhF+QFKEH0iK8ANJEX4gKcIPJEX4gaS4dPcVoO7U1tJY\nft2ydacL1xkdHS3W165d27F29OjR4rJr1qzpqifMDlt+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0jK\ng7y0s+3hvY70EKsbiz948GDH2rx584rLlq4FINVfD6Bu+b1793asPfHEE8Vljx07VqxjZhExq2vs\nseUHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQY578KrFq1qmNt//79xWXrpl2rO+d++/btxfrExETH\n2vnz54vLojuM8wMoIvxAUoQfSIrwA0kRfiApwg8kRfiBpGrH+W0vkbRH0iJJFySNRcQ3bT8p6SuS\nzlYv3RoRP655L8b5gT6b7Tj/bMK/WNLiiDhi+xOSXpN0v6QHJP0pIp6ebVOEH+i/2Ya/dsaeiDgl\n6VR1/x3bk5Ju7K09AG27rO/8tpdK+oykV6qnHrH9C9u7bF/XYZnNtg/bPtxTpwAaNetj+21/XNJL\nkkYjYtz2QknnJIWkf9fUV4N/rnkPdvuBPmvsO78k2f6opB9J+mlE7JyhvlTSjyLilpr3IfxAnzV2\nYo+nTvt6RtLk9OBXPwRe9HlJb15ukwDaM5tf+++S9LKkNzQ11CdJWyVtkHSrpnb7j0v6avXjYOm9\n2PIDfdbobn9TCD/Qf5zPD6CI8ANJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8k\nRfiBpAg/kFTtBTwbdk7S/057PL96bhgNa2/D2pdEb91qsre/mu0LB3o+/4dWbh+OiJHWGigY1t6G\ntS+J3rrVVm/s9gNJEX4gqbbDP9by+kuGtbdh7Uuit2610lur3/kBtKftLT+AlrQSfturbf/K9lu2\nt7TRQye2j9t+w/brbU8xVk2Ddsb2m9Oeu972Adu/qW5nnCatpd6etP1/1Wf3uu1/bKm3Jbb/2/ak\n7V/a/pfq+VY/u0JfrXxuA9/ttz1H0q8l3SfphKRDkjZExNGBNtKB7eOSRiKi9TFh238v6U+S9lyc\nDcn2f0h6OyK2V384r4uIfxuS3p7UZc7c3KfeOs0s/SW1+Nk1OeN1E9rY8t8h6a2I+G1E/FnS9yWt\nbaGPoRcRByW9fcnTayXtru7v1tT/PAPXobehEBGnIuJIdf8dSRdnlm71syv01Yo2wn+jpN9Pe3xC\nwzXld0j6me3XbG9uu5kZLLw4M1J1e0PL/VyqdubmQbpkZumh+ey6mfG6aW2Ef6bZRIZpyOHOiPhb\nSWskfa3avcXsfEvSpzU1jdspSTvabKaaWXqvpK9HxB/b7GW6Gfpq5XNrI/wnJC2Z9viTkk620MeM\nIuJkdXtG0oSmvqYMk9MXJ0mtbs+03M9fRMTpiHg/Ii5I+rZa/OyqmaX3SvpuRIxXT7f+2c3UV1uf\nWxvhPyTpZtufsv0xSV+UtK+FPj7E9tzqhxjZnitppYZv9uF9kjZV9zdJer7FXj5gWGZu7jSztFr+\n7IZtxutWDvKphjL+S9IcSbsiYnTgTczA9l9ramsvTZ3x+L02e7P9rKR7NXXW12lJ35D0nKQfSrpJ\n0u8kfSEiBv7DW4fe7tVlztzcp946zSz9ilr87Jqc8bqRfjjCD8iJI/yApAg/kBThB5Ii/EBShB9I\nivADSRF+ICnCDyT1/zuzOYWa4hAXAAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "%matplotlib inline\n", - "\n", - "print(\"Actual class of test image:\", test_lbl[177])\n", - "plt.imshow(test_img[177].reshape((28,28)))" + "psource(WeightedLearner)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "It is obvious that this Learner is not very efficient. In fact, it will guess correctly in only 1135/10000 of the samples, roughly 10%. It is very fast though, so it might have its use as a quick first guess." + "The `WeightedLearner` function will then call the `PerceptronLearner`, during each iteration, with the modified dataset which contains the examples according to the weights associated with them." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Naive-Bayes\n", + "### Example\n", "\n", - "The Naive-Bayes classifier is an improvement over the Plurality Learner. It is much more accurate, but a lot slower." + "We will pass the `PerceptronLearner` through `WeightedLearner` function. Then we will create an `AdaboostLearner` classifier with number of hypotheses or *K* equal to 5." ] }, { "cell_type": "code", - "execution_count": 55, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "7\n" - ] - } - ], + "execution_count": 4, + "metadata": { + "collapsed": true + }, + "outputs": [], "source": [ - "# takes ~45 Secs. to execute this\n", - "\n", - "nBD = NaiveBayesLearner(MNIST_DataSet, continuous=False)\n", - "print(nBD(test_img[0]))" + "WeightedPerceptron = WeightedLearner(PerceptronLearner)\n", + "AdaboostLearner = AdaBoost(WeightedPerceptron, 5)" ] }, { "cell_type": "code", - "execution_count": 56, + "execution_count": 5, "metadata": {}, "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Actual class of test image: 7\n" - ] - }, { "data": { "text/plain": [ - "" + "0" ] }, - "execution_count": 56, + "execution_count": 5, "metadata": {}, "output_type": "execute_result" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADQNJREFUeJzt3W+MVfWdx/HPZylNjPQBWLHEgnQb3bgaAzoaE3AzamxY\nbYKN1NQHGzbZMH2AZps0ZA1PypMmjemfrU9IpikpJtSWhFbRGBeDGylRGwejBYpQICzMgkAzJgUT\n0yDfPphDO8W5v3u5/84dv+9XQube8z1/vrnhM+ecOefcnyNCAPL5h7obAFAPwg8kRfiBpAg/kBTh\nB5Ii/EBShB9IivADSRF+IKnP9HNjtrmdEOixiHAr83W057e9wvZB24dtP9nJugD0l9u9t9/2LEmH\nJD0gaVzSW5Iei4jfF5Zhzw/0WD/2/HdJOhwRRyPiz5J+IWllB+sD0EedhP96SSemvB+vpv0d2yO2\nx2yPdbAtAF3WyR/8pju0+MRhfUSMShqVOOwHBkkne/5xSQunvP+ipJOdtQOgXzoJ/1uSbrT9Jduf\nlfQNSdu70xaAXmv7sD8iLth+XNL/SJolaVNE7O9aZwB6qu1LfW1tjHN+oOf6cpMPgJmL8ANJEX4g\nKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+\nICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaTaHqJbkmwfk3RO0seSLkTEUDea\nAtB7HYW/cm9E/LEL6wHQRxz2A0l1Gv6QtMP2Htsj3WgIQH90eti/LCJO2p4v6RXb70XErqkzVL8U\n+MUADBhHRHdWZG+QdD4ivl+YpzsbA9BQRLiV+do+7Ld9te3PXXot6SuS9rW7PgD91clh/3WSfm37\n0np+HhEvd6UrAD3XtcP+ljbGYT/Qcz0/7AcwsxF+ICnCDyRF+IGkCD+QFOEHkurGU30prFq1qmFt\nzZo1xWVPnjxZrH/00UfF+pYtW4r1999/v2Ht8OHDxWWRF3t+ICnCDyRF+IGkCD+QFOEHkiL8QFKE\nH0iKR3pbdPTo0Ya1xYsX96+RaZw7d65hbf/+/X3sZLCMj483rD311FPFZcfGxrrdTt/wSC+AIsIP\nJEX4gaQIP5AU4QeSIvxAUoQfSIrn+VtUemb/tttuKy574MCBYv3mm28u1m+//fZifXh4uGHt7rvv\nLi574sSJYn3hwoXFeicuXLhQrJ89e7ZYX7BgQdvbPn78eLE+k6/zt4o9P5AU4QeSIvxAUoQfSIrw\nA0kRfiApwg8k1fR5ftubJH1V0pmIuLWaNk/SLyUtlnRM0qMR8UHTjc3g5/kH2dy5cxvWlixZUlx2\nz549xfqdd97ZVk+taDZewaFDh4r1ZvdPzJs3r2Ft7dq1xWU3btxYrA+ybj7P/zNJKy6b9qSknRFx\no6Sd1XsAM0jT8EfELkkTl01eKWlz9XqzpIe73BeAHmv3nP+6iDglSdXP+d1rCUA/9PzeftsjkkZ6\nvR0AV6bdPf9p2wskqfp5ptGMETEaEUMRMdTmtgD0QLvh3y5pdfV6taTnu9MOgH5pGn7bz0p6Q9I/\n2R63/R+SvifpAdt/kPRA9R7ADML39mNgPfLII8X61q1bi/V9+/Y1rN17773FZScmLr/ANXPwvf0A\nigg/kBThB5Ii/EBShB9IivADSXGpD7WZP7/8SMjevXs7Wn7VqlUNa9u2bSsuO5NxqQ9AEeEHkiL8\nQFKEH0iK8ANJEX4gKcIPJMUQ3ahNs6/Pvvbaa4v1Dz4of1v8wYMHr7inTNjzA0kRfiApwg8kRfiB\npAg/kBThB5Ii/EBSPM+Pnlq2bFnD2quvvlpcdvbs2cX68PBwsb5r165i/dOK5/kBFBF+ICnCDyRF\n+IGkCD+QFOEHkiL8QFJNn+e3vUnSVyWdiYhbq2kbJK2RdLaabX1EvNSrJjFzPfjggw1rza7j79y5\ns1h/44032uoJk1rZ8/9M0opppv8oIpZU/wg+MMM0DX9E7JI00YdeAPRRJ+f8j9v+ne1Ntud2rSMA\nfdFu+DdK+rKkJZJOSfpBoxltj9gesz3W5rYA9EBb4Y+I0xHxcURclPQTSXcV5h2NiKGIGGq3SQDd\n11b4bS+Y8vZrkvZ1px0A/dLKpb5nJQ1L+rztcUnfkTRse4mkkHRM0jd72COAHuB5fnTkqquuKtZ3\n797dsHbLLbcUl73vvvuK9ddff71Yz4rn+QEUEX4gKcIPJEX4gaQIP5AU4QeSYohudGTdunXF+tKl\nSxvWXn755eKyXMrrLfb8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5AUj/Si6KGHHirWn3vuuWL9ww8/\nbFhbsWK6L4X+mzfffLNYx/R4pBdAEeEHkiL8QFKEH0iK8ANJEX4gKcIPJMXz/Mldc801xfrTTz9d\nrM+aNatYf+mlxgM4cx2/Xuz5gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiCpps/z214o6RlJX5B0UdJo\nRPzY9jxJv5S0WNIxSY9GxAdN1sXz/H3W7Dp8s2vtd9xxR7F+5MiRYr30zH6zZdGebj7Pf0HStyPi\nZkl3S1pr+58lPSlpZ0TcKGln9R7ADNE0/BFxKiLerl6fk3RA0vWSVkraXM22WdLDvWoSQPdd0Tm/\n7cWSlkr6raTrIuKUNPkLQtL8bjcHoHdavrff9hxJ2yR9KyL+ZLd0WiHbI5JG2msPQK+0tOe3PVuT\nwd8SEb+qJp+2vaCqL5B0ZrplI2I0IoYiYqgbDQPojqbh9+Qu/qeSDkTED6eUtktaXb1eLen57rcH\noFdaudS3XNJvJO3V5KU+SVqvyfP+rZIWSTou6esRMdFkXVzq67ObbrqpWH/vvfc6Wv/KlSuL9Rde\neKGj9ePKtXqpr+k5f0TsltRoZfdfSVMABgd3+AFJEX4gKcIPJEX4gaQIP5AU4QeS4qu7PwVuuOGG\nhrUdO3Z0tO5169YV6y+++GJH60d92PMDSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFJc5/8UGBlp/C1p\nixYt6mjdr732WrHe7PsgMLjY8wNJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUlznnwGWL19erD/xxBN9\n6gSfJuz5gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiCpptf5bS+U9IykL0i6KGk0In5se4OkNZLOVrOu\nj4iXetVoZvfcc0+xPmfOnLbXfeTIkWL9/Pnzba8bg62Vm3wuSPp2RLxt+3OS9th+par9KCK+37v2\nAPRK0/BHxClJp6rX52wfkHR9rxsD0FtXdM5ve7GkpZJ+W0163PbvbG+yPbfBMiO2x2yPddQpgK5q\nOfy250jaJulbEfEnSRslfVnSEk0eGfxguuUiYjQihiJiqAv9AuiSlsJve7Ymg78lIn4lSRFxOiI+\njoiLkn4i6a7etQmg25qG37Yl/VTSgYj44ZTpC6bM9jVJ+7rfHoBeaeWv/csk/Zukvbbfqaatl/SY\n7SWSQtIxSd/sSYfoyLvvvlus33///cX6xMREN9vBAGnlr/27JXmaEtf0gRmMO/yApAg/kBThB5Ii\n/EBShB9IivADSbmfQyzbZjxnoMciYrpL85/Anh9IivADSRF+ICnCDyRF+IGkCD+QFOEHkur3EN1/\nlPR/U95/vpo2iAa1t0HtS6K3dnWztxtanbGvN/l8YuP22KB+t9+g9jaofUn01q66euOwH0iK8ANJ\n1R3+0Zq3XzKovQ1qXxK9tauW3mo95wdQn7r3/ABqUkv4ba+wfdD2YdtP1tFDI7aP2d5r+526hxir\nhkE7Y3vflGnzbL9i+w/Vz2mHSauptw22/7/67N6x/WBNvS20/b+2D9jeb/s/q+m1fnaFvmr53Pp+\n2G97lqRDkh6QNC7pLUmPRcTv+9pIA7aPSRqKiNqvCdv+F0nnJT0TEbdW056SNBER36t+cc6NiP8a\nkN42SDpf98jN1YAyC6aOLC3pYUn/rho/u0Jfj6qGz62OPf9dkg5HxNGI+LOkX0haWUMfAy8idkm6\nfNSMlZI2V683a/I/T9816G0gRMSpiHi7en1O0qWRpWv97Ap91aKO8F8v6cSU9+MarCG/Q9IO23ts\nj9TdzDSuq4ZNvzR8+vya+7lc05Gb++mykaUH5rNrZ8Trbqsj/NN9xdAgXXJYFhG3S/pXSWurw1u0\npqWRm/tlmpGlB0K7I153Wx3hH5e0cMr7L0o6WUMf04qIk9XPM5J+rcEbffj0pUFSq59nau7nrwZp\n5ObpRpbWAHx2gzTidR3hf0vSjba/ZPuzkr4haXsNfXyC7aurP8TI9tWSvqLBG314u6TV1evVkp6v\nsZe/MygjNzcaWVo1f3aDNuJ1LTf5VJcy/lvSLEmbIuK7fW9iGrb/UZN7e2nyicef19mb7WclDWvy\nqa/Tkr4j6TlJWyUtknRc0tcjou9/eGvQ27AmD13/OnLzpXPsPve2XNJvJO2VdLGavF6T59e1fXaF\nvh5TDZ8bd/gBSXGHH5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpP4CIJjqosJxHysAAAAASUVO\nRK5CYII=\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" } ], "source": [ - "%matplotlib inline\n", + "iris2 = DataSet(name=\"iris\")\n", + "iris2.classes_to_numbers()\n", "\n", - "print(\"Actual class of test image:\", test_lbl[0])\n", - "plt.imshow(test_img[0].reshape((28,28)))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### k-Nearest Neighbors\n", + "adaboost = AdaboostLearner(iris2)\n", "\n", - "We will now try to classify a random image from the dataset using the kNN classifier." - ] - }, - { - "cell_type": "code", - "execution_count": 57, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "5\n" - ] - } - ], - "source": [ - "# takes ~20 Secs. to execute this\n", - "kNN = NearestNeighborLearner(MNIST_DataSet, k=3)\n", - "print(kNN(test_img[211]))" + "adaboost([5, 3, 1, 0.1])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "To make sure that the output we got is correct, let's plot that image along with its label." + "That is the correct answer. Let's check the error rate of adaboost with perceptron." ] }, { "cell_type": "code", - "execution_count": 58, + "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "Actual class of test image: 5\n" + "Error ratio for adaboost: 0.046666666666666634\n" ] - }, - { - "data": { - "text/plain": [ - "" - ] - }, - "execution_count": 58, - "metadata": {}, - "output_type": "execute_result" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADdVJREFUeJzt3X+oVHUax/HPk7kFKWVUauqurcnSIlnLLQq3UCqtJdAt\nNixY3BDv/mFgEGFoP/wjQZZ+QyzdTUkhMyF/QZu7Kku1sElXkczMNsLUumhmpVcKU5/94x6Xm93z\nnWnmzJy5Pu8XyJ05zzlzHgY/95y533Pma+4uAPGcVXYDAMpB+IGgCD8QFOEHgiL8QFCEHwiK8ANB\nEX4gKMIPBHV2M3dmZlxOCDSYu1s169V15DezW81sl5l9bGYP1fNaAJrLar2238wGSPpI0i2S9kl6\nV9Ld7v5BYhuO/ECDNePIf62kj939E3c/JmmFpKl1vB6AJqon/CMk7e31fF+27AfMrN3MOs2ss459\nAShYPX/w6+vU4ken9e7eIalD4rQfaCX1HPn3SRrV6/lISZ/X1w6AZqkn/O9KGmtml5nZzyRNl7Su\nmLYANFrNp/3uftzM7pP0D0kDJC1x9x2FdQagoWoe6qtpZ3zmBxquKRf5AOi/CD8QFOEHgiL8QFCE\nHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQ\nhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiq5im6JcnMdks6IumEpOPu3lZEUwAar67w\nZya5+8ECXgdAE3HaDwRVb/hd0j/NbIuZtRfREIDmqPe0f4K7f25ml0jaYGYfuvtbvVfIfinwiwFo\nMebuxbyQ2QJJ3e7+RGKdYnYGIJe7WzXr1Xzab2bnmdngU48lTZb0fq2vB6C56jntHypptZmdep3l\n7r6+kK4ANFxhp/1V7YzT/nDOP//83Np1112X3Pb111+va9/d3d25tVRfkrRr165kfcKECcn6l19+\nmaw3UsNP+wH0b4QfCIrwA0ERfiAowg8ERfiBoIq4qw9nsLa29F3a7e3pK7fvvPPO3Fp2jUiunTt3\nJusLFy5M1kePHl3ztnv27EnWv//++2S9P+DIDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBcUvvGW7g\nwIHJ+vz585P1WbNmJeuHDh1K1p977rnc2ubNm5Pb7tixI1mfNGlSsr548eLc2tdff53cduLEicn6\nV199layXiVt6ASQRfiAowg8ERfiBoAg/EBThB4Ii/EBQjPOfAaZMmZJbe/jhh5Pbjh8/PllfsWJF\nsv7ggw8m64MGDcqt3Xvvvcltb7755mT9hhtuSNY3btyYW5s7d25y223btiXrrYxxfgBJhB8IivAD\nQRF+ICjCDwRF+IGgCD8QVMVxfjNbIul2SQfcfVy27EJJr0oaLWm3pLvcveINzozz12bBggXJeuqe\n/Erj1YsWLUrWDx48mKzfeOONyfrMmTNza6NGjUpuu3379mT9mWeeSdbXrFmTW6t0P39/VuQ4/0uS\nbj1t2UOSNrn7WEmbsucA+pGK4Xf3tySd/nUtUyUtzR4vlTSt4L4ANFitn/mHunuXJGU/LymuJQDN\n0PC5+sysXVJ6QjcATVfrkX+/mQ2XpOzngbwV3b3D3dvcPT3jI4CmqjX86yTNyB7PkLS2mHYANEvF\n8JvZK5L+I+lXZrbPzGZKWiTpFjP7r6RbsucA+hHu528Blcbx582bl6x3dnbm1lL3+kvSkSNHkvVK\nvT3yyCPJ+vLly3NrqfvtJWn16tXJ+uHDh5P1qLifH0AS4QeCIvxAUIQfCIrwA0ERfiAohvqaYMyY\nMcn622+/nayvXZu+hmrOnDm5tWPHjiW3rWTAgAHJ+rnnnpusf/vtt7m1kydP1tQT0hjqA5BE+IGg\nCD8QFOEHgiL8QFCEHwiK8ANBNfxrvCCNHTs2WR86dGiyfvz48WS93rH8lBMnTiTrR48ebdi+0Vgc\n+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMb5m6DSVNN79+5N1i+44IJk/ayz8n+Hc8888nDkB4Ii\n/EBQhB8IivADQRF+ICjCDwRF+IGgKo7zm9kSSbdLOuDu47JlCyTNkvRFtto8d/97o5rs7z777LNk\nvdJ1APfcc0+yPnjw4NzatGnTktsirmqO/C9JurWP5U+7+1XZP4IP9DMVw+/ub0k61IReADRRPZ/5\n7zOz98xsiZkNKawjAE1Ra/j/KmmMpKskdUl6Mm9FM2s3s04z66xxXwAaoKbwu/t+dz/h7icl/U3S\ntYl1O9y9zd3bam0SQPFqCr+ZDe/19PeS3i+mHQDNUs1Q3yuSJkq6yMz2SXpM0kQzu0qSS9ot6c8N\n7BFAA5i7N29nZs3bWT9y8cUXJ+urVq1K1q+//vrc2sKFC5Pbvvjii8l6pe8aQOtxd6tmPa7wA4Ii\n/EBQhB8IivADQRF+ICjCDwTFUF8/MGRI+taJN954I7d2zTXXJLetNNT3+OOPJ+sMBbYehvoAJBF+\nICjCDwRF+IGgCD8QFOEHgiL8QFCM858BBg0alFubPn16ctsXXnghWf/mm2+S9cmTJyfrnZ18e1uz\nMc4PIInwA0ERfiAowg8ERfiBoAg/EBThB4JinP8MZ5Ye8h02bFiyvn79+mT9iiuuSNavvPLK3NqH\nH36Y3Ba1YZwfQBLhB4Ii/EBQhB8IivADQRF+ICjCDwR1dqUVzGyUpGWShkk6KanD3Z81swslvSpp\ntKTdku5y968a1ypqUek6jq6urmR99uzZyfqbb76ZrKfu92ecv1zVHPmPS3rA3a+QdJ2k2Wb2a0kP\nSdrk7mMlbcqeA+gnKobf3bvcfWv2+IiknZJGSJoqaWm22lJJ0xrVJIDi/aTP/GY2WtLVkjZLGuru\nXVLPLwhJlxTdHIDGqfiZ/xQzGyTpNUn3u/vhSteM99quXVJ7be0BaJSqjvxmNlA9wX/Z3Vdli/eb\n2fCsPlzSgb62dfcOd29z97YiGgZQjIrht55D/GJJO939qV6ldZJmZI9nSFpbfHsAGqWa0/4Jkv4o\nabuZbcuWzZO0SNJKM5spaY+kPzSmRTTSyJEjk/VHH320rtdnCu/WVTH87v5vSXkf8G8qth0AzcIV\nfkBQhB8IivADQRF+ICjCDwRF+IGgqr68N7pLL700tzZ37tzktnPmzCm6naqdc845yfr8+fOT9Ztu\nSo/mrly5MlnfsGFDso7ycOQHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaCYortKl19+eW5t69atyW0n\nTZqUrG/ZsqWmnk4ZN25cbm3ZsmXJbcePH5+sVxrHnzVrVrLe3d2drKN4TNENIInwA0ERfiAowg8E\nRfiBoAg/EBThB4Lifv4qffrpp7m1559/PrntmjVrkvXvvvsuWX/nnXeS9dtuuy23Vul+/jvuuCNZ\n37hxY7J+9OjRZB2tiyM/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRV8X5+MxslaZmkYZJOSupw92fN\nbIGkWZK+yFad5+5/r/Ba/fZ+/pSzz05fLlHpnvcpU6Yk6yNGjEjWU2PxmzZtqnlb9E/V3s9fzUU+\nxyU94O5bzWywpC1mdmomhqfd/YlamwRQnorhd/cuSV3Z4yNmtlNS+lAEoOX9pM/8ZjZa0tWSNmeL\n7jOz98xsiZkNydmm3cw6zayzrk4BFKrq8JvZIEmvSbrf3Q9L+qukMZKuUs+ZwZN9befuHe7e5u5t\nBfQLoCBVhd/MBqon+C+7+ypJcvf97n7C3U9K+pukaxvXJoCiVQy/mZmkxZJ2uvtTvZYP77Xa7yW9\nX3x7ABqlmqG+30p6W9J29Qz1SdI8SXer55TfJe2W9Ofsj4Op1zojh/qAVlLtUB/f2w+cYfjefgBJ\nhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaCaPUX3QUm957q+\nKFvWilq1t1btS6K3WhXZ2y+qXbGp9/P/aOdmna363X6t2lur9iXRW63K6o3TfiAowg8EVXb4O0re\nf0qr9taqfUn0VqtSeiv1Mz+A8pR95AdQklLCb2a3mtkuM/vYzB4qo4c8ZrbbzLab2baypxjLpkE7\nYGbv91p2oZltMLP/Zj/7nCatpN4WmNln2Xu3zcx+V1Jvo8zsX2a208x2mNmcbHmp712ir1Let6af\n9pvZAEkfSbpF0j5J70q6290/aGojOcxst6Q2dy99TNjMbpTULWmZu4/Llv1F0iF3X5T94hzi7nNb\npLcFkrrLnrk5m1BmeO+ZpSVNk/QnlfjeJfq6SyW8b2Uc+a+V9LG7f+LuxyStkDS1hD5anru/JenQ\naYunSlqaPV6qnv88TZfTW0tw9y5335o9PiLp1MzSpb53ib5KUUb4R0ja2+v5PrXWlN8u6Z9mtsXM\n2stupg9DT82MlP28pOR+Tldx5uZmOm1m6ZZ572qZ8bpoZYS/r9lEWmnIYYK7/0bSbZJmZ6e3qE5V\nMzc3Sx8zS7eEWme8LloZ4d8naVSv5yMlfV5CH31y98+znwckrVbrzT68/9QkqdnPAyX383+tNHNz\nXzNLqwXeu1aa8bqM8L8raayZXWZmP5M0XdK6Evr4ETM7L/tDjMzsPEmT1XqzD6+TNCN7PEPS2hJ7\n+YFWmbk5b2ZplfzetdqM16Vc5JMNZTwjaYCkJe6+sOlN9MHMfqmeo73Uc8fj8jJ7M7NXJE1Uz11f\n+yU9JmmNpJWSfi5pj6Q/uHvT//CW09tE/cSZmxvUW97M0ptV4ntX5IzXhfTDFX5ATFzhBwRF+IGg\nCD8QFOEHgiL8QFCEHwiK8ANBEX4gqP8B1flLsMvfVy4AAAAASUVORK5CYII=\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" } ], "source": [ - "%matplotlib inline\n", - "\n", - "print(\"Actual class of test image:\", test_lbl[211])\n", - "plt.imshow(test_img[211].reshape((28,28)))" + "print(\"Error ratio for adaboost: \", err_ratio(adaboost, iris2))" ] }, { "cell_type": "markdown", - "metadata": {}, + "metadata": { + "collapsed": true + }, "source": [ - "Hurray! We've got it correct. Don't worry if our algorithm predicted a wrong class. With this techinique we have only ~97% accuracy on this dataset." + "It reduced the error rate considerably. Unlike the `PerceptronLearner`, `AdaBoost` was able to learn the complexity in the iris dataset." ] } ], @@ -2244,9 +2247,18 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.3" + "version": "3.5.2" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "source": [], + "metadata": { + "collapsed": false + } + } } }, "nbformat": 4, "nbformat_minor": 2 -} +} \ No newline at end of file diff --git a/learning.py b/learning.py index 5f1ba596e..71b6b15e7 100644 --- a/learning.py +++ b/learning.py @@ -1,84 +1,48 @@ -"""Learn to estimate functions from examples. (Chapters 18, 20)""" - -from utils import ( - removeall, unique, product, mode, argmax, argmax_random_tie, isclose, gaussian, - dotproduct, vector_add, scalar_vector_product, weighted_sample_with_replacement, - weighted_sampler, num_or_str, normalize, clip, sigmoid, print_table, - open_data, sigmoid_derivative, probability, norm, matrix_multiplication -) +"""Learning from examples (Chapters 18)""" import copy -import heapq -import math -import random - -from statistics import mean, stdev from collections import defaultdict +from statistics import stdev -# ______________________________________________________________________________ - - -def euclidean_distance(X, Y): - return math.sqrt(sum([(x - y)**2 for x, y in zip(X, Y)])) - - -def rms_error(X, Y): - return math.sqrt(ms_error(X, Y)) - - -def ms_error(X, Y): - return mean([(x - y)**2 for x, y in zip(X, Y)]) - - -def mean_error(X, Y): - return mean([abs(x - y) for x, y in zip(X, Y)]) - - -def manhattan_distance(X, Y): - return sum([abs(x - y) for x, y in zip(X, Y)]) - +from qpsolvers import solve_qp -def mean_boolean_error(X, Y): - return mean(int(x != y) for x, y in zip(X, Y)) - - -def hamming_distance(X, Y): - return sum(x != y for x, y in zip(X, Y)) - -# ______________________________________________________________________________ +from probabilistic_learning import NaiveBayesLearner +from utils import * class DataSet: - """A data set for a machine learning problem. It has the following fields: + """ + A data set for a machine learning problem. It has the following fields: d.examples A list of examples. Each one is a list of attribute values. d.attrs A list of integers to index into an example, so example[attr] gives a value. Normally the same as range(len(d.examples[0])). - d.attrnames Optional list of mnemonic names for corresponding attrs. + d.attr_names Optional list of mnemonic names for corresponding attrs. d.target The attribute that a learning algorithm will try to predict. By default the final attribute. d.inputs The list of attrs without the target. d.values A list of lists: each sublist is the set of possible values for the corresponding attribute. If initially None, - it is computed from the known examples by self.setproblem. + it is computed from the known examples by self.set_problem. If not None, an erroneous value raises ValueError. - d.distance A function from a pair of examples to a nonnegative number. + d.distance A function from a pair of examples to a non-negative number. Should be symmetric, etc. Defaults to mean_boolean_error since that can handle any field types. d.name Name of the data set (for output display only). d.source URL or other source where the data came from. d.exclude A list of attribute indexes to exclude from d.inputs. Elements - of this list can either be integers (attrs) or attrnames. + of this list can either be integers (attrs) or attr_names. Normally, you call the constructor and you're done; then you just - access fields like d.examples and d.target and d.inputs.""" + access fields like d.examples and d.target and d.inputs. + """ - def __init__(self, examples=None, attrs=None, attrnames=None, target=-1, - inputs=None, values=None, distance=mean_boolean_error, - name='', source='', exclude=()): - """Accepts any of DataSet's fields. Examples can also be a + def __init__(self, examples=None, attrs=None, attr_names=None, target=-1, inputs=None, + values=None, distance=mean_boolean_error, name='', source='', exclude=()): + """ + Accepts any of DataSet's fields. Examples can also be a string or file from which to parse examples using parse_csv. - Optional parameter: exclude, as documented in .setproblem(). + Optional parameter: exclude, as documented in .set_problem(). >>> DataSet(examples='1, 2, 3') """ @@ -86,49 +50,50 @@ def __init__(self, examples=None, attrs=None, attrnames=None, target=-1, self.source = source self.values = values self.distance = distance - if values is None: - self.got_values_flag = False - else: - self.got_values_flag = True + self.got_values_flag = bool(values) - # Initialize .examples from string or list or data directory + # initialize .examples from string or list or data directory if isinstance(examples, str): self.examples = parse_csv(examples) elif examples is None: self.examples = parse_csv(open_data(name + '.csv').read()) else: self.examples = examples - # Attrs are the indices of examples, unless otherwise stated. - if attrs is None and self.examples is not None: + + # attrs are the indices of examples, unless otherwise stated. + if self.examples is not None and attrs is None: attrs = list(range(len(self.examples[0]))) + self.attrs = attrs - # Initialize .attrnames from string, list, or by default - if isinstance(attrnames, str): - self.attrnames = attrnames.split() + + # initialize .attr_names from string, list, or by default + if isinstance(attr_names, str): + self.attr_names = attr_names.split() else: - self.attrnames = attrnames or attrs - self.setproblem(target, inputs=inputs, exclude=exclude) + self.attr_names = attr_names or attrs + self.set_problem(target, inputs=inputs, exclude=exclude) - def setproblem(self, target, inputs=None, exclude=()): - """Set (or change) the target and/or inputs. + def set_problem(self, target, inputs=None, exclude=()): + """ + Set (or change) the target and/or inputs. This way, one DataSet can be used multiple ways. inputs, if specified, is a list of attributes, or specify exclude as a list of attributes - to not use in inputs. Attributes can be -n .. n, or an attrname. - Also computes the list of possible values, if that wasn't done yet.""" - self.target = self.attrnum(target) - exclude = list(map(self.attrnum, exclude)) + to not use in inputs. Attributes can be -n .. n, or an attr_name. + Also computes the list of possible values, if that wasn't done yet. + """ + self.target = self.attr_num(target) + exclude = list(map(self.attr_num, exclude)) if inputs: - self.inputs = removeall(self.target, inputs) + self.inputs = remove_all(self.target, inputs) else: - self.inputs = [a for a in self.attrs - if a != self.target and a not in exclude] + self.inputs = [a for a in self.attrs if a != self.target and a not in exclude] if not self.values: self.update_values() self.check_me() def check_me(self): """Check that my fields make sense.""" - assert len(self.attrnames) == len(self.attrs) + assert len(self.attr_names) == len(self.attrs) assert self.target in self.attrs assert self.target not in self.inputs assert set(self.inputs).issubset(set(self.attrs)) @@ -147,12 +112,12 @@ def check_example(self, example): for a in self.attrs: if example[a] not in self.values[a]: raise ValueError('Bad value {} for attribute {} in {}' - .format(example[a], self.attrnames[a], example)) + .format(example[a], self.attr_names[a], example)) - def attrnum(self, attr): + def attr_num(self, attr): """Returns the number used for attr, which can be a name, or -n .. n-1.""" if isinstance(attr, str): - return self.attrnames.index(attr) + return self.attr_names.index(attr) elif attr < 0: return len(self.attrs) + attr else: @@ -163,18 +128,17 @@ def update_values(self): def sanitize(self, example): """Return a copy of example, with non-input attributes replaced by None.""" - return [attr_i if i in self.inputs else None - for i, attr_i in enumerate(example)] + return [attr_i if i in self.inputs else None for i, attr_i in enumerate(example)] def classes_to_numbers(self, classes=None): """Converts class names to numbers.""" if not classes: - # If classes were not given, extract them from values + # if classes were not given, extract them from values classes = sorted(self.values[self.target]) for item in self.examples: item[self.target] = classes.index(item[self.target]) - def remove_examples(self, value=""): + def remove_examples(self, value=''): """Remove examples that contain given value.""" self.examples = [x for x in self.examples if value not in x] self.update_values() @@ -185,32 +149,35 @@ def split_values_by_classes(self): target_names = self.values[self.target] for v in self.examples: - item = [a for a in v if a not in target_names] # Remove target from item - buckets[v[self.target]].append(item) # Add item to bucket of its class + item = [a for a in v if a not in target_names] # remove target from item + buckets[v[self.target]].append(item) # add item to bucket of its class return buckets def find_means_and_deviations(self): - """Finds the means and standard deviations of self.dataset. - means : A dictionary for each class/target. Holds a list of the means + """ + Finds the means and standard deviations of self.dataset. + means : a dictionary for each class/target. Holds a list of the means of the features for the class. - deviations: A dictionary for each class/target. Holds a list of the sample - standard deviations of the features for the class.""" + deviations: a dictionary for each class/target. Holds a list of the sample + standard deviations of the features for the class. + """ target_names = self.values[self.target] feature_numbers = len(self.inputs) item_buckets = self.split_values_by_classes() - means = defaultdict(lambda: [0 for i in range(feature_numbers)]) - deviations = defaultdict(lambda: [0 for i in range(feature_numbers)]) + means = defaultdict(lambda: [0] * feature_numbers) + deviations = defaultdict(lambda: [0] * feature_numbers) for t in target_names: - # Find all the item feature values for item in class t - features = [[] for i in range(feature_numbers)] + # find all the item feature values for item in class t + features = [[] for _ in range(feature_numbers)] for item in item_buckets[t]: - features = [features[i] + [item[i]] for i in range(feature_numbers)] + for i in range(feature_numbers): + features[i].append(item[i]) - # Calculate means and deviations fo the class + # calculate means and deviations fo the class for i in range(feature_numbers): means[t][i] = mean(features[i]) deviations[t][i] = stdev(features[i]) @@ -218,281 +185,194 @@ def find_means_and_deviations(self): return means, deviations def __repr__(self): - return ''.format( - self.name, len(self.examples), len(self.attrs)) - -# ______________________________________________________________________________ + return ''.format(self.name, len(self.examples), len(self.attrs)) def parse_csv(input, delim=','): - r"""Input is a string consisting of lines, each line has comma-delimited - fields. Convert this into a list of lists. Blank lines are skipped. + r""" + Input is a string consisting of lines, each line has comma-delimited + fields. Convert this into a list of lists. Blank lines are skipped. Fields that look like numbers are converted to numbers. The delim defaults to ',' but '\t' and None are also reasonable values. >>> parse_csv('1, 2, 3 \n 0, 2, na') - [[1, 2, 3], [0, 2, 'na']]""" + [[1, 2, 3], [0, 2, 'na']] + """ lines = [line for line in input.splitlines() if line.strip()] return [list(map(num_or_str, line.split(delim))) for line in lines] -# ______________________________________________________________________________ - - -class CountingProbDist: - """A probability distribution formed by observing and counting examples. - If p is an instance of this class and o is an observed value, then - there are 3 main operations: - p.add(o) increments the count for observation o by 1. - p.sample() returns a random element from the distribution. - p[o] returns the probability for o (as in a regular ProbDist).""" - - def __init__(self, observations=[], default=0): - """Create a distribution, and optionally add in some observations. - By default this is an unsmoothed distribution, but saying default=1, - for example, gives you add-one smoothing.""" - self.dictionary = {} - self.n_obs = 0.0 - self.default = default - self.sampler = None - - for o in observations: - self.add(o) - - def add(self, o): - """Add an observation o to the distribution.""" - self.smooth_for(o) - self.dictionary[o] += 1 - self.n_obs += 1 - self.sampler = None - - def smooth_for(self, o): - """Include o among the possible observations, whether or not - it's been observed yet.""" - if o not in self.dictionary: - self.dictionary[o] = self.default - self.n_obs += self.default - self.sampler = None - - def __getitem__(self, item): - """Return an estimate of the probability of item.""" - self.smooth_for(item) - return self.dictionary[item] / self.n_obs - - # (top() and sample() are not used in this module, but elsewhere.) - - def top(self, n): - """Return (count, obs) tuples for the n most frequent observations.""" - return heapq.nlargest(n, [(v, k) for (k, v) in self.dictionary.items()]) - - def sample(self): - """Return a random sample from the distribution.""" - if self.sampler is None: - self.sampler = weighted_sampler(list(self.dictionary.keys()), - list(self.dictionary.values())) - return self.sampler() - -# ______________________________________________________________________________ +def err_ratio(predict, dataset, examples=None): + """ + Return the proportion of the examples that are NOT correctly predicted. + verbose - 0: No output; 1: Output wrong; 2 (or greater): Output correct + """ + examples = examples or dataset.examples + if len(examples) == 0: + return 0.0 + right = 0 + for example in examples: + desired = example[dataset.target] + output = predict(dataset.sanitize(example)) + if output == desired: + right += 1 + return 1 - (right / len(examples)) -def PluralityLearner(dataset): - """A very dumb algorithm: always pick the result that was most popular - in the training data. Makes a baseline for comparison.""" - most_popular = mode([e[dataset.target] for e in dataset.examples]) - - def predict(example): - """Always return same result: the most popular from the training set.""" - return most_popular - return predict -# ______________________________________________________________________________ +def grade_learner(predict, tests): + """ + Grades the given learner based on how many tests it passes. + tests is a list with each element in the form: (values, output). + """ + return mean(int(predict(X) == y) for X, y in tests) -def NaiveBayesLearner(dataset, continuous=True, simple=False): - if simple: - return NaiveBayesSimple(dataset) - if(continuous): - return NaiveBayesContinuous(dataset) +def train_test_split(dataset, start=None, end=None, test_split=None): + """ + If you are giving 'start' and 'end' as parameters, + then it will return the testing set from index 'start' to 'end' + and the rest for training. + If you give 'test_split' as a parameter then it will return + test_split * 100% as the testing set and the rest as + training set. + """ + examples = dataset.examples + if test_split is None: + train = examples[:start] + examples[end:] + val = examples[start:end] else: - return NaiveBayesDiscrete(dataset) + total_size = len(examples) + val_size = int(total_size * test_split) + train_size = total_size - val_size + train = examples[:train_size] + val = examples[train_size:total_size] + return train, val -def NaiveBayesSimple(distribution): - """A simple naive bayes classifier that takes as input a dictionary of - CountingProbDist objects and classifies items according to these distributions. - The input dictionary is in the following form: - (ClassName, ClassProb): CountingProbDist""" - target_dist = {c_name: prob for c_name, prob in distribution.keys()} - attr_dists = {c_name: count_prob for (c_name, _), count_prob in distribution.items()} - def predict(example): - """Predict the target value for example. Calculate probabilities for each - class and pick the max.""" - def class_probability(targetval): - attr_dist = attr_dists[targetval] - return target_dist[targetval] * product(attr_dist[a] for a in example) +def cross_validation_wrapper(learner, dataset, k=10, trials=1): + """ + [Figure 18.8] + Return the optimal value of size having minimum error on validation set. + errT: a training error array, indexed by size + errV: a validation error array, indexed by size + """ + errs = [] + size = 1 + while True: + errT, errV = cross_validation(learner, dataset, size, k, trials) + # check for convergence provided err_val is not empty + if errT and not np.isclose(errT[-1], errT, rtol=1e-6): + best_size = 0 + min_val = np.inf + i = 0 + while i < size: + if errs[i] < min_val: + min_val = errs[i] + best_size = i + i += 1 + return learner(dataset, best_size) + errs.append(errV) + size += 1 - return argmax(target_dist.keys(), key=class_probability) - return predict +def cross_validation(learner, dataset, size=None, k=10, trials=1): + """ + Do k-fold cross_validate and return their mean. + That is, keep out 1/k of the examples for testing on each of k runs. + Shuffle the examples first; if trials > 1, average over several shuffles. + Returns Training error, Validation error + """ + k = k or len(dataset.examples) + if trials > 1: + trial_errT = 0 + trial_errV = 0 + for t in range(trials): + errT, errV = cross_validation(learner, dataset, size, k, trials) + trial_errT += errT + trial_errV += errV + return trial_errT / trials, trial_errV / trials + else: + fold_errT = 0 + fold_errV = 0 + n = len(dataset.examples) + examples = dataset.examples + random.shuffle(dataset.examples) + for fold in range(k): + train_data, val_data = train_test_split(dataset, fold * (n // k), (fold + 1) * (n // k)) + dataset.examples = train_data + h = learner(dataset, size) + fold_errT += err_ratio(h, dataset, train_data) + fold_errV += err_ratio(h, dataset, val_data) + # reverting back to original once test is completed + dataset.examples = examples + return fold_errT / k, fold_errV / k -def NaiveBayesDiscrete(dataset): - """Just count how many times each value of each input attribute - occurs, conditional on the target value. Count the different - target values too.""" +def leave_one_out(learner, dataset, size=None): + """Leave one out cross-validation over the dataset.""" + return cross_validation(learner, dataset, size, len(dataset.examples)) - target_vals = dataset.values[dataset.target] - target_dist = CountingProbDist(target_vals) - attr_dists = {(gv, attr): CountingProbDist(dataset.values[attr]) - for gv in target_vals - for attr in dataset.inputs} - for example in dataset.examples: - targetval = example[dataset.target] - target_dist.add(targetval) - for attr in dataset.inputs: - attr_dists[targetval, attr].add(example[attr]) - def predict(example): - """Predict the target value for example. Consider each possible value, - and pick the most likely by looking at each attribute independently.""" - def class_probability(targetval): - return (target_dist[targetval] * - product(attr_dists[targetval, attr][example[attr]] - for attr in dataset.inputs)) - return argmax(target_vals, key=class_probability) +def learning_curve(learner, dataset, trials=10, sizes=None): + if sizes is None: + sizes = list(range(2, len(dataset.examples) - trials, 2)) - return predict + def score(learner, size): + random.shuffle(dataset.examples) + return cross_validation(learner, dataset, size, trials) + return [(size, mean([score(learner, size) for _ in range(trials)])) for size in sizes] -def NaiveBayesContinuous(dataset): - """Count how many times each target value occurs. - Also, find the means and deviations of input attribute values for each target value.""" - means, deviations = dataset.find_means_and_deviations() - target_vals = dataset.values[dataset.target] - target_dist = CountingProbDist(target_vals) +def PluralityLearner(dataset): + """ + A very dumb algorithm: always pick the result that was most popular + in the training data. Makes a baseline for comparison. + """ + most_popular = mode([e[dataset.target] for e in dataset.examples]) def predict(example): - """Predict the target value for example. Consider each possible value, - and pick the most likely by looking at each attribute independently.""" - def class_probability(targetval): - prob = target_dist[targetval] - for attr in dataset.inputs: - prob *= gaussian(means[targetval][attr], deviations[targetval][attr], example[attr]) - return prob - - return argmax(target_vals, key=class_probability) - - return predict - -# ______________________________________________________________________________ - + """Always return same result: the most popular from the training set.""" + return most_popular -def NearestNeighborLearner(dataset, k=1): - """k-NearestNeighbor: the k nearest neighbors vote.""" - def predict(example): - """Find the k closest items, and have them vote for the best.""" - best = heapq.nsmallest(k, ((dataset.distance(e, example), e) - for e in dataset.examples)) - return mode(e[dataset.target] for (d, e) in best) return predict -# ______________________________________________________________________________ - - -def truncated_svd(X, num_val=2, max_iter=1000): - """Computes the first component of SVD""" - - def normalize_vec(X, n = 2): - """Normalizes two parts (:m and m:) of the vector""" - X_m = X[:m] - X_n = X[m:] - norm_X_m = norm(X_m, n) - Y_m = [x/norm_X_m for x in X_m] - norm_X_n = norm(X_n, n) - Y_n = [x/norm_X_n for x in X_n] - return Y_m + Y_n - - def remove_component(X): - """Removes components of already obtained eigen vectors from X""" - X_m = X[:m] - X_n = X[m:] - for eivec in eivec_m: - coeff = dotproduct(X_m, eivec) - X_m = [x1 - coeff*x2 for x1, x2 in zip(X_m, eivec)] - for eivec in eivec_n: - coeff = dotproduct(X_n, eivec) - X_n = [x1 - coeff*x2 for x1, x2 in zip(X_n, eivec)] - return X_m + X_n - - m, n = len(X), len(X[0]) - A = [[0 for _ in range(n + m)] for _ in range(n + m)] - for i in range(m): - for j in range(n): - A[i][m + j] = A[m + j][i] = X[i][j] - - eivec_m = [] - eivec_n = [] - eivals = [] - - for _ in range(num_val): - X = [random.random() for _ in range(m + n)] - X = remove_component(X) - X = normalize_vec(X) - - for _ in range(max_iter): - old_X = X - X = matrix_multiplication(A, [[x] for x in X]) - X = [x[0] for x in X] - X = remove_component(X) - X = normalize_vec(X) - # check for convergence - if norm([x1 - x2 for x1, x2 in zip(old_X, X)]) <= 1e-10: - break - - projected_X = matrix_multiplication(A, [[x] for x in X]) - projected_X = [x[0] for x in projected_X] - eivals.append(norm(projected_X, 1)/norm(X, 1)) - eivec_m.append(X[:m]) - eivec_n.append(X[m:]) - return (eivec_m, eivec_n, eivals) - -# ______________________________________________________________________________ - class DecisionFork: - """A fork of a decision tree holds an attribute to test, and a dict - of branches, one for each of the attribute's values.""" + """ + A fork of a decision tree holds an attribute to test, and a dict + of branches, one for each of the attribute's values. + """ - def __init__(self, attr, attrname=None, default_child=None, branches=None): + def __init__(self, attr, attr_name=None, default_child=None, branches=None): """Initialize by saying what attribute this node tests.""" self.attr = attr - self.attrname = attrname or attr + self.attr_name = attr_name or attr self.default_child = default_child self.branches = branches or {} def __call__(self, example): """Given an example, classify it using the attribute and the branches.""" - attrvalue = example[self.attr] - if attrvalue in self.branches: - return self.branches[attrvalue](example) + attr_val = example[self.attr] + if attr_val in self.branches: + return self.branches[attr_val](example) else: # return default class when attribute is unknown return self.default_child(example) def add(self, val, subtree): - """Add a branch. If self.attr = val, go to the given subtree.""" + """Add a branch. If self.attr = val, go to the given subtree.""" self.branches[val] = subtree def display(self, indent=0): - name = self.attrname + name = self.attr_name print('Test', name) for (val, subtree) in self.branches.items(): print(' ' * 4 * indent, name, '=', val, '==>', end=' ') subtree.display(indent + 1) def __repr__(self): - return ('DecisionFork({0!r}, {1!r}, {2!r})' - .format(self.attr, self.attrname, self.branches)) + return 'DecisionFork({0!r}, {1!r}, {2!r})'.format(self.attr, self.attr_name, self.branches) class DecisionLeaf: @@ -504,14 +384,12 @@ def __init__(self, result): def __call__(self, example): return self.result - def display(self, indent=0): + def display(self): print('RESULT =', self.result) def __repr__(self): return repr(self.result) -# ______________________________________________________________________________ - def DecisionTreeLearner(dataset): """[Figure 18.5]""" @@ -521,28 +399,27 @@ def DecisionTreeLearner(dataset): def decision_tree_learning(examples, attrs, parent_examples=()): if len(examples) == 0: return plurality_value(parent_examples) - elif all_same_class(examples): + if all_same_class(examples): return DecisionLeaf(examples[0][target]) - elif len(attrs) == 0: + if len(attrs) == 0: return plurality_value(examples) - else: - A = choose_attribute(attrs, examples) - tree = DecisionFork(A, dataset.attrnames[A], plurality_value(examples)) - for (v_k, exs) in split_by(A, examples): - subtree = decision_tree_learning( - exs, removeall(A, attrs), examples) - tree.add(v_k, subtree) - return tree + A = choose_attribute(attrs, examples) + tree = DecisionFork(A, dataset.attr_names[A], plurality_value(examples)) + for (v_k, exs) in split_by(A, examples): + subtree = decision_tree_learning(exs, remove_all(A, attrs), examples) + tree.add(v_k, subtree) + return tree def plurality_value(examples): - """Return the most popular target value for this set of examples. - (If target is binary, this is the majority; otherwise plurality.)""" - popular = argmax_random_tie(values[target], - key=lambda v: count(target, v, examples)) + """ + Return the most popular target value for this set of examples. + (If target is binary, this is the majority; otherwise plurality). + """ + popular = argmax_random_tie(values[target], key=lambda v: count(target, v, examples)) return DecisionLeaf(popular) def count(attr, val, examples): - """Count the number of examples that have attr = val.""" + """Count the number of examples that have example[attr] = val.""" return sum(e[attr] == val for e in examples) def all_same_class(examples): @@ -552,67 +429,36 @@ def all_same_class(examples): def choose_attribute(attrs, examples): """Choose the attribute with the highest information gain.""" - return argmax_random_tie(attrs, - key=lambda a: information_gain(a, examples)) + return argmax_random_tie(attrs, key=lambda a: information_gain(a, examples)) def information_gain(attr, examples): """Return the expected reduction in entropy from splitting by attr.""" + def I(examples): - return information_content([count(target, v, examples) - for v in values[target]]) - N = float(len(examples)) - remainder = sum((len(examples_i) / N) * I(examples_i) - for (v, examples_i) in split_by(attr, examples)) + return information_content([count(target, v, examples) for v in values[target]]) + + n = len(examples) + remainder = sum((len(examples_i) / n) * I(examples_i) for (v, examples_i) in split_by(attr, examples)) return I(examples) - remainder def split_by(attr, examples): """Return a list of (val, examples) pairs for each val of attr.""" - return [(v, [e for e in examples if e[attr] == v]) - for v in values[attr]] + return [(v, [e for e in examples if e[attr] == v]) for v in values[attr]] return decision_tree_learning(dataset.examples, dataset.inputs) def information_content(values): """Number of bits to represent the probability distribution in values.""" - probabilities = normalize(removeall(0, values)) - return sum(-p * math.log2(p) for p in probabilities) - -# ______________________________________________________________________________ - - -def RandomForest(dataset, n=5): - """An ensemble of Decision Trees trained using bagging and feature bagging.""" - - def data_bagging(dataset, m=0): - """Sample m examples with replacement""" - n = len(dataset.examples) - return weighted_sample_with_replacement(m or n, dataset.examples, [1]*n) - - def feature_bagging(dataset, p=0.7): - """Feature bagging with probability p to retain an attribute""" - inputs = [i for i in dataset.inputs if probability(p)] - return inputs or dataset.inputs - - def predict(example): - print([predictor(example) for predictor in predictors]) - return mode(predictor(example) for predictor in predictors) - - predictors = [DecisionTreeLearner(DataSet(examples=data_bagging(dataset), - attrs=dataset.attrs, - attrnames=dataset.attrnames, - target=dataset.target, - inputs=feature_bagging(dataset))) for _ in range(n)] - - return predict - -# ______________________________________________________________________________ - -# A decision list is implemented as a list of (test, value) pairs. + probabilities = normalize(remove_all(0, values)) + return sum(-p * np.log2(p) for p in probabilities) def DecisionListLearner(dataset): - """[Figure 18.11]""" + """ + [Figure 18.11] + A decision list implemented as a list of (test, value) pairs. + """ def decision_list_learning(examples): if not examples: @@ -623,8 +469,10 @@ def decision_list_learning(examples): return [(t, o)] + decision_list_learning(examples - examples_t) def find_examples(examples): - """Find a set of examples that all have the same outcome under - some test. Return a tuple of the test, outcome, and examples.""" + """ + Find a set of examples that all have the same outcome under + some test. Return a tuple of the test, outcome, and examples. + """ raise NotImplementedError def passes(example, test): @@ -636,46 +484,141 @@ def predict(example): for test, outcome in predict.decision_list: if passes(example, test): return outcome + predict.decision_list = decision_list_learning(set(dataset.examples)) return predict -# ______________________________________________________________________________ + +def NearestNeighborLearner(dataset, k=1): + """k-NearestNeighbor: the k nearest neighbors vote.""" + + def predict(example): + """Find the k closest items, and have them vote for the best.""" + best = heapq.nsmallest(k, ((dataset.distance(e, example), e) for e in dataset.examples)) + return mode(e[dataset.target] for (d, e) in best) + + return predict + + +def LinearLearner(dataset, learning_rate=0.01, epochs=100): + """ + [Section 18.6.3] + Linear classifier with hard threshold. + """ + idx_i = dataset.inputs + idx_t = dataset.target + examples = dataset.examples + num_examples = len(examples) + + # X transpose + X_col = [dataset.values[i] for i in idx_i] # vertical columns of X + + # add dummy + ones = [1 for _ in range(len(examples))] + X_col = [ones] + X_col + + # initialize random weights + num_weights = len(idx_i) + 1 + w = random_weights(min_value=-0.5, max_value=0.5, num_weights=num_weights) + + for epoch in range(epochs): + err = [] + # pass over all examples + for example in examples: + x = [1] + example + y = np.dot(w, x) + t = example[idx_t] + err.append(t - y) + + # update weights + for i in range(len(w)): + w[i] = w[i] + learning_rate * (np.dot(err, X_col[i]) / num_examples) + + def predict(example): + x = [1] + example + return np.dot(w, x) + + return predict + + +def LogisticLinearLeaner(dataset, learning_rate=0.01, epochs=100): + """ + [Section 18.6.4] + Linear classifier with logistic regression. + """ + idx_i = dataset.inputs + idx_t = dataset.target + examples = dataset.examples + num_examples = len(examples) + + # X transpose + X_col = [dataset.values[i] for i in idx_i] # vertical columns of X + + # add dummy + ones = [1 for _ in range(len(examples))] + X_col = [ones] + X_col + + # initialize random weights + num_weights = len(idx_i) + 1 + w = random_weights(min_value=-0.5, max_value=0.5, num_weights=num_weights) + + for epoch in range(epochs): + err = [] + h = [] + # pass over all examples + for example in examples: + x = [1] + example + y = sigmoid(np.dot(w, x)) + h.append(sigmoid_derivative(y)) + t = example[idx_t] + err.append(t - y) + + # update weights + for i in range(len(w)): + buffer = [x * y for x, y in zip(err, h)] + w[i] = w[i] + learning_rate * (np.dot(buffer, X_col[i]) / num_examples) + + def predict(example): + x = [1] + example + return sigmoid(np.dot(w, x)) + + return predict -def NeuralNetLearner(dataset, hidden_layer_sizes=[3], - learning_rate=0.01, epochs=100): - """Layered feed-forward network. +def NeuralNetLearner(dataset, hidden_layer_sizes=None, learning_rate=0.01, epochs=100, activation=sigmoid): + """ + Layered feed-forward network. hidden_layer_sizes: List of number of hidden units per hidden layer learning_rate: Learning rate of gradient descent epochs: Number of passes over the dataset """ + if hidden_layer_sizes is None: + hidden_layer_sizes = [3] i_units = len(dataset.inputs) o_units = len(dataset.values[dataset.target]) # construct a network - raw_net = network(i_units, hidden_layer_sizes, o_units) - learned_net = BackPropagationLearner(dataset, raw_net, - learning_rate, epochs) + raw_net = network(i_units, hidden_layer_sizes, o_units, activation) + learned_net = BackPropagationLearner(dataset, raw_net, learning_rate, epochs, activation) def predict(example): - - # Input nodes + # input nodes i_nodes = learned_net[0] - # Activate input layer + # activate input layer for v, n in zip(example, i_nodes): n.value = v - # Forward pass + # forward pass for layer in learned_net[1:]: for node in layer: inc = [n.value for n in node.inputs] - in_val = dotproduct(inc, node.weights) + in_val = dot_product(inc, node.weights) node.value = node.activation(in_val) - # Hypothesis + # hypothesis o_nodes = learned_net[-1] prediction = find_max_node(o_nodes) return prediction @@ -683,24 +626,20 @@ def predict(example): return predict -def random_weights(min_value, max_value, num_weights): - return [random.uniform(min_value, max_value) for i in range(num_weights)] - - -def BackPropagationLearner(dataset, net, learning_rate, epochs): - """[Figure 18.23] The back-propagation algorithm for multilayer network""" - # Initialise weights +def BackPropagationLearner(dataset, net, learning_rate, epochs, activation=sigmoid): + """ + [Figure 18.23] + The back-propagation algorithm for multilayer networks. + """ + # initialise weights for layer in net: for node in layer: - node.weights = random_weights(min_value=-0.5, max_value=0.5, - num_weights=len(node.weights)) + node.weights = random_weights(min_value=-0.5, max_value=0.5, num_weights=len(node.weights)) examples = dataset.examples - ''' - As of now dataset.target gives an int instead of list, - Changing dataset class will have effect on all the learners. - Will be taken care of later - ''' + # As of now dataset.target gives an int instead of list, + # Changing dataset class will have effect on all the learners. + # Will be taken care of later. o_nodes = net[-1] i_nodes = net[0] o_units = len(o_nodes) @@ -711,53 +650,80 @@ def BackPropagationLearner(dataset, net, learning_rate, epochs): inputs, targets = init_examples(examples, idx_i, idx_t, o_units) for epoch in range(epochs): - # Iterate over each example + # iterate over each example for e in range(len(examples)): i_val = inputs[e] t_val = targets[e] - # Activate input layer + # activate input layer for v, n in zip(i_val, i_nodes): n.value = v - # Forward pass + # forward pass for layer in net[1:]: for node in layer: inc = [n.value for n in node.inputs] - in_val = dotproduct(inc, node.weights) + in_val = dot_product(inc, node.weights) node.value = node.activation(in_val) - # Initialize delta - delta = [[] for i in range(n_layers)] + # initialize delta + delta = [[] for _ in range(n_layers)] - # Compute outer layer delta + # compute outer layer delta - # Error for the MSE cost function + # error for the MSE cost function err = [t_val[i] - o_nodes[i].value for i in range(o_units)] - # The activation function used is the sigmoid function - delta[-1] = [sigmoid_derivative(o_nodes[i].value) * err[i] for i in range(o_units)] - # Backward pass + # calculate delta at output + if node.activation == sigmoid: + delta[-1] = [sigmoid_derivative(o_nodes[i].value) * err[i] for i in range(o_units)] + elif node.activation == relu: + delta[-1] = [relu_derivative(o_nodes[i].value) * err[i] for i in range(o_units)] + elif node.activation == tanh: + delta[-1] = [tanh_derivative(o_nodes[i].value) * err[i] for i in range(o_units)] + elif node.activation == elu: + delta[-1] = [elu_derivative(o_nodes[i].value) * err[i] for i in range(o_units)] + elif node.activation == leaky_relu: + delta[-1] = [leaky_relu_derivative(o_nodes[i].value) * err[i] for i in range(o_units)] + else: + return ValueError("Activation function unknown.") + + # backward pass h_layers = n_layers - 2 for i in range(h_layers, 0, -1): layer = net[i] h_units = len(layer) - nx_layer = net[i+1] + nx_layer = net[i + 1] + # weights from each ith layer node to each i + 1th layer node w = [[node.weights[k] for node in nx_layer] for k in range(h_units)] - delta[i] = [sigmoid_derivative(layer[j].value) * dotproduct(w[j], delta[i+1]) - for j in range(h_units)] - - # Update weights + if activation == sigmoid: + delta[i] = [sigmoid_derivative(layer[j].value) * dot_product(w[j], delta[i + 1]) + for j in range(h_units)] + elif activation == relu: + delta[i] = [relu_derivative(layer[j].value) * dot_product(w[j], delta[i + 1]) + for j in range(h_units)] + elif activation == tanh: + delta[i] = [tanh_derivative(layer[j].value) * dot_product(w[j], delta[i + 1]) + for j in range(h_units)] + elif activation == elu: + delta[i] = [elu_derivative(layer[j].value) * dot_product(w[j], delta[i + 1]) + for j in range(h_units)] + elif activation == leaky_relu: + delta[i] = [leaky_relu_derivative(layer[j].value) * dot_product(w[j], delta[i + 1]) + for j in range(h_units)] + else: + return ValueError("Activation function unknown.") + + # update weights for i in range(1, n_layers): layer = net[i] - inc = [node.value for node in net[i-1]] + inc = [node.value for node in net[i - 1]] units = len(layer) for j in range(units): layer[j].weights = vector_add(layer[j].weights, - scalar_vector_product( - learning_rate * delta[i][j], inc)) + scalar_vector_product(learning_rate * delta[i][j], inc)) return net @@ -773,184 +739,360 @@ def PerceptronLearner(dataset, learning_rate=0.01, epochs=100): def predict(example): o_nodes = learned_net[1] - # Forward pass + # forward pass for node in o_nodes: - in_val = dotproduct(example, node.weights) + in_val = dot_product(example, node.weights) node.value = node.activation(in_val) - # Hypothesis + # hypothesis return find_max_node(o_nodes) return predict class NNUnit: - """Single Unit of Multiple Layer Neural Network + """ + Single Unit of Multiple Layer Neural Network inputs: Incoming connections weights: Weights to incoming connections """ - def __init__(self, weights=None, inputs=None): - self.weights = [] - self.inputs = [] + def __init__(self, activation=sigmoid, weights=None, inputs=None): + self.weights = weights or [] + self.inputs = inputs or [] self.value = None - self.activation = sigmoid + self.activation = activation -def network(input_units, hidden_layer_sizes, output_units): - """Create Directed Acyclic Network of given number layers. +def network(input_units, hidden_layer_sizes, output_units, activation=sigmoid): + """ + Create Directed Acyclic Network of given number layers. hidden_layers_sizes : List number of neuron units in each hidden layer excluding input and output layers """ - # Check for PerceptronLearner - if hidden_layer_sizes: - layers_sizes = [input_units] + hidden_layer_sizes + [output_units] - else: - layers_sizes = [input_units] + [output_units] + layers_sizes = [input_units] + hidden_layer_sizes + [output_units] - net = [[NNUnit() for n in range(size)] - for size in layers_sizes] + net = [[NNUnit(activation) for _ in range(size)] for size in layers_sizes] n_layers = len(net) - # Make Connection + # make connection for i in range(1, n_layers): for n in net[i]: - for k in net[i-1]: + for k in net[i - 1]: n.inputs.append(k) n.weights.append(0) return net def init_examples(examples, idx_i, idx_t, o_units): - inputs = {} - targets = {} + inputs, targets = {}, {} - for i in range(len(examples)): - e = examples[i] - # Input values of e + for i, e in enumerate(examples): + # input values of e inputs[i] = [e[i] for i in idx_i] if o_units > 1: - # One-Hot representation of e's target + # one-hot representation of e's target t = [0 for i in range(o_units)] t[e[idx_t]] = 1 targets[i] = t else: - # Target value of e + # target value of e targets[i] = [e[idx_t]] return inputs, targets def find_max_node(nodes): - return nodes.index(argmax(nodes, key=lambda node: node.value)) + return nodes.index(max(nodes, key=lambda node: node.value)) -# ______________________________________________________________________________ +class SVC: -def LinearLearner(dataset, learning_rate=0.01, epochs=100): - """Define with learner = LinearLearner(data); infer with learner(x).""" - idx_i = dataset.inputs - idx_t = dataset.target # As of now, dataset.target gives only one index. - examples = dataset.examples - num_examples = len(examples) + def __init__(self, kernel=linear_kernel, C=1.0, verbose=False): + self.kernel = kernel + self.C = C # hyper-parameter + self.sv_idx, self.sv, self.sv_y = np.zeros(0), np.zeros(0), np.zeros(0) + self.alphas = np.zeros(0) + self.w = None + self.b = 0.0 # intercept + self.verbose = verbose - # X transpose - X_col = [dataset.values[i] for i in idx_i] # vertical columns of X + def fit(self, X, y): + """ + Trains the model by solving a quadratic programming problem. + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + # In QP formulation (dual): m variables, 2m+1 constraints (1 equation, 2m inequations) + self.solve_qp(X, y) + sv = self.alphas > 1e-5 + self.sv_idx = np.arange(len(self.alphas))[sv] + self.sv, self.sv_y, self.alphas = X[sv], y[sv], self.alphas[sv] + + if self.kernel == linear_kernel: + self.w = np.dot(self.alphas * self.sv_y, self.sv) + + for n in range(len(self.alphas)): + self.b += self.sv_y[n] + self.b -= np.sum(self.alphas * self.sv_y * self.K[self.sv_idx[n], sv]) + self.b /= len(self.alphas) + return self + + def solve_qp(self, X, y): + """ + Solves a quadratic programming problem. In QP formulation (dual): + m variables, 2m+1 constraints (1 equation, 2m inequations). + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + m = len(y) # m = n_samples + self.K = self.kernel(X) # gram matrix + P = self.K * np.outer(y, y) + q = -np.ones(m) + lb = np.zeros(m) # lower bounds + ub = np.ones(m) * self.C # upper bounds + A = y.astype(np.float64) # equality matrix + b = np.zeros(1) # equality vector + self.alphas = solve_qp(P, q, A=A, b=b, lb=lb, ub=ub, solver='cvxopt', + sym_proj=True, verbose=self.verbose) + + def predict_score(self, X): + """ + Predicts the score for a given example. + """ + if self.w is None: + return np.dot(self.alphas * self.sv_y, self.kernel(self.sv, X)) + self.b + return np.dot(X, self.w) + self.b - # Add dummy - ones = [1 for _ in range(len(examples))] - X_col = [ones] + X_col + def predict(self, X): + """ + Predicts the class of a given example. + """ + return np.sign(self.predict_score(X)) - # Initialize random weigts - num_weights = len(idx_i) + 1 - w = random_weights(min_value=-0.5, max_value=0.5, num_weights=num_weights) - for epoch in range(epochs): - err = [] - # Pass over all examples - for example in examples: - x = [1] + example - y = dotproduct(w, x) - t = example[idx_t] - err.append(t - y) +class SVR: - # update weights - for i in range(len(w)): - w[i] = w[i] + learning_rate * (dotproduct(err, X_col[i]) / num_examples) + def __init__(self, kernel=linear_kernel, C=1.0, epsilon=0.1, verbose=False): + self.kernel = kernel + self.C = C # hyper-parameter + self.epsilon = epsilon # epsilon insensitive loss value + self.sv_idx, self.sv = np.zeros(0), np.zeros(0) + self.alphas_p, self.alphas_n = np.zeros(0), np.zeros(0) + self.w = None + self.b = 0.0 # intercept + self.verbose = verbose - def predict(example): - x = [1] + example - return dotproduct(w, x) - return predict + def fit(self, X, y): + """ + Trains the model by solving a quadratic programming problem. + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + # In QP formulation (dual): m variables, 2m+1 constraints (1 equation, 2m inequations) + self.solve_qp(X, y) + + sv = np.logical_or(self.alphas_p > 1e-5, self.alphas_n > 1e-5) + self.sv_idx = np.arange(len(self.alphas_p))[sv] + self.sv, sv_y = X[sv], y[sv] + self.alphas_p, self.alphas_n = self.alphas_p[sv], self.alphas_n[sv] -# ______________________________________________________________________________ + if self.kernel == linear_kernel: + self.w = np.dot(self.alphas_p - self.alphas_n, self.sv) + + for n in range(len(self.alphas_p)): + self.b += sv_y[n] + self.b -= np.sum((self.alphas_p - self.alphas_n) * self.K[self.sv_idx[n], sv]) + self.b -= self.epsilon + self.b /= len(self.alphas_p) + + return self + + def solve_qp(self, X, y): + """ + Solves a quadratic programming problem. In QP formulation (dual): + m variables, 2m+1 constraints (1 equation, 2m inequations). + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + # + m = len(y) # m = n_samples + self.K = self.kernel(X) # gram matrix + P = np.vstack((np.hstack((self.K, -self.K)), # alphas_p, alphas_n + np.hstack((-self.K, self.K)))) # alphas_n, alphas_p + q = np.hstack((-y, y)) + self.epsilon + lb = np.zeros(2 * m) # lower bounds + ub = np.ones(2 * m) * self.C # upper bounds + A = np.hstack((np.ones(m), -np.ones(m))) # equality matrix + b = np.zeros(1) # equality vector + alphas = solve_qp(P, q, A=A, b=b, lb=lb, ub=ub, solver='cvxopt', + sym_proj=True, verbose=self.verbose) + self.alphas_p = alphas[:m] + self.alphas_n = alphas[m:] + + def predict(self, X): + if self.kernel != linear_kernel: + return np.dot(self.alphas_p - self.alphas_n, self.kernel(self.sv, X)) + self.b + return np.dot(X, self.w) + self.b + + +class MultiClassLearner: + + def __init__(self, clf, decision_function='ovr'): + self.clf = clf + self.decision_function = decision_function + self.n_class, self.classifiers = 0, [] + + def fit(self, X, y): + """ + Trains n_class or n_class * (n_class - 1) / 2 classifiers + according to the training method, ovr or ovo respectively. + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + :return: array of classifiers + """ + labels = np.unique(y) + self.n_class = len(labels) + if self.decision_function == 'ovr': # one-vs-rest method + for label in labels: + y1 = np.array(y) + y1[y1 != label] = -1.0 + y1[y1 == label] = 1.0 + self.clf.fit(X, y1) + self.classifiers.append(copy.deepcopy(self.clf)) + elif self.decision_function == 'ovo': # use one-vs-one method + n_labels = len(labels) + for i in range(n_labels): + for j in range(i + 1, n_labels): + neg_id, pos_id = y == labels[i], y == labels[j] + X1, y1 = np.r_[X[neg_id], X[pos_id]], np.r_[y[neg_id], y[pos_id]] + y1[y1 == labels[i]] = -1.0 + y1[y1 == labels[j]] = 1.0 + self.clf.fit(X1, y1) + self.classifiers.append(copy.deepcopy(self.clf)) + else: + return ValueError("Decision function must be either 'ovr' or 'ovo'.") + return self + + def predict(self, X): + """ + Predicts the class of a given example according to the training method. + """ + n_samples = len(X) + if self.decision_function == 'ovr': # one-vs-rest method + assert len(self.classifiers) == self.n_class + score = np.zeros((n_samples, self.n_class)) + for i in range(self.n_class): + clf = self.classifiers[i] + score[:, i] = clf.predict_score(X) + return np.argmax(score, axis=1) + elif self.decision_function == 'ovo': # use one-vs-one method + assert len(self.classifiers) == self.n_class * (self.n_class - 1) / 2 + vote = np.zeros((n_samples, self.n_class)) + clf_id = 0 + for i in range(self.n_class): + for j in range(i + 1, self.n_class): + res = self.classifiers[clf_id].predict(X) + vote[res < 0, i] += 1.0 # negative sample: class i + vote[res > 0, j] += 1.0 # positive sample: class j + clf_id += 1 + return np.argmax(vote, axis=1) + else: + return ValueError("Decision function must be either 'ovr' or 'ovo'.") def EnsembleLearner(learners): """Given a list of learning algorithms, have them vote.""" + def train(dataset): predictors = [learner(dataset) for learner in learners] def predict(example): return mode(predictor(example) for predictor in predictors) + return predict - return train -# ______________________________________________________________________________ + return train -def AdaBoost(L, K): +def ada_boost(dataset, L, K): """[Figure 18.34]""" - def train(dataset): - examples, target = dataset.examples, dataset.target - N = len(examples) - epsilon = 1. / (2 * N) - w = [1. / N] * N - h, z = [], [] - for k in range(K): - h_k = L(dataset, w) - h.append(h_k) - error = sum(weight for example, weight in zip(examples, w) - if example[target] != h_k(example)) - # Avoid divide-by-0 from either 0% or 100% error rates: - error = clip(error, epsilon, 1 - epsilon) - for j, example in enumerate(examples): - if example[target] == h_k(example): - w[j] *= error / (1. - error) - w = normalize(w) - z.append(math.log((1. - error) / error)) - return WeightedMajority(h, z) - return train - -def WeightedMajority(predictors, weights): + examples, target = dataset.examples, dataset.target + n = len(examples) + eps = 1 / (2 * n) + w = [1 / n] * n + h, z = [], [] + for k in range(K): + h_k = L(dataset, w) + h.append(h_k) + error = sum(weight for example, weight in zip(examples, w) if example[target] != h_k(example)) + # avoid divide-by-0 from either 0% or 100% error rates + error = np.clip(error, eps, 1 - eps) + for j, example in enumerate(examples): + if example[target] == h_k(example): + w[j] *= error / (1 - error) + w = normalize(w) + z.append(np.log((1 - error) / error)) + return weighted_majority(h, z) + + +def weighted_majority(predictors, weights): """Return a predictor that takes a weighted vote.""" + def predict(example): - return weighted_mode((predictor(example) for predictor in predictors), - weights) + return weighted_mode((predictor(example) for predictor in predictors), weights) + return predict def weighted_mode(values, weights): - """Return the value with the greatest total weight. - >>> weighted_mode('abbaa', [1,2,3,1,2]) + """ + Return the value with the greatest total weight. + >>> weighted_mode('abbaa', [1, 2, 3, 1, 2]) 'b' """ totals = defaultdict(int) for v, w in zip(values, weights): totals[v] += w - return max(list(totals.keys()), key=totals.get) + return max(totals, key=totals.__getitem__) -# _____________________________________________________________________________ -# Adapting an unweighted learner for AdaBoost + +def RandomForest(dataset, n=5): + """An ensemble of Decision Trees trained using bagging and feature bagging.""" + + def data_bagging(dataset, m=0): + """Sample m examples with replacement""" + n = len(dataset.examples) + return weighted_sample_with_replacement(m or n, dataset.examples, [1] * n) + + def feature_bagging(dataset, p=0.7): + """Feature bagging with probability p to retain an attribute""" + inputs = [i for i in dataset.inputs if probability(p)] + return inputs or dataset.inputs + + def predict(example): + print([predictor(example) for predictor in predictors]) + return mode(predictor(example) for predictor in predictors) + + predictors = [DecisionTreeLearner(DataSet(examples=data_bagging(dataset), attrs=dataset.attrs, + attr_names=dataset.attr_names, target=dataset.target, + inputs=feature_bagging(dataset))) for _ in range(n)] + + return predict def WeightedLearner(unweighted_learner): - """Given a learner that takes just an unweighted dataset, return - one that takes also a weight for each example. [p. 749 footnote 14]""" + """ + [Page 749 footnote 14] + Given a learner that takes just an unweighted dataset, return + one that takes also a weight for each example. + """ + def train(dataset, weights): return unweighted_learner(replicated_dataset(dataset, weights)) + return train @@ -963,10 +1105,11 @@ def replicated_dataset(dataset, weights, n=None): def weighted_replicate(seq, weights, n): - """Return n selections from seq, with the count of each element of + """ + Return n selections from seq, with the count of each element of seq proportional to the corresponding weight (filling in fractions randomly). - >>> weighted_replicate('ABC', [1,2,1], 4) + >>> weighted_replicate('ABC', [1, 2, 1], 4) ['A', 'B', 'B', 'C'] """ assert len(seq) == len(weights) @@ -977,167 +1120,50 @@ def weighted_replicate(seq, weights, n): weighted_sample_with_replacement(n - sum(wholes), seq, fractions)) -def flatten(seqs): return sum(seqs, []) - -# _____________________________________________________________________________ -# Functions for testing learners on examples - - -def err_ratio(predict, dataset, examples=None, verbose=0): - """Return the proportion of the examples that are NOT correctly predicted.""" - """verbose - 0: No output; 1: Output wrong; 2 (or greater): Output correct""" - if examples is None: - examples = dataset.examples - if len(examples) == 0: - return 0.0 - right = 0.0 - for example in examples: - desired = example[dataset.target] - output = predict(dataset.sanitize(example)) - if output == desired: - right += 1 - if verbose >= 2: - print(' OK: got {} for {}'.format(desired, example)) - elif verbose: - print('WRONG: got {}, expected {} for {}'.format( - output, desired, example)) - return 1 - (right / len(examples)) - - -def grade_learner(predict, tests): - """Grades the given learner based on how many tests it passes. - tests is a list with each element in the form: (values, output).""" - return mean(int(predict(X) == y) for X, y in tests) - - -def train_and_test(dataset, start, end): - """Reserve dataset.examples[start:end] for test; train on the remainder.""" - start = int(start) - end = int(end) - examples = dataset.examples - train = examples[:start] + examples[end:] - val = examples[start:end] - return train, val - - -def cross_validation(learner, size, dataset, k=10, trials=1): - """Do k-fold cross_validate and return their mean. - That is, keep out 1/k of the examples for testing on each of k runs. - Shuffle the examples first; if trials>1, average over several shuffles. - Returns Training error, Validataion error""" - if k is None: - k = len(dataset.examples) - if trials > 1: - trial_errT = 0 - trial_errV = 0 - for t in range(trials): - errT, errV = cross_validation(learner, size, dataset, - k=10, trials=1) - trial_errT += errT - trial_errV += errV - return trial_errT / trials, trial_errV / trials - else: - fold_errT = 0 - fold_errV = 0 - n = len(dataset.examples) - examples = dataset.examples - for fold in range(k): - random.shuffle(dataset.examples) - train_data, val_data = train_and_test(dataset, fold * (n / k), - (fold + 1) * (n / k)) - dataset.examples = train_data - h = learner(dataset, size) - fold_errT += err_ratio(h, dataset, train_data) - fold_errV += err_ratio(h, dataset, val_data) - # Reverting back to original once test is completed - dataset.examples = examples - return fold_errT / k, fold_errV / k - - -def cross_validation_wrapper(learner, dataset, k=10, trials=1): - """[Fig 18.8] - Return the optimal value of size having minimum error - on validation set. - err_train: A training error array, indexed by size - err_val: A validation error array, indexed by size - """ - err_val = [] - err_train = [] - size = 1 +# metrics - while True: - errT, errV = cross_validation(learner, size, dataset, k) - # Check for convergence provided err_val is not empty - if (err_train and isclose(err_train[-1], errT, rel_tol=1e-6)): - best_size = 0 - min_val = math.inf +def accuracy_score(y_pred, y_true): + assert y_pred.shape == y_true.shape + return np.mean(np.equal(y_pred, y_true)) - i = 0 - while i k / 2)) examples.append(bits) - return DataSet(name="majority", examples=examples) + return DataSet(name='majority', examples=examples) -def Parity(k, n, name="parity"): - """Return a DataSet with n k-bit examples of the parity problem: - k random bits followed by a 1 if an odd number of bits are 1, else 0.""" +def Parity(k, n, name='parity'): + """ + Return a DataSet with n k-bit examples of the parity problem: + k random bits followed by a 1 if an odd number of bits are 1, else 0. + """ examples = [] for i in range(n): - bits = [random.choice([0, 1]) for i in range(k)] + bits = [random.choice([0, 1]) for _ in range(k)] bits.append(sum(bits) % 2) examples.append(bits) return DataSet(name=name, examples=examples) @@ -1197,28 +1225,29 @@ def Parity(k, n, name="parity"): def Xor(n): """Return a DataSet with n examples of 2-input xor.""" - return Parity(2, n, name="xor") + return Parity(2, n, name='xor') def ContinuousXor(n): - "2 inputs are chosen uniformly from (0.0 .. 2.0]; output is xor of ints." + """2 inputs are chosen uniformly from (0.0 .. 2.0]; output is xor of ints.""" examples = [] for i in range(n): - x, y = [random.uniform(0.0, 2.0) for i in '12'] - examples.append([x, y, int(x) != int(y)]) - return DataSet(name="continuous xor", examples=examples) - -# ______________________________________________________________________________ - - -def compare(algorithms=[PluralityLearner, NaiveBayesLearner, - NearestNeighborLearner, DecisionTreeLearner], - datasets=[iris, orings, zoo, restaurant, SyntheticRestaurant(20), - Majority(7, 100), Parity(7, 100), Xor(100)], - k=10, trials=1): - """Compare various learners on various datasets using cross-validation. - Print results as a table.""" - print_table([[a.__name__.replace('Learner', '')] + - [cross_validation(a, d, k, trials) for d in datasets] - for a in algorithms], - header=[''] + [d.name[0:7] for d in datasets], numfmt='%.2f') + x, y = [random.uniform(0.0, 2.0) for _ in '12'] + examples.append([x, y, x != y]) + return DataSet(name='continuous xor', examples=examples) + + +def compare(algorithms=None, datasets=None, k=10, trials=1): + """ + Compare various learners on various datasets using cross-validation. + Print results as a table. + """ + # default list of algorithms + algorithms = algorithms or [PluralityLearner, NaiveBayesLearner, NearestNeighborLearner, DecisionTreeLearner] + + # default list of datasets + datasets = datasets or [iris, orings, zoo, restaurant, SyntheticRestaurant(20), + Majority(7, 100), Parity(7, 100), Xor(100)] + + print_table([[a.__name__.replace('Learner', '')] + [cross_validation(a, d, k=k, trials=trials) for d in datasets] + for a in algorithms], header=[''] + [d.name[0:7] for d in datasets], numfmt='%.2f') diff --git a/learning4e.py b/learning4e.py new file mode 100644 index 000000000..12c0defa5 --- /dev/null +++ b/learning4e.py @@ -0,0 +1,1039 @@ +"""Learning from examples (Chapters 18)""" + +import copy +from collections import defaultdict +from statistics import stdev + +from qpsolvers import solve_qp + +from deep_learning4e import Sigmoid +from probabilistic_learning import NaiveBayesLearner +from utils4e import * + + +class DataSet: + """ + A data set for a machine learning problem. It has the following fields: + + d.examples A list of examples. Each one is a list of attribute values. + d.attrs A list of integers to index into an example, so example[attr] + gives a value. Normally the same as range(len(d.examples[0])). + d.attr_names Optional list of mnemonic names for corresponding attrs. + d.target The attribute that a learning algorithm will try to predict. + By default the final attribute. + d.inputs The list of attrs without the target. + d.values A list of lists: each sublist is the set of possible + values for the corresponding attribute. If initially None, + it is computed from the known examples by self.set_problem. + If not None, an erroneous value raises ValueError. + d.distance A function from a pair of examples to a non-negative number. + Should be symmetric, etc. Defaults to mean_boolean_error + since that can handle any field types. + d.name Name of the data set (for output display only). + d.source URL or other source where the data came from. + d.exclude A list of attribute indexes to exclude from d.inputs. Elements + of this list can either be integers (attrs) or attr_names. + + Normally, you call the constructor and you're done; then you just + access fields like d.examples and d.target and d.inputs. + """ + + def __init__(self, examples=None, attrs=None, attr_names=None, target=-1, inputs=None, + values=None, distance=mean_boolean_error, name='', source='', exclude=()): + """ + Accepts any of DataSet's fields. Examples can also be a + string or file from which to parse examples using parse_csv. + Optional parameter: exclude, as documented in .set_problem(). + >>> DataSet(examples='1, 2, 3') + + """ + self.name = name + self.source = source + self.values = values + self.distance = distance + self.got_values_flag = bool(values) + + # initialize .examples from string or list or data directory + if isinstance(examples, str): + self.examples = parse_csv(examples) + elif examples is None: + self.examples = parse_csv(open_data(name + '.csv').read()) + else: + self.examples = examples + + # attrs are the indices of examples, unless otherwise stated. + if self.examples is not None and attrs is None: + attrs = list(range(len(self.examples[0]))) + + self.attrs = attrs + + # initialize .attr_names from string, list, or by default + if isinstance(attr_names, str): + self.attr_names = attr_names.split() + else: + self.attr_names = attr_names or attrs + self.set_problem(target, inputs=inputs, exclude=exclude) + + def set_problem(self, target, inputs=None, exclude=()): + """ + Set (or change) the target and/or inputs. + This way, one DataSet can be used multiple ways. inputs, if specified, + is a list of attributes, or specify exclude as a list of attributes + to not use in inputs. Attributes can be -n .. n, or an attr_name. + Also computes the list of possible values, if that wasn't done yet. + """ + self.target = self.attr_num(target) + exclude = list(map(self.attr_num, exclude)) + if inputs: + self.inputs = remove_all(self.target, inputs) + else: + self.inputs = [a for a in self.attrs if a != self.target and a not in exclude] + if not self.values: + self.update_values() + self.check_me() + + def check_me(self): + """Check that my fields make sense.""" + assert len(self.attr_names) == len(self.attrs) + assert self.target in self.attrs + assert self.target not in self.inputs + assert set(self.inputs).issubset(set(self.attrs)) + if self.got_values_flag: + # only check if values are provided while initializing DataSet + list(map(self.check_example, self.examples)) + + def add_example(self, example): + """Add an example to the list of examples, checking it first.""" + self.check_example(example) + self.examples.append(example) + + def check_example(self, example): + """Raise ValueError if example has any invalid values.""" + if self.values: + for a in self.attrs: + if example[a] not in self.values[a]: + raise ValueError('Bad value {} for attribute {} in {}' + .format(example[a], self.attr_names[a], example)) + + def attr_num(self, attr): + """Returns the number used for attr, which can be a name, or -n .. n-1.""" + if isinstance(attr, str): + return self.attr_names.index(attr) + elif attr < 0: + return len(self.attrs) + attr + else: + return attr + + def update_values(self): + self.values = list(map(unique, zip(*self.examples))) + + def sanitize(self, example): + """Return a copy of example, with non-input attributes replaced by None.""" + return [attr_i if i in self.inputs else None for i, attr_i in enumerate(example)][:-1] + + def classes_to_numbers(self, classes=None): + """Converts class names to numbers.""" + if not classes: + # if classes were not given, extract them from values + classes = sorted(self.values[self.target]) + for item in self.examples: + item[self.target] = classes.index(item[self.target]) + + def remove_examples(self, value=''): + """Remove examples that contain given value.""" + self.examples = [x for x in self.examples if value not in x] + self.update_values() + + def split_values_by_classes(self): + """Split values into buckets according to their class.""" + buckets = defaultdict(lambda: []) + target_names = self.values[self.target] + + for v in self.examples: + item = [a for a in v if a not in target_names] # remove target from item + buckets[v[self.target]].append(item) # add item to bucket of its class + + return buckets + + def find_means_and_deviations(self): + """ + Finds the means and standard deviations of self.dataset. + means : a dictionary for each class/target. Holds a list of the means + of the features for the class. + deviations: a dictionary for each class/target. Holds a list of the sample + standard deviations of the features for the class. + """ + target_names = self.values[self.target] + feature_numbers = len(self.inputs) + + item_buckets = self.split_values_by_classes() + + means = defaultdict(lambda: [0] * feature_numbers) + deviations = defaultdict(lambda: [0] * feature_numbers) + + for t in target_names: + # find all the item feature values for item in class t + features = [[] for _ in range(feature_numbers)] + for item in item_buckets[t]: + for i in range(feature_numbers): + features[i].append(item[i]) + + # calculate means and deviations fo the class + for i in range(feature_numbers): + means[t][i] = mean(features[i]) + deviations[t][i] = stdev(features[i]) + + return means, deviations + + def __repr__(self): + return ''.format(self.name, len(self.examples), len(self.attrs)) + + +def parse_csv(input, delim=','): + r""" + Input is a string consisting of lines, each line has comma-delimited + fields. Convert this into a list of lists. Blank lines are skipped. + Fields that look like numbers are converted to numbers. + The delim defaults to ',' but '\t' and None are also reasonable values. + >>> parse_csv('1, 2, 3 \n 0, 2, na') + [[1, 2, 3], [0, 2, 'na']] + """ + lines = [line for line in input.splitlines() if line.strip()] + return [list(map(num_or_str, line.split(delim))) for line in lines] + + +def err_ratio(learner, dataset, examples=None): + """ + Return the proportion of the examples that are NOT correctly predicted. + verbose - 0: No output; 1: Output wrong; 2 (or greater): Output correct + """ + examples = examples or dataset.examples + if len(examples) == 0: + return 0.0 + right = 0 + for example in examples: + desired = example[dataset.target] + output = learner.predict(dataset.sanitize(example)) + if np.allclose(output, desired): + right += 1 + return 1 - (right / len(examples)) + + +def grade_learner(learner, tests): + """ + Grades the given learner based on how many tests it passes. + tests is a list with each element in the form: (values, output). + """ + return mean(int(learner.predict(X) == y) for X, y in tests) + + +def train_test_split(dataset, start=None, end=None, test_split=None): + """ + If you are giving 'start' and 'end' as parameters, + then it will return the testing set from index 'start' to 'end' + and the rest for training. + If you give 'test_split' as a parameter then it will return + test_split * 100% as the testing set and the rest as + training set. + """ + examples = dataset.examples + if test_split is None: + train = examples[:start] + examples[end:] + val = examples[start:end] + else: + total_size = len(examples) + val_size = int(total_size * test_split) + train_size = total_size - val_size + train = examples[:train_size] + val = examples[train_size:total_size] + + return train, val + + +def model_selection(learner, dataset, k=10, trials=1): + """ + [Figure 18.8] + Return the optimal value of size having minimum error on validation set. + err: a validation error array, indexed by size + """ + errs = [] + size = 1 + while True: + err = cross_validation(learner, dataset, size, k, trials) + # check for convergence provided err_val is not empty + if err and not np.isclose(err[-1], err, rtol=1e-6): + best_size = 0 + min_val = np.inf + i = 0 + while i < size: + if errs[i] < min_val: + min_val = errs[i] + best_size = i + i += 1 + return learner(dataset, best_size) + errs.append(err) + size += 1 + + +def cross_validation(learner, dataset, size=None, k=10, trials=1): + """ + Do k-fold cross_validate and return their mean. + That is, keep out 1/k of the examples for testing on each of k runs. + Shuffle the examples first; if trials > 1, average over several shuffles. + Returns Training error + """ + k = k or len(dataset.examples) + if trials > 1: + trial_errs = 0 + for t in range(trials): + errs = cross_validation(learner, dataset, size, k, trials) + trial_errs += errs + return trial_errs / trials + else: + fold_errs = 0 + n = len(dataset.examples) + examples = dataset.examples + random.shuffle(dataset.examples) + for fold in range(k): + train_data, val_data = train_test_split(dataset, fold * (n // k), (fold + 1) * (n // k)) + dataset.examples = train_data + h = learner(dataset, size) + fold_errs += err_ratio(h, dataset, train_data) + # reverting back to original once test is completed + dataset.examples = examples + return fold_errs / k + + +def leave_one_out(learner, dataset, size=None): + """Leave one out cross-validation over the dataset.""" + return cross_validation(learner, dataset, size, len(dataset.examples)) + + +def learning_curve(learner, dataset, trials=10, sizes=None): + if sizes is None: + sizes = list(range(2, len(dataset.examples) - trials, 2)) + + def score(learner, size): + random.shuffle(dataset.examples) + return cross_validation(learner, dataset, size, trials) + + return [(size, mean([score(learner, size) for _ in range(trials)])) for size in sizes] + + +class PluralityLearner: + """ + A very dumb algorithm: always pick the result that was most popular + in the training data. Makes a baseline for comparison. + """ + + def __init__(self, dataset): + self.most_popular = mode([e[dataset.target] for e in dataset.examples]) + + def predict(self, example): + """Always return same result: the most popular from the training set.""" + return self.most_popular + + +class DecisionFork: + """ + A fork of a decision tree holds an attribute to test, and a dict + of branches, one for each of the attribute's values. + """ + + def __init__(self, attr, attr_name=None, default_child=None, branches=None): + """Initialize by saying what attribute this node tests.""" + self.attr = attr + self.attr_name = attr_name or attr + self.default_child = default_child + self.branches = branches or {} + + def __call__(self, example): + """Given an example, classify it using the attribute and the branches.""" + attr_val = example[self.attr] + if attr_val in self.branches: + return self.branches[attr_val](example) + else: + # return default class when attribute is unknown + return self.default_child(example) + + def add(self, val, subtree): + """Add a branch. If self.attr = val, go to the given subtree.""" + self.branches[val] = subtree + + def display(self, indent=0): + name = self.attr_name + print('Test', name) + for (val, subtree) in self.branches.items(): + print(' ' * 4 * indent, name, '=', val, '==>', end=' ') + subtree.display(indent + 1) + + def __repr__(self): + return 'DecisionFork({0!r}, {1!r}, {2!r})'.format(self.attr, self.attr_name, self.branches) + + +class DecisionLeaf: + """A leaf of a decision tree holds just a result.""" + + def __init__(self, result): + self.result = result + + def __call__(self, example): + return self.result + + def display(self): + print('RESULT =', self.result) + + def __repr__(self): + return repr(self.result) + + +class DecisionTreeLearner: + """[Figure 18.5]""" + + def __init__(self, dataset): + self.dataset = dataset + self.tree = self.decision_tree_learning(dataset.examples, dataset.inputs) + + def decision_tree_learning(self, examples, attrs, parent_examples=()): + if len(examples) == 0: + return self.plurality_value(parent_examples) + if self.all_same_class(examples): + return DecisionLeaf(examples[0][self.dataset.target]) + if len(attrs) == 0: + return self.plurality_value(examples) + A = self.choose_attribute(attrs, examples) + tree = DecisionFork(A, self.dataset.attr_names[A], self.plurality_value(examples)) + for (v_k, exs) in self.split_by(A, examples): + subtree = self.decision_tree_learning(exs, remove_all(A, attrs), examples) + tree.add(v_k, subtree) + return tree + + def plurality_value(self, examples): + """ + Return the most popular target value for this set of examples. + (If target is binary, this is the majority; otherwise plurality). + """ + popular = argmax_random_tie(self.dataset.values[self.dataset.target], + key=lambda v: self.count(self.dataset.target, v, examples)) + return DecisionLeaf(popular) + + def count(self, attr, val, examples): + """Count the number of examples that have example[attr] = val.""" + return sum(e[attr] == val for e in examples) + + def all_same_class(self, examples): + """Are all these examples in the same target class?""" + class0 = examples[0][self.dataset.target] + return all(e[self.dataset.target] == class0 for e in examples) + + def choose_attribute(self, attrs, examples): + """Choose the attribute with the highest information gain.""" + return argmax_random_tie(attrs, key=lambda a: self.information_gain(a, examples)) + + def information_gain(self, attr, examples): + """Return the expected reduction in entropy from splitting by attr.""" + + def I(examples): + return information_content([self.count(self.dataset.target, v, examples) + for v in self.dataset.values[self.dataset.target]]) + + n = len(examples) + remainder = sum((len(examples_i) / n) * I(examples_i) + for (v, examples_i) in self.split_by(attr, examples)) + return I(examples) - remainder + + def split_by(self, attr, examples): + """Return a list of (val, examples) pairs for each val of attr.""" + return [(v, [e for e in examples if e[attr] == v]) for v in self.dataset.values[attr]] + + def predict(self, x): + return self.tree(x) + + +def information_content(values): + """Number of bits to represent the probability distribution in values.""" + probabilities = normalize(remove_all(0, values)) + return sum(-p * np.log2(p) for p in probabilities) + + +class DecisionListLearner: + """ + [Figure 18.11] + A decision list implemented as a list of (test, value) pairs. + """ + + def __init__(self, dataset): + self.predict.decision_list = self.decision_list_learning(set(dataset.examples)) + + def decision_list_learning(self, examples): + if not examples: + return [(True, False)] + t, o, examples_t = self.find_examples(examples) + if not t: + raise Exception + return [(t, o)] + self.decision_list_learning(examples - examples_t) + + def find_examples(self, examples): + """ + Find a set of examples that all have the same outcome under + some test. Return a tuple of the test, outcome, and examples. + """ + raise NotImplementedError + + def passes(self, example, test): + """Does the example pass the test?""" + raise NotImplementedError + + def predict(self, example): + """Predict the outcome for the first passing test.""" + for test, outcome in self.predict.decision_list: + if self.passes(example, test): + return outcome + + +class NearestNeighborLearner: + """k-NearestNeighbor: the k nearest neighbors vote.""" + + def __init__(self, dataset, k=1): + self.dataset = dataset + self.k = k + + def predict(self, example): + """Find the k closest items, and have them vote for the best.""" + best = heapq.nsmallest(self.k, ((self.dataset.distance(e, example), e) for e in self.dataset.examples)) + return mode(e[self.dataset.target] for (d, e) in best) + + +class SVC: + + def __init__(self, kernel=linear_kernel, C=1.0, verbose=False): + self.kernel = kernel + self.C = C # hyper-parameter + self.sv_idx, self.sv, self.sv_y = np.zeros(0), np.zeros(0), np.zeros(0) + self.alphas = np.zeros(0) + self.w = None + self.b = 0.0 # intercept + self.verbose = verbose + + def fit(self, X, y): + """ + Trains the model by solving a quadratic programming problem. + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + # In QP formulation (dual): m variables, 2m+1 constraints (1 equation, 2m inequations) + self.solve_qp(X, y) + sv = self.alphas > 1e-5 + self.sv_idx = np.arange(len(self.alphas))[sv] + self.sv, self.sv_y, self.alphas = X[sv], y[sv], self.alphas[sv] + + if self.kernel == linear_kernel: + self.w = np.dot(self.alphas * self.sv_y, self.sv) + + for n in range(len(self.alphas)): + self.b += self.sv_y[n] + self.b -= np.sum(self.alphas * self.sv_y * self.K[self.sv_idx[n], sv]) + self.b /= len(self.alphas) + return self + + def solve_qp(self, X, y): + """ + Solves a quadratic programming problem. In QP formulation (dual): + m variables, 2m+1 constraints (1 equation, 2m inequations). + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + m = len(y) # m = n_samples + self.K = self.kernel(X) # gram matrix + P = self.K * np.outer(y, y) + q = -np.ones(m) + lb = np.zeros(m) # lower bounds + ub = np.ones(m) * self.C # upper bounds + A = y.astype(np.float64) # equality matrix + b = np.zeros(1) # equality vector + self.alphas = solve_qp(P, q, A=A, b=b, lb=lb, ub=ub, solver='cvxopt', + sym_proj=True, verbose=self.verbose) + + def predict_score(self, X): + """ + Predicts the score for a given example. + """ + if self.w is None: + return np.dot(self.alphas * self.sv_y, self.kernel(self.sv, X)) + self.b + return np.dot(X, self.w) + self.b + + def predict(self, X): + """ + Predicts the class of a given example. + """ + return np.sign(self.predict_score(X)) + + +class SVR: + + def __init__(self, kernel=linear_kernel, C=1.0, epsilon=0.1, verbose=False): + self.kernel = kernel + self.C = C # hyper-parameter + self.epsilon = epsilon # epsilon insensitive loss value + self.sv_idx, self.sv = np.zeros(0), np.zeros(0) + self.alphas_p, self.alphas_n = np.zeros(0), np.zeros(0) + self.w = None + self.b = 0.0 # intercept + self.verbose = verbose + + def fit(self, X, y): + """ + Trains the model by solving a quadratic programming problem. + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + # In QP formulation (dual): m variables, 2m+1 constraints (1 equation, 2m inequations) + self.solve_qp(X, y) + + sv = np.logical_or(self.alphas_p > 1e-5, self.alphas_n > 1e-5) + self.sv_idx = np.arange(len(self.alphas_p))[sv] + self.sv, sv_y = X[sv], y[sv] + self.alphas_p, self.alphas_n = self.alphas_p[sv], self.alphas_n[sv] + + if self.kernel == linear_kernel: + self.w = np.dot(self.alphas_p - self.alphas_n, self.sv) + + for n in range(len(self.alphas_p)): + self.b += sv_y[n] + self.b -= np.sum((self.alphas_p - self.alphas_n) * self.K[self.sv_idx[n], sv]) + self.b -= self.epsilon + self.b /= len(self.alphas_p) + + return self + + def solve_qp(self, X, y): + """ + Solves a quadratic programming problem. In QP formulation (dual): + m variables, 2m+1 constraints (1 equation, 2m inequations). + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + """ + m = len(y) # m = n_samples + self.K = self.kernel(X) # gram matrix + P = np.vstack((np.hstack((self.K, -self.K)), # alphas_p, alphas_n + np.hstack((-self.K, self.K)))) # alphas_n, alphas_p + q = np.hstack((-y, y)) + self.epsilon + lb = np.zeros(2 * m) # lower bounds + ub = np.ones(2 * m) * self.C # upper bounds + A = np.hstack((np.ones(m), -np.ones(m))) # equality matrix + b = np.zeros(1) # equality vector + alphas = solve_qp(P, q, A=A, b=b, lb=lb, ub=ub, solver='cvxopt', + sym_proj=True, verbose=self.verbose) + self.alphas_p = alphas[:m] + self.alphas_n = alphas[m:] + + def predict(self, X): + if self.kernel != linear_kernel: + return np.dot(self.alphas_p - self.alphas_n, self.kernel(self.sv, X)) + self.b + return np.dot(X, self.w) + self.b + + +class MultiClassLearner: + + def __init__(self, clf, decision_function='ovr'): + self.clf = clf + self.decision_function = decision_function + self.n_class, self.classifiers = 0, [] + + def fit(self, X, y): + """ + Trains n_class or n_class * (n_class - 1) / 2 classifiers + according to the training method, ovr or ovo respectively. + :param X: array of size [n_samples, n_features] holding the training samples + :param y: array of size [n_samples] holding the class labels + :return: array of classifiers + """ + labels = np.unique(y) + self.n_class = len(labels) + if self.decision_function == 'ovr': # one-vs-rest method + for label in labels: + y1 = np.array(y) + y1[y1 != label] = -1.0 + y1[y1 == label] = 1.0 + self.clf.fit(X, y1) + self.classifiers.append(copy.deepcopy(self.clf)) + elif self.decision_function == 'ovo': # use one-vs-one method + n_labels = len(labels) + for i in range(n_labels): + for j in range(i + 1, n_labels): + neg_id, pos_id = y == labels[i], y == labels[j] + X1, y1 = np.r_[X[neg_id], X[pos_id]], np.r_[y[neg_id], y[pos_id]] + y1[y1 == labels[i]] = -1.0 + y1[y1 == labels[j]] = 1.0 + self.clf.fit(X1, y1) + self.classifiers.append(copy.deepcopy(self.clf)) + else: + return ValueError("Decision function must be either 'ovr' or 'ovo'.") + return self + + def predict(self, X): + """ + Predicts the class of a given example according to the training method. + """ + n_samples = len(X) + if self.decision_function == 'ovr': # one-vs-rest method + assert len(self.classifiers) == self.n_class + score = np.zeros((n_samples, self.n_class)) + for i in range(self.n_class): + clf = self.classifiers[i] + score[:, i] = clf.predict_score(X) + return np.argmax(score, axis=1) + elif self.decision_function == 'ovo': # use one-vs-one method + assert len(self.classifiers) == self.n_class * (self.n_class - 1) / 2 + vote = np.zeros((n_samples, self.n_class)) + clf_id = 0 + for i in range(self.n_class): + for j in range(i + 1, self.n_class): + res = self.classifiers[clf_id].predict(X) + vote[res < 0, i] += 1.0 # negative sample: class i + vote[res > 0, j] += 1.0 # positive sample: class j + clf_id += 1 + return np.argmax(vote, axis=1) + else: + return ValueError("Decision function must be either 'ovr' or 'ovo'.") + + +def LinearLearner(dataset, learning_rate=0.01, epochs=100): + """ + [Section 18.6.3] + Linear classifier with hard threshold. + """ + idx_i = dataset.inputs + idx_t = dataset.target + examples = dataset.examples + num_examples = len(examples) + + # X transpose + X_col = [dataset.values[i] for i in idx_i] # vertical columns of X + + # add dummy + ones = [1 for _ in range(len(examples))] + X_col = [ones] + X_col + + # initialize random weights + num_weights = len(idx_i) + 1 + w = random_weights(min_value=-0.5, max_value=0.5, num_weights=num_weights) + + for epoch in range(epochs): + err = [] + # pass over all examples + for example in examples: + x = [1] + example + y = np.dot(w, x) + t = example[idx_t] + err.append(t - y) + + # update weights + for i in range(len(w)): + w[i] = w[i] + learning_rate * (np.dot(err, X_col[i]) / num_examples) + + def predict(example): + x = [1] + example + return np.dot(w, x) + + return predict + + +def LogisticLinearLeaner(dataset, learning_rate=0.01, epochs=100): + """ + [Section 18.6.4] + Linear classifier with logistic regression. + """ + idx_i = dataset.inputs + idx_t = dataset.target + examples = dataset.examples + num_examples = len(examples) + + # X transpose + X_col = [dataset.values[i] for i in idx_i] # vertical columns of X + + # add dummy + ones = [1 for _ in range(len(examples))] + X_col = [ones] + X_col + + # initialize random weights + num_weights = len(idx_i) + 1 + w = random_weights(min_value=-0.5, max_value=0.5, num_weights=num_weights) + + for epoch in range(epochs): + err = [] + h = [] + # pass over all examples + for example in examples: + x = [1] + example + y = Sigmoid()(np.dot(w, x)) + h.append(Sigmoid().derivative(y)) + t = example[idx_t] + err.append(t - y) + + # update weights + for i in range(len(w)): + buffer = [x * y for x, y in zip(err, h)] + w[i] = w[i] + learning_rate * (np.dot(buffer, X_col[i]) / num_examples) + + def predict(example): + x = [1] + example + return Sigmoid()(np.dot(w, x)) + + return predict + + +class EnsembleLearner: + """Given a list of learning algorithms, have them vote.""" + + def __init__(self, learners): + self.learners = learners + + def train(self, dataset): + self.predictors = [learner(dataset) for learner in self.learners] + + def predict(self, example): + return mode(predictor.predict(example) for predictor in self.predictors) + + +def ada_boost(dataset, L, K): + """[Figure 18.34]""" + + examples, target = dataset.examples, dataset.target + n = len(examples) + eps = 1 / (2 * n) + w = [1 / n] * n + h, z = [], [] + for k in range(K): + h_k = L(dataset, w) + h.append(h_k) + error = sum(weight for example, weight in zip(examples, w) if example[target] != h_k.predict(example[:-1])) + # avoid divide-by-0 from either 0% or 100% error rates + error = np.clip(error, eps, 1 - eps) + for j, example in enumerate(examples): + if example[target] == h_k.predict(example[:-1]): + w[j] *= error / (1 - error) + w = normalize(w) + z.append(np.log((1 - error) / error)) + return weighted_majority(h, z) + + +class weighted_majority: + """Return a predictor that takes a weighted vote.""" + + def __init__(self, predictors, weights): + self.predictors = predictors + self.weights = weights + + def predict(self, example): + return weighted_mode((predictor.predict(example) for predictor in self.predictors), self.weights) + + +def weighted_mode(values, weights): + """ + Return the value with the greatest total weight. + >>> weighted_mode('abbaa', [1, 2, 3, 1, 2]) + 'b' + """ + totals = defaultdict(int) + for v, w in zip(values, weights): + totals[v] += w + return max(totals, key=totals.__getitem__) + + +class RandomForest: + """An ensemble of Decision Trees trained using bagging and feature bagging.""" + + def __init__(self, dataset, n=5): + self.dataset = dataset + self.n = n + self.predictors = [DecisionTreeLearner(DataSet(examples=self.data_bagging(), attrs=self.dataset.attrs, + attr_names=self.dataset.attr_names, target=self.dataset.target, + inputs=self.feature_bagging())) for _ in range(self.n)] + + def data_bagging(self, m=0): + """Sample m examples with replacement""" + n = len(self.dataset.examples) + return weighted_sample_with_replacement(m or n, self.dataset.examples, [1] * n) + + def feature_bagging(self, p=0.7): + """Feature bagging with probability p to retain an attribute""" + inputs = [i for i in self.dataset.inputs if probability(p)] + return inputs or self.dataset.inputs + + def predict(self, example): + return mode(predictor.predict(example) for predictor in self.predictors) + + +def WeightedLearner(unweighted_learner): + """ + [Page 749 footnote 14] + Given a learner that takes just an unweighted dataset, return + one that takes also a weight for each example. + """ + + def train(dataset, weights): + dataset = replicated_dataset(dataset, weights) + n_samples, n_features = len(dataset.examples), dataset.target + X, y = (np.array([x[:n_features] for x in dataset.examples]), + np.array([x[n_features] for x in dataset.examples])) + return unweighted_learner.fit(X, y) + + return train + + +def replicated_dataset(dataset, weights, n=None): + """Copy dataset, replicating each example in proportion to its weight.""" + n = n or len(dataset.examples) + result = copy.copy(dataset) + result.examples = weighted_replicate(dataset.examples, weights, n) + return result + + +def weighted_replicate(seq, weights, n): + """ + Return n selections from seq, with the count of each element of + seq proportional to the corresponding weight (filling in fractions + randomly). + >>> weighted_replicate('ABC', [1, 2, 1], 4) + ['A', 'B', 'B', 'C'] + """ + assert len(seq) == len(weights) + weights = normalize(weights) + wholes = [int(w * n) for w in weights] + fractions = [(w * n) % 1 for w in weights] + return (flatten([x] * nx for x, nx in zip(seq, wholes)) + + weighted_sample_with_replacement(n - sum(wholes), seq, fractions)) + + +# metrics + +def accuracy_score(y_pred, y_true): + assert y_pred.shape == y_true.shape + return np.mean(np.equal(y_pred, y_true)) + + +def r2_score(y_pred, y_true): + assert y_pred.shape == y_true.shape + return 1. - (np.sum(np.square(y_pred - y_true)) / # sum of square of residuals + np.sum(np.square(y_true - np.mean(y_true)))) # total sum of squares + + +# datasets + +orings = DataSet(name='orings', target='Distressed', attr_names='Rings Distressed Temp Pressure Flightnum') + +zoo = DataSet(name='zoo', target='type', exclude=['name'], + attr_names='name hair feathers eggs milk airborne aquatic predator toothed backbone ' + 'breathes venomous fins legs tail domestic catsize type') + +iris = DataSet(name='iris', target='class', attr_names='sepal-len sepal-width petal-len petal-width class') + + +def RestaurantDataSet(examples=None): + """ + [Figure 18.3] + Build a DataSet of Restaurant waiting examples. + """ + return DataSet(name='restaurant', target='Wait', examples=examples, + attr_names='Alternate Bar Fri/Sat Hungry Patrons Price Raining Reservation Type WaitEstimate Wait') + + +restaurant = RestaurantDataSet() + + +def T(attr_name, branches): + branches = {value: (child if isinstance(child, DecisionFork) else DecisionLeaf(child)) + for value, child in branches.items()} + return DecisionFork(restaurant.attr_num(attr_name), attr_name, print, branches) + + +""" +[Figure 18.2] +A decision tree for deciding whether to wait for a table at a hotel. +""" + +waiting_decision_tree = T('Patrons', + {'None': 'No', 'Some': 'Yes', + 'Full': T('WaitEstimate', + {'>60': 'No', '0-10': 'Yes', + '30-60': T('Alternate', + {'No': T('Reservation', + {'Yes': 'Yes', + 'No': T('Bar', {'No': 'No', + 'Yes': 'Yes'})}), + 'Yes': T('Fri/Sat', {'No': 'No', 'Yes': 'Yes'})}), + '10-30': T('Hungry', + {'No': 'Yes', + 'Yes': T('Alternate', + {'No': 'Yes', + 'Yes': T('Raining', + {'No': 'No', + 'Yes': 'Yes'})})})})}) + + +def SyntheticRestaurant(n=20): + """Generate a DataSet with n examples.""" + + def gen(): + example = list(map(random.choice, restaurant.values)) + example[restaurant.target] = waiting_decision_tree(example) + return example + + return RestaurantDataSet([gen() for _ in range(n)]) + + +def Majority(k, n): + """ + Return a DataSet with n k-bit examples of the majority problem: + k random bits followed by a 1 if more than half the bits are 1, else 0. + """ + examples = [] + for i in range(n): + bits = [random.choice([0, 1]) for _ in range(k)] + bits.append(int(sum(bits) > k / 2)) + examples.append(bits) + return DataSet(name='majority', examples=examples) + + +def Parity(k, n, name='parity'): + """ + Return a DataSet with n k-bit examples of the parity problem: + k random bits followed by a 1 if an odd number of bits are 1, else 0. + """ + examples = [] + for i in range(n): + bits = [random.choice([0, 1]) for _ in range(k)] + bits.append(sum(bits) % 2) + examples.append(bits) + return DataSet(name=name, examples=examples) + + +def Xor(n): + """Return a DataSet with n examples of 2-input xor.""" + return Parity(2, n, name='xor') + + +def ContinuousXor(n): + """2 inputs are chosen uniformly from (0.0 .. 2.0]; output is xor of ints.""" + examples = [] + for i in range(n): + x, y = [random.uniform(0.0, 2.0) for _ in '12'] + examples.append([x, y, x != y]) + return DataSet(name='continuous xor', examples=examples) + + +def compare(algorithms=None, datasets=None, k=10, trials=1): + """ + Compare various learners on various datasets using cross-validation. + Print results as a table. + """ + # default list of algorithms + algorithms = algorithms or [PluralityLearner, NaiveBayesLearner, NearestNeighborLearner, DecisionTreeLearner] + + # default list of datasets + datasets = datasets or [iris, orings, zoo, restaurant, SyntheticRestaurant(20), + Majority(7, 100), Parity(7, 100), Xor(100)] + + print_table([[a.__name__.replace('Learner', '')] + [cross_validation(a, d, k=k, trials=trials) for d in datasets] + for a in algorithms], header=[''] + [d.name[0:7] for d in datasets], numfmt='%.2f') diff --git a/learning_apps.ipynb b/learning_apps.ipynb new file mode 100644 index 000000000..dd45b11b5 --- /dev/null +++ b/learning_apps.ipynb @@ -0,0 +1,988 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# LEARNING APPLICATIONS\n", + "\n", + "In this notebook we will take a look at some indicative applications of machine learning techniques. We will cover content from [`learning.py`](https://github.com/aimacode/aima-python/blob/master/learning.py), for chapter 18 from Stuart Russel's and Peter Norvig's book [*Artificial Intelligence: A Modern Approach*](http://aima.cs.berkeley.edu/). Execute the cell below to get started:" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": {}, + "outputs": [], + "source": [ + "from learning import *\n", + "from probabilistic_learning import *\n", + "from notebook import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "* MNIST Handwritten Digits\n", + " * Loading and Visualising\n", + " * Testing\n", + "* MNIST Fashion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## MNIST HANDWRITTEN DIGITS CLASSIFICATION\n", + "\n", + "The MNIST Digits database, available from [this page](http://yann.lecun.com/exdb/mnist/), is a large database of handwritten digits that is commonly used for training and testing/validating in Machine learning.\n", + "\n", + "The dataset has **60,000 training images** each of size 28x28 pixels with labels and **10,000 testing images** of size 28x28 pixels with labels.\n", + "\n", + "In this section, we will use this database to compare performances of different learning algorithms.\n", + "\n", + "It is estimated that humans have an error rate of about **0.2%** on this problem. Let's see how our algorithms perform!\n", + "\n", + "NOTE: We will be using external libraries to load and visualize the dataset smoothly ([numpy](http://www.numpy.org/) for loading and [matplotlib](http://matplotlib.org/) for visualization). You do not need previous experience of the libraries to follow along." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Loading MNIST Digits Data\n", + "\n", + "Let's start by loading MNIST data into numpy arrays.\n", + "\n", + "The function `load_MNIST()` loads MNIST data from files saved in `aima-data/MNIST`. It returns four numpy arrays that we are going to use to train and classify hand-written digits in various learning approaches." + ] + }, + { + "cell_type": "code", + "execution_count": 95, + "metadata": {}, + "outputs": [], + "source": [ + "train_img, train_lbl, test_img, test_lbl = load_MNIST()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Check the shape of these NumPy arrays to make sure we have loaded the database correctly.\n", + "\n", + "Each 28x28 pixel image is flattened to a 784x1 array and we should have 60,000 of them in training data. Similarly, we should have 10,000 of those 784x1 arrays in testing data." + ] + }, + { + "cell_type": "code", + "execution_count": 96, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training images size: (60000, 784)\n", + "Training labels size: (60000,)\n", + "Testing images size: (10000, 784)\n", + "Testing labels size: (10000,)\n" + ] + } + ], + "source": [ + "print(\"Training images size:\", train_img.shape)\n", + "print(\"Training labels size:\", train_lbl.shape)\n", + "print(\"Testing images size:\", test_img.shape)\n", + "print(\"Testing labels size:\", test_lbl.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Visualizing Data\n", + "\n", + "To get a better understanding of the dataset, let's visualize some random images for each class from training and testing datasets." + ] + }, + { + "cell_type": "code", + "execution_count": 97, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAAKoCAYAAABUXzFLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzs3Xm8VdP/x/FXpEHznJCxpEimkCTSIBrM85RExkJFQoPoR5lJKFNKlHnKlCFjMiTzlClSac6Uzu8P38/a69x77nXvueeevc8+7+fj4dG29hlWqzPt9fmsz6qQSCQSiIiIiIiIxMQGYXdAREREREQkk3SRIyIiIiIisaKLHBERERERiRVd5IiIiIiISKzoIkdERERERGJFFzkiIiIiIhIrusgREREREZFY0UWOiIiIiIjEii5yREREREQkVnSR41m9ejUDBgygSZMmVKlShTZt2vDAAw+E3a3IW7VqFYMHD6ZLly40aNCAChUqMHz48LC7lRNeeukl+vTpQ4sWLahWrRqbbropvXr1Yu7cuWF3LdI++OADDjroIJo2bUrVqlWpW7cue+21F5MnTw67aznpzjvvpEKFClSvXj3srkTayy+/TIUKFVL+99Zbb4XdvZwwe/ZsunfvTp06dahatSrNmjVj1KhRYXcr0k4++eQiX3d67RXv/fffp3fv3jRp0oSNN96YFi1aMHLkSNauXRt21yLvnXfeoWvXrtSoUYPq1auz33778frrr4fdrVKpGHYHouTQQw9lzpw5jBkzhubNmzNlyhSOOeYY1q9fz7HHHht29yJr6dKl3H777ey000707t2bO++8M+wu5Yzx48ezdOlSzjvvPFq2bMnixYsZN24ce+65JzNnzmT//fcPu4uRtHz5cjbffHOOOeYYNt10U9asWcP999/PCSecwIIFCxg2bFjYXcwZP/30ExdeeCFNmjRhxYoVYXcnJ1x55ZXst99+SW077LBDSL3JHVOmTOGEE07gyCOP5N5776V69ep8/fXXLFy4MOyuRdqll17KGWecUai9R48eVK5cmd133z2EXkXfJ598Qrt27dhuu+24/vrrqV+/Pq+++iojR45k7ty5PPbYY2F3MbLmzJlDhw4daNu2Lffddx+JRIKrr76aTp06MWvWLPbaa6+wu1gyCUkkEonEU089lQASU6ZMSWrv3LlzokmTJol169aF1LPoW79+fWL9+vWJRCKRWLx4cQJIXH755eF2KkcsWrSoUNuqVasSjRo1SnTq1CmEHuW2PfbYI7H55puH3Y2ccvDBByd69OiROOmkkxLVqlULuzuRNmvWrASQeOihh8LuSs758ccfE9WqVUv0798/7K7Ewssvv5wAEsOGDQu7K5F1ySWXJIDEV199ldTer1+/BJD47bffQupZ9HXt2jXRqFGjxJo1a1zbypUrE/Xr10+0a9cuxJ6VjtLV/ueRRx6hevXqHHHEEUntp5xyCgsXLuTtt98OqWfRZyFzKb2GDRsWaqtevTotW7bkhx9+CKFHua1+/fpUrKgAdUlNnjyZV155hVtvvTXsrkjM3XnnnaxZs4YhQ4aE3ZVYmDhxIhUqVKBPnz5hdyWyNtpoIwBq1aqV1F67dm022GADKlWqFEa3csLrr79Ox44d2XjjjV1bjRo16NChA2+88QY///xziL0rOV3k/M/8+fPZfvvtC/1Aat26tTsvkg0rVqzgvffeo1WrVmF3JfLWr1/PunXrWLx4MbfeeiszZ87Uj6gS+vXXXxkwYABjxoxhs802C7s7OeWss86iYsWK1KxZk65duzJ79uywuxR5r776KnXr1uWzzz6jTZs2VKxYkYYNG3LGGWewcuXKsLuXU1asWMH06dPp1KkTW221VdjdiayTTjqJ2rVr079/f7755htWrVrFk08+yYQJEzjrrLOoVq1a2F2MrL/++ovKlSsXare2jz76KNtdSosucv5n6dKl1K1bt1C7tS1dujTbXZI8ddZZZ7FmzRouueSSsLsSeWeeeSYbbbQRDRs2ZODAgdx4442cfvrpYXcrJ5x55plst9129O/fP+yu5IxatWpx3nnnMWHCBGbNmsUNN9zADz/8QMeOHZk5c2bY3Yu0n376ibVr13LEEUdw1FFH8cILLzBo0CDuvfdeunfvTiKRCLuLOWPq1Kn8/vvvnHrqqWF3JdK23HJL3nzzTebPn88222xDzZo16dGjByeddBI33HBD2N2LtJYtW/LWW2+xfv1617Zu3TqX1ZQrv4mV1+EpLuVK6ViSDZdeein3338/N910E7vuumvY3Ym8oUOH0rdvX3799VeeeOIJzj77bNasWcOFF14YdtcibcaMGTzxxBO8//77+mwrhZ133pmdd97Z/f8+++zDIYccwo477sjgwYPp2rVriL2LtvXr1/PHH39w+eWXc9FFFwHQsWNHKlWqxIABA3jxxRc54IADQu5lbpg4cSL16tXjkEMOCbsrkbZgwQJ69OhBo0aNmD59Og0aNODtt9/miiuuYPXq1UycODHsLkbWOeecw6mnnsrZZ5/NJZdcwvr16xkxYgTfffcdABtskBsxktzoZRbUq1cv5ZXpb7/9BpAyyiOSSSNGjOCKK65g9OjRnH322WF3Jyc0bdqU3Xbbje7duzN+/Hj69evHxRdfzOLFi8PuWmStXr2as846i3POOYcmTZqwfPlyli9fzl9//QX8W7luzZo1Ifcyd9SuXZuDDz6YefPm8fvvv4fdnciqV68eQKELwQMPPBCA9957L+t9ykXz5s3j3Xff5fjjj0+ZTiSBiy66iJUrVzJz5kwOO+wwOnTowKBBg7j++uuZNGkSr7zySthdjKw+ffowZswY7rvvPjbbbDOaNm3KJ5984iYQN91005B7WDK6yPmfHXfckU8//ZR169YltVveocqDSnkaMWIEw4cPZ/jw4QwdOjTs7uSstm3bsm7dOr755puwuxJZS5YsYdGiRYwbN446deq4/6ZOncqaNWuoU6cOxx13XNjdzCmWaqWoWNFsfWtBNna5MjMcNos+9O3bN+SeRN8HH3xAy5YtC629sZLbWmtdvCFDhrBkyRI++ugjFixYwBtvvMGyZcuoVq1azmSa6FPlfw455BBWr17NjBkzktrvuecemjRpwh577BFSzyTuRo0axfDhwxk2bBiXX3552N3JabNmzWKDDTZg6623DrsrkdW4cWNmzZpV6L+uXbtSpUoVZs2axRVXXBF2N3PGsmXLePLJJ2nTpg1VqlQJuzuRddhhhwHwzDPPJLU//fTTAOy5555Z71Ou+fPPP5k8eTJt27bVxGsJNGnShI8//pjVq1cntb/55psAKrhSApUrV2aHHXZgiy224Pvvv2fatGmcdtppVK1aNeyulYjW5PzPgQceSOfOnenfvz8rV65k2223ZerUqTz77LNMnjyZDTfcMOwuRtozzzzDmjVrWLVqFfDvJlzTp08HoHv37kllCCUwbtw4LrvsMrp168ZBBx1UaOdqffGn1q9fP2rWrEnbtm1p1KgRS5Ys4aGHHmLatGkMGjSIBg0ahN3FyKpSpQodO3Ys1H733Xez4YYbpjwn/zr22GNdimT9+vX58ssvGTduHIsWLeLuu+8Ou3uR1qVLF3r06MHIkSNZv349e+65J++++y4jRozg4IMPpn379mF3MfIeffRRfvvtN0VxSmjAgAH07t2bzp07M3DgQOrXr89bb73FVVddRcuWLV2qpBQ2f/58ZsyYwW677UblypX58MMPGTNmDM2aNWPUqFFhd6/kQt6nJ1JWrVqVOPfccxONGzdOVKpUKdG6devE1KlTw+5WTthiiy0SQMr/vv3227C7F1n77rtvkeOmt2fRJk2alNhnn30S9evXT1SsWDFRu3btxL777pu47777wu5aztJmoP/tqquuSrRp0yZRq1atxIYbbpho0KBB4pBDDkm88847YXctJ6xduzYxZMiQxOabb56oWLFiomnTpomLL7448ccff4TdtZzQuXPnRLVq1RIrV64Muys546WXXkp06dIl0bhx40TVqlUTzZs3T1xwwQWJJUuWhN21SPv8888THTp0SNStWzdRqVKlxLbbbpsYNmxYYvXq1WF3rVQqJBKq2ygiIiIiIvGhNTkiIiIiIhIrusgREREREZFY0UWOiIiIiIjEii5yREREREQkVnSRIyIiIiIisaKLHBERERERiRVd5IiIiIiISKxUDLsDqVSoUCHsLkRCOlsYaez+pbFLn8YufaUdO43bv/SaS5/GLn0au/Rp7NKnsUtfacdOkRwREREREYkVXeSIiIiIiEis6CJHRERERERiRRc5IiIiIiISK5EsPCAiIiLxd91117njAQMGAHDEEUcAMH369FD6JCLxoEiOiIiIiIjESoVEOrXsyplK5f1LZQbTF5Wxa9CggTs+/vjjAdhuu+0A6NChgztnbdYHv/9XXnklAHfeeScA3333Xcb76YvK2OUilZBOj15z6cvVsdtzzz0BeO2111zbt99+C8Auu+wCwOrVq8u1D7k6dlGgsUufxi59KiEtIiIiIiJ5La8jObVq1QLgm2++cW21a9dOus3y5cvd8ZQpUwC4+OKLAc0yRVnYY2cRnKefftq12eyk9c1/voJtfv+tbfHixQB07NjRnfvss88y1ueCfSkNve7+pUhOevSaS1+ujt0TTzwBQLdu3VzbiSeeCMDUqVOz0odcHbso0NilT2OXPkVyREREREQkr+kiR0REREREYiUv09UsTe3BBx8EoFOnTiXqiw3VSy+9BECfPn3cuR9//DHj/cz1kGaNGjXcsaUmDBs2DIDZs2eX63Nnc+wGDhwIwN577+3aDj300EL9KJiK5j+fpaJZmltx93vvvffcud133z2tPhcnKq+7unXruuOjjjoKgJ49ewKwySabuHM77bQTELyfR4wY4c598sknGe9XcZSulp6ovOZ89erVA4L3pv0JMGPGjKQ/M8FST3/66adS3S+KY1ecAw88EIDHH38cgKVLl7pzjRs3zmpfcm3sjH23jh071rWddtppQPBZ+dBDD5VrH3Jh7A455BB3PHr0aKBwkR//3Oeffw7APffc487Za9J//5dVLoxdVCldTURERERE8lreRHKqVq3qjk8++WQAbrrpplL1peBQ3Xjjje548ODBAKxbt64s3UyS61f7FuGAYMbpo48+AmCfffZx51atWpXx587m2N12220A9O3bt9BjpYrIPPzwwwA8+uij7pyVUa1fvz4AQ4cOdedsNsoe6/fff3fnLJKTyQIEYbzuWrdu7Y6bNm0KJEdk2rRpU+LHWrFihTveYYcdAFi4cGGZ+ldSuRTJqVatGhBsuNi1a1d3zt6fr7/+elb6EsXPujp16gAwb948ADbddNNyfb4XX3wRgM6dO5fqflEcu+LY682i3f7r7vnnn89qX3Jt7PbYYw8Arr76agDat29f6Da2xYBFuCH3v2NLy74z7733Xte28cYbA6Uv/GOFfjKZfRLFsatcuTIQFNZK9XnXokULIDlDwrJzMhnpKo4iOSIiIiIiktfyJpIzatQod2yz5PZX//vvv905y2O1fGGbCQY47LDDANh+++0LPf6gQYMAuO666zLW5yhe7ZfGW2+95Y4t4mB/J38Gyr9dpmRz7Oy15UdfCkZtIJjxKEnUxWZMIJg1STUDNWHCBAD69++fVt9TyebYWfRr3Lhxrq169eoArF+/3rU999xzQPC+nDlzZqHHeuSRR4DkqNAxxxwDBOt1ylvUIzk2thCU6e3evXuh21kp3/vvvz8r/YryZ13Dhg2B5E14bdbTXo/FRQr9NTYrV64E4KuvvgJg2bJl7pxtSTB//vxS9S/KY2f8da8Wwf7zzz8BaNasmTvnj0c2RHnsLNLau3dv12bZIwW3uvD99ddfQBARh/KZZY/i2Nma1kWLFgHJf+9LL70UCL6TW7Zs6c7ZGpwtttii0P0aNWqU8X5Gcezs724b8paUfb5ZBLo8trXwKZIjIiIiIiJ5TRc5IiIiIiISKxXD7kB5s8VTVmwgFX8H+bfffjvpnC2ShGBxuIXb/WIGlqqUyXS1XOeX+Y2zq666qlCbpU75odu1a9eW+DH9+1l4NoKZpWVmqaKWqgNBWtB5553n2mbNmlXkY9iC0g020JzNf5kyZYo7Lpim5qemWYGQ0tp6662B4N8E4JdffgFgyZIlaT1mWCpVqgTAzTffDAQpagBPPvkkAP369QPg119/zXLvcoON4ZAhQ1ybpWFdeOGFQPZT1KKoSZMm7vjwww8HghTk5s2bu3OWsvT1118DwfvN98orrwDZWwgeJQWL9JxwwgnunKU8m1dffdUd2287S/G78sory7WfUXTssccm/b+9xgBuvfVWIPgst6IhECzjmDNnDgDbbLONOxeFz0X9KhARERERkViJfeGBadOmAcHsiP/46S7aPvfcc4HUURsrXHD00UeXvrMFRHFxWknYLO6XX37p2mym6o033gCSN84sD7k6dqnYAvziSl9uuOGGGXu+MMZus802c8el3VjXyqraa8uXj4UH/IiWlUG2WfPzzz/fnatYMTmQb6XQ/dvZ4vBUatasCQQlR/37+aX0jz/+eCCIbqYSxfervXYswrVmzRp3rlWrVgB8//335dqHkoji2BkbJz8yaBE9i1AsX748K31JJYyx8yOC++67LwDjx493bVtttVXS7efOneuO77jjDiDIOHn//fcLPb6V3vdL8JeHKL7uLJJjm/T6hX/834AAu+66qzt++umngeD9XB6bbPuiMnYWaYXg726FVrp06eLOvfDCC0X2xQpj2Hfstdde685ZhtM///yTsT6r8ICIiIiIiOQ1XeSIiIiIiEisxLLwQJUqVdyxpcH4IS5b2OynWZSGpXX06tXLtVnY2cLz+cz2PvHry1vKlS3YlZIrrvCAH47PZaVNUfMVLCrij1Mmw+S5wlLUIFj4mWon74L8ned32203AF5//fVCt7N0G1vg7O+sbo//+++/uzZ/b56osn2pbL8MSN4jDYK0O4hGmlou2H///Qu12S70YaaphWnnnXd2x8888wyQnP5j42Op9i+++KI7Z3vg2Hsu1fvZUubzkaXEfvrpp0DyHkP+3nMQpKgB1KtXD4AzzjijvLsYKf6Y1KhRA4Cff/4ZSJ3+bfzXnY35JZdcAsCYMWPcuS+++AKAiRMnZqjHpadIjoiIiIiIxEosIzl+BMEWJfts4WO6M0k2m5KqJOp2220HJO/w7M/ExJnNfI4cObLI2/jFCKRkCi449P8/18ryZoqVoQU44IADks5NnTrVHdsC1Hxg5fJtV3QIXiupFq2efvrpQLCYuaRsQarNJvuPbZ+p48aNc2333XdfqR4/DLZIO9WC48mTJwOKQqfDClHYdyYkF7goqG7dukAQjWzXrp07Z9s5+FHCXOQXErCojl8K2l5nfvGOgg4++OAiz33yySdl7WLOswwHW/gOQcloi0I0aNDAnRs9ejRQfHGUOLKS2RBsiWK/30qz5QXA2LFjAWjfvr1ra9myZVm7WGaK5IiIiIiISKzEMpLjb3iXyl133ZWR5/E3l7INkayU72mnnebO5Uskx3LvU+XgL1y4EICnnnoqq32Ksg4dOrjjgvnC/v8XtybH3zQ0n/ilLwtuiJevM+4WyfFneQu+Zq644gp3PGnSpBI/9oEHHuiOb7rppqTH9iPiPXr0AFKv5Ymy4jats3x92xQUgjL5VpbWj2bZmgjbQK/gBtP5wMpD25rYb7/91p376quvkm7rbx44c+ZMIPUml7aG1taLlXamOSpsA2SAefPmJf1ZUrVq1SrU9s0335StYzFy5513Asm/wyxyY59b/nrWVBt656tUZckL2nbbbd3x5ptvDgQbdtt6KAg+O8OkSI6IiIiIiMSKLnJERERERCRWYpmu5oe/U/n4448z8jwPPPCAOx4yZEjSc/vlWG0R5bJlyzLyvFHi75puJQRTLXK2XXBzfdFoWQwcOBCAiy66CEhe+Ggh9FSlflV4oHQs5SUfWDoQBJ9H/nvSWGlkPy2jJOW1LW3ylltucW2WnrBq1SoguchArqWpmcceewyAE0880bVZSVU/Va+gNWvWAMmFME444QQgSN2zBfMQLISO+/vWPussfTtVcQsrMuCn81nbSy+9BMCff/7pztm/g5Uwz9V0tbKw9J9TTz0VSP4usIXfEmwTctlll7m2ggUv/LSsfHwtlYaliB599NEAHHXUUe5c48aNAdhrr72A4HcgpP4uyjZFckREREREJFbCv8wqB/7sRqqogpUSLCubxQP45ZdfAGjWrBkANWvWdOeicDVbXnbZZRd3bDNtqRbI51vBAYvS2MZuEJTeTVVIoOCYpRpDa/NngV977bUM9Tg+Onfu7I7jvjGelYEG2GKLLYBg410IIqgWaS4pK+xgEQ57bP/x+/XrB8CDDz5Y2m5HzgsvvADArrvu6tosYlUc2zhvk002cW32mWjFaGxzZIDWrVsD0L17dwB+++23snQ7smwcbZF9quI7hx56KAAbbBDMtdpnpG0060cJJRjX2rVrA8kFMzJVUClODjnkEHdc8HvXMiogKLudbyWki+N/j1rRFft967Pf2JdeeikAvXr1cueKK4OeLYrkiIiIiIhIrMQyxFDcDHl5sQ0I995770LPa1e2VtYwTvzczIJGjRrljr/44otsdCdU/hobm4m0zWGhcFQxVR6wn9tf1P3853nmmWcAaNWqVRo9zl3+5oJWnnarrbYCkmd/jz32WCC5XKix6Ovzzz9fbv0sb2eccUahNr+kc2kjOMY2wPRLhRqLDsUhglOQX964YKnj4vil3K2UqpUyv/rqq925nj17AvD0008DsOeee6bf2YjxyxrbxtBW1thfBzt8+HAgyN23MYEgomb8zzp7XfslmOPCHzvblNFsueWW7tiig8bW+0KwJu+NN94odNsqVaok3c//TrGNy5cuXZpO1yOpW7duQBAZBPjhhx+AYF2T/11rpfVnz54NwOLFi7PSz7B99NFH7njlypVA8NmUKiPKxtAi/ABnn302kByRjZJo9kpERERERCRNusgREREREZFYiWW6WtRYWcw48gsPFDRx4sQs9iR8tiM3BGlqqdIlbZGjX97SWLnZ4goP+Ox5LPVj9OjRpe12TvKLfhxwwAFAkB7Zu3dvd85SYfyUGGMllN966y0gWAwOsHr16gz3uHzYwncI0jD8Xaa//vprIEhh8T3xxBNAkPp3wQUXuHPt27cHgteclaCG9FPg8s3nn38OJJdPtrLS/7XNQS7yU+8KpkcNGzbMHdtnlZUzt5RbX7t27YDk962lUObKe7MofjrarbfeCiSnVVlJ3lQKbjHgFzWytHj7/CsuVd//PIjC4vBMSzUGVga5fv36AMyYMcOds+9RKxRUXNn4OLFS2xCkqxVMl4QgRdn+TLVMwU/ZjRJFckREREREJFbyMpKz6aabAvDTTz9l7DH9zT8LimNpx4MPPhiAfffd17XZxm3nnHMOAD/++GP2OxYCW+R43nnnubZUpcsPP/xwIChT6Zflfeedd4q8X3GbgRp77vvuu8+1ff/99yX7C+S4BQsWAMEmjLboGYIoj5WrtQ3LINio0IqFnHbaae7cddddV34dziB/88rHH38cCD7fIFi0bEUZ/JlNK8qQit3Oijrk6iafkj3+InhjhSv8KPeiRYsAmDBhQqHb20afgwYNApI/w6xEbZxYSegmTZq4Nnvv2e8T//1s7DvgzTffdG22Oa8VvvDLk//xxx9AUAgi7u9nK1jhR1Ft6wX7048uWgTHojz5qGnTpiW+7SmnnFKoLVNbs2SaIjkiIiIiIhIreRnJsTzf8ePHl+lx/OiNv4lcPrjllluA5JlhK5kax1LZxSkuB9pfI2MRnBYtWgDJOcG2jiLVRqG24afdzy+rarez+/ubrp511llJ94NgtiWq+bOZMH/+/ELH119/PRDMdgJsvPHGSff79NNPs9C7zPrggw/ccZs2bYDkaJXloadipT9TrRm0kttnnnkmkLzpoEgq/ibFxqKlPitRayWhTz75ZHfu3HPPBWCnnXYCkqONcXkN/v777+7YMiH870w7tnUzxx9/vDt30003AcHnmp9JEce1Nemy7+Tzzz+/yNv4G3/asW0e6m8iqg1CA/Z+TvW+/vLLL7PdnRJRJEdERERERGJFFzkiIiIiIhIrsUxXu/LKK91xqgXEFsK09CrbxfW/2K7EtjuupXL4bNfX2267zbXFafdc2x3ZUqbWrl3rzvk7zeejVDsEWxlZgIEDBwJw0UUXAanTzux+fjqZ7Wq//fbbA8lpbgWfr2XLlq7tlVdeAWD9+vWuzXa1thKZc+fOLelfLyfZwvsbbrgBKJyiBnDjjTcCMHPmzKz1qzzYQmM/ZdE/huTytAU/v+bNm+eO7TWXKgUpLvw0PX+RdqY0atQISP4+MlY6OE7s8wbg3XffBVKnS9qi5VSLl600vJXXnzZtWsb7GSXLli0D4LDDDit0zlKC7DvXZ2W0laKWmn0fWqr3f7GUNEsNPO644wqdkyAlunXr1q7N0k79bR2iRJEcERERERGJlQqJ4naMCkmqErmlUalSJXdsG435C/Ts8W0ho7+53UsvvQQE5Y+tBC0EUSGbUU/F7n/kkUe6NrvSLa10/mnKOnb/xWa7bXZpxIgR7pxtxhgF2Rw7K0Dhz5oX3LQtVZv/fNb26KOPAsmlgf1oGcDHH3/sjm0Ts5I8n992++23A9C/f/9Cf58ovu5Ko1WrVu7YCmTss88+hW539dVXA8GmhLYZWlmUduyyPW5+hNkvmQ1w0EEHueNnn302a32CcF5zthkqBKV1/TLw6bJo4dixY4EgKgZBtMyKQ/iL0NMVxffrZpttBkC/fv2AIIoNUK1ataTb2tYDEHy3Wln48hbFsTO2uapf7tme2yI/YUYZojx2tqm2X0ymJCXI7X7+htL+hquZEuWxK44V2JozZ45re+6554BgK43yVtqxUyRHRERERERiJZZrcv766y93bGV0LaIDwaZHtiZi0qRJ7pyVmLXZpRo1arhztlGZ8Wd+bfOyKVOmAJmZoYsKm1GCILJlOcFWDjRIaejNAAAgAElEQVSfWXTr/fffd22pSooXnInxIzQ2I+dHcIrSsWNHd2ylLm1W3t/MzNaj+Gty7DnjuFFo8+bNgeT3esGN9Pr27euObW1TJiI4ucI2pIXg9WibMmY7ehM2K7sOwVol/zPejzD8l7Zt27pji5Ja3rq/6fQll1wCxOv7IRXLhLC1Nf6mlbYGzsbcL/ObrQhOLmjfvj2Q/L1hpaPz7b1aWpYRcc8997g2e90Vt0ba1rTa2lVJVjD6DzB16tQQelJyiuSIiIiIiEis6CJHRERERERiJZaFB1LZZptt3PHFF18MJO+0XJK+2FB99913APTp08ed88tnZkpUFqf56WqzZ88GgmINUS0bHcbY+bsk2wJGvx8WArfd5K2sMSSXjC4LP13NUub8RZS2SLC4BatRed0VZ9ttt3XHNp62aNnKuEOQcmRFQ4YPH+7O/f333xnvV1QLD1iaqZ/mYuWhbXf5RYsWZaUvqYTxmrv22mvd8YABAwB47LHHXJstlrcUqq233tqds2NbAO6nQVrpX0tTs8X3kJxKmSm58H6NqiiOXdWqVQG4//77AejVq5c79/zzzwPZW+RdnCiOndliiy2A5HQ1K3hxxx13AEFaKQRppCNHjgSSUyj97+lMifLYFcdfCmJq1qwJwB9//JGVPqjwgIiIiIiI5LW8ieT4bHFply5dAOjZs6c7V3CDMn8zKZvlsxmW8t7kMypX+6kiObao3RaYRk1Uxi4XRXnsNtlkEwA++eQT12YzSeajjz5yx7bh5RtvvJGF3kU3kmORV39jSiv/brOXYQrjNecXHrAIi795pW3UaBGvhg0bunO2MXSqvljxGYsE2VYF5SXK79eoi+LYNWnSBID33nsPSN402iKON910U7n2oSSiOHYFtWjRwh1bOWkrxONH/K3NNob3PwfKY0PkXBg7n0UObZsMf5NzfwPybFAkR0RERERE8pouckREREREJFZiuU/Of7HFyLbjtb/zdao64Pnul19+ccevvvoqEOwnJJJNtj+TpXJAUEDA0kmnTZvmzv32229Z7F10+Xu1GEurylf+Xhjdu3cH4NZbb3VttqdQnTp1Ct3XCnfYHkv33nuvO2d7wmRrIa7Ey8KFCwGYPHkykFx4wAraSMn4BX0s5cqKEfhpgFa8xj4TyyNFLZfZnlaWMjdr1qwwu1MqiuSIiIiIiEis5GXhgVyRa4vTokRjlz6NXfqiWnjAZi2//PJL12ZRWSuFH+bspV5z6dPYpU9jlz6NXfpybexsu5VJkyYBQVEVKJ8S28VR4QEREREREclriuREWK5d7UeJxi59Grv0RTWSE3V6zaVPY5c+jV36NHbp09ilT5EcERERERHJa7rIERERERGRWNFFjoiIiIiIxIouckREREREJFYiWXhAREREREQkXYrkiIiIiIhIrOgiR0REREREYkUXOSIiIiIiEiu6yBERERERkVjRRY6IiIiIiMSKLnJERERERCRWdJEjIiIiIiKxooscERERERGJFV3kiIiIiIhIrOgiR0REREREYkUXOSIiIiIiEiu6yBERERERkVipGHYHUqlQoULYXYiERCJR6vto7P6lsUufxi59pR07jdu/9JpLn8YufRq79Gns0qexS19px06RHBERERERiRVd5IiIiIiISKzoIkdERERERGJFFzkiIiIiIhIrusgREREREZFYiWR1NREREclPDRs2BGDs2LGu7YgjjgBg9uzZAHTu3Dn7HRORnKJIjoiIiIiIxIoiOQVssMG/131XXHEFABdffLE7d/755wNw3XXXZb9jOchm3gAeeOABAJYvXw4kz8K999572e1Yhh144IEAHH300a7NatofdthhhW4/ffp0IPm1tXDhwvLsoohkUN++fQHYZJNNCp078cQTAahRowYAn332mTt39tlnAzB//vzy7mJOatCgAQDPPvssAG3atHHn5syZA8CYMWOy3zERyUmK5IiIiIiISKzoIkdERERERGJF6WoFdOrUCYAhQ4YAsH79enfu9NNPB+Cxxx4DYMGCBe6cfzspLJFIAFCrVi0AOnTo4M7larpaq1atABg/fjwAf//9tzu3evVqAK688spC97NUl48//ti1WSqkpfX9/vvv5dDj3HLnnXe641NPPRWAm2++GUge159//jm7HYu4ypUru+Odd94ZgCpVqgDw8ssvp/WYK1eudMeWhvXKK68A0LFjx7QeM1e0aNECgOeee861bbrppkCQ3myfb6nYInoI3ud9+vTJeD9zlT8+M2fOBGDHHXcE4PHHH3fnBgwYACR/74qUVrVq1QA45JBDXFv79u2T2vwU0xNOOAGA77//PltdlAxSJEdERERERGKlQqK4KaiQ2KLtbOnXr587Hjp0KACbb775f97PZkkB5s2bl/F+pfNPk+2xK86rr77qjtu1a5d07sILL3TH119/fcafOxtjZzM8ttD40EMPdedWrVpV5P3q1KkDwI033ujarGiBLUg+5phj3Dl/Vikbwn7dtW7dGoCXXnrJtdmYmXXr1rnj22+/HYBrrrkGCHfGrbRjVx7v13Hjxrlj/7MNoHnz5u64NBGwFStWuOOlS5cC0LNnTyAzi+jDfs2lcsoppwBB1Pmkk04qdJsHH3wQKH4sp02b5o7nzp0LJEd9yyqKY1cSjRo1AuCZZ55xbRbBOffcc4EgSl5ecnXsoiBXx27GjBkA9OrVy7VZv+zv5PfzkUceAeDwww/PWB+iPHY77bQTAOecc45r23vvvYEgql0cPwvFMi7+7//+D8jM515px06RHBERERERiZW8juQ0bdoUCHLL/bZUbJ1F9erVAfjiiy/cObvS/e233zLWvyhf7Rfn+OOPB5Jn4apWrQrAP//8A8CZZ57pzk2cODHjfcjm2FkZ2bKsDenduzcQrEPxZ0NGjRoFBBGL8hb26+6bb74BYIsttijV/dauXQvADz/84NpsJmny5MkZ6l3xwozkWK75Qw895Nq6du0KwLfffgvAHnvs4c5ZRCaVDTfcEAg2Y7TSxxDMbB555JGZ6DYQ/mvO+NEai4jVrVsXCKI2EGwx8PnnnwOZjcyUVlTGrqQKRnAsegPZi+CYsMfO1rNVqlTJtW299dZAkP3gZ4zYOtBUfXn33XcBmDJlClD+W12EPXYlcdttt7ljW29jZco//fRTd+6GG24o8n6LFy8Ggtet3R9g1113BYLvb4AzzjjjP/sVlbHzMyQuv/xyAI477jgA6tWrV+j2lkHx559/ujb/dzAkR3vsd59tGfLiiy+Wuc+K5IiIiIiISF7TRY6IiIiIiMRKXqarWWh4xIgRAAwePLjQbf744w8gWDAFwa70EyZMKHT7rbbaCsjsoueohDRLy1LSUvXfUolsvMpLro6dhXcfffRR12ah3sMOOwwI0oXKSxhjt8MOO7jj1157DYCaNWuW6TEheC1+/fXXQPJi04Jh9kwIM13NCi9YmWKfFbkYOHBgiR7LyvX6RQyMLdyNU7qapan5qbNWHnrkyJEAjB492p0LMz2toLDHriQqVgx2q5g0aRIA++23HxAUbgGYNWtWVvuVjbHbaKONgCDl0/+9Ub9+fSB4raV6nuL66PfFbmfbWdh2BBAUycmkKL7uLJXs3nvvBaBLly7u3CeffALAmDFjgOTvUUt1Nrvssos7XrJkCRD8tvMLZdjj+2Phv9aLEvbY2fetvRcBdtttt6TbvP/+++742muvBYL0XEuNTMVPc7PbW/q9FRUCWLZsWVp9V7qaiIiIiIjktbzcDNTKqKaK4NgVvS2AvOuuu9w5K0rw008/AcGGcBAsTtWGUcGsVKoNUgcNGpTt7uQUm/HwFzJaKW5bSGqzIxDMTuUqW0hrmwBC+hEc21TWn4WzBfT2nvcXj7dp0yat54mqTEZHt9xyyyLPZbK4SlTstddeQOoZ9alTpwLRit7kmv79+7tji+AceOCBQGZKkEeZbcRrkRxbwA7w119/Acm/M2yG22bu/c2yC24SbQWPANq2bQvAPvvsA8Cxxx7rzlnRJP/fIS78QgBWRGq77bYDkgsKWBEai8wUxx9ze3z7rrXHhiAzxYotRZn9RoWgrP3222/v2j744AMgyHDyv5Mts6kk/II2b775JgAHH3wwEJSnhvQ3pi4tRXJERERERCRWdJEjIiIiIiKxkpfpav7O9AXZojI/fGwaN24MQI0aNQqdswVVFvLLR5dddhkQpKlFsKZFzvDTEizN4YUXXgCS9/IYMmRIdjuWYRbu91Pw7H1WnKefftod24J7W+D73HPPFXk/2+MqLiwdD0q24DUTrr766qw8TzZZmou/CN5ekyVJb5HUbA8xf18024Mk7mlqZtWqVQAcccQRAOy+++7u3IcffgjAW2+9ldZjP/bYY+7YfpfYgnE/fbW8C/2Eyfa/gSCV7OGHHwZSF2EpCT8Fzr5r7LH9FHH7zTN79uy0nieb/P3TLE3NT8vr3r07AL/++muZnsfSMyFIzbR0yTBSnRXJERERERGRWMmbSI6/GLlv375J59asWeOObYfvVKzQQKqF0e3btweC0oX5onLlyu7YX5BXkM3aZbtEaBy8/fbbQDAjGCcrV64EkqOrn332GZA8m/bjjz8CQYTVyoBCsCjSLypQlI8//riMPY4Wf1a4R48eIfYkt1lxAb9kti2StTK//oJaKZnTTjsNSF7EbAVU8o3Nmvuz55lk3w82a+6XHI7C9gjlxf+esL9nun/fFi1aAEEkCILfNfbvZgUzIDeivF27dgWgQ4cOrs2+M4cOHerayhrBse0vbKsLCMpS2/f2vHnzyvQc6VAkR0REREREYiVvIjnnnXeeO/ZLPwMMGzbMHb/zzjtpPb6Vlc43F154oTs+6qijirydzTJpNjR9b7zxBgBNmjQJuSeZt3z5cnfcsWNHIHmNiUV8iivR7m9CVpDlVfvrmeLA36CyrPz371lnnZWxx811zZo1A5JnjC0zINUmi/ZdYJtG+5vq/fLLL+XWzyjZZpttgGDLgJNPPtmdK7jxomSGzaTbmgh/Tewdd9wRSp+ywd+qwv7OtgWDRWYgyBBIxdb1WCbOxhtv7M7ZGhyL4ORC9MZn6wz99Zu22XNx61dLy0p0+7+17bPQni8MiuSIiIiIiEis6CJHRERERERiJfbpalbC7oADDih07sUXXwRgypQpZX6eVCWn84GVzv4vTzzxRDn3JP6++eYbIDd2Vy6L4tIKivPll18CQbqb75577gGS0+LiwE9BKM6pp54KwOGHH+7avvjiCyBYpOsXMdhgA81/Gfts90ujVqtWrcjb77rrrgD07NkTCMYZgh3Yx48fn/F+Rsmxxx4LBAuN41bwI4qaN28OBOmVvunTp2e7O1nzyCOPuGNLH7ViAa+++qo7Z+lmc+fOBZJLT8+YMQMI0t38NC57zFxLUyuOFfIpj8f0t7+wNNUwiybpm0xERERERGIl9pEcW4znbzBoJaNHjBgBZOYK3RY0jxo1qsyPlQts8bvNHkEwI2yzwP6M/LRp07LYO8lHgwcPBoLSvwBt27YFYIcddgCSN22LwyJwi14B7LPPPkXeziIPfgSiNAUs/JngfCuyYgUt/I3sbNNni67efffd7pwVtjn99NOB5MXPVqLaZoxtY8w4sM14IVj4ffPNNwPpR2eleP572DaNNvbajDv/tWXRGovc+5+JTz31FAA33XQTABdddJE7Z+9HK+Rim3xKyX300UdActEG+91nhUhWrFiR9X4pkiMiIiIiIrES+0hOKgsWLADg9ddfz9hjbr755hl7rFzQtGlTAHbccUfXZrMhVtLR3xj1u+++y2LvJB9ZmWl/s0Hb5PHSSy8FoFOnTu6crU9ZtGhRtrqYcX4J90qVKgHJkSz//VkWtpkvwJ9//pmRx4wii9BAMI62kaVFJaBkGzpOnjwZSH49WnnVSy65BIhXJMffbNCiXvb+k/JhEWoI1t3Z+/Piiy8OpU9hst8Zti7T1tpAEF20bBu/xLZFbjJZkj8q7PPL36TTPpMs+gJlj/zZ688f17p16wLJUd5sUyRHRERERERiRRc5IiIiIiISK7FPV7MQmmSGLcIdOnRokbdZuHAhAHfeeWdW+pQvNtlkEwB+/vnnkHsSbRaeB+jTpw8QLPhu166dO2eL6S2M/+uvv2arixnjL+S04id+aoBfMrooW2yxBVB8qsb111+fbhdzSvfu3d2xFQwo66L5CRMmuGNLV6tfvz4QLJQGeOaZZ8r0PGHz31u2S/wff/wBJJffPu644wBo1apVkY/17LPPAslpkva9IoG+ffu6Y0sTeuGFF4B4l40uKb+8dK9evZLOPfzww+44jmlqxpYNWMo2BIUAXnvtNddmKXuTJk0CktPOimMFp0q6nUG2KZIjIiIiIiKxEvtIzmOPPQYUH3nIBH+2Ls72339/IHnGs6A77rgDgMWLF2elT3Fgpc79ktwFdevWDQhmWiQ1P7phpURtga5FdiCYeT700EOB+CwC//vvv91xSRZ+t2nTBoj3bGY6MlX2+K+//nLHVhyjZs2aQBBFi4N9993XHd9///1AEMHxX1sNGzYEoH379kDqMRgwYAAAH374oWuz96u/2WC+su8LP1JrM+/XXHNNKH2KAiuRbxt9+sWPbHws8mB/5gt/81P7Xtx5551dm/1us82Mp0yZ4s5Zka5U0R37zWLfo/64ljQaVJ4UyRERERERkViJfSQnExt9lsQRRxwBwNy5c7PyfGHp2bPnf95m5MiRWehJ7rJZOJvtBGjZsiUAzZo1K3R7mxmxWZHdd9/dnbOc/lzP5y9vtqbEn/msXbs2EOS1+2XONZ5iZVZthtPWmZSWrUuBoLSyRXLiwDb/s01QIVgXV7Hivz8x/DVdP/zwQ9L9/A1qbSuGV155BUheM6AITiDVWjsrAfz+++9nuzuRYdEK25zXjyRYNNHWYFpJ6Xzhr2+z33EHHXSQa7PPuzPOOCPpT4AvvvgCSB2ZsTLRZvXq1e7YypgvW7asTH0vC0VyREREREQkVnSRIyIiIiIisRL7dLVVq1YB8NVXX5X5sc4555wiz/mpLnFmi0vzbdFeJtSpUweAl19+GUgO81q5z1SvMb8kMiQv8LX0Nmsr667FcVC5cmV3bGW3+/XrB0DTpk0L3d52tm/UqFEWeidR5qdOWVpLkyZNADjqqKPSekwr7Q2w5ZZbArB+/XoA1q5dm9ZjRom93/yiKXvssQcQlPD1U1iM/d390rNjx44FgnTRyy+/vBx6nLtsrK0Uuc/KbttvnnxxySWXuONRo0YBQVqVlUWGIF3NCl9Yifh89NNPPwFw++23uzZ7z3Xt2hVInRK51157AVCjRo0iH9uKqwDcfPPNZe9sGSmSIyIiIiIisRL7SM7SpUsBeOutt1zb0UcfDQSbI9nVv88WsPmlp1u3bp10m88//9wdT5s2LUM9jh5/hs5KgqZagJZqHCUwceJEAHbccUcAtt12W3euYARm/Pjx7tiiZraJ3qOPPurO2bEt0O3UqZM7l6kSuFFns5u2weLAgQPdOf+4KFaS2xarSsBm4tasWRNyT7LDvhsANttsszI9lhUX2HPPPQuds/e7X+I2V1nJciuqAHDKKacAQQEB/5zZbbfdgOTS7VZu+/TTTwfCXbAcRYMHDwaC0r9+1CbOv0FSsd9oF110kWuz3yUWtUlVFt/u528GKkFBENvEPdVm7o0bNwagUqVKrs1KTs+YMaO8u5gWRXJERERERCRWYh/JMc8995w7btu2LRBEclLlt1rOoZXATOXII490x6lmquLCLzNYsFygz9+EUf7VqlUrd2zjaK8VP3pjZaWt5OLJJ5/szo0bNw4IZuosnx+CMpi26a3lZUOQP/vzzz9n4G+SPRYtrF69umuzzTz995yx0rUHH3xwqZ7H1gm8+eabafUzH9gGb/a6hHisIymKH/G3KILNmtvaHIBff/0VCMog+2u9rCR5jx49ANhqq63cOXvv+xvz5Tp7H/ll10844QQgiOR88MEH7pythdhmm20AuOWWW9w5+6xbvnx5OfY4t9iaQUiOWgDMmjXLHc+ePTtrfYqC448/HgjegxCUz/bX4pj77rsPCNbk5EumQyb98ssvhdoK/u6z728IIj+p7pctiuSIiIiIiEis6CJHRERERERiJW/S1fzd5T/99FMA5syZAwSlff/Ll19+CQSL2dLdATvX+OU///nnHyAo+2n/D/lXurIkUhVosDS1Cy+80LVZ6N3S2/y0GXu9+WlqxnYBt3K3fmpMLqV8tG/f3h3vvffeAFx55ZUZe3wbux9//NG1WfpgvryP02EpWoMGDXJtV111FRAsEo+Tjz/+2B0/8MADAPTv3x8I3qMQjMu55577n4+5bt06dzx16tRCzxMXI0eOdMeWpnLAAQcAyWm7liZku67nW5pVaVnBGQhSgew7xF6b+chStf3v2IKL3/3y0nZ7u419jknZ2O++J554Agje1wAXXHABEKRZ+r8Xs0WRHBERERERiZUKiVRTzSEr740m7fE333xzAE499VR3zmZNrCTv999/787dcccdQPJscHlK55+mvMfuxhtvBODMM88EkheNpirgEJYojt1+++0HBJsD+guZa9euDQRRGyskEIYwxq5Nmzbu+IUXXgBKHmE11u8FCxa4tgkTJgDBgm8r411eSjt2UdhU14o6vPPOO67N31AVgg1sIZgRzWTkNorvV2Oz5f5i5oIbx/rfEwVLr9pGv1A+i52jPHZRlwtjZwvmIfh9Ym3+RrPZFvbYWfGPevXquTbb1mO77bYDYIMNgnl8i+AMGzYMCLfwQNhjVx6sWNJdd93l2uzvacUelixZUubnKe3YKZIjIiIiIiKxooscERERERGJlbxMV8sVcQxpZovGLn1hj52lrg0dOtS1WWEF4++SvnDhQiBII73nnnsy1pfSysV0NeMvcL733nuTzvXq1csdP/nkkxl/7rBfc7lMY5e+XBg7+3yDIE3y/PPPB+CGG27Ial98YY+dFQIZO3asa2vQoAEQ9M0vLmDHUdjnK+yxKw9WjMpfwtCvXz9A6WoiIiIiIiIZo0hOhMXxaj9bNHbp09ilL5cjOWHSay59Grv05cLYpYrkdOvWDYDnn38+q33x5cLYRVWcx84vqGTZFYrkiIiIiIiIZEjebAYqIiIikku+/fZbd1yjRg0Afv7557C6I1IsP/Lol/AOS/g9EBERERERySBd5IiIiIiISKyo8ECExXlxWnnT2KVPY5c+FR5Ij15z6dPYpU9jlz6NXfo0dulT4QEREREREclrkYzkiIiIiIiIpEuRHBERERERiRVd5IiIiIiISKzoIkdERERERGJFFzkiIiIiIhIrusgREREREZFY0UWOiIiIiIjEii5yREREREQkVnSRIyIiIiIisaKLHBERERERiRVd5IiIiIiISKzoIkdERERERGJFFzkiIiIiIhIrusgREREREZFYqRh2B1KpUKFC2F2IhEQiUer7aOz+pbFLn8YufaUdO43bv/SaS5/GLn0au/Rp7NKnsUtfacdOkRwREREREYkVXeSIiIiIiEis6CJHRERERERiRRc5IiIiIiISK7rIERERERGRWIlkdTURERGJr0022QSAli1burbNNtsMgLvuugtIrihlVZWmT58OwJlnnunOLVmypHw7KyI5SZEcERERERGJFUVyinDHHXcA0LdvX9e2cOFCALp27QrA/Pnzs9+xHLDBBv9eO7/55puurW3btgDssMMOAHz88cfZ71jI7O8+bNgwAI466ih3bvz48QBUrly50P223nprAF566SUgedbS7icikku6dOkCwKRJkwqds6hNqj0xGjVqBEDt2rVdmyI5IpKKIjkiIiIiIhIrusgREREREZFYqZBIFQ8Omb/YMCxvvfUWALvvvnuhc5a21qxZM9f2xx9/ZLwP6fzTRGHsOnfuDMCzzz5b6NzAgQMBuPHGG8u1D1EZu1NOOcUdT5gwAYANN9ywTI/p/90WL14MBGOeiRTKqIxdLirt2Gnc/qXXXPpyYewqVarkjq3gwPPPPw/ANttsU+j269atA+Cvv/5ybV9//TUAvXr1AuC7774rc79yYeyiKhfGbvjw4WV+jMsvvxyAl19+GYARI0a4c9ZWWrkwdlFV2rFTJEdERERERGIlrwsP2AJ5/wp5xx13BGCnnXYCYP369e6c3a5JkyYATJw40Z2zq/2vvvqqHHucG6pWrVrkuVWrVmWxJ+Gz1xiUPYJj/Ndrw4YNAXjnnXcAOOecc9w5//WZ6+zvbEU/jjzySHdur732AuD7778H4OSTT3bnfv755yz1MDr8WfNjjz0WCIpe2OcUwJo1a0r8mH5BjM8//xwIIpNXXXVV+p3NAePGjQPg/PPPd232mrOIf3H23HNPd2y3t9fv4YcfXuj2RxxxRKG2XJ/FveGGG9xxv379irzd6tWrAbjooosAFVaR/9axY0d3bJ9vflumn8d/7P322w9IP6ITJfb7xLJthg4d6s5ZkQ/7HPKjKdZmkVn/M23lypXl2OOSUSRHRERERERiJa8jOaeffjoAffr0cW02G2yzoWeffbY7ZzOfNrt09NFHu3NW3leRnOQZz4Is4pAv7r//fndsUYjDDjss489jM+2jR492bXPmzAFg3rx5GX++bKhTp447vu2224BgBtzP1bdS5bYu6dFHH3Xn9thjj3LvZ9Tceuut7vjUU08F4MMPPwTgl19+cefGjh1b4sf0y51vscUWQDDrHlebb745kPrzzF5zDz30UKFzFrmx++ezNm3aAEHkK5UpU6a44xdeeAGAe+65p3w7Vk7sfbLbbrsVeRs/OrV27VoAVqxY4dp+//33cupdvERhOblFdXI1krPpppu6Y3vPWXTKZ2P95ZdfAsnbg9jvmk6dOgFw9dVXu3NnnHFGhntceorkiIiIiIhIrOgiRyuHefkAACAASURBVEREREREYiUv09W6desGwJVXXgkkL5QvuFBq6tSp7nj58uVJ5+688053bCG6F198EYAFCxZkrsOSs/zS4ieddBIA5513Xqkew0LKF198MQC9e/cu8rYNGjRwxxZKz4V0NVsYD3DhhRcCQalYCErKplqQbO9ZC6Vvt9127pyNh5XajrOePXsCQbEBCF4zjz/+OBCk45aWv5jUSujfe++9aT1WrhgwYMB/3iZVkYCSsDS3H374wbVZepv/mMWl/kbZgw8+CARbMDRt2tSd++effwCYNm0aAOeee647t2zZsmx1sVzY1gj169cv8jb+v6kt2n799ddd27fffpt0+3fffdcdv/LKK0U+7hdffAHEO92tLCWhLaWsuDG02/gFWsqjiEHYKlb896e//1q0NDX7zXLNNde4c/b7NlVp9ypVqgDw8MMPA8kl4e157H5hUCRHRERERERiJW8iORtvvLE7HjlyJAA1a9YEkjettCt4m21PVQLPZjBPPPFE12ZXwWeddRYAgwYNyljfc4WNcXElpPOZzbCVdqbNyiDbgvriIjl//vmnO7ZSv1FUrVo1AC655BIgKAICQcEBv2iDFQDxF+gWpVatWu7YZpnywZgxYwD4+++/XZuNoR8xKA2LIvoLx222uiT/FrnGLxZQmihKqgIE06dPL9RmEY7i+OXRS3L7KLKNsi2CM3fuXHfOCqLYd2Wc2CLsm266ybXZe8gKdqTSrl27lMcAxx13nDtOVcLXWHlyK4JkWSWQuwvj01XWkut+9Ka4SE6ujqsVR0kVrb7sssuAoHT+f7HIjxW3GTx4cKHnmT17dvqdLSNFckREREREJFZ0kSMiIiIiIrGSN+lq/t4Zu+66KxCEzf19Syz0dsIJJ/znY/oLfD/77DMg2JfD3+l+/fr16XY7p3Tv3h1IvUeAjfWPP/6Y1T7FgS3Kv+WWW/7ztv4O7DNnziy3PpXVzTffDATFGGzRIgQpVo888kiJHqtly5ZAkBbi3y/uBQf8vbq23357IHlvgnTT1Izts+MvpI5yGmRZ2WL4VPy9gsozjSxXU9T8ggn+/huQXKTn9ttvz1qfsu2DDz4AoEuXLq7Ndou3IihWjAGgQ4cOAOy9996ubauttkrruS01yP4cMmSIO/fJJ58AQQql/12S68UeUrF0vhEjRri2khQtsNQ0v/BAcXI1Xc3fm8pYcRr7bi6p1q1bA8kp58b2rlO6moiIiIiISIbEPpKz7777ArDPPvu4NisFbeVo/TK/pfHrr7+641mzZgHBAt18ieTYAnKACy64oMjbWSQnjouVy4M/szd06FAgeawLstewv9twlL399ttAMHtU0qhNKsOGDQOCghdWNhmgbt26Sbf1CzMsXbo07ecMmxX5sMINPvssygQrHW1lfyH9MtRRZov9/QILxooK5GqEJVssggBB9GLy5MlA8RGy4vgRRHt/r1q1Cii8pUOU+MVl7NgKyPgl/SdOnAgUXyylf//+7tje96kKD5x88slAMGZWvhdgxx13TPrT38bAiiRYQaYo86Mx9tuuuMIAfkTGji364kd5ShPB8e+XS/zfFP5WE8aKEPjfkSXRqlUrIPk1bJ555plSPVZ5UCRHRERERERipUIi1ZRAyMpa/q9hw4bu2MooWt4+BLmDfp5wWc2YMQMIyvv6kaM33ngjrcdM55+mrGNXWv7snb+hGSTPtNkMqW1YVt5yYexSsfH0N7u0nNdU1q5dC8A555wDwN13313mPuTC2PlltC2/2GZ6/ddY8+bNk+7nRxJt5spKwmci4lrasUt33KzUsR9V+eabb4BgzSGUfbbbol221gCgU6dOZXrMVMJ4zfmfXRal8UtIl4ZfQtoi2mVdD1VSYb9fbcz8tTb2GvS/i0vCZoPt/X3mmWe6c7bW09Ya9uvXz51Ld61n2GNXnvwIx4EHHggEEZxKlSoVur2ffVISURk7P3Jdnht3WgTItgspizDGzt/Q2SKrzz33nGvr0aMHUPqNO//v//4PCDbx/uqrr9w5W3+WaiuWdJV27BTJERERERGRWNFFjoiIiIiIxEosCw/4IXJLU7NUDshsmlpRrJQ0pJ+ulguK28XZ/3tnK00tVx1zzDFAkKZWo0aNIm87f/58d3z99dcDmUlTywUbbbQRAGPGjHFtlqZmpVBtYTLANddcAwTpLP4u9nfddRcQLI5ctGhReXU7K2677TYgMwuyLZ3FUlhsN2ufFcKwRfsQlBH2ywmvWbOmzP0pD34f001TS/VYdmwlp+NYsMBPd7L3kZ/G4+96/l/8tLNu3boB0KtXryJv37VrVyBYRA/amiAVv7yxHbdp0waAAw44IIQelQ//dWeFCUpSlKAszxMX/tYqJUlT22WXXQAYOHCga7My0cbKlUNm09TSpUiOiIiIiIjESqwiOTbraCV3faNHjy7X57aFp/6C6Hxw/PHHh92FnGMlPm+88UbXZhupFhfBsdk4i/pAchnzfGAb5W233Xau7emnnwaCRbX+wseC/Fmm559/Pukxcz2Skyraki6bUbdSwL/88os7ZyVnbTGzvyD2uOOOA6IbvfFZFBSCkuZ+BN7arLiDv9GuH70CGDt2rDu2qJAt7vU3xrzuuusy0vew+eWJU81w23urOBbxGjdunGuzEslSdpUrV3bHFvm2mXjfd999l7U+lbeCG376/1/SDT4hOQoWxwiO8SOmVjjEfkf7kdJLL70UCL4XCpY5jzJFckREREREJFZiFcmxcpOWCw3BxkbPPvtsuT53wVng0047zR3HZfautCZNmhR2FyKlXr16AEydOhWA/fffv0T3sxKZxx57LJB/0RufzTrutNNOrs3We5VkU9/tt9++UJu/xilXpIqgvvTSS2V6zA033NAdW3TLWJlQCNY8WXTcXs+QWxuF+iWe7bik62cK3s7/f1uLaGXzr732Wndu+vTphZ47X1nURtGb8jFhwgR3fMIJJySd89co2yx9nJRmc09fJstER827777rjr/99lsgWN8GwVYMm222GZD8fWBRHfv8st8iUPrS49kW7d6JiIiIiIiUki5yREREREQkVmKVrmblZX22GNlfOFse/LAfBDuFx1WLFi0AaN++faFztmDX3003X1kJcwgKDZQkFO4vfCxJmlrjxo2B8n+dh83ST+fNm5fW/f3Sl6tXrwZg/fr1Ze9YBNgu835awj///POf97OF8SNGjHBtp556atJtHnjgAXdsJfhffPHF9DsbAvu3t3QMv/BAJtPHrEz5m2++WeicpbDlW7qaXybayhjb94Rv4cKFQPBv5b9f99xzz/LsYmz0798fgBNPPNG12S7xlu7rp6j57+1c5JeJtvS0TJaOjosFCxa4Y/v3nzx5smsrWEzFt3btWiBI9456ipovd3oqIiIiIiJSArGK5Fhp3a+//tq1ZWuWolmzZkn///DDD2flecOy++67A1CzZs1C56zUai6Ukc2kunXrumNbbOxvElhc2UUbK5uJuv/++905i+DYDPSWW27pzp1zzjkA7LDDDkDyInpbiH/SSSeV8m8SP1aau06dOq7NFubaLFUusRm4K6+80rXZ+84v32vRKiurve222xZ6LHs93XHHHa7toIMOAoINP/2Z3+JKdEeZXwAAkjeQzGRxGL/UdL6yEu8Wtb7qqqvcOcu4sLL5H3zwgTtnC+Rts1F7/frs89Df9Ddf+WWi7fPMj+CYjz/+GICePXsCybP6ucrKQ5e2uEBxLALkR4L8rIq4eOKJJ4Dk8u2W9eSXhzdWHMSyBT777DN3rk+fPgAceuihABx22GHuXK1atQBYsWJFxvpeWorkiIiIiIhIrMQqkmObLPqzP+U523PyySe74x49egDBjPxtt91Wbs8bBYccckiR5+IexSrIZsJPOeUU11awZGcqy5Ytc8e2PuCRRx4Bktfy2GzLvvvuCyRvLliQrZUC+Ouvv4DkGfrZs2f/Z7/iyN6rtrkl5HZpd1u7YJ95AIMGDQKCKIzPIs32moBgdt0iP/5sW+fOnQHo0KEDEKyFymW2YbNFV8urtLNtBprPSpJBYesHbdNUCEqjDxkypNDtLYJz+umnA/n7WQZQtWpVINiYF4LvHFt/Y9EbgNatW2exd+Un1fqbkrI1h/Y9Wty6nbhHcuw38uDBg13bjBkzgCDCmirqb7/t7HcKBGtae/fuDSRvP2DZVWH+HlYkR0REREREYkUXOSIiIiIiEiuxSlfLliZNmgBw6623ujbbHfbqq68G4Pfff89+xyQUtggvVQnz4tiuwxCE0v0yvunwyyHb4tJ8TuswVor2yy+/dG25XObdSkP7f4eLLroo6c+ysIWplpbaqFEjdy5Xyx9bSWe/GIixlNDiyqgWx09R89OvpGhWvMb+/C+WDvzss8+WW59yRfPmzYHkFGljqZepUv5yXWlLQ/vbNRRMO7O0PvmXlXRPVdo9XZtssknGHitdiuSIiIiIiEisKJJTCnZV+tRTTwHJ5RsnTZoEwNixY7PfsSyyhe3dunUrdM5Kp77zzjtZ7VOu2mWXXdK6n19Mw19IDsmzyFZeOp9tvfXWQBCJsEXLEJTYlsJyOcpVFCs0YZEc25jTb/NLql5wwQVFPpZtTGn386NDBQsPWMEDgAcffDCtvuerL774wh2/8cYbIfYkGqyE77Bhw4q8zciRI4F4lIlOl0Vw4lg0QEpHkRwREREREYmVWEZybCMxCEotlnaNjK27sU33ANq3bw8E62/8DZFshthfExFHe++9N5AcxTK77bYbEEQotCle6dlaC3+Dyp9++gkINnt7+umn3blc3ZgxW2wTS4t42VoTKd73338PBON29NFHu3PvvvtuKH3KlKOOOgqA119/3bVZ9OX88893bXZskRjbjBeSo0BFsfsVFxHKVX///bc7tuixjWsmzJ07F0hec7Jy5cqMPX6usvehberp69SpEwCffvppVvuUTbYBKBRfQnrWrFlA6jU5pV3XIyXz3nvvFWqzsuaZ3LC1tBTJERERERGRWNFFjoiIiIiIxEqs0tUsFeXss892bS+88AIAZ511lmv74IMPgKCQgL8bsC1Qth3oa9WqVeh5bAHkaaed5trinqZmDjjggCLPWenBOXPmZKs7kWDpY7169SrR7S3Vwy9haaWNR48eDagMbVlcdtll7vikk04CgrLKK1asCKVPucbS1aw4g6X9xoGVwL7wwgtdmxWMKVg0AFKXnC7qMSEocGB/xpGfrnbuuecCcMcdd7i222+/HQgKf/hjYWW658+fD8A111xT6PEtRdcvPCBBqlWFChUAePTRR925fFtkX5L0M0tbS/expeReffVVILloTbVq1cLqjqNIjoiIiIiIxEqFRAR3RLJZitKqV68ekHz13qpVKwCWLVvm2uxKs3r16gA0bty4RI9/1113AUFU6M8//0yrnyWVzj9NumNXUhb9evHFF4FgDAH2339/IBqL4cMYu8GDB7vjdu3aAbDpppu6Npt1u+2224DolumN4uuuJLbccksAPvroI9f2/vvvA0HJc7+gQ3ko7dhFYdyK89xzzwHJi+5tMe+iRYsy9jxhv+YsguOXkDYWyfFLQVvkxtrCLLIS9tjlslwYu4MOOsgdW7aKFVKyYgOQ/ddgVMbO/71X1qIC2So9HZWxKw9+YSQree5vJl1WpR07RXJERERERCRWdJEjIiIiIiKxEqt0tbiJc0izvGns0pdrY2e7zz/11FMA1K1b153baqutgOzt/h23dLWGDRsCQdofwPTp0wG44YYbAFi4cKE7Z4UKSivXXnNRorFLXy6MnZ+GtsceewDBnkH+uf79+wPwzTffZKVfURw7S1ezP4vbn8VPSRsxYkShtvIUxbHLlEmTJrljS7VUupqIiIiIiEiGxKqEtIjEW4sWLYCgDC0EO9PbDM+hhx7qzn333XdZ7F38/Prrr0ByeXRbbL/jjjsC0Lt3b3cu3UiOiKQ2fvx4d9y2bVsgKPjjf779/PPP2e1YBFkkxv4cPnx4aH3JVxbhB/h/9u480Krp///4MxEiQ0plKqQy9WlCkimRKaESJVPITMjQJJQxTWYyRCiRKZIhGaK+SMgsJBSSMZHS7w+/99pr33vu6Zx9z7jP6/FP217nnrPuss85d6/3e71X7dq189iT/yiSIyIiIiIisaI1OQUsznmb2aaxi05jF13c1uTkiq656DR20RXb2FkE2yKstjE5wPLly3Pal2Ibu0KisYtOa3JERERERKSk6SZHRERERERiRelqBUwhzeg0dtFp7KJTulo0uuai09hFp7GLTmMXncYuOqWriYiIiIhISSvISI6IiIiIiEhUiuSIiIiIiEis6CZHRERERERiRTc5IiIiIiISK7rJERERERGRWNFNjoiIiIiIxIpuckREREREJFZ0kyMiIiIiIrGimxwREREREYkV3eSIiIiIiEis6CZHRERERERiRTc5IiIiIiISK7rJERERERGRWFkz3x1IpEqVKvnuQkFYtWpV2j+jsfuPxi46jV106Y6dxu0/uuai09hFp7GLTmMXncYuunTHTpEcERERERGJFd3kiIiIiIhIrOgmR0REREREYkU3OSIiIiIiEisFWXhAREREStMtt9wCwGmnnebO9e3bF4AbbrghL30SkeKjSI6IiIiIiMSKIjnA3nvv7Y4vv/zy0L++xx9/HIDDDjsMgJdffjkHvZNituWWW7rjM844A4D1118fgKZNm7q2N998E4A777wTgPnz57u2v/76K+v9FBHJty5dugBBBOeHH35wbY8++mhe+iQixUuRHBERERERiZUqq6LsSpRlud706IUXXnDH++67L5B8wyHLDR4xYkRW+6UNo6LL99g1aNAAgLffftud23jjjVP++VmzZrnjU045BYC5c+dmpnOrkY+xO+aYY9yxH/0qy/LxV65cWanXyxZtBhpNvt+vxazYx659+/bu+LHHHgPgl19+AaBJkyaubenSpRl/7WIfu3zS2EWnsYtOm4GKiIiIiEhJ002OiIiIiIjEigoPAL///ntaj589e3aWelJ8atSo4Y6nTp0KQOvWrQG45pprXFu/fv1y27E8s+IC6aSo+XbbbTd3bON6wQUXADB+/PhK9i4/xo4d644PPvjgUNuGG27ojtdcs+KPJUsV9X366acAjBkzplybpbXdd9996XVWJKK6desCUL16dQDWXXdd12app4lssskmAPTo0cOdO+mkkwC49957M93NgtGnTx93bGM2aNAgIDspaiIAG220kTtu27YtEBS+6Nq1q2uz9++CBQsAuOeee1zb4MGDs91NqSRFckREREREJFZUeAA45JBD3PGTTz4JJF/c9NBDDwHQs2fPrParGBan2cwHwIQJE0JtK1ascMeHHnooAM8//3xO+pXvsdtpp50AeO+998q1vfjiiwDMnDnTnfvxxx+BYKZ3m222cW02k7Ro0SIgHAWZM2dOxvpssjV2/vPm6mPHXmfy5MkAHHXUUa7t77//ztrrpSpX79datWoBMGXKFHeuRYsWQBApLBtd8/n97NatGwCdOnUCwkUjosr3+zWqrbbaCoBevXq5c/YerlOnTqWf/5VXXgGCgjiJFOvY2XfHww8/7M5Z5MbPEMimQh47+9z3o9y23UXLli2BoMANQOfOnQF46623AGjXrp1rK5WiDWus8d+8vb1fLEIDsM8++4Qe628dYr799lsgnK0zb948ALbeemsAOnTo4NoOOOAAAF577bW0+lkoY3fEEUe440mTJgHw9ddfA+HfyYpz+VGsfFHhARERERERKWlakwM0atTIHdtMwL///lvh420G058lSPdOPi46duxYYdtaa61V7nG5iuTk22+//QbA4sWL3TmbTd9vv/2A8GagZ599NgA777wzEF7PdP755wNBrv8zzzzj2iziUwwbhlqePQRRAxsDf93CZ599BsB3333nzg0cOHC1z3/qqacCcNBBB7lzts7B/j/40cVS8uWXXwKwbNkyd+7KK68EUssr99eS3HbbbQBMnz49cx0sMmeddRYAl156KQD16tVzbVGjlD///DMQzKhCUDI9jvr371/u3JAhQ/LQk9ywWfPDDz/cnfvzzz+B4D0FQUTGIg9t2rRxbTabn+gas3MW5bGsFAi+c+Lo+OOPd8e2btUyKT755BPX9n//939AEH3xI/m33HILAAMGDADCn5Omfv36QPBZCvDHH39U/hfIA/tcsQ3KIbh+bAsHP0JvGRCNGzcG4JJLLslJPzNBkRwREREREYkV3eSIiIiIiEisqPAAwaIqCBasJSorbWWBrX8//fSTa7PUNStnmwmFsjgtGb8s8LHHHlthXywVY/vttwfghx9+yGq/CmXs/FLQlh60++67A+FSyZbelmixsoWNrQxy1apVXZuNv6UTZSIdK5djt//++wPBewuCtAJbBJqqtddeG4CJEye6c1bwwviLeNMtHZ+KQis8YKl7lrpy0UUXubYRI0ak/DzLly93x/b/pXfv3gA899xzle5nobxfk/FT9iy9xdKb/b788ssvQJAOY2mqAOPGjQOC74k777zTtdni8ESpMskUw9j52rdvDwTXjRVigSCVK1elo3M5do8//jgQ/kxKln5m5s+f744tldce/9VXX7m2jz/+GICrr74aCK5DgM022yxSn5PJ93VnqaL2vQrwwAMPADB69GgA3n//fddmn2H2ftx2221d26xZs1b7ehdeeCEQTtWyv2escFCq8jF2Vp4dYO7cuUC4cIV9755zzjlAeMmGpQRaSvh5553n2m699dZK9StdKjwgIiIiIiIlraQLD9hiPH9hny0EPPHEE4Fg5gTg5ptvBoKoTc2aNV2bbYCZyUhOMbj88svd8d133w3AtGnTyj3ONt5KttFjHPkzRLbg0fhlov2NycqykuVWVtVfpGozLDZT/9hjj1Wyx7kVtRCFX9TCihbYzF7Z6A0EkbICDFxnnL/IeOTIkUDwez/yyCNpPdfFF18MhKOHw4YNAzITwSkmu+66qzu2CI7xZ9ttawErRuMXZ3nqqaey2cWCVbt2bXdsEUC7JocPH+7a4rz5p/1NMXToUHfuzTffDP2biJ8xsnDhwtW+jr1n11tvPXduhx12AODDDz9Mo8eFxwoKQBDB8cfzsssuW+1zWBEavyhQKqw4kH2XQPoRnHxq1aqVO7YIjp/5YWNnER2fXTf29619r/hyHdFJlSI5IiIiIiISK7rJERERERGRWCmt3KEybBFutWrV3DlLJ0iU9mNpMJaO1bx5c9dm+37Y4vBS8cUXX7jjdMO/pc4fu1TY7uf+glJLczvyyCOB4ktXi2rMmDHu2NKDErHF38nSAeNm0003dcfbbbcdALfffjsACxYsSOk5DjvsMCBICVm5cqVrK7WUXEux9dNLy7JiIgCLFi0KtZVqiprP312+S5cuQLDP0pQpU/LRpZyz4jv+3iSZ1LVrVyBYWO9/Hxd7mprp1q2bO3711VeB1FLUKsMKFNj+Mf4edsUkURr366+/7o6nTp1a4c/aMg4rdNSuXTvXdt111wHB38X+3kSFQJEcERERERGJlZKJ5PgL3lu0aAEEi9j80nz+bvJlWclZmzn+4IMPXJvNothd/y677JKJbseGjeuSJUvy3JPiZSVC/Zl0WwxtOxL7izDt8cXKj7CefvrpQFD0w3YFT8TfhbpTp05Z6l3hSjRTfMUVV6T1HFa8wD43u3fv7tqiFosoNva7WxnaRo0auTaLbNki+rLRG/mPLX7v06ePO2cFB4YMGZKXPsWVFbaxIiEW+Y8TPxphkbFse/bZZ4Gg6EO/fv1y8rqZ9tJLL7ljK4f97rvvpvSzdk1ZCWm/9LS9xwcMGAAkz6zIB0VyREREREQkVkomknPyySe745tuuinU5peV9Tdnq4htCpeoHK2t0/E3jkvlOePAolmJzJkzB4C//vorV92JrS+//NIdWyTHSir7pX6Lgc3M+aVBzzzzTCBcprdevXopP6eVlIYg/79u3boAzJs3z7UlKpVZzGy8Nthgg3JtftnTiviljs8666xQW5MmTdyxRcJnz54dqZ+FzN8wzz63/QiOsbWX99xzT246VqSs7K6/Zmn8+PFAeBNQicafUT/wwAOB4O+SQi3pWxl+NKJZs2ZZex37DoJgjAcOHJi118uFN954wx3bhs4WmYEg8yPRdWOZIrb+JhF7jCI5IiIiIiIiWaSbHBERERERiZWSSVezUqqJWJnUVNlO1vYvBCVaa9asCQShOyiddLVk4eNkYU5JjY1volKQxWqzzTYD4KqrrsrYc/ope1awwP71dw//5ptvgCB99cknn3RtxbSTtbGFxzvvvHNKj7eF9VZy1k/H8IuxABx77LHuePDgwZXpZkHzC1UcffTRobYbbrjBHV999dU561Mxs8IMP/zwgzunggOZs8kmm7jjGjVqhNr84kcvv/xyzvqUTX7abevWrYFwafd0t2Uoyz7n/AI+VuBg9OjRlXrufPO3nrC/eW+77TZ3zr4HR40aVe5n/dTxYlO8PRcREREREUkg9pGcli1bAuHCA8buZtOdWUq04aKV5Nttt92A5KWo42TDDTd0xzazYvzZI9uUUaJbf/31gaBko8+KEfgzpsXgu+++K3du+fLlQOob2I0cOTL0XP412b9//9BjmzZt6o5tFtQirX7hjB49egDhyE8xsxk7f1NP2yDVNv5M5IEHHgDg+OOPz2LvCodffKEsf5O7XJWvLVa2+Wf9+vWB8Iafyd7XBx10EADDhg0DYPvtt3dtffv2BcIRtVLnR3Ls+8H4W1zExeTJk93xoEGDALj33nvdub322ivl57JiPRB8h9hCfCuOAUHxDNsQMw7uuusuAD777DN3zrKRGjZsWOHPWclpKxcNQQZAq1atgCCjAOC5557LUI+jUyRHRERERERiJfaRHLuz9/NVrcTi1KlTScVW3QAAIABJREFUgfDsZjr88o12vHTpUiCeG3El4q8PsaiZ8UsW2uy8pM82xfTXRZT11VdfAcW3luSdd94BgsgJBJt5PvXUU5V+/kcffTT03/4s8DnnnAMEa3j8GSib0fdnCQudRRfsMwiCqJ8/vmXZ+hu/JP60adOAIAL+77//ZrazBcpfi1R2XZK/ttI2BLRrdPjw4a7t66+/zmYXi4LNjJtk34djxoxxx/YZZ595/jVpUVkr311sn3W58tZbbwHh6Flc2O8GcM011wBwySWXuHMzZ84E4JFHHgFg8eLFrs22KbAN2y3rBoItBsaNGwfAaaed5triFMEx9jevX5Lbtgawtb+28TYE42nvuV9//dW12bG9x/21S7Y+9J9//snsL5AGRXJERERERCRWdJMjIiIiIiKxEvt0NeOHvf3jKNZee20ALrroIneuTp06AIwYMQKAt99+u1KvUSz8FJ/Kjmuc2OK9iRMnpvR4K8zwxBNPAOEF8suWLQPghBNOqPDnJ0yYEKWbeWc7Lz/00EM5eb0LLrjAHb/33ntAsHDSL0XavXt3IFwYoRAWUSYza9YsILyLddkyyJaGBkHqh6V97LHHHq5twYIFQOkVDPHLiPvbAJRl6cm2KNl/7P/+9z+g9NKprJAFBKXh7f1t5WkB2rRpAwTpf35xgbKSpQ9K+PPMxufZZ5/NV3dyylJGLeUZgs/tZFtWWApbouupXr16QJC+BpUvS10sLE080RYpyZQtJNKoUSN3fMwxxwBBimk+KJIjIiIiIiKxUjKRHN/cuXOB6CUWbSbKnzEV8Z177rlAMKubql133TWtx1sJyLvvvjutnxMYO3YsEBRt8Bdhtm/fPvQvFM+GaH5ULFmEzKKNia5RKzFaaiySCtClSxcgKGu88cYbu7Yjjzwy9HP+zK+V0s9E4YxiYpvK+scWCe3cubNru/nmmwGoXr06EC4OYse2iahfErjs4udSttVWWwGwww47uHOWSXHLLbfkpU/54kfYbfG8beq7aNEi12aZN4m292jSpAkQFBCZM2eOa2vXrh0QLnoggd9//x2A+++/H4CePXu6thNPPBEICjrko4BNcXxri4iIiIiIpCj2kRy7s/dZzmC3bt2AcAlLYznF9hgIyqluscUW5R7/6aefAjBq1KhK9jg+bH1SKfI3GktHonK+ydqsDPqKFSsivV4ps3zhyy67rFzbX3/9BQTRnjiyNTi2iaA/I2qlWEuNXy7WNn22f/1Inq3jOv/88wHYYIMNXNvuu+8OlF4kx/8MsuPNN98cSFyK3TaLnj9/vjt39dVXA8Emon6brcGIE/tbpGbNmu6cldv1N/osyzYutjUkpcjWgPlR5yOOOAIIIoK2STuEr6WyXn/9dSCIHH7++eeuzbIy/AiFBOx6tTV2/jjZpsAW2c3HZuWK5IiIiIiISKzoJkdERERERGIl9ulqLVu2BMLlAq0E9O233x76139csnLI9hi/vKrt1PzNN99kotuxoAWi6Ut23SVqs52ZrfTtCy+8kJ2OFTlbWNqrVy93ztIQ1lyz/MegjfX48eNz0LvcqVGjhju2VCszZMgQd6z0x/L8RbOXX345AOussw4AF198sWv76aefctuxAmEFPCAoGWvvN/+zy74/LTXIUlr8x1kq26WXXura4vJ9cvzxx7tjK3VsRRgg2I3e0kh9yf4+mTx5MgBLlizJXGcLWLNmzYAgRQ2CQis2xul+ji1fvhwI/h8ArLfeepXqZ6mwLQwSsa0MRo8enavuOIrkiIiIiIhIrMQ+kmOLoYYNG+bOpbJpZbLH2N3+eeed587Nnj07ahclhmbMmAFktsz4u+++CyTeWMufRS0mbdu2BcLvT5uJvPbaa9N6LisWYhEany2Atn8TWbp0qTt+8cUXgWBGOS78Wc+dd94ZSC16LYkl2jDUitCUMis6Y5EcP5PC3mdWctYvcmEbC8a5XG/t2rXdcaKCDMbKQ/uP32WXXSp8vEXxSyUKa2WifVZoIJNjoE1oU2PRL4soAhx66KFAsEmrIjkiIiIiIiKVFPtIjm2WmAk2Q2frbxS9kYq8+uqrQHhTMcshTpdtpHXJJZcAwQZ7cWDrYXbccUd3znKgDzzwwKy+tm1i9sorrwDhaFLcIjgm0TX4ySefAIpApKNjx45AsCmjHwUrlZn0ZCwic9tttwHB5p4QlPe1iKuVoC0V/udMuuw6s/Vhzz//vGu76aabKtexIvXaa6+544ULF2b1+aVidk3ec8897pxFcpo3bw5AgwYNXFuusk8UyRERERERkVjRTY6IiIiIiMRK7NPVbBGUnw5ju1V369atwp+zsKdfQnbUqFGAykSn6sQTT3THtnO8X4Y1ziwUa2VSASZNmgRA+/btV/vzV155pTu2xXpxLE07ffp0ILyA2xaPNm3a1J1Ltvu3+fXXX4GgMEgi9t6HIJX1pZdeSr3DMWILwK0cclxK9FbEClNUq1YNSD1dwsqPH3TQQe6cld9eY43/5gn9IhlTpkypdF/j4owzzgj9K5Vj35+WtvbMM8/kszsFoVatWhl7rsaNGwNQr149d27ixIkZe/5S4KdQ2jYrtm2LXz7dvneyTZEcERERERGJlSqrCrBuqEr2/SfK/5pcj12ijbjWWmstIJjlBLjxxhsBGDBgAAC//fZbVvtVDGNXqApl7Gz2B+Dss89e7eNtxm3+/PkZ70uq0h27XF9zw4cPd8cdOnQAwlHufMnFNWdRw5122gkIFwVJxsqc2+cawPfffw/AHXfcAcDgwYPT6ksmFcr7tRgVw9h17drVHVtmybx58wDYfffdXVuuI/35HjuLrD799NPunBX8sc15/fLkqbCI7CmnnOLOWXGRP/74I3pny8j32OXK4sWLAahZsyYAH330kWuL+r2T7tgpkiMiIiIiIrGimxwREREREYmV2BcekOx67LHH3PFhhx0GBPu5+OHVp556Csh+mprEhy1ahMrtLSGJbbTRRgDUr18fyG+aXy6ccMIJQLA/i+13BkE6RSJvvvkmEN7J+8477wSCtDWRbEm0X5ilTsWxGE2qbIG7pckDdO/eHQjenyeffLJrmzVrVoXPVb16dQC6dOkChIuHZDJNrdTccMMNAAwdOhQIUv8AGjZsCMDnn3+e1T4okiMiIiIiIrGiwgMFrFQWp2WDxi46jV10hV544Oijj3bHDz74IACdOnUCgmhrPuiai05jF10xjJ1futyiiWPGjAGgd+/eOe2LrxjGrlCVythZee9PP/0UCP8OzZs3B1Iv5W9UeEBEREREREqaIjkFrFTu9rNBYxedxi66Qo/kFCpdc9Fp7KLT2EWnsYtOYxedIjkiIiIiIlLSdJMjIiIiIiKxopscERERERGJFd3kiIiIiIhIrBRk4QEREREREZGoFMkREREREZFY0U2OiIiIiIjEim5yREREREQkVnSTIyIiIiIisaKbHBERERERiRXd5IiIiIiISKzoJkdERERERGJFNzkiIiIiIhIruskREREREZFY0U2OiIiIiIjEim5yREREREQkVnSTIyIiIiIisbJmvjuQSJUqVfLdhYKwatWqtH9GY/cfjV10Grvo0h07jdt/dM1Fp7GLTmMXncYuOo1ddOmOnSI5IiIiIiISK7rJERERERGRWNFNjoiIiIiIxIpuckREREREJFZ0kyMiIiIiIrFSkNXVREREJL4aNWoEwMEHH+zOde7cGYDRo0cDMHHixNx3TERiQ5EcERERERGJlSqrohTszrJc1QP/+OOPAWjcuLE7d+uttwJw9dVXA7BgwYKc9CWRYqilPmPGDHc8e/ZsAM4+++yc9iGRYhi7QlXIY9elSxcgPMNr/X3vvfeA8DU5fPhwAObNm5eT/sVtn5waNWoA8Prrr7tz6623HgB77LEHAAsXLqz06xTyNVfoim3sdtttNwAmT54MQM2aNcs95p9//gFgyy23dOd+/PHHjPel2MaukGjsotPYRad9ckREREREpKTpJkdERERERGKlJNPVdtxxRwCmTZsGQK1atco9ZsWKFQBceeWV7tyQIUOy2q+yiiGk6acGNW3aFIDtt98egG+++SanffEV4titvfbaQJCC0b1793KPWWuttQDo0aNHSs9pzzFz5sxMdBEozLEzhx56KABPPvlkSo//+uuvAWjQoEG2uhQSt3S1unXrAonfy82aNQNg7ty5lX6dQr7mErGUvTp16gDwxRdf5K0vxTB29p0L8MorrwDBGD7//POuzb6Tt956awDOP/9812bfyZlUDGNXqIpt7GrXrg3AscceC4SXKZx66qlA8DtVrVo1q33J5dhVr14dgP3339+dGzhwIADNmzcv9/gxY8YA0Lt3b3duk002CT3mp59+itSXTFC6moiIiIiIlLSSKSF93HHHuePbbrsNCGbWE1lzzf+GZtCgQe5c3759AXjggQeA8Oz5fffdl7nOFqn1118fgH322QeAcePG5bE3+bXTTjsBwTUDQYSrVatWkZ7TZosffPBBd+6dd96J2sWiZLO+J554oju38cYbA7DzzjsD4fe6zbTbrPHSpUtz0k+Jt5EjRwJw0kknAeH35L333ht67BlnnOGO7XslVfZcjz32WIReFo4LL7zQHdv7dc6cOQB07NgxL32S+Ovfv787PvnkkwHYaqutgHBEwIpaXHXVVTnsXW7Y721FeCCICiWKivTq1QsI3qcQFAuxx8+aNcu1devWLcM9zixFckREREREJFZKZk2O3Z0C3HHHHRl5zpUrV7rjESNGAHDxxRdn5LmhOHJe/TU5bdq0AaBnz55AfiM5+Ri7ww8/3B0/8sgjAKyxRubmEex38n+3hx56KPR6TzzxRMZeJx2FlKe+ZMkSd7zBBhsAsPfeewPh6zUbtCYnmmK45vy1m3YdNWzYsFxfMvmVatGOli1bVviYQh47u0ZsewEI3p/bbbcdAD///HNO+pJIIY6drcusX78+EI5MWwR7iy22AMLvPYv2WXT/8ccfd23Z+DOvEMfO3ifPPPMMEKzDgSBaY5k4fknyoUOHAjBp0iQg2KogW3IxdraO5rXXXgOC9xvAsmXLgHCU2Z7fNuft2rVrSv1NNzpdWVqTIyIiIiIiJU03OSIiIiIiEislU3jAwuarY2kZl1xyCQDHHHOMazvkkENCj/XLDPbp0yfUlsm0NSkO/u7cydLUFi9eDCQuibruuusCsM4667hzViDDwsl+2NpKTdu/L7/8smuzRZR+idY4a926NRCMIcDy5cuB7KepSXxttNFGAEyePNmd22abbXLy2i+88EJOXidb2rdvD4RTTN566y0gv2lqhcI+q44++mh3rl+/fkDya+zff/8FYIcddnDn/GMIp+iPHTsWyE7aWr4dccQR7tiKSlmqlpUrh6AcuaVOWtloCMbFTzkvdpae5qepmeuuuw6A+++/v1ybFdEaPHiwO2epkImeq9ApkiMiIiIiIrES+8IDjRo1AuCNN95w52xmLpEOHToAwQyaP6Nui9qsXKgt/vNZMYJPPvnEnbMFXB9//HFafS/EhX1lJSo88NJLLwHQrl27nPbFl4+xS1RyMRFbCPjHH3+Ua7Nryl/kbIu/jS2iB+jUqRMATZo0KfdctrjQ70sqi8SL4brz2fv5/fffB2DzzTd3bV999RWQu5n3Qis8YDOaH374IRAu9zl9+vTV/rxde99++225NttI7r333qtsNwv6mrv00kuB5JtB+3359ddfAfjoo4/KPe7ZZ58FgsyCp556yrXZ+9Vn0aNEnxWmkMfurrvuAuCEE05w5+x3LoRZ82yNnR/Jtw2bExWPOOigg4D0Z8jtGvOzSWwLh0RsAb5flKWy8n3dWfbMsGHD3Lk///wTCCIPftGGZCwyFqfNQB9++GEAOnfuXK7NPs+tnHZlnsv+BvEj3dmkwgMiIiIiIlLSdJMjIiIiIiKxEvvCA7ZvSLIUtdGjR7tjSyUyf/31lzu21CxLy7Id1gGefvppIAhz+osADzjgACD9dLVi4NeaN7a7fKnxF9JaWkq6rPBFon1JEj33oEGDANhxxx2B8ELlmjVrAsFCXwinXxajGjVqAHDNNde4c23btgXCaWrGT1MtFfvuu687njhxIhB8/tneG6mydJpEKQKW4lHKPv30UyCcKrT//vsD8fy8T1eLFi3KnXv00Ufz0JPc8ovQ2KL/VNn3iKUvWsotBKnyZd/XEPzt4u8NE2dWHMr/bLL0NEtXS5U9h+2TEweJ9tUr25bMTjvt5I4t5T3Rz9leffa5N3PmzPQ7m0WK5IiIiIiISKzEMpJz7LHHuuNtt922wsdZBMcWlkI4clMRW7TlLwSzmfdExQjOPfdcINiFF+Dzzz9f7esUg/Hjx7tjW4BmUSw/mmULnyWz/vnnHyCYYZk/f75rs0jOZ599lvuOZZjNlH/xxRdAsKB+dXbffXcg2EXcxiuOLIIzYcIEd85meq20eLqzyttvv32Geld8rDjAeeedV67NFtla6XYr8w7w008/5aB3he1///tf6N9S4xc6+f3334EgCm1bCAAsXboUCCIzADfddBMACxYsWO3r+M9lxQjiHMnxyz7b7+lHX9KJ4PjPZX/LlUr0ddNNNwXCUX8rGGXFuqZOnera6tSpU+FzVa9eHUj8t28hUCRHRERERERiJVaRHJs9sXUKkLwU4K233gqkFr1JxF83YRtSPf7440B4fUCDBg1CjwG4/vrrI71mofFLElv5Rpt1T1bSUjKrS5cuQFDW1/fmm2/mujsZ9/fffwPB7GaiSI5FWOvVq+fO2XvP1vBccMEF2exmzp122mnu2Eoc+3n6FvnyP3skNfb9YOXc/TLap59+OhCUdk5W4rmUJcrhf/vtt/PQk9yyWXEItqWwGXJ/2wV7f0pq/KiNZcj4pcinTJkCQM+ePYFwpKssf9sFW1t85513Zq6zeXbllVcCwTpcfzP7atWqAeGov60Bs7/b/O+R7777DoCLLroICD4bIYhQDhw4EAj/PZ2rstLJKJIjIiIiIiKxopscERERERGJlVilq1nKTrJiAxAsqEplYV+qZs+eDcBzzz0HwIknnpix5y5kfrqahd6t9KCVzgb4v//7v9x2rERYyL1Xr17l2qzgwBVXXJHTPmWDFQw46aSTgPDOy7ZY1BbvWqoCwO233w4EO65bSB2C9MpiYqXC7ffo2rWra7PUICsyAEGa2m+//ZarLhY1P9Wxbt26oTY/fcMKWpRCOeRMO/jgg0P/fvDBB65t2rRpQJCeGgezZs0K/SvR+elnAwYMAGD48OHunKUGfv/990B4S5Crrroq9BxWNATg66+/Dv0bB/a3maXg7bnnnq5t4403BpIX8Pnkk0/csaUEWsn8vfbay7X17t0bCP7u8ws62HYXK1asiPhbVJ4iOSIiIiIiEiuxiuSkyhY+2kKrTLDFXclK7cWd3fnbHb1tJAhw3XXXAbB8+fLcdyxmrGQjBCUzy846A9xzzz1AeDO5YjdnzpzQv4ncdddd7tgWSNrM1X777efannrqqWx0MWM23HBDAPr37+/OlS2csMYawTyVbRToP94iOLZBr//4ZJv2tmrVCoB99tkHCJfLN1Z+1I/mFruFCxe640WLFgGw1VZbAeHxuvfee4Hg86zQr6Vca9euXYVt9l2QqCiBZUTYtg7+5sYiPvvue/XVV9052z7ENgq1jaIh2LDdSrz7UQy/XHLc2OL/1q1bu3P2t0GiDZ3t+9M+4xKxwisQRG7s/WwRWgjK8PsbkueaIjkiIiIiIhIrsYrkbLbZZhW2+bmct9xyS8Zf22Y+/bvYUmM5x7Zeok2bNq7Nog+K5ETXsGFDIFye1C9VDuHrfMyYMbnpWBGJWi4+l2zj0vvvvx8If6aUnf32Z+K23HJLAC6//PJyz2kb81rEGVLb6NMiOIlm3S3CEVe2jqtjx45AeGsCK7P6xBNPAEH5boBrr70WCDZ6LEX2XZwoAmjr5F5//XUgvGHo+eefDwRrW/21d+ls9Fgq/LURZf/+sXUpkN81Ednmf+eNHDky9K+/PuSUU04BoGXLlkD4M82itPXr1wfCm2rHha2nAdhjjz0y9rxWqtrWSPnsPWvfTfmgSI6IiIiIiMSKbnJERERERCRWYpWu5u/+XdaTTz7pjv3FpZVhYU8IFv0mYikltqtuXFl6lC0sjTtLK2rcuHGkn99uu+3csZV7tkWRfqpL06ZNARg3bhxQPkUN4OeffwbghhtucOeS7fZcaixN0t+1vlBZKlTz5s3T+jlLD/XTRMvy04d+//13AN54440KH2/XdqLUtLFjx6bVv2JjhVTs33feece12cJmG2s/VcNK53fr1g2IZ+rL6ixZsgRInOZoaWqWjul/d/7yyy9AUPbeTxG0crSZLBhU7C688EJ37BekgXB581ItIX/HHXe44w8//BAISuz716aVSLZiBFZwRVbPUnVbtGgBhNOrrSS/pQpaOetcUiRHRERERERipcqqRFMteZZosWIqbFGxzbD77r77bndsd5XpWnfddYGgvKUtTIXEs+vmyy+/BIKF46mK8r8m6thlwt577w3A9OnTy7VZycHbbrstJ33J1tj5M9q2wPv4449P+7Uq8sMPPwDhmbdk182oUaNC/2aiXHQ+rjv/PWuvn4nFshbBsUhXtku8pzt2ycbNZsbOOeeccm1WSOCjjz5K6bXtfedvgGrHn3/+eYU/Z1FZWxDus/KgmSghXWyfdWbevHkAbL311u6c/S7du3cHYMKECVntQyGOnX1XWknoRo0auTb73rRITiIWyfYLZVhk87333stYPwtx7FJhBQesLDKUj+TYhrWQnc24i23s3nzzTSDIwBk6dGi5x/Tr1w+Axx9/3J2zzaUzuXl0sY1dKqzwxYwZM9w5+3vpmWeeAYLoNkQfz3THTpEcERERERGJlVitycmG3XbbzR1fdtllAHTo0GG1P/fHH3+440MPPTTzHStAyWbYLLc6V5GcbPHL8yaL4NgGZRtssIE755dKrcimm24a+nd1bAZ5zz33BIp348/x48e7Y8u9j5q/65dVrVq1KhCsAygmNgvuR4yNlTwt5TLF6brxxhuBYDw/+OAD12bXTLol7m3jYyt5DME6nfvuuw8IbxSaydngQmbrZizS4Edyttlmm9X+vF37/voy+6zLZCSnWNnakbLRGwjWHyfbMLlU1K5d2x3XqlULCNZG+98vFiWxLB9bowPQpEkTILgmJbHvvvsOCK+DsvLStk6nV69ers0+j7NNkRwREREREYkV3eSIiIiIiEisxCpdzXZXPeqoo8q1+WHdjTfeGAgWI/upQbag99hjjwXgoIMOcm1169ZdbR/uueceAF588UV37uOPP07tFyhyljrzyiuvAOG0IQuv2067/uK0YnL00Uen9Di7jtZcM9pbzC91bOO60UYbAeHrtVOnTgAcdthhQLh8ty2i9Bc+F2q6jJ/OYikGb7/9tjuXTqqAhcghSEOw4h9xkas0teHDhwNwwQUXlGvr27cvkNnCG9l05plnAsHC1V122cW1WYpFonRa2wIg0eJtS8u666673DlLsbL3vv8d4pf1LQWLFi0CwoumbRd6+1xK9P3Ytm1bILzIOG7v4cqwFKpEXnvtNSD91Ms4sr/jIFgEb0WQvv7663KPt9LRVm4aggIZO+64Y7a6GSsvvfSSO7bPR/v7+4wzznBtVjrets3IFkVyREREREQkVmIVybHN2hJFcvwZeCshOG3aNAC6du3q2mrWrJny661cudId2yI2W0xVKtEbn80cJZohqVatGgAHHnggULyRHH+BcceOHSt8nEWukvFLJM+aNQsIZnr98qo202GRRCsyANC5c2cguIb9Esk2u+xvVmqbGBaaa6+91h1buXd/rC2SYOPjF/YwNlPvL1b++++/ARg8eHBmO1xiEpXtLMDdB5L69NNPgfAmvMbKxNq/PovkvPXWW+7cF198AcDkyZMBuPrqqyt8Xdssz3+OUtkg1DIbjjzySHeudevWQFBg5OGHH3ZtVpTAri0/YmlbRJQyK9N78sknV/iYiRMn5qo7Bc//7kvl88r+bvMfG3Wz71I1c+ZMd2yZJfb963/22t/pt956a1b7o0iOiIiIiIjESqwiObYO5vfff3fnatSoUe5xdjeZaEYvGds07+abbwaCNT2QfGOzUmPj4+fDGst5LVa2BgaCkuJdunQBgnU4AP/88w8QlJIGeOONN4CgBOojjzyS1mtbfrs/U2fHlm9ss6Q+60shGzdunDu22Vt/hsdmhG1WfMqUKa6tXr16QFDa3cpGQxDd9TdXldK03377AXDaaacBQflnCNbNJNpIeo01/psL3HXXXd05O7YMAX/NSdkZY3+z1e+//z76L1CELAp9yCGHuHMWobXNbhOt97IIrB+xsEhcKbP1bw0aNCjXZmsRFy5cmMsuFQ17j/plpSti73kIIrmlwqKFVhK6Muya9LfeMLZ5vCI5IiIiIiIiadBNjoiIiIiIxEqs0tWs5OzAgQPduWHDhgHpl/K1FB+/NOj1118PFO+u8rlSbAuSo7IQrF1jVpocghB3JkK+qbCFvbbLOgSlpseMGZOTPmSKlYKfPn26O3fTTTcBcMwxxwDJF976JSz9oiJS2qwsu//9YFq1agUEKY8QlDreYostVvvcidLVLJ3ZT9Uo1cXzfmq3FZ+55ZZbgCDdF4KCDpdeeimQfkpv3NlWAYlYym8xpCfnin2XQPCdYcV3/DQ0KxzSv3//cm1Dhw7Nej8LgaW6P//88wA888wzKf3ciBEjKmyzIl/2mZiPvw0VyRERERERkVipsqoAp939WbHKsjtJ2xgR4PDDDwdg7ty5QLiEpRk7diwA33zzTcb6kq4o/2syOXZR2aJIf0GpzYqeffbZQLD5XrYU69gVgkIcO3t+e+/6RS1soaTNZNoMMeR+5ijd1yv0a84Kt4waNcodNjTTAAAgAElEQVSdswWjNus5fvz4Sr9OIV5zVo7dCoq0b9/etZXdGNAfH2MLwLO9nUAhjl2xKIaxs/cbwNSpU4GgQMYTTzzh2qwkr781QTYVw9j5rFiNRRL9aI0VGrBz/qbZtmlwJt/HhTh2FsmxwlHNmjVzbcn6a/1K5TGLFy925yx6NnLkyLT6me7YKZIjIiIiIiKxopscERERERGJldinqxWzQgxpFguNXXQau+jilq6WK7rmotPYRVcMYzdgwAB3bEUs5s2bB8ABBxzg2nJdEKkYxs5Xq1YtIEh1ttRngO233x6ASZMmAcFeThAuXpAphTx2lqLsF7moW7duqA8HH3ywa7N0ymS/k6Wk+WM5Y8aMSP1TupqIiIiIiJQ0RXIKWCHf7Rc6jV10GrvoFMmJRtdcdBq76Ap57Nq2bQuES+LbAnmbEffLb+daIY9dodPYRadIjoiIiIiIlLRYbQYqIiIiUuyqVq0KBNEbn5XfFZHkFMkREREREZFY0U2OiIiIiIjEigoPFDAtTotOYxedxi46FR6IRtdcdBq76DR20WnsotPYRafCAyIiIiIiUtIKMpIjIiIiIiISlSI5IiIiIiISK7rJERERERGRWNFNjoiIiIiIxIpuckREREREJFZ0kyMiIiIiIrGimxwREREREYkV3eSIiIiIiEis6CZHRERERERiRTc5IiIiIiISK7rJERERERGRWNFNjoiIiIiIxIpuckREREREJFbWzHcHEqlSpUq+u1AQVq1alfbPaOz+o7GLTmMXXbpjp3H7j6656DR20WnsotPYRaexiy7dsVMkR0REREREYkU3OSIiIiIiEiu6yRERERERkVjRTY6IiIiIiMSKbnJERERERCRWdJMjIiIiIiKxUpAlpEVEMunMM88E4MYbb3Tn7r77bgB69+4NwMqVK3PfMRGpULNmzdzxUUcdBcDFF18MwBprBHO0V111FQATJ0505+bMmZOLLopIAVMkR0REREREYkWRnEqoW7euOz7wwANDbc8++6w7XrRoUc76VOhmzpwJwODBg905f6xKQePGjQGoXbu2O3fiiSeG/vXZJmDjx48HYOjQoa5t7ty5Wetnsdpqq63c8UknnQRAnz59gPBGYjbWjzzyCFB616FEt9566wHQvn17AA4//HDXduSRRwLw2WefAdCqVasc9654rb/++gBcf/31ABxwwAGuzd7X//77b7mfu+iiiwD4+++/3TlFckREkRwREREREYkV3eSIiIiIiEisKF0tgo022giA++67z53bd999gWAxpB8qt8fdfPPN7tyKFSuy3s9CsuuuuwLQvHnzPPck/yyNxb9+LKXlm2++AWD58uWubeuttwaChbedO3d2bWPHjgXglFNOyWKPC1vVqlUBOP300wEYMmSIa6tRo8Zqf37QoEEAPPfcc+5copSYYuGnzk6ZMgWA4447DoD7778/L30qZi+88AIALVq0cOfsmrP0x1dffdW1nXfeeQC89NJLuepibNhnYseOHfPcE4mDJk2auOOzzz4bgB133BGAOnXqlHvchAkTgHDa+LJly7LeT8keRXJERERERCRWFMlJg0VwHn30UQD22muvCh/btGlTdzxs2DAgvCDaZlRLZXGkLQy1CMXnn3+ez+7kjD+TZDPsVnjAnxk2Dz30EAB33nmnO2eRhh49egDBLDIEUR0rjfzee+9lrO/Fwmbak70fP/zwQwCmT5/uzp1xxhkA7LbbbgBss802ri0u16dFpCySqkhO+mwm1z7/Abp37w4E197ixYtz37EiZ+M5evRod65Tp05A4kjqL7/8AsA555wDwOuvv17hY4rF2muvDUDfvn3dOfu8X3PN8n+eWRGacePGAdC/f3/X9vXXX2etn8WiYcOGAAwcOBCAI444wrVZsRAbQz9bwgpWWLaEv53A8ccfD5Re9k1cKJIjIiIiIiKxokjOavhloi1fONmMcTI2AwXw119/AaUTydl5550BePzxx4H4zJRXxGZ/rrjiCnduiy22qPDxX3zxBQDffvstEJ5lsjLIb775JgAjR450bRtuuCEATz75JAANGjSobNeLjv3uFj397rvvXJv9f7BIjv//wCI5Nq5ffvll9jubA/Xq1St3zt53qdp4440B2G677QB45513XNs///xTid4Vn/fffx+AQw45xJ2zcu4SnUVwjjnmmAof88QTT7jjO+64AwivnSt29tl++eWXl2vzy92XPWeRxF122cW1devWDQii+Yl+Po6qVavmju+55x4A2rRpU+5xb7/9NgAjRowAgs9935gxYwA4+uij3bnHHnsMCLYaKBXVq1d3x/Y9ahvx+n8X2xYpw4cPB+Dee+91bb/99lu2u7laiuSIiIiIiEis6CZHRERERERiRelqFTj55JOBoPQqwO67756x57eF+P7CwbjxF91vttlmQFC0IY4spAtB2NtKiq+OLXq3UPonn3zi2mxBadu2bSv8+c033xwIrlu/D3FnYzZ16lQA5s+f79qWLl0aeqy973yWfuUvNi1ma621VrlzLVu2BODFF1+s8OcsRQ3ggQceAKBDhw4AjBo1yrVdeOGFQHGX2U6Hpfpdcskl7pylhX711Vd56FFxa9asGRAUYvE/Iy1t11KEEr1fi91BBx3kjocOHVqu3dJtLXXZZ4vnd9hhByBIJwWYPXs2AA8//DAA1157rWuLc1r8AQcc4I732GMPIPgs91PMbIuBZMUp7P285557unNWjCDu6Wq23YKlPfbr18+11a9fP/RY/3vVvm/se9gvzV0If98qkiMiIiIiIrGiSE4ZAwYMAIKFgP4sk81c2qzIu+++W+7nbSGhvwB83rx55R732muvZabDBezKK68sd+6jjz7KQ09yw1+olyyCYxt++jNKNnNpi/a+//77cj9nC0q7dOlSrs1ez+9DqbHiAonYZr1du3Yt1xaXTRs32WQTILxo1my55Zar/fnrr7/eHVsE5+mnnwZgwYIFrq1UFjSbP//8s9w5m0FXJCc1NssLcNhhhwFB5NCPCB566KFAOJIdNxYJhaBwjM+KEtn72L/G7PEWXTjrrLNc248//ggEn3FWjhuCGXUrIR+nkud+IR77bLLv0VQ3ybbvT8uI8P3f//0fAOussw4A7du3d21WBMMvFFSobBNUgA8++ACADTbYwJ0bPHgwEGxmbH+nQLClxYwZMwCYNm2aa/vjjz+AINrvX5OK5IiIiIiIiGSYbnJERERERCRWSjpdzXYU9sNrVgc80aJaWxA4duxYILxTc1m1atVyx4me66677orQ4+JgO1n7Yd27774biHcawkMPPeSOLRXNTxmwELGNhe2yDKmlD1g6kb83QseOHSvR4/hr1KgRANdccw0QDs8/++yzQOLFv8Vo2223BWDvvfcu13brrbdW+HM1a9YEwteVsYXfH3/8cSa6WJS23377cudKLWUvKisyYClqAFtttVWFj4/z94Px9xHZZ599yrVbEZrDDz8cCKdj/frrr0CQPuQX8vn999+BYL8wKx4CMGzYMCB4j9s+O3Hg/5623MAKEKSqXbt2oX/9dHHbX8fG2gpmQLDXTuvWrdPtds41bNjQHVsxI38fKtv/0fYwtLGAcOpaReya7NmzZ+U7m0GK5IiIiIiISKyUdCTHIjj+gtuy/NKLRx55JBAuUVuR6667rpK9K15W0tGfNb/xxhvz1Z2c8QsJWFTHj+5Uli1utBk7SczKlQNMnDgRgJ122gkIojcQLN71I2rFLNEu3+PHjweSR2Js92obI5HKsu0DbGG2X57c2OflOeeck7uOFQD7TIIgUmoloSHI/EhU2KisJUuWlDtnkYfnn3/enbNCGZtuummEHhc2K9Tgs0wa+2wDWLRoUegxfmTDykNbRsXMmTNd2wsvvABAlSpVgHAUd8KECZXqey75f7fadih+1N8iYlEjMUOGDAGCcSoUiuSIiIiIiEislGQk58svvwTC62bKsplP21gLUovg2MxBotKQtpYHYNKkSal1tgjddNNNAKxYscKd848l82xDy1Su0bipV68eEJTWtvLvELwPbUbZNjqD8huFFiP/M8w2u/NZRDnZxp377bdf5jsWI4kip8lmK239ydlnnw0Ea54ALr30UiC4HuPE3/zZcv2trHmi689m4G19AMA999wDBGvo4rhGx488nHDCCUBQvhfg9ttvB6KXtq9WrRqQeI2dX8o7LhJt7mnXXZ8+fdw529zSNgr1txOw97itsUm01tW2cDj22GPduWTbFhQaPyvJrjF/rc1pp50W6Xlto9BWrVoBhbdeUZEcERERERGJFd3kiIiIiIhIrJRMutrJJ5/sjq2EZaIQuqWpWfnGefPmpfU6lt5mZRx9V155pTu2XWLjxHaVtxQaW/QM2hk8EywsbIUdfL/99hsQLgkZR7Zrs79ruKV8JAuTW+qGn4aQyaIQ+dKjRw937C+kTUfLli0r1QfbCRyCkri2WzsEi6t/+OGHSr1OvsyaNavcObvW1lprLSBcotsKWlhKiH/N9erVC4hXupqVifZ/J0sXsp3kE7ECIa+88oo7Z4+3crZWFj2urEhAJrcC6Ny5M5A4Xc3SmuNkzJgx7rhFixYAnHTSSQD07dvXtS1btgwIUgOvvvpq11a7dm0AzjvvPCD8XWJbh1ha708//ZTR/ueDFbrwC1dYYaN02Xdx9erVgcJL4VMkR0REREREYiX2kRwrMuBvQJZsdskKAqQbwbEFk3vuuScQjhLZgq84Rm9skSMEC59tUa5tQCaVY7OZ48aNAxIXzIh7wQGLNlg5T788uc26/fjjjwC88cYbru3bb78F4JRTTgHCEd04RHJs491ssXH3rzmLXtisp19kxSLY/iaFxV7gwcrv+izScO6555Z7jEVarQztn3/+6dr8xflxYRse+2Wiy2ZJJCt8kYiVOp42bZo7Z9+j559/fqR+lgp/s1FjZfKL/b2YiB+d6t27NxAUOjr11FNd28CBAwFYe+21ARg+fLhrs+1B7LvEX5AfpwiOmTp1KhAu0mOblFvkfeHChRX+fNu2bd2xFfwxVo67UCiSIyIiIiIisRKrSM6aa/7369gmnxDMQCaaSbIcxJtvvtmd8/M0K2Kzp61bt3bn2rdvH3odP7/RSkfHaSbA1KhRwx1bLvDPP/8MwOeff56XPsVNo0aNANh1113Ltdk1ZTNRcWXXlL2v/Nlx28z3rrvuAoLojc/WSeyxxx7unB3PmDEjCz3ODX/TYdvYzd/gzfKl/Q0Ija2hO/DAAyt8/vvuu2+1fZg9e7Y7Pu644wCYPHnyan+uWHz00UdAsO4NglnhO++8EwjWAEAQUTT+9RiXSI4fQbSoVjJ+mV/bGNo2AU203YKt87LMCEg/GlRqNt98cyBxeXNba1zMn3WpsGukf//+QLCOFaBDhw5AEKnwN1u1NWS2/ubggw92bXH8O2b69OlAuJy5re+00tr+d4ZFVJs3bw4E5fGh/FrYQrvGFMkREREREZFY0U2OiIiIiIjESqzS1SxNzdJXVsfSCPxytGVZeUwIFlhaeD5ZmN5PgRs9enRK/SlGfglbc+ihhwLh9A5JzzbbbOOO/TKYEE6FtMWTCxYsyE3H8uSLL74AYLfddgOilyS3RfOQ/UX7uWBlUSH43PPT1bp37x76NxPuv/9+IFi8+vTTT7u2OL7nbTf0Y445plzblClTVvvzfirpr7/+mrmO5VG66Wr+AvDHHnsMCNJLLc0c4JlnngGCFF1J3SWXXAJA1apVy7XFMeUqGSuN7KedWcqkFa154IEHXNuiRYuAoEDNBx98kJN+5strr70GhNNBR40aBQRp3P53RtnvD/t5CFLJDznkEADeeeedLPQ4OkVyREREREQkVmIRyalbty4APXv2TOnxtuhqyJAh5dpshsoiOLa5J4RLZJZlC/usBHUqBQyK2ZZbbgmENzi1MpWvv/56XvpUiPxFtfXq1Vvt463UuV8GtE6dOqHH+KUvr7nmmkr2sLikG8GxBaj2vvZLqMZtts7Ka/vFFWwRaTI1a9YEgqIBPosKffbZZ+6cbeqZbPPVOEolapOIXzBk6NChmepOUUlUdCdRgRCLRlpBB1/jxo2BeBQMyRTbtByC8snG/6y74YYbctanQmWfV4k+t+zvtmeffTanfco3v2jMQQcdBEC3bt0AOP7448s93sq49+vXz52L+rmYK4rkiIiIiIhIrMQikmMlUG0jutWxzZ2MPxtiOZm2oZu/cWjZEpYWvfGfI91NRIvVaaedBsD666/vzvkbLZYif4NKu35ss0CANm3aZOR1km3SVSxsdvvSSy8FwiWHLVc/XRZp7dOnjzt35plnAsH72DZUhejregqVbYpnm1CWPa6IlTX219dZXv/ixYsB+P777zPWz1Jh2wpYhBsSRyiKXbLNta3NyvdCsN7QNg30N8m2TbUTlYvebLPNAGjYsCGgSA6E/5YpuxbH1swBzJo1K2d9KiR+JLt69ep57Enhs/ehff8m+x62ktsQfJfbuq9C23BWkRwREREREYkV3eSIiIiIiEisxCJdzRYiJgub+2zhWSo7KCd6Tlt8ZSlJkHhhZRzZAm5bpOyHJv3weCnydwi2VBXfn3/+CQS7DNuC73T5i5ctPbLQF/9BsCM3BKViLcWsMiVObed0K/Zhu6X7XnrpJSAosyoBS7udMGGCO2clQxPtRi+psaIXfnn3uJTYtpK7EBRJSVS4wtgu8z5LDffT+ew7OdF3s31uxmUMK8NS822ReCLPPfdcrrpTcCylcezYse6cv31AWfZ5d/bZZ2e3YzFx1FFHuWMbV0s5t/dpoVAkR0REREREYiUWkZzDDjsMSC0y40vl8X5JwWuvvRaA+fPnA6UTvfHZ4mRbBOpHb7777ru89Cnf9t13XyC8sVYitui9fv36QHgxbln+bKVdpxZF84s9jBw5EiiOSM56663njsuWY582bZo7btCgARCOyNjmgJ07dwbCGyza8yaK0F533XUAXHXVVUDhLYosJLfffrs7tplNKwzhL+C1AgcS8KOyVuLdFuTuuOOOeelTNvmztffddx8QLs2bqPxsWX5p7VRY2VrbTLSU2Wbj/gJwY9/DcSxykaqBAwcCsPbaa7tztqnvk08+CYQ3uPS/m2T1/Cwme9/7mSyFRJEcERERERGJlVhEci6++GIgszMXTzzxBAD333+/O/fKK69k7PmLiT/LYbnVFsU6//zz89KnQvK///0PCM8aJXLqqadW2LZy5UoAbrrpJgBGjRrl2iz6YNfkRx995Nr8jUEL3ZIlS9zxp59+CgQRGr+E9M8//wxAtWrV3LlkM232vLY2wErTQmollKViHTt2BMKRCpWTDljU0Y8u2EbJnTp1AuJXqrysl19+GQiXKbaNPrt06QIE7/PVsbU+r776KhB8t/vPWcpsQ9Rk43nIIYfkqjsFxyIMlnFSpUoV12bvUVsb6rf5x7J6fnT6xx9/BILv9EKjSI6IiIiIiMSKbnJERERERCRWYpGuZukpVpIy1QWNtnjSD7NbKpAtuPV3Yy5Ve+21lzveYostAJg0aRIQlJ8tZZZaYQsbAWrUqFHh421h/IoVK9y5QYMGAXD99ddX+HNWKr1YLV682B1biW17z1pBAYAddtih3M9aSNwWN9r1B8E1uHDhwgz3WIylyUDppKu1atUKgHr16gHhQhgtWrQAoHfv3kCQYglw+OGHA/Daa6/lpJ+Fwi9GcNlllwFBunebNm1cm6XxWcEgn5WmnTFjRtb6WWz8VN0BAwYAUKtWrQof76czlxpLcU6UfmbXYO3atYFwoYynn346B70rfs2aNQOCEt0A06dPB8Lp6IVEkRwREREREYmVKqv829kCEXURmJXkrVOnTkrPb2V6C7UkZZT/NdlYQGelswFOOOEEINiMrFBndfMxdi1btnTHVu7UZnUhKCbwxhtvAIVbcrFQrrtilO7YFdK4rbvuuu7YiqxYxMJKlQNccMEFGX/tQrzmbGbygw8+AGDnnXcu99qjR48O/QtB1DFXCnHsikUxjJ0f2X7//fcrfNyDDz4IBJuyZvtPu0IcOysEcuuttwLJt2lYtmyZO7aS537RmmwqxLFLxZgxYwDo1auXO2fFRR599NGc9CHdsVMkR0REREREYkU3OSIiIiIiEiuxSleLm2INaRYCjV10GrvoijldzWcLTJ977jkgvJi5Xbt2QLC3UybomotOYxddMYxdqulqllKajXTSRAp57Jo0aQJAnz593DlLSZs6dSoA48aNc225Th0v5LFLxooM7LLLLu6c7ZmTq/3AlK4mIiIiIiIlTZGcAlasd/uFQGMXncYuurhEcnJN11x0GrvoimHs/EJKtsVAjx49AJg7d65r22+//YBwqf5sKoaxK1TFNnYbbLABAPPnzwfCBacsapYriuSIiIiIiEhJUySngBXb3X4h0dhFp7GLTpGcaHTNRaexi05jF53GLrpiGzuL1rzzzjsAXHTRRa7txhtvzGlfFMkREREREZGSppscERERERGJFaWrFbBiC2kWEo1ddBq76JSuFo2uueg0dtFp7KLT2EWnsYtO6WoiIiIiIlLSCjKSIyIiIiIiEpUiOSIiIiIiEiu6yRERERERkVjRTY6IiIiIiMSKbnJERERERCRWdJMjIiIiIiKxopscERERERGJFd3kiIiIiIhIrOgmR0REREREYkU3OSIiIiIiEiu6yRERERERkVjRTY6IiIiIiMSKbnJERERERCRW1sx3BxKpUqVKvrtQEFatWpX2z2js/qOxi05jF126Y6dx+4+uueg0dtFp7KLT2EWnsYsu3bFTJEdERERERGJFNzkiIiIiIhIruskREREREZFY0U2OiIiIiIjEim5yREREREQkVnSTIyIiIiIisaKbHBERERERiRXd5IiIiIiISKwU5Gag2bbmmv/92s2aNQOgU6dOrm2XXXYBYP/99wdgjTWC+8CpU6cC8MILLwBwyy23uLY///wziz0uLccee6w7vvfee8u1t2rVCoA5c+bkqksiIpJB9evXB2DIkCHunH32v//++wDsscceru3333/PYe9EJA4UyRERERERkVjRTY6IiIiIiMRKlVWrVq3KdyfKqlKlSsafs1q1au74hhtuAOD0009Pqy9lh+rHH390x/vuuy8AH3/8caX66YvyvyYbY5dte+21FwBNmjQBwmmA//77b7nH77rrrkDydLVSGbtsKIax23zzzd1x06ZNAejSpUvoX4AaNWqEfu7WW291x2eeeWbG+5Xu2Oma+08xXHOFqtjGztLFJ0yYAMDhhx9e4WM322wzd/z9999nvC/5HjtLz2vevLk799BDDwFw6KGHAuU/wwC22247AA455BB3bunSpQB899135R7/9NNPA/DHH38AcO2117q2qKn2+R67Yqaxiy7dsVMkR0REREREYqVkIjlXX321O+7bt2+o7bfffnPHNuPx4osvVvhcVqigY8eO7pwVI7jssssAmDVrViV7HO+7fYveANx8880ANG7cGAgXe7BIzgMPPODO2f+/xYsXV/j8+Ri7HXfc0R0fcMABAGy66abu3HPPPQcEEcC5c+dW6vWypRCvu549ewLQo0cPANq0aePa1l9/fSC1fq9YscId2/+vzz//PGP9zFUkZ5tttgFgrbXWcucWLFgA5K4ISv/+/YHgWgd4+OGHgeA9napCvOZSYVGJddddt9LPtXz5cgD+/vvvtH6uGMbO/0wfNGgQAAMHDqzw8Q8++CAAp5xyijv3119/Zbxf+R67lStXrrYfybJJEj0ulcfcd9997tyJJ56YWmfLyPfYVZb/nrXjJUuW5OS1i33sGjVq5I4tmnjccccBQWYFQLt27QB4+eWXM/baiuSIiIiIiEhJi30kp0GDBgDMmDHDnatTp07oMX4JaYvkJGMzx35+v93NWunL448/3rV99dVX6XX6/yv2u/1kTj31VHd80003hdoSRXJslhqCsU62/ikXY3fEEUcAcOmllwLBmiJIHl2w3Gk/x3zy5MkAPProo0B4vZHlUedKvq+7unXrAuG1NcOGDQPCkYuyrz1v3jwgPHYWpbnooovK9fPOO+8EoHfv3hnre7YjOccccwwAd911FwDrrLOOa7M1Dv7MbDZmv4191vkmTZoEBBHtVOX7mjNVq1Z1xxtttFGFj7O1FBaRts+CdPn/f6655hoArrjiirSeo1DGLhn/s/GDDz4Itf3yyy/u2CL29plqn5XZku+xy1ckxy/H3bZtWyD9zIJ8j126bF12165dAejTp49r23bbbYGgZPmHH34Y+XXsc8O2KJk+fXq5xxTD2FWvXt0d299ctp7d/96pWbNmhc9h71/L0lm0aFGl+6VIjoiIiIiIlDTd5IiIiIiISKzEMl3NLy9r5Rj9hco///wzABdccAEAjzzyiGuLumjXSjLaQkk/zL7bbrsBiUs7JlMMIc10WfpZotLQxk8ZsZSu0047zZ17/PHHV/s6uRg7ew37Xfw0qWeeeQaAH374wZ3bf//9geB6qFWrVoXPbT8PQanzb775Jq3+RZWP6+7cc891x7agfZNNNnHnLK3j119/LfezZ511FhCk+vnFBSxF4dVXXwVgl112cW0zZ84Ewp8NlZWNdDU/HeD1118HoGHDhgC8++67rs0WfPoLui0FKpO22morICiuct1117m2ESNGRHrOfFxzG264oTuuV68eEB47Sw2sLH+rARsfu47fe+891+anVKejkL8nbIHy888/785tscUWocf4v7dfkCYX8j12VkhlwIAB7txbb70FBKk99tkFMHv27JSf27+We/XqBQR99/8WsXS1+fPnp9X3fI9dKuxzEoKCF927dy/3OFt6MGrUKCD1YjT2/P369XPn9t57bwDq168PBMVJfIU4dvZ9a3+n2N/HEJQ4TyUl0mePv/766wG45JJLKt1Ppen/ZV4AACAASURBVKuJiIiIiEhJK3+LGQM77bSTO040S2szuH4Zxcq6+OKLAahduzYQlNPz+5NuJCdObGFuKpEcvzS0LQ5MJXqTazfeeCMQLPj2Z9mWLVtW4eNtZt7foNbKkdts08EHH+zabBbUFkwWaunpyvDLEFuEy5+xsXFJNzJhG+m1atWqsl3MG/tMgWBm3MqRH3jgga7Niqz4kYNsOPnkk4HyBVyKxdFHHw0E1xSEF8Znml8+1SK0tm2BX1AljmxWuGz0BoKMikzM7hYrK7Tgb5FQWWuvvTYQnvm3Yyvq8/XXX7u2dCM4xcAisy+99JI7Z0VrRo4cCcBVV13l2pKVjl5vvfWA4G88PzrUrVs3IMg0ABg9ejQQFMIpZP5Gs/b3cIcOHTL+OpksIZ0uRXJERERERCRWdJMjIiIiIiKxEst0taOOOqrcOT8FyhZBZcPEiROBcLqa7dHxyiuvuHPZ3L+iUNheEpDemKdbZCBfzjnnnEg/lyg0bnu2WOqbLRSFYI+YN954A4AddtjBtcUl3cUKhECQquenpt18882RnvfCCy8EgnQNP4XDChUUo0Qpi1H340qFvzu4n1pYTKwgjaWp+SlqVuDE3xvCT+WoDH+/J/8Ygu8LCFJz45DWbMVjzj///Aof07dvXyAopiGZYYvETzrpJHeubJGcZOnicWC/u6WtQbD/VLJ9qLbccksA9t13X3fuvPPOA4LCLv53iO2B88QTT7hzlq5WDJ588kl3vOeee2b8+S1NzS+ekWuK5IiIiIiISKzEKpJjC3SPPPLIcm3+7rZ+RCXT7Lk/+ugjd26fffYBwuUxbeFwHFmRAT9645cDrohFvwo5epNttiDZFkdCMOs2fPhwAO69917XZgvP//nnnxz1MDv8GW1bnO3vhJ7KzKMtvPdn0mwRpc1kWsQMopc7LgS2iDZX/PfvrrvuGmpLp6xtrtlCawhmcC2C8+mnn7o2ix4OHjzYnfMj0RX58ssvgeRbD/hloq38tpWRt2IiEOxCb9sQQLRSs4XgtttuA4L3pG/M/2vvvuOlKM82jv/U2AnGhoqYKEZRbEQsoPiCggpiIRoLUUSNJSqiRFRsiQoWLNhRLFjAhsQexSgqIoq9IPYSeyEqGBvY3j/8XDPP7Jmz7O7ZPbs75/r+4zizZ3d4zuzumee+n/u+4goArrrqquY8pcxTFFLR6zS6TvVdklUqeBFS+wAVZgg/G5Q1oL9B0qK4KpQRRoJ0ndfb9+8555wDxH+bQmHfsYrQhp9bN954Y6OP32qrrUo8w/JxJMfMzMzMzDIlU81AVRJQa2BC22+/fbQ9adKk0k4sR1ja9ZZbbgGgffv2QHLmU82gVEIYkpGlxtRiwyhRo6uwoeXaa68NJCMNuVRaOYx06fdWzghOLY9dsVq3bg3EM8LKGwZYbbXVgGRJ0Kaqt7EbOHAgEM/GtWnTptHHhjPuWrt3zz33lO1cKtEMtEOHDtG23jeaOdSsZKWFJYBzr7XwHEqd0azUNReeW25Z93DNl9ZIhmvhwvcZJEvCjh49GoBx48YByTWfhejTpw8Qr8EDaNWqFQADBgyI9hVSWrhW3q9h6watVVh66aWB5Dqj3HGtploZu3JQFDJsLJrrzjvvBOJsi6ao5bHr2bMnABMmTIj2qfmv1raG6wy1jkn/prCs9mWXXQbEDUOVbdEU1Rg7feZAvAY2jFjlnlP4vah10h988AEAw4cPj44de+yxjb6mWjeEjdKbys1AzczMzMysRfNNjpmZmZmZZUqmCg+oM21IKT73339/2V8vXPyr7t/12gW8WN26dQPi8sYQp+/lW8Cm1LTTTz+9gmeXLS2l7GcxDj/88GhbBQQKCWOHnxEqS92lSxcAPv3003KeYtloQTrEpY6VkrfTTjtFx8IypvaL8D2jVAuVkh42bFjen1V627333gskS9s39VpRKogKFwCst956QHKxbiHpatWmlJcTTzwx2qc0NQnTxQux4YYbAnGZaUiW986l7/nwHLJMf2ccfPDB0T6VRs/3OaiiLlk3efJkAJ5//vlonwo/de3atdGf02eoykZDdto07LPPPtG2UmPTqBWBiipAnGKqAkGdOnUq6DVVvKWc6WrFciTHzMzMzMwyJVORHAkXaGlW44cffij764SLKddcc00gLiG9+eab5z2velfqv8URnLh0pcoxQjzTueKKKwLQr1+/6JiKPOi/obvuuivx35deeik6ptKOlbj2q0URnHyLHcPZ7+uvvx6IS2WGRUk0nlqAHjbPqyXh58whhxwCxDNqYalxzc6FjVVbevQvLISgcuv5GsuG0TAtUJ4+fXqFzi69UMO+++4bbYeFEGqVCs7kNjqFuCBPWvNaCRfBq3moZorzRW9CamQYvvdfeeWVgn62nujvjEceeQSAZZZZJjqmv3XSIjkqnR+W0M+KMDqv79Hjjz8egHXWWafRnwtbiRx22GFA/uu0pVDZ97TMAP3dV4P1yhrlSI6ZmZmZmWVKJiM54V1mmM/enK+ddqdbT3e/jclt9FlIk0/7hWaVLrroIiDZHFYzJF999RUAzz77bHRM5W3TKEpzzDHHJJ4H4pKiJ598crRPJW/rzaBBg4B4hi4sXa5/s/L31egspFldjVMoLepaq2699VYAnnrqKSDZmFO/23BmXBG+sGR7LpU/Dksk50prrldvZs6cCSQb4FWbSsDXs3yz5aeeeioAP/74Y4NjWp956KGHRvvyrRXIR+WBS/35eqFIbhjBacwDDzwQbecrK12vdN2FDXz12VdIxCGM2rSECM4dd9wRbe+yyy7N8poqWV5NjuSYmZmZmVmm+CbHzMzMzMwyJZPpaiGlBuWz4447RtsqyapFp1dccUVBr6PuuYsttlixp1jzwsXL+RYya0G9hCVXCx3HrGnfvn20rfKd6h4/YsSI6JgWyM+dOxdIdlwePHgwAKNGjQLgwgsvjI5pAb5S35TOBXEaljo2Q/z7q4fStCEtDFV6ZLiouJAylSqLGaYv5EstrXVKN1AXbogX3e68887RvnC7MZ999hkAb731VqOPydelPuyMrTKl77///nxftzl07Ngx2tZYDRw4EIiviVrzxhtvVPsUihKWwc6laysssnLbbbcBcansfN+Z4WfX/vvvDzT8ngGYNm0akCwZnBVh+e1CClE888wzDX5u3rx55T+xKthoo42ibX2fpqXuvfrqq0CcLglxifahQ4cCsPHGG1fsPGtR+J2vtHi9F6GwIjUqdLPGGmtE+/r27dvo46dMmVL0eZabIzlmZmZmZpYpmY/kKDIzceLERh+jEpgAe+65JxDP9i266KLRsQkTJgAwa9YsIFne8vzzzwegc+fODZ7/888/n+851CIt4is0kiMqY5zFEp7FCq8tRXB0HYSRnNxSsip3C3EEZ86cOUCyUZmoYaFmNMPnVxQE4MADDwTqL5KjxaV77LEHAEceeWR0rJCZ77BpZq4ZM2Y07eSqQI0twyIDasqo3zEkx2l+VDo0Tb5Svpohhfj3EzYJrqaw6IbKDOta0Gd2NahxY+vWrRsc+8c//tHcp9MkKu4TFhdQ5EbfmV9//XV0bMstt0z8vK5lgMcffxyAL774AoDdd989OqYIjiKvYal0ff6lleSuV23btgWSC+vzvQ+VDTBgwICKnlc1qEx0GNlTBGf27NnRPn2vhc2iJfd9FUaFWhqVh9bfJIXSe3X48OHRvu222658J1YBjuSYmZmZmVmmZCqSk5YHrrv1MG9TkRUZOXJktK1IhdaTXHDBBdEx7Xv33XeBeOYUYLPNNks8p8qyQtwkLZxlrzVa09G/f/9oX9iQsjEqAwrx+CuCo4ZlLVla01TlTKfNOvbp0wdINnnU4wqZlVeuLcDYsWOBOAIE8WxfvdGsbTh7WwiVllWTwZDKb2u2uR6Fs+ea0TzzzDOjfeF2U6hhKsQNVe+//34gGXWsteajYYNKRQDylTyuhDBn/bjjjgPi9QDhWhW9X9XEt148/fTTAOy6667RPq0bDEucN2bllVeOtvOtIdPvT9/Jae/pLNC6Q41rmzZtomO56wc/+uijaDucXc+a8ePHA7D++us3OBauw1IER2XJw/XWG2ywQeLnTjzxxLKfZ70Jo6iFUHRR6+PS1NpaR0dyzMzMzMwsU3yTY2ZmZmZmmbLAzzVYPzUtxacQSkkLu8UrFB6WU5w0adJ8n0slU8PQWyFDpaIE4YLJhx9+eL4/l6aUX02pY6fFymklt8OSnbnpKGE6Qr4Svs2tOccunzBVRR2HP/74YwCGDBkSHdtmm20A6NmzJ5As2bvffvsBcO2115b9/NI059hpsXs5Q9xhSoPSZrp16wYkz3P69OlAw1TTpih27CpxzVVCWrqaFpCXo0xopa65gw46KNpWCWl9hr3wwgvRMaWHhim2+UpqF0KpMmPGjIn2rbDCConHhClXSskJU50LUSufdSF9B6elFxVCKc8qqQ/xNVjO9NJaGbuwgIhSwFWmN3y93PPt0qVLtP3UU0+V/bzyac6x03u20NfU3yzh3ytK7dX7TN+r1VAr112x9L4Oi82IxveQQw6J9l155ZVlP4dix86RHDMzMzMzy5RMFR5QQYEwGnH66acDcNRRR0X7NOORb8bsvffeA2DTTTeN9l188cVAeploPZciOKVGb5qbGimOHj260cd8++230fbLL7+cOPbdd99V5sQy4r777ou2tQhaEZlwEb1maRQJPOCAA6JjzRXBaU5qYqpmbWFDwdVXX72k59x7772B5Ox47kxyGMUNI2mWX1qTxWIjDtUQziR27doViK+TTp06RceuvvpqIG5eCcmyx5AsCJBbtjycFdbi3JVWWgmAhRdeODqmJoWKHIVNksMiEvVEZbDDwgNhE2RIRsUUCVS58ZkzZ0bHFF199NFHgWQhlSzSZ1BY9jlstDg/ygrIOpUI33rrrQt6vIqw6Ocgjqg++OCDZT67lkPfp2nRlEsvvRSoTPSmKRzJMTMzMzOzTMlUJEfCmW/N3oWlBFWaUTnan3zySaPPdfzxx0fbuc3ywlLUmqWvt7LJaoyXr/SrogsQz5LX27+zWubNmxdt33rrrUC8PmTppZdu8PjXX38dKL60Y73ZYYcdgHhGKHxvaa1EuI5OZdvTaDY0rcGiym+rOeUpp5wSHQvz/S2/sPxqLeSGF0plwiGOtjzwwANAMkrfo0cPINm8V6V8RRGIYoWNd/V9ku96rjdrrbUWkGzUKIrUhuuS9Nmmz8OWQo08b7rppmif1mzmW2cQvt9efPFFIL6O0tpmZJG+L9K+M9Pofa+mstY03bt3B9LXOsnUqVOb9ZwK5UiOmZmZmZllim9yzMzMzMwsUzJVQjqNFn2GaSphEYJizkVDdfPNNwMwaNCg6Fi4YLVcmqPM4MSJE4FkOp+olGUYEldpz1pPV6vXEo21oDnGTukHWsy92GKLNXiuTz/9NNqn1FItRFbJc4gX6uq8w3LUKgTSXOVVs1pCOizgoFLxKuoSpvSWqtrv1yWXXBKARRddNNq38847A9CxY8einkvFBfQ9MWfOnOhYJYoLVHvsLr/8ciBZfGHEiBEADB8+HEimDdaS5hy7wYMHAzBq1KgGz5XvPJRyC7DddtsBtbF4vtrXXT2rh7FTeiXE39P50itVSER/U1aKS0ibmZmZmVmLlvlIjiy00ELRtsrz9urVC4B+/fo1eLxKWGomKtynxeThDEslNMfdfv/+/QG45pprGhxTo89aavJZqHqYKalVzTl2mv1R006IF3wXeh4q9Xv77bcDMGzYsOhYcxdwyGokp127dtG2Fs1rQb1KMkNp106pP1cvY1dp1R47RawU+QLYcMMNgfTS47WkOcZu3XXXBeLsh1atWjV4rrTzeO2114Bk5CcsOV5t1b7u6lk9jF3YkFwtHtKuVxXDUCPzSmQ1hRzJMTMzMzOzFs03OWZmZmZmliktJl2tHtVDSLNWeexKV+2x23fffQEYPXp0tE89be644w4gufBWXa0//PDDsp1DqbKarrb44otH22PHjgXiog5hh/Y333yzpOev9jVXzzx2pWuOsevSpQuQXqwnLf1HaWrbbLMNULu9cHzdla4exq5t27bR9syZM4G4B51SxCEuqHH11Vc3y3k5Xc3MzMzMzFo0R3JqWD3c7dcqj13pPHaly2okJ6TSokcffTQAs2fPjo6dd955JT2nr7nSeexK57ErnceudPU2dmeffTYAQ4YMAWDChAnRMRWvai6O5JiZmZmZWYvmSE4Nq7e7/VrisSudx650LSGSUwm+5krnsSudx650HrvSeexK50iOmZmZmZm1aL7JMTMzMzOzTPFNjpmZmZmZZYpvcszMzMzMLFNqsvCAmZmZmZlZqRzJMTMzMzOzTPFNjpmZmZmZZYpvcszMzMzMLFN8k2NmZmZmZpnimxwzMzMzM8sU3+SYmZmZmVmm+CbHzMzMzMwyxTc5ZmZmZmaWKb7JMTMzMzOzTPFNjpmZmZmZZYpvcszMzMzMLFN8k2NmZmZmZpnyq2qfQJoFFlig2qdQE37++eeif8Zj9wuPXek8dqUrduw8br/wNVc6j13pPHal89iVzmNXumLHzpEcMzMzMzPLFN/kmJmZmZlZpvgmx8zMzMzMMsU3OWZmZmZmlim+yTEzMzMzs0zxTY6ZmZmZmWWKb3LMzMzMzCxTarJPTq1adNFFAdhvv/0aHPvqq68AGDduXLOek5mZ1ac//elP0fbNN98MwN133x3t69u3b7OfU63r168fAMccc0y0r127dgD06dMn2vfiiy8274lZXendu3e0fc8998z38RdccAEA999/f7TvzjvvLP+JWVk5kmNmZmZmZpnimxwzMzMzM8sUp6vNx1JLLRVtn3/++QAMGDCgweM++eQTAN5+++0Gx1544QUAvvzyy0qcYl349a9/DcSpGN26dYuOPf/88wC88847AJx22mnRsccff7y5TrGqVl11VQB+/vlnIB4LM6svK620EgBLLrkkAOuss050rEOHDkD8eThs2LDo2E8//QTAVltt1SznWW/0nXHDDTcAsMgii0TH9L2S9v1rBrD44osD0L59ewDGjBkTHdN7T7755ptoe8EFf4kFDBo0CIAffvghOuZ0tdrnSI6ZmZmZmWWKIzmNUJEBRW+gYQQnvKNfYoklAJgyZUqD59JCtbPOOqvBvpaiV69eAGy++eZAHLEAeOyxxwA44ogjgIazKrWqe/fuQPy7T9OxY8do+8ADD0wcW2CBBaJtzTJJOJOkxw0ePBiASZMmlXjG9WHllVcGktG+rl27Nvp4/R422GCDRh+jMQtn3m666SYAPv/889JPtsbp2lHEIBzTUhe1L7zwwkAccXjppZeiY++9915Jz1mvVlllFSB5Xa2++upAPAP8q1/FX7PhNsB3330XbWtm+d///ndlTrYOtW3bNtoeOXIkEEdwwjEfOHAgAF9//XUznp3VKr3P1lprrWjf7bffDsRZE99//310TBkj+k4YP358dGz55ZcHYPLkyUB8HVp9cCTHzMzMzMwyZYGfwyn1GhHOcFfLbbfdBsAOO+zQ6GMOP/zwaPuOO+4A8ucEH3roodH2pZdeOt9zKOVXUwtjl+b6668HYI899mhwbI011gDgzTffLNvrVWrswvKumhXPF8kp9PXyna8e9+mnnwKw7777RscqEdVpzutOs95dunSJ9qlU529+85uiXrvY81aJ2XwRoGIVew6Vfr927twZiCPHyisHuO6660p6zosvvhiAgw46CEjmtoefccWot8+6nj17AvDPf/4TiNfYzM///vc/AC655BIAzjzzzOjYF198UdK51NvYFULfrYcccki07/e//z0Qr23dfvvto2PTpk0r6XXKNXZ6nrQsBLWXOO+884BkBGrGjBmNvo4iDbWa2VCL150iOFtssQWQnjGjyP1JJ50U7dNnWj4bbbQRAE899VRTT7Mmxy6XIlgAEydOBOLx3WyzzRo8/v333weSZfG//fZbIF6XXg7Fjp0jOWZmZmZmlim+yTEzMzMzs0xxuhrQunXraFthyz333BNIhsZUaODee+8F4C9/+Ut0TKkG5557LpAMs8vUqVOj7R49esz3vOohpFmofOlqKtOtVI5yqNTY/fjjjyW/xqxZswB4+eWXG7yeUqcOPvjgRs9Lr/f6669Hx9Zee+2izqEQlRo7FRQAGDp0KAB//vOfAVhuueUKeh2lC4Zpg3pfaRG4FpYC/OEPfwDi0PuOO+4YHdO/c9dddwXg1ltvLegc8qm1dDUV9Xj00UcBOPLII5v8nEr3UKGQa6+9tsnPWQ+fdb/73e+ibaUnr7vuug0ep++Hp59+GoDRo0dHx/T5odTTcqiHscsnLC7Qrl07IE7pCj8XVOBixIgRQLxIvCnKNXY63z59+jT5nGTChAkAvPvuuw2Ovfrqq0CcRhRS8YVKp7nV4nWnvyVUvEMpZqFtt90WSE9lU+p5mzZton0ffPABkCxU0FS1MnZhmq2uN6VXLr300tGx3LL24bWlVLROnToBybRbpZaqaNcjjzzS5HN2upqZmZmZmbVoLbqEtJq1aUEgxDPLabRI9MQTT2z0MfmiETfeeGOxp1jXVIYboFWrVkB8F3755ZdHx7Q4rR5ceOGF0XbujEIYCXj44YdLev7DDjsMgF122SXapxkWCRfka9G8GqrWooUWWghIFtvYbrvtgHhGKJzZ1gytZtAgjrDOnTsXSEbURNGwNFtuuSWQjOQoMvvWW28V+k+pC/3794+2FenTv79UYSl9/Q7KEcGpJ+G/NzeCc+WVV0bbWjRfT59rlaay4xAXG+nduzcABxxwQHRMJX/nzJkDxMUtIC7yUGqBhkrq168fkCzsIYocbLLJJkDhBVV22223+T4m/B6Vq666CoCHHnoo2qdCSlkvsa33XFqkVOXaw/YMor8F99lnHyAufgOw//77A/G4ZoEipOH1qus0n3HjxiX+C3GWgIporLbaatExRYNUpKkckZxiOZJjZmZmZmaZ4pscMzMzMzPLlBadrjZ27FggWdc7V5jKdsMNN1T8nLIkXBSvngZKdfnrX/9alXNqKi22rjSlZqQJ+/L89re/BWo7XW2FFVYA0mvrK8VMBQLKTeMzbNiwBse0oLeWx64U4WfWZZddBsSpGsXadNNNgTiNA+Dss88u/eTqiAoNqC+V0o1CSosJ0/mcphZT+tkpp5wS7QtTcSGZQqXPPY1nqf1vmpvSZ8PrQLRPi7xV1KgxKtASvucas/jii0fbSoPT9Rr2U9Pi8HPOOQeICwFBaYvga5VS5JWKHNJ1tvXWWwNxmhXEv6NwzLIm7EWnAln6fA8pfTtMk9f7UMUa9HccxNddmKZWSxzJMTMzMzOzTGmRkRyVutPi5zRaSHr00UdH+9IWO4vK1w4ZMqTBsdmzZwPJ0r8tgRaWWuXUUinYxnz44YdActGhInu5s7rlEJZxVfleRXTeeeed6NgJJ5xQ9teupr333htIdqoOP7+KseCCv8x/qaS5PsMgOQucZeqanq/QjMbik08+aZZzqmXqhg5xAZVDDz0USM7yKnKgUrXh98T06dMbfX5FhfScaVTaG+C+++4r+Nybg4oShQVY8sl33Un79u2j7e7duwNxVDws2rD++usDcM011wDJxfflKJ1fKzTGKlQTFppZdtllgbh9R5iZs/HGGyeeR+XKIb3UdD3R37uK3kB6BEeFsQYOHAjkL5m9zDLLRNu9evUqy3lWiiM5ZmZmZmaWKS2mGeiGG24YbaucYticUFS2ViVX33zzzYKeX/mOaTnEkyZNAqBv375FnHHtNIwqlu7s1RQP4vOaPHkyEOfFVkq9jl0aRRD1b1JTUYjLPirnuhwqNXZq1AbxTJJmW5vyMaRmgpr5DJv0qny1cojDtTlhmdByqWYz0CeeeAKAzz77LNpXanPCnj17AnFTvXANXVrZ2qaqxfermsymrSXL9dprr0Xbih6qFHo5G3+mqZWxU4sFyN90VrPthZRSXmeddaLt4447DkhvKC3z5s2LtjVzrZ9LUytjVwnvvfdetL3SSisljqmhKsBJJ51U0vPX8tgpenHsscdG+7T2Ws2Mw3MJG19CHPkCmDlzZtnPrznGbpFFFgHipsThe0lrNIcPHx7tGzVqFJBcb5NL68oGDx4c7QufozFa3xhek6VyM1AzMzMzM2vRfJNjZmZmZmaZkvnCA+pMHZbkzU1TCzurK72j0DQ1Of744xP//+STT0bbYfpMS6DUwDC8qvKNLaX8bFOFpUi1CPynn34C4Msvv4yOlTNNrdLUxRziNKhirbrqqkC8sBniYh9pYewrrrgCiFNpin1f1xN1lS9HCeP11lsPgI8//hiAm2++ucnPWQ/CNIzOnTsX/HNrrrlmtK0S3krxCxeaaxw1rvVK36sAxxxzDJBeGlnfu6eeemq077nnnks8JkwH1Gecxv7CCy9s9BzCtMyPPvoIiK9biFOV8qWrWTbpGhs5cmS0T+lq4aL5XCokEhaoqVc77LADkExTEy0bOP300wt6Lr0f9ZlW7N+01Syr70iOmZmZmZllSiYjOWGRAZX/S1vkqJLOYSnpYmZ6VbIV4hlmRS+uuuqq6Fi9z9oVSosbDzzwwAbHtCg6LEZgDSmSuNdee0X7NLupBczhbHOWhTNQimypPGr4ftZ7Li2SozHTc2nRKcAXX3xR5jOurt133x2IF5pCvABcTQDzCUsAq7T3uHHjgGQJ6Sw76qijom01FpQwAqHtxRZbDEgWeFCBDZU8DpuztmrVCih8BrXWqCzxySefHO1TIYGwjLYWGPfv3x9INmfU4uX3338fSI6z3sMdOnQA4KmnnoqOKVKk9g5hJGfixIlA/FkJyWaPLclyyy0HxAvP07SUkudhQRCVJw+bYuZSBo7Km9ez3IylsAWKIjlp9H4M/87Qey8ss9VGMQAADDdJREFUyCB6r19yySUAnHbaaSWecWU4kmNmZmZmZpmSqUiOygbedddd0T7NqqXN8u60005A4dEbzXRqJiDM92zTpg0Ab7/9NgDjx48v6tyzQPmsYYMy0doIS7fkkksCcfPGsNyy3HPPPUDLiYaFjdxU0j1NvpKSufn4akwK8NBDDwEwYMCAEs+wtmjWUiXyIV4rOGPGDCD/Wqgjjjgi2tY6ibAscEsQRl122203AF588UUgOT6KXshGG20UbXft2rXBc8n+++8P1F8kR2twVJY5jL5cdNFFAIwZM6bBz+lzTd+14baiWuHaTc0ea53su+++Gx17/PHHAdhmm22AZPRMzRwffvjhaF/4+dGSqFmqml+G9PlXiTLwtUgtBCAZqc6liP+//vWvip9Tc8ktOR2ui9H3gd6DEEfvte4mrWGovmvPOuusaJ/WHGpdYvi6eryaf1999dUl/EuaxpEcMzMzMzPLFN/kmJmZmZlZpmQiXU0LPG+55RYAVlhhhQaPUQljiBcpFltOVmVr08og//e//wXisn3h62VZGO7U70HCMowtdRFooZTassUWWzT6mLAMa0sQlsr+z3/+A8TvcS1ahoaFB8KCBUptUcflcDGmFlP27t0bgM033zw6Fi5YrTdKeYT43ztp0qT5/lyYZnDjjTcCcOedd5b57GpbWKChkGINEi6QzyJ9rylN7ZlnnomOacFxWAhABQfUDX211VZr8JwvvfQSkExlHj16NJAsICBKCVcKZVguWh3cw/TK3JTCrFNZ36FDhzb6GKXRh4Ugsqh169ZAnB4KyZTSXCogojL8WfDII48k/j/8W01py+H7LEzty6XiAkpNS0u3VbpaWvq4ij5UgyM5ZmZmZmaWKZmI5GhGVrO1aVReFeIF3IUIZwJyFyi/8cYb0bbKWmp2qqVQSVGII2iaHRg7dmx0LAvNtcotLNG43377Nfo4XbtZbmSZ5uKLL07dLoZmnhXR0WJygF133RWIm8NpwSXA3/72t5JerxaoMSLE/+4wSpWrW7duQDzrnvscZquvvnri/8M2DWrYOWXKlGifivKoXHT4vaiorFoNhNdajx49gHiRuK5fiMvH63te0UaIIzjPP/98Uf+uLFH588UXX7zRx1x77bXNdTpVoetN11ZYHCoflYxWRDALnn32WQBGjRoFJBtoS1qRgGnTpgHJaI0i1bNmzarMyVaQIzlmZmZmZpYpmYjk5CsVqdKAhd6BqlSmojaHH354dCw3X1MzwQAvvPBCYSebESrpe8YZZzQ4pkZT4cywNRSuYcrNYw1nPrWGzIo3d+5cIF5bEq4xUUO8IUOGAPEsMsQzgvWe16/c+3CWPZcaGYfN4lraWpymUk4/5C9JfvPNNzfH6ZRdvjLtijSH2RK5ws8wRWkUVQhnk//v//4PiNcHhA17lYGx9dZbA9lfB1WIXr16Rdu5zR9DI0aMAOIG6FmlvznSohb5qLn0EkssUfZzqhZl1GiNVtparXCdTufOnYH83xX5qNR+2HBbWRL9+vUD0v9erDRHcszMzMzMLFN8k2NmZmZmZplSt+lqffv2jbZzCw6EpWf33ntvID20vcoqqwDJMO8NN9yQeM4wTK+ytVqgnPXQbz5bbbUVkCw7qEIMhxxySFXOqV4cfPDBQLJog64zdaRWydb5WX755QFYaqmlCnp8WCyjpQs7QEPcZR1aTgl4gEGDBgHJMu8PPvhgtU6n4tq2bQskf8dz5swp6blU2CIsXayO8xKWI6/XTvPqcK5WCWFhjnyd5OXvf/97o8cWXDCea1UZaqWShuXQs9SNvql0DSsNDRoWHHjrrbeibbW9CFNS651Sy/R9CvkL+OSj0sgt7ftRBReg9DQ10d8u4eeq0tWqyZEcMzMzMzPLlLqN5IQLy7RIWMJF24rS7LPPPtE+NS3SguNFFlmk0ddRuUuIZ9dbWpnoUJcuXYCGs5UAO++8M9DySh0XSlGXAw44oNHHXHPNNUB6ye2OHTsCcXlMiBfqbrDBBkD6AuFw0e8ee+xR7GlnSlhK+cgjj0wcCyM5ac0Is0aFL1RsRVHsrOrTpw8Ad9xxBwAzZ86MjimKFUZkNMupxe/hLLgWKmuhc1r0Wo8/5phjon31+tmoaNSxxx4LwNNPPx0dO+WUUwDo0KFDoz8fjp0iqBMmTABg6tSp0bFXXnkFgCeeeKIcp51Zm266KZDe4FIRnO222y7aF87YZ4WyF8L3bKn0HTlv3rwmP5fVFkdyzMzMzMwsU+o2krPttttG27mzroo2QPpMR24O8ezZs6Ptu+++G4BHHnkEgPHjx0fHWlKefqhdu3bR9u233w7EM5nh7PfHH3/cvCdWZzTzpKhLmIuua/i4445L/Dd8XL7oQtpj1KA2jPy0VMsttxyQLHOr9RRy0003Nes5VduKK64IxOOgCEdWrbfeekD8XtH/h9vhOjkZN24ckCxnrIhomzZtGjxeUQs108viuIbR4WeeeQaIvxtCWusQRhJeffVVAKZPn17JU8y0cB1KLq2ny+L6knDdkbIeShWWc/ffLuUTloSvBY7kmJmZmZlZpvgmx8zMzMzMMqVu09XCdJ4TTjgBSHadlnzlLZWm9sADD0T78nWrbqnCFA4tnpewlLfKi1q6b775BoiLWajTPOTvKK4UND0mTJt8+OGHgTi9Mkwj+eijj5p+0nVg4YUXBuD7779vcEzXq1KGNtlkkwaPmTFjBgDTpk2r1CnWNF1Xuj6zSguUdZ2cdNJJ0TF9d6R9X+T7TtDYjRkzJtqnNCyluWWdFrqH6X9WGSqW0q1bt0Yf89xzzzXX6TS77777Ltp+++23S3qO9957D4Bhw4ZF+9IK/VhpwmJdWuqQ2+alOTmSY2ZmZmZmmbLAz/mmkKuk2IVLu+22G1B84yE185w8eXJRP9dcSvnVlHPRl8qFhpGc3/72t0AcTVh22WWjY7VUfrHaY5ePSveGC3ULKSqgUq3hrJMiOOVUy2OXRuV/r7vuOgA6deoUHRs4cCCQbForal7Wq1cvIF4Q3RTFjl01x02RjMGDBwPVbdxW7Wuuc+fOAPTu3TvapwIhu+yyS4PHa0H9Y489BiQL1DS3ao9dPau3sdNnXVpEWp9fW2yxBQCff/55Rc+l2mOn4kfh32/67J81axaQjLCqSNLYsWMB+OGHH8p2LsWq9thVUtj4V98xyppSE3koPeJY7Ng5kmNmZmZmZpnimxwzMzMzM8uUTKSrZVW1Q5rq1K1+DwBz584FYMsttwRqt9dBtceuntXb2KmLfFjIoTHqHQRwxhlnAPHC6XKo9XS1lVdeOdp+8skngTjlKi0tq7nU2zVXSzx2pau3scuXrqZUrRdffLFZzqXexq6WZHnslPoL8XeMzj1M6y21yJfT1czMzMzMrEWr2xLSVnkjR45M/NesFg0dOhSAPffcE4A//vGP0bHRo0cDcNdddwHw0EMPRccUlWxJvv3222j75ZdfBpKLc82sPu21115AXHDg4osvjo6FbQfMKumZZ56Jtu+9914gLuiy8cYbR8e6d+8OwJQpUyp6Po7kmJmZmZlZpjiSY2Z17dZbb0381xoXlpXt2bNnFc/EzIo1depUIF6TE0ZhNUt+0UUXAdlv7mu1KVwzc+655wLQo0cPACZNmhQdK0fLhkI4kmNmZmZmZpnimxwzMzMzM8sUl5CuYVkuM1hpHrvSeexKV+slpGuVr7nSeexK57ErnceudB670rmEtJmZmZmZtWg1GckxMzMzMzMrlSM5ZmZmZmaWKb7JMTMzMzOzTPFNjpmZmZmZZYpvcszMzMzMLFN8k2NmZmZmZpnimxwzMzMzM8sU3+SYmZmZmVmm+CbHzMzMzMwyxTc5ZmZmZmaWKb7JMTMzMzOzTPFNjpmZmZmZZYpvcszMzMzMLFN8k2NmZmZmZpnimxwzMzMzM8sU3+SYmZmZmVmm+CbHzMzMzMwyxTc5ZmZmZmaWKb7JMTMzMzOzTPFNjpmZmZmZZYpvcszMzMzMLFN8k2NmZmZmZpnimxwzMzMzM8sU3+SYmZmZmVmm+CbHzMzMzMwyxTc5ZmZmZmaWKb7JMTMzMzOzTPFNjpmZmZmZZYpvcszMzMzMLFN8k2NmZmZmZpnimxwzMzMzM8sU3+SYmZmZmVmm+CbHzMzMzMwyxTc5ZmZmZmaWKf8PEeQepwf1PKgAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# takes 5-10 seconds to execute this\n", + "show_MNIST(train_lbl, train_img)" + ] + }, + { + "cell_type": "code", + "execution_count": 98, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAAKoCAYAAABUXzFLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzs3Xm8VdP/x/FXSkql0uimMlVKE5FEA6mIKGT2jVQaDEWI0CCUhEx9UaZSqMzJnAiF0jeVmcjcoDlDOr8//D57r3Pvuadzz73nnn32eT8fj+/D/u517rnrrvYZ9vp81meViEQiEUREREREREJil3R3QEREREREpCjpJkdEREREREJFNzkiIiIiIhIquskREREREZFQ0U2OiIiIiIiEim5yREREREQkVHSTIyIiIiIioaKbHBERERERCRXd5IiIiIiISKjoJsexefNmBg0aRE5ODmXKlKF58+Y88cQT6e5W4G3atImrrrqKTp06Ua1aNUqUKMGIESPS3a2M8Oabb9KrVy8OPPBAypUrR61atTj55JNZtGhRursWaEuWLOGEE06gTp06lC1blj333JMjjjiCqVOnprtrGWnSpEmUKFGC8uXLp7srgfbWW29RokSJmP9bsGBBuruXEebPn0+XLl2oXLkyZcuWpV69etx4443p7lagnX/++fled7r24vv444/p1q0bOTk57L777hx44IGMGjWKrVu3prtrgffBBx/QuXNnKlSoQPny5Tn66KN59913092tAimV7g4EySmnnMKHH37ImDFjqF+/PtOmTeOss85ix44dnH322enuXmCtXbuWBx54gGbNmtGtWzcmTZqU7i5ljIkTJ7J27Vouu+wyGjVqxOrVqxk/fjytWrXilVde4Zhjjkl3FwNp/fr11K5dm7POOotatWqxZcsWHn/8cc477zxWrlzJddddl+4uZowff/yRIUOGkJOTw4YNG9LdnYxw8803c/TRR0eda9y4cZp6kzmmTZvGeeedx+mnn85jjz1G+fLl+frrr/npp5/S3bVAu/766+nXr1+e8127dmW33XbjsMMOS0Ovgm/FihW0bt2aBg0acOedd1K1alXefvttRo0axaJFi3juuefS3cXA+vDDD2nbti0tW7ZkypQpRCIRbr31Vjp06MDcuXM54ogj0t3FxEQkEolEIrNnz44AkWnTpkWd79ixYyQnJyeyffv2NPUs+Hbs2BHZsWNHJBKJRFavXh0BIsOHD09vpzLEr7/+mufcpk2bIjVq1Ih06NAhDT3KbIcffnikdu3a6e5GRjnxxBMjXbt2jfTs2TNSrly5dHcn0ObOnRsBIjNmzEh3VzLODz/8EClXrlykf//+6e5KKLz11lsRIHLdddeluyuBNWzYsAgQ+eqrr6LO9+3bNwJE1q1bl6aeBV/nzp0jNWrUiGzZssU7t3HjxkjVqlUjrVu3TmPPCkbpav/vmWeeoXz58vTo0SPq/AUXXMBPP/3EwoUL09Sz4LOQuRRc9erV85wrX748jRo1YtWqVWnoUWarWrUqpUopQJ2oqVOnMm/ePO677750d0VCbtKkSWzZsoWrr7463V0JhcmTJ1OiRAl69eqV7q4E1q677gpAxYoVo85XqlSJXXbZhdKlS6ejWxnh3XffpX379uy+++7euQoVKtC2bVvee+89fv755zT2LnG6yfl/y5Yto2HDhnm+IDVt2tRrFykOGzZsYPHixRx00EHp7krg7dixg+3bt7N69Wruu+8+XnnlFX2JStBvv/3GoEGDGDNmDHvvvXe6u5NRBg4cSKlSpdhjjz3o3Lkz8+fPT3eXAu/tt99mzz335LPPPqN58+aUKlWK6tWr069fPzZu3Jju7mWUDRs2MHPmTDp06MC+++6b7u4EVs+ePalUqRL9+/fnm2++YdOmTbz44ovcf//9DBw4kHLlyqW7i4H1119/sdtuu+U5b+c++eST4u5SUnST8//Wrl3Lnnvumee8nVu7dm1xd0my1MCBA9myZQvDhg1Ld1cCb8CAAey6665Ur16dwYMHc9ddd3HRRRelu1sZYcCAATRo0ID+/funuysZo2LFilx22WXcf//9zJ07lwkTJrBq1Srat2/PK6+8ku7uBdqPP/7I1q1b6dGjB2eccQavv/46V155JY899hhdunQhEomku4sZY/r06Wzbto0LL7ww3V0JtH322Yf333+fZcuWsf/++7PHHnvQtWtXevbsyYQJE9LdvUBr1KgRCxYsYMeOHd657du3e1lNmfKdWHkdjngpV0rHkuJw/fXX8/jjj3P33XfTokWLdHcn8K699lp69+7Nb7/9xgsvvMDFF1/Mli1bGDJkSLq7FmizZs3ihRde4OOPP9Z7WwEcfPDBHHzwwd7/b9OmDd27d6dJkyZcddVVdO7cOY29C7YdO3bwxx9/MHz4cIYOHQpA+/btKV26NIMGDeKNN97g2GOPTXMvM8PkyZOpUqUK3bt3T3dXAm3lypV07dqVGjVqMHPmTKpVq8bChQsZPXo0mzdvZvLkyenuYmBdcsklXHjhhVx88cUMGzaMHTt2MHLkSL777jsAdtklM2IkmdHLYlClSpWYd6br1q0DiBnlESlKI0eOZPTo0dx0001cfPHF6e5ORqhTpw6HHnooXbp0YeLEifTt25drrrmG1atXp7trgbV582YGDhzIJZdcQk5ODuvXr2f9+vX89ddfwL+V67Zs2ZLmXmaOSpUqceKJJ7J06VK2bduW7u4EVpUqVQDy3Agef/zxACxevLjY+5SJli5dykcffcS5554bM51IfEOHDmXjxo288sornHrqqbRt25Yrr7ySO++8k4ceeoh58+alu4uB1atXL8aMGcOUKVPYe++9qVOnDitWrPAmEGvVqpXmHiZGNzn/r0mTJnz66ads37496rzlHao8qKTSyJEjGTFiBCNGjODaa69Nd3cyVsuWLdm+fTvffPNNursSWGvWrOHXX39l/PjxVK5c2fvf9OnT2bJlC5UrV+acc85JdzcziqVaKSqWP1vfmpuNXabMDKebRR969+6d5p4E35IlS2jUqFGetTdWcltrreO7+uqrWbNmDZ988gkrV67kvffe4/fff6dcuXIZk2mid5X/1717dzZv3sysWbOizj/66KPk5ORw+OGHp6lnEnY33ngjI0aM4LrrrmP48OHp7k5Gmzt3Lrvssgv77bdfursSWDVr1mTu3Ll5/te5c2fKlCnD3LlzGT16dLq7mTF+//13XnzxRZo3b06ZMmXS3Z3AOvXUUwGYM2dO1PmXXnoJgFatWhV7nzLNn3/+ydSpU2nZsqUmXhOQk5PD8uXL2bx5c9T5999/H0AFVxKw22670bhxY+rWrcv333/Pk08+SZ8+fShbtmy6u5YQrcn5f8cffzwdO3akf//+bNy4kQMOOIDp06fz8ssvM3XqVEqWLJnuLgbanDlz2LJlC5s2bQL+3YRr5syZAHTp0iWqDKH4xo8fzw033MBxxx3HCSeckGfnan3wx9a3b1/22GMPWrZsSY0aNVizZg0zZszgySef5Morr6RatWrp7mJglSlThvbt2+c5/8gjj1CyZMmYbfKvs88+20uRrFq1Kl9++SXjx4/n119/5ZFHHkl39wKtU6dOdO3alVGjRrFjxw5atWrFRx99xMiRIznxxBM56qij0t3FwHv22WdZt26dojgJGjRoEN26daNjx44MHjyYqlWrsmDBAm655RYaNWrkpUpKXsuWLWPWrFkceuih7Lbbbvzvf/9jzJgx1KtXjxtvvDHd3UtcmvfpCZRNmzZFLr300kjNmjUjpUuXjjRt2jQyffr0dHcrI9StWzcCxPzft99+m+7uBVa7du3yHTe9PPP30EMPRdq0aROpWrVqpFSpUpFKlSpF2rVrF5kyZUq6u5axtBnozt1yyy2R5s2bRypWrBgpWbJkpFq1apHu3btHPvjgg3R3LSNs3bo1cvXVV0dq164dKVWqVKROnTqRa665JvLHH3+ku2sZoWPHjpFy5cpFNm7cmO6uZIw333wz0qlTp0jNmjUjZcuWjdSvXz9yxRVXRNasWZPurgXa559/Hmnbtm1kzz33jJQuXTpywAEHRK677rrI5s2b0921AikRiahuo4iIiIiIhIfW5IiIiIiISKjoJkdEREREREJFNzkiIiIiIhIquskREREREZFQ0U2OiIiIiIiEim5yREREREQkVHSTIyIiIiIioVIq3R2IpUSJEunuQiAks4WRxu5fGrvkaeySV9Cx07j9S9dc8jR2ydPYJU9jlzyNXfIKOnaK5IiIiIiISKjoJkdEREREREJFNzkiIiIiIhIquskREREREZFQCWThAREREQmvsmXLAtCxY0fv3IwZMwB48803AejTp4/X9sMPPxRj70QkDBTJERERERGRUCkRSaaWXYqpVN6/Mq3M4HXXXQdA7dq1AZgyZYrXNn/+/GLtSyaMXZUqVbzjI444AoDjjz8egH79+nlt3333XdR/J06c6LU99dRTRd6vTBi7oFIJ6eTomktepo3drrvuCsCkSZMAOO+88/J97CGHHOIdL1mypMj7kmljFyQau+Rp7JKnEtIiIiIiIpLVsnpNzh577AHAiBEjvHOnnXYa4EcjbrnlFq/t9ttvB2DNmjXF1MPMcsEFFwBQp04dwJ+xg+KP5ATZ0UcfDfj55wCVKlUC/Nkad7bCxtP+26ZNG6/trbfeAuC3335LXYdFJOVat24NwJdffumdW716dbq6kzKXXnopED+Cc//99wOwYsWKYumTiISTIjkiIiIiIhIquskREREREZFQyep0tV69egEwaNCgPG2WLjR06FDv3LHHHgtAly5dAKWtSXKuvvpqwE9RK4znnnsO8AsXZItmzZp5x23btgWgatWqgF8AA+Cdd94B4OSTTwZgw4YNxdXFrHHAAQd4x2+88Qbgp14B/Pjjj8Xep6Jk11Ws9/tYr+EGDRpE/f999tnHO165ciUAhx56KAAtWrTw2nJycoDoUsm9e/dOrtMB07lzZ+/4xhtvjGpzX5NWrMZS2kRSxZYrADRp0gTwU8Ld733W9u677wLR6eISfIrkiIiIiIhIqGRlJMdm1saMGZPvYz766KM852z2bfbs2YAf0QFYu3ZtEfYwMz300EOAX8jh888/T2NvgssW01pksDC2bNlS6OfIBFZa+8477wSiZ9DdUtwQXbTBZt3s56w4hhRe+fLlAZg6dap3rmbNmunqTpG7/vrrAejfvz8AmzZtyvOY3XffPc85i8jEEquwSG7u78n0SI5F+Z588knvXJkyZaIeM2HCBO/YLQKU7apVqwbADTfc4J0bMGAAALvs8u/8tG0rAHDTTTcBfkGb9evXF0s/g8SKHVWsWBHwIzMA3bt3B+DUU08Fol+7VmgqlgDuspISRx11FABXXnklAF27ds33sW4569zj416vn3zyCeBnnKSDIjkiIiIiIhIqWRnJsTU4pUuXBuDvv//22qxktP3XvWN95plnAD+/2F3LY7N+2aZbt27esc0AmLFjxxZ3dzLCNddcA0DHjh29cw0bNgRg69atAIwaNcprsyiGrT1xvfzyyynrZ7rZ3w3w8MMPA/76iHgzSbHUrVsXgHLlynnnsiUKlir22rexBZg1axaQuetw7DMB4NxzzwWgRo0aUf91JRKZicWuvT///NM7Z69lG8NMZuNy+eWXA9HrH4xtBnrzzTcXX8cyQMmSJQE/M8J9H7TrbMeOHQDsvffeXpttEt23b18ATjzxRK/t119/TWGPg8PGIJGIvfsZ8sUXXwBw9913A/42D+BHgMLEXo9nn322d27cuHGAH+GK955mW1eAH3Fs1KgRACNHjvTa7L1MkRwREREREZEiopscEREREREJlaxJV7NFsgDHHXdcVNu3337rHcdb+HjfffcBSlcDfzz/85//eOcqVKiQru5kFEtRsdKUO2MFCtzwuglLypX7t91xxx1A9I7otpA0WZbqZ4srAV555ZVCPWcmqFy5MgCnnHIK4Jd4Br+ccUHZYvJbb70ViC4CYeXRM9WFF17oHbulsSH6c2LffffNt80W28YqVDBt2jQAPv30UyB64XiYnHTSSQD069cvT9svv/wC+Onff/31V/F1LAPY+5+bpmaWL18OwMKFC4Ho4jW2yP7ggw8GolOE7N/jt99+S0GP08vGC/xtQWKlWtl1Zu/7VhIa/OIpmzdvBqBHjx6p6WwanXnmmd6xlXHfb7/98n28m3L88ccfA36K6bx587y2smXLAv5nzYcffpjnuaw40AMPPOCds+JetjQkVRTJERERERGRUMmaSI4bZahfvz7gl31OtJSvzZ5s3LgR8BcIZiNbAO6WGbTFkFJ4bulLW9AXa3bKFlpmOnc27uKLL07qOZ5//nkgenY896aCVkoa/GIPYXb++ecDfnnZBx980Gu77LLLEn6e3XbbzTu+5JJLAP89YPjw4V7bqlWrku5rELibmFp00RbSugtqJT6LHMRi0b45c+YUV3cCr2fPnt6xlSw3L730knd82mmnAX5Uwn1dWqGfgQMHAv6WF+BHOCz6GqbPavfvzG3y5MnesWXiLFmyJN/HDxkyBIi94adtHZIprCCAFaC4/fbbvbZYhUBsE2Ir926FLwA+++yzfH+PfR+24hbuxr/r1q0D/IIObqGqI488ElAkR0REREREpEB0kyMiIiIiIqGSNelqrVq1ynPu8ccfB/ww3c7Yor2LLroI8BeRuufuv//+QvUzDNJZEz3TWfqBW8hir732Sld3itQ+++zjHVuqmL1uYqUHxOOmpM2cORPwi4a4vyd3uprbZoUNpkyZUqDfHVS28NNdYHrbbbcB/oLa8ePHJ/Xc7r+PpatZ0QvbwyhssmWn86LSvHlz79jSqoy7F53S1HxWCMX2KAE/TXLbtm2AX6QC8hZpcPdZskJIOTk5gF9sBGD06NGAn761evXqovkDAmD+/PnesY2nfbcbOnSo12apU7Hsv//+gJ8q7RbCeeeddwAYM2ZMEfW4eFjaWbt27fK02evxyy+/9M5ZsYV4qWmJeO+997zjE044AYBOnToV6jkLQ5EcEREREREJldBHcnbddVcArr322jxtTz/9dFLPaWUc3bt9W8RrpQjDUto3GW6ZXikYm42Pt2PzihUriqs7RcrKOIMf8bTX58789NNPgP86HjVqlNd20EEHATBjxgwADjzwwHyfx53BfPvttxP63Zmidu3aQHRxBXuPsr/Viq0kysooP/bYY3narBCLlQQOg1gL5m12t0WLFt65xYsXRz3Grj3w3/uTLdGdqdyFzbm3E3ALpKxZs2anz2XXnZWeBTj33HMBuPfee4HCzziny5577ukd23cQ95y5+eabgYIvzP78888L0bvMM2vWLO/YvoedffbZQHQUw6LasYr1PPHEE4D/HuqWT7bIdSZwC1Hk3qLin3/+8Y4tcliUW5/Y77YCGOAXvrH3A7fghUWaUk2RHBERERERCZXQR3IOO+wwIHoWzsTatCgRVjLv999/987Z7LHdsWZzJMedfZOiZxt5ZRo3GjBs2DDAz4Xemd133x2Ayy+/HIC6det6bW657USfx32OTN6Q0d2I02a43ejY9OnTAX/TXnc2LxEdO3YEoGbNmt657du3A9HRtLBwNwi0Uqg2y2755e6xRcrcGdE//vgD8NdBuOugwrjxpb2GY33GfvHFF0D8GWN3Q8LBgwcDcNZZZwGxIxx2LdtaPIA+ffoAmVEaecCAAd5xrL9v69atALz22mvF1qdMZptKAnTv3h3woza2/QL42xRYhOPNN9/02g455JCo57RS0gBLly4t4h6njrstQO5ryx2noozg2HvhFVdcAcReA2Tc6GtBtjAoDEVyREREREQkVHSTIyIiIiIioRL6dDUr+7xhwwbvXMWKFQE/nOcuMkvE999/D8DXX3/tnbNdd1u2bAn4u6+H3S676D65KNl14xa1MFbUwl3knKmsLKctMHb973//A6ILA9iC8GbNmgHR1128FBV7nD3GCjtA3nKh8+bNK+BfkX62ezz4O0i//PLL3rlk09QsBSlWWoOVrU209H4mueaaa7xjK3ZRunTpfB9v146bqtWgQQPAT1dzd/k+/PDDi66zaVSyZEnv2Eqw5y42AH6hj02bNuVpq1+/PhB9vbol3vNjv8ctzmJpbpZKHmQ7S2W39zo3vaggjj322Dzn7PtPQd8HMs0rr7wCwAcffABEfw9r3bo1AL169QLgnHPOyfPzVoAgUz9jFy1a5B1b2myZMmWA6JRjKwjwyCOP5PtcF154oXdcvnx5wN/OwsqVg/9eENTvgsHslYiIiIiISJJCH8mpXr064EdvipI7S2CRHFvoli2RnExY6JkJjj/+eMCfEY61EaFFJcPANumMVR7VZh3dUrP16tUD/CID7nUXb9NGe1ysx9hmeTbz6W5maNHaoNpjjz0Afwbbdffdd3vHBZm5dcuPWtlUd/bP2KZ6Vir+m2++Sfh3BJ1FEQF69+6d8M9VrVrVO+7Zsyfgl2m14jfgzzBb1M3dJDOTuJGcWBttm4ceeijPuXLlygHw6quvArELh1gZZPc9wKI8tWrVyvP4vffeG8iM8vpWIMS1ZMkS79jdTLEgcn+GuPr27QvE3xAzTKwo1Mknn+yds8j98OHDAT/CAf7ng5U1ztTvNW+88YZ3bBFD+zvdYj0PPPAAEL+IUY0aNbxj9/VeGO51XlwUyRERERERkVAJfSQnnsLOasTaWK99+/YAjBkzplDPHQYjRozwjm1mZcKECWnqTfGy68DN569WrRoA5513HhBdwvy0007b6XOGafNKK5PqrmuLx93EMz+2hsLNebdIh83U9evXz2tr06YN4K/TSXRj0iDIyckBojfitBnxSZMmeedsJt0e58705eZu1mqRtlgsOu7mf2c7N+Jgm7Hamgp302mL+Nvr3Up8h42tB4iV0fDSSy8BsSM49jlh6yfcbRrsM/Wqq67K83NffvllIXtcfNwy4lbWuCh06dIF8NdG/Pzzz15bsut7Mp27ttWyAexcrHWvbvQi09nrxKLybqaCRWbscyRRtgmy+93XrrcOHTrs9OcGDhxYoN9XFBTJERERERGRUNFNjoiIiIiIhEro09VsN3M33cXShgor1m7FqShwkKnc8rO5F9iGKW3NFq675XwtFcpNgcq9+D1Wqc94Hn74YQBWrVrlnbv55puBzC15mSgrWWlpF6eeeqrXNmvWLABeeOEFwH/Nx2IlRsFPK7JwvpsyOHbs2KLodsrYztGdO3f2zlnBAXcHeSuJbCkFbopGvIINublpccOGDQPij3M2s2IPVpK8a9euXtv8+fMBP6XQXWQepvG0xfWW1mKl38FP2TMff/yxd2wpL9u2bQOid57PnULp7lifqQvFC2vkyJHe8YABAwD/+rPF5RCua6sg3BSqs88+G/Df95YvX+61VapUCYDbbrsN8D9LIPq9L5NYeejZs2cD0a9BS5lv3Lhxnp+z9yh3DIylnblLPezzM1a62vr16wH/e0o6SrwrkiMiIiIiIqES+kiObfTpLmC0SE6pUsn9+VWqVAFilxgN+gxwuthCtzBFumzm3DbNirdpYGFY9MJmQG3WCWDatGkAHHLIIUD0ZoZhZJGIo48+utDPZTNWl112GeDPNoFfStjdqDCIrNQuQKdOnfK02+ydRXc2b97stVkk0a4ddybOijfY4tUXX3zRa3MjibJzbrTGNlK1sq7uTGomzbY3bdo0bvv27duj/v/ll1/uHZctWxbwX8vuIuaDDz4Y8DdqtI0bXfaaPPPMM71zYd/kMjcr7e5+B7EIhUXG4pUHDjvbZDpWiX17nTVp0sQ7Z++BVgzo2muv9driFWHJBJbF9Prrr3vn3OPCso2jY7FCN88880yR/b6CUiRHRERERERCJfSRnHjOPfdcACZOnFign7MZdXdzJbNgwYLCdywDrFy5EoCbbrrJO+euwQF/HUCYVKhQwTu2MtHuJorGyqSOHj3aO3fFFVcA0etJcnv00UeB2DOYBx10EBA9C2Mzejbj7o75ddddB2TuhoMu22zRSq660axkZyxnzpwJwLJlywBo2LCh12bHQY/k7IxFpNxNLo1FCG2WbdOmTV7b0KFDAZgyZQoQHQHKJO7GkVbm2aIp6WAbEtp6CXd9iuXOZ4KlS5fGbbdytd27dwdil8i3yIO7NjHeOkV77ds6lHTk9wdFnz59gNglj93PnGxlG/K2bNnSO2ebace6xj755BPAj+6fddZZXtvtt98O+N95JHptZ1FtFJoq4fsWKiIiIiIiWU03OSIiIiIiEipZk662YcOGPOfOOOMMIPF0tVatWgHw+OOP52mzBcBuykc2sDKF4IfQbUd0V1hKfLrFBfbaay8gdileS/c54YQTvHOnnHJKvo+39LZYaWrGSl66C8QtzG7PaSlxAK+++ioQf5f7THXBBRd4x6+99hrgj4Utmt+Z/v3759vWt29foGh3JA8aW2xrKR2LFi3y2h577DEg8bEMqh9++ME7zv26c1MRbfG7WyTAXpOpKAgQa7f1MLHS5lYMo6B/ry2Wdt/Ppk+fDmRfkYFYLBXZZanObpGQbGPpe7E+RxcuXAjAN998k6ftsMMOA+Coo44C/NLHkHyBqjBz04CtdHlQKZIjIiIiIiKhkjW3qLfeeqt3bBsn7r777kDsDfLsXIsWLbw22xzJNgF1y2RaicxMn/ksKHcx3pNPPgn4m0OFkVvu1BYKW7lT1/PPPw/APvvs453LPZM8efJk73jcuHEJ92HFihXesW1e5s54GtvwK4yRHLfohy0WfeihhwA/ChOLuxGwRR4bNWoEFGxzzEzlRlltNtiik08//bTXFpb3sVtuucU7tteIzcy6G6ka97VsY/DHH3/kedzcuXMB+Pnnn4Ho9wWLBtkmoO57gEU27Fqz4g+Zxv3ssyIKsV53iURwbAzBn223Ag0//fRTofoZNhbBtmvYfc+yqGU2vI/lxwohWZaFu9GsW3Ic/KIY4H922PVqG0wDfPXVV6npbAazrKadueGGG1Lck51TJEdEREREREIlayI5bq61sfKdbv6m5V/bLF+sGXLjlvKdM2e/92S6AAAgAElEQVROkfQzk1kef48ePQDIyclJZ3dS7tNPP823LVZ5cduQ9t577wWiN47dtm1bUn2wNSOxrlPbANI23yvM70k3W1Nn626sRC34671sU1b7L/glteOtCYv1mDCWPweoXLmyd2zvf7kjYWHibupnkSpbJ+dunmobA5YvX947Z5F++6/LysDbzG+is+f2eFtf9/777yf0c0HjvlZsM133NWnrvf773//m+VnbcNGiNm4p6F9++aXoO5vhbJ0IwD333BPVZptBgzYiB387B3s9uuuTLPrYunVrIDpaY4+38vK2vYBEs60b4m2Q6l6TX3/9dcr7tDPh/CQXEREREZGspZscEREREREJlaxJV3MXStqC0nLlygHw4IMPFui5LAQXL2SXjZYsWQL4C/zefvttr23MmDEAPPfcc8XfsRQZP348AG+++Sbgp7CAX57YdksGuOuuuwBYt25dkfXBSq1a6N3tg4WWg74jcSLs7+zYsSPgL/wGaNiwYb4/Z2k18dKJYj3G0onCxl0cbmlYkyZNAmKX2Q8TW+Rv/x05cqTX1qBBAwAqVqzonTvttNMAv/hMs2bNvDY37W9n3G0FrBCCld4vyveCdLEUn0QXI0vBXHXVVd5x7nQsK6sNsHXr1uLtWABZCWnjpqRamtoxxxyT788PGTIE8LdfkGhWROXII4/M9zFLly71jv/+++9Ud2mnFMkREREREZFQKREJYL3BVG+UdtJJJwH+AmUr9RmLRScABg0aBPibSbkbzaVCMv80Yd9kLlEau+Rlwti5hR2mTp0KQNOmTQE/Quv2K97fZI/59ttvvXPnn38+4C/KT1RBx664xs1KZ7vFMizCYBEKdybYjXwXh0y45tyNgOPNBufmbrJqEcmilAljF1RBHjvbcHHBggXeOSuNbFs3HHDAAcXSl1iCOHYWlXY3i86P+35n3wWfffZZAP76668U9M4XxLFLhBUXcd/TcrMMEoiOYheVgo6dIjkiIiIiIhIquskREREREZFQycp0tUyRqSHNINDYJS9Tx2727NlA9E72iaSrWRrqlClTvHPJLsIParpa/fr1geg0gzVr1gD+vk2DBw/22g466CAA1q9fXyz9y9RrLgg0dskL8tg1btwYiE6ZN1bsoUOHDt45N62tOARx7Cx977XXXgOgTp06Xpu9l9l+WRMmTPDali1bltJ+5RbEsUuE0tVERERERETSTJGcAMvUu/0g0NglT2OXvKBGcoJO11zyNHbJC/LYWZGB999/3zu39957AzB69GgAbr31Vq+tuEtIB3nsgi5Tx06RHBERERERkTRTJCfAMvVuPwg0dsnT2CVPkZzk6JpLnsYueRq75GnskqexS54iOSIiIiIiktV0kyMiIiIiIqGimxwREREREQkV3eSIiIiIiEioBLLwgIiIiIiISLIUyRERERERkVDRTY6IiIiIiISKbnJERERERCRUdJMjIiIiIiKhopscEREREREJFd3kiIiIiIhIqOgmR0REREREQkU3OSIiIiIiEiq6yRERERERkVDRTY6IiIiIiISKbnJERERERCRUdJMjIiIiIiKhUirdHYilRIkS6e5CIEQikQL/jMbuXxq75GnsklfQsdO4/UvXXPI0dsnT2CVPY5c8jV3yCjp2iuSIiIiIiEio6CZHRERERERCRTc5IiIiIiISKrrJERERERGRUNFNjoiIiIiIhEogq6uJiIhIdqlfvz4AAwYMAOC8887z2jp27AjA4sWLi79jIpKRFMkREREREZFQKRFJpmB3igWhHvguu/x7/9ewYUPv3Ouvvw5AzZo1AVi6dKnXdswxxwCwdu3aIutDttRSr1ChAgBjx44FoH///l7be++9B8CRRx5ZoOfMlrFLBY1d8sK2T87EiRMB6NevX562vfbaC4Bffvml0L8n2665PffcE4CjjjrKOzd58mQAtm/f7p3r0KEDACtWrMj3uTJ97O6++27v+IwzzgD88XFt2LABgCpVqhTZ7870sUsnjV3yNHbJ0z45IiIiIiKS1XSTIyIiIiIioaLCA7n06dMH8NMEevTokecxO3bsAKBx48beuREjRgBwySWXpLiH4XDWWWd5x0cffTQAF154IeCPb+7jbHXAAQcAUK9ePQDat2/vteXk5AB+qstbb71VrH2TcCpV6t+PhrZt2wJ6HSajatWqALRr1847d+qppwJ+enONGjXiPsf06dMBaNasWSq6WGwsJRn8tGT7m1q1auW15U5F+eKLL7zjokwFl+xTpkwZAI499lgAhg0b5rXZNThq1Kio/wL8888/xdVFSQFFckREREREJFSyuvCAlat0F7p3794dgNq1a+f7c3/99Rfgz0gB3H777QBs3LixyPoXxsVpNr4PPPCAdy7WIlNjhQfatGlToN8TlLGrVKmSd9ygQQMATjzxRMD/2wDWrVsH+JGZU045xWvr1q0bALvvvnue57c+//333wC8/PLLXttll10GwMqVKwvU56CMXVGySGs8bhQs2YhYWAoPWKR16tSpedr+/PNPAPbZZx8Afvvtt0L/vky/5mzBPMBpp50G+BGcatWqJf283377LQD77bdfvo8J8tjZe9Zdd93lnTv//PPz7Uvuv+XMM8/0jmfOnFnk/Uv32NnngxvpMpbZEKstlhYtWgCwaNEiAB5++GGvbdmyZYXqZyzpHrtEHHjggd6xZTu4kcP8uMUt1q9fn+9zWlaFfUaD/x3Qsnp+/fXXPM+fCWNXUBdddBHgF6sB/7uHfVacffbZXtsTTzyR1O9R4QEREREREclqWbkmxyI4L730EgD77rtvgX7++++/B2DGjBneOSs5LbGVL18e8HPR40VvXAsXLkxZn4qDWx7VncXIj83WxJqtsBlzd9bYrt0xY8YAfpQI/DG39WVhECsiYzPm7lqlZAwfPtw7HjlyZL6/L+j2339/7/jyyy8H/PeqBQsWeG1//PFH1M+5awzvvPPOfJ/fojtFEcHJJIcffrh33Lt3bwDq1q0L+BtVJspmh91/j2effRaASy+91Dt3zz33JNfZgDjhhBOAvNEb17x587zj559/PqptyZIlKelXOtm6X4ArrrgC8NddxhIv0hXrcVaW3NZ/Adxwww0APPbYY0n0OHPYGAwdOhSAIUOGeG1uVkV+Yq09POmkkwDo2rUr4EdqAfbYY498n8u2GBk9evROf2+msagh+BFWy0JZvXq112ZRrFq1agHRn6fJRnIKSt/MRUREREQkVHSTIyIiIiIioZI1hQds0TfA7NmzgfhpapbK4aahlS5dOt/HP/nkkwBce+21AKxatcprS7YEYaYvTnPLRFs6R8+ePXf6cw8++KB3fPHFFwPRu4AnIihj54a/E+mTFbX47rvvvHMDBw4E4KuvvsrTZm699VbAT38Av5hBQRc+B2Xs3NC2m0pWHKzwgJU3T1Q6Cw9YyWK3oIWbugbR6Yxz5syJarPUIsibNuSyohjPPfdc8p3NJSjXnKtkyZIAPPLII4CfrgJQsWLFnf68paS988473rlZs2YB8PrrrwPw448/5vk5t8CIFRSx/8YSxLGzdJbXXnsNiD9eNs7pkI6xszEBP307Xj/iFVyYNGlSnn7Z5+fee+/ttf38889A/IJKBRWU6859zuuvvx5I7PPihx9+8I5vuukmAL788ksAli9f7rXZ2CXql19+Afw0NXchvgnK2CXK0vLsM+Lee+/12uy7mb1Pjh8/3muzdDVLYXPLwLsFHApChQdERERERCSrhb7wgEVwLHoDeSM4P/30k3dspY3ff/99AHbddVevzWZKGzZsCEQvprTF4PZftyiBLf51f0+Y2aZbtmAP4PTTT8/38TYTYLNS11xzTZ62MPr888+943HjxgGwePFiAFasWOG1xZvFNTZD7EZyMp27iWJubolnd+EyxC4WYEUJ5s6dm9Dvzv2cmeCcc84B8kZvwC/7vHXr1nx/Pl4xkHfffdc7tihE2Nms8LnnnrvTx3788cfesc0C26LnWCVk44n3b5QpLFphM8CxZl9vvPHGYu1TULjbA9iC7L322ss7Z9EE40YcEmHFemJFcsLIigxA/AjOhg0bAD8zwr7jQd5tFtzIo0Vb7d9qZ2677TYgdgQnk7Rs2dI7tu8nVnzFStuDvy3IZ599luc57HPX3gfcsubFRZEcEREREREJFd3kiIiIiIhIqIQyXa1y5crecbwiAx988AEQHT4uSFjXUtsAXnjhBQAaNWoEQI8ePby2Jk2aANCsWTPvXBjTsGzB7NixY4H4KWou243Zwshh8vjjj3vHlk5ke1+4exZ89NFHhfo9tqBv8+bN3rnddtsN8OvXQ2alTLopY7ZvjZumVhCJpKkFqWhHotz3OtthOxYbt3hpeAMGDMi37ZtvvvGOt2zZUoAeZha3uIDtLWLcgh/3338/4BcXsEW3ANu2bUthD4PH3mfcNFFL94mVpvbFF18AsdNbssGmTZu8YxuDohgLKyrg7uNi3M+FsLC9hdy9cIwVe3Lf7+zzN5H9vSy1DaBbt26AX6jFli2AnwbsfpZPnjw5sT8goK677jrAX2YB/uvZlmPEK4bhmjJlCgClSv17q+F+Zy4uiuSIiIiIiEiohCqSYzMZ7o7CiURwkl2U5y5WsxnARYsWAdG761qpPHdG/fvvv0/qdwaZLSSNNyMcy4QJE1LRnUD4+uuvvWOL4AwePBhIvrR4LFa6vHz58t45mzHNpOiNK1YBgUQkWmIy2TLRQfKf//zHO45XEj/eDJpFmOvVq5fvY6wsaljZTKW9jiBvZO+II47wjsO8kLugbCb9yiuvzPcx9l4E0KlTJyD+gnqLDrnlpcNQkCGVrLxvrPc/t/BSJrOiRuAXHHC/axmLpvTv3z+p39O3b1/v+Oqrrwb8CI5FbwEOOeQQIPa2DpnGxtNex/YahMQiOLbFytSpU71zVvzC3iNsG4zipEiOiIiIiIiESigiOTbbYxsixloLYtEbKHwEJxaL6lhu9qBBg/I8xs3x7t27d5H97qBwNxrMj+UGu+PjrlsJGzcaUaFCBaBoIzhWbjtWmVvbSDWMrDQl+GVD3XO5hSFq47KSnLHeZ1w2w7hgwYJ8H2Nredz1Pbm5mw4a2/DR3eAtdynWIHM33bQNO239pGv+/PlAYrn82ahp06Y7fYy7ZimRksj2vmlrXMEvrWz/VkW5GW0YdOnSJd8226w809WpU8c7vuCCC/K0W2ljt6x0IqpUqQL4a+3c7S/s+6U9t/tZm+kRHCsNDX6GiZXMvuqqq7y2eBEci6499NBDAJx66qlem62JcjcPLW6K5IiIiIiISKjoJkdEREREREIlY9PVdt11V+/45ZdfBmKnq1iJP3cn3FQuGo23sMoNr4YlXa1Vq1besaVjxWOL0tKx8226uaVDk7HLLv/OSbilHW+66SYA/v77byB6XNOxyC/VrBR0IqlpEJ70tNysfKqbvhGLpYda4YDGjRt7bZaiYWln8bjXlZVIbtOmDQC//vqr1zZ69Gggdnpb0EybNs07jpWmZmX+r7/+eiB60bMtQj744IOB6OtsxYoVgF8ExP03snFcsmRJ4f+AgLACDW6hBnuvsjQpSyWP5dlnn/WO3RLe+bniiisAf1Ez+Ck22cYtspS78IBb7KGwnz1BsWPHDu/YXp9Wnhj815oVBIi3dUDbtm294zFjxgBw+OGHA9HFG+644w7A38YgDGN55JFHAv5rCWD58uWAn+Yeb7sB97X30ksvAXDQQQfleZy9nv/4449C9jh5iuSIiIiIiEioZGwkx2aKIO+s7l9//eUd28LcV199tVj6lW169uzpHdeoUSPfx9lMsrtpliTGFpnboshYhTVs8Xg6NttKFXcWLl7kxlgEx2bcwshmLd1Sx/FYWWkbGzeSE6/QQG6tW7fOt81K9wPsv//+CT9nurmbM8dz5513AtGFCiySY2Pozu7a58/ChQuB6Gi3/c6OHTsCfgQ2k9mstzv7bTPuscoZ2zja+5kbvUmk/Ls9t0XYAPr161fQboeCOwb2ncjGx/2OZEWPnn/+eSD+LH2QudkJtlWFbc0Afhlji5i62TOrV68G/PcotyiQFc+wIiqXXXaZ1zZ9+vQi6386HXXUUd6xXQdutM8igYlsb2IbvkPsCI4JQtRLkRwREREREQmVjI3kWE56LDaDBuGa2Q4CK6d44YUXAnDOOefk+1h3A0HbAM7yPiU+d4OzTz75BIBatWoB0fnnthlkvNzjTGNRm0SiN4n+nEV33PU67nF+zxXvMelgs5fdu3dP6PE2a27rZ1LBSqsCfP755yn7PcXNomYWfXE3AXz99dcBf13Piy++6LXljs64+eu2Fsdmim+77bai7nag2Iyxy2bQzzrrrDxtNpNumRc1a9b02nKvr7MNXLORvf7daGzu6Jn7Hcmutw8//LC4uphytuHnsGHDvHN169YF/OiyvU7B30zWxsndONvW0Z188skAfPPNN6nqdrGzTBAr8Qywbt06AI455hjvXO5Nw93NQE877TTAj+i7ZaKDTpEcEREREREJFd3kiIiIiIhIqGRsuppbEtrYgk8rB5gOBx54YL5tYUjlsDSqiRMn7vSx7q7DYU5Ts9KyV199tXduzz33BKLLlVtKmaWbuTt/20Lm8847D4geO2t7//33AejcubPXZiWCw8hNFSto6lpu9n7Rrl27uL8n3rkgyMnJScvvfe2117xju+ZsQf7KlSu9tkR2sw8KdyHuI488AkSnR1ma6KxZs4Do1JcNGzYU6ndbWoyNIfglccPkzDPPBGDVqlXeudyFU9zFyZZ+a6+/GTNmpLiHwVW/fn3ALyoDfmlkW+xdtmzZPD9nKX9W2AH8Mr8LFixITWfT6Pzzz/eOrcCCvbbdrUbcwiG5NWjQAIDHH38ciE6NTGf546IwcOBAIDp90c65KWqNGjUC/EIgbkraoYceCvjbtdhyBfDfM8eNGwfAu+++W7R/QCEpkiMiIiIiIqGSsZGchg0b5jlnM4x2t5kOPXr0yLctDLNSNlMSj83Cvf322ynuTXpVq1YN8Dezc0vFGneDPFsQaov+LDID/gy9zdTZhovgL5q3DfUyfWZpZ+z6SaQwQO7j3CxyE68oQVCjNrHce++9gL/pnTuTe+yxxwLRf49FXm0m3TYAdR8fjy1wfuWVV7xztsFypnMLeFhJ51Sw6Cz4428LeN1/D3dT1bCwsrT231gsqgV+hMwWSXfp0iWFvUs/e3+y932AM844A4B69eoB0UVoEimx/cQTTwCJfVaHgfs9w0qJ2xjYZr07YwWVWrZsCUD//v29NtsMNFPttddeQPS1c8sttwBw4403eufss8QyU9zCUfb4UaNGAdGfAfa9x57fLUQTBIrkiIiIiIhIqJSIJDI1UMzc2e/82B0l+CUEbYbcZthTzd34znIcL774YsCfGQB/PUqHDh28c7YxVTzJ/NMkMnaJsrUflqcKfjlC9+8zNqNy4oknArBly5Yi60tBpWrs3M3FbrrpJiD+JqixIjmJ9MH9PY8++uhOfy4etxRkIrPw6b7uipJFbmKV2LaIR+7StIVR0LEr7LhZmWPwSxW7edb2b2/rHtxoY7zcaYv8HHnkkUB01CMVwnTNmerVqwPRa3maNGkC+NkGxx9/fKF/T7rHzj7zJkyY4J3LvTFlPO6mlYk8/uOPPwaiP0+TXSOV7rEbP3484G9aHos7JhbFj7W+xMqZu5HDVEr32Bn3u4i91tq2bQtEr3OztTt2ztbhgR+9MO73M3edXlEpzrGzaLFbzt3O2fc58CMwVtZ+/vz5Xlu876szZ84E4KSTTgKitzeYPXt2Un2Op6Bjp0iOiIiIiIiEim5yREREREQkVDK28ICFrF22QM/dRXrIkCFF/rstTc3KMkJ0eb7cLGSXSIpaurm7UNtuuO6uyvH06dMHSG+aWqr16tXLO46XphbP0qVLgeiS4rkLVljoF2DevHlAdKneRNgu126JzUQXYmYyt7hArFLzYeKmY8S6PqysfkF99NFHQOrT1MLIikLYVgaWogZ+eV93l/ZMZ9sJ/PPPP965e+65B0gstcRNx8r9eLc4i5Unt8XlhS3jHQSPPfYY4G85AP53icWLFwPw6quvem1WYjvWjvP2+GxhWyu4abc2drZEYPDgwV6bjY8tdcidouaycQ4De8+x1GOXjSHAmjVrEn5OK2EOfsqtvR5TkaJWGIrkiIiIiIhIqGRsJOeFF17wjpcsWQJA8+bNgeiNiv73v/8BMGXKlAI9f+3atQF/ET34kYp99tkHgIoVK+b78+6dbibMhtrMhVsu0S1dmR+bsYPojS+zQbyFgDYT7i72s3OxSpzbxm1vvvkmEL14z44tQnnVVVfl+3vdRZI2Wxxr0X1xssiKG1Up7GL/WCWkCxq1sdLc2STeAudsYxvcgV/C1y3y4RaWAfjyyy/zfa5LL73UO7br8O+//87zOJtFDtOsu0Vw3A2iLdPimWeeAQpeDGjZsmUAdOvWzTtnM9JhYt9P3EIzubkZAw8//HC+j7OyyWHmZpVYoaNYWTS2ge93333nnbPXnEVaY7Ey7u+9917hO5sBChK9cTVu3Ng7tojYiy++WCR9KmqK5IiIiIiISKhkbAlpl5VTtJnuiy66yGuzfF83f9fWxljJ6QYNGuR5zl133RWA8uXLJ9QHm3myPOwnn3wyTx8KqjjLDNqmUNdee21Cj7fyu265Srd0bbqlauzc66F06dL5Ps5K9saazY2nU6dOQPS6MosK2nVkkUvwy7cuWrQIgOOOO85ru/vuu4HoqIc7s5Wfoh47iyTF27Qz1VJRLjqW4i4hXVBuBDzeRott2rQBim9GMx3laN1N6yw6X5RsltR9LY8dO7bIf09QSvnGYpuAXnfddd65ww47LN++2N8yZ84cALp27ZrS/gV57MyIESO8Y3ccAR588EHv2N3AsjikY+zc7JJ33nkHgEaNGuV5nH3+uptqW0n3WH777TfAv15THWnNhOsulmbNmgGwYMEC79zvv/8OQNOmTYHko0OJUglpERERERHJarrJERERERGRUMnYwgOurVu3An6I2y1lOWDAACB6wVqiJZHz89lnnwEwevRo79zTTz8NJLajfBhY6DcnJ8c7F6R0tVTZvHlzSp/fyoW2bNnSO3f66acD/sJeWyQNfonVb775BvB3vQcYN24ckFiKWipZCWxXKlLXrJCApablPpbE2XuqFJylwVlZeEslzUZWTtZN/7F0cku9ct8frFCLm+6d7SpUqOAd505Zsu882cJNOYuVpmZszNyxM1aw4KmnnvLOWdqfW5Jf8rKtLdxU/YEDBwKpT1NLliI5IiIiIiISKqEoPBDv50uV+jdYVaVKFe+cLdCzmbZYhQdsNs5dzGcbJ82YMQNI/V1/EAsPbNy4EYCjjjoK8DfdCppMXdgXj5WSdheiWiTNFmS65crbtWsHFDySUxxjZ5GcWBEdi75kYhQm6IUH3A3hbEbTfPXVV96x/bsUV1n4dLxe3SIUVkK/devW3rn69evv9DneeOMNwC8FDH4J6VRHfU0Y3+uKSyaM3fjx471j2+DZ2Cw6+NsQFJd0jJ37mvz000/zfZwVEnCL9Nj2DJMnTwb8wlPpkAnXncuKClixh6+//tprO/zww4GCF1lKlgoPiIiIiIhIVtNNjoiIiIiIhEoo09XCItNCmkGSbWPXoUMHILp+/ZYtW5J6rmwbu6IU9HS1oNI1lzyNXfIyYezipavdcccd3vGVV15ZbH2C9Iydu+D90ksvBfy9bQCWLl0KwJQpUwD46KOPCvX7UiUTrjuXLdVYtWoVEJ3GbIWQ1q9fXyx9UbqaiIiIiIhkNUVyAizT7vaDRGOXPI1d8hTJSY6uueRp7JKXCWPnbgswduxYAA4++GAAOnXq5LX98MMPxdqvTBi7oMrUsRs1ahQALVq08M7ZFhfJZo4UlCI5IiIiIiKS1RTJCbBMvdsPAo1d8jR2yVMkJzm65pKnsUuexi55GrvkaeySp0iOiIiIiIhkNd3kiIiIiIhIqOgmR0REREREQkU3OSIiIiIiEiqBLDwgIiIiIiKSLEVyREREREQkVHSTIyIiIiIioaKbHBERERERCRXd5IiIiIiISKjoJkdEREREREJFNzkiIiIiIhIquskREREREZFQ0U2OiIiIiIiEim5yREREREQkVHSTIyIiIiIioaKbHBERERERCRXd5IiIiIiISKjoJkdEREREREKlVLo7EEuJEiXS3YVAiEQiBf4Zjd2/NHbJ09glr6Bjp3H7l6655GnskqexS57GLnkau+QVdOwUyRERERERkVDRTY6IiIiIiISKbnJERERERCRUArkmR0RERMS0bt0agMmTJwPwyy+/eG1dunQBYNu2bcXfMREJLEVyREREREQkVBTJERERkUAbNmwYAA0aNABg/fr1Xlvp0qUBRXJEJJoiOSIiIiIiEiqK5BSxkiVLAvCf//zHO9ewYUMAHnjgAe/cV199VbwdExGRpNl7O8Ctt94KQIcOHQA4/fTTvbYvvviieDuWJebNmwfA8ccfD8DMmTO9tg0bNqSlTyISbIrkiIiIiIhIqOgmR0REREREQkXpaoVQuXJl77hOnTqAvzjylFNOyfP4EiVKeMdXXnllinsXbJ988ol3nJOTA/hlQBcuXJiWPqVTuXLlAOjevTsARx11VJ7H3HnnnQB89tlnxdexgKlWrRoALVq08M5dc801ALRp0waASCTitd18880ATJo0CYDvvvuuWPoZFDVr1gT8lNn33nvPa/vzzz/T0qdMVaFCBe948ODBUW2LFi3yjkeOHAnA7bffDsCOHTuKoXfhYu+Hf/31l3euefPmgH/drlixovg7FjBnnnmmd3zCCScAsM8++wBw8cUXe23/+9//irVfmaZ27dqAP2aHHXaY19asWTMAHn74YQCGDBlSzDeznt4AACAASURBVL2TwlAkR0REREREQqVExJ32DAg34hEkjRs3BuCSSy4BoFWrVl7bQQcdBPh9d4d18eLFgL9IFWDTpk07/X3J/NMEdexM9erVAfjwww+9c3vvvTcAM2bMAKJnp5IV5LGzaMRjjz3mnbNIoJVHdftif4uVR3VnmVIR1Qni2FmEy2bHbbzA72+s156ds6If/fv3T2k/Czp2qR43i5haJOe8887z2qZPn57S310QQbzmcitTpox3PGfOHMCPLrh92WOPPYD45Yxnz54NwKpVq/K03XLLLd7xmjVrdtqvTBi7RNl7/5FHHglA+/btvTb7jL3tttsAuOqqqwr9+zJh7A455BDvePjw4YA/PpUqVcq3X+PGjfOOhw4dWuT9yoSxi6Vnz54AjBgxwju31157AbDbbrvl+3NW6KJHjx6F7kOmjp058cQTvWPLXrLvIu51l4poa0HHTpEcEREREREJFa3JyYfNGlmeP8Do0aOB6NmTRNhGZaVKZe9wW471iy++CECtWrXyPCaMG7nVrVvXO7bITay1IzZL8+mnnwLw/fffe21Vq1YF/HUos2bN8trsOg0jN9pnf3vuqI0r3rm+ffsC8Oqrr3ptzzzzTNF1NqAaNWoE+OPmlrbPHcmxMQL/Gh00aBAAa9euTWk/M8Eff/zhHduaS4tM29on8PP67THudblly5aoc25E0mTb50SNGjW8Y4u42oz62LFjvTabQU8kupXJLGJo0Rd3/a612fWzceNGr81dMwawbt26lPYzE9haG4C7774b8NcuxXqdWSlyN5r6wQcfAPDOO++krJ+ZwjIh7r33Xu+cfba0bNkSiL4mL7vssmLsXWyK5IiIiIiISKjoJkdEREREREIlu+LiCbBFaRaudEPpuRc8uYuqbCHqxIkTAXjhhRe8NitY4KbHWMjU0hfCzhaOu4sojYWDi2IhadBYqhn4i0XtOopV6tiuu61bt3ptVk7advy24gRhdeCBB0b9F/KOmZuy8vTTTwPw4IMPAvDoo496bbnTta699lqvLazpamPGjMm3zS12kdsBBxzgHZ911llA7DQ38cven3HGGQD89NNPXpulVbmvfWPlpJVK5GvSpIl3bOW3586dC/hFe8Luoosu8o4t3dHeu9w0SVv8Pn78eCA6TTL3+1m9evVS09kAswICkydPBuCII47w2uItM1iyZAngL6j/8ccfU9XFjDRw4EAA7rjjjjxtti3Dyy+/DEQXJbDvNb/++muqu5gvRXJERERERCRUsjqSY7NxtskiwKmnnhr1GHe2ffny5QDceOONgD+rEsv8+fO9Y5uRcYsYlC9fHsieSM7++++fb5tFtcI4u+kWEBgwYEBUm0UgIP5i2s8//xwIVgnJ4maldu+//34gemFobm6Jz2zaMND+7ssvvzxP2+rVq4HoYg6JsHLlVvbcfa5ss+uuu3rHtvC2Xbt2QPS4/ve//wXCv0C+qLz++uvesb1eO3bsCERvBrps2bLi7VgxcrMYbDNPiw7aWEDeLQPilYZ2C9SE0S67/DtHf84553jn7rrrLsCP2nz77bdemxVwuP766wGoWLGi12bRM0VwfOeee653bEW3SpYsCfjZN+Bv4m5txxxzjNdm5d7drQuKmyI5IiIiIiISKrrJERERERGRUMnKdLVjjz0WgEsvvRTww22Qt7jAn3/+6R3bbsPPPvvsTn+HpSwA9OnTJ0+7pbClc0FWqrnh4KOPPhrwU65sYTOEM03NuKk9tgdEQdlicbs2b7rppsJ3LMAsJcNSpcBP/UkkBahhw4becawiD2Hi7tBtaZ+WNgD+333JJZcA8NVXX+X7XG46pB3Xr18f8Bf0Qvamq7nv6W3btgX8tI1u3bqlpU9hYOnf4H8uTJ06FYguIhJmbmEPe4+zNOV4hgwZ4h1nQzpz8+bNveMRI0YAcPLJJ3vn/vnnH8AvXNG7d2+vzYof2d5Ublrvu+++m5oOZyAr/nHNNdd45+wzZfbs2UD0so6///476ufPPPNM77hZs2Yp62eiFMkREREREZFQyZpITvv27b3jKlWqANERnPzYDsPg75hr5QZXrlyZ78/lLmCQ26RJk4D4C/IznbvQ3sog//bbbwC89957aelTpmjRooV33KlTJ8CfqQtz5MuVe5FtfmxRvI2ZO+Oee3YzbIvB3QIMtlDUjVqNGzcOiF8kxbg/Z8ebN28GokuaZ5vjjz8egNNPP907ZzujW0nyn3/+ufg7lqFsNv7JJ58E/IX24JdGHjVqVLH3K52SjSTEes1asZtPP/208B0LGCsQAH4E58033/TOWfTBIqydO3f22ixiuGjRIiB+Of1sZKWfrTCD67nnngP890A3elO5cmXAL55h368BBg8enJrOFoAiOSIiIiIiEipZE8mZNm2ad2wbfNrMhzsLd8899wB+7qGVKQR/NjNeBKds2bJAdOQolpNOOinBnmceKxd4ww03eOds7Gy24Icffij+jmWoeJvQZhvLq3Y39bRNFy3XOtbsZtjWM9kaN7fMp7HN2SA6r3pnTjvttDznrITv9u3bC9rFjFe9enXAjziUK1fOa+vfvz8Ab7zxRvF3LINYLr9bQtZm4G29lzsTb7PtVqLb3WTVNrnNvQYgG9nGvW6mibFyv+77QFhYRojL1uEANG3aFPBLQbubrNp3M1tzsnbt2pT1MxPZZ6N9VrrXz4UXXgj4ZfQHDRrktdma81hZSRahTOfaOkVyREREREQkVHSTIyIiIiIioRL6dDULcVtKi8vSfqwcKMD69esL9fuuvvpqAFq3bp2nzV38G8ZFgcYWrrk7hNtY26I/ic9Nx7LF86tWrQJg8eLFaelTEJxyyilAdGEGC6/HKqGa+5ztUA8wf/78VHQxpWxR54wZMwB/Z2+Ab775BvAXyhdU3bp1vWMbUytjG+ZS9/k5++yzAShfvjwADz74oNd2//33p6VPmcJS+w488EAgugS5pastW7YMiC6fbNszdOzYMc9z2nPZ4900pWxTr149AHbfffc8bZ988klxd6fY3HHHHd7x0KFDgehrxY4tPd5S1FxW1nj58uXeOXvvzGZNmjQB/G1T3EIrNq6W6mepppA3nd6K1UD0GKeLIjkiIiIiIhIqJSIB3CWvsJta2eJkiF06dcuWLQD06tUr38ckwp0l6NChAwAvvPACADt27PDabHZqZ2Wlc0vmnyadG4LZolGLQrgbD9oGrBaNSLXiHDubYXQjgnYNuhE7K4lsmyk+88wzeZ7Lfs69Ju1v6dGjR74/V5SCfN19+OGHABxyyCHeudyRHLf/uc+5159tNlqUZaULOnYFHTcr3BFrQ2KL8D311FMFek4zduxY79j+ji+//BKIvrYt4mMRjkMPPTTPc7kzeHPmzNnp7w7KNeeWM166dCng/5377ruv1xakRd1BGTs3Ejhr1izAnyE/+OCDvTYrOmOfme6m0bkXg//xxx/esS2yt420Ey0xH09Qxq6gLFr74osv5mk74ogjAL+Mcqqke+z22GMPAM444wzvnJUntzFwr7vc3IiDlS63SJGViE+VdI9dLLa9h70erXgDQK1atQAoVerf5C/3+0nugjVum/tvU1QKOnaK5IiIiIiISKiEak2ORVaGDBninYt113fJJZcAyUdwKlSoAERvTGVRDIvguGtP3MeFjRs1swiOld2+6667vLbiiuAUJ1sX8tJLLwH+ppTgX3fuZmTxIg62PsSiQu6sjUUaUh3ByQSxIhjmnXfeAaJneK2cqq1lcWebreR0Jm0QamXZY83q2fXorley16IbWc6PWy7fHm+517/88ku+j9/Zc993332A/74bZG5UyiI4xtZBgL/mxDbmjTU+2cJmfh9++GHvnEVa7b9vv/2212azu3bdxNuUcfr06d6xRc9ss8tslpOTA0S/D9g1mC3rSzZu3AhEr5UzFkm0qB/AggULAD9aa9/jAIYPHw74G2+70Qm3jHmY2Wb3I0aMAKI/K83AgQOB6PdGGyuLfl1++eWp7GaBKZIjIiIiIiKhopscEREREREJlVClq9nu3ocffnietttuu807Tnb3VQtvWli+W7dueR6zZMkSwC+TCfDzzz8n9fuCzErXTpw40Ttn6Ve2eO/xxx8v/o4Vo9tvvx3wU6Fsx2DwU6bcHZobNmwIQJs2bYDodDV7nJ1z2+y4b9++efpgvydWmpv9nP0+93G26D7T2Bi7Yx2Ppcv07t07T5ulWmZSSW57f7EFoPvtt5/Xlju9CvyUoEQWa7ppZwV5/M4e279/fyAz0tXiefXVV/Ocs8W5lgoD/i7rlsoWRu61ZlsG2GJv8MvQWpGVc88912uzBc7m5Zdf9o5t13QrQPD66697bW7qWrazIkbua8/StzIp/TZV7PuX+zlhKWnGLWtuyw2sPLl7rR199NFAYim/mczGYO7cuUD0tiv23nfCCScAsb/3jRs3DoguWBAEiuSIiIiIiEiohKKEtM2k28Z1sTbIcxeU2oxHItzSqbYBnLsA1dhml1YW02awCiOIZQatjKdFadyIlS0utbv9bdu2pbQv8RTH2NmMpF1/PXv29NpibfZqhRksghCv1HGsiEwiJZJjLR6Pda5kyZL5/l1BvO6SZQtQLepqm8SBH80qilK0JtUlpHNzS2nbdei+13300Uc7fQ6btaxcubJ3zv6Op59+Gogu5Wtj6o5lIl577bV824JyzbkbGHfp0gWAO++8E4C9997ba4v3+rFZTlukm2rpGDt3Ftxmbi16A372gpU8t8/HWBo3buwd26JwK+zw5JNPFqqfOxOU6y6W0qVLA9HfN2wbAStA4vb/77//BvzCS24hCNs2oygFcezs+9q8efOA6KJP9957705/3qLOViwF4MwzzwSK9loM4tgl4osvvgDggAMO8M59/fXXgF+u2y3NnQoqIS0iIiIiIlktFGtyrASlRXDcjdps06yCRG/Anw2dPXu2d87d/BNg6NCh3vF///tfIPV3senWr18/wI/grF+/3mu74IILgPRGcIqTXVt2jbhrveJFZGKtu7HyxzfffDMQXfqyQYMGgD9LZf/fZc8Va12Fey7RtSyZbMqUKd6xRXBsLNyoTVFGcNIl1nqieBGTWP766688537//XcARo0aBcCyZcuS6F3msdlwgOeeey7qv+3bt/fa7LOga9eugP++CP4MfLZxIzm2/ipeBMfG010fOHjwYCB4ZWhTzd2E9vzzzwf8z9imTZsm9BwWhbRNfd1sgjfeeKMIehl8NWvWLNTPP//88wBMmDDBO2fRs1RHFYPMMpTsOnUj+7Y+LKjffRXJERERERGRUNFNjoiIiIiIhEoo0tVsgail5Xz55Zdemy2KisXCu1byE/wd6u053UXb9ry2K+7KlSsL2/WM4BZfcNMyILpMtJsmmA0WLVoE+DsFX3bZZV6bLQJ3F2fnTo+y1DSAZ555JqotVrnabGHlZt0SlnadxSvoYSUwzz77bO9c7sWazz77bJH1M8xWrVoFZE+aWiLeeuutPOdiFaHJBm7698KFC4HorRts8XvLli2B6HQsK7d96623AtFp4H369AHgiSeeSEGvg8u2FwC/JPemTZuA6M9YG0d3awIzevRowC8kEoZ03OJWo0YNIPp7X506ddLVncCw4itWcMXSmAGWLl2alj4lSpEcEREREREJlVCUkP7nn38Af3Hx+++/77W5JQTNFVdcAfiFCixqE4uVoIbij+AEpczgtGnTvOMzzjgD8GfjjjzyyCL/fUUhHWPnzvhYFCJeJCeo0n3d/frrr4AfDQM/spC7QINr+fLlQOwS21ZwwUqvpkpxl5AuClbu1120a/8GNqv84IMPprQP6b7mrDz0Dz/8kNDjrdy2ldhu166d12YlZ5966qki61886R67+vXrA4m/v1kE3MZ62LBhXlu8QgWpkO6xi8WKHuXk5AD+Ynjwo9xWXMXtv/07fPXVVyntnwni2NkWF7adhcs2xXYLZORm2zvY6xr8AlNW0KEoBHHscnMLa1mWxC+//AJEb0LtFmspDiohLSIiIiIiWS0Ua3JmzpwJ+KXsjjjiCK/NZo1csTZVNBYFmjt3LgAPPfSQ15Yta3CM5VVb9Ab8vPTjjjsuHV0KtO+//z7msRRM9erVgejS13Xr1gX8aJnNaELsct3GopB33XVXajobAraprY07+JHI2rVrp6VPxc027vzpp5+8c5MnTwb8aKx7fVmZX4vguGuX3Jn3bGBrVd31SRZpaNKkCQAzZszw2iwK++233wJ+Job8K/cGvqVK+V/TevXqBcTehkD80sYWRR03bpzXZpk4L/wfe3ceaNX0/3/8GUrKkIRSIWPSh1SIQuYhRUJSmTIUmTJPSZnHr6JMKRQlUyIhczKUKSWEQolUiApFvz/83muvfe+5t3v2PcM++7we/9j2OvecdVf7nHP3er/Xe40bV+rn7L3dtWtXIJyBYVkrxcKi1P76a1uLc/HFFwO5j95UhiI5IiIiIiKSKLrJERERERGRRElEupotXGzevDkAjRo1qtDPvf3220CQmgbB4rLly5dnsosFo3Hjxu7Y0tX81CtL6yikcKUUFiuFetlll5VqS5WeUfKc///33nsvAAsXLsxkFxPFUotSLawtlrTLX375BQjvdH7jjTcCQTp0tWrVXJul8Nq19tBDD7k2fzfwYmBj4G/XsOeee+arO4mzwQYbuGO/wIWU7b777gOCv2Eg2FpgwoQJQDhNsnv37kCw5OG5555zbW+88UZ2Oxsz9p3ppyrb38p+2mmhUCRHREREREQSJRGRHCuZaKUX/UXJqTZyuv322wH47bffgOKbeSvPNddc445t8fHZZ5/tzhVKGWQpXFa22N8Ez0qCWlnpVEUGbLHoCSec4M5NmjQpa/1MCiu9nUqxbPBrERx/I+kDDzwQgEMPPRQIil9AUHbboo5DhgzJST9FfP57t6Llz4uBbVbrbyprm3VbgYy///7bte26665AUCo/21sNxFHVqlWBcDaPsQyelStX5rRPmaBIjoiIiIiIJIpuckREREREJFGqrIphofU47AIeB7ncFbd27dpAePGoparst99+7tzixYsjPX+uFcKOwnEVx7GztFNLNbWdqSHo7zHHHAPA008/ndW+lCfdsYvTNXfssce6Y0vNsjQuP7UjG+J4zRUKjV10hTB26667rjseP348AG3atAHCe5nYYvtcKYSxs79rAHr16gUE+4FttdVWrs1SpD/++OOc9CuOY7f++usDQSruF1984dratm0LBGmA+ZTu2CmSIyIiIiIiiaJITozF8W6/UGjsotPYRVfIkZx80jUXncYuOo1ddBq76OI4djVr1gTgm2++AWDw4MGuzS9IlW+K5IiIiIiISFFTJCfG4ni3Xyg0dtFp7KJTJCcaXXPRaeyi09hFp7GLTmMXnSI5IiIiIiJS1HSTIyIiIiIiiaKbHBERERERSRTd5IiIiIiISKLEsvCAiIiIiIhIVIrkiIiIiIhIougmR0REREREEkU3OSIiIiIikii6yRERERERkUTRTY6IiIiIiCSKbnJERERERCRRdJMjIiIiIiKJopscERERERFJFN3kiIiIiIhIougmR0REREREEkU3OSIiIiIikii6yRERERERkURZK98dSKVKlSr57kIsrFq1Ku2f0dj9R2MXncYuunTHTuP2H11z0WnsotPYRaexi05jF126Y6dIjoiIiIiIJIpuckREREREJFF0kyMiIiIiIomimxwREREREUkU3eSIiIiIiEii6CZHREREREQSJZYlpEVERCR5DjroIAAmTJgAwFlnneXahgwZkpc+iUgyKZIjIiIiIiKJokiO5NQGG2wAwE8//eTOvfPOOwDsu+++eelTNm2yySbuuFGjRgBceeWVpR732WefAXDJJZfkpmMFwL8eDjvsMAAuvPBCABYuXOjafv31VwBGjhwJwK233ura/vjjj6z3U0QqrkWLFqH/r1WrVp56IsVkww03BOD0008HoE6dOmU+9ttvv3XHjzzyCAC//fZbFnsn2aJIjoiIiIiIJIpuckREREREJFGqrFq1alW+O1FSlSpV8vbaG220EQBnn302APvss49rs/CmpRZdccUVru2rr77KeF+i/NPkc+wqYsCAAUA4ZWvYsGEAnHLKKRl7nXyP3SGHHALAnXfe6c4tWbIEgB133BGA6tWrl/q5xYsXA9CsWTN3bu7cuRnrV0Xke+zMxx9/7I4bNGgABOPjpwGut956oZ+bNWuWOz744IOBcPpBNqU7dtl+v/bs2RMIPseeffZZ1/bYY49l/PXWWOO/ebNHH33UnTvmmGOA4N+zZLoSxOeai2q77bar0OOOO+44AKpWrQrA/vvv79rWXnttAMaPH+/OffLJJwA88cQTZT5noY3d1KlTgeA92a1bN9e2fPnynPal0MYuTuI8dvb+8v9+e+ihhwCoW7dumf1K9TuNGDECgBNPPDFj/Yvz2MVdumOnSI6IiIiIiCRKUUdyNt98cyAcQejRowcA9erVK9WXkkM1e/Zsd9yyZUsgs4vTknS3v/vuuwPw1ltvAcFMC8BRRx0FwNNPP52x18vH2Fn0BoLfxWZnfTNnzgRgzpw57tz2228PwFZbbQUE1yEEka5cict1t/XWW7tjG0eLotr7DeDcc88FoEOHDgCsu+66rs0irBY9W7lyZcb76YtbJKdVq1YATJo0CYClS5e6NisCUln+zOjAgQOB4D3tGzx4MADnnHNOqba4XHOprLXWf/V59tprL3fu6KOPBmCXXXYBgnHOhBUrVrjjH374AQiKlqQS57EzBxxwgDt+6aWXAGjevDkQjtjmWr7Hzj7Xzj//fHeuffv2AOy5554A3HDDDa5t+PDhZT7XzjvvDMB7770HwHfffZexfqaS77Erj/29MXny5DIf88Ybb7jjl19+GQi+f/2/CRXJge7duwPBuLRt29a1WebE8ccfD8CXX37p2ix7JZMUyRERERERkaJW1CWkbVbXNiWDYEbd8sjL48+uWV7x3XffnckuFrQtt9zSHVuOvh/BSQqL4PiRKJuh82eLbLbOohF///23azvwwAMBeOqppwC4+eabXZuVSM5kpKsQfP3112W2WV4/BLNMxx57LAAPPPCAa9tmm22AYP3cNddck/F+xpnNrmWTX+o7VQTHPhMvuuiirPclk6pVqwYE10wmyrvPmDEDCLIA/GiuRbkt0gswffr0Sr9mHPjrsCZOnAjkN4ITF1YSv3///qXa/v33XyB83VXkGrTviylTprhzVkrf1hrPnz8/Yo8Lg0VaU7HvibFjx7pz9regRcP8SI59Xxcbf9107969Adh4442B4NqEIOPCIoj333+/a7M1ofmkSI6IiIiIiCSKbnJERERERCRRijJdzcr6NmzYEAgW+EGwUMpKWdaoUcO1/fzzz0CwAM0PwVvKktLVAv7ieVvQZyy0CcGi6ELVp08fIJxmYiWPO3Xq5M5Z+eNUbOFj3759Abjttttcm4V8iy1dLV2PP/44AFdffbU717hxYyD4NyqGdDW/vLafSgbB4v/KsDLRlmI5dOjQUo/xUwZt7P/5559Kv3Yu2Xs3VYqQFQOxFLxUi5KtFLR9bwAsW7YMgD///DOznY0pK+/up61ko3R5oRozZgwAtWrVcucsVcoKqFjhi4qy57L3p8/+vvFTRxctWpTW8xcCP525pD322AMIL1OwdDVLTdtss81cWxLHJ5WOHTsCwfdn06ZNXVs6RQ823XTTzHaskhTJERERERGRRCmaEtIPPvigO7ZZt1S/+nnnnQfAQQcdBEC7du1cmz3eNmj7/PPPXVvnzp2BYGHfkCFDKt3nQiszaGzmyS9haSW5jT/LZAtRMykXY2e/w2mnnQaEN4S96aabgPRLih9xxBFAsHEZwMKFC4FgEX22Fep1ZzPu/iJeuxZtAf7o0aOz2oc4lJD2ZzFtU1mbjbQy2xCOpqbDPj9TRXBsg0f//f3NN9+s9jnjcs1ZdB+C8rP169cHgs99CEpG//XXXxnvQ7riMnapnHrqqUB4U2SLcv/yyy856UN54jx2tvm4X8xj1113BYISyT77rLPiSeXZaaed3LEVw0hXnMfO3HPPPe749NNPD7XZxrwQZAFkkkXUFyxYUKotLmO32267uePXX38dSL3tRTrs7xWAJk2aAJmNhqmEtIiIiIiIFDXd5IiIiIiISKIkvvDA3nvvDYT3vbGFs7ZPiR9Ku/zyy4EgzGm7D0NQG9xSQOy/PtvfxE9ls7SHOKQ25IKlsZRMUYOgyICFRguZpQfYfi62FwtEC0dDkCa5/vrru3N+qqX85+STT3bHAwcOBKB69eqlHte1a1cgO+kIcWOpjn4qirFd5qOmqNleCP7rGEtRAzj00EOBiqWoxZGlK0OQpmb8/TKK5bO8sqx4wxNPPOHOxSFNrRBY2s99993nzvnHJVmRB0tXTZXebM+ZjZ3o4+j22293x126dAGCgg72WQXpfT/4RQls3xh/X54tttgCgHXWWQeo2J6L+eKPQUXS1D788EMgvMeVv6cQBGmWEKQI3nDDDZXqZ2UokiMiIiIiIomS+EiOLXi0u2oIIjhWktdfgFbSTz/95I5tAeCaa64JQNu2bUs93kpOW0lgvw8XXHBB2v0vJBtttBEArVu3LtVmM0hWutJKNhayF154AYA333wTiB69gWCWyYoY+KyoxfDhwwGYNm1a5NdJCr8csV/mHcKzclaiNYb1VTLCPosgKJJikWoIoizjxo2r1OuMHDnSHbds2RIIItv+ovIvv/yyUq+Tb+VFoObNm5fDnhS2Xr16AUEJ827duuWzO0XhrLPOAsovUGPfId9//30uupR3/ueRRfwtW+eEE05wbbb1h0XB/GhE8+bNgaDogxXTANh8881LvaZl7lgRqjiyLKQzzjijQo9/9dVXAXjqqaeAYFuAFY+11AAAIABJREFUVPxCC3Pnzo3axYxRJEdERERERBIl8ZGcVCyvsLwIjrn33ntLHVvEwu7wIZjdtJl4/w6/e/fuAAwaNMidmzNnTpSux9qOO+4IhPP3jUUf3n333Zz2KReWLl0a6ef8fNjevXsDqTd+s7VNFjnyf65YozrPPvusO77yyiuBYIbu2GOPdW1WdtPW5hTahpRlqVatGhDeNLZkbjQE68Sirklq06YNkHp2eMqUKUAwQ5oEJTct9s2fPz+HPSlsNo62XtFfk1OeDTfcEAjey40aNXJtd9xxBwBvv/12xvqZJCUj2r4VK1YA4Q0wJWB/m82ePRsIl+jecsstQ4/1SznbBt/+JvA33ngjEGwoH0eWlVTRjTst8mPfB/b9k4r/N94jjzwStYsZo0iOiIiIiIgkim5yREREREQkURKZrnbAAQe4Ywt/+5588slKPb+VnPaLC9ixLezzd9q1BcGWbgRB2C+TO8HmQ82aNd3xQw89FGrziwvks4Rg3DRt2hQIlwMtWa42FUtbO+mkk9y58hYAJtmvv/7qju3astLJfrqale+0BaW33nprrrqYVeeccw4QLPD2WTouwGuvvRbp+e1as8/KWrVquTYbe+tDkpRXeMAfa0tTsdLcSmULl6Dt0KEDAGPHjl3tz/nfIfYdaekxv//+u2vbc889gXAJ32Lnp1KVl34/ZMgQIPrnQaHaZJNN3HF5pZwtPW233XZb7XPatQ3wzjvvAIX3d9z48eOB8Gda7dq1y3x8eW0l+dekvbejpvRngiI5IiIiIiKSKImK5NjCd39RcnkLpLLBZvT8yIVFcrbddlt3zi/9Wsj+97//ueOSC/RslgNg4sSJuepS7PXo0QMIR29mzJgBBNewz2YzbbM3+6+EWQlVKwwCsP/++wPB+9EWQkNQQr4Q+RvPluTPNPol8FfH39zYZuD9sTTvv/8+AB988EGFn7tQPPbYY+7YFs+ff/75QPjze9iwYUBwPd1yyy2uzaI7SSwuU5599tnHHdtYvfLKK6v9uSOPPNId20y6RWNnzZrl2mwDaYvk/PDDD5XrcAGzAjX+wu6SBX9sqwyAAQMG5KZjeWaRm+effx6AFi1aVOjnrOyx/ffhhx92bfbd7GfiFDorDuBnHFmxj8ryo+H5jOAYRXJERERERCRREhHJsZntUaNGAeHcYGM5iJCbvHw/t9MvOWhsY9BLLrkk633JhurVqwNw9dVXl/mYiuRjF6Mddtih1Dmb9U0VybE1OFZm2l9z0r9/f6B4Nncrj62T8MfHNi+zWWZ/vV4hR3LsMyXVZ4s/+33dddcBqUtI20yvldXu2LGja7MS+Lbhp89mSZPIriGAiy++GAjKtPvfGxbhstlzf6sBW/dga5amT5+exR7Hh79Wxj6PLOqXin1v22cYBKWm7b/+mgHb0NvWixVzJOewww4DgnVKqfhrN/3rOins+unXr58717NnTyC4VsrbBNr/m7B9+/ZZ6GH8+dGpE088EajY+uBU7PvE39YgDhTJERERERGRRNFNjoiIiIiIJEoi0tW22GILIEgD8kOUtjtyeeUDM6lVq1YAnH322e5cqpDpwoULc9KfbLE0gkMOOaRU288//wyEUzgEunTpAsC+++4LhMsglyza8MADD7jjF198EQgW89rPQ5ASc9FFF2W+w3l22mmnueOrrroKgGuvvRYIl98uyR/XcePGAUG6mhUBgaBQQSGyhe/+54yxVA0Ixsv+67OCFpZm4BcZsDS1VJ9d9tqFxlJs//zzzwo93tL4rDT+W2+95dpOP/10ADp37gwE30EAbdu2BYJtBfr27eva7r///ihdLwj77befO7b0qF9++aXMxx999NFAuIR09+7dQ4/xF47bcyax4EVFWYqW//1Q0ieffAIEn31J4qf32990u+yyS6nHffnll0DwtwiUTu3zizYUq8mTJ7tjK6PdsmVLIPxdaSnQNv6pCtLY98l3332Xnc5GpEiOiIiIiIgkSiIiOSeccEKZbbZYtKKzd1HZ7J2VzEw1Azp06FB3fOedd2a1P9lii5P9xaIl2QK2P/74Iyd9KhRWzrhq1aoAjBw50rU1btwYCAoP+JsLLlu2DIAvvvii1HOeccYZAAwcOBBIRgEC29TOL29patSokdZz2SZ4N954Y+U7FiM333wzEC5Ff+aZZ6b1HOmUIvdL0Gb7szRbLCLz8ccfA+Hrq7yIg/FLo1566aWh5zjvvPNcm0UgN910UyC8ANyiQZ9//nna/S8k2223Xei/NrMOQdTaFozfdNNNru2vv/4Cgs9DPwOjvO/5YmEFi1LNpNvfHFYYxL43kmDvvfcGgmIgEGxQ7L8v7XvQilD5xTD8TZIh/e+SpLO/OSwC6EcCLbpj3zuprL/++kA4K2Xu3LmZ7mbaFMkREREREZFESUQkp6RcbUJpm8VBUKq2vP74s33+Rl1xt8Yawb2wzbKvu+66pR43depUINgMT8Kb0R5xxBEArFy5EgiXM95+++2BYDYlVelVW5vz0UcfuXOWj3zooYcC5a9VKRSpyiJHZTnEtnleUq5Nu05so0oIogv+DKWV4t1pp52AYLatolasWAHA6NGj3blUZaULgUVCrez94Ycf7tpsLYg/K1wRVvrd/2y3DVitfPcee+zh2oYPHw4EazeTxB87WxfWpEkTIBzJseiOfYfYNQrBWif7N3rvvfdc26uvvpqNbseev5bEj2SUZLPstn4xSSw65X+2ffvtt0B4XbC/2TME37mpJGlzz3RZxohtdLo6/gbTZbG/QSZNmhS9Y1mgSI6IiIiIiCSKbnJERERERCRREpGuZmlSxl/wvnz58oy/3oUXXgjA5Zdf7s6VTAPxS7s+9thjWetLLljpYgjvJg9B6hXAlClTgKD0qkDr1q3dsS0WnT17NhAuEmCLlG+55RYg9a7ytuB7xIgR7pylq1k51iSkq1VW7dq13XHv3r1DbX6qXxL47zVbaOwvOLaCKFa+PFXZ1JkzZwKwZMkSd27QoEFAkHqVhIXyVsrZUqms/DoEi5LHjBnjzlmBAlvEvHTpUte29tprA/Dbb78B4fTdadOmAUH52o033ti1+emrSePvPP+///0PCNK4rSAPwNZbbw0Eaan+DulWaGDRokVAUKIbghK1xcKKL/if93bdGT89K1WZ+KSwzx+/pLiVaPc/t0zdunUBaNOmjTtn15t9jzZv3ty1FVvqWkXS1PyiNv62FWWxMtNxo0iOiIiIiIgkSiIiOTbrbRt/+tEFK9drC2jTZTMCEMwc2GJKv5SqzfLZbMHmm2/u2vzNCQuRvzC0JD9qlm4J22Jgm1D6GjVqFPovBNHIkgsnU/GLXHTr1g2AvfbaCwg29ILwot1C8vrrrwPhDXPr1Kmz2p+zx/iz8RbVsfdqqghZsbIohBUTqegi1EJnG+d+9tln7tzJJ58MQI8ePUo93qJan376qTtXr149IHiPNWvWzLXVr18/wz0uPFY2+6677gLg+uuvd21WYtYyGxo2bOjaLEviwQcfBCpW2jtprNDPFVdcAYQ3mjX298zxxx/vziWpZHRJVh7b35rD/ubyN/zcZJNNgGDLAH9zWfvZu+++Gyi+6E26/L9PUl2DJXXt2hUIl4SPA0VyREREREQkUaqsSrVrZZ5FLSG74YYbAtC+fXt3rlOnTkCwiRbAV199VeZzWHlLy9v282F32GEHIChdaOUuIbx5I4RnCaOK8k9T2fK7NkMJwUZ3qcoHWr66bdIF8cqZzsfYpWKzcRDeUBHCEUebcfJL9VaE/XtZFM2fRT7llFPS6+z/F5exs9l1gHvvvRcI8n4PO+ww19agQQMgGN9U0TMr45uqNHcmpTt22Rg3n82a26ylv1GblaO1SEU+5fuas/xzf83MrrvuCgSRxaj88skW4Xj33Xcr9Zy+fI9dRdh3MwTrbazEdj5LHsdx7KwsvG1k7rPohX0nV/barIxcjp29ll++fvHixUB4k0+LONiWDP7jLQPCvhfz+fdKHK+7kqzcPYQjYiVZBoCt/7R/l2xJd+wUyRERERERkUTRTY6IiIiIiCRKIgoPGFuk+OOPP7pz7dq1A6Bp06bu3KOPPgoE6Qg+24naSoL6IUILk1mb/5y2y+tff/1Vyd8iv6zMMaROU7MQ75VXXhn6f0nNUqlS8VPZ0k1TM5Ymabtip0of9N8PVkJ47ty5kV4vl4YNG+aOLcXAfk+/qELNmjWBYDdsv5Sllae18ubFxlKCrAz8d99959rikKYWF1aK2y/zb4VsDj74YCD83rISyVbC3R4LQfEQSxH0CxYU6+elX0Dg/vvvB+CAAw4A4Pbbb0/5uGKy1VZbuePy0vceeughIL9pavlg5d7PPfdcd84Wxtt15LP3sRUZAOjfvz9QvO/BdKX6+zgV+3zLdppaVIrkiIiIiIhIoiSq8IDxIyw241GrVq1Sz1/er26FA/zNA63wgC3Y/eSTT1ybzQ74i8krK9+FB2yRtl9G24o6TJgwoVKvk21xWdhns8AAzzzzDBBs6GYbNEL0SI7p1asXEJ65st/HL59ekc0I4zJ2PovWvPrqqwC0bNmy1GNGjhwJBFFGCEcuciFuhQdsTGxTRSsXDeWXhs+1OF5zhUJjF12+x84+j++44w53rmfPnmU+3hbUl1c8KVfyMXYbbLCBO7btE/ztOoz93RfXMtH5vu4q4tJLL3XHlhGQihXnOvHEE7PeJ1DhARERERERKXK6yRERERERkURJZLqazxbS+zW/bY+NBQsWAPDwww+7NkuDsVQiewwEKTNWqz7bCiGkGVcau+g0dtHFPV3thhtucG353J+kJF1z0Wnsosv32FlqvZ/6Xh7bm2/o0KEZ60NU+R67QlYIY+entvsFWQCWLVvmjvfaay8g2Dsx25SuJiIiIiIiRS3xkZxCVgh3+3GlsYtOYxdd3CI5hULXXHQau+jyPXZW6Oe+++5z5yzTZOHChQBce+21rm3w4MFAUPI8n/I9doWsEMbOfz0rUnPkkUcC4QI2Y8aMyWm/FMkREREREZGipkhOjBXC3X5caeyi09hFp0hONLrmotPYRaexi05jF53GLjpFckREREREpKjpJkdERERERBJFNzkiIiIiIpIouskREREREZFEiWXhARERERERkagUyRERERERkUTRTY6IiIiIiCSKbnJERERERCRRdJMjIiIiIiKJopscERERERFJFN3kiIiIiIhIougmR0REREREEkU3OSIiIiIikii6yRERERERkUTRTY6IiIiIiCSKbnJERERERCRRdJMjIiIiIiKJsla+O5BKlSpV8t2FWFi1alXaP6Ox+4/GLjqNXXTpjp3G7T+65qLT2EWnsYtOYxedxi66dMdOkRwREREREUkU3eSIiIiIiEii6CZHREREREQSJZZrckRERCR56tWrB8C8efOA8FqDAw44AIBXXnkl9x0TkcRRJEdERERERBJFkRwREZEybLHFFgCsueaapdp23nlnAPbZZ59SbYMHDwZg5cqVZT73999/746rV68OwO+//x69szFVt25dd/zcc88BQZUkv1pSkyZNAEVyRCQzFMkREREREZFEUSRHssbyqwGuvPJKAI477jgAfvzxx7z0Ke522203IJjVBXjzzTfz1Z2ss3z8atWquXOHH344AH379nXn/ve//4UeP3fuXNfWv39/AIYOHQrAv//+m8UeS7HYe++9AXjxxReB8DVakr+uxCITdh03atSozJ979dVX3XGdOnUA+O6779y5jz76CIB+/fql0/XYsc81gGbNmoXalixZ4o4nT56csz5JcWratCkA119/vTtnEcStttoKgDPPPNO1TZgwAYA5c+bkqIeSSYrkiIiIiIhIougmR0REREREEqXKKn/VX0z4of9caN++vTu2UKZ555133PHrr7+eqy4B4QWZFZXrsUvF0jqefPJJd65du3YA7L///gC89tprWe1DIYzdpptu6o5vuukmAI4//vhSffn777/LfA5LpXn66acBeOSRRyrdr1yO3bHHHgvAY4895s799ddfADzxxBPu3Keffhr6ua5du7pjS2UbMGAAkN/UnnTHLg7v1ziIy/t1k002cce2QL5FixZp9SUbX6mpih5U5vVydd3ttNNOQPA5BeExBjjqqKPc8dixY3PSLxPnsYu7Qhs7ex+/9NJLANSqVavMx/7xxx/u2NLVTjrpJACWL19e6b4U2tjFSbpjp0iOiIiIiIgkSlEWHrAZ9F69egFw2WWXuba11goPiV/+0xaSvvzyy9nuYkEbOHAgEERvJOzCCy8E4Pzzz3fnbIO8VNZZZ50y24488kgAOnToAECDBg1c25AhQwD49ddfo3c2g9Zbbz13bMUC9tprLwBGjBjh2m6++WYAZsyYUeZz+ZHASZMmAcFC8apVq7q2FStWVLbbiWfXYcOGDd25Pn36lPl4u+YsenjQQQe5tkL/bPTHwMpDm3/++ccd33///UC4AIaxstBPPfUUAK1bt3ZtLVu2DD12+PDh7tgyCvzF936WQSE6+uijgdLRGwh+z+effz6nfZLiUbNmTXd81113AVCjRg0giPxD8F41fiTHLwIEcMwxx7jjzz77DCj/u6pQNW7c2B3fc889QPAd+80337g2K9ZgY2DZOgALFizIej9XR5EcERERERFJlKJck9OzZ08A7r777rT6Ynf3tjbHXzNgJZEt3zMTCjVv888//wTCJVfHjRsHwAknnADAb7/9ltU+xGXsNtpoI3ds625OPPFEIJxnv3TpUiAoU/n444+X+Zz+bHB5M71WqnXatGlp9TlbYzdlyhR3vO222wLBWhyLqkZh7zmbQdpzzz1d23vvvRf5eaMopDU5hx56KBCsPfE1b94cgE8++QQIr1W0aM2GG24IQKtWrVzbxx9/HKkvcXm/brPNNu7Y1mPWrl0bCM9e2nsw259jFRGXsfN17NgRgIcffhgIZs8hiIjZWPsls3MtjmNXKAph7PzPprfffhsIMncsYyBdtkYHYOTIkUD6a2HjOHb294h9j44ePdq1bbDBBqHHzpo1yx1b9pOVyvcjsxb196PglaU1OSIiIiIiUtR0kyMiIiIiIolSNIUH/PQfS9NIly1iswX1/sL6ZcuWAUGKgx8KnThxYqTXSxJbgBaH9I5cqlu3rju2687Cu35Z5AsuuACo2MLttdde2x1ffPHFAFxzzTWlHnfaaacBcPbZZ6fb7azw0+ys37fcckuk5/LHoOTCUCmbn8pnqRaWBuGX77XF83Xq1AFg6NChrs0Kt9hC06gpanHk73RuaWrGTz21Ah/F9nlWnjXWCOZMTz31VCBIU/PTVaxQSD7T1IqBXyK5ZOEHS4uG8rcoKHRt2rRxx/Y5FzVNrXfv3kC40Mqtt95aid7ln32+Q1AMyJZz+NeFFVoZNmwYAO+++65rs8JIlpraqVMn19a9e3cgXGAl1xTJERERERGRRElkJMcvx2vl/g455BB3zl8EmSn2nLZoyy8bagvdrLRyUm2xxRZAMGPiLxCzmeFi45eWtJK0FhFctGiRa/NLVq6ObZYJsGTJkjIf58/ax8GXX37pjn/55ZdKPdcBBxzgju299uabbwLw9ddfV+q5k2jrrbcGwkUGbKZ33rx5AJx++umubfHixUBQAnjXXXd1bTbD17dv3yz2OLc23nhjIJjF9Fm0xi9KYOMjAf971/++hWDRN8B+++2Xsz7lmi1092fIK8IKsQAcfPDBGelL/fr13XGTJk1CbX42SyaLJcWNH1WYOnVqpOewCLdd0507d3ZthZqlY9H4Sy+91J2zz74xY8YAcPnll7u28r5TbXNUe67NNtvMtZ1zzjmAIjkiIiIiIiIZo5scERERERFJlESmqx144IHu+I477shLH/zF0DfeeCMQXnxZkT16Co2lr9iO87ZfDgSL2orZwoULQ//NhPJ2Wn7mmWcy9jqZsP3221f6OXbffXcg2IHZ99VXXwHhYg92bClHflqgpXNY6tEPP/xQ6f7FjaWk2YJRfzGyjcUpp5wCwM8//+zaLH0r1eenpR0+/fTTWehxftjv6xe0sGumQ4cOgFLUVufJJ58sdc6+A6699tpIz+nvDWIpXVbExn8vr1y5MtLzZ4qlNFkabdyLoYwaNcodlyywkST+v8Pvv/9e4Z+77bbb3LGlqVmhFts7rJBZSuS5557rztk1cdJJJwHpF6SwlDY/VdUvRpIv+e+BiIiIiIhIBiUyktOtW7fIP/vvv/8CMGjQICA8k7TeeusBqXcIt4W9/fr1A8LFDWx28Pbbb3fnfvzxRyD17JdIefxFrX5pX4CffvrJHdvsfaHyZ+GuuOIKIJhl8hc3GotI2H99trh+5syZ7tzee+8dajvxxBNdm5XIzOROzfnwf//3f0C4lKqxgihWttz/zLryyiuBINrll0q2UqpJYmVQfVbgw7YHkNQsErDddtuVarvzzjsBeOWVV9J6zsaNGwPhxc9du3YNPcaPvFoJfn+riFyyaJ/9/ZBJ/ntv7ty5Ff45v1CGH6EE6NGjR+U7lkBWfOX888935+69914g2PYgnYhQnGywwQbu2L4XvvjiC3fOronKlhT3x8e2y9htt92A8PdvrsZRkRwREREREUmUREVyrFzgHnvsUe7jvvnmGwAGDx4MhNeLWGnbPn36ROqD5W3fdddd7pzNSNtdLUDz5s2Bwo/k2IwbpJ4tlszZcMMNARgwYIA717BhQyCYQfTLZBb6DLT/PvZndCGcjz99+vRQm63NgSBKY/zZX1srZyUzrQQ1BJur+msJsjFLmw1++fqSUW2/lKfNUJpddtnFHZfcQNbfKNQfp6RIVfLXNlDs2LEjAB9++GFO+xRn/qzwU089VeqcRRxsg8CKsoiMre9cd911y3ysH821TQfzFcmxMbCy/X5pf3/TZwhvJmnlm2fNmlXqnJk/f747/vzzz1fbF1tf5n8X2Oa1tp7pvffeW+3zFDLbrsH+67Prxv8usJLattbTv2579eqVtX7mUpcuXdyxrc28+uqr3TkrBZ1J7dq1A4L1m/61vNVWWwHhLTGyQZEcERERERFJFN3kiIiIiIhIolRZ5W9LHxP+Yv902AK98kLcAG+99RYAbdu2BcK7p1tBgJIpMOk6+eST3fEDDzxQ5uPWXHPNMtui/NNEHbuo9t9/f3dsC5iNX0LaX9ScC4UwdumyHYWvv/76Um1Lly4FguIYlRGXsfN3/h4/fjwQpKmNGDHCtZ111lmVeh1Ls9xnn33cOSsgYgtRoWKFHNIdu0yOm6UL+aXD/d8J4I033nDHU6ZMCbX56YGW8mZlpXfYYQfXlo1SynG55myhPAT/9iUXbUOQnmRpWf64WtEKPzUjm/IxdraQGOCdd94p1f7oo48CQRpZqte2wh9WVATC3ycAK1ascMeW0mWpL/7PGb94iP8ZUZZMj91GG20EhEta+4UDINhtHsKFYjLFUq38Qg32e3bu3BnITJp8XN6zqVjK3quvvurONWnSBAiK11j6N8Att9wCBOWh/c/NbKRx5WPsevbs6Y5tqYafppupz3X/PWzfRTVr1gRgwoQJrs2KdaRb/j3dsVMkR0REREREEiVRhQcqyhYVm4kTJ2b8NfxFmMXCZhpiGByMBVv8abPk/gLxkj777DN3bBtgnnDCCaUeZxsz+jPQSeEvcLeZNYvkfPzxxxl7nUmTJoX+C3DJJZcAsOWWW2bsdbLNrpOS0Ruf31be44zNiJ533nnunC0KTyJ/czzbyM42t7OFshAUIzDnnHOOO7bolxVveP/9913bnDlzMtvhHLP3w1VXXVXu46yYRyoWFfRn2Uv67rvvgKA4CASFMuyaTBXJyXc0ftGiRat9TDaiNxBEJvbaa69Sbd9++y1Q+IWOKsreg1YqGYJIxplnngkEm8tCUOzB2rIRvYkjf1P6Dz74AIDRo0eX+XgrmlGvXj13zgp+WbaOvxm3RXDsu/yYY45xbbnawFeRHBERERERSZRERHKsbKPlTqeazfE3I3vttdey1hcrT3jaaae5czYj6Hv99dez1od8KfYIzk477eSOrSTlYYcd5s5ZuW2biYzKL7lo63T8dRhJ4c+m+VGWbPFLu5a3Vi6urHT2/fff787Zurh9990XgMmTJ7s2i0akuh4tN/2FF14Awpu4FQuLxNhnuh8tPeqoo8r8Octzf+yxx4Bw9MY2GXz22Wcz2tdcsXU0/uea8dfA2HVjGQ1+hMyipKnYWh5bE+dvCHzDDTcA0LJlyyhdTzybNd98881LtT3xxBO57k4s+OtubNsO+/vQzwZo3759bjuWB/aehODvYVuj5R/ffPPNGXtNW+djG6na2uFcUiRHREREREQSRTc5IiIiIiKSKIlIV7MFtFWrVgVSp01Z+DvbLFXEUpMg2Cn977//dudSlQEuRP4u4FYe0BbqJpFfntzKo1oahV9qNlXZ2UwZNWqUO7aS55I+u06tAISfRvPNN98A4dSvuLM+n3HGGWU+xt8l3tLzLF1t2rRprs3KSftl4IuV7Yzup3ZYOVpbgL/ffvu5ttq1a4d+3i9eYQt9rUiEfXZAUFQjzsorxHHBBRe4Yyv9bIUD/FLsJflFLawU92WXXQaES1CvtVbZf6588cUXQPnFDJKuV69eZbaNGzcuhz2JD3+BvKWp2fYg3bp1y0uf8sWKTwAceOCBQPizKp2UvbffftsdW5q0XX9+MQPbpsAvvpJriuSIiIiIiEiiJGIz0IpsAmozZwBTp06N1rES/AV+PXr0AIIFVtWqVSv1eL/kpV8asyxx3mwrlaFDhwLBRqhJ2gx05513BoJNwyC8iWxJNnPhb87pb6iYKbaA0GZmMqHQrruoZs+eDaReqLv11lsD6Zf8zedmoBXhz/bajJuVW7VF5RDMjOdKLq45i0LY98Uvv/yS9muWxf/uOfzww0Nt/iayJb8X/JnmBQtDX2hWAAAgAElEQVQWRHrtXL5fbQa3VatWpdosugVBmXErGZuqkIddd37ZZXsvlvd9sWzZMiDYFBSC79bvv/++Ar9FIEmfdb///jsQjJ0f1bKNlS2rJBPiPHYWAfQ3YrdIoG3Wa5vBQ3jT2VyI89hFZSXeu3Tp4s5ZgZJMFg7SZqAiIiIiIlLUErEmx2bLcxWUsvLA1113nTtnM/2pWETDyrJK4bFZ7vKiNz7baKy8PHKfXSOW42/lZ322PsSuPwg2frv66quB0hvdyn8sx98iNBBenwLhNQW2GWFS2Mz7HXfc4c7ZGkErQ57r6E2u2doPm9X+9ddfSz3GX+NmG+1WduPZjz76yB37GQUQXuczaNCgSr1OvvkbGFeErQUrr6S+nw0wcuRIINjgMd3XKzazZs1yx5mM4MSZrYW1yJ7/3rPIYbt27QBo06aNa8vmtiJJZ2tbjzvuOAAmTpzo2nKx9cPqKJIjIiIiIiKJopscERERERFJlESkq3399dcAbLXVVmU+xl/8aWkZ99xzz2qf28pTQ1By1VLTrGR1KrY4EoKds/0dZ6Ww2ALaivJ3Wi7JFu9a2gUEC9w/+OCDMn/O0iP3339/d84WMvsFDgqBvXcsxP3II49U+jnr168PhNP5zJlnngmEU9Tsc8B2svcXqSYlvcPGxEoV+wvf33nnHSD82ZhkzZo1A2C77bbL+HP7i4LTSZv2ixQUQrraypUrc/I6VubX0nABnnnmmZy8diFp0aKFO05V7KjYWLq2pYvfeuutrs2uKUtXa968uWtTulp6mjZt6o7HjBkDBKm+55xzTl76VBZFckREREREJFESEcmxGW5bmL3GGqXv3fzylnbcsWPH1T53ujN0S5cuBWD8+PHu3PDhw1f7cxJvVgjAX6xYkl+G0hZ1+4uW7Tq10p7+5rAVMXnyZABuv/12d+6oo44Cgk30CoWVlLXCCTfddJNre/TRRwG48MILK/RcFp2x8u29e/cu87H+4npbMJm0IgO+Dh06AEFE2i9TbGXvi4VFsxo2bAiUv7FluqJGcvyIfyE45ZRTgPB3p//eLck2i67oprIW5bYCGT/99FOkfhaLnXbayR1XtMhNklnGhb0fbSNfgB133DHUJtFZdhJAzZo1AXj55ZcBmDt3bl76VBZFckREREREJFEScetvkZKzzjoLyOzsRnkzdLb5FsC8efOAIAe0WPLci8Xo0aOBIFKXih+1mTZtWtb6cvnll6c8LiQ2s3v++ecDsOuuu7q2c889F4CTTjqp1M/Nnz8fCP87NG7cGCh/XZJtHOg/5x9//BGh5/Fna08gXDIagmg3wOeff56zPsXBiy++CATXml/239Zs+ZtcliwxXp50ty+wjZMrsil0nNj6V3/9kL3/bBNoCDbltHLdixcvzlUXpcj4618bNWoEBO/H66+/3rVVr1491CbR7bLLLu7YNlW+77778tWdcimSIyIiIiIiiaKbHBERERERSZQqq2IYu6vswjBb4A3QvXt3ADbZZBN3rrzSzyX5RQxsQZUtpvRTQV5//fVIfS1PlH+afC6qq1GjBhCkJvglLceOHQvAqaeeCsCiRYuy2pdCG7s4ycfYHXnkke7YSrWfccYZkZ5rxIgR7thKg44aNQqo+ALoqNIdu2xccwMGDHDHtvO3pU/66VjZHot0xOX96n9PWEqGFagoz5IlS9yxfbZZUQH/s852YLe0y7/++quSPY7P2BWiQh87//1sxWcsRd8vUHPRRRdl/LXjMnZ+upq918rrm6WV+985ll6ZK3EZu3RZsRp/u4XBgwcDwXKRbEt37BTJERERERGRRElkJCcVfzNHm6GzxaY+m1nr379/qTZbtJtu6d+oCvVu34ov9OnTp1SbbWSZ7c23CnXs4iDfY2fPdfTRR7tz++67LwBHHHEEEC4aYBFD2wju4Ycfdm253tQzn5Ec29zONvmEoFR3p06dgCCiGjf5vuYKmcYuukIfO7/Ygz+7DuECTDNmzMj4a8dx7Cz6v99++wFBsQEIoqj9+vXLah8qIo5jV561114bgEmTJgHhTWhz9TedUSRHRERERESKmm5yREREREQkUYomXa0QFVpIM040dtFp7KLLZ7qaFWyYMGGCO2dFGE444YSMvU426JqLTmMXXaGPnb/o3gojWYpWMaarFYpCG7tXXnkFCNLG/T1xevfuDcDKlStz0helq4mIiIiISFFbK98dEBGRzFmwYIE7vvHGG/PYExHJJtttHnJfZEWKR4MGDQD46quvgKDID+QughOVIjkiIiIiIpIoiuSIiCTASy+9BEDdunXz3BMRyTXb0HL58uVAuMy+SGVsv/32+e5CZIrkiIiIiIhIougmR0REREREEkUlpGOs0MoMxonGLjqNXXT5LCFdyHTNRaexiy5JY1enTh0A/vzzTyD76WpJGrtc09hFpxLSIiIiIiJS1GIZyREREREREYlKkRwREREREUkU3eSIiIiIiEii6CZHREREREQSRTc5IiIiIiKSKLrJERERERGRRNFNjoiIiIiIJIpuckREREREJFF0kyMiIiIiIomimxwREREREUkU3eSIiIiIiEii6CZHREREREQSRTc5IiIiIiKSKGvluwOpVKlSJd9diIVVq1al/TMau/9o7KLT2EWX7thp3P6jay46jV10GrvoNHbRaeyiS3fsFMkREREREZFE0U2OiIiIiIgkim5yREREREQkUXSTIyIiIiIiiaKbHBERERERSRTd5IiIiIiISKLEsoS0iIiIJMNxxx3njrt06QLA4YcfDsAaawRzrX379gVgwIABOeydiCSVIjkiIiIiIpIoVVZF2ZUoy/K56dF9990HQN26dQHo37+/a/vpp58A+OeffwD44YcfstoXbRgVXaGN3amnngrAnnvuCcDJJ5/s2uw6u+eee0r93K+//grAoEGDMtaXfIxd1apV3fEhhxwCwJZbbunO7bDDDqHHb7LJJu64U6dOQNDvZ5991rU988wzALz++usAzJkzp1L9XJ1C3Ay0Zs2aAMyePdudO+eccwAYNWpUTvpQaO/XOInz2G2zzTYAvPzyy+5cw4YNQ4/p2LGjO37//feB4Ls22+I8dnFX6GNn37kAJ554IgCtW7cGoF27dq7thRdeyPhrF/rY5ZM2AxURERERkaKmmxwREREREUkUpasBQ4YMccenn346UH5IbPny5QAMHTrUnbvqqqsA+P333zPWrySGNA8++GAAjjnmGHfOQsMWFj7llFMq/TpxHLuePXsCQUpa586dXdtaa61VqT48+OCDQDgEH1U+xq5+/fru+LvvvqvUc6Xy0UcfAbD33nu7c8uWLcv46xRiutoFF1wAwC233OLOHXjggQC88sorOelDHN+v9p601GX7bqiMs846C4DatWsDMHXqVNdm16Z9v1RUHMfOzJgxA4Dtt9++zLadd945J31JJc5jF3eFNna77747AI899hgQ/s6xv9vuuusuIEhvLnmcKYU2dnGidDURERERESlqRV1C2mbWbdFZRdWoUQOA3r17u3P77rsvEMyALliwIBNdTIx69eoB8PjjjwOw7rrrujZbPP/oo4/mvmNZYpGVo48+2p2zBd7lzcjMnTu31LkGDRqs9vVs8f3AgQPduWnTplWsszGwcuVKd7x06VIgGK9M2GWXXYBwIZELL7wwY89fyDbffHMgfF1+++23+epOXvnv16uvvhqAJk2aZPx1/v33XwCaN2/uzq2zzjpA+pGcOGrVqhVQusiA79prr81Vd0RcgYstttgCCH/XXnLJJQA8//zzQDLeg/IfRXJERERERCRRijKSs9lmmwFw8cUXA1CtWrVKP+eOO+4IBDNYfhlbgeHDhwPhCI5p3749AJMnT85ll7KqQ4cOQPj3HTlyJBD8nhbV8q1YsaLUOSuvbGWU33zzzVKPsfUD66+/fmW6nTd+ydiuXbsC4cje9OnTgSDCcP/991foedu2bQvA5ZdfDkCPHj1c2x133AHAvHnzIvY6WWK4PDNnbENKe99C6QiObR0AQbQx1fvN3sN//fVXqZ/zjwH+7//+zx1nY41Yvtx+++1AkPWQypQpUwA44YQT3DmL/Fx33XVZ7F1+2NouP4L15JNPpvUc9nn2999/A+HIY6p1T8aitDfccAMQfB4m3QEHHOCO7e+9e++9N/T/kNm11IWqcePGQLAhr23aC7Bo0SIgeF+OGDHCtf3888+56mIkiuSIiIiIiEii6CZHREREREQSpSjT1WyhrS1A81mKTPfu3Uu1tWjRAgjKRe+///6uzcLyTz/9NAD77befa3vjjTcy0e2CYyUbISjMYGbNmuWOk7jIuWnTpkB4MbeFdf1F9ulYsmRJmW1WvGHSpEmRnjtOxo0bB4Tfn5YeZClAFWWleo2fXmSFQy677LJI/ZTk2HTTTYEgVdL3ww8/ANCnTx937q233gKCxcy+OXPmAPDFF18AwXsTYPHixZnpcIxUr14dgF69erlzlr6dim29MH/+/FJtH374YYZ7Fx+WunzyySe7c3bsf09UJG3066+/BmDhwoXu3LvvvgsE112jRo1cm5Unt3S3pLNCPGPGjHHnXn75ZSB8nZZlvfXWc8e33norEIznSSed5Nrss6HQbLPNNkCQmgbBZ5kV/PGvQ/seve222wA4/vjjXdthhx0GxDdtTZEcERERERFJlKKJ5Ky99tru2BY6ppoxsRKCqXzwwQcAHHnkkaHngaBksLENLqF4IzlHHHGEO15zzTVDbTaGkMyF3z/++GPGnmujjTYCgoXyvj/++AMIlzNPiqiz3lZYBMIzVSVZZFakPL/88gsQnhU2/kbSxWqDDTYAwpvJlvTpp5+647PPPhsIFs8//PDDab2eldr2izjYc8WZbXLsf+6MGjUKCMYEYPbs2QBsueWWQBAZ9FmhlvIWzNtz+5JU3CcV++y3TXf9v/EqsrGxRXmOO+44d65Nmzahx5x22mnu+Jprrone2RzzMyOGDRsGQOvWrSM9l38NW/GUVFHwOFAkR0REREREEkU3OSIiIiIikihFk6625557uuPTTz891Pbll1+641Qh3rLYQj8IUmtsgVbnzp1dm1+PvRjYIl5/QbeFja3eeqpFp5KahYNLFm8AeP311wEYO3ZsLrsUS5be4Y9FebvVp1vEIOn8xc8SsCIDkppfZKckW5jtp/+kk1pm72kI9naxfdX8Rd/nnnsuEO/CK/Z7f/zxx+6c7U2SyldffRXpdXbbbTcADjroIHfOUt7KS99Ngu222w6AnXbaCYAZM2a4tptvvjn0WFtgDzBw4EAgKAThp7nNnTsXCP5ufP/99zPd7Zx45pln3PHOO+9cqt2uT0vL7devn2uz8bD9xJ566inXZn/rWkrqjTfemMFeV54iOSIiIiIikihFE8k5/PDDy2wrr9hAefwIkJUMtuIEfglCm13wH59EtkB+woQJQHg2xI4nTpwIwG+//Zbj3hWWZs2auWO/gAPA1KlT3bGVdEw6K9feo0ePMh9z7LHHAuVHJPyFumeeeWaGehdfVu5+5syZADzxxBNlPtZ/v+6zzz5A9NnkJClvJ3kpP1pqBWasrHFF2SLpF1980Z3beuutQ4/ZeOON3bH9G8U5kpMrtnWDFYQAGDlyJADLli3LS59yZZdddgFgww03BOC5554r9ZitttoKCP/dZ3+jWREkKxsNQVGp8oo8FAL/b1Lj/0166aWXAuGIT0kWybGCR/45K0+uSI6IiIiIiEgWJT6SYzOS/kZu//77LxCU3c1EGdAFCxYAwWxorVq1XJvlhyY9klO1alUgWJPjmz59OgDnn39+TvtUaCwadv3117tztomczaoffPDBrs3K2yaRH5Gxsp3HHHNMpZ7Tz6f+/vvvK/VcceXnmlse+fLly4HyIzn+eBdr2ftUbBNBf/3E559/nq/uxI6/OWJJ6b7HOnToAASzwSWjN1K25s2bA0H01l+z5JeoTrLhw4cDcN555wHhbT5sDbWtIfMjtJaBY+tLvvnmm6z3NQ7uvfded1xeBMfY387+37KtWrXKfMcySJEcERERERFJFN3kiIiIiIhIoiQ+Xa1du3ZAEGaDYBHk448/nvHXs3Q1vzytlU1OOtst2NLVli5d6toshP7jjz/mvmMFpGPHjgAccsgh7pwteLQxTHKKmq9atWru2C8lG4WVbbWF0Enmv+9sobEt0rZrCODNN98M/ZxfeKA8lgJcLCltljpqC3MhWOBuu9g/8sgjpX7OSqr6/x5J9PDDDwPh8THpliVfc801gdTFHmw87XX8IgP+lg3FytIGLeXZig0UE/tutGtkxIgRrm3w4MFA8H607Rcg+L5Np7x5oXnhhRfc8VlnnQXAe++9l9ZzWIGGAw88sFRb3bp1AahRo4Y7F4dCF4rkiIiIiIhIoiQ+kpNqhscWoGUywtKwYUMA1llnHSAoRADw2muvZex14uyoo44K/b+/6dmzzz6b6+4UFFtQ75euNJdccgkAo0ePzmmf8s2PhtpC0F133TXSc9mMexxmlnLJZtKtpOo111xTqi1VBOftt98GggW5derUcW22kZzN6n377beZ7nbOdevWbbWPWX/99d3xYYcdFmrr2bNnqcdbhOOmm25y55JYsKBBgwZA6uuoItHBli1buuO77ror9HMrVqxwbVa0xr5PrSBLMevevbs7tiIj9hl3++2356VPcWCFn/zvkOrVqwPBxqh+AR//OkuqK6+80h1buWf/77Krr74aCG/0aawAyGOPPQZAvXr1Sj3GCmz53xX2vZtPiuSIiIiIiEiiJD6SY/zNi+68886MP/8DDzwAQO3atQEYN25cxl8jjmz2CGDHHXcMtVk5Rwmza8TWiwEMGjQICGaLrWwtlF/2t1hYNMsipn6JdpsdtzUmVoYWghmr9u3bA+Hr9b777steh2PCZsQtsnzttde6ts033xwIohB+6WmbjbNc9RkzZrg2W9eUhAiOsfebv2l0mzZtVvtzs2fPBqBRo0al2qx87aGHHurOWdRi7ty50TsbM7aO0Gf5/xVZP+ivFbCIo7F1nlA6I6Lk900x6tq1qzu2LRzGjBkDhDMpis3ChQuB8N99FsmxzWv9aOzYsWNz2Lv88Ddgv//++4Hwxp32N4gf8TGptgUpFIrkiIiIiIhIougmR0REREREEqXKqorWDc2hdMtOlnTssce6Y1soNW/ePHfO0jSiWmut/7L8/LCeX5oVwgtRLTSYrij/NJUdu4paY43/7o/9tDxLbXnyySeBYJd6CIdKcyHOY2e7T6dKm7Qyqfvvv787Z6H3XMnH2NmiRQjSUD777DN37pNPPon0vJYyaQt0/bHMRgg+3bHL9jVn5Z5//vlnIDymxnav3mabbdw5Kyrw1VdfZbV/Ji7v17XXXtsdN2vWbLWPt+8C/ztn3333BYJSvv4CedtV3L4fli9fXske52fs/GIBVsrZxgLg3HPPBeDuu+8u8zms7Pbxxx/vztnvMnXqVAD22msv12aLw0888UQA7rnnHtfWu3dvAIYOHZrW7xGX6y5d9evXB2DmzJnunG2TYSmXfontbIjj2NnnlqVA2nUEQWrpxRdfDITfe3369AHg3nvvzWr/TFzGzv87wwoOrLfeeqv9OT9V2UpH22fngAEDXJsVM8ikdMdOkRwREREREUmURBYe8O/07DiTAatTTz0VCEdy7Pnnz58PRI/eFIoLL7wQCG9aOXnyZCDYPC/X0Zu4swW61113Xam2iRMnhtrSjd7YolMr9Qjw008/AfHePLRp06ZAeGO2DTbYAAhvzGZFAqwE8uLFiyv0/BdddBEQLDbdYYcdXFurVq2AoKR8ElVkw06L7tg1BMkqKpAOv+RsOhvlWcltCDaytMIDPXr0cG1WqtrGt2/fvtE7m0dWCASCCI4f9Rs1alSZP2sFV/wCIea2224DgsXzvn79+gFw5plnAuENvotlmwZjkUA/SmifjdmO4MSZ/W1mm0f7GTVWmMaK1xx33HGuzSKBtkF8nL8zM+mVV15xxy1atACCjT7tu9lnkUPLkILgb0H7TLAsn7iIV29EREREREQqKZGRnGyxWbczzjijzMekKqeZJDVq1ADCpXiNbSxVzKUrS/KvB9sc0Er1fvjhh66tS5cuQPkb1FarVg2AzTbbzJ2zTUQtKmGzMAC77bYbEM9ZKZuBtKifRW989vtCMNNms8D+7Hh50Qpbi2IRMn+jsyFDhgBBqeClS5em+Vskg12PW2yxhTtnx7lak5MkX3zxBQBXXHEFEET3IViHZ3nshcrWPlSUH2G2tTh+yXJjUQj7zPLXLZb8znnwwQfdsW0WnHRW2t1mz3///XfXVqwRHFsDB0H01NZr+VF6i9LefPPNQDiSY5H+Ymaf9el+5vvbCwCccsop7rjkWvV8UCRHREREREQSRTc5IiIiIiKSKIlMV3v++efdsaUK1KtXz52zEJpf6q4kSwnyiwtYiowttPzhhx9cm+0k/tFHH1Wq73HXuXNnIPUO33456WJnIfQRI0a4c+ussw4QlLW87LLLXJulSllpUFscCUEp7tq1awNwxBFHlHq977//HginUn7++eeV/C2yx8p4WvpZRdl19+KLL7pzVvbTUt/89Dy/eEFJVrbaPhuKNTXLCg/45UQlcyylNEneeecdd2ypa34JcktdPvLII4GgnDbA+uuvX+bzPvTQQ0CweNlfWF/yMZYOWEzuuusuICjXaynQEF5EXkxOOukkd7zxxhsD8NJLLwHhdD6z3377lflcG264IRDPFO+4si1D7G/lipSgziVFckREREREJFESGclZtmyZO7Y7+QYNGrhztoCxVq1aZT6HzTBvu+22pdqsdKWVtYXcbSKVD/6mibbY00pm24w8xDtykAsWqYGgnKJ/zlgEcO+993bnzjvvPKBikY1//vnHHVup8oEDBwKF829gBSyMH3EZOXIkEGzoBnD00UcDwaaLfrljK0pg/7VSoRCU4vaLNYjkgpXX33333fPck8zzS/NaoYVbbrnFnbPf2S+6UBHlRXmmTJkCBBFwKyqSdBadADjooINCbR988EGuuxMbVjDGig1AUPRo7NixpR5vY2dlyn29evUCiqeARSbZpqrTpk0DoHXr1vnsTimK5IiIiIiISKIkMpLj69+/PxDMDkOQg3/uueeWenyVKlWA8jcRtecsb01Pkhx11FHu2MbD1kRYqUYJr4cpOePms03wUm2Gl4rNlMybNw8IX3dWjrXQWPnmr7/+GghHbebMmVPq8ePHjw/9vz921atXD7X5EbLyWETW31SwGNkGbz4rqVos65SsnLG/Fi5qVLRTp05AUM7djzqaJM3A33333UB4o12/jGwU3333HRD+frG1OAsWLKjUcxcafwys1L5tljps2LC89CkOrIy2/zfa5ZdfDsAmm2wCQNeuXV2brRmx9a+jR492bcU8jpli69H9vxcPP/xwAJ577rm89AkUyRERERERkYTRTY6IiIiIiCRKlVWp8rHyzFLGMslPV7MyyOW9dqphsd3rc1UqOco/TSbHzoo1+GWxrRSopSbYotO4ycfYnX322e7Y36m7IlasWAEE6Vt++NxSE1KlcWVDvq+7ivBL0vbp0weASy+9NK3nsJKr5aUWpivdscv1uJXHL7dqJfFvuummnLx2Pq656667zh3bwmM/tcx2SH/qqaeAoIQ7BJ+NftlkYyVUbasBny3It89Uv4hIVHF5v1pZY4C2bdsCcPDBBwPh8uQ77rgjAMOHDwfC152lqVkarhUOyZa4jF0qzZs3B2DChAnunF2Te+21F5C774RU8j12lvLsp4nbdWOloP1CPlbcxlKnrAAV5L5kdL7HLhvse9S/Xp955hkgnMJWWemOnSI5IiIiIiKSKIkvPGDOOussd2wlZm0WrkWLFqUebxuK+ovT0i2HWehsIzZ/BtMW7dmMkgQGDRrkjqtVqwZA37593Tmb4Z0+fToQvrasNHexXWNRLVq0yB3bpoA2nv5mqbYZYbNmzQD4888/XdvNN9+c9X4WEr/0dp06dfLYk9zwy5jbgu5U7bYJdLpmzZoFBKWkIYhUJLHYhf+dYIVp/E17pWLse9c+z1JFrfMZwYmzbt26hf7fLwltpcefeOKJnPapWNjWLX502qK2VtjF/n7MJUVyREREREQkUXSTIyIiIiIiiVI0hQcKUb4XpzVu3BiAGTNmuHOTJ08GgoWPcZXvsStkGrvoCrnwQD7l45o79NBD3XGrVq2A8OfaW2+9BQSLZv203QceeACAzTffHAjvum5FQ/r16wfA3LlzK9XP1dH7Nbo4jt35558PwK233lqqrXfv3kCw6D6f4jh2hSLJY+dfm1YUYo899gDgvffeq/Tzq/CAiIiIiIgUNUVyYizJd/vZprGLTmMXnSI50eiai05jF10cx+7nn38Ggsih/T8Epbk///zzrPahIuI4doUiyWN3wAEHuOOXXnoJCLZd8YsCRaVIjoiIiIiIFLWiKSEtIiIiEmevvfYaADNnzgTgjjvucG2//vprXvokUlFTp051xxMnTgSgSZMm+eqOIjkiIiIiIpIsuskREREREZFEUeGBGEvy4rRs09hFp7GLToUHotE1F53GLjqNXXQau+g0dtGp8ICIiIiIiBS1WEZyREREREREolIkR0REREREEkU3OSIiIiIikii6yRERERERkUTRTY6IiIiIiCSKbnJERERERCRRdJMjIiIiIiKJopscERERERFJFN3kiIiIiIhIougmR0REREREEkU3OSIiIiIikii6yRERERERkUTRTY6IiIiIiCTKWvnuQCpVqlTJdxdiYdWqVWn/jMbuPxq76DR20aU7dhq3/+iai05jF53GLjqNXXQau+jSHTtFckREREREJFF0kyMiIiIiIomimxwREREREUkU3eSIiIiIiEii6CZHREREREQSRTc5IiIiIiKSKInSH0kAACAASURBVLrJERERERGRRNFNjoiIiIiIJEosNwPNtfbt27vjFi1aANC3b18A3nzzTdf24osvAnDXXXcB8Pvvv+eqi0XFxt7Xv3//PPQkuoYNGwKw5557unNt2rQB4MgjjwSgdu3arm3evHkATJkyBYCzzz7btS1evDi7nRVJw/vvvw8En5VHHXWUaxs7dmxe+pRN9erVA2CdddZx5xYtWgTAb7/9VurxW265JQCHHnooANtvv71rO+2000LP9dRTT7m2kSNHAvD0009nqusFZ7311gPgtttuA6BHjx5lPrZWrVruWN/FUlH2PdyxY0d3brfddgOCa2rOnDmu7eOPPwZgwIABAPz999+56KZkiCI5IiIiIiKSKLrJERERERGRRKmyatWqVfnuRElVqlTJyetYutD48ePduV133bXMvthQLV26FIDevXu7tocffjjj/YvyT5OrscskS1HYaKONAHjnnXdcm4WGt9hii7SeM5djd+qppwJw/PHHu3NNmjQBoE6dOqWevyJ9u/POO93xBRdcEKlfUcXxurP0nsMPPxwIpzTuuOOOZfbFfpdJkyYB8MILL7i2G264IeP9THfsCuX9aqlXAM899xwQ/K59+vRxbQMHDoz0/HG85uwzx1JI7fMJ4LPPPgPgm2++KfVzNlZrrrlmWq83e/ZsALbZZpu0fi6OY5cOP23oqquuAmDnnXcGyv/dLrroInd8xx13RHrtQh+7fCqEsatRo4Y7ts9+S02zFHGA+fPnA7D11lsDULdu3VLPNX36dACuu+46d2706NGR+lUIY+dr164dELxX/RTcUaNGAXD33XfnpC/pjp0iOSIiIiIikihFWXjgoIMOAuDSSy8FSkdvVqdmzZoADBkyxJ2z2bsuXbpkoouJ58+wDBs2DIDdd98dgI033ti1vfbaa7ntWBq6d+8OwKBBgwCoWrWqa1uyZAkAP//8sztnMzGzZs0C4OWXX3Ztm222GQAnn3wyAF27dnVt9vz+YsikqV+/vju2qE3Pnj3duX322QeA5s2bl/rZkjM7qWZ6WrduDUCzZs3cuQ8++ACAl156KWq3i0bJaJlv5syZOexJ7pSMMPtsPCxim65ffvkFgOXLl7tzjRo1ivRchcY++wcPHgwEhVgA1l133dBj/ai+zR5bBoZ9jibBhRde6I5vueUWAP79918Avv32W9dmi9/9iLSxIhj+NVXsjjjiCHdsBUEOPPBAIIju++z6s+8LgG7dugHQqVMnAB555BHXFjWSE0drrfXf7YAV4vKLPTVt2hSAlStXAuHiC61atQJgp512AuCMM87IfmfToEiOiIiIiIgkStGsybF1ExDk7/rRhJIs59rPM7R1EnbH61u2bBkA5513HgBDhw6tZI8LL28zHVZiGYJc9Oeffx4I/w4W0Ui3RGguxs7Kv+6xxx4APPTQQ67NyozPnTs3ref89NNPgfAM8Y033gjAFVdckdZzRZWLsatevToQ5OD7709/HVM22UywRRBTlQNOV1LX5EybNs0dWxTDfld/vY4fnUxHHD/rLNKQambSXttmNC0y45swYQIQrGGCYAbefl//915jjf/mHO27pKLiOHblGTduHBC+bkqydbKd/197dx4313j+cfzjpyjVCpXYYktR0SIVVC2J2NqipEGiqARtiRC1xa6homoJLVpbaymimqjU0tpaldqK2Im1iCJqDdrY6vdHX9/73OeZ88wzM88sZ8583/84zpnMnNw5s5z7uu7rGjUq7Lv55puB5P2a9T1crbyM3S233BK2N910U6Cyc4vP5fbbbweS7If4N8gLL7xQj9NMycvYZVHUZubMmWGf1tKoPHm19J0c/yYcNmxYTc+Vl7EbMGBA2NYa1ZEjR5Y8btq0aUDyW+Tee+8Nx5TRpOf6+te/XvfzjHlNjpmZmZmZdTTf5JiZmZmZWaEUvvCAQr+TJ08O+8qlqWnRn1LTVFoQkk7fShtSAYP4OZUK9/TTT4djf/3rX2s+/6LQ+GjRn9KUIAnDbrbZZgDsvPPO4VieO1lrwbpCuLo+6k1lLYukf//+QFL8ozeUkqAiD+poDUmRkCyrrroqkKS/uABBKaWmrbDCCiXHNN5KNy0CpblA8lmV5eSTTwbghhtuAPJdIKWV9LkfL9BWGXgtrI/T85SSFpeVlttuuw3Ivhbb3cCBA0v26ZrStQZJypSKsqy22mrh2IYbbpj675577hmOaTH5Aw88UM/Tzq1+/foBsNBCC4V9SoevRPwdopRJXbfTp0+vxym2lAoJxCX/lXr30ksvAbDTTjuFYyoAkpUq9swzzwAwevRoIJ2GmlUgo9kcyTEzMzMzs0IpfCRHjRSzZnTnzZsHpBsLasYpjuCIZoxVSlDFCSApb6nXWXPNNcOxTo3kqAQrJGWiVS40XkSnaI3KJ1cz49JK55xzTsOeW4uQIXtRc7t79913gaQwgyI7lXr99dfD9gUXXAAks3eK3vbkySefBNJlatuFSskeeeSRJccUmYL0AtFaXHbZZUD6vaxrU9HqOGrd7uIWAPGi3K7qEYEsMl0v+tyPZ3cVwen6uQ/w+9//vtvn1ILvuLRtEc2YMQNIFoDHn/8q7HH55ZcD6Si/ShurFPkyyywTjqkIxuabbw7Ao48+2pBzzwuNQRwlnDVrVsV/Pr5ex40bByTNQydOnFiHM2y+BRdcMGzrOzP+rrjjjjuAJCJT6ee6ilqo9YMKMUESyVHW06uvvhqONSuq6EiOmZmZmZkVim9yzMzMzMysUAqZrhbXLh8yZEi3j7vnnnuA2uuma8EVwIorrggk/T8OOOCAcExh5Hr04WgHWmw6fvz4sC/uag3psOUWW2wBFD+EXs6gQYOA5Dp66623wrEzzzyzJefUSK+88gqQFGvISleL+zzsvvvuACywwAJAugu9+plUQmlyAPvuuy+Q7+IWXSklQEU6shaC9jZFDZKFqfp3iV9HvWFUcKMI1HMlTt8oZ86cOUD2+KvjvNKqpkyZEo4pRbrolE7Z9XM/puIz5VLUYtX2HGsn8XtWKZPvvfdet49X0Q/9F2DdddcFkvGMf/uo95jSsIr+XasCDYsuumjYp9+F5YqE7LXXXgAcdNBBYd/VV18NwIknngikv5vbyaGHHhq29TkXv6fUD6za9OOpU6cCcMUVVwBJijMk16TGMH6vqwdiozmSY2ZmZmZmhVLISE58Fx7fyYtmItXhtVYqywhw9913A8mda7xodbfddgPSnXKLaPDgwUCyMC+ro7WKNxx33HFhX9FnlboTFxc47LDDgCQKpv+HdIGLotFiR5UqhmQGfLvttgv7FMGp1ccffwzAH/7wh7CvHcv+qmRsVsRBM2r1sP/++wPQp0+fkmMq+tAuBUIqMXbsWCD9mV5O3759gfLdt7W4Ny4OobLUzz33XC2nmWtxcYpyGRSK8mh8LP1ZVytlinz44YdAuriP3Hrrrb1+nXagz6a4VYWiCfptFhclUIsRlS6/5JJLwjFFONpdXFJczjvvvLD9yCOP1PS8KiSi74w4W2LatGlAkoHQipLSjuSYmZmZmVmhFDKSo/UN3VHURY3c6uH999/v9phm5YseydGsyTbbbAOkZznVdErluttpHUSjrLXWWmE7brwF6fVeRabmufpvLC7t3lunnnoqkF1yuZ0oB//5558HkjVckEQXarX44ouH7Q022KDbx+m1lfdehEjjLrvsUtXjNev57LPPlhzbeuutAZh//vmBdFR/0qRJQPPy0Zsp/jutvfbaqWNx1K9dS/DmnTIp9L6Mv3+vv/56oD7r9dqVIo1XXXUVkI50qfmnmsAfeOCBTT67xlFbk7iEtNRzfdFrr70GwNJLLx32qXS0ov+taKfiSI6ZmZmZmRWKb3LMzMzMzKxQCpmu1pN40Xu9aLF0PVPg2oEW6kHSTVmpe9dee204dsIJJwBOU4vFpWVFKX9aJNnJtEAZkjLsWYtpK3HwwQcDsNBCC5XsaydKSYvT1OplxIgRYTsuBNHV4YcfDhQjTU2Uxpi1APwf//gHkC5aoS70WW0BVl99dSApl6piEVB5YYN2FKdjK1VKaSrf+973WnJORRd/nqmQ0jLLLFPyOH3/dgql1MZFjfSZtskmmwDJgnlIilX98pe/bNYpNs36668PpNs0qHS0vlfrafvttw/bKs2v15k9e3bdX68njuSYmZmZmVmhFDKSE5fmzZr5jY/Xm14vft1aZ5/zTM0Y1SALkjKBaq4al0G2pFGsZjzjGV6ZPn16U88pz+JIlxZIajG3FndD0rhSC+e18DamGaUf/OAHYd/nPvc5ACZMmAAks/PtIOszZdNNNwWShaZQvqGgaEHuD3/4w7LPX8mxdqVy4vUoKz5r1iwAHnjgAQC++MUvhmNqaaDIRhHKKGd9jonGM27+bPVz7rnnhm01CJa4KIauyU6hz6hyv/V+97vfhe0zzjij4efUKllRVH021bPwgL53TznllJJj8e/EZnMkx8zMzMzMCqWQkZw41zKrWVt8vN70evHrlmsY16523nlnAFZdddWSYypXaWlaAzJ69GggfV2oaZZyiS2taxOxeL2XLLHEEkC6HPe4ceOAJB9bzVYhaY6mPGGVOYf6znDV0wsvvAAkM+SK3sTihn+Kqj7++OMlj1M5ajUrHjhwYDhW7jNLDY87pbFgrbSGZ+TIkSXHivSdoJYB5Y7Fn2uaZdd7OP6+yHpfW/e++93vhm1dUxrruBn33Llzm3tiLXbUUUcBsMoqq4R9c+bMAWCppZYCYNtttw3HtI7p5ZdfbtYpNo3WBsZl8ldaaSUg/X0YN0ethjJ4VH47bgosrfy8cyTHzMzMzMwKxTc5ZmZmZmZWKIVMV2uFr3zlK60+haZYfvnlgXRqT1et6GqbV0OGDAnbKlMrKuMIsPvuuwNJ+W2r3htvvAGkF+NqcalKqO69994lf+5HP/oRkC5veeGFFzbsPHtDxRFU1OOWW24Jx7SofZ111gn7Lr300m6fS2ktSl0ol1IQpxtNnTq1yrPuTCo93SmyClIoXXTLLbcM+1RiVu/F+D05c+ZMIFlE75YD2fSZFS+sVxq+ilk888wzzT+xFlhggQXC9qmnngokqchxOp9K3t9zzz1AukDLBhtsACSpXUWiz+u4EIWK9cQl86+44oqKn3OxxRYL20q1VwuRvHEkx8zMzMzMCqUjIzm6i+1tudBll102bJdrdnb66af36nXyRAtJs2Z9r7vuumafTm716dMHSEcEVEJaY6fiDeAIjqIJkDSY/epXvxr2aUbuww8/rOp5Fd157bXXenxsvFA3r5Ecue+++wAYM2ZM2KdI4eDBgyt6jmoaisaNWZ977rmK/1wn+9rXvtbqU2iqrGI7Tz75JACXXHJJOLb22msDcNZZZwHpcVJGhGbUt9hiiwaecfvZeOONATjkkEOAdBElfSZcdNFFTT+vVop/e+2///5AUnjgyiuvLHm8ooXxdfeNb3wDSArczJs3rzEn20K6PgAGDBgAwEknnRT2Pfzww0AS8Yrfz2rdsN566wHp5rL6nlZEZ4cddqj7ufeGIzlmZmZmZlYohYzk/OIXvwjb8R2nTJo0CYCHHnoIqH0NycUXXxy241KFkJ7t/M1vflPT8+dF3OztmGOO6fZxmrWzpCxv1my5cqXXXHPNsO+uu+5qzonljCI4N998c9i38sorA+mozYILLliyrxpqRhavDVh//fVreq48ufrqq8P2TTfdBKRL+iqqo+twxx13rOr5NZN+22239eo8m01R00GDBoV9+ryvtVRqtTTrGdPMu8p3F52+O+LPtwcffBCAvfbaC0g3ZVSpd0V7+vfvH47Faxg7iRpvA0ycOBFIyvZ+8MEH4dgRRxwBFLMMchY1f47Xur7yyisATJ48uds/p3ViiugAfP/73089VxEjOfH6JK1jGj58eNj3yCOPAElD8vi7VmOtdTd6LCTfN7omHckxMzMzMzNrIN/kmJmZmZlZoRQyXU1pCZCEHZW+AEnpQHWgrzRdTZ1ytTBaZQdj7733HpBOk6tk0XOexekvSy+9NJBdeOBXv/pV084p71TcIi6rqnKfSm0855xzwrH99tsPSMo4qlt67IknngDgo48+qupcVPYb4NVXXwXyU+jg2GOPBZIUtZj+vpB0Ztb7q1q6XuMOz0WjsYkX22YtvJVrrrkGSDp/x4uYlQaj6yU+1g7uvPNOANZaa62wTwtrlbpYbmyyxOVWdW3G16io0EycGin/+c9/gOz3dxFtvfXWQLrwgCi9+d133y05tsQSSwDp79hOLV0epxltuummqWPnn39+2I7LyXcCvc/0mwSS3yrVfr9llT8vmnhMdtllFyCdvvzTn/4USD7n4jFRetrxxx8PwMknnxyO6Xvn61//eslr6vOxWSnCWRzJMTMzMzOzQilkJCcuZXzvvfcCSenF2NChQ4F0ycUDDzwQSJruLbnkkuGYZt/WXXfdkufSneoBBxwA5L8Eba0UjdDMriIQALNmzWrJOeXR4YcfDqRnwLWIW+UbY1pwq1K9cSRQUQjNTsdFLXRNatYljrBpIWA866I/mxWFbIVDDz0USBdo0GylomGQzFKqCe1VV10VjqlMdDkjRowoec6u5syZU+FZF4OKEugaja8dLQZXOdF2o/dTTIU+Lr/8ciCJMkB2REbvVz1XvABcGQJZkcW+ffum/v/jjz8O27fffntlf4E2ohn10047reSYCj9kFRDQezEuH6/PMV2Tinx1on79+gHZDYxl/PjxzTqd3Nl1112BdKZMXMCmGjNmzABg7ty5vT+xNqD3VVwUS9uKyHzqU8ntgUprl4voZxVZevzxxwFHcszMzMzMzOpmvk+yFle0WD3zI1dYYQUgXWpV5Smz/P3vfweSfOG4zG+5P6fmZdWWaC2nln+aRuSWxrMjw4YNA5JziyNkeSqDnJexiy266KJA0gQ0juhotm6xxRYrOZdK/i5ZkZxy4lmarloxdnF0VFEbjVeWuITlSy+9BMDdd98NwKOPPhqOaa2PSoRmRdHeeustADbccMOwL2tmvxLVjl0rc8E1blprGJ+7GqOqLHWj1fua02zkb3/727Dvs5/9bPUn1o2uEe0ss2fPBpJ1dpBEeOspL591+vsCLLPMMqljTz31VNjWmllFk5dbbrmS51IGRqMjznkZu5iu0zvuuAOANdZYIxxTBFG/MzTD3gqtHjuVOFajSkgavCt7IS4FrbXXaoA5YcKEcOy8884DYJ999qnb+ZXT6rGrJ43/tddeCySNVSH53a117PVQ7dg5kmNmZmZmZoXimxwzMzMzMyuUwqerSVxGVyE0LXwsl7qjtARIUhMUJr3vvvvCMYXjVHK1HvIS0nz++efDthaQOl2tvrRYWcUwNtlkk3Cs6wLdrAV+laarqSCG0reytHrstND77LPPDvvi8ai3Z599FoAhQ4aEfbV2DW/ndLW4gIPSUuO0wEZq1DUXl79XUY9yacfVvrbO+9Zbbw3HbrzxRiApaNPoghatfr9KXNRDqStxwYGur5113koTV2f1Wt+HlcrL2MX22GMPIF0eWlTkQQVbWqnVY7f//vsD6YIX+i2n14mLfujzTt+1119/fTi22267Ac0rdNHqsaunPn36AEmxrpjT1czMzMzMzOqsYyI5WbTILJ4x7iorkqO7/ilTpjTw7PJzt6+iCpA0itK5xSVR1VTwnXfeqfs5VCsvY1dPmoHqWqIWYOzYsd3+Oc1gAUyePBko3ywtL2OnIgyQXHeHHHIIUL4UdDlxyd8zzjgDSJqyxuNUq3aK5Ki4gGbdp0+fHo4ddNBBQLpceSM145rTgm6Vgo6bCI4aNark8a+//joAl156aY/PHV87aqTaLHl5v8b0/pw0aRKQjqjptXXdxTPqKtLw9ttvN/T8JC9jt9NOO4VtNWVU0aSZM2eGY4pw+Ts2EUfgx4wZAyQNVOPG2fq99sILLwAwceLEup9LpfIydvXgSI6ZmZmZmVkT+SbHzMzMzMwKpaPT1fKuSCHNZvPY1S7PY7fIIosAMGLEiLBPhQpWWWWVkmNnnXUWkHRcvuGGG8KxeLF4vbRTulqe5PmayzuPXe3yMnZKw4UkXU3nptQraHyKfDXyMnbtqEhjN3r0aCAptBJTEY2sY7VyupqZmZmZmXU0R3JyrEh3+83msaudx652juTUxtdc7Tx2tcvL2GVFcjT7HReVaXZRi3LyMnbtyGNXO0dyzMzMzMysozmSk2O+26+dx652HrvaOZJTG19ztfPY1c5jVzuPXe08drVzJMfMzMzMzDqab3LMzMzMzKxQfJNjZmZmZmaF4pscMzMzMzMrlFwWHjAzMzMzM6uVIzlmZmZmZlYovskxMzMzM7NC8U2OmZmZmZkVim9yzMzMzMysUHyTY2ZmZmZmheKbHDMzMzMzKxTf5JiZmZmZWaH4JsfMzMzMzArFNzlmZmZmZlYovskxMzMzM7NC8U2OmZmZmZkVim9yzMzMzMysUD7V6hPIMt9887X6FHLhk08+qfrPeOz+x2NXO49d7aodO4/b//iaq53HrnYeu9p57GrnsatdtWPnSI6ZmZmZmRWKb3LMzMzMzKxQfJNjZmZmZmaFkss1OWZmZtY+hg4dGrZvvfVWAP773/+WPG7UqFEATJ06tSnnZWady5EcMzMzMzMrFEdyzHJo7NixYXu77bYDYKuttgLgwgsvDMfGjRsHwPvvv9/EszMz+59vfvObAEyZMiXsUwSnlipSZmb14kiOmZmZmZkViiM5ZjmwwQYbAHDooYcCMHz48HBMs6H675gxY8KxyZMnA/DYY4814zRzY4011gjbyu0fOHAgkJ49fvrppwHYbLPNAHjxxRebdYpmHUGfR4suumi3j3nnnXfC9uuvv97oU7IOt8ACCwDJ+i+A1VZbDYDddtsNgJVXXrmi5zrrrLMAOP744wF47bXXwjFHKvPPkRwzMzMzMysU3+SYmZmZmVmhOF3NmuLHP/4xAEcffTQA5557bji2zz77tOScmu2zn/0sAF/60pcAOOqoo8KxzTffHIAFF1wQgH/961/h2NVXXw3ALrvsAsAiiywSji288MINPOP8+PKXvwwk6XwjRowIxzQeWeVqv/CFLwDwxz/+EYBvfOMb4dg///nPxpxsh5kxY0bYVhrh1ltvHfbdfffdTT8na6yFFloobFfyGXTYYYeF7b/85S8NOSfrTP/3f8lcvb4jjzzySAC++MUvdvvnsr4vsuy7776p/+61117h2MUXXwwUK21t2WWXBZL0PEj/nXsSp85fddVVAPz85z8H4M033wzHPvroo16dZ6UcyTEzMzMzs0KZ75Mc3oLON998rT6Fiqy77rphO55NAHjiiSfCdv/+/QF45plnwr558+b1+Py1/NPkdexmzpwJwNprrw2kZ9FXWGGFur9eHsfuySefBGDAgAElxzT7ceaZZwJw/vnnlzxGs0a77rpr2HfMMccA8JOf/KRu59nqsfvUp/4XYP7ud78b9p1zzjmpY7FHH30UgLfffhtI3m9Qem1ts802YftPf/pTnc44Ue3YNev9uvjiiwPpmbR6PWdczOHTn/40kERsobJrs9XXnGy66aZhWxGHYcOGAUmDS4CJEycC8KMf/aiq59dz/PWvf009T2+0Yuzif1NFV7Oe//777wfgW9/6Vjj28ssv9+q16ykv1107avXYLb300gCccMIJYd8ee+xR03PNnTsXSL5f4ihPuYIaffv2BeCNN96o6vVaPXZZlltuOQBuvvlmoHwUrFYPPPBA2FZ2jzJVoLJxqXbsHMkxMzMzM7NC8ZocYPXVVw/bgwcPBmDPPfcEkhK0AE899RSQND8bOnRoONb1Lvuaa64pef5LL7007ItnHzrBaaedBsAll1wCdF7JY4D//Oc/qf+PZ9U33nhjIJlRyvLss8+W7Js+fXqdzq614ujLr3/9ayBZp5RFUTFI1tm89NJLAHzmM58Jxw4//HAgydHuNFr39f3vfx+AnXfeORy76667anrO+eefP/Xc8ZqMV199FYCLLrqopudutazITD3XkChSFEeMpB5RnUYbOXIkABMmTCj7OGU26PszT9GbIlE0A+Dggw8G4IADDgDg3nvvDcd22mknoP3XIcZ/3xtvvBFI1rj25MMPPwSSrJLf/OY34ZjWjiy//PIAvPXWW+GYIv5ZJaf1W/Cyyy6r7C+QM0sssUTYvummm4DGRHBk0KBBYXvatGkAfO5znwv73n333bq/piM5ZmZmZmZWKL7JMTMzMzOzQunodLXTTz8dSMoOAiy55JKpx2SlFaioQFx4oNxiKC2gjsvwaXFXrSkj7SBe9H377bcDScqfFtF3EhVdGD58OJCUNwbYYIMNgCQEn0UpkUVa+Pr5z38egBtuuCHsi9NHRSkDd955J5BOh1Kamrz33nth+/nnn6/bubajcePGAUmX7nqkA2ix7YEHHlhyTO/zdk1Pyvq8r1bX4gKVPLZdrLnmmkDPi39nzZoFJOmiVru4ZcC2224LwI477gikf4Pos1TppOutt1441qdPH6B909WWWmopIP39WEmaWtyK4ac//SmQ/O7LMmfOnJJ9P/vZzwA444wzSo6ts846QPulq+kaGT16dNiX9b3blQosPP7442HfPffcA8CoUaNKHt+vX7/U62XRtQyNKcntSI6ZmZmZmRVKx0Ry4gW3WjA7cOBAoPKZcc1ObbXVVkDlsyIPPvggANtvv33Yp5lpzbC0mwUWWADILv984oknAknkApIFyauuuiqQXvDWaeKSidXYbrvtgGI1HlNTQUUHIIn2xQu+taj23//+d4/PudFGG4XtepbWbhdaQArphboAjzzySK+fXyWDsyKLapjXbqqN4CgCc9xxx6X+v+h23333ih6nYiCdHkmthQqn6L209957h2NqPzB79mwArr322nDsyiuvBJLGx9ddV2MHRgAAEtVJREFUd104pjL77UYlnZX5ombDPdFvs/i7QGNWCTXuhuyItfzhD3+o+DnzRFlFKgiV5YMPPgjbitbod3TWb9+DDjqoZJ8KXvzyl78Esn/3qdAQJBExFYmoB0dyzMzMzMysUHyTY2ZmZmZmhVKodDWlTSiVCpL+GEpRg6SGv0Ji8eM//vhjIFmc9ve//z0cU4fvahfv6fXitI64l0c70QKyQw45BKi834/S1GSVVVap6nXjBawK0dcj9aYdaIGlxiwO5X700UctOad6UdEALWgGmDdvHgBvv/12Rc+hlAYtwo1TOFSDXwtQs3oNtbP4s2u//fYD0v2FlNoYp+vWIl7kq9fRc8epWvFC33ZSLl1t2LBhQOekpGXRgvdK04ybkaYWL8jX9/b777/f8NdttK985SsAfOc73wHg4YcfDsfU70oFi+Leayoy8ulPfxpI0tfamdKdKk1Te+ihhwA44ogjgOpS1ADWX399IF2cYMUVV+z28fG/Td7Ff4+s1LKubrvttrCtJRrV+t3vfgck/x6tWKbgSI6ZmZmZmRVKISI5ipSMHz8eyF5MpdlhgD//+c9Asshpm222Cce0UPmxxx6r2/mNGDECSHfMPfnkk+v2/M2khcwa82OPPbbkMeoanlU2UKV/L7jggqpeNy7tGP9bdoItt9wSSGYuL7/88nDsySefbMk51VtW6c5KaSF8uaji9773PaA44yXLLbdc2D711FNLjqvoiRaO1mqttdYK24qcye9///uw/d///rdXr9NMcfRGn1kxRW46OYIjisQvvPDCTXm9lVZaCUiihjFlRMTZAe+88w6Q/l7Vtd9u/va3vwFJeeJy4gXyyq7QDLwKELSzrN8XXcUL5I8++mggKeyURWMWXz+vvPIKAJtssgmQtHToiaJnKrYESVQxL/Q7LP5+XG211bp9/MyZMwHYY4896nYOv/jFLwA477zz6vaclXIkx8zMzMzMCqUQkRytVciK4Cg3WLO9ANOmTQOSPHU167SeaT3SpEmTSo5p9iNrVvSWW24Bkpziamd8L7zwwqoeXyQqHf3mm28C2bObnSxuqtodRQ5//OMfh32aXWrHktxaHzhlypSSY/GaQa1FjHP3q7HssssCcMwxx3T7GM38tZtKy0ZPnDixLo9pZ0OHDgXKt1uopPlplrhEryKG5UpVK4sg6zskXns2ffp0IMmkKCKt24Ek+nX88ccD9Wn822r3338/UD7yoAaVkKyh1uPjtYR77rknkDRN/epXvxqOPf3000D1a4X1vo+bH1ebpdJoygDZddddK3q8oqH1bBzbtZVBMzmSY2ZmZmZmheKbHDMzMzMzK5RCpKspDJlFnV3j7ulyxRVXNOycYvHi4KKJFz5qsV9WwQGlFrTTwuR6U2lPpVfF5XZfffXV1GPHjh0btpUqojTLuXPnNvQ8241SUv/9738D6dKy0rdvXwB+/vOfh31XXXUVkE41aBdKy4lTLkTXC/S+ZLZST1dfffWwT2V6leZ7++239+o1WiUrrTamdLZK0tri51KhguOOOy71/+3sW9/6FlA+tbPS7u/qfr711lsD0L9//3BMz1/udfQd0lOaqdJ8i2yjjTYK2/oOmTp1aqtOp+6URlZOnAqltHh93vfr16+i1ymXpqaiOGppkJU6F3/m/vrXvwba97eOCjJ8+ctfDvvUrkOlvNWaIaZy2u+9917JMRX+yRJfr40o2uBIjpmZmZmZFUohIjldF1S99dZbYTsPzel22WWXkn3PPPNMC86kfjRbHpetjaM6kJRlhMqbhhZNPEOkhlijR48G0g3zzj//fCBZDB/PQmrGUgtpLU3FBNR4bPnllw/HLr74YgCWWmqpkj+nRsH7779/o0+xbjSLqMXF8Wz2fffdB9Sn4ama8KlQSPw6N998M5DMwK299trhmMrexoVeFJWsdWF6vVVacKC3z6//KqIDxS9Q0B01EwXYYYcdgNY0BiyKAQMGAOmiCopSq5x2EVxyySVAupl7OXGhgd6IWxpojFXsRRkAAAsuuCCQ/o2n3zpPPPFEXc6lt1R0Ji71/+1vf7vbx+v7dMKECWGfCjqoUW3We1ePif9c3FC0O3GWVSOiX47kmJmZmZlZocz3SQ7rp5YrU5lFfwXdBcaN7ypt6lRvca68mo/Onj077Bs8eDCQnb8otfzTVDt2tdK6knLNFTfbbLOw3exZ3FaPnWZ47rrrrrBPM97lzm3GjBkADBkyJOxTBGfHHXcEGp/r2+qxqyc1rrz77rsBGDRoUMljNK7xTFetqh27asdtq622ArIb/SmyMmbMmLCvktLRmqH8zGc+E/YpKqbPsfjvpbFUY8g4kqPH3XjjjWHfyJEjgfIzzK245uIZxKyoS9dmoFkRoKx9ys/POqbnr2dEpxljp1z5cq8VN9/U95si/vHnv2aDs85Fz6/1JYpwQ5Lz/9vf/rbHc4l1bV4ba9fPOmUFxNeRxlzrJxqtGWOncuF6TymK3Cgq/3zKKaeEfV3XBcVtCJQNEFMkp9yav1Zcd/Eam+uvvx5Ir4erl5NOOils6zfv5MmTAVhooYVKHh+v76mk7Hm1Y+dIjpmZmZmZFYpvcszMzMzMrFAKUXhA5U0fe+wxIB0Ov+OOOwDYfvvtw75mFCOIF1+pdHBcEKERpfKaKauYgqiUdJw22AmUogZw9tlnA0kHb0gW5p122mkAbLjhhuHY+PHjAdhkk01Knveaa64B2rckZSt99NFHQFJqNU5RWn/99QEYPnw4UJ90tVZS2t3AgQPDvpdeeqnbxyv9QZ+X8fXbtYhIrGvZ6riEtFLmVEYV8rsQetiwYVU9PqsUdLny0EpXi685pbB0TYUrgjhtUZ9xKkfbNUWtOw888ACQlKyO0y2PPvroupxnEej9+be//S3sa1aaWjPpO+/ee++t+3MrNRySAkpaWqB2BFn0fQxw0EEHAclvPEjKJfdUor7Z4uvjJz/5CZB8Z0D1n4fdib879DmQlabWLI7kmJmZmZlZoRQikvPCCy8AySLvuNiAZh1vuummsE+LP7X4Ss3tqhXfnWrxlBalxZEjzRCPGjUq7GvXSI5KIqsRYEx/J82KlJsNKaI+ffqE7T322KPk+GGHHQYkZRXj8oqK5GTJam5p1Zk3bx6QRBkhieQss8wyLTmnWmhBv8oyxwtkNYMWLzCNt3vjqaeeCtuzZs1KvXa7NgNttEqiPO0SyVERGTX8XHTRRcs+XjPclUSftbgckojPiy++2OPje3rugw8+uMfXbjcrrbQSkBQXOfPMM1t3Mk20zz77VPQ4FUU5/fTTAXjjjTfCMTWG1+/FuIBAuQJQXcXf8/G1K3Fz0rxSQ95f/epXYZ+ySPbdd98e/3zcZDVuSAswbty4is7h2GOPBSorjtMbjuSYmZmZmVmhFCKSoztBNQVVqUlIZsHXXHPNsE9541ojEUdYZO7cuUC6vJ2st956QDIzD8kdvWZO49lNlSVs1+hNTH8XlZCOKUIW56B3krjss9Y7xNEabS+22GIAXHnllSWPl3iG6IADDgCSXGDNRFn14khO/P5tN+eddx4A06ZNC/tUXjqOsr755ptAsk7nlltuCcfUEG7llVcueX79uSlTpgDpUrWvv/56r8/f2otKQCs6cs4555R9vKIslZR7jSMy1Tw+67HxLPTll1/e43O1GzWJVtQ2bkxZZJ///OcrepwiFCqBH4s/+3ojbo1Rrjx5O/jggw/Ctsan3Dhp3c4WW2wR9nWN5JRzxRVXhG2tC2r072JHcszMzMzMrFB8k2NmZmZmZoXS3rG2Lp577jkAvvnNb4Z9hxxyCABbbrll2Kdyf1p4/I9//KPkuVSOeo011qjotf/5z38CSZGB73znO9Wceq6tuOKKYbvrIu24PKwKDnSqL33pS2FbqRSTJk0qeZxSKjbeeOOwTx2+VXp69913D8cGDBgAwHXXXQfA8ccfX/KcSndT6VVIOtiPGDGi2r9K08Vh7Kzw94cffgjAiSeeCMBFF10UjqlMdCVUohbg7bffBhq/8LGR4tQxpZbpvz3RZ5bKfMZpQ7putYDXDJLvt7jcrFJXKk0pqpf77rsvbOsajlNtKume3m7WWWcdIEnLeuKJJ1p5Oh1h5MiRQLIwf++99w7HsgoPFKUYRPx+/sEPfgAkZdwXXnjhqp5LaZVHHnlk2Nes5RuO5JiZmZmZWaHM90klK/2arOsi7HqIGy8eeuihQLIAfOjQoVU91+zZs4H0bLKa39VzUXgt/zSNGLttt902bE+fPj117KijjgrbJ510Ut1fu1atGLu4SZiiifvtt1/Yt9NOOwHJtThnzpxwbNCgQUDSMDaORp5wwglA0lg06zy1L26apmZk8WL7SrRi7OKolmYpy0VR45Lw1157LZA0clMUNqbS53HZUM3Q6b/xIv5aVTt2jXi/VkqRHJU8veyyy8KxOJLYDHn5rKsnFWBRuehYPc+91WOnlg39+/cP+7pGB5daaqlwLG462/VcVMpXEUpFWyH5HBRFtqH2aGyrx64SanYOSQEIFQLRZ2UrNHPs9BndU5Ra14t+h5177rlVvY7GWt/VAEsuuSQA888/f7d/7v777w/bm2++eepcsjRj7NZdd10g+W1RKWV+xH+u1rLYTz/9NABf+9rXgPoUral27BzJMTMzMzOzQumYSE4W5VMqz7VSWj/R6FK+eZllyorkqLlivPZIjeLyoBVjN3PmzLCtqEvW899xxx1AOrc3K/rQlWaz4j8nmtHTTB/Av/71r0pOu0SrrzvlAsfXlmYuF1988W7/nNaH6f2Z9ZxxI7ef/exnQNKUrB45/HmP5MQzv7qONG5qxAjw7LPPNvW8Wn3N1VO5CI6af8ZrWnqrHcZu5513DtuaFdb6wXicdtttN6DydWW91Q5jFzeW1rrXwYMHA8k65FZo5tip8XrcmiP+vGoVRXDiEvvKLCinGWOnrIU426aRVLY7bnCsMu7lolrVciTHzMzMzMw6mm9yzMzMzMysUApVQrpaWhQZL9a2ymjs4jKDXVMNmlUiMC+GDx8etlV69wtf+ELYp1S/888/H4AXX3yxque/8sorU/8tKi1OPOuss8K+O++8E4ALL7wQSEpKAyy77LIA9OvXD0i6gffklFNOAYpZarar1VZbDYAxY8aUHDv55JOB5qeoFUmcrpKVpiZxOmkniUvEyxlnnNGCM2k/Q4YMCduXXnop0No0tVZ4//33gaRcOSTpWGrJ0GgPPfQQkC7ko2JL9UzHagdKTYPk32HWrFlAdS0dmsGRHDMzMzMzK5SOjuRYZT744IOwrRn0RRZZBEiXjd5qq62AzovgSFyIYocddmjhmRSPGv+p/HY8W6Ryn1qgG5eEVwnLl19+GUg3rH3ttdcaeMb5st122wGl5XshXTCjUykSE187lURdVKY9ixbgHnfccSX7zHqyxBJLAPDtb3877DviiCNadTq5oBYLAOPHjwfSzbHVskFlkCtt5i4XXHABAK+88krY9/DDDwNw9dVXA/mLVDTD1KlTgWSsH3/88XAs77/3HMkxMzMzM7NC8U2OmZmZmZkVSkf3ycm7PNbwV7qFwsBbbrllOPbggw829LWrkcexaxceu9rltU/O2WefDcA+++wT9ilNbaONNgLSaanN1uprTsUC1OOmNxrRC6ecVo9dO8vz2Kkv2kUXXRT2rb766kDje/RVIs9jl3fNGDulJivFbMKECeGYiu0oDRxgxowZqT8fFzhSHz8VnGol98kxMzMzM7OO5khOjnmmpHYeu9p57GqX10hO3uXlmotLQZcrKtBVXFwgfo5myMvYtaM8j92kSZMAGDt2bNinYgR5kOexyzuPXe0cyTEzMzMzs47mSE6O+W6/dh672nnsaudITm18zdXOY1e7PI+dyt337ds37Bs9enRTXrsSeR67vPPY1c6RHDMzMzMz62i+yTEzMzMzs0JxulqOOaRZO49d7Tx2tXO6Wm18zdXOY1c7j13tPHa189jVzulqZmZmZmbW0XIZyTEzMzMzM6uVIzlmZmZmZlYovskxMzMzM7NC8U2OmZmZmZkVim9yzMzMzMysUHyTY2ZmZmZmheKbHDMzMzMzKxTf5JiZmZmZWaH4JsfMzMzMzArFNzlmZmZmZlYovskxMzMzM7NC8U2OmZmZmZkVim9yzMzMzMysUHyTY2ZmZmZmheKbHDMzMzMzKxTf5JiZmZmZWaH4JsfMzMzMzArFNzlmZmZmZlYovskxMzMzM7NC8U2OmZmZmZkVim9yzMzMzMysUHyTY2ZmZmZmheKbHDMzMzMzKxTf5JiZmZmZWaH4JsfMzMzMzArFNzlmZmZmZlYovskxMzMzM7NC8U2OmZmZmZkVim9yzMzMzMysUHyTY2ZmZmZmheKbHDMzMzMzKxTf5JiZmZmZWaH4JsfMzMzMzArFNzlmZmZmZlYo/w9osOThR97h2QAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# takes 5-10 seconds to execute this\n", + "show_MNIST(test_lbl, test_img)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's have a look at the average of all the images of training and testing data." + ] + }, + { + "cell_type": "code", + "execution_count": 99, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Average of all images in training dataset.\n", + "Digit 0 : 5923 images.\n", + "Digit 1 : 6742 images.\n", + "Digit 2 : 5958 images.\n", + "Digit 3 : 6131 images.\n", + "Digit 4 : 5842 images.\n", + "Digit 5 : 5421 images.\n", + "Digit 6 : 5918 images.\n", + "Digit 7 : 6265 images.\n", + "Digit 8 : 5851 images.\n", + "Digit 9 : 5949 images.\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAACBCAYAAADjY3ScAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJztnXmsVdXZxh8EGQSZp4LgBIoXUYyKQypiLVJRUqmxjbbGVq0V0Yqp81BB/SppQqJtorFRYxWlRk3baGtJW1HjgIgjiDOiOBQBleHiUHrP94c+Zz933dfj5Ubu3mf7/BJyD3ufYe13v2utvd5pdahUKhUYY4wxxhhjTEnYJu8GGGOMMcYYY8zXiRc5xhhjjDHGmFLhRY4xxhhjjDGmVHiRY4wxxhhjjCkVXuQYY4wxxhhjSoUXOcYYY4wxxphS4UWOMcYYY4wxplR4kWOMMcYYY4wpFV7kGGOMMcYYY0qFFznCxo0bMWPGDAwZMgRdu3bF2LFj8ac//SnvZhWeDRs24Pzzz8cRRxyBAQMGoEOHDpg5c2bezaoLHnjgAZx88skYNWoUunfvjqFDh+L73/8+nnrqqbybVnieffZZHHXUURg+fDi6deuGvn374qCDDsLcuXPzblrdceONN6JDhw7o0aNH3k0pNA8++CA6dOgQ/lu4cGHezasLHnnkEUyePBl9+vRBt27dMHLkSFx55ZV5N6vQ/PSnP/1SvbPu1eaZZ57BMcccgyFDhmC77bbDqFGjcMUVV2DTpk15N63wLFq0CJMmTcL222+PHj164LDDDsOjjz6ad7O2iE55N6BI/OAHP8CTTz6J2bNnY7fddsMdd9yB448/Hk1NTTjhhBPybl5hWbt2Lf7whz9g7733xjHHHIMbb7wx7ybVDddffz3Wrl2Ls88+Gw0NDVi9ejXmzJmDAw88EPPnz8d3vvOdvJtYWD766CMMGzYMxx9/PIYOHYrGxkbcfvvtOPHEE7FixQpceumleTexLnjnnXdw7rnnYsiQIVi3bl3ezakLfvOb3+Cwww5rdmzPPffMqTX1wx133IETTzwRP/zhD3HrrbeiR48eeP311/Huu+/m3bRCc9lll+H0009vcXzKlCno0qUL9t9//xxaVXyWLVuGgw8+GLvvvjuuueYa9O/fHw8//DCuuOIKPPXUU/jrX/+adxMLy5NPPonx48dj3LhxuO2221CpVPDb3/4Whx9+OBYsWICDDjoo7ya2joqpVCqVyt/+9rcKgModd9zR7PjEiRMrQ4YMqWzevDmnlhWfpqamSlNTU6VSqVRWr15dAVC5/PLL821UnbBq1aoWxzZs2FAZNGhQ5fDDD8+hRfXPAQccUBk2bFjezagbjj766MqUKVMqJ510UqV79+55N6fQLFiwoAKgctddd+XdlLrj7bffrnTv3r0ybdq0vJtSCh588MEKgMqll16ad1MKyyWXXFIBUHnttdeaHT/ttNMqACoffPBBTi0rPpMmTaoMGjSo0tjYWD22fv36Sv/+/SsHH3xwji3bMhyu9gV//vOf0aNHDxx33HHNjv/sZz/Du+++iyeeeCKnlhUfuszNljNw4MAWx3r06IGGhgasXLkyhxbVP/3790enTnZSt4a5c+fioYcewnXXXZd3U0zJufHGG9HY2IgLLrgg76aUgptuugkdOnTAySefnHdTCsu2224LAOjVq1ez471798Y222yDzp0759GsuuDRRx/FhAkTsN1221WPbb/99hg/fjwee+wxvPfeezm2rvV4kfMFS5cuxR577NHi4WivvfaqnjemPVi3bh2efvppjB49Ou+m1AVNTU3YvHkzVq9ejeuuuw7z58/3g1QreP/99zFjxgzMnj0bO+ywQ97NqSumT5+OTp06oWfPnpg0aRIeeeSRvJtUeB5++GH07dsXL730EsaOHYtOnTph4MCBOP3007F+/fq8m1dXrFu3DnfffTcOP/xw7Lzzznk3p7CcdNJJ6N27N6ZNm4bly5djw4YNuO+++3DDDTdg+vTp6N69e95NLCyfffYZunTp0uI4jy1ZsqS9m9QmvMj5grVr16Jv374tjvPY2rVr27tJ5hvK9OnT0djYiEsuuSTvptQFZ5xxBrbddlsMHDgQ55xzDn73u9/hF7/4Rd7NKjxnnHEGdt99d0ybNi3vptQNvXr1wtlnn40bbrgBCxYswLXXXouVK1diwoQJmD9/ft7NKzTvvPMONm3ahOOOOw4/+tGP8K9//QvnnXcebr31VkyePBmVSiXvJtYN8+bNw8cff4xTTjkl76YUmp122gmPP/44li5dil133RU9e/bElClTcNJJJ+Haa6/Nu3mFpqGhAQsXLkRTU1P12ObNm6tRTfXyTOyYDqFWyJXDsUx7cNlll+H222/H73//e+y77755N6cuuPjii3Hqqafi/fffx7333oszzzwTjY2NOPfcc/NuWmG55557cO+99+KZZ57x2LYF7LPPPthnn32q/z/kkEMwdepUjBkzBueffz4mTZqUY+uKTVNTEz755BNcfvnluPDCCwEAEyZMQOfOnTFjxgz8+9//xne/+92cW1kf3HTTTejXrx+mTp2ad1MKzYoVKzBlyhQMGjQId999NwYMGIAnnngCV111FTZu3Iibbrop7yYWlrPOOgunnHIKzjzzTFxyySVoamrCrFmz8OabbwIAttmmPnwk9dHKdqBfv37hyvSDDz4AgNDLY8zXyaxZs3DVVVfh//7v/3DmmWfm3Zy6Yfjw4dhvv/0wefJkXH/99TjttNNw0UUXYfXq1Xk3rZBs3LgR06dPx1lnnYUhQ4bgo48+wkcffYTPPvsMwOdV6xobG3NuZf3Qu3dvHH300Xj++efx8ccf592cwtKvXz8AaLEQPPLIIwEATz/9dLu3qR55/vnnsXjxYvzkJz8Jw4lMxoUXXoj169dj/vz5OPbYYzF+/Hicd955uOaaa3DzzTfjoYceyruJheXkk0/G7Nmzcdttt2GHHXbA8OHDsWzZsqrxcOjQoTm3sHV4kfMFY8aMwYsvvojNmzc3O864Q5cHNVuTWbNmYebMmZg5cyYuvvjivJtT14wbNw6bN2/G8uXL825KIVmzZg1WrVqFOXPmoE+fPtV/8+bNQ2NjI/r06YMf//jHeTezrmColb1iXw7zW1Mou3qxDOcNvQ+nnnpqzi0pPs8++ywaGhpa5N6w5LZzrWtzwQUXYM2aNViyZAlWrFiBxx57DB9++CG6d+9eN5EmHlW+YOrUqdi4cSPuueeeZsf/+Mc/YsiQITjggANyapkpO1deeSVmzpyJSy+9FJdffnnezal7FixYgG222Qa77LJL3k0pJIMHD8aCBQta/Js0aRK6du2KBQsW4Kqrrsq7mXXDhx9+iPvuuw9jx45F165d825OYTn22GMBAPfff3+z43//+98BAAceeGC7t6ne+PTTTzF37lyMGzfOhtdWMGTIELzwwgvYuHFjs+OPP/44ALjgSivo0qUL9txzT+y444546623cOedd+LnP/85unXrlnfTWoVzcr7gyCOPxMSJEzFt2jSsX78eI0aMwLx58/CPf/wDc+fORceOHfNuYqG5//770djYiA0bNgD4fBOuu+++GwAwefLkZmUITcacOXPw61//Gt/73vdw1FFHtdi52hP/l3PaaaehZ8+eGDduHAYNGoQ1a9bgrrvuwp133onzzjsPAwYMyLuJhaRr166YMGFCi+O33HILOnbsGJ4zn3PCCSdUwyP79++PV199FXPmzMGqVatwyy235N28QnPEEUdgypQpuOKKK9DU1IQDDzwQixcvxqxZs3D00Ufj29/+dt5NLDx/+ctf8MEHH9iL00pmzJiBY445BhMnTsQ555yD/v37Y+HChbj66qvR0NBQDZU0LVm6dCnuuece7LfffujSpQuee+45zJ49GyNHjsSVV16Zd/NaT8779BSKDRs2VH75y19WBg8eXOncuXNlr732qsybNy/vZtUFO+64YwVA+O+NN97Iu3mF5dBDD/1Subl71ubmm2+uHHLIIZX+/ftXOnXqVOndu3fl0EMPrdx22215N60u8WagX83VV19dGTt2bKVXr16Vjh07VgYMGFCZOnVqZdGiRXk3rS7YtGlT5YILLqgMGzas0qlTp8rw4cMrF110UeWTTz7Ju2l1wcSJEyvdu3evrF+/Pu+m1A0PPPBA5YgjjqgMHjy40q1bt8puu+1W+dWvflVZs2ZN3k0rNC+//HJl/Pjxlb59+1Y6d+5cGTFiROXSSy+tbNy4Me+mbREdKhXXbTTGGGOMMcaUB+fkGGOMMcYYY0qFFznGGGOMMcaYUuFFjjHGGGOMMaZUeJFjjDHGGGOMKRVe5BhjjDHGGGNKhRc5xhhjjDHGmFLhRY4xxhhjjDGmVHTKuwERHTp0yLsJhaAtWxhZdp9j2bUdy67tbKnsLLfPsc61Hcuu7Vh2bceyazuWXdvZUtkVcpFjjDHGmPqk1gMZz23pw4r3LTfGbCkOVzPGGGOMMcaUCi9yjDHGGGOMMaXC4WrGbEU0bIOv07/p6xSGafBvU1NTi3PGGNNeRGPXNttkNtNOnTo1+9u5c+fquS5dujT7q+f4HRzjPvvss+q5jz/+uNnfTz/9tHpu8+bNzT4HeGw0xtiTY4wxxhhjjCkZ32hPTi2LenSOpJb1rzr3TbEopbLS/9eSQb3LJ7JkbrvttgCAbt26VY/x9XbbbQcA6N69e4tztHyqTGix3LhxIwBg/fr11XObNm0CkFk3adEEgP/9739tv6gCoPpD2fJvrT4b9b1aXrBa/fmbSirLLa3sU0ZZRjKoNU+0xjsbHauHuUOvrWPHjgCyMQ/IvDQc63r06FE917t3bwBA//79m/1fv4Nj14YNG6rn3n//fQDAqlWrAAAffvhh9RzHwf/+97/VY+zrRZWhKQ5pX7XOlAd7cowxxhhjjDGlovSeHK7QaW0CMmtR165dAQC9evWqnuvbt2+zY3qOVnZa1tWivnbtWgCZdUnPffLJJwCaW9nr1VJAeVIW6qno2bMngMxqp7HWtKpFsdaUZ+qV0PepV6IosqNXIZIF9WbgwIHVY0OGDAEADB8+HAAwbNiw6jm+j7LT66UuvfXWWwCA119/vXqOx9555x0Aza2bjY2NAJrrXVFI+6VagSlH9XRRnrT6qmWY/ZjfqddL71fUL2klppxUJyn/ouhaa9hSz3R6bbU8Z+qlTI+prvJ15FFUL1oR+Kq8Eupk+ldfc4zTsY7jAdHrplz4V2VH/dNcE77mubz1MZIT+7DKgB6c7bffHgDQr1+/6rlvfetbAIDBgwcDiD057JM6b69btw5AJl89F3l4zTeDaGyijtCjCLSMpFB9JeyXfGbT12lf1Pc7F6y42JNjjDHGGGOMKRVe5BhjjDHGGGNKRSnD1aJSllEoEd3mO++8c/Xcbrvt1uzYoEGDqucYFsMQmLfffrt67tVXXwUAvPjiiwCAFStWVM8xUVKTKIsYQvRlRGEsDCtgOAKQhWMxDIHhawpdvxo29MEHHwDIQorU3RuFuuTpDlZZULfo/taQjB122AEAsMsuu1SPjRw5stnfHXfcsXqOMmMYlv5OGq62dOnS6rnnnnsOQPNQGkLZMQwQyDdkSPslwwgYkqYhK5TFTjvtVD3G/shj1DUgCzFl+IqGOzJZmSF+r7zySvUcjzHUb/Xq1dVz7ONFCzFNw4U0ZCcKG0pDe1Sv0r6l59LEcY59+l38vIZvsH8z3EhfM9wjbzlGsqPMNESS4xf7NfUMyPSVc4nOL5Q5f0d1iDKgTHRO4PjH0Gcg00ke01C2POQYhT1GZaIpD8pQw3Y57/KYyo7XRz366KOPquc4DnI8q9fw0tZuHbClhStqnasHubSG6NlOxyY+j/C5TedYzh0ME9f5mt/FcZ9zAgC88cYbAIA333wTAPCf//yneo7PLjrHpiH29SL7NAxZw245Vqr8SZqKEIUvR4V/2gt7cowxxhhjjDGlolSenGgFSisRy1UCWeL3qFGjAAB77rln9RyP8T262qe1nJY5WokBYOjQoQAyy54mvHH1quUtacmrl1U+SS3J6kFIPRoqO6KWS0JZ1CoBXBQiCya9LwMGDKieo6dBiwtQR6iLasGkxYMWzKgcK79fPRxMxuVf9ZDRKqVJlHlYUiizKDGZOqJyGjFiBABg9913rx7bddddAWQeMrWq03oXJZJSBrwfqpO02qdla4Gsrxah4EUtT2pkPVfvKuXM9+s1UC/UCkkoG8pLPW206vHz2qdpeY9KoOedPJ8Wu4gswNqHqWvsb5wTgMwb0adPHwCt39AyLYSxZs2a6jnOJ2pFZlspQ/UKtWc0QOpV+CqLOsdE6g+9s0DmwaHMdV6kZXzlypXN/gKZfDjGqVcrspoXxePPexgVWaHeRMfSz6Xfm8LrjTyslBX/qszrIXk+8lKnOgZkfXWPPfYAAIwePbp6jnMI9U+9trx29s933323eo5eIY6BHFOB7L5pP+Z4SFkXZSuHaB6J5mSOaXxeATKPGOdRnWOoS4xYYsQJkHm/2Hd1rqB8tvYziT05xhhjjDHGmFJRCk9O6l1QLwpXpWqF4+p+r732AtDcYszVK2OJdfXLFSctVxpnnFpR1HpOTwWtBHq+nnJzgNobJ9IqwFW+WttT+WjcOS0BlEmRN3SLPDm0YKpnhjqo7aYeRJvZpXkCqsOpx0GtL9RvWrM0D4rtUe9OHkQeVsqM16ZWNV6fWsAoK+qG5sOlG6+qDChH/rZ6JNh/eT/4F8gsykUoSRtZhaNcQ+qCeq3pWaaM1PpNq1qUO8I+TBmph4O/HXltIq9Q0fpu5HmgXmiuF3PnOD/QswNkMtZ+Sqi31FWVCWXM31arMF+rjlLGPJd3X462ZGB/1WvhNdAKrvpDneR3qRWcnhtag997773qOeprtK1A3vNE+gwSWch53TovRvlefM1xLJpXojLabAP1TvOZOLbRsq7WduZ96fNJ3l5XknqudesAjk0a2ZB6cPS5j3MMox40B5PXG+WVpHllXxUtkXrL8vaQpeMekMlC9Y7emr333hsAsO+++1bPMdqJY6COneyXzHF95plnqucWL14MAFi2bBmA5vM29VM9jlvDq2NPjjHGGGOMMaZUeJFjjDHGGGOMKRWlClerVdaYoQcA0NDQ0OxYFHZG966WQqX7kb8ThcXQPaquYrpF1fXO8Jt6C1cjUdgaXeeUi4Yo0J3L61b3bq3SoHmHIaTU2hVew+zoxtbiFHRjM7FYXbN0j/O7VLeY5Exd1hAFhjLwr4ZJRGWD8yQK/eS91v5CGWhYgLrH9fNAdp0MB9GESRY0oHte9ahWeeWiyAyoXXhAQ1kYIqRJ3ml57SicjLqqsqG8+XnqoL6PYxdD+/TYVyWF50GtwgOUXVTqmPLUMDJeSzp26WuWMtdzaehyVEJaj3FMqRUm3J7UCj3VUCKGYVGemhzO91NvNISF4VScK3UMiMJ/igLlEsmC8yDHb92yglsMaMgV38fPaShvrXC1dB7SuYdbWyxatKjZe/X9Ou+yH+eRNB+F56ZbDgCZTmmIKcOpOH/qOMRnOuqbhouzX/HZUUN++dtRiCnvs4at8t7kPYekz8UqO4aRcssUADjggAMAAPvvvz+ArFADkF0752ndpiHVEd12hXMxn4G1P3NcjLZp+DrHOXtyjDHGGGOMMaWiFJ6c1IqiHgQmU0XlaGkJ0JUkrR9c7WsyMi1sXLWrxZjfyZW9Whfo3Xnttdeqx5hgqSvieiJacVMuUdlQJpfSsqIresqVstD7UUSrHWHbeE2auMkNw9QqS/2kzKLN7GgFUtnRmsXEcvVU0koTbdJVNKJNEZkEql4wWti0rGpqFVMLJvscrXiUE9Byo0u1OtGSxPumfZFtzdtynpImkap1jtetukOLJK9DrbupHmpfS8dStc5RTrTOqbWUMlVPbeqNyIvUk6PWV3oBde6gF0L1iVBHOT9o8jx1ml6byMvDcyqnaCPV1FOU93hYK7E+KurBv3qO+sMxkt4bPRZtsUCdjzYWzLsMcloeOtp8nLLQAhb06uim0exr0cbQ7Kv8q+MgxwJ+Tj2VlDkLDkRziP5OUcpvp57DqOy76hZ1kWOOepn5/LV8+XIAzZ9B+F30/EdROlFJ70gXiyI7tpO6qN5U6p0WF+Br6p96WF9++WUA2caoOl9zzKReq3yo+5RnFGmytSn+k5ExxhhjjDHGbAFe5BhjjDHGGGNKRSnC1ehupWtMQ8UYRqbuYA1JAJoXBKBbjq5NrSdPFzrdf/qdhGFxUeJ4lMxWr0QuWcqF16vJynQbM5RD98lJ9z/IOySjFlGIBEMBVCYMgdL7TFcy36cuX0K90XAC/mZa6EC/i22JdrLOO0wobSPQMjRRQ3qisAC6uSPXO8NARowYAaB5MiXHAuqWhhQydCtKiizKPhFAHIKQ7kkFZGEGGkbLUA5em94DJpGy/2mIDb+XoW86ZjJEi2FqGpKZ7vYN5Nufo2ISDH2J9nXRMYtyZPEFvc40aV53SOfYxrFOdS4N59NwNd4bvUfUQ/aXvGSZhvpp2AnDpLRPpnqj76esGKamIeHUG46beo/YhrRoib5WvdvaxRpam1jOENlorGMfVP1J9w+hHgFZP+Z1qnzS0Hx9BuGcEY3Fkd4VYdwDWu4/pAUvouILJCpow/GefTcKz2Vf1xA4/nY03kWhznkWC1GdZJ9j/9SiKtQRTePg8/OKFSsAAAsXLqyee+qppwBkMlT5cL9Jjp2qd3wf55ZoX6etjT05xhhjjDHGmFJRt54cXQWmCe+6yy29LWqho+eHFqQXXniheu75558HkO3eqom6tA5ECYH8flr/dDXLFbImsKYlceuNyDrB62RSm+6mS0sHZa6eHFozi+zBIXrdqWVOLYvpzvRf9h2E+kC90YTytEylfj7dVV2thLQ85W2V4+9r0n96TD1QUclLehZYanXUqFHVc9yNmceicsdMmNT+TOspk8bV4l6EZPnI0pUmk+qYwrFHPTmUJRPlNUGer9k32X+BzCvEZFL1GLEEOq3KamlmX47udd6k1mC1gnOs0qRwWsYpl6icLvVEPQgcB+j51+RnWoHZT1Xv+f06DvJ13qX0Uy+YemY4ZmlxCnpyKDu1qLO4AD2oKlfO4dTvKOKB+qr9lbqofSYqZPN1oveC9ycqoc57zutV2fFz6s2iTvD9OmalHln1sI4bNw5ApssqO8qKfVXvR61SvnmTekMiL2fklaIeRGXiOUbp/MKIH0YD6OcY6UO9Va8b75HqItuVR/ltjfKgnvH5QfsnCyzoMxrHqcWLFwMAHnvsseo5enc4/6hXiDrI79S5gnrG+6H3KpLP1tA7e3KMMcYYY4wxpaJuPTm6YuVKlRZc3WyLljldtXPVzbwbem+AzKvD1bpaxlNrglowaSml9U43A6M1VS2Has2pJ1LLit4H5j/QAqpWSnpuUiuwvq8o1qPWQktEZFGixVOPUVZRCVtaQ2kBjTZ0pM6oXClHWgu/yhuR5gW1B9FvpV47jdWlXNTyNHr0aABZmct99tmneo5xxXy/6iStoJSFluelpZdW1yJ6H4A4J4fji+oJPdiak8gxinqiHlTqLa3mKm9aNDl+au5IaknXc5E3Ng+dS39bX7Nvag5SVH6blkm+T63ztF5SBqq/tAJrCXTC8SDyeEWbXRZlE9DW5DOp/qS5ODpXUgf5XboRJj/H71RPOOVP6zAt6/q+KGeyPTaj5e/yHmqOBvsedUTHJ16THuP7eX3qfaXe1Mrppb5qZAFlzucaeiCA7BknD89DROQho1x1rOGcp14pju/szzoWUmf5fKhecPZ19lnNxWbJ6VdeeQVAtv0HkHk/8t5+IM1dAlr2Vc2jUa89oceKUQ/aZ/n8zD7O+RgAxo4dCyCbM/QeUQep03nIyZ4cY4wxxhhjTKnwIscYY4wxxhhTKuo2XE3DA+iGY8KtFh5gIqO6sVn2c9myZQCAF198sXqOrki6QqNwI4YjqHs3LRMZhRxErsQ8Qzm+DjQMcOTIkQAymatbl+7faJf0erp2bWva7uj+algBwwgYaqQuY4ZaslBGVPKc36nuebrLGY6g4VjUyWgH6TwSmWv9lvZnJi5qAQGWhaabXMtEUz68NnWJ013OMARNtGS4AnVY3fNFKmeuekV9Yts1UZ7jnpbypSyoA5oUyhANXj9LgQLAmDFjAGThWxq+kSZyR2Vd9VhUeCIPKIMotCMKIU2Lhmh4G3WO16Tn0v6toV38zkgmUeGBooyNtcLVKAtNRua1pwnvQCZ3hrfofE19ozxV1/gdTNLX+5MWYAGysJn2KMCS3k8NFWMfjMLVouIIDO1hmJqO6fwdfpeGIDFMnGOchoQzTJzhalG5/KIQhaulxTyAbB7UwgzcpoPPghzjgOZzBtB8/qX8uYXISy+9VD3HFAaGcWnIL++fhpDnWSQkmuujYiHUH30O47Wwb7MYA5DNEQwtZYiavmZoIMP6gJbbNGj/dLiaMcYYY4wxxrSBuvXkqBWHlkuu3jV5lKtSTbTjivzVV18F0LwkYLoxZZS0HVn9aHliu9RCx9VyvXovInidKmt6H3hOS6cyeY/yLYKFvC1ESeDUAy02QSuRWtqop7R8aqIuLfLUYT1HHaZVNNqUkBYu1bvU+pqeB/KzGqdWdfXkRJYnto0WSJa0BJqXXwXiUqj8fi2vTH1NZQjEnpy8dDby5FCXolKeaqGkdY5Wc9VRyolFL/bYY4/qOZblpu5pkjfbQOuefmeUxJx6KPLyHqaFY/R+0zqrHiteO69XrZDpOKYlZyl/3jcdM9IEap0TakUB5D1fpGOJ3nN6DrScMftuVGCBHlp6cNSTw+/l57SQSuqF1vtHr4fehzRaYmvC+5OWkgZabugaeXLUE8Dr0gRuknqktZQ+X7NfcrNVIIteqRVJERXpyFvvUnmqTOjZU08On0f4fp1/qWfsz+rBoueGXghG+QDZmMDf0zbUKhaSN6lORt5F1VPKqqGhodnngCwCgM8lWtyL/Zl6rfMx52mOrxpl0V7zgT05xhhjjDHGmFJRt54c9aIwDjPdgAzIVouRJycqZ5xu5KRWF1pRaPnnf089AAASB0lEQVTk7wKZZYWWpGgzsGjDqHqD8qAMNHeEMqB1QK3tLE9Yr+WiiXoc6L3jdatXix4DLWHJ19GmjbSGUHf1d9K4dvXk0OpHPdeytdRF1eFU/u3pnYgshfyr7aClTC1CjJWmDDQXLM2xUGtuOjZoTgote+nmhEAm8yjWur1IZQRk4x7/qpcuah9zd+iZoUdHoUxUH+khiizx/J1auS3tYT3fUtJytCz7D2SbP2v/oSWcXonIqxd5S1MPr5aqZX/lb2uuStHi+xVeJ+WjOUgcs7RvUQZsv/ZXWoOZL6H9lV5VyiUqy8/f1u+krHXcTNveHkTjKu9n1LbUOwVkfS3yaNPTxbL5++23X/UcowH4bKFeSXpyOK7lnR+3pVA+Ucl11RG+L900GWi5mbbm1vBZhTLTst2p16PIUShRCXV6oHW8o/dLPbKUlT6zEMoz/Qtk8uT3M2oHyPQuynFvrzHNnhxjjDHGGGNMqfAixxhjjDHGGFMq6jZcTV24DAdg4pS6vxk2pqEoDE9hmFoUHkA3sobF8fvpzmPJRiALN4p2eKZrUBPxo6TCoqLufso2DTkAMvc63cAs7ABkbvK8wy7aSqQPDFNjEp6G7rHUouoIQ6b4V0skpzt8a4JeGjKkLvs0EVhLJDO0RL+LruuohO3WujeUXRTWxGP622yjJrvT3U2d0u/ia/Y9DQ/iPeHvqSueIZeUmYa5crzQ5Ob23sk6Sjrn/WK7tGgKdUDHGepA1GbKizqtoVocNzl2MbRXf5O/own5/FwUlpB3SdU0lE77BcNUNBSU8uTnolBQ6ozqHPsyw6l0PkqLNeg53g8NBUnLdedFGjoZFd2JStTymCaA8/2Uv+oww5qpRxoSzsIGkR5FCf95hBVF4WppWWkdc6Oy0ul4puW6GZLGsr1aLIT3hGOklvKlXDkPF6V/tpaoQE3aB4FMRzim632gnrH0tIZDc67hXBsVwuHfqH/mHZ4b6R31jWOa9jOOQ/rsy2I2HOdUR9JCDlrwhs/RHEOZDgJkz4Lsz9HzhgsPGGOMMcYYY8wWULeeHLUk0dIRJVpzBakWWVqQos0SuYqNLFC0ytN6optL0eIUFTrgClotB2xDPVhP1IJJqwktSmppS62/TDoDWibqqsyLLANC640m6tEjQ28Bk0GBLKldy6rW8jimibpqYUnLmWsyJeVP2as1lRYr9SpS7mkp261Ban1TPaJVLPLy8DrVqq5FO/S79fspT70mypr9X+WaJvbq/WBbo2Th9iLytvE62Le073CcUR3ltfFa9R6knkiVDa+bCblLliypnmOSPq3DmtDK9mlhlTwTdfX+cZ6gBy+aJ9R7SNJN9fQ76NHW76KVNNId3g/eh2jzUf1c3hbi1hBZ2SlrjnmqD+yfHJ8ij21UlprfFW0syM99VXnf9iLyvqbl7IH4nqfjkvZn9lUWElHPPa3mLH+skRSUMcdRvR9p+4pEWthDi01w7tNyxpyLKbPIW0O9U08uvRHsxzrHcl5Iy6IDLedtbXMe8owKXrBvcLwGMs+MFljgmJZuggxkMuAzjvY9ypXzgnqMUs9hrc3Utxb25BhjjDHGGGNKRd16ciLLb7QyTGODgWyVz5Wrfi4tkaw5J9xsa8yYMQCyvAsgW+lytawbca1cuRJA81WzlpguEpGFXGOCaT1hTKZaIhmjz+vV0txpudlaFsoiWpSiTfAoA+Z5aI4NLWzUI32tukhoWePfKBeEn1PrHa01UftojVIrIfVOvTtbm7T0K5D1F/UskGjzMsog3VgSaNnH9Tspj0gW/I7Uy6jvK4IuqieHVjnmyNAqCWQyrVXOWOPXOX7xulXevO5042Qgs9TRg6NWPepjVHI6D1QW1AX2W+2HvIYoFy4q95/qjH5XavmtlScSlVUvIqmHWeUUbUScWoX1nG7eCDSPlmCEQBQpwDGLHgu10vOYjpt5blegv5mWAY/OqZ6mnjHmSgBZCXjONep9fe211wBknhx9BmFfpU7r5yJPThHGPaCll17HL+qI5r1ybqQHQfOS6JXmtatXKP09navSeUXvFb+rKJEp+tvpBqrRNg1aRptzRBTZwOdgfr9GVnAsoHw1J7RWHmt7RfXYk2OMMcYYY4wpFV7kGGOMMcYYY0pF3YarqestDTXQcwxR0LAzJk/RTaauN7rqmFCqIWkjR44EkO0MriEKdJ0z+Yp/gSx8S0OE1F1cBKLdyyMXMcMHeEzDEJiAlpYBBVomWEalUYviIo9I3eZA5u6mLDSsgPqj4RbUxShcje5jhiNpqF+6W7C2QUvXftl3akhTWmpya8o8TdjWEIC0zK6GAFCnNMGTryPXO+VJWWspb77m/dBQNn4Xxw3V1yjsKi/91GtNS3FqcYaoiAOvl7vRa2hAtBM2YYgW+7KGIPBzUUnc9tCrLSEqCMBQHw375LXoGE3Z8lxUeIChb9rPqdvUd71HadhlFMpWxLAhto19RnWGc5/qyK677gogmys1NIhypx5FhW3Yp7WAD0vUMnSS4TFAFgqeZ8n3LyP9/Sh8U/sQdZZ9Vp9dGKLF0CJNJn/xxRcBZGFrKjvKuta4lrecSK2iMjrfcUzXfsx7zlA9fQ7j8wllp7/DvpoWfdD3R+W+ixxiyvtJmURFCTTUmP2QstCiH2l5e32mYJ+jLhatD9qTY4wxxhhjjCkVdevJUasrLRZMaFTPDJPCR48eXT1GKxw9OpEnh9YBWgv0c1yVaqm8l19+GUDt8o26ws0zGTeC1gm1qrHggCaG8jXlpNdE6xCtfGotSpP21IKZJj4WJYkvQtuWWl7V0ks5qYWXMuP1qrcm1WG1lKbWe20DLTKUp/YLWmn0HvF8e3oSo41UabFNPYP6Pm0jryEqeU0PGa2cTM4FMk8OraJqgWLSJf9GpZBVh/MiSiaNdCEqdZx6dyIvIq818hjRs6FyL3LJWUK56DXxNa3C1Bcg0z/VOc4L6ZYDQKZz9N7q5ng8x/erd4hypMw1gZ8eo2ijxryh7NheLR7A5G6W1AeyebOhoQFAc28EvTu8Ti3sQJnTS/Pss89Wzy1atAgA8MILLwDIIiSArF8XpXR5RK17qfMuxyp6HLkdAdAysV49OSwSwrlEn2solyJ7CyNST44W8uHziXpWqJ9Rmei0kJIWVKL8o+ICabGkqHCE6lpR5FnLgxh573idlBP1EGj+DAg0L1jAeZOe3KIV1bInxxhjjDHGGFMq6taTo7GELKdK70lUypfWIyCzKnGVr6t9Wt+iErJcqTLekxYlAFi6dCkA4KWXXgLQfCNMWvKKaKFLV+9q6aWlQ8sS06JCq4B6I2g54l+1HNAywr/1VkKaeqHWMeoDY9K1pCljhyOLEGP09f3phrFq/aXeRDkXPMf3R5ufaenyWptzfd2k+T9qCef9j0q10yKslrbUMh/lR9ArpLlRfB9loNZfxmuzP6t1iuNLETw5SmqBU10gWiab/ZmyVG9aWg5UreC0xtWSQ/TbRdnst9ZGqux/2kZ6HiLd4XdEnum0XDSQyZEWdfUeUg+pazqe0MtTlPLbCttBvdC+Qk+O3mfqDccg5ugAmdeMctIxi3M451P+BVp6KqJy0UWRV0SU90o90jmWuRAcEzU3gjKmDJinBGRy5JysFvV68L5GpJuB6tgWPaNRnpSZ5oHyfeyzOhbyGZB6q9+ZRktEUShF1ruIVK5AJit6bXQs5PxBndK+x/EtzfsCam8C317Yk2OMMcYYY4wpFV7kGGOMMcYYY0pF3YaraSIsQ32WLFkCIE6uVXciE07pltOk53QnWA1vYXEBhqmxZCOQJUrSPa9hCEV2pacudHVf0p2rx3gNdFdGpaDp+o3ORcl79eBC5z3UMDLqBmWnYY9MCNWSl2m4moaRMZE3SpikzNMESG1XWoJav0vb3J4J9byvaRu1ndpPCPvjzjvvXD3G8AOWotUy2tRPykllkIaWPvfcc9VzaQKzFh5gKE096Cap1Yc5JqruMPSAxzRcja95z2r15SKXUVU9p17wfmvfZMijlqOlzjEBV5PDCXVbdY5zB3Vv+fLl1XM8xnAj7ZtpgZEiwrZpuArDxbX/sIzxP//5TwDNQ64Y+hKVQU/D+VQ+HBOj+bTI/TSdY1WPOJ6p3jHcnsUsdKxLx00N9UvDhqLw+CLLKYL3mNeic0hUEIR9mqHLKjvCfqYlzxn2lxYuADKZU+frNQxQx2nOESofjoEMU9NiAwwNjAqmpOGRUbhtNFe0V2izPTnGGGOMMcaYUlG3nhy1UnBFzvLNmgxPK5MWCWBZWVqXNJmNliN+Tq1wfE1LoJbRrLWaLfIqP7XwRGVk1dpOCxutRVHyMa9dLSVpcvNXlTMsGpHVkRZFWjB14zFaRdSrSFlF5Z5poeKx1sonTerXfkFZq4W+Pb2K6WZkqkfsc1FCaVSWk99BC53KlTJjf2SCMpB5X5kcTUs60LpE3aLzVZaxtGBDVIKb1633gHoSWS/TohWRV7YofVl1iNZHelmj61XPAZPlaVnX5HB+L9+v2wlwHKDO6aaV1NGoNHe0aV9R0TamYxeQWcJ57TpPpHNGrZK8kSyKoluthX2Q+qYJ79Qp9eTwNT1ekYc1LdsLtIyuqBdPV0qU9M+5QwtesM9pWWl6H+jRiQoPsO/p8xuf7eiB1HmCXtd0c2CgPjw5UcELzp8qH+pi5LnmNacFGvRY9OybRp9Ec9PWxp4cY4wxxhhjTKnwIscYY4wxxhhTKuo2XE3dg3STMWlRQw4YOvDwww9Xj9ENzKQrdePR1dYat1xUD7zIbssItpvXoq5GhiGoS5zhRVESfOp+VLcuvysKV6sH0tArIHOhM0xACwnUSspujZu2lh7VKtoQfS7vIg/UKQ3NSY9pGAJDBrjDOZCFITBRV/ss+yPDzvS7qLscEzRhmp9rjz2Dvm5q7b6tOpom0uo9YF9mWILqZZroq4mmHBOjAiNF69cqH449lIWO39QTDU9OE3A1STfdJ0zHSI4D/B0N00znjnoNKYqIxpmi6UN7Ec2LDFdTPeJ4xr9AFs4WhflShznGaWh+rcIM9QplwOvU5xOOQwwnA7L+y1SEKMSU/ZJFooAs9C16huQYGD0jFbnPps8g0R5DGpKmYZRAHL7NYzqPpIWFWjsftJfs7MkxxhhjjDHGlIoOlQIuRYtckrQ9acutsew+x7JrO+0pu8gjSCtTlKxc63eiZOVaicxbY+jb0u/8OnWOMlK5UZZR8mn0fpIWtIjKgn6dSbd59Fe9bnq1tPw2ZacWUML2RvJJd0Hf2kVWPNa1nfaQXVpwgIndQFZIheWiAWDw4MEAslL6qpP05NDToNED9GjQq6hFclIr+9cxDuahd/r51ox3kUct8jLW6rOk3vpsre1B6E1UXUy3VNGiBOk2A+rloax4TKMl6IWk1y0qYLOlurilsrMnxxhjjDHGGFMq7MkpMLbQtR3Lru1Ydm0nT09OPVNknav1O0WYPossu6LTnhb1aAPGKCeHeSS0pKunIi0FrznDtKBHVnNa4Ovd+1oW2lN20eciL3Wao9kaD7aSevj1deTxbi8Poj05xhhjjDHGmFLhRY4xxhhjjDGmVDhcrcDYHdx2LLu2Y9m1HYertQ3rXNux7NpO3snz0bE0YTw6V6uEfJRYvzXKSVvv2o5l13YcrmaMMcYYY4z5RlNIT44xxhhjjDHGtBV7cowxxhhjjDGlwoscY4wxxhhjTKnwIscYY4wxxhhTKrzIMcYYY4wxxpQKL3KMMcYYY4wxpcKLHGOMMcYYY0yp8CLHGGOMMcYYUyq8yDHGGGOMMcaUCi9yjDHGGGOMMaXCixxjjDHGGGNMqfAixxhjjDHGGFMqvMgxxhhjjDHGlAovcowxxhhjjDGlwoscY4wxxhhjTKnwIscYY4wxxhhTKrzIMcYYY4wxxpQKL3KMMcYYY4wxpcKLHGOMMcYYY0yp8CLHGGOMMcYYUyq8yDHGGGOMMcaUCi9yjDHGGGOMMaXCixxjjDHGGGNMqfAixxhjjDHGGFMqvMgxxhhjjDHGlAovcowxxhhjjDGlwoscY4wxxhhjTKnwIscYY4wxxhhTKrzIMcYYY4wxxpQKL3KMMcYYY4wxpcKLHGOMMcYYY0yp8CLHGGOMMcYYUyq8yDHGGGOMMcaUCi9yjDHGGGOMMaXi/wF5m/0aE3+CBgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Average of all images in testing dataset.\n", + "Digit 0 : 980 images.\n", + "Digit 1 : 1135 images.\n", + "Digit 2 : 1032 images.\n", + "Digit 3 : 1010 images.\n", + "Digit 4 : 982 images.\n", + "Digit 5 : 892 images.\n", + "Digit 6 : 958 images.\n", + "Digit 7 : 1028 images.\n", + "Digit 8 : 974 images.\n", + "Digit 9 : 1009 images.\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAzkAAACBCAYAAADjY3ScAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJztnXmMVtX9xh8WWRxk3zoIuLCJiLghmp+ItUhFSaXGNtoaW7VWRCum7mIFtZU0IdE20diosYpSo6ZttLUkrahRBBVBAcENUVzKKssMLqVzf3/o895nzhxehikz977X55OQebn3Xc793u85557vdlolSZLAGGOMMcYYYwpC66wbYIwxxhhjjDF7Ey9yjDHGGGOMMYXCixxjjDHGGGNMofAixxhjjDHGGFMovMgxxhhjjDHGFAovcowxxhhjjDGFwoscY4wxxhhjTKHwIscYY4wxxhhTKLzIMcYYY4wxxhQKL3KEmpoaTJs2DdXV1ejQoQNGjRqFP/3pT1k3K/ds374dV199NU455RT06tULrVq1wowZM7JuVkXw9NNP4/zzz8ewYcNQVVWFfv364Xvf+x4WL16cddNyz9KlS3HaaadhwIAB6NixI7p3747jjjsOc+bMybppFcc999yDVq1aoVOnTlk3Jdc888wzaNWqVfTfwoULs25eRfD8889j4sSJ6NatGzp27IjBgwfjlltuybpZueYnP/nJLvXOuleeJUuW4IwzzkB1dTX23XdfDBs2DDfffDN27NiRddNyz0svvYQJEyZgv/32Q6dOnXDSSSfhhRdeyLpZe0TbrBuQJ77//e/j5ZdfxqxZszBkyBA8/PDDOPvss1FXV4dzzjkn6+bllk2bNuEPf/gDDj/8cJxxxhm45557sm5SxXDXXXdh06ZNuPzyyzF8+HBs2LABs2fPxpgxYzBv3jx8+9vfzrqJuWXLli3o378/zj77bPTr1w+1tbV46KGHcO6552LNmjWYPn161k2sCD766CNceeWVqK6uxtatW7NuTkXwm9/8BieddFK9YyNGjMioNZXDww8/jHPPPRc/+MEP8MADD6BTp05499138fHHH2fdtFxz44034uKLL25wfNKkSWjfvj2OOeaYDFqVf9544w0cf/zxGDp0KG6//Xb07NkTzz33HG6++WYsXrwYf/3rX7NuYm55+eWXMXbsWIwePRoPPvggkiTBb3/7W5x88smYP38+jjvuuKyb2DgSkyRJkvztb39LACQPP/xwvePjx49Pqqurk507d2bUsvxTV1eX1NXVJUmSJBs2bEgAJDfddFO2jaoQ1q1b1+DY9u3bkz59+iQnn3xyBi2qfI499tikf//+WTejYjj99NOTSZMmJeedd15SVVWVdXNyzfz58xMAyaOPPpp1UyqODz/8MKmqqkqmTJmSdVMKwTPPPJMASKZPn551U3LLDTfckABI3nnnnXrHL7roogRAsnnz5oxaln8mTJiQ9OnTJ6mtrS0d27ZtW9KzZ8/k+OOPz7Ble4bD1b7mz3/+Mzp16oSzzjqr3vGf/vSn+Pjjj7Fo0aKMWpZ/6DI3e07v3r0bHOvUqROGDx+OtWvXZtCiyqdnz55o29ZO6sYwZ84cPPvss7jzzjuzboopOPfccw9qa2txzTXXZN2UQnDvvfeiVatWOP/887NuSm7ZZ599AABdunSpd7xr165o3bo12rVrl0WzKoIXXngB48aNw7777ls6tt9++2Hs2LFYsGABPvnkkwxb13i8yPma5cuX45BDDmnwcDRy5MjSeWNagq1bt+LVV1/FoYcemnVTKoK6ujrs3LkTGzZswJ133ol58+b5QaoRrF+/HtOmTcOsWbOw//77Z92cimLq1Klo27YtOnfujAkTJuD555/Pukm557nnnkP37t2xatUqjBo1Cm3btkXv3r1x8cUXY9u2bVk3r6LYunUrHnvsMZx88sk48MADs25ObjnvvPPQtWtXTJkyBatXr8b27dvx5JNP4u6778bUqVNRVVWVdRNzy5dffon27ds3OM5jy5Yta+kmNQkvcr5m06ZN6N69e4PjPLZp06aWbpL5hjJ16lTU1tbihhtuyLopFcEll1yCffbZB71798YVV1yB3/3ud/j5z3+edbNyzyWXXIKhQ4diypQpWTelYujSpQsuv/xy3H333Zg/fz7uuOMOrF27FuPGjcO8efOybl6u+eijj7Bjxw6cddZZ+OEPf4h//vOfuOqqq/DAAw9g4sSJSJIk6yZWDHPnzsVnn32GCy64IOum5JoDDjgAL774IpYvX46DDz4YnTt3xqRJk3DeeefhjjvuyLp5uWb48OFYuHAh6urqSsd27txZimqqlGdix3QI5UKuHI5lWoIbb7wRDz30EH7/+9/jqKOOyro5FcH111+PCy+8EOvXr8cTTzyBSy+9FLW1tbjyyiuzblpuefzxx/HEE09gyZIlHtv2gCOOOAJHHHFE6f8nnHACJk+ejMMOOwxXX301JkyYkGHr8k1dXR0+//xz3HTTTbj22msBAOPGjUO7du0wbdo0/Otf/8J3vvOdjFtZGdx7773o0aMHJk+enHVTcs2aNWswadIk9OnTB4899hh69eqFRYsW4dZbb0VNTQ3uvfferJuYWy677DJccMEFuPTSS3HDDTegrq4OM2fOxPvvvw8AaN26MnwkldHKFqBHjx7RlenmzZsBIOrlMWZvMnPmTNx666349a9/jUsvvTTr5lQMAwYMwNFHH42JEyfirrvuwkUXXYTrrrsOGzZsyLppuaSmpgZTp07FZZddhurqamzZsgVbtmzBl19+CeCrqnW1tbUZt7Jy6Nq1K04//XS8/vrr+Oyzz7JuTm7p0aMHADRYCJ566qkAgFdffbXF21SJvP7663jllVfw4x//OBpOZFKuvfZabNu2DfPmzcOZZ56JsWPH4qqrrsLtt9+O++67D88++2zWTcwt559/PmbNmoUHH3wQ+++/PwYMGIA33nijZDzs169fxi1sHF7kfM1hhx2GlStXYufOnfWOM+7Q5UFNczJz5kzMmDEDM2bMwPXXX591cyqa0aNHY+fOnVi9enXWTcklGzduxLp16zB79mx069at9G/u3Lmora1Ft27d8KMf/SjrZlYUDLWyV2zXML81hLKrFMtw1tD7cOGFF2bckvyzdOlSDB8+vEHuDUtuO9e6PNdccw02btyIZcuWYc2aNViwYAE+/fRTVFVVVUykiUeVr5k8eTJqamrw+OOP1zv+xz/+EdXV1Tj22GMzapkpOrfccgtmzJiB6dOn46abbsq6ORXP/Pnz0bp1axx00EFZNyWX9O3bF/Pnz2/wb8KECejQoQPmz5+PW2+9NetmVgyffvopnnzySYwaNQodOnTIujm55cwzzwQAPPXUU/WO//3vfwcAjBkzpsXbVGl88cUXmDNnDkaPHm3DayOorq7GihUrUFNTU+/4iy++CAAuuNII2rdvjxEjRmDgwIH44IMP8Mgjj+BnP/sZOnbsmHXTGoVzcr7m1FNPxfjx4zFlyhRs27YNgwYNwty5c/GPf/wDc+bMQZs2bbJuYq556qmnUFtbi+3btwP4ahOuxx57DAAwceLEemUITcrs2bPxq1/9Ct/97ndx2mmnNdi52hP/rrnooovQuXNnjB49Gn369MHGjRvx6KOP4pFHHsFVV12FXr16Zd3EXNKhQweMGzeuwfH7778fbdq0iZ4zX3HOOeeUwiN79uyJt99+G7Nnz8a6detw//33Z928XHPKKadg0qRJuPnmm1FXV4cxY8bglVdewcyZM3H66afj//7v/7JuYu75y1/+gs2bN9uL00imTZuGM844A+PHj8cVV1yBnj17YuHChbjtttswfPjwUqikacjy5cvx+OOP4+ijj0b79u3x2muvYdasWRg8eDBuueWWrJvXeDLepydXbN++PfnFL36R9O3bN2nXrl0ycuTIZO7cuVk3qyIYOHBgAiD677333su6ebnlxBNP3KXc3D3Lc9999yUnnHBC0rNnz6Rt27ZJ165dkxNPPDF58MEHs25aReLNQHfPbbfdlowaNSrp0qVL0qZNm6RXr17J5MmTk5deeinrplUEO3bsSK655pqkf//+Sdu2bZMBAwYk1113XfL5559n3bSKYPz48UlVVVWybdu2rJtSMTz99NPJKaeckvTt2zfp2LFjMmTIkOSXv/xlsnHjxqyblmvefPPNZOzYsUn37t2Tdu3aJYMGDUqmT5+e1NTUZN20PaJVkrhuozHGGGOMMaY4OCfHGGOMMcYYUyi8yDHGGGOMMcYUCi9yjDHGGGOMMYXCixxjjDHGGGNMofAixxhjjDHGGFMovMgxxhhjjDHGFAovcowxxhhjjDGFom3WDYjRqlWrrJuQC5qyhZFl9xWWXdOx7JrOnsrOcvsK61zTseyajmXXdCy7pmPZNZ09lV0uFznGGGOMqUz4QBb+jaEPLXV1dbt8v/ctN8bsKQ5XM8YYY4wxxhQKL3KMMcYYY4wxhcLhasb8jzC0IhZOoWEXrVu3rvd3T0My/vvf/zZ4j0M4jDFZwvGsbdv0cWKfffYBAOy7774AgPbt25fO8bUeIzt37gQAfPHFF/X+AsCXX35Z7xj/D6RjI8PdjDEGsCfHGGOMMcYYUzDsyUF5azv/6vtoPVerEV/zr1rYy1n6i0hjPBtFlIVaMvm6Q4cOpWP77bcfAKCqqgoA0KlTp9I5fR+QWjQBoKampt7fHTt2lM7xNa2b+rlKlXEsWTn0fmm/5HWGf3d1rDHnvuk0ppIP5abvLaIsY7IIj+n/90R2jdXVvKH9j2OdemY4tnXp0gUA0LVr19K5bt261XtPx44dS+f+85//AAC2bNkCoP5Y9+mnn9Y7t23bttK5mHcnNhcbU64/V1IfNI3DnhxjjDHGGGNMofjGeHLatGlTek2LEy3qtCwBQK9evQAA3bt3BwB07ty5wXfRgrR58+bSsU2bNgFIrU20ugOpdYlxw0Dlxg7T4sGYa/VA0DJHeapngx4GyuDzzz8vnePr8G/sc0D+rCyUBfUJSC2Yffv2LR3r378/AGD//fev938A6NOnD4DU4qn6Qd169913AQArV64snVu9ejUA4N///jeA1MoJpHLMo66FHpl27dqVzjGOnzIE0v7Ys2dPAKlXTD/L6/zss89K57Zu3Qog7auUJQBs374dQCon9YJVihV4d16D0CtWzruq1nmOl7HvD71qsXFNj8VyyfJAuWsDUhmwf6unguMez/EvUH/cA+JefeoaPRdA6o2ora0tHQt1M2sZxryslJPKh2Mh+y3nVT3GsY79HUhlwM+vX7++dI4yoHdH5cw5tpyH1xSTmMef/VG9hNQp/lX9oY5Qj3QOCfPDtM/ytc6xRdG3xpZ9zzv25BhjjDHGGGMKhRc5xhhjjDHGmEJRyHC1mCtdXeJ0lx9wwAEAgCFDhpTOHXLIIfXOaSgbYUjQBx98UDrGEKK3334bAPDee++VzjFUhuExQGW50su5gzVsKAzD0nCjMBRDw6oYUqShRIRu4zy6gykLusSpV0Aqi0GDBpWOUbd4jDoGpOEclKdeI8MzPvzww3rfDQCLFy8GACxfvhxAGr4GpGFCmoybpexiIaPsl5qYzBC/wYMHl44NGzYMAHDwwQcDSMP7gDSklHpKfQKATz75BADwzjvvAABWrVpVOvfWW28BAD766CMAaagpEA9hy4pYUnusMArDL/QYQ/koew214LWFeqyv+Z0aihmGvmloR1gkA0hlyd/Ouv/GwvPCEGYgHb/Yr7V/hwn1Or/wuzhG6tgVhsNo8jznCeosAHz88ccA0rFR71+WcozpXbnQZYaZ6mstvEI4ZvGv6hbnDsqw0kpIl0t4j/XxxhTpaWpBlaz74J5SLqRWxy3qG+eQgQMHls4ddNBBAIBvfetbAOrPOfwuzh1r1qwpneOz3Nq1awGkoeH6ftVTjquxkOc8y52ypSxihZR0Die8JvZBHaPCfplF/7QnxxhjjDHGGFMoCuXJ4WpfV6C0sGniI63Bo0aNAgCMGDGidI5eHVrL1doUJjarJT5MiNZEVFqMdWMzWjfzvLIvR1iAAEhlRVn36NGjdI7XTmu5JiZTnuWsBHmRU8xKTh2LFbDo3bt36RgtwdQRtd5RPjEdpozpvaBHCEi9g7FiGJSreiNU7i1FrLgAdYVeGPVO0Wtz2GGHlY4deOCBAFIZqG7R+k5LsuoKvYphYQcg1TfKJyanWJn4libmmY5Zz+lB0GN8zffrNWqBD6D+/aGO8v6oV5aWU1rSVefohdDfCeWbdV+mPDVRPqaHHN+pQwMGDCid43jPz6kHiP2Vv6MeB/bTmPeaFuLYJpnsy+ohy8LLGLOox4rQhB4cnX/ZB/ldqoecHyiLDRs2lM7xNWWgn8ubJycW/RArtR0WsIgdCwtZAPF5kddOvVD5hN4vfRbh6zx6HMoVqOEcol59ev9HjhwJADj00ENL59ifOQ+rzCkX9k8tGERd5jwf83BoPw4911nMubsjjMgB0uvj3KqFkfjM3K9fPwD1xzv2R3q6NMKJURKUj0YzUebN3WftyTHGGGOMMcYUikJ4crjap5UzZlGiJRhILcT05DBWE0itAvwOXWVyZc7fU2syrc8xixKtcLqKVUtKUaCVhdZNtZozr4R/1dJFy1NsQ8u8lZ+NeXJomVMLHS0k2m7GlDPOXmN7eZ3UYc11ojU0VvqS5yhrLXlO60ljNidsTmJev1B2aqHj+7W/MB+Jll59f5hPobILz1VXV5fO8XUsJ4fWKbXCZ0XMKhx6EYGGmywCqSyoMxo7Tvmyv+nv8Ds4xqklnr9JfVaLcSx2O2v9I+E8obH87Ec6TzB3jtZhje8Pc3Fi5cdjcuW8wvsQ89iq/vI174eWl25JwjwRbTflGCv5Tk+25uRQZmF5dyDtixwj161bVzrHPsl5YncW8pacM0Ld0vGJ8mH/VO8++5XKh+fZ97SP83vLXRt1S71gHD+ZX6I5Jxz3dGzIS/5c6HFQDwJzazSyIfTgqGeWcwHnRc3dDCNr9PmN94h9LxaRU27bi7x4cnRMjuUR01tz1FFHAQDGjBlTOjd06FAAqW5q/6ccucUF84QB4NVXXwWQRjPpMw+9Ziq75vDq2JNjjDHGGGOMKRRe5BhjjDHGGGMKRaHC1ehC05AdJkppKV+6MhmmpqEYhG5zLXVMN2UYlgWk4QR0+WmoDd2jMVddXlyZjSV056o7my5lhnJoOF/ozleXb5hIqiUI85ZIGiv1yTbG3NiakEg39vvvv7/L99M9rzrJEBrqaywUgm58PRdLWM2CWLhSGKKo/YwJjBqaExYViOkdQ001rIj9n/0zVjY4TBTX11nqX0xuYRKzhlyxv2kiLkOIKG8dl8KS2/p7YYgNQ0OAVF4M5dNQLfbdcmVEs6JcSBHDhTSckYnKPKbhWJQBQ3xUf6m3DM1VmVPWYQEC/Q49RjnyfmufVhm3FJSdhp6GugKkYS2xQiFh+WxNVOZrzpU6BoSFFrQvx3SrMSWY9xZhQRANMaMsOC7ps0gYEqnvYyhRLAya6LXxeqkX+ryxYsUKAMCCBQvqvReIh1VlGWpVLjxXQ3HZL7UAFF/z2UxDkBmyx1A9DZPkfeM4oIUHwucabQPnJR1L+P4s+meMWHgur08LM4wdOxYAMHr0aADp3KnfsXHjRgD1n12og5SFjqGcyxl2quNkSz3v2ZNjjDHGGGOMKRT5MPU2gdhqP1YummU/WRgASL0tfJ+uJJnwyNU+V/9AamGjRU8tMrSysw1qTWYCoG4Qyt8Jy7hWCrHNnWhxilmUafWlrNW6SU9OrORx1omPIdqe0Buh5V1pudBkzvXr1wNIZabvp3xoUdJNAmmVYklHtciEG3hpcqFaOvOA9jPKhfLUe04LuFoiQ9S6SX3TgiOExygXLSQQehCzLrXdGMIiDmpVpOVXLXDUHcpUrz9MkI+VBaZlU/sy9Z2f1zGMr9XSlxeLZliONla0IebJ4Tnt+xzT6anVeYLnOMapN4L3gfLh/4F4EjPHAb4vK70s5wWjDmoSM+XIOVY9UBwbWW5cPTkcI3XcJKFnWuceyqUl546Y5zeWIB9GNmiRAcpJy/Xyfbxe7T8cs8ptl0F91XmC+sNnECaJa5t3dW0tTbny27E+G9sYmn2IngQAWLJkCYB0w2zte9RTfr/2M+r6nsok62cXyo5zpXpa6TkcN25c6dixxx4LINVXfV5988036x1TnaQO0zuk59gPws2lgXh0THOQr6cgY4wxxhhjjPkf8SLHGGOMMcYYUygqNlxNCQsOaJJsLLGP7jW6JJkIDgCrVq0CAKxcuRJAfZcdQwfoltcwBLp8GbYWS5DTMDq68TQsKa/E3K6xwgN09VL+uh8AwzliCbcM62ipHXCbQqx+PkMjYqEVdNnqdTLUIwxzA1JdZCik6gp/m3quMufnYkngWeyIHoPt1RCAcP8ZbSvlqWEUvHbKR0M+GFY0YsQIAGk4qr6P90H3jmDIKJMptT+zfVmGHPC3yyXiavET9jdNmmWfZIiQhkIx9IVhGxoKwvAC9mWVN8MI+TnV8dheEnnrz5Sd7kfDECENM2YYDPutJs1SZ9555x0A9fcd4TmOaxqWSpnxPmg/YP/QcA/Kke/POoySstBwUYa3aNgQ9YZzJWUCpLrIeVeLs/D6YnuCheOf6hjHjHL7lbQEsbGObaI+qB6xL+l1MnyPfUmT5zl/sk9pn+Wzx5FHHgmg/thAqFsqJx7TfpqXPhuGmMb0QXWR8Pp0nyWO95Shfo7jG+ddLTJC3QrD6vV3tB9n+RyjcwXlw5Bt7Z+HH344gPqFB9jnGJr2/PPPl84tW7YMQCpPHTu57yTHUJUd5yQ+D2s4PWnuOdaeHGOMMcYYY0yhKIQnh1bHsFQjAAwZMgRA/cQ+riZpMaHXBgCWLl1a7xiTI4HUMkTLk1qaabniX02C5qqXFi9tc0uWudybxKzMvD7uMqyWJFrdaNHT8o2hlTLPsijnRdFztI5pgi5lFfOwUF+oU2oNoYWE59SCSesgrelqZYr9Thb6Vs6iFfNA0cKmljb2bXpkaT3S1/TgaAI0Lee0GmuCOMvEx2SXF0smELfO0YIbS/ZWix3vM5Nt1etCCzH7n3qfadHkuKn6SBnSIq2eirx4HGKEydpqBaclV6MA+Jq6oMVSQi+uWnIpA8pFLfe0IvNcrNiF9s2wwEtWY2M52TGhWT2IYcEBHe85p/KYWucpc84dOo/y2ilz9bzyu7Sv8H3NpYux+8T7qUntvOf0IscKpLDgApC2m1ZzPUcd5O+pvnJeoGdbn09CL5KOA2yr6mKW4185/Y95yGKRFGHSPZDOo5yTVV8POeQQAOkYqkVY+MzC+6feoZh3h3LM2pPD51x6aPQZmBFO+kzK+fDFF18EACxatKh0TudNoL5nn6+pd+rlob7FtmmIbQfRHNiTY4wxxhhjjCkUFevJ0ZU2LZBcmcc2h1IrJa0YjKem9wYAXnvtNQCpl0dX6GG5W40lDq2iavkMN2wEGpYlzLP3QglX3xpjydK1tBio7GjNYgxybMO7SpEBKWdlisVmU2fDcqNAarmkDNUbqRvpAfUtyrQy0WIS2zwvVuo0bGdzErPYhFZrlQV1Sr0UtLQdf/zxANK4cyD11lKGmtMQlvFV+dDyyb+x+5clsRKbHDc4vqg1kh5U1RdagXnd6nWhfvA7NYeOlj6W4FfLdPhdGt+fB7kpMd2P5ZVQd2IbWsZKZfPa+VfnI473tNjrOXp8YiWhYxbgvMgzzIlQa20sF4z6yXFJc3I4V1L+6o0IdVg94ZxPYptr856q7MKNaZvTsh4+G8R0hfqgngD1yhPKjB4vfc7gtYSleYFUnjynusXv4HgQ2+Q8j95X3jNet47f1CP1dNETQ/3h3ACkXgveq5jnMZanzdfMu9PoHrZBn3VCfWvJPqxjDfsq9UE9/BzntN185mV5cX3O4HdQTpyPAeCII44AkMo6lo8Z5iICLZczbE+OMcYYY4wxplB4kWOMMcYYY4wpFBUbrqbJinS9MbRCw9WYFKWuWJYSZFm8t956q3SOrkm6mGMhLHShx8p/xly+YZgEkL/d6PcUtl/DAFnCkm5hLY9NmdO1nOcSs3tKLBwrLPULpCFZDFvQpD/q7siRIwEABx54YOkc9Zu6paEGTLilrNUdXK6QQxZhMHqfKZ/YvQ8LiQCpPFh4QHempzxjIQ2UAcNr9DsZUsPf01CulghxaSwachXuXq1y4LWpXjFMhTLScCyOoXw/S3ADaTEHfr+WSKZM2S7Vcb7WkMGWSjBtLLHw0jCMDEj7EsdtnXMYSsnv0GRyhr5Rr1Q+JAylCtuTN9hfOYZpuBpDWDS8lDJjX9TCA9Rhlq9laCSQhuvy+7X/sX/yuzT8m8TCKnVMbC7CcUJDcTjXsW06L/L+63zI8Z3XqeNSOO9qCNKwYcMApKFaGhbHUCQml2sb2Na89M9yxX10bOezxNq1a0vH+LzHcU51iyFW1E3VYcqD5ZPffvvt0jk+H/J3NHyQ90afBbMuEkJ4nRybdIxiG1UPGLpGuehzNPWN8zBLUAPAUUcdBSCdKxjuBqRhqpRZFgUaKvtJ2xhjjDHGGGMCCuHJoeWCVg1NKOPqVRPduTLnilOTzLgyjyXDh2U0NXGVViUeU+trrExunq125QiT5lXWXPnTgqAypyWYloNKvX4lTGDWJFCWWNUCFLQu0drEJFsgtThRhpo8TpkzyVGtorSUUNYxC7p6DcPS31lZm8J2xBLEtd20dNISqfoTS7QlvEe8N2rZo3Up5gVjn81DMQKVA3WN3gLdNJZ6pd5VJnXTM6OWX45H9OTohsn06vAcPbFA2vep41pOOFZSObRsZm3hZNt0fKJe6XYCbDf1S6+J8wTfE9scM3adYWK6WvDLWTazlhmJbaTKcU3HOqLJy4TWYCYqMwIASGUX29wzRGXHsVE3+4150JqbmJcwVlaaxDbP5fs4lmsECPWMfV03cxw6dCiA9LpVFnzG4dxRKZEUYUEH9eRw/FZPDr2JLNwT25Q95m1mEQsWo9LoHnrBONeqZy3PG5iT2Kbl1DH17jASgNEkek3s25TCre00AAASoklEQVShbrjNOYbzlHq6KDvqXWMjTfYm9uQYY4wxxhhjCkXFenK05CytGrRqqpWJqCWSXgWuMjXHoZyHgRaAWDwsV7M8p99DS7FaDmOlIyuBMJ5Vc0e42qflQDeQYtnFSi0XHcuroneA163el1hZVcab0wqiMa88x+9UKxMtI7SCaBwtLTKUp3o4qa+xEra8R3nxqKnViNeppUFpYac1TTea5TXzOtWzQPnTsqebmNGSzNh1HQdoMcyD91XvH1+H5ciBVAfUo8hy7uyvaoHjd9BTobrKsY3XX04OqnPsH2Gp8jzBa9GyxitWrABQf1ymNZf6pN4aXifnIZU5j8U8OtRtzgWxTRlVvnnpn6EHX/sY5zw9Fm6QrB5Hjo3MQ9TcGnpkaFlXvaOMeR/092L5T1nqoN5z9QCG/9e+E36Wsla941zD/JtjjjmmdI5Wdo5dGqFCefKcjrflvOl5madjWzLE8ujC+VBlR1nzczqPco7l39hcENv0Oy/yIdqe0IOjOVp8NtO5hf2Qc6U+67BfsZ+p/vCecOzkczWQesjDCKnwO5oTe3KMMcYYY4wxhcKLHGOMMcYYY0yhqNhwNXVLM3SFIRnq/qZLTJOh6LqlazyWrB3blZ7hB0wYp8tYj/G3dTdmJgCquzB0q+YZdWlS7gxn0eR5ujfp6mUSH5C6KyvhemNQH2KhAwy7YMiZvtakWoan0R2ssqNu0eWroTSEbmG9HwwVYcKl6jLd87GEQ4bNtEQ4TKyQQOwYYXu1D1F/qH+xsJQwKRdIQ9I4RmjJ5TDMVROnKX+VXVahQ9pnqB9M5NSQFI49GobB8SvWdsqLyaca/sNQB4YMaugpxzOGWmlyOeWVRVjC7uD4RJ2LJR5rUjHD1diPNAyar8OEXCAN+wv7JpDKjOc08Zf3Ko/laCkzykLnRb7WY4Q6pX0yLImv5cnDAjXlSr5rYn1YahjItjRyLGyIx1TvwrDH2DF9nuFcw13mNfyUn2OIEMshA+mzB8f9rPWpsYShdDpfxIr7sO/xmI7f1C32ce3rfD6MPfdxnOSx2FYgeQlh07GWesfxWQs0cPxRGTDcntcbK+VNmWtoI/son6e1P7OPct7KoqiKPTnGGGOMMcaYQlEITw6tYbTwxErPakJpmPSvyX/hBnea4ExL/JgxYwAARx55ZOkcLQhc8TLRHkhX0Gp5qgRPTlgyG2hoEdeNB2lFo9VXE9B4LpbkmGcZEOqUJhjTgkGPjHptWJCBid9AqiO0ZOp3UT7UTbW8h5ZS1UkWvKCFRL+T1n5aWIBU7mEybHMQFmtQC1h4TM/xc9pPw1Kd5TZe1c/Rik6LXswrGUss57k8JC9r8jWTYGmtVTnQG6EFMKgzvG6VM2XDcU1/hzJZvXo1AGDp0qWlc/TQ0jqsnqOYhzDL0tF6v9k3aBnXcY16pYUA9LV+HkjHPY4Beo5WUv6OzlWUK9+vFmO2J49jY2M8SnotvGb1YoXfReu5FgXiWMUxUsdPzjnUN7VC83OqixwHsvbklPNk8l7re9hHYzJkmXdubKkeMspg1apVAOpvaEnPdFieGmi5Ur7/C5SJerXogdcCPoySYP9S3eLzF/VOZc75hTqsMg83ZdUyyBxf8lIgRO9huDm2PpNyrNfoIo5bsdLrHJvoSdR+SfnweU89RuFmqVl4vOzJMcYYY4wxxhSKivXkqLUrLIerK/RYKUFa4WJWJq7M+R5dsXKDvNGjRwOoHw/L76flU+NhuemorporoYQ0V+9qpaTliBZMjSlnLg4tympVCy3JajUOyaNFibLQzcWoP/TQ0MoBpB4W9XSFZU7VIsRrpl7E5MM8AM39oWWY7dI8Fr03hB6NMD8BaL7cCcpO20MZ8FislGrMk8M+rvIJ46jVW0G5UPb6O7z22HXnqQRyzJMT62O0oKmcw42LVR9VX8PfoQyZl6Kb49FTS8uxxr3Tepj1Jqqxku/0gLL/6JzAa1DvAPsn9UPlE/bPWJw+0esP74daTWPx/XkhHJ9UTrENLTkm0sut8mGOAGXOsRJI51R+XktPU7c4x6rFONRJbU/e5Bnz8sQ899QRlQEjBCgznUOYC0Gvq+ZGcG7m+2M5c3nJK1FCL72OX8wL1vmQuTj0WKn3gnpD3dUcO3qIOG5otARfU7diuaV5lB3vK/tBrG/oMco4lidGHaQ3Vcc/esiob5pPHM7bKpuWmmPtyTHGGGOMMcYUCi9yjDHGGGOMMYWiYsPV1L1L9yPd4Oo2p0tSQzO09DNQf3dbhnrQFaohaUz2i4U70C26fPlyAMAbb7xROvfee+81+J1y4VpZEtvlVl23dFvymLofGY7HUBq9D2G4mrp885K0V47QbQ6koVBMktVSlmF5YiANb6PeaMgA3esMP4qV5SXqZg9dyxp6xc8xMRBo6CJuTtd6WLhC+wuvgeEHWr6Yn9NQDIbEUE6xXcPZZ1k2Wl/Tza56x+8MkyOBbMvPhsQSZHlMZUTd0WukflC+2tcoe4YsaN/n+1jgQHUolJd+Z17KRceKprC/MtRHyxPzPmuxgTCsSvsOw1uoV5oATrmyT+pYz9+JhSnlSedC2E72P92SgWFjmgBOuXDe1TAjyjVWLIXjAscKDadhQv3rr78OAFi5cmXpHOdfHTfzMq+EIU276yPUG8qMoXtAqrN8j4ZjhQUHNGwoLDiQl8Igu4Oy43OZzrGcT1W32N+pnwxRA+qH7+l3A+nYEIYK6muOq7FtD/II72csVCzsz0AqOz7j8LkGSOXP5z59tuNzH+cKHUPLjWnhdi27et//SmXcLWOMMcYYY4xpJBXryVHrNhOfaNVQjwlX+1rel1Y4emlo0QXS1Swt8GrtC70XakV5+eWXAQBLliwBUD9Rl+3TNufNahJLNqOFTa0nlAHlpAmo3DiQK3m1FtE6ECuTGkveq0Q04TssMQukVkpawFUfKDPqilowaW2JlVumjHkf1DITeiqA1PLfkp7EmBeMesR+pmWPKUfVEVqOYomz1E9a6NWTM3z4cADp/aCOAml50VBv9ffyYg0mYRJ8zMujcqMVkmOeWij5WeqhjpuUL71DsUTlvGxU2VgoM45x9PwBDTeoBBr2Kb1OWn7pwVFPDmXN74r1ZcpV556YZywvhLqipXnpQVDLL/sboyY0OZzjQWyDR/ZvWt1feeWV0rmFCxcCSOdYRkgAqYzVG5sXvQzbof2T167zLsdGjme60SzPxTxqfOZgJEU53cpjojyJyYdzh45fsaI17OOxeYIy5nON6mvYZ8uhY265ojV5k+vuPIjhhr9arjuMQolFmtBzqNEF5eTZUnKyJ8cYY4wxxhhTKCrOkxOL12csIDepUwsdV6CaW0OrEnNsynlY1OJN69v7778PAFi8eHHpHK1LLB2tli6uevNooQutabGN67RsMs9TLrEYaFpA9XppHQiteJVCWI4RSC3ftKapJZyWRZUdrWmUj24OS32hd1Ctv9T1WMljfidzJjQOmxY9PUbrHi1cLWFt4m+oxTYsya2l2ll2VnPBaEGiNS62QS37unqF6PWiLNT6y7w55hTo/aOlNC85JiSM649tihrb8DS2MSWhTmifDL2NsZLdu/p/HqB8Ynlv4SafQOqJUZ2jjlHGam3na3pnVR/5m/QQal9mP2ff17ZkuWHe7mA7OBbpmLJixQoA9cdGXhfn5kGDBpXOUdb8Tv0ueiOYd8McV6DhVgzqqYh5NvNCmHsQ65+aZ0n58DlGdZLfwfFe80xYRptz8u7KROeVcmWGY9uD6HsoT50DCOcVjoU6NxPO5Sq70Cu0u41U8yLj2MbrJPTaAKmHi7qoW6xQByl/nSupb3w+1nEgD/3RnhxjjDHGGGNMofAixxhjjDHGGFMoKi5cLdx5GUhDURh+oiEZfL+6GFmEgAnLWsIyLLv7wQcflM7Rlc4dhZlwCaQhbHSla7J3nkuDhi50dV/ytSa681r0+ghlx3ujMt/V71YKvG7dYZ6hJ5SdXi9DVDScgPKke1fDNBjaQr3T32GYTcw1zhCXsMQy0LC0OpC63lsyTI1yKVcUQWXHhNKBAweWjjH5ln1WS05TrnSN6/UyPO21114DkBYIAYBly5YBSENq1AUfCx3KO7GyyZSlFsUg1AXeFw2j4f1gX44VqihXSrW5y4I2Fg2ZZZ/i/eaYDaThLVrynaV7mUSv8wRlTLmo7nDOYN/XeYLlfRki2dhyq1nDNsXGf/YVlQGv85lnngEQT/Lmd6kM+JrjoJ7j/JLnsL4Y4RyrYY+UhYYGMUyNY50+z7CvcozT4keUFeVUKWWiyxEWWtHUgrDEO5DKlikJQ4cOLZ1jn6VcNNyRYwK/U0vmh7qoxS3yFgYYK+gU+z+f6VS3OL5xvNPS3Hwf5wOVOcfVWIh3GFKYxXOfPTnGGGOMMcaYQlFxnhyiq0WuJFl4QK1M9KzwHJAWIWCJRl3N8rNc2TPZUb+DCX6aOE4rFle6sdVsHglX2NpWWnPV4kEPBVfyaiGmHGOlU/kdtMhUmpUpLKqgx0ILMZAmzJaz/qpcw3K16vXg71Cn1IIelvNVi3u5RMmWSAjk/QwLLihsT6wstlrM6HVgn9Xylrw+WjVpRQZS7y6t6Wq9Z/+NbQaaxyIhSrm+op4cypXXoxa4UAf0O0OLnepVY2STl76s18g+xX4aK12s3ojBgwcDSK3CatkkMY8/5wkWodG5J9zmIG9Jursj9Ojoa9UtXh89VirrcsnkoWU85q3Ji241lnKbIjP5XXWLid/q8SGcM/hX59hwM/Ryc2ylyDDctFKvl/04tsUF5akeROog5aTbCXDO4HyhW4DweY/zfMyTUwnENonWZ1/OqbHtBkIPbqwwA+USK1KTZVlte3KMMcYYY4wxhcKLHGOMMcYYY0yhqNhwNYWuM7oyNfGYYQQLFiwoHWPScrkdc+l607Chcon1eUtAayxsd2y3bbohNfkzTDbWMCO612PFIcJa85USzkdiBSzCsDMNX1y9ejWAeCJgzHVbLkwjbEO5fQTyGN4R22MoDHFRHWOxgEWLFpWOMYSDfVdDOcIEeg1pYNhMmBwJNNT5Sgo9iMH7rH2YMqccNKSI4x73htD9OMLxTwthMGQhtqt4rDhGXggT3TWsgvJhvwUa7lOiOhfuG6M6x3GAhUViu4NXWvJ8Y9D2Uw/yHvbZXOi4zDkzFiLE8UzDmvk+jkfaZ8NnnVhhhnIh4ZVW8CeUAccqIF7wgs97LA6l++VwvOP4pUUbGJIW21uOY2Ce92IisTEklvRPHYsVmiI6rnMMowx07OR9KPdcHNO7lhrv7MkxxhhjjDHGFIpWSQ7NR81hbWjqd2Ypnqb8dqVZapqLSpBdXsrshrSE7ML3x5IiY8nK5SxCYREGoKElubkt53v6nXtT5/hd6l0Ny9fGPIuhpRlomGCucgw9X3tDjln0V/08vTTqrWHircqFUC60Wqp8QotvzGO7N6mEsS6vNJfsYp4cenDolQbS0tG9e/ducIxJ81pkhbpIzyo9D0DqmaAnUb08YVL43hgHs+6zRPsnX8fmkDAKRfss+3FLeWtaUnaxeYFjmxaOol6yaIN69vk+jo+xqIzY1hiMcgk92EDTIyj2VHb25BhjjDHGGGMKxTfGk1OJ2ELXdCy7pmPZNZ0sPTmVTJ51LvY7jcmFaynyLLu805Jea+Y8qLeQuTjqreEx/tWNj8OtAjTHkN4d5pxoLk9oNd8bngrrXdPJ2gtGr5Z6t0L9VM9PuOG2esH4vWF+t74vpndN1UF7cowxxhhjjDHfaLzIMcYYY4wxxhQKh6vlGLuDm45l13Qsu6bjcLWmYZ1rOpZd08laduW+K1ZsJSwWopQrwBK+Z2+QtewqGcuu6ThczRhjjDHGGPONJpeeHGOMMcYYY4xpKvbkGGOMMcYYYwqFFznGGGOMMcaYQuFFjjHGGGOMMaZQeJFjjDHGGGOMKRRe5BhjjDHGGGMKhRc5xhhjjDHGmELhRY4xxhhjjDGmUHiRY4wxxhhjjCkUXuQYY4wxxhhjCoUXOcYYY4wxxphC4UWOMcYYY4wxplB4kWOMMcYYY4wpFF7kGGOMMcYYYwqFFznGGGOMMcaYQuFFjjHGGGOMMaZQeJFjjDHGGGOMKRRe5BhjjDHGGGMKhRc5xhhjjDHGmELhRY4xxhhjjDGmUHiRY4wxxhhjjCkUXuQYY4wxxhhjCoUXOcYYY4wxxphC4UWOMcYYY4wxplB4kWOMMcYYY4wpFF7kGGOMMcYYYwqFFznGGGOMMcaYQuFFjjHGGGOMMaZQeJFjjDHGGGOMKRRe5BhjjDHGGGMKhRc5xhhjjDHGmELhRY4xxhhjjDGmUHiRY4wxxhhjjCkUXuQYY4wxxhhjCsX/A72wAtv5JA/kAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "print(\"Average of all images in training dataset.\")\n", + "show_ave_MNIST(train_lbl, train_img)\n", + "\n", + "print(\"Average of all images in testing dataset.\")\n", + "show_ave_MNIST(test_lbl, test_img)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Testing\n", + "\n", + "Now, let us convert this raw data into `DataSet.examples` to run our algorithms defined in `learning.py`. Every image is represented by 784 numbers (28x28 pixels) and we append them with its label or class to make them work with our implementations in learning module." + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(60000, 784) (60000,)\n", + "(60000, 785)\n" + ] + } + ], + "source": [ + "print(train_img.shape, train_lbl.shape)\n", + "temp_train_lbl = train_lbl.reshape((60000,1))\n", + "training_examples = np.hstack((train_img, temp_train_lbl))\n", + "print(training_examples.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, we will initialize a DataSet with our training examples, so we can use it in our algorithms." + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "metadata": {}, + "outputs": [], + "source": [ + "# takes ~10 seconds to execute this\n", + "MNIST_DataSet = DataSet(examples=training_examples, distance=manhattan_distance)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Moving forward we can use `MNIST_DataSet` to test our algorithms." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Plurality Learner\n", + "\n", + "The Plurality Learner always returns the class with the most training samples. In this case, `1`." + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1\n" + ] + } + ], + "source": [ + "pL = PluralityLearner(MNIST_DataSet)\n", + "print(pL(177))" + ] + }, + { + "cell_type": "code", + "execution_count": 103, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Actual class of test image: 8\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 103, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAADcpJREFUeJzt3V+oXfWZxvHnMW0vTHuhSUyCjZNOkSSDF3Y8yoA6OhTzZyjEhlQaZJIypSlaYSpzMTEKFYZjwmAy06vCKYYm0NoWco6GprYNMhgHiiYGqTYnbaVk2kxC/mChlghF887FWSnHePZvney99l47eb8fkP3n3Wuvlx2fs9bev7XWzxEhAPlc03YDANpB+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJPWRQa7MNocTAn0WEZ7N63ra8ttebftXtt+yvaWX9wIwWO722H7bcyT9WtJ9kk5IOiRpQ0QcLSzDlh/os0Fs+e+Q9FZE/DYi/izp+5LW9vB+AAaol/DfKOn30x6fqJ77ANubbR+2fbiHdQFoWC8/+M20a/Gh3fqIGJM0JrHbDwyTXrb8JyQtmfb4k5JO9tYOgEHpJfyHJN1s+1O2Pybpi5L2NdMWgH7rerc/It6z/Yikn0qaI2lXRPyysc4A9FXXQ31drYzv/EDfDeQgHwBXLsIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5AU4QeS6nqKbkmyfVzSO5Lel/ReRIw00RSas2DBgmL9pZdeKtaXLVtWrNvlCWEnJyc71sbHx4vLbtu2rVg/f/58sY6ynsJf+YeIONfA+wAYIHb7gaR6DX9I+pnt12xvbqIhAIPR627/nRFx0vYNkg7YPhYRB6e/oPqjwB8GYMj0tOWPiJPV7RlJE5LumOE1YxExwo+BwHDpOvy259r+xMX7klZKerOpxgD0Vy+7/QslTVRDPR+R9L2I+EkjXQHoO0fE4FZmD25liZTG8nfs2FFc9sEHHyzW6/7/qBvnLy1ft+zExESxvn79+mI9q4gof7AVhvqApAg/kBThB5Ii/EBShB9IivADSTHUdxVYvXp1x9r+/fuLy9YNt42OjhbrBw4cKNaXL1/esVY3zHjXXXcV64sWLSrWz549W6xfrRjqA1BE+IGkCD+QFOEHkiL8QFKEH0iK8ANJMc5/FTh9+nTH2rx584rLPvfcc8X6xo0bi/VeLp+9atWqYr3uGIWHH364WB8bG7vsnq4GjPMDKCL8QFKEH0iK8ANJEX4gKcIPJEX4gaSamKUXfbZ5c3m2s9Klu+uO42jz8tfnzpUnd6671gB6w5YfSIrwA0kRfiApwg8kRfiBpAg/kBThB5KqHee3vUvS5ySdiYhbqueul/QDSUslHZf0QET8oX9t5la69r1UHssfHx9vup3GrFixolgf5LUmMprNlv87ki6dFWKLpBcj4mZJL1aPAVxBasMfEQclvX3J02sl7a7u75Z0f8N9Aeizbr/zL4yIU5JU3d7QXEsABqHvx/bb3iypfHA6gIHrdst/2vZiSapuz3R6YUSMRcRIRIx0uS4AfdBt+PdJ2lTd3yTp+WbaATAoteG3/aykn0taZvuE7S9L2i7pPtu/kXRf9RjAFaT2O39EbOhQ+mzDvaCDu+++u1gvnfded13+fisdo7B169bisnXn8x88eLCrnjCFI/yApAg/kBThB5Ii/EBShB9IivADSXHp7iFQd8puXf3s2bMday+//HJXPc1WXW+HDh3qWLv22muLyx49erRYP3bsWLGOMrb8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5AU4/xDYM2aNcV63Xj4u+++22Q7l2V0dLRYL/Ved8ru9u1cJqKf2PIDSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKM8w+BuvPW66aqnjdvXsfazp07i8s+9NBDxfqePXuK9ZUrVxbrTLM9vNjyA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBSrhuHtb1L0ucknYmIW6rnnpT0FUkXLxi/NSJ+XLsym0HfLrzwwgvF+qpVqzrWZvHvW6z3uvz4+HjH2rp163pa95w5c4r1rCKi/I9Smc2W/zuSVs/w/H9GxK3Vf7XBBzBcasMfEQclvT2AXgAMUC/f+R+x/Qvbu2xf11hHAAai2/B/S9KnJd0q6ZSkHZ1eaHuz7cO2D3e5LgB90FX4I+J0RLwfERckfVvSHYXXjkXESESMdNskgOZ1FX7bi6c9/LykN5tpB8Cg1J7Sa/tZSfdKmm/7hKRvSLrX9q2SQtJxSV/tY48A+qA2/BGxYYann+lDL+ig7tr4N910U8fasmXLelp33Vj7U089Vaxv27atY21ycrK47GOPPVasP/7448V63eeWHUf4AUkRfiApwg8kRfiBpAg/kBThB5KqPaW30ZVxSm9fPProox1rTz/9dHHZulNyR0bKB2YeOXKkWC+57bbbivVXX321p3Xffvvtl93T1aDJU3oBXIUIP5AU4QeSIvxAUoQfSIrwA0kRfiAppui+CmzZsqVjre44jomJiWL92LFjXfXUhLre58+f33X93LlzXfV0NWHLDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJMc5/FViwYEHHWt1Y+fr165tupzF11xqoG6tnLL+MLT+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJFU7zm97iaQ9khZJuiBpLCK+aft6ST+QtFTScUkPRMQf+tdqXsuXLy/WS2P5g5yX4XKtWLGiWK/rvW6Kb5TNZsv/nqR/jYgVkv5O0tds/42kLZJejIibJb1YPQZwhagNf0Sciogj1f13JE1KulHSWkm7q5ftlnR/v5oE0LzL+s5ve6mkz0h6RdLCiDglTf2BkHRD080B6J9ZH9tv++OS9kr6ekT8se6462nLbZa0ubv2APTLrLb8tj+qqeB/NyLGq6dP215c1RdLOjPTshExFhEjEVGe8RHAQNWG31Ob+GckTUbEzmmlfZI2Vfc3SXq++fYA9MtsdvvvlPRPkt6w/Xr13FZJ2yX90PaXJf1O0hf60yLuueeeYv2aazr/Db9w4ULT7XzA3Llzi/U9e/Z0rK1bt6647JkzM+5M/sXGjRuLdZTVhj8i/kdSpy/4n222HQCDwhF+QFKEH0iK8ANJEX4gKcIPJEX4gaS4dPcVoO7U1tJYft2ydacL1xkdHS3W165d27F29OjR4rJr1qzpqifMDlt+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0jKg7y0s+3hvY70EKsbiz948GDH2rx584rLlq4FINVfD6Bu+b1793asPfHEE8Vljx07VqxjZhExq2vsseUHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQY578KrFq1qmNt//79xWXrpl2rO+d++/btxfrExETH2vnz54vLojuM8wMoIvxAUoQfSIrwA0kRfiApwg8kRfiBpGrH+W0vkbRH0iJJFySNRcQ3bT8p6SuSzlYv3RoRP655L8b5gT6b7Tj/bMK/WNLiiDhi+xOSXpN0v6QHJP0pIp6ebVOEH+i/2Ya/dsaeiDgl6VR1/x3bk5Ju7K09AG27rO/8tpdK+oykV6qnHrH9C9u7bF/XYZnNtg/bPtxTpwAaNetj+21/XNJLkkYjYtz2QknnJIWkf9fUV4N/rnkPdvuBPmvsO78k2f6opB9J+mlE7JyhvlTSjyLilpr3IfxAnzV2Yo+nTvt6RtLk9OBXPwRe9HlJb15ukwDaM5tf+++S9LKkNzQ11CdJWyVtkHSrpnb7j0v6avXjYOm92PIDfdbobn9TCD/Qf5zPD6CI8ANJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kFTtBTwbdk7S/057PL96bhgNa2/D2pdEb91qsre/mu0LB3o+/4dWbh+OiJHWGigY1t6GtS+J3rrVVm/s9gNJEX4gqbbDP9by+kuGtbdh7Uuit2610lur3/kBtKftLT+AlrQSfturbf/K9lu2t7TRQye2j9t+w/brbU8xVk2Ddsb2m9Oeu972Adu/qW5nnCatpd6etP1/1Wf3uu1/bKm3Jbb/2/ak7V/a/pfq+VY/u0JfrXxuA9/ttz1H0q8l3SfphKRDkjZExNGBNtKB7eOSRiKi9TFh238v6U+S9lycDcn2f0h6OyK2V384r4uIfxuS3p7UZc7c3KfeOs0s/SW1+Nk1OeN1E9rY8t8h6a2I+G1E/FnS9yWtbaGPoRcRByW9fcnTayXtru7v1tT/PAPXobehEBGnIuJIdf8dSRdnlm71syv01Yo2wn+jpN9Pe3xCwzXld0j6me3XbG9uu5kZLLw4M1J1e0PL/VyqdubmQbpkZumh+ey6mfG6aW2Ef6bZRIZpyOHOiPhbSWskfa3avcXsfEvSpzU1jdspSTvabKaaWXqvpK9HxB/b7GW6Gfpq5XNrI/wnJC2Z9viTkk620MeMIuJkdXtG0oSmvqYMk9MXJ0mtbs+03M9fRMTpiHg/Ii5I+rZa/OyqmaX3SvpuRIxXT7f+2c3UV1ufWxvhPyTpZtufsv0xSV+UtK+FPj7E9tzqhxjZnitppYZv9uF9kjZV9zdJer7FXj5gWGZu7jSztFr+7IZtxutWDvKphjL+S9IcSbsiYnTgTczA9l9ramsvTZ3x+L02e7P9rKR7NXXW12lJ35D0nKQfSrpJ0u8kfSEiBv7DW4fe7tVlztzcp946zSz9ilr87Jqc8bqRfjjCD8iJI/yApAg/kBThB5Ii/EBShB9IivADSRF+ICnCDyT1/zuzOYWa4hAXAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "print(\"Actual class of test image:\", test_lbl[177])\n", + "plt.imshow(test_img[177].reshape((28,28)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is obvious that this Learner is not very efficient. In fact, it will guess correctly in only 1135/10000 of the samples, roughly 10%. It is very fast though, so it might have its use as a quick first guess." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Naive-Bayes\n", + "\n", + "The Naive-Bayes classifier is an improvement over the Plurality Learner. It is much more accurate, but a lot slower." + ] + }, + { + "cell_type": "code", + "execution_count": 104, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "7\n" + ] + } + ], + "source": [ + "# takes ~45 Secs. to execute this\n", + "\n", + "nBD = NaiveBayesLearner(MNIST_DataSet, continuous = False)\n", + "print(nBD(test_img[0]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To make sure that the output we got is correct, let's plot that image along with its label." + ] + }, + { + "cell_type": "code", + "execution_count": 105, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Actual class of test image: 7\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 105, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAADQNJREFUeJzt3W+MVfWdx/HPZylNjPQBWLHEgnQb3bgaAzoaE3AzamxYbYKN1NQHGzbZMH2AZps0ZA1PypMmjemfrU9IpikpJtSWhFbRGBeDGylRGwejBYpQICzMgkAzJgUT0yDfPphDO8W5v3u5/84dv+9XQube8z1/vrnhM+ecOefcnyNCAPL5h7obAFAPwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+IKnP9HNjtrmdEOixiHAr83W057e9wvZB24dtP9nJugD0l9u9t9/2LEmHJD0gaVzSW5Iei4jfF5Zhzw/0WD/2/HdJOhwRRyPiz5J+IWllB+sD0EedhP96SSemvB+vpv0d2yO2x2yPdbAtAF3WyR/8pju0+MRhfUSMShqVOOwHBkkne/5xSQunvP+ipJOdtQOgXzoJ/1uSbrT9JduflfQNSdu70xaAXmv7sD8iLth+XNL/SJolaVNE7O9aZwB6qu1LfW1tjHN+oOf6cpMPgJmL8ANJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaTaHqJbkmwfk3RO0seSLkTEUDeaAtB7HYW/cm9E/LEL6wHQRxz2A0l1Gv6QtMP2Htsj3WgIQH90eti/LCJO2p4v6RXb70XErqkzVL8U+MUADBhHRHdWZG+QdD4ivl+YpzsbA9BQRLiV+do+7Ld9te3PXXot6SuS9rW7PgD91clh/3WSfm370np+HhEvd6UrAD3XtcP+ljbGYT/Qcz0/7AcwsxF+ICnCDyRF+IGkCD+QFOEHkurGU30prFq1qmFtzZo1xWVPnjxZrH/00UfF+pYtW4r1999/v2Ht8OHDxWWRF3t+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iKR3pbdPTo0Ya1xYsX96+RaZw7d65hbf/+/X3sZLCMj483rD311FPFZcfGxrrdTt/wSC+AIsIPJEX4gaQIP5AU4QeSIvxAUoQfSIrn+VtUemb/tttuKy574MCBYv3mm28u1m+//fZifXh4uGHt7rvvLi574sSJYn3hwoXFeicuXLhQrJ89e7ZYX7BgQdvbPn78eLE+k6/zt4o9P5AU4QeSIvxAUoQfSIrwA0kRfiApwg8k1fR5ftubJH1V0pmIuLWaNk/SLyUtlnRM0qMR8UHTjc3g5/kH2dy5cxvWlixZUlx2z549xfqdd97ZVk+taDZewaFDh4r1ZvdPzJs3r2Ft7dq1xWU3btxYrA+ybj7P/zNJKy6b9qSknRFxo6Sd1XsAM0jT8EfELkkTl01eKWlz9XqzpIe73BeAHmv3nP+6iDglSdXP+d1rCUA/9PzeftsjkkZ6vR0AV6bdPf9p2wskqfp5ptGMETEaEUMRMdTmtgD0QLvh3y5pdfV6taTnu9MOgH5pGn7bz0p6Q9I/2R63/R+SvifpAdt/kPRA9R7ADML39mNgPfLII8X61q1bi/V9+/Y1rN17773FZScmLr/ANXPwvf0Aigg/kBThB5Ii/EBShB9IivADSXGpD7WZP7/8SMjevXs7Wn7VqlUNa9u2bSsuO5NxqQ9AEeEHkiL8QFKEH0iK8ANJEX4gKcIPJMUQ3ahNs6/Pvvbaa4v1Dz4of1v8wYMHr7inTNjzA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBSPM+Pnlq2bFnD2quvvlpcdvbs2cX68PBwsb5r165i/dOK5/kBFBF+ICnCDyRF+IGkCD+QFOEHkiL8QFJNn+e3vUnSVyWdiYhbq2kbJK2RdLaabX1EvNSrJjFzPfjggw1rza7j79y5s1h/44032uoJk1rZ8/9M0opppv8oIpZU/wg+MMM0DX9E7JI00YdeAPRRJ+f8j9v+ne1Ntud2rSMAfdFu+DdK+rKkJZJOSfpBoxltj9gesz3W5rYA9EBb4Y+I0xHxcURclPQTSXcV5h2NiKGIGGq3SQDd11b4bS+Y8vZrkvZ1px0A/dLKpb5nJQ1L+rztcUnfkTRse4mkkHRM0jd72COAHuB5fnTkqquuKtZ3797dsHbLLbcUl73vvvuK9ddff71Yz4rn+QEUEX4gKcIPJEX4gaQIP5AU4QeSYohudGTdunXF+tKlSxvWXn755eKyXMrrLfb8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5AUj/Si6KGHHirWn3vuuWL9ww8/bFhbsWK6L4X+mzfffLNYx/R4pBdAEeEHkiL8QFKEH0iK8ANJEX4gKcIPJMXz/Mldc801xfrTTz9drM+aNatYf+mlxgM4cx2/Xuz5gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiCpps/z214o6RlJX5B0UdJoRPzY9jxJv5S0WNIxSY9GxAdN1sXz/H3W7Dp8s2vtd9xxR7F+5MiRYr30zH6zZdGebj7Pf0HStyPiZkl3S1pr+58lPSlpZ0TcKGln9R7ADNE0/BFxKiLerl6fk3RA0vWSVkraXM22WdLDvWoSQPdd0Tm/7cWSlkr6raTrIuKUNPkLQtL8bjcHoHdavrff9hxJ2yR9KyL+ZLd0WiHbI5JG2msPQK+0tOe3PVuTwd8SEb+qJp+2vaCqL5B0ZrplI2I0IoYiYqgbDQPojqbh9+Qu/qeSDkTED6eUtktaXb1eLen57rcHoFdaudS3XNJvJO3V5KU+SVqvyfP+rZIWSTou6esRMdFkXVzq67ObbrqpWH/vvfc6Wv/KlSuL9RdeeKGj9ePKtXqpr+k5f0TsltRoZfdfSVMABgd3+AFJEX4gKcIPJEX4gaQIP5AU4QeS4qu7PwVuuOGGhrUdO3Z0tO5169YV6y+++GJH60d92PMDSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFJc5/8UGBlp/C1pixYt6mjdr732WrHe7PsgMLjY8wNJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUlznnwGWL19erD/xxBN96gSfJuz5gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiCpptf5bS+U9IykL0i6KGk0In5se4OkNZLOVrOuj4iXetVoZvfcc0+xPmfOnLbXfeTIkWL9/Pnzba8bg62Vm3wuSPp2RLxt+3OS9th+par9KCK+37v2APRK0/BHxClJp6rX52wfkHR9rxsD0FtXdM5ve7GkpZJ+W0163PbvbG+yPbfBMiO2x2yPddQpgK5qOfy250jaJulbEfEnSRslfVnSEk0eGfxguuUiYjQihiJiqAv9AuiSlsJve7Ymg78lIn4lSRFxOiI+joiLkn4i6a7etQmg25qG37Yl/VTSgYj44ZTpC6bM9jVJ+7rfHoBeaeWv/csk/Zukvbbfqaatl/SY7SWSQtIxSd/sSYfoyLvvvlus33///cX6xMREN9vBAGnlr/27JXmaEtf0gRmMO/yApAg/kBThB5Ii/EBShB9IivADSbmfQyzbZjxnoMciYrpL85/Anh9IivADSRF+ICnCDyRF+IGkCD+QFOEHkur3EN1/lPR/U95/vpo2iAa1t0HtS6K3dnWztxtanbGvN/l8YuP22KB+t9+g9jaofUn01q66euOwH0iK8ANJ1R3+0Zq3XzKovQ1qXxK9tauW3mo95wdQn7r3/ABqUkv4ba+wfdD2YdtP1tFDI7aP2d5r+526hxirhkE7Y3vflGnzbL9i+w/Vz2mHSauptw22/7/67N6x/WBNvS20/b+2D9jeb/s/q+m1fnaFvmr53Pp+2G97lqRDkh6QNC7pLUmPRcTv+9pIA7aPSRqKiNqvCdv+F0nnJT0TEbdW056SNBER36t+cc6NiP8akN42SDpf98jN1YAyC6aOLC3pYUn/rho/u0Jfj6qGz62OPf9dkg5HxNGI+LOkX0haWUMfAy8idkm6fNSMlZI2V683a/I/T9816G0gRMSpiHi7en1O0qWRpWv97Ap91aKO8F8v6cSU9+MarCG/Q9IO23tsj9TdzDSuq4ZNvzR8+vya+7lc05Gb++mykaUH5rNrZ8Trbqsj/NN9xdAgXXJYFhG3S/pXSWurw1u0pqWRm/tlmpGlB0K7I153Wx3hH5e0cMr7L0o6WUMf04qIk9XPM5J+rcEbffj0pUFSq59nau7nrwZp5ObpRpbWAHx2gzTidR3hf0vSjba/ZPuzkr4haXsNfXyC7aurP8TI9tWSvqLBG314u6TV1evVkp6vsZe/MygjNzcaWVo1f3aDNuJ1LTf5VJcy/lvSLEmbIuK7fW9iGrb/UZN7e2nyicef19mb7WclDWvyqa/Tkr4j6TlJWyUtknRc0tcjou9/eGvQ27AmD13/OnLzpXPsPve2XNJvJO2VdLGavF6T59e1fXaFvh5TDZ8bd/gBSXGHH5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpP4CIJjqosJxHysAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "print(\"Actual class of test image:\", test_lbl[0])\n", + "plt.imshow(test_img[0].reshape((28,28)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### k-Nearest Neighbors\n", + "\n", + "We will now try to classify a random image from the dataset using the kNN classifier." + ] + }, + { + "cell_type": "code", + "execution_count": 106, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "5\n" + ] + } + ], + "source": [ + "# takes ~20 Secs. to execute this\n", + "kNN = NearestNeighborLearner(MNIST_DataSet, k=3)\n", + "print(kNN(test_img[211]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To make sure that the output we got is correct, let's plot that image along with its label." + ] + }, + { + "cell_type": "code", + "execution_count": 107, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Actual class of test image: 5\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 107, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAADdVJREFUeJzt3X+oVHUax/HPk7kFKWVUauqurcnSIlnLLQq3UCqtJdAtNixY3BDv/mFgEGFoP/wjQZZ+QyzdTUkhMyF/QZu7Kku1sElXkczMNsLUumhmpVcKU5/94x6Xm93znWnmzJy5Pu8XyJ05zzlzHgY/95y533Pma+4uAPGcVXYDAMpB+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBHV2M3dmZlxOCDSYu1s169V15DezW81sl5l9bGYP1fNaAJrLar2238wGSPpI0i2S9kl6V9Ld7v5BYhuO/ECDNePIf62kj939E3c/JmmFpKl1vB6AJqon/CMk7e31fF+27AfMrN3MOs2ss459AShYPX/w6+vU4ken9e7eIalD4rQfaCX1HPn3SRrV6/lISZ/X1w6AZqkn/O9KGmtml5nZzyRNl7SumLYANFrNp/3uftzM7pP0D0kDJC1x9x2FdQagoWoe6qtpZ3zmBxquKRf5AOi/CD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiq5im6JcnMdks6IumEpOPu3lZEUwAar67wZya5+8ECXgdAE3HaDwRVb/hd0j/NbIuZtRfREIDmqPe0f4K7f25ml0jaYGYfuvtbvVfIfinwiwFoMebuxbyQ2QJJ3e7+RGKdYnYGIJe7WzXr1Xzab2bnmdngU48lTZb0fq2vB6C56jntHypptZmdep3l7r6+kK4ANFxhp/1V7YzT/nDOP//83Np1112X3Pb111+va9/d3d25tVRfkrRr165kfcKECcn6l19+maw3UsNP+wH0b4QfCIrwA0ERfiAowg8ERfiBoIq4qw9nsLa29F3a7e3pK7fvvPPO3Fp2jUiunTt3JusLFy5M1kePHl3ztnv27EnWv//++2S9P+DIDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBcUvvGW7gwIHJ+vz585P1WbNmJeuHDh1K1p977rnc2ubNm5Pb7tixI1mfNGlSsr548eLc2tdff53cduLEicn6V199layXiVt6ASQRfiAowg8ERfiBoAg/EBThB4Ii/EBQjPOfAaZMmZJbe/jhh5Pbjh8/PllfsWJFsv7ggw8m64MGDcqt3Xvvvcltb7755mT9hhtuSNY3btyYW5s7d25y223btiXrrYxxfgBJhB8IivADQRF+ICjCDwRF+IGgCD8QVMVxfjNbIul2SQfcfVy27EJJr0oaLWm3pLvcveINzozz12bBggXJeuqe/Erj1YsWLUrWDx48mKzfeOONyfrMmTNza6NGjUpuu3379mT9mWeeSdbXrFmTW6t0P39/VuQ4/0uSbj1t2UOSNrn7WEmbsucA+pGK4Xf3tySd/nUtUyUtzR4vlTSt4L4ANFitn/mHunuXJGU/LymuJQDN0PC5+sysXVJ6QjcATVfrkX+/mQ2XpOzngbwV3b3D3dvcPT3jI4CmqjX86yTNyB7PkLS2mHYANEvF8JvZK5L+I+lXZrbPzGZKWiTpFjP7r6RbsucA+hHu528Blcbx582bl6x3dnbm1lL3+kvSkSNHkvVKvT3yyCPJ+vLly3NrqfvtJWn16tXJ+uHDh5P1qLifH0AS4QeCIvxAUIQfCIrwA0ERfiAohvqaYMyYMcn622+/nayvXZu+hmrOnDm5tWPHjiW3rWTAgAHJ+rnnnpusf/vtt7m1kydP1tQT0hjqA5BE+IGgCD8QFOEHgiL8QFCEHwiK8ANBNfxrvCCNHTs2WR86dGiyfvz48WS93rH8lBMnTiTrR48ebdi+0Vgc+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMb5m6DSVNN79+5N1i+44IJk/ayz8n+Hc8888nDkB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgKo7zm9kSSbdLOuDu47JlCyTNkvRFtto8d/97o5rs7z777LNkvdJ1APfcc0+yPnjw4NzatGnTktsirmqO/C9JurWP5U+7+1XZP4IP9DMVw+/ub0k61IReADRRPZ/57zOz98xsiZkNKawjAE1Ra/j/KmmMpKskdUl6Mm9FM2s3s04z66xxXwAaoKbwu/t+dz/h7icl/U3StYl1O9y9zd3bam0SQPFqCr+ZDe/19PeS3i+mHQDNUs1Q3yuSJkq6yMz2SXpM0kQzu0qSS9ot6c8N7BFAA5i7N29nZs3bWT9y8cUXJ+urVq1K1q+//vrc2sKFC5Pbvvjii8l6pe8aQOtxd6tmPa7wA4Ii/EBQhB8IivADQRF+ICjCDwTFUF8/MGRI+taJN954I7d2zTXXJLetNNT3+OOPJ+sMBbYehvoAJBF+ICjCDwRF+IGgCD8QFOEHgiL8QFCM858BBg0alFubPn16ctsXXnghWf/mm2+S9cmTJyfrnZ18e1uzMc4PIInwA0ERfiAowg8ERfiBoAg/EBThB4JinP8MZ5Ye8h02bFiyvn79+mT9iiuuSNavvPLK3NqHH36Y3Ba1YZwfQBLhB4Ii/EBQhB8IivADQRF+ICjCDwR1dqUVzGyUpGWShkk6KanD3Z81swslvSpptKTdku5y968a1ypqUek6jq6urmR99uzZyfqbb76ZrKfu92ecv1zVHPmPS3rA3a+QdJ2k2Wb2a0kPSdrk7mMlbcqeA+gnKobf3bvcfWv2+IiknZJGSJoqaWm22lJJ0xrVJIDi/aTP/GY2WtLVkjZLGuruXVLPLwhJlxTdHIDGqfiZ/xQzGyTpNUn3u/vhSteM99quXVJ7be0BaJSqjvxmNlA9wX/Z3Vdli/eb2fCsPlzSgb62dfcOd29z97YiGgZQjIrht55D/GJJO939qV6ldZJmZI9nSFpbfHsAGqWa0/4Jkv4oabuZbcuWzZO0SNJKM5spaY+kPzSmRTTSyJEjk/VHH320rtdnCu/WVTH87v5vSXkf8G8qth0AzcIVfkBQhB8IivADQRF+ICjCDwRF+IGgqr68N7pLL700tzZ37tzktnPmzCm6naqdc845yfr8+fOT9ZtuSo/mrly5MlnfsGFDso7ycOQHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaCYortKl19+eW5t69atyW0nTZqUrG/ZsqWmnk4ZN25cbm3ZsmXJbcePH5+sVxrHnzVrVrLe3d2drKN4TNENIInwA0ERfiAowg8ERfiBoAg/EBThB4Lifv4qffrpp7m1559/PrntmjVrkvXvvvsuWX/nnXeS9dtuuy23Vul+/jvuuCNZ37hxY7J+9OjRZB2tiyM/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRV8X5+MxslaZmkYZJOSupw92fNbIGkWZK+yFad5+5/r/Ba/fZ+/pSzz05fLlHpnvcpU6Yk6yNGjEjWU2PxmzZtqnlb9E/V3s9fzUU+xyU94O5bzWywpC1mdmomhqfd/YlamwRQnorhd/cuSV3Z4yNmtlNS+lAEoOX9pM/8ZjZa0tWSNmeL7jOz98xsiZkNydmm3cw6zayzrk4BFKrq8JvZIEmvSbrf3Q9L+qukMZKuUs+ZwZN9befuHe7e5u5tBfQLoCBVhd/MBqon+C+7+ypJcvf97n7C3U9K+pukaxvXJoCiVQy/mZmkxZJ2uvtTvZYP77Xa7yW9X3x7ABqlmqG+30p6W9J29Qz1SdI8SXer55TfJe2W9Ofsj4Op1zojh/qAVlLtUB/f2w+cYfjefgBJhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaCaPUX3QUm957q+KFvWilq1t1btS6K3WhXZ2y+qXbGp9/P/aOdmna363X6t2lur9iXRW63K6o3TfiAowg8EVXb4O0ref0qr9taqfUn0VqtSeiv1Mz+A8pR95AdQklLCb2a3mtkuM/vYzB4qo4c8ZrbbzLab2baypxjLpkE7YGbv91p2oZltMLP/Zj/7nCatpN4WmNln2Xu3zcx+V1Jvo8zsX2a208x2mNmcbHmp712ir1Let6af9pvZAEkfSbpF0j5J70q6290/aGojOcxst6Q2dy99TNjMbpTULWmZu4/Llv1F0iF3X5T94hzi7nNbpLcFkrrLnrk5m1BmeO+ZpSVNk/QnlfjeJfq6SyW8b2Uc+a+V9LG7f+LuxyStkDS1hD5anru/JenQaYunSlqaPV6qnv88TZfTW0tw9y5335o9PiLp1MzSpb53ib5KUUb4R0ja2+v5PrXWlN8u6Z9mtsXM2stupg9DT82MlP28pOR+Tldx5uZmOm1m6ZZ572qZ8bpoZYS/r9lEWmnIYYK7/0bSbZJmZ6e3qE5VMzc3Sx8zS7eEWme8LloZ4d8naVSv5yMlfV5CH31y98+znwckrVbrzT68/9QkqdnPAyX383+tNHNzXzNLqwXeu1aa8bqM8L8raayZXWZmP5M0XdK6Evr4ETM7L/tDjMzsPEmT1XqzD6+TNCN7PEPS2hJ7+YFWmbk5b2ZplfzetdqM16Vc5JMNZTwjaYCkJe6+sOlN9MHMfqmeo73Uc8fj8jJ7M7NXJE1Uz11f+yU9JmmNpJWSfi5pj6Q/uHvT//CW09tE/cSZmxvUW97M0ptV4ntX5IzXhfTDFX5ATFzhBwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gqP8B1flLsMvfVy4AAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "print(\"Actual class of test image:\", test_lbl[211])\n", + "plt.imshow(test_img[211].reshape((28,28)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Hurray! We've got it correct. Don't worry if our algorithm predicted a wrong class. With this techinique we have only ~97% accuracy on this dataset." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## MNIST FASHION\n", + "\n", + "Another dataset in the same format is [MNIST Fashion](https://github.com/zalandoresearch/fashion-mnist/blob/master/README.md). This dataset, instead of digits contains types of apparel (t-shirts, trousers and others). As with the Digits dataset, it is split into training and testing images, with labels from 0 to 9 for each of the ten types of apparel present in the dataset. The below table shows what each label means:\n", + "\n", + "| Label | Description |\n", + "| ----- | ----------- |\n", + "| 0 | T-shirt/top |\n", + "| 1 | Trouser |\n", + "| 2 | Pullover |\n", + "| 3 | Dress |\n", + "| 4 | Coat |\n", + "| 5 | Sandal |\n", + "| 6 | Shirt |\n", + "| 7 | Sneaker |\n", + "| 8 | Bag |\n", + "| 9 | Ankle boot |" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since both the MNIST datasets follow the same format, the code we wrote for loading and visualizing the Digits dataset will work for Fashion too! The only difference is that we have to let the functions know which dataset we're using, with the `fashion` argument. Let's start by loading the training and testing images:" + ] + }, + { + "cell_type": "code", + "execution_count": 108, + "metadata": {}, + "outputs": [], + "source": [ + "train_img, train_lbl, test_img, test_lbl = load_MNIST(fashion=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Visualizing Data\n", + "\n", + "Let's visualize some random images for each class, both for the training and testing sections:" + ] + }, + { + "cell_type": "code", + "execution_count": 109, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA0EAAAKoCAYAAACxwfQnAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzsnXd8FUX3/z+BhBRSaKG30JEivSsg0quCD4hKV1B5VOSRol+JFEUQbCAighQRwQI+gNKbKAGCNAELSFEeqijS+/z+4PfZe3azudyEG7gh5/168eJmZnZ3dvbMzO45Z84EGWMMFEVRFEVRFEVRMgiZbncFFEVRFEVRFEVRbiX6EaQoiqIoiqIoSoZCP4IURVEURVEURclQ6EeQoiiKoiiKoigZCv0IUhRFURRFURQlQ6EfQYqiKIqiKIqiZCj0I0hRFEVRFEVRlAyFfgQpiqIoiqIoipKh0I8gRVEURVEURVEyFLf0IygoKMinf6tXr76p60yePBlBQUHYunXrDcvWq1cP999/v0/nPXjwIF555RVs37492TLHjx9HcHAwFixYAAAYMWIE5s+f71vF/cStauc7kWnTptnaKDg4GAULFkT37t3xv//9L8Xna9CgARo0aGBLCwoKwiuvvOKfCqcznO0bFhaGvHnzomHDhhg5ciSOHTt2u6uYLtm+fTu6d++OuLg4hIWFITIyElWqVMHo0aPx119/pck1161bh1deeQUnT55Mk/PfDBs2bMADDzyAwoULIzQ0FHny5EHt2rXRv3//W16X/fv3IygoCNOmTUvxsatXrw64sdqXti1atChatWp1w3Ol9P5mzZqFt99+O7VV9xuBJF9u+Nr+6RXnPBIUFITY2Fg0aNAACxcuvN3VSxXvvvsugoKCUL58+Zs+V7du3RAZGXnDcm7vJ7fiumlBaseG4DSoS7IkJCTY/h4+fDhWrVqFlStX2tLvuuuuW1anSZMmISgoyKeyBw8exNChQ1GiRAlUrFjRtcxXX32FiIgING7cGMD1j6BHH30Ubdq08Vudb0QgtnN6Y+rUqShTpgzOnz+Pb7/9FiNHjsSaNWvw448/ImvWrLe7euketu/ly5dx7NgxfPfddxg1ahTGjBmDOXPm+KyYUIAPP/wQTz31FEqXLo0XXngBd911Fy5fvoxNmzZh4sSJSEhIwLx58/x+3XXr1mHo0KHo1q0bsmXL5vfzp5avv/4abdq0QYMGDTB69Gjky5cPhw8fxqZNmzB79myMHTv2dlcx3eLvtq1SpQoSEhJ8notmzZqFHTt24LnnnktN9f2CylfgwHnEGIMjR45g/PjxaN26NebPn4/WrVvf7uqliI8++ggAsHPnTmzYsAE1a9a8zTVKX6R2bLilH0G1atWy/R0bG4tMmTIlSb+V+DL4Xr16FVeuXPHpfF988QVatmyJsLCwm61aqrnZdr506RIyZ86MzJkzp0X10pRz584hIiLips9Tvnx5VKtWDQDQsGFDXL16FcOHD8dXX32FRx555KbPH6hQ1kNDQ9P0OrJ9AaB9+/bo168f6tWrhwcffBC7d+9Gnjx5XI/11zO+E0hISMCTTz6Jxo0b46uvvrI9t8aNG6N///5YvHjxbazhrWf06NGIi4vDkiVLEBzsmeI6deqE0aNH38aapX/83bbR0dE+zUuB1OdVvq5z/vx5hIeH39Y6OOeRZs2aIXv27Pj000/T1UfQpk2bsG3bNrRs2RJff/01pkyZoh9Bt4h0uSbovffeQ4UKFRAZGYmoqCiUKVMGL7/8cpJyp06dQu/evZEzZ07kzJkTHTp0wJEjR2xlnO5we/bsQVBQEMaOHYthw4ahaNGiCA0Nxdq1a1G7dm0AwGOPPWaZYEeMGGEd+/fff2PVqlVo3749rly5gqCgIFy8eBFTpkyxystr/fjjj2jTpg2yZcuGsLAwVK5cGR9//LGtfsuXL0dQUBA+/fRTPPfcc8iTJw/Cw8PRsGFDbNu27abbcvHixQgKCsKcOXPwzDPPIF++fAgLC8Mff/wBANi2bRtatWqFbNmyITw8HFWqVMGsWbNs55g4cSKCgoKStC3PvX79eistMTERzZs3R2xsLEJDQ1GgQAG0bt3aduy1a9fwzjvvoGLFiggLC0OOHDnQsWNHHDhwwHb+WrVqoVq1alixYgVq1aqF8PBwPPXUUzfdJm5woj5w4ABeeeUVV+shTfT79+9P8fl37NiBtm3bInv27AgLC0OlSpUwffp0K//48ePIkiWLq5z//PPPCAoKwrvvvmulHTlyBL1790bBggWRJUsWxMXFYejQobaPebrpjB49GiNGjEBcXBxCQ0OxatWqFNffHxQuXBhjx47F6dOn8cEHHwDwmNd//PFHNGnSBFFRUWjUqJF1zPLly9GoUSNER0cjIiICdevWxYoVK2znPX78OJ544gkUKlQIoaGhiI2NRd26dbF8+XKrzJYtW9CqVSvkzp0boaGhyJ8/P1q2bImDBw/emptPJa+99hqCgoIwadIk1w/XLFmyWFboa9euYfTo0ShTpgxCQ0ORO3dudOnSJck9Llu2DG3btkXBggURFhaGEiVKoHfv3vjzzz+tMq+88gpeeOEFAEBcXFxAudieOHECuXLlsr2gkkyZPFPenDlz0KRJE+TLlw/h4eEoW7YsBg0ahLNnz9qOoQzu2bMHLVq0QGRkJAoVKoT+/fvj4sWLtrKHDh3Cv/71L0RFRSEmJgYdO3ZMMi4C1196OnXqhKJFiyI8PBxFixbFww8/nGSMCzR8bVuyePFiVKlSBeHh4ShTpoyl7SZu7nDJ9fkGDRrg66+/xoEDB2xuULcaX9uALmk3agPAt/EaAIYOHYqaNWsiR44ciI6ORpUqVTBlyhQYY25Y7wkTJiA4OBjx8fFW2qVLlzBixAhrTIiNjUX37t1x/Phx27G8l7lz56Jy5coICwvD0KFDb3jNW01YWBiyZMmCkJAQK83XNrt48SL69++PvHnzIiIiAvfeey9++OEHFC1aFN26dUvTek+ZMgUA8Prrr6NOnTqYPXs2zp07ZyvD+XrMmDF48803ERcXh8jISNSuXdv2jpUc33//PXLlyoVWrVolGeMkvsqEN3bu3IlGjRoha9asiI2NRd++fZPcz4ULFzB48GDExcUhS5YsKFCgAJ5++ukk7tW+zFs3MzbcUkuQP5g5cyb69u2LZ599Fi1btkRQUBD27NmDX375JUnZHj16oHXr1vj0009x4MABDBgwAF26dMHSpUtveJ233noLZcqUwZtvvomoqCiUKlUKkydPRq9evfDKK6+gadOmAIBChQpZx8yfPx/BwcFo3rw5goODkZCQgPr166NZs2YYPHgwACAmJgYAsGvXLtSpUwd58+bF+PHjkT17dsyYMQNdunTB8ePH8fzzz9vqM3DgQFSrVg0fffQR/v77b8THx6N+/frYtm0bihQpkur2JP3798e9996LyZMn49q1a8iePTt+/PFH1K1bFwUKFMB7772HbNmyYdq0aXjkkUfw559/4plnnknRNU6ePIkmTZqgTJkymDhxImJjY3H48GGsXLnS1im7deuGOXPmoF+/fhgzZgyOHz+OoUOHol69eti6dSty5sxplT1w4AC6d++OwYMHo2zZsq4Tkz/Ys2cPgOtWtdSsDfLGL7/8gjp16iB37tx49913kTNnTsycORPdunXD0aNHMWDAAMTGxqJVq1aYPn06hg4daptsp06diixZslgWqiNHjqBGjRrIlCkThgwZguLFiyMhIQEjRozA/v37MXXqVNv13333XZQqVQpjxoxBdHQ0SpYs6df7SwktWrRA5syZ8e2331pply5dQps2bdC7d28MGjTIejGYOXMmunTpgrZt22L69OkICQnBBx98gKZNm2LJkiXWx9Jjjz2GzZs349VXX0WpUqVw8uRJbN68GSdOnAAAnD17Fo0bN0ZcXBzee+895MmTB0eOHMGqVatw+vTpW98IPnL16lWsXLkSVatWtY1DyfHkk09i0qRJ6Nu3L1q1aoX9+/fj5ZdfxurVq7F582bkypULAPDbb7+hdu3a6NWrF2JiYrB//368+eabqFevHn788UeEhISgV69e+OuvvzBu3DjMnTsX+fLlAxAYLra1a9fG5MmT8cwzz+CRRx5BlSpVbC9FZPfu3WjRogWee+45ZM2aFT///DNGjRqFjRs3JnEdvnz5Mtq0aYOePXuif//++PbbbzF8+HDExMRgyJAhAK5rxu+//34cOnQII0eORKlSpfD111+jY8eOSa69f/9+lC5dGp06dUKOHDlw+PBhvP/++6hevTp27dplPYtAw9e2Ba4r0Pr3749BgwYhT548mDx5Mnr27IkSJUrg3nvv9Xodtz5fsGBBPPHEE/jtt9/SxL3TV/zdBikZr/fv34/evXujcOHCAID169fj3//+N/73v/9ZcujEGIMXXngB7777LiZPnmy90F+7dg1t27bF2rVrMWDAANSpUwcHDhxAfHw8GjRogE2bNtksPZs3b8ZPP/2E//u//0NcXFxAuIXTc8EYg6NHj+KNN97A2bNn0blzZ6uMr23WvXt3zJkzBwMGDMB9992HXbt24YEHHsCpU6fS9B7Onz+PTz/9FNWrV0f58uXRo0cP9OrVC59//jm6du2apPx7772HMmXKWOtfXn75ZbRo0QL79u2z3i+dfPbZZ+jSpQt69OiBcePGJevlk1KZcOPy5cto0aKF1XfXrVuHESNG4MCBA9ZaeWMM2rVrhxUrVmDw4MG45557sH37dsTHxyMhIQEJCQmWUs+XeWvChAmpHxvMbaRr164ma9asKTqmT58+JleuXF7LfPjhhwaAeeaZZ2zpr732mgFgjh07ZqXVrVvXNGrUyPp79+7dBoApVaqUuXz5su34hIQEA8B8/PHHrtdt1aqVeeCBB2xpoaGhpmfPnknKdujQwYSFhZmDBw/a0ps0aWIiIyPNqVOnjDHGLFu2zAAwNWrUMNeuXbPK/fbbbyY4ONj06dPHW1MYY7y386JFiwwA06RJkyR57dq1MxEREebw4cO29Pvuu89ER0ebM2fOGGOMef/99w2AJOV47oSEBGOMMd99950BYBYvXpxsXVetWmUAmPfee8+WvnfvXpMlSxYzZMgQK61mzZoGgPn++++93H3KmDp1qgFg1q9fby5fvmxOnz5tFi5caGJjY01UVJQ5cuSIiY+PN25dh8fu27fPSqtfv76pX7++rRwAEx8fb/3dqVMnExoaan7//XdbuebNm5uIiAhz8uRJY4wx8+fPNwDM0qVLrTJXrlwx+fPnN+3bt7fSevfubSIjI82BAwds5xszZowBYHbu3GmMMWbfvn0GgClevLi5dOlSitoptbCNEhMTky2TJ08eU7ZsWWPMddkFYD766CNbmbNnz5ocOXKY1q1b29KvXr1q7r77blOjRg0rLTIy0jz33HPJXm/Tpk0GgPnqq69Sc0u3jSNHjhgAplOnTjcs+9NPPxkA5qmnnrKlb9iwwQAwL774outx165dM5cvXzYHDhwwAMx///tfK++NN95IIu+BwJ9//mnq1atnABgAJiQkxNSpU8eMHDnSnD592vUY3ueaNWsMALNt2zYrjzL42Wef2Y5p0aKFKV26tPU3x0HZRsYY8/jjjxsAZurUqcnW+cqVK+bMmTMma9as5p133rHSOR6uWrUqBS2QdvjatkWKFDFhYWG2Mej8+fMmR44cpnfv3laa2/0l1+eNMaZly5amSJEiaXJvvuLvNvB1vHZy9epVc/nyZTNs2DCTM2dO2/tBkSJFTMuWLc25c+dM+/btTUxMjFm+fLnt+E8//dQAMF9++aUtPTEx0QAwEyZMsJ0vc+bM5pdffklBS6UdnEec/0JDQ231dpJcm+3cudMAMAMHDrSVZxt17do1ze5lxowZBoCZOHGiMcaY06dPm8jISHPPPffYynG+rlChgrly5YqVvnHjRgPAfPrpp1aafOd7/fXXTebMmc2oUaOSXNv5fpISmXCDfVeOYcYY8+qrrxoA5rvvvjPGGLN48WIDwIwePdpWbs6cOQaAmTRpkjEmZfNWaseGgHWH4xc+/5n/b7qsUaMG/vzzTzzyyCOYP3++pc11wxmMgMEMfv/99xtev23btimyKpw+fRrLli1D+/btfSq/cuVKNGnSBAUKFLCld+3aFWfOnMGGDRts6Z07d7aZ94oVK4aaNWv6zXXJrd4rV65Es2bNkDdv3iR1PHXqFBITE1N0jTJlyiA6Ohr9+/fHhx9+iJ9//jlJmYULFyJz5szo3Lmz7fkXKlQId911VxJ3m3z58qFOnTopqocv1KpVCyEhIYiKikKrVq2QN29eLFq0KNl1KjfDypUr0ahRoyTa/G7duuHcuXNWoIvmzZsjb968Ns3gkiVLcOjQIfTo0cNKW7hwIRo2bIj8+fPb2rB58+YAgDVr1tiu06ZNm2Q1mbcD4+La4ZTPdevW4a+//kLXrl1t93jt2jU0a9YMiYmJlnWxRo0amDZtGkaMGIH169fj8uXLtnOVKFEC2bNnx8CBAzFx4kTs2rUr7W7uNsFxwunWUaNGDZQtW9bmQnjs2DH06dMHhQoVQnBwMEJCQixr808//XTL6pxacubMibVr1yIxMRGvv/462rZti19//RWDBw9GhQoVLLe+vXv3onPnzsibNy8yZ86MkJAQ1K9fH0DS+wwKCkqyxqBixYo297VVq1YhKioqybwjtdLkzJkzGDhwIEqUKIHg4GAEBwcjMjISZ8+eDeg29rVtAaBSpUqW9h247qpUqlQpn13+fJ1LbzX+boOUjNcrV67E/fffj5iYGEtmhwwZghMnTiSJrHnixAncd9992LhxI7777jubGzGvmy1bNrRu3dp23UqVKiFv3rxJ5tqKFSuiVKlSN91+/mTGjBlITExEYmIiFi1ahK5du+Lpp5/G+PHjrTK+tBnb+F//+pft/B06dEgz7xIyZcoUhIeHo1OnTgCAyMhIPPTQQ1i7di12796dpHzLli1tlhy+1zr7lTEGvXv3Rnx8PGbNmoUBAwbcsC4plYnkcK6b5hjIeYiWdud89NBDDyFr1qzWfJSSeSu1BOxHUJEiRRASEmL9e/XVVwFcb4zJkydj7969ePDBB5E7d27UqlXLtTGk2xQAy7x2/vz5G16f7h2+smDBAhhjfA5L+ffff7teI3/+/ACQ5OPO+SHCNG8fgSnBWZerV6/i1KlTKarjjciZMyfWrFmDsmXL4oUXXkDZsmVRsGBBDB8+HFevXgUAHD16FFevXkX27Nltzz8kJARbt261TTBu9fYXHFy3bNmCQ4cOYfv27ahbt26aXOvEiRM+tXNwcDAee+wxzJs3z/KbnTZtGvLly2e5ZwLX23DBggVJ2q9cuXIAcMvaMDWcPXsWJ06csO4dACIiIhAdHW0rd/ToUQDXJynnfY4aNQrGGCs09Jw5c9C1a1dMnjwZtWvXRo4cOdClSxdrrUZMTAzWrFmDSpUq4cUXX0S5cuWQP39+xMfHJ/lgCiRy5cqFiIgI7Nu374ZlKUPJyRnzr127hiZNmmDu3LkYMGAAVqxYgY0bN1o+576MnYFCtWrVMHDgQHz++ec4dOgQ+vXrh/3792P06NE4c+YM7rnnHmzYsAEjRozA6tWrkZiYiLlz5wJIep8RERFJgt2EhobiwoUL1t8nTpxwVZK4jd2dO3fG+PHj0atXLyxZsgQbN25EYmIiYmNj00Ube2tb4px/gett5sv9ufX5QMNfbeDreL1x40Y0adIEwPWIkN9//z0SExPx0ksvAUgqs7/++is2bNiA5s2bu4ZdPnr0KE6ePGmtoZH/jhw5EtDzBClbtiyqVauGatWqoVmzZvjggw/QpEkTDBgwACdPnvS5zTj+OftvcHCw6zP0F3v27MG3336Lli1bwhiDkydP4uTJk+jQoQMAuK4f8/W99tKlS5gzZw7KlStnfVDfiJTKhBtubcYxkO184sQJBAcHIzY21lYuKCjI9l7r67x1MwTsmqBvvvkGly5dsv6mxSQoKAg9e/ZEz549cebMGaxZswbx8fFo1aoVdu/ejYIFC/rl+ildcPnll19a2gZfyJ49Ow4fPpwk/dChQwCQxCfcbXHtkSNH/NZBnfebOXNmREdH+1RHvhw4Fwm7dZhKlSrh888/x7Vr17Bt2zZMmTIFQ4YMQVRUFJ577jlrwel3333n6rfq9EdNq4WxHFzdkPcrF6P7MkC4kTNnTp9loXv37njjjTcwe/ZsdOzYEfPnz8dzzz1na6tcuXKhYsWKluLAifzAANKuDVPD119/jatXr9r2LnCrH9tk3LhxyUaX4oSWK1cuvP3223j77bfx+++/Y/78+Rg0aBCOHTtmRU6rUKECZs+eDWMMtm/fjmnTpmHYsGEIDw/HoEGD/HyX/iFz5sxo1KgRFi1ahIMHD3od+zhOHD58OEm5Q4cOWe25Y8cObNu2DdOmTbP5o3NNXHolJCQE8fHxeOutt7Bjxw6sXLkShw4dwurVqy3rD4Cb2vMoZ86c2LhxY5J059j9zz//YOHChYiPj7fJ1sWLF9NsT6e0xNm2/iCQxiRfuJk28HW8nj17NkJCQrBw4ULbB/lXX33lelzt2rXx0EMPoWfPngCA999/37aWNFeuXMiZM2ey0SOjoqJsf6eXZ1KxYkUsWbIEv/76q89txvHx6NGjNu+cK1eu+E3R7MZHH30EYwy++OILfPHFF0nyp0+fjhEjRqQqUi+DHDVt2hT3338/Fi9ejOzZs3s9JqUy4QbbTL6bcgxkWs6cOXHlyhUcP37c9iFk/n+o8+rVq9vK32jeuhkC1hJUsWJF6wu/WrVqrl+CkZGRaNmyJQYPHowLFy6kuRtLcl/c586dw+LFi13N98lpvho1aoTly5dbGm0yY8YMREZGokaNGrZ0Z0S2vXv3YsOGDX7d6MqtjkuWLEkSFWTGjBmIjo62PhKKFi0KAEk2kfW2SWymTJlQuXJljB8/HuHh4di8eTMAoFWrVrhy5QqOHj1qe/78R+3Y7SS5++Wiv5TSqFEj66VMMmPGDERERNhe8suWLYuaNWti6tSpmDVrFi5evIju3bvbjmvVqhV27NiB4sWLu7ah8yMoUPj999/xn//8BzExMejdu7fXsnXr1kW2bNmwa9cu13usVq0asmTJkuS4woULo2/fvmjcuLElc5KgoCDcfffdeOutt5AtWzbXMoHE4MGDYYzB448/blMakcuXL2PBggW47777AFwPJiFJTEzETz/9ZLnK8EXHGWmO0fokKbGs30rcFAqAx8Utf/78KbpPX2nYsCFOnz6dZNxzjt1BQUEwxiS59uTJky2LeKDiS9umJb5aktISf7eBr+M1N++WL8Tnz59PElFW0rVrV8yePRtTp05Fly5dbPLVqlUrnDhxAlevXnW9bunSpVN0H4HC1q1bAVwPYuRrmzFIxZw5c2zpX3zxhc/bo6SUq1evYvr06ShevDhWrVqV5F///v1x+PBhLFq0KNXXqFy5MtasWYODBw+iQYMGN9yM3F8y8cknn9j+5hjI91XON8756Msvv8TZs2etfF/nLSD1Y0PAWoKSo3v37oiOjkbdunWRN29eHD58GK+99hqyZ8+OqlWrpum1S5YsibCwMHz88ccoVaoUsmbNigIFCuD777/HpUuX0LZt2yTHVKhQAStXrsTChQuRN29eREdHo1SpUnjllVewaNEiNGjQAC+//DKyZcuGjz/+GEuWLMHYsWOTfHEfPnwYDz74IHr27ImTJ09iyJAhiIiIwMCBA9PsfocOHYqlS5eiQYMGeOmll5AtWzZMnz4dK1aswDvvvGNFh6lbty7i4uLw7LPP4vz584iKisLnn3+OTZs22c735ZdfYtq0aWjbti3i4uJw9epVfPbZZzh//ry1uWyjRo3QpUsXPPLII+jbty/q1auHiIgIHDp0CGvXrkX16tUtzdbtokWLFsiRIwd69uyJYcOGITg4GNOmTbPCiqeU+Ph4yy98yJAhyJEjBz755BN8/fXXGD16dBLrYo8ePdC7d28cOnQIderUSTIwDRs2DMuWLUOdOnXwzDPPoHTp0rhw4QL279+Pb775BhMnTvSbxTS17Nixw/I3PnbsGNauXYupU6cic+bMmDdvXhIzuZPIyEiMGzcOXbt2xV9//YUOHTogd+7cOH78OLZt24bjx4/j/fffxz///IOGDRuic+fOKFOmDKKiopCYmIjFixfjwQcfBHDdD3rChAlo164dihUrBmMM5s6di5MnT1pyGajUrl0b77//Pp566ilUrVoVTz75JMqVK4fLly9jy5YtmDRpEsqXL4958+bhiSeewLhx45ApUyY0b97cirJTqFAh9OvXD8D1dXvFixfHoEGDYIxBjhw5sGDBAixbtizJtStUqAAAeOedd9C1a1eEhISgdOnSPmkL05KmTZuiYMGCaN26NcqUKYNr165h69atGDt2LCIjI/Hss88if/78yJ49O/r06YP4+HiEhITgk08+ualtB7p06YK33noLXbp0wauvvoqSJUvim2++wZIlS2zloqOjce+99+KNN95Arly5ULRoUaxZswZTpkwJqE1n3fClbdOSChUqYO7cuXj//fdRtWpVZMqUKVmLfVrh7zbwdbxu2bIl3nzzTXTu3BlPPPEETpw4gTFjxtxwT7cOHTogIiICHTp0sCKRZcmSBZ06dcInn3yCFi1a4Nlnn0WNGjUQEhKCgwcPYtWqVWjbti0eeOCBm2mqNIfzCHDddWru3LlYtmwZHnjgAcTFxfncZuXKlcPDDz+MsWPHInPmzLjvvvuwc+dOjB07FjExMa7h32+WRYsW4dChQxg1apSrMrt8+fIYP348pkyZ4vMyCzfKli2LtWvX4v7778e9996L5cuXJzv/+0MmsmTJgrFjx+LMmTOoXr26FR2uefPmqFevHoDre9g1bdoUAwcOxKlTp1C3bl0rOlzlypXx2GOPAQBKly7t07wF3MTYkOJQCn4kNdHhPvroI9OwYUOTJ08ekyVLFpM/f37TqVMns2PHDqsMo8Nt2bLFdiwjra1du9ZKSy463FtvveV6/ZkzZ5rSpUubkJAQA8AMHz7cdOq+h3b1AAAgAElEQVTUyXYOyQ8//GBq165twsPDDQBbuW3btplWrVqZ6OhoExoaaipVqmRmzJjhWudZs2aZvn37mtjYWBMaGmrq169vNm/e7FOb+RIdbsGCBa75W7ZsMS1atLDqWLlyZTNz5swk5Xbt2mUaNWpkoqKiTO7cuc3zzz9v5s2bZ4sOt2PHDtOxY0dTrFgxExYWZrJly2Zq1aqV5HzXrl0zH3zwgalevbqJiIgwERERpkSJEqZbt262Z1qzZk1TtWpVn9rAV3yJXmbM9YgsderUMVmzZjUFChQw8fHxZvLkyamKDmeMMT/++KNp3bq1iYmJMVmyZDF33313stGk/vnnH0uePvzwQ9cyx48fN88884yJi4szISEhJkeOHKZq1armpZdesqL6MdrMG2+84fVe/Ykzqk+WLFlM7ty5Tf369c1rr71mi9xozI3HiDVr1piWLVuaHDlymJCQEFOgQAHTsmVL8/nnnxtjjLlw4YLp06ePqVixoomOjjbh4eGmdOnSJj4+3pw9e9YYY8zPP/9sHn74YVO8eHETHh5uYmJiTI0aNcy0adPSriH8zNatW03Xrl1N4cKFTZYsWUzWrFlN5cqVzZAhQ6w2vXr1qhk1apQpVaqUCQkJMbly5TKPPvqo+eOPP2zn2rVrl2ncuLGJiooy2bNnNw899JD5/fffXeV28ODBJn/+/CZTpkwBE8Vszpw5pnPnzqZkyZImMjLShISEmMKFC5vHHnvM7Nq1yyq3bt06U7t2bRMREWFiY2NNr169zObNm5NEcktOBt2iRB48eNC0b9/eREZGmqioKNO+fXuzbt26JOdkuezZs5uoqCjTrFkzs2PHDlOkSBFbJKpAiw7na9syOpkT53iYXHS45Pr8X3/9ZTp06GCyZctmgoKCXKN0pjX+bgNjfBuvjbn+/lO6dGkTGhpqihUrZkaOHGmmTJmSZN5xu/aqVatMZGSkadasmTl37pwxxpjLly+bMWPGmLvvvtuEhYWZyMhIU6ZMGdO7d2+ze/fuG97L7cItOlxMTIypVKmSefPNN82FCxessr622YULF8zzzz9vcufObcLCwkytWrVMQkKCiYmJMf369fP7PbRr185kyZIlyZwn6dSpkwkODjZHjhzxOl87x2a3PnTw4EFTpkwZU7RoUfPbb78ZY9xl0VeZcIPX3b59u2nQoIEJDw83OXLkME8++aRNjo25Hilx4MCBpkiRIiYkJMTky5fPPPnkk+bvv/+2lfN13krt2BBkjA+7bCnJcvHiRcTGxmLUqFF48skn/X7+5cuXo3Hjxpg3bx7atWvn9/MriqIoiqIodtatW4e6devik08+cY3yqKR/0p07XKARGhqa5ptpKYqiKIqiKGnDsmXLkJCQgKpVqyI8PBzbtm3D66+/jpIlS1qu08qdh34EKYqiKIqiKBmW6OhoLF26FG+//TZOnz6NXLlyoXnz5hg5cmSS8PjKnYO6wymKoiiKoiiKkqEI2BDZiqIoiqIoiqIoaYF+BCmKoiiKoiiKkqHQjyBFURRFURRFUTIU+hGkKIqiKIqiKEqGIiCjwwUFBaXp+QsXLgwA1q6569at8+m4rFmzAgDuueceAMDixYvToHYeUhOzIi3aLmfOnNbvl19+GQBw5MgRAEBISIiVFxkZCQDYv38/ANh2P8+TJw8A4MCBAwCAQoUKWXmvv/46AODYsWN+q3OgtF2BAgWs3y+++CIA4KeffgIAfPHFF1beiRMnAMCKQlOrVi0rr1OnTgA8bTds2DCfrs37SWlbBErbSShLdevWBQB8/fXXVl7mzJltZa9evWr9Zh/PmzcvAGDTpk1pWs+Utp0v7eZWxtt1ZN+Ki4sDABQpUgTA9d28CWWO54+JibHyzp49CwA4dOgQAGDjxo1W3uXLl29Y55QSiDKXXrgdbeerTDZv3hyAfQ4JDQ0F4Om3cp6YMGECAODMmTPJXtOfsZxU7lKPtl3q0bZLPf6O5aaWIEVRFEVRFEVRMhQBaQnyJ/379wcA9O3b10q7ePEiAI+WNFMmz7fg2rVrAXi+umk1AoBcuXLZjv/ll1+sPGr3+/TpAwC4du2aH+/i9kJtHgA8++yzAIDz588DAIKDPSJEzZ6bFo9WNFp7cuTIYeVt3rwZADBz5kx/VjtNkNoYp0bi4Ycftn536dIFAFCiRAkr7cKFCwA8cte7d28r76+//gLgkUUpP7///jsA4KGHHgIA/Pvf/7byPvvsMwDAuHHjAAA///xzsvVLz7DvNWjQAIDdEkTLj+zH5O677wYAlCpVCkDaW4LSAvkcndrAevXqWb/ffvttAB6rDwAcPnwYgEf22G8B4OTJkwA87SatRMWKFbOda8+ePVYe2/6FF15I1f0ogY3bGER8tcZw3mzYsCEAj5cA4JknKIvyXCy/YMGCJOdkOdZB9neOAd7GZ0VRFCdqCVIURVEURVEUJUOhH0GKoiiKoiiKomQo7ih3OLrMzJs3z0qjO8c///xjpdEMTzcj6T7y4YcfAgDKlCkDABgwYICVR1cuBgWIjo628po1awYA2LJlCwCgdevWVh6vk16Jioqyfv/2228APO5bMs/pjiRd5eiWw8XY4eHhVl5sbKyfa+x/vLmBUN6qVatmpdElULoGckEwXQFXr15t5dG1km1YvHhxK6927dq2a+/bt8/KY6CAJk2aAACmTJli5THgxJ3AH3/8AQA4evRosmXc3HfOnTsHANi+fXvaVOwWQxmge+mnn36apAzHJ8DjBsexSrYR5ZABERiUA/DIO92SKlWqZOXRBW/UqFEAgIEDB6b6fpTAgzLi5lrG/+VYT9moWbOmlRYREQHAM1ZJ10m6A3MeHjt2rJXHAApM+/bbb608BjA6fvw4AHsAFGc9FUVRfEEtQYqiKIqiKIqiZCjuKEsQF4czHDPgCStMLTzgsVDQKiEtOlzQTuuEDNtM6wfDQl+5csXKo6aamqxJkyZZebQSpVdKlixp/aYGkBYgadFhuxw8eDDJORi2l8efOnXKyksPliA3KG+0AP3vf/+z8qipl9Ywai55v+3atbPy2I6XLl2ylQWAv//+25YmtbC0dJw+fRqAPdjCsmXLAAA//PCDlcZnlBZhjtMSthkDnCxfvtzK27p1a7LHvfPOOwCAyZMnAwBWrFiRVlW8JdDy169fPwD2gAWE4xTgCXHNgBKULwDInz8/AI9GXi5eZwAJat+LFi1q5XGMY5/u2rWrlTd9+vSU3pKSjqAVsE2bNlYaxydaHQFg7969AIAqVaoAAFatWmXlca7Mnj07AM84BQC5c+cGAOzevRuAx2oEAHfddRcAj7VHbjHg1g+UjEe5cuUAeDxxZGh2vsvRM0d6B9F6zu09GDRG4uYN4muaErioJUhRFEVRFEVRlAxFkAnAz9WUbgpF7RG1TXIdhtu5qFF3huqU5yJ//vmn9ZtafbdzMo/+1HLjwaZNmwLwaBl8JVA21GIYZsCz9oTaP3k9/mb7csNFwGMZoQVCamj43BgC2h/cirZbv349AI8lUV7TLVwzYYh1ae2hVdFtXQvXavD8sgwtnNTwS20+w2W3bdvWp/shgSJ3kuHDhwMABg0aBMBjxQU866nYV6l9BjybpC5ZsgSA3fqWFqTFZqkSar95r/ny5bPyuOmkmxzSYkiZlbCfVqhQwUqjNZ1rO6RFk7JG7btcv/bII48AsFtFfdGMBqLMpRf83XZuz4vz2eDBgwF4LD2AR6MuxyWOiZQN6VGRkJAAAOjYsSMAe1/+9ddfbXWQGyHTik05l+tRuT5Nemeo3KUtt3uTXl5frjejhfLHH38EYJct/qbFkpZIwLM+kueXlkV6EVC2/BGGPRDlzpf+4laGc0yvXr0A2L2gUnJuX8vpZqmKoiiKoiiKoig3gX4EKYqiKIqiKIqSobgjAiM8+OCDADyuQHLxr9tCc5rc6Jold0pn+E0i8+gS4nSnAzyuACxPNxLA4w73wQcfpPDOAgPp5sCABnSHk24yXNxP9whpbuZzoOuNbFe5oDbQ4WJwwOMyxKAEsi3YZtItjuZ0unXwfyCpTMk2d+6GLq9D2J5S9mX7p3coWzSF06UQ8IToZVtLd1iWS09t4XQJqFq1qpW3du1aAB43I+nukZiYCAAoWLCglcb7p6w++uijVt6XX34JwLOYmEEXAI8LJheoy0XEdB2hzMkF6jw/3ZOU9Iebu0mdOnUAeNwwOeYBHjmQrmh0v6RMUmYAT6CdlStXArCPg5zDnSG5Ac98zUXs0h3u7rvvBmAPAhOAnv6KC87xzs3djPOifI9jYBcZvp9jGl34GRjmRvD8jRs3BmDf4mT8+PEAgD59+tjqdCPSW4AEZz3dnoPbPfFdhW3Gfg143Ao5N7u9h3urQ/ny5a3f8p3Rn6glSFEURVEURVGUDMUdYQniFyi/IuWmfwx6IDXy/HKlRl1+fVIrxfJuX7zSAkRYnsfLMi1atACQfi1BcpEz25b3K7/mmcfgElJbTwub28J/qQkMdKgRBTz3S8uDvA/ep9SOss2YJ2VLlpNl3NKkLFPO+BykJYhBARiUArAHq0hPcKGrmxWN4UzZBlJjRAtSegrD7tSGlShRwvpNDTutp/wb8GjIpTacYxzlQsrHAw88AMCzWbQMO842ddOkcmE6g5tI63n16tVveD9K+oNad451bvOpGxwjpdxRdmkdkp4AnDPc5linJ4YMtlCsWDEAdtlX0gfO8cFtvHCTMcrKjh07rDRuieB2Ds4PblYJ/l68eDEA+xYSQ4YMAeDZ6oRlAI98u3mzpIdxz5u1xw23d2C2HedouaUKLUFu23u4vWOzHNvVzbrsb9QSpCiKoiiKoihKhuKOsARxs0quU5Ebo3JtjvRhpmbcTbvAr2D+L8s4w3bKr1RqpnltqRmoXbt2ym8qgJBtx693avakXzY1xLNmzQLgHvKaX/1SI5CeNu687777rN+sNzWUbtpLqVWhloMyJWXLzQfeiVse25HXlhpXalMaNmxopS1cuDDZ8wcytHK4aato+fHWZ3/66adbUs+0wE2zRsuWfN7UjMvw85QLavCprQOAe++9F4AnLDF93wGP1o0yJK3BtLyzv8v1HlK7qtw5cJznRryyj7mNe27WbkIZ4ZwpNbyUKZaRcw/nW7c5tkCBAim6HyVwcVtrTVmR4x3XRZYtWzZJGsdJOU+4WYCIc62L3FSb2yrkypUrVfeTXvG2AazbJvDz588HADz77LNW3oEDBwAAu3btAmAfN7xZkPmc+W4PAJ988kkq7uLGqCVIURRFURRFUZQMhX4EKYqiKIqiKIqSoUi37nBFixa1fh89ehSAx2QnF4JzYbQ0q9Pc6m0hGE3v0jTrXIQu85yhseUCdNZPuo3QrSA9wLCoALB//34AwL59+wAAlStXtvLYHr179wZgd4fjc2Bb0O0BsLvaBDrStdEZqMAtdLU03xOWl+6UzhDZbgsWWUZexym70s2QyMXq6dUdjjJCF0RvC7NlQA6W++WXX9K6imlGdHS09Zt9kW5qXLAOePqYXDDOkO48hwzxzrY8ePCg7ZyAJ6gGrydDlTIccVxcHABPEAV5DhnMQe6+rqQfpDzwN10t5XzKMO1u45/bHMsxjm5w0u2Godh5LgY2AjwyyDH1jz/+sPIo+9IVXo4DgY4cz5wBINzct+ieKO+R7RIfHw/A008BoFu3bgA8W1y4zS+3Erf3Km8ulG6uU5QV6ZL/+uuvAwA6dOiQ5FzOdnRz43Rra7pdrlu3Ltk8ifP90C24VqDg1ta+yIPbEoZ8+fIBALZu3WqlTZgwwXbOjRs3Wnl8l/z555+tNAbZadmyJQD7XJZWcqqWIEVRFEVRFEVRMhTp1hJEbQfg2QiRX6dSc0ptk1toPjfNunNzLgnTeB0Zpte5OFtqJ7gB3IgRI6y0p5566ob3GCjIjRKp9T18+DAAoFSpUlaeM8yztHbxN7/+pZbRuUFtICM1IM5gBjI0M2XLTTPpDJAAeNrOzarkDEspr+MMBuAmy3Lj3vQK5cXNautm/SLetKmBDrXusq9Q28tFulK+uEB4586dVhqPpWWHYxHg0eZTpmVfpoaTbSut2M7w79LyTtmMiYlJcj/pbfNA4gxt78Zdd91l/R42bBgAz5YIy5Ytu+k68PkxQMitgBvsShiQo2bNmlbaokWLANg37uWYw7ajpQbwjPduAYqcQRPk3EPrIq1DtGACnrGR2mjAM9cEMrxPOa+4eQ8QtsHcuXMBeDwrgKTbJMhxkeHvmzdvDsD+rG5Hv3QLT+0N57sFAPz2228A7JZmaTlMSR2ctGrVyvrNzad9nUfZjm51Tq84ZcNtLKxQoQIAuyWIQZlKly4NwB7AhGOJtOjScsRyS5Ysuem63wi1BCmKoiiKoiiKkqHQjyBFURRFURRFUTIUd4Q7HM3iNNFLkzgXDsu9CJyuR2647QXk3H1dmgjpBkJ3k927d1t5NA+mVZzztIax3gGPqyHv19suvtJ8TBcIutPJdpWBFwIduQeLcxfk3LlzW3ncl0W6HTjlR8qfN5cEptGlRLpqMsAE3Z6kaZnHySAi6RUu8qX7iHRlcC5Ale3KchUrVrwl9fQnDGIg3WOcu2xLl1PKAPdLAzxyRdcmGYSEgQ14ftkP2a+dMiuvTRce6TbFcrIvOI8LZNwCbri5fjz99NMAgCJFigAAfvjhBytv7dq1AIDBgwcDAF577TUr78EHHwRg76eELoQPPPCAlUa3arpBzZkzx8pzW6jtT+RzpVta4cKFAdjHoPvvvz9J3ThWcZyXz55yStmSLrxsf84vci5nX6b7u8yjLNJtEEgf7nC+7JEn5aF79+4APPOpdG/lM+Gzkm60nLfd3nnc3nUoi126dAEAjBs37ob1TAnShXb48OEAgO3btwMApk2bluxxb7zxhvWbwQ/kMgPu40dXXzkWvvPOOwCADRs2ALC3Bcc7unTJYEKs15AhQwAAzzzzjJXnzT21X79+AIBVq1ZZadJVLD3i5hrMsZ5zs1wmwvcTPge3eZv71AGeIAl0L5aBEdIKtQQpiqIoiqIoipKhSLeWoN9//936/fLLLydbbuTIkQCAXr16WWncPZ7aEfl16lx4Lb94nRpQuViYi/SaNWsGwD10YnpF7jJPTYCbRsC5oFOGzmW7Ukvgtit0ekBqy6mFoyakc+fOVl7r1q0BeDS/ALB3714AnsXsUu7Ynt7CdrJ9ZXhxaqCpWaOmFvBotypVquTr7QUsnTp1ApA0uASQVLspNcvsh9QQpieo8ZYaWmrYef9uoYfdQhUzTZ6L8ktrkttCYaflCbAvbgXsQRBo4ZBj4+3EWyh1t+AjbmMRrYhVq1ZNUp5WcrkdwMCBAwF4tMTyOHoD9OnTBwBQo0YNK4+WObdgKnxGDHF8K6D8AZ5+xDTpHUBrtwx+4G1xPy0U9BSQ98R7pyzLYB20fD3++OO2vwGPDMqARIGCW/AD8uSTTwIA2rVrZ6XRgkWvA2mR48J/WsHkM2KwIs4Psj/TMsL3IL4XAe4L+NnGMsgKcQtO4yu03rRo0cJKK168OADPmNG0aVMrj9enrEjrFoM9cIE94GkDButYuXKllUdrKhfrS08VvqvUqlULgN3KyOAT7IO0CAFAYmIiALtFjufgGCGfLbcPSa84w5kDnvGRYxW9YADPM3XzGGK/cNvyg2m+WEpvFrUEKYqiKIqiKIqSoUi3liCJt/UUU6ZMAQAMGjTISpMhhgG7ttCb3zo1JtR4yXCMPIc3C9Dt3qAstUi/Wt4zv9TlZm3O8JQy9DXb3BkCFbi1YV9TC+9X+jJT20lt3OzZs608hjCW1iGWpzZLyp3TAiRlhW1H+ZOawfnz5wMAPvvsMwB2SxX9j+8EqyTXHVAb57aOgMi2o3zyGaWnTTy5nlFq1gg1ZJQzANiyZQsA+4ao7JOUPanBo/bcLaQrtaSUd9lfqW1nmtTk87kEikbezbLD/uRmrZBWG1p4OY59++23Vh7btVixYgDslhFahWjFkNdh/+RaCzerj+zD1EJzk2M5Fqc1cl0XZYP1lfWm3DGUMOC5F/4v+yvlhv3UzZrJNpB9m94fHM/kHMrz3y4LpHP7AvnMvWmz69WrB8BuSaAMsl/J8Ztr0Oh5Itc90XLkNt7znaVHjx4AgPr161t5lC2GMQY8lhpaaXhd4OY2eud9ymdHzxxafeQz/PjjjwEAbdu2BWB/32A9pJWB3hYcv+RcybUnlCkpd2XKlAHg6ceyn3H8pZXJbTzesWOH9btcuXIAPBt0S1mQVqFAR7YP28xts3LeL9fBy+fHcs53H8Az/7hZIs+ePQvg1ngJqSVIURRFURRFUZQMhX4EKYqiKIqiKIqSobgj3OGcoXGlCY3maYbeA3wLfuAMACDLOUPyAvbQyTeqZ3qGpna6H8i2dC6slm4yNEvT9Cndv2hmDmQYXEC6BXkz1bo9a6cbpjzeKZNu4bPpCiCPo+uTDMVNpOmZMIw8w4QGMtLlhlDGpPw43WFlWzpN7Y0bN7Z+B7o7HMP90jUA8DxT3pccdygD0i2V8koXDim/lEe2pbwOXRoYUlsuPqZLIV1ApPuN09UpkHCG+5Yue3Xq1AEAbNy40UqTYa+T49ChQwA8rouAZ5E1XZYmTpxo5c2cOROApy/LOtC95+2337bSbueu83L84DOmCwsXhAMelyy6FAHu4zyRcua8DuXGTSa5WH3fvn22srJeMmjMrYTjktMlHPDUs3379gDsIZY5xrltr8A86aL5+eefAwAee+wxAHaXc7rG0g1LurdRFhkgQY6DTZo0sZUBgDVr1tjuQbp73gwJCQkA7MEPKCu7du0CALz00ktW3rx58wB45kPpikb3K7cw4atXrwZgdw2mS7TbNgt0yec5ZcAJ1o/vM7JdGQhGPlPWkX1Guh5u2rQJtwrnGCzfSdxcUfle4ba8hO3vtuUEj+P8w+0sAE/wMecyCsAzTrrNFW79KK1QS5CiKIqiKIqiKBmKO8IS5A23jU355en8upW4WYK8WXK8WQW8BW5Ib/Br3xm2GUiquZdaTKmBBuwaQqnNClTkgnribbNdapK8aV+8he+V2hFneHFZ1tsCdD4PqannQu70YAmSGx8Sp7YKSGq99dbP5ALfQIfWGGmxotaTGnJ5r40aNQIAfP/991YatfRsNy50BpKG2ZbjIK0S1NLLEOPUiFJ7KrWtcpNowjb3lzaZeOt/RGoeOVZRsy63VvjXv/5lK5PcOZxwjKNFyPkbsG8VwIAIbhvNcnNFN+uP272m1aJhbwELqNmVGm1uzulm9SHynpzzhNsYyXPJc9IyV61aNQDuoXVl2GPK7q0IDMMAIrSw3HvvvVZe3bp1AXisMLI+/O3mjcL7kxvr0gJEGMoa8FjBaL2VW1ukNvgQLaRt2rSx0pxWopRAK9XSpUutNI4tDBLBENaA5xnSCigtkLQ8SjnluMh2lWMn5Y59TwY5YpuzjLRO8jf7sZQxIudTymft2rUBAMOGDbPyZsyYAQCYMGFCknPcCF/eI/1hgXfbdJxtxvcNaZ10BvCQY5qzD8q5gmOtHBvoncD7kM/IzdvFH6glSFEURVEURVGUDMUdZQlyWxPEr0y3MJVOjYssT7yFtZbX8aYtvJOgpoVhKb35rNPHF/D4iVKTLbWBbhtpBRrURMn7pWxQEyqhxlSGu3Ue50223KyTLONNEyS1jG4y6dzoMpCRftnE2737YmmVmqhAx82SR+sQtZEyXC3XB7Vs2dJKo5WV61Lk5pPO9VXHjh2z8u655x4AHu2s1MhTo120aFEAdvnnWhepSeVaBX9bgtxwWkykZYd1mjRpEgDg3//+d5Jyct2em6YyNSxbtsz6Xb58eQAey5O0EnnbYPRWbihNOZJjNOWOfYzWBsAzpsj1Oxzb2J5uWmVnWQnLu60LpLzJPD5bKXess9My5y+kVYYh/Dlmyf5Cayz7rKw3Q1fLsYv9kuO3cy2p5KuvvkpRnbmRZ/Pmza00btgr1xeyz1Im5fOTYapTy6pVq6zftATxmtwEFfCslWKbSY8SvjfI9mTbUcakPDCPciHviW3Mc8oxjefgs5JzOi0iixcvttK47pYbl/ur76bWApTaOVPWm21GOZdzkvO9u3r16lbeggULAHgsYHLd4OjRowG4yzetS3K+Yoh8f6OWIEVRFEVRFEVRMhT6EaQoiqIoiqIoSobijvfhollNuhc5w7i6LUqkOVu6QjDPbXF2IIaETQtoLmYbSFOmcwEqF0ECnvZ0c/Py5gYSKNAcLOtPMzl3TJfQnUi6DjhNz97c4eTf/M1zyUWu0lwM2F2aeH4pw76Ecg8UvLmYurWdL+5wviymv91wMaibGwVdRigLdLsEPIuAZchYurpwUbncCd0ZGEYudqVLB881Z84cK69Xr162Osk+4ebaKuvoT9zax5vrSe/evQEATz31FAD34CC+jkWUTf4v3XScaXIhNQOs7Ny5E4An3HMgwWcuxw3eE11dZDhlulbJtqfLJGVZzhM8L92Y5HGUSc4l8jiGHqZbJd2NJLKtKXf+dodj0IMnnnjCSmMfYL+UYxEXitOd6kZjEMsROYds3boVgEdu5P1WqVIFgGd+kGG3GULc7b2G5+czAzzvTc5w3YC7i3dKoYsgACxZsgSAJ6CAhPfHa7q5t8mw2Rx/3Nxbna5ybu8icuwkbCs+UzmecesAuYCf7cjnvm7duiTnvBncAiqlNOiWt/KUERkK/K677gLgeaeT7oJxcXEAPHLk1gfZZ0uWLGnl8dls377dSmPgBT4/Gcgord6xA/+NQFEURVEURVEUxY/c8ZYgNy2x28J04mblIfwC5xev1NhkFEsQ752aD7kAVC7yBewbXlHDwvaVmii3oBWBBjVpEsqDmyWocuXKAOyL+ZwaQDeNoNMSKX9TOyK17QxhSg4ePGj9pqZUyqZbeM/0hFt/9mZhc2DBICIAACAASURBVFoGpCUkUGFfoaZSaj+pkXfbLJXaVSkfe/fuBeAJOyuDlTgXmMsNFBlwgXlly5a18rj4m/WT7c/6yXaXIVP9CbWEcnM+WmzdwqszhDA3h5SbktLCJhds8x44PrltNEtkf2W/btCgAQCgYMGCVh7DC/O5Pfroo1YeAzV4C4ct5xxq0P2Nm9WewQ8oM9I64bawmfV087bg+XmfUl6d86i0zDGPVigu0gY82nd5Lj5Tf8MQ9EOGDLHSaIXhs5bPiXLKfizHYLdtDyhvnDtkGzjHMzne09rLedgtQBTzuNEx4B5CnMcy2IUMUsON5999990kx6UGBkJgP5bbPnATYmewCMA9OJMzgIJ8t6Dc8d6kBYm/3cZcjl+8nluYbtn/OSfTWscx2F+4Wbs57sln6dxcXbYd32ekhY9WLQaBkuf69ddfAXj6oAyw5HwX/Pjjj628kSNHAvAEQZDWPoZwl1s60NLM5yb7UVrNI2oJUhRFURRFURQlQ3HHW4KcGwICvq0hcCtDzZXbV6r0cXVyJ22WSg0LtWzSP1ZafgC7ZYhaAh4v2+JWhn9NLdScuIXLXbFiRZI0aljc/OqJ20a8bhuVOTcOlNotpw+z9D/mxqiyrf0R3vRW4c333K0vuaU5Zcsf/uxpjTdZo68+te/UAMryUoPP8jxnxYoVrTz2T2oAZQhrjlnUfsqxzulvL/s9tdzS4uZc4+AvWEfZHxi62e05s57sA9Lqw3s/evRokuPcnsPNjlljxowBYNfI00on5xJac902eKT146GHHrqpujhxs6JxnJH++4Rac2kRcmrd3cY65klLB+cTtz7AtqDcyjVBCQkJSa7jtr7DH/C8ck3Z+vXrAbiHlCZMk+8i3tYHufVnt3UsySEtTs7rSMuFt7WXzvWowM15vbi9C3ENE60x8lq0SsiQ7IQWI2mhdc6RMmw7xyHKq3MDd4nMc25yK9fD0GIuLRwck2fNmpXs+W8Gbr4LeNZ80tokxyVnmHo3i5+UJ47ZK1euBGCXEc4bHBukpYzlOI/I9YKjRo0CAAwfPhyAx1oJAC+++CIAoFy5clYaLamsi6yfWoIURVEURVEURVH8gH4EKYqiKIqiKIqSobij3OHczLQMEylNvjRZuy3a9HZOpxlfHufNtHonQRN9oUKFANhdDpwma2mKpjnebTfy9IA3VwYu7GNYYYl053AG5JDmZuLmmuDcOdzNpY2hd6WLBnc0lyZl+UwCHTcXU+fu1IB39wxnnnT5ClT4nL2Fk6fLhexHbu4ObC+OT7I83U/oJubmzuQWlp3jIOVSXpd1lnImAwP4E8qHN1fkQOU///mP3871xhtv+O1cgGfRs9v4LQNrEI7p0v2Z7jJuLqo8l1uQFso3XY7kGFm0aFEAwNKlSwEAJ06csPLoGiXdMNNqjmHd5FjNoBv8323c531L1zRfXOVlHtuD7evmTsdzuo0fbu6JdB2TdeZv5kmXRbdxxle83Sddy6Tc8R7oHildXzluyft0trV023Ii5cPZZvI6/E13LL77AJ73Syn7XPDv5jp6MzA4RatWraw0jutsOxmkia7QhLIJeJ6v7M/OIB0yGAbL0zVQvoNwfKcrqxzvOTb369fPVk8AKF68OAC7WytdFtnmhw8ftvKOHz+OtEAtQYqiKIqiKIqiZCjuKEuQm0WHX79uoRWpPZJhNVOyoaXbglk37qTw2c7F1BKnZUNqm7jgmFrG9BYkgrIl75EaIbaJXCDutjCb7eEW/MApI75aOgjPyfClMs3bwtf0hltwiJTc37Zt2/xep7TCm6WaGki5kJ+WHdke1LYzTWrwOH7xf6k1ZZrbpoOUd2pnpXxSSy7HRm/aWCXwoCZebmzIsY7hb+XYRY26tBBw7PGWR7mQ8uPcqFVq2ClvLCO3H6A1yl8L+H1Bynh6CLgSSMh5lNaXRYsWAbBbBvgMvQVXcQu/7jxeXofyJ+cQjm88XlopnRZFeRzfg6ScMpw0Q38nV5+UwvcLucEyQ4hzM1O5Qa4zRLfbpsSyrXmfbGs55jONG6PK58F5h9eRVkMGUqBFR7438t1cehnwvHwOsl39vekxUUuQoiiKoiiKoigZijtHRZwMbms5qAlgmtToOEN7un2536w2Oj3j3BBVto/cfMwJtQwMre30Vw10aC2UGlDnPUgfYN6n9FemD7ubX7Y3DRE1K5RbGZacyLCUhNpXt/Vw6Q32Nbd1VE6kldG5Vo9auvQA5cRtDRjlhT7agKeNpNad2nOmSauMMyy7HM8oJ5RfqRXkugvWS4ZUduvXPD+1q9LyrgQebuHXqa2lxUOOQXy+sq85N82V2nTKDeVN9lfKj9vmmJR5yrAMVVyhQgUAwLfffpukvBJ4uK1X2rNnDwDgjz/+sNJo4aCHg1xTQquHHO+c73vyOs51kfI45ybUcsx1bjYqZZLvQzKNG9g635WAm/OAYZ3kZqS05NSuXRsAUL58eSuPfZQhqOW4e+zYMQD2cd35zivD8bPebpvJ8jlwjY88ju+EHDektwyflUyjNYnz/K14r1ZLkKIoiqIoiqIoGQr9CFIURVEURVEUJUNxx/tweXM9ojleLhijmc/NHc5pFpUmUx5HE6R0C7mTzPLe3Ni87WbtDCzgzXUuEOGzli4fMhyrEy4qnjBhgpVGU7WbGyZlys0NhAuB2XZyIWHbtm2TrQPbWJq801tACrJ7924AngXabsECnO5dQNIQsunBDZPPy81NjS5KlB3p0kEXNrfgB3SFkP3OGbBA/s1z8Tg3Wacbg1wkS/mV7c46cBEv3V6UwITPTrrPsL8xTC13kAc8ciCfOcdJZ0hnIOlYJ/N4TabJMZIyyVDFMhgBrydlP6O4qKd3nGHC//vf/1p5ZcqUsZV1G1fkM3eO91J+nK7Rcix05sm/KaeUcznm0hWMgTkAYOzYsbZzuQWBSA1sJ+k+9ssvvwDwbP0gAyPcfffdAIDKlSsDsAd7KF26tO2cgKe/OJeLAB4XQo4D8jp0xV+2bBkAT4hwwNNXq1evDsD+PJzbNwCe+Y1t5rZtg79RS5CiKIqiKIqiKBmKO0pVwi9F+bXtTRvkFvbYm9WG5fl16qbJpwbrTrcE8X6lllpurgX4vgFceoDPWsrK1q1bky1PbXeTJk2stCpVqgAAHnroIQBA3bp1rTwu8qVWXWp7qLFasWIFAGDy5Mk+1ZnnkJpWb6FGAxnKmdvmgt5kyW0zwUDHuami1MgXK1YMALBjxw4AwL333mvlcWGqfN4MSUoNvmwPagapYZcaTm4f4NToAx7tHkPAMviCPKccG3leWU4JXKhZl5pyPleG3a1Xr56Vx74p510+c8qW3FSbssE5RMok51Zvm4iyjJzbaamUdUhvc0xGxfmc5IbW3HjTaSEEPHInZYuyRBmRlgQey/8ZHADwjHNuQT4YfMZtu4BSpUoBsIetZmAHt/dRfyDfJ/mb/U2Gjefm9StXrgRgb6d8+fIBsG9syj7u1vfofcLNVWVAHt6f25YxRYoUAeCZT2QwEz4rObfwHZvnlOXTqj+nvzcERVEURVEURVGUm0A/ghRFURRFURRFyVDcUe5wbq4vXKAuTYg093tzDeK53ExwNNnJPVcYG5274EpT653kDuc0eXpbrObWdmwLt4XtgQxdgKSMSZOwEzeT8ubNm23/pzXcp0Du9+G2x1B6wLkfg5vcue0NweflttdOoEJXBd6H28LR1q1bAwBatGhh5XGxq3TFZXmek4El5HnpciDdUOheRBcJ6ZZAN9DRo0cDAObOnWvl8RnIfWF4H+lV9jIadDOiC4tMo4uNcyE5YHdVcrq6yHGTfZGLyaUbJl3w5CJuQjmiDMs9gehqLGWMe8so6ZeZM2cC8ARVkWM751/ptsnxjuXkexih277bOwjdhrnXDwCUKFECgEfGpOvbyJEjAQDr169Pci5/v+N4O5/b+wbb4OzZs7b/AeDo0aMAvLv0+wPOKXQRdAs0JuvMfLd3+bR6Z1RLkKIoiqIoiqIoGYo7yhLktgBt+/btAOyL2ahVkBoo4gybLb9ImUYtmAyLSA212y7B/l4Ydztx7vLt7d6kBYLl2YZu7RTIULMtF+NKzYoT3q+UH2fYR2/Hu0FNq7RAetOOsI2lpkWGtkxPMBRojRo1ANjv2ylbbuHwqeFLDzCgBRe5yuAjHM94rwsXLrTy5O9biQyJygXM0kpKOU+PQSoyIhxf5PNyjvMbN260fnfv3h2Ap48CSbW8ctykJYjWHmn1YT919mnAs4ib4fJ/+OEHK69AgQIANPjGncZnn312u6uAnTt3ArCH7vaFWxGYI70E//DViuPLe6W/0VlJURRFURRFUZQMxR1lCeJ6FanB2rZtGwCgadOmVho3sKTWSPrJE29rgehPKX2OZ8+eDcDdB/VOsgRxHRV9r6U1rXDhwrayXB8FePxw+YwYWjK9wDVB0ufcGX5d+ru6aTL5W2r23Y6Vx0vcLJfe4Dlz5cplpUnrZXqC9ab8yXDscu0CYLdCME+GDg10GPaaz+3EiRNWHtc4ukF5dAtV6k9oyaTFQLZtly5dANitQ/Xr1wfg8aVnqHclMKG1WPYx5xz266+/Wr/XrVsHwLMmza28hPMzw71LKy2tQgznK9cecVPGL774Isk5OdbJcVTOP4qiKG6oJUhRFEVRFEVRlAyFfgQpiqIoiqIoipKhuKPc4ehC5OZKJBdtNmrUyJYngybQDE/XJbl4nedNa3eTQCYhIQEAMH36dAB210PpIgHYQ+dy52C6k+3ZsydN6+lvJk2aBADo2LGjlfbNN9/YytzMIsW0WOC4adMmAHa3EDdXkvTAuHHjAHhCNXNxNOAJGMHAE9LdkOFNKbfpgS1btgDw3FeHDh2svEOHDtnKSveftByX5HWc7pzLly+3fvfo0QOA3UWOO5YvW7Yszeqn+A+6ecvANtIl08mCBQsAAEuXLrXSGKiALnJyDGIf5lYBMkgOr8P5WvZzuqG7QXmT7sfOvqIoiuJELUGKoiiKoiiKomQogkx6ibGnKIqiKIqiKIriB9QSpCiKoiiKoihKhkI/ghRFURRFURRFyVDoR5CiKIqiKIqiKBkK/QhSFEVRFEVRFCVDoR9BiqIoiqIoiqJkKPQjSFEURVEURVGUDIV+BCmKoiiKoiiKkqHQjyBFURRFURRFUTIU+hGkKIqiKIqiKEqGQj+CFEVRFEVRFEXJUOhHkKIoiqIoiqIoGQr9CFIURVEURVEUJUMRfLsr4EZQUNAN82SZa9eupej8/fr1AwA0adIEALBkyRIrLyEhAQCwf/9+AEDlypWtvMKFCwMA6tevDwC4fPmylfd///d/AICDBw/esO4AYIy5YT19KePtGv7i+++/t35HRkYCAM6fP5+kXGxsrK3MiRMnrLxTp07ZjsuZM6eVl5iYCADo2bOn3+p8u9suW7ZsAID4+Hgr7ZdffgEArF+/HgCwZ88eK+/MmTPJniskJASAR/4otwDQrFkzAMCzzz4LwCO3N8PtbjuSKZNHR/P8888D8LTF7t27rbzg4OvD2MWLFwHY+2yhQoUAAGvWrAEATJs2ze/1lKS07W623XwdU7JkyQLA01aSS5cuAfD0W1n+2LFjN1U/XwkUmUuP3O62K126NAAgLi7OSlu8eLGtTObMma3fnK8pi3IeJQ8//DAAICwszEqbOnWqn2rs4Xa3XXpG2y71aNulntS0nTeCjL/P6Ad8edjeJv+8efNav2fOnAkAOHz4sJW2dOlSAEC5cuUAANmzZ7fyYmJiAAAdOnQAAMyfP9/K+/XXXwF4XkblSyxfwDhor1y50spbtWpVsvX31vyB0lFkPQ4cOADAM5HJiY9t/d577wEAXnvtNSuvZs2aAIBDhw4BACIiIqw8vnDxI8rfdfaVm227sWPHWr/50n6rWbhwofW7devWqTrHrWg7fuBQjqKjo628iRMnAgDy589vpTlf4osUKWLl8VjW4fjx41beH3/8AQAIDQ0FYP/Y3LRpEwCgf//+yd5PStviVn0EuX3MXLlyBQAQFRVlpfHDhi+qkj///BOA5yOoatWqVh7Hy61btwKwKz041vHFVt4D65BeFT7pkVvZdiVKlAAAvPPOO1ZaeHg4APuHDhWNmzdvTtH5qeB5+umnbecGPPP03r17AQAdO3ZM0bndULlLPdp2qUfbLvX4+5NF3eEURVEURVEURclQ6EeQoiiKoiiKoigZioBcE+QNpxsN4HGtGjduHACgbt26SfJy5cplpdFdZsGCBQCAKlWqWHnFihUD4PGF59oOeV6a/emiI89Bl6hevXpZee+++y4AoEKFClZaAHohJoGufbKt6RZD96LTp09beT169AAATJo0CQCwceNGK4/ucBcuXEhyHbo90XXH2/qY242bm88TTzwBwO4Cd+TIEQB2f3eni5U8F+Xam8n76tWrSdJ4Lq6VadWqlZXHtUe1atXyflO3Aec6PuleU7JkSQD29XV0weI6s48//tjKo9ywP/7zzz9W3j333APAXZZr164NABg4cCAAYNSoUam+n1sF5YNuZxK6C0n3Xrq10c1XrrNq3749AE87S5c3rrliu+fOndvKc5NtZ/3Sw/impBy6P8v5lC6n0nWNLtF0f963b5+Vt3PnTgCeOTNPnjxWXtmyZQEAc+bMAQDky5fPymNf5jkVRVFuFrUEKYqiKIqiKIqSoUh3liC3SHB9+/YF4NHs7tq1y8rjol8Z4eyRRx4BAPz1118A7AuwqflkYARq2GW5SpUqAQB+/vlnK4+aWS7YPnr0qJV38uRJAMDkyZOtNGkpClQYeUxqj6nppfZdRiNjdLdu3boBAMaPH2/lUXvM486dO5fkeo0bNwYAzJs3zx/VTxPcNNwffPABAE8EPMAjpzIAhC/acW+RDt0Ww/PZ8NxSS0rrm9SmygAhgQCjtxUoUMBK++GHHwDY2459j1YLGZCDssX/c+TIYeXROkTLpbSm0dpbo0aNJPUKVEsG68XgL9Ii9vjjjwOw9y2Of7RwS8v2XXfdBcAz5klLUKdOnQAAAwYMAGC34PK5DB06FIAnkAXgGRPlInk3C6aSPlm2bBkAoHv37lYan6+cJzjfMiCHjDzYrl07AEDWrFkB2OeQ1atXA/DImxyvaAWmTCvK/2PvrQNtqer+/7ePHRgg0t1cujukQzolLkrDF0UJAWlR6iKPIN0YdIiA0o2Xrktz6VRC7Pb3x/N7zXrPOusM+9x7Ys/dn9c/Z59Zs2fPrFkx84n3CoKJJTxBQRAEQRAEQRD0FK3zBIFbt4lX33PPPSXVLcjEtk8//fTVNiyXWLDcCo11+Nhjj5VUz2tBEptcC4+LZn0hJKTxSknJur/YYotV27BQd7NVC0udQx4AVmPPFTjllFMkJendZZZZps/38SR5PkFu3W4La621lqR0f92SjgfRt+HJcSs5UAe0SfdA5vLQ7i3CYspfz1Njvz322KPadsABB3R6ecPCqquuKqnuOSWnxT20XAseHbyrUspFQGLd16fCOk3dec4VfX222WaTVB8jmtb76gawtF9zzTXVNizq7jHk+i+99FJJddl6rhuJcV9D6eKLL5aU2qp7l/785z9Lkm666SZJKedRShLH7v2JPKFJDx/XmAc92oLxi6iMmWeeuSqjneGB9bkAzzBS3HfccUdVRv7pfvvtNyjXEATB4MEzs68tx7bBYOWVVx60YznhCQqCIAiCIAiCoKeIl6AgCIIgCIIgCHqK1obDIU4gJbc6Mpy+OjqyzYTRSClUg/AtkoYlaZVVVpGUwovcjY/oAVKdCCs4F154oaSUdCyl8B4PY0Ke9oILLmi8zpGEkAYPY8nlnV0imxAJypA69v2Qv/aQMI5JOFNb2GijjSSl8/cwJEKMPKxtQkFwgb8e7ukhmVJdHpowEwQupO4Lh6MOXfIZcYc555yz2pZLhz/yyCPV5wUWWEBSamOeTE0d0J+RipZSGN17770nSdpwww2rMuT2RxLajocLrbPOOpJSovmjjz5alY0aNUpSEjWQpDFjxkhK4X0sDyClsWeKKaaQJE022WRVGSGthNMdeuihVRlhvZdddpmkFMIkpTAmQoelCIebFEGEQ0ptxedR2hn92udkwuFowz43036Yd325C8I3ox0FwdDjQic8u9Jn999//6qM/r/wwgtLqqdRkBZy9913S0rpIlIKaWf5CinNdYSmP/zww1WZp60MJuEJCoIgCIIgCIKgp2itJ2i33XarPmPJRJrZZYJ5Ox0/fny1DcvkpptuKqkuqY01mQXcZpxxxqoMCV6EEZ5++umqDIGDBRdcUFLd4sx+bsHiWN0McrqlhRlLsqgk5VPmVueSyAKQaO2J6W0AD0SetC8l64hLGGPVIHHfvWHUD9Z/6sSPhTfDPUF8j3PxxHeS2ekD3QiWX+9LWJRdKAPZdPrlV7/61aqMeuQ+4KmVkuWJMsQspFSfL7/8sqS6x6wbPEGlxUjxVjFOudfnjDPOkFRf9DX3Hh5yyCFVGUmr1DeeHQdvGVLZkvSrX/1KUhpv3UuHzL17gsJyP+nhUQ1IrPv4TdQEYyL7SMk7ieffIwCYO/Bo+zzqC64GQTC0lBZuR6zIPbQ8f+PZcXGm+eefX1JarsMFc5gXXByMbcw7Dz30UFXGUjiDTXiCgiAIgiAIgiDoKVrrCfJcCyzrxAz7mygLL7qUMwu+kS/kVq1LLrlEUrI4uwQtFvXnn39eUj2+EWsyuUoeJ89+nl9E7hC/043wht+0gKdbebEcYJkvyeSWrAtYCT0noQ3MPffcklIuFHlAknT//fdLko444ohq2yKLLCKpnOuB5wevm7dhrKkc33OPkKPEy+RlJfl1zuHBBx/s6BqHCnKZyKtzyV3ijt0DiceRtoLnS0peVerV6w7p+vnmm09SfbFQ2jX3oamddwt4zqgv9z6yUCltT0r9E6+g91ckrpGyZ1yUUpw2+/v3ttxyS0nJO+eeSTzhTniCJj283THmeB+mvzIG+XxNG2Z+ICdPSn2eNuNjJNbhoP14fsfhhx8uSTryyCMlpQgdKUWTIP/vObDkOb7yyivVNiIE3IMAzB3MBZ5PydzKkirbbrttVUaO6bhx4yTVozSIxPAxkHOmvfpiwP5c2O2Uxu1nnnlGUn1u5jNzq0e4MCbQx30cYI7x/Xk+xKuER1lKz9iDTXiCgiAIgiAIgiDoKeIlKAiCIAiCIAiCnqJ14XDI5rqbHFcbLjRPwqTMJXVxTxJC5G42wucI63DZ3F133VWStPbaa0uqS2Sz4v2tt94qqe72RCbQw21cvKFbIdHVwxw83ConD3VzdyrhEflfKd1Ll0psA7jQS1Lps8wyi6S6IARhWrj0/XqpY+ps8sknr8ry8DB3x5Ms7CFgUHJnk9A40uFwW221laTUDhApkNK133vvvdU25NbZ5mFtJE9zH6gvKdVZ6R5R/9S9J2gjce+iKd0AywEQluD3nZBfQnqlNO6dddZZkuqCD+xH//NQBcYq6ma//faryghbYazzsFcPOw4mXUaPHl19JhQIARcptUvaEeGYUhq/8vBpKY1ZeaiqlPony1gQzhm0h9VXX11SPTWA8WP33XeXVH8eI7Rs8cUXl1SfJ5gDvvGNb1TbmB9IXUAYRkrPLvPMM4+kFJbp+xEq5/MvczhJ/jxnStLll18uqb4kyi677CJJuvnmmyWlcGMpPRe0gabw8Mcff7z6jBgO/difEfnMnOxl9G1/FmQJFebm//3f/53wC+iQ8AQFQRAEQRAEQdBTtM4TxGKBviAib49YMn2RTpLmnnzyyWobnqLDDjtMUv1NdNFFF5UknX/++ZLSQqeStP3220uSVlxxxdrvSdKee+4pKVmy3DKAPLdbMdgPIQVfFKpbILHPk+Cw3mElcGsB+5E0654h6jiXM5aSdcAX+uxWPAESShaT2267TVJdWvhb3/qWpL4Lzk4M1BlS3N7OS1anbpEhxyKIhWiaaaapyrDUufAIbZG/7gGh/hGJcE8QXjf2d68m8vcICfjCv+utt56k7vME4dlGDMbbI0noCBZIyTJ6wgknSKrLZ5966qmSpJNPPllSuS9/85vflFS3ttLG2N/rDYttW/ExvSRRDswLSNJfd911VdlVV10lqR1CGxPKcsst12ebz6O0H+qgaaxrml9K3ux8gehJDeqxk/bji1YijHLLLbdIqnssBvK7zmC3YSIQfLxn22uvvSYpLaAtpf641FJL1faRkrAB4jf+3e9973uSUhSOlASt7rrrLknSSiutVJXxHEbEAM+UkvTYY49JSotEu+dygw02kJQkoCXpoosuqu3nz6MsL+CLDbcR74N4bZh/Sn2WPu59nedE30ad4eX72c9+NpinXSQ8QUEQBEEQBEEQ9BTxEhQEQRAEQRAEQU/RunC48847T5K0xx57VNsQL2BdGw9hISzNwzlIwEPgwN2vJHyRvOkr4+Kqx43nSXdbb721pCSo4Dr1iC0gniClUAsPz+k2uD5PZsNdSV246xPXeb6Pw7Hc9U6oEuFM3UyniY2471lhWUphmJ4kDNTVQEIhpBT6VVpNmVA5F2DolsT1Aw44QFLqL/RdSfrpT38qqR5mRdgRrnNfPwBxCMRG3FVPvZIoSwisJJ100kn9nl+TAMhw40nEtKc777xTUkrklZJAhIeasA7Z7bffLqm+vgvhmfwl8VeSNt98c0kpNNHX1CCMhJARD8Gca665BnZxI0gp/MdD4Gg7XO+ss85alTF3EGLj61bttddeklLojid6E8ZIyCLrkkjtWk/pxhtvrD5vscUWksqhLtAUDtdU5veI+ejqq68e2Ml2MaXxvmnsJ7yasC0PP914440lpVD+ga67Nxzhm4Q5EVompb6D0Ap9Skp9B4EED1dFsOCXv/xltY20AsLbvO8RSs3xXUCG43JMD3ljTPA2D8yn5557brWNVA3GjW6aSBg4hAAAIABJREFUSwYL5lOH8cvrLi9zSuInpIxwH4ZDLCs8QUEQBEEQBEEQ9BSte0XFAuIWXT6T9M1KwlJKanOr8vrrry8piSB4AiEJn1hofDVrpLWRViT5zn8bCW+3dLQV6sUtdXlSf8kTVJLBzr1ofky2uaesW3FpZihZemgHJXIrqZSsIXiLSvUKLrs9UCtTt8iQI93s0vU5a665ZvWZOsC66XVAO0WWviQXTttywZIm3Ko90uDNkZKcLH3LpfixdCL4ICXZYjzbLs6CmAbW5G222aYqw9PB93yFd9ovnhGXS0UwYO655662kbA90mChpO5KwgeI30hp9fljjjlGUt3ai+Q4zDbbbNVnvJtYS5F0dvDcurw4Cdted0QRcE+5L9LICi+4iE1J/KA0PwwEjumWf7xovkRA2yndQ7z7eHY8GgUvw9RTTy2p3n6YOxCDIjpFSt71JljyQZKWX355SdImm2wiqe6d8eeegYKHxqM+Dj74YElpCRL37HCvv/Od7/Q5D8RIjjjiiGobkTWMhb4UBLLXzOHuYee4jHN+fvTBQw89VJJ02mmnVWV4nnxeYRxFOvqaa66pyvyetBHGMpcQx0NLn/e5s+n5hLnI96cNP/fcc5Lqz1tvvvnmRJ17f4QnKAiCIAiCIAiCnqJ1nqAm3AMEWImvv/76ahuxoCwA6LHexM5jwXLpU+LvsXbypi9JF1xwgaTkLSpRWhSOt2e36HQLWN78vJsWROUa2KcU610qa5MnqGSFLFk7L730UknJiiSlHAqO4blBWED469aRPG7c88hYdBTPExY8KXlL/Py6xcOR50A5tCPPs8Bqj1cES5+UZKLZ5sfMLVFuVc3xfUsS8CPFuuuuW33Go42F3OP+GXu8H+G19oUsAe8CljzPX8MSR36Vtxti9xdZZBFJ9Twz8mBcmnaoPEGlNpSPo6Uy/rLwrCRdfPHFktKYJ6VFDn3x7fy36VssgyAlCz64jDH5Vyx2i6yslNq4ezL5bfbvhvYo1SWEm+L9B7oMAMfiHnk/7xYvdpOXq7QAbNOYu/LKK0uqR44gqUyOo7dpvIyM96Vjc154eiVpu+22k5Tum+dg8vzj3g/OnzofLFlyPDM/+tGPqm0cGw+Tj9HkKZLn5F5u8oRY4kBKYxHPe1//+tersiuvvFJSapvuRec5kbrwSA7maTxAPraxZItv41j33HOPpPoz4Q9/+ENJ9Tz0TplY72rpWE5TlA8cdNBBkupedNogHkyPNqAsXyLFtznMazwTegTCUM0j4QkKgiAIgiAIgqCniJegIAiCIAiCIAh6ikkiHK7JfYcwgkv63XfffZKkww8/XJI0evToqoz9CG9wlym/g9vPE7cJKcF16qE1+erZ/Z1rt0F4oa9K3xTegAsTqV4P9cjx6+eYLrXbrZSS7kuSkCeffLKk+j33ZML8e7lwhP8OISG0KQ9b+PWvfy0phRcgvSvVE6yhJOwwEuRhLyU8XImkSFzoHgZCmCorpbuQQN5efcXxnE5laocbv2fcew9nAcIkvYzQBMJQPJyT0Enamoe8UE+EVLoQBSFatFVEYaQU+uFy20MFbcjvVX6/vYyk7AsvvFBSPen+lFNOkSQdffTRfX6H6/SwuIGM34RKS0nq/fjjj5fUXrnn0pjn5MslNFGqy1wARKqLFHUDTSFFpTA1RJcIgZOS1L/vT6gRwk0epsrxed7wuZl2yrw7bty4qoylHWjDPu4ytnp4NnMNx/KxYaDS2w7njcCDlMLfkON34SDGK8J699lnn6rMQ+PgjjvukCR9+ctf7nPejKOMj16v9957r6T68gLgwllSXZyBkGAPNyalYoMNNpCUxBPyaxtJaLulea7UHwnj43o99YRrYv7pVA6bz6X96fceZnjTTTf1ez0TQ3iCgiAIgiAIgiDoKSYJT1CTVY4kQ7cabLvttpKkE044QVLdY4F1EMvJWmutVZVhYSa5EC+TlKwKWPtLb9hNCbzdCN6wkvQp4BWTkvUY+dhDDjmkKiNZm+t2Twf3zxdk7Fbc8gZIiyJNKqW24dKZWJRK1lGsKXlCqpTqDCuet9dcZrLkfXMLtlvguoGmxWFdGAEZZurf64c+S7/0hVQ5Pm3MZWCRSMV74felmzxBWN+k5AnCMurjCJ/xxEop0Z9xyZNLGbNIMHbZY/o1x/R+Ttu+4YYbJEljxoypythGAvdQUhJZyWX6XaQAL/9LL70kKS3YK9UFTIB2Rf/x8TtP/C8tI8Bft0azsC8Sx6XraUq4L/3OSOCCBSXycyvdo/72dbx95570kaLpfBlfPLEe2XXGM+9ntC0fz/C6cu2eaE7d5Z4aKc0B7uEExg2+5/eD3ylFfHBMj2yZmAW3ubYzzzyz2sb4hkfff4vnNurJvc5sc5EBvBH0OReJIcker7h7gkaNGiUpyVm7mMFXvvIVSUnC+8gjj6zKmLfxcPn1MGe5FLd71AfKcPf3E088sfqMsEYumCOlNlISqsnHrdJ85eMBx6LN+1w+VIQnKAiCIAiCIAiCnmKS8AQ1wduze23wQkw55ZSS6lY53jxZGO/pp5+uyohPJZbx7rvvrspWWmklSSlettNYy26GHB23GuWWe89XIL6d3Az3BOWSw251x4pSWryw2/DFPffYYw9JycLnuWW33XabpHo7wHqKNcit61gCmxYXo37cmjJ27FhJKe7Y47ovv/xySdLZZ59dbSvJyI8kJU8QljO34rEf1+7WTtpSyYtGX8fC5GVzzTWXpGT16ybvj7PkkktWn1lQkzGL3DNJ2nDDDSXVxyyklakHFj+U0niGNdktzvli0e5BpA7xqDgco5SPNpwwDrvcLrH8N998s6S0VIKUpHF9rKO/URfeN3OPa1Pb8XEfz1zJY9s0P+TepZHGx/1Sbkzu7Z7Q8+5GTxAw50upf+El2Xfffasy+hALvXteSClHhPrEq+R1h7eG8cznzLyO3SJP3dF3fZ4Ab99I8XNdfiz3xgwUjks0jh+bcYtxWUpzAUuVHHXUUVXZb37zG0n1BVEZF5n7yDOS0pyMl9rzeJibifxxbxf3iPNcY401qjLGlBtvvLHaRn7ugQceKKk+V/lyLANloBLZTcuZNI1XeIB23HHHaht1zPIwpeUkaCv+u3meUOn5xvs45ezPM/pQEp6gIAiCIAiCIAh6ingJCoIgCIIgCIKgp5jkw+FYNZnkXylJW5dkXI899lhJKWTp2WefrcpwayMp6a5W3NKEq5Qksge6evZIQwKnJ/0R5kA4h4d0ET7nYUyQSxt7uMTLL788mKc9pPh5IwSBi9jFHlwQAUjcL0n7EpqUr9pcwkMTCMngr8t5koTtbfGDEpqHG+qzJHldEs8oJfZyDEJF/BpzwQkPCUCqllCLbhUrcWngfJVtD+lgDCIEQUrhBIQveZsjPIfr9qRdxjgEGAhj8WORJOtwzzzEmG1NkvkTQlOIGGPX6quvXm0jFJT9XSwH2VzC1RzqrCmEzdsq9cN9Q+pYSkI9zEEkYntZKSyYJHC/Vvr3SOD3tyTaMLGr2+dhMVL3zBMIJK2yyirVNsKiCK39zne+U5UttdRSklKIlUv4c53eL3NREi9jW1MIIvv4OMA2xkM/JqFxLqDD/owDHoroz1IDhd/w8NrXX39dUpKw/9nPftbnt5jXrr/++qqM8WrFFVestjGPErLnYxr1sffee/c5B0QMELTwOQTZbfb3cQ9JbcZJSTrssMMkpTbhIbkeujdQ6Av+DMJ8UBozOul7xxxzTPWZcGrmUX/2pT75bQ/xoy2V5k/GxZJ0POfukuuM21wPAmVDSXiCgiAIgiAIgiDoKSZ5T1BuhZGka6+9VlKy5JPgLqWkvHXWWUdSWkRLSpYo3vrdwk6COnKNF198cVVGcnLbJLLBE6CxmHoSNSCp7TK8kFuuvO5KCdbdisuIIkWMdcQXMWWbL1zJd3Nr/mDiCctY70pW226hZF33RVKB+ix5ggCrU8lSVloczhNwu5kmi54v/sqY4vvjCcIr4Z5CJKOxxLpXlzrEW+T1hgfFLc2AVc/LBtsDBFguXfYc4RIs2Jy/78f5eDLzCiusIKkulU1/pV48SZc6YB+3lPM73A+3fuL5wbv+/e9/v9/vSanemXtmm222qsxFMYYbT/DmHL3dTayAQ8nCjSxvafHa4YTxFM++lO4dzxsO+/GXeVJqXiiytE++iGxpwe1OBDZKkuWlZxL2R15amrj+zPjt3iS82XjRWMRZSh4H+p4/N5QiVfC60DZ8fOSZDm8R3h8peefoq4goSKndEemCx1ZKY6d7hzgHxoT777+/KvPFqgdK7unLP/cHXi0XOkC0yr3JfKYuvF4Zz/P51I/P2O/tLxcK88gF+rF7lfIlLVwi2z2og0l4goIgCIIgCIIg6Cla6wnqdNE45It9AUEkZ48//nhJddlcYrWxnLqFj/1YWMvj5JFNJA60JEHYbVb4TnHrMVaRkgXC98vJczLcgjUxcbLDzZe//OXq8znnnFMrw+MnSZtuuqmksrcBy7K3u9zy6RbQ3FLn1jCsKLQ7l21FotMlPceNGydJuvLKK/u7xGGl1HdZxM4tjlj2qLvSYn/US2mxNtqb1+tIWZI7pbQwLOdMX+N+Sslaes8991TbyHFhPPM8GCx47qmAGWaYQVLypGDxlJLXpCR3ijfOrXsTmx/SH+QycD5SkrilzyAd7/uX8hnXXnttSdI222xTbcvblbeX3KLrksO0sbztSSnOnrp3zzI5B962qUc8yv67bskebnyhyZJnoGkR5CZyj63PM3i5l1lmGUlpKYbhhnNybxj3ibrw6849Oz4GlXI5cmu410HuDfDfoe+VZK37Oxc/99K4ydhA3qtUt+YPFDwK7lXlGQtPkPdLwAvj4xeeNW+LeGQ4li8vcPvtt0tKc6X3JdoW3/PFUvlMfpHnsOAVcs83n5mLPTqD65gQci+gJK222mqS0gKt/ryKF4W8JX8GxrPs3sw8V7kpl9gjgRi38Lq5Z565q5SvSn36sakrrtWfsZdeemkNBeEJCoIgCIIgCIKgp4iXoCAIgiAIgiAIeorWhsN1yhNPPCEpye5KaWViwig81IMwEyQBXaKPUAZCRDxxjO+R7FVy6TpDFSIyFLiUJCvBl9zqJdc+4MYvJdG6+ES342EveZKpu4GRCXfpXEJfaDdeh3lIUimMpBRiwufHHntMUj00CJrCeLoR3PClMC3q3Fdax6Wfh5FI6d4Q8uHHdMGInFIS6nBDyIiHEFA3jFkbbLBBnzLfn3ACQiNKITy0IQ+XIOSF8B4P0SJBdZdddpFU7xOlxFn6BYnMgwXS5i5wQGhM0zhMn3T53NL9zsc473ce7pd/j/vA7/j38nvk7Zh+WgrZpD49FMml04cbF2gozWF53/WQwKY5Lw/5KYWCsbTFSIXDEYJ68MEHV9sIVyT0yIUECJ+iH/g8QTvy0KNccrwp9N/rmRAl2paHKebhhSURCw9xYo7iWJ5A3xT2/kEQms1SJFIKcZt55pkl1UUJ2MYY7+eI5L33od/85jeSkhAW/0tJNIX74Yn2jCWEY3noFWXs7yG2iy++uKT6eECaBaFyhKpJ9dDmgcJY5iF1jAHc19ISGIy7Pv7Sv7zuPMzPf09K18df0iKkvmOnfy+Xe/exjXPw+Ypywhp9jPaQzMEkPEFBEARBEARBEPQUrfUEdepBwTLo1jgkKrfddltJ9eRUFlfFA3TppZdWZWwrJWdjocOC4AlnpXNugwcI8DJIyZJT8gSVFhoE6orvuXdiJBN8B4qfN9fEvXY5bOrJ7zPWPix0JesxVhWv39wDVPJ08D23OoFbX9x72e2UrrPU7qiXUiJ6Lsjh4wAWWTwuLtvaDf0Tq56PM4w9tBeEX6QkC+1WSTw/tAG3pHpyslQfsxjPSu0FKXi8S37MUht1a+NQ4P3olVdeGdLfCv4PXyicdtep8E9TFEQ+1pW85Ugdn3DCCQM97UHFFwXnM5EnQRk8F4gTSElCuuTFwAtDe1hsscWqMjxIPLM5E/tM8Ytf/KKj/S688MJ+y/AYXnHFFRN1LoAnxD2QRDNQF/4Mwv54+F00AU+Zj535wuIO3n7+uheaZyIipHxZAp516B/+jMhn91Axl+SLx+ffHUzCExQEQRAEQRAEQU/RWk+Q02RZwjJ41VVX9dmGJdNjErFK8GaNt0iSjjzySElJPhaZQUm67777JCVrVcmCOlC50G7B5SJzy7rji4XmYInGsudWaD9+t3PTTTdVn3fYYQdJ0rzzzispLd4oSSeddJKkck5CacG6oeDMM8+UVLdWlTyU3UYpH4A6o8zj3fMyJ+9z/j/fw3uG1bFbcGlVyPOl3PKHpc/rAese1jn3ZOLtoZ49jwGrLNY3zz3gvFgU0C1/JWljrI5N40PQDko5Dbkk+GBQyn/ECu2yx0G7uPPOO/stcznk/vD84TblEg8Gs88+u6R6Thb5QcwLJcl05jnP+SlFGdCPS1ET+dIdpdxJzsE9/xyf3/Mc6TwSw3+b+cbnMvceDibhCQqCIAiCIAiCoKeIl6AgCIIgCIIgCHqKSSIcrimJecEFF5SUJK8l6ZxzzpEkHXXUUZLq0nskfCK/OXbs2KoMFyDH3HjjjauyU089VVJarR3XpZTCQIY6/Gmo8MS1ptAjQr9Kbm3qjqRtl91uEyUZUcIikV6XkohGSe62JBeZu6I9OTEXBSiVsc1d3iT6+/5+/t0K7nQ/V66TspJsLPXqcrO41UsJ++xHOKOHw3WDMALhAn6tudT8mDFjqrIDDzxQUl2+lNAhZF09xG6uueaSlMYnHwf5HSRKvQ2RiHz22WdLSqHAUgrV8HuHOA2JwkF78aUmoDQnlOaHfP9O8OMQAksyeKmfB8GkCuGghCFL0qhRoySVxZZ4DuP5zcfkpv7CM5oLVTD+IyLmZfRD5qlSqBzbfC5jm++fb/Ow96Hq4+EJCoIgCIIgCIKgp5gkPEGlJErgrdYlPVlgkAUHV1999aoMizxWS6ylUvLu8FbsCWrLL7987XeOOOKIquzuu++unWfbcAtCKWkOSKwuiULwPawG7mFrE259ZGFSLBR+TVjX11133Wob1g0s/O616aRtlDxIJBrye+6FY0FDT2Z+6qmnJKWF40aKJjETrE0uWMJ+pf3zxf6aPGUlyzKeoNIxRxK8MC77zHiGcMtFF11UlflnaFr0FYs6FkZvOwO5frcKlhb19EVYg3bDApAO458nPQ8F+Zyz6KKLVp+JwAiCSRXauAtJ8byJ99+9+fRHoqD82aW0sHg+x5YWOO4kmqm0REX+V+obHeS/U/JsDZWwTjufyoMgCIIgCIIgCCaQeAkKgiAIgiAIgqCnmOTD4QjNevDBB6tt6I3vsssuksornn/lK1+RVF9X45lnnpGUkrNfe+21qow1hL797W/3ey5txRPSmoQRWHfp2Wef7fcYhGY99NBDffZpuo/dgq+lwjpBXNPJJ59clY0ePXp4T+z/Z4899qg+s6K6rw306quvSpJOP/304T2xAYAL3PteHp7lbnn6L9/zdQryFagJMZNSuJ2747sJ1v3x6yH0zENxmyiFwcGbb745EWeXYDVwKdUlocZSEgkJ2g8iFyVK4TNQEjJhnO80TDw/JgJFUoTDBb2Dj7f+WarPZcwbhN37mnLMg94v6Yds8/BWD02Xys9ofN+fF/P1izwkjzDsksAJf/3ZfPz48X1+czAIT1AQBEEQBEEQBD3FJOEJyi1EpeRn56abbpKUEo433XTTqgxLK2+dJYnsNddcs/a/JG2yySaSkghCyRvSzR6OJtzawDVj9XvvvfeqMrwkJRnmfCXgG2+8cWhOdoiZYYYZqs9uoZe6w6OAqICDdLlU7g8jQZMwAgmQXkbCJ+2u9D32cQ8SFiy8YV4/CAG4cEQ38dJLL0mq9yek5ZdYYol+v+fjH/XM2OOWv6Z70DSm5t6lqaaaqvqMl92tgW0d94K+4O0v4W0m9/J4e8il/kvf6+9/KUn/uycoCILyshvMfW1dlmSoCU9QEARBEARBEAQ9xSTpCfLYZKzgLpF92mmnSZLOPPNMScl7I0kHHXSQpBQv7zkgbFt11VUlSU888URVxn54A2699dYPPM+24F4trOxYz92KvtJKK0mS7r333j7HYL9Sjlab8DjZG264QVKSvL7rrrv67O8W9JI1HjqRnmxqP5TRpqVU137OWOpHmvx6Xdpz5plnlpQ8IVJqP8Qyu2WZmGckn7/0pS9VZXjukBB1T1Ae57zQQgtVn1lcdCRhXHIp4E5o8rxM6BjUlFv0QW2X+xm0H1+oERhfvN3lnqCSbH3JE8l+zKOeL8B+SK6vvPLKE3MpQRAE4QkKgiAIgiAIgqC3iJegIAiCIAiCIAh6ikkiHC4PM/LksBVWWKHP/oTL4F53l/sjjzwiKSX8e1IyoTUHHnigJOm5556ryprCRfLzbBten++8846kVGcvv/xyVXbZZZdJkh544AFJ0m9+85uqjHCnJlneNtTP9ddfX/zcH6Vr6iQRfULh/kjSKaecMijHHA5c8pl25JKYhMcg7PDYY49VZQgprLbaapKk+eabryrjGISrfv7zn6/KCDcj7M7bcjcw44wzSpKeeuqpET6TiWOuueaSVF/pPGgnF1xwgaR6yClhpS4/T39l/izJZzOveDgl8yiiJT4uvP3227VzueqqqybmUoIgCMITFARBEARBEARBb/Gh/7Y1Wz8IgiAIgiAIgmACCE9QEARBEARBEAQ9RbwEBUEQBEEQBEHQU8RLUBAEQRAEQRAEPUW8BAVBEARBEARB0FPES1AQBEEQBEEQBD1FvAQFQRAEQRAEQdBTxEtQEARBEARBEAQ9RbwEBUEQBEEQBEHQU8RLUBAEQRAEQRAEPUW8BAVBEARBEARB0FPES1AQBEEQBEEQBD1FvAQFQRAEQRAEQdBTxEtQEARBEARBEAQ9xUdG+gRKfOhDHxrQPv/9739rZf/zP33f7f7zn//0e6wf/vCH1edppplGkvSvf/1LkvT3v/+9Khs7dqwk6cwzz+z3WB/96EclSf/+97/7nF9+nh/EQPeXOqu7gR6rdB6HH364JOn999+vtv35z3+WlOrgrbfeqso4xmc+8xlJ0j/+8Y+q7POf/7wk6aSTThq0cx/OuvvIR/6vG9FmRhK/Bj43tf0SI93uYOqpp64+H3300ZKkk08+WZJ0zz33VGX02XnnnVeS9OEPf7gqW2mllSRJf/nLXyRJRxxxxKCfpzPQuhuKemtik002qT7PMcccklJ/5a+U+vLxxx8vqT4ODgXd0ubayEjXHfPt2muvXW1bd911JaV2881vfnNAx9xnn30kSUsvvXS17cQTT5Qk3XLLLRN+shkjXXdtppvr7uMf/7ik8ri1yy67SJK+8IUvVNt+//vfS5LuuusuSdKjjz7a53tNz0MDpZvrrtsZjPp3PvTfwT7iIDDQm53v3+kljR49WpK02WabVdvoPDwQvPfee1UZD1u77767JOn+++8f0Hn6y1knD6Yj3VF4mPQXOurlhhtukJTqS5I+97nPSZI++clPSpJefPHFquxPf/qTJGn22WeXVH9BYv8111xTUr3OJ5ThqLumQZG2cthhh1XbmNCpM/+9V155pbaN7/vx+Xv++edXZUcdddQEnV8Tw9nuSue43377SZI23njjahvt5+qrr5YkHXfccVXZBRdcIEl66aWXJEm33nprVbbnnntKSv1twQUXrMrmnntuSdIf//jHCTr3Et32EsSL5BVXXCFJ+uIXv1iVfepTn5KU6sb7OX3/n//8pyRp5513rsro+4PJSI91bWao6q7J0LjDDjtUn3mx9pdo2s1iiy3W5/uTTz65pDQf+oPq66+/Xtv/5Zdfrspon1NMMYWkNA9L0t133/2B11Mi2t2E0411l7/8YBiTpHPPPVdSMnpfeOGFVRlj4aGHHlr7viQddNBBtd8Y6HNciW6su7Yw2K8sEQ4XBEEQBEEQBEFPES9BQRAEQRAEQRD0FK0Nh+s0Jwh36PLLL19tIywJF+gf/vCHqowwJEK0PL8Al/vvfvc7SdJcc81VlT3//POSpIceekiS9Oyzz1Zl48eP7/f8m6p/pF2mpXNcZZVVJKUYb3cHEwJRyjEg/I06J0dDSveLfI3nnntuos99qOqO/B8p5QBxnTfddFNVNv/880uqhwsS0kVe1Mc+9rGq7K9//Wvtd2h/UqpXvk8dSikn68orr5Qk7bbbbo3X1Um9jES7m3nmmavP9Jcnnnii2vbmm2/W9vfwLMJxfvGLX0iqx3r/6Ec/kiS98MILkqRRo0ZVZbTBJZdccqLO3RnucLgPureEri200EKSpDfeeKMqIxevlMvGsQhxffXVV6syjjWYjPRY12aGs+6WWmopSdIJJ5xQbXvnnXf6nAd9a7LJJpOUwuKkNJ4xn0411VRVGeGbzKN+zE984hOSpE9/+tOS6vPLjjvuKEl66qmnqm2U83slot1NON1cd6Q4eLsjNJ18xyY23HDD6vPKK68sSfrGN77RZ79SykAndHPddTsRDhcEQRAEQRAEQTARtNYTVGLvvfeWJE077bTVNizqfkyS3vBAuBWabSXrKBarUmIw1nm37gMJ22PGjOnwSv6PbrQWHHPMMZKShcWTyqlXrMeeXIjHAi+IW044Z6z2t99++0Sf53DW3W233SZJWm655aptr732Wr/HxzrqbYX6oP259ZJ6pE2W6o42f9ZZZ1VlO+200wRdz0i0u7fffrv6jHcRz5ckPfLII7Vzc+U4+heWX/e+ofoz/fTTS0rWZCnVJ140F1uYUIbaE0T7oI5KyejuqbnxxhslJe+1e26pXx8vgXrj99y7xjG33XbbAZ17E9041rWF4aw7kssRFZFSW/Exi/kQoQMX5FhkkUUkpbblCqOMpbRrb3eMl/Tb6aabrio744wzJNUVRjuOt+CbAAAgAElEQVSx0ke7m3CGo+64h/z13yx5+BDC+exnPyspqdg6tCNXqG2KzCHiZ9VVV5WUFEp9/7aJXg03nSo2I45F5MG4ceOqMtrAYKvwhicoCIIgCIIgCIKeoivXCRooW2yxhSRp2WWXlVSPe+dt370SvFFinSKfR0pWAqzJHgvPmzhWBs8lIgaa3/NconnmmUeS9LWvfa3ads455wzoGruFddZZR1LK0XDrBHXHW79bWvK3+L/97W9VGZboGWeccahOe0hYYIEFJElLLLGEpHpbwZLp7QDpb3KeZp111qqMaye+3q0dtF3apB8Taxg5V+RstYXLL79cUv2a3n33XUn1PouE+MILLyypnmeXS+26hZA+yvG93WHB3mqrrSRJjz32WFV2/fXXT/hFDSG5BbFkDcVbKyUrOG3HPbdswyNUstaVlgrAIoo3wPMwgkkb8im8HZas6IxjX/rSlyTV50r6Gbl4nkuJp5YIDs/BxHJcWgNmpplm6nOuA83TCLoP7iFtrOQFOPbYY6vPRBSUPEDMAf5cAvm46tEETz/9dO3Ye+21V1VGFIHnMTOOcs7d3A4H6sFif/9eJ56ZTqXE11hjDUlpvHBPUGl+GgzCExQEQRAEQRAEQU8RL0FBEARBEARBEPQUk0Q4HJLXhHogoSkl16cnEOOm9IRMQDYW96Ynr/M5T/J3OHYpZGmZZZaptrUpHI7wPymFXRFWVJLoxT1acl+WXKfs14l0ZTex0UYbSUpty68XFziS65J0/PHHS5KOOuooSfVk4csuu0xSCotz9zwiACUpWsJF+D3CT6QkZzt27NgJubxhgWuj30mp7zzwwAPVNtoZIibuJue7yIyzorx/j/BNBBKkFI7z4osvSqqH13QreejRuuuuW5X9+Mc/llRvh4xVjF2TTz55n2MSeuDfY2yjv3p4CP2UFdd/8IMfVGUXX3zxwC8qaA2zzDKLpHoYJm3RRUeYd0th0/Tvxx9/XFK93XFc/noYDSFybPO5xIVSgkmHkogBnHjiiZLqy5G4dLtUfw7rNCRLqs8FtE/mXxe42m677STVxWguvfTS2rEGGnI2nDSdT+m8879OJ5L066+/fvV5gw026FM+33zzSZK22WabPmVDJQwRnqAgCIIgCIIgCHqK1nqC/K1w9tlnl5SsBS6tW1q4M/dGuJWBz1jkSZ52SjLYWMN4Q3YLBBYsl/TEel06frdBMrqULMJYmEveD+rQPRa5tK/DNrcktgEksbnOklyxtzW8GHgjDjnkkKoMWXCEDRACkJI1da211pJUX0iVfsBve9tExKKbPUF4R92Sds8990iqe3QQSyglRbOt5NGYcsopJaW26Quj4rXYcsstB+NShoW8/7DAsOPjH+2h1B5pM3h2vL5zCW7/Hp+RQXbrK4IV7sULJh0QAPKxjvHP52TKaSPeJ/lMn3bxAyIMSt9jfuD3POHcPeBBOykJbPAswfPSddddV5Wdeuqpkpqjany8pP3kC5N/EE3eEiTjEeWS0qKszO/d5v3pj/wZrem8/VmNPtvkAbrjjjsk1ZcRaaK0xMhQCUyEJygIgiAIgiAIgp6itZ4gZKel9GafW32lZElwadz8jdWtTViTWfDT84v4HY+PB6wXWGFd/pjFMV1GkQW4sHp3M0jiSqluS2/lWA6oc7ce5x4Lt/awn1ui2wDWH9qWW0eoHxbylFIO0WyzzSZJWmmllaqyCy64QFKKbZ933nmrsoceekhS8gC5l5G2izXV2/no0aMlSQcddNAEXN3w8vDDD1ef6V8uXY+3l0U/vQ74TF2U7gNtbLfddqvKTjvttMG7gGGG9sGYJyWPjlvWc4lZ73f015KUK/mV1K17GDk+3ji3GOJVC0/QpAX3vDRG0368HdCHSx4d+iJty+dT5ttcZlhK7Y08S5dtd+n3oJ3Qxvz5jGVF9thjD0kpz1Uq5wnl+Hg3UA/QQLjrrruqz4zNl1xyiSRp0003HbLfHUq++93vVp/JPSWKxXNrmUeZr9dee+2qbP7555dUXlaG8cKjOpjP8Px5v56QBWY7ITxBQRAEQRAEQRD0FPESFARBEARBEARBT9HacDiEC6TkXidp02WJn3zySUl1lzvuN0I8PHyEYzUleZXctnmypq9gjay0g9RoG8LhRo0aVX2m7rj2UugN1+Zl3Ae2lcIjCBvsZjzUis+EaHnIEGFwHrb1xhtvSJKmmmoqSXXxA9zGhAs++OCDVRkhKEg5+zGRlKT9eYjADDPMMMCrG35KYVrgoYRINnudAWEOhKJ6WBeiEiROl2Tt28iGG27YZxvtwtsH4ZFs83rmc6kv05bp776cADLbtH9vcyussMIEXU/Q3TBmEZLrISy0FRfkgLyNOYz3zNtS3yUtXAQmP4aH3xG+Xlqyoa00jY1N7L///pKks88+u9qWP4P4GJGLOjlNgkaDTamNfP3rX5eUQsc7CYEbCbw+WTqA54K2kN9jD+PjOQMxEw9F3X777SXV54/8mIwNpaVqSqIpW2+9tSTpyCOP7Pf8BovwBAVBEARBEARB0FO01hNEorSDtdIXTmNBypJ8J7jlmP2wLvu+efI5SZ9SejMmscsXJXz11Vf7nIN7srod96y9/fbbkpLlwxPxSZb7xS9+ISlZD6RkSeT73byAWBMu5Qycv3sbsUy6hSgXhygtBIg1zNt3vgiwt7t8ATG3GuIFaRvUmcve5kn4LliSS2R7/VB3yJO3qa01gbR406J1UrKy5yIIJbz94rmde+65JdUT4rHGciwXQJlzzjkHcBVBW2B5h1IUBP3V51E+lzyQ+dzh32Ne4PgeHZBHW5Q8lz7343lvG1xXaWFxYEFORHakNN4zNmy88cZV2WKLLVb7vo8bTZ6m0rgx2B623Nu0+eabV2W//e1vJUnf+973JNXHF5YX8bboXsX8XBm3eFZzLyMRAni3PaqD/Xim84W92Z/x0r9LO//1r39dlfnzUjdQioJibvUxn+c22oHXOR6vvM/7Z+akUuRQSaxn7733llT3BA0V4QkKgiAIgiAIgqCniJegIAiCIAiCIAh6itaGw80xxxzV5zy8DU1yKbnjSq5zwuc8yRMXIK7Bkhsfl6D/Tr4eka/z8vLLL0tqdtt2I7iBPcSABEvq3HXcCXnDhbnrrrtWZazPgsuecDHHNeS7FdfHB9qWnz915iEN77zzjqQUrkWYpJTqjsRDrx/2p72xHoyUXNfcBw8tocxDL2mL3QxhcJ6MT11R114HhHERwvrMM89UZdTHs88+K6n7whEmFNphKWTGQ02bwpKoN0IdXEiDe8D+Pkbma1P5uOahiMGkA2FX+Zpc/UG7KSU/A/3V52b2Z1sp2Zp9SmE3LkjU1nC4pjA4wtoIdXv00UerMgSJmKMXXXTRfo/j96NJGCHfZyjIj+1h9ITi83zlYw1had4OcrwuGcNKgkHURx6+5TDnuCgA5+WCDV/4whckpWe8bp5zSveVNSw9pYO5mHoppZdQh02pDv69UhgtAgr8treFcePGdXZRAyQ8QUEQBEEQBEEQ9BSt9QR985vfrD7z5o313FeFX2SRRSSVpRWxMvnbar5StZflb65uqea3saputdVWVdlVV10lqR3eH4e6K8lZ89eTBHPLm9c5lmXq3C0QiFfgBelmXCQCaCOeSIg3zJNTsUTRjtxK9dprr0lK3jdvK1jX2f+BBx6oykhYxrvpljKSit1j2QZYFdyFEV5//XVJySPk7YfkWfqleyfz1elXW221quyUU06p/W6b5HVnnXVWSalepLL8fG6Rd6sbdcI45snAeAxpQ6VjkhTsK7FjFeT8pLSSeDfzla98RZK01157Vdtof7fddpsk6Uc/+lFV9tRTTw3j2Y08JKQ3eYC8LPfWeN9qkn7mGHgp3SKfJ9B7e6XMvR9jx45tvqguBWnin/zkJ5Kkhx9+uCpjfqBPuQwzfY5+6ZLlK6+8siTplltukVQfB0owzzEv7bzzzlXZGWecMaDr+SBoB3hoFlhggaqMsYX7+8ILL1Rl3H8XdKG98OxREuugzNsrxyh5IPke+/tzDefucyzPgm143it5/9Zee21J9f6FN8u3AXVQ8ijyORc1kcpeZe439Xn44YdXZRtttFFnFzVAwhMUBEEQBEEQBEFP0VpPkEOMJn/33Xffquzmm2+WVI8bzSVeHbcqSGULKNYGzwHBQoOk7EEHHTQhl9JVzDPPPJLqll6s7FhKpp122qrssssuq33frdTEKZektaENFmNfLBWwdngbQ76z5M2gjXlbw+JO3LFbosijWm655SRJ11xzTVX2gx/8oPb3ueeeq8rOOeccSdLjjz/e6eV1BXgGkZuXkpWQRfPcQkz7xAvm3jDuDQuvuueyjZDnhaRrKWexZGEvjXVY5WjT3n6x+OVWUN8Pz2bp97A8S93XrxmzPG8Tj+rVV19dbcMavs0220iS1llnnaqM9nj55ZdLknbfffeqrK3S9E24l1sqe3Z8Wyk/IN+/ZDnOc4JKSwzQv3385BgjtUB0U+4TfJD3BX72s5/V/ncZbHIlWDR68cUXr8rwkiOfzbOIlJ6DzjrrLEnSnXfeWZXx7II3VEpLNDBPf/vb367KBtsTBNxPxn8pRYfQF32sYa70ZwnK8cK45yJvU6V7ReSA3yuOybFK8tmew0tbZI76/ve/X5V997vf7fObI0lp7F5iiSUklWXwS5LpTZ7d3NNU8jz5Nu4RXszSouCDTXiCgiAIgiAIgiDoKeIlKAiCIAiCIAiCnqK14XAl1zsuTHdXltx3uNwIi3M3Xr7ab0mukn08TCwPRfFzKCXitWHlesKRPKwN9zThOO4yzRPNH3zwweozrmtCaEr375577hm0cx8q8nBJqSzRueeee0qqh0UeccQRkpKbnDp0PHwOCIEoJWEirQ0enrjZZptJks4999xqWxtC4whr9XPl8+233y6p3n+4J7RFFyzJ3fgfJO3b7SACAyXhllJYQin0KBdNcPKwEA8DZRvH8tARQlNcXnWoKN3LpnGVfkOSt4cd3Xjjjf1+j3GN8GBJ2m+//SSlZF3C6STptNNOk5TCUQeTkWq/ebsrSeSWykvnS7vL+6Z/zpOtS8f2cZc2iEz0cFNK/O4ElpE4+eSTq23Mt9Sd1wFh1oRPu0T2+PHjJSXRKH8GQbRo++23r/11EJiRUjgSYezDEUaMNLOLD5FyQBg+4WdSqp9SiDr1Uwq7z8Vy/Fi5VLbUHO7FHO5iCQgZ8dtDMQ5MLE3XtNZaa0mqL0ORC1o1jb2lUDnw/sExfRtjyXA+H7f7iSAIgiAIgiAIgmCAtNYT5G+w+duseydKb658zr0+UrIglBYCBL7nUrwcE8tJSZK722V3c/BKIJMppTd1PA7XXXddVYaVCqgLqa9kpVsI2uCdgNIir9TJ3XffXW0bNWqUJGn99dfvsx9Jm03WjtKiYnhI3BO05JJLFs9FSoukIjkttaOusTrTF6VUB3gnvV/Strj20sLIWBDb4IFtwhe+lcqW56axseSNxgJcsshjzUSQwSlJcmNBdq/JUDHQe7nKKqtIkg455JABfY9FjpHKllJiOQnjyy67bFW2zz77SEoiHqNHjx7Q7zUxUu2XdldqbyUZ7HypCSdvk02W49KCqOCeDqIyfMwYCRi7XKABgYKvfe1rkup9g0WPfeFsxio83L4AMWMddTHzzDNXZSykyjzhEs2MjYiU+DxRukfULZEbw7Eg8i677CKpHn3D7zLWePQEY5P3CfZHOtyjdXJvZkk+m+97O8rblnu5uQ/eFmnfeCVdrKibYWFS6tq9aLSXpugB8P5NfebS91JqW94WS6InQ014goIgCIIgCIIg6Cla6wlqomQJLS2shlXFY0p5Ky1JfPL2yz5useAzb7eTAsiien3yho5V5PTTT+/oWFiSscawqKyUFmZsA6XYaKxBJa+hW4jxlJWsHKV42hyshR4XXVoYLye3gHU7JS8s3kg8bL6wLla8XLrZj0VsuVsS6cel+9atzDTTTP2WdSJHXIJFkd1qmlsFvU7zBfPcQo/10OXNhwqs7cj5Sulech5+bozRLt8NeHi9LeDtpt96/eIlf/HFFyVJ6623XlWGpR+p3yuuuKIqO+644ySlfuvWetqxW6hzuV0/P184cqhh3CvlyNJGSp6gErmktrfNfPzz4+RtsTSnD0fuCm3bc2g6gXvnuRZ4Gb2d0g/Z5ktxcJ3UhUcFPPPMM5LS3Oz1SpuiHZXuVSl6hfbnnmDGhsGGiAUWl5dSrh114udI7pBL8HPtzJE+pjHvNuWHlhaVpo5LSzDwjORtAU9RN88rpflgzJgxksreP8ar0vMJ20pjA9+jr3qd54vHS6ne8TaSAzyUhCcoCIIgCIIgCIKeIl6CgiAIgiAIgiDoKSbJcDgHt19pdWnceF6Ga46/nmTNsUouZdyuJbnQ/PttwcOuAHclLn2Xz24CEQncx+7WJomxDXiYTy664St0c33eRmhnncioutuY3yEEh5AwKbnhaaelBE3CndoCLnTvS7mcuLdNEoHZ3/szIRO0Uy9rYzgcdZMnnPrnUnhRaeyhzRCO6gm/eYhMKWQBvIzzKsnWDjaEebo8dSeJtYSp+XUwLnkyMOEdhFiVwv7oby45TJ1R5rLNs846qyRp6qmnllRv4yWpfSBcz5PA77rrrn73H2zob4TvldpTSeq21CbzMCT/n/tXmn/z3ykl95f2H2yYr3wumH/++SUlIRzknqUUVkl79RC2PHxrYlhggQUm+hg5hGD72DDY4dXcf/rgtddeW5Xtv//+kpLEtz83EKbqYVic2yuvvCKpLH5Q+r8pDB3oe6XQc++7TaJaI4lfY/5MIUlrrLGGpDSfeghkLmjlx8rr1cde9uO+lZZoaFo6Zuutt+7gyiaO8AQFQRAEQRAEQdBTTPKeoNKibbxt8vbuFqU8ec4tS1j/2OZlWMiaFgksvYl3M0g+b7DBBtU2rAR4vrbbbruqzBflzMF6Qv265csX+Ox23IoHtCO3nGClcg9Gnng60MVzsSxjFfPfzhcglFJ7G44k9cGk5OXIky9dprW0oCfgsaS9lhaoBf+9gS56OFw0SZWSaOrXke/vsB8iJX6s3FvubZeyUl3m3pOhZNy4cZLqScmcE+fonlHaTCkxmvP268wFIEpWZb5X8kBwrMcee6zPMUuWUfDzYv/S8YdTepdxuyRrC6X5jWvx6+zE6s73St4e6rwkmoCHbShB1tojGG655Zba3xK59LVUTiqnbfDX58pcjrz07FKSsOb4TfNMSSiAZx4fWxEDGSzmmGMOSUnsxts1XltEQEqLNvs4xDjPHOJ1V1okNae0SG/uefR64lilubwpKmhi8LEgFwtpeq4slZXGEOZMb6f5wqY+FlIH3BtvY6XnpZySqATPnnj0hpLwBAVBEARBEARB0FNMEp6gprdfLCb+9s+baskyUNo//x5v/SUrg0v39ncuH3TO3cKll14qSdppp52qbVhksLhsttlmVVmTJwgvCXXw/vvvV2VNsr/dRskDQXvwxXNLllvaSCmuls+0sZLFDuuWW6jzRWhLUqtuiW4DWKLcukb7KcVZ594Kr5+8zty6lVs+27CQKu2vNH5gefS8FuqS3AO3OHO9jGOlGPmSRHZJChU4xnDkZnD+b7755pD/Vq/D/WzKo/M2md//pgVRS/uVPEG0u9xDJ6Vxbzg8uER9eN5PvmyGSzMTPcGcly8qLpVzR/Fw+DVxXPb3vs7804m3x+9VPi/5fpT5fRiM/CWH41E/PofhXeB83RPEtfu8m4/3fp38Tqnt5h5Lv968zrx+qZ+Sd6/UTieG3Ivsv98JvnD7lVde2aecxe2pp5I3rBQ9ldede3/Ytvrqq0uSnn766aoMT7578vidvfbaq+PrmljCExQEQRAEQRAEQU8RL0FBEARBEARBEPQUk3w4XL76stTXZe6uvZL8dU4eGuB0a0L1hEBSmrtcSbLHpe+JqHPOOaektHK1g8uU77lbu00S2R7mkYdHltqYu67z8KtOXdl5+FxJPpvf9vCI66+/XpJ02GGHdfQ73UJJ4AD5VFZPd5c7n7kf7Cv1TXQtJQ1DG4QR8sRUP89SYnMeylEKcUBW1tsv+5US4X0F+xy+1631F3SOh/Mw7hH24yFFeSivUxJGyMtK22iLTSFFPn7yveGQZifk3UPf+V3GIj8PZIinmmoqSfU+WAph5brYz+s6FzrpNISX+mmq81KobCk8bPz48R39Zqe88847klLo7gwzzFCV5WIvTWkKUt8wzCZBrNI8WmqvJZEsKAlU5EupDBa5sJIk7bPPPpKkFVZYQVK97vhcEuviHF977bVqWz43ltoW47qH/9E+kfG/8847q7Lll1++3+v5yU9+Ikn6f//v/1XbHnzwQUnS2LFj+/3eYBOeoCAIgiAIgiAIeopJwhPUBBaWkgWhlKCO5QmLQGmxVPZxKxX7N0nDtkEMocSYMWOqzzvssIOklMTIQouStPTSS0sqe4JYMPD++++XVLfCDIcM4mDh95A2RaKstxXaRsmaUkpELbXFTs4Bi2PJIrXRRhtJqlvlXciiWynJuXJdWAt9wT7k10ty0HgeaW+e4O+f89/rVkpWTKANlKz0TcfCwliyrJcso3n7LS2A10k7DrqbkkRuqY+UFj3M26ePjbkHvCSFXxIoytuiiw+w30gtUMlzhnt0gs5wYQOp3lbyccjbCu3T57zc29M0DjUJJJTaZJMYTUk+nzI/v8Fon3fccUf1ebnllpNUbnf0MzxtpSgAFyXI+2xpQdTSkgk881500UWSpC222KLPuSC24H22VBfXXXdd7f+mcWOwCE9QEARBEARBEAQ9xSTvCcJK71YtfxuVypKMvIE2LTLYqbWg7VxwwQXV52233VaSNM0000iq1x3xn+edd54kaeaZZ67KuA+lmO225g9gmeD8zzzzzKps33337bM/8uLg153HYHu90hZpY16vcP7550uS1ltvvWobno7ZZputk8vpGkqeNSxWlJXirXMZdj9GySLYBs9PTifytyWp61IOQR7rXhq7SguLQm4d9M9DZbULhg/3lD7//POSUty/j08l6WHK88WcS5QsziXJf2AOces3bdgjE4J2QK5UaWmEfIwpjW2l57BSDlEuo+5jVB5FUFqkF0rRIL4t/21/5hkMT5BHGzE+401rOo8Pkpunv+dLf/h+fM9zcp999llJZQ8Qv12aD/JIDEm66qqrav8Px7NheIKCIAiCIAiCIOgp4iUoCIIgCIIgCIKeYpIPhyMMrkkq0ckFEdyNl7tFS6IJuD5dbpAQHv9+m8LmPAmOVX4XXHBBSXV35TrrrFP7nq+MzbUTsuT1SqhFG/AQDO4h7unvfOc7VZl/Hg6oX0IKpHRv2hb2xTW4FDNCCO6GB/p4KUwGNz7bSqEMbQrH5DpKYWolUYKSrGtOLsnrxy+NlU0rqENJzjVoF6uttlr1mb5FCIuHshAO522kkzm2aZ98X9+vFILEZx8fNt54Y0nSZZdd1u/xg5GH+XOppZaSVBdiIoSc+1pKlPf2Q9h0SVyHMYn5AulyKY2rpZDifLzz8bW0P2FnnKu3yVwEYkJYYIEFqs9XXHGFJGnZZZeVlJYwkVIfZR71MbmpX5bSPTgW1+JlLI1SgnopPe+WtuXS2H6eIYwQBEEQBEEQBEEwCEzynqC3335bUjnZrpRcnS+E6m+redKdJ5Xh7cH7QQKpl7VNNpY68GS+E088UZJ0+umnS0ryi1LyfnHt7gmaffbZJZUXRn311VcH87SHFLeA0jZ++9vfjtTpVOSLW0qpLSMh3RboV97ucm8PfcrLaK/e7iabbLLasd2y1EZvRe6hcUtnSagAStsYx6hnt2bmlkK/F9yD0mKO3Au/B0E7KVl4p512WknSm2++WW3jXnt/ot+VZNvx9LKPz82U5dZ0qW97876Nhfq9996rto0aNUpSeIK6nS233FJSWmTz0Ucfrco22GADSclD4Esj4EFy70q+VIQ/vzE2ffGLX5RUH6OIUOFY/myXex69LBcM8N9+6aWXJPWdgyYWvybqB3g+k6Sdd95ZUqonf97lmaBp6QOfA/IFteeYY44BnXNp/uF8WG6lxHBEsYQnKAiCIAiCIAiCniJegoIgCIIgCIIg6Ckm+XA4ksJKbr9S4jhJnrjqSsII/GVfKSXW4R51l2lbKSWiPffcc5LSWkuuWY8bG7340047rSrLEwi9ftoUOuNuXeqHMMDpppuuKnvttdcklVeznlhKIV2EG7ogB2UeOtYGCB/w8Brc9zPNNJOketgLdUwbm3zyyauyPJl6iimmqMrauLYXoWiE+TolgYM8xMEhrIhQolJiO3VaCvcohRNyfm3q00GZ448/vs9n1onbbrvtqjL6lPetV155RVIKM/Iw6IUXXlhSCoP2cKY8hPfhhx+uymiLzLtPPvlkVfa9732vz7Y2hrv2IrSRs88+W1J9XcLddttNUhrTfVzhGYR1hqTyembA+Ma45/MxggolIa18jSIfCxmHPWyT8+JYgx3SVRKHgD322KPP50UWWURSqktJ2nzzzSWVhYZKELK2xBJLSErPgZ3CnOEh1/w26ww5uRjFUBKeoCAIgiAIgiAIeopJ3hP07rvvSip7NXLLgFSX183hrZS31JKVn7d0LAttppQIiHXtxz/+sSTpBz/4QVWGdX655ZaTVPcE3XfffZKa70MbcK8K500bw/vjuCVjsKSYS9Yu6tWFEWifbh1tA1iZvO5ef/11SWXLVb7ytP+ftzfv67mVqSSe0m2Q1Mt1lCxlfh20gVKyK32ZsaokqlGyYuZlXqdYDMePH9/5RQWt4fzzz5dU9zh/97vflZQSwaUkcPDGG29Iqs+rtDPaCvtIyctD23KZfDzEiDOcdNJJVdkjjzwy4RcVjChXXnmlJOm4446TJE0//fRV2WOPPSYpjd7c/AEAACAASURBVDlEPEhlQZcm8SnGReYQPFBSXzGe0vhfihygnZb2Z1wd7KigprmptDTLgw8+KEnaYYcdqjI++1wxwwwz1I7l3tumaJLca1Oak3yOgGOPPVZSPZoIqOsQRgiCIAiCIAiCIBhkJnlP0KqrriopxWlK6e2SmE23FmNxIAbVY++Jd+et1mMz+YyldZpppqnKHn/88VqZ1L2W5hKl2Opf/epXkqSddtqp2oYkaUkSkljSBx54oE+Z53d0O27RwOpC23LLki9aOhzgjXJLGO2Vdt6NlBabw4K29NJLV9uwZnEtLpWaeya8LM/ta4oxbkOfJH+Ce+tWRq5t6qmnrrYhV0/cdWnxyVlnnVVSfcx68cUXJaX27hZYLPj0W/d6c398vA3aT754+NNPP12VsUSAjzPzzz+/pNR+fFzC0k/fnGWWWaoyFrDke96OnnnmGUnSU089JSl5RZ1S1ELQ3VxzzTWSUlTJhhtuWJWxBEdp0WciHLxtNXmCGB9pwz5f46mkzZTmCcZcjwAqjcOMnXgxabeDRdMcNtD8X/e0uCd3Qo/RH6W5tSmvaDjn4vAEBUEQBEEQBEHQU8RLUBAEQRAEQRAEPcUkHw6H1CaJ1VJyg1L21ltvVWW4Qwk78WThPBG4FA6HW7UkEdsmKd5OIVFWSgmH5513Xp/9DjnkEEkphMxXCfZ70+24EMS1114rKYVMlkLghiKxr+QqxvXuEplTTjmlJOn6668f9HMYLEr1gywuSbEOYX/0XamvdKn3PcK/qIuSHGebIHmYEIvZZputKiNE1UOVwEM4gHA4xixCO6Q0RtKmfRxkHFtggQUk1cNXSGxH7jaYNMgFRm655Zbq8/bbby+pLlX8wgsvSJJmnnlmSfU5lhBNwlb92Hy+8847JdXHVEKOmsLcIgSuvSy44IKS6qFlM844o6Q0No0bN274TyyYpAlPUBAEQRAEQRAEPcWH/jspuieCIAiCIAiCIAj6ITxBQRAEQRAEQRD0FPESFARBEARBEARBTxEvQUEQBEEQBEEQ9BTxEhQEQRAEQRAEQU8RL0FBEARBEARBEPQU8RIUBEEQBEEQBEFPES9BQRAEQRAEQRD0FPESFARBEARBEARBTxEvQUEQBEEQBEEQ9BTxEhQEQRAEQRAEQU8RL0FBEARBEARBEPQU8RIUBEEQBEEQBEFP8ZGRPoESH/rQhz5wn//5n/T+9p///GdAx5933nklSYcddpgk6YYbbqjKxo8fL0n685//LEn6+Mc/XpWxbe2115Yk/eMf/6jKjjrqqAGdQyf897//HfB3Oqm7weTwww+vPh988MH97nf++edLkrbffntJ0j//+c8hPa/hrLuPfOT/utG//vWvatsxxxwjSbrzzjurbVdddZUk6Rvf+IYkaccdd+xTduutt0qSpp9++qrst7/9rSTphRdekCR95jOfqcrmnntuSal+B4NuaXd+zKZz2nvvvSVJf/zjHyXVx4YzzzxT0tC3Nxho3Q1mvXHdpfFwhRVWkCSNGjWq2vb2229Lkj72sY9Jkv79739XZUsuuaQk6Vvf+lafY+XnPCHtJadb2lwbGc66m2aaaSRJq6yySrXtxRdflFQf64aCj370o5JSe11++eWrsj/84Q+SpLvvvntAx4x2N+FE3U04UXcTzmDMN054goIgCIIgCIIg6Ck+9N/Bfq0aBDp54/3whz9cfXYLpiStuOKK1efRo0dLSpZQSXrrrbckJY/Q5z//+Qk6z/fff7/6/NJLL0mSLr/88tr/knTuuef2+S7X2FT9bbAW4LmQUl1z7Z/4xCeqMqyEiy222LCc10jX3aOPPiqp3rbuu+8+SclaiddRkv70pz9Jkl599VVJ0lNPPVWVbbHFFpKknXfeWZK00EILVWXPPvusJGmJJZYYtHMf6borMd1000mS9thjD0nSDDPMUJV9+tOfliR96UtfklSvO+rnC1/4giRpzJgxVRketsFkuDxBTV4fh3Zx/PHHS5Ief/zxqmyKKaaQJH32s5+VJM0000xV2d///ndJ0gYbbCCp3lZzfCzmfAZaD93Y5trCcNQdcyVz51xzzVWV/f73v5dU73cDjc7oBH5zyimnlJTGSilFbDz99NN9vtfUV6LdTThRdxNO1N2EE56gIAiCIAiCIAiCiSBegoIgCIIgCIIg6ClaFw7XFEb285//XJK04YYbVtsI63ARg3fffVdScqe7W32jjTaSJP31r3+VVBdGeOedd2rf5/ckabfddqvt87e//a0qIyxsqaWW6ve6SrTBZUqIlySdddZZkqRTTz1VUl00YeONN5ZUT8weSka67i677DJJ0vrrr19to0198pOflCSdd955VRmCEbPOOqskaZ999qnKCDOkLbswwgUXXCBJ2nbbbfucQychlyVGuu5grbXWqj4fccQRktK5XXPNNVUZYTgIlyyzzDJVGfX52muvSaqHaL733nuSpAMOOGDQznmow+GaQnvmm28+SSlsUpKWXnppSUlUw8+PuiC8iMRzSTrppJMkpfGQupKkK664ova3dD1tbXNtZDjqjpBJBAjWXHPNPvvcdNNN1WeESDoN28wpfS/fxjlJSbDBKYXG5US7m3Ci7iacqLsJJ8LhgiAIgiAIgiAIJoKulMhuomRp/NrXviZJ2mSTTSTVRQnYHxljSZp88sklJS/PVFNNVZWdccYZklKStSfy4zF6/vnna78rSbfccouklFzsViqSsn/yk59U27bZZpsOrrb7QWZXSpKleII8Wf+BBx6QlBLcscy3jU5lm9nvL3/5S7UN8QOknL/+9a9XZVjsOSaJyFJqd5S5BPSMM84oKVnxXQqa/Ts9525jjjnmqD5zfY888oik+nXg7cHb+MYbb1RliFBQ997vXn755aE47UHH719uUcf7KiVrOB5DSRo3bpykJN/OuCalRPOZZ55ZUl3AZZZZZpGUxrrPfe5zVRneIbzfBx54YFV2zz339DnnNrW5oD6+AB4g8OiJ5557TlJZhr7kASodP9+/9L18my8jQKSHeywn1AsVBEHvEJ6gIAiCIAiCIAh6itblBJVgsTasleTsSEm+1S+T42Mp8jwBvEhYtTzunUUusSo//PDDVRl5Gkgiu/wuv+NeJZf47Y82xI3uuuuu1Wcsy3h9WNROSpbDe++9V1LdYjcUDHbdlRZEBfJ/Fl988WobnkDPRaMdcG5+LGSeOQe3vFKPlLkkPG0Xa6fHwe+5556SpLFjx/Z7XSW6sd1de+21kqTbbrtNUr3P4oG88cYbJUmrrrpqVfbjH/9YUvIkffvb367KyMN64oknBu08h0si+/TTT5dU9+zglfVj5mMdcsZS8lDjlf3d735XlU077bSSUh7bpz71qT7f4x54LtF2220nqe4B7YRubHNtYSTqzvNuiYLwtjUceLRF7qnqlGh3E85g1R3jNe3HPYW5N8+9eqV5lLmxFAVRehbMf4d5vmkJFv8+XvfSNiKOeObp7xidEu3u/4icoCAIgiAIgiAIgokgXoKCIAiCIAiCIOgpWieMAIQP+WdC2Dw8A1epu1hxp+G2dHcn4SXsj9iClOSgkb8mZERKicQPPvhgn9+bffbZJUlf/OIXq23IZQ80VKnbWGCBBarPSy65pCRpueWWk5RCaaQkHX7dddcN49kNHqUwOGSaV199dUn1hPymcCDahrcR5J1xx7sMtie65xAGQpiAS5Aj8jH//PP3+/22QN+mflxkhHCK2WabTVJdqpd+ucEGG0iS5plnnqpsMMPghguSwRlLPOyWscvDBQg7JVTurbfeqspoY3PPPXef3yH0gtA3D22lL9BmPSRvvfXWkyRdeOGFA7yyoE14OOqyyy4rqT4nM0cSslYKL2Ls8vZDSBRzOKIn/psXXXSRpLpwCmHoCKf47xAi7MtWBCOLLz2y2mqrSUptxOdaxp9SKFhJJIvvlsLh8jKHuZg24+fAMUrnwHzkYXqMsYyZzzzzTFU23CGjwwHPgI8++ugEfX+kRXTCExQEQRAEQRAEQU/RWk/QpptuWn3G2vT+++9LSotRSn0t5U7+9u/7kQjsFn0sn1ic33zzzaoMLxHeEM7FcWsqggJt9wS52APWEywhU0wxRVWGhbjtuKdlpZVWkpTENNy6RZtyaXYoWTty0QPamm9rknylzC39yLX74qFIRreNSy65RJK09dZbS6rfB4RK8HjhjZWkY445RpJ06KGH1o7TVqaeempJyXvjnlW8re4dok9SRz42sq3kLXdBD6k+RuLxXGihhfp8jzYXtA/uIwswu9AL4gf0MZfIRqbdPat4cOiLvpgpHiPaGDLuUvLo4LXxcRC5duZRpNql1D7d47TiiitKSn3kpz/9af8XHwwrpUgHxiO/hz6fSc1zp9RXxMCPVfIO5d9jbvXjMIeXvp+XSal95sdsM/kSHP5cw9iAsA6LdkvlCJqc0j1l2Qaf532B9MEkPEFBEARBEARBEPQUrfUEbb755v2WebwyFiV/I22KO8wtCe69Qdaat2CXueYzCzB6vtCUU04pKS2SKSVP1ujRo/s9l24Gi7RbgYl9RRKSt3kpvdFjwX7qqaeG4zQHnW233bb6TJtya1OOt7t8v1IsLFajkrWpE6uKW2g45sorr1xta6snCC/bZJNNJqneL5GxP+ywwyTVF5odP368pJQL1JRf1QbwUBMr71ZGFq0sSa/jAWIskpLHCO+Ne39YQJrxzGPZ6cNYc11am/MK2gdtCTn6m2++uSrDI0PUhefXYK33bbQDPEa+SPHjjz8uKS1t4REDwPc8X4g+j5eJ85RSHqovfE6Owq9+9at+rjgYKUreGOZHz+lm3GJMK0lkl+Y88GPlz3b+7FLKBeoErsPHTq6DOcsjRCYV8NhK0kMPPSQpeXvvuOOOqoz6IUrBpeyJpGJJFSlFUq277rqSkvdXCk9QEARBEARBEATBoBAvQUEQBEEQBEEQ9BStDYdbY401qs/IwELJneruylwq0d2kuO8QRHD5WNz+HMslkZGgxR3vbj9cuCSVSfWEzzaC23LOOeestt15552SkvvXQ2ioD77X1nA4T/ymHZXED0ohck0rT3/Qvv3BMfg9/x6fEXCQpO9///sdHbfbePjhhyUl97jfh8997nOSpKuuukpSkqSXpCuvvFJSam8HH3zw0J/sEELoEKG1nmD85JNPSkorlUtJph4xF8Yph/HSRWDycfD++++vytZff31J0mOPPSapHppImGzQXn7+859Lkvbdd99q25e//GVJaX775S9/WZXRRjxEJp+TCZGWUvgk4XOEeErSu+++KymFtCMJL6W+T6jcwgsvXJUxJ3v4DKI9TeHKbSVPVC9BiKCUQpZ47vBQsJFI3C+ddy4o4J8Zj/y88328vGlOzkPPvQyaBBi8jG0+BubXUzrntpGHLyJrLqUQa+qzFN5aCpPm2XyrrbaqtnnKiFQPlRsq2n93giAIgiAIgiAIBkDrPEEkPWMxkpK1s2Qd4W2808WveMvnr1u0sLCSUOzenCWWWEJSssaSCCZJSy+9dJ9t0003nSTp/PPPl1RPuG8DJMi6JRoPEFYVf/vHOuBW6jbiSfdYeLBolLx7TVagwbTAcSy3aHE+eErazAorrCApCY74wsMs1njCCSdIkvbff/+qDKszlkFkWNsKXi68rN6fSosBMibSPjx5nbaJFd0t8niMSEJ37xpiCXzPrXeeyN4L0LdcJIf5B6GKJq9uaRHRkWauueaSVPcaHn/88ZL6LgsgJS+/CxJxXSxWfPjhh1dlRx99tKQkn73DDjtUZXgckdt2CzseJxKwve3TPn1/IhPa4Alq8syUBHRKnhSs8xtuuKGkuiAOdXzXXXf1OWbJwwEsuH3QQQdV23xpkAmlFH1D+3HxmnxZiFJdeN15FJBUf+6jbZTmXY5bkrxuqh9+z3+H86HdlZ4z20a+ZILL03PtzB9evzynU/deF6X5g2d55pHhEDUJT1AQBEEQBEEQBD1FvAQFQRAEQRAEQdBTtC4c7rjjjpNUT74i5AUXsbstCdHyMJjcren/N+nM46Yl+ddd76xHQgK2J3sRMub7k8DpychtAkEEd2USkkCYAiFzUrpHJNgSXtE2PPyPe01oo5dxX0tu/xJNIglN388TRj30g/7gK783ufa7GRKkCZPxcBASpRFPuO2226oywlTpex7G00ZYe4vr9/AztpVCaygrJR3TT1955ZWqjCT3kpAC4yChiR6anIfE5r85qbDQQgtJkl577TVJ0rhx46oyQuQIK6PNlijVjc8ThJM9/fTTE3nGncO1ISwgpf7H+H366adXZYQx+XlzvvS3q6++uiojTI125CIL7E+4p4e9wvvvvy9JuvTSS6tttMFFFlmk2rbiiitKSm3xpz/9aX+XPOJ4f8nDtUpj9fbbby9JWmCBBaptW265paQkWOJ9F4EnwuE+qE+yTgt1yHpjg4VfYx7y1t9+UuehjRzT22Qn3yUczudaxs7S99nPnxPZjzCxXCSkLZRCD0kHYByQ0rMvbcTXjaNe+Ov3g7nC53JC6piTQhghCIIgCIIgCIJgkGmdJwiJX6y/krTZZptJqq+GDqWVgJus4XlZaR8sWJ5wzkr2eEOwSkvJEvDDH/6w2nbuuecWrq49zDfffJLq14lVuiSMgLXArQRtAi+PWzSvv/56SdLyyy8vqS4pi+fRrfJN3pdOkif5vu9L+8Y7yTlJ0q9//WtJyZospTbr8uVtgGRcLJ++CveMM84oKclfr7nmmlUZq9Kzv9cd1iaXs+926GNYw30MKsm0YkklCbU0DiKWwDH9d7DMuTWTcZYkVk+I53f8/rTV290E1nb636KLLlqV7brrrpLSMg4umpCz3nrrVZ85xhVXXFFtW2eddSQNr7Q9Xh+/53h2uPcuAkM/cjlr2gRjj3ujL7nkEklpfvBlFmh3iA8hgiBJr7/+uqQUabD22mtXZQh4uOWYZwTkobuFJk9tiVVXXbX6vN9++0mSVlllFUn1CJd77rlHUrpHLnRyxBFHSEr3qhSJ4d6M3XffXVIS99hzzz2rsj322KPfc+0U92AjLMT44/LW9C/qyedQxjsffxjLctEiKbXZJkGikteHY3AuPraxP2OhlMZVrtGl41lmpQ2Unldof+79pw5oi6X6Zd4peZd8TqJueU4cjrk5PEFBEARBEARBEPQUrfMEEU/MX0nacccda/sQiy2lvA23LmCtLMV4TmiuBLH6xEW65LXHi08qlGJhuXYsAi51idWvlGPQBrDq+PWSD0XMv7cdrCFu9ZtQqcwmyxXHxBqDlVVKVpXLL7+82ta2XCBAZp4+e9FFF1VlWKzXWmut2j5SsrzR/z22HQtimzxBgMXTF43lGl0GGw81bdXbIO2DNuFjZL6oYlPMu1ti2Ua+kJSsySMN593JQsa+rbQQMZ4NrOJ4I6XklaVduWcS7yy5Lz6PYdn2+jrqqKM6uLLBBelq9zLgCaf/jB07tiqjj3k/wpNDZASeMyl5aKhPlpWQpEceeaT2O245x3uNV9jHVrwl/K6Uxgw8VCeeeGLjdXdK0xIb0JTzUhrPR40aVX0mnxhZa/cyPPPMM5KkAw88UJJ02WWXVWXkBO29996S6gvHcg5bb721pLrEMR4j99hiiccT99WvfrUqGwxPkHsScy+9z1H0Pcaf0pIT7kXjWNSZ5yzz3dIx2Max/Nkl/x2Xi2abe4cYR+eZZ55+j9WNNEVIsfAuEvnetsjPpz/7fEBdMC943VPm94/9aHclL99gE56gIAiCIAiCIAh6ingJCoIgCIIgCIKgp2hdOFwnEr8fJCeKi7W0CnF/v1f6zXwVXSm5/0vn4CElecJf28D9665e5GJdEAHYr63XS7jFLrvsUm3baaedJCUXr6+k3bTy9GDCb+P232677aoyQpLmmGOOatsBBxwgSTryyCMH/VyGEpJ9EeJAqlNKydelFboJHyHEx6U9PVSimyGkTUrhAYQNeBlS7S5wkAsieHgBfbFJ/pbv++/AVFNNJake4sq45nLx3QLnNhiS3YSDIWLgCdg33HCDpBTq5knEtFES+F20hPHjhRde6HPOpZC8oWLzzTev/aaUQlZWX311SdKNN95YlSF6Q8iMJN13332SkuiBH4vwK9qPC3oQZkiIjbcjQnA4F59naIP3339/te26666TNPh11iSaVCKf81ZeeeXq89FHHy1Jevnll6tthAKNGTNGUn2ORfwAyWsXzFhhhRUkpf7vbZJnD0QlSiFhfh8YXwlb8+T+TmWqm0BYSUrjdWmMoYw68DA6ztfHL+Zdrn2gzxulZ0JCV6lXP0/Ckb2uad+MuZtssklVduaZZw7ofEr4ueXPFJ22Sb7n9ZN/1yXyTzrpJEmpr3toGmGzyNl7WDX3iD7roW/0SxfwoI4Jx1x//fWrMg+BH0zCExQEQRAEQRAEQU/ROk9Q6U23abEt8Dfm/G2/ZK1vWpiySa6xSfLSpQDbmqAO/x977xlvSVGt/z+YcwQVlSAZhpxhyEEEZpAo0QACV7iAksMHwYsJDKgXEIUP6ACK45WcZYBBcpA4RCVIEFEw5/h/8ft/q56uU9Ozz5l9ztn77PV9M3u6+vTurl5V3Xs9a63i/D0ZFnWIwhR4DXw/T1ztJ7i/7vXGa+GLwpa0LYLXDbgPeOwp1y1lZc490aOVXDja4G1iXJF8LuWy2XiPrrvuutS23XbbScqeRPcsU5LYk9N7ES/0whxCP7jqg9fdxyRFO2pqT7l8gCf3YqtlYqs0dHFM9yrz2Y812nRa6ACb4V9XLujHs88+e8hx25QEinG4F52EfFTZyy67bMg5kJhOMrsknXvuubP9nrFccBY1xdWJF198UVK+916KmqgHT6xnG0qXzzvYFnbq9so4pziER0+4CiDlokdStu+tt946bXvuueckZYVt5syZs7niuYf7j/ruHuxFF11UkrTGGmtIaqp/LOz8wAMPpG2o16glXgZ/p512kpTHuvcrz6bagsUUCsCOXEVjLvHnE/eE4/v7kJe3Hinf+9730ufvfOc7kvJcg1olZZUAu/NnLfYwbdq0tI3nIPv5exht9EGtVDl/58/Mgw8+WFK+Hyy74OfgxUzoa+zTC4y4ujJS/N1xpO+RbX+3++67S5L222+/tA2bpD9dveE9gz50VcwjD6Tmc6QWDcUcwn3wOTqUoCAIgiAIgiAIgi7Qd0pQjU5+DbcpO904Fr9qa3Gttb/rdyUIz5uX78Q7ddZZZ0mSttlmm9TGQne9mCvQCXi/iHuVshcID1atFKZ7cEcjJ6g8pucl8d1ua+5V7HU8trj0KPkCi3h48WCzeLKUPViU2vWS0n4vexlfbLNtYTruM7lBUlY4KC/s3k+87Hh7a4sUcsyaoumLaZZ/1428gU5pU0ncK7n++utLynkVvmC1z2OdHBfIUWNRU/+7tddeW5K06aabpraPfvSjkrJS0Kb+jBfYg3vdV1xxRUm5RK6rp6gGviAqygiKQ23BUvJM3FYoKww+7pnbsDtXN5kPbrjhhrQNJatmp3MDXvBrr702bWN+qS00y9hDEfRxRv6Pl1hnXmKu9jmLvuZ56gok96uWf8s5M55dQWJOcSWPdo7heWrkco0ElFDPZeI+clyuTcrjl/cqf96hTrJItpTzPOnDWvls5j3/HlccpWZfeESL1Czbjo35/vQ198bnBldCxhPGniuW5PvxruMKG9EIPD/c7mij9L2/B7E/Kpz/HZ/dFstcc3/OjxahBAVBEARBEARBMFDEj6AgCIIgCIIgCAaKCREOVxY4QJKWshRdW8G57VhtoUt8T22fTo49EUBKJtxIyqEMrGLtMjDJ6p7Q3k/UEsuxKe65h3Ug/440LKittHZtRe22cu+eaNpWxKHX8FAmpHbCYyiZLUkLL7ywpBw2duaZZ6Y2wneQ7D28hhCfCy+8sMtn3l28yADhFNiX9wMhBB6aQegNoQceqkAICH3ifVMWBfD5sxbaUDKWc10txIJwQQ9zoyjLxhtvLCnbjZTthCRoSbr99ttn+538LUU5SHD3z3z3xRdfnNoIieI8a8ssjBeUpSZ52xPAOU/GoYd7sY1SwlKz2IHUnD+XW265RpvPkYRdlYn5Up5neaZ72Wag+ICUnzXsf/PNNw/ZfyTwXJs0aVLaRhjW008/PdvzLgu4+N/5HE1SPvejVnyoDA+W8nUSAubvInxP7XnE/h6mx/1i3C+yyCKprRY62imETG622WZpGyXlOUcfz5wb/enh3tgdy5JI+fruv/9+Sc0+wIbpO5+jmOe4brdvinzw97UCR7X3A+ZhiqBIzXDNkXLAAQekz/RnWTTEz5P76uMF2/IwwFmzZknKNlwrxU1opocI0neEdPrfleFzbmM8y/wcsG/aKCoymoQSFARBEARBEATBQDEhlKCyRPZKK62U2khSc+9ouX83CyS00e/FEBw8JV4aEi/zQw89JKnptaGv/D70E65KQNv9b1MEuwlemJqCiTfFvf/9VBjBFzPF43XppZdKks4///zURmnYyZMnS5KmTJmS2lgwkSR1L1HqZX57Gbc9PKN4z9xryvxCmV4pqx+0uReTz9iHz09l0QSH4iacl49pbK5NJeoWFF7B6ytlLzseYfew47knAdv74pZbbpHUPO/amAeSo1F53BuNIvmjH/1IUlP9ps2fUeW5+/6cAwouifR+zt0C26LogS/4zfhjEc1a8rqrzHjNsT9XiRiveOQ90Z7+Ry1yxYPvYdFUV0h45ngCf6kqdYuf/OQnkqQrr7wybcMLTlGS2vzPOHNVhfnbPeR4w+kLt0lslnHmfc6Y5XvcfnkuoAbUytr7s6MsloBaI815Mfo2UFB9zuCZxPm60sQ10Se1QkNuixSCQRnxwgjlIrfeRn9yfv7uUha78uIGNUWXuYd/fT7GdkYC6o2X1UcNQ7X1JUiwqbIghJTnL1cGFpoPnwAAIABJREFUuQ+oWl6Qg/7k+K7mlu90bq/YFuPYz4G2WgEatvl4Hi1CCQqCIAiCIAiCYKCYEEpQiS8EVuZMSEPLITpt6lB5TP/134nncyLlBOEZcC8eHje8Mb6gFvu19X0vU7u/pa10en/bFMG2BXxrf8854GlxjyK456qfcoIc+h9PtC8SSiwznvepU6emNrzr/L3nsLWVs+8lPJa7XITZ+4E2V9C43yi3rgQyhimJijddymoP47Vmc7TVbG4s+pZFSQ866KC0Dc8h95uF/KTswUYdJIdCynPX+973vrQNW2O+975jwUsWRPT8kK222kpS9sh7vH5ZTtbh/vn3UIIX77gvMeD3vhugZmyxxRaSpKWWWiq1rbzyypLytXieGl5h9xzTt/SLKzrkjqIy+txaqjeea1Uur+Dnh8LmHmpKKJO/0K2cIMqi77HHHmkb95Wxuvjii6c2VEIWZ/Y8B1f2oMz78/vMOGafWiQGY9UVFeZBPPm+6Dc2jMImZbvrdvQKc4V/P+8JKEGuUmE3NWUNBcJVRhY0pV9c6cBmsU1XielPzs/VCZScWi4a98ZVN46FcsPzSZKOP/54SdLRRx895HrmBIvK+ljC7rAxVz3p4zIXSqqPPcZXqURK7WoY79s1RYd3l1oOWy0CgX7kvvm7/GgRSlAQBEEQBEEQBANF/AgKgiAIgiAIgmCgmJDhcF76tC1BnTaX4zpJaG9bFb0W5gATKRyO0A0v0VuG6ngIItJqrbxkP1C71+U2D/VrWxm6rWw2NuJ2WJbidvieMmHeP/t96adwRA+npLQoCeV33nlnatt1110lSZdccokk6fTTT09thKzUwmtKe+1VPBSBUDdswfuI6/HwGeY27nutfDYhCJ7AWxbc8NDfsvRqzdY9XGq0OOywwyTlkBAph3RttNFGkqTVV189tbH6+Ec+8hFJOYldyvOT911ZqODBBx9MbYSsEeLkoUcULKAPPeymLMDg/cp98BLKfE9tfx8D3YBwH4qO3H333amNZH5CCm+99dbUVivPS8gYdurPReY//s6XtHjuueck5fA7DzOcd955JUnXXHONpBz+KuXQH8r7Sjnkh7C4buNhpxTnYLx5AjwhbIwJwqR8fx8vnC/32vuXkKXac6Isqe1FTcpS+W7nhBL6faD/sT9/Zs1NQZmvfOUrkqRrr702bcM2KHDg8x3fW3v2Ma48rJXrIvyK8Dgph7DyPX5M7JpwwVoBBvDQLi86BMy1/F1ZEn6knHLKKZKkQw45JG0jlJlz8pBR7mutKE5tG8+P2vsJc3wtZBGbwn58jiL8nLLv/nf0ne9P+Dbn5c+3zTfffMh5dYNQgoIgCIIgCIIgGCgmhBJUqgt4jKT8i9d/gZbJ57VFodoWRGVbbXFMPDpeBpOEw35VQWrw698THH2BMam+yGMtea4fqCVMlgszuo3hMXFPUak8+rFKe3N1sizt2bYYmXtOah7asSrd3Q3cQ7TeeutJymPJxx52d/jhh0tqlkwlmRR1eK+99kpt06dPH4Wz7j6u3lG+FI+nl1MmURyvm5RtBQ8htiplW6C/fH7Cy4p3sBzbUj3ZtVyAdSzwkr189pK+E53vfve7XTkOi8iiblEYQspl50lCX3fddVMbYxH1TcoeeErSuwLLcVEA/HnNNjzcnnCO2saxvcgL3vZaJAbH6ha1ObpUBDxZH5ijXUFifNVK3ZcFA6Q8jtuWmqidH33FuPbFPdnmx0TZYi7xOaUbuK0ce+yxkppFloBr4fnr6hZ94c8JijxQpMOVQZTN8phSVpPoa1cgH3jgAUm5D/25Sj/6OaCe+VzbDYh04F9JWn/99SVJ22+//ZDzZi5mfPmczLX7nI+91SJPsCUKTtDPknTfffdJyv1EUQ1JmjZtmqQ85t32Ob5/T1nIrG2R5W7RP29EQRAEQRAEQRAEXWBCKkG+cB0KRG2RwOFSLrJaKx/Jr20/B7zXEyknCM+Ve1PKUrke29vvOUF4VVy94V7jAfF4XOKPPVegtBvPSSkXrOu0NCmeEmzMS7PWShf3kw26F4jyzZQn9X4ljru22CTg6fP+cQ90L+PjiOvHG+kKDTbqaiCeU+zLbQL7RbGteXv5bs9rKRdQ9Zwg7pmXkQ36A1QUbMrnOvKhvve97zX2lXIejysW7E9+ipcJJueFOcvHK4tUss29yuClsQHb9cWQ8cS7MtwNRlo22tWCEp/PoLZQcVt+E/ehhueZjSflO5SUc7v2228/Sc37xfMK5bvWhz4/Mqdji7VctFokBs8FIng83wwFCIXHlSDukT/7ubYjjzxyyLl2GxRWV1pLyH300uwoM/6M5R2N9xJXN8llHG4e4ic/+UlJ0gknnCCpqdrT/64Oce9RUl1NIw+RY3WLUIKCIAiCIAiCIBgo4kdQEARBEARBEAQDRd+Gw3lYTylPT548OX1GPvV9hhMS1FYYoZbQhaTooQEj+d5eh5XYKUkrDV392mXj1VZbTVJOuOw3CCNyOyIMiFAGDwGqJa6W5T49Gba0jVo5S+Rjl6mR6ElU9ORHEk39WG3luXuNWjgIq757f22yySaSchKmy/5PPvmkpGynHh5RC0HpRXwuwQb4txaa4QVJyhW3a2Xc2cfDA7ET+sjLZ2OPhIl4OArn5WEWQX9AiVvmcZ+faKNownXXXZfasBuf/7AN5iAP4+JZwP6ezF0mtnuI1/PPPy9JuvfeeyU1w5kIPyc0WcqhNP0agj0Rqd0LbImwrQ033DC1MQ/xr4fsMt/585CwubXXXluStNVWW6U2/rYsPOHHuPHGGyU151DCLwkT83A9zsuLdFC6v225lLmhVhipLUWDdzX+lZrhfiPB5waeI5yLh0fTn/5O3muEEhQEQRAEQRAEwUDRt0qQqzBl2WVPpsKLOlwPOL+228pnO+U5eGEEGGlCZS9C//t1urdYygnr0tDyi/0GHnG3LdQhEonxVErSbrvtJqm5kBs2Qh/UbKumFpZJ7a4MYNckRrLIne/vikc/lSh3bxyeJBZN9YUc8UpRLtTVDryLJ554oqRmYqYrGL2M2wS2g135vIb66B4/PPFlOXentpAdXs+y8IaUvfVrrLGGpHpxFPcUBv0BqiLzGIvL+jbuvauGRFv4vIQdYG9ewINjUW7XbQWVB0XbvfWoUezvKujyyy8vqZlkXT6PfIHXYHxoe4c655xzJDXL26+wwgqSsi16gZbawt9su/TSSyU17YdiG8yZbisU8MDuXF0qow+ILpDyori+rbxWpxvvgH6M8Xqe1wpU9Ou7XShBQRAEQRAEQRAMFPEjKAiCIAiCIAiCgWJChcOxMrQnh9Pm29qSyNr2YVstSR5I1tt2223Tti9/+cuNc/Hz71cJkT7wkIPyWrx/CEti/Yh+oxbOh/SOzE44liTddtttkqS99torbWMtgfnmm09SM9yrrWgGNkWYk0vve+65p6SchPm///u/qc3lfkD27wc8vGb69OmScp/76tGEZBKa4GOPpOujjjpKkrTAAguktlo4RS/iYT8UfSABmGuW6mtFkCjO+PMkdBJY6WcPcXrrW98qSTrrrLMkNedbEoMJUfR5jdDLfgk1DDI8I2uh44RVUojFQ9gorOFrcAH25s/fDTbYQJK09NJLS8q2JklLLrmkpDxXeqEDwucIi/OQHOZE/l7KdlpbaygYHzoJB/Ow8h/+8IeNf/uJiZT+MJEJJSgIgiAIgiAIgoGib5WgmuecpDn3UpEs7N7KTpLJ2hQaLwEIeErxTnli3XCP3w/Qr65mlCWH3YPNfl4ooJ/Apvy+ca9RdpyTTz658a+Dxx3vqpS9qXhAPZEYL6d7yGaHJ77jifLE9V5ZObwTFl544fSZEtckyLpKRBnO1VdfXVLzGjfffHNJuTCCl4y+5557RuGsu8/tt9+ePq+11lqSskfevdxf+9rXJGVFXMr9xrW6Okg/UQTB5zXG93333TfkfLBHPP877LDDkGN6GdmgP2C+rkVPoDQzv7hnftlll5XUVJCwEVZ+97FGcRPmQVcg+R6eJTVViqR3n4tPOeUUSdKMGTPSNuZQ1Kh+VBOCIBhdQgkKgiAIgiAIgmCgmFBKEJ51vOlS9iTVFi/tJig/KB7rrbde6/5tpSL7AcqNHnLIIWlbuTjYCy+8kD6zMN73v//9MTi77sP5e8lNFJbhLopGv3j/dAtfYJTyxv1qY77AIiovCoMvzMgYJzfIvc6XXXaZpOwNfv/735/a2HbCCSd0/dy7iedx3XTTTZJy+XlfPBC8NL1/lrpbJphzYfFLKStBjz76aNe+JxgbUBDLsupSVr0pRe3MmjWr8e9Y4XMASxK4akp0Rm3R5SAIAimUoCAIgiAIgiAIBoz4ERQEQRAEQRAEwUDRt+FwteIElIilJLWUJX2X9pHJOYaHCxFaU5PSy/094ZxwOPafUxJmv4YoAaEwDz30UNr2rW99q7HPzJkz02cS2n/84x+P/smNApRoroWb+XW2gU1hYx7SWRbrcNsiAbhmryV33nln+rzyyis3/l7qrLhCr0BJXEmaNm2apFyE4umnn05tlF0nQX+TTTZJbRRNINHfQ+xqq3z3Itddd136TGER7OWJJ55o/VtsDNvzUtflHOo2x9/V5lnsie/2Fd5JrvfSxkF/cNVVV0nKZdi9hDxjsVaGHdy2Oin84/sPB47thXeYgz00mXYv8BAEQeCEEhQEQRAEQRAEwUAxz3/6XZIIgiAIgiAIgiAYBqEEBUEQBEEQBEEwUMSPoCAIgiAIgiAIBor4ERQEQRAEQRAEwUARP4KCIAiCIAiCIBgo4kdQEARBEARBEAQDRfwICoIgCIIgCIJgoIgfQUEQBEEQBEEQDBTxIygIgiAIgiAIgoEifgQFQRAEQRAEQTBQxI+gIAiCIAiCIAgGivgRFARBEARBEATBQBE/goIgCIIgCIIgGCheNt4nUGOeeeYZ1j7/+c9/JEkvf/nLJUn/+Mc/Ovqej3zkI5KkhRZaKG2bMWOGJOlPf/qTJOllL8tdtPbaaze+76STTuroe+ClL31p+vyvf/1rjvvzPcOhk74bLs8991z6/LrXvU6S9Ic//EGS9Pe//33Id3Pe//znP4cca9FFF5UkfelLX0rbDj300C6fce/0XY2pU6dKki655JKO9l9wwQUlSZMmTZIkXXHFFR39XXk/OqUX+45xuMsuu0iSVl111dT22te+VlK2RWxUymPu5ptvliSdeeaZqe3Pf/5z189zuH033H57yUv+n9/q3//+d0f7v+Md75AkrbTSSpKkxRdfPLX97Gc/kyS95S1vkdQcy9/5zneGdV5zSy/aXL/Qrb7DtmrzBttqz63XvOY1kqSPf/zjadtOO+0kSfr2t78tKY8/SXrxxRclSb///e8lSW94wxtS2yqrrCIpP5v/8pe/pLYvfOELkqRbb711ttflz1hgrPg1sy3sbuRE342c6LuRM5K+a2Oe/3T7iF2gmy8Gr3rVqyRJe+21V9q2//77S8oPf59o3/nOd0qSfv7zn0tqTtB8fvzxxyU1b8a3vvUtSdIZZ5whSfrFL34xrGuoMd4DZf7555eU+0KSfvnLX0rK/Vr7ocP9+Nvf/pa2cS30+a9//esh39NNxqLveODWXgz4wYKtSdKKK67YaONlQMov5H/84x8lSQsvvHBqox/5IeAv76eddpok6Ytf/KKk+hioOQzaGG+7g7PPPjt95kfPq1/9aknSG9/4xtTGfbjjjjskSRtttFFqo49f+cpXSmo6SK699lpJ0vbbb9+1cx7tH0GdcPHFF6fP/OAG/6HDvEffMKal3E8nnHCCJOmTn/zksM6hX22uH+lW37GN8eT7YA/MQW5jyy67rKTsOJSkV7ziFZKkt73tbZKyo8Jh3nRHI9ueeOIJSXk+lKQ3velNkqQnn3xSkrTPPvuktocfflhSfvaUn/3YUu6zsLuRE303cqLvRk63f7JEOFwQBEEQBEEQBANF/AgKgiAIgiAIgmCg6LtwONpq8b3I8h7PzuV5nkAZD+whIoR3ka+x1lprpTZC5fh7YqH9fPj7v/71r6ntrLPOktTMg+mE8ZZMjz76aEnSpz/96bSNPALCGzzEgM/kZvn50y+EKJCr0O1zhtHqu7a8Lg/fmjJlypB9CBehLzwX7frrr5eUw9tOPfXU1EY4HHlYhIRJ0utf/3pJOUSOsDhJ+spXvjLk/DvJExpvu1tzzTUlSZdeemnaRl4aY4/QGN9GKCr5BFIO9QK/NkJ1ttxyS0nSNddcM9fnPtbhcD4HYV/PP/982vb0009Lkq688kpJ0gsvvJDaCNmlbz0EkxDBddZZR1IznKns01roW4TDjR3dDoertTHGTjnlFEnS5ptvntqeffbZIedBqGWZr+ufsSkPoyPcGhvzuQ6webd9wjbPPffctK0WAlteT9jdyIm+GznRdyMnwuGCIAiCIAiCIAjmgr5Tgkii9IT8ZZZZRlJWgA455JDUhmfcPZl4gcp/paxU4EXy7kHdqXmY8PiTsL3vvvumtkUWWURSThaVpC9/+cuS2pPrx9tbgIL1wQ9+MG175plnJGUPnd+H0gtcqy5En7373e9ObVQEuuuuu7p27qPVd34PURD/53/+R5J05JFHprbHHntMUjPpl3vdVrwARceVy1JFc+hPktopPCFlL/59992XtnVSXWy87Y4qUB/60IfStl/96leS6uePooYqhvoh5f7n71yhRU3Ce+yJ1iNlrJQg7vfVV1+dtqGgzZo1K23jM5W4XEF761vf2jiW2wl2zj5bbLFFavva174mSTrqqKOGXEMoQWPPaFWHqz2TbrvtNkl1Rdz3L+d7H6/lNj8Wdse49WOyP/OuFy1CTfKiKOX1+DlEYYS5Z7z7rlYNmHctom6InpDycxSb8ucphTg++9nPzvF7hzu31RjvvutnQgkKgiAIgiAIgiCYC3pynaA2aiWZ8eRuuOGGkprll3/7299Kkuadd960DSWHXIvf/e53qQ2vAm3upcJzwD7uSWA/lCDK70rSZZddJkl66KGH0ja8qbXr6RWWWmopSc0S4lDzEpaeiprHpPZ3rL/UTSVotPD8McD76LkYeDRrHtOamkFfYT/u3cLjxb+uRuHp4th+TNbt+OhHP5q2dbq+zHiCMugen1IxdTsiT61Ucf1zzfsHlC7vVWrjiJwx1D4plxN29RFboc3npa233lpS9oKSNyRlW1t55ZUl5XxLSdp1110lZSWopvj2YIBBMAe4ZzUVZoEFFmi0+f2tLYnQpuiUinhtTHKsWslrju1zsUcWAMfHlvth7gvaqZVtdxthXiS/0ZcqwW5qOcurrbaapPwuePLJJw/57uGu0TaRYb1HSZpvvvkk1fPEeb8lgsiXW6m9S40HoQQFQRAEQRAEQTBQxI+gIAiCIAiCIAgGir4Lh4Ptttsufabsq4fBwU9+8hNJ0pJLLpm2fepTn5IkbbrpppJyYQUH2bUm1dcKKiD7eyJeyUUXXZQ+77HHHpJySeReBJnTwxzoDyTPWrggbbUwmRpePrWf4NqxLe8nqIW81ZLHy3A27zsSgG+++WZJzVLOFGV48cUXh3zfEkssMfyL6gEodFDrO8aZJ0Ujq3M/PPnfy+/6PlKW6hdbbLGunftoUAst23333SVJBx54YNrGdd96661p29vf/nZJ0t133y2p2aeUIyZM18fhggsuKEnab7/9JEk33XRTaqNQyqGHHiqpWZad/u3lMN+gndozjzFCQRwfV+W8L2U7q4Xp1ua/2Z1DLRSU8U5BDyk/dwnhlqSHH354jscK+ovaffN3KOa0qVOnSsrLmkh5TqLokIdNr7HGGpKkz3zmM419JOn4449vfJ+HG9fedQYBihBJuZgTc4SXrmc/wrY9tYJx7PMNx6IQkj+TZs6c2bXzd0IJCoIgCIIgCIJgoOhbJcgXMZ0xY0ajzT0+qET+y37dddeVJP3mN7+R1PQI8Eu1rZxxrbQnXgZ+3bonlPO58MIL07aDDjpIUm8rQSQeen/yqx3Ph/+KJ/GQ/b0cca1UKvh+/cQGG2wgKXtHvS9qib1tC/S1FZXAO7L88stLaio8JLxji/59JIfWzqGXwXuHuiUN7Z9aOV761RcExZNULvApZZv0hXv7DQq/SNJuu+0mKRdd8fall15aUlbGpTxXcf1+LOyKsu9PPfVUakMdKlU2qa4ARbGE/qAsa+2gduMFd294rfhB2eb3fjjec1fXOS+eMz6XMc6xcykrQRD21zu0ldWv3SdUP39XmDRpkiRp1VVXTdtQC5jbnnzyyY7O595775Uk7b///pJycR5paEEEP782m+r3ea9NOfUiUMstt5ykXJzE5waUIJ471113XWrjuePL1zC2+T6PrOJ9vduEEhQEQRAEQRAEwUDRt0qQ5/gQk84vdldoiBHlXyn/cuUXq3uc2/JaSo+Fez35Trz2/nfESN5zzz1pmy9q2avg6fX+5Lq49vnnnz+1XXzxxZKy132zzTZLbZRG9GOBly/vJ4hzrZXOrC2oC8Nd9KzMv3JVk+PjCXXPKfuxgJwk/fSnPx3Wd48VPh5Q1mqqVdsihzVPXZvCVqoWnmfE4qK9CvHXhx12WNqGt83LkHIdzHVe2pTyscyN7mUt+9I97HhJL730Ukm5RHnQ37SVAEZ9Zsz4GCMXzZVEbKlWBrccw23zoY9R5kGeF/59HMNzgqAW0dCv3vl+p1beHNruSW3/TTbZZMg2lpgAVyXKvFLPT2E+xZY9l4j3mCuuuGK251ezrYloYzwjPSeId44bb7xRUl5gW8rjlxwfX8y4jByS8nsMuaz8K0l33HFHl66iSShBQRAEQRAEQRAMFPEjKAiCIAiCIAiCgaLvwuGQy1yOu+WWWyTlECSX80no9VAPQtZqSbxlSF1biWM/5pvf/GZJORHPj00bkqCUpVhk2F4MUyI5um1Fb4fCDyTIeThcLYwCaqXN+wFCMmvhI7U+K4tKtIVneFtpd7VStNi+2yThcITtSb1pZ5K0wgorDNnWSdKp79cWItcJngzrCZzjBdfo18r17LjjjpKaY4d7631JyBt24aFyb3vb2yTlpGMvLc73MM7979hvvfXWkySdc845qa12DyZiWMhEpK2sOXMdoWV+T2slh2n/3e9+JymHvkhDx2RtriRkyc+JUDdCZ90mmRMpHuP0m/2V4YF+/oQKEwp2++23pzb69cQTT5Qkff7zn09tDz744LDOgWIDzCkeeli7X51CWJvbQ2l3tfLLtbDyyZMnS2qWUZ42bVpjn5pN147FdXJtvLNJ0rbbbisph8PNyZ7K53y/2R/UwsoJr2YJBSmnk7C0hReq4Bi8+/JOKeVCBxRwkvJ78Xe/+11J0sknn5za/DnTTUIJCoIgCIIgCIJgoOg7JejDH/6wpGahA37Z49F0zwC/QH0BJzwmeEd9/7ZSzqUH3/+ObZyXezNqpUNZ+BKPTi966Ck17IvS4XmuJbOed955krKX4LjjjkttZXlTxz16/QQqXq1ceM2LXyYe19Se2v/5jB15G4plmfTp3+dFRHqVOS3sWvZPbfHFtr/rJAnbk6rHUwkqF2quzR+rr766JOmGG25I25jPavaBB65WGAJlx8ch4xR78r/DG1grvPHYY491dI1B79FW0pfxSbEc95Tfd999ktrnv7Z5sAZtnmSNGoHdeiETFMu2ua4XCyPU+rzt3FjYnT5AEZby4u944nfaaafU1rYg+Q477CBJOuqoo9I2lu5g/PsCzMMt7FPDFZpSMUE9nBPYhqtUd95551yd13PPPSepWWDBVY/Z0en96ye8iBXPIOaBbbbZJrWdccYZkvK77LLLLpvaZs2aJSnf06effjq18YxhIW8pRycwtr1E9hFHHCFJmjJlysgvqkIoQUEQBEEQBEEQDBTxIygIgiAIgiAIgoGi78Lh1l13XUnSAw88MKSNpEEPCSJ0w1ebJYETic/3h7bwGdpqK1azFpCvgsv3uMR6/fXXS5KmTp065Pi9wqOPPiqpGebAtXgSLNDXDz300JC2trVfWK2532DFY6T9WvKvhzKVSZoumw8n2bQmt5fJmM6cQs16AfpSqocL1rbNjrZ9aivXgydo9gK1sbLiiitKyuNvvvnmS22E/rqdYZOENjAOpaFJqw4JzIS6+XzLMTim21e/hsOx3pmvo0RIMyEyntTrBUik5vgl5KPfwmJ4DtbW1eFe15LKH3nkEUnZNqUcnlZLDi/t2v9PP5Ig7eu1UASAUBlCw6Qcpu3rjZWhZnOT0N9N/H2jrTgT573BBhukNq7vgAMOkNR8DvPOQWi9r+HXxs477yypOQcTGudhT9CNcLja3Ma88rGPfSxtI+med6f3vve9qY3+8XH5gx/8QFKet3ycMs/Vns033XSTpPye6PeFcGH64oILLujoGvudWjgc/brbbrulNp4xpJxcdNFFqY20DwqWeLEsjknxC//Mdz/77LOprfbO3w16Y1YIgiAIgiAIgiAYI/pOCTr22GMlNT1SpbfTE/lJ2vQV4FmFtpZwXCuNXYIHwj3/eG9Ytd0Tuljh2r0LlPXu5dXWa7+86Z+aEgS1fm3zwj3++OMjOLvxh/uK98L7BE96LVm4kyR9p/Rkum2WaqZ7b7A3Cjj0Mn6O9J17TPnMtQ/Xy87feYIwn/m+XlHM2q5t4403lpSTSd3zjZLlHmBK9qNqUDJbapbElpp2RdEDnzcB9Qn79aIzjAEvod9JIvxoUSt6wxihpL+UC034yvSU+sc7jOIh5YIUzN9eVGLSpEmS8nVffvnlqa1t7LeV1Qfvw273Zzlvcx1StoNacRbmv5pyW/t/mz2U29z+UIV4fvs7QG1eeM973iMpP1+6oWDMDo7t8y+Uak9N/fFCE8stt5yknCTu9nP00UdLyhExN954Y2pDPVt66aUl5eIGknTKKadIqkddLL744pKatk856FqhgbZmtyU0AAAgAElEQVRS6iNh+vTpkvK8RallKReaYs7xogm8a3nxDO4/47Et2sJVdIpDcHwfCxSIOvTQQyXleyBlpcLtFvvkHLbbbrvqdfcq2LLbA6y55pqSmgrbjBkzJGVF8eGHH05tFJWg790mgXEqDS3kw1gYTUIJCoIgCIIgCIJgoOg7JQjPZg1it2t43DsepU4WraxRLqgqZa8CXidKiZafSzzmsddoi8Hk2muLprbRidLWLxDDjLeyllvm8a4olDVvalt52rZ9SpXIz4E+9hLGvYp784jvd08vnifPaYEy76Dm8UW1xVvq++P9w/Pay+Dlff755yU15yBUD+Lopaz20H81pZBt3jfYKkqTK9vYb22JAe6PzwujPdZr5Zc5R7ch1C2eE5R0laTbbrutsY+UcyT23ntvSdJaa62V2vB6MvZ9jj/77LMbf/fpT386taHkeY4q1BT02rbRohw3fr2MH7zDPgfRT25bbapLJ/bAfXSPM32Bqun2yjF9G+WyeSZ3K0erbR7uRCXxBV3J9/E82rvuuktSLkvtCjWedObIL37xi6mNqJcnnnhCUlaLJGnPPfdstLmqic1/4hOfSNuuuuoqSfnZ0e1oDVdleB/j2eVqNeOKXGufo1Fc/L2PY2CnrrAxT3GPvA+wb+YLL/PM3Md5uTrJc8ujQNiPubpWMrpXqL37Mo7dlsn3Y373kuko5tjIGmuskdoYv6hpXsK+Nndy/+jPsXhPDCUoCIIgCIIgCIKBIn4EBUEQBEEQBEEwUPRdOFyZIF1+nh2euIocXa7M7sdqC5WryerIr52WiC0TKbudbNgNKJHt0Ff82+nqzmXYFvLoRMWLZkBpW7Uk2raQrtq2tkIB7O8hPr2KJ/gT9uJhB4wvQmF8vJSr0vt4Lvvaw2s4JhK/h06MJ2XiuJfuZrwReuAhS4TDeMlYwuEIE/JwIfYj9M1DOvgewojd9lihnfvkCcYUC/FV3EebtrLnDmGC+++/v6QcAud431Fq+LDDDpvtMSnosdpqqw05BmV3PTH6xBNPbBzTk8DB54Wy4IknKxPa1C3K56gXK6Ffa0UNsE8PgWT/thC0trYyrFHKfVBbTb4WYsxzniT/boXDtR2H++VLX2y00UaSclK5hyeTRD558uS0jdCjgw8+WFIzBOz444+XJJ1++umSmqFg55xzjqQcKufPZkK/KDXtoXLnnnuupGyvUg5Ru+SSSyRJ06ZNG3IOcwMJ8w7X4iHPzC3M+z5H8dmfBcw7lAsnVEuS7r//fkk5LK629AfzoxeQeNe73tXYx8ONsUV/n2Fs0Ofed6usssqQ6x4PaiGd9Af96YVzKH5DqK+XyKaP6Xsv9sX3kBbgRXQYKz5meRYRluih1qNV2CSUoCAIgiAIgiAIBoq+U4LaFJO25HJfcIxfmbUFJktvWE0JqnmpSHTFM+ulK2ves+EkUo4X7mmD8tc4CdqdgveGxMt+w8uvl/j9rS0O24knss3b0VZmnL9zT1k/LdboKgylUvFoStlrRB90qpSVc4J7lvD61Uru9hIrrLBC+oxXmPnF5w88lq4QkFRN8mlNuaVv3QOLooN339VE7gtJwa5ccA6oKKMJ3mRXXVH88Tz6Yo8knNfKtIJfC/2Bzfh4Kj2cLOQp5T5n4UZPXqd/3v/+90vKHu5OwfMsdX8OLeeLlVZaKX3GzugLHyvYj9tiJ0pQJ8tQ1CI++D4/dq00uydoz+n7hgN9cMwxx6Rt66yzjqQ8Vn28UMwJVRXFVsrq0Oabb562UX4ddfG6665LbTx/eJ+hVL5vw+a9KE95Dg5FGVyNovz0N7/5TUlNW+gGFA2Rho4zv0+8g/hC84ANulJWzvOoY5L01FNPScpRB88880xqW2+99SRlBcz7DrAxV0iwN54h/rcUSPBzP+KII4Ycd7QpIyWk+rgsx4fPNZRR5+823XTT1IaiXhalkJrKupRLnkt5jPg+/C3zpNvraBUuCiUoCIIgCIIgCIKBou+UoDZqCg3wC1/Ksae1X8ht3nao5QZxDBZ+ciVoNBdpG2vK2PCaWlQDzwze94svvngUzm70oRysU1MU8bx3WjYW2nLRagt+4g0ry8dK9XtT8770Aq7UojD4mL3sssskZY+m9yv9UssJKhd+83j8zTbbTFLOfastDjcelF4694BxbSjPHovOtXneD7ZCfpX3G3HzbHPvPsdfdNFFJTVzD1CJsCH/PvcejjYohuRcSDkuvVZ+9fDDD5eUPeU+jvDy0idSVjbKPMg5gVKGPeHtl7LnmLwCzy/gPrTlNXlMPZ7t0YL7LGXbYGx6iW+8u75/Jyp0J4tG19qwO/dcM+/5/cZ73W14drHArpT7gJxgv3485IwNH4PYyiGHHJK2kcPENlceUDqZG5dZZpnUhi2hdLqyi32juPtSCqg8HkVAntfVV18tKecgdQvPSeJ+cr4+BsnhxO7mVHqfxTWxA9+HZwzbPD+0jDDwiA+2lUqklPNhPCKGuYfxzxwqZdv5/Oc/P+Tc54baOwi05bLXQNV0ZZf8tB//+MeSmguinnfeeZKyfbt6g70y1/rzl3cRn0uwAc7V77fPL90klKAgCIIgCIIgCAaK+BEUBEEQBEEQBMFAMaHC4dqore5cowyRq8nxbX9fK7PbTwnqc6KUjWsJhOChOkjIhM4MNyG4V/AESEA69/CzBx54QFJztW8k+jLJXxoa0tVmd25PSMok077vfe9LbRzDk+GXWmopSdKdd94522scDzzcis+e4F8mQ9fKYNfC4YBwHg/duuWWWyR1HtI5VpTzBWFfUh5vhIx4GftJkyZJyqEjUg6RoU88YbtMIvZ+o0/4+1p4Dwm/HlpJSI2HjIxW8RfK2HqIKmFCtbA8+qxWqIbQFQ+JJCzNw4rAr6+Esr4c3/uH766Vx+f7vNBDOVd4aNRohYeAhyUR4sNc5yEszGtud9zzWgn/cm5zu+skBJEx7PePc/Wy+rVlCroBoU8eTsVzgXB4fy5yX9nfbYxrYexKubw2xQm8wENp394HlFsnlM3DW7lH3DcvYc95uZ1yXtyrWmnjkUA4lIcqPvjgg5LyPORzFN9LiJXfU8ZQrWgL/eP9Sr9zP3xeog+wt1tvvTW1lWWtF1hggfSZfqo9QzgH7zuudSTU5i3opOiHj0+uyc+HeXS//faT1Hy2zJgxQ5I0c+ZMSc3CN9gnY9znBu4ptuml0TmG9w/7YZ8eZujlzrtJKEFBEARBEARBEAwUA6MEOaVH3T1TZRJcW2K7J7Didal5DScSZXIgZWdrkOAp5QUD6U8UiX7Dk/XxOnHPXXHBM1RbgBLvSM22ysIT0lBvqntV+W68eF6Csqa29aoS5B4+PHZelrVctLPmIa55w+jjWhveMLxbfq88IXO8oE9cXSYZHrXh1FNPTW177LGHJGnjjTdO2yjHS7+595M+4bq9UACfy8RWqaluSnXPn6szXoq2m+C9veiii9I2vMKoJO55ZKygZtSKj/g2vLyoYe7Bx1PJeHOvO8egoIUvJoti4d7PNpgPuEcUBik/jwbu8S/7x+cW9nMloVS6fPy1JXGX3+dwPzi2zxm0eUI730lJZC+5Pzfsu+++kprPMBSO9ddfX5K09tprpzbGL3bgChuLHrv9fOELX5Ak7bnnnpKa6hb2wLX4NfF84dnjY718dvj8yZj1Mc444vuGuzj97KAvavaDjfvC9qguFBfwuYTr876jDxizNTWQv/P+KRfjXnrppYfsj+17iXPKi/t7Af3IubtiODfU+p37iN37M3OLLbaQlOc971dUIRaOlqR77rlHUo6Q8LmTggiMPf8eIjeYj3zhcxQ/+pcFa6W8rICX21522WUl5SgQjxBxFbybhBIUBEEQBEEQBMFAMaGUoNov5XIhLodftbV8oVpMchkL6/kFUFvQqV9zgmr5TWV8Px6CGq42bL/99pJyH/qv/37C7QGPB6qBe1pqMf9Qy/spFUiHbTWbxLbw3LkSxP1zdbJXc7FqZYHd81N6j9vK2rctCudt5QJ8baWJxwNi+30cEiPNfOYx015+FLhertW9yvQNbV5emG3kNngfU6582223ldRUffDcLbbYYmnbaClBNVAj+He01ZI2UMgeeeSRcTuHuaE2xvBqu7eXvmZxSCl74Dt59tXmvLbFHMuSylK2a39Gsx8e+W4pQYwJX6iUiAgWyG3D1SqeEz72aGeudhvGo15Tdtnfc2SAfqktKM/zy/OY2I+/8/enucm1Qhnwe857FHOOK67kpaBKzKlMfa1UOpTvb94HXBN/5+9xZdlmn+/IYfEoAqIWaiWgu4HniC6//PKS8oK6ft4o9uTe+N9dfvnlkpp5P4xtlMozzzwztfEMYukXf79hrt9nn30kNcuf807E2POcap63X//619M2xg/5VJ7H5Ip6NwklKAiCIAiCIAiCgSJ+BAVBEARBEARBMFBMqHA4JDqXiJFRfRXbUn6vhSUhu9bKQdbKF3PMd73rXUPa+jUcrlZmlmtBymwrL0yJT/87+snLTPYTtVAArs3DI5GPPcG+DMmshbW1ldAt95WGruTupU9rduehGL2O91db6fByH6dttey5KfU6FmBDnljL/SXUx21oxRVXlNQMS+IaCUfwa2YMe/gJlKFHq666amojhIIwHcL2JOmOO+6QNLR4QtA/ULTBbYuxhS36WMNG3LbKZ6yPv1rZ7HK/slCMlEOOeJYvueSSqY2wG39mcf6jFQJcO3/GrM+z7Mdc7XM7bX6OZRn8Wnlx+sLvA+89PHM8RKsM3fd7xZwyZcqUIec1ffp0Sc37MDfzJtdUu69s85Auwn1ZbsNDIMtlN2rfU5vbau92fObeeJEPwtroX3/HI3zOz4Hr4XtqoXlzg5dMJzSPYg0XXHBBavPPJRSOueaaa9I2zpdjevGW9773vZJyWoM/Y+gP+vBrX/taaiMlgnvrfc7z3ct0c3/5d4UVVkhtHsrdTUIJCoIgCIIgCIJgoOhtV2gLcypvCiRTeVu5MGXNIw9tC1rWSojW1JO2c+5lKDNZoyyVXcPVN7woXHe3ykaONbX7W1tQDu+Re6LK0thtCqS3talCHB8PpCtVtYTjNvvsNWpjpJYY28lYalOQehWSbv0+4mWkAIGXZq2p3njPUGw90RTvJYm7XoCh9KAus8wy6fPDDz8sKXv5vCQ3du9lgJmD/byC3qVcRFfK80qt5DzKhc9TZTGX2mK7Iy21jC37YrHYZG2u7LQceTfoZhL8aHm+Z8c555zT0X5eknq4YA9uW7wbcL21Qj6dFs8oF+ltK6TgxywjXPz8UIVQ2HwR+Fq5fbbx3d1eLNq/n2tYZ511JDVtnc8UzHn22WdTG0UP/PlBkQXKg/N/PxbLSXiREYoZoMa6CkoBFd47XPGknPmWW26ZtmFbvB96EYrRKnITSlAQBEEQBEEQBANF3ypBNS9ArcRtWQZ3uMevHZNf3+4tYJuXIex3arG2QL+0qUXuMSo9M72wGOVIqJVaB4/Bbivxiie0Vpq9jdpCd/Qj5+Wla+d0jr1OzeNYU3RqZV87aeu2h67b4JHzOYj7TNlTFnyV8nitLdqLR8095fQvba7UsB9e2euvvz614c3D++7nx/6eKxhKUH/APceOfFFI8r7YB/uT8hirefdrURplnl6n8yBzF7blHmfGwdNPP522oWyyCGQw/tQWEeeZxTaPEmGuqSmQzDttC9T7M4TnLs/PmrrEd7uigl2jeLpK3rZwL3/nisqcns9t7L333pJyXo4k/ehHP5KUbdyXyKB/GKteuvq4446TVM9Fu+GGGyTlsu/S0GUOJk+enD5zPiygShl0h7HozwqiBmr3iPnDy72P1lILoQQFQRAEQRAEQTBQxI+gIAiCIAiCIAgGir4Nh6tRC5WplQQu5dBOixR0UjShTZrtt8IItVBArpk2Dz8oIVFOypIn1z03q073Gtx7Xw2Z0J9a8YO2ghxl8Q3/TJ97SBOfCYvzUChPgodeLQtdG7udFs/opGx2bby1hXv2Au9+97slNccYNkBiKmWxpVy21MMG6EPCgv2aCddgbHoxA5KB2ccTvt3OJemxxx5Ln9/0pjdJaiZ1s7+XQg16D2yL0DcPa8HeCGt5/PHHU9ukSZMkNcdrGQZXK5oAnZa955icwx577JHaTjrpJEn1kDwPEQrGF2zskUceSduYKwgV87m6DEHzMDqeZbUS1ISzua2VxV5qpcexYX82867CsdyWOR8Prea7Oa9bbrkltTE/jgTOd4MNNkjbNtxwQ0l5fv7Zz36W2ngekLLgy2cQIuehaFwzJa99WYRll11WUp4bPKyP+0D/+rOCdxC+x98/2Fa7R2wbi2d0KEFBEARBEARBEAwUvekWHiF4gVzBqCkObeWIS6+U/0ot29wjwHeOZTnO8aD0HtcWI6tRlotsKzDQy9RKbnLvvQjHU089JanpuSpVNLfTclvNA1rzRJXFFtxLXys+0evFABw/17Lf24og1Kglw/Z6cQ7uM95QKScW0zeLLbZYakP1dvUbZZp50BNhWYgQD6F7CrFlvsfLslLogLnOE2jXXnttSdLNN9+ctnkZ1qD3+cpXviKpOT9hB3jK3RuNV9m9vOzH86FW/IBj1iIqygVGpWzD2LeXdMfb7WOa+eDEE0+UJB100EFtlx2MAZQ1X2qppdK2MmrCn5nMMSgoXngKW/HSz9hbbUHUMhrFny/lgue+ADRzIep4TdH2Y3H+KFxeRGBu3g9PP/10SdKFF16YtmHTKDXerywmTHEBH4O19wz6k3Hm73ZcC8VwLrvsstTGoqwsmeB/R1/xnOKZIw1dVNbPsVbqvoxA6BahBAVBEARBEARBMFDEj6AgCIIgCIIgCAaKCRUOVwuRqdVlRxathWSVyesuy7GtFlKEDO8Jo220JWz3Gp7oVibBeahOiScssh993mnSe6+x4IILps9laJLLwNidr2oObatYt1GGxUlZ1mbV5qlTp6Y2ZHBfs6VXCyPUxoGHtpSJ1i7jl2GtPj7bwlt7tTAC6/YQ9uNzEOeMrXmoGW1+/dhfLayNsUhBBG8jNI6/89AjIGSEtSokaf3115fUtHHmAULzerXfBx1CahZddFFJTTvCBrmXPq7Y3yE8rRYOB23PPuzHbZ9QKMbwXnvtldoI1/H9mfdWW201SXmdFUk67bTTZvvdwehx6qmnSmquM0UC/nPPPScph1xJOUmf564Xb8EW/T2jLUy/DB13WyEEjL/3kDzmSWze33kYI/6sYn7jmGeddVZqmzVr1pDzGi6+3tqRRx452/2Y+5dbbjlJzfG2wAILSMpjXpLuv/9+SXnseXg0a4Y9+eSTwzrXKVOmSMph0tdcc01qo2jCG97whrSNvqXvvF85v1rBrrkhlKAgCIIgCIIgCAaK3nQLj5DaCsCUYvQCCSS68SvTPV6lwuFtpQfBf5GSLNymjHS6MnavcdNNN6XP6623nqRcdtRVhhJXe/Cw4N3uhkdkPPByxdgb/7qXam5Whh4JO++8s6R64qF7vDyJvZeoFRmpjT221Yog8HeuQpQFJ3weKBU5P4fxLCCBukOJU4fkXAoc+JzH/n6NfCYh1z2cKLz0g3vd8BTiZcVLK+WxzwrhnijM/l6cgb4nsXW43sRgbKCYC6qyl19n/JGg7snoF198sSRp4403HnIs5n0vcNDmyWUM4n13jz62fPvtt0uS1lxzzdS21VZbSWoqlhwD1WFuyhMH3eXqq6+ufg66B8r+ddddN27nQGRULULKnynjSShBQRAEQRAEQRAMFBNKCXJvE8yYMUOStNNOO6Vtxx57rKQcK+xeS46BR7OWN4S3yvMrrr/++iHfA3ha+0n9cU444YT0mRjPTsoLu5eaONZFFllEUv96f9zrjR2gStTybVwNw/NeK/9IX2ErbpN4QFkE0735eOyJq/UFavk7V01q6kIvUBsbLNAo5UVgiXP2vka1IQcKtUQa2tfeP+UiijUleTxgjJSx6A7x1EsssUTahvfd1Wj6Aq+b5/exsB7ljj0Wnzb61JVN7A/128FG/bzwxJPrFEpQb0JezXHHHTfbfXbYYQdJzVyCc889t/HvWOGq96abbipJ2n333dM25oVHH310TM8rCIL+IZSgIAiCIAiCIAgGivgRFARBEARBEATBQDGhwuHAw0cIs7ntttvSti222KKxPyUapZyUTJJ7LUSGpNAbb7yxo3PolTCbkeLXeeaZZ0rKqwN3yvTp0yXlsLjvfOc7XTq7scXD+AizIPzIr+kb3/jG2J7Y/4+Hg7CaNCFN0vDv21hRC2X1ZErC/gjHWX311VMbYVYk/3uo5mOPPSYpl+acOXPmsM5hPCDkknBdL7hBOCMhZbvuumtqW3nllYfsT5gk/3oIEeF2FDfxVdIpj0pYsJe1fuKJJyQ1Vy4H7GvHHXdM2x544AFJOcQu6E0IHa2FbxNaxr2vlUx3yhDO2jO5tkxEJ6XwORcvysOq9R6u3HZO/RqaHgRBdwklKAiCIAiCIAiCgWKe/4RLJAiCIAiCIAiCASKUoCAIgiAIgiAIBor4ERQEQRAEQRAEwUARP4KCIAiCIAiCIBgo4kdQEARBEARBEAQDRfwICoIgCIIgCIJgoIgfQUEQBEEQBEEQDBTxIygIgiAIgiAIgoEifgQFQRAEQRAEQTBQxI+gIAiCIAiCIAgGivgRFARBEARBEATBQBE/goIgCIIgCIIgGCjiR1AQBEEQBEEQBAPFy8b7BGrMM888o3r8DTfcUJK0/PLLS5L+/Oc/p7Y//OEPkqQ3vOENkqR//OMfqe23v/2tJOl1r3udJOnd7353anv44YclSRdccEHXzvM///nPsP9mtPvumGOOkSTdd999kqS//vWvqe0Xv/hFY9tCCy2U2t75zndKyv37pz/9KbXNmDFDUrOv55bx6Dv/+9r3L7XUUpKkD3zgA5KkFVZYIbX9/e9/lyS96lWvkiQ988wzqe3ee++VJM2cOVOS9NOf/nS23z2S6y4Zy77j7+bUd+W2ww8/PH3+4x//KCmP2dtvvz21XXPNNbP97pe85CWNY49H381tv83puzfYYANJ0tZbby1J2nTTTVPbfPPNJ0l6/etfP+Tvf/jDH0qSHn30UUnSCSeckNpefPHFYZ1fJ33Si3NdvzDec12Nboyl8nsYr//+979n+31hd2NH9N3Iib4bOd2aWyCUoCAIgiAIgiAIBop5/tPtn1VdYG5/8b7xjW9Mn4877jhJ0g477JC2velNb5IkvfKVr5SUPUxS/pWJ5/TNb35zattzzz0lSeuss44k6YUXXkht8847ryTpV7/6lSRp2rRpqe2II46QJP3rX/8a1nX0ireAa5OkW2+9VVK+TlfDUIDwIuOFlrJ6dvfdd0uSXvrSl6a2gw8+WFL2OneD8ei7l7/85ekzqtZOO+2Utn3961+XlFWwX//610OO8YpXvEKS9NrXvnbIebHtW9/6Vmqj7+jP4dpYjV6xOxQKSVpmmWUkSWuuuaYkaZtttklt66+/viTpd7/7nSTp//7v/1Lb5ZdfLikrl4899tiQ7+lUVemEsVKCoGZzX/3qV9O2vffeW1JWb5jzpDzvoaT5vIkiyfFRKqWsoKNIvuxlOaDgn//854iuo1dsrh8Zi74rleaaQtM29xB9IUlHHnmkJOnJJ5+UJK244oqp7S9/+YukPKbbzsWf223qUBthdyMn+m7kRN+NnFCCgiAIgiAIgiAI5oL4ERQEQRAEQRAEwUDRk4URRsr+++8vSdpvv/3SNsKLfv/736dthLHV5MW3vvWtknK4zTve8Y7Uhmz/0EMPSWqGflA0gfCRj370o6mNRPhDDjkkbfOQnV5n2WWXTZ8JuSE8xsOLCEkgDM5DdUhWp5+872qhDP1ILRRo0qRJ6TPhRoQSuk3SV4sssogk6fHHH09tv/nNbyTlcJNXv/rVQ76nG2FwvcLGG28sSdptt93SNuyN0K0f/OAHqe3ss8+WlMMFPYxuvfXWkyS9733vk9RM6v/yl7/c2FYLi+1V2gpheIERbA2b8TDU17zmNZLynPe3v/0ttdHP2BXhcZJ0wAEHNP71Y7J/r/dfMDJqdsc9f9vb3pa2veUtb5Ekfe5zn5Mk3XPPPantpptukiQtueSSQ9oeeeQRSdJZZ50lSbriiitS27nnntv47tqcV7PFIAjmTFl4hLQRSVp44YUl5Welv7M9++yzI/o+5gvmCkn64Ac/KCkXg/r+978/omMPh1CCgiAIgiAIgiAYKCaEEjR58mRJ0ic+8QlJWW2QslfUVR+87rWETn7VbrLJJpKaJaBnzZolKSf5u3cULz8loCkXLWUP9Yknnpi23XHHHZJycmgv8573vCd95tp/+ctfSmoWjkBhe/rppyU17wNeBUple1EJkmH7nZr3e4kllkifSSRH5fGEcvqKUsT0s5S9+JQ0XmCBBWZ7DnNKFu4HPvzhD0tqjiHsjbHrag/ji6R/V0I4BkoS3mcpFzqhz/tJveBc3fONEulFNVAbaXOboJ/YRvEIKRc8QR2i/yXpJz/5SeNc/Jj91IdzA2O3kzHmY5L7Vfa9g/1Luf+/973vjfxku0jN7rbddltJzegHbITnKQq3JN1///2S8hg+/fTTUxtRB/TLoosumtouvvhiSdL06dMlST/72c9S24033iip+Szv5rIBQTBo+Hj+0pe+JCkvBePvJ0Sq8H68yy67DDkWc9oZZ5yRtjGHPP/882nb29/+dknS+eefL6mpBPk82k1CCQqCIAiCIAiCYKDoWyXIvZ0stkmujkNOkHuH+cwv0fnnnz+1lYtVukrRVoaYWEn291+tfJ+fM7+IyX/oZcgdkLJigYfZS+dyzXjg6HspewvwbHqeESV3Rxpb2sugfEnZ004MLLHxkrTuuutKysoRC39Kuf/pV1cnS/pV/fFS6yhe7ullzOGB9/wrtqES+eLHwBj0sscgxgAAACAASURBVOSLLbaYpP72GPv4A881oxw785MraNghKo8rt3j6nnjiCUlNjzx5G9CP/Ta3DKcUuI/Jtr+jjDRRCFJWMvGIjtf45h7vs88+kqTFF188tVGa/pZbbhmyDRUbD6+UF+y99tprJUl77bVXauMZcNttt0lqzgvkCSy33HKSpAUXXDC1vf/975ckHXrooUPOOQiCOVPmBLnaw3OAZ4tH8pB3uvPOO0tqLlZ+5ZVXSpK+/e1vS2o+f3lO+/OKd8ZahNRojedQgoIgCIIgCIIgGCjiR1AQBEEQBEEQBANF34bDeaLoo48+2mjzMCykPQ8boYgByfqXXHLJkL9973vfK6kpwZXhXk899VRqQ8Yn8drD4QgBc9mPELx11llHUk7s7EUooSvlktjPPfecpGbCKyFHhNnUwkAIi0NelZqlGPsZSgZL0pQpUyQ1y4QjBRPO5v1DgQlKE3ufEBJDH7pMfdddd0mSTjvtNEnSN77xjW5cypjjYS+MVR/HQFhcrbw94aqetE14KgVLvO8I0cG+XeLvVcrQPZ+fPvShD0mSlllmmbSNEFyKlfz85z9PbYReMlfRR1IOVSJs1dtIcuVYJMTWzm+i42WhARudd955JTWXWSCEi5KzXjiFUMUf//jHadvSSy8tSZo2bZqkXEJ2rOGZSWiu31+ev4Tz+f7MWR6GedFFF0nKdurLRRAaTZ/5cgCEbTJvelg68+yOO+6YtlFAIQiCOVOmefjcBMzv/hzlM8VQFlpoodR29913S8rPVl+igtQTf5bzLGLudCIcLgiCIAiCIAiCoAv0nRKEt4mFS6Xs5cRLRTKwlJOm3TuMB/iYY46RJJ133nmpDYWD79l7771TG0oOJYpdQSKZlV+rrvrUPNT8+j3llFMkSSussELrdY8n7o0jmY1f9ng0pexJ4Ne+/x0eQfqXeyY1SyH3I3gvp06dmrah9ngRA7zG9KGrhWVyuv+d243UTK5GjWQx0DvvvDO1+edexxOnGcc1NYyCET7GS4+Se5bwSKNS1sqLk/TfD0pQWdbfy5gedNBBkpr2gtd85ZVXltS0K9RrktjdS8f8ipLmRUtITD/11FMlNRe1pZjFRFq0skwYlvJ8v9lmm0mS3vWud6U2lH/+zheh9QI9knTZZZelzyQUH3jggWkbCcIkHY8X3HPutatVKDpeMIPiJr6EAqC8Mnf5UgHsT9lcxq+U+45njj83+D6WIQiCYM74s7JUWlZdddX0mTmsnNuk/IxA7a2pt3yPFwerRbaw35Zbbjmi6xkJoQQFQRAEQRAEQTBQ9J0SdN1110lqxhHjgedXpJe1xguJp0jKcYonn3yyJGn//fdPbRyDBS1rHji81quvvnraxq9mPNTu/WQRR4+L5tcyCzb2Mq5qkbOCYkGOj5SvmX+9fyhvigrinnw80f0KuRiu0OC1rJVYB/eY4FkhDwabkYaWh/a/w2PK31EqVupfJQhlx/OpgH5xRYNtZV6VlPuKY/qin+xP3gVleXuZUlUhd1HKc5aXtOf6GcP+99gmHkDP/UOBIHfR5y7y+bDZo446KrX913/9l6T+LdXe6WLDqIbEwXtZWGwNpcNVNO5Rp/1DWX3USo+AuOeeezo6RjdA6SJ31edvlJkHHnggbSOHjDHsHmdsEnXJF0ssnyGei8b8yXhnaQwpK+H9/iyZ6JAz7XkjLEeCXbhyynsGY8qfp7zTMaakPM+xvz8nmO+JJvAlGLBPbMyf5XwP3+3PF8aFP5PZjzHiC3T3SrQB81xbno3n5JKnS7/6Eh6luu1jkL7g+e59V3uWc/88R3i0CSUoCIIgCIIgCIKBIn4EBUEQBEEQBEEwUPRdOByQBCxJZ511lqRcNtsTLZHvSOKVcvnXpZZaSpJ00kknpTZKbG633XaS8oq3Uk4ALctoS9JNN90kSVprrbUkNcOZCH14z3vek7YRtuSJsb2Kr95LyVySqL30OKELP/zhDyVJ2267bWojjBE51GXYfg9hWGONNSQ1QwOxO5f2yxK4tfA5ZGpvQ3rGjggR8O/BJgmb6Tc8gZpwLg8ZKsPAPLSQzxRS8DAH2uhfl/jpVw/F6xcIi/IwDGyBf6Vsc1yjjzvC2ghN8cIIDz30kCTpqquuktRMkiVsjjAmD7+jv/u9GIJTC13j+fDJT36y69/nyw5QAp85ePLkyaltLMPhmOcJV1lvvfVSG89TLw7BXMiz2MMMsRHmMZZUkLLdME49kZqwGYolePgq57XDDjukbZ///OeHdY3jgfcLYG+1ghy1bSWEb/3qV7/q6BzGsqw94bKeDE/IFLZCCXUphyrzruX9xXuDhzgTIomNvfOd70xtFDPBxljuQ8phnjyvCdX0c6CfvBgU4/LWW29N2zbeeGNJeW72ED4/1/Gkk3BctwfCWpkHSGuQ8rsKx2QZACk/f2oFFbjf/rwidJhwOH/H5r2y24QSFARBEARBEATBQNG3SpCDR4xf+v6LkV+geKuknMBJ2Vj3QrPf9ddfL0lae+21U9tKK60kKXsqvKACJa5RnBZffPHURtKsb+snPLEXrx2L2blnHQ8dqpgnU5cJdd7nLCLab2A/eCtdocEL5El/eE+wIy/XjHpBP3mfUK4dW/aFglnQDBXOvar9RG3xNVc58DLVys2XqoMrwXym79zrxPFdte0XttpqK0nNReVQctzmsEnsyfsNbz3ju7a4MaXsXZ3AM0o5YtRhSVp33XUlSTNnzhzZhY0zbR5S92LS19hqreDESPHlA1DZKMDgJaPHks0331xSHj++SCz24yXt6RfO1+dGngvYj49zvO38nT8n+B6eIf78BV+wtR+UoE7szfcp999pp53SZxYxpkiER554NIcfe3bnsM0220jK7zWf+tSnZnuew4FiISwrIWWVgeco84uUI0cYU64MYmM+B/JMxU7dfrhO/vXk/vI56n9XK0gEnI+PB9QS3pH8+YLC3g/UlnRB5fEIF+a+mvpWqpJeRIFjegEk+hNl96tf/Wpq80XAu0koQUEQBEEQBEEQDBTxIygIgiAIgiAIgoGib8PhanIu4UYuubHf9OnT07b1119fUpaIP/OZz6S2I488UlKWN72uOWv6sM6QJ4exVgwSrYfk+EraQBiZFxboVX7+85+nz5w3crMnwxLeQOiNS6ZI1iRjeyiYFwHoJ7jXSPV+L+kfTy4kkZxV5j0RnXARQkU8zPDyyy+XlG3Rw06Q2lmTwKV3Qkp6ufAE48yT6/ns4ZRQrrUk5b6if3zsMf7pA8IepCzjI8F3ukZML0ByrtsXduGhR/QlxQxq14XdevgMc9tnP/tZSc3QEaCggq8vRKhnv4bDteFrulEYgLXQ/FnAPEiI66OPPpraeDaxdsgFF1ww5Hv8+cUxFltsMUnNUMexhO9nTveQP+75ww8/nLZhS8ztfk1sKwu+SHlOZbz6eGd8Eipz4IEHpjZC5TzZvVfxNZNq2+iP2nORMDXeZ7wIzIMPPigph9/vs88+qe3www9vHKc2D3jxk+OPP15Svh/f//73h3zPSLjjjjskSVOmTEnbGB88C7zgB+FXtBFqL+XwKA9dw85qfYxNYT/+DoJt1Yrk8DxhPSJf44hCBx5uWK5/5YUCvHhSr8Kc5KHjXDtj9vTTT09tFGPadNNNJTVD27FhnrU+f3GvfIzzPKPAhRfX8jC7bhJKUBAEQRAEQRAEA0XfKUG1kpKAAuGeUH6VuqcZrxHeDhLYpJwYh8fFPUuHHXZY41juDcMjWPNU+4rYUK6y28s899xz6TO/1Ll2T2r7xS9+0fg7T+7neuk7LxXpals/gQcKT7rfe0o8kggq5b665JJLhuxfltN0NQPoe/em4P13TzTMP//8knpbCcIePOEbb7N70Cn4gGfJ7Q6PFeO+VoYTT5R7/7BP+t49WHgLew08nJRl90Rb+si3cW1lgQgpq4cUPbj33nuHtOGV9aRglHSgjLMkTZo0afgX1Se4151CBcxdnhiNXZGgvvPOO6c2VHWUIC95zRzpnnbUUJ4hXrp3lVVWmavrmROuxKIEUTDIn5nMWb4/dlqbxxhn7OOREtgZc6vPXfQx+3tCPON1tLzFc6KmPJQwf9dKUde2YTdHH330kDaiUfw5yhxH4ac99tgjtaG8nH/++ZKaz2YKSvnYxRZ5pndabntOoCR4FATfhTpFGW0pLyfBEhvYoZSfg65So1iglHv/zJo1S1J+1vh7B8ou852rPZwXfV57r9loo43SNp79tYIh/twaT9qWMsAeUO2kPB4pyuRjfeutt5aUn7+8Q0u5P4nWcKWNdxa3fZ47ZSEPKb+Td5tQgoIgCIIgCIIgGCj6TgnqJFbff/3zq/yaa64Zsh+LpTr8KqUMtnub+KVblm2Ucrlk4qLdM18rAY23uh8WFXTVCk8yv+y9rVS83NsDePI9z2gsFmkbDfA64pHy8p3gHkLuOZ5Qv24+Yw81z2Kpavh+tLlK5Lbbq9AX7s3DS+WlgsscA++Dss9qOUF4rty7RSlWbLkflCByvph7KEEqZQ8nqrSU83XwsLtd4TF+4YUXJDVLkGLL2Ljnu2FX2JqrS3gR+xW3HWwNL6jnHpQ5FjVOO+20IdtYZBVVyfOwWEKBeHgpKy70uecUuL2OBuuss076jNrKs9U98j/4wQ8a5yrlMVXm5ElZwbrllluGfGe5+HOt3Dg5CH4OLG1Rm4O7Ta109UifYfTZxz/+8bSN8vfYmysWKCPMT7U8VMa8LwuCcoyq5Mekr5kHpJy7y5zA30vSpZdeOqxrdLBZV06ZY1Ba3FZQPlEQfF7GfjzKB8WK/cmPkrK6QL96/izPH/bxXGf6/AMf+ICkZr4uOUoe1UHfsmiqq7cnnniieoE2e6XPPJeJZyyLl/risFwvfejvdsyh3FNX2MDvKdEMrqyB35NuEkpQEARBEARBEAQDRfwICoIgCIIgCIJgoOi7cLhO8ER+pMuTTjppyH5IszfccEPaRvliJOkLL7wwtRFuctVVV0lqSu+EmxC24JIp0nK/4pI7oS+E1XgSOklt4PeBY/D3/RAGOCeQ4ZGWPdQISdllfy+S4H8n5X4kHKct7NOTK++77z5JOeHfQ5M8TKBXQUL35NFnnnlGUrNELKEetcIRhGXRh7XkU0IgbrvttrSN5HTGrofkuaTfS6y44oqSct946C/zk5eYLcep9w3hqoTUeXgSbSQY+71Ya621JOUQQ28rQ06k/igLW0vgh5122kmSdN55583195TH4HkjSd/4xjckNYtQEPLF/fAxPdpz6LLLLps+U/abogQeisfc7n1Yzo1uI1wfITIeOo5NkVztcyohR7VQbOZL77vRYrjl88uwqF133TW1UZba5yXm8Msuu0xSM9G8LCTjz5SyaIwnoRMGzBj3+8d7jN8j5gSeX17Sem7C4bhnbrvYDeGXXmodu2Nu9ucb85AX1sCmKKTgYX+EYdKH/n5CCCppDR6iRZgqc62HmRMu6HbHeXHf/P2vV5ZeqIV0ch8Ip6bIhJQLZBBqWQu5Jh3CQyfLAgy1OcttkedGrcDYaIX3hxIUBEEQBEEQBMFAMSGVIPeW4YnyBDnKIZJEd/bZZ6e2D3/4w5KkfffdV1Iz0ZxSgHij/RfsKaecIknafvvtJTUVkn5N/K+B2oNn3j0tZSGEWlEA7k2tBGK/4Wqf1PRo4i075JBD0jYS/fFyuOetLIRQ8xixj7fh8aJ8u3udxsIrOrfg+fWSm4wv9xqRxIrXuab20D/uRWJ/yoXfeeedqQ0PKF5GXxyuV2G84al1myORFW+mlK+NhFO3HUpic8ypU6emtrLfvERuufCq2y7qgZd+nhvP8VjBtXj/LL/88pJyku66666b2lheASW2U0oP7Oc+97nU9qMf/UhSMwG4XADcx7QXDhkNXJ2l7Drf7+Xr8Qr73FPap9sI3l6KQpAQL+Uxz7OEIghSVj2wZVdIKEfsHmo826O1GPfBBx+cPqMyoAz4uwGqDcqDjyUUIC8zT79TNIHrlYYuWkl5eynPm8x/XpCJcYyS4tEdNe88fcfcsPbaaw/ZZyRwba5SYVv0j5fIphT9t7/9bUlZCZfy9bptkbiPbbhShlLB2PPIFRb6ZG7z5T5QgLiP/nxBKffzIoqAUvcbbLBBauuV5Spq93yvvfaSNHTBYilfZ+3ZjE1iWzWV2ItQADbsx2L/2rIpo9V3oQQFQRAEQRAEQTBQTEglyH914nXxhZaOOeYYSdIFF1wgSfrgBz+Y2vAI4J1w7x8eBBaMOv3001Pb/vvvLynH/9YWjpsIeEyu1IzL9hhbqekxwcvOr/nR9mKOBdzj2kKUxCm7Ksl+9Ettwdya2gOoH65O8j210rClUtWL4Bmv5dc59Fn5r0P/uvKKZwlvKh5jKZeExSPVDzlUlBglz8bzbfjsahfXVNqelL30eGc9l4jxSa6Ze+ZWWmklSTlfwD1/HKOcC3qd2nhbYYUVJOVIAfdQs+DxqaeeKikrQ8P9Hs9nwGvt94j7x1IP/ve+eOto4PMZNkWOkntlUXQ8lwN7wcvrygPedo7vig7jFPvzc6BfUE18nKMY+NzBfNltJQi1ye85OTdEOPgzn/Nkvq+pMCg1vh/zvL/PoPaiTlDmX8rPWFQMj0ZhbqDvfXHqWsloVBIUuTLfd6RwPz0ShDwlru3qq69ObeRM8R7m9k+ZZo8K4DNlt7/0pS+lNq6TOc2jg3bYYQdJ+X74vEq/cN9XW2211MZ85/kzKHDYnStyteVSxoqaTTpEPzGWfH+elWxzJQ/li2eT/x33i7HrNsYc4uOT/uHvfPxTXtyja7pBKEFBEARBEARBEAwU8SMoCIIgCIIgCIKBYkKGw7mUSUjQAQcckLZRam+LLbaQ1AzFQa5Doq+tII6Uu+eee6Y2ZPgzzjhDknTkkUemtlqSV7+GyCHVIzt7SFcZduCrfSMzI4t6SFe/wj3HZvw+c30kFEpZ9ufvauFw4CExSNf864nvhEXQ5uEgHvLQq5TJvFIOYa2Vx60l40OtVHn5994/hB+Vq9T3MmWBES/mQHiaJ5MTFkKiuq8KTxgDoS5LLbVUasN22OYlw8tV3z38jmTuTTbZJG1jxfXxoDZ/g4cLYVeExUg50Z8QrRtvvDG18QzYZZddJOUQG0n67//+7zmeF33oK6+T5O7PI76H0MX7779/jsfuFp6wTJgK4VTYk5Tvr583461cGsG3EUrkYahlaKo/J7hf2LA/XyhiUQsl9P26AQVFCIuXpC233FJSHo8+zsrQHg8JIqzIw83YrzbX8e7CPOY2TJgWNu/vQZwPx/bvI/zJl3PA5ikm0K3yxB46C5SUxsZ9HiIUddKkSZLyfCZlG7zpppvStptvvrlxLApVSbloAiF+FLOQhi5f4d9DQSzsm/6SpN/+9reScj/5cdnfbXK0inSMFIqESbmozcyZMyU17xXPVuzHw/oIY+S5TUlxPwb250VBymNL2dY5pn/PaIVYhxIUBEEQBEEQBMFAMSGVIE+e49e4e5T4dYl3w3/Fo3DUkir5TCKxe0c/9rGPScoJkl66ttd+/c8NJLPhNfe+xisCXhyC/fBclfv2I3gtyoU8pWxvngSLTeGpa1ug0Sn7zhOJ8TLVFq+tlaXsNWolsq+55hpJzRK9eKXxGvl1Qq0P6B/+zr8HjyCeK7zcvQw2Q3/4on4oke5ZRznAw+7ePa4f+/Jkdzx3lID2e8Hx+Xv3INPmi7iOJ7W5t7ZQIAoWhXEkae+99278nXv3P/CBD0jK5b8poiDl5RJQhFx9ZKFM/t69pigM7mnm+YXdjuXSAu6FpTAD48iLMuCJdw8++/EcdFUaO2WxYn+OYtc8kz0JHcWJMeC2zP2bNWtW2kZ7uXTD3MKc4gWVtttuO0nSZz7zGUnNMufYG+qLRwwwPr3UNcUnmL9ZPFrKfYwduW3tvvvukoaqIVKOOqCIhY/Pk08+ecg1cs7YnUcf+HgfLly7K0uoC0RKsOyDlPuC/qHstJTHgtsdyhqFrXxOowQ9+PMRu0OV8sR/iunwbrfEEkukNkpj0yblku/Yn88NcwPPt5pywv1qK35QK4t9wgknpM+M6ZrKyDzKvfJCHjyDGLN+P7hH9IGfA8VP/H2GdlQ+3gVGk1CCgiAIgiAIgiAYKCakElRTb9zrjue+9IQ6/CL1v8Orzy9Y/6WNF6+2iGitLGLNk90PoAChanhOQhlX6zHiKCN4ivv1+p0yz8Q9dtiUq0OlKuG2VSodtYVmy32l7KnDm+I22Q8lnxlT7tXFm8vCg9LQRRf9Ost8IR/PfMYT6gsPkuMxZcqUxnf0MngvGXeu0NTGFPlBlIf1vKdy/qsdi9h495oy9ms5F3i7PReuH8AGUCeknBP08Y9/XJJ08cUXpza87eUC2lJWFI899lhJzXGIRxv1hDyF2XH++edLys8QL0M92rhCQ84L9uNLHBCrT96Gw1zl445tqEu+8GqZM+A2yRjG03zRRReltvXXX19SU3EarcWimSdcEbjwwgslSeedd56kpv3Tdx/60IckSddee21qO+6444Ycn4VJyWP2MfvAAw9Iyp54FluVsqJDH7i3njE7Y8YMSc0+51xd4SkXrfRnDiXyR0JNsWMeIocJFUfKSgvKgPcF6pQ/K1EjiDR57LHHUhtjj3LWrtAw1slFcdtHFeJcXFn86U9/2jh3Kee68fxyO5ybvNNaWWu21VSeNqZNmyapqfBhE8z9/jzEDhiPPi7p81r0FMe/4oorJDXnUHLqXRUsj0V+0mjS/2+iQRAEQRAEQRAEwyB+BAVBEARBEARBMFBMiHC4MkTGw4wIlfHkK0JvCBFxeRNZkXATlyAJCUEudGmzXMnZz6EWllSTNvsBwq84f08ALctquuSOfIpE7yVlxzLEo5uURTRcEqfcpBdGIKmQ/dtKQDvsVyuxSqIr4TLe1g9l2K+66ipJzXAZxt65556bthGe0FYQAXyfMszQQ1gOP/xwSdJpp5028gsYY5hXCBvyJHpszsMRSObHDn11dcI0SGwlAVjK4Q+sPu9stNFGkrKtergHx6+VQu0VOG9f4oCwmU996lNp2wYbbCAph12dc845qY2keMKg+FeS9tlnH0k5HMbDjAhZuv766zs6VxKLeVaNZZEdDyVifNbCdT0kGmivLQdAuBZzlttk+X1+vdw3Qt9YQV7Kz2IPS+xWWefZ4c83nvFLL720pOazgEIZ/OuUz1Mph1ry76qrrpraKKRBn3uJY8Ycz9+77rortTEPcF88/JiwJw9D4/iEdvpzxQtJDRfOw0PlmWOY4714DWFt/OvvXMyFhLdJeZ7j+ITfSXmeY3+3LZ4LhPN6KBh/R9igF01gfy/fvPrqq0tqFg+B8UgDoEz4N7/5zbQN2/U5jWUNeFf2svaMWfrV7x/vzwsttJCk5nxHSOdJJ5005LwY/9OnTx9yLLjyyivndHlzTShBQRAEQRAEQRAMFBNCCSrxX6ngngxUCbxT7rXBQ4LHwQsc4EXxxa+AxP9SHZDqv/77VQkCSu566fHyV7x7ErgneBS6XbZ0PCgVCL9evMC1cta1vy8XRHXKstD+PSQX4wX0xO5+sLFauWLGnid7MlbxUHpf0lYbl4xDV+T6GeYx7nttIUhfCJH5jPHqnlQ8uiQdu0ef/sLDjZdPygVQ8Hr7fUIJr3n3e4VPfOITkpo2R1nez33uc2kbCez77ruvJOmoo45Kbag9p556qqSczC5Je+yxh6SsrG277baprSx379RUYO4fhXfGEvewYyPcXy89jErlRQnw8mIbbnfYRm2hZOyNY7mCxOeal5/vc+++J2+PBj5/oxxQxtvtn3vI3Oyl+EmidyUcxQGV1xUalBTGuCuuKCgoFf7sIWIAdcwLWxCd4feP8c8xvDS1K8bDBfvxSBD6g3cJL1jAeaMEeZ9jW67M0O/Ymyf+U26ZBZHdtnheYzOulGHr2Kkv70HhmMmTJ6dtjzzyiKR8v/3edmPZCsrrS1mR5jy8GAbjElvxUuiohAceeGDaRt/xfuHjkntUewehzyhnP3Xq1NTGEgI1vJAUlHOgq62jRShBQRAEQRAEQRAMFBNSCfJf+Hg0/Jcrv2bxfLpXAu8Ux3APVrlglHtOiBvlV7d7eyaKF1rK/YMXzpWgskxjzROHd6FWNrzfKBcqc0URb5bHFneySCrqjdswYEfuWcKzxne7B7IfFumtqVVcp3vcuC48Rd6H9H8nZULb9mlbaK5X4FopzeqlXLE5vw7GILkZ7onE646doBb59zDXuW3jYawtQEubbxstsAH3HpYKi48BvLvErPu8D672oArhmf/ud7+b2nbZZRdJ2TN/xBFHpDaeBZTPrp1zbWzW2uh39/COFT53Md+jDrnKwLbll18+beNaGMv+HMU+6X9XM8oIDP8ejlVbSJXIAp8XVlllFUnt3ui5oW2OqClYqEQ1fPHSthxZV3lL6CvvM/BFlUtquSujBTmgrkQxLnlnuv3221Mb9xNbrD07XQ3HDnhe+HyHgsg2/zv6p7Y0ShnxUVuk11V0xi+2fMkll6S2NhuYEzvuuKOk5jxEqX2eA55ryLhEbVxjjTVS2/HHHy+p+V7MPWEO9zb6E1v25yi5RDx/WGDb4b3P/w5b9gWg6TO+pxbp0G1CCQqCIAiCIAiCYKCIH0FBEARBEARBEAwUEzIcjhAFKct4HvqAXEnohIcfIGsiy3lIDvsjF7rUigSNRO4hcDUJtx/KF9cgKZJSnV4us0xCd1meUBva2mT9fqOW0EwIjRfWKO3NwymwEeRil43LsAsPJfRCCH6c2Z1XP0D/1JLrsR8Pe2IslaXypaFFE4a7snavQQguiep+rYS8edI6Y5CwOJ93CAEhLM7LYRNCwTGXW2651MYxuBc+13HMsSiRXQspa7N5xuL5/1975xkgS1Vu7eVVrznngJKDR5GckXBAsqiAIhIURVEEP1NnDAAAIABJREFUEQRFwSsmQFCCInBJJkSUoASRnFFyzgIGQMw5p+/H/Z7aq2r2aWbm9Mzpnl7Pn9Ondk911a69d1W9Yb2nnipJ2nrrrXvun9A41nsEFaQiib3eeutJaoeOrLXWWq39+FidaIgq0rteimC6WHjhhZvPt9xyi6T2egaMEQ9543ts8yRr+pO1ykVyGM+Mcx+vhFjyfZeOZsz72ujHHwYLF1oI42PWrFmSpFtvvbXZ1n1ePeWUU5q2pZZaSpL0zne+U1J7DvLc5s+3zL1aGRPSHniOc1ESBHJ6hXHWQke53/jayflMZ7pEPEEhhBBCCCGEkWJGeIK6idGeZIgF1D0PvSzrvP3WrNFdq7In/2INI5HLk0pr1r9h9QRhmcSK59aFrgCEWxnwjHCtXFBhWMFqwbX2ImFcXx8HWGto83FBv/Qqpkaf+5jEil8bT/OiMNtEqVmI6Bf32tB3tXnZtf7XvD2M05qM9jDBHGPN8rWLfsBLJBUPDmPVE7aRH2UfCy20UNOGpQ/Lnxd8JgmXa4FlXyoWfJfUniqYb/77RAH0kuAnQdoTnHuBOIEn8FKkEk/s7Nmzx/wdkQJu6ZwojHO/p00XyBNL5VrTZ75+M6d83OEJYrzWRITwBriIEL+D7LELf3CP5be9GCrH1cuDHsIwc/rpp0sqAglSmUvMIST7pSJQQEkDX4eYJx7NxLMv88yjfHiO5r6AV0oaX7H7WikM1g2/J3E8vtZONYP/lBRCCCGEEEIIfSQvQSGEEEIIIYSRYkaEw3VDgdwlXksWxuWOO8413nGh868ne+LKw9XvGua48QhZ8roOtXo5w5q0juuSMI1eYX0eDkc/4hadCbWTuuOHUEGpjDv+lYrrulbfpBfd8DAXPyAEhQR4T0gfhnC4mpgB+Pwi3KVW74a/rdUU6Ia/jadG0zDA2PFQIsIfvCYIAgeEQvhaB/Tlvffe22yjmjfXx3+H6uTs0+c5Y9RDKaYK6mV5PR6O8yUveUnreKRSyZ2Eea+3dcwxx0hq9x1/S7idV4Vn/Vt88cUltUNbGWMTDYPj2rj4ALU+VlpppQntqx/4eka/cK19Xt12222S2oJE3VosvtbRd4S8eVhit2p9rb4fa95ZZ53VtL385S9v/b00s8R3QmAt9/CzjTbaSFJZ+100gXWI9cufuVjX/X7A93mW8BBWQmOpBTRRYYvaswjPkh5qzbPy5ZdfPqH9zw2D/5QUQgghhBBCCH1kRniCuhb1k046qfm8yy67SCqWQakkzfJm7NZh3k6x7HkCaM0S323DSkolX0n6/ve//4jHPCzQZ5xvL8lXT27DQkeS8dwkCw8KWEXpA0/wI8nQrZxYdWueID7XvBH0Nft36w3SuV/5ylcktRO0e1UJHwbcUs88ZPy5tbybYOljEqsW+/Jr1MW9moPqFcLCTn+4RwzLup8/6xGWOJ93jA/GnnuvsbYzVtm3VKz7/Lb3aS+PU7/BIkr18xo+/1784hdLKufplk48Ou7xYv3Cg3TggQc2bWyDqZKmZ37vsMMOkqQ99tijb/t+JBDOkIo1GYswnhdprJy6VK4/nmm3BLNf5qR7kIjYYP65NDhiGxdccIGk4oGSilCFj8VLLrlkfCcawhDAur7ppps22xZccEFJ0rbbbitJWnnllZs2hG5YA/1eUXuGZVtt7V566aUlFY9TrURFr2eY2prI9+6///5mG56g6bz/xhMUQgghhBBCGClmpCfIZWOXWWYZSdJFF13UbFtxxRXnuC8sSd0ig1LxZtTkoeHKK6+UJK2zzjoTOuZhgWKpyJP28gR5/2BZ7mWJHza6RQJ93K255pqS2sVMybOoWWHwcNT6k21dmVqpWGG32WYbSW0LzTB4grD41LwwhxxySLNtv/32k1TyOTzHD+9DV4Jcku655x5JxWt39NFHjzmGXnlJgway/Hg13CNGn3ixzk022URS8Ux6ngTjEEu8e3vID2I8ea4ZXhCsgu6xY074tnmJ9w+eo/FIuk6UiRZBrVErRHrYYYe1/p1OPIdg5513llSsxO7lX2211SS1JasZn3gNKa7tcE/oJVvvax1r6e677y6pXYybbT6H3VMUwkyE6JKPfvSjc/wO90WiAqTiBfeyHl2ZeY9g6kr0155fuVf0imZxuO880rPyVBNPUAghhBBCCGGkyEtQCCGEEEIIYaSYEeFwvcAd7xKjuP0222wzSSWpUiruwZqkH+EmSIfedNNNTduZZ54pqR0SNRPBRXrOOedIKkIHNU499dTmM+ERLuE47HRDYDzUiJChbgL1VEEoSi1hcdjgHK666qpm29prry2phHUttdRSTRtVrAkn8jlIuGBNsrkbBud9N6jhqh/5yEcklfBeP85aIvi3v/3tKTuWI488UlJJmpdK2MNnPvOZKfvdMP08/PDDc2xDznb11Vdvtm255ZaSypxk3vr3uJ/ecMMNTRvh0sxhl7y+5ZZbJJUQIGeFFVYY76mEMFIQ1uohrFNBr3vmIIeaxxMUQgghhBBCGCke9Z9BfkULIYQQQgghhD4TT1AIIYQQQghhpMhLUAghhBBCCGGkyEtQCCGEEEIIYaTIS1AIIYQQQghhpMhLUAghhBBCCGGkyEtQCCGEEEIIYaTIS1AIIYQQQghhpMhLUAghhBBCCGGkyEtQCCGEEEIIYaTIS1AIIYQQQghhpMhLUAghhBBCCGGkyEtQCCGEEEIIYaR4zLw+gBqPetSjJvSd//znP3P83mMf+1hJ0nbbbddsu//++yVJF1xwwYSOa+GFF5YkbbXVVpKk3/3ud03b8ccfL0n6/e9/P6F99qLXec2J8fTd3HD++edLkv785z9Lkp7+9Kc3bf/6179a3/3nP//ZfH70ox8tSXr84x/f+leSlllmmb4f56D0XW2cLrjggpKkvfbaq2mjP/nO3//+96Ztzz33bO2TvpTG9nk/GJS+22GHHZrPK664oiTpV7/6lSTpjDPOaNp+8IMfSJL+9re/SZKe85znNG3/9V//Z+ehP5/61Kc2bb/5zW8kST/60Y/6dswT7bup6LeFFlqo+bzhhhtKkpZffnlJ0n//9383bcxBxtC///3vpu3ee++VJB155JGSpB/+8Id9P05nUMbcMDIv+s7/njlWW4s+9alPSZKe+MQnNtue9rSnSSrjj/9L0qGHHipJOuecc8bsi9/hfCdz3l0y7iZP+m7ypO8mTz/mvRNPUAghhBBCCGGkeNR/+v1a1Qcm+8b72te+dszfY51y6/msWbMkSX/4wx8kFQuyVKyhfN8tp095ylMkFQ/Qww8/3LQ961nPkiT97Gc/kyT94he/aNouv/zySZ3PoFgLsMBJ0u233y6pnLv/3l/+8hdJ0vOe9zxJ7eOnj//xj39Ialurl156aUnSz3/+874d86D0Xc0TdNNNN0lqe8NuueUWSdL8888vqd0/u+22m6TibcS7KZX+7Cfzuu8OO+wwSdJzn/vcZtuDDz4oqczBBRZYoGlj3OGt+PGPf9y04dH47W9/K0l69rOf3bQx7i699FJJ0ne/+925Pvbp8gQxBvz6b7DBBpKko446qtn217/+VZL0mMc8Zszx/elPf5JU1jgfj3jMnvCEJ0gq3m9JOu200+Z4DJNlXo+5YWYQ+2655ZaTJF1zzTWSpDvuuKNpwyv0uMc9TlI7YoDIAuZ5L8YbDdKLQey7YSF9N3nSd5MnnqAQQgghhBBCmAvyEhRCCCGEEEIYKQZSGGGi7L777pJKKNGXvvSlpo2EaE80JwyEEDZCRaQS/oHAgYsf4L4nkdpd9vfcc48k6QUveIGkEsIkSfPNN58k6cQTT5zE2c17Nt100+YzLlnEJTxckLAGQnDcbUl4IX3urLrqqpJKmM2wwLihTx7JTUv/nH766ZKkvffeu2n76U9/2tqXC05cf/31rf14eKKPXamMUamEIHq4yTDAfHHBAsYP89H7hP544IEHJLXDVJnr9AsCFFIJS3zRi17U3xOYBmohaO9///sltQUOGAOc95Of/OSmjRBBhCQ8sZ31j5C3jTbaqGljnk6FKEcYPpivH/rQh5pthKbfddddkkpYqlRCLZ/0pCdJaoejMyZ/+ctfSpIOP/zwpu3ggw9ufWcAI/lDCENGPEEhhBBCCCGEkWJoPUHrr79+8xlr5W233SZJeuYzn9m0YdHEwyMVa/L3vve9Md9/8YtfLKlYrjxBHevUC1/4Qkltaz1WLZK0r7766qZtpZVWktRODr3xxhvHd6IDgEuY/uQnP5FULOvuDcMCTV+7RRqrMdZ6T8JGenzYmKyHZZ999mn9K0lf/OIXJRWxjmWXXXaOf++W034d07wGKV2peH1czhqYX8xFqcx/Eq5dUAEPCHLYPiaB+e9S7V3v26CA14vzeNnLXta0LbLIIpLKufr3+fehhx5q2p7xjGdIKl5y93ozXxGkWHTRRcccS1dExv8uFMYrZFIToeD+416Wecniiy8uSfrCF77QbOMeQCSGVNZ5xsMrXvGKpu3Xv/61pDJ+XOQE7y8iJ9tuu23T9qY3vUlSET7xNh/XIdRgncJL6fcJxuKpp54qqS1s1YvxRoGEwSWeoBBCCCGEEMJIMbSeoBVWWKH5jOUTD41bhbAsYSWVSp7AGmusIaltPcfSvNhii0kqniHn+c9/vqR2PgZWVKRB3YrN72288cbNtmHyBHkeARYPrJbed1iba9ZOrM183y0nvv9hYrXVVpMkvec975HU9h5gWapZxjl3PJFSsW7Sr+xbKoVCa32HR46+Z/xJ0kEHHSRJuuqqqyZ4ZtMPcf5SyRtzSVDmGp5I5pRUPGM1T9Af//hHSSU3gf9LxROCJ6ifhY77ifdD15PF2JPK+Xtf4jFjzHjOGP1MDpXPW6z1SIp73tTJJ58sSXrve98rqXiL5nTMo24lra2Hm2yyiSRp5513brZxrTynDS88eTeM/+mGSIejjz5aUjt6grno+XZ4h5ibvg4yPokA8EiD7n3F/47PfP+YY45p2igKHILjUSaMlwMOOEBS8fpI0pJLLilJOvbYYyW1n88+8pGPzHH/w762uRefewvn1EuSu3bePHu84x3vaLYR/VTL/yPqxcFrfvbZZ0sqhdCnkniCQgghhBBCCCNFXoJCCCGEEEIII8XQhsO5Cx0XG9tWXnnlMd/zsC3CYFwsoUtN5hmXKbLQLg9N+BuhS4STSCVUjpCGYcNDaOhHXKV+HQhXICzC3amE6hAa4n3nks/DBJLESIi7jDBuZh8/uJvHE/7Hvh1CNWvQ97V9b7HFFo/4e/OaT3/6083nb33rW5La8ri4yTk/d+N74rnUDhl7+ctfLqmMsbPOOmvM31155ZWSpsf1PhG6IghSmTdf+9rXJLXDgklGR/JaKmvdfffdJ6m9DhJiSJvPSRLZmcNIuEslRPjyyy+X1L5OhGD6nB51sQRCQaQiL07ItosC8D0XRlhrrbUklTIQu+6669Qe7BzgGjP/vOQEaxxrvFRC+371q1+1/k4q4eTcF32NZBvjx/uHfTJuXbRol112kSQddthhkzm9oaVX2Omee+7ZfPb1VWrPz5pYzDBR6wPW9q985StN2xvf+EZJJfTcOe+881r/+t997nOfk9QOXZ0p9FqbJxrq94Y3vEGStMMOOzTbeBZECMtDg3nG9vBfnqNf/epXSyphw1PJcD59hhBCCCGEEMIkGVpPUE0mGIuGe2F423RJZqxYWJbc04GFufaGjJWKpE+3RncTv9z6R+KoW8qGCe8LrHC80bsVBqsd27xAHn1A/7gXblAT0h8JvAxYdd3KXpPO5DPWUbemuhXe/97/rleiImPfkw2XXnrp8Z7KQHH33XdLaif4v/SlL5VU5rMnYWNlriVh8xmJ5+uuu65pYx94QgaNmoUWaWKSUF12n/O55ZZbmm0kBjNv3UtE3zD2GM9Smd94yWoeRjyTSBdL0nHHHSepvQb3WlNnIlih3/72t0uSVl999aaNtQIP+le/+tWm7YgjjhizL0R/sJBusMEGTRvJw9PB2muvLanMSb+XXXPNNZLa5RIQIOFfH8uUspg1a5aktjQ79wLWRi+uzXhDAMnFUWbPni1p9DxBNdZcc83Wv1Lx/Oy///6Sxu/9qXmjB42ax2LzzTeXJL3rXe9qtuEBGo+s9TbbbNN8/tjHPiapPCf2KlExzPRap4866ihJ5f7h9xHmI/eRO++8s2njPuJzHFhDvD9Z73zeTzXxBIUQQgghhBBGiqHzBCF/62+rWCuIf8daLEn33HPPmH1gFeXNvpYbhOXDZbDZhsXfvT1Y9mpWqptuuklSiYWWhss66v1Jf9QKJVI8lkKxbi1kH3zfY5LdIzLouOTm8573PEnFOurextp15Zy7ccu1NrdSeR/PCf7erSoveMELxhzzoOW99MLjh7sWyVqeDOfuf8f32Fabz70KWA4CXhCVgpRY091DwxrkY48+4VwpQimVtZQ1y/N+8Cji3XVvOR4n9u2y43vvvbckabfddmu2DcMa14telmMkxD2Ph/IKt956qyRpxx13bNrwdIy3yCLWfGTIp7OwtK9PHC8WXS8dwThy6203WsI93dwHyZGln6SypiLB7XmQ/Da/5wUtufePCr3Gzwc+8AFJ7WefVVddVVLJJbz00kubtiOPPHKOvzMdHqBuvkiv3/TzZXz6+r3UUktJKl5VL2jf3cd4ZfxZY8lVdW9sjWEqoOp90OuZFE8rz8oeicH147rVIqvoC48QYI3wexjrI/vwZyp/Du0n8QSFEEIIIYQQRoq8BIUQQgghhBBGiqELh3vJS14yZhuuuUUWWURSCe+QpKuuukpSXagAV2kt0Y3wGw/bIgwOV723UVEdt7wnifLb7s4jzGQYwpM8JAE3JW5UD9WiP0j29crB9AcuUw+18HCaQQe3sDRWmtndyDVXeK+wq4lUa65Vea6FifG9ddddt9k2DOMNauOi5nInnJL+9+vC97siAFKZj4MuzLHddts1n0lIJcm3dj4eXtor5I8wBPZFUqpUwhAI9XTRD8YV3/EQw9VWW20CZza8EJK13377SSohwFIJBeyVQM0897lcC0Phnnb77bdLas9vQn+migUXXHDMNs7J7wkesgKcH/dmv1cy7hAmqoW5cD8lDFAq9236zNfI6UykHgToTx8zCJQwj718BfffxRdfXFIRVpHKPa1WSoF91BLb+8Vkw5xqaxoh4Oecc46k3qHgvcLVPHVh+eWXlzT+55RhCIODWgkP8HsLfc0c9DbuEWxDxEUq6wX3WASOHA/RZk6zRhDaKUn77rvvuM5posQTFEIIIYQQQhgphs4TxJvkz3/+82YbFive3g844ICmjSRVt2Tyfba5Vav7tu/WFBL9eVv1N168PYgguPUd661bw5CjHQbLvEsVcw5YBjxBDs/aFVdcIan9Fk9CnF8HcBGJQWellVZqPne9N26Vwxvh1uCud6eWmFnzBHUL1PrvYH2pSZaDW/2GiVqCf62tK0/vFkIsUHiQfQ5iUR50YQRPQmfNIYH8Zz/7WdPG+dQ8WySRYymVyvjFIu99jJRzzRO07LLLto6lJmziXoobb7yx1+kNPDXLLvcfinTW1rBa4nZXGKY29txaSr9jLXUBALdWTwV4oaRyrUmMdu8fghrunWUdwivh9xAk6ZGtdwEd+gUPkHt1ub9wT/akd0QW/BhmqpSxVPcaIsn+k5/8RFLxVkpl3PGs454dhFfOPffcZhvCKxTA9ELv/RBL8Ou63nrrSZLOPPNMSW3PYjfp3q9p7foybk4++WRJ7TIRE3nWYmxK0ne+8x1JZQ66PLSLc8yJ8QowzGu6x4aYhlTmHOucXyPuHzyL1O7bRE/5OsD18N9lTeP+44Iz8QSFEEIIIYQQQh/IS1AIIYQQQghhpBi6cDjclO5OJemPmg3XX39900ayYK0+CK5hD0nge7jo/e9wtSO84OIHhEfgMt16663H/J5rpBOmNwx4uGB3m7uGcX3SB36NCNEhdMkT64YpHM4TvwkDqolEjCccrkbtO4SI1MK32NYVAJDKNVpllVUe8XcHkVq9Ac7Pw1YJF/NQN6CvmM9+jXz/g4yLupAgT9iAz6NagjrhVITP+fiiLwlHqNW5Yt6SHCyVRGbmvie2sl5uttlmzbZhD4frBWuXr3X0QXd9kEq/8m9NGIH7mFRCRn71q19Jqgv1TBUeOtkNNfVxBy5OwHHSBzUhHM7Tw+HuvvtuSaV2kIec+71Gat832OesWbOabf4cMIyMp97MNtts03zmHkCYoF8/rhv3BO9zxp2LThEKeeedd7b23S8IgZNK3R1CaD3MsVuX0EN9CenzbYyp2jMFv8Pzm9eHpA94jvPxfeGFF0oqIV2bb7550+bhXcD9iH342nnZZZeN+f6g0A2x9GcdxmK3VqZUzpO1zMOjuafce++9ktrPLrU0AkJr+T0PZ9xkk00mflLjIJ6gEEIIIYQQwkgxdJ6gW265RZK01lprNduw/iD1eu211zZtyDSTaCWNtVK5NQ5pvlpl3K48qFtHsILxe/vss0/T9spXvrJ1fFI72XnQcUsLFj36x/ugm5Dt0pf8Xc1r4n086HjS8h/+8IdWG9Z2qYwxT0CteSomAtYXtwxi+cKa4onyfB9Z1GHDrU1di2DN61GzVmJ5qkneMyZrScaDAInNngCP5ZE549bMmjeQ+eYCL8B6Rt/6nFxooYUkFSl//3u+h2fbrXt8HkTvYy/L+kSrvPP9msABn7siCNLYsVa7Vssss0yzDREBLKRuGa3JzfYT90CScI7QgXu38NB45feu4IN7Xbn3MV59/Wcssi+PtgDuKz/60Y+abVjp3Zsx7J6gXmPxoosuktTuc7wYCB34/ff++++XVK5VTRrZv8+6Mp7ohcmwzjrrNJ932mknSUVIyaX6uX+yVuMRlYoAhHtOSaRnHJx11llNG94Ixph7ixjrrLnu1dx1111bv4NQgiRdeeWVktreK9ZovLgu816LqpmX+NrUvX+uscYaY9oYI35vZh9sq3mJmcf+HM7zt/cJ96T555+/9X9J2nDDDcd1ThMlnqAQQgghhBDCSDF0niBkFPlXKpZut4oAFii3yPNWylumvw3zmbfT2ltt14IqjX2LPvXUU5vPeICQbZSkhx9+uHZ6AwkWF2lsTHgvGWPP2+A68Pf+d27RG3Q8th1rGpZNv6b33HOPpCInLBUpScZNzQrclcr238Ri6p6BSy+9VFLxhnqhYH5vWHFPW83LAXiFalZjrFNIC/c7tn0qIW7frehYaLFc+jljWfNzxNpJX9Y8R1j3fD2jn1lTvd/pS/btHlCOYTzSsYPERKVra/HsXSbqYaRf3ZuB9fOBBx6Q1M7l8IiHqcA9TRwbVnofY1i+fb1h3rHNowTIIVtiiSUkta3DWPrxFBD5IRVrO2Pec144PoqsDiI1b+NEPZBnnHGGpDL/3YtGX5NH5d5bttVynRmnnsvB9WLt8by+fljka7kxrE1+j+XZgOPwdYXx4J6EN7/5zZLKvdiLm/PMUpPKpo1x6yVYeNZkm3tq8ax5bhDlTzhmPz7f73TRa4zVCq/TFyuuuGLTxhzt5if79xlH/jzNPYLnb5/rPB+6F22FFVZo7QsPpjR10VPxBIUQQgghhBBGirwEhRBCCCGEEEaKoQuHq4GMYw3cop5IjdsPt12toi/uYncbAy5TDwnohre98Y1vHP8JDDie2EfIG/3k4YLd8I9a8iV9598lYXGQqVVn70rUenVjwuE8lLAWRjkR6EMPiUECnnA4r/pdq0bPeQxDOKaHV3ZFJWry1oR1+Xdxw3OtPBwMcO1PteTwROE610L4CF1hnEklJMPD07p94vOuu9Z5W1fYw8NQCMEhDNnHEsfgIV2DwkRD3iYSqlS7h9SSymkjlOs973lP04YErI9R+pNwJJ8T3/ve98ZxFpPHxxHHxH3U1xbGmB8bIU2MLQ/bZMzSP4SySUUAAoEXv/ewL8L0vJ+YD57QPigwl2rzmH7yUDTYfffdJUlbbrlls42wSMLEPOSN64CMvkszd9c9DznjM+ISUpnvHJc/P3HPmQyEPtVCRTkOX9v5Xnf8eZsLeBx//PGSShjvBRdc0LSdf/75korAgfcB16gmef3FL35RUglr+8Y3vtG0EdLlx8W+Vl11VUnSzTff3LQh/T6d1NYjtnlfM6cJKXQRCsYB3/fxxPky/vzadsWH/NkFGXaXvmf+kkLi13aqnlniCQohhBBCCCGMFEPnCapZ57qFFB3eYP373e/5GzJvui4XCbyVYtFxS4LLX8/pmJ2aZXYY6IpJeB90hRE8IRAre9dqIE3cQjsvQObc6VrvPOHSJdJhshLZjDdPRoQPf/jDkqQDDzxwTFttbOExGgZPkNO1orolmvNkLLrVuSsq4ZZTtjHnu+N3XoPF2z2HJCxjUXVLGVZJP3/mVi9pVry5te8w5rzfsObxfbeeMic82XVQmGgS+kT+rtZW27bRRhtJkvbYYw9J0nzzzde04TU+55xzmm3dsgO+hniS8VTgnlE8EPyLdLpUvH4+RrDEM8d8nK677rqt36l5INiX/w7zAG+RC6ewrSaONK/pyvr7va/mAXrd614nSVp77bUlSWeffXbTRgFLxopfIzwc9JlLQDPH6cNalIavscxxvudegbnxZnCMXmAZGFtLLrlksw1BB7ySfk6ciz97HXbYYa2/8++zLrLe17wgzC/3fB900EGSyhj2Y+e4vKgv0uw8D9Q8pPOC2nrk4lWwxRZbSGqvPfwt461bHsTxNarrcUI0Qiprgwuq0O+MRRc6qQla9IN4gkIIIYQQQggjxdB5gmpvs71kb2v5F93vu4Wdt3asNp7zwm/z9245rcnz9jrmYfMAARKpWD68f7pWLe8f+hOr6rBJ6Lr0NHQ9fG5ZIm7arRy1ArNzouY9ZP9eEJXfOeKIIx5xn9Jg5mqMByxdefoOAAAgAElEQVRo/Ot9iNWYvva2rrfHY5LJd6l52AYB8rf8mJFYpc0t33fccYekdvFACpqyrrnFkrWRNreaMl+x/Pn6Rn9hGfU1AAuhS/d25ZVnEr28RBT8o9iiVIrQEvv+7W9/u2n72te+Jqlt/VxuueUklWvj16/mRegn7s3rSuP6fbVbOFYqaz/jwddGrM+MO8+3ZL94S3xsMSZr8vD8tucXzAtqecbc62vRJeSBeRFc5s51110nqX2eeBnoA5cNJreWsVWTrge/foxd91h0r5t/3z2VkwXPnYOHxctKMN5rOUHcC5hnUul/ztc9ZXis8XB4/hjrKL/nXhCuH2PaPZd8zyMTOFbaBrFURW3dwsuIPL1fZ9YC5n8tn6omcU7f0dee48c+XYqbewTeRr83e25VP4knKIQQQgghhDBS5CUohBBCCCGEMFIMXTjceKiFfLi7suvar8nGgrfhmufv3X3cq5rtZBNyBxHczPSn90E3rMhd8Ljv6c9aQt4gU5PI7rrcDzjggKat60KXxlZWdnrJ6hLehbvZkxIJV/jYxz7WOpY5/U7tPAaV2vEzxmrzmbC4WvIp4Q4elsO+JitYMdUQgukhQcw3QmYIl5JK2IWHayy22GKSShhdLQyVPvG/4zN96WF3JOSzziJ1KpXQPQ/9WWKJJSRJV155ZY+z7Q/d+eNr7kTX317fr81TeO5znyuphKr6/YhQN5KPvbzDDjvsIKktMMAaQzicj9+anG8/8XAqQixZX/z8SdJ32VzWPcakz8lrrrlGkrTgggtKap8TIaq1xHnOl/HmY5l9TKdEtt/v6I9eMvtIJu+8887NNkJFkWGWSsgXSeR+HZj/SKYjdCNJSy21lCTpvvvuk9QOE+uWS/Cxw/2ktqZyTf35ZrIlHqQyjmqJ9bXQSWA81ULJPVyYcCquja+dfGb/vZ4J/Ri4pvSZj+Xa8fC3jFPCGqV6GGA/qYkSjFe0Za+99pJUQtF8zSd0jT6srauMC+8T7uH09ezZs5s2JMvXW2+9ZhtrwjbbbCOpLchx9913jznmfjCYd/8QQgghhBBCmCJmpCfILZq8xbtXwi0HXbpSgE634JN/hzdkrIBYXmcavNFj+XJrfdc66t4erklXZnxYIKHZ6VpTNtxww+YzFlC3FnLONc9gLwt2t6/duoVHYP3115fUtgTVJJ/dOjjo+HliZcLKhsVYKudJArH3Af3DuHVrNfN4XsqW9oK1xI9vhRVWkFSKbXqhYcaMW8iRQq95vekn+gYRBanIztJfXtCSfbBv71O8GO4Jcs/GVEMfcL6+zkyFJ559upgB1wgLvt9vKGBZ83JiJfd7B5+RPe5KZk8l3nesZy6eAXgu/L7bLartVuWVVlpJUpnTjGVp7Hz1Y8BTgLXfxTfop15S8P2ml7iRF0vnmnPvc48ocwgLuFTKMXDuFEaVpJe97GWSpFVWWUVSO+kezwPesPvvv79pY/7WirNi5fd1hns3a4QLEsyNkAxruq81wPrt/cpxdKNw5nQceF1r6z1jkH362GLcsH+fs+yj1k+Mb5eA747hmldpMownoqjXc1WtWCprlVSEES699FJJ9WeFWsQKn7tCO348jOk999yzaauV9aC4LaUE3Avqhef7STxBIYQQQgghhJFiRniCum/GbpHi7bRX3H/NosDfeRufsaJ4bCxx8lhjZqonqOuxcMtGN5bX+6ebm+UWmmHAZYehW4jSrZA1K1CvArnd8dkrT83HJFYnPCNukapZjOa1hOxE8L7rxnPXLF5Y+rzv+NwtWCiVPh/UnCA8Om79v+eeeyQVq7hbkDk3l1DHuso+3ALM+MCy6XkC9DOWVZfp7Y539/pgtabQq9Tb894PfD5wvWtzjGNijb7ooovG/N14oe9YF9Zaa62mDcs/12PLLbcc17Fj6fTij7SzXk5nQV9fz/jMOPLjxvPga3rXou73AvISiff3+cf3a2slY5F9+Vhmfk/nXPZxvfHGG0sqXi5fZ8jRIafB12jyeHweMx/x3rg1nPsvlnXPneLv2JcXxMaDxHc814I1pfasw7rr0sb9GIPdHKU5tdGP3OfcC1PzHE0kz9PvIZwT49yPobtP3zf7cA8p+2Cc+7yYm1zoiXqyu97/2t8fe+yxzWfWHeZZr3XV+4AxyXz0CA7GLpEqNXn1WpkVnmf8Gk1VPtVg3v1DCCGEEEIIYYrIS1AIIYQQQghhpJgR4XBdXvKSlzSfuwlv/hlXm7t3u23uquu63D0UDBf3dEp0zgsItZlvvvnGtHmiqtQOp+gm9w9buCDXtebG77qDpRIiVHOd1/7fTTisua4Zp/47fCZswF3Y3bHs5zEM1KpSkxTt/cr3aqFyHpYitUNYCDeZ6nCtycJ5eQgqaw7n6KG/hN34+fO3tWr1jDHWOO8r9kH4hofD8Xf0n49H5ndN5naqqIVGkmzvoUfIhbPNx1AtHG48iciEM62++urNNgQtXve61435PteUdcT79dWvfrUk6Ywzzmi2saYy910QZKpxaW/6jOP1PiG0ymWPCZsjjMrFMZAFZ+z6dSBhviuEIpVrRNgr/ezHMx3hgjxfvOENb2i2IYzB8bs4CfdBwiSXWWaZpo3+8XAqxgjX2uflDTfcIKn0i4e8sS/uAX4MXEtCijbZZJOm7d3vfrekdmidP/f4780tPD8wFyXprrvuklSXgec4fJ0Dxlhtvavd+9hvbf4D48fXy67Ms6937MPnCqkR3FcoESBJ3/ve98b85tzAPOkl5AUIO0jS2WefLakdHo/IDts8hJV9MbZ8/tMvXCt//kZs4fvf//4cz8H7ExiLK6+88hz/rl/EExRCCCGEEEIYKWakJ8iTcrFS+Vtt9222ZlXm7dbbsCRgJXCvAFZRChw6M6FIKnS9Z25d6ApM1CQZsRZg0R8Wat4/zpexVSuY6xalbvGyXnKW403w7Xox3BJak6zsVeRx0HBPEJZkzsX7jj6oFXIDLOo+Rnsl5w4CXCv3sHbHnFv76BuXI2a8YmXtJVbisrUk83INasXx+Nf7kd9zK7TLR08FteuNtXfHHXdstu2xxx6Syljy466VNui1biOIgEfcLeu9zrc75vx6YPn3OdotSTDVBVId9+xwX8NqW/Ni16zRrGN+nng2GGM1QQV+x+drVyik5kHqen6nAjwuBx100By/49eQOcS51O4J041HcnA8fk27BVH9Oiy77LKT/l281V4gE0EQrqt7ErinMg+uv/76po1134Ujusft9z7WQLbV5LYZP+7Zga4XXqoLBdx+++2SipfOz2du5MVrXp9ektvM2Q984AOSSgFSqXgEEcWQxgpNuIeGZ99uYVT/3qKLLipJev3rX9+09fIAQS8BhpqUer+JJyiEEEIIIYQwUgytJ6j29ogkoXtjiM/0N3XenrHWePwrVptuDKRU3n6xnLi1B4+Tx/uO55iHDSwyXYu0NNazUSuaRZ/NjUVkXjCe/IBLLrmk+Xz44YdLkmbNmtVs6+bjuBcH6z37d+s/Fixiy936c+ONN0oqFu8VV1yxaWPM+zgdpn6vxSTXck6Yj8Qyu6R0V1rVvSTzQlZ3IlAg04+ZtQpvg68jjC/+TirnhnfcCy8CfeN/h3W1lu/WLWDo3g08JF5cdckll+x9olPAD37wA0lFDluS1lxzTUlFGhvLpVSstW657MrZcn+RpJe//OWSpOOOO06StMYaazRtvfL7urK1tTHu6wTWZDxU0+HpgHPPPbf5zFrFfPJ7LF40z2/Cks4c8/MkH4R7pq+D7MMjDID7CePNxz5rwFQVVHRqeZVc6/F4lz1PjfnpY4R5xbysRarUogm6XhC/T9DHN998s6T2fYLrV8ulZB+9CntPBPrnrLPOaratvfbarePn3ikVjwXnTQ6eJG211VaS2vcCvKk17zZjivuony+fOc9e8uv+vEj/eN+zdt52221jjsE95BOl5vXhPoD3hbxCaexcuvXWW5s2rqH3XVfi2ucl8wsvukusk3+51157SZJOO+20McfZq0B8LSJmOj2kg3n3DyGEEEIIIYQpIi9BIYQQQgghhJFiaMPhahCu4K5JXHq1xN6aNGw3RK7m+mRbLckY16AnHuKi75UIPyyQmItbtCZ+AC772U3SH/Sk9C61MLJuMqVXQ/7mN7/Z+neqmX/++SW1K9fXktoHNfSrhs9Lxg3hER5SwnnWEt5pI4zHXfx8zwUYBolauBDhC6xF3keETnWl6qV6uA4J6cxNl18mFKwbOiwVyVeuhYfrEVJRk0ueDrrhZt/61reaNkLXXvGKV0hqh3RxnosvvnizjXOhn/ycjj/+eEnS3nvvLUm69NJLx3V83XuAXyukgu+///4xx+BSyNOFJ/ATKonEsY8jxk9NNpd9+BxjXDOnPVyI+wtjykUTGIOEMxF66fg9Z6ogNNHnRDdUyp8NumvuI4X/dMUnat/vhm/58dREfLqhcl/96lebz1wrQuV8/+zDQ0OvvfbaMcczUfy4L7jgAknSrrvuKknabbfdmrabbrqpdWw+zz74wQ9KasvTE2rJmvPLX/6yaWNNYCz6mKR/mOO1575ezyy+BvKZOeN972vIROH59ogjjmi2LbTQQq1j82NkHNEH/gzDZ7+ubKuNN9rYl4cSn3/++ZKk/ffff8zf9UojYFstTWQ84lH9YnieiEIIIYQQQgihDwytJ6gmLEBya01S1hPAehVU60p61go58Z2atCeWMk+exBP0SMX5hoGudHhNKhVqcs01Oc5hgHPxsdVNEK0l5bpFqTaWJgK/5xYs9llLuORa+XWYjmKC/cKT6zkXrMieOM05Mcdrnh2sebWif17Ud5DAQu6eFI6Vda0m6eznzzqDJH3NUlizunUlUb3fGIe1BOluAq3UlmHtJ29605skteX2OQf64O67727aPvShD0kqkrEPPPBA04aV1fuAxOJawUaSgN2iPhm84CdywO4JYg4w3imWOR1goZdKQjrJ1VigpSL+4t4q5hvjszYmGT9uHef+iTXdPTvsk+u23377jTm+fhej7IXf7zinQV1Luus+wiGS9MlPfnK6D6caQYL0thdt5blho402ktT2EvG9k046qdmGhDZr5rbbbtu04WXkHuLHwO8wx3sJCHW9al26hZqvueaapq1WtHq8MOdcvptnDtYJ90hxLhyvS3V3xVv8c02woOtldA97rSh093dqESg10ZSuJ2hu+mu8xBMUQgghhBBCGCnyEhRCCCGEEEIYKYY2HK4GIRg1V6aHn9VCYqCbjFjT4Web77ObEFmr5zBsIWA1cLsSSuPhNZ6sLrVDILru1GEKy5JKGIjXKQCqYNdCY/opAIGLuLbPbqK2VBevoJr1MOAhpSRwEo6IEIRUhAMILSLB3GG8ei2TrmjCoOKhBMwxwgRcNIG5WQvT4fx9jrKttmZ164H5MSCgQNiHhwCxf5/7U1Xb5vLLL5fUrm/EuCCMykPYuiFafty1enJsY90+5ZRTmrbxiD2Mp9aFf2eTTTaRVGoQSeVeRT2QXXbZpWm78847H3H/c4Mnym+33XatNq+ZdO+990pqhxfSZ3zPxwChfbX7IWORce1rHeObcMEzzjijafPPYfCpzY0zzzxTUgl9k0p45B133CGpHVrL3Nhggw2abYhlnHzyyZLaIWCrrLKKpFILzOcwzyOsdz7uuiG2vubyfV/vOEbWGxdMmhuuuOIKSdLs2bPn+B1/9mV9Zg76vY++IwRXKms3c9BDUVnzEaqoPUeMRwRhvN/nfl+ra9dv4gkKIYQQQgghjBQzyhOEUIG/4fOW6W+1WPt4M/a3526FZE9m71bsrVmyeJv2BLWZRNfb5kltLgLg35HGVl0eJqlmSfrEJz4hqZ0EyLX2BN0uU1H5uNc+P/3pT4/57J6Oj3/8430/nqnCrXGcQ9cqJ5UxiHXLxyQW6FqV+UEfi4hdeOI/x89a5AIE9JEnk/J9T6YF+gvLpvcN/cvv+HpGgu7VV1895hhqye5uOe0nJAW7IMl4parnFd25ixdFkpZccsnpPpxx05UedxEErO5bb711s+3BBx+U1BYPgu592ucf4xRvkc9lxus73vGOMfvsjldpeqvOh/HRy/qPwIELHSBjj0cXCWypjCP39j7nOc+RVOaVlyoZBXwNxzPrHtqppNd869VWE4w69NBD+3JM42Ew7/4hhBBCCCGEMEXMKE8QOQFYA6SxBQGlsZZftx7RVssroq0bL+9gefZjAD+GYSsWCpwzMbQuVdyV/XaJ4658sVtvhgHyA9yqjSeRYmFOTUJ8Kuha1i6++OKmDauqHwMFLocB72v3hkhtTxDWPrwV7oXoFpHr5q1526DBeua5Ucw7vK6eh0YRUM+XQkaWtcc9OuSyEX/tuWN4lbDSeX8fdthhkqQDDjhAUtvLftVVV7V+Tyox7PNCijf0h14lHcgXYo2UpNe+9rWS6hERrJvkb7glmHnO+P7xj3/ctB100EGSSi6YM6hzOLSZqHeOMcW/5OaG0C/iCQohhBBCCCGMFHkJCiGEEEIIIYwUMyocjhAMd6/Xkp8JzSJMqPb9Of1fKiF2LrVNmArfJ6HPmQkue2RTkWz1PvBwCEm67LLLms/Pe97zJJUwh2GSanZOPPHE5jPjwCtcz2tcHvrLX/6ypMGXgJ4TnvDOfGK8ebInIX7Iabp874ILLihpbHiXQzjYoLHvvvtKqlfUJuzs7LPPnv4D67D77rs3n5G3RdRBkj760Y9O9yGFecBee+015jOhmS7M0Q0rr8H8Rpr3keiVcB9CCHMinqAQQgghhBDCSPGo/8R0EkIIIYQQQhgh4gkKIYQQQgghjBR5CQohhBBCCCGMFHkJCiGEEEIIIYwUeQkKIYQQQgghjBR5CQohhBBCCCGMFHkJCiGEEEIIIYwUeQkKIYQQQgghjBR5CQohhBBCCCGMFHkJCiGEEEIIIYwUeQkKIYQQQgghjBR5CQohhBBCCCGMFHkJCiGEEEIIIYwUj5nXB1DjUY96VN/3+fKXv7z5/NnPflaSdPDBB0uSzj777HHtY5111pEk7b333pKknXbaqWm77bbb+nKczn/+858J/81U9N3Tn/705vNTnvIUSdLPf/5zSdK//vWvMd//97//Pcd9cXxPfOITm21PetKTJEkPP/zw3B/s/2de990TnvAESdJhhx3WbHvlK18pSXr3u98tSbr//vubtl/96letv/f+WXLJJSVJSy21lCTpNa95TdN2yCGHSJLOPffcOR7Lf/1XsXX0ujYwr/uuF+94xzskSbfeemuz7corr2x9x+f6fPPNJ2n8c3xumWjfTVe/DTqDPOYGnUHpu8c//vHN57///e+SxrfeTBWcY6/+GZS+G0aGoe8WXHDB5vP73vc+SeV55vvf/37Tdvjhh0/rcQ1D3w0qk+m7XjzqP/3eYx+YiofRxRZbrNn2tre9TZK0/vrrS5J+/OMfN21MEBb0P/3pT03b8ssvL0n6/Oc/L0k68sgjm7Yf/OAHkqS//e1vfTv2QZkonJskLbTQQpKkP/zhD5LKS9F44e/8OJ/85CdLKg/r/RiS/e47f5EAbvArrbSSJOl73/te08aYuv3225tt//3f/y1JuvfeeyVJL37xi5u2Cy+8sNX25je/uWljDP/617+W1F7YeRnlBf3iiy9u2l772tdKar+oPvrRjx6zrcugjLsa9JP33fXXXy+p9K8/jPFyuc0220zL8U3XS1CvB7xll122+Yyh54UvfOGY3+NF+/nPf74k6Y9//GPTxvhgfP35z39u2vbaay9J5cXysY99bNP2j3/8Y1LnM8hjbtCZqr7z73R/Y80112w+88Kz+eabN9ue+cxnSpJ++9vfSmo/cP7zn/9s7euXv/xl83nWrFmSpPnnn1+StMoqqzRtz3ve8yRJ+++/vyTpuOOOa9pqL1vd+0mtnzLuJs+87juub+3a/+///q8kae211262XXvttZLKM5qPLe6L22+/vaT2fXQqmNd9N8z0+5Ul4XAhhBBCCCGEkSIvQSGEEEIIIYSRYsaHw+HmJJ9Ckh588EFJxa3u4SOE0jzmMf+XLvWb3/ymabvhhhskSR/60IckSS94wQuathtvvLFvxwyD4jK95ZZbms/kWBD2UgsT63UMnJOHY9HXhDh56M1k6Xff9QojO+mkkyRJV199dbONcyCEQ5Ke9axnSSqhJH/5y1+aNlz0Ndc+Y5Jz8vwWQkt+97vfSSp5R5J0wQUXSJK+/vWvj+s8YFDGncNxk/v0ox/9qGnbeOONJZVQQkLmJOmpT32qJGnnnXee0uOD6QqHIwTNw88+/vGPSyrrkyT99a9/bf2dz1f6lDHkYYSMD8JXyduTpLvvvluS9IpXvGLMcY1nfNUYxDE3LExn3x111FGSpJVXXrnZRsg467hU1ijCKVmfJGnxxReXJG211VaSpG9+85tN2y9+8QtJZdwSsi6VEDnCOH/60582bW9961slSXfdddeEzifjbvLMi75jfZHqawz3B+61/mzXiz322ENSWTt32WWXpu0rX/nK5A62B4M47roh1p5by32GbYsuumjTdt5550kqz8q+NnCf4tnoGc94RtO2wQYbSCrPlFJJMfnZz34mqdzT/fj6nWcYT1AIIYQQQghhpBhIdbipwJN3ET/grd8TOnmLxWLqHp5vfOMbkqRnP/vZkkpC/0zHLQJY9nq9ldeSttnGv54c+5znPEeStMYaa0iaPiWvicB4cOvTBz/4QUnFAvLqV7+6aePzl7/85WYbScIXXXSRpJLILxWPEZYu94Y99NBDkkqfuVofCchbbrmlpJIQKhVxhZ/85CfNtiuuuEJSsdp2k5QHldmzZ0sqHgn3WqAOxzh1Zb1um6vKDTOME/cEYVlzgQO83vQXnjFJetzjHiepzGES3H2/bHP1wpe97GWSpHe9612SpCOOOKJpi5V8eOkltrHqqqtKKpbcSy65pGlDdMjH1q677ipJuuOOOyRJV111VdN23XXXSZIOOOAASUX4RZKe+9zntv5l7ZOkk08+WVIRfHGBmKOPPlqS9KpXvWrMsfdKoA/DRW1s7rnnns1n1nmEYJxe97wDDzxQUrk/nHHGGU3bTTfdJEm6+eabJ3vYQ4mr1xL1hIeWZxnfxjOIi5DhAWY+e78iiOLP5jwH+TPLVBNPUAghhBBCCGGkmPGeoJp1C4sAlqtTTjmlaTv99NMlFeuRS2Tj+cFyNSxW9MlSk7/GmoYluubtqf2/a8GpxfMSKz6InqCa9C8eHepGudQrstnbbbdds43+pO9cTh1J7dqYwrOB58jzM7DQEzu/7bbbNm3EzHtOFwzb2MWSRF888MADTRvzePXVV5ckbbTRRk0b3z/nnHOm5Tini26uj1Qsd0sssUSzjXnG/CPHRyo5acxpn69Y/MnR8PHy+9//XlK9NtUAppiGcdK9V7qHBg8y3m/PO9ttt90kSW9/+9ubbe9973slFU/1Pvvs07ThyTnrrLMklbILUrEik6+7wgorNG3kDjDO3ePL2krekCT98Ic/bJ1XGF56efM+8IEPNJ8/8YlPtNo8T2089zyePfx+Qc6vr6vjOa5hozv//fmPeyu5PXfeeWfThgeI5zeXrsfLs++++475vcsuu0xS+1mQXCN/7u4eX7+JJyiEEEIIIYQwUuQlKIQQQgghhDBSzPhwOEKPPPkKdx/hRcgpSsWtievUXe7dxDp3tc5EXLoQOPdaOBxMNiRm4YUXntTfzSsQISCZ8r777ptjm1RC1whl8jGJW51t7l7n+7iI3X1MiCYhnd/+9rebtpe85CWS2hK066233oTOcVCg2jfhXLUEaOQ0fT6vttpqkqRf//rXU32I85wttthCUltUg/ACtnlIAesf/7p8NmMV8YQXvehFTRvhh4wvlzGdqDR2GBxY01l7tt9++6YNOXTWIA/HRDLdBVsWWWQRSSV08uGHH27aENchvOj5z39+04aQCeE2XoaC0GLCgQ855JCmDYlsD4U9/PDDJY1OiCbzmNAlFzOhpICX/BgmaqFQjAcXyTn44IMntf+utD+iHVIRAdlmm20kTY1k9iDQnScucMB85/mPciZSCeXn+4w/hzXC5zPP1h52y32GcGy/b0/VPI4nKIQQQgghhDBSzGxXhopl3ZPieOuvFfpkW9cyL421cmIllYoVxmVmhx33ggFv491/pbEy2N5W+z5geazJWg4yyLIuvfTSkqTll1++abv00ksltS1vJPcjA+l9wbih73xsYuliDGN58b9DJhTPhyTdc889kkoC8jCD1CZeHpfORZACD4UXZrztttum6xDnCW9729uaz1/4whckFeu7VOSyWatclh3RA9Y4PDsOlju3KvMZ76OLhmAhrAmJzGR8vvK5lojN+P385z8vqV1+AKune/LwoBx77LGSpMsvv7yfh92iuzZ7CYi11lpLUvHEsuZJRZQFIQJJeuMb3yipWHRdkANvEmPFhXAQlDnhhBMktec5/cka6QIxWJM5Tql4gmZC0vqccK83ZRkQo/D7KX3AnPf5fM0110gq9wupJMXTx16Ydl5Qe24gGuD666+f49/5HOwlAd/dxv1bKvNy0003ldT2BM3ksUXBUqn04wILLCCpPdcZU695zWsklfEnlWdtxtOnPvWppo1SM/7MzHynbIpHGXih3H4ST1AIIYQQQghhpBgZT5Bbtbrysv6GibcHa14tFhVrqseizkQZTrdIwnhi/uk7t67wdzVPB98bBi+aFwTcbLPNJBXPgx8/lgyXeKXYGnlCbqViTNJPPia7uRtLLrlk00asPsfl0p4vfelLJbXzhNx7OUwQb4yHwQuCIs2JRcqlx91SOpN42tOeJqkdu05hVJcXxVLJWuXrIGsj/eVeS4rV8R3Gs1Ss7vS3x3l/8YtflFQK9Y4KbhHuWoc9moA5Sa4W11GSfvCDH0hqx+LjZWF9mE5P0A033NB8dnmojEIAACAASURBVO+qVK6zVIqIL7XUUs02cnu4h2BFl8pYZC57f1GsHE/STjvt1LTh2cA75sWpOXa87KOC5+3Sr9xD3BvL/ZYCs36tXvnKV0pql3ggCoT7CjLo84qaxwVPouebAfna/fDU4BHxedml9gw57LC+S8UDxP0UD69UvLdEInifc29ZccUVJUk77rhj03bhhRdKkpZZZplm23nnnSdpeouaxxMUQgghhBBCGCnyEhRCCCGEEEIYKWZ8OFxNIrvb1gsPESAcCXefuz1xRXsozrBTq0qPm5l+qYUB1lzQXREK7zuuw3gqOs9rPOmchFLGlrt16buPfvSjzbaNN95YkrTBBhtIakvKEkpHH3gCO/1DcrGHt1F1mbAaTwxGFIAq78MMbvS//OUvktrhOSTvEsLlspr03ete9zpJ0mmnnTb1BzsNHHPMMZLaIS/0jYfpMjZryfp8n/C5WtgW49DnK+OQfXp/uzjITKabZO1hpiQGk0xOKIlUwrxYRz0E8eKLL5bUDi9EWpZ9br755k3bySef3IczKXTDeDy0jGvNmnXjjTc2bWeeeaYk6V3velezjcRy1kEXc9l9990lSbvssktrn1IJ02LceRI6oZ8kVLswAmOQsJuZhN9jGW/02fve976mjT4nfMtDhglnpYyDh2mzXtxxxx3NtuWWW05SW9Cin9TOqZdwQQ3ut3vttdeYtok+S/T6TSTdvUxAl1oIHHNmkMUTeqVxeLgpwgbgAkxHHnmkpHK+yy677Jh9ffe735Ukvf/972+2IXHuIhTrrrtuax+f+9znmrapej6MJyiEEEIIIYQwUoyMJ2i8kq29vEPdN3rf51TJ981Lap4g6CUziZUKS7M0VvTAC2Rh1f/+978/+YOdJvDmSNIaa6whqST/uuUUayfeIqkkEB933HGS2rKjXUEEtyxhNcY679Y/9kGBsre//e1NG5ZAjlMqCctXXXXVOM52cCCRHBlXtwLDb3/7W0nt4ovM0bXXXlvSzPEEIa3sllqSmF38gPHB3Kx5xBm3bpGn37Aq19Y3vB8+z5GBx/MmzZw+d7rea4p1SkW6F+vp/PPP37QhTUwSuku4c4082Z31Aw+Sryf98ATVLPLg3nu8U4i7rLLKKk0bMu2+Zh100EGt73kiPuMT7wTflYpH58QTT5Qk7bHHHmP+jjUAIQlJ+s53viOp7Xkn4f+mm26SNNjJ6zWLfK2sBH3AvHeBine+852SyrjzqBQ8dyS00zdSiSxg7kr1YvH9ZKJF1vH++TMXx+gFchFJYH2s9WutyD2/zZh3rxheWPrevWiMwVox7kH2AEGtz2teau4byGC/5z3vadre8Y53SCrRKB6lQZ+96U1vktSOYsGzts466zTbzj//fElFwAPRI6kUUO438QSFEEIIIYQQRoq8BIUQQgghhBBGihkfDodLshbaNZ5EvPHW/5mJ4XBeNwC6LnpPViMc5/7775fUrkCPG5swBHcV42520YFBxZNNSWRGO/+6665r2q699lpJ0oc+9KFmG3UN9t13X0nt8KNeYZv0GeFeJ510UtPm4TGS9MADDzSfCYW65ZZbmm1egXmYILmcRHEP6yKJGje8h8MREuJ9Pcy85S1vkVRC2Dykkm1eDR4Ia/F1kPAXwjm9Qjj9xjx1oQ4gdMTD7xi/22+/fbNtpoTD+b2AdY/k7CWWWKJpYw5ecsklktohb3vuuaekknhOiKtUQsb8GiH2ARdccEHzmTC0ucHvW5zTxz72MUntkEZqhuyzzz6SpM985jNNG6GqvvZ89rOflVTGIiFpkrTrrrtKKsnSLvbAmOQ8PRyLfuFfr9k2a9YsSe3+ooYavz2I4Um9nkGYc5yHVPqHfvFzQkSC6+FhYltvvbWkMq/pe6nMf+9P7nP0q6+f3Iemip133rn5zHwhNMtDRREFWnjhhZtthJwjZuDju/vs4n3XrV/of8c2avB5KDlj3sOSGbt+HoOKhwYy/wk195BzxuIJJ5wgqX2PRfSEFAd/xmD+E6Z6++23N22EcjJepSLsxO8gIiUlHC6EEEIIIYQQ+sKM9wTBeGUXYTweoJrM80yiK2Ygje2XmieI5PzDDz+8acNKgMXErZ1YWqbawtQPqOAuFTlHvBJ4fxy3cmJNpQq6V2TuJuq6zDF9jsfsE5/4RNPGNpKkfRySVOjVnRdaaCFJ0pZbbjnHcxxE8BKShO+SnVibulXtpeKZmKoE3+nm3e9+t6Qia+3SzFCTEqUffP4yv5l3PuaYr3zHxxX7wKrs14J9+TyZySyyyCKS2h5i5hueFJ9/WEmxeJLkL0kvfOELJbU9yggSIDDwox/9qGnz9WOy1AQCrrjiCkltQRXmD15sX6vxFrhsNonpzFf/ncUWW0xS8Sa5tfdlL3uZpLK2ulgO6ysJ2Ii8SMUj5zLGs2fPliR99atflTTxZ4DJ0EvgoEavNjwcLljgyepSW3CC5HyuzfHHH9+00S/MVf9dxpiPYQQGWFt9Pn/961+f4zGPFxfdwCOD4M///M//NG14DvBYcFySdPTRR0tq9wEe6Iceeqj1d1IZizXBia5gjIuZ/PCHP5QkHXLIIZKkVVdddcz5+LMS0s94yF00ZdCoeUfps5VXXrnZxlxlzcdDJ5VnkG9+85uSiiiUJF1//fWSpP32209SWyKb+w3XUSpjELn9XrLk/SKeoBBCCCGEEMJIMeM9QbVYdqya3QKejwRvzVhaJ/r3w4pLbWIloC9qliwsUm5l6H6vJtU7VTGf/cQt7xdddJGkYr30ol+w//77N58322wzSdIXvvAFSe242m7/uGWJcYYV0H/njDPOaP2dx3ojg3r55Zc321z6c9Dx+cUYJAa7VpyS7yO9KxULdE1Sexghl4S8O++jbiFjqfQTc9E9OniH+I7vi33wfbdwd/u0livjv1OTfR9UJiqjTH6e5+lRZBBrqedmYKEmd8XXSK6Hz1c8MHhB3EvUD29bbf1eb731JLWvKxZaihh6OQByHd3jhdQt+/dcHfLYll56aUkld8r3zzY8n1LJr1xzzTUltSWykcD3dcHztPpJrzye2jbmFfPT13aiJ7xALudHEUq3yAMeRV/XyNvhvuJeEK4f/9ak0V3qnn6kj2v367mh5oEgV8yvK3MCrxjjSpJ22GGH1nek4jnE0+rPFHhmmNfeB9xfuA4PPvhg00ZfU9rC77FIP3vfcfxI5Q8beAs9RxQZbCIyGJtSyenhevi6uckmm0gq/fmtb32raTvvvPMklfEulfw0rlUtuqPfjMZTfAghhBBCCCH8f/ISFEIIIYQQQhgpZnw4HEw2FMPdtt3wN2+bicII4NWQu5XSa+dNMqWLH+Bi7XUdunKwgwihV1JJZCZxtVY1eosttmg+01dI23r/dKWxfawRikBop8sPs//TTz9dUju5lQRkwlUkaZdddpFUkoUHGRKgpTLXPAkfuvKm7kLn83QkRU8ViFlI5VyZfx7uWws7IySGv/MQme7crYX30u+1sDt+x0Nl+J6HmiADfd99983xHAcF7zvOoftv93tdzj33XElFwtflpAnRYr77mkFJARcdQGCgJlLja1E/IDQLQQT/TT6z1rHeSGWOeQgRleEJAfZkesYIc9nHD2GFSBx/5CMfadpWWmml1vddGpk+8zWD0gVThY+HriBCLdyMPuS+IRX5aw/jI1xwxx13lNQO72UOcR1cJIe+I5Hfw63pF+azP7sQCubCKPQxa8Szn/1sTRWEQHIcHlpGiB/bttpqq6btve99r6RSPkEq/YEsfS0dghBTX9NYTwnzdHEG7qnMZ0SIpDImfY0mVJn59NrXvrZp83CwQaB2X0R0x+8VfO6KvkhlDCMNvuCCCzZtlEdApMnXjU033VRS+5oybxBZ8ZIAPg/6STxBIYQQQgghhJFiZDxBTtey18tKXLOOYh3pJbE4k8AqJ7W9CnOC4lleQMytX3OCpLhBBMlgT0RFqAArm0vEAnKzkvT6179eknTsscdKaicXYwVjTLmlmc/0j/frqaee2vo9T5wmwfnmm29utmGlHs/Yn9e4JRevFn3sktdY0PFA+njF4j7MIiYIbzhYLt0zxrm6BY/rzBiqFcfsJevLulYT6uhKZc+JQSxS2aXm7ekl/tJr/tCG9dqT+7EiY1H1Yqm14tRcS66D9yUW536x5JJLSioeAV+DmFskmvt6jqV8xRVXbLats846koqMtxfiROYbC7AXwERmF9ljtzhTKBne8573NJ+5Di45zvHjEXF55bmh13zptZ7SZwgASEUC2sUA8GqRhO4RAwceeKCkck5eCJvC2WeffbYk6Q1veEPThreFMVZbI3yOd+e0j9N+g6Q0a5l7glhrasWuEQvBQyMVMSDwa8X9k/713yERn/NGcl2SPvWpT0kqQiQuU8618XnJ/GHO4gUZRHy8dktxnHjiiU0b/c+c9f7hmZD7rt+TGKdI/LvH7K677pIkHXrooc02vMrcy7wQN/f5fjO8TwYhhBBCCCGEMAlmvCeoZpnpSlyPt+hp16LpRQndsjLT8Lj1rkWoZmHn++7ZGfb+wWpEvLZUYlSxHnkhQ0C6VSrS1gcddFBrn1KxeGBFqlmwGH8en410JWBd8e8tv/zyY/aF92qQC9QSmyyNlXat5WfUCoKyj37nT0wnLiHKGOAcPZeAuVhb82qe6m4uVa2AIXhsfTcf0Pu7th6QozCv6R5bLSfA+465SF+4VRKLPIX+vKhnL28AHk3WEc8FxErv6y1rCzLbnlPQL88GdO+L7r35+Mc/Lkn6xje+IantncBbc9RRRzXbkBVmjcQTIZW5iJfILesUQOXc8E5JZbzRZy5LjjXavUp47WnrV3/VPJtdK7rnL+DRZs31vAiODel/qazb5JKdddZZTRvWdtZDl4DGq8R185wgjoe/87WVfq15h7g27hnqh0Xe1wy8BERb8K9U+hpPpOedMV49l4lzqJUQYMwy7vz5jevGHNx2222bNjyO9Jl7LtiH56FyDPQreXFSu2TGoEE/4vXxa07uMfPYZcJ5hthtt90ktZ9JKNiLN83vV0TG1KT1uQ7unax5YPtBPEEhhBBCCCGEkSIvQSGEEEIIIYSRYrhjlMYBYRy9QhTGGw7X6++GOfH6kXDZZehWlK/h4XDdROKpcm1OFVSS9sQ+kvM93AwIfXBXLxWkSQR11zCuZ9zqHprA9+gzd72ThMhxEYYiFalKD6FhnBIqMcjhcJ5gSVgEohvzzz9/00b4xCKLLCKpHR7xohe9SJI0a9asKT3WqcT7gZCOWihaLVSQ+cZ4qkn+18Lhun/vf0doHcdVCz90kDL25O9+4nOlKyzSK8SvFta07777Np8RNSEExMOprr/+eknS6quvLqkdDtfFw8oQQ6Fa+nLLLde0XXjhhZLa4iuE7hAS6XPZQ6H6AQIqX/rSlyS1+wKpbsKRPNkbuf2rrrqq2Ub4FaFgLvrAtUHEwMOSWCMZ56ecckrT9tBDD0kq4TZ+X6LP/JryecMNN5RUEun7hd8Lll56aUnlXPzYGJ+ck4dBc4yEG0rlfsL18OM+4ogjJJVz2nLLLZs2ZMlr9xBEFlgr/b5Nm8O84d7j4Y/Ic88NO+20U/OZ+YHgj98XuT/97Gc/ax2XVL/m3Ef5nofxctzcFz3cqyvywliTSumOrkiJVK639zVhs4xzn+NTTa/1rvY9/w5jg+vhfUBI5te+9jVJZZ5KZQ388Ic/LKndF0jjL7roopLa1/3WW2+VJL35zW9utq211lqSpKuvvlpSKSkg1YWn+sHMfXIPIYQQQgghhAoz3hNUk7NmWz+lW0fFE1Trzznh1qNhB6sTCZpSsVK5LDWsttpqkoolVyqW3poUON4LrEg1iyZJgp5MjYV2hRVWkNT2BOE1cc8Ilis/rkHFLej0C1Y4T/rtWl/dUofVlT50K6ZLaQ8ybs3kPLC29ZJolsZ6RGoCJTX57O7v1YQVxus1x7rXb09Qt3BzDZ9HiBJ87GMfkyQddthhTRteH5dex4ux3377SZJuuOGGpo1ClljmkXaWxgpBvOY1r2k+0wd4QVxWmvUSr7NULOA1+ex+M3v2bElFdtnHHR4HZJ49Wf+jH/2oJOmMM85otiFnfcIJJ0iSNt9886aNscj5+v1lm222kVSukY9vrgfiCS6hz7a3vvWtzTYEJlZeeeVepz1pXMDijjvukFTGpK8zfMbT4XLPeAa97AEeNe4vfh34TYpI+r66ZRzcg0wbfe7iA3hePLKl61Hxewge97nh85//fPOZgrj8Zk1ohXNyL1pN4p91inXO1wbGEp4vb/P+8N/1z/x9bZ9+DPRntwD6dODrXa0Pat8DBE4oAOtCTwgW4AHac889mzaeaxhbfm/Go4MIAkIJUhEA8WNBDOT973+/pCKVL9ULpfeDmfvkHkIIIYQQQggVZrwnaDxMJA9oFJls3ojH9sKw5gTVLCfk+9T6533ve1/rO1LJBcJa5dKwXSlSt8p1Y3Vr1lGkZL3AGZYWPwb2S66Me7YGDbc+Yglnm1tOsYbyHR93WP2Ikx/k4rBzghh2aawl1y2WeApdVpTvYQ2seZVqfdKNGfc1krwf+tktdLVYc4/r7ic+R8bD9ttvL6l4bpmjUvEEea4ZFk3mlHtwkTbmehx++OFNG3LW8Pa3v735jCV76623ltS20tKPnm/D2tK1VE8F9913nyTptttuk9T2lCKLjBeAPCCpXriX+XbaaadJaktdU9ASaWTPSeHaXHzxxZLafbHGGmtIKt7uV7ziFWOOjxwEb/ei1P2ANdct/fQdY9I9d+SS4c1wuX762CWy8bbxO29729uaNnJNGZO1gpbMcff2s26yzccyc9XHIsfP+PPv9+Oe4cV2yR/DS+rzGs9szfNd8wTVch+BtYm1zO8TrJnMMz+GblmCWt6NHxdrwrx+xulK3vsazthlDkrFU46Eu3sZmaMLLLCApOIZkooXlvIp3veMIyJjdtlll6aNdfHggw9utnE85BD5uuHlSfpJPEEhhBBCCCGEkSIvQSGEEEIIIYSRYsaHw+GSrLlHazLP4xFL6Mo9z3QIj5Am5uL1kKUunqBdk+AeNAh/8WRcQhMuueSSMd8nPMKTC3GTc+6Eq0nF9Vxz5xOSgFSnV65HJrc2bpGzdMldwlT8mg4qnqROOAZjyhNkCfGgDzz8j7+jD2ryw4OOhwUSxsB5MQaluhAJ85XQiF5rXa+1z/dNqAjH4qIJ7L8W0tJvNtlkE0nSuuuu22zrSnq7rCrzjdAOH0NvectbJLXHBAIFhGS5kALSzczbDTbYoGkjvINwOw+XpX8I3/J5yO9QXV0qYVKEh/QbRCukIrGM2IMLPHSTmL2NkBcPn+OabLXVVpLK2iVJl19+uSRpvvnmk1TCaKQiy4tMtIdhIQLDdXHp4XPPPVdSEQyQiuhEV9Jd6n1veiQIN/MQRcY7YVU+X7gHEH511113NW2EKnloHftireJ6SOX+g9iCh7d2hQVcqIIxyPjzY68lznMeNSEkD7edLC6/zHHzW37NuV9xjH6sNRlsjrP2bMc4qD23scbS97W1sLa2cd1qJQRqYjJTTa08Av/WhBqQm5fKnL700kslSYceemjTxjUiVM5DtLkHMa5Z26QSJlx7bmQ++/q47bbbSiohlz72WUMQcOgX8QSFEEIIIYQQRooZ7wkCtyB0Cw2Op6hU7Xv+Zj2TxRXcWtm1tPbqOwrrOTUrTC+J20EBKwfysVJJ7iSp2nnjG98oSbrpppuabVi/8Hy5NQwrKm1uPeczfedWTAQOSBr2ZG+EFFZdddVmG5Y1kiBvv/32OZ3yPMfnF5Yo+sIL3dEvWOXdktj1WgwjbkV3K5vU9qjWzhGrba0ganfu+tzserv9u13vkoP109dDl5buJ8xF5OGlYu3nuJF7lYoVneN2jyoFUb0/GTvINrsXEes+QgpexBR5V7wSPs/hmmuukdQusornwgv78nmqkoK9f5DjxtLq6wx9hhy2F8U99dRTJbX7k/UMK70fP2MY7zoeKKlYmJHIxiotleuBiIyvrUhycx2l4u3AUr3xxhs3bS4oMFHoJxdcwMvDb/n6zXyhP2tzsFekisN1wDPn32H+18RJuDYcp/9dTfyE9bXrTZD6X2CbY+I3fV3pihG4d4W5WvMy1MQAoFZoGmreItZQxl/tutQEG/weBRSO7hfda93r2NybjDf85JNPbrZxHRAuuf/++5s21k4ES3yeMQ/wrH/gAx9o2rqFYt3ry/OhlwRAah7hBV87XWa/n8QTFEIIIYQQQhgpZrwniLdbt4p041x75bn08nTUrKMzEZcw7fZZryKx47UYDYMniHh9PC5S8bQceeSRY75P7KwX1GMsYsVzOd6anDh082FqxUBr+S3kLXh8OlbMQfYAwUILLdR8xuuGRdNlZvEAYZ32cdeVHJ+orPIg4Na9XjHvjAU/R+YpY6bmLapZe3vBPrGQ+trXbZP6b/0EYtj32WefZhvzlHFCbokfE94Xj2u/+eabJbW9PYwrvD5+TvQ1hQLda0JhUbwCSCRLbc+L1M73Yl++jWvZa32YGygEKxV56i222EJS29qLFZl8Jy98iwy2FzfGukvBZi8eyjjDS+TFNz/84Q9LKrlEHv/PvYZ977XXXk0b+S8uNc3cJ7/A+3Vu4L7v3q1uzqHf0/hd+qdWiLOXh7ZGTR66e36+n+592v/fq6gma6q39SOHt+shkMp9zn+L8V8r8lzzYDGmasWhu14T7wP6sZbv3fUOPVLx0a5HzvE84IlSO7buWPE5yHMbObLrr7/+mGM87rjjmm0UgL711lslte+xeF/XXHNNSe3oknPOOUeS9KY3vUlSmbtS8dACcuhSWZv9eeaAAw6QVIoe+/pNUd1+E09QCCGEEEIIYaTIS1AIIYQQQghhpJjx4XC9ZApxtY5X1KCbEFyrEjwT8XC4mht+TriIAHTDc3yfgwziEEixSiUhEGEET1pfZpllJEmXXXZZsw13P8l/tX7Fve39w2dCYtzNTlgOYQDuwiYUAClhqYTDUVndw1oGDU+0Zq7i7vfQD/qHfvHxRDgife3hAsMCCeRSSbgmPMHnIcmrHppFv3XDPfxzr2TaWvggoRRcn5osqx+DCxdMNUjGw7XXXjttvz0eXDJ/0GBt41+XlF5vvfUklRAzFyz40pe+JKlI8ksldJAwXZ+Tn/jEJySVRGcXGOA+ypp67733Nm177723pLLOXnXVVU0bYXAuib7HHntIGjsmpgLCxvjX5cJZm7uhWlKZQ7Vwr1ri/3jCebmH1ISbamHBXBv/fk2SGvohMrP66quP2cY67sI23WPzvuP73nfdPq6tTePBnwm74cLeJ7VQ4prwCrz5zW8e9zF0qaVvbLjhhpJKGCby81K55zG/PGQeSX8XrSGs7ZhjjpHUDmVDSGSXXXaRVIQLpJIiQBjdd7/73Tmeg4cb83zIM4kk7bbbbpLK9f7qV7/atLlQQz+JJyiEEEIIIYQwUsx4TxBv5b0S+McLb+JYHtxagIzqBz/4wbn+nUGDxFJpbEG2XtSEEWoW6fEkgs5rSGj04oJdrwLJ+1JJ2vaEYL7P+PFk565FyS11XQuQe0GQgaUPPdn7vPPOkyR95zvfabZxLV12eVDxoqck42LxdQlarM7Pf/7zJbX7BysY18aT9IehYKwknXDCCc1nCmniafF5hOwyEqdS6RvGkPcNVuua1bRmmQb6Ga+Ge0BrktrTYYkPc0/XM+jePKR0ESrwex/3PLd8k3DN/XeVVVZp2rgvINN/1FFHNW3uyZbaayprAMe55JJLNm0UXh1EmIO1IqMeDTBKbL311s1nrP54sv15o1Y8HLh/erFUqK1D3ftorSTAnP7f/X4X/z7PSBy7C0l4VMZk2XTTTZvP9Nnxxx8vqX2+zInVVlutdVyS9I1vfEOS9N73vrfZxj4oCu1z8VWvepWk4r3ZaKONmjbuxRQ67YU/n/BM5QVRKTKPJ9ifn/rRdzXiCQohhBBCCCGMFHkJCiGEEEIIIYwUMz4cDhemJ5p3kwv9/+MRSSB8xKtme1LoTMNd9rj0cUH3CovzBEeouaInkrA4r8DN7GIP3cRHrxtBLZKzzz672UY/4u73GkKT7QMqMKPf74nghIK5pv8vf/lLScVVPsjCCEsssUTzmeRJxp0nU9OvhGn5HCa5mzA6r800LHiSNYmlhBL4/Nthhx0kSWeddVazjVo1hJx4Ui+fWf88lIJtNWEZwhRJrvXfIzTCk6e7IU5hMOmuQX7tuZ6sJV7lnfA2D7El/I19eFI2dZ0I1/XwTULcCKfzkGPCeQjpdOEGqNV+gWGsETZTedGLXtR8Zm3n2cLXb56xWPcI/5bqYbxQW9O6NZl8vHfHRi9BhVrNLg81o51wOP/7uUnLWGqppSS111ZElrifex0eanTxTOBhdIQy8zwglfpmiJNwP5XGioh5bS+fv126Nag8bPWhhx6SVML1pCLAgihDv2p79SKeoBBCCCGEEMJIMeM9QVgQaol1WJVrbbzx1iwCvPX3kt+eSbiVpGsx6eXBcKtzL1nNXknYgwJWqgUWWGDMNsaKy99+/OMfn5bj4jeRqXXwxN18883NNiS1xysLPy/xpEjmaE12lHGGZckTZbvWQjxnw4SvT7/73e8klfNBKlsqSaWeTM76x1jwucb4ZU67lbLrCapZ2JFXdQ8A/evj61vf+tb4TjQMFDULO9d64403brbhrXEJWwRIsOT7vYCxgdfHrb3M+dmzZ7e+KxXvE2Nx3XXXbdoYYxOVkw7zhkMOOaT5vN9++0kqHgtf0/D24V3x8cC66IIT3ecRXztrokxzopfHpibq5M833GPxWrnniGiRyQh5ME88QoJIDhccADzw/Naiiy7atPEc43OWecw91gUqiDzg/rPccss1bfQB5+nXo/ts7dFBiDK4rPdCCy0kSTr88MMlHKkTYwAABDBJREFUtT3OHlXTT+IJCiGEEEIIIYwUg2+Cn0sefPBBSe1cAOLWsSrUinPxtu9WZd5weWN2q8RUFXIaNLA4IPtYk/0EzzfBokzfe57AQQcd1Pfj7DdLL720pHZMMtafbiE6qVhFarlPtQJrNUnPLoxJt1J1PR1+Pdj29Kc/vdnGsfaj4N1UQ16AJL3vfe+TVPrf5x7nQjw0+VFS8YItvvjikkqu1jDhxeS6+RAuI06fINcvlThw8jZ8PeMz3iFfB9nGd9ySz2/S7+55os3Htseph+HBrzlryaGHHiqpnQeGVZhiiVK5/muvvfaYfeEdwpPjeT9IwHN/WWeddZq26667rvU75Dw4NdnjYcg5HTU+/elPN5/xBOGBcMl9cj+5t/o46ub4SOX+WbvHMh5YJ/0e0itfqLvPWg65PxfgQWGbr7lXX321pMkVkObe5fcwnqvoJzw2UlnDyT2++OKLmzaK1XofkDfL8XsuKp975RCPp3wKc1iSTjnlFEnt+84999wjqdznPO+e7+++++5z3P9kiCcohBBCCCGEMFLkJSiEEEIIIYQwUjzqPwPoKx5P4tpEcTlNquzihq8JHHAMNUlZXPUuL+hVgfvFZC7NVPSd85nPfEZSCXNw1zD9sf7664/5u69//euSynXwsK+tttpKUrtC+dwyVX3nbmxc54MQYsWx1MLpjjnmmOYzVdoJL3NJbRjEccf54cb3EAjCsQjF8sRa5uxFF10kqe3inwom2ncT7bf9999fUglLcKnwT37ykxPaV79AWlUqSbguwHDaaadJku6888457mMQx9ywMB19N1NDyzLuJk+/+u7AAw+UVERVPMSXcCjCtnxd4V5XE23pFXLOPaFWEqC2z+5v+zHwPQ9DJ+SfsDLCzKQSytUr7H1OZNz9H/1eg+IJCiGEEEIIIYwUA+kJCiGEEEIIIYSpIp6gEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJIkZegEEIIIYQQwkiRl6AQQgghhBDCSJGXoBBCCCGEEMJI8f8ArzMrjBN6I6cAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# takes 5-10 seconds to execute this\n", + "show_MNIST(train_lbl, train_img, fashion=True)" + ] + }, + { + "cell_type": "code", + "execution_count": 110, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA0EAAAKoCAYAAACxwfQnAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzsnXd4FUX7sO9IOknovUa6FOldaYJ0VFARX5qoqK8/RVEQfSWgKIIgNgSVJjawgAoICFJERYogiFgoAiJFRFFAkDbfH3zP7uzJEk5CEnKS574urhx2ZndnZ5+d2X3ahBljDIqiKIqiKIqiKDmESy52AxRFURRFURRFUTIT/QhSFEVRFEVRFCVHoR9BiqIoiqIoiqLkKPQjSFEURVEURVGUHIV+BCmKoiiKoiiKkqPQjyBFURRFURRFUXIU+hGkKIqiKIqiKEqOQj+CFEVRFEVRFEXJUehHkKIoiqIoiqIoOYpM/QgKCwsL6t+yZcsu6DyTJk0iLCyMb7755rx1mzZtylVXXRXUcXfv3s2wYcPYuHHjOescOHCA8PBw5syZA8CIESP46KOPgmt4OpFZ/ZwdmTZtmqePwsPDKVmyJH379uXXX39N9fGaN29O8+bNPdvCwsIYNmxY+jQ4xAjs3+joaIoWLUqLFi0YOXIkv/3228VuYkiyceNG+vbtS2JiItHR0cTFxVG7dm1Gjx7NH3/8kSHn/PLLLxk2bBiHDh3KkONfCKtWreLaa6+ldOnSREVFUaRIERo1asTAgQMzvS07duwgLCyMadOmpXrfZcuWZbmxOpi+LVu2LB07djzvsVJ7fW+99RbPPvtsWpuebmQl+fIj2P4PVQLnkbCwMAoVKkTz5s2ZO3fuxW5emnj++ecJCwujWrVqF3ysPn36EBcXd956fu8nmXHejCCtY0N4BrTlnKxcudLz/8cff5ylS5eyZMkSz/bLLrss09r0yiuvEBYWFlTd3bt3M3z4cMqXL0+NGjV863zwwQfExsbSunVr4OxH0H/+8x86d+6cbm0+H1mxn0ONqVOnUrlyZY4dO8Znn33GyJEjWb58Od9++y25c+e+2M0LeaR/T548yW+//cbnn3/OqFGjGDNmDDNnzgxaMaHAq6++yl133UWlSpV48MEHueyyyzh58iRr165l4sSJrFy5ktmzZ6f7eb/88kuGDx9Onz59yJs3b7ofP63MmzePzp0707x5c0aPHk2xYsXYu3cva9euZcaMGYwdO/ZiNzFkSe++rV27NitXrgx6LnrrrbfYtGkTAwYMSEvz0wWVr6yDzCPGGPbt28eLL75Ip06d+Oijj+jUqdPFbl6qmDJlCgDfffcdq1atokGDBhe5RaFFWseGTP0Iatiwoef/hQoV4pJLLkm2PTMJZvA9ffo0p06dCup47733Hh06dCA6OvpCm5ZmLrSfT5w4Qa5cuciVK1dGNC9D+eeff4iNjb3g41SrVo26desC0KJFC06fPs3jjz/OBx98wM0333zBx8+qiKxHRUVl6Hns/gXo2rUr9913H02bNuW6665jy5YtFClSxHff9LrH2YGVK1dy55130rp1az744APPfWvdujUDBw5kwYIFF7GFmc/o0aNJTExk4cKFhIe7U1z37t0ZPXr0RWxZ6JPefZuQkBDUvJSVnnmVr7McO3aMmJiYi9qGwHmkbdu25MuXj7fffjukPoLWrl3Lhg0b6NChA/PmzWPy5Mn6EZRJhGRM0Pjx46levTpxcXHEx8dTuXJlHn300WT1/v77b/r370+BAgUoUKAA3bp1Y9++fZ46ge5wW7duJSwsjLFjx/LYY49RtmxZoqKiWLFiBY0aNQKgZ8+ejgl2xIgRzr5//vknS5cupWvXrpw6dYqwsDD+/fdfJk+e7NS3z/Xtt9/SuXNn8ubNS3R0NLVq1eL111/3tG/x4sWEhYXx9ttvM2DAAIoUKUJMTAwtWrRgw4YNF9yXCxYsICwsjJkzZ3LPPfdQrFgxoqOj+eWXXwDYsGEDHTt2JG/evMTExFC7dm3eeustzzEmTpxIWFhYsr6VY3/11VfOtjVr1tCuXTsKFSpEVFQUJUqUoFOnTp59z5w5w3PPPUeNGjWIjo4mf/783HjjjezcudNz/IYNG1K3bl0+/fRTGjZsSExMDHfdddcF94kfMlHv3LmTYcOG+VoPxUS/Y8eOVB9/06ZNdOnShXz58hEdHU3NmjV57bXXnPIDBw4QGRnpK+c//PADYWFhPP/88862ffv20b9/f0qWLElkZCSJiYkMHz7c8zEvbjqjR49mxIgRJCYmEhUVxdKlS1Pd/vSgdOnSjB07lsOHD/Pyyy8Drnn922+/pU2bNsTHx9OqVStnn8WLF9OqVSsSEhKIjY2lSZMmfPrpp57jHjhwgNtvv51SpUoRFRVFoUKFaNKkCYsXL3bqrF+/no4dO1K4cGGioqIoXrw4HTp0YPfu3Zlz8WnkySefJCwsjFdeecX3wzUyMtKxQp85c4bRo0dTuXJloqKiKFy4ML169Up2jYsWLaJLly6ULFmS6OhoypcvT//+/fn999+dOsOGDePBBx8EIDExMUu52B48eJCCBQt6XlCFSy5xp7yZM2fSpk0bihUrRkxMDFWqVOGhhx7i6NGjnn1EBrdu3Ur79u2Ji4ujVKlSDBw4kH///ddTd8+ePdxwww3Ex8eTJ08ebrzxxmTjIpx96enevTtly5YlJiaGsmXLctNNNyUb47IawfatsGDBAmrXrk1MTAyVK1d2tN2CnzvcuZ755s2bM2/ePHbu3Olxg8psgu0DcUk7Xx9AcOM1wPDhw2nQoAH58+cnISGB2rVrM3nyZIwx5233Sy+9RHh4OElJSc62EydOMGLECGdMKFSoEH379uXAgQOefeVaZs2aRa1atYiOjmb48OHnPWdmEx0dTWRkJBEREc62YPvs33//ZeDAgRQtWpTY2FiuvPJKvv76a8qWLUufPn0ytN2TJ08G4KmnnqJx48bMmDGDf/75x1NH5usxY8bwzDPPkJiYSFxcHI0aNfK8Y52LL774goIFC9KxY8dkY5xNsDKREt999x2tWrUid+7cFCpUiLvvvjvZ9Rw/fpwhQ4aQmJhIZGQkJUqU4L///W8y9+pg5q0LGRsy1RKUHrzxxhvcfffd3HvvvXTo0IGwsDC2bt3Kjz/+mKzuLbfcQqdOnXj77bfZuXMngwYNolevXnzyySfnPc+4ceOoXLkyzzzzDPHx8VSsWJFJkyZx6623MmzYMK6++moASpUq5ezz0UcfER4eTrt27QgPD2flypU0a9aMtm3bMmTIEADy5MkDwObNm2ncuDFFixblxRdfJF++fEyfPp1evXpx4MAB7r//fk97Bg8eTN26dZkyZQp//vknSUlJNGvWjA0bNlCmTJk096cwcOBArrzySiZNmsSZM2fIly8f3377LU2aNKFEiRKMHz+evHnzMm3aNG6++WZ+//137rnnnlSd49ChQ7Rp04bKlSszceJEChUqxN69e1myZInnoezTpw8zZ87kvvvuY8yYMRw4cIDhw4fTtGlTvvnmGwoUKODU3blzJ3379mXIkCFUqVLFd2JKD7Zu3QqctaqlJTYoJX788UcaN25M4cKFef755ylQoABvvPEGffr0Yf/+/QwaNIhChQrRsWNHXnvtNYYPH+6ZbKdOnUpkZKRjodq3bx/169fnkksuYejQoZQrV46VK1cyYsQIduzYwdSpUz3nf/7556lYsSJjxowhISGBChUqpOv1pYb27duTK1cuPvvsM2fbiRMn6Ny5M/379+ehhx5yXgzeeOMNevXqRZcuXXjttdeIiIjg5Zdf5uqrr2bhwoXOx1LPnj1Zt24dTzzxBBUrVuTQoUOsW7eOgwcPAnD06FFat25NYmIi48ePp0iRIuzbt4+lS5dy+PDhzO+EIDl9+jRLliyhTp06nnHoXNx555288sor3H333XTs2JEdO3bw6KOPsmzZMtatW0fBggUB2LZtG40aNeLWW28lT5487Nixg2eeeYamTZvy7bffEhERwa233soff/zBCy+8wKxZsyhWrBiQNVxsGzVqxKRJk7jnnnu4+eabqV27tuelSNiyZQvt27dnwIAB5M6dmx9++IFRo0axevXqZK7DJ0+epHPnzvTr14+BAwfy2Wef8fjjj5MnTx6GDh0KnNWMX3XVVezZs4eRI0dSsWJF5s2bx4033pjs3Dt27KBSpUp0796d/Pnzs3fvXiZMmEC9evXYvHmzcy+yGsH2LZxVoA0cOJCHHnqIIkWKMGnSJPr160f58uW58sorUzyP3zNfsmRJbr/9drZt25Yh7p3Bkt59kJrxeseOHfTv35/SpUsD8NVXX/F///d//Prrr44cBmKM4cEHH+T5559n0qRJzgv9mTNn6NKlCytWrGDQoEE0btyYnTt3kpSURPPmzVm7dq3H0rNu3Tq+//57/ve//5GYmJgl3MLFc8EYw/79+3n66ac5evQoPXr0cOoE22d9+/Zl5syZDBo0iJYtW7J582auvfZa/v777wy9hmPHjvH2229Tr149qlWrxi233MKtt97Ku+++S+/evZPVHz9+PJUrV3biXx599FHat2/Pzz//7LxfBvLOO+/Qq1cvbrnlFl544YVzevmkVib8OHnyJO3bt3ee3S+//JIRI0awc+dOJ1beGMM111zDp59+ypAhQ7jiiivYuHEjSUlJrFy5kpUrVzpKvWDmrZdeeintY4O5iPTu3dvkzp07VfvccccdpmDBginWefXVVw1g7rnnHs/2J5980gDmt99+c7Y1adLEtGrVyvn/li1bDGAqVqxoTp486dl/5cqVBjCvv/6673k7duxorr32Ws+2qKgo069fv2R1u3XrZqKjo83u3bs929u0aWPi4uLM33//bYwxZtGiRQYw9evXN2fOnHHqbdu2zYSHh5s77rgjpa4wxqTcz/PnzzeAadOmTbKya665xsTGxpq9e/d6trds2dIkJCSYI0eOGGOMmTBhggGS1ZNjr1y50hhjzOeff24As2DBgnO2denSpQYw48eP92zfvn27iYyMNEOHDnW2NWjQwADmiy++SOHqU8fUqVMNYL766itz8uRJc/jwYTN37lxTqFAhEx8fb/bt22eSkpKM36Mj+/7888/OtmbNmplmzZp56gEmKSnJ+X/37t1NVFSU2bVrl6deu3btTGxsrDl06JAxxpiPPvrIAOaTTz5x6pw6dcoUL17cdO3a1dnWv39/ExcXZ3bu3Ok53pgxYwxgvvvuO2OMMT///LMBTLly5cyJEydS1U9pRfpozZo156xTpEgRU6VKFWPMWdkFzJQpUzx1jh49avLnz286derk2X769Glz+eWXm/r16zvb4uLizIABA855vrVr1xrAfPDBB2m5pIvGvn37DGC6d+9+3rrff/+9Acxdd93l2b5q1SoDmIcffth3vzNnzpiTJ0+anTt3GsB8+OGHTtnTTz+dTN6zAr///rtp2rSpAQxgIiIiTOPGjc3IkSPN4cOHffeR61y+fLkBzIYNG5wykcF33nnHs0/79u1NpUqVnP/LOGj3kTHG3HbbbQYwU6dOPWebT506ZY4cOWJy585tnnvuOWe7jIdLly5NRQ9kHMH2bZkyZUx0dLRnDDp27JjJnz+/6d+/v7PN7/rO9cwbY0yHDh1MmTJlMuTagiW9+yDY8TqQ06dPm5MnT5rHHnvMFChQwPN+UKZMGdOhQwfzzz//mK5du5o8efKYxYsXe/Z/++23DWDef/99z/Y1a9YYwLz00kue4+XKlcv8+OOPqeipjEPmkcB/UVFRnnYHcq4+++677wxgBg8e7KkvfdS7d+8Mu5bp06cbwEycONEYY8zhw4dNXFycueKKKzz1ZL6uXr26OXXqlLN99erVBjBvv/22s81+53vqqadMrly5zKhRo5KdO/D9JDUy4Yc8u/YYZowxTzzxhAHM559/bowxZsGCBQYwo0eP9tSbOXOmAcwrr7xijEndvJXWsSHLusPJF778M//fdFm/fn1+//13br75Zj766CNHm+tHYDICSWawa9eu856/S5cuqbIqHD58mEWLFtG1a9eg6i9ZsoQ2bdpQokQJz/bevXtz5MgRVq1a5dneo0cPj3nv0ksvpUGDBunmuuTX7iVLltC2bVuKFi2arI1///03a9asSdU5KleuTEJCAgMHDuTVV1/lhx9+SFZn7ty55MqVix49enjuf6lSpbjsssuSudsUK1aMxo0bp6odwdCwYUMiIiKIj4+nY8eOFC1alPnz558zTuVCWLJkCa1atUqmze/Tpw///POPk+iiXbt2FC1a1KMZXLhwIXv27OGWW25xts2dO5cWLVpQvHhxTx+2a9cOgOXLl3vO07lz53NqMi8Gxse1I1A+v/zyS/744w969+7tucYzZ87Qtm1b1qxZ41gX69evz7Rp0xgxYgRfffUVJ0+e9ByrfPny5MuXj8GDBzNx4kQ2b96ccRd3kZBxItCto379+lSpUsXjQvjbb79xxx13UKpUKcLDw4mIiHCszd9//32mtTmtFChQgBUrVrBmzRqeeuopunTpwk8//cSQIUOoXr2649a3fft2evToQdGiRcmVKxcRERE0a9YMSH6dYWFhyWIMatSo4XFfW7p0KfHx8cnmHVsrLRw5coTBgwdTvnx5wsPDCQ8PJy4ujqNHj2bpPg62bwFq1qzpaN/hrKtSxYoVg3b5C3YuzWzSuw9SM14vWbKEq666ijx58jgyO3ToUA4ePJgss+bBgwdp2bIlq1ev5vPPP/e4Ect58+bNS6dOnTznrVmzJkWLFk0219aoUYOKFStecP+lJ9OnT2fNmjWsWbOG+fPn07t3b/773//y4osvOnWC6TPp4xtuuMFz/G7dumWYd4kwefJkYmJi6N69OwBxcXFcf/31rFixgi1btiSr36FDB48lR95rA58rYwz9+/cnKSmJt956i0GDBp23LamViXMRGDctY6DMQ2JpD5yPrr/+enLnzu3MR6mZt9JKlv0IKlOmDBEREc6/J554AjjbGZMmTWL79u1cd911FC5cmIYNG/p2hu02BTjmtWPHjp33/OLeESxz5szBGBN0Wso///zT9xzFixcHSPZxF/ghIttS+ghMDYFtOX36NH///Xeq2ng+ChQowPLly6lSpQoPPvggVapUoWTJkjz++OOcPn0agP3793P69Gny5cvnuf8RERF88803ngnGr93phQyu69evZ8+ePWzcuJEmTZpkyLkOHjwYVD+Hh4fTs2dPZs+e7fjNTps2jWLFijnumXC2D+fMmZOs/6pWrQqQaX2YFo4ePcrBgwedaweIjY0lISHBU2///v3A2Ukq8DpHjRqFMcZJDT1z5kx69+7NpEmTaNSoEfnz56dXr15OrEaePHlYvnw5NWvW5OGHH6Zq1aoUL16cpKSkZB9MWYmCBQsSGxvLzz//fN66IkPnkjMpP3PmDG3atGHWrFkMGjSITz/9lNWrVzs+58GMnVmFunXrMnjwYN5991327NnDfffdx44dOxg9ejRHjhzhiiuuYNWqVYwYMYJly5axZs0aZs2aBSS/ztjY2GTJbqKiojh+/Ljz/4MHD/oqSfzG7h49evDiiy9y6623snDhQlavXs2aNWsoVKhQSPRxSn0rBM6/cLbPgrk+v2c+q5FefRDseL169WratGkDnM0I+cUXX7BmzRoeeeQRILnM/vTTT6xatYp27dr5pl3ev38/hw4dcmJo7H/79u3L0vOEUKVKFerWrUvdunVp27YtL7/8Mm3atGHQoEEcOnQo6D6T8S/w+Q0PD/e9h+nF1q1b+eyzz+jQoQPGGA4dOsShQ4fo1q0bgG/8WLDvtSdOnGDmzJlUrVrV+aA+H6mVCT/8+kzGQOnngwcPEh4eTqFChTz1wsLCPO+1wc5bF0KWjQn6+OOPOXHihPN/sZiEhYXRr18/+vXrx5EjR1i+fDlJSUl07NiRLVu2ULJkyXQ5f2oDLt9//31H2xAM+fLlY+/evcm279mzByCZT7hfcO2+ffvS7QENvN5cuXKRkJAQVBvl5SAwSNjvgalZsybvvvsuZ86cYcOGDUyePJmhQ4cSHx/PgAEDnIDTzz//3NdvNdAfNaMCY2Vw9cO+XjsYPZgBwo8CBQoELQt9+/bl6aefZsaMGdx444189NFHDBgwwNNXBQsWpEaNGo7iIBD7AwMyrg/Twrx58zh9+rRn7QK/9kmfvPDCC+fMLiUTWsGCBXn22Wd59tln2bVrFx999BEPPfQQv/32m5M5rXr16syYMQNjDBs3bmTatGk89thjxMTE8NBDD6XzVaYPuXLlolWrVsyfP5/du3enOPbJOLF3795k9fbs2eP056ZNm9iwYQPTpk3z+KNLTFyoEhERQVJSEuPGjWPTpk0sWbKEPXv2sGzZMsf6A1zQmkcFChRg9erVybYHjt1//fUXc+fOJSkpySNb//77b4at6ZSRBPZtepCVxqRguJA+CHa8njFjBhEREcydO9fzQf7BBx/47teoUSOuv/56+vXrB8CECRM8saQFCxakQIEC58weGR8f7/l/qNyTGjVqsHDhQn766aeg+0zGx/3793u8c06dOpVuimY/pkyZgjGG9957j/feey9Z+WuvvcaIESPSlKlXkhxdffXVXHXVVSxYsIB8+fKluE9qZcIP6TP73VTGQNlWoEABTp06xYEDBzwfQub/pzqvV6+ep/755q0LIctagmrUqOF84detW9f3SzAuLo4OHTowZMgQjh8/nuFuLOf64v7nn39YsGCBr/n+XJqvVq1asXjxYkejLUyfPp24uDjq16/v2R6YkW379u2sWrUqXRe68mvjwoULk2UFmT59OgkJCc5HQtmyZQGSLSKb0iKxl1xyCbVq1eLFF18kJiaGdevWAdCxY0dOnTrF/v37Pfdf/ol27GJyruuVoL/U0qpVK+elzGb69OnExsZ6XvKrVKlCgwYNmDp1Km+99Rb//vsvffv29ezXsWNHNm3aRLly5Xz7MPAjKKuwa9cuHnjgAfLkyUP//v1TrNukSRPy5s3L5s2bfa+xbt26REZGJtuvdOnS3H333bRu3dqROZuwsDAuv/xyxo0bR968eX3rZCWGDBmCMYbbbrvNozQSTp48yZw5c2jZsiVwNpmEzZo1a/j+++8dVxl50QnMNCfZ+mxSY1nPTPwUCuC6uBUvXjxV1xksLVq04PDhw8nGvcCxOywsDGNMsnNPmjTJsYhnVYLp24wkWEtSRpLefRDseC2Ld9svxMeOHUuWUdamd+/ezJgxg6lTp9KrVy+PfHXs2JGDBw9y+vRp3/NWqlQpVdeRVfjmm2+As0mMgu0zSVIxc+ZMz/b33nsv6OVRUsvp06d57bXXKFeuHEuXLk32b+DAgezdu5f58+en+Ry1atVi+fLl7N69m+bNm593MfL0kok333zT838ZA+V9VeabwPno/fff5+jRo055sPMWpH1syLKWoHPRt29fEhISaNKkCUWLFmXv3r08+eST5MuXjzp16mTouStUqEB0dDSvv/46FStWJHfu3JQoUYIvvviCEydO0KVLl2T7VK9enSVLljB37lyKFi1KQkICFStWZNiwYcyfP5/mzZvz6KOPkjdvXl5//XUWLlzI2LFjk31x7927l+uuu45+/fpx6NAhhg4dSmxsLIMHD86w6x0+fDiffPIJzZs355FHHiFv3ry89tprfPrppzz33HNOdpgmTZqQmJjIvffey7Fjx4iPj+fdd99l7dq1nuO9//77TJs2jS5dupCYmMjp06d55513OHbsmLO4bKtWrejVqxc333wzd999N02bNiU2NpY9e/awYsUK6tWr52i2Lhbt27cnf/789OvXj8cee4zw8HCmTZvmpBVPLUlJSY5f+NChQ8mfPz9vvvkm8+bNY/To0cmsi7fccgv9+/dnz549NG7cONnA9Nhjj7Fo0SIaN27MPffcQ6VKlTh+/Dg7duzg448/ZuLEielmMU0rmzZtcvyNf/vtN1asWMHUqVPJlSsXs2fPTmYmDyQuLo4XXniB3r1788cff9CtWzcKFy7MgQMH2LBhAwcOHGDChAn89ddftGjRgh49elC5cmXi4+NZs2YNCxYs4LrrrgPO+kG/9NJLXHPNNVx66aUYY5g1axaHDh1y5DKr0qhRIyZMmMBdd91FnTp1uPPOO6latSonT55k/fr1vPLKK1SrVo3Zs2dz++2388ILL3DJJZfQrl07J8tOqVKluO+++4CzcXvlypXjoYcewhhD/vz5mTNnDosWLUp27urVqwPw3HPP0bt3byIiIqhUqVJQ2sKM5Oqrr6ZkyZJ06tSJypUrc+bMGb755hvGjh1LXFwc9957L8WLFydfvnzccccdJCUlERERwZtvvnlByw706tWLcePG0atXL5544gkqVKjAxx9/zMKFCz31EhISuPLKK3n66acpWLAgZcuWZfny5UyePDlLLTrrRzB9m5FUr16dWbNmMWHCBOrUqcMll1xyTot9RpHefRDseN2hQweeeeYZevTowe23387BgwcZM2bMedd069atG7GxsXTr1s3JRBYZGUn37t158803ad++Pffeey/169cnIiKC3bt3s3TpUrp06cK11157IV2V4cg8Amddp2bNmsWiRYu49tprSUxMDLrPqlatyk033cTYsWPJlSsXLVu25LvvvmPs2LHkyZPHN/37hTJ//nz27NnDqFGjfJXZ1apV48UXX2Ty5MlBh1n4UaVKFVasWMFVV13FlVdeyeLFi885/6eHTERGRjJ27FiOHDlCvXr1nOxw7dq1o2nTpsDZNeyuvvpqBg8ezN9//02TJk2c7HC1atWiZ8+eAFSqVCmoeQsuYGxIdSqFdCQt2eGmTJliWrRoYYoUKWIiIyNN8eLFTffu3c2mTZucOpIdbv369Z59JdPaihUrnG3nyg43btw43/O/8cYbplKlSiYiIsIA5vHHHzfdu3f3HMPm66+/No0aNTIxMTEG8NTbsGGD6dixo0lISDBRUVGmZs2aZvr06b5tfuutt8zdd99tChUqZKKiokyzZs3MunXrguqzYLLDzZkzx7d8/fr1pn379k4ba9WqZd54442B0dHgAAAgAElEQVRk9TZv3mxatWpl4uPjTeHChc39999vZs+e7ckOt2nTJnPjjTeaSy+91ERHR5u8efOahg0bJjvemTNnzMsvv2zq1atnYmNjTWxsrClfvrzp06eP5542aNDA1KlTJ6g+CJZgspcZczYjS+PGjU3u3LlNiRIlTFJSkpk0aVKassMZY8y3335rOnXqZPLkyWMiIyPN5Zdffs5sUn/99ZcjT6+++qpvnQMHDph77rnHJCYmmoiICJM/f35Tp04d88gjjzhZ/STbzNNPP53itaYngVl9IiMjTeHChU2zZs3Mk08+6cncaMz5x4jly5ebDh06mPz585uIiAhTokQJ06FDB/Puu+8aY4w5fvy4ueOOO0yNGjVMQkKCiYmJMZUqVTJJSUnm6NGjxhhjfvjhB3PTTTeZcuXKmZiYGJMnTx5Tv359M23atIzriHTmm2++Mb179zalS5c2kZGRJnfu3KZWrVpm6NChTp+ePn3ajBo1ylSsWNFERESYggULmv/85z/ml19+8Rxr8+bNpnXr1iY+Pt7ky5fPXH/99WbXrl2+cjtkyBBTvHhxc8kll2SZLGYzZ840PXr0MBUqVDBxcXEmIiLClC5d2vTs2dNs3rzZqffll1+aRo0amdjYWFOoUCFz6623mnXr1iXL5HYuGfTLErl7927TtWtXExcXZ+Lj403Xrl3Nl19+meyYUi9fvnwmPj7etG3b1mzatMmUKVPGk4kqq2WHC7ZvJTtZIIHj4bmyw53rmf/jjz9Mt27dTN68eU1YWJhvls6MJr37wJjgxmtjzr7/VKpUyURFRZlLL73UjBw50kyePDnZvON37qVLl5q4uDjTtm1b888//xhjjDl58qQZM2aMufzyy010dLSJi4szlStXNv379zdbtmw577VcLPyyw+XJk8fUrFnTPPPMM+b48eNO3WD77Pjx4+b+++83hQsXNtHR0aZhw4Zm5cqVJk+ePOa+++5L92u45pprTGRkZLI5z6Z79+4mPDzc7Nu3L8X5OnBs9nuGdu/ebSpXrmzKli1rtm3bZozxl8VgZcIPOe/GjRtN8+bNTUxMjMmfP7+58847PXJszNlMiYMHDzZlypQxERERplixYubOO+80f/75p6desPNWWseGMGOCWGVLOSf//vsvhQoVYtSoUdx5553pfvzFixfTunVrZs+ezTXXXJPux1cURVEURVG8fPnllzRp0oQ333zTN8ujEvqEnDtcViMqKirDF9NSFEVRFEVRMoZFixaxcuVK6tSpQ0xMDBs2bOCpp56iQoUKjuu0kv3QjyBFURRFURQlx5KQkMAnn3zCs88+y+HDhylYsCDt2rVj5MiRydLjK9kHdYdTFEVRFEVRFCVHkWVTZCuKoiiKoiiKomQE+hGkKIqiKIqiKEqOQj+CFEVRFEVRFEXJUehHkKIoiqIoiqIoOYosmR0uLCws3Y95++23O79r1qwJwLJlywD44osvnLJff/3Vs1/+/Pmd3xUqVADcldLt1WgXLFgAwAcffJBubU5LzoqM6Dt7teTvvvsOgJ9++gmAEydOOGUNGzYE4OjRowCeVZl///13AIoWLQrAP//845RVqlQp3ducVfrOj4oVKwLQoUMHZ5v0y7///gvAtm3bnLJVq1YB8MMPP5z32PY1yO8zZ86kqn1Zue/efPNNAHbv3u1sGzx4sKfOlVde6fyeMmUKANOmTQNgxIgRGdq+1PZdRvdb6dKlAXj22WeBs6uHC5Lx6PTp0wAcPnzYKZP+nTVrFgBTp05Ndmxpe3rk1snKMpfVuRh9lytXLue3yI8fCQkJADzzzDPONplvjx8/DkC5cuWcsj///BOA8ePHJzuWzEOpHc9SQuUu7WjfpR3tu7ST3rnc1BKkKIqiKIqiKEqOIktagtKTrl27AvDoo4862w4dOgRAixYtAFczDzBnzhzAtVTcdNNNTplo50+dOgV4NWBiBUlPS1BWoUSJEs5v0RDfe++9gGsVA2jcuDHgWtPuv/9+p0ysb507dwbgyJEjGdjirMnrr78OQLNmzQDXYgYQGxsLuNpRe10C0Y7u2LEDgN69eztlf/31F+BqZlPSymYHxCpmWzR++eUXAA4cOAB4rbnff/89AKNHj86sJl50rr76auf3zJkzAXfsEgsueC284MoeuFrHpKQkAHr27OmUtWzZEkh/jZwSOpxvnOnVqxfgWmXLli3rlMnz+uOPPwJw7bXXOmWff/45AB07dgRg7ty5Tll6WoCUrIeMRylZPPzkzh7HAmWkXbt2zm95j8mXLx/g9VSRBe83b94MwOLFi1M8jyBttdss46L8VbnN2qglSFEURVEURVGUHIV+BCmKoiiKoiiKkqPI9u5wYsIUlyKAgwcPAq5bmwTtA5QqVQpw3YvWrFnjlIlZU1zlIiIiMqrZWQo7CD0yMhKAcePGAfDJJ584ZfPmzQNcF6S77rrLKevWrRvgujfkzp07A1ucuYgp3A4WFtkaO3ass+26664DXLct27x+7Ngxz7HsMulzcd+0+3XkyJGec2d3dzjpO9tF86233gLcAGupA67Lg53AI7tSo0YNAGbMmOFsk0QbcXFxADRp0sQpE7cQkTXbRVXkUfqyadOmTtm3334LuIlh5Bz2sdQFJPTxc/ERqlWr5vy+4YYbANcdGtz5Vp4/GafAHRPF5Xf16tVOmbj3/u9//wO8rnIfffQRAB9++OE526oumqGBPb/J3HXy5Mk0HctvrLn++uuTla1btw6Affv2JTufJJCpX7++pw7Apk2bznkeJfRRS5CiKIqiKIqiKDmKbG8JkoBzP8LDwz1/wU0TGxMTA3i1YWL5kb+21km2ZUdNqFgiAPbs2ePZlidPHqfsqaeeAiA+Ph6Ajz/+2CkTLfXll18OuMkpQo2UAiDF+mNj950E7O/cuRPwplgXjaloxWzr5NatWz3HDEzjDjnD0gFuohMJYAU3Da9YJOz+EYuRpCMXa2WokpLFTxJv2FZvSWmfN29eAAoXLuyUyVjnZ30ULakkeilZsqRTJlbdb775BvAmqZBxLyUrghK6yNjeunVrZ5sscfDHH38420S2xNpoj2eSGKF27dqA93m95pprANcqac+jDzzwAAC33norAJ06dXLKVMZCC/u+Xui7koz/4CbAue+++wDXon0+xOL99ddfA175lrlVksr4eQDZViWZy20LuZJ1UUuQoiiKoiiKoig5imxvCapcuTLg1dKLFlW09HZqWLEciXbCT0sh8Sy//fabs000XkWKFAFg79696XMBWQDb11ssOg899BDgxqkAfPrpp4CrsVu7dq1T9thjjwHw888/A67PLrialbT6BGcVbP910VIWKlTI2SbaUZE30aQDNG/eHHDlTfyXwdWqi9wOHDjQKRONlSxqOXv2bKdMtKPZSSsvWjaJOwPXSiHaZnuh2YIFC2Zi6zKewPsncgNQvnx5AKZPn+5sE+2laM/tMUsQ7aeMleCOY2LlsWVVYgRD1ZqrBIdtGZSx55ZbbgG8cicxsvZYJxZYWWTXjo0U74Hnn38ecK0/4C5qLotH22OdxGnIOCpzEcDGjRtTeXXKxaRq1arOb1lKQ2I6be+SQOwxSjwkbE8eQVKti7URUp77AmPKtm/f7pTJu6Pf/vLO4ucFIvFt9hxkWy+VrIFaghRFURRFURRFyVHoR5CiKIqiKIqiKDmKbO8OV6FCBcAbpCYBnLKKtSRBANcFQILhbJcAcdsSs7ydUlYCji+77DIge7nD2YHWAwYMAFw3m+7duztl4kJTvXp1wJsGWwIPJbGCHSgr5uJQ6DM/k3ifPn0AuOmmm5xtYgq302ZLanZxuRQ3EnBTh4tZ3V7NWlLJSh/aSRDy588PwIMPPgi4K7SDGxwa6i5w4LquiiuW7cIqfSVB2+JWAa6Ljh08G8oEuufaz9/Ro0cBbxpscTsVmbGfZem3MmXKAN6AX5HbevXqAdCoUSOnTJ5dkeMCBQo4ZSLjSujjl3xDEtvImATw448/Au6yCQBJSUkAtGrVCvCOZ5KsY/ny5YA7foKbdEPq2/K6a9cuAK644grAdbsG6NWrV/AXplx07PFE7r+4gNvu2/Jb/m7ZsiXZsey5QMZHGb8kvTW486CU2XNz4Pls2bfbYx8H3DHTPpa0QcZFWXYFIDExMdk5sxv33nuv83vRokWAN5GRYL9bC9K3fgnGMuo9Ri1BiqIoiqIoiqLkKLK9JUisEXbgmmjpJSBYkhqAa5WQr3/7S3TDhg2A+9VvB7zJ8W2rUnbB1rQUL14cgPXr1wNeDYik05XkB3bwY48ePTz72dr69957LyOaneGItuLmm28GvIt0+mmUxBopWnlbwxSYWtjeTyxGYg2xNfZST85tB46KXNva1FBFkh9IoP7ff//tlEm/fvXVV4BXS/3ZZ58B3gV/swOyeKlt9RELtz0GSSIEGats7ZvImli07WdZrOQiO2LBBVcjJ8cUKyS4CVOU7IkEgtvpsCtWrAi44yC4i/NKchxbfuRZ/u9//+upA+5ilfv37wfcJAjgyqRYnuwyJbSwPXMC06HbGv9AK4z9PiZjmV1H5kgps+dRKZNj2GVS3y+ZUKDlyE7g5FdfjiXXKLIM/stbXEzSmjTJniukH+WdrmfPnk6ZJOSRpFH2ch8ppUbPzEXf1RKkKIqiKIqiKEqOIttbgmSxLNFMgWvZkNgeW1MuWnpJu2h/kYrPvfgr2+lmZcE4v4W0Qh1bsy79KBY2e0HU9u3bA9CuXTvAjY8CN6Xxk08+CcDSpUudslDV6IlGU6wStqyIHNgaU5GbQD9nG7/FJkWzJNZG+zwip1Jm3yuJ5/jkk09SeWVZj3vuuQdw+1wsPODG+8i121YiiROyF63NDohm3X52/KzQ8pzK2GX3g4x1omG3ZUfGRBkrbav3jh07AChWrBjgn9o+O8ShKcmRdNb2uC/psNesWeNsk5T9Eg9hzwUzZ84E3Hi1F1980SmTOUTiSlesWOGUNWjQAHCtunZKbiW0sN/H/Cw6gUiZX+xNSvNoSvX9rCB+x5K5VcY5v1gW29MocPFpiUs/1/EvJmkdp/0sNZI+315OQeZfiV0VbyFwF00WzyyblBYFT2/UEqQoiqIoiqIoSo5CP4IURVEURVEURclRZHt3OFn5V9I3g+uWJEHkdvpOcS8S06mkIAbXhCuB6uJqZ+O3cnCoY6e9lT4TNwUJUgW47bbbADfl7htvvOGUPfvss4DrqvTSSy85ZRLAHwopsm0k8YPIjJ0SXNzgbPcjcS0KxhxvBw0GmtftFbLFRC/b7DJxU8kOiBucuMLYacLtIFvwuohJmlJZhdx267LTtIcaMq6JuwG4995OZCJyJXJiy5zUl8BkvzLZ307nXrRoUcCV0ebNm1/o5ShZnDp16gCwevVqwDvPzZ49G4BLL73U2dasWTMAVq5cCcCCBQucsjvuuANwXeXuv/9+p0zkVNxgSpcu7ZSJu6eMqfaSBA8//DDgumpmVzLTTQjcYPdt27al63Elvb6NX8rqwGQJfumz/er7JUYIHNP8EjDINr+kCYIt+36ucec6JnjH5lBCnjl73hXErbVt27aA951QXK27dOkCuMlQwJ13Dh06lOyYMofbSRYyKqmEWoIURVEURVEURclRZHtLkCyWdfXVVzvbJKBStMT217n8Fo2UnehA6vtp/uV3qAb5p4Sd+lmsZtIXsmgquH0mfWEHyInW7/bbbwfcBfMgdFM4S2pYkRk71bpck60FCtQa2WWBmr1gAxalzyWo3Q5SF419dkDkTqw3dvID27oDXuub9IsEX9oB2nZyhVCjZs2agDfAWK7blh2xkokW3dbkiTzKX1sTK/X8kh7I+CeWcD+tbnbE1jyLZlOeMVv7KSmf02MukLFU7oN9/zIzyFoWA5dkGLaMlS9fHvDOE9JesSDZC4vL2CjWcr/Fn2XR59dff90pEyuPzD2hurSCH36JcPysPbJN3mHq1q3rlM2fP99T155vUrJmCPY7kmjdGzduDHgtwenhseG3QL2MK7ZnjsiR33zoJ/+B2+z/B2M9CxwTIbl1ya++X+puP7KyJSjQcmVfR2DCMNtCI+92gV4X4M4Nfn0fuMg5uAl85Dz2PJ9RqCVIURRFURRFUZQchX4EKYqiKIqiKIqSo8j27nCyWm/evHmdbeJiICY6v2C7wKBhcM18trlPkPrZMTBT1mEBd70fMVPbazVMmTIFgBo1agDeNXIkacLnn38OQKNGjZyyrGwiTglxERFTse1OJH1mu/qJC4CYm20TeqAJ2i8xQmBdcN015a+drMN2zwt1JOh61apVgNdlQhDTuV0m90Fck6pUqZKh7cwsJODUToQhY5XtIidJTVJK2OLnfiMy5heILfIoxxQ3CHATp/zwww+puZyQQJK7gDvGiZvQVVdd5ZTJejm7du0C4NNPP3XK7LXlApH11Vq3bu1sExclcXV64IEHnLKU3G7Sm2rVqgGuG7OdMEjkTuZacO+/rCskfQGuG7HMD2PHjnXK3nnnHcANspb1hgD69+8PuPOvPQ62aNECgKlTp6bl8i4afsHzga5D9jg+atQowF17xk5KcsUVVwDuHGvPIfbvQETe5Njgjhtyv1955ZVgLido7DFKkDnMdq8Vd16/8csviUFaE0cEzsl+roSBCYrOt80v0UNmJbRICyklZxL+7//+D4CGDRs620TePvzwQ8Cdm8B1lZNj2a7q4vpm94m8C4qLnP2unVGhJmoJUhRFURRFURQlR5HtLUGysrqNaIpFc2xrHqRMvobtr1TRjkh9+0tfvmCzoyXITg4hFiBJaygB2uAmoZA0iHbAeq1atTz7S8BsKCMpqAPTeIIrI37pH/1SdAaWpZQi204JLVZNkVv7fH7atlDCTvEt1yzB17a1R35LnZTKxJIZ6tSuXRvwyoloS+3rF3mSMr+0sPJ829pWsW76pUYV2ZYxzx4Hq1evDmQPS1Cglt5OAiMpnyWA3E5CIimf5dm0V4wXbaakkBZrMriWYtvyK/L6xRdfJGtfStr99Ea0u+JRIWM9uBYsO8B53rx5gNs/YqmxqVSpEuCdf6V/pL6dIlvmE7E82R4E9hIYmUVKKeXtMr/kIoLfHCDW1EGDBgFerbvIiKTIF+07wGuvvQbA0qVLAe8SFZKqXNrSsmVLp+yZZ54BvMmKZK6RpBeSYAbSx3PDvq8ix7ZVW5CxRf7a/ZVSwgK/eTTw+H6WMr/9AtuSkpXIrid9bde3vWqyKiklZZK5xbYE33XXXQDs27cP8I4NIlPFixcHvO/HIlN2gjHpd/HYCFz+IiNQS5CiKIqiKIqiKDmKbG8J2rBhA+D1JwxMxWhrauS3X7rZwDp+lqDsiK1BCdQai4UH3LSJoiUQ7RPAunXrAOjcuTPgLrYKmb8AXHoh1ynWCdvyIrL1yy+/ONv8LEaB+GmgAjVetlZetK9yX+w+DHVLkCyQCq7mU55j27c4ELtMNG9yP+xjhrK1QmJS/FKz2vIRaHX0iz2QZ9m2+Abub8uVxADZMSCCWILefffdVF1PRmFfk/RVsOOMjHsy3m/dutUpk5iV999/H4BPPvnEKZN788gjjwBeH3nR0n/77beAN0ZN2jVu3Dhn21dffXXO9gWbRj89EG2txOa9+eabTplcnx0fKs+ZPIvXXXedUyaxPfJX4kXBtfZI3JCdmnnx4sWeMju+qlOnTgA89dRTabi6tGH3f0rWnpQQOXriiSecbWJtE3mwvVlk/POTYZljxUpsxwtJfbEk2fuLtdHWustYIh4b9evXd8rSY2kB+/zyW/rTzyKUkodESous+lnkAuMd7d9+xwzmObPHGb8U04FtDgXs9svzJWObxPOAO0ZJXKQtR19//TXg3mPbU0XmKTuOWbZJvcxY0FwtQYqiKIqiKIqi5Cj0I0hRFEVRFEVRlBxFtneHE2zznSAmOtv0GRjgaBOYjtgmM1a2vVjYJsyOHTsCUK9ePQASExOdMnFXkCBeO5BYAmTFXWb69OlOWSi5wdkmYgnsE3ci2w1L3NRsuZNykTG/wEy/FasD3QRsc7O4Tojrmx3cbq/KHYrYiTUCsd1bxfXGdgMLJLs9nxIIbrvhiKuFHXwrq7uL7Pi5Y6QUBCxlfilqRUbtMgl2zyqk1k3JngsC++XLL790fg8YMABInpYYYMyYMZ6/tkubuAzv3r0bcN3jAHr16nXOdvndt8xMkf3jjz8CUKxYMQDKlSvnlMlyB3YqcBnnJYmB9Be4QfniTix1AZYvXw6484o9D8tyA1JmB2dLum17zMhoVxo7fe/DDz8MuPOc7R4vz4S4DUofgtt369evd7aJm67Mi3biCEk57OcqJ7IrboLicgmurMixbDdDcW+1r0fmKnH1thNbpIc7nL1kSWDyA3scl+fXT/5TSiwk+O3nl9Y6cD/7nSRwTrb380vOIPv6uWxfiDucXzr1lI6fUj1pr11HfkuyK0nQAa6LpciW7cYrc5G800k6fYDGjRsD7vxjJ9iQdyP7fUaeGzmG7A/+yWHSA7UEKYqiKIqiKIqSo8gxliC/L3u/ADn5KhUtup+W1G+/v/76K51bnHWwr23ZsmWAqwn4+OOPnTIJlr388ssBaNWqlVMm/ShaMdEsAGzcuDEDWp0x2FZAuf9iZbDlQVJC2tog0bSJFs4OAE0pfadow0SLYmvsZCE92d/Wyvst6htK2BYN+S0WNtuy47dwaiB+iRFCGdEm20Glgp+GMyWrQUoaQxk3bYuKPAMic/bYmtX6V1L8gvv8VK1aFfBqyjdv3gz4L2As2NZHWdQzGMuX3Xei5ZfnNiXrj43fPcpMC3rdunUBNxlG+/btk9WxE0BcdtllADz99NPJyuT3yy+/DECHDh2csokTJwJuyvF+/fo5ZWKNkPtga4ZlEcfMCKSWseT11193tsl4H5h2HlzLhvy15UHSWttjtTyrYi2ZMWOGUyYphkVj3qdPH6dMklbI2Gh7AgQu/WGnM5d22Yl0xLImZelt4bXPFZiwxLbmBVqH/BYltQkm6ZBfkpjA/fysPX77SZnf+Chjs13fTomfWvyuLdA6lNpkKXZ9GR/lmZMkVuCmUZfECGLxBDc5lliQ7L6QxFDiTeSXQMyev8XStG3bNs/5QC1BiqIoiqIoiqIo6UKOsQTZX/Zi7fGL5QhM1+ynefDzKc1M/+yLifSVaMEk5gDcOCHR5Eh6RHA1oKIFlPihUMNOAysan8D0muBqrmwth2hBxBJkawuD8RUW/3rb2hO4iJytHfaLXQslbKuCxFAELn5qbxP8fNuF7GKxFY28n0Z37dq1zjbRPEp/+Y1Zfn7tIjui1bNlSbTEInv2vbDjHTKaYDTBspAnuM+dxEDY7RZLkB9+KfxlYUCxdEj8D7jWiL59+wL+fRKsBehiYsuWWGHmz58PeJ8rsYIMGzbM2SYLHXfp0gVwLV8AO3fuBGDmzJmAV9s7atQowI3/sS1IEp/yyiuvAN54S/lta9rtFNrpiTwLdv/IMyTtlr/gPldimbH7TiwitiyKfMox7cV2JVW1PIMSIwTu/CBjnN0+KRNZtsdMObdt6Qx8tmxLUJs2bbhQ7ONLexcuXAh4F76VmCeJG7Mt335xjsHEy/il2w609vjFEsr4as+/8tuOY5LYQfGIsb1eJAYsLfi9Y6a0YHLg+H6+/a688krAXY5CUq4DNG3aFHDvlX1NYlEXy6XdFyLf8szY3h1Szx5XZZyQsjJlypzz+tKLnPHmriiKoiiKoiiK8v/RjyBFURRFURRFUXIU2d4dTkymduC5mM7FJGybGQNdH2yzoZg+ZZsdCC8ramd3ZOVgMcfbZlEJZhV3nAceeMApkwA5SagggbPgujqFAnZqT5EDMfna8iCpb203EHFJkABEvyBskVP7WGLGF7cFW9bEtUT63s990y/IMxTwc9nwc4eT3+JmYu8XWD+lNNqhhFyr7UogMiNuQ+C6KokrZkppZf3SvEq/2XK8atUqwA2EtWXq4MGDabqetBBMELB9TYGuFX6uFsEmIBDXup9//hmABg0aOGXz5s0D3PHPdu+RoHVxIbGfc2mPX5C1jB2Z6eJqu0dJ8hpZ9kAS5IDrniaB+eC6cMl1ijs0uK6AQ4cOBdyxC9x+FZlavXq1UyZjo/SrfW/FzSYz3OHkXjz44IPONnG7lLTCkjod3HsmbpJ2cLi4yIl7K7gyInJnpxAXFze/9xN5RsWN2J6rAvfzc8nzu8ZZs2Z5rgsuLLhfsJ8JGWPkftlujuJaJi72tjuc7GenEA98fu3/ByYfsstkrvSbHwOXTfFbJsO+HpEF6X+7fTJmpoXUzt0p1Rf3RnmfA1cupY12OmuRKXG/9HOdlHtjy7fIvPSdnZrdL1mCHEPGj8xY5kMtQYqiKIqiKIqi5CiyvSWoYcOGgPerVoL5AxcEhOQJEWxtgXyxypev34KNos0TDX12Q4IXRYNgB8+JdkG0frYmTgJcRYtsa5ZDCVurI4hs2UH3osGyU8mK5kPq27IVmFzB1vgGLpJqWzok+USzZs0Ar+ZE5NoOkA2lxAC2tlKu3U5MEQy2xSjwmKGIaG1lfLK1m6J1s+UjGMtBYFAwuOOgbLO1xbt27QLc591ug581LvAepBeilbStT3ItMlbbZZJOWJIa2AHOEyZMALyWQnkmxcpgB7vLvtIvEswObr/IYp5iNQf4/vvvAVcjKgkEwB0XbE2z9K1o3+1FCu0U3xmBnWBErIuS6MAeZ2ShWJlrwb03Mj/YAdFSX/r61VdfdcqefPJJwO2nTZs2OWWyUPfWrVsBr8Vs/PjxnvMG7psRSBvBlbtFixYB8NJLLzllkvhBFpi1rYalS5cGvH0tciOyZcuPjH/Sd/Z7hjyrkkbbtqjIMeQ+SgpicJOC2PIt84Rcl90GwU4RHiySst62LMl4cvfddwPe+UoW2RXLlG1JkOfFHl9SWvUOIkgAACAASURBVFA0MNmDn7eF/LXHtMB3QVv2ZZy02xWYIMC2dKTHe6Gd7EXeO0V+7PFL3jekvbYFz88CKfIiz6zcD3DlQBbKtd/t5Jwy19jvSDIPSH/afee3SK8kwhDrsJ14K6NQS5CiKIqiKIqiKDmKbG8JEu2f/aUuX66iSUhpIVU/rZzs7xeXIF/Ptr9wdkIWQH3kkUcAbxps0dQlJiYC3r6bM2cO4Gp5bK1TKGFrMkRGRKMhGkqALVu2AK5WD9zYJ9FA+aW8FA2zraUS7bpo42wrmmhyJA7L1hQLtvUklCxBKeH37Pk9l6KBFo1UqMcEiTz5WYJEo2trw0Vb7qchDfSDtzWlsk20e7bVRORPNJy25laefVtGxfKS3ohms3Hjxs426RfReNuWsJUrVwJuuyWlP0DXrl0BVxMJrjZS+sW2PEi/S8yE7U0gsTH9+/f3nM9us8RS2tpZ6WO/NOZ+8mvH2WQE9pzZrVs3ALp37w5401rfcMMNAKxYscLZtn79egBatmwJeC2JoimX2EY75bKMoRIbkJSU5JTJoqFyXypWrOiUtW7dGoDrrrsuVdd4IdiWFtGei2XPTmstFrslS5YArsyAO37b8TuBVlj7uZR4IT8LjciIyL7EA4Ir12J5sudfmV/EUmW3QTw4bMuRxPemhZtvvhnwypZcp7RDrD6Q3EPCthoEvsedj0DrUEop9u3xTvrCb26WMde2OkpqbOk7+1rtRb5Ti8xlDz/8sLNNFtuVdPP2MyHPqMwZ9vgl8mBbdMSLSeRmypQpTpmM8WJ5stP+y/uPtM8vZbnfMgPyLmV7VMmx5F3JtiplVHyQWoIURVEURVEURclR6EeQoiiKoiiKoig5imzvDte5c2cAvvnmG2ebmJLlr236TAlxsxETn21eleA+cQnLru5wderUAdwARzs1pKw4LC5XUgfg2muvBdyVh0M1MYLt+iNyI+k7Fy9e7JRJQLZtqg90YbLdtgLdjlJKmmC718h+Yt62U5bLMWxXi1DCDq4Xlz7ZZrsV2PXA269SFlgnVBE3BHFdsANHxf3CllFxGRHZs12tAt1tbJeuwNXYbZcOcZsRufJbXd1+9jPKHU5ccW2XXHGfEJepH3/80SkTNyS/lMDiQmi7XIhLhjxvtiuHuDF9/PHHgH/aXNlmu/CI+4m00w6Ulv63k3fIuWVcEPebzEbGdHE3s8dvcWuz3Vpk3g1MFw7wySefAO5zasudyG6jRo0ArxvWtGnTANfF3S/JiS37F4Nff/3V89cmMEgckgfkg3uvxS3ML6Bd+t92t5a5WI5vy524sIls+qUxlr92fUmykF5IAh97PAlMsWw/SzJ+yTXZKbJl/LJdXlNKmy9lfqmjA8dCv2Ul/NyG/Z5xcQuTZ9y+37YrXWoROZB09QAdOnQA3IQl8h4Kycdde5wW+bHfF2xZAu8yJjLW+yUTk+dYzmeXBboc2v+X9yZb7mRslmPY7y4ZlWBHLUGKoiiKoiiKouQosqUlyA7wE22eHcQY+DVuB19JPT8tk2wT7YR9HNlPvlyrVavmlGV0qs7MRBavk76QlNfgakWk/+30tBLAJ9otSZMaathaZNGsiPzYWhgJ8LU1GaItEi2Yn5VIjmlrt6RMgm4l8BJgwYIFgBuILMHY4MpdqFrdbESLaqfaFVJaLDWwTqgjFjGRE9s6IZo72+otmjfREttWb9HcBy7GC66sikbOTqUqqVdlEUc7mYC052Jp5OXZ2r59+znr+CUH8dPcp5XA42dGmteMwE6y8tVXXwGuhc3WGs+ePRtwk7OAm6xCPADs51bkTI4h6ckBhgwZArhyZ1uCatWqBbjjpr04q8zFWXlpCrnu7777LlX7pbc15mIi70l+SYFkTLPHjkALTWotKXZ9GZv8lg0QefNrV6A1w343lOPb75fyfuiXuOFCFpp94YUXAO+7k7xzyDuXnXpcrETSRjvhhFhf/LygxNroZw0TS5xt7ZFjBHpKQfKFZm1rlN1ngrwzyvuTndQpoxZ6V0uQoiiKoiiKoig5imxpCWrSpInz2+8LVDTFfguiin+zaA1sTYJYAeQL1s8vUjSRzZs3d8qykyVI/LIl9sm2jEia2UcffRTw9nnfvn0B1xd53LhxGd7WjMDWaMq1iGbc9nOW1KW21kisQmI58kslKfVtLYzIYKAlElyN9xdffAG48guuFsbeFkrYMQYpWXLkefazPogmSe5bqFuE5F76afBEs277p4tWTzR/9pglZdIn9lgn8uu3SK0814FxQ3aZ3QYlNLHHIIkRkTHMHgdlcVg7Pa/EPolVTFKng2vdEZmSBRgBbrzxRiB5XAy4ViWxUEkqaHBT99qL1ipZD4mblfcHcO9x4LsXuHOezI/ni6cKtNr4LQAt1gz7PDKH+KUlD0yRbc+/YrmQdwC73O898UIslSNHjgTg3XffdbaJ1V9Sntsxg2Jx/PnnnwHXiguuZ4A95ovFStprX2egBch+rwmMrfez9gS+w4C/1U3qyVxup5qX5VnSG7UEKYqiKIqiKIqSo9CPIEVRFEVRFEVRchTZ0h3ONvuJedwOsBTE5GmbRcVEJ8FztjlVygLdSMA184lLlL1qe3ZC3FwkvaHt5lC1alXANXn37NnTKZM0tuLiZLuOhRJ+q1OLXPgFQPsFRYrZ2JY7wW91eqkn57HN1OJK4GfGF9OyX0rgUMA2nUtSCHHrsp8921UrEJFTqWO72IUi4kopMuQXvGq7yonLZqALG7iyIrLjJ4+yn52aVuRK7oF9TGmDusOFPvYYdMMNNwAwffp0wOt206NHD8Cb6lbcc0Ru7KQJ9evXB1yXuqefftopE/eXLVu2AO6cAm5yjjFjxgBelypxR0opRbJy8RG37WDHYXtsAf8U/34B8zKW2TIs722yzIBdJmOZjJd+wf3y104vLi6g9nuQzPny13YdXb58OQB33HGHz9WmjLiN2suS3HTTTQD06tULgJYtWzpl4roq7bXfT+R5scd1uSfST3a7A5Ml2PdB3i/EndHvftjvLILM737JLsSFT5KhAPzvf/9LVi89UEuQoiiKoiiKoig5imxpCbLTEsvXrf21KV+xogm1tQ2iRU1pQS3BTsUrv+XL2i8NY6ghWjjbklC8eHEABg8eDHi1KZI0Qfrg1Vdfdcqknmjk7TSPohWRRfSyMn6JB0TrZAfqCi+//LLzOzDY1y9FtsiNn2wFng9g3bp1nvNJQDu42psLWaDtYmL3gQQ+y/Nsl4lGSfrTr+/kb6gnRpCAUXmebMufLCDrZ9EJXHAXkmtS/bTosp/9nItWz9YiChIQr5ag0Me2KNrLHYCbDAHgmmuuAbyByzJOivb5ww8/dMrq1asHuDIiVh9wvThEW2+zYsUKwE2fO2jQIKdMLE225chOuKBkDfwWHBU5k7krMLmBjT1GpVRP5lF7vpb52W8h+1GjRgH+KaAD3xdt/JIOSRtlnhd5BZgxYwYAb7/99jnbnhrkOH7Hk2dJnlWxwII7V9rPi/S/vO8tXbrUKZM07d9//z3gTVgi/SNWMTt1vVh2/ZacEQuV/X4i1mQZb26//XanzLYmpydqCVIURVEURVEUJUehH0GKoiiKoiiKouQosqU7nL1eiJjqbNejwJzltolVzKhiDrVNdWIOTclVLjCIPZSR4FbbNUH6YNmyZefdf9GiRecss91lQsENTrBdjQKTGPi5nYnbYGZhBz8GJvIINWzXNfnt59YWjMubjAniMhaqyPgkbgO2O4a4qfm5jPitqSH4ucGJnAeuX2Xj5zocuFq6Err8+uuvzm9Zj0RWpp89e7ZTJuuRyN/zkV7jvbTJbtdLL72ULsdWMgZxl7XfxyTZi4zffu9cwbjz+tWzzyOyMWnSpGTt8tsW6uzbt8/zV5IyZBQ//PBDhh4/owj9N3VFURRFURRFUZRUkC0tQfaq0RJgZWsLRFMs2ko/Db5oQlPSPNhWAQlal9SDdnKGUGXPnj2AN9Bt1qxZ56wvlge/4MeUUpeKVUhWF8/KbNy40fktK5iLPPml/fRLYZyR2Nrbyy+/HPAmCggl7HZLus6ULDlSx7YE2Wl7z7d/KCD3VCxA+fLlc8rEcmsnZZHgUxmrbKugjH920gNBrDyB+9vHEC2rpCEHd7ytVq1aKq9MyWr069fP+S0payX42U56I9hjXaDF0W8ZCj/PimCC4uWvnfigf//+ADRr1szZltGabyX17N69G/COUfnz5wfcMU2WQ4DkY40tY4GpqMFdekOSINiyZad1VxRBLUGKoiiKoiiKouQosqUlyNYkXHHFFYBr1YDkKWRtbYFoRf1ie0R7EeinCq7GVNIS2v7KoY74lIKb3tSPQEuH30KzUseOGZA+CwVLkKSKBNeXWe69aOIzmpQsbHa6d7FGinUy1JC02PbvwPgfG7EANWjQwNlWpUoVwLUS2Qurbt++PZ1bnPFImlAZi+xYHVmM175+scz4xeoEWsJtjarIkfSzbdmWNPmS9thOfyzpWBcuXJiWy1OyELbVVMb91atXn7N+sJbulOql5DEgc4hfHbH4Zoa1XUlfJDWz/P3ll18yvQ0pxUymZJ1M6ViC/b6pi/lmPdQSpCiKoiiKoihKjkI/ghRFURRFURRFyVGEmSxon0ut+TElKleuDEDZsmWdbeJaI4F4skIuuG4zgr0qugTdyV97FW1J/bh//34ANm3adMFtT8utSc++86N8+fKAdxXkYJB2yTXZAd3ieiPBjOlBRvWdnS585MiRgLvi+RNPPJHiMTPjURs+fLjzu1atWgA888wzzrZgUptnRbkTZGVvO/lB165dAXfVbNuNp3Xr1gBMnDgRgHnz5mVo+1LbdxnRb7fddpvzW5IWiAub7dZWqlQpwHXXsMc+cU0RF8+dO3c6Zbt27QJg/Pjx6dbmrCxzWZ3M6Dtxv/RLopFZBM4h6YHKXdrRvks72ndpJ73fo9QSpCiKoiiKoihKjiJLWoIURVEURVEURVEyCrUEKYqiKIqiKIqSo9CPIEVRFEVRFEVRchT6EaQoiqIoiqIoSo5CP4IURVEURVEURclR6EeQoiiKoiiKoig5Cv0IUhRFURRFURQlR6EfQYqiKIqiKIqi5Cj0I0hRFEVRFEVRlByFfgQpiqIoiqIoipKj0I8gRVEURVEURVFyFPoRpCiKoiiKoihKjkI/ghRFURRFURRFyVGEX+wG+BEWFpbux6xYsaLz+7HHHgNgyZIlALz55ptO2dGjR895jPDws93Vt29fANq3b++UPf/88wAsXbo0nVoMxphU75MRfReKZJW+u/76653fDRo0AODQoUMA/P33307ZyZMnAYiIiAAgX758TlliYiIAZ86cAWDAgAFOmX2M9CKr9F2xYsWc3+XKlQPcfomKinLKdu/eDcD69esBqFKlilOWK1cuwO3XSy5x9T6bNm0C/PtQ6kmfB0tq+y6jn9fChQsDsHDhQgAKFizolO3bt89TNz4+3vk9Y8YMAIYNG5ah7ROyisyFIpnRd/I8tGrVCvA+m6VLlwZcWQP48ccfAfj9998B+Pfff52y7du3AxAZGQlAoUKFnDJ5ruU5L1WqlFMmx5D9o6OjnbL33nsPgGPHjqXqulTu0o72XdrRvks7aem7lAgz6X3EdCA9b/aWLVsAKFq0qLPt9OnTgPuCFBcXl2y/w4cPA94XA+HgwYOA9wVJBuSYmBjAfem6ELLKg1KyZEnn9y+//ALAb7/9Brh9aP/+559/AIiNjXXKZMKTSa5AgQJO2V9//ZXubb4YfdezZ0/n9xNPPJHsmNIfIht+siWcOHHC+S3yKjJpv2xMnToVgLvvvhtw+97GfvEP5qX+YsvdtddeC3if2Z9++glwP3jsj8QKFSoA7sdlQkKCU/bDDz94yvLmzeuUycvbqVOnAJgzZ84Ftz0jPoLsOoHHl5dSgD59+gBQr149Z9tzzz0HQO/evQHYs2ePUybjmPRX8+bNnbLBgwcD7odSiRIlnDJRBsmH1Y4dO857DefjYstcKJMZfSdj9IEDBwBXaQPuWG6PZzJ+ydhlf7DIeCTtttsv8inzaP78+Z0yUVDKs1y8eHGn7J133gHcZyBYVO7SjvZd2gm1vpN3Fnnub7311mRlEyZMOO9x7PfFwHHARrbJ+OFXll6oO5yiKIqiKIqiKDkK/QhSFEVRFEVRFCVHke3d4bZu3Qp4Xd7EVC+XbruuiU+xmP1sly5xJZK/tptRYExHpUqVnDLbHzo1ZBWTaeXKlZ3fy5YtA+D48eOA17wp7lrSL3bchpAnTx4Aateu7WxLD3eaQDKj72688UbAdR0qUqSIUyYuVrZ7mvSLnMfPlVCw3dZs1zjwymvu3LkBV8a++uorp+ymm25KdqxgyIy+E5cq6adq1ao5Zddddx0AGzZscLaJC8yRI0cA7zMlbl0ib3JMcF295Hw2MiZIvJHtKvbpp58CGe9KmFK/SZl9TGmPuLnZjB8/HnBdAMEdv95//33AdR0E2Lt3LwBly5b1HBugX79+AKxduxZwYzsAypQpA8B9990HwLZt25yyF154IdmxQsEFM5TJqL6zx/0FCxYArlucHTsrz5uMRfY2+WufT2IcZa71c/2V591+bqW+uN/ZbsHSnrp16zrb/vjjj/Neo8pd2tG+Szuh1neBc9HTTz/tlHXu3BnwvvNmJOoOpyiKoiiKoiiKcgFkyexw6YlfsHSg1t0OvpLfop2yAzplP/lrazhFYyWa/7Raf7Iidh+I9k0STtiWMgliFauY3c/SLxLQbmvxMsISlBkMHToUcO+9rXkUq4RtgZT+CZQxcDUtYuURSxv/j73zDLOkKtf24zHnLAZyEskwhCEMMGQVUFSCIBKOpIMfCCiigiIHERCVIwaiBHUQlCRJBAcFJIzkzBBEghLMOfv98LprPbX6naJ7psPevd/7T++uVbt21aoVqt7wLBX1ssgShGcEi+smm2zSlJGouPvuu4/wysYf9wSRFO2CHNQLFl/ve3jgIo9D7b31++GJ1VLxCEnFEzRSL9pogsULURGpeMmOOOIISdJjjz025HuuHvjRj35UUrHkez2vvfbakkrfROhAkvbYYw9JxUu09957N2VXXnll63c+9rGPNWV4D9wbFXm0kqHsv//+ktpeNLe4jjeooEqlTzJm+TkyBjmMf3hp8TpKpT3gXfQ+RqQAinE+DjLXMN76+Em/fs973tNsQ7E1SZKR415Yni+WXnppSe1nkdtvv12SdOKJJ0qSdt111zke04WJUCpdfPHFm20bb7yxJOnHP/6xJOmMM85oysbKE5aeoCRJkiRJkiRJBopJ7wnCouReCaxLXXkCyH269ZLPkXWYt1Qsp5MJYriloW/jLpVK/WBZdg+S7ye162nWrFmjdq5jDfLNUmkreCc8VwdLKGtaSNKMGTMkSSussIKkIjMulTaFl8hzN5ZbbjlJxXIayUxyX/yYU6dOHenljRuet1P/z2f31OBZpc+697buj1H/pM+7JYqxAUu25zT0Ei4VjhfG85fgoIMOkiRNmTKl2bbffvtJkjbbbDNJ7XXMaCu0IZcd53fow0ceeWRTtvnmm0sq7ZFcJElaZ511JLU9QekBKtReMbeCbr/99pLa4wge9/POO09S2ytY5wqONtOnT28+0/8Ye/y38dC41wa4Pvf8A33R52HGVI7lv4M3inOJ+itrsSX9B+3A2zj3OhrT53ZcqSOBpKHLpkTSzJSREymVdu2RRjzrRGtGRr/dK9TPdvUcLRVPja/Fx7XjJfKlJn70ox9JKs9NPqczbvgSKbQB1gdzT9BYzSPpCUqSJEmSJEmSZKDIl6AkSZIkSZIkSQaKSR8OR8iHu87rlao9yRM3KNsi8YP6OFJxJZIIOplwqdR6JV+vE+oAd3aXTO54ySmONmussUbzmWsnZMjdx4RTPv744822iy66SFKpQ3cN42K/6qqrJBVpd6nUK/fh17/+dVNWh8O5S5tzIAFRaksd9xIepsY1IYXrPPzww5La0uOED0WJk9wj/nqd18nXHoYR9f+JwsOjatlf75v0KcIupVKvhGgsuOCCTRljIvXs1z979mxJJVT14osvbsoQSUAYBOliqS0Tn/wHb5d1SMdxxx3XfCZUxvuCh95I7bYw1uFw/tssf0A/IjRNKsIt3n5ob4xVPk9wnYgg+LjpbUlqXy99kfHW65VjrLrqqsO9vGQC6BJJqWXV5+VYXftHoW4QlZ155pmSpJtvvllSu8/+9Kc/HbI/8+573/teSdK3v/3tpsznrV7A+1D9DBHVxWqrrSapHdZHWBt/11xzzaaM/vjUU09JagukMEZ4qgRzEfPPeJCeoCRJkiRJkiRJBopJ7wnCErXooos223jTHY61N/JmRPvzZnz99dfP4xn3Hi6DDbU3zYksCNQZ+7slsZ/whMCuRUwjEQ0SCB999FFJbfGDG2+8UVJJgv/JT37SlH31q1+VVORyI88lVlive85n/fXXb7adddZZT3uNE4GLb4BfJwnTeDuiRNRI1t7vidSuc5Ju2cfrjvHCPXIThUt34wkDl65GqvSuu+5qtiFXSh92Szv1hnfJ64p65piXXXbZkN8kEfbuu+9uyhBs6Afv41jTZanedtttJbU9JIyJLipRL4obiQ+MFT7ucw85R18Im2uIRHKog0j4ZKTUS1S454ljpieyt+ny2uCl9mgL2juJ8l/84heHHKtLOjmaK7v6JYtEH3bYYc02pNZZlsA5/vjjJUmbbrpps42xEk8n46QkHXjggXM814kgEv7qgn7vS8Dceeedkoqwjs8j1DnbvO+zBIbPDyyXssEGG0iSzjnnnKaMZ6TRJj1BSZIkSZIkSZIMFJPeE/TAAw9IalsXiG/mrTR6G+7ydIC/1WINY5GnyUSdhyAVD4dbYagD3vrdUoeMNBBH3m94DgaeByyTbjmlflx2GCs8bcq9PVhT8EB4PD71iRfA81rwdEbtlPux4oorNtt61RNEfo5UFjh2Kx5WZvqcW6JrPF659tp6O+QY1G/k8ZxI8Gi515R6eN3rXiep7dk6/fTTJbWvkTZGnonnk2Glo05dxhirG5Y5vxfEbb/lLW+R1PYEEfPuOX+T2RMU5eLVXm/nXe96lyRpn332kVTmIqncK7cq13TlM4wWLKjr95zro026R7HLIl/nLPr+9T6+XzS/zGmfqEwqfcTzEJKJBY++L6h50003SSreUc+bYeyg3X/yk59syj7xiU9I6n5Gi/pLtP++++4rSfrc5z4nqT3/1t539wgxNp988snNNuYy/pIP00tE3jDmzbPPPltSkeeXSj0+8cQTktpjOs84LDXhUVfkTL3hDW+QVLxjkvTII49IKvO9/w7z1f/7f/+vKdtpp51GconDJj1BSZIkSZIkSZIMFPkSlCRJkiRJkiTJQDHpw+Fuu+22Idu6ZLBr2Wwvq8McajlPqYSPTCZc9rsORfAQImQNCa9BTlEqLmG+H4XY9QMutU5IXxS6gUvfZZ4RPaDsfe9735D9H3vsMUnFVSwVlzKCCi5z3CXWQTtdeOGFh3l14w8he97GqDNCJ6ShIS2eUD6cECESOT15myRqvu/HIRF3IoURIilgIFzDQx0JdZs2bVqzjSUCCHXwMYtwQH4nWg5gzz33lCR95zvfacpoo9FyAPRzTwbuV6JQN6A+qbOuNkh4mSS9//3vl1TCaDyR/8Mf/rCkdogKIcWE3kYy1KMN5+srufO79ElPfqYduLQ3RCII9Vjl4avMu/VfqdTBK17xijmW+Tmvt956ktqrzicTCyFsp512WrON8QoBmAsuuKApY9yO5rcTTjhBknThhRc22y6//PI5fo92RpucPn16U3bAAQe0zs9D4N797ndLKoIwLos9c+bMIeeFmAyCRFOmTBmyz0RD34nmPO7H8ssv35RRZ5EcPs+AiOH4uMR94NnHxw1SVQhblYaG91966aVzc3kjIj1BSZIkSZIkSZIMFJPeE4T13C0CvP1G1gXekCOJ49rq51YuLPm33nrrqJ17r+DJ1EAdeKIb3h0WCMUSJw21WiKl2294kjoJfVjNI6unWytJIMSa4lLGWGGwVp166qlNGZYZBBG8TdYLhbrVGuvokksuOcyrG3+wELk1GIuye4JIuqSfRXUdWePZj/vg0p6IXCBJ7MfEKnnFFVeM6HpGEzw1bt0GRAw8eZX7vNlmmzXbENyg/lz8gKR8+qZb6fAOcV9cEATvEt+j7UrSHXfcIUnaZJNNhnWNvUaUwB8lUg9nEd1VVllFUtsTgZWYBRW//OUvN2Xf+ta3hhyjHjfHeoFUqW2ZreG63UOKJdjHulqaPkrA5liRMAJjqvdp5hokiDfeeOOmrJbJl4olux88QZG3cTiSxWuttVbz+Uc/+pEkaa+99pLUlrUfzcUnuySpn45rrrlGUhknJGnq1KmSyrjvyx/wnMB45B4avNpve9vbmm1IKzOWu6Q8HkTK3CvOfMs+vtQJc/4Pf/hDSdLVV1/dlDGeMr5KZU6jj3v0SK8QtS2imJhTFl988aaM9kO/9u8zJxO94s929YLtt9xyS1NGRIsvaUF/5/j+fDlWpCcoSZIkSZIkSZKBIl+CkiRJkiRJkiQZKCZ9OFzXejS44T0Mpg6picLowEN4KPOkuclC1zpB7sokeZ11QzzMoXa/RiF2vQw69x4OVwtseLIgrnYXjiCBGPexh7rQlggDcRc6oQMIHPgaBtwH/+26DBd/L0LogLcVwiGi9T0IEfNr8nUGpLYYBXXMOOBhOiS1soaYr9sUJXmPN1xHlAC/zDLLSGq3L8IHPYSPdsHaDYSqSu21maR2H6XP0y493IP1gfbbbz9J7fAp9veww34iChOhXr1dEUoYjY0bbbSRJOlrX/uapBICJ5VkaULlnk54oz4f7rs0dP2S0YL7622L8YjxnjBLSVpzzTUlte85IVNR2GodThWFFkYreX+26gAAIABJREFUzPPbP/jBDyRJb37zm5sywpL8HFZeeeXg6nqT4YS+SSXJntDXj370o00Z4Uybb765pPYadfOK9/F5WXcpEpM66aSTJJWwNg+n8tA4qYRVSWWtGsQQpDKP8L1o7qO9seaNVMYt5pevfvWrTRnhyLTJ+eeff45lfnzGCJ9XeoWu9kZ4uPfTq666SlIZA72s7nv+jMQYgsCBi8SstNJKkorQjlTmpEiYbKxIT1CSJEmSJEmSJAPFpPcEYUX1N1eX163LulajrvG3ad5+SQTzVdv7HV/VHNxKCHhGItED9xhJ7YTFfgDrT+Td4rrdM0j7cY8OViwsaS51jYWMpEu3gFHXXVLG0art/eAJWn311SW1vR1Ym1i9W5I23HBDScVSFyVMU4eIGkjFg4Fgiff9iy++WFLbogx4qPwejZXlfU5w36MVx7lWt6xikfe+ibWU8cmtdLSLSF4YeXu8jg8++GBThqcXz6RbQbkvyJ9KZUyMxpFeZtNNN5VUPF6nnHJKU4a09TbbbCOpXT94jUkC93aDl6gL7qNUVkwnedw9IzvssMNwL2VEzJgxQ1J7jF5hhRUklSUn3BvAWOXtlHGSMdHnSh9DpfZ41jX/+vgqFYu17/+Vr3yl2TaRoibOSK3aJKS/6U1vkiS9973vbcro2xwLL7ZUxiwENjwJ/VOf+tRcnTvjM+citUV7Rsob3/hGSdJ3v/vdIWW0e5fPRgyI5ylvA4wrkVgI454vNcE4RR268A5lzEM+hvKZMveKMyf7sRgD+V4veiS7PEF4FPEIS2XMx+vjkSd1FIq3c7w81IHPzffee6+k9rzDsyP33QV2kEQfbdITlCRJkiRJkiTJQDHpPUFYOyMrPUReIt7myd/wY/D27/lGfA+Pgcfe9zsjjQF22d454ZbifgDLhFtQ6tj2SKLZPRxYkPAIuWV51qxZkoo1z3NSsDJj+XXrf+2FiiR+I69dr4AFzT1fXLvnPxBXzSKc7m2kPvASuWeCOGWO7+MA8cd8z/MJOC/uuzT+niAsyEhSS6Vu8GyxgJ9UvDzudcXyyrW5J4x6Iz/Pxzq8cWzz73EsLHirrrpqU8ZCh54rQ933gydop512aj4fccQRkoqUvUvxkhNDbg9eH6l4Ug499FBJ7fux8847Syp5DHibJOmggw6S1J5Xzj77bEnSIYccIqk9r4zVYqmMSz4+1Rx99NHNZ/pbNP5FnqBazr/2DPk2L2Pept/iHesFuJZoHO66T1wDHhdJ2m677SSVfkY+hlS8JPRdl1hfd911JZX5d5dddmnKyAk8/PDDJbXnX8YNcjSk4oEkrwiv+VjA/ITEN55UaehyJFEuSuSBjMZ06owxzce7epkLl2ZmLuBcPC9pkUUWkdTus+T+0hbGQ+Z5pFCPkUdonXXWkdSuV/oe+0d9tl6OQioS7oyBviwIx/K5pc4v8sXAx0pqPD1BSZIkSZIkSZIMFPkSlCRJkiRJkiTJQDHpw+GWWGKJOZZFLkFcebUb1vfnr7v/+R6rp99www3zfO69goe2dLlDSaSN5HHr/buky3sR3Ovujuc6KYvqxMOvuGZCND08CJc7x3fZyM0220xSCRNwSehaWtvbMgmjURLjeKw8PxxIMKcOpRI+RV+SpBtvvLFV1iVF7/eBEETc+d42SfYl9Mtl2wlrncjVvrlXnqBOSAf15gIsXKtLXxPyQciLhxHS5iIRD36bMc5DDAnvQcTDwxY5Vw/hoy69TU8kXYnq3j6+9KUvSSp9kvA2qYQhEjLiyc/ve9/7JJXQD0+aJhwJCd7vfOc7TdlWW20lqSQh+28jQuG/g3T+ROBiK8wP3kbqebQrjDgK4Y3KaPsuUVwTCdcMV356XuA3Iknwuv0jCS2VcdsFWBBCINyUEGlJ2n///SVJG2+8saQSeiVJ119/vaQyRiJiIZWEdNqbJ/czXrhkNP2A+YVwunmFtu2hjIzD1J2HRzEvUBeEn0rSo48+KqldB12hZ4QSc6+8/9dhmz5vM97Rvv1+EL7lkvEcl7Z43XXXDbmeiYbr8/ZKG6EOPIyzlsj3PsUxuG5vK/R/QjQ9NJi5y+c35qtaZlwqYj2jTXqCkiRJkiRJkiQZKCa9J4hEavfa1JYhtx7VCZlRGRYBt1JhWSdJ+Nxzzx2dC+gBvO4iOWjAuuRWJqgFJ8ZjEazRBCtStMApuFUlEiXA2keyuicJfuITn5BUZCBdkhTLMpKSkQeSc4naubdTrFjjneRfgyeD+omES9zLgScDS5p7EutFAbusbX7/aqtt1KYnsp1iUcPjIpVE+s985jOS2hbIyAqN5S1a6C/aH2pvibdjvElI+fq9IHHbPY0kD/cK0T3dcccdJRUpdqm0Ixb4I/FcKsnDN910kyRpzz33bMrw8tDvvH0icrLllltKassYY1X2vo9lGwupe6rqRYLHE08Oj+ZKqKMnRkok+U89RUxUf2XxUsQFGOOlIpqx3HLLSSrSzpJ05JFHSipju1QWO0Ue2GWFESm68MILJbW9/PRxPNpudWfcox35vaKvRouH4iV3AQ8Xixkp3/ve9yQVr6dUBH8Q/fB7SPvH68Niw1KpA188nHmCuc/HIcYw5shoHq29OPV+9THx7vm84sn8fk5S24s0kUT9kQVxeVbztsU1R15V2grtz703eOnY5vVEvfrv1BEb3ibHSuApPUFJkiRJkiRJkgwUk94ThNXC3+a75AFrIk8Qb8puOcaqiudpskL8JnURecOihVCpsyhfqB/AIuXnX8e9I+MslXbnbYT9sd67xQuLB7kULhNa5+947H0tM/l07ZxY/on2BJGrV+esSLFHkesi3t0tSvW1uyeI+1XHuEvFUsqCev573BuX7h5vogWXyTVE4tbl66PctNpT6G2uPr7nEtX5FJ4vRL1xD7zdcz5+rHoxvfGkK9/kC1/4QvOZHAvva9/4xjckSTfffLOkthT4ZZddJqlY/M8///ymjDZN2cknn9yU1bLZvtgl7djvCxbUyELd5ckbDbrqzr0xkfel9gBFC5JH1ujaqxQdeyLz9Bw//9mzZ0sqnj6/NywCOm3aNEnS1ltv3ZQhfU6umDR0kWQfqykjN8bHM/oZniaXgK7rlcVEpdKWo7mccdlzlkZjIXgWdJVKP8Rr4LmgbMNL4VAv7vHqeqajPrg3kVcComgCcvu8zpmL8fJJ5T6wn3t/fJ4ba6LokC6o4ygPlG3k5Xmd85n24x7quo/7MesFaqVSd/RxIkak9qKqo0l6gpIkSZIkSZIkGSjyJShJkiRJkiRJkoFi0ofDkQjcFcYUSWQPB98XN6G7cicjhAxFoQyUuRseqKt+WDU+ol41WorlNCGSNyaBnPohBEeSbr31VknFRewS64QEUOYhIhwf6VGSuKU4bAGXtydkTwSeHCy1wwRIgvWQBK6F/bwOqJfIXU6/xOXucsUci7AKd9VH5zXeuFQsMI6RgE2ollTONWqP9FcXMeAeUM+eeEq7ot58jKSeCCtx2VrGP5KcpbFbMZ1r6kpijsroA1dccUVTRqK/n+tXvvIVScOT899tt92az8cff7wkaeedd5ZUErilEhpDuCDt0/HwkFqEJxJfmQg8JI2+2BXyFrVJ8HvUJZ8NY5UgPVxI6kccQyqS/YSGepgqzyAzZsyQJC211FJN2cyZMyVJp59+erONfk8oro8DfI7C0WuZbg/tpP6juqP/+3jL/MVc5csHuHDH3OLjKuM2v+9CIoSH02881IrrQ7RIGhrq5qG41AFlUchbFDrG9/i9KHzL65qxj5BR7+NI3Y8HXeNDl5Q7ZVFoffTMzFzB3Or3g1Bzwto85Jw53fs/9VmLkEkpjJAkSZIkSZIkSTIqTHpPEAnm/jbeZW0aTkInlgG3bmEJmOyeoK5kXN7yI4s6uBWln8Dy6RaQ2rrpVuTI2oRYwuWXXy6pveAg3z3nnHMktSVosfaTYOtJrVhh8QR58nZkCZpoKyq48IPUbjMPPfTQkG1YADn/aKG7KOGV/ShzKxXHQqrcLVJ4XCbSExTdK+qNhRDdo4Ylzi3HXD/bvF3V+7iVjv0jbxRtFWu3j61I6mIBlNrtfDShfUdStxBZdumHeNOk2FOBJbfLE4TnFe+PVKS0uUfuna37pHtUsIxG1n22RaIzE0Ek+OJ9svbSeVnkxai/F/0OeB/u+v5YecoYX7zv4RViYespU6Y0ZbQfzs0X92Qsj/pZvQilVDwz0fIHbKuXE5CKVwkviy+azPziXnI+0ybf8573NGUuBz+3rL766s1nRF6oFx+H8AzQF31M5JnLF3MHPK0elVK3pUhUIhIT4viU+bzEfXNPUC1Q4140H3PGiy6PuW/DS8V47oueshQA+3v9MFdS136P8ADRzqOoAL8PnCv7+3jX9Vw5L6QnKEmSJEmSJEmSgWLSe4KwGnXJRTpdZTVuoeHNmDdXj2t1S0C/0+XJwVrv8olQx5T2G1g3outnm+fgYHlzKznWopVXXllSyX2RpD322ENSWWwSi7okXXXVVZKK5K5bIOsYXbfQ9GI8PWCtpH68Lvjs8tS1tLr3PaxxWE7dskT9YF30flnLZ7vXB09Gl9V5rIl+m/tHXP7999/flOHl8TZX51BF/Y924tboWlrbrXDUIeOaW5DxSEYy3aMN5+bnPRwJ/lNOOUVSW9ob7+MWW2zRbOvqK3i3br/9dkltCy9tB2+Ajwv1ffDxJLLYAtc4kQukOj62RNbhuqzL69O1kGpUNpF9UpLOO++81l+peCrog543g0UdD6R7aLC+R3LEEHm9u7xc9H+PGGAxVsZI927SPv13keemj/u1nn322ZJKztzc8IMf/KD5TPTMaqutJinO+2HO83Nk3PF+WuceRx6aeskJ3xaNj8xD3KPIg+T5qPVSDb6g9amnniqpLYk+VtRRAFEUDwuxS6VN4UWjDUjlWroWZWcfz8OiXsiD83YXLVrLfeCeRrlHo016gpIkSZIkSZIkGSjyJShJkiRJkiRJkoFiUobDefJVtFJ6nVQYSWR3hcVxTHf7sR/hDUsuuWRTdv3118/tpfQcuCujVeBxQXetEN+v4XCRGxj3eL0StVTag2/DnUtolofuELqw/vrrSyoCCVJJQEdi9YMf/GBTRrgCLvsoOdnPYaJDSQAxAkKrPBmWZEoPi6hD1zyMiHtD/fo9oj4iOVXqld9x6VpCAghfmAgIK/XwUhJUf/jDH0qKJVejMSsa66ibKMSmSwCF4yM17WGLhIl5SN5YhTGst956kqSPfOQjzTbuJfLX3scQLCAR3GW8qUdvh7RRcNGbBx98UJK05ZZbSioSyVJZPZ7f9nA96pVtUfhqFOrEWBPJ3k8EUahqFLoWhel2UR8jastPPPHEHL/fJZc+liDlzN+67ST/IWrjhH4zdrhoA88S9VwrFTEKH18IceVYPh8iQkK78DKg/fgzDH2VY7oQQxSCzbUx5yDvPVp4G4+uoSYay6dOnSpJ2mSTTZptd9xxh6RynR7iR33UMuNSCV2LxFumT58uSTrxxBOHnC8hxL5/HSLroYtjJaqVnqAkSZIkSZIkSQaKSekJquV358RwJDojooWcIFqAazLRtUAob/RRWWRB6CeiBN96m1831hHftvbaa0uSFllkEUntBfXYb6+99pLU9liQbItFKUoOjTye0IueoO9///uSpA984AOS2l6feuFYqbQtrHJu4a8te558SX/ke94/sfojOLH//vs3ZZ///OcltRfUHG8QPYgWS8Ra6gIadR1JQy2vkSeT9uFtG+8Tder3AuscdXnuuec2ZfXCgn6M0YbkapK+JWmFFVaQVOTkESGRSl9BLMK99XixfZHiT3/605KKZ5JjSkVA4corr5QkTZs2rSmjXdWLcktDPUDRwouRB4PvRYurjhVdctOPPfZY83mklu6uOTaSfq5hbI08bEn/wZhBNIR7YfDe0O59zrzvvvskFW+DVObWemkEqYyPHCvqZ3WSv1TGPqS8XcRnxRVXHHKMetHq0Rr/Irn54UAdvvOd72y2vetd75JUPJdSeTaIBAu6nt+oY+Yi95jjKUdY4+STT27K8Di5cFjtVXYRqLES2ElPUJIkSZIkSZIkA8Wk9AT5QlzgFgHears8QZHlv45XdGsBb8i8DU/kIovjQVR3UXxszVgtYDfWYHWMpH+jGGOszkheS8VizmJz11577ZAy2s9+++3XlF188cWSpC996UuSpL333rsp22CDDSQVOcvI6uzbPJa/FyCfY6eddmq2kdPnseG19atLCtnzJurcLP8eZXihPD4abxKeOanU/3hR50FJxWuFFZQcMqlY56OFoaOF/movkVtgiXvHeur9HevcggsuKKlt+XMPAXgc/2jCebh38+qrr5YkzZw5U1K77ePd4nsuY0x8ultGqQMWcXz/+9/flNG3+J5bhzmfyDsR1TVEUuUcgzbdi4ulduUyRfNEbdHtmhOi/AfGVjx0Uml3w4nkSCaOyDvKvWM88RzILml/5Om9z5LHgrfQ20+Uh1cT5ZwyXjB2+pIA0XnV+TPRmDg3UBerrLJKs23DDTeUVCTa/ZmU82ab9w3mPF8ygnmX+SN6ZmabL/LM9VKvyyyzTFPmMu1S+7mYevH5mt/hHKJxZrRJT1CSJEmSJEmSJANFvgQlSZIkSZIkSTJQTMpwOFyDTpTQHlGHwXWtjO1hC4BrDzlWqb068mShKxwuCvXAlTtWyW1jDdfrYVm4mXHdRgmLnjSM65xEbg8Tqt3G11xzTfP58ssvl1RcyS4iQH3W4Ute5m5tDzXoBZDlPOCAA5ptH/rQhyS1XftcA2EOUeghRMmb9FUXTaB+CIH43ve+15QRNkiIxkRAmJkLvdDWaB+erM9+CGlIJYSD9uGJxdQJdemSqEhGE9rlYQmEKSLc4FLTiA14e44EZEYD7puHZiBPDR76QrtgjEZuXJJmzZolqT120d84f5cjp815Ui8gjMC98j7Hb7PN+2u0aj33qNf67XDvaVeoXJToXcu2R/MM4wIy9lIJrenX+WVQiO7nhRdeKElaY401JLXbA2MM44qPUYSAe4g3IV38jvelOnStaykPPyZ9kPnaw40Z+5iXpDI/s82FW+aFxRZbTJK04447Ntu4Bv/9GkRfoudVr2vqBwl6f3bh+NS/1w9LJRBe3CV+5aFvtfiOw+91XddokSNGkiRJkiRJkiQDxaT0BLnVLJLcHI4MZ+QJqvd36xZWQyzNnrw2GYkse1jrI2scb/v9mrgayWBj8eDaPOkPaxDCBZK0zTbbSIoTxWshDU+G5xgkOnoiKFYp/vr3Iu+VSxdPJFilaDN+jpRFVvZIcIT7wLHcQnzPPfeEv+efseq71+fhhx+W1HsS2XgOGF9I2peKxc/7GG0NK93yyy/flNWL+nmfZvHPW265RVLb20IfcKss0O69Hfp3RxPa/I033thsQ1iENuDJ85w30qzu2YnGe9pfNHZhgaXNuGekjhiIFl7kPrqoA/3b+wLnShL4aFmV55UoQsLPu06AjxYkjxagrueVyJLP97yfR+eV9B6RWAieUxY63WijjZoyJKjpQ0svvXRThjS2ex4YDzlml1fC+2wtIuTjVz3neIQCbZD+KUmvf/3rJRXvu8vuzwuI4EyZMqXZRp2B1y/XEolERPVSy2B7X8SDw1juywusuuqqre91Seu7F632ivs58NfHzlwsNUmSJEmSJEmSZBTIl6AkSZIkSZIkSQaKSRkO58IIXQn5cxsOh4vPQ6NwkeKy60q662e61gAixCNKZsPdjJu636g19KXibl5ooYUktd261IW7cD0pX2q3SdoWoUlez+xHGJy3SZLgcVN7qB3hNYRJSb2zTtBwVr2O1gK69dZbJbXrkr5HaJHXeZ2k6uFj9ZoQUbI3ybcTwUMPPSSphMU5rF7O+lKStMIKK0hqJ7QyVs2ePVtSewVuwhFoMx6aBVF4G585L2/HtFEPg7j77rvndImjgoeAkNTLX4Q3HEJYvL9y771v0aco8zAYxgHak7dnzoe+HLVj2q+HYDJvefulHgnPHE+65keviwjaRCTOUq/FF60FEoUfM95yPwh9dKJwq6T3iEKmjjnmGEltUSD6Fc9TLnzCNg8XrtcO81BcwnfpX962OBZtzMWHGE95drnrrruaMkLeGG8kafHFF5ckXXXVVfHFzyVnnnmmpLI2kFTEcLhuDx9ljOG6/dkgEnNivGJcdBEdQgIJQUTEQhoqStI1t/t9Z79ovONcPNyP0ODRJj1BSZIkSZIkSZIMFJPSE+TWICx1/oZcJ3D522mdQB0lbWJBjSxYWN3H6q11oqmtKG5ZB6wGntiONTSSaewHahEEqbQH6sAt6dSPW8RrT6K3rXq1ePdK0BajROK6nboEJffBLfVdnrxegWtyAQg8WFjxPam9lmb3ZHjaHZaySBgCC5+vng3eXsfbyky7Ovjgg5ttJMYjl3rZZZc1ZS7xPZ5svPHGzedNNtlEUtsCG3ljJpKJ9O7V3HnnneHnXsfnzGiuZLyk/3i/q+eAaI6tv++/w9/IU+X715LISe8QCSsBcvUR7iXqVUZLCKEGzw4CSw5COZtttlmzbdq0aZKK1+bp5KbrZxz39j7yyCOSpNVXX11SEcyRuj1A9TOLP4vgYfNnHeY1PN++BASewtEmPUFJkiRJkiRJkgwU/WmWfxo8b4M4a8/RwWKPdaorf8fzKTyevoYFo4glraULJwuLLrqopPKG72/xxJJS5+4ZwQLQVYe9DLkXxMRKxSJJ/o97EvjsOVBzKxWMxbPOYZFK211mmWUktePk8Ua6B+Ooo46aq3MYbWqrUSR36/VF7DNeBWKxpZJXgQXx6quvbsqIayZ3inht3z/6vTmd53jCgqMuR0pMPNfx6U9/esj3okVmsbxG1xMtaFnndERWd9qhWx8XWWQRSe263HnnnaPLS/oYzy2MvMv1Asbetrq8AEC7cw9SnR8yderUpuyMM84YzmknSd/S5XHB+zRcLxTPaL6wdpS/Byw1EVGfT1cu4WGHHdZ8Xm+99SSV5Rj8t6PcLPJgR5v0BCVJkiRJkiRJMlDkS1CSJEmSJEmSJAPFpAyHO/DAA5vPN910k6S2bCyhN4RreTJlnbTpsoIkpuHO9xXTb7vtNkklXMflBScT2223naQSluNhESTLISH5uc99rikjYd9Xd+8nvv71r0sqK1hLRXiA61x22WWbsu23315SOxwuEksAtnWFivB9D/ekfZ511lmS2oIBJEv6Pepya08knqyOIEKUXH/SSScN+S5hbdR15M4nTOC0005rtg0n1G0iw+HAJZYJM+qSnfZwyeGcf9Tm+F5X+By4ZOz8888vqUhyS7FEdNL7DDd5HQEckqelEs7CmOUJ0cyxtVS2748oiIej068XWGABSdLMmTOHnFcv9NckGQtGs20zx7r40Gjh40YtTuLPxb0SwpqeoCRJkiRJkiRJBopn/Hs4WYpJkiRJkiRJkiSThPQEJUmSJEmSJEkyUORLUJIkSZIkSZIkA0W+BCVJkiRJkiRJMlDkS1CSJEmSJEmSJANFvgQlSZIkSZIkSTJQ5EtQkiRJkiRJkiQDRb4EJUmSJEmSJEkyUORLUJIkSZIkSZIkA0W+BCVJkiRJkiRJMlDkS1CSJEmSJEmSJANFvgQlSZIkSZIkSTJQ5EtQkiRJkiRJkiQDRb4EJUmSJEmSJEkyUDxrok8g4hnPeMaoHev1r3+9JGnhhRdutj3/+c+XJM2ePVuS9MgjjwzrWK985SslSW9605skSX/5y1+asjvuuGPItnnl3//+94i/M5p1F/Hzn/9ckvSCF7xAUvt6n/Ws/zSnv/3tb62/kvSHP/xBkrTEEktIkj75yU82ZZ/61KdG/Tx7se5qttpqq+bz7bffLkn6zW9+I0n69a9/3ZTR7qZPny5J+sY3vjGm59WLdffMZz5TkrTOOutIkp588smm7O6775Yk/etf/xryvfe+972SpN///vetfSXpnnvuGfXzHGndjXeb+8hHPtJ8XnPNNSVJm2++uSRplVVWacpoa5/5zGfG5bx6sc31C/1Qd8wXkvRf//Uf2+uf/vQnSXG/HS/6oe56lay7uSfrbu6Zm7rr4hn/Hu0jjgJze7OnTp0qSVpjjTWabTyI8xAuSS960YskSc95znMklQdzSVp00UUlSX/+858lSc973vOasksvvVSS9NRTT0kqD/1SebHie1/72teasrkd5Hulo7z85S9vPv/qV7+SJP3iF7+Q1D5HJrd//vOfktovQX//+98lSS996UslSXfddVdTNm3atFE/54muO47l50EbWXnllSWVB3qptLszzzxTkrTCCis0ZTvttJMk6bzzzpMkffOb32zK7rzzzvB3698eCRNRd9F5054k6ayzzpIk/e53v5PUfvmeMmWKJGm55ZaTVF4opfIy+brXvU5SMXxI7ZfQ0aIXXoK8XZ1zzjmSyksg9SCVh1D6Mi/bUnnp/NnPfiapPZ4df/zxkspYMBpMdH/tZ8az7mgX/ptd89shhxwiSXr3u9/dbKN/Mq+st956Q47fBfM284yfw0jrItvd3DOedcf3fPzaeOON53hM2pG3EeYT2o/PL4yFPKdE8xF//bmP573777+/2cZn5qqIbHdzz2i/smQ4XJIkSZIkSZIkA0W+BCVJkiRJkiRJMlD0ZE7QSFl22WUlSTvuuKMkaebMmU0Z7kp3JT722GOSiqvU8y9++ctfSiru9Z/85CdNGTkEhDV5nDO89rWvlSTtv//+zbbxiqsfKw444IDmM65IQt2e+9znDimj7jxMgv1wO6+99tpNGe5pD5/rdyKX7TLLLCOpXO+3vvWtpoz6mDFjhqR2GOZhhx0mSfrKV74iqR0qR73+9a9/nePv9gPReXv7+elPfypJevGLXyypHU5J3yZ066abbmrK2P9Vr3qVpHadQxQ60etE4ZbwpS9n4fvhAAAgAElEQVR9qfn8spe9TFKpSw9h47oJgyMURCrhc4Qsed4eoU3LL7/8kN8mxGQi8zyS0cHnTD4Pt48QqrT++utLkmbNmtWUPfDAA5JKWPCDDz7YlB199NGSpBNPPFGSdO+99w45djRPZKjQ5OTYY4+VJL3lLW+R1M7jZIz5xz/+0Wxj3BrOfMhznCS95CUvaR3T23n9nMexpfJ86cdaaKGFJElHHnmkpBLiLmU77UXSE5QkSZIkSZIkyUAxKYQRPvaxj0kqFgEUtqRiHY4SLnmjd08Q1lAsAW4FwKr87Gc/W1LbylCrpS299NJDzsGTi4dDryTPXXnllc3n1VZbTVK5JupEKnVMHbiFBo8FAhUIAUjFs3HbbbeN2jlPdN1hQUedUBoquuGgdrbLLrtIaif+H3rooZJK4v+rX/3qpgzLF8mY3pbxuo2Uia67CNQEH3roIUltTwPJsrSjFVdcsSm7+eabJUk33nijJOm3v/1tU3bSSSeN+nmOlzBC5L3iuq+77rpmG56fKKGdBF/GQe+vjHGU+e/MP//8kqStt95akvSd73yn87yGQy+2uXmly1sXsfrqq0sq3jupiPF0MRF1t8022zSfN9poI0nSSiut1Gxj7MED696b2truCeS0O+ZhF+W54YYbJEk/+tGPJEmnnnrqkPMaqTDMZGx348Xc1B0eEx8fOA7jj4sfXHLJJZKkF77whZJK+5CkJ554QlJ5FpHKPEjkgM+1r3nNayRJ8803n6QikCWV5zfuLZ4hqXi3EZBhDvLr4blIKvMQ4+IOO+ygmmx3c08KIyRJkiRJkiRJkswDfZsT5B4aPmMlcKlX8n8cLA68UWKtkoolM5Lc5HtYMdy7xH6LLLJI61ykYjntV9zCh2UY64hb+LCKYGF2aw91F1kz3vrWt0oaXU/QeMI1rbrqqs02LFduiSK/DAuUW96xeJIT5J6O7bffXlKxcmGR8v2WXHJJSe02+fDDD0tqy0L3A3i6Dj/88GYbnjLW/Tn77LObMqx+9913n6R2O8J7Rm4B1nap5LYgv91POUGRNWzXXXeV1JZ+5XPkoeEz7de/B7Qvz/3je1g43RPUT3U4XkRjHvfPPenkw/i9ZfzAG+/9ezzresEFF5RU5NFf8YpXNGWMY+5l5bzxCLlHnLERD5CPdXiyqTPPv2BuxePJOleStNtuu0nq35zIQYEIB8+hqSMjrr322uYz95/v+Ri1wAILSCrtQpI22GADSdIf//hHSe22xW+Sb0v+t/8Oc7PnR9LPOJYv6cGzp/dF2rVLaSe9S3qCkiRJkiRJkiQZKPIlKEmSJEmSJEmSgaJv/XUe1oKbEhc6ktlSCSNw12QdDueuTNygUWgI3yMkwUMTCBfB1e8J6oT3LLXUUs02l3rsdTyBkFAsrt3ribqjrqOwHMpI6Jfa9dKPkAjpSbwkbdIepKEy4d7uEIygfjzc88c//rGktlgCcG9wwfsxkeQmNECKw0N7gZ133rn5/K53vUtSqUOptDtCbjbbbLOmjOT/Y445RpL01FNPNWX0f6TrPemWsJrp06dLkq6++uqm7PTTT5+n6xlrIglqZNY9lIN+ytjooVl1eK/DfoT1+lhHO6TekhjmlyhZn3p1cQ7Cgjyce+rUqZLieWw8w+FOOeUUSSV0/PHHH2/KaEceMkm4HNfkfRLq0GppqDBCtD+J8C6uwxwb/U7Se0Rz2cEHHyypPe7TF2hj0bOIh9ZxXPqGty2eOZhDvIzxFFEt77O1qJaX8Xs+HlPuS1n0Kl1CIiMVGRlNeCYcj+fk9AQlSZIkSZIkSTJQ9K0nyGVwecPHiuRvsCRkuoWoTgSOPBbRon9YBLBKvPSlL23KsNBhAXv00UebMqwXJGlL/eEJcsEIqL09Xte1ZTKqO/Z3QYV+9QSx6CkeILdS/fznP5fUtnhRP8hvurWTdlN7KaXiVfQFVIF6pO272AJWLRJIpd7zBK211lqS2ovnct5ImUoliRWv1iOPPNKU8Zn+6BZp6oe680R02vdVV10lKfb+9hN4Dz2ZnLqIEovpw7RD2rNDO/Qy9o+k3icjtVfMoU93eWUiKyqeHS9j/MDzK0lHHXWUpFiIZqzxsZ22ghfbZbxpb3izpXKeHMPbTz2PuqWdbdH8wnxCXbi3nMiQCy+8cETXmIwv3Fefpxj7EUQ499xzmzKWNkD23+dTnhv82a4er3yOrQWbfJ4A9vd+Wc8L3l7rtuzHvfvuu4ccv5+Ixq2Ryv6PFJ6N/FlqrOnvWT9JkiRJkiRJkmSE9K0nyCU3kQ7G4u15GHx2ixIWK97euyzAbv3DgoC1wC1ReAOWWGIJSW1rHpbtxRdffBhX1jusu+66Q7Z1WVEgsjrXllIvG0/r5miC5wEvg1s7iVN2Gc411lhDkrTTTjtJko444oim7OKLL5YkHXjggZLadYL3g7aM1LZU2hTWM7fQRp6jXoNrog9LxVuF5LMk3XHHHZKKpQhJcKnIZdMm99prr6aM/AHq58tf/nJT9v73v19SWfwOGeB+wvNH8GxFi55GY1yXNDZlkbUeyz9tzcc1v4/9Tm31HI1x6vLLL5dUFm70hUKJaNh7772HfI9zGM/YfJ8zaVucr7cxLOw+9jBmRbk9zAVRzhSf66UqpNJOGdf8e54fNNmIcle69ttwww0lSd/73vfm+ZijTRSxwGLgLFXgkSGM3+R84RmSyj33a6lzlX3c4jqjJTxq76KfXz1ORtEvUW70G97wBknSlltu2ZR997vfVS8x3PGkywP03//935LKfOoeYZcTnxNEYkhlORk84CyQPJakJyhJkiRJkiRJkoEiX4KSJEmSJEmSJBko+i4cjtALd3MSpkAIkoepkbzr7vLZs2dLGup6l0rip8suAvvhjneZZ9y1uAvdxYzL1MOTOEd+rxfBnevgJib0y6+TbSS0P/nkk01ZvQJ4JNXbb9TywR6iSVKku4/vvfdeSdL5558vSbrzzjubMlY8v/322yW1V8HmWIRcejvC9Uy4iruiIRKoGE953S6QQ/W62H///SWVEDiphEgQTuQr1nN9tK3zzjuvKaN/EcYzZcqUpoz7Rxity2f3Q/+UpFVWWaX5jOiDi18gI7vwwgtLasul0wcjSXvqlHHQk4/Zn1A8X5KgX8PhormAcYl+t/nmmzdlt956q6TSN7tC5XzuIQzuBz/4gaS2IAjb/B4B44jfo7EOjfNxhvmNduHjB23D5bvrkCOvH64hEpqojx/NDYTQ+jlEc9VkIQpXow632GKLZhuhYywH4OGJhOR3HTNitdVWk1SemaJjjQSex2hPUpm7jjvuOEnt/nLsscdKkr71rW9Jard55g6/TkLRusK3aslrp6tPUefeJtnfxWjqpH4Xeth2223nePxeo0si25eo2X777SWVPu9h5TwDcq9c2KJeVsa3EfY+HqQnKEmSJEmSJEmSgaLvPEFY49zbgyWAN1G3JEQSncD+bvHiTTVKzOQzVmgWaZSKNZmFuNxagHXCrcpYcJFK7UW8jqFLQpx6QaZ0zTXXbMq49sgTEd2bfoB2g2XMrVt4w/yeU1d4hPbdd9+m7LbbbpMkbbLJJpLaCZS0+VrSWCp1zTa3cmFR7rJ89QoLLbRQ8xn5+CuuuKLZtvXWW0sqUql4OKRi+cTK6NKkWKA/9KEPSWpbtLDe0ecffPDBpixazK8X8QX5uM/uSWA8ihKSayu9W/JpT4ytP/3pT5syvENY/N70pjc1Ze6F6ydoF5EniERz2pBUZOtnzZolqd1fqbNvf/vbkqRVV121KaONUtcu8LH00ksPOa/aoj2eSezuGaWtcP5eRvtxkQfaG3/dE+QWZqld57UHyfsrx4qEKlz6vp+I5tEuWBrk6KOPltT2ylD/eIIR4pGkSy65ZETntdVWW0kqc8+ZZ57ZlF100UUjOlYE3gNJuvnmm1tlkfQ+fequu+5qtvH85fMubaKrXiNBjhovq4VjIo+5t1P3bNS4B6XXibxiXBtiFlJ5DqLuWR5EKvXIPj7PE7XCHCWVPu7Lz4w16QlKkiRJkiRJkmSgyJegJEmSJEmSJEkGir4Lh/vKV74iSdp0002bbSTmojHuiXKEynjCOC5MQj48QbPL9enufqkdPoKbcLnllhuyL+7p4bpMewVPzq/BZU1So1RClHCh+/XiKo3Wu+hap6mXISyI0Cl345NQ6uGOJLESrvT44483ZbRdxBX4XyqhHiQL+u8QlsIxXYwiCoPrNWGEKEwL8YNHHnmk2cZaPrQ3DzNcaaWVJJW6mDFjRlNGP2PdLvaVSpukvfpq87jjCXvqVQjVcnyNMhJNp06dKqnd7whV4G8UOkLb87Ab1qkiFM/DvfqdqA4Q43HBCfrbYostJkk6+OCDmzLqfJtttpEkbbTRRk0Z9UiocVdIjjQ0JMWTriNhgdEEQQg/D0JIPRGcfudhunV4roc8cyzGoGidoHpfP0b9fanM8/3GcMLgEIqRpHXWWUdSufce+ks7pb15mCrt9Otf/7qkWNzARWM4Fn2dtiwVAY95wceMuh372E6IMu3e2xF90OcO+h71+nT9azjUx4pSJBzmaUK3N9tss6aMcPd+5VOf+pSkdrgaoW7R2kw8q/CM5GMo98rTLhAbG881Dvvz6TNJkiRJkiRJkmQu6TtPEG+UnoDL53POOUeS9IUvfKEp423TLbpYHrBWRRLC0Zso1hMS8pCdlaRrrrlGkvToo49KKhYUSTr55JMlDfUk9TqIPThYXag7lxL/7Gc/O8djsR914Bad0bDWTAS0FSwZv/zlL5uylVdeWVLb44eXBqulW0VuueWW1j4777xzU8ZK2vfdd5+kYg2UijcS65lbxfCaeNvvNZEELD94KqTitf3oRz/abMMj8Y53vENSkUyVpIsvvlhSkdF26zH9eObMmZLicWD99deX1L5/jBu9TtR3GIuktny11K6bWgzA+2SdwI/EqSTddNNNkqR3v/vdkmIBlX6DOnBvBnWLV9c9D8wZtF/GfamIotDWLrjggqZsrbXWklTq04UtkN12uCesoO4W9C9+8YvDvby5wr38tUCDW+vxBPl4BniMvG3VQhzuCailxr3OGQOoex/r3Gs1XkQJ8nOLt4M3v/nNkkq94MWRyliHF8T7Nx509p82bdqQY7797W+X1BYjoM0vs8wyzTbmI87Br8+9gCOFduDnduqpp7b28fbA/eev32ek2SNPEOOXt7ta7Gqkzx1RFEt0zkR4LL/88pKkffbZpyn76le/KqntpesVusQkDjzwQEllPnXhIGT/u7w4tUCCb/Mxl3vkz9ZjTXqCkiRJkiRJkiQZKPrOE9QF8oluIcNa6VbeOgeFRf+ksuCY5/sAb/u88fqCjVgo9t5777m/gB4DC69b53hTjzwKn//851v/u6wu1kK8aW6hcanxfqKOgfV2RZtyTxn1SO6Ux/djzcIi6NZ1rEZYm729Usf8tnswyI17+OGHm23UeyRDOhHQLtwyybm5dCbXx0Kzbonac889JZX6ZB+pWKyxcrrFHms8x/Y453rBu14FK5zjEtm1J8jHNSya0YKUtZXU2zbeSiyG/ZDf+HR0SeneeOONktryttQrdeeLyVI/tCdfyJM2R//Gq+bHuP7665tttEPmL8/HcA/TWOC5tbUV3BcKR97f50PqAAu+W8rrnFy3DjM+RW2ScYG51scH91qNF3Pr/XGvD55tH+9ZIgDpZ59X8UqSC4hnRyrtjUWmv/GNbzRleNff+MY3Smq3V8ZS9/AQscG98eeneVncHNluv+cuqSy1PS20H9qbPysw73o7YDykLFrwM8pB7spLrj0kUV6lQzv1/gORt3Ss6Mqvg8hDCx/84Aebz3igaReeg1e3B29HdX+OIhFc3p66I9rF5+FoIfjRID1BSZIkSZIkSZIMFPkSlCRJkiRJkiTJQDEpwuHqpE13Q+Lq7ZLhjIiSxGqXqSeHdiUH1+fXL1CP7ibFPV273qWhYW0e6rHllltKil2gJPz3G7h6CcXwcCrcxh4qhHTmvffeK0k67LDDmjLqmlAuEqglafbs2ZLKasu77757U0a4Z5SMiNy2r7I91rK6IwUxA3ehExrn10SYKuEEHmaIJPTaa68tqR1+RJ+lr/oxCQlBGnbNNddsygjxuf/+++fyysYHXy2dNuf3e5VVVpFU6tfHvjqcx5PQqTe+R5KvJH35y1+WVMLuJiIUaV7g2nw853MdEiK1+ynQF/fbbz9J0g477NCU0ZfvvPNOSe0wDsYMwhL9XhHy4/MKMrsutFOf81jhbYt5NBKBQab5Pe95T7ONfso5evgq105dR4nR7O9zOYJHRx99tKT2EgPgfX+sQ4889JhxnmUlfHwiFIiyPfbYoyk77rjjhhyXNrHkkktKastTIyjAGIkYglRCh/jt7bffvinjHl177bVDzp05wdsTv33DDTdIai8t4Ms3jBTGaJfo9pAsqR3OS93xvOB9ifFugw02aLZxDVFYZZ3i4NfbFf5LGdv8OY4x1K+B/hulVCD5PR7Uz5vRM3A03iH3732PsHsEjLx9E44YSWRT/8wtXif8tp8X4XCU7bLLLk2ZC56NJukJSpIkSZIkSZJkoJgUnqAatzJgyeyygPtbf53kHiU/RuIAniha028eIMCy5NYC3tpJznerVo0n5HfR69b2OUFdYAVyrw/iB24hoi1Rr8cff3xThhdir732klSSY6Vi/UMgwRdmrC1ffq9Igr3ooouabb0m/UxdeF+KLMR4jJDORDZcKp4Irs3rjrEArw9SsVKxOGJl9yTMXvduYI31NkdbcEst10R77BqLupJ8XbL09ttvl1QsnhMhTzxc3PJYe3ueLrG9tgC7pwzLKPK3LoN7wAEHSCqysi7GgbcYL4YvpswxvQ/jBWZ/96iMVaIwuLWX+RNPCx4qqSxk7F5WLMAcI0qmZ8xyLzCeJu6Nn8OPfvQjSeWeevQFngIW+ZTGzhOEmIvL+iMEwQLPfk2MXcyHeLKkMu75wuTbbrutpCK+4UtrsP/06dMltdsDlnvGTRe/QbRi3333ldR+RvrhD38oqb1YKu1uiSWWkNRu+3hB5wbGVY8aqQUEfDyhjGvxvkS0hIvDUFe1DLvDGBgtjTLSZ7Wu/aMy9/yOBsNZ8oJ7F4kS+LPvQQcdJKmIHrjgBO2TMdHHHu5ltPgpdUyZny/t1J//OC77sdzAWJKeoCRJkiRJkiRJBopJ4Qmq37g9Vrhe0FIqb67RW3S9qJNbTNiGBdStPeR7TCawonRJQhJrHOF11yVP6fv1Om6ZxLqB9cLbGNfkHkg+047cOsrin3iE3EqFpwJ5XLf+YeGLFqFFitUXuuxVvO64PrcsUx/kInh/u/TSSyWVfu8S4rRhrFtugcQqT5t0C2SUb9BLLLfccpLadTRr1ixJbalioE69fdAnIw9QvQgv+WVSyRXB++25I+QSYEmeaLrkfD3vDmvyueee22yjftwKXkM9+e9Qn4wBnuPAIpQsaOsWVbx1XndYtLmn7qEc65wXX6STe009kaMnlT4ZzRP1wuS+H/Ove766FlJl8VnOxecSvCy+mCwejtGGeZ9cRKn0Ezyv3icY2xiX/LnhrW99q6T2QqW0N7zXnn9FvbPN64e+XedPSqX+GQfx3knF2+PbyOfletwr4PdypDAOebt3uWWpeNqk4unm3Pw88Er6NuaJrkieiLqPDzffLvqdesFWh3zg0aJrbOoC7+TWW2/dbGOMoZ/5feE6GaN83mHejHItGRdpmz6vUubP4XymbMEFF5ybyxsR6QlKkiRJkiRJkmSgyJegJEmSJEmSJEkGikkRDgesorzrrrs223BZe4IVrktcye6exg2Pu89d/Ljq+B4yulJZtf4jH/mIJOnTn/70PF/PREM4RxReA+6qr4nc5sORJ+9lPJyIa8FFHIWqeLgC7Yf25yFdHBdZXXdzsz9uZ5cXJzSEfTxRHiLhj16B/rnAAgs027gWDzEgLINkYQ+nQHaUsD9P4meF9RVXXFGSNGPGjKaM0LirrrpKUju0c6zlh+eVzTbbTFK7PRIG09XHorKubVE4GcnKJIN7YvJ4hsNFSw/UAiE+diHhjPiIr1SPBDjy31KZRy688MI5/nZUPzvttJOkEs50xhlnNGWEyNBW6e9+zh7OWbdDD7GZl7Ck4eAhXYTIcK89dIoxyM+H+mEciySHCSHzEGMP9ZXaie312OZjJL89HksAECbkYXz184KHeDNHMvb68wZtKwqdikKtqI9oHKc+CMdyoRfqhX28nhiDfUytxQoQfJC6RaCejltvvVVSW3K//q2VV155yPcIo1pjjTWabfTt6Pktog6Ni8b4aCyMhBRqonBY8Pry0L3RgLqjTfp498ADD0gqdbb++us3ZZFYAvXh/R7oo7Q7bw/0WZ5LvG3xbMQ+/hzO84zXF9v4HcL2xpLenumTJEmSJEmSJElGmUnlCSLJ0N8seQtfdNFFm221xcStT/XCYZFFHouUf4/P4/HmOl5gISfBXirWc6zOvOlHuMcCItnIsZZ6HU3cykHCKxZlL8PLE3mO6gU8pVIvfM+tTrU11a2GWKCiJHcsQW4d67XFUpG0dbnZmTNnSmp71r7//e9LKh4QT9Tn+vjr7Yk+ziKBbv3HO4R1zj3IyIX2KpEVk77p/XVuoT1FltVaGts9mptssomk2Hsy2kRtHus8HkMWH5bKmIWEOpLLknTaaadJKnOIVLxCRx11lKTSXvx3AOlhqfTPE044Yci+yGAjC+tetOie8l08DeM5VkaLK/LXE5wRGOnyzESeoK7E6GhxX7xReCDdM4RlOppzRhvOzT0ttVfSk/trgQ2/3trz5dSRJ36Mep+I4SbNc4wuL0iXh2UkIN7yrne9q9mGMAPgTZbKGIM4kD9fMZb7XFDPb+6hqSMxHNondeD7UAddQjKR9DNzP4JGowVzoFS820Se+Llx7YzPkefY23DtRfdnF76LAFMkyEEbcUl3YHkPF1vAU+6eYOaWaNmRsSI9QUmSJEmSJEmSDBR95wmK4sDZFi3IhDXELQLsF3l0sO7zVutv1vUiqS7fR6wrcZcsVCkVWUS3hg1XunEiYVFEp5YTJv5dkk466aTWvjfeeGPzuUsie15ijMcbt4jRDrCSeBntzS0Z1AFWXbe0QL2go2+L9seKEsXXYxXt5cV6keHdcsstm23k17nViDwJFph1SyvWLDxHbqU6/fTTJRULovc7jkHM/tJLL92UjccibfMCdcRf53vf+17zGas1bWe4i6V2yUPjvXjHO94hqW3V9fyXsSaSp4Ydd9xRknTFFVc02xiz8CiQLyaVNuCebbyH5Ch47gFjP7kfvgDmBz7wgda5+DjK71CH0blH1n22dXkOxpJa1tpzvrbaaqtWmVT6YLQ4M+MlZb4PZZG1nbbIXLvddts1ZXgKfPwbK/iNyDvC/Yw8dtH4Hd1rxvsuDw1EkSrU/XDzPyMPAedAmS8fUHtBRwL5m55LXOcAeU4ci6OyWKdHB9D3kMp2au9NRJe3x+uuPka06Kgfq5bp/r//+785nsPcQH+TiheWMc3z1HkmoJ261xnvmbdJnhcibxjPLHX+lp8Dx/dIFfaPlpxhDPVnJNoz1zPakuIR6QlKkiRJkiRJkmSgyJegJEmSJEmSJEkGir4Lh4vcm/U2D1vAxefbWMEbV50ncuFKxq3t4Qd1Yru78XEdkpDrycm49PpNHvr6668fsq2ua5ddrMPhXHK469ojedBexcP5ajl1D21BlMBDukjejeqilkPtCpNwajlpD9EgcTQKw+gV6HuIIUilHkmmlKQ111xTUglXQgRAKrKghx9+uKR2OyRshxXlPRGUdofowjHHHNOU9ZNYR42HPXRRt0P/n/bENg+ZIZH56quvliQddthhc3+y80AUSgaMzZ40zVh+2WWXSZLWWWedpux//ud/JEknnnhis43wSEJ4PDxkm222kSTtsssukkqYpv8O46cLoHA+7BOF0PbKWOlzZr0aPInYUgk19bmBsJZIDIBrpg68jPvGdXpoK32XcE+Swv28fCX7sQKBDT83xnmeN6KEfO7r04XrUx6Fos5tGH2XSEIdfuewzcfNKPF9pNxwww3N5x122EFSCYObf/75mzJC5RdaaCFJ7VA8Uhf8nnO+UTgc1xmVUa/DCR2P+qf3Y/oN7cOvdV7YZ599JLXbCvMhAkMeWsg9o095qBzPwB5eSB0T3ubPvj6GSXGYP9ft58czTyQ0xr102XDuH/eUc5LGTnQsPUFJkiRJkiRJkgwUfecJ6uKQQw6RVN6YHffa8PaLBdktCXiHWDDK34CxhvKGzYJ3UnkbJkHzmmuuGXIOvZygHkECqlMvMLvqqqvO8fueBEfdRdYmtzj2OpHkNe0By5RUrGXe7rCsRNbKSBChphbmkIqlLBJi4Lz8nIcrmzpeHHHEEXMs84RXrmWPPfaQJH3oQx9qyvBMYHUiqVIqFlrqAFlOqVjPsAJ2nUuvEQnEgFuou7wKXWIlteXOf6f2NHm7p42Ph/DLWmutJaktaIGVft1115XUlu/m3PDeuJXxjjvukCTttttuzbYttthCkvTZz3629X1J2m+//SRJxx13nKR2u0JQgTHA64v+WntWpGJN9rpjPKBsPMdK9/7RpqIkepLWo3seLXxczwGRx4n+6t9jMXS8xpFIzXgscuxeaOBZANEEH/fryJFomYXhSlAPR+xgpG0kWviXz5w70u5SESuYF9zj+uY3v1mS9Ja3vEVSe6zhOQyPvi/Su+mmm0qK645xK/JuD2ds8n3qCKBIhjoiWrR2XhZLJWLBhWhoi4wrLjjBs0AkIEM7cs8y0Ss8n3gbZhylzMdCRDOixZ4ZF9nfhUuoRxchqqX4ve58iYLRJD1BSZIkSZIkSZIMFH3rCYrkpom39jhwLBj+Bo6l+Fvf+pakWG4XS4JbfXibXXvttSW1LdgO2jYAACAASURBVPpYrpCFjqwl/SCLHYGFXRq68FaXlLBbJbA4cN+8Xocr5dkLeLvDYoVHEUuKVOJvV1pppWZbLV0aWYFpR5FFM4oVx1pTy297mdNrniCI+rNfC/2JXJ1ogcW77rpLUlvOlXsSSQu7B6Gmy9PSC3SdV7QwZURtdevC243Xr9S2PI/nGMc5uUz/nXfeKan0P5d0xbPI/ODti/HplltuGfI75HngLZKkj3/845KKp9H7PmNbJINde4C6pHWlMh7wl1j+8cA9+Yxdvg1Y3NLPDYsv1+LWetouc623Zfopv+eS5dtuu60k6bzzzpMUL3sx0f2VdtRPea7jSTSubr311pJKP3ZPAlE79GMf9/ESRXNe7cmSSv+ijfnSHLXUtf8O27q8Sz6/1Iubj1aeGuPPtGnTmm2MP0RILLDAAk0Znm6eT3yhcK7P+yXXwPW5h5+6Ir/JPXLkPrL4tOclXXXVVa3ve67tcOadke4/N6QnKEmSJEmSJEmSgSJfgpIkSZIkSZIkGSj6NhwugjCNr3/96802ksm68KS/4dC1iq2vfD9ZcIlHkhFxnRIqEuEhAYRKREmM/SQdHp0/kpL3339/s+3xxx+X1Hbt49qNQkTqEDkPZcO1TwiiJ9bW33exDhKzPSSI8+o1ojAqbxc33XSTpBIW5yEGuO1JSPdwWEKTCOP0ELhI+GMyEK2EHtElJ1snA3tZ3ecnKsyX0ONvf/vbzTbadyS/zHnX4i5SCV3zPsn4RQiY9x1CwOiTN998c1NG2F0U3laHeroQCnUehayyXyRMMFYgDS5Jm2yyiSTpvvvuk9QOu1l44YUlteuAeozaX319nizNPWEfD4dZffXVJRWBBF+CgZD2WbNmPf2FJRMG7cH7JfMbY7u3O57pEDrxZRPosy7ZTfgkfc7bD3MAoVlROCz9zEPBOL8ukQ8fN+ijtOGxlG1nTEIUrAvC4qTyfOHPJ13LbXgo8NPB0glSqR9SSBCu8eNHghyUeTgs4/CMGTOGfS7DIT1BSZIkSZIkSZIMFH3rCYoS17DwjTT5e7ieiOEkXbKIHsmb/j3/nYlO4BwJJLdJ0mabbSapWOxccGKxxRaT1E4SBqzziCVEybD9QOSFwVqFPLo0dHExJxIsoD3jVXKwZnWJJrDAnFtHsfy41PSTTz45x/PqNdziRh2TCOreSRIzF110UUnSggsu2JQhq4nV0O+LW6WSNl3jU68IvESe0Y033lhS8Yh6sj79KFqolHbhnhkkZrEge7tivzPOOENSu9+6RLsUz1X8jRLo/bwmUqDDLez8PpK3yyyzzJD9fVzimqNEc64vWkYAbx31Es3lJIa715sxzse/pHfxPss9ph2w5IFU5NCvvPJKSe3+TJty72gdSRF5nFjoF6ENqYgI0CajRVbpiz7+RYIljA3sHz0zzAuRkNRwnnkjaXcXS5hbhrMILfevF0lPUJIkSZIkSZIkA0W+BCVJkiRJkiRJMlD0bThcRNeq1lFyKoxmqAEJmv0U7vZ0XHPNNc1n6hZXryf9rbrqqpLicDi+RyiEh534ekK9TuTapm1FK3V7uEsdtultsl43xENRCN8hLAcXvFQSB9nHf4+wHE+I7JVQpuFACJskrbXWWpKK+IG7/1dZZRVJJSTBEy25XhJqXQyBhOxo/aV+xtf1WnzxxZ92/yjkqg657OV2c9RRRzWfWTuENrHUUks1Zaw5wn33vkyb8TbAGEX78CTrc845R1KpOw85JamaY/qq5xyf/upzFefjfZ/wMMbNaGwdK6ZMmdJ8Zswh3HCHHXZoyhBEcMEMxsKobVH/kTgE18lfvx9879xzz5VUkq2lUnceSpX0LtEzGsn3vgYNcx7iPj620SZ9nqjDTAlplUp7o396P0OMgfOKQroo8/ZK+/TxkbbLWOQiC6NBr81TvTw3DIf0BCVJkiRJkiRJMlBMKk8Qb96eoA6jKcMcWbewFh577LFD9q+TRPsNEs8l6eGHH5ZUEoHdooP19Zvf/OaQY5DEyvfcazIayXkTAVamaDVrcOtxvYK8eyywHlEv7u2hrLZkScX7EVlVEa3wtj+Wcp2jjV8LdYs0L1K9UmlbeHvc+o9c5/Tp0yW1RThInqWuJ8sq71HSMWOPj1m1RdH/j+RLofZ4TpTgSzQOk7h/ySWXtP46yMK6PCz9wr0ZHJ924tdJIjVeHn7XP9OeXOiAusPT5vL1CHtEnpGJGCPdQ48HNRJ1OfnkkyW1ZXq7vPuR162Ge4onQCp995ZbbpHUFqrAkj+ZIjAmM9HzGGM8EtZSuf/0KR+P6HveX5hvI08T+9Hu5ptvvqaMsYB9vM92eYLY5ufMcwF93GWek94jPUFJkiRJkiRJkgwUfesJirwqWNo9bnQsiKxNWDEWWWSRMf3tiYbrJPfJ2WKLLSRJH/zgB4eUIeEMbmlh4cG77rpr1M5zrHALJ+eNFcllMsE9OuRn4E1zrwRWI6xb7rGpF1/zxQVZRI7vX3fddU0Z1jaky6WyMPBtt93WfaE9gNdPvUCxW9CxElI/5C1I0uzZsyVJZ5111pDvsXAqde2eoImUJp5Xll9++eYzbSXK4esCyehavlZqWz2ltucpyosbK+b23tCHey0X0XMve4GDDz54WPt96UtfkiTtuuuuzTbPxZDaXpu6jXjOIuMY8vW+8PH222/f+l7m/0wu8Hb6gqiM7XhtvK1EObLAeOXjFhEU99xzjyTp0EMPHfK9yANee9H92ZPjs5CnVJ6ROHfmoKQ3SU9QkiRJkiRJkiQDRb4EJUmSJEmSJEkyUPRtOFyUVHnMMcdIipN5x1qUgGS5aMXqXpM0HC5RSNB+++0nSdppp50ktROCzz777Dkea5dddpEkrb/++pLayYKs4NwPEKYhSY8++qikIsThAhLgIh0LL7ywpBLy4UnYtBFc+544yv6U+f1gFWjClxxC40jirs+n1yHhWpKuuOIKSSWcy+sHsQ3CDlySmNAHVqz2ME6Oj5S4068iJlL7GrfaaitJpe15GSFLXKuHyt1///2SSvjk7bff3pTdeOONrd/zROFkcFlxxRWbz4Qe0e5cqn3atGmSpJe//OWSSh+Vyvh05513DilLJg/R+EqY7Vvf+tZmG+FvUZgaoc1RiC8hupGIEG0MMY1ksElPUJIkSZIkSZIkA8Uz/t2Pmb9JkiRJkiRJkiRzSXqCkiRJkiRJkiQZKPIlKEmSJEmSJEmSgSJfgpIkSZIkSZIkGSjyJShJkiRJkiRJkoEiX4KSJEmSJEmSJBko8iUoSZIkSZIkSZKBIl+CkiRJkiRJkiQZKPIlKEmSJEmSJEmSgSJfgpIkSZIkSZIkGSjyJShJkiRJkiRJkoEiX4KSJEmSJEmSJBko8iUoSZIkSZIkSZKB4lkTfQIRz3jGM5627N///nez7VnP+s9l/OMf/5jj95Zeeunm82GHHSZJ+vvf/y5J+vznP9+U/eIXv5Ak/epXv5IkveY1r2nKVl11VUnSnnvuKUk69thjm7Lzzz9fkvSnP/1pyG//138Nfdf817/+1Srjf8evcbh01d3c8sY3vrH5fO+9987TsV74whdKkrbbbrtm20UXXSRJ+tnPfjZPx3bGs+6WXXZZSdLLXvayZhvX+dhjjzXb7rjjjrk6fheLLrqoJOnVr361JGm++eZryu666y5J0v33399si/pPTa+0u80337z5fPDBB0uSjjzySEnS2WefPaJjve1tb5Mk7b333s22fffdV5J02223zdN5OiOtu7Got4illlpKknTiiSc22xhzZs+eLam0IUn63Oc+J0m68sorJUnPfOYzm7J//vOfo35+vdLm+pGJqDv/Pr8/derUZtu2224rqcyHDzzwQFP22te+VlKZf//2t781Za985SslSbNmzZIkXXDBBUN+c26ud05ku5t7+q3u1llnHUnl2W4s5uPh0m9110uMZv+X0hOUJEmSJEmSJMmA8Yx/j/Zr1SgQvfEOxwrEPhtuuGGzbauttpLUtmTifVlllVUkSS996UubsltuuUWSdMkll0iS3vve9zZlHOOee+6RJP3+979vyuaff35J0uWXXy5J+va3v92U/fznP3/aa4yua6KtBc9+9rMlSffdd1+z7S9/+YukUmdYVSTpD3/4g6RSv1j8pFI/v/71ryVJL3/5y5uyL3zhC5KkffbZZ9TOfTzrrrYwOdOmTWs+L7LIIpKkX/7yl5Lalih+m2O84AUvaMpe//rXt/7i+ZSku+++W1Jpt27Npw6uv/76Ib9DWWTRneh2x3XefvvtQ47/u9/9TpJ06623NmVYi6+44gpJ5X5I0vTp0yUVK/UrXvGKpgyPxpve9CZJ8f0bKb3qCXrzm98sSbr44oubbdTNRz7yEUnSxhtv3JTtvvvukqQTTjjhaY8dtaGRMtFtrp8Zq7ob6X1dc801m8/Mm294wxsklT4tlT7IfDFz5sym7JFHHmn9PfPMM+f6/PrJ692P9GLd1ffcn+2IzgDmEqm0xbHwNkb0Yt31C+kJSpIkSZIkSZIkmQfyJShJkiRJkiRJkoGib8LhCLGKQnY23XRTSdLWW28tqR0uRPgV7k5J+vOf/yxJWmCBBSS1xQ8IQ+Iv+0olfOk3v/mNJOn5z3/+kO+xzZOHn3zySUnSIYcc0mzj/PtBGOGhhx5qPiM+QR2/6EUvaspIdMXt7HXw17/+VVK5JhcR+NrXviZJ2m233UbtnMez7gir9JDLP/7xj5KkJ554otn2kpe8RFIJDXSxDtog36MupdIGb7rpJklFvEMqScXUp4tw0Ob9/vVDiMjHP/5xSdJBBx3UbHv00UclSc997nMltfsebWuXXXaRJF144YVNGf2fffgrlVCd//3f/5VUBFPmhdEMhxuN0AwSzQk5/e1vf9uU0Tb3339/Se0Q3te97nWSSkirj12IJVC3o8FEt7l+ZiLC4V784hc3n1dffXVJ7TBU9l933XUltcN0GasIg7v66qubMgQ8+Eu7lcr8S6hcdK5+nv0w1vUzvVh39T3nGU8q8yFh/vyVioBRhsPF8GzDMx3zilTCqZlHfY4ZDn4f6ud7fy6el3D9LtITlCRJkiRJkiTJQNGTEtkReB6wfLsVHclqrMUknkvlDdY9FlixsLo/9dRTTRlWKpLm3OoOr3rVq4Zs4y2dY7pXYPHFF5ckHXrooc02JH8jD1CvQF0stNBCzTbkdLleF4fg3uCNiOTCseR7/bi3rR/hWtyi8bznPU9SOyEYLwSS1S5YQBnf469U2n7tbZRK4icWGk/8jyTjuTdjIXM8WiBU4HjdSqWfSaV+SPpnHHC47sjqhPR9v3P88cdLaieoI29PGz3rrLOaso022khS8dy6WMlKK60kqdTzcccd15SxjbELD5zUtur3E8OxALs3Y/3115dUhEnuvPPOpmwkfcvb4worrCBJuuGGG4b9/bEkqgtEhxDVkIrl98EHH2y2LbPMMpJKG/F5Drls5nCk/KUyDiKB7+MnHneWafC2HAm9JINH3Y/9OYNtzBc8i3R9P/kPtSfI50zq84tf/KIk6Yc//GFTdtJJJz3tsT3qZSJIT1CSJEmSJEmSJANF33iCaqv29ttv33zGMk7OhVuJ8QB5LgBv+Vjb/djEyWM59hyLelFWtzqxLcpZwFLGwpZSsSq6F6rXwDLpULd4ILxe8VRg9fP6ec5zniOpSGy7BTTyGPUaXYv0LrjggpKKl0wqFhO3RLE/nkiPq6/bYtfCv27BwopCfXqde95VfV69DN4Ltx5Td/Uiw1KpK3KfPCew7sf+Pe6l98teossaueKKK0qSTjnllGYbuWae/4jVnH7nC1rSdvBi4P2Rilw49eULWta5f+edd15TtuOOO0oqCyD3C8Ox/K699trN5ylTpkgqFlHvr+SsMDa6p5sxda211pJUPEm+v3tCyZekf3eNC2MJOWJ4gtzzRdvw62Tcw6u12mqrNWWMe9ddd52kdk4uuRnMwx7BwTIA5AutvPLKTRn5kk5a8xOfC+rnN4+acC9m8h/8WcLHf6k9j9TeXhYhl8o4QJRCxNvf/vbm8yabbNL63mjmic+J9AQlSZIkSZIkSTJQ5EtQkiRJkiRJkiQDRd+Ew9UCAossskjzmTACQjcIC5FKkrSHJbEfblEvq/HwA0KJIjc7IUr8dTcs4SMeIkdYwRlnnDHH355okD51CLEifMvDFQjZiEKuqPMoGTFKZO81ontOSBL319sk4gceLkib4FieEFiHa3m7q4UX/HvUNffD65d78853vrPZ9t3vfldSO2S011hsscUkteucOsBFH8n3Ur+1iIJU6ikKo+vVcLguTj/9dEntFdHpR1EICG3BV0knRBVc8poQzxNOOEFSWxKVcC3aoY/NyJv3WzjccLjkkkuaz7Q5BAA8PIRQN+ra2xdCMoQsehkhiAsvvPAcz2GiQrwIBUQoyAVxmNf83G6++WZJ0tve9jZJ0j333NOULbfccpKkyy+/XJK03nrrNWWEEnJMwjilEn5JO19yySWbMsLhMgQucaLnPsLv6/FPyvYzXFwkBpiLHn/88Wbb3nvvLUlaY401JLUFzQixjSTvIxGosRLQSk9QkiRJkiRJkiQDRd94ggCZTH8TZfFSJF7dys02t3LWMsHRAmtYid3KWXuj3MpQez94k3X8vPrBExRZSrhOyiKBA+rTy/ieW6nBJVJ7FdqbJ0dzvVjEXeiA6/TrrZOa3StB/WD5dE8Hx8er5Nb/+lhe5xzLfxc5Yyz1vZgQSn/2Po5FKLIG1Qspez+tBS1cepzvjXRxt/EiEuPYaaedJJW6cQn/Lo9212LMfM+9llj6WSz1k5/8ZFPGongsUv2zn/2sKcOD5KIqt9566xzPq5ep69+9EohB4B365je/2ZRR1yT+n3/++U3Z97//fUnSe97zHkmlDiXppz/9qaRSvw5jwHhKQEfLPGDl9XGG8cWjAh5++GFJ0qWXXiqptCNJuvbaayWVsdSFPJin8fr4OdSRG35MftuPVcv6JoNHFDHAOBf1pfQEFboWS/aIE/oZz7y+hAzL1SC648dBtMfnX56hEF4Zj+VT0hOUJEmSJEmSJMlA0XeeoC222EJS+40SCxGLevpic8SyuzWct8vIUsTbL2Vura+lAP2YWKKivCQsfG5p9fJeBclxp5Yodosgljnqxa1ytQfJQQ61l4liYLGWY/nw+FWsxlFeSySfXeOWFm+DfmyptFfuS+SB9Ph9cpW4V73iCfK2hnSpn/e5554rqUhmetxxlAMESHuy6OJpp53WlCFzTJxylEs4kURWSTwIUY4d9RVZ6aNj0nZoj/49xkjyLr785S83ZeRs4bHz9kmb3njjjZtt/eoJqus/8vwzpm+55ZYjOjZjhkcHHH744UP2q/MAx9NS7TmOtac6ynn1tkYfjqzueAvxaEeLw0aLrOLFraM1pOKpQka7/m4yGNT9w9sp7WYivKq9xnC8YNGzL/L0eGql0o95BvbnaZ4BIxl9+rMv0s1zwE9+8pMRXc+8kJ6gJEmSJEmSJEkGinwJSpIkSZIkSZJkoOi7cDhCjtwVjoueUCAPayFx2MNtarefuwbrECV3qdcS0O5qxbWH+89XI+YcCB/xbbgXXUK0V6AefbXgug48BIykX0J0kIqVpKeeekpSHKLgidW9Cm3Lw1eoA9zB3u6ogyg0iZBAb4e1i95d0ezPNndF10SuZU+ed2GAXmLZZZdtPuNWJ6laki677DJJ0u677y6pLRPeJZ9dJ0V7+B8hdfy2r0DfC+FwEauttpqkkkzq95NQgy4RhEh6PQrPZExlHPMkdMr4nrc57l29wng/U4cNzgvbbLONJGnbbbeVJF1wwQVNGUnEjrdzqUhyS2Mvc+/jGefBEgleF5x31O9onz5PzJo1S1IZU32uZP7kWB4+g5w41+3Xj6y4h8MNWpI78wN17n2wS5RnMoUN1vfcr7urrB4Lu0QBJgPDuaa6zUglhNzTGqg7xghfToF5g7L55puvKWObt1Oemz772c8O4ypGh/QEJUmSJEmSJEkyUPSNJwhr7fLLLy+pbb3gTbSW6pPKG6vLF2PJ5Bju0amtopH1D+t+ZE3h99y6xaJQ7gki0X7XXXeVJO2///7hdU8k0QKTdZK+c9BBB0kq0qcuJw2RGECvShQ7tKlIRANrR+Qx83ZQtyUXyqjlrL2e6+Rol8jmHGhbLgXPooSemM4xuu7jRBAtWOpWXTwg4OffleBaly2wwALN5xkzZkiSPvzhD0tqL77Yq3RJldIu3HpOe2KMc+se36UdenukzUWy47UX2D2TXVL4/Qp10GUdHq7c7rrrriupeGdPPvnkzt9mSYgDDjhAkjRt2rSmjG1jBWIDUhnb6KezZ89uymgr3g7w0tQRElKZ+xDEecMb3tCUERWAcIQLoFAXP/jBDySVJQqkIrYwKFA/7gnGe33cccdJKovRSmVxZZhM3p/hUnuno4V4fTHpQSB6ho0WZUdYaKuttpIUe3sQM/BnEMo4pkf9RIvW0o8feOCBub+oEdJbT0JJkiRJ8v/ZO88wS6qqbT++5pwIikhOwzDEISMiIDmJIEERUAQRBdFXEQEVUJJcJiSoqIRXkMwQHWAYgRlyZsg5I+ac9fvxXXfVU7vX1HT3nO4+p8+6/8yZ2nWqq3atvavOCs9OkiRJkhEmfwQlSZIkSZIkSdJX9EzOAilThMIp0JTqUBupLl5gSvqVh41JCYlSGKJ0JGAbaSOe8kG6FCFpD6sSOmQVbUm64447JHW3KAD9ONjUFlZIZyXgr3/961VbWyEe9ygqxOsWokJv0oFId/Q0zEi8gHAz67lEqUngaYPYOufgaUsUGnLMLbbYYsDfddvnu54e2g1Q2Oz4WgEIIkC0NlAkjFCmX84333zV59tuu63R1iY40S0wBzEmPcWyFN6QBq6t4ilE2CPH8DFKX/D3vE9pI+3B7Yu5Lkpv7HXa5rC29cBIJZGkzTbbTJI0ffp0SdLMmTOrttVXX12S9KUvfanaRgoYa+lcffXVVds111wzjKuYM9iPr43Gc42Uy2iMRevmMSfyf6m2F2zL5yL2I1XJ026wXeZbT5WLzhkxnvEEQkoIvFCoLkmnn366JGm99daTJN1zzz1z/feYl1dcccVq24UXXjjXxx0t3CbL9wtP0cRO+y0dbrBsu+22kur3B3+ukprJO62n0ZGazXuKC5RFIk2M2W222UaSdN5553XmAlrISFCSJEmSJEmSJH1Fz0SCnn76aUnSkUceOdt9PvOZz0iSjj766GobhaRefF+KH0SF/3gNvICQNrzQkZeB1ee9gNU9Vr1EFKWi7/hFz31xXAAC6OuoIJMoSDdHgvBQep/ssssukuqIhd/nCRMmNNqkgdELj/6U3mP38ONNYR8v6CxlZvEUSnXxsnsL8TzzvaiYeSyIIkF+ndgb9hMJbEQ2VkZ7l1xyyQH7Ax6tbubFF1+U1JQcBryYLpzBnBUJYhBNwiPnkcxSGMG9e+z37LPPDjgHopaLLrro4C9qnFHOcR/96Eerz/QZogAnn3xy1bbddttJku69995q2xVXXCEplpgdqfmSyJXbGB5a5NB93iAi6BHI8hnpx+LaGdM+1xHlIdLhGR9kJpDxQeaB/z2PQPZ6JIgicR+zjKspU6ZIqucDqe4P5IV9PCO89IMf/EDSnCMe/O2VVlpJUjPC5tLp3Y7PW+W49OwOt11pfMpiR0TPylKWX6rfZ9jHI7ss78I79sILL1y18WwhO8izisjK8L7nHeSwww6TlJGgJEmSJEmSJEmSjtMzkaAy379N4tEXOsRj5QtGRl7k8u9E0rBtssJlhMT/XhuRdG23EHkE6B9+vbs3DiIvShR1a/s73QL3Ew+s5xGXOa3nnHNO9Xm33XaT1Iz2/OY3v5FUR23ctogiRXUpfI9z8Rz6+++/X1Id4Tj88MMHfO+QQw6ptv3sZz+TVHtc2sbCaFLWRElNTyYe6HJRWaldrpjP5DJ7jUEpq9ttsuHgkR36BM/3888/X7URlV188cWrbXjD2d+jkXzGwx7dg2jxXmwHCWX3DDM+ei0SVM77bfs40f7lNl80mjGJLLsvks1c6nNMGfFz+fORqidlrvP5jXPCA+x1TlGmA1525izvO6IKPPPok+j71DpKtaeZ+hSvb+NYSy+9dLXtpptumu01jiXREh4elWBeIjJ93HHHVW2M8S984QuSpIcffrhqoz/wsPu4xDuP3fmyCUTm/H5zjkTj3dY8GtmtRM/Rcn73944yEtSPlLV9ZDVJdYQVe/BsH2yQ54G/+/I9nmEeUeQ54vMd79E8p1lcWpLOOuusYV5ZO9351E+SJEmSJEmSJBkh8kdQkiRJkiRJkiR9Rc+kw5FiMJiCNS+ojqSAy1WsIwnotnQ72jyEWp7XYNPbujENDnzlXyjTpwabksH3opSSSH66WyBkTiqGF5RiW1E6B+lbURvHYDVlqbZB/l4kKUuomHQ6qU4XYdX1K6+8smojVc5TRErpYk+1Gkt50Ghcu11wzVGBvxdWS00bK4UBXPygW1IB50R0zqSueToVKQqebsN1Y1/RvEbfRwIxURv9S9G6i85wfm6/vcBgnitt+3iaIX1OWuJSSy1VtfHMIR2JMSrVfebPBGybY7o8/khBOpafB89U5ixSqKR6DnIb4VqwtyjVlHSYJ554otpGGgzpTD4HkArIMT19kxQc5kGpTiX0FMLRwt8p+Iz4i4sZ0C9eTH7CCSdIqtPOEMyQarEbxCF8bihTwK6//vrqMyUCPJc8LYltXkZAiiNj2wUqum0pgSg1OhJ0KW0wmtOwKR9ng0mVHQ+U76Jbb7119Zm0VvrThb94T2Tu9z7n/Se6H+Bjhf04l3322adqy3S4JEmSJEmSJEmSDtAzkaCh4L/iI+9am0egjEq457QtOsQx3KtV0lbM3Y3MmDFjwLaygNALV0t8cVg8HUF7pQAAIABJREFUVm192I0QKeE63auLnKtHXwCviHv98DLhNfa+KCNBXvRLJAhvbCSRjYdwzTXXrNrwNrunnvOnaLhbFk2NxoP3D95H7kMkfsC/HuHBk8649ghFKeUeLcDaDUyaNKn6XEahfbHNSIyAfo0isGzDBgYbEad/iQL7nID32m20W5nTwpqlBzjqQ4g8nN/73vcGbCv7zvuXY3hkE7slqvHII4/M9hw6BQXOPiaRyEXqdurUqVUb0RsfP9x/zj+KxjCm/XnNtWOLUSRo0003ldQUBbj99tslNYv7ub/+HOoEREV8LikXMSZSI9V9xpy+wgorVG3rrLOOJGm11Vartp177rmSakGE5ZZbrmrbeOONJdXX6Qvmsi0SW+C5wJyHBLlUR/Vc4AJRBf71rIVuzl6BSIiJ52/0DKGv6MPRiLh2A1EEm2iqL5D76KOPNtquuuqqqg1xHsaDZ6pgK1Ekm6i4nwNjg3nOs1hc1KiTZCQoSZIkSZIkSZK+YlxGgtyDhWepLRIUeUCjHHo+R14qPPK94CUZLJHXsfSWRzUv4DVFyHbifSnrOLoVPGB4Nj3nnGsg2uMRCLZF9oD9uLcc7x125LaFvRKVimq1OFbkrb7vvvuqz+T2cl6lzPdY4TUG4FHGMtoTUY5PJ/LUl1GObq0Rcs8x148NeN7/lltuKam56C1zYTQPciw8cp73z37YnvcVNspCnhtuuGHVRh92a1RNqr2Lm2yySbWN8co1SbV0OLTVoTnHH3+8pDqC53U/5eKh3k94SaPoG/u1zbdzC9dXytFL0hZbbCFJ2mijjSRJl1566YDveZ9gW1FEkP2jMU0f0E/eP1w7yw/4wtBELDyyjYx3pyNBzNXRtVFX6ZGWEl9W4tprr5VUj10HO/Xsg9I2iAxJ9XVyXkSZpIE1fuuvv37VRoTKnyvYwJNPPimpOad2S/YARFkEjC+3rfK54N/jc7RMQC9k7QyVtoVRWajUozZkk1Dn6PMQkR/qUz1qOHnyZEn1uPBjRrLZd955Z2M/H+MeFeokGQlKkiRJkiRJkqSvyB9BSZIkSZIkSZL0FeMiHa6tgJVwn4fVCbVHq6HzmbQOL/olzByF/8u0k/FKFC6eHd4XZQriSKZ1dBKK/Sgo9WLYsnjSr5eUJE8xwqZIJ4oKibE3T1MjXYE2D+tTLBwVVQMpDeV3pWYR41jiUq8wWJnlMl3B09ra0ujKVMCxlAhvwwUPuDbSGqPCUU95K1P8olRBbMjHNp+jdAns97bbbpPUlPBlf9JRugmknxE08ZQlxulmm21Wbbvhhhsk1WnBbXP7TjvtVH1GPpa0J7dtioijOSAqHuZeMk7bhGjmFuYX/r6PD/pqypQpkppjhdSpSJCoFHyRauEFJJk9xassTPfUTp4hpLnxr1T3q++/0korSWqmn3WCKB2eviMVyO2B8z711FMl1VLZ/j0XPyA9iL7w/mG833vvvZKaz1HGP8f0e/T00083jvXss89WbdinC8WQHorQk8tikz471kTS1cx3bPP5rxTn8GcC8xX31L9Xym73SnpclMoM0TXsvvvukuqUf39voH94T2EMS7VtkN7m9lE+Uz2VkrnNBU7KNEYXiPK0uU6SkaAkSZIkSZIkSfqKcREJKolkR93LyS9VtrmHj8946PwXcxnNiLwFXjg/XvBF1PB0cb1e0PnNb36z8b1owTHoFQEJ7jneNfeIlXLqfu/xykURL77nXl1sqZRt9mPhFXHJ61JIISpId5luPMp8r1u8WkcccUT1mUJyv06IvPFRRLdsiyJB7P+JT3xCknTdddcN9bRHBS80LWXAL7rooqpt++23l9QsJqXYue368bD5vIk9RQW05WLTPpbZb6TkTJ3omspsgLXXXrtqY0FKFtb0SCNjxKWNd9llF0n1mJw2bVrVdvfdd0uqZWQPOOCAqu3xxx9vfM/tkmgU0QQvRnevZwnzTinW0Ek4J6KMPjfQd9OnT5fUlBQnIuNjs22RVPo/auN7/Ov3GK8yHn2PqGCL7nkeqUW4mUM9ys+9Rvr85ptvrtrwkHO9iAR5m48hbJH+9PcMis/5e9EyCxH0RSl44t/z/uLasEmifVL3iOlEzy76LJJfxu7A32uIDiMT3i3PxblhMMuRrLzyytXnj3zkI5LqyLcvug0IGPn4j0RegOdBuRyDVPexv1Mx9xDZdTsdKbvLSFCSJEmSJEmSJH3FuIgEtdUElYuvSQPrLyLvMt6XSPYT/JiRBO14AdlCqV6ojtzr97znPYM6Runh6xXwPuBhdO9IWRPkuf+RTXEsjuHedTxu7ONeFSJG0aKW9Cf7RJ5i9zCzmGW56OZY4x77T3/605Kkj3/84wP2i/LAS9x7xH6R1xnPJ3Kx7kWeNWvWoM99pPFc6NLb69GyKOqFHTLn+fgr5erdS0fflNK6DjUzt9xyS7VtjTXWkNTMJ8fT5/UanaDNBugnz13Hu1hem1R7vr2WCa87UZADDzywasPjP3HiRElNLzP9iXfZJY6pUyFq7NGlCy64oHHuEZE8fqfA288c5BFk5rYNNthAUjPSwVwV1aKVi4hK9TgjWuh9xzE4F5fOJ2rC337f+9434Bo8muH3vpMwN3hNEmOU++sRfbIBiFy5tDaf3cvdJuVMG2PKn0c8T+h7Pyb90hah8+cR44D9fVxE3+0WmMOiSBBEtTJlPYvPhUQ9ovrIbob5ParLA8/eof6LvnC7K5eA8cwq+rjMopJqu4tqtNhG9Eeq7Zrx5M+ykbK77rXmJEmSJEmSJEmSESB/BCVJkiRJkiRJ0leMi3S4Mi2iTFOSmmE1QnSE3jx1qZSU9RAcx2gLKUerSM/uPHuFK6+8svq8+eabS6r70EOsrNpNwaGH0MuC7l5JGySlIJLOLMPLUQqbbyMdCBvxNvqKcLz/HfaLxAxIteA8IwldDzdzj0jX6BaJbLexxRZbTFJdYB4RjaUoHTYKwwP3g371Qtluwu8p9uH3FEht9JQg5kL6wedGrpvjR0IDpD34vFauru73aa211hpwDFIbOp0OR+qdp3Qi5cy5eSoo18649eJe8OsknY3veSoaacCMPySI/W9uvPHGkqRDDz20auMYUTE6Bf8+b/Js4plTFnd3EkQbotRf7t3MmTMlxWPMhSaY37lOTzXl70TiB9g36XfeVs7FpGNKdZqv9yf3eYsttpAkXXLJJbO/+GFASmT5WWqKJjBWuZekG0n1O4inwGHP9I9fE9uwU58b2iSgy+et/z0++xzJ84FtPm8gHjDWrLvuupKkvffeu9pGH9DXbndcJ3PUHXfcUbVxfaTBeboXdseSAF/96lc7eBVDx+9T+T7lKX6MyygN7uKLL5YUv69ik1HpQpQejZ1FYmJAv/rcFolBMX9jY/5MvvbaawcctxNkJChJkiRJkiRJkr5iXESCSvyXJb9S3ROClyAq2C2LhaNfw1EUA+9NWyTIPTNlVKmbufrqq6vPeBU4b49mrLPOOpKkn/70p5Kasppl0Vybl7+bKAt8PfpX3mv3/uEN8f7BQ8wxvHC19OhEcpD8PbdX7A6bd7lJcG9YKdgwkosvDoWvf/3r1eedd95ZUl0oLg0UOGjzUvk4KxfPi7yqLJDpi7xdc801w72UjuPeM7yZ0X1bcsklJTU9tdhMtBwA2zi+H7OMOnob/Ut/Iakq1fbr3sqhLLA8FN773vdKakqIc3/xCLvkNWOR6I0X3TNGXOae8yZy4R58ZNzxILvNrbnmmpKkn//855KkH//4xwPOHVGAORVb04+cS+ntHwkQV/HIF4t5PvTQQ5LqKJlU24FH5OhrxqQ/M7GtNq8y+/sxGa94lf0Zwj3yCCk2y70aTfwdJFr2IJl7WJCWeU+SPvvZz0qStt12W0n1HCHVC8SuuuqqkqQzzzyzamPMMg94lJFjjJTk+mAp303Lz7ODBXaPPfbYahvjxBfNZU7i/cKjzmSQRKJOzIHlXCXV45J3QRfZijIvmMsnT54sqSk+0raEwNyQkaAkSZIkSZIkSfqKcREJKnMQ3YvOr1L38paeSfdSlTKz7tEsvc+RfF+bjF+v1gTdd9991WePKpS8//3vl1RHgjxigccCrzO5n90O14tH0r3BXgcgSYsuumj1GS+kL+hHH9DmOfccF++L2x0eKOw6yscnJz2q8YkWS8XjEi2INhbsscce1Wcksr1mg89tUQX6IvIsR5Lg2GJUC9Ot4MGbMmXKgDZs7cEHH6y2ETmIFkmkL7E99yqWsqpRH2E7Lg0fzXFtks9zAx5+j/ZwTkQLvJaBvqDNz5tz9PFAJARPvkdh6Efuh0eQiIK4TZeUksV+Du5lZR5g20hGFaj34V8HGfTbb79dUrMOAw9tFJ0t5y6plk9HXtxtkuuNxit/h7nrsssuq9qQpj7uuOOqbd1a45d0BmrQzjvvvGobEfzrr79eknTTTTdVbSussIKkOlvCF60lCnLVVVdJaj7nqSXDxsaKKLuEejcWyPbnP9fJeftciBy2jz3edaJMlTLC788K9icTxhfr5t2RxaS32mqrqm277baT1HzO85whGj0ateMZCUqSJEmSJEmSpK/IH0FJkiRJkiRJkvQV4yIdrlxF3lMaCNt5URvpEFGKRyml27ZibfT9tpS3KI2pF/A0h3J1eQ+LIlkLnm5Swmrs3U55j704z9PMpDqkLg2UiHXou6jYvBSe8M+E/71gke8R+vYietJGvMCZ/UmriQQYxgJPNeJz1HdRISj9Q3pXNC4joZNIZrob8XPGPvyewlNPPSWpef3IjSOQ4OId3HvSESigler+IoXC05I4Psd0aeRIppzUhk5DSodf0/PPPy+pvrduV4wR0km8rZQlluprIG3m97//fdVGoS9pqJMmTaratt9++8Z5+jOFfuXY/kxhvvTnF+OU8xpMIfTcEo0f0uAgSvv28Vqmsfj+zJOlaIlUX2c0f2L7yGF7OvLHP/7x2V5HLz1rk8GDbZASJklHHXWUJGmjjTaSJC2//PJVG2laX/ziFyVJs2bNqtoWXnhhSfUYdDEojt/2PjOanH/++dVn5lbmBX+m0T/MW/6+Eo1x3h14jvqxGJc8D1yAifcZ2H333avPZ511VqNtl112qT7z3GHOlup5mLHu88hIjeOMBCVJkiRJkiRJ0leMi0gQnj2iPV44FnnW+YUbFVKXiyu65422aJG3aNt45LrrrpNUiyB4sZ17ZKXYkwBIrXY7eLnxjiyxxBJVmy/wKTW9FnjJo6JfjuVtePYR8PDiaOyOokePRrENr77bKx5lv0ccCxEH9/CPJZEXOYrC4nVyeXL6gD50LxXH4Pjed6XkaTRvdAPeN6VHzkHy1QVJyjnLC37x/NEn3qdl4b7PkXhU8eS51Oyee+4pqSmgMlKLKyLv6vebiAzeTxcSwIa4tx7Boi98PCClTeTV7wPy2iweve+++1Ztd999d+M8I8EAzsEjLESGPeLEfpz7aEjaR9G8NqLFwzlvvOduk8hrMx9GC0NHSwWUktp+/5g3ezXbIhk62JTLtRNVYH58+OGHq7b9999fUh0d8gVueVYile1Rcfa/4oorOnsBQ2TDDTeU1BxnjBOu1+ddxg7vYT6v8Mx0cQiOy7H8fYY5lmP6XHj55ZdLkj7/+c/P8Rr8HZFnjB+rfO+OFjnvNBkJSpIkSZIkSZKkr8gfQUmSJEmSJEmS9BU9kw7XVuRYphtEYfK29Xsc9iNEF4UeSWeKNMy7Jb2oE0QpgejvsyJzlJoFnopSrkVCqkm34+FiqRlC9xQPSXr00Uerz6QKeTEloeTIRtiPdB7vc/5O1HdlP3qxNyk7ZeGi1Bwj3UBbypAknXLKKZLqcL+H9kkRol833njjqm369OmS6hC/ry6OkEDbOXQDnqbG/BeJDXDdPu6wHeYztz0+Y1f+d8r1Wjz9jvU2WH/C092wv9FIYyD1xQujKYjeeeedJUkbbLBB1UbKC88EHzvRdZZ25eN9xx13lFSv11QWADtRcT9zJX0p1emc/szhu6Tn7LXXXlXbSK2gPhj8HKM1fco5y8cy6/fwTPDvlXbnY7JMn4tsLNPh+gfuvz/zmMOwA3/OnXbaaZKaawcBIhukYyIq5G1jbU9larMkLbjggpLqd1JPl+X5z/PAz59x6c+R8nngQk+MVdKGWctPGlqaoI9P0uGi1H/erf2ZNFKlJhkJSpIkSZIkSZKkr+iZSFD5K9x/FZa/EL2YsvQMSAMldd1Tx+dSytQ/R4Xb4NGAXieKBJ1++umSpMMOO2y236MPXcoZj0ybxG83gt2V8rpSHY1gtfitt966asPr7N5KPC0c0+WsKV5kH/e0YsN4vIjCeRs26d7hY445pnGeUu1RdhneXgCvFOe9wAILVG3lKtPzzjtv1Yb0KV4nL3jFJsfSoz4Y3IawC5d+hZVXXnnAtnI1b4/WEn3E5lxCFU8+URO3oTJitsoqq1SfI+/eTjvtJKkpoNBJ/NkwderUxr8OfUYEy23one98p6Rm1BQpZrygPp/tt99+kqRTTz11jucXRRjvueceSdK3vvWtahuS3953fJe/3S226tfEnOjbOM9Ijpy5Cplx/x6ROGzesy0Gs0RF0j8wb7nADduIfvi73UEHHSQpFt0ooyX+nOf42OtYMW3aNElNsQci35tttpkkaZFFFqnaJk6cKKkeSx714ZqibCaEJjxCSwbQ5z73OUlNeXpgfLZlVEyYMKH6TNTK3y95TvF+uPjii1dtLurTSTISlCRJkiRJkiRJX9EzkaCStvxM8iSl+te7563jucfr5L8w8S7x69k9AnhH8dS5J4Hz8Rz0kl7LV468BHgk6U/3pgKLBS633HKzPbbX1nQz3OuoBgNvyLe//W1J0nbbbVe1Ua/gObpl1MYlxIk0RZHLaGFaKL1aXsOEnLfXHRxwwAGS6rz8Uta8W8FTj4fI6zPKCBkSzlJtr9ECrOVCse5pHo1FKQfLtddeW33mvg12/sArR/2L18E88sgjHTm/2267rfp82WWXSWqOl26Rw0e6upSwHmkiW2Lb8ccfP6rn0imefPLJ6jMRSH9W8pnr9Kg3447ntEcgyc7gmRwtO4DXPpIL74XnatIZZsyYIUlaa621qm3MOzwzPapKBBi7i+p7y5pwqc4YKJdUGCu8lvXkk09u/OuwUDaRel/QmfcEfwYyni699FJJ9Vwutc+ZvKtEy8SU45EaSqm+R/5Mot85BhFzaeTeGTMSlCRJkiRJkiRJX5E/gpIkSZIkSZIk6Ste8t8ujB9HBY/lNj/tsiBr8uTJVdsmm2wiqRlyJx0OPCRI+D2SlCUcT9Gny6lyDhQQn3jiia3XMJhuH86tGa1iUUKlXhR95513Sqqlej01kMI9+nqfffYZ0fMbzb4j7Oyymtibp5uV0thRishgziuSlCR9xFMDHnzwQUnSSSedNIirqBkLu4skd50HHnhAUl206QXi9PuFF14oqSlOsvnmm0uq03P8PLFPCtI9ncf7eCgMte8G02+eQoANzZo1S1JdqDq7Y/K5bYmBqI37wTZvY/8ozQub83SpH//4x5LahRG6ea7rdkaj79rsZ6GFFpLUFFthPiOVyMcWz02ew0j5SnUaOvbnggqkwZHG2Yn0pLS74dMtfUf6llTPj88++6ykpv2UoiKRxHokBrXoootKqoUVmHvnhm7pu16k0z9ZMhKUJEmSJEmSJElf0ZWRoCRJkiRJkiRJkpEiI0FJkiRJkiRJkvQV+SMoSZIkSZIkSZK+In8EJUmSJEmSJEnSV+SPoCRJkiRJkiRJ+or8EZQkSZIkSZIkSV+RP4KSJEmSJEmSJOkr8kdQkiRJkiRJkiR9Rf4ISpIkSZIkSZKkr8gfQUmSJEmSJEmS9BX5IyhJkiRJkiRJkr4ifwQlSZIkSZIkSdJX5I+gJEmSJEmSJEn6ipeN9QlEvOQlL+n4Mbfffvvq8zbbbCNJevTRRyVJd911V9X27LPPSpJe+9rXSpLmnXfeqm2FFVaQJL3uda+TJH3zm9+s2h577LGOn/N///vfIX9nJPrOefvb3y5Jesc73iFJ+u1vf1u1vfSlL5Uk/e1vfxvwvX/961+SpFe96lUD9nnuuec6fp6j2Xf/8z//35fwn//8Z1D7f/CDH5Qk/epXv6q2TZ06tbHPAgssUH3G7i6//PI5HvtlL6uHNH0+VLrR7mDy5MmSpA9/+MPVtle84hWSpMsuu0yS9JrXvKZq23DDDSVJDz74oCTpzDPPrNqeeeaZjp/fUPtutPptmWWWkdS0iTe96U2NfbyNeY9+e+qpp4b09/y6BtMn3Wxz3U6n+m4w89j8888vSVp99dWrbRMnTpTUHHeHHHLIkM8p4phjjqk+//rXv5YkTZ8+XVJtm5L0+9//fljHT7sbPtl3wyf7bvgMp+/aeMl/O33EDjC3N/v888+vPr/vfe8b0nf/9Kc/SZJe/epXS6pf7OfE448/Lkk66qijJEnf//73h/R3I7pxoDzwwAOSpHnmmUeS9Lvf/a5qW3TRRRv7+sP03//+t6T6B8/LX/7yqu2d73xnx89zrPuOF4rNN9+82rbmmmtKqn8IrrrqqlUbPy758YP9SdJpp50mqe77hx56qGqbMmWKpOH/4IkY676LWGyxxSRJ3/jGNyQ17Q6HxVve8hZJTbv7y1/+Ikn65z//KUlaeumlq7bll19eUmcn1dH+ERT96OWHoiSdc845kuo+cjvhb7PN5zrs749//KMkadddd63aeAltO4eh0o021yt0uu+4n34v11lnHUnStttuK6np9Lv55pslST/84Q+rbdjStGnTJDXH689//nNJ0t///ndJ0g477FC1Mf+tuOKKkurnhiTtuOOOkqS11lpLUj1nSrVzY9asWbO9roi0u+EznL7juTjSr52RDS+33HKSpC222EJS/a7WCSKnD9uia027Gz6dtp1Mh0uSJEmSJEmSpK/IH0FJkiRJkiRJkvQV4yod7r777pMkTZgwodr24osvSqpD71KdLsO2N77xjVXbm9/8Zkl1GJ50EKnOSaYGwVOWSDfhWF/96lertuHmR3djyJQ6FtKM/vrXv1ZthLrpO09lKEPDiyyySNVG33WSsei797///dVnUt1Ir5TqOihC9PShVNsUfeg1U294wxsk1bblaUv/+Mc/JEm/+c1vJElnnHFG1eY1R0OhG+3ua1/7miRp5ZVXllTX7km1/dDXnp5Fv2KLb33rW6u273znO5IG1mPNDSOdDheleZTcdNNN1ee3ve1tkup0QLcdbI15kH2kOl2Vv0MtiCR95jOfkSSdeuqpwzq/iG60uV5hpGqCvMbngAMOkFQ/Yx1S3XzOopaHOev1r3991XbKKadIkn7wgx9Ikn70ox9VbaRU89z94he/WLUxx3EsPz/mhSOPPLLaxvzaVuuUdjd8urHv2lLQqOFmvnvkkUeqNp4LRx999JCOOdRzgG7suzaGO68PF9LfqfOV6vrep59+uqN/KyNBSZIkSZIkSZL0FeMiEkSB89VXXy2pGfXB8+mXyfHxDOFNl+pfuuzvnlMK2tmGh8mPQZsfk6jHYNXDoBu9BXjXiIr5deJ1xyPokSD6ju8vtdRSVRviAWPpkZeG75XHM45qoFRHJTzag73wr++P3WEj9JfjxwK8oezvUZCDDz5YUvMeDYZutDvEIRDRePjhh6s2+qyMXkj1XLDQQgtJkiZNmlS1HXrooY1jd4JORoKYS3wclRC5lmoxFhdGKKOPiEdI9XyJt36NNdao2vDE833GtlT3M5GgL3/5y7M9P2l8eka7iZHqu1122aX6jBDOn//8Z0lxZoWrtSFAsu6660pq2h02yz633npr1Ub0mmimCx0QjeR6PQuBCJKrlZ511llzvMa0u+EzN303VPXIoR6fYxLBlKSZM2dKkmbMmDHge1dccYUk6YgjjpBUi3dI9XznkfK5pRfsLrpHvLchXCJJr3zlKyXVzxHPkKLP+Nefzby7+DsL45c+dxXK8847T5J00UUXDf+iAjISlCRJkiRJkiRJX9GV6wQNFeQxyRV2DxG/KD0Kw2c8rO51L/OH/Rd7mQ/pv5T5NYvX3eU7r7rqKknS+uuvP8Qr6w6QlpTqX/l4BF3qmr6iL7zv6OvIIzASkaDRYMstt5RU11388pe/rNrw4rttYRtcu9cL4VmN7JXvYZvulac/afN1rfbdd19J0rHHHjucy+sqllxySUn1ejXuWS49w953RI64H08++WTV5nVp3UjkLeScqZVYb731qjY8ci5H7LYiNe2RdYKYq7w27w9/+IOkeo7zeg/GMrLZvgwB0aVPf/rTrdeRdC/MJe9973urbXjPsZEoEuRrm7FkBBFbbFOqnw98z+tusTNqNOabb76qDbtjLPscgAfZI6NJ99KJOYG5ySMJPA+JXPr7CTYcRaZ5h/zABz4gqRkJGmoGT3l8zxga6rFGmrboUnSPPvWpT0lqPncYl4zx6N2Fce1t7O9zA8di3S9fy86fKZ0kI0FJkiRJkiRJkvQV+SMoSZIkSZIkSZK+Ylykw02cOFFSnepCepJUp3V4aK8M83nhMZ+jsCVhzSicymfCsB7i79U0ONhuu+0GbKN/PJxK35Hm5ak3ZejaU3ZctriX2GqrrSTV1+sFgdBWTOl9R0g4EjgocbEOoMjQ7XbhhRee7TF6DQoy999/f0nSRz7ykaoNOXLke12OF+nc448/XlIzrYui7W4lmoMoDkU23VMwsZ0oRZX9SWOVpFVWWUVSnQ53yy23VG2IdtCXnv5E2iFzq58n6STf/e53q20uRZt0P55SC2UKmqeck97rNsIzGFv0ccdn2nz+5+9wfE+bLtOBfY5kf0+HQ2b3sccea7napFuInmtt6WPMbVFKF3LKkSgQduPP5gsuuECStPPOO0tq2iQiUNF7XyT0gJ2yX5uwzVgTpby1CfLwHPH3Nz4zLv29jz7mHclnLj1pAAAgAElEQVRTA+kzT8Mu++yFF14Y8Hc6TUaCkiRJkiRJkiTpK8ZFJAjwfD/00EPVNgqqKbSS6iiN/yqFNs8DbXzPPVF4yFic1RcX7HWWWWaZ6jO/0EuZcalZ4CY1+4f9Sy+JFMtB9wLcc+wJL4lUL+znXvlS4MAp+8A9oGUhsXvM8NjjvfX7wTEj6e5ewBdDpK+JKhCFkJpeaanp4aPPiRJ7X/RKEbVHrOiTZ555RlJT+CCSpieSgz26fVB0Sn95lKj0yPkx6VOO5V5+CtR94eBoAcKke/nkJz8pqfnMxDPOHO9Rb+Y/n5cYk9iUt5VLTLhtcazyWet/E5v3cc8x3VuMvG5GgrqPKHozmIiJ2wOfo2jPxhtvLEm6+eabB7Rhw/6cYPkJ3t98sfu9995bUhw1YZu3dZv4QSfhXdDHWdkHfh+5R8hn+/2Lor3YBWP9pJNO6uwFBGQkKEmSJEmSJEmSvmJcRILKXM1oIc5LLrmk2oa3spSPlQZKZEfwC9Zrj/AWjMYv19Emqn2KPCB4lolAeNSB/iSq4Z7BoS7m2S0QnUAO1nNbozonalYi6A+8Iu4pI5pEP3k0g0gn/eo5tNwHt9Neqs/w/qTegP5B+lqK840BrxSeK7c1l9jtZt7znvcM2IZ9uNeNsei1GXjWsSGf88pFLj3yBixeGeW8R3Mk21ZbbbX2i0q6FhZZ5pkmSSussIIk6fnnn5fU9KIzz7jdlfViHhGHclkAqbZBju/2zbHKJS6kOhLuUUmvDUy6gygCxLxFbbfbUfm8mlOUZYMNNmj8nRNPPHHAPlE2BM8F5kLqJeeGthqibiY6R57FjM/nnnuuaisjtD4uGcdRG+88njHE84qsLq95HSkyEpQkSZIkSZIkSV+RP4KSJEmSJEmSJOkrxkU6XCmV6OHHSy+9dMD+hPQIgXpqVltxXpt04K233tr4v4f422SSewFPISr72PuuTDnyPiCMTTqTCwHcf//9I3HaI8IiiyxSfeZaSMGYNGlS1UZKpq88TXEg4Xi3o/JYnrZEISchaQ8f06+kIiIA4OeAbLvUW+lwSNxK9XVhY57i94tf/EJSLLpBwevSSy8tSbrzzjtH8IxHBmS+pYHpBS5mwFj0lBH6AvvylEHaIqlrxnAkAcv3OJdIknvxxRcf0jUmYw9pl6SkeGrtlltuKalOhytFcCRpwQUXrD6Tssv48+JnPmOvnt6LfbKPf69cviJKxfY0HY7F+Ln99tvD6x4vvOMd72j8/9lnn+3YsTv1PhOJYfA+9uEPf7jxf6kWdOG56EuP8Bz09DnSpNl2xBFHVG2IJERCTOyPiMaECROqtp/85CeSavtZYoklqja3z/J6mEP322+/qq1XU//LJTv8PjAOeT+J3vvY38cs6ar+3ClT5EbjfSUjQUmSJEmSJEmS9BXjIhIEkYQhuJzmYMQPIkp56DbGk0xi5PWLFvPEK430sHtc8OhEkSD35HQ7vgApXk7utXs0kTCm0FKq+zGSHcYDQpGh23LpxffCTjxk9Kd7poia9GqBsHvc8PAusMACkppy5LRFQif0AYWu1157bdXWK145l6hnHOF187GDh9bHK9fIfm6j2C/jNhI/iKLlfObvudQ4Y8HllZPeADEL5hS/h9gbthUtluoeYOYjbMQjkNhItOgpx2gTOcGr7NHgUlhGqu2agvteiwS1vaewGOiee+5ZbStlwrluSdptt90kSQ888ICk2FsfZbhMnjxZ0sBMl+ESvTsdd9xxkqQnnnhCUjMStOKKK0qqr8nnNs7Xo+FEL4kMfuhDH6rauAaimf4OwpImZAp4BGKnnXaSJG2xxRaSaiEkPx8/B45BNMmL+z3C1ItgY36PuD76wJeHwYbZ3+3O+wyYZ8j88MyWkSIjQUmSJEmSJEmS9BX5IyhJkiRJkiRJkr5iXKXDRcII4Gkjc6vVHqWCjaf0txIPfXKdUR+SykVxISt2S3VqBfemV/uLMLs0MN3szDPPrNroC1/PhrStqA8IG2NbkWgC6SNvf/vbq7Z77rlHUp0i4qlJ4ClQvYSv90UKg9sUYIvRugOE3Okzb4vWcupGENSQahsiDcOvgdXOPVWJ68Xm/PqffvppSXW/eXoIaUyl6IxU2y0266ly2KGv8cR6TKTydSNtzw7wwvP9999fUj32n3rqqarttNNOk1T3b6/APMPaUPPNN1/VxvxNCpqvGB+lbWGDUYoVNhiJH5B+GR2zTCN2W47muPKcu5HBrLnlfPvb35Yk7brrrpKku+++u2q78cYbJdUp2973Lo4jzVncADGJT33qU5Kkhx56qGr72te+1vrdNrjXjB9J2nfffSXVwkIHHnhg1cacwfPU19pbbrnlJNXCOFI9/zDfuzARx+JdMErpZC7jHUaqU7J41vp8jG3NmjWr2sZc8pWvfEVSM7Xuc5/7nLqJSPAmgnHJ84f7IdXvHDxr/V2bccnf8bHO+4ynyJGKy7jwVH6fczpJRoKSJEmSJEmSJOkrxlUkqO2XLEXAUu3xbNs/KswsV6xGWta3jUfcqxsVugK/2lmleY011qjaysiR/z9aTbxb8eJ77jmRsl//+tdVG543vza8IG2RyMhzyv7YsJ8DMqhsY2V3/3uRYEAvQARBqq8TiXJfGb704vv10p/lqtb+vW4FYQg/Z+4p3jYvFMbWfC4q5yX3ACMu4X0JZQTIve6lLKzbLB5DCpoladlll5UkzZgxY8DfGU24lkhKvW3+Rh76hz/8YbUNbzXF3BRNS7W3mvu2yy67VG1Tp06VFAtHLLTQQpLqQmyp9rIiCOLRqIMOOmi25zwc8Hgz7pCVl6RHH31UUu35drvDox5FICNpdrztbPO2UqzE23huY69+z3j2uBeaKKnLZncbbXaHKMEZZ5xRbaNfEdzx5SXoH7zvl1xySdVGRGeHHXaQJH3ve9+r2ohiYOdSLSjA/m77ncCjzjzX1ltvPUm1lLUk3XvvvZKkddddV1LT7s466yxJzeyARRddtPHvJz7xiaoNCXcK99226FeeE0suuWTVxnhmfCIlL0lf/OIXJTUjFkS0GKsejSLK2mswJ/Au6H3HOIuWaCD6VorpSPVzw58niD4x97jIQkaCkiRJkiRJkiRJOsC4igS1EUUbIk9Um3e4jH64d9R/sc7pOL2Ge0zLxRMjyUoWFzv11FOrNjw//Jr3/vGao27HvRx4JPG8ec0DOcVbbbVVtQ2PEl6jaJFetrlHtFxA0L1oSHHjofG+xAPaS/3r4L10LrroIknSb3/722pbueii9x33C6+c55R3+4KenHMkL8z48XuLfXi+Np5x9o8WRC0j3NH+kXw2bW7HtLlHvlzEcaxgHmtbEHvzzTevPlPTc9RRR0mKpcph2rRp1WfqHd797ndLkn76059WbVdeeaUkac0115TU9CBzTK8vYB6hruCmm26q2qgB6RTnnnuupLpuAVl5qa49ie4vES8fk9hgJIMbZRGUbVGtTBldcpgHvfaFSOoXvvCF2f69TlE+F/2Z2ZZxwoLQRx55ZLWNqB8edo9KHHvssZLqMefvNUQ6rr76aknNOh5qiYhg8n+p7jN/hyn7zGtNN9poo9lez2ChxlOqZaMffvhhSc35i8+ML5f95rx9PNPX7O/nyvHLZQakOirucxmw/x577CFJ+v73vz9gH6874x0H+fKZM2dWbd32XhjZZjQ+l19+eUn1+54vlkoEmP6MsjQiCfvIhoFtXv/sy410kowEJUmSJEmSJEnSV+SPoCRJkiRJkiRJ+opxlQ7XJnM677zzVp9Z4XaooUn2p7jLQ6cU/1L4Op6ICl5JU3BpUtKvonSTUg7R+95TK7odT0WjuJmwPMXDUh1S9tQsrr3sQ6dcYVmq7YzUkigFilQUT9Hk+J6y2OuQKuIiFKRMRClf9Cehek+XGalCy06BHLuPlVKe30UTmPei9ALGWJSmUwoGONiX2zF/E7v0eZC50fdHypZC5k7TNo9HqbykqE6ZMqVq23333SU1021IVcLWIonvqO+Y/6655hpJ0iabbFK1kWbE/l7Y7gXXYwmF5scdd1y1bZtttpFUpw3dcsstVVtUEA3MPd5Wphy1iQNEy1FwTD/OOuusI0m6/fbbq22HH364pDh9rhNE8sJtqW9XXHGFpKbEMumQPkdzvrfddpuk+t1CqovzDz74YEnNZyf2SarlBz7wgaqN1N9DDjlkEFc2EAQKpKZ0+nBxsYATTjhBkvSd73xHUlPEioJ8xouLXERp5czppJR6CitphcjZu7AV7zjYt8PfYTy7fDbiFS+88EK1jTRGjsm9kpqpsd1KKU4i1amxvPP4exBEts/YjoR8SoEU34YQEoIYUtNmOklGgpIkSZIkSZIk6SvGfSTIF7eEcmEsj1zwq7TNu4h3yuX+1l9/fUnSN7/5zcbfmN159RKRZx1vsxcX8usd3NPeVjgaeR66lUgqmG0U50p1oaR7lkqPqXvxyja3LYpSIy87XvnIiz8YW+41sJtIzCQqpsZO8d71kkgEC+0iG+pgC9E84/NZGWVtiwQ55QKqbo98jpYaoO9dAroTnuM2Bjuv4tG98MILJTWvG2/7Zz/72Wob7YgS7L333lUbywAMRmzhrrvuqj6vvfbakmo77JboT4QLkzDHIRLhssTTp0+XFM9BfM/nLLIH6DvPNOAYeIzd7rAtnjkeMdhtt90kNQu2I+n3ThLZHWPChQSQmyZC4OfN+XoWAZGizTbbbMDxb7jhhsb/TzrppOozUSKeQy5KscEGG0iS9tlnH0nNyA5zI1F2qRYRILLm0WWPhAwXpOWlWqxhrbXWktQUTcBueI9zkRWiL96f5fzu58r4p5/8WETRGMduyxz/zjvvlNRchgLbjZZeIOvg/PPPr9quuuoqdStt76ksnkv/RO8gUdYA45d9fJ7kueZ9zX6cA0tijCQZCUqSJEmSJEmSpK8YV5GgCHI23ZvKL1a8G5EsIri3sJSEjSJB4xH3qJXePv+/12lItayqVN+HyPvcSzUrfr3lon1uY2zz/fFSRYtaljLY7o3h77i9lbz44ouN7/v+vbQY7ZzgOv2a2iJd5WKf7qHtdpCs9WhOKSHsdkL+vNthGQ10jyV2FUUM8dhFi6ViV9hzVIfmduiSt2MJNQDRgoV435HBdbgmXxD12muvldT0qA+GaKFPiPqzrKH0yLvXIYw03E+i+y4vzzX5mGx7TpR1Pm0Lqbodld9zWWLqfEeTk08+ufrMWCXy72OJMYfkunvKiXB4FIbvstTE0UcfXbVRQ8Yx/O+w2C3PU7dz5LIjiXMiG35ePMvpV8+ocRn5oYKN+LOMyAz33ucL7jlRFb/PfI8Im1TbC/V83j9EbZgzvQ+4dqJnfr1EI6i79XoYvud9TaST+77qqqtWbZG89lgQRX3KDCevp1p44YUlxc9fjkFU06M95budz1m0eY0c/ckxvOZ8pMhIUJIkSZIkSZIkfUX+CEqSJEmSJEmSpK8YV+lwUUHX9ttvL6kZSicMGsk1l8fytnL1Ww8fk2YThXt7XRjBCw8huhZfoVqS7rvvvurzGmus0WiLwtS9gKejQFSAS2qSy3BCVMBf4gWeZeFhJBvLSvKepkJxa1u6Z69AekIkuVuOY7etMtWrbbX6bgNZ/2gFbvrDUzOi+1yO0yi9KCpopS2y1bJA3VMjuD+eajJSKYikaLAyu1SnHFHo7MXhCLeQunLuuedWbaxMzjiS6mLy/fffX1IzTYeC8Q9+8IOSmtLaFGBzr3wepACe9J4jjjiiamOb9zVpIfSnzwvf/e53NdqQDudzEHbnqZZcQ5lW6fuzz2DTdctUubb5cyRZYoklJDWfWxTg0y/+nOBZEAk6kEJ0zz33VNsuuOACSbWkM8IKknT99ddLkrbeemtJ9fuNVNvkpEmTJDX7B/vhHPz9BJv0bYwbUqJIg5KaY2So8DcQfZGkCRMmSKpl1z29jdTeSP6f1DhPkaP/SaPyuZNrZ38XYChTx10On/25V566yBzt58V8h334/IcwykgRybZHRP1Zptvvtdde1WfS2LApty2uk/vmzxg+M2a9L/jb0TIiPEeYS0eS3nkjSJIkSZIkSZIk6QDjPhL0rne9S1Lz1yYepbZoT1QszP7RYpd4LPBquChAr0sUt3nc3BvmEtFS05vS5oHvpcL9qPg3WowPD33kMY0Why29MINd4I9zwLvonq9SyKOXoXgSD1GbWEc0D/RiH+C5jCSEubc+t2Cbbjultz2SswbvI2wUT5z3N/vhbY0Wc/VzjpYp6AR4J12Kf7XVVpMkLbrooo1zleoIEGIGFE9LdVRpyy23rLbh+eXZ4ZEOPPL77ruvpKYMNhEd7oN7P+mzaIkBxrDL6uOR57653PhYREI430gG1+fGNtEhxnAZ4Y6IRHOiRaDLqKY0chEjIjvIhUu1JDxtvjg7kRnszYVLiCp5ZOTzn/+8pHrO8yhM+fc8asI4PPPMMyU1r7uUk/bn0i9+8QtJzTFOP0YRVZchHy4eDeM66TO3mXL+8bHE/fVCfOwzsimOwd/zKEMp7uFjvVyM24neBcuFc/15j9z+SDHYbKNomRfYYYcdJEkrrbRStQ37oc88A6GM6Pg5lJEgj2RHy10wfugzvw8jRUaCkiRJkiRJkiTpK8ZVJCiCX/3+6xQPUvTLvqwriLyckdeJbeTQeiSo14kiNfRLJPMMSEr6flFEaKQXteskg5Xzxu6ee+65alsZOWqLjkWyw+zv/bX88stLkqZNmyap6YHnvrnXplfBGxf1WRnZ9bFOG/baS3LskTcdbxsez0jCP6qXwi4iydHIO4x3tW1B1eheRIsNRjVsnQBv4de//vUBbfxN95SXi3S6xDLj1SMceD3pf484cZ3UErn0K/cm8soi68uxkDWW6iiWRwqQKqa20J85ZQ3maMD5R9EJv95yUW2HbW32Tf/698sFad3GuJeejTBSkSDugS90yz2jX9xWpk6dKin2lEdzM5FA7r176zkGNu/zPXMbtu/9VdZaeAQSb7t73UvPvUdzB5ul0Ib3weOPPy5JWm655SQ1a3W4n0RA3VbY/4477qi2cR9KG5MGLsQbLdIZ2WspR+5zBHOQR/64p5wLCzFLdZ1Xpyjr5NreaT0iFd1D6hs33nhjSdJtt91WtbGwLMfwOb18r/EIG/eZZ8xTTz014LzcFjk+Y8Aj38y1nSYjQUmSJEmSJEmS9BX5IyhJkiRJkiRJkr5iXKTDERIsw+VSHZL0MGqbZHWZzhHJ7RL2879HyD2Sk+51PK2tjbII1sPaUSE3eOpAtxOdv4fHwcO4gP1E6QekPBAO9nB1Kffpstsehpea4XyO30uy0LOjTGGIBAHouyhFM1qNvNshrcBtoZRKfvDBB6s2pFsjMZco9YhUmij1tyxa9zRCtpHGEEl4O2Nhf5wjRd/lZ6mWNe4WEFvoZphvvO/ahG2iNGhsi5Qanz9pi8RjSjvyfUhnLMV5RgKkqGfNmlVtQ/QAQQ7Sh6S6DxgnPl6i+Z79SeX05wSpVoxrH89lEXm0TMfs/u/H9O9yrr4/ktpzg79TYD8Ilvizk2cf853fc+YvRE2kWs4aAQV/R2Pup3/d7qIifeAY0Xsm980L/plnSN/y+z1jxowBx58bOJfo3bcNltL4yEc+Um2jXy655BJJzWVNSgEF/3s8G+hfv7fcP9Lg3I7mn39+SU3b57hlWqzUFA/pJL3/dpQkSZIkSZIkSTIExkUkqK3wEa+5LyKJxGObh9JlSkvK4i2p9kDxK9oZ6q/0bsMXIxvKwq/IePr3Ig9U5PXrVvxeRnLAgPdnsJFBvEx4Vdy7iqeEvnOv0wMPPNA4jkuZcq6RDGavUcqURp7lNu8x/RstdtutUIzs9w+7oD+mT59etX384x+X1PRwsh/jNZJyjWRe+cw+bvdEqFh81IumOVZUJJ/0BpGgQClR7zZZCgZJAwuv3QbKBY993JaRWrcjxn65kK9UR0GJJpTtI4GPM+bhcj6W6qgE1xa9W/icXspTO/QHz5fo3YJokbeVzwB/vpRjPfo7Hs3oxOLm/rfKpQCQspekFVZYQVL9HGUxZKkW6bjxxhsHHJf93QbogzahIPrCbbmMurlt0+b9y3sP0cDBZtIMh0UWWURSLZCExLSfG/Oz2x3XycK8Uv1evNlmm0lqisrwjI2W3cB2o4gw+yMYFgnzREtAgB+TOajT5NMpSZIkSZIkSZK+YlxEgkoZV/cybLXVVpKasn3UCc0pL1ZqesOQvzzwwAMlSbvttlvVttZaaw37/LudNllh7y+P/EhNj06J930vRSqiepNI4huvhXs+SslT79e2PsDzFEmzl94RP7+2Wrleg/Eb1QQNRYJ5wQUX7OyJjSCR1x1vMl5olzGNPPKlRLbbS1n3E9U/ss1tHA/sRRddJKk593GfvG6tFxeqTZrgFWZ+8fuLJ9+j3oOpE4K2xcQj2Xa2MZ9K0sorryxJuvLKK6ttUa3mWEAkgH992YSkrqFhzvFoPW1E+ogMSXUEyLMfqMni3cMjCxw3kvEvMwx8zmJ+jJYLiKJDzKu8L3oWktfZDBekrKVapp1FWL1eCwlxrmnmzJlVGxFLr6fabrvtJNXRJX+3K2tKPeLEc6ocn1IdzeQ+RpEgj/7Sztj1OcWjgJ0kI0FJkiRJkiRJkvQV+SMoSZIkSZIkSZK+Ylykw0FUrI9Ms8s1zy1TpkyR1Cw0c/GA8UYpLSvVIVZPtbrnnnsa+3ifR5KeMJKFg50mSrGICnAJKXsfkNZUSnU6UQE7hYeEhr2/SjGAKF2vlE7tRTz8LjXTD7Ap0hTa0v8oUu4FsBef17hWZOURJ5Dqgl+3R+y17COp7qdSIMG/h+34OXAvbr31VklNmfY111xTUtNGMx2uN2gTEiBthjQjUsql+jno80x5LP8/dlamY/o2bD+yHeZPCrklaamllprtuSfdDTaFjfkczTyHJDtS5JK07LLLSpIeeuihahtSzPPMM4+kWBghWi6gTMmM7DUaH5FwDM9pUkgffvjhqs1FHIYKf3/LLbestiEbTb888cQTVRt/l3TBVVddtWrbeOONJTXLRMqUc0/X557wjHExA55BzAku8kG/IhfuY51x7Kmz/E3S4lycwY/bSTISlCRJkiRJkiRJXzGuIkERUdH0YOSdo+/jdf/0pz8tSZo4ceKA/YciId0reAEqRJLXZV97lCLyNkMUaepWosXpoogO3iAvaqc/8Bq1Sbs77I/3xT1l73//+yVJRx555IBjjocIELztbW+TFEd7omJWKD11Lufc7UTRVqIuRJ4piJVqL1rk4WyTL40EFUpbjc6L4nifH4gK+LwwWDtPxpa2+7TeeutJim2yLToeLZZafs+/Xz4f/Hv8TezchXf4Oy78gVBPJPmdjA2RWA/3nDnK5xPEDPjXi+O5rxtuuGG1jYgR4hMezWBO4hwiGWzOK1qMm/2j54w/5/mbRDHOP//8AfsPhw022KBxPlIdHaF/iLhI9fsp0TQfZ6WEvVRfcxkRkmphHJ47LmZA1A2RBh+DLB3DWPV+4h3Jo8pkznA/XAiDv9NpMhKUJEmSJEmSJElfkT+CkiRJkiRJkiTpK8Z9OlwU9hsK0RoGpIFEKW/jKQ0OPPzquu1Sezqchy/LdAhPZfBwaLfjq02TXhGdPwXiLhZBP9IXUWpgtGI14XdC3q61v8oqqzS+H60yPh4o06w8taUs/nd7Jb2BeSBarb1b4Tp8DiMt4fHHH5dUpzo4bWmQno5Q2p//ndJWvSj1hRdeaPwdipGluNi1bR2YpDfgvmJ3DjboY5K5H5vyNo5Faqqn1mA3zKlegE0bY8DTkhDi8ELvct26ZOyJUhK5T7xX+btCmXrr9/fuu++W1JwDV1ttNUm1jXiqPYIuUUo15xWlw3EO2J9fA3bnqWakg3EsUvTmFtbAuvrqq6ttG220kSRps802k1SLRUj1mnisU+Tnzbj0NZY4X54R3j+898w///ySmu+5pGSfd955kqSDDjqoamNNOdbW9Helsl+l+r6R8u/vkPPNN59GgowEJUmSJEmSJEnSV4z7SNBIQNHwhAkTZruPezN6vSDTPWplobT/v4yIuHRu+T33tMxttG40cU86ROd/wgknSGpGzrCJyDtaepQ82oP3C2883hJJuvDCCxt/1+8B3hv39vQq5Yr1fh/aVvsG7K+X+oL77HMJ0ZdrrrlmwP54PX3F9bZxh61FkUm+F3k/8dwzvmfNmjXgHNrklpPeA5vivnpEFXuIRDR4drg3nOgOYidEFqXaq1/OlQ4RpMmTJ1fbOIbLZkOvP3/HE5FwFFEeZM5dzAAJaJ6jbmPIQkciLOuss86Av82zkeeD20X0XAf+ZiTRzBztz2RsmO91OtPF++Dyyy9v/OsgkkAkaKWVVqraIjlyri+KlD366KOSpJtuukmSNG3atKrNr72E50eUIcXzhvvvx0LY4rLLLqvabrnlltn+nbkhn1RJkiRJkiRJkvQVGQmaA9Ev2OnTp0uS9t5779l+bzzlwXvOLREK+sU9vi+++GLje57PiZcB73Obt7pXwIsSScRefPHFo306kpoS0HhRI3nkXmPFFVeUVNufXycL0+JtZl9JuvPOOyVJCyywgKRmTnm3Qz0Z0qNSHd27+eabB+yPhx2vnTSwTsrHHRHtaCyXufHuNXUPvCTddddd1Wf616NxDz744GyvMekNpk6dKkn63//9X0nNOh7wRSHx5jPuiPpItY1QO+F1luVyCdQUSLXHmOeK1z+wgHnWAfUe999/v6R6TosiO+UiulJtYx71ISJIZMRrypiHiDJ4ZAcb5Lnikc4yQuLP0yjrgGgG83A0V48GyE3z74wZM0b9HJA033bbbUf9bw+WjAQlSZIkSZIkSdJX5I+gJEmSJEmSJEn6ikyHmwNRqtYVV1whSTrppJNm+73xJJXtRdiHHnqoJLOO0wIAACAASURBVGnSpEmSmgWvZTqcSzn+6Ec/klTLVLpggK8C3e389Kc/rT6ThvXEE0/Mdv+RLhAvi3499Yii0ieffHJEz2E0+NCHPiSpTqvxVcJ32mknSXWaw/LLL1+1IaP6qU99SpJ00UUXjfzJdoiZM2dKaq6aTeplVGxLAexY4pLwcMYZZ4zBmSSdhHFDig9LAEi1MIanBZPyxvPT0+eQVCe9yAvhSS8qpe19Gymxt95669xdVDLqRCIVpOWS0jhUzj777Lk6p6S/yUhQkiRJkiRJkiR9xUv+O55CFkmSJEmSJEmSJHMgI0FJkiRJkiRJkvQV+SMoSZIkSZIkSZK+In8EJUmSJEmSJEnSV+SPoCRJkiRJkiRJ+or8EZQkSZIkSZIkSV+RP4KSJEmSJEmSJOkr8kdQkiRJkiRJkiR9Rf4ISpIkSZIkSZKkr8gfQUmSJEmSJEmS9BX5IyhJkiRJkiRJkr4ifwQlSZIkSZIkSdJX5I+gJEmSJEmSJEn6ipeN9QlEvOQlLxnSPv/973/nuF+0zw477CBJ2nHHHatt9913X+N7iyyySNV29913S5KOOuqoOZ5z2zkNluEcYzB9N1T23nvv6vMuu+wiSfrlL38pSbrllluqts0337zxvde+9rXV58svv1ySNGnSJEnS4osvXrXttNNOkqTbb7+9Y+fcLX3n0B9rr722JOmNb3xj1faHP/xBkvT2t79dkvSqV72qanv66aclSVOnTpUk/etf/xrR8+yWvnvZy+rpCdtaffXVJUnve9/7qrbTTjtNknTttddKkl7zmtdUbfvuu68k6dWvfrUk6eKLL67apkyZIkl64oknOnbOQ+27kei3hRZaqPp84IEHSpKee+45Sc3z+9vf/tY4h7e85S1V2ytf+UpJ0ute9zpJ0jXXXFO1/eQnP2n8vcHOxW10i831It3Sd1tssUX1+eyzz5ZUz10zZ86s2l588UVJ0u9+9ztJ0sSJE6u2xRZbTJK06qqrSpLuuuuuqm3DDTeUJP3+97/v2Dl3S9/1Itl3wyf7bvh04t3ayUhQkiRJkiRJkiR9xUv+2+mfVR1gqL942f9//uf//6b797//3br/uuuuK0n6zne+I0mab775qjY88f/4xz8kSc8//3zV9uijj0qSTj/9dEnSKaecMtu/wblI0n/+8585X0RAt3gL8JhLdWTsbW97m6Smt57Ixktf+lJJ0iWXXFK1XXXVVZJqrx/RH0l697vfLWl8RoIWXnjh6vN73vMeSdKf//znAfvde++9kmpPqEczYOedd5YkHXbYYdW2TvYZjHXfEeU58sgjq20vf/nLJUn//Oc/JdWRCqnu4+gcfvWrXzX+fcUrXlG1cZ3Tpk2TJO21115zfe4jEQmKIi3R92jz69h6660l1ZFbj/bQl4xX+liqI0f08zzzzFO1bbrppnM8v6Ey1jbXy4xm3y2wwAKSmtkQZAc4f/zjHyXVtuXz2QUXXCBJ2nPPPSVJDz30UNX25je/WVIdGX/DG94w4NhPPfWUJOncc8+tth1zzDGSpF/84hdDup60u+GTfTd8su+GT0aCkiRJkiRJkiRJ5oL8EZQkSZIkSZIkSV8xrtLhokshDP+1r32t2jZ58mRJdcoaRdPSwJQu/pXqlJBHHnlEUjMVadddd53t+fF3hpoW1y0hU9L/pDrN4U9/+pOkuoBVkp555hlJ0rPPPiupWdz/l7/8pfEvBbNSLZpAwWwnGIu+i9KCPvaxj1XbllxySUl1apLbVpnC6W2kiNCfM2bMqNrOOeecxt8eD4IcXK+nU77+9a+XVKeperopY5xx7Of/29/+trE/YgCS9KY3vUmS9K1vfUuSdOKJJ871uY+0MMJg7vNZZ51VfSat7cEHH5TUTEN97LHHJNX9vMEGG1RtjE/6aPnll6/aOAaF7Q52O6eU5JKxtrleZjT6bv/995dUp+J6WimppoxNqZ6r/vrXv0qqBTakgePUz4XnA+Iv/swsU+v8uc24/vCHP1xt8zTu2ZF2N3yy74ZP9t3wyXS4JEmSJEmSJEmSuWBcRIKAKM6Xv/zlatuKK64oqRmVwJOETLNL5BLhQGY28u4TDaF4U6oLMm+88UZJzcjTcBlrbwGeN0QNpLqIleJ+99QhkvCb3/xGUrN4vYyCeYE2XmqEKjrBWPcdHHfccdVn+gDvvHtO8bhT9OuedArW8Y66RLZH6TpFt/SdRy0ofEaoxMfz3//+d0l1n3mBP9eCLbtNsh8Ruk4wWhLZzGt46KX6OlxC+NZbb5VU9xF2Jknnn3++pFr63wU3iKDNO++8kqQFF1ywauMY/LvttttWbcOVG+8Wm+tFRqrvEHKRpCuvvFJS/Zzz+Zz5yKNDjDPENzxySxvPUY/olBEgPybHwpY9Ws4xfNtKK60kqd0m0+6GT/bd8BlO30UiIWPFYJalmVMbn/19hnmifG5L9Zzg792dICNBSZIkSZIkSZL0FeMiEsQCiscee6ykprcXb5N7rli0Es+xe5vILaZbvA3PPR5kju1wbLxVUu1hZTHHwTLWnpZllllGUl0zIdV1BPSde/GQF7/pppskxd4/cr79fsyaNUvS4HK4B8tY9N0KK6xQfZ4wYYIkadlllx2wH5Eyl8/GG+99BtgdefXIk0v1YrX0YScW/hxru4sgokG0MfJEY2MeRaM/qU3wxY9POOEESdLhhx/esfMc6UgQdYk//vGPJTUl/B9//HFJdc2ZJL31rW+VVM9rHkHDjojysFC0NHBuc28d898SSywhqemt++QnP9n4e4OlG22uVxipvqP2VartiHHkSyP4sw78GSzVGRZ+LJ4FHrnlWOWyF/6ZiJB/D/v05xGRyrZrTbsbPtl3w6fXI0GdIHr/LpfA8Gc5fdbJxZKljAQlSZIkSZIkSdJn5I+gJEmSJEmSJEn6ipfNeZfuhzQ4wusU7Utx+JVUD8JrHnIjrE5Yjn99vzJkL9XpAYT9PR3g+OOPlyRNmjRpiFc2tiAc4SkJiE+QQuMyucgYU6DtbW1952kNvQhF6h/96Eerbdddd52kZgokq60jzOHCCPQP/84///xVG8IRcMcdd1Sf6butt95aknTxxRdXbaQujgcefvhhSdLKK68sKZZnBh+zjENStrzPzz777I6f50iDQAQF6i7TT3qaXz+pA4zbF154oWpjvkSannQ6qU67ww7ZV6rnOiTxPZ0BoYYjjzxyOJeXdAHXXHONJGnRRRettiHYQrr3nNKaSFfFVjz9h5Q6UjP9+cv3GK+DLbJmf0+VwS6nT58uqSn0kHQPpNFjK9hHhNsR99xTM8s0M/8/9halnAM25Sm+JZFse2SnvEu6GM2jjz462+P2KsMVSYiWjuEzJQPOSKUDZiQoSZIkSZIkSZK+omcjQRMnTqw+8+sRb6V7BqJfm+65lJpFv3il+AUb/epnGx59qY4+8fe8WJTo0FZbbVVtu+iii+Z4jWMNhdN4naXaW0zhs0td0x/zzTefpKaUIZ5o+sULZb1YuxfZdNNNJTULy7GDxRZbrNqGxwRbca8Q0Um8+B4JItpDP7l94zXjHvmilr0aCYrGLOOFhY7dw8T+eKm9KJ9jUDDtdldG2HoBImE33HCDpOYilIhi4FmVBkZZfbzST8hh+/foy1//+teSmgXn2CECDG7ja6+9tqSMBM2OoS5qvPvuu0uqI3gsLD2SrL/++pKaizIz7vDQerQxivaU1xeJH5A54J7/UhrXxzljmfHuESTOJ8owuPTSS1uuNhlNuCduPyxpghCTR/kRD8Iu2mxMao88+Hve7L6Hjfn5lXiUKDom44FjeET1zDPPnO1xuw3v33Le8usm2yJ6bgPbvI39/X2Gd8YDDzxQUh2BlqSf/OQnw72UVjISlCRJkiRJkiRJX9GzkSBfvI9fpUQiPH+dX+0e/SlrgfwXr/8q9WNL9a9Ytt12221V2xprrCGp9mqxyKBU/4r26FUvRILIwXQZXvoKzyS1GpK04YYbSqojEB7p4J7wS9/vUSSx2kvg0fQoA15193JyndRgbLHFFlXbOuusI0n62Mc+Jkl685vfXLXhcX/yySclSe985zsH/G2iS+NNRhO8lkVq5nVjk2yLvIF4IDstrznaPP3005La8+ZdIptFLqlbYxFoqZ7P2pYKOPnkkyVJyy23XNX2gQ98QFJc/9iWS9/rRAtnsy1qizzV5TaPou21116SpO23377axphnvvToTLREQyfAHtZcc81q2ze+8Q1JzcV5gfnM6+0Yb9ECxnjIS/uTBvZrJHfPv253zIO+/xlnnCGprhnuZxizLNTtdjSaRBEWamWZf3xuJxLA9wYzpqTatvyZ7PYpNecqPkd1RmSxsC2yySj7CIim+7G6hSjTKepP3nmj585glkPg3npGAfe0vC9SbadkJIwkGQlKkiRJkiRJkqSvyB9BSZIkSZIkSZL0FT2bDucFvqQAkRLkYXIv6gdC81GxVll8GaXWEPr01ecJ7UXFfeBFzL0A/eQhZVK6SPfyAv4TTjhBUn0f3vve91ZtN954o6S6Xz0s7MVvvQQCEISRXdaRlDW3LfqOVDe/7vPPP19SXaR+2mmnVW3bbbedJOlLX/qSpGZ6Rykl6XKcpGR6elQvEI25UuraUw7K9Bof/4xVvufy+b3IsssuK6meX/x6GK+/+tWvqm1uD1Kc3hvJh7MfaXCe7lHKH3sKJqktvUaU1lYW+Lal4gxW6OCDH/ygpDoV1tO66U+XKuc+c98/+clPVm2jKT7xmc98pvHvBRdcULVts802kupUTSlOgwP6OEqtYVxHBefcj7e+9a2SmjZJ0fShhx5abXvkkUfaL6qHiYrQDzjgAEl1Ib6no2+88caSpA022EBSMxUb4QifU8v3n0jGvFMwRzF3eCo4YEd+jtFzojxf378UtmoTWYjSepkf/ZjlXOjbOAe/Hi8fGEu4Xhel4v2kTDuV6ndAlk4gzVqSbr75Zkl1icctt9xStZWpslHqm0Nf8W41GmM4I0FJkiRJkiRJkvQVPRsJco8j3jKKqNxrgbfJPVJ4iksRBGngYlnubSi9U3ikpDoyhRcAj77/PS8K6yX81zveAvqYhVGlOkJB8TXyrlLddxzL5cUHU1jXjVDUTNGyR/+WXnppSU0vOdeOlLFHKYlmUOBLZMhhkVTvr9KGve0d73iHpN6LBEWU0vWRTGnkZWLMDdZT3+3gKWNeY0FjqV7kkqiBVEckiRi5fdCHPl8CHsLNNttMUnO8lh5On1t9v15iqBGdNsgGeNe73iVJ+tCHPlS14eFkHnXBD54TPmfwPGHBbYQqRpsy8rDTTjtVbdiiC9xgB/RnW/QgkruP2rBhlm5w4QYyDcY7zPdRpGzbbbeVVL+XrLLKKlUbgkQ/+9nPJEl77LFH1Xb33XdLakbyOh3taaOM1vvcHsmvQ7nAuO8f0bYQahk58n14NnuGQXlMp000pU3QZjThnDyTgGclthW9l+23336Smn2xySabSJJ22WUXSc33GiJH2JYvS3HeeedJkr7yla9U23iG8/w55ZRTqjaPPneSjAQlSZIkSZIkSdJX5I+gJEmSJEmSJEn6ip5Nh/O1UsoiTFIOpHhVePYj9Bat8RD9n8+EXD39hm2IH/iaG4RWXUSgF+D6SNGS6gI5+sJ1/x944AFJdTjVQ9Osz1KmNUnxPeoFuNeIE/gaNISBvX9ISWB9Dy9KJLxMGsnhhx9etSEiccghh0hqrkC9wgorSKrXE/EQdretSTBYotQHimejdRlIgyUd0e2V/bC3qFC72/E1erAP0qhWWmmlqo00KtZYkGobwy483Yi+ida/KFM//HukUDDPehooKVGeKuzrZPQSBx98sKR6jSVfl4fxzb++Lly5VpcX9yJ6wLPA96Xv/u///q/a9u1vf3sur2JkcGEi8GdemQoUpQ2RchW1YW9uk2U6OfOp0/asGg+UaXBHH3109Zm5jfHv/cq8wTa3u1tvvVWSdMMNN1TbjjrqKEmjk2ZYrinVlg7n9hCtAVnaUpQ2Xaarlfv5vr5fVCIRpRLTzt/xuXBOwgCjBecWrZ/J2PNz3XTTTSVJe++9tyTp/vvvr9roA1/7ERDpIXWaFGGpXlvTxzGpdDxbFl544SFe2dDJSFCSJEmSJEmSJH1Fz0aCXPqVX6B4FPzXY+QNL70KZTHmnIiKBvG4Rt5oRBx6TQCAyIaLUFBkj6f43nvvHfA9pJ/dk1BGjjwK0queYq4BD65fE/ffveR4jV3CuARbdOlc7Bpvnnvpl19+eUn1vXIRhF6TZG8DzxDeYC/Cpj/wTnu0h/3YZzQLfjuF2wLXgbfX7/GECRMk1YXO3h552/F28q/Pg8xVfM8jmsy99KWPX8YAwiCSdP3118/xGodDdE3AfOMeXvaPisrh6quvrj4zhokAMX6leiziqfbo27PPPiup7gv3FvP57LPPliRNnTq1arvzzjtne15jTRlN8WcZ99+js/R1JLfb9vwtl6jwY/I84f75fBud13iKAMHWW28tqc4U8OwD5kaWr/C5judQOR9K0l133SWpFjSSpBNPPFFSHU3bbbfdqjaX3u4EvKNF96uMFkaiBn6dZbvPW22iG6VsdvR3oC1K5MeIREFGWjhmsNcULQ/D+IreU4855hhJdbQQ6WtJ2mijjSTVdufPAwTCsD+/BzyvXSyC88FOR2MMZyQoSZIkSZIkSZK+omcjQe6ZaANZPf8VTE57uWiqNFAKNMpPBc+nxOOw1VZbSaplkHsZfr1HHgS8wcjyOngUfCFP+o5f+H7MKJe0F8DLjj25l+O6666TVNcTeDuer8jjgh15FA0PX9n3knTOOedIqqWSPRLUi1EPKfb+UPdD/3gkCM8cXrAo75qx2oveYTxsDjnWLmuL19btiggFY6xtscHIi8gc6eMV+8MTz72R6nE+adKkattIRYLA72l5Td7W5t393ve+J6kZRaTvSulYqe5P+iBalJF9vCaARQa9lgOwUV92gCggXlPkoSXp4osvnu31dJqyzsH7Mor2lDU97pEH+jq6R1F2RrmAZ1uWR6/QJlMPvjAl10wEyCM61J5hb15rwXIJa6+9tqRmBgdzSTSnUuuG/LHUlDQeLh4RYQxF83dZvxNFDX3MlhGOtpoXv162sX90DlENW1utEvuNZiTIz6PsMz+PKIrO+wzPD1/snmMwJ/qcdthhh0mqI4le98Pf4br9/t1zzz2SmvcPW2C+8Pf86dOnB1c892QkKEmSJEmSJEmSviJ/BCVJkiRJkiRJ0lf0bDrcYEEK0uVNIZK6LolCn1Eo0YsKxwuEJCO5b1JEPB2n5Jlnnqk+cwz63I/pYelegtAtqQmenjFr1ixJ0rvf/e5qG3YWpXrQB2XKhx+fbR7yRl58tdVWkyQ99thjVVu0wnUvEKW0kM5RFntKA1N12kQASvniXgDBA6kuIuVfF2Dhfvt9Jw2Bf6PC8SgtCRvDxr2N9DfSaCh+lWpbXWyxxYZ0jcOhTG+RBhbWR4W/cNJJJ1Wf99xzT0nSFVdcUW0rRVyi1Gi2eZoLIh7YrH/vlltukVSneSBs4ufsfc39ItXR01Da0vs6TdmfLkqA/fjq88zvUeE4thWN13J8t6VvLrPMMtVnUmXGOgW4TQgiKpSP0uD2339/SdKXvvQlSdIdd9xRtUVp+sA9QZjD33mYJ+67774Bf5f752lJzJeMgZVXXjm42uGD5LJUz8nYj9/ncm73vhuMxLqn2peiED7OyneQoT4nIgEPzs/b2tIeO0E0XiIhmHL+kgYKNh144IHVZ94reM/1sc5yNRtssIEkac0116zaEH556KGHJEnrrLNO1Ral1iLCxXtlWzlKp8hIUJIkSZIkSZIkfcW4iAS1iRmUC2RJA71Tkfco8spFv+zh4YcflhTLdg6m+LEboQ+8qBVPJN5fl+gsce8KnhW8DWPtsesEXEPpNZdq2VH3wtAHkeBEWZTofY6XObIf+riMJI03Jk6cKCnug7YIENCfRC+kuAi2G3HJf64VGVKPDODJdRsoPbptHnm31VKGOCr8JRrMQr1SPSaWXXbZIV3j3BDNJdG2RRZZRJJ0+umnS2qKlpx77rmS6uiNVNsFBece3WJRP6TAPSpB35155pmSmrLbHJO5wGWxafNIHtuI/LmHeizFdzzqHYmOlM/KaGHKqLC9LUpURgVWWWWVTlzKkCnPw4mkh9tYYoklJDWl0hG3ue222yQ1o4zI30eLnjJP4E33Mctzmoiin1+0fADzBdfIotxzyw477CCpjhpI0vPPPy+pnsv8PEo7iEQ0ogg2c6DPj1wn49PnQmwwkrxve2+LsolKMYCRpOyfKKpaRmWl2h4iO/3+978vqTkuyepZffXVJTWXbVhrrbUk1QI+3/jGN6o25kwWePdoE7brAloPPPCApPq++X049dRTJUmnnHLKgHOeG8bnG1OSJEmSJEmSJMlsGBeRoNIj47/cS6lOaaDnI5L2jCg9WFFebhQlis6hFyCf1j2TXB9eApdsLSE3Wao98PRZ24KFvYJ7SkrImW1bvM9tpZQJ92PjlYtqtNjGsfxejYdoG3DN2I33XZSHX8LYc+8c3ntfXLQbIY9fqqXQo0g1nj6PSNJveIB93GEr0TzGceln/x5/Gxv12gNyv13GfaTgeslJl+qFmonQ/OAHPxiwP2PGJVexj5VWWqnaNmPGDEnSPvvsM9tzINqDB1OSjjvuOEl1380///xVG/eGSJ7fK+6H2zER4ijDYDRrKcu5y+vUON9oYdooWjKYyG0kiVwec6mllhr8BXSQwchwe6QMmWlscrnllqvaDj74YEnSzJkzq23l0gm+mCTRHsa1z/HIqPMs8WwCzidaGDm6H+Wc4BGnual//uxnPyupWUdIFg3vEm7XzDHR+0JUJ8Q2xhzRA0l6/PHHJUkvvPCCpOYzhFq9bbfdVpJ0yCGHDPheFDGLFp/lM/ehU7Lt0XG4Bs7JbYU5re09YI011qg+77fffpJqOyASKdURII7v9sq7HbVrbnd8xua9hhUbu//++6ttPKeIrJFRM5JkJChJkiRJkiRJkr4ifwQlSZIkSZIkSdJXjIt0uDaikCAhxEiibyghev/eaMqVjhas9L7XXntV20jjoD/b0uFcrpniOU/lgl4VjgD6IpKkjoruCRFHcp9tEtCRkAfH55hR2tJ4gDB6WfgqxUWhZRvf8zRD0jq6PR3O07aQRCe9xVPRSmlWKU6bA/qkbc6LbIiV6Ekt9D6dMmWKpGZKxEhBetGOO+5Ybbvmmmsk1eksPj/RV8xBpA9JtcjDWWedNeBYMM8881Sf119/fUnShRdeKKm5VACpOBzf+74UqPA5LyqqL4urmX9HmzKlxmVwo7Qk+rjNxiLRhHIsRyJH/B1PrZndeY4EpP/sscce1Tb6gIJuF2DhnpPa6Od99tlnS2peJ30X2SnjvkxJler0K1Ino7Rp8OcSqdu+jWdZJIwyN8sMUDx/0EEHVduQi+d8/VxLwQInEnQpxQBIc5Oke++9t3H+/rwmjZYUWKeUmnZ75e/4feAcSlEaqTPviT6fILIUwVIGiBK4EAyppH5fb7rpJkl1yqWnynFN/D1PzyUdGTvdeOONqzZEPkhB9BIJ7M3TH+kf0ql5b5Sa4jOdZPy8JSVJkiRJkiRJkgyCcR8J4tem/3pv85C3FW2WxcL+i7zNY9Gr4FlyCUM8AHiP2qRw3SuHh4x+cm9Yr0eAsIM5FSqXtuF2iMeXPo8kTGnz71GoSEGhe7dKsYVew72p2Eu04OxgCk8jz7J79rsZLwAvx18kpe5zV7lEwGCj121Sv9ga98AXzgNkekcSCqpdZhoPIjKvfo/xcJaeS6mOGL344ovVNryeFHN7XzC22J+olFSPcy9SBvqcf6PFuCPJaMY1UbjRphxjLtse7TOYKHTbPlF0l77gXxalHW3w/rvAAbbEePO5lwgk48afA9ire9bZn3vt4wuvPvt79Ia/jd1F0YmIUobdv8szDcETqY7mDIdoEVP6gPmLqKxUj5PIVmhrW+LA7xH9z1j1Y/KsWXfddWf7d6JziRYkbcvciAROBgvn7yDpT8TEpasB4QEX2GGsel8T+Zk8ebKkWG6b/RH7kOrxwLIBt99+e9XGfSYK6rbf9i6IlLbP7Y8++uiAa+sEGQlKkiRJkiRJkqSvGD9hi9nAL0v/BVousuUe0aHkFEcRDH7xOm05972ASxgS+WGxLRYgjPA+pw/w+rTlsvYKZdQw8la5N44+izxYyKKWC7B6W2Sv5T5Or9cEuYcIW8Iz2RapjbZFbVF9Wjfi97uUZHUPJJ5jn4PKeo1osdQ2iexIkrzMdfe2spZgJOGa3BboA/rJa3XwUEZLI3AMPO1S7e1koUAfr9gO1+leWvaLaoLKWqAom8ApIyIelR9LXJY8WnwSoswK+j9awLecS6MMDvrOF8KMKBdR7xR4vHfddddqG5FP6iE8QwJZe6KN7n0Hj4xQs4KNeV0YbVE9Ff3StkB8FGHjvKLalXLRVKkZoR8qhx56qKRmHSZjJ4qcQrQMAtfr812ZbeGLc15xxRWS6giiR9imTZvWaHPKPotq/Lw/y3vjc0Mn3gWPOOKI6jN1MvSFP9OQIcdm3O5YfBzblOqoOcdyeyA6SR3tPffcU7Vx3/iey58vueSSkur6JMaOVL+zeF0r94Sx7e+eI1Vn2ttvSUmSJEmSJEmSJEMkfwQlSZIkSZIkSdJXjIt0uLbVqQmh/b/2zjPclqJM249jjigGQJSMJIFDjkoSBJUoOIKRYJhRVBhERf1QR0WH0Rl1ABMSHRQBESNJlIyScw4SBTHn+P2Y6+56Vu066+xzzg4rPPefvXZXr17d1W9Vdb/RzYR12lw3sc5PQKeb2fmdlol+2N3hqJgsSZtuuqmk4m7iwcUrrLCCJOnWW2+VVEygUnH1aKVTHXYwf7tZF3lz8zx9kcvGvAAAIABJREFU1qp+XVfGdhcRXIxa5uNNNtlEUtscPhNpiqcTv86W6xNMJsC6Tq8r9Xe/GCT8niIDrZTfyJr3B/v3S5VNH7m7F/LXksdWAgZoyfZ0wbW4a8b2228vqaRkdZcX3EE4b+87xqbLHOOHfvG+w+2OselJUbgPuPy2XN5a/cR9bsklqZHvu+++CW2zAWnApYluWNLENdnde/ut15NZK/mdeblUT5c7HLj8cF+OOuqoue7PPfRUv7jNuVs5Lpa4nXmyG+Sungekie6FLTfDljsc19FKcc4xPSHHeeedN9drnBeU3eAZQSpjh3Hm7nZ1iZPWuHFXQsYO1+4JPFZZZRVJxUXOr5c2+tVdumoZbvWryzLnWP+V2skNJsthhx0mSdpuu+26bZRMaI0bXNdIDoH8SUXGWnMg96GVxOB73/uepF4XzZe85CWSpB/96EeSpO9+97tdG/MEqbHdTbrfus194Pqmk1iCQgghhBBCCGPFSFiC+tFKjbuwtAJr0Zi4ZQQmk8J3kPG38bqQG0HDUgkqRMvj2gL6n/7xgMVhBTlAS3L//fd3bWiWXFOHNq2VOrOWqZb2kmN5EgT6s5VQYdgtQa6Nq1Nj+5jql9YeWlqnQdGqzwvXyHGt3FtPAY2WzTWj85Mco1VktUXd5vKIbLeCv6cL1yofd9xxkqRddtlFUruYM8kFXAtK8LBfC1Yhkrk49AF97hpO7hd/W0WUoZWowmWb82Fu/da3vjXXY001rfTd4Otcq8BzPce1ipW3Cqn2C0KvvQhcs839c4vFdHtgtObX1vMG10liHIpS1p/HgToplVSeBfA4cQ+S2qOidU/dUo5sIAc77rhj17bTTjtJapdNAba1Cra2kuy0yqXU1+rPQcjAgsDY9+tdbrnlJLVTUNeJf1qJXVpWNOY2t27Tn7vuuquk3uv90pe+1HMOe+yxR9dGcqO6eLJv82MxjzJnUuB2OoklKIQQQgghhDBWjIQlqJ+lhbd41/r180mezO+0NAgcv6V5nEk/+emg5b/LW7xrhetiofikShNjs4Y9TkqaaLVx7REFLl3zU2sOWynWWxbLOg2na3tIIYlPuWtHRwlkCwtDK/1razzXqaFdGzYT/sZTgWv06Aeuy9PDttKkQz+tZyv9bL8YotpK6ZpO5sHZSs+OVpliqVgIpDIfUVDQLTxYfVrWa2JPfLyyfyvOr75278NW/Ay0ivcir/TrwsQUzC/9LEGurad/WtaPlpWoX4xOvzW21rr7mt6St9nwwKhTy4demL/dOlrLtscEMb+1LMstKwzjkbHtY7aOK23JB/ettTb3mwt9ba9jLN06uffee0uS9tprrwnHmhfE2njMDRD3Q4ywVGLPKA679tprd22sFT6GsMLQB154ldjKT3ziE5KkAw44YFLnvPvuu0uS/uM//kNSb8FT+sVj+2jHWn/++edP6ncWhliCQgghhBBCCGNFXoJCCCGEEEIIY8VIuMPV5k03pWNWc9esuvK0mwRrk2e/NLC+L8ekcrQz7OmgW1XpcRvxNnfNkdp90UpPO6zUqdbdlWa99daTJD300EPdtrpavPcBbS13F0z7uOB4MOOSSy7Zs6+7pIxCHwNyVo9dp+XWVSeccJcxT+oxyPj11P3gYw63gtb+uI74eO3nLlS79bgLJsfH9ff666/v2pC/2jV2tnBXlGuvvbbnb+hPP5flu+++u/vcCsqu1+JW8oN+62K/VPitshe4ErprzbAnJBpFLr74YknSEUcc0W1jfeN+tRIJtJKL1OupbyPhxHvf+96uzRPtDCO4CbpLN25q8zu30cfugst8zrgkrbXUm/Rpbsfie77GnnDCCT1/FwaSLEw1sQSFEEIIIYQQxoqRsATVuNaAYK+WdhhaAaD9NFj9vueBuKMIb/m19kaaaPlxSwTaC7Q2XsRwWCEJAdfmWnZk0PsHDSZtLj/0C9qOVprcfsHmSyyxxIRtnrZ32KHv+lmCoKUBRha9GOaw0CpiyPhrpTF1jVkd1O9zYx287r9T96Fr95B7khC4nGGl7JekIQwHrUQHaKM9qUTL6lenyPa5C9lyy1Hd1gpe5xh1Qg+pBH97Ep8wuBx00EGzfQpDR50gSSrPm60kEXxmLvY5mXn6Jz/5SbfNP88Nxr+vFfXa0iruzf7e1poHuDb282eqfmv+whBLUAghhBBCCGGsyEtQCCGEEEIIYawYCXe42k3N3TPIL+8mt9q1pp+rXL82h7aZrOMw3bQCDx988EFJ0gorrCCpN0hv+eWX7/m+BzhSjRgT6Cj0U11bxF0xcPfzYEy28b1WrQTMv96v0KoxglzjmvTMZz6za1uY6tSDRl3lu+XyNpltnlRiWPD5jHmGulBf/epXu7all15aUqkZIRV3CWTngQceaB5X6pUrvofbk8+fyO0ZZ5whSdpwww27tmuuuUaStPrqq0/y6sKg0kpOQCIWr9vCvNaqcQatZB3s36oh1KovVCck8nXpuc997oTfTGKEMEq0nsdqFzEfs4yhVq3M1lit3dNa7tH1s7Mfq+W2z2+2XF9bbth8l+dFZ7pqz8USFEIIIYQQQhgrRsISVOMaztYbKBol/nrAWB2Y2ar83NJgoanupw0bNlqWr09/+tOSpHXXXVdS7/XWb+qupUBzSMCbp18cVu677z5JRdtx8803d20tOSCAGJl0uWsFEAP9yPdcpvnt+lyk0ehjqDVerQBQ/nrfM47ps2G0QLpFj+QjJCfw9NQrrriipN40wVgd6RsPYq+Tv7QSMDCvuQVt2WWXlSRdddVVkqRNN920a1tzzTUlSQ8//PB8XWMYPFqWIOTPxxjzWEsLzfrZ0irz1+c8xjnHIhBbKrKIN4L/npfAAGQ4STrCKDBKnh2DRCxBIYQQQgghhLFiJCxBddrcVVZZpWtbbLHFJPVqjdBiPetZz5rnsVspstEqo5GSivXJNVf194bNR7l13mjjnvjEJ0rqtbqhiQYv5EifEWtw7rnnTsMZzyzERnDPPQaKfvrABz7QbfvoRz8qqRQadFmpLY6tdJAt+UHbiQ+txwR5GttRgbHb8jtG2+xaavqMfqIg3DDhmmzuL9YeLwaKtQYrkdMqlgpYaV1bT4wFcuVtaCRJqXrnnXd2bcy9WInC8NKab+rCiFIpuOhrZV04uwVzHAXNpRJzwNx16KGHdm2bbbaZpLLO+O8tuuii8/y9EEKoiSUohBBCCCGEMFbkJSiEEEIIIYQwVoyEO1ztOnThhRd2nz/zmc9I6nURwVRfJ0HwbdByh2Mfrz6Pe0CrYvWwucFB67xJh3rBBRdM2Of000/v2ffEE0/sPpMyl1TOk6lOPOiQDpi0sQ899FDX9v73v1+SdN1113XbTj31VEnFvamVPKFVDR1wFfHkB7jd4T6Cm6Ik3X///fN1PYMM10kfeApxxl4rtSdzA98bxmQR3/nOd7rPpMZuubydfPLJPX9nijvuuGPCtttvv31GzyFMPa00/cxh7la6zTbbSOp1L2ecXXvttZKklVdeuWtba621JJXU6ksssUTXxmfW0QMPPHDCOZCcwRPEtNYTxnwIIcyNWIJCCCGEEEIIY8Uj/jGsZooQQgghhBBCWABiCQohhBBCCCGMFXkJCiGEEEIIIYwVeQkKIYQQQgghjBV5CQohhBBCCCGMFXkJCiGEEEIIIYwVeQkKIYQQQgghjBV5CQohhBBCCCGMFXkJCiGEEEIIIYwVeQkKIYQQQgghjBV5CQohhBBCCCGMFXkJCiGEEEIIIYwVeQkKIYQQQgghjBWPmu0TaPGIRzxiWo//xje+UZK0+uqrS5J++ctfdm2///3vJUl//OMfJUlPfvKTu7ZHP/rRkqTHPOYxkqTbbruta7v44oslSVdfffVcf9ev6x//+Mc8z3My+/T7jenkuOOOkyTdd9993bY//elPkqRbbrlFkrTaaqt1bf/0T//3vv3MZz5TkrTnnntO6/nNdt898YlPlCQtvvji3Tbk7s4775QknXzyyV2by6Ak/fnPf+4+H3vssZKKbH3lK1/p2u655565ngPXM799Mdt9By4/Sy21lCTpu9/97gId681vfrMk6cwzz+y2+fidKua372ZqvD796U+XVGRQkn7xi19Iku6//35J0tprr921bbDBBpKk7bbbTtKCycT8MCgyN4xMV99tvvnm3ee///3vPX+dfm1/+ctfJJW1U5LWXHNNSWVt/cMf/tC1nXDCCZKkddZZR1KRUUl61KP+73GFtcTheh75yEd2284999z2hRmRuwUnfbfgDFvfMeZaYxx4LvZnl+lgqteiWIJCCCGEEEIIY8Uj/jHdKr4FYGHfeNddd93u88EHHyypV6sFRx99tCTpec97XrcN7fDHPvYxScVaJBVt8vOf/3xJxfIhFS0V5/6+972va0O7Nb8Msrbg9ttvl9SrTV900UUlSf/v//0/SdK+++7btWEZWXHFFSVJSyyxRNc2HSI4k3332Mc+VlK5bkl69rOfLUl6+OGHu21oSrbZZhtJvVrO5zznOZKk3/72t5KkZz3rWV0b/XrggQdKktZYY42uDS3qpZdeKkn6/Oc/P+H8hs0Ciay4xYv+ZJxhTZOkm2++WVKx3i622GJd23rrrSdJespTniJJuu6667q2HXbYYcrOGQbNEoQ1DZl7whOe0LW9/vWvl1T67yUveUnXdtBBB0kq4/vWW2/t2i6//PIpP8/ZlrlhZrr6btNNN+0+19Ye/00+/+1vf5twDCwzbul+7nOfK0naZJNNJBXPAanIFpYjxrvT2oam2q/rkksumcuVFSJ3C86g9J1b/2oZfPvb3959Zg1gPX3ggQe6tsMOO0xSWX+nm0Hpu8nSzxL07ne/W5J0yCGHSJJ+/vOfd23XXnutpPI8c80113RtH/rQh3r2mSyxBIUQQgghhBDCQpCXoBBCCCGEEMJYMZCJERaU/fbbT1JxW5OK6eymm27qttXm9E9+8pPdZ9xuNtxwQ0m9QZgEpuO65GY5zIS4PGEalKRddtlFkrTbbrvN/0UNGFtttZUkadlll5XUa/pccsklJRV3LXfpoh9xVdpyyy27trPPPnsaz3j6wQ0OdyypyIqbjwlAv+KKKya0/eQnP5FUTPsuwz/4wQ8kFdc6/x2CEHEvfM973tO1IYMD6PHal9/97neSSlC1JP3617+WVBIkuHvrC17wAknFXcC/95vf/EZSccfxtlHF3YEPP/xwSSVI/LLLLuva9t9/f0nFZficc87p2k477TRJ0vrrry+puNNJZSzjgjm/7pZhOPjrX//afV5QdzjamPuk4pK5yCKLSCrzoVRkiYQvrClSWbeZ/3xt5ntxVRsfuNctuXv/+98vqTdI/9///d979nnDG97QfeYZ8Mtf/rIk6Yc//GHXNpmkAKOIj6/62gmHkMozHeOY9VsqibCYB3iulopr3B577NFtW9DQkYUhlqAQQgghhBDCWDESlqB/+Zd/kSS95S1vkST96le/6tp4g/W3WjRcWGjQLkvS9ddfL6kEu7vViCDrJz3pSZJKOm2paCVIlnDvvfdO+J4HrXuq2mECS9CDDz4oqTeQEGvGFltsIaltscBy5MkohsEShBy4dvSf//mfJZXkBJ4E4XGPe5ykYhmUyjUjI65NxQIJnj6bNNvIqWtaOP4dd9whqTfhxPLLLy+pN3nFgqbNng1aWl0suq02rGg+Ln/2s59JKkGxfj+GmdZ95PqxjEnSMcccI0laYYUVJPUmPyC5xA033NCzr1QsbVjO7r777q4NSxOWoGGQpdmA++Ga6jq9c0u7PGfOnO4zAdskEZhJ+lmC/Lxbssg1X3XVVZKklVdeuWsjadABBxwgqViEJOkb3/iGJGmfffaRJJ111lldG2ne0Ty3zqGVNCGMJq15h9IbrBMnnnjiXL//hS98oft83nnnSZJ23XXXnv+lImck6/BxMcpzn6e155nlve99r6Te5wwSG/C85ynvSYPPmoxnhlSel9zrJZagEEIIIYQQQphmRkJtgpaJN3R/gwV/O6WdFIkUb/PPvOG7xhlrBpYm1zrVKTq9Dc09RS+HGTQA/kYPaAtIgUoch1S0KWgEPF5oGHDtD+DbznW7lYH9PQaFbeznWuG6qOAyyyzTtdF3yJ9rlpEztM5+TFJNDytYsqRSlBfLo8e2bL311pKku+66S1Kx5kqlrx//+MdL6tVIDzMtDeRaa60lqdeyTXFZ+gHrjVT6AhnycgDIGhYgLxr91Kc+teevpz9uWT9GEcYZfefXy+dWH0wmrsAtcvQxVhDusdRe56aS1pzHNp9nuE6f7/EKYD31IseUBjj00EMl9cYJIEvEaFx55ZVdG1poUmq7NwHz5yhr5kM7HTZWbql4TXhM9mS48cYbJZX42x133LFr+/rXvy6pHQ83itReTc6b3vQmSb2x4HhNtYqkMke1Cqkyp/m8MRvEEhRCCCGEEEIYK/ISFEIIIYQQQhgrRsIdjsqzpF0mKF1qm+8B8+ZDDz3UbcPE6mZXqF0gWimy+T7JE6RiAjz//PMnf1EDytJLLy2puPi5SwZ9gKucuwTyuU51PMzU7maYd6UiU+4Sg5kZ2WqZj+c3hTPuYdwPTwpAelnSdUvDZcr3YHxkCxfWZzzjGV3bN7/5TUklMNPHLq43uMPNtul9OllppZUk9bq14ZZEMDmJIiTpzDPPlFRcDT1A/dZbb5VU+stdj+hfXDbdZWlc0sjWLqrzy2abbSZJeuUrX9lte/7zny9Juvnmm7ttq666qiTpoosukiStuOKKXdt0J9fxe9kvMULtOilJ3/nOdySVJDnuPk1SnbXXXltSe0wyV2688cbdNpLFkDSBxB5SKVvRcuELo40ngsGtDXyN9oRCUtu17pJLLpHUOy7rfdztfUHH/7DC856Pf/qDscdaK5XnPsZ/K6ER88FsEUtQCCGEEEIIYawYCUsQqfa++MUvSpIOOuigrg1N8GTTadZvuv2Kr/n30UITTOaB/wSAjgIEwdVaFan0XSsgn35Bc+LWumGFYHGCeSkMJpWkCR40jhXGU4cDFqDa2ujb6Dv/fm39cMuTW6aGkdNPP737jLYPS4bLGP2C/Lls0v8k9HBLyKjB9TNGpXLd9ImntEc+sBa5dQ15IqGMF61kTvX9YZgsjZOlVSyR+WvbbbeV1Js4AusNSQyWW265ro390Iy2ygi4ZeRpT3uaJOnII4+UVIpAOl56YSppJXtoJQzC6u1rHmnUGW8ud1wnMtnyGOB3vC+wVO60006SigeIVO5RLEGjTcvSTOF2STrppJN62vrJQ8tLg3XY5ZWxzlgd9aQv/eZwkg954ib255nH+44ECvUa7VCiYbaIJSiEEEIIIYQwVoyEJQg+9alPSZJe8YpXdNuIPWlp5Hnrd99QNAK87bvGq7YK+f9oCdBkoS2VJmonhhksOvh9eh+gjSP+oKWRh1HQptAHxKl4PA/FOV0Oat/3Vt+Ba7xoa1mQsIigYfFjkl53WPHis2jEkS0Hn2TGnmub0LyTtt3jCEYNNOX3339/t63WhLo/O2OZeB+XL8YuMu7aujpFtstZPc5HAdYHH5N77723JOnAAw+U1JsyFsvIddddJ0k65ZRTujZSRXv8TI3HI7zmNa+RNDvzpf9mHQvk1kZKRnj8DmMRGXMLLJp15NVT8WJ5RE49tpa5rWXhJhZyXGLSxpWWlcLnuAWN0WGNbY0z1nLmx1G0dk8WnnH8OYP+YFy3LEGs262+S0xQCCGEEEIIIcwgeQkKIYQQQgghjBUj5Q4HrQBWdzfClMdfN6fWCRFa38Ok59/DBQfXCQ/8d/PgsIO7wqKLLiqpN/UpLgmk3HU3mdpNoV/CiUHG0z9yz7nX7iJCsKYH/V5//fWS2una68QGrX1acA6Y6t19BBcyP4dhChz21PW4OXB97tbFNdHWSn2Km5bL6yhAWmzHXa3oQ5J2uFwxXldbbTVJva5suHjiFkf6Zqkkqdh///0l9VZsv/TSSxf0UgaW1pg57LDDJEnHH3+8pOISNhW43HNckizMJP1SZLeqye+xxx7d59qt3OfG2nW8lYCB3/F1lHHdWjtaqbuHiZZ70WR42cte1n3+1re+NaXnNIi0+sndzOtxONn1ru7zxRdfvPvMsw5uW8P67DIVtErIsBbzLOJhATwvQisxAskWZotYgkIIIYQQQghjxUhYgmot+nnnnde1rbHGGpJ6NaB1aux+b/atxAj9CsaRiOGoo46az6sYDtAQkwbWkx/QxzfddJOkkiZVKveoLqw1bHga7BrXjmy99daSei0Pt9xyS8/+rhVBdluBmRyXJAieMODFL36xJOmKK66Y0EZfP/vZz+62kQ55GPDCd8gW5+8WOcZhbY2VypglxfNsB2FONausssqEbT6/oUlHc+9zFv31ve99T1JvkVX6kKBgL7hLQDtJJzxV9ihagvpp5vtZgJBZX3tqS4XfK9aQO+64o9tGYPHKK68sqdfqRkHb6aJlCQIvygzMRVLRBtN399xzT9fGfNRKa43mmO956nFk0q1KwP7DmnBnfoPtSQK1yy67dNtqS5DLXX381u95KvcPfvCDkkpCi/3222++zm+6aJ23r3m1181k5aGWb7cEedKT1r7jBOuIPxczfuvU91K7bEPNbCcriiUohBBCCCGEMFbkJSiEEEIIIYQwVoyEO1xtnjzooIO6z7gvbbnllt02Aukw0bk5ns/9zNPubgO452CSfu973zv5CxgicGto1QnCXQu3L3dZwkyNmXm2TaALildKrpMXeBAvQeYuR3VQobvC1DWrPLgQkzKude5GR5Vm3GXOPffcCefsldyHyR3Oa4swxnG98sQIyB39y/9S6TP6aVjlbm54tXQCzpdYYoluG64cyGprTCJzLlf1/u4Swv64GI4jdeIcH+e1u3Q/95mWu47LNu52JDnZc889u7bpXmNa1wQ//elPu8+cW6tuHuPVa/v43Fb/XydUcJgvl1566Qltw+pe3aJOwOTgZk9/3n777V3bN7/5TUnS9ttvL2nyblu4rf/Xf/1Xt411zuvcTSX9Ek65HNX1Glt9sv7663efkRH6qV/CCR9nuHLtsMMOknpdAzfYYANJpX9b67ZTn6vPq8Mup8iUj0/6g3vqz8c8Y9PWksl+NdNmgliCQgghhBBCCGPFSFiC+lltDj74YEklratU3lR5++8XuOr001hgcXrjG9844Xv93oKHDarR8/bvfc+2q666SlJv0CZ9hZZ+mCwSDlpPqQQJkprZ20iQcdxxx3XbsBQhB953dXIPh/3RXHng4TbbbCNJOumkkyT1Wn0IXu6XzGGQ8TS8BFOjmUTTLBVNG/3p/YrmrZXyfpjhmj0NPSnY3RLGmGSuc2slfYE1yYOB0eC5thSQUYLWXdNJIgVPWzuKTCYQf0FT6fo9RX7pz6222qprm25LkF8bc1ArrfUrXvEKSb3nzTpBuQTXmJNyuGUpq5NJuFaZ324FWXP8OunRMFI/z3zlK1/pPs+ZM0dS6d8zzjija9t5550llYQye+21V9fG+F9xxRUlSS960Yu6tm233VZSb7KL2oK8/PLLd2233Xbb/F9Uxfw+C/V7xjv//PO7zxtuuOGkv0efOMibr7E333xzzz4t60+/35zX/sNIK2FYy9uAvuB+t9bf2X4uHo0nghBCCCGEEEKYJMOvNjFafqOtwp2kVEQT4JqlWtPV8illW6so46hDTFDLcoFGmD5vaQvQLA9rbMZiiy3WfSbeBI14S/vtslXHBHn/ELOCNs4tHWhP+qWZRNvkvvd1ocxhA62nJN17772SihaS/pLKdbasF/Qj45/CoMMO6axds4aG3a+R2LRrr712wjHQwNE3LY0cx/f+RkuMtpTflUo80qhbgiZDy2Ogbmv1+XOe85zuM1Zm1ipfx9xyN1O01jkKdrbamI98Pquv2ddR5rGW5Za1uV9a8pnUKvt51+uhn0fdNq902KQaJ021W8RvuOEGScUy7jGqWIJf/epXS5IuuOCCuf6GF0ambIBbRliHOFePM5wKS5AXX8Z6zLzisYZYtelDL3GAhcbjIrGcvuENb5DUWy6FOYzj87tSiV3zMgFAnNA111wjqdfSgUy2Yn04Pyx0knTooYdO2G+YYM5vyT5/vSyI93H9vUEhlqAQQgghhBDCWJGXoBBCCCGEEMJYMfLucEAl6tb+85sYoZXowN0U5va9UYCgS1yPWpWDSZ/q/YPZmP05zrDRSvWKi5xXN8cly4PGW26UNa2A4DqAuDYxSyXRhLstkERgkUUW6X9RA8rLX/7y7jMub+AB0MhibZaXiusC7llrr7329JzsDIPLm887uJO4exquH7Q98MADXRt9SJpd/x5tyJrLHDKKq4q7r5Cq/aabblrAKxs9fD3qt4bAW9/61u4zrqy4IbdcjKeL1jrqbm2AC5G7w/FdAvg96B4Xv9Z8SOIOvn/llVd2bQS9+xwHuIy52+Z049fbzx2+X1pkXKVe85rXdNu457jt+/MJ7m+UD1hvvfW6NtamM888U1KvextrE8dy2WkFq+N+vMkmm0iSnvvc5871GhYEkmlI5ZouvPBCSb3zSe067uspc5+7ieMSSL94IpHabdoTGdXlFbw8Aym42d/XedZYd5MHUpW7e+vnPve5CfsNEzxHu8wwVpkbGN/SxDI0fq8o0TDbxBIUQgghhBBCGCtG0hLUwot+1Wk451e71krFOy6az7vuukvSxOKeUulPil/5W38dNIeGcNhwjTgJILi2q6++umtbddVVJfVqQOkftFktLXBdeEwqfV1rSaWinSJo01OZIsut+zAMiTw23njj7jNp19GSttKO0nceSEyfodnzYxKc20oaMOiQet1T5NI3aCClorlraU35jLXMNZz0G9o6T7vN9+644w5JpYigbwv9aVmC0Bhvttlm3TYC4bmPfv+mO/WuzzOcb6t4ZqvAM3PiVe6wAAAgAElEQVQP1kX3xEAr3JqD2IZMuhUFy0bLg4O2mbQEOVhKOG8/D9qwknoaZ6we1113XbeNscaY9XTNWHQZn56ApO4fvx/0ayuFPd/zeZPrYL1fY401urYTTjih0QPzh89bFL/l3Pw8kMG6rIlULGX89XaSJXz1q1/t2lh36V8vussxWE+9lAC/jRXd13S+5/KNRY4+/MxnPtO1tTyShh3krbX+0nf0hT8vurVtNoklKIQQQgghhDBWjJQlCM2Qa5hWWmklSb0afLQgrdgM3mr7pTXle67FIwYEbaqnCRylmCC0S/RhyzcU/3WP26B/BsUPdEFxDR/XhxxcdtllXRs+wn699EE/CxBtLpv42vJ971fkmpTj3taSO/x1+6WZnW3Q1LU0mS0tcq35dG15HdeA9U4qBZSH0RIExN/55y233LLbdswxx0hqWx5q7ar7cnMP0H66z/t73vMeScM/lmcD5LEVJ/K+971PUrHqSmWccq88NmO6U2S3YoKQleWWW67bRsyha+nRBjM/eSxH7Ynhc1adVtxlkmPWabSlyaefnkrcAop1gD7A6iOVMcS85JadW265RVJvbB/HasXdehyL1DsPYmUgxsZjUUgtzVpC0VSpjHViA6WSBptU2h6f+YlPfEILixf6paA91iZfY1nzuOc+f5Mm3AuEExNEH/gYQQb5HX9G4zfpCzwPpGK5I4223w+s7x7jjBcI3i5+Di996Us1qmCl9HXbLWpS73ONr12zSSxBIYQQQgghhLEiL0EhhBBCCCGEsWKk3OFagZZrrrmmpF6TO64ILfe52pzups9+FcAx+xHU+q1vfatrm8kq1jMFZnIPmMYMSj+5uZn+6ZcudJDhnvv50wctt5Q5c+ZIaidG4Bguk3zGXcHdR+p93KSMWwopTd09kc9ukh4GdzhP7lDDtfs4pa9afVa73HhAsKc7H1Za85PLB+MTV6LWXMT815Id3HtwS6n3m9s5DAqTLX8wvyBPXK+vIf3SXwPuUocccki3jcr0nmClTv/u88l0B1n3u6+4AUltNy/6gHP0ua5O+d9KwAA+Xjkfvud9Xt+P6WT33XeX1Otuhgs4blW4J0sTEzm4az6ffXwxZluJArhmXFHdPZLzOffccyVJ++67b9d2+umn91yDz5UvfOELJfW6vLGusLZNlesryQi85AGJXC699FJJ5ZlNKs8QJAhxOWolCsIlC1c0dzPElZBroayEJC211FKSyr3yZAsE8NNnLpPIt/cnbnOf/exnJUlvectburZhTxyDvLXmO+6Dy3K/ciCD4k4dS1AIIYQQQghhrBgpS1BLC0QhN9fg1wW4FlRb2EqtvcEGG0gafUsQGhnv19pC5m/6dTDssNEK8AWsKq6xI/Wpa5RcQyK1LUGtfkVesRL599Bg/eAHP5BUAkKloj1zbYynQR5UsFa10oSTatOtW3Va8VahWe6bt3kChVHCtXRcf2vc1f3l8xR9Q7+jYR02pssyUFuAfF6oU1eTil2SDjjgAEnSRhttJEl63vOe17W97W1vkyTtuuuu3TYSUqA593vrVqHpwH+r1ui+7GUv6z4z7xF8L5X5jzkPq7k0Ue583UDu+O2WVrnl8eFz4nSDNp+kBlKx3mBt8IQOjCsC5r3gLPfQLfME2dPmfc1xWVu9mOx+++0nSTrppJMk9fYr6wL95F4a55xzjiTp7LPP7rbxXSx+fs5uCZlfSLmNtUoqlh9PJgFYa+jfVrFyt4iussoqkqQrrrhCUq8liCK7JDDyorvIJBYkL7KK7HI/vF+x/HkRasYqa78/F1x++eUTrnHQ4PpacyfJKHytqK1h7h3E+G1ZhJIiO4QQQgghhBBmgZG3BJEK0rVzvJW2ClP2Oxa0NFG8Ga+zzjrzccbDSx3fIk1Mh9iymrS2DQNofFyjgZajlS4cDZpbgmo5c+1l3dbSnNSpsqVSjJB+dQ1f69j9fHQHBbSprn3kvOt4PqmMx8lYdt3642mfR4lW6mD6wccrcyL7tCyTxAR5n2KNa8n9oBXhbc3jjBXvi/m1GNXX2Spc+sEPflBSiSGRiqb53e9+tyTp61//+oTveUHfOt7Qz9NTUk839TjyNOxodD0OBgsXWuJ+8UWttZn+bRVnbW1jXM9ETNDFF18sSTrooIO6bcS61MVtpSJnXJtbfeb3fInJOOywwyRJb3rTm7o2ZAuLis/1dWkB9wjAuuLyxLGYg93DYGEskKxdHvfJeTKPEN8qlfUM68L555/ftRG35OdNPA4ySdFUSbr55pslFXnzVNzcL6w+bqWgD1gvfP1lnsTyKUm33nqrpFJKwC1OJ598soYRrpnnaY+nqu9fyyqL/Lm8e2rz2SSWoBBCCCGEEMJYkZegEEIIIYQQwlgxUu5wLTBhttJwtlJd10FhrX1aYGJ18+vcjl2fzzCCS4wnP6gD/93dBHeKmQxgnUpwGXAXLdz/MJ2vttpqXVudzlWamBrb3RX6pR2mjf512eFYnJ8HrbbukQfsDir0K+cvlX6sXS6l0i+TSYzgrhzuwjBKuMtf7SbZz5XXXVVrmfMA49o1atgSv7Rc1xYW77vjjz9eUqk0/4UvfKFr+/CHPzzPYxF8LE2cR9z1ZzZSZCNbhx9+eNdGULi7F730pS+VVNzDfNzW62+rjXt02223dW0Eq99+++0TzpX+n0lZxO1JKn2AKxquWlJxiWYM+rrIeXvf1S6B7uKMq9jnPvc5SaUkh1RcwXAXbAWvt1Ll12NdKgmecJvzdfvUU0/VgoL8+j1HtmjzVOu4wXEeXKMfw9c01j+ew3y+/+lPf9rT5mslbnDs4wkYcHVkDvQkMZyDb8PNa/PNN5cknXbaaRoG+iVEOPLIIyW1kxkgW8i1z691QiJPErHFFltMxWkvNLEEhRBCCCGEEMaK4VTLV/QrTkcAnmtTeOPljdU1p5MJHOdN2bUjaDF4KyZgXSpaglag4rBCEKynQ6zx6x12yxfX6fcc7R0pMT3dLdqslsYLuXPNW12M1X+nLhDq8lpr8Qj+9N927fGwJqagP1oB0LV10TXk9EHLAjmsVsl54VZF5BZNp49JZKFOzy4V7Sr7uAy5Nm8Y4dq8mCRpggl6nyz09SmnnNJtYx7YZ599JElnnnnmhO8hx62+dAsM7S1Lx3SvIS4rnAfn9qIXvahrI+33tttu222rC5G35v9WggNAS++a534psuuCjTOBjzOs7RT85K9U5iCeRRZZZJGurZXCvk677lZYxij9QsppqYxx9nGriVua5nYdbhnhuKxtnqxgKvCioXhQ0BdeABYZYf7xxFNcpz+DELDP/fDxhTWJ6yT5gzRxXfT/sezwDOlJJRgP/rxH8gCu0eeZQaYeO9ttt133+dWvfrWkUhTYn13qYtsua+zHPu5JA25180RSM0UsQSGEEEIIIYSxYiRUoXXcjr/F46vr8QX191rxF/3gjdn3ZVurYBSWoMnGFw0DaGYoIipN1NC1Ct3NZFrXqaRlQUFzhWUHH2CpaH+9T2pNkmta0ZCguWppeeuCoVLRlFEU2Pu8jtGS+msEBwW0eH6ujK9+sXqtuLPaz9nH7HTHVMwWrZggrrs1D9Jv3qfIDt/3uLJBsQS1CjBzLbW8SCVOg3gWtNxS0dK3LEGtAsZw4oknSupNIbzDDjtIkm666aYJ+9PH/frQ70NtrfTvTfdc6lYn5IB5ijT2krTuuutO+C59S6kAXw+xMnD+revlOi+55JKu7fWvf72kEhPUWrdnMiZosr9Fn1133XXTeToT8JTlg4avp8Rtn3XWWZLaFhrur1vRWEddDpjTWRdd7uq4yNY6wdzp3yOlOeuRzzdsc+sQcycp8meLumSEy2s/i+naa68tSfr2t7/dbWM8My6vv/76rq2OsfKYLuZM+qTlfdGyDs0ksQSFEEIIIYQQxoq8BIUQQgghhBDGipFwh6uDMN3lo+XGNBmXt1b1ecDE5+4RddrPYU8EMC9aAZ01rVSgg+JKM79g7vbzxxR+xRVXSOo1EVPp3VNnAmZ5d1siiBUXEXd1wYTcchMjIBNXOf89T9MKLXeoQQN3GT9XzOn0mZv2GXMt9z9kkH3cjeeyyy6bytOeFVoB1e6aUeN9hBtCKxkM2+jvlsvZbMN5zO/54Lbh192vz1pucKS9XmONNSRJe+yxR9fWcoOD2p2ztU64ywjjuk4dPRO0XE4Zk57qmznHXZXqhBo+ZxFoTiC/u8Mwb5J+urWG4G7j96zl0hkGD1wnPdkDMo0b6dlnn921rbrqqpKKHPj6xvfc5RzXtVaZBeSFFPSeCIKwCYL0cdGTSrB+q7RFq2wFx73xxhsbPTBzcE6tRCIt9ttvP0nSJz/5SUklXbhUxhWuip4og0QzJOsgpbhU+rxOle24+9xsEEtQCCGEEEIIYawYCUtQjacrRBPgGrQ65bBrOevidP0sHf72X1uOWhqpUbIO0Z9+nbXGwa0Z9MtUp9qcaVyTQVIIrumrX/1q17bnnntKKtpOqVgh6DPX+KLpIiDT+7IuNOba/KWWWkpSCeh2yyeF+/w+DENiBDRQfi2MY67dLXJ1WnHXHtcB8q6t9nTio4T3Wx2I6ppRxiT7u1zRT60CtJOxpM8EnIcXF0UDzDYvlohldIkllpDUe71YYufMmdNtQ7OJ1vp973tf17b++utLKsH6nhJ5MvRbCzztMem2mW9nMrFMa25nbfVCocxxbnnmfLlOt/Ywb3JvfK7j+tC+k/JYkl7xildIki688EJJkytnEQYLxskb3/jGbhvr1EUXXSSpN9kScoOseDIbntHcYsE4Jk21p1zGk4I50Y/FNuYEX7frhAqewpt1yJM5YEXiOrxtNiyVjBN/Lt54440l9VqwscQ9+OCDknrnea6TNbNV3oO5yovXsh/3z9sYx7PNYKxmIYQQQgghhDBD5CUohBBCCCGEMFaMpDucB1r1q1iN6d1dRmr3GXetqYNZWwFn/dpGKWiTPvbg/n51VyZTH2OQwbTtZnJchpAfrzHyute9TlJv8CV9RvCmV/RGzpAf3Hqk0meYtb3+A+eF2wiBxdLkAyIHDeTI3bqQM9wjfCzRV3UfSsUc3woo99o3o4TPZ3VAqrvD0afIqLtasX+rFs+gsPvuu0uSjj/++G7b0UcfLanI/n333de1Me4Ytz4+qCtCULC3U+sGtxhJ2mabbSQVtzCXx4V1e9566627z8j7bMtq7SbuSQmYEwkul4r7S+1yKUk33HCDpDJ3ubzWyU3cnYm6JbgpevB6XR8qDDbcQ6nUT+LeuQsrcxNzvK+nrWQYSy65pKQyzzGu/TeZ93w8I6esnz5Pcn583931kFeXfebO1vPQTD4DXn755ZIm1t6SiruwJ5W55557JJWx7fMjyZ9aLsi48JNUwn+HPuAc/JnnlFNOWcArm1oGb2ULIYQQQgghhGlkJCxBtfbHAyZ563ctJ2/jBIm3NKC87bdS8YJrl+vAa9cWuDZyVEAz6RoTD1CUeq0maAA80HWYQPPhqT2RGzQmHgC50047zeDZFTbaaKPuM+flViX/PKi0LF61RcI1y3Xwvs8H9AFj3a3E/dIiDzPLLLNM9xktHRpLt+jU/ebjlb5Bs+p9RV/OtlX3y1/+sqSiqZWkDTfcUJK00korSZKWX375Cd+75JJLJPXKCf3kgbtoRj//+c9Lkr72ta/N9Vym0gKx8847d59rC8dMWoT6JfdpXa/P7ViwmG9cfj7xiU9IKmm2W0k3kEXXRpPSnhTbrbU5lqDBhjTM73nPe7ptp512mqRyz9dbb72uDavQ1VdfLan3eQPZcm8UILjfE+FgqcAqgRz5NuY0f5ZBplprCDKItVgqa9NsJIH6zGc+030mvTjn4VZWrsGfYdlG8gO3btHvK6ywgqReiw7WM/raLWzA3HDllVd2297xjnfMx5VNH7EEhRBCCCGEEMaKkbAE1cXy/A32jjvukFTS90lFK1VbffxYLY0X2gJ8y70N38e7775bUrtQ1rDGaLQ49dRTJUmvetWrum0nnnhizz74pErSa1/7WknSUUcdNQNnN/Vcc801koomRCqaj9kuiuZ873vf6z6T8tILhHrc0qDCWCJ1qlQ0y6QEdx566CFJRUvu18s9QrPnWsbDDz98Kk97VmgVS8WCIZW5EE2la93RAjLnuWaU/Ugn7XNkv9i/2cC1i/55WDn33HNn+xQmgKUFuXA5Qkvs8RrISKs4M+vEAQccIKk3jpG4vjPPPHPC90jxi7a+ZakaxNi1UGCtdCsAMsL8s9pqq3VteFsQR7bssstOaHOrBNYMZMWtRHVsrVtBeCbEWsS8J5X1BMuTW8yZb1wWf/zjH0tqP+9Nd6FpUoNLxRpG//ocTp95H9Rp+N1Dis8ci36SSlwk/YQ13vc/55xzFvyippnMGCGEEEIIIYSxIi9BIYQQQgghhLFipNzhWsGRnm4UMLeuvvrqknrdQDDzESjLX6m4wZF20QPhv/GNb/Ts42Cin25T6ExC5WcPSq5drc4777zu85prrilJuv3222fg7KYeqlm7qxHmY0+jDjMVqFv/jgdjXnXVVZKkn//85902d1EcVBgnG2ywQbdt3XXXlVRM+q3U9XzPXWJwYSXAdv/99+/aXHaHlVbqb7+ubbfdVpL00pe+VFJvOmL6CZdBXD6l4h6CW9IFF1zQtdVuHglGH01armUtN3ECwd1NkvHZcoeD008/XVJxlZaKPPuaDKPkTj6ufP/735fUTtCD/Pg8tKDu87jBeQKd2U4zPxMw3ztz5syRVJ7BpJI8x8NE6mff1vjn+YJkFlJviQJp8i73uCXO9riOJSiEEEIIIYQwVjziH1HjhRBCCCGEEMaIWIJCCCGEEEIIY0VegkIIIYQQQghjRV6CQgghhBBCCGNFXoJCCCGEEEIIY0VegkIIIYQQQghjRV6CQgghhBBCCGNFXoJCCCGEEEIIY0VegkIIIYQQQghjRV6CQgghhBBCCGNFXoJCCCGEEEIIY0VegkIIIYQQQghjRV6CQgghhBBCCGPFo2b7BFo84hGPmPJj7r333t3n5ZZbTpL0wx/+UJJ09tlnd21/+9vfJEmPfOQjJUkrrrhi17bVVltJkn73u99Jkr7zne90bQ8++GDP9zjOwvCPf/xjvr8zHX3nbLfddpKkV7/61ZKkyy+/vGtbdNFFJUl33XWXJOkvf/lL17bGGmtIki6++GJJ0pOe9KSu7bjjjpMk/fnPf56y85yJvmN/fuuxj31s17bMMstIklZdddVu2y9/+UtJ0jnnnDPf5yZJH//4xyVJJ598crft4YcfliTddttt8zy/yTKIcrfyyitLknbaaSdJpS+k/uf7kY98RJJ0wgknSJKuvfba6TrFeZ5Li+not4997GPd51/96leSpK9//euSpBtvvLFrW2KJJSSVPnVZXWyxxSRJp59+uiTpyCOPnPLzdAZR5oaFYei7pZdeuvv8+9//XpL00EMPzeg5tBiGvhtUhqHv/umfip7/Fa94hSTpKU95iiTp85///IyeizMMfTeoLEjf9eMR/5jqI04BU3mzn/jEJ0qSdt55527bc5/7XEnlZYYXF4cHch4iJGmRRRbp+d4tt9zStV1wwQVTds4wiAPllFNOkSRtu+22kqRLLrmka+PBn/782c9+1rUtv/zykqSTTjpJkvTAAw90bd/4xjckST/60Y+m7Dynq+98H34DuVhrrbW6Nl7yrrrqqm7boYceKqnIzfvf//5JndcVV1whSTrkkEMk9craKqusIkn661//Kkk677zzurYFfREfFLk79thju88rrbSSJOnRj360pPIAL5Ux/ve//11S78s3L+Qshv6yuNtuu035Oc/mSxBznL8kMz433HDDeX7fz/373/++pKL4me55ZVBkbhgZ5L7j3P70pz9121AGPfnJT5bUO14Z5z/96U9n9Pzmh8jd/zGIfVcr/nzeu/POOyVJBx54oCTp3e9+d9fG8x7rBGvJdDGIfTcsTPUrS9zhQgghhBBCCGNFXoJCCCGEEEIIY8VAxgRNJbhmebzGox7Ve9m/+MUvus9HH320JGmTTTaR1BsThC8pLnK44YwTz372syVJZ5xxhqTePsD1CPD9lqRf//rXkoqb2OMe97iuDTe6qXSHmy7cFIt5evXVV5fU6+OOK6D7JO+7776SpPPPP1+StPnmm3dtJ554oiRp8cUXlyTtsssuXdupp57as8/zn//8rg0Tf8sl79JLL52/ixsQjjjiCEllDErS3XffLam41bgrau3OimugNNGtYc6cOd1n3O1e+9rXTsVpzzp77bWXpF4ZxQ31mGOOkSR98Ytf7NpwSSUmaPfdd+/aNt10055jI7tSuT/ezyH4WsCc/rnPfU5Sr1vSvffeK0nacsstJUk33XRT18ZazPrg86evJyG03MNqVylikaUSj4zrG/OeVNbWfu5wCxpj23KhD4NDLEEhhBBCCCGEsWLkLUFYfTxIHA3m05/+dEm9QZuAdvlpT3tatw3t1LhpQMkUJUkbbLCBpJJlywNY6WP61fvu5z//eU+ba/hmKgh2YWhpgQjiBb/eloyQQXCHHXaQJH3lK1/p2t785jdLkp7whCdI6k0KcPDBB0uS1l13XUm9lstnPvOZkkqw8bOe9ayubckll5RUNK9zu45BA2vjb37zm24b581f71/kjjaXrcc85jGSSvA1/ST1yvWw8eMf/1hSkQmp9MPtt9/ebUMGXve610nq1ciTMe8DH/iAJOm+++7r2pBVrEX/+Z//2bV9+tOfllSscc973vMW9nKGDtaVyawFbiHBm+D++++X1Cur7MeYlop1mbEw21pl1kWfZ/CyYO6SyvniWYH1UJJ23XVXSUX+jj/++K6Na+eYnkX0j3/8o6TS5554h7YwPiD/LYsQ2URb3jpf+tKXJBXZlEr2TNYJH5f9kiRMZj0d5LU2xBIUQgghhBBCGDNG3hKE5s01AqS4RuPs8ULUFSHuh3TaUrEY8XdcYoLWXHPN7vMf/vAHSdL1118vqWjapaKNI/Wp9w99jBbPU4+z/yDT0uagbcLK5RojrtPTv6Ixv/rqqyVJH/rQh7q2t7/97ZKkyy67TFKx/kglBoj+dasmmno0pu4337IEDYNWinpTpMOWimaO8/e+RivPPt4/7IcG27XVru0bFt7xjndIKrFfyJJU5JD7LhUrK3JBrIYk7bPPPpKKVcLHJBbbpZZaasLv0IdYQj/1qU91bcjxbFssppt+FiDiqXzeBGL+rrzySkm9teawrvh8uNpqq0mSvvnNb0qavb7EQkO9n9/+9rddWyuOgm1YFz2lPWnbqdPHPCoV6w7HIpZUKvMBHhnEu0nSHXfcISlxQ+OIx3iz3lLPsGXFufnmmyX11kyj7tz+++8vqT3ORnEeC7EEhRBCCCGEEMaMvASFEEIIIYQQxoqRd4ejirq7yGA6JzDaXd5wc/DATyAQGHc6d/fC3cZ/Z1TABUIqLjOkmXQTMYkB6mB0qbg4EXDtaY19v0Hn8Y9//ITPyIy7V/HZU4HjzrbqqqtK6nWFwUTPXw/ap3/4vruJ1W5e7j6CC4rvP6h9/dSnPrX7jGufywguQrh8eVsdnOquWIxHUoi76wRyipsifT/IkMaaBA/ujkqSBB+TfOa+uxsT6dVJ8e4uXrjR0e+e/KAO1t9+++27NtzhRtF1xGXnOc95jqTSrw8//HDXdsghh0gqsucB2LjBIceeXAWXYR+vK6ywwpSd/8LAeXK9LnesBS13Sq7Tk5xwfawJLpO00Xc+ztnG7/k54Np+6623LsjlhSGEeb61puESTHmJFrjASdK73vUuSWW99kQbrcQLMJl5btRdg4edWIJCCCGEEEIIY8XIW4Je9KIXSSpBmFLReLINjZRUNMbwk5/8pPtca+LdEsT30HyNEm4Vqy1ldX9JRUvqKUzpK7Sp/r1hSIwAaBylooGiT7yNa/ekBFiH0Ph6IV6068stt5ykXk1Unf7VNfZorkjWURcOrc+rLmg7KHgBWK7JNcsUS8Vq4WnC68Bs17xhnUQD71ZfrG3ch2GwBHHOaMGf8YxndG1YdtwayLijT117zmdkB+29JO255549+xx33HFdG5Y6+nmYU433o5arHXfcsWujP88991xJvfPZFltsIalofT3lNWO3LhkgFe21r0dnn322pDJntMo5TBcuW/QFc5HLEXiJAKw77OfWHuY4yiy4FbgOZHfNOW0cE48MqazJk01tHEYT5jvGEImGWtx2223dZzxUNtpoI0nSOeec07Uxz7XWlzD8xBIUQgghhBBCGCtG3hJ01llnSSpxGFKJCarTNktFu4TGzTV8fEZD7Zoo/zxqkCZXKhpMNPGu4cMChDaO9ORS8ZlHS+395QUsBx3XWpKemm2uIaJfXDtKenFkq2XRYZv/Dn3FPhxHKlp5tKOeIpb9XKM7qJYgT6GLNs/TMv/v//6vJOnEE0+UVDR3UomHavmIE8/24Q9/WFJv2uJXv/rVkno174NOnaLYi0nSJx5TggywzWW0TvPsFiQsvVjO3KpLfBDj1tOOjxK1/77Hm5CCl/vhfc4YwwrsFgniWtjmcyRxNx5fxHFJB01pgpnA7yvnW8fzSG3vh7qYrMfK8tn7rKYV38e6wjHd6s38594ZHocURo9WodJ11llHUokb87Wyxuc/ZBhPA7cELWwczyDHAbUsp/NbUJ11FI8T9+xhDPLM7QW5W89B3Af3wJpuYgkKIYQQQgghjBV5CQohhBBCCCGMFSPvDkc19F122aXbhksBriRujqtTc3pQOm2Y3N19ZCYDVmcar8yNufL222+XJK2yyipdGwHSpMylyrlUgn4xh7qZ2lPPDjqeJIJrIZhymWWW6dpww3RXP+QHuXFXQmQKF5E6GYJU3OFw45SKqyJuXqecckrXRh/7/oOKu9dgVvdEBd///vcltVPR164z7g6HuTgslx4AACAASURBVP9HP/pRz/elErCObA4TXLPfW1x+PS07Y4uU1z7X4UJEH+FCIpW0zi23BOa6UQoQbrmA1O4guHRI0lVXXSVJOv/88yX1zmF1mQR33YRNN91UkvTSl76020aigGuuuabbRuKE2Qjyb91f5jyf9zm3e+65p9vG2so85rKFyy7H9yQLtcuvz4N1+mx3JWyVKRh2Wv2PTNIXrfTQ22yzjaTiGiaVtO2jzpw5cySVciZOvz5DzkiS4+O5lq1RSLiBbPl6yHUhY6309HDyySd3n3n+IbmTl1SpE2f5nEq/sjZJ5dmIcyFBjyRdeumlk7iy+Wd0ZowQQgghhBBCmATDo4JfQEhi4MGbaKXQaLoWmv3RDPgbMFoqNHuuyR9l/G0eDQv945potHcXXHCBpF7rGH1M/7o2ZpisaP2sKltuuWX3+etf/7qk3iBBtOrIkWvs6yBe15jU6cW9YCtpkbfddltJvVaN6667TlI7jfmg4WmE0VJ5EDj90wqYBjR1reJ0jFnXsoOnKR5EWtrtugiqVDTsbu1Bm8cx/FjIMnOcyxxzG7Ljcx378T1PmtAqNjgM9NN+UmbB5arWStZJJuYGCU822GADSb2WNmTU7ymeDJ4SfqbweQbNLHO8r5msp2iCpXINyJsnWaBfOX4rOJu5ztvqwsfe1iokPUy05qx+gektawbFPyk34BaLOlHFZFOJU2z+u9/9brdtUMZ2LStSsYYfddRRE/ZHfuq+kEoSnrXWWktSbwId0mzTZy1r8bAVRG2tH/QL/VRbfyTpU5/6lKTeZFkXXXSRpDIuW4lgwBM+tbxeeE5nHZmJ4sexBIUQQgghhBDGirwEhRBCCCGEEMaKkXeHw6zmJlPqruD64G0EdNb7SMV9hGO2zIWjiF8nQWy43ngQ3De+8Y2e73lwIvVGcO9wE+gw1QlyWeEaFl98cUm9pttLLrlEkvTiF7+424b5nb9uNqePMau33L1atYd+8IMfSCq1Ww444ICu7V//9V97fm+QadXqueGGGya002eta2pV9GZ/3Bu4Lw71rQYVdz0ArsvdWugjdz2qXVdcfpEj5jXvN4LV2d+/x5zYklVcIgbFZWZ+ac3pb37zmyVJRx55ZLfNXb/mB+4R9a5IJiMVl0933aZ+FjKw3HLLdW033XTTAp3DZPExxv3HHc7rU5GgwNdK9sMd2BMj1GPX3Wf43KolRP8w3/q6UScMGDZa7mn93KoOPPBASaWujVTuAy5H7orNHIf79LyC+5kvv/zlL0uS3vKWt3RtLVezmaI1t7ssMv9cccUVE77L/q1rJykJY2+NNdbo2nCHQ279HFrr9PzW2ZkNWnM318d64GEKuASvu+66knrdc5E73Fs90Um9VrTcW30+4F5y/Jl4NowlKIQQQgghhDBWjLwlqBUEN7d96s9Sb/A6n3m7dc3XKNMKFkaz58Gzxx57bM/3SGssSa95zWsklcQIbs24++67p/iMpw/XOgFWmHlph+vvukYK7Qvy10oW0aq+DlS4/uAHP9htI10oWi6p9PtsBFr3A+2uVDRoDzzwQLeNqtRYIF3jW1uAvI2g/Ve96lWSiuXMf6d1TweJliWIcUdQqlSClz/60Y9220g5St/4PFhrzV0eCZilb7797W93bWiJSRfv30P77GlPhx3ShbtlEvqlMQbXjN51112SJpZikIqV3GEMuyUFpru0gI8LPCPgGc94RvcZq5+XjCCZSz/LLTLWst5wba4JxopGm/drnYhhJljQYPiWzPTzKll//fW7z+9///sltT0qLr/8cknSJptsImn+U//7PT3++OMlSQ8//LAk6YUvfGHXNhuWoJblgj7feuutu20kF2G8tPq6ZQlinUCmPFFPfQ4tBtnq06KVfKPfs8c73/lOScVS5vJKSQus1j4vMS6xDHsafeYET9zEfWONcWtmnTxqqoglKIQQQgghhDBWjLwlCG2la6LQhvPWiXVCKhos3k79rZhjkdaUfb2tVRhv2EH7LhWNIFomt354SmNJuvHGG7vPdUpG1xYMg9YYzZDLA5qhFVZYQZJ08cUXd23E77jGBK0m1+7F/pDFloa4LmzmaS2xQhEf4NpYNDNXXnllt80tm4OEnxdFKX0bqVrRgLa0x624ADR8pCT2FOfcy0HtE1hppZUmbENj6amrjzjiCEm9liCo059KE9NmtwrZgcf7LbvsspJKPEsrpnIm6Od7Pxm//JZW2eXjK1/5iiTphBNOkFTiKZx+KdvrfaSJBbf9/iGPyKpUrICUHfC+9tTkUwny4NZ65nnavJ+YG1t9jdbd5YL1pBUTAFynW6w33nhjScUK2orvm8mYIL/eyRTSnGyxTebtN73pTZJ6LToUfWae93jGzTffvKeNgr6SdO6550oqMnbcccd1bViA9t13327bbbfdJqn0v1uCpouWtWcy8VEeF+WW8bkdC1oFUZFTP2btPdGvMLL/5mxYh+bXOunW1Nra/KUvfan7zFzD8UnDLpV4b6ySPqdhvWFcurWR5xkf49wHijJ7qnIKU081sQSFEEIIIYQQxoq8BIUQQgghhBDGipF3hyOY0s2WuHHgruWmfcAs525JHsBVH9PT0o4aHhSLSfnpT3+6pF6XtxpPkY0Zle+5qXa63DqmElzXvC+QG1whr7vuuq6tVZGZ/d2NEtwsLfW63WFmxm2LIEOpBCXiDufB2wTIfu1rX+u2Daqc4gIilXPcaKONJrTjuuoB0HW6aHcJoB9xG9luu+26NtxquH+DSr975q4ErXSidbCrz3W4LbDN+7R2UfKEEqReZ25sJfiYCfq5eyyoK4onesENCXfd5ZdfvmtjLDLmJ3su9Bn31I+52267Seq9p7h80cfzcsWZCugDd63lGhgrZ599dteG+56PO9xfuF53qcYlhiDrlnsY86bfj3vvvVdSSQHN3CcVF3Xvn+lyS2odt98cxH1qXeeOO+4oqdftB7nDjdnd9XE9Yn9fj7hf9A9/peKqzjPM7rvv3rUxb5C23ffj3D1xDW53C4P3T53gw9fMen/vc8bgiiuu2G077bTTer7nfV67I7bGDynBDz744G4b7si4u/v3JpMcq5UCfrpoyTr92+qLVsKV/fbbT1LvusP6u/LKK0vqdTtlHUEWXV5rd1iXSe6zP+vwXc7Zk17EHS6EEEIIIYQQpoCRtwS1ivexDe2+a+ZrLb2/1dbf9yQIWJdawbPDjmuYeWtHW/XjH/94rt/z1MxoKLAEeTpYNHyDDJoo15xwz7E2egICtEeuTUWriVbUNV5otVoB1mibsAD591yzIvVa3ziHfvsPCv/zP//TfW5p19BAk7LVLWd1cHrLSkSqTbf6UMgWmRxU+t0zl68W9EldjFfq1bLP7Xvg1lrm0lapgEG3qkltrTIJRkhJLUl77LGHpGJx8DmLMfW2t71NknTqqad2bVglwOULrfVWW20lqbdf0db7HMO80CqO29LiTgVogFvWDKw4HnSPlaA1JltWEOY/9vHvoU3mOl1G6VfWZLcEoWluFXhtWRamglbpiH5WJ5ILvPKVr5zwPU8whHWRgp1uLeSecI+YD6VihUUz788izCH0r6/p9TGl8myEjHlyhk033XSu1zhZvJ8mc39a/cqzmieH6FdAuLbCtI551llnSZLe8IY3dNs222wzScUS1C+BzExTJ7dpFXLluudlkXrXu94lSdp2220l9coD+1944YWSesdsPf698DljhPNzizD96Osb8olMuIV0uoglKIQQQgghhDBWjLwliBSj/sZeF3xyiw7aJSwe7NuiFR8yirSKZ6ElqIvoze17WDHQ3rRS9Q4yrTgetGXIj1sN0WC4BbKmlZq5pVliP7R33nf1vXF5RSvv2tRBTQeNL7bjaVm5hlbK1Fp+Wr76aPFdS3/GGWcs7GnPCC57k0nJ7LAfWjqXHbRt/QoRAtpBqRSxbaXWJo7rm9/85qTObzZoaYBf//rXS+pN10+fMaZci8l8/9a3vlWStOeee3Zt+++/v6SiNV177bW7NuZNYilJRSyVse8xEvQx2/wcpguu2y1NyArzhxcyxorhGmDOk3nftb3MYy2LXC1Tro2urcA+V7YKL3KuU20Jqq1cDtfk42XDDTfs2d+tt1ht/LxXW201SSUOg3TBUonFwPLlGnn6AKu3F5pknWAf7xPO2dMX10VDvVAlVtOFwee0ffbZR1K593581la8RdyqSqp+L8BJzCcy6fF1lLJATpdbbrmujbWbdcK9U7DIuZUYSJ+NV4j/Nsf058vaSrwg+HrXsvJMhu23316StPfee3fbsKhddtllknrlgfHLNp+HGJfIm58LckQ/eVkQjuFyyv1CPlvPXVPN4D99hhBCCCGEEMIUkpegEEIIIYQQwlgx8u5wmH/dLamV/rqGNjc3Y5rDFcK/PxNmu9nCzc2Y6DHfTzblMmZOXAK23HLLru1jH/uYpLbb3aCAO5bLEYkH7rjjDkntgGavngz0mZu16zTY/ju4QPDXXRlq1x53yeMeebX2OhX3oOApN7k+33b77bdL6h8ASl+0UhKzreU20OrXQcIDoznH2lVrbjDXtdzhgP5zN6xarlopdVtuQZ7WfDZpuVohC4xb/krFLc2rpNfJXJyPfOQjkoo8eWD7eeedJ6msCT7OqWjPPOquqswV7ubFfWvdo+mC8eC/Vbv1+ryGS7Sfd50QwffH5Zw+93m/DqT2Y+I2g3sR6aKl4q7j7k/Mpe7KNZXgXiVJ6623Xs9fd2/jnJAnd0lmP3enr5Pk4Eoklf4goYYfq3YZ9ucT5Ii+9nGBG1Qr4J++82NNxTpNQhFJeslLXiKpuNC6yxh9h0y661QrQRVySgISdz2kX0gY4et1vVZ6GQqef7bYYgtJvWOd++ZzIHKxyiqrSOp1BcXtdmForWGrrrqqpF5Z4b4y3tzNkOvz5Blcw9JLLy2p170VOavnI6nME/S9z3e4z9FnnlyGY7TcaPmdqXAfnBexBIUQQgghhBDGipG3BBEc7m+UaOjQaLrGi8+8+bo2mu+xj3+vLqQ6Sng6TVJlYnGYrCWINIpolFpFGwcZNBOuEUXLQeCqywMaDdeK9NPiooVjn5aVsqWlQpuC/HlforHzYw1qEoqWhcKDr9F+tdJgQyvAH+jfVsr76So6OVV4gCr3FHm85pprurb1119/wncpaonVg2J3UrnuVkA+BXb32msvSSVYWypF6+rvS71FFQcN5u8XvOAFknqtBsccc4ykdtIEAn89Xeu//du/SSqpsU8++eSujf4gSPvuu+/u2gg0r7X2c6MukjpdVg2H3/L7ypjy9RCYE30NRANfp22Wigaevm5ZgVuWPGSffVxeOb7PjdM1rnfeeWdJpRi1VNZI0lt7wiDkDo23a76Z010OsO7yDOJrJX3cKsZd96tbOugf5g23CnB+Ph7of47v1ozJFAidF55kgLIO/JZfE+fNGPTi7FdccYWkdgFhH3PAmsH5410gFWs785engKfPsGZ40gT6xdcj1n4swiQhqPdbUNZdd93uMwlZ6DO3lN11112Syr30JCxYjEj64N9FbloWP/bxNQlLXMsSxH1gH/eYYj+f0xgP/I4X5nWPlqlkMJ+IQgghhBBCCGGaGHlLEFoUf6vFOsQbfauQaituiLfY2iLk29h/0LXL84P71aI5aKVF7QfaCPretX/D0FctbTlWMOTHNRW0uTYVDQsakFZ6U/Zppcqmz9wPHA1LXQxPKtrmYYhXa2ngvZAkMtKKCappxQuhKWtp0vsVOBwEXE5qy4HH673zne+c8N1llllGUtH8eWrTOg7N+w1tI/OmyxAFkt/4xjdK6tUgewrU6aZlLai3+fxNuYSjjz5a0vzHNrh2mHUFS9vxxx/ftZF6l+O7RZMxyTj3ua+V/ryeM9zSMV0wd/m5sS62UnS31sr6vN1KVKfB9jmVY/HbHjNVF4f1eZAUxd7XzMfzipubX7i/G2ywQbcNSwV90CoPgbXI5+g69bhvw2rtliCO29LWtwrqAttYJ1qpiv0+1OPIC9Ni4fC08JOF81h99dW7bVhhb731Vkm9ax/7M1f5WCf+yr188Mrg+cRlmPWzVaaDe0TBYqwoUonj4fzcytQ6r9pb4eUvf3nX5gXkFxSf55EN4pL9GQSrMxYv+ksqMVM+nukDvufxRfQd+2PxlIq88ixIfJJUxh5y10qH73LH2KCvmbOlqSnS2yKWoBBCCCGEEMJYkZegEEIIIYQQwlgxku5w7rqBmdwD6mh3EzRgXqwTJHhb/Vcq5n6+NwwuXpPFK1bjKkEA4WTdM3Djwaw6FcGVMwkuSe5OxT13dyBADtyVCfNvKyEH/YJ7QyvRBtXsWylWcVn09J3cm1aQ7jDg11K73PRLg90yuSN/rYDZQad1PUB1b6lUpvfkDwT88j13Kardr3ws1y5dPl7POeecuZ7rZBOlTAUtN8Z6m8/DyEDLDY5AX+8fdwOuYT9cgjzhBClyCfhupTjmHFouSK3U1Jx7y012qmEu8r7g/rfSyHMt3q91qn8/b9Zk8GMip60U2bULrK9LpPV1+ZuulPeHH364pJKCWCr3nNTnuH1LE58l/DoYcz72kGHWGnfXp485VmsN4a/PFfwm99HTSreeWVgz6MPTTjutazvqqKMk9aaTnyy453qqZAL2Sbrk95zzYB11meQZxPtnm222kVTSU7urHH3AuHb5YHzxTOcyWif58DW9fh6SijsZrnV+v7n+BYEyBe5uhnsabmMuR8suu6yk4trsrtR89kQKXAMude4qx32g/7lGqawVPAd9/OMf79pwIUZm9ttvv66Ne0PyHUn67Gc/K6kkHTnzzDMn7D/VxBIUQgghhBBCGCtG0hLkRdTAtQu1ZsY15bWWsJX8AI2JW4L4zD6DXPhzfrn88su7z/QH2iov5DgZ6kD1YQHtj2t10eqgyfI0oq0AfmSQoFTXWtKvaFpcu8U2NDSeVAKNDhpI194QJOzFHueVkne28H5qFTalDxizk7ECSOU+MB77afcHFZeF2tLCfZeKJpUgbWliKl2fz+pCf27tQQvNfXGtN/LeCsTul7BiumilXWWsuWaUNN+77babpN40wQTduqZ5q622klQ0kJ4EhqB45gBP4sH+dVC3NDFRT6voYL+SDTPhYcD84usbv9saPy3PCPbj/F1usWhz7X7/WokXwJMk+LH9+C2r0lSDTLlWGzhHtyRw79DMu9WANh97jKtWMp7a6uapx7GcI2+tAvHMqT630uf+zFLPCXWx3AUFC4Lf52233VZSCaJ3GeN+ci898QXH8DWZPqivVyr3jWN5cgjGPX3t80ZdoNqD9TlG6/mS/fx6mKMXhBNOOEFSbx/stNNOkso85M++PC8wvlrPq37PGY91wVKp3DfG2be//e2ujZTxV1999YRzpn/4PU8SQT9hxZWkT37ykz3n7G0//OEPJxx/KoglKIQQQgghhDBWjKQlCH9QaWJaa6m8gfJ26pYg3uhb2q26kKXHZtTanqlOyzmbuP9uXbyvFVfVok6HOAwFUlu4JhbZ8KJr0E9zRh+61rnW0LkGC21KPy0p5+UaLDS6rh1tFTscBOaVppox2krfWxdS9TbkrV/ftVItDxJ+/5h7iDdxbSbX4XKFjNJHrWO1LG/0M/LUsvAg9671nslivPyuaz/RknK+rYK7O+ywg6RikZCkM844Q1LvuEXDTB+7Pz+aUdKF+7hjDUET6xaMel5oxfi4rDJPck/dyjddIDMe/8i80S8myWWkjm3076FhZpz6+lv/jltIau8BX5co3eAy7Gv3TIEc3HnnnRPaPC55XOF5wdd/ZJo5zZ8piFlpzd+k7fZ01jx3UUjV5YHnQixxHmNy4YUXSioy49YlPDyItfRnSc7PLcGsOcSFudxOhQeMx2T2i8/E6kS8o8epMXe6ZYr5inHpcxXxTViAJmuR5npPOeUUSSX1tR/j9NNPn7A//elycumll07qN+eXWIJCCCGEEEIIY0VegkIIIYQQQghjxUi6w7l7hJsAARM7Zk13XauDU92k7qlnazAdjlJChBYEYpPm+d57753U9zBz4mbj5uNhgPN3eeBaSA3pstZK3wm4g3jAZJ0C1M3//A5uOW6KRobpT2/DVO9uLV75e5ggML8V8Fq7dblbTr/UvjCobnDggdHAnOVpxKGV/AI3Nb9W9qOP3GWJ/uqX6OD888+XJL3qVa/qtrXcgKYLZN1T1NepgH1MktDh05/+tCTpW9/61oRjumvWnDlzJBVXq5tuuqlru+iiiySV/nS3rTrlsMse23Dz8PmklfSDeYG/7s46Xemy6U8/N+al1lhppVjGtYbrbLkl4Y7kMsbvMOf1cyv2NYRz9pS/o+SSPiq03LYZl8i/30PuP+OyVbLkrLPO6raRsIQ1z8cIiU5YS5i/JOmCCy6QVFzfeL6RyhrLeV1zzTUTzs/HCt/lutw9eSZLp/BsNtlntOmA++wJEQaNWIJCCCGEEEIIY8VIWoJWX3317nMr7SjaBN7wW1pi3uxbhSn5nmupsD55GtVRhAJ1/QrOtkCDiIWjX6D6IMJ1enpTrH5oHD1IkcDKVprnVvB4re13DVYrWQJwLDQuHuhYFywcZug7NOKtIootqwX9M8zFi1vJLLjfnv4cXL7qpBH98JSo7N/P2oBlxK0gzAt+XtMVzI+Vwa099XW2UgFfeeWVknoT6DBWWimWv/e970nq7R+CjVtWNO4X5+LpodESkyyB40ilkK8HA9eWzMkWp14YsLCstNJK3TbmOLeUQe1ZIZWEFPS/J/DgGKwJ3oZlkz7wedH7378vTUy8I/Vq4MNggPw///nPn9BWp7eWJqZF9zIUPGth/ZGKDLbmLQpvcsw77rija8NyWVsipZIEpbYyS2VtbaUxR1632GKLrq1fIoMwO8QSFEIIIYQQQhgr8hIUQgghhBBCGCtG0h3OEyO0crvXdRwcTJ2YN929DRc5vudmUdw/VlxxRUnSueeeu5BXMZjgKoFroJuNyT3v9TeAoELcI1qVxwcZXC+8KjLX0HK78GDtmcTdDBkHBIIOMvNKTkB9AdwofOzVLlCtxAgtN55hweULucLlgvoNjrsX4SbE31Zlev767+B+xd+WOyFj34OI2Y96adL0ucO13DxxjeM8PFEN18k1eX2v2tVKKvN8y8WZNYRz8PUF1zruQyuxBffD3X3WX3/9CefCvElNHK+JMl3gcueuhK1aUjXuSsh+LZdc1g7ukbsutVzNJ0MrcU3rnobZpZ6/pOI+yr3zZwrkCBlx11fG1wte8IJuG66uuKP7se655x5JRZZ9XC611FI9v9Nyb+VcvC4aMuxrDnLHvLHeeut1bUcccYTCYBFLUAghhBBCCGGsGElL0JFHHtl9Ji2iaxDQCreC+wkAZX9vq9Nme8Ay2s6zzz57iq5iMCF48XnPe56kXg0xSQNaliAsSGhRhi1QHa2Ta73BK08DGvd+mtOpoA7899/DUuXWoZZWehhwC4bUq6mjrZUEoW5rMegpsj0YGA3n1ltvLamdIGKFFVbotiG3aCe9EjqWn5aVgbZ+qcVJLOCWJ+SvZWWfLlrWMPDrxarCnO5tyIzP91w7fexWBlLw8j23dpEUYH5TWLc04dyHySS2mCr4LSw2UrHM0Ieefhrt+yabbDLhWOznY5JkEFgDfB1lG+O2ZdVsUVue/BhhcGCc+L3hmQIZ8/m4TqntJQFIWOLWVCxNrXTztTy01kLkyGWttr76cZBXtwTz3Vaa7iuuuGLCb4bZJZagEEIIIYQQwlgxnGrhefC1r32t+0xxz913373bhv8n8UJeQJK3fN7wXRvG51Zqxne9612SZrZY4GxAGle0laSPlKS1115bknTVVVdJ6tWm1P7xHGdYQKvTKr7binlAozTdVoZ+lqa60KI0vJagSy65RJK0zjrrSOq9Dvq4FfeCVnuYY/ROP/307jNa8y984Qtz3Z/4qemGe7Lmmmt22+hv5tjZxmPzZitOb7Kw5sx2wW2sN15Mtl+q9QcffFBSbxHJddddV1LR7ruFBhlefvnlJfUWx2Q9wQKAlVKSbr755rme8xlnnNH/osJAgDz4WHzxi18sqZTfcHnACu5WW2BN9nT8dayejyVkqlWwFW8JLK/+XMN5cayll166a8PDxc+Z8cBa5fGotIXBIZagEEIIIYQQwliRl6AQQgghhBDCWPGIfwxgVPB0B4ESwLn99ttPaLv22msllUA3d39affXVJUmnnXaaJOn444+f1vNckFsz3X03Z84cSdKb3vQmSb1pZo877jhJJTDY+epXvyqpuNF95CMf6douvvjiKT/Pqe470te6i98HPvABSdJuu+0mqZjn/VjTPbzq33FXMCpk//d//3e3DZeSG2+8ca7HHES5A9xs9tprr24bQfi4N7hLKq6xLRex6bhH83usqew3juUyyvlM5nf6nft0J/gYZJkbdAa573B586QSuBXhIudJE377299Kku69915J0z9/DnLfDToL03crr7xyt+3YY4+VVFzk3XWSeYe/PreTpIiyCVJxteR3PMkILtSt9Ou4v7GWECohlWccXN/8mZBnnVaiJ87rfe97X7eNZ53I3YIz1XNCLEEhhBBCCCGEsWIgLUEhhBBCCCGEMF3EEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvy97yw1wAAAIlJREFUEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCvyEhRCCCGEEEIYK/ISFEIIIYQQQhgr8hIUQgghhBBCGCv+P2djJnJMsW1uAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# takes 5-10 seconds to execute this\n", + "show_MNIST(test_lbl, test_img, fashion=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now see how many times each class appears in the training and testing data:" + ] + }, + { + "cell_type": "code", + "execution_count": 111, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Average of all images in training dataset.\n", + "Apparel 0 : 6000 images.\n", + "Apparel 1 : 6000 images.\n", + "Apparel 2 : 6000 images.\n", + "Apparel 3 : 6000 images.\n", + "Apparel 4 : 6000 images.\n", + "Apparel 5 : 6000 images.\n", + "Apparel 6 : 6000 images.\n", + "Apparel 7 : 6000 images.\n", + "Apparel 8 : 6000 images.\n", + "Apparel 9 : 6000 images.\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA0IAAACBCAYAAADtygrzAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJztnXd4VVXW/7+BdFIgEHqLtCC9VwVEkBJEBUcGR4o4oo6vZXgFeeeVgGIBwd4FQaxY0BdxAEEQURCCSLeAUsQQOtJb2L8/+K1z19135eQkuUkuk/V5Hp5cTt17nV3OWW2HGWMMFEVRFEVRFEVRShClirsAiqIoiqIoiqIoRY1+CCmKoiiKoiiKUuLQDyFFURRFURRFUUoc+iGkKIqiKIqiKEqJQz+EFEVRFEVRFEUpceiHkKIoiqIoiqIoJQ79EFIURVEURVEUpcShH0KKoiiKoiiKopQ49ENIURRFURRFUZQSR5F+CIWFhXn699VXXxXoPtOmTUNYWBjWrVuX67GdO3fG1Vdf7em6u3fvxvjx47Fhw4Ycj9m/fz/Cw8Px2WefAQAmTpyIuXPneit4kCgqOf8nMnPmTD8ZhYeHo3r16hg+fDj++OOPPF+va9eu6Nq1q9+2sLAwjB8/PjgFvsSw5RsdHY3KlSujW7duePzxx7Fv377iLuIlyYYNGzB8+HCkpKQgOjoacXFxaNmyJSZPnoxDhw4Vyj1XrFiB8ePH48iRI4Vy/YKwatUqXH/99ahZsyaioqJQqVIldOjQAaNGjSrysuzYsQNhYWGYOXNmns/96quvQm6s9iLb2rVrIy0tLddr5bV+7777Lp555pn8Fj1ohFL7kvAq/0sVex4JCwtDcnIyunbtinnz5hV38fLFc889h7CwMDRu3LjA1xo2bBji4uJyPU56PymK+xYGBRkbwoNcFldWrlzp9/9HHnkES5cuxZIlS/y2X3755UVWptdeew1hYWGejt29ezcmTJiAunXromnTpuIxn376KWJjY9GjRw8AFz+E/va3v+Haa68NWplzIxTlfKkxY8YMpKam4tSpU/j666/x+OOPY9myZdi4cSPKlClT3MW75CH5njt3Dvv27cM333yDSZMmYcqUKZg9e7Zn5YQCvP7667jrrrvQoEEDPPDAA7j88stx7tw5rFmzBq+88gpWrlyJTz75JOj3XbFiBSZMmIBhw4ahbNmyQb9+fvn8889x7bXXomvXrpg8eTKqVKmCPXv2YM2aNXj//fcxderU4i7iJUuwZduyZUusXLnS81z07rvvYtOmTbjvvvvyU/ygoO0rdKB5xBiDrKwsvPDCC+jXrx/mzp2Lfv36FXfx8sQbb7wBANi8eTNWrVqFdu3aFXOJLi0KMjYU6YdQ+/bt/f6fnJyMUqVKBWwvSrwMwNnZ2Th//ryn63300Ufo27cvoqOjC1q0fFNQOZ89exalS5dG6dKlC6N4hcrJkycRGxtb4Os0btwYrVu3BgB069YN2dnZeOSRR/Dpp5/i5ptvLvD1QxVq61FRUYV6Hy5fABgwYADuv/9+dO7cGTfccAO2bt2KSpUqiecG6xn/J7By5Urceeed6NGjBz799FO/59ajRw+MGjUKCxYsKMYSFj2TJ09GSkoKFi5ciPBw3xQ3aNAgTJ48uRhLdukTbNkmJCR4mpdCqc9r+7rIqVOnEBMTU6xlsOeRXr16oVy5cnjvvfcuqQ+hNWvWYP369ejbty8+//xzTJ8+XT+EipBLMkboxRdfRJMmTRAXF4f4+HikpqbioYceCjju6NGjGDlyJMqXL4/y5ctj4MCByMrK8jvGdo3btm0bwsLCMHXqVDz88MOoXbs2oqKisHz5cnTo0AEAcMsttzjm2IkTJzrnHj58GEuXLsWAAQNw/vx5hIWF4cyZM5g+fbpzPL/Xxo0bce2116Js2bKIjo5GixYt8NZbb/mVb/HixQgLC8N7772H++67D5UqVUJMTAy6deuG9evXF1iWCxYsQFhYGGbPno177rkHVapUQXR0NH7//XcAwPr165GWloayZcsiJiYGLVu2xLvvvut3jVdeeQVhYWEBsqVrf/fdd862jIwM9O7dG8nJyYiKikK1atXQr18/v3MvXLiAZ599Fk2bNkV0dDSSkpJw0003YefOnX7Xb9++PVq3bo0vv/wS7du3R0xMDO66664Cy0SCJuudO3di/PjxohWRzPU7duzI8/U3bdqE/v37o1y5coiOjkbz5s3x5ptvOvv379+PyMhIsZ3/9NNPCAsLw3PPPedsy8rKwsiRI1G9enVERkYiJSUFEyZM8PugJ5edyZMnY+LEiUhJSUFUVBSWLl2a5/IHg5o1a2Lq1Kk4duwYXn31VQA+U/vGjRvRs2dPxMfHo3v37s45ixcvRvfu3ZGQkIDY2Fh06tQJX375pd919+/fj9tvvx01atRAVFQUkpOT0alTJyxevNg55ocffkBaWhoqVqyIqKgoVK1aFX379sXu3buLpvL55LHHHkNYWBhee+018eM1MjLSsUZfuHABkydPRmpqKqKiolCxYkUMGTIkoI6LFi1C//79Ub16dURHR6Nu3boYOXIkDhw44Bwzfvx4PPDAAwCAlJSUkHK3PXjwICpUqOD3kkqUKuWb8mbPno2ePXuiSpUqiImJQcOGDfHggw/ixIkTfudQG9y2bRv69OmDuLg41KhRA6NGjcKZM2f8js3MzMRf/vIXxMfHIzExETfddFPAuAhcfPEZNGgQateujZiYGNSuXRt//etfA8a4UMOrbIkFCxagZcuWiImJQWpqqqP1JiTXuJz6fNeuXfH5559j586dfi5RRY1XGZB7Wm4yALyN1wAwYcIEtGvXDklJSUhISEDLli0xffp0GGNyLfdLL72E8PBwpKenO9vOnj2LiRMnOmNCcnIyhg8fjv379/udS3WZM2cOWrRogejoaEyYMCHXexY10dHRiIyMREREhLPNq8zOnDmDUaNGoXLlyoiNjcWVV16J77//HrVr18awYcMKtdzTp08HADzxxBPo2LEj3n//fZw8edLvGJqvp0yZgqeeegopKSmIi4tDhw4d/N6xcuLbb79FhQoVkJaWFjDGcby2CTc2b96M7t27o0yZMkhOTsbdd98dUJ/Tp09j7NixSElJQWRkJKpVq4Z//OMfAa7WXuatgo4NRWoRCgZvv/027r77btx7773o27cvwsLCsG3bNvz8888Bx956663o168f3nvvPezcuROjR4/GkCFD8MUXX+R6n6effhqpqal46qmnEB8fj/r162PatGm47bbbMH78eFxzzTUAgBo1ajjnzJ07F+Hh4ejduzfCw8OxcuVKdOnSBb169cLYsWMBAImJiQCALVu2oGPHjqhcuTJeeOEFlCtXDrNmzcKQIUOwf/9+/POf//Qrz5gxY9C6dWu88cYbOHz4MNLT09GlSxesX78etWrVyrc8iVGjRuHKK6/EtGnTcOHCBZQrVw4bN25Ep06dUK1aNbz44osoW7YsZs6ciZtvvhkHDhzAPffck6d7HDlyBD179kRqaipeeeUVJCcnY8+ePViyZIlfxxw2bBhmz56N+++/H1OmTMH+/fsxYcIEdO7cGevWrUP58uWdY3fu3Inhw4dj7NixaNiwoTg5BYNt27YBuGhdy0+skBs///wzOnbsiIoVK+K5555D+fLl8fbbb2PYsGHYu3cvRo8ejeTkZKSlpeHNN9/EhAkT/CbcGTNmIDIy0rFUZWVloW3btihVqhTGjRuHOnXqYOXKlZg4cSJ27NiBGTNm+N3/ueeeQ/369TFlyhQkJCSgXr16Qa1fXujTpw9Kly6Nr7/+2tl29uxZXHvttRg5ciQefPBB5+Xg7bffxpAhQ9C/f3+8+eabiIiIwKuvvoprrrkGCxcudD6YbrnlFqxduxaPPvoo6tevjyNHjmDt2rU4ePAgAODEiRPo0aMHUlJS8OKLL6JSpUrIysrC0qVLcezYsaIXgkeys7OxZMkStGrVym8cyok777wTr732Gu6++26kpaVhx44deOihh/DVV19h7dq1qFChAgDg119/RYcOHXDbbbchMTERO3bswFNPPYXOnTtj48aNiIiIwG233YZDhw7h+eefx5w5c1ClShUAoeFu26FDB0ybNg333HMPbr75ZrRs2dLvxYjYunUr+vTpg/vuuw9lypTBTz/9hEmTJmH16tUBbsTnzp3DtddeixEjRmDUqFH4+uuv8cgjjyAxMRHjxo0DcFFDfvXVVyMzMxOPP/446tevj88//xw33XRTwL137NiBBg0aYNCgQUhKSsKePXvw8ssvo02bNtiyZYvzLEINr7IFLirRRo0ahQcffBCVKlXCtGnTMGLECNStWxdXXnml632kPl+9enXcfvvt+PXXXwvF1dMrwZZBXsbrHTt2YOTIkahZsyYA4LvvvsN//dd/4Y8//nDaoY0xBg888ACee+45TJs2zXmpv3DhAvr374/ly5dj9OjR6NixI3bu3In09HR07doVa9as8bP4rF27Fj/++CP+93//FykpKSHhIk4eDMYY7N27F08++SROnDiBwYMHO8d4ldnw4cMxe/ZsjB49GldddRW2bNmC66+/HkePHi3UOpw6dQrvvfce2rRpg8aNG+PWW2/Fbbfdhg8//BBDhw4NOP7FF19EamqqEw/z0EMPoU+fPti+fbvzfmnzwQcfYMiQIbj11lvx/PPP5+jtk9c2IXHu3Dn06dPH6bsrVqzAxIkTsXPnTid23hiD6667Dl9++SXGjh2LK664Ahs2bEB6ejpWrlyJlStXOoo9L/PWSy+9VLCxwRQjQ4cONWXKlMnTOXfccYepUKGC6zGvv/66AWDuuecev+2PPfaYAWD27dvnbOvUqZPp3r278/+tW7caAKZ+/frm3LlzfuevXLnSADBvvfWWeN+0tDRz/fXX+22LiooyI0aMCDh24MCBJjo62uzevdtve8+ePU1cXJw5evSoMcaYRYsWGQCmbdu25sKFC85xv/76qwkPDzd33HGHmyiMMe5ynj9/vgFgevbsGbDvuuuuM7GxsWbPnj1+26+66iqTkJBgjh8/bowx5uWXXzYAAo6ja69cudIYY8w333xjAJgFCxbkWNalS5caAObFF1/02/7bb7+ZyMhIM27cOGdbu3btDADz7bffutQ+b8yYMcMAMN999505d+6cOXbsmJk3b55JTk428fHxJisry6Snpxup69C527dvd7Z16dLFdOnSxe84ACY9Pd35/6BBg0xUVJTZtWuX33G9e/c2sbGx5siRI8YYY+bOnWsAmC+++MI55vz586Zq1apmwIABzraRI0eauLg4s3PnTr/rTZkyxQAwmzdvNsYYs337dgPA1KlTx5w9ezZPcsovJKOMjIwcj6lUqZJp2LChMeZi2wVg3njjDb9jTpw4YZKSkky/fv38tmdnZ5tmzZqZtm3bOtvi4uLMfffdl+P91qxZYwCYTz/9ND9VKjaysrIMADNo0KBcj/3xxx8NAHPXXXf5bV+1apUBYP7nf/5HPO/ChQvm3LlzZufOnQaA+b//+z9n35NPPhnQ3kOBAwcOmM6dOxsABoCJiIgwHTt2NI8//rg5duyYeA7Vc9myZQaAWb9+vbOP2uAHH3zgd06fPn1MgwYNnP/TOMhlZIwxf//73w0AM2PGjBzLfP78eXP8+HFTpkwZ8+yzzzrbaTxcunRpHiRQeHiVba1atUx0dLTfGHTq1CmTlJRkRo4c6WyT6pdTnzfGmL59+5patWoVSt28EmwZeB2vbbKzs825c+fMww8/bMqXL+/3flCrVi3Tt29fc/LkSTNgwACTmJhoFi9e7Hf+e++9ZwCYjz/+2G97RkaGAWBeeuklv+uVLl3a/Pzzz3mQVOFB84j9Lyoqyq/cNjnJbPPmzQaAGTNmjN/xJKOhQ4cWWl1mzZplAJhXXnnFGGPMsWPHTFxcnLniiiv8jqP5ukmTJub8+fPO9tWrVxsA5r333nO28Xe+J554wpQuXdpMmjQp4N72+0le2oQE9V0+hhljzKOPPmoAmG+++cYYY8yCBQsMADN58mS/42bPnm0AmNdee80Yk7d5qyBjQ8i6xtGXPv0z/9+M2bZtWxw4cAA333wz5s6d62h1JewEBZTgYNeuXbnev3///nmyLhw7dgyLFi3CgAEDPB2/ZMkS9OzZE9WqVfPbPnToUBw/fhyrVq3y2z548GA/U99ll12Gdu3aBc2NSSr3kiVL0KtXL1SuXDmgjEePHkVGRkae7pGamoqEhASMGjUKr7/+On766aeAY+bNm4fSpUtj8ODBfs+/Ro0auPzyywNcb6pUqYKOHTvmqRxeaN++PSIiIhAfH4+0tDRUrlwZ8+fPzzFupSAsWbIE3bt3D9DqDxs2DCdPnnSSX/Tu3RuVK1f20xAuXLgQmZmZuPXWW51t8+bNQ7du3VC1alU/Gfbu3RsAsGzZMr/7XHvttTlqNIsDI7h52O1zxYoVOHToEIYOHepXxwsXLqBXr17IyMhwrIxt27bFzJkzMXHiRHz33Xc4d+6c37Xq1q2LcuXKYcyYMXjllVewZcuWwqtcMUHjhO3i0bZtWzRs2NDPnXDfvn244447UKNGDYSHhyMiIsKxOv/4449FVub8Ur58eSxfvhwZGRl44okn0L9/f/zyyy8YO3YsmjRp4rj4/fbbbxg8eDAqV66M0qVLIyIiAl26dAEQWM+wsLCAmIOmTZv6ubItXboU8fHxAfMO104Tx48fx5gxY1C3bl2Eh4cjPDwccXFxOHHiREjL2KtsAaB58+aOFh646LZUv359z+5/XufSoibYMsjLeL1kyRJcffXVSExMdNrsuHHjcPDgwYCMmwcPHsRVV12F1atX45tvvvFzKab7li1bFv369fO7b/PmzVG5cuWAubZp06aoX79+geUXTGbNmoWMjAxkZGRg/vz5GDp0KP7xj3/ghRdecI7xIjOS8V/+8he/6w8cOLDQvEyI6dOnIyYmBoMGDQIAxMXF4cYbb8Ty5cuxdevWgOP79u3rZ9Gh91q7XxljMHLkSKSnp+Pdd9/F6NGjcy1LXttETthx1DQG0jxEFnd7PrrxxhtRpkwZZz7Ky7xVEEL2Q6hWrVqIiIhw/j366KMALgpk2rRp+O2333DDDTegYsWKaN++vSgQ7kIFwDG1nTp1Ktf7k6uHVz777DMYYzynrDx8+LB4j6pVqwJAwAee/TFC29w+BPOCXZbs7GwcPXo0T2XMjfLly2PZsmVo2LAhHnjgATRs2BDVq1fHI488guzsbADA3r17kZ2djXLlyvk9/4iICKxbt85vkpHKHSxogP3hhx+QmZmJDRs2oFOnToVyr4MHD3qSc3h4OG655RZ88sknjh/tzJkzUaVKFcdVE7gow88++yxAfo0aNQKAIpNhfjhx4gQOHjzo1B0AYmNjkZCQ4Hfc3r17AVycqOx6Tpo0CcYYJ2307NmzMXToUEybNg0dOnRAUlIShgwZ4sRuJCYmYtmyZWjevDn+53/+B40aNULVqlWRnp4e8NEUSlSoUAGxsbHYvn17rsdSG8qpndH+CxcuoGfPnpgzZw5Gjx6NL7/8EqtXr3Z80L2MnaFC69atMWbMGHz44YfIzMzE/fffjx07dmDy5Mk4fvw4rrjiCqxatQoTJ07EV199hYyMDMyZMwdAYD1jY2MDEuBERUXh9OnTzv8PHjwoKkqksXvw4MF44YUXcNttt2HhwoVYvXo1MjIykJycfEnI2E22hD3/Ahdl5qV+Up8PNYIlA6/j9erVq9GzZ08AFzNFfvvtt8jIyMC//vUvAIFt9pdffsGqVavQu3dvMSXz3r17ceTIESemhv/LysoK6XmCaNiwIVq3bo3WrVujV69eePXVV9GzZ0+MHj0aR44c8SwzGv/s/hseHi4+w2Cxbds2fP311+jbty+MMThy5AiOHDmCgQMHAoAYT+b1vfbs2bOYPXs2GjVq5HxU50Ze24SEJDMaA0nOBw8eRHh4OJKTk/2OCwsL83uv9TpvFZSQjRH697//jbNnzzr/J8tJWFgYRowYgREjRuD48eNYtmwZ0tPTkZaWhq1bt6J69epBuX9egzA//vhjR+vghXLlymHPnj0B2zMzMwEgwEdcCrjNysoKWie161u6dGkkJCR4KiO9INiBw1Knad68OT788ENcuHAB69evx/Tp0zFu3DjEx8fjvvvuc4JQv/nmG9GP1fZPLaxgWRpgJXh9eYC6l0FConz58p7bwvDhw/Hkk0/i/fffx0033YS5c+fivvvu85NVhQoV0LRpU0d5YMM/MoDCk2F++Pzzz5Gdne23toFUPpLJ888/n2PWKZrUKlSogGeeeQbPPPMMdu3ahblz5+LBBx/Evn37nIxqTZo0wfvvvw9jDDZs2ICZM2fi4YcfRkxMDB588MEg1zI4lC5dGt27d8f8+fOxe/du17GPxok9e/YEHJeZmenIc9OmTVi/fj1mzpzp559OMXKXKhEREUhPT8fTTz+NTZs2YcmSJcjMzMRXX33lWIEAFGhNpPLly2P16tUB2+2x+88//8S8efOQnp7u17bOnDlTaGs+FSa2bINBKI1JXiiIDLyO1++//z4iIiIwb948v4/yTz/9VDyvQ4cOuPHGGzFixAgAwMsvv+wXW1qhQgWUL18+x6yS8fHxfv+/VJ5J06ZNsXDhQvzyyy+eZUbj4969e/28dM6fPx+0l22JN954A8YYfPTRR/joo48C9r/55puYOHFivjL4UuKja665BldffTUWLFiAcuXKuZ6T1zYhQTLj76Y0BtK28uXL4/z589i/f7/fx5D5/2nQ27Rp43d8bvNWQQlZi1DTpk2dL/3WrVuLX4RxcXHo27cvxo4di9OnTxe6S0tOX94nT57EggULRFN+Thqw7t27Y/HixY5mm5g1axbi4uLQtm1bv+12prbffvsNq1atCupiWFIZFy5cGJAtZNasWUhISHA+FGrXrg0AAQvNui0kW6pUKbRo0QIvvPACYmJisHbtWgBAWloazp8/j7179/o9f/pHWrLiJKf6UiBgXunevbvzYsaZNWsWYmNj/V70GzZsiHbt2mHGjBl49913cebMGQwfPtzvvLS0NGzatAl16tQRZWh/CIUKu3btwn//938jMTERI0eOdD22U6dOKFu2LLZs2SLWsXXr1oiMjAw4r2bNmrj77rvRo0cPp81xwsLC0KxZMzz99NMoW7aseEwoMXbsWBhj8Pe//91PcUScO3cOn332Ga666ioAFxNMcDIyMvDjjz86bjP0smNnoKMsfpy8WNiLEkmpAPjc3apWrZqnenqlW7duOHbsWMC4Z4/dYWFhMMYE3HvatGmOZTxU8SLbwsSrRakwCbYMvI7XtMA3fyk+depUQKZZztChQ/H+++9jxowZGDJkiF/7SktLw8GDB5GdnS3et0GDBnmqR6iwbt06ABcTG3mVGSWumD17tt/2jz76yPPSKXklOzsbb775JurUqYOlS5cG/Bs1ahT27NmD+fPn5/seLVq0wLJly7B792507do11wXLg9Um3nnnHb//0xhI76s039jz0ccff4wTJ044+73OW0DBxoaQtQjlxPDhw5GQkIBOnTqhcuXK2LNnDx577DGUK1cOrVq1KtR716tXD9HR0XjrrbdQv359lClTBtWqVcO3336Ls2fPon///gHnNGnSBEuWLMG8efNQuXJlJCQkoH79+hg/fjzmz5+Prl274qGHHkLZsmXx1ltvYeHChZg6dWrAl/eePXtwww03YMSIEThy5AjGjRuH2NhYjBkzptDqO2HCBHzxxRfo2rUr/vWvf6Fs2bJ488038eWXX+LZZ591ssZ06tQJKSkpuPfee3Hq1CnEx8fjww8/xJo1a/yu9/HHH2PmzJno378/UlJSkJ2djQ8++ACnTp1yFqDt3r07hgwZgptvvhl33303OnfujNjYWGRmZmL58uVo06aNo+EqLvr06YOkpCSMGDECDz/8MMLDwzFz5kwn5XheSU9Pd/zEx40bh6SkJLzzzjv4/PPPMXny5AAr46233oqRI0ciMzMTHTt2DBicHn74YSxatAgdO3bEPffcgwYNGuD06dPYsWMH/v3vf+OVV14JmuU0v2zatMnxP963bx+WL1+OGTNmoHTp0vjkk08CTOY2cXFxeP755zF06FAcOnQIAwcORMWKFbF//36sX78e+/fvx8svv4w///wT3bp1w+DBg5Gamor4+HhkZGRgwYIFuOGGGwBc9It+6aWXcN111+Gyyy6DMQZz5szBkSNHnHYZqnTo0AEvv/wy7rrrLrRq1Qp33nknGjVqhHPnzuGHH37Aa6+9hsaNG+OTTz7B7bffjueffx6lSpVC7969new7NWrUwP333w/gYhxfnTp18OCDD8IYg6SkJHz22WdYtGhRwL2bNGkCAHj22WcxdOhQREREoEGDBp60hoXJNddcg+rVq6Nfv35ITU3FhQsXsG7dOkydOhVxcXG49957UbVqVZQrVw533HEH0tPTERERgXfeeadASxIMGTIETz/9NIYMGYJHH30U9erVw7///W8sXLjQ77iEhARceeWVePLJJ1GhQgXUrl0by5Ytw/Tp00NqYVoJL7ItTJo0aYI5c+bg5ZdfRqtWrVCqVKkcLfeFRbBl4HW87tu3L5566ikMHjwYt99+Ow4ePIgpU6bkuubbwIEDERsbi4EDBzoZyiIjIzFo0CC888476NOnD+699160bdsWERER2L17N5YuXYr+/fvj+uuvL4ioCh2aR4CLblRz5szBokWLcP311yMlJcWzzBo1aoS//vWvmDp1KkqXLo2rrroKmzdvxtSpU5GYmCimhi8o8+fPR2ZmJiZNmiQqtBs3bowXXngB06dP9xxyIdGwYUMsX74cV199Na688kosXrw4x/k/GG0iMjISU6dOxfHjx9GmTRsna1zv3r3RuXNnABfXuLvmmmswZswYHD16FJ06dXKyxrVo0QK33HILAKBBgwae5i2ggGNDvlIsBIn8ZI174403TLdu3UylSpVMZGSkqVq1qhk0aJDZtGmTcwxljfvhhx/8zqUMbMuXL3e25ZQ17umnnxbv//bbb5sGDRqYiIgIA8Ckoj7gAAAgAElEQVQ88sgjZtCgQX7X4Hz//femQ4cOJiYmxgDwO279+vUmLS3NJCQkmKioKNO8eXMza9Yssczvvvuuufvuu01ycrKJiooyXbp0MWvXrvUkMy9Z4z777DNx/w8//GD69OnjlLFFixbm7bffDjhuy5Ytpnv37iY+Pt5UrFjR/POf/zSffPKJX9a4TZs2mZtuuslcdtllJjo62pQtW9a0b98+4HoXLlwwr776qmnTpo2JjY01sbGxpm7dumbYsGF+z7Rdu3amVatWnmTgFS9ZzYy5mKmlY8eOpkyZMqZatWomPT3dTJs2LV9Z44wxZuPGjaZfv34mMTHRREZGmmbNmuWYZerPP/902tPrr78uHrN//35zzz33mJSUFBMREWGSkpJMq1atzL/+9S8n2x9loXnyySdd6xpM7Gw/kZGRpmLFiqZLly7mscce88voaEzuY8SyZctM3759TVJSkomIiDDVqlUzffv2NR9++KExxpjTp0+bO+64wzRt2tQkJCSYmJgY06BBA5Oenm5OnDhhjDHmp59+Mn/9619NnTp1TExMjElMTDRt27Y1M2fOLDxBBJl169aZoUOHmpo1a5rIyEhTpkwZ06JFCzNu3DhHptnZ2WbSpEmmfv36JiIiwlSoUMH87W9/M7///rvftbZs2WJ69Ohh4uPjTbly5cyNN95odu3aJbbbsWPHmqpVq5pSpUqFTHaz2bNnm8GDB5t69eqZuLg4ExERYWrWrGluueUWs2XLFue4FStWmA4dOpjY2FiTnJxsbrvtNrN27dqADG85tUEpe+Tu3bvNgAEDTFxcnImPjzcDBgwwK1asCLgmHVeuXDkTHx9vevXqZTZt2mRq1arll6Eq1LLGeZUtZS2zscfDnLLG5dTnDx06ZAYOHGjKli1rwsLCxOydhU2wZWCMt/HamIvvPw0aNDBRUVHmsssuM48//riZPn16wLwj3Xvp0qUmLi7O9OrVy5w8edIYY8y5c+fMlClTTLNmzUx0dLSJi4szqampZuTIkWbr1q251qW4kLLGJSYmmubNm5unnnrKnD592jnWq8xOnz5t/vnPf5qKFSua6Oho0759e7Ny5UqTmJho7r///qDX4brrrjORkZEBcx5n0KBBJjw83GRlZbnO1/bYLPWh3bt3m9TUVFO7dm3z66+/GmPktui1TUjQfTds2GC6du1qYmJiTFJSkrnzzjv92rExFzMojhkzxtSqVctERESYKlWqmDvvvNMcPnzY7ziv81ZBxoYwYzysxKXkyJkzZ5CcnIxJkybhzjvvDPr1Fy9ejB49euCTTz7BddddF/TrK4qiKIqiKP6sWLECnTp1wjvvvCNmf1T+M7jkXONCjaioqEJfcEtRFEVRFEUpHBYtWoSVK1eiVatWiImJwfr16/HEE0+gXr16jhu18p+JfggpiqIoiqIoJZaEhAR88cUXeOaZZ3Ds2DFUqFABvXv3xuOPPx6QOl/5z0Jd4xRFURRFURRFKXGEbPpsRVEURVEURVGUwkI/hBRFURRFURRFKXHoh5CiKIqiKIqiKCUO/RBSFEVRFEVRFKXEEZJZ48LCwvJ1PP2NjIx09iUkJAAAqlat6myrV6+e37YjR444+/bt2wcAuHDhAgD4rZJeo0YNAHBWMv7pp5+cfb/++isA4MCBAwCAkydPOvuys7MBAHnNS5GfPBZ5lZ19Hl9BmVZfpnoDwNChQwEA1apVA+BfT5LL6dOn/c6X7rNr1y5n2wcffAAAyMrKAgCcO3fO2UfPIa8Uh+zCw33dKTY2FgBQuXJlZ1uTJk0AALVq1QLgLzv6TeWOiYlx9iUlJQHwtdPvv//e2bdjxw4AwJ9//gkAOHv2rLMvv3lQikN2/HySY6VKlZxtl19+OQCgYsWKAIDjx487+w4ePAjAV/dy5co5+8qXL++3j/fZ7du3A/DJnre1opJdfuXmdi3eDhMTEwEALVq0AAB0797d2VemTJkcr7VlyxYAF1PKAsAff/zh7Dtz5gyA/PdNiaJsc8HESxncjsntfJKxm3wKS3bSMTQ/8DmW2lFcXJyzjbJsUftr166ds4/6Io3zNEcDvvHsxx9/BOCbUwDgxIkTAOAsV0HtEJDl5EUul2q7CwWKQnb28bmdb7/H8DGuevXqAHxtkr/37d27FwBw7NgxAP7tzq6n13oXR58NJvx+JE/7r9dySfV167P0V5pjCiO/W0h+CLkhvTjZL+ytWrVy9rVp0wYA0LBhQ2cbvcTT8XwAp8GWPl7Kli3r7KOH//vvvwMAMjMznX07d+4EAKxbtw7AxYW4iM2bNwPw73j5/TgqLEiepUuXdrbRyzi9wAPALbfcAgCoUqVKwDXoXBpEeCOmye/UqVMAfB+OALBmzRoAwKFDhwD4ZAMEdozihrc7qi+9bDdt2tTZR22Qt0X6AKf2x1/Y6aWAZEYfUnzf7t27AfjLbv369QB8MszIyHD20fHB+DgKNvaExevbrFkzAMDVV1/tbKM2SB+WfIKjDxl6MeL7qL6HDx8G4Pv4AYCvv/7a7+/+/fudfVIbDiWk/krjWN26dZ1tJMNBgwYBABo3buzssyci/nJLfbF169YAgDlz5jj7fvjhBwC+D9BgKC5CGWnOsbdJLwZu57m9UEjjpj0+FCZSnahtVKhQAQBQs2ZNZx/1Sa40TE5OBgB069YNAHDFFVc4+0iJQXXi4yB9cFN7o3kV8Cl7SNnIlWmkROPKJZLVf2Kb/E+koH0J8CmBSHnIlbh0HH1I07wN+NoutS0a/wCfYpe/lxDS+4nby3yozL9uSAo1UmzQOMD3kVylDyK3j0j6zeVKYwKNe/zdRZJ/sFDXOEVRFEVRFEVRShz6IaQoiqIoiqIoSonjknWN4yv9kvtGnz59APhckACfiZ6b7cmVjtxlyPQJ+Ex+ZM4kczwQ6JvMXXBq164NwGeSTU1NdfZ98803AID58+c728gnlVxwistk6mZ6JllwNzhyxdmzZw8Af7MvN5cC/j7cthmUm7PJ/ZCuLZm/Q8WkzMtNcrnmmmsA+OIwAJ8LCbU/wFd3cuMgf2TAJzuqL7kdAT53TfrL2z65QZEMU1JSnH0LFy4E4Iv3AHzPpLjlaZvfuZwaNGgAwN91gfpaREQEANndj64luWqRSZ+7wVKfJVdXHndEv3lbLG6ZAYFy4667FEdF7REAOnfuDMDnF09jHhDY3/g4SC5G1KZJ7oAvpmPlypUAfGMZUDjxQ8WF7XrD3RDpt+TiZu/j59nH83323AP45hxJnoUlY6o3f+bUP+vUqQPAF+cI+MYcPu/Wr18fgG8cJJdovo3qy93Z6J79+vUD4O/+tnXrVgDAtm3bAPjHUdK1qC/ze4aae7Xij1v8ib1Nilvh7x00NlF75deiOVWKA6J5gcZJ3lbs+NvcXPfd4lsK070rWLi5xtE8zN9B7Pc+jpeYH8kVmMYE/o7E56dgoxYhRVEURVEURVFKHJeMRcgODubBmj169ADg04hy64/0BUtfp/T1KWmKpG2kaaZ9XGNG2in64udWlE6dOgHwBeIBwNKlSwH4NH6hoq1yS0IB+ORI1gkpUJFk4JZ1iGvmKXmAbRUJJahM3ApIgecU3E8ZzQCfZoonASDZSRp4u85S1hqyavBjqQ2TVoXvIw0Y19iTNaq4NVO21okHTFP/5Zom6mtcC2xfi+BadpIdaZgkLRdZcfm1pUxyoYBtpeXJODp06AAAaNu2rbONMu+RjPhzty1CkqaT2i+3cNM+0p6SZQgAfvnlFwD+bTtUxjYvuAVj83ZF7ZH+StYielZu+/gcImlW6d4093DLSrCx51jeH0jDTu2AJzIhqzTPEkrlpX7E5wnuZQH418n2kODzPFlwabz97rvvnH3U3nhCIntsvJTaYUEINS8KCcnq49ZfbCsr38bbKc0jNE/zeZR+k3ykOVaytBP0ziN5HPB5wi1JR6jNJxy38c7OAil5WPHjCaqv9L4h7aM+S/2YPyMu92CjFiFFURRFURRFUUocl5xFiL7+eTps0khJmmRJk2D7K/KvTlu7LH3pSxYLW8PNLVCkleWpazdu3AjApzHjX7tFqclxSwNLFggeq0Ff71RertEkaJuUFlFKS2zHCLmVDygeTRfJhTQigE9DSRYtHudCz5+3A8ItdbqUQpd+Sz7R1ObpWXGtLFny+Ho8ZCUqDouQpG2ntsI1TCQzSSPlxWrI2zDVU9Lc07Ukq7Hkk15cGlZeH3tdIIoBAoBGjRoB8Fm4gMD+KfVJwq3Nca0rxYRIll8pPX5xWx/ziz32SBYh6nc89TgdJ7U5ex0efh79ltoozRM8Ni7Y2mW7ntIaLJQO+7LLLnP2Ub/glh0qG5XfzerD5187XTjvcyQ7srZzyyelO+YxQuRtIY23oWwtyQtu7yJ5HbuKQiZS/IltXeV9go6jv9L7CZ87aOyjfWTF4dcn+FxJYz+/fk5l4LHP1LZ4X6T2TPt4f74UxkK3pVRIvvxdx34H4bKg+tp/c9pGsqVr8fGOP8tgoxYhRVEURVEURVFKHPohpCiKoiiKoihKieOScY2zzeI8VSel4yRzmhSIJyGZ5sgc6LYqsOTCZbvScLM0BRzzlKPkTkVpqKXAveKAl1syPdtmX25uts3e3MRtm4R5Hcns6rZCcXFDz5zLomrVqgB8ZmIpwJJjm9HdXOPcTOiSewCZ+bnMacV36h+A73kVdxptu61wNwU7oBwIbBtSu5P+b7sPSauW0725axyVoTADNL3Cxxt6ppQYhrsoUbIOyR2T5CC5VpFM3Fbx5s+C2jkFsvMUp9u3bwcAZGZmOtt4WvJLEckl1e5vvP3aLnFuSRZ4O7ZXb+fXJRlyWQajbUruqlRe3o6ondFYx9sHL699LZoveBuxx3vJpU5yz7HHRn5fSuvNxzpKEsNdmf5TkFzZ7SB3/myldx2C5gBp3vWaSMorVEbeX6g90HsSTzBEz9h2T+PX4PMu/Zbajx0KwdsPnUfX5HWkfkbl5IlgaB+XK12fxlPevvkYG+pIiYVITjyBFrnek3y47Ki+Uupx+12SH0fJtLjs9u3bV6D6uKEWIUVRFEVRFEVRShwhbRGSFvekwH2entrWzkkaOI6tJXYLOpW0H1LSBFtDw+9LX7w80J6sCaT9KMzFomy8BlhKlh36ipeSHtiaU34tqp+UMlG6t1tZixJbs8QtQqR9lFJKUn25BsiWmZtFiO+z25uUspzaG99HfYVrzOi4UEmxKiUzsTVqQGByDskiJGk27fTZXMPkZvUo7nbHy8C1p6R9o+B1nsiExhJedjtlrBQwLlmE7DbKr0myp3bPx2IqF+8LFORa3G0tr7iN6aQhJZlLCwxKFiHb2snPsxOCAL5nT8+Za0WDPWfY4zdf4oDaHZWHt0k6nns12Nf0almwZe62uCYvA41x3CJke4iEQuKTgmLLR2orpJnn7Y6sYtRmpORMXCb2eCF5xuQHKeU1efnQM5QS50jjPl2DL71A16CxTPJwoXFfsnJTm5cS7lA5edA+tXkuT9sSxK243DIaqrglS6Axn95fAZ+1mJ4bbyvU7iSLkJ1Ai2+ja3HPgsJELUKKoiiKoiiKopQ4LjmLEKXv41oDQoqToK9aN41GftNM8q9b0lRIaR5JC8O1GaThoxSlfCG4okyx6CUNsWSdkCxC9qJ/PMWivXCsdF4oQ+2Ia0lJO0JaYa7lkmIy3HDz07a3SZo7rh0lqI+QJgsItAgVF5LWibAXVQN8WjgpJbGk0STsVO6HDx929tlxUlJMUijArQYUB0TjIE9xLMUG2P2Ny4j6MNXVbVFAtxTQ3NJNlgM+PpMV41JIHcux2yifV6jPkyaZPwe77bgtHsmfrR0PAfhkRvMF137TmBos7Lg9PtZRPaV0916WoeDYsWnS0hbS+bY1jbdXeh5cPjQmhnLsqRd4uW0Z8HZHdaf+yM+zY8zIQg4EpjPn59qLz9u/8wqVn8cB0VhGbZyPJzSnSunmpWdO51L9+JhDfU1qY9S+qSx8vCP5UFm4hUeyCNG8QmXgfZxiwkMR2zNHsjbSuwRfPJ5iRWmfFF8lxQO5baPxhS+aXJiL0YbObK8oiqIoiqIoilJE6IeQoiiKoiiKoigljkvGNc5OlcjN6XZ6TMlFiZvf3Exs0mrpOSG59UiJAsiUyl0HyFxKpm1+rVBxFSMZS0GwdqA14HOJ27hxIwCgb9++zj5bVvwZkOugm1ticQW62mZibtIn1xFyF+HtjmTGg/PtlLJeEyLYspYSVFC5pDbGXVzItaC43USkehLU/rkbG7nG0XNwc+OQ9lG/phXo+TVtVx3+uzjlRPfmrhXkbkrPVEo1y9uA7eLmFfs8STY0JvMykHsLd8ekvh/KrnFSwha77/P+TfOQlDqf5EPtmLdH27WJu5/YiU+AwPmIJ6bIysrKUx0leH1ttz3+XO1EHNwli8rIkzdICYVygrct+zwpSQe1fT5PSi57UvKkSwnJfZjGAqonLcMBAKmpqQB87l3cDYvcU+k5Sq5xHN4uAeDgwYPObz6G5hXJ7dJOlkDl58dJSTpIFpJrHM0dvE3a6Z1526L7ULINXn87UQ9vY+QmyNsi3ZNcV3n7llzYQw0pOQzJjp4Rd42jMZ/kIs3p0txM44aULIHgz0Fd4xRFURRFURRFUYJISFuEOLYGjWvayeJC2gApgJgH2dFXqaSZt5MrSAu6SZoaO3kA/3olzTMPsqN7SmmPQw0pTSvVj3+xU3D7999/DwBIS0tz9tn14zInuYRySlMpyJO0JHa6XMC9/UjJNtxws57QNaRAeSof7w/SosNFhaQdlhJykIWXa4rsBXx5wK4tHy5Xe5HBP//809lnW4Qka0BxIlmESCMnpW0mbaNk/XZLX+y2TxrP7DGLazlJu8sDnkN1bHNb/gDw1dNOjAD4LEGURpZrseka1I7585CS6dj7pKQd1Ie5Jv+XX37JtY654dUCRuMGtQMplThPDew2tkkp86Xy2FAZSBbcMkHnSRZSaWwMNdwWtuXPgSyCzZo1AwC0a9fO2deoUSO/81evXu3sW7duHYBAD4KctpGVhZ7t+vXrnX18DM0rUtIhep40ZvAlAWib22LD3PpMVgn7HQ/wzRnSeEcypn7M70Pn2YuR8318m5Qci5C2FSdu7Y6XleRKz4Y/I3p+JDN+nps1lsYIKREHPfeimjtCc4ZSFEVRFEVRFEUpRPRDSFEURVEURVGUEscl4xpHJk4yh+/cuTNgH5ntudscrXTOTZ0UkCWtMSS5NBFeVr0m9zDuJkBBrX/88Yezbe/evQB8rg6FGQiWX6hMfCVl26zMzaBUp99++w2Av1naNnFy1wkKKgxFGbi5i5BpXmorZCqXgpGlQOC8uG9wWZIpmWTN3ZRsVxIg0K2puJNQSOWQ1qmw11jhQZVu5nOqk72GBODro5JrnFQu27WxsJFcVshFgZ6j5CrC3TTstsbLbtdDqpc01tlB/XwMoLbG3VXsBDLF5QJry0J63lyeVAdyg+OJCipXrgzAF1wtBenb9wUCXeO4LGgf78MkW2mM5DLOL5IM6J7c7ZK2ubmT83HQTtwhjXX2fQH3NYbs5ya530nrMoVC4pOccHPJJPe0Bg0aOPs6deoEAGjfvj0AoHbt2s4+6nv0bPg4QO2HZM/3kcyoTfPr8mQMxJYtW7xVTkBKlkDjG7Vn7mZqr9PFxxp6vvxaND+7rZsmvWfYybh4OyJXN7ofn59Ijvydk97p6N58rgrlxB3UFu2kJIBvnKN14niCCtsFldfRy3uNNAbZycRyu0ZBUYuQoiiKoiiKoigljpC2CEkB1L///jsA/6/DzMxMAD6LS61atZx9lOaPa1UJt5SpXlbGlgLhKU3l5s2bnX2UxpLKyctKaR5DObUsD9ClckoaO6oTyUBKHS2tQh7KyRLcLEKkOaF2wOtEdXdLlymlyJZSe9py4eeRtom0VlyDIqU3Jk1LUQawS33JLQGEHZwK+Aeq5wd6bvz50X3cLLzFCZWBl9nWjHIZuSWjILg21D7e6zhI7VEK7qeycguJnYq3KHAb20lmvG9SubkWlLSfZAkiyxAgB1UTVF/qa/w+pEmlbbxvS8kn6HlJyRKkOS2YuCVn4WOdm5bbzfomjXVuVnP73pJ1063fFlcyFLd+Ru2HJxcha0zr1q0BAC1btgzYZ6fRB3xtmNo3JVTg9/n5558B+DxXAF874hahGjVqAPB51PDjP/744xzrmhtUNj5P0f1pzOAWIeovkkWI+gm33lDfk5I52RYhyVJLZZA8P+g+UnA/75e2NZQnmwqFeYUjWWqpnny8I8sgJYfhz8+eP936ltd3PPt58PIVBqH1VBRFURRFURRFUYqAkLYIcUgbRBYUrtk8cOAAAGD37t0A/L/Ar7jiCgCytlfSSLlpb9yga1JZ1q5d6+yjGCGuVaH4BC+LiQabvGrDeIwQaTkkKwhZgqQ4Kdsywv1m7TTGoYStReZaWNpG2gvud011kjTG0sKobkjxBATdk6xq5MvLyydpt4rDX17S/LotqMqtQJSu0y3VtZs1jZ4D13r+9NNPftfkFGdcgVv6XLeU7TRGcuuyF99st2Pcno8UV0PtS4pLK6wYITeLI5cP9V3SMvP2RZp1ro2mNkf7eD1tyzbXVJN86Bnx+9iWJN72bEsb4JsfpNiKooTGGSqPFDfq9hwkvLQ/DrVrGuu4x0Fxz6NerKv0zHlsF3mvtGjRwtnWpEkTAD5LJLcW0W/Jwk2/pbTS5BlDViKKGeLlk2LTqB83bdrU2VeQ2DRpTiK5kPaf15f6jr2wKr+WZBW3rRuA+1IC9iLC0sKt1P6kxVP5fWhctBdWDSUkjwx6DjTu8fizyy67DICvHUlpwN2s8Dn9Pyfo+twqVZipx9UipCiKoiiKoihKiUM/hBRFURRFURRFKXFcMq5xZNak4DTudkXuCWSyPHjwoLOPzJluZnu3IGFuRvVi1iNzPbmJAT53Oe6yR7+pfKGYKIDKxMtN8pfcwcgljgIHufsEmZrpOUqy8CqDokzDaydLkFLKkhmer7hNdeeuNvY1pbSxbmUgpGQJvD/Y5eNlpt9SWsvibIPcnYv6s7R6tZtrnJTExK4Td42jZ8rbon1Nt3GjsOUluXfYqUp5+agvclnabphSOuKc/s/h7ZPGACkgWXIhpTIH281Qcn+znxt3walTpw4An8sRL6PkEmO780pubF7aAJe53f944DUdx11vaCyl47h7XmEn2OFypT5J5cjNNc6Lq5iE23lUBrssQKC7JhCYtj3YcFcd202atx1y+aLEA6mpqc4++s3TVNNYR395wDj9llyc3WRnp8imRCCAnACDflN9KHkC4O9+nVfoerwd028pZTLVl7ZJiREk911pvrbftaREAdKyBNTv6S/vn1Ibo3GYnlGoJEjIbbkAckOkcbJRo0bOPnKTo+cghZp4qafb8g28jFQu6jOA/7MMNqHxhBRFURRFURRFUYqQS8YiRF+PUvpl+gKnL1JupZA07bbW0E1LKlmEJG0g/ZZSJlJ5uAZLSgEaqkiaN2nxRtJeUt25lYK0+25WplCRhZvmRAqKJHh9qX5Simy3NMdeLEOSJYkCiPn5kkXBbRHYosTuS7wdURmlxA+EpH2SZGgn9eCB6yQXsmTyMaU426KXlO1SSmqyRkuLXXoJaJfOkyxj1F/tpAD8t2Q5DbZmXgq8pjZA5ecWwLZt2/odL5WbQ/UjTbJkaaR9vM1RGait8YBfru22z6Pf3NJjL1fA61pYSSeonlwmtlacj99URh7kbl8zt232PikdNt2TxgpJdpJFKNiJT2h8qlu3rrONnjE9X+4JQAHmpN2m/wO+xAP8uVJiALqWlCBA6pe2xYK3D9t6KJ0nWUjoGjxBQjAsQpIVV2p3NO/SXy4LyRpjtxt+H1suUvpsaVFQOyW3dD9p0VT6W5gWIS9WfclyLqVtJ0tQ8+bNAfgv5EvtWZJ5XjwLcrMI2c+GW0p5Hwk2ahFSFEVRFEVRFKXEcclahCStGWl0c0sfa2uI+JeprZH36jdva2ikmAe+zS2VY6ggaTuoLiRDnsKUfktxXLZ83KxjoSQTO30210rYWjMuC8ln3dbuSu3OTXvkFh9DPvvcquGW3rg400MDgRp1bhGiMnHrjd1+JBnY8pWQFieltsjbZCi0QcmiZ2vk+Jhip1rmeIl59JpGm9o2PTOuaZTKTL+D3dYofoBrDe04MUr7CgCXX3653/m8PNRHeBuwLfm8bxF0H8nXnuIweHwFpe6ma/G4Qnp+fF6ha5H8eWxOsDWktuaY18kemyUvAbf+6nY/rwtLk1wky6dkQSqs2DRqb71793a2kceDlD6e4s9om2T94XFAdA0qP7eG24sY831Ud2mhY7c4DvvdCgh8Z+FtgZc1r9jWbiDQgiJZY6RU2W5xrm6LZLtZwCSrlF12ydIm3Sev6eG9IslHupfdHvgzpLbIY3AohXvjxo0B+Fsu3d4b3LyoCLf3dmlOor/cml6YC0irRUhRFEVRFEVRlBKHfggpiqIoiqIoilLiuGRc4wg7aFPaJpkOJSQXJTeXGNtcL5k8yXQruT1dCq5xvDx2UDTfT7LjLhLkskDHc7cPQnLpcUsQYJ9nl7EwkMzLUlpN29QupZSVUtB6SeUu1VEyS9spoLn7jluQZHG7xhFS6lYqLzeFewk4dXP/klwyyEVFKoNbak8v7bUguKVst104eN+UUiznJZhWaveSy5Gd8IS7AVG5eFCz3eaCBbmZUZAv4HuGJDMe0E4pYKkv8/JQubl7KyUgIRnz9mHPNeTiBPgSNFCabu7eQeWisZG3JZKZlByDzuNLQ9iJFwqK7arG3ZDsNs/bnX0+R5pjvbi5Stey2x3vy5IrnfScgwG5E9WqVcvZRs+HniHvs/Q8qY3w5yulbbfdwHh/dgvqtxPgSHOBNIZJ7ZvcQqk/8LYgPenL9PIAABK9SURBVHuvSM/JnoskNzO3+kptRVo2xYtrnJRkwSa31NFez80rdC7v9/TbTT7UfvgYRa6c9erVc7Y1a9YMAFC1alUA/vOvvVSIW/+UQkak93Yv6f95kg7eD4KNWoQURVEURVEURSlxhLRFyM1SI1lcJC28ZKHxEhTsphGWrBp0nhQYLFl/3OoTKtgL2AGBMub1pPTZdDylJeYUVirdwsIttaet7aD6A+5BhW5tyw3pPJKnlKKd2qKUJjhULEJSOnxpUUIbSa6Sts3W/vGxgeTjlg6/OHALWrefG08yIVkcvaTNlv7vJchdsgjZyUV4+SVtf0HkTPflKbJpTKaAbq61JwsSPXdJrlyeZBGiPiVZselaPFEApRem1LRcg3748GG/v1JiBB6MblsauNU5GBpS6TlL2mVbe8vr5Nb/CElj7rYQq1v7k9qdFHjtFkheEKRFZQnJ8mwvBs+R0pHbKfIlCwPVVwo+p2fDn5HtlcLHW8kiRNski5A0r3tFsgJ4eefyahGy2yK/pp2UyatVys1bw0vbkryX8gM9Az7e0VgjtRXbmsytK2Slrl+/vrONkoCQ5YiPL25t0i1Zgp0USfIucpt33N4BgolahBRFURRFURRFKXGEtEVIwk17S38lf1uOrbnysohlbvvoN/n/SovvuZU5FJEsQrZPrLRgmb3IJxCoKS7MRcYKivRcJY2UbRnkFiHyr/WqnbePcUPSepK2kWsWSRPk5tNc3BYhqb621YqT37gC6RgaJyQLb1Ej1UdKe2ojLR7txfrjtQxuWlppYUtJI+72PAsCaeS5Zl6y9hDUP93S7vL+Y49/bjFC/DyK49m/f7/fXwA4dOgQAF9bIysV4IspklJkU3127Njh7Dt69GhAHb3iJVaMj3X2HMmtG1Kaai/xtF7KwqH2I1l/JAprjNuzZw8AYNOmTc42SpVOViopfTa1TSkFtORpIMVc2BYdrmGXrESEm6yk+A26PvVxXp+CvLNIz84t3bSbpcZtwVK38cuLRchre/WSjlyyguQHGgu4FYfGDDf50HncIkSpsaXU/nS8tCi29B7kZtGjdiTFm9M1eH+w34sli15hELpvo4qiKIqiKIqiKIWEfggpiqIoiqIoilLi+I9yjSOktH8c2+wpBYXm15WETMjcNSNUU2XnhhRYabtNuJmGudsOYQewhjq2yVkqtx1YCsirbwfr+fPrUPumZ8RddAgp3XZxyF9ynaE2xt0yvLg8uLlxeU2k4JZutrBTZHtBcjOz3RAk1zgv15TwsnQAvwbdW0oa47b6ebAgV7esrCxnG42/dC8+F9A+cj3jLtRUBy5PezkADsmD+h8PIKdrkRscLx/dm+YHvno7ubrxsYPcVKis69evd/bt3bs3oFwFIS9JR7hrnO1iKl0zt22Em/sbtSlpzCDckiUEyzWTUp+vW7fO2UaJLsi9iKcqJhcjO/EFILuKe3H1lbDHP68JogjJBU9K8ECJPvKD7SqVG27zr1tCIjtVNt8nvUPaLuN5TWgkuXBJSSsK8g5A4wJPeU2ucW5pvKm98XElKSnJ7y/fb485QKBrHMd2reTPlsZo6b2ExmZeZvudk1+rMN+f1SKkKIqiKIqiKEqJ45KxCLmlm7YX/eMLlnnVKud2PyBQA+WWLEFK+3epJUug+nItKX2x53VBN/uYvNa7uOSUF4sQT5ZA2sj8pmR2SzEsaT2lFN5eFmX1mjI5GEgp7yXNT34Dq72k4+SQLKRgeEkjXVRt0LYEuQVS8+ftpim3z7d/82vy327jIFlM+Pggad/dUrYXRKbU73jSAMlKZd+LgoZ5MDDt41Zd+k1tk5fVzXOArEN//PEHAJ9liF+DrFMUZA/4kizwoGY6jjSyGzZscPZJC1YHE6m+tI33V0lLbMvFrd15tRrZFiFpfimKhCek3d63b5+zjZ4FJVLg7yD2Qrn83cAtkYibZU7yMvHS770kEQACx2Wu0ac65gcpXbj9PKVxWBqP7YQlfH9eE1S5ee3YyQ94G5OsPvZitDz5SUE8Daj98AQrZIm0k6rw3zTO8TZJ44pkfXZbAFuyFNoeMdxiSAmzqN3xMdetfVP7OHbsmLOvMN9P1CKkKIqiKIqiKEqJ45KxCBFSiljaRl+YPNUj16YStmZS8hV18yN1+zK1v8KlsvPfRWnpyOsXtRSHQL8li4KtGXTzjZX87kMZqb62fLg22fZ1BwLbqRuSJkvSJtmpn3nKcqmveLGGBhupL9mpRaU4NI5bud20nW7xAfY2ruWi8hRHrJAXKySVlcem2PFi0jXd5CfFCEgadnpmpK2TNJ6Sr3qw25zUL6ju1Be5fGhslhaJJCSLkNRGCZpfuAwodoc059xqR9D8wM+j+3GLkL244e7du519XuMsvGLH7UmLdUoafSmuwi1eJaf7StukvizNS25tIdhzLbUHPtaSPKg9cE22HTcnLcHgJfU/3+bFo6Qg9bVlxuXK+0hekbwnSHa0je+j31KKdiluxW0pBClNOOFFjm5xhHycIQswWXj5vmD0WS5/kh3Jh8uCtpEFklsiJXna1h4pZpaeHx+3qK2ThZRbDEl2FAvJrVnUV/i7oD1+HzhwQJBA8FGLkKIoiqIoiqIoJQ79EFIURVEURVEUpcRxybjGeQmCI1MbNwEWdEX1vCY4IHOt5Brndq3iTprgZprnbhC2mwc3M9uuRm7B58FKKVnUSO6NUkCpW1BhQV3j3FzMuMma5O+WRrW40pi7ufu5pWR3c+0itwB+TS/nSa49oZQ+m5ednin1Q3K/AHyBr5L7hVR/KcUsIQUGE3R9Corl7g7kciG5FLuVJT+4Jdqgbdx9icpGfUVym+ZuL9SfpWQJVCe6Fk+IQC4itE1y+6T7SC6EfByx3fh4IHKwV1q3nzkfo+2Adr6P5Ci5AXtJSOR1XrQT7XAXIanMbokFCoLkIkX3tcciaZuUKtsrbolz3Lbl59qALLuCuHdJ/YVcQekvJQAAfO2e7snT4UvuYHa5JfdOr+nX7W3SeEPz7f79+51tlCSFXMT4PimNtFfovjwdv90XeJ+g5AckM54sQUqu4LbMAbV1coPj7n5Uz8zMTAD+9aVnWqtWLQBAcnKys49kx9+Vbdc4uqZdt2CjFiFFURRFURRFUUocIW0RcguUdLMISdYYifxqJt20BvYiVoB7MGlRWkPctONuSBoQSatiL+rlFnwupeQORezn4xaYza0xpInmqX1JHl4W+JPailsKUSoD14CT9s3NulFc1jj7vpKW1C05hBtuySEkrX6wFloMNlJdqc3Rc+YWIZIXb4d23by2Obt/82dB1ijS6vJF+UgDKI0LwbY+2tZQwDf+Stpb6ou0T1qolltZ7KBhyRpMGlKu4SarDfVJaR6j83l/pTlDSlpBmnF+fDD6rpvlWbIIkUykhQ6l42mftGiiPa7lto+QEjZ4uVawxzqpPdDzlayAeV1cViIvdQh2+yAKYi2ndsPHLbJYSGmeCWr3kkXILQ1zXtuD2/slPVP+7kJjoWQRoqQpPOBfWmTeKzSu79ixw9lGYwylb+eJVkiOUtp2kh1PGGO/p/LnTH2N7sOtUmS1IUs4n3/Kly/vt43+D/isQ9LC1lQvnniBj33BJjTfABRFURRFURRFUQoR/RBSFEVRFEVRFKXEEdKucRwvQYKSqwuZ2qTA2PzcN6d99nE8gM/NNS6UkVweyDwpuYvYwaDSCsySa5wti+IK4HdDcgWkbWT25W5wZBbndbEDat2SAkjbJDclKgOZrPn6FSRj7qYRask5pD4rJT0IVvIC6TrS6u6h1AZ5memZ0jbukkFtgbuq2QGwktsgIblcSi4m1NbI5UO6X7AD+SXoHrzN0zZpLRw7qQl3C7ED8QFff5WuRc+B+jx3jaPySK6s9jonfF6icURyUaSy8mQ1wegTbi64UuIByQ1YShZD26S1c+z+7ZbgQEq4YwduAz73Gn68PecEe8xzG6ultX9CgeIuCz0TPldS3yG3N+7+Rm2F2h1fI1JaR4iQXCXzkjxDSrgjJcmQkiXQb2kdoYKMi9T3f//9d2cbyYXuReuOAT5XQ/svIMvOTp4juUPTmM/rS9uoP/I60nkkJ55Yh64hucbRWELvUfxahYFahBRFURRFURRFKXFcMhYhG+lrXkr/R1+YkoZG0pLaGnO3FItc22BvkwJxvQbnhQpUJingmLQifJ+9gjbfZ6eN5F/3oZYswS09tZQyVQqY3r59OwCf9pyfK6VY9VIGqa2Q9oU0Jzw9JWmLJE22pK0uDqjN5Ba47rYquK3JkmRH15KsY3RvKeVtcSIF51I/omfLg1apHbqlRJWsXlIbsC2gkkWIgmS5RYg0kpJ1INgpye3EEXwbyYxrG0kLSvKR2pyEZJ2g8YtkLvVzSa62RUh6HtJyBdQ2uRWkIH3XLQhesghRnWzNMOBLDpGYmOhso/YptTvbUiNZhEjW0hhJ9+bpdClInPcVyWuhqCjucTVUsS2LgK//kuWEW4Soj1Ib4ee5jdvSXOlmEXJL226/2/H+SWWnRAGAz9JB7wO8z/K2nlfovtwaY1vKuOxo7KNxj1vA3axpBB/v6D70/sfT+FP9qG5SqnUaN7h1jKxYvFx2UhZpXC0M1CKkKIqiKIqiKEqJ45KxCOVFw8I1ffRVzL+UyVfSzbfUS4pFaeE7up+kpQgVv2GvC2y6xVzRcdzvlGRMX/hSukbSxnD5SDEaxYmbRUFKEUuaCq4lIW0HLXLJj5fSCbvFa7jFCJHmhLQyKSkpAfeTYm0ki2dht0m3OnGkxRipLZE83eKrpBgLOp63SS/lKw5s2XDLKrU1+ss1kba1D/D1SUl7at9PsqRLfvF0T/JVr1atmrOvcuXKAWW27xMs2VJf5BYhKqekBSUZSLKg324LS0saUvJ95/Jxs0DYGmtJTvxadB9qv9ySXljps6mePA6ItNvUHvhChxs2bADg37eon5L83eQqjYNUXz7H7ty5EwCwdetWv2sDcuyJHeNVlGOdIiOlPqe5i94leDuy36f4M3R7j7MXAOa/pZg/N68LexzmVimyWEgWIbJ+8DZZkBghKi+3MNnpsyXZSRZwL9Y0Lh96XjT+eF2mw35v4rKj8YWXwX7f4uNdYcadhsabp6IoiqIoiqIoShGiH0KKoiiKoiiKopQ4wkwI2ojzuno8mdYoUPfyyy939rVs2RKA/4q2dBwFmElpBQkpBbRtjuS/KXg5IyPD2bdr1y4Acspor+m5vZLXIG/bbY+bKcmkyoOhW7duDQC46qqrAPgHv3344YcAgG3btgEAWrRo4ey78cYb/a7/xRdfOPtWr17td63c0j17kUuwZEflpfbTpEkTZ19qaioAn1l81apVzj4y6UorN7ulz5bwkj6bqF+/vvO7TZs2APyTOKxZswaAL5kDN1UXxHUpv8kFSAa8D1J7q1u3rrOtcePGAHxuWDyNKrkASO521OcowPq3335z9m3ZsgWAz8VLSsOcV1nk9Xi31eZpnKpVq5azr06dOgB89dq4caOzj9wVuDsmXUNKoWq7lEhBrnZyBiAwYQpvc/TMeDA9yZn6ieTiEOw2J7kj266+ubnG2a4iUmrmvCYdscvMxwAvLrNSOulgyY62UX+qUKGCs69s2bJ+x/Jxn9xs+DxRsWJFAL5xk694z/suILsC0nzK0+faK9fzpCBUPu6yQy5KdC0pKU9RjnX/aRREdlyG1H5obOJhDJSAg54vbzuSaxxdV3JlpzFNSt5iu3VJ7tXSNaV08rTNHiP4fYqi3dnjiTTeSemz7bICgXX36mbqtkyGlNzMdsvjsivMBE9qEVIURVEURVEUpcQRkhYhRVEURVEURVGUwkQtQoqiKIqiKIqilDj0Q0hRFEVRFEVRlBKHfggpiqIoiqIoilLi0A8hRVEURVEURVFKHPohpCiKoiiKoihKiUM/hBRFURRFURRFKXHoh5CiKIqiKIqiKCUO/RBSFEVRFEVRFKXEoR9CiqIoiqIoiqKUOPRDSFEURVEURVGUEod+CCmKoiiKoiiKUuLQDyFFURRFURRFUUoc+iGkKIqiKIqiKEqJQz+EFEVRFEVRFEUpceiHkKIoiqIoiqIoJQ79EFIURVEURVEUpcShH0KKoiiKoiiKopQ49ENIURRFURRFUZQSh34IKYqiKIqiKIpS4tAPIUVRFEVRFEVRShz6IaQoiqIoiqIoSolDP4QURVEURVEURSlx6IeQoiiKoiiKoiglDv0QUhRFURRFURSlxKEfQoqiKIqiKIqilDj0Q0hRFEVRFEVRlBKHfggpiqIoiqIoilLi0A8hRVEURVEURVFKHPohpCiKoiiKoihKiUM/hBRFURRFURRFKXHoh5CiKIqiKIqiKCUO/RBSFEVRFEVRFKXEoR9CiqIoiqIoiqKUOP4f+pcGnHzZPrEAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Average of all images in testing dataset.\n", + "Apparel 0 : 1000 images.\n", + "Apparel 1 : 1000 images.\n", + "Apparel 2 : 1000 images.\n", + "Apparel 3 : 1000 images.\n", + "Apparel 4 : 1000 images.\n", + "Apparel 5 : 1000 images.\n", + "Apparel 6 : 1000 images.\n", + "Apparel 7 : 1000 images.\n", + "Apparel 8 : 1000 images.\n", + "Apparel 9 : 1000 images.\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA0IAAACBCAYAAADtygrzAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJztnXd4FdXW/7+BdFJISOgtghCkSO8KiNQEUcEr4pUiXlGvr41XkHt/ElAsINi7IIgVC3oRBAQpgiAE6U1RJIihQ6RDCPv3B++as87OYjhJTpLDzfo8j49hZs7M3mt2mb3aDjLGGCiKoiiKoiiKopQgShV3ARRFURRFURRFUYoaXQgpiqIoiqIoilLi0IWQoiiKoiiKoiglDl0IKYqiKIqiKIpS4tCFkKIoiqIoiqIoJQ5dCCmKoiiKoiiKUuLQhZCiKIqiKIqiKCUOXQgpiqIoiqIoilLi0IWQoiiKoiiKoigljiJdCAUFBfn03+LFiwv0nEmTJiEoKAjr1q275LXt27fH9ddf79N9d+/ejdGjR2PDhg0XvebAgQMIDg7G119/DQAYO3YsZs6c6VvB/URRyfm/kalTp3rJKDg4GFWrVsXgwYPx559/5vl+HTt2RMeOHb2OBQUFYfTo0f4p8GWGLd/w8HBUrFgRnTp1wjPPPIP9+/cXdxEvSzZs2IDBgwcjKSkJ4eHhiIqKQtOmTTF+/HgcPny4UJ65fPlyjB49GllZWYVy/4KwcuVK3HTTTahevTrCwsJQoUIFtGnTBsOGDSvysuzcuRNBQUGYOnVqnn+7ePHigBurfZFtzZo1kZqaesl75bV+H330EV588cX8Ft1vBFL7kvBV/pcr9jwSFBSExMREdOzYEbNmzSru4uWLl19+GUFBQWjQoEGB7zVo0CBERUVd8jrp+6QonlsYFGRsCPZzWVxZsWKF17+ffPJJLFq0CAsXLvQ6ftVVVxVZmd5++20EBQX5dO3u3bsxZswY1K5dG40aNRKv+eqrrxAZGYkuXboAuLAQ+vvf/44bbrjBb2W+FIEo58uNKVOmIDk5GadOncL333+PZ555BkuWLMHGjRtRpkyZ4i7eZQ/JNzs7G/v378eyZcswbtw4TJgwAdOnT/dZOaEA77zzDu677z7UrVsXjz76KK666ipkZ2dj9erVePPNN7FixQp8+eWXfn/u8uXLMWbMGAwaNAhly5b1+/3zy+zZs3HDDTegY8eOGD9+PCpVqoQ9e/Zg9erV+OSTTzBx4sTiLuJli79l27RpU6xYscLnueijjz7Cpk2b8NBDD+Wn+H5B21fgQPOIMQZ79+7Fq6++il69emHmzJno1atXcRcvT7z77rsAgM2bN2PlypVo1apVMZfo8qIgY0ORLoRat27t9e/ExESUKlUq1/GixJcBOCcnB+fOnfPpfp9//jlSUlIQHh5e0KLlm4LK+ezZsyhdujRKly5dGMUrVE6ePInIyMgC36dBgwZo3rw5AKBTp07IycnBk08+ia+++gq33357ge8fqFBbDwsLK9TncPkCQJ8+ffDwww+jffv2uPnmm7F9+3ZUqFBB/K2/3vF/AytWrMC9996LLl264KuvvvJ6b126dMGwYcMwd+7cYixh0TN+/HgkJSVh3rx5CA72THH9+vXD+PHji7Fklz/+lm1MTIxP81Ig9XltXxc4deoUIiIiirUM9jzSvXt3xMXF4eOPP76sFkKrV6/G+vXrkZKSgtmzZ2Py5Mm6ECpCLssYoddeew0NGzZEVFQUoqOjkZycjMcffzzXdUePHsXQoUNRrlw5lCtXDn379sXevXu9rrFd43799VcEBQVh4sSJeOKJJ1CzZk2EhYVh6dKlaNOmDQDgjjvucMyxY8eOdX575MgRLFq0CH369MG5c+cQFBSEM2fOYPLkyc71/FkbN27EDTfcgLJlyyI8PBxNmjTB+++/71W+BQsWICgoCB9//DEeeughVKhQAREREejUqRPWr19fYFnOnTsXQUFBmD59Oh544AFUqlQJ4eHh+OOPPwAA69evR2pqKsqWLYuIiAg0bdoUH330kdc93nzzTQQFBeWSLd37xx9/dI6lp6ejR48eSExMRFhYGKpUqYJevXp5/fb8+fN46aWX0KhRI4SHhyM+Ph633norMjIyvO7funVrNG/eHN999x1at26NiIgI3HfffQWWiQRN1hkZGRg9erRoRSRz/c6dO/N8/02bNqF3796Ii4tDeHg4GjdujPfee885f+DAAYSGhortfNu2bQgKCsLLL7/sHNu7dy+GDh2KqlWrIjQ0FElJSRgzZozXgp5cdsaPH4+xY8ciKSkJYWFhWLRoUZ7L7w+qV6+OiRMn4tixY3jrrbcAeEztGzduRNeuXREdHY3OnTs7v1mwYAE6d+6MmJgYREZGol27dvjuu++87nvgwAHcfffdqFatGsLCwpCYmIh27dphwYIFzjVr165Famoqypcvj7CwMFSuXBkpKSnYvXt30VQ+nzz99NMICgrC22+/LS5eQ0NDHWv0+fPnMX78eCQnJyMsLAzly5fHgAEDctVx/vz56N27N6pWrYrw8HDUrl0bQ4cOxcGDB51rRo8ejUcffRQAkJSUFFDutocOHUJCQoLXRypRqpRnyps+fTq6du2KSpUqISIiAvXq1cNjjz2GEydOeP2G2uCvv/6Knj17IioqCtWqVcOwYcNw5swZr2szMzPxt7/9DdHR0YiNjcWtt96aa1wELnz49OvXDzVr1kRERARq1qyJ2267LdcYF2j4Klti7ty5aNq0KSIiIpCcnOxovQnJNe5ifb5jx46YPXs2MjIyvFyiihpfZUDuaZeSAeDbeA0AY8aMQatWrRAfH4+YmBg0bdoUkydPhjHmkuV+/fXXERwcjLS0NOfY2bNnMXbsWGdMSExMxODBg3HgwAGv31JdZsyYgSZNmiA8PBxjxoy55DOLmvDwcISGhiIkJMQ55qvMzpw5g2HDhqFixYqIjIzEtddei59++gk1a9bEoEGDCrXckydPBgA8++yzaNu2LT755BOcPHnS6xqarydMmIDnn38eSUlJiIqKQps2bby+sS7GDz/8gISEBKSmpuYa4zi+tgk3Nm/ejM6dO6NMmTJITEzE/fffn6s+p0+fxsiRI5GUlITQ0FBUqVIF//znP3O5WvsybxV0bChSi5A/+OCDD3D//ffjwQcfREpKCoKCgvDrr7/i559/znXtnXfeiV69euHjjz9GRkYGhg8fjgEDBuDbb7+95HNeeOEFJCcn4/nnn0d0dDTq1KmDSZMm4a677sLo0aPRrVs3AEC1atWc38ycORPBwcHo0aMHgoODsWLFCnTo0AHdu3fHyJEjAQCxsbEAgC1btqBt27aoWLEiXn31VcTFxWHatGkYMGAADhw4gEceecSrPCNGjEDz5s3x7rvv4siRI0hLS0OHDh2wfv161KhRI9/yJIYNG4Zrr70WkyZNwvnz5xEXF4eNGzeiXbt2qFKlCl577TWULVsWU6dOxe23346DBw/igQceyNMzsrKy0LVrVyQnJ+PNN99EYmIi9uzZg4ULF3p1zEGDBmH69Ol4+OGHMWHCBBw4cABjxoxB+/btsW7dOpQrV865NiMjA4MHD8bIkSNRr149cXLyB7/++iuAC9a1/MQKufHzzz+jbdu2KF++PF5++WWUK1cOH3zwAQYNGoR9+/Zh+PDhSExMRGpqKt577z2MGTPGa8KdMmUKQkNDHUvV3r170bJlS5QqVQqjRo1CrVq1sGLFCowdOxY7d+7ElClTvJ7/8ssvo06dOpgwYQJiYmJw5ZVX+rV+eaFnz54oXbo0vv/+e+fY2bNnccMNN2Do0KF47LHHnI+DDz74AAMGDEDv3r3x3nvvISQkBG+99Ra6deuGefPmOQumO+64A2vWrMFTTz2FOnXqICsrC2vWrMGhQ4cAACdOnECXLl2QlJSE1157DRUqVMDevXuxaNEiHDt2rOiF4CM5OTlYuHAhmjVr5jUOXYx7770Xb7/9Nu6//36kpqZi586dePzxx7F48WKsWbMGCQkJAIDffvsNbdq0wV133YXY2Fjs3LkTzz//PNq3b4+NGzciJCQEd911Fw4fPoxXXnkFM2bMQKVKlQAEhrttmzZtMGnSJDzwwAO4/fbb0bRpU68PI2L79u3o2bMnHnroIZQpUwbbtm3DuHHjsGrVqlxuxNnZ2bjhhhswZMgQDBs2DN9//z2efPJJxMbGYtSoUQAuaMivv/56ZGZm4plnnkGdOnUwe/Zs3HrrrbmevXPnTtStWxf9+vVDfHw89uzZgzfeeAMtWrTAli1bnHcRaPgqW+CCEm3YsGF47LHHUKFCBUyaNAlDhgxB7dq1ce2117o+R+rzVatWxd13343ffvutUFw9fcXfMsjLeL1z504MHToU1atXBwD8+OOP+J//+R/8+eefTju0Mcbg0Ucfxcsvv4xJkyY5H/Xnz59H7969sXTpUgwfPhxt27ZFRkYG0tLS0LFjR6xevdrL4rNmzRps3boV/+///T8kJSUFhIs4eTAYY7Bv3z4899xzOHHiBPr37+9c46vMBg8ejOnTp2P48OG47rrrsGXLFtx00004evRoodbh1KlT+Pjjj9GiRQs0aNAAd955J+666y589tlnGDhwYK7rX3vtNSQnJzvxMI8//jh69uyJ33//3fm+tPn0008xYMAA3HnnnXjllVcu6u2T1zYhkZ2djZ49ezp9d/ny5Rg7diwyMjKc2HljDG688UZ89913GDlyJK655hps2LABaWlpWLFiBVasWOEo9nyZt15//fWCjQ2mGBk4cKApU6ZMnn5zzz33mISEBNdr3nnnHQPAPPDAA17Hn376aQPA7N+/3znWrl0707lzZ+ff27dvNwBMnTp1THZ2ttfvV6xYYQCY999/X3xuamqquemmm7yOhYWFmSFDhuS6tm/fviY8PNzs3r3b63jXrl1NVFSUOXr0qDHGmPnz5xsApmXLlub8+fPOdb/99psJDg4299xzj5sojDHucp4zZ44BYLp27Zrr3I033mgiIyPNnj17vI5fd911JiYmxhw/ftwYY8wbb7xhAOS6ju69YsUKY4wxy5YtMwDM3LlzL1rWRYsWGQDmtdde8zq+Y8cOExoaakaNGuUca9WqlQFgfvjhB5fa540pU6YYAObHH3802dnZ5tixY2bWrFkmMTHRREdHm71795q0tDQjdR367e+//+4c69Chg+nQoYPXdQBMWlqa8+9+/fqZsLAws2vXLq/revToYSIjI01WVpYxxpiZM2caAObbb791rjl37pypXLmy6dOnj3Ns6NChJioqymRkZHjdb8KECQaA2bx5szHGmN9//90AMLVq1TJnz57Nk5zyC8koPT39otdUqFDB1KtXzxhzoe0CMO+++67XNSdOnDDx8fGmV69eXsdzcnLM1VdfbVq2bOkci4qKMg899NBFn7d69WoDwHz11Vf5qVKxsXfvXgPA9OvX75LXbt261QAw9913n9fxlStXGgDmX//6l/i78+fPm+zsbJORkWEAmP/85z/Oueeeey5Xew8EDh48aNq3b28AGAAmJCTEtG3b1jzzzDPm2LFj4m+onkuWLDEAzPr1651z1AY//fRTr9/07NnT1K1b1/k3jYNcRsYY849//MMAMFOmTLlomc+dO2eOHz9uypQpY1566SXnOI2HixYtyoMECg9fZVujRg0THh7uNQadOnXKxMfHm6FDhzrHpPpdrM8bY0xKSoqpUaNGodTNV/wtA1/Ha5ucnByTnZ1tnnjiCVOuXDmv74MaNWqYlJQUc/LkSdOnTx8TGxtrFixY4PX7jz/+2AAwX3zxhdfx9PR0A8C8/vrrXvcrXbq0+fnnn/MgqcKD5hH7v7CwMK9y21xMZps3bzYAzIgRI7yuJxkNHDiw0Ooybdo0A8C8+eabxhhjjh07ZqKiosw111zjdR3N1w0bNjTnzp1zjq9atcoAMB9//LFzjH/zPfvss6Z06dJm3LhxuZ5tf5/kpU1IUN/lY5gxxjz11FMGgFm2bJkxxpi5c+caAGb8+PFe102fPt0AMG+//bYxJm/zVkHGhoB1jaOVPv1n/s+M2bJlSxw8eBC33347Zs6c6Wh1JewEBZTgYNeuXZd8fu/evfNkXTh27Bjmz5+PPn36+HT9woUL0bVrV1SpUsXr+MCBA3H8+HGsXLnS63j//v29TH1XXHEFWrVq5Tc3JqncCxcuRPfu3VGxYsVcZTx69CjS09Pz9Izk5GTExMRg2LBheOedd7Bt27Zc18yaNQulS5dG//79vd5/tWrVcNVVV+VyvalUqRLatm2bp3L4QuvWrRESEoLo6GikpqaiYsWKmDNnzkXjVgrCwoUL0blz51xa/UGDBuHkyZNO8osePXqgYsWKXhrCefPmITMzE3feeadzbNasWejUqRMqV67sJcMePXoAAJYsWeL1nBtuuOGiGs3iwAhuHnb7XL58OQ4fPoyBAwd61fH8+fPo3r070tPTHStjy5YtMXXqVIwdOxY//vgjsrOzve5Vu3ZtxMXFYcSIEXjzzTexZcuWwqtcMUHjhO3i0bJlS9SrV8/LnXD//v245557UK1aNQQHByMkJMSxOm/durXIypxfypUrh6VLlyI9PR3PPvssevfujV9++QUjR45Ew4YNHRe/HTt2oH///qhYsSJKly6NkJAQdOjQAUDuegYFBeWKOWjUqJGXK9uiRYsQHR2da97h2mni+PHjGDFiBGrXro3g4GAEBwcjKioKJ06cCGgZ+ypbAGjcuLGjhQcuuC3VqVPHZ/c/X+fSosbfMsjLeL1w4UJcf/31iI2NddrsqFGjcOjQoVwZNw8dOoTrrrsOq1atwrJly7xcium5ZcuWRa9evbye27hxY1SsWDHXXNuoUSPUqVOnwPLzJ9OmTUN6ejrS09MxZ84cDBw4EP/85z/x6quvOtf4IjOS8d/+9jev+/ft27fQvEyIyZMnIyIiAv369QMAREVF4ZZbbsHSpUuxffv2XNenpKR4WXTou9buV8YYDB06FGlpafjoo48wfPjwS5Ylr23iYthx1DQG0jxEFnd7PrrllltQpkwZZz7Ky7xVEAJ2IVSjRg2EhIQ4/z311FMALghk0qRJ2LFjB26++WaUL18erVu3FgXCXagAOKa2U6dOXfL55OrhK19//TWMMT6nrDxy5Ij4jMqVKwNArgWevRihY24LwbxglyUnJwdHjx7NUxkvRbly5bBkyRLUq1cPjz76KOrVq4eqVaviySefRE5ODgBg3759yMnJQVxcnNf7DwkJwbp167wmGanc/oIG2LVr1yIzMxMbNmxAu3btCuVZhw4d8knOwcHBuOOOO/Dll186frRTp05FpUqVHFdN4IIMv/7661zyq1+/PgAUmQzzw4kTJ3Do0CGn7gAQGRmJmJgYr+v27dsH4MJEZddz3LhxMMY4aaOnT5+OgQMHYtKkSWjTpg3i4+MxYMAAJ3YjNjYWS5YsQePGjfGvf/0L9evXR+XKlZGWlpZr0RRIJCQkIDIyEr///vslr6U2dLF2RufPnz+Prl27YsaMGRg+fDi+++47rFq1yvFB92XsDBSaN2+OESNG4LPPPkNmZiYefvhh7Ny5E+PHj8fx48dxzTXXYOXKlRg7diwWL16M9PR0zJgxA0DuekZGRuZKgBMWFobTp087/z506JCoKJHG7v79++PVV1/FXXfdhXnz5mHVqlVIT09HYmLiZSFjN9kS9vwLXJCZL/WT+nyg4S8Z+Dper1q1Cl27dgVwIVPkDz/8gPT0dPz73/8GkLvN/vLLL1i5ciV69OghpmTet28fsrKynJga/t/evXsDep4g6tWrh+bNm6N58+bo3r073nrrLXTt2hXDhw9HVlaWzzKj8c/uv8HBweI79Be//vorvv/+e6SkpMAYg6ysLGRlZaFv374AIMaT+fpde/bsWUyfPh3169d3FtWXIq9tQkKSGY2BJOdDhw4hODgYiYmJXtcFBQV5fdf6Om8VlICNEfrmm29w9uxZ599kOQkKCsKQIUMwZMgQHD9+HEuWLEFaWhpSU1Oxfft2VK1a1S/Pz2sQ5hdffOFoHXwhLi4Oe/bsyXU8MzMTAHL5iEsBt3v37vVbJ7XrW7p0acTExPhURvpAsAOHpU7TuHFjfPbZZzh//jzWr1+PyZMnY9SoUYiOjsZDDz3kBKEuW7ZM9GO1/VMLK1iWBlgJXl8eoO7LICFRrlw5n9vC4MGD8dxzz+GTTz7BrbfeipkzZ+Khhx7yklVCQgIaNWrkKA9s+CIDKDwZ5ofZs2cjJyfHa28DqXwkk1deeeWiWadoUktISMCLL76IF198Ebt27cLMmTPx2GOPYf/+/U5GtYYNG+KTTz6BMQYbNmzA1KlT8cQTTyAiIgKPPfaYn2vpH0qXLo3OnTtjzpw52L17t+vYR+PEnj17cl2XmZnpyHPTpk1Yv349pk6d6uWfTjFylyshISFIS0vDCy+8gE2bNmHhwoXIzMzE4sWLHSsQgALtiVSuXDmsWrUq13F77P7rr78wa9YspKWlebWtM2fOFNqeT4WJLVt/EEhjki8URAa+jteffPIJQkJCMGvWLK9F+VdffSX+rk2bNrjlllswZMgQAMAbb7zhFVuakJCAcuXKXTSrZHR0tNe/L5d30qhRI8ybNw+//PKLzzKj8XHfvn1eXjrnzp3z28e2xLvvvgtjDD7//HN8/vnnuc6/9957GDt2bL4y+FLio27duuH666/H3LlzERcX5/qbvLYJCZIZ/zalMZCOlStXDufOncOBAwe8FkPm/9Kgt2jRwuv6S81bBSVgLUKNGjVyVvrNmzcXV4RRUVFISUnByJEjcfr06UJ3abnYyvvkyZOYO3euaMq/mAasc+fOWLBggaPZJqZNm4aoqCi0bNnS67idqW3Hjh1YuXKlXzfDkso4b968XNlCpk2bhpiYGGehULNmTQDItdGs20aypUqVQpMmTfDqq68iIiICa9asAQCkpqbi3Llz2Ldvn9f7p/9IS1acXKy+FAiYVzp37ux8mHGmTZuGyMhIrw/9evXqoVWrVpgyZQo++ugjnDlzBoMHD/b6XWpqKjZt2oRatWqJMrQXQoHCrl278L//+7+IjY3F0KFDXa9t164dypYtiy1btoh1bN68OUJDQ3P9rnr16rj//vvRpUsXp81xgoKCcPXVV+OFF15A2bJlxWsCiZEjR8IYg3/84x9eiiMiOzsbX3/9Na677joAFxJMcNLT07F161bHbYY+duwMdJTFj5MXC3tRIikVAI+7W+XKlfNUT1/p1KkTjh07lmvcs8fuoKAgGGNyPXvSpEmOZTxQ8UW2hYmvFqXCxN8y8HW8pg2++UfxqVOncmWa5QwcOBCffPIJpkyZggEDBni1r9TUVBw6dAg5OTnic+vWrZunegQK69atA3AhsZGvMqPEFdOnT/c6/vnnn/u8dUpeycnJwXvvvYdatWph0aJFuf4bNmwY9uzZgzlz5uT7GU2aNMGSJUuwe/dudOzY8ZIblvurTXz44Yde/6YxkL5Xab6x56MvvvgCJ06ccM77Om8BBRsbAtYidDEGDx6MmJgYtGvXDhUrVsSePXvw9NNPIy4uDs2aNSvUZ1955ZUIDw/H+++/jzp16qBMmTKoUqUKfvjhB5w9exa9e/fO9ZuGDRti4cKFmDVrFipWrIiYmBjUqVMHo0ePxpw5c9CxY0c8/vjjKFu2LN5//33MmzcPEydOzLXy3rNnD26++WYMGTIEWVlZGDVqFCIjIzFixIhCq++YMWPw7bffomPHjvj3v/+NsmXL4r333sN3332Hl156ycka065dOyQlJeHBBx/EqVOnEB0djc8++wyrV6/2ut8XX3yBqVOnonfv3khKSkJOTg4+/fRTnDp1ytmAtnPnzhgwYABuv/123H///Wjfvj0iIyORmZmJpUuXokWLFo6Gq7jo2bMn4uPjMWTIEDzxxBMIDg7G1KlTnZTjeSUtLc3xEx81ahTi4+Px4YcfYvbs2Rg/fnwuK+Odd96JoUOHIjMzE23bts01OD3xxBOYP38+2rZtiwceeAB169bF6dOnsXPnTnzzzTd48803/WY5zS+bNm1y/I/379+PpUuXYsqUKShdujS+/PLLXCZzm6ioKLzyyisYOHAgDh8+jL59+6J8+fI4cOAA1q9fjwMHDuCNN97AX3/9hU6dOqF///5ITk5GdHQ00tPTMXfuXNx8880ALvhFv/7667jxxhtxxRVXwBiDGTNmICsry2mXgUqbNm3wxhtv4L777kOzZs1w7733on79+sjOzsbatWvx9ttvo0GDBvjyyy9x991345VXXkGpUqXQo0cPJ/tOtWrV8PDDDwO4EMdXq1YtPPbYYzDGID4+Hl9//TXmz5+f69kNGzYEALz00ksYOHAgQkJCULduXZ+0hoVJt27dULVqVfTq1QvJyck4f/481q1bh4kTJyIqKgoPPvggKleujLi4ONxzzz1IS0tDSEgIPvzwwwJtSTBgwAC88MILGDBgAJ566ilceeWV+OabbzBv3jyv62JiYnDttdfiueeeQ0JCAmrWrIklS5Zg8uTJAbUxrYQvsi1MGjZsiBkzZuCNN95As2bNUKpUqYta7gsLf8vA1/E6JSUFzz//PPr374+7774bhw4dwoQJEy6551vfvn0RGRmJvn37OhnKQkND0a9fP3z44Yfo2bMnHnzwQbRs2RIhISHYvXs3Fi1ahN69e+Omm24qiKgKHZpHgAtuVDNmzMD8+fNx0003ISkpyWeZ1a9fH7fddhsmTpyI0qVL47rrrsPmzZsxceJExMbGiqnhC8qcOXOQmZmJcePGiQrtBg0a4NVXX8XkyZN9DrmQqFevHpYuXYrrr78e1157LRYsWHDR+d8fbSI0NBQTJ07E8ePH0aJFCydrXI8ePdC+fXsAF/a469atG0aMGIGjR4+iXbt2Tta4Jk2a4I477gAA1K1b16d5Cyjg2JCvFAt+Ij9Z4959913TqVMnU6FCBRMaGmoqV65s+vXrZzZt2uRcQ1nj1q5d6/VbysC2dOlS59jFssa98MIL4vM/+OADU7duXRMSEmIAmCeffNL069fP6x5pGcLGAAAgAElEQVScn376ybRp08ZEREQYAF7XrV+/3qSmppqYmBgTFhZmGjdubKZNmyaW+aOPPjL333+/SUxMNGFhYaZDhw5mzZo1PsnMl6xxX3/9tXh+7dq1pmfPnk4ZmzRpYj744INc123ZssV07tzZREdHm/Lly5tHHnnEfPnll15Z4zZt2mRuvfVWc8UVV5jw8HBTtmxZ07p161z3O3/+vHnrrbdMixYtTGRkpImMjDS1a9c2gwYN8nqnrVq1Ms2aNfNJBr7iS1YzYy5kamnbtq0pU6aMqVKliklLSzOTJk3KV9Y4Y4zZuHGj6dWrl4mNjTWhoaHm6quvvmiWqb/++stpT++88454zYEDB8wDDzxgkpKSTEhIiImPjzfNmjUz//73v51sf5SF5rnnnnOtqz+xs/2Ehoaa8uXLmw4dOpinn37aK6OjMZceI5YsWWJSUlJMfHy8CQkJMVWqVDEpKSnms88+M8YYc/r0aXPPPfeYRo0amZiYGBMREWHq1q1r0tLSzIkTJ4wxxmzbts3cdtttplatWiYiIsLExsaali1bmqlTpxaeIPzMunXrzMCBA0316tVNaGioKVOmjGnSpIkZNWqUI9OcnBwzbtw4U6dOHRMSEmISEhLM3//+d/PHH3943WvLli2mS5cuJjo62sTFxZlbbrnF7Nq1S2y3I0eONJUrVzalSpUKmOxm06dPN/379zdXXnmliYqKMiEhIaZ69ermjjvuMFu2bHGuW758uWnTpo2JjIw0iYmJ5q677jJr1qzJleHtYm1Qyh65e/du06dPHxMVFWWio6NNnz59zPLly3Pdk66Li4sz0dHRpnv37mbTpk2mRo0aXhmqAi1rnK+ypaxlNvZ4eLGscRfr84cPHzZ9+/Y1ZcuWNUFBQWL2zsLG3zIwxrfx2pgL3z9169Y1YWFh5oorrjDPPPOMmTx5cq55R3r2okWLTFRUlOnevbs5efKkMcaY7OxsM2HCBHP11Veb8PBwExUVZZKTk83QoUPN9u3bL1mX4kLKGhcbG2saN25snn/+eXP69GnnWl9ldvr0afPII4+Y8uXLm/DwcNO6dWuzYsUKExsbax5++GG/1+HGG280oaGhueY8Tr9+/UxwcLDZu3ev63xtj81SH9q9e7dJTk42NWvWNL/99psxRm6LvrYJCXruhg0bTMeOHU1ERISJj4839957r1c7NuZCBsURI0aYGjVqmJCQEFOpUiVz7733miNHjnhd5+u8VZCxIcgYH3biUi7KmTNnkJiYiHHjxuHee+/1+/0XLFiALl264Msvv8SNN97o9/sriqIoiqIo3ixfvhzt2rXDhx9+KGZ/VP47uOxc4wKNsLCwQt9wS1EURVEURSkc5s+fjxUrVqBZs2aIiIjA+vXr8eyzz+LKK6903KiV/050IaQoiqIoiqKUWGJiYvDtt9/ixRdfxLFjx5CQkIAePXrgmWeeyZU6X/nvQl3jFEVRFEVRFEUpcQRs+mxFURRFURRFUZTCQhdCiqIoiqIoiqKUOHQhpCiKoiiKoihKiUMXQoqiKIqiKIqilDgCMmtcUFBQvq6n/4eEhDjnIiMjAQCVK1d2jtWqVQsAUKVKFQDA8ePHnXMHDx4EAOTk5AAAypQp45yj3XhPnToFAPjll1+cc7///jsA4PDhwwCA06dPO+fOnz8PAMhrXor85LHIq+xsSpcu7fxNuy8nJSU5x26//XYAQMWKFQF4ZAHA2eH57NmzXr8HPDKg8u3evds599lnnwEAMjMzAQDZ2dnOufzm8ihK2dHvgoM93YnaXYUKFZxjDRs2BABUr14dgHcboTZIcqLfA0B8fDwAICsrCwCwdu1a59zOnTsBAH/99RcAb9nRvfJKcbQ7/nuSoyS78uXLAwCOHDninNu/fz8AT/uLi4tzziUkJHg9Z/Pmzc7fO3bsAOBpw1xeRdXuCio3Du1+ztth2bJlAXjkd/311zvnaGyjevPd07ds2QLgwj5mAPDnn386586cOQPAU1d/5NspjjZXWLiVyz7H/01/c1n4MncUluyka6iNhIaGOseioqIAANHR0c4xOl+uXDkAQNu2bZ1zND/Q//nvdu3aBQDYtGmT1zWAZ4yk7Sr4+EnzNccXufw3tbuipihkZ1/Pn2l/90nH+PdbtWrVAHjGRJpPAWDfvn0AgGPHjgGQv0H82Qcvh3YnyZW+D6VzbrjJkB+jv93GvcLI7xaQCyFf4MKnQbdSpUoAgMaNGzvnWrZsCQCoV6+ec4w6RI0aNQB4dxb6KJI+quiZtOjhHwcZGRkAgDVr1gAAVq5c6Zzbvn07AHjtN0QDd6Ak7bMbOuD5GOfyvPPOOwF4PjD5e6BJkmTHPyxpQjtx4gQAz0co4PmwP3ToEADvSS2/i8iigD44abK/6qqrnHPNmjUDADRv3tw5Vr9+fQCeRTkNyICnzjQA84UQyW7v3r0AvBfgq1evBgD89NNPAIB169Y552ixSR+uQODJkdpMRESEc4xkxj/aGzVqBMCjjOAfYvSBRLLjH1ZUX1Jw0IcWACxevBgA8N1333ldA3jacKDJiyC58f5K9b7iiiucYx06dAAADBw4EIB3G7UXylyBRLJo2rQpAOA///mPc47aGCl9pPYVqHLLC/bChC8U7WukcVD6t309f390HR//SLZF2R6lslF/I6UM9UPAo5ygcZAf69SpEwCgY8eOzrmTJ08C8PRXuicA/PbbbwCAr776CoBnrgU88wP9n8+/e/bsAeCtmCNZSYskJXCwP6Td+ot0jh+jOZm+T0jpyK+nb5DExETnXGxsLABP2+Jzga0okz7cJeWF2/WBjKTYpbmB5mk+/0pjoH3OXuDwv/kxGudIycEVIYXZj9U1TlEURVEURVGUEocuhBRFURRFURRFKXFcdq5xZGrjO/2S/3tKSgoAoE6dOs45ijPg5ncy75E/KDe/kcmPzK10DeBxbSMzH48/oHgYek6DBg2cc8uWLQPgccEBcsc1FLfJVDJvkmmU3Bw45IpA5efXE5LLjGTeJLO05H8aaHB3EXLFJPcPcocDPGZ3iqUCPHUnFzfuKkmyIzmR2xHgMc2TCxh35bzyyisBeNzseDzX/PnzAQDbtm1zjtnxHcWFbX4nWQIe9y0eI0R1puu5OZ2OUZ/l/ZnqSf2TuyPWrl0bAPDHH38A8I4VJPedQHOrseMguesuuf927drVOdamTRsAHrc5cv3g9yK5cRcTck0g1zjuCkHPXLVqFQBPbB/gkVtxty9/IsVe2TKzxz7A3f2NjvHf0d98TCU3Hvo/P1dYMqZy83dOcx25XZJ7OeBxMaexCPDMf/Q7aheAZ7wnWfB5glyCaS7nrnFbt271Osa/AUh2PPbUzaVJKV6kGBNpHKJ+Qv/n56S+R22L5g7+HJpT6ZuOtzuaX+j3HIq/lWJJJfcu+1tHcvUPZKRxi/oazSNS36P3wWUuucQRdExyBaZ70rgHeMcE+hu1CCmKoiiKoiiKUuK4bCxC9iqVZ4EjDWiTJk0AeLLYAB7tJdei08qVVqtcg0wrUrdANyoLD2i34RnTJC0/WYlIO1Fcmme3YERa9XPtH9WLVudca0B/U134vUietNLnAe30LuncpTIpFQdSFhrSwJPWnFswqN3xtkiWSJIFD+zlcuTXAB4tMA9mJ8gCKQV0U1acAwcOOMfIGlXclg5b68ytqzExMQC8ZUJ1pzbJ+yy1SWo/POMPPcfOWsjvRTLk/ZnejaTdKmp4GUgO1CdbtGjhnKPEMDy5CVlzpfGM7iVZ0kiG1GaTk5Odc/ReSG48MQxp7bklvbj7bl5wy5QkJQ+gNieNg25WH/v//J6871MblRLQ+LsP21Za3h+of5IVlVsd6RjXElPdaZ7gYxf9bWcZBTxtha6hewOeNk/eFGSR5M8h7T3gacOS5lkpXqTkItTueFuxvwl4P6NzPNEOzbt0D7f+wsclO9MtzUFSmblVVmpbdIyu43Myb+uBiptFiCxm/LvG9qLi2JYgKakElx3JjLLC8nOFKTu1CCmKoiiKoiiKUuK47CxCtDLlcUAUU0DaAMknm2sgbO2opLH0xa9Y0jCRJkKyePAU3hS3QT6Q/tjDpCBIcTm2xhzIneaZ+5Hbmh1J4yLFqFDcRn5z1BcFVDceY1KzZk0AnhSd3CJEspOsOARvP3nRVkr7xJBWjGtxSXPK42/sfbKKC5InaeC4bza3phIkR2ojXPtMbUva48S2REqae9Jucc0i3au4+yXg3QeoL5IliO/PQvEbkizpHtwv3raSSZpOkhu3hNL+ayQPLm+6P+1DBHjGisvBMuQ2F/D3YFt7eJultipZuG3tKf8dtWl+jNomxdjw91fYFiHexyhddpcuXQB4p2i3PQH431RPac8faQsJaoO8nnb5KE03xb8BHm8LGvMAeU+YkoDbnFncfVBKzWxbR6W+ZI///Bj/1qLxUWqTdIz6IJ+bqZ9J3j72s7knB92fj532NxK3ZPDrAh3JIkTf2DxVPs0N9N74nOlmlSVZcJnQ39QWuOx4vJC/UYuQoiiKoiiKoiglDl0IKYqiKIqiKIpS4gho1zgpoI6C2OrWreuco4BgMt9JqXU5bukN7d29peAuKW2j7QbBy05lJlcqwOMuR2moeRmK03wtpc/mgXG2qdMtnSV3mbFNwry+5JIkBdsVtymfsN2oAM87pBTZ3EQvuVbZiTj4OTuBB8dtZ3sqF5nyuTmbysV30LZNzkUpX8nlUQrMdks7TPBzZJK3k5kAuVOecncIuo5+z8sgpTIuLvg4Q25p5GbLE5mQayZ3Y7Pdf93eN3chsl2DubzJLYL6ME+NvHPnTgBARkaGc4ySdlxu2O2Qj2c019D/uVulPRfwtmsHhHP3M3pv/Bj9TTLksvR323Trk7Vq1QLgaWN8/KbrpMQidB1PnmEnGZLc5iQXKntu5i5U5KrHXXb27dvndf9ASHzib6RvJDvxDpB7zslr/f0lL8kllNoD9SHe/qnP2clJ+N98fqM+RM/hW1TQ/Cw9h9yJbbdWwPP9Rr/j2yzQmMn7Ax2j+Z6Pj4GcLMF2qeTjFsmfZEHzEOCRP/82IqjvkUz4mEXH+LxD5+l98LGBJ33yN2oRUhRFURRFURSlxBHQFiEOrU4pGI6vSEm7IAWmS2kUadXpFmxqa6Y40uZQdspUKe023/yQAthJK1GYm0W54ZY+W0pnaQe48RU+aUyk98ADDAFZ9lJiC8nCVhxWDHqv3OpDmhA6xjUi9D65fGwrDK+HpMUj7LYoXUsaG24NIO2oFDxP76O4NKO29VCy3PI+Qdo1SevpFhxMbZGu4e2Q3g3di7dbqS0WNdLm0WSFpOB1vtkxaValtPVulka3ZAmSNZLaGiXq4ElCqHy8zZFWtrgTdOQVu+5uGwxKiTbcNoi0teD8mBQQTrLm1hQpoUBBsOcwPtbRfCVZT6U51k62wc/ZfcrN8ivJzk4pDni+C7hFiNop/a4oNqMtbOz5kI+bdrINKW0ztRmuhXcbGwh/JY2hdyGlvKb5k7ctOxCftx26jidz4vMf4G3FpWdTXfg5Gq+kbxiSHVlDuEWIzkmbgpIlSNqoO5CRvjNI1tS/aP4BPNZiN4uQ9D0kfYfT35T0hKy6vFyFgVqEFEVRFEVRFEUpcQS0RUhKO0qaMXvlz6+RNLvSxlqSZcdGsuwQ/Pd26meubSC4XyxpIKge3Je1sDWnbitryeeYQzKQNvizN8jjqaapflIaRTu9aaCkzAZya964ltT2Heaae5IFt2q4adLsc1z2dM5tU0zJh5rKxzeHo+uKW8a2BZJrikijxuMh6JhbOnIpRojeCaXXpfThgEc7J6VHltp+USNZhCjuizRzfBwk2UhjlptlyN50FnAfg2iskzYfJCsptwj9+eefl7xnoOC20SN/DzQOUH15u3SLWbO19nxOkKzfBGnNuVcB3zw0v7htIMs1vDSWS3GjUopsG8laJFld7fbKcdtUWbIO0LhM8UmBHJ/hhtQmSRaSBwCfowh7w1lupZBizew2yGVXEDlS+blFiNoStW0+nthxc7yf0T14n6C6S9YG21uHj/HUvulevP70fUJtjJeP5CrJh9I98zJnZmYi0JEs4HaMEM1DgMc7yx4jgNyeJ5JFSIoboufwzbp9+V7PL8U/2yuKoiiKoiiKohQxuhBSFEVRFEVRFKXEEdCucdx0SS4EZPrkZjsyRUq7ApNLjZQWWjKfurkn2CmO3VJrc3MomUi5CZDqQ2ZhXp/i2Albqjf9LaXVldJnkxl069atAIBu3bo552wXRS47O5GCW/ns3xYV1Ga4KwK1RTIb83ZH75rLzpYBbw92e+P/tt3BeP2lVLcEuQ5wVwlyMbBT0RY1bokfyN3gyJEjuY5JgdmE1LbsgHWegpP6pR3YzcsXCMkS+Lsl9xcaN7iLCbVD7qbh5gZsH5PkJrUP2/1XKh93i7WD1QM5UJ2/bzuJhhTgLbmskeykcdx235YSBUhzG13P+wRtvVAQJLcr6b3yugPernHUjtzSr+enPLxMgEcG9GzuiieNz3TMdlu/XJASGNF3A7meUhILwJNSn/re3r17nXOHDh0C4HmPXHaSS6M9V3E3TN4G84rkYk5jBrk1cvdGe47lYzTVhY81JBca23nd7O1VuFxtV1feB2ncoufRvQHP+MgTl9AzySWTl5m7wgYadnvjMqB+ReNcQkKCc47+pnNubuV8znFLny1tJ1KYW1moRUhRFEVRFEVRlBJHQFqEJM2NvakdXx3aq3+urbI3twPkDS0JO4BT0ojamiaOveEo4AnKliwfdnpf++9AQEr5SCt1ru0grdNPP/0EAOjRo4dzjupkb7QHeAI3CzMYLr/Ylgu3dLdSCmh+zE6DLbVJSYNqB227tUkOaZ+4lrQ4kyVIbVzSHlEfkjZapPJLqYMlTbydEpgnJbGTJUgpuaUyF5VFg+rDNfOk8SQ5SBt5couQ3aekNMZuWwRIWjh6Jj2Pj2ukWeVaWrrO3+me/Ynb9gEkf64FJU08BQ1zLTa1I3oPXIZ2CmjelqiNcwswnacxkm/K/csvv+SliiLShtjUtqSNXak8UpIH6f1Kfcv+nbQ9gzQu2JYqyWoupfW+3CxCthWb14nS5ZP1p127ds45OkbviOZhANi0aRMAjwz4N5L0Tu208Bs2bHDOFSRJh23VAzz1o2fyvkRB81IiLDrGrbE0PpLs+NhEz5SS41A9yTolBfzbqe8BT5/lcxXNK3Q9n6+lJFqBhrTpLb0Hkg/ftoHkKlntbOuSlISCz1ckR3peUXkCqUVIURRFURRFUZQShy6EFEVRFEVRFEUpcQSkaxzBzWJkMqdAvZ07d170d9xET+4M3JxJZjoyzUlJD9wgU6fkjkRmY26SpaDFXbt2Occo0JXc+oorgNjNXYDqyfccIKQAc9oFeMeOHV7XALlNo/wcBRVK+24UN/Y+QtwVyd7Bm9eJzL1uu3ZL+724XU/tlcuHnmPvdcDLx4+R+dptDw9/4+ZmJgVFSruf22Z3KRDddiHkx6Td6MmFwXZZBNxdeQLJNY67L1CZJbciaVxzqw8dk1wa7H2EuLyprHwfIdv9t7iTJUhucJI7CLl8UAA17ZfBj1E9pfdAY4XkvkjvTwr4l5J2SO7D3P0wv0h9kp4v1cltjHbrP1KCF4LXyR4PpLbslsiIu/PZbsCBsDeYjVQ26i/kHlS7dm3n3DXXXAMAaN26NQBvV0m6nsZGPtbRHEvw8ZOeV6FCBefYFVdcAcDjAsrduwrikinNo7ZrHHdBtffq465lJDtp7yjJBZ/+llzL7SQUfMylbzop0QPNv/ybkydTsJ8nfTMWJ279n7sv0jshV2A+vruNDXQvqe9J3yB2CAzvz4X5XRh4I4OiKIqiKIqiKEohE1jL0/9D2l2aVt60S/natWudc2RdycjIACBrUPiqXEqfaJ+zywL4plkiq8j27dudY3/88QcA712FqaxZWVm5yleUGlP7WdKzebIEO9CQy4IsX5SimGs77dU811aRtrO4NcUSdkCvpHEkrQcvP9Wda4BsLSmXnZ3Awy1In58jTZQdZMjLJ1mxitvqZmufJCuXtPO3mwbVLUU0yYAnjqCxREplHAgB1lRHXmb62y1hi6SRk8ZUG+levlzP2zj1D15mO/17USKlh6bycIuHrZUGPJrmatWqAfAOEKZ6ulnM6J48mJuCjaUgayoXlyf9bVvN+f0LgiQft35kW6Dt8hK2lliaR92SxkhypTmDyuDrnC5Zl4ozWYyU5IGP29WrVwcANGvWDADQqlUr5xxZaCRPF2qT1IabNGninKPrKOkBT3hA53gqbmrz9H+eMnv27Nk+1FaG3ivvZ7YVmVsbqO/QNdL8y2VgJxTi75zajzTH2tuZcGuIneCBt32af7kHkJ3Qi/fxQLNKuln6aawCgKpVqwLwWIT4e7C3GXCzekleMFJ5bAsdULiyC6y3oiiKoiiKoiiKUgQEpEVIglbXlP529+7dzjk6dvDgQQDeK/Y2bdoA8NZKkkbJTq0L+JZuluC/o9UtWXg2btzonCOLEE/dSxoW0hYUhTXEFy2YFDPCLUL0HqjuXD6UPptk4BZbxN8R3d9XGRR2rIFkjZEsQnaMBK8T1Z1r+mz5S9oRyeJh11NKQSmlZpcsQm6a7OKIEbI1x4Cnf1IcBuDRgLqlvJfijeh6en+VK1d2zm3btg2Ap037GiNUVO1PStdqvz9p8023McvXmAs3S5Adq8YtK9TmuEa1sFO2u209wOVDciRNJ+8XZHHk8Qb2ZqnS5rXSeGaPjXzMsFPNSp4AXJ40P9B1XJNekPYnvQvbashlR5pvGqt5HAS3/hH2Rr68PdlWYKmtucXhkkz4mEfH+L3yu6mrr7i1Z7c5hPcNsv40bdrUOdagQQMAnpg0/s6pnUqxLPQ3tVM+95A18+qrrwbgmaMBuZ3a/bhx48bOOV7+vCLFn1H7ofJyKwD1R3omLyPdi3/bUV3onvw5bu2B2rrbc0i+vM9Sf+BjA42xdI6XL1AsQtKG5iQrkjm1TcDjZUVzsj83H+d9lt4NyYxb4Qsz9XhgvBVFURRFURRFUZQiRBdCiqIoiqIoiqKUOALaNU5ycSH3D54Oko6ReZyngZRS4+YFyXXILaiUXKIogQPgSR7ATfn0ty/pugsTN7lQfXm5yUXCTkEOeFz/SAb8d2RqpvryAELbPVAKrC3u9OJkCpZM5tIu1iQLbua/2L35PdwCCCWo7Utp2O00vrz8bs/zF764jkjuMfQ3d42z3RLcXPq4KwK1V7qeBwTbLo3cbSEQkiVQGSSXFal8knuQW/9xc9WUUpsSdrA6d5OQEhHYKdul9LX5QXLvsF1LeduvUaMGAE+74i5d5H4kJXmg8vOxjmQsJSygMtj15veU3gvJkbudkQuT9G75GOoP7DbCy23vAm+nCLax+7fkSiMl8HBrr9SXqf3whDvS1gK2e6u/51opqYWdHh3wuFtRwHm9evWcc1dddRUA73HJTuzCXdHIbUlqW7bbFR8H7a0U+NhKspNSTdM9qey8DPlBco2zj/E+ZScvkLZBkZIO0T15G6G/peQZdl/n5+ytF7icpAQB1C/pXHEmi+FIfYOXjVwTk5KSAHjcKAFPOnVqk7xv+eLul9fvGpInb3d8HvQ3ahFSFEVRFEVRFKXEEdAWIUnrS6txvoEVQatIHtzvtlmipBX3RVvptqkcacwkKwovs62dCJTU0ZIVjmseSXtEq3OqL+CRO13PrXak5aJ7cvlIG9sGCrZ2l2uYbe0lb3f0d14tipKFxNbOSyllpRTkdjpMXg9pozN/aeovhluANm9HVG6utbQDJSVrmiQ7uhfVjQcQUxumVLJce+hW5qJKlkDviFspbE0ef2fU7yQtsVRWW5MnaealNmdveOuWOhrIbYX0F1KftPsYD7Zt2LAhAE+74tpQ+lvSINNz+Phtb+gradOlVMEU7E735uOg1H6pXG5jhr+wrWlcy03Ppbrx/koy59Zvt20Z3BJa2G3KLTEMlznJ08365i+oPZDmHPAk4CDLBbeaUMphKQ07tV3eRmiMklJG22nz+TsieZCc3Pqz1F7dUpzzFPB8XM4rUpIXuy68TiRr+j/vs9LWC25zrG1JlNKYS+Wz0+5Lm3lL6eSLwqvAvrfbZsO8b0hJbWrVqgUAaN68OQBvi5C9ebPb/CvhtsGtBN2LW4T8sV3AxVCLkKIoiqIoiqIoJY6AtghJWkxbM8X/tq0sgLxKtVP28nvZGlRp5SuVz9ZE8HuStoBrG+30ooGCJHNJSyrFxZAVhOrLLUJcCwZ4axRtTXxxW4akdy7FCNntiPvNkwwkjZ30HBupTbptGkiy57KU/PPJOhQoaTypvFzbThYQroXyxXLqZr0h+PsjbSy1RV4GX9pgYaUet98bt3jYFgL+vski5Ba34obkay+lc6e2SX2fP89t42FJy18QuZE2kzTu/N4kO4oLAjwWISlNtZSalSyFkvad2oqkxSaNOWkzueWAykrjA205AHj6MLcO0N8k/8OHD+eqf0FwsxS6bSvBx2/6m1sN7Pu7WYM5bhpuyfJuw8tsbybsrz5KKfi7devmHKOxStq4md4htTf+fsmSxPsL9SG6l5v1RIqPk+YJWxa8vUrWE/s7i/cVblXPK9IYYLc3Xic7zkaKR3RrR3welcYywparZFWXUrlLVg17DJLG1YLgtvG3m1yluDU+PrZo0QIA0KhRIwDe/dlub9L2Em4x3hJuHjEEj/dXi5CiKIqiKIqiKIof0YWQoiiKoiiKoigljoB2jePYZjfJLc3NJOyGm3nZlwAw/kwy10pmZl/NooWdstdtJ3YpEF9KG0sy44kUyFWGrifXEn5fyWRtm7aLM2WxjW0W5y4CbskS6BhvW1JQqn29hN1upDTPdgA7P8cDTO30y/5OluDru6P6Sm2MysTdS3xJtSlh7++FfhAAABQlSURBVCbO3QPo/lICFrd3VVS4tTmSkRTAz9+pW9KDvKTW5v+2A+a5y4I9DvLy+9sdk94fD6ilupCrEXdLIzcQO1Uu/5vLk1xDqJ5SAhN6HpcBuQHTs7l7B71Tconj7sNSOnK6L7kjcRnydpFfpPYgvUO7P0jJEvLqtkq4tQtpnufu2PY5KQDel2DuvEDtjVIKAx43NimBB71Daq98XKN3KKWTllwybVdMNzcpjv3+uFxp3OBjMP1NcxovX0HmCWletM9JbmmSW5tbAhHbFZDfQ/q2s12+3La2cBtfJfw1h9BzuRsltSXJDc9OlsRdGildO2/D9evXB+Bx3/U1yRIhtS273UnhIW7zDy+z5L7sL9QipCiKoiiKoihKieOysQjZSGmeCcki5GuqYmnVbz/HTYPla6pYNy1BYWuhJS2b9G9pAzG3QECyDtHvuLbT/p2bVaq4kyVw7HJLGwNKKcElLaSt1fK1DbtBWh/S0HJNLWkipZSyxZ0swc3CS5ofrvly65duAdm2xo7/nrRpblbm4sBuc1wzb9dfstZK9/Ilfbb0O9uixn9H/Z2PD5IVsrDSGJO2kILXeXkpGJhSFgO5A9MliwqXJ7UPuic/R2Mc3YtrLil1MqWT5lbzjIwMAB6LEH8HZIGSLAZ0f54swR8WIUljLo11dvvhspCSspB87IQy0jFJM2+nIOfX0RjnlogGKLyxjt4ntx7aVkZpLCGZ8aQ6dA/+Ln1J5SylyLat624JpaRkRVI6crIIcQvXkSNHkF/sjVoB+ZuAsC00kiz4veyxnI9DdnIsN4uQZPWxN2uVfsevlzYMLgj03CpVqjjHyLIj9Vk7cQ1P0pGQkAAAqFmzpnOM7ktjp9Qm3SxC0lxjy+5SCRLscYZbgSQror9Qi5CiKIqiKIqiKCWOgLYI+aqVtbUAku+3tIL1ZVXrlj5b2rCMNCdS7IWvlphAgbRCUkpmSRNC2i07vS7gkYfb+wgUJCuOmyaE5MM1fW6pM/OaZpJw06ZJaailtmhr2CTNV0HIb+pMro10syC4pXOWtIa2zKVN5aQyuGnxCmMTZKnNSWlPbUsjf99SmW1/eDftm5uvtmQRIq2ydE7STEpW0oLIkMYXPs64+bLTpsOSZZ+OSZpyur+0qaykXaa/jx49CgDYs2ePc27v3r0APFp7nqKWNLLcukR9mJ7zxx9/OOfo/gVBmt/c0gVLaduleAxCkrWd1ldqt25zMz2ba6zdUiL7e2PLgwcPAgA2b97sHCMrIGnTufadLPNSPJBbuSWPFao7HZO2tpDGJ1/GM8kiRP2BWykL0mft8gPucTZ2W5TSo3PcPA2kvmo/R5pD7HLmtf7+sghRO6pXr55zjOJ5JOsY1YEsQnzDY7IIcWu6bZHmVkA7tliKxSPcLJFSLLD9e0D2RJA2PPcXgfs1qiiKoiiKoiiKUkjoQkhRFEVRFEVRlBJHQLvGcXxxR7FTpwKy2c02e7rt/Ou2uzY/R88ms6JbAJ9Uj6JMGZ3XFMdSMLQUnGf/jgcJ0zPdUukGStpsN9c4t1Tr5HoDyLu+u5nm3YLTbXg7pHKRyZm7CdFu5xw7EJfjb/dMN9cru77c7O0WJCy1Gzships7K8fNBaU4EyjY7n9SH5Nc4yTcEkkQbu5L1GYlVxZym+HvznZ7AuQx2B9QMpbMzEznGHfhAbz7IbkrUR/h6e6pvLwPk6sr1V1KGiM9h9zf9u/fD8DbnY3uTzJJTEx0ztF2A5JrHCV6WLVqlXOOu9zlFbd06lKfsccu7uLi5toiJYbJC1K7k1yEJez+7a/5hd7T+vXrnWM01tK74+2BvkekJB3SvOKL27g0T7i5YLmlw5fat50in0OJPvKDlH7dnovcErtIqcR9TTxlu2JK7UEaL22XOLetP/j9qY9IqebzA7lf1qlTxzlGqfqlNk4yo/bG3TWpvdK4AnjGTrpeStLjlppdci+m+UlqR/QcqS1KCWr8sb3HxVCLkKIoiqIoiqIoJY7L1iLEV4e2BoVrBd0CPt00D74EcErX0CqXBzhL2hu3oL6ixJeNwKSAbDdNlvSO7CBESfsulcmXIO/CxG1DSoK0HZeyxrilz7bPSZpQCdsaIm3q6pYQwS2NeWFia5YlDaH0zvO6MaObVYWeI6WUdbMIFXb7s/uKlLKd+hZPUe+Wlt3NCkm4bYQnXUfjwqUSNtgpaf3dznibt9M2S2m8STPK5wkqG78X/U2y4OMZWbtpnOdafkov/NtvvwEADhw4kKvMVJasrCznGFmg+NhBAdKkhd+yZYtzjluv8orbe5Ww2xF/5yRPXzXshC/HpHFQSg8tzbFuqfMLAo3z3CJH79FOWQx42gYd421SSurhFpTvlpLZPuaWuOZSY5it3edjo9SefYXuy71F3LwgbGvVpZJ02PKRZCBZFnxJhCUlj5LmDmofVEc+phTEqkFjDR8fKOmB9N1J8pG2o6CxT7JcuiUXkeZAu548iQsdsz2CAG/LN0Fyp/GFy85t4/SCohYhRVEURVEURVFKHAFtEZJW85JGxPZH5itfXzTI+Y0DkOKApBSZ0mZS9rHisgj54l/LtTf0t1sKSoJrTmyNIj/nZh0LlLTibtpE0gbx9NmS5cvWVksWD8maZiP1C9KScg2Km1WzONKXSxYHKc7EbVM7Xyy2/JxdX2kTPclvvTBTdV4KN0223Se5RcFOtcyv9yVOQup3khxsaxS3TEhlttOh+8siJPURO36Jl43SHkvWNGoL/BjJWPJXty2N/DkUI0SxS3xcoHuRTLhlhWTNYzBIa0pWF26FkPzuC4Ldp6QxWtLou6XBlqzSvozpvnhrcNnZMW38OskqWpA2SHLnmm+Sh73xJC+b5EVhX8PL6wu+bsnh5glwsd/zsnC58jkmr1Cb4u2H5ChZAeg6yXKWF+sYv94Xa7dbnBz3/KDyUdwY4BmTqR/z/nypmE43SHZ8zCfLDqXGlraOIasy/y6muUJqpwSXoR3rw8c72uSZxlca//g9q1atCsDbCmR7s/A6UhsoSDxaXlCLkKIoiqIoiqIoJQ5dCCmKoiiKoiiKUuIIaNc4jm0GlYLtydTGTYAStpuMm5uNtGu6m0lWCkwjJNe44nb98sVkzl0wbNc4t6B+7kpi15ff03ZdcEtdKZXVXzJ0c92RnkWmcqond4EhuCncNsm7JYDwNQDULgM323P52/UoDqT6Uhkv5YrmFnRtu7hx1xM3F0N7HODyCgTXOMmF0nYL4m4SBHc7se/JcXOPtHeA5/2c2hg9m7s7kKuGlLa2KNwxbZdL3h/InUNK20p/S245tiyA3G42lCAB8LjEkasIH+vsdPzcNYVcWKS2R8/mrnsF2bHebayTUtpT35Dc0iTXOLtsvB/aQe68Xbi1O7s/SEkypHS7BZGThJQmmJ5r910gd0r5S6X39yWxQV7dyH2ZK31N5lQQ9y6SGXd5ooB9KYkJfUfRGMPP0bcWl5ed2ESaK31JzsCxkyxIoQKUKh8Adu/eDcCTNn/fvn3OOT4e5RWSO90f8LR7Sq1NLnKAR640rpA7HOCRq+RKJ40J9GxyAeTjHSXPIBnwZBpUHuqrPF033YOXy3aN47IrSHKYS6EWIUVRFEVRFEVRShz/FRYh+ps0LjxRgZsmRAocL2haayoDX2lLaRsDJVmCG5L1hlblbputElKaSbeN2vJbvqJE0jCRlocHTJJGg1uJbGvapTbdtc+RDCUNKt2ba01Ixr5qRAvLWuSLRlPazNPX4Gtfyu1m/XXbFLg4kcpMmjl6z2R1ADz1cEtn7RYw7pY+W9IIk0ZPCoB1S2Tgr3ZGWniuoSVNp3SO+ieVjVvtpXpSH7aTJvDrJI0waUTp95JVQ7Lw0LwlJe2g5/AAfX9YOqRU61KAvG3B5Zpt+xwvNyFtPi61EVtrL10jpTG268Cf4+bBURDcvCHcNt/15zhbVOOUvxIYURvh4xZZBCSrDz2LLB483bOUQMF+x1J7kL4h7WukOkoWIRqHef//888/AXisGdxCUhBrGvU5vkEzyZPKweVDf7tZf3gqd9uzQkqRTWM+rxP9TUkSuKWWLEA0blG6b8BjLeLjsD3eqUVIURRFURRFURSlkNCFkKIoiqIoiqIoJY7LxjXORsqRL+3E7raPgS/npOeQ+U5yK5B2spf2DQgUfDFzcxcEMnFKgah28Lnk4iHtI+Bvl4WC4mvQKJWb3N+46ZbMxdwcTWZxX/bycUuWILmFkosNd8+j5/kaIFvYuCW8kIKLpX1bfEmeIblDuJXBbV+n4sB+NnezIZnQe+YuGfQ7vvO4255E9u/cgo65TMn1gZ7NXVncAv4vVr/8Qs/g7mUkHykJB/UHGnu4S4bkjmnvcSW5KNJ74K4+NA5IyUromJRAgsZWXmbbHYePMQWRo5u7KtWbu/HY+5Rxl19pfxWqg5SIxG0etduN5F7ttm8bLzNdl18397wQKGNsoEPvhPdZSpxArnE0hgCe90ptjLtdkVuX5HYphRy4hSP48v6o/fF2TuXh/Z/cxyiZjJTUIz/Qc/k+PXbyCe4aZ+8fxENGpP2u7LmVl5XqQHXj9aW/6Z3ycZL6KJWdu7rR++blovdM/Zg/R0pG5S/UIqQoiqIoiqIoSonjsrUISUhBz7ZmCpB3mrZxS63oFkxKx4oiVaw/cAvkpDpxLRtpLUkTIWk9JY0+XU//d7MIFWeKZ0DWIpEMuDbITmHMUxnv3LkTgLfmi66XLBBulg63YF+SMWmJKAUpLyt/DyR3OuavIFg33HYxlyxChBR8Tf2La6vs37qlTJVSuruldy5ObS7Vkfc/+/3t2bPHOUdtTbKuEb4mhnCzCJFVgoJ2uUWIgpolzbxbAoX8QPLhVhI7jTIfZ0gLStdL2lAp8FpKGW0nreB93w6IluQraYbp2fz3pIklGfP6+KNtStZlqb70Dqk8vL60szzXRtvji9S3pLZlW9+kvkxzEE8cQdplac5xS3+uFC30DqS09vR/npqZrBmSZZG8LXz16JFSatu/k74NbQsJ7xc05nLLBfUH6iOS9TQ/UN/niQps6za3plESChr3eGIEKdGE/Q3Cy2ona+HviN6b1M/s5Cq8fDReSHMS9V3ex6VvTX9xeXytK4qiKIqiKIqi+JHLziLky2qer3JJa8BTvJJPorRpqn1Pt3gRrhmgVT9fdQcabvX0FdsqwTfDotU+yZyv9G2tREHipgpLmydZXmwffWkjWNJUcC0JaTK4llRKf50XpBghO2akWrVqua7n2mc7pXZRaEnd+pJb3BPXylGf5Rolwu7HUqwLabJ5u7Of4+ZPbp8vTNw0kNQOScPGNZGUtpVr8khubv70BG+XdowNLwNpPMkaVb58eecclYv3k4JoQd2QLEIkH0kLavvF8/FJ6pN0TLJCkjxIIyxZuH2JZ+P3pDnE7X1z7bI/YoSkNM92PBDgeed0DW0aCwBbt24F4D0XkAXLjvHgf7tp6CWrFG0k+csvv+S6ntobj5GketjPu9gzlcLH7duJ2hiPGSFLB/VVPh5LGyPbcw1/57YXi9t8JM0F9mafgGfO57EvFDspeYoUJEaI6sK9TGh8oG8uLjs7RbaUKpvLzk6fzWVnW8CluCfJY8Vtg2u6h2TRk2LJ1SKkKIqiKIqiKIriR3QhpCiKoiiKoihKieOydY2TArkoePfTTz91zu3YsQOA9462dppGnkbVdm2TUtdKAaP0N7kMpKenO+fIlMnNopIJtqiQzL50jJsuJVfDpUuXAvCYXbkrwo8//gjAE8y3aNEi5xyl9KXnrFixwjlHgf4kJymgvbjSPJNJmNrWggULnHO7du0C4DGFr1+/PtfvuCuSHZQvJUtwK4/krmW7HfF/S6b5jRs3AsidBr0okNzfyNT++++/O+eoHfz888/Osbp16wIAqlSpAsDb5ZD6rO3KBHhM8uS6wJ+zbds2AEBGRoZXWQA5tW9hwp9D7Z/a1bJly5xz5I5G7iRr1qxxzpHctmzZ4hwjFyXqrzydu+0Oxv9N7ddOOQ145CSlMaby8XFh8+bNXtf5y0XJLYkBIfUxqqdb3+TXS+lk6R25jU++1E3aYoBjl5WPjf5OlkD3pvF79erVzjmaR6k85MYEAIsXLwbg2UUeACpUqADAkzxDcpsjeL1p/qH2w12OqFw0dvG5Oi4uDoB3H6b+Q+0u0LZpKIlIrnHUlui7iqeHpvE6NjbW6/+A57tNcveVXKvombbLpFQ+N9c43mepzJJ7rp3Kn9+rIEhp/G3XNcC38Y4fs79BuHzs7U98dRm3ZcffB/VLKZGZ5NZfmN8qahFSFEVRFEVRFKXEEWQ0alBRFEVRFEVRlBKGWoQURVEURVEURSlx6EJIURRFURRFUZQShy6EFEVRFEVRFEUpcehCSFEURVEURVGUEocuhBRFURRFURRFKXHoQkhRFEVRFEVRlBKHLoQURVEURVEURSlx6EJIURRFURRFUZQShy6EFEVRFEVRFEUpcehCSFEURVEURVGUEocuhBRFURRFURRFKXHoQkhRFEVRFEVRlBKHLoQURVEURVEURSlx6EJIURRFURRFUZQShy6EFEVRFEVRFEUpcehCSFEURVEURVGUEocuhBRFURRFURRFKXHoQkhRFEVRFEVRlBKHLoQURVEURVEURSlx6EJIURRFURRFUZQShy6EFEVRFEVRFEUpcehCSFEURVEURVGUEocuhBRFURRFURRFKXHoQkhRFEVRFEVRlBKHLoQURVEURVEURSlx6EJIURRFURRFUZQShy6EFEVRFEVRFEUpcehCSFEURVEURVGUEocuhBRFURRFURRFKXHoQkhRFEVRFEVRlBKHLoQURVEURVEURSlx6EJIURRFURRFUZQShy6EFEVRFEVRFEUpcfx/XRz/LzMCjxUAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "print(\"Average of all images in training dataset.\")\n", + "show_ave_MNIST(train_lbl, train_img, fashion=True)\n", + "\n", + "print(\"Average of all images in testing dataset.\")\n", + "show_ave_MNIST(test_lbl, test_img, fashion=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "Unlike Digits, in Fashion all items appear the same number of times." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Testing\n", + "\n", + "We will now begin testing our algorithms on Fashion.\n", + "\n", + "First, we need to convert the dataset into the `learning`-compatible `Dataset` class:" + ] + }, + { + "cell_type": "code", + "execution_count": 112, + "metadata": {}, + "outputs": [], + "source": [ + "temp_train_lbl = train_lbl.reshape((60000,1))\n", + "training_examples = np.hstack((train_img, temp_train_lbl))" + ] + }, + { + "cell_type": "code", + "execution_count": 113, + "metadata": {}, + "outputs": [], + "source": [ + "# takes ~10 seconds to execute this\n", + "MNIST_DataSet = DataSet(examples=training_examples, distance=manhattan_distance)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Plurality Learner\n", + "\n", + "The Plurality Learner always returns the class with the most training samples. In this case, `9`." + ] + }, + { + "cell_type": "code", + "execution_count": 114, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "9\n" + ] + } + ], + "source": [ + "pL = PluralityLearner(MNIST_DataSet)\n", + "print(pL(177))" + ] + }, + { + "cell_type": "code", + "execution_count": 115, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Actual class of test image: 0\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 115, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAETRJREFUeJzt3V+MnOV1x/Hfwcb/FvPHjv9hzJ8CghouoFioElVFiYxIBeKPFCu+iFwpwrkIgiAuinwTbipQRZJyUSE5xcJICQkooXCBWpCFZJBKhLEgkLptLLMEY3sXYyBevPZi+/Rix9Fids4Z5p133rGf70dC3p0z786z7+6PmdnzPs9j7i4A5Tmj6QEAaAbhBwpF+IFCEX6gUIQfKBThBwpF+IFCEX6gUIQfKNTMfj6YmXE5YQ0uuOCCtrWZM+Mf8cGDB8P60aNHw/rZZ58d1icmJtrWRkZGwmPRHXe3Tu5XKfxmdoukxyTNkPRv7v5Ila93qjKLz/UZZ8QvsI4dO1bp8e+///62tUWLFoXHbtmyJawfOHAgrN98881h/f33329be/TRR8NjMzNmzAjrVc/r6a7rl/1mNkPSv0r6lqSVktaa2cpeDQxAvaq8579e0k533+XuE5J+Ken23gwLQN2qhH+5pA+mfL67dduXmNl6M9tmZtsqPBaAHqvynn+6N7pf+YOeu2+UtFHiD37AIKnyzL9b0oopn18gaU+14QDolyrhf0PS5WZ2iZnNkvQdSS/0ZlgA6tb1y353P2pm90j6T022+ja5++97NrI+y/rhUb87Ww2pasvpzjvvDOt33XVX21rWp1+zZk1Y/+KLL8L62NhYWI+uI3jttdfCY19//fWwXmcrr8rvw6miUp/f3V+U9GKPxgKgj7i8FygU4QcKRfiBQhF+oFCEHygU4QcKZf3csafUy3tvvfXWsP7www+H9YULF4b1jz/+uG2t6rTXQ4cOhfVzzjknrEd9/qVLl4bH7t69O6xv2LAhrL/yyith/XTV6Xx+nvmBQhF+oFCEHygU4QcKRfiBQhF+oFC0+lquu+66sH7vvfe2rV177bXhsUuWLAnrWTstqx8/frxt7dxzzw2Pzezbty+sR8uGS9L+/fu7fuysxTlr1qywvmvXrra17du3h8c+/vjjYf3tt98O602i1QcgRPiBQhF+oFCEHygU4QcKRfiBQhF+oFDF9PkvvPDCsJ5N/5w3b17b2meffRYemy1/ne3imx0fTdvNvnZWz34/Zs+eHdYj2fUL2fLY2diin1k2FTnbnfimm24K601uP06fH0CI8AOFIvxAoQg/UCjCDxSK8AOFIvxAoSr1+c1sWNJBScckHXX3Vcn9G+vzP/nkk2F99erVYX3v3r1ta1mvu+7tnsfHx9vWzjzzzPDYbOnuI0eOhPWhoaGwPjExEdYjZnG7es6cOWE9+t3OthZfvnx5WH/ppZfC+t133x3W69Rpn7/SFt0tf+fu3a/YAKARvOwHClU1/C7pJTN708zW92JAAPqj6sv+G9x9j5ktlvSymf2Pu2+deofW/xT4HwMwYCo987v7nta/o5Kek3T9NPfZ6O6rsj8GAuivrsNvZkNmNv/Ex5JulvRurwYGoF5VXvYvkfRcqx0zU9Iv3P0/ejIqALUrZj5/Nr86m78dnadsPn/Wa89E6/JL+TbckawPnz12dg1DlbUGsj5/do1C1MvPrs3I9juI1gqQpIsuuiis14n5/ABChB8oFOEHCkX4gUIRfqBQhB8oVC9m9Z0SFixYENazraSj7aCzllU2LTZrBWbt2Kgdl7XqsnZalTZiJhtbtmR5NJVZiqdKZ62+bJp1la3HBwXP/EChCD9QKMIPFIrwA4Ui/EChCD9QKMIPFOq06fNfddVVYT2beppND42Oz6Z/Rst+S3mvPbuOIOqXZ9cIVK1nY4/q2TnPphtn109E12YsXrw4PDbbPnz+/Plh/bLLLgvrO3fuDOv9wDM/UCjCDxSK8AOFIvxAoQg/UCjCDxSK8AOFOm36/CtWrAjrH3zwQVjPes7RvPas57tv376uv7ZUrdde99LsVdYLqHqNQXbe5s6dG9brfOzzzz8/rNPnB9AYwg8UivADhSL8QKEIP1Aowg8UivADhUr7/Ga2SdKtkkbd/erWbQsk/UrSxZKGJa1x90/qG2Zu9erVYT3r42frtEd93WjeeCeyOfGZrNceyfrV2XnLxh71y+veM2DRokVta9leCYcPHw7r2Rbdt912W1jfunVrWO+HTp75n5R0y0m3PShpi7tfLmlL63MAp5A0/O6+VdKBk26+XdLm1sebJd3R43EBqFm37/mXuPteSWr9G6+JBGDg1H5tv5mtl7S+7scB8PV0+8w/YmbLJKn172i7O7r7Rndf5e6runwsADXoNvwvSFrX+nidpOd7MxwA/ZKG38yelvRfkq4ws91m9j1Jj0habWZ/kLS69TmAU0j6nt/d17YpfbPHY6kkWyc96xlfeumlYf3DDz9sW8v2kY/6zZI0NjYW1rN+eNTnr3oNQabK2Kqu2z80NBTWh4eH29auuOKK8NhLLrkkrGfXAaxcuTKsDwKu8AMKRfiBQhF+oFCEHygU4QcKRfiBQlndSzt/6cHM+vdgJ1mwYEFYv++++8L6q6++2rb2wAMPhMdmbZ+RkZGwnm3RnU1HjlRtBWbHZ23QSNZOW7hwYVjfsWNH29ozzzwTHnvllVeG9WeffTasN7k0t7t39EPlmR8oFOEHCkX4gUIRfqBQhB8oFOEHCkX4gUIV0+ev0/79+8P6nj17wnrWp89+RtHxVbbQlvJrDLL6+Ph421rV5bMzS5cubVvL+vinMvr8AEKEHygU4QcKRfiBQhF+oFCEHygU4QcKVft2Xf2S9aurLDGdybZrzpaorir63mbOjH/EVa/zyL5+dB3AkSNHwmOrXgeQLe1dRXZ9Q3Ze+3l9TTs88wOFIvxAoQg/UCjCDxSK8AOFIvxAoQg/UKi0z29mmyTdKmnU3a9u3faQpLslfdS62wZ3f7GuQXYi65tWXZ8+6jm/99574bHZ2LJeebb2fd3bcFcRbY2e9emz6yeyrc137doV1us0CH38TCfP/E9KumWa23/q7te0/ms0+AC+vjT87r5V0oE+jAVAH1V5z3+Pmf3OzDaZ2Xk9GxGAvug2/I9LulTSNZL2Svpxuzua2Xoz22Zm27p8LAA16Cr87j7i7sfc/bikn0m6PrjvRndf5e6ruh0kgN7rKvxmtmzKp3dKerc3wwHQL520+p6WdKOkb5jZbkk/knSjmV0jySUNS/p+jWMEUIM0/O6+dpqbn6hhLLXK5l9n8/lnz57dtpb16Q8dOhTWs3nrVeeOV5Gdl+yxo+8t+9rZ951d3xD9XObMmRMem12DMMjXVnSKK/yAQhF+oFCEHygU4QcKRfiBQhF+oFCnzdLdmaqtmag1lLWkqi7dXWcrLzsv2WNn7bqo1RdN9+3ka2fHR/Wqy4LT6gNwyiL8QKEIP1Aowg8UivADhSL8QKEIP1Ao+vwdOuuss9rWsim92WNn/eps6e5I1SXNs+OPHj0a1qPrI6Jp0lLea89EvfzssQ8ePFjpsU8FPPMDhSL8QKEIP1Aowg8UivADhSL8QKEIP1CoYvr8VUVz9rM+f9bHz3rtVeuRrI9fdc59VM+Wz/7kk0/CenbeI9n3lWE+P4BTFuEHCkX4gUIRfqBQhB8oFOEHCkX4gUKljVIzWyHpKUlLJR2XtNHdHzOzBZJ+JeliScOS1rh73Jg9hc2fP79trepW0nX28avO18++t2ytgWi+/9DQUHhstrX5xMREWJ81a1bbWpVrBE4XnTzzH5X0gLv/paS/lvQDM1sp6UFJW9z9cklbWp8DOEWk4Xf3ve6+vfXxQUk7JC2XdLukza27bZZ0R12DBNB7X+s9v5ldLOlaSb+VtMTd90qT/4OQtLjXgwNQn47f+JjZWZJ+LemH7v6nTt+Hmtl6Seu7Gx6AunT0zG9mZ2oy+D9399+0bh4xs2Wt+jJJo9Md6+4b3X2Vu6/qxYAB9EYafpt8in9C0g53/8mU0guS1rU+Xifp+d4PD0BdOnnZf4Ok70p6x8zeat22QdIjkp4xs+9J+qOkb9czxN6oOgUzOr5qqy6bFpupc3pp9rWzVmC2tHckawWOj4+H9Wjpblp9HYTf3V+T1O434Ju9HQ6AfuEKP6BQhB8oFOEHCkX4gUIRfqBQhB8oFM3ODlXZJjvrhTcpG1tWr7IEdnZOoz69VG26cTaVuQSD+1sJoFaEHygU4QcKRfiBQhF+oFCEHygU4QcKRZ+/B6rOea96fJM962wtgmjs2bhnz54d1ufOnRvWjx071rZWdQ2F0wHP/EChCD9QKMIPFIrwA4Ui/EChCD9QKMIPFIo+f4eieetV183P+vjZGvN1rtufydblj8YW9eGzY6X8OoBoPYBo++5S8MwPFIrwA4Ui/EChCD9QKMIPFIrwA4Ui/ECh0j6/ma2Q9JSkpZKOS9ro7o+Z2UOS7pb0UeuuG9z9xboGWlXV+dujo6Nta0eOHAmPzdafz2T97ug6gOzYqmsBzJkzp+tjs+sXsmsIsj5/dP3Ep59+Gh5bgk4u8jkq6QF3325m8yW9aWYvt2o/dfdH6xsegLqk4Xf3vZL2tj4+aGY7JC2ve2AA6vW13vOb2cWSrpX029ZN95jZ78xsk5md1+aY9Wa2zcy2VRopgJ7qOPxmdpakX0v6obv/SdLjki6VdI0mXxn8eLrj3H2ju69y91U9GC+AHuko/GZ2piaD/3N3/40kufuIux9z9+OSfibp+vqGCaDX0vDb5J+Ln5C0w91/MuX2ZVPudqekd3s/PAB16eSv/TdI+q6kd8zsrdZtGyStNbNrJLmkYUnfr2WEAyLa7nnevHnhsVWnrmZTfqM2ZtbKy8aWtUizqbHR8VWXNM+W7o6+94mJifDYzOmwxXcnf+1/TdJ0P4WB7ekDyHGFH1Aowg8UivADhSL8QKEIP1Aowg8Uqpilu6v2ZcfGxtrWhoeHw2Ozaa/Z1NZo2fCsnn3trF71OoFINhX68OHDYT0b2+eff962Nj4+Hh5bAp75gUIRfqBQhB8oFOEHCkX4gUIRfqBQhB8olPVzXrKZfSTp/Sk3fUPS/r4N4OsZ1LEN6rgkxtatXo7tIndf1Mkd+xr+rzy42bZBXdtvUMc2qOOSGFu3mhobL/uBQhF+oFBNh39jw48fGdSxDeq4JMbWrUbG1uh7fgDNafqZH0BDGgm/md1iZv9rZjvN7MEmxtCOmQ2b2Ttm9lbTW4y1tkEbNbN3p9y2wMxeNrM/tP6ddpu0hsb2kJl92Dp3b5nZ3zc0thVm9oqZ7TCz35vZfa3bGz13wbgaOW99f9lvZjMk/Z+k1ZJ2S3pD0lp3/+++DqQNMxuWtMrdG+8Jm9nfShqT9JS7X9267Z8lHXD3R1r/4zzP3f9xQMb2kKSxpndubm0os2zqztKS7pD0D2rw3AXjWqMGzlsTz/zXS9rp7rvcfULSLyXd3sA4Bp67b5V04KSbb5e0ufXxZk3+8vRdm7ENBHff6+7bWx8flHRiZ+lGz10wrkY0Ef7lkj6Y8vluDdaW3y7pJTN708zWNz2YaSxpbZt+Yvv0xQ2P52Tpzs39dNLO0gNz7rrZ8brXmgj/dLv/DFLL4QZ3/ytJ35L0g9bLW3Smo52b+2WanaUHQrc7XvdaE+HfLWnFlM8vkLSngXFMy933tP4dlfScBm/34ZETm6S2/h1teDx/Nkg7N0+3s7QG4NwN0o7XTYT/DUmXm9klZjZL0nckvdDAOL7CzIZaf4iRmQ1JulmDt/vwC5LWtT5eJ+n5BsfyJYOyc3O7naXV8LkbtB2vG7nIp9XK+BdJMyRtcvd/6vsgpmFmf6HJZ3tpcmXjXzQ5NjN7WtKNmpz1NSLpR5L+XdIzki6U9EdJ33b3vv/hrc3YbtTkS9c/79x84j12n8f2N5JelfSOpBPbBG/Q5Pvrxs5dMK61auC8cYUfUCiu8AMKRfiBQhF+oFCEHygU4QcKRfiBQhF+oFCEHyjU/wN4z67WSwjY4gAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "print(\"Actual class of test image:\", test_lbl[177])\n", + "plt.imshow(test_img[177].reshape((28,28)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Naive-Bayes\n", + "\n", + "The Naive-Bayes classifier is an improvement over the Plurality Learner. It is much more accurate, but a lot slower." + ] + }, + { + "cell_type": "code", + "execution_count": 120, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0\n" + ] + } + ], + "source": [ + "# takes ~45 Secs. to execute this\n", + "\n", + "nBD = NaiveBayesLearner(MNIST_DataSet, continuous = False)\n", + "print(nBD(test_img[24]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's check if we got the right output." + ] + }, + { + "cell_type": "code", + "execution_count": 121, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Actual class of test image: 1\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 121, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAADuVJREFUeJzt3V+IXOd5x/Hfo9Wu/mPLVi0tkiqlQVQ2hiplEcYqxSU4OCUg5yIiuggqhGwuYmiwLmp0E98UTGmU+qIENrWIDImTQOJaF6aNbQpucAleGTlWKieShaqstdYqli1t9G+1u08v9iis5Z33HZ8zM+dIz/cDZmfPM2fn8ax+e2bmPed9zd0FIJ5FdTcAoB6EHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIt7+WBmxumEJWzYsCFZ7+vra1lbvDj9Kx4YGEjWZ2ZmkvWpqalkfdGi8seXkydPlt43Mne3du5XKfxm9oikpyX1Sfo3d3+qys/Dwvbu3Zusr1q1qmXt7rvvTu67efPmZP38+fPJ+tjYWLK+YsWKlrXZ2dnkvrt27UrWUU3pP8tm1ifpXyV9XtJ9knab2X2dagxAd1V5z79d0gl3P+nuU5J+JGlnZ9oC0G1Vwr9e0u/mfT9WbPsIMxs2s1EzG63wWAA6rMp7/oU+VPjYB3ruPiJpROIDP6BJqhz5xyRtnPf9BklnqrUDoFeqhP91SVvM7FNmNiDpy5IOdaYtAN1W+mW/u0+b2WOS/lNzQ30H3P3XHesskDvuuCNZX7/+Yx+lfMTFixdb1i5cuJDc95133knWBwcHk/U777wzWV+6dGnL2gMPPJDcNzVMKEmXLl1K1pFWaZzf3V+U9GKHegHQQ5zeCwRF+IGgCD8QFOEHgiL8QFCEHwiqp9fzY2EbN25M1teuXZusX758uWXt2rVryX2vX7+erOcuu+3v70/WUytCTUxMJPfdunVrsn748OFkHWkc+YGgCD8QFOEHgiL8QFCEHwiK8ANBMdTXAGvWrEnWU0N5kvT++++3rOUui12yZEmynrpcWJJWr15dev/UlOOStGPHjmSdob5qOPIDQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCM8zeAWXpF5cnJyWQ9tQx2aupsKX/Jbm4sPtd7qrdz584l9831jmo48gNBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUJXG+c3slKRJSTOSpt19qBNNRZNbojs3lr54cetfY+56/StXriTrmzZtStavXr2arKeu55+ZmUnum5uLANV04iSfv3H333fg5wDoIV72A0FVDb9L+rmZHTaz4U40BKA3qr7s3+HuZ8zsHkkvmdnb7v7q/DsUfxT4wwA0TKUjv7ufKb5OSHpe0vYF7jPi7kN8GAg0S+nwm9kKM1t147akz0k62qnGAHRXlZf9ayU9XwxDLZb0Q3f/j450BaDrSoff3U9K+osO9hJW7rr13DLaKQMDA8n6vffem6wPDg4m6y+//HKynjqPILe8N7qLoT4gKMIPBEX4gaAIPxAU4QeCIvxAUEzd3QC5S1tT019L6aHCrVu3JvcdHR1N1t98881kfeXKlcn61NRUsp6Se15QDUd+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiKcf4GmJ6eTtZXrVqVrKem7t68eXNy3/379yfruXMMhofTM7QdOXKkZS3VdzuPjWp4doGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMb5G8DdK+2fmto7dw7BpUuXkvXUOL0kPf7448l66pr8vr6+5L4XLlxI1lENR34gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCCo7zm9mByR9QdKEu99fbLtL0o8lbZZ0StIud/+ge23e3q5du1Zp/9Q4//Lly5P7jo2NJetvv/12sp675j41F0FunD+1vDeqa+fI/31Jj9y07QlJr7j7FkmvFN8DuIVkw+/ur0o6f9PmnZIOFrcPSnq0w30B6LKy7/nXuvu4JBVf7+lcSwB6oevn9pvZsKT0RG8Aeq7skf+smQ1KUvF1otUd3X3E3YfcfajkYwHogrLhPyRpT3F7j6QXOtMOgF7Jht/MnpP0P5L+3MzGzOyrkp6S9LCZHZf0cPE9gFtI9j2/u+9uUfpsh3sJKzdWvmzZsmQ9Nf+9mSX3PXr0aLKekxuLT43l53q7fPlyqZ7QHs7wA4Ii/EBQhB8IivADQRF+ICjCDwTF1N0NkJteOze1d2qoMHfZ7Icffpis5+Sm1049/tmzZ5P7zs7OluoJ7eHIDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBMc7fAKlLciVpamqq9M++evVq6X3bkest9/+WkrvkF9Vw5AeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoBjnb4CBgYFkfenSpcl6aj6AKucItKPqXATd2hd5HPmBoAg/EBThB4Ii/EBQhB8IivADQRF+IKjsOL+ZHZD0BUkT7n5/se1JSV+TdK642z53f7FbTd7ucuPZubn1U0t45+bVr2pycjJZ7+/vL/2zc0uXo5p2nt3vS3pkge3fcfdtxX8EH7jFZMPv7q9KOt+DXgD0UJXXVY+Z2a/M7ICZre5YRwB6omz4vyvp05K2SRqX9O1WdzSzYTMbNbPRko8FoAtKhd/dz7r7jLvPSvqepO2J+464+5C7D5VtEkDnlQq/mQ3O+/aLko52ph0AvdLOUN9zkh6StMbMxiR9S9JDZrZNkks6JenrXewRQBdkw+/uuxfY/EwXegkrNz99bp361Fj6+Ph4qZ7a9d577yXrmzZtKv2zuZ6/uziLAgiK8ANBEX4gKMIPBEX4gaAIPxAUU3c3wJIlS5L13FBgaurv3FBcVbnLjbds2dKylvv/Yonu7uLIDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBMc7fALnprXOX9KacP9/duVevX7+erC9e3PqfWF9fX3JfLuntLo78QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4/y3gdR5AFeuXKntsauanp7u2s8GR34gLMIPBEX4gaAIPxAU4QeCIvxAUIQfCCo7zm9mGyU9K2mdpFlJI+7+tJndJenHkjZLOiVpl7t/0L1Wb19Vr1tftKj13/ClS5dW+tlVpXrLSc0FgOra+c1MS9rr7vdKekDSN8zsPklPSHrF3bdIeqX4HsAtIht+dx939zeK25OSjklaL2mnpIPF3Q5KerRbTQLovE/0mszMNkv6jKRfSlrr7uPS3B8ISfd0ujkA3dP2myozWynpp5K+6e4X211HzcyGJQ2Xaw9At7R15Dezfs0F/wfu/rNi81kzGyzqg5ImFtrX3UfcfcjdhzrRMIDOyIbf5g7xz0g65u7755UOSdpT3N4j6YXOtwegW9p52b9D0lckvWVmR4pt+yQ9JeknZvZVSaclfak7Ld7+csNhuaHA1P6XLl0q1VO7cpf0Vllme2pqqvS+yMuG391/IanVb/CznW0HQK9whh8QFOEHgiL8QFCEHwiK8ANBEX4gKK6ZbIDcZbe5paxT4/zdnro79/OrXK588eLF0vsijyM/EBThB4Ii/EBQhB8IivADQRF+ICjCDwTFOH8DzMzMJOu5cf7+/v6WtQ8+6O5s6teuXUvWU8tsDwwMJPetMu038nh2gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAoxvkbIDf3fZV5/c+dO1eqp3ZVmWsgdQ6AJF2/fr1UT2gPR34gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCCo7zm9mGyU9K2mdpFlJI+7+tJk9Kelrkm4MJO9z9xe71ejt7PLly8l67rp3s1YrqEvvvvtuqZ7alZu3PzXXwJIlS5L7Mm9/d7Vzks+0pL3u/oaZrZJ02MxeKmrfcfd/7l57ALolG353H5c0XtyeNLNjktZ3uzEA3fWJ3vOb2WZJn5H0y2LTY2b2KzM7YGarW+wzbGajZjZaqVMAHdV2+M1spaSfSvqmu1+U9F1Jn5a0TXOvDL690H7uPuLuQ+4+1IF+AXRIW+E3s37NBf8H7v4zSXL3s+4+4+6zkr4naXv32gTQadnw29xHyc9IOubu++dtH5x3ty9KOtr59gB0Szuf9u+Q9BVJb5nZkWLbPkm7zWybJJd0StLXu9JhALlLdpcvX56sp6b+npycLNVTu3JTd6d6zw1hLlu2rFRPaE87n/b/QtJCA8mM6QO3MM7wA4Ii/EBQhB8IivADQRF+ICjCDwTF1N0NcPr06WT9+PHjyfrq1QteViFJOnnyZKme2vXaa68l6w8++GDL2rp165L7njhxolRPaA9HfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8IylLLO3f8wczOSfq/eZvWSPp9zxr4ZJraW1P7kuitrE72tsnd/6SdO/Y0/B97cLPRps7t19TemtqXRG9l1dUbL/uBoAg/EFTd4R+p+fFTmtpbU/uS6K2sWnqr9T0/gPrUfeQHUJNawm9mj5jZb8zshJk9UUcPrZjZKTN7y8yO1L3EWLEM2oSZHZ237S4ze8nMjhdfW1/P2/venjSzd4vn7oiZ/W1NvW00s/8ys2Nm9msz+/tie63PXaKvWp63nr/sN7M+Sb+V9LCkMUmvS9rt7v/b00ZaMLNTkobcvfYxYTP7a0l/kPSsu99fbPsnSefd/aniD+dqd/+HhvT2pKQ/1L1yc7GgzOD8laUlPSrp71Tjc5foa5dqeN7qOPJvl3TC3U+6+5SkH0naWUMfjefur0o6f9PmnZIOFrcPau4fT8+16K0R3H3c3d8obk9KurGydK3PXaKvWtQR/vWSfjfv+zE1a8lvl/RzMztsZsN1N7OAtcWy6TeWT7+n5n5ull25uZduWlm6Mc9dmRWvO62O8C+0+k+Thhx2uPtfSvq8pG8UL2/RnrZWbu6VBVaWboSyK153Wh3hH5O0cd73GySdqaGPBbn7meLrhKTn1bzVh8/eWCS1+DpRcz9/1KSVmxdaWVoNeO6atOJ1HeF/XdIWM/uUmQ1I+rKkQzX08TFmtqL4IEZmtkLS59S81YcPSdpT3N4j6YUae/mIpqzc3GpladX83DVtxetaTvIphjL+RVKfpAPu/o89b2IBZvZnmjvaS3MzG/+wzt7M7DlJD2nuqq+zkr4l6d8l/UTSn0o6LelL7t7zD95a9PaQ5l66/nHl5hvvsXvc219J+m9Jb0maLTbv09z769qeu0Rfu1XD88YZfkBQnOEHBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiCo/wciWVon3rz+DgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "print(\"Actual class of test image:\", test_lbl[24])\n", + "plt.imshow(test_img[24].reshape((28,28)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### K-Nearest Neighbors\n", + "\n", + "With the dataset in hand, we will first test how the kNN algorithm performs:" + ] + }, + { + "cell_type": "code", + "execution_count": 122, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1\n" + ] + } + ], + "source": [ + "# takes ~20 Secs. to execute this\n", + "kNN = NearestNeighborLearner(MNIST_DataSet, k=3)\n", + "print(kNN(test_img[211]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The output is 1, which means the item at index 211 is a trouser. Let's see if the prediction is correct:" + ] + }, + { + "cell_type": "code", + "execution_count": 123, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Actual class of test image: 1\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 123, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAADt9JREFUeJzt3V9sVOeZx/HfgzFgjAOYLjYJbNNWqEqIUhpZKFFWq6wIVbqqRHrRqFxUrFSVXjRSK/ViI26am5Wials2F6tG7gaVRG3aSm02XKDdRtFKWaSoColIIWEJCEhLDLbB/DGWE/979sKHyiGe95g5Z+aM9Xw/UuTxeebMPJrw85mZ95z3NXcXgHiWVN0AgGoQfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8IivADQS1t5pOZGacT1mHNmjXJent7e92P3dbWVqg+PT2drC9ZUvv4Mjk5mdx3eHg4Wcf83N0Wcr9C4TezxyQ9K6lN0n+4+zNFHg/z2759e7K+fv36uh877w9LV1dXsj46Opqsd3R01KwNDAwk933uueeSdRRT99t+M2uT9O+SvirpXkm7zOzeshoD0FhFPvNvk3Ta3c+4+4SkX0vaWU5bABqtSPjvkvSXOb+fz7Z9gpntMbMjZnakwHMBKFmRz/zzfanwqS/03L1fUr/EF35AKyly5D8vadOc3zdKSn+DA6BlFAn/m5I2m9nnzGyZpG9KOlhOWwAare63/e4+ZWZPSvpvzQ717Xf3d0vrLJA777wzWd+xY0eyvnRp7f+N4+PjdfV00/3335+sX7p0KVm/4447atYeffTR5L7Hjx9P1g8fPpysI63QOL+7H5J0qKReADQRp/cCQRF+ICjCDwRF+IGgCD8QFOEHgmrq9fyYX94luSMjI8n6xMREzdrU1FRy37xzDC5fvpysv/tu+tSO1OOfOXMmuW93d3eyjmI48gNBEX4gKMIPBEX4gaAIPxAU4QeCYqivBaxbty5ZHxwcTNZTw3k9PT3JfVNTa0vSBx98kKznDQWmpC5FlqTNmzfX/djIx5EfCIrwA0ERfiAowg8ERfiBoAg/EBThB4JinL8F5C2xvWzZsrrry5cvL/TYY2NjyXreOQqp53/nnXeS+zLO31gc+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gqELj/GZ2TtKopGlJU+7eV0ZT0axduzZZ7+joSNZnZmZq1lavXp3cd+PGjcn6li1bkvW8sfoi8s4hQDFlnOTzD+6eXqQdQMvhbT8QVNHwu6Q/mNlbZranjIYANEfRt/0Pu/uAma2X9KqZ/Z+7vz73DtkfBf4wAC2m0JHf3Qeyn0OSXpa0bZ779Lt7H18GAq2l7vCbWaeZdd28Lekrko6X1RiAxirytr9H0stmdvNxfuXu/1VKVwAaru7wu/sZSV8qsZew8q7nz5vfPiVvnP+hhx5K1g8dOpSsnzx5MllPLdHd29ub3LetrS1ZRzEM9QFBEX4gKMIPBEX4gaAIPxAU4QeCYuruFnD9+vVkPW+o79q1azVreZcDnzhxIlnft29fsr5z585k/dKl2hd8PvDAA8l9jx49mqyjGI78QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4/wtIDVOL0mdnZ3J+pUrV2rWVq5cmdzX3ZP1vGnFV61aVffj9/T0JPcdGBhI1lEMR34gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIpx/hYwOjqarHd1dSXrS5bU/hu+Zs2a5L551/NPTk4m63mP/9FHH9W97+nTp5N1FMORHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCyh3nN7P9kr4macjd78u2dUv6jaS7JZ2T9IS7176oHEl54/xFlqpet25dsn727NlkPTXvvpR/DsLQ0FDNWt7y4W+88UayjmIWcuT/haTHbtn2lKTX3H2zpNey3wEsIrnhd/fXJY3csnmnpAPZ7QOSHi+5LwANVu9n/h53vyBJ2c/15bUEoBkafm6/me2RtKfRzwPg9tR75B80sw2SlP2s+a2Ou/e7e5+799X5XAAaoN7wH5S0O7u9W9Ir5bQDoFlyw29mL0l6Q9IXzey8mX1b0jOSdpjZKUk7st8BLCK5n/ndfVeN0vaSewlrcHAwWc+bWz8lbxz+2LFjyfrly5eT9fHx8WQ9NZa/dGn6n9/FixeTdRTDGX5AUIQfCIrwA0ERfiAowg8ERfiBoJi6uwVcvXo1WZ+ZmUnWu7u7a9aWL1+e3PfDDz9M1sfGxpL1vMdPTSv+/vvvJ/dFY3HkB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgGOdfBFLLXEvSihUrataWLVtW6LGnpqaS9bwlvDs6OmrWbty4kdwXjcWRHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCYpx/Efj444+T9dQ186dOnUruOzw8nKznTa+dx8xq1vLOMUBjceQHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaByB3HNbL+kr0kacvf7sm1PS/qOpJuDxHvd/VCjmkRaapw/b87/vGWwV65cmay3t7fXvf+VK1eS+6KxFnLk/4Wkx+bZvs/dt2b/EXxgkckNv7u/LmmkCb0AaKIin/mfNLM/mdl+M1tbWkcAmqLe8P9M0hckbZV0QdJPat3RzPaY2REzO1LncwFogLrC7+6D7j7t7jOSfi5pW+K+/e7e5+599TYJoHx1hd/MNsz59euSjpfTDoBmWchQ30uSHpH0GTM7L+lHkh4xs62SXNI5Sd9tYI8AGiA3/O6+a57NzzegF9SQGseXpK6urpq1e+65J7nv9PR0sp53noC7J+updQPGxsaS+6KxOMMPCIrwA0ERfiAowg8ERfiBoAg/EBRTdy8CeZe+rlmzpmZtYmKi0HO3tbUl63lDgamhvrwpydFYHPmBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjG+ReBvEt6U8toX716tex2PiFv6u7UeQLj4+Nlt4PbwJEfCIrwA0ERfiAowg8ERfiBoAg/EBThB4JinH8RMLNkfXJysmats7Oz7HY+YcWKFcl66hyE4eHhmjU0Hkd+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwgqd5zfzDZJekFSr6QZSf3u/qyZdUv6jaS7JZ2T9IS7pyeYR12uX79e97558+7nyZtLIG9Ngd7e3po1xvmrtZAj/5SkH7r7PZIelPQ9M7tX0lOSXnP3zZJey34HsEjkht/dL7j729ntUUknJN0laaekA9ndDkh6vFFNAijfbX3mN7O7JX1Z0h8l9bj7BWn2D4Sk9WU3B6BxFnxuv5mtkvQ7ST9w9+t555vP2W+PpD31tQegURZ05Dezds0G/5fu/vts86CZbcjqGyQNzbevu/e7e5+795XRMIBy5IbfZg/xz0s64e4/nVM6KGl3dnu3pFfKbw9Aoyzkbf/Dkr4l6ZiZHc227ZX0jKTfmtm3Jf1Z0jca0yLyNHJ67LyhvrwluqempmrWRkZG6uoJ5cgNv7sfllTrA/72ctsB0Cyc4QcERfiBoAg/EBThB4Ii/EBQhB8Iiqm7F4GJiYm6901N670QRabmltLnCVy7dq2unlAOjvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBTj/ItA3lh6SkdHR6HnLjr1d+p6/xs3bhR6bBTDkR8IivADQRF+ICjCDwRF+IGgCD8QFOEHgmKcfxEocj1/V1dXiZ18Wt68/dPT0zVrRZYeR3Ec+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gqNxxfjPbJOkFSb2SZiT1u/uzZva0pO9IGs7uutfdDzWq0cguXryYrKfmxj98+HCh5169enWynjdfQOqa/VOnTtXVE8qxkJN8piT90N3fNrMuSW+Z2atZbZ+7/2vj2gPQKLnhd/cLki5kt0fN7ISkuxrdGIDGuq3P/GZ2t6QvS/pjtulJM/uTme03s7U19tljZkfM7EihTgGUasHhN7NVkn4n6Qfufl3SzyR9QdJWzb4z+Ml8+7l7v7v3uXtfCf0CKMmCwm9m7ZoN/i/d/feS5O6D7j7t7jOSfi5pW+PaBFC23PCbmUl6XtIJd//pnO0b5tzt65KOl98egEZZyLf9D0v6lqRjZnY027ZX0i4z2yrJJZ2T9N2GdAj19vYm66nhtgcffLDQc2/atClZX7t23q96/io17Xje5cbj4+PJOopZyLf9hyXZPCXG9IFFjDP8gKAIPxAU4QeCIvxAUIQfCIrwA0ExdfcicPLkyWR9y5YtNWsvvvhioed+7733kvWzZ88m6+3t7TVrQ0NDdfWEcnDkB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgzN2b92Rmw5I+mLPpM5IuNa2B29OqvbVqXxK91avM3j7r7n+zkDs2NfyfenKzI606t1+r9taqfUn0Vq+qeuNtPxAU4QeCqjr8/RU/f0qr9taqfUn0Vq9Keqv0Mz+A6lR95AdQkUrCb2aPmdlJMzttZk9V0UMtZnbOzI6Z2dGqlxjLlkEbMrPjc7Z1m9mrZnYq+5meO7u5vT1tZh9mr91RM/vHinrbZGb/Y2YnzOxdM/t+tr3S1y7RVyWvW9Pf9ptZm6T3Je2QdF7Sm5J2uXv6wvEmMbNzkvrcvfIxYTP7e0k3JL3g7vdl234sacTdn8n+cK51939ukd6elnSj6pWbswVlNsxdWVrS45L+SRW+dom+nlAFr1sVR/5tkk67+xl3n5D0a0k7K+ij5bn765JGbtm8U9KB7PYBzf7jaboavbUEd7/g7m9nt0cl3VxZutLXLtFXJaoI/12S/jLn9/NqrSW/XdIfzOwtM9tTdTPz6MmWTb+5fPr6ivu5Ve7Kzc10y8rSLfPa1bPiddmqCP98q/+00pDDw+7+gKSvSvpe9vYWC7OglZubZZ6VpVtCvStel62K8J+XNHcBuI2SBiroY17uPpD9HJL0slpv9eHBm4ukZj9bZiK8Vlq5eb6VpdUCr10rrXhdRfjflLTZzD5nZsskfVPSwQr6+BQz68y+iJGZdUr6ilpv9eGDknZnt3dLeqXCXj6hVVZurrWytCp+7VptxetKTvLJhjL+TVKbpP3u/i9Nb2IeZvZ5zR7tpdmZjX9VZW9m9pKkRzR71degpB9J+k9Jv5X0t5L+LOkb7t70L95q9PaIZt+6/nXl5pufsZvc299J+l9JxyTNZJv3avbzdWWvXaKvXargdeMMPyAozvADgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxDU/wOD9TqwqkBrGQAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "print(\"Actual class of test image:\", test_lbl[211])\n", + "plt.imshow(test_img[211].reshape((28,28)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indeed, the item was a trouser! The algorithm classified the item correctly." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.0" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "source": [], + "metadata": { + "collapsed": false + } + } + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} \ No newline at end of file diff --git a/logic.ipynb b/logic.ipynb index 3a70f9d17..062ffede2 100644 --- a/logic.ipynb +++ b/logic.ipynb @@ -6,18 +6,16 @@ "collapsed": true }, "source": [ - "# Logic: `logic.py`; Chapters 6-8" + "# Logic" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This notebook describes the [logic.py](https://github.com/aimacode/aima-python/blob/master/logic.py) module, which covers Chapters 6 (Logical Agents), 7 (First-Order Logic) and 8 (Inference in First-Order Logic) of *[Artificial Intelligence: A Modern Approach](http://aima.cs.berkeley.edu)*. See the [intro notebook](https://github.com/aimacode/aima-python/blob/master/intro.ipynb) for instructions.\n", + "This Jupyter notebook acts as supporting material for topics covered in __Chapter 6 Logical Agents__, __Chapter 7 First-Order Logic__ and __Chapter 8 Inference in First-Order Logic__ of the book *[Artificial Intelligence: A Modern Approach](http://aima.cs.berkeley.edu)*. We make use of the implementations in the [logic.py](https://github.com/aimacode/aima-python/blob/master/logic.py) module. See the [intro notebook](https://github.com/aimacode/aima-python/blob/master/intro.ipynb) for instructions.\n", "\n", - "We'll start by looking at `Expr`, the data type for logical sentences, and the convenience function `expr`. We'll be covering two types of knowledge bases, `PropKB` - Propositional logic knowledge base and `FolKB` - First order logic knowledge base. We will construct a propositional knowledge base of a specific situation in the Wumpus World. We will next go through the `tt_entails` function and experiment with it a bit. The `pl_resolution` and `pl_fc_entails` functions will come next. We'll study forward chaining and backward chaining algorithms for `FolKB` and use them on `crime_kb` knowledge base.\n", - "\n", - "But the first step is to load the code:" + "Let's first import everything from the `logic` module." ] }, { @@ -29,7 +27,31 @@ "outputs": [], "source": [ "from utils import *\n", - "from logic import *" + "from logic import *\n", + "from notebook import psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "- Logical sentences\n", + " - Expr\n", + " - PropKB\n", + " - Knowledge-based agents\n", + " - Inference in propositional knowledge base\n", + " - Truth table enumeration\n", + " - Proof by resolution\n", + " - Forward and backward chaining\n", + " - DPLL\n", + " - WalkSAT\n", + " - SATPlan\n", + " - FolKB\n", + " - Inference in first order knowledge base\n", + " - Unification\n", + " - Forward chaining algorithm\n", + " - Backward chaining algorithm" ] }, { @@ -388,12 +410,12 @@ "\n", "The class `PropKB` can be used to represent a knowledge base of propositional logic sentences.\n", "\n", - "We see that the class `KB` has four methods, apart from `__init__`. A point to note here: the `ask` method simply calls the `ask_generator` method. Thus, this one has already been implemented and what you'll have to actually implement when you create your own knowledge base class (if you want to, though I doubt you'll ever need to; just use the ones we've created for you), will be the `ask_generator` function and not the `ask` function itself.\n", + "We see that the class `KB` has four methods, apart from `__init__`. A point to note here: the `ask` method simply calls the `ask_generator` method. Thus, this one has already been implemented, and what you'll have to actually implement when you create your own knowledge base class (though you'll probably never need to, considering the ones we've created for you) will be the `ask_generator` function and not the `ask` function itself.\n", "\n", "The class `PropKB` now.\n", "* `__init__(self, sentence=None)` : The constructor `__init__` creates a single field `clauses` which will be a list of all the sentences of the knowledge base. Note that each one of these sentences will be a 'clause' i.e. a sentence which is made up of only literals and `or`s.\n", "* `tell(self, sentence)` : When you want to add a sentence to the KB, you use the `tell` method. This method takes a sentence, converts it to its CNF, extracts all the clauses, and adds all these clauses to the `clauses` field. So, you need not worry about `tell`ing only clauses to the knowledge base. You can `tell` the knowledge base a sentence in any form that you wish; converting it to CNF and adding the resulting clauses will be handled by the `tell` method.\n", - "* `ask_generator(self, query)` : The `ask_generator` function is used by the `ask` function. It calls the `tt_entails` function, which in turn returns `True` if the knowledge base entails query and `False` otherwise. The `ask_generator` itself returns an empty dict `{}` if the knowledge base entails query and `None` otherwise. This might seem a little bit weird to you. After all, it makes more sense just to return a `True` or a `False` instead of the `{}` or `None` But this is done to maintain consistency with the way things are in First-Order Logic, where, an `ask_generator` function, is supposed to return all the substitutions that make the query true. Hence the dict, to return all these substitutions. I will be mostly be using the `ask` function which returns a `{}` or a `False`, but if you don't like this, you can always use the `ask_if_true` function which returns a `True` or a `False`.\n", + "* `ask_generator(self, query)` : The `ask_generator` function is used by the `ask` function. It calls the `tt_entails` function, which in turn returns `True` if the knowledge base entails query and `False` otherwise. The `ask_generator` itself returns an empty dict `{}` if the knowledge base entails query and `None` otherwise. This might seem a little bit weird to you. After all, it makes more sense just to return a `True` or a `False` instead of the `{}` or `None` But this is done to maintain consistency with the way things are in First-Order Logic, where an `ask_generator` function is supposed to return all the substitutions that make the query true. Hence the dict, to return all these substitutions. I will be mostly be using the `ask` function which returns a `{}` or a `False`, but if you don't like this, you can always use the `ask_if_true` function which returns a `True` or a `False`.\n", "* `retract(self, sentence)` : This function removes all the clauses of the sentence given, from the knowledge base. Like the `tell` function, you don't have to pass clauses to remove them from the knowledge base; any sentence will do fine. The function will take care of converting that sentence to clauses and then remove those." ] }, @@ -544,141 +566,3045 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Inference in Propositional Knowlwdge Base\n", - "In this section we will look at two algorithms to check if a sentence is entailed by the `KB`. Our goal is to decide whether $\\text{KB} \\vDash \\alpha$ for some sentence $\\alpha$.\n", - "### Truth Table Enumeration\n", - "It is a model-checking approach which, as the name suggests, enumerates all possible models in which the `KB` is true and checks if $\\alpha$ is also true in these models. We list the $n$ symbols in the `KB` and enumerate the $2^{n}$ models in a depth-first manner and check the truth of `KB` and $\\alpha$." + "## Knowledge based agents" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A knowledge-based agent is a simple generic agent that maintains and handles a knowledge base.\n", + "The knowledge base may initially contain some background knowledge.\n", + "
    \n", + "The purpose of a KB agent is to provide a level of abstraction over knowledge-base manipulation and is to be used as a base class for agents that work on a knowledge base.\n", + "
    \n", + "Given a percept, the KB agent adds the percept to its knowledge base, asks the knowledge base for the best action, and tells the knowledge base that it has in fact taken that action.\n", + "
    \n", + "Our implementation of `KB-Agent` is encapsulated in a class `KB_AgentProgram` which inherits from the `KB` class.\n", + "
    \n", + "Let's have a look." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def KB_AgentProgram(KB):\n",
    +       "    """A generic logical knowledge-based agent program. [Figure 7.1]"""\n",
    +       "    steps = itertools.count()\n",
    +       "\n",
    +       "    def program(percept):\n",
    +       "        t = next(steps)\n",
    +       "        KB.tell(make_percept_sentence(percept, t))\n",
    +       "        action = KB.ask(make_action_query(t))\n",
    +       "        KB.tell(make_action_sentence(action, t))\n",
    +       "        return action\n",
    +       "\n",
    +       "    def make_percept_sentence(percept, t):\n",
    +       "        return Expr("Percept")(percept, t)\n",
    +       "\n",
    +       "    def make_action_query(t):\n",
    +       "        return expr("ShouldDo(action, {})".format(t))\n",
    +       "\n",
    +       "    def make_action_sentence(action, t):\n",
    +       "        return Expr("Did")(action[expr('action')], t)\n",
    +       "\n",
    +       "    return program\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(KB_AgentProgram)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The helper functions `make_percept_sentence`, `make_action_query` and `make_action_sentence` are all aptly named and as expected,\n", + "`make_percept_sentence` makes first-order logic sentences about percepts we want our agent to receive,\n", + "`make_action_query` asks the underlying `KB` about the action that should be taken and\n", + "`make_action_sentence` tells the underlying `KB` about the action it has just taken." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Inference in Propositional Knowledge Base\n", + "In this section we will look at two algorithms to check if a sentence is entailed by the `KB`. Our goal is to decide whether $\\text{KB} \\vDash \\alpha$ for some sentence $\\alpha$.\n", + "### Truth Table Enumeration\n", + "It is a model-checking approach which, as the name suggests, enumerates all possible models in which the `KB` is true and checks if $\\alpha$ is also true in these models. We list the $n$ symbols in the `KB` and enumerate the $2^{n}$ models in a depth-first manner and check the truth of `KB` and $\\alpha$." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def tt_check_all(kb, alpha, symbols, model):\n",
    +       "    """Auxiliary routine to implement tt_entails."""\n",
    +       "    if not symbols:\n",
    +       "        if pl_true(kb, model):\n",
    +       "            result = pl_true(alpha, model)\n",
    +       "            assert result in (True, False)\n",
    +       "            return result\n",
    +       "        else:\n",
    +       "            return True\n",
    +       "    else:\n",
    +       "        P, rest = symbols[0], symbols[1:]\n",
    +       "        return (tt_check_all(kb, alpha, rest, extend(model, P, True)) and\n",
    +       "                tt_check_all(kb, alpha, rest, extend(model, P, False)))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(tt_check_all)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The algorithm basically computes every line of the truth table $KB\\implies \\alpha$ and checks if it is true everywhere.\n", + "
    \n", + "If symbols are defined, the routine recursively constructs every combination of truth values for the symbols and then, \n", + "it checks whether `model` is consistent with `kb`.\n", + "The given models correspond to the lines in the truth table,\n", + "which have a `true` in the KB column, \n", + "and for these lines it checks whether the query evaluates to true\n", + "
    \n", + "`result = pl_true(alpha, model)`.\n", + "
    \n", + "
    \n", + "In short, `tt_check_all` evaluates this logical expression for each `model`\n", + "
    \n", + "`pl_true(kb, model) => pl_true(alpha, model)`\n", + "
    \n", + "which is logically equivalent to\n", + "
    \n", + "`pl_true(kb, model) & ~pl_true(alpha, model)` \n", + "
    \n", + "that is, the knowledge base and the negation of the query are logically inconsistent.\n", + "
    \n", + "
    \n", + "`tt_entails()` just extracts the symbols from the query and calls `tt_check_all()` with the proper parameters.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def tt_entails(kb, alpha):\n",
    +       "    """Does kb entail the sentence alpha? Use truth tables. For propositional\n",
    +       "    kb's and sentences. [Figure 7.10]. Note that the 'kb' should be an\n",
    +       "    Expr which is a conjunction of clauses.\n",
    +       "    >>> tt_entails(expr('P & Q'), expr('Q'))\n",
    +       "    True\n",
    +       "    """\n",
    +       "    assert not variables(alpha)\n",
    +       "    symbols = list(prop_symbols(kb & alpha))\n",
    +       "    return tt_check_all(kb, alpha, symbols, {})\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(tt_entails)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Keep in mind that for two symbols P and Q, P => Q is false only when P is `True` and Q is `False`.\n", + "Example usage of `tt_entails()`:" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tt_entails(P & Q, Q)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "P & Q is True only when both P and Q are True. Hence, (P & Q) => Q is True" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tt_entails(P | Q, Q)" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tt_entails(P | Q, P)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If we know that P | Q is true, we cannot infer the truth values of P and Q. \n", + "Hence (P | Q) => Q is False and so is (P | Q) => P." + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "(A, B, C, D, E, F, G) = symbols('A, B, C, D, E, F, G')\n", + "tt_entails(A & (B | C) & D & E & ~(F | G), A & D & E & ~F & ~G)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that for the KB to be true, A, D, E have to be True and F and G have to be False.\n", + "Nothing can be said about B or C." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Coming back to our problem, note that `tt_entails()` takes an `Expr` which is a conjunction of clauses as the input instead of the `KB` itself. \n", + "You can use the `ask_if_true()` method of `PropKB` which does all the required conversions. \n", + "Let's check what `wumpus_kb` tells us about $P_{1, 1}$." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(True, False)" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "wumpus_kb.ask_if_true(~P11), wumpus_kb.ask_if_true(P11)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Looking at Figure 7.9 we see that in all models in which the knowledge base is `True`, $P_{1, 1}$ is `False`. It makes sense that `ask_if_true()` returns `True` for $\\alpha = \\neg P_{1, 1}$ and `False` for $\\alpha = P_{1, 1}$. This begs the question, what if $\\alpha$ is `True` in only a portion of all models. Do we return `True` or `False`? This doesn't rule out the possibility of $\\alpha$ being `True` but it is not entailed by the `KB` so we return `False` in such cases. We can see this is the case for $P_{2, 2}$ and $P_{3, 1}$." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(False, False)" + ] + }, + "execution_count": 29, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "wumpus_kb.ask_if_true(~P22), wumpus_kb.ask_if_true(P22)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Proof by Resolution\n", + "Recall that our goal is to check whether $\\text{KB} \\vDash \\alpha$ i.e. is $\\text{KB} \\implies \\alpha$ true in every model. Suppose we wanted to check if $P \\implies Q$ is valid. We check the satisfiability of $\\neg (P \\implies Q)$, which can be rewritten as $P \\land \\neg Q$. If $P \\land \\neg Q$ is unsatisfiable, then $P \\implies Q$ must be true in all models. This gives us the result \"$\\text{KB} \\vDash \\alpha$ if and only if $\\text{KB} \\land \\neg \\alpha$ is unsatisfiable\".
    \n", + "This technique corresponds to proof by contradiction, a standard mathematical proof technique. We assume $\\alpha$ to be false and show that this leads to a contradiction with known axioms in $\\text{KB}$. We obtain a contradiction by making valid inferences using inference rules. In this proof we use a single inference rule, resolution which states $(l_1 \\lor \\dots \\lor l_k) \\land (m_1 \\lor \\dots \\lor m_n) \\land (l_i \\iff \\neg m_j) \\implies l_1 \\lor \\dots \\lor l_{i - 1} \\lor l_{i + 1} \\lor \\dots \\lor l_k \\lor m_1 \\lor \\dots \\lor m_{j - 1} \\lor m_{j + 1} \\lor \\dots \\lor m_n$. Applying the resolution yields us a clause which we add to the KB. We keep doing this until:\n", + "\n", + "* There are no new clauses that can be added, in which case $\\text{KB} \\nvDash \\alpha$.\n", + "* Two clauses resolve to yield the empty clause, in which case $\\text{KB} \\vDash \\alpha$.\n", + "\n", + "The empty clause is equivalent to False because it arises only from resolving two complementary\n", + "unit clauses such as $P$ and $\\neg P$ which is a contradiction as both $P$ and $\\neg P$ can't be True at the same time." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There is one catch however, the algorithm that implements proof by resolution cannot handle complex sentences. \n", + "Implications and bi-implications have to be simplified into simpler clauses. \n", + "We already know that *every sentence of a propositional logic is logically equivalent to a conjunction of clauses*.\n", + "We will use this fact to our advantage and simplify the input sentence into the **conjunctive normal form** (CNF) which is a conjunction of disjunctions of literals.\n", + "For eg:\n", + "
    \n", + "$$(A\\lor B)\\land (\\neg B\\lor C\\lor\\neg D)\\land (D\\lor\\neg E)$$\n", + "This is equivalent to the POS (Product of sums) form in digital electronics.\n", + "
    \n", + "Here's an outline of how the conversion is done:\n", + "1. Convert bi-implications to implications\n", + "
    \n", + "$\\alpha\\iff\\beta$ can be written as $(\\alpha\\implies\\beta)\\land(\\beta\\implies\\alpha)$\n", + "
    \n", + "This also applies to compound sentences\n", + "
    \n", + "$\\alpha\\iff(\\beta\\lor\\gamma)$ can be written as $(\\alpha\\implies(\\beta\\lor\\gamma))\\land((\\beta\\lor\\gamma)\\implies\\alpha)$\n", + "
    \n", + "2. Convert implications to their logical equivalents\n", + "
    \n", + "$\\alpha\\implies\\beta$ can be written as $\\neg\\alpha\\lor\\beta$\n", + "
    \n", + "3. Move negation inwards\n", + "
    \n", + "CNF requires atomic literals. Hence, negation cannot appear on a compound statement.\n", + "De Morgan's laws will be helpful here.\n", + "
    \n", + "$\\neg(\\alpha\\land\\beta)\\equiv(\\neg\\alpha\\lor\\neg\\beta)$\n", + "
    \n", + "$\\neg(\\alpha\\lor\\beta)\\equiv(\\neg\\alpha\\land\\neg\\beta)$\n", + "
    \n", + "4. Distribute disjunction over conjunction\n", + "
    \n", + "Disjunction and conjunction are distributive over each other.\n", + "Now that we only have conjunctions, disjunctions and negations in our expression, \n", + "we will distribute disjunctions over conjunctions wherever possible as this will give us a sentence which is a conjunction of simpler clauses, \n", + "which is what we wanted in the first place.\n", + "
    \n", + "We need a term of the form\n", + "
    \n", + "$(\\alpha_{1}\\lor\\alpha_{2}\\lor\\alpha_{3}...)\\land(\\beta_{1}\\lor\\beta_{2}\\lor\\beta_{3}...)\\land(\\gamma_{1}\\lor\\gamma_{2}\\lor\\gamma_{3}...)\\land...$\n", + "
    \n", + "
    \n", + "The `to_cnf` function executes this conversion using helper subroutines." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def to_cnf(s):\n",
    +       "    """Convert a propositional logical sentence to conjunctive normal form.\n",
    +       "    That is, to the form ((A | ~B | ...) & (B | C | ...) & ...) [p. 253]\n",
    +       "    >>> to_cnf('~(B | C)')\n",
    +       "    (~B & ~C)\n",
    +       "    """\n",
    +       "    s = expr(s)\n",
    +       "    if isinstance(s, str):\n",
    +       "        s = expr(s)\n",
    +       "    s = eliminate_implications(s)  # Steps 1, 2 from p. 253\n",
    +       "    s = move_not_inwards(s)  # Step 3\n",
    +       "    return distribute_and_over_or(s)  # Step 4\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(to_cnf)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`to_cnf` calls three subroutines.\n", + "
    \n", + "`eliminate_implications` converts bi-implications and implications to their logical equivalents.\n", + "
    \n", + "`move_not_inwards` removes negations from compound statements and moves them inwards using De Morgan's laws.\n", + "
    \n", + "`distribute_and_over_or` distributes disjunctions over conjunctions.\n", + "
    \n", + "Run the cell below for implementation details." + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def eliminate_implications(s):\n",
    +       "    """Change implications into equivalent form with only &, |, and ~ as logical operators."""\n",
    +       "    s = expr(s)\n",
    +       "    if not s.args or is_symbol(s.op):\n",
    +       "        return s  # Atoms are unchanged.\n",
    +       "    args = list(map(eliminate_implications, s.args))\n",
    +       "    a, b = args[0], args[-1]\n",
    +       "    if s.op == '==>':\n",
    +       "        return b | ~a\n",
    +       "    elif s.op == '<==':\n",
    +       "        return a | ~b\n",
    +       "    elif s.op == '<=>':\n",
    +       "        return (a | ~b) & (b | ~a)\n",
    +       "    elif s.op == '^':\n",
    +       "        assert len(args) == 2  # TODO: relax this restriction\n",
    +       "        return (a & ~b) | (~a & b)\n",
    +       "    else:\n",
    +       "        assert s.op in ('&', '|', '~')\n",
    +       "        return Expr(s.op, *args)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def move_not_inwards(s):\n",
    +       "    """Rewrite sentence s by moving negation sign inward.\n",
    +       "    >>> move_not_inwards(~(A | B))\n",
    +       "    (~A & ~B)"""\n",
    +       "    s = expr(s)\n",
    +       "    if s.op == '~':\n",
    +       "        def NOT(b):\n",
    +       "            return move_not_inwards(~b)\n",
    +       "        a = s.args[0]\n",
    +       "        if a.op == '~':\n",
    +       "            return move_not_inwards(a.args[0])  # ~~A ==> A\n",
    +       "        if a.op == '&':\n",
    +       "            return associate('|', list(map(NOT, a.args)))\n",
    +       "        if a.op == '|':\n",
    +       "            return associate('&', list(map(NOT, a.args)))\n",
    +       "        return s\n",
    +       "    elif is_symbol(s.op) or not s.args:\n",
    +       "        return s\n",
    +       "    else:\n",
    +       "        return Expr(s.op, *list(map(move_not_inwards, s.args)))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def distribute_and_over_or(s):\n",
    +       "    """Given a sentence s consisting of conjunctions and disjunctions\n",
    +       "    of literals, return an equivalent sentence in CNF.\n",
    +       "    >>> distribute_and_over_or((A & B) | C)\n",
    +       "    ((A | C) & (B | C))\n",
    +       "    """\n",
    +       "    s = expr(s)\n",
    +       "    if s.op == '|':\n",
    +       "        s = associate('|', s.args)\n",
    +       "        if s.op != '|':\n",
    +       "            return distribute_and_over_or(s)\n",
    +       "        if len(s.args) == 0:\n",
    +       "            return False\n",
    +       "        if len(s.args) == 1:\n",
    +       "            return distribute_and_over_or(s.args[0])\n",
    +       "        conj = first(arg for arg in s.args if arg.op == '&')\n",
    +       "        if not conj:\n",
    +       "            return s\n",
    +       "        others = [a for a in s.args if a is not conj]\n",
    +       "        rest = associate('|', others)\n",
    +       "        return associate('&', [distribute_and_over_or(c | rest)\n",
    +       "                               for c in conj.args])\n",
    +       "    elif s.op == '&':\n",
    +       "        return associate('&', list(map(distribute_and_over_or, s.args)))\n",
    +       "    else:\n",
    +       "        return s\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(eliminate_implications)\n", + "psource(move_not_inwards)\n", + "psource(distribute_and_over_or)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's convert some sentences to see how it works\n" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "((A | ~B) & (B | ~A))" + ] + }, + "execution_count": 32, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "A, B, C, D = expr('A, B, C, D')\n", + "to_cnf(A |'<=>'| B)" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "((A | ~B | ~C) & (B | ~A) & (C | ~A))" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "to_cnf(A |'<=>'| (B & C))" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(A & (C | B) & (D | B))" + ] + }, + "execution_count": 34, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "to_cnf(A & (B | (C & D)))" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "((B | ~A | C | ~D) & (A | ~A | C | ~D) & (B | ~B | C | ~D) & (A | ~B | C | ~D))" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "to_cnf((A |'<=>'| ~B) |'==>'| (C | ~D))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Coming back to our resolution problem, we can see how the `to_cnf` function is utilized here" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def pl_resolution(KB, alpha):\n",
    +       "    """Propositional-logic resolution: say if alpha follows from KB. [Figure 7.12]"""\n",
    +       "    clauses = KB.clauses + conjuncts(to_cnf(~alpha))\n",
    +       "    new = set()\n",
    +       "    while True:\n",
    +       "        n = len(clauses)\n",
    +       "        pairs = [(clauses[i], clauses[j])\n",
    +       "                 for i in range(n) for j in range(i+1, n)]\n",
    +       "        for (ci, cj) in pairs:\n",
    +       "            resolvents = pl_resolve(ci, cj)\n",
    +       "            if False in resolvents:\n",
    +       "                return True\n",
    +       "            new = new.union(set(resolvents))\n",
    +       "        if new.issubset(set(clauses)):\n",
    +       "            return False\n",
    +       "        for c in new:\n",
    +       "            if c not in clauses:\n",
    +       "                clauses.append(c)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(pl_resolution)" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(True, False)" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pl_resolution(wumpus_kb, ~P11), pl_resolution(wumpus_kb, P11)" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(False, False)" + ] + }, + "execution_count": 38, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pl_resolution(wumpus_kb, ~P22), pl_resolution(wumpus_kb, P22)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Forward and backward chaining\n", + "Previously, we said we will look at two algorithms to check if a sentence is entailed by the `KB`. Here's a third one. \n", + "The difference here is that our goal now is to determine if a knowledge base of definite clauses entails a single proposition symbol *q* - the query.\n", + "There is a catch however - the knowledge base can only contain **Horn clauses**.\n", + "
    \n", + "#### Horn Clauses\n", + "Horn clauses can be defined as a *disjunction* of *literals* with **at most** one positive literal. \n", + "
    \n", + "A Horn clause with exactly one positive literal is called a *definite clause*.\n", + "
    \n", + "A Horn clause might look like \n", + "
    \n", + "$\\neg a\\lor\\neg b\\lor\\neg c\\lor\\neg d... \\lor z$\n", + "
    \n", + "This, coincidentally, is also a definite clause.\n", + "
    \n", + "Using De Morgan's laws, the example above can be simplified to \n", + "
    \n", + "$a\\land b\\land c\\land d ... \\implies z$\n", + "
    \n", + "This seems like a logical representation of how humans process known data and facts. \n", + "Assuming percepts `a`, `b`, `c`, `d` ... to be true simultaneously, we can infer `z` to also be true at that point in time. \n", + "There are some interesting aspects of Horn clauses that make algorithmic inference or *resolution* easier.\n", + "- Definite clauses can be written as implications:\n", + "
    \n", + "The most important simplification a definite clause provides is that it can be written as an implication.\n", + "The premise (or the knowledge that leads to the implication) is a conjunction of positive literals.\n", + "The conclusion (the implied statement) is also a positive literal.\n", + "The sentence thus becomes easier to understand.\n", + "The premise and the conclusion are conventionally called the *body* and the *head* respectively.\n", + "A single positive literal is called a *fact*.\n", + "- Forward chaining and backward chaining can be used for inference from Horn clauses:\n", + "
    \n", + "Forward chaining is semantically identical to `AND-OR-Graph-Search` from the chapter on search algorithms.\n", + "Implementational details will be explained shortly.\n", + "- Deciding entailment with Horn clauses is linear in size of the knowledge base:\n", + "
    \n", + "Surprisingly, the forward and backward chaining algorithms traverse each element of the knowledge base at most once, greatly simplifying the problem.\n", + "
    \n", + "
    \n", + "The function `pl_fc_entails` implements forward chaining to see if a knowledge base `KB` entails a symbol `q`.\n", + "
    \n", + "Before we proceed further, note that `pl_fc_entails` doesn't use an ordinary `KB` instance. \n", + "The knowledge base here is an instance of the `PropDefiniteKB` class, derived from the `PropKB` class, \n", + "but modified to store definite clauses.\n", + "
    \n", + "The main point of difference arises in the inclusion of a helper method to `PropDefiniteKB` that returns a list of clauses in KB that have a given symbol `p` in their premise." + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def clauses_with_premise(self, p):\n",
    +       "        """Return a list of the clauses in KB that have p in their premise.\n",
    +       "        This could be cached away for O(1) speed, but we'll recompute it."""\n",
    +       "        return [c for c in self.clauses\n",
    +       "                if c.op == '==>' and p in conjuncts(c.args[0])]\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(PropDefiniteKB.clauses_with_premise)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now have a look at the `pl_fc_entails` algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def pl_fc_entails(KB, q):\n",
    +       "    """Use forward chaining to see if a PropDefiniteKB entails symbol q.\n",
    +       "    [Figure 7.15]\n",
    +       "    >>> pl_fc_entails(horn_clauses_KB, expr('Q'))\n",
    +       "    True\n",
    +       "    """\n",
    +       "    count = {c: len(conjuncts(c.args[0]))\n",
    +       "             for c in KB.clauses\n",
    +       "             if c.op == '==>'}\n",
    +       "    inferred = defaultdict(bool)\n",
    +       "    agenda = [s for s in KB.clauses if is_prop_symbol(s.op)]\n",
    +       "    while agenda:\n",
    +       "        p = agenda.pop()\n",
    +       "        if p == q:\n",
    +       "            return True\n",
    +       "        if not inferred[p]:\n",
    +       "            inferred[p] = True\n",
    +       "            for c in KB.clauses_with_premise(p):\n",
    +       "                count[c] -= 1\n",
    +       "                if count[c] == 0:\n",
    +       "                    agenda.append(c.args[1])\n",
    +       "    return False\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(pl_fc_entails)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The function accepts a knowledge base `KB` (an instance of `PropDefiniteKB`) and a query `q` as inputs.\n", + "
    \n", + "
    \n", + "`count` initially stores the number of symbols in the premise of each sentence in the knowledge base.\n", + "
    \n", + "The `conjuncts` helper function separates a given sentence at conjunctions.\n", + "
    \n", + "`inferred` is initialized as a *boolean* defaultdict. \n", + "This will be used later to check if we have inferred all premises of each clause of the agenda.\n", + "
    \n", + "`agenda` initially stores a list of clauses that the knowledge base knows to be true.\n", + "The `is_prop_symbol` helper function checks if the given symbol is a valid propositional logic symbol.\n", + "
    \n", + "
    \n", + "We now iterate through `agenda`, popping a symbol `p` on each iteration.\n", + "If the query `q` is the same as `p`, we know that entailment holds.\n", + "
    \n", + "The agenda is processed, reducing `count` by one for each implication with a premise `p`.\n", + "A conclusion is added to the agenda when `count` reaches zero. This means we know all the premises of that particular implication to be true.\n", + "
    \n", + "`clauses_with_premise` is a helpful method of the `PropKB` class.\n", + "It returns a list of clauses in the knowledge base that have `p` in their premise.\n", + "
    \n", + "
    \n", + "Now that we have an idea of how this function works, let's see a few examples of its usage, but we first need to define our knowledge base. We assume we know the following clauses to be true." + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "clauses = ['(B & F)==>E', \n", + " '(A & E & F)==>G', \n", + " '(B & C)==>F', \n", + " '(A & B)==>D', \n", + " '(E & F)==>H', \n", + " '(H & I)==>J',\n", + " 'A', \n", + " 'B', \n", + " 'C']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will now `tell` this information to our knowledge base." + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "definite_clauses_KB = PropDefiniteKB()\n", + "for clause in clauses:\n", + " definite_clauses_KB.tell(expr(clause))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now check if our knowledge base entails the following queries." + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 43, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pl_fc_entails(definite_clauses_KB, expr('G'))" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 44, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pl_fc_entails(definite_clauses_KB, expr('H'))" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 45, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pl_fc_entails(definite_clauses_KB, expr('I'))" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pl_fc_entails(definite_clauses_KB, expr('J'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Effective Propositional Model Checking\n", + "\n", + "The previous segments elucidate the algorithmic procedure for model checking. \n", + "In this segment, we look at ways of making them computationally efficient.\n", + "
    \n", + "The problem we are trying to solve is conventionally called the _propositional satisfiability problem_, abbreviated as the _SAT_ problem.\n", + "In layman terms, if there exists a model that satisfies a given Boolean formula, the formula is called satisfiable.\n", + "
    \n", + "The SAT problem was the first problem to be proven _NP-complete_.\n", + "The main characteristics of an NP-complete problem are:\n", + "- Given a solution to such a problem, it is easy to verify if the solution solves the problem.\n", + "- The time required to actually solve the problem using any known algorithm increases exponentially with respect to the size of the problem.\n", + "
    \n", + "
    \n", + "Due to these properties, heuristic and approximational methods are often applied to find solutions to these problems.\n", + "
    \n", + "It is extremely important to be able to solve large scale SAT problems efficiently because \n", + "many combinatorial problems in computer science can be conveniently reduced to checking the satisfiability of a propositional sentence under some constraints.\n", + "
    \n", + "We will introduce two new algorithms that perform propositional model checking in a computationally effective way.\n", + "
    \n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 1. DPLL (Davis-Putnam-Logeman-Loveland) algorithm\n", + "This algorithm is very similar to Backtracking-Search.\n", + "It recursively enumerates possible models in a depth-first fashion with the following improvements over algorithms like `tt_entails`:\n", + "1. Early termination:\n", + "
    \n", + "In certain cases, the algorithm can detect the truth value of a statement using just a partially completed model.\n", + "For example, $(P\\lor Q)\\land(P\\lor R)$ is true if P is true, regardless of other variables.\n", + "This reduces the search space significantly.\n", + "2. Pure symbol heuristic:\n", + "
    \n", + "A symbol that has the same sign (positive or negative) in all clauses is called a _pure symbol_.\n", + "It isn't difficult to see that any satisfiable model will have the pure symbols assigned such that its parent clause becomes _true_.\n", + "For example, $(P\\lor\\neg Q)\\land(\\neg Q\\lor\\neg R)\\land(R\\lor P)$ has P and Q as pure symbols\n", + "and for the sentence to be true, P _has_ to be true and Q _has_ to be false.\n", + "The pure symbol heuristic thus simplifies the problem a bit.\n", + "3. Unit clause heuristic:\n", + "
    \n", + "In the context of DPLL, clauses with just one literal and clauses with all but one _false_ literals are called unit clauses.\n", + "If a clause is a unit clause, it can only be satisfied by assigning the necessary value to make the last literal true.\n", + "We have no other choice.\n", + "
    \n", + "Assigning one unit clause can create another unit clause.\n", + "For example, when P is false, $(P\\lor Q)$ becomes a unit clause, causing _true_ to be assigned to Q.\n", + "A series of forced assignments derived from previous unit clauses is called _unit propagation_.\n", + "In this way, this heuristic simplifies the problem further.\n", + "
    \n", + "The algorithm often employs other tricks to scale up to large problems.\n", + "However, these tricks are currently out of the scope of this notebook. Refer to section 7.6 of the book for more details.\n", + "
    \n", + "
    \n", + "Let's have a look at the algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def dpll(clauses, symbols, model):\n",
    +       "    """See if the clauses are true in a partial model."""\n",
    +       "    unknown_clauses = []  # clauses with an unknown truth value\n",
    +       "    for c in clauses:\n",
    +       "        val = pl_true(c, model)\n",
    +       "        if val is False:\n",
    +       "            return False\n",
    +       "        if val is not True:\n",
    +       "            unknown_clauses.append(c)\n",
    +       "    if not unknown_clauses:\n",
    +       "        return model\n",
    +       "    P, value = find_pure_symbol(symbols, unknown_clauses)\n",
    +       "    if P:\n",
    +       "        return dpll(clauses, removeall(P, symbols), extend(model, P, value))\n",
    +       "    P, value = find_unit_clause(clauses, model)\n",
    +       "    if P:\n",
    +       "        return dpll(clauses, removeall(P, symbols), extend(model, P, value))\n",
    +       "    if not symbols:\n",
    +       "        raise TypeError("Argument should be of the type Expr.")\n",
    +       "    P, symbols = symbols[0], symbols[1:]\n",
    +       "    return (dpll(clauses, symbols, extend(model, P, True)) or\n",
    +       "            dpll(clauses, symbols, extend(model, P, False)))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(dpll)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The algorithm uses the ideas described above to check satisfiability of a sentence in propositional logic.\n", + "It recursively calls itself, simplifying the problem at each step. It also uses helper functions `find_pure_symbol` and `find_unit_clause` to carry out steps 2 and 3 above.\n", + "
    \n", + "The `dpll_satisfiable` helper function converts the input clauses to _conjunctive normal form_ and calls the `dpll` function with the correct parameters." + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def dpll_satisfiable(s):\n",
    +       "    """Check satisfiability of a propositional sentence.\n",
    +       "    This differs from the book code in two ways: (1) it returns a model\n",
    +       "    rather than True when it succeeds; this is more useful. (2) The\n",
    +       "    function find_pure_symbol is passed a list of unknown clauses, rather\n",
    +       "    than a list of all clauses and the model; this is more efficient."""\n",
    +       "    clauses = conjuncts(to_cnf(s))\n",
    +       "    symbols = list(prop_symbols(s))\n",
    +       "    return dpll(clauses, symbols, {})\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(dpll_satisfiable)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's see a few examples of usage." + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "A, B, C, D = expr('A, B, C, D')" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{A: True, B: True, C: False, D: True}" + ] + }, + "execution_count": 50, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dpll_satisfiable(A & B & ~C & D)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is a simple case to highlight that the algorithm actually works." + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{B: True, C: True, D: False}" + ] + }, + "execution_count": 51, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dpll_satisfiable((A & B) | (C & ~A) | (B & ~D))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If a particular symbol isn't present in the solution, \n", + "it means that the solution is independent of the value of that symbol.\n", + "In this case, the solution is independent of A." + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{A: True, B: True}" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dpll_satisfiable(A |'<=>'| B)" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{A: False, B: True, C: True}" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dpll_satisfiable((A |'<=>'| B) |'==>'| (C & ~A))" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{B: True, C: True}" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dpll_satisfiable((A | (B & C)) |'<=>'| ((A | B) & (A | C)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2. WalkSAT algorithm\n", + "This algorithm is very similar to Hill climbing.\n", + "On every iteration, the algorithm picks an unsatisfied clause and flips a symbol in the clause.\n", + "This is similar to finding a neighboring state in the `hill_climbing` algorithm.\n", + "
    \n", + "The symbol to be flipped is decided by an evaluation function that counts the number of unsatisfied clauses.\n", + "Sometimes, symbols are also flipped randomly to avoid local optima. A subtle balance between greediness and randomness is required. Alternatively, some versions of the algorithm restart with a completely new random assignment if no solution has been found for too long as a way of getting out of local minima of numbers of unsatisfied clauses.\n", + "
    \n", + "
    \n", + "Let's have a look at the algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def WalkSAT(clauses, p=0.5, max_flips=10000):\n",
    +       "    """Checks for satisfiability of all clauses by randomly flipping values of variables\n",
    +       "    """\n",
    +       "    # Set of all symbols in all clauses\n",
    +       "    symbols = {sym for clause in clauses for sym in prop_symbols(clause)}\n",
    +       "    # model is a random assignment of true/false to the symbols in clauses\n",
    +       "    model = {s: random.choice([True, False]) for s in symbols}\n",
    +       "    for i in range(max_flips):\n",
    +       "        satisfied, unsatisfied = [], []\n",
    +       "        for clause in clauses:\n",
    +       "            (satisfied if pl_true(clause, model) else unsatisfied).append(clause)\n",
    +       "        if not unsatisfied:  # if model satisfies all the clauses\n",
    +       "            return model\n",
    +       "        clause = random.choice(unsatisfied)\n",
    +       "        if probability(p):\n",
    +       "            sym = random.choice(list(prop_symbols(clause)))\n",
    +       "        else:\n",
    +       "            # Flip the symbol in clause that maximizes number of sat. clauses\n",
    +       "            def sat_count(sym):\n",
    +       "                # Return the the number of clauses satisfied after flipping the symbol.\n",
    +       "                model[sym] = not model[sym]\n",
    +       "                count = len([clause for clause in clauses if pl_true(clause, model)])\n",
    +       "                model[sym] = not model[sym]\n",
    +       "                return count\n",
    +       "            sym = argmax(prop_symbols(clause), key=sat_count)\n",
    +       "        model[sym] = not model[sym]\n",
    +       "    # If no solution is found within the flip limit, we return failure\n",
    +       "    return None\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(WalkSAT)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The function takes three arguments:\n", + "
    \n", + "1. The `clauses` we want to satisfy.\n", + "
    \n", + "2. The probability `p` of randomly changing a symbol.\n", + "
    \n", + "3. The maximum number of flips (`max_flips`) the algorithm will run for. If the clauses are still unsatisfied, the algorithm returns `None` to denote failure.\n", + "
    \n", + "The algorithm is identical in concept to Hill climbing and the code isn't difficult to understand.\n", + "
    \n", + "
    \n", + "Let's see a few examples of usage." + ] + }, + { + "cell_type": "code", + "execution_count": 56, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "A, B, C, D = expr('A, B, C, D')" + ] + }, + { + "cell_type": "code", + "execution_count": 57, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{A: True, B: True, C: False, D: True}" + ] + }, + "execution_count": 57, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "WalkSAT([A, B, ~C, D], 0.5, 100)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is a simple case to show that the algorithm converges." + ] + }, + { + "cell_type": "code", + "execution_count": 58, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{A: True, B: True, C: True}" + ] + }, + "execution_count": 58, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "WalkSAT([A & B, A & C], 0.5, 100)" + ] + }, + { + "cell_type": "code", + "execution_count": 59, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{A: True, B: True, C: True, D: True}" + ] + }, + "execution_count": 59, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "WalkSAT([A & B, C & D, C & B], 0.5, 100)" + ] + }, + { + "cell_type": "code", + "execution_count": 60, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "WalkSAT([A & B, C | D, ~(D | B)], 0.5, 1000)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This one doesn't give any output because WalkSAT did not find any model where these clauses hold. We can solve these clauses to see that they together form a contradiction and hence, it isn't supposed to have a solution." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One point of difference between this algorithm and the `dpll_satisfiable` algorithms is that both these algorithms take inputs differently. \n", + "For WalkSAT to take complete sentences as input, \n", + "we can write a helper function that converts the input sentence into conjunctive normal form and then calls WalkSAT with the list of conjuncts of the CNF form of the sentence." ] }, { "cell_type": "code", - "execution_count": 21, + "execution_count": 61, "metadata": { "collapsed": true }, "outputs": [], "source": [ - "%psource tt_check_all" + "def WalkSAT_CNF(sentence, p=0.5, max_flips=10000):\n", + " return WalkSAT(conjuncts(to_cnf(sentence)), 0, max_flips)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Note that `tt_entails()` takes an `Expr` which is a conjunction of clauses as the input instead of the `KB` itself. You can use the `ask_if_true()` method of `PropKB` which does all the required conversions. Let's check what `wumpus_kb` tells us about $P_{1, 1}$." + "Now we can call `WalkSAT_CNF` and `DPLL_Satisfiable` with the same arguments." ] }, { "cell_type": "code", - "execution_count": 22, + "execution_count": 62, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "(True, False)" + "{A: True, B: True, C: False, D: True}" ] }, - "execution_count": 22, + "execution_count": 62, "metadata": {}, "output_type": "execute_result" } ], "source": [ - "wumpus_kb.ask_if_true(~P11), wumpus_kb.ask_if_true(P11)" + "WalkSAT_CNF((A & B) | (C & ~A) | (B & ~D), 0.5, 1000)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Looking at Figure 7.9 we see that in all models in which the knowledge base is `True`, $P_{1, 1}$ is `False`. It makes sense that `ask_if_true()` returns `True` for $\\alpha = \\neg P_{1, 1}$ and `False` for $\\alpha = P_{1, 1}$. This begs the question, what if $\\alpha$ is `True` in only a portion of all models. Do we return `True` or `False`? This doesn't rule out the possibility of $\\alpha$ being `True` but it is not entailed by the `KB` so we return `False` in such cases. We can see this is the case for $P_{2, 2}$ and $P_{3, 1}$." + "It works!\n", + "
    \n", + "Notice that the solution generated by WalkSAT doesn't omit variables that the sentence doesn't depend upon. \n", + "If the sentence is independent of a particular variable, the solution contains a random value for that variable because of the stochastic nature of the algorithm.\n", + "
    \n", + "
    \n", + "Let's compare the runtime of WalkSAT and DPLL for a few cases. We will use the `%%timeit` magic to do this." ] }, { "cell_type": "code", - "execution_count": 23, + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "sentence_1 = A |'<=>'| B\n", + "sentence_2 = (A & B) | (C & ~A) | (B & ~D)\n", + "sentence_3 = (A | (B & C)) |'<=>'| ((A | B) & (A | C))" + ] + }, + { + "cell_type": "code", + "execution_count": 64, "metadata": {}, "outputs": [ { - "data": { - "text/plain": [ - "(False, False)" - ] - }, - "execution_count": 23, - "metadata": {}, - "output_type": "execute_result" + "name": "stdout", + "output_type": "stream", + "text": [ + "1.55 ms ± 64.6 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" + ] } ], "source": [ - "wumpus_kb.ask_if_true(~P22), wumpus_kb.ask_if_true(P22)" + "%%timeit\n", + "dpll_satisfiable(sentence_1)\n", + "dpll_satisfiable(sentence_2)\n", + "dpll_satisfiable(sentence_3)" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1.02 ms ± 6.92 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "WalkSAT_CNF(sentence_1)\n", + "WalkSAT_CNF(sentence_2)\n", + "WalkSAT_CNF(sentence_3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Proof by Resolution\n", - "Recall that our goal is to check whether $\\text{KB} \\vDash \\alpha$ i.e. is $\\text{KB} \\implies \\alpha$ true in every model. Suppose we wanted to check if $P \\implies Q$ is valid. We check the satisfiability of $\\neg (P \\implies Q)$, which can be rewritten as $P \\land \\neg Q$. If $P \\land \\neg Q$ is unsatisfiable, then $P \\implies Q$ must be true in all models. This gives us the result \"$\\text{KB} \\vDash \\alpha$ if and only if $\\text{KB} \\land \\neg \\alpha$ is unsatisfiable\".
    \n", - "This technique corresponds to proof by contradiction, a standard mathematical proof technique. We assume $\\alpha$ to be false and show that this leads to a contradiction with known axioms in $\\text{KB}$. We obtain a contradiction by making valid inferences using inference rules. In this proof we use a single inference rule, resolution which states $(l_1 \\lor \\dots \\lor l_k) \\land (m_1 \\lor \\dots \\lor m_n) \\land (l_i \\iff \\neg m_j) \\implies l_1 \\lor \\dots \\lor l_{i - 1} \\lor l_{i + 1} \\lor \\dots \\lor l_k \\lor m_1 \\lor \\dots \\lor m_{j - 1} \\lor m_{j + 1} \\lor \\dots \\lor m_n$. Applying the resolution yeilds us a clause which we add to the KB. We keep doing this until:\n", - "
      \n", - "
    • There are no new clauses that can be added, in which case $\\text{KB} \\nvDash \\alpha$.
    • \n", - "
    • Two clauses resolve to yield the empty clause, in which case $\\text{KB} \\vDash \\alpha$.
    • \n", - "
    \n", - "The empty clause is equivalent to False because it arises only from resolving two complementary\n", - "unit clauses such as $P$ and $\\neg P$ which is a contradiction as both $P$ and $\\neg P$ can't be True at the same time." + "On an average, for solvable cases, `WalkSAT` is quite faster than `dpll` because, for a small number of variables, \n", + "`WalkSAT` can reduce the search space significantly. \n", + "Results can be different for sentences with more symbols though.\n", + "Feel free to play around with this to understand the trade-offs of these algorithms better." ] }, { - "cell_type": "code", - "execution_count": 24, - "metadata": { - "collapsed": true - }, - "outputs": [], + "cell_type": "markdown", + "metadata": {}, "source": [ - "%psource pl_resolution" + "### SATPlan" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this section we show how to make plans by logical inference. The basic idea is very simple. It includes the following three steps:\n", + "1. Constuct a sentence that includes:\n", + " 1. A colection of assertions about the initial state.\n", + " 2. The successor-state axioms for all the possible actions at each time up to some maximum time t.\n", + " 3. The assertion that the goal is achieved at time t.\n", + "2. Present the whole sentence to a SAT solver.\n", + "3. Assuming a model is found, extract from the model those variables that represent actions and are assigned true. Together they represent a plan to achieve the goals.\n", + "\n", + "\n", + "Lets have a look at the algorithm" ] }, { "cell_type": "code", - "execution_count": 25, + "execution_count": 66, "metadata": {}, "outputs": [ { "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def SAT_plan(init, transition, goal, t_max, SAT_solver=dpll_satisfiable):\n",
    +       "    """Converts a planning problem to Satisfaction problem by translating it to a cnf sentence.\n",
    +       "    [Figure 7.22]"""\n",
    +       "\n",
    +       "    # Functions used by SAT_plan\n",
    +       "    def translate_to_SAT(init, transition, goal, time):\n",
    +       "        clauses = []\n",
    +       "        states = [state for state in transition]\n",
    +       "\n",
    +       "        # Symbol claiming state s at time t\n",
    +       "        state_counter = itertools.count()\n",
    +       "        for s in states:\n",
    +       "            for t in range(time+1):\n",
    +       "                state_sym[s, t] = Expr("State_{}".format(next(state_counter)))\n",
    +       "\n",
    +       "        # Add initial state axiom\n",
    +       "        clauses.append(state_sym[init, 0])\n",
    +       "\n",
    +       "        # Add goal state axiom\n",
    +       "        clauses.append(state_sym[goal, time])\n",
    +       "\n",
    +       "        # All possible transitions\n",
    +       "        transition_counter = itertools.count()\n",
    +       "        for s in states:\n",
    +       "            for action in transition[s]:\n",
    +       "                s_ = transition[s][action]\n",
    +       "                for t in range(time):\n",
    +       "                    # Action 'action' taken from state 's' at time 't' to reach 's_'\n",
    +       "                    action_sym[s, action, t] = Expr(\n",
    +       "                        "Transition_{}".format(next(transition_counter)))\n",
    +       "\n",
    +       "                    # Change the state from s to s_\n",
    +       "                    clauses.append(action_sym[s, action, t] |'==>'| state_sym[s, t])\n",
    +       "                    clauses.append(action_sym[s, action, t] |'==>'| state_sym[s_, t + 1])\n",
    +       "\n",
    +       "        # Allow only one state at any time\n",
    +       "        for t in range(time+1):\n",
    +       "            # must be a state at any time\n",
    +       "            clauses.append(associate('|', [state_sym[s, t] for s in states]))\n",
    +       "\n",
    +       "            for s in states:\n",
    +       "                for s_ in states[states.index(s) + 1:]:\n",
    +       "                    # for each pair of states s, s_ only one is possible at time t\n",
    +       "                    clauses.append((~state_sym[s, t]) | (~state_sym[s_, t]))\n",
    +       "\n",
    +       "        # Restrict to one transition per timestep\n",
    +       "        for t in range(time):\n",
    +       "            # list of possible transitions at time t\n",
    +       "            transitions_t = [tr for tr in action_sym if tr[2] == t]\n",
    +       "\n",
    +       "            # make sure at least one of the transitions happens\n",
    +       "            clauses.append(associate('|', [action_sym[tr] for tr in transitions_t]))\n",
    +       "\n",
    +       "            for tr in transitions_t:\n",
    +       "                for tr_ in transitions_t[transitions_t.index(tr) + 1:]:\n",
    +       "                    # there cannot be two transitions tr and tr_ at time t\n",
    +       "                    clauses.append(~action_sym[tr] | ~action_sym[tr_])\n",
    +       "\n",
    +       "        # Combine the clauses to form the cnf\n",
    +       "        return associate('&', clauses)\n",
    +       "\n",
    +       "    def extract_solution(model):\n",
    +       "        true_transitions = [t for t in action_sym if model[action_sym[t]]]\n",
    +       "        # Sort transitions based on time, which is the 3rd element of the tuple\n",
    +       "        true_transitions.sort(key=lambda x: x[2])\n",
    +       "        return [action for s, action, time in true_transitions]\n",
    +       "\n",
    +       "    # Body of SAT_plan algorithm\n",
    +       "    for t in range(t_max):\n",
    +       "        # dictionaries to help extract the solution from model\n",
    +       "        state_sym = {}\n",
    +       "        action_sym = {}\n",
    +       "\n",
    +       "        cnf = translate_to_SAT(init, transition, goal, t)\n",
    +       "        model = SAT_solver(cnf)\n",
    +       "        if model is not False:\n",
    +       "            return extract_solution(model)\n",
    +       "    return None\n",
    +       "
    \n", + "\n", + "\n" + ], "text/plain": [ - "(True, False)" + "" ] }, - "execution_count": 25, "metadata": {}, - "output_type": "execute_result" + "output_type": "display_data" } ], "source": [ - "pl_resolution(wumpus_kb, ~P11), pl_resolution(wumpus_kb, P11)" + "psource(SAT_plan)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's see few examples of its usage. First we define a transition and then call `SAT_plan`." ] }, { "cell_type": "code", - "execution_count": 26, + "execution_count": 67, "metadata": {}, "outputs": [ { - "data": { - "text/plain": [ - "(False, False)" - ] - }, - "execution_count": 26, - "metadata": {}, - "output_type": "execute_result" + "name": "stdout", + "output_type": "stream", + "text": [ + "None\n", + "['Right']\n", + "['Left', 'Left']\n" + ] } ], "source": [ - "pl_resolution(wumpus_kb, ~P22), pl_resolution(wumpus_kb, P22)" + "transition = {'A': {'Left': 'A', 'Right': 'B'},\n", + " 'B': {'Left': 'A', 'Right': 'C'},\n", + " 'C': {'Left': 'B', 'Right': 'C'}}\n", + "\n", + "\n", + "print(SAT_plan('A', transition, 'C', 2)) \n", + "print(SAT_plan('A', transition, 'B', 3))\n", + "print(SAT_plan('C', transition, 'A', 3))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us do the same for another transition." + ] + }, + { + "cell_type": "code", + "execution_count": 68, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['Right', 'Down']\n" + ] + } + ], + "source": [ + "transition = {(0, 0): {'Right': (0, 1), 'Down': (1, 0)},\n", + " (0, 1): {'Left': (1, 0), 'Down': (1, 1)},\n", + " (1, 0): {'Right': (1, 0), 'Up': (1, 0), 'Left': (1, 0), 'Down': (1, 0)},\n", + " (1, 1): {'Left': (1, 0), 'Up': (0, 1)}}\n", + "\n", + "\n", + "print(SAT_plan((0, 0), transition, (1, 1), 4))" ] }, { @@ -697,12 +3623,12 @@ "## Criminal KB\n", "In this section we create a `FolKB` based on the following paragraph.
    \n", "The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American.
    \n", - "The first step is to extract the facts and convert them into first-order definite clauses. Extracting the facts from data alone is a challenging task. Fortnately we have a small paragraph and can do extraction and conversion manually. We'll store the clauses in list aptly named `clauses`." + "The first step is to extract the facts and convert them into first-order definite clauses. Extracting the facts from data alone is a challenging task. Fortunately, we have a small paragraph and can do extraction and conversion manually. We'll store the clauses in list aptly named `clauses`." ] }, { "cell_type": "code", - "execution_count": 27, + "execution_count": 69, "metadata": { "collapsed": true }, @@ -717,21 +3643,21 @@ "source": [ "“... it is a crime for an American to sell weapons to hostile nations”
    \n", "The keywords to look for here are 'crime', 'American', 'sell', 'weapon' and 'hostile'. We use predicate symbols to make meaning of them.\n", - "
      \n", - "
    • `Criminal(x)`: `x` is a criminal
    • \n", - "
    • `American(x)`: `x` is an American
    • \n", - "
    • `Sells(x ,y, z)`: `x` sells `y` to `z`
    • \n", - "
    • `Weapon(x)`: `x` is a weapon
    • \n", - "
    • `Hostile(x)`: `x` is a hostile nation
    • \n", - "
    \n", - "Let us now combine them with appropriate variable naming depict the meaning of the sentence. The criminal `x` is also the American `x` who sells weapon `y` to `z`, which is a hostile nation.\n", + "\n", + "* `Criminal(x)`: `x` is a criminal\n", + "* `American(x)`: `x` is an American\n", + "* `Sells(x ,y, z)`: `x` sells `y` to `z`\n", + "* `Weapon(x)`: `x` is a weapon\n", + "* `Hostile(x)`: `x` is a hostile nation\n", + "\n", + "Let us now combine them with appropriate variable naming to depict the meaning of the sentence. The criminal `x` is also the American `x` who sells weapon `y` to `z`, which is a hostile nation.\n", "\n", "$\\text{American}(x) \\land \\text{Weapon}(y) \\land \\text{Sells}(x, y, z) \\land \\text{Hostile}(z) \\implies \\text{Criminal} (x)$" ] }, { "cell_type": "code", - "execution_count": 28, + "execution_count": 70, "metadata": { "collapsed": true }, @@ -752,7 +3678,7 @@ }, { "cell_type": "code", - "execution_count": 29, + "execution_count": 71, "metadata": { "collapsed": true }, @@ -766,14 +3692,14 @@ "metadata": {}, "source": [ "\"Nono ... has some missiles\"
    \n", - "This states the existance of some missile which is owned by Nono. $\\exists x \\text{Owns}(\\text{Nono}, x) \\land \\text{Missile}(x)$. We invoke existential instantiation to introduce a new constant `M1` which is the missile owned by Nono.\n", + "This states the existence of some missile which is owned by Nono. $\\exists x \\text{Owns}(\\text{Nono}, x) \\land \\text{Missile}(x)$. We invoke existential instantiation to introduce a new constant `M1` which is the missile owned by Nono.\n", "\n", "$\\text{Owns}(\\text{Nono}, \\text{M1}), \\text{Missile}(\\text{M1})$" ] }, { "cell_type": "code", - "execution_count": 30, + "execution_count": 72, "metadata": { "collapsed": true }, @@ -797,7 +3723,7 @@ }, { "cell_type": "code", - "execution_count": 31, + "execution_count": 73, "metadata": { "collapsed": true }, @@ -818,7 +3744,7 @@ }, { "cell_type": "code", - "execution_count": 32, + "execution_count": 74, "metadata": { "collapsed": true }, @@ -838,7 +3764,7 @@ }, { "cell_type": "code", - "execution_count": 33, + "execution_count": 75, "metadata": { "collapsed": true }, @@ -857,7 +3783,7 @@ }, { "cell_type": "code", - "execution_count": 34, + "execution_count": 76, "metadata": { "collapsed": true }, @@ -866,12 +3792,173 @@ "crime_kb = FolKB(clauses)" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `subst` helper function substitutes variables with given values in first-order logic statements.\n", + "This will be useful in later algorithms.\n", + "It's implementation is quite simple and self-explanatory." + ] + }, + { + "cell_type": "code", + "execution_count": 77, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def subst(s, x):\n",
    +       "    """Substitute the substitution s into the expression x.\n",
    +       "    >>> subst({x: 42, y:0}, F(x) + y)\n",
    +       "    (F(42) + 0)\n",
    +       "    """\n",
    +       "    if isinstance(x, list):\n",
    +       "        return [subst(s, xi) for xi in x]\n",
    +       "    elif isinstance(x, tuple):\n",
    +       "        return tuple([subst(s, xi) for xi in x])\n",
    +       "    elif not isinstance(x, Expr):\n",
    +       "        return x\n",
    +       "    elif is_var_symbol(x.op):\n",
    +       "        return s.get(x, x)\n",
    +       "    else:\n",
    +       "        return Expr(x.op, *[subst(s, arg) for arg in x.args])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(subst)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's an example of how `subst` can be used." + ] + }, + { + "cell_type": "code", + "execution_count": 78, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "Owns(Nono, M1)" + ] + }, + "execution_count": 78, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "subst({x: expr('Nono'), y: expr('M1')}, expr('Owns(x, y)'))" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Inference in First-Order Logic\n", - "In this section we look at a forward chaining and a backward chaining algorithm for `FolKB`. Both the aforementioned algorithms rely on a process called unification, a key component of all first-order inference algorithms." + "In this section we look at a forward chaining and a backward chaining algorithm for `FolKB`. Both aforementioned algorithms rely on a process called unification, a key component of all first-order inference algorithms." ] }, { @@ -884,7 +3971,7 @@ }, { "cell_type": "code", - "execution_count": 35, + "execution_count": 79, "metadata": {}, "outputs": [ { @@ -893,7 +3980,7 @@ "{x: 3}" ] }, - "execution_count": 35, + "execution_count": 79, "metadata": {}, "output_type": "execute_result" } @@ -904,7 +3991,7 @@ }, { "cell_type": "code", - "execution_count": 36, + "execution_count": 80, "metadata": {}, "outputs": [ { @@ -913,7 +4000,7 @@ "{x: B}" ] }, - "execution_count": 36, + "execution_count": 80, "metadata": {}, "output_type": "execute_result" } @@ -924,7 +4011,7 @@ }, { "cell_type": "code", - "execution_count": 37, + "execution_count": 81, "metadata": {}, "outputs": [ { @@ -933,7 +4020,7 @@ "{x: Bella, y: Dobby}" ] }, - "execution_count": 37, + "execution_count": 81, "metadata": {}, "output_type": "execute_result" } @@ -951,7 +4038,7 @@ }, { "cell_type": "code", - "execution_count": 38, + "execution_count": 82, "metadata": {}, "outputs": [ { @@ -970,12 +4057,12 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We also need to take care we do not unintentionally use same variable name. Unify treats them as a single variable which prevents it from taking multiple value." + "We also need to take care we do not unintentionally use the same variable name. Unify treats them as a single variable which prevents it from taking multiple value." ] }, { "cell_type": "code", - "execution_count": 39, + "execution_count": 83, "metadata": {}, "outputs": [ { @@ -995,20 +4082,152 @@ "metadata": {}, "source": [ "### Forward Chaining Algorithm\n", - "We consider the simple forward-chaining algorithm presented in Figure 9.3. We look at each rule in the knoweldge base and see if the premises can be satisfied. This is done by finding a substitution which unifies the each of the premise with a clause in the `KB`. If we are able to unify the premises the conclusion (with the corresponding substitution) is added to the `KB`. This inferencing process is repeated until either the query can be answered or till no new sentences can be aded. We test if the newly added clause unifies with the query in which case the substitution yielded by `unify` is an answer to the query. If we run out of sentences to infer, this means the query was a failure.\n", + "We consider the simple forward-chaining algorithm presented in Figure 9.3. We look at each rule in the knowledge base and see if the premises can be satisfied. This is done by finding a substitution which unifies each of the premise with a clause in the `KB`. If we are able to unify the premises, the conclusion (with the corresponding substitution) is added to the `KB`. This inferencing process is repeated until either the query can be answered or till no new sentences can be added. We test if the newly added clause unifies with the query in which case the substitution yielded by `unify` is an answer to the query. If we run out of sentences to infer, this means the query was a failure.\n", "\n", "The function `fol_fc_ask` is a generator which yields all substitutions which validate the query." ] }, { "cell_type": "code", - "execution_count": 40, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 84, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def fol_fc_ask(KB, alpha):\n",
    +       "    """A simple forward-chaining algorithm. [Figure 9.3]"""\n",
    +       "    # TODO: Improve efficiency\n",
    +       "    kb_consts = list({c for clause in KB.clauses for c in constant_symbols(clause)})\n",
    +       "    def enum_subst(p):\n",
    +       "        query_vars = list({v for clause in p for v in variables(clause)})\n",
    +       "        for assignment_list in itertools.product(kb_consts, repeat=len(query_vars)):\n",
    +       "            theta = {x: y for x, y in zip(query_vars, assignment_list)}\n",
    +       "            yield theta\n",
    +       "\n",
    +       "    # check if we can answer without new inferences\n",
    +       "    for q in KB.clauses:\n",
    +       "        phi = unify(q, alpha, {})\n",
    +       "        if phi is not None:\n",
    +       "            yield phi\n",
    +       "\n",
    +       "    while True:\n",
    +       "        new = []\n",
    +       "        for rule in KB.clauses:\n",
    +       "            p, q = parse_definite_clause(rule)\n",
    +       "            for theta in enum_subst(p):\n",
    +       "                if set(subst(theta, p)).issubset(set(KB.clauses)):\n",
    +       "                    q_ = subst(theta, q)\n",
    +       "                    if all([unify(x, q_, {}) is None for x in KB.clauses + new]):\n",
    +       "                        new.append(q_)\n",
    +       "                        phi = unify(q_, alpha, {})\n",
    +       "                        if phi is not None:\n",
    +       "                            yield phi\n",
    +       "        if not new:\n",
    +       "            break\n",
    +       "        for clause in new:\n",
    +       "            KB.tell(clause)\n",
    +       "    return None\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource fol_fc_ask" + "psource(fol_fc_ask)" ] }, { @@ -1020,7 +4239,7 @@ }, { "cell_type": "code", - "execution_count": 41, + "execution_count": 85, "metadata": {}, "outputs": [ { @@ -1045,7 +4264,7 @@ }, { "cell_type": "code", - "execution_count": 42, + "execution_count": 86, "metadata": {}, "outputs": [ { @@ -1087,13 +4306,117 @@ }, { "cell_type": "code", - "execution_count": 43, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 87, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def fol_bc_or(KB, goal, theta):\n",
    +       "    for rule in KB.fetch_rules_for_goal(goal):\n",
    +       "        lhs, rhs = parse_definite_clause(standardize_variables(rule))\n",
    +       "        for theta1 in fol_bc_and(KB, lhs, unify(rhs, goal, theta)):\n",
    +       "            yield theta1\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource fol_bc_or" + "psource(fol_bc_or)" ] }, { @@ -1106,13 +4429,122 @@ }, { "cell_type": "code", - "execution_count": 44, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 88, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def fol_bc_and(KB, goals, theta):\n",
    +       "    if theta is None:\n",
    +       "        pass\n",
    +       "    elif not goals:\n",
    +       "        yield theta\n",
    +       "    else:\n",
    +       "        first, rest = goals[0], goals[1:]\n",
    +       "        for theta1 in fol_bc_or(KB, subst(theta, first), theta):\n",
    +       "            for theta2 in fol_bc_and(KB, rest, theta1):\n",
    +       "                yield theta2\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource fol_bc_and" + "psource(fol_bc_and)" ] }, { @@ -1124,7 +4556,7 @@ }, { "cell_type": "code", - "execution_count": 45, + "execution_count": 89, "metadata": { "collapsed": true }, @@ -1136,7 +4568,7 @@ }, { "cell_type": "code", - "execution_count": 46, + "execution_count": 90, "metadata": {}, "outputs": [ { @@ -1145,7 +4577,7 @@ "{v_5: x, x: Nono}" ] }, - "execution_count": 46, + "execution_count": 90, "metadata": {}, "output_type": "execute_result" } @@ -1172,7 +4604,7 @@ }, { "cell_type": "code", - "execution_count": 47, + "execution_count": 91, "metadata": {}, "outputs": [ { @@ -1181,7 +4613,7 @@ "(P ==> ~Q)" ] }, - "execution_count": 47, + "execution_count": 91, "metadata": {}, "output_type": "execute_result" } @@ -1199,7 +4631,7 @@ }, { "cell_type": "code", - "execution_count": 48, + "execution_count": 92, "metadata": {}, "outputs": [ { @@ -1208,7 +4640,7 @@ "(P ==> ~Q)" ] }, - "execution_count": 48, + "execution_count": 92, "metadata": {}, "output_type": "execute_result" } @@ -1226,7 +4658,7 @@ }, { "cell_type": "code", - "execution_count": 49, + "execution_count": 93, "metadata": {}, "outputs": [ { @@ -1235,7 +4667,7 @@ "PartialExpr('==>', P)" ] }, - "execution_count": 49, + "execution_count": 93, "metadata": {}, "output_type": "execute_result" } @@ -1255,7 +4687,7 @@ }, { "cell_type": "code", - "execution_count": 50, + "execution_count": 94, "metadata": {}, "outputs": [ { @@ -1264,7 +4696,7 @@ "(P ==> ~Q)" ] }, - "execution_count": 50, + "execution_count": 94, "metadata": {}, "output_type": "execute_result" } @@ -1294,7 +4726,7 @@ }, { "cell_type": "code", - "execution_count": 51, + "execution_count": 95, "metadata": {}, "outputs": [ { @@ -1303,7 +4735,7 @@ "(~(P & Q) ==> (~P | ~Q))" ] }, - "execution_count": 51, + "execution_count": 95, "metadata": {}, "output_type": "execute_result" } @@ -1321,7 +4753,7 @@ }, { "cell_type": "code", - "execution_count": 52, + "execution_count": 96, "metadata": {}, "outputs": [ { @@ -1330,7 +4762,7 @@ "(~(P & Q) ==> (~P | ~Q))" ] }, - "execution_count": 52, + "execution_count": 96, "metadata": {}, "output_type": "execute_result" } @@ -1349,7 +4781,7 @@ }, { "cell_type": "code", - "execution_count": 53, + "execution_count": 97, "metadata": {}, "outputs": [ { @@ -1358,7 +4790,7 @@ "(((P & Q) ==> P) | Q)" ] }, - "execution_count": 53, + "execution_count": 97, "metadata": {}, "output_type": "execute_result" } @@ -1376,7 +4808,7 @@ }, { "cell_type": "code", - "execution_count": 54, + "execution_count": 98, "metadata": {}, "outputs": [ { @@ -1385,7 +4817,7 @@ "((P & Q) ==> (P | Q))" ] }, - "execution_count": 54, + "execution_count": 98, "metadata": {}, "output_type": "execute_result" } @@ -1403,11 +4835,133 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 99, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "
    \n", + "\n", + "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "from notebook import Canvas_fol_bc_ask\n", "canvas_bc_ask = Canvas_fol_bc_ask('canvas_bc_ask', crime_kb, expr('Criminal(x)'))" diff --git a/logic.py b/logic.py index 5810e633f..1624d55a5 100644 --- a/logic.py +++ b/logic.py @@ -1,4 +1,5 @@ -"""Representations and Inference for Logic (Chapters 7-9, 12) +""" +Representations and Inference for Logic. (Chapters 7-9, 12) Covers both Propositional and First-Order Logic. First we have four important data types: @@ -31,24 +32,23 @@ diff, simp Symbolic differentiation and simplification """ -from utils import ( - removeall, unique, first, argmax, probability, - isnumber, issequence, Expr, expr, subexpressions -) -import agents - +import heapq import itertools import random -from collections import defaultdict +from collections import defaultdict, Counter -# ______________________________________________________________________________ +import networkx as nx +from agents import Agent, Glitter, Bump, Stench, Breeze, Scream +from csp import parse_neighbors, UniversalDict +from search import astar_search, PlanRoute +from utils import remove_all, unique, first, probability, isnumber, issequence, Expr, expr, subexpressions, extend -class KB: +class KB: """A knowledge base to which you can tell and ask sentences. To create a KB, first subclass this class and implement - tell, ask_generator, and retract. Why ask_generator instead of ask? + tell, ask_generator, and retract. Why ask_generator instead of ask? The book is a bit vague on what ask means -- For a Propositional Logic KB, ask(P & Q) returns True or False, but for an FOL KB, something like ask(Brother(x, y)) might return many substitutions @@ -57,7 +57,8 @@ class KB: first one or returns False.""" def __init__(self, sentence=None): - raise NotImplementedError + if sentence: + self.tell(sentence) def tell(self, sentence): """Add the sentence to the KB.""" @@ -80,9 +81,8 @@ class PropKB(KB): """A KB for propositional logic. Inefficient, with no indexing.""" def __init__(self, sentence=None): + super().__init__(sentence) self.clauses = [] - if sentence: - self.tell(sentence) def tell(self, sentence): """Add the sentence's clauses to the KB.""" @@ -105,44 +105,57 @@ def retract(self, sentence): if c in self.clauses: self.clauses.remove(c) + # ______________________________________________________________________________ -def KB_AgentProgram(KB): - """A generic logical knowledge-based agent program. [Figure 7.1]""" +def KBAgentProgram(kb): + """ + [Figure 7.1] + A generic logical knowledge-based agent program. + """ steps = itertools.count() def program(percept): t = next(steps) - KB.tell(make_percept_sentence(percept, t)) - action = KB.ask(make_action_query(t)) - KB.tell(make_action_sentence(action, t)) + kb.tell(make_percept_sentence(percept, t)) + action = kb.ask(make_action_query(t)) + kb.tell(make_action_sentence(action, t)) return action def make_percept_sentence(percept, t): - return Expr("Percept")(percept, t) + return Expr('Percept')(percept, t) def make_action_query(t): - return expr("ShouldDo(action, {})".format(t)) + return expr('ShouldDo(action, {})'.format(t)) def make_action_sentence(action, t): - return Expr("Did")(action[expr('action')], t) + return Expr('Did')(action[expr('action')], t) return program def is_symbol(s): - """A string s is a symbol if it starts with an alphabetic char.""" + """A string s is a symbol if it starts with an alphabetic char. + >>> is_symbol('R2D2') + True + """ return isinstance(s, str) and s[:1].isalpha() def is_var_symbol(s): - """A logic variable symbol is an initial-lowercase string.""" + """A logic variable symbol is an initial-lowercase string. + >>> is_var_symbol('EXE') + False + """ return is_symbol(s) and s[0].islower() def is_prop_symbol(s): - """A proposition logic symbol is an initial-uppercase string.""" + """A proposition logic symbol is an initial-uppercase string. + >>> is_prop_symbol('exe') + False + """ return is_symbol(s) and s[0].isupper() @@ -156,7 +169,7 @@ def variables(s): def is_definite_clause(s): """Returns True for exprs s of the form A & B & ... & C ==> D, - where all literals are positive. In clause form, this is + where all literals are positive. In clause form, this is ~A | ~B | ... | ~C | D, where exactly one clause is positive. >>> is_definite_clause(expr('Farmer(Mac)')) True @@ -165,8 +178,7 @@ def is_definite_clause(s): return True elif s.op == '==>': antecedent, consequent = s.args - return (is_symbol(consequent.op) and - all(is_symbol(arg.op) for arg in conjuncts(antecedent))) + return is_symbol(consequent.op) and all(is_symbol(arg.op) for arg in conjuncts(antecedent)) else: return False @@ -182,16 +194,18 @@ def parse_definite_clause(s): # Useful constant Exprs used in examples and code: -A, B, C, D, E, F, G, P, Q, x, y, z = map(Expr, 'ABCDEFGPQxyz') +A, B, C, D, E, F, G, P, Q, a, x, y, z, u = map(Expr, 'ABCDEFGPQaxyzu') # ______________________________________________________________________________ def tt_entails(kb, alpha): - """Does kb entail the sentence alpha? Use truth tables. For propositional - kb's and sentences. [Figure 7.10]. Note that the 'kb' should be an - Expr which is a conjunction of clauses. + """ + [Figure 7.10] + Does kb entail the sentence alpha? Use truth tables. For propositional + kb's and sentences. Note that the 'kb' should be an Expr which is a + conjunction of clauses. >>> tt_entails(expr('P & Q'), expr('Q')) True """ @@ -258,7 +272,10 @@ def pl_true(exp, model={}): """Return True if the propositional logic expression is true in the model, and False if it is false. If the model does not specify the value for every proposition, this may return None to indicate 'not obvious'; - this may happen even when the expression is tautological.""" + this may happen even when the expression is tautological. + >>> pl_true(P, {}) is None + True + """ if exp in (True, False): return exp op, args = exp.op, exp.args @@ -304,7 +321,8 @@ def pl_true(exp, model={}): elif op == '^': # xor or 'not equivalent' return pt != qt else: - raise ValueError("illegal operator in logic expression" + str(exp)) + raise ValueError('Illegal operator in logic expression' + str(exp)) + # ______________________________________________________________________________ @@ -312,8 +330,10 @@ def pl_true(exp, model={}): def to_cnf(s): - """Convert a propositional logical sentence to conjunctive normal form. - That is, to the form ((A | ~B | ...) & (B | C | ...) & ...) [p. 253] + """ + [Page 253] + Convert a propositional logical sentence to conjunctive normal form. + That is, to the form ((A | ~B | ...) & (B | C | ...) & ...) >>> to_cnf('~(B | C)') (~B & ~C) """ @@ -349,11 +369,13 @@ def eliminate_implications(s): def move_not_inwards(s): """Rewrite sentence s by moving negation sign inward. >>> move_not_inwards(~(A | B)) - (~A & ~B)""" + (~A & ~B) + """ s = expr(s) if s.op == '~': def NOT(b): return move_not_inwards(~b) + a = s.args[0] if a.op == '~': return move_not_inwards(a.args[0]) # ~~A ==> A @@ -419,7 +441,10 @@ def associate(op, args): def dissociate(op, args): """Given an associative op, return a flattened list result such - that Expr(op, *result) means the same as Expr(op, *args).""" + that Expr(op, *result) means the same as Expr(op, *args). + >>> dissociate('&', [A & B]) + [A, B] + """ result = [] def collect(subargs): @@ -428,6 +453,7 @@ def collect(subargs): collect(arg.args) else: result.append(arg) + collect(args) return result @@ -451,17 +477,23 @@ def disjuncts(s): """ return dissociate('|', [s]) + # ______________________________________________________________________________ -def pl_resolution(KB, alpha): - """Propositional-logic resolution: say if alpha follows from KB. [Figure 7.12]""" - clauses = KB.clauses + conjuncts(to_cnf(~alpha)) +def pl_resolution(kb, alpha): + """ + [Figure 7.12] + Propositional-logic resolution: say if alpha follows from KB. + >>> pl_resolution(horn_clauses_KB, A) + True + """ + clauses = kb.clauses + conjuncts(to_cnf(~alpha)) new = set() while True: n = len(clauses) pairs = [(clauses[i], clauses[j]) - for i in range(n) for j in range(i+1, n)] + for i in range(n) for j in range(i + 1, n)] for (ci, cj) in pairs: resolvents = pl_resolve(ci, cj) if False in resolvents: @@ -480,11 +512,10 @@ def pl_resolve(ci, cj): for di in disjuncts(ci): for dj in disjuncts(cj): if di == ~dj or ~di == dj: - dnew = unique(removeall(di, disjuncts(ci)) + - removeall(dj, disjuncts(cj))) - clauses.append(associate('|', dnew)) + clauses.append(associate('|', unique(remove_all(di, disjuncts(ci)) + remove_all(dj, disjuncts(cj))))) return clauses + # ______________________________________________________________________________ @@ -507,84 +538,207 @@ def retract(self, sentence): def clauses_with_premise(self, p): """Return a list of the clauses in KB that have p in their premise. This could be cached away for O(1) speed, but we'll recompute it.""" - return [c for c in self.clauses - if c.op == '==>' and p in conjuncts(c.args[0])] + return [c for c in self.clauses if c.op == '==>' and p in conjuncts(c.args[0])] -def pl_fc_entails(KB, q): - """Use forward chaining to see if a PropDefiniteKB entails symbol q. +def pl_fc_entails(kb, q): + """ [Figure 7.15] + Use forward chaining to see if a PropDefiniteKB entails symbol q. >>> pl_fc_entails(horn_clauses_KB, expr('Q')) True """ - count = {c: len(conjuncts(c.args[0])) - for c in KB.clauses - if c.op == '==>'} + count = {c: len(conjuncts(c.args[0])) for c in kb.clauses if c.op == '==>'} inferred = defaultdict(bool) - agenda = [s for s in KB.clauses if is_prop_symbol(s.op)] + agenda = [s for s in kb.clauses if is_prop_symbol(s.op)] while agenda: p = agenda.pop() if p == q: return True if not inferred[p]: inferred[p] = True - for c in KB.clauses_with_premise(p): + for c in kb.clauses_with_premise(p): count[c] -= 1 if count[c] == 0: agenda.append(c.args[1]) return False -""" [Figure 7.13] +""" +[Figure 7.13] Simple inference in a wumpus world example """ -wumpus_world_inference = expr("(B11 <=> (P12 | P21)) & ~B11") - +wumpus_world_inference = expr('(B11 <=> (P12 | P21)) & ~B11') -""" [Figure 7.16] +""" +[Figure 7.16] Propositional Logic Forward Chaining example """ horn_clauses_KB = PropDefiniteKB() -for s in "P==>Q; (L&M)==>P; (B&L)==>M; (A&P)==>L; (A&B)==>L; A;B".split(';'): - horn_clauses_KB.tell(expr(s)) +for clause in ['P ==> Q', + '(L & M) ==> P', + '(B & L) ==> M', + '(A & P) ==> L', + '(A & B) ==> L', + 'A', 'B']: + horn_clauses_KB.tell(expr(clause)) + +""" +Definite clauses KB example +""" +definite_clauses_KB = PropDefiniteKB() +for clause in ['(B & F) ==> E', + '(A & E & F) ==> G', + '(B & C) ==> F', + '(A & B) ==> D', + '(E & F) ==> H', + '(H & I) ==>J', + 'A', 'B', 'C']: + definite_clauses_KB.tell(expr(clause)) + + +# ______________________________________________________________________________ +# Heuristics for SAT Solvers + + +def no_branching_heuristic(symbols, clauses): + return first(symbols), True + + +def min_clauses(clauses): + min_len = min(map(lambda c: len(c.args), clauses), default=2) + return filter(lambda c: len(c.args) == (min_len if min_len > 1 else 2), clauses) + + +def moms(symbols, clauses): + """ + MOMS (Maximum Occurrence in clauses of Minimum Size) heuristic + Returns the literal with the most occurrences in all clauses of minimum size + """ + scores = Counter(l for c in min_clauses(clauses) for l in prop_symbols(c)) + return max(symbols, key=lambda symbol: scores[symbol]), True + + +def momsf(symbols, clauses, k=0): + """ + MOMS alternative heuristic + If f(x) the number of occurrences of the variable x in clauses with minimum size, + we choose the variable maximizing [f(x) + f(-x)] * 2^k + f(x) * f(-x) + Returns x if f(x) >= f(-x) otherwise -x + """ + scores = Counter(l for c in min_clauses(clauses) for l in disjuncts(c)) + P = max(symbols, + key=lambda symbol: (scores[symbol] + scores[~symbol]) * pow(2, k) + scores[symbol] * scores[~symbol]) + return P, True if scores[P] >= scores[~P] else False + + +def posit(symbols, clauses): + """ + Freeman's POSIT version of MOMs + Counts the positive x and negative x for each variable x in clauses with minimum size + Returns x if f(x) >= f(-x) otherwise -x + """ + scores = Counter(l for c in min_clauses(clauses) for l in disjuncts(c)) + P = max(symbols, key=lambda symbol: scores[symbol] + scores[~symbol]) + return P, True if scores[P] >= scores[~P] else False + + +def zm(symbols, clauses): + """ + Zabih and McAllester's version of MOMs + Counts the negative occurrences only of each variable x in clauses with minimum size + """ + scores = Counter(l for c in min_clauses(clauses) for l in disjuncts(c) if l.op == '~') + return max(symbols, key=lambda symbol: scores[~symbol]), True + + +def dlis(symbols, clauses): + """ + DLIS (Dynamic Largest Individual Sum) heuristic + Choose the variable and value that satisfies the maximum number of unsatisfied clauses + Like DLCS but we only consider the literal (thus Cp and Cn are individual) + """ + scores = Counter(l for c in clauses for l in disjuncts(c)) + P = max(symbols, key=lambda symbol: scores[symbol]) + return P, True if scores[P] >= scores[~P] else False + + +def dlcs(symbols, clauses): + """ + DLCS (Dynamic Largest Combined Sum) heuristic + Cp the number of clauses containing literal x + Cn the number of clauses containing literal -x + Here we select the variable maximizing Cp + Cn + Returns x if Cp >= Cn otherwise -x + """ + scores = Counter(l for c in clauses for l in disjuncts(c)) + P = max(symbols, key=lambda symbol: scores[symbol] + scores[~symbol]) + return P, True if scores[P] >= scores[~P] else False + + +def jw(symbols, clauses): + """ + Jeroslow-Wang heuristic + For each literal compute J(l) = \sum{l in clause c} 2^{-|c|} + Return the literal maximizing J + """ + scores = Counter() + for c in clauses: + for l in prop_symbols(c): + scores[l] += pow(2, -len(c.args)) + return max(symbols, key=lambda symbol: scores[symbol]), True + + +def jw2(symbols, clauses): + """ + Two Sided Jeroslow-Wang heuristic + Compute J(l) also counts the negation of l = J(x) + J(-x) + Returns x if J(x) >= J(-x) otherwise -x + """ + scores = Counter() + for c in clauses: + for l in disjuncts(c): + scores[l] += pow(2, -len(c.args)) + P = max(symbols, key=lambda symbol: scores[symbol] + scores[~symbol]) + return P, True if scores[P] >= scores[~P] else False + # ______________________________________________________________________________ # DPLL-Satisfiable [Figure 7.17] -def dpll_satisfiable(s): +def dpll_satisfiable(s, branching_heuristic=no_branching_heuristic): """Check satisfiability of a propositional sentence. This differs from the book code in two ways: (1) it returns a model rather than True when it succeeds; this is more useful. (2) The function find_pure_symbol is passed a list of unknown clauses, rather - than a list of all clauses and the model; this is more efficient.""" - clauses = conjuncts(to_cnf(s)) - symbols = list(prop_symbols(s)) - return dpll(clauses, symbols, {}) + than a list of all clauses and the model; this is more efficient. + >>> dpll_satisfiable(A |'<=>'| B) == {A: True, B: True} + True + """ + return dpll(conjuncts(to_cnf(s)), prop_symbols(s), {}, branching_heuristic) -def dpll(clauses, symbols, model): +def dpll(clauses, symbols, model, branching_heuristic=no_branching_heuristic): """See if the clauses are true in a partial model.""" unknown_clauses = [] # clauses with an unknown truth value for c in clauses: val = pl_true(c, model) if val is False: return False - if val is not True: + if val is None: unknown_clauses.append(c) if not unknown_clauses: return model P, value = find_pure_symbol(symbols, unknown_clauses) if P: - return dpll(clauses, removeall(P, symbols), extend(model, P, value)) + return dpll(clauses, remove_all(P, symbols), extend(model, P, value), branching_heuristic) P, value = find_unit_clause(clauses, model) if P: - return dpll(clauses, removeall(P, symbols), extend(model, P, value)) - if not symbols: - raise TypeError("Argument should be of the type Expr.") - P, symbols = symbols[0], symbols[1:] - return (dpll(clauses, symbols, extend(model, P, True)) or - dpll(clauses, symbols, extend(model, P, False))) + return dpll(clauses, remove_all(P, symbols), extend(model, P, value), branching_heuristic) + P, value = branching_heuristic(symbols, unknown_clauses) + return (dpll(clauses, remove_all(P, symbols), extend(model, P, value), branching_heuristic) or + dpll(clauses, remove_all(P, symbols), extend(model, P, not value), branching_heuristic)) def find_pure_symbol(symbols, clauses): @@ -635,7 +789,7 @@ def unit_clause_assign(clause, model): if model[sym] == positive: return None, None # clause already True elif P: - return None, None # more than 1 unbound variable + return None, None # more than 1 unbound variable else: P, value = sym, positive return P, value @@ -654,12 +808,282 @@ def inspect_literal(literal): else: return literal, True + +# ______________________________________________________________________________ +# CDCL - Conflict-Driven Clause Learning with 1UIP Learning Scheme, +# 2WL Lazy Data Structure, VSIDS Branching Heuristic & Restarts + + +def no_restart(conflicts, restarts, queue_lbd, sum_lbd): + return False + + +def luby(conflicts, restarts, queue_lbd, sum_lbd, unit=512): + # in the state-of-art tested with unit value 1, 2, 4, 6, 8, 12, 16, 32, 64, 128, 256 and 512 + def _luby(i): + k = 1 + while True: + if i == (1 << k) - 1: + return 1 << (k - 1) + elif (1 << (k - 1)) <= i < (1 << k) - 1: + return _luby(i - (1 << (k - 1)) + 1) + k += 1 + + return unit * _luby(restarts) == len(queue_lbd) + + +def glucose(conflicts, restarts, queue_lbd, sum_lbd, x=100, k=0.7): + # in the state-of-art tested with (x, k) as (50, 0.8) and (100, 0.7) + # if there were at least x conflicts since the last restart, and then the average LBD of the last + # x learnt clauses was at least k times higher than the average LBD of all learnt clauses + return len(queue_lbd) >= x and sum(queue_lbd) / len(queue_lbd) * k > sum_lbd / conflicts + + +def cdcl_satisfiable(s, vsids_decay=0.95, restart_strategy=no_restart): + """ + >>> cdcl_satisfiable(A |'<=>'| B) == {A: True, B: True} + True + """ + clauses = TwoWLClauseDatabase(conjuncts(to_cnf(s))) + symbols = prop_symbols(s) + scores = Counter() + G = nx.DiGraph() + model = {} + dl = 0 + conflicts = 0 + restarts = 1 + sum_lbd = 0 + queue_lbd = [] + while True: + conflict = unit_propagation(clauses, symbols, model, G, dl) + if conflict: + if dl == 0: + return False + conflicts += 1 + dl, learn, lbd = conflict_analysis(G, dl) + queue_lbd.append(lbd) + sum_lbd += lbd + backjump(symbols, model, G, dl) + clauses.add(learn, model) + scores.update(l for l in disjuncts(learn)) + for symbol in scores: + scores[symbol] *= vsids_decay + if restart_strategy(conflicts, restarts, queue_lbd, sum_lbd): + backjump(symbols, model, G) + queue_lbd.clear() + restarts += 1 + else: + if not symbols: + return model + dl += 1 + assign_decision_literal(symbols, model, scores, G, dl) + + +def assign_decision_literal(symbols, model, scores, G, dl): + P = max(symbols, key=lambda symbol: scores[symbol] + scores[~symbol]) + value = True if scores[P] >= scores[~P] else False + symbols.remove(P) + model[P] = value + G.add_node(P, val=value, dl=dl) + + +def unit_propagation(clauses, symbols, model, G, dl): + def check(c): + if not model or clauses.get_first_watched(c) == clauses.get_second_watched(c): + return True + w1, _ = inspect_literal(clauses.get_first_watched(c)) + if w1 in model: + return c in (clauses.get_neg_watched(w1) if model[w1] else clauses.get_pos_watched(w1)) + w2, _ = inspect_literal(clauses.get_second_watched(c)) + if w2 in model: + return c in (clauses.get_neg_watched(w2) if model[w2] else clauses.get_pos_watched(w2)) + + def unit_clause(watching): + w, p = inspect_literal(watching) + G.add_node(w, val=p, dl=dl) + G.add_edges_from(zip(prop_symbols(c) - {w}, itertools.cycle([w])), antecedent=c) + symbols.remove(w) + model[w] = p + + def conflict_clause(c): + G.add_edges_from(zip(prop_symbols(c), itertools.cycle('K')), antecedent=c) + + while True: + bcp = False + for c in filter(check, clauses.get_clauses()): + # we need only visit each clause when one of its two watched literals is assigned to 0 because, until + # this happens, we can guarantee that there cannot be more than n-2 literals in the clause assigned to 0 + first_watched = pl_true(clauses.get_first_watched(c), model) + second_watched = pl_true(clauses.get_second_watched(c), model) + if first_watched is None and clauses.get_first_watched(c) == clauses.get_second_watched(c): + unit_clause(clauses.get_first_watched(c)) + bcp = True + break + elif first_watched is False and second_watched is not True: + if clauses.update_second_watched(c, model): + bcp = True + else: + # if the only literal with a non-zero value is the other watched literal then + if second_watched is None: # if it is free, then the clause is a unit clause + unit_clause(clauses.get_second_watched(c)) + bcp = True + break + else: # else (it is False) the clause is a conflict clause + conflict_clause(c) + return True + elif second_watched is False and first_watched is not True: + if clauses.update_first_watched(c, model): + bcp = True + else: + # if the only literal with a non-zero value is the other watched literal then + if first_watched is None: # if it is free, then the clause is a unit clause + unit_clause(clauses.get_first_watched(c)) + bcp = True + break + else: # else (it is False) the clause is a conflict clause + conflict_clause(c) + return True + if not bcp: + return False + + +def conflict_analysis(G, dl): + conflict_clause = next(G[p]['K']['antecedent'] for p in G.pred['K']) + P = next(node for node in G.nodes() - 'K' if G.nodes[node]['dl'] == dl and G.in_degree(node) == 0) + first_uip = nx.immediate_dominators(G, P)['K'] + G.remove_node('K') + conflict_side = nx.descendants(G, first_uip) + while True: + for l in prop_symbols(conflict_clause).intersection(conflict_side): + antecedent = next(G[p][l]['antecedent'] for p in G.pred[l]) + conflict_clause = pl_binary_resolution(conflict_clause, antecedent) + # the literal block distance is calculated by taking the decision levels from variables of all + # literals in the clause, and counting how many different decision levels were in this set + lbd = [G.nodes[l]['dl'] for l in prop_symbols(conflict_clause)] + if lbd.count(dl) == 1 and first_uip in prop_symbols(conflict_clause): + return 0 if len(lbd) == 1 else heapq.nlargest(2, lbd)[-1], conflict_clause, len(set(lbd)) + + +def pl_binary_resolution(ci, cj): + for di in disjuncts(ci): + for dj in disjuncts(cj): + if di == ~dj or ~di == dj: + return pl_binary_resolution(associate('|', remove_all(di, disjuncts(ci))), + associate('|', remove_all(dj, disjuncts(cj)))) + return associate('|', unique(disjuncts(ci) + disjuncts(cj))) + + +def backjump(symbols, model, G, dl=0): + delete = {node for node in G.nodes() if G.nodes[node]['dl'] > dl} + G.remove_nodes_from(delete) + for node in delete: + del model[node] + symbols |= delete + + +class TwoWLClauseDatabase: + + def __init__(self, clauses): + self.__twl = {} + self.__watch_list = defaultdict(lambda: [set(), set()]) + for c in clauses: + self.add(c, None) + + def get_clauses(self): + return self.__twl.keys() + + def set_first_watched(self, clause, new_watching): + if len(clause.args) > 2: + self.__twl[clause][0] = new_watching + + def set_second_watched(self, clause, new_watching): + if len(clause.args) > 2: + self.__twl[clause][1] = new_watching + + def get_first_watched(self, clause): + if len(clause.args) == 2: + return clause.args[0] + if len(clause.args) > 2: + return self.__twl[clause][0] + return clause + + def get_second_watched(self, clause): + if len(clause.args) == 2: + return clause.args[-1] + if len(clause.args) > 2: + return self.__twl[clause][1] + return clause + + def get_pos_watched(self, l): + return self.__watch_list[l][0] + + def get_neg_watched(self, l): + return self.__watch_list[l][1] + + def add(self, clause, model): + self.__twl[clause] = self.__assign_watching_literals(clause, model) + w1, p1 = inspect_literal(self.get_first_watched(clause)) + w2, p2 = inspect_literal(self.get_second_watched(clause)) + self.__watch_list[w1][0].add(clause) if p1 else self.__watch_list[w1][1].add(clause) + if w1 != w2: + self.__watch_list[w2][0].add(clause) if p2 else self.__watch_list[w2][1].add(clause) + + def remove(self, clause): + w1, p1 = inspect_literal(self.get_first_watched(clause)) + w2, p2 = inspect_literal(self.get_second_watched(clause)) + del self.__twl[clause] + self.__watch_list[w1][0].discard(clause) if p1 else self.__watch_list[w1][1].discard(clause) + if w1 != w2: + self.__watch_list[w2][0].discard(clause) if p2 else self.__watch_list[w2][1].discard(clause) + + def update_first_watched(self, clause, model): + # if a non-zero literal different from the other watched literal is found + found, new_watching = self.__find_new_watching_literal(clause, self.get_first_watched(clause), model) + if found: # then it will replace the watched literal + w, p = inspect_literal(self.get_second_watched(clause)) + self.__watch_list[w][0].remove(clause) if p else self.__watch_list[w][1].remove(clause) + self.set_second_watched(clause, new_watching) + w, p = inspect_literal(new_watching) + self.__watch_list[w][0].add(clause) if p else self.__watch_list[w][1].add(clause) + return True + + def update_second_watched(self, clause, model): + # if a non-zero literal different from the other watched literal is found + found, new_watching = self.__find_new_watching_literal(clause, self.get_second_watched(clause), model) + if found: # then it will replace the watched literal + w, p = inspect_literal(self.get_first_watched(clause)) + self.__watch_list[w][0].remove(clause) if p else self.__watch_list[w][1].remove(clause) + self.set_first_watched(clause, new_watching) + w, p = inspect_literal(new_watching) + self.__watch_list[w][0].add(clause) if p else self.__watch_list[w][1].add(clause) + return True + + def __find_new_watching_literal(self, clause, other_watched, model): + # if a non-zero literal different from the other watched literal is found + if len(clause.args) > 2: + for l in disjuncts(clause): + if l != other_watched and pl_true(l, model) is not False: + # then it is returned + return True, l + return False, None + + def __assign_watching_literals(self, clause, model=None): + if len(clause.args) > 2: + if model is None or not model: + return [clause.args[0], clause.args[-1]] + else: + return [next(l for l in disjuncts(clause) if pl_true(l, model) is None), + next(l for l in disjuncts(clause) if pl_true(l, model) is False)] + + # ______________________________________________________________________________ # Walk-SAT [Figure 7.18] def WalkSAT(clauses, p=0.5, max_flips=10000): """Checks for satisfiability of all clauses by randomly flipping values of variables + >>> WalkSAT([A & ~A], 0.5, 100) is None + True """ # Set of all symbols in all clauses symbols = {sym for clause in clauses for sym in prop_symbols(clause)} @@ -682,30 +1106,527 @@ def sat_count(sym): count = len([clause for clause in clauses if pl_true(clause, model)]) model[sym] = not model[sym] return count - sym = argmax(prop_symbols(clause), key=sat_count) + + sym = max(prop_symbols(clause), key=sat_count) model[sym] = not model[sym] # If no solution is found within the flip limit, we return failure return None + # ______________________________________________________________________________ +# Map Coloring SAT Problems -class HybridWumpusAgent(agents.Agent): - """An agent for the wumpus world that does logical inference. [Figure 7.20]""" +def MapColoringSAT(colors, neighbors): + """Make a SAT for the problem of coloring a map with different colors + for any two adjacent regions. Arguments are a list of colors, and a + dict of {region: [neighbor,...]} entries. This dict may also be + specified as a string of the form defined by parse_neighbors.""" + if isinstance(neighbors, str): + neighbors = parse_neighbors(neighbors) + colors = UniversalDict(colors) + clauses = [] + for state in neighbors.keys(): + clause = [expr(state + '_' + c) for c in colors[state]] + clauses.append(clause) + for t in itertools.combinations(clause, 2): + clauses.append([~t[0], ~t[1]]) + visited = set() + adj = set(neighbors[state]) - visited + visited.add(state) + for n_state in adj: + for col in colors[n_state]: + clauses.append([expr('~' + state + '_' + col), expr('~' + n_state + '_' + col)]) + return associate('&', map(lambda c: associate('|', c), clauses)) + + +australia_sat = MapColoringSAT(list('RGB'), """SA: WA NT Q NSW V; NT: WA Q; NSW: Q V; T: """) + +france_sat = MapColoringSAT(list('RGBY'), + """AL: LO FC; AQ: MP LI PC; AU: LI CE BO RA LR MP; BO: CE IF CA FC RA + AU; BR: NB PL; CA: IF PI LO FC BO; CE: PL NB NH IF BO AU LI PC; FC: BO + CA LO AL RA; IF: NH PI CA BO CE; LI: PC CE AU MP AQ; LO: CA AL FC; LR: + MP AU RA PA; MP: AQ LI AU LR; NB: NH CE PL BR; NH: PI IF CE NB; NO: + PI; PA: LR RA; PC: PL CE LI AQ; PI: NH NO CA IF; PL: BR NB CE PC; RA: + AU BO FC PA LR""") + +usa_sat = MapColoringSAT(list('RGBY'), + """WA: OR ID; OR: ID NV CA; CA: NV AZ; NV: ID UT AZ; ID: MT WY UT; + UT: WY CO AZ; MT: ND SD WY; WY: SD NE CO; CO: NE KA OK NM; NM: OK TX AZ; + ND: MN SD; SD: MN IA NE; NE: IA MO KA; KA: MO OK; OK: MO AR TX; + TX: AR LA; MN: WI IA; IA: WI IL MO; MO: IL KY TN AR; AR: MS TN LA; + LA: MS; WI: MI IL; IL: IN KY; IN: OH KY; MS: TN AL; AL: TN GA FL; + MI: OH IN; OH: PA WV KY; KY: WV VA TN; TN: VA NC GA; GA: NC SC FL; + PA: NY NJ DE MD WV; WV: MD VA; VA: MD DC NC; NC: SC; NY: VT MA CT NJ; + NJ: DE; DE: MD; MD: DC; VT: NH MA; MA: NH RI CT; CT: RI; ME: NH; + HI: ; AK: """) - def __init__(self): - raise NotImplementedError +# ______________________________________________________________________________ + + +# Expr functions for WumpusKB and HybridWumpusAgent + +def facing_east(time): + return Expr('FacingEast', time) + + +def facing_west(time): + return Expr('FacingWest', time) + + +def facing_north(time): + return Expr('FacingNorth', time) + + +def facing_south(time): + return Expr('FacingSouth', time) + + +def wumpus(x, y): + return Expr('W', x, y) + + +def pit(x, y): + return Expr('P', x, y) + + +def breeze(x, y): + return Expr('B', x, y) + + +def stench(x, y): + return Expr('S', x, y) + + +def wumpus_alive(time): + return Expr('WumpusAlive', time) + + +def have_arrow(time): + return Expr('HaveArrow', time) + + +def percept_stench(time): + return Expr('Stench', time) + + +def percept_breeze(time): + return Expr('Breeze', time) + + +def percept_glitter(time): + return Expr('Glitter', time) + + +def percept_bump(time): + return Expr('Bump', time) + + +def percept_scream(time): + return Expr('Scream', time) + + +def move_forward(time): + return Expr('Forward', time) + + +def shoot(time): + return Expr('Shoot', time) + + +def turn_left(time): + return Expr('TurnLeft', time) + + +def turn_right(time): + return Expr('TurnRight', time) + + +def ok_to_move(x, y, time): + return Expr('OK', x, y, time) + + +def location(x, y, time=None): + if time is None: + return Expr('L', x, y) + else: + return Expr('L', x, y, time) + + +# Symbols + +def implies(lhs, rhs): + return Expr('==>', lhs, rhs) + + +def equiv(lhs, rhs): + return Expr('<=>', lhs, rhs) + + +# Helper Function + +def new_disjunction(sentences): + t = sentences[0] + for i in range(1, len(sentences)): + t |= sentences[i] + return t + + +# ______________________________________________________________________________ + + +class WumpusKB(PropKB): + """ + Create a Knowledge Base that contains the a temporal "Wumpus physics" and temporal rules with time zero. + """ + + def __init__(self, dimrow): + super().__init__() + self.dimrow = dimrow + self.tell(~wumpus(1, 1)) + self.tell(~pit(1, 1)) + + for y in range(1, dimrow + 1): + for x in range(1, dimrow + 1): + + pits_in = list() + wumpus_in = list() + + if x > 1: # West room exists + pits_in.append(pit(x - 1, y)) + wumpus_in.append(wumpus(x - 1, y)) + + if y < dimrow: # North room exists + pits_in.append(pit(x, y + 1)) + wumpus_in.append(wumpus(x, y + 1)) + + if x < dimrow: # East room exists + pits_in.append(pit(x + 1, y)) + wumpus_in.append(wumpus(x + 1, y)) + + if y > 1: # South room exists + pits_in.append(pit(x, y - 1)) + wumpus_in.append(wumpus(x, y - 1)) + + self.tell(equiv(breeze(x, y), new_disjunction(pits_in))) + self.tell(equiv(stench(x, y), new_disjunction(wumpus_in))) + + # Rule that describes existence of at least one Wumpus + wumpus_at_least = list() + for x in range(1, dimrow + 1): + for y in range(1, dimrow + 1): + wumpus_at_least.append(wumpus(x, y)) + + self.tell(new_disjunction(wumpus_at_least)) + + # Rule that describes existence of at most one Wumpus + for i in range(1, dimrow + 1): + for j in range(1, dimrow + 1): + for u in range(1, dimrow + 1): + for v in range(1, dimrow + 1): + if i != u or j != v: + self.tell(~wumpus(i, j) | ~wumpus(u, v)) + + # Temporal rules at time zero + self.tell(location(1, 1, 0)) + for i in range(1, dimrow + 1): + for j in range(1, dimrow + 1): + self.tell(implies(location(i, j, 0), equiv(percept_breeze(0), breeze(i, j)))) + self.tell(implies(location(i, j, 0), equiv(percept_stench(0), stench(i, j)))) + if i != 1 or j != 1: + self.tell(~location(i, j, 0)) + + self.tell(wumpus_alive(0)) + self.tell(have_arrow(0)) + self.tell(facing_east(0)) + self.tell(~facing_north(0)) + self.tell(~facing_south(0)) + self.tell(~facing_west(0)) + + def make_action_sentence(self, action, time): + actions = [move_forward(time), shoot(time), turn_left(time), turn_right(time)] + + for a in actions: + if action is a: + self.tell(action) + else: + self.tell(~a) + + def make_percept_sentence(self, percept, time): + # Glitter, Bump, Stench, Breeze, Scream + flags = [0, 0, 0, 0, 0] + + # Things perceived + if isinstance(percept, Glitter): + flags[0] = 1 + self.tell(percept_glitter(time)) + elif isinstance(percept, Bump): + flags[1] = 1 + self.tell(percept_bump(time)) + elif isinstance(percept, Stench): + flags[2] = 1 + self.tell(percept_stench(time)) + elif isinstance(percept, Breeze): + flags[3] = 1 + self.tell(percept_breeze(time)) + elif isinstance(percept, Scream): + flags[4] = 1 + self.tell(percept_scream(time)) + + # Things not perceived + for i in range(len(flags)): + if flags[i] == 0: + if i == 0: + self.tell(~percept_glitter(time)) + elif i == 1: + self.tell(~percept_bump(time)) + elif i == 2: + self.tell(~percept_stench(time)) + elif i == 3: + self.tell(~percept_breeze(time)) + elif i == 4: + self.tell(~percept_scream(time)) + + def add_temporal_sentences(self, time): + if time == 0: + return + t = time - 1 + + # current location rules + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + self.tell(implies(location(i, j, time), equiv(percept_breeze(time), breeze(i, j)))) + self.tell(implies(location(i, j, time), equiv(percept_stench(time), stench(i, j)))) + s = list() + s.append(equiv(location(i, j, time), location(i, j, time) & ~move_forward(time) | percept_bump(time))) + if i != 1: + s.append(location(i - 1, j, t) & facing_east(t) & move_forward(t)) + if i != self.dimrow: + s.append(location(i + 1, j, t) & facing_west(t) & move_forward(t)) + if j != 1: + s.append(location(i, j - 1, t) & facing_north(t) & move_forward(t)) + if j != self.dimrow: + s.append(location(i, j + 1, t) & facing_south(t) & move_forward(t)) + + # add sentence about location i,j + self.tell(new_disjunction(s)) + + # add sentence about safety of location i,j + self.tell(equiv(ok_to_move(i, j, time), ~pit(i, j) & ~wumpus(i, j) & wumpus_alive(time))) + + # Rules about current orientation + + a = facing_north(t) & turn_right(t) + b = facing_south(t) & turn_left(t) + c = facing_east(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_east(time), a | b | c) + self.tell(s) + + a = facing_north(t) & turn_left(t) + b = facing_south(t) & turn_right(t) + c = facing_west(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_west(time), a | b | c) + self.tell(s) + + a = facing_east(t) & turn_left(t) + b = facing_west(t) & turn_right(t) + c = facing_north(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_north(time), a | b | c) + self.tell(s) + + a = facing_west(t) & turn_left(t) + b = facing_east(t) & turn_right(t) + c = facing_south(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_south(time), a | b | c) + self.tell(s) + + # Rules about last action + self.tell(equiv(move_forward(t), ~turn_right(t) & ~turn_left(t))) + + # Rule about the arrow + self.tell(equiv(have_arrow(time), have_arrow(t) & ~shoot(t))) + + # Rule about Wumpus (dead or alive) + self.tell(equiv(wumpus_alive(time), wumpus_alive(t) & ~percept_scream(time))) + + def ask_if_true(self, query): + return pl_resolution(self, query) + + +# ______________________________________________________________________________ + + +class WumpusPosition: + def __init__(self, x, y, orientation): + self.X = x + self.Y = y + self.orientation = orientation + + def get_location(self): + return self.X, self.Y + + def set_location(self, x, y): + self.X = x + self.Y = y + + def get_orientation(self): + return self.orientation + + def set_orientation(self, orientation): + self.orientation = orientation + + def __eq__(self, other): + if other.get_location() == self.get_location() and other.get_orientation() == self.get_orientation(): + return True + else: + return False + + +# ______________________________________________________________________________ + + +class HybridWumpusAgent(Agent): + """ + [Figure 7.20] + An agent for the wumpus world that does logical inference. + """ + + def __init__(self, dimentions): + self.dimrow = dimentions + self.kb = WumpusKB(self.dimrow) + self.t = 0 + self.plan = list() + self.current_position = WumpusPosition(1, 1, 'UP') + super().__init__(self.execute) + + def execute(self, percept): + self.kb.make_percept_sentence(percept, self.t) + self.kb.add_temporal_sentences(self.t) + + temp = list() + + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if self.kb.ask_if_true(location(i, j, self.t)): + temp.append(i) + temp.append(j) + + if self.kb.ask_if_true(facing_north(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'UP') + elif self.kb.ask_if_true(facing_south(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'DOWN') + elif self.kb.ask_if_true(facing_west(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'LEFT') + elif self.kb.ask_if_true(facing_east(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'RIGHT') + + safe_points = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if self.kb.ask_if_true(ok_to_move(i, j, self.t)): + safe_points.append([i, j]) + + if self.kb.ask_if_true(percept_glitter(self.t)): + goals = list() + goals.append([1, 1]) + self.plan.append('Grab') + actions = self.plan_route(self.current_position, goals, safe_points) + self.plan.extend(actions) + self.plan.append('Climb') + + if len(self.plan) == 0: + unvisited = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + for k in range(self.t): + if self.kb.ask_if_true(location(i, j, k)): + unvisited.append([i, j]) + unvisited_and_safe = list() + for u in unvisited: + for s in safe_points: + if u not in unvisited_and_safe and s == u: + unvisited_and_safe.append(u) + + temp = self.plan_route(self.current_position, unvisited_and_safe, safe_points) + self.plan.extend(temp) + + if len(self.plan) == 0 and self.kb.ask_if_true(have_arrow(self.t)): + possible_wumpus = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if not self.kb.ask_if_true(wumpus(i, j)): + possible_wumpus.append([i, j]) + + temp = self.plan_shot(self.current_position, possible_wumpus, safe_points) + self.plan.extend(temp) + + if len(self.plan) == 0: + not_unsafe = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if not self.kb.ask_if_true(ok_to_move(i, j, self.t)): + not_unsafe.append([i, j]) + temp = self.plan_route(self.current_position, not_unsafe, safe_points) + self.plan.extend(temp) + + if len(self.plan) == 0: + start = list() + start.append([1, 1]) + temp = self.plan_route(self.current_position, start, safe_points) + self.plan.extend(temp) + self.plan.append('Climb') + + action = self.plan[0] + self.plan = self.plan[1:] + self.kb.make_action_sentence(action, self.t) + self.t += 1 + + return action + + def plan_route(self, current, goals, allowed): + problem = PlanRoute(current, goals, allowed, self.dimrow) + return astar_search(problem).solution() + + def plan_shot(self, current, goals, allowed): + shooting_positions = set() + + for loc in goals: + x = loc[0] + y = loc[1] + for i in range(1, self.dimrow + 1): + if i < x: + shooting_positions.add(WumpusPosition(i, y, 'EAST')) + if i > x: + shooting_positions.add(WumpusPosition(i, y, 'WEST')) + if i < y: + shooting_positions.add(WumpusPosition(x, i, 'NORTH')) + if i > y: + shooting_positions.add(WumpusPosition(x, i, 'SOUTH')) + + # Can't have a shooting position from any of the rooms the Wumpus could reside + orientations = ['EAST', 'WEST', 'NORTH', 'SOUTH'] + for loc in goals: + for orientation in orientations: + shooting_positions.remove(WumpusPosition(loc[0], loc[1], orientation)) + + actions = list() + actions.extend(self.plan_route(current, shooting_positions, allowed)) + actions.append('Shoot') + return actions -def plan_route(current, goals, allowed): - raise NotImplementedError # ______________________________________________________________________________ -def SAT_plan(init, transition, goal, t_max, SAT_solver=dpll_satisfiable): - """Converts a planning problem to Satisfaction problem by translating it to a cnf sentence. - [Figure 7.22]""" +def SAT_plan(init, transition, goal, t_max, SAT_solver=cdcl_satisfiable): + """ + [Figure 7.22] + Converts a planning problem to Satisfaction problem by translating it to a cnf sentence. + >>> transition = {'A': {'Left': 'A', 'Right': 'B'}, 'B': {'Left': 'A', 'Right': 'C'}, 'C': {'Left': 'B', 'Right': 'C'}} + >>> SAT_plan('A', transition, 'C', 1) is None + True + """ # Functions used by SAT_plan def translate_to_SAT(init, transition, goal, time): @@ -715,14 +1636,16 @@ def translate_to_SAT(init, transition, goal, time): # Symbol claiming state s at time t state_counter = itertools.count() for s in states: - for t in range(time+1): - state_sym[s, t] = Expr("State_{}".format(next(state_counter))) + for t in range(time + 1): + state_sym[s, t] = Expr('S_{}'.format(next(state_counter))) # Add initial state axiom clauses.append(state_sym[init, 0]) # Add goal state axiom - clauses.append(state_sym[goal, time]) + clauses.append(state_sym[first(clause[0] for clause in state_sym + if set(conjuncts(clause[0])).issuperset(conjuncts(goal))), time]) \ + if isinstance(goal, Expr) else clauses.append(state_sym[goal, time]) # All possible transitions transition_counter = itertools.count() @@ -731,15 +1654,14 @@ def translate_to_SAT(init, transition, goal, time): s_ = transition[s][action] for t in range(time): # Action 'action' taken from state 's' at time 't' to reach 's_' - action_sym[s, action, t] = Expr( - "Transition_{}".format(next(transition_counter))) + action_sym[s, action, t] = Expr('T_{}'.format(next(transition_counter))) # Change the state from s to s_ - clauses.append(action_sym[s, action, t] |'==>'| state_sym[s, t]) - clauses.append(action_sym[s, action, t] |'==>'| state_sym[s_, t + 1]) + clauses.append(action_sym[s, action, t] | '==>' | state_sym[s, t]) + clauses.append(action_sym[s, action, t] | '==>' | state_sym[s_, t + 1]) # Allow only one state at any time - for t in range(time+1): + for t in range(time + 1): # must be a state at any time clauses.append(associate('|', [state_sym[s, t] for s in states])) @@ -771,7 +1693,7 @@ def extract_solution(model): return [action for s, action, time in true_transitions] # Body of SAT_plan algorithm - for t in range(t_max): + for t in range(t_max + 1): # dictionaries to help extract the solution from model state_sym = {} action_sym = {} @@ -787,9 +1709,14 @@ def extract_solution(model): def unify(x, y, s={}): - """Unify expressions x,y with substitution s; return a substitution that + """ + [Figure 9.1] + Unify expressions x,y with substitution s; return a substitution that would make x,y equal, or None if x,y can not unify. x and y can be - variables (e.g. Expr('x')), constants, lists, or Exprs. [Figure 9.1]""" + variables (e.g. Expr('x')), constants, lists, or Exprs. + >>> unify(x, 3, {}) + {x: 3} + """ if s is None: return None elif x == y: @@ -823,7 +1750,9 @@ def unify_var(var, x, s): elif occur_check(var, x, s): return None else: - return extend(s, var, x) + new_s = extend(s, var, x) + cascade_substitution(new_s) + return new_s def occur_check(var, x, s): @@ -842,13 +1771,6 @@ def occur_check(var, x, s): return False -def extend(s, var, val): - """Copy the substitution s and extend it by setting var to val; return copy.""" - s2 = s.copy() - s2[var] = val - return s2 - - def subst(s, x): """Substitute the substitution s into the expression x. >>> subst({x: 42, y:0}, F(x) + y) @@ -866,6 +1788,99 @@ def subst(s, x): return Expr(x.op, *[subst(s, arg) for arg in x.args]) +def cascade_substitution(s): + """This method allows to return a correct unifier in normal form + and perform a cascade substitution to s. + For every mapping in s perform a cascade substitution on s.get(x) + and if it is replaced with a function ensure that all the function + terms are correct updates by passing over them again. + >>> s = {x: y, y: G(z)} + >>> cascade_substitution(s) + >>> s == {x: G(z), y: G(z)} + True + """ + + for x in s: + s[x] = subst(s, s.get(x)) + if isinstance(s.get(x), Expr) and not is_variable(s.get(x)): + # Ensure Function Terms are correct updates by passing over them again + s[x] = subst(s, s.get(x)) + + +def unify_mm(x, y, s={}): + """Unify expressions x,y with substitution s using an efficient rule-based + unification algorithm by Martelli & Montanari; return a substitution that + would make x,y equal, or None if x,y can not unify. x and y can be + variables (e.g. Expr('x')), constants, lists, or Exprs. + >>> unify_mm(x, 3, {}) + {x: 3} + """ + + set_eq = extend(s, x, y) + s = set_eq.copy() + while True: + trans = 0 + for x, y in set_eq.items(): + if x == y: + # if x = y this mapping is deleted (rule b) + del s[x] + elif not is_variable(x) and is_variable(y): + # if x is not a variable and y is a variable, rewrite it as y = x in s (rule a) + if s.get(y, None) is None: + s[y] = x + del s[x] + else: + # if a mapping already exist for variable y then apply + # variable elimination (there is a chance to apply rule d) + s[x] = vars_elimination(y, s) + elif not is_variable(x) and not is_variable(y): + # in which case x and y are not variables, if the two root function symbols + # are different, stop with failure, else apply term reduction (rule c) + if x.op is y.op and len(x.args) == len(y.args): + term_reduction(x, y, s) + del s[x] + else: + return None + elif isinstance(y, Expr): + # in which case x is a variable and y is a function or a variable (e.g. F(z) or y), + # if y is a function, we must check if x occurs in y, then stop with failure, else + # try to apply variable elimination to y (rule d) + if occur_check(x, y, s): + return None + s[x] = vars_elimination(y, s) + if y == s.get(x): + trans += 1 + else: + trans += 1 + if trans == len(set_eq): + # if no transformation has been applied, stop with success + return s + set_eq = s.copy() + + +def term_reduction(x, y, s): + """Apply term reduction to x and y if both are functions and the two root function + symbols are equals (e.g. F(x1, x2, ..., xn) and F(x1', x2', ..., xn')) by returning + a new mapping obtained by replacing x: y with {x1: x1', x2: x2', ..., xn: xn'} + """ + for i in range(len(x.args)): + if x.args[i] in s: + s[s.get(x.args[i])] = y.args[i] + else: + s[x.args[i]] = y.args[i] + + +def vars_elimination(x, s): + """Apply variable elimination to x: if x is a variable and occurs in s, return + the term mapped by x, else if x is a function recursively applies variable + elimination to each term of the function.""" + if not isinstance(x, Expr): + return x + if is_variable(x): + return s.get(x, x) + return Expr(x.op, *[vars_elimination(arg, s) for arg in x.args]) + + def standardize_variables(sentence, dic=None): """Replace all the variables in sentence with new variables.""" if dic is None: @@ -880,12 +1895,25 @@ def standardize_variables(sentence, dic=None): dic[sentence] = v return v else: - return Expr(sentence.op, - *[standardize_variables(a, dic) for a in sentence.args]) + return Expr(sentence.op, *[standardize_variables(a, dic) for a in sentence.args]) standardize_variables.counter = itertools.count() + +# ______________________________________________________________________________ + + +def parse_clauses_from_dimacs(dimacs_cnf): + """Converts a string into CNF clauses according to the DIMACS format used in SAT competitions""" + return map(lambda c: associate('|', c), + map(lambda c: [expr('~X' + str(abs(l))) if l < 0 else expr('X' + str(l)) for l in c], + map(lambda line: map(int, line.split()), + filter(None, ' '.join( + filter(lambda line: line[0] not in ('c', 'p'), + filter(None, dimacs_cnf.strip().replace('\t', ' ').split('\n')))).split(' 0'))))) + + # ______________________________________________________________________________ @@ -901,16 +1929,18 @@ class FolKB(KB): False """ - def __init__(self, initial_clauses=[]): + def __init__(self, clauses=None): + super().__init__() self.clauses = [] # inefficient: no indexing - for clause in initial_clauses: - self.tell(clause) + if clauses: + for clause in clauses: + self.tell(clause) def tell(self, sentence): if is_definite_clause(sentence): self.clauses.append(sentence) else: - raise Exception("Not a definite clause: {}".format(sentence)) + raise Exception('Not a definite clause: {}'.format(sentence)) def ask_generator(self, query): return fol_bc_ask(self, query) @@ -922,10 +1952,14 @@ def fetch_rules_for_goal(self, goal): return self.clauses -def fol_fc_ask(KB, alpha): - """A simple forward-chaining algorithm. [Figure 9.3]""" - # TODO: Improve efficiency - kb_consts = list({c for clause in KB.clauses for c in constant_symbols(clause)}) +def fol_fc_ask(kb, alpha): + """ + [Figure 9.3] + A simple forward-chaining algorithm. + """ + # TODO: improve efficiency + kb_consts = list({c for clause in kb.clauses for c in constant_symbols(clause)}) + def enum_subst(p): query_vars = list({v for clause in p for v in variables(clause)}) for assignment_list in itertools.product(kb_consts, repeat=len(query_vars)): @@ -933,96 +1967,96 @@ def enum_subst(p): yield theta # check if we can answer without new inferences - for q in KB.clauses: - phi = unify(q, alpha, {}) + for q in kb.clauses: + phi = unify_mm(q, alpha) if phi is not None: yield phi while True: new = [] - for rule in KB.clauses: + for rule in kb.clauses: p, q = parse_definite_clause(rule) for theta in enum_subst(p): - if set(subst(theta, p)).issubset(set(KB.clauses)): + if set(subst(theta, p)).issubset(set(kb.clauses)): q_ = subst(theta, q) - if all([unify(x, q_, {}) is None for x in KB.clauses + new]): + if all([unify_mm(x, q_) is None for x in kb.clauses + new]): new.append(q_) - phi = unify(q_, alpha, {}) + phi = unify_mm(q_, alpha) if phi is not None: yield phi if not new: break for clause in new: - KB.tell(clause) + kb.tell(clause) return None -def fol_bc_ask(KB, query): - """A simple backward-chaining algorithm for first-order logic. [Figure 9.6] - KB should be an instance of FolKB, and query an atomic sentence.""" - return fol_bc_or(KB, query, {}) +def fol_bc_ask(kb, query): + """ + [Figure 9.6] + A simple backward-chaining algorithm for first-order logic. + KB should be an instance of FolKB, and query an atomic sentence. + """ + return fol_bc_or(kb, query, {}) -def fol_bc_or(KB, goal, theta): - for rule in KB.fetch_rules_for_goal(goal): +def fol_bc_or(kb, goal, theta): + for rule in kb.fetch_rules_for_goal(goal): lhs, rhs = parse_definite_clause(standardize_variables(rule)) - for theta1 in fol_bc_and(KB, lhs, unify(rhs, goal, theta)): + for theta1 in fol_bc_and(kb, lhs, unify_mm(rhs, goal, theta)): yield theta1 -def fol_bc_and(KB, goals, theta): +def fol_bc_and(kb, goals, theta): if theta is None: pass elif not goals: yield theta else: first, rest = goals[0], goals[1:] - for theta1 in fol_bc_or(KB, subst(theta, first), theta): - for theta2 in fol_bc_and(KB, rest, theta1): + for theta1 in fol_bc_or(kb, subst(theta, first), theta): + for theta2 in fol_bc_and(kb, rest, theta1): yield theta2 -# A simple KB that defines the relevant conditions of the Wumpus World as in Fig 7.4. +# A simple KB that defines the relevant conditions of the Wumpus World as in Figure 7.4. # See Sec. 7.4.3 wumpus_kb = PropKB() P11, P12, P21, P22, P31, B11, B21 = expr('P11, P12, P21, P22, P31, B11, B21') wumpus_kb.tell(~P11) -wumpus_kb.tell(B11 | '<=>' | ((P12 | P21))) -wumpus_kb.tell(B21 | '<=>' | ((P11 | P22 | P31))) +wumpus_kb.tell(B11 | '<=>' | (P12 | P21)) +wumpus_kb.tell(B21 | '<=>' | (P11 | P22 | P31)) wumpus_kb.tell(~B11) wumpus_kb.tell(B21) -test_kb = FolKB( - map(expr, ['Farmer(Mac)', - 'Rabbit(Pete)', - 'Mother(MrsMac, Mac)', - 'Mother(MrsRabbit, Pete)', - '(Rabbit(r) & Farmer(f)) ==> Hates(f, r)', - '(Mother(m, c)) ==> Loves(m, c)', - '(Mother(m, r) & Rabbit(r)) ==> Rabbit(m)', - '(Farmer(f)) ==> Human(f)', - # Note that this order of conjuncts - # would result in infinite recursion: - # '(Human(h) & Mother(m, h)) ==> Human(m)' - '(Mother(m, h) & Human(h)) ==> Human(m)' - ])) - -crime_kb = FolKB( - map(expr, ['(American(x) & Weapon(y) & Sells(x, y, z) & Hostile(z)) ==> Criminal(x)', - 'Owns(Nono, M1)', - 'Missile(M1)', - '(Missile(x) & Owns(Nono, x)) ==> Sells(West, x, Nono)', - 'Missile(x) ==> Weapon(x)', - 'Enemy(x, America) ==> Hostile(x)', - 'American(West)', - 'Enemy(Nono, America)' - ])) +test_kb = FolKB(map(expr, ['Farmer(Mac)', + 'Rabbit(Pete)', + 'Mother(MrsMac, Mac)', + 'Mother(MrsRabbit, Pete)', + '(Rabbit(r) & Farmer(f)) ==> Hates(f, r)', + '(Mother(m, c)) ==> Loves(m, c)', + '(Mother(m, r) & Rabbit(r)) ==> Rabbit(m)', + '(Farmer(f)) ==> Human(f)', + # Note that this order of conjuncts + # would result in infinite recursion: + # '(Human(h) & Mother(m, h)) ==> Human(m)' + '(Mother(m, h) & Human(h)) ==> Human(m)'])) + +crime_kb = FolKB(map(expr, ['(American(x) & Weapon(y) & Sells(x, y, z) & Hostile(z)) ==> Criminal(x)', + 'Owns(Nono, M1)', + 'Missile(M1)', + '(Missile(x) & Owns(Nono, x)) ==> Sells(West, x, Nono)', + 'Missile(x) ==> Weapon(x)', + 'Enemy(x, America) ==> Hostile(x)', + 'American(West)', + 'Enemy(Nono, America)'])) + # ______________________________________________________________________________ # Example application (not in the book). -# You can use the Expr class to do symbolic differentiation. This used to be +# You can use the Expr class to do symbolic differentiation. This used to be # a part of AI; now it is considered a separate field, Symbolic Algebra. @@ -1049,14 +2083,14 @@ def diff(y, x): elif op == '/': return (v * diff(u, x) - u * diff(v, x)) / (v * v) elif op == '**' and isnumber(x.op): - return (v * u ** (v - 1) * diff(u, x)) + return v * u ** (v - 1) * diff(u, x) elif op == '**': return (v * u ** (v - 1) * diff(u, x) + u ** v * Expr('log')(u) * diff(v, x)) elif op == 'log': return diff(u, x) / u else: - raise ValueError("Unknown op: {} in diff({}, {})".format(op, y, x)) + raise ValueError('Unknown op: {} in diff({}, {})'.format(op, y, x)) def simp(x): @@ -1117,11 +2151,14 @@ def simp(x): if u == 1: return 0 else: - raise ValueError("Unknown op: " + op) + raise ValueError('Unknown op: ' + op) # If we fall through to here, we can not simplify further return Expr(op, *args) def d(y, x): - """Differentiate and then simplify.""" + """Differentiate and then simplify. + >>> d(x * x - x, x) + ((2 * x) - 1) + """ return simp(diff(y, x)) diff --git a/logic4e.py b/logic4e.py new file mode 100644 index 000000000..75608ad74 --- /dev/null +++ b/logic4e.py @@ -0,0 +1,1665 @@ +"""Representations and Inference for Logic (Chapters 7-10) + +Covers both Propositional and First-Order Logic. First we have four +important data types: + + KB Abstract class holds a knowledge base of logical expressions + KB_Agent Abstract class subclasses agents.Agent + Expr A logical expression, imported from utils.py + substitution Implemented as a dictionary of var:value pairs, {x:1, y:x} + +Be careful: some functions take an Expr as argument, and some take a KB. + +Logical expressions can be created with Expr or expr, imported from utils, TODO +or with expr, which adds the capability to write a string that uses +the connectives ==>, <==, <=>, or <=/=>. But be careful: these have the +operator precedence of commas; you may need to add parents to make precedence work. +See logic.ipynb for examples. + +Then we implement various functions for doing logical inference: + + pl_true Evaluate a propositional logical sentence in a model + tt_entails Say if a statement is entailed by a KB + pl_resolution Do resolution on propositional sentences + dpll_satisfiable See if a propositional sentence is satisfiable + WalkSAT Try to find a solution for a set of clauses + +And a few other functions: + + to_cnf Convert to conjunctive normal form + unify Do unification of two FOL sentences + diff, simp Symbolic differentiation and simplification +""" +import itertools +import random +from collections import defaultdict + +from agents import Agent, Glitter, Bump, Stench, Breeze, Scream +from search import astar_search, PlanRoute +from utils4e import remove_all, unique, first, probability, isnumber, issequence, Expr, expr, subexpressions + + +# ______________________________________________________________________________ +# Chapter 7 Logical Agents +# 7.1 Knowledge Based Agents + + +class KB: + """ + A knowledge base to which you can tell and ask sentences. + To create a KB, subclass this class and implement tell, ask_generator, and retract. + Ask_generator: + For a Propositional Logic KB, ask(P & Q) returns True or False, but for an + FOL KB, something like ask(Brother(x, y)) might return many substitutions + such as {x: Cain, y: Abel}, {x: Abel, y: Cain}, {x: George, y: Jeb}, etc. + So ask_generator generates these one at a time, and ask either returns the + first one or returns False. + """ + + def __init__(self, sentence=None): + raise NotImplementedError + + def tell(self, sentence): + """Add the sentence to the KB.""" + raise NotImplementedError + + def ask(self, query): + """Return a substitution that makes the query true, or, failing that, return False.""" + return first(self.ask_generator(query), default=False) + + def ask_generator(self, query): + """Yield all the substitutions that make query true.""" + raise NotImplementedError + + def retract(self, sentence): + """Remove sentence from the KB.""" + raise NotImplementedError + + +class PropKB(KB): + """A KB for propositional logic. Inefficient, with no indexing.""" + + def __init__(self, sentence=None): + self.clauses = [] + if sentence: + self.tell(sentence) + + def tell(self, sentence): + """Add the sentence's clauses to the KB.""" + self.clauses.extend(conjuncts(to_cnf(sentence))) + + def ask_generator(self, query): + """Yield the empty substitution {} if KB entails query; else no results.""" + if tt_entails(Expr('&', *self.clauses), query): + yield {} + + def ask_if_true(self, query): + """Return True if the KB entails query, else return False.""" + for _ in self.ask_generator(query): + return True + return False + + def retract(self, sentence): + """Remove the sentence's clauses from the KB.""" + for c in conjuncts(to_cnf(sentence)): + if c in self.clauses: + self.clauses.remove(c) + + +def KB_AgentProgram(KB): + """A generic logical knowledge-based agent program. [Figure 7.1]""" + steps = itertools.count() + + def program(percept): + t = next(steps) + KB.tell(make_percept_sentence(percept, t)) + action = KB.ask(make_action_query(t)) + KB.tell(make_action_sentence(action, t)) + return action + + def make_percept_sentence(percept, t): + return Expr("Percept")(percept, t) + + def make_action_query(t): + return expr("ShouldDo(action, {})".format(t)) + + def make_action_sentence(action, t): + return Expr("Did")(action[expr('action')], t) + + return program + + +# _____________________________________________________________________________ +# 7.2 The Wumpus World + + +# Expr functions for WumpusKB and HybridWumpusAgent + + +def facing_east(time): + return Expr('FacingEast', time) + + +def facing_west(time): + return Expr('FacingWest', time) + + +def facing_north(time): + return Expr('FacingNorth', time) + + +def facing_south(time): + return Expr('FacingSouth', time) + + +def wumpus(x, y): + return Expr('W', x, y) + + +def pit(x, y): + return Expr('P', x, y) + + +def breeze(x, y): + return Expr('B', x, y) + + +def stench(x, y): + return Expr('S', x, y) + + +def wumpus_alive(time): + return Expr('WumpusAlive', time) + + +def have_arrow(time): + return Expr('HaveArrow', time) + + +def percept_stench(time): + return Expr('Stench', time) + + +def percept_breeze(time): + return Expr('Breeze', time) + + +def percept_glitter(time): + return Expr('Glitter', time) + + +def percept_bump(time): + return Expr('Bump', time) + + +def percept_scream(time): + return Expr('Scream', time) + + +def move_forward(time): + return Expr('Forward', time) + + +def shoot(time): + return Expr('Shoot', time) + + +def turn_left(time): + return Expr('TurnLeft', time) + + +def turn_right(time): + return Expr('TurnRight', time) + + +def ok_to_move(x, y, time): + return Expr('OK', x, y, time) + + +def location(x, y, time=None): + if time is None: + return Expr('L', x, y) + else: + return Expr('L', x, y, time) + + +# Symbols + + +def implies(lhs, rhs): + return Expr('==>', lhs, rhs) + + +def equiv(lhs, rhs): + return Expr('<=>', lhs, rhs) + + +# Helper Function + + +def new_disjunction(sentences): + t = sentences[0] + for i in range(1, len(sentences)): + t |= sentences[i] + return t + + +# ______________________________________________________________________________ +# 7.4 Propositional Logic + + +def is_symbol(s): + """A string s is a symbol if it starts with an alphabetic char. + >>> is_symbol('R2D2') + True + """ + return isinstance(s, str) and s[:1].isalpha() + + +def is_var_symbol(s): + """A logic variable symbol is an initial-lowercase string. + >>> is_var_symbol('EXE') + False + """ + return is_symbol(s) and s[0].islower() + + +def is_prop_symbol(s): + """A proposition logic symbol is an initial-uppercase string. + >>> is_prop_symbol('exe') + False + """ + return is_symbol(s) and s[0].isupper() + + +def variables(s): + """Return a set of the variables in expression s. + >>> variables(expr('F(x, x) & G(x, y) & H(y, z) & R(A, z, 2)')) == {x, y, z} + True + """ + return {x for x in subexpressions(s) if is_variable(x)} + + +def is_definite_clause(s): + """ + Returns True for exprs s of the form A & B & ... & C ==> D, + where all literals are positive. In clause form, this is + ~A | ~B | ... | ~C | D, where exactly one clause is positive. + >>> is_definite_clause(expr('Farmer(Mac)')) + True + """ + if is_symbol(s.op): + return True + elif s.op == '==>': + antecedent, consequent = s.args + return (is_symbol(consequent.op) and + all(is_symbol(arg.op) for arg in conjuncts(antecedent))) + else: + return False + + +def parse_definite_clause(s): + """Return the antecedents and the consequent of a definite clause.""" + assert is_definite_clause(s) + if is_symbol(s.op): + return [], s + else: + antecedent, consequent = s.args + return conjuncts(antecedent), consequent + + +# Useful constant Exprs used in examples and code: +A, B, C, D, E, F, G, P, Q, x, y, z = map(Expr, 'ABCDEFGPQxyz') + + +# ______________________________________________________________________________ +# 7.4.4 A simple inference procedure + + +def tt_entails(kb, alpha): + """ + Does kb entail the sentence alpha? Use truth tables. For propositional + kb's and sentences. [Figure 7.10]. Note that the 'kb' should be an + Expr which is a conjunction of clauses. + >>> tt_entails(expr('P & Q'), expr('Q')) + True + """ + assert not variables(alpha) + symbols = list(prop_symbols(kb & alpha)) + return tt_check_all(kb, alpha, symbols, {}) + + +def tt_check_all(kb, alpha, symbols, model): + """Auxiliary routine to implement tt_entails.""" + if not symbols: + if pl_true(kb, model): + result = pl_true(alpha, model) + assert result in (True, False) + return result + else: + return True + else: + P, rest = symbols[0], symbols[1:] + return (tt_check_all(kb, alpha, rest, extend(model, P, True)) and + tt_check_all(kb, alpha, rest, extend(model, P, False))) + + +def prop_symbols(x): + """Return the set of all propositional symbols in x.""" + if not isinstance(x, Expr): + return set() + elif is_prop_symbol(x.op): + return {x} + else: + return {symbol for arg in x.args for symbol in prop_symbols(arg)} + + +def constant_symbols(x): + """Return the set of all constant symbols in x.""" + if not isinstance(x, Expr): + return set() + elif is_prop_symbol(x.op) and not x.args: + return {x} + else: + return {symbol for arg in x.args for symbol in constant_symbols(arg)} + + +def predicate_symbols(x): + """ + Return a set of (symbol_name, arity) in x. + All symbols (even functional) with arity > 0 are considered. + """ + if not isinstance(x, Expr) or not x.args: + return set() + pred_set = {(x.op, len(x.args))} if is_prop_symbol(x.op) else set() + pred_set.update({symbol for arg in x.args for symbol in predicate_symbols(arg)}) + return pred_set + + +def tt_true(s): + """Is a propositional sentence a tautology? + >>> tt_true('P | ~P') + True + """ + s = expr(s) + return tt_entails(True, s) + + +def pl_true(exp, model={}): + """ + Return True if the propositional logic expression is true in the model, + and False if it is false. If the model does not specify the value for + every proposition, this may return None to indicate 'not obvious'; + this may happen even when the expression is tautological. + >>> pl_true(P, {}) is None + True + """ + if exp in (True, False): + return exp + op, args = exp.op, exp.args + if is_prop_symbol(op): + return model.get(exp) + elif op == '~': + p = pl_true(args[0], model) + if p is None: + return None + else: + return not p + elif op == '|': + result = False + for arg in args: + p = pl_true(arg, model) + if p is True: + return True + if p is None: + result = None + return result + elif op == '&': + result = True + for arg in args: + p = pl_true(arg, model) + if p is False: + return False + if p is None: + result = None + return result + p, q = args + if op == '==>': + return pl_true(~p | q, model) + elif op == '<==': + return pl_true(p | ~q, model) + pt = pl_true(p, model) + if pt is None: + return None + qt = pl_true(q, model) + if qt is None: + return None + if op == '<=>': + return pt == qt + elif op == '^': # xor or 'not equivalent' + return pt != qt + else: + raise ValueError("illegal operator in logic expression" + str(exp)) + + +# ______________________________________________________________________________ +# 7.5 Propositional Theorem Proving + + +def to_cnf(s): + """Convert a propositional logical sentence to conjunctive normal form. + That is, to the form ((A | ~B | ...) & (B | C | ...) & ...) [p. 253] + >>> to_cnf('~(B | C)') + (~B & ~C) + """ + s = expr(s) + if isinstance(s, str): + s = expr(s) + s = eliminate_implications(s) # Steps 1, 2 from p. 253 + s = move_not_inwards(s) # Step 3 + return distribute_and_over_or(s) # Step 4 + + +def eliminate_implications(s): + """Change implications into equivalent form with only &, |, and ~ as logical operators.""" + s = expr(s) + if not s.args or is_symbol(s.op): + return s # Atoms are unchanged. + args = list(map(eliminate_implications, s.args)) + a, b = args[0], args[-1] + if s.op == '==>': + return b | ~a + elif s.op == '<==': + return a | ~b + elif s.op == '<=>': + return (a | ~b) & (b | ~a) + elif s.op == '^': + assert len(args) == 2 # TODO: relax this restriction + return (a & ~b) | (~a & b) + else: + assert s.op in ('&', '|', '~') + return Expr(s.op, *args) + + +def move_not_inwards(s): + """Rewrite sentence s by moving negation sign inward. + >>> move_not_inwards(~(A | B)) + (~A & ~B) + """ + s = expr(s) + if s.op == '~': + def NOT(b): + return move_not_inwards(~b) + + a = s.args[0] + if a.op == '~': + return move_not_inwards(a.args[0]) # ~~A ==> A + if a.op == '&': + return associate('|', list(map(NOT, a.args))) + if a.op == '|': + return associate('&', list(map(NOT, a.args))) + return s + elif is_symbol(s.op) or not s.args: + return s + else: + return Expr(s.op, *list(map(move_not_inwards, s.args))) + + +def distribute_and_over_or(s): + """Given a sentence s consisting of conjunctions and disjunctions + of literals, return an equivalent sentence in CNF. + >>> distribute_and_over_or((A & B) | C) + ((A | C) & (B | C)) + """ + s = expr(s) + if s.op == '|': + s = associate('|', s.args) + if s.op != '|': + return distribute_and_over_or(s) + if len(s.args) == 0: + return False + if len(s.args) == 1: + return distribute_and_over_or(s.args[0]) + conj = first(arg for arg in s.args if arg.op == '&') + if not conj: + return s + others = [a for a in s.args if a is not conj] + rest = associate('|', others) + return associate('&', [distribute_and_over_or(c | rest) + for c in conj.args]) + elif s.op == '&': + return associate('&', list(map(distribute_and_over_or, s.args))) + else: + return s + + +def associate(op, args): + """Given an associative op, return an expression with the same + meaning as Expr(op, *args), but flattened -- that is, with nested + instances of the same op promoted to the top level. + >>> associate('&', [(A&B),(B|C),(B&C)]) + (A & B & (B | C) & B & C) + >>> associate('|', [A|(B|(C|(A&B)))]) + (A | B | C | (A & B)) + """ + args = dissociate(op, args) + if len(args) == 0: + return _op_identity[op] + elif len(args) == 1: + return args[0] + else: + return Expr(op, *args) + + +_op_identity = {'&': True, '|': False, '+': 0, '*': 1} + + +def dissociate(op, args): + """Given an associative op, return a flattened list result such + that Expr(op, *result) means the same as Expr(op, *args). + >>> dissociate('&', [A & B]) + [A, B] + """ + result = [] + + def collect(subargs): + for arg in subargs: + if arg.op == op: + collect(arg.args) + else: + result.append(arg) + + collect(args) + return result + + +def conjuncts(s): + """Return a list of the conjuncts in the sentence s. + >>> conjuncts(A & B) + [A, B] + >>> conjuncts(A | B) + [(A | B)] + """ + return dissociate('&', [s]) + + +def disjuncts(s): + """Return a list of the disjuncts in the sentence s. + >>> disjuncts(A | B) + [A, B] + >>> disjuncts(A & B) + [(A & B)] + """ + return dissociate('|', [s]) + + +# ______________________________________________________________________________ + + +def pl_resolution(KB, alpha): + """ + Propositional-logic resolution: say if alpha follows from KB. [Figure 7.12] + >>> pl_resolution(horn_clauses_KB, A) + True + """ + clauses = KB.clauses + conjuncts(to_cnf(~alpha)) + new = set() + while True: + n = len(clauses) + pairs = [(clauses[i], clauses[j]) + for i in range(n) for j in range(i + 1, n)] + for (ci, cj) in pairs: + resolvents = pl_resolve(ci, cj) + if False in resolvents: + return True + new = new.union(set(resolvents)) + if new.issubset(set(clauses)): + return False + for c in new: + if c not in clauses: + clauses.append(c) + + +def pl_resolve(ci, cj): + """Return all clauses that can be obtained by resolving clauses ci and cj.""" + clauses = [] + for di in disjuncts(ci): + for dj in disjuncts(cj): + if di == ~dj or ~di == dj: + dnew = unique(remove_all(di, disjuncts(ci)) + + remove_all(dj, disjuncts(cj))) + clauses.append(associate('|', dnew)) + return clauses + + +# ______________________________________________________________________________ +# 7.5.4 Forward and backward chaining + + +class PropDefiniteKB(PropKB): + """A KB of propositional definite clauses.""" + + def tell(self, sentence): + """Add a definite clause to this KB.""" + assert is_definite_clause(sentence), "Must be definite clause" + self.clauses.append(sentence) + + def ask_generator(self, query): + """Yield the empty substitution if KB implies query; else nothing.""" + if pl_fc_entails(self.clauses, query): + yield {} + + def retract(self, sentence): + self.clauses.remove(sentence) + + def clauses_with_premise(self, p): + """Return a list of the clauses in KB that have p in their premise. + This could be cached away for O(1) speed, but we'll recompute it.""" + return [c for c in self.clauses + if c.op == '==>' and p in conjuncts(c.args[0])] + + +def pl_fc_entails(KB, q): + """Use forward chaining to see if a PropDefiniteKB entails symbol q. + [Figure 7.15] + >>> pl_fc_entails(horn_clauses_KB, expr('Q')) + True + """ + count = {c: len(conjuncts(c.args[0])) + for c in KB.clauses + if c.op == '==>'} + inferred = defaultdict(bool) + agenda = [s for s in KB.clauses if is_prop_symbol(s.op)] + while agenda: + p = agenda.pop() + if p == q: + return True + if not inferred[p]: + inferred[p] = True + for c in KB.clauses_with_premise(p): + count[c] -= 1 + if count[c] == 0: + agenda.append(c.args[1]) + return False + + +""" [Figure 7.13] +Simple inference in a wumpus world example +""" +wumpus_world_inference = expr("(B11 <=> (P12 | P21)) & ~B11") + +""" [Figure 7.16] +Propositional Logic Forward Chaining example +""" +horn_clauses_KB = PropDefiniteKB() +for s in "P==>Q; (L&M)==>P; (B&L)==>M; (A&P)==>L; (A&B)==>L; A;B".split(';'): + horn_clauses_KB.tell(expr(s)) + +""" +Definite clauses KB example +""" +definite_clauses_KB = PropDefiniteKB() +for clause in ['(B & F)==>E', '(A & E & F)==>G', '(B & C)==>F', '(A & B)==>D', '(E & F)==>H', '(H & I)==>J', 'A', 'B', + 'C']: + definite_clauses_KB.tell(expr(clause)) + + +# ______________________________________________________________________________ +# 7.6 Effective Propositional Model Checking +# DPLL-Satisfiable [Figure 7.17] + + +def dpll_satisfiable(s): + """Check satisfiability of a propositional sentence. + This differs from the book code in two ways: (1) it returns a model + rather than True when it succeeds; this is more useful. (2) The + function find_pure_symbol is passed a list of unknown clauses, rather + than a list of all clauses and the model; this is more efficient. + >>> dpll_satisfiable(A |'<=>'| B) == {A: True, B: True} + True + """ + clauses = conjuncts(to_cnf(s)) + symbols = list(prop_symbols(s)) + return dpll(clauses, symbols, {}) + + +def dpll(clauses, symbols, model): + """See if the clauses are true in a partial model.""" + unknown_clauses = [] # clauses with an unknown truth value + for c in clauses: + val = pl_true(c, model) + if val is False: + return False + if val is not True: + unknown_clauses.append(c) + if not unknown_clauses: + return model + P, value = find_pure_symbol(symbols, unknown_clauses) + if P: + return dpll(clauses, remove_all(P, symbols), extend(model, P, value)) + P, value = find_unit_clause(clauses, model) + if P: + return dpll(clauses, remove_all(P, symbols), extend(model, P, value)) + if not symbols: + raise TypeError("Argument should be of the type Expr.") + P, symbols = symbols[0], symbols[1:] + return (dpll(clauses, symbols, extend(model, P, True)) or + dpll(clauses, symbols, extend(model, P, False))) + + +def find_pure_symbol(symbols, clauses): + """ + Find a symbol and its value if it appears only as a positive literal + (or only as a negative) in clauses. + >>> find_pure_symbol([A, B, C], [A|~B,~B|~C,C|A]) + (A, True) + """ + for s in symbols: + found_pos, found_neg = False, False + for c in clauses: + if not found_pos and s in disjuncts(c): + found_pos = True + if not found_neg and ~s in disjuncts(c): + found_neg = True + if found_pos != found_neg: + return s, found_pos + return None, None + + +def find_unit_clause(clauses, model): + """ + Find a forced assignment if possible from a clause with only 1 + variable not bound in the model. + >>> find_unit_clause([A|B|C, B|~C, ~A|~B], {A:True}) + (B, False) + """ + for clause in clauses: + P, value = unit_clause_assign(clause, model) + if P: + return P, value + return None, None + + +def unit_clause_assign(clause, model): + """Return a single variable/value pair that makes clause true in + the model, if possible. + >>> unit_clause_assign(A|B|C, {A:True}) + (None, None) + >>> unit_clause_assign(B|~C, {A:True}) + (None, None) + >>> unit_clause_assign(~A|~B, {A:True}) + (B, False) + """ + P, value = None, None + for literal in disjuncts(clause): + sym, positive = inspect_literal(literal) + if sym in model: + if model[sym] == positive: + return None, None # clause already True + elif P: + return None, None # more than 1 unbound variable + else: + P, value = sym, positive + return P, value + + +def inspect_literal(literal): + """The symbol in this literal, and the value it should take to + make the literal true. + >>> inspect_literal(P) + (P, True) + >>> inspect_literal(~P) + (P, False) + """ + if literal.op == '~': + return literal.args[0], False + else: + return literal, True + + +# ______________________________________________________________________________ +# 7.6.2 Local search algorithms +# Walk-SAT [Figure 7.18] + + +def WalkSAT(clauses, p=0.5, max_flips=10000): + """ + Checks for satisfiability of all clauses by randomly flipping values of variables + >>> WalkSAT([A & ~A], 0.5, 100) is None + True + """ + # Set of all symbols in all clauses + symbols = {sym for clause in clauses for sym in prop_symbols(clause)} + # model is a random assignment of true/false to the symbols in clauses + model = {s: random.choice([True, False]) for s in symbols} + for i in range(max_flips): + satisfied, unsatisfied = [], [] + for clause in clauses: + (satisfied if pl_true(clause, model) else unsatisfied).append(clause) + if not unsatisfied: # if model satisfies all the clauses + return model + clause = random.choice(unsatisfied) + if probability(p): + sym = random.choice(list(prop_symbols(clause))) + else: + # Flip the symbol in clause that maximizes number of sat. clauses + def sat_count(sym): + # Return the the number of clauses satisfied after flipping the symbol. + model[sym] = not model[sym] + count = len([clause for clause in clauses if pl_true(clause, model)]) + model[sym] = not model[sym] + return count + + sym = max(prop_symbols(clause), key=sat_count) + model[sym] = not model[sym] + # If no solution is found within the flip limit, we return failure + return None + + +# ______________________________________________________________________________ +# 7.7 Agents Based on Propositional Logic +# 7.7.1 The current state of the world + + +class WumpusKB(PropKB): + """ + Create a Knowledge Base that contains the atemporal "Wumpus physics" and temporal rules with time zero. + """ + + def __init__(self, dimrow): + super().__init__() + self.dimrow = dimrow + self.tell(~wumpus(1, 1)) + self.tell(~pit(1, 1)) + + for y in range(1, dimrow + 1): + for x in range(1, dimrow + 1): + + pits_in = list() + wumpus_in = list() + + if x > 1: # West room exists + pits_in.append(pit(x - 1, y)) + wumpus_in.append(wumpus(x - 1, y)) + + if y < dimrow: # North room exists + pits_in.append(pit(x, y + 1)) + wumpus_in.append(wumpus(x, y + 1)) + + if x < dimrow: # East room exists + pits_in.append(pit(x + 1, y)) + wumpus_in.append(wumpus(x + 1, y)) + + if y > 1: # South room exists + pits_in.append(pit(x, y - 1)) + wumpus_in.append(wumpus(x, y - 1)) + + self.tell(equiv(breeze(x, y), new_disjunction(pits_in))) + self.tell(equiv(stench(x, y), new_disjunction(wumpus_in))) + + # Rule that describes existence of at least one Wumpus + wumpus_at_least = list() + for x in range(1, dimrow + 1): + for y in range(1, dimrow + 1): + wumpus_at_least.append(wumpus(x, y)) + + self.tell(new_disjunction(wumpus_at_least)) + + # Rule that describes existence of at most one Wumpus + for i in range(1, dimrow + 1): + for j in range(1, dimrow + 1): + for u in range(1, dimrow + 1): + for v in range(1, dimrow + 1): + if i != u or j != v: + self.tell(~wumpus(i, j) | ~wumpus(u, v)) + + # Temporal rules at time zero + self.tell(location(1, 1, 0)) + for i in range(1, dimrow + 1): + for j in range(1, dimrow + 1): + self.tell(implies(location(i, j, 0), equiv(percept_breeze(0), breeze(i, j)))) + self.tell(implies(location(i, j, 0), equiv(percept_stench(0), stench(i, j)))) + if i != 1 or j != 1: + self.tell(~location(i, j, 0)) + + self.tell(wumpus_alive(0)) + self.tell(have_arrow(0)) + self.tell(facing_east(0)) + self.tell(~facing_north(0)) + self.tell(~facing_south(0)) + self.tell(~facing_west(0)) + + def make_action_sentence(self, action, time): + actions = [move_forward(time), shoot(time), turn_left(time), turn_right(time)] + + for a in actions: + if action is a: + self.tell(action) + else: + self.tell(~a) + + def make_percept_sentence(self, percept, time): + # Glitter, Bump, Stench, Breeze, Scream + flags = [0, 0, 0, 0, 0] + + # Things perceived + if isinstance(percept, Glitter): + flags[0] = 1 + self.tell(percept_glitter(time)) + elif isinstance(percept, Bump): + flags[1] = 1 + self.tell(percept_bump(time)) + elif isinstance(percept, Stench): + flags[2] = 1 + self.tell(percept_stench(time)) + elif isinstance(percept, Breeze): + flags[3] = 1 + self.tell(percept_breeze(time)) + elif isinstance(percept, Scream): + flags[4] = 1 + self.tell(percept_scream(time)) + + # Things not perceived + for i in range(len(flags)): + if flags[i] == 0: + if i == 0: + self.tell(~percept_glitter(time)) + elif i == 1: + self.tell(~percept_bump(time)) + elif i == 2: + self.tell(~percept_stench(time)) + elif i == 3: + self.tell(~percept_breeze(time)) + elif i == 4: + self.tell(~percept_scream(time)) + + def add_temporal_sentences(self, time): + if time == 0: + return + t = time - 1 + + # current location rules + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + self.tell(implies(location(i, j, time), equiv(percept_breeze(time), breeze(i, j)))) + self.tell(implies(location(i, j, time), equiv(percept_stench(time), stench(i, j)))) + + s = list() + + s.append( + equiv( + location(i, j, time), location(i, j, time) & ~move_forward(time) | percept_bump(time))) + + if i != 1: + s.append(location(i - 1, j, t) & facing_east(t) & move_forward(t)) + + if i != self.dimrow: + s.append(location(i + 1, j, t) & facing_west(t) & move_forward(t)) + + if j != 1: + s.append(location(i, j - 1, t) & facing_north(t) & move_forward(t)) + + if j != self.dimrow: + s.append(location(i, j + 1, t) & facing_south(t) & move_forward(t)) + + # add sentence about location i,j + self.tell(new_disjunction(s)) + + # add sentence about safety of location i,j + self.tell( + equiv(ok_to_move(i, j, time), ~pit(i, j) & ~wumpus(i, j) & wumpus_alive(time)) + ) + + # Rules about current orientation + + a = facing_north(t) & turn_right(t) + b = facing_south(t) & turn_left(t) + c = facing_east(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_east(time), a | b | c) + self.tell(s) + + a = facing_north(t) & turn_left(t) + b = facing_south(t) & turn_right(t) + c = facing_west(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_west(time), a | b | c) + self.tell(s) + + a = facing_east(t) & turn_left(t) + b = facing_west(t) & turn_right(t) + c = facing_north(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_north(time), a | b | c) + self.tell(s) + + a = facing_west(t) & turn_left(t) + b = facing_east(t) & turn_right(t) + c = facing_south(t) & ~turn_left(t) & ~turn_right(t) + s = equiv(facing_south(time), a | b | c) + self.tell(s) + + # Rules about last action + self.tell(equiv(move_forward(t), ~turn_right(t) & ~turn_left(t))) + + # Rule about the arrow + self.tell(equiv(have_arrow(time), have_arrow(t) & ~shoot(t))) + + # Rule about Wumpus (dead or alive) + self.tell(equiv(wumpus_alive(time), wumpus_alive(t) & ~percept_scream(time))) + + def ask_if_true(self, query): + return pl_resolution(self, query) + + +# ______________________________________________________________________________ + + +class WumpusPosition: + def __init__(self, x, y, orientation): + self.X = x + self.Y = y + self.orientation = orientation + + def get_location(self): + return self.X, self.Y + + def set_location(self, x, y): + self.X = x + self.Y = y + + def get_orientation(self): + return self.orientation + + def set_orientation(self, orientation): + self.orientation = orientation + + def __eq__(self, other): + if (other.get_location() == self.get_location() and + other.get_orientation() == self.get_orientation()): + return True + else: + return False + + +# ______________________________________________________________________________ +# 7.7.2 A hybrid agent + + +class HybridWumpusAgent(Agent): + """An agent for the wumpus world that does logical inference. [Figure 7.20]""" + + def __init__(self, dimentions): + self.dimrow = dimentions + self.kb = WumpusKB(self.dimrow) + self.t = 0 + self.plan = list() + self.current_position = WumpusPosition(1, 1, 'UP') + super().__init__(self.execute) + + def execute(self, percept): + self.kb.make_percept_sentence(percept, self.t) + self.kb.add_temporal_sentences(self.t) + + temp = list() + + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if self.kb.ask_if_true(location(i, j, self.t)): + temp.append(i) + temp.append(j) + + if self.kb.ask_if_true(facing_north(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'UP') + elif self.kb.ask_if_true(facing_south(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'DOWN') + elif self.kb.ask_if_true(facing_west(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'LEFT') + elif self.kb.ask_if_true(facing_east(self.t)): + self.current_position = WumpusPosition(temp[0], temp[1], 'RIGHT') + + safe_points = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if self.kb.ask_if_true(ok_to_move(i, j, self.t)): + safe_points.append([i, j]) + + if self.kb.ask_if_true(percept_glitter(self.t)): + goals = list() + goals.append([1, 1]) + self.plan.append('Grab') + actions = self.plan_route(self.current_position, goals, safe_points) + self.plan.extend(actions) + self.plan.append('Climb') + + if len(self.plan) == 0: + unvisited = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + for k in range(self.t): + if self.kb.ask_if_true(location(i, j, k)): + unvisited.append([i, j]) + unvisited_and_safe = list() + for u in unvisited: + for s in safe_points: + if u not in unvisited_and_safe and s == u: + unvisited_and_safe.append(u) + + temp = self.plan_route(self.current_position, unvisited_and_safe, safe_points) + self.plan.extend(temp) + + if len(self.plan) == 0 and self.kb.ask_if_true(have_arrow(self.t)): + possible_wumpus = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if not self.kb.ask_if_true(wumpus(i, j)): + possible_wumpus.append([i, j]) + + temp = self.plan_shot(self.current_position, possible_wumpus, safe_points) + self.plan.extend(temp) + + if len(self.plan) == 0: + not_unsafe = list() + for i in range(1, self.dimrow + 1): + for j in range(1, self.dimrow + 1): + if not self.kb.ask_if_true(ok_to_move(i, j, self.t)): + not_unsafe.append([i, j]) + temp = self.plan_route(self.current_position, not_unsafe, safe_points) + self.plan.extend(temp) + + if len(self.plan) == 0: + start = list() + start.append([1, 1]) + temp = self.plan_route(self.current_position, start, safe_points) + self.plan.extend(temp) + self.plan.append('Climb') + + action = self.plan[0] + self.plan = self.plan[1:] + self.kb.make_action_sentence(action, self.t) + self.t += 1 + + return action + + def plan_route(self, current, goals, allowed): + problem = PlanRoute(current, goals, allowed, self.dimrow) + return astar_search(problem).solution() + + def plan_shot(self, current, goals, allowed): + shooting_positions = set() + + for loc in goals: + x = loc[0] + y = loc[1] + for i in range(1, self.dimrow + 1): + if i < x: + shooting_positions.add(WumpusPosition(i, y, 'EAST')) + if i > x: + shooting_positions.add(WumpusPosition(i, y, 'WEST')) + if i < y: + shooting_positions.add(WumpusPosition(x, i, 'NORTH')) + if i > y: + shooting_positions.add(WumpusPosition(x, i, 'SOUTH')) + + # Can't have a shooting position from any of the rooms the Wumpus could reside + orientations = ['EAST', 'WEST', 'NORTH', 'SOUTH'] + for loc in goals: + for orientation in orientations: + shooting_positions.remove(WumpusPosition(loc[0], loc[1], orientation)) + + actions = list() + actions.extend(self.plan_route(current, shooting_positions, allowed)) + actions.append('Shoot') + return actions + + +# ______________________________________________________________________________ +# 7.7.4 Making plans by propositional inference + + +def SAT_plan(init, transition, goal, t_max, SAT_solver=dpll_satisfiable): + """Converts a planning problem to Satisfaction problem by translating it to a cnf sentence. + [Figure 7.22] + >>> transition = {'A': {'Left': 'A', 'Right': 'B'}, 'B': {'Left': 'A', 'Right': 'C'}, 'C': {'Left': 'B', 'Right': 'C'}} + >>> SAT_plan('A', transition, 'C', 2) is None + True + """ + + # Functions used by SAT_plan + def translate_to_SAT(init, transition, goal, time): + clauses = [] + states = [state for state in transition] + + # Symbol claiming state s at time t + state_counter = itertools.count() + for s in states: + for t in range(time + 1): + state_sym[s, t] = Expr("State_{}".format(next(state_counter))) + + # Add initial state axiom + clauses.append(state_sym[init, 0]) + + # Add goal state axiom + clauses.append(state_sym[goal, time]) + + # All possible transitions + transition_counter = itertools.count() + for s in states: + for action in transition[s]: + s_ = transition[s][action] + for t in range(time): + # Action 'action' taken from state 's' at time 't' to reach 's_' + action_sym[s, action, t] = Expr( + "Transition_{}".format(next(transition_counter))) + + # Change the state from s to s_ + clauses.append(action_sym[s, action, t] | '==>' | state_sym[s, t]) + clauses.append(action_sym[s, action, t] | '==>' | state_sym[s_, t + 1]) + + # Allow only one state at any time + for t in range(time + 1): + # must be a state at any time + clauses.append(associate('|', [state_sym[s, t] for s in states])) + + for s in states: + for s_ in states[states.index(s) + 1:]: + # for each pair of states s, s_ only one is possible at time t + clauses.append((~state_sym[s, t]) | (~state_sym[s_, t])) + + # Restrict to one transition per timestep + for t in range(time): + # list of possible transitions at time t + transitions_t = [tr for tr in action_sym if tr[2] == t] + + # make sure at least one of the transitions happens + clauses.append(associate('|', [action_sym[tr] for tr in transitions_t])) + + for tr in transitions_t: + for tr_ in transitions_t[transitions_t.index(tr) + 1:]: + # there cannot be two transitions tr and tr_ at time t + clauses.append(~action_sym[tr] | ~action_sym[tr_]) + + # Combine the clauses to form the cnf + return associate('&', clauses) + + def extract_solution(model): + true_transitions = [t for t in action_sym if model[action_sym[t]]] + # Sort transitions based on time, which is the 3rd element of the tuple + true_transitions.sort(key=lambda x: x[2]) + return [action for s, action, time in true_transitions] + + # Body of SAT_plan algorithm + for t in range(t_max): + # dictionaries to help extract the solution from model + state_sym = {} + action_sym = {} + + cnf = translate_to_SAT(init, transition, goal, t) + model = SAT_solver(cnf) + if model is not False: + return extract_solution(model) + return None + + +# ______________________________________________________________________________ +# Chapter 9 Inference in First Order Logic +# 9.2 Unification and First Order Inference +# 9.2.1 Unification + + +def unify(x, y, s={}): + """Unify expressions x,y with substitution s; return a substitution that + would make x,y equal, or None if x,y can not unify. x and y can be + variables (e.g. Expr('x')), constants, lists, or Exprs. [Figure 9.1] + >>> unify(x, 3, {}) + {x: 3} + """ + if s is None: + return None + elif x == y: + return s + elif is_variable(x): + return unify_var(x, y, s) + elif is_variable(y): + return unify_var(y, x, s) + elif isinstance(x, Expr) and isinstance(y, Expr): + return unify(x.args, y.args, unify(x.op, y.op, s)) + elif isinstance(x, str) or isinstance(y, str): + return None + elif issequence(x) and issequence(y) and len(x) == len(y): + if not x: + return s + return unify(x[1:], y[1:], unify(x[0], y[0], s)) + else: + return None + + +def is_variable(x): + """A variable is an Expr with no args and a lowercase symbol as the op.""" + return isinstance(x, Expr) and not x.args and x.op[0].islower() + + +def unify_var(var, x, s): + if var in s: + return unify(s[var], x, s) + elif x in s: + return unify(var, s[x], s) + elif occur_check(var, x, s): + return None + else: + return extend(s, var, x) + + +def occur_check(var, x, s): + """Return true if variable var occurs anywhere in x + (or in subst(s, x), if s has a binding for x).""" + if var == x: + return True + elif is_variable(x) and x in s: + return occur_check(var, s[x], s) + elif isinstance(x, Expr): + return (occur_check(var, x.op, s) or + occur_check(var, x.args, s)) + elif isinstance(x, (list, tuple)): + return first(e for e in x if occur_check(var, e, s)) + else: + return False + + +def extend(s, var, val): + """Copy the substitution s and extend it by setting var to val; return copy. + >>> extend({x: 1}, y, 2) == {x: 1, y: 2} + True + """ + s2 = s.copy() + s2[var] = val + return s2 + + +# 9.2.2 Storage and retrieval + + +class FolKB(KB): + """A knowledge base consisting of first-order definite clauses. + >>> kb0 = FolKB([expr('Farmer(Mac)'), expr('Rabbit(Pete)'), + ... expr('(Rabbit(r) & Farmer(f)) ==> Hates(f, r)')]) + >>> kb0.tell(expr('Rabbit(Flopsie)')) + >>> kb0.retract(expr('Rabbit(Pete)')) + >>> kb0.ask(expr('Hates(Mac, x)'))[x] + Flopsie + >>> kb0.ask(expr('Wife(Pete, x)')) + False + """ + + def __init__(self, initial_clauses=None): + self.clauses = [] # inefficient: no indexing + if initial_clauses: + for clause in initial_clauses: + self.tell(clause) + + def tell(self, sentence): + if is_definite_clause(sentence): + self.clauses.append(sentence) + else: + raise Exception("Not a definite clause: {}".format(sentence)) + + def ask_generator(self, query): + return fol_bc_ask(self, query) + + def retract(self, sentence): + self.clauses.remove(sentence) + + def fetch_rules_for_goal(self, goal): + return self.clauses + + +# ______________________________________________________________________________ +# 9.3 Forward Chaining +# 9.3.2 A simple forward-chaining algorithm + + +def fol_fc_ask(KB, alpha): + """A simple forward-chaining algorithm. [Figure 9.3]""" + kb_consts = list({c for clause in KB.clauses for c in constant_symbols(clause)}) + + def enum_subst(p): + query_vars = list({v for clause in p for v in variables(clause)}) + for assignment_list in itertools.product(kb_consts, repeat=len(query_vars)): + theta = {x: y for x, y in zip(query_vars, assignment_list)} + yield theta + + # check if we can answer without new inferences + for q in KB.clauses: + phi = unify(q, alpha, {}) + if phi is not None: + yield phi + + while True: + new = [] + for rule in KB.clauses: + p, q = parse_definite_clause(rule) + for theta in enum_subst(p): + if set(subst(theta, p)).issubset(set(KB.clauses)): + q_ = subst(theta, q) + if all([unify(x, q_, {}) is None for x in KB.clauses + new]): + new.append(q_) + phi = unify(q_, alpha, {}) + if phi is not None: + yield phi + if not new: + break + for clause in new: + KB.tell(clause) + return None + + +def subst(s, x): + """Substitute the substitution s into the expression x. + >>> subst({x: 42, y:0}, F(x) + y) + (F(42) + 0) + """ + if isinstance(x, list): + return [subst(s, xi) for xi in x] + elif isinstance(x, tuple): + return tuple([subst(s, xi) for xi in x]) + elif not isinstance(x, Expr): + return x + elif is_var_symbol(x.op): + return s.get(x, x) + else: + return Expr(x.op, *[subst(s, arg) for arg in x.args]) + + +def standardize_variables(sentence, dic=None): + """Replace all the variables in sentence with new variables.""" + if dic is None: + dic = {} + if not isinstance(sentence, Expr): + return sentence + elif is_var_symbol(sentence.op): + if sentence in dic: + return dic[sentence] + else: + v = Expr('v_{}'.format(next(standardize_variables.counter))) + dic[sentence] = v + return v + else: + return Expr(sentence.op, + *[standardize_variables(a, dic) for a in sentence.args]) + + +standardize_variables.counter = itertools.count() + + +# __________________________________________________________________ +# 9.4 Backward Chaining + + +def fol_bc_ask(KB, query): + """A simple backward-chaining algorithm for first-order logic. [Figure 9.6] + KB should be an instance of FolKB, and query an atomic sentence.""" + return fol_bc_or(KB, query, {}) + + +def fol_bc_or(KB, goal, theta): + for rule in KB.fetch_rules_for_goal(goal): + lhs, rhs = parse_definite_clause(standardize_variables(rule)) + for theta1 in fol_bc_and(KB, lhs, unify(rhs, goal, theta)): + yield theta1 + + +def fol_bc_and(KB, goals, theta): + if theta is None: + pass + elif not goals: + yield theta + else: + first, rest = goals[0], goals[1:] + for theta1 in fol_bc_or(KB, subst(theta, first), theta): + for theta2 in fol_bc_and(KB, rest, theta1): + yield theta2 + + +# ______________________________________________________________________________ +# A simple KB that defines the relevant conditions of the Wumpus World as in Fig 7.4. +# See Sec. 7.4.3 +wumpus_kb = PropKB() + +P11, P12, P21, P22, P31, B11, B21 = expr('P11, P12, P21, P22, P31, B11, B21') +wumpus_kb.tell(~P11) +wumpus_kb.tell(B11 | '<=>' | (P12 | P21)) +wumpus_kb.tell(B21 | '<=>' | (P11 | P22 | P31)) +wumpus_kb.tell(~B11) +wumpus_kb.tell(B21) + +test_kb = FolKB( + map(expr, ['Farmer(Mac)', + 'Rabbit(Pete)', + 'Mother(MrsMac, Mac)', + 'Mother(MrsRabbit, Pete)', + '(Rabbit(r) & Farmer(f)) ==> Hates(f, r)', + '(Mother(m, c)) ==> Loves(m, c)', + '(Mother(m, r) & Rabbit(r)) ==> Rabbit(m)', + '(Farmer(f)) ==> Human(f)', + # Note that this order of conjuncts + # would result in infinite recursion: + # '(Human(h) & Mother(m, h)) ==> Human(m)' + '(Mother(m, h) & Human(h)) ==> Human(m)'])) + +crime_kb = FolKB( + map(expr, ['(American(x) & Weapon(y) & Sells(x, y, z) & Hostile(z)) ==> Criminal(x)', + 'Owns(Nono, M1)', + 'Missile(M1)', + '(Missile(x) & Owns(Nono, x)) ==> Sells(West, x, Nono)', + 'Missile(x) ==> Weapon(x)', + 'Enemy(x, America) ==> Hostile(x)', + 'American(West)', + 'Enemy(Nono, America)'])) + + +# ______________________________________________________________________________ + +# Example application (not in the book). +# You can use the Expr class to do symbolic differentiation. This used to be +# a part of AI; now it is considered a separate field, Symbolic Algebra. + + +def diff(y, x): + """Return the symbolic derivative, dy/dx, as an Expr. + However, you probably want to simplify the results with simp. + >>> diff(x * x, x) + ((x * 1) + (x * 1)) + """ + if y == x: + return 1 + elif not y.args: + return 0 + else: + u, op, v = y.args[0], y.op, y.args[-1] + if op == '+': + return diff(u, x) + diff(v, x) + elif op == '-' and len(y.args) == 1: + return -diff(u, x) + elif op == '-': + return diff(u, x) - diff(v, x) + elif op == '*': + return u * diff(v, x) + v * diff(u, x) + elif op == '/': + return (v * diff(u, x) - u * diff(v, x)) / (v * v) + elif op == '**' and isnumber(x.op): + return (v * u ** (v - 1) * diff(u, x)) + elif op == '**': + return (v * u ** (v - 1) * diff(u, x) + + u ** v * Expr('log')(u) * diff(v, x)) + elif op == 'log': + return diff(u, x) / u + else: + raise ValueError("Unknown op: {} in diff({}, {})".format(op, y, x)) + + +def simp(x): + """Simplify the expression x.""" + if isnumber(x) or not x.args: + return x + args = list(map(simp, x.args)) + u, op, v = args[0], x.op, args[-1] + if op == '+': + if v == 0: + return u + if u == 0: + return v + if u == v: + return 2 * u + if u == -v or v == -u: + return 0 + elif op == '-' and len(args) == 1: + if u.op == '-' and len(u.args) == 1: + return u.args[0] # --y ==> y + elif op == '-': + if v == 0: + return u + if u == 0: + return -v + if u == v: + return 0 + if u == -v or v == -u: + return 0 + elif op == '*': + if u == 0 or v == 0: + return 0 + if u == 1: + return v + if v == 1: + return u + if u == v: + return u ** 2 + elif op == '/': + if u == 0: + return 0 + if v == 0: + return Expr('Undefined') + if u == v: + return 1 + if u == -v or v == -u: + return 0 + elif op == '**': + if u == 0: + return 0 + if v == 0: + return 1 + if u == 1: + return 1 + if v == 1: + return u + elif op == 'log': + if u == 1: + return 0 + else: + raise ValueError("Unknown op: " + op) + # If we fall through to here, we can not simplify further + return Expr(op, *args) + + +def d(y, x): + """Differentiate and then simplify. + >>> d(x * x - x, x) + ((2 * x) - 1) + """ + return simp(diff(y, x)) diff --git a/making_simple_decision4e.py b/making_simple_decision4e.py new file mode 100644 index 000000000..4a35f94bd --- /dev/null +++ b/making_simple_decision4e.py @@ -0,0 +1,168 @@ +"""Making Simple Decisions (Chapter 15)""" + +import random + +from agents import Agent +from probability import BayesNet +from utils4e import vector_add, weighted_sample_with_replacement + + +class DecisionNetwork(BayesNet): + """An abstract class for a decision network as a wrapper for a BayesNet. + Represents an agent's current state, its possible actions, reachable states + and utilities of those states.""" + + def __init__(self, action, infer): + """action: a single action node + infer: the preferred method to carry out inference on the given BayesNet""" + super().__init__() + self.action = action + self.infer = infer + + def best_action(self): + """Return the best action in the network""" + return self.action + + def get_utility(self, action, state): + """Return the utility for a particular action and state in the network""" + raise NotImplementedError + + def get_expected_utility(self, action, evidence): + """Compute the expected utility given an action and evidence""" + u = 0.0 + prob_dist = self.infer(action, evidence, self).prob + for item, _ in prob_dist.items(): + u += prob_dist[item] * self.get_utility(action, item) + + return u + + +class InformationGatheringAgent(Agent): + """A simple information gathering agent. The agent works by repeatedly selecting + the observation with the highest information value, until the cost of the next + observation is greater than its expected benefit. [Figure 16.9]""" + + def __init__(self, decnet, infer, initial_evidence=None): + """decnet: a decision network + infer: the preferred method to carry out inference on the given decision network + initial_evidence: initial evidence""" + super().__init__() + self.decnet = decnet + self.infer = infer + self.observation = initial_evidence or [] + self.variables = self.decnet.nodes + + def integrate_percept(self, percept): + """Integrate the given percept into the decision network""" + raise NotImplementedError + + def execute(self, percept): + """Execute the information gathering algorithm""" + self.observation = self.integrate_percept(percept) + vpis = self.vpi_cost_ratio(self.variables) + j = max(vpis) + variable = self.variables[j] + + if self.vpi(variable) > self.cost(variable): + return self.request(variable) + + return self.decnet.best_action() + + def request(self, variable): + """Return the value of the given random variable as the next percept""" + raise NotImplementedError + + def cost(self, var): + """Return the cost of obtaining evidence through tests, consultants or questions""" + raise NotImplementedError + + def vpi_cost_ratio(self, variables): + """Return the VPI to cost ratio for the given variables""" + v_by_c = [] + for var in variables: + v_by_c.append(self.vpi(var) / self.cost(var)) + return v_by_c + + def vpi(self, variable): + """Return VPI for a given variable""" + vpi = 0.0 + prob_dist = self.infer(variable, self.observation, self.decnet).prob + for item, _ in prob_dist.items(): + post_prob = prob_dist[item] + new_observation = list(self.observation) + new_observation.append(item) + expected_utility = self.decnet.get_expected_utility(variable, new_observation) + vpi += post_prob * expected_utility + + vpi -= self.decnet.get_expected_utility(variable, self.observation) + return vpi + + +# _________________________________________________________________________ +# chapter 25 Robotics +# TODO: Implement continuous map for MonteCarlo similar to Fig25.10 from the book + + +class MCLmap: + """Map which provides probability distributions and sensor readings. + Consists of discrete cells which are either an obstacle or empty""" + + def __init__(self, m): + self.m = m + self.nrows = len(m) + self.ncols = len(m[0]) + # list of empty spaces in the map + self.empty = [(i, j) for i in range(self.nrows) for j in range(self.ncols) if not m[i][j]] + + def sample(self): + """Returns a random kinematic state possible in the map""" + pos = random.choice(self.empty) + # 0N 1E 2S 3W + orient = random.choice(range(4)) + kin_state = pos + (orient,) + return kin_state + + def ray_cast(self, sensor_num, kin_state): + """Returns distace to nearest obstacle or map boundary in the direction of sensor""" + pos = kin_state[:2] + orient = kin_state[2] + # sensor layout when orientation is 0 (towards North) + # 0 + # 3R1 + # 2 + delta = ((sensor_num % 2 == 0) * (sensor_num - 1), (sensor_num % 2 == 1) * (2 - sensor_num)) + # sensor direction changes based on orientation + for _ in range(orient): + delta = (delta[1], -delta[0]) + range_count = 0 + while (0 <= pos[0] < self.nrows) and (0 <= pos[1] < self.nrows) and (not self.m[pos[0]][pos[1]]): + pos = vector_add(pos, delta) + range_count += 1 + return range_count + + +def monte_carlo_localization(a, z, N, P_motion_sample, P_sensor, m, S=None): + """Monte Carlo localization algorithm from Fig 25.9""" + + def ray_cast(sensor_num, kin_state, m): + return m.ray_cast(sensor_num, kin_state) + + M = len(z) + W = [0] * N + S_ = [0] * N + W_ = [0] * N + v = a['v'] + w = a['w'] + + if S is None: + S = [m.sample() for _ in range(N)] + + for i in range(N): + S_[i] = P_motion_sample(S[i], v, w) + W_[i] = 1 + for j in range(M): + z_ = ray_cast(j, S_[i], m) + W_[i] = W_[i] * P_sensor(z[j], z_) + + S = weighted_sample_with_replacement(N, S_, W_) + return S diff --git a/mdp.ipynb b/mdp.ipynb index ee9b0ba85..b9952f528 100644 --- a/mdp.ipynb +++ b/mdp.ipynb @@ -4,21 +4,20 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Markov decision processes (MDPs)\n", + "# Making Complex Decisions\n", + "---\n", "\n", - "This IPy notebook acts as supporting material for topics covered in **Chapter 17 Making Complex Decisions** of the book* Artificial Intelligence: A Modern Approach*. We makes use of the implementations in mdp.py module. This notebook also includes a brief summary of the main topics as a review. Let us import everything from the mdp module to get started." + "This Jupyter notebook acts as supporting material for topics covered in **Chapter 17 Making Complex Decisions** of the book* Artificial Intelligence: A Modern Approach*. We make use of the implementations in mdp.py module. This notebook also includes a brief summary of the main topics as a review. Let us import everything from the mdp module to get started." ] }, { "cell_type": "code", "execution_count": 1, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from mdp import *\n", - "from notebook import psource, pseudocode" + "from notebook import psource, pseudocode, plot_pomdp_utility" ] }, { @@ -30,7 +29,12 @@ "* Overview\n", "* MDP\n", "* Grid MDP\n", - "* Value Iteration Visualization" + "* Value Iteration\n", + " * Value Iteration Visualization\n", + "* Policy Iteration\n", + "* POMDPs\n", + "* POMDP Value Iteration\n", + " - Value Iteration Visualization" ] }, { @@ -59,18 +63,206 @@ "source": [ "## MDP\n", "\n", - "To begin with let us look at the implementation of MDP class defined in mdp.py The docstring tells us what all is required to define a MDP namely - set of states,actions, initial state, transition model, and a reward function. Each of these are implemented as methods. Do not close the popup so that you can follow along the description of code below." + "To begin with let us look at the implementation of MDP class defined in mdp.py The docstring tells us what all is required to define a MDP namely - set of states, actions, initial state, transition model, and a reward function. Each of these are implemented as methods. Do not close the popup so that you can follow along the description of code below." ] }, { "cell_type": "code", "execution_count": 2, - "metadata": { - "collapsed": true - }, - "outputs": [], + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class MDP:\n",
    +       "\n",
    +       "    """A Markov Decision Process, defined by an initial state, transition model,\n",
    +       "    and reward function. We also keep track of a gamma value, for use by\n",
    +       "    algorithms. The transition model is represented somewhat differently from\n",
    +       "    the text. Instead of P(s' | s, a) being a probability number for each\n",
    +       "    state/state/action triplet, we instead have T(s, a) return a\n",
    +       "    list of (p, s') pairs. We also keep track of the possible states,\n",
    +       "    terminal states, and actions for each state. [page 646]"""\n",
    +       "\n",
    +       "    def __init__(self, init, actlist, terminals, transitions = {}, reward = None, states=None, gamma=.9):\n",
    +       "        if not (0 < gamma <= 1):\n",
    +       "            raise ValueError("An MDP must have 0 < gamma <= 1")\n",
    +       "\n",
    +       "        if states:\n",
    +       "            self.states = states\n",
    +       "        else:\n",
    +       "            ## collect states from transitions table\n",
    +       "            self.states = self.get_states_from_transitions(transitions)\n",
    +       "            \n",
    +       "        \n",
    +       "        self.init = init\n",
    +       "        \n",
    +       "        if isinstance(actlist, list):\n",
    +       "            ## if actlist is a list, all states have the same actions\n",
    +       "            self.actlist = actlist\n",
    +       "        elif isinstance(actlist, dict):\n",
    +       "            ## if actlist is a dict, different actions for each state\n",
    +       "            self.actlist = actlist\n",
    +       "        \n",
    +       "        self.terminals = terminals\n",
    +       "        self.transitions = transitions\n",
    +       "        if self.transitions == {}:\n",
    +       "            print("Warning: Transition table is empty.")\n",
    +       "        self.gamma = gamma\n",
    +       "        if reward:\n",
    +       "            self.reward = reward\n",
    +       "        else:\n",
    +       "            self.reward = {s : 0 for s in self.states}\n",
    +       "        #self.check_consistency()\n",
    +       "\n",
    +       "    def R(self, state):\n",
    +       "        """Return a numeric reward for this state."""\n",
    +       "        return self.reward[state]\n",
    +       "\n",
    +       "    def T(self, state, action):\n",
    +       "        """Transition model. From a state and an action, return a list\n",
    +       "        of (probability, result-state) pairs."""\n",
    +       "        if(self.transitions == {}):\n",
    +       "            raise ValueError("Transition model is missing")\n",
    +       "        else:\n",
    +       "            return self.transitions[state][action]\n",
    +       "\n",
    +       "    def actions(self, state):\n",
    +       "        """Set of actions that can be performed in this state. By default, a\n",
    +       "        fixed list of actions, except for terminal states. Override this\n",
    +       "        method if you need to specialize by state."""\n",
    +       "        if state in self.terminals:\n",
    +       "            return [None]\n",
    +       "        else:\n",
    +       "            return self.actlist\n",
    +       "\n",
    +       "    def get_states_from_transitions(self, transitions):\n",
    +       "        if isinstance(transitions, dict):\n",
    +       "            s1 = set(transitions.keys())\n",
    +       "            s2 = set([tr[1] for actions in transitions.values() \n",
    +       "                              for effects in actions.values() for tr in effects])\n",
    +       "            return s1.union(s2)\n",
    +       "        else:\n",
    +       "            print('Could not retrieve states from transitions')\n",
    +       "            return None\n",
    +       "\n",
    +       "    def check_consistency(self):\n",
    +       "        # check that all states in transitions are valid\n",
    +       "        assert set(self.states) == self.get_states_from_transitions(self.transitions)\n",
    +       "        # check that init is a valid state\n",
    +       "        assert self.init in self.states\n",
    +       "        # check reward for each state\n",
    +       "        #assert set(self.reward.keys()) == set(self.states)\n",
    +       "        assert set(self.reward.keys()) == set(self.states)\n",
    +       "        # check that all terminals are valid states\n",
    +       "        assert all([t in self.states for t in self.terminals])\n",
    +       "        # check that probability distributions for all actions sum to 1\n",
    +       "        for s1, actions in self.transitions.items():\n",
    +       "            for a in actions.keys():\n",
    +       "                s = 0\n",
    +       "                for o in actions[a]:\n",
    +       "                    s += o[0]\n",
    +       "                assert abs(s - 1) < 0.001\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource MDP" + "psource(MDP)" ] }, { @@ -101,21 +293,21 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": 3, "metadata": { "collapsed": true }, "outputs": [], "source": [ - "# Transition Matrix as nested dict. State -> Actions in state -> States by each action -> Probabilty\n", + "# Transition Matrix as nested dict. State -> Actions in state -> List of (Probability, State) tuples\n", "t = {\n", " \"A\": {\n", - " \"X\": {\"A\":0.3, \"B\":0.7},\n", - " \"Y\": {\"A\":1.0}\n", + " \"X\": [(0.3, \"A\"), (0.7, \"B\")],\n", + " \"Y\": [(1.0, \"A\")]\n", " },\n", " \"B\": {\n", - " \"X\": {\"End\":0.8, \"B\":0.2},\n", - " \"Y\": {\"A\":1.0}\n", + " \"X\": {(0.8, \"End\"), (0.2, \"B\")},\n", + " \"Y\": {(1.0, \"A\")}\n", " },\n", " \"End\": {}\n", "}\n", @@ -133,29 +325,26 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 4, "metadata": { "collapsed": true }, "outputs": [], "source": [ "class CustomMDP(MDP):\n", - "\n", - " def __init__(self, transition_matrix, rewards, terminals, init, gamma=.9):\n", + " def __init__(self, init, terminals, transition_matrix, reward = None, gamma=.9):\n", " # All possible actions.\n", " actlist = []\n", " for state in transition_matrix.keys():\n", - " actlist.extend(transition_matrix.keys())\n", + " actlist.extend(transition_matrix[state])\n", " actlist = list(set(actlist))\n", - "\n", - " MDP.__init__(self, init, actlist, terminals=terminals, gamma=gamma)\n", - " self.t = transition_matrix\n", - " self.reward = rewards\n", - " for state in self.t:\n", - " self.states.add(state)\n", + " MDP.__init__(self, init, actlist, terminals, transition_matrix, reward, gamma=gamma)\n", "\n", " def T(self, state, action):\n", - " return [(new_state, prob) for new_state, prob in self.t[state][action].items()]" + " if action is None:\n", + " return [(0.0, state)]\n", + " else: \n", + " return self.t[state][action]" ] }, { @@ -167,20 +356,20 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 5, "metadata": { "collapsed": true }, "outputs": [], "source": [ - "our_mdp = CustomMDP(t, rewards, terminals, init, gamma=.9)" + "our_mdp = CustomMDP(init, terminals, t, rewards, gamma=.9)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "With this we have sucessfully represented our MDP. Later we will look at ways to solve this MDP." + "With this we have successfully represented our MDP. Later we will look at ways to solve this MDP." ] }, { @@ -189,18 +378,176 @@ "source": [ "## GRID MDP\n", "\n", - "Now we look at a concrete implementation that makes use of the MDP as base class. The GridMDP class in the mdp module is used to represent a grid world MDP like the one shown in in **Fig 17.1** of the AIMA Book. The code should be easy to understand if you have gone through the CustomMDP example." + "Now we look at a concrete implementation that makes use of the MDP as base class. The GridMDP class in the mdp module is used to represent a grid world MDP like the one shown in in **Fig 17.1** of the AIMA Book. We assume for now that the environment is _fully observable_, so that the agent always knows where it is. The code should be easy to understand if you have gone through the CustomMDP example." ] }, { "cell_type": "code", "execution_count": 6, - "metadata": { - "collapsed": true - }, - "outputs": [], + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class GridMDP(MDP):\n",
    +       "\n",
    +       "    """A two-dimensional grid MDP, as in [Figure 17.1]. All you have to do is\n",
    +       "    specify the grid as a list of lists of rewards; use None for an obstacle\n",
    +       "    (unreachable state). Also, you should specify the terminal states.\n",
    +       "    An action is an (x, y) unit vector; e.g. (1, 0) means move east."""\n",
    +       "\n",
    +       "    def __init__(self, grid, terminals, init=(0, 0), gamma=.9):\n",
    +       "        grid.reverse()  # because we want row 0 on bottom, not on top\n",
    +       "        reward = {}\n",
    +       "        states = set()\n",
    +       "        self.rows = len(grid)\n",
    +       "        self.cols = len(grid[0])\n",
    +       "        self.grid = grid\n",
    +       "        for x in range(self.cols):\n",
    +       "            for y in range(self.rows):\n",
    +       "                if grid[y][x] is not None:\n",
    +       "                    states.add((x, y))\n",
    +       "                    reward[(x, y)] = grid[y][x]\n",
    +       "        self.states = states\n",
    +       "        actlist = orientations\n",
    +       "        transitions = {}\n",
    +       "        for s in states:\n",
    +       "            transitions[s] = {}\n",
    +       "            for a in actlist:\n",
    +       "                transitions[s][a] = self.calculate_T(s, a)\n",
    +       "        MDP.__init__(self, init, actlist=actlist,\n",
    +       "                     terminals=terminals, transitions = transitions, \n",
    +       "                     reward = reward, states = states, gamma=gamma)\n",
    +       "\n",
    +       "    def calculate_T(self, state, action):\n",
    +       "        if action is None:\n",
    +       "            return [(0.0, state)]\n",
    +       "        else:\n",
    +       "            return [(0.8, self.go(state, action)),\n",
    +       "                    (0.1, self.go(state, turn_right(action))),\n",
    +       "                    (0.1, self.go(state, turn_left(action)))]\n",
    +       "    \n",
    +       "    def T(self, state, action):\n",
    +       "        if action is None:\n",
    +       "            return [(0.0, state)]\n",
    +       "        else:\n",
    +       "            return self.transitions[state][action]\n",
    +       " \n",
    +       "    def go(self, state, direction):\n",
    +       "        """Return the state that results from going in this direction."""\n",
    +       "        state1 = vector_add(state, direction)\n",
    +       "        return state1 if state1 in self.states else state\n",
    +       "\n",
    +       "    def to_grid(self, mapping):\n",
    +       "        """Convert a mapping from (x, y) to v into a [[..., v, ...]] grid."""\n",
    +       "        return list(reversed([[mapping.get((x, y), None)\n",
    +       "                               for x in range(self.cols)]\n",
    +       "                              for y in range(self.rows)]))\n",
    +       "\n",
    +       "    def to_arrows(self, policy):\n",
    +       "        chars = {\n",
    +       "            (1, 0): '>', (0, 1): '^', (-1, 0): '<', (0, -1): 'v', None: '.'}\n",
    +       "        return self.to_grid({s: chars[a] for (s, a) in policy.items()})\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource GridMDP" + "psource(GridMDP)" ] }, { @@ -234,16 +581,16 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "" + "" ] }, - "execution_count": 5, + "execution_count": 7, "metadata": {}, "output_type": "execute_result" } @@ -258,20 +605,136 @@ "collapsed": true }, "source": [ - "# Value Iteration\n", + "# VALUE ITERATION\n", "\n", "Now that we have looked how to represent MDPs. Let's aim at solving them. Our ultimate goal is to obtain an optimal policy. We start with looking at Value Iteration and a visualisation that should help us understanding it better.\n", "\n", - "We start by calculating Value/Utility for each of the states. The Value of each state is the expected sum of discounted future rewards given we start in that state and follow a particular policy pi.The algorithm Value Iteration (**Fig. 17.4** in the book) relies on finding solutions of the Bellman's Equation. The intuition Value Iteration works is because values propagate. This point will we more clear after we encounter the visualisation. For more information you can refer to **Section 17.2** of the book. \n" + "We start by calculating Value/Utility for each of the states. The Value of each state is the expected sum of discounted future rewards given we start in that state and follow a particular policy $\\pi$. The value or the utility of a state is given by\n", + "\n", + "$$U(s)=R(s)+\\gamma\\max_{a\\epsilon A(s)}\\sum_{s'} P(s'\\ |\\ s,a)U(s')$$\n", + "\n", + "This is called the Bellman equation. The algorithm Value Iteration (**Fig. 17.4** in the book) relies on finding solutions of this Equation. The intuition Value Iteration works is because values propagate through the state space by means of local updates. This point will we more clear after we encounter the visualisation. For more information you can refer to **Section 17.2** of the book. \n" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def value_iteration(mdp, epsilon=0.001):\n",
    +       "    """Solving an MDP by value iteration. [Figure 17.4]"""\n",
    +       "    U1 = {s: 0 for s in mdp.states}\n",
    +       "    R, T, gamma = mdp.R, mdp.T, mdp.gamma\n",
    +       "    while True:\n",
    +       "        U = U1.copy()\n",
    +       "        delta = 0\n",
    +       "        for s in mdp.states:\n",
    +       "            U1[s] = R(s) + gamma * max([sum([p * U[s1] for (p, s1) in T(s, a)])\n",
    +       "                                        for a in mdp.actions(s)])\n",
    +       "            delta = max(delta, abs(U1[s] - U[s]))\n",
    +       "        if delta < epsilon * (1 - gamma) / gamma:\n",
    +       "            return U\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(value_iteration)" ] @@ -280,12 +743,35 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "It takes as inputs two parameters, an MDP to solve and epsilon the maximum error allowed in the utility of any state. It returns a dictionary containing utilities where the keys are the states and values represent utilities. Let us solve the **sequencial_decision_enviornment** GridMDP." + "It takes as inputs two parameters, an MDP to solve and epsilon, the maximum error allowed in the utility of any state. It returns a dictionary containing utilities where the keys are the states and values represent utilities.
    Value Iteration starts with arbitrary initial values for the utilities, calculates the right side of the Bellman equation and plugs it into the left hand side, thereby updating the utility of each state from the utilities of its neighbors. \n", + "This is repeated until equilibrium is reached. \n", + "It works on the principle of _Dynamic Programming_ - using precomputed information to simplify the subsequent computation. \n", + "If $U_i(s)$ is the utility value for state $s$ at the $i$ th iteration, the iteration step, called Bellman update, looks like this:\n", + "\n", + "$$ U_{i+1}(s) \\leftarrow R(s) + \\gamma \\max_{a \\epsilon A(s)} \\sum_{s'} P(s'\\ |\\ s,a)U_{i}(s') $$\n", + "\n", + "As you might have noticed, `value_iteration` has an infinite loop. How do we decide when to stop iterating? \n", + "The concept of _contraction_ successfully explains the convergence of value iteration. \n", + "Refer to **Section 17.2.3** of the book for a detailed explanation. \n", + "In the algorithm, we calculate a value $delta$ that measures the difference in the utilities of the current time step and the previous time step. \n", + "\n", + "$$\\delta = \\max{(\\delta, \\begin{vmatrix}U_{i + 1}(s) - U_i(s)\\end{vmatrix})}$$\n", + "\n", + "This value of delta decreases as the values of $U_i$ converge.\n", + "We terminate the algorithm if the $\\delta$ value is less than a threshold value determined by the hyperparameter _epsilon_.\n", + "\n", + "$$\\delta \\lt \\epsilon \\frac{(1 - \\gamma)}{\\gamma}$$\n", + "\n", + "To summarize, the Bellman update is a _contraction_ by a factor of $gamma$ on the space of utility vectors. \n", + "Hence, from the properties of contractions in general, it follows that `value_iteration` always converges to a unique solution of the Bellman equations whenever $gamma$ is less than 1.\n", + "We then terminate the algorithm when a reasonable approximation is achieved.\n", + "In practice, it often occurs that the policy $pi$ becomes optimal long before the utility function converges. For the given 4 x 3 environment with $gamma = 0.9$, the policy $pi$ is optimal when $i = 4$ (at the 4th iteration), even though the maximum error in the utility function is stil 0.46. This can be clarified from **figure 17.6** in the book. Hence, to increase computational efficiency, we often use another method to solve MDPs called Policy Iteration which we will see in the later part of this notebook. \n", + "
    For now, let us solve the **sequential_decision_environment** GridMDP using `value_iteration`." ] }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 9, "metadata": {}, "outputs": [ { @@ -304,7 +790,7 @@ " (3, 2): 1.0}" ] }, - "execution_count": 6, + "execution_count": 9, "metadata": {}, "output_type": "execute_result" } @@ -322,7 +808,7 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": 10, "metadata": {}, "outputs": [ { @@ -351,7 +837,7 @@ "" ] }, - "execution_count": 2, + "execution_count": 10, "metadata": {}, "output_type": "execute_result" } @@ -360,6 +846,30 @@ "pseudocode(\"Value-Iteration\")" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### AIMA3e\n", + "__function__ VALUE-ITERATION(_mdp_, _ε_) __returns__ a utility function \n", + " __inputs__: _mdp_, an MDP with states _S_, actions _A_(_s_), transition model _P_(_s′_ | _s_, _a_), \n", + "      rewards _R_(_s_), discount _γ_ \n", + "   _ε_, the maximum error allowed in the utility of any state \n", + " __local variables__: _U_, _U′_, vectors of utilities for states in _S_, initially zero \n", + "        _δ_, the maximum change in the utility of any state in an iteration \n", + "\n", + " __repeat__ \n", + "   _U_ ← _U′_; _δ_ ← 0 \n", + "   __for each__ state _s_ in _S_ __do__ \n", + "     _U′_\\[_s_\\] ← _R_(_s_) + _γ_ max_a_ ∈ _A_(_s_) Σ _P_(_s′_ | _s_, _a_) _U_\\[_s′_\\] \n", + "     __if__ | _U′_\\[_s_\\] − _U_\\[_s_\\] | > _δ_ __then__ _δ_ ← | _U′_\\[_s_\\] − _U_\\[_s_\\] | \n", + " __until__ _δ_ < _ε_(1 − _γ_)/_γ_ \n", + " __return__ _U_ \n", + "\n", + "---\n", + "__Figure ??__ The value iteration algorithm for calculating utilities of states. The termination condition is from Equation (__??__)." + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -371,7 +881,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 11, "metadata": { "collapsed": true }, @@ -399,7 +909,7 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 12, "metadata": { "collapsed": true }, @@ -412,7 +922,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 13, "metadata": { "collapsed": true }, @@ -426,16 +936,16 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 14, "metadata": { "scrolled": true }, "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAATcAAADuCAYAAABcZEBhAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADVdJREFUeJzt239o2/edx/HX9+prSRfbbQqLrK9d2iKzcporX2kcnyAH\nV0i8/JjbP7pL/MfcboGQXEaYYab5Y1cYgbZXzuFwmgbcCyX5xwn0D3s4P6rQMAiInKCJ/pjDgWpk\nsL6KU9zN9Vw36WK++8OKUjeO5XWW9M17zwcY/NXnY/h834hnpUh1fN8XAFjzD9U+AACUA3EDYBJx\nA2AScQNgEnEDYBJxA2AScQNgEnEDYBJxA2BSzV+zeXZW/O8MQBmtrXWqfYTg8/0VDYlXbgBMIm4A\nTCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBM\nIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwi\nbgBMCmzcfN9Xb+8BxWIRtbc/p3T6ypL7rl79RBs3tigWi6i394B831+03t/fp9paR1NTU5U4dsUw\nn9KY0f39XNL3Jf3wPuu+pAOSIpKek/TNyZ2Q1Fz4OVHGM/6tAhu3ROKcxsYySqcz6u8fUE/PviX3\n9fTs05Ej7yudzmhsLKMLF84X13K5CV28mFBT05OVOnbFMJ/SmNH9vSbp/DLr5yRlCj8Dku5M7g+S\nfiPp/ySlCr//sWyn/NsENm5nzgyrq6tbjuOora1d09PTmpy8vmjP5OR1zczMqK2tXY7jqKurWyMj\nQ8X1gwd7dOjQO3Icp9LHLzvmUxozur9/lbRumfVhSd2SHEntkqYlXZf0kaTNhb99vPD7cpGspsDG\nLZ/35LpNxWvXbVQ+7y2xp7F4HQ7f3TMyMqxw2FVLS6wyB64w5lMaM/ruPElN37huLDx2v8eDqKba\nByiHubk59fW9qaGhRLWPEkjMpzRm9OAL1Cu3gYGjisdbFY+3KhRqkOdNFNc8L6dw2F20Pxx25Xm5\n4nU+v7Anmx3T+HhW8XhM0ehT8rycNm16XjduTFbsXsqB+ZTGjFaHK2niG9e5wmP3ezyIAhW3PXv2\nK5lMK5lMa8eOlzU4eFK+7yuVuqz6+nqFQg2L9odCDaqrq1MqdVm+72tw8KS2b39J0WiLstnPNDo6\nrtHRcbluoy5duqL160NVurPVwXxKY0aro1PSSS18anpZUr2kBkkdkhJa+BDhj4XfO6p0xlIC+7a0\no2ObEomzisUiWrPmUR079kFxLR5vVTKZliQdPvye9u59TTdvfqXNm7dqy5at1TpyRTGf0pjR/XVJ\n+p2kKS38u9lvJP25sLZX0jZJZ7XwVZBHJd2Z3DpJ/ylpQ+H6DS3/wUQ1Od/+Ts9yZme18s0A/mpr\na219KlsWvr+iIQXqbSkArBbiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk\n4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTi\nBsAk4gbAJOIGwCTiBsAk4gbAJOIGwKSaah/AkrXf86t9hMCb/dKp9hECzRHPoVJWOiFeuQEwibgB\nMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEw\nibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJ\nuAEwKbBx831fvb0HFItF1N7+nNLpK0vuu3r1E23c2KJYLKLe3gPyfX/Ren9/n2prHU1NTVXi2BVz\n/vx5/eDZZxVpbtbbb799z/qtW7e0c9cuRZqbtbG9XePj48W1t956S5HmZv3g2Wf10UcfVfDUlcVz\nqJT/l/Qvkh6R9N/L7MtK2igpImmnpK8Lj98qXEcK6+PlOuh3Eti4JRLnNDaWUTqdUX//gHp69i25\nr6dnn44ceV/pdEZjYxlduHC+uJbLTejixYSamp6s1LErYn5+Xvt/8QudO3tW10ZHNXjqlK5du7Zo\nz/Hjx/X4Y4/p00xGPb/8pV4/eFCSdO3aNZ06fVqjv/+9zp87p//Yv1/z8/PVuI2y4zlUyjpJ/ZJ+\nVWLf65J6JH0q6XFJxwuPHy9cf1pYf708x/yOAhu3M2eG1dXVLcdx1NbWrunpaU1OXl+0Z3LyumZm\nZtTW1i7HcdTV1a2RkaHi+sGDPTp06B05jlPp45dVKpVSJBLRM888o4cffli7du7U8PDwoj3Dv/2t\nXn31VUnSK6+8oo8//li+72t4eFi7du7UI488oqefflqRSESpVKoat1F2PIdK+b6kDZL+cZk9vqSL\nkl4pXL8q6c58hgvXKqx/XNgfDIGNWz7vyXWbiteu26h83ltiT2PxOhy+u2dkZFjhsKuWllhlDlxB\nnuepqfHufTc2NsrzvHv3NC3Mr6amRvX19fr8888XPS5Jja57z99awXNoNXwu6TFJNYXrRkl3ZuhJ\nujPfGkn1hf3BUFN6y4Nnbm5OfX1vamgoUe2j4AHFc+jBF6hXbgMDRxWPtyoeb1Uo1CDPmyiueV5O\n4bC7aH847MrzcsXrfH5hTzY7pvHxrOLxmKLRp+R5OW3a9Lxu3Jis2L2Uk+u6msjdve9cLifXde/d\nM7Ewv9u3b+uLL77QE088sehxScp53j1/+yDjOVTKUUmthZ/8CvY/IWla0u3CdU7SnRm6ku7M97ak\nLwr7gyFQcduzZ7+SybSSybR27HhZg4Mn5fu+UqnLqq+vVyjUsGh/KNSguro6pVKX5fu+BgdPavv2\nlxSNtiib/Uyjo+MaHR2X6zbq0qUrWr8+VKU7W10bNmxQJpNRNpvV119/rVOnT6uzs3PRns4f/1gn\nTpyQJH344Yd68cUX5TiOOjs7der0ad26dUvZbFaZTEZtbW3VuI2y4DlUyn5J6cJPeAX7HUn/JunD\nwvUJSS8Vfu8sXKuw/mJhfzAE9m1pR8c2JRJnFYtFtGbNozp27IPiWjzeqmQyLUk6fPg97d37mm7e\n/EqbN2/Vli1bq3XkiqmpqdG7R46o40c/0vz8vH7+s58pGo3qjTfe0AsvvKDOzk7t3r1bP+3uVqS5\nWevWrdOpwUFJUjQa1b//5Cf6p2hUNTU1Ovruu3rooYeqfEflwXOolElJL0ia0cLrnP+RdE1SnaRt\nkv5XCwH8L0m7JP1a0j9L2l34+92SfqqFr4Ksk3Sqgmcvzfn2d3qWMzsboI9CAmjt9xhPKbNfBue/\n7EFUW1vtEwSf76/s5WGg3pYCwGohbgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwi\nbgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJu\nAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEyqqfYBLJn90qn2EfCA+9Ofqn0CO3jlBsAk4gbAJOIG\nwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbA\nJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk\n4gbApMDGzfd99fYeUCwWUXv7c0qnryy57+rVT7RxY4tisYh6ew/I9/1F6/39faqtdTQ1NVWJY1cM\n8ymNGS3P+nwCG7dE4pzGxjJKpzPq7x9QT8++Jff19OzTkSPvK53OaGwsowsXzhfXcrkJXbyYUFPT\nk5U6dsUwn9KY0fKszyewcTtzZlhdXd1yHEdtbe2anp7W5OT1RXsmJ69rZmZGbW3tchxHXV3dGhkZ\nKq4fPNijQ4fekeM4lT5+2TGf0pjR8qzPJ7Bxy+c9uW5T8dp1G5XPe0vsaSxeh8N394yMDCscdtXS\nEqvMgSuM+ZTGjJZnfT411T5AOczNzamv700NDSWqfZRAYj6lMaPlPQjzCdQrt4GBo4rHWxWPtyoU\napDnTRTXPC+ncNhdtD8cduV5ueJ1Pr+wJ5sd0/h4VvF4TNHoU/K8nDZtel43bkxW7F7KgfmUxoyW\n9/c0n0DFbc+e/Uom00om09qx42UNDp6U7/tKpS6rvr5eoVDDov2hUIPq6uqUSl2W7/saHDyp7dtf\nUjTaomz2M42Ojmt0dFyu26hLl65o/fpQle5sdTCf0pjR8v6e5hPYt6UdHduUSJxVLBbRmjWP6tix\nD4pr8Xirksm0JOnw4fe0d+9runnzK23evFVbtmyt1pErivmUxoyWZ30+zre/s7Kc2VmtfDMAlMHa\ntVrRR7OBelsKAKuFuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4\nATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgB\nMIm4ATCJuAEwibgBMIm4ATDJ8X2/2mcAgFXHKzcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3\nACYRNwAmETcAJv0F9s8EDYqi1wAAAAAASUVORK5CYII=\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAATcAAADuCAYAAABcZEBhAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAADYxJREFUeJzt211oW2eex/Hf2Xpb0onWrVkm1otL\nW2SmrNaVtzS2K8jCFhJPXsbtRWcTX4zbmUBINkMYw5jmYrYwhNJuMWTjaTCYDSW5cQK9iEOcpDad\nLAREVtBEF+OwoDEyWEdxirvjelw36cScubCi1PWLvK0lnfnP9wMGHz2P4dEf8fWRnDie5wkArPmb\nah8AAMqBuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMKnm/7N5bk78dwagjDYHnGofwf88\nb11D4s4NgEnEDYBJxA2AScQNgEnEDYBJxA2AScQNgEnEDYBJxA2AScQNgEnEDYBJxA2AScQNgEnE\nDYBJxA2AScQNgEnEDYBJxA2AScQNgEnEDYBJxA2AScQNgEnEDYBJxA2AScQNgEnEDYBJxA2AScQN\ngEnEDYBJxA2AScQNgEm+jZvneerpOaJ4PKq2tueVTt9Ycd/Nm5+otbVJ8XhUPT1H5HnekvUTJ3oV\nCDianp6uxLErhvmUxoxW9zNJ35f0j6use5KOSIpKel7S1yd3WlJj4et0Gc/4Xfk2biMjlzU+nlE6\nnVFf34C6uw+tuK+7+5D6+gaUTmc0Pp7R6OiV4louN6mrV0fV0PBUpY5dMcynNGa0ujckXVlj/bKk\nTOFrQNKDyf2fpF9L+h9JqcL3fyjbKb8b38ZteHhInZ1dchxHLS1tmpmZ0dTU7SV7pqZua3Z2Vq2t\nL8lxHHV2dunixfPF9aNHu3Xs2HtyHKfSxy875lMaM1rdP0uqW2N9SFKXJEdSm6QZSbclfSRpe+Fn\nnyx8v1Ykq8m3ccvnXYXDDcXrcDiifN5dYU+keB0KPdwzPHxBoVBYTU3xyhy4wphPaczo23MlNXzt\nOlJ4bLXH/aim2gdYzTc/95C07Lfnanvm5+fV2/u2zp8fKdv5qo35lMaMvr3lU1m8i1vtcT/y1Z3b\nwMBJJRLNSiSaFQyG5LqTxTXXzSkYDC3ZHw5H5Lq54nU+v7gnmx3XxERWiURcsdjTct2ctm17QXfu\nTFXsuZQD8ymNGW2MiKTJr13nJIXWeNyPfBW3AwcOK5lMK5lMa8+eVzU4eEae5ymVuq7a2lrV1weX\n7K+vDyoQCCiVui7P8zQ4eEa7d7+iWKxJ2eynGhub0NjYhMLhiK5du6EtW+qr9Mw2BvMpjRltjA5J\nZ7R4p3ZdUq2koKR2SSNa/CPCHwrft1fpjKX49m1pe/sujYxcUjwe1aZNj6u//4PiWiLRrGQyLUk6\nfrxfBw++obt3v9T27Tu1Y8fOah25ophPacxodZ2S/lvStBbvxn4t6U+FtYOSdkm6pMV/CvK4pAeT\nq5P075K2Fq7f0tp/mKgmZ6XPHFYzN7fiW24AG2RzwK+fYPmI561rSL56WwoAG4W4ATCJuAEwibgB\nMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEw\nibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMIm4ATCJuAEwibgBMKmm2gew\nZPP3vGofwffmvnCqfQRfc8RrqJT1Tog7NwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3\nACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcA\nJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAm+TZunuepp+eI4vGo2tqeVzp9Y8V9N29+\notbWJsXjUfX0HJHneUvWT5zoVSDgaHp6uhLHrpgrV67oB889p2hjo959991l6/fu3dPeffsUbWxU\na1ubJiYmimvvvPOOoo2N+sFzz+mjjz6q4Kkri9dQKf8r6SVJj0nqXWNfVlKrpEZJeyV9VXj8XuE6\nWlifKNdBvxXfxm1k5LLGxzNKpzPq6xtQd/ehFfd1dx9SX9+A0umMxsczGh29UlzL5SZ19eqoGhqe\nqtSxK2JhYUGHf/5zXb50SbfGxjR49qxu3bq1ZM+pU6f05BNP6PeZjLp/8Qu9efSoJOnWrVs6e+6c\nxn73O125fFn/dviwFhYWqvE0yo7XUCl1kvok/bLEvjcldUvKSHpS0qnC46cK178vrL9ZnmN+S76N\n2/DwkDo7u+Q4jlpa2jQzM6OpqdtL9kxN3dbs7KxaW1+S4zjq7OzSxYvni+tHj3br2LH35DhOpY9f\nVqlUStFoVM8++6weffRR7du7V0NDQ0v2DF24oNdff12S9Nprr+njjz+W53kaGhrSvr179dhjj+mZ\nZ55RNBpVKpWqxtMoO15DpXxf0lZJf7vGHk/SbyW9Vrh+XdKD+QwVrlVY/7iw3x98G7d83lU43FC8\nDocjyufdFfZEiteh0MM9w8MXFAqF1dQUr8yBK8h1XTVEHj7vSCQi13WX72lYnF9NTY1qa2v12Wef\nLXlckiLh8LKftYLX0Eb4TNITkmoK1xFJD2boSnow3xpJtYX9/lBTekt1fPNzD0nLfnuutmd+fl69\nvW/r/PmRsp2vmr7LbNbzs1bwGtoIK92JOetYqz5f3bkNDJxUItGsRKJZwWBIrjtZXHPdnILB0JL9\n4XBErpsrXufzi3uy2XFNTGSVSMQViz0t181p27YXdOfOVMWeSzlFIhFN5h4+71wup1AotHzP5OL8\n7t+/r88//1x1dXVLHpeknOsu+9m/ZLyGSjkpqbnwlV/H/r+XNCPpfuE6J+nBDCOSHsz3vqTPtfg5\nnj/4Km4HDhxWMplWMpnWnj2vanDwjDzPUyp1XbW1taqvDy7ZX18fVCAQUCp1XZ7naXDwjHbvfkWx\nWJOy2U81NjahsbEJhcMRXbt2Q1u21FfpmW2srVu3KpPJKJvN6quvvtLZc+fU0dGxZE/Hj36k06dP\nS5I+/PBDvfzyy3IcRx0dHTp77pzu3bunbDarTCajlpaWajyNsuA1VMphSenC13p+qTmS/kXSh4Xr\n05JeKXzfUbhWYf1l+enOzbdvS9vbd2lk5JLi8ag2bXpc/f0fFNcSiWYlk2lJ0vHj/Tp48A3dvful\ntm/fqR07dlbryBVTU1Oj93/zG7X/8IdaWFjQz376U8ViMb311lt68cUX1dHRof379+snXV2KNjaq\nrq5OZwcHJUmxWEz/+uMf6x9iMdXU1Ojk++/rkUceqfIzKg9eQ6VMSXpR0qwW73P+U9ItSX8naZek\n/9JiAP9D0j5Jv5L0T5L2F35+v6SfaPGfgtRJOlvBs5fmrPSZw2rm5nz0pxAf2vw9xlPK3Bf++c3u\nR4FAtU/gf563vttDX70tBYCNQtwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwA\nmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYRNwAmETcAJhE3ACY\nRNwAmETcAJhE3ACYRNwAmETcAJhE3ACYVFPtA1gy94VT7SPgL9wf/1jtE9jBnRsAk4gbAJOIGwCT\niBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOI\nGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gb\nAJN8GzfP89TTc0TxeFRtbc8rnb6x4r6bNz9Ra2uT4vGoenqOyPO8JesnTvQqEHA0PT1diWNXDPMp\njRmtzfp8fBu3kZHLGh/PKJ3OqK9vQN3dh1bc1919SH19A0qnMxofz2h09EpxLZeb1NWro2poeKpS\nx64Y5lMaM1qb9fn4Nm7Dw0Pq7OyS4zhqaWnTzMyMpqZuL9kzNXVbs7Ozam19SY7jqLOzSxcvni+u\nHz3arWPH3pPjOJU+ftkxn9KY0dqsz8e3ccvnXYXDDcXrcDiifN5dYU+keB0KPdwzPHxBoVBYTU3x\nyhy4wphPacxobdbnU1PtA6zmm+/rJS377bDanvn5efX2vq3z50fKdr5qYz6lMaO1WZ+Pr+7cBgZO\nKpFoViLRrGAwJNedLK65bk7BYGjJ/nA4ItfNFa/z+cU92ey4JiaySiTiisWeluvmtG3bC7pzZ6pi\nz6UcmE9pzGhtf03z8VXcDhw4rGQyrWQyrT17XtXg4Bl5nqdU6rpqa2tVXx9csr++PqhAIKBU6ro8\nz9Pg4Bnt3v2KYrEmZbOfamxsQmNjEwqHI7p27Ya2bKmv0jPbGMynNGa0tr+m+fj2bWl7+y6NjFxS\nPB7Vpk2Pq7//g+JaItGsZDItSTp+vF8HD76hu3e/1PbtO7Vjx85qHbmimE9pzGht1ufjrPSeejVz\nc1r/ZgAog82bta4/zfrqbSkAbBTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTi\nBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIG\nwCTiBsAk4gbAJOIGwCTiBsAk4gbAJOIGwCTH87xqnwEANhx3bgBMIm4ATCJuAEwibgBMIm4ATCJu\nAEwibgBMIm4ATCJuAEwibgBM+jPdN0cNjYpeKAAAAABJRU5ErkJggg==\n", "text/plain": [ - "" + "" ] }, "metadata": {}, @@ -445,13 +955,13 @@ "name": "stderr", "output_type": "stream", "text": [ - "Widget Javascript not detected. It may not be installed or enabled properly.\n" + "The installed widget Javascript is the wrong version. It must satisfy the semver range ~2.1.4.\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { - "model_id": "9aed96e7288d4ed59df439f68399dc12" + "model_id": "77e9849e074841e49d8b0ebc8191507c" } }, "metadata": {}, @@ -469,7 +979,7 @@ "\n", "visualize_callback = make_visualize(iteration_slider)\n", "\n", - "visualize_button = widgets.ToggleButton(desctiption = \"Visualize\", value = False)\n", + "visualize_button = widgets.ToggleButton(description = \"Visualize\", value = False)\n", "time_select = widgets.ToggleButtons(description='Extra Delay:',options=['0', '0.1', '0.2', '0.5', '0.7', '1.0'])\n", "a = widgets.interactive(visualize_callback, Visualize = visualize_button, time_step=time_select)\n", "display(a)" @@ -479,7 +989,1986 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Move the slider above to observe how the utility changes across iterations. It is also possible to move the slider using arrow keys or to jump to the value by directly editing the number with a double click. The **Visualize Button** will automatically animate the slider for you. The **Extra Delay Box** allows you to set time delay in seconds upto one second for each time step." + "Move the slider above to observe how the utility changes across iterations. It is also possible to move the slider using arrow keys or to jump to the value by directly editing the number with a double click. The **Visualize Button** will automatically animate the slider for you. The **Extra Delay Box** allows you to set time delay in seconds upto one second for each time step. There is also an interactive editor for grid-world problems `grid_mdp.py` in the gui folder for you to play around with." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# POLICY ITERATION\n", + "\n", + "We have already seen that value iteration converges to the optimal policy long before it accurately estimates the utility function. \n", + "If one action is clearly better than all the others, then the exact magnitude of the utilities in the states involved need not be precise. \n", + "The policy iteration algorithm works on this insight. \n", + "The algorithm executes two fundamental steps:\n", + "* **Policy evaluation**: Given a policy _πᵢ_, calculate _Uᵢ = U(πᵢ)_, the utility of each state if _πᵢ_ were to be executed.\n", + "* **Policy improvement**: Calculate a new policy _πᵢ₊₁_ using one-step look-ahead based on the utility values calculated.\n", + "\n", + "The algorithm terminates when the policy improvement step yields no change in the utilities. \n", + "Refer to **Figure 17.6** in the book to see how this is an improvement over value iteration.\n", + "We now have a simplified version of the Bellman equation\n", + "\n", + "$$U_i(s) = R(s) + \\gamma \\sum_{s'}P(s'\\ |\\ s, \\pi_i(s))U_i(s')$$\n", + "\n", + "An important observation in this equation is that this equation doesn't have the `max` operator, which makes it linear.\n", + "For _n_ states, we have _n_ linear equations with _n_ unknowns, which can be solved exactly in time _**O(n³)**_.\n", + "For more implementational details, have a look at **Section 17.3**.\n", + "Let us now look at how the expected utility is found and how `policy_iteration` is implemented." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def expected_utility(a, s, U, mdp):\n",
    +       "    """The expected utility of doing a in state s, according to the MDP and U."""\n",
    +       "    return sum([p * U[s1] for (p, s1) in mdp.T(s, a)])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(expected_utility)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def policy_iteration(mdp):\n",
    +       "    """Solve an MDP by policy iteration [Figure 17.7]"""\n",
    +       "    U = {s: 0 for s in mdp.states}\n",
    +       "    pi = {s: random.choice(mdp.actions(s)) for s in mdp.states}\n",
    +       "    while True:\n",
    +       "        U = policy_evaluation(pi, U, mdp)\n",
    +       "        unchanged = True\n",
    +       "        for s in mdp.states:\n",
    +       "            a = argmax(mdp.actions(s), key=lambda a: expected_utility(a, s, U, mdp))\n",
    +       "            if a != pi[s]:\n",
    +       "                pi[s] = a\n",
    +       "                unchanged = False\n",
    +       "        if unchanged:\n",
    +       "            return pi\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(policy_iteration)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
    Fortunately, it is not necessary to do _exact_ policy evaluation. \n", + "The utilities can instead be reasonably approximated by performing some number of simplified value iteration steps.\n", + "The simplified Bellman update equation for the process is\n", + "\n", + "$$U_{i+1}(s) \\leftarrow R(s) + \\gamma\\sum_{s'}P(s'\\ |\\ s,\\pi_i(s))U_{i}(s')$$\n", + "\n", + "and this is repeated _k_ times to produce the next utility estimate. This is called _modified policy iteration_." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def policy_evaluation(pi, U, mdp, k=20):\n",
    +       "    """Return an updated utility mapping U from each state in the MDP to its\n",
    +       "    utility, using an approximation (modified policy iteration)."""\n",
    +       "    R, T, gamma = mdp.R, mdp.T, mdp.gamma\n",
    +       "    for i in range(k):\n",
    +       "        for s in mdp.states:\n",
    +       "            U[s] = R(s) + gamma * sum([p * U[s1] for (p, s1) in T(s, pi[s])])\n",
    +       "    return U\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(policy_evaluation)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us now solve **`sequential_decision_environment`** using `policy_iteration`." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{(0, 0): (0, 1),\n", + " (0, 1): (0, 1),\n", + " (0, 2): (1, 0),\n", + " (1, 0): (1, 0),\n", + " (1, 2): (1, 0),\n", + " (2, 0): (0, 1),\n", + " (2, 1): (0, 1),\n", + " (2, 2): (1, 0),\n", + " (3, 0): (-1, 0),\n", + " (3, 1): None,\n", + " (3, 2): None}" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "policy_iteration(sequential_decision_environment)" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "data": { + "text/markdown": [ + "### AIMA3e\n", + "__function__ POLICY-ITERATION(_mdp_) __returns__ a policy \n", + " __inputs__: _mdp_, an MDP with states _S_, actions _A_(_s_), transition model _P_(_s′_ | _s_, _a_) \n", + " __local variables__: _U_, a vector of utilities for states in _S_, initially zero \n", + "        _π_, a policy vector indexed by state, initially random \n", + "\n", + " __repeat__ \n", + "   _U_ ← POLICY\\-EVALUATION(_π_, _U_, _mdp_) \n", + "   _unchanged?_ ← true \n", + "   __for each__ state _s_ __in__ _S_ __do__ \n", + "     __if__ max_a_ ∈ _A_(_s_) Σ_s′_ _P_(_s′_ | _s_, _a_) _U_\\[_s′_\\] > Σ_s′_ _P_(_s′_ | _s_, _π_\\[_s_\\]) _U_\\[_s′_\\] __then do__ \n", + "       _π_\\[_s_\\] ← argmax_a_ ∈ _A_(_s_) Σ_s′_ _P_(_s′_ | _s_, _a_) _U_\\[_s′_\\] \n", + "       _unchanged?_ ← false \n", + " __until__ _unchanged?_ \n", + " __return__ _π_ \n", + "\n", + "---\n", + "__Figure ??__ The policy iteration algorithm for calculating an optimal policy." + ], + "text/plain": [ + "" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pseudocode('Policy-Iteration')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### AIMA3e\n", + "__function__ POLICY-ITERATION(_mdp_) __returns__ a policy \n", + " __inputs__: _mdp_, an MDP with states _S_, actions _A_(_s_), transition model _P_(_s′_ | _s_, _a_) \n", + " __local variables__: _U_, a vector of utilities for states in _S_, initially zero \n", + "        _π_, a policy vector indexed by state, initially random \n", + "\n", + " __repeat__ \n", + "   _U_ ← POLICY\\-EVALUATION(_π_, _U_, _mdp_) \n", + "   _unchanged?_ ← true \n", + "   __for each__ state _s_ __in__ _S_ __do__ \n", + "     __if__ max_a_ ∈ _A_(_s_) Σ_s′_ _P_(_s′_ | _s_, _a_) _U_\\[_s′_\\] > Σ_s′_ _P_(_s′_ | _s_, _π_\\[_s_\\]) _U_\\[_s′_\\] __then do__ \n", + "       _π_\\[_s_\\] ← argmax_a_ ∈ _A_(_s_) Σ_s′_ _P_(_s′_ | _s_, _a_) _U_\\[_s′_\\] \n", + "       _unchanged?_ ← false \n", + " __until__ _unchanged?_ \n", + " __return__ _π_ \n", + "\n", + "---\n", + "__Figure ??__ The policy iteration algorithm for calculating an optimal policy." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## Sequential Decision Problems\n", + "\n", + "Now that we have the tools required to solve MDPs, let us see how Sequential Decision Problems can be solved step by step and how a few built-in tools in the GridMDP class help us better analyse the problem at hand. \n", + "As always, we will work with the grid world from **Figure 17.1** from the book.\n", + "![title](images/grid_mdp.jpg)\n", + "
    This is the environment for our agent.\n", + "We assume for now that the environment is _fully observable_, so that the agent always knows where it is.\n", + "We also assume that the transitions are **Markovian**, that is, the probability of reaching state $s'$ from state $s$ depends only on $s$ and not on the history of earlier states.\n", + "Almost all stochastic decision problems can be reframed as a Markov Decision Process just by tweaking the definition of a _state_ for that particular problem.\n", + "
    \n", + "However, the actions of our agent in this environment are unreliable. In other words, the motion of our agent is stochastic. \n", + "

    \n", + "More specifically, the agent may - \n", + "* move correctly in the intended direction with a probability of _0.8_, \n", + "* move $90^\\circ$ to the right of the intended direction with a probability 0.1\n", + "* move $90^\\circ$ to the left of the intended direction with a probability 0.1\n", + "

    \n", + "The agent stays put if it bumps into a wall.\n", + "![title](images/grid_mdp_agent.jpg)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These properties of the agent are called the transition properties and are hardcoded into the GridMDP class as you can see below." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def T(self, state, action):\n",
    +       "        if action is None:\n",
    +       "            return [(0.0, state)]\n",
    +       "        else:\n",
    +       "            return self.transitions[state][action]\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(GridMDP.T)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To completely define our task environment, we need to specify the utility function for the agent. \n", + "This is the function that gives the agent a rough estimate of how good being in a particular state is, or how much _reward_ an agent receives by being in that state.\n", + "The agent then tries to maximize the reward it gets.\n", + "As the decision problem is sequential, the utility function will depend on a sequence of states rather than on a single state.\n", + "For now, we simply stipulate that in each state $s$, the agent receives a finite reward $R(s)$.\n", + "\n", + "For any given state, the actions the agent can take are encoded as given below:\n", + "- Move Up: (0, 1)\n", + "- Move Down: (0, -1)\n", + "- Move Left: (-1, 0)\n", + "- Move Right: (1, 0)\n", + "- Do nothing: `None`\n", + "\n", + "We now wonder what a valid solution to the problem might look like. \n", + "We cannot have fixed action sequences as the environment is stochastic and we can eventually end up in an undesirable state.\n", + "Therefore, a solution must specify what the agent shoulddo for _any_ state the agent might reach.\n", + "
    \n", + "Such a solution is known as a **policy** and is usually denoted by $\\pi$.\n", + "
    \n", + "The **optimal policy** is the policy that yields the highest expected utility an is usually denoted by $\\pi^*$.\n", + "
    \n", + "The `GridMDP` class has a useful method `to_arrows` that outputs a grid showing the direction the agent should move, given a policy.\n", + "We will use this later to better understand the properties of the environment." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def to_arrows(self, policy):\n",
    +       "        chars = {\n",
    +       "            (1, 0): '>', (0, 1): '^', (-1, 0): '<', (0, -1): 'v', None: '.'}\n",
    +       "        return self.to_grid({s: chars[a] for (s, a) in policy.items()})\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(GridMDP.to_arrows)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This method directly encodes the actions that the agent can take (described above) to characters representing arrows and shows it in a grid format for human visalization purposes. \n", + "It converts the received policy from a `dictionary` to a grid using the `to_grid` method." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def to_grid(self, mapping):\n",
    +       "        """Convert a mapping from (x, y) to v into a [[..., v, ...]] grid."""\n",
    +       "        return list(reversed([[mapping.get((x, y), None)\n",
    +       "                               for x in range(self.cols)]\n",
    +       "                              for y in range(self.rows)]))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(GridMDP.to_grid)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have all the tools required and a good understanding of the agent and the environment, we consider some cases and see how the agent should behave for each case." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Case 1\n", + "---\n", + "R(s) = -0.04 in all states except terminal states" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "# Note that this environment is also initialized in mdp.py by default\n", + "sequential_decision_environment = GridMDP([[-0.04, -0.04, -0.04, +1],\n", + " [-0.04, None, -0.04, -1],\n", + " [-0.04, -0.04, -0.04, -0.04]],\n", + " terminals=[(3, 2), (3, 1)])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will use the `best_policy` function to find the best policy for this environment.\n", + "But, as you can see, `best_policy` requires a utility function as well.\n", + "We already know that the utility function can be found by `value_iteration`.\n", + "Hence, our best policy is:" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .001))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now use the `to_arrows` method to see how our agent should pick its actions in the environment." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "> > > .\n", + "^ None ^ .\n", + "^ > ^ <\n" + ] + } + ], + "source": [ + "from utils import print_table\n", + "print_table(sequential_decision_environment.to_arrows(pi))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is exactly the output we expected\n", + "
    \n", + "![title](images/-0.04.jpg)\n", + "
    \n", + "Notice that, because the cost of taking a step is fairly small compared with the penalty for ending up in `(4, 2)` by accident, the optimal policy is conservative. \n", + "In state `(3, 1)` it recommends taking the long way round, rather than taking the shorter way and risking getting a large negative reward of -1 in `(4, 2)`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Case 2\n", + "---\n", + "R(s) = -0.4 in all states except in terminal states" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "sequential_decision_environment = GridMDP([[-0.4, -0.4, -0.4, +1],\n", + " [-0.4, None, -0.4, -1],\n", + " [-0.4, -0.4, -0.4, -0.4]],\n", + " terminals=[(3, 2), (3, 1)])" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "> > > .\n", + "^ None ^ .\n", + "^ > ^ <\n" + ] + } + ], + "source": [ + "pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .001))\n", + "from utils import print_table\n", + "print_table(sequential_decision_environment.to_arrows(pi))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is exactly the output we expected\n", + "![title](images/-0.4.jpg)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the reward for each state is now more negative, life is certainly more unpleasant.\n", + "The agent takes the shortest route to the +1 state and is willing to risk falling into the -1 state by accident." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Case 3\n", + "---\n", + "R(s) = -4 in all states except terminal states" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "sequential_decision_environment = GridMDP([[-4, -4, -4, +1],\n", + " [-4, None, -4, -1],\n", + " [-4, -4, -4, -4]],\n", + " terminals=[(3, 2), (3, 1)])" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "> > > .\n", + "^ None > .\n", + "> > > ^\n" + ] + } + ], + "source": [ + "pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .001))\n", + "from utils import print_table\n", + "print_table(sequential_decision_environment.to_arrows(pi))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is exactly the output we expected\n", + "![title](images/-4.jpg)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The living reward for each state is now lower than the least rewarding terminal. Life is so _painful_ that the agent heads for the nearest exit as even the worst exit is less painful than any living state." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Case 4\n", + "---\n", + "R(s) = 4 in all states except terminal states" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "sequential_decision_environment = GridMDP([[4, 4, 4, +1],\n", + " [4, None, 4, -1],\n", + " [4, 4, 4, 4]],\n", + " terminals=[(3, 2), (3, 1)])" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "> > < .\n", + "> None < .\n", + "> > > v\n" + ] + } + ], + "source": [ + "pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .001))\n", + "from utils import print_table\n", + "print_table(sequential_decision_environment.to_arrows(pi))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this case, the output we expect is\n", + "![title](images/4.jpg)\n", + "
    \n", + "As life is positively enjoyable and the agent avoids _both_ exits.\n", + "Even though the output we get is not exactly what we want, it is definitely not wrong.\n", + "The scenario here requires the agent to anything but reach a terminal state, as this is the only way the agent can maximize its reward (total reward tends to infinity), and the program does just that.\n", + "
    \n", + "Currently, the GridMDP class doesn't support an explicit marker for a \"do whatever you like\" action or a \"don't care\" condition.\n", + "You can however, extend the class to do so.\n", + "
    \n", + "For in-depth knowledge about sequential decision problems, refer **Section 17.1** in the AIMA book." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## POMDP\n", + "---\n", + "Partially Observable Markov Decision Problems\n", + "\n", + "In retrospect, a Markov decision process or MDP is defined as:\n", + "- a sequential decision problem for a fully observable, stochastic environment with a Markovian transition model and additive rewards.\n", + "\n", + "An MDP consists of a set of states (with an initial state $s_0$); a set $A(s)$ of actions\n", + "in each state; a transition model $P(s' | s, a)$; and a reward function $R(s)$.\n", + "\n", + "The MDP seeks to make sequential decisions to occupy states so as to maximise some combination of the reward function $R(s)$.\n", + "\n", + "The characteristic problem of the MDP is hence to identify the optimal policy function $\\pi^*(s)$ that provides the _utility-maximising_ action $a$ to be taken when the current state is $s$.\n", + "\n", + "### Belief vector\n", + "\n", + "**Note**: The book refers to the _belief vector_ as the _belief state_. We use the latter terminology here to retain our ability to refer to the belief vector as a _probability distribution over states_.\n", + "\n", + "The solution of an MDP is subject to certain properties of the problem which are assumed and justified in [Section 17.1]. One critical assumption is that the agent is **fully aware of its current state at all times**.\n", + "\n", + "A tedious (but rewarding, as we will see) way of expressing this is in terms of the **belief vector** $b$ of the agent. The belief vector is a function mapping states to probabilities or certainties of being in those states.\n", + "\n", + "Consider an agent that is fully aware that it is in state $s_i$ in the statespace $(s_1, s_2, ... s_n)$ at the current time.\n", + "\n", + "Its belief vector is the vector $(b(s_1), b(s_2), ... b(s_n))$ given by the function $b(s)$:\n", + "\\begin{align*}\n", + "b(s) &= 0 \\quad \\text{if }s \\neq s_i \\\\ &= 1 \\quad \\text{if } s = s_i\n", + "\\end{align*}\n", + "\n", + "Note that $b(s)$ is a probability distribution that necessarily sums to $1$ over all $s$.\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "### POMDPs - a conceptual outline\n", + "\n", + "The POMDP really has only two modifications to the **problem formulation** compared to the MDP.\n", + "\n", + "- **Belief state** - In the real world, the current state of an agent is often not known with complete certainty. This makes the concept of a belief vector extremely relevant. It allows the agent to represent different degrees of certainty with which it _believes_ it is in each state.\n", + "\n", + "- **Evidence percepts** - In the real world, agents often have certain kinds of evidence, collected from sensors. They can use the probability distribution of observed evidence, conditional on state, to consolidate their information. This is a known distribution $P(e\\ |\\ s)$ - $e$ being an evidence, and $s$ being the state it is conditional on.\n", + "\n", + "Consider the world we used for the MDP. \n", + "\n", + "![title](images/grid_mdp.jpg)\n", + "\n", + "#### Using the belief vector\n", + "An agent beginning at $(1, 1)$ may not be certain that it is indeed in $(1, 1)$. Consider a belief vector $b$ such that:\n", + "\\begin{align*}\n", + " b((1,1)) &= 0.8 \\\\\n", + " b((2,1)) &= 0.1 \\\\\n", + " b((1,2)) &= 0.1 \\\\\n", + " b(s) &= 0 \\quad \\quad \\forall \\text{ other } s\n", + "\\end{align*}\n", + "\n", + "By horizontally catenating each row, we can represent this as an 11-dimensional vector (omitting $(2, 2)$).\n", + "\n", + "Thus, taking $s_1 = (1, 1)$, $s_2 = (1, 2)$, ... $s_{11} = (4,3)$, we have $b$:\n", + "\n", + "$b = (0.8, 0.1, 0, 0, 0.1, 0, 0, 0, 0, 0, 0)$ \n", + "\n", + "This fully represents the certainty to which the agent is aware of its state.\n", + "\n", + "#### Using evidence\n", + "The evidence observed here could be the number of adjacent 'walls' or 'dead ends' observed by the agent. We assume that the agent cannot 'orient' the walls - only count them.\n", + "\n", + "In this case, $e$ can take only two values, 1 and 2. This gives $P(e\\ |\\ s)$ as:\n", + "\\begin{align*}\n", + " P(e=2\\ |\\ s) &= \\frac{1}{7} \\quad \\forall \\quad s \\in \\{s_1, s_2, s_4, s_5, s_8, s_9, s_{11}\\}\\\\\n", + " P(e=1\\ |\\ s) &= \\frac{1}{4} \\quad \\forall \\quad s \\in \\{s_3, s_6, s_7, s_{10}\\} \\\\\n", + " P(e\\ |\\ s) &= 0 \\quad \\forall \\quad \\text{ other } s, e\n", + "\\end{align*}\n", + "\n", + "Note that the implications of the evidence on the state must be known **a priori** to the agent. Ways of reliably learning this distribution from percepts are beyond the scope of this notebook." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### POMDPs - a rigorous outline\n", + "\n", + "A POMDP is thus a sequential decision problem for for a *partially* observable, stochastic environment with a Markovian transition model, a known 'sensor model' for inferring state from observation, and additive rewards. \n", + "\n", + "Practically, a POMDP has the following, which an MDP also has:\n", + "- a set of states, each denoted by $s$\n", + "- a set of actions available in each state, $A(s)$\n", + "- a reward accrued on attaining some state, $R(s)$\n", + "- a transition probability $P(s'\\ |\\ s, a)$ of action $a$ changing the state from $s$ to $s'$\n", + "\n", + "And the following, which an MDP does not:\n", + "- a sensor model $P(e\\ |\\ s)$ on evidence conditional on states\n", + "\n", + "Additionally, the POMDP is now uncertain of its current state hence has:\n", + "- a belief vector $b$ representing the certainty of being in each state (as a probability distribution)\n", + "\n", + "\n", + "#### New uncertainties\n", + "\n", + "It is useful to intuitively appreciate the new uncertainties that have arisen in the agent's awareness of its own state.\n", + "\n", + "- At any point, the agent has belief vector $b$, the distribution of its believed likelihood of being in each state $s$.\n", + "- For each of these states $s$ that the agent may **actually** be in, it has some set of actions given by $A(s)$.\n", + "- Each of these actions may transport it to some other state $s'$, assuming an initial state $s$, with probability $P(s'\\ |\\ s, a)$\n", + "- Once the action is performed, the agent receives a percept $e$. $P(e\\ |\\ s)$ now tells it the chances of having perceived $e$ for each state $s$. The agent must use this information to update its new belief state appropriately.\n", + "\n", + "#### Evolution of the belief vector - the `FORWARD` function\n", + "\n", + "The new belief vector $b'(s')$ after an action $a$ on the belief vector $b(s)$ and the noting of evidence $e$ is:\n", + "$$ b'(s') = \\alpha P(e\\ |\\ s') \\sum_s P(s'\\ | s, a) b(s)$$ \n", + "\n", + "where $\\alpha$ is a normalising constant (to retain the interpretation of $b$ as a probability distribution.\n", + "\n", + "This equation is just counts the sum of likelihoods of going to a state $s'$ from every possible state $s$, times the initial likelihood of being in each $s$. This is multiplied by the likelihood that the known evidence actually implies the new state $s'$. \n", + "\n", + "This function is represented as `b' = FORWARD(b, a, e)`\n", + "\n", + "#### Probability distribution of the evolving belief vector\n", + "\n", + "The goal here is to find $P(b'\\ |\\ b, a)$ - the probability that action $a$ transforms belief vector $b$ into belief vector $b'$. The following steps illustrate this -\n", + "\n", + "The probability of observing evidence $e$ when action $a$ is enacted on belief vector $b$ can be distributed over each possible new state $s'$ resulting from it:\n", + "\\begin{align*}\n", + " P(e\\ |\\ b, a) &= \\sum_{s'} P(e\\ |\\ b, a, s') P(s'\\ |\\ b, a) \\\\\n", + " &= \\sum_{s'} P(e\\ |\\ s') P(s'\\ |\\ b, a) \\\\\n", + " &= \\sum_{s'} P(e\\ |\\ s') \\sum_s P(s'\\ |\\ s, a) b(s)\n", + "\\end{align*}\n", + "\n", + "The probability of getting belief vector $b'$ from $b$ by application of action $a$ can thus be summed over all possible evidences $e$:\n", + "\\begin{align*}\n", + " P(b'\\ |\\ b, a) &= \\sum_{e} P(b'\\ |\\ b, a, e) P(e\\ |\\ b, a) \\\\\n", + " &= \\sum_{e} P(b'\\ |\\ b, a, e) \\sum_{s'} P(e\\ |\\ s') \\sum_s P(s'\\ |\\ s, a) b(s)\n", + "\\end{align*}\n", + "\n", + "where $P(b'\\ |\\ b, a, e) = 1$ if $b' = $ `FORWARD(b, a, e)` and $= 0$ otherwise.\n", + "\n", + "Given initial and final belief states $b$ and $b'$, the transition probabilities still depend on the action $a$ and observed evidence $e$. Some belief states may be achievable by certain actions, but have non-zero probabilities for states prohibited by the evidence $e$. Thus, the above condition thus ensures that only valid combinations of $(b', b, a, e)$ are considered.\n", + "\n", + "#### A modified rewardspace\n", + "\n", + "For MDPs, the reward space was simple - one reward per available state. However, for a belief vector $b(s)$, the expected reward is now:\n", + "$$\\rho(b) = \\sum_s b(s) R(s)$$\n", + "\n", + "Thus, as the belief vector can take infinite values of the distribution over states, so can the reward for each belief vector vary over a hyperplane in the belief space, or space of states (planes in an $N$-dimensional space are formed by a linear combination of the axes)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we know the basics, let's have a look at the `POMDP` class." + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class POMDP(MDP):\n",
    +       "\n",
    +       "    """A Partially Observable Markov Decision Process, defined by\n",
    +       "    a transition model P(s'|s,a), actions A(s), a reward function R(s),\n",
    +       "    and a sensor model P(e|s). We also keep track of a gamma value,\n",
    +       "    for use by algorithms. The transition and the sensor models\n",
    +       "    are defined as matrices. We also keep track of the possible states\n",
    +       "    and actions for each state. [page 659]."""\n",
    +       "\n",
    +       "    def __init__(self, actions, transitions=None, evidences=None, rewards=None, states=None, gamma=0.95):\n",
    +       "        """Initialize variables of the pomdp"""\n",
    +       "\n",
    +       "        if not (0 < gamma <= 1):\n",
    +       "            raise ValueError('A POMDP must have 0 < gamma <= 1')\n",
    +       "\n",
    +       "        self.states = states\n",
    +       "        self.actions = actions\n",
    +       "\n",
    +       "        # transition model cannot be undefined\n",
    +       "        self.t_prob = transitions or {}\n",
    +       "        if not self.t_prob:\n",
    +       "            print('Warning: Transition model is undefined')\n",
    +       "        \n",
    +       "        # sensor model cannot be undefined\n",
    +       "        self.e_prob = evidences or {}\n",
    +       "        if not self.e_prob:\n",
    +       "            print('Warning: Sensor model is undefined')\n",
    +       "        \n",
    +       "        self.gamma = gamma\n",
    +       "        self.rewards = rewards\n",
    +       "\n",
    +       "    def remove_dominated_plans(self, input_values):\n",
    +       "        """\n",
    +       "        Remove dominated plans.\n",
    +       "        This method finds all the lines contributing to the\n",
    +       "        upper surface and removes those which don't.\n",
    +       "        """\n",
    +       "\n",
    +       "        values = [val for action in input_values for val in input_values[action]]\n",
    +       "        values.sort(key=lambda x: x[0], reverse=True)\n",
    +       "\n",
    +       "        best = [values[0]]\n",
    +       "        y1_max = max(val[1] for val in values)\n",
    +       "        tgt = values[0]\n",
    +       "        prev_b = 0\n",
    +       "        prev_ix = 0\n",
    +       "        while tgt[1] != y1_max:\n",
    +       "            min_b = 1\n",
    +       "            min_ix = 0\n",
    +       "            for i in range(prev_ix + 1, len(values)):\n",
    +       "                if values[i][0] - tgt[0] + tgt[1] - values[i][1] != 0:\n",
    +       "                    trans_b = (values[i][0] - tgt[0]) / (values[i][0] - tgt[0] + tgt[1] - values[i][1])\n",
    +       "                    if 0 <= trans_b <= 1 and trans_b > prev_b and trans_b < min_b:\n",
    +       "                        min_b = trans_b\n",
    +       "                        min_ix = i\n",
    +       "            prev_b = min_b\n",
    +       "            prev_ix = min_ix\n",
    +       "            tgt = values[min_ix]\n",
    +       "            best.append(tgt)\n",
    +       "\n",
    +       "        return self.generate_mapping(best, input_values)\n",
    +       "\n",
    +       "    def remove_dominated_plans_fast(self, input_values):\n",
    +       "        """\n",
    +       "        Remove dominated plans using approximations.\n",
    +       "        Resamples the upper boundary at intervals of 100 and\n",
    +       "        finds the maximum values at these points.\n",
    +       "        """\n",
    +       "\n",
    +       "        values = [val for action in input_values for val in input_values[action]]\n",
    +       "        values.sort(key=lambda x: x[0], reverse=True)\n",
    +       "\n",
    +       "        best = []\n",
    +       "        sr = 100\n",
    +       "        for i in range(sr + 1):\n",
    +       "            x = i / float(sr)\n",
    +       "            maximum = (values[0][1] - values[0][0]) * x + values[0][0]\n",
    +       "            tgt = values[0]\n",
    +       "            for value in values:\n",
    +       "                val = (value[1] - value[0]) * x + value[0]\n",
    +       "                if val > maximum:\n",
    +       "                    maximum = val\n",
    +       "                    tgt = value\n",
    +       "\n",
    +       "            if all(any(tgt != v) for v in best):\n",
    +       "                best.append(tgt)\n",
    +       "\n",
    +       "        return self.generate_mapping(best, input_values)\n",
    +       "\n",
    +       "    def generate_mapping(self, best, input_values):\n",
    +       "        """Generate mappings after removing dominated plans"""\n",
    +       "\n",
    +       "        mapping = defaultdict(list)\n",
    +       "        for value in best:\n",
    +       "            for action in input_values:\n",
    +       "                if any(all(value == v) for v in input_values[action]):\n",
    +       "                    mapping[action].append(value)\n",
    +       "\n",
    +       "        return mapping\n",
    +       "\n",
    +       "    def max_difference(self, U1, U2):\n",
    +       "        """Find maximum difference between two utility mappings"""\n",
    +       "\n",
    +       "        for k, v in U1.items():\n",
    +       "            sum1 = 0\n",
    +       "            for element in U1[k]:\n",
    +       "                sum1 += sum(element)\n",
    +       "            sum2 = 0\n",
    +       "            for element in U2[k]:\n",
    +       "                sum2 += sum(element)\n",
    +       "        return abs(sum1 - sum2)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(POMDP)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `POMDP` class includes all variables of the `MDP` class and additionally also stores the sensor model in `e_prob`.\n", + "
    \n", + "
    \n", + "`remove_dominated_plans`, `remove_dominated_plans_fast`, `generate_mapping` and `max_difference` are helper methods for `pomdp_value_iteration` which will be explained shortly." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To understand how we can model a partially observable MDP, let's take a simple example.\n", + "Let's consider a simple two state world.\n", + "The states are labelled 0 and 1, with the reward at state 0 being 0 and at state 1 being 1.\n", + "
    \n", + "There are two actions:\n", + "
    \n", + "`Stay`: stays put with probability 0.9 and\n", + "`Go`: switches to the other state with probability 0.9.\n", + "
    \n", + "For now, let's assume the discount factor `gamma` to be 1.\n", + "
    \n", + "The sensor reports the correct state with probability 0.6.\n", + "
    \n", + "This is a simple problem with a trivial solution.\n", + "Obviously the agent should `Stay` when it thinks it is in state 1 and `Go` when it thinks it is in state 0.\n", + "
    \n", + "The belief space can be viewed as one-dimensional because the two probabilities must sum to 1." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's model this POMDP using the `POMDP` class." + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [], + "source": [ + "# transition probability P(s'|s,a)\n", + "t_prob = [[[0.9, 0.1], [0.1, 0.9]], [[0.1, 0.9], [0.9, 0.1]]]\n", + "# evidence function P(e|s)\n", + "e_prob = [[[0.6, 0.4], [0.4, 0.6]], [[0.6, 0.4], [0.4, 0.6]]]\n", + "# reward function\n", + "rewards = [[0.0, 0.0], [1.0, 1.0]]\n", + "# discount factor\n", + "gamma = 0.95\n", + "# actions\n", + "actions = ('0', '1')\n", + "# states\n", + "states = ('0', '1')" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [], + "source": [ + "pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have defined our `POMDP` object." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## POMDP VALUE ITERATION\n", + "Defining a POMDP is useless unless we can find a way to solve it. As POMDPs can have infinitely many belief states, we cannot calculate one utility value for each state as we did in `value_iteration` for MDPs.\n", + "
    \n", + "Instead of thinking about policies, we should think about conditional plans and how the expected utility of executing a fixed conditional plan varies with the initial belief state.\n", + "
    \n", + "If we bound the depth of the conditional plans, then there are only finitely many such plans and the continuous space of belief states will generally be divided inte _regions_, each corresponding to a particular conditional plan that is optimal in that region. The utility function, being the maximum of a collection of hyperplanes, will be piecewise linear and convex.\n", + "
    \n", + "For the one-step plans `Stay` and `Go`, the utility values are as follows\n", + "
    \n", + "
    \n", + "$$\\alpha_{|Stay|}(0) = R(0) + \\gamma(0.9R(0) + 0.1R(1)) = 0.1$$\n", + "$$\\alpha_{|Stay|}(1) = R(1) + \\gamma(0.9R(1) + 0.1R(0)) = 1.9$$\n", + "$$\\alpha_{|Go|}(0) = R(0) + \\gamma(0.9R(1) + 0.1R(0)) = 0.9$$\n", + "$$\\alpha_{|Go|}(1) = R(1) + \\gamma(0.9R(0) + 0.1R(1)) = 1.1$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The utility function can be found by `pomdp_value_iteration`.\n", + "
    \n", + "To summarize, it generates a set of all plans consisting of an action and, for each possible next percept, a plan in U with computed utility vectors.\n", + "The dominated plans are then removed from this set and the process is repeated till the maximum difference between the utility functions of two consecutive iterations reaches a value less than a threshold value." + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "data": { + "text/markdown": [ + "### AIMA3e\n", + "__function__ POMDP-VALUE-ITERATION(_pomdp_, _ε_) __returns__ a utility function \n", + " __inputs__: _pomdp_, a POMDP with states _S_, actions _A_(_s_), transition model _P_(_s′_ | _s_, _a_), \n", + "      sensor model _P_(_e_ | _s_), rewards _R_(_s_), discount _γ_ \n", + "     _ε_, the maximum error allowed in the utility of any state \n", + " __local variables__: _U_, _U′_, sets of plans _p_ with associated utility vectors _αp_ \n", + "\n", + " _U′_ ← a set containing just the empty plan \\[\\], with _α\\[\\]_(_s_) = _R_(_s_) \n", + " __repeat__ \n", + "   _U_ ← _U′_ \n", + "   _U′_ ← the set of all plans consisting of an action and, for each possible next percept, \n", + "     a plan in _U_ with utility vectors computed according to Equation(__??__) \n", + "   _U′_ ← REMOVE\\-DOMINATED\\-PLANS(_U′_) \n", + " __until__ MAX\\-DIFFERENCE(_U_, _U′_) < _ε_(1 − _γ_) ⁄ _γ_ \n", + " __return__ _U_ \n", + "\n", + "---\n", + "__Figure ??__ A high\\-level sketch of the value iteration algorithm for POMDPs. The REMOVE\\-DOMINATED\\-PLANS step and MAX\\-DIFFERENCE test are typically implemented as linear programs." + ], + "text/plain": [ + "" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pseudocode('POMDP-Value-Iteration')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's have a look at the `pomdp_value_iteration` function." + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def pomdp_value_iteration(pomdp, epsilon=0.1):\n",
    +       "    """Solving a POMDP by value iteration."""\n",
    +       "\n",
    +       "    U = {'':[[0]* len(pomdp.states)]}\n",
    +       "    count = 0\n",
    +       "    while True:\n",
    +       "        count += 1\n",
    +       "        prev_U = U\n",
    +       "        values = [val for action in U for val in U[action]]\n",
    +       "        value_matxs = []\n",
    +       "        for i in values:\n",
    +       "            for j in values:\n",
    +       "                value_matxs.append([i, j])\n",
    +       "\n",
    +       "        U1 = defaultdict(list)\n",
    +       "        for action in pomdp.actions:\n",
    +       "            for u in value_matxs:\n",
    +       "                u1 = Matrix.matmul(Matrix.matmul(pomdp.t_prob[int(action)], Matrix.multiply(pomdp.e_prob[int(action)], Matrix.transpose(u))), [[1], [1]])\n",
    +       "                u1 = Matrix.add(Matrix.scalar_multiply(pomdp.gamma, Matrix.transpose(u1)), [pomdp.rewards[int(action)]])\n",
    +       "                U1[action].append(u1[0])\n",
    +       "\n",
    +       "        U = pomdp.remove_dominated_plans_fast(U1)\n",
    +       "        # replace with U = pomdp.remove_dominated_plans(U1) for accurate calculations\n",
    +       "        \n",
    +       "        if count > 10:\n",
    +       "            if pomdp.max_difference(U, prev_U) < epsilon * (1 - pomdp.gamma) / pomdp.gamma:\n",
    +       "                return U\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(pomdp_value_iteration)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This function uses two aptly named helper methods from the `POMDP` class, `remove_dominated_plans` and `max_difference`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try solving a simple one-dimensional POMDP using value-iteration.\n", + "
    \n", + "Consider the problem of a user listening to voicemails.\n", + "At the end of each message, they can either _save_ or _delete_ a message.\n", + "This forms the unobservable state _S = {save, delete}_.\n", + "It is the task of the POMDP solver to guess which goal the user has.\n", + "
    \n", + "The belief space has two elements, _b(s = save)_ and _b(s = delete)_.\n", + "For example, for the belief state _b = (1, 0)_, the left end of the line segment indicates _b(s = save) = 1_ and _b(s = delete) = 0_.\n", + "The intermediate points represent varying degrees of certainty in the user's goal.\n", + "
    \n", + "The machine has three available actions: it can _ask_ what the user wishes to do in order to infer his or her current goal, or it can _doSave_ or _doDelete_ and move to the next message.\n", + "If the user says _save_, then an error may occur with probability 0.2, whereas if the user says _delete_, an error may occur with a probability 0.3.\n", + "
    \n", + "The machine receives a large positive reward (+5) for getting the user's goal correct, a very large negative reward (-20) for taking the action _doDelete_ when the user wanted _save_, and a smaller but still significant negative reward (-10) for taking the action _doSave_ when the user wanted _delete_. \n", + "There is also a small negative reward for taking the _ask_ action (-1).\n", + "The discount factor is set to 0.95 for this example.\n", + "
    \n", + "Let's define the POMDP." + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [], + "source": [ + "# transition function P(s'|s,a)\n", + "t_prob = [[[0.65, 0.35], [0.65, 0.35]], [[0.65, 0.35], [0.65, 0.35]], [[1.0, 0.0], [0.0, 1.0]]]\n", + "# evidence function P(e|s)\n", + "e_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[0.8, 0.2], [0.3, 0.7]]]\n", + "# reward function\n", + "rewards = [[5, -10], [-20, 5], [-1, -1]]\n", + "\n", + "gamma = 0.95\n", + "actions = ('0', '1', '2')\n", + "states = ('0', '1')\n", + "\n", + "pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have defined the `POMDP` object.\n", + "Let's run `pomdp_value_iteration` to find the utility function." + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [], + "source": [ + "utility = pomdp_value_iteration(pomdp, epsilon=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYEAAAD8CAYAAACRkhiPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzsnXd81dX9/5+fm733JiEBEkYYYRNkrwABW9yjWq2tP7WuVq24v0WtiKNaF1K0al3ggNIEhIDKEhlhRCBkD7L33rnn98eH+ykrZN2bm3Gej0cekuQzziee+359zjnv83orQggkEolEMjDRmbsBEolEIjEfUgQkEolkACNFQCKRSAYwUgQkEolkACNFQCKRSAYwUgQkEolkACNFQCKRSAYwUgQkEolkACNFQCKRSAYwluZuwPl4enqK4OBgczdDIpFI+hTx8fElQgivrpzbq0QgODiYI0eOmLsZEolE0qdQFCWrq+fK6SCJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgHMgBaBF198kfDwcMaOHUtERAQHDx40d5MkfYBNmzahKApnzpy54nGOjo491CJJW1hYWBAREUF4eDjjxo3j9ddfR6/XX/GczMxMRo8e3e4xn3/+uTGbajYGrAgcOHCAmJgYjh49SkJCAjt37iQwMNDczZL0Ab744gtmzJjBl19+ae6mSNrBzs6O48ePc+rUKeLi4ti6dSt//etfu31dKQL9gPz8fDw9PbGxsQHA09MTf39/Vq1axeTJkxk9ejR33303QggSExOZMmWKdm5mZiZjx44FID4+ntmzZzNx4kSioqLIz883y/NIeoaamhr279/PBx98oIlAfn4+s2bNIiIigtGjR7N3794LzikpKSEyMpLY2FhzNFlyDm9vb9atW8fbb7+NEILW1lYee+wxJk+ezNixY3n//fcvOaetY1auXMnevXuJiIjg73//e4eu1WsRQvSar4kTJ4qeorq6WowbN06EhoaKe++9V/z4449CCCFKS0u1Y37zm9+ILVu2CCGEGDdunEhLSxNCCLF69Wrx/PPPi6amJhEZGSmKioqEEEJ8+eWX4s477+yxZ5D0PP/+97/F7373OyGEEJGRkSI+Pl68+uqr4oUXXhBCCNHS0iKqqqqEEEI4ODiIgoICMWXKFLFjxw6ztXkg4+DgcMnPXF1dRUFBgXj//ffF888/L4QQoqGhQUycOFGkp6eLjIwMER4eLoQQbR7zww8/iOjoaO2abR3XUwBHRBfjrlGspBVF+RBYBhQJIUaf+5k7sAEIBjKBG4QQ5ca4nzFwdHQkPj6evXv38sMPP3DjjTeyevVqnJycWLNmDXV1dZSVlREeHs7y5cu54YYb2LhxIytXrmTDhg1s2LCBpKQkTp48ycKFCwH1rcHPz8/MTyYxJV988QUPP/wwADfddBNffPEFy5cv53e/+x3Nzc38+te/JiIiAoDm5mbmz5/PO++8w+zZs83ZbMl5qDETduzYQUJCAl9//TUAlZWVpKSkEBYWph3b1jHW1tYXXLOt40JCQnrikbpHV9Xj/C9gFjABOHnez9YAK8/9eyXwcnvXCRsdJhpbGk2ilO3x1VdfiQULFghvb2+RnZ0thBDiueeeE88995wQQojU1FQxfvx4kZSUJCZMmCCEECIhIUFMmzbNLO2V9DwlJSXC1tZWBAUFicGDB4tBgwaJwMBAodfrRW5urli3bp0YPXq0+Pjjj4UQQtjb24vbb79dPPHEE2Zu+cDl4pFAWlqacHd3F3q9XlxzzTXiu+++u+Sc80cCbR1z8UigreN6gvz87o0EjLImIITYA5Rd9ONfAR+f+/fHwK/bu05yaTKeazy5buN1fHT8IwprCo3RvMuSlJRESkqK9v3x48cZPnw4oK4P1NTUaKoOMHToUCwsLHj++ee58cYbARg+fDjFxcUcOHAAUN/8Tp06ZbI2S8zL119/ze23305WVhaZmZmcPXuWkJAQ9uzZg7e3N3/4wx+46667OHr0KACKovDhhx9y5swZVq9ebebWS4qLi7nnnnu4//77URSFqKgo3nvvPZqbmwFITk6mtrb2gnPaOsbJyYnq6up2jzMlubnw0EPQ3cGGKSuL+Qgh8gGEEPmKoni3d8JQ96HMGz2P2JRYvkn8BgWFyQGTWRa6jOiwaMb7jkdRFKM0rqamhgceeICKigosLS0ZNmwY69atw9XVlTFjxhAcHMzkyZMvOOfGG2/kscceIyMjAwBra2u+/vprHnzwQSorK2lpaeHhhx8mPDzcKG2U9C6++OILVq5cecHPrr32Wu644w4cHBywsrLC0dGRTz75RPu9hYUFX375JcuXL8fZ2Zn77ruvp5s9oKmvryciIoLm5mYsLS257bbb+POf/wzA73//ezIzM5kwYQJCCLy8vNi8efMF57d1zNixY7G0tGTcuHHccccdPPTQQ+1ey1hkZ8PLL8P69aDXw223wb/+1fXrKeLc/Fh3URQlGIgR/1sTqBBCuJ73+3IhhNtlzrsbuBsgKChoYlZWFkIIjhccJyY5htiUWA7lHkIg8HfyZ+mwpSwLW8aCIQtwsHYwStslEomkt5OZCS+99L+Af+edsHKlOhJQFCVeCDGpK9c1pQgkAXPOjQL8gB+FEMOvdI1JkyaJy9UYLqwpZFvqNmJTYtmeup3qpmpsLGyYEzyHZWHLiA6NJsStDyzAmIGdZeos3QJ3dzO3RCLpOgO5H6elwd/+Bp98Ajod/P738PjjEBT0v2N6qwi8ApQKIVYrirIScBdC/OVK12hLBM6nqbWJfdn7iEmOISY5hpQydV5/lNcookOjWRa2jOmB07HUmXKmq+8w59gxAH4cP97MLZFIus5A7MdJSWrw/+wzsLKCu++Gv/wFAgIuPdbsIqAoyhfAHMATKASeAzYDG4EgIBu4Xghx8eLxBXREBC4muTSZ2ORYYlNi2Z21mxZ9C662riwetphloctYPGwxHvYeXXiq/sFA/PBI+h8DqR+fPg0vvghffgk2NnDvvfDoo3Cl7PPuiIBRXpeFEDe38av5xrj+lQjzCCMsMow/Rf6JqsYq4tLiiEmJYWvKVr48+SU6RUfkoEhtlDDae7TRFpclEonEWPzyC7zwAnz1Fdjbq4H/kUfAu92Umu7Rr+ZMnG2cuXbUtVw76lr0Qs+RvCPa4vKT3z/Jk98/SZBLkCYIc4PnYmdlZ+5mSySSAczx4/D88/Dtt+DkBE88AX/6E3h69sz9e5UItOfu1xl0io4pAVOYEjCFVXNXkVedx9aUrcQkx/DxiY9578h72FnaMX/IfC0FdZDzIKPdXyKRSK7EkSNq8N+yBVxc4Nln1bz/nl77NtrCsDFQFEUA6HQ6fH19iYqK4qmnnmLo0KFGvU9DSwO7M3eri8spMWRWZAIwzmecNkqYEjAFC52FUe9rDpLq6gAYbm9v5pZIJF2nP/Xjn3+GVatg2zZwc1Pf+h94AFxd2z+3Lcy+MGwsDCLQFs7OzkRERPDII4+wfPlyo8ztCyFILEnUpo32Z++nVbTiae/JkmFLWBa2jEVDF+Fq243/QxKJZMCzb58a/OPiwMNDnfO/7z5wdu7+tfuNCLi6uopp06aRnZ1NWloaTU1N7Z5jZWVFSEgIt9xyC3/6059w7uZftLy+nO1p24lJjmFb6jbK6suwUCyYOXimNkoY7jG8zywu/7ekBIDlPTXBKJGYgL7aj4WA3bvV4P/DD+oi72OPwT33gDFrDvUbETh/JGBjY4OXlxe+vr5YW1tTX19PRkYGlZWVtNdmRVHw9PRk9uzZPPfcc+1WCWqLVn0rP+f8TGxKLDHJMfxS9AsAQ9yGaOsIswfPxsbSpkvX7wkGUmqdpP/S1/qxELBrlxr89+4FX191g9fdd6uZP8am34hASEiIWLRoEQkJCWRkZFBWVqYZMhmwtrbGw8MDV1dXdDod5eXlFBYW0tra2u717ezsGD16NA888AC33HILFhadm/PPrswmNjmWmJQYvs/4noaWBhysHFg0dBHRodEsDV2Kn1PvspLuax8eieRy9JV+LARs364G/wMH1I1dK1fCXXeBnQkTEfuNCFxus1hZWRlxcXHs2bOH48ePk5GRQUlJySXiYGVlhYuLC/b29uj1ekpLS6mvr2/3nhYWFvj7+3Pttdfy7LPP4uZ2ib3RZalrruP7jO81UcipygFgot9EloUtY1nYMib4TUCnmLd4W1/58EgkV6K392MhIDZWDf6HD6uWDk88ofr72PTAREG/FoG2qKqqYufOnezZs4djx46RlpZGSUkJjY2NFxxnaWmJg4MDNjY21NfXU1tb26FUVFdXVyIjI3n22WeZNm3aFY8VQpBQmKBNG/2c8zMCgY+DD9Gh0USHRbNwyEKcbJw69GzGpLd/eCSSjtBb+7Fer6Z4rloFx46pZm5PPgm33w4X1Z0xKQNSBNqitraW77//nt27d3P06FFSU1MpLi6moaHhguMsLCywtbVFp9NRX19PS0tLu9e2sbFh2LBh3H333dx3331YWl5+m0VxbTHfpX5HbEos36V+R2VjJVY6K+YEz9EWl4e6GzfttS1664dHIukMva0f6/Xq5q7nn4eEBBg2DJ56Cm69VfX56WmkCHSAhoYGdu/ezY8//kh8fDwpKSkUFhZeMmWk0+mwsrJCr9dfMuV0OXQ6Hd7e3ixdupQXXnjhkvKSza3N/HT2J21PwpmSMwAM9xiuOaDOCJqBlYVpes7Zc+IXaGtrkutLJD1Bb+nHra2wcaNq73D6NAwfDk8/DTfdBG28E/YIUgS6QVNTE/v27eOHH37gyJEjJCcnU1BQQN25zSkGFEVBp9Oh1+vbzU4CcHBwYMKECTz11FNERUVpP08rSyM2RTW8+zHzR5pam3C2cSZqaBTLwpaxZNgSvBy8jP6cEomk67S0wBdfqMZuSUkwahQ88wxcfz10Mr/EJPQbEXBxcRGPP/44V199NeHh4WbNxW9paeHgwYPs3LmTw4cPk5ycTF5e3mVLximK0iFhsLS0JCgoiN/+9rc8/vjjNNHEzvSdmigU1BSgoDB10FQtBXWcz7hu/R02FBUBcKOpXagkEhNirn7c3AyffqoG/7Q0GDtWDf7XXKN6+/cW+o0IXGnHsJWVFa6urvj7+zN27FiWL1/OwoULce3OXusuoNfriY+PZ+fOnRw8eJAzZ86Ql5d3Qb3RzqAoCi4uLsybN49bH7qVX/S/EJMSw5E8dUQ0yHmQurgcGs38IfOxt+pcknFvm0uVSLpCT/fjpib4+GPVzz8zEyZMUIP/1Vf3ruBvYECIQEfQ6XTY2dnh6elJSEgIM2bMYMWKFYwbN67TewI6i16vJyEhgbi4OA4ePEhiYiI5OTlUV1d3aJRwMVbWVvgM9sF3iS9nfM9Q01SDraUtc4PnamsJg10Ht3sdKQKS/kBP9ePGRvjwQ7WM49mzMHkyPPccLF0KvdkkoF+JgCFrx9bWFmtray0Dp7GxkZqaGhobGzu0MexKGAqC+/v7M3r0aBYvXszVV1+Nuwns+4QQJCYmsmPHDn7++WdOnz7N2bNnqaqq6rxrqgI6Bx36kXqYD6ODRmvTRtMGTbtsNTUpApL+gKn7cX29Wrj95ZchNxciI9Xgv2hR7w7+BvqNCDg4OAhfX18qKyupra2lqanpioHS0tISKyurC8SiqamJhoYGWlpauvQGbkBRFGxtbXFzc2PIkCFMnTqVa6+9lsmTJ7eZGtpZkpOTiYuL48CBA5w6dYrs7GwqKio6Lw6WoAvUMf+e+dy55E6ihkXhbqcKmhQBSX/AVP24rg7efx/WrIGCApg5Uw3+8+b1jeBvoN+IwOWygxobG0lOTubMmTOkpKSQlZVFTk4OhYWFlJaWUlVVpQnGlZ7FwsICS0tLLYC3tLTQ0tLS7VGFhYUFjo6O+Pn5MXLkSKKiorjmmmvw8up6hk9WVhbbt29n//79nDx5kuzsbMrLyzvXVh24ertif+2vCPntH9k3eXKX2yORmBtji0BNDbz3Hrz6KhQVqUH/2Wdh9myjXL7H6dci0Bnq6+s5c+YMiYmJpKWlkZmZSV5eHgUFBZSVlVFVVUVdXR3Nzc3tCoZhDaG1tbXDaaFtoSgKVlZWuLm5MXjwYCZPnsy1117LzJkzOzWqyM3NJS4ujn379pGQkEBmZiZlZWWdEgdnZ2eioqJ4++238ZYZQ5I+Qsk5R2HPbm7DraqCd96B116D0lJ1uueZZ2DGDGO00nxIEegC1dXVJCYmkpSURGpqKtnZ2eTk5FBUVERZWRnV1dWaYFwJ3blUASFEt4TCcC07Ozt8fHwYOXIk8+bN44YbbmDQoCtXPCsqKiIuLo69e/eSkJCgWWh0dFrJysqK4cOHs3r1aqKjo7v1DBJJb6SiAt56C/7+dygvVxd6n3kG2nGE6bVUVFTw1VdfsWvXLk6fPs0vv/wiRcCUlJeXc/r0aZKSkkhLSyM7O5u8vDyKioooLy+nurqa+vr6Du0wNgaWlpY4OzszePBgJkyYwIoVK4iKirpkVFFWVsZTGzeSfPAg9clJnDx9kuqKjqeyenp6cvPNN/Paa69hZY698BLJOT7KzwfgDr/OufSWlcGbb6pflZVqiuczz8CkLoXLnqG5uZm9e/eyefNm4uPjtenghoaGK436pQj0FoqLizXBSE9P1wSjuLhYEwzDwrUpURQFGxsbWp2csAsO5rGrr+bWW28lJCSEyspK/vXtv/h629ecOHqCmuwa6KB+2djYMHHiRN577z3Gjh1r0meQSAx0dk2gpER963/rLaiuVjd3Pf009Ib8iIyMDDZu3MjevXtJSUmhqKiI2traDiezWFpaYm9vj4eHB8HBwQwdOpT169f3DxFQFEVYWVlhZWWFjY0Ntra22Nvb4+joiKOjIy4uLri6uuLm5oanpydeXl54e3vj6+uLv78/fn5+2NnZ9YmqX0IICgoKNMHIyMjQBKOkpITy8nJqamraU/9uY2FhgZ2dHU4eTuj8dRRZFdHc2AzpQCnQgRklRVHw8fHhkUce4dFHHzVZWyUDl46KQFGROt//zjtq5s/116vBf8yYnmgl1NXVsWXLFrZv384vv/xCTk4OlZWV7WY6GtDpdNjY2ODg4ICzs7MW7ywsLCgtLaW4uJiKigrq6uoufpHsHyJga2srvL29qa+vp7Gxkaampi5l8Oh0OiwsLC4REwcHh0vExMPDAy8vL3x8fPD19cXPzw8/Pz8cHBx6jZjo9XrOnj3L6dOnSUlJISMjg7Nnz5Kfn09JSQkVFRVG20PRJjrQWejQt+ihg13G2tqa6dOn8/nnn19irCeRdIb2RCA/H155BdauVTd83XST6uo5apTx2iCE4MiRI3z77bccOnSI9PR0rW5JR0b2Bv8xQ1q7ra0tlpaWtLS0UFtb290Xvt4rAoqiZALVQCvQcqWGXmk6qLm5mZKSEvLz88nPz6ewsJDi4mJKSkooLS2lvLycyspKqqurqampoa6ujrq6ugvEpLNZPoqiXLAX4XJi4uLigru7uyYm3t7e+Pj44OfnR0BAAI6Ojj0qJnq9noyMDE0wXjt0iMbCQkIbGjTBqK2tpbGxsfP7EYyAoih4eHjw2GOP8eijj2oL6xLJlWhLBHJy1Bz/detUk7dbb1WDf1hY5+9RXFzMV199xffff09iYiKFhYVUV1e3m01owBDkDZ93vV5v9M+Y4QXX1tYWJycnfHx8GDZsGF999VWvF4FJQoiS9o7tiTWBlpYWSktLLxCToqIiSkpKKCsr08SkqqqKmpoaamtrLxiZNDc3d0lMzh+Z2NjYtDkycXd3v0BM/P398ff3x9nZuUticqU3qJaWFlJTU0lMTCQlJYXMzExycnIoKCigtLSUyspKampq2t2DYQysrKwYO3Ysr7/+OjNmzJDiILmAi/txdjasXg0ffKB6+99+u1rJa9iwy5/f0tJCXFwcMTExHD16VNuY2dDQYJaXofOxsrLC1tYWZ2dnfHx8CA0NZcKECUyePJkJEybg4uLS7jV6dYpobxMBY6HX6ykrKyMvL4+8vLwLRiYGMamoqLhETBoaGi6Y5uqqmBhGJnZ2djg4OODk5ISTk9Ml01wuHh54+/gQHBDAoEGDcHFx6ZKYNDU1kZycTGJiIqmpqWRkZJCYnkja2TRKS0tpqmlSF5dN8Hny9vbm1ltv5emnnzaJtYek91N3bpqkMNuCl16Cjz5Sf/6736k1fJubU9iwYQM//fQTKSkpFBcXa/PmPT3lbWFhgb29Pc7Ozvj6+hIaGsqkSZOYOnUqERERODo6Gv2evV0EMoBy1Jnk94UQ69o6ti+JgLHQ6/WUl5dfMjIpLi6+YJqrLTFpbm7usphYWlpqIxODmDg6Ol6wIOXh4YGnp+cF01z+/v64u7tfICZFtUVsS9lGbEos2xK3UZNbg0WZBcH6YLyavbCutaaqrIrCwkLKy8tpbGw02ofTwcEBb29vJk+ezF133cX8+fNNbhgo6TlqampYu3Yna9e6k5Y2HWjFwuJftLb+DThr8vsbpl9cXFzw8/MjNDSUqVOnEhkZybhx47DtBQWbersI+Ash8hRF8QbigAeEEHvO+/3dwN0AQUFBE7Oyskzanv6KEILKykry8vI0QSkuLmZ7Rga15eX4NzVdMDKpqakxqphYW1trYmLvYA/WUKfUUSbKqNHVgD14eXoxMXQi88bMY96YeQwOHIy1tTVJSUnaLm+DLUhqaio5OTlGW+jW6XQ4OzsTHBzMwoULeeihhwgICDDKtSVdRwjBgQMH2LRpE4cOHSIlJYWysrLzpiDDgKeAW4Em4H3gFSCvy/c0vPwYgnpYWBjTp0/nqquuIjw8HOueLA5sJHq1CFxwM0X5P6BGCPHq5X4/EEcCpqaz+dVCCKqrq8nNzb1kzcSQutqemHR2CH5x1oQmJufSg52dnXFxccHOzk6zBbm4LKixsLS0xMXFhbFjx3Lbbbdxyy23YGNjY5J7DQSys7N588032bJlCzk5OZ0YAY5CDf43AfXAe8CrQOElRxrSKl1cXPD399eC+rx58xg+fLjRDB97M71WBBRFcQB0Qojqc/+OA1YJIb673PFSBIyPuVxEhRDU1tZqI5OCggKKiorIyc/hRMYJknOTyS3KpamuCRrBRthgrbeGFmhtbtVGJuZetGsLa2tr3N3dmTJlCo8//jjTp083d5N6jMrKSj788EO2bt1KQkICZWVlRtz8OAZ4GrgOqMPLayPTp//ML2GuuE6bxpEVK3pN6nZvojeLwBBg07lvLYHPhRAvtnW8FAHj05utpPVCT3xePDHJMcSmxBKfHw9AoHOgVjhnXsg89E168vPzNTNAw8iktLSUsrIyKioqtPTg2tpa6urqqK+vp6amxuQ7s6+EoihYW1vj5eVFZGQkDz/8MNOmTet1mU/19fVs2bKF//znPxw/fpzCwkItjdiYWFtb4+Pjw4QJE5g7dy7R0dEMHToURVE4dgyefx42bQInJ3jwQXj4YfD0VM/tzf24N9BrRaCzSBEwPn3pw5NXncfWlK3EpsQSlxZHbXMtdpZ2zAuZp4lCoEtgl6+vLjCu5b333iMrK8ukO7E7i8GSfPjw4cyZM4fIyEjGjx+Pv79/p32b6urq2L59O9u3byc+Pl4rYtTU1GT0Z9bpdDg4ODBixAjuvfdebrvttk5Nvxw+rAb///4XXFzUwP/QQ+DmduFxfakfmwMpApI26asfnsaWRnZn7SYmOYaY5BgyKjIAGOszVqumNjVgKha67mcBnTp1imeffZbvv/+eysrKTq1nGIKgYa+Hs7MzDg4OVFRUkJaWRlVVlVlHI11Fp9Ph6OjIoEGDCA8PZ86cOVx//fXdqpNxPgcOqMF/2zY14P/5z/DAA6oQXI6+2o97CikCkn6NEIIzJWe0aaN92ftoFa142HmwJHQJy0KXETUsCldbV6Pds7a2ljVr1vDvf/+bs2fPdjqQW1hY4O7uTnBwMGPHjuWqq65i0aJFBAQEUFVVRVxcHDt37uTHH38kPT2dpnN++T2JYUe8g4MDnp6e+Pn5aenBHh4eF+yC9/X1JSAgAF9f326lRO7dC6tWwc6d6lTPI4/AH/+oTgFJuo4UAcmAory+nB1pO4hJiWFrylbK6suwUCyYETRDmzYa4TnC6AuIQgj++9//smbNGo4dO0ZdXZ1Rr9+X0Ol0mqWKwZ/Lzs7uArNHNze3c7vgPaioiGDXrumcOuWNh0cLjzwieOABK0ywb2pAIkVA0iavZmcD8GhQkJlbYhpa9a0czD2ojRISChMAGOI2RJs2mj14NjaWHUvzNLi77tu3jz179mhOkGVlZVoZU3Nx/pu7t7c3YWFhzJw5kxtuuIHAwEAOHjzIp59+yp49e8jJyaG6utpo2VXn+2hZWFhc4o/T2trahtnjAuBZYCZqbv/LwD9R0z6vbPbo5OSkjUyyrKywd3Pj6qFD8fb21jYt+vv7Y29vb5Rn7MtIEZC0yUCbS82uzGZrylZikmPYlbGLhuYGbOtsCW8Mx7nAmfrceory1WJABk8oY38GDEHSWAZijo6OBAcHEx4ezrRp01i4cCGjRo3q0kinoqKCr7/+mi1btnDixAmKi4tpaGgw2t9ADejW2Nr+moaGx2hoGI+1dSEhIRsJCfkea2v171FdXX2B2aNhr0lzc3OX/LnOF5OLzR7PF5OLnYPP3wXf02aPxkSKgKRN+psI6PV60tLSOHToED/99BMnT54kNzeXsrIyzSvGmBkwhk1shsVfRVEuCVodCfSGdFFbW1taW1s7VVhIUZRLAqKiKDg5OTFo0CBGjhzJlClTWLhwIePGjet2CqoQgmPHjrFhwwZ2795NWloalZWVHayctwz1zX8ykAX8DfgIdbfvhZz/d/Xw8NDWHYKDgxk2bBhhYWF4enpSUlLCbfv20Vxayl329hQXF3fI7LGr/lyWlpYXWKpcSUwutlQZNGgQTk5OPS4mUgQkbdLbRaClpYWkpCQOHTrE4cOHOXXqFDk5OZSXl2s1no2+YUwHWAD2YOliqQYe72DsmuwoLfnqFwU+AAAgAElEQVRf4Y6uVnsaNmwY06ZN47rrrmPMmDHtBoS8vDxeeukltmzZQm5ubodEzGBZfPGxiqLg6OiIv78/I0aMYMqUKSxYsIBJkyYZfX9CbW3tuf0F/2XPHjcKC/+AXh+BWpHoReDfdLhk3RVQFAWsrLCwsyPo3IK1j48PgwYNIiQkhGHDhjFy5EhCQ0Mvm54qhLjA7LGgoMDsZo+GXfDni4mhpolBDDtj9ihFQNImPS0CjY2NnDp1imPHjhEfH8+pU6c4e/asVg3JFEFdURTtrdLDw0ObMw4MDCQkJISAgABycnKIj4/n5MmT5OXlUVlZ2eGpIIMtgaurK4MGDSIiIoKlS5eyZMkSk1lKNDc38/HHH7N27VpOnz7dIZsMw9/BcP7Fz+bg4ICfnx/Dhw9n0qRJLFiwgMjIyC6b7en18M03aqrnL7+oNs5PPw233AKGrQ1CCJKSkti4cSO7d+8mMTHR6AaC53N+ZS4XFxc8PT3x9fUlMDCQ4OBgQkNDGTVqFEOGDOm0KAohKC8v13bBn2+pYti42BNmj/b29tjb21/gHPz1119LEZBcniUJ6kLpti7WA66vrychIYFjx45x4sSJC97U6+vrTfOmzv9qJDs6OuLu7q6lMBre/sLCwhg5ciSBgYHEx8dfUO2prKysU9WerKyssLGzwcrZiibXJmr8aiAchg8ZTnRoNMvCljEjaAZWFp3btGUKDh8+zOrVq9mzZw+lpaUdCiiGRdeWlpbLBl97e3t8fX0JCwtj4sSJzJ07l1mzZrW5Sa21FTZuhBdegNOnYcQINfjfeCN01aanvr6eHTt28J///Efz+6+urjbpHgsLCwtNMFxdXfHy8tIEw9DHRo0aRWBgoFFHURebPRYWFlJYWHhJgSxDPQ/DLvh2xESKgKRj1NTUcPz4cY4dO8bJkyc5ffo0OTk5VFRUmDSow//e0gxvL+cH9uDgYO1DFxQUpH3oioqKtLnpxMRECgoKqKmp6XC1J0MNZXd3dwYPHszkyZP51a9+dcXCNenl6cQmxxKTEsOPmT/S1NqEs40zUUOjWBa2jCXDluDlYJxNU8agpKSEN954gw0bNpCVldWhuXuD572VlRVNTU3U1dVd8v/d1tYWX19fhg0bxsSJE5k5cy6FhXN5+WVrkpMhPByeeQauuw56yrk7NTWVb7/9lh9//JEzZ85QWFjYI4VhDHbSjo6OmmD4+fkRFBRESEgIw4cPZ+TIkfj5+fWYLcj5Zo+jRo2SIjAQEUJQUVGhvaWfPn2axMRE8vLyTP6mbkCn013w4bg4sIeGhhIeHn5BYDfQ0tLC9u3biY2N5dixY52u9mRYtHV2dsbf35/Ro0ezYMECVqxY0aFqTB2hpqmGnek7iU2OJTYllvyafBQUpg6aqo0SxvmM63VZJS0tLXz77be89dZbnDhxgurq6g6dZyhwDurbeXV1Na2tCvAbVFfPYShKAu7u7zBmTCoTJkQwZ84c5s2bh4ODg8mepzM0NDSwZ88eNm/ezJEjR8jKyqKioqLH0nstLS21z4RhFOvv709QUBBDhw4lNDSU0aNHG233Ncg1gX6DEIKSkhKOHj3KiRMnOHPmDMnJyeTm5mpz6oZayabk/GGym5ubtmgVFBSkBfaRI0cSHBx8xbee06dP880331xQ7ckwTdNevzNkadjb2+Pp6cmwYcO46qqruO666xgxwvgbwTqCXug5XnBcs7I4nHcYgACnAE0Q5oXMw8G6dwTDy3Hy5EleeeUV4uLiKCgoaOf/gxXwW+BJIAQbm5O4uPyD1tbNVFaWXzJVYzDLGzJkCOPHj2fWrFksXLhQE5Xu8HxmJgDPBAd3+1oGzp49y5YtW4iLi9NGmbW1td3OLju/b7bXzw2lJZ2cnHBzc8Pb2xt/f38GDx7MkCFDGD58OOHh4bhdbKZ06T2lCPRG9Ho9BQUFxMfH88svv3DmzBnS0tK0oG4IiD1hl3y5+U8fHx9t/rOjgd1AZWUlmzdvJi4ujpMnT5Kfn6+ZlHXkeQztcXV1JSgoiHHjxrF8+XIWLFjQZ/z7C2oKtGpq29O2U9NUg42FDfNC5hEdGk10WDTBrsHmbma7VFVVsXbtWj755BPS0tJoaBDA74CVQBBwEFgFbNXOURQFNzc3fH19sbGxoampidLSUkpLSy+ZjrKyssLT05OQkBAiIiKYOXMmixYt6lSpUHNluTU3N7N//37+85//cPDgQTIyMqioqDDKwrZOp9M+a0KIdsXHysoKOzs7nJ2dtVrkhpTaVatWSRHoCVpbW8nNzeXIkSMkJCSQmppKeno6ubm5VFZWatMvPfU3vTiwn58JYQjsf9XrsfX3Z8/EiR2+bmtrq9bxjxw5QmZmprbY2pG3JMNiq6OjIz4+PowaNUozIPPx8enOI/damlqb2JO1R1tLSC1LBSDcK5xlYctYFraMaYOmYanrvQVO6uvhn/+El1+GvDwYObIMO7tXSE9/n4qK8g5dw9bWlkGDBuHj44O1tTW1tbXk5uZSXFx8yXSMpaUlHh4emr/SjBkziIqKumwf6e2pznl5eWzbto0dO3ZoGWjGsDI/P9UU1BfLNvbCSBHoCk1NTWRlZWmpjKmpqWRmZpKXl3fBm3pP/o2uFNjPT3Hr6Bv75T48Z8+e5ZtvvtEWW4uKiqitre2wgBnmPD08PAgJCWHKlCmsWLGCKVOm9DqvfHORXJqsWVnsydpDi74FN1s3loQuITo0msXDFuNu1/E3YVNSWwvvvw9r1kBhIcyaBc89B3PnwsWzbqmpqbz++uvExsaSn5/foUVoRVHw9PQkNDQULy8vLC0tKSkpIT09naKiokvqFhjM9wYPHsyYMWO46qqrWB8QgI2PT68VgY7Q3NzM4cOHiYmJ4aeffiI9PZ3S0lLq6+uNEWOkCIC6IJSenk58fDynT58mPT2drKwsLagbdmka45k7M+/XkcBumIrpboH0hoYGtm3bxrZt2zhx4gTH0tJoqalB6eACsU6nw9raGmdnZwICAhgzZgxRUVEsX74cJ2n12CUqGyqJS48jJlk1vCuuK0an6Lgq8CptLWGUV9dsILpDTQ28+y68+ioUF8P8+Wq2z+zZnbtObW0tH3/8MR988AFJSUnU1tZ26DxbW1sGDx5MeHg4vr6+6PV6UlJSSE1N1bJ+LkCnw8PNjaCgIMaMGcP06dOJiooi2IjrBL2B4uJidu7cydatWzlx4gT5+fkd2bHdP0WgtraW1NRULahnZGSQk5Oj/VGM+aZuqHMLalA3fLVFW4H9/F2Mhjf27gZ2A0IIjh8/zubNmzlw4ACpqamUlpZqC8YdeUbDYqvBgGz69OnccMMNWoUniWnRCz2Hcw9ro4RjBepIbbDLYG3aaE7wHGwtu27X3B5VVfD22/D661BaClFRavC/6irj3UMIwe7du3nzzTfZu3cv5eXlHX4J8fDwICwsjAkTJuDn50dlZSX/2rePmqwsOGcPcvE5Li4uBAYGEh4eTmRkJIsWLWL48OHGe6BeRktLCydOnCA2Npa9e/eyc+fO/iECOp1OQPtv1h28lra13mDm1d5uPUO6oyGwG/xMLjcVY6zAfj6lpaVs2rSJXbt2cerUKfLz86murqapqanDOfG2tra4ubkRGBjIhAkTuPrqq5k9e3afWWwdaORU5WjV1Ham76SuuQ57K3sWDFmgLi6HRhPgHGCUe1VUwD/+AX//u/rv6Gg1+E+dapTLd4js7GzefvttNm3aRHZ2dofTNm1tbbURwLRp0wgICCApKYnDhw+TnJxMfn7+JSMQRVFwcXHR/JWmTZvGokWLCA8P73cvPP0mO0hRlMs2xvCWbmFhcYFDY2tra7vFyA0blAx57BcHdsMbe0hIiEkCu4GWlhZ2797Nli1biI+PJysri/LychoaGjq12Ork5ISvry/h4eHMnTuXa665Bm9vb5O1W9JzNLQ08GPmj1oKalZlFgARvhGaLfZk/8mdrqZWVgZvvAFvvqmOAn71KzX4dyJXwKQ0NjbyxRdfsH79ehISEqipqemwnYe7uzuhoaFMmjSJefPm4e7uzt69ezl06BBnzpzRFmjPx2C+FxAQwIgRI5g6dSoLFy4kIiKiz65p9RsRsLCwEJaWlu2mTbYV2C82lBoyZIhJA/v5pKWl8c0337Bv3z6SkpK0xdbOGJDZ2dnh4eHB0KFDmTp1Ktdccw3jx4/vVsd8Ij0dgJeGDOnyNSQ9jxCC08WntWmj/Wf3oxd6vOy9WBq6lOjQaBYNXYSLbdub4kpK1Cmft95S5/+vvVa1d4iI6MEH6SJCCA4cOMC7777Lrl27KCwqQnQwldqQoRQeHs6MGTOIjo6mvr6enTt38vPPP3PmzBlyc3Oprq6+5LPp5OSEn5+fZr43f/58Jk+e3GNxpKv0GxHQ6XTC09PzAuMncwZ2A7W1tcTGxrJt2zYSEhK0lNCO5sQbFlsNQ9Nx48axePFilixZgqOJSyv19tQ6Sccoqy9je+p2YlJi2JayjfKGcix1lswMmqmtJYR5hAFqhs9rr6mLvnV1cMMNavAfPdrMD9ENDP34Sz8/1q5dy1dffUV6evqli8dtoCgK7u7uDB06lEmTJjF//nwWLVpEZmYmO3bs4Oeff9YsVKqqqi5rvufr68uIESO08yMjIy/rWmoO+o0ImGOfQGtrK0ePHmXTpk0cPHjwgrStzhiQOTg44OXlxfDhw5k1axbXXHMNISEhZp97lCLQ/2jRt/Bzzs/aKOFk0UkAgi0icTv6N05vm0lzk46bb1Z46ikYOdLMDTYCV+rHTU1NbNq0iX/+85/Ex8dTWVnZ4XVFW1tb/P39GTVqFNOnT2f58uWEh4eTnJysicOpU6fIzs6msrLykpc+e3t7fHx8NPO9efPmMXPmTKytrbv/0J1AisAVKCgoYNOmTfzwww+cOnWKwsJCampqurTYGhwczMSJE1m+fPkVXRZ7E1IE+j8/n8ph5V/L2Lt5BPpWHYz5FIf5/yBqagjLQpexNHQpPo59e5NeZ/uxoTDO2rVr+e6778jPz+9UER83NzdCQkK0wL5kyRKcnZ1JT09nx44dWkEjg9/Vxet6tra2+Pj4aOZ7c+bMYc6cOdjZ2XXuwTvIgBWB5uZmdu7cSUxMDMeOHdOMohobGzu82GrIiff19dUMyK6++mo8PT278yi9BikC/ZesLFi9Gj78UPX2/+1v4aFH6sjU7dJGCbnVuQBM9p/MsrBlRIdGM95vPDqlby2AGqsfl5WVsX79er788kuSkpI6tVHLxsYGPz8/Ro4cSWRkJMuWLdMquZ09e5YdO3awf/9+fvnlF7KysigrK7skDtnY2ODt7c3QoUOZMGECs2fPZt68ed2eFu7VIqAoymLgTdRaTuuFEKvbOvZ8ERBCcObMGTZt2sS+ffs0AzJDTnxnqj0ZDMimTZvGihUrGDNmTK9f6DEWvzl9GoBPR40yc0skxiI9HV56CT76SN3R+7vfwcqVcPGeKSEEJwpPaFYWB3MOIhD4OfqxNHQpy8KWsWDIAhytTbsuZQxM2Y9bW1v573//ywcffMCBAwc6vKcB/peGGhISwvjx45k3bx5Lly7VDN8KCgqIi4tj7969JCQkkJmZSWlp6RXN9yIiIpg1axYLFizA1dW1o+3onSKgKIoFkAwsBHKAw8DNQojTlztep9MJQ/pnexhy+g2LrREREURHRzN//nyTL7ZKJOYgJQX+9jf497/V4i2//z08/jgEBnbs/OLaYralqoZ336V+R1VjFdYW1swJnqOloA5xk1lkBk6dOsW6deuIiYnh7NmzHayxrGJtba0tJEdGRrJ06dILSnyWlpZq4nD8+HEyMjIoKSm5rPmeh4cHQ4YMYdy4ccycOZOFCxdeMlPRm0UgEvg/IUTUue+fABBCvNTG8eLcfzUDMm9vb0aMGMHs2bO55pprCAwMNPtiq0TSk5w5Ay++CJ9/DtbWcM898Nhj4O/f9Ws2tzazL3sfsSmxxCTHkFSaBMBIz5GalcX0wOm9oppab6KqqopPPvmEzz77jJMnT1JbW9vh6SRFUXB2diY4OJhx48Yxd+5coqOjL6grUFlZyc6dO9mzZw/Hjh0jPT29TfM9d3d3zXxv/fr1vVYErgMWCyF+f+7724CpQoj7L3f8xIkTRXx8vMnaMxB5OCUFgDdCQ83cEklnOXVKLeG4YQPY2cF998Ejj4Cvr/HvlVqWqk0b7c7cTbO+GVdbV62a2uJhi/G0N986WW/ux3q9nri4OD788EN2795NSUlJp2oSWFlZaRlGkZGRLF68mOnTp1+wP6i2tpZdu3axe/dujh07RmpqKsXFxeenyPZaEbgeiLpIBKYIIR4475i7gbsBgoKCJmZlZZmsPQMRuTDc9zhxQg3+X38Njo5w//3w5z+DEQtRXZHqxmri0uO0amqFtYXoFB3TBk3Tpo3GeI/p0RF5X+zHaWlprF+/ns2bN5ORkXGJW2p7ODs7a7U2Zs+ezdVXX32JzXZDQwM//PADS5cu7bUi0KnpoN5eT6Av0hc/PAOVo0fh+edh82ZwdoYHH4SHHwYPD/O1SS/0xOfFa9NG8fnqSD3QOfCCamp2VqZJfTTQX/pxXV0dn3/+OZ999hlHjx697K7lK2FlZYWXlxdhYWFMnTqVqKgoZs6ciZWVVa8VAUvUheH5QC7qwvAtQohTlzteioDx6S8fnv7MoUNq8I+JAVdXNfA/+CC0U1HQLORX52uGdzvSdlDbXIutpS3zQ+Zr1dSCXIKMft/+3I/1ej379+9n/fr1qkVGYWFXitF0WQRMuudZCNGiKMr9wHbUFNEP2xIAiWSgceAArFoF330H7u7qFND994NL23ZAZsfPyY+7JtzFXRPuorGlkT1Ze1TDuxR1XwJbYazPWG2UMDVgaqcN7wYaOp2OmTNnMnPmzAt+npOTwwcffMCmTZtITk6mvr7eJPfv05vFJO1zd5Ka9bGuH3ur9zX27FHf/HfuBE9PePRRddG3L9fsEUKQVJqkbVLbm7WXVtGKh52HVk0tamgUbnZdG97IfqxSX1/P119/zWeffcbhw4cpLy83TCf1zumgziJFQNJfEQJ+/BH++lfYvRt8fNQ0z3vuAQcHc7fO+FQ0VLA9dTuxKbFsTdlKaX0pFooFM4JmaKOEEZ4jZLq3ERBCoNPppAhIJL0RIdQ3/lWrYN8+8PNTN3j94Q9gb2/u1vUMrfpWDuUe0kYJJwpPABDiGqI5oM4ePBsbS1n4qKv02s1inUWKgPGRw2jzIARs26YG/4MHYdAg1drhrrvA1nSVI/sEZyvPEpuipp/uSt9FfUs9DlYOLBy6kOjQaJaGLsXf6cKdcLIfX5nuiEDvMMOWmIzki+qxSkyLEPDf/6rBPz4eBg+G999Xzd1khU+VQJdA7pl0D/dMuof65np+yPxBGyVsPrMZgAl+E7Q9CZP8J8l+bEKkCEgkRkCvV/P7n38ejh+HIUPggw/gttugDziOmw07KzuWhi5laehShBCcLDqp7Ul4Ye8LrNqzCh8HHxT3qbj7zKRq1FCcbZzN3ex+hZwO6uf05/zq3kBrK3zzjRr8T56E0FC1itctt6gmb5KuU1pXynep3xGTEsPXSbG0NFdjpbNi1uBZ2lrCMPdh5m5mr0CuCUjaRIqAaWhtVT19XngBEhNhxAi1ePuNN8IAcSnvUWbHH6ay/ASLRDKxKbGcLlaNiMM8wrRpoxlBM7C26NmKXr0FuSYgaZMIaattVFpaVDfPF1+E5GS1bu+GDWoRdxn8Tcd4Z1dwns2a0N+zZuEaMsoztGmjtw+/zes/v46zjTNRQ6OIDo1mSegSvB28zd3sPoEcCUgkHaC5WfXxf/FFtajLuHHw7LPw61+Drm8V6ep31DTVsCt9lyYK+TX5KChMCZiiVVOL8I3o13sS5HSQRGIiGhvh44/VYi5ZWTBxohr8ly9Xq3pJehdCCI4VHNNssQ/nHkYgCHAK0LyN5ofMx8G6f+3QkyIgaRNZXrJrNDSotXtXr4azZ2HqVDX4L1kig7856Go/LqwpZFvqNmKSY9iRtoPqpmpsLGyYGzJXW0sIdg02QYt7FrkmIGmTnE56mA906uth3TpYswby8uCqq2D9eli4UAZ/c9LVfuzj6MMdEXdwR8QdNLU2sTdrrzZtdP+2+7l/2/2Ee4VrVhaRgZFY6gZWWBxYTyuRtEFtLaxdC6+8AoWFMHs2fPopzJkjg39/wdrCmvlD5jN/yHxej3qd5NJkbdro9Z9fZ81Pa3CzdWPxsMVaNTV3O3dzN9vkSBGQDGiqq+Hdd+HVV6GkBObPV7N9Zs82d8skpibMI4ywyDD+FPknKhsq1WpqKbHEJsfyxckv0Ck6pgdO16aNwr3C++XishQByYCkshLefhtefx3KymDxYjXPf/p0c7dMYg5cbF24btR1XDfqOvRCz5G8I2qdhOQYVu5aycpdKxnsMlibNpobMhdby/5hAiVFoJ8T2ZsrlJiB8nL4xz/gjTegogKWLVOD/5Qp5m6Z5Er0ZD/WKTqmBExhSsAUVs1dRW5VrlZN7aMTH/HukXext7Jnfsh8LQU1wDmgx9pnbGR2kGRAUFqqBv5//AOqqtT8/qefVlM+JZKO0tDSwO7M3Vo1tcyKTAAifCO0UcJk/8k9Xk1NpohKJG1QXKxO+bz9NtTUwHXXqcF/3Dhzt0zS1xFCkFiSqDmg7s/eT6toxcveiyWhS1gWuoxFQxfhYmv6UYwUAUmbXHvyJADfjB5t5pb0LIWF6mLvu++qaZ833ghPPaXaPEj6Hn2hH5fVl2nV1LalbqOsvgxLnSUzg2Zqo4QwjzCTLC7LfQKSNiltbjZ3E3qUvDw1zXPtWmhqUt08n3pKNXiT9F36Qj92t3Pn5jE3c/OYm2nVt/Jzzs/aKOHRuEd5NO5RhroN1RxQZw2e1SsM76QISPoFOTnw8svwz3+qJm+33QZPPqlaO0skPY2FzoKrgq7iqqCreGnBS2RVZLE1ZSsxKTG8H/8+bx58E0drRxYNXaRVU/N19DVLW6UISPo0WVmqtcOHH6qFXe64A554Qi3qIpH0Fga7Dubeyfdy7+R7qWuu4/uM77VRwreJ3wIw2X+yNm003m88OqVnnAmlCEj6JOnpqqnbxx+rO3rvukut4Tt4sLlbJpFcGXsre21KSAhBQmGCJgh/3f1X/m/3/+Hr6KsJwoIhC3C0Np0lvBSBfs58NzdzN8GopKSods6ffqpW7rrnHvjLXyAw0Nwtk5iS/taPDSiKwjjfcYzzHcdTs56iuLb4f9XUTn/NB8c+wNrCmjnBc1QX1NBohroPNW4bTJUdpCjK/wF/AIrP/ehJIcTWK50js4MkbZGYqAb/L75QC7b/v/8Hjz0G/v7mbplEYhqaW5vZf3a/5m90puQMACM8R2hWFlcFXoWVhVXvTBE9JwI1QohXO3qOFAHJxZw8qZZw3LgR7Ozgj3+ERx4BHx9zt0wi6VnSytI0B9TdWbtpam3CxcaFxcMWs+H6DTJFVHJ5liQkALBt7Fgzt6RznDihFm//5htwdFTn+//0J/DyMnfLJOagr/ZjYzLUfSgPTn2QB6c+SHVjNTvTd6qGdymx3bquqUXgfkVRbgeOAI8IIcpNfD/JRdS3tpq7CZ0iPl4N/v/5Dzg7q74+Dz0EHh7mbpnEnPS1fmxqnGycWDFyBStGrkAv9Fg82nWbim7lICmKslNRlJOX+foV8B4wFIgA8oHX2rjG3YqiHFEU5UhxcfHlDpEMAA4eVM3cJk2C3bvhr39V0z9XrZICIJFcie6mknZrJCCEWNCR4xRF+ScQ08Y11gHrQF0T6E57JH2Pn35SA/327eDuri7+3n+/OgqQSCSmx2S7ERRF8Tvv2xXASVPdS9L32LMHFixQyzceParu9s3MVHf5SgGQSHoOU64JrFEUJQIQQCbw/0x4L0kbLOtFcylCwA8/qG/+u3erGT6vvaamezo4mLt1kt5Mb+rH/Q3pIioxOUJAXJwa/PfvV3P7H38c/vAHNe1TIpF0j+7sE+gZcwrJgEQI2LoVIiMhKkpd6H3nHUhLgwcflAIgkfQGpAj0c+YcO8acY8d69J5CqCmekydDdDQUFMD770NqKtx3H9j2j9Kskh7EHP14oCBFQGI09Hp1c9f48Wr5xooK1d0zJQXuvlu1e5BIJL0LKQKSbtPaChs2wNixavnGujrV3fPMGbjzTrCyMncLJRJJW0gRkHSZlhb47DO1ZONNN6kjgc8+U83ebr9ddfmUSCS9GykCkk7T0qK+6Y8aBb/5jRrsN25Uzd5uuQUsur6DXSKR9DDyXa2fc4O3t9Gu1dQE//63WswlPR0iIuDbb+FXvwKdfJ2QmBBj9mPJhUgR6OfcFxDQ7Ws0NsJHH8FLL6lpnpMmwRtvqF4/itL9Nkok7WGMfiy5PPL9rZ9T19pKXRcdGBsa1Lz+YcPUCl6+vmre/6FDsHy5FABJz9Gdfiy5MnIk0M9Zes6H/cfx4zt8Tl0d/POfqp9Pfr7q7/Phh6rXjwz8EnPQlX4s6RhSBCQatbWwdi288goUFsKcOWq2z5w5MvhLJP0VKQISqqvVaZ/XXoOSEvWNf+NGmDXL3C2TSCSmRorAAKayEt56C/7+dygrg8WL1Upe06ebu2USiaSnkCIwACkvhzffVL8qKtRF3qefhilTzN0yiUTS00gR6Ofc4eur/bu0VH3rf+stqKqCFSvU4D9hghkbKJF0gPP7scS4SBHo59zh50dxMaxcqc7719aq/j5PP616/UgkfYE7/PzaP0jSJaQI9GMKCuD5l1v5aJ2O+nqFm26Cp56C8HBzt0wi6RwlTU0AeFpbm7kl/Q8pAv2QvDxYs0b18G9o0uGzpJz4V90ZMcLcLZNIusZ1p04Bcp+AKZAi0I84e1bd4LV+vWrydvvtcOpXp7ELamLECHdzN08ikfRCpG1EPyAzU2czTncAAA2TSURBVLV1GDpUffu//XZITlZ3+doFNZm7eRKJpBcjRwJ9mLQ01dTt449VF8/f/14t4D54sLlbJpFI+gpSBPogycnw4ouqpYOlJdx7L/zlLzBokLlbJpFI+hpSBPoQiYlq8P/iC7Ve74MPwmOPwZWy5+6VFrySfoDsx6ZDikAf4Jdf4IUX4KuvwN4eHnlE/fLxaf/cG2UxDkk/QPZj09GthWFFUa5XFOWUoih6RVEmXfS7JxRFSVUUJUlRlKjuNXNgcvw4XHutuqlr2zZ44gl1EXjNmo4JAMDZhgbONjSYtJ0SiamR/dh0dHckcBK4Bnj//B8qijIKuAkIB/yBnYqihAkhZFWIDnDkCDz/PGzZAi4u8Oyz8NBD4N6FLM/bEhMBmV8t6dvIfmw6uiUCQohEAOVSs/lfAV8KIRqBDEVRUoEpwIHu3K+/c/AgrFqlVu9yc1P//cAD4Opq7pZJJJL+iqnWBAKAn8/7PufczySXYf9+NeDv2AEeHmoh9z/+EZydzd0yiUTS32lXBBRF2QlczsLvKSHEf9o67TI/E21c/27gboCgoKD2mtOv2L1bDf7ffw9eXupc/733gqOjuVsmkUgGCu2KgBBiQReumwMEnvf9ICCvjeuvA9YBTJo06bJC0Z8QQg36q1bBnj1q8fbXX4e77wYHB3O3TiKRDDRMNR20BfhcUZTXUReGQ4FDJrpXn0AIdbpn1Sr46Sfw94d//EPd5WtnZ7r7PhIY2P5BEkkvR/Zj09EtEVAUZQXwFuAFxCqKclwIESWEOKUoykbgNNAC/HGgZgYJoS70rloFhw5BYCC8+y7ceSfY2pr+/ss9PU1/E4nExMh+bDq6mx20CdjUxu9eBF7szvX7MkKoKZ6rVsHRoxAcDOvWwW9/Cz1piZ5UVwfAcHv7nrupRGJkZD82HXLHsJHR6+Hbb9UdvidOqM6eH34Iv/kNWFn1fHv+X1ISIPOrJX0b2Y9Nh7SSNhKtrfDll+ru3uuvh/p6+OQTOHNGnfoxhwBIJBJJe0gR6CYtLfDpp2rJxptvVkcCn38Op0/DbbepLp8SiUTSW5Ei0EWam+Gjj2DkSDXYW1vDxo1w8qQqBhYW5m6hRCKRtI98T+0kTU3qNM/f/gYZGTB+PGzaBFdfrRZ2kUgkkr6EFIEO0tgI//qXWskrOxsmTVLz/KOj4VLrpN7D07LMmKQfIPux6ZAi0A4NDWrh9tWrITcXpk1T6/hGRfXu4G9gQVesRyWSXobsx6ZDikAb1NWpef1r1kB+PsyYoa4BzJ/fN4K/gePV1QBEODmZuSUSSdeR/dh0SBG4iJoaWLsWXnkFiopg7lw122f27L4V/A08nJoKyPxqSd9G9mPTIUXgHNXV8M478NprUFICCxfCM8/AzJnmbplEIpGYjgEvApWV8NZb8Pe/Q1kZLFmiBv/ISHO3TCKRSEzPgBWB8nJ44w14801VCJYvV4P/5MnmbplEIpH0HANOBEpL1bf+f/xDnQJasUIN/nKqUSKRDEQGjAgUFanz/e+8o2b+XH89PP00jBlj7paZlr8NGWLuJkgk3Ub2Y9PR70WgoEDN9HnvPXXD1003wVNPwahR5m5ZzzDdxcXcTZBIuo3sx6aj34pAbq6a479unWr18JvfwJNPwvDh5m5Zz/JTZSUgP0SSvo3sx6aj34lAdja8/LK6y1evh9tvhyeegGHDzN0y8/Bkejog86slfRvZj01HvxGBzEzV1+df/1K/v/NOWLkSQkLM2iyJRCLp1fR5EUhLUx09P/lEdfH8wx/g8cchKMjcLZNIJJLeT58VgaQkNfh/9plateu+++Avf4GAAHO3TCKRSPoOfU4ETp+GF19USzna2MBDD8Gjj/L/27vfGCuuMo7j31+pQAh/w9qUWBAaoeFPiVZC2jdVQ6MNUYhNVUwaW20kFOUFEqMNSW3AvrFpNEZrwdigjVr+NBTQEixarTFuBUNKKRUC2BYQslIUX7Si4OOLmXY3ZHfv7M7OzN6Z3ychmd2Ze+6Th3Pvs3PmzBmmTKk6MjOz9tM2ReCll5KHt2/dCmPGJF/8a9bANddUHdnw9p2mXhG3WnE/Ls6wLwIHDsD69cnTu8aNS2b6rF4NHR1VR9YevPSu1YH7cXGGbRHYty/58t+1CyZMgAceSIZ+/GyJgdl7/jzgh3JYe3M/Lk6uIiDpU8CDwGxgYUTsT38/HXgFOJIe2hkRK7K02dkJ69bB7t0waVKyvWoVTJyYJ9Lm+uZrrwH+8Fh7cz8uTt4zgUPAHcCGXvYdj4j3D6Sxo0eTJZwnT07m/K9cCePH54zQzMz6lKsIRMQrABqiR2699Vayzs+KFTB27JA0aWZm/biqwLZnSDog6XeSMj2f68Ybk1k/LgBmZuVoeSYgaS9wbS+71kbEjj5edgaYFhFvSPog8LSkuRHxr17aXw4sB5jm23zNzErVsghExG0DbTQiLgIX0+0/SzoOzAL293LsRmAjwIIFC2Kg72X929C0ZVOtltyPi1PIFFFJ7wbOR8RlSdcDM4ETRbyX9e+GMWOqDsEsN/fj4uS6JiDpk5JOAbcAv5S0J911K3BQ0ovANmBFRJzPF6oNxq5z59h17lzVYZjl4n5cnLyzg7YD23v5/VPAU3natqHxyMmTAHzCt1hbG3M/Lk6Rs4PMzGyYcxEwM2swFwEzswZzETAza7Bhu4qoDY0nZs+uOgSz3NyPi+MiUHNTR4+uOgSz3NyPi+PhoJrb3NXF5q6uqsMwy8X9uDg+E6i5H5w+DcBn/BxOa2Pux8XxmYCZWYO5CJiZNZiLgJlZg7kImJk1mC8M19y2uXOrDsEsN/fj4rgI1FzHyJFVh2CWm/txcTwcVHObzpxh05kzVYdhlov7cXFcBGpu09mzbDp7tuowzHJxPy6Oi4CZWYO5CJiZNZiLgJlZg7kImJk1mKeI1twz8+dXHYJZbu7HxXERqLkxI0ZUHYJZbu7HxfFwUM09evo0j6bL8Jq1K/fj4rgI1NyWri62+GEc1ubcj4uTqwhIeljSXyQdlLRd0sQe++6XdEzSEUkfyx+qmZkNtbxnAs8C8yJiPnAUuB9A0hxgGTAXuB14VJIH9czMhplcRSAifhURl9IfO4Hr0u2lwJMRcTEi/gocAxbmeS8zMxt6Q3lN4AvA7nT7PcDJHvtOpb8zM7NhpOUUUUl7gWt72bU2Inakx6wFLgE/fftlvRwffbS/HFie/nhR0qFWMTVEB3BuqBrr7T+kjQxpLtpco3NxRT9udC6ucMNgX9iyCETEbf3tl3Q38HFgUUS8/UV/Cpja47DrgL/10f5GYGPa1v6IWJAh7tpzLro5F92ci27ORTdJ+wf72ryzg24HvgYsiYg3e+zaCSyTNErSDGAm8Kc872VmZkMv7x3D3wNGAc9KAuiMiBUR8bKkLcBhkmGiL0XE5ZzvZWZmQyxXEYiI9/Wz7yHgoQE2uTFPPDXjXHRzLro5F92ci26DzoW6h/HNzKxpvGyEmVmDVVIEJN2eLidxTNLXe9k/StLmdP8LkqaXH2U5MuTiK5IOp0tz/FrSe6uIswytctHjuDslhaTazgzJkgtJn077xsuSflZ2jGXJ8BmZJuk5SQfSz8niKuIsmqTHJXX1NY1eie+meToo6aZMDUdEqf+AEcBx4HpgJPAiMOeKY1YCj6Xby4DNZcc5jHLxEWBMun1fk3ORHjcOeJ7kDvUFVcddYb+YCRwAJqU/X1N13BXmYiNwX7o9B3i16rgLysWtwE3AoT72Lya5YVfAzcALWdqt4kxgIXAsIk5ExH+AJ0mWmehpKfDjdHsbsEjp9KOaaZmLiHguuqff9lyao26y9AuA9cC3gH+XGVzJsuTii8D3I+IfABFR1yU2s+QigPHp9gT6uCep3UXE88D5fg5ZCvwkEp3ARElTWrVbRRHIsqTEO8dEsjbRBWByKdGVa6DLa9xL99IcddMyF5I+AEyNiF+UGVgFsvSLWcAsSX+Q1Jnes1NHWXLxIHCXpFPAM8CqckIbdga1XE8VTxbLsqRE5mUn2txAlte4C1gAfKjQiKrTby4kXQV8G7inrIAqlKVfXE0yJPRhkrPD30uaFxH/LDi2smXJxWeBTRHxiKRbgCfSXPyv+PCGlUF9b1ZxJpBlSYl3jpF0NckpXn+nQe0q0/Iakm4D1pLcmX2xpNjK1ioX44B5wG8lvUoy5rmzpheHs35GdkTEfyNZqfcISVGomyy5uBfYAhARfwRGk6wr1DSZl+vpqYoisA+YKWmGpJEkF353XnHMTuDudPtO4DeRXvmomZa5SIdANpAUgLqO+0KLXETEhYjoiIjpETGd5PrIkogY9Jopw1iWz8jTJJMGkNRBMjx0otQoy5ElF68DiwAkzSYpAn8vNcrhYSfwuXSW0M3AhYg40+pFpQ8HRcQlSV8G9pBc+X88kmUm1gH7I2In8COSU7pjJGcAy8qOswwZc/EwMBbYml4bfz0illQWdEEy5qIRMuZiD/BRSYeBy8BXI+KN6qIuRsZcrAF+KGk1yfDHPXX8o1HSz0mG/zrS6x/fAN4FEBGPkVwPWUzy/JY3gc9nareGuTIzs4x8x7CZWYO5CJiZNZiLgJlZg7kImJk1mIuAmVmDuQiYmTWYi4CZWYO5CJiZNdj/AfYEjbWN5IUkAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "plot_pomdp_utility(utility)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "---\n", + "## Appendix\n", + "\n", + "Surprisingly, it turns out that there are six other optimal policies for various ranges of R(s). \n", + "You can try to find them out for yourself.\n", + "See **Exercise 17.5**.\n", + "To help you with this, we have a GridMDP editor in `grid_mdp.py` in the GUI folder. \n", + "
    \n", + "Here's a brief tutorial about how to use it\n", + "
    \n", + "Let us use it to solve `Case 2` above\n", + "1. Run `python gui/grid_mdp.py` from the master directory.\n", + "2. Enter the dimensions of the grid (3 x 4 in this case), and click on `'Build a GridMDP'`\n", + "3. Click on `Initialize` in the `Edit` menu.\n", + "4. Set the reward as -0.4 and click `Apply`. Exit the dialog. \n", + "![title](images/ge0.jpg)\n", + "
    \n", + "5. Select cell (1, 1) and check the `Wall` radio button. `Apply` and exit the dialog.\n", + "![title](images/ge1.jpg)\n", + "
    \n", + "6. Select cells (4, 1) and (4, 2) and check the `Terminal` radio button for both. Set the rewards appropriately and click on `Apply`. Exit the dialog. Your window should look something like this.\n", + "![title](images/ge2.jpg)\n", + "
    \n", + "7. You are all set up now. Click on `Build and Run` in the `Build` menu and watch the heatmap calculate the utility function.\n", + "![title](images/ge4.jpg)\n", + "
    \n", + "Green shades indicate positive utilities and brown shades indicate negative utilities. \n", + "The values of the utility function and arrow diagram will pop up in separate dialogs after the algorithm converges." ] } ], @@ -499,7 +2988,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.3" + "version": "3.6.4" }, "widgets": { "state": { diff --git a/mdp.py b/mdp.py index 6637108e5..1003e26b5 100644 --- a/mdp.py +++ b/mdp.py @@ -1,65 +1,132 @@ -"""Markov Decision Processes (Chapter 17) +""" +Markov Decision Processes (Chapter 17) First we define an MDP, and the special case of a GridMDP, in which states are laid out in a 2-dimensional grid. We also represent a policy -as a dictionary of {state:action} pairs, and a Utility function as a -dictionary of {state:number} pairs. We then define the value_iteration -and policy_iteration algorithms.""" - -from utils import argmax, vector_add, orientations, turn_right, turn_left +as a dictionary of {state: action} pairs, and a Utility function as a +dictionary of {state: number} pairs. We then define the value_iteration +and policy_iteration algorithms. +""" import random +from collections import defaultdict +import numpy as np + +from utils import vector_add, orientations, turn_right, turn_left -class MDP: +class MDP: """A Markov Decision Process, defined by an initial state, transition model, and reward function. We also keep track of a gamma value, for use by algorithms. The transition model is represented somewhat differently from the text. Instead of P(s' | s, a) being a probability number for each state/state/action triplet, we instead have T(s, a) return a list of (p, s') pairs. We also keep track of the possible states, - terminal states, and actions for each state. [page 646]""" + terminal states, and actions for each state. [Page 646]""" - def __init__(self, init, actlist, terminals, transitions={}, states=None, gamma=.9): + def __init__(self, init, actlist, terminals, transitions=None, reward=None, states=None, gamma=0.9): if not (0 < gamma <= 1): raise ValueError("An MDP must have 0 < gamma <= 1") - if states: - self.states = states - else: - self.states = set() + # collect states from transitions table if not passed. + self.states = states or self.get_states_from_transitions(transitions) + self.init = init - self.actlist = actlist + + if isinstance(actlist, list): + # if actlist is a list, all states have the same actions + self.actlist = actlist + + elif isinstance(actlist, dict): + # if actlist is a dict, different actions for each state + self.actlist = actlist + self.terminals = terminals - self.transitions = transitions + self.transitions = transitions or {} + if not self.transitions: + print("Warning: Transition table is empty.") + self.gamma = gamma - self.reward = {} + + self.reward = reward or {s: 0 for s in self.states} + + # self.check_consistency() def R(self, state): """Return a numeric reward for this state.""" + return self.reward[state] def T(self, state, action): """Transition model. From a state and an action, return a list of (probability, result-state) pairs.""" - if(self.transitions == {}): + + if not self.transitions: raise ValueError("Transition model is missing") else: return self.transitions[state][action] def actions(self, state): - """Set of actions that can be performed in this state. By default, a + """Return a list of actions that can be performed in this state. By default, a fixed list of actions, except for terminal states. Override this method if you need to specialize by state.""" + if state in self.terminals: return [None] else: return self.actlist + def get_states_from_transitions(self, transitions): + if isinstance(transitions, dict): + s1 = set(transitions.keys()) + s2 = set(tr[1] for actions in transitions.values() + for effects in actions.values() + for tr in effects) + return s1.union(s2) + else: + print('Could not retrieve states from transitions') + return None -class GridMDP(MDP): + def check_consistency(self): + # check that all states in transitions are valid + assert set(self.states) == self.get_states_from_transitions(self.transitions) + + # check that init is a valid state + assert self.init in self.states + + # check reward for each state + assert set(self.reward.keys()) == set(self.states) + + # check that all terminals are valid states + assert all(t in self.states for t in self.terminals) + + # check that probability distributions for all actions sum to 1 + for s1, actions in self.transitions.items(): + for a in actions.keys(): + s = 0 + for o in actions[a]: + s += o[0] + assert abs(s - 1) < 0.001 + + +class MDP2(MDP): + """ + Inherits from MDP. Handles terminal states, and transitions to and from terminal states better. + """ + + def __init__(self, init, actlist, terminals, transitions, reward=None, gamma=0.9): + MDP.__init__(self, init, actlist, terminals, transitions, reward, gamma=gamma) + + def T(self, state, action): + if action is None: + return [(0.0, state)] + else: + return self.transitions[state][action] + + +class GridMDP(MDP): """A two-dimensional grid MDP, as in [Figure 17.1]. All you have to do is specify the grid as a list of lists of rewards; use None for an obstacle (unreachable state). Also, you should specify the terminal states. @@ -67,41 +134,56 @@ class GridMDP(MDP): def __init__(self, grid, terminals, init=(0, 0), gamma=.9): grid.reverse() # because we want row 0 on bottom, not on top - MDP.__init__(self, init, actlist=orientations, - terminals=terminals, gamma=gamma) - self.grid = grid + reward = {} + states = set() self.rows = len(grid) self.cols = len(grid[0]) + self.grid = grid for x in range(self.cols): for y in range(self.rows): - self.reward[x, y] = grid[y][x] - if grid[y][x] is not None: - self.states.add((x, y)) - - def T(self, state, action): - if action is None: - return [(0.0, state)] - else: + if grid[y][x]: + states.add((x, y)) + reward[(x, y)] = grid[y][x] + self.states = states + actlist = orientations + transitions = {} + for s in states: + transitions[s] = {} + for a in actlist: + transitions[s][a] = self.calculate_T(s, a) + MDP.__init__(self, init, actlist=actlist, + terminals=terminals, transitions=transitions, + reward=reward, states=states, gamma=gamma) + + def calculate_T(self, state, action): + if action: return [(0.8, self.go(state, action)), (0.1, self.go(state, turn_right(action))), (0.1, self.go(state, turn_left(action)))] + else: + return [(0.0, state)] + + def T(self, state, action): + return self.transitions[state][action] if action else [(0.0, state)] def go(self, state, direction): """Return the state that results from going in this direction.""" + state1 = vector_add(state, direction) return state1 if state1 in self.states else state def to_grid(self, mapping): """Convert a mapping from (x, y) to v into a [[..., v, ...]] grid.""" + return list(reversed([[mapping.get((x, y), None) for x in range(self.cols)] for y in range(self.rows)])) def to_arrows(self, policy): - chars = { - (1, 0): '>', (0, 1): '^', (-1, 0): '<', (0, -1): 'v', None: '.'} + chars = {(1, 0): '>', (0, 1): '^', (-1, 0): '<', (0, -1): 'v', None: '.'} return self.to_grid({s: chars[a] for (s, a) in policy.items()}) + # ______________________________________________________________________________ @@ -114,49 +196,55 @@ def to_arrows(self, policy): [-0.04, -0.04, -0.04, -0.04]], terminals=[(3, 2), (3, 1)]) + # ______________________________________________________________________________ def value_iteration(mdp, epsilon=0.001): """Solving an MDP by value iteration. [Figure 17.4]""" + U1 = {s: 0 for s in mdp.states} R, T, gamma = mdp.R, mdp.T, mdp.gamma while True: U = U1.copy() delta = 0 for s in mdp.states: - U1[s] = R(s) + gamma * max([sum([p * U[s1] for (p, s1) in T(s, a)]) - for a in mdp.actions(s)]) + U1[s] = R(s) + gamma * max(sum(p * U[s1] for (p, s1) in T(s, a)) + for a in mdp.actions(s)) delta = max(delta, abs(U1[s] - U[s])) - if delta < epsilon * (1 - gamma) / gamma: + if delta <= epsilon * (1 - gamma) / gamma: return U def best_policy(mdp, U): """Given an MDP and a utility function U, determine the best policy, - as a mapping from state to action. (Equation 17.4)""" + as a mapping from state to action. [Equation 17.4]""" + pi = {} for s in mdp.states: - pi[s] = argmax(mdp.actions(s), key=lambda a: expected_utility(a, s, U, mdp)) + pi[s] = max(mdp.actions(s), key=lambda a: expected_utility(a, s, U, mdp)) return pi def expected_utility(a, s, U, mdp): """The expected utility of doing a in state s, according to the MDP and U.""" - return sum([p * U[s1] for (p, s1) in mdp.T(s, a)]) + + return sum(p * U[s1] for (p, s1) in mdp.T(s, a)) + # ______________________________________________________________________________ def policy_iteration(mdp): """Solve an MDP by policy iteration [Figure 17.7]""" + U = {s: 0 for s in mdp.states} pi = {s: random.choice(mdp.actions(s)) for s in mdp.states} while True: U = policy_evaluation(pi, U, mdp) unchanged = True for s in mdp.states: - a = argmax(mdp.actions(s), key=lambda a: expected_utility(a, s, U, mdp)) + a = max(mdp.actions(s), key=lambda a: expected_utility(a, s, U, mdp)) if a != pi[s]: pi[s] = a unchanged = False @@ -167,13 +255,207 @@ def policy_iteration(mdp): def policy_evaluation(pi, U, mdp, k=20): """Return an updated utility mapping U from each state in the MDP to its utility, using an approximation (modified policy iteration).""" + R, T, gamma = mdp.R, mdp.T, mdp.gamma for i in range(k): for s in mdp.states: - U[s] = R(s) + gamma * sum([p * U[s1] for (p, s1) in T(s, pi[s])]) + U[s] = R(s) + gamma * sum(p * U[s1] for (p, s1) in T(s, pi[s])) return U +class POMDP(MDP): + """A Partially Observable Markov Decision Process, defined by + a transition model P(s'|s,a), actions A(s), a reward function R(s), + and a sensor model P(e|s). We also keep track of a gamma value, + for use by algorithms. The transition and the sensor models + are defined as matrices. We also keep track of the possible states + and actions for each state. [Page 659].""" + + def __init__(self, actions, transitions=None, evidences=None, rewards=None, states=None, gamma=0.95): + """Initialize variables of the pomdp""" + + if not (0 < gamma <= 1): + raise ValueError('A POMDP must have 0 < gamma <= 1') + + self.states = states + self.actions = actions + + # transition model cannot be undefined + self.t_prob = transitions or {} + if not self.t_prob: + print('Warning: Transition model is undefined') + + # sensor model cannot be undefined + self.e_prob = evidences or {} + if not self.e_prob: + print('Warning: Sensor model is undefined') + + self.gamma = gamma + self.rewards = rewards + + def remove_dominated_plans(self, input_values): + """ + Remove dominated plans. + This method finds all the lines contributing to the + upper surface and removes those which don't. + """ + + values = [val for action in input_values for val in input_values[action]] + values.sort(key=lambda x: x[0], reverse=True) + + best = [values[0]] + y1_max = max(val[1] for val in values) + tgt = values[0] + prev_b = 0 + prev_ix = 0 + while tgt[1] != y1_max: + min_b = 1 + min_ix = 0 + for i in range(prev_ix + 1, len(values)): + if values[i][0] - tgt[0] + tgt[1] - values[i][1] != 0: + trans_b = (values[i][0] - tgt[0]) / (values[i][0] - tgt[0] + tgt[1] - values[i][1]) + if 0 <= trans_b <= 1 and trans_b > prev_b and trans_b < min_b: + min_b = trans_b + min_ix = i + prev_b = min_b + prev_ix = min_ix + tgt = values[min_ix] + best.append(tgt) + + return self.generate_mapping(best, input_values) + + def remove_dominated_plans_fast(self, input_values): + """ + Remove dominated plans using approximations. + Resamples the upper boundary at intervals of 100 and + finds the maximum values at these points. + """ + + values = [val for action in input_values for val in input_values[action]] + values.sort(key=lambda x: x[0], reverse=True) + + best = [] + sr = 100 + for i in range(sr + 1): + x = i / float(sr) + maximum = (values[0][1] - values[0][0]) * x + values[0][0] + tgt = values[0] + for value in values: + val = (value[1] - value[0]) * x + value[0] + if val > maximum: + maximum = val + tgt = value + + if all(any(tgt != v) for v in best): + best.append(np.array(tgt)) + + return self.generate_mapping(best, input_values) + + def generate_mapping(self, best, input_values): + """Generate mappings after removing dominated plans""" + + mapping = defaultdict(list) + for value in best: + for action in input_values: + if any(all(value == v) for v in input_values[action]): + mapping[action].append(value) + + return mapping + + def max_difference(self, U1, U2): + """Find maximum difference between two utility mappings""" + + for k, v in U1.items(): + sum1 = 0 + for element in U1[k]: + sum1 += sum(element) + sum2 = 0 + for element in U2[k]: + sum2 += sum(element) + return abs(sum1 - sum2) + + +class Matrix: + """Matrix operations class""" + + @staticmethod + def add(A, B): + """Add two matrices A and B""" + + res = [] + for i in range(len(A)): + row = [] + for j in range(len(A[0])): + row.append(A[i][j] + B[i][j]) + res.append(row) + return res + + @staticmethod + def scalar_multiply(a, B): + """Multiply scalar a to matrix B""" + + for i in range(len(B)): + for j in range(len(B[0])): + B[i][j] = a * B[i][j] + return B + + @staticmethod + def multiply(A, B): + """Multiply two matrices A and B element-wise""" + + matrix = [] + for i in range(len(B)): + row = [] + for j in range(len(B[0])): + row.append(B[i][j] * A[j][i]) + matrix.append(row) + + return matrix + + @staticmethod + def matmul(A, B): + """Inner-product of two matrices""" + + return [[sum(ele_a * ele_b for ele_a, ele_b in zip(row_a, col_b)) for col_b in list(zip(*B))] for row_a in A] + + @staticmethod + def transpose(A): + """Transpose a matrix""" + + return [list(i) for i in zip(*A)] + + +def pomdp_value_iteration(pomdp, epsilon=0.1): + """Solving a POMDP by value iteration.""" + + U = {'': [[0] * len(pomdp.states)]} + count = 0 + while True: + count += 1 + prev_U = U + values = [val for action in U for val in U[action]] + value_matxs = [] + for i in values: + for j in values: + value_matxs.append([i, j]) + + U1 = defaultdict(list) + for action in pomdp.actions: + for u in value_matxs: + u1 = Matrix.matmul(Matrix.matmul(pomdp.t_prob[int(action)], + Matrix.multiply(pomdp.e_prob[int(action)], Matrix.transpose(u))), + [[1], [1]]) + u1 = Matrix.add(Matrix.scalar_multiply(pomdp.gamma, Matrix.transpose(u1)), [pomdp.rewards[int(action)]]) + U1[action].append(u1[0]) + + U = pomdp.remove_dominated_plans_fast(U1) + # replace with U = pomdp.remove_dominated_plans(U1) for accurate calculations + + if count > 10: + if pomdp.max_difference(U, prev_U) < epsilon * (1 - pomdp.gamma) / pomdp.gamma: + return U + + __doc__ += """ >>> pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .01)) @@ -192,3 +474,19 @@ def policy_evaluation(pi, U, mdp, k=20): ^ None ^ . ^ > ^ < """ # noqa + +""" +s = { 'a' : { 'plan1' : [(0.2, 'a'), (0.3, 'b'), (0.3, 'c'), (0.2, 'd')], + 'plan2' : [(0.4, 'a'), (0.15, 'b'), (0.45, 'c')], + 'plan3' : [(0.2, 'a'), (0.5, 'b'), (0.3, 'c')], + }, + 'b' : { 'plan1' : [(0.2, 'a'), (0.6, 'b'), (0.2, 'c'), (0.1, 'd')], + 'plan2' : [(0.6, 'a'), (0.2, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3' : [(0.3, 'a'), (0.3, 'b'), (0.4, 'c')], + }, + 'c' : { 'plan1' : [(0.3, 'a'), (0.5, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan2' : [(0.5, 'a'), (0.3, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3' : [(0.1, 'a'), (0.3, 'b'), (0.1, 'c'), (0.5, 'd')], + }, + } +""" diff --git a/mdp4e.py b/mdp4e.py new file mode 100644 index 000000000..f8871bdc9 --- /dev/null +++ b/mdp4e.py @@ -0,0 +1,516 @@ +""" +Markov Decision Processes (Chapter 16) + +First we define an MDP, and the special case of a GridMDP, in which +states are laid out in a 2-dimensional grid. We also represent a policy +as a dictionary of {state: action} pairs, and a Utility function as a +dictionary of {state: number} pairs. We then define the value_iteration +and policy_iteration algorithms. +""" + +import random +from collections import defaultdict + +import numpy as np + +from utils4e import vector_add, orientations, turn_right, turn_left + + +class MDP: + """A Markov Decision Process, defined by an initial state, transition model, + and reward function. We also keep track of a gamma value, for use by + algorithms. The transition model is represented somewhat differently from + the text. Instead of P(s' | s, a) being a probability number for each + state/state/action triplet, we instead have T(s, a) return a + list of (p, s') pairs. We also keep track of the possible states, + terminal states, and actions for each state. [Page 646]""" + + def __init__(self, init, actlist, terminals, transitions=None, reward=None, states=None, gamma=0.9): + if not (0 < gamma <= 1): + raise ValueError("An MDP must have 0 < gamma <= 1") + + # collect states from transitions table if not passed. + self.states = states or self.get_states_from_transitions(transitions) + + self.init = init + + if isinstance(actlist, list): + # if actlist is a list, all states have the same actions + self.actlist = actlist + + elif isinstance(actlist, dict): + # if actlist is a dict, different actions for each state + self.actlist = actlist + + self.terminals = terminals + self.transitions = transitions or {} + if not self.transitions: + print("Warning: Transition table is empty.") + + self.gamma = gamma + + self.reward = reward or {s: 0 for s in self.states} + + # self.check_consistency() + + def R(self, state): + """Return a numeric reward for this state.""" + + return self.reward[state] + + def T(self, state, action): + """Transition model. From a state and an action, return a list + of (probability, result-state) pairs.""" + + if not self.transitions: + raise ValueError("Transition model is missing") + else: + return self.transitions[state][action] + + def actions(self, state): + """Return a list of actions that can be performed in this state. By default, a + fixed list of actions, except for terminal states. Override this + method if you need to specialize by state.""" + + if state in self.terminals: + return [None] + else: + return self.actlist + + def get_states_from_transitions(self, transitions): + if isinstance(transitions, dict): + s1 = set(transitions.keys()) + s2 = set(tr[1] for actions in transitions.values() + for effects in actions.values() + for tr in effects) + return s1.union(s2) + else: + print('Could not retrieve states from transitions') + return None + + def check_consistency(self): + + # check that all states in transitions are valid + assert set(self.states) == self.get_states_from_transitions(self.transitions) + + # check that init is a valid state + assert self.init in self.states + + # check reward for each state + assert set(self.reward.keys()) == set(self.states) + + # check that all terminals are valid states + assert all(t in self.states for t in self.terminals) + + # check that probability distributions for all actions sum to 1 + for s1, actions in self.transitions.items(): + for a in actions.keys(): + s = 0 + for o in actions[a]: + s += o[0] + assert abs(s - 1) < 0.001 + + +class MDP2(MDP): + """ + Inherits from MDP. Handles terminal states, and transitions to and from terminal states better. + """ + + def __init__(self, init, actlist, terminals, transitions, reward=None, gamma=0.9): + MDP.__init__(self, init, actlist, terminals, transitions, reward, gamma=gamma) + + def T(self, state, action): + if action is None: + return [(0.0, state)] + else: + return self.transitions[state][action] + + +class GridMDP(MDP): + """A two-dimensional grid MDP, as in [Figure 16.1]. All you have to do is + specify the grid as a list of lists of rewards; use None for an obstacle + (unreachable state). Also, you should specify the terminal states. + An action is an (x, y) unit vector; e.g. (1, 0) means move east.""" + + def __init__(self, grid, terminals, init=(0, 0), gamma=.9): + grid.reverse() # because we want row 0 on bottom, not on top + reward = {} + states = set() + self.rows = len(grid) + self.cols = len(grid[0]) + self.grid = grid + for x in range(self.cols): + for y in range(self.rows): + if grid[y][x]: + states.add((x, y)) + reward[(x, y)] = grid[y][x] + self.states = states + actlist = orientations + transitions = {} + for s in states: + transitions[s] = {} + for a in actlist: + transitions[s][a] = self.calculate_T(s, a) + MDP.__init__(self, init, actlist=actlist, + terminals=terminals, transitions=transitions, + reward=reward, states=states, gamma=gamma) + + def calculate_T(self, state, action): + if action: + return [(0.8, self.go(state, action)), + (0.1, self.go(state, turn_right(action))), + (0.1, self.go(state, turn_left(action)))] + else: + return [(0.0, state)] + + def T(self, state, action): + return self.transitions[state][action] if action else [(0.0, state)] + + def go(self, state, direction): + """Return the state that results from going in this direction.""" + + state1 = tuple(vector_add(state, direction)) + return state1 if state1 in self.states else state + + def to_grid(self, mapping): + """Convert a mapping from (x, y) to v into a [[..., v, ...]] grid.""" + + return list(reversed([[mapping.get((x, y), None) + for x in range(self.cols)] + for y in range(self.rows)])) + + def to_arrows(self, policy): + chars = {(1, 0): '>', (0, 1): '^', (-1, 0): '<', (0, -1): 'v', None: '.'} + return self.to_grid({s: chars[a] for (s, a) in policy.items()}) + + +# ______________________________________________________________________________ + + +""" [Figure 16.1] +A 4x3 grid environment that presents the agent with a sequential decision problem. +""" + +sequential_decision_environment = GridMDP([[-0.04, -0.04, -0.04, +1], + [-0.04, None, -0.04, -1], + [-0.04, -0.04, -0.04, -0.04]], + terminals=[(3, 2), (3, 1)]) + + +# ______________________________________________________________________________ +# 16.1.3 The Bellman equation for utilities + + +def q_value(mdp, s, a, U): + if not a: + return mdp.R(s) + res = 0 + for p, s_prime in mdp.T(s, a): + res += p * (mdp.R(s) + mdp.gamma * U[s_prime]) + return res + + +# TODO: DDN in figure 16.4 and 16.5 + +# ______________________________________________________________________________ +# 16.2 Algorithms for MDPs +# 16.2.1 Value Iteration + + +def value_iteration(mdp, epsilon=0.001): + """Solving an MDP by value iteration. [Figure 16.6]""" + + U1 = {s: 0 for s in mdp.states} + R, T, gamma = mdp.R, mdp.T, mdp.gamma + while True: + U = U1.copy() + delta = 0 + for s in mdp.states: + # U1[s] = R(s) + gamma * max(sum(p * U[s1] for (p, s1) in T(s, a)) + # for a in mdp.actions(s)) + U1[s] = max(q_value(mdp, s, a, U) for a in mdp.actions(s)) + delta = max(delta, abs(U1[s] - U[s])) + if delta <= epsilon * (1 - gamma) / gamma: + return U + + +# ______________________________________________________________________________ +# 16.2.2 Policy Iteration + + +def best_policy(mdp, U): + """Given an MDP and a utility function U, determine the best policy, + as a mapping from state to action.""" + + pi = {} + for s in mdp.states: + pi[s] = max(mdp.actions(s), key=lambda a: q_value(mdp, s, a, U)) + return pi + + +def expected_utility(a, s, U, mdp): + """The expected utility of doing a in state s, according to the MDP and U.""" + + return sum(p * U[s1] for (p, s1) in mdp.T(s, a)) + + +def policy_iteration(mdp): + """Solve an MDP by policy iteration [Figure 17.7]""" + + U = {s: 0 for s in mdp.states} + pi = {s: random.choice(mdp.actions(s)) for s in mdp.states} + while True: + U = policy_evaluation(pi, U, mdp) + unchanged = True + for s in mdp.states: + a_star = max(mdp.actions(s), key=lambda a: q_value(mdp, s, a, U)) + # a = max(mdp.actions(s), key=lambda a: expected_utility(a, s, U, mdp)) + if q_value(mdp, s, a_star, U) > q_value(mdp, s, pi[s], U): + pi[s] = a_star + unchanged = False + if unchanged: + return pi + + +def policy_evaluation(pi, U, mdp, k=20): + """Return an updated utility mapping U from each state in the MDP to its + utility, using an approximation (modified policy iteration).""" + + R, T, gamma = mdp.R, mdp.T, mdp.gamma + for i in range(k): + for s in mdp.states: + U[s] = R(s) + gamma * sum(p * U[s1] for (p, s1) in T(s, pi[s])) + return U + + +# ___________________________________________________________________ +# 16.4 Partially Observed MDPs + + +class POMDP(MDP): + """A Partially Observable Markov Decision Process, defined by + a transition model P(s'|s,a), actions A(s), a reward function R(s), + and a sensor model P(e|s). We also keep track of a gamma value, + for use by algorithms. The transition and the sensor models + are defined as matrices. We also keep track of the possible states + and actions for each state. [Page 659].""" + + def __init__(self, actions, transitions=None, evidences=None, rewards=None, states=None, gamma=0.95): + """Initialize variables of the pomdp""" + + if not (0 < gamma <= 1): + raise ValueError('A POMDP must have 0 < gamma <= 1') + + self.states = states + self.actions = actions + + # transition model cannot be undefined + self.t_prob = transitions or {} + if not self.t_prob: + print('Warning: Transition model is undefined') + + # sensor model cannot be undefined + self.e_prob = evidences or {} + if not self.e_prob: + print('Warning: Sensor model is undefined') + + self.gamma = gamma + self.rewards = rewards + + def remove_dominated_plans(self, input_values): + """ + Remove dominated plans. + This method finds all the lines contributing to the + upper surface and removes those which don't. + """ + + values = [val for action in input_values for val in input_values[action]] + values.sort(key=lambda x: x[0], reverse=True) + + best = [values[0]] + y1_max = max(val[1] for val in values) + tgt = values[0] + prev_b = 0 + prev_ix = 0 + while tgt[1] != y1_max: + min_b = 1 + min_ix = 0 + for i in range(prev_ix + 1, len(values)): + if values[i][0] - tgt[0] + tgt[1] - values[i][1] != 0: + trans_b = (values[i][0] - tgt[0]) / (values[i][0] - tgt[0] + tgt[1] - values[i][1]) + if 0 <= trans_b <= 1 and trans_b > prev_b and trans_b < min_b: + min_b = trans_b + min_ix = i + prev_b = min_b + prev_ix = min_ix + tgt = values[min_ix] + best.append(tgt) + + return self.generate_mapping(best, input_values) + + def remove_dominated_plans_fast(self, input_values): + """ + Remove dominated plans using approximations. + Resamples the upper boundary at intervals of 100 and + finds the maximum values at these points. + """ + + values = [val for action in input_values for val in input_values[action]] + values.sort(key=lambda x: x[0], reverse=True) + + best = [] + sr = 100 + for i in range(sr + 1): + x = i / float(sr) + maximum = (values[0][1] - values[0][0]) * x + values[0][0] + tgt = values[0] + for value in values: + val = (value[1] - value[0]) * x + value[0] + if val > maximum: + maximum = val + tgt = value + + if all(any(tgt != v) for v in best): + best.append(np.array(tgt)) + + return self.generate_mapping(best, input_values) + + def generate_mapping(self, best, input_values): + """Generate mappings after removing dominated plans""" + + mapping = defaultdict(list) + for value in best: + for action in input_values: + if any(all(value == v) for v in input_values[action]): + mapping[action].append(value) + + return mapping + + def max_difference(self, U1, U2): + """Find maximum difference between two utility mappings""" + + for k, v in U1.items(): + sum1 = 0 + for element in U1[k]: + sum1 += sum(element) + sum2 = 0 + for element in U2[k]: + sum2 += sum(element) + return abs(sum1 - sum2) + + +class Matrix: + """Matrix operations class""" + + @staticmethod + def add(A, B): + """Add two matrices A and B""" + + res = [] + for i in range(len(A)): + row = [] + for j in range(len(A[0])): + row.append(A[i][j] + B[i][j]) + res.append(row) + return res + + @staticmethod + def scalar_multiply(a, B): + """Multiply scalar a to matrix B""" + + for i in range(len(B)): + for j in range(len(B[0])): + B[i][j] = a * B[i][j] + return B + + @staticmethod + def multiply(A, B): + """Multiply two matrices A and B element-wise""" + + matrix = [] + for i in range(len(B)): + row = [] + for j in range(len(B[0])): + row.append(B[i][j] * A[j][i]) + matrix.append(row) + + return matrix + + @staticmethod + def matmul(A, B): + """Inner-product of two matrices""" + + return [[sum(ele_a * ele_b for ele_a, ele_b in zip(row_a, col_b)) for col_b in list(zip(*B))] for row_a in A] + + @staticmethod + def transpose(A): + """Transpose a matrix""" + + return [list(i) for i in zip(*A)] + + +def pomdp_value_iteration(pomdp, epsilon=0.1): + """Solving a POMDP by value iteration.""" + + U = {'': [[0] * len(pomdp.states)]} + count = 0 + while True: + count += 1 + prev_U = U + values = [val for action in U for val in U[action]] + value_matxs = [] + for i in values: + for j in values: + value_matxs.append([i, j]) + + U1 = defaultdict(list) + for action in pomdp.actions: + for u in value_matxs: + u1 = Matrix.matmul(Matrix.matmul(pomdp.t_prob[int(action)], + Matrix.multiply(pomdp.e_prob[int(action)], Matrix.transpose(u))), + [[1], [1]]) + u1 = Matrix.add(Matrix.scalar_multiply(pomdp.gamma, Matrix.transpose(u1)), [pomdp.rewards[int(action)]]) + U1[action].append(u1[0]) + + U = pomdp.remove_dominated_plans_fast(U1) + # replace with U = pomdp.remove_dominated_plans(U1) for accurate calculations + + if count > 10: + if pomdp.max_difference(U, prev_U) < epsilon * (1 - pomdp.gamma) / pomdp.gamma: + return U + + +__doc__ += """ +>>> pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .01)) + +>>> sequential_decision_environment.to_arrows(pi) +[['>', '>', '>', '.'], ['^', None, '^', '.'], ['^', '>', '^', '<']] + +>>> from utils import print_table + +>>> print_table(sequential_decision_environment.to_arrows(pi)) +> > > . +^ None ^ . +^ > ^ < + +>>> print_table(sequential_decision_environment.to_arrows(policy_iteration(sequential_decision_environment))) +> > > . +^ None ^ . +^ > ^ < +""" # noqa + +""" +s = { 'a' : { 'plan1' : [(0.2, 'a'), (0.3, 'b'), (0.3, 'c'), (0.2, 'd')], + 'plan2' : [(0.4, 'a'), (0.15, 'b'), (0.45, 'c')], + 'plan3' : [(0.2, 'a'), (0.5, 'b'), (0.3, 'c')], + }, + 'b' : { 'plan1' : [(0.2, 'a'), (0.6, 'b'), (0.2, 'c'), (0.1, 'd')], + 'plan2' : [(0.6, 'a'), (0.2, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3' : [(0.3, 'a'), (0.3, 'b'), (0.4, 'c')], + }, + 'c' : { 'plan1' : [(0.3, 'a'), (0.5, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan2' : [(0.5, 'a'), (0.3, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3' : [(0.1, 'a'), (0.3, 'b'), (0.1, 'c'), (0.5, 'd')], + }, + } +""" diff --git a/mdp_apps.ipynb b/mdp_apps.ipynb new file mode 100644 index 000000000..da3ae7b06 --- /dev/null +++ b/mdp_apps.ipynb @@ -0,0 +1,1825 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# APPLICATIONS OF MARKOV DECISION PROCESSES\n", + "---\n", + "In this notebook we will take a look at some indicative applications of markov decision processes. \n", + "We will cover content from [`mdp.py`](https://github.com/aimacode/aima-python/blob/master/mdp.py), for **Chapter 17 Making Complex Decisions** of Stuart Russel's and Peter Norvig's book [*Artificial Intelligence: A Modern Approach*](http://aima.cs.berkeley.edu/).\n" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from mdp import *\n", + "from notebook import psource, pseudocode" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "- Simple MDP\n", + " - State dependent reward function\n", + " - State and action dependent reward function\n", + " - State, action and next state dependent reward function\n", + "- Grid MDP\n", + " - Pathfinding problem\n", + "- POMDP\n", + " - Two state POMDP" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SIMPLE MDP\n", + "---\n", + "### State dependent reward function\n", + "\n", + "Markov Decision Processes are formally described as processes that follow the Markov property which states that \"The future is independent of the past given the present\". \n", + "MDPs formally describe environments for reinforcement learning and we assume that the environment is *fully observable*. \n", + "Let us take a toy example MDP and solve it using the functions in `mdp.py`.\n", + "This is a simple example adapted from a [similar problem](http://www0.cs.ucl.ac.uk/staff/D.Silver/web/Teaching_files/MDP.pdf) by Dr. David Silver, tweaked to fit the limitations of the current functions.\n", + "![title](images/mdp-b.png)\n", + "\n", + "Let's say you're a student attending lectures in a university.\n", + "There are three lectures you need to attend on a given day.\n", + "
    \n", + "Attending the first lecture gives you 4 points of reward.\n", + "After the first lecture, you have a 0.6 probability to continue into the second one, yielding 6 more points of reward.\n", + "
    \n", + "But, with a probability of 0.4, you get distracted and start using Facebook instead and get a reward of -1.\n", + "From then onwards, you really can't let go of Facebook and there's just a 0.1 probability that you will concentrate back on the lecture.\n", + "
    \n", + "After the second lecture, you have an equal chance of attending the next lecture or just falling asleep.\n", + "Falling asleep is the terminal state and yields you no reward, but continuing on to the final lecture gives you a big reward of 10 points.\n", + "
    \n", + "From there on, you have a 40% chance of going to study and reach the terminal state, \n", + "but a 60% chance of going to the pub with your friends instead. \n", + "You end up drunk and don't know which lecture to attend, so you go to one of the lectures according to the probabilities given above.\n", + "
    \n", + "We now have an outline of our stochastic environment and we need to maximize our reward by solving this MDP.\n", + "
    \n", + "
    \n", + "We first have to define our Transition Matrix as a nested dictionary to fit the requirements of the MDP class." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "t = {\n", + " 'leisure': {\n", + " 'facebook': {'leisure':0.9, 'class1':0.1},\n", + " 'quit': {'leisure':0.1, 'class1':0.9},\n", + " 'study': {},\n", + " 'sleep': {},\n", + " 'pub': {}\n", + " },\n", + " 'class1': {\n", + " 'study': {'class2':0.6, 'leisure':0.4},\n", + " 'facebook': {'class2':0.4, 'leisure':0.6},\n", + " 'quit': {},\n", + " 'sleep': {},\n", + " 'pub': {}\n", + " },\n", + " 'class2': {\n", + " 'study': {'class3':0.5, 'end':0.5},\n", + " 'sleep': {'end':0.5, 'class3':0.5},\n", + " 'facebook': {},\n", + " 'quit': {},\n", + " 'pub': {},\n", + " },\n", + " 'class3': {\n", + " 'study': {'end':0.6, 'class1':0.08, 'class2':0.16, 'class3':0.16},\n", + " 'pub': {'end':0.4, 'class1':0.12, 'class2':0.24, 'class3':0.24},\n", + " 'facebook': {},\n", + " 'quit': {},\n", + " 'sleep': {}\n", + " },\n", + " 'end': {}\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now need to define the reward for each state." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "rewards = {\n", + " 'class1': 4,\n", + " 'class2': 6,\n", + " 'class3': 10,\n", + " 'leisure': -1,\n", + " 'end': 0\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This MDP has only one terminal state." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "terminals = ['end']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now set the initial state to Class 1." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "init = 'class1'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will write a CustomMDP class to extend the MDP class for the problem at hand. \n", + "This class will implement the `T` method to implement the transition model. This is the exact same class as given in [`mdp.ipynb`](https://github.com/aimacode/aima-python/blob/master/mdp.ipynb#MDP)." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class CustomMDP(MDP):\n", + "\n", + " def __init__(self, transition_matrix, rewards, terminals, init, gamma=.9):\n", + " # All possible actions.\n", + " actlist = []\n", + " for state in transition_matrix.keys():\n", + " actlist.extend(transition_matrix[state])\n", + " actlist = list(set(actlist))\n", + " print(actlist)\n", + "\n", + " MDP.__init__(self, init, actlist, terminals=terminals, gamma=gamma)\n", + " self.t = transition_matrix\n", + " self.reward = rewards\n", + " for state in self.t:\n", + " self.states.add(state)\n", + "\n", + " def T(self, state, action):\n", + " if action is None:\n", + " return [(0.0, state)]\n", + " else: \n", + " return [(prob, new_state) for new_state, prob in self.t[state][action].items()]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now need an instance of this class." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['quit', 'sleep', 'study', 'pub', 'facebook']\n" + ] + } + ], + "source": [ + "mdp = CustomMDP(t, rewards, terminals, init, gamma=.9)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The utility of each state can be found by `value_iteration`." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'class1': 16.90340650279542,\n", + " 'class2': 14.597383430869879,\n", + " 'class3': 19.10533144728953,\n", + " 'end': 0.0,\n", + " 'leisure': 13.946891353066082}" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "value_iteration(mdp)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we can compute the utility values, we can find the best policy." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pi = best_policy(mdp, value_iteration(mdp, .01))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`pi` stores the best action for each state." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'class2': 'sleep', 'class3': 'pub', 'end': None, 'class1': 'study', 'leisure': 'quit'}\n" + ] + } + ], + "source": [ + "print(pi)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can confirm that this is the best policy by verifying this result against `policy_iteration`." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'class1': 'study',\n", + " 'class2': 'sleep',\n", + " 'class3': 'pub',\n", + " 'end': None,\n", + " 'leisure': 'quit'}" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "policy_iteration(mdp)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "Everything looks perfect, but let us look at another possibility for an MDP.\n", + "
    \n", + "Till now we have only dealt with rewards that the agent gets while it is **on** a particular state.\n", + "What if we want to have different rewards for a state depending on the action that the agent takes next. \n", + "The agent gets the reward _during its transition_ to the next state.\n", + "
    \n", + "For the sake of clarity, we will call this the _transition reward_ and we will call this kind of MDP a _dynamic_ MDP. \n", + "This is not a conventional term, we just use it to minimize confusion between the two.\n", + "
    \n", + "This next section deals with how to create and solve a dynamic MDP." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### State and action dependent reward function\n", + "Let us consider a very similar problem, but this time, we do not have rewards _on_ states, \n", + "instead, we have rewards on the transitions between states. \n", + "This state diagram will make it clearer.\n", + "![title](images/mdp-c.png)\n", + "\n", + "A very similar scenario as the previous problem, but we have different rewards for the same state depending on the action taken.\n", + "
    \n", + "To deal with this, we just need to change the `R` method of the `MDP` class, but to prevent confusion, we will write a new similar class `DMDP`." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class DMDP:\n", + "\n", + " \"\"\"A Markov Decision Process, defined by an initial state, transition model,\n", + " and reward model. We also keep track of a gamma value, for use by\n", + " algorithms. The transition model is represented somewhat differently from\n", + " the text. Instead of P(s' | s, a) being a probability number for each\n", + " state/state/action triplet, we instead have T(s, a) return a\n", + " list of (p, s') pairs. The reward function is very similar.\n", + " We also keep track of the possible states,\n", + " terminal states, and actions for each state.\"\"\"\n", + "\n", + " def __init__(self, init, actlist, terminals, transitions={}, rewards={}, states=None, gamma=.9):\n", + " if not (0 < gamma <= 1):\n", + " raise ValueError(\"An MDP must have 0 < gamma <= 1\")\n", + "\n", + " if states:\n", + " self.states = states\n", + " else:\n", + " self.states = set()\n", + " self.init = init\n", + " self.actlist = actlist\n", + " self.terminals = terminals\n", + " self.transitions = transitions\n", + " self.rewards = rewards\n", + " self.gamma = gamma\n", + "\n", + " def R(self, state, action):\n", + " \"\"\"Return a numeric reward for this state and this action.\"\"\"\n", + " if (self.rewards == {}):\n", + " raise ValueError('Reward model is missing')\n", + " else:\n", + " return self.rewards[state][action]\n", + "\n", + " def T(self, state, action):\n", + " \"\"\"Transition model. From a state and an action, return a list\n", + " of (probability, result-state) pairs.\"\"\"\n", + " if(self.transitions == {}):\n", + " raise ValueError(\"Transition model is missing\")\n", + " else:\n", + " return self.transitions[state][action]\n", + "\n", + " def actions(self, state):\n", + " \"\"\"Set of actions that can be performed in this state. By default, a\n", + " fixed list of actions, except for terminal states. Override this\n", + " method if you need to specialize by state.\"\"\"\n", + " if state in self.terminals:\n", + " return [None]\n", + " else:\n", + " return self.actlist" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The transition model will be the same" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "t = {\n", + " 'leisure': {\n", + " 'facebook': {'leisure':0.9, 'class1':0.1},\n", + " 'quit': {'leisure':0.1, 'class1':0.9},\n", + " 'study': {},\n", + " 'sleep': {},\n", + " 'pub': {}\n", + " },\n", + " 'class1': {\n", + " 'study': {'class2':0.6, 'leisure':0.4},\n", + " 'facebook': {'class2':0.4, 'leisure':0.6},\n", + " 'quit': {},\n", + " 'sleep': {},\n", + " 'pub': {}\n", + " },\n", + " 'class2': {\n", + " 'study': {'class3':0.5, 'end':0.5},\n", + " 'sleep': {'end':0.5, 'class3':0.5},\n", + " 'facebook': {},\n", + " 'quit': {},\n", + " 'pub': {},\n", + " },\n", + " 'class3': {\n", + " 'study': {'end':0.6, 'class1':0.08, 'class2':0.16, 'class3':0.16},\n", + " 'pub': {'end':0.4, 'class1':0.12, 'class2':0.24, 'class3':0.24},\n", + " 'facebook': {},\n", + " 'quit': {},\n", + " 'sleep': {}\n", + " },\n", + " 'end': {}\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The reward model will be a dictionary very similar to the transition dictionary with a reward for every action for every state." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "r = {\n", + " 'leisure': {\n", + " 'facebook':-1,\n", + " 'quit':0,\n", + " 'study':0,\n", + " 'sleep':0,\n", + " 'pub':0\n", + " },\n", + " 'class1': {\n", + " 'study':-2,\n", + " 'facebook':-1,\n", + " 'quit':0,\n", + " 'sleep':0,\n", + " 'pub':0\n", + " },\n", + " 'class2': {\n", + " 'study':-2,\n", + " 'sleep':0,\n", + " 'facebook':0,\n", + " 'quit':0,\n", + " 'pub':0\n", + " },\n", + " 'class3': {\n", + " 'study':10,\n", + " 'pub':1,\n", + " 'facebook':0,\n", + " 'quit':0,\n", + " 'sleep':0\n", + " },\n", + " 'end': {\n", + " 'study':0,\n", + " 'pub':0,\n", + " 'facebook':0,\n", + " 'quit':0,\n", + " 'sleep':0\n", + " }\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The MDP has only one terminal state" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "terminals = ['end']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now set the initial state to Class 1." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "init = 'class1'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will write a CustomDMDP class to extend the DMDP class for the problem at hand.\n", + "This class will implement everything that the previous CustomMDP class implements along with a new reward model." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class CustomDMDP(DMDP):\n", + " \n", + " def __init__(self, transition_matrix, rewards, terminals, init, gamma=.9):\n", + " actlist = []\n", + " for state in transition_matrix.keys():\n", + " actlist.extend(transition_matrix[state])\n", + " actlist = list(set(actlist))\n", + " print(actlist)\n", + " \n", + " DMDP.__init__(self, init, actlist, terminals=terminals, gamma=gamma)\n", + " self.t = transition_matrix\n", + " self.rewards = rewards\n", + " for state in self.t:\n", + " self.states.add(state)\n", + " \n", + " \n", + " def T(self, state, action):\n", + " if action is None:\n", + " return [(0.0, state)]\n", + " else:\n", + " return [(prob, new_state) for new_state, prob in self.t[state][action].items()]\n", + " \n", + " def R(self, state, action):\n", + " if action is None:\n", + " return 0\n", + " else:\n", + " return self.rewards[state][action]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One thing we haven't thought about yet is that the `value_iteration` algorithm won't work now that the reward model is changed.\n", + "It will be quite similar to the one we currently have nonetheless." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Bellman update equation now is defined as follows\n", + "\n", + "$$U(s)=\\max_{a\\epsilon A(s)}\\bigg[R(s, a) + \\gamma\\sum_{s'}P(s'\\ |\\ s,a)U(s')\\bigg]$$\n", + "\n", + "It is not difficult to see that the update equation we have been using till now is just a special case of this more generalized equation. \n", + "We also need to max over the reward function now as the reward function is action dependent as well.\n", + "
    \n", + "We will use this to write a function to carry out value iteration, very similar to the one we are familiar with." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def value_iteration_dmdp(dmdp, epsilon=0.001):\n", + " U1 = {s: 0 for s in dmdp.states}\n", + " R, T, gamma = dmdp.R, dmdp.T, dmdp.gamma\n", + " while True:\n", + " U = U1.copy()\n", + " delta = 0\n", + " for s in dmdp.states:\n", + " U1[s] = max([(R(s, a) + gamma*sum([(p*U[s1]) for (p, s1) in T(s, a)])) for a in dmdp.actions(s)])\n", + " delta = max(delta, abs(U1[s] - U[s]))\n", + " if delta < epsilon * (1 - gamma) / gamma:\n", + " return U" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We're all set.\n", + "Let's instantiate our class." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['quit', 'sleep', 'study', 'pub', 'facebook']\n" + ] + } + ], + "source": [ + "dmdp = CustomDMDP(t, r, terminals, init, gamma=.9)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Calculate utility values by calling `value_iteration_dmdp`." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'class1': 2.0756895004431364,\n", + " 'class2': 5.772550326127298,\n", + " 'class3': 12.827904448229472,\n", + " 'end': 0.0,\n", + " 'leisure': 1.8474896554396596}" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "value_iteration_dmdp(dmdp)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These are the expected utility values for our new MDP.\n", + "
    \n", + "As you might have guessed, we cannot use the old `best_policy` function to find the best policy.\n", + "So we will write our own.\n", + "But, before that we need a helper function to calculate the expected utility value given a state and an action." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def expected_utility_dmdp(a, s, U, dmdp):\n", + " return dmdp.R(s, a) + dmdp.gamma*sum([(p*U[s1]) for (p, s1) in dmdp.T(s, a)])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we write our modified `best_policy` function." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "from utils import argmax\n", + "def best_policy_dmdp(dmdp, U):\n", + " pi = {}\n", + " for s in dmdp.states:\n", + " pi[s] = argmax(dmdp.actions(s), key=lambda a: expected_utility_dmdp(a, s, U, dmdp))\n", + " return pi" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Find the best policy." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'class2': 'sleep', 'class3': 'study', 'end': None, 'class1': 'facebook', 'leisure': 'quit'}\n" + ] + } + ], + "source": [ + "pi = best_policy_dmdp(dmdp, value_iteration_dmdp(dmdp, .01))\n", + "print(pi)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "From this, we can infer that `value_iteration_dmdp` tries to minimize the negative reward. \n", + "Since we don't have rewards for states now, the algorithm takes the action that would try to avoid getting negative rewards and take the lesser of two evils if all rewards are negative.\n", + "You might also want to have state rewards alongside transition rewards. \n", + "Perhaps you can do that yourself now that the difficult part has been done.\n", + "
    " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### State, action and next-state dependent reward function\n", + "\n", + "For truly stochastic environments, \n", + "we have noticed that taking an action from a particular state doesn't always do what we want it to. \n", + "Instead, for every action taken from a particular state, \n", + "it might be possible to reach a different state each time depending on the transition probabilities. \n", + "What if we want different rewards for each state, action and next-state triplet? \n", + "Mathematically, we now want a reward function of the form R(s, a, s') for our MDP. \n", + "This section shows how we can tweak the MDP class to achieve this.\n", + "
    \n", + "\n", + "Let's now take a different problem statement. \n", + "The one we are working with is a bit too simple.\n", + "Consider a taxi that serves three adjacent towns A, B, and C.\n", + "Each time the taxi discharges a passenger, the driver must choose from three possible actions:\n", + "1. Cruise the streets looking for a passenger.\n", + "2. Go to the nearest taxi stand.\n", + "3. Wait for a radio call from the dispatcher with instructions.\n", + "
    \n", + "Subject to the constraint that the taxi driver cannot do the third action in town B because of distance and poor reception.\n", + "\n", + "Let's model our MDP.\n", + "
    \n", + "The MDP has three states, namely A, B and C.\n", + "
    \n", + "It has three actions, namely 1, 2 and 3.\n", + "
    \n", + "Action sets:\n", + "
    \n", + "$K_{a}$ = {1, 2, 3}\n", + "
    \n", + "$K_{b}$ = {1, 2}\n", + "
    \n", + "$K_{c}$ = {1, 2, 3}\n", + "
    \n", + "\n", + "We have the following transition probability matrices:\n", + "
    \n", + "
    \n", + "Action 1: Cruising streets \n", + "
    \n", + "$\\\\\n", + " P^{1} = \n", + " \\left[ {\\begin{array}{ccc}\n", + " \\frac{1}{2} & \\frac{1}{4} & \\frac{1}{4} \\\\\n", + " \\frac{1}{2} & 0 & \\frac{1}{2} \\\\\n", + " \\frac{1}{4} & \\frac{1}{4} & \\frac{1}{2} \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "
    \n", + "
    \n", + "Action 2: Waiting at the taxi stand \n", + "
    \n", + "$\\\\\n", + " P^{2} = \n", + " \\left[ {\\begin{array}{ccc}\n", + " \\frac{1}{16} & \\frac{3}{4} & \\frac{3}{16} \\\\\n", + " \\frac{1}{16} & \\frac{7}{8} & \\frac{1}{16} \\\\\n", + " \\frac{1}{8} & \\frac{3}{4} & \\frac{1}{8} \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "
    \n", + "
    \n", + "Action 3: Waiting for dispatch \n", + "
    \n", + "$\\\\\n", + " P^{3} =\n", + " \\left[ {\\begin{array}{ccc}\n", + " \\frac{1}{4} & \\frac{1}{8} & \\frac{5}{8} \\\\\n", + " 0 & 1 & 0 \\\\\n", + " \\frac{3}{4} & \\frac{1}{16} & \\frac{3}{16} \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "
    \n", + "
    \n", + "For the sake of readability, we will call the states A, B and C and the actions 'cruise', 'stand' and 'dispatch'.\n", + "We will now build the transition model as a dictionary using these matrices." + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "t = {\n", + " 'A': {\n", + " 'cruise': {'A':0.5, 'B':0.25, 'C':0.25},\n", + " 'stand': {'A':0.0625, 'B':0.75, 'C':0.1875},\n", + " 'dispatch': {'A':0.25, 'B':0.125, 'C':0.625}\n", + " },\n", + " 'B': {\n", + " 'cruise': {'A':0.5, 'B':0, 'C':0.5},\n", + " 'stand': {'A':0.0625, 'B':0.875, 'C':0.0625},\n", + " 'dispatch': {'A':0, 'B':1, 'C':0}\n", + " },\n", + " 'C': {\n", + " 'cruise': {'A':0.25, 'B':0.25, 'C':0.5},\n", + " 'stand': {'A':0.125, 'B':0.75, 'C':0.125},\n", + " 'dispatch': {'A':0.75, 'B':0.0625, 'C':0.1875}\n", + " }\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The reward matrices for the problem are as follows:\n", + "
    \n", + "
    \n", + "Action 1: Cruising streets \n", + "
    \n", + "$\\\\\n", + " R^{1} = \n", + " \\left[ {\\begin{array}{ccc}\n", + " 10 & 4 & 8 \\\\\n", + " 14 & 0 & 18 \\\\\n", + " 10 & 2 & 8 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "
    \n", + "
    \n", + "Action 2: Waiting at the taxi stand \n", + "
    \n", + "$\\\\\n", + " R^{2} = \n", + " \\left[ {\\begin{array}{ccc}\n", + " 8 & 2 & 4 \\\\\n", + " 8 & 16 & 8 \\\\\n", + " 6 & 4 & 2\\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "
    \n", + "
    \n", + "Action 3: Waiting for dispatch \n", + "
    \n", + "$\\\\\n", + " R^{3} = \n", + " \\left[ {\\begin{array}{ccc}\n", + " 4 & 6 & 4 \\\\\n", + " 0 & 0 & 0 \\\\\n", + " 4 & 0 & 8\\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "
    \n", + "
    \n", + "We now build the reward model as a dictionary using these matrices." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "r = {\n", + " 'A': {\n", + " 'cruise': {'A':10, 'B':4, 'C':8},\n", + " 'stand': {'A':8, 'B':2, 'C':4},\n", + " 'dispatch': {'A':4, 'B':6, 'C':4}\n", + " },\n", + " 'B': {\n", + " 'cruise': {'A':14, 'B':0, 'C':18},\n", + " 'stand': {'A':8, 'B':16, 'C':8},\n", + " 'dispatch': {'A':0, 'B':0, 'C':0}\n", + " },\n", + " 'C': {\n", + " 'cruise': {'A':10, 'B':2, 'C':18},\n", + " 'stand': {'A':6, 'B':4, 'C':2},\n", + " 'dispatch': {'A':4, 'B':0, 'C':8}\n", + " }\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "The Bellman update equation now is defined as follows\n", + "\n", + "$$U(s)=\\max_{a\\epsilon A(s)}\\sum_{s'}P(s'\\ |\\ s,a)(R(s'\\ |\\ s,a) + \\gamma U(s'))$$\n", + "\n", + "It is not difficult to see that all the update equations we have used till now is just a special case of this more generalized equation. \n", + "If we did not have next-state-dependent rewards, the first term inside the summation exactly sums up to R(s, a) or the state-reward for a particular action and we would get the update equation used in the previous problem.\n", + "If we did not have action dependent rewards, the first term inside the summation sums up to R(s) or the state-reward and we would get the first update equation used in `mdp.ipynb`.\n", + "
    \n", + "For example, as we have the same reward regardless of the action, let's consider a reward of **r** units for a particular state and let's assume the transition probabilities to be 0.1, 0.2, 0.3 and 0.4 for 4 possible actions for that state.\n", + "We will further assume that a particular action in a state leads to the same state every time we take that action.\n", + "The first term inside the summation for this case will evaluate to (0.1 + 0.2 + 0.3 + 0.4)r = r which is equal to R(s) in the first update equation.\n", + "
    \n", + "There are many ways to write value iteration for this situation, but we will go with the most intuitive method.\n", + "One that can be implemented with minor alterations to the existing `value_iteration` algorithm.\n", + "
    \n", + "Our `DMDP` class will be slightly different.\n", + "More specifically, the `R` method will have one more index to go through now that we have three levels of nesting in the reward model.\n", + "We will call the new class `DMDP2` as I have run out of creative names." + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class DMDP2:\n", + "\n", + " \"\"\"A Markov Decision Process, defined by an initial state, transition model,\n", + " and reward model. We also keep track of a gamma value, for use by\n", + " algorithms. The transition model is represented somewhat differently from\n", + " the text. Instead of P(s' | s, a) being a probability number for each\n", + " state/state/action triplet, we instead have T(s, a) return a\n", + " list of (p, s') pairs. The reward function is very similar.\n", + " We also keep track of the possible states,\n", + " terminal states, and actions for each state.\"\"\"\n", + "\n", + " def __init__(self, init, actlist, terminals, transitions={}, rewards={}, states=None, gamma=.9):\n", + " if not (0 < gamma <= 1):\n", + " raise ValueError(\"An MDP must have 0 < gamma <= 1\")\n", + "\n", + " if states:\n", + " self.states = states\n", + " else:\n", + " self.states = set()\n", + " self.init = init\n", + " self.actlist = actlist\n", + " self.terminals = terminals\n", + " self.transitions = transitions\n", + " self.rewards = rewards\n", + " self.gamma = gamma\n", + "\n", + " def R(self, state, action, state_):\n", + " \"\"\"Return a numeric reward for this state, this action and the next state_\"\"\"\n", + " if (self.rewards == {}):\n", + " raise ValueError('Reward model is missing')\n", + " else:\n", + " return self.rewards[state][action][state_]\n", + "\n", + " def T(self, state, action):\n", + " \"\"\"Transition model. From a state and an action, return a list\n", + " of (probability, result-state) pairs.\"\"\"\n", + " if(self.transitions == {}):\n", + " raise ValueError(\"Transition model is missing\")\n", + " else:\n", + " return self.transitions[state][action]\n", + "\n", + " def actions(self, state):\n", + " \"\"\"Set of actions that can be performed in this state. By default, a\n", + " fixed list of actions, except for terminal states. Override this\n", + " method if you need to specialize by state.\"\"\"\n", + " if state in self.terminals:\n", + " return [None]\n", + " else:\n", + " return self.actlist\n", + " \n", + " def actions(self, state):\n", + " \"\"\"Set of actions that can be performed in this state. By default, a\n", + " fixed list of actions, except for terminal states. Override this\n", + " method if you need to specialize by state.\"\"\"\n", + " if state in self.terminals:\n", + " return [None]\n", + " else:\n", + " return self.actlist" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Only the `R` method is different from the previous `DMDP` class.\n", + "
    \n", + "Our traditional custom class will be required to implement the transition model and the reward model.\n", + "
    \n", + "We call this class `CustomDMDP2`." + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class CustomDMDP2(DMDP2):\n", + " \n", + " def __init__(self, transition_matrix, rewards, terminals, init, gamma=.9):\n", + " actlist = []\n", + " for state in transition_matrix.keys():\n", + " actlist.extend(transition_matrix[state])\n", + " actlist = list(set(actlist))\n", + " print(actlist)\n", + " \n", + " DMDP2.__init__(self, init, actlist, terminals=terminals, gamma=gamma)\n", + " self.t = transition_matrix\n", + " self.rewards = rewards\n", + " for state in self.t:\n", + " self.states.add(state)\n", + " \n", + " def T(self, state, action):\n", + " if action is None:\n", + " return [(0.0, state)]\n", + " else:\n", + " return [(prob, new_state) for new_state, prob in self.t[state][action].items()]\n", + " \n", + " def R(self, state, action, state_):\n", + " if action is None:\n", + " return 0\n", + " else:\n", + " return self.rewards[state][action][state_]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can finally write value iteration for this problem.\n", + "The latest update equation will be used." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def value_iteration_taxi_mdp(dmdp2, epsilon=0.001):\n", + " U1 = {s: 0 for s in dmdp2.states}\n", + " R, T, gamma = dmdp2.R, dmdp2.T, dmdp2.gamma\n", + " while True:\n", + " U = U1.copy()\n", + " delta = 0\n", + " for s in dmdp2.states:\n", + " U1[s] = max([sum([(p*(R(s, a, s1) + gamma*U[s1])) for (p, s1) in T(s, a)]) for a in dmdp2.actions(s)])\n", + " delta = max(delta, abs(U1[s] - U[s]))\n", + " if delta < epsilon * (1 - gamma) / gamma:\n", + " return U" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These algorithms can be made more pythonic by using cleverer list comprehensions.\n", + "We can also write the variants of value iteration in such a way that all problems are solved using the same base class, regardless of the reward function and the number of arguments it takes.\n", + "Quite a few things can be done to refactor the code and reduce repetition, but we have done it this way for the sake of clarity.\n", + "Perhaps you can try this as an exercise.\n", + "
    \n", + "We now need to define terminals and initial state." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "terminals = ['end']\n", + "init = 'A'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's instantiate our class." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['stand', 'dispatch', 'cruise']\n" + ] + } + ], + "source": [ + "dmdp2 = CustomDMDP2(t, r, terminals, init, gamma=.9)" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'A': 124.4881543573768, 'B': 137.70885410461636, 'C': 129.08041190693115}" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "value_iteration_taxi_mdp(dmdp2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These are the expected utility values for the states of our MDP.\n", + "Let's proceed to write a helper function to find the expected utility and another to find the best policy." + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def expected_utility_dmdp2(a, s, U, dmdp2):\n", + " return sum([(p*(dmdp2.R(s, a, s1) + dmdp2.gamma*U[s1])) for (p, s1) in dmdp2.T(s, a)])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "from utils import argmax\n", + "def best_policy_dmdp2(dmdp2, U):\n", + " pi = {}\n", + " for s in dmdp2.states:\n", + " pi[s] = argmax(dmdp2.actions(s), key=lambda a: expected_utility_dmdp2(a, s, U, dmdp2))\n", + " return pi" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Find the best policy." + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'C': 'cruise', 'A': 'stand', 'B': 'stand'}\n" + ] + } + ], + "source": [ + "pi = best_policy_dmdp2(dmdp2, value_iteration_taxi_mdp(dmdp2, .01))\n", + "print(pi)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have successfully adapted the existing code to a different scenario yet again.\n", + "The takeaway from this section is that you can convert the vast majority of reinforcement learning problems into MDPs and solve for the best policy using simple yet efficient tools." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## GRID MDP\n", + "---\n", + "### Pathfinding Problem\n", + "Markov Decision Processes can be used to find the best path through a maze. Let us consider this simple maze.\n", + "![title](images/maze.png)\n", + "\n", + "This environment can be formulated as a GridMDP.\n", + "
    \n", + "To make the grid matrix, we will consider the state-reward to be -0.1 for every state.\n", + "
    \n", + "State (1, 1) will have a reward of -5 to signify that this state is to be prohibited.\n", + "
    \n", + "State (9, 9) will have a reward of +5.\n", + "This will be the terminal state.\n", + "
    \n", + "The matrix can be generated using the GridMDP editor or we can write it ourselves." + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "grid = [\n", + " [None, None, None, None, None, None, None, None, None, None, None], \n", + " [None, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, None, +5.0, None], \n", + " [None, -0.1, None, None, None, None, None, None, None, -0.1, None], \n", + " [None, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, None], \n", + " [None, -0.1, None, None, None, None, None, None, None, None, None], \n", + " [None, -0.1, None, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, None], \n", + " [None, -0.1, None, None, None, None, None, -0.1, None, -0.1, None], \n", + " [None, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, -0.1, None, -0.1, None], \n", + " [None, None, None, None, None, -0.1, None, -0.1, None, -0.1, None], \n", + " [None, -5.0, -0.1, -0.1, -0.1, -0.1, None, -0.1, None, -0.1, None], \n", + " [None, None, None, None, None, None, None, None, None, None, None]\n", + "]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have only one terminal state, (9, 9)" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "terminals = [(9, 9)]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We define our maze environment below" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [], + "source": [ + "maze = GridMDP(grid, terminals)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To solve the maze, we can use the `best_policy` function along with `value_iteration`." + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pi = best_policy(maze, value_iteration(maze))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is the heatmap generated by the GridMDP editor using `value_iteration` on this environment\n", + "
    \n", + "![title](images/mdp-d.png)\n", + "
    \n", + "Let's print out the best policy" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "None None None None None None None None None None None\n", + "None v < < < < < < None . None\n", + "None v None None None None None None None ^ None\n", + "None > > > > > > > > ^ None\n", + "None ^ None None None None None None None None None\n", + "None ^ None > > > > v < < None\n", + "None ^ None None None None None v None ^ None\n", + "None ^ < < < < < < None ^ None\n", + "None None None None None ^ None ^ None ^ None\n", + "None > > > > ^ None ^ None ^ None\n", + "None None None None None None None None None None None\n" + ] + } + ], + "source": [ + "from utils import print_table\n", + "print_table(maze.to_arrows(pi))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can infer, we can find the path to the terminal state starting from any given state using this policy.\n", + "All maze problems can be solved by formulating it as a MDP." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## POMDP\n", + "### Two state POMDP\n", + "Let's consider a problem where we have two doors, one to our left and one to our right.\n", + "One of these doors opens to a room with a tiger in it, and the other one opens to an empty hall.\n", + "
    \n", + "We will call our two states `0` and `1` for `left` and `right` respectively.\n", + "
    \n", + "The possible actions we can take are as follows:\n", + "
    \n", + "1. __Open-left__: Open the left door.\n", + "Represented by `0`.\n", + "2. __Open-right__: Open the right door.\n", + "Represented by `1`.\n", + "3. __Listen__: Listen carefully to one side and possibly hear the tiger breathing.\n", + "Represented by `2`.\n", + "\n", + "
    \n", + "The possible observations we can get are as follows:\n", + "
    \n", + "1. __TL__: Tiger seems to be at the left door.\n", + "2. __TR__: Tiger seems to be at the right door.\n", + "\n", + "
    \n", + "The reward function is as follows:\n", + "
    \n", + "We get +10 reward for opening the door to the empty hall and we get -100 reward for opening the other door and setting the tiger free.\n", + "
    \n", + "Listening costs us -1 reward.\n", + "
    \n", + "We want to minimize our chances of setting the tiger free.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Our transition probabilities can be defined as:\n", + "
    \n", + "
    \n", + "Action `0` (Open left door)\n", + "$\\\\\n", + " P(0) = \n", + " \\left[ {\\begin{array}{cc}\n", + " 0.5 & 0.5 \\\\\n", + " 0.5 & 0.5 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + " \n", + "Action `1` (Open right door)\n", + "$\\\\\n", + " P(1) = \n", + " \\left[ {\\begin{array}{cc}\n", + " 0.5 & 0.5 \\\\\n", + " 0.5 & 0.5 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + " \n", + "Action `2` (Listen)\n", + "$\\\\\n", + " P(2) = \n", + " \\left[ {\\begin{array}{cc}\n", + " 1.0 & 0.0 \\\\\n", + " 0.0 & 1.0 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + " \n", + "
    \n", + "
    \n", + "Our observation probabilities can be defined as:\n", + "
    \n", + "
    \n", + "$\\\\\n", + " O(0) = \n", + " \\left[ {\\begin{array}{ccc}\n", + " Open left & TL & TR \\\\\n", + " Tiger: left & 0.5 & 0.5 \\\\\n", + " Tiger: right & 0.5 & 0.5 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "\n", + "$\\\\\n", + " O(1) = \n", + " \\left[ {\\begin{array}{ccc}\n", + " Open right & TL & TR \\\\\n", + " Tiger: left & 0.5 & 0.5 \\\\\n", + " Tiger: right & 0.5 & 0.5 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "\n", + "$\\\\\n", + " O(2) = \n", + " \\left[ {\\begin{array}{ccc}\n", + " Listen & TL & TR \\\\\n", + " Tiger: left & 0.85 & 0.15 \\\\\n", + " Tiger: right & 0.15 & 0.85 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + "\n", + "
    \n", + "
    \n", + "The rewards of this POMDP are defined as:\n", + "
    \n", + "
    \n", + "$\\\\\n", + " R(0) = \n", + " \\left[ {\\begin{array}{cc}\n", + " Openleft & Reward \\\\\n", + " Tiger: left & -100 \\\\\n", + " Tiger: right & +10 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + " \n", + "$\\\\\n", + " R(1) = \n", + " \\left[ {\\begin{array}{cc}\n", + " Openright & Reward \\\\\n", + " Tiger: left & +10 \\\\\n", + " Tiger: right & -100 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + " \n", + "$\\\\\n", + " R(2) = \n", + " \\left[ {\\begin{array}{cc}\n", + " Listen & Reward \\\\\n", + " Tiger: left & -1 \\\\\n", + " Tiger: right & -1 \\\\\n", + " \\end{array}}\\right] \\\\\n", + " \\\\\n", + " $\n", + " \n", + "
    \n", + "Based on these matrices, we will initialize our variables." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's first define our transition state." + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [], + "source": [ + "t_prob = [[[0.5, 0.5], \n", + " [0.5, 0.5]], \n", + " \n", + " [[0.5, 0.5], \n", + " [0.5, 0.5]], \n", + " \n", + " [[1.0, 0.0], \n", + " [0.0, 1.0]]]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Followed by the observation model." + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": {}, + "outputs": [], + "source": [ + "e_prob = [[[0.5, 0.5], \n", + " [0.5, 0.5]], \n", + " \n", + " [[0.5, 0.5], \n", + " [0.5, 0.5]], \n", + " \n", + " [[0.85, 0.15], \n", + " [0.15, 0.85]]]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And the reward model." + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": {}, + "outputs": [], + "source": [ + "rewards = [[-100, 10], \n", + " [10, -100], \n", + " [-1, -1]]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now define our states, observations and actions.\n", + "
    \n", + "We will use `gamma` = 0.95 for this example.\n", + "
    " + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [], + "source": [ + "# 0: open-left, 1: open-right, 2: listen\n", + "actions = ('0', '1', '2')\n", + "# 0: left, 1: right\n", + "states = ('0', '1')\n", + "\n", + "gamma = 0.95" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have all the required variables to instantiate an object of the `POMDP` class." + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": {}, + "outputs": [], + "source": [ + "pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now find the utility function by running `pomdp_value_iteration` on our `pomdp` object." + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "defaultdict(list,\n", + " {'0': [array([-83.05169196, 26.94830804])],\n", + " '1': [array([ 26.94830804, -83.05169196])],\n", + " '2': [array([23.55049363, -0.76359097]),\n", + " array([23.55049363, -0.76359097]),\n", + " array([23.55049363, -0.76359097]),\n", + " array([23.55049363, -0.76359097]),\n", + " array([23.24120177, 1.56028929]),\n", + " array([23.24120177, 1.56028929]),\n", + " array([23.24120177, 1.56028929]),\n", + " array([20.0874279 , 15.03900771]),\n", + " array([20.0874279 , 15.03900771]),\n", + " array([20.0874279 , 15.03900771]),\n", + " array([20.0874279 , 15.03900771]),\n", + " array([17.91696135, 17.91696135]),\n", + " array([17.91696135, 17.91696135]),\n", + " array([17.91696135, 17.91696135]),\n", + " array([17.91696135, 17.91696135]),\n", + " array([17.91696135, 17.91696135]),\n", + " array([15.03900771, 20.0874279 ]),\n", + " array([15.03900771, 20.0874279 ]),\n", + " array([15.03900771, 20.0874279 ]),\n", + " array([15.03900771, 20.0874279 ]),\n", + " array([ 1.56028929, 23.24120177]),\n", + " array([ 1.56028929, 23.24120177]),\n", + " array([ 1.56028929, 23.24120177]),\n", + " array([-0.76359097, 23.55049363]),\n", + " array([-0.76359097, 23.55049363]),\n", + " array([-0.76359097, 23.55049363]),\n", + " array([-0.76359097, 23.55049363])]})" + ] + }, + "execution_count": 45, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "utility = pomdp_value_iteration(pomdp, epsilon=3)\n", + "utility" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": {}, + "outputs": [], + "source": [ + "import matplotlib.pyplot as plt\n", + "%matplotlib inline\n", + "\n", + "def plot_utility(utility):\n", + " open_left = utility['0'][0]\n", + " open_right = utility['1'][0]\n", + " listen_left = utility['2'][0]\n", + " listen_right = utility['2'][-1]\n", + " left = (open_left[0] - listen_left[0]) / (open_left[0] - listen_left[0] + listen_left[1] - open_left[1])\n", + " right = (open_right[0] - listen_right[0]) / (open_right[0] - listen_right[0] + listen_right[1] - open_right[1])\n", + " \n", + " colors = ['g', 'b', 'k']\n", + " for action in utility:\n", + " for value in utility[action]:\n", + " plt.plot(value, color=colors[int(action)])\n", + " plt.vlines([left, right], -10, 35, linestyles='dashed', colors='c')\n", + " plt.ylim(-10, 35)\n", + " plt.xlim(0, 1)\n", + " plt.text(left/2 - 0.35, 30, 'open-left')\n", + " plt.text((right + left)/2 - 0.04, 30, 'listen')\n", + " plt.text((right + 1)/2 + 0.22, 30, 'open-right')\n", + " plt.show()" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYEAAAD8CAYAAACRkhiPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzsnXlcVNX7xz9nZthXWRSU1Q0UJBREQITUci/XMpc0yg03yi3zq4ZkmltuZWaWmfnT1EpLLS0zFTUVRRQXElkURUUUBVkHnt8fA8Q4wzZzZwHO+/WalzJz7znPXc793HOec56HERE4HA6H0zgR6doADofD4egOLgIcDofTiOEiwOFwOI0YLgIcDofTiOEiwOFwOI0YLgIcDofTiFFbBBhjxoyxs4yxeMbYFcbYorLvv2WMpTDGLpZ9fNU3l8PhcDhCIhGgjEIAPYgolzFmACCGMfZb2W+ziWiPAHVwOBwORwOoLQIkW22WW/anQdmHr0DjcDicegATYsUwY0wM4DyA1gA+J6L3GWPfAgiCrKdwBMBcIipUsu8EABMAwMzMzM/T01NtexoKiXl5AAAPU1MdW8Lh6Ae8TSjn/PnzD4nIXpV9BRGBisIYswbwM4BpALIA3ANgCGATgJtEFF3d/v7+/hQbGyuYPfWdF+PiAAB/d+yoY0s4HP2AtwnlMMbOE5G/KvsKOjuIiLIB/A2gDxFlkIxCAFsABAhZF4fD4XDUR22fAGPMHkAxEWUzxkwAvARgGWPMkYgyGGMMwCAACerW1diY7+qqaxM4HL2CtwnhEWJ2kCOArWV+ARGAXUS0nzH2V5lAMAAXAUwSoK5GxUs2Nro2gcPRK3ibEB4hZgddAqAwQEdEPdQtu7FzMScHAOBrYaFjSzgc/YC3CeERoifA0RDvJiUB4E4wDqcc3iaEh4eN4HA4nEYMFwEOh8NpxHAR4HA4nEYMFwEOh8NpxHDHsB6zpGVLXZvA4egVvE0IDxcBPSbYykrXJnA4egVvE8LDh4P0mFNPnuDUkye6NoPD0Rt4mxAe3hPQY+YlJwPgc6I5nHJ4mxAe3hPgcDicRgwXAQ6Hw2nEcBFQg7feegt79lSfPfP69evw9fVFx44dcf78eWzYsEFL1nGqw9zcHABw9+5dDBs2rMrtsrOz+TXjqMTGjRvx3XffVbvNt99+i6lTpyr9bcmSJZowSwEuAhpm7969GDhwIOLi4mBra8sfKHpG8+bNqxVyLgIcVZBKpZg0aRLGjBmjchlcBFTk008/hbe3N7y9vbFmzRqkpqbC09MTY8eOhY+PD4YNG4a8shR158+fR1hYGPz8/NC7d29kZGQAAF588UW8//77CAgIQNu2bXHixIka61VW1sGDB7FmzRps3rwZ3bt3x9y5c3Hz5k34+vpi9uzZNZa5pnVrrGndWr0TwqmW1NRUeHt7AwCuXLmCgIAA+Pr6wsfHBzdu3FB6zVasWIHOnTvDx8cHH374YUU57dq1w/jx4+Hl5YVevXohPz9fZ8fVUKlrm9Dm8+DFF1/EvHnzEBYWhrVr1yIqKgorV64EAJw7dw4+Pj4ICgrC7NmzK+45QNYb7dOnD9q0aYM5c+YAAObOnYv8/Hz4+vpi1KhRKp2rWkNEevPx8/MjdYiNjSVvb2/Kzc2lnJwcat++PV24cIEAUExMDBERhYeH04oVK6ioqIiCgoLowYMHRES0c+dOCg8PJyKisLAwmjFjBhERHThwgHr27Km0vrFjx9Lu3burLevDDz+kFStWEBFRSkoKeXl5qXWMHGEwMzMjIvlrMnXqVPr++++JiKiwsJDy8vIUrtmhQ4do/PjxVFpaSiUlJdS/f386duwYpaSkkFgspri4OCIieu2112jbtm1aPipOZbT9PAgLC6OIiIiKvyu3fS8vLzp58iQREb3//vsV99SWLVvI3d2dsrOzKT8/n1xcXOjWrVtE9N89WhsAxJKKz10hMosZAzgOwAiyKad7iOhDxpg7gJ0AbABcAPAmERWpW191xMTEYPDgwTAzMwMADBkyBCdOnICzszO6du0KABg9ejTWrVuHPn36ICEhAS+//DIAoKSkBI6OjhVlDRkyBADg5+eH1NTUautNTEystixV+fPRIwA8kYa2CAoKwscff4z09HQMGTIEbdq0Udjm8OHDOHz4MDqWTVHMzc3FjRs34OLiAnd3d/j6+gKo3X3DqTt1aRO6eB4MHz5c4bvs7Gzk5OQgODgYADBy5Ejs37+/4veePXvCqmwRXPv27ZGWlgZnZ+caj08ohFgnUAigBxHlMsYMAMQwxn4DMAPAaiLayRjbCOAdAF8IUF+VyARREVmGS/m/iQheXl44ffq00n2MjIwAAGKxGFKpFAAQHh6OuLg4NG/eHAcPHpSrt7qyVGVxWhoALgLaYuTIkejSpQsOHDiA3r17Y/PmzWj5XJgCIsIHH3yAiRMnyn2fmppacc8AsvuGDwcJT13ahC6eB+WCUxs7ni/7+fK1hdo+gbLeSG7ZnwZlHwLQA0C5x20rZHmGNUpoaCj27t2LvLw8PHv2DD///DO6deuGW7duVVzcHTt2ICQkBB4eHsjMzKz4vri4GFeuXKm2/C1btuDixYtyAgCg1mVZWFggpywzEkf/SE5ORsuWLTF9+nS8+uqruHTpksI16927N7755hvk5spu+Tt37uDBgwe6MplTDbp6HjxPkyZNYGFhgX/++QcAsHPnzlrZb2BggOLi4lptqw6COIYZY2LG2EUADwD8AeAmgGwiKpe0dAAthKirOjp16oS33noLAQEB6NKlC8aNG4cmTZqgXbt22Lp1K3x8fPDo0SNERETA0NAQe/bswfvvv48XXngBvr6+OHXqlEr11rYsW1tbdO3aFd7e3rVyDHO0yw8//ABvb2/4+vri+vXrGDNmjMI169WrF0aOHImgoCB06NABw4YN48Kup+jqeaCMr7/+GhMmTEBQUBCIqGL4pzomTJgAHx8fjTuGWU1dlToVxpg1gJ8BLASwhYhal33vDOAgEXVQss8EABMAwMXFxS+trLsnFKmpqRgwYAASEhIELVcbvBgXB4AvkedwylG3TejqeZCbm1uxNuWTTz5BRkYG1q5dK1j5jLHzROSvyr6CThElomwAfwMIBGDNGCv3OTgBuFvFPpuIyJ+I/O3t7YU0h8PhcPSCAwcOwNfXF97e3jhx4gTmz5+va5MqULsnwBizB1BMRNmMMRMAhwEsAzAWwI+VHMOXiKjaVTf+/v4UGxurlj0NicSy+csepqY6toTD0Q94m1COOj0BIWYHOQLYyhgTQ9az2EVE+xljVwHsZIwtBhAH4GsB6mpU8Budw5GHtwnhUVsEiOgSAIUBOiJKBhCgbvmNmV8fPgQAvGJnp2NLOBz9gLcJ4eH5BPSYVbdvA+A3PIdTDm8TwtPgYgdxOBwOp/ZwEeBwOJxGDBcBDofDacRwEeBwOJxGDHcM6zHb2rXTtQkcjl7B24TwcBHQY5yNjXVtAoejV/A2ITx8OEiP+eHBA/zAI1RyOBXwNiE8etUTuHdP1xboF1/cuQMAGN60qY4t4XD0A94mFPm/y/+n1v561RO4cweoRTpfDofD4QC4m3MXEQci1CpDr0TA0BCYMAEoLNS1JRwOh6P/RP4eiUKpeg9MvRIBFxfg+nVg2TJdW8LhcDj6zf5/92PP1T1YGLZQrXL0SgSsrIA33gA+/hhITNS1NRwOh6Of5BblYvKByfCy98Ks4FlqlaVXjmEAWLMG+P13YOJE4OhR4Lmc0I2KPV5eujaBw9EreJuQseCvBbj99DZOvn0ShmJDtcrSq54AADRrBixfDhw7BmzZomtrdIudoSHsDNW7wBxOQ4K3CeD83fNYd3YdJvlNQrBzsNrl6Z0IAMA77wDdugGzZgGNeUrwtxkZ+DYjQ9dmcDh6Q2NvE9JSKcb/Oh7NzJph6UtLBSlTbRFgjDkzxo4yxq4xxq4wxiLLvo9ijN1hjF0s+/SrtVEi4MsvgdxcYMYMdS2sv3x77x6+5YsnOJwKGnubWHdmHeLuxWFd33WwNrYWpEwhegJSADOJqB1kCeanMMbal/22moh8yz4H61Jou3bABx8A27cDhw4JYCWHw+HUY1KzU7Hg6AIMaDsAQ9sNFaxctUWAiDKI6ELZ/3MAXAPQQt1yAZkItG0LREQAZfmlORwOp9FBRJhycAoYGD7v9zmYgDNmBPUJMMbcIMs3fKbsq6mMsUuMsW8YY02q2GcCYyyWMRabmZkp95uxMbBpE5CSAkRHC2kph8Ph1B92X92NgzcOYnGPxXCxchG0bMFEgDFmDuBHAO8S0VMAXwBoBcAXQAaAVcr2I6JNRORPRP729vYKv4eFAW+/DaxcCVy6JJS1HA6HUz94nP8Y03+bDj9HP0wLmCZ4+YyI1C+EMQMA+wEcIqJPlfzuBmA/EXlXV46/vz/FxsYqfP/oEeDpCbi7A6dOAWKx2ibXC/JKSgAApo3lgDmcGmiMbWLirxOxOW4zzo0/h06OnZRuwxg7T0T+qpQvxOwgBuBrANcqCwBjzLHSZoMBJKhah40NsHo1cPYs8MUXqtta3zAVixvVzc7h1ERjaxMxt2Kw6cImvNvl3SoFQF3U7gkwxkIAnABwGUBp2dfzAIyAbCiIAKQCmEhE1U7wraonAABEQJ8+wOnTwLVrQAtBXM/6zYaysLmTG8PBcji1oDG1iaKSInT8siOeFT1DwuQEmBuaV7mtOj0BtcNGEFEMAGWu6jpNCQWA/Pz8Kn9jTNYL8PYGpk0DfvqprqXXP3aVrZRrDDc8h1MbGlObWH5yOa5mXsWBkQeqFYCHDx+qVY9erRi+evUqJBIJunbtintKFoS0bAl8+CHw88/A3r06MJDD4XC0wL9Z/2Lx8cV43et19GujuM62oKAAw4cPh7GxMZRNqKkLeiUCAFBSUoJTp07B0dERhoaGGDx4MHJzcyt+nzED8PEBpk4Fnj7VoaEcDoejAYgIk/ZPgrHEGGv7rK34XiqV4t1334W5uTlMTEywa9cuFAqQfEXvRKAyxcXF2Lt3LywsLGBqaoqIiAgwJsVXXwF37wLz5+vaQg6HwxGWrfFbcTT1KJa/vBwO5g5Ys2YNbG1tYWBggLVr1+LZs2eC1qdXIuDp6YnQ0FAYGRkp/Jafn4+NGzfCwMAAvXs3QUDAx/jsM9mMIQ6Hw2kIZD7LxMzDM+Fp6omoV6PAGMN7772HR48eKWwrkUjQoUMH7Nq1S606BVknIBSVZwfdunULM2bMwOHDh5GTk1PlPmJxM3z11XKEh4/RlpkcDocjOKdPn0bvL3sjxyUH2AggU3EbIyMjdOnSBcuWLUNgYGDF9zpdJ6ApXFxcsGfPHjx9+hQ5OTmYOnUq7OzsFLYrKbmPt98eC5FIhFatWuH48eM6sJbD4XDqTmpqKgICAiAWixE8Ohg57jlADOQEwMLCAkOHDkVaWhoKCgpw7NgxOQFQF70VgcqYm5tj/fr1yMzMhFQqxSeffAJXV1e5IEpEhOTkZISFhYExBl9fXyTW8xyVK2/dwspbt3RtBoejNzSENpGdnY0+ffrAwMAA7u7uOHfuHEpFpcAAAFkATgB2dnaYOnUqcnJy8PTpU+zZswcuLsLGDCqnXohAZcRiMd5//32kpqaitLQUn322E4x5A5BfRRgfHw9PT08wxtCjRw+159Lqgv1ZWdiflaVrMzgcvaG+tgmpVIoxY8bAxMQETZo0waFDhyCVSv/bIAyADTDeYTykBVJkZmZi/fr1MDeven2AUNQ7EXieKVOGY+3aywCkWLToNEJCQmD4XPq5o0ePwt7eHgYGBhg+fDgKCgp0YyyHw2lU/O9//4OlpSUMDAywbds2hWePt7c3ln+7HJIwCd7yfQub5m2CWMthMeq9CADA5MlAQADw2WeB2LfvBAoLC5GcnIxBgwbBzMysYjupVIpdu3bBxMQEJiYmmDlzprwaczgcjpps2rQJtra2YIxhyZIlchNbxGIxQkJCcPr0aRAR4i/F4yfpT7A2tsbKl1fqxN4GIQJiMfDVV7Joo3PmyL5zd3fHzz//jNzcXGRnZyMiIgLW1v+lYysoKMCnn34KAwODCp8Dh8PhqMLvv/8OBwcHMMYwceJEuSmdRkZGGDRoEJKTkyGVSnHixIkKx+7G2I34J/0frO69GramtjqxvUGIACBbRTxzJvD118CxY/K/WVlZYcOGDXj8+DGkUik++ugjuZlGz549w/Tp08EYg4WFBXbv3q1l65VjIhbDpBFFTORwakKf2sSlS5fg7OwMxhj69u2L+/fvV/xWvrg1OzsbBQUF+Pnnn+Hu7i63/52nd/DBkQ/wcsuXMarDKG2b/x9EpDcfPz8/Uodnz4jc3Yk8PIgKCmq3z7Zt28jBwYEgi3Yq97G0tKQ//vhDLZs4HE7DIT09nVxdXZU+L8zMzOjDDz8kqVRaq7KG/DCEjBcbU1JWktp2AYglFZ+7DaYnAACmprJIo4mJwNKltdtn9OjRyMjIABHh2LFjaFEpOuHTp0/x8ssvgzGGJk2a4OTJkxqynMPh6Cv37t2Du7s7GGNwcnJCWlpaxW8WFhb46quvQETIzc1FVFRUrRy7vyT+gp+u/YSFoQvRyqaVJs2vGVXVQxMfdXsC5YwcSWRoSHTtmuplXL9+nZycnJQqvrW1NR0+fFgQW6sjOiWFolNSNF4Ph1Nf0FabuHnzJrm5uVU5QrBv3z6Vy35a8JScPnUi7w3eVCQtEsRe6LInwBhzZowdZYxdY4xdYYxFln1vwxj7gzF2o+xfpYnmNcHq1YCZGTBhAlBaWvP2yvDw8MDt27dBREhNTYWTk1PFb9nZ2ejVq1dFD2Hnzp0CWS7PkcePceTxY42UzeHURzTZJs6ePQs3NzcwxtCqVSukpqZW/GZpaYnDhw+DiPDkyRO8+uqrKtez4OgC3Hl6B1+98hUMxAYCWK4eQgwHSQHMJKJ2AAIBTGGMtQcwF8ARImoD4EjZ39Vy/fp1DBs2DPPmzcPOnTuV5hSoDU2bAitWACdOAN98o1IRcri6ulYIwvXr1+Ho+F/mzOzsbIwYMQKMMVhbW2P58uUoKcuDyuFw9Jvdu3fDyckJjDF06dJFYajnl19+qXjwv/zyy2rXd+7OOaw7sw4R/hEIdFIt9EN2djb27duHRYsWYcSIEejWrZtaNgkeQI4xtg/AZ2WfF4kooyzf8N9E5FHDvtUawxiDSCSCRCKBkZERTE1NYWVlhWbNmsHZ2Rnt2rVDQEAAgoKCYGZmju7dgfh44Pp1oFkz4Y6xnNOnT2Pw4MFyswLKMTc3x1tvvYWlS5eqvOrvxbg4AMDfHTuqZSeH01BQt02UlJRgzZo1WLFihdJ2a2Zmhs8//xxjx45Vy05lSEul6PxVZ9zPvY9rU67BCEaIjY3FmTNncOXKFdy6dQv37t3D48eP8ezZMxQWFqK4uBilpaWoxXNa5QBygooAY8wNwHEA3gBuEZF1pd8eE5HCkBBjbAKACQBgbm7u161bN9y7dw9ZWVnIyclBQUEBiouLUVJSUpsTUZVlEItFMDAwgLGxMczMzGBjYwNHR0e4urqiQ4cOCAwMRMeOHSGRqJZxc8+ePQrzg8sxMTFB//79sWbNGjnHc01wEeBw5FGlTeTn52Pu3Ln4/vvvq2yf//vf//C///1PJZukUikSExNx8uRJXL58GcnJycjIyEBWVhZyc3ORn5+P4uJiSAOkQC8APwC4Vrc6yl+An3+GNWvWDO7u7vjqq690LwKMMXMAxwB8TEQ/McayayMClaku0fzzFBQU4OzZszhz5gyuXbuG1NRU3L9/H9nZ2cjNzUVhYSGkUilKSkoh8+fUHZFIBLFYXHHiLS0tYWNjAycnJ7i7u8PX1xchISFo3bq13H6rVq3CRx99hCdPniiUaWhoiKCgICxfvhwBAQHV1j80IQEA8KO3t0r2czgNjdq2iTt37uDdd9/FwYMHkZeXp/C7kZERwsPDsX79erkXv3v37uHvv//GpUuX8O+//+L27dvIysrCkydPkJ+fj6KiIpSUlKC0rs5GawCTASQDot0iSMSy0QwzMzNYWlqiWbNmcHFxgYeHBzp37oyQkJA6jSCoE0paEBFgjBkA2A/gEBF9WvZdIuo4HFQXEagthYWAry9QUAAkJAD5+Q9x8uRJnD9/Hjdu3MCtW7eQmZmJJ0+eIC8vD0VFRZBKpRWe87rCGANjDBKJBAYGBigtLUV+fr7SbcViMby8vLBw4UIMHTpU3UPlcBo158+fx+zZs3HixIkqw8EYGBhAIpGgtLQUUqm0tkMtSil/STQ0NISJiQksLS1hZ2cHZ2dntGrVCh07dkRISAhatGiBfv/XDzG3YnB18lU4Wzmrc5hK0akIMFk8560AHhHRu5W+XwEgi4g+YYzNBWBDRHOqK0sTIgAAx48DYWHA7NnA8uWqlZGUlIRTp04hPj4eSUlJuHv3Lh4+fIicnJyK7p5KbwjPIZFIYGpqCjMzMzRp0gT29vZwc3ODl5cX/Pz8EBwcDGNjY7Xq4HDqC1KpFHFxcfjnn39w+fJlpKWlISMjA48ePcKzZ8/w7NkzFBcXq1UHY0yux29hYQFbW1s4ODigVatWFcPF7du3V2m4eGfCToz4cQTW9F6DyMBItWytCl2LQAiAEwAuAyh/As4DcAbALgAuAG4BeI2IFAfkKqEpEQCA8eOBLVuA2FhZz0AblN/AZ8+eRUJCApKTk3Hv3j08fPgQ9+/fF2QW0fM3sLm5OWxsbNC8eXO4ubnBx8cHwcHBKt/AHI5QJCUlISYmBhcvXkRKSgrS09PlfH9FRUUoLS1V+0UKQMX0bUdHRzRt2rTiRapz584ICAjQ2ovU4/zH8PzcEy5WLvjnnX8gFmkm5IXOh4OEQpMi8Pgx4OkJuLoCp0/Lgs7pA1euXMGoUaNw+fLlKm9+kUgEMzMzGBsbo7CwsMLfIWRX1sLCAnZ2dnBycqroyoaGhsqtj+BwAODhw4c4fvw4Ll68iMTERNy+fRsPHz5EdnZ2xbi5UEOqhoaGMDIyQklJCXJzc6uN+uvi4oLPP/8cAwYMUOfwBGXCrxPwTdw3iJ0QC18Hzb19chGoJTt2ACNHAmvXAtOna6walfnrr78wbtw4pKamVtl4jI2NERwcjFWrVsH3uS7NvXv3EBMTgwsXLuDGjRtIT0/Hw4cP5fwd6gxZlc9QKBcPMzMzWFlZoWnTpnBxcUHbtm3h5+eHkJAQuYitHP2koKAAp06dwvnz53HlyhWkpqYiMzNTboqiUC8bBgYGCi8bLVu2xAsvvIDg4GCFyRX37t3Du+++i99++w1Pnz6tsnw7Ozt89NFHmDRpkkr2aZITaScQ+m0oZgXNwopeKzRaFxeBWkIE9OsHxMQAV68CzsL7ZwTju+++wzvvvQepkilt5UgkEnh7e2PRokUqr2CUSqW4efMmYmJicOnSJSQnJ1f4OypPb1Nnim7l6W3lMyLKp7e5ubnB29sb/v7+CAwM5ENWKiCVSnH16lWcOnUKly5dQmpqKu7evYtHjx4hNzdXkGnWz09RLB92dHBwQMuWLeHt7Y2AgAC1pllfvHgRM2fOxKlTp6pN/CQyNcX7kZFYsmSJSvVog0JpIXy/9EV+cT6uTL4CM0OzmndSAy4CdSAlBfDyAnr1Avbu1WhValM+J7rn/v1YtWqV0imn5YhEIri5uWHatGmYNm2axrMTFRQUIC4uDmfOnEFCQgLS0tJw7949PHr0CHl5eRUPHnXeIiv7O0xMTGBubg47Ozs4ODigdevW8PHxQWBgIDw8PBqMeKSnp+P48eOIi4vDzZs3K3pz5RMQhOrNVV5wWT4BwdXVFe3bt4efnx+CgoK0ktpw//79WLhwIS5fvlztUI+xsTFee+01JE+bBolEovdrZ6KPRePDvz/EwZEH0bdNX43X12BEoKYVwxwOh6P32AKIgGxB2I9aq1VlEWhQoaQ5HA5H57wCoBjA77o2pHbolQj4+flpLWz12bMExghTpug+hLa6n9u3byMoKEhhCKi8218ZY2Nj9OzZE/Hx8Tq3W9Of4uJixMfH44svvsCkSZMqFu6Ym5tDLBZDtsRFeMpntpiamsLBwQEBAQEIDw/H6tWrcerUKeTn5+v83Gj6k5GRgZEjR8LKykrpuXn+u7Zt2+LUqVM6t1vdz9cXvgbcgE2vbQLlaq9ete5XdQsQEm34BCoTGQmsXw+cOgUEqhbQT6O8e+MGAGBNmza13ufChQt48803ce3aNbmbo3ylZOXVyxKJBD4+Pli0aJFeTaurCWWzoDIzM/H06VPBZkEp+xBRhY9D3bns1YUkad26NTp06KA0JIk+c+nSJcycORMnT56Uu88MDAwgFosVnL3NmzfH2rVrMWzYsFrXoUqb0BYPnj2A52ee8G7qjb/f+hsipr137AbjE9C2COTkAO3bA02aAOfPAwa6D+0th7oB5Pbv34/Jkyfj9u3bct+bmZnB0NAQjyvFZReJRGjZsiUiIyMRERGhcccyAOTm5uL06dM4e/YsEhMTkZaWhvv371dMadXEeghLS0vY2trCyckJbdq0qZii6ObmpvbxPHz4X0iSxMTECnHS1Px5ExMTWFtbw97eHk5OTvDw8ICfnx+6du0ql0Nbkxw8eBALFizApUuX5By75b2tp0+fyh2rjY0N5s2bh5kzZ6pUnz4HVRz902jsurIL8ZPi0c6+nVbr5iKgBvv2AYMGydJRzq0x44F2EfKG37hxIxYsWICHDx9WfMcYg52dHYyMjHD37l25t1sHBwe8+eabWLx4MQwNDastu6qV0c9PURRqplBDWxldvpL28uXLciFJnj59Kje9U6gZQeUhScqDlnl5eaFLly61WklbUlKCTZs24dNPP0VycrKcTfb29jA0NMS9e/fkVsObmZlh3LhxWLlypdrXRl9F4FDSIfTZ3gcLQxdiUfdFWq+fi4CaDB0KHDwoCzDXSsfpPiujqRv+/fffx4YNG5Cbm1vxnUgkQquyg09KSpJ7WDPGKrr0ssis+vFAasxUjqlz9erVivDFjx8/Flx4RSIRGGMV5VX+3dHREba2trh+/bpcDB8jIyMMHDgQW7duFfQ66qMI5BXnwXvenOb2AAAgAElEQVSDNwzEBoifFA9jifbvW3VEoH69MmmIdeuAP/4AIiKAQ4cADfkLtUp2djaOHTtWbbRUkUhU0ahLS0txo2y89XmICEVFRQrflz/MKw9NlEdR9PDwgK+vL0JDQ7U2NNGYkEgk6Ny5Mzp37lyn/Wo7BFe+sKy6uftEhLt37+Lu3bsKv5WWluLAgQNwdnZWOgTXkEKSRB+LRkp2Cv4e+7dOBEBdeE+gjM8/B6ZOBb7/Hhg1SicmKDAhMRHSggK89eRJjXkT1B03LxeE59/wRSIROnXqBAcHB5w4cUJuwZqBgQF8fHwQHR2Nfv36qXWsHN2TkJCAWbNm4fjx43KOXRMTE/j7+8PMzAxHjx5FYWGh3H7ls6wYYxoLSeLp6YmOHTtil709jK2tscmj2qj0WuPS/Uvo9GUnjH1hLL4e+LXO7ODDQQJQUgJ07QrcvClLR2lrK2z55Uv7y0Pi3rx5U/AMalVlH6prBrWkpCSMGDECFy5ckGvUlpaWiIiIQGlpKbZt2yaXA7p8OGnmzJmYOHGiSvZztM+hQ4cwf/58xMfHyw3nWFlZ4ZVXXkFwcDCio6PlrjVjDG5ubti8eTN69OhRZdnlGbf++ecfXLp0CUlJSRVRdDURksTY2BimpqYV4SxcXV01GpKkpLQEwd8EI+VxCq5PvQ4bExtBy68LXAQE4tIlwM8PePPN6hPUp6enIyYmBnFxcbhx4wbu3LlT4cjT1NL+qnIpa3ppf0xMDN5++20FP4G9vT0WL16MoqIirF27VsFJ6OjoiLFjx2LRokU1OpY52uXLL7/EqlWrcPPmTaWTAXr27InJkycjJSVF7po3a9YMy5cvx5gxYzRqX0FBAWJjYxEbG4uEhISK3m95DoE65t5VSlUhSZo3b46WLVvCx8cHISEhaNWqVZXi8dnZzzDtt2n4fvD3GOWj2+EDLgIqkpubi5iYGJw7dw6JiYm4desWrly5j0ePnsLU9Bmk0sKKh7lQUxStrKxga2sLZ2dntG3btuJmUzY+OiExEQD0puu7Z88eREZGKowBu7i44Msvv4RUKlUaB8ba2hqvvvoqVq9eDRsb3b0tNVaKioqwaNEifPvtt3LXrvK04B49emD06NGIj4+XEwYrKyvMnDkTCxYs0IXpClTXJrKzsytezq5fv45bt27hwYMHePLkCZ49eyb3cqZue5bYSFDwTgGMHxrD55IPnJ2c0aZNG3Tq1AkhISFwcHBQ6zjris5FgDH2DYABAB4QkXfZd1EAxgPILNtsHhEdrK4cVUVAKpXi3LlzOHPmDK5evYqUlBTcu3evIiSupoKZ2drawtHREa1bt4aXlxe6du0qaDAzfZwJUc6qVavw8ccfy601YIzBy8sL27dvR2lpKWbNmoWTJ0/KLRIyNTVFaGgoPv30U7Rrp9251I2JR48eYcaMGfjll1/krpFEIkGHDh0QHR2NwMBAvP766zh+/LjclE4TExOMHj0aGzZs0LvptppoE6mpqTh16lTdgvYNB9AawAYAj6souIyagvZ5eHio3bPXBxEIBZAL4LvnRCCXiFbWthx/f3/avn17xZzpmzdv4u7duwrZh4QOa9ykSRM4ODhUTFEUibpgxgx/LFhgjOholaoRBH0WgXKkUilmzZqFzZs349mzZxXfi8ViBAUFYffu3QCAGTNm4ODBgwqOZV9fX3z88cd4+eWXtW57Q+PatWuYMWMGjh8/Lpdc3djYGF27dsXKlSvh6emJ8PBw/Pzzz3IOXgMDA/Tt2xfbt2/XSvRQVdGHNrH3+l4M/mEwIr0i4fnQUy/CdxsYGKgsAkLGrnADkFDp7ygAs+pYBtX2wxgjkUhEhoaGZG5uTk2bNqW2bdtSSEgIvfHGGxQVFUW//vorPX78mFRh9GgiAwOiK1dU2l0Qwi5coLALF3RnQB3Jz8+nYcOGkaGhody1MjQ0pMGDB1N+fj7l5eXRe++9R82aNZPbRiQSUdu2bWnz5s26Pox6xZEjRyggIEDhnFtZWdGIESMoIyODiouLacaMGWRmZia3jVgspqCgILp9+7auD6PW6LpNPCl4Qi1WtSCfL3yoSFpU5/3z8/PpyJEjtHz5cho7dix1796d2rdvT46OjmRpaUlGRkYkFoupLKJyXT6xpOqzW9UdFQpSLgKpAC4B+AZAkyr2mwAgFkAsY4xsbW3J3d2d/P39adCgQTRr1izaunUrpaSk1PmEq8ODB0Q2NkQhIUQlJVqtugJd3/DqkJmZSWFhYSQWi+VuVlNTU4qIiKDi4mKSSqW0evVqatmyJYlEIrntWrRoQfPnz6fCwkJdH4resXnzZvLw8FA4Z82aNaP33nuP8vLyiIho3bp1ZGNjo/Dy1L59ezp//ryOj0I1dN0mph2cRiyK0T+3/9FqvZmZmfTjjz/SggUL6PXXX6egoCBq06YN2dvbk7m5ud6KQDMAYsgilX4M4JuayvDz89PUOVSJb76RnaFNm3RTf+S//1Lkv//qpnIBSUhIIB8fH4W3G2tra1q6dGnFdj/99BP5+vqSRCKR265JkyYUHh5OWVlZOjwK3VFYWEgLFy6kFi1aKPSeWrZsSatXryapVEpERD/++CM1b95c4U3R2dmZfv31Vx0fifrosk2cST9DLIrR1ANTdVJ/deilCNT2t8offROB0lKiF18ksrIiysjQtTUNgyNHjpC7u7uCIDg4OND27dsrtouNjaXu3buTsbGxQk+iX79+lJiYqMOj0DxZWVn09ttvK7zJSyQS8vX1pZ9++qli27Nnz1Lbtm0VzqmdnR198cUXOjyKhkORtIh8vvChFqta0JOCJ7o2RwG9FAEAjpX+/x6AnTWVoW8iQESUmEhkZEQ0fLiuLWl4bNmyhZo2baowXNGqVSs6duxYxXbp6en0+uuvk6WlpYKvITAwkI4cOaLDoxCOxMRE6t+/P5mamsodp7GxMXXv3p1iY2Mrtk1JSaHOnTsrDAmZm5vTnDlzdHgUDZNlMcsIUaCfrv5U88Y6QOciAGAHgAzI8umkA3gHwDYAl8t8Ar9UFoWqPvooAkRE0dGyM3XwoHbrHXXlCo3SpWdai0RHR5OVlZXCcEfHjh3p+vXrFdvl5eVRZGSkgniIRCLy9PSkLVu26O4gVODo0aMUGBio4Ni1tLSk119/ndLT0yu2zcnJob59+5KBgYGCSLz55ptUXFyswyPRDrpoE8mPkslksQkN2jlIq/XWBZ2LgFAffRWBwkKidu2IXF2JcnO1V6+unWC6oLi4mMaPH08mJiYKwyA9e/akzMzMim2lUimtXLmS3Nzc5IZCGGPUokULWrhwYcVYuT6xdetW8vT0VHCaN23alCIjIyscu0Sy8zF27FiFYTGJREK9evVSefZbfUXbbaK0tJR6b+tNFkss6PYT/Z1FxUVAC5w4ITtbM2dqr87GKAKVycnJoQEDBii8+RoZGdEbb7xB+fn5ctvv2bOHfHx8FBzLNjY2NG7cOMrOztbJcUilUoqKiiInJycFsXJzc6OVK1cqiNWCBQsUhr9EIhH5+/vTjRs3dHIc+oC228T2S9sJUaB1/6zTWp2qwEVAS0yYQCQWE2nrHmzsIlCZ27dvU1BQkMLbs7m5Oc2aNUth+zNnzlBYWBgZGRnJbW9mZkYDBgygpKQkjdqbnZ1N48aNU+rY9fHxoT179ijss3nzZrK3t1fwkbRt25ZOnTqlUXvrC9psE1l5WWS/3J4CvgogaYn+9Sgrw0VASzx6RNSsGZG/P5E2Rhm4CCjn/Pnz1K5dO4XZMDY2NrRuneIbW1paGg0bNowsLCwUHMvBwcFyTmh1SEpKoldffVVhUZaRkRGFhYXRmTNnFPb57bffyMXFRWFKZ/PmzWn37t2C2NWQ0GabeGffOyReJKaLGRe1Up86cBHQIjt3ys7amjWar2vuzZs09+ZNzVdUj/n111/J2dlZ4SHaokUL2rt3r8L2OTk5NHXqVIU3brFYTO3ataNt27bVqf5jx45RcHCwgmPXwsKChg0bRmlpaQr7xMfHk7e3t4KINWnShFauXKnyuWgMaKtN/J3yNyEKNOdw/ZhpxUVAi5SWEvXtS2RmRqSkfXN0yGeffUa2trZKh1POnj2rsL1UKqVly5YpdSw7OztTdHS0Usfytm3bqF27dgpDU/b29jR16lTKyclR2CcjI4NCQkIU9jEzM6PIyMhGMbOnvlBQXEAe6z3IfY07PSt6pmtzagUXAS2TkkJkakr0yisyUeDoH3PmzClfTi/nWA0ICKgyBMmuXbvI29tb4UFtbW1N/v7+1KJFC6WO3WXLlikVi/z8fBo8eLDSWErDhg1TcGxz9IMPj35IiAL9fuN3XZtSa7gI6ICVK2VnT4l/TzCGXL5MQy5f1lwFjYDi4mJ68803FaZYGhgYUN++fZW+tRMR/fHHH+Tg4KAwzFQ+1LN+/foq64uIiFBY8CUWiyksLExuiiun7mi6TVzLvEaGHxnSyB9HaqwOTcBFQAcUFxN17Ejk6EikqZmH3DEsLI8fP6ZevXopTCE1Njam8PBwunHjBg0cOFChB2FoaEi2trYKaxcMDQ2pa9euFBMTQ0uXLiVra2uFoSgfHx9KSEjQ9aE3GDTZJkpKS6jbN92oySdN6H7ufY3UoSm4COiIc+eIRCKiiAjNlM9FQHPcuHGD/P39lb7pl7/tDxkyhJKTk+X2y8nJoYiICLKzs6tyX3d39wYTykLf0GSb+Or8V4Qo0Obz9S+cuToiIAJHZfz9genTgY0bgdOndW0Np7b88MMPGDx4MOLKEpQoIycnBydOnMBff/0l9/2FCxdw+PBhZGVlVblvamoqwsPDsXTpUrmMXRz95X7ufcz+YzZCXUPxdse3dW2OVuEioCYffQQ4OQETJgDFxbq2hqOMkpISLFu2DK6urhCJRHjjjTeQkJCA0tJSuLi4YMmSJZBKpSAi7NixA46OjgCAzMxMjBs3DowxGBoaQiQSISwsDDdv3gQRwcrKCtHR0RVvVNu3b4e3tzdEIhFu3bqFefPmQSKRwN7eHpMnT0Zubq6OzwSnKt479B7yivPw5YAvwRjTtTnaRdUuhCY+9W04qJxffpENrC1ZImy50SkpFK3lZDoNhZycHJoyZYrCsI1YLCZvb2+5sNXKyMzMpFatWikd7rG2tq4xKUtMTAx17dpV6fqBIUOGKF0/wKkZTbSJ3278RogCRR2NErRcbQLuE9A9w4bJQk434rAuOictLY2GDBmidGVwuQO3OvLz8+mNN95QCDVhYGBAbm5uCt+LxWLq1q0bZdSQbCIpKUmpw9nIyIi6detGp0+fFvI0cOpAbmEuua1xI8/PPKmguEDX5qgMFwE94M4dIktLop49+doBbXL69Gnq1q2bwgPa3NycBg4cWKsYQbNmzVJ4QFeVf7c2eZSrIzs7myZOnKiwqK28h7Jr1y61zgenbsw+PJsQBTqWKkzoEF3BRUBP2LBBdka/+06Y8vrEx1Of+HhhCmtA7Nq1izp06KAw1dPW1pYmTpxYq2ihVeXfbdeuXa3z72ZkZFBoaKjSPMpTp06tcRWwVCql6OhocnZ2rtMitMaMkG0iLiOOxIvENG7fOEHK0yU6FwHIEsk/gHxmMRsAfwC4Ufav0kTzlT/1XQRKSoiCgojs7IiEWBPEp4jKUDW8w/Ps3btXIU8vIEz+3drmUa6OrVu31jkcRWNDqDYhLZFS502dqemKpvQo75EAlukWfRCBUACdnhOB5QDmlv1/LoBlNZVT30WAiOjyZSKJhGjsWPXLaswiUFOgt61bt9aqHF3k3z18+DC5ubkpiI2DgwPt2LGjVmUcPXq0ysB0Q4cObbSOZaHaxNp/1hKiQP936f8EsEr36FwEZDYo5BhORFlKSQCOABJrKqMhiAAR0bx5sjOr7nqhxiYC6enpNGzYMKW5hIODg+no0aO1KiclJYUCAgL0Iv9ubfMoV4cqIaobKkK0iVvZt8h8iTn13tabShuIA09fRSD7ud8fV7HfBACxAGJdXFw0dY60Sl4eUevWRG3aEKkTI6wxiEBsbGyVyV9effXVWid/qQ/5d6OiopRmC+vUqVOts4WpkqymISFEmxi4YyCZLDah5EfJNW9cT6jXIlD501B6AkREf/4pO7vz56texoq0NFrRALv9QqWBLC4upvDw8HqXf7e4uJjeeeedKvMo19ZuVdJW1nfUbRM/Xf2JEAVaHrNcQKt0j76KQKMdDipnzBiZf6Cxxw+TSqW0atUqcnd3V3hYOTk5UVRUVJ0eVg0p/251eZRHjhxZpx7Mli1byNPTU2EYTFkC+8ZIdn42NV/VnF744gUqkhbp2hxB0VcRWPGcY3h5TWU0NBHIzCSytSUKDpbNHGpM5OXlUWRkpMJ4uEgkIk9PT9qyZUudyqsq/26bNm3oxIkTmjkILVPXPMrVceTIEQoMDFRwLFtaWtLrr79O6enpGjoK/WXKgSnEohidSW94PhSdiwCAHQAyABQDSAfwDgBbAEcgmyJ6BIBNTeU0NBEgIvr2W9lZ3rix7vvWN59Aeno6vf7660odu4GBgXWOrHn48GFydXVVmGXj6OhY61k29ZXq8ih/9tlndSorMTGR+vfvr5DjwNjYmLp3705xcXEaOgrhUbVNnL59mlgUo+kHp2vAKt2jcxEQ6tMQRaC0lKhHDyIrK6K7d+u2b30Qgbi4OOrRo4fCuLypqSn179+fEhMT61Redfl3ly9vWOO4taWueZSrIysri8LDw6lJkyYK/ghfX1/at2+fho5CGFRpE0XSIuqwoQM5fepETwueasgy3cJFQM/5919ZXKHXXqvbfvoqAvv27SNfX18Fx26TJk0oPDycsrKy6lReRkYGdevWTWn+3enTp+vFzB59oao8yh4eHkrzKFdHYWEhzZ8/X2EBnUgkopYtW9Lq1av1zrGsSptYemIpIQq091rdBLM+wUWgHrB4sexs799f+330RQSkUimtW7eOWrZsqeB0bNGiBc2fP58KCwvrVCbPv6s+yvIoi8XiavMoV8emTZvIw8ND4Ro3a9aMZs6cqReO5bq2iaSsJDJebEyDdw7WoFW6h4tAPaCwkMjLi8jFhai2q/91KQJ5eXk0c+ZMatasmcJbooeHB23atKnOZRYXF9PUqVOV5t8NDQ2tMRonRznFxcU0atQopXmU+/Xrp1K4icOHD1Pnzp0VZi1ZWVnRiBEjdHat6tImSktL6eXvXiaLJRaU/qRhO8K5CNQTYmJkZ3zGjNpt/3l6On2uxVkcGRkZNGLECLKyslJ4mHTu3JkOHz6sUrk8/672qCqPsomJCYWHh6s0tHb16lXq06ePUsdyz549tepYrkub+D7+e0IU6LMzdXOk10e4CNQjJk2S5SWOjdW1JTLi4+OpZ8+eSh27ffr0oatXr6pU7o4dO8jBwUHBmenm5qaymHDqxo0bN6hTp04KwzuWlpa0YMEClcp88OABjR07VqljuWPHjmoH4hOKh88ekt1yO+ryVReSluiXX0MTcBGoRzx+TOTgQNSpE1FNL2XPpFJ6pgHH3K+//kqdOnVS6tgdO3YsPXjwQKVyjx07Rq1atVKY2dO0adM6rwvgCMuJEyeodevWCtfG3t6eNm9WLbF6YWEhzZ07lxwdHRWGDFu3bk3r168X3LFc2zYRvjecJNESir/XOEKxcxGoZ+zaJTvzn35a/XaChc2VSmn9+vXUunVrhbdCR0dHmjt3bp0du+VU97YZFRWltu0c4dmxY4fCgxsAubq6qtVL27hxI7Vp00bhXnBwcKDZs2erfI9VpjZt4mjKUUIUaO4fc9Wur77ARaCeUVpK1L8/kZkZUWpq1dupIwKFhYU0e/ZshSEZkUhEbdq0oY2qrF4r4/Hjx9SzZ0+l487jx4/nUzrrEcuXL1fqr+nQoYNa/prff/+d/P39lTqWR48erXJvs6Y2kV+cT23Xt6WWa1tSXpHuZzNpCy4C9ZDUVJkI9O9fdTrKuorAgwcPaPTo0QqN2sDAgPz9/en3339X2d7i4mIaOXKk0vy7AwYM4AlP6jnFxcU0ffp0hXDVtc2jXB2XL1+m3r17KwTMMzExoV69etHly5drXVZNbWLBXwsIUaDDSY3L78RFoJ7y6aeyK1BVWtnaiMDly5epV69eShtY796969TAlFGX/LuchkF+fj4NHTpU6RqOoUOHqrWGo7oXFT8/Pzpw4EC1+1fXJq48uEIG0QY06sdRKttXX+EiUE8pLpY5iB0cZA7j56nqhj9w4AD5+fkpdLWtra3V6mqXI0T+XU7DQN08ytVRWFhIc+bMUepYrmrIsqo2UVJaQiHfhJDNMhu6n3tfZZvqK1wE6jHnz8umjE6apPjblrt3aUtZwKGqnG6Ojo40Z84ctZ1umsy/y2kYJCQkUIcOHZTmUVY3rpNUKqUNGzbUOHmhcpuozKbYTYQo0DcXvlHLjvoKF4F6zowZsisRE/PfdzVNv9uwYYPa0+/Onj1LHh4eCo3a1ta2zpEqOY0LIfIoV0dV05itra1pzJgxcr3djJwMslpqRS9++2KDSRdZV7gI1HNycmThJDw8smjUqDEK46USiYQ6deokyBu5PuXf5TQMqsr10Lp1a0FyPcTHx9NLL71Upd+r79d9yfAjQ7qeeV2Ao6mfcBGox5QvyTcyMlW4wa27dCG/nTvVriMnJ4f69eunNP/uqFGj+JROjmAIkUe5OoIOHSL7Pn3+C23SGoQokKi7SK3QJvUdvRYBAKkALgO4WJOhjUUEqgrOJZFYEWMj6ORJ2XQ8ddYJlOffVZbHVp/z73IaBtXlUX7ppZdUvv8qt4nM7EyyXGhJ4uliglhedNq2batSkMP6Sn0QAbvabNuQRWDz5s3Utm3basP03r0rSz7Tvbts7YAqIlBV/l2h3sQ4nLoiZB7lym1i5qGZhCjQibQTFeHOW7VqpdDGmjdvrlK48/oEFwE9pLqEHa1ataJ169Ypdexu3Ci7Kt9+W3sR0PSYLIcjFLdv36bAwECleZRr45MqbxMX7l4g8SIxjf9lvNLthE58pO/ouwikALgA4DyACUp+nwAgFkCsi4uLps6RVhAidV9JCVHXrmUJ6o/EVykCjTn/LqdhcPbs2TrnUQ67cIFCz58j/03+1GxFM3qU96jGeqpLgdqvX786p0DVR/RdBJqX/dsUQDyA0Kq2rY89gcTEROrXr5/SWOs9evRQKdZ6QgKRgQFR6Ot5tPP+/Urfa26eNoejS/bu3UtOTk4KLzVOTk5yeZR33r9PY/74iBAF2nm57pMm0tPTafjw4QpDpoaGhhQYGEhHjhwR8rC0hl6LgFxlQBSAWVX9Xl9E4MiRIxQYGKiwrN7S0pKGDx9O6QIkgpk/X3Z1duzg+Xc5jYvq8ij/cuwXMvvYjPp+31ftNQF5eXkUGRlJTZs2VRiy9fT0rFfhz/VWBACYAbCo9P9TAPpUtb0+i8CWLVvI09NTwenUtGlTioyMFDT/an5+Pg0aNJQA4WO3cDj1CYXYVSNAmAfy7e4raOwqqVRKq1atInd3d7meNmOMWrRoQQsXLhQ8N4KQ6LMItCwbAooHcAXA/6rbXp9EQCqV0sKFC6lFixYKN4W7uzutWrVK0Juiqvy7gJicnXn+XU7jpri4mLpN7EaIAiFIPvCcqnmUq2PPnj30wgsvKDiWbWxsaNy4cZSdnS1ofeqityJQ14+uRSA7O5vGjRunEDxNIpHQCy+8QHv27BG8zuriufvt3k3NXnlIEgmRmsFAOZx6TXZ+NjmudCTz1Z4UfOQPeumll5Tms3jnnXcEHyI9c+YMhYWFKYRRNzMzowEDBlBSUpKg9akCFwE1SEpKogEDBijEUTcyMqKwsDA6c+aM4HXWNv9u2IULFHwknuzsiIKCZDOHOJzGSMT+CBItElGnv76XmzGn7cx2aWlpNGzYMKWO5aCgIDp69KjgddYGLgJ15NixYxQcHKzUsTts2DBKS0sTvE5VcryWz4n+7jvZldqwQXCzOBy959StU8SiGEX+Flnt2pnq2pgmnLw5OTk0ffp0BceyWCymdu3a0datWwWvsyq4CNSCrVu3Urt27RRm2TRt2pSmT5+ukcxY6r6llN/wpaVEPXsSWVoS3bkjuJkcjt5SJC0i7w3e5PypMz0teFrrBZS17W0LhVQqpWXLlpGbm5uCD9HZ2Zmio6M16ljmIqAEqVRK0dHR5OzsrHBR3NzcaNmyZRq5KI8fPxZsvPKXzEz6JTOTiIhu3CAyNiYaNkxwkzkcvWXJ8SWEKNAv138hIvk2UVs0lUe5Onbt2kUdOnRQeA7Y2trS+PHjBXcscxEoIzs7m8aPH68wx1gikVCHDh1oV1V5HNVEW/l3lyyRXbFffhGkOA5Hr7mRdYOMPjKioT8MFaS88hl4yvIoh4aGUmYdxaW2nD59mkJDQxWeD+bm5jRw4EBKTk5Wu45GLQLJyck0cOBAhTy4RkZGFBoaSqdPn65zmbVlzpw5SvPvBgYGCjKH+fqzZ3T92bOKv4uKiLy9iZydZTkIOJyGSmlpKfXc2pMsl1rSnaf/jYE+3yZURZN5lKsjLS2NhgwZQhYWFgr1du3alWIqZ5aqA41OBGJiYqhr164KF9DCwoKGDBmiEcduOZ999lmV+XfPnj0raF3Kxj9PnSJijOjddwWtisPRK767+B0hCrThrPxsCHXCq1dFRoZuVuXn5OTQlClTyM7OTuFF0svLi7Zv317rshqFCGzfvp28vLwULpSdnR1NmTJFI47dcmob10RoqrrhIyJkeYnPndNY1RyOzsh8lkl2y+0oaHMQlZTKz4vWhAhURlfxuaRSKX3yySfk6uqq4MN0cXGhJUuWVOvDbJAiIJVKacmSJeTi4qJwUlxdXemTTz7RqLddH/LvVnXDZ2cTOToS+foS8dBBnIbG2J/HkiRaQpfvK66Q1LQIVEaXkXp37txJ3t7eSl96IyIiFF56G4wIdFLhXSoAABI+SURBVOzYkSIiIpR2j7y9vWmnAKkWq6O6WOezZs3SaN3KqO6G37NHdvVWrtSyURyOBjmSfIQQBfrgzw+U/q5NEaiMLnN2xMTEUEhIiFLH8qBBgyg5ObnhiMDzjt2QkBCVHSW1RZ/z71Z3w5eWEr3yCpGpKVFKinbt4nA0QX5xPrVZ14ZarW1FeUXKAzLqSgQqo8vsfcnJyTRo0CCFCSkNRgREIlGFsmkSTeU/FZo/srLoj2oyIKWlEZmZEfXtKxMFDqc+M//IfEIU6M+bf1a5TU1tQpvo+jmSnZ1NERER5VPiG4YIaDpsRFRUVIPLv7tmjewqanikjMPRKAn3E0gSLaE3f3pT16aohK5HFLgIVMOWLVvqbf7duKdPKe7p02q3kUqJ/P2JmjUjelRzpj0OR+8oKS2h4K+DyXaZLT3IfVDttrVpE7pG3TzKqsBF4DkOHz5Mbm5uOvHqC0ltxz8vXCASi4kmTNCCURyOwGw8t5EQBfo27tsat9UHn0Bd0NYsQ3VEQAQNwxjrwxhLZIwlMcbmaqqeK1euwMfHByKRCL169UJqaioAwNraGsuXLwcR4e7du3jjjTc0ZYLO6NgReO89YNMmICZG19ZwOLUnIycD7//5Pnq498CYF8bo2hzB6dy5M65fv47S0lLs3bsXTk5OAICsrCxMnToVjDE4Oztj3759OrNRoyLAGBMD+BxAXwDtAYxgjLUXqvyHDx8iLCwMEokE3t7euHz5MogIZmZmmDp1KoqLi/H48WPMnj1bqCr1lqgowNUVmDABKCzUtTUcTu2I/D0SBdICbOy/EYwxXZujUQYOHIjbt2+DiLBu3TrY2NgAANLT0zFo0CCIRCK0b98eFy5c0Kpdmu4JBABIIqJkIioCsBPAQHUKLCgowLBhw2BkZAR7e3scP34cJSUlMDQ0xNChQ5Gfn4/c3FysX78eEolEkIOoD5iZAV98AVy7BixfrmtrOJyaOfDvAey+uhvzQ+ejjW0bXZujVaZNm4asrCwQEWbNmgVzc3MQEa5duwY/Pz9IJBIEBQUhPT1d47ZoWgRaALhd6e/0su8qYIxNYIzFMsZiMzMzlRYilUoxbdo0mJubw8TEBD/++COKioogFovRrVs3ZGRkoLCwEHv27IGxsbHmjkbP6dsXGD4c+Phj4N9/dW0Nh1M1uUW5mHxwMtrbt8ecrnN0bY5OWbFiBXJyclBcXIyRI0fCyMgIJSUl+Oeff+Ds7AxDQ0O88soryM3N1YwBqjoTavMB8BqAzZX+fhPA+qq2f94xrIs44PrEyexsOlnHuOMZGUTW1kQvvsjXDnD0lxm/zyBEgWLS6rYYVJU2UR/JzMyknj171jovCfR1dhCAIACHKv39AYAPqtrez8+PduzYQY6Ojgoze1xdXTWSEaghsmmT7Mp+842uLeFwFIm9E0uiRSKa+OtEXZtSL6hNhkJ1RIDJ9tcMjDEJgH8B9ARwB8A5ACOJ6EoV28sZY29vj6VLl+Kdd97RmI36zKknTwAAwVZWddqvtBQICwOuXgWuXwfs7TVhHYdTd6SlUnTZ3AV3c+7i2pRrsDa2rtP+qraJhsLx48fx9ttvIzk5Gc89u88Tkb8qZWrUJ0BEUgBTARwCcA3ArqoEoBxLS0ssWLAARIQHDx40WgEAgHnJyZiXnFzn/UQi2XTRnBxgxgwNGMbhqMj6M+txIeMC1vVZV2cBAFRvEw2F0NBQJCUlobS0FNu3b4eDg4PaZWq0J1BX/Pz86Pz587o2Q294MS4OAPB3x44q7f/hh0B0NHD4MPDyy0JaxuHUnbTsNHht8MKLbi/i1xG/qjQlVN020VBhjOlnT6CuNPR5wtrmgw+Atm2BSZOAvDxdW8NpzBARphycAgLh836f87auR+iVCHCExdgY+PJLIDkZ+OgjXVvDaczsuboHB24cwEfdP4KrtauuzeFUgotAA+fFF4HwcGDlSuDyZV1bw2mMZBdkY/rv09HJsROmd5mua3M4z9F4ltTWQ9a0bi1IOStWAPv3A+PHAydPAmKxIMVyOLXigz8/wINnD3Bg5AFIROo9coRqE5z/4D0BPcbXwgK+FhZql2NrC6xeDZw5A2zcKIBhHE4tOXnrJDae34jILpHo5NhJ7fKEahOc/9Cr2UH+/v4UGxurazP0hj8fPQIAvFQWaEodiIA+fYDTp2XxhVq0qHkfDkcdikqK0PHLjsgtysWVyVdgbmiudplCtomGRIOZHcSRZ3FaGhanpQlSFmPAhg1AcTEwnQ/LcrTAipMrcDXzKjb02yCIAADCtgmODC4CjYhWrWQhp3/6CdBh+HJOI+BG1g18dPwjvNb+NfRv21/X5nCqgYtAI2PGDMDHB5g6VbaimMMRGiLCxP0TYSwxxto+a3VtDqcGuAg0MgwMZCEl7twB5s/XtTWchsh38d/haOpRfPLSJ3C0cNS1OZwa4CLQCOnSBZg8GVi/Hjh3TtfWcBoSD/MeYubhmQh2DsYEvwm6NodTC/g6AT3mSw8PjZW9ZAmwd69s7cC5c7IeAoejLjMPz8TTwqfYNGATREz4d0xNtonGCu8J6DEepqbwMDXVSNmWlrKeQHw8sGaNRqrgNDL+TP4T38V/hzld58CrqZdG6tBkm2is8HUCesyvDx8CAF6xs9NYHYMGyaKMXrkCuLtrrBpOAye/OB8dvugAxhguTboEEwMTjdSjjTZRH+HrBBooq27fxqrbt2veUA3Wr5eFkZg8WbagjMNRhcXHF+Pm45v4csCXGhMAQDttorGhMRFgjEUxxu4wxi6Wffppqi6O6jg7y/wDv/8O/PCDrq3h1EcSHiRg+anlGPvCWPRw76Frczh1RNM9gdVE5Fv2OajhujgqMnkyEBAAREYCZavyOZxaUUqlmPDrBFgZWWFlr5W6NoejAnw4iPP/7d17cBX1FcDx7zEYGFCoEkvRgtApVMFHFWSktlQeUygMMFSpMKMtFc1AgVFsOgYZrRYYlPoYsbyCj9Q6KMgfCCjyUBF0CJSpaYAgDgJieDSGolWp4eHpH7+tydCLWXNz97d37/nMZLibu3f3cNi9h939PcjLc30HjhyBu+/2HY3JJgu2LmBT1SYeHfgoBS3tPn02ynQRmCgiFSLytIicl+F9mTRceaXrTfzkk7Bxo+9oTDY4+OlBil8rpn/n/txyxS2+wzGNlFbrIBFZB6Sa6XgqUAbUAApMA9qr6q0ptlEIFAJ07Nixxwc2ONRXPvziCwA6tGgRyf4+/xwuvxyaN4fycvenMWcy8sWRrHxvJdvGb+P750czzn/U50S2SKd1UFqdxVR1QJj1RGQhsPIM2ygBSsA1EU0nnqSJ+kBv1QrmzXNDTj/4oJuo3phUVuxawdLKpczoNyOyAgD25Z8JmWwdVH/QkBHA9kztK6kWV1ezuLo60n0OHAijR7sWQ+++G+muTZb47PhnTHhlAt0v6E7Rj4oi3bePcyLpMvlMYJaIbBORCqAvMDmD+0qkeQcOMO/Agcj3+9hj0LIljBtnfQfM/7v39Xup+ncVC4cuJD8vP9J9+zonkixjRUBVb1HVy1X1ClUdpqqHMrUv07TatXMT07/5JjzzjO9oTJxsPbiV2VtmM67nOHp36O07HNMErImoSenWW6FPHygqArv6NgAnvzzJ7Stup12rdszsP9N3OKaJWBEwKYnAggWuxdBku5FngMfLHqf8cDmzfz6bNi3a+A7HNBErAuaMLrkEpkyBRYtg9Wrf0Rif9n28j/vW38fQrkO54dIbfIdjmpCNIhpjNcePA1CQH+3Dt/pqa11HsuPHYft298DY5BZVZciiIWz4YAOVEyrp2Kajt1jicE7EkY0imlAF+fneD/bmzd2QEnv3wgMPeA3FeLJkxxJW7V7F9H7TvRYAiMc5kTRWBGKs9NAhSg/5b1TVpw+MHQuPPOImoTG54+h/jnLHq3fQo30PJvWa5Duc2JwTSWJFIMZKDx+m9PBh32EAMGsWtG0LhYVw6pTvaExUitcVU3OshoVDF5J3Vp7vcGJ1TiSFFQETyvnnu2kot2xxQ0uY5Htr/1uU/L2EO6+9k6vaX+U7HJMhVgRMaKNGuWElpkyBqirf0ZhMqj1ZS+GKQi5uczEPXG8Pg5LMioAJTcRdBZw6BZP83x42GTTr7VnsrNnJ3CFzaZXfync4JoOsCJhvpHNnuP9+WLbM/Zjkee/Ie8zYOIObut/E4C42K2zSWT+BGDsWPIFtmef/gVx9J07ANddATQ1UVkLr1r4jMk1FVen3bD/KD5ezc8JOvnNOqulC/InrOeGb9RNIqJZ5ebE82M8+2/UdOHgQpk71HY1pSqXlpazft56HBjwUuwIA8T0nspkVgRibe+AAc2M6bG6vXjBxIsyZA5s3+47GNIWPPv+IorVFXNfhOm67+jbf4aQU53MiW1kRiLEl1dUsifEQntOnw4UXur4DJ074jsak6641d/Fp7aeUDC3hLInnV0Pcz4lsFM9/aZMVWrd2VwIVFW4iGpO91r6/lucqnqP4x8V0u6Cb73BMhNIqAiIyUkR2iMiXItLztPemiMhuEdklIgPTC9PE1fDhMGKEazG0Z4/vaExjHDtxjHEvj6Nr267c85N7fIdjIpbulcB24BfAhvq/FJFuwCigOzAImCsi9jQnoZ54Apo1g/HjbTrKbDTtzWnsObqH+UPm06KZTeSea9IqAqq6U1V3pXhrOPCCqtaq6l5gN9ArnX2Z+LroIpg5E9asgeef9x2N+Sa2/XMbD296mDE/HEPfzn19h2M8aJJ+AiKyHihS1a3B8p+BMlV9Llh+ClilqktTfLYQKAwWL8NdXRgoAGp8BxETlos6los6los6P1DVcxvzwWYNrSAi64BUDYanqupLZ/pYit+lrDaqWgKUBPva2tgOD0ljuahjuahjuahjuagjIo3uZdtgEVDVAY3YbhXQod7yd4GDjdiOMcaYDMpUE9HlwCgRaS4inYEuwJYM7csYY0wjpdtEdISIVAG9gZdFZDWAqu4AlgCVwKvABFUNMxVJSTrxJIzloo7loo7loo7lok6jcxGrAeSMMcZEy3oMG2NMDrMiYIwxOcxLERCRQcFwErtFpDjF+81FZHHw/mYR6RR9lNEIkYu7RKRSRCpE5DURudhHnFFoKBf11rtRRPT0oUqSJEwuROSXwbGxQ0QWRR1jVEKcIx1F5A0ReSc4TxI5E46IPC0i1SKSsi+VOLODPFWIyNWhNqyqkf4AecD7wPeAfOAfQLfT1vktMD94PQpYHHWcMcpFX6Bl8Hp8LuciWO9c3DAlZUBP33F7PC66AO8A5wXL3/Ydt8dclADjg9fdgH2+485QLvoAVwPbz/D+YGAVrp/WtcDmMNv1cSXQC9itqntU9TjwAm6YifqGA38JXi8F+otIqg5o2a7BXKjqG6p6LFgsw/W5SKIwxwXANGAW8EWUwUUsTC5uB+ao6lEAVU3q+MphcqHA/+a3a0NC+ySp6gbgX1+zynDgWXXKgG+JSPuGtuujCFwEfFhvuSr4Xcp1VPUk8AnQNpLoohUmF/WNxVX6JGowFyJyFdBBVVdGGZgHYY6LrkBXEXlbRMpEZFBk0UUrTC7uB24Omqu/AkyKJrTY+abfJ0CIHsMZEGZIidDDTmS50H9PEbkZ6An8NKMR+fO1uRCRs4DHgDFRBeRRmOOiGe6W0PW4q8ONInKZqn6c4diiFiYXo4FSVX1ERHoDfw1y8WXmw4uVRn1v+rgSCDOkxFfriEgz3CXe110GZatQw2uIyABgKjBMVWsjii1qDeXiXNwAg+tFZB/unufyhD4cDnuOvKSqJ9SN1LsLVxSSJkwuxuI6p6Kqm4AWuMHlck2jhuvxUQT+BnQRkc4iko978Lv8tHWWA78OXt8IvK7Bk4+EaTAXwS2QBbgCkNT7vtBALlT1E1UtUNVOqtoJ93xkmAYj1yZMmHNkGa7RACJSgLs9lMRpfcLkYj/QH0BELsUVgY8ijTIelgO/CloJXQt8oqqHGvpQ5LeDVPWkiEwEVuOe/D+tqjtE5I/AVlVdDjyFu6TbjbsCGBV1nFEImYs/AecALwbPxver6jBvQWdIyFzkhJC5WA38TEQqgVPA71X1iL+oMyNkLn4HLBSRybjbH2OS+J9GEXked/uvIHj+8QfgbABVnY97HjIYN3/LMeA3obabwFwZY4wJyXoMG2NMDrMiYIwxOcyKgDHG5DArAsYYk8OsCBhjTA6zImCMMTnMioAxxuSw/wJvKdH74RNWdQAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_utility(utility)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Hence, we get a piecewise-continuous utility function consistent with the given POMDP." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/neural_nets.ipynb b/neural_nets.ipynb index a6bb6f43b..1291da547 100644 --- a/neural_nets.ipynb +++ b/neural_nets.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": 1, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from learning import *\n", @@ -32,13 +30,13 @@ "\n", "### Overview\n", "\n", - "Although the Perceptron may seem like a good way to make classifications, it is a linear classifier (which, roughly, means it can only draw straight lines to divide spaces) and therefore it can be stumped by more complex problems. We can extend Perceptron to solve this issue, by employing multiple layers of its functionality. The construct we are left with is called a Neural Network, or a Multi-Layer Perceptron, and it is a non-linear classifier. It achieves that by combining the results of linear functions on each layer of the network.\n", + "Although the Perceptron may seem like a good way to make classifications, it is a linear classifier (which, roughly, means it can only draw straight lines to divide spaces) and therefore it can be stumped by more complex problems. To solve this issue we can extend Perceptron by employing multiple layers of its functionality. The construct we are left with is called a Neural Network, or a Multi-Layer Perceptron, and it is a non-linear classifier. It achieves that by combining the results of linear functions on each layer of the network.\n", "\n", - "Similar to the Perceptron, this network also has an input and output layer. However it can also have a number of hidden layers. These hidden layers are responsible for the non-linearity of the network. The layers are comprised of nodes. Each node in a layer (excluding the input one), holds some values, called *weights*, and takes as input the output values of the previous layer. The node then calculates the dot product of its inputs and its weights and then activates it with an *activation function* (sometimes a sigmoid). Its output is fed to the nodes of the next layer. Note that sometimes the output layer does not use an activation function, or uses a different one from the rest of the network. The process of passing the outputs down the layer is called *feed-forward*.\n", + "Similar to the Perceptron, this network also has an input and output layer; however, it can also have a number of hidden layers. These hidden layers are responsible for the non-linearity of the network. The layers are comprised of nodes. Each node in a layer (excluding the input one), holds some values, called *weights*, and takes as input the output values of the previous layer. The node then calculates the dot product of its inputs and its weights and then activates it with an *activation function* (e.g. sigmoid activation function). Its output is then fed to the nodes of the next layer. Note that sometimes the output layer does not use an activation function, or uses a different one from the rest of the network. The process of passing the outputs down the layer is called *feed-forward*.\n", "\n", - "After the input values are fed-forward into the network, the resulting output can be used for classification. The problem at hand now is how to train the network (ie. adjust the weights in the nodes). To accomplish that we utilize the *Backpropagation* algorithm. In short, it does the opposite of what we were doing up to now. Instead of feeding the input forward, it will feed the error backwards. So, after we make a classification, we check whether it is correct or not, and how far off we were. We then take this error and propagate it backwards in the network, adjusting the weights of the nodes accordingly. We will run the algorithm on the given input/dataset for a fixed amount of time, or until we are satisfied with the results. The number of times we will iterate over the dataset is called *epochs*. In a later section we take a detailed look at how this algorithm works.\n", + "After the input values are fed-forward into the network, the resulting output can be used for classification. The problem at hand now is how to train the network (i.e. adjust the weights in the nodes). To accomplish that we utilize the *Backpropagation* algorithm. In short, it does the opposite of what we were doing up to this point. Instead of feeding the input forward, it will track the error backwards. So, after we make a classification, we check whether it is correct or not, and how far off we were. We then take this error and propagate it backwards in the network, adjusting the weights of the nodes accordingly. We will run the algorithm on the given input/dataset for a fixed amount of time, or until we are satisfied with the results. The number of times we will iterate over the dataset is called *epochs*. In a later section we take a detailed look at how this algorithm works.\n", "\n", - "NOTE: Sometimes we add to the input of each layer another node, called *bias*. This is a constant value that will be fed to the next layer, usually set to 1. The bias generally helps us \"shift\" the computed function to the left or right." + "NOTE: Sometimes we add another node to the input of each layer, called *bias*. This is a constant value that will be fed to the next layer, usually set to 1. The bias generally helps us \"shift\" the computed function to the left or right." ] }, { @@ -60,14 +58,153 @@ "\n", "The NeuralNetLearner returns the `predict` function which, in short, can receive an example and feed-forward it into our network to generate a prediction.\n", "\n", - "In more detail, the example values are first passed to the input layer and then they are passed through the rest of the layers. Each node calculates the dot product of its inputs and its weights, activates it and pushes it to the next layer. The final prediction is the node with the maximum value from the output layer." + "In more detail, the example values are first passed to the input layer and then they are passed through the rest of the layers. Each node calculates the dot product of its inputs and its weights, activates it and pushes it to the next layer. The final prediction is the node in the output layer with the maximum value." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 2, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def NeuralNetLearner(dataset, hidden_layer_sizes=None,\n",
    +       "                     learning_rate=0.01, epochs=100, activation = sigmoid):\n",
    +       "    """Layered feed-forward network.\n",
    +       "    hidden_layer_sizes: List of number of hidden units per hidden layer\n",
    +       "    learning_rate: Learning rate of gradient descent\n",
    +       "    epochs: Number of passes over the dataset\n",
    +       "    """\n",
    +       "\n",
    +       "    hidden_layer_sizes = hidden_layer_sizes or [3]  # default value\n",
    +       "    i_units = len(dataset.inputs)\n",
    +       "    o_units = len(dataset.values[dataset.target])\n",
    +       "\n",
    +       "    # construct a network\n",
    +       "    raw_net = network(i_units, hidden_layer_sizes, o_units, activation)\n",
    +       "    learned_net = BackPropagationLearner(dataset, raw_net,\n",
    +       "                                         learning_rate, epochs, activation)\n",
    +       "\n",
    +       "    def predict(example):\n",
    +       "        # Input nodes\n",
    +       "        i_nodes = learned_net[0]\n",
    +       "\n",
    +       "        # Activate input layer\n",
    +       "        for v, n in zip(example, i_nodes):\n",
    +       "            n.value = v\n",
    +       "\n",
    +       "        # Forward pass\n",
    +       "        for layer in learned_net[1:]:\n",
    +       "            for node in layer:\n",
    +       "                inc = [n.value for n in node.inputs]\n",
    +       "                in_val = dotproduct(inc, node.weights)\n",
    +       "                node.value = node.activation(in_val)\n",
    +       "\n",
    +       "        # Hypothesis\n",
    +       "        o_nodes = learned_net[-1]\n",
    +       "        prediction = find_max_node(o_nodes)\n",
    +       "        return prediction\n",
    +       "\n",
    +       "    return predict\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(NeuralNetLearner)" ] @@ -80,9 +217,9 @@ "\n", "### Overview\n", "\n", - "In both the Perceptron and the Neural Network, we are using the Backpropagation algorithm to train our weights. Basically it achieves that by propagating the errors from our last layer into our first layer, this is why it is called Backpropagation. In order to use Backpropagation, we need a cost function. This function is responsible for indicating how good our neural network is for a given example. One common cost function is the *Mean Squared Error* (MSE). This cost function has the following format:\n", + "In both the Perceptron and the Neural Network, we are using the Backpropagation algorithm to train our model by updating the weights. This is achieved by propagating the errors from our last layer (output layer) back to our first layer (input layer), this is why it is called Backpropagation. In order to use Backpropagation, we need a cost function. This function is responsible for indicating how good our neural network is for a given example. One common cost function is the *Mean Squared Error* (MSE). This cost function has the following format:\n", "\n", - "$$MSE=\\frac{1}{2} \\sum_{i=1}^{n}(y - \\hat{y})^{2}$$\n", + "$$MSE=\\frac{1}{n} \\sum_{i=1}^{n}(y - \\hat{y})^{2}$$\n", "\n", "Where `n` is the number of training examples, $\\hat{y}$ is our prediction and $y$ is the correct prediction for the example.\n", "\n", @@ -169,21 +306,204 @@ "source": [ "### Implementation\n", "\n", - "First, we feed-forward the examples in our neural network. After that, we calculate the gradient for each layer weights. Once that is complete, we update all the weights using gradient descent. After running these for a given number of epochs, the function returns the trained Neural Network." + "First, we feed-forward the examples in our neural network. After that, we calculate the gradient for each layers' weights by using the chain rule. Once that is complete, we update all the weights using gradient descent. After running these for a given number of epochs, the function returns the trained Neural Network." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 4, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def BackPropagationLearner(dataset, net, learning_rate, epochs, activation=sigmoid):\n",
    +       "    """[Figure 18.23] The back-propagation algorithm for multilayer networks"""\n",
    +       "    # Initialise weights\n",
    +       "    for layer in net:\n",
    +       "        for node in layer:\n",
    +       "            node.weights = random_weights(min_value=-0.5, max_value=0.5,\n",
    +       "                                          num_weights=len(node.weights))\n",
    +       "\n",
    +       "    examples = dataset.examples\n",
    +       "    '''\n",
    +       "    As of now dataset.target gives an int instead of list,\n",
    +       "    Changing dataset class will have effect on all the learners.\n",
    +       "    Will be taken care of later.\n",
    +       "    '''\n",
    +       "    o_nodes = net[-1]\n",
    +       "    i_nodes = net[0]\n",
    +       "    o_units = len(o_nodes)\n",
    +       "    idx_t = dataset.target\n",
    +       "    idx_i = dataset.inputs\n",
    +       "    n_layers = len(net)\n",
    +       "\n",
    +       "    inputs, targets = init_examples(examples, idx_i, idx_t, o_units)\n",
    +       "\n",
    +       "    for epoch in range(epochs):\n",
    +       "        # Iterate over each example\n",
    +       "        for e in range(len(examples)):\n",
    +       "            i_val = inputs[e]\n",
    +       "            t_val = targets[e]\n",
    +       "\n",
    +       "            # Activate input layer\n",
    +       "            for v, n in zip(i_val, i_nodes):\n",
    +       "                n.value = v\n",
    +       "\n",
    +       "            # Forward pass\n",
    +       "            for layer in net[1:]:\n",
    +       "                for node in layer:\n",
    +       "                    inc = [n.value for n in node.inputs]\n",
    +       "                    in_val = dotproduct(inc, node.weights)\n",
    +       "                    node.value = node.activation(in_val)\n",
    +       "\n",
    +       "            # Initialize delta\n",
    +       "            delta = [[] for _ in range(n_layers)]\n",
    +       "\n",
    +       "            # Compute outer layer delta\n",
    +       "\n",
    +       "            # Error for the MSE cost function\n",
    +       "            err = [t_val[i] - o_nodes[i].value for i in range(o_units)]\n",
    +       "\n",
    +       "            # The activation function used is relu or sigmoid function\n",
    +       "            if node.activation == sigmoid:\n",
    +       "                delta[-1] = [sigmoid_derivative(o_nodes[i].value) * err[i] for i in range(o_units)]\n",
    +       "            else:\n",
    +       "                delta[-1] = [relu_derivative(o_nodes[i].value) * err[i] for i in range(o_units)]\n",
    +       "\n",
    +       "            # Backward pass\n",
    +       "            h_layers = n_layers - 2\n",
    +       "            for i in range(h_layers, 0, -1):\n",
    +       "                layer = net[i]\n",
    +       "                h_units = len(layer)\n",
    +       "                nx_layer = net[i+1]\n",
    +       "\n",
    +       "                # weights from each ith layer node to each i + 1th layer node\n",
    +       "                w = [[node.weights[k] for node in nx_layer] for k in range(h_units)]\n",
    +       "\n",
    +       "                if activation == sigmoid:\n",
    +       "                    delta[i] = [sigmoid_derivative(layer[j].value) * dotproduct(w[j], delta[i+1])\n",
    +       "                            for j in range(h_units)]\n",
    +       "                else:\n",
    +       "                    delta[i] = [relu_derivative(layer[j].value) * dotproduct(w[j], delta[i+1])\n",
    +       "                            for j in range(h_units)]\n",
    +       "\n",
    +       "            #  Update weights\n",
    +       "            for i in range(1, n_layers):\n",
    +       "                layer = net[i]\n",
    +       "                inc = [node.value for node in net[i-1]]\n",
    +       "                units = len(layer)\n",
    +       "                for j in range(units):\n",
    +       "                    layer[j].weights = vector_add(layer[j].weights,\n",
    +       "                                                  scalar_vector_product(\n",
    +       "                                                  learning_rate * delta[i][j], inc))\n",
    +       "\n",
    +       "    return net\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ "psource(BackPropagationLearner)" ] }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 5, "metadata": {}, "outputs": [ { @@ -204,11 +524,16 @@ }, { "cell_type": "markdown", - "metadata": {}, + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, "source": [ - "The output should be 0, which means the item should get classified in the first class, \"setosa\". Note that since the algorithm is non-deterministic (because of the random initial weights) the classification might be wrong. Usually though it should be correct.\n", + "The output should be 0, which means the item should get classified in the first class, \"setosa\". Note that since the algorithm is non-deterministic (because of the random initial weights) the classification might be wrong. Usually though, it should be correct.\n", "\n", - "To increase accuracy, you can (most of the time) add more layers and nodes. Unfortunately the more layers and nodes you have, the greater the computation cost." + "To increase accuracy, you can (most of the time) add more layers and nodes. Unfortunately, increasing the number of layers or nodes also increases the computation cost and might result in overfitting.\n", + "\n" ] } ], @@ -228,9 +553,18 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.3" + "version": "3.5.2" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "source": [], + "metadata": { + "collapsed": false + } + } } }, "nbformat": 4, "nbformat_minor": 2 -} +} \ No newline at end of file diff --git a/nlp.ipynb b/nlp.ipynb index f95d8283c..9656c1ea0 100644 --- a/nlp.ipynb +++ b/nlp.ipynb @@ -79,17 +79,17 @@ "source": [ "### Probabilistic Context-Free Grammar\n", "\n", - "While a simple CFG can be very useful, we might want to know the chance of each rule occuring. Above, we do not know if `S` is more likely to be replaced by `aSb` or `ε`. **Probabilistic Context-Free Grammars (PCFG)** are built to fill exactly that need. Each rule has a probability, given in brackets, and the probabilities of a rule sum up to 1:\n", + "While a simple CFG can be very useful, we might want to know the chance of each rule occurring. Above, we do not know if `S` is more likely to be replaced by `aSb` or `ε`. **Probabilistic Context-Free Grammars (PCFG)** are built to fill exactly that need. Each rule has a probability, given in brackets, and the probabilities of a rule sum up to 1:\n", "\n", "```\n", "S -> aSb [0.7] | ε [0.3]\n", "```\n", "\n", - "Now we know it is more likely for `S` to be replaced by `aSb` than by `e`.\n", + "Now we know it is more likely for `S` to be replaced by `aSb` than by `ε`.\n", "\n", "An issue with *PCFGs* is how we will assign the various probabilities to the rules. We could use our knowledge as humans to assign the probabilities, but that is a laborious and prone to error task. Instead, we can *learn* the probabilities from data. Data is categorized as labeled (with correctly parsed sentences, usually called a **treebank**) or unlabeled (given only lexical and syntactic category names).\n", "\n", - "With labeled data, we can simply count the occurences. For the above grammar, if we have 100 `S` rules and 30 of them are of the form `S -> ε`, we assign a probability of 0.3 to the transformation.\n", + "With labeled data, we can simply count the occurrences. For the above grammar, if we have 100 `S` rules and 30 of them are of the form `S -> ε`, we assign a probability of 0.3 to the transformation.\n", "\n", "With unlabeled data we have to learn both the grammar rules and the probability of each rule. We can go with many approaches, one of them the **inside-outside** algorithm. It uses a dynamic programming approach, that first finds the probability of a substring being generated by each rule, and then estimates the probability of each rule." ] @@ -119,7 +119,7 @@ "source": [ "### Lexicon\n", "\n", - "The lexicon of a language is defined as a list of allowable words. These words are grouped into the usual classes: `verbs`, `nouns`, `adjectives`, `adverbs`, `pronouns`, `names`, `articles`, `prepositions` and `conjuctions`. For the first five classes it is impossible to list all words, since words are continuously being added in the classes. Recently \"google\" was added to the list of verbs, and words like that will continue to pop up and get added to the lists. For that reason, these first five categories are called **open classes**. The rest of the categories have much fewer words and much less development. While words like \"thou\" were commonly used in the past but have declined almost completely in usage, most changes take many decades or centuries to manifest, so we can safely assume the categories will remain static for the foreseeable future. Thus, these categories are called **closed classes**.\n", + "The lexicon of a language is defined as a list of allowable words. These words are grouped into the usual classes: `verbs`, `nouns`, `adjectives`, `adverbs`, `pronouns`, `names`, `articles`, `prepositions` and `conjunctions`. For the first five classes it is impossible to list all words, since words are continuously being added in the classes. Recently \"google\" was added to the list of verbs, and words like that will continue to pop up and get added to the lists. For that reason, these first five categories are called **open classes**. The rest of the categories have much fewer words and much less development. While words like \"thou\" were commonly used in the past but have declined almost completely in usage, most changes take many decades or centuries to manifest, so we can safely assume the categories will remain static for the foreseeable future. Thus, these categories are called **closed classes**.\n", "\n", "An example lexicon for a PCFG (note that other classes can also be used according to the language, like `digits`, or `RelPro` for relative pronoun):\n", "\n", @@ -133,7 +133,7 @@ "Name -> john [0.05] | mary [0.05] | peter [0.01] | ...\n", "Article -> the [0.35] | a [0.25] | an [0.025] | ...\n", "Preposition -> to [0.25] | in [0.2] | at [0.1] | ...\n", - "Conjuction -> and [0.5] | or [0.2] | but [0.2] | ...\n", + "Conjunction -> and [0.5] | or [0.2] | but [0.2] | ...\n", "Digit -> 1 [0.3] | 2 [0.2] | 0 [0.2] | ...\n", "```" ] @@ -147,7 +147,7 @@ "With grammars we combine words from the lexicon into valid phrases. A grammar is comprised of **grammar rules**. Each rule transforms the left-hand side of the rule into the right-hand side. For example, `A -> B` means that `A` transforms into `B`. Let's build a grammar for the language we started building with the lexicon. We will use a PCFG.\n", "\n", "```\n", - "S -> NP VP [0.9] | S Conjuction S [0.1]\n", + "S -> NP VP [0.9] | S Conjunction S [0.1]\n", "\n", "NP -> Pronoun [0.3] | Name [0.1] | Noun [0.1] | Article Noun [0.25] |\n", " Article Adjs Noun [0.05] | Digit [0.05] | NP PP [0.1] |\n", @@ -216,9 +216,9 @@ "name": "stdout", "output_type": "stream", "text": [ - "Lexicon {'Adverb': ['here', 'lightly', 'now'], 'Verb': ['is', 'say', 'are'], 'Digit': ['1', '2', '0'], 'RelPro': ['that', 'who', 'which'], 'Conjuction': ['and', 'or', 'but'], 'Name': ['john', 'mary', 'peter'], 'Pronoun': ['me', 'you', 'he'], 'Article': ['the', 'a', 'an'], 'Noun': ['robot', 'sheep', 'fence'], 'Adjective': ['good', 'new', 'sad'], 'Preposition': ['to', 'in', 'at']}\n", + "Lexicon {'Adverb': ['here', 'lightly', 'now'], 'Verb': ['is', 'say', 'are'], 'Digit': ['1', '2', '0'], 'RelPro': ['that', 'who', 'which'], 'Conjunction': ['and', 'or', 'but'], 'Name': ['john', 'mary', 'peter'], 'Pronoun': ['me', 'you', 'he'], 'Article': ['the', 'a', 'an'], 'Noun': ['robot', 'sheep', 'fence'], 'Adjective': ['good', 'new', 'sad'], 'Preposition': ['to', 'in', 'at']}\n", "\n", - "Rules: {'RelClause': [['RelPro', 'VP']], 'Adjs': [['Adjective'], ['Adjective', 'Adjs']], 'NP': [['Pronoun'], ['Name'], ['Noun'], ['Article', 'Noun'], ['Article', 'Adjs', 'Noun'], ['Digit'], ['NP', 'PP'], ['NP', 'RelClause']], 'S': [['NP', 'VP'], ['S', 'Conjuction', 'S']], 'VP': [['Verb'], ['VP', 'NP'], ['VP', 'Adjective'], ['VP', 'PP'], ['VP', 'Adverb']], 'PP': [['Preposition', 'NP']]}\n" + "Rules: {'RelClause': [['RelPro', 'VP']], 'Adjs': [['Adjective'], ['Adjective', 'Adjs']], 'NP': [['Pronoun'], ['Name'], ['Noun'], ['Article', 'Noun'], ['Article', 'Adjs', 'Noun'], ['Digit'], ['NP', 'PP'], ['NP', 'RelClause']], 'S': [['NP', 'VP'], ['S', 'Conjunction', 'S']], 'VP': [['Verb'], ['VP', 'NP'], ['VP', 'Adjective'], ['VP', 'PP'], ['VP', 'Adverb']], 'PP': [['Preposition', 'NP']]}\n" ] } ], @@ -233,14 +233,14 @@ " Name = \"john | mary | peter\",\n", " Article = \"the | a | an\",\n", " Preposition = \"to | in | at\",\n", - " Conjuction = \"and | or | but\",\n", + " Conjunction = \"and | or | but\",\n", " Digit = \"1 | 2 | 0\"\n", ")\n", "\n", "print(\"Lexicon\", lexicon)\n", "\n", "rules = Rules(\n", - " S = \"NP VP | S Conjuction S\",\n", + " S = \"NP VP | S Conjunction S\",\n", " NP = \"Pronoun | Name | Noun | Article Noun \\\n", " | Article Adjs Noun | Digit | NP PP | NP RelClause\",\n", " VP = \"Verb | VP NP | VP Adjective | VP PP | VP Adverb\",\n", @@ -393,9 +393,9 @@ "name": "stdout", "output_type": "stream", "text": [ - "Lexicon {'Noun': [('robot', 0.4), ('sheep', 0.4), ('fence', 0.2)], 'Name': [('john', 0.4), ('mary', 0.4), ('peter', 0.2)], 'Adverb': [('here', 0.6), ('lightly', 0.1), ('now', 0.3)], 'Digit': [('0', 0.35), ('1', 0.35), ('2', 0.3)], 'Adjective': [('good', 0.5), ('new', 0.2), ('sad', 0.3)], 'Pronoun': [('me', 0.3), ('you', 0.4), ('he', 0.3)], 'Article': [('the', 0.5), ('a', 0.25), ('an', 0.25)], 'Preposition': [('to', 0.4), ('in', 0.3), ('at', 0.3)], 'Verb': [('is', 0.5), ('say', 0.3), ('are', 0.2)], 'Conjuction': [('and', 0.5), ('or', 0.2), ('but', 0.3)], 'RelPro': [('that', 0.5), ('who', 0.3), ('which', 0.2)]}\n", + "Lexicon {'Noun': [('robot', 0.4), ('sheep', 0.4), ('fence', 0.2)], 'Name': [('john', 0.4), ('mary', 0.4), ('peter', 0.2)], 'Adverb': [('here', 0.6), ('lightly', 0.1), ('now', 0.3)], 'Digit': [('0', 0.35), ('1', 0.35), ('2', 0.3)], 'Adjective': [('good', 0.5), ('new', 0.2), ('sad', 0.3)], 'Pronoun': [('me', 0.3), ('you', 0.4), ('he', 0.3)], 'Article': [('the', 0.5), ('a', 0.25), ('an', 0.25)], 'Preposition': [('to', 0.4), ('in', 0.3), ('at', 0.3)], 'Verb': [('is', 0.5), ('say', 0.3), ('are', 0.2)], 'Conjunction': [('and', 0.5), ('or', 0.2), ('but', 0.3)], 'RelPro': [('that', 0.5), ('who', 0.3), ('which', 0.2)]}\n", "\n", - "Rules: {'S': [(['NP', 'VP'], 0.6), (['S', 'Conjuction', 'S'], 0.4)], 'RelClause': [(['RelPro', 'VP'], 1.0)], 'VP': [(['Verb'], 0.3), (['VP', 'NP'], 0.2), (['VP', 'Adjective'], 0.25), (['VP', 'PP'], 0.15), (['VP', 'Adverb'], 0.1)], 'Adjs': [(['Adjective'], 0.5), (['Adjective', 'Adjs'], 0.5)], 'PP': [(['Preposition', 'NP'], 1.0)], 'NP': [(['Pronoun'], 0.2), (['Name'], 0.05), (['Noun'], 0.2), (['Article', 'Noun'], 0.15), (['Article', 'Adjs', 'Noun'], 0.1), (['Digit'], 0.05), (['NP', 'PP'], 0.15), (['NP', 'RelClause'], 0.1)]}\n" + "Rules: {'S': [(['NP', 'VP'], 0.6), (['S', 'Conjunction', 'S'], 0.4)], 'RelClause': [(['RelPro', 'VP'], 1.0)], 'VP': [(['Verb'], 0.3), (['VP', 'NP'], 0.2), (['VP', 'Adjective'], 0.25), (['VP', 'PP'], 0.15), (['VP', 'Adverb'], 0.1)], 'Adjs': [(['Adjective'], 0.5), (['Adjective', 'Adjs'], 0.5)], 'PP': [(['Preposition', 'NP'], 1.0)], 'NP': [(['Pronoun'], 0.2), (['Name'], 0.05), (['Noun'], 0.2), (['Article', 'Noun'], 0.15), (['Article', 'Adjs', 'Noun'], 0.1), (['Digit'], 0.05), (['NP', 'PP'], 0.15), (['NP', 'RelClause'], 0.1)]}\n" ] } ], @@ -410,14 +410,14 @@ " Name = \"john [0.4] | mary [0.4] | peter [0.2]\",\n", " Article = \"the [0.5] | a [0.25] | an [0.25]\",\n", " Preposition = \"to [0.4] | in [0.3] | at [0.3]\",\n", - " Conjuction = \"and [0.5] | or [0.2] | but [0.3]\",\n", + " Conjunction = \"and [0.5] | or [0.2] | but [0.3]\",\n", " Digit = \"0 [0.35] | 1 [0.35] | 2 [0.3]\"\n", ")\n", "\n", "print(\"Lexicon\", lexicon)\n", "\n", "rules = ProbRules(\n", - " S = \"NP VP [0.6] | S Conjuction S [0.4]\",\n", + " S = \"NP VP [0.6] | S Conjunction S [0.4]\",\n", " NP = \"Pronoun [0.2] | Name [0.05] | Noun [0.2] | Article Noun [0.15] \\\n", " | Article Adjs Noun [0.1] | Digit [0.05] | NP PP [0.15] | NP RelClause [0.1]\",\n", " VP = \"Verb [0.3] | VP NP [0.2] | VP Adjective [0.25] | VP PP [0.15] | VP Adverb [0.1]\",\n", @@ -1034,7 +1034,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.3" + "version": "3.5.2" } }, "nbformat": 4, diff --git a/nlp.py b/nlp.py index f34d088b5..03aabf54b 100644 --- a/nlp.py +++ b/nlp.py @@ -5,6 +5,7 @@ import urllib.request import re + # ______________________________________________________________________________ # Grammars and Lexicons @@ -89,7 +90,7 @@ def ProbRules(**rules): rules[lhs] = [] rhs_separate = [alt.strip().split() for alt in rhs.split('|')] for r in rhs_separate: - prob = float(r[-1][1:-1]) # remove brackets, convert to float + prob = float(r[-1][1:-1]) # remove brackets, convert to float rhs_rule = (r[:-1], prob) rules[lhs].append(rhs_rule) @@ -106,7 +107,7 @@ def ProbLexicon(**rules): rules[lhs] = [] rhs_separate = [word.strip().split() for word in rhs.split('|')] for r in rhs_separate: - prob = float(r[-1][1:-1]) # remove brackets, convert to float + prob = float(r[-1][1:-1]) # remove brackets, convert to float word = r[:-1][0] rhs_rule = (word, prob) rules[lhs].append(rhs_rule) @@ -212,9 +213,9 @@ def __repr__(self): Lexicon(Adj='happy | handsome | hairy', N='man')) -E_Prob = ProbGrammar('E_Prob', # The Probabilistic Grammar from the notebook +E_Prob = ProbGrammar('E_Prob', # The Probabilistic Grammar from the notebook ProbRules( - S="NP VP [0.6] | S Conjuction S [0.4]", + S="NP VP [0.6] | S Conjunction S [0.4]", NP="Pronoun [0.2] | Name [0.05] | Noun [0.2] | Article Noun [0.15] \ | Article Adjs Noun [0.1] | Digit [0.05] | NP PP [0.15] | NP RelClause [0.1]", VP="Verb [0.3] | VP NP [0.2] | VP Adjective [0.25] | VP PP [0.15] | VP Adverb [0.1]", @@ -232,37 +233,47 @@ def __repr__(self): Name="john [0.4] | mary [0.4] | peter [0.2]", Article="the [0.5] | a [0.25] | an [0.25]", Preposition="to [0.4] | in [0.3] | at [0.3]", - Conjuction="and [0.5] | or [0.2] | but [0.3]", + Conjunction="and [0.5] | or [0.2] | but [0.3]", Digit="0 [0.35] | 1 [0.35] | 2 [0.3]" )) - - -E_Chomsky = Grammar('E_Prob_Chomsky', # A Grammar in Chomsky Normal Form - Rules( - S='NP VP', - NP='Article Noun | Adjective Noun', - VP='Verb NP | Verb Adjective', - ), - Lexicon( - Article='the | a | an', - Noun='robot | sheep | fence', - Adjective='good | new | sad', - Verb='is | say | are' - )) - -E_Prob_Chomsky = ProbGrammar('E_Prob_Chomsky', # A Probabilistic Grammar in CNF +E_Chomsky = Grammar('E_Prob_Chomsky', # A Grammar in Chomsky Normal Form + Rules( + S='NP VP', + NP='Article Noun | Adjective Noun', + VP='Verb NP | Verb Adjective', + ), + Lexicon( + Article='the | a | an', + Noun='robot | sheep | fence', + Adjective='good | new | sad', + Verb='is | say | are' + )) + +E_Prob_Chomsky = ProbGrammar('E_Prob_Chomsky', # A Probabilistic Grammar in CNF ProbRules( - S='NP VP [1]', - NP='Article Noun [0.6] | Adjective Noun [0.4]', - VP='Verb NP [0.5] | Verb Adjective [0.5]', + S='NP VP [1]', + NP='Article Noun [0.6] | Adjective Noun [0.4]', + VP='Verb NP [0.5] | Verb Adjective [0.5]', ), ProbLexicon( - Article='the [0.5] | a [0.25] | an [0.25]', - Noun='robot [0.4] | sheep [0.4] | fence [0.2]', - Adjective='good [0.5] | new [0.2] | sad [0.3]', - Verb='is [0.5] | say [0.3] | are [0.2]' + Article='the [0.5] | a [0.25] | an [0.25]', + Noun='robot [0.4] | sheep [0.4] | fence [0.2]', + Adjective='good [0.5] | new [0.2] | sad [0.3]', + Verb='is [0.5] | say [0.3] | are [0.2]' )) +E_Prob_Chomsky_ = ProbGrammar('E_Prob_Chomsky_', + ProbRules( + S='NP VP [1]', + NP='NP PP [0.4] | Noun Verb [0.6]', + PP='Preposition NP [1]', + VP='Verb NP [0.7] | VP PP [0.3]', + ), + ProbLexicon( + Noun='astronomers [0.18] | eyes [0.32] | stars [0.32] | telescopes [0.18]', + Verb='saw [0.5] | \'\' [0.5]', + Preposition='with [1]' + )) # ______________________________________________________________________________ @@ -270,9 +281,8 @@ def __repr__(self): class Chart: - """Class for parsing sentences using a chart data structure. - >>> chart = Chart(E0); + >>> chart = Chart(E0) >>> len(chart.parses('the stench is in 2 2')) 1 """ @@ -299,7 +309,7 @@ def parses(self, words, S='S'): def parse(self, words, S='S'): """Parse a list of words; according to the grammar. Leave results in the chart.""" - self.chart = [[] for i in range(len(words)+1)] + self.chart = [[] for i in range(len(words) + 1)] self.add_edge([0, 0, 'S_', [], [S]]) for i in range(len(words)): self.scanner(i, words[i]) @@ -321,7 +331,7 @@ def scanner(self, j, word): """For each edge expecting a word of this category here, extend the edge.""" for (i, j, A, alpha, Bb) in self.chart[j]: if Bb and self.grammar.isa(word, Bb[0]): - self.add_edge([i, j+1, A, alpha + [(Bb[0], word)], Bb[1:]]) + self.add_edge([i, j + 1, A, alpha + [(Bb[0], word)], Bb[1:]]) def predictor(self, edge): """Add to chart any rules for B that could help extend this edge.""" @@ -355,13 +365,13 @@ def CYK_parse(words, grammar): # Combine first and second parts of right-hand sides of rules, # from short to long. - for length in range(2, N+1): - for start in range(N-length+1): + for length in range(2, N + 1): + for start in range(N - length + 1): for len1 in range(1, length): # N.B. the book incorrectly has N instead of length len2 = length - len1 for (X, Y, Z, p) in grammar.cnf_rules(): P[X, start, length] = max(P[X, start, length], - P[Y, start, len1] * P[Z, start+len1, len2] * p) + P[Y, start, len1] * P[Z, start + len1, len2] * p) return P @@ -433,7 +443,7 @@ def onlyWikipediaURLS(urls): """Some example HTML page data is from wikipedia. This function converts relative wikipedia links to full wikipedia URLs""" wikiURLs = [url for url in urls if url.startswith('/wiki/')] - return ["https://en.wikipedia.org"+url for url in wikiURLs] + return ["https://en.wikipedia.org" + url for url in wikiURLs] # ______________________________________________________________________________ @@ -473,17 +483,18 @@ def normalize(pages): """Normalize divides each page's score by the sum of the squares of all pages' scores (separately for both the authority and hub scores). """ - summed_hub = sum(page.hub**2 for _, page in pages.items()) - summed_auth = sum(page.authority**2 for _, page in pages.items()) + summed_hub = sum(page.hub ** 2 for _, page in pages.items()) + summed_auth = sum(page.authority ** 2 for _, page in pages.items()) for _, page in pages.items(): - page.hub /= summed_hub**0.5 - page.authority /= summed_auth**0.5 + page.hub /= summed_hub ** 0.5 + page.authority /= summed_auth ** 0.5 class ConvergenceDetector(object): """If the hub and authority values of the pages are no longer changing, we have reached a convergence and further iterations will have no effect. This detects convergence so that we can stop the HITS algorithm as early as possible.""" + def __init__(self): self.hub_history = None self.auth_history = None @@ -497,10 +508,10 @@ def detect(self): if self.hub_history is None: self.hub_history, self.auth_history = [], [] else: - diffsHub = [abs(x-y) for x, y in zip(curr_hubs, self.hub_history[-1])] - diffsAuth = [abs(x-y) for x, y in zip(curr_auths, self.auth_history[-1])] - aveDeltaHub = sum(diffsHub)/float(len(pagesIndex)) - aveDeltaAuth = sum(diffsAuth)/float(len(pagesIndex)) + diffsHub = [abs(x - y) for x, y in zip(curr_hubs, self.hub_history[-1])] + diffsAuth = [abs(x - y) for x, y in zip(curr_auths, self.auth_history[-1])] + aveDeltaHub = sum(diffsHub) / float(len(pagesIndex)) + aveDeltaAuth = sum(diffsAuth) / float(len(pagesIndex)) if aveDeltaHub < 0.01 and aveDeltaAuth < 0.01: # may need tweaking return True if len(self.hub_history) > 2: # prevent list from getting long @@ -511,13 +522,13 @@ def detect(self): return False -def getInlinks(page): +def getInLinks(page): if not page.inlinks: page.inlinks = determineInlinks(page) return [addr for addr, p in pagesIndex.items() if addr in page.inlinks] -def getOutlinks(page): +def getOutLinks(page): if not page.outlinks: page.outlinks = findOutlinks(page) return [addr for addr, p in pagesIndex.items() if addr in page.outlinks] @@ -527,12 +538,12 @@ def getOutlinks(page): # HITS Algorithm class Page(object): - def __init__(self, address, inlinks=None, outlinks=None, hub=0, authority=0): + def __init__(self, address, inLinks=None, outLinks=None, hub=0, authority=0): self.address = address self.hub = hub self.authority = authority - self.inlinks = inlinks - self.outlinks = outlinks + self.inlinks = inLinks + self.outlinks = outLinks pagesContent = {} # maps Page relative or absolute URL/location to page's HTML content @@ -551,8 +562,8 @@ def HITS(query): hub = {p: pages[p].hub for p in pages} for p in pages: # p.authority ← ∑i Inlinki(p).Hub - pages[p].authority = sum(hub[x] for x in getInlinks(pages[p])) + pages[p].authority = sum(hub[x] for x in getInLinks(pages[p])) # p.hub ← ∑i Outlinki(p).Authority - pages[p].hub = sum(authority[x] for x in getOutlinks(pages[p])) + pages[p].hub = sum(authority[x] for x in getOutLinks(pages[p])) normalize(pages) return pages diff --git a/nlp4e.py b/nlp4e.py new file mode 100644 index 000000000..095f54357 --- /dev/null +++ b/nlp4e.py @@ -0,0 +1,523 @@ +"""Natural Language Processing (Chapter 22)""" + +from collections import defaultdict +from utils4e import weighted_choice +import copy +import operator +import heapq +from search import Problem + + +# ______________________________________________________________________________ +# 22.2 Grammars + + +def Rules(**rules): + """Create a dictionary mapping symbols to alternative sequences. + >>> Rules(A = "B C | D E") + {'A': [['B', 'C'], ['D', 'E']]} + """ + for (lhs, rhs) in rules.items(): + rules[lhs] = [alt.strip().split() for alt in rhs.split('|')] + return rules + + +def Lexicon(**rules): + """Create a dictionary mapping symbols to alternative words. + >>> Lexicon(Article = "the | a | an") + {'Article': ['the', 'a', 'an']} + """ + for (lhs, rhs) in rules.items(): + rules[lhs] = [word.strip() for word in rhs.split('|')] + return rules + + +class Grammar: + + def __init__(self, name, rules, lexicon): + """A grammar has a set of rules and a lexicon.""" + self.name = name + self.rules = rules + self.lexicon = lexicon + self.categories = defaultdict(list) + for lhs in lexicon: + for word in lexicon[lhs]: + self.categories[word].append(lhs) + + def rewrites_for(self, cat): + """Return a sequence of possible rhs's that cat can be rewritten as.""" + return self.rules.get(cat, ()) + + def isa(self, word, cat): + """Return True iff word is of category cat""" + return cat in self.categories[word] + + def cnf_rules(self): + """Returns the tuple (X, Y, Z) for rules in the form: + X -> Y Z""" + cnf = [] + for X, rules in self.rules.items(): + for (Y, Z) in rules: + cnf.append((X, Y, Z)) + + return cnf + + def generate_random(self, S='S'): + """Replace each token in S by a random entry in grammar (recursively).""" + import random + + def rewrite(tokens, into): + for token in tokens: + if token in self.rules: + rewrite(random.choice(self.rules[token]), into) + elif token in self.lexicon: + into.append(random.choice(self.lexicon[token])) + else: + into.append(token) + return into + + return ' '.join(rewrite(S.split(), [])) + + def __repr__(self): + return ''.format(self.name) + + +def ProbRules(**rules): + """Create a dictionary mapping symbols to alternative sequences, + with probabilities. + >>> ProbRules(A = "B C [0.3] | D E [0.7]") + {'A': [(['B', 'C'], 0.3), (['D', 'E'], 0.7)]} + """ + for (lhs, rhs) in rules.items(): + rules[lhs] = [] + rhs_separate = [alt.strip().split() for alt in rhs.split('|')] + for r in rhs_separate: + prob = float(r[-1][1:-1]) # remove brackets, convert to float + rhs_rule = (r[:-1], prob) + rules[lhs].append(rhs_rule) + + return rules + + +def ProbLexicon(**rules): + """Create a dictionary mapping symbols to alternative words, + with probabilities. + >>> ProbLexicon(Article = "the [0.5] | a [0.25] | an [0.25]") + {'Article': [('the', 0.5), ('a', 0.25), ('an', 0.25)]} + """ + for (lhs, rhs) in rules.items(): + rules[lhs] = [] + rhs_separate = [word.strip().split() for word in rhs.split('|')] + for r in rhs_separate: + prob = float(r[-1][1:-1]) # remove brackets, convert to float + word = r[:-1][0] + rhs_rule = (word, prob) + rules[lhs].append(rhs_rule) + + return rules + + +class ProbGrammar: + + def __init__(self, name, rules, lexicon): + """A grammar has a set of rules and a lexicon. + Each rule has a probability.""" + self.name = name + self.rules = rules + self.lexicon = lexicon + self.categories = defaultdict(list) + + for lhs in lexicon: + for word, prob in lexicon[lhs]: + self.categories[word].append((lhs, prob)) + + def rewrites_for(self, cat): + """Return a sequence of possible rhs's that cat can be rewritten as.""" + return self.rules.get(cat, ()) + + def isa(self, word, cat): + """Return True iff word is of category cat""" + return cat in [c for c, _ in self.categories[word]] + + def cnf_rules(self): + """Returns the tuple (X, Y, Z, p) for rules in the form: + X -> Y Z [p]""" + cnf = [] + for X, rules in self.rules.items(): + for (Y, Z), p in rules: + cnf.append((X, Y, Z, p)) + + return cnf + + def generate_random(self, S='S'): + """Replace each token in S by a random entry in grammar (recursively). + Returns a tuple of (sentence, probability).""" + + def rewrite(tokens, into): + for token in tokens: + if token in self.rules: + non_terminal, prob = weighted_choice(self.rules[token]) + into[1] *= prob + rewrite(non_terminal, into) + elif token in self.lexicon: + terminal, prob = weighted_choice(self.lexicon[token]) + into[0].append(terminal) + into[1] *= prob + else: + into[0].append(token) + return into + + rewritten_as, prob = rewrite(S.split(), [[], 1]) + return (' '.join(rewritten_as), prob) + + def __repr__(self): + return ''.format(self.name) + + +E0 = Grammar('E0', + Rules( # Grammar for E_0 [Figure 22.2] + S='NP VP | S Conjunction S', + NP='Pronoun | Name | Noun | Article Noun | Digit Digit | NP PP | NP RelClause', + VP='Verb | VP NP | VP Adjective | VP PP | VP Adverb', + PP='Preposition NP', + RelClause='That VP'), + + Lexicon( # Lexicon for E_0 [Figure 22.3] + Noun="stench | breeze | glitter | nothing | wumpus | pit | pits | gold | east", + Verb="is | see | smell | shoot | fell | stinks | go | grab | carry | kill | turn | feel", # noqa + Adjective="right | left | east | south | back | smelly | dead", + Adverb="here | there | nearby | ahead | right | left | east | south | back", + Pronoun="me | you | I | it", + Name="John | Mary | Boston | Aristotle", + Article="the | a | an", + Preposition="to | in | on | near", + Conjunction="and | or | but", + Digit="0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9", + That="that" + )) + +E_ = Grammar('E_', # Trivial Grammar and lexicon for testing + Rules( + S='NP VP', + NP='Art N | Pronoun', + VP='V NP'), + + Lexicon( + Art='the | a', + N='man | woman | table | shoelace | saw', + Pronoun='I | you | it', + V='saw | liked | feel' + )) + +E_NP_ = Grammar('E_NP_', # Another Trivial Grammar for testing + Rules(NP='Adj NP | N'), + Lexicon(Adj='happy | handsome | hairy', + N='man')) + +E_Prob = ProbGrammar('E_Prob', # The Probabilistic Grammar from the notebook + ProbRules( + S="NP VP [0.6] | S Conjunction S [0.4]", + NP="Pronoun [0.2] | Name [0.05] | Noun [0.2] | Article Noun [0.15] \ + | Article Adjs Noun [0.1] | Digit [0.05] | NP PP [0.15] | NP RelClause [0.1]", + VP="Verb [0.3] | VP NP [0.2] | VP Adjective [0.25] | VP PP [0.15] | VP Adverb [0.1]", + Adjs="Adjective [0.5] | Adjective Adjs [0.5]", + PP="Preposition NP [1]", + RelClause="RelPro VP [1]" + ), + ProbLexicon( + Verb="is [0.5] | say [0.3] | are [0.2]", + Noun="robot [0.4] | sheep [0.4] | fence [0.2]", + Adjective="good [0.5] | new [0.2] | sad [0.3]", + Adverb="here [0.6] | lightly [0.1] | now [0.3]", + Pronoun="me [0.3] | you [0.4] | he [0.3]", + RelPro="that [0.5] | who [0.3] | which [0.2]", + Name="john [0.4] | mary [0.4] | peter [0.2]", + Article="the [0.5] | a [0.25] | an [0.25]", + Preposition="to [0.4] | in [0.3] | at [0.3]", + Conjunction="and [0.5] | or [0.2] | but [0.3]", + Digit="0 [0.35] | 1 [0.35] | 2 [0.3]" + )) + +E_Chomsky = Grammar('E_Prob_Chomsky', # A Grammar in Chomsky Normal Form + Rules( + S='NP VP', + NP='Article Noun | Adjective Noun', + VP='Verb NP | Verb Adjective', + ), + Lexicon( + Article='the | a | an', + Noun='robot | sheep | fence', + Adjective='good | new | sad', + Verb='is | say | are' + )) + +E_Prob_Chomsky = ProbGrammar('E_Prob_Chomsky', # A Probabilistic Grammar in CNF + ProbRules( + S='NP VP [1]', + NP='Article Noun [0.6] | Adjective Noun [0.4]', + VP='Verb NP [0.5] | Verb Adjective [0.5]', + ), + ProbLexicon( + Article='the [0.5] | a [0.25] | an [0.25]', + Noun='robot [0.4] | sheep [0.4] | fence [0.2]', + Adjective='good [0.5] | new [0.2] | sad [0.3]', + Verb='is [0.5] | say [0.3] | are [0.2]' + )) +E_Prob_Chomsky_ = ProbGrammar('E_Prob_Chomsky_', + ProbRules( + S='NP VP [1]', + NP='NP PP [0.4] | Noun Verb [0.6]', + PP='Preposition NP [1]', + VP='Verb NP [0.7] | VP PP [0.3]', + ), + ProbLexicon( + Noun='astronomers [0.18] | eyes [0.32] | stars [0.32] | telescopes [0.18]', + Verb='saw [0.5] | \'\' [0.5]', + Preposition='with [1]' + )) + + +# ______________________________________________________________________________ +# 22.3 Parsing + + +class Chart: + """Class for parsing sentences using a chart data structure. + >>> chart = Chart(E0) + >>> len(chart.parses('the stench is in 2 2')) + 1 + """ + + def __init__(self, grammar, trace=False): + """A datastructure for parsing a string; and methods to do the parse. + self.chart[i] holds the edges that end just before the i'th word. + Edges are 5-element lists of [start, end, lhs, [found], [expects]].""" + self.grammar = grammar + self.trace = trace + + def parses(self, words, S='S'): + """Return a list of parses; words can be a list or string.""" + if isinstance(words, str): + words = words.split() + self.parse(words, S) + # Return all the parses that span the whole input + # 'span the whole input' => begin at 0, end at len(words) + return [[i, j, S, found, []] + for (i, j, lhs, found, expects) in self.chart[len(words)] + # assert j == len(words) + if i == 0 and lhs == S and expects == []] + + def parse(self, words, S='S'): + """Parse a list of words; according to the grammar. + Leave results in the chart.""" + self.chart = [[] for i in range(len(words) + 1)] + self.add_edge([0, 0, 'S_', [], [S]]) + for i in range(len(words)): + self.scanner(i, words[i]) + return self.chart + + def add_edge(self, edge): + """Add edge to chart, and see if it extends or predicts another edge.""" + start, end, lhs, found, expects = edge + if edge not in self.chart[end]: + self.chart[end].append(edge) + if self.trace: + print('Chart: added {}'.format(edge)) + if not expects: + self.extender(edge) + else: + self.predictor(edge) + + def scanner(self, j, word): + """For each edge expecting a word of this category here, extend the edge.""" + for (i, j, A, alpha, Bb) in self.chart[j]: + if Bb and self.grammar.isa(word, Bb[0]): + self.add_edge([i, j + 1, A, alpha + [(Bb[0], word)], Bb[1:]]) + + def predictor(self, edge): + """Add to chart any rules for B that could help extend this edge.""" + (i, j, A, alpha, Bb) = edge + B = Bb[0] + if B in self.grammar.rules: + for rhs in self.grammar.rewrites_for(B): + self.add_edge([j, j, B, [], rhs]) + + def extender(self, edge): + """See what edges can be extended by this edge.""" + (j, k, B, _, _) = edge + for (i, j, A, alpha, B1b) in self.chart[j]: + if B1b and B == B1b[0]: + self.add_edge([i, k, A, alpha + [edge], B1b[1:]]) + + +# ______________________________________________________________________________ +# CYK Parsing + + +class Tree: + def __init__(self, root, *args): + self.root = root + self.leaves = [leaf for leaf in args] + + +def CYK_parse(words, grammar): + """ [Figure 22.6] """ + # We use 0-based indexing instead of the book's 1-based. + P = defaultdict(float) + T = defaultdict(Tree) + + # Insert lexical categories for each word. + for (i, word) in enumerate(words): + for (X, p) in grammar.categories[word]: + P[X, i, i] = p + T[X, i, i] = Tree(X, word) + + # Construct X(i:k) from Y(i:j) and Z(j+1:k), shortest span first + for i, j, k in subspan(len(words)): + for (X, Y, Z, p) in grammar.cnf_rules(): + PYZ = P[Y, i, j] * P[Z, j + 1, k] * p + if PYZ > P[X, i, k]: + P[X, i, k] = PYZ + T[X, i, k] = Tree(X, T[Y, i, j], T[Z, j + 1, k]) + + return T + + +def subspan(N): + """returns all tuple(i, j, k) covering a span (i, k) with i <= j < k""" + for length in range(2, N + 1): + for i in range(1, N + 2 - length): + k = i + length - 1 + for j in range(i, k): + yield (i, j, k) + + +# using search algorithms in the searching part + + +class TextParsingProblem(Problem): + def __init__(self, initial, grammar, goal='S'): + """ + :param initial: the initial state of words in a list. + :param grammar: a grammar object + :param goal: the goal state, usually S + """ + super(TextParsingProblem, self).__init__(initial, goal) + self.grammar = grammar + self.combinations = defaultdict(list) # article combinations + # backward lookup of rules + for rule in grammar.rules: + for comb in grammar.rules[rule]: + self.combinations[' '.join(comb)].append(rule) + + def actions(self, state): + actions = [] + categories = self.grammar.categories + # first change each word to the article of its category + for i in range(len(state)): + word = state[i] + if word in categories: + for X in categories[word]: + state[i] = X + actions.append(copy.copy(state)) + state[i] = word + # if all words are replaced by articles, replace combinations of articles by inferring rules. + if not actions: + for start in range(len(state)): + for end in range(start, len(state) + 1): + # try combinations between (start, end) + articles = ' '.join(state[start:end]) + for c in self.combinations[articles]: + actions.append(state[:start] + [c] + state[end:]) + return actions + + def result(self, state, action): + return action + + def h(self, state): + # heuristic function + return len(state) + + +def astar_search_parsing(words, gramma): + """bottom-up parsing using A* search to find whether a list of words is a sentence""" + # init the problem + problem = TextParsingProblem(words, gramma, 'S') + state = problem.initial + # init the searching frontier + frontier = [(len(state) + problem.h(state), state)] + heapq.heapify(frontier) + + while frontier: + # search the frontier node with lowest cost first + cost, state = heapq.heappop(frontier) + actions = problem.actions(state) + for action in actions: + new_state = problem.result(state, action) + # update the new frontier node to the frontier + if new_state == [problem.goal]: + return problem.goal + if new_state != state: + heapq.heappush(frontier, (len(new_state) + problem.h(new_state), new_state)) + return False + + +def beam_search_parsing(words, gramma, b=3): + """bottom-up text parsing using beam search""" + # init problem + problem = TextParsingProblem(words, gramma, 'S') + # init frontier + frontier = [(len(problem.initial), problem.initial)] + heapq.heapify(frontier) + + # explore the current frontier and keep b new states with lowest cost + def explore(frontier): + new_frontier = [] + for cost, state in frontier: + # expand the possible children states of current state + if not problem.goal_test(' '.join(state)): + actions = problem.actions(state) + for action in actions: + new_state = problem.result(state, action) + if [len(new_state), new_state] not in new_frontier and new_state != state: + new_frontier.append([len(new_state), new_state]) + else: + return problem.goal + heapq.heapify(new_frontier) + # only keep b states + return heapq.nsmallest(b, new_frontier) + + while frontier: + frontier = explore(frontier) + if frontier == problem.goal: + return frontier + return False + + +# ______________________________________________________________________________ +# 22.4 Augmented Grammar + + +g = Grammar("arithmetic_expression", # A Grammar of Arithmetic Expression + rules={ + 'Number_0': 'Digit_0', 'Number_1': 'Digit_1', 'Number_2': 'Digit_2', + 'Number_10': 'Number_1 Digit_0', 'Number_11': 'Number_1 Digit_1', + 'Number_100': 'Number_10 Digit_0', + 'Exp_5': ['Number_5', '( Exp_5 )', 'Exp_1, Operator_+ Exp_4', 'Exp_2, Operator_+ Exp_3', + 'Exp_0, Operator_+ Exp_5', 'Exp_3, Operator_+ Exp_2', 'Exp_4, Operator_+ Exp_1', + 'Exp_5, Operator_+ Exp_0', 'Exp_1, Operator_* Exp_5'], # more possible combinations + 'Operator_+': operator.add, 'Operator_-': operator.sub, 'Operator_*': operator.mul, + 'Operator_/': operator.truediv, + 'Digit_0': 0, 'Digit_1': 1, 'Digit_2': 2, 'Digit_3': 3, 'Digit_4': 4 + }, + lexicon={}) + +g = Grammar("Ali loves Bob", # A example grammer of Ali loves Bob example + rules={ + "S_loves_ali_bob": "NP_ali, VP_x_loves_x_bob", "S_loves_bob_ali": "NP_bob, VP_x_loves_x_ali", + "VP_x_loves_x_bob": "Verb_xy_loves_xy NP_bob", "VP_x_loves_x_ali": "Verb_xy_loves_xy NP_ali", + "NP_bob": "Name_bob", "NP_ali": "Name_ali" + }, + lexicon={ + "Name_ali": "Ali", "Name_bob": "Bob", "Verb_xy_loves_xy": "loves" + }) diff --git a/nlp_apps.ipynb b/nlp_apps.ipynb index 016a53bd6..2f4796b7a 100644 --- a/nlp_apps.ipynb +++ b/nlp_apps.ipynb @@ -15,14 +15,17 @@ "source": [ "## CONTENTS\n", "\n", - "* Language Recognition" + "* Language Recognition\n", + "* Author Recognition\n", + "* The Federalist Papers\n", + "* Text Classification" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## LANGUAGE RECOGNITION\n", + "# LANGUAGE RECOGNITION\n", "\n", "A very useful application of text models (you can read more on them on the [`text notebook`](https://github.com/aimacode/aima-python/blob/master/text.ipynb)) is categorizing text into a language. In fact, with enough data we can categorize correctly mostly any text. That is because different languages have certain characteristics that set them apart. For example, in German it is very usual for 'c' to be followed by 'h' while in English we see 't' followed by 'h' a lot.\n", "\n", @@ -36,9 +39,7 @@ { "cell_type": "code", "execution_count": 1, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from utils import open_data\n", @@ -49,7 +50,7 @@ "\n", "P_flatland = NgramCharModel(2, wordseq)\n", "\n", - "faust = open_data(\"faust.txt\").read()\n", + "faust = open_data(\"GE-text/faust.txt\").read()\n", "wordseq = words(faust)\n", "\n", "P_faust = NgramCharModel(2, wordseq)" @@ -67,9 +68,7 @@ { "cell_type": "code", "execution_count": 2, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from learning import NaiveBayesLearner\n", @@ -91,9 +90,7 @@ { "cell_type": "code", "execution_count": 3, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "def recognize(sentence, nBS, n):\n", @@ -106,6 +103,8 @@ " for b, p in P_sentence.dictionary.items():\n", " ngrams += [b]*p\n", " \n", + " print(ngrams)\n", + " \n", " return nBS(ngrams)" ] }, @@ -121,6 +120,13 @@ "execution_count": 4, "metadata": {}, "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 'i'), ('i', 'c'), ('c', 'h'), (' ', 'b'), ('b', 'i'), ('i', 'n'), ('i', 'n'), (' ', 'e'), ('e', 'i'), (' ', 'p'), ('p', 'l'), ('l', 'a'), ('a', 't'), ('t', 'z')]\n" + ] + }, { "data": { "text/plain": [ @@ -141,6 +147,13 @@ "execution_count": 5, "metadata": {}, "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 't'), ('t', 'u'), ('u', 'r'), ('r', 't'), ('t', 'l'), ('l', 'e'), ('e', 's'), (' ', 'f'), ('f', 'l'), ('l', 'y'), (' ', 'h'), ('h', 'i'), ('i', 'g'), ('g', 'h')]\n" + ] + }, { "data": { "text/plain": [ @@ -161,6 +174,13 @@ "execution_count": 6, "metadata": {}, "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 'd'), ('d', 'e'), ('e', 'r'), ('e', 'r'), (' ', 'p'), ('p', 'e'), ('e', 'l'), ('l', 'i'), ('i', 'k'), ('k', 'a'), ('a', 'n'), (' ', 'i'), ('i', 's'), ('s', 't'), (' ', 'h'), ('h', 'i'), ('i', 'e')]\n" + ] + }, { "data": { "text/plain": [ @@ -181,6 +201,13 @@ "execution_count": 7, "metadata": {}, "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 'a'), ('a', 'n'), ('n', 'd'), (' ', 't'), (' ', 't'), ('t', 'h'), ('t', 'h'), ('h', 'u'), ('u', 's'), ('h', 'e'), (' ', 'w'), ('w', 'i'), ('i', 'z'), ('z', 'a'), ('a', 'r'), ('r', 'd'), (' ', 's'), ('s', 'p'), ('p', 'o'), ('o', 'k'), ('k', 'e')]\n" + ] + }, { "data": { "text/plain": [ @@ -202,6 +229,789 @@ "source": [ "You can add more languages if you want, the algorithm works for as many as you like! Also, you can play around with *n*. Here we used 2, but other numbers work too (even though 2 suffices). The algorithm is not perfect, but it has high accuracy even for small samples like the ones we used. That is because English and German are very different languages. The closer together languages are (for example, Norwegian and Swedish share a lot of common ground) the lower the accuracy of the classifier." ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## AUTHOR RECOGNITION\n", + "\n", + "Another similar application to language recognition is recognizing who is more likely to have written a sentence, given text written by them. Here we will try and predict text from Edwin Abbott and Jane Austen. They wrote *Flatland* and *Pride and Prejudice* respectively.\n", + "\n", + "We are optimistic we can determine who wrote what based on the fact that Abbott wrote his novella on much later date than Austen, which means there will be linguistic differences between the two works. Indeed, *Flatland* uses more modern and direct language while *Pride and Prejudice* is written in a more archaic tone containing more sophisticated wording.\n", + "\n", + "Similarly with Language Recognition, we will first import the two datasets. This time though we are not looking for connections between characters, since that wouldn't give that great results. Why? Because both authors use English and English follows a set of patterns, as we show earlier. Trying to determine authorship based on this patterns would not be very efficient.\n", + "\n", + "Instead, we will abstract our querying to a higher level. We will use words instead of characters. That way we can more accurately pick at the differences between their writing style and thus have a better chance at guessing the correct author.\n", + "\n", + "Let's go right ahead and import our data:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import open_data\n", + "from text import *\n", + "\n", + "flatland = open_data(\"EN-text/flatland.txt\").read()\n", + "wordseq = words(flatland)\n", + "\n", + "P_Abbott = UnigramWordModel(wordseq, 5)\n", + "\n", + "pride = open_data(\"EN-text/pride.txt\").read()\n", + "wordseq = words(pride)\n", + "\n", + "P_Austen = UnigramWordModel(wordseq, 5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This time we set the `default` parameter of the model to 5, instead of 0. If we leave it at 0, then when we get a sentence containing a word we have not seen from that particular author, the chance of that sentence coming from that author is exactly 0 (since to get the probability, we multiply all the separate probabilities; if one is 0 then the result is also 0). To avoid that, we tell the model to add 5 to the count of all the words that appear.\n", + "\n", + "Next we will build the Naive Bayes Classifier:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "from learning import NaiveBayesLearner\n", + "\n", + "dist = {('Abbott', 1): P_Abbott, ('Austen', 1): P_Austen}\n", + "\n", + "nBS = NaiveBayesLearner(dist, simple=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have build our classifier, we will start classifying. First, we need to convert the given sentence to the format the classifier needs. That is, a list of words." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "def recognize(sentence, nBS):\n", + " sentence = sentence.lower()\n", + " sentence_words = words(sentence)\n", + " \n", + " return nBS(sentence_words)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we will input a sentence that is something Abbott would write. Note the use of square and the simpler language." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'Abbott'" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"the square is mad\", nBS)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The classifier correctly guessed Abbott.\n", + "\n", + "Next we will input a more sophisticated sentence, similar to the style of Austen." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'Austen'" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"a most peculiar acquaintance\", nBS)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The classifier guessed correctly again.\n", + "\n", + "You can try more sentences on your own. Unfortunately though, since the datasets are pretty small, chances are the guesses will not always be correct." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## THE FEDERALIST PAPERS\n", + "\n", + "Let's now take a look at a harder problem, classifying the authors of the [Federalist Papers](https://en.wikipedia.org/wiki/The_Federalist_Papers). The *Federalist Papers* are a series of papers written by Alexander Hamilton, James Madison and John Jay towards establishing the United States Constitution.\n", + "\n", + "What is interesting about these papers is that they were all written under a pseudonym, \"Publius\", to keep the identity of the authors a secret. Only after Hamilton's death, when a list was found written by him detailing the authorship of the papers, did the rest of the world learn what papers each of the authors wrote. After the list was published, Madison chimed in to make a couple of corrections: Hamilton, Madison said, hastily wrote down the list and assigned some papers to the wrong author!\n", + "\n", + "Here we will try and find out who really wrote these mysterious papers.\n", + "\n", + "To solve this we will learn from the undisputed papers to predict the disputed ones. First, let's read the texts from the file:" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import open_data\n", + "from text import *\n", + "\n", + "federalist = open_data(\"EN-text/federalist.txt\").read()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's see how the text looks. We will print the first 500 characters:" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'The Project Gutenberg EBook of The Federalist Papers, by \\nAlexander Hamilton and John Jay and James Madison\\n\\nThis eBook is for the use of anyone anywhere at no cost and with\\nalmost no restrictions whatsoever. You may copy it, give it away or\\nre-use it under the terms of the Project Gutenberg License included\\nwith this eBook or online at www.gutenberg.net\\n\\n\\nTitle: The Federalist Papers\\n\\nAuthor: Alexander Hamilton\\n John Jay\\n James Madison\\n\\nPosting Date: December 12, 2011 [EBook #18]'" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "federalist[:500]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It seems that the text file opens with a license agreement, hardly useful in our case. In fact, the license spans 113 words, while there is also a licensing agreement at the end of the file, which spans 3098 words. We need to remove them. To do so, we will first convert the text into words, to make our lives easier." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [], + "source": [ + "wordseq = words(federalist)\n", + "wordseq = wordseq[114:-3098]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now take a look at the first 100 words:" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'federalist no 1 general introduction for the independent journal hamilton to the people of the state of new york after an unequivocal experience of the inefficacy of the subsisting federal government you are called upon to deliberate on a new constitution for the united states of america the subject speaks its own importance comprehending in its consequences nothing less than the existence of the union the safety and welfare of the parts of which it is composed the fate of an empire in many respects the most interesting in the world it has been frequently remarked that it seems to'" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "' '.join(wordseq[:100])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Much better.\n", + "\n", + "As with any Natural Language Processing problem, it is prudent to do some text pre-processing and clean our data before we start building our model. Remember that all the papers are signed as 'Publius', so we can safely remove that word, since it doesn't give us any information as to the real author.\n", + "\n", + "NOTE: Since we are only removing a single word from each paper, this step can be skipped. We add it here to show that processing the data in our hands is something we should always be considering. Oftentimes pre-processing the data in just the right way is the difference between a robust model and a flimsy one." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "wordseq = [w for w in wordseq if w != 'publius']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we have to separate the text from a block of words into papers and assign them to their authors. We can see that each paper starts with the word 'federalist', so we will split the text on that word.\n", + "\n", + "The disputed papers are the papers from 49 to 58, from 18 to 20 and paper 64. We want to leave these papers unassigned. Also, note that there are two versions of paper 70; both from Hamilton.\n", + "\n", + "Finally, to keep the implementation intuitive, we add a `None` object at the start of the `papers` list to make the list index match up with the paper numbering (for example, `papers[5]` now corresponds to paper no. 5 instead of the paper no.6 in the 0-indexed Python)." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(4, 16, 52)" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import re\n", + "\n", + "papers = re.split(r'federalist\\s', ' '.join(wordseq))\n", + "papers = [p for p in papers if p not in ['', ' ']]\n", + "papers = [None] + papers\n", + "\n", + "disputed = list(range(49, 58+1)) + [18, 19, 20, 64]\n", + "jay, madison, hamilton = [], [], []\n", + "for i, p in enumerate(papers):\n", + " if i in disputed or i == 0:\n", + " continue\n", + " \n", + " if 'jay' in p:\n", + " jay.append(p)\n", + " elif 'madison' in p:\n", + " madison.append(p)\n", + " else:\n", + " hamilton.append(p)\n", + "\n", + "len(jay), len(madison), len(hamilton)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, from the undisputed papers Jay wrote 4, Madison 17 and Hamilton 51 (+1 duplicate). Let's now build our word models. The Unigram Word Model again will come in handy." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [], + "source": [ + "hamilton = ''.join(hamilton)\n", + "hamilton_words = words(hamilton)\n", + "P_hamilton = UnigramWordModel(hamilton_words, default=1)\n", + "\n", + "madison = ''.join(madison)\n", + "madison_words = words(madison)\n", + "P_madison = UnigramWordModel(madison_words, default=1)\n", + "\n", + "jay = ''.join(jay)\n", + "jay_words = words(jay)\n", + "P_jay = UnigramWordModel(jay_words, default=1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now it is time to build our new Naive Bayes Learner. It is very similar to the one found in `learning.py`, but with an important difference: it doesn't classify an example, but instead returns the probability of the example belonging to each class. This will allow us to not only see to whom a paper belongs to, but also the probability of authorship as well. \n", + "We will build two versions of Learners, one will multiply probabilities as is and other will add the logarithms of them.\n", + "\n", + "Finally, since we are dealing with long text and the string of probability multiplications is long, we will end up with the results being rounded to 0 due to floating point underflow. To work around this problem we will use the built-in Python library `decimal`, which allows as to set decimal precision to much larger than normal.\n", + "\n", + "Note that the logarithmic learner will compute a negative likelihood since the logarithm of values less than 1 will be negative.\n", + "Thus, the author with the lesser magnitude of proportion is more likely to have written that paper.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [], + "source": [ + "import random\n", + "import decimal\n", + "import math\n", + "from decimal import Decimal\n", + "\n", + "decimal.getcontext().prec = 100\n", + "\n", + "def precise_product(numbers):\n", + " result = 1\n", + " for x in numbers:\n", + " result *= Decimal(x)\n", + " return result\n", + "\n", + "def log_product(numbers):\n", + " result = 0.0\n", + " for x in numbers:\n", + " result += math.log(x)\n", + " return result\n", + "\n", + "def NaiveBayesLearner(dist):\n", + " \"\"\"A simple naive bayes classifier that takes as input a dictionary of\n", + " Counter distributions and can then be used to find the probability\n", + " of a given item belonging to each class.\n", + " The input dictionary is in the following form:\n", + " ClassName: Counter\"\"\"\n", + " attr_dist = {c_name: count_prob for c_name, count_prob in dist.items()}\n", + "\n", + " def predict(example):\n", + " \"\"\"Predict the probabilities for each class.\"\"\"\n", + " def class_prob(target, e):\n", + " attr = attr_dist[target]\n", + " return precise_product([attr[a] for a in e])\n", + "\n", + " pred = {t: class_prob(t, example) for t in dist.keys()}\n", + "\n", + " total = sum(pred.values())\n", + " for k, v in pred.items():\n", + " pred[k] = v / total\n", + "\n", + " return pred\n", + "\n", + " return predict\n", + "\n", + "def NaiveBayesLearnerLog(dist):\n", + " \"\"\"A simple naive bayes classifier that takes as input a dictionary of\n", + " Counter distributions and can then be used to find the probability\n", + " of a given item belonging to each class. It will compute the likelihood by adding the logarithms of probabilities.\n", + " The input dictionary is in the following form:\n", + " ClassName: Counter\"\"\"\n", + " attr_dist = {c_name: count_prob for c_name, count_prob in dist.items()}\n", + "\n", + " def predict(example):\n", + " \"\"\"Predict the probabilities for each class.\"\"\"\n", + " def class_prob(target, e):\n", + " attr = attr_dist[target]\n", + " return log_product([attr[a] for a in e])\n", + "\n", + " pred = {t: class_prob(t, example) for t in dist.keys()}\n", + "\n", + " total = -sum(pred.values())\n", + " for k, v in pred.items():\n", + " pred[k] = v/total\n", + "\n", + " return pred\n", + "\n", + " return predict\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next we will build our Learner. Note that even though Hamilton wrote the most papers, that doesn't make it more probable that he wrote the rest, so all the class probabilities will be equal. We can change them if we have some external knowledge, which for this tutorial we do not have." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [], + "source": [ + "dist = {('Madison', 1): P_madison, ('Hamilton', 1): P_hamilton, ('Jay', 1): P_jay}\n", + "nBS = NaiveBayesLearner(dist)\n", + "nBSL = NaiveBayesLearnerLog(dist)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As usual, the `recognize` function will take as input a string and after removing capitalization and splitting it into words, will feed it into the Naive Bayes Classifier." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [], + "source": [ + "def recognize(sentence, nBS):\n", + " return nBS(words(sentence.lower()))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can start predicting the disputed papers:" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Straightforward Naive Bayes Learner\n", + "\n", + "Paper No. 49: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 50: Hamilton: 0.0000 Madison: 0.0000 Jay: 1.0000\n", + "Paper No. 51: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 52: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 53: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 54: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 55: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 56: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 57: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 58: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 18: Hamilton: 0.0000 Madison: 0.0000 Jay: 1.0000\n", + "Paper No. 19: Hamilton: 0.0000 Madison: 0.0000 Jay: 1.0000\n", + "Paper No. 20: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 64: Hamilton: 1.0000 Madison: 0.0000 Jay: 0.0000\n", + "\n", + "Logarithmic Naive Bayes Learner\n", + "\n", + "Paper No. 49: Hamilton: -0.330591 Madison: -0.327717 Jay: -0.341692\n", + "Paper No. 50: Hamilton: -0.333119 Madison: -0.328454 Jay: -0.338427\n", + "Paper No. 51: Hamilton: -0.330246 Madison: -0.325758 Jay: -0.343996\n", + "Paper No. 52: Hamilton: -0.331094 Madison: -0.327491 Jay: -0.341415\n", + "Paper No. 53: Hamilton: -0.330942 Madison: -0.328364 Jay: -0.340693\n", + "Paper No. 54: Hamilton: -0.329566 Madison: -0.327157 Jay: -0.343277\n", + "Paper No. 55: Hamilton: -0.330821 Madison: -0.328143 Jay: -0.341036\n", + "Paper No. 56: Hamilton: -0.330333 Madison: -0.327496 Jay: -0.342171\n", + "Paper No. 57: Hamilton: -0.330625 Madison: -0.328602 Jay: -0.340772\n", + "Paper No. 58: Hamilton: -0.330271 Madison: -0.327215 Jay: -0.342515\n", + "Paper No. 18: Hamilton: -0.337781 Madison: -0.330932 Jay: -0.331287\n", + "Paper No. 19: Hamilton: -0.335635 Madison: -0.331774 Jay: -0.332590\n", + "Paper No. 20: Hamilton: -0.334911 Madison: -0.331866 Jay: -0.333223\n", + "Paper No. 64: Hamilton: -0.331004 Madison: -0.332968 Jay: -0.336028\n" + ] + } + ], + "source": [ + "print('\\nStraightforward Naive Bayes Learner\\n')\n", + "for d in disputed:\n", + " probs = recognize(papers[d], nBS)\n", + " results = ['{}: {:.4f}'.format(name, probs[(name, 1)]) for name in 'Hamilton Madison Jay'.split()]\n", + " print('Paper No. {}: {}'.format(d, ' '.join(results)))\n", + "\n", + "print('\\nLogarithmic Naive Bayes Learner\\n')\n", + "for d in disputed:\n", + " probs = recognize(papers[d], nBSL)\n", + " results = ['{}: {:.6f}'.format(name, probs[(name, 1)]) for name in 'Hamilton Madison Jay'.split()]\n", + " print('Paper No. {}: {}'.format(d, ' '.join(results)))\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that both learners classify the papers identically. Because of underflow in the straightforward learner, only one author remains with a positive value. The log learner is more accurate with marginal differences between all the authors. \n", + "\n", + "This is a simple approach to the problem and thankfully researchers are fairly certain that papers 49-58 were all written by Madison, while 18-20 were written in collaboration between Hamilton and Madison, with Madison being credited for most of the work. Our classifier is not that far off. It correctly identifies the papers written by Madison, even the ones in collaboration with Hamilton.\n", + "\n", + "Unfortunately, it misses paper 64. Consensus is that the paper was written by John Jay, while our classifier believes it was written by Hamilton. The classifier is wrong there because it does not have much information on Jay's writing; only 4 papers. This is one of the problems with using unbalanced datasets such as this one, where information on some classes is sparser than information on the rest. To avoid this, we can add more writings for Jay and Madison to end up with an equal amount of data for each author." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## Text Classification" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Text Classification** is assigning a category to a document based on the content of the document. Text Classification is one of the most popular and fundamental tasks of Natural Language Processing. Text classification can be applied on a variety of texts like *Short Documents* (like tweets, customer reviews, etc.) and *Long Document* (like emails, media articles, etc.).\n", + "\n", + "We already have seen an example of Text Classification in the above tasks like Language Identification, Author Recognition and Federalist Paper Identification.\n", + "\n", + "### Applications\n", + "Some of the broad applications of Text Classification are:-\n", + "- Language Identification\n", + "- Author Recognition\n", + "- Sentiment Analysis\n", + "- Spam Mail Detection\n", + "- Topic Labelling \n", + "- Word Sense Disambiguation\n", + "\n", + "### Use Cases\n", + "Some of the use cases of Text classification are:-\n", + "- Social Media Monitoring\n", + "- Brand Monitoring\n", + "- Auto-tagging of user queries\n", + "\n", + "For Text Classification, we would be using the Naive Bayes Classifier. The reasons for using Naive Bayes Classifier are:-\n", + "- Being a probabilistic classifier, therefore, will calculate the probability of each category\n", + "- It is fast, reliable and accurate \n", + "- Naive Bayes Classifiers have already been used to solve many Natural Language Processing (NLP) applications.\n", + "\n", + "Here we would here be covering an example of **Word Sense Disambiguation** as an application of Text Classification. It is used to remove the ambiguity of a given word if the word has two different meanings.\n", + "\n", + "As we know that we would be working on determining whether the word *apple* in a sentence refers to `fruit` or to a `company`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 1:- Defining the dataset** \n", + "\n", + "The dataset has been defined here so that everything is clear and can be tested with other things as well." + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [], + "source": [ + "train_data = [\n", + " \"Apple targets big business with new iOS 7 features. Finally... A corp iTunes account!\",\n", + " \"apple inc is searching for people to help and try out all their upcoming tablet within our own net page No.\",\n", + " \"Microsoft to bring Xbox and PC games to Apple, Android phones: Report: Microsoft Corp\",\n", + " \"When did green skittles change from lime to green apple?\",\n", + " \"Myra Oltman is the best. I told her I wanted to learn how to make apple pie, so she made me a kit!\",\n", + " \"Surreal Sat in a sewing room, surrounded by crap, listening to beautiful music eating apple pie.\"\n", + "]\n", + "\n", + "train_target = [\n", + " \"company\",\n", + " \"company\",\n", + " \"company\",\n", + " \"fruit\",\n", + " \"fruit\",\n", + " \"fruit\",\n", + "]\n", + "\n", + "class_0 = \"company\"\n", + "class_1 = \"fruit\"\n", + "\n", + "test_data = [\n", + " \"Apple Inc. supplier Foxconn demos its own iPhone-compatible smartwatch\",\n", + " \"I now know how to make a delicious apple pie thanks to the best teachers ever\"\n", + "]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 2:- Preprocessing the dataset**\n", + "\n", + "In this step, we would be doing some preprocessing on the dataset like breaking the sentence into words and converting to lower case.\n", + "\n", + "We already have a `words(sent)` function defined in `text.py` which does the task of splitting the sentence into words." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [], + "source": [ + "train_data_processed = [words(i) for i in train_data]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 3:- Feature Extraction from the text**\n", + "\n", + "Now we would be extracting features from the text like extracting the set of words used in both the categories i.e. `company` and `fruit`.\n", + "\n", + "The frequency of a word would help in calculating the probability of that word being in a particular class. " + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Number of words in `company` class: 49\n", + "Number of words in `fruit` class: 49\n" + ] + } + ], + "source": [ + "words_0 = []\n", + "words_1 = []\n", + "\n", + "for sent, tag in zip(train_data_processed, train_target):\n", + " if(tag == class_0):\n", + " words_0 += sent\n", + " elif(tag == class_1):\n", + " words_1 += sent\n", + " \n", + "print(\"Number of words in `{}` class: {}\".format(class_0, len(words_0)))\n", + "print(\"Number of words in `{}` class: {}\".format(class_1, len(words_1)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you might have observed, that our dataset is equally balanced, i.e. we have an equal number of words in both the classes." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 4:- Building the Naive Bayes Model**\n", + "\n", + "Using the Naive Bayes classifier we can calculate the probability of a word in `company` and `fruit` class and then multiplying all of them to get the probability of that sentence belonging each of the given classes. But if a word is not in our dictionary then this leads to the probability of that word belonging to that class becoming zero. For example:- the word *Foxconn* is not in the dictionary of any of the classes. Due to this, the probability of word *Foxconn* being in any of these classes becomes zero, and since all the probabilities are multiplied, this leads to the probability of that sentence belonging to any of the classes becoming zero. \n", + "\n", + "To solve the problem we need to use **smoothing**, i.e. providing a minimum non-zero threshold probability to every word that we come across.\n", + "\n", + "The `UnigramWordModel` class has implemented smoothing by taking an additional argument from the user, i.e. the minimum frequency that we would be giving to every word even if it is new to the dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [], + "source": [ + "model_words_0 = UnigramWordModel(words_0, 1)\n", + "model_words_1 = UnigramWordModel(words_1, 1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we would be building the Naive Bayes model. For that, we would be making `dist` as we had done earlier in the Authorship Recognition Task." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [], + "source": [ + "from learning import NaiveBayesLearner\n", + "\n", + "dist = {('company', 1): model_words_0, ('fruit', 1): model_words_1}\n", + "\n", + "nBS = NaiveBayesLearner(dist, simple=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 5:- Predict the class of a sentence**\n", + "\n", + "Now we will be writing a function that does pre-process of the sentences which we have taken for testing. And then predicting the class of every sentence in the document." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [], + "source": [ + "def recognize(sentence, nBS):\n", + " sentence_words = words(sentence)\n", + " return nBS(sentence_words)" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Apple Inc. supplier Foxconn demos its own iPhone-compatible smartwatch\t-company\n", + "I now know how to make a delicious apple pie thanks to the best teachers ever\t-fruit\n" + ] + } + ], + "source": [ + "# predicting the class of sentences in the test set\n", + "for i in test_data:\n", + " print(i + \"\\t-\" + recognize(i, nBS))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You might have observed that the predictions made by the model are correct and we are able to differentiate between sentences of different classes. You can try more sentences on your own. Unfortunately though, since the datasets are pretty small, chances are the guesses will not always be correct.\n", + "\n", + "As you might have observed, the above method is very much similar to the Author Recognition, which is also a type of Text Classification. Like this most of Text Classification have the same underlying structure and follow a similar procedure." + ] } ], "metadata": { @@ -220,7 +1030,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.3" + "version": "3.6.7" } }, "nbformat": 4, diff --git a/notebook.py b/notebook.py index 2894a8bfb..7f0306335 100644 --- a/notebook.py +++ b/notebook.py @@ -1,21 +1,23 @@ +import time +from collections import defaultdict from inspect import getsource -from utils import argmax, argmin -from games import TicTacToe, alphabeta_player, random_player, Fig52Extended, infinity -from logic import parse_definite_clause, standardize_variables, unify, subst -from learning import DataSet -from IPython.display import HTML, display -from collections import Counter, defaultdict - +import ipywidgets as widgets import matplotlib.pyplot as plt +import networkx as nx import numpy as np +from IPython.display import HTML +from IPython.display import display +from PIL import Image +from matplotlib import lines -import os, struct -import array -import time +from games import TicTacToe, alpha_beta_player, random_player, Fig52Extended +from learning import DataSet +from logic import parse_definite_clause, standardize_variables, unify_mm, subst +from search import GraphProblem, romania_map -#______________________________________________________________________________ +# ______________________________________________________________________________ # Magic Words @@ -46,6 +48,7 @@ def psource(*functions): except ImportError: print(source_code) + # ______________________________________________________________________________ # Iris Visualization @@ -54,7 +57,6 @@ def show_iris(i=0, j=1, k=2): """Plots the iris dataset in a 3D plot. The three axes are given by i, j and k, which correspond to three of the four iris features.""" - from mpl_toolkits.mplot3d import Axes3D plt.rcParams.update(plt.rcParamsDefault) @@ -79,7 +81,6 @@ def show_iris(i=0, j=1, k=2): b_versicolor = [v[j] for v in buckets["versicolor"]] c_versicolor = [v[k] for v in buckets["versicolor"]] - for c, m, sl, sw, pl in [('b', 's', a_setosa, b_setosa, c_setosa), ('g', '^', a_virginica, b_virginica, c_virginica), ('r', 'o', a_versicolor, b_versicolor, c_versicolor)]: @@ -91,15 +92,18 @@ def show_iris(i=0, j=1, k=2): plt.show() + # ______________________________________________________________________________ # MNIST -def load_MNIST(path="aima-data/MNIST"): +def load_MNIST(path="aima-data/MNIST/Digits", fashion=False): import os, struct import array import numpy as np - from collections import Counter + + if fashion: + path = "aima-data/MNIST/Fashion" plt.rcParams.update(plt.rcParamsDefault) plt.rcParams['figure.figsize'] = (10.0, 8.0) @@ -125,32 +129,41 @@ def load_MNIST(path="aima-data/MNIST"): te_lbl = array.array("b", test_lbl_file.read()) test_lbl_file.close() - #print(len(tr_img), len(tr_lbl), tr_size) - #print(len(te_img), len(te_lbl), te_size) + # print(len(tr_img), len(tr_lbl), tr_size) + # print(len(te_img), len(te_lbl), te_size) - train_img = np.zeros((tr_size, tr_rows*tr_cols), dtype=np.int16) + train_img = np.zeros((tr_size, tr_rows * tr_cols), dtype=np.int16) train_lbl = np.zeros((tr_size,), dtype=np.int8) for i in range(tr_size): - train_img[i] = np.array(tr_img[i*tr_rows*tr_cols : (i+1)*tr_rows*tr_cols]).reshape((tr_rows*te_cols)) + train_img[i] = np.array(tr_img[i * tr_rows * tr_cols: (i + 1) * tr_rows * tr_cols]).reshape((tr_rows * te_cols)) train_lbl[i] = tr_lbl[i] - test_img = np.zeros((te_size, te_rows*te_cols), dtype=np.int16) + test_img = np.zeros((te_size, te_rows * te_cols), dtype=np.int16) test_lbl = np.zeros((te_size,), dtype=np.int8) for i in range(te_size): - test_img[i] = np.array(te_img[i*te_rows*te_cols : (i+1)*te_rows*te_cols]).reshape((te_rows*te_cols)) + test_img[i] = np.array(te_img[i * te_rows * te_cols: (i + 1) * te_rows * te_cols]).reshape((te_rows * te_cols)) test_lbl[i] = te_lbl[i] - return(train_img, train_lbl, test_img, test_lbl) + return (train_img, train_lbl, test_img, test_lbl) + +digit_classes = [str(i) for i in range(10)] +fashion_classes = ["T-shirt/top", "Trouser", "Pullover", "Dress", "Coat", + "Sandal", "Shirt", "Sneaker", "Bag", "Ankle boot"] + + +def show_MNIST(labels, images, samples=8, fashion=False): + if not fashion: + classes = digit_classes + else: + classes = fashion_classes -def show_MNIST(labels, images, samples=8): - classes = ["0", "1", "2", "3", "4", "5", "6", "7", "8", "9"] num_classes = len(classes) for y, cls in enumerate(classes): idxs = np.nonzero([i == y for i in labels]) idxs = np.random.choice(idxs[0], samples, replace=False) - for i , idx in enumerate(idxs): + for i, idx in enumerate(idxs): plt_idx = i * num_classes + y + 1 plt.subplot(samples, num_classes, plt_idx) plt.imshow(images[idx].reshape((28, 28))) @@ -161,24 +174,31 @@ def show_MNIST(labels, images, samples=8): plt.show() -def show_ave_MNIST(labels, images): - classes = ["0", "1", "2", "3", "4", "5", "6", "7", "8", "9"] +def show_ave_MNIST(labels, images, fashion=False): + if not fashion: + item_type = "Digit" + classes = digit_classes + else: + item_type = "Apparel" + classes = fashion_classes + num_classes = len(classes) for y, cls in enumerate(classes): idxs = np.nonzero([i == y for i in labels]) - print("Digit", y, ":", len(idxs[0]), "images.") + print(item_type, y, ":", len(idxs[0]), "images.") - ave_img = np.mean(np.vstack([images[i] for i in idxs[0]]), axis = 0) - #print(ave_img.shape) + ave_img = np.mean(np.vstack([images[i] for i in idxs[0]]), axis=0) + # print(ave_img.shape) - plt.subplot(1, num_classes, y+1) + plt.subplot(1, num_classes, y + 1) plt.imshow(ave_img.reshape((28, 28))) plt.axis("off") plt.title(cls) plt.show() + # ______________________________________________________________________________ # MDP @@ -197,7 +217,7 @@ def plot_grid_step(iteration): for column in range(columns): current_row.append(data[(column, row)]) grid.append(current_row) - grid.reverse() # output like book + grid.reverse() # output like book fig = plt.imshow(grid, cmap=plt.cm.bwr, interpolation='nearest') plt.axis('off') @@ -213,18 +233,20 @@ def plot_grid_step(iteration): return plot_grid_step + def make_visualize(slider): """Takes an input a sliderand returns callback function for timer and animation.""" - def visualize_callback(Visualize, time_step): - if Visualize is True: + def visualize_callback(visualize, time_step): + if visualize is True: for i in range(slider.min, slider.max + 1): slider.value = i time.sleep(float(time_step)) return visualize_callback + # ______________________________________________________________________________ @@ -242,7 +264,7 @@ class Canvas: """Inherit from this class to manage the HTML canvas element in jupyter notebooks. To create an object of this class any_name_xyz = Canvas("any_name_xyz") The first argument given must be the name of the object being created. - IPython must be able to refernce the variable name that is being passed.""" + IPython must be able to reference the variable name that is being passed.""" def __init__(self, varname, width=800, height=600, cid=None): self.name = varname @@ -261,10 +283,10 @@ def mouse_move(self, x, y): raise NotImplementedError def execute(self, exec_str): - """Stores the command to be exectued to a list which is used later during update()""" + """Stores the command to be executed to a list which is used later during update()""" if not isinstance(exec_str, str): print("Invalid execution argument:", exec_str) - self.alert("Recieved invalid execution command format") + self.alert("Received invalid execution command format") prefix = "{0}_canvas_object.".format(self.cid) self.exec_list.append(prefix + exec_str + ';') @@ -358,12 +380,13 @@ def display_html(html_string): class Canvas_TicTacToe(Canvas): """Play a 3x3 TicTacToe game on HTML canvas""" + def __init__(self, varname, player_1='human', player_2='random', width=300, height=350, cid=None): - valid_players = ('human', 'random', 'alphabeta') + valid_players = ('human', 'random', 'alpha_beta') if player_1 not in valid_players or player_2 not in valid_players: raise TypeError("Players must be one of {}".format(valid_players)) - Canvas.__init__(self, varname, width, height, cid) + super().__init__(varname, width, height, cid) self.ttt = TicTacToe() self.state = self.ttt.initial self.turn = 0 @@ -375,20 +398,20 @@ def __init__(self, varname, player_1='human', player_2='random', def mouse_click(self, x, y): player = self.players[self.turn] if self.ttt.terminal_test(self.state): - if 0.55 <= x/self.width <= 0.95 and 6/7 <= y/self.height <= 6/7+1/8: + if 0.55 <= x / self.width <= 0.95 and 6 / 7 <= y / self.height <= 6 / 7 + 1 / 8: self.state = self.ttt.initial self.turn = 0 self.draw_board() return if player == 'human': - x, y = int(3*x/self.width) + 1, int(3*y/(self.height*6/7)) + 1 + x, y = int(3 * x / self.width) + 1, int(3 * y / (self.height * 6 / 7)) + 1 if (x, y) not in self.ttt.actions(self.state): # Invalid move return move = (x, y) - elif player == 'alphabeta': - move = alphabeta_player(self.ttt, self.state) + elif player == 'alpha_beta': + move = alpha_beta_player(self.ttt, self.state) else: move = random_player(self.ttt, self.state) self.state = self.ttt.result(self.state, move) @@ -398,11 +421,11 @@ def mouse_click(self, x, y): def draw_board(self): self.clear() self.stroke(0, 0, 0) - offset = 1/20 - self.line_n(0 + offset, (1/3)*6/7, 1 - offset, (1/3)*6/7) - self.line_n(0 + offset, (2/3)*6/7, 1 - offset, (2/3)*6/7) - self.line_n(1/3, (0 + offset)*6/7, 1/3, (1 - offset)*6/7) - self.line_n(2/3, (0 + offset)*6/7, 2/3, (1 - offset)*6/7) + offset = 1 / 20 + self.line_n(0 + offset, (1 / 3) * 6 / 7, 1 - offset, (1 / 3) * 6 / 7) + self.line_n(0 + offset, (2 / 3) * 6 / 7, 1 - offset, (2 / 3) * 6 / 7) + self.line_n(1 / 3, (0 + offset) * 6 / 7, 1 / 3, (1 - offset) * 6 / 7) + self.line_n(2 / 3, (0 + offset) * 6 / 7, 2 / 3, (1 - offset) * 6 / 7) board = self.state.board for mark in board: @@ -414,64 +437,65 @@ def draw_board(self): # End game message utility = self.ttt.utility(self.state, self.ttt.to_move(self.ttt.initial)) if utility == 0: - self.text_n('Game Draw!', offset, 6/7 + offset) + self.text_n('Game Draw!', offset, 6 / 7 + offset) else: - self.text_n('Player {} wins!'.format("XO"[utility < 0]), offset, 6/7 + offset) + self.text_n('Player {} wins!'.format("XO"[utility < 0]), offset, 6 / 7 + offset) # Find the 3 and draw a line self.stroke([255, 0][self.turn], [0, 255][self.turn], 0) for i in range(3): if all([(i + 1, j + 1) in self.state.board for j in range(3)]) and \ - len({self.state.board[(i + 1, j + 1)] for j in range(3)}) == 1: - self.line_n(i/3 + 1/6, offset*6/7, i/3 + 1/6, (1 - offset)*6/7) + len({self.state.board[(i + 1, j + 1)] for j in range(3)}) == 1: + self.line_n(i / 3 + 1 / 6, offset * 6 / 7, i / 3 + 1 / 6, (1 - offset) * 6 / 7) if all([(j + 1, i + 1) in self.state.board for j in range(3)]) and \ - len({self.state.board[(j + 1, i + 1)] for j in range(3)}) == 1: - self.line_n(offset, (i/3 + 1/6)*6/7, 1 - offset, (i/3 + 1/6)*6/7) + len({self.state.board[(j + 1, i + 1)] for j in range(3)}) == 1: + self.line_n(offset, (i / 3 + 1 / 6) * 6 / 7, 1 - offset, (i / 3 + 1 / 6) * 6 / 7) if all([(i + 1, i + 1) in self.state.board for i in range(3)]) and \ - len({self.state.board[(i + 1, i + 1)] for i in range(3)}) == 1: - self.line_n(offset, offset*6/7, 1 - offset, (1 - offset)*6/7) + len({self.state.board[(i + 1, i + 1)] for i in range(3)}) == 1: + self.line_n(offset, offset * 6 / 7, 1 - offset, (1 - offset) * 6 / 7) if all([(i + 1, 3 - i) in self.state.board for i in range(3)]) and \ - len({self.state.board[(i + 1, 3 - i)] for i in range(3)}) == 1: - self.line_n(offset, (1 - offset)*6/7, 1 - offset, offset*6/7) + len({self.state.board[(i + 1, 3 - i)] for i in range(3)}) == 1: + self.line_n(offset, (1 - offset) * 6 / 7, 1 - offset, offset * 6 / 7) # restart button self.fill(0, 0, 255) - self.rect_n(0.5 + offset, 6/7, 0.4, 1/8) + self.rect_n(0.5 + offset, 6 / 7, 0.4, 1 / 8) self.fill(0, 0, 0) - self.text_n('Restart', 0.5 + 2*offset, 13/14) + self.text_n('Restart', 0.5 + 2 * offset, 13 / 14) else: # Print which player's turn it is self.text_n("Player {}'s move({})".format("XO"[self.turn], self.players[self.turn]), - offset, 6/7 + offset) + offset, 6 / 7 + offset) self.update() def draw_x(self, position): self.stroke(0, 255, 0) - x, y = [i-1 for i in position] - offset = 1/15 - self.line_n(x/3 + offset, (y/3 + offset)*6/7, x/3 + 1/3 - offset, (y/3 + 1/3 - offset)*6/7) - self.line_n(x/3 + 1/3 - offset, (y/3 + offset)*6/7, x/3 + offset, (y/3 + 1/3 - offset)*6/7) + x, y = [i - 1 for i in position] + offset = 1 / 15 + self.line_n(x / 3 + offset, (y / 3 + offset) * 6 / 7, x / 3 + 1 / 3 - offset, (y / 3 + 1 / 3 - offset) * 6 / 7) + self.line_n(x / 3 + 1 / 3 - offset, (y / 3 + offset) * 6 / 7, x / 3 + offset, (y / 3 + 1 / 3 - offset) * 6 / 7) def draw_o(self, position): self.stroke(255, 0, 0) - x, y = [i-1 for i in position] - self.arc_n(x/3 + 1/6, (y/3 + 1/6)*6/7, 1/9, 0, 360) + x, y = [i - 1 for i in position] + self.arc_n(x / 3 + 1 / 6, (y / 3 + 1 / 6) * 6 / 7, 1 / 9, 0, 360) -class Canvas_minimax(Canvas): - """Minimax for Fig52Extended on HTML canvas""" +class Canvas_min_max(Canvas): + """MinMax for Fig52Extended on HTML canvas""" + def __init__(self, varname, util_list, width=800, height=600, cid=None): - Canvas.__init__(self, varname, width, height, cid) - self.utils = {node:util for node, util in zip(range(13, 40), util_list)} + super.__init__(varname, width, height, cid) + self.utils = {node: util for node, util in zip(range(13, 40), util_list)} self.game = Fig52Extended() self.game.utils = self.utils self.nodes = list(range(40)) - self.l = 1/40 + self.l = 1 / 40 self.node_pos = {} for i in range(4): base = len(self.node_pos) - row_size = 3**i + row_size = 3 ** i for node in [base + j for j in range(row_size)]: - self.node_pos[node] = ((node - base)/row_size + 1/(2*row_size) - self.l/2, - self.l/2 + (self.l + (1 - 5*self.l)/3)*i) + self.node_pos[node] = ((node - base) / row_size + 1 / (2 * row_size) - self.l / 2, + self.l / 2 + (self.l + (1 - 5 * self.l) / 3) * i) self.font("12px Arial") self.node_stack = [] self.explored = {node for node in self.utils} @@ -480,20 +504,21 @@ def __init__(self, varname, util_list, width=800, height=600, cid=None): self.draw_graph() self.stack_manager = self.stack_manager_gen() - def minimax(self, node): + def min_max(self, node): game = self.game player = game.to_move(node) + def max_value(node): if game.terminal_test(node): return game.utility(node, player) self.change_list.append(('a', node)) self.change_list.append(('h',)) - max_a = argmax(game.actions(node), key=lambda x: min_value(game.result(node, x))) + max_a = max(game.actions(node), key=lambda x: min_value(game.result(node, x))) max_node = game.result(node, max_a) self.utils[node] = self.utils[max_node] x1, y1 = self.node_pos[node] x2, y2 = self.node_pos[max_node] - self.change_list.append(('l', (node, max_node - 3*node - 1))) + self.change_list.append(('l', (node, max_node - 3 * node - 1))) self.change_list.append(('e', node)) self.change_list.append(('p',)) self.change_list.append(('h',)) @@ -504,12 +529,12 @@ def min_value(node): return game.utility(node, player) self.change_list.append(('a', node)) self.change_list.append(('h',)) - min_a = argmin(game.actions(node), key=lambda x: max_value(game.result(node, x))) + min_a = min(game.actions(node), key=lambda x: max_value(game.result(node, x))) min_node = game.result(node, min_a) self.utils[node] = self.utils[min_node] x1, y1 = self.node_pos[node] x2, y2 = self.node_pos[min_node] - self.change_list.append(('l', (node, min_node - 3*node - 1))) + self.change_list.append(('l', (node, min_node - 3 * node - 1))) self.change_list.append(('e', node)) self.change_list.append(('p',)) self.change_list.append(('h',)) @@ -518,7 +543,7 @@ def min_value(node): return max_value(node) def stack_manager_gen(self): - self.minimax(0) + self.min_max(0) for change in self.change_list: if change[0] == 'a': self.node_stack.append(change[1]) @@ -547,7 +572,7 @@ def draw_graph(self): for node in self.node_stack: x, y = self.node_pos[node] self.fill(200, 200, 0) - self.rect_n(x - self.l/5, y - self.l/5, self.l*7/5, self.l*7/5) + self.rect_n(x - self.l / 5, y - self.l / 5, self.l * 7 / 5, self.l * 7 / 5) for node in self.nodes: x, y = self.node_pos[node] if node in self.explored: @@ -561,12 +586,12 @@ def draw_graph(self): self.line_n(x + self.l, y + self.l, x, y + self.l) self.fill(0, 0, 0) if node in self.explored: - self.text_n(self.utils[node], x + self.l/10, y + self.l*9/10) + self.text_n(self.utils[node], x + self.l / 10, y + self.l * 9 / 10) # draw edges for i in range(13): - x1, y1 = self.node_pos[i][0] + self.l/2, self.node_pos[i][1] + self.l + x1, y1 = self.node_pos[i][0] + self.l / 2, self.node_pos[i][1] + self.l for j in range(3): - x2, y2 = self.node_pos[i*3 + j + 1][0] + self.l/2, self.node_pos[i*3 + j + 1][1] + x2, y2 = self.node_pos[i * 3 + j + 1][0] + self.l / 2, self.node_pos[i * 3 + j + 1][1] if i in [1, 2, 3]: self.stroke(200, 0, 0) else: @@ -579,22 +604,23 @@ def draw_graph(self): self.update() -class Canvas_alphabeta(Canvas): +class Canvas_alpha_beta(Canvas): """Alpha-beta pruning for Fig52Extended on HTML canvas""" + def __init__(self, varname, util_list, width=800, height=600, cid=None): - Canvas.__init__(self, varname, width, height, cid) - self.utils = {node:util for node, util in zip(range(13, 40), util_list)} + super().__init__(varname, width, height, cid) + self.utils = {node: util for node, util in zip(range(13, 40), util_list)} self.game = Fig52Extended() self.game.utils = self.utils self.nodes = list(range(40)) - self.l = 1/40 + self.l = 1 / 40 self.node_pos = {} for i in range(4): base = len(self.node_pos) - row_size = 3**i + row_size = 3 ** i for node in [base + j for j in range(row_size)]: - self.node_pos[node] = ((node - base)/row_size + 1/(2*row_size) - self.l/2, - 3*self.l/2 + (self.l + (1 - 6*self.l)/3)*i) + self.node_pos[node] = ((node - base) / row_size + 1 / (2 * row_size) - self.l / 2, + 3 * self.l / 2 + (self.l + (1 - 6 * self.l) / 3) * i) self.font("12px Arial") self.node_stack = [] self.explored = {node for node in self.utils} @@ -605,27 +631,27 @@ def __init__(self, varname, util_list, width=800, height=600, cid=None): self.draw_graph() self.stack_manager = self.stack_manager_gen() - def alphabeta_search(self, node): + def alpha_beta_search(self, node): game = self.game player = game.to_move(node) - # Functions used by alphabeta + # Functions used by alpha_beta def max_value(node, alpha, beta): if game.terminal_test(node): self.change_list.append(('a', node)) self.change_list.append(('h',)) self.change_list.append(('p',)) return game.utility(node, player) - v = -infinity + v = -np.inf self.change_list.append(('a', node)) - self.change_list.append(('ab',node, v, beta)) + self.change_list.append(('ab', node, v, beta)) self.change_list.append(('h',)) for a in game.actions(node): min_val = min_value(game.result(node, a), alpha, beta) if v < min_val: v = min_val max_node = game.result(node, a) - self.change_list.append(('ab',node, v, beta)) + self.change_list.append(('ab', node, v, beta)) if v >= beta: self.change_list.append(('h',)) self.pruned.add(node) @@ -633,8 +659,8 @@ def max_value(node, alpha, beta): alpha = max(alpha, v) self.utils[node] = v if node not in self.pruned: - self.change_list.append(('l', (node, max_node - 3*node - 1))) - self.change_list.append(('e',node)) + self.change_list.append(('l', (node, max_node - 3 * node - 1))) + self.change_list.append(('e', node)) self.change_list.append(('p',)) self.change_list.append(('h',)) return v @@ -645,16 +671,16 @@ def min_value(node, alpha, beta): self.change_list.append(('h',)) self.change_list.append(('p',)) return game.utility(node, player) - v = infinity + v = np.inf self.change_list.append(('a', node)) - self.change_list.append(('ab',node, alpha, v)) + self.change_list.append(('ab', node, alpha, v)) self.change_list.append(('h',)) for a in game.actions(node): max_val = max_value(game.result(node, a), alpha, beta) if v > max_val: v = max_val min_node = game.result(node, a) - self.change_list.append(('ab',node, alpha, v)) + self.change_list.append(('ab', node, alpha, v)) if v <= alpha: self.change_list.append(('h',)) self.pruned.add(node) @@ -662,16 +688,16 @@ def min_value(node, alpha, beta): beta = min(beta, v) self.utils[node] = v if node not in self.pruned: - self.change_list.append(('l', (node, min_node - 3*node - 1))) - self.change_list.append(('e',node)) + self.change_list.append(('l', (node, min_node - 3 * node - 1))) + self.change_list.append(('e', node)) self.change_list.append(('p',)) self.change_list.append(('h',)) return v - return max_value(node, -infinity, infinity) + return max_value(node, -np.inf, np.inf) def stack_manager_gen(self): - self.alphabeta_search(0) + self.alpha_beta_search(0) for change in self.change_list: if change[0] == 'a': self.node_stack.append(change[1]) @@ -706,7 +732,7 @@ def draw_graph(self): self.fill(200, 100, 100) else: self.fill(200, 200, 0) - self.rect_n(x - self.l/5, y - self.l/5, self.l*7/5, self.l*7/5) + self.rect_n(x - self.l / 5, y - self.l / 5, self.l * 7 / 5, self.l * 7 / 5) for node in self.nodes: x, y = self.node_pos[node] if node in self.explored: @@ -723,12 +749,12 @@ def draw_graph(self): self.line_n(x + self.l, y + self.l, x, y + self.l) self.fill(0, 0, 0) if node in self.explored and node not in self.pruned: - self.text_n(self.utils[node], x + self.l/10, y + self.l*9/10) + self.text_n(self.utils[node], x + self.l / 10, y + self.l * 9 / 10) # draw edges for i in range(13): - x1, y1 = self.node_pos[i][0] + self.l/2, self.node_pos[i][1] + self.l + x1, y1 = self.node_pos[i][0] + self.l / 2, self.node_pos[i][1] + self.l for j in range(3): - x2, y2 = self.node_pos[i*3 + j + 1][0] + self.l/2, self.node_pos[i*3 + j + 1][1] + x2, y2 = self.node_pos[i * 3 + j + 1][0] + self.l / 2, self.node_pos[i * 3 + j + 1][1] if i in [1, 2, 3]: self.stroke(200, 0, 0) else: @@ -743,21 +769,22 @@ def draw_graph(self): if node not in self.explored: x, y = self.node_pos[node] alpha, beta = self.ab[node] - self.text_n(alpha, x - self.l/2, y - self.l/10) - self.text_n(beta, x + self.l, y - self.l/10) + self.text_n(alpha, x - self.l / 2, y - self.l / 10) + self.text_n(beta, x + self.l, y - self.l / 10) self.update() class Canvas_fol_bc_ask(Canvas): """fol_bc_ask() on HTML canvas""" + def __init__(self, varname, kb, query, width=800, height=600, cid=None): - Canvas.__init__(self, varname, width, height, cid) + super().__init__(varname, width, height, cid) self.kb = kb self.query = query - self.l = 1/20 - self.b = 3*self.l + self.l = 1 / 20 + self.b = 3 * self.l bc_out = list(self.fol_bc_ask()) - if len(bc_out) is 0: + if len(bc_out) == 0: self.valid = False else: self.valid = True @@ -775,10 +802,11 @@ def __init__(self, varname, kb, query, width=800, height=600, cid=None): def fol_bc_ask(self): KB = self.kb query = self.query + def fol_bc_or(KB, goal, theta): for rule in KB.fetch_rules_for_goal(goal): lhs, rhs = parse_definite_clause(standardize_variables(rule)) - for theta1 in fol_bc_and(KB, lhs, unify(rhs, goal, theta)): + for theta1 in fol_bc_and(KB, lhs, unify_mm(rhs, goal, theta)): yield ([(goal, theta1[0])], theta1[1]) def fol_bc_and(KB, goals, theta): @@ -811,22 +839,22 @@ def dfs(node, depth): return (depth, pos) dfs(graph, 0) - y_off = 0.85/len(table) + y_off = 0.85 / len(table) for i, row in enumerate(table): - x_off = 0.95/len(row) + x_off = 0.95 / len(row) for j, node in enumerate(row): - pos[(i, j)] = (0.025 + j*x_off + (x_off - self.b)/2, 0.025 + i*y_off + (y_off - self.l)/2) + pos[(i, j)] = (0.025 + j * x_off + (x_off - self.b) / 2, 0.025 + i * y_off + (y_off - self.l) / 2) for p, c in links: x1, y1 = pos[p] x2, y2 = pos[c] - edges.add((x1 + self.b/2, y1 + self.l, x2 + self.b/2, y2)) + edges.add((x1 + self.b / 2, y1 + self.l, x2 + self.b / 2, y2)) self.table = table self.pos = pos self.edges = edges def mouse_click(self, x, y): - x, y = x/self.width, y/self.height + x, y = x / self.width, y / self.height for node in self.pos: xs, ys = self.pos[node] xe, ye = xs + self.b, ys + self.l @@ -852,7 +880,7 @@ def draw_table(self): self.line_n(x, y + self.l, x + self.b, y + self.l) self.fill(0, 0, 0) self.text_n(self.table[i][j], x + 0.01, y + self.l - 0.01) - #draw edges + # draw edges for x1, y1, x2, y2 in self.edges: self.line_n(x1, y1, x2, y2) else: @@ -868,3 +896,227 @@ def draw_table(self): self.fill(0, 0, 0) self.text_n(self.table[self.context[0]][self.context[1]] if self.context else "Click for text", 0.025, 0.975) self.update() + + +############################################################################################################ + +##################### Functions to assist plotting in search.ipynb #################### + +############################################################################################################ + + +def show_map(graph_data, node_colors=None): + G = nx.Graph(graph_data['graph_dict']) + node_colors = node_colors or graph_data['node_colors'] + node_positions = graph_data['node_positions'] + node_label_pos = graph_data['node_label_positions'] + edge_weights = graph_data['edge_weights'] + + # set the size of the plot + plt.figure(figsize=(18, 13)) + # draw the graph (both nodes and edges) with locations from romania_locations + nx.draw(G, pos={k: node_positions[k] for k in G.nodes()}, + node_color=[node_colors[node] for node in G.nodes()], linewidths=0.3, edgecolors='k') + + # draw labels for nodes + node_label_handles = nx.draw_networkx_labels(G, pos=node_label_pos, font_size=14) + + # add a white bounding box behind the node labels + [label.set_bbox(dict(facecolor='white', edgecolor='none')) for label in node_label_handles.values()] + + # add edge lables to the graph + nx.draw_networkx_edge_labels(G, pos=node_positions, edge_labels=edge_weights, font_size=14) + + # add a legend + white_circle = lines.Line2D([], [], color="white", marker='o', markersize=15, markerfacecolor="white") + orange_circle = lines.Line2D([], [], color="orange", marker='o', markersize=15, markerfacecolor="orange") + red_circle = lines.Line2D([], [], color="red", marker='o', markersize=15, markerfacecolor="red") + gray_circle = lines.Line2D([], [], color="gray", marker='o', markersize=15, markerfacecolor="gray") + green_circle = lines.Line2D([], [], color="green", marker='o', markersize=15, markerfacecolor="green") + plt.legend((white_circle, orange_circle, red_circle, gray_circle, green_circle), + ('Un-explored', 'Frontier', 'Currently Exploring', 'Explored', 'Final Solution'), + numpoints=1, prop={'size': 16}, loc=(.8, .75)) + + # show the plot. No need to use in notebooks. nx.draw will show the graph itself. + plt.show() + + +# helper functions for visualisations + +def final_path_colors(initial_node_colors, problem, solution): + "Return a node_colors dict of the final path provided the problem and solution." + + # get initial node colors + final_colors = dict(initial_node_colors) + # color all the nodes in solution and starting node to green + final_colors[problem.initial] = "green" + for node in solution: + final_colors[node] = "green" + return final_colors + + +def display_visual(graph_data, user_input, algorithm=None, problem=None): + initial_node_colors = graph_data['node_colors'] + if user_input is False: + def slider_callback(iteration): + # don't show graph for the first time running the cell calling this function + try: + show_map(graph_data, node_colors=all_node_colors[iteration]) + except: + pass + + def visualize_callback(visualize): + if visualize is True: + button.value = False + + global all_node_colors + + iterations, all_node_colors, node = algorithm(problem) + solution = node.solution() + all_node_colors.append(final_path_colors(all_node_colors[0], problem, solution)) + + slider.max = len(all_node_colors) - 1 + + for i in range(slider.max + 1): + slider.value = i + # time.sleep(.5) + + slider = widgets.IntSlider(min=0, max=1, step=1, value=0) + slider_visual = widgets.interactive(slider_callback, iteration=slider) + display(slider_visual) + + button = widgets.ToggleButton(value=False) + button_visual = widgets.interactive(visualize_callback, visualize=button) + display(button_visual) + + if user_input is True: + node_colors = dict(initial_node_colors) + if isinstance(algorithm, dict): + assert set(algorithm.keys()).issubset({"Breadth First Tree Search", + "Depth First Tree Search", + "Breadth First Search", + "Depth First Graph Search", + "Best First Graph Search", + "Uniform Cost Search", + "Depth Limited Search", + "Iterative Deepening Search", + "Greedy Best First Search", + "A-star Search", + "Recursive Best First Search"}) + + algo_dropdown = widgets.Dropdown(description="Search algorithm: ", + options=sorted(list(algorithm.keys())), + value="Breadth First Tree Search") + display(algo_dropdown) + elif algorithm is None: + print("No algorithm to run.") + return 0 + + def slider_callback(iteration): + # don't show graph for the first time running the cell calling this function + try: + show_map(graph_data, node_colors=all_node_colors[iteration]) + except: + pass + + def visualize_callback(visualize): + if visualize is True: + button.value = False + + problem = GraphProblem(start_dropdown.value, end_dropdown.value, romania_map) + global all_node_colors + + user_algorithm = algorithm[algo_dropdown.value] + + iterations, all_node_colors, node = user_algorithm(problem) + solution = node.solution() + all_node_colors.append(final_path_colors(all_node_colors[0], problem, solution)) + + slider.max = len(all_node_colors) - 1 + + for i in range(slider.max + 1): + slider.value = i + # time.sleep(.5) + + start_dropdown = widgets.Dropdown(description="Start city: ", + options=sorted(list(node_colors.keys())), value="Arad") + display(start_dropdown) + + end_dropdown = widgets.Dropdown(description="Goal city: ", + options=sorted(list(node_colors.keys())), value="Fagaras") + display(end_dropdown) + + button = widgets.ToggleButton(value=False) + button_visual = widgets.interactive(visualize_callback, visualize=button) + display(button_visual) + + slider = widgets.IntSlider(min=0, max=1, step=1, value=0) + slider_visual = widgets.interactive(slider_callback, iteration=slider) + display(slider_visual) + + +# Function to plot NQueensCSP in csp.py and NQueensProblem in search.py +def plot_NQueens(solution): + n = len(solution) + board = np.array([2 * int((i + j) % 2) for j in range(n) for i in range(n)]).reshape((n, n)) + im = Image.open('images/queen_s.png') + height = im.size[1] + im = np.array(im).astype(np.float) / 255 + fig = plt.figure(figsize=(7, 7)) + ax = fig.add_subplot(111) + ax.set_title('{} Queens'.format(n)) + plt.imshow(board, cmap='binary', interpolation='nearest') + # NQueensCSP gives a solution as a dictionary + if isinstance(solution, dict): + for (k, v) in solution.items(): + newax = fig.add_axes([0.064 + (k * 0.112), 0.062 + ((7 - v) * 0.112), 0.1, 0.1], zorder=1) + newax.imshow(im) + newax.axis('off') + # NQueensProblem gives a solution as a list + elif isinstance(solution, list): + for (k, v) in enumerate(solution): + newax = fig.add_axes([0.064 + (k * 0.112), 0.062 + ((7 - v) * 0.112), 0.1, 0.1], zorder=1) + newax.imshow(im) + newax.axis('off') + fig.tight_layout() + plt.show() + + +# Function to plot a heatmap, given a grid +def heatmap(grid, cmap='binary', interpolation='nearest'): + fig = plt.figure(figsize=(7, 7)) + ax = fig.add_subplot(111) + ax.set_title('Heatmap') + plt.imshow(grid, cmap=cmap, interpolation=interpolation) + fig.tight_layout() + plt.show() + + +# Generates a gaussian kernel +def gaussian_kernel(l=5, sig=1.0): + ax = np.arange(-l // 2 + 1., l // 2 + 1.) + xx, yy = np.meshgrid(ax, ax) + kernel = np.exp(-(xx ** 2 + yy ** 2) / (2. * sig ** 2)) + return kernel + + +# Plots utility function for a POMDP +def plot_pomdp_utility(utility): + save = utility['0'][0] + delete = utility['1'][0] + ask_save = utility['2'][0] + ask_delete = utility['2'][-1] + left = (save[0] - ask_save[0]) / (save[0] - ask_save[0] + ask_save[1] - save[1]) + right = (delete[0] - ask_delete[0]) / (delete[0] - ask_delete[0] + ask_delete[1] - delete[1]) + + colors = ['g', 'b', 'k'] + for action in utility: + for value in utility[action]: + plt.plot(value, color=colors[int(action)]) + plt.vlines([left, right], -20, 10, linestyles='dashed', colors='c') + plt.ylim(-20, 13) + plt.xlim(0, 1) + plt.text(left / 2 - 0.05, 10, 'Save') + plt.text((right + left) / 2 - 0.02, 10, 'Ask') + plt.text((right + 1) / 2 - 0.07, 10, 'Delete') + plt.show() diff --git a/notebook4e.py b/notebook4e.py new file mode 100644 index 000000000..5b03081c6 --- /dev/null +++ b/notebook4e.py @@ -0,0 +1,1158 @@ +import time +from collections import defaultdict +from inspect import getsource + +import ipywidgets as widgets +import matplotlib.pyplot as plt +import networkx as nx +import numpy as np +from IPython.display import HTML +from IPython.display import display +from PIL import Image +from matplotlib import lines +from matplotlib.colors import ListedColormap + +from games import TicTacToe, alpha_beta_player, random_player, Fig52Extended +from learning import DataSet +from logic import parse_definite_clause, standardize_variables, unify_mm, subst +from search import GraphProblem, romania_map + + +# ______________________________________________________________________________ +# Magic Words + + +def pseudocode(algorithm): + """Print the pseudocode for the given algorithm.""" + from urllib.request import urlopen + from IPython.display import Markdown + + algorithm = algorithm.replace(' ', '-') + url = "https://raw.githubusercontent.com/aimacode/aima-pseudocode/master/md/{}.md".format(algorithm) + f = urlopen(url) + md = f.read().decode('utf-8') + md = md.split('\n', 1)[-1].strip() + md = '#' + md + return Markdown(md) + + +def psource(*functions): + """Print the source code for the given function(s).""" + source_code = '\n\n'.join(getsource(fn) for fn in functions) + try: + from pygments.formatters import HtmlFormatter + from pygments.lexers import PythonLexer + from pygments import highlight + + display(HTML(highlight(source_code, PythonLexer(), HtmlFormatter(full=True)))) + + except ImportError: + print(source_code) + + +def plot_model_boundary(dataset, attr1, attr2, model=None): + # prepare data + examples = np.asarray(dataset.examples) + X = np.asarray([examples[:, attr1], examples[:, attr2]]) + y = examples[:, dataset.target] + h = 0.1 + + # create color maps + cmap_light = ListedColormap(['#FFAAAA', '#AAFFAA', '#00AAFF']) + cmap_bold = ListedColormap(['#FF0000', '#00FF00', '#00AAFF']) + + # calculate min, max and limits + x_min, x_max = X[0].min() - 1, X[0].max() + 1 + y_min, y_max = X[1].min() - 1, X[1].max() + 1 + # mesh the grid + xx, yy = np.meshgrid(np.arange(x_min, x_max, h), + np.arange(y_min, y_max, h)) + Z = [] + for grid in zip(xx.ravel(), yy.ravel()): + # put them back to the example + grid = np.round(grid, decimals=1).tolist() + Z.append(model(grid)) + # Put the result into a color plot + Z = np.asarray(Z) + Z = Z.reshape(xx.shape) + plt.figure() + plt.pcolormesh(xx, yy, Z, cmap=cmap_light) + + # Plot also the training points + plt.scatter(X[0], X[1], c=y, cmap=cmap_bold) + plt.xlim(xx.min(), xx.max()) + plt.ylim(yy.min(), yy.max()) + plt.show() + + +# ______________________________________________________________________________ +# Iris Visualization + + +def show_iris(i=0, j=1, k=2): + """Plots the iris dataset in a 3D plot. + The three axes are given by i, j and k, + which correspond to three of the four iris features.""" + + plt.rcParams.update(plt.rcParamsDefault) + + fig = plt.figure() + ax = fig.add_subplot(111, projection='3d') + + iris = DataSet(name="iris") + buckets = iris.split_values_by_classes() + + features = ["Sepal Length", "Sepal Width", "Petal Length", "Petal Width"] + f1, f2, f3 = features[i], features[j], features[k] + + a_setosa = [v[i] for v in buckets["setosa"]] + b_setosa = [v[j] for v in buckets["setosa"]] + c_setosa = [v[k] for v in buckets["setosa"]] + + a_virginica = [v[i] for v in buckets["virginica"]] + b_virginica = [v[j] for v in buckets["virginica"]] + c_virginica = [v[k] for v in buckets["virginica"]] + + a_versicolor = [v[i] for v in buckets["versicolor"]] + b_versicolor = [v[j] for v in buckets["versicolor"]] + c_versicolor = [v[k] for v in buckets["versicolor"]] + + for c, m, sl, sw, pl in [('b', 's', a_setosa, b_setosa, c_setosa), + ('g', '^', a_virginica, b_virginica, c_virginica), + ('r', 'o', a_versicolor, b_versicolor, c_versicolor)]: + ax.scatter(sl, sw, pl, c=c, marker=m) + + ax.set_xlabel(f1) + ax.set_ylabel(f2) + ax.set_zlabel(f3) + + plt.show() + + +# ______________________________________________________________________________ +# MNIST + + +def load_MNIST(path="aima-data/MNIST/Digits", fashion=False): + import os, struct + import array + import numpy as np + + if fashion: + path = "aima-data/MNIST/Fashion" + + plt.rcParams.update(plt.rcParamsDefault) + plt.rcParams['figure.figsize'] = (10.0, 8.0) + plt.rcParams['image.interpolation'] = 'nearest' + plt.rcParams['image.cmap'] = 'gray' + + train_img_file = open(os.path.join(path, "train-images-idx3-ubyte"), "rb") + train_lbl_file = open(os.path.join(path, "train-labels-idx1-ubyte"), "rb") + test_img_file = open(os.path.join(path, "t10k-images-idx3-ubyte"), "rb") + test_lbl_file = open(os.path.join(path, 't10k-labels-idx1-ubyte'), "rb") + + magic_nr, tr_size, tr_rows, tr_cols = struct.unpack(">IIII", train_img_file.read(16)) + tr_img = array.array("B", train_img_file.read()) + train_img_file.close() + magic_nr, tr_size = struct.unpack(">II", train_lbl_file.read(8)) + tr_lbl = array.array("b", train_lbl_file.read()) + train_lbl_file.close() + + magic_nr, te_size, te_rows, te_cols = struct.unpack(">IIII", test_img_file.read(16)) + te_img = array.array("B", test_img_file.read()) + test_img_file.close() + magic_nr, te_size = struct.unpack(">II", test_lbl_file.read(8)) + te_lbl = array.array("b", test_lbl_file.read()) + test_lbl_file.close() + + # print(len(tr_img), len(tr_lbl), tr_size) + # print(len(te_img), len(te_lbl), te_size) + + train_img = np.zeros((tr_size, tr_rows * tr_cols), dtype=np.int16) + train_lbl = np.zeros((tr_size,), dtype=np.int8) + for i in range(tr_size): + train_img[i] = np.array(tr_img[i * tr_rows * tr_cols: (i + 1) * tr_rows * tr_cols]).reshape((tr_rows * te_cols)) + train_lbl[i] = tr_lbl[i] + + test_img = np.zeros((te_size, te_rows * te_cols), dtype=np.int16) + test_lbl = np.zeros((te_size,), dtype=np.int8) + for i in range(te_size): + test_img[i] = np.array(te_img[i * te_rows * te_cols: (i + 1) * te_rows * te_cols]).reshape((te_rows * te_cols)) + test_lbl[i] = te_lbl[i] + + return (train_img, train_lbl, test_img, test_lbl) + + +digit_classes = [str(i) for i in range(10)] +fashion_classes = ["T-shirt/top", "Trouser", "Pullover", "Dress", "Coat", + "Sandal", "Shirt", "Sneaker", "Bag", "Ankle boot"] + + +def show_MNIST(labels, images, samples=8, fashion=False): + if not fashion: + classes = digit_classes + else: + classes = fashion_classes + + num_classes = len(classes) + + for y, cls in enumerate(classes): + idxs = np.nonzero([i == y for i in labels]) + idxs = np.random.choice(idxs[0], samples, replace=False) + for i, idx in enumerate(idxs): + plt_idx = i * num_classes + y + 1 + plt.subplot(samples, num_classes, plt_idx) + plt.imshow(images[idx].reshape((28, 28))) + plt.axis("off") + if i == 0: + plt.title(cls) + + plt.show() + + +def show_ave_MNIST(labels, images, fashion=False): + if not fashion: + item_type = "Digit" + classes = digit_classes + else: + item_type = "Apparel" + classes = fashion_classes + + num_classes = len(classes) + + for y, cls in enumerate(classes): + idxs = np.nonzero([i == y for i in labels]) + print(item_type, y, ":", len(idxs[0]), "images.") + + ave_img = np.mean(np.vstack([images[i] for i in idxs[0]]), axis=0) + # print(ave_img.shape) + + plt.subplot(1, num_classes, y + 1) + plt.imshow(ave_img.reshape((28, 28))) + plt.axis("off") + plt.title(cls) + + plt.show() + + +# ______________________________________________________________________________ +# MDP + + +def make_plot_grid_step_function(columns, rows, U_over_time): + """ipywidgets interactive function supports single parameter as input. + This function creates and return such a function by taking as input + other parameters.""" + + def plot_grid_step(iteration): + data = U_over_time[iteration] + data = defaultdict(lambda: 0, data) + grid = [] + for row in range(rows): + current_row = [] + for column in range(columns): + current_row.append(data[(column, row)]) + grid.append(current_row) + grid.reverse() # output like book + fig = plt.imshow(grid, cmap=plt.cm.bwr, interpolation='nearest') + + plt.axis('off') + fig.axes.get_xaxis().set_visible(False) + fig.axes.get_yaxis().set_visible(False) + + for col in range(len(grid)): + for row in range(len(grid[0])): + magic = grid[col][row] + fig.axes.text(row, col, "{0:.2f}".format(magic), va='center', ha='center') + + plt.show() + + return plot_grid_step + + +def make_visualize(slider): + """Takes an input a sliderand returns callback function + for timer and animation.""" + + def visualize_callback(visualize, time_step): + if visualize is True: + for i in range(slider.min, slider.max + 1): + slider.value = i + time.sleep(float(time_step)) + + return visualize_callback + + +# ______________________________________________________________________________ + + +_canvas = """ + +
    + +
    + + +""" # noqa + + +class Canvas: + """Inherit from this class to manage the HTML canvas element in jupyter notebooks. + To create an object of this class any_name_xyz = Canvas("any_name_xyz") + The first argument given must be the name of the object being created. + IPython must be able to reference the variable name that is being passed.""" + + def __init__(self, varname, width=800, height=600, cid=None): + self.name = varname + self.cid = cid or varname + self.width = width + self.height = height + self.html = _canvas.format(self.cid, self.width, self.height, self.name) + self.exec_list = [] + display_html(self.html) + + def mouse_click(self, x, y): + """Override this method to handle mouse click at position (x, y)""" + raise NotImplementedError + + def mouse_move(self, x, y): + raise NotImplementedError + + def execute(self, exec_str): + """Stores the command to be executed to a list which is used later during update()""" + if not isinstance(exec_str, str): + print("Invalid execution argument:", exec_str) + self.alert("Received invalid execution command format") + prefix = "{0}_canvas_object.".format(self.cid) + self.exec_list.append(prefix + exec_str + ';') + + def fill(self, r, g, b): + """Changes the fill color to a color in rgb format""" + self.execute("fill({0}, {1}, {2})".format(r, g, b)) + + def stroke(self, r, g, b): + """Changes the colors of line/strokes to rgb""" + self.execute("stroke({0}, {1}, {2})".format(r, g, b)) + + def strokeWidth(self, w): + """Changes the width of lines/strokes to 'w' pixels""" + self.execute("strokeWidth({0})".format(w)) + + def rect(self, x, y, w, h): + """Draw a rectangle with 'w' width, 'h' height and (x, y) as the top-left corner""" + self.execute("rect({0}, {1}, {2}, {3})".format(x, y, w, h)) + + def rect_n(self, xn, yn, wn, hn): + """Similar to rect(), but the dimensions are normalized to fall between 0 and 1""" + x = round(xn * self.width) + y = round(yn * self.height) + w = round(wn * self.width) + h = round(hn * self.height) + self.rect(x, y, w, h) + + def line(self, x1, y1, x2, y2): + """Draw a line from (x1, y1) to (x2, y2)""" + self.execute("line({0}, {1}, {2}, {3})".format(x1, y1, x2, y2)) + + def line_n(self, x1n, y1n, x2n, y2n): + """Similar to line(), but the dimensions are normalized to fall between 0 and 1""" + x1 = round(x1n * self.width) + y1 = round(y1n * self.height) + x2 = round(x2n * self.width) + y2 = round(y2n * self.height) + self.line(x1, y1, x2, y2) + + def arc(self, x, y, r, start, stop): + """Draw an arc with (x, y) as centre, 'r' as radius from angles 'start' to 'stop'""" + self.execute("arc({0}, {1}, {2}, {3}, {4})".format(x, y, r, start, stop)) + + def arc_n(self, xn, yn, rn, start, stop): + """Similar to arc(), but the dimensions are normalized to fall between 0 and 1 + The normalizing factor for radius is selected between width and height by + seeing which is smaller.""" + x = round(xn * self.width) + y = round(yn * self.height) + r = round(rn * min(self.width, self.height)) + self.arc(x, y, r, start, stop) + + def clear(self): + """Clear the HTML canvas""" + self.execute("clear()") + + def font(self, font): + """Changes the font of text""" + self.execute('font("{0}")'.format(font)) + + def text(self, txt, x, y, fill=True): + """Display a text at (x, y)""" + if fill: + self.execute('fill_text("{0}", {1}, {2})'.format(txt, x, y)) + else: + self.execute('stroke_text("{0}", {1}, {2})'.format(txt, x, y)) + + def text_n(self, txt, xn, yn, fill=True): + """Similar to text(), but with normalized coordinates""" + x = round(xn * self.width) + y = round(yn * self.height) + self.text(txt, x, y, fill) + + def alert(self, message): + """Immediately display an alert""" + display_html(''.format(message)) + + def update(self): + """Execute the JS code to execute the commands queued by execute()""" + exec_code = "" + self.exec_list = [] + display_html(exec_code) + + +def display_html(html_string): + display(HTML(html_string)) + + +################################################################################ + + +class Canvas_TicTacToe(Canvas): + """Play a 3x3 TicTacToe game on HTML canvas""" + + def __init__(self, varname, player_1='human', player_2='random', + width=300, height=350, cid=None): + valid_players = ('human', 'random', 'alpha_beta') + if player_1 not in valid_players or player_2 not in valid_players: + raise TypeError("Players must be one of {}".format(valid_players)) + super().__init__(varname, width, height, cid) + self.ttt = TicTacToe() + self.state = self.ttt.initial + self.turn = 0 + self.strokeWidth(5) + self.players = (player_1, player_2) + self.font("20px Arial") + self.draw_board() + + def mouse_click(self, x, y): + player = self.players[self.turn] + if self.ttt.terminal_test(self.state): + if 0.55 <= x / self.width <= 0.95 and 6 / 7 <= y / self.height <= 6 / 7 + 1 / 8: + self.state = self.ttt.initial + self.turn = 0 + self.draw_board() + return + + if player == 'human': + x, y = int(3 * x / self.width) + 1, int(3 * y / (self.height * 6 / 7)) + 1 + if (x, y) not in self.ttt.actions(self.state): + # Invalid move + return + move = (x, y) + elif player == 'alpha_beta': + move = alpha_beta_player(self.ttt, self.state) + else: + move = random_player(self.ttt, self.state) + self.state = self.ttt.result(self.state, move) + self.turn ^= 1 + self.draw_board() + + def draw_board(self): + self.clear() + self.stroke(0, 0, 0) + offset = 1 / 20 + self.line_n(0 + offset, (1 / 3) * 6 / 7, 1 - offset, (1 / 3) * 6 / 7) + self.line_n(0 + offset, (2 / 3) * 6 / 7, 1 - offset, (2 / 3) * 6 / 7) + self.line_n(1 / 3, (0 + offset) * 6 / 7, 1 / 3, (1 - offset) * 6 / 7) + self.line_n(2 / 3, (0 + offset) * 6 / 7, 2 / 3, (1 - offset) * 6 / 7) + + board = self.state.board + for mark in board: + if board[mark] == 'X': + self.draw_x(mark) + elif board[mark] == 'O': + self.draw_o(mark) + if self.ttt.terminal_test(self.state): + # End game message + utility = self.ttt.utility(self.state, self.ttt.to_move(self.ttt.initial)) + if utility == 0: + self.text_n('Game Draw!', offset, 6 / 7 + offset) + else: + self.text_n('Player {} wins!'.format("XO"[utility < 0]), offset, 6 / 7 + offset) + # Find the 3 and draw a line + self.stroke([255, 0][self.turn], [0, 255][self.turn], 0) + for i in range(3): + if all([(i + 1, j + 1) in self.state.board for j in range(3)]) and \ + len({self.state.board[(i + 1, j + 1)] for j in range(3)}) == 1: + self.line_n(i / 3 + 1 / 6, offset * 6 / 7, i / 3 + 1 / 6, (1 - offset) * 6 / 7) + if all([(j + 1, i + 1) in self.state.board for j in range(3)]) and \ + len({self.state.board[(j + 1, i + 1)] for j in range(3)}) == 1: + self.line_n(offset, (i / 3 + 1 / 6) * 6 / 7, 1 - offset, (i / 3 + 1 / 6) * 6 / 7) + if all([(i + 1, i + 1) in self.state.board for i in range(3)]) and \ + len({self.state.board[(i + 1, i + 1)] for i in range(3)}) == 1: + self.line_n(offset, offset * 6 / 7, 1 - offset, (1 - offset) * 6 / 7) + if all([(i + 1, 3 - i) in self.state.board for i in range(3)]) and \ + len({self.state.board[(i + 1, 3 - i)] for i in range(3)}) == 1: + self.line_n(offset, (1 - offset) * 6 / 7, 1 - offset, offset * 6 / 7) + # restart button + self.fill(0, 0, 255) + self.rect_n(0.5 + offset, 6 / 7, 0.4, 1 / 8) + self.fill(0, 0, 0) + self.text_n('Restart', 0.5 + 2 * offset, 13 / 14) + else: # Print which player's turn it is + self.text_n("Player {}'s move({})".format("XO"[self.turn], self.players[self.turn]), + offset, 6 / 7 + offset) + + self.update() + + def draw_x(self, position): + self.stroke(0, 255, 0) + x, y = [i - 1 for i in position] + offset = 1 / 15 + self.line_n(x / 3 + offset, (y / 3 + offset) * 6 / 7, x / 3 + 1 / 3 - offset, (y / 3 + 1 / 3 - offset) * 6 / 7) + self.line_n(x / 3 + 1 / 3 - offset, (y / 3 + offset) * 6 / 7, x / 3 + offset, (y / 3 + 1 / 3 - offset) * 6 / 7) + + def draw_o(self, position): + self.stroke(255, 0, 0) + x, y = [i - 1 for i in position] + self.arc_n(x / 3 + 1 / 6, (y / 3 + 1 / 6) * 6 / 7, 1 / 9, 0, 360) + + +class Canvas_min_max(Canvas): + """MinMax for Fig52Extended on HTML canvas""" + + def __init__(self, varname, util_list, width=800, height=600, cid=None): + super().__init__(varname, width, height, cid) + self.utils = {node: util for node, util in zip(range(13, 40), util_list)} + self.game = Fig52Extended() + self.game.utils = self.utils + self.nodes = list(range(40)) + self.l = 1 / 40 + self.node_pos = {} + for i in range(4): + base = len(self.node_pos) + row_size = 3 ** i + for node in [base + j for j in range(row_size)]: + self.node_pos[node] = ((node - base) / row_size + 1 / (2 * row_size) - self.l / 2, + self.l / 2 + (self.l + (1 - 5 * self.l) / 3) * i) + self.font("12px Arial") + self.node_stack = [] + self.explored = {node for node in self.utils} + self.thick_lines = set() + self.change_list = [] + self.draw_graph() + self.stack_manager = self.stack_manager_gen() + + def min_max(self, node): + game = self.game + player = game.to_move(node) + + def max_value(node): + if game.terminal_test(node): + return game.utility(node, player) + self.change_list.append(('a', node)) + self.change_list.append(('h',)) + max_a = max(game.actions(node), key=lambda x: min_value(game.result(node, x))) + max_node = game.result(node, max_a) + self.utils[node] = self.utils[max_node] + x1, y1 = self.node_pos[node] + x2, y2 = self.node_pos[max_node] + self.change_list.append(('l', (node, max_node - 3 * node - 1))) + self.change_list.append(('e', node)) + self.change_list.append(('p',)) + self.change_list.append(('h',)) + return self.utils[node] + + def min_value(node): + if game.terminal_test(node): + return game.utility(node, player) + self.change_list.append(('a', node)) + self.change_list.append(('h',)) + min_a = min(game.actions(node), key=lambda x: max_value(game.result(node, x))) + min_node = game.result(node, min_a) + self.utils[node] = self.utils[min_node] + x1, y1 = self.node_pos[node] + x2, y2 = self.node_pos[min_node] + self.change_list.append(('l', (node, min_node - 3 * node - 1))) + self.change_list.append(('e', node)) + self.change_list.append(('p',)) + self.change_list.append(('h',)) + return self.utils[node] + + return max_value(node) + + def stack_manager_gen(self): + self.min_max(0) + for change in self.change_list: + if change[0] == 'a': + self.node_stack.append(change[1]) + elif change[0] == 'e': + self.explored.add(change[1]) + elif change[0] == 'h': + yield + elif change[0] == 'l': + self.thick_lines.add(change[1]) + elif change[0] == 'p': + self.node_stack.pop() + + def mouse_click(self, x, y): + try: + self.stack_manager.send(None) + except StopIteration: + pass + self.draw_graph() + + def draw_graph(self): + self.clear() + # draw nodes + self.stroke(0, 0, 0) + self.strokeWidth(1) + # highlight for nodes in stack + for node in self.node_stack: + x, y = self.node_pos[node] + self.fill(200, 200, 0) + self.rect_n(x - self.l / 5, y - self.l / 5, self.l * 7 / 5, self.l * 7 / 5) + for node in self.nodes: + x, y = self.node_pos[node] + if node in self.explored: + self.fill(255, 255, 255) + else: + self.fill(200, 200, 200) + self.rect_n(x, y, self.l, self.l) + self.line_n(x, y, x + self.l, y) + self.line_n(x, y, x, y + self.l) + self.line_n(x + self.l, y + self.l, x + self.l, y) + self.line_n(x + self.l, y + self.l, x, y + self.l) + self.fill(0, 0, 0) + if node in self.explored: + self.text_n(self.utils[node], x + self.l / 10, y + self.l * 9 / 10) + # draw edges + for i in range(13): + x1, y1 = self.node_pos[i][0] + self.l / 2, self.node_pos[i][1] + self.l + for j in range(3): + x2, y2 = self.node_pos[i * 3 + j + 1][0] + self.l / 2, self.node_pos[i * 3 + j + 1][1] + if i in [1, 2, 3]: + self.stroke(200, 0, 0) + else: + self.stroke(0, 200, 0) + if (i, j) in self.thick_lines: + self.strokeWidth(3) + else: + self.strokeWidth(1) + self.line_n(x1, y1, x2, y2) + self.update() + + +class Canvas_alpha_beta(Canvas): + """Alpha-beta pruning for Fig52Extended on HTML canvas""" + + def __init__(self, varname, util_list, width=800, height=600, cid=None): + super().__init__(varname, width, height, cid) + self.utils = {node: util for node, util in zip(range(13, 40), util_list)} + self.game = Fig52Extended() + self.game.utils = self.utils + self.nodes = list(range(40)) + self.l = 1 / 40 + self.node_pos = {} + for i in range(4): + base = len(self.node_pos) + row_size = 3 ** i + for node in [base + j for j in range(row_size)]: + self.node_pos[node] = ((node - base) / row_size + 1 / (2 * row_size) - self.l / 2, + 3 * self.l / 2 + (self.l + (1 - 6 * self.l) / 3) * i) + self.font("12px Arial") + self.node_stack = [] + self.explored = {node for node in self.utils} + self.pruned = set() + self.ab = {} + self.thick_lines = set() + self.change_list = [] + self.draw_graph() + self.stack_manager = self.stack_manager_gen() + + def alpha_beta_search(self, node): + game = self.game + player = game.to_move(node) + + # Functions used by alpha_beta + def max_value(node, alpha, beta): + if game.terminal_test(node): + self.change_list.append(('a', node)) + self.change_list.append(('h',)) + self.change_list.append(('p',)) + return game.utility(node, player) + v = -np.inf + self.change_list.append(('a', node)) + self.change_list.append(('ab', node, v, beta)) + self.change_list.append(('h',)) + for a in game.actions(node): + min_val = min_value(game.result(node, a), alpha, beta) + if v < min_val: + v = min_val + max_node = game.result(node, a) + self.change_list.append(('ab', node, v, beta)) + if v >= beta: + self.change_list.append(('h',)) + self.pruned.add(node) + break + alpha = max(alpha, v) + self.utils[node] = v + if node not in self.pruned: + self.change_list.append(('l', (node, max_node - 3 * node - 1))) + self.change_list.append(('e', node)) + self.change_list.append(('p',)) + self.change_list.append(('h',)) + return v + + def min_value(node, alpha, beta): + if game.terminal_test(node): + self.change_list.append(('a', node)) + self.change_list.append(('h',)) + self.change_list.append(('p',)) + return game.utility(node, player) + v = np.inf + self.change_list.append(('a', node)) + self.change_list.append(('ab', node, alpha, v)) + self.change_list.append(('h',)) + for a in game.actions(node): + max_val = max_value(game.result(node, a), alpha, beta) + if v > max_val: + v = max_val + min_node = game.result(node, a) + self.change_list.append(('ab', node, alpha, v)) + if v <= alpha: + self.change_list.append(('h',)) + self.pruned.add(node) + break + beta = min(beta, v) + self.utils[node] = v + if node not in self.pruned: + self.change_list.append(('l', (node, min_node - 3 * node - 1))) + self.change_list.append(('e', node)) + self.change_list.append(('p',)) + self.change_list.append(('h',)) + return v + + return max_value(node, -np.inf, np.inf) + + def stack_manager_gen(self): + self.alpha_beta_search(0) + for change in self.change_list: + if change[0] == 'a': + self.node_stack.append(change[1]) + elif change[0] == 'ab': + self.ab[change[1]] = change[2:] + elif change[0] == 'e': + self.explored.add(change[1]) + elif change[0] == 'h': + yield + elif change[0] == 'l': + self.thick_lines.add(change[1]) + elif change[0] == 'p': + self.node_stack.pop() + + def mouse_click(self, x, y): + try: + self.stack_manager.send(None) + except StopIteration: + pass + self.draw_graph() + + def draw_graph(self): + self.clear() + # draw nodes + self.stroke(0, 0, 0) + self.strokeWidth(1) + # highlight for nodes in stack + for node in self.node_stack: + x, y = self.node_pos[node] + # alpha > beta + if node not in self.explored and self.ab[node][0] > self.ab[node][1]: + self.fill(200, 100, 100) + else: + self.fill(200, 200, 0) + self.rect_n(x - self.l / 5, y - self.l / 5, self.l * 7 / 5, self.l * 7 / 5) + for node in self.nodes: + x, y = self.node_pos[node] + if node in self.explored: + if node in self.pruned: + self.fill(50, 50, 50) + else: + self.fill(255, 255, 255) + else: + self.fill(200, 200, 200) + self.rect_n(x, y, self.l, self.l) + self.line_n(x, y, x + self.l, y) + self.line_n(x, y, x, y + self.l) + self.line_n(x + self.l, y + self.l, x + self.l, y) + self.line_n(x + self.l, y + self.l, x, y + self.l) + self.fill(0, 0, 0) + if node in self.explored and node not in self.pruned: + self.text_n(self.utils[node], x + self.l / 10, y + self.l * 9 / 10) + # draw edges + for i in range(13): + x1, y1 = self.node_pos[i][0] + self.l / 2, self.node_pos[i][1] + self.l + for j in range(3): + x2, y2 = self.node_pos[i * 3 + j + 1][0] + self.l / 2, self.node_pos[i * 3 + j + 1][1] + if i in [1, 2, 3]: + self.stroke(200, 0, 0) + else: + self.stroke(0, 200, 0) + if (i, j) in self.thick_lines: + self.strokeWidth(3) + else: + self.strokeWidth(1) + self.line_n(x1, y1, x2, y2) + # display alpha and beta + for node in self.node_stack: + if node not in self.explored: + x, y = self.node_pos[node] + alpha, beta = self.ab[node] + self.text_n(alpha, x - self.l / 2, y - self.l / 10) + self.text_n(beta, x + self.l, y - self.l / 10) + self.update() + + +class Canvas_fol_bc_ask(Canvas): + """fol_bc_ask() on HTML canvas""" + + def __init__(self, varname, kb, query, width=800, height=600, cid=None): + super().__init__(varname, width, height, cid) + self.kb = kb + self.query = query + self.l = 1 / 20 + self.b = 3 * self.l + bc_out = list(self.fol_bc_ask()) + if len(bc_out) == 0: + self.valid = False + else: + self.valid = True + graph = bc_out[0][0][0] + s = bc_out[0][1] + while True: + new_graph = subst(s, graph) + if graph == new_graph: + break + graph = new_graph + self.make_table(graph) + self.context = None + self.draw_table() + + def fol_bc_ask(self): + KB = self.kb + query = self.query + + def fol_bc_or(KB, goal, theta): + for rule in KB.fetch_rules_for_goal(goal): + lhs, rhs = parse_definite_clause(standardize_variables(rule)) + for theta1 in fol_bc_and(KB, lhs, unify_mm(rhs, goal, theta)): + yield ([(goal, theta1[0])], theta1[1]) + + def fol_bc_and(KB, goals, theta): + if theta is None: + pass + elif not goals: + yield ([], theta) + else: + first, rest = goals[0], goals[1:] + for theta1 in fol_bc_or(KB, subst(theta, first), theta): + for theta2 in fol_bc_and(KB, rest, theta1[1]): + yield (theta1[0] + theta2[0], theta2[1]) + + return fol_bc_or(KB, query, {}) + + def make_table(self, graph): + table = [] + pos = {} + links = set() + edges = set() + + def dfs(node, depth): + if len(table) <= depth: + table.append([]) + pos = len(table[depth]) + table[depth].append(node[0]) + for child in node[1]: + child_id = dfs(child, depth + 1) + links.add(((depth, pos), child_id)) + return (depth, pos) + + dfs(graph, 0) + y_off = 0.85 / len(table) + for i, row in enumerate(table): + x_off = 0.95 / len(row) + for j, node in enumerate(row): + pos[(i, j)] = (0.025 + j * x_off + (x_off - self.b) / 2, 0.025 + i * y_off + (y_off - self.l) / 2) + for p, c in links: + x1, y1 = pos[p] + x2, y2 = pos[c] + edges.add((x1 + self.b / 2, y1 + self.l, x2 + self.b / 2, y2)) + + self.table = table + self.pos = pos + self.edges = edges + + def mouse_click(self, x, y): + x, y = x / self.width, y / self.height + for node in self.pos: + xs, ys = self.pos[node] + xe, ye = xs + self.b, ys + self.l + if xs <= x <= xe and ys <= y <= ye: + self.context = node + break + self.draw_table() + + def draw_table(self): + self.clear() + self.strokeWidth(3) + self.stroke(0, 0, 0) + self.font("12px Arial") + if self.valid: + # draw nodes + for i, j in self.pos: + x, y = self.pos[(i, j)] + self.fill(200, 200, 200) + self.rect_n(x, y, self.b, self.l) + self.line_n(x, y, x + self.b, y) + self.line_n(x, y, x, y + self.l) + self.line_n(x + self.b, y, x + self.b, y + self.l) + self.line_n(x, y + self.l, x + self.b, y + self.l) + self.fill(0, 0, 0) + self.text_n(self.table[i][j], x + 0.01, y + self.l - 0.01) + # draw edges + for x1, y1, x2, y2 in self.edges: + self.line_n(x1, y1, x2, y2) + else: + self.fill(255, 0, 0) + self.rect_n(0, 0, 1, 1) + # text area + self.fill(255, 255, 255) + self.rect_n(0, 0.9, 1, 0.1) + self.strokeWidth(5) + self.stroke(0, 0, 0) + self.line_n(0, 0.9, 1, 0.9) + self.font("22px Arial") + self.fill(0, 0, 0) + self.text_n(self.table[self.context[0]][self.context[1]] if self.context else "Click for text", 0.025, 0.975) + self.update() + + +############################################################################################################ + +##################### Functions to assist plotting in search.ipynb #################### + +############################################################################################################ + + +def show_map(graph_data, node_colors=None): + G = nx.Graph(graph_data['graph_dict']) + node_colors = node_colors or graph_data['node_colors'] + node_positions = graph_data['node_positions'] + node_label_pos = graph_data['node_label_positions'] + edge_weights = graph_data['edge_weights'] + + # set the size of the plot + plt.figure(figsize=(18, 13)) + # draw the graph (both nodes and edges) with locations from romania_locations + nx.draw(G, pos={k: node_positions[k] for k in G.nodes()}, + node_color=[node_colors[node] for node in G.nodes()], linewidths=0.3, edgecolors='k') + + # draw labels for nodes + node_label_handles = nx.draw_networkx_labels(G, pos=node_label_pos, font_size=14) + + # add a white bounding box behind the node labels + [label.set_bbox(dict(facecolor='white', edgecolor='none')) for label in node_label_handles.values()] + + # add edge lables to the graph + nx.draw_networkx_edge_labels(G, pos=node_positions, edge_labels=edge_weights, font_size=14) + + # add a legend + white_circle = lines.Line2D([], [], color="white", marker='o', markersize=15, markerfacecolor="white") + orange_circle = lines.Line2D([], [], color="orange", marker='o', markersize=15, markerfacecolor="orange") + red_circle = lines.Line2D([], [], color="red", marker='o', markersize=15, markerfacecolor="red") + gray_circle = lines.Line2D([], [], color="gray", marker='o', markersize=15, markerfacecolor="gray") + green_circle = lines.Line2D([], [], color="green", marker='o', markersize=15, markerfacecolor="green") + plt.legend((white_circle, orange_circle, red_circle, gray_circle, green_circle), + ('Un-explored', 'Frontier', 'Currently Exploring', 'Explored', 'Final Solution'), + numpoints=1, prop={'size': 16}, loc=(.8, .75)) + + # show the plot. No need to use in notebooks. nx.draw will show the graph itself. + plt.show() + + +# helper functions for visualisations + +def final_path_colors(initial_node_colors, problem, solution): + """Return a node_colors dict of the final path provided the problem and solution.""" + + # get initial node colors + final_colors = dict(initial_node_colors) + # color all the nodes in solution and starting node to green + final_colors[problem.initial] = "green" + for node in solution: + final_colors[node] = "green" + return final_colors + + +def display_visual(graph_data, user_input, algorithm=None, problem=None): + initial_node_colors = graph_data['node_colors'] + if user_input is False: + def slider_callback(iteration): + # don't show graph for the first time running the cell calling this function + try: + show_map(graph_data, node_colors=all_node_colors[iteration]) + except: + pass + + def visualize_callback(visualize): + if visualize is True: + button.value = False + + global all_node_colors + + iterations, all_node_colors, node = algorithm(problem) + solution = node.solution() + all_node_colors.append(final_path_colors(all_node_colors[0], problem, solution)) + + slider.max = len(all_node_colors) - 1 + + for i in range(slider.max + 1): + slider.value = i + # time.sleep(.5) + + slider = widgets.IntSlider(min=0, max=1, step=1, value=0) + slider_visual = widgets.interactive(slider_callback, iteration=slider) + display(slider_visual) + + button = widgets.ToggleButton(value=False) + button_visual = widgets.interactive(visualize_callback, visualize=button) + display(button_visual) + + if user_input is True: + node_colors = dict(initial_node_colors) + if isinstance(algorithm, dict): + assert set(algorithm.keys()).issubset({"Breadth First Tree Search", + "Depth First Tree Search", + "Breadth First Search", + "Depth First Graph Search", + "Best First Graph Search", + "Uniform Cost Search", + "Depth Limited Search", + "Iterative Deepening Search", + "Greedy Best First Search", + "A-star Search", + "Recursive Best First Search"}) + + algo_dropdown = widgets.Dropdown(description="Search algorithm: ", + options=sorted(list(algorithm.keys())), + value="Breadth First Tree Search") + display(algo_dropdown) + elif algorithm is None: + print("No algorithm to run.") + return 0 + + def slider_callback(iteration): + # don't show graph for the first time running the cell calling this function + try: + show_map(graph_data, node_colors=all_node_colors[iteration]) + except: + pass + + def visualize_callback(visualize): + if visualize is True: + button.value = False + + problem = GraphProblem(start_dropdown.value, end_dropdown.value, romania_map) + global all_node_colors + + user_algorithm = algorithm[algo_dropdown.value] + + iterations, all_node_colors, node = user_algorithm(problem) + solution = node.solution() + all_node_colors.append(final_path_colors(all_node_colors[0], problem, solution)) + + slider.max = len(all_node_colors) - 1 + + for i in range(slider.max + 1): + slider.value = i + # time.sleep(.5) + + start_dropdown = widgets.Dropdown(description="Start city: ", + options=sorted(list(node_colors.keys())), value="Arad") + display(start_dropdown) + + end_dropdown = widgets.Dropdown(description="Goal city: ", + options=sorted(list(node_colors.keys())), value="Fagaras") + display(end_dropdown) + + button = widgets.ToggleButton(value=False) + button_visual = widgets.interactive(visualize_callback, visualize=button) + display(button_visual) + + slider = widgets.IntSlider(min=0, max=1, step=1, value=0) + slider_visual = widgets.interactive(slider_callback, iteration=slider) + display(slider_visual) + + +# Function to plot NQueensCSP in csp.py and NQueensProblem in search.py +def plot_NQueens(solution): + n = len(solution) + board = np.array([2 * int((i + j) % 2) for j in range(n) for i in range(n)]).reshape((n, n)) + im = Image.open('images/queen_s.png') + height = im.size[1] + im = np.array(im).astype(np.float) / 255 + fig = plt.figure(figsize=(7, 7)) + ax = fig.add_subplot(111) + ax.set_title('{} Queens'.format(n)) + plt.imshow(board, cmap='binary', interpolation='nearest') + # NQueensCSP gives a solution as a dictionary + if isinstance(solution, dict): + for (k, v) in solution.items(): + newax = fig.add_axes([0.064 + (k * 0.112), 0.062 + ((7 - v) * 0.112), 0.1, 0.1], zorder=1) + newax.imshow(im) + newax.axis('off') + # NQueensProblem gives a solution as a list + elif isinstance(solution, list): + for (k, v) in enumerate(solution): + newax = fig.add_axes([0.064 + (k * 0.112), 0.062 + ((7 - v) * 0.112), 0.1, 0.1], zorder=1) + newax.imshow(im) + newax.axis('off') + fig.tight_layout() + plt.show() + + +# Function to plot a heatmap, given a grid +def heatmap(grid, cmap='binary', interpolation='nearest'): + fig = plt.figure(figsize=(7, 7)) + ax = fig.add_subplot(111) + ax.set_title('Heatmap') + plt.imshow(grid, cmap=cmap, interpolation=interpolation) + fig.tight_layout() + plt.show() + + +# Generates a gaussian kernel +def gaussian_kernel(l=5, sig=1.0): + ax = np.arange(-l // 2 + 1., l // 2 + 1.) + xx, yy = np.meshgrid(ax, ax) + kernel = np.exp(-(xx ** 2 + yy ** 2) / (2. * sig ** 2)) + return kernel + + +# Plots utility function for a POMDP +def plot_pomdp_utility(utility): + save = utility['0'][0] + delete = utility['1'][0] + ask_save = utility['2'][0] + ask_delete = utility['2'][-1] + left = (save[0] - ask_save[0]) / (save[0] - ask_save[0] + ask_save[1] - save[1]) + right = (delete[0] - ask_delete[0]) / (delete[0] - ask_delete[0] + ask_delete[1] - delete[1]) + + colors = ['g', 'b', 'k'] + for action in utility: + for value in utility[action]: + plt.plot(value, color=colors[int(action)]) + plt.vlines([left, right], -20, 10, linestyles='dashed', colors='c') + plt.ylim(-20, 13) + plt.xlim(0, 1) + plt.text(left / 2 - 0.05, 10, 'Save') + plt.text((right + left) / 2 - 0.02, 10, 'Ask') + plt.text((right + 1) / 2 - 0.07, 10, 'Delete') + plt.show() diff --git a/notebooks/chapter19/Learners.ipynb b/notebooks/chapter19/Learners.ipynb new file mode 100644 index 000000000..c6f3d1e4f --- /dev/null +++ b/notebooks/chapter19/Learners.ipynb @@ -0,0 +1,508 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Learners\n", + "\n", + "In this section, we will introduce several pre-defined learners to learning the datasets by updating their weights to minimize the loss function. when using a learner to deal with machine learning problems, there are several standard steps:\n", + "\n", + "- **Learner initialization**: Before training the network, it usually should be initialized first. There are several choices when initializing the weights: random initialization, initializing weights are zeros or use Gaussian distribution to init the weights.\n", + "\n", + "- **Optimizer specification**: Which means specifying the updating rules of learnable parameters of the network. Usually, we can choose Adam optimizer as default.\n", + "\n", + "- **Applying back-propagation**: In neural networks, we commonly use back-propagation to pass and calculate gradient information of each layer. Back-propagation needs to be integrated with the chosen optimizer in order to update the weights of NN properly in each epoch.\n", + "\n", + "- **Iterations**: Iterating over the forward and back-propagation process of given epochs. Sometimes the iterating process will have to be stopped by triggering early access in case of overfitting.\n", + "\n", + "We will introduce several learners with different structures. We will import all necessary packages before that:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Using TensorFlow backend.\n" + ] + } + ], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from deep_learning4e import *\n", + "from notebook4e import *\n", + "from learning4e import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Perceptron Learner\n", + "\n", + "### Overview\n", + "\n", + "The Perceptron is a linear classifier. It works the same way as a neural network with no hidden layers (just input and output). First, it trains its weights given a dataset and then it can classify a new item by running it through the network.\n", + "\n", + "Its input layer consists of the item features, while the output layer consists of nodes (also called neurons). Each node in the output layer has *n* synapses (for every item feature), each with its own weight. Then, the nodes find the dot product of the item features and the synapse weights. These values then pass through an activation function (usually a sigmoid). Finally, we pick the largest of the values and we return its index.\n", + "\n", + "Note that in classification problems each node represents a class. The final classification is the class/node with the max output value.\n", + "\n", + "Below you can see a single node/neuron in the outer layer. With *f* we denote the item features, with *w* the synapse weights, then inside the node we have the dot product and the activation function, *g*." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "![perceptron](images/perceptron.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "Perceptron learner is actually a neural network learner with only one hidden layer which is pre-defined in the algorithm of `perceptron_learner`:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw_net = [InputLayer(input_size), DenseLayer(input_size, output_size)]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Where `input_size` and `output_size` are calculated from dataset examples. In the perceptron learner, the gradient descent optimizer is used to update the weights of the network. we return a function `predict` which we will use in the future to classify a new item. The function computes the (algebraic) dot product of the item with the calculated weights for each node in the outer layer. Then it picks the greatest value and classifies the item in the corresponding class." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Let's try the perceptron learner with the `iris` dataset examples, first let's regulate the dataset classes:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "iris = DataSet(name=\"iris\")\n", + "classes = [\"setosa\", \"versicolor\", \"virginica\"]\n", + "iris.classes_to_numbers(classes)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "epoch:50, total_loss:14.089098023560856\n", + "epoch:100, total_loss:12.439240091345326\n", + "epoch:150, total_loss:11.848151059704785\n", + "epoch:200, total_loss:11.283665595671044\n", + "epoch:250, total_loss:11.153290841913241\n", + "epoch:300, total_loss:11.00747536734494\n", + "epoch:350, total_loss:10.871093050365419\n", + "epoch:400, total_loss:10.838400319844233\n", + "epoch:450, total_loss:10.687417928867456\n", + "epoch:500, total_loss:10.650371951865573\n" + ] + } + ], + "source": [ + "pl = perceptron_learner(iris, epochs=500, learning_rate=0.01, verbose=50)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see from the printed lines that the final total loss is converged to around 10.50. If we check the error ratio of perceptron learner on the dataset after training, we will see it is much higher than randomly guess:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.046666666666666634\n" + ] + } + ], + "source": [ + "print(err_ratio(pl, iris))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If we test the trained learner with some test cases:" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1\n" + ] + } + ], + "source": [ + "tests = [([5.0, 3.1, 0.9, 0.1], 0),\n", + " ([5.1, 3.5, 1.0, 0.0], 0),\n", + " ([4.9, 3.3, 1.1, 0.1], 0),\n", + " ([6.0, 3.0, 4.0, 1.1], 1),\n", + " ([6.1, 2.2, 3.5, 1.0], 1),\n", + " ([5.9, 2.5, 3.3, 1.1], 1),\n", + " ([7.5, 4.1, 6.2, 2.3], 2),\n", + " ([7.3, 4.0, 6.1, 2.4], 2),\n", + " ([7.0, 3.3, 6.1, 2.5], 2)]\n", + "print(grade_learner(pl, tests))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It seems the learner is correct on all the test examples.\n", + "\n", + "Now let's try perceptron learner on a more complicated dataset: the MNIST dataset, to see what the result will be. First, we import the dataset to make the examples a `Dataset` object:" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "length of training dataset: 60000\n", + "length of test dataset: 10000\n" + ] + } + ], + "source": [ + "train_img, train_lbl, test_img, test_lbl = load_MNIST(path=\"../../aima-data/MNIST/Digits\")\n", + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "train_examples = [np.append(train_img[i], train_lbl[i]) for i in range(len(train_img))]\n", + "test_examples = [np.append(test_img[i], test_lbl[i]) for i in range(len(test_img))]\n", + "print(\"length of training dataset:\", len(train_examples))\n", + "print(\"length of test dataset:\", len(test_examples))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's train the perceptron learner on the first 1000 examples of the dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "epoch:1, total_loss:423.8627535296463\n", + "epoch:2, total_loss:341.31697581698995\n", + "epoch:3, total_loss:328.98647291325443\n", + "epoch:4, total_loss:327.8999700915627\n", + "epoch:5, total_loss:310.081065570072\n", + "epoch:6, total_loss:268.5474616202945\n", + "epoch:7, total_loss:259.0999998773958\n", + "epoch:8, total_loss:259.09999987481393\n", + "epoch:9, total_loss:259.09999987211944\n", + "epoch:10, total_loss:259.0999998693056\n" + ] + } + ], + "source": [ + "mnist = DataSet(examples=train_examples[:1000])\n", + "pl = perceptron_learner(mnist, epochs=10, verbose=1)" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.893\n" + ] + } + ], + "source": [ + "print(err_ratio(pl, mnist))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It looks like we have a near 90% error ratio on training data after the network is trained on it. Then we can investigate the model's performance on the test dataset which it never has seen before:" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.92\n" + ] + } + ], + "source": [ + "test_mnist = DataSet(examples=test_examples[:100])\n", + "print(err_ratio(pl, test_mnist))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It seems a single layer perceptron learner cannot simulate the structure of the MNIST dataset. To improve accuracy, we may not only increase training epochs but also consider changing to a more complicated network structure." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Neural Network Learner\n", + "\n", + "Although there are many different types of neural networks, the dense neural network we implemented can be treated as a stacked perceptron learner. Adding more layers to the perceptron network could add to the non-linearity to the network thus model will be more flexible when fitting complex data-target relations. Whereas it also adds to the risk of overfitting as the side effect of flexibility.\n", + "\n", + "By default we use dense networks with two hidden layers, which has the architecture as the following:\n", + "\n", + "\n", + "\n", + "In our code, we implemented it as:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# initialize the network\n", + "raw_net = [InputLayer(input_size)]\n", + "# add hidden layers\n", + "hidden_input_size = input_size\n", + "for h_size in hidden_layer_sizes:\n", + " raw_net.append(DenseLayer(hidden_input_size, h_size))\n", + " hidden_input_size = h_size\n", + "raw_net.append(DenseLayer(hidden_input_size, output_size))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Where hidden_layer_sizes are the sizes of each hidden layer in a list which can be specified by user. Neural network learner uses gradient descent as default optimizer but user can specify any optimizer when calling `neural_net_learner`. The other special attribute that can be changed in `neural_net_learner` is `batch_size` which controls the number of examples used in each round of update. `neural_net_learner` also returns a `predict` function which calculates prediction by multiplying weight to inputs and applying activation functions.\n", + "\n", + "### Example\n", + "\n", + "Let's also try `neural_net_learner` on the `iris` dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "epoch:10, total_loss:15.931817841643683\n", + "epoch:20, total_loss:8.248422285412149\n", + "epoch:30, total_loss:6.102968668275\n", + "epoch:40, total_loss:5.463915043272969\n", + "epoch:50, total_loss:5.298986288420822\n", + "epoch:60, total_loss:4.032928400456889\n", + "epoch:70, total_loss:3.2628899927346855\n", + "epoch:80, total_loss:6.01336701367312\n", + "epoch:90, total_loss:5.412020420311795\n", + "epoch:100, total_loss:3.1044027319850773\n" + ] + } + ], + "source": [ + "nn = neural_net_learner(iris, epochs=100, learning_rate=0.15, optimizer=gradient_descent, verbose=10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similarly we check the model's accuracy on both training and test dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "error ration on training set: 0.033333333333333326\n" + ] + } + ], + "source": [ + "print(\"error ration on training set:\",err_ratio(nn, iris))" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "accuracy on test set: 1\n" + ] + } + ], + "source": [ + "tests = [([5.0, 3.1, 0.9, 0.1], 0),\n", + " ([5.1, 3.5, 1.0, 0.0], 0),\n", + " ([4.9, 3.3, 1.1, 0.1], 0),\n", + " ([6.0, 3.0, 4.0, 1.1], 1),\n", + " ([6.1, 2.2, 3.5, 1.0], 1),\n", + " ([5.9, 2.5, 3.3, 1.1], 1),\n", + " ([7.5, 4.1, 6.2, 2.3], 2),\n", + " ([7.3, 4.0, 6.1, 2.4], 2),\n", + " ([7.0, 3.3, 6.1, 2.5], 2)]\n", + "print(\"accuracy on test set:\",grade_learner(nn, tests))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that the error ratio on the training set is smaller than the perceptron learner. As the error ratio is relatively small, let's try the model on the MNIST dataset to see whether there will be a larger difference. " + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "epoch:10, total_loss:89.0002153455983\n", + "epoch:20, total_loss:87.29675663038348\n", + "epoch:30, total_loss:86.29591779319225\n", + "epoch:40, total_loss:83.78091780128402\n", + "epoch:50, total_loss:82.17091581738829\n", + "epoch:60, total_loss:83.8434277386084\n", + "epoch:70, total_loss:83.55209905561495\n", + "epoch:80, total_loss:83.106898191118\n", + "epoch:90, total_loss:83.37041170165992\n", + "epoch:100, total_loss:82.57013813500876\n" + ] + } + ], + "source": [ + "nn = neural_net_learner(mnist, epochs=100, verbose=10)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.784\n" + ] + } + ], + "source": [ + "print(err_ratio(nn, mnist))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After the model converging, the model's error ratio on the training set is still high. We will introduce the convolutional network in the following chapters to see how it helps improve accuracy on learning this dataset." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter19/Loss Functions and Layers.ipynb b/notebooks/chapter19/Loss Functions and Layers.ipynb new file mode 100644 index 000000000..25676e899 --- /dev/null +++ b/notebooks/chapter19/Loss Functions and Layers.ipynb @@ -0,0 +1,398 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Loss Function\n", + "\n", + "Loss functions evaluate how well specific algorithm models the given data. Commonly loss functions are used to compare the target data and model's prediction. If predictions deviate too much from actual targets, loss function would output a large value. Usually, loss functions can help other optimization functions to improve the accuracy of the model.\n", + "\n", + "However, there’s no one-size-fits-all loss function to algorithms in machine learning. For each algorithm and machine learning projects, specifying certain loss functions could assist the user in getting better model performance. Here we will demonstrate two loss functions: `mse_loss` and `cross_entropy_loss`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Min Square Error\n", + "\n", + "Min square error(MSE) is the most commonly used loss function in machine learning. The intuition of MSE is straight forward: the distance between two points represents the difference between them. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$MSE = -\\sum_i{(y_i-t_i)^2/n}$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Where $y_i$ is the prediction of the ith example and $t_i$ is the target of the ith example. And n is the total number of examples.\n", + "\n", + "Below is a plot of an MSE function where the true target value is 100, and the predicted values range between -10,000 to 10,000. The MSE loss (Y-axis) reaches its minimum value at prediction (X-axis) = 100." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Cross-Entropy\n", + "\n", + "For most deep learning applications, we can get away with just one loss function: cross-entropy loss function. We can think of most deep learning algorithms as learning probability distributions and what we are learning is a distribution of predictions $P(y|x)$ given a series of inputs. \n", + "\n", + "To associate input examples x with output examples y, the parameters that maximize the likelihood of the training set should be:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$\\theta^* = argmax_\\theta \\prod_{i=0}^n p(y^{(i)}/x^{(i)})$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Maxmizing the above formula equals to minimizing the negative log form of it:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$\\theta^* = argmin_\\theta -\\sum_{i=0}^n logp(y^{(i)}/x^{(i)})$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It can be proven that the above formula equals to minimizing MSE loss.\n", + "\n", + "The majority of deep learning algorithms use cross-entropy in some way. Classifiers that use deep learning calculate the cross-entropy between categorical distributions over the output class. For a given class, its contribution to the loss is dependent on its probability in the following trend:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Examples\n", + "\n", + "First let's import necessary packages." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Using TensorFlow backend.\n" + ] + } + ], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from deep_learning4e import *\n", + "from notebook4e import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Neural Network Layers\n", + "\n", + "Neural networks may be conveniently described using data structures of computational graphs. A computational graph is a directed graph describing how many variables should be computed, with each variable by computed by applying a specific operation to a set of other variables. \n", + "\n", + "In our code, we provide class `NNUnit` as the basic structure of a neural network. The structure of `NNUnit` is simple, it only stores the following information:\n", + "\n", + "- **val**: the value of the current node.\n", + "- **parent**: parents of the current node.\n", + "- **weights**: weights between parent nodes and current node. It should be in the same size as parents.\n", + "\n", + "There is another class `Layer` inheriting from `NNUnit`. A `Layer` object holds a list of nodes that represents all the nodes in a layer. It also has a method `forward` to pass a value through the current layer. Here we will demonstrate several pre-defined types of layers in a Neural Network." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Output Layers\n", + "\n", + "Neural networks need specialized output layers for each type of data we might ask them to produce. For many problems, we need to model discrete variables that have k distinct values instead of just binary variables. For example, models of natural language may predict a single word from among of vocabulary of tens of thousands or even more choices. To represent these distributions, we use a softmax layer:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$P(y=i|x)=softmax(h(x)^TW+b)_i$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "where $W$ is matrix of learned weights of output layer $b$ is a vector of learned biases, and the softmax function is:\n", + "\n", + "$$softmax(z_i)=exp(z_i)/\\sum_i exp(z_i)$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is simple to create a output layer and feed an example into it:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[0.03205860328008499, 0.08714431874203257, 0.23688281808991013, 0.6439142598879722]\n" + ] + } + ], + "source": [ + "layer = OutputLayer(size=4)\n", + "example = [1,2,3,4]\n", + "print(layer.forward(example))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The output can be treated like normalized probability when the input of output layer is calculated by probability." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Input Layers\n", + "\n", + "Input layers can be treated like a mapping layer that maps each element of the input vector to each input layer node. The input layer acts as a storage of input vector information which can be used when doing forward propagation.\n", + "\n", + "In our realization of input layers, the size of the input vector and input layer should match." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[1, 2, 3]\n" + ] + } + ], + "source": [ + "layer = InputLayer(size=3)\n", + "example = [1,2,3]\n", + "print(layer.forward(example))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Hidden Layers\n", + "\n", + "While processing an input vector x of the neural network, it performs several intermediate computations before producing the output y. We can think of these intermediate computations as the state of memory during the execution of a multi-step program. We call the intermediate computations hidden because the data does not specify the values of these variables.\n", + "\n", + "Most neural network hidden layers are based on a linear transformation followed by the application of an elementwise nonlinear function called the activation function g:\n", + "\n", + "$$h=g(W+b)$$\n", + "\n", + "where W is a learned matrix of weights and b is a learned set of bias parameters.\n", + "\n", + "Here we pre-defined several activation functions in `utils.py`: `sigmoid`, `relu`, `elu`, `tanh` and `leaky_relu`. They are all inherited from the `Activation` class. You can get the value of the function or its derivative at a certain point of x:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Sigmoid at 0: 0.5\n", + "Deriavation of sigmoid at 0: 0\n" + ] + } + ], + "source": [ + "s = sigmoid()\n", + "print(\"Sigmoid at 0:\", s.f(0))\n", + "print(\"Deriavation of sigmoid at 0:\", s.derivative(0))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To create a hidden layer object, there are several attributes need to be specified:\n", + "\n", + "- **in_size**: the input vector size of each hidden layer node.\n", + "- **out_size**: the size of the output vector of the hidden layer. Thus each node will hide the weight of the size of (in_size). The weights will be initialized randomly.\n", + "- **activation**: the activation function used for this layer.\n", + "\n", + "Now let's demonstrate how a dense hidden layer works briefly:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[0.21990266877137224, 0.2038864498984756, 0.5543443697256466]\n" + ] + } + ], + "source": [ + "layer = DenseLayer(in_size=4, out_size=3, activation=sigmoid())\n", + "example = [1,2,3,4]\n", + "print(layer.forward(example))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This layer mapped input of size 4 to output of size 3. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Convolutional Layers\n", + "\n", + "The convolutional layer is similar to the hidden layer except they use a different forward strategy. The convolutional layer takes an input of multiple channels and does convolution on each channel with a pre-defined kernel function. Thus the output of the convolutional layer will still be with the same number of channels. If we image each input as an image, then channels represent its color model such as RGB. The output will still have the same color model as the input.\n", + "\n", + "Now let's try the one-dimensional convolution layer:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[array([3.9894228, 3.9894228, 3.9894228]), array([3.9894228, 3.9894228, 3.9894228]), array([3.9894228, 3.9894228, 3.9894228])]\n" + ] + } + ], + "source": [ + "layer = ConvLayer1D(size=3, kernel_size=3)\n", + "example = [[1]*3 for _ in range(3)]\n", + "print(layer.forward(example))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Which can be deemed as a one-dimensional image with three channels." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Pooling Layers\n", + "\n", + "Pooling layers can be treated as a special kind of convolutional layer that uses a special kind of kernel to extract a certain value in the kernel region. Here we use max-pooling to report the maximum value in each group." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[3, 4], [4, 4], [4, 4]]\n" + ] + } + ], + "source": [ + "layer = MaxPoolingLayer1D(size=3, kernel_size=3)\n", + "example = [[1,2,3,4], [2,3,4,1],[3,4,1,2]]\n", + "print(layer.forward(example))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that each time kernel picks up the maximum value in its region." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter19/Optimizer and Backpropagation.ipynb b/notebooks/chapter19/Optimizer and Backpropagation.ipynb new file mode 100644 index 000000000..5194adc7a --- /dev/null +++ b/notebooks/chapter19/Optimizer and Backpropagation.ipynb @@ -0,0 +1,311 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Optimization Algorithms\n", + "\n", + "Training a neural network consists of modifying the network’s parameters to minimize the cost function on the training set. In principle, any kind of optimization algorithm could be used. In practice, modern neural networks are almost always trained with some variant of stochastic gradient descent(SGD). Here we will provide two optimization algorithms: SGD and Adam optimizer.\n", + "\n", + "## Stochastic Gradient Descent\n", + "\n", + "The goal of an optimization algorithm is to find the value of the parameter to make loss function very low. For some types of models, an optimization algorithm might find the global minimum value of loss function, but for neural network, the most efficient way to converge loss function to a local minimum is to minimize loss function according to each example.\n", + "\n", + "Gradient descent uses the following update rule to minimize loss function:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$\\theta^{(t+1)} = \\theta^{(t)}-\\alpha\\nabla_\\theta L(\\theta^{(t)})$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "where t is the time step of the algorithm and $\\alpha$ is the learning rate. But this rule could be very costly when $L(\\theta)$ is defined as a sum across the entire training set. Using SGD can accelerate the learning process as we can use only a batch of examples to update the parameters. \n", + "\n", + "We implemented the gradient descent algorithm, which can be viewed with the following code:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Using TensorFlow backend.\n" + ] + } + ], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from deep_learning4e import *\n", + "from notebook4e import *" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def gradient_descent(dataset, net, loss, epochs=1000, l_rate=0.01,  batch_size=1):\n",
    +       "    """\n",
    +       "    gradient descent algorithm to update the learnable parameters of a network.\n",
    +       "    :return: the updated network.\n",
    +       "    """\n",
    +       "    # init data\n",
    +       "    examples = dataset.examples\n",
    +       "\n",
    +       "    for e in range(epochs):\n",
    +       "        total_loss = 0\n",
    +       "        random.shuffle(examples)\n",
    +       "        weights = [[node.weights for node in layer.nodes] for layer in net]\n",
    +       "\n",
    +       "        for batch in get_batch(examples, batch_size):\n",
    +       "\n",
    +       "            inputs, targets = init_examples(batch, dataset.inputs, dataset.target, len(net[-1].nodes))\n",
    +       "            # compute gradients of weights\n",
    +       "            gs, batch_loss = BackPropagation(inputs, targets, weights, net, loss)\n",
    +       "            # update weights with gradient descent\n",
    +       "            weights = vector_add(weights, scalar_vector_product(-l_rate, gs))\n",
    +       "            total_loss += batch_loss\n",
    +       "            # update the weights of network each batch\n",
    +       "            for i in range(len(net)):\n",
    +       "                if weights[i]:\n",
    +       "                    for j in range(len(weights[i])):\n",
    +       "                        net[i].nodes[j].weights = weights[i][j]\n",
    +       "\n",
    +       "        if (e+1) % 10 == 0:\n",
    +       "            print("epoch:{}, total_loss:{}".format(e+1,total_loss))\n",
    +       "    return net\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(gradient_descent)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There several key elements need to specify when using a `gradient_descent` optimizer:\n", + "\n", + "- **dataset**: A dataset object we used in the previous chapter, such as `iris` and `orings`.\n", + "- **net**: A neural network object which we will cover in the next chapter.\n", + "- **loss**: The loss function used in representing accuracy.\n", + "- **epochs**: How many rounds the training set is used.\n", + "- **l_rate**: learning rate.\n", + "- **batch_size**: The number of examples is used in each update. When very small batch size is used, gradient descent and be treated as SGD." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Adam Optimizer\n", + "\n", + "To mitigate some of the problems caused by the fact that the gradient ignores the second derivatives, some optimization algorithms incorporate the idea of momentum which keeps a running average of the gradients of past mini-batches. Thus Adam optimizer maintains a table saving the previous gradient result.\n", + "\n", + "To view the pseudocode and the implementation, you can use the following codes:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pseudocode(adam_optimizer)\n", + "psource(adam_optimizer)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are several attributes to specify when using Adam optimizer that is different from gradient descent: rho and delta. These parameters determine the percentage of the last iteration is memorized. For more details of how this algorithm work, please refer to the article [here](https://arxiv.org/abs/1412.6980).\n", + "\n", + "In the Stanford course on deep learning for computer vision, the Adam algorithm is suggested as the default optimization method for deep learning applications: \n", + ">In practice Adam is currently recommended as the default algorithm to use, and often works slightly better than RMSProp. However, it is often also worth trying SGD+Nesterov Momentum as an alternative." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Backpropagation\n", + "\n", + "The above algorithms are optimization algorithms: they update parameters like $\\theta$ to get smaller loss values. And back-propagation is the method to calculate the gradient for each layer. For complicated models like deep neural networks, the gradients can not be calculated directly as there are enormous array-valued variables.\n", + "\n", + "Fortunately, back-propagation can calculate the gradients briefly which we can interpret as calculating gradients from the last layer to the first which is the inverse process to the forwarding procedure. The derivation of the loss function is passed to previous layers to make them changing toward the direction of minimizing the loss function." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Applying optimizers and back-propagation algorithm together, we can update the weights of a neural network to minimize the loss function with alternatively doing forward and back-propagation process. Here is a figure form [here](https://medium.com/datathings/neural-networks-and-backpropagation-explained-in-a-simple-way-f540a3611f5e) describing how a neural network updates its weights:\n", + "\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In our implementation, all the steps are integrated into the optimizer objects. The forward-backward process of passing information through the whole neural network is put into the method `BackPropagation`. You can view the code with:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(BackPropagation)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The demonstration of optimizers and back-propagation algorithm will be made together with neural network learners." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter19/RNN.ipynb b/notebooks/chapter19/RNN.ipynb new file mode 100644 index 000000000..b6971b36a --- /dev/null +++ b/notebooks/chapter19/RNN.ipynb @@ -0,0 +1,487 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# RNN\n", + "\n", + "## Overview\n", + "\n", + "When human is thinking, they are thinking based on the understanding of previous time steps but not from scratch. Traditional neural networks can’t do this, and it seems like a major shortcoming. For example, imagine you want to do sentimental analysis of some texts. It will be unclear if the traditional network cannot recognize the short phrase and sentences.\n", + "\n", + "Recurrent neural networks address this issue. They are networks with loops in them, allowing information to persist.\n", + "\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A recurrent neural network can be thought of as multiple copies of the same network, each passing a message to a successor. Consider what happens if we unroll the above loop:\n", + " \n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As demonstrated in the book, recurrent neural networks may be connected in many different ways: sequences in the input, the output, or in the most general case both.\n", + "\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Implementation\n", + "\n", + "In our case, we implemented rnn with modules offered by the package of `keras`. To use `keras` and our module, you must have both `tensorflow` and `keras` installed as a prerequisite. `keras` offered very well defined high-level neural networks API which allows for easy and fast prototyping. `keras` supports many different types of networks such as convolutional and recurrent neural networks as well as user-defined networks. About how to get started with `keras`, please read the [tutorial](https://keras.io/).\n", + "\n", + "To view our implementation of a simple rnn, please use the following code:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Using TensorFlow backend.\n" + ] + } + ], + "source": [ + "import warnings\n", + "warnings.filterwarnings(\"ignore\", category=FutureWarning)\n", + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from deep_learning4e import *\n", + "from notebook4e import *" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def SimpleRNNLearner(train_data, val_data, epochs=2):\n",
    +       "    """\n",
    +       "    RNN example for text sentimental analysis.\n",
    +       "    :param train_data: a tuple of (training data, targets)\n",
    +       "            Training data: ndarray taking training examples, while each example is coded by embedding\n",
    +       "            Targets: ndarray taking targets of each example. Each target is mapped to an integer.\n",
    +       "    :param val_data: a tuple of (validation data, targets)\n",
    +       "    :param epochs: number of epochs\n",
    +       "    :return: a keras model\n",
    +       "    """\n",
    +       "\n",
    +       "    total_inputs = 5000\n",
    +       "    input_length = 500\n",
    +       "\n",
    +       "    # init data\n",
    +       "    X_train, y_train = train_data\n",
    +       "    X_val, y_val = val_data\n",
    +       "\n",
    +       "    # init a the sequential network (embedding layer, rnn layer, dense layer)\n",
    +       "    model = Sequential()\n",
    +       "    model.add(Embedding(total_inputs, 32, input_length=input_length))\n",
    +       "    model.add(SimpleRNN(units=128))\n",
    +       "    model.add(Dense(1, activation='sigmoid'))\n",
    +       "    model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])\n",
    +       "\n",
    +       "    # train the model\n",
    +       "    model.fit(X_train, y_train, validation_data=(X_val, y_val), epochs=epochs, batch_size=128, verbose=2)\n",
    +       "\n",
    +       "    return model\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(SimpleRNNLearner)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`train_data` and `val_data` are needed when creating a simple rnn learner. Both attributes take lists of examples and the targets in a tuple. Please note that we build the network by adding layers to a `Sequential()` model which means data are passed through the network one by one. `SimpleRNN` layer is the key layer of rnn which acts the recursive role. Both `Embedding` and `Dense` layers before and after the rnn layer are used to map inputs and outputs to data in rnn form. And the optimizer used in this case is the Adam optimizer." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example\n", + "\n", + "Here is an example of how we train the rnn network made with `keras`. In this case, we used the IMDB dataset which can be viewed [here](https://keras.io/datasets/#imdb-movie-reviews-sentiment-classification) in detail. In short, the dataset is consist of movie reviews in text and their labels of sentiment (positive/negative). After loading the dataset we use `keras_dataset_loader` to split it into training, validation and test datasets." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "from keras.datasets import imdb\n", + "data = imdb.load_data(num_words=5000)\n", + "train, val, test = keras_dataset_loader(data)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Then we build and train the rnn model for 10 epochs:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "WARNING: Logging before flag parsing goes to stderr.\n", + "W1018 22:51:23.614058 140557804885824 deprecation.py:323] From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/nn_impl.py:180: add_dispatch_support..wrapper (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "Use tf.where in 2.0, which has the same broadcast rule as np.where\n", + "W1018 22:51:24.267649 140557804885824 deprecation_wrapper.py:119] From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:422: The name tf.global_variables is deprecated. Please use tf.compat.v1.global_variables instead.\n", + "\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Train on 24990 samples, validate on 25000 samples\n", + "Epoch 1/10\n", + " - 59s - loss: 0.6540 - accuracy: 0.5959 - val_loss: 0.6234 - val_accuracy: 0.6488\n", + "Epoch 2/10\n", + " - 61s - loss: 0.5977 - accuracy: 0.6766 - val_loss: 0.6202 - val_accuracy: 0.6326\n", + "Epoch 3/10\n", + " - 61s - loss: 0.5269 - accuracy: 0.7356 - val_loss: 0.4803 - val_accuracy: 0.7789\n", + "Epoch 4/10\n", + " - 61s - loss: 0.4159 - accuracy: 0.8130 - val_loss: 0.5640 - val_accuracy: 0.7046\n", + "Epoch 5/10\n", + " - 61s - loss: 0.3931 - accuracy: 0.8294 - val_loss: 0.4707 - val_accuracy: 0.8090\n", + "Epoch 6/10\n", + " - 61s - loss: 0.3357 - accuracy: 0.8637 - val_loss: 0.4177 - val_accuracy: 0.8122\n", + "Epoch 7/10\n", + " - 61s - loss: 0.3552 - accuracy: 0.8594 - val_loss: 0.4652 - val_accuracy: 0.7889\n", + "Epoch 8/10\n", + " - 61s - loss: 0.3286 - accuracy: 0.8686 - val_loss: 0.4708 - val_accuracy: 0.7785\n", + "Epoch 9/10\n", + " - 61s - loss: 0.3428 - accuracy: 0.8635 - val_loss: 0.4332 - val_accuracy: 0.8137\n", + "Epoch 10/10\n", + " - 61s - loss: 0.3650 - accuracy: 0.8471 - val_loss: 0.4673 - val_accuracy: 0.7914\n" + ] + } + ], + "source": [ + "model = SimpleRNNLearner(train, val, epochs=10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The accuracy of the training dataset and validation dataset are both over 80% which is very promising. Now let's try on some random examples in the test set:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Autoencoder\n", + "\n", + "Autoencoders are an unsupervised learning technique in which we leverage neural networks for the task of representation learning. It works by compressing the input into a latent-space representation, to do transformations on the data. \n", + "\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Autoencoders are learned automatically from data examples. It means that it is easy to train specialized instances of the algorithm that will perform well on a specific type of input and that it does not require any new engineering, only the appropriate training data.\n", + "\n", + "Autoencoders have different architectures for different kinds of data. Here we only provide a simple example of a vanilla encoder, which means they're only one hidden layer in the network:\n", + "\n", + "\n", + "\n", + "You can view the source code by:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def AutoencoderLearner(inputs, encoding_size, epochs=200):\n",
    +       "    """\n",
    +       "    Simple example of linear auto encoder learning producing the input itself.\n",
    +       "    :param inputs: a batch of input data in np.ndarray type\n",
    +       "    :param encoding_size: int, the size of encoding layer\n",
    +       "    :param epochs: number of epochs\n",
    +       "    :return: a keras model\n",
    +       "    """\n",
    +       "\n",
    +       "    # init data\n",
    +       "    input_size = len(inputs[0])\n",
    +       "\n",
    +       "    # init model\n",
    +       "    model = Sequential()\n",
    +       "    model.add(Dense(encoding_size, input_dim=input_size, activation='relu', kernel_initializer='random_uniform',\n",
    +       "                    bias_initializer='ones'))\n",
    +       "    model.add(Dense(input_size, activation='relu', kernel_initializer='random_uniform', bias_initializer='ones'))\n",
    +       "\n",
    +       "    # update model with sgd\n",
    +       "    sgd = optimizers.SGD(lr=0.01)\n",
    +       "    model.compile(loss='mean_squared_error', optimizer=sgd, metrics=['accuracy'])\n",
    +       "\n",
    +       "    # train the model\n",
    +       "    model.fit(inputs, inputs, epochs=epochs, batch_size=10, verbose=2)\n",
    +       "\n",
    +       "    return model\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(AutoencoderLearner)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It shows we added two dense layers to the network structures." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter19/images/autoencoder.png b/notebooks/chapter19/images/autoencoder.png new file mode 100644 index 000000000..cd216e9f7 Binary files /dev/null and b/notebooks/chapter19/images/autoencoder.png differ diff --git a/notebooks/chapter19/images/backprop.png b/notebooks/chapter19/images/backprop.png new file mode 100644 index 000000000..8d53530e6 Binary files /dev/null and b/notebooks/chapter19/images/backprop.png differ diff --git a/notebooks/chapter19/images/corss_entropy_plot.png b/notebooks/chapter19/images/corss_entropy_plot.png new file mode 100644 index 000000000..8212405e7 Binary files /dev/null and b/notebooks/chapter19/images/corss_entropy_plot.png differ diff --git a/notebooks/chapter19/images/mse_plot.png b/notebooks/chapter19/images/mse_plot.png new file mode 100644 index 000000000..fd58f9db9 Binary files /dev/null and b/notebooks/chapter19/images/mse_plot.png differ diff --git a/notebooks/chapter19/images/nn.png b/notebooks/chapter19/images/nn.png new file mode 100644 index 000000000..673b9338b Binary files /dev/null and b/notebooks/chapter19/images/nn.png differ diff --git a/notebooks/chapter19/images/nn_steps.png b/notebooks/chapter19/images/nn_steps.png new file mode 100644 index 000000000..4a596133b Binary files /dev/null and b/notebooks/chapter19/images/nn_steps.png differ diff --git a/notebooks/chapter19/images/perceptron.png b/notebooks/chapter19/images/perceptron.png new file mode 100644 index 000000000..68d2a258a Binary files /dev/null and b/notebooks/chapter19/images/perceptron.png differ diff --git a/notebooks/chapter19/images/rnn_connections.png b/notebooks/chapter19/images/rnn_connections.png new file mode 100644 index 000000000..c72d459b8 Binary files /dev/null and b/notebooks/chapter19/images/rnn_connections.png differ diff --git a/notebooks/chapter19/images/rnn_unit.png b/notebooks/chapter19/images/rnn_unit.png new file mode 100644 index 000000000..e4ebabf2b Binary files /dev/null and b/notebooks/chapter19/images/rnn_unit.png differ diff --git a/notebooks/chapter19/images/rnn_units.png b/notebooks/chapter19/images/rnn_units.png new file mode 100644 index 000000000..5724f5d46 Binary files /dev/null and b/notebooks/chapter19/images/rnn_units.png differ diff --git a/notebooks/chapter19/images/vanilla.png b/notebooks/chapter19/images/vanilla.png new file mode 100644 index 000000000..db7a45f9a Binary files /dev/null and b/notebooks/chapter19/images/vanilla.png differ diff --git a/notebooks/chapter21/Active Reinforcement Learning.ipynb b/notebooks/chapter21/Active Reinforcement Learning.ipynb new file mode 100644 index 000000000..1ce3c79e0 --- /dev/null +++ b/notebooks/chapter21/Active Reinforcement Learning.ipynb @@ -0,0 +1,212 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# ACTIVE REINFORCEMENT LEARNING\n", + "\n", + "This notebook mainly focuses on active reinforce learning algorithms. For a general introduction to reinforcement learning and passive algorithms, please refer to the notebook of **[Passive Reinforcement Learning](./Passive%20Reinforcement%20Learning.ipynb)**.\n", + "\n", + "Unlike Passive Reinforcement Learning in Active Reinforcement Learning, we are not bound by a policy pi and we need to select our actions. In other words, the agent needs to learn an optimal policy. The fundamental tradeoff the agent needs to face is that of exploration vs. exploitation. \n", + "\n", + "## QLearning Agent\n", + "\n", + "The QLearningAgent class in the rl module implements the Agent Program described in **Fig 21.8** of the AIMA Book. In Q-Learning the agent learns an action-value function Q which gives the utility of taking a given action in a particular state. Q-Learning does not require a transition model and hence is a model-free method. Let us look into the source before we see some usage examples." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%psource QLearningAgent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Agent Program can be obtained by creating the instance of the class by passing the appropriate parameters. Because of the __ call __ method the object that is created behaves like a callable and returns an appropriate action as most Agent Programs do. To instantiate the object we need a `mdp` object similar to the `PassiveTDAgent`.\n", + "\n", + " Let us use the same `GridMDP` object we used above. **Figure 17.1 (sequential_decision_environment)** is similar to **Figure 21.1** but has some discounting parameter as **gamma = 0.9**. The enviroment also implements an exploration function **f** which returns fixed **Rplus** until agent has visited state, action **Ne** number of times. The method **actions_in_state** returns actions possible in given state. It is useful when applying max and argmax operations." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us create our object now. We also use the **same alpha** as given in the footnote of the book on **page 769**: $\\alpha(n)=60/(59+n)$ We use **Rplus = 2** and **Ne = 5** as defined in the book. The pseudocode can be referred from **Fig 21.7** in the book." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from rl4e import *\n", + "from mdp import sequential_decision_environment, value_iteration" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, \n", + " alpha=lambda n: 60./(59+n))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now to try out the q_agent we make use of the **run_single_trial** function in rl.py (which was also used above). Let us use **200** iterations." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "for i in range(200):\n", + " run_single_trial(q_agent,sequential_decision_environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let us see the Q Values. The keys are state-action pairs. Where different actions correspond according to:\n", + "\n", + "north = (0, 1) \n", + "south = (0,-1) \n", + "west = (-1, 0) \n", + "east = (1, 0)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "q_agent.Q" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Utility U of each state is related to Q by the following equation.\n", + "\n", + "$$U (s) = max_a Q(s, a)$$\n", + "\n", + "Let us convert the Q Values above into U estimates.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "U = defaultdict(lambda: -1000.) # Very Large Negative Value for Comparison see below.\n", + "for state_action, value in q_agent.Q.items():\n", + " state, action = state_action\n", + " if U[state] < value:\n", + " U[state] = value" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can output the estimated utility values at each state:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "defaultdict(()>,\n", + " {(0, 0): -0.0036556430391564178,\n", + " (1, 0): -0.04862675963288682,\n", + " (2, 0): 0.03384490363100474,\n", + " (3, 0): -0.16618771401113092,\n", + " (3, 1): -0.6015323978614368,\n", + " (0, 1): 0.09161077177913537,\n", + " (0, 2): 0.1834607974581678,\n", + " (1, 2): 0.26393277962204903,\n", + " (2, 2): 0.32369726495311274,\n", + " (3, 2): 0.38898341569576245,\n", + " (2, 1): -0.044858154562400485})" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "U" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us finally compare these estimates to value_iteration results." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{(0, 1): 0.3984432178350045, (1, 2): 0.649585681261095, (3, 2): 1.0, (0, 0): 0.2962883154554812, (3, 0): 0.12987274656746342, (3, 1): -1.0, (2, 1): 0.48644001739269643, (2, 0): 0.3447542300124158, (2, 2): 0.7953620878466678, (1, 0): 0.25386699846479516, (0, 2): 0.5093943765842497}\n" + ] + } + ], + "source": [ + "print(value_iteration(sequential_decision_environment))" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter21/Passive Reinforcement Learning.ipynb b/notebooks/chapter21/Passive Reinforcement Learning.ipynb new file mode 100644 index 000000000..cbb5ae9e3 --- /dev/null +++ b/notebooks/chapter21/Passive Reinforcement Learning.ipynb @@ -0,0 +1,424 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Introduction to Reinforcement Learning\n", + "\n", + "This Jupyter notebook and the others in the same folder act as supporting materials for **Chapter 21 Reinforcement Learning** of the book* Artificial Intelligence: A Modern Approach*. The notebooks make use of the implementations in `rl.py` module. We also make use of the implementation of MDPs in the `mdp.py` module to test our agents. It might be helpful if you have already gone through the Jupyter notebook dealing with the Markov decision process. Let us import everything from the `rl` module. It might be helpful to view the source of some of our implementations." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from rl4e import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before we start playing with the actual implementations let us review a couple of things about RL.\n", + "\n", + "1. Reinforcement Learning is concerned with how software agents ought to take actions in an environment so as to maximize some notion of cumulative reward. \n", + "\n", + "2. Reinforcement learning differs from standard supervised learning in that correct input/output pairs are never presented, nor sub-optimal actions explicitly corrected. Further, there is a focus on on-line performance, which involves finding a balance between exploration (of uncharted territory) and exploitation (of current knowledge).\n", + "\n", + "-- Source: [Wikipedia](https://en.wikipedia.org/wiki/Reinforcement_learning)\n", + "\n", + "In summary, we have a sequence of state action transitions with rewards associated with some states. Our goal is to find the optimal policy $\\pi$ which tells us what action to take in each state." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Passive Reinforcement Learning\n", + "\n", + "In passive Reinforcement Learning the agent follows a fixed policy $\\pi$. Passive learning attempts to evaluate the given policy $pi$ - without any knowledge of the Reward function $R(s)$ and the Transition model $P(s'\\ |\\ s, a)$.\n", + "\n", + "This is usually done by some method of **utility estimation**. The agent attempts to directly learn the utility of each state that would result from following the policy. Note that at each step, it has to *perceive* the reward and the state - it has no global knowledge of these. Thus, if a certain the entire set of actions offers a very low probability of attaining some state $s_+$ - the agent may never perceive the reward $R(s_+)$.\n", + "\n", + "Consider a situation where an agent is given the policy to follow. Thus, at any point, it knows only its current state and current reward, and the action it must take next. This action may lead it to more than one state, with different probabilities.\n", + "\n", + "For a series of actions given by $\\pi$, the estimated utility $U$:\n", + "$$U^{\\pi}(s) = E(\\sum_{t=0}^\\inf \\gamma^t R^t(s'))$$\n", + "Or the expected value of summed discounted rewards until termination.\n", + "\n", + "Based on this concept, we discuss three methods of estimating utility: direct utility estimation, adaptive dynamic programming, and temporal-difference learning.\n", + "\n", + "### Implementation\n", + "\n", + "Passive agents are implemented in `rl4e.py` as various `Agent-Class`es.\n", + "\n", + "To demonstrate these agents, we make use of the `GridMDP` object from the `MDP` module. `sequential_decision_environment` is similar to that used for the `MDP` notebook but has discounting with $\\gamma = 0.9$.\n", + "\n", + "The `Agent-Program` can be obtained by creating an instance of the relevant `Agent-Class`. The `__call__` method allows the `Agent-Class` to be called as a function. The class needs to be instantiated with a policy ($\\pi$) and an `MDP` whose utility of states will be estimated.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "from mdp import sequential_decision_environment" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `sequential_decision_environment` is a GridMDP object as shown below. The rewards are **+1** and **-1** in the terminal states, and **-0.04** in the rest. Now we define actions and a policy similar to **Fig 21.1** in the book." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "# Action Directions\n", + "north = (0, 1)\n", + "south = (0,-1)\n", + "west = (-1, 0)\n", + "east = (1, 0)\n", + "\n", + "policy = {\n", + " (0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None,\n", + " (0, 1): north, (2, 1): north, (3, 1): None,\n", + " (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west, \n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This enviroment will be extensively used in the following demonstrations." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Direct Utility Estimation (DUE)\n", + " \n", + " The first, most naive method of estimating utility comes from the simplest interpretation of the above definition. We construct an agent that follows the policy until it reaches the terminal state. At each step, it logs its current state, reward. Once it reaches the terminal state, it can estimate the utility for each state for *that* iteration, by simply summing the discounted rewards from that state to the terminal one.\n", + "\n", + " It can now run this 'simulation' $n$ times and calculate the average utility of each state. If a state occurs more than once in a simulation, both its utility values are counted separately.\n", + " \n", + " Note that this method may be prohibitively slow for very large state-spaces. Besides, **it pays no attention to the transition probability $P(s'\\ |\\ s, a)$.** It misses out on information that it is capable of collecting (say, by recording the number of times an action from one state led to another state). The next method addresses this issue.\n", + " \n", + "### Examples\n", + "\n", + "The `PassiveDEUAgent` class in the `rl` module implements the Agent Program described in **Fig 21.2** of the AIMA Book. `PassiveDEUAgent` sums over rewards to find the estimated utility for each state. It thus requires the running of several iterations." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%psource PassiveDUEAgent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's try the `PassiveDEUAgent` on the newly defined `sequential_decision_environment`:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "DUEagent = PassiveDUEAgent(policy, sequential_decision_environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can try passing information through the markove model for 200 times in order to get the converged utility value:" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "for i in range(200):\n", + " run_single_trial(DUEagent, sequential_decision_environment)\n", + " DUEagent.estimate_U()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's print our estimated utility for each position:" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(0, 1):0.7956939931414414\n", + "(1, 2):0.9162054322837863\n", + "(3, 2):1.0\n", + "(0, 0):0.734717308253083\n", + "(2, 2):0.9595117143816332\n", + "(0, 2):0.8481387156375687\n", + "(1, 0):0.4355860415209706\n", + "(2, 1):-0.550079982553143\n", + "(3, 1):-1.0\n" + ] + } + ], + "source": [ + "print('\\n'.join([str(k)+':'+str(v) for k, v in DUEagent.U.items()]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Adaptive Dynamic Programming (ADP)\n", + " \n", + " This method makes use of knowledge of the past state $s$, the action $a$, and the new perceived state $s'$ to estimate the transition probability $P(s'\\ |\\ s,a)$. It does this by the simple counting of new states resulting from previous states and actions.
    \n", + " The program runs through the policy a number of times, keeping track of:\n", + " - each occurrence of state $s$ and the policy-recommended action $a$ in $N_{sa}$\n", + " - each occurrence of $s'$ resulting from $a$ on $s$ in $N_{s'|sa}$.\n", + " \n", + " It can thus estimate $P(s'\\ |\\ s,a)$ as $N_{s'|sa}/N_{sa}$, which in the limit of infinite trials, will converge to the true value.
    \n", + " Using the transition probabilities thus estimated, it can apply `POLICY-EVALUATION` to estimate the utilities $U(s)$ using properties of convergence of the Bellman functions.\n", + " \n", + "### Examples\n", + "\n", + "The `PassiveADPAgent` class in the `rl` module implements the Agent Program described in **Fig 21.2** of the AIMA Book. `PassiveADPAgent` uses state transition and occurrence counts to estimate $P$, and then $U$. Go through the source below to understand the agent." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We instantiate a `PassiveADPAgent` below with the `GridMDP` shown and train it for 200 steps. The `rl` module has a simple implementation to simulate a single step of the iteration. The function is called `run_single_trial`." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Warning: Transition table is empty.\n" + ] + } + ], + "source": [ + "ADPagent = PassiveADPAgent(policy, sequential_decision_environment)\n", + "for i in range(200):\n", + " run_single_trial(ADPagent, sequential_decision_environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The utilities are calculated as :" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(0, 0):0.3014408531958584\n", + "(0, 1):0.40583863351329275\n", + "(1, 2):0.6581480346627065\n", + "(3, 2):1.0\n", + "(3, 0):0.0\n", + "(3, 1):-1.0\n", + "(2, 1):0.5341859348580892\n", + "(2, 0):0.0\n", + "(2, 2):0.810403779650285\n", + "(1, 0):0.23129676787627254\n", + "(0, 2):0.5214746706094832\n" + ] + } + ], + "source": [ + "print('\\n'.join([str(k)+':'+str(v) for k, v in ADPagent.U.items()]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When comparing to the result of `PassiveDUEAgent`, they both have -1.0 for utility at (3,1) and 1.0 at (3,2). Another point to notice is that the spot with the highest utility for both agents is (2,2) beside the terminal states, which is easy to understand when referring to the map." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Temporal-difference learning (TD)\n", + " \n", + " Instead of explicitly building the transition model $P$, the temporal-difference model makes use of the expected closeness between the utilities of two consecutive states $s$ and $s'$.\n", + " For the transition $s$ to $s'$, the update is written as:\n", + "$$U^{\\pi}(s) \\leftarrow U^{\\pi}(s) + \\alpha \\left( R(s) + \\gamma U^{\\pi}(s') - U^{\\pi}(s) \\right)$$\n", + " This model implicitly incorporates the transition probabilities by being weighed for each state by the number of times it is achieved from the current state. Thus, over a number of iterations, it converges similarly to the Bellman equations.\n", + " The advantage of the TD learning model is its relatively simple computation at each step, rather than having to keep track of various counts.\n", + " For $n_s$ states and $n_a$ actions the ADP model would have $n_s \\times n_a$ numbers $N_{sa}$ and $n_s^2 \\times n_a$ numbers $N_{s'|sa}$ to keep track of. The TD model must only keep track of a utility $U(s)$ for each state.\n", + " \n", + "### Examples\n", + "\n", + "`PassiveTDAgent` uses temporal differences to learn utility estimates. We learn the difference between the states and back up the values to previous states. Let us look into the source before we see some usage examples." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%psource PassiveTDAgent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In creating the `TDAgent`, we use the **same learning rate** $\\alpha$ as given in the footnote of the book: $\\alpha(n)=60/(59+n)$" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [], + "source": [ + "TDagent = PassiveTDAgent(policy, sequential_decision_environment, alpha = lambda n: 60./(59+n))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we run **200 trials** for the agent to estimate Utilities." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "for i in range(200):\n", + " run_single_trial(TDagent,sequential_decision_environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The calculated utilities are:" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(0, 1):0.36652562797696076\n", + "(1, 2):0.6584162739552614\n", + "(3, 2):1\n", + "(0, 0):0.27775491505339645\n", + "(3, 0):0.0\n", + "(3, 1):-1\n", + "(2, 1):0.6097040420148784\n", + "(2, 0):0.0\n", + "(2, 2):0.7936759402770092\n", + "(1, 0):0.19085842384266813\n", + "(0, 2):0.5258782999305713\n" + ] + } + ], + "source": [ + "print('\\n'.join([str(k)+':'+str(v) for k, v in TDagent.U.items()]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When comparing to previous agents, the result of `PassiveTDAgent` is closer to `PassiveADPAgent`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter21/images/mdp.png b/notebooks/chapter21/images/mdp.png new file mode 100644 index 000000000..e874130ee Binary files /dev/null and b/notebooks/chapter21/images/mdp.png differ diff --git a/notebooks/chapter22/Grammar.ipynb b/notebooks/chapter22/Grammar.ipynb new file mode 100644 index 000000000..3c1a2a005 --- /dev/null +++ b/notebooks/chapter22/Grammar.ipynb @@ -0,0 +1,526 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Grammar\n", + "\n", + "Languages can be represented by a set of grammar rules over a lexicon of words. Different languages can be represented by different types of grammar, but in Natural Language Processing we are mainly interested in context-free grammars.\n", + "\n", + "## Context-Free Grammar\n", + "\n", + "A lot of natural and programming languages can be represented by a **Context-Free Grammar (CFG)**. A CFG is a grammar that has a single non-terminal symbol on the left-hand side. That means a non-terminal can be replaced by the right-hand side of the rule regardless of context. An example of a CFG:\n", + "\n", + "```\n", + "S -> aSb | ε\n", + "```\n", + "\n", + "That means `S` can be replaced by either `aSb` or `ε` (with `ε` we denote the empty string). The lexicon of the language is comprised of the terminals `a` and `b`, while with `S` we denote the non-terminal symbol. In general, non-terminals are capitalized while terminals are not, and we usually name the starting non-terminal `S`. The language generated by the above grammar is the language anbn for n greater or equal than 1." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Probabilistic Context-Free Grammar\n", + "\n", + "While a simple CFG can be very useful, we might want to know the chance of each rule occurring. Above, we do not know if `S` is more likely to be replaced by `aSb` or `ε`. **Probabilistic Context-Free Grammars (PCFG)** are built to fill exactly that need. Each rule has a probability, given in brackets, and the probabilities of a rule sum up to 1:\n", + "\n", + "```\n", + "S -> aSb [0.7] | ε [0.3]\n", + "```\n", + "\n", + "Now we know it is more likely for `S` to be replaced by `aSb` than by `ε`.\n", + "\n", + "An issue with *PCFGs* is how we will assign the various probabilities to the rules. We could use our knowledge as humans to assign the probabilities, but that is laborious and prone to error task. Instead, we can *learn* the probabilities from data. Data is categorized as labeled (with correctly parsed sentences, usually called a **treebank**) or unlabeled (given only lexical and syntactic category names).\n", + "\n", + "With labeled data, we can simply count the occurrences. For the above grammar, if we have 100 `S` rules and 30 of them are of the form `S -> ε`, we assign a probability of 0.3 to the transformation.\n", + "\n", + "With unlabeled data, we have to learn both the grammar rules and the probability of each rule. We can go with many approaches, one of them the **inside-outside** algorithm. It uses a dynamic programming approach, that first finds the probability of a substring being generated by each rule and then estimates the probability of each rule." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Chomsky Normal Form\n", + "\n", + "Grammar is in Chomsky Normal Form (or **CNF**, not to be confused with *Conjunctive Normal Form*) if its rules are one of the three:\n", + "\n", + "* `X -> Y Z`\n", + "* `A -> a`\n", + "* `S -> ε`\n", + "\n", + "Where *X*, *Y*, *Z*, *A* are non-terminals, *a* is a terminal, *ε* is the empty string and *S* is the start symbol (the start symbol should not be appearing on the right-hand side of rules). Note that there can be multiple rules for each left-hand side non-terminal, as long they follow the above. For example, a rule for *X* might be: `X -> Y Z | A B | a | b`.\n", + "\n", + "Of course, we can also have a *CNF* with probabilities.\n", + "\n", + "This type of grammar may seem restrictive, but it can be proven that any context-free grammar can be converted to CNF." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Lexicon\n", + "\n", + "The lexicon of a language is defined as a list of allowable words. These words are grouped into the usual classes: `verbs`, `nouns`, `adjectives`, `adverbs`, `pronouns`, `names`, `articles`, `prepositions` and `conjunctions`. For the first five classes, it is impossible to list all words since words are continuously being added in the classes. Recently \"google\" was added to the list of verbs, and words like that will continue to pop up and get added to the lists. For that reason, these first five categories are called **open classes**. The rest of the categories have much fewer words and much less development. While words like \"thou\" were commonly used in the past but have declined almost completely in usage, most changes take many decades or centuries to manifest, so we can safely assume the categories will remain static for the foreseeable future. Thus, these categories are called **closed classes**.\n", + "\n", + "An example lexicon for a PCFG (note that other classes can also be used according to the language, like `digits`, or `RelPro` for relative pronoun):\n", + "\n", + "```\n", + "Verb -> is [0.3] | say [0.1] | are [0.1] | ...\n", + "Noun -> robot [0.1] | sheep [0.05] | fence [0.05] | ...\n", + "Adjective -> good [0.1] | new [0.1] | sad [0.05] | ...\n", + "Adverb -> here [0.1] | lightly [0.05] | now [0.05] | ...\n", + "Pronoun -> me [0.1] | you [0.1] | he [0.05] | ...\n", + "RelPro -> that [0.4] | who [0.2] | which [0.2] | ...\n", + "Name -> john [0.05] | mary [0.05] | peter [0.01] | ...\n", + "Article -> the [0.35] | a [0.25] | an [0.025] | ...\n", + "Preposition -> to [0.25] | in [0.2] | at [0.1] | ...\n", + "Conjunction -> and [0.5] | or [0.2] | but [0.2] | ...\n", + "Digit -> 1 [0.3] | 2 [0.2] | 0 [0.2] | ...\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Grammer Rules\n", + "\n", + "With grammars we combine words from the lexicon into valid phrases. A grammar is comprised of **grammar rules**. Each rule transforms the left-hand side of the rule into the right-hand side. For example, `A -> B` means that `A` transforms into `B`. Let's build a grammar for the language we started building with the lexicon. We will use a PCFG.\n", + "\n", + "```\n", + "S -> NP VP [0.9] | S Conjunction S [0.1]\n", + "\n", + "NP -> Pronoun [0.3] | Name [0.1] | Noun [0.1] | Article Noun [0.25] |\n", + " Article Adjs Noun [0.05] | Digit [0.05] | NP PP [0.1] |\n", + " NP RelClause [0.05]\n", + "\n", + "VP -> Verb [0.4] | VP NP [0.35] | VP Adjective [0.05] | VP PP [0.1]\n", + " VP Adverb [0.1]\n", + "\n", + "Adjs -> Adjective [0.8] | Adjective Adjs [0.2]\n", + "\n", + "PP -> Preposition NP [1.0]\n", + "\n", + "RelClause -> RelPro VP [1.0]\n", + "```\n", + "\n", + "Some valid phrases the grammar produces: \"`mary is sad`\", \"`you are a robot`\" and \"`she likes mary and a good fence`\".\n", + "\n", + "What if we wanted to check if the phrase \"`mary is sad`\" is actually a valid sentence? We can use a **parse tree** to constructively prove that a string of words is a valid phrase in the given language and even calculate the probability of the generation of the sentence.\n", + "\n", + "![parse_tree](images/parse_tree.png)\n", + "\n", + "The probability of the whole tree can be calculated by multiplying the probabilities of each individual rule transormation: `0.9 * 0.1 * 0.05 * 0.05 * 0.4 * 0.05 * 0.3 = 0.00000135`.\n", + "\n", + "To conserve space, we can also write the tree in linear form:\n", + "\n", + "[S [NP [Name **mary**]] [VP [VP [Verb **is**]] [Adjective **sad**]]]\n", + "\n", + "Unfortunately, the current grammar **overgenerates**, that is, it creates sentences that are not grammatically correct (according to the English language), like \"`the fence are john which say`\". It also **undergenerates**, which means there are valid sentences it does not generate, like \"`he believes mary is sad`\"." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Implementation\n", + "\n", + "In the module, we have implemented both probabilistic and non-probabilistic grammars. Both of these implementations follow the same format. There are functions for the lexicon and the rules which can be combined to create a grammar object.\n", + "\n", + "### Non-Probabilistic\n", + "\n", + "Execute the cell below to view the implementations:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from nlp4e import *\n", + "from notebook4e import psource" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(Lexicon, Rules, Grammar)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's build a lexicon and a grammar for the above language:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Lexicon {'Verb': ['is', 'say', 'are'], 'Noun': ['robot', 'sheep', 'fence'], 'Adjective': ['good', 'new', 'sad'], 'Adverb': ['here', 'lightly', 'now'], 'Pronoun': ['me', 'you', 'he'], 'RelPro': ['that', 'who', 'which'], 'Name': ['john', 'mary', 'peter'], 'Article': ['the', 'a', 'an'], 'Preposition': ['to', 'in', 'at'], 'Conjunction': ['and', 'or', 'but'], 'Digit': ['1', '2', '0']}\n", + "\n", + "Rules: {'S': [['NP', 'VP'], ['S', 'Conjunction', 'S']], 'NP': [['Pronoun'], ['Name'], ['Noun'], ['Article', 'Noun'], ['Article', 'Adjs', 'Noun'], ['Digit'], ['NP', 'PP'], ['NP', 'RelClause']], 'VP': [['Verb'], ['VP', 'NP'], ['VP', 'Adjective'], ['VP', 'PP'], ['VP', 'Adverb']], 'Adjs': [['Adjective'], ['Adjective', 'Adjs']], 'PP': [['Preposition', 'NP']], 'RelClause': [['RelPro', 'VP']]}\n" + ] + } + ], + "source": [ + "lexicon = Lexicon(\n", + " Verb = \"is | say | are\",\n", + " Noun = \"robot | sheep | fence\",\n", + " Adjective = \"good | new | sad\",\n", + " Adverb = \"here | lightly | now\",\n", + " Pronoun = \"me | you | he\",\n", + " RelPro = \"that | who | which\",\n", + " Name = \"john | mary | peter\",\n", + " Article = \"the | a | an\",\n", + " Preposition = \"to | in | at\",\n", + " Conjunction = \"and | or | but\",\n", + " Digit = \"1 | 2 | 0\"\n", + ")\n", + "\n", + "print(\"Lexicon\", lexicon)\n", + "\n", + "rules = Rules(\n", + " S = \"NP VP | S Conjunction S\",\n", + " NP = \"Pronoun | Name | Noun | Article Noun \\\n", + " | Article Adjs Noun | Digit | NP PP | NP RelClause\",\n", + " VP = \"Verb | VP NP | VP Adjective | VP PP | VP Adverb\",\n", + " Adjs = \"Adjective | Adjective Adjs\",\n", + " PP = \"Preposition NP\",\n", + " RelClause = \"RelPro VP\"\n", + ")\n", + "\n", + "print(\"\\nRules:\", rules)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Both the functions return a dictionary with keys to the left-hand side of the rules. For the lexicon, the values are the terminals for each left-hand side non-terminal, while for the rules the values are the right-hand sides as lists.\n", + "\n", + "We can now use the variables `lexicon` and `rules` to build a grammar. After we've done so, we can find the transformations of a non-terminal (the `Noun`, `Verb` and the other basic classes do **not** count as proper non-terminals in the implementation). We can also check if a word is in a particular class." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "How can we rewrite 'VP'? [['Verb'], ['VP', 'NP'], ['VP', 'Adjective'], ['VP', 'PP'], ['VP', 'Adverb']]\n", + "Is 'the' an article? True\n", + "Is 'here' a noun? False\n" + ] + } + ], + "source": [ + "grammar = Grammar(\"A Simple Grammar\", rules, lexicon)\n", + "\n", + "print(\"How can we rewrite 'VP'?\", grammar.rewrites_for('VP'))\n", + "print(\"Is 'the' an article?\", grammar.isa('the', 'Article'))\n", + "print(\"Is 'here' a noun?\", grammar.isa('here', 'Noun'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Chomsky Normal Form\n", + "If the grammar is in **Chomsky Normal Form**, we can call the class function `cnf_rules` to get all the rules in the form of `(X, Y, Z)` for each `X -> Y Z` rule. Since the above grammar is not in *CNF* though, we have to create a new one." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "E_Chomsky = Grammar(\"E_Prob_Chomsky\", # A Grammar in Chomsky Normal Form\n", + " Rules(\n", + " S = \"NP VP\",\n", + " NP = \"Article Noun | Adjective Noun\",\n", + " VP = \"Verb NP | Verb Adjective\",\n", + " ),\n", + " Lexicon(\n", + " Article = \"the | a | an\",\n", + " Noun = \"robot | sheep | fence\",\n", + " Adjective = \"good | new | sad\",\n", + " Verb = \"is | say | are\"\n", + " ))" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[('S', 'NP', 'VP'), ('NP', 'Article', 'Noun'), ('NP', 'Adjective', 'Noun'), ('VP', 'Verb', 'NP'), ('VP', 'Verb', 'Adjective')]\n" + ] + } + ], + "source": [ + "print(E_Chomsky.cnf_rules())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, we can generate random phrases using our grammar. Most of them will be complete gibberish, falling under the overgenerated phrases of the grammar. That goes to show that in the grammar the valid phrases are much fewer than the overgenerated ones." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'a fence is 2 at 0 at he at john the fence at a good new sheep in the new sad robot which is who is a good robot which are good sad new now lightly sad at 2 and me are'" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "grammar.generate_random('S')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Probabilistic\n", + "\n", + "The probabilistic grammars follow the same approach. They take as input a string, are assembled from grammar and a lexicon and can generate random sentences (giving the probability of the sentence). The main difference is that in the lexicon we have tuples (terminal, probability) instead of strings and for the rules, we have a list of tuples (list of non-terminals, probability) instead of the list of lists of non-terminals.\n", + "\n", + "Execute the cells to read the code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(ProbLexicon, ProbRules, ProbGrammar)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's build a lexicon and rules for the probabilistic grammar:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Lexicon {'Verb': [('is', 0.5), ('say', 0.3), ('are', 0.2)], 'Noun': [('robot', 0.4), ('sheep', 0.4), ('fence', 0.2)], 'Adjective': [('good', 0.5), ('new', 0.2), ('sad', 0.3)], 'Adverb': [('here', 0.6), ('lightly', 0.1), ('now', 0.3)], 'Pronoun': [('me', 0.3), ('you', 0.4), ('he', 0.3)], 'RelPro': [('that', 0.5), ('who', 0.3), ('which', 0.2)], 'Name': [('john', 0.4), ('mary', 0.4), ('peter', 0.2)], 'Article': [('the', 0.5), ('a', 0.25), ('an', 0.25)], 'Preposition': [('to', 0.4), ('in', 0.3), ('at', 0.3)], 'Conjunction': [('and', 0.5), ('or', 0.2), ('but', 0.3)], 'Digit': [('0', 0.35), ('1', 0.35), ('2', 0.3)]}\n", + "\n", + "Rules: {'S': [(['NP', 'VP'], 0.6), (['S', 'Conjunction', 'S'], 0.4)], 'NP': [(['Pronoun'], 0.2), (['Name'], 0.05), (['Noun'], 0.2), (['Article', 'Noun'], 0.15), (['Article', 'Adjs', 'Noun'], 0.1), (['Digit'], 0.05), (['NP', 'PP'], 0.15), (['NP', 'RelClause'], 0.1)], 'VP': [(['Verb'], 0.3), (['VP', 'NP'], 0.2), (['VP', 'Adjective'], 0.25), (['VP', 'PP'], 0.15), (['VP', 'Adverb'], 0.1)], 'Adjs': [(['Adjective'], 0.5), (['Adjective', 'Adjs'], 0.5)], 'PP': [(['Preposition', 'NP'], 1.0)], 'RelClause': [(['RelPro', 'VP'], 1.0)]}\n" + ] + } + ], + "source": [ + "lexicon = ProbLexicon(\n", + " Verb = \"is [0.5] | say [0.3] | are [0.2]\",\n", + " Noun = \"robot [0.4] | sheep [0.4] | fence [0.2]\",\n", + " Adjective = \"good [0.5] | new [0.2] | sad [0.3]\",\n", + " Adverb = \"here [0.6] | lightly [0.1] | now [0.3]\",\n", + " Pronoun = \"me [0.3] | you [0.4] | he [0.3]\",\n", + " RelPro = \"that [0.5] | who [0.3] | which [0.2]\",\n", + " Name = \"john [0.4] | mary [0.4] | peter [0.2]\",\n", + " Article = \"the [0.5] | a [0.25] | an [0.25]\",\n", + " Preposition = \"to [0.4] | in [0.3] | at [0.3]\",\n", + " Conjunction = \"and [0.5] | or [0.2] | but [0.3]\",\n", + " Digit = \"0 [0.35] | 1 [0.35] | 2 [0.3]\"\n", + ")\n", + "\n", + "print(\"Lexicon\", lexicon)\n", + "\n", + "rules = ProbRules(\n", + " S = \"NP VP [0.6] | S Conjunction S [0.4]\",\n", + " NP = \"Pronoun [0.2] | Name [0.05] | Noun [0.2] | Article Noun [0.15] \\\n", + " | Article Adjs Noun [0.1] | Digit [0.05] | NP PP [0.15] | NP RelClause [0.1]\",\n", + " VP = \"Verb [0.3] | VP NP [0.2] | VP Adjective [0.25] | VP PP [0.15] | VP Adverb [0.1]\",\n", + " Adjs = \"Adjective [0.5] | Adjective Adjs [0.5]\",\n", + " PP = \"Preposition NP [1]\",\n", + " RelClause = \"RelPro VP [1]\"\n", + ")\n", + "\n", + "print(\"\\nRules:\", rules)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's use the above to assemble our probabilistic grammar and run some simple queries:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "How can we rewrite 'VP'? [(['Verb'], 0.3), (['VP', 'NP'], 0.2), (['VP', 'Adjective'], 0.25), (['VP', 'PP'], 0.15), (['VP', 'Adverb'], 0.1)]\n", + "Is 'the' an article? True\n", + "Is 'here' a noun? False\n" + ] + } + ], + "source": [ + "grammar = ProbGrammar(\"A Simple Probabilistic Grammar\", rules, lexicon)\n", + "\n", + "print(\"How can we rewrite 'VP'?\", grammar.rewrites_for('VP'))\n", + "print(\"Is 'the' an article?\", grammar.isa('the', 'Article'))\n", + "print(\"Is 'here' a noun?\", grammar.isa('here', 'Noun'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If we have a grammar in *CNF*, we can get a list of all the rules. Let's create a grammar in the form and print the *CNF* rules:" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [], + "source": [ + "E_Prob_Chomsky = ProbGrammar(\"E_Prob_Chomsky\", # A Probabilistic Grammar in CNF\n", + " ProbRules(\n", + " S = \"NP VP [1]\",\n", + " NP = \"Article Noun [0.6] | Adjective Noun [0.4]\",\n", + " VP = \"Verb NP [0.5] | Verb Adjective [0.5]\",\n", + " ),\n", + " ProbLexicon(\n", + " Article = \"the [0.5] | a [0.25] | an [0.25]\",\n", + " Noun = \"robot [0.4] | sheep [0.4] | fence [0.2]\",\n", + " Adjective = \"good [0.5] | new [0.2] | sad [0.3]\",\n", + " Verb = \"is [0.5] | say [0.3] | are [0.2]\"\n", + " ))" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[('S', 'NP', 'VP', 1.0), ('NP', 'Article', 'Noun', 0.6), ('NP', 'Adjective', 'Noun', 0.4), ('VP', 'Verb', 'NP', 0.5), ('VP', 'Verb', 'Adjective', 0.5)]\n" + ] + } + ], + "source": [ + "print(E_Prob_Chomsky.cnf_rules())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Lastly, we can generate random sentences from this grammar. The function `prob_generation` returns a tuple (sentence, probability)." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "a good good new good sheep that say a good good robot the sad robot to 1 to me you to sheep are\n", + "5.511240000000004e-26\n" + ] + } + ], + "source": [ + "sentence, prob = grammar.generate_random('S')\n", + "print(sentence)\n", + "print(prob)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As with the non-probabilistic grammars, this one mostly overgenerates. You can also see that the probability is very, very low, which means there are a ton of generate able sentences (in this case infinite, since we have recursion; notice how `VP` can produce another `VP`, for example)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter22/Introduction.ipynb b/notebooks/chapter22/Introduction.ipynb new file mode 100644 index 000000000..0905b91a9 --- /dev/null +++ b/notebooks/chapter22/Introduction.ipynb @@ -0,0 +1,92 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# NATURAL LANGUAGE PROCESSING\n", + "\n", + "The notebooks in this folder cover chapters 23 of the book *Artificial Intelligence: A Modern Approach*, 4th Edition. The implementations of the algorithms can be found in [nlp.py](https://github.com/aimacode/aima-python/blob/master/nlp4e.py).\n", + "\n", + "Run the below cell to import the code from the module and get started!" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from nlp4e import *\n", + "from notebook4e import psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## OVERVIEW\n", + "\n", + "**Natural Language Processing (NLP)** is a field of AI concerned with understanding, analyzing and using natural languages. This field is considered a difficult yet intriguing field of study since it is connected to how humans and their languages work.\n", + "\n", + "Applications of the field include translation, speech recognition, topic segmentation, information extraction and retrieval, and a lot more.\n", + "\n", + "Below we take a look at some algorithms in the field. Before we get right into it though, we will take a look at a very useful form of language, **context-free** languages. Even though they are a bit restrictive, they have been used a lot in research in natural language processing.\n", + "\n", + "Below is a summary of the demonstration files in this chapter." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "- Introduction: Introduction to the field of nlp and the table of contents.\n", + "- Grammars: Introduction to grammar rules and lexicon of words of a language.\n", + " - Context-free Grammar\n", + " - Probabilistic Context-Free Grammar\n", + " - Chomsky Normal Form\n", + " - Lexicon\n", + " - Grammar Rules\n", + " - Implementation of Different Grammars\n", + "- Parsing: The algorithms parsing sentences according to a certain kind of grammar.\n", + " - Chart Parsing\n", + " - CYK Parsing\n", + " - A-star Parsing\n", + " - Beam Search Parsing\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter22/Parsing.ipynb b/notebooks/chapter22/Parsing.ipynb new file mode 100644 index 000000000..50a4264fb --- /dev/null +++ b/notebooks/chapter22/Parsing.ipynb @@ -0,0 +1,522 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Parsing\n", + "\n", + "## Overview\n", + "\n", + "Syntactic analysis (or **parsing**) of a sentence is the process of uncovering the phrase structure of the sentence according to the rules of grammar. \n", + "\n", + "There are two main approaches to parsing. *Top-down*, start with the starting symbol and build a parse tree with the given words as its leaves, and *bottom-up*, where we start from the given words and build a tree that has the starting symbol as its root. Both approaches involve \"guessing\" ahead, so it may take longer to parse a sentence (the wrong guess mean a lot of backtracking). Thankfully, a lot of effort is spent in analyzing already analyzed substrings, so we can follow a dynamic programming approach to store and reuse these parses instead of recomputing them. \n", + "\n", + "In dynamic programming, we use a data structure known as a chart, thus the algorithms parsing a chart is called **chart parsing**. We will cover several different chart parsing algorithms." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Chart Parsing\n", + "\n", + "### Overview\n", + "\n", + "The chart parsing algorithm is a general form of the following algorithms. Given a non-probabilistic grammar and a sentence, this algorithm builds a parse tree in a top-down manner, with the words of the sentence as the leaves. It works with a dynamic programming approach, building a chart to store parses for substrings so that it doesn't have to analyze them again (just like the CYK algorithm). Each non-terminal, starting from S, gets replaced by its right-hand side rules in the chart until we end up with the correct parses.\n", + "\n", + "### Implementation\n", + "\n", + "A parse is in the form `[start, end, non-terminal, sub-tree, expected-transformation]`, where `sub-tree` is a tree with the corresponding `non-terminal` as its root and `expected-transformation` is a right-hand side rule of the `non-terminal`.\n", + "\n", + "The chart parsing is implemented in a class, `Chart`. It is initialized with grammar and can return the list of all the parses of a sentence with the `parses` function.\n", + "\n", + "The chart is a list of lists. The lists correspond to the lengths of substrings (including the empty string), from start to finish. When we say 'a point in the chart', we refer to a list of a certain length.\n", + "\n", + "A quick rundown of the class functions:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "* `parses`: Returns a list of parses for a given sentence. If the sentence can't be parsed, it will return an empty list. Initializes the process by calling `parse` from the starting symbol.\n", + "\n", + "\n", + "* `parse`: Parses the list of words and builds the chart.\n", + "\n", + "\n", + "* `add_edge`: Adds another edge to the chart at a given point. Also, examines whether the edge extends or predicts another edge. If the edge itself is not expecting a transformation, it will extend other edges and it will predict edges otherwise.\n", + "\n", + "\n", + "* `scanner`: Given a word and a point in the chart, it extends edges that were expecting a transformation that can result in the given word. For example, if the word 'the' is an 'Article' and we are examining two edges at a chart's point, with one expecting an 'Article' and the other a 'Verb', the first one will be extended while the second one will not.\n", + "\n", + "\n", + "* `predictor`: If an edge can't extend other edges (because it is expecting a transformation itself), we will add to the chart rules/transformations that can help extend the edge. The new edges come from the right-hand side of the expected transformation's rules. For example, if an edge is expecting the transformation 'Adjective Noun', we will add to the chart an edge for each right-hand side rule of the non-terminal 'Adjective'.\n", + "\n", + "\n", + "* `extender`: Extends edges given an edge (called `E`). If `E`'s non-terminal is the same as the expected transformation of another edge (let's call it `A`), add to the chart a new edge with the non-terminal of `A` and the transformations of `A` minus the non-terminal that matched with `E`'s non-terminal. For example, if an edge `E` has 'Article' as its non-terminal and is expecting no transformation, we need to see what edges it can extend. Let's examine the edge `N`. This expects a transformation of 'Noun Verb'. 'Noun' does not match with 'Article', so we move on. Another edge, `A`, expects a transformation of 'Article Noun' and has a non-terminal of 'NP'. We have a match! A new edge will be added with 'NP' as its non-terminal (the non-terminal of `A`) and 'Noun' as the expected transformation (the rest of the expected transformation of `A`).\n", + "\n", + "You can view the source code by running the cell below:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(Chart)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "We will use the grammar `E0` to parse the sentence \"the stench is in 2 2\".\n", + "\n", + "First, we need to build a `Chart` object:" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [], + "source": [ + "chart = Chart(E0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And then we simply call the `parses` function:" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[0, 6, 'S', [[0, 2, 'NP', [('Article', 'the'), ('Noun', 'stench')], []], [2, 6, 'VP', [[2, 3, 'VP', [('Verb', 'is')], []], [3, 6, 'PP', [('Preposition', 'in'), [4, 6, 'NP', [('Digit', '2'), ('Digit', '2')], []]], []]], []]], []]]\n" + ] + } + ], + "source": [ + "print(chart.parses('the stench is in 2 2'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see which edges get added by setting the optional initialization argument `trace` to true." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "chart_trace = Chart(nlp.E0, trace=True)\n", + "chart_trace.parses('the stench is in 2 2')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try and parse a sentence that is not recognized by the grammar:" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[]\n" + ] + } + ], + "source": [ + "print(chart.parses('the stench 2 2'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An empty list was returned." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CYK Parse\n", + "\n", + "The *CYK Parsing Algorithm* (named after its inventors, Cocke, Younger, and Kasami) utilizes dynamic programming to parse sentences of grammar in *Chomsky Normal Form*.\n", + "\n", + "The CYK algorithm returns an *M x N x N* array (named *P*), where *N* is the number of words in the sentence and *M* the number of non-terminal symbols in the grammar. Each element in this array shows the probability of a substring being transformed from a particular non-terminal. To find the most probable parse of the sentence, a search in the resulting array is required. Search heuristic algorithms work well in this space, and we can derive the heuristics from the properties of the grammar.\n", + "\n", + "The algorithm in short works like this: There is an external loop that determines the length of the substring. Then the algorithm loops through the words in the sentence. For each word, it again loops through all the words to its right up to the first-loop length. The substring will work on in this iteration is the words from the second-loop word with the first-loop length. Finally, it loops through all the rules in the grammar and updates the substring's probability for each right-hand side non-terminal." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "The implementation takes as input a list of words and a probabilistic grammar (from the `ProbGrammar` class detailed above) in CNF and returns the table/dictionary *P*. An item's key in *P* is a tuple in the form `(Non-terminal, the start of a substring, length of substring)`, and the value is a `Tree` object. The `Tree` data structure has two attributes: `root` and `leaves`. `root` stores the value of current tree node and `leaves` is a list of children nodes which may be terminal states(words in the sentence) or a sub tree.\n", + "\n", + "For example, for the sentence \"the monkey is dancing\" and the substring \"the monkey\" an item can be `('NP', 0, 2): `, which means the first two words (the substring from index 0 and length 2) can be parse to a `NP` and the detailed operations are recorded by a `Tree` object.\n", + "\n", + "Before we continue, you can take a look at the source code by running the cell below:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from nlp4e import *\n", + "from notebook4e import psource" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(CYK_parse)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When updating the probability of a substring, we pick the max of its current one and the probability of the substring broken into two parts: one from the second-loop word with third-loop length, and the other from the first part's end to the remainder of the first-loop length.\n", + "\n", + "### Example\n", + "\n", + "Let's build a probabilistic grammar in CNF:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "E_Prob_Chomsky = ProbGrammar(\"E_Prob_Chomsky\", # A Probabilistic Grammar in CNF\n", + " ProbRules(\n", + " S = \"NP VP [1]\",\n", + " NP = \"Article Noun [0.6] | Adjective Noun [0.4]\",\n", + " VP = \"Verb NP [0.5] | Verb Adjective [0.5]\",\n", + " ),\n", + " ProbLexicon(\n", + " Article = \"the [0.5] | a [0.25] | an [0.25]\",\n", + " Noun = \"robot [0.4] | sheep [0.4] | fence [0.2]\",\n", + " Adjective = \"good [0.5] | new [0.2] | sad [0.3]\",\n", + " Verb = \"is [0.5] | say [0.3] | are [0.2]\"\n", + " ))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's see the probabilities table for the sentence \"the robot is good\":" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "defaultdict(, {('Article', 0, 0): , ('Noun', 1, 1): , ('Verb', 2, 2): , ('Adjective', 3, 3): , ('VP', 2, 3): })\n" + ] + } + ], + "source": [ + "words = ['the', 'robot', 'is', 'good']\n", + "grammar = E_Prob_Chomsky\n", + "\n", + "P = CYK_parse(words, grammar)\n", + "print(P)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A `defaultdict` object is returned (`defaultdict` is basically a dictionary but with a default value/type). Keys are tuples in the form mentioned above and the values are the corresponding parse trees which demonstrates how the sentence will be parsed. Let's check the details of each parsing:" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{('Article', 0, 0): ['the'], ('Noun', 1, 1): ['robot'], ('Verb', 2, 2): ['is'], ('Adjective', 3, 3): ['good'], ('VP', 2, 3): [, ]}\n" + ] + } + ], + "source": [ + "parses = {k: p.leaves for k, p in P.items()}\n", + "\n", + "print(parses)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Please note that each item in the returned dict represents a parsing strategy. For instance, `('Article', 0, 0): ['the']` means parsing the article at position 0 from the word `the`. For the key `'VP', 2, 3`, it is mapped to another `Tree` which means this is a nested parsing step. If we print this item in detail: " + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['is']\n", + "['good']\n" + ] + } + ], + "source": [ + "for subtree in P['VP', 2, 3].leaves:\n", + " print(subtree.leaves)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "So we can interpret this step as parsing the word at index 2 and 3 together('is' and 'good') as a verh phrase." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A-star Parsing\n", + "\n", + "The CYK algorithm uses space of $O(n^2m)$ for the P and T tables, where n is the number of words in the sentence, and m is the number of nonterminal symbols in the grammar and takes time $O(n^3m)$. This is the best algorithm if we want to find the best parse and works for all possible context-free grammars. But actually, we only want to parse natural languages, not all possible grammars, which allows us to apply more efficient algorithms.\n", + "\n", + "By applying a-start search, we are using the state-space search and we can get $O(n)$ running time. In this situation, each state is a list of items (words or categories), the start state is a list of words, and a goal state is the single item S. \n", + "\n", + "In our code, we implemented a demonstration of `astar_search_parsing` which deals with the text parsing problem. By specifying different `words` and `gramma`, we can use this searching strategy to deal with different text parsing problems. The algorithm returns a boolean telling whether the input words is a sentence under the given grammar.\n", + "\n", + "For detailed implementation, please execute the following block:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(astar_search_parsing)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Now let's try \"the wumpus is dead\" example. First we need to define the grammer and words in the sentence." + ] + }, + { + "cell_type": "code", + "execution_count": 68, + "metadata": {}, + "outputs": [], + "source": [ + "grammar = E0\n", + "words = ['the', 'wumpus', 'is', 'dead']" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'S'" + ] + }, + "execution_count": 66, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "astar_search_parsing(words, grammar)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The algorithm returns a 'S' which means it treats the inputs as a sentence. If we change the order of words to make it unreadable:" + ] + }, + { + "cell_type": "code", + "execution_count": 69, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 69, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "words_swaped = [\"the\", \"is\", \"wupus\", \"dead\"]\n", + "astar_search_parsing(words_swaped, grammar)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Then the algorithm asserts that out words cannot be a sentence." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Beam Search Parsing\n", + "\n", + "In the beam searching algorithm, we still treat the text parsing problem as a state-space searching algorithm. when using beam search, we consider only the b most probable alternative parses. This means we are not guaranteed to find the parse with the highest probability, but (with a careful implementation) the parser can operate in $O(n)$ time and still finds the best parse most of the time. A beam search parser with b = 1 is called a **deterministic parser**.\n", + "\n", + "### Implementation\n", + "\n", + "In the beam search, we maintain a `frontier` which is a priority queue keep tracking of the current frontier of searching. In each step, we explore all the examples in `frontier` and saves the best n examples as the frontier of the exploration of the next step.\n", + "\n", + "For detailed implementation, please view with the following code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(beam_search_parsing)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Let's try both the positive and negative wumpus example on this algorithm:" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'S'" + ] + }, + "execution_count": 70, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "beam_search_parsing(words, grammar)" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "beam_search_parsing(words_swaped, grammar)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter22/images/parse_tree.png b/notebooks/chapter22/images/parse_tree.png new file mode 100644 index 000000000..f6ca87b2f Binary files /dev/null and b/notebooks/chapter22/images/parse_tree.png differ diff --git a/notebooks/chapter22/nlp_apps.ipynb b/notebooks/chapter22/nlp_apps.ipynb new file mode 100644 index 000000000..bd38efadf --- /dev/null +++ b/notebooks/chapter22/nlp_apps.ipynb @@ -0,0 +1,1038 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# NATURAL LANGUAGE PROCESSING APPLICATIONS\n", + "\n", + "In this notebook we will take a look at some indicative applications of natural language processing. We will cover content from [`nlp.py`](https://github.com/aimacode/aima-python/blob/master/nlp.py) and [`text.py`](https://github.com/aimacode/aima-python/blob/master/text.py), for chapters 22 and 23 of Stuart Russel's and Peter Norvig's book [*Artificial Intelligence: A Modern Approach*](http://aima.cs.berkeley.edu/)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "* Language Recognition\n", + "* Author Recognition\n", + "* The Federalist Papers\n", + "* Text Classification" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# LANGUAGE RECOGNITION\n", + "\n", + "A very useful application of text models (you can read more on them on the [`text notebook`](https://github.com/aimacode/aima-python/blob/master/text.ipynb)) is categorizing text into a language. In fact, with enough data we can categorize correctly mostly any text. That is because different languages have certain characteristics that set them apart. For example, in German it is very usual for 'c' to be followed by 'h' while in English we see 't' followed by 'h' a lot.\n", + "\n", + "Here we will build an application to categorize sentences in either English or German.\n", + "\n", + "First we need to build our dataset. We will take as input text in English and in German and we will extract n-gram character models (in this case, *bigrams* for n=2). For English, we will use *Flatland* by Edwin Abbott and for German *Faust* by Goethe.\n", + "\n", + "Let's build our text models for each language, which will hold the probability of each bigram occuring in the text." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import open_data\n", + "from text import *\n", + "\n", + "flatland = open_data(\"EN-text/flatland.txt\").read()\n", + "wordseq = words(flatland)\n", + "\n", + "P_flatland = NgramCharModel(2, wordseq)\n", + "\n", + "faust = open_data(\"GE-text/faust.txt\").read()\n", + "wordseq = words(faust)\n", + "\n", + "P_faust = NgramCharModel(2, wordseq)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can use this information to build a *Naive Bayes Classifier* that will be used to categorize sentences (you can read more on Naive Bayes on the [`learning notebook`](https://github.com/aimacode/aima-python/blob/master/learning.ipynb)). The classifier will take as input the probability distribution of bigrams and given a list of bigrams (extracted from the sentence to be classified), it will calculate the probability of the example/sentence coming from each language and pick the maximum.\n", + "\n", + "Let's build our classifier, with the assumption that English is as probable as German (the input is a dictionary with values the text models and keys the tuple `language, probability`):" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "from learning import NaiveBayesLearner\n", + "\n", + "dist = {('English', 1): P_flatland, ('German', 1): P_faust}\n", + "\n", + "nBS = NaiveBayesLearner(dist, simple=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we need to write a function that takes as input a sentence, breaks it into a list of bigrams and classifies it with the naive bayes classifier from above.\n", + "\n", + "Once we get the text model for the sentence, we need to unravel it. The text models show the probability of each bigram, but the classifier can't handle that extra data. It requires a simple *list* of bigrams. So, if the text model shows that a bigram appears three times, we need to add it three times in the list. Since the text model stores the n-gram information in a dictionary (with the key being the n-gram and the value the number of times the n-gram appears) we need to iterate through the items of the dictionary and manually add them to the list of n-grams." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "def recognize(sentence, nBS, n):\n", + " sentence = sentence.lower()\n", + " wordseq = words(sentence)\n", + " \n", + " P_sentence = NgramCharModel(n, wordseq)\n", + " \n", + " ngrams = []\n", + " for b, p in P_sentence.dictionary.items():\n", + " ngrams += [b]*p\n", + " \n", + " print(ngrams)\n", + " \n", + " return nBS(ngrams)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can start categorizing sentences." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 'i'), ('i', 'c'), ('c', 'h'), (' ', 'b'), ('b', 'i'), ('i', 'n'), ('i', 'n'), (' ', 'e'), ('e', 'i'), (' ', 'p'), ('p', 'l'), ('l', 'a'), ('a', 't'), ('t', 'z')]\n" + ] + }, + { + "data": { + "text/plain": [ + "'German'" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"Ich bin ein platz\", nBS, 2)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 't'), ('t', 'u'), ('u', 'r'), ('r', 't'), ('t', 'l'), ('l', 'e'), ('e', 's'), (' ', 'f'), ('f', 'l'), ('l', 'y'), (' ', 'h'), ('h', 'i'), ('i', 'g'), ('g', 'h')]\n" + ] + }, + { + "data": { + "text/plain": [ + "'English'" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"Turtles fly high\", nBS, 2)" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 'd'), ('d', 'e'), ('e', 'r'), ('e', 'r'), (' ', 'p'), ('p', 'e'), ('e', 'l'), ('l', 'i'), ('i', 'k'), ('k', 'a'), ('a', 'n'), (' ', 'i'), ('i', 's'), ('s', 't'), (' ', 'h'), ('h', 'i'), ('i', 'e')]\n" + ] + }, + { + "data": { + "text/plain": [ + "'German'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"Der pelikan ist hier\", nBS, 2)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[(' ', 'a'), ('a', 'n'), ('n', 'd'), (' ', 't'), (' ', 't'), ('t', 'h'), ('t', 'h'), ('h', 'u'), ('u', 's'), ('h', 'e'), (' ', 'w'), ('w', 'i'), ('i', 'z'), ('z', 'a'), ('a', 'r'), ('r', 'd'), (' ', 's'), ('s', 'p'), ('p', 'o'), ('o', 'k'), ('k', 'e')]\n" + ] + }, + { + "data": { + "text/plain": [ + "'English'" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"And thus the wizard spoke\", nBS, 2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can add more languages if you want, the algorithm works for as many as you like! Also, you can play around with *n*. Here we used 2, but other numbers work too (even though 2 suffices). The algorithm is not perfect, but it has high accuracy even for small samples like the ones we used. That is because English and German are very different languages. The closer together languages are (for example, Norwegian and Swedish share a lot of common ground) the lower the accuracy of the classifier." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## AUTHOR RECOGNITION\n", + "\n", + "Another similar application to language recognition is recognizing who is more likely to have written a sentence, given text written by them. Here we will try and predict text from Edwin Abbott and Jane Austen. They wrote *Flatland* and *Pride and Prejudice* respectively.\n", + "\n", + "We are optimistic we can determine who wrote what based on the fact that Abbott wrote his novella on much later date than Austen, which means there will be linguistic differences between the two works. Indeed, *Flatland* uses more modern and direct language while *Pride and Prejudice* is written in a more archaic tone containing more sophisticated wording.\n", + "\n", + "Similarly with Language Recognition, we will first import the two datasets. This time though we are not looking for connections between characters, since that wouldn't give that great results. Why? Because both authors use English and English follows a set of patterns, as we show earlier. Trying to determine authorship based on this patterns would not be very efficient.\n", + "\n", + "Instead, we will abstract our querying to a higher level. We will use words instead of characters. That way we can more accurately pick at the differences between their writing style and thus have a better chance at guessing the correct author.\n", + "\n", + "Let's go right ahead and import our data:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import open_data\n", + "from text import *\n", + "\n", + "flatland = open_data(\"EN-text/flatland.txt\").read()\n", + "wordseq = words(flatland)\n", + "\n", + "P_Abbott = UnigramWordModel(wordseq, 5)\n", + "\n", + "pride = open_data(\"EN-text/pride.txt\").read()\n", + "wordseq = words(pride)\n", + "\n", + "P_Austen = UnigramWordModel(wordseq, 5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This time we set the `default` parameter of the model to 5, instead of 0. If we leave it at 0, then when we get a sentence containing a word we have not seen from that particular author, the chance of that sentence coming from that author is exactly 0 (since to get the probability, we multiply all the separate probabilities; if one is 0 then the result is also 0). To avoid that, we tell the model to add 5 to the count of all the words that appear.\n", + "\n", + "Next we will build the Naive Bayes Classifier:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "from learning import NaiveBayesLearner\n", + "\n", + "dist = {('Abbott', 1): P_Abbott, ('Austen', 1): P_Austen}\n", + "\n", + "nBS = NaiveBayesLearner(dist, simple=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have build our classifier, we will start classifying. First, we need to convert the given sentence to the format the classifier needs. That is, a list of words." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "def recognize(sentence, nBS):\n", + " sentence = sentence.lower()\n", + " sentence_words = words(sentence)\n", + " \n", + " return nBS(sentence_words)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we will input a sentence that is something Abbott would write. Note the use of square and the simpler language." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'Abbott'" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"the square is mad\", nBS)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The classifier correctly guessed Abbott.\n", + "\n", + "Next we will input a more sophisticated sentence, similar to the style of Austen." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'Austen'" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recognize(\"a most peculiar acquaintance\", nBS)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The classifier guessed correctly again.\n", + "\n", + "You can try more sentences on your own. Unfortunately though, since the datasets are pretty small, chances are the guesses will not always be correct." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## THE FEDERALIST PAPERS\n", + "\n", + "Let's now take a look at a harder problem, classifying the authors of the [Federalist Papers](https://en.wikipedia.org/wiki/The_Federalist_Papers). The *Federalist Papers* are a series of papers written by Alexander Hamilton, James Madison and John Jay towards establishing the United States Constitution.\n", + "\n", + "What is interesting about these papers is that they were all written under a pseudonym, \"Publius\", to keep the identity of the authors a secret. Only after Hamilton's death, when a list was found written by him detailing the authorship of the papers, did the rest of the world learn what papers each of the authors wrote. After the list was published, Madison chimed in to make a couple of corrections: Hamilton, Madison said, hastily wrote down the list and assigned some papers to the wrong author!\n", + "\n", + "Here we will try and find out who really wrote these mysterious papers.\n", + "\n", + "To solve this we will learn from the undisputed papers to predict the disputed ones. First, let's read the texts from the file:" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import open_data\n", + "from text import *\n", + "\n", + "federalist = open_data(\"EN-text/federalist.txt\").read()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's see how the text looks. We will print the first 500 characters:" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'The Project Gutenberg EBook of The Federalist Papers, by \\nAlexander Hamilton and John Jay and James Madison\\n\\nThis eBook is for the use of anyone anywhere at no cost and with\\nalmost no restrictions whatsoever. You may copy it, give it away or\\nre-use it under the terms of the Project Gutenberg License included\\nwith this eBook or online at www.gutenberg.net\\n\\n\\nTitle: The Federalist Papers\\n\\nAuthor: Alexander Hamilton\\n John Jay\\n James Madison\\n\\nPosting Date: December 12, 2011 [EBook #18]'" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "federalist[:500]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It seems that the text file opens with a license agreement, hardly useful in our case. In fact, the license spans 113 words, while there is also a licensing agreement at the end of the file, which spans 3098 words. We need to remove them. To do so, we will first convert the text into words, to make our lives easier." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [], + "source": [ + "wordseq = words(federalist)\n", + "wordseq = wordseq[114:-3098]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now take a look at the first 100 words:" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'federalist no 1 general introduction for the independent journal hamilton to the people of the state of new york after an unequivocal experience of the inefficacy of the subsisting federal government you are called upon to deliberate on a new constitution for the united states of america the subject speaks its own importance comprehending in its consequences nothing less than the existence of the union the safety and welfare of the parts of which it is composed the fate of an empire in many respects the most interesting in the world it has been frequently remarked that it seems to'" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "' '.join(wordseq[:100])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Much better.\n", + "\n", + "As with any Natural Language Processing problem, it is prudent to do some text pre-processing and clean our data before we start building our model. Remember that all the papers are signed as 'Publius', so we can safely remove that word, since it doesn't give us any information as to the real author.\n", + "\n", + "NOTE: Since we are only removing a single word from each paper, this step can be skipped. We add it here to show that processing the data in our hands is something we should always be considering. Oftentimes pre-processing the data in just the right way is the difference between a robust model and a flimsy one." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "wordseq = [w for w in wordseq if w != 'publius']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we have to separate the text from a block of words into papers and assign them to their authors. We can see that each paper starts with the word 'federalist', so we will split the text on that word.\n", + "\n", + "The disputed papers are the papers from 49 to 58, from 18 to 20 and paper 64. We want to leave these papers unassigned. Also, note that there are two versions of paper 70; both from Hamilton.\n", + "\n", + "Finally, to keep the implementation intuitive, we add a `None` object at the start of the `papers` list to make the list index match up with the paper numbering (for example, `papers[5]` now corresponds to paper no. 5 instead of the paper no.6 in the 0-indexed Python)." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(4, 16, 52)" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import re\n", + "\n", + "papers = re.split(r'federalist\\s', ' '.join(wordseq))\n", + "papers = [p for p in papers if p not in ['', ' ']]\n", + "papers = [None] + papers\n", + "\n", + "disputed = list(range(49, 58+1)) + [18, 19, 20, 64]\n", + "jay, madison, hamilton = [], [], []\n", + "for i, p in enumerate(papers):\n", + " if i in disputed or i == 0:\n", + " continue\n", + " \n", + " if 'jay' in p:\n", + " jay.append(p)\n", + " elif 'madison' in p:\n", + " madison.append(p)\n", + " else:\n", + " hamilton.append(p)\n", + "\n", + "len(jay), len(madison), len(hamilton)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, from the undisputed papers Jay wrote 4, Madison 17 and Hamilton 51 (+1 duplicate). Let's now build our word models. The Unigram Word Model again will come in handy." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [], + "source": [ + "hamilton = ''.join(hamilton)\n", + "hamilton_words = words(hamilton)\n", + "P_hamilton = UnigramWordModel(hamilton_words, default=1)\n", + "\n", + "madison = ''.join(madison)\n", + "madison_words = words(madison)\n", + "P_madison = UnigramWordModel(madison_words, default=1)\n", + "\n", + "jay = ''.join(jay)\n", + "jay_words = words(jay)\n", + "P_jay = UnigramWordModel(jay_words, default=1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now it is time to build our new Naive Bayes Learner. It is very similar to the one found in `learning.py`, but with an important difference: it doesn't classify an example, but instead returns the probability of the example belonging to each class. This will allow us to not only see to whom a paper belongs to, but also the probability of authorship as well. \n", + "We will build two versions of Learners, one will multiply probabilities as is and other will add the logarithms of them.\n", + "\n", + "Finally, since we are dealing with long text and the string of probability multiplications is long, we will end up with the results being rounded to 0 due to floating point underflow. To work around this problem we will use the built-in Python library `decimal`, which allows as to set decimal precision to much larger than normal.\n", + "\n", + "Note that the logarithmic learner will compute a negative likelihood since the logarithm of values less than 1 will be negative.\n", + "Thus, the author with the lesser magnitude of proportion is more likely to have written that paper.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [], + "source": [ + "import random\n", + "import decimal\n", + "import math\n", + "from decimal import Decimal\n", + "\n", + "decimal.getcontext().prec = 100\n", + "\n", + "def precise_product(numbers):\n", + " result = 1\n", + " for x in numbers:\n", + " result *= Decimal(x)\n", + " return result\n", + "\n", + "def log_product(numbers):\n", + " result = 0.0\n", + " for x in numbers:\n", + " result += math.log(x)\n", + " return result\n", + "\n", + "def NaiveBayesLearner(dist):\n", + " \"\"\"A simple naive bayes classifier that takes as input a dictionary of\n", + " Counter distributions and can then be used to find the probability\n", + " of a given item belonging to each class.\n", + " The input dictionary is in the following form:\n", + " ClassName: Counter\"\"\"\n", + " attr_dist = {c_name: count_prob for c_name, count_prob in dist.items()}\n", + "\n", + " def predict(example):\n", + " \"\"\"Predict the probabilities for each class.\"\"\"\n", + " def class_prob(target, e):\n", + " attr = attr_dist[target]\n", + " return precise_product([attr[a] for a in e])\n", + "\n", + " pred = {t: class_prob(t, example) for t in dist.keys()}\n", + "\n", + " total = sum(pred.values())\n", + " for k, v in pred.items():\n", + " pred[k] = v / total\n", + "\n", + " return pred\n", + "\n", + " return predict\n", + "\n", + "def NaiveBayesLearnerLog(dist):\n", + " \"\"\"A simple naive bayes classifier that takes as input a dictionary of\n", + " Counter distributions and can then be used to find the probability\n", + " of a given item belonging to each class. It will compute the likelihood by adding the logarithms of probabilities.\n", + " The input dictionary is in the following form:\n", + " ClassName: Counter\"\"\"\n", + " attr_dist = {c_name: count_prob for c_name, count_prob in dist.items()}\n", + "\n", + " def predict(example):\n", + " \"\"\"Predict the probabilities for each class.\"\"\"\n", + " def class_prob(target, e):\n", + " attr = attr_dist[target]\n", + " return log_product([attr[a] for a in e])\n", + "\n", + " pred = {t: class_prob(t, example) for t in dist.keys()}\n", + "\n", + " total = -sum(pred.values())\n", + " for k, v in pred.items():\n", + " pred[k] = v/total\n", + "\n", + " return pred\n", + "\n", + " return predict\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next we will build our Learner. Note that even though Hamilton wrote the most papers, that doesn't make it more probable that he wrote the rest, so all the class probabilities will be equal. We can change them if we have some external knowledge, which for this tutorial we do not have." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [], + "source": [ + "dist = {('Madison', 1): P_madison, ('Hamilton', 1): P_hamilton, ('Jay', 1): P_jay}\n", + "nBS = NaiveBayesLearner(dist)\n", + "nBSL = NaiveBayesLearnerLog(dist)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As usual, the `recognize` function will take as input a string and after removing capitalization and splitting it into words, will feed it into the Naive Bayes Classifier." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [], + "source": [ + "def recognize(sentence, nBS):\n", + " return nBS(words(sentence.lower()))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can start predicting the disputed papers:" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Straightforward Naive Bayes Learner\n", + "\n", + "Paper No. 49: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 50: Hamilton: 0.0000 Madison: 0.0000 Jay: 1.0000\n", + "Paper No. 51: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 52: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 53: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 54: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 55: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 56: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 57: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 58: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 18: Hamilton: 0.0000 Madison: 0.0000 Jay: 1.0000\n", + "Paper No. 19: Hamilton: 0.0000 Madison: 0.0000 Jay: 1.0000\n", + "Paper No. 20: Hamilton: 0.0000 Madison: 1.0000 Jay: 0.0000\n", + "Paper No. 64: Hamilton: 1.0000 Madison: 0.0000 Jay: 0.0000\n", + "\n", + "Logarithmic Naive Bayes Learner\n", + "\n", + "Paper No. 49: Hamilton: -0.330591 Madison: -0.327717 Jay: -0.341692\n", + "Paper No. 50: Hamilton: -0.333119 Madison: -0.328454 Jay: -0.338427\n", + "Paper No. 51: Hamilton: -0.330246 Madison: -0.325758 Jay: -0.343996\n", + "Paper No. 52: Hamilton: -0.331094 Madison: -0.327491 Jay: -0.341415\n", + "Paper No. 53: Hamilton: -0.330942 Madison: -0.328364 Jay: -0.340693\n", + "Paper No. 54: Hamilton: -0.329566 Madison: -0.327157 Jay: -0.343277\n", + "Paper No. 55: Hamilton: -0.330821 Madison: -0.328143 Jay: -0.341036\n", + "Paper No. 56: Hamilton: -0.330333 Madison: -0.327496 Jay: -0.342171\n", + "Paper No. 57: Hamilton: -0.330625 Madison: -0.328602 Jay: -0.340772\n", + "Paper No. 58: Hamilton: -0.330271 Madison: -0.327215 Jay: -0.342515\n", + "Paper No. 18: Hamilton: -0.337781 Madison: -0.330932 Jay: -0.331287\n", + "Paper No. 19: Hamilton: -0.335635 Madison: -0.331774 Jay: -0.332590\n", + "Paper No. 20: Hamilton: -0.334911 Madison: -0.331866 Jay: -0.333223\n", + "Paper No. 64: Hamilton: -0.331004 Madison: -0.332968 Jay: -0.336028\n" + ] + } + ], + "source": [ + "print('\\nStraightforward Naive Bayes Learner\\n')\n", + "for d in disputed:\n", + " probs = recognize(papers[d], nBS)\n", + " results = ['{}: {:.4f}'.format(name, probs[(name, 1)]) for name in 'Hamilton Madison Jay'.split()]\n", + " print('Paper No. {}: {}'.format(d, ' '.join(results)))\n", + "\n", + "print('\\nLogarithmic Naive Bayes Learner\\n')\n", + "for d in disputed:\n", + " probs = recognize(papers[d], nBSL)\n", + " results = ['{}: {:.6f}'.format(name, probs[(name, 1)]) for name in 'Hamilton Madison Jay'.split()]\n", + " print('Paper No. {}: {}'.format(d, ' '.join(results)))\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that both learners classify the papers identically. Because of underflow in the straightforward learner, only one author remains with a positive value. The log learner is more accurate with marginal differences between all the authors. \n", + "\n", + "This is a simple approach to the problem and thankfully researchers are fairly certain that papers 49-58 were all written by Madison, while 18-20 were written in collaboration between Hamilton and Madison, with Madison being credited for most of the work. Our classifier is not that far off. It correctly identifies the papers written by Madison, even the ones in collaboration with Hamilton.\n", + "\n", + "Unfortunately, it misses paper 64. Consensus is that the paper was written by John Jay, while our classifier believes it was written by Hamilton. The classifier is wrong there because it does not have much information on Jay's writing; only 4 papers. This is one of the problems with using unbalanced datasets such as this one, where information on some classes is sparser than information on the rest. To avoid this, we can add more writings for Jay and Madison to end up with an equal amount of data for each author." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## Text Classification" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Text Classification** is assigning a category to a document based on the content of the document. Text Classification is one of the most popular and fundamental tasks of Natural Language Processing. Text classification can be applied on a variety of texts like *Short Documents* (like tweets, customer reviews, etc.) and *Long Document* (like emails, media articles, etc.).\n", + "\n", + "We already have seen an example of Text Classification in the above tasks like Language Identification, Author Recognition and Federalist Paper Identification.\n", + "\n", + "### Applications\n", + "Some of the broad applications of Text Classification are:-\n", + "- Language Identification\n", + "- Author Recognition\n", + "- Sentiment Analysis\n", + "- Spam Mail Detection\n", + "- Topic Labelling \n", + "- Word Sense Disambiguation\n", + "\n", + "### Use Cases\n", + "Some of the use cases of Text classification are:-\n", + "- Social Media Monitoring\n", + "- Brand Monitoring\n", + "- Auto-tagging of user queries\n", + "\n", + "For Text Classification, we would be using the Naive Bayes Classifier. The reasons for using Naive Bayes Classifier are:-\n", + "- Being a probabilistic classifier, therefore, will calculate the probability of each category\n", + "- It is fast, reliable and accurate \n", + "- Naive Bayes Classifiers have already been used to solve many Natural Language Processing (NLP) applications.\n", + "\n", + "Here we would here be covering an example of **Word Sense Disambiguation** as an application of Text Classification. It is used to remove the ambiguity of a given word if the word has two different meanings.\n", + "\n", + "As we know that we would be working on determining whether the word *apple* in a sentence refers to `fruit` or to a `company`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 1:- Defining the dataset** \n", + "\n", + "The dataset has been defined here so that everything is clear and can be tested with other things as well." + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [], + "source": [ + "train_data = [\n", + " \"Apple targets big business with new iOS 7 features. Finally... A corp iTunes account!\",\n", + " \"apple inc is searching for people to help and try out all their upcoming tablet within our own net page No.\",\n", + " \"Microsoft to bring Xbox and PC games to Apple, Android phones: Report: Microsoft Corp\",\n", + " \"When did green skittles change from lime to green apple?\",\n", + " \"Myra Oltman is the best. I told her I wanted to learn how to make apple pie, so she made me a kit!\",\n", + " \"Surreal Sat in a sewing room, surrounded by crap, listening to beautiful music eating apple pie.\"\n", + "]\n", + "\n", + "train_target = [\n", + " \"company\",\n", + " \"company\",\n", + " \"company\",\n", + " \"fruit\",\n", + " \"fruit\",\n", + " \"fruit\",\n", + "]\n", + "\n", + "class_0 = \"company\"\n", + "class_1 = \"fruit\"\n", + "\n", + "test_data = [\n", + " \"Apple Inc. supplier Foxconn demos its own iPhone-compatible smartwatch\",\n", + " \"I now know how to make a delicious apple pie thanks to the best teachers ever\"\n", + "]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 2:- Preprocessing the dataset**\n", + "\n", + "In this step, we would be doing some preprocessing on the dataset like breaking the sentence into words and converting to lower case.\n", + "\n", + "We already have a `words(sent)` function defined in `text.py` which does the task of splitting the sentence into words." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [], + "source": [ + "train_data_processed = [words(i) for i in train_data]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 3:- Feature Extraction from the text**\n", + "\n", + "Now we would be extracting features from the text like extracting the set of words used in both the categories i.e. `company` and `fruit`.\n", + "\n", + "The frequency of a word would help in calculating the probability of that word being in a particular class. " + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Number of words in `company` class: 49\n", + "Number of words in `fruit` class: 49\n" + ] + } + ], + "source": [ + "words_0 = []\n", + "words_1 = []\n", + "\n", + "for sent, tag in zip(train_data_processed, train_target):\n", + " if(tag == class_0):\n", + " words_0 += sent\n", + " elif(tag == class_1):\n", + " words_1 += sent\n", + " \n", + "print(\"Number of words in `{}` class: {}\".format(class_0, len(words_0)))\n", + "print(\"Number of words in `{}` class: {}\".format(class_1, len(words_1)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you might have observed, that our dataset is equally balanced, i.e. we have an equal number of words in both the classes." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 4:- Building the Naive Bayes Model**\n", + "\n", + "Using the Naive Bayes classifier we can calculate the probability of a word in `company` and `fruit` class and then multiplying all of them to get the probability of that sentence belonging each of the given classes. But if a word is not in our dictionary then this leads to the probability of that word belonging to that class becoming zero. For example:- the word *Foxconn* is not in the dictionary of any of the classes. Due to this, the probability of word *Foxconn* being in any of these classes becomes zero, and since all the probabilities are multiplied, this leads to the probability of that sentence belonging to any of the classes becoming zero. \n", + "\n", + "To solve the problem we need to use **smoothing**, i.e. providing a minimum non-zero threshold probability to every word that we come across.\n", + "\n", + "The `UnigramWordModel` class has implemented smoothing by taking an additional argument from the user, i.e. the minimum frequency that we would be giving to every word even if it is new to the dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [], + "source": [ + "model_words_0 = UnigramWordModel(words_0, 1)\n", + "model_words_1 = UnigramWordModel(words_1, 1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we would be building the Naive Bayes model. For that, we would be making `dist` as we had done earlier in the Authorship Recognition Task." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [], + "source": [ + "from learning import NaiveBayesLearner\n", + "\n", + "dist = {('company', 1): model_words_0, ('fruit', 1): model_words_1}\n", + "\n", + "nBS = NaiveBayesLearner(dist, simple=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Step 5:- Predict the class of a sentence**\n", + "\n", + "Now we will be writing a function that does pre-process of the sentences which we have taken for testing. And then predicting the class of every sentence in the document." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [], + "source": [ + "def recognize(sentence, nBS):\n", + " sentence_words = words(sentence)\n", + " return nBS(sentence_words)" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Apple Inc. supplier Foxconn demos its own iPhone-compatible smartwatch\t-company\n", + "I now know how to make a delicious apple pie thanks to the best teachers ever\t-fruit\n" + ] + } + ], + "source": [ + "# predicting the class of sentences in the test set\n", + "for i in test_data:\n", + " print(i + \"\\t-\" + recognize(i, nBS))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You might have observed that the predictions made by the model are correct and we are able to differentiate between sentences of different classes. You can try more sentences on your own. Unfortunately though, since the datasets are pretty small, chances are the guesses will not always be correct.\n", + "\n", + "As you might have observed, the above method is very much similar to the Author Recognition, which is also a type of Text Classification. Like this most of Text Classification have the same underlying structure and follow a similar procedure." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter24/Image Edge Detection.ipynb b/notebooks/chapter24/Image Edge Detection.ipynb new file mode 100644 index 000000000..6429943a1 --- /dev/null +++ b/notebooks/chapter24/Image Edge Detection.ipynb @@ -0,0 +1,408 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Edge Detection\n", + "\n", + "Edge detection is one of the earliest and popular image processing tasks. Edges are straight lines or curves in the image plane across which there is a “significant” change in image brightness. The goal of edge detection is to abstract away from the messy, multi-megabyte image and towards a more compact, abstract representation.\n", + "\n", + "There are multiple ways to detect an edge in an image but the most may be grouped into two categories, gradient, and Laplacian. Here we will introduce some algorithms among them and their intuitions. First, let's import the necessary packages.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Using TensorFlow backend.\n" + ] + } + ], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from perception4e import *\n", + "from notebook4e import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Gradient Edge Detection\n", + "\n", + "Because edges correspond to locations in images where the brightness undergoes a sharp change, a naive idea would be to differentiate the image and look for places where the magnitude of the derivative is large. For many simple cases with regular geometry topologies, this simple method could work. \n", + "\n", + "Here we introduce a 2D function $f(x,y)$ to represent the pixel values on a 2D image plane. Thus this method follows the math intuition below:\n", + "\n", + "$$\\frac{\\partial f(x,y)}{\\partial x} = \\lim_{\\epsilon \\rightarrow 0} \\frac{f(x+\\epsilon,y)-\\partial f(x,y)}{\\epsilon}$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Above is exactly the definition of the edges in an image. In real cases, $\\epsilon$ cannot be 0. We can only investigate the pixels in the neighborhood of the current one to get the derivation of a pixel. Thus the previous formula becomes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$\\frac{\\partial f(x,y)}{\\partial x} = \\lim_{\\epsilon \\rightarrow 0} \\frac{f(x+1,y)-\\partial f(x,y)}{1}$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To implement the above formula, we can simply apply a filter $[1,-1]$ to extract the differentiated image. For the case of derivation in the y-direction, we can transpose the above filter and apply it to the original image. The relation of partial deviation of the direction of edges are summarized in the following picture:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "We implemented an edge detector using a gradient method as `gradient_edge_detector` in `perceptron.py`. There are two filters defined as $[[1, -1]], [[1], [-1]]$ to extract edges in x and y directions respectively. The filters are applied to an image using `convolve2d` method in `scipy.single` package. The image passed into the function needs to be in the form of `numpy.ndarray` or an iterable object that can be transformed into a `ndarray`.\n", + "\n", + "To view the detailed implementation, please execute the following block" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(gradient_edge_detector)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Now let's try the detector for real case pictures. First, we will show the original picture before edge detection:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "\n", + "We will use `matplotlib` to read the image as a numpy ndarray:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "image height: 590\n", + "image width: 787\n" + ] + } + ], + "source": [ + "import matplotlib.pyplot as plt\n", + "import numpy as np\n", + "import matplotlib.image as mpimg\n", + "\n", + "im =mpimg.imread('images/stapler.png')\n", + "print(\"image height:\", len(im))\n", + "print(\"image width:\", len(im[0]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code shows we get an image with a size of $787*590$. `gaussian_derivative_edge_detector` can extract images in both x and y direction and then put them together in a ndarray:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "image height: 590\n", + "image width: 787\n" + ] + } + ], + "source": [ + "edges = gradient_edge_detector(im)\n", + "print(\"image height:\", len(edges))\n", + "print(\"image width:\", len(edges[0]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The edges are in the same shape of the original image. Now we will try print out the image, we implemented a `show_edges` function to do this:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAATAAAADnCAYAAACZtwrQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOy9WW9jSXa1vTlooERNpays6a3uKtgGjL7yb/KP9bUBXzQaNrpt1FyZlalMSRQpkuJ3IT/B5ywdMY3yxQcDGYAgiTwnTsSOPay9YjiDzWZTH8vH8rF8LP8Xy/D/7wZ8LB/Lx/Kx/N7y0YF9LB/Lx/J/tnx0YB/Lx/Kx/J8tHx3Yx/KxfCz/Z8tHB/axfCwfy//ZMt715T//8z9vqqr+/Oc/13j8eOlyuayDg4OazWZVVXVwcFCr1aoODw9rsVjUZrOpyWRSq9Wq5vN5nZyc1PX1dQ2Hw9rf36/NZlODwaDG43Hd3d3V3t5eLZfLGo/HtdlsarVa1d7eXq3X6xqNRrXZbGq9XtfBwUEtFov293q9LmZQ9/b2arVa1Wg0aveNRqO6v7+v1WpVBwcHVVW1Xq9rtVq1vozH41qtVlVVNRqNajAYtOfxma/n8yz0abPZtDrH43Fri+8dDof18PDw5F6e5+dyHW2kHdxXVfXu3bt68eJFHRwc1HK5rIeHh04fqJ86+fvVq1f1/fff15/+9Kc6PDys+Xxel5eXdXh4WFdXV7Ver+vh4aEGg0EtFos6PDxs41NVrc3ZHvrKeO7t7bX2MD6MScpzOBzWarWq/f399hz34eHhoV2DPKiTQpvdPu5nXFwfdQ6Hw3b/w8NDkzX9sm4wTnw/HA7bd9Q/Go1qOBzWcrls+p5yQF8Gg0Etl8uOntM/X7PZbGqxWLS2jUajWiwWTZb8/9z3FF/XtwoBWaVNWHYpd/SL6xjPh4eHJ+OFPjEeyMi2Np1O61/+5V9qvV7XyclJvXr1avCkofUBBHZ1dVWj0ahubm6acQwGg5rNZnV4eNiMdG9vr66vr2t/f78ODg7q6uqqNebu7q6Ojo46ij+fz+vm5qaqqubzeTO+qqr9/f32G+Gu1+u6v7+vo6Oj9szDw8NW32w2a0rFYM9ms44jwUkeHBw0x2gj4RqUx87L19nJoPQInUGgvqqqxWLRcUb8RvF9D3XgPCjj8fiJQdmZLZfLzjPsgK2kKN7d3V3t7+/Xcrlsjp77uBalWi6XzWngAHn+ZrPp1M3vo6OjzljbuDDIxWLRnAeyvL+/b87AZW9vr+nEcDhs8sCIHh4eWh37+/udsbHBHR4eNplbtrShqtp9Lu6b5Uvb3N7hcFjr9bo5RZwXxp16h0yoG51DRxnndAJuv+szEEDuq9WqjZWdkXUK28L5Wsb0gb8p1I1sBoNBs8uqbTA4ODh40g90jyCHbTK+yC4DfpadDuzs7KyWy2UdHx83g6PhODAGb39/vxaLRSdaj8fj5pCm02nN5/OqquZE+E3EQ3lxJDQeoRC9uYa22MD4DoGgVDiK9XrdQSfcT3tz0Hguz+KHAUJZ+gYbhdrb2+sYBu1hYKgTY7QT5fo+w6LwXDtUHPd6vW5KR38JKIyrgwHPwmlYgfjMkZj//bkdoRWX4IKBoPQ434ODg3afn4vSz+fzJnOPC07j4eGh5vN53d/fNyO106GezWZTDw8PrT8gG7fTz0A30NmDg4MOyrMc7LBAOegmtuDgwxgbablvjKGdlh2o9db/M6Z+Fm2mLuRB8OBv6liv100uXG+5GG2tVqteZ8PY4KzcZ+xxtVo1BIotL5fL5nAT7Xfqf/abqvrpp5+aR6RhCPnq6qopOg6DTk+n03bPw8NDzWazhrTm83lTYCDzfD6vwWBQ9/f3tVwuWwqJwdFxkBaKhxBtgFZEp5UJl+2c/MNnOEgrIQPqaEq/5/N5UwieTfF3DLLbAuqqqoZCHDkxSKdC9JNxsUNECa0ooCfS6qotckPBHh4emhIZ1ldVc9ROiWkPyoaztNN0H7mXvxlnI1gK6CzHLI3Af+NgCK6MoQMVzzHCd7DKtthA6SOyzzGk2Hnz98HBQd3f3zcnkobpIGM581zoE/eVwAVdYx3O+9y2PnrB16H/2KnTyaQ7nK6iO+iSi3WCoG79QiamhfARuxbb73RgL168qOFw2NAQ6AYFwRDv7u466OT+/r4zEOPxuI6OjmqxWLTOoQQoMAZmSEwHN5tNSw3gu6gro7+f54i4WCw6XJCFNhgMOtE/Oa39/f0nENrPNiRP9AYS7DNS14FxHR4edmA4dfFjmTLQmZbiZHkWiCFTDdJD+gAyxGkvl8va399v/aeQ9riNyPY5h8MzqqqTqjOW6Aty537/b7nRV4yP+pfLZcdxUQ8ozwZGn6kTHUPnGE/SRI+FUQr3gDbM0/ahBzsOj3Efkkd3yCY8hvDQpO3YX/KUo9GoyQQ9Tw6L59upo9dwks4+PA4eO2im59JybJH7+hAov5HpruxjpwOD16p6VI7JZFKTyaRxYKCn09PTxmnw2cPDQx0dHbW8GPIfARqiuuEMPAYE/zWbzWpvb6/m83ltNpsnRonAgaT85vpMZewIGEzqw2iA1UQnlNXPMq9h5GWU4+hqBenjzFarVUNJTj/4LhXP8BzOiejF9Tg0p74ohw2WgmxJ622IOalhp+/IbqV2/bQTmgE9QuZOpfsmDNAV84QYN06IvlvxHbySk0Rf7u/vGwXisTEazfFwv13IStz3RFXIECRssp3nGPVbj3GSyBSdSV0D7cBLEshN4uNQUreqqoMaPRaMAc80eMmJJE+U8FxPNlCHbZbP+gKYy04Htr+/X/P5vHUUIZGnMtg4AfgHlGO5XDbey8VRhIG1ITunJwI5Qjk6UN9yuWxRiAKySsjsdjgdALkliuLaPkEi6D7Ogf9JpYn8tJGoaDn4GY56OJG9vb32mTkcjxmyTJjPOJnTYhIGuTrioVCMv50QwSadPoHJfTI/41TBqTroj346grs4dcVJ+nnJcyILj0um40bfOX7omXUiddD0AIgF5+I60SE7BJwlz6ra6i1ytw1kv+y0zC86KGCDfWm3ucLk0tAx2wLIl6yoT1agW/pOu0ndLZNEtE6Rq+pJKpplpwNDwVerVUsBE8FUVd3d3TUjHo/HHU7q8PCwzXotFouGpCzo29vbDkTG4CGb3UELzp/t7e01rsikO4KzIVES8eHQiIi0hz45J6cNOB0jLReUNQlOOwz+N1fH99SB0WP4JqXbYP63YrG0xUaGfFGibA+IhkCFQSOTTDeMWJ0KGVFn2pW8klPyvun2nDTpky16gFxsAPSB7CCNyrqcHJbrsbwdVLx8gRlOAr5TWWSVwZM60xHwTF+TjjoDcwY/zyzznQONZWEdSAftZ9AGgjDfeWYX3SUQPjw8NDTqe7kfAEH2wJjs7++3e3eVnevAmCUE1iE0ZgZxXFbkPmRU9ejk/LkHylzC/f19e87d3V1HKBjE/f19axd1elYyI1pCYyMxRzQbprmRRBSkGwieOjuC/e+60snwzDRKf9ZH7mYKRzvsJJAXfbATMjmaXIVTtZwA4D4UEmPCSHONkdfcpbHYMO3gnIbwzAxELn3yoTj1oD0YXl5r4/U4ZxrFd0Yt2R5TEuhnOiyuTdlmfe5D8oiUPgeUOu3vPR7OTFK/MxD36QfBIFNEtx8kxn3wjfzdh3gdNPqQc1/ZicBubm5aWrVarVqqQUe9PseDzKwU3hyI7uhlfgBvy+B6kB0hPaXr73iO4a6d5GKxaG00DM/U0jxMttNRw8XGzuBl3dRDX+GV7Egd6WmzFTeRCNfxPNIEjMkDj0Owg6I9RmE4NpPcnvH1dU7VEq0wrjzXjpT7E2FWVUPmfZySi5EccjG36O+sH5Zj9od2mpg2Kqd99CNTLcuaa5BDOizzP/zP95YL7UfO2JAJcAdIAwGnYxTrIfXYTtwnfqfTt77YaWWQz+tpm/Xejp/nY+dkbfnMLDsdGLNUDw8PbR0XC1pBPHBi9rA4sKrqpB809O7urnUCGJmroT2TZk7EcNhpEtfTJgYDY2FhK8J0OkAx6ey0YReaovSlj6nMCct5VhajWEd9f45zYNCd7tM/Pwe5AdPdVytiprnphHBwJm4tA+qlXZm624BMavOb2WX/ZGrJve6HESd9tZEYbVjmmXo53TMyNAdo/eGe5LbW6y3JTp8sD5AOBDvf2zm5mHIxr+SshvTVAdr9M/plvNxv64z7w8SIgzFysp6iG3Z82C1Lpaqqw+M6XaWe0Wi7VrHPrlx2OrAvvvii5vN5zWaz5swYBKaWR6NRzWazlveS0xOtptNpGyjSv5OTk85gOsofHh7W/f19TafTzroxZkPMsSCIqurMyBiB8RvI7FnJvmhnRXJxvVaujFpuk9OAVMhEDI5gXrZgDsTpWN/AegbuuZkbAoUdoNNl/uY3DpGx3dvb6wQLI66qerLlKPk6pyiptMjTxmSD8no6y4vn2+G6TXZijJHlS0GnPV4OhOkQ7Cxoo/XRjtNZhX8zM2enkw7InJ3XIVr3SAvpN4jc8nNAJSuy04Q/pb8OnkZs5tecFXmBthEtVIMDVlIRVVuwMx6P27q25/S4PXPXl7/88kudnp42Ah7FZ5DxwhDo3uYBEoMQpzMoIaQt6Gg8Htft7W2LIldXV01J+lI6BGEjgMy3t/fAQ8bboBhoBpW29nE0fm5GbXNpoBorjQfaEcfX8L/X99Bvzyr5Htq12WxXkzvAmLyl346UOCaUxuukcFh2EChY31o1xtqLFLnXq9f7Upw+TodCkMtU0OOLLhplWO9s7DYgxtipNcZj3rWqmu7QB/javjbbyTyH6u0MjLxMITDmHlvaAzlOe5yi22Egd8sfQGA0avohsw50ygjQ/Ch9tpPEB5gygS8zWjVgQE7o0K41YFUfcGCffvppcyTHx8e1t7fXUhXQEmV/f7+lkTguowEadnx83EnLbESsVQEqIzwvejR6ykiPIOz0mHxgQOfzeVssas4hyU4/w7Oqvt5ciqfNiUL0pS+dsAEaTdkZJEeR7U15eIMsdfv3aLSdLfMOC773DggKK7LN43gngbkuO1iPY6LU5CCdbtipePwxbgcK/+9ITl95RhqpZcn9zGAzltZZ2uTZadprns3pc5a+oGfDddrqtiILxtiOHwfh+rJYh2iHn4teJidHe422TNX4fhAXumkb9FIf/IODt69zYLRz3lV2OrD379+3LRer1aoN4Hj8OONl4fH7/v6+PRjB403NUSTPgyOzInggud45vwfAvAglOQKuz0F97nr+N6eX16dR2tD7hJ/T388VR6jxeNyJmCiDURkKhgycIvYpvJXQ8nB09304LaJikuDuc59sfR31Wx+qnpK1XqDq51FXH2rDYbrv6VBs7H4G36Uz8Ho3X8fJEbTPqNNBzgg05UX7s41OXX0tz+azvu+zH0Zjz8nCsvSzJ5PJk7Hum6E3LeC/7djskI12TUkQANlhsosOqfqAA5tOp61xNNSzUyZXSf3oGJ9VdZc49K3ExQNTh1GHCdQ0Dl8LN0OEzZTOKW9VtX2cSST3EctJVrqYK/B9nlEz55GkMYNpDs6IxcaY0diBwxuw7Rjov1EHMD63IOFUGBsbZcrEgclt5bdlD7dhxadYRjhSG5mN1Gm4n+dUMfnC5KmyJOLMsfL90CWJ+h280nn3PduBgfqwERyOEVdV9xQK2ma9zGDpdBTkxGeWqduLzLIfyDSvR/eHw2FnLZuvgQOj/r5ZbVAXz/EJHw7kfeWDyygmk0lTdq8FciOGw2GbSXFUd2Qaj8edtAXOoS/PRfDeHoGyOiXAwwPvnZcb4hoNUh8zqQjHgjVH42iSCJK/06n6M1+LItkJmZ8yemLw7Iid5vp/OCsfSeTgwv1+rsl8r8x3muyx8MQMwckIog95sfjX7baToh9G5HbYicRzdTvf8zncXqZDzxWjzhxn82R5T3JARmUYo4OVuSc/t6radjm+R+/NHVNAJEaC1rPxePxkpTv1Gpl6bPtk6oAC0vSOG+qlzw8PD53TLJJysKO1/GibU3/awGL2D5WdDozFqqvV48wi6SG5qhctzmazur29bXslE5Ki8DiryWTSlAtin0MRKScnJ03QPuTNHFdCYxuSicHVatVmNI3KzDVxPcrTR0o6VaPv5lX4zL+NqHzAH/VaSY1qnouOyIR783QExqmqu+aIfvrQPiNSHCEBwaklRkVgcuo1m806zswpTk5eoMxOgb3mzUgMFJspBEaW9VZ1T+egPUZTRjQE3eQSkYsN2ZkAupRjSduMMD07bkRopGq5Ge1Z93D2dhyWDSiOXRTm7tDlPookERXtRQ+ZwMMmjLixIe8m4Zgi2gVYMZeaMvJY0X/PxO8i8ncuc10sFjWZTDpC5uE3Nzct2rBQdL1ed7gTH7VDw82ngOb6TktIJU20kp1nQPtycistdRgd+hmkOim0Pu6gahthMp3I62hTKqHr7UuRq56iODuz8Xh7EJw5BjvPbBOIOuWNsnkPLEqeG41tCE4faB9t8GRMpgJGsfSd1LQP/Tw3Fs9t43JJNIdxowvmF2mrEYv1ij6h29l39y1lZsdlotpB11yZ0//83AHIDi+5NlMticxTJhT3I+XtdjoY+hmWre2MdqczpE2WG4vbd1EAOxEYs4IgJCLRYrGo6XTapkgdUVerVYtsw+GwkYA5Y0TquL+/3440Ho/HHYfpweYZmaZWdVcs02lHTaM0p4aGuZm3JyLgJ5FfDkIihvzf0Sf5NKeuRg5GXpnGcjRNwm2e48BhBJlbhqw8PpPNzzLqSkNwMSJFeftSqExzUlnT8bk9li2y8hh5vDNQ2WAsa7c/xyXHCv6KdrndpNweK/qGHSWnm07K97mtFC9v4XtTLfSTttkmLEOjSn/v1N7BnO8TwduGnMYnerZDsxyYELEjRM6/G4HNZrPa39/vHBHs1Cv5CmbKvCePTgwG243dIADun81mzfmZ33JkrNrO4GWEcZTw1LZJaNIjp3wZVe39jT4wXsP4VDa3xZ+7rqquI3ZxKpB1u2QkM+znefl/8hHIyVyI0YOjuSG/z93qi4xWfitpOiBKLmpNB9d3br7rSTSbz8GobGQ5Ns+No69P9JYcUaI29kJmitiHOuiHg2pfpmGnn3rovroPu/qSmYptzeiTa2wv2SZk4lNMKLlPMreu0V+fQYcTxk+kE3TZicAg5qqqnVtvbiUVnUGzV2dw3r17165LJ5OHtXnrhetK3shOAAXhhzoQlFEK3EwOvlGbHYl5C67PSJeoy/yDI6+NNFOWdKDun1MMGwJ9y8kVBwMHnoz8NkIjK0N9kB4FmdBGnxnntqUR8zy+s94wJmkU1iPzPGxrg49KvgcdyP4YsaE32R6PSaJ1nuNV40aYXinv8UxS3lSA9SvRHO3tc/TJbVrWyd1xf/KK1l1f55lDozyyLssReyYby/od8DwZmKc8bzbdnQAfcl5VH3Bgl5eXHaMfDB6PvGDz9Wq1assfcjq4apuCDofDOj8/r9Vq1RayGhkx6HSMRbMoTA4UMyqOSgjUkwyOlMfHx3V4eFij0eOSjcPDw86xPskn+H9KThTYiVnJrdjmDXE2ri+dsvuTKRPPof/Aa7jIyWTSOwNJvTh1p+Y+GgVUTKACIfWhRZwEium2U9br/ul7HB399ypzoydWxPvdAfQZrsxplh0/MmLBctV2xi+RBA7JsnawsWF7ZtVBgOfzbMvAxm5DzfGxY3XweQ4J+bDJTBEZG4AGfUu9SvTJc5EtZD798Nl2LP716bYEDvcJ30A7Qf99gc9j4xcJPVc+uJB1Mpm0xrI/cTab1XQ6rc1m094W9P79+44hDQaP+x5RIKLk9fV1Jwp7zdHR0VEHrSAAp4hGWiZDrQjmYBwN2WLkKO00kYFHYZN3svK7XSiP0QP3JTIzCnqOIOVZVri+wwtRqGzfaDTqcIn0JaOeEaodFs8yx+M6jFq8LaQvWvZxW/6sD4EyHjzHBy6at6IcHR09WTu1Xq/bxBJt90tCCJrwfSz29AyunTTBOsfZaNwOwfpX9TRjcdrE59Yt6uAZ5uGcdlFSj+zQ0sFyfQZaZOlA7HZxPdfC8zEmTg+9AD4zBsYXe/T5fp5Rzy1HfWUnBzadTtv+RIh2hMDx0ZzA6uNiOSfMKcn+/n5HEfhNYeM2nextrHgEpz728HSe/z0YLiYoE2VWbfmZ5EgywuXgunCfEVXW5771tc/GZBkYQfXVl840r7HTxgkyHpZHn4NxnaYZ+q5xX6zE5nrMddmpPier5HgcWByAkm8hc0g+xwaeXFXKLIObuUjrgumHvn5VbfXCKSjXWl/62uSATjv6OK8cAxyp5UWbq6qD2CzjlIO/T10jC+gLarSLtBMQZL22jPv0yWUnAru9ve2kV/a+eVIi6SKNzGgBwU/Ewzi8Yp30sA+hIbiMhtRpwabHp91WCq7zEc+OnuYMzC/RFhy5I1pGZ6drqUy+LzkDUs4+XiYVlTU2FIzB6JI6gPvZFhSa53kG0u31SvpMnxIZuq15tlnyeEbdOKOsl3sSVVRVJ/1D1qPRqK37o3iNmI+s9ro/1ivyWXKijHO+bgwZWZfM7xkd8r3l69QfQyaFQ098fZ98kKWDkekDxtYUAm332Jg/s6Op6r7iz6jcxRvUE0WanjGVwGoEbJvnPBcUKR88kRWhOPVCEI6+9ugMBo3gEL/7+/uaTCbtdNbFovvW4PF43IHwNiq+J0pagTN60Q6ndnZOTlEw7OeQhnN+95P7kp/y/1zrGVs+x/kaotPHqu4ECnXz28rKb8bI9RoBmIdz34zyCCq5qyEjqTkytw/DoF1MjTvtNGndJ8M+xJyo02lhIl/66edkwZF5HDL981g6LU15+XOjSooJ7HyGeUCPIbb2XCaS/XcxGKCNfcjMIMF1mFczaqadpiCwce+T9RgaBSeKNK8HCjPa9MtWkoN12YnASA19SoQjXJ6dbljrwafROAKnd+Z60jERUe3x8eQWrpGVnZMdXqZ8GR1opyMSqTKRMCMsA+EI5etQWNdrXscDSTuIQH5W34mdNpg8w8m8hNsBZ8ZYpDKRUuOc0jhyD6WdJs93kMOBZoTOiRn6AkVhjvL+/r7tBuF6SsrOTpHn4JTzPvrJ+DCBkc5wvX482txjYELabcjgapSeFIKdUKIg6591hud7TCl27paTbaav2EEZeVGHj0bydxyf5e1+HncKzo1+g3KzrzmetOl/ReJzuCCppBVruVw2sp6CYqKIfMYPpGpGMIoH1x0yJE0nk/daKE5LvGHUTsDRBqHZELgfg37OkIx0HFHtyM3VeIO76zEqMwdStT1NwgSnnbNTdQeWjLAUy95biYxO3VecFw4z992xjcTyt25Q6IeDUHJgOBP2YGbwzPIcmvV9vtbOrS+V5LqqaiS/P3OAsS4k92Mds3O1U2MMvZzI+ml9f+5zZxEeE6dslnPfrKvtjns4ydizkVVbPUK+BIBMKb3ecLPZzr46hec7nD+y+Z8cqfPB90KypIHD7qoeHZvfLmLyLfe+cegbxkFhYgDICu9CneksxuPHAw9tmJREAjZ+Cw+Be+A4Kz95KRQuuQuKX3ji+vI6O3CXVGiMJHk6L/WwU2UtDs+08+B7vjMqsDNi2xcTLKQISX6DUJClF7Q6FfHsJW32LJSdi9GiHR4ONz/fbLYz2Xd3d52DKH04JgXnlJMf3MPYMQFklErBgRJ8jIAXi0XnjTqWrcfLY2R52UmkTjsI0AcvgeHz1DWebTu0jmXfzO1ZTknHZGZUtaVWfIKvA5mvd1oKp02gpG+0Dx0iyDx3lFXrx7PfVLWNnBiEp6FHo1FnxsKeHmWhQSAoKxEbt3Og7cTMZaxWq85O/4xk5rb8WjHanfA00R0CdHTO9Mi/OUU2Oat0VE5ls9iAmPXMaNOXGiW0RkZ8lsS9r6FOf59ps1dU2wCdblgWz/EwvravJPrFiWQ6lJ8n92XHTkkeiutoE/phHg99zUkhczMm9rPQRu8wSOdmGfEMc5aj0ejJ3k4HL9MjTmeNxPx/Ik/bIIEJNIQuuD1eKeCCXVlvrPsGLNbrTJlB9M7Y+I52/W4ERmfhDKjUHACF19D7wTgO8wAI3g7Lb45O4/L7F41kPEDj8bjjDD0IJhG9gdkDyf+gKvMOTg0pyeEkFO5LcVLpURAbkJ1gPtftdjDht4/H8efcw/ONSjE2jkNhbIiUPBOjYBW2nRxcSHI9KbMsdgY2vkTDNmz+x1kxY0hb/VJl6gdFkZI4OHl8q7bn2O/t7dX+/n6bfMLJ2pH0OfTxuLuX1+PB96ArE9TJCaczdh+RsakKZGKeyqjL6bIRK22yPNxW5AgKst0+PDx09nsmJZIBJO0J2/bsK+1Fdn2HKmTZicD8QCrHY97d3bXV7jQYA+R3enUPljtpiOoOemBwVI5cjoqgHEe89Pref5nkqNMKR6CMcP6dXB198v+OdhkN05D42xGpL5J7KYoDi7e3+Nn0weewMZaOsLkeztGSzzIgeDU1n3u8XJCZ0bVl1TdxkKm25ZZGQfGWloODg3r37l2nrbSddjNmtMvkNLppfUzk52DB0UKpj0lxuFjnn0N3GZx5dupKttF/M8bpGIza+1B1Xx/SsRjF+X6j16puIM4TUYyASVE/dCbY/+hEVufbHlSUaLPZnhufjTXqsgEYMZmD4TsvuAMyo2A4A5SKLUImAM1hYaQ+mykNNAeN5/lsoyx9aYGRZtXThYBGaHawqWhZj4u30IDG7Hwo3tJS9RSVVW2NHf6t76ww6vX9ufshHYK5P1MOoIm+lPbLL7+s09PThoRTHkZYlpGDHwaMM/zll1/q1atX7bP5fF7z+fzJIXwY1O3tbXu2qQicoYl2JheOjo7q8PCwJpNJnZyc1MuXL+vLL7+sP/zhD/XZZ5/VF198UZ9++mlnEssOM7MK5JhjxZjndTiTzHBMrVC4FvTExnPuZ7wMVsimMrPwi3zcFusK44HjtPM1wqefyZfaZ/SVnQjsxx9/rG+//bYtLBsMBnV8fFy3t7d1dHTUFgru7e21tV2kkjg6D5Y3gvoFFPBJHJqIAiXByUwNsBa4nmlLojgGhGtJhxI7BQYAACAASURBVMwtmetypMT46Mfd3V1n5X9flB4MBu14oD6eKTmtJFkxDuo3csn0bLFY1CeffNKUk90RJmapm5XoboP/p262i/E557U9d0aaFZn25jlRntzxixvclvv7+zaLys4PorSX8xgtjMfjzktQcbqTyaTNYk0mk87xM5vNpukr4zccDms+n9fR0VFdX193jki3TDwWrn9/f79NMt3e3tbV1VXN5/O21Y46Pv300/rpp58a7cHnfVwl32ETuevBpDeoCvlZ76jfiJ/rGIdcRuL7fe6/Sy7wfi5Agn651hOAi8WibTfkedgN9f/u43Sm02m9ffu2Dfzx8XHzspPJpC2XYAbx6OioIatcZ0N0MO8Dj4Hz4lgdBhJBcI0dEwPHKROguKp+lOhtUNRr0tUFDsgowJMC3J8DaviOcTjlwyFZoRKx0kaCg1EoTteRdjKZPFn0akTHW5n6NjLTb0N5Ip6dc57qauSNPmBg0ApOTZ1241ydlrifRgw4MoyLWUhPzZsnzTQF1GRODdmYHuAenP54PK7j4+OOg7c+MHu7WCzq9va2bm9vW0B2n1ldfnJyUuv1uv7617/W9fV1/fDDDzWdTutPf/pTc3hkHegHssZponOMIzrmoMdnnBHnNNaUi20jl4ckb7lerzsOhuuchjrdQ0ZG833H6SToMIq2o+vLQFx2frtarWo6nTaF8QkUoCauI0Ixc4nwbDAoJ9EQJ4JiJxnObwRC5/zSUKMcnsFzPavj65xmGSkxYNTjNJOSZ1RZwCiEnW/Vdr0Un/l+2um01vyOOT8+x8GYf2HVO8VrtjB6w3hka7iPk2Ua3MrE/Xbijuo4MrjGHPM+bjD5Pfqe6BF0lMbXl9o7pfHxyl6zSHroZyMP80hGMtZ1xhPDZjmQj1LH4cGl3d3d1fn5eV1cXLQJk8Vi0V4cnbyeZZLy4n/rbKZdyQ/iYL0onHt5DmPoWXujaRf/77/7OExfY2dmrjD7TFAEhT9XdjowKgI1GR1ZkebzeZ2ennaWVeDsSBdubm4ap0bjq546AP9N1K3qHmaYkZPBBDnRdtpqktIIyMVpHM8G0tqR2TnzHORBW9wXOwHXzzMTZbmt9CHPR4O/wPkYyrs4FfV19AXei+fc39835OGgkOkAdVsXXGgbz4aCAE0kMW90bmSQqZpf0IKTAwmBCBhvk8pGKqRZlpGNFwfsCQ6na5alJx0Wi0Xd3Ny0e//4xz92TluYTCZ1enpav/zyS5sAI3sAKDg9PTo6qvF4XL/99ltL0a2XTtctw9S/1Pu81g67apveMbZ8brSVjscBLTnfdGBJiaCjbiv2ANL93SnkYDBoZ/5MJpM263V/f99BFazYx8ujfCgEUJqUxXwJL8ZEQQzj6YCjCErTx5exOLaqu//QENw5O3878sI5pJIwOCZ2q6rDD5hgzXTISMXRMdOAXI3thX+MCXX69AiMFydj5zUejzszw4yB03/agzL7eozaM56ejfSSCsbeiMUI2zSAEbdlilNi/GijeSmclBGlN2+jdyB1zvFaLpctLfXY0i/0ijbSHvNFlqmRMgF6OBzW69ev6927d3V4eFiXl5dVVfX69es6PDysv/u7v2sLqDm92OnhfD5vu1+Oj49rOp3WbDbrbN3L1BFn49e+JenOcyxvB1TuAdkTLGwzznD6shZs2X8bLTuoeKyyHVWPmQvyea58EIFl9B0Oh81hkbbk6+dRqpubmzo9PW3KxObt2WzWuJK+QUFwwGwMFeK1qrsA1fyEoxKDZOVD4MnlYDR8lwjLxD7ludlJrjd6ozjfxwjNMVEvzpqB9uQDs192KuaJ7LgNzz3BkXIARXtyw3yl0wgIYKeAKCXBxU4RA8C5+R2IzEoaSYB8U6n9khhm0NC34fDxIExPwcN7cj3tINAhQxv1eLydteYeAoNfWOM1e9TrID0YDOrFixe1Wq2a48IRL5fLTto4n8/r5uamxuNxM9jPPvusBoNBc2TT6bST3r958+aJfjJedjqWqbMeo3sHUuwnZWKdIfit10+XcVA/bcWRJc3znO3wPfV54qmvfPA4HchI3hFpo/Eh/Dibh4eHms1mLdLxGYoCUU9KCvlub2/FITJ5X5QdHkLEUdAO9lxZUOPx4wmd3hfp6MoAcZ1L8hPJSSB481iZDhqZmPszf5GcmGF/VbWZV/gM82Buj98EbX4NOVjJuJ5UhQBkJ7+3t9dSQQwxSfOmVP99vxdfVlXjg9xHp8f8Nj9aVW3pg/fEemsNr3yz0yQFOz4+7siVVNScIgEMXYYvG48fJ5d46xN9ZTkHY8J2I+8BPj09ba8ePD8/r729vbq5ualff/211WPU//XXX9fR0VFdXl7W119/3caWSbT5fF5v376t29vbWi6X9cknn9SLFy86wS11kVTeKbllkcE4dRq5OI0jC6MYIZlW8GsUsX/oBNuGg6ULwb1Pv1x2IjCEa+IyPT7XVW0jLMZjKA4flhuAzZslr+IowbQ6kSyXQZgjOjw8bHwLioUjYFXxev04a4oi+pnU55kr2mdH6ZzdUc5IkL/T2XMf9dt581r14XBYt7e3nXVKpO9EfgwNx418bWDm2aqqEaMYL0rG3+YejTwxCsYjFYv+YTj5Ehb3n9TUaM31ZFrP2IH2efWbjdMBzTpbtSWxjQSsx5YlKJaUlDrtbEFS3OuUHN2nDQQJjlXnxTbX19d1d3dXn376aV1fXzduc39/vy4uLmq5XNZ0Oq2Tk5P65Zdf6vr6uiaTSf3888/14sWLphMnJye1t7dXt7e3nUkU2ydy93HwDu59kwSU+/v7Dp9pWRmZ43QS1SZx73Hy54w9ss0Mqq/sdGAXFxf19u3bJ05js3mcvr+5uenwVTgBiPSTk5PWEAvFXASKVFWdNBMDZfEgjsmDQV2LxaKtt0HprWjpdFF6H+lLfeYWMhpVbZ1ZrhOycdqIXHBuSWBnQKBu1nXRZ9ARhuN1VjyL2WE7dORgR+10AV7z9PS0XYNBuj0YMH0zmetUtmp74oQdfa4XQ6dyv53Hy2SxdxIwZugd35n8x7i8dpA+8wwMMhEKz/YEAwaLbHGK1v3keeGBva6NCYbpdNqyFOqBWrm9vW39QP4vX76s5XLZ0NpsNqv5fN5S1MlkUkdHRzWZTNpaNgcrHKv1AIdmZ2en5CBQ1f8OTuszz3K6TZ995hcHRBiooA9pj7sc2M4U8rfffquzs7N6//593d7ets8PDg7q5uampWlEUVbEk6ZBPFZt824i78nJSVv/glfmWrgopyF4ZZxjOjEmGfjOu9gtAITrejBmOwKnnzy/qnvuvK8z0vCzuBdZMdhGb/xG6b0oE0eFkZNyg+5yDRgol2fA7TmFJY2q2iI13vRTtSVjDfetVIwZz8CwWajclEtOEGOu2gYDnudjbMw/GqE6gJgyIAh6lT46hvL7NFLSQ6fUpIVcm4GJoJGcmwMS4zKZTFqwub+/bzOT9JVTRzzuBwcH7dSXzWZTb9++rXfv3rXFxOgQyG2z2dQPP/zQaJjpdFq//vprnZ2d1du3b+unn36q1WpVL168qPPz8zo6OmryoJ+5di77mycR+3sKji4zFoI/tlLVPU+fFJNxxXZA1IwhiPR3L6O4vLzsvKyD4kWqnjFCmT1r5Qhs5wAhj7J5QI34PJuIQVGHDY57qev09LStHTK5SuEFInZG3t6EwM2Tua0WdBLmJoxzUK0IRq8MlGf67CBoR86Q2SkYbfm5yA8lY5bLywSYxk/kZA6La9kL65cxOOXDAXK9DYEUFGX2rKkXTFpfqKeqWiDd29ur4+Pjuru7q4eHhzo6Oqr7+/tOxoA+8J3lS6oGGgKdOHCZmwE1gCI9RqA8xouZdfruVM1jYzS8Xq/r7Oysk3KBrpjdxNYGg0FDY0ym/f3f/317dSFp6s3NTdMXeMvJZNJZeEuwM4J0m9EdX8tYeybXjsZrBj1+th87QtuQC/z5704hUSo6grBzGwZ8kmc6mI52LoxiApNRUENH7jNpb0IfIRiiUnwPLxaxQpq8tJHwPYPqtiJ8rm2CE7dl58X1dny+PlPRvmN0UCRkZ0NxwOA7HBOR/vj4uHNWFWggOSgcjmf0GA8fBZ3OeL1eP0ktHIC4FmPI6XS33csxMi21knt8OJsO50VfxuNxnZ6etuC2WCya86raHqFN3d5O437TfhbQOoCATNGlXNBp3s/9A63YkTP+zCrTLtKry8vLpuvehQB6Qfdubm5qMBjUzz//XFXbSQVOe8AJMCYsvHX70O/kZ20LHkdSRU+kcK8XpTPmVd3FvX1vIvKEBPbdl9m4/I+O03E6ZqdlLovIYO+KJ3f6g3HZkHivpI0weRtzQSYATSx6+tuCQwikuCh8cmB2UGxBYeDNn1VVm2XxYDvS+r1/o9GoM22Og0OxIM9xznd3d+1vFIdi9IKMGQs4IB8Qh8KYkyMKZvREqXCe5kKs2HZCFHNqOAmnkzynz3k5HfeSHCMdy8POE6dEXwlA6BaGdnh42HnxDHWZo0Wvjo6OOgtHU+e5zmmRObFMo0kbfZSOZWLHxup8c7TMPONUV6tVW15xdXVVt7e39fr1687eTB/rY44RXcUmjHirtjPFBDZS9hxvnBj1YFek9bYVp/xV3Zl000BuX+4YeK7sRGCsDTKpSgFSIgQrHhHk5OSkoTUUNjksKwkD6tNOiU5OwZz+WaDJy2CkpJB9DsSpS9XT7Q1uqz9z1Od/nk1dKBtK7dTKzt1paSpK9p02egYMWTltsaJmnXbipFqkkKRIOGgjCKeCKR+uSYRBagAKol19vIapB1/Db+qkn17UbFLaqJeAOJ/P2yygjcmODmdmhMvyEfQHRNc3yQCiM+Kl3UaK9IFreFl0jikAwbN6o9GoXr582YAFy0QODw/r9PS0s0cYu7NzxeFmcEraI3UO+Rph01ZkkDpBH5hYMh9qh52rAdCTPg4uywcXsjIoNB5kdHJyUrPZ7Enqw+D4RANIUARhISVacgch9hwBUHLzB0m2e4qfuiBI4c2IVIvFojOLZB6NwbFzdtTNwetDAXnaJMaJEtFOnsd9Nl6no1ZkDMMGm2+KypQMoxgMBvX//t//ay8ornpENm/fvm0G79SYdWJVXaQJUsIZgwyQlWfR6D+GRf+437yjuSlkgq4kL2g0l+m4lzDwco6zs7M22121TXns6AnQOBCQlM9iA61gfCx/2Ww2HY7JM3/IC96OAIqcqZ9AD3okaNE224hpkqQJTJAjdxA5z3EWwLjk9h76C7GeqV2CHDtvZwzOaniuddb1kEb+bg6MQUXIKLUdT1X3JE0iOANKjuvUw4pib2yvj6L689Fo1FmNTbEymtdxpKXtfVwUEWo2m3UQHzMhdgR+poWN8cGR2FkZETGojixEGoyWAkz30oBcg5XHlPg318Of2TET4a6vrzvLM/iB+0SRHX2tnBSe4QmfjNg4T/SDgONJE/gtEIv1Cwdh5Dyfz5sTNsq206adpHLv3r1r6aSdJtfliSggUq9HY38jxDhvTmIpEGmmZ+Ft9JlKI+fDw8N2LA/9vrq6aujSOkcQs9OwXVqPWDfHcVAHBwdt3Zjlbx2l/6y/XK/X9f79+6ravg3dugwadtClrRkQrRtca5kz/jn5lmWnA7u/v28zIMw6ofxwJsBgIDGK5XTAg+foUVUduF7V3cCMgjK4KJH5K0fOhKVA/SedjuiEgB11cEJ9qRJ1VG0VEBl46QDRmP6no0si1e2p6jpxvqcddnROCYzqSIlAajxzMHg8140xct9Go8f1SZD4dpB2XEkZIFN2WlhuufLdMoYvMbqyolufrFP007pjLhC58xynYaenp7VeP661AmWNx9tjg0DUnh0DBYFUuNZptvUc2gLnQTaCgXttFsjZa8Zot+2CIIEesaczJwqMXv28vb29hvqur687lAv6wMwmsicIz2azGg6HHR5xMpk050r/zXu5jpyN9VloGajgTz80A1n1AQd2dnbWBA9ywonYcVVtN/K2iqXsjoQoIo13+sU9DA5QHOXmepSD9UiktXSWvZp98BOlRlhOE1E4OwcG2CiIAg8D5+fI7T5zn6MjSpazkObSSEG9Atz/pwPCYTklwrkwFukgGbsc39y3SNs88UAdOB2iJuPpCJsTHXZkTt1znDCsPnSBE8DILH+T5TgdkISdx3C4PQYnTwb2eFRVB10gN/TW/eMwRVALTspOCB3NPoCsTMdwDh9yoO67u7vWV3TDqSr3W1+NNI2eWJ4yHo87a9Lo993dXW02m0YR2Ma4hsk5y9BgJTeaJ4eWCC3XDfaVnQ7sb3/7W33zzTdNsc/OzqqqWm6Kst/d3TXlsWICqYkwnFjBLBKpHwbh0xmtfMBx82I4lSSy/Xw+w3nQdpwVjgzj6CMjq56emMkAcL/vZUC9UJRi9FDVJSpJvZzGoLTmDYwyR6PHY1dIFVESz8baEOwknE67nSZwjbSqqo0fcuR55iD533LN9XUep6QT7PT5znUiX7IA2pWbkqnPbfOSoM1m09AlJDInsqacucdtt8ytR+Z7TbFg8EbL0C7eAsZYoKdGU94ORfCmjTiv5JOMIglyw+Gwrq6u6vj4uM3MozOJfjlZ9uTkpKoeucSqqqurq7bHk0Dq4I9+Ovjapj2mdmIU9pV6Rrev7HRg3377bd3c3NR6va6Li4v2ksuq7pE4k8mkoQmTnsfHx41n4QWlOQvJDAUOiQH3Z0ZmKFMKGoXJNAThUSeO17NQroNUgr9x3j4dAqFTcEImUNOxoTheLJm8iu9LPpA22rBBW7S1b90Rf6fjSgQKf2lE4NTIi1udnvWlfDxn12GRdswu1A9qsbPzc3DeObFiRE7BkHnmcrls/BV6jHNm6w0G6eBlBOgForQ5t0XRf1CzOVGCu6kEE/xepsFvr32kv+gJvJuR/nrdPRnXSzTOz887C3lBhpyKAQrf29uri4uLDsqz/dB/xp3f5mlNEfC/gQX984QHTtVv8e4rO93bL7/8Up9++mmtVqu2NwvBcfqqyWCioo8OYfuD07aqbu7rzts4QFieJLABorCcPMAguxB1+J77bDh2Rj7JAiXkt+/BWRp5wi+R9yMbIqn7RkSt6q74TqO2A3AqCKJ1imhHZ8NMrsj9pYBIHPE9PiDnRG6JlukDqNlLbSjmvrie+00eY7DMErOOijqJ/Ci/F/VafqtVd8ExPOV8Pn+yaZj67aAxKHOVyNuptPfw0nbSrqrtMczm43Aw3t9atSWxCWhGw76/qtrWPdAjYw6gqKqW+r18+bLOz8/r7OysHh4e6u7urk0aDIfDuri4qLOzs86WNQI4Y+EZV2Rtu7SOGWxYf7nGmY3XSuLsLfe+shOBnZyc1Pv371sjnA4Au2mID8+D5KTghHI2zgS+0w3+T9KaLRo4JAbdz7JC2OvbSOyIrYxGUTjNTGsc2b1OxY5pMNhuS/HyEa7ByFAM6jCyQy6WB7Jg5o0Xq7hflqWVxo7HKJHvQSC0P+XkfXjI1Olj3wRDciGURFiemcMReL2cUayDoBcTG2FzDanZYPC4zs3LEAiQIBfGIXlUdOn29rY++eSTDhqDUE99NnJC9pvNpp2D50mfqmrZCXrtpT4U2sZ9Rs84bmR1fHzc2sW2oel0WldXV61eNn8bieIcCWY+ogi7g49jH6YdjdNFj7VlZhRvrjuDHIExeecsOxHY+/fvW4TnhwcicJTMvAHrw5iupYMecKKNEUYfj1G13Xrk86QQBobp9NFOwIZvo0w+xsLlPiKQNxtTjOJ8vzksitEW6A4DMuqkLqNTOx5IUvNQOMOckmYGijqNct0GnlFVbSY2USwpV6YBXsVOO73i3+dj4fyRuVEKKYkdn9NBb4syb8IYIheegzF5CxZby0yQ39zcNPl4I7oXNhtp4JAdyDB+vjcvheOiPcgGFMZPktRs0AZZYoOe4TZiM79pR4feffrppw39XV5e1vHxcduu57PNaK/bhTMzj4UOQOzTX9sCBVslWBB0kjrAWXK9fcPvJvF5m4rTORQo9y5WdY9ENvSnYyamq6rDYaDAHIsD0khSvy8VynTHKMrCIq0xYiNSWhEdnT0w5gvoPwbP5+aWeF4anJ9L+3OdmtEPA08AsXJ4ywmycb0pE+Tsl7JUVWdNj1EYbe1LQ+3MMn3kPu4BlSOzqu45VdQHCqB4GxvpuJergMzNs+E87u7u2okoRnYsDeK3I/1qtaqbm5uWSnIdnCWOl3TN8iYNM8I1GkdOpPYOIjhcO72UEfK1I4Gq8T2evTNnNZ1Oa29vr53Jj5NwtkKbQYHr9eNyk9PT02ZnrCXjhSTMhlKoyyja9uag56VO1N9X13PlgwcaovBegGhCG2WBkDRJDUREQEk4e8EfEcrnjnFEMOuRcp0MCmcPjZKmMTtNQEmsVBYWCsbftJkdABY20dlrvqgrc3630w4AOYM2bPiLxaKur6/r7OzsyWRBErpWEOrjc08S2IEbVeaMj5eW9CkSyu9XwFVtSXrXz/1u92rVnQEEJVHQKeTr9VMYB4YEkiMQ8bO/v98ckB26xwDOxbwqhmydANGwhMHr37gehJVoxkjOBoocaQ9jzN9kMYwNW/OMUJGHJ4noP9whdkR7QW0gVIr7jK1z7l9VF3Ey5hzPTbup31mPnSrPzEkn5MAkhm3oubLz299++61Nn3oRJEpNh40gGACMEuTA/wjCJwFgtF7chrHhNFhbA6zlGV5LRt1wOeZ7Dg4OnixYRGlAPiYXbXAYqq8zXLd8uM/twklTEq2gLP7cZ25dXFx0vnP67FTF8uM5yMRr1QgCyINxNWdDGonhwrvZ+Jl1g1xGTn1KZ/QBgX56eto2rg8Gg7q8vOykY69evaqqarPfPse+qppugoa8kyANBqMyAqAfFOTo4IQxJYVwdHTU1iLStnROjC2OxJyVjdfUxmKx6CBUdJU+TafTjlOynu7t7XXO+SIQIxN0zP87YKxWqzo9PW0BBTl8/vnnHb1FX+zs0EMHUqNvbITimVdPrOAjBoNBa8suBLaTA2PNx3q9bm8WQrlM/nnAvFLYHBEHvdlQUAS8rfksK5WVzqiPXe3Ux9Qvg8T9FAva5KIJYvJ6c1Dci9KZR7NCWvlZHJmIgz7wXJPQoMTBYNDu59o+FMS6OTtar5czn4cDRtFBK6AE/qY+85f0ge0y5jFs2B5fox2npnyGI1qtVm1Ly+3tbTvDar1e18uXL+vly5c1nU5rOp226yeTSZ2cnNR0Om3ohOeAWngObccxVm0PfTT1wBjY4ZgyMFda9cjvTSaT5lTRH6PtDDYeS+zGzt7cMjI16sexoN+2FezRyzIoPB9UV7V9Byjy8ub9m5ubhqyRkXUvAzB9ZFLHDph2Uw/PR3dM89jhTSaTThr+XNmJwByxmIVgMOmsG0jDHMm5F8Nh5TOGRUrD/31kOZ4ZpSIVNDFrRGRH4nvtDIlGvt+KwACY/3G76LdTRwbD98PPOMXgf/cNfuvNmze1WCzq/Py84yCA/8ia/zFaFMMLdau6fBh8Jf0BwRkh+yUj/t7podNUO6aUmbkwzw7j9Pkuz2GDfzFKGQ6HdXJyUuPx9qyvX375pZMu+oBG9ImgZZRCffzmOqc9Od5uC/1izRTtN5pgrJJ7NE/qtXrJHVKsK6ZLvLaP9BT9zfVitJlTXKoe0/Pr6+v2HC8KNzBYr9d1c3PTdLJqu9TIARSny0LenKHmmvF43OHNs/3mC+24nysfJPFfv35dVd1pTaakIRDt7Z1mIXgjF6dTfdtWjDoypTKhn167qrsUwtHBPJfvtUIRhVar7fsEvRzE5CsRyegKpcy/GXBD6/yePr5+/bopkNucvA/yxZiRi9GloyVyof+WqRdqJv/Bs4zcTCjzmVee0z+UEMSe13C/jdhr2pKYtmNzekJ5eHhoa514NwKfJ2fj5RhOl3HyRog+ngZ0m6jJCzut23bUdvSgdf42yqJYtxxgGRscgJd7MMGDEyPY3d3d1fHxcb17966Wy2VbTsHaMS9SxhHZoR4cHLQz9pG/f6dOmJ/jf/SUsUxg5GuRCTL63SnkDz/8UOfn520Zg5XZAjdpR27t6OU0BXgOGWli28rtNJS6EG4qsGdEva4q00T+9iC5DqKpkVVG6PybduAQcv0bvxN++x2Xg8GgbYo9Pz9v63hMGlOMboxG7TyqtukT8vPSABP8iSp9Ppn3XHpLEbKjbvePa7Iddk7oCYhvNps1ngu0hFOjnTyPyOzUjNTEJ5C+ffu2MwteteVuNpvH/YoEYMbUskIWfG7nlVmCdW25XLZXjeV+3dSLqi53aueNTZkjZfwZw0xd7+/v6+7urqX6oCDQKU75+Pi4vvrqq3b8EUdxw5d5HOkzbWIxMTZtHUlZ2raxdUCC0SOBCufvNWkfKjsd2DfffNMiHxCTAfQhchCPHFWL0FnRbMXAkCBe+wYIZWAKGMdiJ2DCD0XmxRRGFWmkDKjPYDJaRHBOmxw9vT7G7ed+Rx8PIBwDysaufhSeiQuQlB2AZ2jsnP3SB+TA6bhuC9czfl7nBSXQd/pGX/rsfhsZ4EiyMDY8i+uM5uDViMbZZ8vRqb8jNEY6GAzauVXz+bzpBDtJIOUJeui1U0rkAyGO46PNHiujKuqy3meajoxtnJli4RzMo/Ec+EEHT/SKvoIIvS0IPYFbImg4sDvgoacHBwfNbuFBCTA+Ftr0CRwd7TT/SiDiPsba/KEDclIUT/Tr2W/qcSErhpY8h72o3+3HYOPYgISOXBx1YxIc2ItioQB2RjaQ5Lccke28INQd3fySDxylD6uzw/PhdbQVhTK5i+A928hn6/X2nCTSGdZ2QZY7RTAfYWiPXDK18jQ2/eY3bbLD8XIM2kV6kGuq+O36ab+fxXXmymwcHr90ZBj8mzdv2kuUOZttsVjU559/3sbJYw4yMNey2Ww3aROkvA8Vg2EcQW/QCOg3dERSOAAAIABJREFU6IKlD4xNLocwN4VM/HpAuCefk5cv0sDwzQUBEoxyzXviBGlTVdW//uu/1j/90z91HDoTJDhTHKOpEWyHvxlXxna5XLbtgVXVQV3IxbaXY4NOmA+0HvSdBEMw8nP6yk4Exrsf6SicEFyQIZ8b6KM1lstle8lsVT0xJtCGV21TktcxscwgG7au1+sG2U0K82wGKhfYGuajjOb1fGwIEcRpMs/yZIIdPPJisGmn7+E7il855lSYaEa6SttJG/gxV0ihDqek5tU4t4q25SQHcs6jk5CBi9Gtr3GkNkFvRAKPxfVXV1cNLbCq3vJ3tMZhMW6k2lwP4Q/qM88D+gKRmj4wlUF9XM81BCf2qCYFwt5L0nOQCnqVSBo54mQ9keZ24JRx3BwKmejZM6Ec0kDfnGbygw7gXAeDQdvXyRvLbWNul/tEP5yO9uk81zHOjN2ushOBXV1d1cXFRTsAzmgBBaPAgXF2UFW13f2kjfa8/O89YEQKIxzQCB01mZ9H7RqhEBFwgFYQb9Y2Z0UhUtHWRBl+lu9x31DwRDrIjDPoHR3twHH0TltQOK+hwwF7gsOK6v4hT0N698HIoo+zSSftIODrqcMGRPtctwMAJySYK2Mf7s3NTW02m3rz5k3d3t7WyclJ3d/ft7PhkY3RuutnsWxfuofDQk4sGEanTDVY3z3zi+MwwmYCwFwOOoATQLfQU9A+smVJkNdRpT56Iu3s7KzW68dJtvPz84a+mH1ETzJYABRMObAbgVX2oGT6zjh6wSzjnvU7m0guL4Ns8nr+rq98cBYSaD+dThvxTL5dtfWyXkFs5bDCQ9QBr+mAHRQDmySx0wSM1ttoDOlRAi+uMy9nB2VDN+eTMyCOYqlEtCs5kb5noJh+hu93HzyAdnJGgThSo2Onbk4n+c5Q3sZn+aJIPMfjaOdZ1V064eNbfFIHY0Ib/DkFtEnKwn5A6ry5uamTk5PmbF69elXr9bo+++yz1jYcR6Iyj48dkdMWdPbNmzc1Ho9rOp029M0kAYG1art3FBm4TusMcnJARdagJZwd+ko6yec4CsYajtDk//n5eX311Vc1GAzq3bt3HXLdNumFtUZJRlzomNeL+XvQH0EZ6iiDInUlxeExsM5uNpuGfpFBZmYuOx3YcrlsnteOgKlbk9dc6xy+arsdhQaRfjjFsCKnAJLI6zNCFxu8o5udnF9OQb2elOB+X2MlTD6Q//2byEk0B005ImVdNjrXZ3lWPXWWRqyWEcVK6tQLbq9q63CNFBPe9/U7n039fcGH69LY3V6+T66KZzML5kWmfis2HI/T/iweS39GkGO9E/yp02U7cYKGDzVAdsnz8LcdEK8ZPDh4PKcfBGbKwHKCPLeNWTaj0eNSHMs0ETXLK1iIC93jYOQglJSLwQZOkHSXDMxIns8tb6fiJvCpG/0wzfJc2enAjo6OGurCKyIEHu4IjRf2oIHImB1J53VwcNA4MqIhRmzkljyMZ0ZzZXiiIwbCvJQRHryFB7HPeewSJCUVCrLep4HaiBMVIB9/nugLhXQfGK9MiY3Wqrobr5mp47nJXfjaLJadZWO5OdhUdXceWEYgN/ps9Oo1cMyEUTdnot3e3tZyuWwryNHP8/PzJ0Ei++Ctbpa7Ax/6ikNlVg5nQpBiEihTPOqiX/SJTeHMHHM21+XlZedlvFXbBcp9XBOcGtu6KDyPjMNIGOfDxAYoajabtR0Oq9WqXr9+XdfX13V8fFzT6bQzNixbMVqvenoENJ970a63qjnzQB9t88+NX9UHHJgdFp7WCpkCJKUzdKWRQGHy8KrtiaImDau2bxByypmEpA3d6JAfKxG8D8oFeqROzzQZnrsd1McmXvqDseTAEpn39/fbK+Or6olzdwCgID/QKYQv93ON/0aWVd2junPfKv1KBEta4JXQ/i6fiVz53kHJXE7O6jpNNzJ1yty3s8PEL4HSlMDBwUE7WhqDvLm56ThATp9wO5AdY+zJjzQuO4F37941fgknxt69qu0Bgh5zEJTfls2qdztdOxWvSTOpbQQP2nfqavRK1vHw8NCQ6dXVVZ2enraN7l7H+ebNm/rkk0+aLn/77bftfqNR+sAMK89EZ5EV/WZW2fwhsqYNZEqMEc99ruyk+N+9e1fHx8dtILx/Dj7HEcVKjULgPFBGZhSNQOgI11hB6WjfOiUrI+0y75J80Gaz6RwiSBriwXfEcx1V1TE26uM0WJyjz/gyqjC6o02OlvANPIf+c6/r8KvekB2KbL6jj69aLpedo3M8XZ3jlxE1eTWcijkU2uGAY5TmtU600SlK8lVGarQDDof+uo7JZFLT6bQuLi7q/Py8Hfw4Ho/r9evX9d1333XQ3WAw6JyPlmkyz0Y2GFRmCp6gWa/XLTUE2XpSg9lEXkr75s2bTv9Ho8fjsk9PT5uecj/yPTw8bEteTE0Y4ZozcwpqfWNl/vv372u5fDz37fz8vC4vL9u91m2XzebxtFkDGQdxns//m82mrR2zz4Drsgwpu/ivqg8gsOPj44ZCrHA+jZWGez2RZx2dLzuCOy2q2jowH6OBYjqKo/wm091JDNj8gJ3GeDxuJ2PiiEFZCBBlyHY7Gpvnof2Qp35xCU6zL611n1y/5eJtFvTHb4ry/TgjoxU7G/63kpioh5/08+2ISbXdJsbMY2MEnKjN51f1pURE8eR7LD87xVR89GU4fHwnA+9sZEzW68d3G2JAe3t7dXx83HHilqtTTOgQ6wV9XK1W9f79+9b38/Pzlg5NJpMO5YIT96ZtApTTTGQLLeAjyK13XmPowEYfcz8ryO/k5KRubm5qMpl0bNpvIbq4uGjo0EEJpGQ/gHwANubhQH+0tY/3ts9Yr7fnuj1HY1R9wIG9evWq/vjHP7aji+EgvBGTxtiofD4UgvZ79Wg0SsxsJakonXFdTB6gpGw5yfQGxehzOkTOTBOqqgm36ikH5pQop+oNjauqGYOdnBFH5vPpUMwJ8Zvrkrj37/ysj6h2Ks53tI9xtDLRdyOLqqfLVRhPO/eUYZ8jywKPRFpY1T1H3mgMw3AKQorF8dcggPH48ZhlCH4iPvr266+/1tHRUTs7zC9wtsNy3ywHIyt0/ubmptENLDlij+tisWipI21ChrTPzzo5OWmnQ3iirGq7W4C6jArdNnQRRwdq4ogk2m/C30EkJ+fQJ5/T51NAsp20DWSIbXqJCDwnIInx/90O7JNPPqnFYtFeweTZEZTIC81QFiuvTyRFUE4RiEYIxAZBfdTjaIW3NhIyr2Xy22kUA8RAUogSfaSvDTIhrdfA9BH/dhiWG39bIZzC8HnOSDklpS+Wq59TVb2fZbtANLQnl7nkBEJVdbg1TyogZ7eVa/y/73dxGp1kvmet3DfGBVl5fFnMav3DSZDqsX5qPB43Up11UMlZpgxdQI6k8AQzZjJ5mSz6RFDG+fs5Rrm0jYDul46kTmba6+LJCNphPfCyDduTHZGzmqpqkxm5Y4V7sI8cO4K5J25wVg5GXnbSV3ZyYEBHc0EgGGAiZTTaHsELvMzOApGpj6iE0bDuzB7X34/H45ZLAzEp5ohcUDLSB88c5WCj2C5e7+aoiLGnEqMYmV7wtxGU6/Lbe9JBJ9flsXBqQL3J/SGXRET8JC/X51iTd0NhUTAU1TJPB2WZoANWTgyfa5yG2Jnlmij6nQiXdpBSwoWxLebo6Kg5LF6g+9tvv9Xbt2/r7u6uXr9+Xb/++mvd3t62WW+PBfLweJsDBJmw3xU95n7TJcxswm2BwvyiFfYx8izWzRlROfVHVrQLztc6xfifnZ21lBoHybNpo8eMDAj+cLFYdLIYp/i5mt7ywd6cAntc/1cIbDwet/e0Oa1xtLXyOLXk/uR/QF10ns2h4/G4reBnIAaDQfsMx5b5vAsKayRHm5l5JKoQgXmWuTPqQpDpsJ3qZrGD4X9+J3ozP+fPfH0GANrGZ7mLwWjUxuZxeq495gC5jzqZvbKTcv/Na2Sa7Wtcb6aV1h8/x+Sw68l0h/4lYuT61E9kkg6SWeaTk5POzNvh4WG9f/++EeAOJC4OUlyDfAhUqb/wYdYHp6Wk15YN9WQmRB/6EDpB2o6tqtopEcgc2wWBPReMaItTxpSHx4lnZsrqQGS79AqBvrLTgc3n83acsR9ixGIexWuKaNjx8XGL0nYU8DF4YL/UAgUB4ZAP8+ovBOw0xJxXoh8vzvSKagaLSGiew8/xAOLUiWZEVD+zautEQTmZIiTSs7NBDsnpMaiJ/pzC5d8OIo6mTmNZ2sLKcsYrnUw6pTSMPofuscni++x03ffk07zWkPY5mDz3LAcBozUvHeF76/pyuWwH/719+7ZNAKzX6/riiy8aanUaaOoiqY5Mxzab7RIGZhR9dLZRJw7YRDgzh+xh9dYzByn2Z/7nf/5nffnll/XZZ5+1NpCqJW+22Ww675scjUZt5wCcN3K1jqR+IweehTxA2Z688TKT5If7ygcRmCEgAkS5aIAHJReqYhSGgaAhOmWjNVoYDocdDs1O0mmoIyHRCGeJ0B2tDKPNJaGwqWiObrPZrHMUUBptpm3mtZIrw6HCSTgNTgRio8yIVtVdfOrPkmgH7qOgVlqnjk5Ds247SnNbfY7D/U5Uyt+JNJNn8/W5Jshj4CAG8nBJxARn42f5Hk86YXiff/75kyUAjCEpnftgWTnoIlMcclInXrPlQFVVLSuiYBPejE1xSrder9vx3NiUUaudOO1ika4PDjDCfPXqVb18+bITFHIxbdod424ujnSb/nmd566y+5Uf/y0UEAreneNovNIXQpKIAPxECCAVQ33PtNERBpPBtvIDi71Mw4Qj9/roG0hPG5gjfVWXs3EOnlPEq9WqTk5OOoruSIuSpdCdbtkxEX2tFFa+JLqNItIppGz5Pvd4cr+5LBsI/IcREHUbRVOPzxZjPIzw7Ez60n6KnajlWVUdh4s+ofBJ9NNW62amu9Tp9V+JShkb0kn+Jp1er9dtZpCf1WrVVqwzu86MqN9VYLnSLztxnyZB1lFVnaOtTO7Td9eJA7y6uqrRaFRnZ2d1dXXV+D7AAEDBtshSFAcP84nOVkD3OG/qc3ZmX8HeW9rptN6B0k5yp948+81/N5YTHlnLUrX1+JREC0ZJ6/W6DXi+/dg5N6keHj+jJwpg+AmRSEHpMWAiJSjSBtjHnzjXtvOgLW4/deX/CXttkH1t5fn53JwA6eNN/Ix0aGm4NlQrCumpp7ApHp8+xIfM+ji8vvY9VxyZ85hr2u66Gdu+AEAxT5Z8DTRGpvU8w8/ebDaNh0WvE6lNp9M6OzvrtIEV9ayqx7gz2PWhWnSGrTrm8tB76oDOIFB7cmg8HrcN8YPB4x7P8XjcTsNAtl6lj+3ZBm0X3sVR9XjgA/phfXMW46UwqV/038GnqjrHDO0qOx0Y77+DyPQ6GsM+GsgAsJXGsJGG0EFy+uSO8mAzSqYG1GkvT3GU4PlOe30Yn08D8MI5nPXV1VWdn593CNY+h+HpXkco8392Ih7w55ybFb3vPHbX5Vml1WrV4YVQZjtpO+3kGxhjxsI7ClwyWrqPljtpViIxo8VMoekLQSKX0fga6wffO1U3uexnMU4OJsnlgUT8PeNhNIW8uObnn39uevr27du6uLjoLH8gZbMDRX52rE6PV6tVO5wT+6K+8/PzzrsUWIjqt0y5HqeJ5tb4306OuvpOGjGaoy2gLD93tVo9Qco+WNTIGsCR29D6yk4Hdnt722A2a1lsLObHHAlBRpmCJCFrA3UEohhNUAf/0+FMm7xEIlORTPu8MNBw3Ijq7Oysc48Now8l+P6qehLp87f7lX9TnH5QcobRiu/n2+mbcE/jNh+WiNGOzX3PtlKHv/fSA6dJiSj7xp0+54tHfE8iO7cvHZadQkb45KyyP04xfT26A2IhUBD0j46O6vz8vIbDYUsF2eCfAc0608d9bjbbrTiTyaSNNXYFT4UTYukLQTv5w77i/uNgHWQdJHICggKoQT481+lu3+Jk82FV1QElz5Wd68AgwL3AkxQvCxwBnUNJxuNxW1s0Ho87kdRIxINvGIxAvTrbe7KIiBRvT/EMk1NHojpRCKFXVXvlEyfR8ow+/ii5Lk9IZHridLWqe6gj9xlhGVE5kjn9TY6CH9rp39zrYoXy/0nu813f6nkX+pxpmdMfxoji/XJ+ViIUy4z7CJpeB5VUAPXRF/NNcKp2mKwuz3u9fsuycrAD7ezv79fR0VFNJpOmq4vFot68eVO//vpr/fDDD/Xdd9/Vf/3XfzV+KJ1tHmnDs1JWOE7Q2Xr9+Bo07Or4+LjtMqAvJvo5R5+3b3MIIvIydw1KRM5VXeDC2NuuMjCRWgI+/CJenCHX8PO7OTDenkx66C0YVd1D+4B7OAtzLJnn53orUJBTr5zZQtGcAiYKcBoK95WQ3AIkfaraTk2/f//+yVuW3Y7kvFzsnPK0jCx9HFofSuWZfXXgWB0xKZbbczwVzzaxmpEUpcxAkTNK7hMliXMrPaUvQvchPuo3cvTizXTmqTtup/vnZ/iIZF9vnUQeTpNpu3Wa34nK9vf36/z8vF68eNF0jtXsAIREXbTTdZvHdNsGg0E77Xe9Xjekh6657+Px9i1cfWOGI0K+ffrlJUnou+sDSDBm9iP8z1gSzLgn17b1lZ1XcF6UPTLKbC/rGUSvf/IgjEbbM4DoBAabyzBQIkdzE5eu3zNS6/W6rYVxusB3Rl5AbpTzt99+q+vr687RxxQbMYPIYOHsTNR6gImmJr35PtO8qqdrxHiOI1+mGp0BjbV61OHrIHuRDw4/jdN9deqLkqVS5/+kPDn54bTVsk0Zp0PLgOHn9qXmFDsbj1Efr4cO+7l2DhnM8jQPAgL2YTTJqn9Om2Wc37x5U2/evKn5fF7v3r2r7777rn7++ed69epVB7knpUIbDBz8nYMkM47IlHTWW664x7J02u+CzXsyxfw2v5Ghl0zQJiNy18HfyHZX6rsTgS0Wi0Y+JnnvyGlOKzkFE3EWRBpJn2HbiTm1JFJRUJREJDynqnt2laMmDm4ymdTx8XEbsOcQVnIlPoCtD+raISSnYdK2DYgCRR9ysKL2IdA+7oT+O8hQzJGNRt03c7skmnFJXtMcF+uj8l4jbMvGKXoabCI69zsdvZ9jWaecLGfvAUVmuePChmzEbJrAJ/7ybE944VB4tjOC/f39Ojk5qapqKSCnqNJu+rJabRdcr1ZPDxrYhYYTBTuVw077dNVjDTdu8t926VUAyMtj2RfEccaeNNxVdjowKnEHqrYOw5HUaVsu1rOS3N/ft5NDaayPjnX6lamEFc4pw2w2a051Mpl0PLw3JVNARff39/XTTz/Vixcveo8qqdo6YJOXWfpSCX+WBkN77ChQDn+e5DOKZSWhbhTCiNPbO5zmYagofiosPKADQgYF8yjmaLw5n3FwdKZYn4xoLGfk6F0ZaaROudx3GwrI0KmdEa/H2XXbofl0VnOFGYBc0G1PjHiGzTN71lWWX1RVe+EshzO+fPmy7u/v6/T0tINOMpil8zG/6rQw0Vz2x85qtVq1F6xcX1/XyclJO9PsuWKkj+yQhe3bOk0m9Rz90pHxri/ZYsJiOhtSoioIUQ9uQlpKrgvzEgEfMQ1Cg6icTqd1c3PT2oTD5IW6Jh2tDPBsVY+DdHR0VD/++GPt7++3dw7yHcoOKnCk7DsoMHkgpzs4WQzDEd2IxVwh8uxDYlYESqZBVpJcd7O3t9cQI+nPbDZrTr+qi2ZYGpDOkesYM/qYDibT2+Rt/HleQ18zEFD3cySxHavl1OdYzVP6oMXsb/JMlrvb6nag/zkLncgTA8dgIbc5yQKSHa7q7u6u3rx50+HOjKD8DIKVZYITRc/y+CoHAL7/7bffmtN7eHioo6Oj+uSTT+rs7KyzOh85mfdGh0ChiYRTh00NfYjAr/qAA5tMJm2tFgNBQ5imtTHTaX9mRXDKyDn5Nuo8OsN/HxwctCNOMD5HawvNhpaobX9/v/7617/WF1988Wy0cbqUaUcaMnLxy32rttEsrzPPZFRpY+DZdormg4yW2MOYfCFBwQiCcaQ+9oXSJxQNhff2Gm8AdhT1KQ2MIdc49TcfkilnOgGnK5Y5xYsioTeSwHfJIIkemhOyvhj9oiM2cK5zfZZ7VXeLEd8RcHk+aIw6vSfTDpeXlTAeX331Vc3n87q6umop/8XFRTvby/0Zj8cNUftz+pCOF+d+dXVVDw8P9dtvv7XlIC9evGh1Xlxc1GQyqTdv3nRk4smIqu3C16pt2urZxdFo1PEnOFg+N2fZV3Y6sPv7+xYJrBR4fwSaq2zNe0HeJ0fkiG8S19HOHJBTIuq0grUOyXD6VmnPZrP6+uuvnzij3JPYR5imQ/Nz801HfXyX+0JdWU+mJWm8OZHRl1aZ3Mf4cvkKsrCjSOMxgezx7ZN/piPcx/OSz+tLyZ0K+j6nqFXdZR99a4lyFsxIKmWdbXCwM9HsurKd/v859OlnE8gwZMveBwUY/RuNMxHAIlLqu7y8rPfv39fd3V17Gzi8Jg4C20y9I9tiKyBB7NNPP33St6pH3elzLH3cmlfkJyHPuOQBiBS2NT1Xdjqw2WzWTqMAxhIxMFT/BhXZ65po9Hsaqx6VxW/tdkQgsj9HtlZ1N46jcJ4ZRWmOj4/r+++/r5OTk7q4uGgnZfZxW30DbAWj+CjqLCgUbU/HkZMdJnvz3PGq7sD3GbwRg1N3o7z8Hvn1oUqjEHNgTjGyX0beRi/ez8qWNCN5rnd/ciIojTjb6pSIunNcjPCN3u0gfTY/8nFW4HTWekIG4N0LLslToj+JUhlrb/B2+1kMy+vYjABns1n98MMPLe2bzWb1/v37Oj09rfv7+7aY9u7urubzef34448NsXG8dNUj2ttsNu2I6aOjo9psNp2XKVOY2aSPBA9znunQkgbwotaqbgDPseorOx3Yixcv6t27d7VYLNpRuxCQ5rB4UQQoxDN+XIsjg3fJdM/G7WlsrvO79xBYRiZ/j2JOp9P67rvv6sWLF21vJ8K3QpoL4Nn+HIGaH8AoMp1I55W8kPtqdJvLIkza4zSSL8joDyLmt2eMbRB2XImQ/IxEjkZQjJ37hNMySkOOTrPdFvN+fGdHmzyOnbKn7OlPbkEx1+Mxzz7YUVm3kt/kXqe49NcO0fVnUOjTO/OJmcrSP2TidZNsOKdMJpM6PDysi4uLzvHVP/zwQzuy+49//GNbMoSzHI/H7SUkq9Xj5AtIyzKwjvtcPY8nbXQhJcxVDAYetgc4t1082E4H9uOPP9Y333xT0+m0bm9v6/T0tNbrxxkCzvlGSTNiG5F5KtqvNMvNw5CO8DpWdtJWpztWbKcOjiCvXr1qb4hJHiAjpme0ntv/1wQ37h6qaCdlJR6Pt/vETNq6z8gLZewjrZMUtSPJv73q3byDnUI6qef66pTVaMTOr6+4D88tRXAabT0wggetols+ssYksBct98k20z1kmusWLXMjVZ+Rxr2JhjO7wAkaZWIr6Gxyl66jqjve3JNrH13QS3OgOAcm4gwe/KpENnWzbCPH37xc8nc4G2zVGY5tlUKQdWC2c+YVeF6w3Fd2OrCXL1+2WavpdNpWzpK/I1g2b+OY/CYhhOiUI2fZcgBZFsFgcqYYAkmkwnYM6p5MJvX69evWbpZMOOVB0LTD62mIpii2N2RXPd0aZCI6ERvOlgFKZ5OcDI6GOpL89qxg8mOJFF0vEy58x3256fs5jsSFqOu0wakUbUE+fWmxURtyIU1Kx22URGDJXSCO1Clb6uS5Hjdfz3OS3Hc6lxMuRk8pzz5yv298+AxHmQGeceWzTNUeHrZv/GFd1nA4bM8kqE2n0yYjiH3kitPy2FpfkzayrtEHp753d3edNN4BPZ1fonGe3TfTnGXnSnzPaODIaGRVd6YFIQHh6QiLQ1NZ/DIDhEhjmRLGSMn7UzEZIHJ0IPN//ud/1ieffNK8+Gr1OFvFe/qqtq9Yt1GQciJUnuXobCEbURGRWU9G8fd2dIb8VpZEEibHPbg2MOrM929SqBeZ2vnjpI2cUwesfFZgO3LDfXNltM/pD22yY+ZZ5rD6lgawewJHz1EyJt1zJwbtY4yMHjn9FCO2QZpvI2Vcr9dNzpY97fbpvk7T3X9zecnH8Vo77MMoEmK/akuBIAOQFnJYr9d1fX1dt7e37Tx+eGhOm+D4HOQ7HA7bsggHhL7JMPptYECw98GnBN2qLeeVOrhYLDqy9wTH/yqFdJqUM3vuBMVbVBCKnaANE5I+I23frFKmURYOQtnb26urq6tarVb18uXLJ/cYGWVxP0x826n2RU1klGjH19gxOJL7mUYzWaib5ydvYMcE0jDUp1gJkyTv6xclxyD7acSRiypzgsR9Teefz0+DT56M+x4eHpoeQQEgFwwBJGxDwzn4sxxLdDbRItcjByM/oxbrja9L5J3F12dQoY1ul7OG5C6TXPf49TmG5KdAnKmHOX583vc6tVydnzSIT4QZjUZt3en/ZB3Y/wiBkbbRQW/cJsUbjbZvpnH0cfplFMZg53YBrw9xLk97iIY+N/vy8rK9yv3s7KzDAfG85GAwclIA+tbnRGgrhueZI3NKfSXXtjm9sBFUbVGcn5WO32NjZOUUljSANvK8jHz+jlQ/HZ/bg+w99kac6ZwHg0EHYYAi0JE+p0R7/PdzyNcTNnxf1eVnQNkOHvztmeTkbqu6KJlnsM+R/jgggGbNYzkdS1TICSsGCk4nq7Y6ac6NvtnB+XBP6sSGvFEaxMnx088FE/OC6UTRe/eJseVv6IC+GXQjd8bLY8NnBKDfnULSqNVq+1Yf8m08/mr1OA27XC4bdES5x+PHWY1cB2ZuAiXnefnqKl4egNF4hT1n0//5z39uDhPl9GGJcAvA4/V63U6YRTFMBHsQLFTa4XSEwe9bq2JF9sFuThkT5VppiLR9xL9RR9U7/LoVAAAgAElEQVR2c66vt9LDn6QD4ToMOF8KbP4J2duhum2gQvpgnsOkrR1nppI2AqdWFGb67LwxFAdEo4A+1LvZbA/7y7FgjJhIsjNiBt0OA9qDgO1tchmEIMOrti8arurO2HF2lw0YZ5zXIkPGHf32b/PDfsuYHVEWDnFAv7ER+uCgwm9nJZa1MwM700RmyVt+CBxUfcCBIQDWhvBABsuRzE6LQoftoOigowQDicElCcp1RBqM8f379/W3v/2tXr58WV988UVnYMxdMINJ5EFxyd9tSI6YRhzJPVFodx+y8TXUhVxRZPMhyNQpDvKxTIxy/Qxklam9Uw2ucVs90WDERb/MB9oZpPInsWty36g4U1jrhQME1/Sla3aUDjAgCz63LN1n5GcjxYHSj4ODgzZBRfu8oj8zh6pqeua01frO/2yARrbuA1kN8uG9FC59KS31OTWmbi9h8DimDK132AjOy4GOMci03NQBqagdn/nyvJd7vDRm1wxk1QccGFOskKRWOKePLFkgNQMa574/GwTRyhHZRp1RgRy5qtqWhvl8Xt9880176zHPyRMw7ED424cyJq9hxUaR3R4bQ6Y16YSt5FZSO9nkK8ypVFVDpTmYfYPreincn7yHUXGfUbrYIFer7bHBbosVd5fiZdrrtjgYoBsei3REnlQyyiPKm7+iGPFWVdND2uQMwDPV5tGqtkttHDjddp6Fo/epHIkq08lmkFksFh2nzeceVyMqHyJqWzAa41qj3RwX9xUU5X5R2EHgwE8fcFzmSOHLDFJAZaYsPF595YMnhq3X687bUPgMJ8CeRjoOvAYleXAZBENdjB3nZYiK86BuXkrwH//xH/X+/fv24gS/eaeq+4YWhENJSFzVTRftUEibrLjpXDMlsyI71eQ5lqsjoe/B6HDaduqOfukkzIM5peDePOUz+bk8TibJbiMxyHO4T9pq/pKxtiG5zcwyZvSnDX63IUqdiyMHg0HHqPL75BmRHfIZjUaN1zLCwFkYtUIV0G70ximjHaAdYXJI6AqcZQZa15EImT75BGRskPt4FZqfZd1N5GyUaNBhx2ruCllYf7yX0WOBjJE9bcdOSW25HmD0HCftstOBvX37tr00E6HgAOxkckD70sDBYNCWQ+DIII2NwniRCI4TI7u8vKzvv/++vv/++/r888/r/Pz8CdfC3jA2xSbiAfUl8eqoz6vdzY9UbZdsOEWhTit6Ctyy4XkUO1kG3VE22+j/zS35+nRKVU/3U9oh5UGR7I90SuJ++1kodJ9TQW6sFzTiNVJzP5Gn67fMbDzeu+rjts0TYSB2/L7XAfnh4aHm83nnFWAZ+Ql6XiaQuwS85tEkOWPA58vlsnPCC/20EzFHOxqNnrw4xIH34OCgczSP092qp8S50TaO1CjTOmH0aF6XOgkIzDDSL+sFDqmqOkui0B/sy3IfDodPkH6Wncsovvjii/ZWYhrvbUUIGUUhVenjZjBknCBeFqTGoLhevPLR0VE7P/wf/uEfmrCJ+Lky2UrjOs250R87HsNwK6AHy2edOZrZQThFIcImn8b3Jj4xOga/Ly0jwvEWJae3yUHu4q9oZx70SJpAnRxdZCdBv7zQsmpLvLJEhjYhUy9MzCUgOCI7MaNk82tGusjFxg7Sc0RPAhw5g0zhSa27tMvjiY71cVLmdCjsWjFKq9oebYQd0BbGKlNfjzGySZrEdIcdhAv3ONvxJERe35fKsWh2NBq11BFHhFzRzRx77qc4pfWYowt9S4JcdiKwH374oc7Ozprz4Y25d3d3rVL4MYSCUZBajMfjOjk5aRGOqE6kx3nZ+dCJzWZTFxcX9Ze//KWqqr766qu6vr7uzGS4c34BAYNotGSEYmhdtVUQT61bYV2fSUkE30cor1bdY39MvPo6Dy7XohR2Zk6ncCDJFWb66rbRV1IWR2AiNuNzdHT0JG2xItvweKaNlOd7M3dV95ROE/vmmZI3tCH6e/TM7zBknB3t7dDs2JAzbWTm2lxRn/E4taVPdl5GdsgCR+HFpmQguXyFlAt07R/qRTaWn5e0pCNCVk6THaC9fMnZkx0pNJDPlaOYHgKUmEJBXs4yCEDJjz1Hu/SVnQ7s9PS0k4sSdaw0OStRtXVGKOv19XWHXF6tHpderFarzlncHgDe7v3dd9/Vl19+2V6HjuF4epmOInD+pi3m0WyopLUogNPiJFq5p2prwIkWKJ7kSELzuX1sEMYYvZcFMNj035yhi2XOwENQJ0djZIFMcd4mtbkmUyrrAxHW9TkNs5Oo2u4YgDfzBE06Kwp7II2+PDvotuJIHd2dGtpYuI8N0ckzeoxwxEarXEc7nIHgDLIOj01ftgKFYk6UoOXASXCjTYyhx8ht4tnJM2dxnciUoAfCg99mrI2uB4NBZ98xSDOPeYKWMILGt5B97SLwqz7gwFhrZXTEw/0/3p/lFkRA5+nmNgaDQdsrxT5FnseRKw8PjxuxOY/I6Slw18riqMFEgslRRx2ucxRxW83xOe1iUC0HowAKjsLKYZLb13kFMs/KlCfb7QjvYscFwsJBIz8HCSt28jg8C6Scqa/RoKOkZ4AZV7gMBwCeY8V1MZnPPZDGGAmnLCAHxtMHLDqNBoEhawKJ+Tye7dQU2eb6JTvxlGVVdYK80asPl7RDJIUlWHuRd2YrfdkDSzhoD0HOAddjn46Yuh1QkgMjMDjg8mzLCwTpJVToP7oN5+jP0rE5U+grOx3Y3t5eez8is42Q4IbE8/m8zQayCM/5PMSp0RIC5Tu8+uXlZf37v/97vXnzpv7whz80gbF3kS0G1E8kQDmMAlJp+MwLbqu6U/gZnRzl7TDg8py20jcrCcbgpQXUQ9/Me1R1lyEkLwSCHY/HT84jt5OlTbQn0Q919ykIjgmk7M9zgsFoF76RtpMa0WZk4XQJI3AbnGpzL/1Dn3iWxyQdkbkW2ml058g/Ho87KSTXeMkHi7i9tCLXQDr9dEaSgcv8qQ0VOxmPx212FPLeM43OGPjMn/MMnD5tsM7jkA0uaAPjgWPGESGnvmzFz0BeRsKegMCWcY7myrwY+UNLcnY6MBwEA4VnNCpBqY0U3BFzOhRv9ubzk5OTms/n9W//9m/17bfftgPYIFg9WEQvIpxPUGAgM6JzMJsFn2mg0aKjGsWpnJ1dokv+znOpHNkSOboYgSFfQ3Tuz7dLV3WNluvM/3kpRZKzfp5LpkCWj8lb2m6eCUXEEDBQc1lcw/1eAc8GfHNA6CD9oh700+mxv0vU6KN5uMZcH7K1k7AxcpiBKRP304ia+91uZJDOLNF30hAgfqf6DlBG/w4Ath87B/TLz0+HS5sZF+9ltC3Qz+QEh8Ptm8mxDcvGky+ACfPhz5WdDgzegBkHKjVxzcOcIgCrcyuG4Sh5MgPzl7/8pfb39+sf//Efm+cm4hnu+zVmKFzyOYbXjuZJlCaHxK547nuOVLaBe9CsTI5ITdhyQjmLg2PFsNxOp55WvuT/LHuUyMjSwcgBhrbwk/2gYMAZEY2AmT22TJk1pQ8oM5HeqCVJY3M/ONdMneys7HwoOY7oLZwXesazGAeW0ngMc0IlJwYYH56LPNDLqu1xROiSU+VEb+i901Y4KvOqfmdFVXcvow88dH9MBYA+c8IJWzLy9+fppJGhwQG8Lm3LpTVOO42+PsR/VX3AgSF8vLONwp7VcDQRCQNZVZ39iVzPG4JOTk6acmIk1GukAOzE2SQ5aT4JJ2WHg1Ml9bOBMvNWVW2dDoNjwjXJ5UynnosYhsN20o6yJs0pKJmVmh/uTz4MwpX22vBJu0ER9M0kMdfZ8M1V5LOQk9uHk/TyBHQJBEmEzZlBFxuJg4KXDFQ9nUlOJGaSn+eAdr1Jm3ZBn2Co6FUaljMFOzT6RdpFXVVbgh95oEMO2PTJDs5O27pkWWGnjDvpJ5/5TLk8np0glsfb4Ig8juboqJ+6klNErsjR40IdyNlLiey8+8pOB3Z1ddU5IsdpkQ0OPsKCQsj8dkNAIFdXV/W3v/2t/vCHP9Tl5WWLyAyiIz6K4KgNP2ClsgP1rAk/8DJc53SYKGxFMGzGcTsNw7k493fqnOlYOiYjVyMLBtLK/fDw8GShqnksE7NWCPrqae5Eqm6rx5fr7JSNMJAVnzvg5Vq2qm4QI0D5jUep3K7fER+ZWwf53IHHxYQ8442OobfmibyI0pwO3znAsozABT0BgZr3hdN1ACJlNvdmboxJg+Fw2GYqk1flHj/fdog+mH6wU/Vz3Q9P7KBPnkyj0DefpsFY5RY/p/WbzaZDF9mm+oJmG9Nnv6mqr7/+umazWVuMB/SzAQA/q7abXemISWki8Xq9rrOzs3r9+nWNx+P65ptvWlp4e3vbGo5CGHaidAwq0cTkpmeDzNFRTAKbMzO359LHByV34u/sGJ1WPlefDTejD87bUcpGWNVdNEt0yxXYyAZDyvaY+OU5VliTuO4Dz3NUdpDCEJ2Wp4wTPZizYezMPWXQQReo2wdlut7/r70725Eruc42HDWITbLVkjXBMmDAPvXd+OoNwzpow4AB2QIkWU0Wh67hPyDezCcXI7ParZPfAAMoVFXm3jGsWMO3vogde7YlSri/Pz2R14DluOrX999/v+7u7g4Ui9zpnMP0zG0gId+cvHKuLnmwfud4Hh8/vcBZPZmpc/VN/mz3uJ4yc1U0ZD71fepDwEI57xYOtI+Ayg7VSn1Me5nlWQQWYunkhlDN3MPiM4Mq6VdffXVwPt9///365ptv1r/8y7+s+/v7w7OMefJy+aJVEDglMq8XfU1jlixMcAl0ClZFb2NmdQmxbdu2/NxJlreyaOzdE8FuRJ0LEfJVGrlbFeqvyEnFCsnKc9XH3f8ig9r22jnmeItdFG/syVzi32dp5cSaA+s36Kx1yuPIH9qvuTdKHVprHV4nlgPM4UxSPV2OF/Z9i/W7TENSf63jCS7JpR/5HgNvTjteSJSnvuhwHaOZRAg3tCv48GmK6igIyrPplJorx5oTza7u7u4O2Y+cbnNdPyZA0dntqJFZLjqwn//85+vt27cnHECCNA1wYMLVJq2l9Zubm/W73/1u/cM//MPhFenCRfkRByZ0Vwlm1DEtMgJ1n2eEN44Ub3r6HQqoPg1zpoqT/7Ou+ti4ur6+abgZaYsd9XGOWfJ7rU883uTLehTIlKW+yz8YMCZaEmmLGrq2uZvp9ST9M4ScYU7D7/2tTEXkRnZlNUtpl2nmHGsOJLokh9Bc60z9PCSSM2psyaZ07+np6XCsc30wvW6Ocp7pmwgtOUUj+ELimZK7QKOTzI60i9BVAEPH0RjTPbMZ+Tcd3/QVM/C/fv365F51I6B0c3Nz8jzlJTL/WQTWSzTn84RN2FxxmJP09PTpDUH/8z//sz5+/Lj+8R//8fCizbU+P7E0RyeR2edyLh4KJwJqMlxpskjyF53qu+eQNyYfzpUYtz4nPCWakSNllYtqLI1NRTONKArvkE0GIOmp4q91+g7DiSJDLkbltU73QNXHqWw6yhxj94j8TG+tvx/3c3lqgYbpeA1QjqXfjqUnPkTWfZ/uqEtrHfcr+TiMTjQnVn19Vt8zurKUjx8/Hl4Wm741nzpsF6JyRF6jDpa5dK+66LWeeefiT/NWxjNPljD9tQgsmpuZestRyx2KvlskmEhu+pLJoc7y7HlgEfR1PqMWuSTwnEJe/+XLl+ubb75Z//3f/72++eabQ/rTpERgq/QZck4uYYsGREHzrC5T3Lk62HXC64mo5t9B4Ev5uFGpCZrpn5GqezLGFL5Jlxeb6VZ1KOucR+lOK2qSsslVzs9Fkvpfu5NfcYWt1EkuKn2xHlFsiDdZmJIqP2Xn/fJSom/nb/J+NzfHU1VFXTrz9EX0bpruPOrs1JPQqVsFDFAvXrw40BNlI81f+xNrU6JfPUw+2ZZHqte3UsTmVzQ20/H0uY256kJzPMeobc9gONH6dDrt26weeS/tNB1ID/4qDsyJzuCurq4Ox+tMb1wnnp4+neL65z//+bAxNWVJsKaFDSiPG5wXeczzjuqTKV4EqQI1r55CnemYBtMkKWih9+4+jU9eYHJHOayp7F0vKqv+nELEsQhJrmKuhIWS23flezmn4U9nJ//T36781YeMx7F5XQaSceog1zruU6quGdQmP9M1/dYx+1Ppfon2UrwcnHqjrjZv0hqV9K75Clm9ePHiME/pT+mpAaonW0JmEtciu+Q7tyGJXpJ9+jOfF/ZQ0mTr4oh6oM4JBPpMuVfPzumajZmhhLDqk1TEHM9fxYF1DMgk1HQc04t++PBh/c3f/M16//79+u6779Y//dM/Hd6GncDn2VIO1OsUbnyC3ltEpVHqiIT7Mx+fKY6rNSmtKzCT9PdzI1CfPT4+Ho6j8fPpqLo/BdUxZGizzRz9VCZTBx9z8YzzomyIKrmI7qaim/q2ipaDSiYerWJa7SbWtY7Kbdqbk5tE/ORtciRzIUIn5XzskExoMNogx9E1/e0pFx3qea7sHK/kuwFjopCrq6v19u3bgwx3KLQxNmYd5DT05kpEmzzmkdumn2udpoaPj48HCsXUvvkr41Bmyr7PClrNnTTOzF4mx1j958rF88C+++679Ytf/OKE+5qEakaYsfzsZz9b33777frlL3+5fvvb3x6gsxOYUnX8tNHNlDQhTeLZenQ43e//tbsj10sNZ0pZmYJrvPbDFR+LDsUyuZO5jUD+0DHqDBpT3+UATD/kakzBu+bx8fGE2Pdzjd5o2O/r6+vDqp3p+dwa0KqdYzQVnRxaBpXcq8+50AnsaAHnX13x/2SnTlbXbv7nql7fzbmVn7Lk3OxjCMlgNus6l+EkI1HsLAYor9llEF7n766Xb/T+ApkAJLnOYK/tNL5JNSTTgmP686M5sHLkYJ9vS9kR2j//+c/XX/7yl/Wzn/3sMGiXV40KCbnBO+iiSmhBMlFo3/Up57mBGs2mUC06SSfEOvrfiKMj7RodqLl8KMJVJpViGuI5518fJ+FrP2Z0TZbyV66G9YiRiMm3Ue3S9wzNgKasdFL1QXnM7+z3ROHO7yTx+z7Es9Y6Iaado6lLs6508ebmeNx0OtZ1bgRtLkTDBiN1Vhn4DsUQiW+vUhaVHFe2IVLf6X/yO0cx1LYBqnZcHKo4jvms7+4N7OlHMu36DoBIPzyBwvsuoa+1nnFgnbwaBC6Nk8O5v78/nN31u9/97sC15Fhawm/ydwLMGc5BlgpNIaQwXTcdRk6jCfVxCQU+Uz+5tOqtTolJf5eKzFMeqq+VrJzW69evD8ee+FSA3I4pdOOvmBK4gtp9BRn7KXpxhbMxyZ108oGKL+Jb6/ToZtN1g1VjCeXV965tT9Pk0CKQZzv1xXlK5r4cwl3j9S1COj3o99PT6QLV5GJ8RVp1TzI8uebkQvU+JhXiqk/qbSvfV1efzrUv1fcVZLOkj+q5QX4XHAIiczGovuSszLRM352jyuQZW/EUfRUAkulc6c9WTNfngsGlctGB/fu///t6/fr1ZydQlj5EUH/33Xfr/v7TG7HXWiePhtze3p4QpN47BTkjsBxTzkBl8pjjrps8QxxSQp1chg+Hi+bkQeSIZr7faQk6Qo/B1RnriHPuplM5gdqKg6h/ySTnM5FZ/QlR2O8MUidUv5Jn/RUtd7+8ylTc6ik91MBML0xP5iNUj4+Ph5XnFNx9VJWbm5uDk24FL9k1zklH5ISSoYGjlVMNa4f4c0JyV2YC6l2/rWOH8poTg/fcQKrMZ0AycE8+S1nf3Nwc0vGZCk5nVZ3qbSv9ZgtzAcUgqdOLWsp5Zhs6NdPiifwb57lykQP77W9/e1CI9oNFat/d3a1f/epX6+7ubn333Xfr17/+9aGzOhR5iniXJktHlhDa32L6MfNwI6D8QSsmppazGBnkKnJcRuHycElYHYBpocbob8ld0YRcTelZk+8zeDmTFF3OprmRB0z55vYSX80mem1+5LJM7bsmQ5gkrI/FJDfh/26u04WKz8DphDLAZDB5qMlv1i/RaP2ZD0TLhTp/OsLa6u/0v2vev39/+KxrTR2rL8PUFhxHaNTsRnSabkx7eXh4OElDLa5aNza5UBFvc61+G5hqWx2eepZz7zN52/ow6QDnLV8hBzzBxK48eyKrjwK9efPmkDL97d/+7fr222/Xn/70p/XrX//6M4JbnmStY47vgXcaSDuLc14Zm8u2Oozqa6+aZLdKInKxP3d3dydGGpxNCUNmObm5ymI6aiTy2opp1Ty3f63j8TIZcYikOluaD7VdX1+fbP6c3J7KMPmqc31MbpLoGo2BSSfhufzTyctpaNQ5pubWc/Obk5k+uTKXzOqLeiR6kI8T+db/+iUhL+coQqifIjLvrw3bypgr3jsfvSttVm45l7lwEqLKqSiLify9rmv6PLu+vr4+BDhTzMnbZuOOYYKJdHwHHpKtCznd0zxKKTRPlxzYRQT24sWL9e7du8N2gLzsy5cv17fffnt4T+Ps7O6zJtpjO/zthFXH5MRER3ln4fskbOXNIqjz/J7kKp/Q/3MM9d/PUvy5QjbH3W9J9toTmquQLsNr/DteTAfVNY13Kpgks87eeWheRMGixOqIVyo9cF7milL9mgsLKug5hDZX0vzecccbzbE4X7Zt0fh389hYzAKmDtcHg62Oea3T48Z1otPBh9p12pM0T87SEPJf1dN8O4dytjkUUaLt2Nb8vBLyrI6JYtO1HKX3zYWcZJMuXSoXEVgQfK11eJHs/f39+sMf/rB+9atffbb6Yc7f/92fkExjimoKfA5orU+OVI5CRyYXsVNQjWytdXjno6Sh11ZqZxLmftY9RUTL7NPcj7PWafrbNd0jkSkaSd5GaTlDFVflr21TvCJdixHKI7RXvaKWjHn2e85/DrN5N2XXccmXiuBsd3JI6kGlelL8UiaRRLKbn691Su7bho656w0Gpsp9pnw8QNJX+jkvfR4iMigaQHYOZKasyii5iNQKpgZVA6qydC52n810vj70qFBBUN2Ygb8ij+cTD5ec2LMO7N27d+vrr79ev/nNb9a333677u+PLx2YHclz+wIA30BTXtv//RZ5ROTW6Y4PKbJJDCcor005Jw9XHzsIUbi/1ufnIhUxamc66XntRIlOuBNvVIpDqh8phIbSffbXh9Krez7LNoOA3NxEhSFjSf/Sets2cLx///6z/hqcvK65nRRDv3PGorLKDEYimB3yr5/1y/4r84kWGrOkc/KavJhHRtVOiH5mAckiCmBuNWilU2eZbPteY/ZluJPWUAbusRLdy8f95Cc/ObyIZ2YcM5WUa0vXTLFLiQ06Dw8PB9SYXHvkqu8Lwtqb6PW5cjGFvLr69OaXt2/frj/84Q/r7//+70/2EKUc5fo+2K0xNcAeHcj7m8rlaX0HZcJqIpy0STbmgHypan3TIIqSGkP37lJfU9uEvktLKpNQzqnOKGgkzXHLf8mZidJSgpREx66RJWP5qsYgKkx29lteJ/TZhuScZn1SSePq4k6dJ8euQ5lpbN/LjyiDEErXy9vopEWBGreIyejum4Ls40y5J4orGKr/of7+7jtTMHm2mZ7N9GuX1qUHcaei5SkndS8uTd5SPew+A+Pd3d2J49tROv0fmowqsn7rNu3d7XVrl4FHIe3KRQf26tWr9ebNm/X09LR+85vffKYolTkQU5QclYZu2lNd07BdRfHxh12U9r4mW6d4LqLLZ8zvKnP1UAQpZ1epvSanFbq5RcQVtlDoTPkyDBHAfHrANCwFmatXO/nbF9OGKYMdMrR+l9bXOh6J7H05qbiy6p0o0XlLPvGvtuFYditwfi9yEE3rvKQhZsBThiJGg2P/64w06p2Drm77O/ukXGZprueufw9ZNOMxyMyAbtDzwIW11snD51M2k/+T+vDadFtKofbO8cZlK9N+Z7mYQv7xj39cL168WN98881BaFauwekpbbitEw2wjhtF1zoaj0odchH5FHm8d/ci1tlmRe5B/k6UsuNbdKymh/XRFFMn0p6lHU/kcd0T8nuuu+2r5KYr9qV6puI7XvsjwmyzZp+7SKJhWsdcZJDjNCWbc9T9zrn6MLmreU06t3NiyWPnCM6NR75G7jODlD+cPFB2If9rMJ3o3tTVel69enXilLQ5KYh0UK52fnZ9fdxvJ63j/IkIp6zVcZGd+md//J2emsbvFgF6f2T1GZzr4yUO7CIC01n0f56/HdumY3XCiVM4KnECEtGYDigsCcb7+/sTbqb09ebm+NjTWqf7X2rbYtq1i5AuPkhWNhGuElbf/N56KhrRJM37fKLNZKXTMbqaJjq+adgqUYo+UUT1ORc7xGz0nYiooGOavtbplgPrnP2cemDKYdSefM+5vs7PLepADiqZTPTUmE0/SyOlNiY3rAxM3Saf2Zt6OsOs9FM98P9zjru0bDogEf2Um3pqgDHoze+d+5lqe51lBsBseDcWg9i58ux5YObiEvJ1MIKvYuTXk+uAditlIrUZFeQwaq/o9fOf//zAA6x1FOokWkNw5uWSkEXEndD9TOflGCsRnPP00ooTqALIL3WfCEteqD7PVLB6+i5HMgOJAcD27ZfOZZfiTkThXM9V0elo/H9GcfVg1tf183Vc/a0hXFL+iQLmG6oMIPKjjvnp6enA0ezojIl+5ZCaa51in/l0ydPT0+HpCfnAHf87Uy1lIzpqdbD57GeWyb3V39n2HOO0feufWc319fUhQxHlrXX6NM+lcvGKN2/eHCqOi5pCeXh4OEE+5/iJBuaxHJMQLwqa5rnqlAA8D/zt27frzZs3n62uuQF1viLKFZMUa/ZlGpufG92m8uYYHOdM7SrTYWXA3ZNy10e3mUyElbOYiGgS8msdV2tVmAnbpxxmGjdlMtPC/p4cYu3bbrLaLS7MhQ9RrUgi2dj2JI3rvyuBGnPy2unkw8PDyWpg8pv62ne+jFe6ZZLS2lHXllWIxutDnxuY5FwzcEYAACAASURBVN9mKqecRE7Jsp/dHJn+Tqc836k6U3zb6jMdoMFCWUY/tPtg1j3LRQcWAVr0nw5pPqipUUjYTkMVfT09HV+saj0KQS+egXa9E6NiT2/ug7yS9kYfU90fIjyRm5855tvb2xNDdoWnPtT3xr1rezqV2W5KGHLLeEVqjU2+pnq8xofRNRRRoWnoHNfO6Yu6NBgXQvqsbTSTt9FAk3F17NoWESivuVAklbFDU2YY9WM6ua61jUlu79LVFnl0dA8PDyeOzleydc3koSe/tnsNWqhnrXUyppubm8+eaGneI/99h0F9mrZRn0yn1/qco56627UF8JkVXCLxL3JgdtxIWMl763zkPSQ9J0zNsK6urj5bSanu+pCwhK9zRcQiJ3Rzc3wId63TEylmGuQpCDNqV3QktT+JUfuvkfj9DAgZRRPY7z7LEU4e0bpqI9nqNEUYjXv2v3t08Mmzvu94t5nmTQcx08sMqXsa20z7JgLr/x1a2M1Fc9Q8ipSUzUzBlW888DkOTYQb2jdd9BpXKSdpfi6dN8U27TSwzHvMJiaHVzH1rb4pw8YjF65d1ofuDTnO1UPHbXCeKe9c1JkLI7vy7Imsu1RJ2OzSaw32HKEK6llidXa+AktEJVnqG7gnT7GDpcHwPqu/cQoq3IT/osbGsFPy6aj8zt/yV441WYTUWnVMKUSDO+MxmPhqtZyTfFf9FOlI7BoQRLvOvc68/nb0i/Mq+s1JzJRgpiwqtfvmKpMTTS7KqD6LLEojm38Rjn20jr4zpek6DUoUJpc6jU3HIMr2x9XIrnORJNlMDq7/60tzPIN9/ZhZiXa8O99s9kU5FRBEWz0X6ziT/3wms36nNzt+9LkVyLV+AAfmSoRGmMCurz+9JbgIY94u1DW/rYNd028fP1hrnazQdM0kWCd09noFXpRxQoX5EzHNdHKma7PNynTKKfXkDCryN3Jec5z1q7YnWZ3SiYJFPI1ZFCe570KHjs40VKfuK8Dsl0o9U8yKS+cWUYoObi4C1J7ZQboYWlrrk5G6wCNHqHOafKyIojmUi7TU377XuHUoax15pSnn5qBxybs5d9Y1F5bq5/zOlXw/L/jsitdN+8hmZmaVE/cag2M/OvDqvcSbTq53losOzNXDSsJwhbJJE+3c3Hz+VLqdK0VyY6aEevdO3qN2XI2rD0UZ0wUV077OVbXGKWq0GKn7fvbNSGK0d8Kmght5Jgmt/KcTSBlEU1MhVQLH2qM0cV0zlZ3O+uHheHZZ16x1POBvtpnjnGggGc000OA4ebsZnXXmIs/6qhz6W+er/KrzHMJrPB7IudPFZLHWMTWb6XfB13mY6Ezd1KHJO4qSJ/qvXed2Iqf+76Fqg6bj0SkaINXTMhqvt83kZ9Euq9uMZq3TRSFteVcucmCWmVpJWBox+tzvD40BIeucT9gb9TxVc3JV1TWFIoSf11dSgmns83qh+twaYD2T0/K3crGouD4KZARtUg0Moicjv46wIp+1cxbniGtTDY3EHe3Oo47VxRpTL+tXNqby1XcOFey+271Sz5L86oPzOHkwU0YdkO0atGZKJrm+49HUJz+b+xWrt7E1/zMo74rB5zn9lsM6l82ItKZdzZT23Lx5Vlz3xZWZxc3Alo1MOc/y7Jn4VarXtTMJVodVhJ9potEjpZpQdMJvJ9doUJs6lF1qZ0paCb1NvmqOy+i/W45f66jUrqBZX9zfXGlLphnNXAAxgvvmahGNCLU65mJBgaB+75xGRWQzOZScZvP7+Ph4eGyoMcz0tr9V7mQlzWD7u8WNyR9W0qe5c93o3blXtavsdg5tOq+JSEWzIiW/n8R5uiw9EHJJf1zlm+lsv135m4jKa+WpdD6WEJ9vl7K+9puJ/rS/q6urwzOPU07a5U6WX3/99Wf98VnOgshfzYF55vg0gP5uILe3t4fnsCZXMPPnh4eHz96Ic2mlQbLUU18znt0kmbvrxa+urg57dGZ+P8eV8fl/dUwkZpozHent7e2JE1rrlOCWp5rk6Awg0+mIcDp80pJT26VRc9Gke3vVmOmYqLVNhhmdBL2lduWbZmowKQZl6fX+bZquY21MIsXpPHUI8WXNgRytbe6cVO3vXmRRMcDMtNzAUF+qc5cyFbB3trRr1342dn93bzLZLZIJLtJJbW0n38nD9b87DibAmVyfOr+bE8uzHFgpnhFWZPX09HRyvM6Mvg0qhRVh5BhNi3IWCWymqvPFHJUm3snXyVSExlNRXCRonLtIMvkqZeL/yU8UYrqSjGr7/v7+cFihfU+OOoFQpIhr5zhFdvJq3WMQmGWiyfog0poLHDogHYn1R+I3BymqAWonSxV+p9hzJcsDDuOy5KGURU4sfdxRFPOz+jupArlh73EXfW3OolPfoe0QW/OW3k/K5txiwxyTczDttDLrNqsxCH748OHAvZlqy1v20h9tsrGIuLKBSUHMctGBeYzKWqcpRt7aCOVEp2DB1BxT54jf3h6PbXZin56eTo77EKIGievXWkfvbU6dkia8eR66SqGBmpOvdSRwcwBOXs7DlND0ts98w0+fpXiumPW5m3oncpvoReRaPaGw+BW5Sd/h14pvz9/lQKq/uubblppvV08tRvEchAsOax1TeJ1uXN1Ml5Op6KxAuVudzFnJAZriPz4+HvTtxYsXJ6R79ySTEKaLESJyA6oLAZ46Og9JtKhX1e3Bhs55f8/XvOk4227U/CrD5kDbcM6qP0ekg2qsAomCwdTJ9Ky2RPHX19eH16lNrlh5N0+XHFflogN79+7dyQpFA9Kj6niMHBp/Ci/Br0CrN+EI0TMy8/uMTaWSd1BZVDijTgKdUVOeqz6LKoXdog3htimJkz8dmYGh/6+vrw/jDbm5ClRbjUEH+fT0tN6+fXtwHFPRRVAdsKdcdgsmzrcvtLi/P758ZCqxjsZl/F0aXKCT80hf5JDSAxcNGs/OOa21DoFSfUnOcWfJtu0WIr24wAzVN2q5Sij34747aY5+ZlpXIM6ZGSjV+fpvgJgLA83XjsIRtRfQknWyrS0XR+qf16aLHiaqLvm+hsq5R4PU+5ni78Y3y0UHNldPJJvlgybZPvP0Otp9KVQeu6g7oert7elpGA0q8njnoZtgo0qTpTHXJ9OOuTJnhJCEb+xO0kyr+rv25n0+//f4+HgS7afjlBeYRQOO31P2yclUtO+9X3Q00+GUU4OyDhF4Tiz0Y4pSEaHO1Fd0N+U7U9OdTGor5z/R0VpHI89JFqD7/5whnVtMqIhW4lnr+y51zGmdQ2eWaWPWVT1+P9Neg67/ZxcCiElF2PbkwGamJGem3tjutL9kL02SDKYznuVZDizIasVGqZub49HDKuqE9b3jz1zX9HMX9bt3F4kSRgNd64jYipIJy35NUlaeZubrfe+BilMOIqMdLzP3SmlMItvdxkLJ6oopqk5B5yhx3v86F2Vkn6dj8nVfa62T44x3fOFcMNEp6oBEG/XHdqajnI5DVDWL8u6aiTanQ1FupUfdL+ro944Drb+iIx2fXOeOdqlM3Z6r6z2xYVB2TvrteyBevnx5chCDMuj3uYfMu37qhsi4fmmPU5fV2eahLRY5su6Zdn0JgT17pHSPjNS43zlh8ghrfX4ShTxEn8VjOGnxNHpxvfmMjk6G0UIi1zpndBFh+l3tTCdlMU9XSZNVqEcHdu49fjocCcz6EqoREagks31XR7tHVBna9Q1Hzp/Bpvrqk8Y8OTxlKPkuag8dTaTXnIpm/W2gsqgTU8cm/THToK4/54BNp+b3s32pA+fEeawu9X7yXP6ee93SWU/qnSml3yevmSHNVFUHo3ynne8cd59PPrkxS2Hk3LTp7FYqxfZ/dArZ4WqleCKNySs1iNlgkdqXTuQUQm5CUl9yYc5e+zMNnATkhJtXV1fr7u7upD8Zlf30vuvr6xMy2RRSEl9eSSRnKRWaxlg7E9GpWCISU+/d/XOlTq6wdpWZc2FK56qWzsf+tLKavHN8Mw3LOLpmpqnTkU+nYsrn2Ezn5fuUi/3MgF340XHnvHaGIpmunNJzU7fQW2PvLeOV6lA2OZN0Shplku21sdb5Y3m8rjrnG7OSqci78czFg9BcRXufDtnFBPXQQOmug0nFqK/ya5fKsynk4+PjCdFYx4vENl6UTtlzPu/fvz+8fWXHbQiXPdTQlYzKjCYJaEJxFac3r0z+zlVAI7MIIYF7RHUIbyKBxmN9/b3jYWon+c1Jm0ZVPaKePpP49ycjmlyDXIXznUHWP+c+PjJCdvIlte/KnpyJDnnnvJPtfMOS47m+vj5ZVfa7OffN76tXr9Za60BjOOc6zFaAp2GLwnWWrYZLT3hYQK/ws6SD6kmf9WYo0zTf+h0V4/+iy1mm/ShTF8YEAZXJ06nrBTjrjgKR3+3+HH5ZRHXNI3oKMs1Zcp+2bbnowH76058enE6DnpOYADSqfuYKY3yaQumAtjrphDm5lSbZaOqqh5xXv4XMORadmMJ2goTkIq1+JOhDCabFOYlWw5TFXLAoQmUwTbppnNyMfQ7hzlUbOcB4joy6Oma6NtGkiwI68Zn6G5j6vEjsXHW/jrv7XWRwzirWa6o2FzxmeisNUTvNSXMrApkI0AUotzlMhJB+W9yG1BwazJKjdEmfzflzLP3d/3K+zZ3XKn+Lz0zu6qmu6tFGZirab1eR5TtFd09Px0NHAwn5g1Z/s4cfnUL2OqUic84mCC5CmMStqU3l/v7+s3OF2ruSo5qcxfT+XWskcqXFF9aqZPPN3Gt9Ijc9+C2FauIV/nTMk78TIYi+hOWW+lrffTpAJzJXiJr4lDpnkNJMB5nMG4cP4Zsat7teo82J184MVLtFj7XWCeJpjvztvLVIUj0ilh2qMFg05mRn4LD+r7/+er148eLwjkrl01MHtplD0bgbY/u+JPRFfOri5KJqt82cbkNpDiPDfZnIDKQ6JgO0gcGgsSP662NjaUNuNu7+tXjkm5ubg32VKlf3T37yk5OzwLQzUbxO2cMEQuqNpf7sFsYsz6aQOigdQgPXKdQpjVruyNUx0xWJd51TcNSIU9sNVGchYpirJimm0U3klnJMtJBiTXI4xTGKCNVnyphz63eIqf77Snr5pxlJQ4+mKi0MPD4+niDY+hPXOJ2rvFmKbtqcfHUa1W2Ri1rrlANJLrVR0djcuHzunYUZtfWYhoeukmmGdn19vf785z+vP/3pT+unP/3pSWA1ta8N21bXk/3kXE1xdGa7hZrms5M9Zpai/kp5zHZFayIzDX0XCOTU3Gya43Lc2lyo0cWgAsBcGIsCmKug1dvLgPQNkzOvzdq/VC46MJ1JnZvEpw3XYHDbo3CNivIia31+8oSku9eafho9hNF3d3cn3IxePwWznkl+a5xyfY7bFHUS5/W5x2XkSzL0jKxxZPAZb/U6yaXv1TF3LWfcRjZfgOv/yeX+/v7kOdeM6/r6+LKFjhqeK4i7qOi8J6u1jsZU/3IwyUwuzoAj8tThVZdviM6ZXV1dHdLDAtXV1acd7MnZdNU6TUO7vzrNOJqr+tZTDSE869UhXlp99cUiOtbZx/n/TCmTademX9bdd/6vA5w8pI5chF+/d5lPtubnfdd87NJO/Uc/cyHDctGB9ahFTsKKyvdT/Enoeo+rN3Vo8iF69IQmknHiMsra8H4VX35IQTrpprwapXxUk7lDEqIqI+oknq17OmBX+1Is29A5aawiqRCIaWApikrT36LR6u45xMfHx8MjSb1NvaJDr62+Twkz1lJkz4zSQTcXolcXcSJ0uz70OVNqjS80m3zUAee7eRCdNMciuuYup9hY0w/5zRYJSqfUY3V758Amqq3fLho1nkm+73grsxoX3xqrv/vbwO5LPbThybFpc7e3tyeOaTpCaRrtrgWM7CC7kBY4Vy46sL/85S+frZisdfS6VqxzUbgpglCzwc63vEx+a6Kv6p9p31dfffXZAW5GClODHexd6/QYj5CTixdF4BRDtCQy6neKnzFXl3ILmahIU+mc9LWOjlHkqQPoO9FDDjF5GlFVUtMO4b/z2LhCr24c1TFJzicXI7xFKkKdagOqejA5HLnCqWONMSQgZyfq1PAzJkt986j0/q/+h4eHk82czfs5NKasRGe1V72S67tS/53XybnpUHb2IwcWmura7DIg06KElJJBMjtpfuqDaae0jMFH9G1wuoTALm5k1RnIa/S234TStaYWRos4G8nwrvFvuSXTPduZijoHvtbpyqKkrM7Y8UkWCl9d9p9vpjGS7wQ8EcB08LuUqHHlTEMxOUxX2JLnJJJFl6LS7ptcibI8t8Ug+Xld/VFmfadcko3yqr6irY5NhKfyRhyn+M5Xvz19IsOd+pPBRTiXVqdzBrvamVxr8+pJLela91Vfbab7c6FkrXXyKJ3yUUdEYvaxOT+nh9MB6EyqMw7MPhusG3MyEN05d+/fvz/U07XxXnd3dwd05uEBzafy0Ef8VST+zc2nLQCtRlp5T+xXhNMqS0S5D9bWMdMBI5npZ+mH5+UnoBQ3YVZev359+LvUUUPQce2E4wpXRacmmqh/pnH9NBZfxCrRbNt97u8mX76usYQSciYquvyjjiJEZhpgwFGBdD7JSj5NZ6rcNP7msWNWkmeGX4o4FTan64KQfbUeHetME5OJNMDj4+PJo0UFY/ckGWSsT0dt/51Dx57sG69Ox5Rvt+FV2eeks7lzxPZOlzsZQ2eyy0AECNmIqC7ZdO38+/7+/vBSD6//8OHDSfAJLDg/Ux4ucj1XftBxOj4P2SRN6J1yTC/eMnUTngIFmU0BJPOtdw6kCJoRJRjTBceQ8e3SR4/20AA6nWAaqJtf1zo98bT0s2tzwBpCCEHiWm5kh2BcRNBxJkPhfvf78lq3XUyUMJGf6CYkaLuOY/IulfpR/aHq7otXM02f/UoHcsTygvJV6Vj9VQbOp5uEp9Opr9VR8JrOun4ml9l/nwnO+UodeAyUiyE5z12qONHSrpg6znQsFD5pIO81yBqkdGLWYT1zDtb6ZFMhXDMpeUFlkCxrt6A80fOu/KATWVMWU5sEq/D0rDkNCWUjzuy0ApL4E5EVNbtWnqCBWmd9Kcefu5r7W+egcKuzSdbJzYlITtMg5O66tnpU7sbaNSpIshZB9YCu5LcBoXs+fvx4QBkT/icfuaAM9+HhdPl8OuZdMT1xjDpoU2uVdyp5RmUwcq7td7zMTMudH+chvdEZ+WBxbeSANP7Qktf1nUhfVNy81Y/GXR3v378/OILkITJ2n5u8lTJXl8uc0ou4OTkwnYOLbSJ5A/+7d+8ObajX876bm5uTM+3sV+2qDwWB7NuVe5H5uXLRga31KR3L+6fIEnJGJxXj9vZ4FE6TI2qrczvYG2dmvf5dpHPVRIfn/wm5a1XylEln2wRNwaV0KXl1yQ1OhU1eM30w4iUHHamnX8xn3mqnSTc1ru7GJioTpYYypsOcPKNF7kaH1v0qnkS5gUnn1RMVoaOQWfeXdhgolcnV1fF44uqegWCmlhpJK7Iugsw03dQ9vYoKkcvRWL3fIDH5XukIOUr5th3XWzBKL+Y8Nc6JwMyKduh/IkznoWDW/fFdcqP21WzMfqy1DhlBbalftXt3d3eY+92KreXZjaxv3749UfqMs87MDgnnFcpa6+Q87LU+OcfdW7mvrq7W69evD2mlfEh5dX1oQqeTM8rUVyerqDTPMfK+lDNhZoB9rpwapxPW567uZQxTNl7vsco5a2WzS9tmGuEqabJpeV/yWMXx+TgdViWHU/31xxM25PAcl6nqzc3Nevv27cmYHh8/bd3wZRc5tEoBsjn7+uuvT9DguWgdreAWjWRWX+V9crrNRzvs5UYzavngbCTUbno5s5XpeLIr7SEHMAl7nfluv9Zapxu5+5mLMSInrysw5/xMKysBG9vS4eXgXKCwveiY9CnbyqFr4z+axO/5vAYgwepWAT16AyiFyJkldIURvzbhZhM1rxWhNDBX6pqAhDkXGeQIdpFLPq7/4+4q1ZkDlBfM+V1fXx8chcpj2jDRie3rtES4RqPJaRg5J/qr7pyaxLnc3URLykk47yqhpK8p7iT856rS7uW2GphGnvMNpagTBibpDYNMfXIxRe5Q/aivZRrNk0g5Z+XKev3NLtJjF5p2qd7kGLMvnZ+6airY/+m1pHvzWH0Gvh3Xppyau2Si81X/svWJeKUkpG2yj8fHxwONdHV1dcJzem99ulQuOrCf/exnB8U3osp1yInsnMMlGJjCxU+l3BqGBhpRGsFXlDZXdyHAFZI+s1+mYPJfFSNyY4hwlVeqf8pgpi8S8k2kaVvyWOv0BR1GcudAp9Mc9FnOsvFlVK0EunBR0Qh1HBOR9lsjnal1shTNJAMfo5m64SqmjildSEecR1Nv53Ot0xViHUaymCnvWsfFjzkHk5tKF/tuOhSdSg7esTYeOcv6Khqv7NLhyVmJzJKfNpl8pvym3Trn2sBMLWdqOudA52Zw9QkKF1qmH/H1fufKsw9zv379+qD4RqMdDE0gooMmRJ4l4X748OFkR7+5vfXZTnA+x5UQhdo5mSD9dDLznXwJVodU/6tnGnvX2l/J/hTbz/09l87X+vxRkNrqPpU6Q5ltiyJSiBxG8zBTUB1k81Dgurm5OTg85WFU1qE1/66yNeY4nMZiqpl+OR/qwOPj8YUfOv/qMJo3FklikZuBYKZp9bM+1f/mSz7IYFERac25nenfLrCb8Uy9cK5DgBp99lMb6cLUFecpG5pyaO7mm5T6bq11wkHOur3H9rPb5GyG0Pgl89XjXbnowF6+fLnevXt34nndhGY+LWLSQFSkJjyhttRaJx8eHg5K2u83b96stY5H6Ta5kp4qZ5OiIskTKOApZNtpvKa3Xdcyb/VVVGyJzOoWeZRCzXRjrdOHodc6TqoGP9FS9+/4MZ2qxqmjdaUsJ5eTmajANLTvpA50SqIvuQ6dt8hTJFL9EvWNs/mfafxMKZW96aHOJI5ThDZlIgrXeYuAXAyYhqeOuqdRB5huS37PosPpHm3KOtR55306s35bR2Pqc1FZ8prHpdcfuTPT1z4LYGQnBkLRV/Pxozmw9+/fr5/85Ccny+k6D1dLEsIkLDOeSbBOdGC6d3Nzc+C82hwX2sl4clJzmX2ig+rq+7liMqObRPNMI6ojI9BhyK9YNJjqr56MYh5kuLteZ6qcMxwPizRomAaaYpdGVTy+pTkRYWjg9bu6q0uFn0UuVAqivtRe10ruplteI09TmZF/l45N45efnM6hOiaCaV6tu9/Zinag3cyNpPKD8ok76iUdeHh4OByWWFtPT0+fnbw6edI+25H3875dJtLczHtyepMuMEiUPdW2Y579zE7bT3apXHRgGUYvdxDm3d8fl7mFnyKwHdKZPM5ax5QvUraJ7z55hwTR6mSObJKPMx00glxdXZ2c02QE65o56dU30WXXCYsjKdf6/Fy0DL77W4Bwp3oOwzSriY7DEgnWr9rXqcq3Tbif0k8C1XnKkc+d0Y3dCL1LOZS5xHAotHm3zmQtEa3DF42a2qaDptw629JAFyZyTqbH9VMkJXrcoVz7Im+UgzK4No+iL1GSSFYbmMauzWWTkwPrmt3znY43mUwHO23QOWmRzvnvJ73UrvIhUlLWL2rt+ea5cXyWiw7MaKpxhcxMSXIgU3h9NyOsiwApRntsJFibnKKLfNf9/f3J7t5zzsdXXN3f3x8ewo2LMtVMAXWa3XcO1q+1Tu4pBalfOw6t72vbQwe9VvK4FL4FBB1VRuqTD7Wf4+oRGpXx48ePJw7EucyImss+qy0J++SnstY3T6Oo3R6QV8bJsWvm0cRzDgyqkvPqRz+R89NwNA45TZ3adFoFqbaPGJCUuwhQ6kNnUXra/GszymM6zXRCRL5bSAh16/h2nFrjjqMyNXbc6qLHaU8eWqemHOub78NoDrN/D0ZMrufKxYe5dwPQoFReicCEMh92NWUwcjowHZ85tg8zGyFqy/RQI1jrE6/11VdfnTzv1hvCjURNkCjJNHlOfH2XOK8UYVNkjW2t02cFG/9uoUNCM4XUmRkBq1cZi6RUytCZe9REvrXlfqAUUr6psTdHIaScXUhLhKfSu4BgKpnD0iDUxRCqVET9DC0a3bu2z6yruUpX5aukDFwQuL6+PslMkpW63HhF+LWto1K+ORuph5nWptsS8c1x9esA5XUrs91zjlKn2BjSt/r69HQ8qUPf0JwZMM0IJjhyH2K2eYn/WusHPMydojRYeYYmzsghVJ0EpcR1hlkaZGopN6FQ7+/vP1ullLAvrdQp1bbpRdHKNlTY7tER7pxX350I9Pr0GTodm8qgQhipWpXNCCciqy+mCypWjtE0dvJ69aHnPUtl5cxEOyl5DknnPOXcb9uZJHjXz4UQnZHUgE7ZdiSrr6+PL/uYjs5iXc1xaU312KeJJqUS+hFN979/1568j3pqEDR4Ordd53cziE9nlvMqHSuT0SlMR++4dWrqtPZU6XrHojNf6zQzkqpIP0xlpwzOlWc3strpPjOvlQexs0aH6om7qcMJIgcparN4JveEwwqjCclZTkg+YXLXWxTWOac1eS1Lxmckl/9a65iSzCjU4y0i2Rx6xt7/IjJl4Oc+kC6aq70WSrreXeDt0hd5zu00zeHcujCdhKg2JyjfY9pmEb1OB3Dud7SAvOAMTqLd+jUPQZzkv8b4+Hg86DEDLAXse4NfdaozM0WbSHIubFSXMhJVW2fXpiv+nb26OTv7DZlPCkiH1+duaZnzWQpYX1ts8B4pE7MK+UYXks6VZ0+jqCPlrSml0bCHaSUQVXqNSnJa8j2Ho3ORELU/tfPy5ctDnxTY/G2krs/97AjZ7vHaSgrqNXOVq+t0XvM8sXlf450krA6n/ppihpR2PIEGv+PgMlyd2OSYvCfFcnGhfuskJcn7viKqNi1MzhLvylfD3a3O9btV84xWi5SRVAAAIABJREFUxJRDLBiKwGp7Oq7qVV9De1PupdDVIxrWuYq+01sDmhzqRNKVmfpJr4giZ0BuLGZKBlYXWqSFqts+5tSnXuasAjkF36en061TrponX4N66PtHI7A3b96ccBaeqW4auSNYp4FpHAksgtlXtwlFNQh5sdoupfz48ePhtff9SBAbeZqcXVpoWmTpTKPalCcyXer7tU73PRVx5GLiT5Kh8qjfGohk6Jx858h+iPx0jCmGnE59dBOwZRLWa53ucE+Zpwzra2WnjKJ1x6Zxhv5EYnIr3aMOumv9/v7+QJpPpy8yNPA0Z/XF7ED9LgCb8ubopuwz/LXWyeJShLh8ovLRiU75z7RXdFk/ldVap5u5nQPL7n/ft5BcG2NjmauT6Ygbbxur96n76eVzPNhFB/aLX/zi0Emf7VOQrrAoDGFlEz3z77xtRlNkqO6cUM+cff/994cVxLU+PwRtrj5l9N1vJMnhWeTOfDvR7e3tyZEmOotpkPM7lWo6jJzqXJHMeCby07BsL9kYlXU4L1++PKADebLuT1bym8lupvhyFpPTapyzeG3lHIqqjsmd+GbnHJl8nG0VHHIYE9HZnroi6ljrSCSbhjWH8YbprimaMiirEF26cFH9PhtqNtE4pjOxnbmoUrvKw7FPHnjOYU5vykkqo+9Mz0VRBqC1jsFEvnutY9rvfDr/c1P3LBcd2Nu3bw9LmyKYhDa9fIPqM7kyO7tbQaoOl9w7WVV47TYBlW3m1L6TcL6fsEmSwOyz2jZqTmEbVTNwlU+jyugnX9GkTQ4jY5M3kqh2HHI5IsEQY8YehzgVp7Y9SdPfk98y2k+jr5QiVNIHV5QnWvQFEgYkUwj3oeVATM+q10ChA6x+nXD9FBUp1zkv9dVUyvRwps6uWNZOemGaGYI0kBj4znGxU3d3mUCgwHRPp2OmUz2Cj+oITblZtvtnXwQPytVFFp2tKL2/Zxp5rjx7nE6IZ8LnJqcOlQJMUt98ujo1+CZ4Rpmiro6xFEvuSgEVIVKuhKDjrO4JoR2HdRqBd8IXPZkS5nByQqYcTYibXVPanI98RGWS4zmFZNhqmrKpLk9ldd50rPKb00k6Ry5QOCYXMNY6HuqnwTcOZdTP5Dgdw45bEyntFlZccHBVt2vmW+Ez5kn6Z9CSzD3TaarjVgKDj0S91Is6MPXY4vgs6obUS2OrPDw8HLIQOUfR0kzT6mtbMPreM9uaLxGdTjJ78r50dqa+Bp61Tncw/GgObK2jcXX0TRXWYdOu0E4w071XM80QGhZhjcCR+iIChZIx7RyOR/dahLm7RxRctdHYdyS07RlFHaOpWtcow84r11m3wU8Ep4PLKI36rR6tdTxNwXvjtnL+a51uvG2e5ziSrylbRquRq4wzbdDJ7pCZMmvOXHEVfYq0RDM50MnDzvoNRH1v0FGmE0X7v1xgOplckpX8p4sKpqG3t8fz8+rLNG51Sf2bKZ5o03HapsWFMf+fXOl8s/isZx4W2f8iM8n7eMVkICXioosZxY/mwOSijOJrne7+rjw9PZ28Xt2XgZrfZnQSpwnp3bt3J0StAtXoZiSxf99///3J0cQ62tLRkI6kfPvIRAwTNk/i8Ry8nwpon/u+saTkKn2G2PEz9VUDjesx8hZVMwaVR7I8Jer/SUC3nF7quYPxGotpskEoGYm4qqv/5VAdS85AZVam6ZTUgbKv/uZ4FnmW5sH+plciAw0zZ9O1bYNxT11ZQ/NRgMp4DXLN7+Q4u1an5ftau85UTznVj+q/ubk5CRYCBx8FSzdEbnP+07ns01XtUFp9a9zutH98fDw5bl47aB4ucWAXd+I3Of3doFLwSkY+lSQFaYAZh6mER/sa/eS5dqmobdvWWkfiMoNyVcQU1LRGpyG3U/vWteOtTFEbm/1T2VLAHJdONKRR1I6MV77Vk+wqpkwRx/PxKCH8VH63AMwVUr9LRn3vfCmDjltJiT3FVW5wysjVUx2retl4dK739/cnJx2oq6a8c9w6kFmms3H+RKbJrLlqA2mGaNDzWmWo05O/dB6mvs50t/7VJx/ZqWhHLuJYpAfm4oc23WfZbXan05po7+7u7qD781ip/i4bmSh6lh+UQqa4EtdF5hTOhhWCRtdg+mzyajlMV2EUQMKTb0t407GZhzsJ87P64l6VHYnf9VPJu+5cnq4xVkwlVczJDfp9TlVFs84MormYRrlbbbJ/54rG69g1yGmk861Q9cn0qHodg1yeZcq8+de5rnXc6lGf5AcNwHNOJlGuDu/Qp6jEa+W9cgw5JhFH+iUfOudnZit+t0P9orMdpzXt0wxiOihlNduuCAji2N69e3eQy66P2bW6r8437nQ9Wf1oDqyla1dg5IXM1VPS6RjODUShNLGetCrHYKqSIxVFyYXZt7WOEUayUchcmSuUOiZfItvPbhHB/yehH4qT9JVL6l45v+4THU4k0vL7Ln3q/why0yWN1986UFGk8pHjUfkkfa0rfZjOZBrvWqfnQF1SXOuv7MYy2+6z7s2JzCBZMR1XnsnUh8S9xsCbUU/UZZky2aEObdFMJqeZ4cvjzvsM5DvdFaT027TZ1U0XNSoz8E9y3yyhuc4mQm35gLkiuSsXHZgIJ35Iw9VJyRfU8TqlIGd+7sQblYTGveBjrU8vE5DDSlgTOc0xODESr0abCbWre24NKPKYtk5OrElJueT5RCKOvf06RrEdWtRw+87TDIT5KmJOM3nIr/QT0qoPOn35KMeZYrahuCIn6dyn/HJgztVMW6ZRJG/l6RycG0sGkWxy/AYZU+nSGOe576tbZ15fnZ84zPo/na78nv2uXEKjycW2TYvTaeU2aZKKXGj9enp6Onn8R6fZPFVElbYVOgtRJY8ZSEVccr1yxbvy7EbWtlFMIwwNCXONdpXp5UUyCn9OmoK+u7s7GJLbH4wmsxgBzME1ksZgSttY4srcf6QDn9B2hzS9Zipb5xxF2LpCKNk6Edda6zOHM+Wqo5skdqtKbSicBpNRejTxWscFgxRNRNO4vvrqq8PCzVrHbS0GvfTFvWqVUM2M6qbZyci9Y3JrXi93Oh2HNISOP3k0Rh3VnINdmee4mybXH1FIOiKFkNEaNLQj0WNFqqPfBujKDLRmLZZ0PbAwU33/ru/q2e3tcfO3uxEKCqLgrpmLGqbc58rVJe/2z//8z09fffXV+o//+I/PuCije5M0SdIJxxVin3uvxmCUa8BGFutyNWRX5qTtihD2hxQj3DkHmiE+lwpZdnzNJRJzyls0MPuzQ8JTnumDfdZ465NOxvbm3Pb9lMNON+TGpmPbfT6/n3VP7nYGyRyFqEX5KQsdVuMxwEz00f2O2f8v6cW5eVSez5WZRu445R9y/3P6vSvTV+QgDXjpz7mxvHr1av3rv/7rYfvW73//+21jFxFYe1SC9MFpXym21jGNqLNt8uu6mUrqpT1QUEeWwH1vo9DSiLNLbediwkQUIhV5Ju+rHiOU6epa62SLxUSbcyLll/p7clfB68bf+Gaa0d9TSSea0ggLOjoYV8RaGZroqvkXfU86QDpBx6YzcJHB6yZfKhqpHn9XREnqlosIu5StuZBgzylblym0lEaowFVJUdpsxzS2Mvc91vdQq46xupLnDv1Nu1CWM9Vba53svdrVY+YgjVAbOzvqeq+9uro6eRdp/XKMk7et7BZQZrnowO7u7k54JVdQNJb+V8El4xTiVOR56qb1eH3tTMexi+o74/KRqIkyrGP394483a3u2eZcpVJeKv8OPU4nJSc0nVSfda0OsZ3mKrV15/xEI7tnz/pOp6uhZ/wz1VH+19fXh1XCGWQmsdwYn1NeH0+azq/P1JnmyNXC9EXawnn0RJbHx0/bftoWoh7pXAxcBYaJKCevWVZT32aqPLnjxjNXFR3rWqdHt/u5KZ2fe8Jxpa0oM6Pob+tPVn7mvriZgmvfO0Q+aZRZLjowVwJEIZ63pJCmYmvUkzyf3te8v7pMDZzUuf9Jx1e/5vaM+tAWkAl/Jb1doNhdt+Me5uqmzvwzocN19L8cmGOeq51GbOdFRciQXCnqmhzEXOyorxpm7c7UWm5iprxyIgUwHZt/TwV1FcoyEWH3miY6F82fiDP9m8hcdFGf+zv+x/lNh9yW0BHrNzfHzdTv378/GZfFhY+ZpqX39UH+bs6LgXTq6URMft79a51/s3f6Egm/c4ITPc4Fs5Bp6LTvJq87M6Zk1PeXUt5nVyElGlNIjwiu4068TqxB6FTMxxtc9ekAvE64LnqZxOHDw8PhuOg+u78/vlrex3dm9Go88nkJV8Nz4jJQI6r37EpBIQdUPUYnUdo5wri+d33nzOfgVaRO5VSGtmEf5pJ7L1GR/yoqJ1u5DfufIWsQ8k0PD8cnHSbqyphFOFMfHh4eDpRFY3P+0hNRmQ6hx85qIz1Z6xOFEo1iMNbZPDwcj1bu713gMZD3v/X4+JTEtaDAN9Jf4qesP1v1c3VHBxgQ0AbWOn2usrmXilAm2d/8Ph3Qec/+ivJ1XDrWWZ5FYHVkGrkkoTuOu0bomnObz0WamhZ5Vbbum29KcTe9z2qlBCKo/s/A1zqekV8bprsiBpeBLxWNsLpS6EvCt0wU1r05aZ32jJRyCD4HGuzX0Zw7niTl0cmEXDIix+jbYuZ8pXyPj48nJ7vWhsioNtdahzZEQNNAHWt1uRA0VwwnJ1PRQAy+GtI07vqafohwJfZF6Mnel9JU0tsQjvd2Rt7kIuvPWp+jp8lnrfWJBjp3QohZwqRE6qfB0LR0d8JL9nt7e3sCICrNr/qsE1bG6e1fhcA8omZGjgav4wp91YGMwfy+Qelk5AmEmCrlXFXa8RWVmZOvdcoxGJmNfLVjpM1B19Y5VDW5sMlJnCvyLxq59+7I4drSiZgS6YT7LoPWmA0mPfojgjD9XuvzR1okmKejzSlrKKIJKYpK8jcNtoRO/NzVcPsWpyf30nWzXlPrnE51yNvO4KTDu7q6Ojj26dx3PE6yk4qRa3XM0wEbYBtzezUtPgfbeHY2MznedMxtPn3eOP2+urWtfht0JkXj32Zz19fXh6d9frQDa7LkCVR4YXJC0bimQIO0Rmjh5PTm1SMX5gtUZ+5uRPXvnJN5vxA5RTJ9EumY9jhOI2YrtFMelRkdU1wdomndLprW3jSGJtwxTHSjIdYfHWNz4yNi1d3Yk2F9r36Vci70XF1dnaSu9qk+VFftTS6wudhxSY5BJGwqKa820aeOuXu9Zp7X7/xPvtOAPOcvJ+T4RNlTH3YrsJPymMFUTkyUaJ9dOLDs6tYhmSJ6nRRKp4/IV6tz7iebXLhtFcB2jnaWiw9zz9xZuKxDmClmHfRFBxr7jjScKdS8NqfTRO7y78rs2/x+ktlOyuTTzvEH08DndpDJT+yQ285J5XgmEVp/i+6S6JPjqcyXi07npVHMvtR/EY99mESsvNNMdeaCikhzbpFw7vq+1cud4akPppDTyft5js9swX56/ySqC467VVIRcHM3sxOdcfqsjUyezf7pRGfQFlxU1FHvq5/q53TQEvc7BKSsSil3+lz7zeOswzHPlVqzinPlIgJLWRO0XlPDSBhNkkZoyjG5JSNG/0+DTFEmyadw1to/puGkCp13xjYdmbxX9ZxbpekarxXJSfArA++ZSGw64JlmOd5QZoaZgxBNzUin49DIRANyGucUbcq7YqpbH3fKmAxDZqJn5dC4vS8Z+zKNxrbj+WbaKXJ0QWiHCtY65c3mAkuy2p0a4vXpgqc22L/unfsW6599Sc7eP+fKAGXKOxcZrCfdya6VTfIqaE07qK3mI90rtXbxqrprr3uzv+fSx7V+wEZWo2mOp+cRPYFSj2xnJmrJIBNiAjeqeT64QlVhdVzVrUDqj/vFJvdj/zwDP0XQGU0y0bP7bWeefySJrJxc8dtxXdNZ6LCc/KK6Cy7KeK2jM6kPcxtH/fAdBGsdH+Z3kaX2+5lpqAFHg1OP2sRoIPS7eZ/OKQczdS6nZUC16KiTm2munJ/RX8SZIXtyScip8/Ez1l1pLPGq1TWDrDqn45r8pVxc36UDO1rF316zQzg52nRL0r46XKHNjs0ezBbSL1Pp6hPIVKSWfjQC+/Of/3xikGsd8/urq+P59CqGKKyO9N3T03Elsk6L5DxeZ66GyOuEFJ6ejk8FVKZj8Axzn4V0MnQuO9K/a43IGXlKkwOc/bcop36/f//+hDtxzNPJNPkZwVyhajw6vX5n1Mm6uaov33///cmr3ZVpjka0nHwcp+m+ZRLabXvIGZnOz7l/eno6KL5UhahAGSSbHKHjmA8mz5Rl8k+VZJvuvnr16jD+fiScnRfLzc3xjCupjeQ4V9tzaL2Axjp1Wt2v09ahVSTnRXpzcUsgoKyvrq4OWySenp4OutupHL5drC0pOTTJ+eo0iNnv+nQuEFiefRby+vp6/ed//udJR3Q6eXINXn7lHBEn11SZ/IppZIP3YdkdvAy2ioIm+moid32Tp/khJGKl6+d9/5v6dtzN7jtX0XZ8xuR33GbgNWutZ697rr/V0W8dg7ycFEN6JHLy95TVOZlOeYma+v8cJaFTnTI4Nwc7DrDPRLvWP+sqEOz4UVMzi5zZlMG5ck5m874fop/ec67eS/ec69Ouz2t9cm4vX75c//Zv/3bYGPxf//Vf//tnISPhQy4zndRL63GbtCZbIrq/Q0gp945fUdlEW0U9lSihuXBQX4u4O1hev6Zjtr+TyO23TlHObEaU+jFTuknsa3jzOyPVLvWyZIi1lWFX96zTYDSjuIi1ceuwRBv9L4LMYfm9r4Drc7+fQTU9qrhaOFNYkXGIwfHPtFfUKzeoM9Xpqgs5RykG06UdOv3w4cPJSunkhAuCIVT5pXTx5ubmM5Q79WkGAGXk5+r77vrukaoRjbfq6L3JW8S1C0rp1qROyu7kR8+Vi6uQbQ1olaEGJp9T2UXUKZSHh+PO5XPIob9T9rkQMI0qotD0QcHZT9PA/rZPE0nN++VNqkNoXn+6v3FK5F9Cfmud3zNk9Pca07HaT17nIt68dq3TE3INNClvbRvJ2x+01uevVAv5TDJ7pjwT9eyciGM3tfE69aI+ekprnytPdWDKZ6ZoyUSnbBrdWKejU87KU+TmNc6fqbo2N23nErKZiG/eMxHYDmHNldLsy/mfgciFpdm/h4eHk6cEtJG1Tt9idA4Rr/UDHiWKD5m7hWelRrY8r52fENjOeriZxphwjLQ5DxHY9fX1evXq1WcpTNdPHkA01vjKueV5GvcsOkFR0DklmG9qafx+phP0usm5eWJp3+fAd8XI7JwUzUU9OoD5hnDl2tw9Pj6evDHbMfeZr45zfCLY2p+Bz7RWWUsc6/ziY6pfuUyi2TpzfrPorKwnpzgNfTqu3XyutU4cZ3unlItjtJ9zsUHnN9uYqGvqW38LLGy/72cQ738DxlwgqGi3jdeFO+c6EFGdjf05SuPZjaxv3rxZj4+PhxXJILAE8FxJk3xz4C7x7tIsBd5vFWWXXniPUSKhiR5n/ZMwffXq1UlfvK5rJtRvTI1xEsdrre2zYVM+poQW04enp0+bg90QuDO8Wb8RMsWpr+fQssGqKOiCxySP+z4HEUqUgJ5ckavKBiTHrswMDo1D5yO62aVKUiA5pebOFVbTo3SkR2p8VEuH0v8GohnMplNpDlqJDCScG4MroPbZuZZHMtDO9uczkum2qWlOpb5MLq6/k1eLLc6DY/GzeSLr7e3t4WUfM8P60Q6sXPb6+tMLbl1ClVfq0DGv7/5Kg8qRuSIo7J7pXo6zfSQZg6toreapKBlWXt0fJ6giD1EdOQz7XKmNflqhSj7TQRs5va7/Ne7p6BqLqVqy0mAyvrlgsdaRrDfS7qKxUc/o3xykiDrcUood6ulRotBM9faz43N0LDvuTp7l+vr6YGS7FHoiQ/++vb09OZm2e5wbA4VpaTLoYfX+99Ep21nr9Awsx6Tjm3roXBo8GovptG9YbxydfOsD52sdkbz11kf1I9Cy1ilFUOYiB+aTHAWIgoAO2EUV03MD5lpHfu0caFnrB+zENxoryAxFwRkZ1zpVNHkbHVkCTcBzI+DHjx8/O62gOp3QhJ7Sud+lyTQ6hQB0AkaTacTm+n7ffc/tXH6uTGQ6nZPOpeIYa7dxFhAeHh4OJyyk8PXda/pu9l2iujloPk3fdinsrq75nXzKWvuVuMmB5SQndzb5xXOrmsp5UhMW9VVUJfo0DU82kycS+Wmk1pUz1UEpJ5H0LoOwHWUx9Wl+P+Xt3+lFOtUOADOQ6J4oAO2ndqc85iqu8p+UxaVV4bWeQWA5ExXcMlfhVIoi3+QKigrdP/P+hDDhrSmIaMKoPtPNwyBR/L73pSCVooD8TJ9NJz7Hf44HOMdNzJUfJ77J1CDP9dU6RS3TCQjLp5HUH52fq7JrnR5103dTsYqWppJ9LqFbHRNxpg9zFdaA5CpncyIfaGpaOxMRpUs6fB10bUeFZEwuJvnbMRUQXGVX7pPUzmALkNIxfW//pC7mnCfDykyP/X7qqd8bTMsestlQbbqh41N2Zks7BDWD1tTv5k20tisXHdjvf//79eLFi8MGwGlUwcXSy5yCXnOinHk6o5MeCjNfdoJ6kDtFLZWaAtqtak0+KyElwBxHfd2lcSpj/W9rh+jNuoXHlRTkkrNTkWtL5dPoZ13yJDsiuHpCAd7b/xrgLCq07YjOpjyMtNfX14dNs31WZE8nDETp0Fw88m03ax0Rky/5EHkXmNwYW/u+QLbr3Gm+MyL1rnkuBSuV9MXFE5UULHQY2sdap0+IOJ+7VHPqz0RdOaSpE7v76ufkkdMLX8qSoxK01GfnVLRrMGizvCXKZi7KzXLRgf3d3/3devv27WFvVUqmV00Z44s0kCJvDjBFTQj93SRqFKYYkoVGyhTrHPSvxFkk4F5lnhHJS01OqT5MYrbrMnYRTtxA/zuuHGVvbLEdHapKNIuKlUw1erdDNLZ2S+s8e3t39zrOc23vSvM4kZFzlUKG3CZ6zfGV7hbBnefpRDo8UKSXwRhARQ/xoqLNtY679A2icwFkBkqdRzSKKVAynKhKPiq5udDUok/9ukRJpAsFm5lymqLqUMx6qkN6IVvO7tW5tdb6+uuvT57SeXx8PFlxVjYGSVF9/WyedMLK7EcjsD/96U/rm2++OUnr8ohN8FrHZ+bmA9BCyj5rb5dIZxLas4gCJEmnZ7+9PW6cc4LkT+pvdZ1L41KoIv+MdO/fvz85akVHNlFExjGj5FrHqJgyzAne7ZlTKZODiFUDSi5zkURit+AzkaQOdZYcsKnEdJprndIJ9sf0IJmEsHskpXmbL4CQT5lL8fXBrSEFqqkblnQi1NR16bs6MNMfZVXx/16M3PwkCwOYDkjj9XnjKVvTfFN/dcEAZ306N9vOkalTyiW9/vDhwwG1yn1NzlEnWD3KLsrGk3C1nUsE/lo/4GHujHTmsxrJXIHzO4VXZPTUzR3H5j1zqXsKQ+OQ5J/GnaAymCY+dCb/1gQ0uRmAxjr70rjP/Z0B6PQ+fPhwwgeaxoYSVIRkMJ2BvFMp7Q497dLF6lZBTZN1nLbZPfVXOSTPDL92Uvh2xxsM58JM85l+yUv6nfXmoGdKGaLJSaUnIc7Zf1GBf/fdHPPMJpKPvGhotzpc/NHx7gw2/ZkpuRmDyNK5cTzagbSE+jMdR+NpN8DLly9PHHryrt3r6+sThz2Div3f8bST1rmEvtZ6xoF5TEkGPdOMGpaf8bpzRO8kH/u7dnQ4CnUikQzAtHWS+Sprf3sSa44qJVCoRrbJgVl2PFVjmumSf5tO6Kib+Jm6zPRERdBBhahmP6dDnPM4of7uuo6mnnOevOU4TCcKYCp37eZUlE9z55zPVUfHUv+n0utIvK8+TELcotE71omI/K1jnES6lEhO3PRJjrf7al9ubFIHGfzUP+dRdKeOp6P1TXRWW6JyAcLcKtVvAUh1zaDeuER/1pOvuITCnl2FTLESnNBeIRS956GEOipJ1oQiATg5IRHOTOWqs/sT0PT2rpwVOeRNJG3X2j/eoeJUv6uufj4dVuOYJKXjkn/ouxysZ6mLUEs7d5FbxyN3mHzmgsKuX93vfiGNpzHpFAoos4jeZ+pn++qW98p1Wk//zzI/k49zbBLpIpbpsOaCRfJ0y0864mEDIZqJnERgydM0fKInN7imn7vAdHNz3FM3OdtdCqcsZv/kq/1OGWtHyUz0W3+qR/AS+kxu0ij1NXByiQN89kDDGQE6CmWHsExvJsJq0kVV7W7++PHjye79JtDOTyLUlM9oP43Cvq21Di+ZKH3p+klm7tJB+akgtX26uTm+Nr3/J1oyfTX65IwnWpAYntG+yGa/krEOsQ2IboA15VReGo5zZX0q9UxVJ+cpyet7IUXDXdvYdryWhtE1fq5c/OycExGJaFwGor6Pw6yfX3311QmP2+el4cl4OgFRsplGfbq5uTlwi6auE8V1/6Qw7GPXhcSds8Zp2z51UGCK8sgxWc+0u+kY3YLlizwMCC0UxCmvdeQLS1svrUCu9YwDi0xtopo0vXswv53yrfBpCAlaxKRR9ln1NokTvq91JO6noc8J3K0arXVMf+ZrrHI8CW8+fjSNdxLdjfnu7u7knhTZNK/PTUWa7PYDTW7gnLI2TxO9mhrJy9hvjal7fbHpDjnOsWUMGfxE3aIR35ko/yIPZv0Gzvrk/+fQQ/2SE6wu05k+LzDrLNY6Hpw5g8ckvCvJS2epzHVMlqenpwMX1vYEgcNMDU1R7bOntVTPdLJ9bz/v749PskiX1J5z05MH6WMZ2tPT02fHqrtfs7ZzUFEnLYZNZNp3f5UDCyHNKC9J57VNbh7WZXMnQYMx320g07C8toP3GuTuNWHuMZlKJsyeBvr4+HjitGcRodUfnxUrgtZf+z6dz3SQlmkUGYER23QyeeRMZmpzdXV1OHxuvmhll3o5Mj7dAAAELUlEQVS2Gjh5kl06MR8VaW9ZJWMzEr98+fJANTQfBUJTinOUwVzdrvS6PMcYejKyu51ERyzarb2pB85/sqr4CjX77YLMRK89W5w+2HfRUO2EVJJ9Y3jx4sXJiSqvX78+XKP8nHP1SkBQX0wTk7sApSwmfZinFE89aXwuhKSfPj5U/wJEP3ofWM84zry8CZJvKorO975JVkrUTiXtf/fsOIntY1prHQjkBr9bKJhRMGGKKN1d7emRbnwVYfW/xl1/U94dST75ku6bqGDep+ObvNW8zjTDKCvKEPl1nwacce+c92wnXkanpfF7re1KC7gaKFepHLuuOucWHhG6CKw2Q1EajuNRttNpmmLtOCQRlePLyZjK72gJt+qo7wZ4EWV9VE593kkq3ZOTr81zNEBjlM/KCXqOfaW5ag5KvXeLac5dY2zc6VvXF2B9FrP6fvQ+sF/+8pfr3bt3h/9dls7j1pEcQx1UyYwaCaEJqiT4BOtu8r6fPNjMu9c6EsbxGkX53fWTPzFlmasrKmjtNFY5ld2YgujTqZWKnCumOipi8un+maYqdw3COtc6oifnqzFomPGeM7WQL5zp3lxUub29PXlL9xz3bqOq17jNQlLduiatYKrks6rTSShT09X0MSc9+SidtXMvgppZy6QcPGzA/ldn1zRWg5rtFtT6maln13ddTi6H1NMJ2U5ofT4JYDpZOzNF9bOZqZXeVvw+FFYqGsi4ZCMXHdgf//jHk0c+8sbuVl7rFL3IXWXYM11ygnWAT0/HJ/1nGjKFsEMsE5kY2et/pbYsjUlkVrqTgQm/p8NobEa+169fn6RZte3vnGcKqjHMdGSOfzp1ZaKTST72003Bfid6Sy71pWunrOXpRH2eYGBgmCmgXMqUR33wO0v6NN/hqMPwyYf+njxjv51P+bo2W9bePEUj5DL71/86PJ2JtMPNzc0JjzqdoT+h3v5Wr/xflOVpEdfXn07yKC0sczHjmkFBRDZ1r2C3Q2ei2dLi+ikYurr6tDfTx8Em/WK56MC++eabA7TL84ZqgueTG+k691tJOq51XC5vMKUtL168OJw+sSOPVSzThUrEX4hAVJXCTlQ20ZbOV6JUfsg+7NCd0crXY03D2qE55ZnRiIymcZhKzdUqyfSZDqWkM7obeObcyhX5MHz6EcoR5U3eRxnteCxT94lqRVCNybbmQoULIlNPmgvnqvrloWZA8hwrubLukwMWXU2Oa1IUjfHDhw/r1atXJ7qxy1hMxXTyyeTjx48ni0aVNhE3jnbUByIK6tlgn08klh75GFfOMbRcQJqLM/VfpCiVYR3PlYsv9fhSvpQv5Uv5/7lc3qf/pXwpX8qX8v9x+eLAvpQv5Uv5P1u+OLAv5Uv5Uv7Pli8O7Ev5Ur6U/7PliwP7Ur6UL+X/bPniwL6UL+VL+T9b/h95WY89BzyAGgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "show_edges(edges)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that the edges are extracted well. We can use the result of this simple algorithm as a baseline and compare the results of other algorithms to it." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Derivative of Gaussian \n", + "\n", + "When considering the situation when there is strong noise in an image, the ups and downs of the noise will induce strong peaks in the gradient profile. In order to be more noise-robust, an algorithm introduced a Gaussian filter before applying the gradient filer. In another way, convolving a gradient filter after a Gaussian filter equals to convolving a derivative of Gaussian filter directly to the image.\n", + "\n", + "Here is how this intuition is represented in math:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$(I\\bigotimes g)\\bigotimes h = I\\bigotimes (g\\bigotimes h) $$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Where $I$ is the image, $g$ is the gradient filter and $h$ is the Gaussian filter. A two dimensional derivative of Gaussian kernel is dipicted in the following figure:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "In our implementation, we initialize Gaussian filters by applying the 2D Gaussian function on a given size of the grid which is the same as the kernel size. Then the x and y direction image filters are calculated as the convolution of the Gaussian filter and the gradient filter:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "x_filter = scipy.signal.convolve2d(gaussian_filter, np.asarray([[1, -1]]), 'same')\n", + "y_filter = scipy.signal.convolve2d(gaussian_filter, np.asarray([[1], [-1]]), 'same')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Then both of the filters are applied to the input image to extract the x and y direction edges. For detailed implementation, please view by:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(gaussian_derivative_edge_detector)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Now let's try again on the stapler image and plot the extracted edges:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAATAAAADnCAYAAACZtwrQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOyd224kyXW1d1YVz2w2+zTTI2E0smDA8Av5cf0EvjYsyxYkSJbm0D3dzVMVWaf/gviCXy4GqwXr4oeBDoAgWZUZGbEPa6+9IzJz2G639aV9aV/al/Z/sU3+fw/gS/vSvrQv7X/bvgDYl/alfWn/Z9sXAPvSvrQv7f9s+wJgX9qX9qX9n21fAOxL+9K+tP+zbbbry3/5l3/Znp+f1x/+8IeazWY1mUxqvV7XwcFB3d7e1na7rf39/dput7W3t1er1ao2m00dHR3Ver2uxWJRz549q5ubmxqGofb392uz2dR0Oq3pdFq3t7ftvOl0WlVV6/W6ZrNZrVarms3uh7darerg4KBWq1Wt1+va29ur7Xbbfvb29mqz2dQwDLXdbmsymdR0Oq3lctmOr6rabDbt+lXV5jMMQ/uhz6qqYRhqvV63459aseXa7nM6nY6utdlsuudut9vRuKuqptNprVar9t1qtaphGEbj4Of9+/f18uXLOj4+rtVqVZPJpHutyWTS+qmq+v777+s///M/69tvv629vb1aLBb19u3bevPmTd3d3dVms2nzvb29rYODg9putyN5eQ6eB3PYbDY1m80efY+OU57IezabtTlwLvNCNsMw1GQyeTTfHIc/22w2NZlMRv3xHfPx9axzzudv902fq9WqfT6bzWoYhloulzWbzdqcfG3mg26Q193dXbNh5MGc7u7u2v/YBn3TD2OfTCZ1d3fX/MVyZly2d+yU8U0mkzZXbNry6vnKdrsd2T76tkw3m03t7e3Ver0e2SU4MAxDnZ6e1r/+67/WTz/9VJPJpBaLxYPQbde9D2lXV1c1nU7r5uamlstlU+B8Pq+9vb0mgNlsVvP5vGazWe3v79f19XUbJMaPwiaTSS2Xy1osFlVVtVwua39/v00S0MJhENBqtarDw8P2/f7+flMSfVVVE8Bisajlctmcfrlc1mQyGRnDer0eGeJqtWrAYaUxNpSHwuiHa6JElIShrNfrESBst9sR6HkcNgSOBfA5FmPgfMbtuSAbAyhjRCc2HPqh8TdGxhzsHJaJgYGAgR64Nk4KCBkQkBX9MI9hGJrO+Y28DULIw985qEyn09rb22ug5+/44X/rwr8dQKx7BxQHR65pG9psNo/kbFvAbwBM7BVZMQcasvT/XAdZ83cGfdvG/v5+A1PGzv/0a9khYwc05utAgQ+hO/sz+pvNZiP95Dh2bfXaCWAnJye1Wq3q6OioKYWB7+/v12w2a863t7dXd3d3I0CaTqe1v79fwzC0vtzPwcHByBAQ1mq1qtvb2xHjw7FRhvtKJYHk7t9Rz8p0VLDS7PyOMjgI56AkK5Y+DD42Ovp3v3ZGMwzLq8cW+J7fNojNZjMCR+Z4eHhYd3d3dXBwUFUPwYBm0PC48zODQNUD68KBsIuqGsko+0CvZl6O8Hy2XC5HoGkGZdklKKAjZGHHQi7L5XI0TjuagxRsKuXO/DmGPtGJMwGuaxn6GtYt87fsbDcEfQdmwMB67AVe5IHebHOQE+SCbMyMexmGfY3rWRduDvb8bT+17p9qOwHs/fv3zQEZGIK6urqqYRhaCsh3AA4CWK1WtVgsarFYtNQQ9Ofvu7u7qnpgEkZrGzVMy8rC8Jk8qe16vW4gZzrbJi5QMDByPOfaCFGoGRXysdPwPf1zjo3HinGEXS6Xj9gcqTPOmnOz4mkJzk4xbDhmRT4v2ZnTx6p6NJ5M9ZyW+4exYi/oOFmT2S7NOjQAmhHMZrNmX5naG5js2A5WZnAwFc/XqaLBk76Qq9kY1yM1dFBNlpdyxnbIgAzs1gVpv+0CEE3AS4ZLqmn7xw5hj5a3/Yg5YgfWc0/uyMLM0nI0c/wc+6r6DICdn583FoRRL5fLBlAIdLFYjIwSByF/J+ozQRuBc2sml7UCjJz0ZL1e1+3tbRMQgvbEDw8PmzFm3cQOhmO7TmDnT1aWaUkaAr99PjJIA3Af9Etqzmd2MObo83xspmeOpuiPYx140INTOddfPP6qB1Zix7A8aTZQ5MS1nLaSZmbdC/316mFmo4wHlmWmhQ5sU5xvQMAG6A+bN9CiM7NrzkV+ZltZckjAMoOzn1nnZDro2UyJwD8MQ8t0Us/JXu23LlVkWm8ZYwvOJDITMHPNtNw+A4kxCFsHZo+ex1NtJ4DN5/NWpKcGRQGfKDefz0fp4f7+fkPzw8PD9hlO4RyZiOHUBwETsThmsVi0wj+RMalqGjLnGpBc0zHYIMy7u7umeOgzfSTwudaA4TqKca4VnWmgDZ/zzJjcp5kTzcECowZskC3M1nUJIqwNnjGahSDjTLW5ZoJLb65Z06uq5uD7+/vN+WwXmVagY6cs9G8GjpM5WBpos4Rg4EROaVtmXHl8D6Bt1wYFjjGbMwt2EDXLB1AzLXamYtDlfGcIlFMYF5kFtpCpHw37TaboudhWnM3YNjiP62cKb3u+u7sbEY5dLGwngAEiCKqXAx8dHTUw48JpEEZkGyr/54qUDXFvb+/RggGG6XEBsK7ZcH5GWrPATDOPj49HrM5Ak+yC87gO13RzkTeZg50sz03jMEuw3HpplqMov5mPr+Fg4RUmjnFkzboE7ABjtCxwCOTeK347HUuwTBZr2/DfDlB2fKfo/DaDZC7YYrL+dFJAwONCf54/bHt/f78ODw+bnu2omV4D3g5iMBX0jn5sSwTXXprMGJA1Qdm6t207hU9QYRxeUaThWw7CjBkdGNgcbLh2LhTYNpPR99pnAQwDOTk5qeVy2WpUTj9ub2+bMiaTSSsKY+QwtuVyWbe3t6OIOZvNarFYNAAAANfrdVsMsCHhPDYgxpqrfQYJxmNjT0YI+jvaUbdglcaG7ojr6zBet6wdZDrhdMRzNFD6x5Gc/qw31xVsmMzb47cT7mIIfG9AzbTHDIa+uV7qgbkzP4IJx6cMkk2ZFWZKaHkzfo4zGJAeuTnImLXBcviclXJSchgwtp61smTe/GSth7Fjh84oDHxmfslUPCf6J1iYHeOL9OOSiQMNJMHBGAywnrA3bBq5IWenmNb5MAwjJmiWv6vt3AdGSgfaIzSofxac7XwYNwO8vb0dpTZWlJkJq5gInOPSwblmRgbn1DacpPo2fo/DKSnRr9d/pjemyTaaBLKn6LDTNANbOjpzTibjKGdDpW87r+WRKSTf2eGSQZppmb1gM+7D3yFfM89kXcjILKOX2ni8lpXHjh566Zfn5mv0or5B0TpMZkjtq1fMTvtxMEwWbeaN/rIho96Kpo9P+3RAs+3TTEyyP1/LqaIZNscnk3Ogc7B04E2byPn12k4GxsohBfi9vb1GTZ3D2imcbvAd7Mroa8bkwr+pLQpwfcffcR6ACd11f1UP4GnFpcBt0E55cYB0ZloaoNmLQRS5eI9P0v8EFYMpyrcTGbTS2a34HAfAbBlwHXTAdwCSGabH6jmYOfBdpnecbz07WPTAqsdmLZcMPACEZcGcXMYweFlGBgenqS5Z2OYNAq5JeqUWO6uqxupprrEmM8/A6yzDzNnzSPn6ewcRrsl3Ls94vBlEHQBovfIK55Faey7WuwET2wd7evofzenJb6pGkfLo6KhthyA62kA8qbu7u2YALP86QlOIR+AUnrOm5Z25CAKlYiDug+twHkKB0hsk6SPrPTir04Ze/p/Nykx2xFjp29HYaSDjSDZmMPB4DVR2Ngw7nRq9IV/YWjqTC8DIiWuaFSewwsoMLICJdYhMkgV7/D1wMijZDpJNJWOy7DwfBxaag1nWIJPt5uKHUx8Dnut8wzC0Egx/u17ma9APY/WKtsGKcfc2AufYE3ATHA1sjMnbUlwPY/yWretptivGVvVQV3N5wW0ymdTh4WGXIWbbCWAvX75sdSscGWMkzRuGYcRwNpvNaMf80dHRSAjr9bpOTk5aP6yQmIJTkAd4QGXAkXNdxHZNwAyMMbLCQ57tqIzQXHRMoWGQGSWTSfg2j56SHOU5BwNM9sDxTgHzPDcb9q6Uy/TeUa5Xr3JJAAbpeojrclU10qXH7/SW5hQKmRjg3C9BygsGjNGA66BhmXqfHraYzoEzWgZmbpny9JgbY8JWPAbOTTZ4e3vb5ulgb3tkDhlgLAtnNS7yW94AsxedHCDJEHxdAgv9ekMxWzmQjQMu9XKu5+ysZxNmg57nrrYTwD5+/FgnJyftliAiDxM1InszK7+Xy+Wjovh8Ph8VMCeTSbsNaT6f13K5rL29vbq6uqqqcZqQ6WEWsXHIxWIxoqcsFU8mk9HKSVWNdmCn0TE/rzbZOVCcKbkZlh2Q62DMNtjsy7VFO0QaG/NmPLnPK6Nuj7liQOjLjoZMnOojU3bxcz1YBccxLp/rORkkDMxZEzHz8H40g4FTNTNpA2SuBNop7aiwSLMXdOC9XcPwsF8ui/DILsGf8ZCC2lbQP9ciozBztkywKcvRzNIAnWzZZZFk1AZN+wKlIPaAunzhQJJZkLfvpIwchPkuS0nuv9d2AtizZ8/q6uqqttttY0QocbVatXpY1UPEQZBJZ5nY8fHxqH5mug5bY/US4RtkiDxZXHcqS4SdTqej9BTj3N/fb/Nw/QwFomz681hcS/P8iGjMic+h3t7RbZBgXD7XNbBMkZyyeRc9c8iUvmrs7L4fENBh7mYwjN8rTTg97MvRFsM3++Ma9Js1FQMMc3df6DuvlyCFzLJOxzV69RkvOjFPM3hkn8CTfTE3gN5gacfzjnhk4OszB+ZpEEGOTr2YY+5R7DWzXjIZrosfGGQoMxg4uQbZGOfYD13isU0hb/sqLf2AHQzGiF1tJ4Dd3Ny0AWJIZjleJkbYRPGq8b1hHrjzaf73KkXevoNQnPaZCfJ/KotzAEgMzcDH+PnMqRsA19uMSP/+2wqxY7o5WveiEi1Tjh7r9F4fX9OO1gsUgJ9TIssNoLejePw4eM7TAczXzBTK10+ZZUqXjNX9OqjY5pAxn2eNxd/5xnOPIdmv58VxXom3nWJHZnYO9paXZUq/qTeDv5mlATvtyPPwthZvm+A4fA1Z+treymT59gDXtogN2qd61/VxPs+481Q5pOozAHZ8fDwSooGAWlXVw2M7UDTRgtt9vL8HRmSQQsjU0kwjzQoccfmevr03zcyolzIx5p6DpKPzOzfg2fmsIMbuFR0DQc8Ze0ZU9VCjMrsxC3SKwbE5NgMe/QDsBiEfS2riFJPzM5iY6XCOU9DVajV6kkmmBJ4X87DOneI5YFpmGH2mre4rm0E/WZB1ZTlhB07/bWtcv3edXqM/Mhb68FMZvEjjsfknZZqpY4/FOnBlQM55uObpzw1c3mmQQcgLRWZVBndfP1cs/9cp5Hw+b7vbGbipuj87ODgYDY4o54n0Il5GUJQKE4PyOg1JI3HBMBlR7jJm7CgVA0zB2jAc9WwUHJsRyuDisTDOHghxjNmTUww7stmiQZm5UeezA3Mdr2J5Xh47f2PMrkHxGSvJHAvAGFSdyjMHAAZgyBIAnydLR5fM18BO/+jRunTtKFtPp4yJIrLZpPXglMvf4Yypk7xGBljOQY5O4xw0HZCtP+vJDJMfkwH7GkV7jsn6H3rknkzGwW+uf3t7O5qrdYKNMN9kzLnhlz7/bgA7ODhoE8IpMEoKes6NF4tFu1fSAEKB2BM/PDxskZ5CP8+oonFbjydpCktfBiAziaT1vv8LQTqycA4G5J3Hdsxe2mRQQwGZxhFd/F3WVczcemluT9Gu5xiYq8YbEx2Jmb+B2mwDfXG85VE1LmjP5/MRezQop3G6tojzuPjLsSwg9Awa/ZmJeY7YgR3JTsw4uH724wK4g4htgU28vqYZO2MA0MySXcvlOC8g8Zn1y+8ERsvEK4TJmlxbxC5sh4wfvfoWIsDUwIlteA6QE/pg/PSB/J1iuznwcOcL43qq7QQw5/cMHMF564QLwTAMJuJVDhRhJuInVtrYN5tN20+GEMxGnG44sqQBcwyfOdoZfCxA03j6yRw/f9NnRjiPN5kZ83JqgKGl8yVgJyA8peg0Eo8JWdhJbIjM34ZvtohOuYXMY6T1NjB6HvRjoHDNtFf/yDk50PR+qsY1GYOQ5WKGyJiSjVSNGWemcAa7ZJjM0U9bQMbWhefpwNmTFde2fvIzp9EpY9tMruZyvOtmPj51Zr9MpuqyAr6cpCZLMdbdU+2zDAyGZENar9eNQTF4Iip7uJggT2N1rQTh4iywtun04a55JuUagI2J761QG7wVkTTaoMSYLExfwwzASnGUpQ+ajc4Rvqq6Rp/G6d8JigYvB4MExtQXfTm19PgdQPy4YcveK1YGPJqB23KzUyGD3vEGx95xlpNl42BlOZjt2fESiJ7SKc3f+TOz5dShQSNTTo/TZZgcp8flQIe+eiw0yw62W8smgyv2b7tKfViHll9mDMMwjGTDZ+m/XL/qgeyYiTKeXauRn2Vgm83DY3GM7CBpsrPJZDJ6QCHpJGDlpXHS0uvr6/a3H0joCef1rHSPwcZjZ7OQU5BQfQSGgzu9wYBMx6v6t1kkyDq9yaKwlWsnzLTJRuQ5VY1XlgAaU/RMr/hsl8PmvGDajDFZie3DINtLsWm5k91zy3m50b9ZS9aVnE4/Fcn9vR2Ghh7cB5/b0R0wbCMOQgn01r0L7AYy25CZIWNwn8zFnydrtmw9XsDY4Jf67QVS26jBK7/rgabT9aqHnfq5+t1j7247b+Z2Cgfi8zuL5k41XEcBja+ururZs2cj1OZ7lph5/phT0owyKJx+MbLZ7OH5YbAJanSTyWT0FAwbq4VkxmeFAIim2F4QsGHaKOnbdSEDcjIQA61B2IZkMOfc5XLZnhZSNX5Olft2FHb65BTXCzRERdcMPV6nBwYNG3sv4ia4pmzNajNYGNx9voNoBgb0ju1m0d+AZBn12KU3aOZCDHq3jAzoBhfrw1sGMig4WFi+2JUBxM8T8zFe8IEouH7FOWZbfp+E5+MV6iQK+AWpc6aYrmNbBtY3W1N6jLTXdjKw09PTkSFst9v26BuiBhf043BobEydTCb17Nmzlnoa6GBsTq3YNNuL9jyuJLdQIFBviEMAXPfg4KBFG54xZnruiGmgRcn+DDDnOyvCxxronDKgyKT7VY+jXzIixs54kb036HIs4yFNpi8vJiST9QKHHdkOttlsGtM2qCT79I5yp03IxP04YK3X67bXMBdUiNzWG2NC7r4jBHZCkduOSp/exkBfBiBkgA3lijZ9OX1KxuH+sGHLi7G7pOH+nSojL4/TKSJ/uyadWQf9JWNmfMwz2TPH+j5nxsRGZ9u4wct2aR9PZlj18DjtXe2zT6NAYavV/cs9eL7X8fFxE/T+/n59+vSpTdpgBpAwGK9YQRkxBvp0zQ2hg9qkoUQurmOluRaAgNfrdXuaBQ6Uhd2qx9sYrDAzJy/3c60sxDpqojgzBs8ro00aVNYUemzEwOAXdRik7SyOsE4RDQLIlWMMaN4Z3mMrnJMszHO0cyN77/Zm7gYkxsB1WOLHWWAGzhCcqt3e3tZ8Pm8voWFV3UwV2Tv1ytU1ZGO24NqibYInUDhg5aqe60EJMJY7/RiMnS7yO4Ojx2oWZf15O5TPR1/oAv1jS6ljMzGDEnaGHpgvzfdwGvSfajtTyOPj4/Z6MrY42NG8E9k3XvPbObejS1J5HI5I7OVtKz2jnp3b9JyIu8t5bBj87wjm61iAKD7TZUd/xsDfXqHx5x6XHdLjszHxf9J/j4PmiEd/7t+1PAMF35lpMT6DJ306wic78G/PO1fx0CsOzfV8fs7FvzMAJROkP/dpXTvdJ0D2mvs2o+6xlKxReRwp0x74JKu0bCwf24fHYLDytQDSlBfNd9K4+Zg8xxkGNpoppAmFAyHkxtkWx9ufnmp/061ETof427R/s9mMbu7lczsXEyL1QgDe6+PCfjK0qvF9XF7lwVkxPCvb4zZrqRrXfuxgBhPfuFr1+OkTdkrmkEaT4JHKdgqHQh19bQhZdyOiue+qx3UNg76Pca0xAdfXsEx8fA/UaHyWaZvtgf/v7u5aKQEwy9QG0E2wsAN5tc7OSF85H659e3tbt7e37R2oBG76z7cC5eJAysggh2Oa9fIdP5a5mZlZjnWcgdUsKetqyKIX7L1PkH4MrAn8HoOzHV/L4IVf0K/7cyZFWYdzfI0Mzm47AcxMyPtnEASAksjvG1tRHK9Pg2kBTp44fWZxkOZl7xSkv8/akql8AkOCh43B5/dYUwrWffv7NBLkyd8JFkQwp78GjwQNpzT528aSsvOc/T+LKJaxj3Uhm/PsVHbiHvBnaoAevHnSunXEtu6sdzuPZekxcbyd3amft9g4VXKKmXp2CaFnE7bh1LOL6y5p8J39zT+ue6VdmBlyzQwqtqGnUr9k9bmVwd/nmBiP02n7JHqy/xG4bDOWx1NtJ4AxINiVEXQ6nY523Bt4zHiS7mdx2ALNKOPr8T+CSlCxcgFRb7vopXw9YTqae3wumrulQbh+wfFOfVGGC5iWDwzVDmuwM3MYhqGx1pRzKj2dyOM26FFHyvcJZv/0lX2wQm255aZJsy+DCKzn7u6uPTzz5uamrq+vGwPy+N2n++F/O3Ev8AA+2B02Q6MgfXNzM1pQoJbam3um2rYj7DMZTjJFmu0wdZsA5jQ2QSTLG2kXBuLUjQHRPseY/ex/PyAgQZkGm82sh+9o9tveuNv4n/ymqhmyX0oLm2Flz06N89n4fTyO7HTOtZxkFEzcj/RAmMkk7Ez8dvTyUyucZmStwNHc5+d+oarH+1ro30BqcE5K7z64hmsFmWoBWAbZvIMh6zcJUDZOjNvRGACyXpPhIUszlGQkCc5PFXLT4bEHg4sZmY+1syeTwHkdCBIQEtjMeq0bGFjKNcfhwPIUO/GxfEfdzY9rpw9sIJl96jrnkpmLx1L1+Pl0zMOgMQxDW1DLQr2ZauqMsTMGj8UlKc+J822jri8+1XYC2O3tbXu6qot/+Swts68sHrMA4KgAcvMo3aqHVzSZViZb4AWoZiUGtAQvFwxRkGmthW4jtyJwoFwtZE7JrJL5GIwYrw0ABboG4LnDrjJVMdBmFPU4ONcOx/mkit7EaKBxBHa09EoRgG3AMIO0A2ZKkkwa8ExnZAV7sVjU7e1tXV9f183NTYvmTj8s9wSXrJHaln1eApfrU67H5vYOs8DUr4M2srUM7Dt+/A3n+H2leb4bx7hkYobo8dg26Td9wYBsG0pbcW3LQOssAqzwmDabzQgzfHO5a6FPtZ0A5khp5aKkXMbt1ahgZKA4A+JG8adSGjsejmBnTppsttADiaSijmIowyzLCwN2KubGuwFs0PSbMnRa0ktdzfDSIA1odqKntmykAXuOBiP0wfcGQa6RkdmO3XNMfjjHAQld8dt6ppkh+XOXEKbT6aMnI2Q66vFm/4zZ9+LaQbwY5eK9WUGvlOCA6dSV5mt43nZyZIW9eX5sR7IN2E7TXyw7Wm8RweUTxmN7NKCkjmxT/G/f9bUyi/B4XW/NBanUYbadAIaiqSnRqUHLF2bvUQKHBZSrI5vNZvTYHBSAQCwEO4cnxTh9nhEcYTlFyKLiZDJprKpnDKbrpMqM30JOA7IyDKLIxQrLVVaO8++k8mZhZmhJzc1qDfB+6oTn7OfPI0OzXubo1eiM1DmWBFunqskisw7D3wZkHuGyWCxqPp/XfD4f1dGoD/JQPvZ9ZU2R5k3Zh4eHdXx83MASZmm7ZKxOpf1OAOZpUPULMhycnXnYxvEJUrnMTlIu6RsOzJznOiVj8TkOZMgtMwMIRdqu7d12kKup+BDpc/pMLwD12k4Aq3rY5pCT86uh+O3H1SDIFGCme1yDSTsSuAaAsjP1wPncp4HQgGGDcVTl+1z54/OMILkiY7DIVCnHz/EJaBTPiUhOfdKJvbWklyIYAPjcoNUzrAwqnmeCuVlDPsguU1vPwf9jvHZWM2Cn3J6L5WHG70UWs7W9vb26vr6u29vb9oiWHmPnehyzXq/bRkvm53sNPS6XEQw0jNfjtpzp13LNYEjzYoj1mNmIwZI+vOHYNWAznCQlvXQ/s55svTkyp9zqlAyS67GDIUsAT7WdG1mpUSUi+34lWARRbjp9ePZPsikoO5teeVIFBoMh+J4pKymdgAlCr71QYOWQxqbDoNC8A97Xgd310p2MXHaKXmTzd+6PvhyBzcQyLfUbXtJozN6Yd9V41zvzctoCECGDvJE6jYkn61p/HmvWQtlG4+tnMPn2229rf3+/bm5u6ubmZsTmqh7uzzOIGeicMWCTf/rTn+ri4qLOz89bf7zbwfLAFpbLZT179mwEqsjz8PBwBLBHR0ftntuqB/Db29trt9GZvX78+LGur6/buEnP8KWsVRqkDJLWuedgXdAgG2aRtg3s/ykGh48yHo4hkOYz1czUM3txOSUXm5hXZhz5d7adAPbhw4f69ttvR9Ge90NikAglHxVNgRijNvoDgvyPYXv/mA2VSTg1BACdezsVQhimvgYV7zHJ1NS/mSPGxTyTRRmsMEqc1ODMXPI6NDOZZJoGZQzg7u6uzs7O2nVxzKoapeXJ7Dx3rsvfPN6b6/QeF4wxWn8OMhg61/TN43nfrI15sVhU1f1dIFdXV6NVUZ47lozWDspnfpY7AWo+n7d5oEevMGLXnz59ansWOX82m9XNzc2I6QNe5+fn7ZFQ9L1YLBrzOz09bYD3zTff1I8//ljD8LDC59u+DFjI6qnAbAbNOLOUgv56YMcxmU5WjVM9lxeyJaPCNpxZmCTYfowT9t9kuL2STJvzk9/U/c3Yl5eXzQB5UsQw3N9wfX19PRK8H2BISuTGwLwszFMiZrP7F4VQc7BgzPYAJfqmyOqVG7MZBMGenixSZ8rLeS40oiinZlmvoLmeAYglqHoFyPUQs5GnQAGjROnoJNMWp5UYX6ZnXpzIVV07iFNWp2+cy9xw6KyPIO8em92VavEYJ4ya+xczzQK82JuEUzDOw8PDEVtk1QZIOCsAACAASURBVJJx2Xb4jHsCM5Bhc9PptO1T495g2wD9PHv2rE5PT2uxWNQf//jHev78ef3www+1v79f//zP/9zmdX193V456PtemZMzCezCAcmgjq171z12gC3yeT7i3XZA3/v7+yMCQfNeRmzBPuSgaHaVcrJNZmbRS1XddgIYBulIihEsFouRUfOZ0zXTzaS5Rtqk6gYFRwanfPTt2lHm/tm350Dzo259XdN1MwQbDQKnmalxLIr28b5GppL8cFymlY6ymSrk+PncBpH1McvbIJtMwHO0EZtFcnzOyUvoNAexnD868hK6V8WclmXKk0HHgcINObh+ljVB92tb8Dai6XTamKGd0Cn1xcVFffz4sU5PT+vs7KwFDhgbm2QZR6/1nD9rZ/YBvmM++XYh+0nvGgarTOXRj/WF7DOg9/q2rJ6yGY7LxYFsOwHMVI4VHRogNAxDi0CkSmY1rLrwghAEYIBJp+PapCdEdKd/CS4GK85Pp8yaVwrYAG2W5GsZJK0AK8VzSQVZuRg5feNkOQeDvSMrzMlpsMdmJmLZ0Y+jL+w5AaCqRk5tQOEzgwNzNJNwqskxmb7SB/LH4YZhaIExgYLVa1gK+k1WlyybJ86SPvHcu2EY6vLysumMjdoEB2pdDsIA7Wq1Gr3o5Lvvvmv6mE6n9fXXX9fZ2Vn99NNPdX19XbPZ/YucKcUcHh6OHgHFtX7++eeWVkMkshziQGDGw1hdGkhfcGBzP7Yp94eP2Ab9Xa85aGem4RTSY3fp4O8CMJSEQOnc6SFFeYNMOuzBwcGovuVCK8boR+WQOg7Dw6Nh8lVu7q/qfuMtWyHs6Bxj1oKR8j1CxnFcEO0p1umEI0cyQC/ZJ+NwxOVavmOAeWR0d/QyODEGzjezoHlxJRmH2Z73F/UKrpyT16RfA7MjqYu1ZnNZ3/E1Dw8P26qgU3hWMuljsViMnIXarIGdrAIdWzdO/WFGMHdsh3kwB+xqOp3W0dFRu9YPP/xQV1dXdXx8XK9fv67Dw8P69OlTHR0d1T/+4z+2wE6pABkxrvl83lLQ58+f19XVVWNqdmjXsghEDurWF3+bfCTo8Bn6h/EDntafGaDPRS70vYs99hgYzfXcp9pOAOMCRnPyZiI4wONCIBNZLBZ1fHw8AhMizmw2a899qnpMSzEoPyyR5+ZnPu5H+vhzM7XN5uH5SVYQ4GdnI6ryeVJoxsuYHSX43M5tkE0qDSMwi3GqlBHIqUbS/F6Nw8ZhluexbLfbVrT3rUoOYG4GU+ZoALTxOg11pPVxdkrm5OgLSABIGf1hZjwogEyAv81syRhydQ17gS2aSXIeNk8fyBn7AxiZB8X7v/zlL3V8fFxHR0d1dHTUxoaM7u7u6uLioqrunwAzDEN98803o1LN8+fP68WLF411vn///lHQ8RzzrUkGCKeYWULxgkmWZXo2YJBBZjBvs2HrluOMLW6MD59+itlV/Q2PlMaAbm9v6/z8fFT7MhWn9uVCIw8/5HgKlhSeWXZ2VHNkREAYiCdsgzfTQnneuWy094KBGSDNtD9TQoOG+zXoZsPoM331fB0JzWCZfxacncoa+D0eF4LNMCwHj5fnvXkuZjLIxiUCF4bRL79xbIOB93khBz8D3foz0JJ6Yhs4f26fwUa8wIOu0ZcDGrLAaQFby8IBjSe6unZGult1zwAZA2wMVr3ZbOr9+/e12Wzqq6++aosLOOkvf/nLury8rJOTk+a0q9Wq3r17V8+ePRsF8+l0Wm/fvq1hGOrDhw/NT+3oDiLI1Wze99Bid7lSbrt3YHZQghU7LSS7guz4abdm49ZJBkmz9dyS4/ZZBuZn1OM0GJJ3ozNY0k2ilPNZHnnMAKvG+4Wy6OooyDI1q0NO/7Ke5gctonjXdegTI0OgMJuMWB4rArcSnRIhIy8c4KgYTa+gjJzW6/Wo/sNmSq6HQ/A3BuV6AmPKCGrGasaGvgzeyA8nSCbDylTOw4aeWwD47bqc65H5Gdf3HjU7BLUkv5/UwM6+LYMScvIWDqedfI/deAwG5M1m09gSDKaq2s5/p1JV9+API7u8vKxhGOrdu3e1XC7r1atXzbb39/fr4OCgjo+Pa7lc1tHRUZ2entaPP/5Ynz59qoODg7q4uGhpKds4ZrNZSzP9OOosmwBsXiBBDta37Z80FzvLYGDZmYVZvujPmUr6AZ/hLw6oT7XPbqO4uLioo6OjZtQM4vDwsBUtnSeT12+32/aIaNNqp3eueVU95NuODtyhDzCaVXE90h4iWjoEwjFT8RNkze44Pumuo9l2u22pjMHBzuclbLMrMyMzMCvaWyiyoG5ANigwbvYymT3glIyBMRlMXMNkTPTJeLJew3iqHr+ooapGoE0/rkWalSJDflu/NIIM84eNoScHIDsNxXjXw5yiw5KTkbgvGJeBAXvE/vARbBuwMPDm+ewvY7vS6elpnZyctD1rk8mkAdfbt29rtVrV2dlZK82wOXZ/f79OTk7q+Pi4gdzV1VVdX1+PanWwWfpmjGZeZkS+A2C73T5izLZ9znXtmPNMRhywDXo+z0FuV9sJYFdXV/UP//APLdIdHx+3aOQagCO5gYbVE0coJnFyclLX19cjtAYE/axs00/vITIjcGRxsdVFRxpGRj8GLCvQ9aSsl9mxnFbSn/N7FO6UDZBwAZr+vCXEaSbzc2rrAiggCZuyQTEPpz++KwGdmLWiV9f47JCurxn4MkLDZrxJmbH61jMbu42eBoAfHR21uTFGakoHBwcNOBgLTjubzdp+MN+g7eCCrj1Ws28vslhvyJAAhd2TUuamWp+DbCeTSdv8enFx0VYlT09Pm+xdGvnhhx/aOyo2m039+OOP9erVqwZoe3t79Ytf/KI+fvxY8/m8Pn782IKJ972ZgQIc+GSv9uvA6e/Tvw08BqZeuok8sRXOcbB/qu0EsLOzs7q5uXn05mX+dormHJ3Cb7IaK5HobIpLnxgaRmNWYzqcubWdgdeMZb7PHFC8HcWFYzsw83D/jj5mDK73MDb6Npuwo6fCPQY7t2WdBVI7Ps2prdnnarWq+XzeAgf1GBu1U1YDCfP1TvVkbQkgzJ/xeYUUufF4JesEmRnMr6+v27gODw9bUONFy1dXV3V0dDR6DA6rmNhCProYMM97TGlZq7E8HLCcxnuzrIPiZrMZZRV2/JOTk3bOer2uT58+1dXVVZ2entbx8fHodqw3b960LSSHh4f1m9/8pj0hhfTr48eP7ZYsxrG3t1eLxeLR28Hc7JNOBV1X47fLKlwjg7t91pkEzf26+YECT7WdAEY9BsUzANgXEWm1WrXVlaTM1NAANaM7BmomAUglwMHwkolZmE7rKC7a6Lhm0liamUQvtWGMBrMERv62A8NSkI2VbFaEAfTqWTYCF0ETQNlTlICIoRmADZ5ebTRbY96ZUuXiR9YzDGLJGJ2K5XfJ/FyDw25gOF7c4djT09PGuhaLRZ2eno507LphgqzZE5/5YZ4OHPTheiJ2z5jNYqjrVD2sFGKDvkWKhZq7u7tWM+M5aK4rAmAGm++//7751DAMo9qmg7RrdC6RbLfbR3rFLjjOoGf7NntifmZqtnXvLOBz+xCZixftnmo7AQzFoYScgGnt7e1tGxQC4jjf9+i6FxTRiwSkkKbdAKXpKRGXyNa7WTujJd8jeM738ZxvAbtWRX+9p3FkDYw+AVQAyY5gWVY93KfX23Fv6u0aG30mS3VaZOBz7cf1DRd4TfUNJq5XmP5nAPKYXOfwuOnbgMb5jviWE3rM53hhr3zOlgsclgBsNoLtOT2GiWJn2KGL+Ni4a3ouOzjtyRLGer1uK/n2E8bv+4uxadJL2vX1dWPNV1dXNQxDe9DjwcFBq61hs17UMljZ5mkAtu3KunKzbbhea7k6A+L6vrc2a8UGMca1C8Q+eyuRC6M0O6DrII6g6/X9ex57Gwl7Cvb+JTt/1jQwMpRg9OYzRyGzRMbBdXA0mBfnJ1U2MPGZ01aub3lwHM5oJ2bezI8GQPZSTR+DHBPc0gAMoBxjZ5tMJm2D8unp6ei+SoAgQcwvZOnNx6Bm5zAL5n+a5eSW/7sUsd1u28ZVv4EdUGBM2+223a94dnbW9OsaFsd5QQVg5x5H33NrVuLyhDMDBwfbjOugvPWLHfmulwGk9ImNzmazOj8/r9VqVVdXV00mrv9VVSMBNJ6c4ccFMWbGYx155dElDNsl+ve8TDKG4eEOD9uAgd51PeyN/nspe7bPbqNw6lL1YJiHh4etSMnAKbxhAN4X42Vr+sMQLRi+M3MhCqWQMjLjBCwm+DqMn1uTZrNZMxbXyugH40nWCEAnqBr8AFUvLgCYzCNTWAMwyua301Yr2ADFMWaKPpaxoYvJZFK/+tWv6uXLl218BwcHdXV11V5xZr15E2o25siKmufcW+ww8PMdc/ViBee6+O95uaxgQ7escBQWjTabTT1//nwEMPSfhehkpGy7AUyoOTF2s0/qc16hd8pf9XB3is+Ddd3c3FTVPdPjurYH21+PqZrlbDabBpLYNduReDFPvi/SaaYDie+ocfBGL7Y5rm9987fl4NVfB0LfyP5U++ytRDaOpKGuvfC3ayeudbULxsqgWYSdjyiG0vjMqZUF4H1qNhYzPIzUQES/FDdd65tMHpbbMRpapnXMzWwU5VY9fvVX0mPP0SkW5zo6wpQ8PrMhswmcx3ufvArnms9isWjghXPDhh0IvIpGS+Zr/VgWXuVzTdFOmYsn6BXgIM0irTs7O2tPZ3VKY0eGnQ3DUB8/fqyjo6NRMDFbzBU41wSdBjkYcH3qV2xncTptndsuHDxJB52WU3P2vaD04aBicLctcB2eXLu3t9d2EphY2L5dOsIv1ut1XV5eVlXV8+fP27alTBfN2O3b9hczuyxZMG4v4DzVPptCYgB3d3ejGtJ2u211AO8cdiRysdI1BxzQxfIERFa5fEtEbki1AaJIfnuFygrKFJXfXvFywdgKtsNaqJkGMwYUaKUmi/DxbmaVCQQJ4GZejBP5OwV0X0RSpzzMhVW7TAGQfRoqf/s2MR+PfRCAevpzemHDxwaxFRu4WZMdiOZb3qx39ideXV21epFBM9Nxsy7mZfYDkDFOAAK5ARbICPB33RKA5BjbDVtEzIZIMx1IzIwAOAOnt3CQfuKj+PJyuRzdjpWB9ujoqOHB2dlZXV1djVaUuab9lPPtP7lKb+bmsscu8Kr6DICdnJy0aDeZPNyL5tUkFOwVRQaCIfaOrarR8rYZDiyjqlp9A+MHnBA0Bs9k7XR2cm/uzDTWmzmTPQAGGHCmPhiuFzF6YOSUESW5FkFzam3GyXiyuI8jEBC43QXnxjidrjKvTGNxRBu6n4NlVpgG53l+7m+uVfV4g67naSd1NAbwXf9E78yNvjl+Nps1HRkoq6ptBKUe2GPHvTGbTWBDw/CwyZtAzbnYEPbthQjbG77GNUjnAMHlclnHx8f16dOn5nMwTM5BVhlo+Q57YaHDfkudjLa/v99ux7q+vm7bnKhBMj8/tJK+bM8GdTNIl4ws65R/r+0EsO+//77evHnTnPv09LSqHvY0sbKGsiys7Xb7iK6u1+vRSxJgb0wkC48WtFka1Jec3lE5ncasA+PxKpady6mRQcQ3idvgYDkGJafPXNPG6VQyI7znQP8ev+cDqPHoYgyMVMNGVPXgdI7YZmzJ5ojcTlHNcAx6lr9Xlawbmtkn51sGWQ9Dlqkr25prdrAZF+cZr5+wipyXy2ULktyLaF2aKWRpwgtSBl6yExwYwGGMaau9bAKbdzoMC6Pwb5BKX8gyDjLF7j9+/NjqYIeHh83+DDjL5bLevXtXt7e3dXJyUsMw1KtXr2o2m7W7A/JuGuzb2Q8Bw3sBrVPbFw286NmQ204Ae/PmTauRnJ6eNmpp2r9arUYbT512sYLDsnGmAoARrMJMAeMiGniyXuYFRABFpyN2AuoIRA4Q3quVAAyRBCaSt+44V7dynFrY8M18cAzYDTLJ9LPXsn6wWq1arcXsFyNxVHN90SySMTJOy4yW6bb79Vj5HsfKupGvb6flXObo9NCyM7DgeDTX5rhe1pw8B4ISjsuGUFgOqSaOhkw9T2yEMTv44MSTyaSOj4/bvY5mzHb+lIMZv4OxbQRGw4/3V2G7DphXV1c1n8/r4OCgnj171uyG+azX67q6umpBkA3B3GAOO3P24OcAVj2khvZ1Bznm2JsLcjEL3QVeVX/DrUS/+c1varVajZ4hXlU1n8/b0izH5uOEnUPbgZKJZCERhmR2ZgBhoplmYIiO0hiI06JkR3YS0kX647amnmNjKIwP4/OmPsDBgIZBulbkWkFGUoM6RkG9ItN2Agjz5d63dEQDBnLxahZj4Xg/+SGDBNc28Di1ckGY79OouY7H52BAKpzjRX4Jyg4wLlG4hpuBKsdPcAAwE8joxw6HA/o63gOIw2KzXnEkEGLz7t92QmmBMVDvBfS4HrYzm93f6D2ZTOr169fNJnkNHT43nU7rxYsXI/tjTs+ePWtyxZ8yuFSNbw+rGr//wAtiBGDbke+19dgzNXX77FuJQGQuArCQurgQicBhT46WueLEpMyETHFdTIWdkQK4DjeZjG9Q9ebYXj4N2/GtJhnhHDkcxcxkcA7XOBiPU72sD5nBslDh9M51LWRkA+FpA7y3EIA1czMrsKE7LbWBIGPvSaNP5gaDzXQpa3j0zXnehmFGgVxh1NiCU0/rDRnaVvJ9hU5BAT7qpb6vFpChDmXnZ45szubm7PV6XS9evBiBrhkZjBpQNeukT1gYsuF80jNqgOyddMD03A3wZrXMnUxmGO4ft3N0dNQeqEhN7ezsrLFM+yV9mBSgN1Lh1WrVbjF0EPZDCGwPZsS5rSTl7ubyzFPt6W+q2ltYLCAUYGQ03Z9MJm0ZGNBBMM6VneIZvYnaGCHCRBiO5ozJhUyDAH35M8bumoZZFA5FfzghhoxhAJT0x2cGTTc/VdardgAAn6Uxut4C60I+Lu4boK0T5OjVK+RFsOBaflhg9olBZz3Oesi0z9tnDOTog7EAXtSNSFW4jtmbzyWAUBPFSQABxt6L7tS8HFi4np8+nDW+ZIlmY/7fdUSA3gyyVwvE7ghOs9ms1ZDRkeWXDNDBydd++fJlO//Vq1cNtOjH10/bQy+M2aDNM/Vgq7Yx692lGT++241+0JcXBRPY3D77OB2UxSQ98clk/MAyp0YI1I/c8VLwMAztyQJ2gru7u5ZX85YiG62fBmBHw4AcyTkvU1evrDn/p7FsTXTGQG3spIlmDzi0jQyj9THZH9c0AHgVdbVaNSbsTbqAr2+poiVzcYBhk7GPz9uwnM44fadl9DfbdmBzwdy3ySB3p0w0O5a35tiJzPy9agbTmUzu97VdXV2NUsWqavbl24YMNMgeIOMmazbCbjabOjk5afMw+yXYAHyuBSJTb5PwPEkHXScDHDJbgCQ4UzH7QZ4soE0mkzo9Pa2jo6O6urpq5RnGYIDMmqV92DXEy8vL9vSLZKYG/txGQ7/pe1laQK4Jdm6f3Ynv1INO7TR+LRqD8U3R7Dfix4KA1fihiQDien3/wEFABLA0KLi2RCNyp0OQTrh4iFGZwnq+rjVkbcrOZGCyAszSckxZ3ESheXP33d1dffr0qT1WxWluj8VmKst1zcxcU0i25TF6C0cvapqdug+nyujHmx3Nfp89e9YWIy4uLkZMLvcdegneKQkrVjc3N6MUnRqTswDXvnKnt+unaVfeX7bdbtt2gqyfMT+zI2zEvw3+yZ6ZK1kMQWC73bY7HSwbN5dlbB/DMDT2z90LWV6w/fL/s2fPWjmINgxDe8BoVTXmnI8pAlgBW+zZxOIpGaMPlyF6bSeAffr0qV3EgjDLQeBmNdB5106YOFEkGQ97zKrGaRbXzpdt0I/fWef9ZVnv8SZLol/V+DYS03+ie0ZTfp4SKv35cT1mjsir1wyObIu4u7trb4pGhgm0djrXTsx2YUYcC8N0ncnpkJ0bxsYGUK5F0GHeTgezJXNaLBbtnr7pdNpefmGb+umnn5peN5v7bQRkBVXVtvXwyGVSSQCKgGlgQLa91gtifM4Y+I4nfrCd4ejoaGSzZg+2+wQX24TtDV34/8lkUmdnZ6Ogjw5Ikb2Axrj9FnUH4awzbjabOj4+blnKdnt/3/GvfvWrVg8007dfZNDkeweIZFMORHxP8KPOe3d314Cy13YCGFHfkQxFsnPX6ZkNFOFUPdScvFGRBiuyAFz3MEtCKEzIr2mremBfXN/Cqxo/Vsbf29j8lnDO8fkujjvquk7jdMlposGEc2im21XVdjj3CrkoH2B0TYjoZzZXNWZAMGiPDTn6NiWnLRg1x1l2Pp9rpNHaHugfpkGK5JrLbDart2/fVlXVxcVFu81puVy2RyjzzDfmYsZMgOFzs2TAx8HUTmimy7wc9OiX7OL29vbRhl/bi3WYrMlyMpv3MSlP7Mf2jrNn+lb1sErrVWvAkuwJ9juZTFpKCGvzwxCsZ48bGXtrkO3LrJR5gxnYpJm+meeuthPA6DjrEQgnaSAKszNxHIKlNuF8lwhB6omzZX2FCWM4LsxyXRAcQ8w+cqk7o6Gf9e40tbfiZgUY3FyrMBAY0BxtnDJ9+vSprq+vW/DYbDajyJdGQb+WddY1MBhvaWBsLAr0UnJ041Q4mV8PhB3EcNh8AYdTKvROkZdnzRsAeDop9rNcLuvDhw+tD4M9fWJ37BhHNpa9deNFImRDc42TOWHHBGE/GywfGOjgkjUn+s/g6M8Zb5Yc/HQJZw22PY5nX+Zkcn9j+ocPH0Y1RmRxcnLSbGmz2bS3hrO6yebftCPmjlwT8FyD7dVJuaZr2Jz/VNsJYMfHx/Xx48cR2oL2fv4WF+U4BrTZPDwOx6kAxoxzABrUvnxLDsbOVgqAyuzMTuoHIRqkTFMdlR0BMHRTecAui/AoJCOjvzcTMgA4zTBr+emnn0abEZlrGgSNiOc0OlkRzfKnIWsDOWNkHsjSzuEU1XP2/AyyVY/rTVW1k+3SvxmyX3BiYKiqthoJeLgMQb8UrtmQ3dsr5zkmg8WmPE6AC/tnbDzx1rLCLpwmZxrl+fp6GaiwSYLVdrttwMO+MH54o9h8Pq+ff/65rq+vq+p+kY7Ntdg6jNJ2ykIA13dmhQzwJZcwHLB9143rv16sMo64frir7dxG8f79+5YTOzIjMP4m2lFwB9SIaAzE6ZtfUupm2o7SuDb9u/5W9ZAeUaB0+oBzEU0QmlMEjHpvb68plHOsKIzJzMlGBgC7ltQEPRlvSnRhcxiG+vTpU00mk3r27Nlob06OyUAGs3EkdNQyUPo5Wa5tGdCqHnaAI/cEL+SC7JKVmr1a7rlSyPeLxaJubm7q6uqq2ZD7dd3RbNm6oE8zu0+fPjW74DhAbRiGtsuc8oiDHk8yJfhyrhk+4zPrBjBYTLi9vR1tL3gquNjRXSvMAMwckYOZFzLzEyZYZXXhntrSN998M7qB++rqqo0fMNlut+1uGNcDzR7zMTx8n7Uu5O5tULY3xmnmjR1m2cltJwP7+uuvW0emmC44ujietSuci4FTgL29vW2Fdr9k1EohQnpVxStvXMPpYjK8qvEGShtHvoSUYwG3qvHO9oyk/j5rXfmZa2t8xxL9fD4fsQGYqKk18k7Hyccd04dv4aHZGFx8R79+Nprn52LwU5+jt6rHgJZFYvQL89ls7gvh19fXTfbs+vaN/DSDsJuDCc+h415cxn58fFzDMLSVT2TgzarOIkipncoSsAmOZqDoz4X0YRhaScAszPNxiuUCvtkec7O8sw7FPaHc/rNYLNqr2tDZ2dlZe3mI33zk0o3rebPZ/ctQuF3Q76A0sHl/on3V9sln9O10PIEappf2lG0ngLGR1U7L4OxMTiWJdigfw/I9f9QMiH5EO4yCibu+xaTMWvidk+Z419xc3CUFNsg5/cT4mJtBLOVho8PIs+blVA2DIVImmJox0J/l0hSnPTnclGsgTrBwHcW/XYesekg/PE8zMKcb1oNlVlWP2AGf2Y4IirPZrL37cDabtYcq4vTffPPNCBgSvFwewKYoOWBnwzDUzc3NyFaQWwYzxp3bB5AX8uEz7/BH/k7teZoDsj45OXnkvIyb8SJz5Gq/Ys7oA7n827/9W/3TP/1TA4nDw8O2SuqXtuAT+HKyZmzQhXXS06oalWmwS78NCxtJv80FCj5HhsjX51i+vbYTwPb39+v6+nqUe/u2DDMdWMLd3V2dnJyMHNSDdjHcKxVMNiOua09M0A+LM1IbaJzmcG2YDhEXg3B+jgGnoaDgTF0BboONU7eqxy/JgD2RrthJmL9l588xWG8ehgUzV9dr3JI9M1+Dpu/9pD/6RK7YgAEr62C9yNk7PlcmOR8g4/P9/f168eLFiJE6rWSM9FX18EZvR3wa7A+Gwooecs3n1bsOmgEhU3HYtlNu+t5sNnV0dDTa2gCbym0wXNd2Y//IVMt74th862Bj1n15edmCB3NwiouuATlAnnsq0f98Pm/jN2C5D9ufa2XMwXOyHHfVdGk7Aezy8rLevn1bJycnLSIwcLMoBLler1u0qarR27Sz1gKK44xMzKkXE7YzskseRyZPz7qOC7agOUbCd7AuCwgDclQ3gGdERvCu7xBBHV1JB9nV7F3gnJ+ggczNopiDr5spIaDOeJJJ2aHSWTw3GtcC2M1S8lgfjyzNQNOg0el0Oq2XL1+OygUXFxc1nU7r5uam7u7u6uLiom5vb+vFixdVVfXVV181XTvo0LiWX9eGDLmXl20YyPTs7Gxk48iXoGO5Y1/efO0MxYHVQHtxcVFVD2/xJsA5C7FN+IZofATHtt+cn5/Xdnu/J+38/Lx+/vnnka8ul8vG/vAZmBM2BCDO5/NRWcbsj89dj0s2if7xkUyPe/bplHMXaLl9dhWSPSI83DBzcddDyMO93EokYsIAXNYJXB8iYtC3wx/sPgAAIABJREFUi34ImKjhp2SgSK7ZA0AXaxEk486UizFgPPyfTMxprBXn+lLVfUrOErSBg3nxO+sgXMPXxii8H80sxikp83a0Zi6kW2a9Xvm1o2RKaBnwvfdK8XeuVrvG6LqIAebg4KABGq/su7m5qb29vXYPHmzgF7/4RavJkDo6PbYNcZ1kjDjjYrFoL4E9OztrNusaG0GPVU/XAHtObb35uuwfI/Dn00WcVtoG6Nulgqqq169f11dffVXT6bQBkPWHHQ/D0Jj26elps12yAvu2bSjLRrbdDHI0ZJ7g5Tk5czG79lyfajsBzHm6FYGwUT5AYhrKAJ2ukL75LSTOlxGw9x25CMj17FCepA2U8QKkBjLAyNTVTIfWAyunFBauWYdZJDm8nx4L+JjBuf/PpV92xty3hQFknz7OYOtalp0ha25PNcbhVNclgl1zs2xtpKRartdVPTCWDCY8wwtATqDP4rkBnObVXRYRtttte1orYOBg6vqoAcoO7Xnxues8h4eHdXJyUpeXl49KMsjVaZmb7Ri9OzU0yDF+MgBeblv1cHuPSwUGSViqSw3ol/SScXhuzMUBy7Zi/zGzNBb0mLXbZ98LCTA5MsCsXOvxs7syb4d1QbfNNnyzKT8AGgpyWgEIGJQcCbN+RXMqYHAxy7DwGJ/pNkpwSyCzsWAwPESOcdgJ3I/BxzKyLPNvgxO3B2XfTwERQYDrOkgAYMkEc8y0jNbu/6lImgyO47wvzKUC30Y0mUzaC18vLy9rtbp/EzXX29/fr9evX7cyRQYa/nZt0gGNtI5jPE5W5ViUqnr8VFj6o09AwnUibBi9LRaLdouVA7xtvxc0XRohJeRzg7gZKSklm4JhZLe3t3V8fNzsireD84SMtJlk67aLp4J7b+WWz13M90rrU+2zbyVyhPELark4QuJx0CgFQfE9kdMbYK0YszCzFfp0aufI4l3HRnwr0TeL079TX9e5nJr53XgYOWDJ9anlJViy/2c2m7UaYtX4wYhmIwa97Aulul5ApMvietWYxRLFbDwOQGl0rhFm3c9BizE7VeZY5EFfrj851QQgsg7naGzWY+Dn2icnJ+3FHAAaJYs///nPNZlMmuOdnZ2NQBI7SdafjC1ZNY69Xq8bqHKnAOBwfX09qgnlPYowNZ5qQTD3M8o2m01jSVlWsM0aABhn1mAJbNj0zz//XJPJ/b5D7jfcbO5fdPLhw4c6Pz9vvvDmzZsR68xm+8RffIcHY/bnztSsc2clPRab7bPbKH75y1+2wbFTHeNksEQ95+wYCUDBwDDWrBu5FkIzMHkflNPDXNkgssF4XLdBkQcHB60gm07qAjfzw5gYk8dMgdiG7mf2M1c7f6bXNObRAwm+x9DNCjIt8x6mBGbAA8PJGhhj7rEmxuVxZ23Jhul9fJmiGBSsc4OHdZNpdxbGCXY8JZj0j5dWzOfzev/+fS2Xy/r222/ba9XMmHoprceG3M1UYVbe/DoMQz1//rzJAaaDvAGYk5OT2t/fr6urq3KDNVY9vCnbLAsde6UwmVkGxLTB1er+Ju+PHz+O7gAhKJyfn9fl5WUjALngg/xZVPPnpPJOJTnXsvYeTa/ouwa2C7yq/oYnshq0GLDvqWIiCBwm5giRq5Wca0PBIZi4GZb3xyCIdAbadvuwwY4xAAoIhzECxFXjfUuOAgkOBgDXZ5DPxcVF7e/vjyKVjd4tr80409ByHNB9A5sBg3E6lc/CucfgucI2DaLImAjaq73YWayn1I9XQb2wwNh6tRD68/9mF4yd6yN7xsuGTa797t27qqpWf4KZ0SxX686LFAZegpa3spydnbX/WQyDDXsLhQEbOVo/7JZn/6LlRGbiDMAB0AHdtuC5X15e1mQyqefPn7fzDg4OGjDzFFoHT2dWtjOvpFpH/HZ5x/qzfg20ucu/13YC2MePH+vXv/5129zI6o53lbuOhTH70bq+fw2hVj0sCcMAAA2noAAmDkoxlVTUk2M8Zh5WqPNuDNCrp+7PtZCqGjEWPybHwufxN0TiXG17quW4mAcGj8NkmurG58nczL5gkl4QMcvJAIMsk21ZJh4f4O/UzPMxS+b8bNSUcsOq9Wxwo2/GznUtdz+zCwC5vLxsY/n48WO9e/eujo6O6vT0tM7Pz0cZBeyuZ0OWg8F7tVrV5eVlC5S+R3EY7l+s6zKK2Sg+BCAMw1Cnp6eNxeUKneXIeM1QnbJantgTK+KwUZc43BcN5k86atDNe5g9Lgc1jnEQ4H/Ih/1vVx1sJ4Cdn5/Xcrmsm5ub9vRUF48RNsYFHbdiXNDMepELmQjHf7OaUlUj5LfSkwlZYVZC1fgFHDZSjkNYKTDA1g7uOfZqOVzfaQnXMeNIUPU1uU5vVbIHNslYMpp7ngZv5u3ree9Rz4FzVdNzzHTF9TuO8/luZq4GBeZkhtFL67N+BUvxSqEBjfsduQZ2x+1u1kmuiGXg4frMgYB/eXnZ9kSakZDGuaEzQNFZAUSBVC9tcpejex6wG4LuMAxtqwrAyr26uQveYLter1s2lj5sJu2x9eyN4/BPszWPu9d2ApiXTjOiGjxQHpM324IRoQADEDeWshfm9PS0sSsmeXBw0GoEZmCmrozNqRjnIxDqF67Z9Bwgd1XzWGvGDVObTCbNABK0nQI7LfJYMSSu7xSW1gM4H+valltGTbME953HMm5qZU4t/UwojA0H9LYYF9rt7AZZxs21ub7vW6QBNLY7O4XZcNZ9sA+YBeNxwd4lhsViUT/88ENVPTywkTRqb2+vXr58OUpVASGDuwOEa5X0ZwLARtlcQAAcLOfJZNI2cBPUWd3uyTL9FZmTQTE3jj89Pa2bm5t207cDBqDpYOIaMfNDtmlXCazWTQY3ZJm18qfaTgCjBoBge3uyuECvHoSwTHMBIADNO+X95EUmyWde2YLyOu/HaBCmazDU0hgzqY9ZUkYz+nBUsIP4WWRuWb+xPJ6KJsko8nNa1rMYp+fi6/mY/Nws1udlfYP5eLUzZUxD1pnuJigbuHusM2WQRsxnCWgGBx+T9pEO47SI883+ARoveLgm5/E5fTKwmUUhKzNL+1naKmNHH1ljMgO276Xd2Y69VaXq4UWy+CZBKc/ls2RQDuI0B4dkYAlcybaRRfaZbSeAed+W2YMNhlSF2lfVAxMjyqB057k+FgGC6BgPxkXx0imoDdjCsmKz/uKVN88h01rn4ln3GIah3SBrg8400ds7HAW5JqzOlDkN0EHD6RNj6tV8kElvuwIyM/vimjZ4O5xBJpleL13POgefZYrnvw2Wnof7thG7RuqAmKlr2qod26tiDkS5YsotTJvNpt6/f992/282m/rVr35VVdVeNWZ/sD2YnfHbqSOMazK5f9Agj6c2owVcvKiFH3HPsu8RZiGJ8znv97//fZ2dndWbN29GQYzH6vgJE1X3LzRxHZF3WF5fX7cVVtsGAOzSUbKx9AeaiRG29bn22X1gRncP1rUKT9ibBmFQftV91fjhcAiXqOctC0Q9hOItDdQBXMsxqJlJeNUOI8u6FoZusE3mtdls2qY+A0xGCSvHjpkgZjabdNtMMGXcc+pevcCg5Ov7N3J2H4AzASdB3P/nQoWZUrICy9vXT5bluWfK6AjNb4NOr0TQk4evBQha3tgWKRsZwsuXL5veb25u2tu8/Y5EMxeDN+PrpZz8tr4B2kzL3X9mNMx7Pp/XarWqk5OTBtCsZPJYIet0Nps1P7WcsVM/624YHl4Q8v3337d7U92X9W39ucRjxmWfs1/tSh+r/gYAA0RQ5Hr9cGe6nR+jhzERLUj3sugLa4MlICiE41Ujb/QjuqBcogBC8yooxVhv8LPwcFwrxkvlNCLU3d1di0Y29HSIFLojkp3ThmjlZ8qTcuuBQB5DH960amfBOHxrDgbn+p3ZAeP3NgjPA3309rN57MlW+W0mZRlShsgUuDd398nf1qnl4udk+Q4Rro/tsxGZW4xIt9hH5frW6elpvXjxor0ABfm7vpeg5lRrs3l48iysixSWG9AZf2YGjB+d3t7e1ocPH2oymdTLly/bFp9nz561BQHGgI8DYs5AzBi96jwMD4+gyjTcgd+pr8sdxoVe6aXnS9k+WwPDcSnumTZmHgzI8T2KQsEAH+cBHGZYXDONyQZgGu60wEbhIr+/cxTwWDjW6SbXZjzeO2bnzpTNYG0DfSqXz9rbUy1TeOspV22YpwHCaSHG6e0jmT64/pGUP9P0bAbAHJ/7MSNm/rmQ4sIw55htZ582en/n4OWHaGaKY11ttw8buJPdAfaZhs7n83Y7jj+3nzgQmKVYRgRh5AeQu2zBHLyCz3n7+/tt68Yw3G+upS82cnuPnwOZ2bHtncfnQBJOTk5GK6Zpm87grLve35aVwf7veh6YU4qkwtBXFy1hQb7Z26mSVxdwnmEY2goLINAThGsTZneTyeRRBLAh2lB87Gz28ILXVCCK/fDhQ3ttF9/ZgFE0CjUt9tzzvOxrV3Mq0asx0QykT606WnecT+DxfqbccNprvZqYa3ZmtlkH87hzO0I6syN3sjDr2kHKdShHcss7UzGaZezr0tdms2llBHSMU2+39yt0f/rTn5o9XV1d1fn5eZ2entbJyUm3TgaT9XiRG2PihSc4tbMhXlqLPOfzectahmEYPaHDCxFOQZEFAAfIshHWfg4j5m4CgC33cfqcZOTeoG798/9Tdue284j5fF5v3rxpdSTfdc5vhODc1dsvMufP6ObfRmR/nswDo00khx1lfSFzbM5lkYJjHP1QLGkDLWtdNBu9j4FN9Obwt7ZcZXRzrcWFzwwABlzLgEbh3vr0d09dl0bfmY5mem1HyfpfXs91TI+blsDlebuGk6l3Ljx4DDlejsMpDX7IjIDhm/6rqm1xOD09rdls1p7D7/1crvNxrhmo7ZjvqmoESIyLp0ZUPawSMyavKDP+nizxE66bT/ewfyDjHKMJQK9k0/P3TDcJDH93CslNnt664DqHPwOVMzLl/XW+SXo6nTb0ZlXFBm6Ki6LI8bPew0qLx8hvLw5k/75ZfDqd1uXlZYuyZlPIxOllAoVXY/IGXhSVY8DJHJVoyMy3org5lWXe/tyAkUCRqVNGS2+bcb3wqQZz4zj+Rz/o0g7K3JgDdpaAYuDIsWeKaJswG8OZM5Wm0V/eLlM1XpEnZXQKulqt2mokdTMam0avrq5ajYtd9W/fvm1FcIAAeWetyH7FHNjbOAwPj8rhlp7pdNpeRedVa2TAKissi4wLP8hSUcrKpMD2mFlBMn5k6LkZ1LAHfOKp0kvVZwCMZwb16hy5+sdFXcB1+ja66OzhXkIUTp+eQEZegCSXoN1vsjAf4/TEKD8MQzNansvUYwqM0dH8qeaicI9tmZ36s5SX0/SsISVj7bHDZBw+Dh14pYvruH/+f2ol0tfrXYeWaRjzzbTWYJXzs9MQPA0kHreZeY8lZEaQLMtsxZu0rQfXWmGfrtkwBq/ETyb3T4HAD66vr0d7zjLVcu3XY0uQ9TwYG31n6uwHgTpAebHJAc39e3zekpR69vz9nTHCtoBseoHoqbYTwNhoaidgMAwcoSJI351udAbRcWyvDNqwvHsZNCeqIOAe7Xd0xdgAQ54EYAXDZlDi+/fva7O5X2lK0OzRbo6h8O36UYIbwMr1rLSeszhlzN805mlnsVxcc0pH9QZis1RH9QwqfmyzndVAZqbIXOjn5uZmtLzeG6fH4jE5WPYagcmARXN2kOla2h4y8qqn007maLBGV2Z61rHtiPsOOQ825lfArdfrtlVnf3+/vvrqq9ELMwygMBSDucdj3dA/1/VDSS3vDBicn8HE6aPHxnkcbz3aJ61768Zjxtd3gdhnbyXKWws8oYwABiachsG72ZkzemCQbmY8RnRfz7Ulpxw4k2809dhcjyD6ub+cn8e42WxGNNvn+Tgc38ZPfwl2VnquLNIHn/fYGt9l3cnGmKs6dmbvg7NezGZ3Mc+sR5lBJCNlDk5Pc5Eij88g4vEzh16dNaO8dWKgn81mo5JH3nHgmk4CiZlqpr1mjqRusGuApapa+uniPc2+5OzFdTrbRA98XHLw+K0Dl4UykFvGk8mk3TOKTJ9iW+mfDphuXN+3p+1qn30iKwMlEgMamaey/A1bwvEdHWBhvHzWqZ4nw6Tdp3NrM77tdlvz+byNkeVxb+5zbYjGU2J//PHHOj8/b/l/KtxsqOc82Qw8WY/iexSVss7UiutkZDPoO3pmkLG8cBBWsygge878nfvccr7IgXEBBDikAZP6Rq4ocb2cqxt69v2ZuXJlsHVtje8Zi69jR3O0dzZh5lb1+KZ2sw4HLjuca4gOIGY6fvsRY+DJE+v1uv7617/Wer1ujwR6+/ZtTSaTevXq1ajPtKdM77PkYxsxQegFVOa0WCzqw4cPtVrdv4/g9PS0ZSyZrWQ67+zH+ncq7PH6Mde72k4Aw1CWy2UdHR2NDNa3p6AsHCgjuJXrgiKg4aX36XTa9t0wYcCO55Pl7TV+6YJTzaoaASXC2t/fr++//76GYRjtIkYBGBO1AtNzBOxIZoO0M9IXQMEKrZWSTDH7d72gx+LQh89zUMD4clWK4NB7a5TB1rUjg6NTIWTngIPtZAT1PNwvLQE4A4HTVf+2ffVKDHZggxNytj3Z2ZyOOcgafJOxoy9SaAOdGZz/94ZOv+ZtsVjUfD5vcv348WN7zPPJyUlbJUSHZqDYHvJm3A7I3uJkooKd3dzc1Lt37xprZDP3wcFB217k+aVMssSQ+k89JpBlRpPts/vAvPoB8DBxorgHb7bjQSUroA7jnfMsA7smgsD39vbam1YAEzNC1zdSiI4us9ms/vjHP9arV68e1T/cXOOhoVxTeQzUVN4OQz9emMgo12Nl/t6pldMdjJ29NxwDkCSDwKFcy7KD9WTIWKhfpcOiQ4yN6zwFtp5bD4gBGLOgZJ2WAbrIiJ+69MLRMAyj+3Wd3iFjBwP+T2ekv8wY0HcCFMf5mk5nU7akUqzOw8xev35dd3d37b2Z+/v79fLly3r9+vVIZ4AbPsY4GL+vhS5YIb28vKy7u7t69+5dTaf3z+0/Oztrcnjx4kU9f/68rq+vH9X6HJwMkDTruJdFID8D3FNtJ4AZvalXoUweg4MjGLjSmZ2OOJdmoM7JOccFSkf7jFh8l0VGj9XHXV1d1ddffz1Kbbzp0w7g/pICJwh5I22yMfr13wazpP9ZgO/VhQAswMVsp7cI4A2Z9MdtVpne2hFzzMzdOvXnPQBJWfX+99ws654+HOB8vu2G5npLMqCsXXls6MVycF8ca/Cz3hKcbfNmSiYDfIdj+zYw9Owf31qErUMyeL7XZDJpG2CPj49HvmkZXFxc1Gw2a7cbTafTtjkW/7esuFXJIOhm37XO/J1LDi70u7/ckJ3ts0+jYEWRfSI0pzcoARrrDX2kfpvNpqWGfmmEd8I7TfMLQ3F0JgtL860vGJwNtOregU9OTuoPf/hDe+EnfaPskUBm4yeK0lcKEfDuveTADpBRm/ZUbSnvmdzVPEf67KV/Zhd2pASIZBfu0wzEdSYbdY7ZtbTpdNrSVacpHk+vdueGXhL06StXeN1g+GZMABrH2n48/l4mYdaJIzrltM1guw5EsOZhGEaB2fVOB3wY3dHR0aNtPlX3m87/8pe/tPoRe8940cj5+XlNJvcv1b2+vq7/+Z//abXo/f39Bm5kVQBX790SjI+FiAwCZpkGK2SRaTrnuC/rNB/66LYTwE5PT+v6+rqBEMDkpdjtdttWImA1VpQLsNwIDXLbSOy43veCIOjb0Y9+HQVt/NPptI6Pj+vPf/5zPX/+vD0V08Knue6UqY9rBHZ+oqTPN6tyrclOmkqj5SqSr++tCVlv4Vz+7hU+XXswK07ATKBOgLUMHP1Tn06fqx6ncT2Q8sqbdY3MnmJKBmAzU7Nqbjr2Nc2okukZnJ9i1O4n02fLnDkzJ8DIpQazPfdrMHb5hv6QJfdekgkBXrDs+XxeP/zwQ9si8+rVq0ZIWI1nBdYyyWDDXFwSMWjbLnpBuFcD9BzN6Ho11GyffSb+d9991x7yD5rzxInt9uHJjKw4eXUShuRnZxPRQP4cPEyOOwBsMH4qpAvSdiqnuOv1un766ae2RcJ7bUz5ASIDi0EnW47XOT9j9aJG7xjO99g5z6m7QYnvd6Wm2SyXqvHSuo/pscycsx8XzPhzDrRkxhksGIMXV3wdZGXQdIDpAYXTO8bXA3h/7xoufWYaaGaVTIIxJtiZuTpdpF+CpOtAyRCr6lGd2XPl9iXfXsR4sHcyHL73ajSMK+eYhXxIi1Ndy8BAxLVYsOIzM07Xv6xP7NCZyOdAbCeAPX/+vD1o0DdboyTYB6le3hLCLTpQXgZpup1F0Kr7TY+uvfnFHq6JYTQuglZVHR0d1bt371otACNwOuMovV4/3n7h4mMWIi2DZCEc68jqiGbjxpjdzG7o0+e62Jn1sTRCX8cvT8GoPN9mELPxzdi9hr4N+BhnLn48VSOxXDz+Xj3MaXCmIBi7ncJycx9mdWZZfObgi/15FZJ+bWtZmLc8zVqyTmkwol9vbSFgu1SDndqhvZEaUIMh5YMRqW1VVUsf6bd3TbNlg61Bq8dgTS7Mqpm3bd26MaPspeK99jdvoyAfzqKic2MjMMBkpZhCs0PehsR5AIijEEDmdNHKWa1W9eLFi/rpp5/qr3/9a3399dftODs647cz+6mzR0dHDTBN8e1IZlPJXvL5WnZGf5YMz6mwjSeVnsVlO5JrWJlSWkdOd/m+qh7tC+N4SgW+8b0HkvTTAz8DJ//3AoOPd+rMmL1S6qDhmpTZEfLynM0kfPuZ2YCZGzqm5WZfZJIOnYDn1Ar/cH3MzMM2YL1CFFxT84tJrD+eKGF50hf3TXqOXqE2+CTjtg34lW3Wj4HZtUvkZJt2JsV4LNddbecuMRyXwSeLAOAYUKKsFZkU3EaEAHpMBaUYkf2Z09XLy8taLpf16tWrUaTlepnqZAQm3XXkpI8810DQS4/4u5c2Mh7PNaOMU6hkWj3QQIYunDvtyjHYAT2OHE8v/XHL1Ugz4t547Qwpm6oasYHePNImq8YFdcZqFpZz8HiyxmOHSabhaztzsL37/3R4y5/xGLQtY8/d35vx5UqzQdqssKc7Ew/82LLygobnYDkaoDmPUpHHbBtN+aM/A55LPbvYV9VnAMwO6FzZq4NmRgwMSktkcwpAv5vNpr0k1UCXTIw6GUqlcO807fz8vC4uLmq9vn/NkyOZBe15OTVIw8tm5Zq5GPyear0UDUfMdIl5Ox2CbRkUPH7OxeCdbrtv1xkYu69pZpzfp6O4qE7dhfNTXi72ehx23Nyy0QM2Pk/2lfLLFNLB02NwbTLZQaY9zMlAxhw5lzoR9tFbqEnGCyAaGC23HLP/TmD2fs3ZbNbqXdS58FvuseQHnRuIPBeDbQKZg4Zlb9nl+T6P1gv0+Ege+8gmnvymHnJU6kcIjC0EoDhpk9M+p115v6CXohG4B42QWE3x9b0xk9fD/+53vxvdAO5aQNXDZlPoLnQZY7TwrKRe3u50xEaVTshY8vus//k8s7lUusFrOn14W5KBnj78OZ+t1+u2sZF+POdeas24kVmuEtpgnV7bqb1x0oyafpwm8pnnn4bt2if9GtQyGBogzCoAXAcMxsw8sBfryLafgAiQ+AEIZqZ85mCTgQYboV+PN4Osx8z1ez+9NN2BLFlj1cMjqy0f+7aDSaaVPfaUNTWfn8Et+9iVRn72kYekZ45qfjor2yKM2Ln5zAVMGwvGCguz01nQNgQEe3h4WJeXl/Xu3bv2tEueOoHRmYa75uQFiarHK0aZ8jBnR4NMueyE2ZyiMTevNjqiG+hms4fHCZu5caxB1tsiDN5mqnYMN3/v6M64PWeul2kt/dqxvZDRA+M0TKdFPcaYKQ2fp6M4yidbcTNT5H9sGMAfhoe3w9sOM31kfLaT1BH90pfLKsmI/ahv1+kMSj2ZMJdkl5ZVjt8poetx9uWebxhwptNpw4FM5bGdBEMzWNdTMwjuap+tgQEaNk4EOJvNaj6fj+4c9ws2sjhXVW1VcrO5f5QIaWHVw+qW6yCus3Hcy5cvq6rq06dP9c0339Th4WG7zQgl9sDJOX9vKwLnm2VhyK45JStItoPsdgkfwzajsOFkxEtHTeUyBvrKFK03ltRLApKbV8Kc1u5qKZOqMfC7DpOpOb+TiWZqYjkYTJBnpq/JYKrGixdmabnqbXtErk6nPL9MqwAHvrPNOYD3ZOcUMRmPA6/lRfDLDbeZFibQeJw0+ktG7bKQMzSzRbN72wBytc/62Ax8T7WdDIxB8l5Gp4MMlPu0cBqAq+rhTnsbCIZhwWOUpt5MlJ3bd3d37d6r3/72t3VyclIvXrwY1cRgLUmLjez0m58hMDslYGuQcT3F/dC8NL2rGZzS+HuUP5ujoaOunyaaNN6slPvjzEzZg+fnRBGQ3Bfz84qZmUEvDcb4LS/r2d/zWW7LMRD5d7KoHtCb4ThAZCnAKWQGEWwT2TrIcYwDJ2Ox7Nw/tuLN3pnO2SZwegDAdwrkxmyIAQ86SBactpR/G5jzPli+s/48xiwFAaaM1QzSMvQCBRvOd6WPVZ9hYFdXV21TadXD7Q8eFH8zEde+GAwT9K0KVdUKiEZybvKG+cDiXr9+Xd9//3396U9/qjdv3jx6Vv16vW77XrwUjCFktKIxXsbIC0FRvGtB/E7Q6aUntLzVyPWUHgu0QxmAzVDMfly3yXpS1fiJA4yR4/jtz9m5bT1aV66DGSjdchsFizrogmslkFkO7iNrkXYe+jKL8NzNvg2ynMc1qf/ZSZOt9dJa13WG4aH47dIH57kMALi4f47v1ZjIeAwCnosXYNCfHxFlG3AAZg53d3et1uysyAyQ/9NnMgA4jeTzxWLR5opPWKYZzLiW59BrOxnYixcv6ubmZmS7DVtOAAAgAElEQVQYd3d3dXx8PHphLdGI2ph3DrMbl0H7diSE5pUskBfn2mw2dXZ2Vn/84x9rsVjUd9991wAuqXFuosTIXMNgvK5ncFwP6GzkGELSYgsdxaAA79vhWMaRhfik2Vk7cXT0LVjJfGgG8mS2zQDEwMwMOJenXWTdArZnXXMdmB1jdyTNmk2Oxey1x4CQgRmQA4p1iYPmewZS3wYW5megyj7NSuyAGVSwF993a8DBLtktb12ZDHgsBgrLCYZlQuB0M1NM5Mc8epkLfee1sbteZgMQ2waSUDhops0yX/Ai6+nZdjKwd+/e1enp6cjx/QTT7XbbdupXjXfmm+Wcnp5WVbWbMieTSXsIodNMFx4RxtnZWf3ud7+r1WpVX3/9dXuwG07vPLpXAHbURsg9dsj3sEyDiqO3QSBXELM2xjF2QsbZc2D/z7EeK/OC0TzlmLlS5+v6f59DvQT5s8KbEZs2DEPTPbL2yhW6ZCXPzmQGadZBuujmAOU5VNUoPXIzGOSmT+br/pFXriwms+DHq5f0lwza9uIgZVmyEn5wcDDaguKyCsf7fDNm6zgZnRvs2bJmHp6fFysc+Gx7XhDq1cBMSiwT6ylZs/3TY/+7UsiTk5O2slD1YDDURnKVyQjN4Pb29urq6mqUbqzX61Fti0ECZpvNpo6Pj+vm5qb+8Ic/1OvXr9ubjn1tRwDXGzIH9zPyJ5OHp2EMwzB6fDAKXa/XjTnaAOyEGIENN483M0iDBBQts6ypOEo5qmJkudmxV1dD3mYY6cBEzCzE8v1ThuQ0287O+NyH64u+jovjyDFZr23KgOP3ilrGvbdW9yK5GTwA3gt8yTCQq+3OYOXVxQySDoY5Tv72Oxn5zfEOwE7Tttv7nff5whsHYffpWt5TbNNzx/5dNmLOZqL87S1L1MjzxclZ10Re9r9d7KvqMwC2Wq3aTaEusGb6BvpT7DdLsYCrHtCYnBjnqbo3vNvb2zo6Oqrlclnv3r1rT0z1Xi4bWtJ2HACGgvECwj7Oz4a34QJ6diYAM9MyQM+1J4wGOZm6o6w0Lq8YGcTox7UnO5eZYoIojLnqIT3z3Q8GkDQm+vZbqbNG4zkxVrMry8qs3GmU5+V+kIebt+oMw8OjXmjYEU7mWoud3nbCGD0PsyanYAQ5O6zfyoOuzXjRnf0FX0Im2B71nqOjo5HdcH2XWsxo0Pnh4eFIvwS5XokC3fcWZ+wLyCIXpzKgwSYdTGCrtkeXQYwjZmLJAnexsM++FxKgIWe30yB8P1IGuu6aEc2rUAlqPLb68PCw/uu//qtOT0/b/YywvMlk0p4ygXMCHnY++jQzJCdHaBzDb6O9Adi1sGQEXCNTMhsXBkBztHNNB2NNMDLDrRo/oNC1tTy+x2IYF8aVxXY3zvPLjB2RPUenzvnOT2QEuBFMckOvr5nytz4ZS9Xjl5NgU9ifbc+ypi8DGce7dmfZpT1jd1kcN9vKtDcZsnXN92ZzTltd83OGwXm2VevLdUXL2Trjf5iT/Zdzfb8w5MDX8fjN9O3rKSMA1umj+2FO/+saGFEio7QN07mwU0oGmIqpqkfspqrq7Oys7u7u6j/+4z/qm2++qefPn7drWaH0gRPgGEbyBMiqao//MdOgbwsuU0M/PC4djWtaIe47ATPPRcb+n7l5Dr1jzPDcchXSaZyfIFH19Bu3rRfkwndmkz4ugdXnJGB7zE6nOd8FaZ5TZXsxc7OenbbbkfnOc3Q6V/WgL4/dAGdWynwzDbMj91Jb25cB2UwqGQfysw/Y96wTA13K32Bo5m8ikmzROnImZR832JiJ2rbQpV9gYv0hawNpyvCp9tkUEhBj4nYAK7hX3M48FtCB6pvR/fa3v631el2//vWvm1IzfXI+jWHzuQVBlCAacIwpbqZ9jMeOSFR29Koapzt2jKxxJEg1oSsV8ZYSlJiMEhmmnLPo7JS2Z0TUHX3NqmqPWcnaleeRTM5zye+8749aiHeok1rkudYl13QNLx3fQFP1kLL1ADg/d1qJXfGb/gheyVacQpo106/BA3k4uAGSjMF3umTQyjTXwdIsLbcj2fm9xcI+6w3O9jP7ja/rcWBvLlEY3M26yLC8SOf0NQlFL0A+1T57BPuCmIQHkaiOoDK1QLDenIZRHx8f11/+8pf23C4bGoJ2LYXHU7OLP+mnz7MwaYCqb5GgsfJWNX5aBlSXYw0MXuECJHsposfGd6437GJkVY9vvvZxCVYc41oVDKzqYR8OKRMFfAAcY/cqZ6bRPSbpOqNTSD9jirlQerB8Mgj5d9XjRxCl/visV1Jww3Fcg+rdJ8qKN2N3Xcb9ulaU13QtyjUkznGAotbkQEtQMrt7qiSBDA046B27d4DcbDYjm3CgzOCYTIvPmA8LEmaZLjcwpiy32PYNfA7C/+sU8ubmpm0mRZDcuM1g6TydwANMtjMMQ3v8ze9///v66quv6sWLF23wvX1EWWzkc0ciFOgo5adZMBYM0QK1MunH/fn/LC47ZTYN30WBHXUMtsyPvtxf1fhZVMmIOderk5mG0wfX9QZTf0/QsSP5Gh6vX0hBfx47hs7YnIrAHjIN8XWe0pdBw2PyyvJTMmelizmSDZgFJfAYPAA8O3KyLwf+6XQ62tTcC14c488ydYXp8MTiDCyepzMnMzPmj84zW+G6PQbn62VAsizM5qxD1xC9AJLptetpBuxsOwHs5cuXDbxgPr077e343s+CI2C0UOWzs7P66aef6vb2tr7++utmSDc3N23TJGmmC9i+lqMMzC7ZyK6UwsLFQDLC2+lpdtJsrgG6kOmWrMHA68hlA/O1k43k+TCLHlOCXZlpuK7ptMHskxqGQZnfzDdTdOvOAE9/Tl2QtW+pyfl6DozTQY2+Waq3jA16/ozz1+uH+3O94OAMo6pGgWE+n492xafz2fFgNFwvF7rMbgxW2LhBDJkeHByMrpF26zQc2XJtBw+XFjx+fI1xZzAzeDkj8Fgzq8BuGTO4YHtyEPhb0sjPMjA69kZTdmfbULwL38LDMDj++Pi4fvvb39bt7W29fPmyGQvCcz3GzAfHtID439GBsfh/11WSodkJqK9xrgHThWP3nZ/TnhK+Uwhfy3vbaO6Xv0nrcCY/RsisgnPsgMzJRmKG63kZfDjed164pVytNzulAZ7+mTs2xhgcXBgnc8p02tfpLUwwB9uO2SuPV55Op+2pw2amOK/TLLIRA7CZQ6Z6uTjg8XCOa30Oqswr39Dl420rTivpg+t5QyupM4GN4zjH7N1j5Bqea6b/XjByqQemawKEjpxCem672k4A460mZiLr9cNTSx11vdEwFc6tEtvttv77v/+73rx5017Oaafi78zrmZCNPtPSqnoEfBbAZrMZUfhkUgk2Xl3NNIjz/dvyqeo/Q4zjHOX4nbv6uTbnw3Zy64DlNplMGlN2HcNplcfoaxisGI8jq1NTy9vR3fUz5mFmh7ycThG0+Nty8RhTFlzfcu4FDDMH99EDqFwBTV0ydrNkB4lk1iwCkFF4Xukr6Mgy53hs3mQgFwI4hv89zrQ1+xIBb7VajepYZoVOG82UmIf1i805zXRAOjg4GLFNgznBbTKZtLsT/i4Au7i4aDdgc/uQc1sE4Ojh+gkAcnJyUh8/fqybm5v65ptvRkzBu6k9IYTgyTkvNv23sdrQqvqrhAjV9Syzxen0YROlnbNqXLOycyN4A9RI0JPxdgFHyJyHHYVrZkRPZ8UoCCQGWMsrjZF+Pe4MJHkXgseUdRA7km2F6zq9xz7M5p3iZh2F5tqa5cH/1in72MwQGI9ZrMsWfG5g5beDpFMx5mu26wUjPyE15WVGjEx6DA2deh+d7cLszemqQcjzH4ahbX5lVd9gnWPz52aTLtukzrFH10bpx+eayWF3n7sPsuozAIbyYWFOC1ECv22kDPTg4KCOjo7qz3/+cx0cHIzeHWljfUqRjpJG8YzC2fzIazuV6Srn9upZGf1Ji9NAaU4HewyMMfZSQqcLBsCqxw+C5Lwe+NowSBMMLpa703bPBYPKdCudghJCFv298x1ZMHaAyoVt64RxOIgl2FnmfOZ587dTUZhPpijIEjZoAId95a1a2bfZTaY7mcJxHlkHYHZ0dNR0TbDJYIU8bWcZYG0TDkgGVuuWseWN1ZyT90karK2jtGdapsu2ZdtwytIA6jT/qbYTwFC4GdBkMhndq4Vhc1GM/fj4uC4uLurf//3f69tvv20TMGtJ50XAVrxzZwuGY/hNPxiKUzo7cm9+NBtsT9BZYHY/WYuwM+UeHVoaP06LzDNqZb9WctYBTfEpUpsBeauBUwsf43naiRwImHPPuB1lLRsHg8lkvKePzzwG+jfIW8cG+QT8vJ4ZNoHODMFzsfNk6m5ZMC5k7e0nyK2nn9vb2xZozX6SKdkefL7tEV2RxnuFlYyHY5J4pK75m2saVFIu7jNBt3ctf+6HO/A7x/B3pZA83M7CclQ2K0KBt7e39ezZs5rP5/X+/fv67rvvaj6fj6KgjYTJ0TCCjLhp4E6xUtiMs5eKZZ9ch/4diYnCdsAeezCA0LJAnDJDDgYg16Ac8cy03L/rB5ma+jaXYXh4rBEG6Foh1/DtIq7bGdxwdhe0uTa2Yp0afNySOTmgJVhgN+jBiwHYBKyTYrQBIXWCQ/dYo8/5f+2dS28kx7FGk90zQ3IsSzIM////5YW98MKAF5auLGhIDh93oXuap05HNQV7cw0wAYLd9chHZMQXX0RmVVsGMKUWOziPvY7RfW1q4HA4nN6558UUX8u8NYqgnQl8DLDoi7d/lHnTd+qFjVoHDWZ2SMaBpljsnGmD8ZgR166cCtkrF5+F9K/62oPRIGh/dfXrw9kfP35c33333frLX/6yvvvuu/WnP/3pFDbYe7hzsCwbbQGp8bbrcCg2TWgZk72HGZcNdyoObwxGvq9GagAwy3Q/3Uf33Z8LNrSF/MweWpdzJVY6lNO0viBoam+Zk98xq3ZfHOZ44cM5nSq5c2cNXajf/SqI2bHYWDwXZgEOiV1qZFzrdIZlbWMzU3e9lSP2UB0qW2lOiHMUO83Ok9mwj0+LF2Vivp6+2VaYo69fv24eNXQbnlfLwfpWG6V+Lw7Vplre3InvjYdsUjUtRrhXV1frm2++WT/88MO6vb09S7RPVLX5FXukKtCE8A5PEFZj87W2+aYKywLq9z3hGZiYCIeJbsfA9vz8fGJGsArv8DfT8XinsNVMy4ZTQOc4dcFwzDKRHZTeitMnFuzRJwa8t8Gz8+M5dh02Vo+7gFewtuz9SmzrSxmDWYCLN/p64crg2wfFKXY2ZjNmzHz3j+OYRRroHfVMYWV1vrpPP8oS/bkgx7E6E39H580svf+w7SNT9NvbQJCHo7IJN6ZyEcD4lWoeRYBtGbm/fv26bm5u1t3d3frzn/98mnQbqHdqW1ksOCYTY0G5mv/xCpA91hQ6UTBMK8xa6wzhq3yuE6AsG7m/v9/kGKiX+jz2tX4NKdnrxsqUJxDAKgA6z+LrXTfzsQd4djweC8DqJy0skwIJymemM4V1V1dXmxfbtZ/WJa9eeTO0i6+Z8laVuRkP88h/zvWpBcvn4eHh7PnYRgFrbUMonEOB3xuzDfxOMXz9+vXUnlfn22ZX8ih19naCJiK+tiGrnQtt7AFJgdsryT7ulEV/Y8LO187UC3yXykUA+8c//nEyNjqM0a7166R8/Phx/fTTT+vLly/rj3/840kZ7eH80PZeSGOgApwQkB86xqCdd7Ki2qMjIJLEJLM9Ad0IataAoRlkPJGAkGW01mvins/UzzgBQ7fNmC03K4PHhVI1TMILIl8UB4M0EJu5WsYoejc/mgVYXgZrpwIAGPpZ1tUHiWkbp0cfWmB6z8+vPzbjvJhZkkHFLMSbOf3cp0GsOnw4HDavCuIadGxvocjOhrlHNs/Pz6cXhiIjFlua9GecBRgDEv8bGiJvz3mjoJ7DFnyv27NDtR5h440QAGE/tmSmyb2Mw4sHl0DsIoB9++23J2/FD10g3Lu7u/Xtt9+ul5eX9dNPP60//OEPGwEYYR0KOnHuvAvCcnLbjGytreexEdMOYOGkN39d0auXLrtCyfml74afVo561UkGnSB7Jd8HKDiUxGhRpg8fPpweJSkbwzic34IFA2YGW+cqzH4NPpwz0E45L8bl/I3ns16ctpxIJwStvJGN60OPnBP0mMwkvbXEBsf8deEHQOF6swfvG+N+7x2bmIXZjeVD/82mfA19RLbI36kbg0jrMGiWXDQUtR40ldLIaa11Gr9tD71w6Nm8rOef/9UPh8+XwsiLSXwYC5PmV9r+8Y9/XH/961/X4XBY33///SbXYw9YGs+D2hWY98H4uIVoI2/IZQOz4jtmd66E35FkjF6kuLp6fXeWV9aoz8yhk+K+2vPUs9j4r66uTr/+NOWWYFOM7dOnTycW3HbXOn+cg3s9H82JmL67YMw2ZCutQ07mz2003Pfc1gEA3NRj4Kcvrsf64r5SGhIZ4Gz4ZisO/8zKDWqc2wuvLKfJWUACIAc4oQJN5d0UjEFrDxBhlnaIEAm3zw+0VD9sT9xHKUOu7dTGrV+MpRvZGY/7cKlcvOLTp0+nPV+m2Dc3N+uvf/3r+uabb9Y333xzlhNoOOHOkYdqvskMwIbguqzIa73uzanX8TUomNnY09Ovm2w/fvx4CkNsZL5n8hoO3zr2lhqu2YND4OaQDM4TReezx8p1AC9joY6GyJx3ewDKFKo4LHdux/J3/Yxx8qDul8fVUGWt+bU5bs9sp+crP+TkUN7XA1AtTgtQx2RoXa10P+2Auxo49bFEwKzajsE2VibGdU3VNJrgnoK7AcfOpv21QzTTq3ymhZXJxvbmvOUigPGrM09PT+tf//rX+vz583p4eFh///vf17fffnsWVjgO57+Ny5PC/064QwgKe5QcLnLcbdi4PCmOp3kioIqN0Cx0JsH5No73mb96I+euDORmhOQD19qucKEoZSCuzzuwyzoKSg4zPWfuD69DogBENggzSq/OWQ78L7shlPW1VVyHwp0T55DoT4HGdQK2DpktF4c5Tl9M/TE7p57qO063n9eaf4qv/W44ZWdsuVe3XQo67p9ZbwHQTqv315HQR9qz87A8/Syx+2d5tbi9zsVeuQhgDw8P68uXL+vz58/r+++/X3/729/W169fTz+T1klH6R1SQEv9x7U2WgTZX8eBSUwbal0/15QhOObnGkJhC9dgaoV3LO5zfryI7z7vuh0WOk/Hjmn6YkaK/BiLldcLEU5GN29BHYyF/2U5h8PhlLuxwdrjd75ZffVSOteYVTnt4NykQyKvbpfZFixoz3PSOh2CmbGY1Tp0M2PAqVgHygiIIqy7HCvw0Ac/+tN+m/HCnie2RqhXtsNxrrG+23nWGR0Oh9PjggU56xfycbrHoArDwxYdLvrtKXZCDnvt5CxT17VXLubADofDur29XT///PP6xz/+cXp3VxWbCXLsa+/OQLx07pASASKwtbZgRB1G/HoZBHh3d7dhQ1NYAViRB3CeZ0oIO7RzQtJ5m4YxZlLOQ5WRuu/8d1/7Hvvj8XhaDeM89XmBhL7YoPZydlamJlwBF/JzBUnaBSTIpZQJNbShbsvBzsey874sh2722AZEz6HD9mmcFPTNKYvqDTI08GK0AI+BkPEcDq8rmICVbYQxeYXOMpmYEfbmX+uybCfmBDh5bH6/nvvH34cPH07v2ysDo+/0Fz13xGNmZ330HE862cWmvXIRwD58+LB+/PHH9fj4uL777rtNoxYIAqBBTxDK4ESgO9yQwArZ857YxtcTA5rQ2wZTmlvmw7HmnNZaG+PqBPhejLQg0jbMemz03e1uwHJbVgobh412yrN0XD6HJ3x6en2+j+MN9Sh2YjA05t+JXYfPnRvaNbvZA5S9fl8a11TK3Pfqdjjr9AcGXOdtgKpOV9ena/fGa7B2qX51jmsXdtpmTFwL8E56NDknz59ttDpf+7U8PJYea7kYQv7444/reDyengNzGGOjpDHK9GMedLoewANqZ/GcZi1OUCMUwhkL5q3XCpvdNVwx4+J7aX8ZYz0mfWm46j01ZiDOlxAaNxRp/+sRnXeZktc4ACsa9+Apva/LckL2ZrcOzyjMcQG+OSyU3HPmduvxJ51xqsL/fZ3rmLx9wYP5Zqz8IS/3m+LFmDpN11m22ZwfIZ03RXsbRENsMyV0lDHzv/k7y8jhq2XpebFOUI/H0NXr5rysH0Q+U92eL4e9/xGAGSzcKA07P2FFIJ/iAZh52cBtvBUewrKQYAMcJ2S8vb3dxPZVbo4XFGyoZoEFZwNVQ1uXCn9SaO53uOaVTY5ZfpORNy83rRBNsjWVL2ibSVR+/l4PSzGrcIjk41ZU19W+2inVMXjck1Na63XHvT1/rzeLsEyaZ6VfdcAG6b0QDtn6mBmcjz08PKzr6+tNisW6uRdW7bFry88yK0P3anHZ2pSGsTyoz/ZkrGiUZjBuntnXu6298uY2Cq/cuEEaajjRifIgzDZ8PwOavMUkfHbWH4/H9f33359+yZh7DofXpOZaW6ZjoPCD0BVWP08Mi/asLChClXMK+cwI8MaTcthT8rlh6WTcBR2zKDuGyYnUY3fcgG6va/rAhtLQznktK7Hfx9+0AnIrmPg4ZW+VtHnOfneyf62to7Axk4dy/XbYnnsbpgGp8u0LDw3Cnp+WttGogTptU9WZgq51og57AhbLcMrBTazROUTnxn9rDuwigH358mWzyuaVLgvUeaA9L2Qq7fMeFAZmwRbQDofXl9Q9P//6uuq7u7sNcL68bF/J4nzL8Xg8bb5sItXeuJPrUsUoA8GjcW6SWxPsZhxO+ttJwGIdykyMqDKnHxil5eG63Meyk8lwzKb2PCXy8L19ENpzYxmVxXZ7iGWIbhWYzFoculO8Sj0ZqQ2+DNlhuPXYc17Zls04r0TkQO7QDr6pg4az1YcyMsuz/UD3GnVQzwTmHUdD/AkUC3CdMzNz5NA6Wt4MIe39PUiMFGNzktaggOCsaKWKXmK+RPcR1MT8XE+Fa6ZlwdBn2tzztHvKgPG5b80fWGmq5O4rSt/rKd2qYCVGwVBEzwfj8Zwyf76f9ixXOxO3zf0dH8eb4DeITPV5Pp6fnzevYi5TLIjwfwpnbXjug4GQ/9UZ2nt+3v6Wgs+1DTsG7jVTRF/sqK1/3j40PQZl/ZpAxzbFVhuP2zk3b+e4uro6vU7dkRBs2KwY+XY7CNcbdMoaDcRTLpXPnsvJMbtcBLAqppW1MSwTWa/B5LUTRt8pgYhQbIgWQnNG7i9gwGfvFq8xN4yjHnvj5lZ6jfNaFMbiSTDQTJPi8ND3mqU45K6D4Lyva8hvj884zD4tR89tPSr1dQ44ZmfAfQU7jmMwzJsfnrdM+e4+rPX6sHbHMcnBMio7tJx9//Q2VtpvrmnarFuWhGxpk747irERoxcN+8pu7HC6lYG2KNOYHKLTRn8xioKtey4Ae8bUaIxrHCHZNjqvv6W8mQNrKAWi29O40y8vL6e3T1gwXplk4FxnJSrlR3goYA3DA29Yy2ezijKZadWL7wYI57p8rgZZwPZ9LgZGxmggKjjZcXg+PCazsYZL9o7ID8NoCGSw53jzS4+PjxvgMKOcHNb0iI1lwbhgMdN1nbuGlHZclj36Wjbk6xymc34K/dtvZLxnfJWn+0qbzfc4XWA9a/rFc2vG5/HVoVHaZlf2qz914NZP+uDtLiYvXs21Hhvk7Fzdj0vh41q/IQeG4OxBERgNwnBMiz1oMxR7I0+uQcFJdgupBu4cxSTktbb5JNdRAVVQza/0HntL+uP7NkI+nO/XoU3X7wktGywDKlAYvA2ME9uz56RvZioTC2yo7Nft2NDKmmirBuo59V83SdPHSXaWkZ2dZehtK44M9sJqy695nRbAcZo3MyeKV4k5NzEy67b1zNfX2XIvbSN3Mzzf493+raORhP+XxVqGXrmdoiXbVNkZfZiii0sg9mYOrBOJ4ImZ3Ygnn9DQnp3PFij3mErSjgXhYiWwoGzkZlxmD845VAmsRM3j2JszgZaLwyDadVjtPIRZjRW/IYblNlFy8hMA/UTzGVsZm8dig6EvHfe0RQOG4mP0c1oi9zzacTgcKZC7717qd5887wWNGmTDKue/7HzcTnOTrq/Odm+jse3CfZkchGXq/jPWPVbo42WxE+B4PhoON53A/OEUCjTth3WX+j2v1U3rh22z89dycSe+B1eAmVaAGJATlVPYZ4Uzi/KA7Tmpr0Kirg7QYMS1ZUjT4y7+7MmdlqS530pfRoXyT32n365nz4NyzPU2zKjX7FaAyqH5P8vU4+48I1+H7VOYYgcw3W/5+76Hh4czVtMUwVTnNEY7yrJQ2vAWChu9HXL1sMddT/tgOVJwOk9PT2ePAFHQGzOaMq0WRwa9pt+tq5ciGDsS6rE8DNjTOCbbeXp63cvp+TFWOFKaohfKRQZGpV4dMYtyxz2BMC+EM4UL9iau0/VyvxXFhlxPu7dC1clrIr3G5kmawiEbrRWNsbsOvwWVMZh5YjQGbi9QHI/HzSqYQa3hQdnIWmvz8K8ZcpXNMjKT86Zh582sF9RnJ9FQ18zMrMYOEabqlSyz1BoJ7RI2Wq5cb7aAXM1+JydZGRcUDHgTmLROL4Ywx330jTnu86G2na6GToyO4+7HBCzI3mSjzLKAaDaG3MviK7+JwfEKK/pRG7ccL4WPa70BYAzCm+tKLe1NrUj2kFVeT75DGbOXqVhoFvolJZpocUMCU1cb+Vrbd471Ouo3IDhEpY49EGr73Ybiayt31+VxVX5N8Pc+Aynj8XzbKfQ3DL1pd8+TO3GNAtsouc86YEAsQ7RTRD8NUp6bPfbLdV7ocR6wrNKsiz4i248fP2767DJtfeG45Y1s19r+SEv77felce8l3S8rX+scIBhXF2nsWDtuz7NxgPr2cOLq6mrzm2CwfwgAACAASURBVA1llJZRdWWvXAQwlLV7O6qANzc3G0rKtf5so2ag/iEP2rMioKRmSJNSesBWPJ9DUA69LBzGZa/q82V3zZFV0EyGf+CE4wWJtbZPOhjcLXvLyfmfOgfaMbvtSiP9hqEZ6Ch9HpN6+96wMtg6Lzs3xuo59pyZ7ft4AYD+W4Zl1HaoAJ11BBC2Xq51/mMvlDry4/F1JzkydD11Gvywh/ve8pZDtkzsLE0ELANK9225Pf57vuoEK4+JIbLZvZGKQYn9aWbW1k/30RiyV94EMJ5Gd2f76lnv/6Iz9pYoLIPkWUm2UZSN2EC8QsS1puBmEA1P6F83zTWc8zgMmPawBgCHNg1vy5b6Opy1tmGiJ9chJcvOBnmDJMyG/tDG9fX1Cfhgq/TXzohxfv369ex1NfTFbyYo6CIvzy3X+e0dXOdnZ53gb6jZbRBNLaBnBc6J1ZlBui7/qC19Nfg8Pv76y1b9RSkzFoMifed+vyLKYDSxiY7XDquFOek8mcU5vN+bKztg98k26jYnZl1nUCD1PBkU/dOMnaduE/kt5SKA+VUaFD9cSseZqKKnabLZlCccz2BP6boBKwvRIcda2zcv1IOZHXV7hxmj8xZepfNPtNOW++G6zPB6z8T4KNxLfYzXeSZkipx83vLiLboAnPtrZfYL+DxPfrcX8kBZ+3P1fc2N81h2dtRtZ2RZ0A/PBfXYk0/bCKawjT7x1tjWxXcnoFlVt5ECdBgq9sC+RHTeuS6Pyey6rIZiVjaFoRz386tNj1gmdSjWLW9zsDPxPNFej9fx21H5PuTW0ldoW2cLkpzfY6mbNi+dbGLTwrAHMJOgU/w3qJWt2TOsdU5Vnadg8M/P5y9va9tVJnvZvXyS+8NnlHPvvj3QbDjtHI1ZJH3HkxsEC5S+3214lQewYazUO/UZGZbheENtnZT1wvPr9j0mh+oOAytHK/S0g5xrplXsgqHlbUO1HtiRNQltQ7Ie+D6f93fPR59l3CsTqHmMfN7bCMz4C35NpZTFTzo6tQ/wTfZmkOlrcezYO+dlbVw3yWkCdJc3VyHpjMMVH/eklZG4837PlDvafArXWBhWqks0s6xure3PaTXXwT1up4qAMjo8dPwPYDJWe+a11ikEbXvUZeMqIFDMfJqDdCnD8PUONQE6wMJjsPctyF1fX2/m1zrh+XBYaM/vflunpvGY2RTQyqgN7jYk+ri3HWTP6fT9V5U7dZYJ2fEQJk/XeN73AMz9mhZSPOaClR0xIW0Bxvc75HS7vt6g48Jxp5U8567LfWVurBt7dV8qb+bAyAd0lYLJdl6gDMGfPUAfa36q7MMoXW/WUBIh2IvzuTkr6rNnMPhUmJOicKyMwsZZFrqXeyi49j4bz941LlaMCWRgenUgZVoGoz5cjUypv84BuXSryMvLax6qelKlrbFbpgUH6mAcDZMcRvt65FsdJzqYDHHqq68xg7MsvD2Gdi+V5k+nPFPlwPnmxihmm/6b2sPG7Oin4jpcig+WkRl62ZjHe5HB7p5Zvxo/Pz9WxSwL8+DKIq6uXld7+I5HtyL0/UpdWeOzw4m9nJTL3d3dRmAUG0EBpb/gXVC1kpSK872Jatrkf/uLnCqvidX6uxWyoNHwHEP1vDmBT8LdjIF6qLu/Y+Ck7NRGHYd/h2BSzuZfqtiWuVmW5QKTYkyHw+vzuM7fOL826UW3oXi+uHdPdl595tqJVTrMNwO0zKs/7u9UDPKTTUws0GBveyg7m+RkIuKFCMbch7f3HJCJxyUHfRrHpZM01LyIcwtuwINnUrmf1S4LyUlFjjuHQz1uo/mYtbZeo4nNtdbpl1c6md506k15DsUYj8MKlHZSIiuOH/GxIgN+lsHV1dUpVDSt93i6D8iyJNyzLLln2luEfBqu853//IAu/QOc6ly4BnbVpHm96sSeuQbwNrvzfPYZWesQMvb8sjLrpP7EFgDCSTYOs7mGH6qZdAqdaW4JMG10Qurj6up1XyWsl754fmobE0gVAP3Zea1pjsokret8t+PmHX1rvaZNPD9uH92cFtXAGr/V+VIoeRHAbm9vTwZA5YBMJ7cxsr2xJ9SejEFMq1Ze/rdgUQqv3vjBVPqGorHMTX8dXlL/xHT63jDH6f6jnw6T8LLIxbvhnT+xMU1bEuh/vVwZyfF43LBk5+rs9f2TdZQafdkkbMvhvsHHc9OQkLbsfS1fh3o4RetP86VmZjagMrkyvpeXl837rmA5ZtNuwwzKuoGjo46yd37hvWzTzJjcYyMJM9g+gO5ramvUwWeztzrsyoT+0I5JR4HS+juFe9UD9xubtB0B3A7HwRYvWFgfp3IRwPg5JQMQHakHdsc6IM7jUcwyzOYQpBXDObG1XnMxZgr1gAYee2P3Za1XhaMe5zwMtGw5sOewsjBpnggnuD0e52W8x84ASB3c65Wy5+fnTSL3cDhsDNpeDmVDXgDSFOI0IW/52UjKiNwmCvfp06eNfBiPGQNAzjyg8P6l9OZpOO6Q1w7CoRCO7unpaX3+/Hnd3Nxs3uTLvbRNMWABJGWHlgPhEfPQcKwbSJ+fn096Z2NFr61jDk2ttwYQyxbHYH33WD0+gz91EDW4Ho8DZzQ5F87brtEzyIb7gHwpJgjItLsSpvJmmt+VoPAOK22kptd0iuNlYw7LMD6MZPrhViaD6znnbQ8os4VVZmGva+ZGX703iL9uozA4TOyhQOV+cN6gyPX02wrpMTLmtc73V9GHbhVYaxtydZOwjdOGYmZoQJ8ofWVsWdnT2/A9Zr8/zszMYTwhm0HQ7WE46Cgs53g8rh9//HH985//XLe3t6dxlE1Tr1lsnQlzYVCzgwHEpmJGZaDyvNaoucft8t3tVr8859YFxs35RgPoAPKnXhyB2ZbtmmLbdv6yDNEyrtPEPtufvXIRwEy96fhESTtgT6xpqY3A9zfR7PDQgnNoaO9hgXh3vwse2cKZkqSTcRZ8uM/9mibKoGxmh5GZIaHYNVozSHtjezOz4+PxeHoRZZP1DqU5byC1ogIkzH/Dj8qXOjtWgIfzXOMwweyvim8n6DZxVg5xaJNcF/2G7TRPiDzr7CzfghjzXYZydXV1prcGxbISM0izN3SjUYeLHU1DSved/tqJNjrxOTsGg6PrsRO3zprceHyNfCbGjy5aZ92vRnMuv2kbRVdO1vo1MQ49RtGKpDU2rjH9ZpIdFnKtwc6DN5szS7IHoJiqe/IclrmP7r/zFo79qcftMCY/fuL43srh1bGCqJWdcUPPCQH7JgfL1izAAGk257wd84FRGtxfXs4fhUJRG7JMyjflNOijQ2b6j6yRAy/K5K8M0IZs1gj42tEwBjtP98HynIDGeUzrCGC51utTKt535XbszMr0mltCRtVBlwK7++eoxjri68qe3E+DZu2J/wZgZO8IyDbEPSYdDc2ZA4esHXPLxfeBsf2goZsT4RQPBiM0k3BohKD8GluM2gZv0LTAPPkevENEg1YN1WOheHJtCABCvRDnDAo2BhuS8yAGT9iNjRTDoyALOwk7AcsDmZiWmxljKPQBGWBsfvYR+fkXuWF4HgPAZ2CxV23Y6D7x2QzR5xxueLx2apZFAbIhfB2fZd18jnWMNg3Ma50/MmMdmxhU7aCs0/PonKb1rcXjsQ5QvFnZulAZUL/TQ3YUsHXnX0tqug2q+mlWankbJ9Y63+t4qbz5QkOzDoPJWtu9JPWmZUP27s6fuSBMK33ZX0EMwdCv5nmaRC1QcX+9V0OIGiHny4Q6FhtW8xbI1OzB8kOZDofDSXFsTPbOTgY7lMWLM4YC95SToJ6mA8yAu31gYpsuNVo+ex4NJpzzPPgheMuWfnoLA8cmD24H52NlJHXKHDcjbI4GR2FwhMFZdz0XLHq4Heu7++Sow23uFeusv6N7yKrpFSIE63v11UBzOBzW/f39ZtXa7Oru7u4U8pvIID+Hw43kLpWLZz98+LBubm7Wly9fNsaFAfVXu6d3IxEj+w0RNlYrbWP/siCK43XXh3DZ90X7fb7On80cqNssA4Vx2NjcFMWTYZBxQrmsoGyCOjiHTL3sj9yYcO8PQwG658lhpdnTSRECRgXmhrs17LZjkGELBrIDPKZnWs06uqG2/WqO1dczN8iXvKCdosf9WzaPMmcY/f39/UYXbIANj5ojcptOb7i0H+jyW6zEBQdOv7xa6GLArK73GjsG5urp6WljdwY4Ox87yebHqo+186lcBLDj8bgeHh42+7esUH2+yiEgwvBDyl2CtbAQbr1yWUOpq/uEInSljTapH0E6X7PW9hd8LHDawUCnnECZKIZm5sR5A0iBwkn2MjJkaplYVnYSTbg6/0BxmGLwNEg3j+E+INcyaUDSgGPdYPuImWKNxQzK/aFN65BXrf3iSL/+xzpRBt/8U0Mo7uHapiMKuDicygzgRu/ok1eJkVGd6ltMhOsoyNL5tbXmHfy1A9vWlNZwPbRjuZM3tY0hn7JB5+rc7t5q7lnfL51sjqRxqT2KhexEr72jPc7UaUon38o9xeb1DBYS/eBJAOq3QgI6HksVaZrIKRdnsDAld+wPQ1rr/Pk6j6vH3CcSxRiT81fur/Nz9MsyNCigYAAk40FO1NtC35pbY8yVoefPQIaSc41/is/Xo0MFRK6pPK2bBi8DTn/YlfvKAroFAt3hz47OfQH4Oo7j8fwxGwPO8XjcsHDniaw/jI3/ZpReRKk8+O46LSfmhxVWxtEUisNa2L/n39c7PWA9tA5Q/iMGttb5b0OatpvyuxMYl5eazUo4Nikb1zKpZgW0wdK/DdN9mMKAThh98YrVtDpJcfxvA61ReYKt4A4rME7LzAppgLVym7XWsMp+aZt+YFBcx/iow8zS/TETNJjVIxt8zJocUrm+T58+na6nbb++m6SxAYI2+0iT5eMcTdmQHS7gZefscAjZOSrAcVim1GWb8AKP5eY8oOVvmzJrLytFv9x2HQr1tM2ySesJc8p3629lzxi9d6+65OKneAzgjTAY68vLy7q/v9+w0slpnsaye+b/BnV3d7cxtjIeGzRMoMbkibThXV9fnz3aw+fb29u11mtYZGBw7sRhYduZjMyCdd6EiXY9tAFbMt113c11ICsfLyNoGGmHgKLa+Ax+ZpmeeI/dLIExmQUbKBlXc5pVSMsXj0wfDYz8Nfz3OO/u7jb5M8CEuT0cDqccEwVdYww3Nzcbh2iQdyF35HlkHvjzKl8XX3wOkGEOOe/cX0Gkq5IGLZc6RIf1nLdsqXuKClxH9cHXOP3iJzq6E74gwhMTHoN1BvZMXqxpAsa6xyD3QLzlzUeJLCB7IJTMgLbW9j1Ips2m13SU/Npa56/xcNK5oGW2Yg/enFXjfRuMAaAeqWyQB5oZH/1jTGYxjBHQclK+rMSTaBBCbmZtFFPwssHTpAaoud8JYO6vo6k39OcqmwGr7U8Mzn1rMtmysbHRBzMl6uTRIOvVFLajw87Beq4Yu+eIPgGYjBX2Y6dH8Zi76GIdQ6ctFzOdhpBmxm6rYbOLdd06yb2OKCiOZqpnk1Nw2NgIB/ZqEK9TYFy0B3aY9V5iX2u9AWCfP3/exPwvLy8nZIYpOG9idG3IMHUEBbMHc7GBMrFOhOIp3G7zdVYGA9Nar4zKntgTYgEzsXhxvlcpmxdrzoD2620twz3wMGC3X5Y3/2tgXcaeZFIgMvugPx07MvTKo8MQAwLMt+2ZHTgMtGJ7U/Wka5ORUTBOA1bDKHTC4OzFDy8WTBtLXYeBvU6eaww8lqW3qFh3Pd/UWQfWPFYZD/d4Li0f2rQTsEN2vSY37SfHTGA6Bjs6zjm8tpPdKxcB7OHhYd3e3m5yGg4XrGCT0FBGo7iRldUKT15XN6pcKLEfEO1kkyMDAEuDnbuoB+mKTb1cmZHDLrNOew8zFMbonBT1OhR3W2YZrtN5BYOiv7s0Ie9+c75eEofhuaH/niOcB/NPv82k0B+DteVvZfVY8eTe5W0jpM/MffWnc8bnKZ9Y3cZQWQR6fp5/TYs+Wx8re4NaWatLF7dqwA77qvveHG5nYHD3Pe6P9RFd8Dx7LLRf4lGnVRCanj/mOrO+5tr3ypv7wL58+XL6jgelkwYjhGGFpZOeWJ+fdlqjHCjiL7/8chamUF/bQhBNEjopbgXrfTCOeh8L0+xtrbVhBM5XTYoHSHllacqFFMg8rq5s2UDdB8+Zw2vXzTEzRzOPhrqWm0MCwLRhrUHMYIOcO0aDgoGmRlJQnEIh5q91u88O2bh+j1VNoRx9c/jaHNkUPptJ2oC5vuzMbZbldqwGJ+dZm/fiOvSlzJvx2c6pz3npCVwMOg5bp9XU9pvjDfH3ykUA+/r16+YNABaWcwAYiemtBVwUrofjOLkJKwbJQrOdhkkUMwQKCxEI1gpWz4TgTfmd8DXD8TgwSIOm7zdAoHywT8ba8Md5INNt/3Vl1MzHygfgkMSnXoMd57oIYVkY0Kmf663ULp5n+ugFE2Ti8Rv40A0/aO6xWd6txw6WYl1GhgZEJ9y5zoBinfVjNTZYjN1bKRyKOTeMrD1uO1zrLH2mn58+fToBGvdVh1z/pI+T7pUdVufN1DjGGC1n6kIWyMvJezNgM7v7+/v18PCwCdun8uY+ML9Pu3GqQ0uooVfxJm9mUHOd1MGgJ1rPRHmZ2spY8DC6l8F11zbFK23uM/fZW3OdvShK20l2WOjPjKUvPbSc+fMKEePxRtnn5+fNw9dQfJSmISeMzYDunAR9gRGbeTBWr2p6DrluUmpkwL2kA1BigwufnW+tPLxQUsDwf+dvPUbGwmfk31yfdcC6YGbn+W2ivX2knzgT+jAlyG3k1gnrqqMjz93EzCkFn+Zyy3B9vfchIjvaM+ADrtRjZuWw0fp1c3OzPn36dLZxvOUigHUPGOXh4eHM0JvbMNIaaBp2Gmx4uZw9EApnquuJswE1H4By16M5Du9zkwZgezVPCqUAOHkirpvA2Nc6H9h2AG4MuS/w88Zgv0qHAqj4wWyucR7D/XeI5hDSDqMG7L1VZl0FNebAK81VUnJe7pPDVO6zc7m6ujr9WC2g5M/tVwHW4apBws6SvqFX1jXPhcdqtmjgY14Zm+fezLvX9jvsGjkaSCbbmRwr7fp3A3yea5xWsLN07tf39Zjnrjhwc3NzCjXdh6ZiXC4+zG2jM7U2sjtnwufmuFxPB+cwwMli7reAS9PdT4cCnRSUl6fp2W/kxx+4zuN0TsxxO2OhLdPyghf1TJ9rtPbOAIEn3PkPG5vH6vpr4L4fIOuPl6BMgCwKjcwbcrs9jLDHfa/ZjUM+OyD6PrG5AhZjNvt3mOz6uM+sv/qHLOsMDeqMg+011If+2tl5FbNzzXnLrVs2anv0hbEaVFwMnhOL8Wp+iYRLWbMjKDND5qo44JC795e5equU5X2pXGRgKI0ZQPMMZQ0OG00vqacsDMNyGFZKabbX0Mz1wLSa85jqKwBQphDEuRjXyxgdqlpuNgKO7YU/tM1TDPawbtthSBOw9dIei5kUfeJXoRzKmn1aKQ1ATgE01GH8E+Ps3JiBGZgMEmWCDekYL2G4AalOrvpiXUS3qcvMYmK6XOMopdHDXgrF8ui82WYKOmaqZeqeD64F4LzlY3pG08U6ZRm4T46CKCUOZfHWdVISBjfnVanP5GevXAQw6LdDQ3sTe4IirZGY3IsH4AH63oldeWe6laIAxzErko0Yg/QO7z0B2QgvleYjyqI8Dht6wYY+G+CYQIdw/HeyGBmgCBz3UxE2VMvDzMwMhOfZzFy6i98LA83/+B6zMivzWuuU3DUomqFXoa3U/o987cn9rCPzUUPqsaYuDKzU8fLycnr+kvvQUWzGTLdMnfqp13Uz9w3b+d4FKutYgaOLPGZ7Zre2Hz9j6tw3xUyR7567tV5TMRwjrPfbanhZpR2S533q21Te3EbhpKFfDWzF6hsXrGgYgUHLSo6C45HYXW3l4npyIrRj4ymIOBdQj+a+18tRyhKneNwMteMpuPp9UFPo6H67LgqyazK0St361toai+W61mvYx1zZ4MwybfRVWMu5SXIDt5nslAeiXTN1xj5tpWBMDTsArj4ZYYMqw+Z7V5OptxEGsrI+2A56Pdf1aQjydE1RUA/X2Plwnn4b+N1vszz6UACyfhhIaM+AWkLQvpqIoO9d2Tbb5lrrpHXT9+6ViwB2d3e3of5O2hVkakgWlnNEVui1Xn81mEGZwjt8MiBSCHNQAAvVG/oQvHfptx/0efqMVzLbahhoZXOh3W4B8WKBx+OwyquGZrIFKAzRytu8S0MzyxIAgXlZRtzrl+61WHkrO/piR1I5o6gOjwvmXrhwyG3m1Tn1WLxianZL/ydHZ+ftbSYNP9d6dTZmxma3Lnz3LyJ5PxmAbWB3HmwK31wvfZ/kSJ3WM/e784Y8GCvjNFsvuNop2ZFbJxvGuz2z7UnfNmO/dPLz588bT2YjomGU1+GdJ89MoejrlSCUzzkFztP+4+Pjenh42EzklBOwUgNy1AMjnN5x7omeXnHi3IfvaRh4Eu5h+2iTx+8QoYzHit9QlnG4Hj8EDWCa8Xk3Pe1abg5tPT8Oh71S2RzOBPwNAwqmVlzrQEM1PrfP1r0yfifTJ3CfwrbJMRQ0qd8A6nzqxMpYjKAfZlrMj3WLes0KK0/Ph8Nz7rXMab+huHNkHjPF1zsU7mJCwcegbn028LWt6kSd9l6UtNZvYGB+75RXBZunacM+ZkUz+JlyMmBi47VeQ0a8w+Hw668s1xsgGCbFynA4bJ9fLG11KYVnLPTf46WNMi5/t2GgLJafJxuP6WRmcxB2ENRhT0Vd3ovGd2+hqLJ7u4IZtUPWysqsx3pBOwYbrllrnSnpWtul+4b3BiGHsa7b6QmHqwYk7kUufKbYIXtsyI++ePWPdjo/U8hMG762Ty8gAwNs62Dcnn+PsZ9h1Q7PTEbqkD2/FOtGWZ774nunaAT7czTg+iewtsOYyps5MH7N2Rvt7B1p2KBg5mAPUfA6dUIMjAHY6zZnYGVtLI7xeuWE/tpw7cE8jk6Sx+d2bUQWPu05BOZ/PbQfpyFp7tCh9Zb5NkHOnFFM5afwn/oxarPAblB1OoB7fL8dA7L/+vXrhv3Wk3osOBo7DDsOy7HADmunHsuO/jjE47wf57JeeqXVBm9HTJ1mG37pn/Wp80gBlKcQzIWtLh6bx2oW19VGZOaFMNuoichUOm/9IR7PjW284T5bTuygGwXUUb4VPq71BoA5BIGNWXj1EighhultDR4YddcjudNWuoYODS9coKv9xZi1tquqrdvjbf7Inq2exjIwkNSrUE8T/ihQV84ccnLcCWHXD1ABFBOIwYxsAO6fV40bunEdhgIoINPKn3o+fvx4yvMAPPS97Pvl5eX0kjyzBwOAZWEQ9f+9BYuJ+bltO9A6jeaRkF+dOP/dzocPH05vezXL5j/gZBCwnVgO1j8DrOfMqRqu92pkS8HS+tuQ1+cdiVAaFVmG1nXPf+fR4LcHqpv+Xjx5eE0wAgoYhEM2OgnSN5a3EjkJaq9KZ/uzTGttPZnDUIqNCRZmD2Sw9QqVDZRzjMlssHktg+20SlLPYS9dwG6dKAXjJN8Fs7TReZWt3oyCQnvu2g4yNmNDPg4vWwyyzKVlX6Cply5rQv715DUi999MtMwLuVZursMyrnEWtD23nLfz4DrbAMBMXZZnGT8A1y0OXGOHxnwa+HBQHqMZlO2WNprMt6wa1k/zX8AuY5vsyW+LcbTiOTRgT+yV8uZGVocGDLCg5UloHGumZcS2J17rlV15MyLC9iAueSjTca/YWcBmOhaMPcTk+U8CO5yvjPJ5mugqq8dmcDR7sJw476V/s6auqNFHxuI8IvWZFft6O4kqXsfesAPw8jWPj4+nt5kwJ14McHvIxh7ax6uHNhzAlrm1jKmzaQT6TN/o9wSU7o/zigY+rvOqs/vteXMqhjY4z2onxYtJzS2amRk0kY/1x8VM3KkCF2xsz3kV1D9+/HjSNeuFox6cDfswyxT9ea1Xnb/ExC4CmL1Mjc2hjge6R6VrZAjBnTMdNr00g6I0D9WJssJ44jleYGsoNLEoC5PvDX33hG3vWCB3m80D1BvaaXQ1FplNjxhNTx0gDxfnDj0PBQ/YkoEFcLTx4pCcMG744/bqMGAbBibLwWNof8xQLM86WeQzsSzkW9bvUNzGx3evhvKZuuir57D64XYKuh6TQXgKKSlu1/dNEYQZW2Xl+5hXnAebe2s71eGu4ttWPf9OB+yVN3fiUwEob4G4YRp04r3epx0zrcfDGIyo217InqYI7lyLldrGX6W2R3OxkL2NxJMw0dtpRceTY1ZRBmPA5j9tVdHsQQ0MDntoE2brxQvanpbRkaGvqTcug+a7XzHjYs/qOpCvdaOAY8O2zrSu6VxlbpnYEdiwzb691cXjBcSdHOde2jTIuI7WZblMDrB6Qvuuv0DYiKD6bR2ZjlU/kGttt86t4G+2b7xgAcBza303Ifi3AYyJhSIaXRtKIdB6hA6gRmADcOhjA7+9vT1de319fQJTG+MEqIyBvqGQNnRKFWcKL1sv/Z48les0QJ8EL0V03qlKVDCzzDlnhuNQbFJs5FiQs3zsJflshZzGwr3ex9e+eN7RFXtgs0UfL2ujzQnUm5cBYMqeXbfnwec4b+fthLhBgu/ThmoDq2VjJu552AO0guvkdL3laALDaasF3wvyLy8vp0WYMiXrhMHJxMXzaUBEnmX4nQfGvBfGrvUGgP3ud79b9/f34yoIAqhXKrX3OXdoz6NaOWjny5cvJ6X3A8ZlUS71ANxjwy8rsPIBCH6cYfJabW9iVfUqa72+PscLD1xr6tzJ5HhDUOTakNxjXWttmNges6J/yKwAOYW4Xixwn8xQuJ76zLyYX89B9cRzaUApmBbkpygAOZfpWod9bUOv6jWy6kZNA+0U8lonCpq0a/nZKdKW27NTn3S1DGsvIkhsWgAAGKRJREFU9UG7Zu2U6ovr8LYK53HtKBwtMDfe8G1deYuBXXydzsPDw/r9739/VgHe04M20yl6Fl2N3vWsFgoo7p/DmrxOVz782V7Syuqx+Np6v0ve0McmhdhjCZyv4dmImFzLyvKhPc6bLSM/hz9W/GmrxlrbFSQUkHN9tY/nDXBzbpR58OqWgal7qLgewCs7rwFP17hdrumik+fSoRZj8TXMT1Mak+zMoB1NuFQX7FjsHLzi6HsMeG53Cvc8/2a37UPnoCkFAxTFc97jlmHnraH4lBqwA2K8U57uNM7dM2udnn/zRBpluxWAxvGipomTd3MIxCALgkZmA4o9vz2zz7n0gVgLjf7vhTpTnQYcvndRwpNW5afebtHAs045jbZvZaCPZUdWUBTZsnC4zkoS11O4pkZQ42POGSO601SD++B69hZsrDN1Tvb4PdZxOFTmfx2g22BurTOAjg3NjtltGYiYL8977/Gc0I639hjwrENTmGUwn5xoc6G9j7rd/6ltsyTGbIdwOLw+CeM59BxZZhTqRY/2ypuPEpnymcabuje08CTXA1ix8I4NQRlckdz9mDaiToKg8EiUBTddN1F6X4vCeTvAWtvd/g4Zuc+yqud3/WYYlC6euH9uFyqO8ninPoZtI5oWDPyDohSU2Abt7ygqIaDl6LH213walpUtOkVhgOF/xzexiQIx8rIRNfy2c0Rn7MSbU0OG1OuIwjkzitka381IcIYlAZPuTKGsy8ReAEbfNwE+xYzL/ZjsqCuH1qnJtqb0heW0l7I5XXPppGmsN1KSvzEj8wS4UYTv5DmT7Guc+zJY1cDXOk+qlyWZCdlg6H/3gK31yg4IYfaKvSD30YfSZHuPvdC19L+TjKIZPAxI1GHDpJ6yM+q0ArvNhoCcm2i8x+U2DKYoYENbA1eBoMzCsrGxOn9S5mlGBADUWVqWBoOuLvupEsvZ7fEDzTZ05/cIvw20Dq3dXnW5Ye8UlvaY62mp02jO0fW6z2VcjMFzYdZVNjWxMzv8yUExl5fKm9so7C35IxFccKoH8ncbwQReKEZDTX9mQPaQZif01b/4jacFdL31oobiejk+0X8rB3WaXtcYprIXErr+PYbg/lghefaQjZQOk2CgHZeNAiOctrFcX19vnJBDHT/szH185/lAhyyACWNyiGbH0+Q/pbrnzZ/NGZrp0187NB59Q+bIb61X1o5umm3Tb5yB+1L98Hx7Pm1bhNTeumTmyfzsAVML7WGrlh3nbZNluXWGlqfBnOL57yuyPFetj2LnR32/ZawXr+gv0ZhVOIxgAv1nRXE8vWlcwEFdDjVtaBUQ9Tks3EPtsojuJUMZvaeK66tULfbua23zGPWeBYz2sbmYtc4TvV097SpaVza5hlU/h1VmjO6fw3gn1b1r2z8L11yRAez6+nojZ4NfQ6CmBszs7RCdmuB/tzkU5Bs+W//o2wR8vs7MznU6UnB4yBh4vIq+lEH7cR76AIOzTD3vAKfHM+nV/f392eNgZf+uty85wLbdNy+s2YaRWR2rSYhtvGTHerDWq51fYmEXAaw/2UVhYkypXerdO8kON82GUI7SSNNTe76pXQvGx3ocYdfIrVj0008g7HmFerm9cKhsjsK19vK+jn57JcoKWOdgBuh6ylDMJrx4gOJyn3N+VmSDlWVAu31/m8Mq60DZzcRUaL8pCNqysXN8Wi1riI9hUY/l4Otoo2M1S/dbP5oj9pw5vPb1zWlNoST3d8uGWTbFOVHLq8WAZfbDvONcDcxrne9zaxvd02fb8Bxzv53C9AzsWb8vnTTqlz3AWLhuSt75R3E517jZwuqydMNIBOwEuZWfe2zkExuCWZohOtyi/zYWAyf3+Pv0BgiX5qscilGs5BOQWc6nCTwcNkrniXffDehWJOrgPudrOMe1ZsMOx71HzAyIvju8qyE1Z+Kxmg26vwVQ6iozsIFYV60vdlrTc4P9rVLkbx3rXFV+Hq9ZE3oJgzeY2d6sM+hddYexmUFNNrvngPdA3uyTMVAc5vdRsTpjs3rq8zW0aec0LUCc9fvNK/6vmD0xkaWNDIrr7LV7nwHCCuk2LEhAwStCbsv1lGpzvnR3rW1S3AKdAKMMw+Oenj+s3KjX/ajyc02T9h5vix+YtVcm/LNhtK9eLJm2Nrh/VS7nLQy0Dg/tZLxS5/9m4mX69HHP8OoErV/U4e8AroG0jNbfy5i9IDGxQDvGyZH14WY+ozMlAvRnYio18LbV8ewl/Kd7/RNne6kTh5Z9w0QJwgS4TZm8xbamchHA3AmHgs35MEhf6wny+cbyZiacN+20wKiH77CG3u97bJATjbdH9XEb+STYaUHCxYnpjpH77V0nb9q223/Ax4lmA5KTy82VUY+VzswMpuDEdJWyffUcuP/Vl47JTMpys6Pp/DE++tzw5pL8kD/9ar1lLb63xsx3nos0i5v0hjH5rSjuE6W5KPpbkC/DK9iXWNhBu/i7galy4Ttybv7ZusS1U8ht8gNDNpO7lPDf9PvSSb+ccK1Xo7u5uVmPj4+nF9A5obnW9hUd9nwNCWEGBbauRFKPJ9OMzhPTjZRdeaNwL/1jtaxsjH5OsrGAHSYVGG309J0691aWCpBmA76GMVxiqF4pNcV3XQ61+64o5N+lfxS1ITznrYT9xR7L1szTcwIrdOqBcwZts/Sycs+3S/XWffAf+ur7ahMYHg6je7xo3yAI4NVRTgBgXaI0PCzL6nGzqYa4ZZO0i7wtH+cu/bN91iuHuLZ1+tYICDCsrAu+U7kIYL/88ssGjBwKXF1dnR7cpQFe3EZC3B21oSHIJoGncMkD5XtXRDxAv9wOJsVkd48TQraA7Sk8Nq5ryIiyYPQFDwNvWSdgUeVxHsgycB2WB3U631SPSt8ZE9cbbMiBNXnKPFnWXalFXu5bx2S2wpw0DWHjYyx9ooNjDscsAy9IWLaWTcM7G07B0jmqp6ens1dYc0/lUObC/DjE9hz5x15tvBx3sV7xvfJGTtNx20/1bIqS6JN/9pDjXQACEJ34r6MwybCzwWadgrpULj4LeXt7e1IeK91a24R7wxIUxGGHlcwC5HwBwiyIgR2Px43BG/ych/HkMhlXV1ebuL7hbcfU+lyXJ5vPDs+sFJZPc4U2dJ8ziNeTG7B8re/1UvXeWLw8bllPMvR8UR+Kh8ycWjDIdWw2bq41i7Is6YNZg+Xg+y1vh6zMWftpJ2FAdz6q47cxcg1MqsV5ub1inenceazkMqektuf0t5TqpsteaFrn4+tsG9YLzwnzYVvsdQVtMKdAfzb+S4P1GxnsBelU3+OEcApORtW9AYHGDTP5/Pz8vO7u7kbhGegKEs6NWEGdNKV/TIQVGkNpKa11uNUwhHotE3/3eAqqLvaMrX+SyVrbH1Fo/507goXVSVnh3b7Zk/NNHbOB2mE+QODj1R+Px2PwiqLbxklRN17cjs6657YdGnmslqMB2iyuaYpPnz6dvSnCKQ+nDcr8Wzdy5lo/G9n53mNkHZeP28583KU2zVwVdAw2nhuniNqnOgnq4HcE/iMA86NCftWzBdbvRt69eNYbX43Cvt6I7mvdRhGb+xpWFiQmcPQvv1TgZTMeW0Ndt+F2mzcpo7AMqbchTmk4bTU34j5VOSuTht/+bCYz9XGt131AhDo4Njsns0WU1nmihjCeSxvZFEb5uMGKsQM4jRw8x3agntemKsyO7IDs/PjeEN9ypj/YFtfZOTSHync/GlVZ7DGxPdZlcJ2283QuWiw3rrWs/dl2xznG3PvWes1Jv7UX7E0GhmFTIcKssKwIDmem0GKtrbetQTOg5sios6EWiG1wtND9IwlmLkZ6rutqyRQeeByeWOdBbLR9ntF97zhap43TcuO+GqhLDc3/ndifru8bK+w4vHrH6pHZmUufevCYzB6q4FwHiyqTaXhJ8aq0E8OMxX1tysJg7/u7UsmWgYaaBnzas0zt5JEHOcc6IkpzstPLNd2H6XtTI77G43ffKAac/nfYbRDcY3l2SHVK2K7TPHt6vRnnxZOHw/ry5cuJoeBdndBEQDTocGyteZtEqebk5TlXGt5BmUk1z2T26OL2zBb4bUZKWVRpMlTfCjx5i/5cGHXRR495otluswnTevAWszaDf0MAt40MHDYZzDz3DZ3RDcbHDxET1hVA11pnCyDuT1MCyMP5L+thDdXG5Pqb+6vOcA06Z0Zqhzhtfm1I3bDUDMUrkdNPAVKwK9tW2XDHwTUFa/o96faeLVKnQWqKMuyIGumY+eKI3c+vX7+ebTJ/q1wEMHeWRySsGAiQhh1Sco0Fa5ZVr8ZxKxkeHmbkNs34Hh8fN8l959y6vcB5O/+fjJ98BnV2cqmTvIbHXE/XCZkmqHKyolvueO6J3gMSBqtSeReuZW7NQnqfd6VbF+qV3X8eJfLcIDMU2ivHZVoUy7566JVy/69jrUxxBh4jc1W2VMeCDOxMPLYyGwMj1/qz27vEOsriywzbT+yiD2B7dR5ZrDVv38H2eKDf/bDsyc9Z3h6b7cWAZnLTBZ+3gOwigJUiUvAcKOz0gxFG2z3DmGiugQAhwIwaXriN5kvKpAAgPCnCN7X1xHGfQ8Iqltus9/ytHsSloSSfJ4Ay/XbfJioPc5iMykZe5V/rlYVZEZtnc998f70338sayloZn+vpeJ30bthXZtF2AU+3a5lUFwy+lpPtw8XOuXV2LJO8Gw66D23H93ZPmcPm3s95g8ha52kMO6s+92gAdD8LTpVVQ0zLxMza8torbzIwwKodZRB0ti88YzAWFMBiT049/m5PQTH9br0+V8rquj0Ge4KWCtBeuddZVv5fFnASeIyXUo9HqUdsXQaKhk1TPfRpCtes7JeW23uuYXAV1Xmh3mfQsbHUqRiUDRAOZeyk2kbZnNmfw1nLz2+JcF1T2A3DwCmSG3Sk4DEwZlhcd95T/OwvsmecEzh7DNbJ2u6Ub1trC0xuv29kcXTl7+5HU0YlILbR4ovTA5fIwEUA++GHH9aHDx9OYVRjeIdnhG1eimeyPXh+O84My0yrg3XnvSXgcHj9JWcLpfF2BeZj/rk2C87t2wDKUK6uXn+gtue7TaMhwyUvzFin45a7++eXNNYYUIYqF2zUYUA9ZYGO9glNPH8UlNHJ/8qlhuliZXZu08zL+tax9S0NZhPPz+e/yk70YAdoffRcdaweF/pAH5w789gBri6QMCZvs4D1VL5eDTX4U5qO4VjZZR2znVvnkfOMhXYnp9Q2pucezboMaJ73CXw3Y9o9s9b6wx/+sO7u7s7Y1WRQ9jpmH8/Pz5uf2sJ7MBAEDUU1wjvUrPezAOoNCg7OF6GYxPRlMBVyWQ3FQNrVJtqcqH2ZhsvEmAquPmdD76twCqjs5au8HQbYCGyoE8hQHIqarU6GYVlVKS2rKvTUD0Dbcza163Fz3O/mYm5w0sjOjrfz0D6vtU4rk1MY2lDPq5jWIcbBbneuvST/PZbt6MastXpMP+yEuR7ZTk78+vr6bO+c89S1y+aI20/3z3Y3OfrNHOyeWWv9/PPP6+bmZgMgRkQ6XO9TIPFAuN5MZ48Gd5C06Xo9KQUgT5AF7Wf9proRKMdMY12Hc34oob0thuofNTibgLDJyTlQ9hhb2V3fgYXMDUoAQL038mIMnnvPhRdtkG1B03NvGfohc8boENY5KrNAriGnV7mYSVHssCgTeDJWr7q6b056T3ra7Tbug/eFNXw06JpdUti+NIV87ovnqfkjy9q5WtsRdmhH5jGYbLy8vGz0nzE2ZHY/qj+eB1JQDrvfAi7KmxtZvWq31jkNnoDIArDwoL4Ngaz8/PkeA5SF7PwXguCYAcvCsMIQRtFe6+Jae8H2ZQobO1EGQY/b+8489peXl3Vzc3MKYVxXFcvgBHg5h+dS45tWfCdF7JjsPBjbxJB9rsrfXJadT9vzkr/nlOI9azYa2kbOzScatKZ5tEPyuKyzLlOuCzk6hHJfYc6Xnvubnus027UulzV5PE2POAfV0NKyctRUnfQc0S870DJop3Ka4kGfjBtvgdibG1ldeRmPz1lxEIAVtF5hCs0mcPB1BiiEV4X2da6ztJQJ8gOna80JT09YWU/ZX+XgYz5OW8ioY8Xg2hfnI5yvch0oc3NDjGWvTJtOp/L09LTx5A4f19rO9cRWOic2sLLQKX+3145DGF/rcKilgDP1d1qEuGRYBsbqSfNibtP/3/psoDGgTvo3/bk07bMX0Tils8f+zeToZx0Yx80Q7Syru/92Dmx6ZUmRncaZlMnACnp4THul5jxQZlPg5oWa85iW0C14My0rggGvINlJMvBNxwtYa52/idRj6KRyzvt3qMP3EnZOSXbLZpKly2SwHr/nc8rrGID3wHECwxqw2VzHYVZtOdmRte0yhIndOZGOjpVFwigqH3bQU7pah7zcN8/5FFY1GuGe5lDdjx6vrKfQzWUKrZugn2yi93J/AcrOwXP58vKy7u/vN7rlfWQGtH87B7bW1viOx+NpZ7W9jDu8x1ScQ0DpoaNsRC3a21uYflqAHuRbFHWt7YPATJANY6LdDdmopxSXCSuTpB6+17O5Dq+00g/LmT5Mhu3xeSwYqVcqYSUFYu7xSmaNwHM7AY7bt+PzIzNOMTQ0cuF82/GxCWDNJHs/xmIZTvOBjkxPcuy9ywsjLGs3mFlv3XfkYpk3OvA90xYd1308HjeLC5Vpx4Q8zFid4K/8KXaUzXvZ3usQWNw7Ho/r/v5+rbVOuxQOh8Pp8cBL5SKAsYTrRx1YuUIQVZAq5+Pj40kBSpfLhhCa9yNNjMaMzpNdj+TvZhRmaggZhfZqme87CSzMzUzi+fn5lHR13VbaspUarBcGXM8U2tI3g5BBoeG86y3boBjk9sIuj22trdw7p/QFoz8ej6f3xnWcBUv0qADgfEsNyQzGjKhbDMgVMufOH5mJlFW1zy7oTcHWsp4MEsZNLgynxZx7UcMyKYBjr2a0zb3Z6VlXuLepi4mImKwY3JpTdfTD5z4hA6YgZ88zj+BdKm+GkEyijZoGjbgGHfIjeKPmcroQwP3el1XvwrGnp9dnGx0S2nNO+TBPCH33mFzfFGrZMOxN3H9/9/1TfdMzdPTZxauhBQb+/AgJTMv3X11dnRS0e+cYN8VjqKG2Xo6hkM7v2Mg4x/zxuhkzUrNy6nOOz/2rIVNYyfIzrfSZfhhILUPrsce9F6JXJhhnV0jXWqf+2PEiX/rs/k59Ri4GU7Ntxo1MsBHPWZ0cYNGcVl9XZMBDz52W8H3OvXGcFBP98d7Jkpb29z96G8X9/f1GIc2ySnMdIpnh2Jv4fjrrhYLD4XD2W3AOObr9wWBR8JrCPoczje2dzDezIBfl+j3xXNs3WTR87UR0pWYqBV7GZYWz7Av8ToY2HzO1gXHvGa0Bwxs2HU5Pr8nxKpsByQ7By/zNhU0hTJmvQx7GwJ9zXU0NWBe4nzKFap4rh0PuX7cqTKEyYA/r6Vs0LHPX0UiC/hDSAqSTHXj+HSY3jEYm08PlE9u2HU12x7VgA2+AMSP15mKz4Ia/LRcB7Lvvvlv39/cbAa61/WFZdwTBmTWs9SsQOjSYkvFMJOBmim/6XA9WwTbPZIOx0nN/QaIgQB2d5Bprww/61K0DJ8En/HGxXKYcBCy34SAKhFGxIXLP+M0efdyyxbt3tdRj8T4nb3loKIKSGoSnv8rIwOnx0UaBxvMHcE3PCboO53UtA0AdZmSHcSmUpJ2G+QYQGCf923vtkhmZ9dX24XbrhP3n/KmT9XbU6Lt/E6P6zxhqoz3X3JtTRx4r43SYWUAcZX3p5P/8z/9sNgFSoR+CXuv1J5is2Cgj1NYC9oT4OHU35JyEUy9vukup12r41TDOXovPZhdNvk4Mir4xtuvr67PEfBmXcwIeE+1PYaUVoOc7Xoq/V85TQthysbzor4tDRerjz4C81jaHyXcDJox9b/VtLy9ZNtQH961/zSk6rVGmbMdgeU/6M+kFdlFj5LhTIsfjcd3d3Z2NDZn5j/Yq1z3W5pCMKIpHBR0alv24jmlczEmZZCMerkNGPsdiA9HKpVX9jQx3z6y1Pn/+fGJbBgwQHq8yLZPTOZDdRg0LQ2jQYbxRH58oPTWjqqfxO4nsTeizBdhJKItDeA3LfG0/Mz766ZCY482zIAMnpxlPx9hiJ1AmUjbj/35Wj3psgO4L/bc3bmIbhmUAN5iVZeEUyixpr2zIfXE/zTZshIyR7/b67lvnnHrM5jxfZlKVAeyT4w4RzVA9rq5kPz4+ruvr682Yp5AefXEUYec6bWhGHz1XhJ5mXfTTqQ+zv0Yi/DexMStrusNzbXa4t+J5qVxdQrf38l7ey3v5/1ze3Af2Xt7Le3kv/1/LO4C9l/fyXv5ryzuAvZf38l7+a8s7gL2X9/Je/mvLO4C9l/fyXv5ryzuAvZf38l7+a8v/As0toE27V8j6AAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "e = gaussian_derivative_edge_detector(im)\n", + "show_edges(e)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that the extracted edges are more similar to the original one. The resulting edges are depending on the initial Gaussian kernel size and how it is initialized." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Laplacian Edge Detector\n", + "\n", + "Laplacian is somewhat different from the methods we have discussed so far. Unlike the above kernels which are only using the first-order derivatives of the original image, the Laplacian edge detector uses the second-order derivatives of the image. Using the second derivatives also makes the detector very sensitive to noise. Thus the image is often Gaussian smoothed before applying the Laplacian filter.\n", + "\n", + "Here are how the Laplacian detector looks like:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "There are two commonly used small Laplacian kernels:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In our implementation, we used the first one as the default kernel and convolve it with the original image using packages provided by `scipy`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Now let's use the Laplacian edge detector to extract edges of the staple example:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAATAAAADnCAYAAACZtwrQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOy9SY9cR3a+f3KunGsixaGpHiC50W4b6I3hjT+CF4YX/speeGXYbrRa3S3JIsUia8qxcs78Ler/RD43mEzB8uIPA7wAQbIq896IE2d4z3tOxC3tdrv4dH26Pl2frv+LV/n/7wF8uj5dn65P10+9PjmwT9en69P1f/b65MA+XZ+uT9f/2euTA/t0fbo+Xf9nr08O7NP16fp0/Z+9qsd++Y//+I+7arUab968iXq9HuVyOVarVdRqtZjP57Fer+Pk5CTW63V0Op1Yr9fpZ7vdLsbjcfT7/RiPx1GtVqNarcZ2u41yuRy1Wi1ms1mUy+X0s1KpFJvNJqrVx2Gt1+v9QKvVwv232236Xrm898P+2Xa7TePlfrvdLiqVSmy328J48mu9Xke1Wi18f7vdRkREqVQKV2+5x3K5jJOTk/Qd33u9XqdxeV4RkcbKvD3Xcrkcm80mNptNWgPmUavVYjweR6PRiHq9HpVKJTabTZKNZefx1+v1uLq6ivl8Hp999lk0Go1YLBaxWq3i1atXcXNzE6VS6aAMLEOPn5/7e/mzkYXlw5j43Xa7jUajEavVqiAby9DfqVaraS02m81BmSLv9XodjUaj8LntdhulUilKpVJUq9VYLpeFtcgv5sjfPAvbqFarSYfL5XLM5/P071wOlhPzrVQqsVwuC7KrVqtpbefzedKDXJY8j99vt9ukk54v62VZ5hdjQ/8iImq1WkGPkd1ut0vz99z5DN+vVCrpcycnJ7HZbKJSqcR8Pk/zrtfrERHRaDRiNBrFV199FZ1OJ96+fVs6NM6jCOz6+jpqtVoMBoNYLpdJcR8eHuLk5CQajUZst9vodrtxf38flUolms1m/PDDD0mQDw8PcXp6WjDI1WoV4/E4IiLm83k0m800Se5ZqVSi0WgkQ5nNZtHr9aLdbsd6vY5Wq5UE9/DwkJQUQ1sulzGfz9Ni7Ha7aDQaUavVCkblP6vVKlarVTIi7rdarZLwcbT8GyOKeHQMLOp6vY7tdhuz2SyNLXdeOPXlcvmBwaFA/Ix78/9KpRLr9To2m00sl8v0bxuQgwbPr9VqMZlMUuAplUqxXC7TfXCWzJnPIB8Uularpc9zb/7d7XbTzwh0rDv3m8/nybhQeORgJ7xer9Pcy+VyVCqVqFaryeGs1+tYLBbp/81mMyIiOReeHRHR6XQKwSe/Z+54HdjsMHAw6BTz3mw2ae0jHg0WXbGjsONGR2u1WloDjNjj4RmMC/3CqfB7gvt8Pk/PQO58zvq42+1ivV5Hu91O9+UZPLtSqRQCltd8sVgUAMjJyUkhUDNG1gOHxrw3m03yJScnJ1EqlVIwZizHWr2OIrAnT57EbDaL09PTiNgjj5OTk6jVaknpZrNZdLvdWCwWsV6v4+nTpwXlwOMiUJxMo9GI3W6XIldu+CC5er0e1Wo1FotFzOfzKJVK8fDwkAyLReRCaVEE7ovD47OOtEQyDAwjQnEdRfg39wEN5fdcr9dRq9UKzgclNILEOa7X64KRoyT8O3dGKKDRAs9kLL6QVbfbjXfv3kWn04mIR8MejUZJaXFuyCtfN6Inn0O5+T9oIiIKxsjnHJ35PMrP2tl5YozIiyDBevAzDBcnXq/XE6KKeHQWlidjwykRTEDRNiIcTI5mQRGsD/q63W5ToOf+7Xb7A0TPd1erVRoPF5kCMuNvZxm+H3KuVqsJxVqmeaaC/tbr9ZQN8Xnm4PkZ+TMGbDh3Ng4cyIZ1rtfryZE6MDA/1sy29rHrKAL77rvvCt7XghgOh2lyy+UyISa8Oc5gt9vFdDqN+Xye0FylUolWqxWLxSIJgwnhzNbrdSyXy3TP7XYb0+m0AFF3u11st9s4OTlJivzw8FCA64zDaaeVjUW0oCxcxs3nQBTb7TY5D1AaPzfqqFar8fDwELPZrADfcZA4HAx8s9mk+/p+oCLGRASz8vB/K3Q+H1JF5gJ6c8qNQoHO8shtA3AU5v8eT57qg94i9mjb42VN+Rw6FLFHuJaPEU61Wo1Wq5VQnZESaJRxcC/kb+SF8WDQjDs3NtawVqulgO45oxfoKAiNoG4DzxEKThi95D52ZLVaLTkcnDx6sVgsUhZip+JgQEaF3XgNcIwg7dz5YbdGx8vlMn2HZwB6HFgt19xBczUajQKq/9h11IG9fPkyKSiLO5/PE+eC4BEGURkjQRGq1Wq02+2Yz+fRbrdjs9nEw8NDAR57IigmSlgqlaLRaCR0QHTjMygGisLi8js4OxtRRBQcc71ePwhZcY6G01YgFpSx+A+fr9frScHzSOjnMs92u53GRfpoR+nUsFarRaVSSYaPw7SB8GyUm7FzfzsJO+3VahWNRiNarVbBQDECfwdnbSOzwiInKAiM06k+v8cp5fK30aGPjAe9eXh4SP/P55qn8kbLoBZ0xugIpGMkakfMPOyEDiGfHLUZDZKKogcYOXLnWUa3rHO1Wo1utxvlcjlarVZ6LrIDxfI55m3n0Ol0UuBFb7l3vV5P/ybo4NDRM/42lYMtgeSRMwEZeTj4WV+RPTI6dB11YG/evIlarRaNRiPm83mcnJzE6elp4q3Ozs6SU2Iw9Xo9KXi73Y5Wq5UMkogRsY84GAkOsd/vF1Ko1WoV8/k8xuNxNJvNNHkTkygAThTEAFKxEEgNEDhKaxK0VqulKAQqcTqWKySGBYJkQSMeI6HTUBQaKE10w6A2m00KACcnJwklRUSB7DRxbUObzWZRqVQKhCv3AIFst9vEG5mbstOI2HM4niepzm63K/Afdmo4JZPa5okYX6VSiXa7XYjGBBin+XaK5quQMUZFgDP6Mvo7VBxCrnBpq9UqWq1WGiufh2w2yjH3xIW8yB7yORiBcw/mYiSKDI3oCVjIyk7Na8j81ut10u9Wq5XWHV0llTNf5nkjG/TV9st6OMUGhdkOc+djbhwHZxvgM6SRecEsv446sH6/n6IlOTUOZDabxf39fVIYHmxhzGazuL29TYaJcFC83DBQvHq9nlJHnAzIDgI7oljhwsFaYNzHysx34Tem02laECIYv0cZneJyWdhEu4g96sh5K6c9PM/Q3JUcvo9M4EKYD4jOBoJ8gfSWD5HYn+f/u90uISzIUyNLc1eMC/TR6XSS8yJFIn0wL2dZmtdZLBYF+TuQ5NVLp7ncHxkZseEQUHqQqauoOAPWD1QPXeJn8izSTsuJtAibID1sNBoFcj8n3svlcpIp5LntAX1iXNgY8oFewZ48ZtabFAzemPsyT+sbsgZocF/WgcCwWq0SReM0G5rhUADCuTE3dND24sojz3awPXYddWCOJrVaLabTaYFsxEin02nsdrvo9/vRbDaTE1iv19FsNmM8HqeFnkwmyfjMEYF6zKe4mmVOyQIxnF+tVim6WJheCC4UE3hsZGLilwjTarWSAi2Xy2g2m+m+REvDXhSEyJRHEiMxpwh2ACha3h7B+ECTjBWjAKkyT3gJFArj58JpYtzj8TjdEw6Tf5vEZVyMAYdC8YXPkE7k1S3kz7MZh9MmVwP9Xf7PHO20jQgImMi+UqkkhwA3ChphLcx32WmCJLk3RnhychLNZjNqtVoMh8MkB/TQxs447IAwXObCM0Fevg+2APXAmGzslUolIX8jdKeO5u/MO9br9Q+AAJfHYSTotWUNQMpkFE4nnSI6JZ5Op2lNms1mIe3/2HW0Cvn8+fNk0LVaLd2UqICR4J2n02kBZVQqlRiPx1Gv12OxWCTheNCuFjWbzZhOp0kxRqPRfqCKyE4DicDk+BFR6JeKOJzy8VkWmUhsfsqLRhrJYpnzwAj8HVI4R1X6jZxOECVz5MOF8mFci8UiBQiUr16vp9Qd0tUpNsbEWHHyTq0xMuQC4kah+JnRLkUYvp/zPcjWqClin0bklT8XVJxCHuLXGL8dOM4dp+hnImOnqszLbRX52rB2OGiCq8eDLkZE4qDQE8ZpXg00Zzvxc/i+K/d2Xu5jdGrN/W1fyNV0AvdGT3On6cKdA6bHiA65EGZ9sx9gvVzZNFeW63rEY/bm+X3sOvrbr7/+OiqVSpyfn6cWCS54MC6jLiYKDAZOu+eFiLtcLqPf76fo5ChNtTKvmhmtRERaEPq8csTGovjZKGtuuF78iH3LgFsdWEDP1yllEu7/p7iQ8yifjdAoZbVapchJUyAXcvFzgfTr9brg7Pke83MlitTHBsRnnYbivEAHjJ2/jTCYq9fVMjbHAYLHiHme52ijZcz+mauYJou513a7TdwPOpU33nreNlbmSzAwj4oDt3MFBeWpDhU/c4tc3JvPTafTJB8yE+63XC7THz+byzQEOsRaOFPhZ3Zodmx83+Q+a8VzPS4HRV92/MyPNVgulwl0sNamnJAJVftc5w9dRxEYUWSxWMTp6WlCO0zIEZpWCpfsc04H4nY6nSbB1Ov1GAwGCZ1g9IvFIkFjk9J4dv6NchN5DWtRTtAJ/A3RPye7HcFcngeS55eVHyO0oRC14QyRjxtjIW/Nl2CkTm0N3SP2qKHdbqfqkB1iHkhQJNpYmLOViXt4/Rmv0yIjAyMu903xGTuqiEjIje+ydozBKRoXTh05kZKgIzyLQFOtVgvpdd57ZcRh+UbsnT+Zgg2YMeP8vavD6JB5MC6ej02QkbA2dqBG+tanPE1j3B5f3kJDb5fHxJzcx2cnZsoChN9oNKLZbCaqJ6+Gu6UHh8O/QbmsB9V6LnhFp7oRj4Do7u6uoOuHrqMI7NWrVzEej2M4HBZgIdwFBgOJ6Uja7XZjvV5Hv98vLO5sNovLy8tC2wGLu1qtUhrZ7Xbj5OQkkYYoLdULd3CjcCiAldKob7fbpbE6KrOA8BI0vjoN9f8dZbiHOQXLKe9ByvksKy9GBLIgGBCBHfkPcRTIBCN0asT/jXJw4hih21GYm1NIdAAZMnaiqBXWcuNZcJxWYPMldoJUMRn3ZrOJ4XD4AZmP7DFIZOZSfalUislkkr6DfPkucmo2mx/oD4UO6wCB1QjHa4O+mWw38uZCH9yblyM8HAnoj90P6Brj4ZnYgDlZrvV63xhMqus1IqXE2WALDmLb7TZxopVKJSaTSVpHbMyIFH3xWAlCIE4C2ng8Tj8jo/CWrUPXUQf23//933F2dhYXFxcxHo8LvVmkZ65CQuCaRKRBjgVfLpcJcSHAyWSS4OR4PI5WqxXX19cR8eiJvfju9j7EFcHN4WRwBKSR7rFizCZ0HYFJtXB4Rmi5I8sbHUEBVh6UC+dvzgMZrVarmM1mSZauAjHOQ+nKZrOJ6XRa4IxQPtYLCI9zNQeHornYYaQTsW8uJGqa2+JzGD3Pw/jdbMxcjRQPcTo4VlIfZGCeCkeBYeDASINx2lAcrA9jjYik1yAlHDI8IONuNpsFHgt5kVlERMEWGAc6YyTNWtpxee2MxEgz+T/PYl3cWhKxb1cyl5cjRGcU5oNxbk6l/TlSc9/DgYz/M5bZbFaojKMP5rQte4KcC3jHrh9FYO/fv0+bfnkwyKJW2+9xIkKUSqUUURAI13w+j06nkyAmC4jxUAmibQLvzj28qAjNXdH87SZCBIPQp9NpWmCnZFT8HF3zlKZcLm7r4QLKO+XkMl+DI2U83MdR3cbBvV0UaDabBYKYiNVsNqPT6RQQBmjMZepms5kaY0GUVFShAnJ0ikMyckMZc77KXBqNmFZW/saoeRapx3a7b1YlEM1msxREiMgggohI9IENCF2Ac/HzncrlfXsR++Zi5gHqAdn4qtVqqafOgSgvRLCLxDrD+rjR03sEkT96cSiN9DOcziMHo0B0yQUe7onDsvMEyTKOiEeHxLMjHgEGsuEeBA2cN87ZRQPW2e0U3kvJARDH0FfEjziwq6ur6PV6sVqtYjgcRsS+/DuZTFKPGJG2Wq3GaDSKbrebFoAOYQu60WhEqVRKnt6l7X6/n5wMDbIorI2BezFBoiHC55k5Z+QUx1HPFUX2d0bs00l/lohrzoIFsiKadyHlYgsVDXx59QgngRNG0ani0teDHPkem8YxeBAGToQx8PO8X4/xm2w1OW5DYE5cGKI3pXvNuYzAjIBwzjyLVAUagjUnwBBU7Cgj9nwf36fqzXPthHm2kZkREvfk3zhUowkoDuSEI6TQQZ8cduBqI/KBGwZ9WKagMXQul71Rv50j4+D3rq7jnL1W1mPkh45CAfneRkWmXRiH19+y9NgBD4PB4ING6nK5nA5tOJRx+DrqwM7OzmK5XKYuXrzkfD5PEC/ikUimhwMeodvtpmoCUL7VasXDw0NSBhyQIaj3/TmXtqNCAOZzut1uodoYsc+/Scdyghrj5jmgSiKXU5VckVAYb3MyckI++cJgrERUkAdR2xVRvkMKlCMbp7bNZrOgKNwPZwXSoRXGkTxPe1B8xkkqaSNxmoIhmrh3wALduHjgi/uD7pgz6IQgwucwUOaC8wSF4QztsJzy5z9HD41iuMgskLv7k/w9I3YQk5GGLzt50lQCNM9oNpupIZZA7JYHgq5Pb+HiuS5SUYlG38gYXMVEf3kmFAQ0kNE39ofONZvNQrBzQIQi8tyd7rqBFRsmc3CT8qHrqAO7vb2NXq+X0hUTt4bYTMA8kXuEECRREcOLiFR9dO8JTiTfa8dl/ogxPDw8FDaWOpKan3AEYPHcg0VENRfE54n+vnCSfh4oiN/bmOEEMNLZbFZoXjVkd+pFA7B5NjhFkCcpN0bmJloUkjEZDSETHLFhO+O0wzKR7+/lAYLU0w3RbNEyB8S6kHYwHhwo4/CG/EqlkqpsLpycnJwU+sCcGhphRexbEBykrGOH0k+QMI4jR3VGg274NQrMkbqRFM8ARfM7xkJwZ22RAX/cIO7KNrxlRLFS7sDO2Nx4ut1uo9PpFO5rR8+a+gQJ246dpYslrkz7GCWn5Hm6fug66sCePn0at7e3cXd3F6enp8m4WUAWixSEMit9LZVKJSk3itBsNtPRIiAuDAKkh3DOz88L6ASB2zgMj/Pob16M1IvfY/iMHeUg6jhtRMiu0Pn+jnREXsZkREXaZ47CPFK5XE5FC8vMz7STJ+WcTCbp36wHzySiRuxTsIh9oYHnMgdQD2gMWXpLionrcrmcqtSslcfrbSw4TgczH5eTV+pYY77PxZgwJvcDWrY5d2QU02g04vT0NKEQ7uPP2/mBSkB7i8Ui8Zp5iuO0DuLffZQ4RzuUHPV7LubD8qOeGJOLRH6Osxn0hmDCOFizWq2WdpwQyBqNRkwmk0LKz72928SICX0DSRml73a7FLSRlVsucO74F2zyY1fpWI/F3/3d3+1+9atfxVdffZUcEV3T7gmhx8kGEVEsA0dEGjipGigDKA5SsIJaMUkTECRR1ydh4HQcGa1YVJXyvhOPm7H7b+6VozjPxc/j3z6yJL/foefY2Vmx3D/G2BuNRozH42QAbr3wnBz1UbK3b9/GfD6Pp0+fpnXd7Xbx6tWrmEwmafuX19AEuhsx83l7zSOKp4163sjTPVs4Gm+C597mvaw7PNdpov/NHPIepFxOEXtOx8bEhf4RpI3c8zk5+BGgzb+iu/4sPzOdgA4x9hzl+rJTY944WVdTbSO2Fd8DPfMaIycKKA7AjIUgzty8n5g+sIh9YHWrj2XRbDZjNpvF73//+6jX63F9ff0/P5H15cuX6SwvJtbtdmM8Hsf5+XkatM912m636TwwEAVVR2Cpy7SNRiO63W6aWK/XO+gMDYutpDiQXBHsGIzgHImt4DlB7fSI7zulPQTHDeFtjHaiJk/9PcbG/ZCPx+6UBEWaz+cpjbDjN8/Fv/m5e7FMFkfsj4bBKTNHEJObWvm+52A+xoR2/rOIfarr9Ip7REShBSf/rlE265HvNfUeQzcmmzR2ZS9PY/l5nmISLPOtPozLvI7Xlmf7Zx6HUTH6mbcdcE8+7+Dgi++6kdyf8/3gnshQttttYZO9HR/3IBjaYZrQBz05w7DDo6GZ9ev1egX+jXsfAhqFeX70N/F4nA7leYjAh4eHaDabiW/B+EkJIx65M5rSKIG7cjSdTtO+ydlsFtfX1wkCTyaTAhJjsXOPbuUwx3ToMp9EOsnP+Z1PTTV/Qfmb6IGBMn5HsYj9QXKOdN5Dap6OORBlGZ8V0/1ujJVeJZTC24IwIKNRrxPjyJWCaqbnyfj4A9ImHcidMM4AFGinYiVmrJTKufLeI86u8lpzufxOOu0WkxxRmLvBmWF8XnePx3ykMwPk53Sce2CYtBUQ3Nl2Z2cAlXGo/YP0Lnf+DkqMgYCL7iBzV8FNBeRpuVNnbA75glzdAUCQRkdbrVZqNeGopRwpem7ICjS7Wq3i7u4uHbPj9o//1XE6FowjA44HRWBwCMHGMZvNCqkOzsJw2CTsZrNJHA5H8LIAh1Ijvgu6w0jNvZTL5VQW5jK/YRK+XC6e404vCwuxXC7ThnO+B2rB2LzRfblcpv2Khu05SZwf3cPl7moUl61Y5t6YI45mtdqf3U50NDfh0wp4Jm0aXnNkCdoldYCHjNgjcOblVNGVs4Li6f/MCzlZ2W0o3pfJGiIPHLRPv/X7CFarVepBdCc+DsEFEhsY46NlhTXmmUbzoKLcsbEmNl4T9w6ufN/onc+5pcc8pAn0vKrvrMTP8t5KttnZ0fNd92vyPQco5IVMaZEy/0zAQx8d+Bkzgc8nYThIfOw66sBevHjxQaUJ7otWCAbJxUQh+kFdFxcXsVgsotfrJWNwAx9GWCqVotvtRrVaLRy65+qn822jAPqsqtVqgcPYbh/bLHCc/J5/s5+Qz7rRMwlKykYqBcphAZwGEcHyVM1bPMyfeC6uoqF8riDmp2aw6N1ut4B6chQGb0NliOjmdAOHCMpzNEYxMUQqrUYglpsVP0+bqRhGFDu4jdhwaKQidkhumcERY/heD9Zqs9nEZDJJzt3ku52g5Zf38WEHpjO8duiC02/vfPDJH+6U52c4Q7cZ5ZxVHghyfcEBkYa5nSRHpcgnR53YgFG1K5UugBnZVSqPp884rcVeDvGNlqH1i/GAJn/yVqL3798nQ8f5tFqtGA6HyVGMx+N48uRJDAaDFMloZF0sFunFEexFu7u7SxNzeX82m6UUwOgI4ZkYBLk4DWBhUEoEb96ESgqIhn+7wuKKFPeMKPa2ROybL3kOn7GC+Myn9XpdKISgLB/L77mHv+v52aEjQ9BTRBT2oDJGOwhvB2ItkLvPuWKcyJTiAdGTcThtMArBSdp5OQ1ibd1ywZhBNjhbxs18WA8qVkbj/Dyi2Obh3jXG1Gw208GCvEwGrhAZErBcJAEx0QNlJ8E6s9ZG0Pzf63+o55F75W0WrIm3A5kQ9/ec6qPXTpX5LLICPNhx2pFa332AgflpZIrtgfa5r7Mk+t5Ae+7zRB+OXUcd2JMnT2I8HsdkMknlVDz7arWKXq8XZ2dnMRwOC2nTYDBIUR5n4z2TOERXZdrtdmFTK0J1pYazrPIKl4/1ySEnSp0TwlY0R3BvZbDD4js5Me7UCUNlsd03xZxNUPJ7nmv04gZGNryCaI0IjGSQS6lUSq0sVjwqvXzPTYl83ymbU2fm7GKGnV6pVEr8Xu6UXTCJiIKxs56W3SG6IJc338WIMBan21z0btkRcoESQU5U7XIU7tM4WE9ONXEV1fqAwyeDcFrKv12g8Vr5fvBEvtAd1hwExOXKpfUA1MS87aDL5ce9y+gDDskFLmTG2KAjcidHvyM6bCDgI45wxOg0fgRf8b9yYN999116ywt5csQe+vFSgfV6nTw3F8pMGogBWrl2u12hAZaufy6iHRM2sY3RViqVlGrQsmDejOjgrSEoDf08eHpzDkRb0jMQBpHckBpnBR/hi14gjB6FthMGzRGVjHh4Bq0iOQkL8cnlaGguMGLfF8SYzXmYJ/FeQKcbGAjyQF5GpcwZdA0Ssbzc9W7+lLExB+5PdCalcDU2Yp/2YKzoKbwrDb/cx6Q1cuEgAaMa0xWsX6PRKFTsqtV98zFcGf9G73BCzInLwS03chwa93NGYCTHGXkO3NvtNlX/8855gwj01cEFO3ABwYcc8F3G6pQf9EwaSMuVCwuWAQi2VqvF+fl52k2DLTHeYzzYUQf2/Pnz5AyI3obiwMN2u51SyNVqlU58gDitVB73hZEm5uc5OR93kxuL5P1ikH7mMSqVxy0yKKtTI8bB521ApFE4P0d9lIbPk+bAq6EAKGLOX/liTkYFH2uApDwPEc34nULjmJGf00icvFPFnNvz9hwrIP/21h7k4DFSUeblxMwL1MfnKcJEFDlAN/NS4WUtarXaBzJk3fljXs1pJ5cRu1GTix2W/6H0j7/ReRdT2u12Wg8HYGSAPFk7ZMaYCcjmrCi6MA7rrAOmv4fN5PJijibB7QQO8ZHWfXPPeVrL511ZR6d4BrZk1I6d2UmztvCTVKWRB/7lJ7dRDIfD1KfFTUzymvNhkQ2ZI4q5uSMVykjPEZHFQsKgcwjN4rkkfWiTcsR+yw0I0txSjoIQHsppgTuyMAanvKSg7j1i4cntMSqewYKDYlzdMQ/hcr7TPBQEdEmFzemr1wLFSItf3p8AAZLISWjmZkTm3yEz1ipHfUY6EfsDDanqIbeISDsK/Fo+7uc0iH97fcwPMrfJZFKoCLpVxqf3ohd+OzXtDbPZLI3LqTS6x3l1jCXfD8ozWH+CkeVj3sd2gH7ahpzBuCqac7b8zfocao1hParV/RvNQT2Mn5/nhQ3aiMicnPKax+SeBIn1ep129OSOKadfkPFPJvF/8YtfxGg0infv3iXnNBqN4uTkJKbTaUJaGDYLsVgs0tnt9FHNZrPodDqFtAhuwo7BJCCL5BYIw+d88uYciLgIlZMBnKY4OufKwxEzVjxzBrQWsKDIJ99ywZhwKKSA+XnyJkHz6hYOyumiORsiN3vo+HfEHn1ygWAJEJ4zxsd62MFZrjiqfD/gZDIptJ0gL1IF5or83LhpBGj5eYO60wobGn9Il9AP3rbkVI7nMEH0x8IAACAASURBVCacWUQkZ8SpE6yZCx44WHq8QBCgIDeDkvYaMTE3DN68Dxv2HUSxJ+uZiwhGs8jWAdCkvR2tq+EPDw+F3kgQYrlcjvv7+5SSoleeC/f2JnwuxknHAbp86PRi9NvBFLv/yQ7sD3/4Qzx58iQuLy/j5uYmRXhXA4Hp5Pnu4KXS0mq1otvtFjis0WiUoLjRBEpGQ6hT2LyvhwVwCmiEYGLdkcqeHijsqIRygypweE5tT05OUs7OfSP2jo97wVEYLRy6OE0iV8bcSYBYnEqxJvCQLhQ4pbZMWL9qdd/Pxb0dIPg+vBaOAgRq9N3pdFIXNvfFWeQkt9M+jNJprdE16GE6naajjiaTScH50UwdUXx5qlG/m4VBlHwWROisws6Nz4MMhsNhSqHd5IncvZfPa2RESHBzzx3zdioJgnfwNvLPHSMcXU78W8/RNTIU7IY+LvY0evuVgx2Iyk6Hz7HWOVJkPtgc9u2DAYza8qLRoevomfhPnz6N+/v7mEwm6bA8Jo3xLpfLgnPi/C826XK+FoPCGTQajQRD7VQcVVgYJkW5FcE4AvBZnAvjcZrIZZ7GfW0myFlYKlT5IoLeUFh6ooyQyuVygQ9EBtzTvWS8vclONB+/USZOhM/BH/I7H5OCXByFiWw4GO6LIiKPpCjVasF5OUqbp/D68PP8d/474pEj7Ha7KVKzRtaJiD1CrNX2B2my/lQP+ZmLPNvttnAoogOFdYJUyPdjPUFJUA+eU0Rxl4T5WRPWnjNBwSmldYvf49wZs/dj+rnMgUDDxXhAdegla4tMyEKM3HFijInnsb7QP0b1XhP3edpGjbx3u/2hhjkPhz38ZARGdCAy8MDcK65Wq7TFKKJYXrfH5j6bzabwIgdKuiwAjsSoB9jripvzfO6H54YMxuma/GfMJscj9mfRRxSNJWL/9maehVPL08WIfUQ2YnIlzOmA50oJGwW2E0FxfPZYRPFt3yg9MuUzfIc1pLLaarXSvDAMHBMtH1Zey9iVXFJZOxvWCx3I/1gH8oKBAxprZEfAM0jv1+t1ejep0zcMvlwupyIPa+/n0SXuog7n7/O86XSajNSVPVdqQTAEN+TPZ43UfPy30y8jTheYxuNxQoDMAb0wR2rngPPG8aLnbNwHLSMH7NppKJwVc2BOtM3gD5iDm6yRDYHXXBlOmbP5GTdzceZz7DqKwFBwFM6TheAH7YC0XNHg9NGcaDyEihCwOQ9zQUROUidKtCymT6mw0QHJ7YidGuRRHv4DJ5IrIj8jQueIwfwScyCCeG7ufEdZD8mFexv1ReydpY8CppxOVQ8E5Fe1gQDNtSAHHLPTZyJxXqWlN41Ii6OlYAKHxPq4Cst88qjNFi0HLZfmuS/I3pwkz8I4XcwAPXFfPxvUys8JMqS/BNzVapV6IZ1S8xkCGu97wPhxFDR2cpkrcm8Y8sOhclHtRIfRKfaLsrY4DAIwMjOl4QBoFGyqwXaBQ7XugTbZ9occXMBhvPgG0nh8CTrnKjEHoaJvh/jugm0c+yULaQPOy6MIE4PGQDAc9uCxYOPx+IP8PPe08/k8hsPhB+dP2RGaNAUBMF7GdXKyf1ckXc9ETaMKQ3sWm3O5SFuJCChKXpnL0VFejUSeyAYSnvk74oI46CHjuzgB3gJtB+FeIHqd3FPnOUbsESaku+dnfooWEiPuPN0GpfAMnAxjcnAArTuIoFcvX74sBA22ePHMnCJAJ0BcXKVSKR02sF6v09FBjMH7+Rg/usA42JaFsZ2fn0e9Xk9b0pBZtfrY0DqdTgt9kRGRTmXp9/up1Wg0GqV1BOnwXDsMV6lBQ+iLtyRRMEM2plIYu9eGdUavKEA4LTSX7DYR5out+kBJH2IA+kUO+bYo+Ow8ja9WqymgoF8GJoeuowjs22+/jb/927+NarUanU4n8V23t7fx5MmTmM1miaQfj8epOgVJF/F4LDWDRVibzSYdL00krNfrhR4xFozIbgLfipxDaldsgM7uqgZ15N7dqZ7huVFbuVxO80SRGDvKlH8uj3yH9nuh0EZKoAQ7ioji0TLMl+1ajor39/cFxTRKJQDl5D6pUS4D0AORFqfN85gbUZRxuoKE495sNum47bzKhMGyZe3169cFfqXf7xd4MmToQGe+1HKERPf2KH+ewNtqteL29jZGo1GqXOOkJpNJcua1Wi21V6zX6+h0OtFsNpPzIbDizOv1elxeXkaz2Yz/+I//iHa7nZxDzsmRSlWr1fQ+xkqlknjWiD1Zb76O7AQ0hE7WarV0D2yAJl/zfKBmdBpHb8oE++N5dBjAg7M+Rnk5T8cf7JDvOoMCkOQZUH4ddWCvXr2KN2/epJ6YanW/mRtldnm83W7HcDj8oM+FQRF12fDJ7yqVSnKCCDlif7IozyaFZFMyikjUR0gIjDy90WjEw8NDAT7zh1TIaRIGDAGdoz9DYBCqnZ17f2h9wJnB+6AETrlAEigiz8GJEDAoRTt1RyEwWuQxn8/Ty1FIKZz2uSGRCweBIqP8OD8UNGKfIoN6ut1uImD5vCuikOnoBGk48oCPssNdrR63rdGSA6pwRZxxc7/lchnNZjMd9UIgtMMkXUSWNrLT09M0//V6nVpEvH8Xw2u321Gv1+P29jYiHvsnQbGTySR+/etfR6fTie+//z5RHycnJ/HFF1/EfD6P77//vpARcMjker1OzZ3oqwOvz8xyRXqxWMR0Oo1ms5n0Bd31/1kT5AICcs8YrzijWoge5NVCnB2VS2dqXNYz855+0xGBhN/9r7YSXV9fx7Nnz6LVaiXoaycBETwejxMyi3jcC0kEmkwmqQzM4PJ3vhEh+FnO9xjJoNigAIjAHKWgkNw7T7m43E2O4wGl2YEArZ3W8B0jQlCjDYtIjjF7UTEgb0ECNTBenFe1Wk2kJ87O/TWG9+YfQCgoqZWOn6OURlEYDsZIMOD/kK9ES+bBXF3IAA3k3IsdPlQAa4TykmaBBFB4HCRzBhW4OMQLUk9PT5M+VKuPJ5C0Wq2o1+vRbrfT/12FNYpwjxw8oVPBSuVxtwkpI5uUeVEz68e4eAkODq3X6yW6hZdg+Ngp7IE1x0EYBGy325TttNvtVOEjHUfm1n8CBJkBTprPuJcTWcAPEhQjiu/WNKeVUw/WC+ucdcK652ccuo4isPPz84SWUMS8fFyrPb4aCacEbxTx2FJAf81kMol+v1+oqmGQdlbmRvJSMc80AnLVwtGJzxNxQDn5ZvC8yoHzg1RnIeFdTAZjQCiI2w5AUlSl+LwrOW4PsOHn82CLDhcHTILSjFLzUyuQo0vzGAGpJ4Y4GAwKXIubMEEuFAbykreRDQbmvizmkcvEFVDGyXehE0BCOQcEUuP3bMGyIwYh+3w2AhRGaB0zj+S/PV/WdjQaJdTNoZzL5TJ6vV58+eWXMRgMUoC7u7uL8/PzuLu7S47zhx9+iO12G4PBIKWYIKhutxvL5TLevXuXUlMcq/sm0SscoavAIK6ISFRP3iyKw3L6hr3sdrsULLgv623U61SeMebkO9yXdxTwc3ogObGGViBA0rEU8igCu7+/T0fjdDqdRBjCF1Wr1RgOh3F6epo4H++AJ3+OeKyW3N/fp9QMjsIkL07DZVtXVyDkIcCJQCAfw3OjN/M1XgjG6Gqne3xQPhPJLK5TMsZJdCF9xRmiYLkhgB5BlcjKB+kxJjfqsrgEDZSI73nPJjIF1oN4vNma+8KlgJYYQ8S+r80RlLnyHBxJxL5nDyVEX0y8R0RCQyg18qZHkM/MZrPEgzI2qq088/7+Pn2ehsnFYpEQOI4ApMSzOGUFZ4V8803yrJErqjhN0Ck0x9dffx3ffPNNrFarePbsWUrFT09Po9frRafTibOzs+h2u/Hs2bP42c9+Fs1mM8kJ1Pby5csolUrx8PAQV1dXqf2CtfGZbeiK1w26wWmkC1NUbU2roKNG2uiKewVtU9wPOwJtud2IIAeSRo6r1arQmIw92uF97DrqwD777LPCK80ZCE2sVLpIVYhMCMpKhCGdnJyk43YgEM0/wAvZ8LgfztNQFcXHgdowiNAWlFPgiEjpKNyO+8TYs+XPg0AwYKIFSuN+OSua95NiwPwew0H5KZYQjZwWumfIzYgR++qgeSXv1yQq8x33A7EPkiBlAha5UMFDJ0zYo+BGhKwbnwXx+dyyiH1PFCkaz3WJnfckmmLAwdDj5XPoIJcj9rwQ/BvGgr50Op0PAgi8LGPASEkzsQlvXwLhwEnxVq0//elPMZ/P4+LiIqWwcLnI5u7uLu7v7xPvBbUBQf7ixYv45S9/mdDr1dVVQT/dRwXyIrAi1/V6nSrBXt98ozZzc0B0hkPQc1EkpwQI4nzWqAqdMcrN74Ef8O8PXUcd2J/+9KdoNBpxfn4eV1dXKafnLC/OvbfxYDCQmygZlcfhcJjeuusTCKzsfB+hI3gT1VQ8mCi5PguMghL9jR5MspvXwynimHEgRnBEFQzUhshlp+O+GIzbVRqUyfwEjiAi0g4IFtgn2jr9zU9UII3Hubg4AvKzQ6VaRoTlQpGpqLGFimeCDAlwbmFhDZEbuoCMXcAwyjXRXK0+vu2dAg9OeLfbpXFAPJvjM5dn8hhEgj65bQN+EtkgF2TorS/cDwTofarVajU+++yzpFu0s3z33Xfx5z//OZ4/fx69Xi/a7XbSzX6/H+VyOZ3Ycn5+Hqenp3F2dlY4RaRer8fZ2Vk8e/Ys8WWkqoyHebp6CNAwIIgobhR3xQ8ddlDmMsobjUZJR9DL9Xqd3mvJOkDwd7vdlOoSFOyocGpuHcnTUV9HOTCIyIeHh7RT3z0lTCJi76Dy84ccoTAAtyv4/B8jCnguDMGtGodK9+YH3PflyotRzyHOAGWlodLbUyL2XA8wnOhm5Afv4ndosjcNefj9l3zPvBBparVaTYURFG0ymSSZuJ0ARGZ+z86Fz8G54ExRLNbL1ULu4QDjSihj5zk+ugfOxTsuWB+cE3yVuUTGYUTmfjYCJ6kISu42G4wYrpCKJD1GPA99IyAxTtYOHoiCCuvIBnp+h4NzgOSMOusJhxl88803SW/n83n86le/SuN4eHiIVquVOMD7+/vodDoxGAxiPB6nqvHp6WlqMSElpppJQHfRCrSE7iAvOycjafO53JM02r2MyNqVZa+hbZN72TZdgEHH0A2KCz95K9GrV6/i+vo6SqVSPHv2LB4eHgpbIICF5gyIggzQJVsW2WQrqRKGZB4L74yzc45NlQUH+vDwkBwjxup/YxAYs6syroCZTwDpWQkYj3tyzB/hFPr9fgG9gWxw1iwWY/Bx15DD5XI5vYKOBYbfgnjFEBgfr8Fzp3jEPqVl0zPfYWwoFfdljKVSKZ1OOx6Pk3wcLc1zuEfMBLIJYebNQY9OM5gj60wEXq1WKTU0SrUzRm8c+SM+PN7YztFozikq62oSmUCB/lD5dfc+jo2UlmKBx8PFeG9ubuLdu3fpjVzsE6YZly552iKQKQ6TSv+zZ8/il7/8ZXzxxRcR8dgNAF2DPfKSWuac0zRcBCW/lTvnBbkANtgJaMxNqHQiYMetVismk0lhf6n1inVBjz92HXVgf/nLX+Lp06exXq/jzZs3hQV1o55PUmy1WslrOirifGgI5D1ww+EwOUCcGhGCkwcM272AQM/dbpeiNIpK71q+dYIKCaVh0hyclaMGwjcKwwjghfxcmhj5P8bntARnnCNY/m3Hi7xxEKBSnCGOx7xFq9VK6QIOAH6L75fL5XRqBs7B3JYrzibqXfklvTcCRW7Myc7AaQJpM6mfq9z+PakrrRsQwf1+P92PN73jXNvtdjo11esVEWkNkA3OZT6fx/39feofM7rACK2foLc8YHqrE0gOvs0bxT2OPHWeTCbptYS1Wi0uLi6SfpbL5Tg7O4uzs7PUqFyr1eLJkyfxhz/8IVqtVgwGg3j//n10u9348ssv4/z8PI2fv1lL7Apdw+k4DacyiC66QMb3zJl5XymOHVvyXuPxeJx8BffjJBt0hwzpGAd2NIX8/PPP4927dykqoDQ4I4z14eGhQNAeKjsjKFIgogaK6pSKnprtdpuUnMXiMximu9oh/8rlcpyfn8fDw0NSHDe/RkTiH3yZ8EcpQWDmtWyg/O1UGkW2gtpp8R2qV6QsTqW4cD6kck5r4ESYMykLVbeI+MBIIh4jOs2pOMxOp5POw3JVlNTDiMttKTyDy602oDPSDuuCtxExL3SIy1tTGM9kMomLi4vo9XoxGo0SomAP3fv376PRaMTp6Wm6DyQ+c0HWdo6VyuMROaRPLiqxJqenp2k9WCO3beDcORnE+o8srEfICnSMzqOz4/E4rq+v4+XLl0kX6f5/8eJFAW3/4he/+AB5LpfLODs7S3oHNUJLEwiLLMrca8SeVsH5sb8WJ4jDhzrwqTOsv52mT0zBgTqoGzR4/Y8hsKMObDqdRr/fLwwI2G6oDqlvQ3IuzP/peMcAQAQsJsq2Wq0+6H1yhYrqGsbhNIwICenLuIyMgKYR+96ViH1q42gUUTz5kufzHb6PcUdEYUw5wnLEYtFNmqPkdpw8E8dlUhrjAr3BodiAUH5HfVehXDBBDgQi1s99Xiiz5+x5uvpkxc6bg/m/+VJHbPr20A/kNJvN4unTp/H111+nVI6Uw/waOyDcCI08/Gyus7OzRHzTrwaaMhrnkALu6abOHKFE7PcXon84VLf+mH/CuHG2t7e3hb455k9qig59//33CWyAKt2fyNjM/QIs0A1vLCeYYT/MgfUBdWGPLo5Zx7gfOgk/il0BLtxH6NTzJ6eQ/X4/tTpE7F/2QMqIknY6nQRxqR74pEj6TuAF8jyaCWGI7sqn0uXTIyDKUSjGkUd1RxP3i/F/kIbHyxj47KH+J/gBowrzAOXyfmM33yedpkSPE3KLBFwNe+5QGC8m0Rlewn1XzJt0CoMg3XZBxfeiSmTS18oH5+Z5mi9hjN4mheOxo3WA82UnCMdnfs1Erp0UDs1pPQgFJ0TK1ul0EifltJKiBJuRaZ9gDLRdODvgtGGcME4Dgt2kNXpD9Q0jb7fbcXp6mjIL1t+pP06MajrIstVqRbPZjOfPn0ez2YyLi4vkyOkxwym6j8pFjjxDgicjiHDlvC22w/1o2WF9yuVyOhOQ/0NX8D2KWAYPH7t+jMQ/+u23b9/GF198kY7V8WZjJokXJfeNeDSm8XicHJsb09jB7q0iQEt+TuQhpYvYp6aQ0CApkBifx0BQJLY5GcGhhPBe/X6/EOHyCorRC9+HXMybG0nxqNj4mBkT2xg5/XNcOEtkQ4Tld3BBNkAXRXBuIF2QDJcJXc6Y4pnIrd1uJ+QREYn0Z/z5SZ0gAfQAx23EdogL89YVnKNTCf547lSo3QcIsqBfiXHhDO/u7mI+n8eTJ0/SfJMBKHBgUPC57BVFB1arx21w0+k0/Z+2IsbDvafTaeJHKWCYT9xuH7vz0X3SMKM9k/buw+r1ejEej1NQpDrP/E1PeEM/usu8jbgj9hvusQOcDjLhO06FQblOq+FUmevd3V3h9GLvlZzP56kdBr/C/L0/+GPXUQf2/PnzuLq6Sk14lPJRhsFgkBwYhkiqCO+B4gNd3RfFRUpnhISg+D5K6S51LwzOjr9dLYootikA21kYoi/PYSFBWkYbNng7B4Sebx1CLu6E32w2qYhhBzebzQr9XZZdXqkjurrlBCeLEwD1kDIgX8b37NmzAtSHP8v5GxyoSfe8UrnZbFLgQo4OBibpTRK7VG6E5vUGQbqyxRojL7eU2CEz9svLyxiPx/Hw8BAvXrxIqMrtOzZ2UBaOCg4YHokgAtp1+mi+DAeHLjEHCkAODASrfr+fCG0jM2/Dy3eboKek/wQSUxysGbZFr5jbJNBb97+xbnCsFFoc0LFB7IAxlUqllB0c4p2RNTK0TTtwf+z60bcSUYWcz+cxGAxSJzqog4fgSR2F2F5BRANddbvdtCmaaI3C23i4r4UFf8DlxTP0dRoSUaxwGE1F7B0NlUvuRWsH9+NewF//7ZQjj3xuMgWV4Mj5OTyDq20YSn7WO8hit9ul37nkzD0pTRsJubs9LxqAFECR5jyMqkgrcnjvk0iQBw7A6SqRFwIe/SI4Wq44W2gGjNBv0KaaBalNDxY9dzjKarUa5+fn8fbt25SugtpB9xizW2Xc1+U1oOjAGqDn2IqzFR88ib553UDP/X4/ta2wxYnOfKr+OHM3HeOMuZdTefSPai0tGdhnv99Pm8AZK+Ol/5NtT6A+0yQAAFeuWU9zbW5LwjH7pdTmoI1Ej13HE9DY72uaTCZxenpa8KBwRUSx4XCYBoIy8H/ug4I4ZXAf0SG4Cyx2pzPKYu6GyEx0MwLk9yy0o+Zq9bgtxY2YRHenncwnIj5wUJTB2byenyfPwnDiAPIwJ2QH5KjnBXWaxZyMzkBmIBFkwLPdmGmob2UBGbiJFcfBeJ12kCr1er24u7srNLniSDFkUAaICQSLPD0v+BXmRKsDDoLxUbBhnUC7jCNHr/1+PyIibm9vPyD4WWe+h76CHnCYHFfjNY6IVF3v9Xqp1YImbHTZLThGfAQQqAc+a94PRAT/iU6aAmBt2G+IrAlobsjdbB6Py768vExyd1sNbTqj0ShRD6BPtko5aFr2EfsTjqmWk7EBFmh7oc8PvTUqO3YddWCnp6fJaXS73QJXQQREGV2Zci7N/jo+G7E//wdYjmfOjYUuY9IkHCGKA79m/sAEKM8gKtmYjEKIuDgSnBzCM8TOlX2xWKTGQ58LjwGg9CZ2USj3wNhx+mfuyuf/jAek4gIHgcPIAsdg5xBRrMbB5+UnYeZI1tU4k9oeP87HaMytA8gOJM/caKaN2O9GYL52bqSMrCtr7UDmwACRDF/Dz9rtdpRKpRiNRkm2zIN7GBUQnAgg8LDIkrF5JwBO2L1soGpQj5tJ0QvWFPmASEFSTv9MjiPvQ4bPWuN0/BJeTkB2es/9SIcrlUq8fv06Go1G2hbo/ZLz+bxwaIH11UGTsbADwqk3c2QXxY85saMp5O9///tk3IPBIDqdTkI29KlUq/vjXowUgPAY+Hw+j+l0muArBCVOC4V1KkQkxGAhlvl5qVRK8N3wGTQUsTcYFC3fqJxvfeLnGI2NKs/9WYzFYlHo7I8ovmqLy8Q34zQ6yNOBiOIZ5ia/GUOn00lbVLjHyclJqnb6tAZ4JkdkyweZIl87F3NaGC9BBefuRk+itR1JtfrhiSBGsmw1w2nYmeUokX+TurgdgN+5p49udu/7Qy69Xq9ARXjrEmhks9kkVEL6dKigACriZ7wCjm1CDorVajUFDQwaPcZhkabhUBqNRtoPyxyRG5V0Aq2LBg5IfosY+gLKQ3dII8vlclxdXcVyuUyct7dH+eQTgqA7EEyVsO5uqMaufXQ96TM9d8euow7sd7/7XeqI//nPfx6j0ajw2jCqU5xISVMmUabZbEa73U6enhaK2WwWw+EwarVa3NzcRLlcTt4c48BQjZosEEdfFAmjoYiAMEx84wC5nO6wEMB0oDCGaqN05Q0DsWMAXTmCM37G4P4cE7T+vJ0Xc8Wg+TkG4gqT+2qQD6SqDdtOyOS87+emXKcpoGt3iuPky+VygYhmLZxG8n/Pl2rgdrtNnBeKz32QHTwOAYd141l5BRjZwtcyTrdljMfjGAwG6f84CPg+kBH35twu1setNfRG8co4B06f2c/lVgT00pV60nqvM/Lg5bTOELBNnAuZC2f8cVF0AFQ8PDzE/f19cr69Xi9lQ9iHe8rgwbAH1gZ9yHvu+L9tza05+Xr/5D6wr776Kl68eBHlcjlev36dlAGviSBrtVoiHBE+6ROdtygPyuQo50qJq2bASAzGFTMbmNMCUAACMxGMcIyEXDFECU18EwU4QZPnEJGdpvGdvBCBEhF1QZ5GaMzLc4yIFJkwaKdm7uViDhiqq1AYEimPHQZKRMCBz8DRcYGaXWHk4l7cG6ft/iGiP+kcMjFKAOlw+U3Z6IDXlZMbzs7O4vT09AOi3ByZdYztbKQ8Rrn8DXnudUE/eT4pMJ8xHWJnQvXWLR7wjFT24cAcbOycTUl4FwQB3rKmMLLb7WI4HEa5XI5er1c4L+3zzz9P7Qvz+Txubm5iPB6nVgzWutPpxGg0SojTugHix9G7qGf+G3txFoH+OAOCJwTlUTg6VLnkOsqBffbZZ/H+/fukwCiT+7QwTFIBohV5b97B7WohToqfOxUgpUSxKWVThbTX94mg9trmZHCK7kHJ+Rp3HJO24hy5XI10ZAFVsDAoGYaGMbmogCyNAPk9z3CXMs51MpkUXrN1aFykgO4Bw2j4ec5Z5V3PjAVnHVE8nSJHVvz+UKS1U86JZx9JTXrmdJF+QtoD8oMXMYi8CAOSIzVCt5iLCxnoLTKFo2JNJ5NJPH/+PI0dwzIvasRt5J2vd7VaLbTb8PYiuGKfS29USwbgthyn90YufGYwGESv1ys0yaIDu92u8LJo7BnC3dVXPkPhgSDpdQKs5HypsyB0kzlxhhvj9lgoYORUjK+jCOz29jYd98pN4LP8MgiUnj1Vo9Eo9QUxIRyDS7Amo3GO8AL+rHkFFMWOwPm1kYUnjqK76sZCWhn5vR0X0RxlBq3YkPPxuMJEhOLfGBfPcCMsXIbJd28fArnhdFwBwtn7u3zHvVTwTKRxKJc3HiNjSHo33NpJ24iIwowN1JpHXRAFDoW2AZCH71+tVtO4GD9r7UIG8jENALLBASFjOtXhXM25USSiOdUcj3XCe35zB03QYj3gZL3FjJTXHNt8Pk/naDFW1ttVX1L2iD23y/O9GR09vri4SOk42Q8bzafTaTrscb1eJ4qHcaHXeYsDetBoNFIxpNlspsMh86zk5OQkpblGk74Xa4Ze88dOMr9+9Ex8JsXmWHJwvLX5KefG5XI5HXCIMP3qJcrL3Ac0tVqtCmjvEArI+7hwchgKC2l0w8UzWRCf+CK1QAAAIABJREFUZ0S0ZmFQMJ7p3+enfaI0TjGstF54FtTEr8lOowKnQwQIxsM8XIny88xJuFufyOqCgpXFzp+UHnQRsUdc3ivq6qJ/T/CqVCqJAHfhxqmSiw1U7vg5yu41hVci5bEewlPiFKh40TuHfvB5EAupTLVajV6vFxGR+r04En06ncbp6WkhJTLqtP64JYJn+dDNnLoA9VNlxDaYH+uEc3f1PC8SuS+Nl5rAg3ltTOg7cwKJ3t3dJbtnVw62fHV1lYAAMjffFREJNRJkkBFtIzyTOThIucP/0HUUgUHgkbYYBWA8OKhms5lOlXQ0MHdluG94CTJy86bf+IsQXLmI2PND5iFyHoTLSMDwm/u55A/iQxkqlUriBpwSsNmYhcPIURLG4BSHBckdBbwCFRlvlgbOYwROh5ENLSlGtP4dhlOtVlNKy7hBKZYZyu6igjlAy5qtNSieFTmieEKHyXtO9KWplfeEMkfzPKSm6BSGzKZg0g0CC6/Ri3hsA+j3+2k/pJ2lUS48lbknWh3Qj2azGZeXlwnREmy5l3uZWFvGjswdfPmeq7+LxeNr0bzRn8DvXQOgR+vQfD5PdACZAfbrObNOPNPZiP+/Wq1iOBwWzttjPag6ViqVdMIsxT32ebrP7+TkJDWwH6JUsLN6vZ4oElMgh67SsX1Gv/vd73ZffvllfPPNN4XqF97TA8BoUFw8Nj9HaZzXRxT5CAoDjvQYKhHIqMM/57MR8QFSwFExVhCLf5YjSS8086C65zYGX8gGp0gkyrkSE5x2sii50ZzlZs7LvIrTUjcb2lgckZEhRs9cjSQi9nwX6Ml7SnN+BwX0kS65DmAAoNPtdn9G1Hq9Tigf4+Q0Wg7KRGZEf29fw+n4jc/WM+RL0OUiTaaqnXO2Rt3m2fjubrdLPYB+obP5PyPGQ3yOkS/66J4tMpBKpVIIXgACy501JZhy1FBExGg0il6vl5yh+U70A9oHB82Z/PV6PVFK2A/PMf3ji7nnPYB5yn1oqx3yvbq6ioiI77///mAp8igCoyeGFMkDdC+JB+e/baBeXCMEdtE7L3a7gvm1iCJEB+nwDBAIkzfEx4Hg+KzkEVGI3BiLK352SigZEQJSlnnVarXCiypAXczf97FDx2HkzbdwhsyfOa/Xjz09OAxvM+HzyMnOhjQAx4+82CjPvHxYY6VSSS9gcZQ3onZFF/lhxEaM5iFBZyAxaIf1el04nBHkSMEobwzl7DeeTbe5X8qLjJ2WR+x5JNaGih5oDJRPKsvz4EFdUDL348AHyuPCAfA34wb9u60GXYFXnM/nMRqNYrFYpP5Kt1kwBjeYcy9fVGCRfX6xdxQ0yIXO2qZAWBQGcj2I2PO/Bi9G6rl/Qa+PtVEc5cDI1YGhnjgKiRd2Vzl/s9j2ruy+j9gjnLwDGMMyEY2yulJkwtoErvmInDB0v5ORFotN64YdnA2Iy07O90HRiCTusSJSYSDMn7lyoiaHPlp2GBCRNSfEcXqMFR7HCARymj452gl4FijYRQNSPyNITgMxie3LyDViX0rPCwLcm/vAUTF2dAl5vXz5MsrlcuG4ZTtvDNV0A9xho9FIXFjEfgsTnKirlnzO72iwHpLGkvJSSXe3POmysxMjaMZhhGz0zgUIwBatU/yeOURE4dSH7XbfS0fLCRdtFbme8wxz1hH7tyNZF0Fy6CMpKNuhXPU3yjS/Z94OeVj/8DEfu44iMN5GZPQwn8/j8vIyRRx3k0dEqlp6iwHKUCqVUl8K96RREMeD8HE6LBItFeaVQCZOAXBU/NukYUTx3YNETBSZnQNEZzgFFidXQJo4WRyjVDgVFh8HYKKaZ282m6R4KAUR206JxSXqU9lhrkaPpBNEzuVymTZqg1Qmk0mqKjsQMSZHbyM1/k+VzqmnK582NJAbn4E/Ya1AfTgfk9EYA++GZL1YPwIVaxWxL4ZQZcPhIgM+S0WP0ykwGDeLUrGFQ2R8rIMDKa9/Q7bQAUbbRtFOR7nYK+ktZzhH1oc+tkrl8QW7cMbYiZHbbDaLTqeT9Gs+n8ft7W2akzlet+BY1rvdLumX38tgPpeAYa4ZcMK4mCvzYDxGgbbxY84r4kc4sL/5m7/Z/eY3v4lvvvkmdrtdAc478ubKDWfA58zXADGZGAvsASNESH2TzEzQCpBHLhbWaZmjPYICSeB8THzaieWLlPM/jM1IMueqzJ2BIthjVq3uX+RqRJJHJuTB/HAe7oInQjNHkIB5HAzf6IyxWtn8vRzNfQx92SitG14z/w3aWa8fD8b0epkPzcdgXi1ij2jQp9lslhyOnVGpVEpv2UJ26F/eiGskna8zso0opsR5JRp7QGbojWXJd8xh8TMjHJ7JuE1vOB027YAuoxfD4TC22238/Oc/T+koDgt+1Pcdj8fx+eefx/X1dUREOsqbixM/zCtbZsgTJEebR94/h04CaghYP/zwQ1QqlZ/Ggf31X/91ROwP4EcJgPUm42x8LJIrJJCuZ2dnaSEcPfMIzkT8soa8goPTMp9C4yf38P39M8hQLm9IdroJimGu5o1YIJ7vPXt2YtvtvuKGgjEfnpE7K6exyIWFxcB4MYnl5c3f3kCPvNyywQWHkc8NhT7kOJya5xHYl1Eu34/Yb+YGsbdarTg7O4vLy8vo9/vpLUEuNrA2HosrYzgo9NOcGEGQtR0Oh8l58sJb2lT8wg6f5eVCCXrFXKrVaiLAT05O0vixG79ExcjK+gYysV46nSSosNa2FXTNXCntD+Xy4ympUAegMbYFOhgTPJCdd19EPBYCeJuUgy3yIw2FenLRhIBg+3QPH880EiMt/dj1oyeyPnny5ANEwMMcge2BEYT7rRAy5+kbvQFj3RjplgEm75/zXYwOpYDP4bNU0Nw06h4slJ1+Lj6Hw0NRnDq6Gx0+AOdmhTtUvSKqsuhUd0EOLFx+MKKdRMT+mG6+Txk+Yo8MvNWD75P2oUw8E6UxJwGisNNibVH+vPPaVUd+ZgRrpAInutk8nijy/v37hJDYRrRcLuPZs2fpuRHFtzrbKTA+1hIOh1QIJ+Y007JkKxV6BbK1TOBImZsLMuaEqExiN/Blq9UqHc1Dip4jeuiJnHzP1wZbIRD+27/9W3zxxRdJxryxCVtAns1mM66urhJ9w8Vz0HmeNx6P037n8Xhc6HtEX/kbZ4TdI0OvlTMZ98c5iINGke/HrqMI7PLyMu7u7pJh5Nt7mEB+xDBbP2jI9FEzDJiUzZUtK6e9stM2UEFE8WWYwF4TmFTLnDJ5R0DEvonUAs7TEyA9bRs0/3kcRGZ3MHO5uTdij3gYmxsJ+QxVTKIh43L/nc9rAuHB18HtGBHhONhTyditYL6nf0egYfzIm/vmHA4/N2LKHTJjRG5s3anX62lz8enpaWrshPfKX8eGDjB/Ir6LQTzXPCStAhjyer2O4XAYk8kkIQqcU76TBLTgwgB0A9kGfCXpPnpPW0NEsWXHXCvfh1IB7Vp3I/aVRKNnZwDIiQZeUmo7aL7j55h3Yz7lcjm9Zg3ODMftIG906nQaJ2XbQf4GFM4+jqGviB9BYK9fv44XL14UzsIH4fgt3KAKFIL9U6Aw0jVHTgwIaJvzAwg234vIiZJMPmJ/GByfN1/kLt80aUFiBMnf5hJQIIQJucrYiHxEHC5XZ7mfUZGdsaOv/83v8ooflTzuiYL5wD3PDyfh1ArC18flGCHlUZLLuyVcWTYKY12MPJ3meW1xxlTIQPsYy+3tbUIAm80m3r59G5PJJDqdTgyHw/jiiy8KG5stz4h9MAOJ8XPmWak8vkoNI18sFnFxcZGiPimsN8MjFyqzfI5g6yo3a0pKCqq5vr5O67TZbNKbg5AVBuxTQNw/54ZU70x4+vRpcvCXl5cpCFH15cRc1t72wIGe6IjfTk4ApU8vD7qMjfnmbSkRe4SFXaGfoGPzlMwfp3aMpz/qwF68eBGlUilubm6i1+slz2sOxZ4eItpbO9zZu91u4/z8PB2l420HNmAbBm0HOFCMgdeHOXUlimAcpCE4Qcbqz/Nzn1ZgYyQCOuVDKfLobgMhShviR0Q6NsgpIgvshY7Yvxcv76nLnQU/g6v0WHCYENrMF7n5WGU/v1wuvh4L+XuLlrdh5f1nPMfjzYl485BweoyNNMZ9WdVqNXGojUYjyef8/Dxms1nae4kDZ11xKKwXOuWeQzv9H374IT0LpBgRqYWCizPGcHLoCHM1ikAHmZt5oHK5nMhxHCvByfsyI6LgOJ1Ogoh4+xG6TwBjxwJzZj8kF9umWCvvZ0R/jaBZ84j9wQVw1k4FvcMB+3UmZ1oEWeDofBz4x66jDuz9+/fx8uXLZPAoMacxwh1wDliv10uDQTg0wjIJzq5ylGTyGIgdjVEU9wDxmKTnuSgQDhDlQqjeKcCzuC+oEdRlWGzUh/FbAc1huJrlZkJHTyMbIzpkjBIzF6difB9EiKKYqzT3ZLLdcwb+82yiL8aeb+MwAZunKSAuBwMQmJGeoyr3IECAQFwZ4/7MHcdASglZ3e1245e//GWiDTjTC341Ys9XGtWDthyk2AO82WzSUermA0EnyA9HihMEzXwM/SM/O/u7u7s4Pz9PaA2ZOQ1Gh+z0GTstEaPRKBHsXi/kzBltVME5EJRtQ25rQq/Ql1arFaPRqAA4VqtV6g/j8FKca+7QHVSsczzfNkgV0tTLoeuoA/vyyy/j9evXUSqV0ivK6a1hQyxGZ6NAMSIiwW/gLxtWWZjz8/P0codarZYE5iNq80oW6MYEP87BEyYCrdfrQhOnS80+jwhDj9hDWhxe7hiMWnLFxKGDRNkE7MoRabOb/SKKaZAJ6nK5nEhNO0tkuV7vX4fGZcfutNrPclRE6fJtJiiUUStIBjLXL4JgvZwG2+hAtQ463mYGOmKsvi9jbLfbMZlM0nsgp9NpKvUPh8N49epVasHhcqAsl8vJoH36CRfHhBMAWYt2u51eIkInOYEMRIYD4MBP1teoCWdEI2pExM3NTdze3sZvfvObwvl6Efu9kG6jYW2gFUjTkSUVTbgrjtWmn4t+OGdOb9++jU6nExcXF6mgQtsE53VxdTqdQpEnotjOwr7evJLNGhK0sBk7NN7MZKR36DrqwOCbIDcRNKQ9h7ExWAbGYEE65g3YyY4iAZtRbpwD0Ys3oWA0hwydpsiI/d42k8zmffr9fmGbRo5++D/K7XQk4hFZktZsNpvCSZQsEL0xKArO3sjFnJYdkheYn9Mdn5fuTYZGREK6DhTePeC0HKMlYCBHp5A2HmTkyxVbpxf5rgD/zhybkUVEcXeDZeFWFv5mbUgpaTZlnQeDQaGxdbfbRafTScaA4YBeXBHl+XnPFagQvaYYAyI8OztL60qlLk+XaZkpl8vJhswvv3r1KkajUUJa/J7ARUB3HyX6bAdveyJgcu12u5Ry9/v9tEZujGXM3j+7Xq8L54HhoIyeADOsjwOf+VN+lm9RsiPLqZ5D19Eq5M3NTZyfnycF8w5/zi3CUWEEzp1ZLOe7EVGYBBHMkR1UErFHQnn64WhpB0K0h5vgvhgHMNfVRjtQjMGNr+7MzlGeXw9Xqz2eTkBhgrTETad5lPccOLaFcYESvEWmXC6nNgH3gDFHHCYOzuvA/DEmHAyKAupB3sw5L2ObmHUKhR743tyL+RpRgeLY08ezcofJvXAefvZms4nxeBzT6TTt8iBN5I3cFxcXcXZ2Fuv1Om5ubuLZs2dxeXmZDhLE2bvJ0nv00FUIcrIN9sGSAYAqXOzAVuic3+126fwtEOL9/X2h4hnxiEDOzs5iPp8nJIfe7Xa76Ha7qe2Do7WRi1Eu3KcRNY6I0zkoclSr1XTP8/Pz1EtHMcPbs1jv4XBYCL4uWiGfarVa0GHW2eDAfX7WuWPOK+JHOvH/6Z/+abdcLuPNmzdp4yoIajqdpmobaAjU4pSBCGEHxs+t6KQuRiR5CoRSE+Xdge7J2qH53qQqGCqKhxHnbQP8fejfng8OtVarpb2F/A4HY+7LcvAYrQieE8gD4/K/LROn5sz3UPrNPJwq4mxRMAcV942Zk2EcOcQ/NEee7bX0yar8n21XH9sryH18f+aB46HRkhQKIh+Hx+ZkgiLFKp5pVE4g87rn8onYB2U+xysISfvp7SKgO81zMHWK661ooDzzoawhgTpvgjXi3m63cXl5GYPBIJrNZrx79y45SKfrpJvn5+cRsacoIvYnyqJb3Afum3PLGJuLDW4GxgHmes6FHaxWq/juu+8iIuLt27cH+ymOppDfffdd/NVf/VUBNpKOucGMVISoQvc8EBQBmU8gB3fbgwl5DIN3/hkdbLf7ipWdVkSk1M2CsRIRMV3qjti/iRlojFM2mnNPWo4OOMcIJchTJByAfxex32CNwjEWHDQKkXfh54GHiOjoS4rCOrmKi2xAjj4LCqWlxO/vI2s7T5Q7Yr91ybsycA5eK6fqXPSyEfGdbpvUzp0I80Um3m8bsSfvoUNwdqTYy+Uyzs7Okn7d3d2lceWEP3pDUAINe08oesO5a/CgzOX+/r5Q9eN+ODccFaiMVprJZBL9fj/JlDFhTybPcfagR/cS8r1KpRJPnjxJOo8OkRJCP3AvV50JZm6PcXM3js6FLfQAZ0bGhv0DhuxQnYYfuo46sOfPn8d2u43RaJR4KyMFhAYyQ2FcvUJQ6/U68UAImdzaKQmLah4CRQKx8ZxDfArlY+7DhRDgiYyK/BkL1YiITuq885zNu4PB4IPvsJguXxMI7DgZP1xXRPGVbZ7feDwuOHI+m1dHc/4qn6dPFDB3sV6vE7HN+uROO2LfBMozmCNrZe4I5eSZXoccfVI1xEjzijCG7R0e3gmB/Bgv0Z45o4sYES0FjLXdbsevf/3rdJTMcDhMxmTnb0To56HXzIXvPXv2LIbDYQGpwf86iPM7AjftSfV6PZ1WwjHOrE2eslsHGSPjIK3zqw2RBWm5q53ujwSRkvY5oCIPc+ERxXPRcurHCNdy5D6MO+defR11YPf39+lkRZPDDJIBMiGXUFFsQ1M8Msd6DIfD9Frz1WqVohKLR8sGhCNVPRBLvqkWodhhsqBsq0CYfAZOCcHlVZOHh4dE/oIgSWFbrVbc3t6m88+53LtGJMd58X/zVjh5O2iiEHs7CRJOI7if0QxOFRmRllClckoGSmE94VOMYBkr97SToiSfb/dCKY2ukLOdAL9H9rxImTm44ZY5umDjFgmcOI4Aw3L6D5o0+jGivbm5iaurq9RLBx3w7Nmz5LCperoB2sEJWfjtWJDydM1fXFzEdvu4oRoOMHc+Xh8HectiMpnExcVFcnSsGUicrIO2CveS0WaCbHjlIYjS6ItMwOPCqfKH442wdxfyGJ95bBw96+wMjGq9q+8fu446sIuLi3Q+FS9PxduaZ8J7e48gzouFZGDb7f5NMvSIoYygAVcraHtgYblfzr0YibllAQMgonNvtmpwT5eoMQSMDkcJGkCB6HUzr8EiMiYQg6umThmoZPpAxIh9lRGFABkyv4h9scHODZSGvLyhOecRI6Kwnjh4NyjiIHDSTnPdksH4vFvDvJFTa1Alc0SGlicHEYIIGL85Hp8CAQJ1tdZ8HwR8jsqNGtwLViqV4vLyMsm5VCrFxcVF4WDD3HmZ67OseE65XE6FGgIGa+b5ErRASKwp98a+IiIFdsbOvZ3FOO2mzcP6TFUyX1OcV552I08COY7e+sS/I4oHg3IdSnXh8Jwt5cg/v446sDdv3sRvfvOb1NmLsM3/sFA+MM6keN6xvlwuU9UDR8N3MR7uR4WLvWOr1SptuwCZIQg+b2IUPsZ7+PKKjCM3iI+Ig3KQekTsN/6ywRXHnfM6RCg4MJTHShcR6R6gFxTT6bX7lnIkReS1Apl/4sXC/j3K7uoewYb0AW7skCLaIaKsrLWVzRyl03qnvqx5nv5iLEZ0DpDm+XBk+TNzrjLnJo3m0FnOdWdcs9ks/vjHP6Yeqbu7u6RDv/3tb9M7J51uORixrk6F+DcoiHVEd16+fJkaSwm4pmlAw/T9GcWwzjg/9GQwGMQXX3wRg8EgFotF4RVxBGBsYLVaxeXlZXKAzWYzneLrvbIEPusWVAi8uVN5924yNxfO+Bw2cYz7Sjp27JfPnz9PaYIdF7wGxDKVR3iUiP32EvgP4KCPbTECitifPolxwcP4eBPGg+CN2FBK7gvcpqPXXeGka9zXCA+FgAuL2Kd6OE3mwSKiDAjeaTTwGbRkhGYS11EbGeX9RDkf5XUBhXGxPi5IVCqVGI/HhX4bywXZ5309OBgXb7icTnExTlfmmKc/gxyMoLhwvO7ZYi1MBtsJ2VDcaGvnh5wh2o0ieQ76gY6wVYi3EUHMl8vlePLkSWy321QQwPmwBiBO/9vz4GKNSflYIxqhaWY1zUCWYroB1N/v91MzKgCE4OWX3HJRsbR9oAc4Lz6Hfd3d3cWLFy8KhS9sgwDEGrMm9JGyLqAudAS+2YH9Y9dRBzYajeJXv/pV6pdhIAgITgPh0XTK+feUkev1elpcR1kTtzgalItSOAKFuDVJmvMcGCDGZiP0a7dQFrgxBIYBgMDMSbXb7bSFxA7FFTlHXSMDUJp5N6dWdkiu6vhkDRcs8gXl+66CYoigF8N4qq15lZLvwAH53Z3c1zLEII26KaWzlq6M5QUO0w7u2bMcIiLtezWy8lE1NnwjXrhbUnSeWa1W075RI2fWDc4FZMvLVwnc8/k8IZ7NZhM3NzeJj2o0GvH8+fMYDAapP819dtAS6Gh+bNJyuYzhcJjGwwt1d7td9Pv9FNj8smQ7c7KW4XAY19fXqWfs8vIyZTmVSiVt9iZo0J8GYl+tVslpE/QI1gThJ0+exHfffZf0w3tnvZbMdb1ep0MgrMeuqAJWOLYod/L5dbQP7J//+Z93y+Uyvv7663j69GlCICaiXVkzOmBw/J/+MfcR8TepBK0IPqqHCXhyefXKqMECMwHIQvN894J5nO5bi4i022AymRTISMbv8fAceIOI+MCB2Vm5ggvK4HPcFxn7uzkKysfu59pRIiPPkTHkRL05i7wtIqJY4bVz8r+Rjx1gfg/zVyC0vDrtOWJwXv+PodNDTvPQZV7S3fggBQcZ9BEDL5fLaRsWRYVarRYXFxdRrT6+7ANnZp1ATy0b5mC95vO0Y6AHFL6oXKM3Lj5RyMrTvOVymU6/dZDzDg1sw1xypVJJQQE06HcqOKgzJ2dE6LLTxnwrn+WwWq3i22+/jUqlEm/evPmf94GRGxOZGbRJdkM8FItesBypmNMBTkbsjwGml8yLDJw1YQo6MYLD4ODQrMCr1Sr1riEsxk2UcSmbMTcajbi+vk5Vq4gotAPYUbOQeaWPNIax20BNmPvzXnTPy04HA+CZtFeYz2BO+SGOKAgpnreLsI52Fu4FswOy43DRg6oayklPEIoP2iLQUKjBQLjQOcZvzsaB91B6aj2ws+ce9AvSA2dkDSLkPszVbSe0IVBwYA242JeJblxcXMSrV69iPB7H/f19IsWtqyBIIw/bAfsSOVEV51kqlRJicXDy71lr+GxXbNkUT7XdPZKVSiW9R4BMymvUarWi2Wymo48IwqxHu91OSBWUx4Xjw3Hl6M10yMeuow7M79gzf2MjN8fhwZlvcApkb48yL5fLgpE4ghvZWZm85YWIGREJvTkl5DIScoXUXcI4zJOTk/QOPY8JBMQ4IvYENgvImBinkQCfsQN3BMsJUVduzd1g/N6Ia8UwJ8cY+AwFBsuCNc6LEcyP8RhJGqk5mjJHNz6iJ+ZZmIvpCa81+sT4vKZu1+H/9DnhHI2Sc4dEgGX+Rm6MJaL4Dk+qtKAyno3uQ5/QMtHpdKJer8erV68SLdHr9aJer8fd3V2SjZ0YOsD4nbqXSqV0MgbVTLIi7s8mcE4+5ufo+ng8Luw4cDbk9hR4yYh9s7XRPJQMJ7w6E2PsDrp2iOa/HNywd+yQ1oxj11EHhvEbhm82m0LKgyBM4jFBFhQlgnPKS/IgE6cGTJRnUwVBoESJQ7CVZ5hjMVIk4mKsRCNHjLu7u0LjLZGJQgLVSVfeIFo5bsikc17a96kRNN+SfviARuTLz+3IkJ0bET2n7XZ/IKKDjquijMVkuZGTN4ZHRGHcKB3zY6y5TuTpNHtF+RnOjxTO8/Pl+0A18H/vD835NNIh5uZmTusRc6Pb3UgbhI2TNFIC9bDf8vT0NM2lVCrF7e1tbDabdDBjuVyOu7u7+NnPfhaXl5dRqezfLLRerwtv2jIXaceOLroIgUyXy2Wcn59Hu92OdrtdOK2Xfi4cB/dFB0ejUaqiQtrDUTtjMhcKuEHHDUAc1OHY8At2uk7RkW1EFALVoesoB/YP//APuxcvXsS///u/R6vVSpHQBsnVbDYLXeJOF0njaIEgogDVIfh5fZMhLGiGaqQ5pYgPkZ4rhR5fzi3kQkGYQGY7JTsfOz9+h6BJvRgXjugQZ2fOiDnmMs3HaAdoBeHf/jvnpHJuyjJzOpkTpoe+x3dzPsqRNn9GRKSGRwh5c0A+wSIngbmfe65M7jot9fz5v/+dFy28O6DRaKT2GPdOISN3oec8FZ9ztoETYqyj0ajwchyOqLq8vEzBZjAYpDO34LhAUi4WeR0YG4jSSJLUMKJ43Lqr9qyjU82I4rsrHZhy1MbxUZY/c8xtje84ELL+PJ/PPDw8xLfffhuNRiNev379P38r0dOnT+Pdu3epJAv/BSLhMDSgqRcUY2QRyZ8RBI6O7zlNgB/A85N2WDAQ+wiAJkjux2KjsAjY3Nlq9bi/jDldXV2lqMNCUA3jO7SLcNHvwjhYZCItlb78915o95nlDgNl4/M2LCOIiOIOf8+B6MvPcCCgXhwAMmXdQHcEBozF7wV0as3a+Go0GomfcTrrC94uovhaMAKYT6rAeTmtNVrLZR2xRwjr9fqDky94BuOv1+sxnU4LTsFO0fMF0fB95MHpqP5/tVrvRPBoAAAgAElEQVSNp0+fRqfTiUajEd1uN0ajUQwGg/jP//zP+Oqrr+KPf/xjfP/993Fzc5MQ0atXrwpVWuTvrT0UyJCPdWu1WsVwOIy3b98mSohewuVymQAFKCiiaIMOkrkTozPADdXWAWcGpMe0mDhboHDCz6j0UnDyfPLrKAL7+7//+92rV6/iz3/+c5TL5cJxxU7zMFg3kHpRnXoxYHfXeyuFy7ou8xuyOt+m8uLobWFzT2Bxzr9EPL68BGI1v9xbZY7I9/fnjBQcjQ2lDyHAfJGcophwBarnVUi+g5zylBUn4ec69eP7vheyQ96sPfP3fPLnRETa7Fwul9OxK0ZTEcXXhuXyMIdlZ5lXNDE4CiOmN3B26KR1Alnmuxmcqhplcn/km1fXcjSObdFK5P2ndsAcggCF0O/3E5fbarXSW7QXi0Xq54qI1LKyWCwSnXFIN4wkXcFGNpZljlhNu1iu6Ah7pHmjVF45PoTIsG3bkVsr8DWj0Si+/vrrqNfr8cMPP/zPq5AvX75MR48AfR1pMES3VxiFuYuc/BzPTveySXQW342eefXRx+igRBy8CP8Euc0GWAQUUXzX4cnJSdzf38f19fUHgmeu5oWc0uYNivACTmmdbtoR2pmASpgXP4dbwkjseEA/PNOtCkYa8I8R+6NVPJb1el0wNkd49xlZEZkXYzYCItC4oABCzNsw3GBrZ+n58n92RyBfEAiOzHoIme/eIgzL/Vbok2WJ3JH1oeCFLJGzix7MxWmcv+fnuQJnQr5er8fNzU1yfnd3d3FychK3t7fR6/Xi888/T2+sQt7OEEy2H1o3ZwugQ9bD9hWxR8MREWdnZ0mHarVa3NzcxHq9jtvb28QpGulz4STzYlZEFDoDkCPywXbyAxfy62gK+fXXX8fZ2VlUq/t9ajgwHgpnhCPyYPzSAH7ulggmbW/c7XYLWyBQQKodLtEzjlarFd1uNyk6OwPYp0XJmef5cDscHgpmI65UKoW9hBybbAU1l2NyE+V0ShZR3IBuxQPe04fjjfGMi/vAFbBXjMuoD8Xg2GO2X3l8vP6OdWQNIP1dDUPOvrxh3FVlnpEXWhhPxOESuauMRkHIBuSQO6mI4lvh3buEHDxWj4OUlN+bH/M+VBue50ibilNefu7KL2tuOSAXAuRms0l63uv1CocNcnLqN998E3/4wx+iUqnEz3/+84S68oIbaAsnhH461d9ut4WtQVASoLJWqxX9fj851sViEdfX1/H69esYDAYFrtlpP7qIs+NneTaC88z1m/X3CTcfu46mkP/yL/+yu7m5ie+//z6Vgpko21HcbMmgvPeOBXe3OxfRGrQE0vKeKVcoybnzEjqKjCK4WdOIECHf3NzE6elpQaCG3nYgFrCNylFus9kUnANzy9stcA4YgBUKFOt03EjA5DHRFD4r30JkhIVDYLsJ88NJe8z5z/P7gDpZZ3roKM6AmniWEVdOePvi/hiQlZw1QB4Yi1tf3CXPDhGjh8lkEmdnZyn9cqWSz7gFKOLjzbs5YjRyN40AZ8i9nZqB0vL3IoB43bxdLpcL7wjNnSkvHeHoK95pabRjJ27d8lz4XLfbTRlSo9GIv/zlLwmI0L3PeqIDvnIZeccOl6vzDiAuEuIwv/3229jtdh9NIX+0Cvmzn/0s/uu//isuLi7Slgx3qmPYwMjpdJqUJZ8Ihm/uxEQ0nAwNshYKf+NcXIb35SZJ7oMT5OhdFgQHSvqUL6h3D3BZAfMufu6Z9zUduuykrWyuqPnezJ3P59VePouc7Ehy8ty8oru4CQ55Vc/7CCOKp+zaoHIEytzd4uDKHCjHvJ3XnPtZBl4HuC8bVb5O/M7pML83P5Y/02tEYORQRDuInBPyOlg2eXHGKR72Y53xesEbsaYcDVWtVlMRqVKpxG9/+9t0vNNut0vvvEQfvLWNijvB//T0NDqdTtzd3UWpVEqHOhLIfEYYMu/1eikbsqNnLRxATRfxfX+GC3/U6/ViOBzGn//856hUfmInPohgPp+nvWP5YqzX63SWULn8+H67wWBQUAoqFY5EEfuKD4oLaU965AiIMhIdjFDMq/io4G63W9jDNRgMUiMgz/H3I4rGWa1WE0GaH7rGq7MMg2307sgHLWEMvo8dn/k/HEtuAHZeGLGdr4lYlNeIxKkTY0Qp3eeTp0BGjhgZqZz5PnOEoEo6v73LgO+BrkFPlomLQSAQjBvny2d3u10yFLcwIEvGXiqVCkETxx0Rhf4q1itvT0BGRnDMk0IVvGlu1CBKuvapYDJWZxak8cyvVCqllgWqmNgnsvnXf/3XNA9SW7r/5/N5PHv2LNEGNzc3UalU0okYdNKzlhQdWq1W2jvrTfO0PvF/xoj8qVQbdaPDbk2yHbhQxdxcdDh0/eiLba+urmK9XqdUxdEBTsj9PBHFA+4YsI1hMpkUNtzSRkBEg/NwXwr3tnE5ArKQJgXZ9kATIuX8iP0Jq0Qmxm1UFREHnRdvYgFxMgejBL+VxQ4HOeCMbLD0umGIyBZ5M3bPOSe9zWU5tYbz8rlrOGnkaMVlzjzLXB4RHcNzusXPKMC42OC00vdn/nkzLM/mPu4jRLYm65EDDo5obrRp9I6jRwfszHiuq4sRxWNvcP55ddJIMc9YuBeycHUP/o5jqSDWQS3m6uBicSy1Wi0dW+22FV72u1wu4+rqKqrVx5anfr+fqA/0g7mxA4Vxudo/n88LlXEQmAMGyA6/4UDi7/JZ/Ad66AILAe5j11GG7I9//GP0+/3UZkC3trkH99UYEVGaLpVKySFxIiVd7Jx9/v9I+7ceu5Ijv/9O1oFk8dAkW2q1NJbGsA3DV77whTEYwHPhAXxhwG/BL9qABXtgW+PWwfKoW+qmWMVTnf4XfD5rf3dws/pxTwJEFXfttVZmZGTEL34RmasT683VRWw8UXmRbvjtW4b6/GfPnu15Dd+h+D1GZq21ZxgdFT0XGoXHs1UhS17fv39/exGKcLpvITJBHRNCHWlq8RdpzBdhzN36JUT7b621vX+zqKDXlHfs3zREs4W71tqoA/LTWsDb7TLGWrSsJklDKkM8LWOoUtdI894Mpzku6jk6Ovpo/DgjiNW9DiHMmXVsyImD80yGGl1BXuU86xAYAiee0Nn+7idjw0CjTB4+/HByhmOoIP6HDx+uH//4x5vjVDgLjJydna3nz5+vtdbeC1GMVSTA2PqbdUwfSgGphQMUoFvypTdNzOD4Li4utvIbyPpT7U4D9q/+1b/alMbplASmHR8f750ywepb3LIqrLmOQRuMEasMtvsc2qlXhkh4KcrfCt8f/ehH6+XLl+vNmzfb1hCe6PT0dDsux6J9+vTpBmMvLy/X69evN8WXyWMAPa9/t2h6iikoT1llbWYWdsrVApqLWGamBLGFcnt7u/e6Mn11zxK6FswsEK7TOTo6+mjDOkW1HUU46DuT67IAGs4yWBwIQ4EUpvj6a4FZtM2EMjhz+5GFYnGYYwR5WxdkE0NFiX2G+9B3zqalFRpnROcaUpb28LtEFgfse/SHvtIbztfrDlUEcDLdHlbEfX19ve3TJH9Z/bnhmpytt/KR5pje0YfHjx9vRwJBmnTg5ORkkxdkx5iRt/Vtbn4wAvv666+37NrLly/3BF0ICEJ28y1l6ndxSrJBFj5ysgRmF7r7TeUgCPzJ06dP15dffrn+9Kc/bSEcD2krFGVGdLoXNPf555/vLcCGSuXutB6a5/41LmRUI3B8fLyNycKvQZvkJqWzINwbGrG4GdamrtsP4zlUid5MbEMgRnqtD29iJk+LrYikBqPHpPDOe4o3khJC/u4smLV5t7e32zEx7rvWBxQFJVkQFmfR+dxfenR0tPFQr1+/3jN8/k6ORZ9QHGfX+5bnND/VlzomKKZGjf44hRgKkWwoF2mhP378eFvsa+0yqp5vLNUDxrSo8vj4eK8ivw2POHnnoi8hprdCcaCoJ8bSeWp06fXr13uRkPHZmH6XAfveM/G9Xdrhb8fHx3vp6JLCjuOFwBoWMSJrrb2tAs3Ute5mZm0YFI3yW0hOAHj37t3eQsNzKHZ1PwLt89da29iEsCaP4ashYmzds8anylKSvSStvjEojGER2DZRuQ6aKYKrk0CaV9HW2lWIN5xrfycxrk2jUmfCUcxFAg2XAJ+y59G1SXo3JJ/z0JDxyZMn2wIQcsiY1zGUMysNwjj0hS6eKyJoQqkhaTOV+lSk0nKTtXa7H6bxns0Z+WTWjDqniofGl/ZdCJMzrg5B8j4rf0VWU/dmZtpY9E9ipGvWHLf0yXcdhAnBCrdLDX2f8VrrexCYKtvyFJeXl3unm3aPo6xiyfXGuLgwoUs3hddDEurMQiiWFIpCXp9//vn6F//iX2wGp0S29HMX9L1797asqdjehOj3hNENBed3D/Vf86aXTvRau7f0aBOVFT30nYMMAq83DZj6nfJ5ntfvtd9k8OjRo70+8coMnPk3xxCEe/nZM+TojdC74e2s6etzy2n6DNqxyBiRjolBkvXE0ay1T/RblPrBEKA8hGDl0nyPEZ9hJoRcpIzANxed37Ozs+3N7q2NAhgmEqphpLMQtbpMkUlrqnxWbvfx48fbi3rq1MimyLwcrTVF1r43gYCz0sxHW9dWd4BM4+v6HxxCOsL27du3e2/OoaAU5NmzZ3ulFsK6o6Oj7QhZHVprP7NHCLwnLyit31olfAgh3Lv34c0xv/nNb9b/+B//Y621y4A0TS6D6jMozcQ1jc2TUta1duFGa5x8bmFWabugcH1CFen2KidOyD3dg0drZpQBxudwMNCtTcRFhxTMwjPOQnahfjNm9YRClW6D2ZQoaLZckmubXaSsWhMGs0Ec5fHevXu3bcBn5LrQen99oWvl+ziM9okcEOl9E5D/I+Xt63Vd10ar2ckFyiATnGh3ADAU/t7wvPVw1a/yd8h8rWU4HCzqpPs0i5rJDQ/NyTDmdRzVITpL3j1LkPzYDc65yY4ias1YfjCJ/8c//nGdnZ2tzz//fM+L1avz3iy5B1t06qW2B6bzvILspL9beK3DITzGdK0PVv7Xv/71un///vr5z3++N/girpubD++9UwPUEKFnGbmesWr21CRLQrj/LJPoYvV70adFeHR0tE3oWvuV6v5GMcmw6KPJhLV2W2NKzLt3+YW5G4LSU8pyFxyB+eume2FRDZljaBoy0guZKX1oKCJRVMPueo5TvxDTx8fH2ykPa328lcjYoJW5jQf32vmGMFAZjx492itcVaXOIPm8svYd4+KEGk0gvmeoTv7Hx8db9T0E7hQUTncCALJVf1d5io4axpHRzIYaB8fecFVSzbw3aQa1di78bFKBo4d2q1/ty8y+f6rd+ddf/OIX6+XLl+sPf/jD3qbWtT4oJoWVJbK4zs7ONiK35x+ZVB1zfHShaBVprf2aMkL5yU9+sl68eLH+8Ic/rBcvXmxvLPb8HgBXY+SZNRad5HorBqQcB6/pO2utPRSH6KUAJgl/N/kx4VX5PIrcMFwhYQ1PjZJrkKRk7/pyNPXqrqvnngak/YHE3r59ux4/frwng6Ojow0VQx91KIjo/v2QY6inFibjgywcaKIZvI6rIYgxF5lNJDgzsb7THQJv3rzZQxXVTToL3R/iC8nN75Nzqs53POYUKpxZOf0spdCXmEDwjRLcU1+urq72nGH7N/mzFvJWl52e2sQSSqSRCRlB9Naf9VEj79k/OIR0gx/96EebsGolGYO+BslE4XuagSopN6F+Pb/n9O8s+C9+8Yv11Vdfrd/85jfr2bNnmzC61QGpT8BQ48z0eZ5QhQK4zlaiOcFVYvdF+JJT66Rc08Xh2Yc2+FI2RqdKX6QxSW/IdL5o1L/uPlCf1qY2b1OOcC1k3O82za2vZNKkS3mSzn/3DV5fX+9lsfFr5ZIa4jTkbihZLpIBWGufsyx353meI+SnNxyWk3qdhb/WzvH2ZJWW3Mi+6U8XIud1fn6+d4ZZax8ZpmYc19rftO9erUWjuzKKDflqKDkMWUyfqX80pzX09Bqy7JhaFMuwQoKojhZeG5v1Rcfv37+/dzjqXe1OA/brX/96vXjxYr18+XK9fft2e5ecB/udgaDkIK7OlNyuAdEa/xP+DFdM4C9/+cv1+eefr5/+9Kd7fX3/fncGOY7EAmHQ/NSEIUIl/I6jeJWG4JrWWhsJ6pkN08oPFv77/8ykTq+31k6p1tonnfV3rY9LUyxUi6nlB57T71Gulju4r+vxOjUa7Zt5JDt9t/CL6JrVqnGZIWiNIJlULvTE4plckUVQI2Zeyvkg+H1uAeF0D2Xu1lrbWCGxm5ubvVftrbUrIeJwheLNkuLTlN10DbkXoyqU0pfSM3SkRoqsTk9P15MnTzaZlq6ovNfaGSZjMnfWUPlYYXt3yXRdK6YtYEBruE6VAJmWp6yxn4b7ULvTgP3zf/7P1x/+8Ietg1988cVa64OlbSp/rR0/wmCYZMfZEKyfBgAtgI8WPQX/7rvv1meffbYeP368fv/7369/8k/+yVpr7WU+1/p4/57JbpwtO0NpC81b//T27YeXeJYngOyEEp5dbqYKb/Kc/21SLJpJsttcWzKzNUaUpRtjeXuKd3p6uleD18wXZa2R1YeGKmutrTbHIYSHPD/qoNyIeWs6v4uy894tOfpqcUyFpWscwDSQRXxFiRIInBRjUBR6ff2hMPfevXtb7ZVFZQ58HzXBCJFfIwjzqR+ymkVJa63trdr0rWU11kWNd+dYUz/JIdV51UA3oulRVdYL2dXI1di1H4wxWTaspluMPIPlvpzl69evt8ipDkIzThX8d/Fg33se2Oeff749xKui1IYd2rBpG1C5L8dRK0J05M5aa/MSJmStXSr8/fv368svv1z/8A//sL766qv1s5/9bMsMzT1wJo2gfaccmglca5ehoThNOuC0KtByFoxGXzTSZ1hsrYGzkD3bdwq3uz2GtysKa/kE79QMZdPbwlPefK1dSt+ug1lnBXlReN4daukJIMfHx5uBE/55wUS3ihWl6ptrZMMsohLrXczlDNtXaNgWKdeRoYUgCSIi6DFLJycn26Gd9rW2lRfl9CanwyjUYPg7WUw0Zf6g3CaFJJu6E6LhcB1KSXlHvK+1v/ugPyXO1LbJjsvyCyFL+7TmkyzRRjWY5W4/++yzPZ65ZRo18OQ60WHlfle78zid//Af/sPt/+9A/b3snQng1XAzPGDJ2ZK1freYKWwLEQkNfP0//+f/rOfPn6/Xr1/vQeK52djP7uOCZuYWj55P1LICi1qI0UxIiV9GWEjK2DCGPX7G4mz6vB7VZPX/7Zc+tP+f+v4Mz8i7xYGzgNg8ddfDPAds1tG138bYxsjrd1FV5WixqgecbZbbNCs15VyZzfPRuuD1q/PvPg2j5gJqISkHNsfVvvnM8/vdRirN2EJo5UD9dN8mVCZ6MYddGwppW3NZx1DjYQyTI57r0s/ajvLLRaMihm5Ud+8i/xYYm5fLy8v193//9+vk5OST54HdicDOz8+3424nhwFtNBz67LPPNq6nStK0rgkDd8/OzjY0VuLzxz/+8fruu++28gcKaXLB0daYdQErIJ2V7T3CpLUuPZGCIrXkoOQnL+L782C3Lp7yBSatymXcPJ9ryLdHEDVkr5GiBC36paz1xMfHx3vZw/IdJWP7shae1jWzeJGTgFq7UDkuBl4/KfhMl8+F1HozZRg1FD1Nd60dArf9DWLT34aDE5W2HouR6iLu6Sk1LGRVZAy5kk+TJ3ahkIs1dHl5uSHWx48fb7xcnw8hFR1pUFhrGk9OTrZxVTdcC1W1uNo9GdieXgK5+V4dEDSsv9bZBA/00rPtwWy/oFJz/YPrwF68eLGdfS0bR2HL/4DR3U/mzScmjWcv7HS/etEvvvhi/eEPf1j/83/+z7XW+mhwJebXWh+l8y02SkwIILN3K7b+ZG4faSglfq9SrrW2UJYRbJxuwsmimc3WabXosN/vRM+9ckKuk5OT9fTp071F/+7duz0erWEqmXWfXAnWwvqS5baQeb656ni7cFoTCLXLrjFglJuTK7ldnkw/3bP9omM+g5iLnLszg9HRNyi+G4jpp7HIQK61tt/71qurq6ttVwJnOWVjnsu7MX76UXTUuZGUOjs723gj48IXNltpHOrXOEmGBx+41s7hM5qzcTA1hvQRb2xuzaMQvC/CrbH3Xkx6zBm0vMQ8l076wRzYt99+u372s5+t29vbjcxv1fZMV+uArKHwQ6vyi8WbPXn+/Pn6+7//+/Xll1+uL774YtufNrfonJycbAQyL3GIM+nAnUjRvx2qnK8XMWEWVb1Ji22LroQox8fH21uaNUYKPOZZi8rIpN6y4yiXN4tGPaP3k1mDQKCYLpjyPodI3MqjnEWfI6EwQzx9cJ0izdbMNXGx1j6BLSvcViOw1u7kVEmMomXPmAmCIhyEsZ0ePrdIG6rhfTlE/WgZAI6NQS5ynsa5W2msn4ZT9vc2qlHqwXkyNGutbaM02ZQsL+9pLoV4PmspBhlOvmoeRd69zRzMPFjg6Ohoe08FxzHXa9chu1Bq5FC704AZ3Onp6ZahIRSCZmWbBrW4Fbc1PLi5udkObHv79sM77c7OztY333yzzs/P109+8pO11n5ZQuP+VpyXS9Aar3cRQlld8H3RA6OgNqqclWdMfsGEd7Genp5u5HQ3AbdvjFshuQmVwamiGx85FInMGqOpbPfu3dvbxkLxW3ndKvzJUdVQVVb9f8PWWXDao470E79ZBZ6hdY1GS1PqRMopkVMXgQXQQtM6MCU+To29urraKxgVWrVBI6UFWlIwT6MoaierGl/lKozG5IfMRyMRxrW1WzhL9++cQoieNwn10hHl5UoFMLbTwNLvlvYwZrPA1vyoDMAfM3KtQWzZy13t7jr9tdvEXIPAu/AKFUjLGkrw46W6OZzgGToks1DQMyChVvVWGBMdlF+YFc74tqaTKYF33K31weDZvsHQWATlnlrfUk9GHh8JPN6mCk5BLIYaZTyDZ1Iwxqxopg167CZqxqOJF166By+SOThfRe8Y6ySEhZo5Rj+QCcM7ZWCum7Toc8pVlc/rPEK5+C/y7JlcPi/HaRFxrKenp+v8/HyvJmytfbRqbuh9w/uG9lNuvt+QnZNpwqt/a7lJDdxa+7V1DHadDyAAsUHiRcezJrC0ylo7rrkke20CPsuato4b6ZBr+WrAgENumU1171Pte0n8KbR2dmYPoAf1KQ0ZG9r4/vn5+To/P1+np6fryy+/XGvtCiTLi6k3aXaRIjAyRTldVK0ONlG+x8gSXreBFFlAmC1vKMIxqSWNa7g1fWjTnxaENhz2O4TnFI21dkXEUJ/vV9ksFmixfaXcrrWgelZaZdRWw1jEVCUU+nRLTve2utb1RTdTRuVT5tzw9sYoBJ+7OcjEnDXxMmvjGNRZh0QO5fG8N0G/m6V0TSMGz4ZORCXQSI32zPZBrn/+85+3PjbJwjBPKkFkxPl1b64+lDhvnd7jx4/3Eib0SakUg4lCgcAk2yrXJt3U31nPdu7Qbett6t6eHn7yL2utf/kv/+WGPIpOLAyT1ZqsGq2SnWut9fnnn6/z8/P17NmzrSj2yZMnG09EEdZaWxjFI9/e3m58QBWBZ5lhqt+LHqqATafXg030U0Pq7xcXF3thnZ81vDME1aps5UbW2lXYUwTf12/wfq0dCcsYlKBu9TIj4/qZFKBYrqeIDe8OVfLrg79bTGvt71tlmI+Pj/cyaa6f4T9UxvAxRp1fPBRn0d0D9o2WoyHPzgOjbwE5NcW/yc/QY/d+9erV3pg5r+5cKCKDMuhkeU6GhK6Tw+3t7d47EhoJqGZHV0wivtwc9M5wmveChZaQGCfSXdSiza1uZCVJ1qQdHTFOOq7uzub86kL7d6i8pu3OOrC//uu/vv2n//Sfrv/6X//rdvyMzjp8TBjVhUHZDE5R47t379aLFy/W7373u3V0dLSdfrrWfhGoRsFM4NyfNY1SFa6I0e8Up9fXMPDauJbyJg39GA6Q3eS3yHEarSIkk88zq27mPacS+ryp9jYwvXvPJgowFw1T+7yGG7P/+gKRdI6LphpGt2+994MHD7bwQqPYNfYl3Ofva+2QnvsaK97RXDS75dnmuH2SZezWmfKZjTQmDzTHWuJ56kTHN//fYuZeZwylRhgH4+gzO3cTaDRT+ubNm72ayM5H10tl7ffK7xBKmocgzKhhorMiLn19+/bt+uqrr9bJyQ+sA3M8s3CwNSGtdRJmFI3YviHOfvXq1frFL36xvv322/XkyZP15ZdfbpxY+Q+TVaK28byJLClZBXKNSS5vNav2ZeiEGp1I17Uo9+3b3RtZ2jfPLvKoIS4JbQyF/FLK+tGwobxP69Xabm5utlDHYZOebxE3a+v5DaGur6/3kKwMcJ/RGqC1djVu5QCLXBhnxrTGQLiCk2MIEcWd02bRtKJ995566HoIk6Oa82MsXmhBV3B1DH0RW9GV5whnfd+cXl1dba8hc61r/M7JG6f7NwNvrTBeJf79XpTaZt5lSUUykjqMtuu7JivT0glNmnDy0BkEPeUsGQidl8czvpbglKY61O5EYP/23/7b25///Ofr17/+9WbAyid5sHKBhj31mmvtQqCeG1TlaTxPIIcWK0G1apmlJ4xD22xcO5GGzJz7UkpoowimfEOf5xmfQoX6OInsepw52RDNXDj9e/tAJpSySRb9rzdufU3lMcdS5LbWLjyc2TJOh1ExNuPUyJHyCwfLudQQTjlb1HVGdIHR8RMy7RHbHSODNtGEAu4uQnOuxsnC6q6GLv6GU2ofIfuiWLsNqiNTr2Yj185nxzCR9NHR0YZwiyYb0lUv/N513LAWiiv66q6J9sO13c1RfSxCLvo6Pf3wjs1/VCX+F198sS4vd2/o8eBHjx5tFr/ZHCRkrf+XX365vv766z0UQJBi3xLG3eA6J7Zx5AwAACAASURBVK2hqUlo6Ooe3QrUcEHrwmvBXcNCE7HWjkfzzFlcV0PQ++sXJWlMX5QhPOYM6vFaDAtNKEhsGMK7qouDBPSrxs/RNMZRrpFSVvkorgUvjIUYORr1Wp4rLKakjo6hS94ZWONFNshgrcRwM2Dlobq3s3yT7/fv5hNqVR/39u3brQi6YdxaH1CNMptmy6CihnqVI+TMeEF7Jycn2za6+/fvb0aJcZ3Io84Cb9WdAfTD/92LzOaa6oGE1pT7eOO3xtnQEcivRcI14I1mimjds2sBci7Fw7h2nRxqdxqwV69ebWd2l1P505/+tFdl30P5bm5uNtLv6Oho/fKXv1x/+Zd/+dFZTjxwLX2teENU8brnlQugMAQ2t07I7hCkVt7EM6rgNVIETUkPFY/O+5e8bW2W0KDZl8pWHwqfmwnDH3V/G2Vtur2GTWHxWvtbsfAmZFayt7+vtUs+mDvyauZQyOXfRA89mBF6XmtXZFp04L7GLdNoriZ6nP/ayLYoiKPt6QlCQMatBcdql4peioSKhlrC0cr3Ih1yf/PmzVZAy6Ca05YS0Qe67z7Vb2E+Y6WPCHn30I+iuPavjl242tC7un+ofMcYej8AociQ8WzW2PP079D995511x9/8YtfrO+++24rddBm2hpkfvTo0Xr9+vX64osv1o9//ON1cXGxfvrTn26LrfCfcs7QqDxYLbBtM4wHZSzRXa+71uFNtJNH4R2ExxCC4kdClMGrh+0CpKjQpO9aJJSnyqA0QyammbNm7njDLk7It16act67d297VyDE1pM6mtI/Pj7eO85ord2bnrQmUciVvMyfejRz2oXSBdakjTF3e5PvHdIH82IcDVUsbP8Y35amFKn1oMIWfZKHa8l31pxNbqf1iebH2HCtNc7lytZa2xl2ypPKVWp1eo1iyhm7fxMNRUvV1TodjrnOCxUjBG1pirXV/lnbjXoa3pa0lyCpAb293X8nBr79B3Ng//pf/+vbf/Nv/s36u7/7u21x44Z4UNkdaMeCBJunJ19rx3NcXV19dGDZDNXqmYuaLIYWZRLEWh9nJicP1v/7fiek6Gzei1ErBO5Cash6SOHJg1LMDGzHcohDkGlD/Epy1ChBEOVbhJfGfyjrqfavXh+f454zwzn5n87DlIFFamwNi9oqy87VlO/knZo9PsQRWlwQaMsgWurgOQzS5NBmlrqc6USAkDTk5VkWcLPgmpooci+HSb6HqJH+rCyLxubn5tt663yVkytCZpCca2Z8QEHpjUZZ5sqWrbXW9qIbf8eBXV5err/7u7/74RzYz3/+8/WnP/1p7zwlXv3o6GjjPB4+fLi++eab9eLFi/XTn/50e1GlhVml5hlMHgMKzlpUJXJngSmDgVM4RHRS9k5wkwL6NUlpSEJoTElnNTEjTRnnHkMLg1fp4pvZvHpaPye0LxdVQ+X/5TsY+yofpbi62r2dvNty1lp7J9l2QTF2lV0XU+WqL/o6DVPnVV99p6E5D1wDVARGZ/QPEnn16tV69erVxm3VCBYZ4EfNy8XFxXa/i4uLzfH2hNVuSVIHxjDa1N0w1XOPj483brNGX8jEaXfbGb0WVh7aO9iiV7I+tBbW2tXO1WhX12aG1tz00IZyZGvtXuICTfWt6V2vBRh0jPw4SPwae2A8E+nOdudfZVxAYTyLSaTgFxcX6/nz5+u3v/3t+uabbzZjd3Nzs1fc19Cu9S6e1Ti5cN/EMYizXIGV91lj/Mba5VEoPW6IYpu0hmh4JMIujG/SgUx6vA4jCw4j2I+OdqUm5VDmGHoAo7E3VPJ9MvCcGsjKbK39F0n0eba3dH56n3J4/VdZk+NcAJ13z4ZI6tjKG5VHY4CLGj2zr79zpHHHx5AxFg1RSmwzaM6/f/v27Z6DFPaVg9U35LnTeluicnR0tJ1eAnm5FolPT+o824pEGQQGvpk788tIW0c9IbVOpDpSA8UBKeydhLq1rU/6z9kaXzdv39zcbIXrM9zHLzNi08h+qt1pwH75y1+up0+frqdPn27KpKM2oF5eXq4vvvhivX//4Ux6ghPi3Lu3e+W9UKipW8pHqQi4Ropna7jZs+lNZhe1vlJqoRMvQcDdQlJD6Jkmr7yb+9dDzxqzEq0zLHaPTjiFgaJ8No+q9uyixxoCUBxStNjIqE6gsL7ndVWxZip9ekSoqaivW5naTwvDwnKNJJBTKIRN5N70PudwdXW1JQV6f4uqRZ2vXr3aDBW9gDS9Bk0o1ORC0cjJycmm96435z05t6i64VhpE4sfkd8z6Mi39ETnndygpupYnVTnEoLjcIvayrtNh1Yd13zeGr9Jk1xdXW36p4SFUeco8b7Vl0Zn8+W3n2p3cmD/8T/+x9vj4+P1v//3/17Pnj3brCer/OzZs7XWh6OmvQS3YdvkleYgTXDj71kBPbmlTqY4vJti11oHDU2fz2tDVYX7BNYtQQjfyfWY2ENIsKUYxtViw8l3USZyqZdca+0R8vpcDsZY19qvgu48tJV/mp9PPqbzxGDUyJYXo9juLzSpISo/Qj79/5RJedTJU87+VxZ1Pne1zl/He3t7+1H20331qxuXEeWcFGQxTwhp6cNaa9ONzlGRkrVQB1DDWAc8yzva9M28Q251EH3e5FPpmznwwubKmu5VH6eO6iM06rntC8P6q1/9ap2env4wDuzy8nJ99tlna60PYaKNxG/evFl/+Zd/uS4vL9fXX3+9nj9/vnlMi4+XKuyFDDqgepSrq483cfMgRVbudXp6umWHSoQ2/p6Lw08nLzSMsChbR1ZPsNYu5V+vaXwgMKWqkRV6CDHW2hmLk5MP9UCtg/F3Dfq8uLjYtr4wJkVvNzc3Gz/XOq/JJ3TBus7/6/XNGcUuahOG4w2baSP7lqMUMTREtSuiNWMNd3w+5VrOqcavnKtxQqAW0NzBIXSjh1BlOVD3rVEsd0QP/by8vPzoLDP0ATBAZxw6YOyeoV/6osBWOF2HQB5NEMg6lj89OjraajmNyWnK+n56errnmITSPb31yZMnezoqMdKMdxsZQmIc3KHymUZTPzgL+Z/+03+6XWut//W//tf62c9+tiGRZ8+erTdv3mxvFCpRSnEOoQzbXOpJi6JY/pYstO6m9zuU2SgRz5j5TjNQlK0IqoiqC9t92yZ/YJEfyqjyJG1FPhMJzedRBinwTxG19ex3NYu8YXwN0uyjBTRrdKp0TVhAoTVQGm4DF+U5Fh8agcFYa5/0rxM6hL54/1bQk40xfEo+RRZF/d2fCPXqd7mcOmqGltG1cFsjttbaCzv77KKoylk0UFl1/uqkXOv/rT9shFGKZtZMWkPWJf571jbWmBtr951Wt4rE6OJcN/oEgd2/f3/97ne/+39HYP/wD/+wzs7O1meffbZ+85vfrJ/+9Kfryy+/XP/3//7fDS3pRFHSDEEI0ELuNVW0FurhwWTL8B0VGMHPwlLPKB/UhTGTChBVwz79gLjsIsC7QU4Qziy87aLTvwcPHmyQW+xfo9OwAgJsjRZuSJ/rpSba8VkVQj8awljs5+fney+6vbraHe7X8MLfZllBnw2FMn6UE+KYfSz61sciseqMRT75OGjQc2oIOs7WMrWkAm8jtHJvOtSsuho7qLSV9kJ3SKUV6JzAWrtjnvSvyZqi+NIO5YWLSrquWg9ZB9JwdK3d28T18fXr15uOkzuH5FlFcfqDW4a+6DXj5Tn6MndvlDLo7/3sB3Ngf/M3f3P79OnT9fvf/359+eWX66uvvtogbLMEk59qPUg/o9Syb5TN4oe+6iHKcRXlGdzksBTCVbm7aMpzTQ82Y/V65UPeYnI202Bo7XO9e/tWZV5rfRROal3wZGCxkt2hLNO8N0PRcdbQ9V6T7+sYex1dKgIuUp6G2nX4yDm/s99F7BPdMWCHrp/0wRynzxm4SVrPcHGir7X2nXbn36J/9OjRVt3ee1r4a62PkFojjTneidzRHWTQ0Nn3qgsy7TWOk//s/Xv9rB+kq5zu7G8N5XRMHFHXFN14//79+u1vf7uOj49/+FuJHj9+vF68eLF++9vfrp/97Gfb4WayVmutzVOXO9A5/549e7YXKjZsq9JRaB7R9pgS1v1H2fx/clbbQIPaWpNTIU94S8l75r5+Tx5vKowJafVx0Zh7TOOKq2kWTf/JsidkqFXrW3XW2oUMMsKtEyrHRSbtN46Ct0VUF2lP42URdN8nHkt/5kKixBcXF9u9e6Kv+8xyh+rXXGD9d3OzK3OBvsmfzOb3Jo1Qwyf8FYJzlk46Lb+LKmGMutuAY6C/rdGbYSY9hcboqkJQL83tS1MsfugFh6jPEBMZkH+TA+bGu0/La6k+MJ4afOPofOhT9aLzMFH5Wrv3y94FsNb6HgT2n//zf769uLhY/+W//Jf1i1/8YiuJ8GACVpFbfmB6oZLBFL8Zq3IABOiEi55N7jsQXQdOyLIhnYhDXrS8xfQaRQtVYp93Mvx/rR3SrNLNSmvGhDK5r6r6QwiuC7+oZn63JRUTFRvrId6QjCh3jYP+ty/1wH2JxURRHTeUZvHiY8qb9Rr3syiKcKD4iUQPUQkdXx1njRbj4fnVAYaBHA5luefP6txaO5TimhqkOpTytoeim0P3L3dmXfk/w/n06dOtzqxJjKJw+l/H6tnk1fKXynQ6/3m6Rr83w8a2SaP86le/Wg8fPvxhHNh/+2//bd2/f3/9xV/8xbq4uNg2vEIIhHB8fLxt4Pb3GWocMmhVbIawoZW3mLhvBVZe6fb2dp2fn++dM/748eO9M6c0Mbjr53lglJoBbDaowvV5CexDMT4FwmsYK26PHPzkPTmDps/JhRwsgG4yp1xeUuEaC6PFofXQzcLh3SibBWGu3ctP7VBYY2ytr6ox7TNc05KCm5ubvfOr1lp7NYju1+ynBQm9+7/FZWE9fPjwo3Pt2icnRtANaKkFnHRKX429n9V4lZejA/TtwYMH28mwniFbjoOD8qbxpj+tBri9vd043IuLi63P9+7d28qhIH46xHj5na4zuJzvNErTecjSNztOV2ssHQnf66eu/2AO7N//+39/e3Z2tr799tsN3kuVHx8f720xmts0evJlBdHB40gmh2OiqyTd80dwkwCfHq+ebXqrcmXlGtwXypwLQ2vY1cV9dbUrpbDoymN8Cp2YsBpb32HEWq9TQ+H/nlekYkH02ZeXu9er1XtORZpK5e/QQY2uMRvrrGGaaKb9nKiiMur/6drJyclHxyDNRIM+T7Q6ERnDVJlVvg0Jm7mrLCCTGveZJadLZNcjiBhiyHc6/Pa3Y2sf11p7KLFocXJ97tlop+toytH1h0K9+fzagYk6q0vk/PDhwy37OdfrP/pEVidAXF1dbSQd5WTQjo6Otu/xNlMxIQ9CcE+Leq1dzZfvmWRepAv39PR0+8zvU7DN8kxDOj0INAl9FJIfWszlCmqALAYLbBojW0607usrr9TPLH6LRH9rjCgEI6KP5DI99pMnT/bCBPPgHlUw81XuomFCCd0ar85rES/UrI/mu0iv8+nZrr93797ecU2dg4kcGio1s2jcrsMBTQTpGRbiDB1PTk42J14k2kQKOVYOnW99kB1vlAIZ1eA5/qfkt37hO8l1ht7v37/fSqEO6VvHXmPt8Ab9te60rjOgoM7feuoRP9AvWone+S70+33tTgT2V3/1V7cOJDRoE1RSlufQ2QkFO5GsfRW9fFW9QS365GFKHK+1j2YmSc2wHCLq28+1dpNojFWCKdDyOQT/+PHjjWeAfA5xQ2vtsjlHRx9S832JL4Ws4en/J6cwxzUXUT2l8Gl68f597mH0bP+f3NlEJfri/+0Pg1C96BwUdVbf3E8Rp0XS0LmITBhZfZvconloCQCDOUsYypPSsU9xlhC+BJQC074/onyUeWuo2/sVoU5DSyY9AXYiTd/tzoDq/Kzxk5DBb1tbxtJ6sBr5ysJn5mNeU93yOcdKT/7+7/9+HR0drd///vf/7wjs+fPnm9W2KCzmKkoVsfH7bJSkROX19fV6/vz5dnDiWrvsUonGTnK9RDNLk1PQh4cPH+69D/IQXKYYmjHyOF1MUBBP2BeTNrtVg1yU1EXh86dPn+4ZQyEhuelTq6mbMYPE9Hse7SKsqdcltyLNIqTKybMZ5KI3Y70rDGyb58MxGkWVlRnPbO5bmU8P9BMaaijGQJljhqoRQDlO6AxXxGAWDVuMNdztE77YfEBTRbAPHz7c6huNWx+bvdYn97FuGEifc350suNhGBmvGbU4q77RCCK+NV1kXZ6ULHqsPKdxc3OzbWQ/Pj7eeyuVz1xjnou26e6n2p0GDDy+vd293mmikMk92Djts6nsvKVO37t3b52fn+8ZFvCZsjQMMMl9YYfJd+3R0dHe+yIvLi72yP3p5dzHpDRkImTXWBxS31X2tXY8Bm+EG+kCPZSNoYAgflFmvSPCvaESchlhi8i1EHhlcuTdzJ0arKJaygeNeX7R8vS4Dc2gS0ru99vb3ZvbGVz98pyWYhjzRJoNxRDQ7d9a+yFma8fmUeZ1ChOdlA6wsCfC7/O6LW1uV+ox3/rb+Z7hZR0JfW0yoKiyDtz66ckpNRrv37/fnKx2c7PLBDOSULojtjseutOC67U+PtiRTD2rR29D1jPULC0xE0UfyfyTf1kfzrOf1ctXV1fb21sKiSGAy8vLraK7YSOBlVBtlmGmjlU34xkYKILpizxq7XmEGjTKdH29e8ddQ+ca5iK3tXbkK+/LKBXtNDSxANyfskzOo97VxFog+l1Z1OjNUBsJWgKX4VJ4rO/u3RDOC0eL0GYIZ261t2/frj//+c97+lID2Fo/zqbPL6qFZKCU29vbzXHWyJNJD0SEQiz+vnfBc9X9cQTqtsikoVYzpXSyKKdb0nyP0aaHk5sjl/KAjTZOTk72SoUa0XCuwkS6o7/mQqusjH2t/Qwr42hXiHtwtL0PZ6cvfQ5nY80xYo8ePdrsxdu3bzfHSufIvTrQ9VNj+4+qA/vrv/7r2xcvXqw//vGPG79jcfR44XIejF3htHZ9vTsrqwQko2grh7Cm/ALlK3Lpgmlt1+QGwO2pANq8jnfAnR1K+5cnsFDB7hoBHm2t9RFiadaS8rcuysszJqc0+85g4BAaGmvlo/Ay3RVRb9x784bz2cZKZhSvi2n2uxzZIY5urbWhx24MJ7PJo3aeG250rBxhn0UGzcyVEpick8Xp+1MO+kg3auT839x2TeDK2o9D8+u7h+an2faiqKurD2VIfXdCG73C1TVryWi21tL4etKJ9TETVpMT7HjNS0l+cmzfrflf//rXP5wDcxCcwbYGRlZyKimLX9RVI+m19QZycXGxLeSiDSc2Ekh5IBZcX+r11tr3Wj3Z0v1qiPS92bfWa/l+s43gOUUoPzNDrRLWlM0it7exaKyLm4KTQUs66kRqhCiecbVejyI6Rsii1NdZg2XcrRnqQponntYIQl76wRgy+LNsYa3doYWM10QBM6xq6Kivwubqm/msHray3Pc7Fy3WnBX30GX730MTy3NpM6T07PK5xlGjO9FhQ2m62jkh16dPn26nxLq+Drzn/vsblGl9FGmutb+tj+O8uLjY+26pCHOED2yIXkqq+gapNuv5qZB9rbXuzFO+efNmff7559vvNjzXW7W+pVzDJHYPdeTmZndCo0lWqgGGvnr16qPtRmvtQjUCa6sC1euvtf+arUOkc4VvK8P0gsLaWe/TZ661C9XmIq1x9DeTTRn0sbIpilAmUkWuXDtPzb4WgTQcrrKQTRFl72/c3d5FDs1Wdnx0oGOgwM2s4gJL7NYAUXwbp2fIP8slWnIyEV1RYMP89rt6UrTYmjeIVptIqoagCJ2D7byJGBiYnrNVpFXd9RnE1PGQe59xe3u7d3hjD45sqFr5Tae21gcaxysXe0xUdag7Htzz+vp6vXjxYiu2Nhc11sCNOf1UuxOBvXjxYp2fn297rnilTrKw8O3bt9tpp1VOHSqRWytOcDV2JteroHy3ZRd+Tq5gG1iQQbN9a63tunrITnA9uOfXSPVUiI7Z3/osp4xCUF3QFLkGvwgFOSq5MPtZ4zf7TVZ2JFCC7h31PePqvOpDFVc4/ebNm72w59Ci50V59c6NsQpP6+xkJbtIkdXuX7TJyM5xa7Jx7Z/vtUyixmkiG1lI90OWi0KKlNzH/JSPs6BnqF2j0c9OTnZ1iRBcxzANZbnAzkt3JDC+zoyT/RY+0oW524axIQuRxKNHj/aywRB7x9HyiAcPHqzPPvtsj7sUFZQXVHv2fe1OA/b1119vD3RgXTNyBHxxcbEePny4/vznP28CqpL5iXdhECG2Q0Vrc3IatvJOPdP+kAJ0Yk18Mya+U46m+wCVRFAGC60nUvA8PB/CUnMGfl+NttbaXrfl+SarxaBkOLmI1kTVWbQId60dkbvWx+hjrbXnSOoFHQ0+5XhxcbEV407jqwnnm9JvcqLohUGcyKLPriybpeu8NUtaY9x5n/domF4Z6ENpBlHG5ASvrq7WkydP1lofZ+M1Mm3E4Hs1nocOIZh9M39r7ZcD1WB3Phry15n3/61Ra6Kl4WbLk9ru37+/GWYOr/WdpXvoS59xdLQ7ZrpJOX2kNz8YgSmJADObGiXUy8vL9fnnn29EfIU0M32qjddaewezEfYUFHRTb9gjfjtJTQDMZ/MwVZqZ7XBtjVtDoVmmQeD6Jtwsr3B8fLzV+eDioCETWu7v7du3m+dpGURr0ZDPeAWes8XFJr73rwHRGj5UqdwHvzFDJvKBMKCp2Shw0+gMIEPCm3s2x9R556lbcGps/b2Gmcw7ZkaNUW2YCvWZ9775fKIqzgTlsdYuQSNEOznZnXpibmp0qkt1OmRrIRfNNJnw+PHjdXZ2tl6/fr0XsjXErcOcut5yDDrUOfR/eiBkJw/66LP+jdw8q0YdWJH15tzL2zYROMn+2b6XxG8Y56eyhpLR7QBDQQCFv1CcI3OEn50kijDDmtZrSeF7VtFHDaHfm50qQqm3nyEoQ9Kx9zBEHsT/CbpGGIlcXgFJ62/Co25eJzdoqEXEDS2hNEYO4dwQs2R4kRtFKi9GgW5uPmSdzYV7QyStKSM/rYjWM5oQcSpCjaZ6qBZY1pO3Nqi0g+cX8RV1dQuOe9Il3t3Y6Sh0zBF4q9Tz58+3e5Aljsd4W6oh4lhrh4D10xirKw4CbH+L3CQ5ZAovLy+3Gi3zLMLpAYlFevPVZ+az89fSn7V2JUtKI25ubvbus9baxnpzc7O9cAeyKwf38uXL7bV3DLvi9xq6XnNXu9OAvXr1ag8y88CyXhZfM0qNW8X7JWV5azxSi1513O9OivDmlq3TR0frs88+24O9tfgd/Hy9lZ8mtAt68kr6U4LRJPHkRRNrrY9CBTLqGPXB/1teMavM8X0tCi13YOItRg7nU9zU5LuEcDi0orY6IH2RIZyV/tP4NwxnAJu4qMwmei8KY7hxLjVqWrOWPYL76dOnm974WSTJMK71Ae0XVci4M0gPHjxY33333YZyaiSh2Ovr670DPxm4IuKizbXWtpiPjo6201TMceVZ51EHbN40b/sxdw2T1WiW47q8vNycntIdReCuIeP5ex1QebSXL1/uncJCPxpK+6z8d8dtPlv0fqh972butfZfDVYyWlpcRmOt/QXfg9BaS3V7e7tBycvLy41HaB3OWrv0eZV4rV318iR3K6iGFCab8AmrnMuEvvra9PwkuT3Hd5pe7/UTJbhXM3Ldb9aSjrX2ObzWXpnwGZ7N8K4h4tym01CK/BoSWjSV5aHTWcmxWUB9Mtd1YmRaHs/v5qaIUxa8PKBF1OfVmEMiNbLCk6K3WeRpoTZcmoZ6cksQknEpq5jZtV7DaNUBk+c0dJ23hsTq/8pn6reQUz8YmDqU8rz37t3baIz2x/5J/TT/UyeAjUOcZ8dW2XcsHIrv15h9qt1pwN68ebOePHmyx2F0wROmTMbJyclm/T0cWnvy5Mm2MCifNG4hOK6sFrl1Xq1p0uoBxfeMiX2KFvY8gqfhlXuV4/A3Fd6b4I72RVf01MnviRBrHTZia30gyLtb/xCKpNiMRMNN11HUclvl/4QhXtDQvpGNaviimbV27+LsXE/k2/op98fX1dPimKacNQultWkMmtM6y5H6WQRs3tUm1qO3BIKxI6+GsS0UbT/LXR3qvxo1z6nROjraPxO+P30HyoeYuj7W2kdEDef1p1vwyl9Vv/qvp0L4u/uXAhGBlIdea+2dPzez7QUM5k5/OdLj4+OP6BhjvQuB3VmJ/7d/+7e3Jycn6+uvv14nJyfr6dOne/F2j7mpkEt8dgANTXimGsObm5u9e7Z5ngEWQa21e5lsiWbtEBKrABk9NWht5ZRc07Ea+ySVZz1UZUQmhzKDDcfVerXWrminnnpWoE9jd4hTbL9wVTKDLW8g86bijUHrQm0mqtyVZ3k+B9WQzn272Pvsyo4eFcnSgeoQNCJUV1fV8VlADP9ETeXMWv5RiqD3cO1EiL03NDj1ot+FarQmXmbYPpv7Novd+ZsZalz1fF8qzm3qUWVqXbQgl21R+A0R1pFAgtUB/Xjz5s361a9+tY6OfmAl/ps3b9bz58+38+xL1h8yXjyTzwjl6OjDe+imh+1RJ67vJtOrq6vNsvPwjIkJ0codUHzGcW5vqQctKvTz+Ph4ex/mrCvSd3u81tovnAWvm2qnRIcUrh5PSG48ri0BXAPESJr0ohl/ayX/NLLti35Q6h7SV55iZrfcuyHn/L1h37179zaurcWwxlelFsoIg8ydzGydr/HQq+Pj4w154cjqfJsEIOvqlH6X84ReW/pRnVcrRxe6U6BjbKRAt0UMDFRR7KHWnQS+12vKJ7YQ3NzN2jbrBD3TsM9prnUqnfO26+tddT6dJgu62tqvJv86jz0a6K72vVnI169fr2+++WYLOyhOjZcHNxxrDdPbt2/Xd999t5HAXczCAouncXP9JwAAIABJREFUZCWSvM8pYbk3kBiZm5uPic2SjoyCvsiMdVHbhtExlYS1B6wGuAu7nldGqsaghHYntai1nplxN6mUnoE/VO1MSSzqtfbroHokimeaO55S7VeznSWPi+Q8+/z8fI/TqgGYHIdnMwqMRJFbn0n2a60NKaAiXG87DB3wctkaX/rg37179/Yya7NkiI4KYcnIYqUfynn0E49chM3hTZ5TzaBwthSHdWEeGfUZRtawdM1O/q3vgOzxT56L92ufXMuId7y2m631gTvv9jQURstbRHQt18DBHR0drWfPnn1yT3XbnQbM26IpU1O7c5+URtGk3eu9CRTS4B002S3Cb9apSnFysjvj++rqas/7UKYiskLbcgmNu016UQ0jxBAbXxMSjNyhze2eAX1WecHvetCixumRyMoz9bdH0EyURx7GgKCdc2UsjDEkdHZ2tkfY19iVp2xhrT7TFwR8jdAMd2vs2rciEfJroqTHYvte+093yLahkjmpzJrR7nUMML6WUy5qM9edu7V2WcY5DqjcOijnw2AWWR/aqmSd+Tt5CsEmzVH91ya6Zbi7XnBgXYe93ne0nuCK0y4Y4AwnsqPjylbW2j8v7FC704AVDay1tleJy1xUMNrkYYpq1IqYQKEY5a0HIJxyDGvtKpAtrAcPHmzfmV6tC4JA+mwL1ATOmhnKiRdyvxlGMErTOLZ8ZPJAiFMhHsNco9l71ZAqQel3J4fSe5+fn29Gjjcln5Z4NDQw3mlUmp5vWFny3vYSzoqhmYaZg2NMyHpybOa14TI5mI8a9YaHFpdseZGEPvRZFvHp6eneW6OM2wLjsFpiVJ6WvNyj/8ijvB+DRU8fPXq0t+ODHPzU3zrorp3u8ihHVYfWjdvWFcPV/bPKmBhF/++9Hz58uJ3v3/KUm5ubdX5+vierGtmuW98/PT3dIot/VBlFvRGh1MhQSp3QqWYPes4QpZhGAmRmiHpOUV+CSxlNNMLRtScnu3OVTNI81bTI0T5FY72+vl6vXr3aM4SQi0Xv2RS66MLi0kcKarGVXFcAeXFxse12IJNuZ5ohAFgNalsQDe/WWttRKjc3H7YGQRqMNHmR8fSE5pYM5nYdMvMTh7bW2ja6WzT6Xw7PIrm9vd08MSRR423ePZd8mlnWR2FtS1/o4Pn5+Xrx4sWevuKc+gwLjezwdJzRoajDNcJXMvR514oiZp81Iyd8Jhvh6exf0WEdcmkddW30w3Xm+vHjx3tFtEXaDUMvLy/3jroie88vshJhPHjwYDsTkHEzr3am9DktvSE7TmpSFLPdacDevXu3vQDCzXE1wqDp+WdYZQFeXe1eHd8TDhCfhHV8fLzBbp918TbEE6Z24TmGhzGYxlZGhHEo+a2Qr+ipntmiL9leQ9BK7sLn6+vrDb2Sx7t37zYlqqKutQuNJE7Ioa2GgOIYC4NWRawna7mEtxaRkUWAt8MLWczlliZS4hzwPmQvIdKsaw12s2rm7sGDBxsZT851qCrS1Zh1K8vcqsTwvXr1ai9DaFzu30avyrHNZJPrObQiPEbY/fGFE1H5HsTkOy0zmte0RIIuVH9mJrD86HwBckP/kv4FAmq78NI1wnPngXUI8UJ2paGOjj4UorfMo8mLrv3vI/K/dy/ky5cvN8PlphY/5FBexyQ0XKjhY3QsLjD59evXH3FCU7hIW0pvQroVQSWy58qw8AAWYFHdWvsV2pSiY/GsyZvUY/TUDvf2t6KV3o8MGKw3b95s/Eo5sWb2ZrU2R9CQkie05cT36nR4xiIbyJIH50kPwfjW+0A/UCnlfvDgwbq4uNjjEMnXM3t/Sv3mzZvtOBXjp2fm4Pb2w8kYNQynp6dbEkHjqMy9edaHop8aEM+uUdZX8+HN2J6z1i6rxwD2OZ/KatchVRbmuhzUPLjSjoHJQ+GqOX764f/0SX+sndPT0w0IFEBIshnrzc0uOaTWzPP0c+45di0uETDggCTXXDsLrWe704D96U9/2tCAyfdwgtcYsk+FF+C1392HUB89erTtQUNWtxFK62agAvsC2whaWtj9cE3gtVZFs1jKFxgP78c7WEytdZF5KUotLCaXp0+ffmTYKEIRQnkDfS4fVFlVCSyWhrhVJt69sllrv1gZAhMSGxflKnfIKBSZtSix3FRLVtbaz442fBYuMxqSFaUhyO/+/fubk2khr9emdVEYM8RNdkVzdVLd6wfZQMhQxqH5qGNBgZQ+0ffyQJOHKjKezXh6Yqzwt/fs3OOppl7T1devX2/rRF/ovQNIi5Z7HwT8IT7OfFv3bErr/OiQOac/n2p3HmjYt+1U2UpUNgshNKToTfeDmiUim2GoF7i52Z0YWc/kOpPqXgTo/ofO85oJgnqeiRjbH7/j4hhNqMd4jbNhUiFxjdhaHxSvh8D1ma2f8bzyQA2RzcdMqjR0Lo+IFzKvxj0XFSWfIZw+VMZdiA3d5pimXHtPz2qJjs98r9tgijTdC6IVkhStkhmnwtnMAzgPtRpk/2Zf9aHzVbQ09aMy6RYpetL56rNLfHeeOw8Nw3y3xqtozPfnOXENn805GeGdm2Q4OjrajtVqWI1vU5bEcTFULWMqd16Z/mAE9u7du/Xo0aO9bQItdCwnAQaWO2C4hI1d7MIP3+dpK+xOIKvOCzoJElIqD9U3EtWTNsyZyqExTnOCeSxjmNxFkVrrpnxORj6Hfibxy8CUkIds26/WLNXrrbW2WqWinXpJSuEn48UD+tkFN1FeF7FFW2RbjoXzKvq4vr7ezo8zXobp+Ph4yxAfSut7Zo/SljkkR5/ra7O8xloqgn7SI32d4U9Pt5CsqI5wbHSSEYMmqmc+76GIdaxduNaImq2J0P1efWbwjIeMhMptTW65f8td8MOa3837+/fvtxdc0xd1pFO//X0ayTo/Cbp/VCHrj370o3V+fr6RpTIIJd7rMUDt7k1kTEyUrAal7eDm6aaua4P0Hj16tKXoC4fxAfrm7xRqxv49kE3af61dOOX5ru3OAN8r/J1GU1+aMrdQeLCe7lGe5FAq3wJs+YnvtpwAX4En4u0a4pm38jXuN9FCEV/74zNjK69RT0tvhBBF2Wvtv01In/GDc89ldzsIxT2bDJoRxrFqFiudgd7meV3TeFUvlGaYD+FW/7/WflW82kbyYgydh1V07Rnmif6Wc9RKQ9CdeVxREdEhZ4SDpp9+WhulTWqMi7Ks5YuLi/XkyZNtA/7V1dVG8zQZZp2UehCZ2Xbk759qdxqwb7/9diPm3r9/v16/fr23qVmas2GjRUDhhTZFYI7paZapxGRLLeq1bm5uNmKTAjZlb7LqTTy/2zraatRMZscwhWcS+n3JhRqbGZa1tqre130on+u6S2Ct3VE2nt26HF7eC1PWWlsB6bt37zZvOTNXJeFPTk4+OourSKXIZCIEi2vSBOrnKKdnlsD33Pfv32+H9Al7Efnmoouf83nw4MF69erVJuOWP2i4W/2X0DEmhtJcuNda+yePXF5ebkkWKNt3uhg5a/LQGjY1YqA/HEr5XI6Ow53Z8Y4NH3Z8fLz3ouhZEF7Kx//JTtKnCPThw4dbeVG3IJnncoqePddNv8so+n/BB96137mr3bmZ+9/9u393+xd/8Rfr17/+9fawtXa7xSkr62whmoCGa53IIpS1Puab1E2plTLphK7GqVD50P0OhYjzOyarPEjrvMpNyIhQAGFyjXdDABPSwkfPLrFLPjUm/l6jNb/fBTPHXYdQZSiSmuUsrSfjJetpyzmVgyp/1f51vt2Dx24yqI3Hhkw9X7+hBQW5zRo35CGLGpq5IbvhS0N+aIksy8OUpGfk19pxcJ1H925pEXlNvWxSpv3v7w1/9f1QmDvXw/xsfn/yf/1/11BDXC+9blg+G0dsTiFkcjBfHT85Pn36dL169Wp99dVX6/T0dP3ud7/7f9/MXU+ihmMOVCbm6Oho4y0ak9fbzKyX2qhJWh8fH2/xNMTjZAYDNAmySH0mb9mXhZqEho5r7TyCfjJUhNr6qoaiFsUk9CGneun5NnATWGNi8TBaTVMbtzHObUMUpRwaUrTPazEq2RrXWruyDuFhQ40an4auHBnZzAVVpXd95U+Ga+3eSrXWh5CyR1rf3t6u8/PzbeGstTZPjz8rX7PWrg6Kd4cezNFaa89AcmRNLhTpQRozg9qdKY0YoKrez1x3Hnx3rV3WXL/Nq7610n1GK5V9ERqqpnPUvpSfqo5O9N2I4dmzZ3uhZu9XGme+FlCtnHloZrfrsI7zLpB1ZxayyGOt3cIp/O2m0G6YXmtXZ9QCyoZFPCmB18PMheDs77nob25uNoPWibEdobUxM6zr52utPaNAmBRqQnbGvPCZwTF2hqVEqjEW7RUFIODrpcu1UIQaHWEG5MjLFVVI+3csM9tTlNVQxWKarUgashHCz7R/qYaZQKgzKA9UJNrQxeI1N+X1itQmaijCZbCKmMoz1cBAgp0D/Z0lGVBEjX15QPSBqGUio4ZY5r4GZWaPK+deV93p3/u9iZqm8TP+ltCU2nHQoWv0r5zWHMfJyckWhjfJwXFbQ+7xjyLxf/KTn6z379+v8/PzPcvMu8+K4CoBw1diuYugKWebjKeHr0E8Pj7ekgkW/cnJyXr58uUeydnTLCzm8kjliKo4PFUNqL81LIQEkffG1SORm8FrYsCibYg3YfRau1KHGQq3hkzfus/QpAuViup8t9xH0/HQivsUcVV59ePkZJfBIzPGFJ0wEU836NdAH5oTMoLQr66uNo5vZj79n46SY6vA3R+iqDEtgYx4VmLRkA+irxGq3mt1aIcMpyQL1FNOqRniGjf60MLQItnylV1jU39xgGTeJFiNfrcfWcPQoPXFgXe9WEP607lpGdbR0dEeqV8DTW4Kou9qdxqwb775Zr148WI9f/58G0AJb62Gqe/hm6c3ts7Dwm/WCMQ0If1pwdnWgFR0YmyLCRGZNWRr7b+Qg5CKLlvz0nDWeMufUEQv5tBH32lldGG6Ce82IZPdDNY0bhbcRE+tiSpC6zMbRkF4NdDmrydVyDRZJEVhFmCNkIwwWZH98fHxll1+8+bNtvWnMm4Yql9Q5MOHD7fs99OnT9e9e/e2/8tods6cZ2++Ss6Ti3H4vZ9VZ9SKQUsW8HwxRou9J9qrQe7WGDKqgSTb8sszsjDnqB1G3lx5j6v+zKiIbPFS1lOTG0dHu8RUk1+ds+vr63V+fr53fNBauyytz7v29bnhNj2jz5wKubILn2p3GrBnz56t7777btuuQVg8ueyRhzsQjSL34SaupGo9ncl++vTppohrrfXdd99t925TI8Qg1DNNaN4Qw2SVAykUbwlGUc1a+2d36TOU0c219+59OAe+L7mdkFoWca39dDneo+Ufhe360JKM3oNxgA5n1qh1X35n0HxGkci2h/TpZ8l7CwGKa2u9GyPIIJAF5DXDTv3lJHx/bsAuHdC/9b76Tz4WpVKV8oNFdmRQZGM9NJHT5zdcbKP7xqFvddbdT9sxVvbVm6JA5U0cNRRcTsl9PLeHcs5tO2o7y1G658OHD9eTJ08+iUJFKOTKwbln5eH3Jj1alHwXB3anAXv58uU6OzvbXkJbgUq79mwwkLCevCii4YJQYCIm1+KvPvvss00ZGZsuHjzIWmtLAfNIjEbT7xR5LpZmszpZ0/vNFLwFAXGpvaryd3O1PrVSfJ6YUTm0f0pIPNMYhFdgOIPR0P34+PgjhW+yohkli68hvf41/Cp532OJqied88pQaFkD2jC1mT6yrD55Xs9h615azxXW1qFxNsbbGrZGEJ5dakATMuuPvkOH5TEruyJ9rWit/S2vVwNs/EJy89G5neS8vjBk1lL3yhr3dFYdB8RUR1802XCxe3Bvb2+3WrMiQvraOXc/6PCudqcBe/DgwfrjH/+4FbN2MlqwenV1tVeZb8FOklAni1TW2oVHLcWYmaV6LwOrMIo09LG/Nxxca0ey6zPBzYQExbdwW+zXui5jo/QND+tpGAbP84xu96lxrEExViijfFcVTt/61uMqauVTDqLcy+THis58VkpBP2eleDkpTf8ZIZu2m2jQJ4jfQm0NYMPnjrfEf7PI5nTyNkJ6DpXxq0MpCpmvAiSPhoaNIqoPE2l2rnuSRRc2HaxTaORQzrOclnm5vb396Gyxoh33rCOaznsamYlY9UVmUQhJRyEvBdZkOo3ZfHYrDg61Ow3Y9fX13stWLQ5v6uUxeB4PFwI1dq6gyguwzhRjGopmXXhcAlOta8GVmK1Q6pltHsWZQUIVdjNw2uQShDf62DCTN7LI6oko+iFvDv73OeXEKG4TCJyJZxaRaJSp9TiU/OpqV6XesLQZoBbMmjPfabNghR0UlLcun1jPzohyFJ1P97TAGYIiRH1sn8jUHM9N9r7jJx1gGMioXCmZdj71zYLs6RrGT15aryt32bmfHHAjAjoNvdZQ1zi655w/rWjXHHEmXRfGQ/cYeXrVzDcZkzN5apB3eW99dbBl5/wQB9h2pwFzWkLrvxqDF3n4WRLZxFDMub3BgNyPYPqc7oZ33YSVBFXCsQiCwjnf2/X+Tyk80+IhPM+sJ/a3EuJFHkV7DFbraGr0anQsuoYAjFOzqhAr5FCuB3qYBqaEa+fVK+8qtyJo8isy6O/k19e/FU26D8Wtk+oCmckQfaZX3WFhkZQ0N/8We+Xd+rYuiC62GtWJkiuPogOOSt9cp8915MbZZEJLZG5ubrbz+62Z6sFsRZYTXTdRovaqIXGTJEVT5c+KylA2khprfajDM36y757VtXbJmIbDs5n3Ho3UhNBd7c46MGGimqr79++vs7OzLb1JEL57797uxapNUVMkh+dNq4o/sgh7TeNrELVbLRqvT+G1JqeL0HaoZiO1GfrW61kAPOshj9aJmiQv3qgnrl5dXe2dcqlMpJ6qCt4QfPJCvuOaGgyLRV3a9fX19nbzclEWXuE9JcRFWYAWY4noQ2HAWvsvnCUr47JtqDKZqLZbr9x/OsoWUZPLbP7eZ621W7juRabTQTUSKFqC2KtLoovyQ/reOrgaKkhxJsCmblXvJs/c7zqeqki9/S5CagjaNVZkr0+9bqJFme4+CxJmrBj8OqfOUZ3YD85C8n4eeHJysveSgsbdFlw7YFGWmK/wqiyutYCbgQNdGUithZnTUqucbraQl5ivbtsTSBSw9+znk9+poOutjWH+ffJ3/uazchhkZRwloOtAGJYWy5b36CLR7CNlgJo4kP3Rdyi0RqPKOPnM1o71iHCL0Xiurq72soxTN4q0yLooxjyVkOd4a1R7f8aZ4Z7lFZVh65Jcd3Nzs7c5fK1dQXHnrOuEkWm00fEwcKIU+tNQvn0s5aEZTyMK4290g1Os7Kw1OgC56WvHwii6V/VzrbX33gX/FyJeX19vIIjeyE4y9PScvt6VhbwTgfHUDJHjpWvhj4+PN5KfEmk3N/sZOANpaMZgmXSv8aIUzVBCEATeDc0z1jaBLQisp6wnbSv0bWu/24o+/F7laajZNyBR4NYBUeQmSSxWMhd2kd/kl2afa7gYgaLMhvz61WwgLqhjqWctcmaAycBYq8zGXWRjjEVoLblp6QEZ1xjP5Mda+85N5u/169d7CMVCn0i9C7oylGVu1o7M8F+uFV6Wc1tr7R3B03KYhpJzDotIioobzhk/edUwae7juiJCsvK9tdYevTKRPF0kC1y0/9uvPBHto0ePtoLkyq1z0tCV8/lUuxOBOUO8k1KLCAZTsHIs5b1svxEqdVJwV+7ZbQSu92zWugbCDnpJhKI2Fr2K36zZNEb+X/5krX1vgEtwj7mYJsrhdVv42TBym4ij3am3NQYlZSlHwzz3E9oyiu/evdtC5RrfenVGBNIynpYGdF4oovH57tHR0UHOwr07hoYDDf3pRdP9NWjd0VAjyYCUq6KbXYBC5/J1natZx8c5VH5HR0d7ZTv+/+DBg624FV80DbD5KeF+dna2ya2GrYbAc2u8psFpCGfuZjTUM/fX2u1S6Ekl1f8a+K4F31f3Z23MQ04nart3b/eaPc/zs06/z/7/B4HdacC++OKL7UYtOyhpX6PRFyt0EizYKuFauyLHWT1tAN2awoh5g43JqLGgbMIDBgGv0O87p2xPGEEdDXdbfHdoC0nv0zoiE9Q4v8aVl9ZKYlvAzdzx6H0xAw5pcnI84MOHH153Ve8+620auvm8WThzRZE+xRN+6v+Qn9+70LT+vWGd56kLbEgONRVFFCEVgUPe+lbutk6YLG9vd697q1zU+dEzYZGEyOvXr/dCb/1oSN0sejPwLW0oEvXsNske89f7rrWP6KB58sVRlTaYdZGauSLTWZ/Y+k062uytvpOj9yPUGJNBQ0jfsVY+1b73PLBHjx5taIC11G5ubjbUBVL6O8JuhitzsbhPSUyDf/78+SYUn/X43SouYycml81pGrkG4dCEeUYriKcA8QwlgQ8hP5PQEGFyLkVvNdiTzK7HnQuhxZrau3fv9hQNQd6wlpE/Pj7eDBwZQGrmicxaU9Xwr9635HZD0VkdXkcmi9Zwn6FiePq255avlMtScA2VrbXjHX3H2P1DWJdU7rxNntdBi3WiZMpBlWtr63X6xbm13KYotxFL7znDLQ7b/5sR1i9/b8HpIQSK51prl8BpmN7wuqGfNg/b1HoYZI1bHSqdNtYZgs9253lgf/u3f3u71gdD9uTJk+0h80FdPMKfcgsEbkAlmycP4rszg+Zn0eCM2U9PT/cOUGwRHKNjUhvLV0CynJ3g+Z05rrYu3oZ1DJfdCya4W6BAf+OetU6T1G1oDYaTc8c6ZVhZM9LmZGZ8zHPnqTVb1Ylm1tZaG9cynVXlUmNeHWmbWdup2GTVULT9b3hfQ16ZVSZzzFq5sGajZ1jc8dUQVgYannf2q3KofpTjajuEUg7p9aFsZPV7rikyI1+60lCysmxyos7Z9YyjdVX06/y/Gshf/epX6+Tk5IedB/bq1avtDcGIehbcthbelCI8evRo8ySPHj3aTm+ooIRoXUStcF9r7b1pqLE0QTVM6UTjY/quxWmEILnZSsI3W9SJLiHdyTau6TllF9dae4W3t7e7F7MKc/XLIoUGeGT8SfmCen1hCYUrqoWMQPyJFGS9unDrHXlNY5le+NCis3G7zm6tHcJoqEEn1to/faNOrw5zhoXCOCelIMihzGkkyLDOxRzq36ec+xzLWjvqwPowh6ICz6sRPT093YBB71laYvJS5qgIrGuhxpu+om9KV7Twt2urzz45OdnLKHJSc46rMxNl6a/QvpvvhdA+KxduvUzdme17N3O/e/dufffdd1utzlr7r4riEbTyOs79MSBCYMFnKLDWh7dls8g8bSeykzP/zxvgfj7VKLtxCC9Mqnt1Itfar2xuKNL7HurfrDejMCanYe7V1dXGX3lmq+aLWkv06vdc6MYFxTRMmUWW6sCMrQZ9Gra19mkCC5KM6gSqhDWczZr5f40hVAjZVZFrdFwzs26cmUTP1JfyfOWtyNg9fe47+qt/mjmEdjyjB0UWqXhWx0cfGaYZbTCwHJW/tWC0Dt440D0cZd9AT66+4zPzBLm1XV9ff8R7dd30zUX0jx431L+5udl7CU8Lxju/n2p3GjC1Gi0itOAoUGF1lWNCVEphEnlIi8aEP3nyZK+MYsb89RJzsspP9U0vDTcsDBwZJNSXSqhc7iLsT0inY6R0a+0fAVSv63km+Pp6d4yz/iI6TSq5Nhs8F7p/5Mao2Hfms9vb221rWA3HLAC2EOactuSkDqWhfps9ceSoT0WRlJ8u1EBLHHiuRVJj0LDSnFQvmyFnXLVmBGe5x6eMDP3qrpKrq6vNSHEUL1++3OZ78mRd9HX+9+7d24wFnXnz5s1ets+ctLQE0qF/6gwZQHNU7qs62jIZTclI1+1cD70OV14ZOT7H+O/fv78ePny4Efnmex5IenPzoeyovOmn2p0G7MmTJ9vxrzc3u9dY8TAgOiF0awhh83w8DONFCEUTk6tox01UN3zjWAp7OwEUufc2cfZzep53NDKms5JYSGo8JrAvVNDPhniTi+I9+94ABqBIpIvVJHbBNGS1CFsjxUApF2jmuB63oaP+l9sqB9JMZHk6/ZpFjS2H8Z2O5/j4eM9ouG+97tHR7mC7GqyGgPrRN77PUMi1Laep4dO/WW5Tfo9Ra1aP7N3Pz9Y3qm73LIagCGhyf+YARXNIDybSqjEju7V2Z3FVvn2Bi/DSOD27hdgQn8+sPfd9/fr1BnhcT6cbqnqe8JbOGH+zj2zL5GXb7jRgf/rTn9bjx4+3VyZ5QwxYPGGqkLGTeagGyD1Ya4rRavx+v0peYpgA2wdK0spe8LV9cU49hSV0nq3n6V9eXu59H4lfg02htHq8Tl6NCYNeZMrYtvL+7Ozsozoez20dTwsnIQYKSUkqr2kUKVENxAznGjZNgtwcnpyc7CGdtXbhdV8hxxh6HhRdhNm5nOMvMV49EYa1n9XBEvYcdO/ZLGTDMvNqoZ2cfCibcI3FNpFrHZM5aYlK55NjmsmIzrfQuCiqzh1wKPpu8WkdVNdfkZsIZO5kacG4NeD9Dz4TxeD/jo6O9rYL1tC3XAZPZm3r1w82YCU5z87O9rZnFApTEPVSUAyE0kVe/qNe3z0p31q7Km4G9Obmw4GH0u71RDzDTNErMWj2pt69b+/xt5l1q6dByncyZ0jruvJJRUz1UIX/PVPMdcLN7vMrurAA/YQOZIssFCF/n1sFp+ST/2n4W7kyWP0uspjiU+KSwK27Mg7XNyw3Pn1ba3+zf/vg3vrpGv3r7oUaJ4a2iL/cn4Xk/vrgp1CniFZNozGWmO615rEcY2vaZqLF94WqdR76hu4pH2XufW8i8uotZ0E2+ClNeOu57t8z6qBOpyS/e/duq+Gr7L2puw6i3F+3zP1gEv/nP//5Oj8//4jbWGtXnV6inWA/5VkIpgiOsE0gJawntr2UaDJ9AAAgAElEQVSoZRJzC0Wf1ed5z6AFS0iMT5EfwyH0Oj093cui1uvN5EWVZK39UgvX1tsUencLDmPEg07+gVczXvIv/zBT9+UxhK7dGVAjoH89dVR/eU1zMKvjhU2da4roH6dYB6Df9+7d2+M9OJyiQwulVfINw/WlIRdZQSZ+kkEztEVYa+2Okamu1aiutX/eXPdINrSvQa5j6Ib8GnWL3bhqnH2/i3/SC+Q+F/+kWzjQ1pDVKckcStLQBfrUqKZ2gjH3prKGwtXhtXZHXDNypQo6l4fanXVgf/M3f3N7dna2Xr58uVlaiiCj4PNDB85NHqtektIV+QhRKFNruqCLuYfLQCGl/q01KZ5/qJlAKEe9y6wJ6318p4u1z2DAmm2ZZ32ttUsi9FRWiiwMwp+0oJfy9gC7bg3qAihvORFi54ATaW0OZSry5IRqvLSGxEV9LceAHlxnjBb2LDae92xYXurAPSqfmUBhLJvl6h7djt21Dc3nWCujUgPkUsOlf0Ui1eOpS1O2dKOoqtfVMHH284DD2e+5TttfBv5QXWfXVvdkWov+Zp0IDYsyJzLVj+vr6/Xo0aP17t279d//+39fZ2dn67e//e3/ex3Y06dPN2EfHx9vx+XKIvAm0IJwAVJ4/fr1XjbPAmZlCyE7EJ5e1gJcV/4AQa212zNX48X7EKr71RNOTqxK7Bklbik0j89bQXjd4KsxQELVNpNloiyCtXZvEbcAz87ONu/VeizyqyK3NMAYa3whZ3NaL25vnvlpogFq0qfj4+M9T1zerGR/ZbfWjhYolWAeoNKGU80gux4/VALc9y8vL7cawxpDfTEXFpgdEr0/eZojRqPzO9GNvljYntN71HjhWBuilverfPwjyxoqfWkSwjwX9U4U3PdPdkx99wQ9u7ra7UTQui2uRrXn7TUK0jd1aXSwtoD8UR9dn59qd55G8e23364XL15sXkjWwMBN4oMHDzYyU8PFIOZm2QHF4SE0fy+vMGEtOFsFnXVmnRQK07CxGUQe1rPL79Uj6YeTJcob1PhSFIrcEK7jrLE1eUVNmgmfz6nxqow0NVBkAfWST1GP67thv5yZn0080IVmmrSpfPPvh8bQ+dU3RqdIXbhcA14DWTnN6vwagLmhfqL2ojMyfPz48bboGT4hbQ0Qx1tH5LknJyd7We/OX/WuIS4ZaRO5zXWAj2yJQsGI7WMz1H78+PGWafW8s7Oz9erVq02uRWGHSnCguu5JNe4i4iZv5r09u5UKh9qdCOz58+cb2moHeV4PA1VL2Lfux2Qj+Zs+L0paa2e43KvZGkJRs6XNbCckWOH4XSg1YfMkTRvWTo8HgZV785boQ2FODZr+TQQqNNNXSMNmYeRwQyTzgTMzHtuVKEQXermTytu4yi0yErM+y+8NrcjMfOA5JofTgtHKAP/FMFah67yKoA/RFOX5ytm1JKDzyiGZMwbPta9fv976dHR0tGXaa7x6GCMZytjXMHtGw05jMc6WN5Qf01q7N8PwtsePH299bIKrRPzJyckWQWilhuiUU2kawupDkZ817vw31IgoiY2gQ+USZ6KmJ1f84Czkq1evtvKBFqj1PXj1WgbWUyjW2iltiX9NR+sxS+BNiwzRQQktWNWUPdQjdqtOoatWHqjPK1ehTYVfa3+XfhMTXdw1kjUIxoWj8azyDr0XPmqt/SppxuzJkydbiGDyuzWrJRYzI9Z5a6mBhEKNaCmCGpImB2QdZ4jTZ5uT6hM5z1ILTrGouaUBDeebBCDPSQyfnJxsBdPmho49efJkOyZnhm/k5fnG2JNda6QmAuzcNznVl7nSUX2jU4xyUf1EKSKgFpwWxWp98/yMEBi6uavFuA8hwMvLy/Xq1auP0Oft7e26uLjY47usw6kXU59+cBby+fPn2wDViHShetBM8RMAoZhghq/eq4ZwcmLCS9+rB+NFKVi9kVBstnI3DICFCS10/yXD19Mr24rumqHsFhqtyjxDqbXWHiqlqIpHmyWivMbajdgWlcyxhW7ReulpeQbPmuEA+RXy9x2BvGq9aENMc9nXrUFm7ltnVk/cMKgGowulWeAmRyZZfXR0tM3x7IPv1PhZmE4tLRpiEA/NpcXYI6U4FDxm+bcai5YM6D/HbF6bqdVX66kUwWwXFxcfnZCKoNf/hsOyqAxxuWVr8e3bt+uzzz7b/l45inBEDnXIa60tdC1FZD47Z8Yzs/2z3WnAfve7321v5Z7nHE3rX+82i1R1kNAeP368KbvrWkpQT97SjGbGTPqh+Lh97O523tbkXF5eblxE0+oT6dSotYShiwu6bHj1KQ/Zz423YQVlUZzKWPtOU9BNvVN0hlgGqImELtgSzM28Qll1JjUMJc0pWMsOmq2EbswFlHJ6erolJ9zPfJe3I18yZazLnc2QvtlRTou+kWsN3uT7Li93x6DXURwf77bX+Dd3eZyfn2/z4LOiQjLr+MwPg7/WrqTHWPWvIeTcY/vw4cP14MGDPV5ZDaWo4fp693Z76xNFcXm5Oxi0BxBAU/Tx/v372+m26rkYQM+xroo4cXLVrUYPngdRP3r0aG9uDrU7DdgXX3yxXr58ua6urrYMZLOMQhEKbHEITxgq7fT0dG9iWufiegu/Rw53wt2zpyLIbNRQEYyJmW81Kt9RRZoCna01QUdHu2LBmdyYB78VXfk3M5P1SgxhwzEFqhNdNmy5d2/3YhUK5PvdUCv0a2shpHC6yFbFNUNYQ8EQCRvna/XMeVFGCyHJv98vioFqGMOGGnWEsmUTlVXHPKuonhFgJFsiYN45Fn2kh/pTRNuwVR+mE5u60bXSUhqNwxF+FdnoC6csYqrh7JqtfpvHSde0oJysPLfyL3Fv3dY5ciIXFxd75ULkSc/olAgD/XBXu9OA/fnPf/5oT6CaL8rqX8lfg+7LPHljpF7DpBLTsiZTaSl534ikdeMnwXeCJQ+01rbUExEoYVbB+6/Iz0S1wLbhtN+bJeX1q9RFmpOPIgfoaPIwDXGFneTPK/uu5+pbF83c3VADL2QkP8rMWWnQTRXVGegN/7X+3nolzm2tte1NPcRdyrY1NCWDot/yMT2U8BDqZyCgDuOY3G55OeEeFMlY1yi7hiw9c75Nqc3irlObc9UIonN66CQYAGTyau7jGRcXFwezn5rici/P8QrGGmz9UEZhN4/mO7O2Dfqi6z+YA3ODvqnEDQtvCdrPQl5WvOUFthjg1Eq0Wqi+SzkYpSqM71OKCsFk1YP2bzM0mkfLNDlgsXZLTE9aqOGDkGqUKY3v9gQK1xYmVwmnJ6+sGvboS5XDM7tHrjJrPVX/3zBT+AHR+F4XfMPuetMiw6KvQ6F3k0T0RbhZeoJMje/29nbjaZvJqpGDFvp+04az5TqrQxwsudd5dA6MFbda+mQW+3YnQJGMMfZsrBaI0v/yrMJ1+gmBet7x8fFe9Tz91U86MbPJa32oAeXERDgNv4WDQlJzMMNqYzZ3OEHPppuAElvz5z//eY8r+1S704D13B4LEYF3fX2999IPyEkadSp7vUhflInkbSshPluJQsJaa6e0BK3Gx/1xLPVgBH5zc7Nevny554GULRR613sUFdzc3GzcgQMgKYrsH+VqdbMM4oTuNVI+L6GrXxCJEJZhFwKReevRalQod+v19NtPcitH1xC2pPgkrlvQ6jML3aK1kNpHDvD29nbbaiXkrvHyPQuh+2DJpUi399dfBmVu5/KTIWBc6sRub3fbjN6/f78hjdZeFZUwOGgM95zoCyclBCTv8sjV4+vr623PIkDQdWeOGdomSYqm6bqIgkGhw6WK1lobrSRB13Pg/Ox89ngsKJdOVVe6FsqvHmp3GrCvv/56Iy6b7gTBnz17timmgROe1ord6+v9CuS+eso9OkH1yFU+hsOEtbnftOglSGcI1nu2Cr/kZPclUhwCvrq62g6I6zswjYtM/K2GoqllC6snYXQiy1VAqL5DlhRD6+IomV1+SF8gLfs/8UEtXq2ym5MaiyY9GKaGVLztNIyer0+uWWt39FF3NiDaeXb3aDrffTxXJhDCNBZj5WhKAzCOJycf3mDO2HE+9HuendXnrLULIdVITaPVkIvsRAGeVaPa0iXGoTpxqLynTms6q667AgHfgar8/v79++3FN48ePVoXFxebznuWMHRy3M24F1UaK8dZvvlT7XvPxL+9vd24MJXdFXSNSjmjLhTw1LNYZgqPO/Dd1iY1kwElWHStjyo57L6e0exlDRllU4oAXejTFGArtyd3U68zD2gzoRR8osvG/pWB8ddwUljyMuHdBM1wNpwrX8IoldtkrLpH1fctlo6pxrMhkJR9kzCdP4c1MrjmpuPv86Hh8jmlD2oMqg+4uTpKC77zWMNXB1F+r/NDn5rdLVpuYqvJpxo2+uaen2oN02dFuiik+4PbZv2XOeEE6AADX32H9kuPzP62b4wrWujo6Gidn5/vHYRwiBYh5yb01lp7+22/+uqrdXR09MPOxP/222+33eTn5+d7e/Z69Gw5hk7S5ILAYh3kDRra1Xi5f2HmIf6qC1SoRmCIRtlLk16P0n2QDRdAfWNqHRSl8B2hQMfc2jd9c68ap8pI6+Kc3J5nzIwP2SiLaPhLyfRrhgOI7y4oxohB972SvtOgHx8fb4rLqPZNUhS9RLnrGO9ZI9R6rOvr670yEnNm3DNz5k3yDZt6PVm3GLb9IS+0CBla0E1ukM/9+/fXxcXFXiLBHOGfKruGmW3lXY21pDlnVeM4IwR/M34hJL21dmbNW+9XAFAHeXS02wvcc9U4nfZ7AgxgSClNHfT9+/fXs2fPtvFPiqntexHYycmH43EL70FjN693r4fuYuEpSuIyVo8ePdo7VsaEm4gS3DPk0ywKtT6M0vTmWsfNC2k4pZub/aOjFSpaeDVAUNBETBSp0J3xaAZ0ZvOKKqvoHQ8595rva0VMlenbt2+34sOJ/iC+osh68In0yLaorWOprFoz2HDZz3JXjNs09tULujHr0zjJ8mTGouK/4279UXWS41KuMREGpNM1cWiu3HfqYZFt57WOit5XXw4hdeNuGA3x9Dtrra1wd61dhDQ5QmtsolgGjY4U2UG3pTY6j3hEv3dM79+/X7///e/X7e3tD0NgFxcX68WLF5vnMFBkvkmYsNLbaHAQJfYoSKHsmzdv9rgG93J9kYTnNUxa68OuAcLXL5M5a7B43Hv37m0nbhwfH++9zPby8nLzolUMC4En9qwarypdEUlDkSJNSHEqbw0W1IofmFnL6cWLyCp7fYdGNYuvi5aRbRmLe1M05G0RCgRYDtEzbm5utu1prSubHn6Oba1dNlGTlq+DefLkyRb2tNxnrbXtVy2CFC00077W2vSy/WZonz59upeh1qdZouC6GhXjMZ8I8nKEHGcTDsZPb2cxrPtrE0w0otAnOtzwr06jVEa5tGkkOQz6ZPz0g74bbyOOhuHe/Vm5TgAz250G7PPPP1+/+93vtt31R0dH6+XLl5uBmm/uMQndz1WE0a0PFkTDmLkoixbqqWS4CLPX1TMgGrtQy5fwMGt9MEzexOy587VuNTpaXw/vmV20NzcfMpwNbQr9Pd8G5sqzXpXB7mvX6limd3eNUL73sjB6PE5bw6TyeXP7FUdEjpIPveeTJ0+2sMW8UXBZrDdv3uzxM4eQ5HyW+jJhRjkuhskcdyN2naH7+U51i6HVnNLw8OHDLSytrPSRAa8DORSh0C2GVgFw0V9DQvxvX348w3drhDOTkJi6wYF3jR1KIFn3zTSXb6MPNryfn59v97dmIXpOiUMhe+uCXDmk7jf+wQZM1hA6YWVbBiAt/ujRo72KaoapxKiaD167BbBVzoYvtkL0Ba4l/XhZwmsoA/1QyhoXnM8MjVrSwGBALY6zNoGqnt++fbspWHf7UxByqrJ5TvnDkq79bmE4xdE/E07xyxs0SVIkAiVQtC6IIooin/fv36+Li4tNduXUzFnrfVzf19fpLwNhzvEo9AbyrmEt91LOqjpJ5uYVUmkdXEsScHyTpK7jtBChMY7N83Cf5NnMqOshlHK6E1np66zKbyjYUJzRIVu61+jg6dOnW9aQ4zMvrdVimNfavWi3xu3i4mLb3M6RM+jsA3vAAEJdHLPx+1fDPkl9DrKntH6q3XkemMVDuPUK79+/32Dl48eP9wxILW7rs3Ta/xm1mcKu0uIbTL57t+hyeqxZrV1v1Xt7KWczbw0leg0k1/7VoOLw6jGm55vPr+dW82MMJfxN5CE+rgbDcyUqKLti1FZrHx8fbwkM/ZnZJ/fs/tGGxv5PWVvMXA7JYu8R1BAe4110MrmdhszCmUNeuQanzrMIWt/ItmHkpzKOzZDVwNC/Gf7OjF3De+UgEIbfFYZ27G0twi36K43Sou179+7tZVftc4Tg8JDWoetaEWCd6yukVRLf39+8ebPOzs42Q69/dKd6CvSU0unco5QOcb6zfW8lfssRTIRFeigcab0KwwANrfWBsPeig+kVTXC9Uzm0KuBa+1X/NRaNwfUdOuiue96yXAXFbcUw71K43XKOJihaM1cDdej3FkU2VGMUypl0xwIuAZLqZJuDmZpuoWersDVo2JxyOM0sFZmVg4EEfLfyopQztOYAy6cYS41TkSR5F020HR8fb1k6iMU8Vi/pyL179/Zqw8i/hp8MGf9mY7WeRjyJ/UmzuL8Gzamh6vFIa+3e6eB4KIZAyMuBFjDQi74TkiMij6I1TrmlF2Tv757rXrjL29vbLbNaDnuOuWF1eXP6M7ONtTl3AYE7DdhXX321Hj58uJ1C2Th2rd15TTgMC83kNdYWVuEhGjr6jlNTG5YwLNBOicfugetich2lLXLDa1QZupfQQm+IdCidbZE29COXbqL9VKNE/f9HkxMPZVF5tj2lvCsov9baC2v0r4Wh2ty/qpFrZVQZ86Cz1dj1EEFzVwTeHQrTYXEOENHp6emmNzVudSJr7bY1+Yz+NYPu56H3ZZYLurq62o5qIqsmABrmM3ZQbUNLaLjjLJdaPYEuyzO9evVqG6t+kS/ZTPRKh3tsFKfuOJsaJOMpx8opGpt1Qw4SeYeykfpSIw/59b50WalWI5DpbD/V7jRg/+yf/bONgMZZ6UQRi2MvSmgSyuvXr7drdbwnN7TjhZ9NwfPChwj/eiyLtAKcz6No9+/f3yalZx/VCMxzwLpYKEZr20rMu5eJLkHvurV2IXcTCkU7rejvhnYIFp/UTfIWXjNDLUJuGOF5rcXTrzoDn2nlCt2/Bt+ROQ3JGMbJ/5TXRCPw9L1f+9frJnruM+7du7fxt3SH8YSarq6u9k4RRVv0jP5puMwnowpJ2dRM7nShiLRo8vT0dG/3wlofsv+e8/jx4z3HM9dYObfq5tu3bzdjhaMyX+oZGS1Gp8ABipMx9hzh3bNnz7aDAtb6sDboaus7yY7utqqggMff9PHBgwdbP38wAvvmm2/W559/vil7QzmxPA5DzF1isPxNs2iEQcEPbfGZJB9BNPykIA09QNVa/8vLy726Hd5rnjtVEv358+d7mbOpgC9fvlyvX7/ejM48f4shgzbWWnsGo89Ffh7KGOEtZmuxYeWx1m4vHhlyEhYl5RFGIHH1C9qc5Sf9vcWYFJ8+1OuXB1prZ1QtHE4H0QzRm2dOp+OjDzWiSOImjeiB7LJMeIlx15MZY+Zv+tTTNXB+7Uv5p/7fZzM5VKM0SeqW5Lx8+XKveLfGWfRiDoTp7s8hNCyE6HpdnVWpAPwfIGDerq+vtzny3W4NamPQCgIKMBqtSACY7373U+1OA/ajH/1oy/JZwA3xeEBWWhhZ6Co97O8lhNfahW2TY5nbPdbaWeaJFkrQCl0bY5eMbSi61u7U1z6XUH2XcbKIjcEG1xpaCjFLKSzQGrHXr1/vpdCNCQIw8c0QMljlg8iI520msgiqmSp9WGu32M/OzvZI26LZyhu6pGQMl6Zv+tmEwtHR0fYCGEkU6Mw1EAmyvoilaBu3wnBxtDUY19fXWxZNWNvxqGucYV3Ddk6WoWOk3INu0X+ois6Yl/Ky+j1PPSk6LcFdB2FezT2AgO5hiDof/glzOWeymJyUnxyfM/afPXu2RSfuVb6tIazxtphb3wGOrj2h9tu3b7f3kt5VQrHW91Ti/9Vf/dXt559/vr799tttV3o9aif46GhXu9F9YpMrWGttIaVJ7ELDVcmOzGyX32e4Y8FNEpJC8vQN13jSSXivtTN6nqe/NXgl3MkEMuhzKRSjPvtuwt2D0vN2vSeFaLU0+deL9u+HnukaY+hnPe1jZoE4ln6/WefO+bzvDAGnty5f2b4K9eZ3O67pqT8VdkBlUz+F6JUlg8jxNOt7SJ79XVlNEUa5xNm/SWx3TDi1WSx9SO79rM8qumy5C13pnsoZ/Qj9qyvTQZY+WGuHKumL+W6ENiv+G0LiPX/729+utdYPq8R//vz5XgU+D8NYWLCMVy1z0dpa+3H/zABCFSwzYyJMJZBJjHtu0/UlcGtMZZT0uWHAXEi8irHypicnH3bY8zwlwY1F/8nI+Jo5q/L6rEoOkZEJFFLiHPlZctf1zZTxaA0xagwZCL/rt3s4IrlOivL3TeZFMWvtG5TOsWdZaC0CZrAtVNf67kTWE2H2sy4GfZ3yIYciSH+HVtQhNhqAkNt3ffMWr7V2vGNlU8NNv+oEO8f6BjnTK4t/opOjow/8Z5HORHQtEzLXa+3zT+0D5Fnjg0M2T3UmDJH1UD7WXLofOuP4+EO9pgL5ho9z18lsdxqwP/7xj9smaDwJpe2GXJ3volhrf1f7mzdv9lK3a+1exlAuqTCboAhgGgzCrtdpuNWJKelOmBMtgdnKKIp43F8Y/PTp03VxcfER4ugrwXiStXZZGK18EwM5DYt/lT8Falq/ix167dhdh7xWkLrWbsG4x6wpmhufez8LqkZBOz4+3sJBYy2qM6YaYIaupQd1SuVEGrqSURHlIV6Q3E9PT7dxFT1cXv5/7d1bbhs5EIVhZgDlwVEuC8j+N5Yl2EDg2AnmYfC1flUUzWsMsIAgsmyp2WRdTx2yX66OsPFeMTtwirJ0ZlJt6mhKCbLNkHq/t+bW+panN0vSmX3PexV0Z3aqHO31y8BX0tMlgaVYHN1tGbrWxTGy9VsHBNC5QjN8gGDx/v37gzh7T+4SWSmPCRFdgMGzjJRqilIMrcCj7tJal6fOMPLWz67f0sgiFvDlBFsK9/VMXUs9KH7y6dOn4/l+6vNpADVm3ZgC/HCtlop/evyV1LzgeXGeifGJ5uawkVx54zNVbNeiRJOw2e5eywmGPcfR/wsDFDPyORiH7+LEkR77HVVyc6+0aPZOz2YJ47rdjqQEBCRXT9roKP9pUkE4sYnbzq7orDRKrG7553XXiKEXWpklbvV5QgikSUAPSqzuteFV4mpLZI7WZzsOHfDSo9pUcF8lNtNBVVQ7ro+Pj1c+AVVFMLlVblfuZmB1PD19VDbVL9aGbslSIK9YWbtqOj6UrYowjafcFpNk0ijMxMy6EL7DolvIlji3ukjmAlPafQKbLaR7LkZXY690Efu7mTlwmpzwxElankrh63g40wKpzWQ7TqWj02UnhjWPselOBOvXErNYUukN5RqVflFH6G85s2ajNRTjV9aL6uYJVUbjoIC6OZtgdhsqUwfpXOekOo6T18+oXDi+4lXtVNOf6m35UGutI+s3d+ZAF5tefvny5bc165jMT3cWmGP8zpaVxlRox+4O4ylvbq1L2djrnU6ng9Ffx27OYMwqkgbmW3LXgYk8ohFFEY1EKBfF3Xh9fT2ejDIjPQUuZiDFplxTaXSselKjyawy+F9WUGyje//mViOKwRBbVswsseXXWut4vJRxz6hnUaTOFH9un/H3kyxZbliFgvt9FeRY3H8uHLPpeH2ngGOcLX163WYkrssJdMNvQdkaNoJmx8oZTFwVQdI93GJxtwRTKchO/N3r6+txmqsdIBXzXJih1ICW8w1Eri87m9kzqoK/t47deQE68Hu24nvQkjzhuxia+Wwi0GeF6sR6v2stEJav1Yqnc9yDBa2byquMAtl2pT9bO+vjdJY6p8fHx6NTbF4bvP4kdx3Y169fj6cvW7SfPy/PlTufz1el4vPz87Gx+XS6nEw6Faf1usylXJO11pUB+FlG1zT/VtuZABpFD4aCXzRb6i2HvJbRzQd+yjREnVuM4ToIDxj1fjtexu5fDWhiDBShStHsdirgFMqx1uU4ms+fP19lmGtdIvOPHz8OHlWNoDwr+IimAydJ2UV1D2oo49q/W91mv1vr+uRS9wUzEa0LbPvshw8fjuZSsTvS7mY77cbZUk22KGNyb5NsKavgpIxZ6Vs9QpxlU+6z5E/jlpF67T45Jfrfknhua2NvBdEFVutaUi9MsHuPbZcri6D67poyqep4qRddV0nKXJPy0m7JXQf27du39fDwsM7n85FhUc46GHvATqfTkX6SpoETMKQka13jHiazxluQeCpL32+DgAF1LI3oLc8YooXivIppMNouyFrXWRBFkll8/Pjxt9RaZlgntNY1A7ugaTOyGvVa66qEmRmDwNKNsy2lv3//fnS5ZplWjIfzZrxVym79KYF1rUvTgMOxZWRmVda75b6svtiJzLkNgkodZiEHY+HsbzkynzVfXZMaXrdszSdZtXSuUddOXNdr9tHxmG/jnlibNTLnhVTa2Zx0Jusxn+LknDSZM5IqG7Fesmx8Os6+2XG5kYJlEyCwSwN+78c9dy7vUb3+l8jqkD9ZCQXWPaSQf3IkQGzYRd+D3yhbHh4ejpttK306CotRRfD+09PTVaQooCtFLjdrRm6L1izMvdjA2jFRhCoykWY/Pz9f7cG0xaN438vLy/EUdMfsKiPm9TrP82EOa13O6xKJi7ettQ6nVWOGSRBOquuw1mWfJYPvxnqf5zjMi5KBg5DFtQtWPpPrNNtop7mvZ0NC4AJPwPVkEsUkhCUAAACcSURBVNayTmHumGBgpXS4N8bbkpqT4PS7D7I0iH737OS2IYMKMaXZlLUrp8sc/Pr1XzOpeB2HVmjn3bt3h7NCYtYcO5/PR0LSTeASlzbvev+lQ1m/VhtK3Y5pYpFz29Y9uUtk3bJly5a/We5mYFu2bNnyN8t2YFu2bHmzsh3Yli1b3qxsB7Zly5Y3K9uBbdmy5c3KdmBbtmx5s/IvSjj13Upl9AQAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "e = laplacian_edge_detector(im)\n", + "show_edges(e)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The edges are more subtle but meanwhile showing small zigzag structures that may be affected by noise. However, the overall performance of edge extracting is still promising." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter24/Image Segmentation.ipynb b/notebooks/chapter24/Image Segmentation.ipynb new file mode 100644 index 000000000..d0a8b36af --- /dev/null +++ b/notebooks/chapter24/Image Segmentation.ipynb @@ -0,0 +1,480 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Segmentation\n", + "\n", + "Image segmentation is another early as well as an important image processing task. Segmentation is the process of breaking an image into groups, based on similarities of the pixels. Pixels can be similar to each other in multiple ways like brightness, color, or texture. The segmentation algorithms are to find a partition of the image into sets of similar pixels which usually indicating objects or certain scenes in an image.\n", + "\n", + "The segmentations in this chapter can be categorized into two complementary ways: one focussing on detecting the boundaries of these groups, and the other on detecting the groups themselves, typically called regions. We will introduce some principles of some algorithms in this notebook to present the basic ideas in segmentation." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Probability Boundary Detection\n", + "\n", + "A boundary curve passing through a pixel $(x,y)$ in an image will have an orientation $\\theta$, so we can formulize boundary detection problem as a classification problem. Based on features from a local neighborhood, we want to compute the probability $P_b(x,y,\\theta)$ that indeed there is a boundary curve at that pixel along that orientation. \n", + "\n", + "One of the sampling ways to calculate $P_b(x,y,\\theta)$ is to generate a series sub-divided into two half disks by a diameter oriented at θ. If there is a boundary at (x, y, θ) the two half disks might be expected to differ significantly in their brightness, color, and texture. For detailed proof of this algorithm, please refer to this [article](https://people.eecs.berkeley.edu/~malik/papers/MFM-boundaries.pdf)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "We implemented a simple demonstration of probability boundary detector as `probability_contour_detection` in `perception.py`. This method takes three inputs:\n", + "\n", + "- image: an image already transformed into the type of numpy ndarray.\n", + "- discs: a list of sub-divided discs.\n", + "- threshold: the standard to tell whether the difference between intensities of two discs implying there is a boundary passing the current pixel.\n", + "\n", + "we also provide a helper function `gen_discs` to gen a list of discs. It takes `scales` as the number of sizes of discs will be generated which is default 1. Please note that for each scale size, there will be 8 sub discs generated which are in the horizontal, verticle and two diagnose directions. Another `init_scale` indicates the starting scale size. For instance, if we use `init_scale` of 10 and `scales` of 2, then scales of sizes of 10 and 20 will be generated and thus we will have 16 sub-divided scales." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Now let's demonstrate the inner mechanism with our navie implementation of the algorithm. First, let's generate some very simple test images. We already generated a grayscale image with only three steps of gray scales in `perceptron.py`:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Using TensorFlow backend.\n" + ] + } + ], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from perception4e import *\n", + "from notebook4e import *\n", + "import matplotlib.pyplot as plt" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's take a look at it:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAOcAAADnCAYAAADl9EEgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAC7UlEQVR4nO3YMYrEMBAAwdViPVsvd6DLD68vOdYdVIWaZDA0Ax577xfQ8356AeCaOCFKnBAlTogSJ0Qdd8Mxhl+5N9ZaT6+Q5xv9bc45rt5dTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBB13A3XWt/aA/jF5YQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixt774/A8z89D4F/MOcfVu8sJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosbe++kdgAsuJ0SJE6LECVHihChxQpQ4IeoHudIUPgvLLmwAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "plt.imshow(gray_scale_image, cmap='gray', vmin=0, vmax=255)\n", + "plt.axis('off')\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can also generate your own grayscale images by calling `gen_gray_scale_picture` and pass the image size and grayscale levels needed:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAOcAAADnCAYAAADl9EEgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAADB0lEQVR4nO3YMYrEMBAAQesw+Ff6/5c20uWL11xweNtQFWqSSZoBjbXWBvT8fHsB4Jw4IUqcECVOiBInRO1XwzHGo75y55zfXuHPnrTrtj1r3yftum3bdhzHOHt3OSFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6I2q+Gc8679gDeuJwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiNqvhnPOu/YA3ricECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0TtV8M55117AG9cTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEjbXWx+Hr9fo8BP7FcRzj7N3lhChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFqrLW+vQNwwuWEKHFClDghSpwQJU6IEidE/QKgERT5nIvuXQAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "gray_img = gen_gray_scale_picture(100, 5)\n", + "plt.imshow(gray_img, cmap='gray', vmin=0, vmax=255)\n", + "plt.axis('off')\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's generate the discs we are going to use as sampling masks to tell the intensity difference between two half of the care area of an image. We can generate the discs of size 100 pixels and show them:" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjwAAABJCAYAAAA5f/zBAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAGTElEQVR4nO3c3XLbOgyF0ehM3v+V3YtTTz0JKVsiSOwNfuu2nRjCD4m6do7H4/EFAABQ2X/ZAQAAAMzGwgMAAMpj4QEAAOWx8AAAgPJYeAAAQHnfZ394HMdWX+F6PB7Hp3+X3JwjP33kpo/c9JGbc+Snj9z873ThwXqjvybgOC6dEaV8krvK+dn9+XEPZ8458lMHC4+AyN+F9PqzGLTfquWH36OFOzhzzpGfmlh4Es2+rJ4/nyFrc84Piw7u4Mw5R35qY+FJsPqyYsjOOeWHRQd3cOacIz97YOFZKPuyYsjOKecnu3fgKbtvlGfq64v87IavpS+SPVivlGJRpJYftXjgQalvlGJ5UopJKZbKWHgWUGxmxZiUqORHJQ54UewbpZiUYnlSjKkaFp7JlJtYOTYF2fnJfn14Uu4bhdgUYuhRjq0CFp6JHJrXIcZMWfmhLn3kps8hN5kxkh9vo7lh4ZnEqWmdYs2Q9Q0O/PbMDTlqc/nwa0b9XHrGpYarRcw+C88ELoP1yjHmlVblhzr0/cwNuWpzuTBX1s+lV1xqt1rU7LPwBHMZrBbn2FdY9UvJ8FsvN+SszeXiXFE/lx5xqdlqkbPPwhPIZbDOVHiGmWblh7z3vcsNuWtzuUBn1s+lN1xqtVr07LPwALDncrGtxkWqjxq1zZhpFp4glQ7cSs8yQ3R+yHffldyQxzaHC3VG7Rz6waE2GWbNPQsPgDIcLrkMXKx6qEnbzBlm4QlQ8ZCt+EyRovJDnvvu5oactqlfsJF1U+8B9VpkmT3zLDwAylG/8LJw0eajBm0rZpaFZ1Dlg7Xys0UYzQ/57YvIDfltU75wq9ddOfeZVtWdhQdAWcqXXyYu3vXIedvKGWXhAVAaS08bF/A65Lpt9Wyy8AzY4SDd4RlH8MHaeLt+RTmD4kU8UivFOivmWEHGnLPwANiC4mWogAt5HnLbljWLLDwAtsHS08bFHI+ctmXOIAsPgK2w9LRxQcchl23Zs8fCA2A72QevKi7qceSwTWHmWHgAbEnhAFbEhX0fuWtTmTUWHgDbUjmI1XBxX0fO2pRm7PvdX1AKdiaaFdjT4/Fg/huO49jm/B9F/7Sp9Q/v8ADYntrBrIKL/D1y1KY4U2/f4UEfjQ7UwTs9bbzT00e/tKn2C+/wAMBfqgd1Ni7238hJm/IMsfAAwAvlAzsTF/w/5KJNfXZYeADgB/WDOwsXPTnocZgZFh4AaHA4wDPsfOHv/OxnXGaFhQcAOlwO8tV2vPh3fOZPOM0ICw8AnHA60FfaaQHY6VmvcJuNtwvPDoXe4Rkxx93eoef6FHPjdrCvMqNWIz9TLZ7KFGfiXa14hwcAPqB4wCuovBBUfrYRrrPAwgMAH3I96GeruBhUfKYIzjPw0cJTufCVnw1zjfYOvdennBvnA3+miJpV+hkVKff+JzXjHR4AuEj54M9UYVGo8AwzVOj5jxeeik1Q8ZmwRlTv0IN96rmpcAHMoPBBfoUYKlHv9U/rxjs8AHCT+kWQxXFxcIx5hUo9fmnhqdQQlZ4Fa0X3Dr3Y55CbShdCpCu1y/46uUOfZXDo7Su14x0eABjkcDFkcFgkHGLMULGnLy88FZqjwjMgx6zeoSf7XHJT8YKI8K5+M+ub+drOXHr5av1uvcPj3CTOsSPX7N6hN/tccuNyUazWq9+Kuma+tiOXHr5Tv9v/peXYLI4xQ8Oq3qFH+1xy43JhrPazfivrmfnaTlx69279hj7D49Q0TrFCy+reoVf7XHLjcnGs9qxfRh0zX9uBS8+O1G/4Q8sOzeMQIzRl9Q492+eQG4cYs2Tmhrr0OeRmNMaQb2kpJ0o5NmjL7p3s11emnBvl2IAzyr0bEVvY19IVE6UYEzyo9I5KHIoUc6MYE3CFYg9HxRT6e3iUEqUUC7yo9Y5aPEqUcqMUCzBCqZcjY/kO+0l/PYPL+gCUUqHgRbl3sudKWXZulPsGuKviXIUvPE+rk8Whg7uceif7EFLGmQPEqzRX0xaep9nJ4tDBXc69w+LTx5kDxKswV9MXnqfXhxlNGAcO7qrWO5FzVQ1nDhDPea6WLTyvODwww+59tfvznyE3QDy3uTr4VyEAAKgu9GvpAAAAilh4AABAeSw8AACgPBYeAABQHgsPAAAoj4UHAACU9wfXBOrVVvZtlAAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "discs = gen_discs(100, 1)\n", + "fig=plt.figure(figsize=(10, 10))\n", + "for i in range(8):\n", + " img = discs[0][i]\n", + " fig.add_subplot(1, 8, i+1)\n", + " plt.axis('off')\n", + " plt.imshow(img, cmap='gray', vmin=0, vmax=255)\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The white part of disc images is of value 1 while dark places are of value 0. Thus convolving the half-disc image with the corresponding area of an image will yield only half of its content. Of course, discs of size 100 is too large for an image of the same size. We will use discs of size 10 and pass them to the detector." + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": {}, + "outputs": [], + "source": [ + "discs = gen_discs(10, 1)\n", + "contours = probability_contour_detection(gray_img, discs[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAOcAAADnCAYAAADl9EEgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAADOElEQVR4nO3dwUrDUBRF0V7x/3/5ORaKKL2SHV1rqBDewN1TJG3mnPMAet6uPgDwnDghSpwQJU6IEidEvX/1y5nxr1z4ZeecefZzywlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihKgvb98reuXD4TNP75L6VXc676sfvL/Tea/4W/gpywlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlRa9/4/uq3hQOfWU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVFrN76XzczVR4Afs5wQtbac1gl2WU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlRaw8yOudsXQp4WE7IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVFrN76zZ2auPgIBlhOi1pbTqz3sspwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidErT0r5ZyzdSngYTkhS5wQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0St3fjO/zQzVx/hz7KcECVOiFp7W+vtDeyynBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0StPSvlnLN1KeBhOSFLnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRK3d+A53MDNXH+HbLCdErS3nnV6R4A4sJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTouacc/UZgCcsJ0SJE6LECVHihChxQpQ4IeoDHL0j76PZjhkAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "show_edges(contours)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we are using discs of size 10 and some boundary conditions are not dealt with in our naive algorithm, the extracted contour has a bold edge with missings near the image border. But the main structures of contours are extracted correctly which shows the ability of this algorithm." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Group Contour Detection\n", + "\n", + "The alternative approach is based on trying to “cluster” the pixels into regions based on their brightness, color and texture properties. There are multiple grouping algorithms and the simplest and the most popular one is k-means clustering. Basically, the k-means algorithm starts with k randomly selected centroids, which are used as the beginning points for every cluster, and then performs iterative calculations to optimize the positions of the centroids. For a detailed description, please refer to the chapter of unsupervised learning." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "Here we will use the module of `cv2` to perform K-means clustering and show the image. To use it you need to have `opencv-python` pre-installed. Using `cv2.kmeans` is quite simple, you only need to specify the input image and the characters of cluster initialization. Here we use modules provide by `cv2` to initialize the clusters. `cv2.KMEANS_RANDOM_CENTERS` can randomly generate centers of clusters and the cluster number is defined by the user.\n", + "\n", + "`kmeans` method will return the centers and labels of clusters, which can be used to classify pixels of an image. Let's try this algorithm again on the small grayscale image we imported:" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": {}, + "outputs": [], + "source": [ + "contours = group_contour_detection(gray_scale_image, 3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's show the extracted contours:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAOcAAADnCAYAAADl9EEgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAC7UlEQVR4nO3YMYrEMBAAwdViPVsvd6DLD68vOdYdVIWaZDA0Ax577xfQ8356AeCaOCFKnBAlTogSJ0Qdd8Mxhl+5N9ZaT6+Q5xv9bc45rt5dTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBB13A3XWt/aA/jF5YQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixt774/A8z89D4F/MOcfVu8sJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosbe++kdgAsuJ0SJE6LECVHihChxQpQ4IeoHudIUPgvLLmwAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "show_edges(contours)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is not obvious as our generated image already has very clear boundaries. Let's apply the algorithm on the stapler example to see whether it will be more obvious:" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import matplotlib.image as mpimg\n", + "\n", + "stapler_img = mpimg.imread('images/stapler.png', format=\"gray\")" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 50, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAATAAAADnCAYAAACZtwrQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAKD0lEQVR4nO3dzY7jxBoG4M/OT0ehadEtMSA23AA3wJ7bYM91cFlsEVtYwwKxGjFCYgNNupPYPotR+VQ8TrpnOEec7+h5pNE4dlXZLjuvWnLK1QzDEAAZtf/0AQC8KwEGpCXAgLQEGJCWAAPSWl7a+Pnnnw9fffVVvHjxIlarVWw2m4iIWK/X0TRNrFaraJomFotFtG07LjdNM/4rnyMiFotFRMRYNiKiaZpo23ZcrtcXpV5Rl6mfotZ1Lq17rrkntMMwnLQ5/fy2bZdzOPc0uO/7N+r1fT/21S+//BK73S5evHgRt7e3sdlsYrlcjteo9G3XdXE8Hsf6P//8c3z77bfxww8/RNu28eeff8bNzU18+eWX8emnn8ZmsxmvU9d18fDwMLYVESfL5VhKX9T9Ua+f1qn7oOu6eHx8HLd3XRd935/0S9/30XXd2F9d10Xbtid9VO+jOBwOF67Em+dSznt6HtPl+l5v2/akXr2tvn/Lvuo+Kev6vh/Pp/6OnDO9d0p/1f/6vj/px3P3WdlXqVP3SX1c0zam16c+l6n6+Obq1nVub2/ju+++i6+//jq+//77GIZhtjP8BQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpCXAgLQEGJCWAAPSEmBAWgIMSEuAAWkJMCAtAQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpCXAgLQEGJCWAAPSEmBAWgIMSGt5ceNyGXd3d/Hxxx/HarWKq6uraJomFotFtG0bi8UiIiLa9nUONk0zbq8Nw3Cy/TmGYYhhGKJpmjgej2P9uTbrz3P6vo++799ou17X9/24vuu6N8rO7aO0UR/LtO5c+9N26zbK+vo46rL1uoiI/X4fx+Mx/vjjj/G6nOvjuu7hcIiPPvoovvjii3H/TdPEy5cv49WrV2M7TdNE27bR9/14nacuXdO5a1ZbLBaz16OuV9835d7qui72+300TRPr9TqWy+V4rFNzfV/W1+au83S51KmvU7lWZT/1/TbdR8Tp92WxWMRisRjPr23bcXv5v2wv5cvytN8fHh6i67pYLBaxWq1isVhE3/exWq3i/fffj4eHh9hut7HdbuNwOMT19XUsl8uT4yz7LN+53W4X9/f38fj4GPf39zEMQyyXy/G69X0fx+NxLD/Xh03TnP0Oneujm5ubePny5ey2k768uBXgf5gAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpCXAgLQEGJCWAAPSEmBAWgIMSEuAAWkJMCAtAQak9eS8kGWuwdVqNc5ZV8/7WLaV+ehqZT64sr7MoVfmnuv7fnZex+mcj6Wdeq7IS/NB1nMAlv2U9WVuxDJ3X2m767qT+QnrOf7qef8Oh0P0fT+2czgcTuYLLO1M56Ks+2t6bufOY3pOdfl6fdu2MQzDODffuXJ1P5R5HpfL5Wy/lnJN05zMCTk3B+Rz5/qcO6bj8Xgyb+N0PsXp3It1ufK57vO55bLves7Mslzu53p7mSuxzNlY1tffgdJ35f5frVax2WzGuRtL2Xqe1HI8x+NxnJvzvffei+12e3JPlfOur9V0XsVy3qVO13Wx2+0i4vU9eTwex7kyy7q7u7v49ddf48cff4yffvop1ut1XF1dRd/3cXV1FcvlMtbrdVxfX8eHH344HvdqtYrlchm3t7cn16Rpmnj16lXs9/uTeTHr4yzXspQ/t32q7/v466+/nvxe+AsMSEuAAWkJMCAtAQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaV0cSnQ8Hk+G0JQhKxFxMhzocDhExL+HntRDhyJiHO5QhqrMDYuZW54OO5gbQjQ3FGc6JKYcZxkmUobf1EMYyrnVw41Ku3PDH8rnevhE0zSzw0HK/qdtleXpEIzpcdRl63UREfv9Po7H4xvDXubUdQ+Hw1i3HuqxWq1O+rH0TRlO9NTQjqnp0KSpeihTfT2m90P5XA/L2e/30TRNrNfrcdhMuS9rc31f1tfmrvN0uQwRK/uvr1U9/Gw6JKpWD8sq163u67K9/F+2l/JleXqdHx4exiF95Tr2fT8Odfr999/jgw8+iE8++SQ+++yzuL6+HodNleOsv9cREbvdLu7v7+Px8THu7+9jGIZxiOEwDLHdbmO9Xo/D2Ob68NwQovr6TN3c3MR2u31ymJq/wIC0BBiQlgAD0hJgQFoCDEhLgAFpPflG1vI2xvpR9dT0Meh+vz/ZVh6xTt9yWuqVN1TOvXmz/hwRJ4/cp2+nnL6JdfpWy7p+WX7uG0an6+qfFlyqd2n925ap1W9QLY/M534CUj/CLusiXj+a32w24/rpY/Tp8ZTH7X/n2OfeQjt9Y2p9nPVPR+Z+ZlKO+fHx8WQ/l34GUcw9up97xP9UnXdRt1Nfr3Nl5s55emwRpz9XKP1Z3/fl+3vuOzTXZt12+b/8nGO5XI5vpZ3+fGV639Xmrkl9nBGv78PdbueNrMD/LwEGpCXAgLQEGJCWAAPSEmBAWgIMSEuAAWkJMCAtAQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpCXAgLQEGJCWAAPSEmBAWstLG3/77bf45ptv4u7uLtbrdazX64iIWCwW0TRNtG0bTdOMn5umiYg42Vavb9vTvKzL1Nvrdub+n7YzV2ZuW73fqbl1T7V3rs65en+n3LvUu9Qnz3XpHN/2eP6bhmF4q/J93/9H23tunafKPKeNc8f+3P03TXO27KU2St25ctNjqsvW5af/l3rT9YfDIbque/J8/AUGpCXAgLQEGJCWAAPSEmBAWgIMSEuAAWkJMCAtAQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpCXAgLQEGJCWAAPSEmBAWgIMSEuAAWkJMCAtAQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpCXAgLQEGJCWAAPSEmBAWgIMSEuAAWkJMCAtAQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpCXAgLQEGJCWAAPSEmBAWgIMSEuAAWkJMCAtAQakJcCAtAQYkJYAA9ISYEBaAgxIS4ABaQkwIC0BBqQlwIC0BBiQlgAD0hJgQFoCDEhLgAFpCTAgLQEGpNUMw/BPHwPAO/EXGJCWAAPSEmBAWgIMSEuAAWkJMCCtfwGmiMjvMosFAAAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "contours = group_contour_detection(stapler_img, 5)\n", + "plt.axis('off')\n", + "plt.imshow(contours, cmap=\"gray\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The segmentation is very rough when using only 5 clusters. Adding to the cluster number will increase the degree of subtle of each group thus the whole picture will be more alike the original one:" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 51, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAATAAAADnCAYAAACZtwrQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAS7UlEQVR4nO3d247jVLPA8fI56QOd6R4BI7gYiSfgSbjg3XgqDlfDJRIaNAyoxTTNdBInsb0v2GVVqpfdGb79iV3S/ye1OvFh2V5xSnF5reVsGAYBgIjyf3sHAOCfIoABCIsABiAsAhiAsAhgAMIq52Z++eWXw9dffy03NzdSFIUsl0vJskzKspQsy6QoCsnzXIqikCzLJM//jodVVYmIjNN0ep7nkmXZ0Z/y83R9O28YhqN1dJp97/myLF9eav7cOlPrz5U5VbY/Dvveb7Pv+3G5169fy+FwkOvra/noo4+kaRrJ81yqqjqq+/1+L33fS9/3MgyD/Pzzz/Ldd9/Jjz/+KHmey3q9lsvLS/nqq6/kxYsXslgsxnrvuk52u91Ylj3GVP1O1UnqM9R5fd/LZrNJ1snUep7u39S+2PL89lPH5T/nuXMtda74c3nqM5+aN3XO2u2n3s+dW6lpdjv23Ertoy9Ll58q35/Htgw7T89LnfbRRx/J999/L9988428evVKhmFIfvj8AgMQFgEMQFgEMABhEcAAhEUAAxAWAQxAWAQwAGERwACERQADEBYBDEBYBDAAYRHAAIRFAAMQFgEMQFgEMABhEcAAhEUAAxAWAQxAWAQwAGERwACERQADEBYBDEBYBDAAYRHAAIRFAAMQFgEMQFgEMABhEcAAhEUAAxAWAQxAWAQwAGERwACEVc7NzPNcVquVrFYrKctSmqYZp+d5LmVZSpZlUhTF0fRhGCTLsvHPzhMRybJMhmEY39vXOt++HoZhLNNOs8tN0eX6vh/X82Xb5ex8/1q3Z1+nyk1to+/7o7rRZfzx6XQtN7Uftn77vpe2bWWz2UjTNJJlmZRl+ajOD4eD9H0vfd+PZfiy+76XruvG/3Zf+r4fX2vZ9r0qiuLR8el/+7ml6lL3+3A4yN3dndzd3ckwDHI4HGQYBtnv9+P+Hw6Ho/qw28jzXLqukyzLpG1bub+/H8vX+snzXIqikLIspaoqqetaiqKQpmlksVhInudS17VUVTUea57nR+fc2dmZFEUhy+VSzs7O5OLiQoqikKIopK5rybJMuq6TpmlkvV7LdruVoijk4eFB2raVuq5luVxK13Wy3+/Hcne73fgZdF03fnb2fPFsPfh69cv7744917quO1rWfkftZ63bs9/b1LZ02txy+pnZcjWm+P3x+AUGICwCGICwCGAAwiKAAQhrNomvNDFrE/KaCLXJejtfRB4lcjVZmOf5mBjWZKq+1uU04a1l2CRgKgHtt6V84t+uo4lqPQ5N/PptpJKhPunpE/SppL8moL3UfJu8TK3j61WT7/ra3jzQZXwC39500WmHw2FMHiubULbrpm6sKHsO+HrwNxHsTR29KfTxxx/L8+fPj5bdbDby+++/y3a7Pdqu3U9N8Nu6qOt6TJTbetC68fVry9TzM89zqapqTPyXZSmLxWJM2Ot7XU6X0RsEVVXJcrmUuq5lsViMNwuqqpKzs7OjRLYm/quqkr7vZbfbSVEUstlspO972e/3MgyDlGU5HlPXdUfzd7vd+Bna+vfHqvVRVdV4vPa7aL+rU+eflmNv1PjlfJ3r/uh5pPNtGRpn5vALDEBYBDAAYRHAAIRFAAMQ1pNJfE26aQtjERkT+CKPE54ij5OidlnbKl7Lty2+bctxTWymEvCp1uyppGJqPZtwTLUanyojlaDUean5NmHpW9bb5L9d1rcsT5VtE+maYNU/baGuyVHfklrLsy277bZsS3x7A2Cz2Rwta5P4epPH150v354H9iaB76GRqjMRkaZp5PPPP5ftdivv3r2TN2/eyPn5+ZjIVpo8L8tSVquVvHjxYmzJr8dub0xst9uxxfvhcJDdbjfWhU4XEWnb9umW4f+b9Ndkv7bur6pq/L9cLseW/rqsXadpmjHZX1WVXFxcjNM0+Z9lmdR1Pd602O/3cnNzc3Rs2+1W/vzzT7m7u5P379+P54fevLHfAT1G/73032l/08ef/6nzNNVjZmoZW5btSTJZ37NzAeD/MQIYgLAIYADCenI0Cm28p435/Hw7AoCdLvJ3LkJzCZobSOWKbM92e+2t+QpdvigK6bruaMQDmydKNay01/RTjU39dix7bW8b2/l999fwvtGeXd7unzYOtev7ffE5B7tfNv+lORubP7D5RR3xQKedn5/Lp59+KsMwyHq9luVyKW3byh9//DHmOXV/drvduD3df9tw2TditvkVne/PEZ8/tftr56caJ+d5Lre3t5Jlmdzc3Bzl77S+1uv1eO6JyHguawNUuz1bzzafaPOB79+/P5qmuTKbW9P3dnSJ1GcvIkd5L82Rae5LG7/qaBdaP7ZRrX6/7H7b89XSc0rzZpq71PPir7/+elTHuo+6vI7MYRsc6zZ9HlTPa22Ias/TVKN0v92575DFLzAAYRHAAIR1Ul9I/1Nef0baywX7s1CbW9gmGPa9bcZgt+F/ztu+k/Yycu6nsr/U0tf+kizVhGKqv5e9LLGXHfYnrh3oz/a38/vmt5HqMyjyuC+klqd93HRa27bS973c39/Ler1OXnb6bevl1nK5lC+++GK8JBIReXh4kIeHh6MmM3affZOH1GW3zvfs5Zz/7Kf+7LI2ZdF1nXz22WdjMwJtomAve/2x6zFut9tHl8Kp81Ev2eq6FhGRq6uro/1Qtg+mvZT1l6Laz1SbLNhLz67rxqYO9nyz56f93umlpH5Oeilq52szjuVyKWVZyv39vdR1ffTd9XVl0ytlWcqvv/4qv/32mzRNM9aFDnAqIvLy5ctHl4X2c/LNM0TkaJpPn9jUyin4BQYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMKabQem7Z50bGr/AFvb3kP5ITB8txi7vG/L5dsU2THCbXce325natgO26bFH1eqC4ntdjO1vH9t2+zYYWLscDm2bY+fr226bDcYbRdku7XYNly+PkVk7Lriu+D47lR2v225+tlqGfY5CNrmJ9W2a2p6qs51G7YOm6YZu9/sdrtH5fh9KYpi7GJzfX09TrPPaPDnn/+cfFur1Bj59lzQ19p16Kkx9W2d2PrTYXD0vX0Iri6batdo2xra88h/jre3t7Jer6UoCjk7O5M8z2Wz2chms5EXL15I0zSyWq3k5uZG9vu9fPLJJ+N4/LbNpbXb7aRtW7m7u5NffvlF9vu9ZFkmi8VC2raVV69eHZ2Pvv3dKedFqv3l+fm5vH79Otkly+IXGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGIKzZztz2uX6247Z2yrWdfXVg/1RHY9tJNfVcONuZVKfpevpcOv+AA7u+f/1UB27bkVdp52rf8deWqQ9t0H3VZ+rZB0bYZWxnaV+vnn+oh11WO/2mlk11mPXb0M/EPpDEdty1DyTR/1MP1vBlpjrnTx2j3X+l9ec7A6c+B9up2XaA1//60Iyph2Ho67Isj87jpmmOHoShD76o63r8y/Nclsvl+DCNqQfb2A7X9rzQZ2tqx3U9dy4vL+Xq6mp82Id9EIivh9Tnbev/+vr60TTddlVV8vz5c3n37p389NNP8sMPP8hisZDz8/Pxc9NjWS6Xslqtxmdf6jmhdSfy93leVZUsl8vxO+0fxDH13bTTUt9TERmfjTl3HonwCwxAYAQwAGERwACERQADEBYBDEBYBDAAYc02oxCR5Djjdlz21Ljrerte17PL6O1Y32xgalz8w+Ewjolub0/7phT2Vr4u45sJ+Nva9ta9bfLgx1E/HA5Hf13XjWODp8bo1/2wfFMG22RhalzwKalmC36b/nOx9ZB67oBl6zfVfMKv65tT2O2fsv96a74sy0dNIGwzCv/Z+WYtqe2m9sGPh9+27dEY89qEYb/fj2P16/Z0G3meS9M0kmWZLJdLqapKrq6upCxLubi4kLOzM1ksFtI0jdR1PY5Tb+tAz1H9X9e1lGU5ju9vm7P4seb1PGzbVvq+l/V6LXd3dzIMg7x//17atpWmaaSqqrGJyLNnz+Tt27fy5s0bOT8/l6IoxmNaLpdSlqVcXl7K+fm5rFYrOT8/l4uLC1ksFlJVlYjIuN3tdiuHw0Fev34tm81G2rZ9VLe2/lPNJVLfYZ1mn+0wh19gAMIigAEIiwAGIKwnc2BK8zZT3Ux8twA7X3MXU/kUm0fy3YU0R+BzbpoT0PX0Wl7ZfNd+vx9zHLqcdj/R9bXbxOFwOHoWnXb78Dkln8uy9TRVf/a/z+fM5ZpS023OwNaZ/Sx8GTZnZbdv5/nc11SuQted6tKk7/V8mepS5bsiTeVMtCybu0zlX1P16PfT5mDtf7tOqpuLzZHa/dVz7fb29ihnquecyPEzIDWfVFXVmAMsy1KqqpK6rsd8lObELi4ujvLQ9hj1GZlXV1fy7NmzZD1q96e6ruXly5fjPqae4WiPXbs/bbfbR/lnXeby8lKyLJP1ej12J/Kfoa1f/1mkcpZT9Z/CLzAAYRHAAIRFAAMQFgEMQFgEMABhnXwXMtXa3LZ0t3fAUncKbTk6veu6o7tU/g6kvWOhr/1Ab7otvato73bYafau23a7PbrraO8q+Ttu9s6d3yd7rKk7RKn3tmx7x0/vjKXubOoyuo6/e+TvBqfuIPq7hXN3Ue1AfbYHRepukd1uap6/I5aqj6k7TqneDEVRHA0a6O8uTpXh98/++X2auiNpe6XM3Tn257HdBz0X7Tmvd/l0UMP9fj+2sNdpuv26rqVpGrm6uhrvWjZNI4vFQlar1aPW97q/2+12HJQxVQ9d10nXdbLZbGS320nbtnI4HOTh4UE2m40cDgdp21YeHh7G/b2+vpabmxsZhmHsTZGqD/ve98xJ3QmduxPt8QsMQFgEMABhEcAAhDWbA9NraJvf0ryVyHEr7q7rxock2F77Nhem7LW3lmOn2Ycb2DyWbfVs81o63edGbJk2P6br6D7q+6kRGOw0+9+3yp9b76lpOj2Vx/HrPbW9U/I6Nm9l5/nt2/e27FR+46nRA07JNT3F5uhSeTlfbur93LJ+ms+z6b7OHWuqbmy92ZEWpsqa2i+fT9PRMu7v7+Xt27fJnh36vdCHk9iRV1J1Zb/zdp8136Z1b3Nsc9+dqenaS8Dug+aYbR52Dr/AAIRFAAMQFgEMQFgEMABhEcAAhEUAAxAWAQxAWAQwAGERwACERQADEBYBDEBYBDAAYRHAAIRFAAMQFgEMQFgEMABhEcAAhDU7ImvXdUdPSPEjgeqojjpKqh0pVUeG9COo2hFS7dOJRB6PkqrsSKT63z/pyK/ny/AjP6aWOXXeU8uf+mSif7rNqdFYfTn+SVFTZdn1pp42NDfS5lP7k9q31PbnRlJ9alTVuRFdp55MNLWtU6b7UYY/ZL7d36mRcT9kP/36H3IeTZk6h/0IrPo0Io0P9vWpdWT33z756ZTj4BcYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiCs2QEN7+/v5dtvv5XVaiVlWUpd1+OgY/qng44VRTEOoOf/RB4PsuenaRmeDpI2N2Bgqqy5eXPLnDr91Pn/dNkPWd4OKJgapO6pcuYGJHyq3u30Dz2+U9eZOzY1NyDhKeufMv8/WfeUsv9pGU8d+4cc19w6/jyxy9gBQ/U7m2IHOZwaYFIHPD1lv/kFBiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGIKxybuZ2u5Xb21vZ7XZSFIU0TSPDMEhRFONflmWSZZkURSF5nkuWZZLnf8fFPM/Haf5PRMb/RVGM7/18/zrLMhmGYZyu7Htfvu7PKeyyfht2P1L7MLeOemrf/xv+Sfkfuo4uP1Uv/6ZhGGbn6ef5f1Xmf2tdW8enTJ/blp+u7/30vu8n98fOs+ulXtv/U699mdvt9sm64hcYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgLAIYgLAIYADCIoABCIsABiAsAhiAsAhgAMIigAEIiwAGICwCGICwCGAAwiKAAQiLAAYgrOw/eUAnAPyb+AUGICwCGICwCGAAwiKAAQiLAAYgLAIYgLD+Bz86e+p63iwyAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "contours = group_contour_detection(stapler_img, 15)\n", + "plt.axis('off')\n", + "plt.imshow(contours, cmap=\"gray\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Minimum Cut Segmentation\n", + "\n", + "Another way to do clustering is by applying the minimum cut algorithm in graph theory. Roughly speaking, the criterion for partitioning the graph is to minimize the sum of weights of connections across the groups and maximize the sum of weights of connections within the groups.\n", + "\n", + "### Implementation\n", + "\n", + "There are several kinds of representations of a graph such as a matrix or an adjacent list. Here we are using a util function `image_to_graph` to convert an image in ndarray type to an adjacent list. It is integrated into the class of `Graph`. `Graph` takes an image as input and offer the following implementations of some graph theory algorithms:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "- bfs: performing bread searches from a source vertex to a terminal vertex. Return `True` if there is a path between the two nodes else return `False`.\n", + "\n", + "- min_cut: performing minimum cut on a graph from a source vertex to sink vertex. The method will return the edges to be cut.\n", + "\n", + "Now let's try the minimum cut method on a simple generated grayscale image of size 10:" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAOcAAADnCAYAAADl9EEgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAC3ElEQVR4nO3YMQrDMBAAwSjY//+vGqUPwpAmXsNMqWuuWQ401lovoOd99wLAnjghSpwQJU6IEidEHVfDMYav3IeZc969Aj86z3Ps3l1OiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlRx9VwzvmvPYAvLidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihChxQpQ4IUqcECVOiBInRIkTosQJUeKEKHFClDghSpwQJU6IEidEiROixAlR4oQocUKUOCFKnBAlTogSJ0SJE6LECVHihKix1rp7B2DD5YQocUKUOCFKnBAlTogSJ0R9AF+CDrluZqs6AAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "image = gen_gray_scale_picture(size=10, level=2)\n", + "show_edges(image)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[((0, 4), (0, 5)),\n", + " ((1, 4), (1, 5)),\n", + " ((2, 4), (2, 5)),\n", + " ((3, 4), (3, 5)),\n", + " ((4, 0), (5, 0)),\n", + " ((4, 1), (5, 1)),\n", + " ((4, 2), (5, 2)),\n", + " ((4, 3), (5, 3)),\n", + " ((4, 4), (5, 4)),\n", + " ((4, 4), (4, 5))]" + ] + }, + "execution_count": 66, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "graph = Graph(image)\n", + "graph.min_cut((0,0), (9,9))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are ten edges to be cut. By cutting the ten edges, we can separate the pictures into two parts by the pixel intensities." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter24/Objects in Images.ipynb b/notebooks/chapter24/Objects in Images.ipynb new file mode 100644 index 000000000..03fc92235 --- /dev/null +++ b/notebooks/chapter24/Objects in Images.ipynb @@ -0,0 +1,454 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Objects in Images\n", + "\n", + "There are two key problems shaping all thinking about objects in images: image classification and object detection. They are much more complicated than the problems like boundary detection. Thus more complicated models are needed to deal with the problems even challenging to human's eyes. For the image classification problem, we use a convolutional neural network to extract patterns of an image. For the case of object detection, we use Recursive CNN, which can assist to find the locations of objects of a set of classes in the image. These two models will be detailly introduced in the following sections." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Image Classification\n", + "\n", + "Image classification is a task where we decide what class an image of a fixed size belongs to. Traditional ways convert grayscale or RGB images into a list of numbers representing the intensity of that pixel and then do classification job on top of this procedure. Currently One of the most popular techniques used in improving the accuracy of traditional image classification ways is Convolutional Neural Networks which is more similar to the principle of human seeing things.\n", + "\n", + "CNN is different from other neural networks in that it has a convolution layer at the beginning. Instead of converting the image to an array of numbers, the image is broken up into some sections by the convolutional kernel, the machine then tries to predict what each section is. Finally, the computer tries to predict what’s in the picture based on the votes of all sections. \n", + "\n", + "A classic CNN would has the following architecture:\n", + "\n", + "$$Input ->Convolution ->ReLU ->Convolution ->ReLU ->Pooling -> ... -> Fully Connected$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "CNNs have an input layer, an output layer, as well as hidden layers. The hidden layers usually consist of convolutional layers, ReLU layers, pooling layers, and fully connected layers. Their functionality can be briefly described as :\n", + "\n", + "- Convolutional layers apply a convolution operation to the input. This layer extracted the features of an image that are used for further processing or classification.\n", + "- Pooling layers combines the outputs of clusters of neurons into a single neuron in the next layer.\n", + "- Fully connected layers connect every neuron in one layer to every neuron in the next layer.\n", + "- RELU layer will apply an elementwise activation function, such as the max(0,x) thresholding at zero.\n", + "\n", + "For a more detailed guidance, please refer to the [course note](http://cs231n.github.io/convolutional-networks/) of Stanford." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "We implemented a simple CNN with a package of keras which is an advanced level API of TensorFlow. For a more detailed guide, please refer to our previous notebooks or the [official guide](https://keras.io/). The source code can be viewed by importing the necessary packages and executing the following block:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Using TensorFlow backend.\n" + ] + } + ], + "source": [ + "import os, sys\n", + "sys.path = [os.path.abspath(\"../../\")] + sys.path\n", + "from perception4e import *\n", + "from notebook4e import *" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "psource(simple_convnet)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `simple_convnet` function takes two inputs and returns a Keras `Sequential` model. The input attributes are the number of hidden layers and the number of output classes. One hidden layer is defined as a pair of convolutional layer and max-pooling layer:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "model.add(Conv2D(32, (2, 2), padding='same', kernel_initializer='random_uniform'))\n", + "model.add(MaxPooling2D(padding='same'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The convolution kernel size we used is of size 2x2 and it is initialized by applying random uniform distribution. We also implemented a helper demonstration function `train_model` to show how the convolutional net performs on a certain dataset. This function only takes a CNN model as input and feeds an MNIST dataset into it. The MNIST dataset is split into the training set, validation set and test set by the number of 1000, 100 and 100." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Example\n", + "\n", + "Now let's try the simple CNN on the MNIST dataset. For the MNIST dataset, there are totally 10 classes: 0-9. Thus we will build a CNN with 10 prediction classes:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "WARNING: Logging before flag parsing goes to stderr.\n", + "W0820 17:50:16.660604 4604204480 deprecation_wrapper.py:119] From /Users/tianqiyang/anaconda3/envs/3point6/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:74: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.\n", + "\n", + "W0820 17:50:16.847119 4604204480 deprecation_wrapper.py:119] From /Users/tianqiyang/anaconda3/envs/3point6/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:517: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n", + "\n", + "W0820 17:50:16.932054 4604204480 deprecation_wrapper.py:119] From /Users/tianqiyang/anaconda3/envs/3point6/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:4138: The name tf.random_uniform is deprecated. Please use tf.random.uniform instead.\n", + "\n", + "W0820 17:50:17.006165 4604204480 deprecation_wrapper.py:119] From /Users/tianqiyang/anaconda3/envs/3point6/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:3976: The name tf.nn.max_pool is deprecated. Please use tf.nn.max_pool2d instead.\n", + "\n", + "W0820 17:50:17.120162 4604204480 deprecation_wrapper.py:119] From /Users/tianqiyang/anaconda3/envs/3point6/lib/python3.7/site-packages/keras/optimizers.py:790: The name tf.train.Optimizer is deprecated. Please use tf.compat.v1.train.Optimizer instead.\n", + "\n", + "W0820 17:50:17.130156 4604204480 deprecation_wrapper.py:119] From /Users/tianqiyang/anaconda3/envs/3point6/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:3295: The name tf.log is deprecated. Please use tf.math.log instead.\n", + "\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "_________________________________________________________________\n", + "Layer (type) Output Shape Param # \n", + "=================================================================\n", + "conv2d_1 (Conv2D) (None, 1, 28, 32) 3616 \n", + "_________________________________________________________________\n", + "max_pooling2d_1 (MaxPooling2 (None, 1, 14, 32) 0 \n", + "_________________________________________________________________\n", + "conv2d_2 (Conv2D) (None, 1, 14, 32) 4128 \n", + "_________________________________________________________________\n", + "max_pooling2d_2 (MaxPooling2 (None, 1, 7, 32) 0 \n", + "_________________________________________________________________\n", + "conv2d_3 (Conv2D) (None, 1, 7, 32) 4128 \n", + "_________________________________________________________________\n", + "max_pooling2d_3 (MaxPooling2 (None, 1, 4, 32) 0 \n", + "_________________________________________________________________\n", + "flatten_1 (Flatten) (None, 128) 0 \n", + "_________________________________________________________________\n", + "dense_1 (Dense) (None, 10) 1290 \n", + "_________________________________________________________________\n", + "activation_1 (Activation) (None, 10) 0 \n", + "=================================================================\n", + "Total params: 13,162\n", + "Trainable params: 13,162\n", + "Non-trainable params: 0\n", + "_________________________________________________________________\n", + "None\n" + ] + } + ], + "source": [ + "cnn_model = simple_convnet(size=3, num_classes=10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The brief description of the CNN architecture is described as above. Please note that each layer has the number of parameters needs to be trained. More parameters meaning longer to train the network on a dataset. We have 3 convolutional layers and 3 max-pooling layers in total and more than 10000 parameters to train.\n", + "\n", + "Now lets train the model for 5 epochs with the pre-defined training parameters: `epochs=5` and `batch_size=32`." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Train on 1000 samples, validate on 100 samples\n", + "Epoch 1/5\n", + " - 0s - loss: 1.9887 - acc: 0.3560 - val_loss: 1.9666 - val_acc: 0.3900\n", + "Epoch 2/5\n", + " - 0s - loss: 1.9144 - acc: 0.3670 - val_loss: 1.8953 - val_acc: 0.4200\n", + "Epoch 3/5\n", + " - 0s - loss: 1.8376 - acc: 0.3920 - val_loss: 1.8257 - val_acc: 0.4200\n", + "Epoch 4/5\n", + " - 0s - loss: 1.7612 - acc: 0.4000 - val_loss: 1.7614 - val_acc: 0.4400\n", + "Epoch 5/5\n", + " - 0s - loss: 1.6921 - acc: 0.4220 - val_loss: 1.7038 - val_acc: 0.4600\n", + "100/100 [==============================] - 0s 36us/step\n", + "[8.314567489624023, 0.47]\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_model(cnn_model)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Within 5 epochs of training, the model accuracy on the training set improves from 35% to 42% while validation accuracy is improved to 46%. This is still relatively low but much higher than the 10% probability of random guess. To improve the accuracy further, you can try both adding more examples to a dataset such as using 20000 training examples and meanwhile training for more rounds." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Object Detection\n", + "\n", + "An object detection program must mark the locations of each object from a known set of classes in test images. Object detection is hard in many aspects: objects can be in various shapes and sometimes maybe deformed or vague. Objects can appear in an image in any position and they are often mixed up with noisy objects or scenes.\n", + "\n", + "Many object detectors are built out of image classifiers.On top of the classifier, there is an additional task needed for detecting an object: select objects to be classified with windows and report their precise locations. We usually call windows as bounding boxes and there are multiple ways to build it. The very simplest procedure for choosing windows is to use all windows on some grid. Here we will introduce two main procedures of finding a bounding box." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Selective Search\n", + "\n", + "The simplest procedure for building boxes is to slide a window over the image. It produces a large number of boxes, and the boxes themselves ignore important image evidence but it is designed to be fast. \n", + "\n", + "Selective Search starts by over-segmenting the image based on the intensity of the pixels using a graph-based segmentation method. Selective Search algorithm takes these segments as initial input and then add all bounding boxes corresponding to segmented parts to the list of regional proposals. Then the algorithm group adjacent segments based on similarity and continue then go repeat the previous steps.\n", + "\n", + "\n", + "#### Implementation\n", + "\n", + "Here we use the selective search method provided by the `opencv-python` package. To use it, please make sure the additional `opencv-contrib-python` version is also installed. You can create a selective search with the following line of code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ss = cv2.ximgproc.segmentation.createSelectiveSearchSegmentation()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Then what to do is to set the input image and selective search mode. Then the model is ready to train:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ss.setBaseImage(im)\n", + "ss.switchToSelectiveSearchQuality()\n", + "rects = ss.process()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The returned `rects` will be the coordinates of the bounding box corners.\n", + "\n", + "#### Example\n", + "\n", + "Here we provided the `selective_search` method to demonstrate the result of the selective search. The method takes a path to the image as input. To execute the demo, please use the following line of code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "image_path = \"./images/stapler.png\"\n", + "selective_search(image_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The bounding boxes are drawn on the original picture showed in the following:\n", + "\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Some of the bounding boxes do have the stapler or at least most of it in the box, which can assist the classification process." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### R-CNN and Faster R-CNN\n", + "\n", + "[Ross Girshick et al.](https://arxiv.org/pdf/1311.2524.pdf) proposed a method where they use selective search to extract just 2000 regions from the image. Then the regions in bounding boxes are feed into a convolutional neural network to perform classification. The brief architecture can be shown as:\n", + "\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The problem with R-CNN is that one must pass each box independently through an image classifier thus it takes a huge amount of time to train the network. And meanwhile, the selective search is not that stable and sometimes may generate bad examples.\n", + "\n", + "Faster R-CNN solved the drawbacks of R-CNN by applying a faster object detection algorithm. Instead of feeding the region proposals to the CNN, we feed the input image to the CNN to generate a convolutional feature map. Then we identify the region of interests on the feature map and then reshape them into a fixed size with an ROI pooling layer so it can be put into another classifier. \n", + "\n", + "This algorithm is faster than R-CNN as the image is not frequently fed into the CNN to extract feature maps." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Implementation\n", + "\n", + "For an ROI pooling layer, we implemented a simple demo of it as `pool_rois`. We can fake a simple feature map with `numpy`:" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "\n", + "feature_maps_shape = (200, 100, 1)\n", + "feature_map = np.ones(feature_maps_shape, dtype='float32')\n", + "feature_map[200 - 1, 100 - 3, 0] = 50" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the fake feature map is all 1 except for one spot with a value of 50. Now let's generate some regio of interests:" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [], + "source": [ + "roiss = np.asarray([[0.5, 0.2, 0.7, 0.4], [0.0, 0.0, 1.0, 1.0]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we only made up two regions of interest. The first only crops some part of the image where all pixels are '1' which ranges from 0.5-0.7 of the length of the horizontal edge and 0.2-0.4 of verticle edge. The range of the second region is the whole image. Now let's pool a 3x7 area out of each region of interest." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[array([[1., 1., 1., 1., 1., 1., 1.],\n", + " [1., 1., 1., 1., 1., 1., 1.],\n", + " [1., 1., 1., 1., 1., 1., 1.]], dtype=float32),\n", + " array([[ 1., 1., 1., 1., 1., 1., 1.],\n", + " [ 1., 1., 1., 1., 1., 1., 1.],\n", + " [ 1., 1., 1., 1., 1., 1., 50.]], dtype=float32)]" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pool_rois(feature_map, roiss, 3, 7)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "What we are expecting is that the second pooled region is different from the first one as there is an artificial feature-the '50' in its input. The printed result is exactly the same as we expected." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to try the whole algorithm of the Faster R-CNN, you can refer to [this GitHub repository](https://github.com/endernewton/tf-faster-rcnn) for more detailed guidance." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/notebooks/chapter24/images/RCNN.png b/notebooks/chapter24/images/RCNN.png new file mode 100644 index 000000000..273021fbe Binary files /dev/null and b/notebooks/chapter24/images/RCNN.png differ diff --git a/notebooks/chapter24/images/derivative_of_gaussian.png b/notebooks/chapter24/images/derivative_of_gaussian.png new file mode 100644 index 000000000..0be575529 Binary files /dev/null and b/notebooks/chapter24/images/derivative_of_gaussian.png differ diff --git a/notebooks/chapter24/images/gradients.png b/notebooks/chapter24/images/gradients.png new file mode 100644 index 000000000..ae57bdf3b Binary files /dev/null and b/notebooks/chapter24/images/gradients.png differ diff --git a/notebooks/chapter24/images/laplacian.png b/notebooks/chapter24/images/laplacian.png new file mode 100644 index 000000000..6d7e6916a Binary files /dev/null and b/notebooks/chapter24/images/laplacian.png differ diff --git a/notebooks/chapter24/images/laplacian_kernels.png b/notebooks/chapter24/images/laplacian_kernels.png new file mode 100644 index 000000000..faca3321c Binary files /dev/null and b/notebooks/chapter24/images/laplacian_kernels.png differ diff --git a/notebooks/chapter24/images/stapler.png b/notebooks/chapter24/images/stapler.png new file mode 100644 index 000000000..e550d83f9 Binary files /dev/null and b/notebooks/chapter24/images/stapler.png differ diff --git a/notebooks/chapter24/images/stapler_bbox.png b/notebooks/chapter24/images/stapler_bbox.png new file mode 100644 index 000000000..c5a7c7af0 Binary files /dev/null and b/notebooks/chapter24/images/stapler_bbox.png differ diff --git a/obsolete_search4e.ipynb b/obsolete_search4e.ipynb new file mode 100644 index 000000000..72981d49b --- /dev/null +++ b/obsolete_search4e.ipynb @@ -0,0 +1,3835 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "*Note: This is not yet ready, but shows the direction I'm leaning in for Fourth Edition Search.*\n", + "\n", + "# State-Space Search\n", + "\n", + "This notebook describes several state-space search algorithms, and how they can be used to solve a variety of problems. We start with a simple algorithm and a simple domain: finding a route from city to city. Later we will explore other algorithms and domains.\n", + "\n", + "## The Route-Finding Domain\n", + "\n", + "Like all state-space search problems, in a route-finding problem you will be given:\n", + "- A start state (for example, `'A'` for the city Arad).\n", + "- A goal state (for example, `'B'` for the city Bucharest).\n", + "- Actions that can change state (for example, driving from `'A'` to `'S'`).\n", + "\n", + "You will be asked to find:\n", + "- A path from the start state, through intermediate states, to the goal state.\n", + "\n", + "We'll use this map:\n", + "\n", + "\n", + "\n", + "A state-space search problem can be represented by a *graph*, where the vertices of the graph are the states of the problem (in this case, cities) and the edges of the graph are the actions (in this case, driving along a road).\n", + "\n", + "We'll represent a city by its single initial letter. \n", + "We'll represent the graph of connections as a `dict` that maps each city to a list of the neighboring cities (connected by a road). For now we don't explicitly represent the actions, nor the distances\n", + "between cities." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "romania = {\n", + " 'A': ['Z', 'T', 'S'],\n", + " 'B': ['F', 'P', 'G', 'U'],\n", + " 'C': ['D', 'R', 'P'],\n", + " 'D': ['M', 'C'],\n", + " 'E': ['H'],\n", + " 'F': ['S', 'B'],\n", + " 'G': ['B'],\n", + " 'H': ['U', 'E'],\n", + " 'I': ['N', 'V'],\n", + " 'L': ['T', 'M'],\n", + " 'M': ['L', 'D'],\n", + " 'N': ['I'],\n", + " 'O': ['Z', 'S'],\n", + " 'P': ['R', 'C', 'B'],\n", + " 'R': ['S', 'C', 'P'],\n", + " 'S': ['A', 'O', 'F', 'R'],\n", + " 'T': ['A', 'L'],\n", + " 'U': ['B', 'V', 'H'],\n", + " 'V': ['U', 'I'],\n", + " 'Z': ['O', 'A']}" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "Suppose we want to get from `A` to `B`. Where can we go from the start state, `A`?" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['Z', 'T', 'S']" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "romania['A']" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "We see that from `A` we can get to any of the three cities `['Z', 'T', 'S']`. Which should we choose? *We don't know.* That's the whole point of *search*: we don't know which immediate action is best, so we'll have to explore, until we find a *path* that leads to the goal. \n", + "\n", + "How do we explore? We'll start with a simple algorithm that will get us from `A` to `B`. We'll keep a *frontier*—a collection of not-yet-explored states—and expand the frontier outward until it reaches the goal. To be more precise:\n", + "\n", + "- Initially, the only state in the frontier is the start state, `'A'`.\n", + "- Until we reach the goal, or run out of states in the frontier to explore, do the following:\n", + " - Remove the first state from the frontier. Call it `s`.\n", + " - If `s` is the goal, we're done. Return the path to `s`.\n", + " - Otherwise, consider all the neighboring states of `s`. For each one:\n", + " - If we have not previously explored the state, add it to the end of the frontier.\n", + " - Also keep track of the previous state that led to this new neighboring state; we'll need this to reconstruct the path to the goal, and to keep us from re-visiting previously explored states.\n", + " \n", + "# A Simple Search Algorithm: `breadth_first`\n", + " \n", + "The function `breadth_first` implements this strategy:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "from collections import deque # Doubly-ended queue: pop from left, append to right.\n", + "\n", + "def breadth_first(start, goal, neighbors):\n", + " \"Find a shortest sequence of states from start to the goal.\"\n", + " frontier = deque([start]) # A queue of states\n", + " previous = {start: None} # start has no previous state; other states will\n", + " while frontier:\n", + " s = frontier.popleft()\n", + " if s == goal:\n", + " return path(previous, s)\n", + " for s2 in neighbors[s]:\n", + " if s2 not in previous:\n", + " frontier.append(s2)\n", + " previous[s2] = s\n", + " \n", + "def path(previous, s): \n", + " \"Return a list of states that lead to state s, according to the previous dict.\"\n", + " return [] if (s is None) else path(previous, previous[s]) + [s]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "A couple of things to note: \n", + "\n", + "1. We always add new states to the end of the frontier queue. That means that all the states that are adjacent to the start state will come first in the queue, then all the states that are two steps away, then three steps, etc.\n", + "That's what we mean by *breadth-first* search.\n", + "2. We recover the path to an `end` state by following the trail of `previous[end]` pointers, all the way back to `start`.\n", + "The dict `previous` is a map of `{state: previous_state}`. \n", + "3. When we finally get an `s` that is the goal state, we know we have found a shortest path, because any other state in the queue must correspond to a path that is as long or longer.\n", + "3. Note that `previous` contains all the states that are currently in `frontier` as well as all the states that were in `frontier` in the past.\n", + "4. If no path to the goal is found, then `breadth_first` returns `None`. If a path is found, it returns the sequence of states on the path.\n", + "\n", + "Some examples:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['A', 'S', 'F', 'B']" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "breadth_first('A', 'B', romania)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['L', 'T', 'A', 'S', 'F', 'B', 'U', 'V', 'I', 'N']" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "breadth_first('L', 'N', romania)" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['N', 'I', 'V', 'U', 'B', 'F', 'S', 'A', 'T', 'L']" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "breadth_first('N', 'L', romania)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['E']" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "breadth_first('E', 'E', romania)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "Now let's try a different kind of problem that can be solved with the same search function.\n", + "\n", + "## Word Ladders Problem\n", + "\n", + "A *word ladder* problem is this: given a start word and a goal word, find the shortest way to transform the start word into the goal word by changing one letter at a time, such that each change results in a word. For example starting with `green` we can reach `grass` in 7 steps:\n", + "\n", + "`green` → `greed` → `treed` → `trees` → `tress` → `cress` → `crass` → `grass`\n", + "\n", + "We will need a dictionary of words. We'll use 5-letter words from the [Stanford GraphBase](http://www-cs-faculty.stanford.edu/~uno/sgb.html) project for this purpose. Let's get that file from aimadata." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "from search import *\n", + "sgb_words = open_data(\"EN-text/sgb-words.txt\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "We can assign `WORDS` to be the set of all the words in this file:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "5757" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "WORDS = set(sgb_words.read().split())\n", + "len(WORDS)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "And define `neighboring_words` to return the set of all words that are a one-letter change away from a given `word`:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "def neighboring_words(word):\n", + " \"All words that are one letter away from this word.\"\n", + " neighbors = {word[:i] + c + word[i+1:]\n", + " for i in range(len(word))\n", + " for c in 'abcdefghijklmnopqrstuvwxyz'\n", + " if c != word[i]}\n", + " return neighbors & WORDS" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For example:" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'cello', 'hallo', 'hells', 'hullo', 'jello'}" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "neighboring_words('hello')" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'would'}" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "neighboring_words('world')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "Now we can create `word_neighbors` as a dict of `{word: {neighboring_word, ...}}`: " + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "word_neighbors = {word: neighboring_words(word)\n", + " for word in WORDS}" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "Now the `breadth_first` function can be used to solve a word ladder problem:" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['green', 'greed', 'treed', 'trees', 'treys', 'trays', 'grays', 'grass']" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "breadth_first('green', 'grass', word_neighbors)" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['smart',\n", + " 'start',\n", + " 'stars',\n", + " 'sears',\n", + " 'bears',\n", + " 'beans',\n", + " 'brans',\n", + " 'brand',\n", + " 'braid',\n", + " 'brain']" + ] + }, + "execution_count": 15, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "breadth_first('smart', 'brain', word_neighbors)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['frown',\n", + " 'flown',\n", + " 'flows',\n", + " 'slows',\n", + " 'slots',\n", + " 'slits',\n", + " 'spits',\n", + " 'spite',\n", + " 'smite',\n", + " 'smile']" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "breadth_first('frown', 'smile', word_neighbors)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# More General Search Algorithms\n", + "\n", + "Now we'll embelish the `breadth_first` algorithm to make a family of search algorithms with more capabilities:\n", + "\n", + "1. We distinguish between an *action* and the *result* of an action.\n", + "3. We allow different measures of the cost of a solution (not just the number of steps in the sequence).\n", + "4. We search through the state space in an order that is more likely to lead to an optimal solution quickly.\n", + "\n", + "Here's how we do these things:\n", + "\n", + "1. Instead of having a graph of neighboring states, we instead have an object of type *Problem*. A Problem\n", + "has one method, `Problem.actions(state)` to return a collection of the actions that are allowed in a state,\n", + "and another method, `Problem.result(state, action)` that says what happens when you take an action.\n", + "2. We keep a set, `explored` of states that have already been explored. We also have a class, `Frontier`, that makes it efficient to ask if a state is on the frontier.\n", + "3. Each action has a cost associated with it (in fact, the cost can vary with both the state and the action).\n", + "4. The `Frontier` class acts as a priority queue, allowing the \"best\" state to be explored next.\n", + "We represent a sequence of actions and resulting states as a linked list of `Node` objects.\n", + "\n", + "The algorithm `breadth_first_search` is basically the same as `breadth_first`, but using our new conventions:" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "def breadth_first_search(problem):\n", + " \"Search for goal; paths with least number of steps first.\"\n", + " if problem.is_goal(problem.initial): \n", + " return Node(problem.initial)\n", + " frontier = FrontierQ(Node(problem.initial), LIFO=False)\n", + " explored = set()\n", + " while frontier:\n", + " node = frontier.pop()\n", + " explored.add(node.state)\n", + " for action in problem.actions(node.state):\n", + " child = node.child(problem, action)\n", + " if child.state not in explored and child.state not in frontier:\n", + " if problem.is_goal(child.state):\n", + " return child\n", + " frontier.add(child)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next is `uniform_cost_search`, in which each step can have a different cost, and we still consider first one os the states with minimum cost so far." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "def uniform_cost_search(problem, costfn=lambda node: node.path_cost):\n", + " frontier = FrontierPQ(Node(problem.initial), costfn)\n", + " explored = set()\n", + " while frontier:\n", + " node = frontier.pop()\n", + " if problem.is_goal(node.state):\n", + " return node\n", + " explored.add(node.state)\n", + " for action in problem.actions(node.state):\n", + " child = node.child(problem, action)\n", + " if child.state not in explored and child not in frontier:\n", + " frontier.add(child)\n", + " elif child in frontier and frontier.cost[child] < child.path_cost:\n", + " frontier.replace(child)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, `astar_search` in which the cost includes an estimate of the distance to the goal as well as the distance travelled so far." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "def astar_search(problem, heuristic):\n", + " costfn = lambda node: node.path_cost + heuristic(node.state)\n", + " return uniform_cost_search(problem, costfn)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# Search Tree Nodes\n", + "\n", + "The solution to a search problem is now a linked list of `Node`s, where each `Node`\n", + "includes a `state` and the `path_cost` of getting to the state. In addition, for every `Node` except for the first (root) `Node`, there is a previous `Node` (indicating the state that lead to this `Node`) and an `action` (indicating the action taken to get here)." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "class Node(object):\n", + " \"\"\"A node in a search tree. A search tree is spanning tree over states.\n", + " A Node contains a state, the previous node in the tree, the action that\n", + " takes us from the previous state to this state, and the path cost to get to \n", + " this state. If a state is arrived at by two paths, then there are two nodes \n", + " with the same state.\"\"\"\n", + "\n", + " def __init__(self, state, previous=None, action=None, step_cost=1):\n", + " \"Create a search tree Node, derived from a previous Node by an action.\"\n", + " self.state = state\n", + " self.previous = previous\n", + " self.action = action\n", + " self.path_cost = 0 if previous is None else (previous.path_cost + step_cost)\n", + "\n", + " def __repr__(self): return \"\".format(self.state, self.path_cost)\n", + " \n", + " def __lt__(self, other): return self.path_cost < other.path_cost\n", + " \n", + " def child(self, problem, action):\n", + " \"The Node you get by taking an action from this Node.\"\n", + " result = problem.result(self.state, action)\n", + " return Node(result, self, action, \n", + " problem.step_cost(self.state, action, result)) " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# Frontiers\n", + "\n", + "A frontier is a collection of Nodes that acts like both a Queue and a Set. A frontier, `f`, supports these operations:\n", + "\n", + "* `f.add(node)`: Add a node to the Frontier.\n", + "\n", + "* `f.pop()`: Remove and return the \"best\" node from the frontier.\n", + "\n", + "* `f.replace(node)`: add this node and remove a previous node with the same state.\n", + "\n", + "* `state in f`: Test if some node in the frontier has arrived at state.\n", + "\n", + "* `f[state]`: returns the node corresponding to this state in frontier.\n", + "\n", + "* `len(f)`: The number of Nodes in the frontier. When the frontier is empty, `f` is *false*.\n", + "\n", + "We provide two kinds of frontiers: One for \"regular\" queues, either first-in-first-out (for breadth-first search) or last-in-first-out (for depth-first search), and one for priority queues, where you can specify what cost function on nodes you are trying to minimize." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "from collections import OrderedDict\n", + "import heapq\n", + "\n", + "class FrontierQ(OrderedDict):\n", + " \"A Frontier that supports FIFO or LIFO Queue ordering.\"\n", + " \n", + " def __init__(self, initial, LIFO=False):\n", + " \"\"\"Initialize Frontier with an initial Node.\n", + " If LIFO is True, pop from the end first; otherwise from front first.\"\"\"\n", + " super(FrontierQ, self).__init__()\n", + " self.LIFO = LIFO\n", + " self.add(initial)\n", + " \n", + " def add(self, node):\n", + " \"Add a node to the frontier.\"\n", + " self[node.state] = node\n", + " \n", + " def pop(self):\n", + " \"Remove and return the next Node in the frontier.\"\n", + " (state, node) = self.popitem(self.LIFO)\n", + " return node\n", + " \n", + " def replace(self, node):\n", + " \"Make this node replace the nold node with the same state.\"\n", + " del self[node.state]\n", + " self.add(node)" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "class FrontierPQ:\n", + " \"A Frontier ordered by a cost function; a Priority Queue.\"\n", + " \n", + " def __init__(self, initial, costfn=lambda node: node.path_cost):\n", + " \"Initialize Frontier with an initial Node, and specify a cost function.\"\n", + " self.heap = []\n", + " self.states = {}\n", + " self.costfn = costfn\n", + " self.add(initial)\n", + " \n", + " def add(self, node):\n", + " \"Add node to the frontier.\"\n", + " cost = self.costfn(node)\n", + " heapq.heappush(self.heap, (cost, node))\n", + " self.states[node.state] = node\n", + " \n", + " def pop(self):\n", + " \"Remove and return the Node with minimum cost.\"\n", + " (cost, node) = heapq.heappop(self.heap)\n", + " self.states.pop(node.state, None) # remove state\n", + " return node\n", + " \n", + " def replace(self, node):\n", + " \"Make this node replace a previous node with the same state.\"\n", + " if node.state not in self:\n", + " raise ValueError('{} not there to replace'.format(node.state))\n", + " for (i, (cost, old_node)) in enumerate(self.heap):\n", + " if old_node.state == node.state:\n", + " self.heap[i] = (self.costfn(node), node)\n", + " heapq._siftdown(self.heap, 0, i)\n", + " return\n", + "\n", + " def __contains__(self, state): return state in self.states\n", + " \n", + " def __len__(self): return len(self.heap)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# Search Problems\n", + "\n", + "`Problem` is the abstract class for all search problems. You can define your own class of problems as a subclass of `Problem`. You will need to override the `actions` and `result` method to describe how your problem works. You will also have to either override `is_goal` or pass a collection of goal states to the initialization method. If actions have different costs, you should override the `step_cost` method. " + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "class Problem(object):\n", + " \"\"\"The abstract class for a search problem.\"\"\"\n", + "\n", + " def __init__(self, initial=None, goals=(), **additional_keywords):\n", + " \"\"\"Provide an initial state and optional goal states.\n", + " A subclass can have additional keyword arguments.\"\"\"\n", + " self.initial = initial # The initial state of the problem.\n", + " self.goals = goals # A collection of possible goal states.\n", + " self.__dict__.update(**additional_keywords)\n", + "\n", + " def actions(self, state):\n", + " \"Return a list of actions executable in this state.\"\n", + " raise NotImplementedError # Override this!\n", + "\n", + " def result(self, state, action):\n", + " \"The state that results from executing this action in this state.\"\n", + " raise NotImplementedError # Override this!\n", + "\n", + " def is_goal(self, state):\n", + " \"True if the state is a goal.\" \n", + " return state in self.goals # Optionally override this!\n", + "\n", + " def step_cost(self, state, action, result=None):\n", + " \"The cost of taking this action from this state.\"\n", + " return 1 # Override this if actions have different costs " + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def action_sequence(node):\n", + " \"The sequence of actions to get to this node.\"\n", + " actions = []\n", + " while node.previous:\n", + " actions.append(node.action)\n", + " node = node.previous\n", + " return actions[::-1]\n", + "\n", + "def state_sequence(node):\n", + " \"The sequence of states to get to this node.\"\n", + " states = [node.state]\n", + " while node.previous:\n", + " node = node.previous\n", + " states.append(node.state)\n", + " return states[::-1]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# Two Location Vacuum World" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "dirt = '*'\n", + "clean = ' '\n", + "\n", + "class TwoLocationVacuumProblem(Problem):\n", + " \"\"\"A Vacuum in a world with two locations, and dirt.\n", + " Each state is a tuple of (location, dirt_in_W, dirt_in_E).\"\"\"\n", + "\n", + " def actions(self, state): return ('W', 'E', 'Suck')\n", + " \n", + " def is_goal(self, state): return dirt not in state\n", + " \n", + " def result(self, state, action):\n", + " \"The state that results from executing this action in this state.\" \n", + " (loc, dirtW, dirtE) = state\n", + " if action == 'W': return ('W', dirtW, dirtE)\n", + " elif action == 'E': return ('E', dirtW, dirtE)\n", + " elif action == 'Suck' and loc == 'W': return (loc, clean, dirtE)\n", + " elif action == 'Suck' and loc == 'E': return (loc, dirtW, clean) \n", + " else: raise ValueError('unknown action: ' + action)" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "problem = TwoLocationVacuumProblem(initial=('W', dirt, dirt))\n", + "result = uniform_cost_search(problem)\n", + "result" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['Suck', 'E', 'Suck']" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "action_sequence(result)" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[('W', '*', '*'), ('W', ' ', '*'), ('E', ' ', '*'), ('E', ' ', ' ')]" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "state_sequence(result)" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['Suck']" + ] + }, + "execution_count": 29, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "problem = TwoLocationVacuumProblem(initial=('E', clean, dirt))\n", + "result = uniform_cost_search(problem)\n", + "action_sequence(result)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# Water Pouring Problem\n", + "\n", + "Here is another problem domain, to show you how to define one. The idea is that we have a number of water jugs and a water tap and the goal is to measure out a specific amount of water (in, say, ounces or liters). You can completely fill or empty a jug, but because the jugs don't have markings on them, you can't partially fill them with a specific amount. You can, however, pour one jug into another, stopping when the seconfd is full or the first is empty." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "class PourProblem(Problem):\n", + " \"\"\"Problem about pouring water between jugs to achieve some water level.\n", + " Each state is a tuples of levels. In the initialization, provide a tuple of \n", + " capacities, e.g. PourProblem(capacities=(8, 16, 32), initial=(2, 4, 3), goals={7}), \n", + " which means three jugs of capacity 8, 16, 32, currently filled with 2, 4, 3 units of \n", + " water, respectively, and the goal is to get a level of 7 in any one of the jugs.\"\"\"\n", + " \n", + " def actions(self, state):\n", + " \"\"\"The actions executable in this state.\"\"\"\n", + " jugs = range(len(state))\n", + " return ([('Fill', i) for i in jugs if state[i] != self.capacities[i]] +\n", + " [('Dump', i) for i in jugs if state[i] != 0] +\n", + " [('Pour', i, j) for i in jugs for j in jugs if i != j])\n", + "\n", + " def result(self, state, action):\n", + " \"\"\"The state that results from executing this action in this state.\"\"\"\n", + " result = list(state)\n", + " act, i, j = action[0], action[1], action[-1]\n", + " if act == 'Fill': # Fill i to capacity\n", + " result[i] = self.capacities[i]\n", + " elif act == 'Dump': # Empty i\n", + " result[i] = 0\n", + " elif act == 'Pour':\n", + " a, b = state[i], state[j]\n", + " result[i], result[j] = ((0, a + b) \n", + " if (a + b <= self.capacities[j]) else\n", + " (a + b - self.capacities[j], self.capacities[j]))\n", + " else:\n", + " raise ValueError('unknown action', action)\n", + " return tuple(result)\n", + "\n", + " def is_goal(self, state):\n", + " \"\"\"True if any of the jugs has a level equal to one of the goal levels.\"\"\"\n", + " return any(level in self.goals for level in state)" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(2, 13)" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "p7 = PourProblem(initial=(2, 0), capacities=(5, 13), goals={7})\n", + "p7.result((2, 0), ('Fill', 1))" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('Pour', 0, 1), ('Fill', 0), ('Pour', 0, 1)]" + ] + }, + "execution_count": 32, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "result = uniform_cost_search(p7)\n", + "action_sequence(result)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# Visualization Output" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "def showpath(searcher, problem):\n", + " \"Show what happens when searcvher solves problem.\"\n", + " problem = Instrumented(problem)\n", + " print('\\n{}:'.format(searcher.__name__))\n", + " result = searcher(problem)\n", + " if result:\n", + " actions = action_sequence(result)\n", + " state = problem.initial\n", + " path_cost = 0\n", + " for steps, action in enumerate(actions, 1):\n", + " path_cost += problem.step_cost(state, action, 0)\n", + " result = problem.result(state, action)\n", + " print(' {} =={}==> {}; cost {} after {} steps'\n", + " .format(state, action, result, path_cost, steps,\n", + " '; GOAL!' if problem.is_goal(result) else ''))\n", + " state = result\n", + " msg = 'GOAL FOUND' if result else 'no solution'\n", + " print('{} after {} results and {} goal checks'\n", + " .format(msg, problem._counter['result'], problem._counter['is_goal']))\n", + " \n", + "from collections import Counter\n", + "\n", + "class Instrumented:\n", + " \"Instrument an object to count all the attribute accesses in _counter.\"\n", + " def __init__(self, obj):\n", + " self._object = obj\n", + " self._counter = Counter()\n", + " def __getattr__(self, attr):\n", + " self._counter[attr] += 1\n", + " return getattr(self._object, attr) " + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "uniform_cost_search:\n", + " (2, 0) ==('Pour', 0, 1)==> (0, 2); cost 1 after 1 steps\n", + " (0, 2) ==('Fill', 0)==> (5, 2); cost 2 after 2 steps\n", + " (5, 2) ==('Pour', 0, 1)==> (0, 7); cost 3 after 3 steps\n", + "GOAL FOUND after 83 results and 22 goal checks\n" + ] + } + ], + "source": [ + "showpath(uniform_cost_search, p7)" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "uniform_cost_search:\n", + " (0, 0) ==('Fill', 0)==> (7, 0); cost 1 after 1 steps\n", + " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 2 after 2 steps\n", + " (0, 7) ==('Fill', 0)==> (7, 7); cost 3 after 3 steps\n", + " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 4 after 4 steps\n", + " (1, 13) ==('Dump', 1)==> (1, 0); cost 5 after 5 steps\n", + " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 6 after 6 steps\n", + " (0, 1) ==('Fill', 0)==> (7, 1); cost 7 after 7 steps\n", + " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 8 after 8 steps\n", + " (0, 8) ==('Fill', 0)==> (7, 8); cost 9 after 9 steps\n", + " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 10 after 10 steps\n", + "GOAL FOUND after 110 results and 32 goal checks\n" + ] + } + ], + "source": [ + "p = PourProblem(initial=(0, 0), capacities=(7, 13), goals={2})\n", + "showpath(uniform_cost_search, p)" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class GreenPourProblem(PourProblem): \n", + " def step_cost(self, state, action, result=None):\n", + " \"The cost is the amount of water used in a fill.\"\n", + " if action[0] == 'Fill':\n", + " i = action[1]\n", + " return self.capacities[i] - state[i]\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "uniform_cost_search:\n", + " (0, 0) ==('Fill', 0)==> (7, 0); cost 7 after 1 steps\n", + " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 7 after 2 steps\n", + " (0, 7) ==('Fill', 0)==> (7, 7); cost 14 after 3 steps\n", + " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 14 after 4 steps\n", + " (1, 13) ==('Dump', 1)==> (1, 0); cost 14 after 5 steps\n", + " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 14 after 6 steps\n", + " (0, 1) ==('Fill', 0)==> (7, 1); cost 21 after 7 steps\n", + " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 21 after 8 steps\n", + " (0, 8) ==('Fill', 0)==> (7, 8); cost 28 after 9 steps\n", + " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 28 after 10 steps\n", + "GOAL FOUND after 184 results and 48 goal checks\n" + ] + } + ], + "source": [ + "p = GreenPourProblem(initial=(0, 0), capacities=(7, 13), goals={2})\n", + "showpath(uniform_cost_search, p)" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "def compare_searchers(problem, searchers=None):\n", + " \"Apply each of the search algorithms to the problem, and show results\"\n", + " if searchers is None: \n", + " searchers = (breadth_first_search, uniform_cost_search)\n", + " for searcher in searchers:\n", + " showpath(searcher, problem)" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "breadth_first_search:\n", + " (0, 0) ==('Fill', 0)==> (7, 0); cost 7 after 1 steps\n", + " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 7 after 2 steps\n", + " (0, 7) ==('Fill', 0)==> (7, 7); cost 14 after 3 steps\n", + " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 14 after 4 steps\n", + " (1, 13) ==('Dump', 1)==> (1, 0); cost 14 after 5 steps\n", + " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 14 after 6 steps\n", + " (0, 1) ==('Fill', 0)==> (7, 1); cost 21 after 7 steps\n", + " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 21 after 8 steps\n", + " (0, 8) ==('Fill', 0)==> (7, 8); cost 28 after 9 steps\n", + " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 28 after 10 steps\n", + "GOAL FOUND after 100 results and 31 goal checks\n", + "\n", + "uniform_cost_search:\n", + " (0, 0) ==('Fill', 0)==> (7, 0); cost 7 after 1 steps\n", + " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 7 after 2 steps\n", + " (0, 7) ==('Fill', 0)==> (7, 7); cost 14 after 3 steps\n", + " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 14 after 4 steps\n", + " (1, 13) ==('Dump', 1)==> (1, 0); cost 14 after 5 steps\n", + " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 14 after 6 steps\n", + " (0, 1) ==('Fill', 0)==> (7, 1); cost 21 after 7 steps\n", + " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 21 after 8 steps\n", + " (0, 8) ==('Fill', 0)==> (7, 8); cost 28 after 9 steps\n", + " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 28 after 10 steps\n", + "GOAL FOUND after 184 results and 48 goal checks\n" + ] + } + ], + "source": [ + "compare_searchers(p)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Random Grid\n", + "\n", + "An environment where you can move in any of 4 directions, unless there is an obstacle there.\n", + "\n", + "\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{(0, 0): [(0, 1), (1, 0)],\n", + " (0, 1): [(0, 2), (0, 0), (1, 1)],\n", + " (0, 2): [(0, 3), (0, 1), (1, 2)],\n", + " (0, 3): [(0, 4), (0, 2), (1, 3)],\n", + " (0, 4): [(0, 3), (1, 4)],\n", + " (1, 0): [(1, 1), (2, 0), (0, 0)],\n", + " (1, 1): [(1, 2), (1, 0), (2, 1), (0, 1)],\n", + " (1, 2): [(1, 3), (1, 1), (2, 2), (0, 2)],\n", + " (1, 3): [(1, 4), (1, 2), (2, 3), (0, 3)],\n", + " (1, 4): [(1, 3), (2, 4), (0, 4)],\n", + " (2, 0): [(2, 1), (3, 0), (1, 0)],\n", + " (2, 1): [(2, 2), (2, 0), (3, 1), (1, 1)],\n", + " (2, 2): [(2, 3), (2, 1), (1, 2)],\n", + " (2, 3): [(2, 4), (2, 2), (3, 3), (1, 3)],\n", + " (2, 4): [(2, 3), (1, 4)],\n", + " (3, 0): [(3, 1), (4, 0), (2, 0)],\n", + " (3, 1): [(3, 0), (4, 1), (2, 1)],\n", + " (3, 2): [(3, 3), (3, 1), (4, 2), (2, 2)],\n", + " (3, 3): [(4, 3), (2, 3)],\n", + " (3, 4): [(3, 3), (4, 4), (2, 4)],\n", + " (4, 0): [(4, 1), (3, 0)],\n", + " (4, 1): [(4, 2), (4, 0), (3, 1)],\n", + " (4, 2): [(4, 3), (4, 1)],\n", + " (4, 3): [(4, 4), (4, 2), (3, 3)],\n", + " (4, 4): [(4, 3)]}" + ] + }, + "execution_count": 40, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import random\n", + "\n", + "N, S, E, W = DIRECTIONS = [(0, 1), (0, -1), (1, 0), (-1, 0)]\n", + "\n", + "def Grid(width, height, obstacles=0.1):\n", + " \"\"\"A 2-D grid, width x height, with obstacles that are either a collection of points,\n", + " or a fraction between 0 and 1 indicating the density of obstacles, chosen at random.\"\"\"\n", + " grid = {(x, y) for x in range(width) for y in range(height)}\n", + " if isinstance(obstacles, (float, int)):\n", + " obstacles = random.sample(grid, int(width * height * obstacles))\n", + " def neighbors(x, y):\n", + " for (dx, dy) in DIRECTIONS:\n", + " (nx, ny) = (x + dx, y + dy)\n", + " if (nx, ny) not in obstacles and 0 <= nx < width and 0 <= ny < height:\n", + " yield (nx, ny)\n", + " return {(x, y): list(neighbors(x, y))\n", + " for x in range(width) for y in range(height)}\n", + "\n", + "Grid(5, 5)" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class GridProblem(Problem):\n", + " \"Create with a call like GridProblem(grid=Grid(10, 10), initial=(0, 0), goal=(9, 9))\"\n", + " def actions(self, state): return DIRECTIONS\n", + " def result(self, state, action):\n", + " #print('ask for result of', state, action)\n", + " (x, y) = state\n", + " (dx, dy) = action\n", + " r = (x + dx, y + dy)\n", + " return r if r in self.grid[state] else state" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "uniform_cost_search:\n", + " (0, 0) ==(0, 1)==> (0, 1); cost 1 after 1 steps\n", + " (0, 1) ==(0, 1)==> (0, 2); cost 2 after 2 steps\n", + " (0, 2) ==(0, 1)==> (0, 3); cost 3 after 3 steps\n", + " (0, 3) ==(1, 0)==> (1, 3); cost 4 after 4 steps\n", + " (1, 3) ==(1, 0)==> (2, 3); cost 5 after 5 steps\n", + " (2, 3) ==(0, 1)==> (2, 4); cost 6 after 6 steps\n", + " (2, 4) ==(1, 0)==> (3, 4); cost 7 after 7 steps\n", + " (3, 4) ==(1, 0)==> (4, 4); cost 8 after 8 steps\n", + "GOAL FOUND after 248 results and 69 goal checks\n" + ] + } + ], + "source": [ + "gp = GridProblem(grid=Grid(5, 5, 0.3), initial=(0, 0), goals={(4, 4)})\n", + "showpath(uniform_cost_search, gp)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "source": [ + "# Finding a hard PourProblem\n", + "\n", + "What solvable two-jug PourProblem requires the most steps? We can define the hardness as the number of steps, and then iterate over all PourProblems with capacities up to size M, keeping the hardest one." + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "def hardness(problem):\n", + " L = breadth_first_search(problem)\n", + " #print('hardness', problem.initial, problem.capacities, problem.goals, L)\n", + " return len(action_sequence(L)) if (L is not None) else 0" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "3" + ] + }, + "execution_count": 44, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "hardness(p7)" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[('Pour', 0, 1), ('Fill', 0), ('Pour', 0, 1)]" + ] + }, + "execution_count": 45, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "action_sequence(breadth_first_search(p7))" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "((0, 0), (7, 9), {8})" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "C = 9 # Maximum capacity to consider\n", + "\n", + "phard = max((PourProblem(initial=(a, b), capacities=(A, B), goals={goal})\n", + " for A in range(C+1) for B in range(C+1)\n", + " for a in range(A) for b in range(B)\n", + " for goal in range(max(A, B))),\n", + " key=hardness)\n", + "\n", + "phard.initial, phard.capacities, phard.goals" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "breadth_first_search:\n", + " (0, 0) ==('Fill', 1)==> (0, 9); cost 1 after 1 steps\n", + " (0, 9) ==('Pour', 1, 0)==> (7, 2); cost 2 after 2 steps\n", + " (7, 2) ==('Dump', 0)==> (0, 2); cost 3 after 3 steps\n", + " (0, 2) ==('Pour', 1, 0)==> (2, 0); cost 4 after 4 steps\n", + " (2, 0) ==('Fill', 1)==> (2, 9); cost 5 after 5 steps\n", + " (2, 9) ==('Pour', 1, 0)==> (7, 4); cost 6 after 6 steps\n", + " (7, 4) ==('Dump', 0)==> (0, 4); cost 7 after 7 steps\n", + " (0, 4) ==('Pour', 1, 0)==> (4, 0); cost 8 after 8 steps\n", + " (4, 0) ==('Fill', 1)==> (4, 9); cost 9 after 9 steps\n", + " (4, 9) ==('Pour', 1, 0)==> (7, 6); cost 10 after 10 steps\n", + " (7, 6) ==('Dump', 0)==> (0, 6); cost 11 after 11 steps\n", + " (0, 6) ==('Pour', 1, 0)==> (6, 0); cost 12 after 12 steps\n", + " (6, 0) ==('Fill', 1)==> (6, 9); cost 13 after 13 steps\n", + " (6, 9) ==('Pour', 1, 0)==> (7, 8); cost 14 after 14 steps\n", + "GOAL FOUND after 150 results and 44 goal checks\n" + ] + } + ], + "source": [ + "showpath(breadth_first_search, PourProblem(initial=(0, 0), capacities=(7, 9), goals={8}))" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "uniform_cost_search:\n", + " (0, 0) ==('Fill', 1)==> (0, 9); cost 1 after 1 steps\n", + " (0, 9) ==('Pour', 1, 0)==> (7, 2); cost 2 after 2 steps\n", + " (7, 2) ==('Dump', 0)==> (0, 2); cost 3 after 3 steps\n", + " (0, 2) ==('Pour', 1, 0)==> (2, 0); cost 4 after 4 steps\n", + " (2, 0) ==('Fill', 1)==> (2, 9); cost 5 after 5 steps\n", + " (2, 9) ==('Pour', 1, 0)==> (7, 4); cost 6 after 6 steps\n", + " (7, 4) ==('Dump', 0)==> (0, 4); cost 7 after 7 steps\n", + " (0, 4) ==('Pour', 1, 0)==> (4, 0); cost 8 after 8 steps\n", + " (4, 0) ==('Fill', 1)==> (4, 9); cost 9 after 9 steps\n", + " (4, 9) ==('Pour', 1, 0)==> (7, 6); cost 10 after 10 steps\n", + " (7, 6) ==('Dump', 0)==> (0, 6); cost 11 after 11 steps\n", + " (0, 6) ==('Pour', 1, 0)==> (6, 0); cost 12 after 12 steps\n", + " (6, 0) ==('Fill', 1)==> (6, 9); cost 13 after 13 steps\n", + " (6, 9) ==('Pour', 1, 0)==> (7, 8); cost 14 after 14 steps\n", + "GOAL FOUND after 159 results and 45 goal checks\n" + ] + } + ], + "source": [ + "showpath(uniform_cost_search, phard)" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "class GridProblem(Problem):\n", + " \"\"\"A Grid.\"\"\"\n", + "\n", + " def actions(self, state): return ['N', 'S', 'E', 'W'] \n", + " \n", + " def result(self, state, action):\n", + " \"\"\"The state that results from executing this action in this state.\"\"\" \n", + " (W, H) = self.size\n", + " if action == 'N' and state > W: return state - W\n", + " if action == 'S' and state + W < W * W: return state + W\n", + " if action == 'E' and (state + 1) % W !=0: return state + 1\n", + " if action == 'W' and state % W != 0: return state - 1\n", + " return state" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "breadth_first_search:\n", + " 0 ==S==> 10; cost 1 after 1 steps\n", + " 10 ==S==> 20; cost 2 after 2 steps\n", + " 20 ==S==> 30; cost 3 after 3 steps\n", + " 30 ==S==> 40; cost 4 after 4 steps\n", + " 40 ==E==> 41; cost 5 after 5 steps\n", + " 41 ==E==> 42; cost 6 after 6 steps\n", + " 42 ==E==> 43; cost 7 after 7 steps\n", + " 43 ==E==> 44; cost 8 after 8 steps\n", + "GOAL FOUND after 135 results and 49 goal checks\n", + "\n", + "uniform_cost_search:\n", + " 0 ==S==> 10; cost 1 after 1 steps\n", + " 10 ==S==> 20; cost 2 after 2 steps\n", + " 20 ==E==> 21; cost 3 after 3 steps\n", + " 21 ==E==> 22; cost 4 after 4 steps\n", + " 22 ==E==> 23; cost 5 after 5 steps\n", + " 23 ==S==> 33; cost 6 after 6 steps\n", + " 33 ==S==> 43; cost 7 after 7 steps\n", + " 43 ==E==> 44; cost 8 after 8 steps\n", + "GOAL FOUND after 1036 results and 266 goal checks\n" + ] + } + ], + "source": [ + "compare_searchers(GridProblem(initial=0, goals={44}, size=(10, 10)))" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'test_frontier ok'" + ] + }, + "execution_count": 51, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "def test_frontier():\n", + " \n", + " #### Breadth-first search with FIFO Q\n", + " f = FrontierQ(Node(1), LIFO=False)\n", + " assert 1 in f and len(f) == 1\n", + " f.add(Node(2))\n", + " f.add(Node(3))\n", + " assert 1 in f and 2 in f and 3 in f and len(f) == 3\n", + " assert f.pop().state == 1\n", + " assert 1 not in f and 2 in f and 3 in f and len(f) == 2\n", + " assert f\n", + " assert f.pop().state == 2\n", + " assert f.pop().state == 3\n", + " assert not f\n", + " \n", + " #### Depth-first search with LIFO Q\n", + " f = FrontierQ(Node('a'), LIFO=True)\n", + " for s in 'bcdef': f.add(Node(s))\n", + " assert len(f) == 6 and 'a' in f and 'c' in f and 'f' in f\n", + " for s in 'fedcba': assert f.pop().state == s\n", + " assert not f\n", + "\n", + " #### Best-first search with Priority Q\n", + " f = FrontierPQ(Node(''), lambda node: len(node.state))\n", + " assert '' in f and len(f) == 1 and f\n", + " for s in ['book', 'boo', 'bookie', 'bookies', 'cook', 'look', 'b']:\n", + " assert s not in f\n", + " f.add(Node(s))\n", + " assert s in f\n", + " assert f.pop().state == ''\n", + " assert f.pop().state == 'b'\n", + " assert f.pop().state == 'boo'\n", + " assert {f.pop().state for _ in '123'} == {'book', 'cook', 'look'}\n", + " assert f.pop().state == 'bookie'\n", + " \n", + " #### Romania: Two paths to Bucharest; cheapest one found first\n", + " S = Node('S')\n", + " SF = Node('F', S, 'S->F', 99)\n", + " SFB = Node('B', SF, 'F->B', 211)\n", + " SR = Node('R', S, 'S->R', 80)\n", + " SRP = Node('P', SR, 'R->P', 97)\n", + " SRPB = Node('B', SRP, 'P->B', 101)\n", + " f = FrontierPQ(S)\n", + " f.add(SF); f.add(SR), f.add(SRP), f.add(SRPB); f.add(SFB)\n", + " def cs(n): return (n.path_cost, n.state) # cs: cost and state\n", + " assert cs(f.pop()) == (0, 'S')\n", + " assert cs(f.pop()) == (80, 'R')\n", + " assert cs(f.pop()) == (99, 'F')\n", + " assert cs(f.pop()) == (177, 'P')\n", + " assert cs(f.pop()) == (278, 'B')\n", + " return 'test_frontier ok'\n", + "\n", + "test_frontier()" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "button": false, + "collapsed": true, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [], + "source": [ + "# %matplotlib inline\n", + "import matplotlib.pyplot as plt\n", + "\n", + "p = plt.plot([i**2 for i in range(10)])\n", + "plt.savefig('destination_path.eps', format='eps', dpi=1200)" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "button": false, + "new_sheet": false, + "run_control": { + "read_only": false + } + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD8CAYAAABn919SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3Xl8VOW9x/HPj5AAYUkghC0QArLL\nkkAExaUVsFevC7i1oiIqGtvrde2tou29tr22pdZra691QVHZBC2CUrVerTtakYQgYd/NQoAASQhk\nT577R8aKNshkmZyZyff9evmamZMzzNch+XLyzDnPY845REQk9LXxOoCIiDQPFbqISJhQoYuIhAkV\nuohImFChi4iECRW6iEiYUKGLiIQJFbqISJhQoYuIhIm2Lfli3bt3d0lJSS35kiIiIS8jI+Ogcy7+\nZPu1aKEnJSWRnp7eki8pIhLyzOwLf/bTkIuISJhQoYuIhAkVuohImFChi4iECRW6iEiYUKGLiIQJ\nFbqISJjwq9DN7C4z22hmG8xsiZm1N7MBZrbazLab2YtmFhXosCIioebQ0Qp++ZdNlFXWBPy1Tlro\nZpYA3A6kOudGAhHAVcBvgd875wYDhcCsQAYVEQk1ldW1/GjxWhav/oLdB48F/PX8HXJpC3Qws7ZA\nNJAPTAKW+b4+H5jW/PFERELXL1/byGe7D/PQFaMZ0adLwF/vpIXunMsDHgayqSvyYiADKHLOVft2\nywUSAhVSRCTULF79BYs+zeaW7wxkanLL1KM/Qy5dganAAKAP0BG4oJ5d3Qmen2Zm6WaWXlBQ0JSs\nIiIhYfWuQzzw6kbOHRrPPf8yrMVe158hlynAbudcgXOuClgOTARifUMwAH2BvfU92Tk31zmX6pxL\njY8/6WRhIiIhLbewlH9bvJbEuGgenZ5CRBtrsdf2p9CzgdPNLNrMDJgMbALeA67w7TMTeDUwEUVE\nQkNpZTVpCzKorKnl6etS6dI+skVf358x9NXUffi5FsjyPWcucC9wt5ntAOKAeQHMKSIS1Jxz/OTP\n69m87wh/nJ7CKfGdWjyDX/OhO+ceAB74xuZdwPhmTyQiEoIef38nr2flc98Fwzh3aA9PMuhKURGR\nJvrbpv08/NZWpiX3Ie2cgZ7lUKGLiDTB9v0l3PniOkYlxDDn8tHUfdToDRW6iEgjFZVWctOCdNpH\nRvDUjHG0j4zwNI8KXUSkEaprarltSSb5ReU8NWMsvWM6eB2pZReJFhEJF7/56xY+2n6Qhy4fzbj+\n3byOA+gIXUSkwZZl5DJv1W6un5jE90/r53Wcf1Chi4g0QGZ2Ifcvz2LiKXH89MLhXsf5GhW6iIif\n9h8p55aFGfSKac+frh5LZERwVWhwpRERCVLlVTWkLczgaEU1T1+XSteOwbemjz4UFRE5Cecc9y/P\n4vOcIp68dhxDe3X2OlK9dIQuInIS81btZnlmHndNGcL5I3t5HeeEVOgiIt/ig20F/PqNzVwwshe3\nTRrkdZxvpUIXETmB3QePcdsLaxnSszMPXzmGNi04t3ljqNBFROpRUl7FzQvSiWhjPH1dKh3bBf9H\njsGfUESkhdXUOu5cuo7dB4+xaNYE+nWL9jqSX/xZU3Soma077r8jZnanmXUzs7fNbLvvtmtLBBYR\nCbRH3t7KO1sO8MDFIzjjlDiv4/jNnxWLtjrnkp1zycA4oBRYAcwG3nHODQbe8T0WEQlpf/l8L396\nbyfTx/djxun9vY7TIA0dQ58M7HTOfQFMBeb7ts8HpjVnMBGRlrYhr5ifLPuc1P5d+cUlIz2d27wx\nGlroVwFLfPd7OufyAXy33qy5JCLSDA4erSBtQTrdoqN44tpxRLUNvXNG/E5sZlHAJcCfG/ICZpZm\nZulmll5QUNDQfCIiAVdZXcuPFmVwuLSSudelEt+5ndeRGqUh/wRdAKx1zu33Pd5vZr0BfLcH6nuS\nc26ucy7VOZcaHx/ftLQiIs3MOccDKzeyZk8hD10xhpEJMV5HarSGFPp0vhpuAVgJzPTdnwm82lyh\nRERayqLV2Sz5LJsfffcULhnTx+s4TeJXoZtZNHAesPy4zXOA88xsu+9rc5o/nohI4Hy66xC/WLmR\nScN68B/fG+p1nCbz68Ii51wpEPeNbYeoO+tFRCTk5Bwu5d8Wr6V/XDR/uCqZiCC/rN8fofcxrohI\nE5VWVnPzgnSqamp5+rpUurSP9DpSs1Chi0ir4pzjP/78Odv2l/DY1WMZGN/J60jNRoUuIq3K/767\ngzey9nHfBcP5zpDwOvNOhS4ircZbG/fxyNvbuDQlgZvOHuB1nGanQheRVmHrvhLuenEdY/rG8JvL\nRoXcZf3+UKGLSNgrPFbJzQvSiW7XlqdmpNI+MsLrSAGhQheRsFZdU8u/L1nLvuJynpoxjl4x7b2O\nFDBa4EJEwtqv3tjMxzsO8bsrRjM2MbyXbdARuoiErZfSc3ju4z3ceOYArkzt53WcgFOhi0hY+mTn\nQX62YgNnDerO/f86zOs4LUKFLiJhZ31uETfPT6d/XDSPXZ1C24jWUXWt4/9SRFqNHQeOcv1za+ja\nMYqFsyYQGx3ldaQWo0IXkbCxt6iM6+atpo3BwlkTwvqMlvqo0EUkLBw+VsmMeaspKa/m+RvGM6B7\nR68jtTidtigiIe9oRTXXP/cZuYVlLLhxfEivOtQUKnQRCWnlVTWkLUhn494jPHXtOCYMjDv5k8KU\nvysWxZrZMjPbYmabzewMM+tmZm+b2XbfbXifsS8iQae6ppY7lmbyyc66C4emjOjpdSRP+TuG/ijw\npnNuGDAG2AzMBt5xzg0G3vE9FhFpEc45frpiA/+3cT//ddEILhvb1+tInjtpoZtZF+AcYB6Ac67S\nOVcETAXm+3abD0wLVEgRkW+a8+YWXkzP4fZJg7jxrPCbCrcx/DlCHwgUAM+ZWaaZPWNmHYGezrl8\nAN9tj/qebGZpZpZuZukFBQXNFlxEWq8nP9jJUx/sYsbp/bnrvCFexwka/hR6W2As8IRzLgU4RgOG\nV5xzc51zqc651Pj48FodRERa3tLPspnz1y1cPKYPv7jk1LCc17yx/Cn0XCDXObfa93gZdQW/38x6\nA/huDwQmoohInTc35HP/iiy+MySe/7lyDG3aqMyPd9JCd87tA3LMbKhv02RgE7ASmOnbNhN4NSAJ\nRUSAj3cc5PYl60hJ7MoT144lqq2ui/wmf89Dvw1YbGZRwC7gBur+MXjJzGYB2cCVgYkoIq3d5zlF\npC1IZ0D3jjw78zSio3QJTX38elecc+uA1Hq+NLl544iIfN2OAyVc/9xndOsUxcJZ44mJjvQ6UtDS\n7ywiErTyisqYMe8zItq0YdGsCfTo0rom22ooFbqIBKVDRyuY8cxqjlZUs3DWePrHtb7JthpKhS4i\nQaekvIqZz33G3uIynr3+NIb37uJ1pJCgQheRoFJeVcPNC9LZkl/CE9eM47Skbl5HChn6qFhEgkZ1\nTS23Lcnk012HefSqZM4dVu8F6HICOkIXkaDgnGP28ize3rSfX1xyKlOTE7yOFHJU6CLiOeccv35j\nM8sycrlzymBmTkzyOlJIUqGLiOee+GAnT3+0m5ln9OeOyYO9jhOyVOgi4qkXVmfz0JtbmZrchwcu\n1mRbTaFCFxHPvJGVz09fyeLcofE8rMm2mkyFLiKe+Gh7AXcszWRcYlcev2YckRGqo6bSOygiLS4z\nu5BbFmZwSnwn5l1/Gh2iIryOFBZU6CLSorbtL+GG59cQ37kdC2aNJ6aDJttqLip0EWkxOYdLmTFv\nNVERbVh44wR6dNZkW81JhS4iLaKgpILrnv2MssoaFswaT2JctNeRwo5fl/6b2R6gBKgBqp1zqWbW\nDXgRSAL2AN93zhUGJqaIhLIj5VVc/9xn5BeXsfimCQzrpcm2AqEhR+jnOueSnXNfLnQxG3jHOTcY\neIcGLBwtIq1HeVUNN81PZ+u+Ep68dhzj+muyrUBpypDLVGC+7/58YFrT44hIOKmuqeXfX1jLmj2H\neeQHyXx3qCbbCiR/C90Bb5lZhpml+bb1dM7lA/hu9TclIv9QW+u45+X1/G3zAX45dSSXjOnjdaSw\n5+/0uWc65/aaWQ/gbTPb4u8L+P4BSANITExsREQRCTXOOR58fTPL1+Zx93lDmHF6f68jtQp+HaE7\n5/b6bg8AK4DxwH4z6w3guz1wgufOdc6lOudS4+Pjmye1iAS1P723g2c/3s0NZyZx26RBXsdpNU5a\n6GbW0cw6f3kf+B6wAVgJzPTtNhN4NVAhRSQ0OOd45O1tPPzWNi5LSeA/LxyhybZakD9DLj2BFb6/\nlLbAC865N81sDfCSmc0CsoErAxdTRIJdba3jl69t4vlP9vD91L785rLRmmyrhZ200J1zu4Ax9Ww/\nBEwORCgRCS3VNbXMXp7FsoxcZp01gJ9dOFxH5h7QmqIi0iQV1TXcsWQdb27cx93nDeG2SYNU5h5R\noYtIo5VWVnPLwgw+2n6Q/7poBDeeNcDrSK2aCl1EGqW4rIobn19DZnYhv7tiNFem9vM6UqunQheR\nBvtyoq0dB0p4/JqxnD+yt9eRBBW6iDRQXlEZ1z6zmn3F5cybeRrnDNH1JcFChS4ifttZcJQZz6ym\npKKaRTeN10RbQUaFLiJ+2bi3mOvmfYYZLE07nVP7xHgdSb5BhS4iJ5W+5zA3PL+Gzu3asuimCQyM\n7+R1JKmHCl1EvtWH2wq4ZWEGvWPas/CmCSTEdvA6kpyACl1ETuivWfncvjSTQT06s+DG8cR3bud1\nJPkWKnQRqddL6TnMfnk9KYldefb604jpEOl1JDkJFbqI/JNnV+3ml69t4uzB3Xlqxjiio1QVoUB/\nSyLyD845/vjODn7/t22cf2ovHp2eTLu2EV7HEj+p0EUE+GqVoXmrdnPFuL7MuWwUbSOasuywtDQV\nuohQU+u4b/l6XkrP5fqJSfzXRSM0l3kIUqGLtHIV1TXc9eI63sjaxx2TB3PnlMGa/jZE+f37lJlF\nmFmmmb3mezzAzFab2XYze9HMogIXU0QCobSympsXZPBG1j5+duFw7jpviMo8hDVkgOwOYPNxj38L\n/N45NxgoBGY1ZzARCazisiqum/cZq7YX8NDlo7np7IFeR5Im8qvQzawvcCHwjO+xAZOAZb5d5gPT\nAhFQRJrfwaMVTJ/7KZ/nFvHY1WP5/mmayzwc+DuG/gfgHqCz73EcUOScq/Y9zgUS6nuimaUBaQCJ\niYmNTyoizWKvb/rbvcVlPDPzNL6j6W/DxkmP0M3sIuCAcy7j+M317Orqe75zbq5zLtU5lxofr28c\nES/tKjjKlU/+nYKSChbOmqAyDzP+HKGfCVxiZv8KtAe6UHfEHmtmbX1H6X2BvYGLKSJNtWnvEa57\ndjXOwZK00xmZoOlvw81Jj9Cdc/c55/o655KAq4B3nXPXAO8BV/h2mwm8GrCUItIkGV8c5qq5fycy\nog0v3nKGyjxMNeUysHuBu81sB3Vj6vOaJ5KINKePthdw7TOf0a1jFH/+4RkM6qG5zMNVgy4scs69\nD7zvu78LGN/8kUSkuby5YR+3L8lkYHxHFswaT4/O7b2OJAGkK0VFwtTLGbnc8/J6RveN4fnrxxMT\nrelvw50KXSQMPf/xbn7+l02cOSiOuTNS6dhOP+qtgf6WRcKIc47H3t3B/7y9je+N6Mkfp6fQPlLT\n37YWKnSRMFFdU8uv3tjMcx/v4bKxCTx0+WhNf9vKqNBFwsDhY5XctmQtH+84xI1nDuBnFw7X9Let\nkApdJMRtyCvmloUZFByt4HdXjObKVM3L0lqp0EVC2MsZudy/Iou4jlEs++EZjO4b63Uk8ZAKXSQE\nVdXU8uBrm5j/9y84Y2Acj12dQlyndl7HEo+p0EVCzIGScm5dvJY1ewq5+ewB3Hv+MH34KYAKXSSk\nrM0u5EeLMiguq+LRq5KZmlzvrNXSSqnQRULEC6uzeWDlBnrHdGDFv41neO8uXkeSIKNCFwlyFdU1\nPPDqRpauyeGcIfH88apkYqO1hK/8MxW6SBDLLy7jh4vW8nlOEbeeewp3nzeUCJ1fLiegQhcJUqt3\nHeLWF9ZSVlnDk9eO4/yRvbyOJEFOhS4SZJxzPP/JHn71+mYS46JZmnY6g3p0PvkTpdU7aaGbWXvg\nQ6Cdb/9lzrkHzGwAsBToBqwFZjjnKgMZViTclVXWcP+KLFZk5jFleE8e+cEYurTXtLfiH39OXq0A\nJjnnxgDJwPlmdjrwW+D3zrnBQCEwK3AxRcJfzuFSLn/iE15Zl8fd5w1h7oxxKnNpEH/WFHXOuaO+\nh5G+/xwwCVjm2z4fmBaQhCKtwEfbC7j4sVXkFJYyb2Yqt08erMm1pMH8GkM3swggAxgE/AnYCRQ5\n56p9u+QCusJBpIGcczz14S4eenMLg3p04qkZqQzo3tHrWBKi/Cp051wNkGxmscAKYHh9u9X3XDNL\nA9IAEhMTGxlTJPwcq6jmnmXreT0rnwtH9eahK0ZrZSFpkoYuEl1kZu8DpwOxZtbWd5TeF9h7gufM\nBeYCpKam1lv6Iq3NnoPHSFuYzo4DR7nvgmGknTMQMw2xSNOcdAzdzOJ9R+aYWQdgCrAZeA+4wrfb\nTODVQIUUCSfvbtnPxY+t4kBJBQtunMAt3zlFZS7Nwp8j9N7AfN84ehvgJefca2a2CVhqZg8CmcC8\nAOYUCXm1tY7/fXcHf3hnGyN6d+HJa8fRr1u017EkjJy00J1z64GUerbvAsYHIpRIuDlSXsXdL37O\n3zbv57KUBH592Sgt3izNTp/AiATYjgMlpC3IIPtwKT+/eAQzJyZpiEUCQoUuEkBvbsjnxy99Toeo\nCBbfNIEJA+O8jiRhTIUuEgA1tY7/eWsrj7+/k+R+sTxx7Vh6x3TwOpaEORW6SDMrKq3k9qXr+HBb\nAdPH9+Pnl5xKu7YaL5fAU6GLNKNNe49wy6J09hdX8JvLRjF9vC6mk5ajQhdpJq+uy+Pel9cT0yGS\npbecztjErl5HklZGhS7SREcrqpnz180s+jSb8UndeOyaFHp0bu91LGmFVOgiTfDe1gP8dHkW+UfK\nuemsAdx7wTAiI/yZlVqk+anQRRqh8Fgl//3aJpZn5jGoRyeW/XAi4/priEW8pUIXaQDnHG9k7eOB\nlRsoKq3i9kmDuHXSIJ3FIkFBhS7ipwNHyvnZKxt4a9N+RiXEsODGCYzo08XrWCL/oEIXOQnnHH9O\nz+W/X99EZXUt910wjFlnDaCtxsolyKjQRb5F9qFS7luxno93HGL8gG789vLRWlFIgpYKXaQeNbWO\n5z/Zw8P/t5WINsaD00Zy9fhErfMpQU2FLvIN2/eXcM/L68nMLuLcofH86tJR9InVPCwS/E5a6GbW\nD1gA9AJqgbnOuUfNrBvwIpAE7AG+75wrDFxUkcCqrK7lyQ928ti7O+jYLoI//CCZqcl9NNWthAx/\njtCrgR8759aaWWcgw8zeBq4H3nHOzTGz2cBs4N7ARRUJnPW5RdyzbD1b9pVw8Zg+PHDxCLp3aud1\nLJEG8WfFonwg33e/xMw2AwnAVOC7vt3mA++jQpcQU1ZZwx/+to2nP9pFfOd2PH1dKueN6Ol1LJFG\nadAYupklUbcc3Wqgp6/scc7lm1mPZk8nEkCf7jrE7JfXs+dQKdPH92P2BcOJ6RDpdSyRRvO70M2s\nE/AycKdz7oi/44pmlgakASQmaipR8V5JeRVz/rqFxauzSewWzQs3TWDioO5exxJpMr8K3cwiqSvz\nxc655b7N+82st+/ovDdwoL7nOufmAnMBUlNTXTNkFmm0d7fs56crNrDfN5nWj783lA5RumxfwoM/\nZ7kYMA/Y7Jx75LgvrQRmAnN8t68GJKFIMzh8rJJf/mUjr6zby5CenXj8momkaL5yCTP+HKGfCcwA\nssxsnW/b/dQV+UtmNgvIBq4MTESRxnPO8Zf1+fx85UZKyqu4Y/Jgbj13EFFtddm+hB9/znJZBZxo\nwHxy88YRaT77iusm0/rb5v2M6RvDb6+YwLBemkxLwpeuFJWw45xj6Zocfv36Zqpqa/nZhcO54cwB\nROiyfQlzKnQJK18cOsbsl7P4+65DnDEwjjmXj6J/nCbTktZBhS5hoabW8dzHu3n4ra1EtmnDby4b\nxVWn9dNl+9KqqNAl5G3dVzeZ1uc5RUwZ3oMHp42iV4wWaZbWR4UuIetASTlPvL+TRZ9+Qef2kfxx\negoXj+6to3JptVToEnIKSip46oOdLPz0C6prHVeM7cu9FwyjW8cor6OJeEqFLiHj0NEKnvpwFwv+\nvofK6louTenLbZMGkaQVhEQAFbqEgMPHKpnrK/LyqhqmJSdw2+TBWgpO5BtU6BK0Co9V8vRHu5j/\nyR5Kq2q4ZEwfbp88mFPiO3kdTSQoqdAl6BSXVvHMql089/EejlVWc+Go3twxeTCDe3b2OppIUFOh\nS9AoLqvi2VW7eXbVbkoqqvnXUb24Y/IQhvZSkYv4Q4UunjtSXsVzq/Ywb9UujpRXc/6pvbhjymCG\n99a8KyINoUIXzxytqOb5j3fz9Ee7KS6r4rwRPblzymBO7RPjdTSRkKRClxZ3rKKa5z/Zw9Mf7aKo\ntIopw3tw55QhjExQkYs0hQpdWkxpZTUL/v4Fcz/cxeFjlZw7NJ47pwxhTL9Yr6OJhAUVugRcWWUN\niz79gic/2MmhY5WcMySeu6YM1opBIs3MnyXongUuAg4450b6tnUDXgSSgD3A951zhYGLKaGovOrL\nIt/FwaMVnD24O3dOGcK4/ipykUDw5wj9eeAxYMFx22YD7zjn5pjZbN/je5s/noSi8qoalnyWzePv\n76SgpIKJp8TxxLVjOS2pm9fRRMKaP0vQfWhmSd/YPBX4ru/+fOB9VOitXkV1DS+uyeFP7+1g/5EK\nJgzoxmPTU5gwMM7raCKtQmPH0Hs65/IBnHP5ZtajGTNJiKmoruGl9Fwef28H+cXljE/qxu9/kMzE\nU7p7HU2kVQn4h6JmlgakASQmJgb65aQFVVbXsiwjl8fe3c7e4nLG9e/K764Yw5mD4jQnuYgHGlvo\n+82st+/ovDdw4EQ7OufmAnMBUlNTXSNfT4JI9qFSXlmXx4trcsgrKiMlMZY5l4/m7MHdVeQiHmps\noa8EZgJzfLevNlsiCUqFxyp5PSufVzLzSP+i7oSmCQO68eClI/nukHgVuUgQ8Oe0xSXUfQDa3cxy\ngQeoK/KXzGwWkA1cGciQ4o3yqhre23KAFZl5vLf1AFU1jkE9OvGTfxnKtJQEEmI7eB1RRI7jz1ku\n00/wpcnNnEWCQG2tY82ew7yyLo/X1+dzpLya7p3acd0ZSVyaksCpfbroaFwkSOlKUQFgx4ESVmTm\n8UrmXvKKyugQGcH5I3txaUoCE0+Jo21EG68jishJqNBbsYKSClZ+vpdXMvPIyiumjcFZg+P5yb8M\n5bwRPenYTt8eIqFEP7GtTGllNW9t3M+KzDxW7ThITa1jZEIX/vOiEVw8pjc9Orf3OqKINJIKvRWo\nqXV8vOMgr2Tm8ebGfZRW1pAQ24Effmcg05ITtLSbSJhQoYcp5xyb8o+wYm0eKz/fy4GSCjq3b8vU\n5D5MS07gtKRutGmjDzdFwokKPczsLSrjlXV5vJKZx7b9R4mMML47tAeXpSRw7rAetI+M8DqiiASI\nCj0MHCmv4s2sfSzPzGX17sM4B+P6d+XBaSO5cFRvunaM8jqiiLQAFXqIqqyu5cNtBazIzOPtzfup\nrK5lQPeO3DVlCNOSE0iMi/Y6ooi0MBV6iHDOsedQKetyClmzp5C/ZuVTWFpFXMcorh6fyLSUBMb0\njdFFPyKtmAo9SBWXVrEut4jM7ELW5RTxeU4RhaVVAHSMimDS8J5cmtKHswfHE6mLfkQEFXpQqKqp\nZeu+EjJzvirwXQXHADCDIT06870RvUhJjCU5MZbBPToToTNUROQbVOgtzDlHfnE5644r76y8Ysqr\nagHo3qkdyf1iuXxsX1L6xTKqbwyd20d6nFpEQoEKPcCOVVSTlVdMZnYR63IKycwu4kBJBQBRbdsw\nsk8XrpnQn+R+sST3i6Vv1w4aBxeRRlGhN6PaWsfOgqNkZhf9Y/hk2/4San3LeiTFRTPxlDhSEruS\n3C+W4b27ENVW498i0jxU6E1w8GgF67KL6oZPcgpZn1NMSUU1AF3at2VMv1i+d2ovUvrFMqZfLN10\nPriIBFCTCt3MzgceBSKAZ5xzc5olVRApq6yhqKySwmNVFJVWssX34eW6nEJyDpcBENHGGNarM1NT\n+pDcryspibEMiOuoS+tFpEU1utDNLAL4E3AekAusMbOVzrlNzRWuOVVU11BcWkVhaV0xF5ZWUVxW\n6Xtct62otIrC0kqKy+pui0qrqKiu/ac/q3dMe1ISY5lxen9SErsysk8MHaJ0Sb2IeKspR+jjgR3O\nuV0AZrYUmAoEtNCramopLju+gL+6X+Qr6OLSrwq5qLSSorIqSitrTvhnRkYYsdFRdI2OJLZDFInd\nohndN4au0VHEREfSNTqK2A6RxERHMrB7J3rFaIpZEQk+TSn0BCDnuMe5wISmxanf/Suy+HBbAUWl\nVRz1jVHXJ6KNEdshktjoSGKjo+gT257hvbvUFbVvW6yvoGM6RNK1Y11RR0dF6MwSEQl5TSn0+hrQ\n/dNOZmlAGkBiYmKjXightgPjk7p9dbT8ZTn7yvvLI+nO7dqqmEWk1WpKoecC/Y573BfY+82dnHNz\ngbkAqamp/1T4/rj13EGNeZqISKvSlJOg1wCDzWyAmUUBVwErmyeWiIg0VKOP0J1z1Wb278D/UXfa\n4rPOuY3NlkxERBqkSeehO+feAN5opiwiItIEuu5cRCRMqNBFRMKECl1EJEyo0EVEwoQKXUQkTJhz\njbrWp3EvZlYAfNHIp3cHDjZjnFCn9+Mrei++Tu/H14XD+9HfORd/sp1atNCbwszSnXOpXucIFno/\nvqL34uv0fnxda3o/NOQiIhImVOgiImEilAp9rtcBgozej6/ovfg6vR9f12rej5AZQxcRkW8XSkfo\nIiLyLUKi0M3sfDPbamY7zGy213m8Ymb9zOw9M9tsZhvN7A6vMwUDM4sws0wze83rLF4zs1gzW2Zm\nW3zfJ2d4nckrZnaX7+dkg5mwtvk3AAACDklEQVQtMbOwXzsy6Av9uMWoLwBGANPNbIS3qTxTDfzY\nOTccOB24tRW/F8e7A9jsdYgg8SjwpnNuGDCGVvq+mFkCcDuQ6pwbSd0U31d5myrwgr7QOW4xaudc\nJfDlYtStjnMu3zm31ne/hLof1gRvU3nLzPoCFwLPeJ3Fa2bWBTgHmAfgnKt0zhV5m8pTbYEOZtYW\niKaeFdXCTSgUen2LUbfqEgMwsyQgBVjtbRLP/QG4B6j1OkgQGAgUAM/5hqCeMbOOXofygnMuD3gY\nyAbygWLn3Fvepgq8UCh0vxajbk3MrBPwMnCnc+6I13m8YmYXAQeccxleZwkSbYGxwBPOuRTgGNAq\nP3Mys67U/SY/AOgDdDSza71NFXihUOh+LUbdWphZJHVlvtg5t9zrPB47E7jEzPZQNxQ3ycwWeRvJ\nU7lArnPuy9/allFX8K3RFGC3c67AOVcFLAcmepwp4EKh0LUYtY+ZGXXjo5udc494ncdrzrn7nHN9\nnXNJ1H1fvOucC/ujsBNxzu0DcsxsqG/TZGCTh5G8lA2cbmbRvp+bybSCD4ibtKZoS9Bi1F9zJjAD\nyDKzdb5t9/vWdhUBuA1Y7Dv42QXc4HEeTzjnVpvZMmAtdWeHZdIKrhjVlaIiImEiFIZcRETEDyp0\nEZEwoUIXEQkTKnQRkTChQhcRCRMqdBGRMKFCFxEJEyp0EZEw8f/pavD4X6i2SQAAAABJRU5ErkJg\ngg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeYAAAHSCAYAAAA5eGh0AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt219Mk/f///9HaxOnNULI1iIWxrSi\nk8URHfFPYjK7g2YaiOhUkkUPdrIlLNmOfX8mduLigYdkMYsHYKIBBDM2o43GGNEjdIZEl2iE6JQ/\nFlw0bI1b4aLfA3/29674nuCAvvrifjuif676fPC6ruvRq0VXMpkUAAAwgzvTAwAAgP8fxQwAgEEo\nZgAADEIxAwBgEIoZAACDUMwAABjEk+kBXseBAwcejo2N+TM9x1Rzu91jY2Nj1r1ZSiaTYy6Xy7pc\nkr1r5vF4xkZHR63LJdm7P86ZM2fMcRzrctl6jEmS2+2OffPNN/kv3p+VxTw2Nubfvn17pseYcm1t\nbW5bc8VisUyPMS38fr+1a1ZbW5vpMaZFJBKxcn/0+/1WrlkkErHyGJOktra2l15gWvkuBACAbEUx\nAwBgEIoZAACDUMwAABiEYgYAwCAUMwAABqGYAQAwCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAY\nhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjEDAGAQihkAAINQzAAAGGTWFvOVK1dUUVGhzZs3\n6+jRo+Mev3btmnbu3KmysjKdO3cudf+tW7f06aefauvWrdq2bZui0ehMjj0htma7f/++Tpw4oePH\nj+v69evjHu/v79fJkyd15MgR9fT0pO5/9OiRTp06paamJjU3N6u7u3smx34lW9dLkqLRqJYvX65g\nMKhDhw6Ne7yjo0OrV6+Wx+NRa2tr6v6uri6tX79epaWlWrVqlZqbm2dy7FeydV+U7F2zbDrOPNP+\nLxjIcRwdPHhQP/zwg/Lz81VdXa1NmzZp6dKlqecsWrRIBw4cUGNjY9q2b7zxhr777ju9/fbbGhwc\n1K5du7RhwwYtXLhwpmO8lK3ZxsbGdPnyZVVUVMjr9aqtrU3FxcXKy8tLPWfBggUKhULq6upK29bj\n8SgUCik3N1fxeFytra0qLCzU3LlzZzrGOLaul/QsW01Njc6fP69AIKDy8nJVVlZq5cqVqecUFRWp\noaFBhw8fTtt2/vz5OnbsmJYtW6b+/n6tWbNG4XBYubm5Mx1jHFv3RcneNcu242xWFvONGzdUVFSk\nwsJCSdLHH3+sixcvpi3S4sWLJUkulytt2+Li4tTPPp9PeXl5evz4sTEnQ1uzDQ4OKicnJzVLMBjU\nvXv30k6Gzx97Mdd/nxi8Xq/mzZunp0+fGnEytHW9JKmzs1PBYFBLliyRJFVXV6u9vT3tJP88g9ud\n/uFdSUlJ6ueCggL5fD4NDQ0ZcZK3dV+U7F2zbDvOZuVH2YODg8rPz0/d9vv9isVik36dGzduaGRk\nJLXYJrA1Wzwel9frTd32er2Kx+OTfp1YLCbHcZSTkzOV4702W9dLkvr6+tLmCQQC6uvrm/TrdHZ2\nKpFIpJ1EM8nWfVGyd82y7TiblVfMyWRy3H0vvkt6laGhIe3du1d1dXXj3jlmks3Z/q14PK4LFy4o\nFApN+ncyXWxer6nINjAwoN27d6uxsdGobP+WifuiZO+aZdtxZsZvbYb5/X49fPgwdTsWi8nn8014\n+z///FM1NTX68ssv9f7770/HiK/N1mwvXpW8eNXyKolEQmfOnNHatWvT3jlnmq3rJT272nrw4EHq\ndm9vrwoKCia8/fDwsLZs2aK6ujqtW7duOkZ8Lbbui5K9a5Ztx9msLOb33ntPv/32m3p7ezUyMqKz\nZ8/qww8/nNC2IyMj+vrrr1VRUaFwODy9g74GW7P5fD49efJEw8PDchxH3d3dad/9/BPHcRSNRlVS\nUmLMR2vP2bpeklReXq47d+7o7t27SiQSampqUmVl5YS2TSQSqqqq0p49e7Rjx45pnnRybN0XJXvX\nLNuOs1n5UbbH49HevXv1xRdfyHEcVVVVKRgMqr6+XqWlpdq0aZNu3rypr776Sn/88YcuXbqk77//\nXj/++KOi0ah++eUXPXnyRO3t7ZKkuro6rVixIsOpnrE1m9vt1saNG3X69Gklk0mtWLFCeXl56uzs\n1FtvvaV33nlHg4ODikaj+vvvv3Xv3j1dvXpV1dXV6unp0cDAgP766y/dvn1bkhQKhfTmm29mOJW9\n6yU9y1ZfX69wOCzHcfTZZ5+ptLRU+/bt0wcffKDKykpdvXpVVVVVevz4sX7++WfV1tbq119/VUtL\nizo6OvT777+roaFBktTQ0KCysrLMhpK9+6Jk75pl23Hmetln76aLRCLJ7du3Z3qMKdfW1iZbc73O\nH1pkA7/fb+2a1dbWZnqMaRGJRKzcH/1+v5VrFolErDzGpNRxNu7L7ln5UTYAAKaimAEAMAjFDACA\nQShmAAAMQjEDAGAQihkAAINQzAAAGIRiBgDAIBQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZ\nAACDUMwAABiEYgYAwCAUMwAABqGYAQAwCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAYxJVMJjM9\nw6QdPHjQGR0dte5Nhcfj0ejoaKbHmHLJZFIulyvTY0yLOXPmyHGcTI8x5WzdFyV7s7ndbo2NjWV6\njCln63pJksfjGfvPf/4zZ9z9mRjm3xodHXXX1tZmeowpF4lEZGuuWCyW6TGmhd/vt3bNbMwl2Zst\nEolo+/btmR5jyrW1tVm5XpIUiUReeoFp3VUnAADZjGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAM\nQjEDAGAQihkAAINQzAAAGIRiBgDAIBQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwA\nABiEYgYAwCAUMwAABqGYAQAwCMUMAIBBZm0xR6NRLV++XMFgUIcOHRr3eEdHh1avXi2Px6PW1tbU\n/V1dXVq/fr1KS0u1atUqNTc3z+TYE2Jrtvv37+vEiRM6fvy4rl+/Pu7x/v5+nTx5UkeOHFFPT0/q\n/kePHunUqVNqampSc3Ozuru7Z3LsV7J1vSR7s9maS5KuXLmiiooKbd68WUePHh33+LVr17Rz506V\nlZXp3Llzqftv3bqlTz/9VFu3btW2bdsUjUZncuxXyqY180z7v2Agx3FUU1Oj8+fPKxAIqLy8XJWV\nlVq5cmXqOUVFRWpoaNDhw4fTtp0/f76OHTumZcuWqb+/X2vWrFE4HFZubu5Mx3gpW7ONjY3p8uXL\nqqiokNfrVVtbm4qLi5WXl5d6zoIFCxQKhdTV1ZW2rcfjUSgUUm5uruLxuFpbW1VYWKi5c+fOdIxx\nbF0vyd5stuaSnmU7ePCgfvjhB+Xn56u6ulqbNm3S0qVLU89ZtGiRDhw4oMbGxrRt33jjDX333Xd6\n++23NTg4qF27dmnDhg1auHDhTMcYJ9vWbFYWc2dnp4LBoJYsWSJJqq6uVnt7e9oiFRcXS5Lc7vQP\nFUpKSlI/FxQUyOfzaWhoyJgDy9Zsg4ODysnJSR3kwWBQ9+7dSyvm54+5XK60bf97fq/Xq3nz5unp\n06dGFLOt6yXZm83WXJJ048YNFRUVqbCwUJL08ccf6+LFi2nFvHjxYknjj7PnmSXJ5/MpLy9Pjx8/\nNqKYs23NZuVH2X19fakdT5ICgYD6+vom/TqdnZ1KJBJpO22m2ZotHo/L6/Wmbnu9XsXj8Um/TiwW\nk+M4ysnJmcrxXput6yXZm83WXNKzN8D5+fmp236/X7FYbNKvc+PGDY2MjKT9njIp29ZsVl4xJ5PJ\ncfe9+O7vVQYGBrR79241NjaOe4eVSTZn+7fi8bguXLigUCg06d/JdLF5vWzNZmsuaWqyDQ0Nae/e\nvaqrqzMmW7atmRm/tRkWCAT04MGD1O3e3l4VFBRMePvh4WFt2bJFdXV1Wrdu3XSM+NpszfbiFfKL\nV9CvkkgkdObMGa1duzbtiiDTbF0vyd5stuaSnl0hP3z4MHU7FovJ5/NNePs///xTNTU1+vLLL/X+\n++9Px4ivJdvWbFYWc3l5ue7cuaO7d+8qkUioqalJlZWVE9o2kUioqqpKe/bs0Y4dO6Z50smzNZvP\n59OTJ080PDwsx3HU3d2d9p3WP3EcR9FoVCUlJUZ9bCjZu16SvdlszSVJ7733nn777Tf19vZqZGRE\nZ8+e1YcffjihbUdGRvT111+roqJC4XB4egedpGxbs1lZzB6PR/X19QqHw3r33Xe1c+dOlZaWat++\nffrpp58kSVevXlUgENDJkyf1+eefq7S0VJLU0tKijo4ONTQ0qKysTGVlZeP+CjiTbM3mdru1ceNG\nnT59Wk1NTVq6dKny8vLU2dmpu3fvSnr2/dixY8fU09OjS5cuqampSZLU09OjgYEB3b59Wy0tLWpp\nadGjR48yGSfF1vWS7M1may7pWba9e/fqiy++UGVlpcLhsILBoOrr63Xx4kVJ0s2bN/XRRx/p/Pnz\n+vbbb7V161ZJz/470i+//KL29nZ98skn+uSTT3Tr1q1MxknJtjVzveyzd9NFIpFkbW1tpseYcpFI\nRLbmep0/IMkGfr/f2jWzMZdkb7ZIJKLt27dneowp19bWZuV6Sal9cdyX3bPyihkAAFNRzAAAGIRi\nBgDAIBQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABiEYgYAwCAUMwAABqGYAQAw\nCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAYhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAM4kom\nk5meYdIOHjzojI6OWvemwuPxaHR0NNNjTDm3262xsbFMjzEtbM2WTCblcrkyPca0sDWbrblsPcYk\nye12j33zzTdzXrzfk4lh/q3R0VF3bW1tpseYcpFIRLbm2r59e6bHmBZtbW1WZmtra1MsFsv0GNPC\n7/dbmc3mXDYeY5LU1tb20gtM6646AQDIZhQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACD\nUMwAABiEYgYAwCAUMwAABqGYAQAwCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAYhGIGAMAgFDMA\nAAahmAEAMAjFDACAQWZtMUejUS1fvlzBYFCHDh0a93hHR4dWr14tj8ej1tbW1P1dXV1av369SktL\ntWrVKjU3N8/k2BNia7YrV66ooqJCmzdv1tGjR8c9fu3aNe3cuVNlZWU6d+5c6v5bt27p008/1dat\nW7Vt2zZFo9GZHPuVbM0lSffv39eJEyd0/PhxXb9+fdzj/f39OnnypI4cOaKenp7U/Y8ePdKpU6fU\n1NSk5uZmdXd3z+TYr2RrLsnebNl0nHmm/V8wkOM4qqmp0fnz5xUIBFReXq7KykqtXLky9ZyioiI1\nNDTo8OHDadvOnz9fx44d07Jly9Tf3681a9YoHA4rNzd3pmO8lK3ZHMfRwYMH9cMPPyg/P1/V1dXa\ntGmTli5dmnrOokWLdODAATU2NqZt+8Ybb+i7777T22+/rcHBQe3atUsbNmzQwoULZzrGOLbmkqSx\nsTFdvnxZFRUV8nq9amtrU3FxsfLy8lLPWbBggUKhkLq6utK29Xg8CoVCys3NVTweV2trqwoLCzV3\n7tyZjjGOrbkke7Nl23E2K4u5s7NTwWBQS5YskSRVV1ervb09rbyKi4slSW53+ocKJSUlqZ8LCgrk\n8/k0NDRkRHlJ9ma7ceOGioqKVFhYKEn6+OOPdfHixbQDa/HixZIkl8uVtu3zvJLk8/mUl5enx48f\nG1FgtuaSpMHBQeXk5KTmCQaDunfvXtpJ/vljL2b7733O6/Vq3rx5evr0qREneVtzSfZmy7bjbFZ+\nlN3X15daIEkKBALq6+ub9Ot0dnYqkUikLW6m2ZptcHBQ+fn5qdt+v1+xWGzSr3Pjxg2NjIyk/Y4y\nydZckhSPx+X1elO3vV6v4vH4pF8nFovJcRzl5ORM5XivzdZckr3Zsu04m5VXzMlkctx9L75LepWB\ngQHt3r1bjY2N4648M8nWbFORa2hoSHv37lVdXR25skQ8HteFCxcUCoUm/Xsxma25JDOzZdtxZtdR\nPEGBQEAPHjxI3e7t7VVBQcGEtx8eHtaWLVtUV1endevWTceIr83WbH6/Xw8fPkzdjsVi8vl8E97+\nzz//VE1Njb788ku9//770zHia7E1lzT+auvFq7FXSSQSOnPmjNauXZt2tZNptuaS7M2WbcfZrCzm\n8vJy3blzR3fv3lUikVBTU5MqKysntG0ikVBVVZX27NmjHTt2TPOkk2drtvfee0+//fabent7NTIy\norNnz+rDDz+c0LYjIyP6+uuvVVFRoXA4PL2DTpKtuaRn38c9efJEw8PDchxH3d3dad/X/RPHcRSN\nRlVSUmLM1ynP2ZpLsjdbth1ns/KjbI/Ho/r6eoXDYTmOo88++0ylpaXat2+fPvjgA1VWVurq1auq\nqqrS48eP9fPPP6u2tla//vqrWlpa1NHRod9//10NDQ2SpIaGBpWVlWU21P/H1mwej0d79+7VF198\nIcdxVFVVpWAwqPr6epWWlmrTpk26efOmvvrqK/3xxx+6dOmSvv/+e/3444+KRqP65Zdf9OTJE7W3\nt0uS6urqtGLFigynsjeX9OyPCzdu3KjTp08rmUxqxYoVysvLU2dnp9566y298847GhwcVDQa1d9/\n/6179+7p6tWrqq6uVk9PjwYGBvTXX3/p9u3bkqRQKKQ333wzw6nszSXZmy3bjjPXyz57N10kEknW\n1tZmeowpF4lEZGuu7du3Z3qMadHW1mZltra2ttf645hs8Lp/+GM6m3PZeIxJz46z2tracV92z8qP\nsgEAMBXFDACAQShmAAAMQjEDAGAQihkAAINQzAAAGIRiBgDAIBQzAAAGoZgBADAIxQwAgEEoZgAA\nDEIxAwBgEIoZAACDUMwAABiEYgYAwCAUMwAABqGYAQAwCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DM\nAAAYhGIGAMAgrmQymekZJm3//v2Oy+Wy7k3FnDlz5DhOpseYch6PR6Ojo5keY1rYmi2ZTMrlcmV6\njGlhazZbzx+2rpckJZPJsf3798958X5PJob5t1wulzsWi2V6jCnn9/tVW1ub6TGmXCQSsTKXZG+2\nSCQiG48x6dlxZmM2m88fNq6XJPn9/pdeYFp31QkAQDajmAEAMAjFDACAQShmAAAMQjEDAGAQihkA\nAINQzAAAGIRiBgDAIBQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABiEYgYAwCAU\nMwAABqGYAQAwCMUMAIBBKGYAAAwya4v5/v37OnHihI4fP67r16+Pe7y/v18nT57UkSNH1NPTk7r/\n0aNHOnXqlJqamtTc3Kzu7u6ZHHtCotGoli9frmAwqEOHDo17vKOjQ6tXr5bH41Fra2vq/q6uLq1f\nv16lpaVatWqVmpubZ3LsVyJXduWS7D3ObM0l2bs/ZtOaeab9XzDQ2NiYLl++rIqKCnm9XrW1tam4\nuFh5eXmp5yxYsEChUEhdXV1p23o8HoVCIeXm5ioej6u1tVWFhYWaO3fuTMd4KcdxVFNTo/PnzysQ\nCKi8vFyVlZVauXJl6jlFRUVqaGjQ4cOH07adP3++jh07pmXLlqm/v19r1qxROBxWbm7uTMcYh1zZ\nlUuy9zizNZdk7/6YbWs2K4t5cHBQOTk5WrhwoSQpGAzq3r17aYv0/DGXy5W27X/vZF6vV/PmzdPT\np0+NObA6OzsVDAa1ZMkSSVJ1dbXa29vTDqzi4mJJktud/oFJSUlJ6ueCggL5fD4NDQ0ZcWCRK7ty\nSfYeZ7bmkuzdH7NtzWblR9nxeFxerzd12+v1Kh6PT/p1YrGYHMdRTk7OVI73r/T19amwsDB1OxAI\nqK+vb9Kv09nZqUQioaVLl07leK+NXP/MtFySvceZrbkke/fHbFuzWXnFPBXi8bguXLigUCg07h1W\nJiWTyXH3TXa+gYEB7d69W42NjePeFWcKuf43E3NNFVOPs3/L1Fzsj//bTK6ZPb+1SXjx3dKL76Ze\nJZFI6MyZM1q7dq3y8/OnY8TXFggE9ODBg9Tt3t5eFRQUTHj74eFhbdmyRXV1dVq3bt10jPhayPVy\npuaS7D3ObM0l2bs/Ztuazcpi9vl8evLkiYaHh+U4jrq7u1Pfm7yK4ziKRqMqKSkx5mOa/1ZeXq47\nd+7o7t27SiQSampqUmVl5YS2TSQSqqqq0p49e7Rjx45pnnRyyDWeybkke48zW3NJ9u6P2bZms/Kj\nbLfbrY0bN+r06dNKJpNasWKF8vLy1NnZqbfeekvvvPOOBgcHFY1G9ffff+vevXu6evWqqqur1dPT\no4GBAf3111+6ffu2JCkUCunNN9/McKpnPB6P6uvrFQ6H5TiOPvvsM5WWlmrfvn364IMPVFlZqatX\nr6qqqkqPHz/Wzz//rNraWv36669qaWlRR0eHfv/9dzU0NEiSGhoaVFZWltlQIle25ZLsPc5szSXZ\nuz9m25q5XvadgukikUgyFotleowp5/f7VVtbm+kxplwkErEyl2RvtkgkIhuPMenZcWZjNpvPHzau\nl5Ras3FfWM/Kj7IBADAVxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABiEYgYAwCAUMwAABqGYAQAw\nCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAYhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjED\nAGAQihkAAINQzAAAGIRiBgDAIK5kMpnpGSatrq7OcRzHujcVHo9Ho6OjmR5jyrndbo2NjWV6jGlh\n65olk0m5XK5MjzEt5syZI8dxMj3GlLN1X7T5/OF2u8e++eabOS/e78nEMP+W4zju2traTI8x5SKR\niGzNtX379kyPMS3a2tqsXbNYLJbpMaaF3++3ds1szWXx+eOlF5jWXXUCAJDNKGYAAAxCMQMAYBCK\nGQAAg1DMAAAYhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjEDAGAQihkAAINQzAAAGIRiBgDA\nIBQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABhk1hZzNBrV8uXLFQwGdejQoXGP\nd3R0aPXq1fJ4PGptbU3d39XVpfXr16u0tFSrVq1Sc3PzTI49IbZmu3LliioqKrR582YdPXp03OPX\nrl3Tzp07VVZWpnPnzqXuv3Xrlj799FNt3bpV27ZtUzQancmxX8nW9ZKk+/fv68SJEzp+/LiuX78+\n7vH+/n6dPHlSR44cUU9PT+r+R48e6dSpU2pqalJzc7O6u7tncuxXsnnNbM2WTecPz7T/CwZyHEc1\nNTU6f/68AoGAysvLVVlZqZUrV6aeU1RUpIaGBh0+fDht2/nz5+vYsWNatmyZ+vv7tWbNGoXDYeXm\n5s50jJeyNZvjODp48KB++OEH5efnq7q6Wps2bdLSpUtTz1m0aJEOHDigxsbGtG3feOMNfffdd3r7\n7bc1ODioXbt2acOGDVq4cOFMxxjH1vWSpLGxMV2+fFkVFRXyer1qa2tTcXGx8vLyUs9ZsGCBQqGQ\nurq60rb1eDwKhULKzc1VPB5Xa2urCgsLNXfu3JmOMY7Na2Zrtmw7f8zKYu7s7FQwGNSSJUskSdXV\n1Wpvb0/b+YqLiyVJbnf6hwolJSWpnwsKCuTz+TQ0NGTEzifZm+3GjRsqKipSYWGhJOnjjz/WxYsX\n0w6sxYsXS5JcLlfats/zSpLP51NeXp4eP35sRDHbul6SNDg4qJycnNTvORgM6t69e2nF/PyxF9fs\nvzN4vV7NmzdPT58+NaKYbV4zW7Nl2/ljVn6U3dfXl1ogSQoEAurr65v063R2diqRSKQtbqbZmm1w\ncFD5+fmp236/X7FYbNKvc+PGDY2MjKT9jjLJ1vWSpHg8Lq/Xm7rt9XoVj8cn/TqxWEyO4ygnJ2cq\nx3ttNq+Zrdmy7fwxK6+Yk8nkuPtefJf0KgMDA9q9e7caGxvHvXPMJFuzTUWuoaEh7d27V3V1dVbl\nMnG9pko8HteFCxcUCoUm/XuZLjavma3Zsu38YcZvbYYFAgE9ePAgdbu3t1cFBQUT3n54eFhbtmxR\nXV2d1q1bNx0jvjZbs/n9fj18+DB1OxaLyefzTXj7P//8UzU1Nfryyy/1/vvvT8eIr8XW9ZLGXyG/\neAX9KolEQmfOnNHatWvTrnYyzeY1szVbtp0/ZmUxl5eX686dO7p7964SiYSamppUWVk5oW0TiYSq\nqqq0Z88e7dixY5onnTxbs7333nv67bff1Nvbq5GREZ09e1YffvjhhLYdGRnR119/rYqKCoXD4ekd\ndJJsXS/p2fdxT5480fDwsBzHUXd3d9r3df/EcRxFo1GVlJQY83Hoczavma3Zsu38MSuL2ePxqL6+\nXuFwWO+++6527typ0tJS7du3Tz/99JMk6erVqwoEAjp58qQ+//xzlZaWSpJaWlrU0dGhhoYGlZWV\nqaysbNxflGaSrdk8Ho/27t2rL774QpWVlQqHwwoGg6qvr9fFixclSTdv3tRHH32k8+fP69tvv9XW\nrVslPfvvH7/88ova29v1ySef6JNPPtGtW7cyGSfF1vWSnv1x0MaNG3X69Gk1NTVp6dKlysvLU2dn\np+7evSvp2Xd/x44dU09Pjy5duqSmpiZJUk9PjwYGBnT79m21tLSopaVFjx49ymScFJvXzNZs2Xb+\ncL3ss3fTRSKRZG1tbabHmHKRSES25tq+fXumx5gWbW1t1q7Z6/xxTDbw+/3WrpmtuSw/f4z7sntW\nXjEDAGAqihkAAINQzAAAGIRiBgDAIBQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwA\nABiEYgYAwCAUMwAABqGYAQAwCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAYhGIGAMAgFDMAAAah\nmAEAMAjFDACAQVzJZDLTM0zagQMHnLGxMeveVHg8Ho2OjmZ6jClnay5JcrvdGhsby/QYU87WXJK9\n2Ww9zmxdL0lyu91j33zzzZy3SXG9AAARWUlEQVQX7/dkYph/a2xszL19+/ZMjzHl2traVFtbm+kx\nplwkErEyl/Qsm637oo25JHuz2Xz+sHG9JKmtre2lF5jWXXUCAJDNKGYAAAxCMQMAYBCKGQAAg1DM\nAAAYhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjEDAGAQihkAAINQzAAAGIRiBgDAIBQzAAAG\noZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDzNpivnLliioqKrR582YdPXp03OPXrl3Tzp07VVZW\npnPnzqXuv3Xrlj799FNt3bpV27ZtUzQancmxJyQajWr58uUKBoM6dOjQuMc7Ojq0evVqeTwetba2\npu7v6urS+vXrVVpaqlWrVqm5uXkmx34lW3PZvC/ams3WXBLHmQlr5pn2f8FAjuPo4MGD+uGHH5Sf\nn6/q6mpt2rRJS5cuTT1n0aJFOnDggBobG9O2feONN/Tdd9/p7bff1uDgoHbt2qUNGzZo4cKFMx3j\npRzHUU1Njc6fP69AIKDy8nJVVlZq5cqVqecUFRWpoaFBhw8fTtt2/vz5OnbsmJYtW6b+/n6tWbNG\n4XBYubm5Mx1jHJtz2bwv2pjN1lwSx5kpazYri/nGjRsqKipSYWGhJOnjjz/WxYsX0xZp8eLFkiSX\ny5W2bXFxcepnn8+nvLw8PX782JgDq7OzU8FgUEuWLJEkVVdXq729Pe3Aep7B7U7/wKSkpCT1c0FB\ngXw+n4aGhow4sGzNZfO+aGs2W3NJHGeSGWs2Kz/KHhwcVH5+fuq23+9XLBab9OvcuHFDIyMjqcU2\nQV9fX9o8gUBAfX19k36dzs5OJRKJtB03k2zNZfO+aGs2W3NJHGevMlNrNiuvmJPJ5Lj7XnyX9CpD\nQ0Pau3ev6urqxr1zzKSpyDYwMKDdu3ersbHRmGzk+t9s3hdNzGZrLonj7J/M5JqZ8VubYX6/Xw8f\nPkzdjsVi8vl8E97+zz//VE1Njb788ku9//770zHiawsEAnrw4EHqdm9vrwoKCia8/fDwsLZs2aK6\nujqtW7duOkZ8LbbmsnlftDWbrbkkjrP/ZabXbFYW83vvvafffvtNvb29GhkZ0dmzZ/Xhhx9OaNuR\nkRF9/fXXqqioUDgcnt5BX0N5ebnu3Lmju3fvKpFIqKmpSZWVlRPaNpFIqKqqSnv27NGOHTumedLJ\nsTWXzfuirdlszSVxnL1MJtZsVhazx+PR3r179cUXX6iyslLhcFjBYFD19fW6ePGiJOnmzZv66KOP\ndP78eX377bfaunWrpGf/leCXX35Re3u7PvnkE33yySe6detWJuOk8Xg8qq+vVzgc1rvvvqudO3eq\ntLRU+/bt008//SRJunr1qgKBgE6ePKnPP/9cpaWlkqSWlhZ1dHSooaFBZWVlKisrU1dXVybjpNic\ny+Z90cZstuaSOM5MWTPXyz57N10kEklu374902NMuba2NtXW1mZ6jCkXiUSszCU9y2brvmhjLsne\nbDafP2xcLym1ZuO+7J6VV8wAAJiKYgYAwCAUMwAABqGYAQAwCMUMAIBBKGYAAAxCMQMAYBCKGQAA\ng1DMAAAYhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjEDAGAQihkAAINQzAAAGIRiBgDAIBQz\nAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEFcymcz0DJO2f/9+x+VyWfemYs6cOXIcJ9NjTDmPx6PR\n0dFMjzEtbM1may7J3mzJZFIulyvTY0w5W3NJUjKZHNu/f/+cF+/3ZGKYf8vlcrljsVimx5hyfr9f\ntbW1mR5jykUiEStzSfZmszWXZG+2SCQiW8+LNuaSJL/f/9ILTOuuOgEAyGYUMwAABqGYAQAwCMUM\nAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAYhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjEDAGAQ\nihkAAINQzAAAGIRiBgDAIBQzAAAGoZgBADAIxQwAgEEoZgAADDJri/n+/fs6ceKEjh8/ruvXr497\nvL+/XydPntSRI0fU09OTuv/Ro0c6deqUmpqa1NzcrO7u7pkce0Ki0aiWL1+uYDCoQ4cOjXu8o6ND\nq1evlsfjUWtra+r+rq4urV+/XqWlpVq1apWam5tncuxXIld25ZLszWZrLsnec2M25fJM+79goLGx\nMV2+fFkVFRXyer1qa2tTcXGx8vLyUs9ZsGCBQqGQurq60rb1eDwKhULKzc1VPB5Xa2urCgsLNXfu\n3JmO8VKO46impkbnz59XIBBQeXm5KisrtXLlytRzioqK1NDQoMOHD6dtO3/+fB07dkzLli1Tf3+/\n1qxZo3A4rNzc3JmOMQ65siuXZG82W3NJ9p4bsy3XrCzmwcFB5eTkaOHChZKkYDCoe/fupS3S88dc\nLlfatv99AHm9Xs2bN09Pnz41YueTpM7OTgWDQS1ZskSSVF1drfb29rSTRnFxsSTJ7U7/wKSkpCT1\nc0FBgXw+n4aGhow4aZAru3JJ9mazNZdk77kx23LNyo+y4/G4vF5v6rbX61U8Hp/068RiMTmOo5yc\nnKkc71/p6+tTYWFh6nYgEFBfX9+kX6ezs1OJREJLly6dyvFeG7n+mWm5JHuz2ZpLsvfcmG25ZuUV\n81SIx+O6cOGCQqHQuHdYmZRMJsfdN9n5BgYGtHv3bjU2No57x58p5PrfTMwl2ZvN1lxTxdRz4781\nk7ns2iMm6MV3Sy++m3qVRCKhM2fOaO3atcrPz5+OEV9bIBDQgwcPUrd7e3tVUFAw4e2Hh4e1ZcsW\n1dXVad26ddMx4msh18uZmkuyN5utuSR7z43ZlmtWFrPP59OTJ080PDwsx3HU3d2d+k7oVRzHUTQa\nVUlJiVEfQT1XXl6uO3fu6O7du0okEmpqalJlZeWEtk0kEqqqqtKePXu0Y8eOaZ50csg1nsm5JHuz\n2ZpLsvfcmG25ZmUxu91ubdy4UadPn1ZTU5OWLl2qvLw8dXZ26u7du5Ke/bHAsWPH1NPTo0uXLqmp\nqUmS1NPTo4GBAd2+fVstLS1qaWnRo0ePMhknjcfjUX19vcLhsN59913t3LlTpaWl2rdvn3766SdJ\n0tWrVxUIBHTy5El9/vnnKi0tlSS1tLSoo6NDDQ0NKisrU1lZ2bi/UMwUcmVXLsnebLbmkuw9N2Zb\nLtfLvi8xXSQSScZisUyPMeX8fr9qa2szPcaUi0QiVuaS7M1may7J3myRSES2nhdtzCWlzvnjvrCe\nlVfMAACYimIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjEDAGAQihkAAINQzAAAGIRiBgDAIBQz\nAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABiEYgYAwCAUMwAABqGYAQAwCMUMAIBB\nKGYAAAziSiaTmZ5h0vbv3++4XC7r3lQkk0m5XK5MjzHl5syZI8dxMj3GtLB1zdxut8bGxjI9xrTw\neDwaHR3N9BhTztZ90ebzx5w5c8b+7//+b86L93syMcy/5XK53LFYLNNjTDm/3y9bc9XW1mZ6jGkR\niUSsXbPt27dneoxp0dbWZuX+aPO+aON6SVIkEnnpBaZ1V50AAGQzihkAAINQzAAAGIRiBgDAIBQz\nAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABiEYgYAwCAUMwAABqGYAQAwCMUMAIBB\nKGYAAAxCMQMAYBCKGQAAg1DMAAAYhGIGAMAgFDMAAAaZtcV8//59nThxQsePH9f169fHPd7f36+T\nJ0/qyJEj6unpSd3/6NEjnTp1Sk1NTWpublZ3d/dMjj0htmaLRqNavny5gsGgDh06NO7xjo4OrV69\nWh6PR62tran7u7q6tH79epWWlmrVqlVqbm6eybFfydb1kqQrV66ooqJCmzdv1tGjR8c9fu3aNe3c\nuVNlZWU6d+5c6v5bt27p008/1datW7Vt2zZFo9GZHPuVbN0XJXv3x2xaM8+0/wsGGhsb0+XLl1VR\nUSGv16u2tjYVFxcrLy8v9ZwFCxYoFAqpq6srbVuPx6NQKKTc3FzF43G1traqsLBQc+fOnekYL2Vr\nNsdxVFNTo/PnzysQCKi8vFyVlZVauXJl6jlFRUVqaGjQ4cOH07adP3++jh07pmXLlqm/v19r1qxR\nOBxWbm7uTMcYx9b1kp6t2cGDB/XDDz8oPz9f1dXV2rRpk5YuXZp6zqJFi3TgwAE1NjambfvGG2/o\nu+++09tvv63BwUHt2rVLGzZs0MKFC2c6xji27ouSvftjtq3ZrCzmwcFB5eTkpA7yYDCoe/fupe18\nzx9zuVxp2/73Yni9Xs2bN09Pnz41YueT7M3W2dmpYDCoJUuWSJKqq6vV3t6edmAVFxdLktzu9A+C\nSkpKUj8XFBTI5/NpaGjIiJOhreslSTdu3FBRUZEKCwslSR9//LEuXryYVsyLFy+WND7b87WUJJ/P\np7y8PD1+/NiIYrZ1X5Ts3R+zbc1m5UfZ8XhcXq83ddvr9Soej0/6dWKxmBzHUU5OzlSO96/Ymq2v\nry91gpekQCCgvr6+Sb9OZ2enEolEWjlkkq3rJT07yefn56du+/1+xWKxSb/OjRs3NDIykrb+mWTr\nvijZuz9m25rNyivmqRCPx3XhwgWFQqFx7xyznYnZksnkuPsmO9vAwIB2796txsbGce+Ks5mJ6yVN\nzZoNDQ1p7969qqurM2bN2Bf/mYn7Y7atmV17xAS9+C7wxXeJr5JIJHTmzBmtXbs27YrABLZmCwQC\nevDgQep2b2+vCgoKJrz98PCwtmzZorq6Oq1bt246Rnwttq6X9OwK+eHDh6nbsVhMPp9vwtv/+eef\nqqmp0Zdffqn3339/OkZ8Lbbui5K9+2O2rdmsLGafz6cnT55oeHhYjuOou7s77Tutf+I4jqLRqEpK\nSoz6COo5W7OVl5frzp07unv3rhKJhJqamlRZWTmhbROJhKqqqrRnzx7t2LFjmiedHFvXS5Lee+89\n/fbbb+rt7dXIyIjOnj2rDz/8cELbjoyM6Ouvv1ZFRYXC4fD0DjpJtu6Lkr37Y7at2az8KNvtdmvj\nxo06ffq0ksmkVqxYoby8PHV2duqtt97SO++8o8HBQUWjUf3999+6d++erl69qurqavX09GhgYEB/\n/fWXbt++LUkKhUJ68803M5zqGVuzeTwe1dfXKxwOy3EcffbZZyotLdW+ffv0wQcfqLKyUlevXlVV\nVZUeP36sn3/+WbW1tfr111/V0tKijo4O/f7772poaJAkNTQ0qKysLLOhZO96Sc/WbO/evfriiy/k\nOI6qqqoUDAZVX1+v0tJSbdq0STdv3tRXX32lP/74Q5cuXdL333+vH3/8UdFoVL/88ouePHmi9vZ2\nSVJdXZ1WrFiR4VT27ouSvftjtq2Z62WfvZsuEokkX+ePSEz3un8cYzq/36/a2tpMjzEtIpGItWu2\nffv2TI8xLdra2qzcH23eF21cL+nZmtXW1o77sntWfpQNAICpKGYAAAxCMQMAYBCKGQAAg1DMAAAY\nhGIGAMAgFDMAAAahmAEAMAjFDACAQShmAAAMQjEDAGAQihkAAINQzAAAGIRiBgDAIBQzAAAGoZgB\nADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABiEYgYAwCAUMwAABnElk8lMzzBp+/fvf+hyufyZ\nnmOqJZPJMZfLZd2bpTlz5ow5jmNdLsneNXO73WNjY2PW5ZIkj8czNjo6al02W/dFm88fHo8n9p//\n/Cf/xfuzspgBALCVle9CAADIVhQzAAAGoZgBADAIxQwAgEEoZgAADEIxAwBgEIoZAACDUMwAABiE\nYgYAwCAUMwAABqGYAQAwCMUMAIBBKGYAAAxCMQMAYBCKGQAAg1DMAAAYhGIGAMAgFDMAAAahmAEA\nMAjFDACAQShmAAAMQjEDAGCQ/wdJuZEoaHGMKwAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import itertools\n", + "import random\n", + "# http://stackoverflow.com/questions/10194482/custom-matplotlib-plot-chess-board-like-table-with-colored-cells\n", + "\n", + "from matplotlib.table import Table\n", + "\n", + "def main():\n", + " grid_table(8, 8)\n", + " plt.axis('scaled')\n", + " plt.show()\n", + "\n", + "def grid_table(nrows, ncols):\n", + " fig, ax = plt.subplots()\n", + " ax.set_axis_off()\n", + " colors = ['white', 'lightgrey', 'dimgrey']\n", + " tb = Table(ax, bbox=[0,0,2,2])\n", + " for i,j in itertools.product(range(ncols), range(nrows)):\n", + " tb.add_cell(i, j, 2./ncols, 2./nrows, text='{:0.2f}'.format(0.1234), \n", + " loc='center', facecolor=random.choice(colors), edgecolor='grey') # facecolors=\n", + " ax.add_table(tb)\n", + " #ax.plot([0, .3], [.2, .2])\n", + " #ax.add_line(plt.Line2D([0.3, 0.5], [0.7, 0.7], linewidth=2, color='blue'))\n", + " return fig\n", + "\n", + "main()" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import collections\n", + "class defaultkeydict(collections.defaultdict):\n", + " \"\"\"Like defaultdict, but the default_factory is a function of the key.\n", + " >>> d = defaultkeydict(abs); d[-42]\n", + " 42\n", + " \"\"\"\n", + " def __missing__(self, key):\n", + " self[key] = self.default_factory(key)\n", + " return self[key]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Simulated Annealing visualisation using TSP\n", + "\n", + "Applying simulated annealing in traveling salesman problem to find the shortest tour to travel all cities in Romania. Distance between two cities is taken as the euclidean distance." + ] + }, + { + "cell_type": "code", + "execution_count": 60, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "class TSP_problem(Problem):\n", + "\n", + " '''\n", + " subclass of Problem to define various functions \n", + " '''\n", + "\n", + " def two_opt(self, state):\n", + " '''\n", + " Neighbour generating function for Traveling Salesman Problem\n", + " '''\n", + " state2 = state[:]\n", + " l = random.randint(0, len(state2) - 1)\n", + " r = random.randint(0, len(state2) - 1)\n", + " if l > r:\n", + " l, r = r,l\n", + " state2[l : r + 1] = reversed(state2[l : r + 1])\n", + " return state2\n", + "\n", + " def actions(self, state):\n", + " '''\n", + " action that can be excuted in given state\n", + " '''\n", + " return [self.two_opt]\n", + " \n", + " def result(self, state, action):\n", + " '''\n", + " result after applying the given action on the given state\n", + " '''\n", + " return action(state)\n", + "\n", + " def path_cost(self, c, state1, action, state2):\n", + " '''\n", + " total distance for the Traveling Salesman to be covered if in state2\n", + " '''\n", + " cost = 0\n", + " for i in range(len(state2) - 1):\n", + " cost += distances[state2[i]][state2[i + 1]]\n", + " cost += distances[state2[0]][state2[-1]]\n", + " return cost\n", + " \n", + " def value(self, state):\n", + " '''\n", + " value of path cost given negative for the given state\n", + " '''\n", + " return -1 * self.path_cost(None, None, None, state)\n" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def init():\n", + " ''' \n", + " Initialisation function for matplotlib animation\n", + " '''\n", + " line.set_data([], [])\n", + " for name, coordinates in romania_map.locations.items():\n", + " ax.annotate(\n", + " name,\n", + " xy=coordinates, xytext=(-10, 5), textcoords='offset points', size = 10)\n", + " text.set_text(\"Cost = 0 i = 0\" )\n", + "\n", + " return line, \n", + "\n", + "def animate(i):\n", + " '''\n", + " Animation function to set next path and print its cost.\n", + " '''\n", + " x, y = [], []\n", + " for name in states[i]:\n", + " x.append(romania_map.locations[name][0])\n", + " y.append(romania_map.locations[name][1])\n", + " x.append(romania_map.locations[states[i][0]][0])\n", + " y.append(romania_map.locations[states[i][0]][1])\n", + " line.set_data(x,y) \n", + " text.set_text(\"Cost = \" + str('{:.2f}'.format(TSP_problem.path_cost(None, None, None, None, states[i]))))\n", + " return line," + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": {}, + "outputs": [ + { + "data": { + "application/javascript": [ + "/* Put everything inside the global mpl namespace */\n", + "window.mpl = {};\n", + "\n", + "\n", + "mpl.get_websocket_type = function() {\n", + " if (typeof(WebSocket) !== 'undefined') {\n", + " return WebSocket;\n", + " } else if (typeof(MozWebSocket) !== 'undefined') {\n", + " return MozWebSocket;\n", + " } else {\n", + " alert('Your browser does not have WebSocket support.' +\n", + " 'Please try Chrome, Safari or Firefox ≥ 6. ' +\n", + " 'Firefox 4 and 5 are also supported but you ' +\n", + " 'have to enable WebSockets in about:config.');\n", + " };\n", + "}\n", + "\n", + "mpl.figure = function(figure_id, websocket, ondownload, parent_element) {\n", + " this.id = figure_id;\n", + "\n", + " this.ws = websocket;\n", + "\n", + " this.supports_binary = (this.ws.binaryType != undefined);\n", + "\n", + " if (!this.supports_binary) {\n", + " var warnings = document.getElementById(\"mpl-warnings\");\n", + " if (warnings) {\n", + " warnings.style.display = 'block';\n", + " warnings.textContent = (\n", + " \"This browser does not support binary websocket messages. \" +\n", + " \"Performance may be slow.\");\n", + " }\n", + " }\n", + "\n", + " this.imageObj = new Image();\n", + "\n", + " this.context = undefined;\n", + " this.message = undefined;\n", + " this.canvas = undefined;\n", + " this.rubberband_canvas = undefined;\n", + " this.rubberband_context = undefined;\n", + " this.format_dropdown = undefined;\n", + "\n", + " this.image_mode = 'full';\n", + "\n", + " this.root = $('
    ');\n", + " this._root_extra_style(this.root)\n", + " this.root.attr('style', 'display: inline-block');\n", + "\n", + " $(parent_element).append(this.root);\n", + "\n", + " this._init_header(this);\n", + " this._init_canvas(this);\n", + " this._init_toolbar(this);\n", + "\n", + " var fig = this;\n", + "\n", + " this.waiting = false;\n", + "\n", + " this.ws.onopen = function () {\n", + " fig.send_message(\"supports_binary\", {value: fig.supports_binary});\n", + " fig.send_message(\"send_image_mode\", {});\n", + " if (mpl.ratio != 1) {\n", + " fig.send_message(\"set_dpi_ratio\", {'dpi_ratio': mpl.ratio});\n", + " }\n", + " fig.send_message(\"refresh\", {});\n", + " }\n", + "\n", + " this.imageObj.onload = function() {\n", + " if (fig.image_mode == 'full') {\n", + " // Full images could contain transparency (where diff images\n", + " // almost always do), so we need to clear the canvas so that\n", + " // there is no ghosting.\n", + " fig.context.clearRect(0, 0, fig.canvas.width, fig.canvas.height);\n", + " }\n", + " fig.context.drawImage(fig.imageObj, 0, 0);\n", + " };\n", + "\n", + " this.imageObj.onunload = function() {\n", + " fig.ws.close();\n", + " }\n", + "\n", + " this.ws.onmessage = this._make_on_message_function(this);\n", + "\n", + " this.ondownload = ondownload;\n", + "}\n", + "\n", + "mpl.figure.prototype._init_header = function() {\n", + " var titlebar = $(\n", + " '
    ');\n", + " var titletext = $(\n", + " '
    ');\n", + " titlebar.append(titletext)\n", + " this.root.append(titlebar);\n", + " this.header = titletext[0];\n", + "}\n", + "\n", + "\n", + "\n", + "mpl.figure.prototype._canvas_extra_style = function(canvas_div) {\n", + "\n", + "}\n", + "\n", + "\n", + "mpl.figure.prototype._root_extra_style = function(canvas_div) {\n", + "\n", + "}\n", + "\n", + "mpl.figure.prototype._init_canvas = function() {\n", + " var fig = this;\n", + "\n", + " var canvas_div = $('
    ');\n", + "\n", + " canvas_div.attr('style', 'position: relative; clear: both; outline: 0');\n", + "\n", + " function canvas_keyboard_event(event) {\n", + " return fig.key_event(event, event['data']);\n", + " }\n", + "\n", + " canvas_div.keydown('key_press', canvas_keyboard_event);\n", + " canvas_div.keyup('key_release', canvas_keyboard_event);\n", + " this.canvas_div = canvas_div\n", + " this._canvas_extra_style(canvas_div)\n", + " this.root.append(canvas_div);\n", + "\n", + " var canvas = $('');\n", + " canvas.addClass('mpl-canvas');\n", + " canvas.attr('style', \"left: 0; top: 0; z-index: 0; outline: 0\")\n", + "\n", + " this.canvas = canvas[0];\n", + " this.context = canvas[0].getContext(\"2d\");\n", + "\n", + " var backingStore = this.context.backingStorePixelRatio ||\n", + "\tthis.context.webkitBackingStorePixelRatio ||\n", + "\tthis.context.mozBackingStorePixelRatio ||\n", + "\tthis.context.msBackingStorePixelRatio ||\n", + "\tthis.context.oBackingStorePixelRatio ||\n", + "\tthis.context.backingStorePixelRatio || 1;\n", + "\n", + " mpl.ratio = (window.devicePixelRatio || 1) / backingStore;\n", + "\n", + " var rubberband = $('');\n", + " rubberband.attr('style', \"position: absolute; left: 0; top: 0; z-index: 1;\")\n", + "\n", + " var pass_mouse_events = true;\n", + "\n", + " canvas_div.resizable({\n", + " start: function(event, ui) {\n", + " pass_mouse_events = false;\n", + " },\n", + " resize: function(event, ui) {\n", + " fig.request_resize(ui.size.width, ui.size.height);\n", + " },\n", + " stop: function(event, ui) {\n", + " pass_mouse_events = true;\n", + " fig.request_resize(ui.size.width, ui.size.height);\n", + " },\n", + " });\n", + "\n", + " function mouse_event_fn(event) {\n", + " if (pass_mouse_events)\n", + " return fig.mouse_event(event, event['data']);\n", + " }\n", + "\n", + " rubberband.mousedown('button_press', mouse_event_fn);\n", + " rubberband.mouseup('button_release', mouse_event_fn);\n", + " // Throttle sequential mouse events to 1 every 20ms.\n", + " rubberband.mousemove('motion_notify', mouse_event_fn);\n", + "\n", + " rubberband.mouseenter('figure_enter', mouse_event_fn);\n", + " rubberband.mouseleave('figure_leave', mouse_event_fn);\n", + "\n", + " canvas_div.on(\"wheel\", function (event) {\n", + " event = event.originalEvent;\n", + " event['data'] = 'scroll'\n", + " if (event.deltaY < 0) {\n", + " event.step = 1;\n", + " } else {\n", + " event.step = -1;\n", + " }\n", + " mouse_event_fn(event);\n", + " });\n", + "\n", + " canvas_div.append(canvas);\n", + " canvas_div.append(rubberband);\n", + "\n", + " this.rubberband = rubberband;\n", + " this.rubberband_canvas = rubberband[0];\n", + " this.rubberband_context = rubberband[0].getContext(\"2d\");\n", + " this.rubberband_context.strokeStyle = \"#000000\";\n", + "\n", + " this._resize_canvas = function(width, height) {\n", + " // Keep the size of the canvas, canvas container, and rubber band\n", + " // canvas in synch.\n", + " canvas_div.css('width', width)\n", + " canvas_div.css('height', height)\n", + "\n", + " canvas.attr('width', width * mpl.ratio);\n", + " canvas.attr('height', height * mpl.ratio);\n", + " canvas.attr('style', 'width: ' + width + 'px; height: ' + height + 'px;');\n", + "\n", + " rubberband.attr('width', width);\n", + " rubberband.attr('height', height);\n", + " }\n", + "\n", + " // Set the figure to an initial 600x600px, this will subsequently be updated\n", + " // upon first draw.\n", + " this._resize_canvas(600, 600);\n", + "\n", + " // Disable right mouse context menu.\n", + " $(this.rubberband_canvas).bind(\"contextmenu\",function(e){\n", + " return false;\n", + " });\n", + "\n", + " function set_focus () {\n", + " canvas.focus();\n", + " canvas_div.focus();\n", + " }\n", + "\n", + " window.setTimeout(set_focus, 100);\n", + "}\n", + "\n", + "mpl.figure.prototype._init_toolbar = function() {\n", + " var fig = this;\n", + "\n", + " var nav_element = $('
    ')\n", + " nav_element.attr('style', 'width: 100%');\n", + " this.root.append(nav_element);\n", + "\n", + " // Define a callback function for later on.\n", + " function toolbar_event(event) {\n", + " return fig.toolbar_button_onclick(event['data']);\n", + " }\n", + " function toolbar_mouse_event(event) {\n", + " return fig.toolbar_button_onmouseover(event['data']);\n", + " }\n", + "\n", + " for(var toolbar_ind in mpl.toolbar_items) {\n", + " var name = mpl.toolbar_items[toolbar_ind][0];\n", + " var tooltip = mpl.toolbar_items[toolbar_ind][1];\n", + " var image = mpl.toolbar_items[toolbar_ind][2];\n", + " var method_name = mpl.toolbar_items[toolbar_ind][3];\n", + "\n", + " if (!name) {\n", + " // put a spacer in here.\n", + " continue;\n", + " }\n", + " var button = $('');\n", + " button.click(method_name, toolbar_event);\n", + " button.mouseover(tooltip, toolbar_mouse_event);\n", + " nav_element.append(button);\n", + " }\n", + "\n", + " // Add the status bar.\n", + " var status_bar = $('');\n", + " nav_element.append(status_bar);\n", + " this.message = status_bar[0];\n", + "\n", + " // Add the close button to the window.\n", + " var buttongrp = $('
    ');\n", + " var button = $('');\n", + " button.click(function (evt) { fig.handle_close(fig, {}); } );\n", + " button.mouseover('Stop Interaction', toolbar_mouse_event);\n", + " buttongrp.append(button);\n", + " var titlebar = this.root.find($('.ui-dialog-titlebar'));\n", + " titlebar.prepend(buttongrp);\n", + "}\n", + "\n", + "mpl.figure.prototype._root_extra_style = function(el){\n", + " var fig = this\n", + " el.on(\"remove\", function(){\n", + "\tfig.close_ws(fig, {});\n", + " });\n", + "}\n", + "\n", + "mpl.figure.prototype._canvas_extra_style = function(el){\n", + " // this is important to make the div 'focusable\n", + " el.attr('tabindex', 0)\n", + " // reach out to IPython and tell the keyboard manager to turn it's self\n", + " // off when our div gets focus\n", + "\n", + " // location in version 3\n", + " if (IPython.notebook.keyboard_manager) {\n", + " IPython.notebook.keyboard_manager.register_events(el);\n", + " }\n", + " else {\n", + " // location in version 2\n", + " IPython.keyboard_manager.register_events(el);\n", + " }\n", + "\n", + "}\n", + "\n", + "mpl.figure.prototype._key_event_extra = function(event, name) {\n", + " var manager = IPython.notebook.keyboard_manager;\n", + " if (!manager)\n", + " manager = IPython.keyboard_manager;\n", + "\n", + " // Check for shift+enter\n", + " if (event.shiftKey && event.which == 13) {\n", + " this.canvas_div.blur();\n", + " event.shiftKey = false;\n", + " // Send a \"J\" for go to next cell\n", + " event.which = 74;\n", + " event.keyCode = 74;\n", + " manager.command_mode();\n", + " manager.handle_keydown(event);\n", + " }\n", + "}\n", + "\n", + "mpl.figure.prototype.handle_save = function(fig, msg) {\n", + " fig.ondownload(fig, null);\n", + "}\n", + "\n", + "\n", + "mpl.find_output_cell = function(html_output) {\n", + " // Return the cell and output element which can be found *uniquely* in the notebook.\n", + " // Note - this is a bit hacky, but it is done because the \"notebook_saving.Notebook\"\n", + " // IPython event is triggered only after the cells have been serialised, which for\n", + " // our purposes (turning an active figure into a static one), is too late.\n", + " var cells = IPython.notebook.get_cells();\n", + " var ncells = cells.length;\n", + " for (var i=0; i= 3 moved mimebundle to data attribute of output\n", + " data = data.data;\n", + " }\n", + " if (data['text/html'] == html_output) {\n", + " return [cell, data, j];\n", + " }\n", + " }\n", + " }\n", + " }\n", + "}\n", + "\n", + "// Register the function which deals with the matplotlib target/channel.\n", + "// The kernel may be null if the page has been refreshed.\n", + "if (IPython.notebook.kernel != null) {\n", + " IPython.notebook.kernel.comm_manager.register_target('matplotlib', mpl.mpl_figure_comm);\n", + "}\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "next_state = cities\n", + "states = []\n", + "\n", + "# creating plotting area\n", + "fig = plt.figure(figsize = (8,6))\n", + "ax = plt.axes(xlim=(60, 600), ylim=(245, 600))\n", + "line, = ax.plot([], [], c=\"b\",linewidth = 1.5, marker = 'o', markerfacecolor = 'r', markeredgecolor = 'r',markersize = 10)\n", + "text = ax.text(450, 565, \"\", fontdict = font)\n", + "\n", + "# to plot only the final states of every simulated annealing iteration\n", + "for iterations in range(100):\n", + " tsp_problem = TSP_problem(next_state) \n", + " states.append(simulated_annealing(tsp_problem))\n", + " next_state = states[-1]\n", + " \n", + "anim = animation.FuncAnimation(fig, animate, init_func=init,\n", + " frames=len(states),interval=len(states), blit=True, repeat = False)\n", + "plt.show()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.1" + }, + "widgets": { + "state": {}, + "version": "1.1.1" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/perception4e.py b/perception4e.py new file mode 100644 index 000000000..edd556607 --- /dev/null +++ b/perception4e.py @@ -0,0 +1,467 @@ +"""Perception (Chapter 24)""" + +import cv2 +import keras +import matplotlib.pyplot as plt +import numpy as np +import scipy.signal +from keras.datasets import mnist +from keras.layers import Dense, Activation, Flatten, InputLayer, Conv2D, MaxPooling2D +from keras.models import Sequential + +from utils4e import gaussian_kernel_2D + + +# ____________________________________________________ +# 24.3 Early Image Processing Operators +# 24.3.1 Edge Detection + + +def array_normalization(array, range_min, range_max): + """Normalize an array in the range of (range_min, range_max)""" + if not isinstance(array, np.ndarray): + array = np.asarray(array) + array = array - np.min(array) + array = array * (range_max - range_min) / np.max(array) + range_min + return array + + +def gradient_edge_detector(image): + """ + Image edge detection by calculating gradients in the image + :param image: numpy ndarray or an iterable object + :return: numpy ndarray, representing a gray scale image + """ + if not isinstance(image, np.ndarray): + image = np.asarray(image) + # gradient filters of x and y direction edges + x_filter, y_filter = np.array([[1, -1]]), np.array([[1], [-1]]) + # convolution between filter and image to get edges + y_edges = scipy.signal.convolve2d(image, x_filter, 'same') + x_edges = scipy.signal.convolve2d(image, y_filter, 'same') + edges = array_normalization(x_edges + y_edges, 0, 255) + return edges + + +def gaussian_derivative_edge_detector(image): + """Image edge detector using derivative of gaussian kernels""" + if not isinstance(image, np.ndarray): + image = np.asarray(image) + gaussian_filter = gaussian_kernel_2D() + # init derivative of gaussian filters + x_filter = scipy.signal.convolve2d(gaussian_filter, np.asarray([[1, -1]]), 'same') + y_filter = scipy.signal.convolve2d(gaussian_filter, np.asarray([[1], [-1]]), 'same') + # extract edges using convolution + y_edges = scipy.signal.convolve2d(image, x_filter, 'same') + x_edges = scipy.signal.convolve2d(image, y_filter, 'same') + edges = array_normalization(x_edges + y_edges, 0, 255) + return edges + + +def laplacian_edge_detector(image): + """Extract image edge with laplacian filter""" + if not isinstance(image, np.ndarray): + image = np.asarray(image) + # init laplacian filter + laplacian_kernel = np.asarray([[0, -1, 0], [-1, 4, -1], [0, -1, 0]]) + # extract edges with convolution + edges = scipy.signal.convolve2d(image, laplacian_kernel, 'same') + edges = array_normalization(edges, 0, 255) + return edges + + +def show_edges(edges): + """ helper function to show edges picture""" + plt.imshow(edges, cmap='gray', vmin=0, vmax=255) + plt.axis('off') + plt.show() + + +# __________________________________________________ +# 24.3.3 Optical flow + + +def sum_squared_difference(pic1, pic2): + """SSD of two frames""" + pic1 = np.asarray(pic1) + pic2 = np.asarray(pic2) + assert pic1.shape == pic2.shape + min_ssd = np.inf + min_dxy = (np.inf, np.inf) + + # consider picture shift from -30 to 30 + for Dx in range(-30, 31): + for Dy in range(-30, 31): + # shift the image + shifted_pic = np.roll(pic2, Dx, axis=0) + shifted_pic = np.roll(shifted_pic, Dy, axis=1) + # calculate the difference + diff = np.sum((pic1 - shifted_pic) ** 2) + if diff < min_ssd: + min_dxy = (Dx, Dy) + min_ssd = diff + return min_dxy, min_ssd + + +# ____________________________________________________ +# segmentation + +def gen_gray_scale_picture(size, level=3): + """ + Generate a picture with different gray scale levels + :param size: size of generated picture + :param level: the number of level of gray scales in the picture, + range (0, 255) are equally divided by number of levels + :return image in numpy ndarray type + """ + assert level > 0 + # init an empty image + image = np.zeros((size, size)) + if level == 1: + return image + # draw a square on the left upper corner of the image + for x in range(size): + for y in range(size): + image[x, y] += (250 // (level - 1)) * (max(x, y) * level // size) + return image + + +gray_scale_image = gen_gray_scale_picture(3) + + +def probability_contour_detection(image, discs, threshold=0): + """ + Detect edges/contours by applying a set of discs to an image + :param image: an image in type of numpy ndarray + :param discs: a set of discs/filters to apply to pixels of image + :param threshold: threshold to tell whether the pixel at (x, y) is on an edge + :return image showing edges in numpy ndarray type + """ + # init an empty output image + res = np.zeros(image.shape) + step = discs[0].shape[0] + for x_i in range(0, image.shape[0] - step + 1, 1): + for y_i in range(0, image.shape[1] - step + 1, 1): + diff = [] + # apply each pair of discs and calculate the difference + for d in range(0, len(discs), 2): + disc1, disc2 = discs[d], discs[d + 1] + # crop the region of interest + region = image[x_i: x_i + step, y_i: y_i + step] + diff.append(np.sum(np.multiply(region, disc1)) - np.sum(np.multiply(region, disc2))) + if max(diff) > threshold: + # change color of the center of region + res[x_i + step // 2, y_i + step // 2] = 255 + return res + + +def group_contour_detection(image, cluster_num=2): + """ + Detecting contours in an image with k-means clustering + :param image: an image in numpy ndarray type + :param cluster_num: number of clusters in k-means + """ + img = image + Z = np.float32(img) + criteria = (cv2.TERM_CRITERIA_EPS + cv2.TERM_CRITERIA_MAX_ITER, 10, 1.0) + K = cluster_num + # use kmeans in opencv-python + ret, label, center = cv2.kmeans(Z, K, None, criteria, 10, cv2.KMEANS_RANDOM_CENTERS) + center = np.uint8(center) + res = center[label.flatten()] + res2 = res.reshape(img.shape) + # show the image + # cv2.imshow('res2', res2) + # cv2.waitKey(0) + # cv2.destroyAllWindows() + return res2 + + +def image_to_graph(image): + """ + Convert an image to an graph in adjacent matrix form + """ + graph_dict = {} + for x in range(image.shape[0]): + for y in range(image.shape[1]): + graph_dict[(x, y)] = [(x + 1, y) if x + 1 < image.shape[0] else None, + (x, y + 1) if y + 1 < image.shape[1] else None] + return graph_dict + + +def generate_edge_weight(image, v1, v2): + """ + Find edge weight between two vertices in an image + :param image: image in numpy ndarray type + :param v1, v2: verticles in the image in form of (x index, y index) + """ + diff = abs(image[v1[0], v1[1]] - image[v2[0], v2[1]]) + return 255 - diff + + +class Graph: + """Graph in adjacent matrix to represent an image""" + + def __init__(self, image): + """image: ndarray""" + self.graph = image_to_graph(image) + # number of columns and rows + self.ROW = len(self.graph) + self.COL = 2 + self.image = image + # dictionary to save the maximum flow of each edge + self.flow = {} + # initialize the flow + for s in self.graph: + self.flow[s] = {} + for t in self.graph[s]: + if t: + self.flow[s][t] = generate_edge_weight(image, s, t) + + def bfs(self, s, t, parent): + """Breadth first search to tell whether there is an edge between source and sink + parent: a list to save the path between s and t""" + # queue to save the current searching frontier + queue = [s] + visited = [] + + while queue: + u = queue.pop(0) + for node in self.graph[u]: + # only select edge with positive flow + if node not in visited and node and self.flow[u][node] > 0: + queue.append(node) + visited.append(node) + parent.append((u, node)) + return True if t in visited else False + + def min_cut(self, source, sink): + """Find the minimum cut of the graph between source and sink""" + parent = [] + max_flow = 0 + + while self.bfs(source, sink, parent): + path_flow = np.inf + # find the minimum flow of s-t path + for s, t in parent: + path_flow = min(path_flow, self.flow[s][t]) + + max_flow += path_flow + + # update all edges between source and sink + for s in self.flow: + for t in self.flow[s]: + if t[0] <= sink[0] and t[1] <= sink[1]: + self.flow[s][t] -= path_flow + parent = [] + res = [] + for i in self.flow: + for j in self.flow[i]: + if self.flow[i][j] == 0 and generate_edge_weight(self.image, i, j) > 0: + res.append((i, j)) + return res + + +def gen_discs(init_scale, scales=1): + """ + Generate a collection of disc pairs by splitting an round discs with different angles + :param init_scale: the initial size of each half discs + :param scales: scale number of each type of half discs, the scale size will be doubled each time + :return: the collection of generated discs: [discs of scale1, discs of scale2...] + """ + discs = [] + for m in range(scales): + scale = init_scale * (m + 1) + disc = [] + # make the full empty dist + white = np.zeros((scale, scale)) + center = (scale - 1) / 2 + for i in range(scale): + for j in range(scale): + if (i - center) ** 2 + (j - center) ** 2 <= (center ** 2): + white[i, j] = 255 + # generate lower half and upper half + lower_half = np.copy(white) + lower_half[:(scale - 1) // 2, :] = 0 + upper_half = lower_half[::-1, ::-1] + # generate left half and right half + disc += [lower_half, upper_half, np.transpose(lower_half), np.transpose(upper_half)] + # generate upper-left, lower-right, upper-right, lower-left half discs + disc += [np.tril(white, 0), np.triu(white, 0), np.flip(np.tril(white, 0), axis=0), + np.flip(np.triu(white, 0), axis=0)] + discs.append(disc) + return discs + + +# __________________________________________________ +# 24.4 Classifying Images + + +def load_MINST(train_size, val_size, test_size): + """Load MINST dataset from keras""" + (x_train, y_train), (x_test, y_test) = mnist.load_data() + total_size = len(x_train) + if train_size + val_size > total_size: + train_size = total_size - val_size + x_train = x_train.reshape(x_train.shape[0], 1, 28, 28) + x_test = x_test.reshape(x_test.shape[0], 1, 28, 28) + x_train = x_train.astype('float32') + x_train /= 255 + test_x = x_test.astype('float32') + test_x /= 255 + y_train = keras.utils.to_categorical(y_train, 10) + y_test = keras.utils.to_categorical(y_test, 10) + return ((x_train[:train_size], y_train[:train_size]), + (x_train[train_size:train_size + val_size], y_train[train_size:train_size + val_size]), + (x_test[:test_size], y_test[:test_size])) + + +def simple_convnet(size=3, num_classes=10): + """ + Simple convolutional network for digit recognition + :param size: number of convolution layers + :param num_classes: number of output classes + :return a convolution network in keras model type + """ + model = Sequential() + # add input layer for images of size (28, 28) + model.add(InputLayer(input_shape=(1, 28, 28))) + # add convolution layers and max pooling layers + for _ in range(size): + model.add(Conv2D(32, (2, 2), padding='same', kernel_initializer='random_uniform')) + model.add(MaxPooling2D(padding='same')) + + # add flatten layer and output layers + model.add(Flatten()) + model.add(Dense(num_classes)) + model.add(Activation('softmax')) + + # compile model + model.compile(loss='categorical_crossentropy', + metrics=['accuracy']) + print(model.summary()) + return model + + +def train_model(model): + """Train the simple convolution network""" + # load dataset + (train_x, train_y), (val_x, val_y), (test_x, test_y) = load_MINST(1000, 100, 100) + model.fit(train_x, train_y, validation_data=(val_x, val_y), epochs=5, verbose=2, batch_size=32) + scores = model.evaluate(test_x, test_y, verbose=1) + print(scores) + return model + + +# _____________________________________________________ +# 24.5 DETECTING OBJECTS + + +def selective_search(image): + """ + Selective search for object detection + :param image: str, the path of image or image in ndarray type with 3 channels + :return list of bounding boxes, each element is in form of [x_min, y_min, x_max, y_max] + """ + if not image: + im = cv2.imread("./images/stapler1-test.png") + elif isinstance(image, str): + im = cv2.imread(image) + else: + im = np.stack(image * 3, axis=-1) + + # use opencv python to extract bounding box with selective search + ss = cv2.ximgproc.segmentation.createSelectiveSearchSegmentation() + ss.setBaseImage(im) + ss.switchToSelectiveSearchQuality() + rects = ss.process() + + # show bounding boxes with the input image + image_out = im.copy() + for rect in rects[:100]: + print(rect) + x, y, w, h = rect + cv2.rectangle(image_out, (x, y), (x + w, y + h), (0, 255, 0), 1, cv2.LINE_AA) + cv2.imshow("Output", image_out) + cv2.waitKey(0) + return rects + + +# faster RCNN +def pool_rois(feature_map, rois, pooled_height, pooled_width): + """ + Applies ROI pooling for a single image and various ROIs + :param feature_map: ndarray, in shape of (width, height, channel) + :param rois: list of roi + :param pooled_height: height of pooled area + :param pooled_width: width of pooled area + :return list of pooled features + """ + + def curried_pool_roi(roi): + return pool_roi(feature_map, roi, pooled_height, pooled_width) + + pooled_areas = list(map(curried_pool_roi, rois)) + return pooled_areas + + +def pool_roi(feature_map, roi, pooled_height, pooled_width): + """ + Applies a single ROI pooling to a single image + :param feature_map: ndarray, in shape of (width, height, channel) + :param roi: region of interest, in form of [x_min_ratio, y_min_ratio, x_max_ratio, y_max_ratio] + :return feature of pooling output, in shape of (pooled_width, pooled_height) + """ + + # Compute the region of interest + feature_map_height = int(feature_map.shape[0]) + feature_map_width = int(feature_map.shape[1]) + + h_start = int(feature_map_height * roi[0]) + w_start = int(feature_map_width * roi[1]) + h_end = int(feature_map_height * roi[2]) + w_end = int(feature_map_width * roi[3]) + + region = feature_map[h_start:h_end, w_start:w_end, :] + + # Divide the region into non overlapping areas + region_height = h_end - h_start + region_width = w_end - w_start + h_step = region_height // pooled_height + w_step = region_width // pooled_width + + areas = [[( + i * h_step, + j * w_step, + (i + 1) * h_step if i + 1 < pooled_height else region_height, + (j + 1) * w_step if j + 1 < pooled_width else region_width) + for j in range(pooled_width)] + for i in range(pooled_height)] + + # take the maximum of each area and stack the result + def pool_area(x): + return np.max(region[x[0]:x[2], x[1]:x[3], :]) + + pooled_features = np.stack([[pool_area(x) for x in row] for row in areas]) + return pooled_features + +# faster rcnn demo can be installed and shown in jupyter notebook +# def faster_rcnn_demo(directory): +# """ +# show the demo of rcnn, the model is from +# @inproceedings{renNIPS15fasterrcnn, +# Author = {Shaoqing Ren and Kaiming He and Ross Girshick and Jian Sun}, +# Title = {Faster {R-CNN}: Towards Real-Time Object Detection +# with Region Proposal Networks}, +# Booktitle = {Advances in Neural Information Processing Systems ({NIPS})}, +# Year = {2015}} +# :param directory: the directory where the faster rcnn model is installed +# """ +# os.chdir(directory + '/lib') +# # make file +# os.system("make clean") +# os.system("make") +# # run demo +# os.chdir(directory) +# os.system("./tools/demo.py") +# return 0 diff --git a/planning.ipynb b/planning.ipynb index 37461ee9b..7b05b3c20 100644 --- a/planning.ipynb +++ b/planning.ipynb @@ -6,96 +6,490 @@ "collapsed": true }, "source": [ - "# Planning: planning.py; chapters 10-11" + "# Planning\n", + "#### Chapters 10-11\n", + "----" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This notebook describes the [planning.py](https://github.com/aimacode/aima-python/blob/master/planning.py) module, which covers Chapters 10 (Classical Planning) and 11 (Planning and Acting in the Real World) of *[Artificial Intelligence: A Modern Approach](http://aima.cs.berkeley.edu)*. See the [intro notebook](https://github.com/aimacode/aima-python/blob/master/intro.ipynb) for instructions.\n", + "This notebook serves as supporting material for topics covered in **Chapter 10 - Classical Planning** and **Chapter 11 - Planning and Acting in the Real World** from the book *[Artificial Intelligence: A Modern Approach](http://aima.cs.berkeley.edu)*. \n", + "This notebook uses implementations from the [planning.py](https://github.com/aimacode/aima-python/blob/master/planning.py) module. \n", + "See the [intro notebook](https://github.com/aimacode/aima-python/blob/master/intro.ipynb) for instructions.\n", "\n", - "We'll start by looking at `PDDL` and `Action` data types for defining problems and actions. Then, we will see how to use them by trying to plan a trip from *Sibiu* to *Bucharest* across the familiar map of Romania, from [search.ipynb](https://github.com/aimacode/aima-python/blob/master/search.ipynb). Finally, we will look at the implementation of the GraphPlan algorithm.\n", + "We'll start by looking at `PlanningProblem` and `Action` data types for defining problems and actions. \n", + "Then, we will see how to use them by trying to plan a trip from *Sibiu* to *Bucharest* across the familiar map of Romania, from [search.ipynb](https://github.com/aimacode/aima-python/blob/master/search.ipynb) \n", + "followed by some common planning problems and methods of solving them.\n", "\n", - "The first step is to load the code:" + "Let's start by importing everything from the planning module." ] }, { "cell_type": "code", - "execution_count": 1, - "metadata": { - "collapsed": false - }, + "execution_count": 79, + "metadata": {}, "outputs": [], "source": [ - "from planning import *" + "from planning import *\n", + "from notebook import psource" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "To be able to model a planning problem properly, it is essential to be able to represent an Action. Each action we model requires at least three things:\n", - "* preconditions that the action must meet\n", - "* the effects of executing the action\n", - "* some expression that represents the action" + "## CONTENTS\n", + "\n", + "**Classical Planning**\n", + "- PlanningProblem\n", + "- Action\n", + "- Planning Problems\n", + " * Air cargo problem\n", + " * Spare tire problem\n", + " * Three block tower problem\n", + " * Shopping Problem\n", + " * Socks and shoes problem\n", + " * Cake problem\n", + "- Solving Planning Problems\n", + " * GraphPlan\n", + " * Linearize\n", + " * PartialOrderPlanner\n", + "
    \n", + "\n", + "**Planning in the real world**\n", + "- Problem\n", + "- HLA\n", + "- Planning Problems\n", + " * Job shop problem\n", + " * Double tennis problem\n", + "- Solving Planning Problems\n", + " * Hierarchical Search\n", + " * Angelic Search" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Planning actions have been modelled using the `Action` class. Let's look at the source to see how the internal details of an action are implemented in Python." + "## PlanningProblem\n", + "\n", + "PDDL stands for Planning Domain Definition Language.\n", + "The `PlanningProblem` class is used to represent planning problems in this module. The following attributes are essential to be able to define a problem:\n", + "* an initial state\n", + "* a set of goals\n", + "* a set of viable actions that can be executed in the search space of the problem\n", + "\n", + "View the source to see how the Python code tries to realise these." ] }, { "cell_type": "code", - "execution_count": 2, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 80, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class PlanningProblem:\n",
    +       "    """\n",
    +       "    Planning Domain Definition Language (PlanningProblem) used to define a search problem.\n",
    +       "    It stores states in a knowledge base consisting of first order logic statements.\n",
    +       "    The conjunction of these logical statements completely defines a state.\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, init, goals, actions):\n",
    +       "        self.init = self.convert(init)\n",
    +       "        self.goals = self.convert(goals)\n",
    +       "        self.actions = actions\n",
    +       "\n",
    +       "    def convert(self, clauses):\n",
    +       "        """Converts strings into exprs"""\n",
    +       "        if not isinstance(clauses, Expr):\n",
    +       "            if len(clauses) > 0:\n",
    +       "                clauses = expr(clauses)\n",
    +       "            else:\n",
    +       "                clauses = []\n",
    +       "        try:\n",
    +       "            clauses = conjuncts(clauses)\n",
    +       "        except AttributeError:\n",
    +       "            clauses = clauses\n",
    +       "\n",
    +       "        new_clauses = []\n",
    +       "        for clause in clauses:\n",
    +       "            if clause.op == '~':\n",
    +       "                new_clauses.append(expr('Not' + str(clause.args[0])))\n",
    +       "            else:\n",
    +       "                new_clauses.append(clause)\n",
    +       "        return new_clauses\n",
    +       "\n",
    +       "    def goal_test(self):\n",
    +       "        """Checks if the goals have been reached"""\n",
    +       "        return all(goal in self.init for goal in self.goals)\n",
    +       "\n",
    +       "    def act(self, action):\n",
    +       "        """\n",
    +       "        Performs the action given as argument.\n",
    +       "        Note that action is an Expr like expr('Remove(Glass, Table)') or expr('Eat(Sandwich)')\n",
    +       "        """       \n",
    +       "        action_name = action.op\n",
    +       "        args = action.args\n",
    +       "        list_action = first(a for a in self.actions if a.name == action_name)\n",
    +       "        if list_action is None:\n",
    +       "            raise Exception("Action '{}' not found".format(action_name))\n",
    +       "        if not list_action.check_precond(self.init, args):\n",
    +       "            raise Exception("Action '{}' pre-conditions not satisfied".format(action))\n",
    +       "        self.init = list_action(self.init, args).clauses\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource Action" + "psource(PlanningProblem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "It is interesting to see the way preconditions and effects are represented here. Instead of just being a list of expressions each, they consist of two lists - `precond_pos` and `precond_neg`. This is to work around the fact that PDDL doesn't allow for negations. Thus, for each precondition, we maintain a seperate list of those preconditions that must hold true, and those whose negations must hold true. Similarly, instead of having a single list of expressions that are the result of executing an action, we have two. The first (`effect_add`) contains all the expressions that will evaluate to true if the action is executed, and the the second (`effect_neg`) contains all those expressions that would be false if the action is executed (ie. their negations would be true).\n", - "\n", - "The constructor parameters, however combine the two precondition lists into a single `precond` parameter, and the effect lists into a single `effect` parameter." + "The `init` attribute is an expression that forms the initial knowledge base for the problem.\n", + "
    \n", + "The `goals` attribute is an expression that indicates the goals to be reached by the problem.\n", + "
    \n", + "Lastly, `actions` contains a list of `Action` objects that may be executed in the search space of the problem.\n", + "
    \n", + "The `goal_test` method checks if the goal has been reached.\n", + "
    \n", + "The `act` method acts out the given action and updates the current state.\n", + "
    \n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The `PDDL` class is used to represent planning problems in this module. The following attributes are essential to be able to define a problem:\n", - "* a goal test\n", - "* an initial state\n", - "* a set of viable actions that can be executed in the search space of the problem\n", + "## ACTION\n", "\n", - "View the source to see how the Python code tries to realise these." + "To be able to model a planning problem properly, it is essential to be able to represent an Action. Each action we model requires at least three things:\n", + "* preconditions that the action must meet\n", + "* the effects of executing the action\n", + "* some expression that represents the action" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The module models actions using the `Action` class" ] }, { "cell_type": "code", - "execution_count": 3, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 81, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class Action:\n",
    +       "    """\n",
    +       "    Defines an action schema using preconditions and effects.\n",
    +       "    Use this to describe actions in PlanningProblem.\n",
    +       "    action is an Expr where variables are given as arguments(args).\n",
    +       "    Precondition and effect are both lists with positive and negative literals.\n",
    +       "    Negative preconditions and effects are defined by adding a 'Not' before the name of the clause\n",
    +       "    Example:\n",
    +       "    precond = [expr("Human(person)"), expr("Hungry(Person)"), expr("NotEaten(food)")]\n",
    +       "    effect = [expr("Eaten(food)"), expr("Hungry(person)")]\n",
    +       "    eat = Action(expr("Eat(person, food)"), precond, effect)\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, action, precond, effect):\n",
    +       "        if isinstance(action, str):\n",
    +       "            action = expr(action)\n",
    +       "        self.name = action.op\n",
    +       "        self.args = action.args\n",
    +       "        self.precond = self.convert(precond)\n",
    +       "        self.effect = self.convert(effect)\n",
    +       "\n",
    +       "    def __call__(self, kb, args):\n",
    +       "        return self.act(kb, args)\n",
    +       "\n",
    +       "    def __repr__(self):\n",
    +       "        return '{}({})'.format(self.__class__.__name__, Expr(self.name, *self.args))\n",
    +       "\n",
    +       "    def convert(self, clauses):\n",
    +       "        """Converts strings into Exprs"""\n",
    +       "        if isinstance(clauses, Expr):\n",
    +       "            clauses = conjuncts(clauses)\n",
    +       "            for i in range(len(clauses)):\n",
    +       "                if clauses[i].op == '~':\n",
    +       "                    clauses[i] = expr('Not' + str(clauses[i].args[0]))\n",
    +       "\n",
    +       "        elif isinstance(clauses, str):\n",
    +       "            clauses = clauses.replace('~', 'Not')\n",
    +       "            if len(clauses) > 0:\n",
    +       "                clauses = expr(clauses)\n",
    +       "\n",
    +       "            try:\n",
    +       "                clauses = conjuncts(clauses)\n",
    +       "            except AttributeError:\n",
    +       "                pass\n",
    +       "\n",
    +       "        return clauses\n",
    +       "\n",
    +       "    def substitute(self, e, args):\n",
    +       "        """Replaces variables in expression with their respective Propositional symbol"""\n",
    +       "\n",
    +       "        new_args = list(e.args)\n",
    +       "        for num, x in enumerate(e.args):\n",
    +       "            for i, _ in enumerate(self.args):\n",
    +       "                if self.args[i] == x:\n",
    +       "                    new_args[num] = args[i]\n",
    +       "        return Expr(e.op, *new_args)\n",
    +       "\n",
    +       "    def check_precond(self, kb, args):\n",
    +       "        """Checks if the precondition is satisfied in the current state"""\n",
    +       "\n",
    +       "        if isinstance(kb, list):\n",
    +       "            kb = FolKB(kb)\n",
    +       "        for clause in self.precond:\n",
    +       "            if self.substitute(clause, args) not in kb.clauses:\n",
    +       "                return False\n",
    +       "        return True\n",
    +       "\n",
    +       "    def act(self, kb, args):\n",
    +       "        """Executes the action on the state's knowledge base"""\n",
    +       "\n",
    +       "        if isinstance(kb, list):\n",
    +       "            kb = FolKB(kb)\n",
    +       "\n",
    +       "        if not self.check_precond(kb, args):\n",
    +       "            raise Exception('Action pre-conditions not satisfied')\n",
    +       "        for clause in self.effect:\n",
    +       "            kb.tell(self.substitute(clause, args))\n",
    +       "            if clause.op[:3] == 'Not':\n",
    +       "                new_clause = Expr(clause.op[3:], *clause.args)\n",
    +       "\n",
    +       "                if kb.ask(self.substitute(new_clause, args)) is not False:\n",
    +       "                    kb.retract(self.substitute(new_clause, args))\n",
    +       "            else:\n",
    +       "                new_clause = Expr('Not' + clause.op, *clause.args)\n",
    +       "\n",
    +       "                if kb.ask(self.substitute(new_clause, args)) is not False:    \n",
    +       "                    kb.retract(self.substitute(new_clause, args))\n",
    +       "\n",
    +       "        return kb\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource PDDL" + "psource(Action)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The `initial_state` attribute is a list of `Expr` expressions that forms the initial knowledge base for the problem. Next, `actions` contains a list of `Action` objects that may be executed in the search space of the problem. Lastly, we pass a `goal_test` function as a parameter - this typically takes a knowledge base as a parameter, and returns whether or not the goal has been reached." + "This class represents an action given the expression, the preconditions and its effects. \n", + "A list `precond` stores the preconditions of the action and a list `effect` stores its effects.\n", + "Negative preconditions and effects are input using a `~` symbol before the clause, which are internally prefixed with a `Not` to make it easier to work with.\n", + "For example, the negation of `At(obj, loc)` will be input as `~At(obj, loc)` and internally represented as `NotAt(obj, loc)`. \n", + "This equivalently creates a new clause for each negative literal, removing the hassle of maintaining two separate knowledge bases.\n", + "This greatly simplifies algorithms like `GraphPlan` as we will see later.\n", + "The `convert` method takes an input string, parses it, removes conjunctions if any and returns a list of `Expr` objects.\n", + "The `check_precond` method checks if the preconditions for that action are valid, given a `kb`.\n", + "The `act` method carries out the action on the given knowledge base." ] }, { @@ -109,10 +503,8 @@ }, { "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": false - }, + "execution_count": 82, + "metadata": {}, "outputs": [], "source": [ "from utils import *\n", @@ -140,10 +532,8 @@ }, { "cell_type": "code", - "execution_count": 5, - "metadata": { - "collapsed": true - }, + "execution_count": 83, + "metadata": {}, "outputs": [], "source": [ "knowledge_base.extend([\n", @@ -162,10 +552,8 @@ }, { "cell_type": "code", - "execution_count": 6, - "metadata": { - "collapsed": false - }, + "execution_count": 84, + "metadata": {}, "outputs": [ { "data": { @@ -182,7 +570,7 @@ " At(Sibiu)]" ] }, - "execution_count": 6, + "execution_count": 84, "metadata": {}, "output_type": "execute_result" } @@ -202,53 +590,39 @@ }, { "cell_type": "code", - "execution_count": 7, - "metadata": { - "collapsed": false - }, + "execution_count": 85, + "metadata": {}, "outputs": [], "source": [ "#Sibiu to Bucharest\n", - "precond_pos = [expr('At(Sibiu)')]\n", - "precond_neg = []\n", - "effect_add = [expr('At(Bucharest)')]\n", - "effect_rem = [expr('At(Sibiu)')]\n", - "fly_s_b = Action(expr('Fly(Sibiu, Bucharest)'), [precond_pos, precond_neg], [effect_add, effect_rem])\n", + "precond = 'At(Sibiu)'\n", + "effect = 'At(Bucharest) & ~At(Sibiu)'\n", + "fly_s_b = Action('Fly(Sibiu, Bucharest)', precond, effect)\n", "\n", "#Bucharest to Sibiu\n", - "precond_pos = [expr('At(Bucharest)')]\n", - "precond_neg = []\n", - "effect_add = [expr('At(Sibiu)')]\n", - "effect_rem = [expr('At(Bucharest)')]\n", - "fly_b_s = Action(expr('Fly(Bucharest, Sibiu)'), [precond_pos, precond_neg], [effect_add, effect_rem])\n", + "precond = 'At(Bucharest)'\n", + "effect = 'At(Sibiu) & ~At(Bucharest)'\n", + "fly_b_s = Action('Fly(Bucharest, Sibiu)', precond, effect)\n", "\n", "#Sibiu to Craiova\n", - "precond_pos = [expr('At(Sibiu)')]\n", - "precond_neg = []\n", - "effect_add = [expr('At(Craiova)')]\n", - "effect_rem = [expr('At(Sibiu)')]\n", - "fly_s_c = Action(expr('Fly(Sibiu, Craiova)'), [precond_pos, precond_neg], [effect_add, effect_rem])\n", + "precond = 'At(Sibiu)'\n", + "effect = 'At(Craiova) & ~At(Sibiu)'\n", + "fly_s_c = Action('Fly(Sibiu, Craiova)', precond, effect)\n", "\n", "#Craiova to Sibiu\n", - "precond_pos = [expr('At(Craiova)')]\n", - "precond_neg = []\n", - "effect_add = [expr('At(Sibiu)')]\n", - "effect_rem = [expr('At(Craiova)')]\n", - "fly_c_s = Action(expr('Fly(Craiova, Sibiu)'), [precond_pos, precond_neg], [effect_add, effect_rem])\n", + "precond = 'At(Craiova)'\n", + "effect = 'At(Sibiu) & ~At(Craiova)'\n", + "fly_c_s = Action('Fly(Craiova, Sibiu)', precond, effect)\n", "\n", "#Bucharest to Craiova\n", - "precond_pos = [expr('At(Bucharest)')]\n", - "precond_neg = []\n", - "effect_add = [expr('At(Craiova)')]\n", - "effect_rem = [expr('At(Bucharest)')]\n", - "fly_b_c = Action(expr('Fly(Bucharest, Craiova)'), [precond_pos, precond_neg], [effect_add, effect_rem])\n", + "precond = 'At(Bucharest)'\n", + "effect = 'At(Craiova) & ~At(Bucharest)'\n", + "fly_b_c = Action('Fly(Bucharest, Craiova)', precond, effect)\n", "\n", "#Craiova to Bucharest\n", - "precond_pos = [expr('At(Craiova)')]\n", - "precond_neg = []\n", - "effect_add = [expr('At(Bucharest)')]\n", - "effect_rem = [expr('At(Craiova)')]\n", - "fly_c_b = Action(expr('Fly(Craiova, Bucharest)'), [precond_pos, precond_neg], [effect_add, effect_rem])" + "precond = 'At(Craiova)'\n", + "effect = 'At(Bucharest) & ~At(Craiova)'\n", + "fly_c_b = Action('Fly(Craiova, Bucharest)', precond, effect)" ] }, { @@ -260,18 +634,30 @@ }, { "cell_type": "code", - "execution_count": 8, - "metadata": { - "collapsed": true - }, + "execution_count": 86, + "metadata": {}, "outputs": [], "source": [ "#Drive\n", - "precond_pos = [expr('At(x)')]\n", - "precond_neg = []\n", - "effect_add = [expr('At(y)')]\n", - "effect_rem = [expr('At(x)')]\n", - "drive = Action(expr('Drive(x, y)'), [precond_pos, precond_neg], [effect_add, effect_rem])" + "precond = 'At(x)'\n", + "effect = 'At(y) & ~At(x)'\n", + "drive = Action('Drive(x, y)', precond, effect)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Our goal is defined as" + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": {}, + "outputs": [], + "source": [ + "goals = 'At(Bucharest)'" ] }, { @@ -283,14 +669,12 @@ }, { "cell_type": "code", - "execution_count": 9, - "metadata": { - "collapsed": true - }, + "execution_count": 88, + "metadata": {}, "outputs": [], "source": [ "def goal_test(kb):\n", - " return kb.ask(expr(\"At(Bucharest)\"))" + " return kb.ask(expr('At(Bucharest)'))" ] }, { @@ -302,32 +686,3161 @@ }, { "cell_type": "code", - "execution_count": 10, - "metadata": { - "collapsed": false - }, + "execution_count": 89, + "metadata": {}, "outputs": [], "source": [ - "prob = PDDL(knowledge_base, [fly_s_b, fly_b_s, fly_s_c, fly_c_s, fly_b_c, fly_c_b, drive], goal_test)" + "prob = PlanningProblem(knowledge_base, goals, [fly_s_b, fly_b_s, fly_s_c, fly_c_s, fly_b_c, fly_c_b, drive])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## PLANNING PROBLEMS\n", + "---\n", + "\n", + "## Air Cargo Problem" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the Air Cargo problem, we start with cargo at two airports, SFO and JFK. Our goal is to send each cargo to the other airport. We have two airplanes to help us accomplish the task. \n", + "The problem can be defined with three actions: Load, Unload and Fly. \n", + "Let us look how the `air_cargo` problem has been defined in the module. " ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, + "execution_count": 90, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def air_cargo():\n",
    +       "    """\n",
    +       "    [Figure 10.1] AIR-CARGO-PROBLEM\n",
    +       "\n",
    +       "    An air-cargo shipment problem for delivering cargo to different locations,\n",
    +       "    given the starting location and airplanes.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> ac = air_cargo()\n",
    +       "    >>> ac.goal_test()\n",
    +       "    False\n",
    +       "    >>> ac.act(expr('Load(C2, P2, JFK)'))\n",
    +       "    >>> ac.act(expr('Load(C1, P1, SFO)'))\n",
    +       "    >>> ac.act(expr('Fly(P1, SFO, JFK)'))\n",
    +       "    >>> ac.act(expr('Fly(P2, JFK, SFO)'))\n",
    +       "    >>> ac.act(expr('Unload(C2, P2, SFO)'))\n",
    +       "    >>> ac.goal_test()\n",
    +       "    False\n",
    +       "    >>> ac.act(expr('Unload(C1, P1, JFK)'))\n",
    +       "    >>> ac.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='At(C1, SFO) & At(C2, JFK) & At(P1, SFO) & At(P2, JFK) & Cargo(C1) & Cargo(C2) & Plane(P1) & Plane(P2) & Airport(SFO) & Airport(JFK)', \n",
    +       "                goals='At(C1, JFK) & At(C2, SFO)',\n",
    +       "                actions=[Action('Load(c, p, a)', \n",
    +       "                                precond='At(c, a) & At(p, a) & Cargo(c) & Plane(p) & Airport(a)',\n",
    +       "                                effect='In(c, p) & ~At(c, a)'),\n",
    +       "                         Action('Unload(c, p, a)',\n",
    +       "                                precond='In(c, p) & At(p, a) & Cargo(c) & Plane(p) & Airport(a)',\n",
    +       "                                effect='At(c, a) & ~In(c, p)'),\n",
    +       "                         Action('Fly(p, f, to)',\n",
    +       "                                precond='At(p, f) & Plane(p) & Airport(f) & Airport(to)',\n",
    +       "                                effect='At(p, to) & ~At(p, f)')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(air_cargo)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**At(c, a):** The cargo **'c'** is at airport **'a'**.\n", + "\n", + "**~At(c, a):** The cargo **'c'** is _not_ at airport **'a'**.\n", + "\n", + "**In(c, p):** Cargo **'c'** is in plane **'p'**.\n", + "\n", + "**~In(c, p):** Cargo **'c'** is _not_ in plane **'p'**.\n", + "\n", + "**Cargo(c):** Declare **'c'** as cargo.\n", + "\n", + "**Plane(p):** Declare **'p'** as plane.\n", + "\n", + "**Airport(a):** Declare **'a'** as airport.\n", + "\n", + "\n", + "\n", + "In the `initial_state`, we have cargo C1, plane P1 at airport SFO and cargo C2, plane P2 at airport JFK. \n", + "Our goal state is to have cargo C1 at airport JFK and cargo C2 at airport SFO. We will discuss on how to achieve this. Let us now define an object of the `air_cargo` problem:" + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": {}, "outputs": [], - "source": [] + "source": [ + "airCargo = air_cargo()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before taking any actions, we will check if `airCargo` has reached its goal:" + ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 92, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n" + ] + } + ], + "source": [ + "print(airCargo.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It returns False because the goal state is not yet reached. Now, we define the sequence of actions that it should take in order to achieve the goal.\n", + "The actions are then carried out on the `airCargo` PlanningProblem.\n", + "\n", + "The actions available to us are the following: Load, Unload, Fly\n", + "\n", + "**Load(c, p, a):** Load cargo **'c'** into plane **'p'** from airport **'a'**.\n", + "\n", + "**Fly(p, f, t):** Fly the plane **'p'** from airport **'f'** to airport **'t'**.\n", + "\n", + "**Unload(c, p, a):** Unload cargo **'c'** from plane **'p'** to airport **'a'**.\n", + "\n", + "This problem can have multiple valid solutions.\n", + "One such solution is shown below." + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr(\"Load(C1 , P1, SFO)\"),\n", + " expr(\"Fly(P1, SFO, JFK)\"),\n", + " expr(\"Unload(C1, P1, JFK)\"),\n", + " expr(\"Load(C2, P2, JFK)\"),\n", + " expr(\"Fly(P2, JFK, SFO)\"),\n", + " expr(\"Unload (C2, P2, SFO)\")] \n", + "\n", + "for action in solution:\n", + " airCargo.act(action)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the `airCargo` has taken all the steps it needed in order to achieve the goal, we can now check if it has acheived its goal:" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(airCargo.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It has now achieved its goal." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## The Spare Tire Problem" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's consider the problem of changing a flat tire of a car. \n", + "The goal is to mount a spare tire onto the car's axle, given that we have a flat tire on the axle and a spare tire in the trunk. " + ] + }, + { + "cell_type": "code", + "execution_count": 95, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def spare_tire():\n",
    +       "    """[Figure 10.2] SPARE-TIRE-PROBLEM\n",
    +       "\n",
    +       "    A problem involving changing the flat tire of a car\n",
    +       "    with a spare tire from the trunk.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> st = spare_tire()\n",
    +       "    >>> st.goal_test()\n",
    +       "    False\n",
    +       "    >>> st.act(expr('Remove(Spare, Trunk)'))\n",
    +       "    >>> st.act(expr('Remove(Flat, Axle)'))\n",
    +       "    >>> st.goal_test()\n",
    +       "    False\n",
    +       "    >>> st.act(expr('PutOn(Spare, Axle)'))\n",
    +       "    >>> st.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='Tire(Flat) & Tire(Spare) & At(Flat, Axle) & At(Spare, Trunk)',\n",
    +       "                goals='At(Spare, Axle) & At(Flat, Ground)',\n",
    +       "                actions=[Action('Remove(obj, loc)',\n",
    +       "                                precond='At(obj, loc)',\n",
    +       "                                effect='At(obj, Ground) & ~At(obj, loc)'),\n",
    +       "                         Action('PutOn(t, Axle)',\n",
    +       "                                precond='Tire(t) & At(t, Ground) & ~At(Flat, Axle)',\n",
    +       "                                effect='At(t, Axle) & ~At(t, Ground)'),\n",
    +       "                         Action('LeaveOvernight',\n",
    +       "                                precond='',\n",
    +       "                                effect='~At(Spare, Ground) & ~At(Spare, Axle) & ~At(Spare, Trunk) & \\\n",
    +       "                                        ~At(Flat, Ground) & ~At(Flat, Axle) & ~At(Flat, Trunk)')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(spare_tire)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**At(obj, loc):** object **'obj'** is at location **'loc'**.\n", + "\n", + "**~At(obj, loc):** object **'obj'** is _not_ at location **'loc'**.\n", + "\n", + "**Tire(t):** Declare a tire of type **'t'**.\n", + "\n", + "Let us now define an object of `spare_tire` problem:" + ] + }, + { + "cell_type": "code", + "execution_count": 96, + "metadata": {}, "outputs": [], - "source": [] + "source": [ + "spareTire = spare_tire()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before taking any actions, we will check if `spare_tire` has reached its goal:" + ] + }, + { + "cell_type": "code", + "execution_count": 97, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n" + ] + } + ], + "source": [ + "print(spareTire.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, it hasn't completed the goal. \n", + "We now define a possible solution that can help us reach the goal of having a spare tire mounted onto the car's axle. \n", + "The actions are then carried out on the `spareTire` PlanningProblem.\n", + "\n", + "The actions available to us are the following: Remove, PutOn\n", + "\n", + "**Remove(obj, loc):** Remove the tire **'obj'** from the location **'loc'**.\n", + "\n", + "**PutOn(t, Axle):** Attach the tire **'t'** on the Axle.\n", + "\n", + "**LeaveOvernight():** We live in a particularly bad neighborhood and all tires, flat or not, are stolen if we leave them overnight.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 98, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr(\"Remove(Flat, Axle)\"),\n", + " expr(\"Remove(Spare, Trunk)\"),\n", + " expr(\"PutOn(Spare, Axle)\")]\n", + "\n", + "for action in solution:\n", + " spareTire.act(action)" + ] + }, + { + "cell_type": "code", + "execution_count": 99, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(spareTire.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is a valid solution.\n", + "
    \n", + "Another possible solution is" + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "metadata": {}, + "outputs": [], + "source": [ + "spareTire = spare_tire()\n", + "\n", + "solution = [expr('Remove(Spare, Trunk)'),\n", + " expr('Remove(Flat, Axle)'),\n", + " expr('PutOn(Spare, Axle)')]\n", + "\n", + "for action in solution:\n", + " spareTire.act(action)" + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(spareTire.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that both solutions work, which means that the problem can be solved irrespective of the order in which the `Remove` actions take place, as long as both `Remove` actions take place before the `PutOn` action." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have successfully mounted a spare tire onto the axle." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Three Block Tower Problem" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This problem's domain consists of a set of cube-shaped blocks sitting on a table. \n", + "The blocks can be stacked, but only one block can fit directly on top of another.\n", + "A robot arm can pick up a block and move it to another position, either on the table or on top of another block. \n", + "The arm can pick up only one block at a time, so it cannot pick up a block that has another one on it. \n", + "The goal will always be to build one or more stacks of blocks. \n", + "In our case, we consider only three blocks.\n", + "The particular configuration we will use is called the Sussman anomaly after Prof. Gerry Sussman." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's take a look at the definition of `three_block_tower()` in the module." + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def three_block_tower():\n",
    +       "    """\n",
    +       "    [Figure 10.3] THREE-BLOCK-TOWER\n",
    +       "\n",
    +       "    A blocks-world problem of stacking three blocks in a certain configuration,\n",
    +       "    also known as the Sussman Anomaly.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> tbt = three_block_tower()\n",
    +       "    >>> tbt.goal_test()\n",
    +       "    False\n",
    +       "    >>> tbt.act(expr('MoveToTable(C, A)'))\n",
    +       "    >>> tbt.act(expr('Move(B, Table, C)'))\n",
    +       "    >>> tbt.goal_test()\n",
    +       "    False\n",
    +       "    >>> tbt.act(expr('Move(A, Table, B)'))\n",
    +       "    >>> tbt.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='On(A, Table) & On(B, Table) & On(C, A) & Block(A) & Block(B) & Block(C) & Clear(B) & Clear(C)',\n",
    +       "                goals='On(A, B) & On(B, C)',\n",
    +       "                actions=[Action('Move(b, x, y)',\n",
    +       "                                precond='On(b, x) & Clear(b) & Clear(y) & Block(b) & Block(y)',\n",
    +       "                                effect='On(b, y) & Clear(x) & ~On(b, x) & ~Clear(y)'),\n",
    +       "                         Action('MoveToTable(b, x)',\n",
    +       "                                precond='On(b, x) & Clear(b) & Block(b)',\n",
    +       "                                effect='On(b, Table) & Clear(x) & ~On(b, x)')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(three_block_tower)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**On(b, x):** The block **'b'** is on **'x'**. **'x'** can be a table or a block.\n", + "\n", + "**~On(b, x):** The block **'b'** is _not_ on **'x'**. **'x'** can be a table or a block.\n", + "\n", + "**Block(b):** Declares **'b'** as a block.\n", + "\n", + "**Clear(x):** To indicate that there is nothing on **'x'** and it is free to be moved around.\n", + "\n", + "**~Clear(x):** To indicate that there is something on **'x'** and it cannot be moved.\n", + " \n", + " Let us now define an object of `three_block_tower` problem:" + ] + }, + { + "cell_type": "code", + "execution_count": 103, + "metadata": {}, + "outputs": [], + "source": [ + "threeBlockTower = three_block_tower()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before taking any actions, we will check if `threeBlockTower` has reached its goal:" + ] + }, + { + "cell_type": "code", + "execution_count": 104, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n" + ] + } + ], + "source": [ + "print(threeBlockTower.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, it hasn't completed the goal. \n", + "We now define a sequence of actions that can stack three blocks in the required order. \n", + "The actions are then carried out on the `threeBlockTower` PlanningProblem.\n", + "\n", + "The actions available to us are the following: MoveToTable, Move\n", + "\n", + "**MoveToTable(b, x): ** Move box **'b'** stacked on **'x'** to the table, given that box **'b'** is clear.\n", + "\n", + "**Move(b, x, y): ** Move box **'b'** stacked on **'x'** to the top of **'y'**, given that both **'b'** and **'y'** are clear.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 105, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr(\"MoveToTable(C, A)\"),\n", + " expr(\"Move(B, Table, C)\"),\n", + " expr(\"Move(A, Table, B)\")]\n", + "\n", + "for action in solution:\n", + " threeBlockTower.act(action)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the `three_block_tower` has taken all the steps it needed in order to achieve the goal, we can now check if it has acheived its goal." + ] + }, + { + "cell_type": "code", + "execution_count": 106, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(threeBlockTower.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It has now successfully achieved its goal i.e, to build a stack of three blocks in the specified order." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `three_block_tower` problem can also be defined in simpler terms using just two actions `ToTable(x, y)` and `FromTable(x, y)`.\n", + "The underlying problem remains the same however, stacking up three blocks in a certain configuration given a particular starting state.\n", + "Let's have a look at the alternative definition." + ] + }, + { + "cell_type": "code", + "execution_count": 107, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def simple_blocks_world():\n",
    +       "    """\n",
    +       "    SIMPLE-BLOCKS-WORLD\n",
    +       "\n",
    +       "    A simplified definition of the Sussman Anomaly problem.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> sbw = simple_blocks_world()\n",
    +       "    >>> sbw.goal_test()\n",
    +       "    False\n",
    +       "    >>> sbw.act(expr('ToTable(A, B)'))\n",
    +       "    >>> sbw.act(expr('FromTable(B, A)'))\n",
    +       "    >>> sbw.goal_test()\n",
    +       "    False\n",
    +       "    >>> sbw.act(expr('FromTable(C, B)'))\n",
    +       "    >>> sbw.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='On(A, B) & Clear(A) & OnTable(B) & OnTable(C) & Clear(C)',\n",
    +       "                goals='On(B, A) & On(C, B)',\n",
    +       "                actions=[Action('ToTable(x, y)',\n",
    +       "                                precond='On(x, y) & Clear(x)',\n",
    +       "                                effect='~On(x, y) & Clear(y) & OnTable(x)'),\n",
    +       "                         Action('FromTable(y, x)',\n",
    +       "                                precond='OnTable(y) & Clear(y) & Clear(x)',\n",
    +       "                                effect='~OnTable(y) & ~Clear(x) & On(y, x)')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(simple_blocks_world)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**On(x, y):** The block **'x'** is on **'y'**. Both **'x'** and **'y'** have to be blocks.\n", + "\n", + "**~On(x, y):** The block **'x'** is _not_ on **'y'**. Both **'x'** and **'y'** have to be blocks.\n", + "\n", + "**OnTable(x):** The block **'x'** is on the table.\n", + "\n", + "**~OnTable(x):** The block **'x'** is _not_ on the table.\n", + "\n", + "**Clear(x):** To indicate that there is nothing on **'x'** and it is free to be moved around.\n", + "\n", + "**~Clear(x):** To indicate that there is something on **'x'** and it cannot be moved.\n", + "\n", + "Let's now define a `simple_blocks_world` prolem." + ] + }, + { + "cell_type": "code", + "execution_count": 108, + "metadata": {}, + "outputs": [], + "source": [ + "simpleBlocksWorld = simple_blocks_world()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before taking any actions, we will see if `simple_bw` has reached its goal." + ] + }, + { + "cell_type": "code", + "execution_count": 109, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 109, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "simpleBlocksWorld.goal_test()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, it hasn't completed the goal. \n", + "We now define a sequence of actions that can stack three blocks in the required order. \n", + "The actions are then carried out on the `simple_bw` PlanningProblem.\n", + "\n", + "The actions available to us are the following: MoveToTable, Move\n", + "\n", + "**ToTable(x, y): ** Move box **'x'** stacked on **'y'** to the table, given that box **'y'** is clear.\n", + "\n", + "**FromTable(x, y): ** Move box **'x'** from wherever it is, to the top of **'y'**, given that both **'x'** and **'y'** are clear.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 110, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr('ToTable(A, B)'),\n", + " expr('FromTable(B, A)'),\n", + " expr('FromTable(C, B)')]\n", + "\n", + "for action in solution:\n", + " simpleBlocksWorld.act(action)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the `three_block_tower` has taken all the steps it needed in order to achieve the goal, we can now check if it has acheived its goal." + ] + }, + { + "cell_type": "code", + "execution_count": 111, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(simpleBlocksWorld.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It has now successfully achieved its goal i.e, to build a stack of three blocks in the specified order." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Shopping Problem" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This problem requires us to acquire a carton of milk, a banana and a drill.\n", + "Initially, we start from home and it is known to us that milk and bananas are available in the supermarket and the hardware store sells drills.\n", + "Let's take a look at the definition of the `shopping_problem` in the module." + ] + }, + { + "cell_type": "code", + "execution_count": 112, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def shopping_problem():\n",
    +       "    """\n",
    +       "    SHOPPING-PROBLEM\n",
    +       "\n",
    +       "    A problem of acquiring some items given their availability at certain stores.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> sp = shopping_problem()\n",
    +       "    >>> sp.goal_test()\n",
    +       "    False\n",
    +       "    >>> sp.act(expr('Go(Home, HW)'))\n",
    +       "    >>> sp.act(expr('Buy(Drill, HW)'))\n",
    +       "    >>> sp.act(expr('Go(HW, SM)'))\n",
    +       "    >>> sp.act(expr('Buy(Banana, SM)'))\n",
    +       "    >>> sp.goal_test()\n",
    +       "    False\n",
    +       "    >>> sp.act(expr('Buy(Milk, SM)'))\n",
    +       "    >>> sp.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='At(Home) & Sells(SM, Milk) & Sells(SM, Banana) & Sells(HW, Drill)',\n",
    +       "                goals='Have(Milk) & Have(Banana) & Have(Drill)', \n",
    +       "                actions=[Action('Buy(x, store)',\n",
    +       "                                precond='At(store) & Sells(store, x)',\n",
    +       "                                effect='Have(x)'),\n",
    +       "                         Action('Go(x, y)',\n",
    +       "                                precond='At(x)',\n",
    +       "                                effect='At(y) & ~At(x)')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(shopping_problem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**At(x):** Indicates that we are currently at **'x'** where **'x'** can be Home, SM (supermarket) or HW (Hardware store).\n", + "\n", + "**~At(x):** Indicates that we are currently _not_ at **'x'**.\n", + "\n", + "**Sells(s, x):** Indicates that item **'x'** can be bought from store **'s'**.\n", + "\n", + "**Have(x):** Indicates that we possess the item **'x'**." + ] + }, + { + "cell_type": "code", + "execution_count": 113, + "metadata": {}, + "outputs": [], + "source": [ + "shoppingProblem = shopping_problem()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's first check whether the goal state Have(Milk), Have(Banana), Have(Drill) is reached or not." + ] + }, + { + "cell_type": "code", + "execution_count": 114, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n" + ] + } + ], + "source": [ + "print(shoppingProblem.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's look at the possible actions\n", + "\n", + "**Buy(x, store):** Buy an item **'x'** from a **'store'** given that the **'store'** sells **'x'**.\n", + "\n", + "**Go(x, y):** Go to destination **'y'** starting from source **'x'**." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now define a valid solution that will help us reach the goal.\n", + "The sequence of actions will then be carried out onto the `shoppingProblem` PlanningProblem." + ] + }, + { + "cell_type": "code", + "execution_count": 115, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr('Go(Home, SM)'),\n", + " expr('Buy(Milk, SM)'),\n", + " expr('Buy(Banana, SM)'),\n", + " expr('Go(SM, HW)'),\n", + " expr('Buy(Drill, HW)')]\n", + "\n", + "for action in solution:\n", + " shoppingProblem.act(action)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have taken the steps required to acquire all the stuff we need. \n", + "Let's see if we have reached our goal." + ] + }, + { + "cell_type": "code", + "execution_count": 116, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 116, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "shoppingProblem.goal_test()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It has now successfully achieved the goal." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Socks and Shoes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is a simple problem of putting on a pair of socks and shoes.\n", + "The problem is defined in the module as given below." + ] + }, + { + "cell_type": "code", + "execution_count": 117, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def socks_and_shoes():\n",
    +       "    """\n",
    +       "    SOCKS-AND-SHOES-PROBLEM\n",
    +       "\n",
    +       "    A task of wearing socks and shoes on both feet\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> ss = socks_and_shoes()\n",
    +       "    >>> ss.goal_test()\n",
    +       "    False\n",
    +       "    >>> ss.act(expr('RightSock'))\n",
    +       "    >>> ss.act(expr('RightShoe'))\n",
    +       "    >>> ss.act(expr('LeftSock'))\n",
    +       "    >>> ss.goal_test()\n",
    +       "    False\n",
    +       "    >>> ss.act(expr('LeftShoe'))\n",
    +       "    >>> ss.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='',\n",
    +       "                goals='RightShoeOn & LeftShoeOn',\n",
    +       "                actions=[Action('RightShoe',\n",
    +       "                                precond='RightSockOn',\n",
    +       "                                effect='RightShoeOn'),\n",
    +       "                        Action('RightSock',\n",
    +       "                                precond='',\n",
    +       "                                effect='RightSockOn'),\n",
    +       "                        Action('LeftShoe',\n",
    +       "                                precond='LeftSockOn',\n",
    +       "                                effect='LeftShoeOn'),\n",
    +       "                        Action('LeftSock',\n",
    +       "                                precond='',\n",
    +       "                                effect='LeftSockOn')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(socks_and_shoes)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**LeftSockOn:** Indicates that we have already put on the left sock.\n", + "\n", + "**RightSockOn:** Indicates that we have already put on the right sock.\n", + "\n", + "**LeftShoeOn:** Indicates that we have already put on the left shoe.\n", + "\n", + "**RightShoeOn:** Indicates that we have already put on the right shoe.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 118, + "metadata": {}, + "outputs": [], + "source": [ + "socksShoes = socks_and_shoes()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's first check whether the goal state is reached or not." + ] + }, + { + "cell_type": "code", + "execution_count": 119, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 119, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "socksShoes.goal_test()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the goal state isn't reached, we will define a sequence of actions that might help us achieve the goal.\n", + "These actions will then be acted upon the `socksShoes` PlanningProblem to check if the goal state is reached." + ] + }, + { + "cell_type": "code", + "execution_count": 120, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr('RightSock'),\n", + " expr('RightShoe'),\n", + " expr('LeftSock'),\n", + " expr('LeftShoe')]" + ] + }, + { + "cell_type": "code", + "execution_count": 121, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 121, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "for action in solution:\n", + " socksShoes.act(action)\n", + " \n", + "socksShoes.goal_test()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have reached our goal." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Cake Problem" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This problem requires us to reach the state of having a cake and having eaten a cake simlutaneously, given a single cake.\n", + "Let's first take a look at the definition of the `have_cake_and_eat_cake_too` problem in the module." + ] + }, + { + "cell_type": "code", + "execution_count": 122, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def have_cake_and_eat_cake_too():\n",
    +       "    """\n",
    +       "    [Figure 10.7] CAKE-PROBLEM\n",
    +       "\n",
    +       "    A problem where we begin with a cake and want to \n",
    +       "    reach the state of having a cake and having eaten a cake.\n",
    +       "    The possible actions include baking a cake and eating a cake.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> cp = have_cake_and_eat_cake_too()\n",
    +       "    >>> cp.goal_test()\n",
    +       "    False\n",
    +       "    >>> cp.act(expr('Eat(Cake)'))\n",
    +       "    >>> cp.goal_test()\n",
    +       "    False\n",
    +       "    >>> cp.act(expr('Bake(Cake)'))\n",
    +       "    >>> cp.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='Have(Cake)',\n",
    +       "                goals='Have(Cake) & Eaten(Cake)',\n",
    +       "                actions=[Action('Eat(Cake)',\n",
    +       "                                precond='Have(Cake)',\n",
    +       "                                effect='Eaten(Cake) & ~Have(Cake)'),\n",
    +       "                         Action('Bake(Cake)',\n",
    +       "                                precond='~Have(Cake)',\n",
    +       "                                effect='Have(Cake)')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(have_cake_and_eat_cake_too)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since this problem doesn't involve variables, states can be considered similar to symbols in propositional logic.\n", + "\n", + "**Have(Cake):** Declares that we have a **'Cake'**.\n", + "\n", + "**~Have(Cake):** Declares that we _don't_ have a **'Cake'**." + ] + }, + { + "cell_type": "code", + "execution_count": 123, + "metadata": {}, + "outputs": [], + "source": [ + "cakeProblem = have_cake_and_eat_cake_too()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First let us check whether the goal state 'Have(Cake)' and 'Eaten(Cake)' are reached or not." + ] + }, + { + "cell_type": "code", + "execution_count": 124, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n" + ] + } + ], + "source": [ + "print(cakeProblem.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us look at the possible actions.\n", + "\n", + "**Bake(x):** To bake **' x '**.\n", + "\n", + "**Eat(x):** To eat **' x '**." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now define a valid solution that can help us reach the goal.\n", + "The sequence of actions will then be acted upon the `cakeProblem` PlanningProblem." + ] + }, + { + "cell_type": "code", + "execution_count": 125, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr(\"Eat(Cake)\"),\n", + " expr(\"Bake(Cake)\")]\n", + "\n", + "for action in solution:\n", + " cakeProblem.act(action)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we have made actions to bake the cake and eat the cake. Let us check if we have reached the goal." + ] + }, + { + "cell_type": "code", + "execution_count": 126, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(cakeProblem.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It has now successfully achieved its goal i.e, to have and eat the cake." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One might wonder if the order of the actions matters for this problem.\n", + "Let's see for ourselves." + ] + }, + { + "cell_type": "code", + "execution_count": 128, + "metadata": {}, + "outputs": [ + { + "ename": "Exception", + "evalue": "Action 'Bake(Cake)' pre-conditions not satisfied", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mException\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 6\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0maction\u001b[0m \u001b[0;32min\u001b[0m \u001b[0msolution\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 7\u001b[0;31m \u001b[0mcakeProblem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mact\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m~/aima-python/planning.py\u001b[0m in \u001b[0;36mact\u001b[0;34m(self, action)\u001b[0m\n\u001b[1;32m 58\u001b[0m \u001b[0;32mraise\u001b[0m \u001b[0mException\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Action '{}' not found\"\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mformat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction_name\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 59\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mlist_action\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcheck_precond\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minit\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 60\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mException\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Action '{}' pre-conditions not satisfied\"\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mformat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maction\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 61\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minit\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mlist_action\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minit\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclauses\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 62\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mException\u001b[0m: Action 'Bake(Cake)' pre-conditions not satisfied" + ] + } + ], + "source": [ + "cakeProblem = have_cake_and_eat_cake_too()\n", + "\n", + "solution = [expr('Bake(Cake)'),\n", + " expr('Eat(Cake)')]\n", + "\n", + "for action in solution:\n", + " cakeProblem.act(action)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It raises an exception.\n", + "Indeed, according to the problem, we cannot bake a cake if we already have one.\n", + "In planning terms, '~Have(Cake)' is a precondition to the action 'Bake(Cake)'.\n", + "Hence, this solution is invalid." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## PLANNING IN THE REAL WORLD\n", + "---\n", + "## PROBLEM\n", + "The `Problem` class is a wrapper for `PlanningProblem` with some additional functionality and data-structures to handle real-world planning problems that involve time and resource constraints.\n", + "The `Problem` class includes everything that the `PlanningProblem` class includes.\n", + "Additionally, it also includes the following attributes essential to define a real-world planning problem:\n", + "- a list of `jobs` to be done\n", + "- a dictionary of `resources`\n", + "\n", + "It also overloads the `act` method to call the `do_action` method of the `HLA` class, \n", + "and also includes a new method `refinements` that finds refinements or primitive actions for high level actions.\n", + "
    \n", + "`hierarchical_search` and `angelic_search` are also built into the `Problem` class to solve such planning problems." + ] + }, + { + "cell_type": "code", + "execution_count": 129, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class Problem(PlanningProblem):\n",
    +       "    """\n",
    +       "    Define real-world problems by aggregating resources as numerical quantities instead of\n",
    +       "    named entities.\n",
    +       "\n",
    +       "    This class is identical to PDLL, except that it overloads the act function to handle\n",
    +       "    resource and ordering conditions imposed by HLA as opposed to Action.\n",
    +       "    """\n",
    +       "    def __init__(self, init, goals, actions, jobs=None, resources=None):\n",
    +       "        super().__init__(init, goals, actions)\n",
    +       "        self.jobs = jobs\n",
    +       "        self.resources = resources or {}\n",
    +       "\n",
    +       "    def act(self, action):\n",
    +       "        """\n",
    +       "        Performs the HLA given as argument.\n",
    +       "\n",
    +       "        Note that this is different from the superclass action - where the parameter was an\n",
    +       "        Expression. For real world problems, an Expr object isn't enough to capture all the\n",
    +       "        detail required for executing the action - resources, preconditions, etc need to be\n",
    +       "        checked for too.\n",
    +       "        """\n",
    +       "        args = action.args\n",
    +       "        list_action = first(a for a in self.actions if a.name == action.name)\n",
    +       "        if list_action is None:\n",
    +       "            raise Exception("Action '{}' not found".format(action.name))\n",
    +       "        self.init = list_action.do_action(self.jobs, self.resources, self.init, args).clauses\n",
    +       "\n",
    +       "    def refinements(hla, state, library):  # refinements may be (multiple) HLA themselves ...\n",
    +       "        """\n",
    +       "        state is a Problem, containing the current state kb\n",
    +       "        library is a dictionary containing details for every possible refinement. eg:\n",
    +       "        {\n",
    +       "        'HLA': [\n",
    +       "            'Go(Home, SFO)',\n",
    +       "            'Go(Home, SFO)',\n",
    +       "            'Drive(Home, SFOLongTermParking)',\n",
    +       "            'Shuttle(SFOLongTermParking, SFO)',\n",
    +       "            'Taxi(Home, SFO)'\n",
    +       "            ],\n",
    +       "        'steps': [\n",
    +       "            ['Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'],\n",
    +       "            ['Taxi(Home, SFO)'],\n",
    +       "            [],\n",
    +       "            [],\n",
    +       "            []\n",
    +       "            ],\n",
    +       "        # empty refinements indicate a primitive action\n",
    +       "        'precond': [\n",
    +       "            ['At(Home) & Have(Car)'],\n",
    +       "            ['At(Home)'],\n",
    +       "            ['At(Home) & Have(Car)'],\n",
    +       "            ['At(SFOLongTermParking)'],\n",
    +       "            ['At(Home)']\n",
    +       "            ],\n",
    +       "        'effect': [\n",
    +       "            ['At(SFO) & ~At(Home)'],\n",
    +       "            ['At(SFO) & ~At(Home)'],\n",
    +       "            ['At(SFOLongTermParking) & ~At(Home)'],\n",
    +       "            ['At(SFO) & ~At(SFOLongTermParking)'],\n",
    +       "            ['At(SFO) & ~At(Home)']\n",
    +       "            ]\n",
    +       "        }\n",
    +       "        """\n",
    +       "        e = Expr(hla.name, hla.args)\n",
    +       "        indices = [i for i, x in enumerate(library['HLA']) if expr(x).op == hla.name]\n",
    +       "        for i in indices:\n",
    +       "            actions = []\n",
    +       "            for j in range(len(library['steps'][i])):\n",
    +       "                # find the index of the step [j]  of the HLA \n",
    +       "                index_step = [k for k,x in enumerate(library['HLA']) if x == library['steps'][i][j]][0]\n",
    +       "                precond = library['precond'][index_step][0] # preconditions of step [j]\n",
    +       "                effect = library['effect'][index_step][0] # effect of step [j]\n",
    +       "                actions.append(HLA(library['steps'][i][j], precond, effect))\n",
    +       "            yield actions\n",
    +       "\n",
    +       "    def hierarchical_search(problem, hierarchy):\n",
    +       "        """\n",
    +       "        [Figure 11.5] 'Hierarchical Search, a Breadth First Search implementation of Hierarchical\n",
    +       "        Forward Planning Search'\n",
    +       "        The problem is a real-world problem defined by the problem class, and the hierarchy is\n",
    +       "        a dictionary of HLA - refinements (see refinements generator for details)\n",
    +       "        """\n",
    +       "        act = Node(problem.actions[0])\n",
    +       "        frontier = deque()\n",
    +       "        frontier.append(act)\n",
    +       "        while True:\n",
    +       "            if not frontier:\n",
    +       "                return None\n",
    +       "            plan = frontier.popleft()\n",
    +       "            print(plan.state.name)\n",
    +       "            hla = plan.state  # first_or_null(plan)\n",
    +       "            prefix = None\n",
    +       "            if plan.parent:\n",
    +       "                prefix = plan.parent.state.action  # prefix, suffix = subseq(plan.state, hla)\n",
    +       "            outcome = Problem.result(problem, prefix)\n",
    +       "            if hla is None:\n",
    +       "                if outcome.goal_test():\n",
    +       "                    return plan.path()\n",
    +       "            else:\n",
    +       "                print("else")\n",
    +       "                for sequence in Problem.refinements(hla, outcome, hierarchy):\n",
    +       "                    print("...")\n",
    +       "                    frontier.append(Node(plan.state, plan.parent, sequence))\n",
    +       "\n",
    +       "    def result(state, actions):\n",
    +       "        """The outcome of applying an action to the current problem"""\n",
    +       "        for a in actions: \n",
    +       "            if a.check_precond(state, a.args):\n",
    +       "                state = a(state, a.args).clauses\n",
    +       "        return state\n",
    +       "    \n",
    +       "\n",
    +       "    def angelic_search(problem, hierarchy, initialPlan):\n",
    +       "        """\n",
    +       "\t[Figure 11.8] A hierarchical planning algorithm that uses angelic semantics to identify and\n",
    +       "\tcommit to high-level plans that work while avoiding high-level plans that don’t. \n",
    +       "\tThe predicate MAKING-PROGRESS checks to make sure that we aren’t stuck in an infinite regression\n",
    +       "\tof refinements. \n",
    +       "\tAt top level, call ANGELIC -SEARCH with [Act ] as the initialPlan .\n",
    +       "\n",
    +       "        initialPlan contains a sequence of HLA's with angelic semantics \n",
    +       "\n",
    +       "        The possible effects of an angelic HLA in initialPlan are : \n",
    +       "        ~ : effect remove\n",
    +       "        $+: effect possibly add\n",
    +       "        $-: effect possibly remove\n",
    +       "        $$: possibly add or remove\n",
    +       "\t"""\n",
    +       "        frontier = deque(initialPlan)\n",
    +       "        while True: \n",
    +       "            if not frontier:\n",
    +       "                return None\n",
    +       "            plan = frontier.popleft() # sequence of HLA/Angelic HLA's \n",
    +       "            opt_reachable_set = Problem.reach_opt(problem.init, plan)\n",
    +       "            pes_reachable_set = Problem.reach_pes(problem.init, plan)\n",
    +       "            if problem.intersects_goal(opt_reachable_set): \n",
    +       "                if Problem.is_primitive( plan, hierarchy ): \n",
    +       "                    return ([x for x in plan.action])\n",
    +       "                guaranteed = problem.intersects_goal(pes_reachable_set) \n",
    +       "                if guaranteed and Problem.making_progress(plan, plan):\n",
    +       "                    final_state = guaranteed[0] # any element of guaranteed \n",
    +       "                    #print('decompose')\n",
    +       "                    return Problem.decompose(hierarchy, problem, plan, final_state, pes_reachable_set)\n",
    +       "                (hla, index) = Problem.find_hla(plan, hierarchy) # there should be at least one HLA/Angelic_HLA, otherwise plan would be primitive.\n",
    +       "                prefix = plan.action[:index-1]\n",
    +       "                suffix = plan.action[index+1:]\n",
    +       "                outcome = Problem(Problem.result(problem.init, prefix), problem.goals , problem.actions )\n",
    +       "                for sequence in Problem.refinements(hla, outcome, hierarchy): # find refinements\n",
    +       "                    frontier.append(Angelic_Node(outcome.init, plan, prefix + sequence+ suffix, prefix+sequence+suffix))\n",
    +       "\n",
    +       "\n",
    +       "    def intersects_goal(problem, reachable_set):\n",
    +       "        """\n",
    +       "        Find the intersection of the reachable states and the goal\n",
    +       "        """\n",
    +       "        return [y for x in list(reachable_set.keys()) for y in reachable_set[x] if all(goal in y for goal in problem.goals)] \n",
    +       "\n",
    +       "\n",
    +       "    def is_primitive(plan,  library):\n",
    +       "        """\n",
    +       "        checks if the hla is primitive action \n",
    +       "        """\n",
    +       "        for hla in plan.action: \n",
    +       "            indices = [i for i, x in enumerate(library['HLA']) if expr(x).op == hla.name]\n",
    +       "            for i in indices:\n",
    +       "                if library["steps"][i]: \n",
    +       "                    return False\n",
    +       "        return True\n",
    +       "             \n",
    +       "\n",
    +       "\n",
    +       "    def reach_opt(init, plan): \n",
    +       "        """\n",
    +       "        Finds the optimistic reachable set of the sequence of actions in plan \n",
    +       "        """\n",
    +       "        reachable_set = {0: [init]}\n",
    +       "        optimistic_description = plan.action #list of angelic actions with optimistic description\n",
    +       "        return Problem.find_reachable_set(reachable_set, optimistic_description)\n",
    +       " \n",
    +       "\n",
    +       "    def reach_pes(init, plan): \n",
    +       "        """ \n",
    +       "        Finds the pessimistic reachable set of the sequence of actions in plan\n",
    +       "        """\n",
    +       "        reachable_set = {0: [init]}\n",
    +       "        pessimistic_description = plan.action_pes # list of angelic actions with pessimistic description\n",
    +       "        return Problem.find_reachable_set(reachable_set, pessimistic_description)\n",
    +       "\n",
    +       "    def find_reachable_set(reachable_set, action_description):\n",
    +       "        """\n",
    +       "\tFinds the reachable states of the action_description when applied in each state of reachable set.\n",
    +       "\t"""\n",
    +       "        for i in range(len(action_description)):\n",
    +       "            reachable_set[i+1]=[]\n",
    +       "            if type(action_description[i]) is Angelic_HLA:\n",
    +       "                possible_actions = action_description[i].angelic_action()\n",
    +       "            else: \n",
    +       "                possible_actions = action_description\n",
    +       "            for action in possible_actions:\n",
    +       "                for state in reachable_set[i]:\n",
    +       "                    if action.check_precond(state , action.args) :\n",
    +       "                        if action.effect[0] :\n",
    +       "                            new_state = action(state, action.args).clauses\n",
    +       "                            reachable_set[i+1].append(new_state)\n",
    +       "                        else: \n",
    +       "                            reachable_set[i+1].append(state)\n",
    +       "        return reachable_set\n",
    +       "\n",
    +       "    def find_hla(plan, hierarchy):\n",
    +       "        """\n",
    +       "        Finds the the first HLA action in plan.action, which is not primitive\n",
    +       "        and its corresponding index in plan.action\n",
    +       "        """\n",
    +       "        hla = None\n",
    +       "        index = len(plan.action)\n",
    +       "        for i in range(len(plan.action)): # find the first HLA in plan, that is not primitive\n",
    +       "            if not Problem.is_primitive(Node(plan.state, plan.parent, [plan.action[i]]), hierarchy):\n",
    +       "                hla = plan.action[i] \n",
    +       "                index = i\n",
    +       "                break\n",
    +       "        return (hla, index)\n",
    +       "\t\n",
    +       "    def making_progress(plan, initialPlan):\n",
    +       "        """ \n",
    +       "        Not correct\n",
    +       "\n",
    +       "        Normally should from infinite regression of refinements \n",
    +       "        \n",
    +       "        Only case covered: when plan contains one action (then there is no regression to be done)  \n",
    +       "        """\n",
    +       "        if (len(plan.action)==1):\n",
    +       "            return False\n",
    +       "        return True \n",
    +       "\n",
    +       "    def decompose(hierarchy, s_0, plan, s_f, reachable_set):\n",
    +       "        solution = [] \n",
    +       "        while plan.action_pes: \n",
    +       "            action = plan.action_pes.pop()\n",
    +       "            i = max(reachable_set.keys())\n",
    +       "            if (i==0): \n",
    +       "                return solution\n",
    +       "            s_i = Problem.find_previous_state(s_f, reachable_set,i, action) \n",
    +       "            problem = Problem(s_i, s_f , plan.action)\n",
    +       "            j=0\n",
    +       "            for x in Problem.angelic_search(problem, hierarchy, [Angelic_Node(s_i, Node(None), [action],[action])]):\n",
    +       "                solution.insert(j,x)\n",
    +       "                j+=1\n",
    +       "            s_f = s_i\n",
    +       "        return solution\n",
    +       "\n",
    +       "\n",
    +       "    def find_previous_state(s_f, reachable_set, i, action):\n",
    +       "        """\n",
    +       "        Given a final state s_f and an action finds a state s_i in reachable_set \n",
    +       "        such that when action is applied to state s_i returns s_f.  \n",
    +       "        """\n",
    +       "        s_i = reachable_set[i-1][0]\n",
    +       "        for state in reachable_set[i-1]:\n",
    +       "            if s_f in [x for x in Problem.reach_pes(state, Angelic_Node(state, None, [action],[action]))[1]]:\n",
    +       "                s_i =state\n",
    +       "                break\n",
    +       "        return s_i\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Problem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## HLA\n", + "To be able to model a real-world planning problem properly, it is essential to be able to represent a _high-level action (HLA)_ that can be hierarchically reduced to primitive actions." + ] + }, + { + "cell_type": "code", + "execution_count": 130, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class HLA(Action):\n",
    +       "    """\n",
    +       "    Define Actions for the real-world (that may be refined further), and satisfy resource\n",
    +       "    constraints.\n",
    +       "    """\n",
    +       "    unique_group = 1\n",
    +       "\n",
    +       "    def __init__(self, action, precond=None, effect=None, duration=0,\n",
    +       "                 consume=None, use=None):\n",
    +       "        """\n",
    +       "        As opposed to actions, to define HLA, we have added constraints.\n",
    +       "        duration holds the amount of time required to execute the task\n",
    +       "        consumes holds a dictionary representing the resources the task consumes\n",
    +       "        uses holds a dictionary representing the resources the task uses\n",
    +       "        """\n",
    +       "        precond = precond or [None]\n",
    +       "        effect = effect or [None]\n",
    +       "        super().__init__(action, precond, effect)\n",
    +       "        self.duration = duration\n",
    +       "        self.consumes = consume or {}\n",
    +       "        self.uses = use or {}\n",
    +       "        self.completed = False\n",
    +       "        # self.priority = -1 #  must be assigned in relation to other HLAs\n",
    +       "        # self.job_group = -1 #  must be assigned in relation to other HLAs\n",
    +       "\n",
    +       "    def do_action(self, job_order, available_resources, kb, args):\n",
    +       "        """\n",
    +       "        An HLA based version of act - along with knowledge base updation, it handles\n",
    +       "        resource checks, and ensures the actions are executed in the correct order.\n",
    +       "        """\n",
    +       "        # print(self.name)\n",
    +       "        if not self.has_usable_resource(available_resources):\n",
    +       "            raise Exception('Not enough usable resources to execute {}'.format(self.name))\n",
    +       "        if not self.has_consumable_resource(available_resources):\n",
    +       "            raise Exception('Not enough consumable resources to execute {}'.format(self.name))\n",
    +       "        if not self.inorder(job_order):\n",
    +       "            raise Exception("Can't execute {} - execute prerequisite actions first".\n",
    +       "                            format(self.name))\n",
    +       "        kb = super().act(kb, args)  # update knowledge base\n",
    +       "        for resource in self.consumes:  # remove consumed resources\n",
    +       "            available_resources[resource] -= self.consumes[resource]\n",
    +       "        self.completed = True  # set the task status to complete\n",
    +       "        return kb\n",
    +       "\n",
    +       "    def has_consumable_resource(self, available_resources):\n",
    +       "        """\n",
    +       "        Ensure there are enough consumable resources for this action to execute.\n",
    +       "        """\n",
    +       "        for resource in self.consumes:\n",
    +       "            if available_resources.get(resource) is None:\n",
    +       "                return False\n",
    +       "            if available_resources[resource] < self.consumes[resource]:\n",
    +       "                return False\n",
    +       "        return True\n",
    +       "\n",
    +       "    def has_usable_resource(self, available_resources):\n",
    +       "        """\n",
    +       "        Ensure there are enough usable resources for this action to execute.\n",
    +       "        """\n",
    +       "        for resource in self.uses:\n",
    +       "            if available_resources.get(resource) is None:\n",
    +       "                return False\n",
    +       "            if available_resources[resource] < self.uses[resource]:\n",
    +       "                return False\n",
    +       "        return True\n",
    +       "\n",
    +       "    def inorder(self, job_order):\n",
    +       "        """\n",
    +       "        Ensure that all the jobs that had to be executed before the current one have been\n",
    +       "        successfully executed.\n",
    +       "        """\n",
    +       "        for jobs in job_order:\n",
    +       "            if self in jobs:\n",
    +       "                for job in jobs:\n",
    +       "                    if job is self:\n",
    +       "                        return True\n",
    +       "                    if not job.completed:\n",
    +       "                        return False\n",
    +       "        return True\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(HLA)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to preconditions and effects, an object of the `HLA` class also stores:\n", + "- the `duration` of the HLA\n", + "- the quantity of consumption of _consumable_ resources\n", + "- the quantity of _reusable_ resources used\n", + "- a bool `completed` denoting if the `HLA` has been completed\n", + "\n", + "The class also has some useful helper methods:\n", + "- `do_action`: checks if required consumable and reusable resources are available and if so, executes the action.\n", + "- `has_consumable_resource`: checks if there exists sufficient quantity of the required consumable resource.\n", + "- `has_usable_resource`: checks if reusable resources are available and not already engaged.\n", + "- `inorder`: ensures that all the jobs that had to be executed before the current one have been successfully executed." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## PLANNING PROBLEMS\n", + "---\n", + "## Job-shop Problem\n", + "This is a simple problem involving the assembly of two cars simultaneously.\n", + "The problem consists of two jobs, each of the form [`AddEngine`, `AddWheels`, `Inspect`] to be performed on two cars with different requirements and availability of resources.\n", + "
    \n", + "Let's look at how the `job_shop_problem` has been defined on the module." + ] + }, + { + "cell_type": "code", + "execution_count": 138, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def job_shop_problem():\n",
    +       "    """\n",
    +       "    [Figure 11.1] JOB-SHOP-PROBLEM\n",
    +       "\n",
    +       "    A job-shop scheduling problem for assembling two cars,\n",
    +       "    with resource and ordering constraints.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> p = job_shop_problem()\n",
    +       "    >>> p.goal_test()\n",
    +       "    False\n",
    +       "    >>> p.act(p.jobs[1][0])\n",
    +       "    >>> p.act(p.jobs[1][1])\n",
    +       "    >>> p.act(p.jobs[1][2])\n",
    +       "    >>> p.act(p.jobs[0][0])\n",
    +       "    >>> p.act(p.jobs[0][1])\n",
    +       "    >>> p.goal_test()\n",
    +       "    False\n",
    +       "    >>> p.act(p.jobs[0][2])\n",
    +       "    >>> p.goal_test()\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "    resources = {'EngineHoists': 1, 'WheelStations': 2, 'Inspectors': 2, 'LugNuts': 500}\n",
    +       "\n",
    +       "    add_engine1 = HLA('AddEngine1', precond='~Has(C1, E1)', effect='Has(C1, E1)', duration=30, use={'EngineHoists': 1})\n",
    +       "    add_engine2 = HLA('AddEngine2', precond='~Has(C2, E2)', effect='Has(C2, E2)', duration=60, use={'EngineHoists': 1})\n",
    +       "    add_wheels1 = HLA('AddWheels1', precond='~Has(C1, W1)', effect='Has(C1, W1)', duration=30, use={'WheelStations': 1}, consume={'LugNuts': 20})\n",
    +       "    add_wheels2 = HLA('AddWheels2', precond='~Has(C2, W2)', effect='Has(C2, W2)', duration=15, use={'WheelStations': 1}, consume={'LugNuts': 20})\n",
    +       "    inspect1 = HLA('Inspect1', precond='~Inspected(C1)', effect='Inspected(C1)', duration=10, use={'Inspectors': 1})\n",
    +       "    inspect2 = HLA('Inspect2', precond='~Inspected(C2)', effect='Inspected(C2)', duration=10, use={'Inspectors': 1})\n",
    +       "\n",
    +       "    actions = [add_engine1, add_engine2, add_wheels1, add_wheels2, inspect1, inspect2]\n",
    +       "\n",
    +       "    job_group1 = [add_engine1, add_wheels1, inspect1]\n",
    +       "    job_group2 = [add_engine2, add_wheels2, inspect2]\n",
    +       "\n",
    +       "    return Problem(init='Car(C1) & Car(C2) & Wheels(W1) & Wheels(W2) & Engine(E2) & Engine(E2) & ~Has(C1, E1) & ~Has(C2, E2) & ~Has(C1, W1) & ~Has(C2, W2) & ~Inspected(C1) & ~Inspected(C2)',\n",
    +       "                   goals='Has(C1, W1) & Has(C1, E1) & Inspected(C1) & Has(C2, W2) & Has(C2, E2) & Inspected(C2)',\n",
    +       "                   actions=actions,\n",
    +       "                   jobs=[job_group1, job_group2],\n",
    +       "                   resources=resources)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(job_shop_problem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The states of this problem are:\n", + "
    \n", + "
    \n", + "**Has(x, y)**: Car **'x'** _has_ **'y'** where **'y'** can be an Engine or a Wheel.\n", + "\n", + "**~Has(x, y)**: Car **'x'** does _not have_ **'y'** where **'y'** can be an Engine or a Wheel.\n", + "\n", + "**Inspected(c)**: Car **'c'** has been _inspected_.\n", + "\n", + "**~Inspected(c)**: Car **'c'** has _not_ been inspected.\n", + "\n", + "In the initial state, `C1` and `C2` are cars and neither have an engine or wheels and haven't been inspected.\n", + "`E1` and `E2` are engines.\n", + "`W1` and `W2` are wheels.\n", + "
    \n", + "Our goal is to have engines and wheels on both cars and to get them inspected. We will discuss how to achieve this.\n", + "
    \n", + "Let's define an object of the `job_shop_problem`." + ] + }, + { + "cell_type": "code", + "execution_count": 139, + "metadata": {}, + "outputs": [], + "source": [ + "jobShopProblem = job_shop_problem()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before taking any actions, we will check if `jobShopProblem` has reached its goal." + ] + }, + { + "cell_type": "code", + "execution_count": 140, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n" + ] + } + ], + "source": [ + "print(jobShopProblem.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now define a possible solution that can help us reach the goal. \n", + "The actions are then carried out on the `jobShopProblem` object." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The following actions are available to us:\n", + "\n", + "**AddEngine1**: Adds an engine to the car C1. Takes 30 minutes to complete and uses an engine hoist.\n", + " \n", + "**AddEngine2**: Adds an engine to the car C2. Takes 60 minutes to complete and uses an engine hoist.\n", + "\n", + "**AddWheels1**: Adds wheels to car C1. Takes 30 minutes to complete. Uses a wheel station and consumes 20 lug nuts.\n", + "\n", + "**AddWheels2**: Adds wheels to car C2. Takes 15 minutes to complete. Uses a wheel station and consumes 20 lug nuts as well.\n", + "\n", + "**Inspect1**: Gets car C1 inspected. Requires 10 minutes of inspection by one inspector.\n", + "\n", + "**Inspect2**: Gets car C2 inspected. Requires 10 minutes of inspection by one inspector." + ] + }, + { + "cell_type": "code", + "execution_count": 141, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [jobShopProblem.jobs[1][0],\n", + " jobShopProblem.jobs[1][1],\n", + " jobShopProblem.jobs[1][2],\n", + " jobShopProblem.jobs[0][0],\n", + " jobShopProblem.jobs[0][1],\n", + " jobShopProblem.jobs[0][2]]\n", + "\n", + "for action in solution:\n", + " jobShopProblem.act(action)" + ] + }, + { + "cell_type": "code", + "execution_count": 142, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(jobShopProblem.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is a valid solution and one of many correct ways to solve this problem." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Double tennis problem\n", + "This problem is a simple case of a multiactor planning problem, where two agents act at once and can simultaneously change the current state of the problem. \n", + "A correct plan is one that, if executed by the actors, achieves the goal.\n", + "In the true multiagent setting, of course, the agents may not agree to execute any particular plan, but atleast they will know what plans _would_ work if they _did_ agree to execute them.\n", + "
    \n", + "In the double tennis problem, two actors A and B are playing together and can be in one of four locations: `LeftBaseLine`, `RightBaseLine`, `LeftNet` and `RightNet`.\n", + "The ball can be returned only if a player is in the right place.\n", + "Each action must include the actor as an argument.\n", + "
    \n", + "Let's first look at the definition of the `double_tennis_problem` in the module." + ] + }, + { + "cell_type": "code", + "execution_count": 172, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def double_tennis_problem():\n",
    +       "    """\n",
    +       "    [Figure 11.10] DOUBLE-TENNIS-PROBLEM\n",
    +       "\n",
    +       "    A multiagent planning problem involving two partner tennis players\n",
    +       "    trying to return an approaching ball and repositioning around in the court.\n",
    +       "\n",
    +       "    Example:\n",
    +       "    >>> from planning import *\n",
    +       "    >>> dtp = double_tennis_problem()\n",
    +       "    >>> goal_test(dtp.goals, dtp.init)\n",
    +       "    False\n",
    +       "    >>> dtp.act(expr('Go(A, RightBaseLine, LeftBaseLine)'))\n",
    +       "    >>> dtp.act(expr('Hit(A, Ball, RightBaseLine)'))\n",
    +       "    >>> goal_test(dtp.goals, dtp.init)\n",
    +       "    False\n",
    +       "    >>> dtp.act(expr('Go(A, LeftNet, RightBaseLine)'))\n",
    +       "    >>> goal_test(dtp.goals, dtp.init)\n",
    +       "    True\n",
    +       "    >>>\n",
    +       "    """\n",
    +       "\n",
    +       "    return PlanningProblem(init='At(A, LeftBaseLine) & At(B, RightNet) & Approaching(Ball, RightBaseLine) & Partner(A, B) & Partner(B, A)',\n",
    +       "                             goals='Returned(Ball) & At(a, LeftNet) & At(a, RightNet)',\n",
    +       "                             actions=[Action('Hit(actor, Ball, loc)',\n",
    +       "                                             precond='Approaching(Ball, loc) & At(actor, loc)',\n",
    +       "                                             effect='Returned(Ball)'),\n",
    +       "                                      Action('Go(actor, to, loc)', \n",
    +       "                                             precond='At(actor, loc)',\n",
    +       "                                             effect='At(actor, to) & ~At(actor, loc)')])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(double_tennis_problem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The states of this problem are:\n", + "\n", + "**Approaching(Ball, loc)**: The `Ball` is approaching the location `loc`.\n", + "\n", + "**Returned(Ball)**: One of the actors successfully hit the approaching ball from the correct location which caused it to return to the other side.\n", + "\n", + "**At(actor, loc)**: `actor` is at location `loc`.\n", + "\n", + "**~At(actor, loc)**: `actor` is _not_ at location `loc`.\n", + "\n", + "Let's now define an object of `double_tennis_problem`.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 173, + "metadata": {}, + "outputs": [], + "source": [ + "doubleTennisProblem = double_tennis_problem()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before taking any actions, we will check if `doubleTennisProblem` has reached the goal." + ] + }, + { + "cell_type": "code", + "execution_count": 174, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n" + ] + } + ], + "source": [ + "print(doubleTennisProblem.goal_test())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, the goal hasn't been reached. \n", + "We now define a possible solution that can help us reach the goal of having the ball returned.\n", + "The actions will then be carried out on the `doubleTennisProblem` object." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The actions available to us are the following:\n", + "\n", + "**Hit(actor, ball, loc)**: returns an approaching ball if `actor` is present at the `loc` that the ball is approaching.\n", + "\n", + "**Go(actor, to, loc)**: moves an `actor` from location `loc` to location `to`.\n", + "\n", + "We notice something different in this problem though, \n", + "which is quite unlike any other problem we have seen so far. \n", + "The goal state of the problem contains a variable `a`.\n", + "This happens sometimes in multiagent planning problems \n", + "and it means that it doesn't matter _which_ actor is at the `LeftNet` or the `RightNet`, as long as there is atleast one actor at either `LeftNet` or `RightNet`." + ] + }, + { + "cell_type": "code", + "execution_count": 175, + "metadata": {}, + "outputs": [], + "source": [ + "solution = [expr('Go(A, RightBaseLine, LeftBaseLine)'),\n", + " expr('Hit(A, Ball, RightBaseLine)'),\n", + " expr('Go(A, LeftNet, RightBaseLine)')]\n", + "\n", + "for action in solution:\n", + " doubleTennisProblem.act(action)" + ] + }, + { + "cell_type": "code", + "execution_count": 178, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "False" + ] + }, + "execution_count": 178, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "doubleTennisProblem.goal_test()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It has now successfully reached its goal, ie, to return the approaching ball." + ] } ], "metadata": { @@ -346,9 +3859,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.4.3" + "version": "3.5.3" } }, "nbformat": 4, - "nbformat_minor": 0 + "nbformat_minor": 1 } diff --git a/planning.py b/planning.py index da00ee5d5..1e4a19209 100644 --- a/planning.py +++ b/planning.py @@ -1,26 +1,154 @@ -"""Planning (Chapters 10-11) -""" +"""Planning (Chapters 10-11)""" +import copy import itertools +from collections import deque, defaultdict +from functools import reduce as _reduce + +import numpy as np + +import search +from csp import sat_up, NaryCSP, Constraint, ac_search_solver, is_constraint +from logic import FolKB, conjuncts, unify_mm, associate, SAT_plan, cdcl_satisfiable from search import Node -from utils import Expr, expr, first, FIFOQueue -from logic import FolKB +from utils import Expr, expr, first -class PDDL: +class PlanningProblem: """ - Planning Domain Definition Language (PDDL) used to define a search problem. + Planning Domain Definition Language (PlanningProblem) used to define a search problem. It stores states in a knowledge base consisting of first order logic statements. The conjunction of these logical statements completely defines a state. """ - def __init__(self, initial_state, actions, goal_test): - self.kb = FolKB(initial_state) + def __init__(self, initial, goals, actions, domain=None): + self.initial = self.convert(initial) if domain is None else self.convert(initial) + self.convert(domain) + self.goals = self.convert(goals) self.actions = actions - self.goal_test_func = goal_test + self.domain = domain + + def convert(self, clauses): + """Converts strings into exprs""" + if not isinstance(clauses, Expr): + if len(clauses) > 0: + clauses = expr(clauses) + else: + clauses = [] + try: + clauses = conjuncts(clauses) + except AttributeError: + pass + + new_clauses = [] + for clause in clauses: + if clause.op == '~': + new_clauses.append(expr('Not' + str(clause.args[0]))) + else: + new_clauses.append(clause) + return new_clauses + + def expand_fluents(self, name=None): + + kb = None + if self.domain: + kb = FolKB(self.convert(self.domain)) + for action in self.actions: + if action.precond: + for fests in set(action.precond).union(action.effect).difference(self.convert(action.domain)): + if fests.op[:3] != 'Not': + kb.tell(expr(str(action.domain) + ' ==> ' + str(fests))) + + objects = set(arg for clause in set(self.initial + self.goals) for arg in clause.args) + fluent_list = [] + if name is not None: + for fluent in self.initial + self.goals: + if str(fluent) == name: + fluent_list.append(fluent) + break + else: + fluent_list = list(map(lambda fluent: Expr(fluent[0], *fluent[1]), + {fluent.op: fluent.args for fluent in self.initial + self.goals + + [clause for action in self.actions for clause in action.effect if + clause.op[:3] != 'Not']}.items())) + + expansions = [] + for fluent in fluent_list: + for permutation in itertools.permutations(objects, len(fluent.args)): + new_fluent = Expr(fluent.op, *permutation) + if (self.domain and kb.ask(new_fluent) is not False) or not self.domain: + expansions.append(new_fluent) + + return expansions + + def expand_actions(self, name=None): + """Generate all possible actions with variable bindings for precondition selection heuristic""" + + has_domains = all(action.domain for action in self.actions if action.precond) + kb = None + if has_domains: + kb = FolKB(self.initial) + for action in self.actions: + if action.precond: + kb.tell(expr(str(action.domain) + ' ==> ' + str(action))) + + objects = set(arg for clause in self.initial for arg in clause.args) + expansions = [] + action_list = [] + if name is not None: + for action in self.actions: + if str(action.name) == name: + action_list.append(action) + break + else: + action_list = self.actions + + for action in action_list: + for permutation in itertools.permutations(objects, len(action.args)): + bindings = unify_mm(Expr(action.name, *action.args), Expr(action.name, *permutation)) + if bindings is not None: + new_args = [] + for arg in action.args: + if arg in bindings: + new_args.append(bindings[arg]) + else: + new_args.append(arg) + new_expr = Expr(str(action.name), *new_args) + if (has_domains and kb.ask(new_expr) is not False) or ( + has_domains and not action.precond) or not has_domains: + new_preconds = [] + for precond in action.precond: + new_precond_args = [] + for arg in precond.args: + if arg in bindings: + new_precond_args.append(bindings[arg]) + else: + new_precond_args.append(arg) + new_precond = Expr(str(precond.op), *new_precond_args) + new_preconds.append(new_precond) + new_effects = [] + for effect in action.effect: + new_effect_args = [] + for arg in effect.args: + if arg in bindings: + new_effect_args.append(bindings[arg]) + else: + new_effect_args.append(arg) + new_effect = Expr(str(effect.op), *new_effect_args) + new_effects.append(new_effect) + expansions.append(Action(new_expr, new_preconds, new_effects)) + + return expansions + + def is_strips(self): + """ + Returns True if the problem does not contain negative literals in preconditions and goals + """ + return (all(clause.op[:3] != 'Not' for clause in self.goals) and + all(clause.op[:3] != 'Not' for action in self.actions for clause in action.precond)) def goal_test(self): - return self.goal_test_func(self.kb) + """Checks if the goals have been reached""" + return all(goal in self.initial for goal in self.goals) def act(self, action): """ @@ -32,278 +160,662 @@ def act(self, action): list_action = first(a for a in self.actions if a.name == action_name) if list_action is None: raise Exception("Action '{}' not found".format(action_name)) - if not list_action.check_precond(self.kb, args): + if not list_action.check_precond(self.initial, args): raise Exception("Action '{}' pre-conditions not satisfied".format(action)) - list_action(self.kb, args) + self.initial = list_action(self.initial, args).clauses class Action: """ Defines an action schema using preconditions and effects. - Use this to describe actions in PDDL. + Use this to describe actions in PlanningProblem. action is an Expr where variables are given as arguments(args). - Precondition and effect are both lists with positive and negated literals. + Precondition and effect are both lists with positive and negative literals. + Negative preconditions and effects are defined by adding a 'Not' before the name of the clause Example: - precond_pos = [expr("Human(person)"), expr("Hungry(Person)")] - precond_neg = [expr("Eaten(food)")] - effect_add = [expr("Eaten(food)")] - effect_rem = [expr("Hungry(person)")] - eat = Action(expr("Eat(person, food)"), [precond_pos, precond_neg], [effect_add, effect_rem]) + precond = [expr("Human(person)"), expr("Hungry(Person)"), expr("NotEaten(food)")] + effect = [expr("Eaten(food)"), expr("Hungry(person)")] + eat = Action(expr("Eat(person, food)"), precond, effect) """ - def __init__(self, action, precond, effect): + def __init__(self, action, precond, effect, domain=None): + if isinstance(action, str): + action = expr(action) self.name = action.op self.args = action.args - self.precond_pos = precond[0] - self.precond_neg = precond[1] - self.effect_add = effect[0] - self.effect_rem = effect[1] + self.precond = self.convert(precond) if domain is None else self.convert(precond) + self.convert(domain) + self.effect = self.convert(effect) + self.domain = domain def __call__(self, kb, args): return self.act(kb, args) + def __repr__(self): + return '{}'.format(Expr(self.name, *self.args)) + + def convert(self, clauses): + """Converts strings into Exprs""" + if isinstance(clauses, Expr): + clauses = conjuncts(clauses) + for i in range(len(clauses)): + if clauses[i].op == '~': + clauses[i] = expr('Not' + str(clauses[i].args[0])) + + elif isinstance(clauses, str): + clauses = clauses.replace('~', 'Not') + if len(clauses) > 0: + clauses = expr(clauses) + + try: + clauses = conjuncts(clauses) + except AttributeError: + pass + + return clauses + + def relaxed(self): + """ + Removes delete list from the action by removing all negative literals from action's effect + """ + return Action(Expr(self.name, *self.args), self.precond, + list(filter(lambda effect: effect.op[:3] != 'Not', self.effect))) + def substitute(self, e, args): """Replaces variables in expression with their respective Propositional symbol""" + new_args = list(e.args) for num, x in enumerate(e.args): - for i in range(len(self.args)): + for i, _ in enumerate(self.args): if self.args[i] == x: new_args[num] = args[i] return Expr(e.op, *new_args) def check_precond(self, kb, args): """Checks if the precondition is satisfied in the current state""" - # check for positive clauses - for clause in self.precond_pos: + + if isinstance(kb, list): + kb = FolKB(kb) + for clause in self.precond: if self.substitute(clause, args) not in kb.clauses: return False - # check for negative clauses - for clause in self.precond_neg: - if self.substitute(clause, args) in kb.clauses: - return False return True def act(self, kb, args): - """Executes the action on the state's kb""" - # check if the preconditions are satisfied + """Executes the action on the state's knowledge base""" + + if isinstance(kb, list): + kb = FolKB(kb) + if not self.check_precond(kb, args): - raise Exception("Action pre-conditions not satisfied") - # remove negative literals - for clause in self.effect_rem: - kb.retract(self.substitute(clause, args)) - # add positive literals - for clause in self.effect_add: + raise Exception('Action pre-conditions not satisfied') + for clause in self.effect: kb.tell(self.substitute(clause, args)) + if clause.op[:3] == 'Not': + new_clause = Expr(clause.op[3:], *clause.args) + + if kb.ask(self.substitute(new_clause, args)) is not False: + kb.retract(self.substitute(new_clause, args)) + else: + new_clause = Expr('Not' + clause.op, *clause.args) + + if kb.ask(self.substitute(new_clause, args)) is not False: + kb.retract(self.substitute(new_clause, args)) + + return kb + + +def goal_test(goals, state): + """Generic goal testing helper function""" + + if isinstance(state, list): + kb = FolKB(state) + else: + kb = state + return all(kb.ask(q) is not False for q in goals) def air_cargo(): - init = [expr('At(C1, SFO)'), - expr('At(C2, JFK)'), - expr('At(P1, SFO)'), - expr('At(P2, JFK)'), - expr('Cargo(C1)'), - expr('Cargo(C2)'), - expr('Plane(P1)'), - expr('Plane(P2)'), - expr('Airport(JFK)'), - expr('Airport(SFO)')] - - def goal_test(kb): - required = [expr('At(C1 , JFK)'), expr('At(C2 ,SFO)')] - return all([kb.ask(q) is not False for q in required]) - - # Actions - - # Load - precond_pos = [expr("At(c, a)"), expr("At(p, a)"), expr("Cargo(c)"), expr("Plane(p)"), - expr("Airport(a)")] - precond_neg = [] - effect_add = [expr("In(c, p)")] - effect_rem = [expr("At(c, a)")] - load = Action(expr("Load(c, p, a)"), [precond_pos, precond_neg], [effect_add, effect_rem]) - - # Unload - precond_pos = [expr("In(c, p)"), expr("At(p, a)"), expr("Cargo(c)"), expr("Plane(p)"), - expr("Airport(a)")] - precond_neg = [] - effect_add = [expr("At(c, a)")] - effect_rem = [expr("In(c, p)")] - unload = Action(expr("Unload(c, p, a)"), [precond_pos, precond_neg], [effect_add, effect_rem]) - - # Fly - # Used 'f' instead of 'from' because 'from' is a python keyword and expr uses eval() function - precond_pos = [expr("At(p, f)"), expr("Plane(p)"), expr("Airport(f)"), expr("Airport(to)")] - precond_neg = [] - effect_add = [expr("At(p, to)")] - effect_rem = [expr("At(p, f)")] - fly = Action(expr("Fly(p, f, to)"), [precond_pos, precond_neg], [effect_add, effect_rem]) - - return PDDL(init, [load, unload, fly], goal_test) + """ + [Figure 10.1] AIR-CARGO-PROBLEM + + An air-cargo shipment problem for delivering cargo to different locations, + given the starting location and airplanes. + + Example: + >>> from planning import * + >>> ac = air_cargo() + >>> ac.goal_test() + False + >>> ac.act(expr('Load(C2, P2, JFK)')) + >>> ac.act(expr('Load(C1, P1, SFO)')) + >>> ac.act(expr('Fly(P1, SFO, JFK)')) + >>> ac.act(expr('Fly(P2, JFK, SFO)')) + >>> ac.act(expr('Unload(C2, P2, SFO)')) + >>> ac.goal_test() + False + >>> ac.act(expr('Unload(C1, P1, JFK)')) + >>> ac.goal_test() + True + >>> + """ + + return PlanningProblem(initial='At(C1, SFO) & At(C2, JFK) & At(P1, SFO) & At(P2, JFK)', + goals='At(C1, JFK) & At(C2, SFO)', + actions=[Action('Load(c, p, a)', + precond='At(c, a) & At(p, a)', + effect='In(c, p) & ~At(c, a)', + domain='Cargo(c) & Plane(p) & Airport(a)'), + Action('Unload(c, p, a)', + precond='In(c, p) & At(p, a)', + effect='At(c, a) & ~In(c, p)', + domain='Cargo(c) & Plane(p) & Airport(a)'), + Action('Fly(p, f, to)', + precond='At(p, f)', + effect='At(p, to) & ~At(p, f)', + domain='Plane(p) & Airport(f) & Airport(to)')], + domain='Cargo(C1) & Cargo(C2) & Plane(P1) & Plane(P2) & Airport(SFO) & Airport(JFK)') def spare_tire(): - init = [expr('Tire(Flat)'), - expr('Tire(Spare)'), - expr('At(Flat, Axle)'), - expr('At(Spare, Trunk)')] - - def goal_test(kb): - required = [expr('At(Spare, Axle)')] - return all(kb.ask(q) is not False for q in required) - - # Actions - - # Remove - precond_pos = [expr("At(obj, loc)")] - precond_neg = [] - effect_add = [expr("At(obj, Ground)")] - effect_rem = [expr("At(obj, loc)")] - remove = Action(expr("Remove(obj, loc)"), [precond_pos, precond_neg], [effect_add, effect_rem]) - - # PutOn - precond_pos = [expr("Tire(t)"), expr("At(t, Ground)")] - precond_neg = [expr("At(Flat, Axle)")] - effect_add = [expr("At(t, Axle)")] - effect_rem = [expr("At(t, Ground)")] - put_on = Action(expr("PutOn(t, Axle)"), [precond_pos, precond_neg], [effect_add, effect_rem]) - - # LeaveOvernight - precond_pos = [] - precond_neg = [] - effect_add = [] - effect_rem = [expr("At(Spare, Ground)"), expr("At(Spare, Axle)"), expr("At(Spare, Trunk)"), - expr("At(Flat, Ground)"), expr("At(Flat, Axle)"), expr("At(Flat, Trunk)")] - leave_overnight = Action(expr("LeaveOvernight"), [precond_pos, precond_neg], - [effect_add, effect_rem]) - - return PDDL(init, [remove, put_on, leave_overnight], goal_test) + """ + [Figure 10.2] SPARE-TIRE-PROBLEM + + A problem involving changing the flat tire of a car + with a spare tire from the trunk. + + Example: + >>> from planning import * + >>> st = spare_tire() + >>> st.goal_test() + False + >>> st.act(expr('Remove(Spare, Trunk)')) + >>> st.act(expr('Remove(Flat, Axle)')) + >>> st.goal_test() + False + >>> st.act(expr('PutOn(Spare, Axle)')) + >>> st.goal_test() + True + >>> + """ + + return PlanningProblem(initial='At(Flat, Axle) & At(Spare, Trunk)', + goals='At(Spare, Axle) & At(Flat, Ground)', + actions=[Action('Remove(obj, loc)', + precond='At(obj, loc)', + effect='At(obj, Ground) & ~At(obj, loc)', + domain='Tire(obj)'), + Action('PutOn(t, Axle)', + precond='At(t, Ground) & ~At(Flat, Axle)', + effect='At(t, Axle) & ~At(t, Ground)', + domain='Tire(t)'), + Action('LeaveOvernight', + precond='', + effect='~At(Spare, Ground) & ~At(Spare, Axle) & ~At(Spare, Trunk) & \ + ~At(Flat, Ground) & ~At(Flat, Axle) & ~At(Flat, Trunk)')], + domain='Tire(Flat) & Tire(Spare)') def three_block_tower(): - init = [expr('On(A, Table)'), - expr('On(B, Table)'), - expr('On(C, A)'), - expr('Block(A)'), - expr('Block(B)'), - expr('Block(C)'), - expr('Clear(B)'), - expr('Clear(C)')] - - def goal_test(kb): - required = [expr('On(A, B)'), expr('On(B, C)')] - return all(kb.ask(q) is not False for q in required) - - # Actions - - # Move - precond_pos = [expr('On(b, x)'), expr('Clear(b)'), expr('Clear(y)'), expr('Block(b)'), - expr('Block(y)')] - precond_neg = [] - effect_add = [expr('On(b, y)'), expr('Clear(x)')] - effect_rem = [expr('On(b, x)'), expr('Clear(y)')] - move = Action(expr('Move(b, x, y)'), [precond_pos, precond_neg], [effect_add, effect_rem]) - - # MoveToTable - precond_pos = [expr('On(b, x)'), expr('Clear(b)'), expr('Block(b)')] - precond_neg = [] - effect_add = [expr('On(b, Table)'), expr('Clear(x)')] - effect_rem = [expr('On(b, x)')] - moveToTable = Action(expr('MoveToTable(b, x)'), [precond_pos, precond_neg], - [effect_add, effect_rem]) - - return PDDL(init, [move, moveToTable], goal_test) + """ + [Figure 10.3] THREE-BLOCK-TOWER + + A blocks-world problem of stacking three blocks in a certain configuration, + also known as the Sussman Anomaly. + + Example: + >>> from planning import * + >>> tbt = three_block_tower() + >>> tbt.goal_test() + False + >>> tbt.act(expr('MoveToTable(C, A)')) + >>> tbt.act(expr('Move(B, Table, C)')) + >>> tbt.goal_test() + False + >>> tbt.act(expr('Move(A, Table, B)')) + >>> tbt.goal_test() + True + >>> + """ + return PlanningProblem(initial='On(A, Table) & On(B, Table) & On(C, A) & Clear(B) & Clear(C)', + goals='On(A, B) & On(B, C)', + actions=[Action('Move(b, x, y)', + precond='On(b, x) & Clear(b) & Clear(y)', + effect='On(b, y) & Clear(x) & ~On(b, x) & ~Clear(y)', + domain='Block(b) & Block(y)'), + Action('MoveToTable(b, x)', + precond='On(b, x) & Clear(b)', + effect='On(b, Table) & Clear(x) & ~On(b, x)', + domain='Block(b) & Block(x)')], + domain='Block(A) & Block(B) & Block(C)') + + +def simple_blocks_world(): + """ + SIMPLE-BLOCKS-WORLD + + A simplified definition of the Sussman Anomaly problem. + + Example: + >>> from planning import * + >>> sbw = simple_blocks_world() + >>> sbw.goal_test() + False + >>> sbw.act(expr('ToTable(A, B)')) + >>> sbw.act(expr('FromTable(B, A)')) + >>> sbw.goal_test() + False + >>> sbw.act(expr('FromTable(C, B)')) + >>> sbw.goal_test() + True + >>> + """ + + return PlanningProblem(initial='On(A, B) & Clear(A) & OnTable(B) & OnTable(C) & Clear(C)', + goals='On(B, A) & On(C, B)', + actions=[Action('ToTable(x, y)', + precond='On(x, y) & Clear(x)', + effect='~On(x, y) & Clear(y) & OnTable(x)'), + Action('FromTable(y, x)', + precond='OnTable(y) & Clear(y) & Clear(x)', + effect='~OnTable(y) & ~Clear(x) & On(y, x)')]) def have_cake_and_eat_cake_too(): - init = [expr('Have(Cake)')] + """ + [Figure 10.7] CAKE-PROBLEM + + A problem where we begin with a cake and want to + reach the state of having a cake and having eaten a cake. + The possible actions include baking a cake and eating a cake. + + Example: + >>> from planning import * + >>> cp = have_cake_and_eat_cake_too() + >>> cp.goal_test() + False + >>> cp.act(expr('Eat(Cake)')) + >>> cp.goal_test() + False + >>> cp.act(expr('Bake(Cake)')) + >>> cp.goal_test() + True + >>> + """ + + return PlanningProblem(initial='Have(Cake)', + goals='Have(Cake) & Eaten(Cake)', + actions=[Action('Eat(Cake)', + precond='Have(Cake)', + effect='Eaten(Cake) & ~Have(Cake)'), + Action('Bake(Cake)', + precond='~Have(Cake)', + effect='Have(Cake)')]) + + +def shopping_problem(): + """ + SHOPPING-PROBLEM + + A problem of acquiring some items given their availability at certain stores. + + Example: + >>> from planning import * + >>> sp = shopping_problem() + >>> sp.goal_test() + False + >>> sp.act(expr('Go(Home, HW)')) + >>> sp.act(expr('Buy(Drill, HW)')) + >>> sp.act(expr('Go(HW, SM)')) + >>> sp.act(expr('Buy(Banana, SM)')) + >>> sp.goal_test() + False + >>> sp.act(expr('Buy(Milk, SM)')) + >>> sp.goal_test() + True + >>> + """ + + return PlanningProblem(initial='At(Home) & Sells(SM, Milk) & Sells(SM, Banana) & Sells(HW, Drill)', + goals='Have(Milk) & Have(Banana) & Have(Drill)', + actions=[Action('Buy(x, store)', + precond='At(store) & Sells(store, x)', + effect='Have(x)', + domain='Store(store) & Item(x)'), + Action('Go(x, y)', + precond='At(x)', + effect='At(y) & ~At(x)', + domain='Place(x) & Place(y)')], + domain='Place(Home) & Place(SM) & Place(HW) & Store(SM) & Store(HW) & ' + 'Item(Milk) & Item(Banana) & Item(Drill)') + + +def socks_and_shoes(): + """ + SOCKS-AND-SHOES-PROBLEM + + A task of wearing socks and shoes on both feet + + Example: + >>> from planning import * + >>> ss = socks_and_shoes() + >>> ss.goal_test() + False + >>> ss.act(expr('RightSock')) + >>> ss.act(expr('RightShoe')) + >>> ss.act(expr('LeftSock')) + >>> ss.goal_test() + False + >>> ss.act(expr('LeftShoe')) + >>> ss.goal_test() + True + >>> + """ + + return PlanningProblem(initial='', + goals='RightShoeOn & LeftShoeOn', + actions=[Action('RightShoe', + precond='RightSockOn', + effect='RightShoeOn'), + Action('RightSock', + precond='', + effect='RightSockOn'), + Action('LeftShoe', + precond='LeftSockOn', + effect='LeftShoeOn'), + Action('LeftSock', + precond='', + effect='LeftSockOn')]) + + +def double_tennis_problem(): + """ + [Figure 11.10] DOUBLE-TENNIS-PROBLEM + + A multiagent planning problem involving two partner tennis players + trying to return an approaching ball and repositioning around in the court. + + Example: + >>> from planning import * + >>> dtp = double_tennis_problem() + >>> goal_test(dtp.goals, dtp.initial) + False + >>> dtp.act(expr('Go(A, RightBaseLine, LeftBaseLine)')) + >>> dtp.act(expr('Hit(A, Ball, RightBaseLine)')) + >>> goal_test(dtp.goals, dtp.initial) + False + >>> dtp.act(expr('Go(A, LeftNet, RightBaseLine)')) + >>> goal_test(dtp.goals, dtp.initial) + True + >>> + """ + + return PlanningProblem( + initial='At(A, LeftBaseLine) & At(B, RightNet) & Approaching(Ball, RightBaseLine) & Partner(A, B) & Partner(B, A)', + goals='Returned(Ball) & At(a, LeftNet) & At(a, RightNet)', + actions=[Action('Hit(actor, Ball, loc)', + precond='Approaching(Ball, loc) & At(actor, loc)', + effect='Returned(Ball)'), + Action('Go(actor, to, loc)', + precond='At(actor, loc)', + effect='At(actor, to) & ~At(actor, loc)')]) + + +class ForwardPlan(search.Problem): + """ + [Section 10.2.1] + Forward state-space search + """ - def goal_test(kb): - required = [expr('Have(Cake)'), expr('Eaten(Cake)')] - return all(kb.ask(q) is not False for q in required) + def __init__(self, planning_problem): + super().__init__(associate('&', planning_problem.initial), associate('&', planning_problem.goals)) + self.planning_problem = planning_problem + self.expanded_actions = self.planning_problem.expand_actions() - # Actions + def actions(self, state): + return [action for action in self.expanded_actions if all(pre in conjuncts(state) for pre in action.precond)] - # Eat cake - precond_pos = [expr('Have(Cake)')] - precond_neg = [] - effect_add = [expr('Eaten(Cake)')] - effect_rem = [expr('Have(Cake)')] - eat_cake = Action(expr('Eat(Cake)'), [precond_pos, precond_neg], [effect_add, effect_rem]) + def result(self, state, action): + return associate('&', action(conjuncts(state), action.args).clauses) - # Bake Cake - precond_pos = [] - precond_neg = [expr('Have(Cake)')] - effect_add = [expr('Have(Cake)')] - effect_rem = [] - bake_cake = Action(expr('Bake(Cake)'), [precond_pos, precond_neg], [effect_add, effect_rem]) + def goal_test(self, state): + return all(goal in conjuncts(state) for goal in self.planning_problem.goals) - return PDDL(init, [eat_cake, bake_cake], goal_test) + def h(self, state): + """ + Computes ignore delete lists heuristic by creating a relaxed version of the original problem (we can do that + by removing the delete lists from all actions, i.e. removing all negative literals from effects) that will be + easier to solve through GraphPlan and where the length of the solution will serve as a good heuristic. + """ + relaxed_planning_problem = PlanningProblem(initial=state.state, + goals=self.goal, + actions=[action.relaxed() for action in + self.planning_problem.actions]) + try: + return len(linearize(GraphPlan(relaxed_planning_problem).execute())) + except: + return np.inf -class Level(): +class BackwardPlan(search.Problem): + """ + [Section 10.2.2] + Backward relevant-states search + """ + + def __init__(self, planning_problem): + super().__init__(associate('&', planning_problem.goals), associate('&', planning_problem.initial)) + self.planning_problem = planning_problem + self.expanded_actions = self.planning_problem.expand_actions() + + def actions(self, subgoal): + """ + Returns True if the action is relevant to the subgoal, i.e.: + - the action achieves an element of the effects + - the action doesn't delete something that needs to be achieved + - the preconditions are consistent with other subgoals that need to be achieved + """ + + def negate_clause(clause): + return Expr(clause.op.replace('Not', ''), *clause.args) if clause.op[:3] == 'Not' else Expr( + 'Not' + clause.op, *clause.args) + + subgoal = conjuncts(subgoal) + return [action for action in self.expanded_actions if + (any(prop in action.effect for prop in subgoal) and + not any(negate_clause(prop) in subgoal for prop in action.effect) and + not any(negate_clause(prop) in subgoal and negate_clause(prop) not in action.effect + for prop in action.precond))] + + def result(self, subgoal, action): + # g' = (g - effects(a)) + preconds(a) + return associate('&', set(set(conjuncts(subgoal)).difference(action.effect)).union(action.precond)) + + def goal_test(self, subgoal): + return all(goal in conjuncts(self.goal) for goal in conjuncts(subgoal)) + + def h(self, subgoal): + """ + Computes ignore delete lists heuristic by creating a relaxed version of the original problem (we can do that + by removing the delete lists from all actions, i.e. removing all negative literals from effects) that will be + easier to solve through GraphPlan and where the length of the solution will serve as a good heuristic. + """ + relaxed_planning_problem = PlanningProblem(initial=self.goal, + goals=subgoal.state, + actions=[action.relaxed() for action in + self.planning_problem.actions]) + try: + return len(linearize(GraphPlan(relaxed_planning_problem).execute())) + except: + return np.inf + + +def CSPlan(planning_problem, solution_length, CSP_solver=ac_search_solver, arc_heuristic=sat_up): + """ + [Section 10.4.3] + Planning as Constraint Satisfaction Problem + """ + + def st(var, stage): + """Returns a string for the var-stage pair that can be used as a variable""" + return str(var) + "_" + str(stage) + + def if_(v1, v2): + """If the second argument is v2, the first argument must be v1""" + + def if_fun(x1, x2): + return x1 == v1 if x2 == v2 else True + + if_fun.__name__ = "if the second argument is " + str(v2) + " then the first argument is " + str(v1) + " " + return if_fun + + def eq_if_not_in_(actset): + """First and third arguments are equal if action is not in actset""" + + def eq_if_not_in(x1, a, x2): + return x1 == x2 if a not in actset else True + + eq_if_not_in.__name__ = "first and third arguments are equal if action is not in " + str(actset) + " " + return eq_if_not_in + + expanded_actions = planning_problem.expand_actions() + fluent_values = planning_problem.expand_fluents() + for horizon in range(solution_length): + act_vars = [st('action', stage) for stage in range(horizon + 1)] + domains = {av: list(map(lambda action: expr(str(action)), expanded_actions)) for av in act_vars} + domains.update({st(var, stage): {True, False} for var in fluent_values for stage in range(horizon + 2)}) + # initial state constraints + constraints = [Constraint((st(var, 0),), is_constraint(val)) + for (var, val) in {expr(str(fluent).replace('Not', '')): + True if fluent.op[:3] != 'Not' else False + for fluent in planning_problem.initial}.items()] + constraints += [Constraint((st(var, 0),), is_constraint(False)) + for var in {expr(str(fluent).replace('Not', '')) + for fluent in fluent_values if fluent not in planning_problem.initial}] + # goal state constraints + constraints += [Constraint((st(var, horizon + 1),), is_constraint(val)) + for (var, val) in {expr(str(fluent).replace('Not', '')): + True if fluent.op[:3] != 'Not' else False + for fluent in planning_problem.goals}.items()] + # precondition constraints + constraints += [Constraint((st(var, stage), st('action', stage)), if_(val, act)) + # st(var, stage) == val if st('action', stage) == act + for act, strps in {expr(str(action)): action for action in expanded_actions}.items() + for var, val in {expr(str(fluent).replace('Not', '')): + True if fluent.op[:3] != 'Not' else False + for fluent in strps.precond}.items() + for stage in range(horizon + 1)] + # effect constraints + constraints += [Constraint((st(var, stage + 1), st('action', stage)), if_(val, act)) + # st(var, stage + 1) == val if st('action', stage) == act + for act, strps in {expr(str(action)): action for action in expanded_actions}.items() + for var, val in {expr(str(fluent).replace('Not', '')): True if fluent.op[:3] != 'Not' else False + for fluent in strps.effect}.items() + for stage in range(horizon + 1)] + # frame constraints + constraints += [Constraint((st(var, stage), st('action', stage), st(var, stage + 1)), + eq_if_not_in_(set(map(lambda action: expr(str(action)), + {act for act in expanded_actions if var in act.effect + or Expr('Not' + var.op, *var.args) in act.effect})))) + for var in fluent_values for stage in range(horizon + 1)] + csp = NaryCSP(domains, constraints) + sol = CSP_solver(csp, arc_heuristic=arc_heuristic) + if sol: + return [sol[a] for a in act_vars] + + +def SATPlan(planning_problem, solution_length, SAT_solver=cdcl_satisfiable): + """ + [Section 10.4.1] + Planning as Boolean satisfiability + """ + + def expand_transitions(state, actions): + state = sorted(conjuncts(state)) + for action in filter(lambda act: act.check_precond(state, act.args), actions): + transition[associate('&', state)].update( + {Expr(action.name, *action.args): + associate('&', sorted(set(filter(lambda clause: clause.op[:3] != 'Not', + action(state, action.args).clauses)))) + if planning_problem.is_strips() + else associate('&', sorted(set(action(state, action.args).clauses)))}) + for state in transition[associate('&', state)].values(): + if state not in transition: + expand_transitions(expr(state), actions) + + transition = defaultdict(dict) + expand_transitions(associate('&', planning_problem.initial), planning_problem.expand_actions()) + + return SAT_plan(associate('&', sorted(planning_problem.initial)), transition, + associate('&', sorted(planning_problem.goals)), solution_length, SAT_solver=SAT_solver) + + +class Level: """ Contains the state of the planning problem and exhaustive list of actions which use the states as pre-condition. """ - def __init__(self, poskb, negkb): - self.poskb = poskb - # Current state - self.current_state_pos = poskb.clauses - self.current_state_neg = negkb.clauses - # Current action to current state link - self.current_action_links_pos = {} - self.current_action_links_neg = {} - # Current state to action link - self.current_state_links_pos = {} - self.current_state_links_neg = {} - # Current action to next state link + def __init__(self, kb): + """Initializes variables to hold state and action details of a level""" + + self.kb = kb + # current state + self.current_state = kb.clauses + # current action to state link + self.current_action_links = {} + # current state to action link + self.current_state_links = {} + # current action to next state link self.next_action_links = {} - # Next state to current action link - self.next_state_links_pos = {} - self.next_state_links_neg = {} + # next state to current action link + self.next_state_links = {} + # mutually exclusive actions self.mutex = [] def __call__(self, actions, objects): self.build(actions, objects) self.find_mutex() + def separate(self, e): + """Separates an iterable of elements into positive and negative parts""" + + positive = [] + negative = [] + for clause in e: + if clause.op[:3] == 'Not': + negative.append(clause) + else: + positive.append(clause) + return positive, negative + def find_mutex(self): + """Finds mutually exclusive actions""" + # Inconsistent effects - for poseff in self.next_state_links_pos: - negeff = poseff - if negeff in self.next_state_links_neg: - for a in self.next_state_links_pos[poseff]: - for b in self.next_state_links_neg[negeff]: - if set([a, b]) not in self.mutex: - self.mutex.append(set([a, b])) - - # Interference - for posprecond in self.current_state_links_pos: - negeff = posprecond - if negeff in self.next_state_links_neg: - for a in self.current_state_links_pos[posprecond]: - for b in self.next_state_links_neg[negeff]: - if set([a, b]) not in self.mutex: - self.mutex.append(set([a, b])) - - for negprecond in self.current_state_links_neg: - poseff = negprecond - if poseff in self.next_state_links_pos: - for a in self.next_state_links_pos[poseff]: - for b in self.current_state_links_neg[negprecond]: - if set([a, b]) not in self.mutex: - self.mutex.append(set([a, b])) + pos_nsl, neg_nsl = self.separate(self.next_state_links) + + for negeff in neg_nsl: + new_negeff = Expr(negeff.op[3:], *negeff.args) + for poseff in pos_nsl: + if new_negeff == poseff: + for a in self.next_state_links[poseff]: + for b in self.next_state_links[negeff]: + if {a, b} not in self.mutex: + self.mutex.append({a, b}) + + # Interference will be calculated with the last step + pos_csl, neg_csl = self.separate(self.current_state_links) # Competing needs - for posprecond in self.current_state_links_pos: - negprecond = posprecond - if negprecond in self.current_state_links_neg: - for a in self.current_state_links_pos[posprecond]: - for b in self.current_state_links_neg[negprecond]: - if set([a, b]) not in self.mutex: - self.mutex.append(set([a, b])) + for pos_precond in pos_csl: + for neg_precond in neg_csl: + new_neg_precond = Expr(neg_precond.op[3:], *neg_precond.args) + if new_neg_precond == pos_precond: + for a in self.current_state_links[pos_precond]: + for b in self.current_state_links[neg_precond]: + if {a, b} not in self.mutex: + self.mutex.append({a, b}) # Inconsistent support state_mutex = [] @@ -314,33 +826,26 @@ def find_mutex(self): else: next_state_1 = self.next_action_links[list(pair)[0]] if (len(next_state_0) == 1) and (len(next_state_1) == 1): - state_mutex.append(set([next_state_0[0], next_state_1[0]])) + state_mutex.append({next_state_0[0], next_state_1[0]}) - self.mutex = self.mutex+state_mutex + self.mutex = self.mutex + state_mutex def build(self, actions, objects): + """Populates the lists and dictionaries containing the state action dependencies""" - # Add persistence actions for positive states - for clause in self.current_state_pos: - self.current_action_links_pos[Expr('Persistence', clause)] = [clause] - self.next_action_links[Expr('Persistence', clause)] = [clause] - self.current_state_links_pos[clause] = [Expr('Persistence', clause)] - self.next_state_links_pos[clause] = [Expr('Persistence', clause)] - - # Add persistence actions for negative states - for clause in self.current_state_neg: - not_expr = Expr('not'+clause.op, clause.args) - self.current_action_links_neg[Expr('Persistence', not_expr)] = [clause] - self.next_action_links[Expr('Persistence', not_expr)] = [clause] - self.current_state_links_neg[clause] = [Expr('Persistence', not_expr)] - self.next_state_links_neg[clause] = [Expr('Persistence', not_expr)] + for clause in self.current_state: + p_expr = Expr('P' + clause.op, *clause.args) + self.current_action_links[p_expr] = [clause] + self.next_action_links[p_expr] = [clause] + self.current_state_links[clause] = [p_expr] + self.next_state_links[clause] = [p_expr] for a in actions: num_args = len(a.args) possible_args = tuple(itertools.permutations(objects, num_args)) for arg in possible_args: - if a.check_precond(self.poskb, arg): + if a.check_precond(self.kb, arg): for num, symbol in enumerate(a.args): if not symbol.op.islower(): arg = list(arg) @@ -348,47 +853,31 @@ def build(self, actions, objects): arg = tuple(arg) new_action = a.substitute(Expr(a.name, *a.args), arg) - self.current_action_links_pos[new_action] = [] - self.current_action_links_neg[new_action] = [] + self.current_action_links[new_action] = [] - for clause in a.precond_pos: + for clause in a.precond: new_clause = a.substitute(clause, arg) - self.current_action_links_pos[new_action].append(new_clause) - if new_clause in self.current_state_links_pos: - self.current_state_links_pos[new_clause].append(new_action) + self.current_action_links[new_action].append(new_clause) + if new_clause in self.current_state_links: + self.current_state_links[new_clause].append(new_action) else: - self.current_state_links_pos[new_clause] = [new_action] - - for clause in a.precond_neg: - new_clause = a.substitute(clause, arg) - self.current_action_links_neg[new_action].append(new_clause) - if new_clause in self.current_state_links_neg: - self.current_state_links_neg[new_clause].append(new_action) - else: - self.current_state_links_neg[new_clause] = [new_action] + self.current_state_links[new_clause] = [new_action] self.next_action_links[new_action] = [] - for clause in a.effect_add: + for clause in a.effect: new_clause = a.substitute(clause, arg) - self.next_action_links[new_action].append(new_clause) - if new_clause in self.next_state_links_pos: - self.next_state_links_pos[new_clause].append(new_action) - else: - self.next_state_links_pos[new_clause] = [new_action] - for clause in a.effect_rem: - new_clause = a.substitute(clause, arg) self.next_action_links[new_action].append(new_clause) - if new_clause in self.next_state_links_neg: - self.next_state_links_neg[new_clause].append(new_action) + if new_clause in self.next_state_links: + self.next_state_links[new_clause].append(new_action) else: - self.next_state_links_neg[new_clause] = [new_action] + self.next_state_links[new_clause] = [new_action] def perform_actions(self): - new_kb_pos = FolKB(list(set(self.next_state_links_pos.keys()))) - new_kb_neg = FolKB(list(set(self.next_state_links_neg.keys()))) + """Performs the necessary actions and returns a new Level""" - return Level(new_kb_pos, new_kb_neg) + new_kb = FolKB(list(set(self.next_state_links.keys()))) + return Level(new_kb) class Graph: @@ -397,20 +886,25 @@ class Graph: Used in graph planning algorithm to extract a solution """ - def __init__(self, pddl, negkb): - self.pddl = pddl - self.levels = [Level(pddl.kb, negkb)] - self.objects = set(arg for clause in pddl.kb.clauses + negkb.clauses for arg in clause.args) + def __init__(self, planning_problem): + self.planning_problem = planning_problem + self.kb = FolKB(planning_problem.initial) + self.levels = [Level(self.kb)] + self.objects = set(arg for clause in self.kb.clauses for arg in clause.args) def __call__(self): self.expand_graph() def expand_graph(self): + """Expands the graph by a level""" + last_level = self.levels[-1] - last_level(self.pddl.actions, self.objects) + last_level(self.planning_problem.actions, self.objects) self.levels.append(last_level.perform_actions()) def non_mutex_goals(self, goals, index): + """Checks whether the goals are mutually exclusive""" + goal_perm = itertools.combinations(goals, 2) for g in goal_perm: if set(g) in self.levels[index].mutex: @@ -425,39 +919,37 @@ class GraphPlan: Returns solution for the planning problem """ - def __init__(self, pddl, negkb): - self.graph = Graph(pddl, negkb) - self.nogoods = [] + def __init__(self, planning_problem): + self.graph = Graph(planning_problem) + self.no_goods = [] self.solution = [] def check_leveloff(self): - first_check = (set(self.graph.levels[-1].current_state_pos) == - set(self.graph.levels[-2].current_state_pos)) - second_check = (set(self.graph.levels[-1].current_state_neg) == - set(self.graph.levels[-2].current_state_neg)) + """Checks if the graph has levelled off""" - if first_check and second_check: + check = (set(self.graph.levels[-1].current_state) == set(self.graph.levels[-2].current_state)) + + if check: return True - def extract_solution(self, goals_pos, goals_neg, index): + def extract_solution(self, goals, index): + """Extracts the solution""" + level = self.graph.levels[index] - if not self.graph.non_mutex_goals(goals_pos+goals_neg, index): - self.nogoods.append((level, goals_pos, goals_neg)) + if not self.graph.non_mutex_goals(goals, index): + self.no_goods.append((level, goals)) return - level = self.graph.levels[index-1] + level = self.graph.levels[index - 1] # Create all combinations of actions that satisfy the goal actions = [] - for goal in goals_pos: - actions.append(level.next_state_links_pos[goal]) - - for goal in goals_neg: - actions.append(level.next_state_links_neg[goal]) + for goal in goals: + actions.append(level.next_state_links[goal]) all_actions = list(itertools.product(*actions)) - # Filter out the action combinations which contain mutexes + # Filter out non-mutex actions non_mutex_actions = [] for action_tuple in all_actions: action_pairs = itertools.combinations(list(set(action_tuple)), 2) @@ -472,22 +964,17 @@ def extract_solution(self, goals_pos, goals_neg, index): if [action_list, index] not in self.solution: self.solution.append([action_list, index]) - new_goals_pos = [] - new_goals_neg = [] - for act in set(action_list): - if act in level.current_action_links_pos: - new_goals_pos = new_goals_pos + level.current_action_links_pos[act] - + new_goals = [] for act in set(action_list): - if act in level.current_action_links_neg: - new_goals_neg = new_goals_neg + level.current_action_links_neg[act] + if act in level.current_action_links: + new_goals = new_goals + level.current_action_links[act] - if abs(index)+1 == len(self.graph.levels): + if abs(index) + 1 == len(self.graph.levels): return - elif (level, new_goals_pos, new_goals_neg) in self.nogoods: + elif (level, new_goals) in self.no_goods: return else: - self.extract_solution(new_goals_pos, new_goals_neg, index-1) + self.extract_solution(new_goals, index - 1) # Level-Order multiple solutions solution = [] @@ -504,58 +991,428 @@ def extract_solution(self, goals_pos, goals_neg, index): return solution + def goal_test(self, kb): + return all(kb.ask(q) is not False for q in self.graph.planning_problem.goals) -def spare_tire_graphplan(): - pddl = spare_tire() - negkb = FolKB([expr('At(Flat, Trunk)')]) - graphplan = GraphPlan(pddl, negkb) + def execute(self): + """Executes the GraphPlan algorithm for the given problem""" - def goal_test(kb, goals): - return all(kb.ask(q) is not False for q in goals) + while True: + self.graph.expand_graph() + if (self.goal_test(self.graph.levels[-1].kb) and self.graph.non_mutex_goals( + self.graph.planning_problem.goals, -1)): + solution = self.extract_solution(self.graph.planning_problem.goals, -1) + if solution: + return solution - # Not sure - goals_pos = [expr('At(Spare, Axle)'), expr('At(Flat, Ground)')] - goals_neg = [] + if len(self.graph.levels) >= 2 and self.check_leveloff(): + return None - while True: - if (goal_test(graphplan.graph.levels[-1].poskb, goals_pos) and - graphplan.graph.non_mutex_goals(goals_pos+goals_neg, -1)): - solution = graphplan.extract_solution(goals_pos, goals_neg, -1) - if solution: - return solution - graphplan.graph.expand_graph() - if len(graphplan.graph.levels)>=2 and graphplan.check_leveloff(): - return None +class Linearize: + + def __init__(self, planning_problem): + self.planning_problem = planning_problem + + def filter(self, solution): + """Filter out persistence actions from a solution""" + + new_solution = [] + for section in solution[0]: + new_section = [] + for operation in section: + if not (operation.op[0] == 'P' and operation.op[1].isupper()): + new_section.append(operation) + new_solution.append(new_section) + return new_solution + + def orderlevel(self, level, planning_problem): + """Return valid linear order of actions for a given level""" + + for permutation in itertools.permutations(level): + temp = copy.deepcopy(planning_problem) + count = 0 + for action in permutation: + try: + temp.act(action) + count += 1 + except: + count = 0 + temp = copy.deepcopy(planning_problem) + break + if count == len(permutation): + return list(permutation), temp + return None -def double_tennis_problem(): - init = [expr('At(A, LeftBaseLine)'), - expr('At(B, RightNet)'), - expr('Approaching(Ball, RightBaseLine)'), - expr('Partner(A, B)'), - expr('Partner(B, A)')] + def execute(self): + """Finds total-order solution for a planning graph""" + + graphPlan_solution = GraphPlan(self.planning_problem).execute() + filtered_solution = self.filter(graphPlan_solution) + ordered_solution = [] + planning_problem = self.planning_problem + for level in filtered_solution: + level_solution, planning_problem = self.orderlevel(level, planning_problem) + for element in level_solution: + ordered_solution.append(element) + + return ordered_solution + + +def linearize(solution): + """Converts a level-ordered solution into a linear solution""" + + linear_solution = [] + for section in solution[0]: + for operation in section: + if not (operation.op[0] == 'P' and operation.op[1].isupper()): + linear_solution.append(operation) + + return linear_solution + + +class PartialOrderPlanner: + """ + [Section 10.13] PARTIAL-ORDER-PLANNER + + Partially ordered plans are created by a search through the space of plans + rather than a search through the state space. It views planning as a refinement of partially ordered plans. + A partially ordered plan is defined by a set of actions and a set of constraints of the form A < B, + which denotes that action A has to be performed before action B. + To summarize the working of a partial order planner, + 1. An open precondition is selected (a sub-goal that we want to achieve). + 2. An action that fulfils the open precondition is chosen. + 3. Temporal constraints are updated. + 4. Existing causal links are protected. Protection is a method that checks if the causal links conflict + and if they do, temporal constraints are added to fix the threats. + 5. The set of open preconditions is updated. + 6. Temporal constraints of the selected action and the next action are established. + 7. A new causal link is added between the selected action and the owner of the open precondition. + 8. The set of new causal links is checked for threats and if found, the threat is removed by either promotion or + demotion. If promotion or demotion is unable to solve the problem, the planning problem cannot be solved with + the current sequence of actions or it may not be solvable at all. + 9. These steps are repeated until the set of open preconditions is empty. + """ + + def __init__(self, planning_problem): + self.tries = 1 + self.planning_problem = planning_problem + self.causal_links = [] + self.start = Action('Start', [], self.planning_problem.initial) + self.finish = Action('Finish', self.planning_problem.goals, []) + self.actions = set() + self.actions.add(self.start) + self.actions.add(self.finish) + self.constraints = set() + self.constraints.add((self.start, self.finish)) + self.agenda = set() + for precond in self.finish.precond: + self.agenda.add((precond, self.finish)) + self.expanded_actions = planning_problem.expand_actions() + + def find_open_precondition(self): + """Find open precondition with the least number of possible actions""" + + number_of_ways = dict() + actions_for_precondition = dict() + for element in self.agenda: + open_precondition = element[0] + possible_actions = list(self.actions) + self.expanded_actions + for action in possible_actions: + for effect in action.effect: + if effect == open_precondition: + if open_precondition in number_of_ways: + number_of_ways[open_precondition] += 1 + actions_for_precondition[open_precondition].append(action) + else: + number_of_ways[open_precondition] = 1 + actions_for_precondition[open_precondition] = [action] + + number = sorted(number_of_ways, key=number_of_ways.__getitem__) + + for k, v in number_of_ways.items(): + if v == 0: + return None, None, None + + act1 = None + for element in self.agenda: + if element[0] == number[0]: + act1 = element[1] + break + + if number[0] in self.expanded_actions: + self.expanded_actions.remove(number[0]) + + return number[0], act1, actions_for_precondition[number[0]] + + def find_action_for_precondition(self, oprec): + """Find action for a given precondition""" + + # either + # choose act0 E Actions such that act0 achieves G + for action in self.actions: + for effect in action.effect: + if effect == oprec: + return action, 0 + + # or + # choose act0 E Actions such that act0 achieves G + for action in self.planning_problem.actions: + for effect in action.effect: + if effect.op == oprec.op: + bindings = unify_mm(effect, oprec) + if bindings is None: + break + return action, bindings + + def generate_expr(self, clause, bindings): + """Generate atomic expression from generic expression given variable bindings""" + + new_args = [] + for arg in clause.args: + if arg in bindings: + new_args.append(bindings[arg]) + else: + new_args.append(arg) + + try: + return Expr(str(clause.name), *new_args) + except: + return Expr(str(clause.op), *new_args) + + def generate_action_object(self, action, bindings): + """Generate action object given a generic action and variable bindings""" + + # if bindings is 0, it means the action already exists in self.actions + if bindings == 0: + return action + + # bindings cannot be None + else: + new_expr = self.generate_expr(action, bindings) + new_preconds = [] + for precond in action.precond: + new_precond = self.generate_expr(precond, bindings) + new_preconds.append(new_precond) + new_effects = [] + for effect in action.effect: + new_effect = self.generate_expr(effect, bindings) + new_effects.append(new_effect) + return Action(new_expr, new_preconds, new_effects) + + def cyclic(self, graph): + """Check cyclicity of a directed graph""" + + new_graph = dict() + for element in graph: + if element[0] in new_graph: + new_graph[element[0]].append(element[1]) + else: + new_graph[element[0]] = [element[1]] + + path = set() + + def visit(vertex): + path.add(vertex) + for neighbor in new_graph.get(vertex, ()): + if neighbor in path or visit(neighbor): + return True + path.remove(vertex) + return False + + value = any(visit(v) for v in new_graph) + return value + + def add_const(self, constraint, constraints): + """Add the constraint to constraints if the resulting graph is acyclic""" + + if constraint[0] == self.finish or constraint[1] == self.start: + return constraints + + new_constraints = set(constraints) + new_constraints.add(constraint) + + if self.cyclic(new_constraints): + return constraints + return new_constraints + + def is_a_threat(self, precondition, effect): + """Check if effect is a threat to precondition""" - def goal_test(kb): - required = [expr('Goal(Returned(Ball))'), expr('At(a, RightNet)'), expr('At(a, LeftNet)')] - return all(kb.ask(q) is not False for q in required) + if (str(effect.op) == 'Not' + str(precondition.op)) or ('Not' + str(effect.op) == str(precondition.op)): + if effect.args == precondition.args: + return True + return False - # Actions + def protect(self, causal_link, action, constraints): + """Check and resolve threats by promotion or demotion""" - # Hit - precond_pos = [expr("Approaching(Ball,loc)"), expr("At(actor,loc)")] - precond_neg = [] - effect_add = [expr("Returned(Ball)")] - effect_rem = [] - hit = Action(expr("Hit(actor, Ball)"), [precond_pos, precond_neg], [effect_add, effect_rem]) + threat = False + for effect in action.effect: + if self.is_a_threat(causal_link[1], effect): + threat = True + break - # Go - precond_pos = [expr("At(actor, loc)")] - precond_neg = [] - effect_add = [expr("At(actor, to)")] - effect_rem = [expr("At(actor, loc)")] - go = Action(expr("Go(actor, to)"), [precond_pos, precond_neg], [effect_add, effect_rem]) + if action != causal_link[0] and action != causal_link[2] and threat: + # try promotion + new_constraints = set(constraints) + new_constraints.add((action, causal_link[0])) + if not self.cyclic(new_constraints): + constraints = self.add_const((action, causal_link[0]), constraints) + else: + # try demotion + new_constraints = set(constraints) + new_constraints.add((causal_link[2], action)) + if not self.cyclic(new_constraints): + constraints = self.add_const((causal_link[2], action), constraints) + else: + # both promotion and demotion fail + print('Unable to resolve a threat caused by', action, 'onto', causal_link) + return + return constraints + + def convert(self, constraints): + """Convert constraints into a dict of Action to set orderings""" + + graph = dict() + for constraint in constraints: + if constraint[0] in graph: + graph[constraint[0]].add(constraint[1]) + else: + graph[constraint[0]] = set() + graph[constraint[0]].add(constraint[1]) + return graph + + def toposort(self, graph): + """Generate topological ordering of constraints""" - return PDDL(init, [hit, go], goal_test) + if len(graph) == 0: + return + + graph = graph.copy() + + for k, v in graph.items(): + v.discard(k) + + extra_elements_in_dependencies = _reduce(set.union, graph.values()) - set(graph.keys()) + + graph.update({element: set() for element in extra_elements_in_dependencies}) + while True: + ordered = set(element for element, dependency in graph.items() if len(dependency) == 0) + if not ordered: + break + yield ordered + graph = {element: (dependency - ordered) + for element, dependency in graph.items() + if element not in ordered} + if len(graph) != 0: + raise ValueError('The graph is not acyclic and cannot be linearly ordered') + + def display_plan(self): + """Display causal links, constraints and the plan""" + + print('Causal Links') + for causal_link in self.causal_links: + print(causal_link) + + print('\nConstraints') + for constraint in self.constraints: + print(constraint[0], '<', constraint[1]) + + print('\nPartial Order Plan') + print(list(reversed(list(self.toposort(self.convert(self.constraints)))))) + + def execute(self, display=True): + """Execute the algorithm""" + + step = 1 + while len(self.agenda) > 0: + step += 1 + # select from Agenda + try: + G, act1, possible_actions = self.find_open_precondition() + except IndexError: + print('Probably Wrong') + break + + act0 = possible_actions[0] + # remove from Agenda + self.agenda.remove((G, act1)) + + # For actions with variable number of arguments, use least commitment principle + # act0_temp, bindings = self.find_action_for_precondition(G) + # act0 = self.generate_action_object(act0_temp, bindings) + + # Actions = Actions U {act0} + self.actions.add(act0) + + # Constraints = add_const(start < act0, Constraints) + self.constraints = self.add_const((self.start, act0), self.constraints) + + # for each CL E CausalLinks do + # Constraints = protect(CL, act0, Constraints) + for causal_link in self.causal_links: + self.constraints = self.protect(causal_link, act0, self.constraints) + + # Agenda = Agenda U {: P is a precondition of act0} + for precondition in act0.precond: + self.agenda.add((precondition, act0)) + + # Constraints = add_const(act0 < act1, Constraints) + self.constraints = self.add_const((act0, act1), self.constraints) + + # CausalLinks U {} + if (act0, G, act1) not in self.causal_links: + self.causal_links.append((act0, G, act1)) + + # for each A E Actions do + # Constraints = protect(, A, Constraints) + for action in self.actions: + self.constraints = self.protect((act0, G, act1), action, self.constraints) + + if step > 200: + print("Couldn't find a solution") + return None, None + + if display: + self.display_plan() + else: + return self.constraints, self.causal_links + + +def spare_tire_graphPlan(): + """Solves the spare tire problem using GraphPlan""" + return GraphPlan(spare_tire()).execute() + + +def three_block_tower_graphPlan(): + """Solves the Sussman Anomaly problem using GraphPlan""" + return GraphPlan(three_block_tower()).execute() + + +def air_cargo_graphPlan(): + """Solves the air cargo problem using GraphPlan""" + return GraphPlan(air_cargo()).execute() + + +def have_cake_and_eat_cake_too_graphPlan(): + """Solves the cake problem using GraphPlan""" + return [GraphPlan(have_cake_and_eat_cake_too()).execute()[1]] + + +def shopping_graphPlan(): + """Solves the shopping problem using GraphPlan""" + return GraphPlan(shopping_problem()).execute() + + +def socks_and_shoes_graphPlan(): + """Solves the socks and shoes problem using GraphPlan""" + return GraphPlan(socks_and_shoes()).execute() + + +def simple_blocks_world_graphPlan(): + """Solves the simple blocks world problem""" + return GraphPlan(simple_blocks_world()).execute() class HLA(Action): @@ -565,18 +1422,19 @@ class HLA(Action): """ unique_group = 1 - def __init__(self, action, precond=[None, None], effect=[None, None], duration=0, - consume={}, use={}): + def __init__(self, action, precond=None, effect=None, duration=0, consume=None, use=None): """ As opposed to actions, to define HLA, we have added constraints. duration holds the amount of time required to execute the task consumes holds a dictionary representing the resources the task consumes uses holds a dictionary representing the resources the task uses """ + precond = precond or [None] + effect = effect or [None] super().__init__(action, precond, effect) self.duration = duration - self.consumes = consume - self.uses = use + self.consumes = consume or {} + self.uses = use or {} self.completed = False # self.priority = -1 # must be assigned in relation to other HLAs # self.job_group = -1 # must be assigned in relation to other HLAs @@ -586,7 +1444,6 @@ def do_action(self, job_order, available_resources, kb, args): An HLA based version of act - along with knowledge base updation, it handles resource checks, and ensures the actions are executed in the correct order. """ - # print(self.name) if not self.has_usable_resource(available_resources): raise Exception('Not enough usable resources to execute {}'.format(self.name)) if not self.has_consumable_resource(available_resources): @@ -594,10 +1451,11 @@ def do_action(self, job_order, available_resources, kb, args): if not self.inorder(job_order): raise Exception("Can't execute {} - execute prerequisite actions first". format(self.name)) - super().act(kb, args) # update knowledge base + kb = super().act(kb, args) # update knowledge base for resource in self.consumes: # remove consumed resources available_resources[resource] -= self.consumes[resource] self.completed = True # set the task status to complete + return kb def has_consumable_resource(self, available_resources): """ @@ -636,18 +1494,19 @@ def inorder(self, job_order): return True -class Problem(PDDL): +class RealWorldPlanningProblem(PlanningProblem): """ Define real-world problems by aggregating resources as numerical quantities instead of named entities. - This class is identical to PDLL, except that it overloads the act function to handle + This class is identical to PDDL, except that it overloads the act function to handle resource and ordering conditions imposed by HLA as opposed to Action. """ - def __init__(self, initial_state, actions, goal_test, jobs=None, resources={}): - super().__init__(initial_state, actions, goal_test) + + def __init__(self, initial, goals, actions, jobs=None, resources=None): + super().__init__(initial, goals, actions) self.jobs = jobs - self.resources = resources + self.resources = resources or {} def act(self, action): """ @@ -662,106 +1521,249 @@ def act(self, action): list_action = first(a for a in self.actions if a.name == action.name) if list_action is None: raise Exception("Action '{}' not found".format(action.name)) - list_action.do_action(self.jobs, self.resources, self.kb, args) + self.initial = list_action.do_action(self.jobs, self.resources, self.initial, args).clauses - def refinements(hla, state, library): # TODO - refinements may be (multiple) HLA themselves ... + def refinements(self, library): # refinements may be (multiple) HLA themselves ... """ - state is a Problem, containing the current state kb - library is a dictionary containing details for every possible refinement. eg: + State is a Problem, containing the current state kb library is a + dictionary containing details for every possible refinement. e.g.: { - "HLA": [ - "Go(Home,SFO)", - "Go(Home,SFO)", - "Drive(Home, SFOLongTermParking)", - "Shuttle(SFOLongTermParking, SFO)", - "Taxi(Home, SFO)" - ], - "steps": [ - ["Drive(Home, SFOLongTermParking)", "Shuttle(SFOLongTermParking, SFO)"], - ["Taxi(Home, SFO)"], - [], # empty refinements ie primitive action + 'HLA': [ + 'Go(Home, SFO)', + 'Go(Home, SFO)', + 'Drive(Home, SFOLongTermParking)', + 'Shuttle(SFOLongTermParking, SFO)', + 'Taxi(Home, SFO)' + ], + 'steps': [ + ['Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'], + ['Taxi(Home, SFO)'], + [], [], [] - ], - "precond_pos": [ - ["At(Home), Have(Car)"], - ["At(Home)"], - ["At(Home)", "Have(Car)"] - ["At(SFOLongTermParking)"] - ["At(Home)"] - ], - "precond_neg": [[],[],[],[],[]], - "effect_pos": [ - ["At(SFO)"], - ["At(SFO)"], - ["At(SFOLongTermParking)"], - ["At(SFO)"], - ["At(SFO)"] - ], - "effect_neg": [ - ["At(Home)"], - ["At(Home)"], - ["At(Home)"], - ["At(SFOLongTermParking)"], - ["At(Home)"] - ] - } - """ - e = Expr(hla.name, hla.args) - indices = [i for i, x in enumerate(library["HLA"]) if expr(x).op == hla.name] + ], + # empty refinements indicate a primitive action + 'precond': [ + ['At(Home) & Have(Car)'], + ['At(Home)'], + ['At(Home) & Have(Car)'], + ['At(SFOLongTermParking)'], + ['At(Home)'] + ], + 'effect': [ + ['At(SFO) & ~At(Home)'], + ['At(SFO) & ~At(Home)'], + ['At(SFOLongTermParking) & ~At(Home)'], + ['At(SFO) & ~At(SFOLongTermParking)'], + ['At(SFO) & ~At(Home)'] + ]} + """ + indices = [i for i, x in enumerate(library['HLA']) if expr(x).op == self.name] for i in indices: - action = HLA(expr(library["steps"][i][0]), [ # TODO multiple refinements - [expr(x) for x in library["precond_pos"][i]], - [expr(x) for x in library["precond_neg"][i]] - ], - [ - [expr(x) for x in library["effect_pos"][i]], - [expr(x) for x in library["effect_neg"][i]] - ]) - if action.check_precond(state.kb, action.args): - yield action - - def hierarchical_search(problem, hierarchy): - """ - [Figure 11.5] 'Hierarchical Search, a Breadth First Search implementation of Hierarchical + actions = [] + for j in range(len(library['steps'][i])): + # find the index of the step [j] of the HLA + index_step = [k for k, x in enumerate(library['HLA']) if x == library['steps'][i][j]][0] + precond = library['precond'][index_step][0] # preconditions of step [j] + effect = library['effect'][index_step][0] # effect of step [j] + actions.append(HLA(library['steps'][i][j], precond, effect)) + yield actions + + def hierarchical_search(self, hierarchy): + """ + [Figure 11.5] + 'Hierarchical Search, a Breadth First Search implementation of Hierarchical Forward Planning Search' - The problem is a real-world prodlem defined by the problem class, and the hierarchy is + The problem is a real-world problem defined by the problem class, and the hierarchy is a dictionary of HLA - refinements (see refinements generator for details) """ - act = Node(problem.actions[0]) - frontier = FIFOQueue() + act = Node(self.initial, None, [self.actions[0]]) + frontier = deque() frontier.append(act) - while(True): + while True: if not frontier: return None - plan = frontier.pop() - print(plan.state.name) - hla = plan.state # first_or_null(plan) - prefix = None - if plan.parent: - prefix = plan.parent.state.action # prefix, suffix = subseq(plan.state, hla) - outcome = Problem.result(problem, prefix) - if hla is None: + plan = frontier.popleft() + # finds the first non primitive hla in plan actions + (hla, index) = RealWorldPlanningProblem.find_hla(plan, hierarchy) + prefix = plan.action[:index] + outcome = RealWorldPlanningProblem( + RealWorldPlanningProblem.result(self.initial, prefix), self.goals, self.actions) + suffix = plan.action[index + 1:] + if not hla: # hla is None and plan is primitive if outcome.goal_test(): - return plan.path() + return plan.action else: - print("else") - for sequence in Problem.refinements(hla, outcome, hierarchy): - print("...") - frontier.append(Node(plan.state, plan.parent, sequence)) + for sequence in RealWorldPlanningProblem.refinements(hla, hierarchy): # find refinements + frontier.append(Node(outcome.initial, plan, prefix + sequence + suffix)) - def result(problem, action): + def result(state, actions): """The outcome of applying an action to the current problem""" - if action is not None: - problem.act(action) - return problem - else: - return problem + for a in actions: + if a.check_precond(state, a.args): + state = a(state, a.args).clauses + return state + + def angelic_search(self, hierarchy, initial_plan): + """ + [Figure 11.8] + A hierarchical planning algorithm that uses angelic semantics to identify and + commit to high-level plans that work while avoiding high-level plans that don’t. + The predicate MAKING-PROGRESS checks to make sure that we aren’t stuck in an infinite regression + of refinements. + At top level, call ANGELIC-SEARCH with [Act] as the initialPlan. + + InitialPlan contains a sequence of HLA's with angelic semantics + + The possible effects of an angelic HLA in initialPlan are: + ~ : effect remove + $+: effect possibly add + $-: effect possibly remove + $$: possibly add or remove + """ + frontier = deque(initial_plan) + while True: + if not frontier: + return None + plan = frontier.popleft() # sequence of HLA/Angelic HLA's + opt_reachable_set = RealWorldPlanningProblem.reach_opt(self.initial, plan) + pes_reachable_set = RealWorldPlanningProblem.reach_pes(self.initial, plan) + if self.intersects_goal(opt_reachable_set): + if RealWorldPlanningProblem.is_primitive(plan, hierarchy): + return [x for x in plan.action] + guaranteed = self.intersects_goal(pes_reachable_set) + if guaranteed and RealWorldPlanningProblem.making_progress(plan, initial_plan): + final_state = guaranteed[0] # any element of guaranteed + return RealWorldPlanningProblem.decompose(hierarchy, final_state, pes_reachable_set) + # there should be at least one HLA/AngelicHLA, otherwise plan would be primitive + hla, index = RealWorldPlanningProblem.find_hla(plan, hierarchy) + prefix = plan.action[:index] + suffix = plan.action[index + 1:] + outcome = RealWorldPlanningProblem( + RealWorldPlanningProblem.result(self.initial, prefix), self.goals, self.actions) + for sequence in RealWorldPlanningProblem.refinements(hla, hierarchy): # find refinements + frontier.append( + AngelicNode(outcome.initial, plan, prefix + sequence + suffix, prefix + sequence + suffix)) + + def intersects_goal(self, reachable_set): + """ + Find the intersection of the reachable states and the goal + """ + return [y for x in list(reachable_set.keys()) + for y in reachable_set[x] + if all(goal in y for goal in self.goals)] + + def is_primitive(plan, library): + """ + checks if the hla is primitive action + """ + for hla in plan.action: + indices = [i for i, x in enumerate(library['HLA']) if expr(x).op == hla.name] + for i in indices: + if library["steps"][i]: + return False + return True + + def reach_opt(init, plan): + """ + Finds the optimistic reachable set of the sequence of actions in plan + """ + reachable_set = {0: [init]} + optimistic_description = plan.action # list of angelic actions with optimistic description + return RealWorldPlanningProblem.find_reachable_set(reachable_set, optimistic_description) + + def reach_pes(init, plan): + """ + Finds the pessimistic reachable set of the sequence of actions in plan + """ + reachable_set = {0: [init]} + pessimistic_description = plan.action_pes # list of angelic actions with pessimistic description + return RealWorldPlanningProblem.find_reachable_set(reachable_set, pessimistic_description) + + def find_reachable_set(reachable_set, action_description): + """ + Finds the reachable states of the action_description when applied in each state of reachable set. + """ + for i in range(len(action_description)): + reachable_set[i + 1] = [] + if type(action_description[i]) is AngelicHLA: + possible_actions = action_description[i].angelic_action() + else: + possible_actions = action_description + for action in possible_actions: + for state in reachable_set[i]: + if action.check_precond(state, action.args): + if action.effect[0]: + new_state = action(state, action.args).clauses + reachable_set[i + 1].append(new_state) + else: + reachable_set[i + 1].append(state) + return reachable_set + + def find_hla(plan, hierarchy): + """ + Finds the the first HLA action in plan.action, which is not primitive + and its corresponding index in plan.action + """ + hla = None + index = len(plan.action) + for i in range(len(plan.action)): # find the first HLA in plan, that is not primitive + if not RealWorldPlanningProblem.is_primitive(Node(plan.state, plan.parent, [plan.action[i]]), hierarchy): + hla = plan.action[i] + index = i + break + return hla, index + + def making_progress(plan, initial_plan): + """ + Prevents from infinite regression of refinements + + (infinite regression of refinements happens when the algorithm finds a plan that + its pessimistic reachable set intersects the goal inside a call to decompose on + the same plan, in the same circumstances) + """ + for i in range(len(initial_plan)): + if plan == initial_plan[i]: + return False + return True + + def decompose(hierarchy, plan, s_f, reachable_set): + solution = [] + i = max(reachable_set.keys()) + while plan.action_pes: + action = plan.action_pes.pop() + if i == 0: + return solution + s_i = RealWorldPlanningProblem.find_previous_state(s_f, reachable_set, i, action) + problem = RealWorldPlanningProblem(s_i, s_f, plan.action) + angelic_call = RealWorldPlanningProblem.angelic_search(problem, hierarchy, + [AngelicNode(s_i, Node(None), [action], [action])]) + if angelic_call: + for x in angelic_call: + solution.insert(0, x) + else: + return None + s_f = s_i + i -= 1 + return solution + + def find_previous_state(s_f, reachable_set, i, action): + """ + Given a final state s_f and an action finds a state s_i in reachable_set + such that when action is applied to state s_i returns s_f. + """ + s_i = reachable_set[i - 1][0] + for state in reachable_set[i - 1]: + if s_f in [x for x in RealWorldPlanningProblem.reach_pes( + state, AngelicNode(state, None, [action], [action]))[1]]: + s_i = state + break + return s_i def job_shop_problem(): """ - [figure 11.1] JOB-SHOP-PROBLEM + [Figure 11.1] JOB-SHOP-PROBLEM A job-shop scheduling problem for assembling two cars, with resource and ordering constraints. @@ -783,82 +1785,226 @@ def job_shop_problem(): True >>> """ - init = [expr('Car(C1)'), - expr('Car(C2)'), - expr('Wheels(W1)'), - expr('Wheels(W2)'), - expr('Engine(E2)'), - expr('Engine(E2)')] - - def goal_test(kb): - # print(kb.clauses) - required = [expr('Has(C1, W1)'), expr('Has(C1, E1)'), expr('Inspected(C1)'), - expr('Has(C2, W2)'), expr('Has(C2, E2)'), expr('Inspected(C2)')] - for q in required: - # print(q) - # print(kb.ask(q)) - if kb.ask(q) is False: - return False - return True - resources = {'EngineHoists': 1, 'WheelStations': 2, 'Inspectors': 2, 'LugNuts': 500} - # AddEngine1 - precond_pos = [] - precond_neg = [expr("Has(C1,E1)")] - effect_add = [expr("Has(C1,E1)")] - effect_rem = [] - add_engine1 = HLA(expr("AddEngine1"), - [precond_pos, precond_neg], [effect_add, effect_rem], - duration=30, use={'EngineHoists': 1}) - - # AddEngine2 - precond_pos = [] - precond_neg = [expr("Has(C2,E2)")] - effect_add = [expr("Has(C2,E2)")] - effect_rem = [] - add_engine2 = HLA(expr("AddEngine2"), - [precond_pos, precond_neg], [effect_add, effect_rem], - duration=60, use={'EngineHoists': 1}) - - # AddWheels1 - precond_pos = [] - precond_neg = [expr("Has(C1,W1)")] - effect_add = [expr("Has(C1,W1)")] - effect_rem = [] - add_wheels1 = HLA(expr("AddWheels1"), - [precond_pos, precond_neg], [effect_add, effect_rem], - duration=30, consume={'LugNuts': 20}, use={'WheelStations': 1}) - - # AddWheels2 - precond_pos = [] - precond_neg = [expr("Has(C2,W2)")] - effect_add = [expr("Has(C2,W2)")] - effect_rem = [] - add_wheels2 = HLA(expr("AddWheels2"), - [precond_pos, precond_neg], [effect_add, effect_rem], - duration=15, consume={'LugNuts': 20}, use={'WheelStations': 1}) - - # Inspect1 - precond_pos = [] - precond_neg = [expr("Inspected(C1)")] - effect_add = [expr("Inspected(C1)")] - effect_rem = [] - inspect1 = HLA(expr("Inspect1"), - [precond_pos, precond_neg], [effect_add, effect_rem], - duration=10, use={'Inspectors': 1}) - - # Inspect2 - precond_pos = [] - precond_neg = [expr("Inspected(C2)")] - effect_add = [expr("Inspected(C2)")] - effect_rem = [] - inspect2 = HLA(expr("Inspect2"), - [precond_pos, precond_neg], [effect_add, effect_rem], - duration=10, use={'Inspectors': 1}) + add_engine1 = HLA('AddEngine1', precond='~Has(C1, E1)', effect='Has(C1, E1)', duration=30, use={'EngineHoists': 1}) + add_engine2 = HLA('AddEngine2', precond='~Has(C2, E2)', effect='Has(C2, E2)', duration=60, use={'EngineHoists': 1}) + add_wheels1 = HLA('AddWheels1', precond='~Has(C1, W1)', effect='Has(C1, W1)', duration=30, use={'WheelStations': 1}, + consume={'LugNuts': 20}) + add_wheels2 = HLA('AddWheels2', precond='~Has(C2, W2)', effect='Has(C2, W2)', duration=15, use={'WheelStations': 1}, + consume={'LugNuts': 20}) + inspect1 = HLA('Inspect1', precond='~Inspected(C1)', effect='Inspected(C1)', duration=10, use={'Inspectors': 1}) + inspect2 = HLA('Inspect2', precond='~Inspected(C2)', effect='Inspected(C2)', duration=10, use={'Inspectors': 1}) + + actions = [add_engine1, add_engine2, add_wheels1, add_wheels2, inspect1, inspect2] job_group1 = [add_engine1, add_wheels1, inspect1] job_group2 = [add_engine2, add_wheels2, inspect2] - return Problem(init, [add_engine1, add_engine2, add_wheels1, add_wheels2, inspect1, inspect2], - goal_test, [job_group1, job_group2], resources) + return RealWorldPlanningProblem( + initial='Car(C1) & Car(C2) & Wheels(W1) & Wheels(W2) & Engine(E2) & Engine(E2) & ~Has(C1, E1) & ~Has(C2, ' + 'E2) & ~Has(C1, W1) & ~Has(C2, W2) & ~Inspected(C1) & ~Inspected(C2)', + goals='Has(C1, W1) & Has(C1, E1) & Inspected(C1) & Has(C2, W2) & Has(C2, E2) & Inspected(C2)', + actions=actions, + jobs=[job_group1, job_group2], + resources=resources) + + +def go_to_sfo(): + """Go to SFO Problem""" + + go_home_sfo1 = HLA('Go(Home, SFO)', precond='At(Home) & Have(Car)', effect='At(SFO) & ~At(Home)') + go_home_sfo2 = HLA('Go(Home, SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home)') + drive_home_sfoltp = HLA('Drive(Home, SFOLongTermParking)', precond='At(Home) & Have(Car)', + effect='At(SFOLongTermParking) & ~At(Home)') + shuttle_sfoltp_sfo = HLA('Shuttle(SFOLongTermParking, SFO)', precond='At(SFOLongTermParking)', + effect='At(SFO) & ~At(SFOLongTermParking)') + taxi_home_sfo = HLA('Taxi(Home, SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home)') + + actions = [go_home_sfo1, go_home_sfo2, drive_home_sfoltp, shuttle_sfoltp_sfo, taxi_home_sfo] + + library = { + 'HLA': [ + 'Go(Home, SFO)', + 'Go(Home, SFO)', + 'Drive(Home, SFOLongTermParking)', + 'Shuttle(SFOLongTermParking, SFO)', + 'Taxi(Home, SFO)' + ], + 'steps': [ + ['Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'], + ['Taxi(Home, SFO)'], + [], + [], + [] + ], + 'precond': [ + ['At(Home) & Have(Car)'], + ['At(Home)'], + ['At(Home) & Have(Car)'], + ['At(SFOLongTermParking)'], + ['At(Home)'] + ], + 'effect': [ + ['At(SFO) & ~At(Home)'], + ['At(SFO) & ~At(Home)'], + ['At(SFOLongTermParking) & ~At(Home)'], + ['At(SFO) & ~At(SFOLongTermParking)'], + ['At(SFO) & ~At(Home)']]} + + return RealWorldPlanningProblem(initial='At(Home)', goals='At(SFO)', actions=actions), library + + +class AngelicHLA(HLA): + """ + Define Actions for the real-world (that may be refined further), under angelic semantics + """ + + def __init__(self, action, precond, effect, duration=0, consume=None, use=None): + super().__init__(action, precond, effect, duration, consume, use) + + def convert(self, clauses): + """ + Converts strings into Exprs + An HLA with angelic semantics can achieve the effects of simple HLA's (add / remove a variable) + and furthermore can have following effects on the variables: + Possibly add variable ( $+ ) + Possibly remove variable ( $- ) + Possibly add or remove a variable ( $$ ) + + Overrides HLA.convert function + """ + lib = {'~': 'Not', + '$+': 'PosYes', + '$-': 'PosNot', + '$$': 'PosYesNot'} + + if isinstance(clauses, Expr): + clauses = conjuncts(clauses) + for i in range(len(clauses)): + for ch in lib.keys(): + if clauses[i].op == ch: + clauses[i] = expr(lib[ch] + str(clauses[i].args[0])) + + elif isinstance(clauses, str): + for ch in lib.keys(): + clauses = clauses.replace(ch, lib[ch]) + if len(clauses) > 0: + clauses = expr(clauses) + + try: + clauses = conjuncts(clauses) + except AttributeError: + pass + + return clauses + + def angelic_action(self): + """ + Converts a high level action (HLA) with angelic semantics into all of its corresponding high level actions (HLA). + An HLA with angelic semantics can achieve the effects of simple HLA's (add / remove a variable) + and furthermore can have following effects for each variable: + + Possibly add variable ( $+: 'PosYes' ) --> corresponds to two HLAs: + HLA_1: add variable + HLA_2: leave variable unchanged + + Possibly remove variable ( $-: 'PosNot' ) --> corresponds to two HLAs: + HLA_1: remove variable + HLA_2: leave variable unchanged + + Possibly add / remove a variable ( $$: 'PosYesNot' ) --> corresponds to three HLAs: + HLA_1: add variable + HLA_2: remove variable + HLA_3: leave variable unchanged + + + example: the angelic action with effects possibly add A and possibly add or remove B corresponds to the + following 6 effects of HLAs: + + + '$+A & $$B': HLA_1: 'A & B' (add A and add B) + HLA_2: 'A & ~B' (add A and remove B) + HLA_3: 'A' (add A) + HLA_4: 'B' (add B) + HLA_5: '~B' (remove B) + HLA_6: ' ' (no effect) + + """ + + effects = [[]] + for clause in self.effect: + (n, w) = AngelicHLA.compute_parameters(clause) + effects = effects * n # create n copies of effects + it = range(1) + if len(effects) != 0: + # split effects into n sublists (separate n copies created in compute_parameters) + it = range(len(effects) // n) + for i in it: + if effects[i]: + if clause.args: + effects[i] = expr(str(effects[i]) + '&' + str( + Expr(clause.op[w:], clause.args[0]))) # make changes in the ith part of effects + if n == 3: + effects[i + len(effects) // 3] = expr( + str(effects[i + len(effects) // 3]) + '&' + str(Expr(clause.op[6:], clause.args[0]))) + else: + effects[i] = expr( + str(effects[i]) + '&' + str(expr(clause.op[w:]))) # make changes in the ith part of effects + if n == 3: + effects[i + len(effects) // 3] = expr( + str(effects[i + len(effects) // 3]) + '&' + str(expr(clause.op[6:]))) + + else: + if clause.args: + effects[i] = Expr(clause.op[w:], clause.args[0]) # make changes in the ith part of effects + if n == 3: + effects[i + len(effects) // 3] = Expr(clause.op[6:], clause.args[0]) + + else: + effects[i] = expr(clause.op[w:]) # make changes in the ith part of effects + if n == 3: + effects[i + len(effects) // 3] = expr(clause.op[6:]) + + return [HLA(Expr(self.name, self.args), self.precond, effects[i]) for i in range(len(effects))] + + def compute_parameters(clause): + """ + computes n,w + + n = number of HLA effects that the angelic HLA corresponds to + w = length of representation of angelic HLA effect + + n = 1, if effect is add + n = 1, if effect is remove + n = 2, if effect is possibly add + n = 2, if effect is possibly remove + n = 3, if effect is possibly add or remove + + """ + if clause.op[:9] == 'PosYesNot': + # possibly add/remove variable: three possible effects for the variable + n = 3 + w = 9 + elif clause.op[:6] == 'PosYes': # possibly add variable: two possible effects for the variable + n = 2 + w = 6 + elif clause.op[:6] == 'PosNot': # possibly remove variable: two possible effects for the variable + n = 2 + w = 3 # We want to keep 'Not' from 'PosNot' when adding action + else: # variable or ~variable + n = 1 + w = 0 + return n, w + + +class AngelicNode(Node): + """ + Extends the class Node. + self.action: contains the optimistic description of an angelic HLA + self.action_pes: contains the pessimistic description of an angelic HLA + """ + + def __init__(self, state, parent=None, action_opt=None, action_pes=None, path_cost=0): + super().__init__(state, parent, action_opt, path_cost) + self.action_pes = action_pes diff --git a/planning_angelic_search.ipynb b/planning_angelic_search.ipynb new file mode 100644 index 000000000..71408e1d9 --- /dev/null +++ b/planning_angelic_search.ipynb @@ -0,0 +1,638 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Angelic Search \n", + "\n", + "Search using angelic semantics (is a hierarchical search), where the agent chooses the implementation of the HLA's.
    \n", + "The algorithms input is: problem, hierarchy and initialPlan\n", + "- problem is of type Problem \n", + "- hierarchy is a dictionary consisting of all the actions. \n", + "- initialPlan is an approximate description(optimistic and pessimistic) of the agents choices for the implementation.
    \n", + " initialPlan contains a sequence of HLA's with angelic semantics" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from planning import * \n", + "from notebook import psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Angelic search algorithm consists of three parts. \n", + "- Search using angelic semantics\n", + "- Decompose\n", + "- a search in the space of refinements, in a similar way with hierarchical search\n", + "\n", + "### Searching using angelic semantics\n", + "- Find the reachable set (optimistic and pessimistic) of the sequence of angelic HLA in initialPlan\n", + " - If the optimistic reachable set doesn't intersect the goal, then there is no solution\n", + " - If the pessimistic reachable set intersects the goal, then we call decompose, in order to find the sequence of actions that lead us to the goal. \n", + " - If the optimistic reachable set intersects the goal, but the pessimistic doesn't we do some further refinements, in order to see if there is a sequence of actions that achieves the goal. \n", + " \n", + "### Search in space of refinements\n", + "- Create a search tree, that has root the action and children it's refinements\n", + "- Extend frontier by adding each refinement, so that we keep looping till we find all primitive actions\n", + "- If we achieve that we return the path of the solution (search tree), else there is no solution and we return None.\n", + "\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def angelic_search(problem, hierarchy, initialPlan):\n",
    +       "        """\n",
    +       "\t[Figure 11.8] A hierarchical planning algorithm that uses angelic semantics to identify and\n",
    +       "\tcommit to high-level plans that work while avoiding high-level plans that don’t. \n",
    +       "\tThe predicate MAKING-PROGRESS checks to make sure that we aren’t stuck in an infinite regression\n",
    +       "\tof refinements. \n",
    +       "\tAt top level, call ANGELIC -SEARCH with [Act ] as the initialPlan .\n",
    +       "\n",
    +       "        initialPlan contains a sequence of HLA's with angelic semantics \n",
    +       "\n",
    +       "        The possible effects of an angelic HLA in initialPlan are : \n",
    +       "        ~ : effect remove\n",
    +       "        $+: effect possibly add\n",
    +       "        $-: effect possibly remove\n",
    +       "        $$: possibly add or remove\n",
    +       "\t"""\n",
    +       "        frontier = deque(initialPlan)\n",
    +       "        while True: \n",
    +       "            if not frontier:\n",
    +       "                return None\n",
    +       "            plan = frontier.popleft() # sequence of HLA/Angelic HLA's \n",
    +       "            opt_reachable_set = Problem.reach_opt(problem.init, plan)\n",
    +       "            pes_reachable_set = Problem.reach_pes(problem.init, plan)\n",
    +       "            if problem.intersects_goal(opt_reachable_set): \n",
    +       "                if Problem.is_primitive( plan, hierarchy ): \n",
    +       "                    return ([x for x in plan.action])\n",
    +       "                guaranteed = problem.intersects_goal(pes_reachable_set) \n",
    +       "                if guaranteed and Problem.making_progress(plan, initialPlan):\n",
    +       "                    final_state = guaranteed[0] # any element of guaranteed \n",
    +       "                    #print('decompose')\n",
    +       "                    return Problem.decompose(hierarchy, problem, plan, final_state, pes_reachable_set)\n",
    +       "                (hla, index) = Problem.find_hla(plan, hierarchy) # there should be at least one HLA/Angelic_HLA, otherwise plan would be primitive.\n",
    +       "                prefix = plan.action[:index]\n",
    +       "                suffix = plan.action[index+1:]\n",
    +       "                outcome = Problem(Problem.result(problem.init, prefix), problem.goals , problem.actions )\n",
    +       "                for sequence in Problem.refinements(hla, outcome, hierarchy): # find refinements\n",
    +       "                    frontier.append(Angelic_Node(outcome.init, plan, prefix + sequence+ suffix, prefix+sequence+suffix))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Problem.angelic_search)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### Decompose \n", + "- Finds recursively the sequence of states and actions that lead us from initial state to goal.\n", + "- For each of the above actions we find their refinements,if they are not primitive, by calling the angelic_search function. \n", + " If there are not refinements return None\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def decompose(hierarchy, s_0, plan, s_f, reachable_set):\n",
    +       "        solution = [] \n",
    +       "        i = max(reachable_set.keys())\n",
    +       "        while plan.action_pes: \n",
    +       "            action = plan.action_pes.pop()\n",
    +       "            if (i==0): \n",
    +       "                return solution\n",
    +       "            s_i = Problem.find_previous_state(s_f, reachable_set,i, action) \n",
    +       "            problem = Problem(s_i, s_f , plan.action)\n",
    +       "            angelic_call = Problem.angelic_search(problem, hierarchy, [Angelic_Node(s_i, Node(None), [action],[action])])\n",
    +       "            if angelic_call:\n",
    +       "                for x in angelic_call: \n",
    +       "                    solution.insert(0,x)\n",
    +       "            else: \n",
    +       "                return None\n",
    +       "            s_f = s_i\n",
    +       "            i-=1\n",
    +       "        return solution\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Problem.decompose)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example\n", + "\n", + "Suppose that somebody wants to get to the airport. \n", + "The possible ways to do so is either get a taxi, or drive to the airport.
    \n", + "Those two actions have some preconditions and some effects. \n", + "If you get the taxi, you need to have cash, whereas if you drive you need to have a car.
    \n", + "Thus we define the following hierarchy of possible actions.\n", + "\n", + "##### hierarchy" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "library = {\n", + " 'HLA': ['Go(Home,SFO)', 'Go(Home,SFO)', 'Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)', 'Taxi(Home, SFO)'],\n", + " 'steps': [['Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'], ['Taxi(Home, SFO)'], [], [], []],\n", + " 'precond': [['At(Home) & Have(Car)'], ['At(Home)'], ['At(Home) & Have(Car)'], ['At(SFOLongTermParking)'], ['At(Home)']],\n", + " 'effect': [['At(SFO) & ~At(Home)'], ['At(SFO) & ~At(Home) & ~Have(Cash)'], ['At(SFOLongTermParking) & ~At(Home)'], ['At(SFO) & ~At(LongTermParking)'], ['At(SFO) & ~At(Home) & ~Have(Cash)']] }\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "the possible actions are the following:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "go_SFO = HLA('Go(Home,SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home)')\n", + "taxi_SFO = HLA('Taxi(Home,SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home) & ~Have(Cash)')\n", + "drive_SFOLongTermParking = HLA('Drive(Home, SFOLongTermParking)', 'At(Home) & Have(Car)','At(SFOLongTermParking) & ~At(Home)' )\n", + "shuttle_SFO = HLA('Shuttle(SFOLongTermParking, SFO)', 'At(SFOLongTermParking)', 'At(SFO) & ~At(LongTermParking)')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Suppose that (our preconditionds are that) we are Home and we have cash and car and our goal is to get to SFO and maintain our cash, and our possible actions are the above.
    \n", + "##### Then our problem is: " + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "prob = Problem('At(Home) & Have(Cash) & Have(Car)', 'At(SFO) & Have(Cash)', [go_SFO, taxi_SFO, drive_SFOLongTermParking,shuttle_SFO])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An agent gives us some approximate information about the plan we will follow:
    \n", + "(initialPlan is an Angelic Node, where: \n", + "- state is the initial state of the problem, \n", + "- parent is None \n", + "- action: is a list of actions (Angelic HLA's) with the optimistic estimators of effects and \n", + "- action_pes: is a list of actions (Angelic HLA's) with the pessimistic approximations of the effects\n", + "##### InitialPlan" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "angelic_opt_description = Angelic_HLA('Go(Home, SFO)', precond = 'At(Home)', effect ='$+At(SFO) & $-At(Home)' ) \n", + "angelic_pes_description = Angelic_HLA('Go(Home, SFO)', precond = 'At(Home)', effect ='$+At(SFO) & ~At(Home)' )\n", + "\n", + "initialPlan = [Angelic_Node(prob.init, None, [angelic_opt_description], [angelic_pes_description])] \n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We want to find the optimistic and pessimistic reachable set of initialPlan when applied to the problem:\n", + "##### Optimistic/Pessimistic reachable set" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[At(Home), Have(Cash), Have(Car)], [Have(Cash), Have(Car), At(SFO), NotAt(Home)], [Have(Cash), Have(Car), NotAt(Home)], [At(Home), Have(Cash), Have(Car), At(SFO)], [At(Home), Have(Cash), Have(Car)]] \n", + "\n", + "[[At(Home), Have(Cash), Have(Car)], [Have(Cash), Have(Car), At(SFO), NotAt(Home)], [Have(Cash), Have(Car), NotAt(Home)]]\n" + ] + } + ], + "source": [ + "opt_reachable_set = Problem.reach_opt(prob.init, initialPlan[0])\n", + "pes_reachable_set = Problem.reach_pes(prob.init, initialPlan[0])\n", + "print([x for y in opt_reachable_set.keys() for x in opt_reachable_set[y]], '\\n')\n", + "print([x for y in pes_reachable_set.keys() for x in pes_reachable_set[y]])\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Refinements" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[HLA(Drive(Home, SFOLongTermParking)), HLA(Shuttle(SFOLongTermParking, SFO))]\n", + "[{'duration': 0, 'effect': [At(SFOLongTermParking), NotAt(Home)], 'args': (Home, SFOLongTermParking), 'uses': {}, 'consumes': {}, 'name': 'Drive', 'completed': False, 'precond': [At(Home), Have(Car)]}, {'duration': 0, 'effect': [At(SFO), NotAt(LongTermParking)], 'args': (SFOLongTermParking, SFO), 'uses': {}, 'consumes': {}, 'name': 'Shuttle', 'completed': False, 'precond': [At(SFOLongTermParking)]}] \n", + "\n", + "[HLA(Taxi(Home, SFO))]\n", + "[{'duration': 0, 'effect': [At(SFO), NotAt(Home), NotHave(Cash)], 'args': (Home, SFO), 'uses': {}, 'consumes': {}, 'name': 'Taxi', 'completed': False, 'precond': [At(Home)]}] \n", + "\n" + ] + } + ], + "source": [ + "for sequence in Problem.refinements(go_SFO, prob, library):\n", + " print (sequence)\n", + " print([x.__dict__ for x in sequence ], '\\n')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Run the angelic search\n", + "##### Top level call" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[HLA(Drive(Home, SFOLongTermParking)), HLA(Shuttle(SFOLongTermParking, SFO))] \n", + "\n", + "[{'duration': 0, 'effect': [At(SFOLongTermParking), NotAt(Home)], 'args': (Home, SFOLongTermParking), 'uses': {}, 'consumes': {}, 'name': 'Drive', 'completed': False, 'precond': [At(Home), Have(Car)]}, {'duration': 0, 'effect': [At(SFO), NotAt(LongTermParking)], 'args': (SFOLongTermParking, SFO), 'uses': {}, 'consumes': {}, 'name': 'Shuttle', 'completed': False, 'precond': [At(SFOLongTermParking)]}]\n" + ] + } + ], + "source": [ + "plan= Problem.angelic_search(prob, library, initialPlan)\n", + "print (plan, '\\n')\n", + "print ([x.__dict__ for x in plan])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example 2" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [], + "source": [ + "library_2 = {\n", + " 'HLA': ['Go(Home,SFO)', 'Go(Home,SFO)', 'Bus(Home, MetroStop)', 'Metro(MetroStop, SFO)' , 'Metro(MetroStop, SFO)', 'Metro1(MetroStop, SFO)', 'Metro2(MetroStop, SFO)' ,'Taxi(Home, SFO)'],\n", + " 'steps': [['Bus(Home, MetroStop)', 'Metro(MetroStop, SFO)'], ['Taxi(Home, SFO)'], [], ['Metro1(MetroStop, SFO)'], ['Metro2(MetroStop, SFO)'],[],[],[]],\n", + " 'precond': [['At(Home)'], ['At(Home)'], ['At(Home)'], ['At(MetroStop)'], ['At(MetroStop)'],['At(MetroStop)'], ['At(MetroStop)'] ,['At(Home) & Have(Cash)']],\n", + " 'effect': [['At(SFO) & ~At(Home)'], ['At(SFO) & ~At(Home) & ~Have(Cash)'], ['At(MetroStop) & ~At(Home)'], ['At(SFO) & ~At(MetroStop)'], ['At(SFO) & ~At(MetroStop)'], ['At(SFO) & ~At(MetroStop)'] , ['At(SFO) & ~At(MetroStop)'] ,['At(SFO) & ~At(Home) & ~Have(Cash)']] \n", + " }" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[HLA(Bus(Home, MetroStop)), HLA(Metro1(MetroStop, SFO))] \n", + "\n", + "[{'duration': 0, 'effect': [At(MetroStop), NotAt(Home)], 'args': (Home, MetroStop), 'uses': {}, 'consumes': {}, 'name': 'Bus', 'completed': False, 'precond': [At(Home)]}, {'duration': 0, 'effect': [At(SFO), NotAt(MetroStop)], 'args': (MetroStop, SFO), 'uses': {}, 'consumes': {}, 'name': 'Metro1', 'completed': False, 'precond': [At(MetroStop)]}]\n" + ] + } + ], + "source": [ + "plan_2 = Problem.angelic_search(prob, library_2, initialPlan)\n", + "print(plan_2, '\\n')\n", + "print([x.__dict__ for x in plan_2])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example 3 \n", + "\n", + "Sometimes there is no plan that achieves the goal!" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [], + "source": [ + "library_3 = {\n", + " 'HLA': ['Shuttle(SFOLongTermParking, SFO)', 'Go(Home, SFOLongTermParking)', 'Taxi(Home, SFOLongTermParking)', 'Drive(Home, SFOLongTermParking)', 'Drive(SFOLongTermParking, Home)', 'Get(Cash)', 'Go(Home, ATM)'],\n", + " 'steps': [['Get(Cash)', 'Go(Home, SFOLongTermParking)'], ['Taxi(Home, SFOLongTermParking)'], [], [], [], ['Drive(SFOLongTermParking, Home)', 'Go(Home, ATM)'], []],\n", + " 'precond': [['At(SFOLongTermParking)'], ['At(Home)'], ['At(Home) & Have(Cash)'], ['At(Home)'], ['At(SFOLongTermParking)'], ['At(SFOLongTermParking)'], ['At(Home)']],\n", + " 'effect': [['At(SFO)'], ['At(SFO)'], ['At(SFOLongTermParking) & ~Have(Cash)'], ['At(SFOLongTermParking)'] ,['At(Home) & ~At(SFOLongTermParking)'], ['At(Home) & Have(Cash)'], ['Have(Cash)'] ]\n", + " }\n" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [], + "source": [ + "shuttle_SFO = HLA('Shuttle(SFOLongTermParking, SFO)', 'Have(Cash) & At(SFOLongTermParking)', 'At(SFO)')\n", + "prob_3 = Problem('At(SFOLongTermParking) & Have(Cash)', 'At(SFO) & Have(Cash)', [shuttle_SFO])\n", + "# optimistic/pessimistic descriptions\n", + "angelic_opt_description = Angelic_HLA('Shuttle(SFOLongTermParking, SFO)', precond = 'At(SFOLongTermParking)', effect ='$+At(SFO) & $-At(SFOLongTermParking)' ) \n", + "angelic_pes_description = Angelic_HLA('Shuttle(SFOLongTermParking, SFO)', precond = 'At(SFOLongTermParking)', effect ='$+At(SFO) & ~At(SFOLongTermParking)' ) \n", + "# initial Plan\n", + "initialPlan_3 = [Angelic_Node(prob.init, None, [angelic_opt_description], [angelic_pes_description])] " + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "None\n" + ] + } + ], + "source": [ + "plan_3 = prob_3.angelic_search(library_3, initialPlan_3)\n", + "print(plan_3)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/planning_graphPlan.ipynb b/planning_graphPlan.ipynb new file mode 100644 index 000000000..bffecb937 --- /dev/null +++ b/planning_graphPlan.ipynb @@ -0,0 +1,1066 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SOLVING PLANNING PROBLEMS\n", + "----\n", + "### GRAPHPLAN\n", + "
    \n", + "The GraphPlan algorithm is a popular method of solving classical planning problems.\n", + "Before we get into the details of the algorithm, let's look at a special data structure called **planning graph**, used to give better heuristic estimates and plays a key role in the GraphPlan algorithm." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Planning Graph\n", + "A planning graph is a directed graph organized into levels. \n", + "Each level contains information about the current state of the knowledge base and the possible state-action links to and from that level.\n", + "The first level contains the initial state with nodes representing each fluent that holds in that level.\n", + "This level has state-action links linking each state to valid actions in that state.\n", + "Each action is linked to all its preconditions and its effect states.\n", + "Based on these effects, the next level is constructed.\n", + "The next level contains similarly structured information about the next state.\n", + "In this way, the graph is expanded using state-action links till we reach a state where all the required goals hold true simultaneously.\n", + "We can say that we have reached our goal if none of the goal states in the current level are mutually exclusive.\n", + "This will be explained in detail later.\n", + "
    \n", + "Planning graphs only work for propositional planning problems, hence we need to eliminate all variables by generating all possible substitutions.\n", + "
    \n", + "For example, the planning graph of the `have_cake_and_eat_cake_too` problem might look like this\n", + "![title](images/cake_graph.jpg)\n", + "
    \n", + "The black lines indicate links between states and actions.\n", + "
    \n", + "In every planning problem, we are allowed to carry out the `no-op` action, ie, we can choose no action for a particular state.\n", + "These are called 'Persistence' actions and are represented in the graph by the small square boxes.\n", + "In technical terms, a persistence action has effects same as its preconditions.\n", + "This enables us to carry a state to the next level.\n", + "
    \n", + "
    \n", + "The gray lines indicate mutual exclusivity.\n", + "This means that the actions connected bya gray line cannot be taken together.\n", + "Mutual exclusivity (mutex) occurs in the following cases:\n", + "1. **Inconsistent effects**: One action negates the effect of the other. For example, _Eat(Cake)_ and the persistence of _Have(Cake)_ have inconsistent effects because they disagree on the effect _Have(Cake)_\n", + "2. **Interference**: One of the effects of an action is the negation of a precondition of the other. For example, _Eat(Cake)_ interferes with the persistence of _Have(Cake)_ by negating its precondition.\n", + "3. **Competing needs**: One of the preconditions of one action is mutually exclusive with a precondition of the other. For example, _Bake(Cake)_ and _Eat(Cake)_ are mutex because they compete on the value of the _Have(Cake)_ precondition." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the module, planning graphs have been implemented using two classes, `Level` which stores data for a particular level and `Graph` which connects multiple levels together.\n", + "Let's look at the `Level` class." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [], + "source": [ + "from planning import *\n", + "from notebook import psource" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class Level:\n",
    +       "    """\n",
    +       "    Contains the state of the planning problem\n",
    +       "    and exhaustive list of actions which use the\n",
    +       "    states as pre-condition.\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, kb):\n",
    +       "        """Initializes variables to hold state and action details of a level"""\n",
    +       "\n",
    +       "        self.kb = kb\n",
    +       "        # current state\n",
    +       "        self.current_state = kb.clauses\n",
    +       "        # current action to state link\n",
    +       "        self.current_action_links = {}\n",
    +       "        # current state to action link\n",
    +       "        self.current_state_links = {}\n",
    +       "        # current action to next state link\n",
    +       "        self.next_action_links = {}\n",
    +       "        # next state to current action link\n",
    +       "        self.next_state_links = {}\n",
    +       "        # mutually exclusive actions\n",
    +       "        self.mutex = []\n",
    +       "\n",
    +       "    def __call__(self, actions, objects):\n",
    +       "        self.build(actions, objects)\n",
    +       "        self.find_mutex()\n",
    +       "\n",
    +       "    def separate(self, e):\n",
    +       "        """Separates an iterable of elements into positive and negative parts"""\n",
    +       "\n",
    +       "        positive = []\n",
    +       "        negative = []\n",
    +       "        for clause in e:\n",
    +       "            if clause.op[:3] == 'Not':\n",
    +       "                negative.append(clause)\n",
    +       "            else:\n",
    +       "                positive.append(clause)\n",
    +       "        return positive, negative\n",
    +       "\n",
    +       "    def find_mutex(self):\n",
    +       "        """Finds mutually exclusive actions"""\n",
    +       "\n",
    +       "        # Inconsistent effects\n",
    +       "        pos_nsl, neg_nsl = self.separate(self.next_state_links)\n",
    +       "\n",
    +       "        for negeff in neg_nsl:\n",
    +       "            new_negeff = Expr(negeff.op[3:], *negeff.args)\n",
    +       "            for poseff in pos_nsl:\n",
    +       "                if new_negeff == poseff:\n",
    +       "                    for a in self.next_state_links[poseff]:\n",
    +       "                        for b in self.next_state_links[negeff]:\n",
    +       "                            if {a, b} not in self.mutex:\n",
    +       "                                self.mutex.append({a, b})\n",
    +       "\n",
    +       "        # Interference will be calculated with the last step\n",
    +       "        pos_csl, neg_csl = self.separate(self.current_state_links)\n",
    +       "\n",
    +       "        # Competing needs\n",
    +       "        for posprecond in pos_csl:\n",
    +       "            for negprecond in neg_csl:\n",
    +       "                new_negprecond = Expr(negprecond.op[3:], *negprecond.args)\n",
    +       "                if new_negprecond == posprecond:\n",
    +       "                    for a in self.current_state_links[posprecond]:\n",
    +       "                        for b in self.current_state_links[negprecond]:\n",
    +       "                            if {a, b} not in self.mutex:\n",
    +       "                                self.mutex.append({a, b})\n",
    +       "\n",
    +       "        # Inconsistent support\n",
    +       "        state_mutex = []\n",
    +       "        for pair in self.mutex:\n",
    +       "            next_state_0 = self.next_action_links[list(pair)[0]]\n",
    +       "            if len(pair) == 2:\n",
    +       "                next_state_1 = self.next_action_links[list(pair)[1]]\n",
    +       "            else:\n",
    +       "                next_state_1 = self.next_action_links[list(pair)[0]]\n",
    +       "            if (len(next_state_0) == 1) and (len(next_state_1) == 1):\n",
    +       "                state_mutex.append({next_state_0[0], next_state_1[0]})\n",
    +       "        \n",
    +       "        self.mutex = self.mutex + state_mutex\n",
    +       "\n",
    +       "    def build(self, actions, objects):\n",
    +       "        """Populates the lists and dictionaries containing the state action dependencies"""\n",
    +       "\n",
    +       "        for clause in self.current_state:\n",
    +       "            p_expr = Expr('P' + clause.op, *clause.args)\n",
    +       "            self.current_action_links[p_expr] = [clause]\n",
    +       "            self.next_action_links[p_expr] = [clause]\n",
    +       "            self.current_state_links[clause] = [p_expr]\n",
    +       "            self.next_state_links[clause] = [p_expr]\n",
    +       "\n",
    +       "        for a in actions:\n",
    +       "            num_args = len(a.args)\n",
    +       "            possible_args = tuple(itertools.permutations(objects, num_args))\n",
    +       "\n",
    +       "            for arg in possible_args:\n",
    +       "                if a.check_precond(self.kb, arg):\n",
    +       "                    for num, symbol in enumerate(a.args):\n",
    +       "                        if not symbol.op.islower():\n",
    +       "                            arg = list(arg)\n",
    +       "                            arg[num] = symbol\n",
    +       "                            arg = tuple(arg)\n",
    +       "\n",
    +       "                    new_action = a.substitute(Expr(a.name, *a.args), arg)\n",
    +       "                    self.current_action_links[new_action] = []\n",
    +       "\n",
    +       "                    for clause in a.precond:\n",
    +       "                        new_clause = a.substitute(clause, arg)\n",
    +       "                        self.current_action_links[new_action].append(new_clause)\n",
    +       "                        if new_clause in self.current_state_links:\n",
    +       "                            self.current_state_links[new_clause].append(new_action)\n",
    +       "                        else:\n",
    +       "                            self.current_state_links[new_clause] = [new_action]\n",
    +       "                   \n",
    +       "                    self.next_action_links[new_action] = []\n",
    +       "                    for clause in a.effect:\n",
    +       "                        new_clause = a.substitute(clause, arg)\n",
    +       "\n",
    +       "                        self.next_action_links[new_action].append(new_clause)\n",
    +       "                        if new_clause in self.next_state_links:\n",
    +       "                            self.next_state_links[new_clause].append(new_action)\n",
    +       "                        else:\n",
    +       "                            self.next_state_links[new_clause] = [new_action]\n",
    +       "\n",
    +       "    def perform_actions(self):\n",
    +       "        """Performs the necessary actions and returns a new Level"""\n",
    +       "\n",
    +       "        new_kb = FolKB(list(set(self.next_state_links.keys())))\n",
    +       "        return Level(new_kb)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Level)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Each level stores the following data\n", + "1. The current state of the level in `current_state`\n", + "2. Links from an action to its preconditions in `current_action_links`\n", + "3. Links from a state to the possible actions in that state in `current_state_links`\n", + "4. Links from each action to its effects in `next_action_links`\n", + "5. Links from each possible next state from each action in `next_state_links`. This stores the same information as the `current_action_links` of the next level.\n", + "6. Mutex links in `mutex`.\n", + "
    \n", + "
    \n", + "The `find_mutex` method finds the mutex links according to the points given above.\n", + "
    \n", + "The `build` method populates the data structures storing the state and action information.\n", + "Persistence actions for each clause in the current state are also defined here. \n", + "The newly created persistence action has the same name as its state, prefixed with a 'P'." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's now look at the `Graph` class." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class Graph:\n",
    +       "    """\n",
    +       "    Contains levels of state and actions\n",
    +       "    Used in graph planning algorithm to extract a solution\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, planningproblem):\n",
    +       "        self.planningproblem = planningproblem\n",
    +       "        self.kb = FolKB(planningproblem.init)\n",
    +       "        self.levels = [Level(self.kb)]\n",
    +       "        self.objects = set(arg for clause in self.kb.clauses for arg in clause.args)\n",
    +       "\n",
    +       "    def __call__(self):\n",
    +       "        self.expand_graph()\n",
    +       "\n",
    +       "    def expand_graph(self):\n",
    +       "        """Expands the graph by a level"""\n",
    +       "\n",
    +       "        last_level = self.levels[-1]\n",
    +       "        last_level(self.planningproblem.actions, self.objects)\n",
    +       "        self.levels.append(last_level.perform_actions())\n",
    +       "\n",
    +       "    def non_mutex_goals(self, goals, index):\n",
    +       "        """Checks whether the goals are mutually exclusive"""\n",
    +       "\n",
    +       "        goal_perm = itertools.combinations(goals, 2)\n",
    +       "        for g in goal_perm:\n",
    +       "            if set(g) in self.levels[index].mutex:\n",
    +       "                return False\n",
    +       "        return True\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Graph)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The class stores a problem definition in `pddl`, \n", + "a knowledge base in `kb`, \n", + "a list of `Level` objects in `levels` and \n", + "all the possible arguments found in the initial state of the problem in `objects`.\n", + "
    \n", + "The `expand_graph` method generates a new level of the graph.\n", + "This method is invoked when the goal conditions haven't been met in the current level or the actions that lead to it are mutually exclusive.\n", + "The `non_mutex_goals` method checks whether the goals in the current state are mutually exclusive.\n", + "
    \n", + "
    \n", + "Using these two classes, we can define a planning graph which can either be used to provide reliable heuristics for planning problems or used in the `GraphPlan` algorithm.\n", + "
    \n", + "Let's have a look at the `GraphPlan` class." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class GraphPlan:\n",
    +       "    """\n",
    +       "    Class for formulation GraphPlan algorithm\n",
    +       "    Constructs a graph of state and action space\n",
    +       "    Returns solution for the planning problem\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, planningproblem):\n",
    +       "        self.graph = Graph(planningproblem)\n",
    +       "        self.nogoods = []\n",
    +       "        self.solution = []\n",
    +       "\n",
    +       "    def check_leveloff(self):\n",
    +       "        """Checks if the graph has levelled off"""\n",
    +       "\n",
    +       "        check = (set(self.graph.levels[-1].current_state) == set(self.graph.levels[-2].current_state))\n",
    +       "\n",
    +       "        if check:\n",
    +       "            return True\n",
    +       "\n",
    +       "    def extract_solution(self, goals, index):\n",
    +       "        """Extracts the solution"""\n",
    +       "\n",
    +       "        level = self.graph.levels[index]    \n",
    +       "        if not self.graph.non_mutex_goals(goals, index):\n",
    +       "            self.nogoods.append((level, goals))\n",
    +       "            return\n",
    +       "\n",
    +       "        level = self.graph.levels[index - 1]    \n",
    +       "\n",
    +       "        # Create all combinations of actions that satisfy the goal    \n",
    +       "        actions = []\n",
    +       "        for goal in goals:\n",
    +       "            actions.append(level.next_state_links[goal])    \n",
    +       "\n",
    +       "        all_actions = list(itertools.product(*actions))    \n",
    +       "\n",
    +       "        # Filter out non-mutex actions\n",
    +       "        non_mutex_actions = []    \n",
    +       "        for action_tuple in all_actions:\n",
    +       "            action_pairs = itertools.combinations(list(set(action_tuple)), 2)        \n",
    +       "            non_mutex_actions.append(list(set(action_tuple)))        \n",
    +       "            for pair in action_pairs:            \n",
    +       "                if set(pair) in level.mutex:\n",
    +       "                    non_mutex_actions.pop(-1)\n",
    +       "                    break\n",
    +       "    \n",
    +       "\n",
    +       "        # Recursion\n",
    +       "        for action_list in non_mutex_actions:        \n",
    +       "            if [action_list, index] not in self.solution:\n",
    +       "                self.solution.append([action_list, index])\n",
    +       "\n",
    +       "                new_goals = []\n",
    +       "                for act in set(action_list):                \n",
    +       "                    if act in level.current_action_links:\n",
    +       "                        new_goals = new_goals + level.current_action_links[act]\n",
    +       "\n",
    +       "                if abs(index) + 1 == len(self.graph.levels):\n",
    +       "                    return\n",
    +       "                elif (level, new_goals) in self.nogoods:\n",
    +       "                    return\n",
    +       "                else:\n",
    +       "                    self.extract_solution(new_goals, index - 1)\n",
    +       "\n",
    +       "        # Level-Order multiple solutions\n",
    +       "        solution = []\n",
    +       "        for item in self.solution:\n",
    +       "            if item[1] == -1:\n",
    +       "                solution.append([])\n",
    +       "                solution[-1].append(item[0])\n",
    +       "            else:\n",
    +       "                solution[-1].append(item[0])\n",
    +       "\n",
    +       "        for num, item in enumerate(solution):\n",
    +       "            item.reverse()\n",
    +       "            solution[num] = item\n",
    +       "\n",
    +       "        return solution\n",
    +       "\n",
    +       "    def goal_test(self, kb):\n",
    +       "        return all(kb.ask(q) is not False for q in self.graph.planningproblem.goals)\n",
    +       "\n",
    +       "    def execute(self):\n",
    +       "        """Executes the GraphPlan algorithm for the given problem"""\n",
    +       "\n",
    +       "        while True:\n",
    +       "            self.graph.expand_graph()\n",
    +       "            if (self.goal_test(self.graph.levels[-1].kb) and self.graph.non_mutex_goals(self.graph.planningproblem.goals, -1)):\n",
    +       "                solution = self.extract_solution(self.graph.planningproblem.goals, -1)\n",
    +       "                if solution:\n",
    +       "                    return solution\n",
    +       "            \n",
    +       "            if len(self.graph.levels) >= 2 and self.check_leveloff():\n",
    +       "                return None\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(GraphPlan)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Given a planning problem defined as a PlanningProblem, `GraphPlan` creates a planning graph stored in `graph` and expands it till it reaches a state where all its required goals are present simultaneously without mutual exclusivity.\n", + "
    \n", + "Once a goal is found, `extract_solution` is called.\n", + "This method recursively finds the path to a solution given a planning graph.\n", + "In the case where `extract_solution` fails to find a solution for a set of goals as a given level, we record the `(level, goals)` pair as a **no-good**.\n", + "Whenever `extract_solution` is called again with the same level and goals, we can find the recorded no-good and immediately return failure rather than searching again. \n", + "No-goods are also used in the termination test.\n", + "
    \n", + "The `check_leveloff` method checks if the planning graph for the problem has **levelled-off**, ie, it has the same states, actions and mutex pairs as the previous level.\n", + "If the graph has already levelled off and we haven't found a solution, there is no point expanding the graph, as it won't lead to anything new.\n", + "In such a case, we can declare that the planning problem is unsolvable with the given constraints.\n", + "
    \n", + "
    \n", + "To summarize, the `GraphPlan` algorithm calls `expand_graph` and tests whether it has reached the goal and if the goals are non-mutex.\n", + "
    \n", + "If so, `extract_solution` is invoked which recursively reconstructs the solution from the planning graph.\n", + "
    \n", + "If not, then we check if our graph has levelled off and continue if it hasn't." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's solve a few planning problems that we had defined earlier." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Air cargo problem\n", + "In accordance with the summary above, we have defined a helper function to carry out `GraphPlan` on the `air_cargo` problem.\n", + "The function is pretty straightforward.\n", + "Let's have a look." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def air_cargo_graphplan():\n",
    +       "    """Solves the air cargo problem using GraphPlan"""\n",
    +       "    return GraphPlan(air_cargo()).execute()\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(air_cargo_graphplan)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's instantiate the problem and find a solution using this helper function." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[[[Load(C2, P2, JFK),\n", + " PAirport(SFO),\n", + " PAirport(JFK),\n", + " PPlane(P2),\n", + " PPlane(P1),\n", + " Fly(P2, JFK, SFO),\n", + " PCargo(C2),\n", + " Load(C1, P1, SFO),\n", + " Fly(P1, SFO, JFK),\n", + " PCargo(C1)],\n", + " [Unload(C2, P2, SFO), Unload(C1, P1, JFK)]]]" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "airCargoG = air_cargo_graphplan()\n", + "airCargoG" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Each element in the solution is a valid action.\n", + "The solution is separated into lists for each level.\n", + "The actions prefixed with a 'P' are persistence actions and can be ignored.\n", + "They simply carry certain states forward.\n", + "We have another helper function `linearize` that presents the solution in a more readable format, much like a total-order planner, but it is _not_ a total-order planner." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[Load(C2, P2, JFK),\n", + " Fly(P2, JFK, SFO),\n", + " Load(C1, P1, SFO),\n", + " Fly(P1, SFO, JFK),\n", + " Unload(C2, P2, SFO),\n", + " Unload(C1, P1, JFK)]" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "linearize(airCargoG)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indeed, this is a correct solution.\n", + "
    \n", + "There are similar helper functions for some other planning problems.\n", + "
    \n", + "Lets' try solving the spare tire problem." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[Remove(Spare, Trunk), Remove(Flat, Axle), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "spareTireG = spare_tire_graphplan()\n", + "linearize(spareTireG)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solution for the cake problem" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[Eat(Cake), Bake(Cake)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "cakeProblemG = have_cake_and_eat_cake_too_graphplan()\n", + "linearize(cakeProblemG)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solution for the Sussman's Anomaly configuration of three blocks." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 23, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sussmanAnomalyG = three_block_tower_graphplan()\n", + "linearize(sussmanAnomalyG)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solution of the socks and shoes problem" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[RightSock, LeftSock, RightShoe, LeftShoe]" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "socksShoesG = socks_and_shoes_graphplan()\n", + "linearize(socksShoesG)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/planning_hierarchical_search.ipynb b/planning_hierarchical_search.ipynb new file mode 100644 index 000000000..18e57b23b --- /dev/null +++ b/planning_hierarchical_search.ipynb @@ -0,0 +1,546 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Hierarchical Search \n", + "\n", + "Hierarchical search is a a planning algorithm in high level of abstraction.
    \n", + "Instead of actions as in classical planning (chapter 10) (primitive actions) we now use high level actions (HLAs) (see planning.ipynb)
    \n", + "\n", + "## Refinements\n", + "\n", + "Each __HLA__ has one or more refinements into a sequence of actions, each of which may be an HLA or a primitive action (which has no refinements by definition).
    \n", + "For example:\n", + "- (a) the high level action \"Go to San Fransisco airport\" (Go(Home, SFO)), might have two possible refinements, \"Drive to San Fransisco airport\" and \"Taxi to San Fransisco airport\". \n", + "
    \n", + "- (b) A recursive refinement for navigation in the vacuum world would be: to get to a\n", + "destination, take a step, and then go to the destination.\n", + "
    \n", + "![title](images/refinement.png)\n", + "
    \n", + "- __implementation__: An HLA refinement that contains only primitive actions is called an implementation of the HLA\n", + "- An implementation of a high-level plan (a sequence of HLAs) is the concatenation of implementations of each HLA in the sequence\n", + "- A high-level plan __achieves the goal__ from a given state if at least one of its implementations achieves the goal from that state\n", + "
    \n", + "\n", + "The refinements function input is: \n", + "- __hla__: the HLA of which we want to compute its refinements\n", + "- __state__: the knoweledge base of the current problem (Problem.init)\n", + "- __library__: the hierarchy of the actions in the planning problem\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from planning import * \n", + "from notebook import psource" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def refinements(hla, state, library):  # refinements may be (multiple) HLA themselves ...\n",
    +       "        """\n",
    +       "        state is a Problem, containing the current state kb\n",
    +       "        library is a dictionary containing details for every possible refinement. eg:\n",
    +       "        {\n",
    +       "        'HLA': [\n",
    +       "            'Go(Home, SFO)',\n",
    +       "            'Go(Home, SFO)',\n",
    +       "            'Drive(Home, SFOLongTermParking)',\n",
    +       "            'Shuttle(SFOLongTermParking, SFO)',\n",
    +       "            'Taxi(Home, SFO)'\n",
    +       "            ],\n",
    +       "        'steps': [\n",
    +       "            ['Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'],\n",
    +       "            ['Taxi(Home, SFO)'],\n",
    +       "            [],\n",
    +       "            [],\n",
    +       "            []\n",
    +       "            ],\n",
    +       "        # empty refinements indicate a primitive action\n",
    +       "        'precond': [\n",
    +       "            ['At(Home) & Have(Car)'],\n",
    +       "            ['At(Home)'],\n",
    +       "            ['At(Home) & Have(Car)'],\n",
    +       "            ['At(SFOLongTermParking)'],\n",
    +       "            ['At(Home)']\n",
    +       "            ],\n",
    +       "        'effect': [\n",
    +       "            ['At(SFO) & ~At(Home)'],\n",
    +       "            ['At(SFO) & ~At(Home)'],\n",
    +       "            ['At(SFOLongTermParking) & ~At(Home)'],\n",
    +       "            ['At(SFO) & ~At(SFOLongTermParking)'],\n",
    +       "            ['At(SFO) & ~At(Home)']\n",
    +       "            ]\n",
    +       "        }\n",
    +       "        """\n",
    +       "        e = Expr(hla.name, hla.args)\n",
    +       "        indices = [i for i, x in enumerate(library['HLA']) if expr(x).op == hla.name]\n",
    +       "        for i in indices:\n",
    +       "            actions = []\n",
    +       "            for j in range(len(library['steps'][i])):\n",
    +       "                # find the index of the step [j]  of the HLA \n",
    +       "                index_step = [k for k,x in enumerate(library['HLA']) if x == library['steps'][i][j]][0]\n",
    +       "                precond = library['precond'][index_step][0] # preconditions of step [j]\n",
    +       "                effect = library['effect'][index_step][0] # effect of step [j]\n",
    +       "                actions.append(HLA(library['steps'][i][j], precond, effect))\n",
    +       "            yield actions\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Problem.refinements)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hierarchical search \n", + "\n", + "Hierarchical search is a breadth-first implementation of hierarchical forward planning search in the space of refinements. (i.e. repeatedly choose an HLA in the current plan and replace it with one of its refinements, until the plan achieves the goal.) \n", + "\n", + "
    \n", + "The algorithms input is: problem and hierarchy\n", + "- __problem__: is of type Problem \n", + "- __hierarchy__: is a dictionary consisting of all the actions and the order in which they are performed. \n", + "
    \n", + "\n", + "In top level call, initialPlan contains [act] (i.e. is the action to be performed) " + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def hierarchical_search(problem, hierarchy):\n",
    +       "        """\n",
    +       "        [Figure 11.5] 'Hierarchical Search, a Breadth First Search implementation of Hierarchical\n",
    +       "        Forward Planning Search'\n",
    +       "        The problem is a real-world problem defined by the problem class, and the hierarchy is\n",
    +       "        a dictionary of HLA - refinements (see refinements generator for details)\n",
    +       "        """\n",
    +       "        act = Node(problem.init, None, [problem.actions[0]])\n",
    +       "        frontier = deque()\n",
    +       "        frontier.append(act)\n",
    +       "        while True:\n",
    +       "            if not frontier:\n",
    +       "                return None\n",
    +       "            plan = frontier.popleft()\n",
    +       "            (hla, index) = Problem.find_hla(plan, hierarchy) # finds the first non primitive hla in plan actions\n",
    +       "            prefix = plan.action[:index]\n",
    +       "            outcome = Problem(Problem.result(problem.init, prefix), problem.goals , problem.actions )\n",
    +       "            suffix = plan.action[index+1:]\n",
    +       "            if not hla: # hla is None and plan is primitive\n",
    +       "                if outcome.goal_test():\n",
    +       "                    return plan.action\n",
    +       "            else:\n",
    +       "                for sequence in Problem.refinements(hla, outcome, hierarchy): # find refinements\n",
    +       "                    frontier.append(Node(outcome.init, plan, prefix + sequence+ suffix))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Problem.hierarchical_search)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example\n", + "\n", + "Suppose that somebody wants to get to the airport. \n", + "The possible ways to do so is either get a taxi, or drive to the airport.
    \n", + "Those two actions have some preconditions and some effects. \n", + "If you get the taxi, you need to have cash, whereas if you drive you need to have a car.
    \n", + "Thus we define the following hierarchy of possible actions.\n", + "\n", + "##### hierarchy" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "library = {\n", + " 'HLA': ['Go(Home,SFO)', 'Go(Home,SFO)', 'Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)', 'Taxi(Home, SFO)'],\n", + " 'steps': [['Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'], ['Taxi(Home, SFO)'], [], [], []],\n", + " 'precond': [['At(Home) & Have(Car)'], ['At(Home)'], ['At(Home) & Have(Car)'], ['At(SFOLongTermParking)'], ['At(Home)']],\n", + " 'effect': [['At(SFO) & ~At(Home)'], ['At(SFO) & ~At(Home) & ~Have(Cash)'], ['At(SFOLongTermParking) & ~At(Home)'], ['At(SFO) & ~At(LongTermParking)'], ['At(SFO) & ~At(Home) & ~Have(Cash)']] }\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "the possible actions are the following:" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [], + "source": [ + "go_SFO = HLA('Go(Home,SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home)')\n", + "taxi_SFO = HLA('Taxi(Home,SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home) & ~Have(Cash)')\n", + "drive_SFOLongTermParking = HLA('Drive(Home, SFOLongTermParking)', 'At(Home) & Have(Car)','At(SFOLongTermParking) & ~At(Home)' )\n", + "shuttle_SFO = HLA('Shuttle(SFOLongTermParking, SFO)', 'At(SFOLongTermParking)', 'At(SFO) & ~At(LongTermParking)')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Suppose that (our preconditionds are that) we are Home and we have cash and car and our goal is to get to SFO and maintain our cash, and our possible actions are the above.
    \n", + "##### Then our problem is: " + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [], + "source": [ + "prob = Problem('At(Home) & Have(Cash) & Have(Car)', 'At(SFO) & Have(Cash)', [go_SFO])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Refinements\n", + "\n", + "The refinements of the action Go(Home, SFO), are defined as:
    \n", + "['Drive(Home,SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'], ['Taxi(Home, SFO)']" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[HLA(Drive(Home, SFOLongTermParking)), HLA(Shuttle(SFOLongTermParking, SFO))]\n", + "[{'completed': False, 'args': (Home, SFOLongTermParking), 'name': 'Drive', 'uses': {}, 'duration': 0, 'effect': [At(SFOLongTermParking), NotAt(Home)], 'consumes': {}, 'precond': [At(Home), Have(Car)]}, {'completed': False, 'args': (SFOLongTermParking, SFO), 'name': 'Shuttle', 'uses': {}, 'duration': 0, 'effect': [At(SFO), NotAt(LongTermParking)], 'consumes': {}, 'precond': [At(SFOLongTermParking)]}] \n", + "\n", + "[HLA(Taxi(Home, SFO))]\n", + "[{'completed': False, 'args': (Home, SFO), 'name': 'Taxi', 'uses': {}, 'duration': 0, 'effect': [At(SFO), NotAt(Home), NotHave(Cash)], 'consumes': {}, 'precond': [At(Home)]}] \n", + "\n" + ] + } + ], + "source": [ + "for sequence in Problem.refinements(go_SFO, prob, library):\n", + " print (sequence)\n", + " print([x.__dict__ for x in sequence ], '\\n')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Run the hierarchical search\n", + "##### Top level call" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[HLA(Drive(Home, SFOLongTermParking)), HLA(Shuttle(SFOLongTermParking, SFO))] \n", + "\n", + "[{'completed': False, 'args': (Home, SFOLongTermParking), 'name': 'Drive', 'uses': {}, 'duration': 0, 'effect': [At(SFOLongTermParking), NotAt(Home)], 'consumes': {}, 'precond': [At(Home), Have(Car)]}, {'completed': False, 'args': (SFOLongTermParking, SFO), 'name': 'Shuttle', 'uses': {}, 'duration': 0, 'effect': [At(SFO), NotAt(LongTermParking)], 'consumes': {}, 'precond': [At(SFOLongTermParking)]}]\n" + ] + } + ], + "source": [ + "plan= Problem.hierarchical_search(prob, library)\n", + "print (plan, '\\n')\n", + "print ([x.__dict__ for x in plan])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example 2" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [], + "source": [ + "library_2 = {\n", + " 'HLA': ['Go(Home,SFO)', 'Go(Home,SFO)', 'Bus(Home, MetroStop)', 'Metro(MetroStop, SFO)' , 'Metro(MetroStop, SFO)', 'Metro1(MetroStop, SFO)', 'Metro2(MetroStop, SFO)' ,'Taxi(Home, SFO)'],\n", + " 'steps': [['Bus(Home, MetroStop)', 'Metro(MetroStop, SFO)'], ['Taxi(Home, SFO)'], [], ['Metro1(MetroStop, SFO)'], ['Metro2(MetroStop, SFO)'],[],[],[]],\n", + " 'precond': [['At(Home)'], ['At(Home)'], ['At(Home)'], ['At(MetroStop)'], ['At(MetroStop)'],['At(MetroStop)'], ['At(MetroStop)'] ,['At(Home) & Have(Cash)']],\n", + " 'effect': [['At(SFO) & ~At(Home)'], ['At(SFO) & ~At(Home) & ~Have(Cash)'], ['At(MetroStop) & ~At(Home)'], ['At(SFO) & ~At(MetroStop)'], ['At(SFO) & ~At(MetroStop)'], ['At(SFO) & ~At(MetroStop)'] , ['At(SFO) & ~At(MetroStop)'] ,['At(SFO) & ~At(Home) & ~Have(Cash)']] \n", + " }" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[HLA(Bus(Home, MetroStop)), HLA(Metro1(MetroStop, SFO))] \n", + "\n", + "[{'completed': False, 'args': (Home, MetroStop), 'name': 'Bus', 'uses': {}, 'duration': 0, 'effect': [At(MetroStop), NotAt(Home)], 'consumes': {}, 'precond': [At(Home)]}, {'completed': False, 'args': (MetroStop, SFO), 'name': 'Metro1', 'uses': {}, 'duration': 0, 'effect': [At(SFO), NotAt(MetroStop)], 'consumes': {}, 'precond': [At(MetroStop)]}]\n" + ] + } + ], + "source": [ + "plan_2 = Problem.hierarchical_search(prob, library_2)\n", + "print(plan_2, '\\n')\n", + "print([x.__dict__ for x in plan_2])" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/planning_partial_order_planner.ipynb b/planning_partial_order_planner.ipynb new file mode 100644 index 000000000..4b1a98bb3 --- /dev/null +++ b/planning_partial_order_planner.ipynb @@ -0,0 +1,850 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### PARTIAL ORDER PLANNER\n", + "A partial-order planning algorithm is significantly different from a total-order planner.\n", + "The way a partial-order plan works enables it to take advantage of _problem decomposition_ and work on each subproblem separately.\n", + "It works on several subgoals independently, solves them with several subplans, and then combines the plan.\n", + "
    \n", + "A partial-order planner also follows the **least commitment** strategy, where it delays making choices for as long as possible.\n", + "Variables are not bound unless it is absolutely necessary and new actions are chosen only if the existing actions cannot fulfil the required precondition.\n", + "
    \n", + "Any planning algorithm that can place two actions into a plan without specifying which comes first is called a **partial-order planner**.\n", + "A partial-order planner searches through the space of plans rather than the space of states, which makes it perform better for certain problems.\n", + "
    \n", + "
    \n", + "Let's have a look at the `PartialOrderPlanner` class." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "from planning import *\n", + "from notebook import psource" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class PartialOrderPlanner:\n",
    +       "\n",
    +       "    def __init__(self, planningproblem):\n",
    +       "        self.planningproblem = planningproblem\n",
    +       "        self.initialize()\n",
    +       "\n",
    +       "    def initialize(self):\n",
    +       "        """Initialize all variables"""\n",
    +       "        self.causal_links = []\n",
    +       "        self.start = Action('Start', [], self.planningproblem.init)\n",
    +       "        self.finish = Action('Finish', self.planningproblem.goals, [])\n",
    +       "        self.actions = set()\n",
    +       "        self.actions.add(self.start)\n",
    +       "        self.actions.add(self.finish)\n",
    +       "        self.constraints = set()\n",
    +       "        self.constraints.add((self.start, self.finish))\n",
    +       "        self.agenda = set()\n",
    +       "        for precond in self.finish.precond:\n",
    +       "            self.agenda.add((precond, self.finish))\n",
    +       "        self.expanded_actions = self.expand_actions()\n",
    +       "\n",
    +       "    def expand_actions(self, name=None):\n",
    +       "        """Generate all possible actions with variable bindings for precondition selection heuristic"""\n",
    +       "\n",
    +       "        objects = set(arg for clause in self.planningproblem.init for arg in clause.args)\n",
    +       "        expansions = []\n",
    +       "        action_list = []\n",
    +       "        if name is not None:\n",
    +       "            for action in self.planningproblem.actions:\n",
    +       "                if str(action.name) == name:\n",
    +       "                    action_list.append(action)\n",
    +       "        else:\n",
    +       "            action_list = self.planningproblem.actions\n",
    +       "\n",
    +       "        for action in action_list:\n",
    +       "            for permutation in itertools.permutations(objects, len(action.args)):\n",
    +       "                bindings = unify(Expr(action.name, *action.args), Expr(action.name, *permutation))\n",
    +       "                if bindings is not None:\n",
    +       "                    new_args = []\n",
    +       "                    for arg in action.args:\n",
    +       "                        if arg in bindings:\n",
    +       "                            new_args.append(bindings[arg])\n",
    +       "                        else:\n",
    +       "                            new_args.append(arg)\n",
    +       "                    new_expr = Expr(str(action.name), *new_args)\n",
    +       "                    new_preconds = []\n",
    +       "                    for precond in action.precond:\n",
    +       "                        new_precond_args = []\n",
    +       "                        for arg in precond.args:\n",
    +       "                            if arg in bindings:\n",
    +       "                                new_precond_args.append(bindings[arg])\n",
    +       "                            else:\n",
    +       "                                new_precond_args.append(arg)\n",
    +       "                        new_precond = Expr(str(precond.op), *new_precond_args)\n",
    +       "                        new_preconds.append(new_precond)\n",
    +       "                    new_effects = []\n",
    +       "                    for effect in action.effect:\n",
    +       "                        new_effect_args = []\n",
    +       "                        for arg in effect.args:\n",
    +       "                            if arg in bindings:\n",
    +       "                                new_effect_args.append(bindings[arg])\n",
    +       "                            else:\n",
    +       "                                new_effect_args.append(arg)\n",
    +       "                        new_effect = Expr(str(effect.op), *new_effect_args)\n",
    +       "                        new_effects.append(new_effect)\n",
    +       "                    expansions.append(Action(new_expr, new_preconds, new_effects))\n",
    +       "\n",
    +       "        return expansions\n",
    +       "\n",
    +       "    def find_open_precondition(self):\n",
    +       "        """Find open precondition with the least number of possible actions"""\n",
    +       "\n",
    +       "        number_of_ways = dict()\n",
    +       "        actions_for_precondition = dict()\n",
    +       "        for element in self.agenda:\n",
    +       "            open_precondition = element[0]\n",
    +       "            possible_actions = list(self.actions) + self.expanded_actions\n",
    +       "            for action in possible_actions:\n",
    +       "                for effect in action.effect:\n",
    +       "                    if effect == open_precondition:\n",
    +       "                        if open_precondition in number_of_ways:\n",
    +       "                            number_of_ways[open_precondition] += 1\n",
    +       "                            actions_for_precondition[open_precondition].append(action)\n",
    +       "                        else:\n",
    +       "                            number_of_ways[open_precondition] = 1\n",
    +       "                            actions_for_precondition[open_precondition] = [action]\n",
    +       "\n",
    +       "        number = sorted(number_of_ways, key=number_of_ways.__getitem__)\n",
    +       "        \n",
    +       "        for k, v in number_of_ways.items():\n",
    +       "            if v == 0:\n",
    +       "                return None, None, None\n",
    +       "\n",
    +       "        act1 = None\n",
    +       "        for element in self.agenda:\n",
    +       "            if element[0] == number[0]:\n",
    +       "                act1 = element[1]\n",
    +       "                break\n",
    +       "\n",
    +       "        if number[0] in self.expanded_actions:\n",
    +       "            self.expanded_actions.remove(number[0])\n",
    +       "\n",
    +       "        return number[0], act1, actions_for_precondition[number[0]]\n",
    +       "\n",
    +       "    def find_action_for_precondition(self, oprec):\n",
    +       "        """Find action for a given precondition"""\n",
    +       "\n",
    +       "        # either\n",
    +       "        #   choose act0 E Actions such that act0 achieves G\n",
    +       "        for action in self.actions:\n",
    +       "            for effect in action.effect:\n",
    +       "                if effect == oprec:\n",
    +       "                    return action, 0\n",
    +       "\n",
    +       "        # or\n",
    +       "        #   choose act0 E Actions such that act0 achieves G\n",
    +       "        for action in self.planningproblem.actions:\n",
    +       "            for effect in action.effect:\n",
    +       "                if effect.op == oprec.op:\n",
    +       "                    bindings = unify(effect, oprec)\n",
    +       "                    if bindings is None:\n",
    +       "                        break\n",
    +       "                    return action, bindings\n",
    +       "\n",
    +       "    def generate_expr(self, clause, bindings):\n",
    +       "        """Generate atomic expression from generic expression given variable bindings"""\n",
    +       "\n",
    +       "        new_args = []\n",
    +       "        for arg in clause.args:\n",
    +       "            if arg in bindings:\n",
    +       "                new_args.append(bindings[arg])\n",
    +       "            else:\n",
    +       "                new_args.append(arg)\n",
    +       "\n",
    +       "        try:\n",
    +       "            return Expr(str(clause.name), *new_args)\n",
    +       "        except:\n",
    +       "            return Expr(str(clause.op), *new_args)\n",
    +       "        \n",
    +       "    def generate_action_object(self, action, bindings):\n",
    +       "        """Generate action object given a generic action andvariable bindings"""\n",
    +       "\n",
    +       "        # if bindings is 0, it means the action already exists in self.actions\n",
    +       "        if bindings == 0:\n",
    +       "            return action\n",
    +       "\n",
    +       "        # bindings cannot be None\n",
    +       "        else:\n",
    +       "            new_expr = self.generate_expr(action, bindings)\n",
    +       "            new_preconds = []\n",
    +       "            for precond in action.precond:\n",
    +       "                new_precond = self.generate_expr(precond, bindings)\n",
    +       "                new_preconds.append(new_precond)\n",
    +       "            new_effects = []\n",
    +       "            for effect in action.effect:\n",
    +       "                new_effect = self.generate_expr(effect, bindings)\n",
    +       "                new_effects.append(new_effect)\n",
    +       "            return Action(new_expr, new_preconds, new_effects)\n",
    +       "\n",
    +       "    def cyclic(self, graph):\n",
    +       "        """Check cyclicity of a directed graph"""\n",
    +       "\n",
    +       "        new_graph = dict()\n",
    +       "        for element in graph:\n",
    +       "            if element[0] in new_graph:\n",
    +       "                new_graph[element[0]].append(element[1])\n",
    +       "            else:\n",
    +       "                new_graph[element[0]] = [element[1]]\n",
    +       "\n",
    +       "        path = set()\n",
    +       "\n",
    +       "        def visit(vertex):\n",
    +       "            path.add(vertex)\n",
    +       "            for neighbor in new_graph.get(vertex, ()):\n",
    +       "                if neighbor in path or visit(neighbor):\n",
    +       "                    return True\n",
    +       "            path.remove(vertex)\n",
    +       "            return False\n",
    +       "\n",
    +       "        value = any(visit(v) for v in new_graph)\n",
    +       "        return value\n",
    +       "\n",
    +       "    def add_const(self, constraint, constraints):\n",
    +       "        """Add the constraint to constraints if the resulting graph is acyclic"""\n",
    +       "\n",
    +       "        if constraint[0] == self.finish or constraint[1] == self.start:\n",
    +       "            return constraints\n",
    +       "\n",
    +       "        new_constraints = set(constraints)\n",
    +       "        new_constraints.add(constraint)\n",
    +       "\n",
    +       "        if self.cyclic(new_constraints):\n",
    +       "            return constraints\n",
    +       "        return new_constraints\n",
    +       "\n",
    +       "    def is_a_threat(self, precondition, effect):\n",
    +       "        """Check if effect is a threat to precondition"""\n",
    +       "\n",
    +       "        if (str(effect.op) == 'Not' + str(precondition.op)) or ('Not' + str(effect.op) == str(precondition.op)):\n",
    +       "            if effect.args == precondition.args:\n",
    +       "                return True\n",
    +       "        return False\n",
    +       "\n",
    +       "    def protect(self, causal_link, action, constraints):\n",
    +       "        """Check and resolve threats by promotion or demotion"""\n",
    +       "\n",
    +       "        threat = False\n",
    +       "        for effect in action.effect:\n",
    +       "            if self.is_a_threat(causal_link[1], effect):\n",
    +       "                threat = True\n",
    +       "                break\n",
    +       "\n",
    +       "        if action != causal_link[0] and action != causal_link[2] and threat:\n",
    +       "            # try promotion\n",
    +       "            new_constraints = set(constraints)\n",
    +       "            new_constraints.add((action, causal_link[0]))\n",
    +       "            if not self.cyclic(new_constraints):\n",
    +       "                constraints = self.add_const((action, causal_link[0]), constraints)\n",
    +       "            else:\n",
    +       "                # try demotion\n",
    +       "                new_constraints = set(constraints)\n",
    +       "                new_constraints.add((causal_link[2], action))\n",
    +       "                if not self.cyclic(new_constraints):\n",
    +       "                    constraints = self.add_const((causal_link[2], action), constraints)\n",
    +       "                else:\n",
    +       "                    # both promotion and demotion fail\n",
    +       "                    print('Unable to resolve a threat caused by', action, 'onto', causal_link)\n",
    +       "                    return\n",
    +       "        return constraints\n",
    +       "\n",
    +       "    def convert(self, constraints):\n",
    +       "        """Convert constraints into a dict of Action to set orderings"""\n",
    +       "\n",
    +       "        graph = dict()\n",
    +       "        for constraint in constraints:\n",
    +       "            if constraint[0] in graph:\n",
    +       "                graph[constraint[0]].add(constraint[1])\n",
    +       "            else:\n",
    +       "                graph[constraint[0]] = set()\n",
    +       "                graph[constraint[0]].add(constraint[1])\n",
    +       "        return graph\n",
    +       "\n",
    +       "    def toposort(self, graph):\n",
    +       "        """Generate topological ordering of constraints"""\n",
    +       "\n",
    +       "        if len(graph) == 0:\n",
    +       "            return\n",
    +       "\n",
    +       "        graph = graph.copy()\n",
    +       "\n",
    +       "        for k, v in graph.items():\n",
    +       "            v.discard(k)\n",
    +       "\n",
    +       "        extra_elements_in_dependencies = _reduce(set.union, graph.values()) - set(graph.keys())\n",
    +       "\n",
    +       "        graph.update({element:set() for element in extra_elements_in_dependencies})\n",
    +       "        while True:\n",
    +       "            ordered = set(element for element, dependency in graph.items() if len(dependency) == 0)\n",
    +       "            if not ordered:\n",
    +       "                break\n",
    +       "            yield ordered\n",
    +       "            graph = {element: (dependency - ordered) for element, dependency in graph.items() if element not in ordered}\n",
    +       "        if len(graph) != 0:\n",
    +       "            raise ValueError('The graph is not acyclic and cannot be linearly ordered')\n",
    +       "\n",
    +       "    def display_plan(self):\n",
    +       "        """Display causal links, constraints and the plan"""\n",
    +       "\n",
    +       "        print('Causal Links')\n",
    +       "        for causal_link in self.causal_links:\n",
    +       "            print(causal_link)\n",
    +       "\n",
    +       "        print('\\nConstraints')\n",
    +       "        for constraint in self.constraints:\n",
    +       "            print(constraint[0], '<', constraint[1])\n",
    +       "\n",
    +       "        print('\\nPartial Order Plan')\n",
    +       "        print(list(reversed(list(self.toposort(self.convert(self.constraints))))))\n",
    +       "\n",
    +       "    def execute(self, display=True):\n",
    +       "        """Execute the algorithm"""\n",
    +       "\n",
    +       "        step = 1\n",
    +       "        self.tries = 1\n",
    +       "        while len(self.agenda) > 0:\n",
    +       "            step += 1\n",
    +       "            # select <G, act1> from Agenda\n",
    +       "            try:\n",
    +       "                G, act1, possible_actions = self.find_open_precondition()\n",
    +       "            except IndexError:\n",
    +       "                print('Probably Wrong')\n",
    +       "                break\n",
    +       "\n",
    +       "            act0 = possible_actions[0]\n",
    +       "            # remove <G, act1> from Agenda\n",
    +       "            self.agenda.remove((G, act1))\n",
    +       "\n",
    +       "            # For actions with variable number of arguments, use least commitment principle\n",
    +       "            # act0_temp, bindings = self.find_action_for_precondition(G)\n",
    +       "            # act0 = self.generate_action_object(act0_temp, bindings)\n",
    +       "\n",
    +       "            # Actions = Actions U {act0}\n",
    +       "            self.actions.add(act0)\n",
    +       "\n",
    +       "            # Constraints = add_const(start < act0, Constraints)\n",
    +       "            self.constraints = self.add_const((self.start, act0), self.constraints)\n",
    +       "\n",
    +       "            # for each CL E CausalLinks do\n",
    +       "            #   Constraints = protect(CL, act0, Constraints)\n",
    +       "            for causal_link in self.causal_links:\n",
    +       "                self.constraints = self.protect(causal_link, act0, self.constraints)\n",
    +       "\n",
    +       "            # Agenda = Agenda U {<P, act0>: P is a precondition of act0}\n",
    +       "            for precondition in act0.precond:\n",
    +       "                self.agenda.add((precondition, act0))\n",
    +       "\n",
    +       "            # Constraints = add_const(act0 < act1, Constraints)\n",
    +       "            self.constraints = self.add_const((act0, act1), self.constraints)\n",
    +       "\n",
    +       "            # CausalLinks U {<act0, G, act1>}\n",
    +       "            if (act0, G, act1) not in self.causal_links:\n",
    +       "                self.causal_links.append((act0, G, act1))\n",
    +       "\n",
    +       "            # for each A E Actions do\n",
    +       "            #   Constraints = protect(<act0, G, act1>, A, Constraints)\n",
    +       "            for action in self.actions:\n",
    +       "                self.constraints = self.protect((act0, G, act1), action, self.constraints)\n",
    +       "\n",
    +       "            if step > 200:\n",
    +       "                print('Couldn\\'t find a solution')\n",
    +       "                return None, None\n",
    +       "\n",
    +       "        if display:\n",
    +       "            self.display_plan()\n",
    +       "        else:\n",
    +       "            return self.constraints, self.causal_links                \n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(PartialOrderPlanner)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will first describe the data-structures and helper methods used, followed by the algorithm used to find a partial-order plan." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Each plan has the following four components:\n", + "\n", + "1. **`actions`**: a set of actions that make up the steps of the plan.\n", + "`actions` is always a subset of `pddl.actions` the set of possible actions for the given planning problem. \n", + "The `start` and `finish` actions are dummy actions defined to bring uniformity to the problem. The `start` action has no preconditions and its effects constitute the initial state of the planning problem. \n", + "The `finish` action has no effects and its preconditions constitute the goal state of the planning problem.\n", + "The empty plan consists of just these two dummy actions.\n", + "2. **`constraints`**: a set of temporal constraints that define the order of performing the actions relative to each other.\n", + "`constraints` does not define a linear ordering, rather it usually represents a directed graph which is also acyclic if the plan is consistent.\n", + "Each ordering is of the form A < B, which reads as \"A before B\" and means that action A _must_ be executed sometime before action B, but not necessarily immediately before.\n", + "`constraints` stores these as a set of tuples `(Action(A), Action(B))` which is interpreted as given above.\n", + "A constraint cannot be added to `constraints` if it breaks the acyclicity of the existing graph.\n", + "3. **`causal_links`**: a set of causal-links. \n", + "A causal link between two actions _A_ and _B_ in the plan is written as _A_ --_p_--> _B_ and is read as \"A achieves p for B\".\n", + "This imples that _p_ is an effect of _A_ and a precondition of _B_.\n", + "It also asserts that _p_ must remain true from the time of action _A_ to the time of action _B_.\n", + "Any violation of this rule is called a threat and must be resolved immediately by adding suitable ordering constraints.\n", + "`causal_links` stores this information as tuples `(Action(A), precondition(p), Action(B))` which is interpreted as given above.\n", + "Causal-links can also be called **protection-intervals**, because the link _A_ --_p_--> _B_ protects _p_ from being negated over the interval from _A_ to _B_.\n", + "4. **`agenda`**: a set of open-preconditions.\n", + "A precondition is open if it is not achieved by some action in the plan.\n", + "Planners will work to reduce the set of open preconditions to the empty set, without introducing a contradiction.\n", + "`agenda` stored this information as tuples `(precondition(p), Action(A))` where p is a precondition of the action A.\n", + "\n", + "A **consistent plan** is a plan in which there are no cycles in the ordering constraints and no conflicts with the causal-links.\n", + "A consistent plan with no open preconditions is a **solution**.\n", + "
    \n", + "
    \n", + "Let's briefly glance over the helper functions before going into the actual algorithm.\n", + "
    \n", + "**`expand_actions`**: generates all possible actions with variable bindings for use as a heuristic of selection of an open precondition.\n", + "
    \n", + "**`find_open_precondition`**: finds a precondition from the agenda with the least number of actions that fulfil that precondition.\n", + "This heuristic helps form mandatory ordering constraints and causal-links to further simplify the problem and reduce the probability of encountering a threat.\n", + "
    \n", + "**`find_action_for_precondition`**: finds an action that fulfils the given precondition along with the absolutely necessary variable bindings in accordance with the principle of _least commitment_.\n", + "In case of multiple possible actions, the action with the least number of effects is chosen to minimize the chances of encountering a threat.\n", + "
    \n", + "**`cyclic`**: checks if a directed graph is cyclic.\n", + "
    \n", + "**`add_const`**: adds `constraint` to `constraints` if the newly formed graph is acyclic and returns `constraints` otherwise.\n", + "
    \n", + "**`is_a_threat`**: checks if the given `effect` negates the given `precondition`.\n", + "
    \n", + "**`protect`**: checks if the given `action` poses a threat to the given `causal_link`.\n", + "If so, the threat is resolved by either promotion or demotion, whichever generates acyclic temporal constraints.\n", + "If neither promotion or demotion work, the chosen action is not the correct fit or the planning problem cannot be solved altogether.\n", + "
    \n", + "**`convert`**: converts a graph from a list of edges to an `Action` : `set` mapping, for use in topological sorting.\n", + "
    \n", + "**`toposort`**: a generator function that generates a topological ordering of a given graph as a list of sets.\n", + "Each set contains an action or several actions.\n", + "If a set has more that one action in it, it means that permutations between those actions also produce a valid plan.\n", + "
    \n", + "**`display_plan`**: displays the `causal_links`, `constraints` and the partial order plan generated from `toposort`.\n", + "
    " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The **`execute`** method executes the algorithm, which is summarized below:\n", + "
    \n", + "1. An open precondition is selected (a sub-goal that we want to achieve).\n", + "2. An action that fulfils the open precondition is chosen.\n", + "3. Temporal constraints are updated.\n", + "4. Existing causal links are protected. Protection is a method that checks if the causal links conflict\n", + " and if they do, temporal constraints are added to fix the threats.\n", + "5. The set of open preconditions is updated.\n", + "6. Temporal constraints of the selected action and the next action are established.\n", + "7. A new causal link is added between the selected action and the owner of the open precondition.\n", + "8. The set of new causal links is checked for threats and if found, the threat is removed by either promotion or demotion.\n", + " If promotion or demotion is unable to solve the problem, the planning problem cannot be solved with the current sequence of actions\n", + " or it may not be solvable at all.\n", + "9. These steps are repeated until the set of open preconditions is empty." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A partial-order plan can be used to generate different valid total-order plans.\n", + "This step is called **linearization** of the partial-order plan.\n", + "All possible linearizations of a partial-order plan for `socks_and_shoes` looks like this.\n", + "
    \n", + "![title](images/pop.jpg)\n", + "
    \n", + "Linearization can be carried out in many ways, but the most efficient way is to represent the set of temporal constraints as a directed graph.\n", + "We can easily realize that the graph should also be acyclic as cycles in constraints means that the constraints are inconsistent.\n", + "This acyclicity is enforced by the `add_const` method, which adds a new constraint only if the acyclicity of the existing graph is not violated.\n", + "The `protect` method also checks for acyclicity of the newly-added temporal constraints to make a decision between promotion and demotion in case of a threat.\n", + "This property of a graph created from the temporal constraints of a valid partial-order plan allows us to use topological sort to order the constraints linearly.\n", + "A topological sort may produce several different valid solutions for a given directed acyclic graph." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we know how `PartialOrderPlanner` works, let's solve a few problems using it." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Causal Links\n", + "(Action(PutOn(Spare, Axle)), At(Spare, Axle), Action(Finish))\n", + "(Action(Start), Tire(Spare), Action(PutOn(Spare, Axle)))\n", + "(Action(Remove(Flat, Axle)), NotAt(Flat, Axle), Action(PutOn(Spare, Axle)))\n", + "(Action(Start), At(Flat, Axle), Action(Remove(Flat, Axle)))\n", + "(Action(Remove(Spare, Trunk)), At(Spare, Ground), Action(PutOn(Spare, Axle)))\n", + "(Action(Start), At(Spare, Trunk), Action(Remove(Spare, Trunk)))\n", + "(Action(Remove(Flat, Axle)), At(Flat, Ground), Action(Finish))\n", + "\n", + "Constraints\n", + "Action(Remove(Flat, Axle)) < Action(PutOn(Spare, Axle))\n", + "Action(Start) < Action(Finish)\n", + "Action(Remove(Spare, Trunk)) < Action(PutOn(Spare, Axle))\n", + "Action(Start) < Action(Remove(Spare, Trunk))\n", + "Action(Start) < Action(Remove(Flat, Axle))\n", + "Action(Remove(Flat, Axle)) < Action(Finish)\n", + "Action(PutOn(Spare, Axle)) < Action(Finish)\n", + "Action(Start) < Action(PutOn(Spare, Axle))\n", + "\n", + "Partial Order Plan\n", + "[{Action(Start)}, {Action(Remove(Flat, Axle)), Action(Remove(Spare, Trunk))}, {Action(PutOn(Spare, Axle))}, {Action(Finish)}]\n" + ] + } + ], + "source": [ + "st = spare_tire()\n", + "pop = PartialOrderPlanner(st)\n", + "pop.execute()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We observe that in the given partial order plan, Remove(Flat, Axle) and Remove(Spare, Trunk) are in the same set.\n", + "This means that the order of performing these actions does not affect the final outcome.\n", + "That aside, we also see that the PutOn(Spare, Axle) action has to be performed after both the Remove actions are complete, which seems logically consistent." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Causal Links\n", + "(Action(FromTable(C, B)), On(C, B), Action(Finish))\n", + "(Action(FromTable(B, A)), On(B, A), Action(Finish))\n", + "(Action(Start), OnTable(B), Action(FromTable(B, A)))\n", + "(Action(Start), OnTable(C), Action(FromTable(C, B)))\n", + "(Action(Start), Clear(C), Action(FromTable(C, B)))\n", + "(Action(Start), Clear(A), Action(FromTable(B, A)))\n", + "(Action(ToTable(A, B)), Clear(B), Action(FromTable(C, B)))\n", + "(Action(Start), On(A, B), Action(ToTable(A, B)))\n", + "(Action(ToTable(A, B)), Clear(B), Action(FromTable(B, A)))\n", + "(Action(Start), Clear(A), Action(ToTable(A, B)))\n", + "\n", + "Constraints\n", + "Action(Start) < Action(FromTable(C, B))\n", + "Action(FromTable(B, A)) < Action(FromTable(C, B))\n", + "Action(Start) < Action(FromTable(B, A))\n", + "Action(Start) < Action(ToTable(A, B))\n", + "Action(Start) < Action(Finish)\n", + "Action(FromTable(B, A)) < Action(Finish)\n", + "Action(FromTable(C, B)) < Action(Finish)\n", + "Action(ToTable(A, B)) < Action(FromTable(B, A))\n", + "Action(ToTable(A, B)) < Action(FromTable(C, B))\n", + "\n", + "Partial Order Plan\n", + "[{Action(Start)}, {Action(ToTable(A, B))}, {Action(FromTable(B, A))}, {Action(FromTable(C, B))}, {Action(Finish)}]\n" + ] + } + ], + "source": [ + "sbw = simple_blocks_world()\n", + "pop = PartialOrderPlanner(sbw)\n", + "pop.execute()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "We see that this plan does not have flexibility in selecting actions, ie, actions should be performed in this order and this order only, to successfully reach the goal state." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Causal Links\n", + "(Action(RightShoe), RightShoeOn, Action(Finish))\n", + "(Action(LeftShoe), LeftShoeOn, Action(Finish))\n", + "(Action(LeftSock), LeftSockOn, Action(LeftShoe))\n", + "(Action(RightSock), RightSockOn, Action(RightShoe))\n", + "\n", + "Constraints\n", + "Action(LeftSock) < Action(LeftShoe)\n", + "Action(RightSock) < Action(RightShoe)\n", + "Action(Start) < Action(RightShoe)\n", + "Action(Start) < Action(Finish)\n", + "Action(LeftShoe) < Action(Finish)\n", + "Action(Start) < Action(RightSock)\n", + "Action(Start) < Action(LeftShoe)\n", + "Action(Start) < Action(LeftSock)\n", + "Action(RightShoe) < Action(Finish)\n", + "\n", + "Partial Order Plan\n", + "[{Action(Start)}, {Action(LeftSock), Action(RightSock)}, {Action(LeftShoe), Action(RightShoe)}, {Action(Finish)}]\n" + ] + } + ], + "source": [ + "ss = socks_and_shoes()\n", + "pop = PartialOrderPlanner(ss)\n", + "pop.execute()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "This plan again doesn't have constraints in selecting socks or shoes.\n", + "As long as both socks are worn before both shoes, we are fine.\n", + "Notice however, there is one valid solution,\n", + "
    \n", + "LeftSock -> LeftShoe -> RightSock -> RightShoe\n", + "
    \n", + "that the algorithm could not find as it cannot be represented as a general partially-ordered plan but is a specific total-order solution." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Runtime differences\n", + "Let's briefly take a look at the running time of all the three algorithms on the `socks_and_shoes` problem." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "ss = socks_and_shoes()" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "198 µs ± 3.53 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "GraphPlan(ss).execute()" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "844 µs ± 23.8 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "Linearize(ss).execute()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "258 µs ± 4.03 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "PartialOrderPlanner(ss).execute(display=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We observe that `GraphPlan` is about 4 times faster than `Linearize` because `Linearize` essentially runs a `GraphPlan` subroutine under the hood and then carries out some transformations on the solved planning-graph.\n", + "
    \n", + "We also find that `GraphPlan` is slightly faster than `PartialOrderPlanner`, but this is mainly due to the `expand_actions` method in `PartialOrderPlanner` that slows it down as it generates all possible permutations of actions and variable bindings.\n", + "
    \n", + "Without heuristic functions, `PartialOrderPlanner` will be atleast as fast as `GraphPlan`, if not faster, but will have a higher tendency to encounter threats and conflicts which might take additional time to resolve.\n", + "
    \n", + "Different planning algorithms work differently for different problems." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/planning_total_order_planner.ipynb b/planning_total_order_planner.ipynb new file mode 100644 index 000000000..b94941ece --- /dev/null +++ b/planning_total_order_planner.ipynb @@ -0,0 +1,341 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### TOTAL ORDER PLANNER\n", + "\n", + "In mathematical terminology, **total order**, **linear order** or **simple order** refers to a set *X* which is said to be totally ordered under ≤ if the following statements hold for all *a*, *b* and *c* in *X*:\n", + "
    \n", + "If *a* ≤ *b* and *b* ≤ *a*, then *a* = *b* (antisymmetry).\n", + "
    \n", + "If *a* ≤ *b* and *b* ≤ *c*, then *a* ≤ *c* (transitivity).\n", + "
    \n", + "*a* ≤ *b* or *b* ≤ *a* (connex relation).\n", + "\n", + "
    \n", + "In simpler terms, a total order plan is a linear ordering of actions to be taken to reach the goal state.\n", + "There may be several different total-order plans for a particular goal depending on the problem.\n", + "
    \n", + "
    \n", + "In the module, the `Linearize` class solves problems using this paradigm.\n", + "At its core, the `Linearize` uses a solved planning graph from `GraphPlan` and finds a valid total-order solution for it.\n", + "Let's have a look at the class." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from planning import *\n", + "from notebook import psource" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class Linearize:\n",
    +       "\n",
    +       "    def __init__(self, planningproblem):\n",
    +       "        self.planningproblem = planningproblem\n",
    +       "\n",
    +       "    def filter(self, solution):\n",
    +       "        """Filter out persistence actions from a solution"""\n",
    +       "\n",
    +       "        new_solution = []\n",
    +       "        for section in solution[0]:\n",
    +       "            new_section = []\n",
    +       "            for operation in section:\n",
    +       "                if not (operation.op[0] == 'P' and operation.op[1].isupper()):\n",
    +       "                    new_section.append(operation)\n",
    +       "            new_solution.append(new_section)\n",
    +       "        return new_solution\n",
    +       "\n",
    +       "    def orderlevel(self, level, planningproblem):\n",
    +       "        """Return valid linear order of actions for a given level"""\n",
    +       "\n",
    +       "        for permutation in itertools.permutations(level):\n",
    +       "            temp = copy.deepcopy(planningproblem)\n",
    +       "            count = 0\n",
    +       "            for action in permutation:\n",
    +       "                try:\n",
    +       "                    temp.act(action)\n",
    +       "                    count += 1\n",
    +       "                except:\n",
    +       "                    count = 0\n",
    +       "                    temp = copy.deepcopy(planningproblem)\n",
    +       "                    break\n",
    +       "            if count == len(permutation):\n",
    +       "                return list(permutation), temp\n",
    +       "        return None\n",
    +       "\n",
    +       "    def execute(self):\n",
    +       "        """Finds total-order solution for a planning graph"""\n",
    +       "\n",
    +       "        graphplan_solution = GraphPlan(self.planningproblem).execute()\n",
    +       "        filtered_solution = self.filter(graphplan_solution)\n",
    +       "        ordered_solution = []\n",
    +       "        planningproblem = self.planningproblem\n",
    +       "        for level in filtered_solution:\n",
    +       "            level_solution, planningproblem = self.orderlevel(level, planningproblem)\n",
    +       "            for element in level_solution:\n",
    +       "                ordered_solution.append(element)\n",
    +       "\n",
    +       "        return ordered_solution\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Linearize)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `filter` method removes the persistence actions (if any) from the planning graph representation.\n", + "
    \n", + "The `orderlevel` method finds a valid total-ordering of a specified level of the planning-graph, given the state of the graph after the previous level.\n", + "
    \n", + "The `execute` method sequentially calls `orderlevel` for all the levels in the planning-graph and returns the final total-order solution.\n", + "
    \n", + "
    \n", + "Let's look at some examples." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[Load(C1, P1, SFO),\n", + " Fly(P1, SFO, JFK),\n", + " Load(C2, P2, JFK),\n", + " Fly(P2, JFK, SFO),\n", + " Unload(C2, P2, SFO),\n", + " Unload(C1, P1, JFK)]" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# total-order solution for air_cargo problem\n", + "Linearize(air_cargo()).execute()" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[Remove(Spare, Trunk), Remove(Flat, Axle), PutOn(Spare, Axle)]" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# total-order solution for spare_tire problem\n", + "Linearize(spare_tire()).execute()" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[MoveToTable(C, A), Move(B, Table, C), Move(A, Table, B)]" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# total-order solution for three_block_tower problem\n", + "Linearize(three_block_tower()).execute()" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[ToTable(A, B), FromTable(B, A), FromTable(C, B)]" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# total-order solution for simple_blocks_world problem\n", + "Linearize(simple_blocks_world()).execute()" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[RightSock, LeftSock, RightShoe, LeftShoe]" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# total-order solution for socks_and_shoes problem\n", + "Linearize(socks_and_shoes()).execute()" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/probabilistic_learning.py b/probabilistic_learning.py new file mode 100644 index 000000000..1138e702d --- /dev/null +++ b/probabilistic_learning.py @@ -0,0 +1,154 @@ +"""Learning probabilistic models. (Chapters 20)""" + +import heapq + +from utils import weighted_sampler, product, gaussian + + +class CountingProbDist: + """ + A probability distribution formed by observing and counting examples. + If p is an instance of this class and o is an observed value, then + there are 3 main operations: + p.add(o) increments the count for observation o by 1. + p.sample() returns a random element from the distribution. + p[o] returns the probability for o (as in a regular ProbDist). + """ + + def __init__(self, observations=None, default=0): + """ + Create a distribution, and optionally add in some observations. + By default this is an unsmoothed distribution, but saying default=1, + for example, gives you add-one smoothing. + """ + if observations is None: + observations = [] + self.dictionary = {} + self.n_obs = 0 + self.default = default + self.sampler = None + + for o in observations: + self.add(o) + + def add(self, o): + """Add an observation o to the distribution.""" + self.smooth_for(o) + self.dictionary[o] += 1 + self.n_obs += 1 + self.sampler = None + + def smooth_for(self, o): + """ + Include o among the possible observations, whether or not + it's been observed yet. + """ + if o not in self.dictionary: + self.dictionary[o] = self.default + self.n_obs += self.default + self.sampler = None + + def __getitem__(self, item): + """Return an estimate of the probability of item.""" + self.smooth_for(item) + return self.dictionary[item] / self.n_obs + + # (top() and sample() are not used in this module, but elsewhere.) + + def top(self, n): + """Return (count, obs) tuples for the n most frequent observations.""" + return heapq.nlargest(n, [(v, k) for (k, v) in self.dictionary.items()]) + + def sample(self): + """Return a random sample from the distribution.""" + if self.sampler is None: + self.sampler = weighted_sampler(list(self.dictionary.keys()), list(self.dictionary.values())) + return self.sampler() + + +def NaiveBayesLearner(dataset, continuous=True, simple=False): + if simple: + return NaiveBayesSimple(dataset) + if continuous: + return NaiveBayesContinuous(dataset) + else: + return NaiveBayesDiscrete(dataset) + + +def NaiveBayesSimple(distribution): + """ + A simple naive bayes classifier that takes as input a dictionary of + CountingProbDist objects and classifies items according to these distributions. + The input dictionary is in the following form: + (ClassName, ClassProb): CountingProbDist + """ + target_dist = {c_name: prob for c_name, prob in distribution.keys()} + attr_dists = {c_name: count_prob for (c_name, _), count_prob in distribution.items()} + + def predict(example): + """Predict the target value for example. Calculate probabilities for each + class and pick the max.""" + + def class_probability(target_val): + attr_dist = attr_dists[target_val] + return target_dist[target_val] * product(attr_dist[a] for a in example) + + return max(target_dist.keys(), key=class_probability) + + return predict + + +def NaiveBayesDiscrete(dataset): + """ + Just count how many times each value of each input attribute + occurs, conditional on the target value. Count the different + target values too. + """ + + target_vals = dataset.values[dataset.target] + target_dist = CountingProbDist(target_vals) + attr_dists = {(gv, attr): CountingProbDist(dataset.values[attr]) for gv in target_vals for attr in dataset.inputs} + for example in dataset.examples: + target_val = example[dataset.target] + target_dist.add(target_val) + for attr in dataset.inputs: + attr_dists[target_val, attr].add(example[attr]) + + def predict(example): + """ + Predict the target value for example. Consider each possible value, + and pick the most likely by looking at each attribute independently. + """ + + def class_probability(target_val): + return (target_dist[target_val] * product(attr_dists[target_val, attr][example[attr]] + for attr in dataset.inputs)) + + return max(target_vals, key=class_probability) + + return predict + + +def NaiveBayesContinuous(dataset): + """ + Count how many times each target value occurs. + Also, find the means and deviations of input attribute values for each target value. + """ + means, deviations = dataset.find_means_and_deviations() + + target_vals = dataset.values[dataset.target] + target_dist = CountingProbDist(target_vals) + + def predict(example): + """Predict the target value for example. Consider each possible value, + and pick the most likely by looking at each attribute independently.""" + + def class_probability(target_val): + prob = target_dist[target_val] + for attr in dataset.inputs: + prob *= gaussian(means[target_val][attr], deviations[target_val][attr], example[attr]) + return prob + + return max(target_vals, key=class_probability) + + return predict diff --git a/probability.ipynb b/probability.ipynb index 7b1cd3605..fe9643a83 100644 --- a/probability.ipynb +++ b/probability.ipynb @@ -2,55 +2,243 @@ "cells": [ { "cell_type": "markdown", - "metadata": { - "collapsed": false - }, + "metadata": {}, "source": [ "# Probability \n", "\n", - "This IPy notebook acts as supporting material for **Chapter 13 Quantifying Uncertainty**, **Chapter 14 Probabilistic Reasoning** and **Chapter 15 Probabilistic Reasoning over Time** of the book* Artificial Intelligence: A Modern Approach*. This notebook makes use of the implementations in probability.py module. Let us import everything from the probability module. It might be helpful to view the source of some of our implementations. Please refer to the Introductory IPy file for more details on how to do so." + "This IPy notebook acts as supporting material for topics covered in **Chapter 13 Quantifying Uncertainty**, **Chapter 14 Probabilistic Reasoning**, **Chapter 15 Probabilistic Reasoning over Time**, **Chapter 16 Making Simple Decisions** and parts of **Chapter 25 Robotics** of the book* Artificial Intelligence: A Modern Approach*. This notebook makes use of the implementations in probability.py module. Let us import everything from the probability module. It might be helpful to view the source of some of our implementations. Please refer to the Introductory IPy file for more details on how to do so." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 1, + "metadata": {}, "outputs": [], "source": [ - "from probability import *" + "from probability import *\n", + "from utils import print_table\n", + "from notebook import psource, pseudocode, heatmap" ] }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, + "source": [ + "## CONTENTS\n", + "- Probability Distribution\n", + " - Joint probability distribution\n", + " - Inference using full joint distributions\n", + "
    \n", + "- Bayesian Networks\n", + " - BayesNode\n", + " - BayesNet\n", + " - Exact Inference in Bayesian Networks\n", + " - Enumeration\n", + " - Variable elimination\n", + " - Approximate Inference in Bayesian Networks\n", + " - Prior sample\n", + " - Rejection sampling\n", + " - Likelihood weighting\n", + " - Gibbs sampling\n", + "
    \n", + "- Hidden Markov Models\n", + " - Inference in Hidden Markov Models\n", + " - Forward-backward\n", + " - Fixed lag smoothing\n", + " - Particle filtering\n", + "
    \n", + "
    \n", + "- Monte Carlo Localization\n", + "- Decision Theoretic Agent\n", + "- Information Gathering Agent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, "source": [ - "## Probability Distribution\n", + "## PROBABILITY DISTRIBUTION\n", "\n", "Let us begin by specifying discrete probability distributions. The class **ProbDist** defines a discrete probability distribution. We name our random variable and then assign probabilities to the different values of the random variable. Assigning probabilities to the values works similar to that of using a dictionary with keys being the Value and we assign to it the probability. This is possible because of the magic methods **_ _getitem_ _** and **_ _setitem_ _** which store the probabilities in the prob dict of the object. You can keep the source window open alongside while playing with the rest of the code to get a better understanding." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class ProbDist:\n",
    +       "    """A discrete probability distribution. You name the random variable\n",
    +       "    in the constructor, then assign and query probability of values.\n",
    +       "    >>> P = ProbDist('Flip'); P['H'], P['T'] = 0.25, 0.75; P['H']\n",
    +       "    0.25\n",
    +       "    >>> P = ProbDist('X', {'lo': 125, 'med': 375, 'hi': 500})\n",
    +       "    >>> P['lo'], P['med'], P['hi']\n",
    +       "    (0.125, 0.375, 0.5)\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, varname='?', freqs=None):\n",
    +       "        """If freqs is given, it is a dictionary of values - frequency pairs,\n",
    +       "        then ProbDist is normalized."""\n",
    +       "        self.prob = {}\n",
    +       "        self.varname = varname\n",
    +       "        self.values = []\n",
    +       "        if freqs:\n",
    +       "            for (v, p) in freqs.items():\n",
    +       "                self[v] = p\n",
    +       "            self.normalize()\n",
    +       "\n",
    +       "    def __getitem__(self, val):\n",
    +       "        """Given a value, return P(value)."""\n",
    +       "        try:\n",
    +       "            return self.prob[val]\n",
    +       "        except KeyError:\n",
    +       "            return 0\n",
    +       "\n",
    +       "    def __setitem__(self, val, p):\n",
    +       "        """Set P(val) = p."""\n",
    +       "        if val not in self.values:\n",
    +       "            self.values.append(val)\n",
    +       "        self.prob[val] = p\n",
    +       "\n",
    +       "    def normalize(self):\n",
    +       "        """Make sure the probabilities of all values sum to 1.\n",
    +       "        Returns the normalized distribution.\n",
    +       "        Raises a ZeroDivisionError if the sum of the values is 0."""\n",
    +       "        total = sum(self.prob.values())\n",
    +       "        if not isclose(total, 1.0):\n",
    +       "            for val in self.prob:\n",
    +       "                self.prob[val] /= total\n",
    +       "        return self\n",
    +       "\n",
    +       "    def show_approx(self, numfmt='{:.3g}'):\n",
    +       "        """Show the probabilities rounded and sorted by key, for the\n",
    +       "        sake of portable doctests."""\n",
    +       "        return ', '.join([('{}: ' + numfmt).format(v, p)\n",
    +       "                          for (v, p) in sorted(self.prob.items())])\n",
    +       "\n",
    +       "    def __repr__(self):\n",
    +       "        return "P({})".format(self.varname)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource ProbDist" + "psource(ProbDist)" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.75" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "p = ProbDist('Flip')\n", "p['H'], p['T'] = 0.25, 0.75\n", @@ -61,28 +249,46 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The first parameter of the constructor **varname** has a default value of '?'. So if the name is not passed it defaults to ?. The keyword argument **freqs** can be a dictionary of values of random variable:probability. These are then normalized such that the probability values sum upto 1 using the **normalize** method." + "The first parameter of the constructor **varname** has a default value of '?'. So if the name is not passed it defaults to ?. The keyword argument **freqs** can be a dictionary of values of random variable: probability. These are then normalized such that the probability values sum upto 1 using the **normalize** method." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'?'" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "p = ProbDist(freqs={'low': 125, 'medium': 375, 'high': 500})\n", - "p.varname\n" + "p.varname" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(0.125, 0.375, 0.5)" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "(p['low'], p['medium'], p['high'])" ] @@ -96,11 +302,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['low', 'medium', 'high']" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "p.values" ] @@ -109,16 +324,25 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The distribution by default is not normalized if values are added incremently. We can still force normalization by invoking the **normalize** method." + "The distribution by default is not normalized if values are added incrementally. We can still force normalization by invoking the **normalize** method." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(50, 114, 64)" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "p = ProbDist('Y')\n", "p['Cat'] = 50\n", @@ -129,11 +353,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(0.21929824561403508, 0.5, 0.2807017543859649)" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "p.normalize()\n", "(p['Cat'], p['Dog'], p['Mice'])" @@ -148,11 +381,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'Cat: 0.219, Dog: 0.5, Mice: 0.281'" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "p.show_approx()" ] @@ -171,35 +413,175 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(8, 10)" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "event = {'A': 10, 'B': 9, 'C': 8}\n", "variables = ['C', 'A']\n", - "event_values (event, variables)" + "event_values(event, variables)" ] }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "_A probability model is completely determined by the joint distribution for all of the random variables._ (**Section 13.3**) The probability module implements these as the class **JointProbDist** which inherits from the **ProbDist** class. This class specifies a discrete probability distribute over a set of variables. " ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class JointProbDist(ProbDist):\n",
    +       "    """A discrete probability distribute over a set of variables.\n",
    +       "    >>> P = JointProbDist(['X', 'Y']); P[1, 1] = 0.25\n",
    +       "    >>> P[1, 1]\n",
    +       "    0.25\n",
    +       "    >>> P[dict(X=0, Y=1)] = 0.5\n",
    +       "    >>> P[dict(X=0, Y=1)]\n",
    +       "    0.5"""\n",
    +       "\n",
    +       "    def __init__(self, variables):\n",
    +       "        self.prob = {}\n",
    +       "        self.variables = variables\n",
    +       "        self.vals = defaultdict(list)\n",
    +       "\n",
    +       "    def __getitem__(self, values):\n",
    +       "        """Given a tuple or dict of values, return P(values)."""\n",
    +       "        values = event_values(values, self.variables)\n",
    +       "        return ProbDist.__getitem__(self, values)\n",
    +       "\n",
    +       "    def __setitem__(self, values, p):\n",
    +       "        """Set P(values) = p.  Values can be a tuple or a dict; it must\n",
    +       "        have a value for each of the variables in the joint. Also keep track\n",
    +       "        of the values we have seen so far for each variable."""\n",
    +       "        values = event_values(values, self.variables)\n",
    +       "        self.prob[values] = p\n",
    +       "        for var, val in zip(self.variables, values):\n",
    +       "            if val not in self.vals[var]:\n",
    +       "                self.vals[var].append(val)\n",
    +       "\n",
    +       "    def values(self, var):\n",
    +       "        """Return the set of possible values for a variable."""\n",
    +       "        return self.vals[var]\n",
    +       "\n",
    +       "    def __repr__(self):\n",
    +       "        return "P({})".format(self.variables)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource JointProbDist" + "psource(JointProbDist)" ] }, { @@ -213,11 +595,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "P(['X', 'Y'])" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "variables = ['X', 'Y']\n", "j = JointProbDist(variables)\n", @@ -234,11 +625,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(0.2, 0.5)" + ] + }, + "execution_count": 13, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "j[1,1] = 0.2\n", "j[dict(X=0, Y=1)] = 0.5\n", @@ -255,11 +655,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[1, 0]" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "j.values('X')" ] @@ -274,7 +683,7 @@ "\n", "This is illustrated in **Section 13.3** of the book. The functions **enumerate_joint** and **enumerate_joint_ask** implement this functionality. Under the hood they implement **Equation 13.9** from the book.\n", "\n", - "$$\\textbf{P}(X | \\textbf{e}) = α \\textbf{P}(X, \\textbf{e}) = α \\sum_{y} \\textbf{P}(X, \\textbf{e}, \\textbf{y})$$\n", + "$$\\textbf{P}(X | \\textbf{e}) = \\alpha \\textbf{P}(X, \\textbf{e}) = \\alpha \\sum_{y} \\textbf{P}(X, \\textbf{e}, \\textbf{y})$$\n", "\n", "Here **α** is the normalizing factor. **X** is our query variable and **e** is the evidence. According to the equation we enumerate on the remaining variables **y** (not in evidence or query variable) i.e. all possible combinations of **y**\n", "\n", @@ -283,10 +692,8 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, + "execution_count": 15, + "metadata": {}, "outputs": [], "source": [ "full_joint = JointProbDist(['Cavity', 'Toothache', 'Catch'])\n", @@ -309,13 +716,119 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def enumerate_joint(variables, e, P):\n",
    +       "    """Return the sum of those entries in P consistent with e,\n",
    +       "    provided variables is P's remaining variables (the ones not in e)."""\n",
    +       "    if not variables:\n",
    +       "        return P[e]\n",
    +       "    Y, rest = variables[0], variables[1:]\n",
    +       "    return sum([enumerate_joint(rest, extend(e, Y, y), P)\n",
    +       "                for y in P.values(Y)])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource enumerate_joint" + "psource(enumerate_joint)" ] }, { @@ -327,11 +840,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.19999999999999998" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "evidence = dict(Toothache=True)\n", "variables = ['Cavity', 'Catch'] # variables not part of evidence\n", @@ -348,11 +870,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.12" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "evidence = dict(Cavity=True, Toothache=True)\n", "variables = ['Catch'] # variables not part of evidence\n", @@ -371,11 +902,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.6" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "ans2/ans1" ] @@ -389,13 +929,125 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def enumerate_joint_ask(X, e, P):\n",
    +       "    """Return a probability distribution over the values of the variable X,\n",
    +       "    given the {var:val} observations e, in the JointProbDist P. [Section 13.3]\n",
    +       "    >>> P = JointProbDist(['X', 'Y'])\n",
    +       "    >>> P[0,0] = 0.25; P[0,1] = 0.5; P[1,1] = P[2,1] = 0.125\n",
    +       "    >>> enumerate_joint_ask('X', dict(Y=1), P).show_approx()\n",
    +       "    '0: 0.667, 1: 0.167, 2: 0.167'\n",
    +       "    """\n",
    +       "    assert X not in e, "Query variable must be distinct from evidence"\n",
    +       "    Q = ProbDist(X)  # probability distribution for X, initially empty\n",
    +       "    Y = [v for v in P.variables if v != X and v not in e]  # hidden variables.\n",
    +       "    for xi in P.values(X):\n",
    +       "        Q[xi] = enumerate_joint(Y, extend(e, X, xi), P)\n",
    +       "    return Q.normalize()\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource enumerate_joint_ask" + "psource(enumerate_joint_ask)" ] }, { @@ -407,11 +1059,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(0.6, 0.39999999999999997)" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "query_variable = 'Cavity'\n", "evidence = dict(Toothache=True)\n", @@ -430,7 +1091,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Bayesian Networks\n", + "## BAYESIAN NETWORKS\n", "\n", "A Bayesian network is a representation of the joint probability distribution encoding a collection of conditional independence statements.\n", "\n", @@ -441,13 +1102,182 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class BayesNode:\n",
    +       "    """A conditional probability distribution for a boolean variable,\n",
    +       "    P(X | parents). Part of a BayesNet."""\n",
    +       "\n",
    +       "    def __init__(self, X, parents, cpt):\n",
    +       "        """X is a variable name, and parents a sequence of variable\n",
    +       "        names or a space-separated string.  cpt, the conditional\n",
    +       "        probability table, takes one of these forms:\n",
    +       "\n",
    +       "        * A number, the unconditional probability P(X=true). You can\n",
    +       "          use this form when there are no parents.\n",
    +       "\n",
    +       "        * A dict {v: p, ...}, the conditional probability distribution\n",
    +       "          P(X=true | parent=v) = p. When there's just one parent.\n",
    +       "\n",
    +       "        * A dict {(v1, v2, ...): p, ...}, the distribution P(X=true |\n",
    +       "          parent1=v1, parent2=v2, ...) = p. Each key must have as many\n",
    +       "          values as there are parents. You can use this form always;\n",
    +       "          the first two are just conveniences.\n",
    +       "\n",
    +       "        In all cases the probability of X being false is left implicit,\n",
    +       "        since it follows from P(X=true).\n",
    +       "\n",
    +       "        >>> X = BayesNode('X', '', 0.2)\n",
    +       "        >>> Y = BayesNode('Y', 'P', {T: 0.2, F: 0.7})\n",
    +       "        >>> Z = BayesNode('Z', 'P Q',\n",
    +       "        ...    {(T, T): 0.2, (T, F): 0.3, (F, T): 0.5, (F, F): 0.7})\n",
    +       "        """\n",
    +       "        if isinstance(parents, str):\n",
    +       "            parents = parents.split()\n",
    +       "\n",
    +       "        # We store the table always in the third form above.\n",
    +       "        if isinstance(cpt, (float, int)):  # no parents, 0-tuple\n",
    +       "            cpt = {(): cpt}\n",
    +       "        elif isinstance(cpt, dict):\n",
    +       "            # one parent, 1-tuple\n",
    +       "            if cpt and isinstance(list(cpt.keys())[0], bool):\n",
    +       "                cpt = {(v,): p for v, p in cpt.items()}\n",
    +       "\n",
    +       "        assert isinstance(cpt, dict)\n",
    +       "        for vs, p in cpt.items():\n",
    +       "            assert isinstance(vs, tuple) and len(vs) == len(parents)\n",
    +       "            assert all(isinstance(v, bool) for v in vs)\n",
    +       "            assert 0 <= p <= 1\n",
    +       "\n",
    +       "        self.variable = X\n",
    +       "        self.parents = parents\n",
    +       "        self.cpt = cpt\n",
    +       "        self.children = []\n",
    +       "\n",
    +       "    def p(self, value, event):\n",
    +       "        """Return the conditional probability\n",
    +       "        P(X=value | parents=parent_values), where parent_values\n",
    +       "        are the values of parents in event. (event must assign each\n",
    +       "        parent a value.)\n",
    +       "        >>> bn = BayesNode('X', 'Burglary', {T: 0.2, F: 0.625})\n",
    +       "        >>> bn.p(False, {'Burglary': False, 'Earthquake': True})\n",
    +       "        0.375"""\n",
    +       "        assert isinstance(value, bool)\n",
    +       "        ptrue = self.cpt[event_values(event, self.parents)]\n",
    +       "        return ptrue if value else 1 - ptrue\n",
    +       "\n",
    +       "    def sample(self, event):\n",
    +       "        """Sample from the distribution for this variable conditioned\n",
    +       "        on event's values for parent_variables. That is, return True/False\n",
    +       "        at random according with the conditional probability given the\n",
    +       "        parents."""\n",
    +       "        return probability(self.p(True, event))\n",
    +       "\n",
    +       "    def __repr__(self):\n",
    +       "        return repr((self.variable, ' '.join(self.parents)))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource BayesNode" + "psource(BayesNode)" ] }, { @@ -465,10 +1295,8 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 23, + "metadata": {}, "outputs": [], "source": [ "alarm_node = BayesNode('Alarm', ['Burglary', 'Earthquake'], \n", @@ -484,15 +1312,13 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 24, + "metadata": {}, "outputs": [], "source": [ "john_node = BayesNode('JohnCalls', ['Alarm'], {True: 0.90, False: 0.05})\n", "mary_node = BayesNode('MaryCalls', 'Alarm', {(True, ): 0.70, (False, ): 0.01}) # Using string for parents.\n", - "# Equvivalant to john_node definition. " + "# Equivalant to john_node definition." ] }, { @@ -504,10 +1330,8 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 25, + "metadata": {}, "outputs": [], "source": [ "burglary_node = BayesNode('Burglary', '', 0.001)\n", @@ -523,11 +1347,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.09999999999999998" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "john_node.p(False, {'Alarm': True, 'Burglary': True}) # P(JohnCalls=False | Alarm=True)" ] @@ -541,13 +1374,148 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 27, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class BayesNet:\n",
    +       "    """Bayesian network containing only boolean-variable nodes."""\n",
    +       "\n",
    +       "    def __init__(self, node_specs=None):\n",
    +       "        """Nodes must be ordered with parents before children."""\n",
    +       "        self.nodes = []\n",
    +       "        self.variables = []\n",
    +       "        node_specs = node_specs or []\n",
    +       "        for node_spec in node_specs:\n",
    +       "            self.add(node_spec)\n",
    +       "\n",
    +       "    def add(self, node_spec):\n",
    +       "        """Add a node to the net. Its parents must already be in the\n",
    +       "        net, and its variable must not."""\n",
    +       "        node = BayesNode(*node_spec)\n",
    +       "        assert node.variable not in self.variables\n",
    +       "        assert all((parent in self.variables) for parent in node.parents)\n",
    +       "        self.nodes.append(node)\n",
    +       "        self.variables.append(node.variable)\n",
    +       "        for parent in node.parents:\n",
    +       "            self.variable_node(parent).children.append(node)\n",
    +       "\n",
    +       "    def variable_node(self, var):\n",
    +       "        """Return the node for the variable named var.\n",
    +       "        >>> burglary.variable_node('Burglary').variable\n",
    +       "        'Burglary'"""\n",
    +       "        for n in self.nodes:\n",
    +       "            if n.variable == var:\n",
    +       "                return n\n",
    +       "        raise Exception("No such variable: {}".format(var))\n",
    +       "\n",
    +       "    def variable_values(self, var):\n",
    +       "        """Return the domain of var."""\n",
    +       "        return [True, False]\n",
    +       "\n",
    +       "    def __repr__(self):\n",
    +       "        return 'BayesNet({0!r})'.format(self.nodes)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource BayesNet" + "psource(BayesNet)" ] }, { @@ -572,11 +1540,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "BayesNet([('Burglary', ''), ('Earthquake', ''), ('Alarm', 'Burglary Earthquake'), ('JohnCalls', 'Alarm'), ('MaryCalls', 'Alarm')])" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "burglary" ] @@ -590,22 +1567,43 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "probability.BayesNode" + ] + }, + "execution_count": 29, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "type(burglary.variable_node('Alarm'))" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{(True, True): 0.95,\n", + " (True, False): 0.94,\n", + " (False, True): 0.29,\n", + " (False, False): 0.001}" + ] + }, + "execution_count": 30, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "burglary.variable_node('Alarm').cpt" ] @@ -627,20 +1625,132 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def enumerate_all(variables, e, bn):\n",
    +       "    """Return the sum of those entries in P(variables | e{others})\n",
    +       "    consistent with e, where P is the joint distribution represented\n",
    +       "    by bn, and e{others} means e restricted to bn's other variables\n",
    +       "    (the ones other than variables). Parents must precede children in variables."""\n",
    +       "    if not variables:\n",
    +       "        return 1.0\n",
    +       "    Y, rest = variables[0], variables[1:]\n",
    +       "    Ynode = bn.variable_node(Y)\n",
    +       "    if Y in e:\n",
    +       "        return Ynode.p(e[Y], e) * enumerate_all(rest, e, bn)\n",
    +       "    else:\n",
    +       "        return sum(Ynode.p(y, e) * enumerate_all(rest, extend(e, Y, y), bn)\n",
    +       "                   for y in bn.variable_values(Y))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource enumerate_all" + "psource(enumerate_all)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "**enumerate__all** recursively evaluates a general form of the **Equation 14.4** in the book.\n", + "**enumerate_all** recursively evaluates a general form of the **Equation 14.4** in the book.\n", "\n", "$$\\textbf{P}(X | \\textbf{e}) = α \\textbf{P}(X, \\textbf{e}) = α \\sum_{y} \\textbf{P}(X, \\textbf{e}, \\textbf{y})$$ \n", "\n", @@ -651,29 +1761,147 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def enumeration_ask(X, e, bn):\n",
    +       "    """Return the conditional probability distribution of variable X\n",
    +       "    given evidence e, from BayesNet bn. [Figure 14.9]\n",
    +       "    >>> enumeration_ask('Burglary', dict(JohnCalls=T, MaryCalls=T), burglary\n",
    +       "    ...  ).show_approx()\n",
    +       "    'False: 0.716, True: 0.284'"""\n",
    +       "    assert X not in e, "Query variable must be distinct from evidence"\n",
    +       "    Q = ProbDist(X)\n",
    +       "    for xi in bn.variable_values(X):\n",
    +       "        Q[xi] = enumerate_all(bn.variables, extend(e, X, xi), bn)\n",
    +       "    return Q.normalize()\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource enumeration_ask" + "psource(enumeration_ask)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let us solve the problem of finding out **P(Burglary=True | JohnCalls=True, MaryCalls=True)** using the **burglary** network.**enumeration_ask** takes three arguments **X** = variable name, **e** = Evidence (in form a dict like previously explained), **bn** = The Bayes Net to do inference on." + "Let us solve the problem of finding out **P(Burglary=True | JohnCalls=True, MaryCalls=True)** using the **burglary** network. **enumeration_ask** takes three arguments **X** = variable name, **e** = Evidence (in form a dict like previously explained), **bn** = The Bayes Net to do inference on." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.2841718353643929" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "ans_dist = enumeration_ask('Burglary', {'JohnCalls': True, 'MaryCalls': True}, burglary)\n", "ans_dist[True]" @@ -699,13 +1927,120 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def make_factor(var, e, bn):\n",
    +       "    """Return the factor for var in bn's joint distribution given e.\n",
    +       "    That is, bn's full joint distribution, projected to accord with e,\n",
    +       "    is the pointwise product of these factors for bn's variables."""\n",
    +       "    node = bn.variable_node(var)\n",
    +       "    variables = [X for X in [var] + node.parents if X not in e]\n",
    +       "    cpt = {event_values(e1, variables): node.p(e1[var], e1)\n",
    +       "           for e1 in all_events(variables, bn, e)}\n",
    +       "    return Factor(variables, cpt)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource make_factor" + "psource(make_factor)" ] }, { @@ -721,13 +2056,120 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def all_events(variables, bn, e):\n",
    +       "    """Yield every way of extending e with values for all variables."""\n",
    +       "    if not variables:\n",
    +       "        yield e\n",
    +       "    else:\n",
    +       "        X, rest = variables[0], variables[1:]\n",
    +       "        for e1 in all_events(rest, bn, e):\n",
    +       "            for x in bn.variable_values(X):\n",
    +       "                yield extend(e1, X, x)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource all_events" + "psource(all_events)" ] }, { @@ -741,10 +2183,8 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, + "execution_count": 36, + "metadata": {}, "outputs": [], "source": [ "f5 = make_factor('MaryCalls', {'JohnCalls': True, 'MaryCalls': True}, burglary)" @@ -752,33 +2192,60 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "f5" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{(True,): 0.7, (False,): 0.01}" + ] + }, + "execution_count": 38, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "f5.cpt" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['Alarm']" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "f5.variables" ] @@ -792,10 +2259,8 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 40, + "metadata": {}, "outputs": [], "source": [ "new_factor = make_factor('MaryCalls', {'Alarm': True}, burglary)" @@ -803,11 +2268,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 41, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{(True,): 0.7, (False,): 0.30000000000000004}" + ] + }, + "execution_count": 41, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "new_factor.cpt" ] @@ -825,13 +2299,117 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 42, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def pointwise_product(self, other, bn):\n",
    +       "        """Multiply two factors, combining their variables."""\n",
    +       "        variables = list(set(self.variables) | set(other.variables))\n",
    +       "        cpt = {event_values(e, variables): self.p(e) * other.p(e)\n",
    +       "               for e in all_events(variables, bn, {})}\n",
    +       "        return Factor(variables, cpt)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource Factor.pointwise_product" + "psource(Factor.pointwise_product)" ] }, { @@ -843,13 +2421,113 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 43, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def pointwise_product(factors, bn):\n",
    +       "    return reduce(lambda f, g: f.pointwise_product(g, bn), factors)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource pointwise_product" + "psource(pointwise_product)" ] }, { @@ -861,13 +2539,118 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 44, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def sum_out(self, var, bn):\n",
    +       "        """Make a factor eliminating var by summing over its values."""\n",
    +       "        variables = [X for X in self.variables if X != var]\n",
    +       "        cpt = {event_values(e, variables): sum(self.p(extend(e, var, val))\n",
    +       "                                               for val in bn.variable_values(var))\n",
    +       "               for e in all_events(variables, bn, {})}\n",
    +       "        return Factor(variables, cpt)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource Factor.sum_out" + "psource(Factor.sum_out)" ] }, { @@ -879,13 +2662,118 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def sum_out(var, factors, bn):\n",
    +       "    """Eliminate var from all factors by summing over its values."""\n",
    +       "    result, var_factors = [], []\n",
    +       "    for f in factors:\n",
    +       "        (var_factors if var in f.variables else result).append(f)\n",
    +       "    result.append(pointwise_product(var_factors, bn).sum_out(var, bn))\n",
    +       "    return result\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource sum_out" + "psource(sum_out)" ] }, { @@ -910,26 +2798,226 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 46, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def elimination_ask(X, e, bn):\n",
    +       "    """Compute bn's P(X|e) by variable elimination. [Figure 14.11]\n",
    +       "    >>> elimination_ask('Burglary', dict(JohnCalls=T, MaryCalls=T), burglary\n",
    +       "    ...  ).show_approx()\n",
    +       "    'False: 0.716, True: 0.284'"""\n",
    +       "    assert X not in e, "Query variable must be distinct from evidence"\n",
    +       "    factors = []\n",
    +       "    for var in reversed(bn.variables):\n",
    +       "        factors.append(make_factor(var, e, bn))\n",
    +       "        if is_hidden(var, X, e):\n",
    +       "            factors = sum_out(var, factors, bn)\n",
    +       "    return pointwise_product(factors, bn).normalize()\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource elimination_ask" + "psource(elimination_ask)" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'False: 0.716, True: 0.284'" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "elimination_ask('Burglary', dict(JohnCalls=True, MaryCalls=True), burglary).show_approx()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Elimination Ask Optimizations\n", + "\n", + "`elimination_ask` has some critical point to consider and some optimizations could be performed:\n", + "\n", + "- **Operation on factors**:\n", + "\n", + " `sum_out` and `pointwise_product` function used in `elimination_ask` is where space and time complexity arise in the variable elimination algorithm (AIMA3e pg. 526).\n", + "\n", + ">The only trick is to notice that any factor that does not depend on the variable to be summed out can be moved outside the summation.\n", + "\n", + "- **Variable ordering**:\n", + "\n", + " Elimination ordering is important, every choice of ordering yields a valid algorithm, but different orderings cause different intermediate factors to be generated during the calculation (AIMA3e pg. 527). In this case the algorithm applies a reversed order.\n", + "\n", + "> In general, the time and space requirements of variable elimination are dominated by the size of the largest factor constructed during the operation of the algorithm. This in turn is determined by the order of elimination of variables and by the structure of the network. It turns out to be intractable to determine the optimal ordering, but several good heuristics are available. One fairly effective method is a greedy one: eliminate whichever variable minimizes the size of the next factor to be constructed. \n", + "\n", + "- **Variable relevance**\n", + " \n", + " Some variables could be irrelevant to resolve a query (i.e. sums to 1). A variable elimination algorithm can therefore remove all these variables before evaluating the query (AIMA3e pg. 528).\n", + "\n", + "> An optimization is to remove 'every variable that is not an ancestor of a query variable or evidence variable is irrelevant to the query'." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Runtime comparison\n", + "Let's see how the runtimes of these two algorithms compare.\n", + "We expect variable elimination to outperform enumeration by a large margin as we reduce the number of repetitive calculations significantly." + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "105 µs ± 11.9 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)\n" + ] + } + ], "source": [ + "%%timeit\n", + "enumeration_ask('Burglary', dict(JohnCalls=True, MaryCalls=True), burglary).show_approx()" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "262 µs ± 54.7 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", "elimination_ask('Burglary', dict(JohnCalls=True, MaryCalls=True), burglary).show_approx()" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this test case we observe that variable elimination is slower than what we expected. It has something to do with number of threads, how Python tries to optimize things and this happens because the network is very small, with just 5 nodes. The `elimination_ask` has some critical point and some optimizations must be perfomed as seen above.\n", + "
    \n", + "Of course, for more complicated networks, variable elimination will be significantly faster and runtime will drop not just by a constant factor, but by a polynomial factor proportional to the number of nodes, due to the reduction in repeated calculations." + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -941,13 +3029,117 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def sample(self, event):\n",
    +       "        """Sample from the distribution for this variable conditioned\n",
    +       "        on event's values for parent_variables. That is, return True/False\n",
    +       "        at random according with the conditional probability given the\n",
    +       "        parents."""\n",
    +       "        return probability(self.p(True, event))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource BayesNode.sample" + "psource(BayesNode.sample)" ] }, { @@ -963,13 +3155,118 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def prior_sample(bn):\n",
    +       "    """Randomly sample from bn's full joint distribution. The result\n",
    +       "    is a {variable: value} dict. [Figure 14.13]"""\n",
    +       "    event = {}\n",
    +       "    for node in bn.nodes:\n",
    +       "        event[node.variable] = node.sample(event)\n",
    +       "    return event\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource prior_sample" + "psource(prior_sample)" ] }, { @@ -980,15 +3277,25 @@ "\n", "\n", "\n", - "We store the samples on the observations. Let us find **P(Rain=True)**" + "Traversing the graph in topological order is important.\n", + "There are two possible topological orderings for this particular directed acyclic graph.\n", + "
    \n", + "1. `Cloudy -> Sprinkler -> Rain -> Wet Grass`\n", + "2. `Cloudy -> Rain -> Sprinkler -> Wet Grass`\n", + "
    \n", + "
    \n", + "We can follow any of the two orderings to sample from the network.\n", + "Any ordering other than these two, however, cannot be used.\n", + "
    \n", + "One way to think about this is that `Cloudy` can be seen as a precondition of both `Rain` and `Sprinkler` and just like we have seen in planning, preconditions need to be satisfied before a certain action can be executed.\n", + "
    \n", + "We store the samples on the observations. Let us find **P(Rain=True)** by taking 1000 random samples from the network." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, + "execution_count": 52, + "metadata": {}, "outputs": [], "source": [ "N = 1000\n", @@ -1004,10 +3311,8 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 53, + "metadata": {}, "outputs": [], "source": [ "rain_true = [observation for observation in all_observations if observation['Rain'] == True]" @@ -1022,11 +3327,17 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 54, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.503\n" + ] + } + ], "source": [ "answer = len(rain_true) / N\n", "print(answer)" @@ -1036,16 +3347,50 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To evaluate a conditional distribution. We can use a two-step filtering process. We first separate out the variables that are consistent with the evidence. Then for each value of query variable, we can find probabilities. For example to find **P(Cloudy=True | Rain=True)**. We have already filtered out the values consistent with our evidence in **rain_true**. Now we apply a second filtering step on **rain_true** to find **P(Rain=True and Cloudy=True)**" + "Sampling this another time might give different results as we have no control over the distribution of the random samples" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 55, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.519\n" + ] + } + ], + "source": [ + "N = 1000\n", + "all_observations = [prior_sample(sprinkler) for x in range(N)]\n", + "rain_true = [observation for observation in all_observations if observation['Rain'] == True]\n", + "answer = len(rain_true) / N\n", + "print(answer)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To evaluate a conditional distribution. We can use a two-step filtering process. We first separate out the variables that are consistent with the evidence. Then for each value of query variable, we can find probabilities. For example to find **P(Cloudy=True | Rain=True)**. We have already filtered out the values consistent with our evidence in **rain_true**. Now we apply a second filtering step on **rain_true** to find **P(Rain=True and Cloudy=True)**" + ] + }, + { + "cell_type": "code", + "execution_count": 56, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.8265895953757225\n" + ] + } + ], "source": [ "rain_and_cloudy = [observation for observation in rain_true if observation['Cloudy'] == True]\n", "answer = len(rain_and_cloudy) / len(rain_true)\n", @@ -1058,18 +3403,156 @@ "source": [ "### Rejection Sampling\n", "\n", - "Rejection Sampling is based on an idea similar to what we did just now. First, it generates samples from the prior distribution specified by the network. Then, it rejects all those that do not match the evidence. The function **rejection_sampling** implements the algorithm described by **Figure 14.14**" + "Rejection Sampling is based on an idea similar to what we did just now. \n", + "First, it generates samples from the prior distribution specified by the network. \n", + "Then, it rejects all those that do not match the evidence. \n", + "
    \n", + "Rejection sampling is advantageous only when we know the query beforehand.\n", + "While prior sampling generally works for any query, it might fail in some scenarios.\n", + "
    \n", + "Let's say we have a generic Bayesian network and we have evidence `e`, and we want to know how many times a state `A` is true, given evidence `e` is true.\n", + "Normally, prior sampling can answer this question, but let's assume that the probability of evidence `e` being true in our actual probability distribution is very small.\n", + "In this situation, it might be possible that sampling never encounters a data-point where `e` is true.\n", + "If our sampled data has no instance of `e` being true, `P(e) = 0`, and therefore `P(A | e) / P(e) = 0/0`, which is undefined.\n", + "We cannot find the required value using this sample.\n", + "
    \n", + "We can definitely increase the number of sample points, but we can never guarantee that we will encounter the case where `e` is non-zero (assuming our actual probability distribution has atleast one case where `e` is true).\n", + "To guarantee this, we would have to consider every single data point, which means we lose the speed advantage that approximation provides us and we essentially have to calculate the exact inference model of the Bayesian network.\n", + "
    \n", + "
    \n", + "Rejection sampling will be useful in this situation, as we already know the query.\n", + "
    \n", + "While sampling from the network, we will reject any sample which is inconsistent with the evidence variables of the given query (in this example, the only evidence variable is `e`).\n", + "We will only consider samples that do not violate **any** of the evidence variables.\n", + "In this way, we will have enough data with the required evidence to infer queries involving a subset of that evidence.\n", + "
    \n", + "
    \n", + "The function **rejection_sampling** implements the algorithm described by **Figure 14.14**" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 57, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def rejection_sampling(X, e, bn, N=10000):\n",
    +       "    """Estimate the probability distribution of variable X given\n",
    +       "    evidence e in BayesNet bn, using N samples.  [Figure 14.14]\n",
    +       "    Raises a ZeroDivisionError if all the N samples are rejected,\n",
    +       "    i.e., inconsistent with e.\n",
    +       "    >>> random.seed(47)\n",
    +       "    >>> rejection_sampling('Burglary', dict(JohnCalls=T, MaryCalls=T),\n",
    +       "    ...   burglary, 10000).show_approx()\n",
    +       "    'False: 0.7, True: 0.3'\n",
    +       "    """\n",
    +       "    counts = {x: 0 for x in bn.variable_values(X)}  # bold N in [Figure 14.14]\n",
    +       "    for j in range(N):\n",
    +       "        sample = prior_sample(bn)  # boldface x in [Figure 14.14]\n",
    +       "        if consistent_with(sample, e):\n",
    +       "            counts[sample[X]] += 1\n",
    +       "    return ProbDist(X, counts)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource rejection_sampling" + "psource(rejection_sampling)" ] }, { @@ -1083,13 +3566,115 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 58, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def consistent_with(event, evidence):\n",
    +       "    """Is event consistent with the given evidence?"""\n",
    +       "    return all(evidence.get(k, v) == v\n",
    +       "               for k, v in event.items())\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource consistent_with" + "psource(consistent_with)" ] }, { @@ -1101,11 +3686,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 59, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.8035019455252919" + ] + }, + "execution_count": 59, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "p = rejection_sampling('Cloudy', dict(Rain=True), sprinkler, 1000)\n", "p[True]" @@ -1117,6 +3711,7 @@ "source": [ "### Likelihood Weighting\n", "\n", + "Rejection sampling takes a long time to run when the probability of finding consistent evidence is low. It is also slow for larger networks and more evidence variables.\n", "Rejection sampling tends to reject a lot of samples if our evidence consists of a large number of variables. Likelihood Weighting solves this by fixing the evidence (i.e. not sampling it) and then using weights to make sure that our overall sampling is still consistent.\n", "\n", "The pseudocode in **Figure 14.15** is implemented as **likelihood_weighting** and **weighted_sample**." @@ -1124,13 +3719,124 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 60, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def weighted_sample(bn, e):\n",
    +       "    """Sample an event from bn that's consistent with the evidence e;\n",
    +       "    return the event and its weight, the likelihood that the event\n",
    +       "    accords to the evidence."""\n",
    +       "    w = 1\n",
    +       "    event = dict(e)  # boldface x in [Figure 14.15]\n",
    +       "    for node in bn.nodes:\n",
    +       "        Xi = node.variable\n",
    +       "        if Xi in e:\n",
    +       "            w *= node.p(e[Xi], event)\n",
    +       "        else:\n",
    +       "            event[Xi] = node.sample(event)\n",
    +       "    return event, w\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource weighted_sample" + "psource(weighted_sample)" ] }, { @@ -1145,24 +3851,144 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 61, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "({'Rain': True, 'Cloudy': False, 'Sprinkler': True, 'WetGrass': True}, 0.2)" + ] + }, + "execution_count": 61, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "weighted_sample(sprinkler, dict(Rain=True))" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 62, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def likelihood_weighting(X, e, bn, N=10000):\n",
    +       "    """Estimate the probability distribution of variable X given\n",
    +       "    evidence e in BayesNet bn.  [Figure 14.15]\n",
    +       "    >>> random.seed(1017)\n",
    +       "    >>> likelihood_weighting('Burglary', dict(JohnCalls=T, MaryCalls=T),\n",
    +       "    ...   burglary, 10000).show_approx()\n",
    +       "    'False: 0.702, True: 0.298'\n",
    +       "    """\n",
    +       "    W = {x: 0 for x in bn.variable_values(X)}\n",
    +       "    for j in range(N):\n",
    +       "        sample, weight = weighted_sample(bn, e)  # boldface x, w in [Figure 14.15]\n",
    +       "        W[sample[X]] += weight\n",
    +       "    return ProbDist(X, W)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource likelihood_weighting" + "psource(likelihood_weighting)" ] }, { @@ -1174,11 +4000,20 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 63, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'False: 0.2, True: 0.8'" + ] + }, + "execution_count": 63, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "likelihood_weighting('Cloudy', dict(Rain=True), sprinkler, 200).show_approx()" ] @@ -1196,13 +4031,124 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 64, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def gibbs_ask(X, e, bn, N=1000):\n",
    +       "    """[Figure 14.16]"""\n",
    +       "    assert X not in e, "Query variable must be distinct from evidence"\n",
    +       "    counts = {x: 0 for x in bn.variable_values(X)}  # bold N in [Figure 14.16]\n",
    +       "    Z = [var for var in bn.variables if var not in e]\n",
    +       "    state = dict(e)  # boldface x in [Figure 14.16]\n",
    +       "    for Zi in Z:\n",
    +       "        state[Zi] = random.choice(bn.variable_values(Zi))\n",
    +       "    for j in range(N):\n",
    +       "        for Zi in Z:\n",
    +       "            state[Zi] = markov_blanket_sample(Zi, state, bn)\n",
    +       "            counts[state[X]] += 1\n",
    +       "    return ProbDist(X, counts)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource gibbs_ask" + "psource(gibbs_ask)" ] }, { @@ -1214,39 +4160,2377 @@ }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": false - }, - "outputs": [], + "execution_count": 65, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'False: 0.215, True: 0.785'" + ] + }, + "execution_count": 65, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "gibbs_ask('Cloudy', dict(Rain=True), sprinkler, 200).show_approx()" ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.4.3" + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Runtime analysis\n", + "Let's take a look at how much time each algorithm takes." + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "13.2 ms ± 3.45 ms per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "all_observations = [prior_sample(sprinkler) for x in range(1000)]\n", + "rain_true = [observation for observation in all_observations if observation['Rain'] == True]\n", + "len([observation for observation in rain_true if observation['Cloudy'] == True]) / len(rain_true)" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "11 ms ± 687 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "rejection_sampling('Cloudy', dict(Rain=True), sprinkler, 1000)" + ] + }, + { + "cell_type": "code", + "execution_count": 68, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "2.12 ms ± 554 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "likelihood_weighting('Cloudy', dict(Rain=True), sprinkler, 200)" + ] + }, + { + "cell_type": "code", + "execution_count": 69, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "14.4 ms ± 2.16 ms per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "gibbs_ask('Cloudy', dict(Rain=True), sprinkler, 200)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, all algorithms have a very similar runtime.\n", + "However, rejection sampling would be a lot faster and more accurate when the probabiliy of finding data-points consistent with the required evidence is small.\n", + "
    \n", + "Likelihood weighting is the fastest out of all as it doesn't involve rejecting samples, but also has a quite high variance." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## HIDDEN MARKOV MODELS" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Often, we need to carry out probabilistic inference on temporal data or a sequence of observations where the order of observations matter.\n", + "We require a model similar to a Bayesian Network, but one that grows over time to keep up with the latest evidences.\n", + "If you are familiar with the `mdp` module or Markov models in general, you can probably guess that a Markov model might come close to representing our problem accurately.\n", + "
    \n", + "A Markov model is basically a chain-structured Bayesian Network in which there is one state for each time step and each node has an identical probability distribution.\n", + "The first node, however, has a different distribution, called the prior distribution which models the initial state of the process.\n", + "A state in a Markov model depends only on the previous state and the latest evidence and not on the states before it.\n", + "
    \n", + "A **Hidden Markov Model** or **HMM** is a special case of a Markov model in which the state of the process is described by a single discrete random variable.\n", + "The possible values of the variable are the possible states of the world.\n", + "
    \n", + "But what if we want to model a process with two or more state variables?\n", + "In that case, we can still fit the process into the HMM framework by redefining our state variables as a single \"megavariable\".\n", + "We do this because carrying out inference on HMMs have standard optimized algorithms.\n", + "A HMM is very similar to an MDP, but we don't have the option of taking actions like in MDPs, instead, the process carries on as new evidence appears.\n", + "
    \n", + "If a HMM is truncated at a fixed length, it becomes a Bayesian network and general BN inference can be used on it to answer queries.\n", + "\n", + "Before we start, it will be helpful to understand the structure of a temporal model. We will use the example of the book with the guard and the umbrella. In this example, the state $\\textbf{X}$ is whether it is a rainy day (`X = True`) or not (`X = False`) at Day $\\textbf{t}$. In the sensor or observation model, the observation or evidence $\\textbf{U}$ is whether the professor holds an umbrella (`U = True`) or not (`U = False`) on **Day** $\\textbf{t}$. Based on that, the transition model is \n", + "\n", + "| $X_{t-1}$ | $X_{t}$ | **P**$(X_{t}| X_{t-1})$| \n", + "| ------------- |------------- | ----------------------------------|\n", + "| ***${False}$*** | ***${False}$*** | 0.7 |\n", + "| ***${False}$*** | ***${True}$*** | 0.3 |\n", + "| ***${True}$*** | ***${False}$*** | 0.3 |\n", + "| ***${True}$*** | ***${True}$*** | 0.7 |\n", + "\n", + "And the the sensor model will be,\n", + "\n", + "| $X_{t}$ | $U_{t}$ | **P**$(U_{t}|X_{t})$| \n", + "| :-------------: |:-------------: | :------------------------:|\n", + "| ***${False}$*** | ***${True}$*** | 0.2 |\n", + "| ***${False}$*** | ***${False}$*** | 0.8 |\n", + "| ***${True}$*** | ***${True}$*** | 0.9 |\n", + "| ***${True}$*** | ***${False}$*** | 0.1 |\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HMMs are implemented in the **`HiddenMarkovModel`** class.\n", + "Let's have a look." + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class HiddenMarkovModel:\n",
    +       "    """A Hidden markov model which takes Transition model and Sensor model as inputs"""\n",
    +       "\n",
    +       "    def __init__(self, transition_model, sensor_model, prior=None):\n",
    +       "        self.transition_model = transition_model\n",
    +       "        self.sensor_model = sensor_model\n",
    +       "        self.prior = prior or [0.5, 0.5]\n",
    +       "\n",
    +       "    def sensor_dist(self, ev):\n",
    +       "        if ev is True:\n",
    +       "            return self.sensor_model[0]\n",
    +       "        else:\n",
    +       "            return self.sensor_model[1]\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(HiddenMarkovModel)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We instantiate the object **`hmm`** of the class using a list of lists for both the transition and the sensor model." + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": {}, + "outputs": [], + "source": [ + "umbrella_transition_model = [[0.7, 0.3], [0.3, 0.7]]\n", + "umbrella_sensor_model = [[0.9, 0.2], [0.1, 0.8]]\n", + "hmm = HiddenMarkovModel(umbrella_transition_model, umbrella_sensor_model)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The **`sensor_dist()`** method returns a list with the conditional probabilities of the sensor model." + ] + }, + { + "cell_type": "code", + "execution_count": 72, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[0.9, 0.2]" + ] + }, + "execution_count": 72, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "hmm.sensor_dist(ev=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have defined an HMM object, our task here is to compute the belief $B_{t}(x)= P(X_{t}|U_{1:t})$ given evidence **U** at each time step **t**.\n", + "
    \n", + "The basic inference tasks that must be solved are:\n", + "1. **Filtering**: Computing the posterior probability distribution over the most recent state, given all the evidence up to the current time step.\n", + "2. **Prediction**: Computing the posterior probability distribution over the future state.\n", + "3. **Smoothing**: Computing the posterior probability distribution over a past state. Smoothing provides a better estimation as it incorporates more evidence.\n", + "4. **Most likely explanation**: Finding the most likely sequence of states for a given observation\n", + "5. **Learning**: The transition and sensor models can be learnt, if not yet known, just like in an information gathering agent\n", + "
    \n", + "
    \n", + "\n", + "There are three primary methods to carry out inference in Hidden Markov Models:\n", + "1. The Forward-Backward algorithm\n", + "2. Fixed lag smoothing\n", + "3. Particle filtering\n", + "\n", + "Let's have a look at how we can carry out inference and answer queries based on our umbrella HMM using these algorithms." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### FORWARD-BACKWARD\n", + "This is a general algorithm that works for all Markov models, not just HMMs.\n", + "In the filtering task (inference) we are given evidence **U** in each time **t** and we want to compute the belief $B_{t}(x)= P(X_{t}|U_{1:t})$. \n", + "We can think of it as a three step process:\n", + "1. In every step we start with the current belief $P(X_{t}|e_{1:t})$\n", + "2. We update it for time\n", + "3. We update it for evidence\n", + "\n", + "The forward algorithm performs the step 2 and 3 at once. It updates, or better say reweights, the initial belief using the transition and the sensor model. Let's see the umbrella example. On **Day 0** no observation is available, and for that reason we will assume that we have equal possibilities to rain or not. In the **`HiddenMarkovModel`** class, the prior probabilities for **Day 0** are by default [0.5, 0.5]. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The observation update is calculated with the **`forward()`** function. Basically, we update our belief using the observation model. The function returns a list with the probabilities of **raining or not** on **Day 1**." + ] + }, + { + "cell_type": "code", + "execution_count": 73, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def forward(HMM, fv, ev):\n",
    +       "    prediction = vector_add(scalar_vector_product(fv[0], HMM.transition_model[0]),\n",
    +       "                            scalar_vector_product(fv[1], HMM.transition_model[1]))\n",
    +       "    sensor_dist = HMM.sensor_dist(ev)\n",
    +       "\n",
    +       "    return normalize(element_wise_product(sensor_dist, prediction))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(forward)" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The probability of raining on day 1 is 0.82\n" + ] + } + ], + "source": [ + "umbrella_prior = [0.5, 0.5]\n", + "belief_day_1 = forward(hmm, umbrella_prior, ev=True)\n", + "print ('The probability of raining on day 1 is {:.2f}'.format(belief_day_1[0]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In **Day 2** our initial belief is the updated belief of **Day 1**.\n", + "Again using the **`forward()`** function we can compute the probability of raining in **Day 2**" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The probability of raining in day 2 is 0.88\n" + ] + } + ], + "source": [ + "belief_day_2 = forward(hmm, belief_day_1, ev=True)\n", + "print ('The probability of raining in day 2 is {:.2f}'.format(belief_day_2[0]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the smoothing part we are interested in computing the distribution over past states given evidence up to the present. Assume that we want to compute the distribution for the time **k**, for $0\\leq k\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def backward(HMM, b, ev):\n",
    +       "    sensor_dist = HMM.sensor_dist(ev)\n",
    +       "    prediction = element_wise_product(sensor_dist, b)\n",
    +       "\n",
    +       "    return normalize(vector_add(scalar_vector_product(prediction[0], HMM.transition_model[0]),\n",
    +       "                                scalar_vector_product(prediction[1], HMM.transition_model[1])))\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(backward)" + ] + }, + { + "cell_type": "code", + "execution_count": 77, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[0.6272727272727272, 0.37272727272727274]" + ] + }, + "execution_count": 77, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "b = [1, 1]\n", + "backward(hmm, b, ev=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Some may notice that the result is not the same as in the book. The main reason is that in the book the normalization step is not used. If we want to normalize the result, one can use the **`normalize()`** helper function.\n", + "\n", + "In order to find the smoothed estimate for raining in **Day k**, we will use the **`forward_backward()`** function. As in the example in the book, the umbrella is observed in both days and the prior distribution is [0.5, 0.5]" + ] + }, + { + "cell_type": "code", + "execution_count": 78, + "metadata": {}, + "outputs": [ + { + "data": { + "text/markdown": [ + "### AIMA3e\n", + "__function__ FORWARD-BACKWARD(__ev__, _prior_) __returns__ a vector of probability distributions \n", + " __inputs__: __ev__, a vector of evidence values for steps 1,…,_t_ \n", + "     _prior_, the prior distribution on the initial state, __P__(__X__0) \n", + " __local variables__: __fv__, a vector of forward messages for steps 0,…,_t_ \n", + "        __b__, a representation of the backward message, initially all 1s \n", + "        __sv__, a vector of smoothed estimates for steps 1,…,_t_ \n", + "\n", + " __fv__\\[0\\] ← _prior_ \n", + " __for__ _i_ = 1 __to__ _t_ __do__ \n", + "   __fv__\\[_i_\\] ← FORWARD(__fv__\\[_i_ − 1\\], __ev__\\[_i_\\]) \n", + " __for__ _i_ = _t_ __downto__ 1 __do__ \n", + "   __sv__\\[_i_\\] ← NORMALIZE(__fv__\\[_i_\\] × __b__) \n", + "   __b__ ← BACKWARD(__b__, __ev__\\[_i_\\]) \n", + " __return__ __sv__\n", + "\n", + "---\n", + "__Figure ??__ The forward\\-backward algorithm for smoothing: computing posterior probabilities of a sequence of states given a sequence of observations. The FORWARD and BACKWARD operators are defined by Equations (__??__) and (__??__), respectively." + ], + "text/plain": [ + "" + ] + }, + "execution_count": 78, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pseudocode('Forward-Backward')" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The probability of raining in Day 0 is 0.65 and in Day 1 is 0.88\n" + ] + } + ], + "source": [ + "umbrella_prior = [0.5, 0.5]\n", + "prob = forward_backward(hmm, ev=[T, T], prior=umbrella_prior)\n", + "print ('The probability of raining in Day 0 is {:.2f} and in Day 1 is {:.2f}'.format(prob[0][0], prob[1][0]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "Since HMMs are represented as single variable systems, we can represent the transition model and sensor model as matrices.\n", + "The `forward_backward` algorithm can be easily carried out on this representation (as we have done here) with a time complexity of $O({S}^{2} t)$ where t is the length of the sequence and each step multiplies a vector of size $S$ with a matrix of dimensions $SxS$.\n", + "
    \n", + "Additionally, the forward pass stores $t$ vectors of size $S$ which makes the auxiliary space requirement equivalent to $O(St)$.\n", + "
    \n", + "
    \n", + "Is there any way we can improve the time or space complexity?\n", + "
    \n", + "Fortunately, the matrix representation of HMM properties allows us to do so.\n", + "
    \n", + "If $f$ and $b$ represent the forward and backward messages respectively, we can modify the smoothing algorithm by first\n", + "running the standard forward pass to compute $f_{t:t}$ (forgetting all the intermediate results) and then running\n", + "backward pass for both $b$ and $f$ together, using them to compute the smoothed estimate at each step.\n", + "This optimization reduces auxlilary space requirement to constant (irrespective of the length of the sequence) provided\n", + "the transition matrix is invertible and the sensor model has no zeros (which is sometimes hard to accomplish)\n", + "
    \n", + "
    \n", + "Let's look at another algorithm, that carries out smoothing in a more optimized way." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### FIXED LAG SMOOTHING\n", + "The matrix formulation allows to optimize online smoothing with a fixed lag.\n", + "
    \n", + "Since smoothing can be done in constant, there should exist an algorithm whose time complexity is independent of the length of the lag.\n", + "For smoothing a time slice $t - d$ where $d$ is the lag, we need to compute $\\alpha f_{1:t-d}$ x $b_{t-d+1:t}$ incrementally.\n", + "
    \n", + "As we already know, the forward equation is\n", + "
    \n", + "$$f_{1:t+1} = \\alpha O_{t+1}{T}^{T}f_{1:t}$$\n", + "
    \n", + "and the backward equation is\n", + "
    \n", + "$$b_{k+1:t} = TO_{k+1}b_{k+2:t}$$\n", + "
    \n", + "where $T$ and $O$ are the transition and sensor models respectively.\n", + "
    \n", + "For smoothing, the forward message is easy to compute but there exists no simple relation between the backward message of this time step and the one at the previous time step, hence we apply the backward equation $d$ times to get\n", + "
    \n", + "$$b_{t-d+1:t} = \\left ( \\prod_{i=t-d+1}^{t}{TO_i} \\right )b_{t+1:t} = B_{t-d+1:t}1$$\n", + "
    \n", + "where $B_{t-d+1:t}$ is the product of the sequence of $T$ and $O$ matrices.\n", + "
    \n", + "Here's how the `probability` module implements `fixed_lag_smoothing`.\n", + "
    " + ] }, - "widgets": { - "state": {}, - "version": "1.1.1" + { + "cell_type": "code", + "execution_count": 80, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def fixed_lag_smoothing(e_t, HMM, d, ev, t):\n",
    +       "    """[Figure 15.6]\n",
    +       "    Smoothing algorithm with a fixed time lag of 'd' steps.\n",
    +       "    Online algorithm that outputs the new smoothed estimate if observation\n",
    +       "    for new time step is given."""\n",
    +       "    ev.insert(0, None)\n",
    +       "\n",
    +       "    T_model = HMM.transition_model\n",
    +       "    f = HMM.prior\n",
    +       "    B = [[1, 0], [0, 1]]\n",
    +       "    evidence = []\n",
    +       "\n",
    +       "    evidence.append(e_t)\n",
    +       "    O_t = vector_to_diagonal(HMM.sensor_dist(e_t))\n",
    +       "    if t > d:\n",
    +       "        f = forward(HMM, f, e_t)\n",
    +       "        O_tmd = vector_to_diagonal(HMM.sensor_dist(ev[t - d]))\n",
    +       "        B = matrix_multiplication(inverse_matrix(O_tmd), inverse_matrix(T_model), B, T_model, O_t)\n",
    +       "    else:\n",
    +       "        B = matrix_multiplication(B, T_model, O_t)\n",
    +       "    t += 1\n",
    +       "\n",
    +       "    if t > d:\n",
    +       "        # always returns a 1x2 matrix\n",
    +       "        return [normalize(i) for i in matrix_multiplication([f], B)][0]\n",
    +       "    else:\n",
    +       "        return None\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(fixed_lag_smoothing)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This algorithm applies `forward` as usual and optimizes the smoothing step by using the equations above.\n", + "This optimization could be achieved only because HMM properties can be represented as matrices.\n", + "
    \n", + "`vector_to_diagonal`, `matrix_multiplication` and `inverse_matrix` are matrix manipulation functions to simplify the implementation.\n", + "
    \n", + "`normalize` is used to normalize the output before returning it." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's how we can use `fixed_lag_smoothing` for inference on our umbrella HMM." + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": {}, + "outputs": [], + "source": [ + "umbrella_transition_model = [[0.7, 0.3], [0.3, 0.7]]\n", + "umbrella_sensor_model = [[0.9, 0.2], [0.1, 0.8]]\n", + "hmm = HiddenMarkovModel(umbrella_transition_model, umbrella_sensor_model)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Given evidence T, F, T, F and T, we want to calculate the probability distribution for the fourth day with a fixed lag of 2 days.\n", + "
    \n", + "Let `e_t = False`" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[0.1111111111111111, 0.8888888888888888]" + ] + }, + "execution_count": 82, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "e_t = F\n", + "evidence = [T, F, T, F, T]\n", + "fixed_lag_smoothing(e_t, hmm, d=2, ev=evidence, t=4)" + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[0.9938650306748466, 0.006134969325153394]" + ] + }, + "execution_count": 83, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "e_t = T\n", + "evidence = [T, T, F, T, T]\n", + "fixed_lag_smoothing(e_t, hmm, d=1, ev=evidence, t=4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We cannot calculate probability distributions when $t$ is less than $d$" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": {}, + "outputs": [], + "source": [ + "fixed_lag_smoothing(e_t, hmm, d=5, ev=evidence, t=4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the output is `None`" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### PARTICLE FILTERING\n", + "The filtering problem is too expensive to solve using the previous methods for problems with large or continuous state spaces.\n", + "Particle filtering is a method that can solve the same problem but when the state space is a lot larger, where we wouldn't be able to do these computations in a reasonable amount of time as fast, as time goes by, and we want to keep track of things as they happen.\n", + "
    \n", + "The downside is that it is a sampling method and hence isn't accurate, but the more samples we're willing to take, the more accurate we'd get.\n", + "
    \n", + "In this method, instead of keping track of the probability distribution, we will drop particles in a similar proportion at the required regions.\n", + "The internal representation of this distribution is usually a list of particles with coordinates in the state-space.\n", + "A particle is just a new name for a sample.\n", + "\n", + "Particle filtering can be divided into four steps:\n", + "1. __Initialization__: \n", + "If we have some idea about the prior probability distribution, we drop the initial particles accordingly, or else we just drop them uniformly over the state space.\n", + "\n", + "2. __Forward pass__: \n", + "As time goes by and measurements come in, we are going to move the selected particles into the grid squares that makes the most sense in terms of representing the distribution that we are trying to track.\n", + "When time goes by, we just loop through all our particles and try to simulate what could happen to each one of them by sampling its next position from the transition model.\n", + "This is like prior sampling - samples' frequencies reflect the transition probabilities.\n", + "If we have enough samples we are pretty close to exact values.\n", + "We work through the list of particles, one particle at a time, all we do is stochastically simulate what the outcome might be.\n", + "If we had no dimension of time, and we had no new measurements come in, this would be exactly the same as what we did in prior sampling.\n", + "\n", + "3. __Reweight__:\n", + "As observations come in, don't sample the observations, fix them and downweight the samples based on the evidence just like in likelihood weighting.\n", + "$$w(x) = P(e/x)$$\n", + "$$B(X) \\propto P(e/X)B'(X)$$\n", + "
    \n", + "As before, the probabilities don't sum to one, since most have been downweighted.\n", + "They sum to an approximation of $P(e)$.\n", + "To normalize the resulting distribution, we can divide by $P(e)$\n", + "
    \n", + "Likelihood weighting wasn't the best thing for Bayesian networks, because we were not accounting for the incoming evidence so we were getting samples from the prior distribution, in some sense not the right distribution, so we might end up with a lot of particles with low weights. \n", + "These samples were very uninformative and the way we fixed it then was by using __Gibbs sampling__.\n", + "Theoretically, Gibbs sampling can be run on a HMM, but as we iterated over the process infinitely many times in a Bayesian network, we cannot do that here as we have new incoming evidence and we also need computational cycles to propagate through time.\n", + "
    \n", + "A lot of samples with very low weight and they are not representative of the _actual probability distribution_.\n", + "So if we keep running likelihood weighting, we keep propagating the samples with smaller weights and carry out computations for that even though these samples have no significant contribution to the actual probability distribution.\n", + "Which is why we require this last step.\n", + "\n", + "4. __Resample__:\n", + "Rather than tracking weighted samples, we _resample_.\n", + "We choose from our weighted sample distribution as many times as the number of particles we initially had and we replace these particles too, so that we have a constant number of particles.\n", + "This is equivalent to renormalizing the distribution.\n", + "The samples with low weight are rarely chosen in the new distribution after resampling.\n", + "This newer set of particles after resampling is in some sense more representative of the actual distribution and so we are better allocating our computational cycles.\n", + "Now the update is complete for this time step, continue with the next one.\n", + "\n", + "
    \n", + "Let's see how this is implemented in the module." + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def particle_filtering(e, N, HMM):\n",
    +       "    """Particle filtering considering two states variables."""\n",
    +       "    dist = [0.5, 0.5]\n",
    +       "    # Weight Initialization\n",
    +       "    w = [0 for _ in range(N)]\n",
    +       "    # STEP 1\n",
    +       "    # Propagate one step using transition model given prior state\n",
    +       "    dist = vector_add(scalar_vector_product(dist[0], HMM.transition_model[0]),\n",
    +       "                      scalar_vector_product(dist[1], HMM.transition_model[1]))\n",
    +       "    # Assign state according to probability\n",
    +       "    s = ['A' if probability(dist[0]) else 'B' for _ in range(N)]\n",
    +       "    w_tot = 0\n",
    +       "    # Calculate importance weight given evidence e\n",
    +       "    for i in range(N):\n",
    +       "        if s[i] == 'A':\n",
    +       "            # P(U|A)*P(A)\n",
    +       "            w_i = HMM.sensor_dist(e)[0] * dist[0]\n",
    +       "        if s[i] == 'B':\n",
    +       "            # P(U|B)*P(B)\n",
    +       "            w_i = HMM.sensor_dist(e)[1] * dist[1]\n",
    +       "        w[i] = w_i\n",
    +       "        w_tot += w_i\n",
    +       "\n",
    +       "    # Normalize all the weights\n",
    +       "    for i in range(N):\n",
    +       "        w[i] = w[i] / w_tot\n",
    +       "\n",
    +       "    # Limit weights to 4 digits\n",
    +       "    for i in range(N):\n",
    +       "        w[i] = float("{0:.4f}".format(w[i]))\n",
    +       "\n",
    +       "    # STEP 2\n",
    +       "\n",
    +       "    s = weighted_sample_with_replacement(N, s, w)\n",
    +       "\n",
    +       "    return s\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(particle_filtering)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, `scalar_vector_product` and `vector_add` are helper functions to help with vector math and `weighted_sample_with_replacement` resamples from a weighted sample and replaces the original sample, as is obvious from the name.\n", + "
    \n", + "This implementation considers two state variables with generic names 'A' and 'B'.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's how we can use `particle_filtering` on our umbrella HMM, though it doesn't make much sense using particle filtering on a problem with such a small state space.\n", + "It is just to get familiar with the syntax." + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": {}, + "outputs": [], + "source": [ + "umbrella_transition_model = [[0.7, 0.3], [0.3, 0.7]]\n", + "umbrella_sensor_model = [[0.9, 0.2], [0.1, 0.8]]\n", + "hmm = HiddenMarkovModel(umbrella_transition_model, umbrella_sensor_model)" + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['A', 'A', 'A', 'A', 'A', 'A', 'A', 'A', 'A', 'A']" + ] + }, + "execution_count": 87, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "particle_filtering(T, 10, hmm)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We got 5 samples from state `A` and 5 samples from state `B`" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['A', 'B', 'A', 'B', 'B', 'B', 'B', 'B', 'B', 'B']" + ] + }, + "execution_count": 88, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "particle_filtering([F, T, F, F, T], 10, hmm)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This time we got 2 samples from state `A` and 8 samples from state `B`" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Comparing runtimes for these algorithms will not be useful, as each solves the filtering task efficiently for a different scenario.\n", + "
    \n", + "`forward_backward` calculates the exact probability distribution.\n", + "
    \n", + "`fixed_lag_smoothing` calculates an approximate distribution and its runtime will depend on the value of the lag chosen.\n", + "
    \n", + "`particle_filtering` is an efficient method for approximating distributions for a very large or continuous state space." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## MONTE CARLO LOCALIZATION\n", + "In the domain of robotics, particle filtering is used for _robot localization_.\n", + "__Localization__ is the problem of finding out where things are, in this case, we want to find the position of a robot in a continuous state space.\n", + "
    \n", + "__Monte Carlo Localization__ is an algorithm for robots to _localize_ using a _particle filter_.\n", + "Given a map of the environment, the algorithm estimates the position and orientation of a robot as it moves and senses the environment.\n", + "
    \n", + "Initially, particles are distributed uniformly over the state space, ie the robot has no information of where it is and assumes it is equally likely to be at any point in space.\n", + "
    \n", + "When the robot moves, it analyses the incoming evidence to shift and change the probability to better approximate the probability distribution of its position.\n", + "The particles are then resampled based on their weights.\n", + "
    \n", + "Gradually, as more evidence comes in, the robot gets better at approximating its location and the particles converge towards the actual position of the robot.\n", + "
    \n", + "The pose of a robot is defined by its two Cartesian coordinates with values $x$ and $y$ and its direction with value $\\theta$.\n", + "We use the kinematic equations of motion to model a deterministic state prediction.\n", + "This is our motion model (or transition model).\n", + "
    \n", + "Next, we need a sensor model.\n", + "There can be two kinds of sensor models, the first assumes that the sensors detect _stable_, _recognizable_ features of the environment called __landmarks__.\n", + "The robot senses the location and bearing of each landmark and updates its belief according to that.\n", + "We can also assume the noise in measurements to be Gaussian, to simplify things.\n", + "
    \n", + "Another kind of sensor model is used for an array of range sensors, each of which has a fixed bearing relative to the robot.\n", + "These sensors provide a set of range values in each direction.\n", + "This will also be corrupted by Gaussian noise, but we can assume that the errors for different beam directions are independent and identically distributed.\n", + "
    \n", + "After evidence comes in, the robot updates its belief state and reweights the particle distribution to better aproximate the actual distribution.\n", + "
    \n", + "
    \n", + "Let's have a look at how this algorithm is implemented in the module" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def monte_carlo_localization(a, z, N, P_motion_sample, P_sensor, m, S=None):\n",
    +       "    """Monte Carlo localization algorithm from Fig 25.9"""\n",
    +       "\n",
    +       "    def ray_cast(sensor_num, kin_state, m):\n",
    +       "        return m.ray_cast(sensor_num, kin_state)\n",
    +       "\n",
    +       "    M = len(z)\n",
    +       "    W = [0]*N\n",
    +       "    S_ = [0]*N\n",
    +       "    W_ = [0]*N\n",
    +       "    v = a['v']\n",
    +       "    w = a['w']\n",
    +       "\n",
    +       "    if S is None:\n",
    +       "        S = [m.sample() for _ in range(N)]\n",
    +       "\n",
    +       "    for i in range(N):\n",
    +       "        S_[i] = P_motion_sample(S[i], v, w)\n",
    +       "        W_[i] = 1\n",
    +       "        for j in range(M):\n",
    +       "            z_ = ray_cast(j, S_[i], m)\n",
    +       "            W_[i] = W_[i] * P_sensor(z[j], z_)\n",
    +       "\n",
    +       "    S = weighted_sample_with_replacement(N, S_, W_)\n",
    +       "    return S\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(monte_carlo_localization)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Our implementation of Monte Carlo Localization uses the range scan method.\n", + "The `ray_cast` helper function casts rays in different directions and stores the range values.\n", + "
    \n", + "`a` stores the `v` and `w` components of the robot's velocity.\n", + "
    \n", + "`z` is a range scan.\n", + "
    \n", + "`P_motion_sample` is the motion or transition model.\n", + "
    \n", + "`P_sensor` is the range sensor noise model.\n", + "
    \n", + "`m` is the 2D map of the environment\n", + "
    \n", + "`S` is a vector of samples of size N" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll now define a simple 2D map to run Monte Carlo Localization on.\n", + "
    \n", + "Let's say this is the map we want\n", + "
    " + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfAAAAFaCAYAAADhKw9uAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAASOUlEQVR4nO3df4ztd13n8dd779hAKSwlvaj9oaVaUJao0JGARFYpxIJIMbvZBcUUf6SJP6AQFIsmaGI0ZDWoiQZTC7aJDailArqKdPEHmrDVuQWEclEaiu2FSoclCLrGWnz7x5yScXrnzvSc750zn9PHI7mZ8+M75/v+3Dszz/s958w51d0BAMbyn5Y9AADw4Ak4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOBwiFXVx6vq2Tsue2lV/cUEt91V9dWL3g6wHAIOAAMScBhYVZ1bVW+tqs2quqOqXr7tuqdW1Xur6rNVdXdV/UpVnTG77j2zzT5QVf9YVf+zqr6lqk5U1aur6p7Z57ywqp5XVX9bVZ+pqp/Yz+3Pru+qenlVfayqPl1VP19VfubARHwzwaBmMfy9JB9Icl6SS5O8oqq+bbbJF5K8Msk5SZ4+u/6HkqS7nznb5uu7+6zu/q3Z+S9L8rDZ7b02ya8neUmSS5J8c5LXVtVFe93+Nt+ZZD3JU5JcnuT7plg7kJTXQofDq6o+nq1A3rft4jOS3JrkVUl+p7u/Ytv2r0ny+O7+3pPc1iuS/Nfu/s7Z+U5ycXffPjv/LUn+MMlZ3f2Fqnpkks8leVp33zLb5liSn+nut+3z9p/b3e+cnf+hJP+tuy9d4K8EmFlb9gDAnl7Y3f/n/jNV9dIkP5DkK5OcW1Wf3bbtkSR/Ptvu8Ulen60j4DOz9f1+bI99/b/u/sLs9D/PPn5q2/X/nOSsB3H7d207/XdJzt1j/8A+uQsdxnVXkju6+9Hb/jyyu583u/4NST6SraPsRyX5iSQ14f73c/sXbDv9FUk+OeH+4SFNwGFcf5nkc1X141X18Ko6UlVPqqpvnF1//13g/1hVX5PkB3d8/qeSXJT57XX7SfJjVXV2VV2Q5Kokv3WSbYA5CDgManZX93ck+YYkdyT5dJJrk/zn2SY/muS7knw+W09G2xnPn05y/exZ5P9jjhH2uv0keXu27lZ/f5L/neSNc+wHOAlPYgNOi51PkgOm5QgcAAYk4AAwIHehA8CAHIEDwIAO9IVczjnnnL7wwgsPcpfAijh2bK/XoGE/LrnkkmWPcFoc9NfHQf49Hjt27NPdfXTn5Qd6F/r6+npvbGwc2P6A1VE15WvQPHSt6sOmB/31cZB/j1V1rLvXd17uLnQAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAY0EIBr6rLqupvqur2qrp6qqEAgFObO+BVdSTJryZ5bpInJnlxVT1xqsEAgN0tcgT+1CS3d/fHuvveJG9Jcvk0YwEAp7JIwM9Lcte28ydml/0HVXVlVW1U1cbm5uYCuwMA7rdIwE/21i8PeHuW7r6mu9e7e/3o0Qe8GxoAMIdFAn4iyQXbzp+f5JOLjQMA7MciAf+rJBdX1eOq6owkL0ryjmnGAgBOZW3eT+zu+6rqR5L8UZIjSd7U3bdNNhkAsKu5A54k3f0HSf5golkAgH3ySmwAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgBb6PXAA2E3Vyd4yg6k4AgeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAyouvvgdlZ1cDuDh6iD/J4+SFW17BFWwgH/zD+wfR20A/57PNbd6zsvdwQOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABjQ3AGvqguq6k+q6nhV3VZVV005GACwu7UFPve+JK/q7lur6pFJjlXVzd394YlmAwB2MfcReHff3d23zk5/PsnxJOdNNRgAsLtFjsC/qKouTPLkJLec5Lork1w5xX4AgC0Lv51oVZ2V5M+S/Gx337THtqv5PodwiHg7UU7F24lOY/i3E62qL0ny1iQ37BVvAGA6izwLvZK8Mcnx7n79dCMBAHtZ5Aj8GUm+J8mzqur9sz/Pm2guAOAU5n4SW3f/RZLVfYADAA4xr8QGAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAk7wb2X5dcskl2djYOMhdAsBKcgQOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAY0NqyBzhdqmrZIwDAaeMIHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwoIUDXlVHqup9VfX7UwwEAOxtiiPwq5Icn+B2AIB9WijgVXV+km9Pcu004wAA+7HoEfgvJXl1kn/bbYOqurKqNqpqY3Nzc8HdAQDJAgGvqucnuae7j51qu+6+prvXu3v96NGj8+4OANhmkSPwZyR5QVV9PMlbkjyrqn5zkqkAgFOaO+Dd/ZruPr+7L0zyoiR/3N0vmWwyAGBXfg8cAAa0NsWNdPefJvnTKW4LANibI3AAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQJP8Hvhh1N3LHgGYUFUtewQ4VByBA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADGihgFfVo6vqxqr6SFUdr6qnTzUYALC7tQU//5eTvLO7/3tVnZHkzAlmAgD2MHfAq+pRSZ6Z5KVJ0t33Jrl3mrEAgFNZ5C70i5JsJvmNqnpfVV1bVY/YuVFVXVlVG1W1sbm5ucDuAID7LRLwtSRPSfKG7n5ykn9KcvXOjbr7mu5e7+71o0ePLrA7AOB+iwT8RJIT3X3L7PyN2Qo6AHCazR3w7v77JHdV1RNmF12a5MOTTAUAnNKiz0J/WZIbZs9A/1iS7118JABgLwsFvLvfn2R9olkAgH3ySmwAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAa06CuxkaSqlj0Ch1x3L3sEYMU4AgeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxobdkDAOxHdy97BB6kg/w3q6oD29dh4QgcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADCghQJeVa+sqtuq6kNV9eaqethUgwEAu5s74FV1XpKXJ1nv7iclOZLkRVMNBgDsbtG70NeSPLyq1pKcmeSTi48EAOxl7oB39yeS/EKSO5PcneQfuvtdO7erqiuraqOqNjY3N+efFAD4okXuQj87yeVJHpfk3CSPqKqX7Nyuu6/p7vXuXj969Oj8kwIAX7TIXejPTnJHd292978muSnJN00zFgBwKosE/M4kT6uqM2vrndQvTXJ8mrEAgFNZ5DHwW5LcmOTWJB+c3dY1E80FAJzC2iKf3N0/leSnJpoFANgnr8QGAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADGih3wNnS3cvewSAQ2frRTo5XRyBA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADCgtWUPsAqqatkjcMh197JHGJ7vs2kc5NfiQe7rofj14QgcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABrRnwKvqTVV1T1V9aNtlj6mqm6vqo7OPZ5/eMQGA7fZzBH5dkst2XHZ1knd398VJ3j07DwAckD0D3t3vSfKZHRdfnuT62enrk7xw4rkAgFOY9zHwL+3uu5Nk9vGxu21YVVdW1UZVbWxubs65OwBgu9P+JLbuvqa717t7/ejRo6d7dwDwkDBvwD9VVV+eJLOP90w3EgCwl3kD/o4kV8xOX5Hk7dOMAwDsx35+jezNSd6b5AlVdaKqvj/J65I8p6o+muQ5s/MAwAFZ22uD7n7xLlddOvEsAMA+eSU2ABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADqu4+uJ1VbSb5uwf5aeck+fRpGOcwWNW1req6ktVd26quK1ndta3qupLVXdu86/rK7n7Au4EdaMDnUVUb3b2+7DlOh1Vd26quK1ndta3qupLVXduqritZ3bVNvS53oQPAgAQcAAY0QsCvWfYAp9Gqrm1V15Ws7tpWdV3J6q5tVdeVrO7aJl3XoX8MHAB4oBGOwAGAHQQcAAZ0qANeVZdV1d9U1e1VdfWy55lCVV1QVX9SVcer6raqumrZM02tqo5U1fuq6veXPctUqurRVXVjVX1k9m/39GXPNJWqeuXsa/FDVfXmqnrYsmeaR1W9qaruqaoPbbvsMVV1c1V9dPbx7GXOOK9d1vbzs6/Hv66q362qRy9zxnmcbF3brvvRquqqOmcZsy1qt7VV1ctmXbutqv7XIvs4tAGvqiNJfjXJc5M8McmLq+qJy51qEvcleVV3f22SpyX54RVZ13ZXJTm+7CEm9stJ3tndX5Pk67Mi66uq85K8PMl6dz8pyZEkL1ruVHO7LsllOy67Osm7u/viJO+enR/RdXng2m5O8qTu/rokf5vkNQc91ASuywPXlaq6IMlzktx50ANN6LrsWFtVfWuSy5N8XXf/lyS/sMgODm3Akzw1ye3d/bHuvjfJW7K18KF1993dfevs9OezFYLzljvVdKrq/CTfnuTaZc8ylap6VJJnJnljknT3vd392eVONam1JA+vqrUkZyb55JLnmUt3vyfJZ3ZcfHmS62enr0/ywgMdaiInW1t3v6u775ud/b9Jzj/wwRa0y79ZkvxiklcnGfZZ1rus7QeTvK67/2W2zT2L7OMwB/y8JHdtO38iKxS6JKmqC5M8Ockty51kUr+UrW+8f1v2IBO6KMlmkt+YPTRwbVU9YtlDTaG7P5Gto4A7k9yd5B+6+13LnWpSX9rddydb/3lO8tglz3O6fF+SP1z2EFOoqhck+UR3f2DZs5wGj0/yzVV1S1X9WVV94yI3dpgDXie5bNj/je1UVWcleWuSV3T355Y9zxSq6vlJ7unuY8ueZWJrSZ6S5A3d/eQk/5Rx74r9D2aPCV+e5HFJzk3yiKp6yXKn4sGoqp/M1kNzNyx7lkVV1ZlJfjLJa5c9y2myluTsbD18+mNJfruqTta6fTnMAT+R5IJt58/PoHft7VRVX5KteN/Q3Tcte54JPSPJC6rq49l6yONZVfWbyx1pEieSnOju++8puTFbQV8Fz05yR3dvdve/JrkpyTcteaYpfaqqvjxJZh8XusvysKmqK5I8P8l392q8qMdXZes/kx+Y/Rw5P8mtVfVlS51qOieS3NRb/jJb91TO/SS9wxzwv0pycVU9rqrOyNYTa96x5JkWNvvf1huTHO/u1y97nil192u6+/zuvjBb/15/3N3DH811998nuauqnjC76NIkH17iSFO6M8nTqurM2dfmpVmRJ+jNvCPJFbPTVyR5+xJnmVRVXZbkx5O8oLv//7LnmUJ3f7C7H9vdF85+jpxI8pTZ9+AqeFuSZyVJVT0+yRlZ4F3XDm3AZ0/O+JEkf5StHyi/3d23LXeqSTwjyfdk6+j0/bM/z1v2UOzpZUluqKq/TvINSX5uyfNMYnavwo1Jbk3ywWz9TBjyZSyr6s1J3pvkCVV1oqq+P8nrkjynqj6arWc1v26ZM85rl7X9SpJHJrl59nPk15Y65Bx2WddK2GVtb0py0exXy96S5IpF7jnxUqoAMKBDewQOAOxOwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMKB/B24h+wUcnnY9AAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "m = MCLmap([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 0],\n", + " [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0],\n", + " [1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 1, 1, 1, 0, 1, 1, 0],\n", + " [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0],\n", + " [0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 0],\n", + " [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0],\n", + " [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0],\n", + " [0, 0, 1, 1, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0],\n", + " [0, 0, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0],\n", + " [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0],\n", + " [0, 0, 1, 1, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 1, 0]])\n", + "\n", + "heatmap(m.m, cmap='binary')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's define the motion model as a function `P_motion_sample`." + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": {}, + "outputs": [], + "source": [ + "def P_motion_sample(kin_state, v, w):\n", + " \"\"\"Sample from possible kinematic states.\n", + " Returns from a single element distribution (no uncertainity in motion)\"\"\"\n", + " pos = kin_state[:2]\n", + " orient = kin_state[2]\n", + "\n", + " # for simplicity the robot first rotates and then moves\n", + " orient = (orient + w)%4\n", + " for _ in range(orient):\n", + " v = (v[1], -v[0])\n", + " pos = vector_add(pos, v)\n", + " return pos + (orient,)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Define the sensor model as a function `P_sensor`." + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "metadata": {}, + "outputs": [], + "source": [ + "def P_sensor(x, y):\n", + " \"\"\"Conditional probability for sensor reading\"\"\"\n", + " # Need not be exact probability. Can use a scaled value.\n", + " if x == y:\n", + " return 0.8\n", + " elif abs(x - y) <= 2:\n", + " return 0.05\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Initializing variables." + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": {}, + "outputs": [], + "source": [ + "a = {'v': (0, 0), 'w': 0}\n", + "z = (2, 4, 1, 6)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's run `monte_carlo_localization` with these parameters to find a sample distribution S." + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": {}, + "outputs": [], + "source": [ + "S = monte_carlo_localization(a, z, 1000, P_motion_sample, P_sensor, m)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's plot the values in the sample distribution `S`." + ] + }, + { + "cell_type": "code", + "execution_count": 95, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "GRID:\n", + " 0 0 12 0 143 14 0 0 0 0 0 0 0 0 0 0 0\n", + " 0 0 0 0 17 52 201 6 0 0 0 0 0 0 0 0 0\n", + " 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + " 0 0 0 3 5 19 9 3 0 0 0 0 0 0 0 0 0\n", + " 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + " 0 0 6 166 0 21 0 0 0 0 0 0 0 0 0 0 0\n", + " 0 0 0 1 11 75 0 0 0 0 0 0 0 0 0 0 0\n", + " 73 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0\n", + "124 0 0 0 0 0 0 1 0 3 0 0 0 0 0 0 0\n", + " 0 0 0 14 4 15 1 0 0 0 0 0 0 0 0 0 0\n", + " 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfAAAAFaCAYAAADhKw9uAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAATEElEQVR4nO3df6zldX3n8debuSK/F2Swlt+yi7pq2upOjdbU7Qqs+KNis5td7dJg2w1Ju1U0thZtIt1s0pi2cdukjV0WLSQl2i7S6nZbFW271qyLHVBUxFYiCKMIA4aCXSsF3vvHPSS317lzh3u+c858Lo9HMrn3nPO95/P+zNy5z/mee+6Z6u4AAGM5bNkDAACPn4ADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg6HsKq6varOXXfd66vqkxPcd1fVP5v3foDlEHAAGJCAw8Cq6uSq+kBV7a2q26rqjWtue0FVfaqq7q+qu6rqt6rq8Nltn5gddlNVfauq/n1V/UhV7amqt1bVPbOPeU1VvaKq/qaqvllVbz+Q+5/d3lX1xqr6SlXdW1W/VlW+5sBE/GWCQc1i+D+T3JTklCTnJHlTVb1sdsgjSd6cZGeSF81u/9kk6e6XzI75/u4+prt/f3b5aUmOmN3fO5L89yQXJvkXSX44yTuq6qzN7n+NH0uyK8nzk1yQ5Kem2DuQlNdCh0NXVd2e1UA+vObqw5PcmOQtSf5Hd5++5vi3JXlGd//kPu7rTUn+ZXf/2OxyJzm7u2+dXf6RJH+a5JjufqSqjk3yQJIXdvf1s2NuSPJfuvuPDvD+X97dH55d/tkk/6a7z5njtwSYWVn2AMCmXtPdH3vsQlW9Psl/THJGkpOr6v41x+5I8pez456R5F1ZPQM+Kqt/32/YZK37uvuR2fvfnr29e83t305yzOO4/zvXvP/VJCdvsj5wgDyEDuO6M8lt3X38ml/HdvcrZre/O8mXsnqWfVyStyepCdc/kPs/bc37pyf5+oTrwxOagMO4Pp3kgar6xao6sqp2VNVzq+oHZ7c/9hD4t6rqWUl+Zt3H353krGzdZvefJL9QVSdU1WlJLkny+/s4BtgCAYdBzR7q/tEkP5DktiT3JrkiyT+ZHfLzSX48yYNZfTLa+nj+cpKrZs8i/3dbGGGz+0+SD2b1YfXPJvlfSd6zhXWAffAkNuCgWP8kOWBazsABYEACDgAD8hA6AAzIGTgADGihL+Syc+eJfebpp21+4GgefWTzY6Z02I6FLfXQbZ9b2FqHn/Gcha21yN9DgHnc8Jmb7u3uk9Zfv9CAn3n6adn9yY9tfuBg+u8fWOh6dcRxC1vr9gtPWdhaZ1zxhwtbq444fmFrAcyjjj7pq/u63kPoADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMKC5Al5V51fVX1fVrVV16VRDAQD7t+WAV9WOJL+d5OVJnp3kdVX17KkGAwA2Ns8Z+AuS3NrdX+nuh5K8P8kF04wFAOzPPAE/Jcmday7vmV33j1TVxVW1u6p27733vjmWAwAeM0/Aax/X9Xdd0X15d+/q7l0n7TxxjuUAgMfME/A9Sdb+596nJvn6fOMAAAdinoD/VZKzq+rpVXV4ktcm+dA0YwEA+7Oy1Q/s7oer6ueSfCTJjiTv7e6bJ5sMANjQlgOeJN39J0n+ZKJZAIAD5JXYAGBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAHN9XPgrKojjlv2CAfNGZfftLC1+va/XNha//nHL17YWkly2advX9hatfLkha0FLI8zcAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIBWlj0Ah7Y6aufi1nrWjy5srV++8a6FrQVwMDgDB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMaMsBr6rTqurPq+qWqrq5qi6ZcjAAYGPzvBb6w0ne0t03VtWxSW6oquu6+4sTzQYAbGDLZ+DdfVd33zh7/8EktyQ5ZarBAICNTfI98Ko6M8nzkly/j9surqrdVbV77733TbEcADzhzR3wqjomyQeSvKm7H1h/e3df3t27unvXSTtPnHc5ACBzBryqnpTVeF/d3ddOMxIAsJl5noVeSd6T5Jbuftd0IwEAm5nnDPzFSX4iyUur6rOzX6+YaC4AYD+2/GNk3f3JJDXhLADAAfJKbAAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABjTP/wfOkvSjjyxwsQWu9e37F7fW4Ucvbq0kWTliYUvVYTsWthawPM7AAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AA1pZ9gA8fnXYjgWutsC1jnnq4tYCGJwzcAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgOYOeFXtqKrPVNUfTzEQALC5Kc7AL0lyywT3AwAcoLkCXlWnJnllkiumGQcAOBDznoH/RpK3Jnl0owOq6uKq2l1Vu/fee9+cywEAyRwBr6pXJbmnu2/Y33HdfXl37+ruXSftPHGrywEAa8xzBv7iJK+uqtuTvD/JS6vq9yaZCgDYry0HvLvf1t2ndveZSV6b5M+6+8LJJgMANuTnwAFgQCtT3El3/0WSv5jivgCAzTkDB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAY0yc+BP9H1w99Z6HrXvfL0ha31rz9y98LW6ge/sbC16tinLWwtgIPBGTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AA1pZ9gDbQa08eaHrnffhbyxsrf7Og4tb6//8t4WtVS+7bGFrARwMzsABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQHMFvKqOr6prqupLVXVLVb1oqsEAgI3N+1Kqv5nkw939b6vq8CRHTTATALCJLQe8qo5L8pIkr0+S7n4oyUPTjAUA7M88D6GflWRvkt+tqs9U1RVVdfT6g6rq4qraXVW799573xzLAQCPmSfgK0men+Td3f28JH+X5NL1B3X35d29q7t3nbTzxDmWAwAeM0/A9yTZ093Xzy5fk9WgAwAH2ZYD3t3fSHJnVT1zdtU5Sb44yVQAwH7N+yz0NyS5evYM9K8k+cn5RwIANjNXwLv7s0l2TTQLAHCAvBIbAAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABzftKbI/PA3fl0Y/9ykKWOuzcty9knWWoqsUt9uRjF7ZUveyyha3FNLp7YWst9PMeBuAMHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMKCVRS72yAPfzLc+evVC1jru3LcvZB04EN29sLWqamFrpR9d3Fq1Y3FrwQCcgQPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABjRXwKvqzVV1c1V9oareV1VHTDUYALCxLQe8qk5J8sYku7r7uUl2JHntVIMBABub9yH0lSRHVtVKkqOSfH3+kQCAzWw54N39tSS/nuSOJHcl+dvu/uj646rq4qraXVW77/v2Av/nIgDYxuZ5CP2EJBckeXqSk5McXVUXrj+uuy/v7l3dvevEIz1nDgCmME9Rz01yW3fv7e5/SHJtkh+aZiwAYH/mCfgdSV5YVUdVVSU5J8kt04wFAOzPPN8Dvz7JNUluTPL52X1dPtFcAMB+rMzzwd19WZLLJpoFADhAnlUGAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADGiunwN/vHac+pwc96sfW+SS21L//f2LW+xJRy9urYceXNxaR5ywuLWSrL5Y4fZTh+1Y9gjwhOUMHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwIAEHgAEJOAAMSMABYEACDgADEnAAGJCAA8CABBwABiTgADAgAQeAAQk4AAxIwAFgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABrSx7AB6/OuL4ZY9wcBz5lGVPADAMZ+AAMCABB4ABCTgADEjAAWBAAg4AAxJwABiQgAPAgAQcAAYk4AAwoE0DXlXvrap7quoLa657SlVdV1Vfnr094eCOCQCsdSBn4FcmOX/ddZcm+Xh3n53k47PLAMCCbBrw7v5Ekm+uu/qCJFfN3r8qyWsmngsA2I+tfg/8e7r7riSZvX3qRgdW1cVVtbuqdu+9974tLgcArHXQn8TW3Zd3967u3nXSzhMP9nIA8ISw1YDfXVXfmySzt/dMNxIAsJmtBvxDSS6avX9Rkg9OMw4AcCAO5MfI3pfkU0meWVV7quqnk7wzyXlV9eUk580uAwALsrLZAd39ug1uOmfiWQCAA+SV2ABgQAIOAAMScAAYkIADwIAEHAAGJOAAMCABB4ABCTgADKi6e3GLVe1N8tXH+WE7k9x7EMY5FGzXvW3XfSXbd2/bdV/J9t3bdt1Xsn33ttV9ndHdJ62/cqEB34qq2t3du5Y9x8GwXfe2XfeVbN+9bdd9Jdt3b9t1X8n23dvU+/IQOgAMSMABYEAjBPzyZQ9wEG3XvW3XfSXbd2/bdV/J9t3bdt1Xsn33Num+DvnvgQMA322EM3AAYB0BB4ABHdIBr6rzq+qvq+rWqrp02fNMoapOq6o/r6pbqurmqrpk2TNNrap2VNVnquqPlz3LVKrq+Kq6pqq+NPuze9GyZ5pKVb159rn4hap6X1UdseyZtqKq3ltV91TVF9Zc95Squq6qvjx7e8IyZ9yqDfb2a7PPx89V1R9W1fHLnHEr9rWvNbf9fFV1Ve1cxmzz2mhvVfWGWddurqpfnWeNQzbgVbUjyW8neXmSZyd5XVU9e7lTTeLhJG/p7n+e5IVJ/tM22ddalyS5ZdlDTOw3k3y4u5+V5PuzTfZXVackeWOSXd393CQ7krx2uVNt2ZVJzl933aVJPt7dZyf5+OzyiK7Md+/tuiTP7e7vS/I3Sd626KEmcGW+e1+pqtOSnJfkjkUPNKErs25vVfWvklyQ5Pu6+zlJfn2eBQ7ZgCd5QZJbu/sr3f1QkvdndeND6+67uvvG2fsPZjUEpyx3qulU1alJXpnkimXPMpWqOi7JS5K8J0m6+6Huvn+5U01qJcmRVbWS5KgkX1/yPFvS3Z9I8s11V1+Q5KrZ+1clec1Ch5rIvvbW3R/t7odnF/9vklMXPticNvgzS5L/muStSYZ9lvUGe/uZJO/s7u/MjrlnnjUO5YCfkuTONZf3ZBuFLkmq6swkz0ty/XInmdRvZPUv3qPLHmRCZyXZm+R3Z98auKKqjl72UFPo7q9l9SzgjiR3Jfnb7v7ocqea1Pd0913J6j+ekzx1yfMcLD+V5E+XPcQUqurVSb7W3Tcte5aD4BlJfriqrq+q/11VPzjPnR3KAa99XDfsv8bWq6pjknwgyZu6+4FlzzOFqnpVknu6+4ZlzzKxlSTPT/Lu7n5ekr/LuA/F/iOz7wlfkOTpSU5OcnRVXbjcqXg8quqXsvqtuauXPcu8quqoJL+U5B3LnuUgWUlyQla/ffoLSf6gqvbVugNyKAd8T5LT1lw+NYM+tLdeVT0pq/G+uruvXfY8E3pxkldX1e1Z/ZbHS6vq95Y70iT2JNnT3Y89UnJNVoO+HZyb5Lbu3tvd/5Dk2iQ/tOSZpnR3VX1vkszezvWQ5aGmqi5K8qok/6G3x4t6/NOs/mPyptnXkVOT3FhVT1vqVNPZk+TaXvXprD5SueUn6R3KAf+rJGdX1dOr6vCsPrHmQ0ueaW6zf229J8kt3f2uZc8zpe5+W3ef2t1nZvXP68+6e/izue7+RpI7q+qZs6vOSfLFJY40pTuSvLCqjpp9bp6TbfIEvZkPJblo9v5FST64xFkmVVXnJ/nFJK/u7v+37Hmm0N2f7+6ndveZs68je5I8f/Z3cDv4oyQvTZKqekaSwzPH/7p2yAZ89uSMn0vykax+QfmD7r55uVNN4sVJfiKrZ6efnf16xbKHYlNvSHJ1VX0uyQ8k+ZUlzzOJ2aMK1yS5Mcnns/o1YciXsayq9yX5VJJnVtWeqvrpJO9Mcl5VfTmrz2p+5zJn3KoN9vZbSY5Nct3s68jvLHXILdhgX9vCBnt7b5KzZj9a9v4kF83zyImXUgWAAR2yZ+AAwMYEHAAGJOAAMCABB4ABCTgADEjAAWBAAg4AA/r/85kBLqIO9qEAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "grid = [[0]*17 for _ in range(11)]\n", + "for x, y, _ in S:\n", + " if 0 <= x < 11 and 0 <= y < 17:\n", + " grid[x][y] += 1\n", + "print(\"GRID:\")\n", + "print_table(grid)\n", + "heatmap(grid, cmap='Oranges')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The distribution is highly concentrated at `(5, 3)`, but the robot is not very confident about its position as some other cells also have high probability values." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's look at another scenario." + ] + }, + { + "cell_type": "code", + "execution_count": 96, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "GRID:\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 1000 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n", + "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfAAAAFaCAYAAADhKw9uAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAARl0lEQVR4nO3df6zld13n8dd7OzbQFpbaKUp/YOluwWWJSnckIJF1KWQLshSzmxV2MUXdNNEVCkGxaIIkm2zIalhNNJhuwTaxAd1SBV1FKv5gSdjqtFChFKWh0A5UOlOCoGu2Ft/7xz01l8vcucM9Z+bM+/J4JJN7fnzv+b4/nbn3eb/fc+5pdXcAgFn+0boHAAC+dgIOAAMJOAAMJOAAMJCAA8BAAg4AAwk4AAwk4HAKq6pPVdXzttz2iqr6wAoeu6vqny77OMB6CDgADCTgMFhVnVdV76yqw1V1T1W9atN9z6iqD1bVF6rq/qr6xao6fXHf+xeb3VFVf11V319V31NVh6rqdVX1wOJzXlJVL6yqv6iqz1fVTx3P4y/u76p6VVV9sqqOVNXPVpXvObAivphgqEUMfyvJHUnOT3JZkldX1b9ebPLlJK9Jsj/Jsxb3/2iSdPdzFtt8e3ef1d2/trj+zUketXi8NyT5H0lenuRfJPnuJG+oqot3evxNvi/JgSSXJrkiyQ+tYu1AUt4LHU5dVfWpbATy4U03n57k9iSvTfI/u/uJm7Z/fZInd/cPHuWxXp3kX3b39y2ud5JLuvvuxfXvSfK7Sc7q7i9X1WOSfDHJM7v71sU2tyX5L939m8f5+C/o7vcsrv9okn/b3Zct8Z8EWNi37gGAHb2ku3//kStV9Yok/ynJtyQ5r6q+sGnb05L878V2T07y5mwcAZ+Rja/323bY14Pd/eXF5b9dfPzcpvv/NslZX8Pj37fp8qeTnLfD/oHj5BQ6zHVfknu6+3Gb/jymu1+4uP8tST6ejaPsxyb5qSS1wv0fz+NfuOnyE5N8doX7h69rAg5z/UmSL1bVT1bVo6vqtKp6WlV95+L+R06B/3VVfWuSH9ny+Z9LcnF2b6fHT5KfqKqzq+rCJFcn+bWjbAPsgoDDUItT3f8myXckuSfJkSTXJfnHi01+PMl/SPKlbLwYbWs835jkhsWryP/9LkbY6fGT5F3ZOK3+4ST/K8lbd7Ef4Ci8iA04Iba+SA5YLUfgADCQgAPAQE6hA8BAjsABYKCT+kYu+/ef0xc98cKdNwQAkiS3feiOI9197tbbT2rAL3rihTn4gd/feUMAIElSZ5776aPd7hQ6AAwk4AAwkIADwEACDgADCTgADCTgADCQgAPAQAIOAAMJOAAMtFTAq+ryqvrzqrq7qq5Z1VAAwLHtOuBVdVqSX0rygiRPTfKyqnrqqgYDALa3zBH4M5Lc3d2f7O6HkrwjyRWrGQsAOJZlAn5+kvs2XT+0uO0rVNVVVXWwqg4ePvLgErsDAB6xTMDrKLf1V93QfW13H+juA+fuP2eJ3QEAj1gm4IeSbP6fe1+Q5LPLjQMAHI9lAv6nSS6pqidV1elJXprk3asZCwA4ln27/cTufriqfizJ7yU5LcnbuvvOlU0GAGxr1wFPku7+nSS/s6JZAIDj5J3YAGAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABlrq98CBU88bL33CydvX7feftH0BX8kROAAMJOAAMJCAA8BAAg4AAwk4AAwk4AAwkIADwEACDgADCTgADCTgADCQgAPAQAIOAAMJOAAMJOAAMJCAA8BAAg4AAwk4AAwk4AAwkIADwEACDgADCTgADCTgADCQgAPAQAIOAAMJOAAMJOAAMNC+dQ8ArNYbb79/3SMAJ4EjcAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWCgXQe8qi6sqj+sqruq6s6qunqVgwEA21vmvdAfTvLa7r69qh6T5LaquqW7P7ai2QCAbez6CLy77+/u2xeXv5TkriTnr2owAGB7K3kOvKouSvL0JLce5b6rqupgVR08fOTBVewOAL7uLR3wqjoryTuTvLq7v7j1/u6+trsPdPeBc/efs+zuAIAsGfCq+oZsxPvG7r55NSMBADtZ5lXoleStSe7q7jevbiQAYCfLHIE/O8kPJHluVX148eeFK5oLADiGXf8aWXd/IEmtcBYA4Dh5JzYAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgoKUDXlWnVdWHquq3VzEQALCzVRyBX53krhU8DgBwnJYKeFVdkOR7k1y3mnEAgOOx7BH4zyd5XZK/326Dqrqqqg5W1cHDRx5ccncAQLJEwKvqRUke6O7bjrVdd1/b3Qe6+8C5+8/Z7e4AgE2WOQJ/dpIXV9WnkrwjyXOr6ldXMhUAcEy7Dnh3v767L+jui5K8NMkfdPfLVzYZALAtvwcOAAPtW8WDdPcfJfmjVTwWALAzR+AAMJCAA8BAAg4AAwk4AAwk4AAwkIADwEACDgADCTgADCTgADCQgAPAQAIOAAMJOAAMJOAAMJCAA8BAAg4AAwk4AAwk4AAwkIADwEACDgADCTgADCTgADCQgAPAQAIOAAMJOAAMJOAAMJCAA8BAAg4AAwk4AAwk4AAwkIADwEACDgADCTgADCTgADCQgAPAQAIOAAMJOAAMJOAAMJCAA8BAAg4AAwk4AAwk4AAwkIADwEACDgADCTgADCTgADDQUgGvqsdV1U1V9fGququqnrWqwQCA7e1b8vN/Icl7uvvfVdXpSc5YwUwAwA52HfCqemyS5yR5RZJ090NJHlrNWADAsSxzCv3iJIeT/EpVfaiqrquqM7duVFVXVdXBqjp4+MiDS+wOAHjEMgHfl+TSJG/p7qcn+Zsk12zdqLuv7e4D3X3g3P3nLLE7AOARywT8UJJD3X3r4vpN2Qg6AHCC7Trg3f2XSe6rqqcsbrosycdWMhUAcEzLvgr9lUluXLwC/ZNJfnD5kQCAnSwV8O7+cJIDK5oFADhO3okNAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGGipgFfVa6rqzqr6aFW9vaoetarBAIDt7TrgVXV+klclOdDdT0tyWpKXrmowAGB7y55C35fk0VW1L8kZST67/EgAwE52HfDu/kySn0tyb5L7k/xVd79363ZVdVVVHayqg4ePPLj7SQGAf7DMKfSzk1yR5ElJzktyZlW9fOt23X1tdx/o7gPn7j9n95MCAP9gmVPoz0tyT3cf7u6/S3Jzku9azVgAwLEsE/B7kzyzqs6oqkpyWZK7VjMWAHAsyzwHfmuSm5LcnuQji8e6dkVzAQDHsG+ZT+7un0nyMyuaBQA4Tt6JDQAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABhJwABhIwAFgIAEHgIEEHAAGEnAAGEjAAWAgAQeAgQQcAAYScAAYSMABYCABB4CBBBwABtox4FX1tqp6oKo+uum2b6yqW6rqE4uPZ5/YMQGAzY7nCPz6JJdvue2aJO/r7kuSvG9xHQA4SXYMeHe/P8nnt9x8RZIbFpdvSPKSFc8FABzDbp8D/6buvj9JFh8fv92GVXVVVR2sqoOHjzy4y90BAJud8Bexdfe13X2guw+cu/+cE707APi6sNuAf66qnpAki48PrG4kAGAnuw34u5Ncubh8ZZJ3rWYcAOB4HM+vkb09yQeTPKWqDlXVDyd5U5LnV9Unkjx/cR0AOEn27bRBd79sm7suW/EsAMBx8k5sADCQgAPAQAIOAAMJOAAMJOAAMJCAA8BAAg4AAwk4AAxU3X3ydlZ1OMmnv8ZP25/kyAkY51SwV9e2V9eV7N217dV1JXt3bXt1XcneXdtu1/Ut3X3u1htPasB3o6oOdveBdc9xIuzVte3VdSV7d217dV3J3l3bXl1XsnfXtup1OYUOAAMJOAAMNCHg1657gBNor65tr64r2btr26vrSvbu2vbqupK9u7aVruuUfw4cAPhqE47AAYAtBBwABjqlA15Vl1fVn1fV3VV1zbrnWYWqurCq/rCq7qqqO6vq6nXPtGpVdVpVfaiqfnvds6xKVT2uqm6qqo8v/u6ete6ZVqWqXrP4t/jRqnp7VT1q3TPtRlW9raoeqKqPbrrtG6vqlqr6xOLj2euccbe2WdvPLv49/llV/UZVPW6dM+7G0da16b4fr6quqv3rmG1Z262tql656NqdVfXfltnHKRvwqjotyS8leUGSpyZ5WVU9db1TrcTDSV7b3f8syTOT/Oc9sq7Nrk5y17qHWLFfSPKe7v7WJN+ePbK+qjo/yauSHOjupyU5LclL1zvVrl2f5PItt12T5H3dfUmS9y2uT3R9vnpttyR5Wnd/W5K/SPL6kz3UClyfr15XqurCJM9Pcu/JHmiFrs+WtVXVv0pyRZJv6+5/nuTnltnBKRvwJM9Icnd3f7K7H0ryjmwsfLTuvr+7b19c/lI2QnD+eqdanaq6IMn3Jrlu3bOsSlU9Nslzkrw1Sbr7oe7+wnqnWql9SR5dVfuSnJHks2ueZ1e6+/1JPr/l5iuS3LC4fEOSl5zUoVbkaGvr7vd298OLq/8nyQUnfbAlbfN3liT/Pcnrkox9lfU2a/uRJG/q7v+32OaBZfZxKgf8/CT3bbp+KHsodElSVRcleXqSW9c7yUr9fDa+8P5+3YOs0MVJDif5lcVTA9dV1ZnrHmoVuvsz2TgKuDfJ/Un+qrvfu96pVuqbuvv+ZOOH5ySPX/M8J8oPJfnddQ+xClX14iSf6e471j3LCfDkJN9dVbdW1R9X1Xcu82CncsDrKLeN/Wlsq6o6K8k7k7y6u7+47nlWoapelOSB7r5t3bOs2L4klyZ5S3c/PcnfZO6p2K+weE74iiRPSnJekjOr6uXrnYqvRVX9dDaemrtx3bMsq6rOSPLTSd6w7llOkH1Jzs7G06c/keTXq+porTsup3LADyW5cNP1CzL01N5WVfUN2Yj3jd1987rnWaFnJ3lxVX0qG095PLeqfnW9I63EoSSHuvuRMyU3ZSPoe8HzktzT3Ye7+++S3Jzku9Y80yp9rqqekCSLj0udsjzVVNWVSV6U5D/23nhTj3+SjR8m71h8H7kgye1V9c1rnWp1DiW5uTf8STbOVO76RXqncsD/NMklVfWkqjo9Gy+sefeaZ1ra4qettya5q7vfvO55Vqm7X9/dF3T3Rdn4+/qD7h5/NNfdf5nkvqp6yuKmy5J8bI0jrdK9SZ5ZVWcs/m1elj3yAr2Fdye5cnH5yiTvWuMsK1VVlyf5ySQv7u7/u+55VqG7P9Ldj+/uixbfRw4luXTxNbgX/GaS5yZJVT05yelZ4v+6dsoGfPHijB9L8nvZ+Iby691953qnWolnJ/mBbBydfnjx54XrHoodvTLJjVX1Z0m+I8l/XfM8K7E4q3BTktuTfCQb3xNGvo1lVb09yQeTPKWqDlXVDyd5U5LnV9UnsvGq5jetc8bd2mZtv5jkMUluWXwf+eW1DrkL26xrT9hmbW9LcvHiV8vekeTKZc6ceCtVABjolD0CBwC2J+AAMJCAA8BAAg4AAwk4AAwk4AAwkIADwED/H3ZBvi8oWJldAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "a = {'v': (0, 1), 'w': 0}\n", + "z = (2, 3, 5, 7)\n", + "S = monte_carlo_localization(a, z, 1000, P_motion_sample, P_sensor, m, S)\n", + "grid = [[0]*17 for _ in range(11)]\n", + "for x, y, _ in S:\n", + " if 0 <= x < 11 and 0 <= y < 17:\n", + " grid[x][y] += 1\n", + "print(\"GRID:\")\n", + "print_table(grid)\n", + "heatmap(grid, cmap='Oranges')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this case, the robot is 99.9% certain that it is at position `(6, 7)`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## DECISION THEORETIC AGENT\n", + "We now move into the domain of probabilistic decision making.\n", + "
    \n", + "To make choices between different possible plans in a certain situation in a given environment, an agent must have _preference_ between the possible outcomes of the various plans.\n", + "
    \n", + "__Utility theory__ is used to represent and reason with preferences.\n", + "The agent prefers states with a higher _utility_.\n", + "While constructing multi-agent systems, one major element in the design is the mechanism the agents use for making decisions about which actions to adopt in order to achieve their goals.\n", + "What is usually required is a mechanism which ensures that the actions adopted lead to benefits for both individual agents, and the community of which they are part.\n", + "The utility of a state is _relative_ to an agent.\n", + "
    \n", + "Preferences, as expressed by utilities, are combined with probabilities in the general theory of rational decisions called __decision theory__.\n", + "
    \n", + "An agent is said to be _rational_ if and only if it chooses the action that yields the highest expected utility, averaged over all the possible outcomes of the action." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we'll see how a decision-theoretic agent is implemented in the module." + ] + }, + { + "cell_type": "code", + "execution_count": 97, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def DTAgentProgram(belief_state):\n",
    +       "    """A decision-theoretic agent. [Figure 13.1]"""\n",
    +       "    def program(percept):\n",
    +       "        belief_state.observe(program.action, percept)\n",
    +       "        program.action = argmax(belief_state.actions(),\n",
    +       "                                key=belief_state.expected_outcome_utility)\n",
    +       "        return program.action\n",
    +       "    program.action = None\n",
    +       "    return program\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(DTAgentProgram)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `DTAgentProgram` function is pretty self-explanatory.\n", + "
    \n", + "It encapsulates a function `program` that takes in an observation or a `percept`, updates its `belief_state` and returns the action that maximizes the `expected_outcome_utility`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## INFORMATION GATHERING AGENT\n", + "Before we discuss what an information gathering agent is, we'll need to know what decision networks are.\n", + "For an agent in an environment, a decision network represents information about the agent's current state, its possible actions, the state that will result from the agent's action, and the utility of that state.\n", + "Decision networks have three primary kinds of nodes which are:\n", + "1. __Chance nodes__: These represent random variables, just like in Bayesian networks.\n", + "2. __Decision nodes__: These represent points where the decision-makes has a choice between different actions and the decision maker tries to find the optimal decision at these nodes with regard to the cost, safety and resulting utility.\n", + "3. __Utility nodes__: These represent the agent's utility function.\n", + "A description of the agent's utility as a function is associated with a utility node.\n", + "
    \n", + "
    \n", + "To evaluate a decision network, we do the following:\n", + "1. Initialize the evidence variables according to the current state.\n", + "2. Calculate posterior probabilities for each possible value of the decision node and calculate the utility resulting from that action.\n", + "3. Return the action with the highest utility.\n", + "
    \n", + "Let's have a look at the implementation of the `DecisionNetwork` class." + ] + }, + { + "cell_type": "code", + "execution_count": 98, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class DecisionNetwork(BayesNet):\n",
    +       "    """An abstract class for a decision network as a wrapper for a BayesNet.\n",
    +       "    Represents an agent's current state, its possible actions, reachable states\n",
    +       "    and utilities of those states."""\n",
    +       "\n",
    +       "    def __init__(self, action, infer):\n",
    +       "        """action: a single action node\n",
    +       "        infer: the preferred method to carry out inference on the given BayesNet"""\n",
    +       "        super(DecisionNetwork, self).__init__()\n",
    +       "        self.action = action\n",
    +       "        self.infer = infer\n",
    +       "\n",
    +       "    def best_action(self):\n",
    +       "        """Return the best action in the network"""\n",
    +       "        return self.action\n",
    +       "\n",
    +       "    def get_utility(self, action, state):\n",
    +       "        """Return the utility for a particular action and state in the network"""\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def get_expected_utility(self, action, evidence):\n",
    +       "        """Compute the expected utility given an action and evidence"""\n",
    +       "        u = 0.0\n",
    +       "        prob_dist = self.infer(action, evidence, self).prob\n",
    +       "        for item, _ in prob_dist.items():\n",
    +       "            u += prob_dist[item] * self.get_utility(action, item)\n",
    +       "\n",
    +       "        return u\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(DecisionNetwork)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `DecisionNetwork` class inherits from `BayesNet` and has a few extra helper methods.\n", + "
    \n", + "`best_action` returns the best action in the network.\n", + "
    \n", + "`get_utility` is an abstract method which is supposed to return the utility of a particular action and state in the network.\n", + "
    \n", + "`get_expected_utility` computes the expected utility, given an action and evidence.\n", + "
    " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before we proceed, we need to know a few more terms.\n", + "
    \n", + "Having __perfect information__ refers to a state of being fully aware of the current state, the cost functions and the outcomes of actions.\n", + "This in turn allows an agent to find the exact utility value of each state.\n", + "If an agent has perfect information about the environment, maximum expected utility calculations are exact and can be computed with absolute certainty.\n", + "
    \n", + "In decision theory, the __value of perfect information__ (VPI) is the price that an agent would be willing to pay in order to gain access to _perfect information_.\n", + "VPI calculations are extensively used to calculate expected utilities for nodes in a decision network.\n", + "
    \n", + "For a random variable $E_j$ whose value is currently unknown, the value of discovering $E_j$, given current information $e$ must average over all possible values $e_{jk}$ that we might discover for $E_j$, using our _current_ beliefs about its value.\n", + "The VPI of $E_j$ is then given by:\n", + "
    \n", + "
    \n", + "$$VPI_e(E_j) = \\left(\\sum_{k}P(E_j=e_{jk}\\ |\\ e) EU(\\alpha_{e_{jk}}\\ |\\ e, E_j=e_{jk})\\right) - EU(\\alpha\\ |\\ e)$$\n", + "
    \n", + "VPI is _non-negative_, _non-additive_ and _order-indepentent_." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An information gathering agent is an agent with certain properties that explores decision networks as and when required with heuristics driven by VPI calculations of nodes.\n", + "A sensible agent should ask questions in a reasonable order, should avoid asking irrelevant questions, should take into account the importance of each piece of information in relation to its cost and should stop asking questions when that is appropriate.\n", + "_VPI_ is used as the primary heuristic to consider all these points in an information gathering agent as the agent ultimately wants to maximize the utility and needs to find the optimal cost and extent of finding the required information.\n", + "
    \n", + "As an overview, an information gathering agent works by repeatedly selecting the observations with the highest information value, until the cost of the next observation is greater than its expected benefit.\n", + "
    \n", + "The `InformationGatheringAgent` class is an abstract class that inherits from `Agent` and works on the principles discussed above.\n", + "Let's have a look.\n", + "
    " + ] + }, + { + "cell_type": "code", + "execution_count": 99, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class InformationGatheringAgent(Agent):\n",
    +       "    """A simple information gathering agent. The agent works by repeatedly selecting\n",
    +       "    the observation with the highest information value, until the cost of the next\n",
    +       "    observation is greater than its expected benefit. [Figure 16.9]"""\n",
    +       "\n",
    +       "    def __init__(self, decnet, infer, initial_evidence=None):\n",
    +       "        """decnet: a decision network\n",
    +       "        infer: the preferred method to carry out inference on the given decision network\n",
    +       "        initial_evidence: initial evidence"""\n",
    +       "        self.decnet = decnet\n",
    +       "        self.infer = infer\n",
    +       "        self.observation = initial_evidence or []\n",
    +       "        self.variables = self.decnet.nodes\n",
    +       "\n",
    +       "    def integrate_percept(self, percept):\n",
    +       "        """Integrate the given percept into the decision network"""\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def execute(self, percept):\n",
    +       "        """Execute the information gathering algorithm"""\n",
    +       "        self.observation = self.integrate_percept(percept)\n",
    +       "        vpis = self.vpi_cost_ratio(self.variables)\n",
    +       "        j = argmax(vpis)\n",
    +       "        variable = self.variables[j]\n",
    +       "\n",
    +       "        if self.vpi(variable) > self.cost(variable):\n",
    +       "            return self.request(variable)\n",
    +       "\n",
    +       "        return self.decnet.best_action()\n",
    +       "\n",
    +       "    def request(self, variable):\n",
    +       "        """Return the value of the given random variable as the next percept"""\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def cost(self, var):\n",
    +       "        """Return the cost of obtaining evidence through tests, consultants or questions"""\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def vpi_cost_ratio(self, variables):\n",
    +       "        """Return the VPI to cost ratio for the given variables"""\n",
    +       "        v_by_c = []\n",
    +       "        for var in variables:\n",
    +       "            v_by_c.append(self.vpi(var) / self.cost(var))\n",
    +       "        return v_by_c\n",
    +       "\n",
    +       "    def vpi(self, variable):\n",
    +       "        """Return VPI for a given variable"""\n",
    +       "        vpi = 0.0\n",
    +       "        prob_dist = self.infer(variable, self.observation, self.decnet).prob\n",
    +       "        for item, _ in prob_dist.items():\n",
    +       "            post_prob = prob_dist[item]\n",
    +       "            new_observation = list(self.observation)\n",
    +       "            new_observation.append(item)\n",
    +       "            expected_utility = self.decnet.get_expected_utility(variable, new_observation)\n",
    +       "            vpi += post_prob * expected_utility\n",
    +       "\n",
    +       "        vpi -= self.decnet.get_expected_utility(variable, self.observation)\n",
    +       "        return vpi\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(InformationGatheringAgent)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `cost` method is an abstract method that returns the cost of obtaining the evidence through tests, consultants, questions or any other means.\n", + "
    \n", + "The `request` method returns the value of the given random variable as the next percept.\n", + "
    \n", + "The `vpi_cost_ratio` method returns a list of VPI divided by cost for each variable in the `variables` list provided to it.\n", + "
    \n", + "The `vpi` method calculates the VPI for a given variable\n", + "
    \n", + "And finally, the `execute` method executes the general information gathering algorithm, as described in __figure 16.9__ in the book.\n", + "
    \n", + "Our agent implements a form of information gathering that is called __myopic__ as the VPI formula is used shortsightedly here.\n", + "It calculates the value of information as if only a single evidence variable will be acquired.\n", + "This is similar to greedy search, where we do not look at the bigger picture and aim for local optimizations to hopefully reach the global optimum.\n", + "This often works well in practice but a myopic agent might hastily take an action when it would have been better to request more variables before taking an action.\n", + "A _conditional plan_, on the other hand might work better for some scenarios.\n", + "
    \n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With this we conclude this notebook." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" } }, "nbformat": 4, - "nbformat_minor": 0 + "nbformat_minor": 2 } diff --git a/probability.py b/probability.py index 5c9e28245..e1e77d224 100644 --- a/probability.py +++ b/probability.py @@ -1,30 +1,27 @@ -"""Probability models. (Chapter 13-15) -""" +"""Probability models (Chapter 13-15)""" -from utils import ( - product, argmax, element_wise_product, matrix_multiplication, - vector_to_diagonal, vector_add, scalar_vector_product, inverse_matrix, - weighted_sample_with_replacement, isclose, probability, normalize -) -from logic import extend - -import random from collections import defaultdict from functools import reduce -# ______________________________________________________________________________ +from agents import Agent +from utils import * def DTAgentProgram(belief_state): - """A decision-theoretic agent. [Figure 13.1]""" + """ + [Figure 13.1] + A decision-theoretic agent. + """ + def program(percept): belief_state.observe(program.action, percept) - program.action = argmax(belief_state.actions(), - key=belief_state.expected_outcome_utility) + program.action = max(belief_state.actions(), key=belief_state.expected_outcome_utility) return program.action + program.action = None return program + # ______________________________________________________________________________ @@ -38,14 +35,14 @@ class ProbDist: (0.125, 0.375, 0.5) """ - def __init__(self, varname='?', freqs=None): - """If freqs is given, it is a dictionary of values - frequency pairs, + def __init__(self, var_name='?', freq=None): + """If freq is given, it is a dictionary of values - frequency pairs, then ProbDist is normalized.""" self.prob = {} - self.varname = varname + self.var_name = var_name self.values = [] - if freqs: - for (v, p) in freqs.items(): + if freq: + for (v, p) in freq.items(): self[v] = p self.normalize() @@ -67,7 +64,7 @@ def normalize(self): Returns the normalized distribution. Raises a ZeroDivisionError if the sum of the values is 0.""" total = sum(self.prob.values()) - if not isclose(total, 1.0): + if not np.isclose(total, 1.0): for val in self.prob: self.prob[val] /= total return self @@ -75,11 +72,10 @@ def normalize(self): def show_approx(self, numfmt='{:.3g}'): """Show the probabilities rounded and sorted by key, for the sake of portable doctests.""" - return ', '.join([('{}: ' + numfmt).format(v, p) - for (v, p) in sorted(self.prob.items())]) + return ', '.join([('{}: ' + numfmt).format(v, p) for (v, p) in sorted(self.prob.items())]) def __repr__(self): - return "P({})".format(self.varname) + return "P({})".format(self.var_name) class JointProbDist(ProbDist): @@ -102,7 +98,7 @@ def __getitem__(self, values): return ProbDist.__getitem__(self, values) def __setitem__(self, values, p): - """Set P(values) = p. Values can be a tuple or a dict; it must + """Set P(values) = p. Values can be a tuple or a dict; it must have a value for each of the variables in the joint. Also keep track of the values we have seen so far for each variable.""" values = event_values(values, self.variables) @@ -131,12 +127,15 @@ def event_values(event, variables): else: return tuple([event[var] for var in variables]) + # ______________________________________________________________________________ def enumerate_joint_ask(X, e, P): - """Return a probability distribution over the values of the variable X, - given the {var:val} observations e, in the JointProbDist P. [Section 13.3] + """ + [Section 13.3] + Return a probability distribution over the values of the variable X, + given the {var:val} observations e, in the JointProbDist P. >>> P = JointProbDist(['X', 'Y']) >>> P[0,0] = 0.25; P[0,1] = 0.5; P[1,1] = P[2,1] = 0.125 >>> enumerate_joint_ask('X', dict(Y=1), P).show_approx() @@ -156,8 +155,8 @@ def enumerate_joint(variables, e, P): if not variables: return P[e] Y, rest = variables[0], variables[1:] - return sum([enumerate_joint(rest, extend(e, Y, y), P) - for y in P.values(Y)]) + return sum([enumerate_joint(rest, extend(e, Y, y), P) for y in P.values(Y)]) + # ______________________________________________________________________________ @@ -165,10 +164,11 @@ def enumerate_joint(variables, e, P): class BayesNet: """Bayesian network containing only boolean-variable nodes.""" - def __init__(self, node_specs=[]): + def __init__(self, node_specs=None): """Nodes must be ordered with parents before children.""" self.nodes = [] self.variables = [] + node_specs = node_specs or [] for node_spec in node_specs: self.add(node_spec) @@ -200,13 +200,105 @@ def __repr__(self): return 'BayesNet({0!r})'.format(self.nodes) +class DecisionNetwork(BayesNet): + """An abstract class for a decision network as a wrapper for a BayesNet. + Represents an agent's current state, its possible actions, reachable states + and utilities of those states.""" + + def __init__(self, action, infer): + """action: a single action node + infer: the preferred method to carry out inference on the given BayesNet""" + super(DecisionNetwork, self).__init__() + self.action = action + self.infer = infer + + def best_action(self): + """Return the best action in the network""" + return self.action + + def get_utility(self, action, state): + """Return the utility for a particular action and state in the network""" + raise NotImplementedError + + def get_expected_utility(self, action, evidence): + """Compute the expected utility given an action and evidence""" + u = 0.0 + prob_dist = self.infer(action, evidence, self).prob + for item, _ in prob_dist.items(): + u += prob_dist[item] * self.get_utility(action, item) + + return u + + +class InformationGatheringAgent(Agent): + """ + [Figure 16.9] + A simple information gathering agent. The agent works by repeatedly selecting + the observation with the highest information value, until the cost of the next + observation is greater than its expected benefit.""" + + def __init__(self, decnet, infer, initial_evidence=None): + """decnet: a decision network + infer: the preferred method to carry out inference on the given decision network + initial_evidence: initial evidence""" + self.decnet = decnet + self.infer = infer + self.observation = initial_evidence or [] + self.variables = self.decnet.nodes + + def integrate_percept(self, percept): + """Integrate the given percept into the decision network""" + raise NotImplementedError + + def execute(self, percept): + """Execute the information gathering algorithm""" + self.observation = self.integrate_percept(percept) + vpis = self.vpi_cost_ratio(self.variables) + j = max(vpis) + variable = self.variables[j] + + if self.vpi(variable) > self.cost(variable): + return self.request(variable) + + return self.decnet.best_action() + + def request(self, variable): + """Return the value of the given random variable as the next percept""" + raise NotImplementedError + + def cost(self, var): + """Return the cost of obtaining evidence through tests, consultants or questions""" + raise NotImplementedError + + def vpi_cost_ratio(self, variables): + """Return the VPI to cost ratio for the given variables""" + v_by_c = [] + for var in variables: + v_by_c.append(self.vpi(var) / self.cost(var)) + return v_by_c + + def vpi(self, variable): + """Return VPI for a given variable""" + vpi = 0.0 + prob_dist = self.infer(variable, self.observation, self.decnet).prob + for item, _ in prob_dist.items(): + post_prob = prob_dist[item] + new_observation = list(self.observation) + new_observation.append(item) + expected_utility = self.decnet.get_expected_utility(variable, new_observation) + vpi += post_prob * expected_utility + + vpi -= self.decnet.get_expected_utility(variable, self.observation) + return vpi + + class BayesNode: """A conditional probability distribution for a boolean variable, P(X | parents). Part of a BayesNet.""" def __init__(self, X, parents, cpt): """X is a variable name, and parents a sequence of variable - names or a space-separated string. cpt, the conditional + names or a space-separated string. cpt, the conditional probability table, takes one of these forms: * A number, the unconditional probability P(X=true). You can @@ -277,21 +369,22 @@ def __repr__(self): T, F = True, False -burglary = BayesNet([ - ('Burglary', '', 0.001), - ('Earthquake', '', 0.002), - ('Alarm', 'Burglary Earthquake', - {(T, T): 0.95, (T, F): 0.94, (F, T): 0.29, (F, F): 0.001}), - ('JohnCalls', 'Alarm', {T: 0.90, F: 0.05}), - ('MaryCalls', 'Alarm', {T: 0.70, F: 0.01}) -]) +burglary = BayesNet([('Burglary', '', 0.001), + ('Earthquake', '', 0.002), + ('Alarm', 'Burglary Earthquake', + {(T, T): 0.95, (T, F): 0.94, (F, T): 0.29, (F, F): 0.001}), + ('JohnCalls', 'Alarm', {T: 0.90, F: 0.05}), + ('MaryCalls', 'Alarm', {T: 0.70, F: 0.01})]) + # ______________________________________________________________________________ def enumeration_ask(X, e, bn): - """Return the conditional probability distribution of variable X - given evidence e, from BayesNet bn. [Figure 14.9] + """ + [Figure 14.9] + Return the conditional probability distribution of variable X + given evidence e, from BayesNet bn. >>> enumeration_ask('Burglary', dict(JohnCalls=T, MaryCalls=T), burglary ... ).show_approx() 'False: 0.716, True: 0.284'""" @@ -317,11 +410,14 @@ def enumerate_all(variables, e, bn): return sum(Ynode.p(y, e) * enumerate_all(rest, extend(e, Y, y), bn) for y in bn.variable_values(Y)) + # ______________________________________________________________________________ def elimination_ask(X, e, bn): - """Compute bn's P(X|e) by variable elimination. [Figure 14.11] + """ + [Figure 14.11] + Compute bn's P(X|e) by variable elimination. >>> elimination_ask('Burglary', dict(JohnCalls=T, MaryCalls=T), burglary ... ).show_approx() 'False: 0.716, True: 0.284'""" @@ -373,23 +469,20 @@ def __init__(self, variables, cpt): def pointwise_product(self, other, bn): """Multiply two factors, combining their variables.""" variables = list(set(self.variables) | set(other.variables)) - cpt = {event_values(e, variables): self.p(e) * other.p(e) - for e in all_events(variables, bn, {})} + cpt = {event_values(e, variables): self.p(e) * other.p(e) for e in all_events(variables, bn, {})} return Factor(variables, cpt) def sum_out(self, var, bn): """Make a factor eliminating var by summing over its values.""" variables = [X for X in self.variables if X != var] - cpt = {event_values(e, variables): sum(self.p(extend(e, var, val)) - for val in bn.variable_values(var)) + cpt = {event_values(e, variables): sum(self.p(extend(e, var, val)) for val in bn.variable_values(var)) for e in all_events(variables, bn, {})} return Factor(variables, cpt) def normalize(self): """Return my probabilities; must be down to one variable.""" assert len(self.variables) == 1 - return ProbDist(self.variables[0], - {k: v for ((k,), v) in self.cpt.items()}) + return ProbDist(self.variables[0], {k: v for ((k,), v) in self.cpt.items()}) def p(self, e): """Look up my value tabulated for e.""" @@ -406,35 +499,42 @@ def all_events(variables, bn, e): for x in bn.variable_values(X): yield extend(e1, X, x) + # ______________________________________________________________________________ # [Figure 14.12a]: sprinkler network -sprinkler = BayesNet([ - ('Cloudy', '', 0.5), - ('Sprinkler', 'Cloudy', {T: 0.10, F: 0.50}), - ('Rain', 'Cloudy', {T: 0.80, F: 0.20}), - ('WetGrass', 'Sprinkler Rain', - {(T, T): 0.99, (T, F): 0.90, (F, T): 0.90, (F, F): 0.00})]) +sprinkler = BayesNet([('Cloudy', '', 0.5), + ('Sprinkler', 'Cloudy', {T: 0.10, F: 0.50}), + ('Rain', 'Cloudy', {T: 0.80, F: 0.20}), + ('WetGrass', 'Sprinkler Rain', + {(T, T): 0.99, (T, F): 0.90, (F, T): 0.90, (F, F): 0.00})]) + # ______________________________________________________________________________ def prior_sample(bn): - """Randomly sample from bn's full joint distribution. The result - is a {variable: value} dict. [Figure 14.13]""" + """ + [Figure 14.13] + Randomly sample from bn's full joint distribution. + The result is a {variable: value} dict. + """ event = {} for node in bn.nodes: event[node.variable] = node.sample(event) return event + # _________________________________________________________________________ -def rejection_sampling(X, e, bn, N): - """Estimate the probability distribution of variable X given - evidence e in BayesNet bn, using N samples. [Figure 14.14] +def rejection_sampling(X, e, bn, N=10000): + """ + [Figure 14.14] + Estimate the probability distribution of variable X given + evidence e in BayesNet bn, using N samples. Raises a ZeroDivisionError if all the N samples are rejected, i.e., inconsistent with e. >>> random.seed(47) @@ -452,15 +552,17 @@ def rejection_sampling(X, e, bn, N): def consistent_with(event, evidence): """Is event consistent with the given evidence?""" - return all(evidence.get(k, v) == v - for k, v in event.items()) + return all(evidence.get(k, v) == v for k, v in event.items()) + # _________________________________________________________________________ -def likelihood_weighting(X, e, bn, N): - """Estimate the probability distribution of variable X given - evidence e in BayesNet bn. [Figure 14.15] +def likelihood_weighting(X, e, bn, N=10000): + """ + [Figure 14.15] + Estimate the probability distribution of variable X given + evidence e in BayesNet bn. >>> random.seed(1017) >>> likelihood_weighting('Burglary', dict(JohnCalls=T, MaryCalls=T), ... burglary, 10000).show_approx() @@ -474,9 +576,11 @@ def likelihood_weighting(X, e, bn, N): def weighted_sample(bn, e): - """Sample an event from bn that's consistent with the evidence e; + """ + Sample an event from bn that's consistent with the evidence e; return the event and its weight, the likelihood that the event - accords to the evidence.""" + accords to the evidence. + """ w = 1 event = dict(e) # boldface x in [Figure 14.15] for node in bn.nodes: @@ -487,10 +591,11 @@ def weighted_sample(bn, e): event[Xi] = node.sample(event) return event, w + # _________________________________________________________________________ -def gibbs_ask(X, e, bn, N): +def gibbs_ask(X, e, bn, N=1000): """[Figure 14.16]""" assert X not in e, "Query variable must be distinct from evidence" counts = {x: 0 for x in bn.variable_values(X)} # bold N in [Figure 14.16] @@ -514,22 +619,22 @@ def markov_blanket_sample(X, e, bn): Q = ProbDist(X) for xi in bn.variable_values(X): ei = extend(e, X, xi) - # [Equation 14.12:] - Q[xi] = Xnode.p(xi, e) * product(Yj.p(ei[Yj.variable], ei) - for Yj in Xnode.children) + # [Equation 14.12] + Q[xi] = Xnode.p(xi, e) * product(Yj.p(ei[Yj.variable], ei) for Yj in Xnode.children) # (assuming a Boolean variable here) return probability(Q.normalize()[True]) + # _________________________________________________________________________ class HiddenMarkovModel: """A Hidden markov model which takes Transition model and Sensor model as inputs""" - def __init__(self, transition_model, sensor_model, prior=[0.5, 0.5]): + def __init__(self, transition_model, sensor_model, prior=None): self.transition_model = transition_model self.sensor_model = sensor_model - self.prior = prior + self.prior = prior or [0.5, 0.5] def sensor_dist(self, ev): if ev is True: @@ -554,52 +659,95 @@ def backward(HMM, b, ev): scalar_vector_product(prediction[1], HMM.transition_model[1]))) -def forward_backward(HMM, ev, prior): - """[Figure 15.4] +def forward_backward(HMM, ev): + """ + [Figure 15.4] Forward-Backward algorithm for smoothing. Computes posterior probabilities - of a sequence of states given a sequence of observations.""" + of a sequence of states given a sequence of observations. + """ t = len(ev) ev.insert(0, None) # to make the code look similar to pseudo code - fv = [[0.0, 0.0] for i in range(len(ev))] + fv = [[0.0, 0.0] for _ in range(len(ev))] b = [1.0, 1.0] - bv = [b] # we don't need bv; but we will have a list of all backward messages here - sv = [[0, 0] for i in range(len(ev))] + sv = [[0, 0] for _ in range(len(ev))] - fv[0] = prior + fv[0] = HMM.prior for i in range(1, t + 1): fv[i] = forward(HMM, fv[i - 1], ev[i]) for i in range(t, -1, -1): sv[i - 1] = normalize(element_wise_product(fv[i], b)) b = backward(HMM, b, ev[i]) - bv.append(b) sv = sv[::-1] return sv + +def viterbi(HMM, ev): + """ + [Equation 15.11] + Viterbi algorithm to find the most likely sequence. Computes the best path and the + corresponding probabilities, given an HMM model and a sequence of observations. + """ + t = len(ev) + ev = ev.copy() + ev.insert(0, None) + + m = [[0.0, 0.0] for _ in range(len(ev) - 1)] + + # the recursion is initialized with m1 = forward(P(X0), e1) + m[0] = forward(HMM, HMM.prior, ev[1]) + # keep track of maximizing predecessors + backtracking_graph = [] + + for i in range(1, t): + m[i] = element_wise_product(HMM.sensor_dist(ev[i + 1]), + [max(element_wise_product(HMM.transition_model[0], m[i - 1])), + max(element_wise_product(HMM.transition_model[1], m[i - 1]))]) + backtracking_graph.append([np.argmax(element_wise_product(HMM.transition_model[0], m[i - 1])), + np.argmax(element_wise_product(HMM.transition_model[1], m[i - 1]))]) + + # computed probabilities + ml_probabilities = [0.0] * (len(ev) - 1) + # most likely sequence + ml_path = [True] * (len(ev) - 1) + + # the construction of the most likely sequence starts in the final state with the largest probability, and + # runs backwards; the algorithm needs to store for each xt its predecessor xt-1 maximizing its probability + i_max = np.argmax(m[-1]) + + for i in range(t - 1, -1, -1): + ml_probabilities[i] = m[i][i_max] + ml_path[i] = True if i_max == 0 else False + if i > 0: + i_max = backtracking_graph[i - 1][i_max] + + return ml_path, ml_probabilities + + # _________________________________________________________________________ def fixed_lag_smoothing(e_t, HMM, d, ev, t): - """[Figure 15.6] + """ + [Figure 15.6] Smoothing algorithm with a fixed time lag of 'd' steps. Online algorithm that outputs the new smoothed estimate if observation - for new time step is given.""" + for new time step is given. + """ ev.insert(0, None) T_model = HMM.transition_model f = HMM.prior B = [[1, 0], [0, 1]] - evidence = [] - evidence.append(e_t) - O_t = vector_to_diagonal(HMM.sensor_dist(e_t)) + O_t = np.diag(HMM.sensor_dist(e_t)) if t > d: f = forward(HMM, f, e_t) - O_tmd = vector_to_diagonal(HMM.sensor_dist(ev[t - d])) - B = matrix_multiplication(inverse_matrix(O_tmd), inverse_matrix(T_model), B, T_model, O_t) + O_tmd = np.diag(HMM.sensor_dist(ev[t - d])) + B = matrix_multiplication(np.linalg.inv(O_tmd), np.linalg.inv(T_model), B, T_model, O_t) else: B = matrix_multiplication(B, T_model, O_t) t += 1 @@ -610,6 +758,7 @@ def fixed_lag_smoothing(e_t, HMM, d, ev, t): else: return None + # _________________________________________________________________________ @@ -645,17 +794,19 @@ def particle_filtering(e, N, HMM): w[i] = float("{0:.4f}".format(w[i])) # STEP 2 - s = weighted_sample_with_replacement(N, s, w) return s + # _________________________________________________________________________ -## TODO: Implement continous map for MonteCarlo similar to Fig25.10 from the book +# TODO: Implement continuous map for MonteCarlo similar to Fig25.10 from the book + class MCLmap: """Map which provides probability distributions and sensor readings. Consists of discrete cells which are either an obstacle or empty""" + def __init__(self, m): self.m = m self.nrows = len(m) @@ -672,34 +823,36 @@ def sample(self): return kin_state def ray_cast(self, sensor_num, kin_state): - """Returns distace to nearest obstacle or map boundary in the direction of sensor""" + """Returns distance to nearest obstacle or map boundary in the direction of sensor""" pos = kin_state[:2] orient = kin_state[2] # sensor layout when orientation is 0 (towards North) # 0 # 3R1 # 2 - delta = ((sensor_num%2 == 0)*(sensor_num - 1), (sensor_num%2 == 1)*(2 - sensor_num)) + delta = ((sensor_num % 2 == 0) * (sensor_num - 1), (sensor_num % 2 == 1) * (2 - sensor_num)) # sensor direction changes based on orientation for _ in range(orient): delta = (delta[1], -delta[0]) range_count = 0 - while (0 <= pos[0] < self.nrows) and (0 <= pos[1] < self.nrows) and (not self.m[pos[0]][pos[1]]): + while 0 <= pos[0] < self.nrows and 0 <= pos[1] < self.nrows and not self.m[pos[0]][pos[1]]: pos = vector_add(pos, delta) range_count += 1 return range_count def monte_carlo_localization(a, z, N, P_motion_sample, P_sensor, m, S=None): - """Monte Carlo localization algorithm from Fig 25.9""" + """ + [Figure 25.9] + Monte Carlo localization algorithm + """ def ray_cast(sensor_num, kin_state, m): return m.ray_cast(sensor_num, kin_state) M = len(z) - W = [0]*N - S_ = [0]*N - W_ = [0]*N + S_ = [0] * N + W_ = [0] * N v = a['v'] w = a['w'] diff --git a/probability-4e.ipynb b/probability4e.ipynb similarity index 100% rename from probability-4e.ipynb rename to probability4e.ipynb diff --git a/probability4e.py b/probability4e.py new file mode 100644 index 000000000..d413a55ae --- /dev/null +++ b/probability4e.py @@ -0,0 +1,776 @@ +"""Probability models (Chapter 12-13)""" + +import copy +import random +from collections import defaultdict +from functools import reduce + +import numpy as np + +from utils4e import product, probability, extend + + +# ______________________________________________________________________________ +# Chapter 12 Qualifying Uncertainty +# 12.1 Acting Under Uncertainty + + +def DTAgentProgram(belief_state): + """A decision-theoretic agent. [Figure 12.1]""" + + def program(percept): + belief_state.observe(program.action, percept) + program.action = max(belief_state.actions(), key=belief_state.expected_outcome_utility) + return program.action + + program.action = None + return program + + +# ______________________________________________________________________________ +# 12.2 Basic Probability Notation + + +class ProbDist: + """A discrete probability distribution. You name the random variable + in the constructor, then assign and query probability of values. + >>> P = ProbDist('Flip'); P['H'], P['T'] = 0.25, 0.75; P['H'] + 0.25 + >>> P = ProbDist('X', {'lo': 125, 'med': 375, 'hi': 500}) + >>> P['lo'], P['med'], P['hi'] + (0.125, 0.375, 0.5) + """ + + def __init__(self, varname='?', freqs=None): + """If freqs is given, it is a dictionary of values - frequency pairs, + then ProbDist is normalized.""" + self.prob = {} + self.varname = varname + self.values = [] + if freqs: + for (v, p) in freqs.items(): + self[v] = p + self.normalize() + + def __getitem__(self, val): + """Given a value, return P(value).""" + try: + return self.prob[val] + except KeyError: + return 0 + + def __setitem__(self, val, p): + """Set P(val) = p.""" + if val not in self.values: + self.values.append(val) + self.prob[val] = p + + def normalize(self): + """Make sure the probabilities of all values sum to 1. + Returns the normalized distribution. + Raises a ZeroDivisionError if the sum of the values is 0.""" + total = sum(self.prob.values()) + if not np.isclose(total, 1.0): + for val in self.prob: + self.prob[val] /= total + return self + + def show_approx(self, numfmt='{:.3g}'): + """Show the probabilities rounded and sorted by key, for the + sake of portable doctests.""" + return ', '.join([('{}: ' + numfmt).format(v, p) + for (v, p) in sorted(self.prob.items())]) + + def __repr__(self): + return "P({})".format(self.varname) + + +# ______________________________________________________________________________ +# 12.3 Inference Using Full Joint Distributions + + +class JointProbDist(ProbDist): + """A discrete probability distribute over a set of variables. + >>> P = JointProbDist(['X', 'Y']); P[1, 1] = 0.25 + >>> P[1, 1] + 0.25 + >>> P[dict(X=0, Y=1)] = 0.5 + >>> P[dict(X=0, Y=1)] + 0.5""" + + def __init__(self, variables): + self.prob = {} + self.variables = variables + self.vals = defaultdict(list) + + def __getitem__(self, values): + """Given a tuple or dict of values, return P(values).""" + values = event_values(values, self.variables) + return ProbDist.__getitem__(self, values) + + def __setitem__(self, values, p): + """Set P(values) = p. Values can be a tuple or a dict; it must + have a value for each of the variables in the joint. Also keep track + of the values we have seen so far for each variable.""" + values = event_values(values, self.variables) + self.prob[values] = p + for var, val in zip(self.variables, values): + if val not in self.vals[var]: + self.vals[var].append(val) + + def values(self, var): + """Return the set of possible values for a variable.""" + return self.vals[var] + + def __repr__(self): + return "P({})".format(self.variables) + + +def event_values(event, variables): + """Return a tuple of the values of variables in event. + >>> event_values ({'A': 10, 'B': 9, 'C': 8}, ['C', 'A']) + (8, 10) + >>> event_values ((1, 2), ['C', 'A']) + (1, 2) + """ + if isinstance(event, tuple) and len(event) == len(variables): + return event + else: + return tuple([event[var] for var in variables]) + + +def enumerate_joint_ask(X, e, P): + """Return a probability distribution over the values of the variable X, + given the {var:val} observations e, in the JointProbDist P. [Section 12.3] + >>> P = JointProbDist(['X', 'Y']) + >>> P[0,0] = 0.25; P[0,1] = 0.5; P[1,1] = P[2,1] = 0.125 + >>> enumerate_joint_ask('X', dict(Y=1), P).show_approx() + '0: 0.667, 1: 0.167, 2: 0.167' + """ + assert X not in e, "Query variable must be distinct from evidence" + Q = ProbDist(X) # probability distribution for X, initially empty + Y = [v for v in P.variables if v != X and v not in e] # hidden variables. + for xi in P.values(X): + Q[xi] = enumerate_joint(Y, extend(e, X, xi), P) + return Q.normalize() + + +def enumerate_joint(variables, e, P): + """Return the sum of those entries in P consistent with e, + provided variables is P's remaining variables (the ones not in e).""" + if not variables: + return P[e] + Y, rest = variables[0], variables[1:] + return sum([enumerate_joint(rest, extend(e, Y, y), P) + for y in P.values(Y)]) + + +# ______________________________________________________________________________ +# 12.4 Independence + + +def is_independent(variables, P): + """ + Return whether a list of variables are independent given their distribution P + P is an instance of JoinProbDist + >>> P = JointProbDist(['X', 'Y']) + >>> P[0,0] = 0.25; P[0,1] = 0.5; P[1,1] = P[1,0] = 0.125 + >>> is_independent(['X', 'Y'], P) + False + """ + for var in variables: + event_vars = variables[:] + event_vars.remove(var) + event = {} + distribution = enumerate_joint_ask(var, event, P) + events = gen_possible_events(event_vars, P) + for e in events: + conditional_distr = enumerate_joint_ask(var, e, P) + if conditional_distr.prob != distribution.prob: + return False + return True + + +def gen_possible_events(vars, P): + """Generate all possible events of a collection of vars according to distribution of P""" + events = [] + + def backtrack(vars, P, temp): + if not vars: + events.append(temp) + return + var = vars[0] + for val in P.values(var): + temp[var] = val + backtrack([v for v in vars if v != var], P, copy.copy(temp)) + + backtrack(vars, P, {}) + return events + + +# ______________________________________________________________________________ +# Chapter 13 Probabilistic Reasoning +# 13.1 Representing Knowledge in an Uncertain Domain + + +class BayesNet: + """Bayesian network containing only boolean-variable nodes.""" + + def __init__(self, node_specs=None): + """ + Nodes must be ordered with parents before children. + :param node_specs: an nested iterable object, each element contains (variable name, parents name, cpt) + for each node + """ + + self.nodes = [] + self.variables = [] + node_specs = node_specs or [] + for node_spec in node_specs: + self.add(node_spec) + + def add(self, node_spec): + """ + Add a node to the net. Its parents must already be in the + net, and its variable must not. + Initialize Bayes nodes by detecting the length of input node specs + """ + if len(node_spec) >= 5: + node = ContinuousBayesNode(*node_spec) + else: + node = BayesNode(*node_spec) + assert node.variable not in self.variables + assert all((parent in self.variables) for parent in node.parents) + self.nodes.append(node) + self.variables.append(node.variable) + for parent in node.parents: + self.variable_node(parent).children.append(node) + + def variable_node(self, var): + """ + Return the node for the variable named var. + >>> burglary.variable_node('Burglary').variable + 'Burglary' + """ + for n in self.nodes: + if n.variable == var: + return n + raise Exception("No such variable: {}".format(var)) + + def variable_values(self, var): + """Return the domain of var.""" + return [True, False] + + def __repr__(self): + return 'BayesNet({0!r})'.format(self.nodes) + + +class BayesNode: + """ + A conditional probability distribution for a boolean variable, + P(X | parents). Part of a BayesNet. + """ + + def __init__(self, X, parents, cpt): + """ + :param X: variable name, + :param parents: a sequence of variable names or a space-separated string. Representing the names of parent nodes + :param cpt: the conditional probability table, takes one of these forms: + + * A number, the unconditional probability P(X=true). You can + use this form when there are no parents. + + * A dict {v: p, ...}, the conditional probability distribution + P(X=true | parent=v) = p. When there's just one parent. + + * A dict {(v1, v2, ...): p, ...}, the distribution P(X=true | + parent1=v1, parent2=v2, ...) = p. Each key must have as many + values as there are parents. You can use this form always; + the first two are just conveniences. + + In all cases the probability of X being false is left implicit, + since it follows from P(X=true). + + >>> X = BayesNode('X', '', 0.2) + >>> Y = BayesNode('Y', 'P', {T: 0.2, F: 0.7}) + >>> Z = BayesNode('Z', 'P Q', + ... {(T, T): 0.2, (T, F): 0.3, (F, T): 0.5, (F, F): 0.7}) + """ + if isinstance(parents, str): + parents = parents.split() + + # We store the table always in the third form above. + if isinstance(cpt, (float, int)): # no parents, 0-tuple + cpt = {(): cpt} + elif isinstance(cpt, dict): + # one parent, 1-tuple + if cpt and isinstance(list(cpt.keys())[0], bool): + cpt = {(v,): p for v, p in cpt.items()} + + assert isinstance(cpt, dict) + for vs, p in cpt.items(): + assert isinstance(vs, tuple) and len(vs) == len(parents) + assert all(isinstance(v, bool) for v in vs) + assert 0 <= p <= 1 + + self.variable = X + self.parents = parents + self.cpt = cpt + self.children = [] + + def p(self, value, event): + """ + Return the conditional probability + P(X=value | parents=parent_values), where parent_values + are the values of parents in event. (event must assign each + parent a value.) + >>> bn = BayesNode('X', 'Burglary', {T: 0.2, F: 0.625}) + >>> bn.p(False, {'Burglary': False, 'Earthquake': True}) + 0.375 + """ + assert isinstance(value, bool) + ptrue = self.cpt[event_values(event, self.parents)] + return ptrue if value else 1 - ptrue + + def sample(self, event): + """ + Sample from the distribution for this variable conditioned + on event's values for parent_variables. That is, return True/False + at random according with the conditional probability given the + parents. + """ + return probability(self.p(True, event)) + + def __repr__(self): + return repr((self.variable, ' '.join(self.parents))) + + +# Burglary example [Figure 13 .2] + + +T, F = True, False + +burglary = BayesNet([ + ('Burglary', '', 0.001), + ('Earthquake', '', 0.002), + ('Alarm', 'Burglary Earthquake', + {(T, T): 0.95, (T, F): 0.94, (F, T): 0.29, (F, F): 0.001}), + ('JohnCalls', 'Alarm', {T: 0.90, F: 0.05}), + ('MaryCalls', 'Alarm', {T: 0.70, F: 0.01}) +]) + + +# ______________________________________________________________________________ +# Section 13.2. The Semantics of Bayesian Networks +# Bayesian nets with continuous variables + + +def gaussian_probability(param, event, value): + """ + Gaussian probability of a continuous Bayesian network node on condition of + certain event and the parameters determined by the event + :param param: parameters determined by discrete parent events of current node + :param event: a dict, continuous event of current node, the values are used + as parameters in calculating distribution + :param value: float, the value of current continuous node + :return: float, the calculated probability + >>> param = {'sigma':0.5, 'b':1, 'a':{'h1':0.5, 'h2': 1.5}} + >>> event = {'h1':0.6, 'h2': 0.3} + >>> gaussian_probability(param, event, 1) + 0.2590351913317835 + """ + + assert isinstance(event, dict) + assert isinstance(param, dict) + buff = 0 + for k, v in event.items(): + # buffer varianle to calculate h1*a_h1 + h2*a_h2 + buff += param['a'][k] * v + res = 1 / (param['sigma'] * np.sqrt(2 * np.pi)) * np.exp(-0.5 * ((value - buff - param['b']) / param['sigma']) ** 2) + return res + + +def logistic_probability(param, event, value): + """ + Logistic probability of a discrete node in Bayesian network with continuous parents, + :param param: a dict, parameters determined by discrete parents of current node + :param event: a dict, names and values of continuous parent variables of current node + :param value: boolean, True or False + :return: int, probability + """ + + buff = 1 + for _, v in event.items(): + # buffer variable to calculate (value-mu)/sigma + + buff *= (v - param['mu']) / param['sigma'] + p = 1 - 1 / (1 + np.exp(-4 / np.sqrt(2 * np.pi) * buff)) + return p if value else 1 - p + + +class ContinuousBayesNode: + """ A Bayesian network node with continuous distribution or with continuous distributed parents """ + + def __init__(self, name, d_parents, c_parents, parameters, type): + """ + A continuous Bayesian node has two types of parents: discrete and continuous. + :param d_parents: str, name of discrete parents, value of which determines distribution parameters + :param c_parents: str, name of continuous parents, value of which is used to calculate distribution + :param parameters: a dict, parameters for distribution of current node, keys corresponds to discrete parents + :param type: str, type of current node's value, either 'd' (discrete) or 'c'(continuous) + """ + + self.parameters = parameters + self.type = type + self.d_parents = d_parents.split() + self.c_parents = c_parents.split() + self.parents = self.d_parents + self.c_parents + self.variable = name + self.children = [] + + def continuous_p(self, value, c_event, d_event): + """ + Probability given the value of current node and its parents + :param c_event: event of continuous nodes + :param d_event: event of discrete nodes + """ + assert isinstance(c_event, dict) + assert isinstance(d_event, dict) + + d_event_vals = event_values(d_event, self.d_parents) + if len(d_event_vals) == 1: + d_event_vals = d_event_vals[0] + param = self.parameters[d_event_vals] + if self.type == "c": + p = gaussian_probability(param, c_event, value) + if self.type == "d": + p = logistic_probability(param, c_event, value) + return p + + +# harvest-buy example. Figure 13.5 + + +harvest_buy = BayesNet([ + ('Subsidy', '', 0.001), + ('Harvest', '', 0.002), + ('Cost', 'Subsidy', 'Harvest', + {True: {'sigma': 0.5, 'b': 1, 'a': {'Harvest': 0.5}}, + False: {'sigma': 0.6, 'b': 1, 'a': {'Harvest': 0.5}}}, 'c'), + ('Buys', '', 'Cost', {T: {'mu': 0.5, 'sigma': 0.5}, F: {'mu': 0.6, 'sigma': 0.6}}, 'd')]) + + +# ______________________________________________________________________________ +# 13.3 Exact Inference in Bayesian Networks +# 13.3.1 Inference by enumeration + + +def enumeration_ask(X, e, bn): + """ + Return the conditional probability distribution of variable X + given evidence e, from BayesNet bn. [Figure 13.10] + >>> enumeration_ask('Burglary', dict(JohnCalls=T, MaryCalls=T), burglary + ... ).show_approx() + 'False: 0.716, True: 0.284' + """ + + assert X not in e, "Query variable must be distinct from evidence" + Q = ProbDist(X) + for xi in bn.variable_values(X): + Q[xi] = enumerate_all(bn.variables, extend(e, X, xi), bn) + return Q.normalize() + + +def enumerate_all(variables, e, bn): + """ + Return the sum of those entries in P(variables | e{others}) + consistent with e, where P is the joint distribution represented + by bn, and e{others} means e restricted to bn's other variables + (the ones other than variables). Parents must precede children in variables. + """ + + if not variables: + return 1.0 + Y, rest = variables[0], variables[1:] + Ynode = bn.variable_node(Y) + if Y in e: + return Ynode.p(e[Y], e) * enumerate_all(rest, e, bn) + else: + return sum(Ynode.p(y, e) * enumerate_all(rest, extend(e, Y, y), bn) + for y in bn.variable_values(Y)) + + +# ______________________________________________________________________________ +# 13.3.2 The variable elimination algorithm + + +def elimination_ask(X, e, bn): + """ + Compute bn's P(X|e) by variable elimination. [Figure 13.12] + >>> elimination_ask('Burglary', dict(JohnCalls=T, MaryCalls=T), burglary + ... ).show_approx() + 'False: 0.716, True: 0.284' + """ + assert X not in e, "Query variable must be distinct from evidence" + factors = [] + for var in reversed(bn.variables): + factors.append(make_factor(var, e, bn)) + if is_hidden(var, X, e): + factors = sum_out(var, factors, bn) + return pointwise_product(factors, bn).normalize() + + +def is_hidden(var, X, e): + """Is var a hidden variable when querying P(X|e)?""" + return var != X and var not in e + + +def make_factor(var, e, bn): + """ + Return the factor for var in bn's joint distribution given e. + That is, bn's full joint distribution, projected to accord with e, + is the pointwise product of these factors for bn's variables. + """ + node = bn.variable_node(var) + variables = [X for X in [var] + node.parents if X not in e] + cpt = {event_values(e1, variables): node.p(e1[var], e1) + for e1 in all_events(variables, bn, e)} + return Factor(variables, cpt) + + +def pointwise_product(factors, bn): + return reduce(lambda f, g: f.pointwise_product(g, bn), factors) + + +def sum_out(var, factors, bn): + """Eliminate var from all factors by summing over its values.""" + result, var_factors = [], [] + for f in factors: + (var_factors if var in f.variables else result).append(f) + result.append(pointwise_product(var_factors, bn).sum_out(var, bn)) + return result + + +class Factor: + """A factor in a joint distribution.""" + + def __init__(self, variables, cpt): + self.variables = variables + self.cpt = cpt + + def pointwise_product(self, other, bn): + """Multiply two factors, combining their variables.""" + variables = list(set(self.variables) | set(other.variables)) + cpt = {event_values(e, variables): self.p(e) * other.p(e) + for e in all_events(variables, bn, {})} + return Factor(variables, cpt) + + def sum_out(self, var, bn): + """Make a factor eliminating var by summing over its values.""" + variables = [X for X in self.variables if X != var] + cpt = {event_values(e, variables): sum(self.p(extend(e, var, val)) + for val in bn.variable_values(var)) + for e in all_events(variables, bn, {})} + return Factor(variables, cpt) + + def normalize(self): + """Return my probabilities; must be down to one variable.""" + assert len(self.variables) == 1 + return ProbDist(self.variables[0], + {k: v for ((k,), v) in self.cpt.items()}) + + def p(self, e): + """Look up my value tabulated for e.""" + return self.cpt[event_values(e, self.variables)] + + +def all_events(variables, bn, e): + """Yield every way of extending e with values for all variables.""" + if not variables: + yield e + else: + X, rest = variables[0], variables[1:] + for e1 in all_events(rest, bn, e): + for x in bn.variable_values(X): + yield extend(e1, X, x) + + +# ______________________________________________________________________________ +# 13.3.4 Clustering algorithms +# [Figure 13.14a]: sprinkler network + + +sprinkler = BayesNet([ + ('Cloudy', '', 0.5), + ('Sprinkler', 'Cloudy', {T: 0.10, F: 0.50}), + ('Rain', 'Cloudy', {T: 0.80, F: 0.20}), + ('WetGrass', 'Sprinkler Rain', + {(T, T): 0.99, (T, F): 0.90, (F, T): 0.90, (F, F): 0.00})]) + + +# ______________________________________________________________________________ +# 13.4 Approximate Inference for Bayesian Networks +# 13.4.1 Direct sampling methods + + +def prior_sample(bn): + """ + Randomly sample from bn's full joint distribution. The result + is a {variable: value} dict. [Figure 13.15] + """ + event = {} + for node in bn.nodes: + event[node.variable] = node.sample(event) + return event + + +# _________________________________________________________________________ + + +def rejection_sampling(X, e, bn, N=10000): + """ + [Figure 13.16] + Estimate the probability distribution of variable X given + evidence e in BayesNet bn, using N samples. + Raises a ZeroDivisionError if all the N samples are rejected, + i.e., inconsistent with e. + >>> random.seed(47) + >>> rejection_sampling('Burglary', dict(JohnCalls=T, MaryCalls=T), + ... burglary, 10000).show_approx() + 'False: 0.7, True: 0.3' + """ + counts = {x: 0 for x in bn.variable_values(X)} # bold N in [Figure 13.16] + for j in range(N): + sample = prior_sample(bn) # boldface x in [Figure 13.16] + if consistent_with(sample, e): + counts[sample[X]] += 1 + return ProbDist(X, counts) + + +def consistent_with(event, evidence): + """Is event consistent with the given evidence?""" + return all(evidence.get(k, v) == v + for k, v in event.items()) + + +# _________________________________________________________________________ + + +def likelihood_weighting(X, e, bn, N=10000): + """ + [Figure 13.17] + Estimate the probability distribution of variable X given + evidence e in BayesNet bn. + >>> random.seed(1017) + >>> likelihood_weighting('Burglary', dict(JohnCalls=T, MaryCalls=T), + ... burglary, 10000).show_approx() + 'False: 0.702, True: 0.298' + """ + + W = {x: 0 for x in bn.variable_values(X)} + for j in range(N): + sample, weight = weighted_sample(bn, e) # boldface x, w in [Figure 14.15] + W[sample[X]] += weight + return ProbDist(X, W) + + +def weighted_sample(bn, e): + """ + Sample an event from bn that's consistent with the evidence e; + return the event and its weight, the likelihood that the event + accords to the evidence. + """ + + w = 1 + event = dict(e) # boldface x in [Figure 13.17] + for node in bn.nodes: + Xi = node.variable + if Xi in e: + w *= node.p(e[Xi], event) + else: + event[Xi] = node.sample(event) + return event, w + + +# _________________________________________________________________________ +# 13.4.2 Inference by Markov chain simulation + + +def gibbs_ask(X, e, bn, N=1000): + """[Figure 13.19]""" + assert X not in e, "Query variable must be distinct from evidence" + counts = {x: 0 for x in bn.variable_values(X)} # bold N in [Figure 14.16] + Z = [var for var in bn.variables if var not in e] + state = dict(e) # boldface x in [Figure 14.16] + for Zi in Z: + state[Zi] = random.choice(bn.variable_values(Zi)) + for j in range(N): + for Zi in Z: + state[Zi] = markov_blanket_sample(Zi, state, bn) + counts[state[X]] += 1 + return ProbDist(X, counts) + + +def markov_blanket_sample(X, e, bn): + """ + Return a sample from P(X | mb) where mb denotes that the + variables in the Markov blanket of X take their values from event + e (which must assign a value to each). The Markov blanket of X is + X's parents, children, and children's parents. + """ + Xnode = bn.variable_node(X) + Q = ProbDist(X) + for xi in bn.variable_values(X): + ei = extend(e, X, xi) + # [Equation 13.12:] + Q[xi] = Xnode.p(xi, e) * product(Yj.p(ei[Yj.variable], ei) + for Yj in Xnode.children) + # (assuming a Boolean variable here) + return probability(Q.normalize()[True]) + + +# _________________________________________________________________________ +# 13.4.3 Compiling approximate inference + + +class complied_burglary: + """compiled version of burglary network""" + + def Burglary(self, sample): + if sample['Alarm']: + if sample['Earthquake']: + return probability(0.00327) + else: + return probability(0.485) + else: + if sample['Earthquake']: + return probability(7.05e-05) + else: + return probability(6.01e-05) + + def Earthquake(self, sample): + if sample['Alarm']: + if sample['Burglary']: + return probability(0.0020212) + else: + return probability(0.36755) + else: + if sample['Burglary']: + return probability(0.0016672) + else: + return probability(0.0014222) + + def MaryCalls(self, sample): + if sample['Alarm']: + return probability(0.7) + else: + return probability(0.01) + + def JongCalls(self, sample): + if sample['Alarm']: + return probability(0.9) + else: + return probability(0.05) + + def Alarm(self, sample): + raise NotImplementedError diff --git a/pytest.ini b/pytest.ini index 7d983c3fc..1561b6fe6 100644 --- a/pytest.ini +++ b/pytest.ini @@ -1,3 +1,5 @@ [pytest] filterwarnings = - ignore::ResourceWarning + ignore::DeprecationWarning + ignore::UserWarning + ignore::RuntimeWarning diff --git a/reinforcement_learning.ipynb b/reinforcement_learning.ipynb new file mode 100644 index 000000000..ee3b6a5eb --- /dev/null +++ b/reinforcement_learning.ipynb @@ -0,0 +1,644 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Reinforcement Learning\n", + "\n", + "This Jupyter notebook acts as supporting material for **Chapter 21 Reinforcement Learning** of the book* Artificial Intelligence: A Modern Approach*. This notebook makes use of the implementations in `rl.py` module. We also make use of implementation of MDPs in the `mdp.py` module to test our agents. It might be helpful if you have already gone through the Jupyter notebook dealing with Markov decision process. Let us import everything from the `rl` module. It might be helpful to view the source of some of our implementations. Please refer to the Introductory Jupyter notebook for more details." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "from reinforcement_learning import *" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "* Overview\n", + "* Passive Reinforcement Learning\n", + " - Direct Utility Estimation\n", + " - Adaptive Dynamic Programming\n", + " - Temporal-Difference Agent\n", + "* Active Reinforcement Learning\n", + " - Q learning" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## OVERVIEW\n", + "\n", + "Before we start playing with the actual implementations let us review a couple of things about RL.\n", + "\n", + "1. Reinforcement Learning is concerned with how software agents ought to take actions in an environment so as to maximize some notion of cumulative reward. \n", + "\n", + "2. Reinforcement learning differs from standard supervised learning in that correct input/output pairs are never presented, nor sub-optimal actions explicitly corrected. Further, there is a focus on on-line performance, which involves finding a balance between exploration (of uncharted territory) and exploitation (of current knowledge).\n", + "\n", + "-- Source: [Wikipedia](https://en.wikipedia.org/wiki/Reinforcement_learning)\n", + "\n", + "In summary we have a sequence of state action transitions with rewards associated with some states. Our goal is to find the optimal policy $\\pi$ which tells us what action to take in each state." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## PASSIVE REINFORCEMENT LEARNING\n", + "\n", + "In passive Reinforcement Learning the agent follows a fixed policy $\\pi$. Passive learning attempts to evaluate the given policy $pi$ - without any knowledge of the Reward function $R(s)$ and the Transition model $P(s'\\ |\\ s, a)$.\n", + "\n", + "This is usually done by some method of **utility estimation**. The agent attempts to directly learn the utility of each state that would result from following the policy. Note that at each step, it has to *perceive* the reward and the state - it has no global knowledge of these. Thus, if a certain the entire set of actions offers a very low probability of attaining some state $s_+$ - the agent may never perceive the reward $R(s_+)$.\n", + "\n", + "Consider a situation where an agent is given a policy to follow. Thus, at any point it knows only its current state and current reward, and the action it must take next. This action may lead it to more than one state, with different probabilities.\n", + "\n", + "For a series of actions given by $\\pi$, the estimated utility $U$:\n", + "$$U^{\\pi}(s) = E(\\sum_{t=0}^\\inf \\gamma^t R^t(s')$$)\n", + "Or the expected value of summed discounted rewards until termination.\n", + "\n", + "Based on this concept, we discuss three methods of estimating utility:\n", + "\n", + "1. **Direct Utility Estimation (DUE)**\n", + " \n", + " The first, most naive method of estimating utility comes from the simplest interpretation of the above definition. We construct an agent that follows the policy until it reaches the terminal state. At each step, it logs its current state, reward. Once it reaches the terminal state, it can estimate the utility for each state for *that* iteration, by simply summing the discounted rewards from that state to the terminal one.\n", + "\n", + " It can now run this 'simulation' $n$ times, and calculate the average utility of each state. If a state occurs more than once in a simulation, both its utility values are counted separately.\n", + " \n", + " Note that this method may be prohibitively slow for very large statespaces. Besides, **it pays no attention to the transition probability $P(s'\\ |\\ s, a)$.** It misses out on information that it is capable of collecting (say, by recording the number of times an action from one state led to another state). The next method addresses this issue.\n", + " \n", + "2. **Adaptive Dynamic Programming (ADP)**\n", + " \n", + " This method makes use of knowledge of the past state $s$, the action $a$, and the new perceived state $s'$ to estimate the transition probability $P(s'\\ |\\ s,a)$. It does this by the simple counting of new states resulting from previous states and actions.
    \n", + " The program runs through the policy a number of times, keeping track of:\n", + " - each occurrence of state $s$ and the policy-recommended action $a$ in $N_{sa}$\n", + " - each occurrence of $s'$ resulting from $a$ on $s$ in $N_{s'|sa}$.\n", + " \n", + " It can thus estimate $P(s'\\ |\\ s,a)$ as $N_{s'|sa}/N_{sa}$, which in the limit of infinite trials, will converge to the true value.
    \n", + " Using the transition probabilities thus estimated, it can apply `POLICY-EVALUATION` to estimate the utilities $U(s)$ using properties of convergence of the Bellman functions.\n", + "\n", + "3. **Temporal-difference learning (TD)**\n", + " \n", + " Instead of explicitly building the transition model $P$, the temporal-difference model makes use of the expected closeness between the utilities of two consecutive states $s$ and $s'$.\n", + " For the transition $s$ to $s'$, the update is written as:\n", + "$$U^{\\pi}(s) \\leftarrow U^{\\pi}(s) + \\alpha \\left( R(s) + \\gamma U^{\\pi}(s') - U^{\\pi}(s) \\right)$$\n", + " This model implicitly incorporates the transition probabilities by being weighed for each state by the number of times it is achieved from the current state. Thus, over a number of iterations, it converges similarly to the Bellman equations.\n", + " The advantage of the TD learning model is its relatively simple computation at each step, rather than having to keep track of various counts.\n", + " For $n_s$ states and $n_a$ actions the ADP model would have $n_s \\times n_a$ numbers $N_{sa}$ and $n_s^2 \\times n_a$ numbers $N_{s'|sa}$ to keep track of. The TD model must only keep track of a utility $U(s)$ for each state." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Demonstrating Passive agents\n", + "\n", + "Passive agents are implemented in `rl.py` as various `Agent-Class`es.\n", + "\n", + "To demonstrate these agents, we make use of the `GridMDP` object from the `MDP` module. `sequential_decision_environment` is similar to that used for the `MDP` notebook but has discounting with $\\gamma = 0.9$.\n", + "\n", + "The `Agent-Program` can be obtained by creating an instance of the relevant `Agent-Class`. The `__call__` method allows the `Agent-Class` to be called as a function. The class needs to be instantiated with a policy ($\\pi$) and an `MDP` whose utility of states will be estimated." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "from mdp import sequential_decision_environment" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `sequential_decision_environment` is a GridMDP object as shown below. The rewards are **+1** and **-1** in the terminal states, and **-0.04** in the rest. Now we define actions and a policy similar to **Fig 21.1** in the book." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "# Action Directions\n", + "north = (0, 1)\n", + "south = (0,-1)\n", + "west = (-1, 0)\n", + "east = (1, 0)\n", + "\n", + "policy = {\n", + " (0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None,\n", + " (0, 1): north, (2, 1): north, (3, 1): None,\n", + " (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west, \n", + "}\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Direction Utility Estimation Agent\n", + "\n", + "The `PassiveDEUAgent` class in the `rl` module implements the Agent Program described in **Fig 21.2** of the AIMA Book. `PassiveDEUAgent` sums over rewards to find the estimated utility for each state. It thus requires the running of a number of iterations." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "%psource PassiveDUEAgent" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "DUEagent = PassiveDUEAgent(policy, sequential_decision_environment)\n", + "for i in range(200):\n", + " run_single_trial(DUEagent, sequential_decision_environment)\n", + " DUEagent.estimate_U()\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The calculated utilities are:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print('\\n'.join([str(k)+':'+str(v) for k, v in DUEagent.U.items()]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Adaptive Dynamic Programming Agent\n", + "\n", + "The `PassiveADPAgent` class in the `rl` module implements the Agent Program described in **Fig 21.2** of the AIMA Book. `PassiveADPAgent` uses state transition and occurrence counts to estimate $P$, and then $U$. Go through the source below to understand the agent." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "%psource PassiveADPAgent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We instantiate a `PassiveADPAgent` below with the `GridMDP` shown and train it over 200 iterations. The `rl` module has a simple implementation to simulate iterations. The function is called **run_single_trial**." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "ADPagent = PassiveADPAgent(policy, sequential_decision_environment)\n", + "for i in range(200):\n", + " run_single_trial(ADPagent, sequential_decision_environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The calculated utilities are:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "print('\\n'.join([str(k)+':'+str(v) for k, v in ADPagent.U.items()]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Passive Temporal Difference Agent\n", + "\n", + "`PassiveTDAgent` uses temporal differences to learn utility estimates. We learn the difference between the states and backup the values to previous states. Let us look into the source before we see some usage examples." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "%psource PassiveTDAgent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In creating the `TDAgent`, we use the **same learning rate** $\\alpha$ as given in the footnote of the book on **page 837**." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "TDagent = PassiveTDAgent(policy, sequential_decision_environment, alpha = lambda n: 60./(59+n))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we run **200 trials** for the agent to estimate Utilities." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(200):\n", + " run_single_trial(TDagent,sequential_decision_environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The calculated utilities are:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print('\\n'.join([str(k)+':'+str(v) for k, v in TDagent.U.items()]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Comparison with value iteration method\n", + "\n", + "We can also compare the utility estimates learned by our agent to those obtained via **value iteration**.\n", + "\n", + "**Note that value iteration has a priori knowledge of the transition table $P$, the rewards $R$, and all the states $s$.**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "from mdp import value_iteration" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The values calculated by value iteration:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "U_values = value_iteration(sequential_decision_environment)\n", + "print('\\n'.join([str(k)+':'+str(v) for k, v in U_values.items()]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Evolution of utility estimates over iterations\n", + "\n", + "We can explore how these estimates vary with time by using plots similar to **Fig 21.5a**. We will first enable matplotlib using the inline backend. We also define a function to collect the values of utilities at each iteration." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "import matplotlib.pyplot as plt\n", + "\n", + "def graph_utility_estimates(agent_program, mdp, no_of_iterations, states_to_graph):\n", + " graphs = {state:[] for state in states_to_graph}\n", + " for iteration in range(1,no_of_iterations+1):\n", + " run_single_trial(agent_program, mdp)\n", + " for state in states_to_graph:\n", + " graphs[state].append((iteration, agent_program.U[state]))\n", + " for state, value in graphs.items():\n", + " state_x, state_y = zip(*value)\n", + " plt.plot(state_x, state_y, label=str(state))\n", + " plt.ylim([0,1.2])\n", + " plt.legend(loc='lower right')\n", + " plt.xlabel('Iterations')\n", + " plt.ylabel('U')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here is a plot of state $(2,2)$." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60./(59+n))\n", + "graph_utility_estimates(agent, sequential_decision_environment, 500, [(2,2)])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is also possible to plot multiple states on the same plot. As expected, the utility of the finite state $(3,2)$ stays constant and is equal to $R((3,2)) = 1$." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "graph_utility_estimates(agent, sequential_decision_environment, 500, [(2,2), (3,2)])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## ACTIVE REINFORCEMENT LEARNING\n", + "\n", + "Unlike Passive Reinforcement Learning in Active Reinforcement Learning we are not bound by a policy pi and we need to select our actions. In other words the agent needs to learn an optimal policy. The fundamental tradeoff the agent needs to face is that of exploration vs. exploitation. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### QLearning Agent\n", + "\n", + "The QLearningAgent class in the rl module implements the Agent Program described in **Fig 21.8** of the AIMA Book. In Q-Learning the agent learns an action-value function Q which gives the utility of taking a given action in a particular state. Q-Learning does not required a transition model and hence is a model free method. Let us look into the source before we see some usage examples." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "%psource QLearningAgent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Agent Program can be obtained by creating the instance of the class by passing the appropriate parameters. Because of the __ call __ method the object that is created behaves like a callable and returns an appropriate action as most Agent Programs do. To instantiate the object we need a mdp similar to the PassiveTDAgent.\n", + "\n", + " Let us use the same GridMDP object we used above. **Figure 17.1 (sequential_decision_environment)** is similar to **Figure 21.1** but has some discounting as **gamma = 0.9**. The class also implements an exploration function **f** which returns fixed **Rplus** until agent has visited state, action **Ne** number of times. This is the same as the one defined on page **842** of the book. The method **actions_in_state** returns actions possible in given state. It is useful when applying max and argmax operations." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us create our object now. We also use the **same alpha** as given in the footnote of the book on **page 837**. We use **Rplus = 2** and **Ne = 5** as defined on page 843. **Fig 21.7** " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, \n", + " alpha=lambda n: 60./(59+n))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now to try out the q_agent we make use of the **run_single_trial** function in rl.py (which was also used above). Let us use **200** iterations." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(200):\n", + " run_single_trial(q_agent,sequential_decision_environment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let us see the Q Values. The keys are state-action pairs. Where different actions correspond according to:\n", + "\n", + "north = (0, 1)\n", + "south = (0,-1)\n", + "west = (-1, 0)\n", + "east = (1, 0)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "q_agent.Q" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Utility **U** of each state is related to **Q** by the following equation.\n", + "\n", + "**U (s) = max a Q(s, a)**\n", + "\n", + "Let us convert the Q Values above into U estimates.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "U = defaultdict(lambda: -1000.) # Very Large Negative Value for Comparison see below.\n", + "for state_action, value in q_agent.Q.items():\n", + " state, action = state_action\n", + " if U[state] < value:\n", + " U[state] = value" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "U" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us finally compare these estimates to value_iteration results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(value_iteration(sequential_decision_environment))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.3" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "source": [], + "metadata": { + "collapsed": false + } + } + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} \ No newline at end of file diff --git a/reinforcement_learning.py b/reinforcement_learning.py new file mode 100644 index 000000000..4cb91af0f --- /dev/null +++ b/reinforcement_learning.py @@ -0,0 +1,337 @@ +"""Reinforcement Learning (Chapter 21)""" + +import random +from collections import defaultdict + +from mdp import MDP, policy_evaluation + + +class PassiveDUEAgent: + """ + Passive (non-learning) agent that uses direct utility estimation + on a given MDP and policy. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + agent = PassiveDUEAgent(policy, sequential_decision_environment) + for i in range(200): + run_single_trial(agent,sequential_decision_environment) + agent.estimate_U() + agent.U[(0, 0)] > 0.2 + True + """ + + def __init__(self, pi, mdp): + self.pi = pi + self.mdp = mdp + self.U = {} + self.s = None + self.a = None + self.s_history = [] + self.r_history = [] + self.init = mdp.init + + def __call__(self, percept): + s1, r1 = percept + self.s_history.append(s1) + self.r_history.append(r1) + ## + ## + if s1 in self.mdp.terminals: + self.s = self.a = None + else: + self.s, self.a = s1, self.pi[s1] + return self.a + + def estimate_U(self): + # this function can be called only if the MDP has reached a terminal state + # it will also reset the mdp history + assert self.a is None, 'MDP is not in terminal state' + assert len(self.s_history) == len(self.r_history) + # calculating the utilities based on the current iteration + U2 = {s: [] for s in set(self.s_history)} + for i in range(len(self.s_history)): + s = self.s_history[i] + U2[s] += [sum(self.r_history[i:])] + U2 = {k: sum(v) / max(len(v), 1) for k, v in U2.items()} + # resetting history + self.s_history, self.r_history = [], [] + # setting the new utilities to the average of the previous + # iteration and this one + for k in U2.keys(): + if k in self.U.keys(): + self.U[k] = (self.U[k] + U2[k]) / 2 + else: + self.U[k] = U2[k] + return self.U + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward)""" + return percept + + +class PassiveADPAgent: + """ + [Figure 21.2] + Passive (non-learning) agent that uses adaptive dynamic programming + on a given MDP and policy. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + agent = PassiveADPAgent(policy, sequential_decision_environment) + for i in range(100): + run_single_trial(agent,sequential_decision_environment) + + agent.U[(0, 0)] > 0.2 + True + agent.U[(0, 1)] > 0.2 + True + """ + + class ModelMDP(MDP): + """Class for implementing modified Version of input MDP with + an editable transition model P and a custom function T.""" + + def __init__(self, init, actlist, terminals, gamma, states): + super().__init__(init, actlist, terminals, states=states, gamma=gamma) + nested_dict = lambda: defaultdict(nested_dict) + # StackOverflow:whats-the-best-way-to-initialize-a-dict-of-dicts-in-python + self.P = nested_dict() + + def T(self, s, a): + """Return a list of tuples with probabilities for states + based on the learnt model P.""" + return [(prob, res) for (res, prob) in self.P[(s, a)].items()] + + def __init__(self, pi, mdp): + self.pi = pi + self.mdp = PassiveADPAgent.ModelMDP(mdp.init, mdp.actlist, + mdp.terminals, mdp.gamma, mdp.states) + self.U = {} + self.Nsa = defaultdict(int) + self.Ns1_sa = defaultdict(int) + self.s = None + self.a = None + self.visited = set() # keeping track of visited states + + def __call__(self, percept): + s1, r1 = percept + mdp = self.mdp + R, P, terminals, pi = mdp.reward, mdp.P, mdp.terminals, self.pi + s, a, Nsa, Ns1_sa, U = self.s, self.a, self.Nsa, self.Ns1_sa, self.U + + if s1 not in self.visited: # Reward is only known for visited state. + U[s1] = R[s1] = r1 + self.visited.add(s1) + if s is not None: + Nsa[(s, a)] += 1 + Ns1_sa[(s1, s, a)] += 1 + # for each t such that Ns′|sa [t, s, a] is nonzero + for t in [res for (res, state, act), freq in Ns1_sa.items() + if (state, act) == (s, a) and freq != 0]: + P[(s, a)][t] = Ns1_sa[(t, s, a)] / Nsa[(s, a)] + + self.U = policy_evaluation(pi, U, mdp) + ## + ## + self.Nsa, self.Ns1_sa = Nsa, Ns1_sa + if s1 in terminals: + self.s = self.a = None + else: + self.s, self.a = s1, self.pi[s1] + return self.a + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward).""" + return percept + + +class PassiveTDAgent: + """ + [Figure 21.4] + The abstract class for a Passive (non-learning) agent that uses + temporal differences to learn utility estimates. Override update_state + method to convert percept to state and reward. The mdp being provided + should be an instance of a subclass of the MDP Class. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60./(59+n)) + for i in range(200): + run_single_trial(agent,sequential_decision_environment) + + agent.U[(0, 0)] > 0.2 + True + agent.U[(0, 1)] > 0.2 + True + """ + + def __init__(self, pi, mdp, alpha=None): + + self.pi = pi + self.U = {s: 0. for s in mdp.states} + self.Ns = {s: 0 for s in mdp.states} + self.s = None + self.a = None + self.r = None + self.gamma = mdp.gamma + self.terminals = mdp.terminals + + if alpha: + self.alpha = alpha + else: + self.alpha = lambda n: 1 / (1 + n) # udacity video + + def __call__(self, percept): + s1, r1 = self.update_state(percept) + pi, U, Ns, s, r = self.pi, self.U, self.Ns, self.s, self.r + alpha, gamma, terminals = self.alpha, self.gamma, self.terminals + if not Ns[s1]: + U[s1] = r1 + if s is not None: + Ns[s] += 1 + U[s] += alpha(Ns[s]) * (r + gamma * U[s1] - U[s]) + if s1 in terminals: + self.s = self.a = self.r = None + else: + self.s, self.a, self.r = s1, pi[s1], r1 + return self.a + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward).""" + return percept + + +class QLearningAgent: + """ + [Figure 21.8] + An exploratory Q-learning agent. It avoids having to learn the transition + model because the Q-value of a state can be related directly to those of + its neighbors. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, alpha=lambda n: 60./(59+n)) + for i in range(200): + run_single_trial(q_agent,sequential_decision_environment) + + q_agent.Q[((0, 1), (0, 1))] >= -0.5 + True + q_agent.Q[((1, 0), (0, -1))] <= 0.5 + True + """ + + def __init__(self, mdp, Ne, Rplus, alpha=None): + + self.gamma = mdp.gamma + self.terminals = mdp.terminals + self.all_act = mdp.actlist + self.Ne = Ne # iteration limit in exploration function + self.Rplus = Rplus # large value to assign before iteration limit + self.Q = defaultdict(float) + self.Nsa = defaultdict(float) + self.s = None + self.a = None + self.r = None + + if alpha: + self.alpha = alpha + else: + self.alpha = lambda n: 1. / (1 + n) # udacity video + + def f(self, u, n): + """Exploration function. Returns fixed Rplus until + agent has visited state, action a Ne number of times. + Same as ADP agent in book.""" + if n < self.Ne: + return self.Rplus + else: + return u + + def actions_in_state(self, state): + """Return actions possible in given state. + Useful for max and argmax.""" + if state in self.terminals: + return [None] + else: + return self.all_act + + def __call__(self, percept): + s1, r1 = self.update_state(percept) + Q, Nsa, s, a, r = self.Q, self.Nsa, self.s, self.a, self.r + alpha, gamma, terminals = self.alpha, self.gamma, self.terminals, + actions_in_state = self.actions_in_state + + if s in terminals: + Q[s, None] = r1 + if s is not None: + Nsa[s, a] += 1 + Q[s, a] += alpha(Nsa[s, a]) * (r + gamma * max(Q[s1, a1] + for a1 in actions_in_state(s1)) - Q[s, a]) + if s in terminals: + self.s = self.a = self.r = None + else: + self.s, self.r = s1, r1 + self.a = max(actions_in_state(s1), key=lambda a1: self.f(Q[s1, a1], Nsa[s1, a1])) + return self.a + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward).""" + return percept + + +def run_single_trial(agent_program, mdp): + """Execute trial for given agent_program + and mdp. mdp should be an instance of subclass + of mdp.MDP """ + + def take_single_action(mdp, s, a): + """ + Select outcome of taking action a + in state s. Weighted Sampling. + """ + x = random.uniform(0, 1) + cumulative_probability = 0.0 + for probability_state in mdp.T(s, a): + probability, state = probability_state + cumulative_probability += probability + if x < cumulative_probability: + break + return state + + current_state = mdp.init + while True: + current_reward = mdp.R(current_state) + percept = (current_state, current_reward) + next_action = agent_program(percept) + if next_action is None: + break + current_state = take_single_action(mdp, current_state, next_action) diff --git a/reinforcement_learning4e.py b/reinforcement_learning4e.py new file mode 100644 index 000000000..eaaba3e5a --- /dev/null +++ b/reinforcement_learning4e.py @@ -0,0 +1,353 @@ +"""Reinforcement Learning (Chapter 21)""" + +import random +from collections import defaultdict + +from mdp4e import MDP, policy_evaluation + + +# _________________________________________ +# 21.2 Passive Reinforcement Learning +# 21.2.1 Direct utility estimation + + +class PassiveDUEAgent: + """ + Passive (non-learning) agent that uses direct utility estimation + on a given MDP and policy. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + agent = PassiveDUEAgent(policy, sequential_decision_environment) + for i in range(200): + run_single_trial(agent,sequential_decision_environment) + agent.estimate_U() + agent.U[(0, 0)] > 0.2 + True + """ + + def __init__(self, pi, mdp): + self.pi = pi + self.mdp = mdp + self.U = {} + self.s = None + self.a = None + self.s_history = [] + self.r_history = [] + self.init = mdp.init + + def __call__(self, percept): + s1, r1 = percept + self.s_history.append(s1) + self.r_history.append(r1) + ## + ## + if s1 in self.mdp.terminals: + self.s = self.a = None + else: + self.s, self.a = s1, self.pi[s1] + return self.a + + def estimate_U(self): + # this function can be called only if the MDP has reached a terminal state + # it will also reset the mdp history + assert self.a is None, 'MDP is not in terminal state' + assert len(self.s_history) == len(self.r_history) + # calculating the utilities based on the current iteration + U2 = {s: [] for s in set(self.s_history)} + for i in range(len(self.s_history)): + s = self.s_history[i] + U2[s] += [sum(self.r_history[i:])] + U2 = {k: sum(v) / max(len(v), 1) for k, v in U2.items()} + # resetting history + self.s_history, self.r_history = [], [] + # setting the new utilities to the average of the previous + # iteration and this one + for k in U2.keys(): + if k in self.U.keys(): + self.U[k] = (self.U[k] + U2[k]) / 2 + else: + self.U[k] = U2[k] + return self.U + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward)""" + return percept + + +# 21.2.2 Adaptive dynamic programming + + +class PassiveADPAgent: + """ + [Figure 21.2] + Passive (non-learning) agent that uses adaptive dynamic programming + on a given MDP and policy. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + agent = PassiveADPAgent(policy, sequential_decision_environment) + for i in range(100): + run_single_trial(agent,sequential_decision_environment) + + agent.U[(0, 0)] > 0.2 + True + agent.U[(0, 1)] > 0.2 + True + """ + + class ModelMDP(MDP): + """Class for implementing modified Version of input MDP with + an editable transition model P and a custom function T.""" + + def __init__(self, init, actlist, terminals, gamma, states): + super().__init__(init, actlist, terminals, states=states, gamma=gamma) + nested_dict = lambda: defaultdict(nested_dict) + # StackOverflow:whats-the-best-way-to-initialize-a-dict-of-dicts-in-python + self.P = nested_dict() + + def T(self, s, a): + """Return a list of tuples with probabilities for states + based on the learnt model P.""" + return [(prob, res) for (res, prob) in self.P[(s, a)].items()] + + def __init__(self, pi, mdp): + self.pi = pi + self.mdp = PassiveADPAgent.ModelMDP(mdp.init, mdp.actlist, + mdp.terminals, mdp.gamma, mdp.states) + self.U = {} + self.Nsa = defaultdict(int) + self.Ns1_sa = defaultdict(int) + self.s = None + self.a = None + self.visited = set() # keeping track of visited states + + def __call__(self, percept): + s1, r1 = percept + mdp = self.mdp + R, P, terminals, pi = mdp.reward, mdp.P, mdp.terminals, self.pi + s, a, Nsa, Ns1_sa, U = self.s, self.a, self.Nsa, self.Ns1_sa, self.U + + if s1 not in self.visited: # Reward is only known for visited state. + U[s1] = R[s1] = r1 + self.visited.add(s1) + if s is not None: + Nsa[(s, a)] += 1 + Ns1_sa[(s1, s, a)] += 1 + # for each t such that Ns′|sa [t, s, a] is nonzero + for t in [res for (res, state, act), freq in Ns1_sa.items() + if (state, act) == (s, a) and freq != 0]: + P[(s, a)][t] = Ns1_sa[(t, s, a)] / Nsa[(s, a)] + + self.U = policy_evaluation(pi, U, mdp) + ## + ## + self.Nsa, self.Ns1_sa = Nsa, Ns1_sa + if s1 in terminals: + self.s = self.a = None + else: + self.s, self.a = s1, self.pi[s1] + return self.a + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward).""" + return percept + + +# 21.2.3 Temporal-difference learning + + +class PassiveTDAgent: + """ + [Figure 21.4] + The abstract class for a Passive (non-learning) agent that uses + temporal differences to learn utility estimates. Override update_state + method to convert percept to state and reward. The mdp being provided + should be an instance of a subclass of the MDP Class. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60./(59+n)) + for i in range(200): + run_single_trial(agent,sequential_decision_environment) + + agent.U[(0, 0)] > 0.2 + True + agent.U[(0, 1)] > 0.2 + True + """ + + def __init__(self, pi, mdp, alpha=None): + + self.pi = pi + self.U = {s: 0. for s in mdp.states} + self.Ns = {s: 0 for s in mdp.states} + self.s = None + self.a = None + self.r = None + self.gamma = mdp.gamma + self.terminals = mdp.terminals + + if alpha: + self.alpha = alpha + else: + self.alpha = lambda n: 1 / (1 + n) # udacity video + + def __call__(self, percept): + s1, r1 = self.update_state(percept) + pi, U, Ns, s, r = self.pi, self.U, self.Ns, self.s, self.r + alpha, gamma, terminals = self.alpha, self.gamma, self.terminals + if not Ns[s1]: + U[s1] = r1 + if s is not None: + Ns[s] += 1 + U[s] += alpha(Ns[s]) * (r + gamma * U[s1] - U[s]) + if s1 in terminals: + self.s = self.a = self.r = None + else: + self.s, self.a, self.r = s1, pi[s1], r1 + return self.a + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward).""" + return percept + + +# __________________________________________ +# 21.3. Active Reinforcement Learning +# 21.3.2 Learning an action-utility function + + +class QLearningAgent: + """ + [Figure 21.8] + An exploratory Q-learning agent. It avoids having to learn the transition + model because the Q-value of a state can be related directly to those of + its neighbors. + + import sys + from mdp import sequential_decision_environment + north = (0, 1) + south = (0,-1) + west = (-1, 0) + east = (1, 0) + policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, (0, 1): north, (2, 1): north, + (3, 1): None, (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west,} + q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, alpha=lambda n: 60./(59+n)) + for i in range(200): + run_single_trial(q_agent,sequential_decision_environment) + + q_agent.Q[((0, 1), (0, 1))] >= -0.5 + True + q_agent.Q[((1, 0), (0, -1))] <= 0.5 + True + """ + + def __init__(self, mdp, Ne, Rplus, alpha=None): + + self.gamma = mdp.gamma + self.terminals = mdp.terminals + self.all_act = mdp.actlist + self.Ne = Ne # iteration limit in exploration function + self.Rplus = Rplus # large value to assign before iteration limit + self.Q = defaultdict(float) + self.Nsa = defaultdict(float) + self.s = None + self.a = None + self.r = None + + if alpha: + self.alpha = alpha + else: + self.alpha = lambda n: 1. / (1 + n) # udacity video + + def f(self, u, n): + """Exploration function. Returns fixed Rplus until + agent has visited state, action a Ne number of times. + Same as ADP agent in book.""" + if n < self.Ne: + return self.Rplus + else: + return u + + def actions_in_state(self, state): + """Return actions possible in given state. + Useful for max and argmax.""" + if state in self.terminals: + return [None] + else: + return self.all_act + + def __call__(self, percept): + s1, r1 = self.update_state(percept) + Q, Nsa, s, a, r = self.Q, self.Nsa, self.s, self.a, self.r + alpha, gamma, terminals = self.alpha, self.gamma, self.terminals, + actions_in_state = self.actions_in_state + + if s in terminals: + Q[s, None] = r1 + if s is not None: + Nsa[s, a] += 1 + Q[s, a] += alpha(Nsa[s, a]) * (r + gamma * max(Q[s1, a1] + for a1 in actions_in_state(s1)) - Q[s, a]) + if s in terminals: + self.s = self.a = self.r = None + else: + self.s, self.r = s1, r1 + self.a = max(actions_in_state(s1), key=lambda a1: self.f(Q[s1, a1], Nsa[s1, a1])) + return self.a + + def update_state(self, percept): + """To be overridden in most cases. The default case + assumes the percept to be of type (state, reward).""" + return percept + + +def run_single_trial(agent_program, mdp): + """Execute trial for given agent_program + and mdp. mdp should be an instance of subclass + of mdp.MDP """ + + def take_single_action(mdp, s, a): + """ + Select outcome of taking action a + in state s. Weighted Sampling. + """ + x = random.uniform(0, 1) + cumulative_probability = 0.0 + for probability_state in mdp.T(s, a): + probability, state = probability_state + cumulative_probability += probability + if x < cumulative_probability: + break + return state + + current_state = mdp.init + while True: + current_reward = mdp.R(current_state) + percept = (current_state, current_reward) + next_action = agent_program(percept) + if next_action is None: + break + current_state = take_single_action(mdp, current_state, next_action) diff --git a/requirements.txt b/requirements.txt index 6b7eb8f47..dd6b1be8a 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,2 +1,18 @@ -networkx==1.11 +cvxopt +image +ipython +ipythonblocks +ipywidgets jupyter +keras +matplotlib +networkx +numpy +opencv-python +pandas +pillow +pytest-cov +qpsolvers +scipy +sortedcontainers +tensorflow \ No newline at end of file diff --git a/rl.ipynb b/rl.ipynb deleted file mode 100644 index b0920b8ed..000000000 --- a/rl.ipynb +++ /dev/null @@ -1,563 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Reinforcement Learning\n", - "\n", - "This IPy notebook acts as supporting material for **Chapter 21 Reinforcement Learning** of the book* Artificial Intelligence: A Modern Approach*. This notebook makes use of the implementations in rl.py module. We also make use of implementation of MDPs in the mdp.py module to test our agents. It might be helpful if you have already gone through the IPy notebook dealing with Markov decision process. Let us import everything from the rl module. It might be helpful to view the source of some of our implementations. Please refer to the Introductory IPy file for more details." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "from rl import *" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## CONTENTS\n", - "\n", - "* Overview\n", - "* Passive Reinforcement Learning\n", - "* Active Reinforcement Learning" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "collapsed": true - }, - "source": [ - "## OVERVIEW\n", - "\n", - "Before we start playing with the actual implementations let us review a couple of things about RL.\n", - "\n", - "1. Reinforcement Learning is concerned with how software agents ought to take actions in an environment so as to maximize some notion of cumulative reward. \n", - "\n", - "2. Reinforcement learning differs from standard supervised learning in that correct input/output pairs are never presented, nor sub-optimal actions explicitly corrected. Further, there is a focus on on-line performance, which involves finding a balance between exploration (of uncharted territory) and exploitation (of current knowledge).\n", - "\n", - "-- Source: [Wikipedia](https://en.wikipedia.org/wiki/Reinforcement_learning)\n", - "\n", - "In summary we have a sequence of state action transitions with rewards associated with some states. Our goal is to find the optimal policy (pi) which tells us what action to take in each state." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## PASSIVE REINFORCEMENT LEARNING\n", - "\n", - "In passive Reinforcement Learning the agent follows a fixed policy and tries to learn the Reward function and the Transition model (if it is not aware of that)." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Passive Temporal Difference Agent\n", - "\n", - "The PassiveTDAgent class in the rl module implements the Agent Program (notice the usage of word Program) described in **Fig 21.4** of the AIMA Book. PassiveTDAgent uses temporal differences to learn utility estimates. In simple terms we learn the difference between the states and backup the values to previous states while following a fixed policy. Let us look into the source before we see some usage examples." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "%psource PassiveTDAgent" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The Agent Program can be obtained by creating the instance of the class by passing the appropriate parameters. Because of the __ call __ method the object that is created behaves like a callable and returns an appropriate action as most Agent Programs do. To instantiate the object we need a policy(pi) and a mdp whose utility of states will be estimated. Let us import a GridMDP object from the mdp module. **Figure 17.1 (sequential_decision_environment)** is similar to **Figure 21.1** but has some discounting as **gamma = 0.9**." - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "from mdp import sequential_decision_environment" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "**Figure 17.1 (sequential_decision_environment)** is a GridMDP object and is similar to the grid shown in **Figure 21.1**. The rewards in the terminal states are **+1** and **-1** and **-0.04** in rest of the states. Now we define a policy similar to **Fig 21.1** in the book." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "# Action Directions\n", - "north = (0, 1)\n", - "south = (0,-1)\n", - "west = (-1, 0)\n", - "east = (1, 0)\n", - "\n", - "policy = {\n", - " (0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None,\n", - " (0, 1): north, (2, 1): north, (3, 1): None,\n", - " (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west, \n", - "}\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let us create our object now. We also use the **same alpha** as given in the footnote of the book on **page 837**." - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "our_agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60./(59+n))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The rl module also has a simple implementation to simulate iterations. The function is called **run_single_trial**. Now we can try our implementation. We can also compare the utility estimates learned by our agent to those obtained via **value iteration**.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "from mdp import value_iteration" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The values calculated by value iteration:" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{(0, 1): 0.3984432178350045, (1, 2): 0.649585681261095, (3, 2): 1.0, (0, 0): 0.2962883154554812, (3, 0): 0.12987274656746342, (3, 1): -1.0, (2, 1): 0.48644001739269643, (2, 0): 0.3447542300124158, (2, 2): 0.7953620878466678, (1, 0): 0.25386699846479516, (0, 2): 0.5093943765842497}\n" - ] - } - ], - "source": [ - "print(value_iteration(sequential_decision_environment))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now the values estimated by our agent after **200 trials**." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{(0, 1): 0.3892840731173828, (1, 2): 0.6211579621949068, (3, 2): 1, (0, 0): 0.3022330060485855, (2, 0): 0.0, (3, 0): 0.0, (1, 0): 0.18020445259687815, (3, 1): -1, (2, 2): 0.822969605478094, (2, 1): -0.8456690895152308, (0, 2): 0.49454878907979766}\n" - ] - } - ], - "source": [ - "for i in range(200):\n", - " run_single_trial(our_agent,sequential_decision_environment)\n", - "print(our_agent.U)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We can also explore how these estimates vary with time by using plots similar to **Fig 21.5a**. To do so we define a function to help us with the same. We will first enable matplotlib using the inline backend." - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "%matplotlib inline\n", - "import matplotlib.pyplot as plt\n", - "\n", - "def graph_utility_estimates(agent_program, mdp, no_of_iterations, states_to_graph):\n", - " graphs = {state:[] for state in states_to_graph}\n", - " for iteration in range(1,no_of_iterations+1):\n", - " run_single_trial(agent_program, mdp)\n", - " for state in states_to_graph:\n", - " graphs[state].append((iteration, agent_program.U[state]))\n", - " for state, value in graphs.items():\n", - " state_x, state_y = zip(*value)\n", - " plt.plot(state_x, state_y, label=str(state))\n", - " plt.ylim([0,1.2])\n", - " plt.legend(loc='lower right')\n", - " plt.xlabel('Iterations')\n", - " plt.ylabel('U')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Here is a plot of state (2,2)." - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEKCAYAAAD9xUlFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xd4HOW1+PHv2VXvsoqbbOResY2RDQbTDTHNlBBKIAkB\nLuQmIYUkXFIggYSEJDck9/4C3BAgdAghFIeOQzHY2Lj3Jne5qdhqVt3d9/fHFI2kVbVWkqXzeR4/\n1s7Ojt5Z7c6Z97xNjDEopZRSAL6eLoBSSqneQ4OCUkoplwYFpZRSLg0KSimlXBoUlFJKuTQoKKWU\nckUsKIjIEyJSKCLrW3j+ehFZKyLrRGSxiEyNVFmUUkq1TyRrCk8Cc1t5fidwljHmROCXwKMRLItS\nSql2iIrUgY0xC0Ukt5XnF3seLgFyIlUWpZRS7ROxoNBBNwNvt/SkiNwK3AqQmJh48vjx47urXEop\n1SesWLGi2BiT1dZ+PR4UROQcrKAwu6V9jDGPYqeX8vLyzPLly7updEop1TeIyO727NejQUFEpgCP\nARcaY0p6sixKKaV6sEuqiAwHXgG+YozZ2lPlUEop1SBiNQUReQE4G8gUkQLg50A0gDHm/4B7gAzg\nYREBCBhj8iJVHqWUUm2LZO+j69p4/hbglkj9fqWUUh2nI5qVUkq5NCgopZRyaVBQSinl0qCglFLK\npUFBKaWUS4OCUkoplwYFpZRSLg0KSimlXBoUlFJKuTQoKKWUcmlQUEop5dKgoJRSyqVBQSmllEuD\nglJKKZcGBaWUUi4NCkoppVwaFJRSSrk0KCillHJpUFBKKeXSoKCUUsqlQUEppZRLg4JSSimXBgWl\nlFIuDQpKKaVcGhSUUkq5NCgopZRyaVBQSinlilhQEJEnRKRQRNa38LyIyP+KSL6IrBWR6ZEqi1JK\nqfaJZE3hSWBuK89fCIyx/90KPBLBsiillGqHiAUFY8xC4HAru1wGPG0sS4A0ERkcqfIopZRqW0+2\nKQwF9noeF9jblFJK9ZDjoqFZRG4VkeUisryoqKini6OUUn1WTwaFfcAwz+Mce1szxphHjTF5xpi8\nrKysbimcUkr1Rz0ZFOYDX7V7IZ0KlBljDvRgeZRSqt+LitSBReQF4GwgU0QKgJ8D0QDGmP8D3gIu\nAvKBKuDrkSqLUkqp9olYUDDGXNfG8wb4VqR+v1JKqY47LhqalVJKdQ8NCkoppVwaFJRSSrk0KCil\nlHJpUFBKKeXSoKCUUsqlQUEppZRLg4JSSimXBgWllFIuDQpKKaVcGhSUUkq5NCgopZRyaVBQSinl\n0qCglFLKpUFBKaWUS4OCUkoplwYFpZRSLg0KSimlXBoUlFJKuSK2RnNv9ObaA9QHQ9QFQry6ah/B\nkOHuSyZyYk6qu09VXYCnP9vNnAnZjM5ODnuc6rog2wormJKT1uLvqguE+Nea/SzZUUJ8jJ97501C\nRDpU3sKKGuav3k9WciyXTRvaodd6bT1UwaL8YnaXVFFYUUNmUmyb5akLhFiyo4TNB8vZWVzFlJxU\nrps5vNNl6Gpl1fWkxEV1+D1VSrWuXwWFbz2/EgCfwIDEWIora1myo4QTc1LZfLCcYekJ/PKNjby4\nbC8Hy2q4cvpQPs0v5ptnj3aPEQwZ8n71Pkfrgqz9xQWkxEU3+z3VdUGu/esS1uwtdbf94PxxpCY0\n39frxc/38Nb6g/ztxhk8uXgXD7y9ifqgITbK1+6gsHLPEd7feIgfXjCOkspafvraet7feAiA5Lgo\nKmoCAPzoC+NIDlP2UMjw7NLd/OG9rZRV17vb314fzXUzh7OnpIo//XsrX545nLzcAe0qk6M2EOTp\nxbsZkhbPxVMGEwiGWLitiFNHZpAQE/6jGAwZ/D7rwl8fDPHKygKeWrybjQfK+d0Xp3D1jGEdKoNS\nqnX9JigcPlrn/hwy8OKtpzDnwYXUBUNU1QWY+6dPOG1UBit2H3H3m/fnRQAs3XGYh6+fTmJsFB9v\nLeRoXRCwLv7hgsIjH+WzZm8p/3PtNMqr67n79Q3UBoNAy0Fhxe4j3PXKOgCeX7qbX76xkfMnDqQu\nEGLVniNhXxMIhvhgcyFzJgzE5xM2HSjnyocXA3Du+GzufHktB8tq+MH5Y7ny5ByGpsXzt0U7ufdf\nGwkETbPjhUKGH/xjDa+u2sfs0ZncNDuXk4cP4PnP9/Dbdzbz702HuP2FVVTVBUlPiOlQUCiurOWG\nx5ay+WAFo7OTOGtcFjc8tpTVe0v55eWT+cqpJzR7zdOf7eJXb27iya/PIDcjkW8+t5LVe0uZNCSF\npNgoVuw+0qeCQihkKK2uZ0BiTIdeV1UXIL+wstWaq1Lt1W/aFLwX1rhoHyMykwAIBA37S6sBWLy9\nhNpACMD9H+DjrUUsyi8G4LVV+93tdZ59HIFgiBeW7eXc8dlcNm0osVF+d9/6YPP9/2fBNqbe+x4P\nvr/F3Xb36xsYnZ3En798EqOzkwg1v34D8Ju3N3PrMytYuK0IYww/eXWd+9x3XljFnsNVPPn1Gdx+\n3hiGpsUDEOW3/uT1oeZlee7zPby6ah/fmzOGZ26eybnjB5KaEM0JGQkA3PzUcobYxwl3Li2pqgvw\n1cc/Z1fJUWaPziS/sJLr7YAAcLiyrtlrnly0k3te30BdIMTb6w5y7aNLyC+s5P9ddxJv3D6bE4em\nsuVQRbvLEElVdQHuf3Mj24sqO32Msqp6zv3DR0z/5fuNbmDasmF/GRPveZd5f17EnpKqTv/+3qA+\nGOJ//72N6b98n6U7So7pWMYYjGnhi9NOoZDhX2v2u9/9/qLfBIVhAxLcnxNiovD7BJ9YH8S9h6ub\n7d/0C56RFAvApgPl7rZAmKv1pgMVFFXUcvlJVronJsp6iytrA4z56dv8acHWRvv/cYGVplmUX8LF\nUwa72289YySxUX6ifEIgzAU8FDI8sWgnAPPX7OeDzYWs2lPKj74wDoADZTVcnTeMU0ZmNHpdjN9J\nxTQue1VdgAff28JpozL47nljGuXqh3veu7985WSGpMZRZdeW2uPB97ay8UA5j9xwMl882Xpf1uwt\n5f9ddxLx0X4qauob7b+uoIz739rEnAnZjMxM5JkluzlYVsPTN8/k0qlDEBHGDUpm26EKQmH+BhU1\n9dzx99XkF3b+It1etYEgNz6xjL9+spN/rdnf9gvCOHy0jqv/8hm77It6UUVtu163KL+YLz6y2H28\nv8z6HAdDpkNBuzc4UFbNVY8s5sH3t3L4aB1rCkpb3b+qLkAoZDhaG2Dv4cbB8NVVBUy7733eWHug\nQ2UwxrB0RwkVNfXsK63mhseXcvsLq7j+saWsbKG23hf1m6AwdmAyN88eAUB8tHX3Hu338eKyPfx9\n2d5G+6bERXGwrKbRtmDIUBsIsrP4KLn2nXO4L97afdaH+aRhVlXeCQrrCsoAGl04DpU3/h232OUD\n+MKkQQD4fUIwzIVvdUEpzo3QKyv3cfNTy0mM8XPT6Q3HuOn03Gavi/JZ5Qk0Kftrq/ZzpKqeO84f\n26zxNjczEYDZozMZlZVEfIyfqrqA+/ydL6/hvn9tbPa7AOti/tlurs7L4Zxx2YzOshrvhw2I59Kp\nQ0iJb2jnAOuLeffr6xmQGMPvr5oKdlHuvmQC04enu/udkJHA0bogR6qsu+rfvbOZRxduB+CBtzfz\nyqp9PLtkd9gydaU/vr+Nz3cdBgibkmtLIBji28+vZGfJUW47cyRg3UC0ZV1BGf/x9HJOGJDIc7ec\nAkBJZR019UFm3r+AG//2eYfL0lO2F1Vy2Z8XkV9YyUNfnk60Xzh8tL7F/T/ZVsTkn7/LT15dx6Sf\nv8sZv/uQQDCEMYbfvL2J7/99DWXV9Y3a9NoSDBl+9tp6rnl0CTf+bRlz/7SQ1XtLOW98NgBXPryY\nxz/dGfYmpK/pN0EBrLQRQHyMFRRi/D6KK+t4Z8NBd5+k2CjSEmKaBYVAKMTO4qMEQoZJQ63eSuHS\nR+sKykhLiCYnPd79HQDLd1l3GpOGNPR02rC/rNFrp3pywk6jtFVTaP5B/GhzodsA6zhnfLZ7bgBj\nBjbvPRVtBymnplBaVcfX//Y5D32Yz8isRE4+Ib3Za5Jio1hwx5k8ceMMwKppOTWFsqp6Xlpe4NZa\nKmsDnPabf7NwaxFgtY8EQiFuP3eMff4p3HXheP75n6cBkBwXTUWtdQGoqLFqTKv3lvK9OWNJT4zh\n7osnctXJOXz5lMZtDk5bTmVtgM0Hy3n4o+38+q3NFFXU8o8VBQBsK2xIL+UXVnDFw4ua/V2Pxeq9\npTy6cDvX5A0jNT7avZiv3HOEbzyzotFNw4KNh/ivl9c2O8ZfFu5g8fYS7r98MhdMGuieU2uq6gJ8\n58VVpMVH88zNMxk3yPo7Hz5ay+/f3ULJ0ToW5R9b+iWSquoCBEOGz7aXUHCkihseW0rIGP75zdO4\neMpgBiTGcPho+NrS0h0l3PLUcqtd0HMzd6Cshl+9uYm/fLyD608ZzrAB8RxqZ40rGDJ854VVPLd0\nD2C17w1Jjeft757BX75ystve9cs3NrKqA4HmeNVvGpqhoYYQZV9Mo6N80ORzMyg1jpAx1Nlf6B9f\nOJ7fvL2ZYMhwoNS6oIyy75zDXay3F1UydmCye7ft1BRW7bWCgpOTByvV5OXzCT+7eEKjVJfPJxhj\npYt8niCwam8p4wYmc7C8xs1BO6mit797hnuOTUV7evIAvLpqHx9usS7g3z5ndItdPL3dc62aghUU\n3vUEVLAufvvLanh04Q7OGJPJK6v2cfroTPecfD7hG2eNcvd3ekSVVdUz9b73AEhPiOYKO/12zvhs\nzrHv1ryS4qyPbkVNoFGN4Ddvb6I+GGL26ExW7TmCMQYR4dZnVrCj6ChrC0oZlDoo7DkCbh66ta6u\nWw5W8Ks3N1JaVc+AxFh+eskEPs0vprymnlDIuI39ew9XMTIriZr6ILc8vRyAX14+2f1MHCir5s8f\n5DN30iC+lDeMzQet1ORRT1AIBEN88ZHFXDp1CF8/fQSHymt47JOd7Co5yvO3nEp2ShzBkEEE3l5/\nkM/sXLw35debbNhfxsX/+ymThqSwYX85yXFRGAMv3TaL8YNSAEhPiAlbUyg4UsWtz6wgJz2e/5o7\nnv9+bwtXnJTDb9/ZzK/e3Mi7Gw5x42m5/PzSiVzz6BIKy9u+ATDG8Iv5G3hz3QF+fOF4Lp4ymGeX\n7OE/zx5Farx14+H8zR7/dCd7D1eFvXHqS/pZTcEKCs4XPtrf/Is/MCWWaDvF4hPIy7U+AIGgobTa\nuvhmpcQB1oX1nfUHyL3rTffCfKi8lsGpce7xnAvAjqKjAIQ8jV+bDpQzbEA8N56Wy5+/fBIAt5wx\n0k0dQUMAC3peZ4xhbUEZU4el8tZ3znA/vCfb6ZUJg1PC1hKgoaHZSXV420hOG5UR9jVNJcT4qbaD\nwsd2jcCpGX2wuRCw0nWbDlRQcKSaS6cOafFYyXHRlNcEeH/TIXfbZdOGun+rFl8XawWFoopa5q/Z\nz+Sh1gXllZX7OH/CQM4dn22nl+qprgu677/z92jJs0v3cMqv/93owtzUXa+s5ZNtxazbV8Z/nDGC\nlLhokuOiqKwJsHBbkbtfoX2n+vRnu9xtzmcI4I/vbyVoDD+9eAIAiXa33EpPOu2VlftYU1DGb9/Z\nzC/mb+C0Bz7giUU7uXbGMGbZfy+/T0iLj2bx9hKGpMZzyZTBYWuxPS0QDPHDf1i1pQ37rc9dRU2A\nB754IhOHpLj7ZSQ1rykEgiFuf2EVwZDh8a/N4IJJg3jv+2e5Nw/vbjjEnAnZ3H3JRESE7ORYlu48\nzOUPLWJHKx0AXlm5j2eW7Oa2M0dy21mjyElP4K4Lx7vfKccPL7Da6gqOHN+N+e0R0aAgInNFZIuI\n5IvIXWGeHy4iH4rIKhFZKyIXRbI8TmrFuYl28uteiTFRRNnBYkBijNt7KBAylFZZdy9ZdqNzfSDE\nIx/vAGBn8VGMMRwqr2FgSkNQiLUvQk6twts+sOdwFSMyk/jFvElcMiX8hdNvlzEYsu5ocu96k32l\n1ZRV1zN5aCqDUuN4/Vun853zxjB+UPhA4OUEQqcmtNzTBfek4e27A0qw2xRCIcOnds+M2oCV03Xu\nVI/WBtyAcfbYrBaPlRIXxZq9pfzwH2vcbU4apTXOGIs31x2gqi7IXXMnuM9dnTeMIWnW32B/aTUf\nbil0nwvXPuP1/NI9FFbUcs/rG3ht1b5mzxdX1rJhX0Mg/fIpw+3yRFFZG+DJxbvc5woragmGDH9b\n1LDN+QyVVNby2ur9XJ2X49aiku3aj5M+Msbw10+sz1d2chzPeGpE3zlvTKNyHbGP+53zRpOWEO3+\nfVtSVl3P+n1lre7T1Z76bDebDpTzn2eP4rJpQ3jptlk8ePXUZp/9AYmx7vk4nlu6h1V7Srn/islu\nGxdAdnIscdE+spNj+f1VU92Ualay9R1dvbfUTWU2taekirtfX8/MEQO4c+74VsseH+MnIzGGxz7d\nyZEO9A47HkUsfSQifuAh4HygAFgmIvONMd4WyZ8BLxljHhGRicBbQG6kyhTv1hSsx+HuGmOifO7d\ndGZSrPshC4ZCHKmqR8S6kwGoDxlq7Dvm2Cgf5dUBagMhsu0PZLjf4b0oHSyrYcKgFFrj1BQCIeNe\ncJyuhyMyrC9HbmYid5w/to2zt0T7Gxqa//LxdnYUHeXiKYOZN3VIo/aI1sRHR1FdF2RXyVHKqutJ\niPFTWx9kR/FRt+dMeU09S3aUMHZgEtmeINmUE3QBZuYO4KThaZwyou0ai5M+enXVPlLjozl15ADm\nTBjIgk2HmD0mk80HrdTcwbIa3lzX0AslXMrPkV9Y4dac/rmygH+uLHB7kTleXlFAXTDEr684kWED\n4t3g5IybKK8J8PXTc/nbol0UVdSyKL+YA2U1XDdzOC98vod31x+k4EgVmw9WUBcI8bVZue6xE+3a\nT3FlLftKq9lRVMm2wkpS46PZZ3ebnjosjYsmD2Jwanyjcg1Ni2dfaTVXTs9hy8HKNmsKsx/4gIra\nALseuLjV/bpKeU09f1qwlbPGZnHnF8a5tfWZI5qPdRmQEE1JZUNNoaSylv9+bwuzR2cyr0mt0+cT\nfvvFKYzKSiLdM74j3lPTbJqmddz3hnUp+tM105q1z4VTYgeDW59Zzj++cVqb+3elUMjwzedWcpH9\nXY2kSLYpzATyjTE7AETkReAywBsUDOBcFVOBzvXpayc3fWR3aQmXd4+J8rl595T4aHef+qChrKqO\nlLho9+6/PhCiut4KCsGQ4VCFlcMMV1NwOOmjQDBEcWUtA1NiaY0blDw9W3YUW6mQ4Rkdzxs7QaGw\nopbfvL0ZgEtOHNwoZdWWhBg/VfVBt9vgzBED+Gx7idvbIyUuirLqejYeKOcLE1s/7s7ihqr9JVMH\n81XPRbI1zl11MGQ4fXQGUX4ff/7ySZRX1xMX7XdTeLtKjvLh5kLyTkhn+e4jrdYU5q/ej0/gwsmD\nGwUSr/c2HGRKTqpbQ2goj5UGAysF+NySPRRW1LB6bymp8dF8KS+HFz7fwx/e30puRgIhA7NGZjRK\n80X7fcRE+Xj4o+288PkeTj5hAJlJMdw0ewS/e2cLk4em8Pq3Tg9brle/dRrGNByjtaBwsKyGCk9t\npDumCnl2yW4qagL88IJxbf6+7JQ4ymsCPPRhPgfKqkmMtWphP790YtjXhhvtf/PsEYzKSuLlFQVs\nsttqvKPjF24tYsGmQ/zX3PGN2vla882zR/HwR9tZtusIVXWBFkfhh7NyzxGeX7qH331xits2aIwh\nGDLuTWhrXlq+l3c2HOSc8S3XurtKJNNHQwFvX88Ce5vXL4AbRKQAq5Zwe7gDicitIrJcRJYXFYWv\nCrZH05pCOLFRPjd9FOupNQTt0aZpCdHuhfXT/GK3wbU2EOKA3bNlkLdNwd/47tu5KBVV1hIyMDC1\n5btoaAgK3rEK+YWVRPul2d1iezjnts3Th3/84NZrK00l2A3N6wrKiY/2M3lIKrWBEBv2lxMb5WPq\nsDTWFZRRWlXPtOGtj7IdnW0NIvzm2aO4Oq/9o5OTYhu+kLNGZQJW0HdqJZlJsUT5hLfXH6SqLuim\npMLVFA4frWPVniN8uKWIvNwB3HvZJPe5ukCIRxdup7ym3tpvbynnjGu54XvSkBSGpsWTlRxLweFq\n3ttwkHlThzDIc6Owq6SKPYermDet+R1fgl1bO1JVzwebDzFv6lBG2umSL89sPurbkZ0c596MxET5\nqLO7aIYzf01DWqy1mlNXqakP8vgnOzlrbFajecZaMs4OlL9/dwvPLtnDowt3cNGJg1tsJwsnIymW\nL56cw7hByeQXVrJhfxmjfvIWi/OLMcbw4PtbyUmP56bZue0+5p1zx3Of/dmo7sA4naDd+eDlFQVu\nbQPg2y+s4tTf/LvN11fU1PP7d7cwIze9Q9+RzurphubrgCeNMTnARcAzItKsTMaYR40xecaYvKys\nzkfKhpqCfdww+8T4fW5bQ2yUr1H6prSqnrT4hqDw5OJdFNvV3KU7Snhp2V7io/2MsS900Dx95NQU\nnK6Rg1pJrUBDUHBqJGB1tcxJT2hXlbcppxF9mz0a+IcXjGWEJ0fbHvExfuoCIbYeqmBUdqKbdlqz\nt5Txg5IZkBjj3olOHtL6ReAX8ybxzvfO4M6549tsXPby7psXpjeI3ycMSo1jxe4jiOCmpIJhBgJe\n85fPuOLhxWzYX8apIwaQmRTLD+x03DsbDvLrtzZz+/Or+GRbEcYQtjdUvX1nft4EK/gMSo3jg82F\n1AZCnDshm/SE5lNXnBfmOKWeXHrIwBcmDeTscdn89KIJXDm9ffNfObXTltoV2hqV3xpjDN98bkWz\nQZjhlFXXY4zhvY2HKDlax3+cMbJdv2P84IaL/4lDUzEGvuWZf6wjhg1IoKouyL32OJqF24r5bIfV\n7fkbZ41qlL5sD+dz5/0+gtWmeNc/14Ydu/T2+oZap9NeVHCkijfXHqC4sq7ZmKGm/u/j7ZQcrXMb\n0SMtkkFhH+ANazn2Nq+bgZcAjDGfAXFAZqQK5FygnTc23J1UjKemEBPlcy+8pVV1HKmqIzUhxh17\n4PWH97fy5roDXH7SUNI8F4CmQcHp9eMMXBvYRlBwglLBkYZR13sPV7cZTFoSHdVQU/AJ3HrmqDZe\n0ZxzN7tuXxmjspLci9DagjLGD0ppNB/UqOzWA05CTJTbFbGzvEHYK9ducxk3MNmdTyjcADOn1hQy\nMMPOcTv5/Q/t3lTLdh3mw82FZCTGMGVo80Dn3BycM866aRmdlUR1fZAYv49TRgwgPsZPbJTPfa+m\nDktrta0FICPRml8qLtrPf5w5st1B0/l8hrvgbztUwcYD5e4AzI4EhWDIsOlABW+tO8ifFmxj4/7y\nFvfdsL+Mqfe+x7/WHuDlFQUMTYtvd++2oWnxpCVEc8qIAfzjG7N48zuzG/VO6ginV9znO60BhlV1\nAR5duIOs5FiuOjmnw8dzsg01TYLCF/64kBeX7WV3k6lGjDH8+YN897HTq+2xT3a620qrWx6oV1pV\nx98W7eLSqUO6bW6rSAaFZcAYERkhIjHAtcD8JvvsAc4DEJEJWEGh8/mhdnKCbbjatTW1hK/hZztA\n/OrNTawtKCM1PtrdFk5WUuM7wmYNzcYZNGZ9ENqa/MwJSt7RzwfKqslMbr0toiXOue0oqmRIWnyb\nXTTDibdzqWXV9YzMTHIvVnXBECOyEkmJt54fmhbfobxrZ7WUk83NtC58U3PS3Pcx1OSP7p0J1icN\nPbCc9NRHds+lqrogn+0oYdaojEbjRRz3XDqJH5w/lmn2SPYxA61ANf2ENPc9uP6UE/jJRRMQgQsm\nhu9hdcHEgczITSczKYaLThzcqdqg8zcNd8F/z54x12lAb6uXkuOTbUVM+vk7/G1Rw8XM250Z4KnF\nu9wR+6+utO7/3t1wkE+3FXHl9KFh37dwRIQnbpzBg9dMIy7a32jAZ0cNS29odxOxgsPHW4u4bsaw\nDtVMHU5QqK5reN8+2lLovo9Nawofby1i88EKNwBV1AQ4crSOvy/b694grN9Xxvi73+az7Q0DDo8c\nraOqzhqDU1UX5Jtnd/zmrbMi9o01xgRE5NvAu4AfeMIYs0FE7gOWG2PmAz8A/ioi38fK5txojnUW\nq3ZwUihh00dRPrfbpjeV5IiP9rnpo3Dim1wEvbWK1Phod5h8uT3fj9Ng2hInAHnnw6kPGjI6OJNm\n0/KEDI3GU3REmqcP94isRGo9d00nDEhw21ZGZnUsLdVR//2lqWQmtfw+OO05OenxjdKAXt673UlD\nUt1g4NQUjlTVkxofTVl1PYfKa5nSQk58RGYit3u6iTptJWeMaUh33nPpRAAmD011x1U09ehX8wCr\nK224lFN7xLSSPvp0WzETBqe4EyS2t6bwxpoD1NSHeGXVPqYPT2NtQRn5dv//I0friI/x8/P5GwC4\ndOoQd7zGx1uKCBma9eJqy/R2do9uy1C7pjB1WBoDk2PdoPilTubmnVSpkz76bHsJN/5tmft8bZP3\n84XP95CRGMN1M4fx8ooCjtYG+PvyvVTXB7nnkonc98ZGHnx/KzX1Ieav2c+sURkEQ4bLHlrE9OFp\nfJpfzNnjspjQwXa/YxHR2zhjzFtYDcjebfd4ft4IhO9OEQHThqXxtVkncIud22x61wiNu6R600fe\n58OljxyJsY3vPrwD5FLjo3GyFxU1AXzSMGCpJT67WlNU2XgwT2sXw9Z4azltpS9a4g0mOenxjVJb\nJ2Qkcta4LBJj/e0e99BZbVX/Z+Sm88SinZwyMsPTtbjx33yr3baSHBvFuZ4cv/fveMaYTHdytclh\nUkfhzBwxgC9OzwnbDtCeEbHt7RETTkvpo+q6ICt2H+HG03PdwNH0IgbWe/Taqn3MmzaEaL8PY4w7\n5iQYMpw3YSDlNQG2F1ayKL+Y6x9b2uj1BUeq2HrIChiVtQFGZiUyKit8ii/SUuOjuf6U4Vw4eTCf\n7SjmvY2HOG1URqNZAzrCqV04c3+9bE+pcunUIfxrzf5GN0iF5TUs2FTILWeMcAN8ZW2Al5bvZUZu\nOqfaMxA0ZZCiAAAc7ElEQVSstedFc2oOC7cVsedwFQVHqggZ+LpnPrPu0K+mufD7hHsvm+w+Dlcn\nifE3dEmN8TQ0NzzvbzV9FN+kSuptGIr2i1tTqKgJkBQb1WaV2qmpNJ05MzOpk+kjT9k72y7h7V01\nNC2eYk/ZTshIICEmimtm9PwqbReeOJilPzmPgSlxlNnpuqZtClsOVZAaH83yn83B7/lbeXs3nTk2\nq8NBISEmij9cPfVYT6FTWkoffb7rMHXBEKePzqTavqh596mqC1AfMLy+Zh/3vL6BqvogXzn1BLYV\nVnLQk748e1wWq/eWsqvkaLNpTnLS4/nInjZlYEosh8prOTdMb63udP8VJwJw0vA0BiTGMnt055st\nvW0KVXUB3l5/gGvyhnH1jBwrKHjez3+utFZ3vHbGcPcmY+HWInYUHeUbZ45qljp2Op+8ZM/pFDIw\nJDXumMrbGT3d+6hHmTAJpKYNzU0DQGwb6aPE2JbjrHfG0/LqelLiW1+JzXkNNA8KGZ0MCt5aTltj\nJFqSndwQFDKTYon1BMLWzr8nOA35fn8LNYWDFYwbmEy039coQDvnkRDjd+/sR2Qmhl1UqbdpqRaw\nOL+YGL+PmbkDwqaYzn9wIVPve4+V9ih3Z2Dmx/ZFfnBqHNnJsUwcnEJ2cixFFbWs2tMwQdxl04ZQ\nUx/koy1F5KTHc7rdVfjcCT0bFByJsVHcPHuEO4FgZ3jTR+9vPERVXZArpjesm+J9z99ad4Bpw9IY\nkZno3mS8smofCTF+LpoymDR70svkuChm5KbzzoaD3PO6tVKiM3fVVXnDOtWudCx61ze4m4Xpndgo\nZRSuTSHG7ws7Z5IjoZVRwT4Rt6G5vCYQdjnMpqJaDAqdTR95g0Lnagrexmm/T5oN0OuNwrUpGGPY\ncqiCy8MMfnK+xBMGp7hTJrS3ltDTWmpTWLW3lElDU4iP8bvtLU5NwRjjjpp22goO29OSf77rMCMy\nE7l33iTqAiFEhMwkayqKsuoyxg1M5tKpg6msDVJWXc/i7cVcOX0ouRmJfLajhBkdXLa1N/M2NC/e\nXkxmUiwzcwe466/UBqxAuvdwFev2lfGTi8Y3eh1YXZqdz1d2cixzJw9iiT09zNOfWVOZ/PqKE3ll\nVUHYFQkjrV8HhXBio3xusIiJ8tE0SMdE+VrtK9xabxu/Txo1NKe00cgMDXe4TYNCWjtqGeF4A1pW\nJ2sbTXWmF0d3805X4jhQVkNFTYCxYe4cnZrCpCEpJMdGMWdCdsSnF+gqsWHaFIIhw4Z9ZW47TNMU\nk3ehqS32FCEFR6rZfLCc1XtLOWN0Jmd65rByer+FDPz4ovGcPS6bhz/Kpz5oqA8GmTkig0unDOYr\ns05otWZ9vHEu7hU19Xy8tYgLJw/C55OGmkK99X46YxMunGwtnOW9ZnhnD3jj9tmkxEdz92vr3XaY\nnPR4Th+dwewx3Zs2cvTroBB2nILf5zZAx4YJAG3dFbdWU/D7GmoKFTUBtwdIa5w8d8nROrLsKjs0\njKDtqGhPzSe9kz2YABbccabbCO68J+0Jcj3FeR+9NQVnOc9xYUbKpsZHc+HkQe5Kb499bUb3FLQL\nhGtT2FlcydG6ICfmNF78qS5o3dl6Vzpz1tr415r9bhfTpiPTvV2vnQ4F3prvySekIyIdHhzW28XF\nWO/bJ9uKqagJcO54q2txbHTjlN37Gw8xaUhK2AZtZywLNHT2uO+yycRG+3h2yR4unDyoWwaptaTv\nhPBOCNclNTba566JHC4AdCYozJkwkNvOHNmoTaGipt7tz98ab0P3SM/I487mtr15cyen2Rmjs5MZ\nafcoccqY1skulN3BZy+/6m1T2G3PIRVuRLffJzxyw8nHZerDGxQe+2QHP3l1HevsGVFPtFNgTXso\nbWwy5sAZb9HSYyelNnZgkjvNtHNTMDAlliGd7O7c28V4priJ8fs4w76bj3XbcYJU1gZYtaeUs8LM\nDhzj94VNG8fH+Ll82lB8AvOmdqz7blfrvbd23SBsl1S/361BtDSLamvCNbQ+9jWr7/lVjyxu3NDc\njgu7t5FpVHYSS+2RmV2Rx0+L75qLuNP4dm6YaRt6kyifr1FNYX9ZDTFRvk537+2tvG0Kv3pzE2B9\nXuKj/Yyyx440bYz2DkRLT4jmyulDWe1ZZazpqHOn95u3e63TccKpJfRF3vO6+YwR7vfd29C8ZHsJ\ngZBplv5Z9tM5rV4/8nIHsPLu83v85qpf1xTCzQUWE9WQPgqXC20rKLTa0OypKVTXB1vd1+Ht/eTt\n690VXzpnedJjlZOewDvfO4OfXTyh7Z17UNP1rveVVjM0Lb7PXcCcu1mn0ROsUbMTh6S4HQ1im6SY\nNu4vZ6o9MG/y0NRm7SdNP/eDU+M5aXhao7UQnJucrhp41tt9b07DYEU3yNaH+DS/mPhof7PxKFnJ\nsc0W72mqpwMC9POaQrhxClF+cYNFuK5gTWc9barVhmYRAqEQwZChPmja1UDr97QBjOriEcJdeTE8\n1vmLukOUTxqNU9hfWu0uxtOXOBco7zw8mw9UcKlnVlZvbaK4spbCilquzhvGmoIypuSkkpYQw/Kf\nzeFQeU2zsTfO61/9ZuNxpxMGJ3P5tCEtLhjV13jbS/w+IdovVNbW8+GWQmaOGHDctqf066AQLgXj\n7TYa7qLZVtqmtT7Ffp9QGzDuZFrtSQF5B1S1p2FatcyqqXl6H5XWuDnhvsS54K/3rBBXURto1Cbl\n1Cb2Hal2U0enjcpgzMAkTrcHS2UmxXZokGRCTBR/uvakYy5/b/f0TTPDvi9xUX7+ak9099OLenet\nuTX9Oig8ddNMvvCnhY1SCkJDr6Rw1/fW0kcv3Tar1d/n8wlB0zDDYvtqCp5pMo6hYVjZNQX7b10f\nDHGoouaYppPorZJjrc9J0+U2velH53P88Efb3dHKEwancNox9EjrL85sYXnZ2GgfFbUwcXAKF3Rg\n0arepl+3KYzOTuKN22c32uaThryzL0xNwf0yXT/d3faHL03ln/95WtilBb38Yi2rV2PncduT0/e2\nKThfdtU5fp80Ws/CmL5Z+4qP8RMX7Ws0NQU07mXlvblZlF9MRmLMMXVRVg1tkG0tLNXb9eugANbd\n0a4HLnZHDqYnRrttCq0FhYtOHOxumz0ms12TnPl9wrp9ZayzJ8DqaE2hqxqGH7jyRJ6+aWaXHOt4\n4m1T2G+P3u2LNQWg2Qyr0X5x1xaAxl2dD5XXckInlnZVjTmzAzftvnu86dfpI6+7L5nIdTOHk5Oe\n4KaPwrUPdGbsgsMJMt94doX9unb0PvKUwWnjSDrG+YWundnzk9X1BL+/oRa4v8wKCoP7YEMzWEHh\nQFkNo7IS2V50lBMyEhtNcdK0vcxZkEgdu5M0KPQNMVE+d3WnhppC8/3CB4X29TJoGmRi23Hn37S2\n8vI3ZpGTrnd1neEdp7C/1LqrG9KJda6PB+mJVqpxSk4a24uONmpkDucEDQpdpqemCe8q/T59FM5P\nLprA7NGZzAqzfGC4LqntrSk0DQpx7akpNJl8Ly93QKOpq1X7eccp7D1cRUZijDvwrq9x0kfOokAj\nw1yoPvjBWe7Pzip1qvNyMxIYkBjT7hXmeiutKYQxOjuJZ285Jexz4XoftfdD0CwotKOm0N3T5vZl\nVu8jq5F/W2GluzpaX+QEhUlDUvnNlSeG7TEzMiuJjMQYSo7WaU2hCyy446ywA2KPNxoUOuhYppfw\nS9Og0J42Ba3MdRWnpmCMYevBig4vEXk8cXoS5WYktNorLjU+mpKjdeRqQ/Mxa2mt8OONBoUO8qZz\n0hKiKa2qb2XvxprWKDra+0gdG2ecQsGRaipqw0+Z3VfMmzoYv4g7cV1LUhOiSY2P7hXTK6jeQYNC\nB2QmxTaaxuL9759FcZO1k1vTtKbQrhHNGhS6jN8nfLy1iD+8twW/TzgtTJtRXzE6O5nvzmk76OVm\nJLa5TrjqX/TT0AHLfzan0eOs5Ng278S8OlNTaLpGtOq8umAIY+C11fuZMyH7uO8l0hV+c+WJYWcL\nVv2XBoV2mDMhm12eycU6q+n1XRuau1dFTcD9ObuTS5H2NcfDqnmqe2lQaIeuWnWrac+E9oxvaJpy\nUp3nDQrpOo+UUmH1jeby44R3hs5ov7SrFnC893nuTSpqGjoFNJ0GQill0ZpCNwo2xIR2DVxzRPmE\n758/NgIl6l/qPWspaFBQKjwNCt3I26AX3YHxDvm/vigSxenXnGkglFKNafqoG3nXB9ZeRT1L++Ur\nFV5Eg4KIzBWRLSKSLyJ3tbDP1SKyUUQ2iMjzkSxPTwt5gkK49Z9V9wm3xKRSKoLpIxHxAw8B5wMF\nwDIRmW+M2ejZZwzwY+B0Y8wREcmOVHl6g4CnobnpRHcq8px5fhJi/AwfoNM6KBVOq0FBRO5osskA\nxcCnxpidbRx7JpBvjNlhH+tF4DJgo2ef/wAeMsYcATDGFHag7Mcdb0Ozpo+634I7zqKyNsAwDQhK\ntaitHEZyk38pQB7wtohc28ZrhwJ7PY8L7G1eY4GxIrJIRJaIyNxwBxKRW0VkuYgsLyoqauPX9l6N\nu6Rq+qi7pSfGaEBQqg2t1hSMMfeG2y4iA4AFwItd8PvHAGcDOcBCETnRGFPapByPAo8C5OXlHbdj\n8hs1NGv6SCnVC3XqdtUYcxho66q2DxjmeZxjb/MqAOYbY+rtdNRWrCDRJ3m7pOqU2Eqp3qhTVyYR\nOQc40sZuy4AxIjJCRGKAa4H5TfZ5DauWgIhkYqWTdnSmTMeDQFC7pCqlere2GprXYTUuew0A9gNf\nbe21xpiAiHwbeBfwA08YYzaIyH3AcmPMfPu5C0RkIxAEfmSMKencqfR+3ppC07WXlVKqN2irS+ol\nTR4boMQYc7Q9BzfGvAW81WTbPZ6fDXCH/a/PC3pnxNOYoJTqhdpqaN7dXQXpD7yzomr2SCnVG2lr\nZzf64zXTSI6z4rBoVUEp1QtpUOhGg1LjuMOe7VSbFJRSvZEGhW7m9DrShmalVG+kQaGb+e3xCRoT\nlFK9kQaFbubMbiEaFZRSvZAGhW7m1hR6uBxKKRWOBoVu5rQpaEVBKdUbaVDoZj5taFZK9WIaFLqZ\nW1Po4XIopVQ4GhS6mVND0IqCUqo30qDQzZxgoL2PlFK9kQaFbmbsmVI1JCileiMNCt3MmT1bKwpK\nqd5Ig0I3cybP1t5HSqneSINCN3MW2tGYoJTqjTQodDM3faStCkqpXkiDQjdz0kdaU1BK9UYaFLqZ\n2/tIo4JSqhfSoNBDonQ9TqVUL9TqGs2q682dPIjrTxnO9+0V2JRSqjfRoNDNYqP83H/FiT1dDKWU\nCkvTR0oppVwaFJRSSrk0KCillHJpUFBKKeXSoKCUUsoV0aAgInNFZIuI5IvIXa3s90URMSKSF8ny\nKKWUal3EgoKI+IGHgAuBicB1IjIxzH7JwHeBpZEqi1JKqfaJZE1hJpBvjNlhjKkDXgQuC7PfL4Hf\nAjURLItSSql2iGRQGArs9TwusLe5RGQ6MMwY82ZrBxKRW0VkuYgsLyoq6vqSKqWUAnqwoVlEfMCD\nwA/a2tcY86gxJs8Yk5eVlRX5wimlVD8VyaCwDxjmeZxjb3MkA5OBj0RkF3AqMF8bm5VSqudEMigs\nA8aIyAgRiQGuBeY7TxpjyowxmcaYXGNMLrAEmGeMWR7BMimllGpFxIKCMSYAfBt4F9gEvGSM2SAi\n94nIvEj9XqWUUp0X0VlSjTFvAW812XZPC/ueHcmyKKWUapuOaFZKKeXSoKCUUsqlQUEppZRLg4JS\nSimXBgWllFIuDQpKKaVcGhSUUkq5NCgopZRyaVBQSinl0qCglFLKpUFBKaWUS4OCUkoplwYFpZRS\nLg0KSimlXBoUlFJKuTQoKKWUcmlQUEop5dKgoJRSyqVBQSmllEuDglJKKZcGBaWUUi4NCkoppVwa\nFJRSSrk0KCillHJpUFBKKeXSoKCUUsqlQUEppZQrokFBROaKyBYRyReRu8I8f4eIbBSRtSLybxE5\nIZLlUUop1bqIBQUR8QMPARcCE4HrRGRik91WAXnGmCnAy8DvIlUepZRSbYtkTWEmkG+M2WGMqQNe\nBC7z7mCM+dAYU2U/XALkRLA8Siml2hDJoDAU2Ot5XGBva8nNwNsRLI9SSqk2RPV0AQBE5AYgDzir\nhedvBW4FGD58eDeWTCml+pdI1hT2AcM8j3PsbY2IyBzgp8A8Y0xtuAMZYx41xuQZY/KysrIiUlil\nlFKRDQrLgDEiMkJEYoBrgfneHUTkJOAvWAGhMIJlUUop1Q4RCwrGmADwbeBdYBPwkjFmg4jcJyLz\n7N1+DyQB/xCR1SIyv4XDKaWU6gYRbVMwxrwFvNVk2z2en+dE8vcrpZTqGB3RrJRSyqVBQSmllEuD\nglJKKZcGBaWUUi4NCkoppVwaFJRSSrk0KCillHJpUFBKKeXqFRPiKaVUV6uvr6egoICampqeLkq3\niouLIycnh+jo6E69XoOCUqpPKigoIDk5mdzcXESkp4vTLYwxlJSUUFBQwIgRIzp1DE0fKaX6pJqa\nGjIyMvpNQAAQETIyMo6pdqRBQSnVZ/WngOA41nPWoKCUUsqlQUEppSKkurqas846i2AwyOrVq5k1\naxaTJk1iypQp/P3vf2/z9Q8++CATJ05kypQpnHfeeezevRuAoqIi5s6dG5Eya1BQSqkIeeKJJ7jy\nyivx+/0kJCTw9NNPs2HDBt555x2+973vUVpa2urrTzrpJJYvX87atWu56qqruPPOOwHIyspi8ODB\nLFq0qMvLrL2PlFJ93r3/2sDG/eVdesyJQ1L4+aWTWt3nueee4/nnnwdg7Nix7vYhQ4aQnZ1NUVER\naWlpLb7+nHPOcX8+9dRTefbZZ93Hl19+Oc899xynn356Z08hLK0pKKVUBNTV1bFjxw5yc3ObPff5\n559TV1fHqFGj2n28xx9/nAsvvNB9nJeXxyeffNIVRW1EawpKqT6vrTv6SCguLg5bCzhw4ABf+cpX\neOqpp/D52ndf/uyzz7J8+XI+/vhjd1t2djb79+/vsvI6NCgopVQExMfHNxsvUF5ezsUXX8z999/P\nqaee2q7jLFiwgPvvv5+PP/6Y2NhYd3tNTQ3x8fFdWmbQ9JFSSkVEeno6wWDQDQx1dXVcccUVfPWr\nX+Wqq65qtO+Pf/xjXn311WbHWLVqFbfddhvz588nOzu70XNbt25l8uTJXV5uDQpKKRUhF1xwAZ9+\n+ikAL730EgsXLuTJJ59k2rRpTJs2jdWrVwOwbt06Bg0a1Oz1P/rRj6isrORLX/oS06ZNY968ee5z\nH374IRdffHGXl1nTR0opFSHf+ta3+OMf/8icOXO44YYbuOGGG8LuV19fz6xZs5ptX7BgQYvHnj9/\nPq+//nqXldWhNQWllIqQ6dOnc8455xAMBlvd79133+3QcYuKirjjjjtIT08/luKFpTUFpZSKoJtu\nuqnLj5mVlcXll1/e5ccFrSkopfowY0xPF6HbHes5a1BQSvVJcXFxlJSU9KvA4KynEBcX1+ljaPpI\nKdUn5eTkUFBQQFFRUU8XpVs5K691lgYFpVSfFB0d3enVx/qziKaPRGSuiGwRkXwRuSvM87Ei8nf7\n+aUikhvJ8iillGpdxIKCiPiBh4ALgYnAdSIyscluNwNHjDGjgT8Cv41UeZRSSrUtkjWFmUC+MWaH\nMaYOeBG4rMk+lwFP2T+/DJwn/XH9PKWU6iUi2aYwFNjreVwAnNLSPsaYgIiUARlAsXcnEbkVuNV+\nWCkiWzpZpsymx+4H9Jz7Bz3n/uFYzvmE9ux0XDQ0G2MeBR491uOIyHJjTF4XFOm4oefcP+g59w/d\ncc6RTB/tA4Z5HufY28LuIyJRQCpQEsEyKaWUakUkg8IyYIyIjBCRGOBaYH6TfeYDX7N/vgr4wPSn\nkSZKKdXLRCx9ZLcRfBt4F/ADTxhjNojIfcByY8x84HHgGRHJBw5jBY5IOuYU1HFIz7l/0HPuHyJ+\nzqI35koppRw695FSSimXBgWllFKufhEU2ppu43glIk+ISKGIrPdsGyAi74vINvv/dHu7iMj/2u/B\nWhGZ3nMl7zwRGSYiH4rIRhHZICLftbf32fMWkTgR+VxE1tjnfK+9fYQ9PUy+PV1MjL29z0wfIyJ+\nEVklIm/Yj/v0OYvILhFZJyKrRWS5va1bP9t9Pii0c7qN49WTwNwm2+4C/m2MGQP8234M1vmPsf/d\nCjzSTWXsagHgB8aYicCpwLfsv2dfPu9a4FxjzFRgGjBXRE7Fmhbmj/Y0MUewpo2BvjV9zHeBTZ7H\n/eGczzHGTPOMR+jez7Yxpk//A2YB73oe/xj4cU+XqwvPLxdY73m8BRhs/zwY2GL//BfgunD7Hc//\ngNeB8/vLeQMJwEqs2QGKgSh7u/s5x+rxN8v+OcreT3q67J041xysi+C5wBuA9INz3gVkNtnWrZ/t\nPl9TIPx0G0N7qCzdYaAx5oD980FgoP1zn3sf7BTBScBS+vh522mU1UAh8D6wHSg1xgTsXbzn1Wj6\nGMCZPuZ48yfgTiBkP86g75+zAd4TkRX29D7QzZ/t42KaC9U5xhgjIn2yz7GIJAH/BL5njCn3zqPY\nF8/bGBMEpolIGvAqML6HixRRInIJUGiMWSEiZ/d0ebrRbGPMPhHJBt4Xkc3eJ7vjs90fagrtmW6j\nLzkkIoMB7P8L7e195n0QkWisgPCcMeYVe3OfP28AY0wp8CFW6iTNnh4GGp9XX5g+5nRgnojswpph\n+Vzgf+jb54wxZp/9fyFW8J9JN3+2+0NQaM90G32Jd+qQr2Hl3J3tX7V7LJwKlHmqpMcNsaoEjwOb\njDEPep7qs+ctIll2DQERicdqQ9mEFRyusndres7H9fQxxpgfG2NyjDG5WN/ZD4wx19OHz1lEEkUk\n2fkZuABYT3d/tnu6YaWbGm8uArZi5WF/2tPl6cLzegE4ANRj5RNvxsqj/hvYBiwABtj7ClYvrO3A\nOiCvp8vfyXOejZV3XQustv9d1JfPG5gCrLLPeT1wj719JPA5kA/8A4i1t8fZj/Pt50f29Dkc4/mf\nDbzR18/ZPrc19r8NzrWquz/bOs2FUkopV39IHymllGonDQpKKaVcGhSUUkq5NCgopZRyaVBQSinl\n0qCg+h0RqbT/zxWRL3fxsX/S5PHirjy+UpGmQUH1Z7lAh4KCZzRtSxoFBWPMaR0sk1I9SoOC6s8e\nAM6w567/vj3p3O9FZJk9P/1tACJytoh8IiLzgY32ttfsScs2OBOXicgDQLx9vOfsbU6tROxjr7fn\ny7/Gc+yPRORlEdksIs/Zo7YRkQfEWjdirYj8d7e/O6pf0gnxVH92F/BDY8wlAPbFvcwYM0NEYoFF\nIvKeve90YLIxZqf9+CZjzGF72ollIvJPY8xdIvJtY8y0ML/rSqy1EKYCmfZrFtrPnQRMAvYDi4DT\nRWQTcAUw3hhjnGkulIo0rSko1eACrLlkVmNNx52BtYAJwOeegADwHRFZAyzBmpRsDK2bDbxgjAka\nYw4BHwMzPMcuMMaEsKbtyMWa+rkGeFxErgSqjvnslGoHDQpKNRDgdmOtejXNGDPCGOPUFI66O1lT\nOc/BWtRlKta8RHHH8HtrPT8HsRaRCWDNkPkycAnwzjEcX6l206Cg+rMKINnz+F3gP+2puRGRsfZs\nlU2lYi39WCUi47GWBXXUO69v4hPgGrvdIgs4E2vitrDs9SJSjTFvAd/HSjspFXHapqD6s7VA0E4D\nPYk1X38usNJu7C0CLg/zuneAb9h5/y1YKSTHo8BaEVlprKmeHa9irYGwBmuW1zuNMQftoBJOMvC6\niMRh1WDu6NwpKtUxOkuqUkopl6aPlFJKuTQoKKWUcmlQUEop5dKgoJRSyqVBQSmllEuDglJKKZcG\nBaWUUq7/D2ktlL9G6rguAAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60./(59+n))\n", - "graph_utility_estimates(agent, sequential_decision_environment, 500, [(2,2)])" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "It is also possible to plot multiple states on the same plot." - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEKCAYAAAD9xUlFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XecVNX5x/HPA0sv0hEEAQ2gKEVcFSs2CFiwYSIRS2Is\niUaNkUSTXzQxMbEkaozGBCNiQVGJxtUoGNSIYF0ElyaK1AWVpYkodff8/nju3J1dtrOz9ft+vfY1\nM/eeuXPuzp37nHbPtRACIiIiAA2qOwMiIlJzKCiIiEhMQUFERGIKCiIiElNQEBGRmIKCiIjEUhYU\nzGyCma01s/nFrD/fzLLMbJ6ZvWVmA1OVFxERKZtU1hQmAiNKWL8MGBpC6A/8DhifwryIiEgZpKVq\nwyGEGWbWs4T1byW9fAfolqq8iIhI2aQsKJTTJcDLxa00s8uAywBatGhx6AEHHFBV+RIRqRNmz569\nLoTQsbR01R4UzOwEPCgcU1yaEMJ4oual9PT0kJmZWUW5ExGpG8xsRVnSVWtQMLMBwD+BkSGE9dWZ\nFxERqcYhqWa2L/AscEEI4ePqyoeIiORLWU3BzJ4Ejgc6mFk2cDPQCCCE8HfgJqA98DczA9gVQkhP\nVX5ERKR0qRx9NKaU9T8EfpiqzxcRkfLTFc0iIhJTUBARkZiCgoiIxBQUREQkpqAgIiIxBQUREYkp\nKIiISExBQUREYgoKIiISU1AQEZGYgoKIiMQUFEREJKagICIiMQUFERGJKSiIiEhMQUFERGIKCiIi\nElNQEBGRmIKCiIjEFBRERCSmoCAiIjEFBRERiSkoiIhITEFBRERiCgoiIhJTUBARkZiCgoiIxFIW\nFMxsgpmtNbP5xaw3M7vXzJaYWZaZDU5VXkREpGxSWVOYCIwoYf1IoHf0dxnwQArzIiIiZZCyoBBC\nmAFsKCHJGcCjwb0DtDGzLqnKj4iIlC6tGj97H2BV0uvsaNlnqfiw376wgIVrNqdi0yIiVaJf19bc\nfPpBKf2MWtHRbGaXmVmmmWXm5ORUd3ZEROqs6qwprAa6J73uFi3bTQhhPDAeID09PVTkw1IdXUVE\n6oLqrClkABdGo5CGAF+GEFLSdCQiImWTspqCmT0JHA90MLNs4GagEUAI4e/AS8ApwBLgG+D7qcqL\niIiUTcqCQghhTCnrA3Blqj5fRETKr1Z0NIuISNVQUBARkZiCgoiIxBQUREQkpqAgIiIxBQUREYkp\nKIiISExBQUREYgoKIiISU1AQEZGYgoKIiMQUFEREJKagICIiMQUFERGJKSiIiEhMQUFERGIKCiIi\nElNQEBGRmIKCiIjEFBRERCSmoCAiIjEFBRERiSkoiIhITEFBRERi9TcobFgKXyyEjcth6o2wYRn8\n+8fw7j+qO2ci+fLy4ONpsHFFdedE6om06s5Atfj4FXjmYmiYBg0awTfr4P1/Qu4O+GIBHHF52baz\n6j3/sQ44N6XZrZM+y4I5j8EJv4Rmbas7NzXTirdh6i/gsw/hkLFwxv3VnSOpB+pfUFg8FZ46H9r0\ngA2fQtte0P1wWPUudOgDaxeWbTufvgZPjoHGLepWUNiyFjJ+Ao1bwsq34Yz7YP8TK/czPngM/vMz\nyN0OX6+Dho1hn0PhiMvKt53cnbDyHeh5DJhVbh6r046vYfpv4b1/wF7doUUn/17qmtWz4avPofsQ\naNG+unNTNXZug0ZNqzsXJapfQWHlu/D0hbB3f7jweW8+6nQANGkNIQ/e/bufCL/ZAM3bFb+dZW/C\nE+f5SY0Knozy8uDFa6BZOxj224pto7J99Tk8cjqs+zh/WfbsygsKIcDrf4AZd8B+x0P73vD+g74u\nazL0Phna7Ve2bW3fAs9cBEumw/eegT7DYcc30Lh55eS1uqz/FCZ/D3I+giOugJNugqcugG/Wl39b\nG5bCnMfhuHHQqFn53pu7C3ZthSatyv+5pW57J7z2e5h1T/6yn30MrTpX/mfVFN9sgFd+DXMnweUz\noMuA6s5RsVLap2BmI8xssZktMbMbili/r5m9bmZzzCzLzE5JWWa++hyeGgt77QNjn4Wme0GPI73p\nokFDaNjIaw0AG5cVv50Ny+DpC6BdLzjsUv/h5OWVPz///TV88CgseqFi+1PZtqz1gPDlavjuJDjh\n/3z59i8rZ/shwLRfeUA45AL/Dob/Hob8GL79B0/zRRlraVs3wSOneW0NYO0CeHc8/KGLB+zaaukb\nMP54/y4ueA5G3u410ebt/aRSHivegnsPgTf/DMtmlO+9G1fA34+Bfw4r3/sSdnwNL1wLb/1193Wb\nP/PjbNY9MPB70GZfX752oQeLNXP8WClOSetqqmUz4G9DYO7jQID1S6o7RyVKWVAws4bA/cBIoB8w\nxsz6FUr2f8DTIYRDgPOAv6UqP6x6z0v25z1RfC2gXSIoLC96fe5OmPJ9r1Wc9wTs1c2X7/ymfHmZ\n/yy8fZ8HpC9XVSyoAGz7Et66D7ZtLvt7Ni73/Ui2a4eXRjetgrH/ggNPg6HjoPU+5T8ZFWfWX+Cd\n+730O+qvHogbNYURf4SDzvY0W74oeRtb1sLC5+GJ78Ln8/07aLk3zJkEL4/zNOs/qZz8VrXFL8Ok\nc/2Yuux/BWtnyUFh6Rvw92O9cFKcBf+GR8+Ahk38dXFNTyHAOw/Aew/mL1v9AfzzJMhZ5H9bN5Zv\nP75eD4+MgtkPw2u35i/f9iWsXeTb/iwLznkIznoAfjDN16+ZA4+f40Fxxp0Ft7l1kx8/Uy6BP/et\nvGMy1fLy4I07/bto2gbGPOXLt39VvfkqRSprCocDS0IIS0MIO4DJwBmF0gSgdfR8L2BNynLTbxRc\nkwWdDiw+Tdue/rhhmTdFFPbmXX7wnn4vtN/fS3EAO7eWPR+bVsEL10C3w2DoDd65PX4oZE7IT7Nr\ne/En+hDgy2x/zPgJvPIrfyyL5bPgL4Ng7hMFl0+9AVa94/0HPY7MX968XeX8ABdmwPSb/eT/7T/u\n3v7fogNg+Sevr9f5ST/Zjq9hwghv/lv1DpzzIPQdCR37eCDYu7+nK+9JrCb45L9ei+3cDy7+D7Tt\nUXB983ZeY1s2Ax4dBZ9nwefzit7Wohdgyg+g62C4Zq4v+7qYoPDa7/y7T5ToV38Aj57pTU3DoxP6\nh5PLvh+bP4OHR8AX82GvfT3whwALnoPb9vXScl4uXDIN+o/297TqAo2ae16Wz/RlWU/lb/PrdV6z\n+O9NMH+KFxyyM8uep+qycxtMuRhe/z0cfA5c+hr0OMrXbS9HIW5LDjx8iu9/FUllUNgHWJX0Ojta\nluw3wFgzywZeAoo8u5nZZWaWaWaZOTk5Fc9RszYlr2/cAlp29gP0D10KlrA2LIU3/+Rf8EFn+rJG\nUfv1zq9L3u6Xq70PYvMaeOl6yNsF5/zTAwv4j/zNu/15CDD5fD8QivLeg3D3QX6QLHzel330Hz9p\nlmTbZnjuCiB4PhIWvQCZD8FRV+f/UBOatatYW3ayjSt8qO8+6XDm36BBEYdcw0ZeGt7yhQfNR8/w\n5qFEDSoRADd8Cl0PgdET4KCzfN23hvm2L8zw76O2lCIT1syBpy+CTv18H4qqxSaWPXEepEV9A0UF\nvyWvwjPf9077sVOgdVdo3KromsJb93nTUotOsGmld/o+eiY028sDU+IYn3pDwVrJrh1eKEm2a4en\neewsP7YueA6GXAE7tvgIsymXeLp9j4If/jc/gIMXEDr09sEGY56EY37qx0zuLg8ID5/ifVwn/hrO\n/idYA1id6cfE2/d7nwlA5sPeT7FrR+n/88oQQvGftW2z13oWPu/B9ewHoUlLH7yBla1mv+4T38Y9\n/WHFrPI3Ae6B6u5oHgNMDCH82cyOBB4zs4NDCAXaU0II44HxAOnp6altVGzbK78ZY/NqaNnJn7/y\nax++OjypSpzovCuqVvHNBnjnb3Dsz/wE/vHL8EIefDLNt9G2Z8FmnFadYc1cePI8+OozP/iTRyrk\n5fnyV6NO6bfuhe5HeCfipNE+CudbJ/m63F1eSksukU+7ETZn+z4kTihbN8F/rvcf6Uk3774Pzdt7\nwKqovNwoEAHnPlxyZ2fLzt7kMPvh/GUbl3ngnP0wzP+X5/HY6wq+7+ir/Q+iIFaLgsJXX3hTWPP2\ncP4z0LR10emaRyNzGjWDizLggaN2DwrrlnhA6NjXt5XoIG7ZcfegMP9Zr2EeOAoOOBWeu9xP6I2b\ne0Bos6+f9Np/y9u/3/wTnHYPYPDXwd7kedMGP8Z2bvXmouz3/Ni64FkvEScKExk/iYL2835iLMqo\n+/xY3bu/5zVvpwepqTfAphXepNnzGE876x4fKTjzLnj1Fl+WtwtevNaft9kXBl9Y5q+gQnZuhSe+\n481Al77uec/LhWm/9O9o5TuQ/b4HseSRiQ0a+KCW0moKaz/y2tHXa/2Y7jLQC0RVJJU1hdVA96TX\n3aJlyS4BngYIIbwNNAU6pDBPpUv0K4CPcAGvrn70Ihz7U2jdJX99Sc1HM+7M/5s/xZd9Mg069PV2\ndfAf3TkPQd9TvDbx7KVRQGjo/RYblnq6Bc/B7T3g+Sv9B9Cqq6c59a78Kunq2f6Ylwd/PcSbEHJ3\n+o971Xteojrqau9o37bJ006/2Q+8UX/1azYKa9LKTwoz/lT+/yN4e/XKt+CUO/I7FIvzZVKlcuD3\n/PGzD71U+spN0GuolyJL0rwdbK0lQSEvz7/vbZvhe09Bq72LT9tloJ8wv/e01ygaNs4PCrm74Pmr\n4L5D/Tsc82TBGnHjlrDgWT9Rgbfr//vHPgz07Ad9GDZ4qXfMk/nfkxlc+b4/n/M4zHvGB0ckvqev\n1/mx9Z/rPSC06QFjJkOv43z9XtFPv93+vn/FBQTwkTiJ2kNi9NmE4bDmAzh3Yn5AAN/+sjfzAwLA\niz+F/aMCUUl9LZUhd5f/tpbN8Fre+k/9//DyL3z04sy7PWid88+ih6o3bV1yTWHTSg/QZvCjt+Da\nLB+V93WO1+q3bkrdvkVSGRTeB3qbWS8za4x3JGcUSrMSOAnAzA7Eg8IetA9VgiE/hvQf+PNERJ/x\nJ+8UPuJHBdMW13y0eQ28/5A/n3k3tOjobbzgQwwTJ2Azb7Lp2Be+WuPV5ONv9Ko/+OtdO+C/N3te\nlr7u+TvtLr+Qae+Doyavvb3KvfYjD0CbVvqJ4N5DvET18i+87fa4cd7htXWjd/bNnujb63pI0f+L\nvNz8fSivLTnwxu3QezgMHFN6+kTT1Q2r4PS/QFpTyLjam8pCLoy6t/RrEZq33/Pmrqoy8y5Y9oYH\nzM6Fx18U0m4/uGImdDvU/wfN2uYHhVl3exMNwLmP7B58E9/h7InexPjMxR7sv/OI10I7HQg9j/Um\nucLHQYMGcHh0Iefbf/Oa717R9r9aAx884iNqjhuXf/JK2Lu/H8sXPBf1GZVRos+vYRMY/bD3GyXb\n/0Q/HnocDRe/FL3nIN+fVl1LH6xQHtu37D4IZOoNsPil/ALKJ694J/j7D8KgsZ6XMx/Ib94srEmr\n4msK32yAx87272nss9D5IE+fCLBPjYXpv6mUXStJypqPQgi7zOwqYBrQEJgQQlhgZrcAmSGEDOBn\nwINm9lO80/niEKp5zFmXAXDkVd7xu/0rv8L545fhhF/tXtpJBIXCzUdv3ecHbtM2Xio/8kpo3gGW\n9/HqemGtuvpjh77+A9u13WsCqzP9JLdphbclpzWBo6/ZvW+kzb4w7+loyFuSL1fBzL94J+UZf/P8\nJ04or97i+TtuXPH/ixNu9G3uc2jp/zfwYZCtu3rT2P/+6Af38FvLdmHZKX/ytInrDHoclT/kdNgt\n+YMAStK8vf+varqcxfC/2/zEccgF5X9/s7Z+8pv1Fx/hc/A5cNY/vG+msLP/4cNLt3/lQ4JzFsOF\n/86vmTRqBhe/WPxnnXIHfDbXS79dBsKI270zeen/4PU/+kn6+Bt3f1+DhnD8bqPQS9e8HfzfWj/W\ni7L/Sd4s03uYN8WMvAP6neEnz1adfeh5Zdi43EdCHXll/m9k7hN+8j/qJ3Dyb3yU1/v/9GbOg86O\nRtWVUs4urvkod5ePbNy0wpva9j44f10iKHQ8wD83xVLapxBCeAnvQE5edlPS84XA0anMQ4U0idp2\nt3/lHbtpTeGwH+6eLnEC2/KFlyqatPT3zHnMf/A7voEVMyH9Eq82Di7mBJAYbTL05/5jatzcD/p5\nU/zH0e0wPxhCKLqzvG0Pr8InHHqxlwzBA0K7/WHAd/11szZe4wA/2ZbU+b5XN/jWyWVrp9+8xjuI\nDxzl+zF7ote4OvYp/b2Qv98Jx//S/6dn3F/2bdSGmkIIfjV34xYw8s6KXYndrC18PNX/WnX1gFpU\nQAAvsff+to/s2bbJCzz7HV++z9t3iF9DMjqpX+i13/vv5Kzx/t1VpuICAvhJN7lZJnlKmpZ7F2yG\nrKjcnd45vnWjFwrBR3u9cK03X530G1/W42gvNHXq5yP3SgsI4OeBT17xocX7Dc1f/totHmhH/TW/\nSThhn0O96ffwy0ofLFMJ6u+EeCVJdPhtXu1tqQefU/SokERN4YWrffw1wNwnvSRwxI9g5G3w/ZeL\n70BM+NbJcNEL/jkJA8d4/8LG5V4y6XkM9Dq26Pe3iYJKxwPhgNNg2O+g76n5zV3HXZ/fZJWYZ6hZ\n26IDXWFl6RgDL7Xm7vCLkN64w08eFSkpJnQ/zEeqlDUggHeqbvvSO+iry8blkPVM8WPRs56C5W96\nkG/ZsWKfsWmlP3Y7zEv5JV19D17L2rbJa5Qn/LL8n3fCr+DqOd7p36KTD4LI2wWn3FnxfUiF5u19\nOOzMe0pPW5TEpIOv3+q19GbtfDTcru3wbHRCHv1w/m/pwNO9Oe07j+X3L5ZmUxS0Hh2VPwrwk+n+\n+zn0+0V3kjdqCsN/B226774uBap79FHNlNbEO/MyH/ZhdYd+v+h0jZJKtjkfeSlw9sPef9CtjE0u\n4CWtRAddQt+R3rzTdC8/0ZfkgFO92jnqvvzRSmOe8JJ25375tQTwDivwgFCWA7m0jjHw/oPZE330\nydqF/nfU1eVrS64MLaMmkS1fFBzrv2YOfPiUXzldltJcReXlebvv5/M8IJ/8m4Lz3Ozc5nMa7XMo\nDL6o4p/zrZO9PX/sv/z4KE2H3v546t1lP3klS2uSf/JvmObTk3TsU7AQUxMkajFv/hmOubZ87503\nBf51CZz8Wz9BHzIWMB/u/dL1fkx/7+mCx3TfEf5XHkN+5IVI8OCe1hQyrvKmoRG3lW9bKaKaQnGa\ntPLSVYe+0C296DTJzR1pzXz45tqF0QG1h9KawHce9ZJJadXzfQb7aIfCE201aeklj+T3J66ULUst\nAcpWU5g9EXZtK/hDHPLjsm2/MrWM5s4p3Nk47f/g3Qd8tE/h6zmevRxe+nnlfP5HL+RfVPbuA/Dk\nd334Z2JEzOyJ3kF70s17FpxO/bN3yJclIAAMOh8umV6wI3hPXPoqjJ5Y8yYhHPoLf0wEwbJaM8eb\n9MBH5LXo6P1brffx0WwfPOp9P32+ved5PPQiuGKWP9+8Bl7+uRfUzvpHjZkoT0GhOInO44PPLv7g\nT1xIlPDhU17DKG7kQXntN7R8NY6ySP+Bd+SVNAQyWZPWfsIv7kKd3J1+8dv+J3p79WGXenU6eehu\nVUlMqLZkev6y1R94vw74yKwPHs1f98UCn4gvcSXtngjBr3hvt5+fhMHbiKd83+/RseMbL8H2PLZg\nW3JFNGxUepNkssbNvTmusjRpVfQQ5urWsqPXir9e569XvJXfJ1Ccz+fBgyd5c19i5NbI272pKNFc\nM/giH/5dWVpHA0s+nOzN08eNg66DKm/7e6gGfrM1xK7o2oPEvDxFadDAR0Isf9Or8x8+6UMwS2vj\nrW4ldeQV1jSp0z2tiOmNF7/kfR+n3eM/pFMreE1DZUg0H71xuzdtdOzrc0w1bgU7ojb+qTdAl0E+\nnUdieoctlTBiZen/fJTO6X/x605WzMqfQ2vzam9W/Hqt1/4kdVp09KCwejY8PNJfjytmAroQ/AZb\njVv4HEwbl/n7+kVXcw/4bn5LQWXWipq19SG3n0zzQkRp199UMdUUitM4Gn5aWkfngHO9+Qa8qllZ\ntYSaIh6JVcxsqXOf8GsgeldwRs3KlNzeu/5T79Rb8G+vsl/3kV8jAh7ItuREo7ua+YilPZ0eYeZd\nHpQGjvGr4E+/N3/d5tV+YVOPowvOLSWVr0UHv27owaiZtGERBaCpv/Qmw09e8QLdib/2vrcDTvVj\nJBEAGjbyGlZlN5OZ5dekR9xevkJaFVBQKM5PPoCfLS5b2mZRzaBBmncC1iWJmsLLv9h93ZYcn8xt\nwHcqf1hiRTRoCMdF/QMbl/nc9SHPhy227uJTjrTp4TWbrMk+nUJ6NIjgk2ll+4wPJ/tcRcm+WOBX\nuB754/wfeI+jfWrwvqd46XPTyrLf0U8qrkXSaKhWXfOnqUlYu8gvwps/xa+ladMj/xioSj2Ogf7n\n+n1AahgFheK06lz2dvdEc1HPY6pkHHGV6niAP37yio+umTfF+xFyd/oPK+SW7YrlqnLCL712s2GZ\nN+f1OrbgVb6tu3oH3wePQbfDffoM8FFDpU1hnpfnF50tfN4vNkrMXfXBo96XNChpgEHDNB9K3LFv\n9LndfJiwpFaihr/f8d7sU3ha+//dBgSvHa6Z4+35xV3jkUpn3u+DQ2ogBYXK0CqqCpY2dLQ2ar8/\nHHOdDzfNfMiH7T3xHbithw/Z7dy/5OnIq5qZj8uf+4S36SfmUUpo3dXb+9ct9pFZySXL0uZNWjEz\nugFTgC/mwe09fV6qDyf7mPWibimZODYOu6Rmds7WNfuf6Bd5jX7Y+wqSR5ut/9QDevcj/HWbHjDw\nvOrJZw2moFAZ2u/v0x4Xdz1Dbdeigze1LPi3v/70NW+3XbfY71NR03QZ6Plr2MRP1skSJ+lGLbz/\nZ+/++c1/z1zskw4WlrhYLHnk0tt/82tYXvypD10ubmbOnsd6qfXQiyu+P1J2TVv7RXXN2+0eFN55\nwGsFo+7zC91Ouql6agk1nIJCZdlvaN0tCTaPOnBXvrX7usIn3Zqgd9RO27HP7vNVJYLCoRf5urTG\nPosneKfj8lkF0698x+e0XzLdL2RKNKcteNYft270EmfPQhcfJnTu53PZ1PQRaXVRo+b5zUffbPA+\npv7f8eNi3Ke73z9EAAUFKYtEs0jybS4apPmcSomTZE3yrZO9FnDW+N3XDb7A70GduC805AcK2H3u\npKyn/XHWvX6COfxSf523K7+Gcdy41F4pLRXTuIVfY5OX6xcO7vzGBwNAzbvwrgapo0VbqVSJmkKD\nNJ+Rcsl0+O7jXhKriT+uxs19Hv6iNN3L70GdLHElNPjV25kTfB6pbofBomi292VveP9Dv7P86ldr\nCJe97jWF4qYel+qVmNIj8Z32GurTUUuJFBSkdIk7f+17pM/s+c368k1UV9OlNfaZLnds8f6DF3/q\ngwYO+2H+XFHgTWXN2/lIo17HeYd2Wab0luqRmJts8cs+e+qw31ZvfmoJBQUpXctOfhXmwWd7U1JR\no2xquytm+d31no6mN9+43K9+btHJ78a36l2/0tUsusFR/xI3JzVAYnjqu//w47cujg5MAQUFKV1a\nE7hukc/oWFc1aFBweOoX8/3x7Af93tlfrvYL0sAv1pOaLzFh5Wdz/Q5yNezK4ZpKQUHKplGz0tPU\ndolpMqyhX5QHPjPmgacXvCeF1A7J04RXxszF9YSGTIgktNnX+woOjaax6NTPO6YbNdOQ0tqoURQU\nrIHfZlfKREUfkYS0Jn4HvK++8L+Rt1d3jmRPdBngNzs66qrqzkmtoqAgUlirzn7nOqndGjXzW+JK\nuaj5SEREYgoKIiISU1AQEZGYgoKIiMQUFEREJKagICIiMQUFERGJpTQomNkIM1tsZkvM7IZi0nzH\nzBaa2QIz0+BwEZFqlLKL18ysIXA/MAzIBt43s4wQwsKkNL2BG4GjQwgbzaxTqvIjIiKlKzEomNl1\nhRYFYB0wM4SwrJRtHw4sCSEsjbY1GTgDWJiU5lLg/hDCRoAQwtpy5F1ERCpZac1HrQr9tQbSgZfN\n7LxS3rsPsCrpdXa0LFkfoI+ZzTKzd8xsRFEbMrPLzCzTzDJzcnKKSiIiIpWgxJpCCKHIWxWZWTtg\nOjC5Ej6/N3A80A2YYWb9QwibCuVjPDAeID09PezhZ4qISDEq1NEcQtgAlHZz3tVA96TX3aJlybKB\njBDCzqg56mM8SIiISDWoUFAwsxOAjaUkex/obWa9zKwxcB6QUSjNv/FaAmbWAW9OWlqRPImIyJ4r\nraN5Ht65nKwdsAa4sKT3hhB2mdlVwDSgITAhhLDAzG4BMkMIGdG64Wa2EMgFxoUQ1ldsV0REZE9Z\nCMU30ZtZj0KLArA+hPB1SnNVgvT09JCZmVldHy8iUiuZ2ewQQnpp6UrraF5ReVkSEZGaTtNciIhI\nTEFBRERiCgoiIhJTUBARkZiCgoiIxBQUREQkpqAgIiIxBQUREYkpKIiISExBQUREYgoKIiISU1AQ\nEZGYgoKIiMQUFEREJKagICIiMQUFERGJKSiIiEhMQUFERGIKCiIiElNQEBGRmIKCiIjEFBRERCSm\noCAiIjEFBRERiSkoiIhITEFBRERiKQ0KZjbCzBab2RIzu6GEdOeYWTCz9FTmR0RESpayoGBmDYH7\ngZFAP2CMmfUrIl0r4Brg3VTlRUREyiaVNYXDgSUhhKUhhB3AZOCMItL9Drgd2JbCvIiISBmkMijs\nA6xKep0dLYuZ2WCgewjhPyVtyMwuM7NMM8vMycmp/JyKiAhQjR3NZtYAuAv4WWlpQwjjQwjpIYT0\njh07pj5zIiL1VCqDwmqge9LrbtGyhFbAwcD/zGw5MATIUGeziEj1SWVQeB/obWa9zKwxcB6QkVgZ\nQvgyhNAhhNAzhNATeAcYFULITGGeRESkBCkLCiGEXcBVwDRgEfB0CGGBmd1iZqNS9bkiIlJxaanc\neAjhJeClQstuKibt8anMi4iIlE5XNIuISExBQUREYgoKIiISU1AQEZGYgoKIiMQUFEREJKagICIi\nMQUFEREi6Yw0AAANwklEQVSJKSiIiEhMQUFERGIKCiIiElNQEBGRmIKCiIjEFBRERCSmoCAiIjEF\nBRERiSkoiIhITEFBRERiCgoiIhJTUBARkZiCgoiIxBQUREQkpqAgIiIxBQUREYkpKIiISExBQURE\nYgoKIiISS2lQMLMRZrbYzJaY2Q1FrL/OzBaaWZaZvWpmPVKZHxERKVnKgoKZNQTuB0YC/YAxZtav\nULI5QHoIYQAwBbgjVfkREZHSpaVw24cDS0IISwHMbDJwBrAwkSCE8HpS+neAsSnMj4jUIzt37iQ7\nO5tt27ZVd1aqVNOmTenWrRuNGjWq0PtTGRT2AVYlvc4Gjigh/SXAyynMj4jUI9nZ2bRq1YqePXti\nZtWdnSoRQmD9+vVkZ2fTq1evCm2jRnQ0m9lYIB24s5j1l5lZppll5uTkVG3mRKRW2rZtG+3bt683\nAQHAzGjfvv0e1Y5SGRRWA92TXneLlhVgZicDvwJGhRC2F7WhEML4EEJ6CCG9Y8eOKcmsiNQ99Skg\nJOzpPqcyKLwP9DazXmbWGDgPyEhOYGaHAP/AA8LaFOZFRETKIGVBIYSwC7gKmAYsAp4OISwws1vM\nbFSU7E6gJfCMmc01s4xiNiciUuts3bqVoUOHkpuby4oVKxg8eDCDBg3ioIMO4u9//3up7x83bhwH\nHHAAAwYM4KyzzmLTpk0AzJs3j4svvjgleU5pn0II4aUQQp8Qwv4hhFujZTeFEDKi5yeHEDqHEAZF\nf6NK3qKISO0xYcIEzj77bBo2bEiXLl14++23mTt3Lu+++y633XYba9asKfH9w4YNY/78+WRlZdGn\nTx/++Mc/AtC/f3+ys7NZuXJlpec5laOPRERqhN++sICFazZX6jb7dW3NzacfVGKaSZMm8cQTTwDQ\nuHHjePn27dvJy8sr9TOGDx8ePx8yZAhTpkyJX59++ulMnjyZn//85+XNeolqxOgjEZG6ZseOHSxd\nupSePXvGy1atWsWAAQPo3r07v/jFL+jatWuZtzdhwgRGjhwZv05PT+fNN9+szCwDqimISD1QWok+\nFdatW0ebNm0KLOvevTtZWVmsWbOGM888k9GjR9O5c+dSt3XrrbeSlpbG+eefHy/r1KlTqc1PFaGa\ngohICjRr1qzY6wW6du3KwQcfXKaS/sSJE3nxxReZNGlSgeGm27Zto1mzZpWW3wQFBRGRFGjbti25\nublxYMjOzmbr1q0AbNy4kZkzZ9K3b18ALrzwQt57773dtjF16lTuuOMOMjIyaN68eYF1H3/8MQcf\nfHCl51tBQUQkRYYPH87MmTMBWLRoEUcccQQDBw5k6NChXH/99fTv3x+ArKysIvsXrrrqKr766iuG\nDRvGoEGDuOKKK+J1r7/+Oqeeemql51l9CiIiKXLllVdy9913c/LJJzNs2DCysrJ2S7N582Z69+5N\nt27ddlu3ZMmSIre7fft2MjMzueeeeyo9z6opiIikyODBgznhhBPIzc0tNk3r1q155plnyrXdlStX\nctttt5GWVvnletUURERS6Ac/+EGlb7N379707t270rcLqimIiEgSBQUREYkpKIiISExBQUREYgoK\nIiIpkjx19ty5cznyyCM56KCDGDBgAE899VSp77/rrrvo168fAwYM4KSTTmLFihUA5OTkMGLEiJTk\nWUFBRCRFkqfObt68OY8++igLFixg6tSpXHvttfH9EYpzyCGHkJmZSVZWFqNHj45nRO3YsSNdunRh\n1qxZlZ5nDUkVkbrv5Rvg83mVu829+8PI20pMkjx1dp8+feLlXbt2pVOnTuTk5Ow2aV6yE044IX4+\nZMgQHn/88fj1mWeeyaRJkzj66KMrugdFUk1BRCQFipo6O+G9995jx44d7L///mXe3kMPPaSps0VE\nKkUpJfpUKGrqbIDPPvuMCy64gEceeYQGDcpWLn/88cfJzMzkjTfeiJelaupsBQURkRQoaurszZs3\nc+qpp3LrrbcyZMiQMm1n+vTp3Hrrrbzxxhs0adIkXq6ps0VEapHCU2fv2LGDs846iwsvvJDRo0cX\nSHvjjTfy3HPP7baNOXPmcPnll5ORkUGnTp0KrNPU2SIitUzy1NlPP/00M2bMYOLEiQwaNIhBgwYx\nd+5cAObNm8fee++92/vHjRvHli1bOPfccxk0aBCjRo2K12nqbBGRWiZ56uyxY8cyduzYItPt3LmT\nI488crfl06dPL3bbGRkZPP/885WW1wTVFEREUqQsU2cDTJs2rVzbzcnJ4brrrqNt27Z7kr0iqaYg\nIpJCqZg6u2PHjpx55pmVvl1QTUFE6rAQQnVnocrt6T4rKIhIndS0aVPWr19frwJDCIH169fTtGnT\nCm9DzUciUid169aN7OxscnJyqjsrVapp06ZF3u+5rBQURKROatSoEb169arubNQ6KW0+MrMRZrbY\nzJaY2Q1FrG9iZk9F6981s56pzI+IiJQsZUHBzBoC9wMjgX7AGDPrVyjZJcDGEMK3gLuB21OVHxER\nKV0qawqHA0tCCEtDCDuAycAZhdKcATwSPZ8CnGRmlsI8iYhICVLZp7APsCrpdTZwRHFpQgi7zOxL\noD2wLjmRmV0GXBa93GJmiyuYpw6Ft10PaJ/rB+1z/bAn+9yjLIlqRUdzCGE8MH5Pt2NmmSGE9ErI\nUq2hfa4ftM/1Q1Xscyqbj1YD3ZNed4uWFZnGzNKAvYD1KcyTiIiUIJVB4X2gt5n1MrPGwHlARqE0\nGcBF0fPRwGuhPl1pIiJSw6Ss+SjqI7gKmAY0BCaEEBaY2S1AZgghA3gIeMzMlgAb8MCRSnvcBFUL\naZ/rB+1z/ZDyfTYVzEVEJEFzH4mISExBQUREYvUiKJQ23UZtZWYTzGytmc1PWtbOzP5rZp9Ej22j\n5WZm90b/gywzG1x9Oa84M+tuZq+b2UIzW2Bm10TL6+x+m1lTM3vPzD6M9vm30fJe0fQwS6LpYhpH\ny+vM9DFm1tDM5pjZi9HrOr3PZrbczOaZ2Vwzy4yWVemxXeeDQhmn26itJgIjCi27AXg1hNAbeDV6\nDb7/vaO/y4AHqiiPlW0X8LMQQj9gCHBl9H3W5f3eDpwYQhgIDAJGmNkQfFqYu6NpYjbi08ZA3Zo+\n5hpgUdLr+rDPJ4QQBiVdj1C1x3YIoU7/AUcC05Je3wjcWN35qsT96wnMT3q9GOgSPe8CLI6e/wMY\nU1S62vwHPA8Mqy/7DTQHPsBnB1gHpEXL4+McH/F3ZPQ8LUpn1Z33CuxrN/wkeCLwImD1YJ+XAx0K\nLavSY7vO1xQoerqNfaopL1Whcwjhs+j550Dn6Hmd+z9ETQSHAO9Sx/c7akaZC6wF/gt8CmwKIeyK\nkiTvV4HpY4DE9DG1zT3Az4G86HV76v4+B+AVM5sdTe8DVXxs14ppLqRiQgjBzOrkmGMzawn8C7g2\nhLA5eR7FurjfIYRcYJCZtQGeAw6o5iyllJmdBqwNIcw2s+OrOz9V6JgQwmoz6wT818w+Sl5ZFcd2\nfagplGW6jbrkCzPrAhA9ro2W15n/g5k1wgPCpBDCs9HiOr/fACGETcDreNNJm2h6GCi4X3Vh+pij\ngVFmthyfYflE4C/U7X0mhLA6elyLB//DqeJjuz4EhbJMt1GXJE8dchHe5p5YfmE0YmEI8GVSlbTW\nMK8SPAQsCiHclbSqzu63mXWMagiYWTO8D2URHhxGR8kK73Otnj4mhHBjCKFbCKEn/pt9LYRwPnV4\nn82shZm1SjwHhgPzqepju7o7Vqqo8+YU4GO8HfZX1Z2fStyvJ4HPgJ14e+IleDvqq8AnwHSgXZTW\n8FFYnwLzgPTqzn8F9/kYvN01C5gb/Z1Sl/cbGADMifZ5PnBTtHw/4D1gCfAM0CRa3jR6vSRav191\n78Me7v/xwIt1fZ+jffsw+luQOFdV9bGtaS5ERCRWH5qPRESkjBQUREQkpqAgIiIxBQUREYkpKIiI\nSExBQeodM9sSPfY0s+9V8rZ/Wej1W5W5fZFUU1CQ+qwnUK6gkHQ1bXEKBIUQwlHlzJNItVJQkPrs\nNuDYaO76n0aTzt1pZu9H89NfDmBmx5vZm2aWASyMlv07mrRsQWLiMjO7DWgWbW9StCxRK7Fo2/Oj\n+fK/m7Tt/5nZFDP7yMwmRVdtY2a3md83IsvM/lTl/x2plzQhntRnNwDXhxBOA4hO7l+GEA4zsybA\nLDN7JUo7GDg4hLAsev2DEMKGaNqJ983sXyGEG8zsqhDCoCI+62z8XggDgQ7Re2ZE6w4BDgLWALOA\no81sEXAWcEAIISSmuRBJNdUURPINx+eSmYtPx90ev4EJwHtJAQHgajP7EHgHn5SsNyU7BngyhJAb\nQvgCeAM4LGnb2SGEPHzajp741M/bgIfM7Gzgmz3eO5EyUFAQyWfAT4Lf9WpQCKFXCCFRU/g6TuRT\nOZ+M39RlID4vUdM9+NztSc9z8ZvI7MJnyJwCnAZM3YPti5SZgoLUZ18BrZJeTwN+FE3NjZn1iWar\nLGwv/NaP35jZAfhtQRN2Jt5fyJvAd6N+i47AcfjEbUWK7hexVwjhJeCneLOTSMqpT0HqsywgN2oG\nmojP198T+CDq7M0BzizifVOBK6J2/8V4E1LCeCDLzD4IPtVzwnP4PRA+xGd5/XkI4fMoqBSlFfC8\nmTXFazDXVWwXRcpHs6SKiEhMzUciIhJTUBARkZiCgoiIxBQUREQkpqAgIiIxBQUREYkpKIiISOz/\nAW4Hvin6vj2yAAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "graph_utility_estimates(agent, sequential_decision_environment, 500, [(2,2), (3,2)])" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "collapsed": true - }, - "source": [ - "## ACTIVE REINFORCEMENT LEARNING\n", - "\n", - "Unlike Passive Reinforcement Learning in Active Reinforcement Learning we are not bound by a policy pi and we need to select our actions. In other words the agent needs to learn an optimal policy. The fundamental tradeoff the agent needs to face is that of exploration vs. exploitation. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### QLearning Agent\n", - "\n", - "The QLearningAgent class in the rl module implements the Agent Program described in **Fig 21.8** of the AIMA Book. In Q-Learning the agent learns an action-value function Q which gives the utility of taking a given action in a particular state. Q-Learning does not required a transition model and hence is a model free method. Let us look into the source before we see some usage examples." - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "%psource QLearningAgent" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The Agent Program can be obtained by creating the instance of the class by passing the appropriate parameters. Because of the __ call __ method the object that is created behaves like a callable and returns an appropriate action as most Agent Programs do. To instantiate the object we need a mdp similar to the PassiveTDAgent.\n", - "\n", - " Let us use the same GridMDP object we used above. **Figure 17.1 (sequential_decision_environment)** is similar to **Figure 21.1** but has some discounting as **gamma = 0.9**. The class also implements an exploration function **f** which returns fixed **Rplus** untill agent has visited state, action **Ne** number of times. This is the same as the one defined on page **842** of the book. The method **actions_in_state** returns actions possible in given state. It is useful when applying max and argmax operations." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let us create our object now. We also use the **same alpha** as given in the footnote of the book on **page 837**. We use **Rplus = 2** and **Ne = 5** as defined on page 843. **Fig 21.7** " - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, \n", - " alpha=lambda n: 60./(59+n))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now to try out the q_agent we make use of the **run_single_trial** function in rl.py (which was also used above). Let us use **200** iterations." - ] - }, - { - "cell_type": "code", - "execution_count": 14, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "for i in range(200):\n", - " run_single_trial(q_agent,sequential_decision_environment)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now let us see the Q Values. The keys are state-action pairs. Where differnt actions correspond according to:\n", - "\n", - "north = (0, 1)\n", - "south = (0,-1)\n", - "west = (-1, 0)\n", - "east = (1, 0)" - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "defaultdict(float,\n", - " {((0, 0), (-1, 0)): -0.12953971401732597,\n", - " ((0, 0), (0, -1)): -0.12753699595470713,\n", - " ((0, 0), (0, 1)): -0.01158029172666495,\n", - " ((0, 0), (1, 0)): -0.13035841083471436,\n", - " ((0, 1), (-1, 0)): -0.04,\n", - " ((0, 1), (0, -1)): -0.1057916516323444,\n", - " ((0, 1), (0, 1)): 0.13072636267769677,\n", - " ((0, 1), (1, 0)): -0.07323076923076924,\n", - " ((0, 2), (-1, 0)): 0.12165200587479848,\n", - " ((0, 2), (0, -1)): 0.09431411803674361,\n", - " ((0, 2), (0, 1)): 0.14047883620608154,\n", - " ((0, 2), (1, 0)): 0.19224095989491635,\n", - " ((1, 0), (-1, 0)): -0.09696833851887868,\n", - " ((1, 0), (0, -1)): -0.15641263417341367,\n", - " ((1, 0), (0, 1)): -0.15340385689815017,\n", - " ((1, 0), (1, 0)): -0.15224266498911238,\n", - " ((1, 2), (-1, 0)): 0.18537063683043895,\n", - " ((1, 2), (0, -1)): 0.17757702529142774,\n", - " ((1, 2), (0, 1)): 0.17562120416256435,\n", - " ((1, 2), (1, 0)): 0.27484289408254886,\n", - " ((2, 0), (-1, 0)): -0.16785234970594098,\n", - " ((2, 0), (0, -1)): -0.1448679824723624,\n", - " ((2, 0), (0, 1)): -0.028114098214323924,\n", - " ((2, 0), (1, 0)): -0.16267477943781278,\n", - " ((2, 1), (-1, 0)): -0.2301056003129034,\n", - " ((2, 1), (0, -1)): -0.4332722098873507,\n", - " ((2, 1), (0, 1)): 0.2965645851500498,\n", - " ((2, 1), (1, 0)): -0.90815406879654,\n", - " ((2, 2), (-1, 0)): 0.1905755278897695,\n", - " ((2, 2), (0, -1)): 0.07306332481110034,\n", - " ((2, 2), (0, 1)): 0.1793881607466996,\n", - " ((2, 2), (1, 0)): 0.34260576652777697,\n", - " ((3, 0), (-1, 0)): -0.16576962655130892,\n", - " ((3, 0), (0, -1)): -0.16840120349372995,\n", - " ((3, 0), (0, 1)): -0.5090288592720464,\n", - " ((3, 0), (1, 0)): -0.88375,\n", - " ((3, 1), None): -0.6897322258069369,\n", - " ((3, 2), None): 0.388990723935834})" - ] - }, - "execution_count": 15, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "q_agent.Q" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The Utility **U** of each state is related to **Q** by the following equation.\n", - "\n", - "**U (s) = max a Q(s, a)**\n", - "\n", - "Let us convert the Q Values above into U estimates.\n", - "\n" - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "U = defaultdict(lambda: -1000.) # Very Large Negative Value for Comparison see below.\n", - "for state_action, value in q_agent.Q.items():\n", - " state, action = state_action\n", - " if U[state] < value:\n", - " U[state] = value" - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "defaultdict(>,\n", - " {(0, 0): -0.01158029172666495,\n", - " (0, 1): 0.13072636267769677,\n", - " (0, 2): 0.19224095989491635,\n", - " (1, 0): -0.09696833851887868,\n", - " (1, 2): 0.27484289408254886,\n", - " (2, 0): -0.028114098214323924,\n", - " (2, 1): 0.2965645851500498,\n", - " (2, 2): 0.34260576652777697,\n", - " (3, 0): -0.16576962655130892,\n", - " (3, 1): -0.6897322258069369,\n", - " (3, 2): 0.388990723935834})" - ] - }, - "execution_count": 17, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "U" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let us finally compare these estimates to value_iteration results." - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{(0, 1): 0.3984432178350045, (1, 2): 0.649585681261095, (3, 2): 1.0, (0, 0): 0.2962883154554812, (3, 0): 0.12987274656746342, (3, 1): -1.0, (2, 1): 0.48644001739269643, (2, 0): 0.3447542300124158, (2, 2): 0.7953620878466678, (1, 0): 0.25386699846479516, (0, 2): 0.5093943765842497}\n" - ] - } - ], - "source": [ - "print(value_iteration(sequential_decision_environment))" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.5.2+" - } - }, - "nbformat": 4, - "nbformat_minor": 1 -} diff --git a/rl.py b/rl.py deleted file mode 100644 index 20a392592..000000000 --- a/rl.py +++ /dev/null @@ -1,203 +0,0 @@ -"""Reinforcement Learning (Chapter 21)""" - -from collections import defaultdict -from utils import argmax -from mdp import MDP, policy_evaluation - -import random - - -class PassiveADPAgent: - - """Passive (non-learning) agent that uses adaptive dynamic programming - on a given MDP and policy. [Figure 21.2]""" - - class ModelMDP(MDP): - """ Class for implementing modifed Version of input MDP with - an editable transition model P and a custom function T. """ - def __init__(self, init, actlist, terminals, gamma, states): - super().__init__(init, actlist, terminals, gamma) - nested_dict = lambda: defaultdict(nested_dict) - # StackOverflow:whats-the-best-way-to-initialize-a-dict-of-dicts-in-python - self.P = nested_dict() - - def T(self, s, a): - """Returns a list of tuples with probabilities for states - based on the learnt model P.""" - return [(prob, res) for (res, prob) in self.P[(s, a)].items()] - - def __init__(self, pi, mdp): - self.pi = pi - self.mdp = PassiveADPAgent.ModelMDP(mdp.init, mdp.actlist, - mdp.terminals, mdp.gamma, mdp.states) - self.U = {} - self.Nsa = defaultdict(int) - self.Ns1_sa = defaultdict(int) - self.s = None - self.a = None - - def __call__(self, percept): - s1, r1 = percept - self.mdp.states.add(s1) # Model keeps track of visited states. - R, P, mdp, pi = self.mdp.reward, self.mdp.P, self.mdp, self.pi - s, a, Nsa, Ns1_sa, U = self.s, self.a, self.Nsa, self.Ns1_sa, self.U - - if s1 not in R: # Reward is only available for visted state. - U[s1] = R[s1] = r1 - if s is not None: - Nsa[(s, a)] += 1 - Ns1_sa[(s1, s, a)] += 1 - # for each t such that Ns′|sa [t, s, a] is nonzero - for t in [res for (res, state, act), freq in Ns1_sa.items() - if (state, act) == (s, a) and freq != 0]: - P[(s, a)][t] = Ns1_sa[(t, s, a)] / Nsa[(s, a)] - - U = policy_evaluation(pi, U, mdp) - if s1 in mdp.terminals: - self.s = self.a = None - else: - self.s, self.a = s1, self.pi[s1] - return self.a - - def update_state(self, percept): - '''To be overridden in most cases. The default case - assumes the percept to be of type (state, reward)''' - return percept - - -class PassiveTDAgent: - """The abstract class for a Passive (non-learning) agent that uses - temporal differences to learn utility estimates. Override update_state - method to convert percept to state and reward. The mdp being provided - should be an instance of a subclass of the MDP Class. [Figure 21.4] - """ - - def __init__(self, pi, mdp, alpha=None): - - self.pi = pi - self.U = {s: 0. for s in mdp.states} - self.Ns = {s: 0 for s in mdp.states} - self.s = None - self.a = None - self.r = None - self.gamma = mdp.gamma - self.terminals = mdp.terminals - - if alpha: - self.alpha = alpha - else: - self.alpha = lambda n: 1./(1+n) # udacity video - - def __call__(self, percept): - s1, r1 = self.update_state(percept) - pi, U, Ns, s, r = self.pi, self.U, self.Ns, self.s, self.r - alpha, gamma, terminals = self.alpha, self.gamma, self.terminals - if not Ns[s1]: - U[s1] = r1 - if s is not None: - Ns[s] += 1 - U[s] += alpha(Ns[s]) * (r + gamma * U[s1] - U[s]) - if s1 in terminals: - self.s = self.a = self.r = None - else: - self.s, self.a, self.r = s1, pi[s1], r1 - return self.a - - def update_state(self, percept): - ''' To be overridden in most cases. The default case - assumes the percept to be of type (state, reward)''' - return percept - - -class QLearningAgent: - """ An exploratory Q-learning agent. It avoids having to learn the transition - model because the Q-value of a state can be related directly to those of - its neighbors. [Figure 21.8] - """ - def __init__(self, mdp, Ne, Rplus, alpha=None): - - self.gamma = mdp.gamma - self.terminals = mdp.terminals - self.all_act = mdp.actlist - self.Ne = Ne # iteration limit in exploration function - self.Rplus = Rplus # large value to assign before iteration limit - self.Q = defaultdict(float) - self.Nsa = defaultdict(float) - self.s = None - self.a = None - self.r = None - - if alpha: - self.alpha = alpha - else: - self.alpha = lambda n: 1./(1+n) # udacity video - - def f(self, u, n): - """ Exploration function. Returns fixed Rplus untill - agent has visited state, action a Ne number of times. - Same as ADP agent in book.""" - if n < self.Ne: - return self.Rplus - else: - return u - - def actions_in_state(self, state): - """ Returns actions possible in given state. - Useful for max and argmax. """ - if state in self.terminals: - return [None] - else: - return self.all_act - - def __call__(self, percept): - s1, r1 = self.update_state(percept) - Q, Nsa, s, a, r = self.Q, self.Nsa, self.s, self.a, self.r - alpha, gamma, terminals = self.alpha, self.gamma, self.terminals, - actions_in_state = self.actions_in_state - - if s in terminals: - Q[s, None] = r1 - if s is not None: - Nsa[s, a] += 1 - Q[s, a] += alpha(Nsa[s, a]) * (r + gamma * max(Q[s1, a1] - for a1 in actions_in_state(s1)) - Q[s, a]) - if s in terminals: - self.s = self.a = self.r = None - else: - self.s, self.r = s1, r1 - self.a = argmax(actions_in_state(s1), key=lambda a1: self.f(Q[s1, a1], Nsa[s1, a1])) - return self.a - - def update_state(self, percept): - ''' To be overridden in most cases. The default case - assumes the percept to be of type (state, reward)''' - return percept - - -def run_single_trial(agent_program, mdp): - ''' Execute trial for given agent_program - and mdp. mdp should be an instance of subclass - of mdp.MDP ''' - - def take_single_action(mdp, s, a): - ''' - Selects outcome of taking action a - in state s. Weighted Sampling. - ''' - x = random.uniform(0, 1) - cumulative_probability = 0.0 - for probability_state in mdp.T(s, a): - probability, state = probability_state - cumulative_probability += probability - if x < cumulative_probability: - break - return state - - current_state = mdp.init - while True: - current_reward = mdp.R(current_state) - percept = (current_state, current_reward) - next_action = agent_program(percept) - if next_action is None: - break - current_state = take_single_action(mdp, current_state, next_action) diff --git a/search-4e.ipynb b/search-4e.ipynb deleted file mode 100644 index 785596ef0..000000000 --- a/search-4e.ipynb +++ /dev/null @@ -1,2151 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "*Note: This is not yet ready, but shows the direction I'm leaning in for Fourth Edition Search.*\n", - "\n", - "# State-Space Search\n", - "\n", - "This notebook describes several state-space search algorithms, and how they can be used to solve a variety of problems. We start with a simple algorithm and a simple domain: finding a route from city to city. Later we will explore other algorithms and domains.\n", - "\n", - "## The Route-Finding Domain\n", - "\n", - "Like all state-space search problems, in a route-finding problem you will be given:\n", - "- A start state (for example, `'A'` for the city Arad).\n", - "- A goal state (for example, `'B'` for the city Bucharest).\n", - "- Actions that can change state (for example, driving from `'A'` to `'S'`).\n", - "\n", - "You will be asked to find:\n", - "- A path from the start state, through intermediate states, to the goal state.\n", - "\n", - "We'll use this map:\n", - "\n", - "\n", - "\n", - "A state-space search problem can be represented by a *graph*, where the vertexes of the graph are the states of the problem (in this case, cities) and the edges of the graph are the actions (in this case, driving along a road).\n", - "\n", - "We'll represent a city by its single initial letter. \n", - "We'll represent the graph of connections as a `dict` that maps each city to a list of the neighboring cities (connected by a road). For now we don't explicitly represent the actions, nor the distances\n", - "between cities." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "romania = {\n", - " 'A': ['Z', 'T', 'S'],\n", - " 'B': ['F', 'P', 'G', 'U'],\n", - " 'C': ['D', 'R', 'P'],\n", - " 'D': ['M', 'C'],\n", - " 'E': ['H'],\n", - " 'F': ['S', 'B'],\n", - " 'G': ['B'],\n", - " 'H': ['U', 'E'],\n", - " 'I': ['N', 'V'],\n", - " 'L': ['T', 'M'],\n", - " 'M': ['L', 'D'],\n", - " 'N': ['I'],\n", - " 'O': ['Z', 'S'],\n", - " 'P': ['R', 'C', 'B'],\n", - " 'R': ['S', 'C', 'P'],\n", - " 'S': ['A', 'O', 'F', 'R'],\n", - " 'T': ['A', 'L'],\n", - " 'U': ['B', 'V', 'H'],\n", - " 'V': ['U', 'I'],\n", - " 'Z': ['O', 'A']}" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "Suppose we want to get from `A` to `B`. Where can we go from the start state, `A`?" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['Z', 'T', 'S']" - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "romania['A']" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "We see that from `A` we can get to any of the three cities `['Z', 'T', 'S']`. Which should we choose? *We don't know.* That's the whole point of *search*: we don't know which immediate action is best, so we'll have to explore, until we find a *path* that leads to the goal. \n", - "\n", - "How do we explore? We'll start with a simple algorithm that will get us from `A` to `B`. We'll keep a *frontier*—a collection of not-yet-explored states—and expand the frontier outward until it reaches the goal. To be more precise:\n", - "\n", - "- Initially, the only state in the frontier is the start state, `'A'`.\n", - "- Until we reach the goal, or run out of states in the frontier to explore, do the following:\n", - " - Remove the first state from the frontier. Call it `s`.\n", - " - If `s` is the goal, we're done. Return the path to `s`.\n", - " - Otherwise, consider all the neighboring states of `s`. For each one:\n", - " - If we have not previously explored the state, add it to the end of the frontier.\n", - " - Also keep track of the previous state that led to this new neighboring state; we'll need this to reconstruct the path to the goal, and to keep us from re-visiting previously explored states.\n", - " \n", - "# A Simple Search Algorithm: `breadth_first`\n", - " \n", - "The function `breadth_first` implements this strategy:" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": { - "button": false, - "collapsed": true, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "from collections import deque # Doubly-ended queue: pop from left, append to right.\n", - "\n", - "def breadth_first(start, goal, neighbors):\n", - " \"Find a shortest sequence of states from start to the goal.\"\n", - " frontier = deque([start]) # A queue of states\n", - " previous = {start: None} # start has no previous state; other states will\n", - " while frontier:\n", - " s = frontier.popleft()\n", - " if s == goal:\n", - " return path(previous, s)\n", - " for s2 in neighbors[s]:\n", - " if s2 not in previous:\n", - " frontier.append(s2)\n", - " previous[s2] = s\n", - " \n", - "def path(previous, s): \n", - " \"Return a list of states that lead to state s, according to the previous dict.\"\n", - " return [] if (s is None) else path(previous, previous[s]) + [s]" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "A couple of things to note: \n", - "\n", - "1. We always add new states to the end of the frontier queue. That means that all the states that are adjacent to the start state will come first in the queue, then all the states that are two steps away, then three steps, etc.\n", - "That's what we mean by *breadth-first* search.\n", - "2. We recover the path to an `end` state by following the trail of `previous[end]` pointers, all the way back to `start`.\n", - "The dict `previous` is a map of `{state: previous_state}`. \n", - "3. When we finally get an `s` that is the goal state, we know we have found a shortest path, because any other state in the queue must correspond to a path that is as long or longer.\n", - "3. Note that `previous` contains all the states that are currently in `frontier` as well as all the states that were in `frontier` in the past.\n", - "4. If no path to the goal is found, then `breadth_first` returns `None`. If a path is found, it returns the sequence of states on the path.\n", - "\n", - "Some examples:" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['A', 'S', 'F', 'B']" - ] - }, - "execution_count": 4, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "breadth_first('A', 'B', romania)" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['L', 'T', 'A', 'S', 'F', 'B', 'U', 'V', 'I', 'N']" - ] - }, - "execution_count": 5, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "breadth_first('L', 'N', romania)" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['N', 'I', 'V', 'U', 'B', 'F', 'S', 'A', 'T', 'L']" - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "breadth_first('N', 'L', romania)" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['E']" - ] - }, - "execution_count": 7, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "breadth_first('E', 'E', romania)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "Now let's try a different kind of problem that can be solved with the same search function.\n", - "\n", - "## Word Ladders Problem\n", - "\n", - "A *word ladder* problem is this: given a start word and a goal word, find the shortest way to transform the start word into the goal word by changing one letter at a time, such that each change results in a word. For example starting with `green` we can reach `grass` in 7 steps:\n", - "\n", - "`green` → `greed` → `treed` → `trees` → `tress` → `cress` → `crass` → `grass`\n", - "\n", - "We will need a dictionary of words. We'll use 5-letter words from the [Stanford GraphBase](http://www-cs-faculty.stanford.edu/~uno/sgb.html) project for this purpose. Let's get that file from aimadata." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "from search import *\n", - "sgb_words = open_data(\"EN-text/sgb-words.txt\")" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "We can assign `WORDS` to be the set of all the words in this file:" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "5757" - ] - }, - "execution_count": 9, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "WORDS = set(sgb_words.read().split())\n", - "len(WORDS)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "And define `neighboring_words` to return the set of all words that are a one-letter change away from a given `word`:" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "def neighboring_words(word):\n", - " \"All words that are one letter away from this word.\"\n", - " neighbors = {word[:i] + c + word[i+1:]\n", - " for i in range(len(word))\n", - " for c in 'abcdefghijklmnopqrstuvwxyz'\n", - " if c != word[i]}\n", - " return neighbors & WORDS" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "For example:" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "{'cello', 'hallo', 'hells', 'hullo', 'jello'}" - ] - }, - "execution_count": 11, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "neighboring_words('hello')" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "data": { - "text/plain": [ - "{'would'}" - ] - }, - "execution_count": 12, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "neighboring_words('world')" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "Now we can create `word_neighbors` as a dict of `{word: {neighboring_word, ...}}`: " - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "word_neighbors = {word: neighboring_words(word)\n", - " for word in WORDS}" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "Now the `breadth_first` function can be used to solve a word ladder problem:" - ] - }, - { - "cell_type": "code", - "execution_count": 14, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['green', 'greed', 'treed', 'trees', 'treys', 'greys', 'grays', 'grass']" - ] - }, - "execution_count": 14, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "breadth_first('green', 'grass', word_neighbors)" - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['smart',\n", - " 'start',\n", - " 'stars',\n", - " 'sears',\n", - " 'bears',\n", - " 'beans',\n", - " 'brans',\n", - " 'brand',\n", - " 'braid',\n", - " 'brain']" - ] - }, - "execution_count": 15, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "breadth_first('smart', 'brain', word_neighbors)" - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['frown',\n", - " 'flown',\n", - " 'flows',\n", - " 'slows',\n", - " 'stows',\n", - " 'stoas',\n", - " 'stoae',\n", - " 'stole',\n", - " 'stile',\n", - " 'smile']" - ] - }, - "execution_count": 16, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "breadth_first('frown', 'smile', word_neighbors)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# More General Search Algorithms\n", - "\n", - "Now we'll embelish the `breadth_first` algorithm to make a family of search algorithms with more capabilities:\n", - "\n", - "1. We distinguish between an *action* and the *result* of an action.\n", - "3. We allow different measures of the cost of a solution (not just the number of steps in the sequence).\n", - "4. We search through the state space in an order that is more likely to lead to an optimal solution quickly.\n", - "\n", - "Here's how we do these things:\n", - "\n", - "1. Instead of having a graph of neighboring states, we instead have an object of type *Problem*. A Problem\n", - "has one method, `Problem.actions(state)` to return a collection of the actions that are allowed in a state,\n", - "and another method, `Problem.result(state, action)` that says what happens when you take an action.\n", - "2. We keep a set, `explored` of states that have already been explored. We also have a class, `Frontier`, that makes it efficient to ask if a state is on the frontier.\n", - "3. Each action has a cost associated with it (in fact, the cost can vary with both the state and the action).\n", - "4. The `Frontier` class acts as a priority queue, allowing the \"best\" state to be explored next.\n", - "We represent a sequence of actions and resulting states as a linked list of `Node` objects.\n", - "\n", - "The algorithm `breadth_first_search` is basically the same as `breadth_first`, but using our new conventions:" - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "def breadth_first_search(problem):\n", - " \"Search for goal; paths with least number of steps first.\"\n", - " if problem.is_goal(problem.initial): \n", - " return Node(problem.initial)\n", - " frontier = FrontierQ(Node(problem.initial), LIFO=False)\n", - " explored = set()\n", - " while frontier:\n", - " node = frontier.pop()\n", - " explored.add(node.state)\n", - " for action in problem.actions(node.state):\n", - " child = node.child(problem, action)\n", - " if child.state not in explored and child.state not in frontier:\n", - " if problem.is_goal(child.state):\n", - " return child\n", - " frontier.add(child)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Next is `uniform_cost_search`, in which each step can have a different cost, and we still consider first one os the states with minimum cost so far." - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "def uniform_cost_search(problem, costfn=lambda node: node.path_cost):\n", - " frontier = FrontierPQ(Node(problem.initial), costfn)\n", - " explored = set()\n", - " while frontier:\n", - " node = frontier.pop()\n", - " if problem.is_goal(node.state):\n", - " return node\n", - " explored.add(node.state)\n", - " for action in problem.actions(node.state):\n", - " child = node.child(problem, action)\n", - " if child.state not in explored and child not in frontier:\n", - " frontier.add(child)\n", - " elif child in frontier and frontier.cost[child] < child.path_cost:\n", - " frontier.replace(child)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Finally, `astar_search` in which the cost includes an estimate of the distance to the goal as well as the distance travelled so far." - ] - }, - { - "cell_type": "code", - "execution_count": 19, - "metadata": { - "button": false, - "collapsed": true, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "def astar_search(problem, heuristic):\n", - " costfn = lambda node: node.path_cost + heuristic(node.state)\n", - " return uniform_cost_search(problem, costfn)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# Search Tree Nodes\n", - "\n", - "The solution to a search problem is now a linked list of `Node`s, where each `Node`\n", - "includes a `state` and the `path_cost` of getting to the state. In addition, for every `Node` except for the first (root) `Node`, there is a previous `Node` (indicating the state that lead to this `Node`) and an `action` (indicating the action taken to get here)." - ] - }, - { - "cell_type": "code", - "execution_count": 20, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "class Node(object):\n", - " \"\"\"A node in a search tree. A search tree is spanning tree over states.\n", - " A Node contains a state, the previous node in the tree, the action that\n", - " takes us from the previous state to this state, and the path cost to get to \n", - " this state. If a state is arrived at by two paths, then there are two nodes \n", - " with the same state.\"\"\"\n", - "\n", - " def __init__(self, state, previous=None, action=None, step_cost=1):\n", - " \"Create a search tree Node, derived from a previous Node by an action.\"\n", - " self.state = state\n", - " self.previous = previous\n", - " self.action = action\n", - " self.path_cost = 0 if previous is None else (previous.path_cost + step_cost)\n", - "\n", - " def __repr__(self): return \"\".format(self.state, self.path_cost)\n", - " \n", - " def __lt__(self, other): return self.path_cost < other.path_cost\n", - " \n", - " def child(self, problem, action):\n", - " \"The Node you get by taking an action from this Node.\"\n", - " result = problem.result(self.state, action)\n", - " return Node(result, self, action, \n", - " problem.step_cost(self.state, action, result)) " - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# Frontiers\n", - "\n", - "A frontier is a collection of Nodes that acts like both a Queue and a Set. A frontier, `f`, supports these operations:\n", - "\n", - "* `f.add(node)`: Add a node to the Frontier.\n", - "\n", - "* `f.pop()`: Remove and return the \"best\" node from the frontier.\n", - "\n", - "* `f.replace(node)`: add this node and remove a previous node with the same state.\n", - "\n", - "* `state in f`: Test if some node in the frontier has arrived at state.\n", - "\n", - "* `f[state]`: returns the node corresponding to this state in frontier.\n", - "\n", - "* `len(f)`: The number of Nodes in the frontier. When the frontier is empty, `f` is *false*.\n", - "\n", - "We provide two kinds of frontiers: One for \"regular\" queues, either first-in-first-out (for breadth-first search) or last-in-first-out (for depth-first search), and one for priority queues, where you can specify what cost function on nodes you are trying to minimize." - ] - }, - { - "cell_type": "code", - "execution_count": 21, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "from collections import OrderedDict\n", - "import heapq\n", - "\n", - "class FrontierQ(OrderedDict):\n", - " \"A Frontier that supports FIFO or LIFO Queue ordering.\"\n", - " \n", - " def __init__(self, initial, LIFO=False):\n", - " \"\"\"Initialize Frontier with an initial Node.\n", - " If LIFO is True, pop from the end first; otherwise from front first.\"\"\"\n", - " self.LIFO = LIFO\n", - " self.add(initial)\n", - " \n", - " def add(self, node):\n", - " \"Add a node to the frontier.\"\n", - " self[node.state] = node\n", - " \n", - " def pop(self):\n", - " \"Remove and return the next Node in the frontier.\"\n", - " (state, node) = self.popitem(self.LIFO)\n", - " return node\n", - " \n", - " def replace(self, node):\n", - " \"Make this node replace the nold node with the same state.\"\n", - " del self[node.state]\n", - " self.add(node)" - ] - }, - { - "cell_type": "code", - "execution_count": 22, - "metadata": { - "button": false, - "collapsed": true, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "class FrontierPQ:\n", - " \"A Frontier ordered by a cost function; a Priority Queue.\"\n", - " \n", - " def __init__(self, initial, costfn=lambda node: node.path_cost):\n", - " \"Initialize Frontier with an initial Node, and specify a cost function.\"\n", - " self.heap = []\n", - " self.states = {}\n", - " self.costfn = costfn\n", - " self.add(initial)\n", - " \n", - " def add(self, node):\n", - " \"Add node to the frontier.\"\n", - " cost = self.costfn(node)\n", - " heapq.heappush(self.heap, (cost, node))\n", - " self.states[node.state] = node\n", - " \n", - " def pop(self):\n", - " \"Remove and return the Node with minimum cost.\"\n", - " (cost, node) = heapq.heappop(self.heap)\n", - " self.states.pop(node.state, None) # remove state\n", - " return node\n", - " \n", - " def replace(self, node):\n", - " \"Make this node replace a previous node with the same state.\"\n", - " if node.state not in self:\n", - " raise ValueError('{} not there to replace'.format(node.state))\n", - " for (i, (cost, old_node)) in enumerate(self.heap):\n", - " if old_node.state == node.state:\n", - " self.heap[i] = (self.costfn(node), node)\n", - " heapq._siftdown(self.heap, 0, i)\n", - " return\n", - "\n", - " def __contains__(self, state): return state in self.states\n", - " \n", - " def __len__(self): return len(self.heap)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# Search Problems\n", - "\n", - "`Problem` is the abstract class for all search problems. You can define your own class of problems as a subclass of `Problem`. You will need to override the `actions` and `result` method to describe how your problem works. You will also have to either override `is_goal` or pass a collection of goal states to the initialization method. If actions have different costs, you should override the `step_cost` method. " - ] - }, - { - "cell_type": "code", - "execution_count": 23, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "class Problem(object):\n", - " \"\"\"The abstract class for a search problem.\"\"\"\n", - "\n", - " def __init__(self, initial=None, goals=(), **additional_keywords):\n", - " \"\"\"Provide an initial state and optional goal states.\n", - " A subclass can have additional keyword arguments.\"\"\"\n", - " self.initial = initial # The initial state of the problem.\n", - " self.goals = goals # A collection of possibe goal states.\n", - " self.__dict__.update(**additional_keywords)\n", - "\n", - " def actions(self, state):\n", - " \"Return a list of actions executable in this state.\"\n", - " raise NotImplementedError # Override this!\n", - "\n", - " def result(self, state, action):\n", - " \"The state that results from executing this action in this state.\"\n", - " raise NotImplementedError # Override this!\n", - "\n", - " def is_goal(self, state):\n", - " \"True if the state is a goal.\" \n", - " return state in self.goals # Optionally override this!\n", - "\n", - " def step_cost(self, state, action, result=None):\n", - " \"The cost of taking this action from this state.\"\n", - " return 1 # Override this if actions have different costs " - ] - }, - { - "cell_type": "code", - "execution_count": 24, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "def action_sequence(node):\n", - " \"The sequence of actions to get to this node.\"\n", - " actions = []\n", - " while node.previous:\n", - " actions.append(node.action)\n", - " node = node.previous\n", - " return actions[::-1]\n", - "\n", - "def state_sequence(node):\n", - " \"The sequence of states to get to this node.\"\n", - " states = [node.state]\n", - " while node.previous:\n", - " node = node.previous\n", - " states.append(node.state)\n", - " return states[::-1]" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# Two Location Vacuum World" - ] - }, - { - "cell_type": "code", - "execution_count": 25, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "dirt = '*'\n", - "clean = ' '\n", - "\n", - "class TwoLocationVacuumProblem(Problem):\n", - " \"\"\"A Vacuum in a world with two locations, and dirt.\n", - " Each state is a tuple of (location, dirt_in_W, dirt_in_E).\"\"\"\n", - "\n", - " def actions(self, state): return ('W', 'E', 'Suck')\n", - " \n", - " def is_goal(self, state): return dirt not in state\n", - " \n", - " def result(self, state, action):\n", - " \"The state that results from executing this action in this state.\" \n", - " (loc, dirtW, dirtE) = state\n", - " if action == 'W': return ('W', dirtW, dirtE)\n", - " elif action == 'E': return ('E', dirtW, dirtE)\n", - " elif action == 'Suck' and loc == 'W': return (loc, clean, dirtE)\n", - " elif action == 'Suck' and loc == 'E': return (loc, dirtW, clean) \n", - " else: raise ValueError('unknown action: ' + action)" - ] - }, - { - "cell_type": "code", - "execution_count": 26, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "" - ] - }, - "execution_count": 26, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "problem = TwoLocationVacuumProblem(initial=('W', dirt, dirt))\n", - "result = uniform_cost_search(problem)\n", - "result" - ] - }, - { - "cell_type": "code", - "execution_count": 27, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['Suck', 'E', 'Suck']" - ] - }, - "execution_count": 27, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "action_sequence(result)" - ] - }, - { - "cell_type": "code", - "execution_count": 28, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "data": { - "text/plain": [ - "[('W', '*', '*'), ('W', ' ', '*'), ('E', ' ', '*'), ('E', ' ', ' ')]" - ] - }, - "execution_count": 28, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "state_sequence(result)" - ] - }, - { - "cell_type": "code", - "execution_count": 29, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "['Suck']" - ] - }, - "execution_count": 29, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "problem = TwoLocationVacuumProblem(initial=('E', clean, dirt))\n", - "result = uniform_cost_search(problem)\n", - "action_sequence(result)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# Water Pouring Problem\n", - "\n", - "Here is another problem domain, to show you how to define one. The idea is that we have a number of water jugs and a water tap and the goal is to measure out a specific amount of water (in, say, ounces or liters). You can completely fill or empty a jug, but because the jugs don't have markings on them, you can't partially fill them with a specific amount. You can, however, pour one jug into another, stopping when the seconfd is full or the first is empty." - ] - }, - { - "cell_type": "code", - "execution_count": 30, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "class PourProblem(Problem):\n", - " \"\"\"Problem about pouring water between jugs to achieve some water level.\n", - " Each state is a tuples of levels. In the initialization, provide a tuple of \n", - " capacities, e.g. PourProblem(capacities=(8, 16, 32), initial=(2, 4, 3), goals={7}), \n", - " which means three jugs of capacity 8, 16, 32, currently filled with 2, 4, 3 units of \n", - " water, respectively, and the goal is to get a level of 7 in any one of the jugs.\"\"\"\n", - " \n", - " def actions(self, state):\n", - " \"\"\"The actions executable in this state.\"\"\"\n", - " jugs = range(len(state))\n", - " return ([('Fill', i) for i in jugs if state[i] != self.capacities[i]] +\n", - " [('Dump', i) for i in jugs if state[i] != 0] +\n", - " [('Pour', i, j) for i in jugs for j in jugs if i != j])\n", - "\n", - " def result(self, state, action):\n", - " \"\"\"The state that results from executing this action in this state.\"\"\"\n", - " result = list(state)\n", - " act, i, j = action[0], action[1], action[-1]\n", - " if act == 'Fill': # Fill i to capacity\n", - " result[i] = self.capacities[i]\n", - " elif act == 'Dump': # Empty i\n", - " result[i] = 0\n", - " elif act == 'Pour':\n", - " a, b = state[i], state[j]\n", - " result[i], result[j] = ((0, a + b) \n", - " if (a + b <= self.capacities[j]) else\n", - " (a + b - self.capacities[j], self.capacities[j]))\n", - " else:\n", - " raise ValueError('unknown action', action)\n", - " return tuple(result)\n", - "\n", - " def is_goal(self, state):\n", - " \"\"\"True if any of the jugs has a level equal to one of the goal levels.\"\"\"\n", - " return any(level in self.goals for level in state)" - ] - }, - { - "cell_type": "code", - "execution_count": 31, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "(2, 13)" - ] - }, - "execution_count": 31, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "p7 = PourProblem(initial=(2, 0), capacities=(5, 13), goals={7})\n", - "p7.result((2, 0), ('Fill', 1))" - ] - }, - { - "cell_type": "code", - "execution_count": 32, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "[('Pour', 0, 1), ('Fill', 0), ('Pour', 0, 1)]" - ] - }, - "execution_count": 32, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "result = uniform_cost_search(p7)\n", - "action_sequence(result)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# Visualization Output" - ] - }, - { - "cell_type": "code", - "execution_count": 33, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "def showpath(searcher, problem):\n", - " \"Show what happens when searcvher solves problem.\"\n", - " problem = Instrumented(problem)\n", - " print('\\n{}:'.format(searcher.__name__))\n", - " result = searcher(problem)\n", - " if result:\n", - " actions = action_sequence(result)\n", - " state = problem.initial\n", - " path_cost = 0\n", - " for steps, action in enumerate(actions, 1):\n", - " path_cost += problem.step_cost(state, action, 0)\n", - " result = problem.result(state, action)\n", - " print(' {} =={}==> {}; cost {} after {} steps'\n", - " .format(state, action, result, path_cost, steps,\n", - " '; GOAL!' if problem.is_goal(result) else ''))\n", - " state = result\n", - " msg = 'GOAL FOUND' if result else 'no solution'\n", - " print('{} after {} results and {} goal checks'\n", - " .format(msg, problem._counter['result'], problem._counter['is_goal']))\n", - " \n", - "from collections import Counter\n", - "\n", - "class Instrumented:\n", - " \"Instrument an object to count all the attribute accesses in _counter.\"\n", - " def __init__(self, obj):\n", - " self._object = obj\n", - " self._counter = Counter()\n", - " def __getattr__(self, attr):\n", - " self._counter[attr] += 1\n", - " return getattr(self._object, attr) " - ] - }, - { - "cell_type": "code", - "execution_count": 34, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "uniform_cost_search:\n", - " (2, 0) ==('Pour', 0, 1)==> (0, 2); cost 1 after 1 steps\n", - " (0, 2) ==('Fill', 0)==> (5, 2); cost 2 after 2 steps\n", - " (5, 2) ==('Pour', 0, 1)==> (0, 7); cost 3 after 3 steps\n", - "GOAL FOUND after 83 results and 22 goal checks\n" - ] - } - ], - "source": [ - "showpath(uniform_cost_search, p7)" - ] - }, - { - "cell_type": "code", - "execution_count": 35, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "uniform_cost_search:\n", - " (0, 0) ==('Fill', 0)==> (7, 0); cost 1 after 1 steps\n", - " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 2 after 2 steps\n", - " (0, 7) ==('Fill', 0)==> (7, 7); cost 3 after 3 steps\n", - " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 4 after 4 steps\n", - " (1, 13) ==('Dump', 1)==> (1, 0); cost 5 after 5 steps\n", - " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 6 after 6 steps\n", - " (0, 1) ==('Fill', 0)==> (7, 1); cost 7 after 7 steps\n", - " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 8 after 8 steps\n", - " (0, 8) ==('Fill', 0)==> (7, 8); cost 9 after 9 steps\n", - " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 10 after 10 steps\n", - "GOAL FOUND after 110 results and 32 goal checks\n" - ] - } - ], - "source": [ - "p = PourProblem(initial=(0, 0), capacities=(7, 13), goals={2})\n", - "showpath(uniform_cost_search, p)" - ] - }, - { - "cell_type": "code", - "execution_count": 36, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "class GreenPourProblem(PourProblem): \n", - " def step_cost(self, state, action, result=None):\n", - " \"The cost is the amount of water used in a fill.\"\n", - " if action[0] == 'Fill':\n", - " i = action[1]\n", - " return self.capacities[i] - state[i]\n", - " return 0" - ] - }, - { - "cell_type": "code", - "execution_count": 37, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "uniform_cost_search:\n", - " (0, 0) ==('Fill', 0)==> (7, 0); cost 7 after 1 steps\n", - " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 7 after 2 steps\n", - " (0, 7) ==('Fill', 0)==> (7, 7); cost 14 after 3 steps\n", - " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 14 after 4 steps\n", - " (1, 13) ==('Dump', 1)==> (1, 0); cost 14 after 5 steps\n", - " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 14 after 6 steps\n", - " (0, 1) ==('Fill', 0)==> (7, 1); cost 21 after 7 steps\n", - " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 21 after 8 steps\n", - " (0, 8) ==('Fill', 0)==> (7, 8); cost 28 after 9 steps\n", - " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 28 after 10 steps\n", - "GOAL FOUND after 184 results and 48 goal checks\n" - ] - } - ], - "source": [ - "p = GreenPourProblem(initial=(0, 0), capacities=(7, 13), goals={2})\n", - "showpath(uniform_cost_search, p)" - ] - }, - { - "cell_type": "code", - "execution_count": 38, - "metadata": { - "button": false, - "collapsed": true, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "def compare_searchers(problem, searchers=None):\n", - " \"Apply each of the search algorithms to the problem, and show results\"\n", - " if searchers is None: \n", - " searchers = (breadth_first_search, uniform_cost_search)\n", - " for searcher in searchers:\n", - " showpath(searcher, problem)" - ] - }, - { - "cell_type": "code", - "execution_count": 39, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "breadth_first_search:\n", - " (0, 0) ==('Fill', 0)==> (7, 0); cost 7 after 1 steps\n", - " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 7 after 2 steps\n", - " (0, 7) ==('Fill', 0)==> (7, 7); cost 14 after 3 steps\n", - " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 14 after 4 steps\n", - " (1, 13) ==('Dump', 1)==> (1, 0); cost 14 after 5 steps\n", - " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 14 after 6 steps\n", - " (0, 1) ==('Fill', 0)==> (7, 1); cost 21 after 7 steps\n", - " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 21 after 8 steps\n", - " (0, 8) ==('Fill', 0)==> (7, 8); cost 28 after 9 steps\n", - " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 28 after 10 steps\n", - "GOAL FOUND after 100 results and 31 goal checks\n", - "\n", - "uniform_cost_search:\n", - " (0, 0) ==('Fill', 0)==> (7, 0); cost 7 after 1 steps\n", - " (7, 0) ==('Pour', 0, 1)==> (0, 7); cost 7 after 2 steps\n", - " (0, 7) ==('Fill', 0)==> (7, 7); cost 14 after 3 steps\n", - " (7, 7) ==('Pour', 0, 1)==> (1, 13); cost 14 after 4 steps\n", - " (1, 13) ==('Dump', 1)==> (1, 0); cost 14 after 5 steps\n", - " (1, 0) ==('Pour', 0, 1)==> (0, 1); cost 14 after 6 steps\n", - " (0, 1) ==('Fill', 0)==> (7, 1); cost 21 after 7 steps\n", - " (7, 1) ==('Pour', 0, 1)==> (0, 8); cost 21 after 8 steps\n", - " (0, 8) ==('Fill', 0)==> (7, 8); cost 28 after 9 steps\n", - " (7, 8) ==('Pour', 0, 1)==> (2, 13); cost 28 after 10 steps\n", - "GOAL FOUND after 184 results and 48 goal checks\n" - ] - } - ], - "source": [ - "compare_searchers(p)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Random Grid\n", - "\n", - "An environment where you can move in any of 4 directions, unless there is an obstacle there.\n", - "\n", - "\n", - "\n" - ] - }, - { - "cell_type": "code", - "execution_count": 40, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "data": { - "text/plain": [ - "{(0, 0): [(0, 1), (1, 0)],\n", - " (0, 1): [(0, 2), (0, 0), (1, 1)],\n", - " (0, 2): [(0, 3), (0, 1), (1, 2)],\n", - " (0, 3): [(0, 4), (0, 2), (1, 3)],\n", - " (0, 4): [(0, 3), (1, 4)],\n", - " (1, 0): [(1, 1), (2, 0), (0, 0)],\n", - " (1, 1): [(1, 2), (1, 0), (2, 1), (0, 1)],\n", - " (1, 2): [(1, 3), (1, 1), (2, 2), (0, 2)],\n", - " (1, 3): [(1, 4), (1, 2), (2, 3), (0, 3)],\n", - " (1, 4): [(1, 3), (2, 4), (0, 4)],\n", - " (2, 0): [(2, 1), (3, 0), (1, 0)],\n", - " (2, 1): [(2, 2), (2, 0), (3, 1), (1, 1)],\n", - " (2, 2): [(2, 3), (2, 1), (3, 2), (1, 2)],\n", - " (2, 3): [(2, 4), (2, 2), (1, 3)],\n", - " (2, 4): [(2, 3), (1, 4)],\n", - " (3, 0): [(3, 1), (4, 0), (2, 0)],\n", - " (3, 1): [(3, 2), (3, 0), (4, 1), (2, 1)],\n", - " (3, 2): [(3, 1), (4, 2), (2, 2)],\n", - " (3, 3): [(3, 2), (4, 3), (2, 3)],\n", - " (3, 4): [(4, 4), (2, 4)],\n", - " (4, 0): [(4, 1), (3, 0)],\n", - " (4, 1): [(4, 2), (4, 0), (3, 1)],\n", - " (4, 2): [(4, 3), (4, 1), (3, 2)],\n", - " (4, 3): [(4, 4), (4, 2)],\n", - " (4, 4): [(4, 3)]}" - ] - }, - "execution_count": 40, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "import random\n", - "\n", - "N, S, E, W = DIRECTIONS = [(0, 1), (0, -1), (1, 0), (-1, 0)]\n", - "\n", - "def Grid(width, height, obstacles=0.1):\n", - " \"\"\"A 2-D grid, width x height, with obstacles that are either a collection of points,\n", - " or a fraction between 0 and 1 indicating the density of obstacles, chosen at random.\"\"\"\n", - " grid = {(x, y) for x in range(width) for y in range(height)}\n", - " if isinstance(obstacles, (float, int)):\n", - " obstacles = random.sample(grid, int(width * height * obstacles))\n", - " def neighbors(x, y):\n", - " for (dx, dy) in DIRECTIONS:\n", - " (nx, ny) = (x + dx, y + dy)\n", - " if (nx, ny) not in obstacles and 0 <= nx < width and 0 <= ny < height:\n", - " yield (nx, ny)\n", - " return {(x, y): list(neighbors(x, y))\n", - " for x in range(width) for y in range(height)}\n", - "\n", - "Grid(5, 5)" - ] - }, - { - "cell_type": "code", - "execution_count": 41, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "class GridProblem(Problem):\n", - " \"Create with a call like GridProblem(grid=Grid(10, 10), initial=(0, 0), goal=(9, 9))\"\n", - " def actions(self, state): return DIRECTIONS\n", - " def result(self, state, action):\n", - " #print('ask for result of', state, action)\n", - " (x, y) = state\n", - " (dx, dy) = action\n", - " r = (x + dx, y + dy)\n", - " return r if r in self.grid[state] else state" - ] - }, - { - "cell_type": "code", - "execution_count": 42, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "uniform_cost_search:\n", - "no solution after 132 results and 33 goal checks\n" - ] - } - ], - "source": [ - "gp = GridProblem(grid=Grid(5, 5, 0.3), initial=(0, 0), goals={(4, 4)})\n", - "showpath(uniform_cost_search, gp)\n" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "button": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "source": [ - "# Finding a hard PourProblem\n", - "\n", - "What solvable two-jug PourProblem requires the most steps? We can define the hardness as the number of steps, and then iterate over all PourProblems with capacities up to size M, keeping the hardest one." - ] - }, - { - "cell_type": "code", - "execution_count": 43, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "def hardness(problem):\n", - " L = breadth_first_search(problem)\n", - " #print('hardness', problem.initial, problem.capacities, problem.goals, L)\n", - " return len(action_sequence(L)) if (L is not None) else 0" - ] - }, - { - "cell_type": "code", - "execution_count": 44, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "3" - ] - }, - "execution_count": 44, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "hardness(p7)" - ] - }, - { - "cell_type": "code", - "execution_count": 45, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "data": { - "text/plain": [ - "[('Pour', 0, 1), ('Fill', 0), ('Pour', 0, 1)]" - ] - }, - "execution_count": 45, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "action_sequence(breadth_first_search(p7))" - ] - }, - { - "cell_type": "code", - "execution_count": 46, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "((0, 0), (7, 9), {8})" - ] - }, - "execution_count": 46, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "C = 9 # Maximum capacity to consider\n", - "\n", - "phard = max((PourProblem(initial=(a, b), capacities=(A, B), goals={goal})\n", - " for A in range(C+1) for B in range(C+1)\n", - " for a in range(A) for b in range(B)\n", - " for goal in range(max(A, B))),\n", - " key=hardness)\n", - "\n", - "phard.initial, phard.capacities, phard.goals" - ] - }, - { - "cell_type": "code", - "execution_count": 47, - "metadata": { - "collapsed": false - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "breadth_first_search:\n", - " (0, 0) ==('Fill', 1)==> (0, 9); cost 1 after 1 steps\n", - " (0, 9) ==('Pour', 1, 0)==> (7, 2); cost 2 after 2 steps\n", - " (7, 2) ==('Dump', 0)==> (0, 2); cost 3 after 3 steps\n", - " (0, 2) ==('Pour', 1, 0)==> (2, 0); cost 4 after 4 steps\n", - " (2, 0) ==('Fill', 1)==> (2, 9); cost 5 after 5 steps\n", - " (2, 9) ==('Pour', 1, 0)==> (7, 4); cost 6 after 6 steps\n", - " (7, 4) ==('Dump', 0)==> (0, 4); cost 7 after 7 steps\n", - " (0, 4) ==('Pour', 1, 0)==> (4, 0); cost 8 after 8 steps\n", - " (4, 0) ==('Fill', 1)==> (4, 9); cost 9 after 9 steps\n", - " (4, 9) ==('Pour', 1, 0)==> (7, 6); cost 10 after 10 steps\n", - " (7, 6) ==('Dump', 0)==> (0, 6); cost 11 after 11 steps\n", - " (0, 6) ==('Pour', 1, 0)==> (6, 0); cost 12 after 12 steps\n", - " (6, 0) ==('Fill', 1)==> (6, 9); cost 13 after 13 steps\n", - " (6, 9) ==('Pour', 1, 0)==> (7, 8); cost 14 after 14 steps\n", - "GOAL FOUND after 150 results and 44 goal checks\n" - ] - } - ], - "source": [ - "showpath(breadth_first_search, PourProblem(initial=(0, 0), capacities=(7, 9), goals={8}))" - ] - }, - { - "cell_type": "code", - "execution_count": 48, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "uniform_cost_search:\n", - " (0, 0) ==('Fill', 1)==> (0, 9); cost 1 after 1 steps\n", - " (0, 9) ==('Pour', 1, 0)==> (7, 2); cost 2 after 2 steps\n", - " (7, 2) ==('Dump', 0)==> (0, 2); cost 3 after 3 steps\n", - " (0, 2) ==('Pour', 1, 0)==> (2, 0); cost 4 after 4 steps\n", - " (2, 0) ==('Fill', 1)==> (2, 9); cost 5 after 5 steps\n", - " (2, 9) ==('Pour', 1, 0)==> (7, 4); cost 6 after 6 steps\n", - " (7, 4) ==('Dump', 0)==> (0, 4); cost 7 after 7 steps\n", - " (0, 4) ==('Pour', 1, 0)==> (4, 0); cost 8 after 8 steps\n", - " (4, 0) ==('Fill', 1)==> (4, 9); cost 9 after 9 steps\n", - " (4, 9) ==('Pour', 1, 0)==> (7, 6); cost 10 after 10 steps\n", - " (7, 6) ==('Dump', 0)==> (0, 6); cost 11 after 11 steps\n", - " (0, 6) ==('Pour', 1, 0)==> (6, 0); cost 12 after 12 steps\n", - " (6, 0) ==('Fill', 1)==> (6, 9); cost 13 after 13 steps\n", - " (6, 9) ==('Pour', 1, 0)==> (7, 8); cost 14 after 14 steps\n", - "GOAL FOUND after 159 results and 45 goal checks\n" - ] - } - ], - "source": [ - "showpath(uniform_cost_search, phard)" - ] - }, - { - "cell_type": "code", - "execution_count": 49, - "metadata": { - "button": false, - "collapsed": true, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [], - "source": [ - "class GridProblem(Problem):\n", - " \"\"\"A Grid.\"\"\"\n", - "\n", - " def actions(self, state): return ['N', 'S', 'E', 'W'] \n", - " \n", - " def result(self, state, action):\n", - " \"\"\"The state that results from executing this action in this state.\"\"\" \n", - " (W, H) = self.size\n", - " if action == 'N' and state > W: return state - W\n", - " if action == 'S' and state + W < W * W: return state + W\n", - " if action == 'E' and (state + 1) % W !=0: return state + 1\n", - " if action == 'W' and state % W != 0: return state - 1\n", - " return state" - ] - }, - { - "cell_type": "code", - "execution_count": 50, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "breadth_first_search:\n", - " 0 ==S==> 10; cost 1 after 1 steps\n", - " 10 ==S==> 20; cost 2 after 2 steps\n", - " 20 ==S==> 30; cost 3 after 3 steps\n", - " 30 ==S==> 40; cost 4 after 4 steps\n", - " 40 ==E==> 41; cost 5 after 5 steps\n", - " 41 ==E==> 42; cost 6 after 6 steps\n", - " 42 ==E==> 43; cost 7 after 7 steps\n", - " 43 ==E==> 44; cost 8 after 8 steps\n", - "GOAL FOUND after 135 results and 49 goal checks\n", - "\n", - "uniform_cost_search:\n", - " 0 ==S==> 10; cost 1 after 1 steps\n", - " 10 ==S==> 20; cost 2 after 2 steps\n", - " 20 ==E==> 21; cost 3 after 3 steps\n", - " 21 ==E==> 22; cost 4 after 4 steps\n", - " 22 ==E==> 23; cost 5 after 5 steps\n", - " 23 ==S==> 33; cost 6 after 6 steps\n", - " 33 ==S==> 43; cost 7 after 7 steps\n", - " 43 ==E==> 44; cost 8 after 8 steps\n", - "GOAL FOUND after 1036 results and 266 goal checks\n" - ] - } - ], - "source": [ - "compare_searchers(GridProblem(initial=0, goals={44}, size=(10, 10)))" - ] - }, - { - "cell_type": "code", - "execution_count": 51, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "'test_frontier ok'" - ] - }, - "execution_count": 51, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "def test_frontier():\n", - " \n", - " #### Breadth-first search with FIFO Q\n", - " f = FrontierQ(Node(1), LIFO=False)\n", - " assert 1 in f and len(f) == 1\n", - " f.add(Node(2))\n", - " f.add(Node(3))\n", - " assert 1 in f and 2 in f and 3 in f and len(f) == 3\n", - " assert f.pop().state == 1\n", - " assert 1 not in f and 2 in f and 3 in f and len(f) == 2\n", - " assert f\n", - " assert f.pop().state == 2\n", - " assert f.pop().state == 3\n", - " assert not f\n", - " \n", - " #### Depth-first search with LIFO Q\n", - " f = FrontierQ(Node('a'), LIFO=True)\n", - " for s in 'bcdef': f.add(Node(s))\n", - " assert len(f) == 6 and 'a' in f and 'c' in f and 'f' in f\n", - " for s in 'fedcba': assert f.pop().state == s\n", - " assert not f\n", - "\n", - " #### Best-first search with Priority Q\n", - " f = FrontierPQ(Node(''), lambda node: len(node.state))\n", - " assert '' in f and len(f) == 1 and f\n", - " for s in ['book', 'boo', 'bookie', 'bookies', 'cook', 'look', 'b']:\n", - " assert s not in f\n", - " f.add(Node(s))\n", - " assert s in f\n", - " assert f.pop().state == ''\n", - " assert f.pop().state == 'b'\n", - " assert f.pop().state == 'boo'\n", - " assert {f.pop().state for _ in '123'} == {'book', 'cook', 'look'}\n", - " assert f.pop().state == 'bookie'\n", - " \n", - " #### Romania: Two paths to Bucharest; cheapest one found first\n", - " S = Node('S')\n", - " SF = Node('F', S, 'S->F', 99)\n", - " SFB = Node('B', SF, 'F->B', 211)\n", - " SR = Node('R', S, 'S->R', 80)\n", - " SRP = Node('P', SR, 'R->P', 97)\n", - " SRPB = Node('B', SRP, 'P->B', 101)\n", - " f = FrontierPQ(S)\n", - " f.add(SF); f.add(SR), f.add(SRP), f.add(SRPB); f.add(SFB)\n", - " def cs(n): return (n.path_cost, n.state) # cs: cost and state\n", - " assert cs(f.pop()) == (0, 'S')\n", - " assert cs(f.pop()) == (80, 'R')\n", - " assert cs(f.pop()) == (99, 'F')\n", - " assert cs(f.pop()) == (177, 'P')\n", - " assert cs(f.pop()) == (278, 'B')\n", - " return 'test_frontier ok'\n", - "\n", - "test_frontier()" - ] - }, - { - "cell_type": "code", - "execution_count": 52, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXEAAAEACAYAAABF+UbAAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGf5JREFUeJzt3XuQVPWd9/H3h4vGy8JiVjAqIRFXJG4lEl0vQWMb77gB\nk31C5ImumsdNJRo1bio6ums5qYpVasol5GbiRhHjJYouQlx9QBZboiZeAG8RWSMrXhmzXFzRCqvw\n3T/OGRzHhjk93T2nT/fnVdU1p5tzur814odf/87voojAzMyKaVDeBZiZWf85xM3MCswhbmZWYA5x\nM7MCc4ibmRWYQ9zMrMAyhbik8yQ9lT7OTV8bIWmBpBWS5ksa3thSzcystz5DXNJ+wP8DDgT2B/5G\n0ligA1gYEeOARcBFjSzUzMw+KEtLfDzwcERsjIhNwGLgi8BkYFZ6zizgpMaUaGZmW5MlxJ8GDk+7\nT3YEJgGjgVER0QUQEauBkY0r08zMKhnS1wkR8aykK4B7gQ3AMmBTpVPrXJuZmfWhzxAHiIiZwEwA\nSZcBLwFdkkZFRJek3YDXK10ryeFuZtYPEaG+zsk6OmXX9OdHgS8ANwPzgNPTU04D5m6jkKZ6XHrp\npbnXUISamrUu1+Sa2qGurDK1xIE7JO0CvAOcFRH/nXax3Cbpq8AqYGrmTzUzs7rI2p3y2QqvrQWO\nrntFZmaWWVvO2CyVSnmX8AHNWBM0Z12uKRvXlF2z1pWFqul76dcHSNHozzAzazWSiHrd2DQzs+bk\nEDczKzCHuJlZgTnEzcwKzCFuZlZgDnEzswJziJuZFZhD3MyswBziZmYF5hA3Myswh7iZWYE5xM3M\nCswhbmZWYA5xM7MCy7o92/mSnpb0pKSbJG0naYSkBZJWSJovaXijizUzs/frM8Ql7Q6cA3w6Ij5J\nshvQNKADWBgR44BFwEWNLNTMrF1cfnn2c7N2pwwGdpI0BNgBeAWYAsxK/3wWcFL2jzUzs0pmzIAb\nbsh+fp8hHhGvAlcBL5KE9xsRsRAYFRFd6TmrgZH9KdjMzBJ33AHf/z7cc0/2a/rcKFnSn5O0uscA\nbwCzJX0F6L3n2lb3YOvs7NxyXCqVCr2fnZlZI/zoR2U6OsqccgrMnJn9uj732JT0f4DjIuLv0+en\nAocAnwNKEdElaTfgvogYX+F677FpZrYNK1bAEUck3SjHHpu8Vs89Nl8EDpH0IUkCjgKeAeYBp6fn\nnAbM7UftZmZtbfVqOOGE5GZmd4BXI9Nu95IuBU4G3gGWAWcCfwbcBowGVgFTI2J9hWvdEjczq2DD\nBiiVYMoUuOSS9/9Z1pZ4phCvhUPczOyD3n0XJk+GPfaAa64B9YrrenanmJlZHUXAN76RHP/0px8M\n8Gr0OTrFzMzq63vfg6VL4f77YejQ2t7LIW5mNoCuvz4ZQvjQQ7DzzrW/n/vEzcwGyPz5cNppSQt8\n3Lhtn5u1T9wtcTOzAbBsGZx6KsyZ03eAV8M3Ns3MGmzVKvj85+Hqq2HixPq+t0PczKyB1q1LJvNc\ncAH87d/W//3dJ25m1iB/+hMcdxwceCBcdVV113qyj5lZjjZvhmnTkuNbboFBVfZ7+MammVmOLrgA\nXnsNFiyoPsCr4RA3M6uzGTPg7rvhgQfgQx9q7Gc5xM3M6qh7Y4cHH4Rddmn85znEzczq5MEHkzVR\n5s+HMWMG5jM9xNDMrA6efTYZQnjjjTBhwsB9rkPczKxGq1fDpEn939ihFg5xM7MabNgAJ54Ip5+e\nPAZalj029wFuJdkIWcBewCXAL9PXxwAvkOzs80aF6z1O3MxaUvfGDrvvDv/yL7WtC95bQyb7SBoE\nvAwcDHwTWBMRV0q6EBgRER0VrnGIm1nLiYCvfQ1eeQXmzq19XfDeGrWzz9HA8xHxEjAFmJW+Pgs4\nqcr3MjMrrO6NHW67rf4BXo1qhxh+Gbg5PR4VEV0AEbFa0si6VmZm1qTqvbFDLTKHuKShwGTgwvSl\n3n0kW+0z6ezs3HJcKpUolUqZCzQzaybz50NHR7Kxw2671e99y+Uy5XK56usy94lLmgycFRHHp8+X\nA6WI6JK0G3BfRIyvcJ37xM2sJSxblqxKOGdO/dcF760RfeLTgFt6PJ8HnJ4enwbMreK9zMwKpZEb\nO9QiU0tc0o7AKmCviHgzfW0X4DZgdPpnUyNifYVr3RI3s0JbuxYOOwy+/nU499yB+UyvJ25mVgd/\n+lMyC/Ov/7r6jR1q4RA3M6vR5s1w8snJJJ7+bOxQC28KYWZWo+98J1kXpdEbO9TCIW5mVsGMGXDP\nPQOzsUMtHOJmZr0M9MYOtXCIm5n1kMfGDrVo0l4eM7OBl9fGDrVwiJuZke/GDrVwiJtZ28t7Y4da\neJy4mbW1d95JNnbYY4/6b+xQi0atJ25m1jIikpuYUrImSrMEeDU8OsXM2lIEnHMOPP00LFyY78YO\ntXBL3MzaTneAP/ZYMpQw740dauEQN7O20jvAhw/Pu6LaOMTNrG20WoCDQ9zM2kQrBjg4xM2sDbRq\ngEPGEJc0XNJsScsl/V7SwZJGSFogaYWk+ZJa6NdiZq2ilQMcsrfEZwB3pxshfwp4FugAFkbEOGAR\ncFFjSjQz659WD3DIMGNT0jBgWUSM7fX6s8ARPXa7L0fEvhWu94xNMxtwRQ/wes7Y/DjwX5JmSloq\n6Zp04+RREdEFEBGrgZG1lWxmVh9FD/BqZJmxOQT4NHB2RDwmaTpJV0rv5vVWm9udnZ1bjkulEqVS\nqepCzcyyKGqAl8tlyuVy1ddl6U4ZBfw2IvZKnx9GEuJjgVKP7pT70j7z3te7O8XMBkRRA7ySunWn\npF0mL0naJ33pKOD3wDzg9PS104C5/SvVzKx2rRTg1ci0FK2kTwG/AIYCK4EzgMHAbcBoYBUwNSLW\nV7jWLXEza6hWDPCsLXGvJ25mhdaKAQ5eT9zM2kCrBng1HOJmVkgO8IRD3MwKxwH+Hoe4mRWKA/z9\nHOJmVhgO8A9yiJtZITjAK3OIm1nTc4BvnUPczJqaA3zbHOJm1rQc4H1ziJtZU3KAZ+MQN7Om4wDP\nziFuZk3FAV4dh7iZNQ0HePUc4mbWFBzg/eMQN7PcOcD7L8sem0h6AXgD2Ay8ExEHSRoB3AqMAV4g\n2RTijQbVaWYtygFem6wt8c0k+2lOiIiD0tc6gIURMQ5YBFzUiALNrHU5wGuXNcRV4dwpwKz0eBZw\nUr2KMrPW5wCvj6whHsC9kh6VdGb62qh0E2UiYjUwshEFmlnrcYDXT6Y+cWBiRLwmaVdggaQVJMHe\nkzfSNLM+OcDrK1OIR8Rr6c8/SroTOAjokjQqIrok7Qa8vrXrOzs7txyXSiVKpVItNZtZQTnAt65c\nLlMul6u+rs/d7iXtCAyKiA2SdgIWAN8FjgLWRsQVki4ERkRER4Xrvdu9mTnAq5R1t/ssIf5xYA5J\nd8kQ4KaIuFzSLsBtwGhgFckQw/UVrneIm7W5jRvhq1+FF16Au+92gGdRtxCvQyEOcbM2tmYNfOEL\nMGoU3HAD7LBD3hUVQ9YQ94xNM2uY55+Hz3wGDjkEbr3VAd4IDnEza4jf/Q4OOwy+9S248koY5LRp\niKxDDM3MMrvjDvj612HWLJg0Ke9qWptD3MzqJgL++Z9h+nRYsAAmTMi7otbnEDezunj3XTjvPPjN\nb+C3v4XRo/OuqD04xM2sZhs2wMknw//8DzzwAAwblndF7cO3GsysJq++Cp/9LHzkI/Bv/+YAH2gO\ncTPrt6eegkMPhS99Ca65BoYOzbui9uPuFDPrl3vvha98BWbMgGnT8q6mfbklbmZVu+46OPXUZCih\nAzxfbombWWYRcMkl8Ktfwf33w7hxeVdkDnEzy6R7EauVK5MhhLvumndFBu5OMbMM1q6FY45JhhAu\nWuQAbyYOcTPbJi9i1dwc4ma2Vd2LWJ13nhexalbuEzeziu64A77xDbj+ei9i1cwy/7sqaZCkpZLm\npc9HSFogaYWk+ZK8V4dZC4iAq65KlpCdP98B3uyq+XJ0HvBMj+cdwMKIGAcsAi6qZ2FmNvDefRe+\n+c1kCdmHHvIqhEWQKcQl7QlMAn7R4+UpwKz0eBZwUn1LM7OBtGEDnHQSPPdcsoiVVyEshqwt8enA\nd0g2S+42KiK6ACJiNTCyzrWZ2QDxIlbF1WeISzoR6IqIx4Ftbdrp3ZDNCsiLWBVbltEpE4HJkiYB\nOwB/JumXwGpJoyKiS9JuwOtbe4POzs4tx6VSiVKpVFPRZlYfXsSqeZTLZcrlctXXKSJ7A1rSEcC3\nI2KypCuBNRFxhaQLgRER0VHhmqjmM8xsYFx3HVx8McyeDYcfnnc11pskImJbvR9AbePELwduk/RV\nYBUwtYb3MrMB4kWsWktVLfF+fYBb4mZNo+ciVvPmeQ2UZpa1Je5JtGZtwotYtSaHuFkbWLnSi1i1\nKoe4WYvzIlatzQtgmbUwL2LV+hziZi0oAqZPTx7z53sNlFbmEDdrMRs2JItYLV2aLGLlNVBam3vH\nzFrIsmVwwAEweHCyD6YDvPU5xM1aQAT86Edw3HHQ2QnXXgs77ZR3VTYQ3J1iVnBr1iQTeF59NWl9\njx2bd0U2kNwSNyuwxYuTm5Z/+Zfw4IMO8HbklrhZAW3aBJddBldfnSxkdcIJeVdkeXGImxXMK68k\ny8cOHgxLlsDuu+ddkeXJ3SlmBXLXXcnok2OOgQULHODmlrhZIWzcCBdeCHPmJLMwJ07MuyJrFg5x\nsyb33HPw5S/Dxz6WjAPfZZe8K7Jm4u4UsyZ2443J6oNnnpm0wB3g1lufLXFJ2wOLge3S82+PiO9K\nGgHcCowBXgCmRsQbDazVrG1s2ABnnw2PPAL//u/wyU/mXZE1qz5b4hGxETgyIiYA+wMnSDoI6AAW\nRsQ4YBFwUUMrNWsT3VPnhwyBxx5zgNu2ZepOiYi308PtSVrjAUwBZqWvzwJOqnt1Zm3EU+etPzLd\n2JQ0CFgCjAV+EhGPShoVEV0AEbFa0sgG1mnW0jx13vorU4hHxGZggqRhwBxJ+5G0xt932tau7+zs\n3HJcKpUolUpVF2rWqhYvhlNOgalTYfZs2G67vCuyPJTLZcrlctXXVb3bvaRLgLeBM4FSRHRJ2g24\nLyLGVzjfu92bVbBpE3zve/Czn3nqvH1Q3Xa7l/QXkoanxzsAxwDLgXnA6elppwFz+12tWZt5+WU4\n6qikFb5kiQPc+i/Ljc2PAPdJehx4GJgfEXcDVwDHSFoBHAVc3rgyzVrHXXfBgQd66rzVR9XdKVV/\ngLtTzID3T52/+WZPnbdty9qd4mn3ZgPAU+etUTzt3qzBPHXeGsktcbMG6Z46//DDsHAhfOpTeVdk\nrcgtcbMG6Dl1fskSB7g1jkPcrI4i4Ic/hGOPhUsv9dR5azx3p5jVyZo1cMYZ702d33vvvCuyduCW\nuFkddO86v88+8NBDDnAbOG6Jm9XgrbeSXednzvTUecuHW+Jm/RCRTNr5xCfgP/8Tli51gFs+3BI3\nq9Jzz8E558CLL8L118ORR+ZdkbUzt8TNMnrrLfjHf4RDD4Wjj4YnnnCAW/7cEjfrQwTceSd861vJ\nzMsnnoA99si7KrOEQ9xsG9x1Ys3O3SlmFbz9NvzTPyVdJ8cc464Ta15uiZv10N11cv75SYC768Sa\nnUPcLNWz62TmTLe8rRiybM+2p6RFkn4v6SlJ56avj5C0QNIKSfO7t3AzKxp3nViRZekTfxf4h4jY\nDzgUOFvSvkAHsDAixgGLgIsaV6ZZ/fWcsPP880l4f/vbMHRo3pWZZdef3e7vBH6cPo7osdt9OSL2\nrXC+t2ezpvPcc3DuubBqFfzkJ255W/Op2273vd70Y8D+wO+AURHRBRARq4GR1ZdpNrB6dp14wo61\ngsw3NiXtDNwOnBcRGyT1bl5vtbnd2dm55bhUKlEqlaqr0qxGPUedeMKONaNyuUy5XK76ukzdKZKG\nAHcB90TEjPS15UCpR3fKfRExvsK17k6xXLnrxIqo3t0p1wHPdAd4ah5wenp8GjC3qgrNGsxdJ9YO\n+myJS5oILAaeIukyCeBi4BHgNmA0sAqYGhHrK1zvlrgNqN5dJ9//vrtOrHiytsSrHp3Sj0Ic4jZg\nurtOXnwRfvxjt7ytuBoyOsWsWfXuOnn8cQe4tQeHuBVazwk7K1d6wo61H6+dYoXVs+vEa51Yu3JL\n3ArHXSdm73GIW2Fs3gyzZ7vrxKwnd6dY09u4EW66Ca68EoYNc9eJWU8OcWtab74J11wD06fDX/0V\nXH01lEqgPgddmbUPh7g1nddfhx/+EH72s2R971//GiZMyLsqs+bkPnFrGitXwllnwb77wtq18PDD\ncMstDnCzbXGIW+4efxymTYODDoIRI2D5cvjpT2Hs2LwrM2t+DnHLRQSUy3D88XDiiXDAAUlL/LLL\nYNSovKszKw73iduA2rwZ5s6Fyy+H9evhgguS59tvn3dlZsXkELcB0XuYYEcHTJkCgwfnXZlZsTnE\nraHefBN+/nP4wQ88TNCsERzi1hBdXckwwZ//3MMEzRrJNzatrrqHCY4fD+vWeZigWaP1GeKSrpXU\nJenJHq+NkLRA0gpJ8yUNb2yZ1uw8TNAsH1la4jOB43q91gEsjIhxwCLgonoXZs3PwwTN8pd1t/sx\nwK8j4pPp82eBI3rsdF+OiH23cq23Z2sxlYYJnnKKhwma1VPW7dn6e2NzZER0AUTEakkj+/k+ViAb\nN8KNNyYbD3uYoFlzqNfolG02tTs7O7ccl0olSqVSnT7WBoKHCZo1XrlcplwuV31df7tTlgOlHt0p\n90XE+K1c6+6Uguo9TPCCCzzKxGyg1Hu3e6WPbvOA09Pj04C5VVVnTWv9erjhBvj852HcOK8maNbs\n+myJS7oZKAEfBrqAS4E7gdnAaGAVMDUi1m/lerfEm9z69TBvXrL12f33w+c+B1/6UhLkw4blXZ1Z\ne8raEs/UnVJjIQ7xJuTgNmtuDnH7AAe3WXE4xA1wcJsVlUO8jTm4zYrPId5mHNxmrcUh3gYc3Gat\nyyHeonoH95FHwtSpDm6zVuMQbyEObrP24xAvOAe3WXtziBeQg9vMujnEC2DdOliyJHn85jeweLGD\n28wSDvEm0zOwux+vvw777w8HHggHHwyTJjm4zSzhEM9RX4F9wAHJY599vKGCmVXmEB8g69bB0qXw\n2GPvD+wJE94Lawe2mVXLId4ADmwzGygO8Ro5sM0sTwMS4pKOB35AskPQtRFxRYVzmj7EuwN7yZL3\nQvuPf0z6sB3YZpaHem/PVukDBgE/Bo4D9gOmSdq3v+/XaJs2wZo18Ic/wFVXlbnyymQo39ixMGYM\nfPe78NprMHky3HVXEuyLF8P06XDKKTB+fGMDvD8bpA6EZqzLNWXjmrJr1rqyqGW3+4OA5yJiFYCk\nXwFTgGfrUVglmzYlE2LWrev7sXbt+59v2JAM3xsxAjZtKvPFL5aYPDkJ72ZoYZfLZUqlUr5FVNCM\ndbmmbFxTds1aVxa1hPgewEs9nr9MEuzbVGsQDx+eBHGlx4c/DHvvXfnPhg+HQen3js7O5GFmVnS1\nhHhmEya8F8RvvfVei7iWIDYzsxpubEo6BOiMiOPT5x1A9L65Kam572qamTWpho5OkTQYWAEcBbwG\nPAJMi4jl/XpDMzOrWr+7UyJik6RvAgt4b4ihA9zMbAA1fLKPmZk1TsNuE0o6XtKzkv5D0oWN+pxq\nSLpWUpekJ/OupZukPSUtkvR7SU9JOrcJatpe0sOSlqU1XZp3Td0kDZK0VNK8vGvpJukFSU+kv69H\n8q4HQNJwSbMlLU//bh2ccz37pL+fpenPN5rk7/r5kp6W9KSkmyRt1wQ1nZf+f5ctDyKi7g+Sfxz+\nAIwBhgKPA/s24rOqrOswYH/gybxr6VHTbsD+6fHOJPcZmuF3tWP6czDwO+CgvGtK6zkfuBGYl3ct\nPWpaCYzIu45eNV0PnJEeDwGG5V1Tj9oGAa8Co3OuY/f0v9126fNbgb/Luab9gCeB7dP/9xYAe23r\nmka1xLdMBIqId4DuiUC5iogHgHV519FTRKyOiMfT4w3AcpIx+LmKiLfTw+1JQiD3fjdJewKTgF/k\nXUsvooHfaqslaRhweETMBIiIdyPiv3Muq6ejgecj4qU+z2y8wcBOkoYAO5L845Kn8cDDEbExIjYB\ni4EvbuuCRv3FqzQRKPdganaSPkbyTeHhfCvZ0m2xDFgN3BsRj+ZdEzAd+A5N8A9KLwHcK+lRSX+f\ndzHAx4H/kjQz7b64RtIOeRfVw5eBW/IuIiJeBa4CXgReAdZHxMJ8q+Jp4HBJIyTtSNJoGb2tC5qm\n9dDuJO0M3A6cl7bIcxURmyNiArAncLCkT+RZj6QTga70W4vSR7OYGBGfJvkf7mxJh+VczxDg08BP\n0rreBjryLSkhaSgwGZjdBLX8OUkPwRiSrpWdJf3fPGuKiGeBK4B7gbuBZcCmbV3TqBB/Bfhoj+d7\npq9ZBelXuduBX0bE3Lzr6Sn9Gn4fcHzOpUwEJktaSdKKO1LSDTnXBEBEvJb+/CMwhwzLTzTYy8BL\nEfFY+vx2klBvBicAS9LfVd6OBlZGxNq06+Jfgc/kXBMRMTMiDoyIErAe+I9tnd+oEH8U2FvSmPRu\n78lAs4wmaLZWHMB1wDMRMSPvQgAk/YWk4enxDsAxNHBhsywi4uKI+GhE7EXy92lRRPxdnjUBSNox\n/RaFpJ2AY0m+EucmIrqAlyTtk750FPBMjiX1NI0m6EpJvQgcIulDkkTye8p9roukXdOfHwW+ANy8\nrfMbsnZKNOlEIEk3AyXgw5JeBC7tvvmTY00Tga8AT6V90AFcHBH/P8eyPgLMSpcbHgTcGhF351hP\nMxsFzEmXlxgC3BQRC3KuCeBc4Ka0+2IlcEbO9ZD28R4NfC3vWgAi4hFJt5N0WbyT/rwm36oAuEPS\nLiQ1ndXXTWlP9jEzKzDf2DQzKzCHuJlZgTnEzcwKzCFuZlZgDnEzswJziJuZFZhD3MyswBziZmYF\n9r8varwUoYrZVQAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "%matplotlib inline\n", - "import matplotlib.pyplot as plt\n", - "\n", - "p = plt.plot([i**2 for i in range(10)])\n", - "plt.savefig('destination_path.eps', format='eps', dpi=1200)" - ] - }, - { - "cell_type": "code", - "execution_count": 53, - "metadata": { - "button": false, - "collapsed": false, - "deletable": true, - "new_sheet": false, - "run_control": { - "read_only": false - } - }, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAe8AAAHaCAYAAAApPsHTAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt209MVPe///HXmT+JDhg7OjAEBIGIGbBYDBLSUha6ICwq\n1IgpzdXvzVfytRujDUlj2t5v6aQ3JN2QUN3YtIsmpbXFaSSmCZpYFrYbF99qid4QSUACDWMwRhmm\niQNnfoveO8nU2u/8gGH4HJ6P3ZlzTny//HzOvGYQrWQyKQAAYA5XrgcAAAD/fyhvAAAMQ3kDAGAY\nyhsAAMNQ3gAAGMaT6wEy9eGHH85alhXM9RzZkkwmbcuyHPthysn53G63vbS05MhskrPXTiKf6Zz8\n/Hk8nuj7779f9Kfn1nqY5bIsKxiNRnM9RtYEg0EX+cwUDAZdPT09uR4ja8LhsGPXTnL23pQ2Rj6n\nPn/hcPi5X1gd+WkFAAAno7wBADAM5Q0AgGEobwAADEN5AwBgGMobAADDUN4AABiG8gYAwDCUNwAA\nhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShv\nAAAMQ3kDAGAYyhsAAMN4cj3AWpuamtJPP/2kZDKp6upq7du3L+38o0ePNDIyorm5OTU2Nuqll16S\nJMViMV2/fl2//fabLMtSdXW19u7dm4sIf4l8ZucbHh7W22+/Ldu21dXVpbNnz6adHxsb09///nf9\n61//Um9vr7q7uyVJ09PT+tvf/qZoNCqXy6V//OMfOn36dC4iPJfT1458Zucz7dnbUOWdTCZ148YN\ntbW1yefzKRKJqLy8XH6/P3XNpk2b1NzcrImJibR7XS6XmpqaFAgElEgkNDg4qNLS0rR7c418Zuez\nbVunTp3S9evXVVxcrIaGBrW3tysUCqWu2b59u86dO6fLly+n3evxeNTX16e6ujrFYjHV19erpaUl\n7d5ccvrakc/sfCY+exvqx+bRaFRbt27Vli1b5Ha7tWvXLk1OTqZds3nzZhUUFMiyrLTXfT6fAoGA\nJMnr9crv92thYWGtRs8I+czOd/PmTVVVVWnnzp3yer3q7OzU0NBQ2jWBQED19fXyeNI/dxcVFamu\nrk6SlJ+fr+rqas3MzKzZ7P+O09eOfGbnM/HZ21DlvbCwoPz8/NRxfn7+sjbRkydP9PDhQwWDwdUc\nb8XIl5n1mm9mZkalpaWp4x07dizrTWByclK3bt1SY2Pjao63Ik5fO/JlZr3mM/HZ21DlvRoSiYSu\nXbumpqYmeb3eXI+z6shntlgspo6ODvX396e92TqB09eOfGZb62dvQ5V3Xl6eYrFY6jgWiykvLy/j\n+23b1tWrV7V7925VVFRkY8QVId9fW+/5SkpKNDU1lTqenp5WSUlJxvcvLi6qo6NDx48fV3t7ezZG\nXDanrx35/tp6z2fis7ehyruwsFCPHz/W/Py8lpaWND4+rvLy8uden0wm045HRkbk9/vX5W9KSuT7\nI9PyNTQ0aHx8XPfv39fTp0918eJFtbW1Pff6P+Y7ceKEampqdObMmWyP+v/N6WtHvnSm5TPx2dtQ\nv23ucrnU3NysK1euSJJCoZD8fr/u3Lkjy7JUU1OjeDyuS5cuKZFIyLIsjY6OqrOzU3Nzc7p37562\nbdumwcFBSVJjY6PKyspyGSkN+czO53a7df78ebW0tKT+u0p1dbUuXLggy7J08uRJRaNR7d+/X/Pz\n83K5XOrv79fdu3d1+/ZtDQwMqLa2Vvv27ZNlWert7VVra2uuY0ly/tqRz+x8Jj571h8/QaxX4XA4\nGY1Gcz1G1gSDQZHPTMFgUD09PbkeI2vC4bBj105y9t6UNkY+pz5/4XBYPT091p+d21A/NgcAwAko\nbwAADEN5AwBgGMobAADDUN4AABiG8gYAwDCUNwAAhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAA\nw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kDAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3\nAACGsZLJZK5nyMh///d/Ly0tLTn2w4bL5ZJt27keI2ucnM/J2STJ4/FocXEx12NkjdPXL5lMyrKs\nXI+RNW63W0tLS7keIys8Ho/9/vvvu//03FoPs1xLS0uunp6eXI+RNeFwWEeOHMn1GFkTiUQcm8/J\n2aTf8/HsmSsSiSgajeZ6jKwJBoOO3Z/hcPi5X1gd+00WAACnorwBADAM5Q0AgGEobwAADEN5AwBg\nGMobAADDUN4AABiG8gYAwDCUNwAAhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIG\nAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kDAGAYyhsAAMNsuPIeHh5WKBTS7t279fHHHz9zfmxs\nTK+88oo2bdqkvr6+1OvT09M6ePCg9uzZo9raWn3yySdrOXbGfvzxRx06dEivvfaaPv/882fOT0xM\n6NixY6qvr9cXX3yRen12dlZdXV16/fXXdfjwYQ0MDKzl2Bkjn7n5ePbMXTtJmpqa0tdff62vvvpK\nP//88zPnHz16pO+++06ffvqpbt++nXo9FotpaGhIFy9e1DfffKNffvllLcfOmGn707Mmf8o6Ydu2\nTp06pevXr6u4uFgNDQ1qb29XKBRKXbN9+3adO3dOly9fTrvX4/Gor69PdXV1isViqq+vV0tLS9q9\nuWbbtnp7e/XZZ5+poKBAb775pg4cOKDKysrUNS+88ILeffdd/fDDD2n3ejwevfPOOwqFQorH43rj\njTf08ssvp92ba+QzNx/PnrlrJ0nJZFI3btxQW1ubfD6fIpGIysvL5ff7U9ds2rRJzc3NmpiYSLvX\n5XKpqalJgUBAiURCg4ODKi0tTbs310zcnxvqm/fNmzdVVVWlnTt3yuv1qrOzU0NDQ2nXBAIB1dfX\ny+NJ/1xTVFSkuro6SVJ+fr6qq6s1MzOzZrNnYnR0VGVlZSouLpbX61Vra6tGRkbSrvH7/dqzZ88z\n+QKBQGqz+Xw+VVRU6MGDB2s2eybIZ24+nj1z106SotGotm7dqi1btsjtdmvXrl2anJxMu2bz5s0q\nKCiQZVlpr/t8PgUCAUmS1+uV3+/XwsLCWo2eERP354Yq75mZGZWWlqaOd+zYsay/5MnJSd26dUuN\njY2rOd6KPXjwQEVFRanjYDC4rDeBmZkZjY2Nae/evas53oqRLzPrMR/PXmbW49pJ0sLCgvLz81PH\n+fn5yyrgJ0+e6OHDhwoGg6s53oqZuD83VHmvhlgspo6ODvX396dtZqeIx+Pq7u7W2bNn5fP5cj3O\nqiOfuXj2zJZIJHTt2jU1NTXJ6/XmepxVt9b7c0OVd0lJiaamplLH09PTKikpyfj+xcVFdXR06Pjx\n42pvb8/GiCtSWFio2dnZ1HE0GlVhYWHG9y8uLqq7u1uHDh3SwYMHszHiipDvr63nfDx7f209r50k\n5eXlKRaLpY5jsZjy8vIyvt+2bV29elW7d+9WRUVFNkZcERP354Yq74aGBo2Pj+v+/ft6+vSpLl68\nqLa2tuden0wm045PnDihmpoanTlzJtujLsuLL76oqakp/frrr0okEhoeHtaBAwcyvv+DDz5QZWWl\njh07lsUpl498f2095+PZ+2vree2k3z+cPH78WPPz81paWtL4+LjKy8ufe/0f129kZER+v3/d/XPA\n/zFxf26o3zZ3u906f/68WlpaZNu2urq6VF1drQsXLsiyLJ08eVLRaFT79+/X/Py8XC6X+vv7dffu\nXd2+fVsDAwOqra3Vvn37ZFmWent71dramutYKW63W++9957eeust2batw4cPq7KyUt9++60sy9LR\no0c1Nzenzs5OxeNxWZalL7/8UkNDQxobG9P333+vqqoqHT16VJZl6fTp03r11VdzHSuFfObm49kz\nd+2k339jvLm5WVeuXJEkhUIh+f1+3blzR5ZlqaamRvF4XJcuXVIikZBlWRodHVVnZ6fm5uZ07949\nbdu2TYODg5KkxsZGlZWV5TJSGhP3p/XHTxDrVTgcTvb09OR6jKwJh8M6cuRIrsfImkgk4th8Ts4m\n/Z6PZ89ckUhE0Wg012NkTTAYdOz+DIfD6unpsf7s3Ib6sTkAAE5AeQMAYBjKGwAAw1DeAAAYhvIG\nAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kDAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM\n5Q0AgGEobwAADEN5AwBgGMobAADDUN4AABiG8gYAwDCUNwAAhqG8AQAwjJVMJnM9Q0Y++uijJdu2\nHfthw+PxaHFxMddjZI2T8zk5m0Q+05HPXB6Px37//ffdf3purYdZLtu2XUeOHMn1GFkTiUTU09OT\n6zGyJhwOOzafk7NJ5DMd+cwVDoef+4XVsd9kAQBwKsobAADDUN4AABiG8gYAwDCUNwAAhqG8AQAw\nDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kD\nAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADDMhivvH3/8UYcOHdJrr72mzz///JnzExMTOnbs\nmOrr6/XFF1+kXp+dnVVXV5def/11HT58WAMDA2s5dsaGh4cVCoW0e/duffzxx8+cHxsb0yuvvKJN\nmzapr68v9fr09LQOHjyoPXv2qLa2Vp988slajp0x8pmbz8nZJPKRb23zedbkT1knbNtWb2+vPvvs\nMxUUFOjNN9/UgQMHVFlZmbrmhRde0Lvvvqsffvgh7V6Px6N33nlHoVBI8Xhcb7zxhl5++eW0e3PN\ntm2dOnVK169fV3FxsRoaGtTe3q5QKJS6Zvv27Tp37pwuX76cdq/H41FfX5/q6uoUi8VUX1+vlpaW\ntHtzjXzm5nNyNol8EvnWOt+G+uY9OjqqsrIyFRcXy+v1qrW1VSMjI2nX+P1+7dmzRx5P+ueaQCCQ\nWgyfz6eKigo9ePBgzWbPxM2bN1VVVaWdO3fK6/Wqs7NTQ0NDadcEAgHV19c/k6+oqEh1dXWSpPz8\nfFVXV2tmZmbNZs8E+czN5+RsEvkk8klrm29DlfeDBw9UVFSUOg4Gg8sq4JmZGY2NjWnv3r2rOd6K\nzczMqLS0NHW8Y8eOZW2iyclJ3bp1S42Njas53oqRLzPrMZ+Ts0nkyxT5Vs+GKu/VEI/H1d3drbNn\nz8rn8+V6nFUXi8XU0dGh/v5+5efn53qcVUc+czk5m0Q+0611vg1V3oWFhZqdnU0dR6NRFRYWZnz/\n4uKiuru7dejQIR08eDAbI65ISUmJpqamUsfT09MqKSnJ+P7FxUV1dHTo+PHjam9vz8aIK0K+v7ae\n8zk5m0S+f4d8q29DlfeLL76oqakp/frrr0okEhoeHtaBAwcyvv+DDz5QZWWljh07lsUpl6+hoUHj\n4+O6f/++nj59qosXL6qtre251yeTybTjEydOqKamRmfOnMn2qMtCvnQm5XNyNol8f0S+7NtQv23u\ndrv13nvv6a233pJt2zp8+LAqKyv17bffyrIsHT16VHNzc+rs7FQ8HpdlWfryyy81NDSksbExff/9\n96qqqtLRo0dlWZZOnz6tV199NdexUtxut86fP6+WlhbZtq2uri5VV1frwoULsixLJ0+eVDQa1f79\n+zU/Py+Xy6X+/n7dvXtXt2/f1sDAgGpra7Vv3z5ZlqXe3l61trbmOlYK+czN5+RsEvnIt/b5rD9+\nglivwuFw8siRI7keI2sikYh6enpyPUbWhMNhx+ZzcjaJfKYjn7n+N5v1Z+c21I/NAQBwAsobAADD\nUN4AABiG8gYAwDCUNwAAhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcA\nAIahvAEAMAzlDQCAYShvAAAMQ3kDAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGGs\nZDKZ6xky8uGHHy5ZluXYDxtut1tLS0u5HiNrPB6PFhcXcz1GViSTSVmWlesxssbp+Zz+7Dl9/Zyc\nL5lM2h9++KH7z8551nqY5bIsyxWNRnM9RtYEg0H19PTkeoysCYfDjs0XDofl9L3p9HxO3ZsS+9Nk\nwWDwuV9YHftNFgAAp6K8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIah\nvAEAMAzlDQCAYShvAAAMQ3kDAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGEobwAA\nDEN5AwBgGE+uB1hrU1NT+umnn5RMJlVdXa19+/alnX/06JFGRkY0NzenxsZGvfTSS5KkWCym69ev\n67fffpNlWaqurtbevXtzEeEvDQ8P6+2335Zt2+rq6tLZs2fTzo+Njenvf/+7/vWvf6m3t1fd3d2S\npOnpaf3tb39TNBqVy+XSP/7xD50+fToXEf6S0/M5eX86OZvE3jR9/UzLt6HKO5lM6saNG2pra5PP\n51MkElF5ebn8fn/qmk2bNqm5uVkTExNp97pcLjU1NSkQCCiRSGhwcFClpaVp9+aabds6deqUrl+/\nruLiYjU0NKi9vV2hUCh1zfbt23Xu3Dldvnw57V6Px6O+vj7V1dUpFoupvr5eLS0taffmmtPzOXl/\nOjmbxN6UzF4/E/NtqB+bR6NRbd26VVu2bJHb7dauXbs0OTmZds3mzZtVUFAgy7LSXvf5fAoEApIk\nr9crv9+vhYWFtRo9Izdv3lRVVZV27twpr9erzs5ODQ0NpV0TCARUX18vjyf9c1tRUZHq6uokSfn5\n+aqurtbMzMyazZ4Jp+dz8v50cjaJvSmZvX4m5ttQ5b2wsKD8/PzUcX5+/rL+kp88eaKHDx8qGAyu\n5ngrNjMzo9LS0tTxjh07lvUmMDk5qVu3bqmxsXE1x1sxp+dz8v50cjaJvZmp9bp+JubbUOW9GhKJ\nhK5du6ampiZ5vd5cj7PqYrGYOjo61N/fn7aZncLp+Zy8P52cTWJvmm6t822o8s7Ly1MsFksdx2Ix\n5eXlZXy/bdu6evWqdu/erYqKimyMuCIlJSWamppKHU9PT6ukpCTj+xcXF9XR0aHjx4+rvb09GyOu\niNPzOXl/OjmbxN78d9b7+pmYb0OVd2FhoR4/fqz5+XktLS1pfHxc5eXlz70+mUymHY+MjMjv96/L\n35SUpIaGBo2Pj+v+/ft6+vSpLl68qLa2tude/8d8J06cUE1Njc6cOZPtUZfF6fmcvD+dnE1ib/6R\naetnYr4N9dvmLpdLzc3NunLliiQpFArJ7/frzp07sixLNTU1isfjunTpkhKJhCzL0ujoqDo7OzU3\nN6d79+5p27ZtGhwclCQ1NjaqrKwsl5HSuN1unT9/Xi0tLan/rlJdXa0LFy7IsiydPHlS0WhU+/fv\n1/z8vFwul/r7+3X37l3dvn1bAwMDqq2t1b59+2RZlnp7e9Xa2prrWClOz+fk/enkbBJ70/T1MzGf\n9cdPEOtVOBxORqPRXI+RNcFgUD09PbkeI2vC4bBj84XDYTl9bzo9n1P3psT+NNn/7k3rz85tqB+b\nAwDgBJQ3AACGobwBADAM5Q0AgGEobwAADEN5AwBgGMobAADDUN4AABiG8gYAwDCUNwAAhqG8AQAw\nDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kD\nAGAYyhsAAMNQ3gAAGMZKJpO5niEjH3300ZJt2479sJFMJmVZVq7HyBon53NyNklyuVyybTvXY2SN\nx+PR4uJirsfIGqfvTyfnSyaT9ocffuj+s3OetR5muWzbdh05ciTXY2RNJBJRNBrN9RhZEwwGHZvP\nydmk3/M5/dnr6enJ9RhZEw6HHb8/nZovGAw+9wurY7/JAgDgVJQ3AACGobwBADAM5Q0AgGEobwAA\nDEN5AwBgGMobAADDUN4AABiG8gYAwDCUNwAAhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1De\nAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAM48n1AGvtxx9/1Mcff6xkMqnDhw+rq6sr7fzE\nxIT++c9/6n/+5390+vRp/ed//qckaXZ2Vu+//74ePnwoy7LU0dGh//iP/8hFhL80NTWln376Sclk\nUtXV1dq3b1/a+UePHmlkZERzc3NqbGzUSy+9JEmKxWK6fv26fvvtN1mWperqau3duzcXEf4S+czN\n5/Rnb3h4WG+//bZs21ZXV5fOnj2bdn5sbEx///vf9a9//Uu9vb3q7u6WJE1PT+tvf/ubotGoXC6X\n/vGPf+j06dO5iPCXnLw3JfPybajytm1bvb29+uyzz1RQUKA333xTBw4cUGVlZeqaF154Qe+++65+\n+OGHtHs9Ho/eeecdhUIhxeNxvfHGG3r55ZfT7s21ZDKpGzduqK2tTT6fT5FIROXl5fL7/alrNm3a\npObmZk1MTKTd63K51NTUpEAgoEQiocHBQZWWlqbdm2vkMzef058927Z16tQpXb9+XcXFxWpoaFB7\ne7tCoVDqmu3bt+vcuXO6fPly2r0ej0d9fX2qq6tTLBZTfX29Wlpa0u7NNSfvTcnMfBvqx+ajo6Mq\nKytTcXGxvF6vWltbNTIyknaN3+/Xnj175PGkf64JBAKph8nn86miokIPHjxYs9kzEY1GtXXrVm3Z\nskVut1u7du3S5ORk2jWbN29WQUGBLMtKe93n8ykQCEiSvF6v/H6/FhYW1mr0jJDP3HxOf/Zu3ryp\nqqoq7dy5U16vV52dnRoaGkq7JhAIqL6+/pl8RUVFqqurkyTl5+erurpaMzMzazZ7Jpy8NyUz822o\n8n7w4IGKiopSx8FgcFlvAjMzMxobG1t3P/pZWFhQfn5+6jg/P39Zm+jJkyd6+PChgsHgao63YuTL\nzHrM5/Rnb2ZmRqWlpanjHTt2LKuAJycndevWLTU2Nq7meCvm5L0pmZlvQ5X3aojH4+ru7tbZs2fl\n8/lyPc6qSyQSunbtmpqamuT1enM9zqojn7mc/uzFYjF1dHSov78/rUicwsl7U1r7fBuqvAsLCzU7\nO5s6jkajKiwszPj+xcVFdXd369ChQzp48GA2RlyRvLw8xWKx1HEsFlNeXl7G99u2ratXr2r37t2q\nqKjIxogrQr6/tp7zOf3ZKykp0dTUVOp4enpaJSUlGd+/uLiojo4OHT9+XO3t7dkYcUWcvDclM/Nt\nqPJ+8cUXNTU1pV9//VWJRELDw8M6cOBAxvd/8MEHqqys1LFjx7I45fIVFhbq8ePHmp+f19LSksbH\nx1VeXv7c65PJZNrxyMiI/H7/uvuR5P8hXzqT8jn92WtoaND4+Lju37+vp0+f6uLFi2pra3vu9X9c\nuxMnTqimpkZnzpzJ9qjL4uS9KZmZb0P9trnb7dZ7772nt956S7Zt6/Dhw6qsrNS3334ry7J09OhR\nzc3NqbOzU/F4XJZl6csvv9TQ0JDGxsb0/fffq6qqSkePHpVlWTp9+rReffXVXMdKcblcam5u1pUr\nVyRJoVBIfr9fd+7ckWVZqqmpUTwe16VLl5RIJGRZlkZHR9XZ2am5uTndu3dP27Zt0+DgoCSpsbFR\nZWVluYyUhnzm5nP6s+d2u3X+/Hm1tLSk/qtYdXW1Lly4IMuydPLkSUWjUe3fv1/z8/NyuVzq7+/X\n3bt3dfv2bQ0MDKi2tlb79u2TZVnq7e1Va2trrmOlOHlvSmbms/74CWK9CofDySNHjuR6jKyJRCKK\nRqO5HiNrgsGgY/M5OZv0ez6nP3s9PT25HiNrwuGw4/enU/MFg0H19PRYf3ZuQ/3YHAAAJ6C8AQAw\nDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kD\nAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGEobwAADEN5AwBgGMobAADDUN4AABjG\nSiaTuZ4hIx999NGSbduO/bCRTCZlWVaux8gal8sl27ZzPUZWODmbJHk8Hi0uLuZ6jKxx+vo5PZ+T\n96fb7bb/67/+y/1n5zxrPcxy2bbtOnLkSK7HyJpIJKJoNJrrMbImGAzKqesXiUQcm036PV9PT0+u\nx8iacDjs+PVzej6n7s9wOPzcL6yO/SYLAIBTUd4AABiG8gYAwDCUNwAAhqG8AQAwDOUNAIBhKG8A\nAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kDAGAYyhsAAMNQ\n3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGE2XHn/+OOPOnTokF577TV9/vnnz5yfmJjQsWPHVF9f\nry+++CL1+uzsrLq6uvT666/r8OHDGhgYWMuxMzY1NaWvv/5aX331lX7++ednzj969EjfffedPv30\nU92+fTv1eiwW09DQkC5evKhvvvlGv/zyy1qOnTGnr5+T8w0PDysUCmn37t36+OOPnzk/NjamV155\nRZs2bVJfX1/q9enpaR08eFB79uxRbW2tPvnkk7UcO2NOXjvJ+flM25+eNflT1gnbttXb26vPPvtM\nBQUFevPNN3XgwAFVVlamrnnhhRf07rvv6ocffki71+Px6J133lEoFFI8Htcbb7yhl19+Oe3eXEsm\nk7px44ba2trk8/kUiURUXl4uv9+fumbTpk1qbm7WxMRE2r0ul0tNTU0KBAJKJBIaHBxUaWlp2r25\n5vT1c3I+27Z16tQpXb9+XcXFxWpoaFB7e7tCoVDqmu3bt+vcuXO6fPly2r0ej0d9fX2qq6tTLBZT\nfX29Wlpa0u7NNSevnbQx8pm2PzfUN+/R0VGVlZWpuLhYXq9Xra2tGhkZSbvG7/drz5498njSP9cE\nAoHUYvh8PlVUVOjBgwdrNnsmotGotm7dqi1btsjtdmvXrl2anJxMu2bz5s0qKCiQZVlpr/t8PgUC\nAUmS1+uqVgErAAARmklEQVSV3+/XwsLCWo2eEaevn5Pz3bx5U1VVVdq5c6e8Xq86Ozs1NDSUdk0g\nEFB9ff0z2YqKilRXVydJys/PV3V1tWZmZtZs9kw4ee0k5+czcX9uqPJ+8OCBioqKUsfBYHBZm2hm\nZkZjY2Pau3fvao63YgsLC8rPz08d5+fnL6uAnzx5oocPHyoYDK7meCvm9PVzcr6ZmRmVlpamjnfs\n2LGsN7jJyUndunVLjY2Nqzneijl57STn5zNxf26o8l4N8Xhc3d3dOnv2rHw+X67HWXWJRELXrl1T\nU1OTvF5vrsdZdU5fPyfni8Vi6ujoUH9/f9qHVKdw8tpJzs+31vtzQ5V3YWGhZmdnU8fRaFSFhYUZ\n37+4uKju7m4dOnRIBw8ezMaIK5KXl6dYLJY6jsViysvLy/h+27Z19epV7d69WxUVFdkYcUWcvn5O\nzldSUqKpqanU8fT0tEpKSjK+f3FxUR0dHTp+/Lja29uzMeKKOHntJOfnM3F/bqjyfvHFFzU1NaVf\nf/1ViURCw8PDOnDgQMb3f/DBB6qsrNSxY8eyOOXyFRYW6vHjx5qfn9fS0pLGx8dVXl7+3OuTyWTa\n8cjIiPx+/7r7kdb/cfr6OTlfQ0ODxsfHdf/+fT19+lQXL15UW1vbc6//4948ceKEampqdObMmWyP\nuixOXjvJ+flM3J8b6rfN3W633nvvPb311luybVuHDx9WZWWlvv32W1mWpaNHj2pubk6dnZ2Kx+Oy\nLEtffvmlhoaGNDY2pu+//15VVVU6evSoLMvS6dOn9eqrr+Y6VorL5VJzc7OuXLkiSQqFQvL7/bpz\n544sy1JNTY3i8bguXbqkRCIhy7I0Ojqqzs5Ozc3N6d69e9q2bZsGBwclSY2NjSorK8tlpDROXz8n\n53O73Tp//rxaWlpk27a6urpUXV2tCxcuyLIsnTx5UtFoVPv379f8/LxcLpf6+/t19+5d3b59WwMD\nA6qtrdW+fftkWZZ6e3vV2tqa61gpTl47aWPkM21/Wn/8BLFehcPh5JEjR3I9RtZEIhFFo9Fcj5E1\nwWBQTl2/SCTi2GzS7/l6enpyPUbWhMNhx6+f0/M5dX+Gw2H19PRYf3ZuQ/3YHAAAJ6C8AQAwDOUN\nAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kDAGAY\nyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGEobwAADEN5AwBgGMobAADDUN4AABjGSiaT\nuZ4hIx999NGSbduO/bDh8Xi0uLiY6zGyxsn5XC6XbNvO9RhZ4+S1k1g/0yWTSVmWlesxsiKZTNof\nfvih+8/OedZ6mOWybdt15MiRXI+RNZFIRD09PbkeI2vC4bBj84XDYbE3zcX6mS0cDisajeZ6jKwI\nBoPP/cLq2G+yAAA4FeUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzl\nDQCAYShvAAAMQ3kDAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGEobwAADEN5AwBg\nGMobAADDUN4AABhmw5X3jz/+qEOHDum1117T559//sz5iYkJHTt2TPX19friiy9Sr8/Ozqqrq0uv\nv/66Dh8+rIGBgbUcO2PDw8MKhULavXu3Pv7442fOj42N6ZVXXtGmTZvU19eXen16eloHDx7Unj17\nVFtbq08++WQtx86Y0/M5eX+yduauneT89ZuamtLXX3+tr776Sj///PMz5x89eqTvvvtOn376qW7f\nvp16PRaLaWhoSBcvXtQ333yjX375ZU3m9azJn7JO2Lat3t5effbZZyooKNCbb76pAwcOqLKyMnXN\nCy+8oHfffVc//PBD2r0ej0fvvPOOQqGQ4vG43njjDb388stp9+aabds6deqUrl+/ruLiYjU0NKi9\nvV2hUCh1zfbt23Xu3Dldvnw57V6Px6O+vj7V1dUpFoupvr5eLS0taffm2kbI59T9ydqZu3aS89cv\nmUzqxo0bamtrk8/nUyQSUXl5ufx+f+qaTZs2qbm5WRMTE2n3ulwuNTU1KRAIKJFIaHBwUKWlpWn3\nZsOG+uY9OjqqsrIyFRcXy+v1qrW1VSMjI2nX+P1+7dmzRx5P+ueaQCCQ2mw+n08VFRV68ODBms2e\niZs3b6qqqko7d+6U1+tVZ2enhoaG0q4JBAKqr69/Jl9RUZHq6uokSfn5+aqurtbMzMyazZ4Jp+dz\n8v5k7cxdO8n56xeNRrV161Zt2bJFbrdbu3bt0uTkZNo1mzdvVkFBgSzLSnvd5/MpEAhIkrxer/x+\nvxYWFrI+84Yq7wcPHqioqCh1HAwGl/WQzMzMaGxsTHv37l3N8VZsZmZGpaWlqeMdO3Ys6yGZnJzU\nrVu31NjYuJrjrZjT8zl5f7J2mVmPayc5f/0WFhaUn5+fOs7Pz19WAT958kQPHz5UMBhczfH+1IYq\n79UQj8fV3d2ts2fPyufz5XqcVReLxdTR0aH+/v60zewUTs/n5P3J2pnN6euXSCR07do1NTU1yev1\nZv3P21DlXVhYqNnZ2dRxNBpVYWFhxvcvLi6qu7tbhw4d0sGDB7Mx4oqUlJRoamoqdTw9Pa2SkpKM\n719cXFRHR4eOHz+u9vb2bIy4Ik7P5+T9ydr9tfW8dpLz1y8vL0+xWCx1HIvFlJeXl/H9tm3r6tWr\n2r17tyoqKrIx4jM2VHm/+OKLmpqa0q+//qpEIqHh4WEdOHAg4/s/+OADVVZW6tixY1mccvkaGho0\nPj6u+/fv6+nTp7p48aLa2tqee30ymUw7PnHihGpqanTmzJlsj7osTs/n5P3J2v219bx2kvPXr7Cw\nUI8fP9b8/LyWlpY0Pj6u8vLy517/x3wjIyPy+/1r+s8dG+q3zd1ut9577z299dZbsm1bhw8fVmVl\npb799ltZlqWjR49qbm5OnZ2disfjsixLX375pYaGhjQ2Nqbvv/9eVVVVOnr0qCzL0unTp/Xqq6/m\nOlaK2+3W+fPn1dLSItu21dXVperqal24cEGWZenkyZOKRqPav3+/5ufn5XK51N/fr7t37+r27dsa\nGBhQbW2t9u3bJ8uy1Nvbq9bW1lzHStkI+Zy6P1k7c9dOcv76uVwuNTc368qVK5KkUCgkv9+vO3fu\nyLIs1dTUKB6P69KlS0okErIsS6Ojo+rs7NTc3Jzu3bunbdu2aXBwUJLU2NiosrKyrM5s/fETxHoV\nDoeTR44cyfUYWROJRNTT05PrMbImHA47Nl84HBZ701ysn9nC4bCi0Wiux8iKYDConp4e68/Obagf\nmwMA4ASUNwAAhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEA\nMAzlDQCAYShvAAAMQ3kDAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGEobwAADEN5\nAwBgGMobAADDWMlkMtczZOSjjz5asm3bsR82PB6PFhcXcz1G1rhcLtm2nesxsiKZTMqyrFyPkTVO\nz+d2u7W0tJTrMbLG6evn5PcWl8tl//Of/3T/2TnPWg+zXLZtu44cOZLrMbImEomop6cn12NkTTgc\nllPXLxKJKBqN5nqMrAkGg47P5/Rnz+nr5+D3lud+YXXsN1kAAJyK8gYAwDCUNwAAhqG8AQAwDOUN\nAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShvAAAMQ3kDAGAY\nyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGEobwAADLPhyvvHH3/UoUOH9Nprr+nzzz9/\n5vzExISOHTum+vp6ffHFF6nXZ2dn1dXVpddff12HDx/WwMDAWo6dseHhYYVCIe3evVsff/zxM+fH\nxsb0yiuvaNOmTerr60u9Pj09rYMHD2rPnj2qra3VJ598spZjZ8zp6zc1NaWvv/5aX331lX7++edn\nzj969EjfffedPv30U92+fTv1eiwW09DQkC5evKhvvvlGv/zyy1qOnREnZ5Oc/+w5ff1Me2/xrMmf\nsk7Ytq3e3l599tlnKigo0JtvvqkDBw6osrIydc0LL7ygd999Vz/88EPavR6PR++8845CoZDi8bje\neOMNvfzyy2n35ppt2zp16pSuX7+u4uJiNTQ0qL29XaFQKHXN9u3bde7cOV2+fDntXo/Ho76+PtXV\n1SkWi6m+vl4tLS1p9+aa09cvmUzqxo0bamtrk8/nUyQSUXl5ufx+f+qaTZs2qbm5WRMTE2n3ulwu\nNTU1KRAIKJFIaHBwUKWlpWn35pKTs0nOf/Y2wvqZ9t6yob55j46OqqysTMXFxfJ6vWptbdXIyEja\nNX6/X3v27JHHk/65JhAIpB4mn8+niooKPXjwYM1mz8TNmzdVVVWlnTt3yuv1qrOzU0NDQ2nXBAIB\n1dfXP5OvqKhIdXV1kqT8/HxVV1drZmZmzWbPhNPXLxqNauvWrdqyZYvcbrd27dqlycnJtGs2b96s\ngoICWZaV9rrP51MgEJAkeb1e+f1+LSwsrNXo/5aTs0nOf/acvn4mvrdsqPJ+8OCBioqKUsfBYHBZ\nf8kzMzMaGxvT3r17V3O8FZuZmVFpaWnqeMeOHct6E5icnNStW7fU2Ni4muOtmNPXb2FhQfn5+anj\n/Pz8Zb3JPXnyRA8fPlQwGFzN8VbEydkk5z97Tl8/E99bNlR5r4Z4PK7u7m6dPXtWPp8v1+Osulgs\npo6ODvX396c9rE7h9PVLJBK6du2ampqa5PV6cz3OqnJyNsn5z57T12+t31s2VHkXFhZqdnY2dRyN\nRlVYWJjx/YuLi+ru7tahQ4d08ODBbIy4IiUlJZqamkodT09Pq6SkJOP7FxcX1dHRoePHj6u9vT0b\nI66I09cvLy9PsVgsdRyLxZSXl5fx/bZt6+rVq9q9e7cqKiqyMeKyOTmb5Pxnz+nrZ+J7y4Yq7xdf\nfFFTU1P69ddflUgkNDw8rAMHDmR8/wcffKDKykodO3Ysi1MuX0NDg8bHx3X//n09ffpUFy9eVFtb\n23OvTyaTaccnTpxQTU2Nzpw5k+1Rl8Xp61dYWKjHjx9rfn5eS0tLGh8fV3l5+XOv/+P6jYyMyO/3\nr7t/DpCcnU1y/rPn9PUz8b1lQ/22udvt1nvvvae33npLtm3r8OHDqqys1LfffivLsnT06FHNzc2p\ns7NT8XhclmXpyy+/1NDQkMbGxvT999+rqqpKR48elWVZOn36tF599dVcx0pxu906f/68WlpaZNu2\nurq6VF1drQsXLsiyLJ08eVLRaFT79+/X/Py8XC6X+vv7dffuXd2+fVsDAwOqra3Vvn37ZFmWent7\n1dramutYKU5fP5fLpebmZl25ckWSFAqF5Pf7defOHVmWpZqaGsXjcV26dEmJREKWZWl0dFSdnZ2a\nm5vTvXv3tG3bNg0ODkqSGhsbVVZWlstIKU7OJjn/2dsI62fae4v1x09I61U4HE4eOXIk12NkTSQS\nUU9PT67HyJpwOCynrl8kElE0Gs31GFkTDAYdn8/pz57T18/J7y09PT3Wn53bUD82BwDACShvAAAM\nQ3kDAGAYyhsAAMNQ3gAAGIbyBgDAMJQ3AACGobwBADAM5Q0AgGEobwAADEN5AwBgGMobAADDUN4A\nABiG8gYAwDCUNwAAhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIax\nkslkrmfIyEcffTRr23Yw13Nki8fjsRcXFx37Ycrlctm2bTsyXzKZtC3LcmQ2yfn53G63vbS05Nh8\nTl8/J7+3uFyu6D//+c+iPztnTHkDAIDfOfLTCgAATkZ5AwBgGMobAADDUN4AABiG8gYAwDCUNwAA\nhqG8AQAwDOUNAIBhKG8AAAxDeQMAYBjKGwAAw1DeAAAYhvIGAMAwlDcAAIahvAEAMAzlDQCAYShv\nAAAMQ3kDAGAYyhsAAMP8P1qBrT7BINI0AAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "import itertools\n", - "import random\n", - "# http://stackoverflow.com/questions/10194482/custom-matplotlib-plot-chess-board-like-table-with-colored-cells\n", - "\n", - "from matplotlib.table import Table\n", - "\n", - "def main():\n", - " grid_table(8, 8)\n", - " plt.axis('scaled')\n", - " plt.show()\n", - "\n", - "def grid_table(nrows, ncols):\n", - " fig, ax = plt.subplots()\n", - " ax.set_axis_off()\n", - " colors = ['white', 'lightgrey', 'dimgrey']\n", - " tb = Table(ax, bbox=[0,0,2,2])\n", - " for i,j in itertools.product(range(ncols), range(nrows)):\n", - " tb.add_cell(i, j, 2./ncols, 2./nrows, text='{:0.2f}'.format(0.1234), \n", - " loc='center', facecolor=random.choice(colors), edgecolor='grey') # facecolors=\n", - " ax.add_table(tb)\n", - " #ax.plot([0, .3], [.2, .2])\n", - " #ax.add_line(plt.Line2D([0.3, 0.5], [0.7, 0.7], linewidth=2, color='blue'))\n", - " return fig\n", - "\n", - "main()" - ] - }, - { - "cell_type": "code", - "execution_count": 55, - "metadata": { - "collapsed": false - }, - "outputs": [], - "source": [ - "import collections\n", - "class defaultkeydict(collections.defaultdict):\n", - " \"\"\"Like defaultdict, but the default_factory is a function of the key.\n", - " >>> d = defaultkeydict(abs); d[-42]\n", - " 42\n", - " \"\"\"\n", - " def __missing__(self, key):\n", - " self[key] = self.default_factory(key)\n", - " return self[key]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.5.1" - }, - "widgets": { - "state": {}, - "version": "1.1.1" - } - }, - "nbformat": 4, - "nbformat_minor": 0 -} diff --git a/search.ipynb b/search.ipynb index d27d42f22..caf231dcc 100644 --- a/search.ipynb +++ b/search.ipynb @@ -15,12 +15,12 @@ "cell_type": "code", "execution_count": 1, "metadata": { - "collapsed": true, "scrolled": true }, "outputs": [], "source": [ "from search import *\n", + "from notebook import psource, heatmap, gaussian_kernel, show_map, final_path_colors, display_visual, plot_NQueens\n", "\n", "# Needed to hide warnings in the matplotlib sections\n", "import warnings\n", @@ -35,12 +35,21 @@ "\n", "* Overview\n", "* Problem\n", + "* Node\n", + "* Simple Problem Solving Agent\n", "* Search Algorithms Visualization\n", "* Breadth-First Tree Search\n", "* Breadth-First Search\n", + "* Best First Search\n", "* Uniform Cost Search\n", + "* Greedy Best First Search\n", "* A\\* Search\n", - "* Genetic Algorithm" + "* Hill Climbing\n", + "* Simulated Annealing\n", + "* Genetic Algorithm\n", + "* AND-OR Graph Search\n", + "* Online DFS Agent\n", + "* LRTA* Agent" ] }, { @@ -49,26 +58,52 @@ "source": [ "## OVERVIEW\n", "\n", - "Here, we learn about problem solving. Building goal-based agents that can plan ahead to solve problems, in particular, navigation problem/route finding problem. First, we will start the problem solving by precisely defining **problems** and their **solutions**. We will look at several general-purpose search algorithms. Broadly, search algorithms are classified into two types:\n", + "Here, we learn about a specific kind of problem solving - building goal-based agents that can plan ahead to solve problems. In particular, we examine navigation problem/route finding problem. We must begin by precisely defining **problems** and their **solutions**. We will look at several general-purpose search algorithms.\n", + "\n", + "Search algorithms can be classified into two types:\n", "\n", "* **Uninformed search algorithms**: Search algorithms which explore the search space without having any information about the problem other than its definition.\n", - "* Examples:\n", - " 1. Breadth First Search\n", - " 2. Depth First Search\n", - " 3. Depth Limited Search\n", - " 4. Iterative Deepening Search\n", + " * Examples:\n", + " 1. Breadth First Search\n", + " 2. Depth First Search\n", + " 3. Depth Limited Search\n", + " 4. Iterative Deepening Search\n", "\n", "\n", "* **Informed search algorithms**: These type of algorithms leverage any information (heuristics, path cost) on the problem to search through the search space to find the solution efficiently.\n", - "* Examples:\n", - " 1. Best First Search\n", - " 2. Uniform Cost Search\n", - " 3. A\\* Search\n", - " 4. Recursive Best First Search\n", + " * Examples:\n", + " 1. Best First Search\n", + " 2. Uniform Cost Search\n", + " 3. A\\* Search\n", + " 4. Recursive Best First Search\n", "\n", "*Don't miss the visualisations of these algorithms solving the route-finding problem defined on Romania map at the end of this notebook.*" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For visualisations, we use networkx and matplotlib to show the map in the notebook and we use ipywidgets to interact with the map to see how the searching algorithm works. These are imported as required in `notebook.py`." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "import networkx as nx\n", + "import matplotlib.pyplot as plt\n", + "from matplotlib import lines\n", + "\n", + "from ipywidgets import interact\n", + "import ipywidgets as widgets\n", + "from IPython.display import display\n", + "import time" + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -80,13 +115,161 @@ }, { "cell_type": "code", - "execution_count": 2, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class Problem(object):\n",
    +       "\n",
    +       "    """The abstract class for a formal problem. You should subclass\n",
    +       "    this and implement the methods actions and result, and possibly\n",
    +       "    __init__, goal_test, and path_cost. Then you will create instances\n",
    +       "    of your subclass and solve them with the various search functions."""\n",
    +       "\n",
    +       "    def __init__(self, initial, goal=None):\n",
    +       "        """The constructor specifies the initial state, and possibly a goal\n",
    +       "        state, if there is a unique goal. Your subclass's constructor can add\n",
    +       "        other arguments."""\n",
    +       "        self.initial = initial\n",
    +       "        self.goal = goal\n",
    +       "\n",
    +       "    def actions(self, state):\n",
    +       "        """Return the actions that can be executed in the given\n",
    +       "        state. The result would typically be a list, but if there are\n",
    +       "        many actions, consider yielding them one at a time in an\n",
    +       "        iterator, rather than building them all at once."""\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def result(self, state, action):\n",
    +       "        """Return the state that results from executing the given\n",
    +       "        action in the given state. The action must be one of\n",
    +       "        self.actions(state)."""\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def goal_test(self, state):\n",
    +       "        """Return True if the state is a goal. The default method compares the\n",
    +       "        state to self.goal or checks for state in self.goal if it is a\n",
    +       "        list, as specified in the constructor. Override this method if\n",
    +       "        checking against a single self.goal is not enough."""\n",
    +       "        if isinstance(self.goal, list):\n",
    +       "            return is_in(state, self.goal)\n",
    +       "        else:\n",
    +       "            return state == self.goal\n",
    +       "\n",
    +       "    def path_cost(self, c, state1, action, state2):\n",
    +       "        """Return the cost of a solution path that arrives at state2 from\n",
    +       "        state1 via action, assuming cost c to get up to state1. If the problem\n",
    +       "        is such that the path doesn't matter, this function will only look at\n",
    +       "        state2.  If the path does matter, it will consider c and maybe state1\n",
    +       "        and action. The default method costs 1 for every step in the path."""\n",
    +       "        return c + 1\n",
    +       "\n",
    +       "    def value(self, state):\n",
    +       "        """For optimization problems, each state has a value.  Hill-climbing\n",
    +       "        and related algorithms try to maximize this value."""\n",
    +       "        raise NotImplementedError\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource Problem" + "psource(Problem)" ] }, { @@ -95,7 +278,7 @@ "source": [ "The `Problem` class has six methods.\n", "\n", - "* `__init__(self, initial, goal)` : This is what is called a `constructor` and is the first method called when you create an instance of the class. `initial` specifies the initial state of our search problem. It represents the start state from where our agent begins its task of exploration to find the goal state(s) which is given in the `goal` parameter.\n", + "* `__init__(self, initial, goal)` : This is what is called a `constructor`. It is the first method called when you create an instance of the class as `Problem(initial, goal)`. The variable `initial` specifies the initial state $s_0$ of the search problem. It represents the beginning state. From here, our agent begins its task of exploration to find the goal state(s) which is given in the `goal` parameter.\n", "\n", "\n", "* `actions(self, state)` : This method returns all the possible actions agent can execute in the given state `state`.\n", @@ -104,7 +287,7 @@ "* `result(self, state, action)` : This returns the resulting state if action `action` is taken in the state `state`. This `Problem` class only deals with deterministic outcomes. So we know for sure what every action in a state would result to.\n", "\n", "\n", - "* `goal_test(self, state)` : Given a graph state, it checks if it is a terminal state. If the state is indeed a goal state, value of `True` is returned. Else, of course, `False` is returned.\n", + "* `goal_test(self, state)` : Return a boolean for a given state - `True` if it is a goal state, else `False`.\n", "\n", "\n", "* `path_cost(self, c, state1, action, state2)` : Return the cost of the path that arrives at `state2` as a result of taking `action` from `state1`, assuming total cost of `c` to get up to `state1`.\n", @@ -113,6 +296,216 @@ "* `value(self, state)` : This acts as a bit of extra information in problems where we try to optimise a value when we cannot do a goal test." ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## NODE\n", + "\n", + "Let's see how we define a Node. Run the next cell to see how abstract class `Node` is defined in the search module." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class Node:\n",
    +       "\n",
    +       "    """A node in a search tree. Contains a pointer to the parent (the node\n",
    +       "    that this is a successor of) and to the actual state for this node. Note\n",
    +       "    that if a state is arrived at by two paths, then there are two nodes with\n",
    +       "    the same state.  Also includes the action that got us to this state, and\n",
    +       "    the total path_cost (also known as g) to reach the node.  Other functions\n",
    +       "    may add an f and h value; see best_first_graph_search and astar_search for\n",
    +       "    an explanation of how the f and h values are handled. You will not need to\n",
    +       "    subclass this class."""\n",
    +       "\n",
    +       "    def __init__(self, state, parent=None, action=None, path_cost=0):\n",
    +       "        """Create a search tree Node, derived from a parent by an action."""\n",
    +       "        self.state = state\n",
    +       "        self.parent = parent\n",
    +       "        self.action = action\n",
    +       "        self.path_cost = path_cost\n",
    +       "        self.depth = 0\n",
    +       "        if parent:\n",
    +       "            self.depth = parent.depth + 1\n",
    +       "\n",
    +       "    def __repr__(self):\n",
    +       "        return "<Node {}>".format(self.state)\n",
    +       "\n",
    +       "    def __lt__(self, node):\n",
    +       "        return self.state < node.state\n",
    +       "\n",
    +       "    def expand(self, problem):\n",
    +       "        """List the nodes reachable in one step from this node."""\n",
    +       "        return [self.child_node(problem, action)\n",
    +       "                for action in problem.actions(self.state)]\n",
    +       "\n",
    +       "    def child_node(self, problem, action):\n",
    +       "        """[Figure 3.10]"""\n",
    +       "        next_state = problem.result(self.state, action)\n",
    +       "        next_node = Node(next_state, self, action,\n",
    +       "                    problem.path_cost(self.path_cost, self.state,\n",
    +       "                                      action, next_state))\n",
    +       "        return next_node\n",
    +       "    \n",
    +       "    def solution(self):\n",
    +       "        """Return the sequence of actions to go from the root to this node."""\n",
    +       "        return [node.action for node in self.path()[1:]]\n",
    +       "\n",
    +       "    def path(self):\n",
    +       "        """Return a list of nodes forming the path from the root to this node."""\n",
    +       "        node, path_back = self, []\n",
    +       "        while node:\n",
    +       "            path_back.append(node)\n",
    +       "            node = node.parent\n",
    +       "        return list(reversed(path_back))\n",
    +       "\n",
    +       "    # We want for a queue of nodes in breadth_first_graph_search or\n",
    +       "    # astar_search to have no duplicated states, so we treat nodes\n",
    +       "    # with the same state as equal. [Problem: this may not be what you\n",
    +       "    # want in other contexts.]\n",
    +       "\n",
    +       "    def __eq__(self, other):\n",
    +       "        return isinstance(other, Node) and self.state == other.state\n",
    +       "\n",
    +       "    def __hash__(self):\n",
    +       "        return hash(self.state)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(Node)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `Node` class has nine methods. The first is the `__init__` method.\n", + "\n", + "* `__init__(self, state, parent, action, path_cost)` : This method creates a node. `parent` represents the node that this is a successor of and `action` is the action required to get from the parent node to this node. `path_cost` is the cost to reach current node from parent node.\n", + "\n", + "The next 4 methods are specific `Node`-related functions.\n", + "\n", + "* `expand(self, problem)` : This method lists all the neighbouring(reachable in one step) nodes of current node. \n", + "\n", + "* `child_node(self, problem, action)` : Given an `action`, this method returns the immediate neighbour that can be reached with that `action`.\n", + "\n", + "* `solution(self)` : This returns the sequence of actions required to reach this node from the root node. \n", + "\n", + "* `path(self)` : This returns a list of all the nodes that lies in the path from the root to this node.\n", + "\n", + "The remaining 4 methods override standards Python functionality for representing an object as a string, the less-than ($<$) operator, the equal-to ($=$) operator, and the `hash` function.\n", + "\n", + "* `__repr__(self)` : This returns the state of this node.\n", + "\n", + "* `__lt__(self, node)` : Given a `node`, this method returns `True` if the state of current node is less than the state of the `node`. Otherwise it returns `False`.\n", + "\n", + "* `__eq__(self, other)` : This method returns `True` if the state of current node is equal to the other node. Else it returns `False`.\n", + "\n", + "* `__hash__(self)` : This returns the hash of the state of current node." + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -122,28 +515,163 @@ }, { "cell_type": "code", - "execution_count": 3, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class GraphProblem(Problem):\n",
    +       "\n",
    +       "    """The problem of searching a graph from one node to another."""\n",
    +       "\n",
    +       "    def __init__(self, initial, goal, graph):\n",
    +       "        Problem.__init__(self, initial, goal)\n",
    +       "        self.graph = graph\n",
    +       "\n",
    +       "    def actions(self, A):\n",
    +       "        """The actions at a graph node are just its neighbors."""\n",
    +       "        return list(self.graph.get(A).keys())\n",
    +       "\n",
    +       "    def result(self, state, action):\n",
    +       "        """The result of going to a neighbor is just that neighbor."""\n",
    +       "        return action\n",
    +       "\n",
    +       "    def path_cost(self, cost_so_far, A, action, B):\n",
    +       "        return cost_so_far + (self.graph.get(A, B) or infinity)\n",
    +       "\n",
    +       "    def find_min_edge(self):\n",
    +       "        """Find minimum value of edges."""\n",
    +       "        m = infinity\n",
    +       "        for d in self.graph.graph_dict.values():\n",
    +       "            local_min = min(d.values())\n",
    +       "            m = min(m, local_min)\n",
    +       "\n",
    +       "        return m\n",
    +       "\n",
    +       "    def h(self, node):\n",
    +       "        """h function is straight-line distance from a node's state to goal."""\n",
    +       "        locs = getattr(self.graph, 'locations', None)\n",
    +       "        if locs:\n",
    +       "            if type(node) is str:\n",
    +       "                return int(distance(locs[node], locs[self.goal]))\n",
    +       "\n",
    +       "            return int(distance(locs[node.state], locs[self.goal]))\n",
    +       "        else:\n",
    +       "            return infinity\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "%psource GraphProblem" + "psource(GraphProblem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now it's time to define our problem. We will define it by passing `initial`, `goal`, `graph` to `GraphProblem`. So, our problem is to find the goal state starting from the given initial state on the provided graph. Have a look at our romania_map, which is an Undirected Graph containing a dict of nodes as keys and neighbours as values." + "Have a look at our romania_map, which is an Undirected Graph containing a dict of nodes as keys and neighbours as values." ] }, { "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": true - }, + "execution_count": 6, + "metadata": {}, "outputs": [], "source": [ "romania_map = UndirectedGraph(dict(\n", @@ -182,15 +710,15 @@ "And `romania_map.locations` contains the positions of each of the nodes. We will use the straight line distance (which is different from the one provided in `romania_map`) between two cities in algorithms like A\\*-search and Recursive Best First Search.\n", "\n", "**Define a problem:**\n", - "Hmm... say we want to start exploring from **Arad** and try to find **Bucharest** in our romania_map. So, this is how we do it." + "Now it's time to define our problem. We will define it by passing `initial`, `goal`, `graph` to `GraphProblem`. So, our problem is to find the goal state starting from the given initial state on the provided graph. \n", + "\n", + "Say we want to start exploring from **Arad** and try to find **Bucharest** in our romania_map. So, this is how we do it." ] }, { "cell_type": "code", - "execution_count": 5, - "metadata": { - "collapsed": true - }, + "execution_count": 7, + "metadata": {}, "outputs": [], "source": [ "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)" @@ -214,14 +742,14 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "{'Vaslui': (509, 444), 'Sibiu': (207, 457), 'Arad': (91, 492), 'Giurgiu': (375, 270), 'Mehadia': (168, 339), 'Eforie': (562, 293), 'Iasi': (473, 506), 'Oradea': (131, 571), 'Craiova': (253, 288), 'Urziceni': (456, 350), 'Fagaras': (305, 449), 'Pitesti': (320, 368), 'Neamt': (406, 537), 'Rimnicu': (233, 410), 'Zerind': (108, 531), 'Timisoara': (94, 410), 'Hirsova': (534, 350), 'Lugoj': (165, 379), 'Bucharest': (400, 327), 'Drobeta': (165, 299)}\n" + "{'Arad': (91, 492), 'Bucharest': (400, 327), 'Craiova': (253, 288), 'Drobeta': (165, 299), 'Eforie': (562, 293), 'Fagaras': (305, 449), 'Giurgiu': (375, 270), 'Hirsova': (534, 350), 'Iasi': (473, 506), 'Lugoj': (165, 379), 'Mehadia': (168, 339), 'Neamt': (406, 537), 'Oradea': (131, 571), 'Pitesti': (320, 368), 'Rimnicu': (233, 410), 'Sibiu': (207, 457), 'Timisoara': (94, 410), 'Urziceni': (456, 350), 'Vaslui': (509, 444), 'Zerind': (108, 531)}\n" ] } ], @@ -234,202 +762,324 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's start the visualisations by importing necessary modules. We use networkx and matplotlib to show the map in the notebook and we use ipywidgets to interact with the map to see how the searching algorithm works." + "Let's get started by initializing an empty graph. We will add nodes, place the nodes in their location as shown in the book, add edges to the graph." ] }, { "cell_type": "code", - "execution_count": 7, - "metadata": { - "collapsed": true - }, + "execution_count": 9, + "metadata": {}, "outputs": [], "source": [ - "%matplotlib inline\n", - "import networkx as nx\n", - "import matplotlib.pyplot as plt\n", - "from matplotlib import lines\n", + "# node colors, node positions and node label positions\n", + "node_colors = {node: 'white' for node in romania_map.locations.keys()}\n", + "node_positions = romania_map.locations\n", + "node_label_pos = { k:[v[0],v[1]-10] for k,v in romania_map.locations.items() }\n", + "edge_weights = {(k, k2) : v2 for k, v in romania_map.graph_dict.items() for k2, v2 in v.items()}\n", "\n", - "from ipywidgets import interact\n", - "import ipywidgets as widgets\n", - "from IPython.display import display\n", - "import time" + "romania_graph_data = { 'graph_dict' : romania_map.graph_dict,\n", + " 'node_colors': node_colors,\n", + " 'node_positions': node_positions,\n", + " 'node_label_positions': node_label_pos,\n", + " 'edge_weights': edge_weights\n", + " }" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's get started by initializing an empty graph. We will add nodes, place the nodes in their location as shown in the book, add edges to the graph." + "We have completed building our graph based on romania_map and its locations. It's time to display it here in the notebook. This function `show_map(node_colors)` helps us do that. We will be calling this function later on to display the map at each and every interval step while searching, using variety of algorithms from the book." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can simply call the function with node_colors dictionary object to display it." ] }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 10, "metadata": { - "collapsed": true + "scrolled": false }, - "outputs": [], + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABTsAAAPKCAYAAABbVI7QAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzs3Xdc1eX///HnQYaCg8SFmCPcouLG1BQX5Uj9OHKVfBLtY0qOzJELREXNcFbmKC0zS1Nz5RZHoqklOTBH7r1yJvP8/uALv06gggJvODzut9u5+Tnv93Vd7+f7KPThxXVdb5PZbDYLAAAAAAAAALI4G6MDAAAAAAAAAEBaoNgJAAAAAAAAwCpQ7AQAAAAAAABgFSh2AgAAAAAAALAKFDsBAAAAAAAAWAWKnQAAAAAAAACsAsVOAAAAAAAAAFaBYicAAAAAAAAAq0CxEwAAAAAAAIBVoNgJAAAAAAAAwCpQ7AQAAAAAAABgFSh2AgAAAAAAALAKFDsBAAAAAAAAWAWKnQAAAAAAAACsAsVOAAAAAAAAAFaBYicAAAAAAAAAq0CxEwAAAAAAAIBVoNgJAAAAAAAAwCpQ7AQAAAAAAABgFSh2AgAAAAAAALAKFDsBAAAAAAAAWAWKnQAAAAAAAACsAsVOAAAAAAAAAFaBYicAAAAAAAAAq0CxEwAAAAAAAIBVoNgJAAAAAAAAwCpQ7AQAAAAAAABgFSh2AgAAAAAAALAKFDsBAAAAAAAAWAWKnQAAAAAAAACsAsVOAAAAAAAAAFaBYicAAAAAAAAAq0CxEwAAAAAAAIBVoNgJAAAAAAAAwCpQ7AQAAAAAAABgFSh2AgAAAAAAALAKFDsBAAAAAAAAWAWKnQAAAAAAAACsAsVOAAAAAAAAAFaBYicAAAAAAAAAq0CxEwAAAAAAAIBVoNgJAAAAAAAAwCrYGh0AyGiRkZHavn27Hj16lHisRo0acnNzMzAVAAAAAAAAnpfJbDabjQ4BZISzZ89q3759cnBwUOPGjeXk5CRJMpvN2rNnjy5duqTChQurXr16MplMBqcFAAAAAABAalHsRLawefNm5cqVSy+//PITC5lXrlzR+vXr1aVLFzk4OGRgQgAAAAAAADwvip2wehs3btRLL72k0qVLp6h9dHS0vv76a7311luytWWnBwAAAAAAgKyCYiesWnh4uMxmszw9PVPV7++//9aaNWvUsWPHdEoGAAAAAACAtMbT2GHVTpw4kepCpyTlypVLefPm1b1799IhFQAAAAAAANIDxU5YrevXr6tgwYLP3L9x48baunVrGiYCAAAAAABAeqLYCav1888/q0GDBs/c387OTrGxsWmYCAAAAAAAAOmJYiesVo4cOWRj83z/xO3s7NIoDQAAAAAAANIbxU5YrbR49hbP7wIAAAAAAMg6KHbCaplMpkwxBgAAAAAAADIGxU5YLVtbWz18+PC5xoiKikqjNAAAAAAAAEhvFDthtRo3bqwtW7Y8c//bt2/L2dk5DRMBAAAAAAAgPVHshNVycHBQZGTkM++7uX37djVq1ChtQwEAAAAAACDdUOyEVatXr55++umnVPc7e/ascufOrRw5cqRDKgAAAAAAAKQHip2waq6uripevLi2bt2a4j4XLlzQgQMH1LRp03RMBgAAAAAAgLRmMj/rGl8gCzl+/Lj27NmjJk2ayM3NLdk20dHRWrhwoV544QW1b98+gxMCAAAAAADgedkaHQDICGXLltWCBQu0fv16tW/fXs7OzipSpIjs7e1169YtXbhwQba2ttqxY4dcXFwodgIAAAAAAGRBzOxEtnDjxg2VK1dOv/76q0qUKKG7d+/q+vXrioqK0gsvvKDChQvLZDLp2rVrqlSpkkJDQ1WpUiWjYwMAAAAAACAVKHYiWxg6dKju3bunTz/99KltZ8yYoR9//FGbN2+WyWTKgHQAAAAAAABICxQ7YfWuXr2qihUrKjw8XMWKFXtq+5iYGFWrVk1jxoxRhw4dMiAhAAAAAAAA0gLFTli9QYMGKTY2VtOnT09xn9DQUPXo0UMRERFydHRMx3QAAAAAAABIKxQ7YdUuXbokDw8PHTlyRK6urqnq+8Ybb6hcuXIaO3ZsOqUDAAAAAABAWqLYCavm7+8ve3t7ffzxx6nue/78eXl6emrfvn166aWX0iEdAAAAAAAA0hLFTlit8+fPq2rVqjp27JgKFSr0TGOMHz9e+/bt08qVK9M4HQAAAAAAANIaxU5YrT59+ihfvnyaOHHiM4/x6NEjeXh46JNPPpGPj08apgMAAAAAAEBao9gJq3TmzBnVqFFDf/zxhwoUKPBcY61evVqDBw/WoUOHZG9vn0YJAQAAAAAAkNYodgIAAAAAAACwCjZGBwAAAAAAAACAtECxEwAAAAAAAIBVoNgJAAAAAAAAwCpQ7AQAAAAAAABgFSh2Av+ybNkymUwmo2MAAAAAAAAglSh2ItO7ePGievfurWLFisne3l5ubm7q1auXLly4YHQ0AAAAAAAAZCIUO5GpnT59WjVr1tThw4e1cOFCnTx5UosWLdKRI0dUq1YtnTlzJtl+UVFRGRsUAAAAAAAAhqPYiUytb9++srGx0ebNm9WkSRMVL15c3t7e2rx5s2xsbNS3b19JUqNGjdSnTx8NHjxYBQsWVL169SRJISEhqlKlipycnOTm5iY/Pz/99ddfFtf46quvVKJECTk6OqpVq1a6evVqkhyrV69WjRo1lDNnTpUqVUojRoywKKguWrRItWrVUp48eVSoUCF17NhRFy9eTMdPBgAAAAAAAP9GsROZ1q1bt7R+/Xr17dtXjo6OFuccHR317rvv6qefftLt27clxRcczWazdu7cqa+++kqSZGNjo2nTpunIkSNavHixfvnlF/n7+yeOs3fvXvn6+qp37946ePCgWrdurdGjR1tca8OGDerWrZv69eunI0eO6IsvvtCyZcv04YcfJraJiopSYGCgwsPDtWbNGt24cUNdunRJr48GAAAAAAAAyTCZzWaz0SGA5Ozdu1deXl5avny52rVrl+T8ihUr9J///Ed79+7VkCFDdOvWLf3+++9PHHP9+vVq06aN/v77b9nY2Khr1666fv26Nm3alNjGz89P8+fPV8KXxiuvvKJmzZpp1KhRiW1Wrlyp7t276969e8k+zOjYsWOqUKGCzp8/r2LFij3rRwAAAAAAAIBUYGYnMr3HPRk9oRiZcL5GjRpJ2mzdulXNmjVTsWLFlCdPHv3nP/9RVFSUrly5IkmKiIhQ3bp1Lfr8+/2BAwc0fvx45c6dO/HVtWtXPXjwIHGcX3/9VW3atFGJEiWUJ08e1axZU5J07ty557hzAAAAAAAApAbFTmRaZcqUkclk0pEjR5I9HxERIZPJJHd3d0mSk5OTxfmzZ8+qZcuWqlChgpYuXaoDBw7oiy++kPT/H2CUkonNcXFxGjNmjA4ePJj4+v3333XixAkVLFhQDx48kI+PjxwdHfX1119r3759Wr9+vcV1AAAAAAAAkP5sjQ4APE7+/Pnl4+OjTz/9VAMHDrTYt/Phw4f65JNP9Nprryl//vzJ9t+/f7+ioqI0depU5ciRQ5K0Zs0aizYVK1bUnj17LI79+3316tV17NgxlS5dOtnrhIeH68aNG5owYYJKlSolSVq+fHnqbhYAAAAAAADPjZmdyNRmzZqlmJgYNW3aVFu3btX58+cVGhqqZs2ayWw2a9asWY/tW6ZMGcXFxWnatGk6ffq0vv32W02bNs2izXvvvafNmzcrODhYJ06c0Ny5c7VixQqLNqNHj9bixYs1evRoHT58WMeOHdOyZcs0ZMgQSVLx4sXl4OCgWbNm6c8//9TatWst9vcEAAAAAABAxqDYiUzN3d1d+/fvV6VKlfTmm2/qpZdeUteuXVWhQgXt27cvcSZlcqpUqaLp06crJCREFStW1Lx58zRlyhSLNl5eXpo/f74+++wzValSRcuXL1dAQIBFGx8fH61du1bbtm1T7dq1Vbt2bU2cOFHFixeXJBUsWFALFy7UypUrVbFiRQUGBiokJCTNPwsAAAAAAAA8GU9jBwAAAAAAAGAVmNkJAAAAAAAAwCpQ7AQAAAAAAABgFSh2AgAAAAAAALAKFDsBAAAAAAAAWAWKnQAAAAAAAACsAsVOZAlms1k1atTQ8uXLjY6SImazWc2aNdO0adOMjgIAAAAAAJBtUOxElrBq1SrFxcWpbdu2RkdJEZPJpBkzZmjcuHG6evWq0XEAAAAAAACyBZPZbDYbHQJ4kri4OFWrVk1BQUF6/fXXjY6TKu+//75u376tL774wugoAAAAAAAAVo+Zncj0li9fLnt7e7Vu3droKKk2ZswYrV+/Xnv37jU6CgAAAAAAgNWj2IlMzWw26/r16xo7dqxMJpPRcVItb968Cg4Olr+/v+Li4oyOAwAAAAAAYNVYxo5ML+GfaFYsdkrxy/Dr1asnPz8/9ezZ0+g4AAAAAAAAVotiJ5ABDhw4oJYtW+rYsWNydnY2Og4AAAAAAIBVotgJZJDevXsrV65cmj59utFRAAAAAAAArBLFTiCDXL9+XRUrVtS2bdvk4eFhdBwAAAAAAACrwwOKgAxSsGBBjRkzRv7+/uJ3DAAAAAAAAGmPYieQgf73v//p5s2bWrp0qdFRAAAAAAAArA7L2IEMtn37dr355puKiIiQk5OT0XEAAAAAAACsBjM7Yahbt24ZHSHDNWzYUPXq1VNwcLDRUQAAAAAAAKwKMzthmHnz5mnXrl3y9fWVp6ennJ2dE8+ZzWaZTKbHvs/qLly4oKpVq+qXX36Ru7u70XEAAAAAAACsAsVOGCI2Nlb58+dXVFSUnJ2d1a5dO3Xu3FlVq1ZVvnz5Ets9ePBAdnZ2sre3NzBt+ggODlZYWJhWrVpldBQAAAAAAACrwDJ2GGLZsmWqVKmSfvvtNwUGBmrdunXq2LGjRo0apZ07d+revXuSpGnTplntcu9BgwYpIiJCP/30k9FRAAAAAAAArAIzO2GItWvXasuWLRoyZIiKFCkiSZo1a5YmTZqkmJgYdenSRbVr11bXrl21adMmNWnSxODE6WPt2rUaOHCgDh06JAcHB6PjAAAAAAAAZGkUO5Hh7t+/r9y5c+vPP//USy+9pJiYGNna2iaenz59uqZOnapz586pQYMG2r59u4Fp01+rVq3UoEEDDR061OgoAAAAAAAAWRrFTmSoR48eqVWrVpo4caJq1qxp8eChfxY9jx07pooVK2rPnj2qXbu2kZHT3cmTJ+Xl5aXw8HC5ubkZHQcAAAAAACDLYs9OZKiRI0dq69atGj58uO7evWvxhPWEQmdsbKwmTJigMmXKWH2hU5JKly6t3r17a8iQIUZHAQAAAAAAyNIodiLD3LlzR9OnT9e8efN0+fJlde3aVZcvX5YUX+BMYDab1aBBAy1dutSoqBnuww8/1I4dO7Rz506jowAAAAAAAGRZLGNHhvHz89Off/6prVu3atGiRRowYIC6dOmimTNnJmkbGxurHDlyGJDSOEuWLNHEiRN14MCBbHfvAAAAAAAAaYFiJzLEzZs3VaRIEe3evVu1atWSFF/c8/f315tvvqnx48crV65ciouLk41N9pxwbDab5e3trU6dOundd981Og4AAAAAAECWQ7ETGaJPnz76448/tHXrVsXGxsrGxkYxMTGaMGGCpk2bpo8++kh+fn5GxzTc77//rqZNm+ro0aMqUKCA0XEAAAAAAACyFIqdyBBRUVG6d++eXFxckpwbMWKEZs6cqSlTpqh3794GpMtc/P39FR0drdmzZxsdBQAAAAAAIEuh2AnDJCxZv3nzpvz9/bVhwwZt2bJFnp6eRkcz1O3bt1WhQgWtW7dO1atXNzoOAAAAAABAlpE9N0dEppCwN6eLi4vmz58vT09POTo6GpzKeC+88IKCgoLk7+8vfhcBAAAAAACQcszshOESZnjevXtXefPmNTpOphAbGysvLy+99957evPNN42OAwAAAAAAkCVQ7ESGSng4kSSZTCaD02Rue/fu1X/+8x9FRERQBAYAAAAAAEgBlrEjQw0ePFiLFi2i0JkCderUUfPmzRUUFGR0FAAAAAAAgCyBmZ3IMJcuXZKHh4eOHj2qIkWKGB0nS7h69ao8PDy0c+dOlS9f3ug4AAAAAAAAmRrFTmQYf39/OTg4aMqUKUZHyVKmTp2q9evXa/369cyIBQAAAAAAeAKKncgQ58+fl6enpyIiIlSoUCGj42Qp0dHR8vT01Pjx49W2bVuj4wAAAABAhrt7966uXbum6Ohoo6MAWZqdnZ0KFSpk1c8GodiJDPG///1Pzs7OmjhxotFRsqQtW7aoV69eOnLkiHLlymV0HAAAAADIMHfv3tXVq1fl5uamXLlyseINeEZms1l///23Ll68qMKFC1ttwZNiJ9LdmTNnVKNGDR0/flwuLi5Gx8myOnTooCpVqmj06NFGRwEAAACADHPy5EkVLVpUjo6ORkcBrMLDhw916dIllS5d2ugo6YKnsSPdjRs3Tu+++y6Fzuf08ccfa8aMGTp79qzRUQAAAAAgw0RHR7PCDUhDuXLlsuotISh2Il2dOnVKK1eu1KBBg4yOkuWVKFFC7733nt5//32jowAAAABAhmLpOpB2rP3riWIn0tXYsWPl7++vF154wegoVuGDDz7Qr7/+qi1bthgdBQAAAAAAINOxNToArNcff/yhdevW6eTJk0ZHsRq5cuVSSEiI/P39FR4eLjs7O6MjAQAAAAAAZBrM7ES6GTt2rAYOHKh8+fIZHcWqtGnTRi+++KJmzZpldBQAAAAAwDPw9fVVsWLFkj0XGhoqk8mkzZs3Z3CqtJNwD6GhoUZHSeTr66uSJUsaHQMZgGIn0sXRo0e1efNm+fv7Gx3F6phMJk2fPl0TJkzQ1atXjY4DAAAAAACQaVDsRLoICAjQ+++/rzx58hgdxSqVL19evr6+GjZsmNFRAAAAAABIN7GxsYqJiTE6BrIQip1Ic7///rt27typvn37Gh3Fqo0aNUobN27Unj17jI4CAAAAAEgnJUuWVPfu3bVkyRJVqFBBTk5Oqlmzpnbt2pXiMebOnauqVasqZ86cKlCggHr27Klbt24lnp83b55MJpNWrlyZeCw2NlavvPKK3N3dde/ePUnxE5tMJpMOHTokb29vOTo6ytXVVaNHj1ZcXNwTM5jNZk2dOlXlypWTvb29XF1d1a9fP929e9einclk0ogRIzRx4kSVKlVK9vb2OnTokCTpxo0b6tOnj9zc3OTg4KDy5ctrzpw5Sa61ZcsWVa9eXTlz5pS7u7s+//zzFH9WyPp4QBHSXEBAgIYMGSInJyejo1i1vHnzauLEifL399fevXtlY8PvLgAAAADAGu3cuVN//PGHgoKClDNnTo0aNUqtWrXSmTNn5Ozs/MS+w4YN08cff6z33ntPH330kS5evKiRI0fq8OHD2r17t3LkyCE/Pz9t3LhRfn5+qlWrltzc3BQUFKSwsDDt2rUryarNtm3b6u2339bw4cO1YcMGBQUFycbGRgEBAY/NMWLECAUHB6tv375q3bq1jh49qlGjRik8PFzbt2+3+Jl2wYIFeumllzRlyhQ5OTmpaNGiunv3rurVq6e///5bAQEBKlWqlDZs2KA+ffooMjIycRu9iIgItWjRQjVr1tSSJUsUGRmpgIAA3b9/Xzly5Hj2vwRkGRQ7kaZ+/fVX7d27V998843RUbKF7t27a/bs2friiy/k5+dndBwAAAAAQDq4e/euDh48qBdeeEGSVKRIEdWqVUvr1q1T165dH9vvzJkz+uijjzRmzBiNHj068XjZsmVVv359rV69Wm3btpUkzZkzR1WrVlX37t0VEBCgcePGKSgoSHXq1Ekybq9evRK3VWvevLnu3r2rjz/+WAMGDEi2+Hrr1i2FhISoR48eiQ/b9fHxUcGCBfXmm29qzZo1ev311xPbm81mbdy4Ubly5Uo8FhQUpLNnz+rQoUMqU6aMJKlp06b666+/FBgYqD59+sjW1lbjxo1Tnjx5tHHjxsRJWC+//LLc3d1VtGjRlH3gyNKYCoY0NWbMGA0bNsziGxLSj8lk0syZMzVy5Ejdvn3b6DgAAAAAgHRQt27dxEKnJFWuXFmSdO7cOUnxxcGYmJjEV2xsrCRp06ZNiouLU7du3SzO16lTR3nz5tWOHTsSx3R2dtbixYu1c+dO+fj4qEGDBho6dGiyeTp16mTxvnPnzrp//74OHz6cbPs9e/YoMjJS3bt3T9LP1tZW27dvtzj+6quvJqkrrF+/XnXq1FGpUqUs7sXHx0c3b97U0aNHJUlhYWFq0aKFxWrTF198UfXq1Us2G6wPxU6kmV9++UUHDx5Ur169jI6SrVSvXl1t27bVmDFjjI4CAAAAAEgBW1vbxILkvyUct7X9/4tx8+fPb9HGwcFBkvTo0SNJ0sKFC2VnZ5f4cnd3lyRdu3ZNklS6dGmL83Z2drp7965u3rxpMa6Xl5fKlSunyMhI9e/f/7HbpRUuXDjZ9xcvXky2fcL+oK6urhbHbW1t5eLiYrF/aHLtEu5lx44dSe6jY8eOkpR4L5cvX06SL7nMsF4sY0eaGTNmjEaMGKGcOXMaHSXbGT9+vCpUqCA/Pz9VqVLF6DgAAABIQ7GxsTpw4ICuX78us9msF154QbVq1ZK9vb3R0QA8o0KFCunGjRuKiopK8rV86dIlSakrzrVu3Vr79u1LfJ9QDHVxcZEkbdy40WJmaIKE8wkCAwN14sQJValSRQMHDpS3t7fy5cuXpN/Vq1f10ksvWbyXJDc3t2TzJRRrr1y5okqVKiUej4mJ0c2bN5PkMJlMyWYtVKiQpk+fnuw1ypUrJym+UJqQ59+ZkT1Q7ESa2L17tyIiIvTjjz8aHSVbcnFxUUBAgPz9/RUaGprsfxgAAACQtVy/fl07d+6UyWRSnTp1VL16dZlMJt2+fVvr169XVFSU6tSpoxdffNHoqABSydvbW8HBwVq1apU6dOhgce6HH36Qq6trYvEuJVxcXJIUDCWpWbNmsrGx0blz59SsWbMnjrFz505NmDBBwcHBeuONN1S1alX16dNHixcvTtL2+++/T9yzU5KWLFmi3Llzy8PDI9mxvby85ODgoCVLlqhJkyaJx7/77jvFxMSoYcOGT73HV199VTNnzlTx4sVVqFChx7arW7eu1q1bpwcPHiQuZT9//rx+/vln9uzMJih2Ik2MHj1aI0eO5LfLBnrnnXc0Z84cfffdd+rcubPRcQAAAPActmzZIrPZrLZt2yZZRlqgQAG9/vrrMpvN2rNnjw4cOJD4gBEAWUPTpk3VrFkz+fr66tixY6pTp47u3bunJUuW6Mcff9SXX3752CXkqeHu7q6hQ4eqX79++uOPP9SwYUPlzJlT58+f16ZNm+Tn5ydvb2/dvn1b3bp1k7e3twYPHiyTyaQ5c+aoU6dO8vHxUY8ePSzGnTt3ruLi4lSrVi1t2LBB8+bNU0BAwGOfDJ8/f34NGjRIwcHBcnJyUosWLRQREaGRI0eqfv36atmy5VPvZeDAgfruu+/UoEEDDRw4UOXKldODBw907Ngx7dy5M3Hy1ciRI7V06VI1b95cH3zwgaKiojRmzBiWsWcjFDvx3LZv367Tp08n+eaHjJUjRw7NnDlTXbt2VatWrZQ7d26jIwEAAOAZrF+/XqVLl1bp0qWf2M5kMqlu3bq6cuWKli5dmrhvHYDMz2QyadWqVRo3bpy++uorBQUFyd7eXp6enlq5cqXatGmTZteaMGGCKlSooE8++USffPKJTCaTXnzxRTVp0iTxqea9e/fW33//ra+++ipxpWDHjh3Vs2dP9evXT/Xq1bP4nvTjjz/K399fQUFBypcvn0aOHKlRo0Y9Mcf48eNVsGBBzZ49W59++qlcXFz01ltvKTg4OEWF3Xz58mn37t0aO3asJk2apIsXL8rZ2VnlypVT+/btE9tVqFBB69at0wcffKA33nhDbm5uGjp0qMLCwhQaGvoMnyCyGpPZbDYbHQJZl9lsVqNGjfT2229T7MwkunXrphIlSmjChAlGRwEAAEAq7d+/Xzlz5nzsUtDHOXfunE6ePKnGjRunUzLAOBEREapQoYLRMSApICBAgYGBio6OtniAErIea/664mnseC7btm3T5cuX1a1bN6Oj4P9MnjxZc+bM0cmTJ42OAgAAgFQ6c+ZMqgudklS8eHHdvn1bzGUBAGR3FDvxzMxms0aNGqUxY8bwG51MxM3NTR988IEGDBhgdBQAAACkwqlTp+Tu7v7M/b28vLRnz540TAQAQNZDsRPPbOPGjbp9+zYPw8mEBgwYoOPHj2vt2rVGRwEAAEAKhYeHq1q1as/c383NTZcuXUrDRABgKSAgQGazmQlPyNQoduKZmM1mjR49WgEBAcqRI4fRcfAvDg4Omj59ugYMGKDIyEij4wAAACAF7OzsnnsMe3v7NEgCAEDWRbETz2TdunV6+PChOnToYHQUPMZrr72mChUqKCQkxOgoAAAASIG02G+TPTsBANkdxU6kWsKszsDAQNnY8E8oM5s6daqmTJmiCxcuGB0FAAAAT2EymTLFGAAAZGVUqpBqP/74o8xms9q1a2d0FDyFu7u7+vTpow8++MDoKAAAAHiK6Ojo556ZGRUVlUZpAADImih2IlXi4uI0ZswYBQYG8lvjLGL48OH6+eeftX37dqOjAAAA4Alq1Kih/fv3P3P/M2fOqFixYmmYCACArIdiJ1Jl+fLlsre3V6tWrYyOghRycnLSlClT5O/vr5iYGKPjAAAA4DFKlCihs2fPPnP/Tz/9VJMnT1ZEREQapgKsjNksXd8tHZsmHQqK//P67vjjAKwCxU6kWGxsrMaMGaOxY8cyqzOL6dixowoUKKDZs2cbHQUAAABP4O7uroMHD6a6359//qmmTZuqTp06atiwoXx9fXX69Ol0SAhkUXHR0onZ0ip3aVtz6eBQ6dCY+D+3NY8/fmJ2fDsAWRrFTqTY999/r3z58unVV181OgpSyWQyacaMGQoMDNT169eNjgMAAIDHqFatmq5fv65jx46luM+FCxcUHh6u5s2ba8iQITpx4oRKlCihmjV6rEGuAAAgAElEQVRrql+/frp8+XI6JgaygOj70pbG0q/vSw9OSzEPpLgoSeb4P2MexB//9X1pS5P49ulswYIFMplMyb42b96c7tf/p+XLl2vatGlJjm/evFkmk0m7du3K0DzA86LYiRSJiYlRQEAAszqzMA8PD3Xt2lUjRowwOgoAAACeoFmzZrp69arWrVv3xG2I4uLiFBoaqvDwcIuHh+bLl0+BgYE6duyYHBwcVKlSJQ0dOlQ3b97MiPhA5hIXLYW+Jt3cJ8U+fHLb2IfSzV+k0BYZNsNz6dKlCgsLs3jVrl07Q66d4HHFztq1ayssLExVq1bN0DzA87I1OgAyl0uXLum3335TbGysTCaTihcvrqpVq+rbb79V4cKF1aRJE6Mj4jkEBgaqfPny6t27t2rWrGl0HAAAADxGw4YNdefOHa1evVqxsbHy9PRU4cKFZWNjoxs3bujAgQMym81q0KCBChUqlOwYBQsW1Mcff6yBAwcqKChI5cqVU//+/TVgwADlyZMng+8IMMip+dKtX6W4yJS1j4uUbh2QTn0hlXknfbNJ8vT0VOnSpVPUNjIyUg4ODumc6P/LmzevvLy80mQss9ms6Oho2dvbp8l4wJMwsxMym83atWuXfvjhB509e1Y+Pj56/fXX1apVK+XOnVtLly7V7Nmz9eGHHzKrM4tzdnbW+PHj5e/vr7i4OKPjAAAA4Any5cundu3aqX379nr06JH279+vsLAw3bp1S23atFH79u0fW+j8p2LFiunzzz/Xnj179Mcff6h06dKaOnWqHj16lAF3ARjIbJaOTn76jM5/i30Y38/AhxYlLCFfuXKl3n77bRUoUEBubm6J59etW6c6deooV65ccnZ2Vrt27XTixAmLMerXr69GjRpp48aNqlatmhwdHeXh4aFVq1Yltunevbu++eYbnT17NnEZfULx9XHL2JctW6Y6derI0dFRzs7O6tSpky5cuGDRplixYvL19dXcuXNVrlw52dvba8OGDWn9MQHJotiZzd27d08LFixQ6dKl1b59e9WtW1e2tvETfk0mk9zd3dWxY0dt2bJF9+/f19GjRw1OjOf13//+V7Gxsfr666+NjgIAAIAUMJlM8vDwkLe3t5o2bapq1aopR44cqR6ndOnSWrRokTZv3qzt27erTJkymjt3rqKjeSALrNSNMCny2rP1jbwa3z+dxcbGKiYmJvEVGxtrcb5v376ytbXVN998o/nz50uS1qxZo1atWumFF17Q999/r08++UTh4eGqX7++rly5YtH/+PHjGjRokAYPHqzly5ercOHCat++feIDzAIDA+Xj46MiRYokLqNftmzZY/POmjVLnTp1UuXKlfXDDz9o9uzZCg8PV6NGjXT/vuVep5s2bUp8dsT69etVqVKltPjIgKdiGXs29uDBAy1fvlw9evSQjc2T6945c+ZUhw4dFBoaqri4OHl4eGRQSqQ1GxsbzZw5U+3atVPbtm2VL18+oyMBAAAgA1WuXFkrV67U3r17NWLECE2aNEljx45V586dn/pzAZBpHBgg3T745DYPL0gxqZzVmSDmoRT2luRY7PFtXvCUaiTd6zI1ypcvb/G+Xr16FjMpX375Zc2ZM8eizciRI1W2bFmtXbs28RcfderUUfny5RUSEqLJkycntr1x44Z27dqll156SZJUtWpVFS1aVEuXLtWQIUPk7u6uAgUKyMHB4alL1u/evavhw4fLz8/PIlOtWrVUvnx5LViwQP369Us8fufOHf32228pmoEOpCX+S5aNrVixQt27d0/V/6Fp1KiRTp06pb/++isdkyG91alTR6+++qrGjh1rdBQAAAAYpE6dOtq8ebPmzJmjGTNmyNPTU6tWrZLZwKW7QJoyx0p61n/P5v/rn75WrFihffv2Jb4SZm8m+OfDx6T4gmN4eLg6d+5sMcO7dOnS8vLy0vbt2y3aly9fPrHQKUmurq4qUKCAzp07l+qsP//8s+7fv69u3bpZzEYtUaKEypQpox07dli0f/nllyl0whDM7MymTpw4ocqVKz/T8pdWrVppzZo1atOmTTokQ0YJDg6Wh4eH/Pz8VKFCBaPjAAAAwCCNGzdWWFiY1qxZoxEjRmjChAmaMGGCGjdubHQ04PFSMqPy2DTp4FApLir149s4SOUGSOX7p75vKnh4eDzxAUWurq4W72/dupXscUkqUqSIwsPDLY7lz58/STsHB4dn2rP32rX4LQEaNWqUoqzJZQQyAsXObOr3339X+/btn6lvjhw5FBsbK7PZzAOLsrDChQtrxIgReu+997Rx40b+LgEAALIxk8mk1q1bq2XLlvruu+/0zjvvqESJEho/frzq1KljdDzg2bjUlmzsnrHYaSu51Er7TKn075/TEoqX/96bM+GYi4tLumVJGPvrr79OsvxekvLkyWPxnp8xYRSWsWdD0dHRsre3f64x6tWrp927d6dRIhilb9++unTpklasWGF0FAAAAGQCNjY26tKli44ePao33nhDHTp0UJs2bXTo0CGjowGpV6Cu5PCMy6hzFo7vn8nkzZtXnp6e+v777xUXF5d4/M8//9SePXvUsGHDVI/p4OCgv//++6nt6tevLycnJ506dUo1a9ZM8ipXrlyqrw2kB4qd2dD169efezp54cKFE6fPI+uys7PTzJkzNWjQID18+IwbdwMAAMDq2NnZqVevXjpx4oS8vb3VrFkzdevWTSdPnjQ6GpByJpNUcYiUwzF1/XI4ShWGxPfPhIKCghQREaHWrVtrzZo1Wrx4sZo3by4XFxcNHDgw1eNVrFhR165d05w5c7Rv3z4dPnw42XbOzs6aNGmSxo0bpz59+mjVqlUKDQ3VN998Iz8/P3333XfPe2tAmqDYmQ3dv39fTk5Ozz0OG5dbh8aNG6tWrVoWT+wDAAAAJClnzpwaMGCATpw4oQoVKsjLy0vvvPOOLly4YHQ0IGXce0r5q8fvwZkSNg5S/hqS+9vpm+s5tGrVSqtXr9aNGzfUoUMH9enTR5UrV9auXbtUpEiRVI/Xu3dvderUSUOHDlXt2rXVtm3bx7bt27evVqxYoYiICHXr1k0tWrRQQECAzGazqlat+jy3BaQZk5mKVbZz5coVnTt3TrVr136ucVavXq3WrVunUSoY6dy5c6pWrZoOHDigkiVLGh0HAAAAmdStW7c0efJkzZ07Vz169NDw4cNVsGBBo2PBykVERDzfQ1Wj70uhLaRbB6TYJ6xoy+EYX+hstE6yy/3s1wOygOf+usrEmNmZDRUoUECXL19+rjHOnDmjokWLplEiGK148eIaOHCgBg0aZHQUAAAAZGL58+fXxIkTdfjwYUVFRal8+fIaPXq07ty5Y3Q04PHscktNtkjVQySnlyRbp/+b6WmK/9PWScr9Uvz5JlsodAJZHMXObMjW1lbR0dHPtQz9wIEDql69ehqmgtEGDx6s8PBwbdq0yegoAAAAyORcXV01a9YsHThwQOfPn1eZMmU0efJk9oFH5mVjJ5V5R3r9pOS9UfKcJFUZG/+n9yap9cn48zZ2RicF8JwodmZTXl5e2rNnzzP1jYyMlL29vUyZdLNmPJucOXNq6tSpeu+99xQVFWV0HAAAAGQBJUuW1Jdffqnt27dr3759Kl26tD755BP+/yQyL5NJKviyVL6/5DEy/s+CdTPtw4gApB7FzmyqWLFiOn36tB49epTqvitXrlSTJk3SIRWM1rp1a5UsWVIzZ840OgoAAACykAoVKmjp0qVavXq11qxZo3LlymnhwoWKjY01OhoAIJuh2JmNdezYUYsXL1ZkZGSK+6xevVpeXl5ydHRMx2Qwislk0vTp0xUcHPzc+7oCAAAg+6lRo4Z++uknLVy4UPPmzVPlypX1ww8/PNcWWgAApAbFzmzMzs5Ob775ppYtW6bff//9iW2vXr2qRYsWydPTUyVKlMighDBC2bJl1bNnTw0bNszoKAAAAFmWr6+vTCaTxo0bZ3E8NDRUJpNJN27cMChZvAULFih37vR7CMsrr7yiHTt2KCQkROPHj1etWrW0YcMGip4AgHRHsTObs7OzU7du3RQbG6sWLVpo1apVOn36tG7duqULFy5o586d+uGHH3T8+HF169ZNL774otGRkQFGjhypLVu2aPfu3UZHAQAAyLJy5sypyZMn6/r160ZHMYTJZNKrr76q/fv3a9iwYRowYIAaNWqkXbt2GR0NAGDFKHZCkvTbb7/Jzs5OTZs21f3793XkyBFdu3ZN5cuXV/v27dWgQQMeSJSN5MmTR5MmTZK/vz/7LAEAADwjb29vlSxZUkFBQY9tc/ToUbVs2VJ58uRRoUKF1KVLF125ciXx/L59+9S8eXMVKFBAefPmVf369RUWFmYxhslk0meffaY2bdrI0dFRZcuW1bZt23ThwgX5+PjIyclJnp6e+vXXXyXFzy7973//qwcPHshkMslkMikgICBdPgNJsrGxUYcOHXTo0CH997//Vffu3dWiRYvEPAAApCWKnZAkzZ8/Xz179pSjo6MqV66sBg0aqHr16ipYsKDR0WCQrl27ytHRUfPnzzc6CgAAQJZkY2OjiRMnavbs2Tp16lSS85cvX9Yrr7wiDw8P/fLLL9q8ebPu37+v119/XXFxcZKke/fu6c0339TOnTv1yy+/yNPTUy1atEiyDH7cuHHq3LmzwsPDVbNmTXXp0kU9e/bUu+++q99++01FixaVr6+vJOnll1/WtGnT5OjoqMuXL+vy5csaPHhwun8etra28vX11R9//KGWLVuqVatW6tSpk44dO5bu1wYSmc3S7t3StGlSUFD8n7t3xx8HYBVMZjZNyfYiIiLUuHFjnTt3TnZ2dkbHQSZy8OBB+fj4KCIiQvnz5zc6DgAAQJbh6+urGzduaM2aNfL29lbhwoW1ZMkShYaGytvbW9evX9eMGTP0888/a8uWLYn9bt++rfz582vv3r2qXbt2knHNZrOKFi2qjz76SN27d5cUP7Nz2LBhCg4OliQdPnxYlStX1scff6xBgwZJksV1CxQooAULFqhfv366f/9+BnwayXvw4IFmzZqlKVOmqHXr1hozZgzPB0CyIiIiVKFChecbJDpamj9fmjxZunYt/n10tGRnF/8qVEgaMkTq2TP+PWDl0uTrKpNiZif05Zdf6q233qLQiSQ8PT3Vvn17jR492ugoAAAAWdbkyZO1dOlS7d+/3+L4gQMHtGPHDuXOnTvxlbBHfsJM0GvXrumdd95R2bJllS9fPuXJk0fXrl3TuXPnLMaqUqVK4v8uXLiwJKly5cpJjl27di3tb/AZOTk5aejQoTpx4oTc3NxUvXp1+fv7WyzjB9LE/ftS48bS++9Lp09LDx5IUVHxszmjouLfnz4df75Jk/j2GSAsLEydOnVS0aJFZW9vLxcXFzVr1kwLFy7MstuJrVy5UiEhIUmOJzycLTQ0NE2uk7AFR3KvlStXpsk1/i2t7yG9xgTFzmwvOjpaX331ld5++22joyCTCgoK0tKlSxUeHm50FAAAgCypVq1aat++vYYOHWpxPC4uTi1bttTBgwctXidOnFCrVq0kST169NC+ffs0depU7d69WwcPHlSxYsUUFRVlMdY/Jy4k7LWf3LGE5fGZibOzs4KCghQRESE7OztVqlRJw4cP161bt4yOBmsQHS299pq0b5/08OGT2z58KP3yi9SiRXy/dDRt2jTVq1dPt27d0qRJk7R582Z98cUXKlu2rPr06aM1a9ak6/XTy+OKnenB19dXYWFhSV4NGzbMkOunherVqyssLEzVq1c3OopVsTU6AIy1du1alSlTRuXKlTM6CjIpFxcXBQYGyt/fX9u3b+dBVQAAAM9gwoQJqlixotavX594rHr16vr+++9VokSJx66y2rVrl2bMmKGWLVtKkq5evarLly8/dx57e/tMN3OsUKFCCgkJ0cCBAxUUFKSyZctq4MCB6t+/v3Lnzm10PGRV8+dLv/4qRUamrH1kpHTggPTFF9I776RLpB07dmjQoEHq16+fZsyYYXGuTZs2GjRokB48ePDc14mOjpatrW2yP8NFRkbKwcHhua9hJDc3N3l5eRkd45nExsbKbDYrb968WfYeMjNmdmZz8+fPZ1YnnqpXr166f/++lixZYnQUAACALKl06dLq3bu3pk+fnnisb9++unPnjt544w3t3btXf/75pzZv3qzevXvr3r17kqSyZctq0aJFOnr0qPbt26fOnTvL3t7+ufOULFlSjx490qZNm3Tjxg09fNqMtwz04osvas6cOQoLC9ORI0dUunRpTZ8+XY8ePTI6GrIaszl+j87U/vt++DC+Xzo94mTixInKnz+/Jk+enOx5d3f3xK0pAgICki1W+vr6qmTJkonvz5w5I5PJpE8//VRDhgxR0aJF5eDgoL/++ksLFiyQyWTSjh071LFjRzk7O6tOnTqJfbdv364mTZooT548cnJyko+Pjw4fPmxxvUaNGql+/fravHmzqlevLkdHR3l4eFgsGff19dXChQt18eLFxCXl/8z4T/369VPhwoUV/a8ZtPfv31eePHk0fPjwJ36GKTFv3rwky9pjY2P1yiuvyN3dPfH7bMJnfOjQIXl7e8vR0VGurq4aPXr0U2fDm81mTZ06VeXKlZO9vb1cXV3Vr18/3b1716KdyWTSiBEjNHHiRJUqVUr29vY6dOhQssvYU/JZJ/j2229Vvnx55cyZU5UrV9aqVavUqFEjNWrU6Nk/OCtAsTMbu3Tpknbt2qWOHTsaHQWZXI4cOTRz5kx98MEHhm5iDwAAkJWNHj1atrb/f3Fd0aJF9fPPP8vGxkavvvqqKlWqpL59+8rBwSFxxtUXX3yh+/fvq0aNGurcubPefvvtxxYPUuPll1/W//73P3Xp0kUFCxZ8bNHFSGXKlNHixYu1YcMGbdmyRWXLltW8efMUExNjdDRkFWFh8Q8jehZXr8b3T2OxsbEKDQ1V8+bNlTNnzjQff/z48Tp+/LjmzJmjFStWWFyjW7duKlWqlJYtW6aJEydKil/t2aRJE+XOnVuLFi3S4sWLde/ePTVo0EDnz5+3GPvUqVPq37+/Bg0apOXLl8vV1VUdOnTQyZMnJUmjRo1SixYtVLBgwcQl5StWrEg257vvvqtr164lOf/NN9/owYMH6tWr11Pv1Ww2KyYmJskrgZ+fnzp27Cg/Pz9dvHhRUvw2bWFhYVq8eLHy5MljMV7btm3VtGlTrVy5Ul27dlVQUJDGjh37xAwjRozQoEGD1KxZM61evVpDhgzRggUL1LJlyySF0gULFmjt2rWaMmWK1q5dq6JFiz523Kd91pK0adMmdevWTeXLl9cPP/ygwYMHa8CAATp+/PhTPzurZ0a2FRwcbPbz8zM6BrKQ7t27m4cNG2Z0DAAAAGRDYWFhZm9vb3OZMmXM3377rTk2NtboSMggR48eTXqwf3+zuWHDJ7/c3c1mk8lsjp+jmbqXyRTf/0nj9++f6nu5cuWKWVKKf64aM2aMObnSTY8ePcwlSpRIfH/69GmzJHO1atXMcXFxFm2//PJLsyTzgAEDkozj7u5ubty4scWxO3fumF1cXMz9/3F/DRs2NNva2pqPHz+eeOzq1atmGxsb8/jx4y1yubm5JbnOtm3bzJLM27Ztsxjz39euVq2a2cfHJ0n/f5P02Nf169cT292+fdtcvHhxc6NGjcyhoaHmHDlymCdMmGAxVsJnHBwcbHHcz8/PnDt3bvPt27eTvYebN2+aHRwczD169LDo9/XXX5slmX/88UeLvK6uruaHDx+m6HNJyWddt25dc6VKlSz+vg8cOGCWZG7YsOFTP8Nkv66sBDM7s7Fhw4Zp7ty5RsdAFjJ58mTNnTtXJ06cMDoKAAAAshkvLy9t3bpVn332maZOnapq1appzZo1MqfTUmNYgdjYZ1+KbjbH989i2rZt+9jnLLRr187i/YkTJ3Tq1Cl169bNYmako6Oj6tatqx07dli0L1OmjMqUKZP4vlChQipUqJDOnTv3TFnfffddbdu2LfHny3379um3337TOyncK/Xtt9/Wvn37krycnZ0T2zg7O2vx4sXauXOnfHx81KBBgyQPi0vQqVMni/edO3fW/fv3kyzpT7Bnzx5FRkaqe/fuSfrZ2tpq+/btFsdfffVV5cqVK0X39rTPOjY2Vvv371f79u0t/r6rV6+uUqVKpega1owHFAFIMVdXVw0dOlQDBgzQ2rVrjY4DAACAbKhJkybas2ePVq1apeHDh2v8+PGaMGGCvL29U9Q/Li5ONjbM+8nypk1LWZuhQ6WoqNSP7+AgDRgg9e+f+r5P4OLioly5cuns2bNpOm4CV1fXFJ+79n9L/Hv27KmePXsmaV+8eHGL9/nz50/SxsHB4Zn3023Xrp2KFCmizz//XFOmTNHs2bNVtGhRtW7dOkX9XV1dVbNmzae28/LyUrly5XT06FH179//sV//hQsXTvZ9whL4f7t161Zijn+ytbWVi4tL4vl/5k2pp33WN27cUHR0tAoVKpSk3b/vIzviOzyAVOnfv79OnTqlNWvWGB0FAAAA2ZTJZFKbNm108OBB9evXT35+furSpcsTZ3leuXJFU6dOla+vr0aPHp3kwSiwQrVrS3Z2z9bX1laqVStt8yi+ENaoUSNt2rRJkSl4QnzCnptR/yrY3rx5M9n2j5vVmdw5FxcXSVJwcHCyMyRXr1791HzPw87OTn5+flqwYIGuXbumJUuWqGfPnhZ7G6eFwMBAnThxQlWqVNHAgQN1586dZNtdvXo12fdubm7Jtk8oSF65csXieExMjG7evJn4+SZ40t9NahUoUEB2dnaJBet/+vd9ZEcUOwGkir29vaZPn64BAwbwREwAAAAYKkeOHOrWrZuOHTumkJCQx7aLi4vTu+++q2nTpqlIkSLaunWr3NzctHTpUkliKby1qltXSmbmW4oULhzfPx0MGzZMN2/e1AcffJDs+dOnT+v333+XJJUoUUKSLJZS//XXX9q9e/dz5yhXrpxKliypI0eOqGbNmkleCU+ETw0HBwf9/fffKW7/zjvv6M6dO+rYsaMiIyNT9GCi1Ni5c6cmTJig8ePHa/Xq1frrr7/Up0+fZNt+//33Fu+XLFmi3Llzy8PDI9n2Xl5ecnBw0JIlSyyOf/fdd4qJiVHDhg3T5iaSkSNHDtWsWVM//PCDxfevAwcO6PTp0+l23ayCZewAUs3Hx0ceHh4KCQnRhx9+aHQcAAAAZHN2dnZPXCJ66dIlHT16VCNHjkwspkyaNEmzZs1Sy5Yt5ejomFFRkZFMJmnIEOn996WHD1Pez9Exvl8azsT7p1deeUUhISEaNGiQIiIi5Ovrq+LFi+v27dvasmWL5s2bp8WLF6tKlSp67bXXlC9fPvXq1UuBgYGKjIzU5MmTlTt37ufOYTKZ9Mknn6hNmzaKiopSp06dVKBAAV29elW7d+9W8eLFNWjQoFSNWbFiRd26dUufffaZatasqZw5c6py5cqPbe/m5qbWrVtrxYoVat26tV588cUUX+vixYvas2dPkuMlSpSQq6urbt++rW7dusnb21uDBw+WyWTSnDlz1KlTJ/n4+KhHjx4W/ebOnau4uDjVqlVLGzZs0Lx58xQQEGCxB+g/5c+fX4MGDVJwcLCcnJzUokULRUREaOTIkapfv75atmyZ4nt5FoGBgWrevLnatWun3r1768aNGwoICFCRIkWy/VYd2fvu8VS+vr5q1arVc4/j4eGhgICA5w+ETCMkJEQhISE6f/680VEAAACAJ0rY2++fRYvixYvr1KlTCg8PlxS/9HT+/PlGRUR66dlTql49fg/OlHBwkGrUkN5+O11jDRgwQLt27ZKzs7MGDx6sxo0by9fXVxEREfr8888T9610dnbWmjVrZGNjo06dOmn48OHy9/dP8R61T9OiRQvt2LFDDx48kJ+fn3x8fDRkyBBduXJFdZ9hZqufn586d+6sDz/8ULVr107R/psdO3aUpBQ/mCjBggULVLdu3SSvb775RpLUu3dv/f333/rqq68Sl5B37NhRPXv2VL9+/XTy5EmL8X788Udt2rRJr7/+uhYtWqSRI0dq1KhRT8wwfvx4hYSE6KefflKrVq00ceJEvfXWW1q7dm26FxybNWumb775RhEREWrXrp0mTZqkjz/+WEWKFFG+fPnS9dqZncnMfP0sLTQ09Inf5Bo1aqRt27Y98/h37tyR2Wx+7G8yUsrD4/+xd99RUV3v18D30JsNsSAIRpAiiNhFbGAhNqyUBAtqopGIGlRUYhQLqFHsmq9KswPW2INgB4wNOwYlNkZEiQ0QYRjm/cOf84bYEbgMsz9rzVLunHvvHpYIPPOcc2wxaNAgFjwrmRkzZiA1NfWttn0iIiIioorizz//xNKlS5Gamork5GSMHTsW7u7umDp1KlRUVLBu3TpYWloiOTkZrVu3Rr169RAUFPTWDssknJSUFFhbW5f8Ajk5QM+ewPnzH+7w1NF5Xeg8cAAohc5J+jReXl5ISEjA33//LUhHYmBgIGbNmgWJRFLq64WWt/T0dJibm+Pnn3/+aKH2i7+uKjB2diq4du3aISMj463HmjVrIBKJ4OPjU6LrFhYWQiaToVq1al9c6KTKa+rUqUhKSsKxY8eEjkJERERE9Ja8vDw4OzujXr16WLp0Kfbs2YM//vgDkyZNQteuXTFv3jxYWloCAJo1awaJRILJkyfDz88PZmZmOHDggMCvgEqFnh4QHw8sXgw0bAjo6r7u4BSJXv+pq/v6+OLFr8ex0FkuTp8+jf/973+Ijo6Gn5+f0k+9/lx5eXkYM2YMduzYgePHjyMiIgLdunWDjo4OvvvuO6HjCYr/khSchoYG6tatW+zx9OlTTJ48GQEBAfJ2cLFYDE9PT9SoUQM1atRAr169cPPmTfl1AgMDYWtri8jISJiZmUFTUxO5ublvTWPv3C9L3UQAACAASURBVLkzfHx8EBAQAAMDA9SuXRuTJk1CUVGRfMyjR4/Qt29faGtrw9TUFOHh4eX3CaFypaOjg5CQEPj6+qKwsFDoOERERERExWzduhW2trYICAhAhw4d0Lt3b6xatQoPHjzA6NGj4ejoCOD1BkVvHmPHjkV6ejr69OmD3r1746effsLLz1nvkSomdXVg9Gjg1i0gNhZYsACYPfv1n4cPvz4+enTJd2+nz+bg4IDJkydj2LBhJW7UUmaqqqp4+PAhxo4di27dusHPzw+NGjXCiRMnPriGsTJgsbOSefbsGfr164dOnTphzpw5AICXL1/CyckJWlpaOH78OJKSkmBoaIiuXbsW+6Z9+/ZtbNmyBdu2bcOlS5egpaX1znts3rwZampqSExMxMqVK7F06VJER0fLn/f29satW7cQFxeH3bt3Y8OGDbhz506Zvm4SzsCBA1G7dm2sXr1a6ChERERERMVIJBJkZGTgxYsX8mNGRkaoXr06zp8/Lz8mEokgEonkuxrHx8fj1q1bsLS0hJOTEzcwqkxEIqBdO2D8eGD69Nd/OjiU2WZE9H4ymQzZ2dkICwsTdPp4YGAgZDKZwk1h19DQwK5du5CRkYGCggI8ffoUe/bsee/u8cqExc5KpKioCN9++y1UVVWxadMm+QK8UVFRkMlkiIiIgJ2dHaysrLBmzRrk5ORg37598vMLCgqwceNGNG/eHLa2tu/9Qm/cuDFmz54NCwsLuLu7w8nJCfHx8QCA1NRUHDx4EGvXroWjoyOaNWuG9evXIy8vr+w/ASQIkUiE5cuXY86cOXj06JHQcYiIiIiI5Dp16oS6deti4cKFEIvFuHr1KrZu3Yr09HQ0atQIwOuCy5uZalKpFCdPnsTQoUPx/Plz7NixA66urkK+BCIi+kyKVbamDwoICEBSUhLOnDmDqlWryo+fP38et2/fRpUqVYqNf/nyJdLS0uQfGxsbo06dOh+9j52dXbGP69WrJy9ypaSkQEVFBa1bt5Y/b2pqinr16pXoNZFisLGxweDBgxEQEIDQ0FCh4xARERERAQCsrKwQERGBMWPGoGXLlqhZsyZevXoFf39/WFpaoqioCCoqKvJGkSVLlmDFihXo2LEjlixZAhMTE8hkMvnzRERU8bHYWUlER0dj0aJF2L9/v/wdyjeKiopgb2//zh2z9fX15X/X1dX9pHup/2cNE5FIJH8n9M20D1I+gYGBsLKywtmzZ9GqVSuh4xARERERAXj9xvyJEydw8eJF3Lt3Dy1atEDt2rUBvN6YVUNDA0+ePEFERARmz54Nb29vLFy4ENra2gDAQicRkYJhsbMSuHjxIkaMGIH58+fDxcXlreebN2+OrVu3wsDAoMx3Vre2tkZRURHOnj2Ldu3aAQDu3buHBw8elOl9SXjVqlVDcHAwxo4di6SkJO6kR0REREQVir29Pezt7QFA3qyhoaEBAJgwYQL279+P6dOnY9y4cdDW1pZ3fRIRkWLh/9wKLisrC/369UPnzp0xePBgPHz48K2Hl5cX6tSpg759++L48eO4ffs2Tpw4gYkTJxbbkb00WFpa4uuvv8bo0aORlJSEixcvwtvbW/6uKFVuw4YNg0gkwoULF4SOQkRERET0Xm+KmHfv3kXHjh2xa9cuzJ49G1OnTpVvRvTfQidnsRERKQZ2diq4/fv34+7du7h79y4MDQ3fOUYmk+HEiROYOnUq3Nzc8Pz5c9SrVw9OTk6oUaNGqWeKjIzE999/D2dnZxgYGGDmzJncuEZJqKio4OTJkwq3ix0RERERKSdTU1OMGTMGJiYmcHR0BIAPdnT6+vpi7NixsLS0LM+YVIpkMhnS09MhFouRn58PTU1NGBkZwdjYmEsWEFUSIhnfniIiIiIiIiL6oMLCQixcuBCLFy+Gq6srZsyYAVNTU6FjKYWUlBRYW1t/0TWkUimSk5ORkJCA3NxcFBUVQSqVQlVVFSoqKtDV1YWjoyOaNWsGVVXVUkpOVHGVxtdVRcVp7EQkmPz8fKEjEBERERF9EjU1NUybNg03b96EoaEhmjdvjvHjxyMzM1PoaPQRBQUF2LBhA2JjY/Hs2TNIJBJIpVIAr4ugEokEz549Q2xsLDZs2ICCgoIyzxQZGQmRSPTOR1ntteHt7Y0GDRqUybVLSiQSITAwUOgYVMmw2ElE5a6oqAjx8fFYvnw5Hj58KHQcIiIiIqJPVr16dcydOxfXr1+HSCRC48aN8fPPP+Pp06dCR6N3kEql2Lx5M8RiMSQSyQfHSiQSiMVibN68WV4MLWvbtm1DUlJSsUdcXFy53JuosmKxk4jKnYqKCl6+fIljx45hwoQJQschIiIiIvpsderUwdKlS5GcnIzMzExYWFhg3rx5yM3NFToa/UtycjIyMjI+uXgplUqRkZGB5OTkMk72mr29Pdq2bVvs0bJly3K595fgLD2qyFjsJKJy9WZKSJ8+fTBw4EDExMTg8OHDAqciIiIiIioZExMThIaG4tSpU7h06RLMzc2xfPlyFoMqAJlMhoSEhI92dP6XRCJBQkIChNzipKioCJ07d0aDBg3w/Plz+fErV65AW1sbkydPlh9r0KABBg8ejHXr1sHc3BxaWlpo3rw5jh49+tH7ZGRkYOjQoTAwMICmpibs7OywadOmYmPeTLk/ceIE3NzcUL16dbRp00b+/PHjx9GlSxdUqVIFurq6cHFxwdWrV4tdQyqVYvr06TA0NISOjg46d+6Ma9eulfTTQ/RBLHYSUbkoLCwEAGhoaKCwsBATJ06En58fHB0dP/uHDyIiIiKiisbS0hJRUVE4ePAgDh8+DAsLC4SHh8t/Dqbyl56eXuJO29zcXKSnp5dyordJpVIUFhYWexQVFUFFRQWbNm1CdnY2Ro8eDQDIy8uDp6cnbGxsEBQUVOw6x48fx+LFixEUFISoqChoamqiR48e+Ouvv95779zcXHTq1AkHDx5EcHAwdu/ejSZNmmDIkCFYu3btW+O9vLzw1VdfYfv27Zg/fz4AYP/+/ejSpQv09PSwadMmbNmyBdnZ2ejQoQPu378vPzcwMBDBwcHw8vLC7t270b17d7i6upbGp5DoLWpCB6CyER0djXXr1nGtDxJUWloaioqK0KhRI6ipvf7vZv369QgICICWlhZ++eUXuLq6wszMTOCkRERERESlw97eHnv37kViYiICAgKwYMECzJkzB4MGDYKKCvuNSsuhQ4c+uv7/ixcvStxYIZFIsGvXLlStWvW9Y+rWrYuvv/66RNd/w8rK6q1jvXr1wr59+2BsbIzQ0FAMGDAALi4uSEpKwt27d3HhwgVoaGgUOyczMxMJCQkwMTEBAHTp0gWmpqaYO3cuNm7c+M57R0RE4ObNmzh69Cg6d+4MAOjRowcyMzMxffp0jBw5stjO9IMGDcKvv/5a7Brjx49Hp06d8Pvvv8uPOTk5oWHDhggJCcHSpUvx9OlTLFmyBKNGjcKiRYsAAN27d4eqqiqmTp36+Z80oo9gsbOSCgsLw8iRI4WOQUpu8+bN2Lp1K1JSUpCcnAxfX19cvXoV3377LYYNG4amTZtCS0tL6JhERERERKWuXbt2OHr0KOLi4hAQEIDg4GAEBQWhZ8+eEIlEQsdTCkVFRYKe/yl27doFY2PjYsf+vRt7//79MXr0aIwZMwb5+fkIDw+HhYXFW9dp27atvNAJAFWqVEGvXr2QlJT03nufOHECRkZG8kLnG4MHD8bw4cNx/fp1NGnSpFiWf7t58ybS0tIQEBBQrINZR0cHDg4OOHHiBIDXU+9zc3Ph7u5e7HxPT08WO6lMsNhZCb18+RIFBQXo16+f0FFIyU2bNg0hISFo0aIFbt68iXbt2mHDhg1o37499PX1i4199uwZLl26hE6dOgmUloiIiIiodIlEInTr1g1du3bF7t27MWXKFAQHByM4OJg/936hT+moPH36NOLi4kq0s7qqqqp8w6CyZGtrC3Nz8w+OGTZsGNasWYPatWvj22+/feeYOnXqvPOYWCx+73WfPHkCQ0PDt47XrVtX/vy//Xfso0ePAAAjR458Z7PVm+JrRkbGOzO+KzNRaWAPfSWkra2No0ePQltbW+gopOTU1dWxevVqJCcnY8qUKVizZg1cXV3fKnQeOnQIP/30EwYMGID4+HiB0hIRERERlQ2RSIT+/fvj0qVLGDNmDIYPHw4XFxecO3dO6GiVmpGRUYmXDlBRUYGRkVEpJ/p8L1++xIgRI2Bra4vnz5+/txMyMzPzncc+9Br09fXfuRTAm2M1a9Ysdvy/Hclvnp83bx7Onj371mPv3r0A/n+R9L8Z35WZqDSw2FkJiUQiTougCsPLywuNGzdGamoqTE1NAUC+q+HDhw8xe/Zs/Pzzz/jnn39ga2uLoUOHChmXiIiIiKjMqKqqYvDgwbhx4wb69++Pvn37YuDAgbh+/brQ0SolY2Nj6OrqluhcPT29t6aXC2H8+PEQi8X4/fff8euvv2LZsmU4dOjQW+NOnz5dbEOg7Oxs7N+/Hw4ODu+9dqdOnZCeno6EhIRix7ds2YLatWvD2tr6g9ksLS3RoEEDXLt2DS1btnzrYWdnBwCws7ODrq4uYmJiip0fFRX10ddPVBKcxk5EZS48PByjR4+GWCyGkZGRvBhfVFQEqVSK1NRUREZGokmTJrC0tERgYCACAwOFDU1EREREVEY0NDTwww8/YNiwYVi1ahWcnJzg4uKCwMBANGzYUOh4lYZIJIKjoyNiY2M/a6MidXV1tGvXrlyaiC5evIisrKy3jrds2RK///47QkNDsXHjRjRs2BDjxo1DbGwsvL29cfnyZdSuXVs+vk6dOujevTsCAwOhqamJBQsWIDc3F7/88st77+3t7Y1ly5ZhwIABCAoKgrGxMTZv3ozDhw9jzZo1xTYneheRSIRVq1ahb9++KCgogLu7OwwMDJCZmYnExESYmJjAz88P1atXx08//YSgoCBUqVIF3bt3x9mzZxEWFlbyTxzRB7Czk4jKXOvWrbF9+3ZUrVpVvkg1ANSrVw9jx45Fq1atEB0dDQBYtGgRgoKC8PTpU6HiEhERERGVC21tbUyaNAk3b96EmZkZWrVqBR8fHzx48EDoaJVGs2bNYGho+NHC3RuqqqowNDREs2bNyjjZa25ubnBwcHjrkZGRge+//x5eXl4YPHiwfHxERAREIhG8vb3lM+aA112aEydOREBAADw8PPDq1SscPHjwnZsZvaGrq4vjx4+je/fumDp1Kvr27YtLly5h48aNGDVq1Cfl79mzJ06cOIHc3Fx89913cHFxgb+/Px4+fFisqzQwMBABAQHYuHEjXF1dERsbK5/mTlTaRLJ/f3UQEZURmUyG7777DlKpFKGhoVBVVZW/UxoVFYWQkBAcOHAAtWrVgp+fH3r27ImuXbsKnJqIiIiIqPxkZWVhwYIFCA8Px8iRIzFlypS31k1URikpKR+dUv0hBQUF2Lx5MzIyMj7Y4amurg5DQ0N4eXlBQ0OjxPcrbw0aNED79u2xadMmoaOQAvnSr6uKjJ2dCkomk4F1alIkIpEILVu2xJkzZ1BYWAiRSCTfFfHRo0eQyWTQ09MDAISEhLDQSURERERKx8DAAAsXLsTly5eRnZ0NS0tLzJo1Cy9evBA6mkLT0NDA0KFD0b17d1SvXh3q6uryTk9VVVWoq6ujRo0a6N69O4YOHapQhU4iehs7OysJmUwGkUgk/5OoojI3N8eQIUPg6+sLfX19iMVi9OnTB/r6+jh06BDU1LiUMBERERERAKSlpSEwMBCxsbHw9/eHj48PtLW1hY5V7kqzA00mkyE9PR1isRgFBQXQ0NCAkZERjI2NFfZ3aXZ2UklU5s5OFjsV0Lx58/Ds2TMsWLBA6ChEny0hIQFjxoyBrq4u6tevj9OnT8PIyAiRkZGwtLSUj5NKpUhMTESdOnU+uM4MEREREVFld/XqVcyYMQNnzpzBL7/8ghEjRkBdXV3oWOWmMhdliIRSmb+uOI1dAa1cuRLm5ubyj/fv34/ffvsNS5YswdGjR1FYWChgOqIPc3R0RGhoKBwcHPD48WOMGDECixcvhoWFRbGlGW7fvo3Nmzdj6tSpKCgoEDAxEREREZGwbG1tsXPnTuzatQs7duyAtbU1Nm3aJF8WioiI/j92diqYpKQkdOnSBU+ePIGamhomTZqEDRs2QFtbGwYGBlBTU8PMmTPh6uoqdFSiT1JUVAQVlXe/73Ls2DH4+fmhZcuWWLt2bTknIyIiIiKqmI4ePYqff/4ZL168wNy5c9G3b1+FnYL9KSpzBxqRUCrz1xU7OxXMwoUL4enpCS0tLcTExODo0aNYtWoVxGIxNm/ejEaNGsHLywsPHz4UOirRBxUVFQGAvND53/ddpFIpHj58iNu3b2Pv3r1clJ2IiIiI6P84OTkhISEBCxYsQGBgINq2bYu4uDhuYktEBBY7FU5iYiIuXbqEPXv2YMWKFRg6dCi++eYbAK+nNsyfPx9fffUVLly4IHBSog97U+TMzMwEgGLvRJ8/fx59+vSBl5cXPDw8cO7cOVStWlWQnEREREREFZFIJEKvXr1w4cIF+Pn5wcfHB126dEFSUpLQ0YiIBMVipwLJycmBn58fLC0t4e/vj1u3bsHe3l7+vFQqRd26daGiosJ1O0kh3LlzBz4+Prh58yYAQCwWY+LEiXB0dMTz589x6tQp/O9//4ORkZHASYmIiIiIKiYVFRV4eHjg+vXr8mYBV1dXXL58WehoRESC4JqdCuT69eto3LgxxGIxzpw5gzt37qBbt26wtbWVjzlx4gR69uyJnJwcAZMSfbrWrVvDwMAAgwYNQmBgICQSCebOnYuRI0cKHY2IiIiISOG8evUKa9euRXBwMJycnDBr1ixYWFgIHeuLlObagjKZDEnpSTgjPoPs/GxU0ayC1kat4WDsUKnXPSX6r8q8ZieLnQri/v37aNWqFVasWAE3NzcAgEQiAQCoq6sDAC5evIjAwEBUr14dkZGRQkUl+ixpaWnyndj9/Pwwffp0VK9eXehYREREREQKLScnB8uXL8eSJUvQr18/zJgxA/Xr1xc6VomURlFGIpUgLDkMvyb8ike5jyApkkAilUBdVR3qKuqorVsb/o7+GNlsJNRV1UspOVHFVZmLnZzGriAWLlyIR48ewdvbG3PmzEF2djbU1dWL7WJ948YNiEQiTJs2TcCkRJ/HzMwM06ZNg4mJCYKDg1noJCIiIiIqBXp6eggICEBqaipq1aoFe3t7/PTTT3j06JHQ0cpdTkEOnDc4Y2LsRNx+dhu5klwUSAsggwwF0gLkSnJx+9ltTIydiC4buiCnoGxnSkZGRkIkEr3zERcXBwCIi4uDSCTCqVOnyizH4MGDYW5u/tFxDx8+hK+vLywsLKCtrQ0DAwO0aNEC48ePlzdhfapbt25BJBJh06ZNn533yJEjCAwMLNVrUuXEYqeCiIiIQHx8PAIDA7Fu3Tps2LABAKCqqiof4+npiR07dsDS0lKomEQlMnfuXKSnp8v/XRMRERERUemoUaMGgoODce3aNUilUlhbW+OXX37Bs2fPhI5WLiRSCXps7oGz4rN4KXn5wbEvJS9xRnwGPTf3hET6eUW8kti2bRuSkpKKPVq3bg3g9XJfSUlJaNq0aZnn+JBnz56hdevWOHjwIPz8/HDgwAGsWbMGPXr0wJ49e5Cfn19uWY4cOYJZs2a9dbx+/fpISkrC119/XW5ZqGJTEzoAfdzOnTuhq6sLJycnNG3aFJmZmRg3bhwuX76MOXPmoHbt2igsLIRIJCpW/CRSJMeOHUN+fj5kMhnXyiEiIiIiKmV169bF8uXLMXHiRMyePRsWFhbw8/ODr68vdHV1hY5XZsKSw3Ah4wLypZ9WlMuX5uN8xnmEJ4djdMvRZZrN3t7+vZ2VVatWRdu2bcv0/p8iJiYG9+/fx9WrV2FjYyM/PnDgQMyZM6dC/O6mqalZIT5XVHGws1MBLF68GN7e3gAAfX19LFq0CKtXr8Yff/yBhQsXAgDU1NRY6CSF1r59e3Tp0qVCfLMkIiIiIqqsTE1NERYWhhMnTiA5ORmNGjXCypUry7VDr7zIZDL8mvDrRzs6/+ul5CV+TfgVQm5x8q5p7O3bt0fnzp0RGxuLZs2aQUdHB7a2ttizZ0+xc1NTUzF48GA0aNAA2traMDMzw48//liibt4nT54AeF0s/6///u5WUFCAgIAAmJqaQkNDAw0aNMCMGTM+OtW9ffv26Nq161vHjY2N8d133wEApk+fjqCgIPl9RSIR1NRe9++9bxr7+vXrYWdnB01NTdSqVQvDhg1DZmbmW/fw9vbG5s2bYWVlBV1dXbRq1QqJiYkfzEwVG4udFdyLFy+QlJSEUaNGAQCkUikAYOTIkfD398eqVavQp08f3LlzR8CUREREREREpEisrKwQHR2N/fv34+DBg7C0tERkZCQKCws/+RovXrzA7t27sWfPHvlj586dSEtLK8Pkny4pPQmPcku2RmlmbiaS0pNKOVFxUqkUhYWF8seb3/c/JDU1FX5+fpg0aRJ27tyJOnXqYODAgbh9+7Z8jFgshqmpKZYtW4Y//vgDP//8M/744w/07t37szO+mVbv7u6O2NhY5Obmvnfs4MGDsXDhQgwfPhz79u3D0KFDERwcjJEjR372ff/rhx9+kDeBvZnyn5CQ8N7xq1evhre3N5o0aYLdu3cjKCgI+/fvR+fOnfHyZfHi99GjR7F8+XIEBQUhKioKBQUF6N27N168ePHFuUkYnMZewVWtWhWPHz+Gvr4+gP+/Rqeamhp8fHxQq1Yt+Pv7Y9y4cYiKioKOjo6QcYlKzZt3UdnpSURERERUdpo1a4b9+/cjISEBAQEBWLBgAWbPno2BAwcW2xD33+7cuYNz586hSpUq6NWrF9TVi+9efuHCBWzfvh1GRkZwcHAok9wTDk3AxYcXPzgm/UX6Z3d1vvFS8hJDdw2FcVXj946xr2uPpV8vLdH1gdcF539zdHT86IZEWVlZOHXqFBo2bAgAaNq0KerVq4dt27bB398fAODk5AQnJyf5Oe3atUPDhg3h5OSEK1euoEmTJp+c0dnZGTNmzEBwcDCOHDkCVVVVNGvWDH369MGECRNQtWpVAMClS5ewbds2zJkzB9OnTwcAdO/eHSoqKpg1axamTp2Kxo0bf/J9/8vY2BhGRkYA8NEp64WFhZg5cya6dOmCzZs3y49bWFjAyckJkZGR8PHxkR/PyclBbGwsqlWrBgCoVasWHBwccOjQIbi7u5c4MwmHnZ0K4E2h813c3NywePFiZGVlsdBJlUpRURFatWqFI0eOCB2FiIiIiKjSc3R0xLFjx7Bs2TIsWLAALVu2xMGDB9+ayn3hwgWkpaVh0KBBcHFxeavQCQDNmzfHoEGDYGBggF27dpXXS3iLtEgKGUo2FV0GGaRFH++0/BK7du3C2bNn5Y+wsLCPnmNlZSUvdAKAoaEhDAwMcO/ePfmx/Px8zJ07F1ZWVtDW1oa6urq8+PnXX399ds5Zs2bh7t27WLduHQYPHozHjx9j5syZsLW1xePHjwEAx48fB/C6u/Pf3nz85vnycP36dWRlZb2VpXPnzjAyMnori6Ojo7zQCUBeDP7355QUCzs7K4H+/fujc+fOQscgKlWqqqoICAjAuHHjkJyc/M4fooiIiIiIqPSIRCJ0794d3bp1w65duzBx4kQEBwcjODgYHTp0wLVr15Cbm4suXbp80vUaNWoEXV1d7N27F3369CnVrJ/SUbn09FJMiZuCAmnBZ19fU1UTE9pOwPi240sS75PY2tq+d4Oi93lXM5SmpiZevXol/9jf3x+//fYbAgMD0bZtW1SpUgV3796Fm5tbsXGfo169evjuu+/ka2guW7YMEyZMQEhICObPny9f29PQ0LDYeW/W+nzzfHl4X5Y3ef6b5b+fU01NTQAo8eeKhMfOzkqiRo0aQkcgKnX9+/eHoaEhVq9eLXQUIiIiIiKlIRKJMGDAAFy5cgXff/89hg4diq+//hqnT59Ghw4dPuta9erVg7GxMVJSUsoo7fu1NmoNdZWSNU2oqaihlVGrUk5UPqKiojBixAgEBATA2dkZrVq1Kta5WBrGjx+PqlWr4vr16wD+f8Hw4cOHxca9+bhmzZrvvZaWlhYKCooXpGUyGZ4+fVqibO/L8ubYh7JQ5cBip4IRcjc4ovImEomwfPlyzJ07F48elWxhcSIiIiIiKhlVVVUMHToUf/31F5o3b46ePXuW6DrNmjWTF8XKk4OxA2rr1i7RuXX06sDBuGzWGy1reXl5b82Mi4iIKNG1MjIy3rlxUnp6OrKzs+Xdk506dQLwutD6b2/WzOzYseN772Fqaoq//vqr2OZYR48efWsjoTcdl3l5eR/M3LhxYxgYGLyV5fjx4xCLxfKsVHmx2KlAbt68iZCQEBY8SalYW1tj6NChmDZtmtBRiIiIiIiUkoaGBlq0aPHOacGfSldXFzk5OaWY6uNEIhH8Hf2ho/55+1voqOvAv52/wm6W6uLigvDwcPz222+IjY3F999/jzNnzpToWuvXr0fDhg0xa9YsHDx4EMeOHcPatWvh7OwMLS0t+UY/TZs2hZubG3755RfMmTMHhw8fRmBgIObOnYshQ4Z8cHMiT09PPHr0CCNGjEBcXBzWrFmDH3/8EVWqVCk27s01Fi1ahD///BPnz59/5/XU1NQwa9YsHDp0CMOGDcOhQ4cQGhoKNzc3WFlZYdiwYSX6XJDiYLFTgYSHhyMjI0Nh/8MlKqmZM2fi4MGDJf4GTUREREREJZebmyvfdbuknJ2dceLEiVJK9OlGNhuJ5obNoamq+UnjNVU10cKwBUY0G1HGycrO6tWr0atXL0ybNg0eHh549epVsV3JP0efPn3Qv39/7Nq1C15eXujWrRsCAwNhb2+PxMRENG3aVD522Ua2lgAAIABJREFU06ZNmDRpEkJDQ9GzZ09ERkZi2rRpH914qVu3bli1ahUSExPRp08fbNy4EVu2bHnr31zfvn0xevRoLF++HA4ODmjTps17r+nj44PIyEgkJyejb9++mDp1Knr06IFjx45xc2clIJKxTVAhFBYWwsTEBHFxcR98R4Soslq/fj1WrVqF06dPQ0WF79MQEREREZWXu3fv4vnz57Czs/ui65R0o6KUlBRYW1uX+L45BTnoubknzmecx0vJy/eO01HXQQvDFjjgdQB6Gnolvh+RIvjSr6uKjBUDBXHo0CGYmpqy0ElKa8iQIVBVVUVkZKTQUYiIiIiIlEphYSFUVVW/+DpC9Vrpaeghfmg8FndfjIbVG0JXXReaqpoQQQRNVU3oquuiYY2GWNx9MeKHxrPQSaTg1IQOQJ8mLCwMI0eOFDoGkWBUVFSwcuVK9O7dGwMGDED16tWFjkREREREpBT09fVx5cqVL7qG0JNK1VXVMbrlaIxqMQpJ6Uk4Kz6L7IJsVNGogtZGrdHWuC2XjCOqJDiNXQFkZmbC0tIS9+7d++J1UogU3ahRo6Cjo4OlS5cKHYWIiIiISGns2LEDAwcOLPH5iYmJaNCgAerVq/fZ51bm6bZEQqnMX1ecxq4ANm7ciP79+7PQSQQgKCgIW7ZswdWrV4WOQkRERESkNLS0tJCXl1fi8x88eFCiQicR0edisbOCk8lknMJO9C+1atXCjBkzMG7cOMGnwhARERERKYsuXbogLi6uROeKxWIYGhqWciIiondjsbOCS0pKQlFRERwdHYWOQlRh/PDDD8jKysL27duFjkJEREREpBS0tLSgp6eH1NTUzzrv1atXiIuLQ7t27b7o/mx0ICo9lf3ricXOCi4sLAwjRozgQslE/6KmpoYVK1Zg4sSJyM3NFToOEREREZFScHJyQlpaGlJSUj5pfHZ2NrZu3Ypvv/32i36nVVdX/6Ip9ERUXF5eHtTV1YWOUWa4QVEFlpOTg/r16yMlJQV169YVOg5RhfPNN9/AzMwMc+fOFToKEREREZHSSExMhFgsRps2bWBiYvLW87m5uVi9ejWMjIzg6ekJFZUv67N68eIFMjMzYWRkBG1tbTYDEZWQTCZDXl4exGIx6tSpU2n3hlETOgC9X0xMDDp27MhCJ9F7LFy4EE2bNsXw4cNhZmYmdBwiIiIiIqXQrl07yGQynD17FmfOnIGGhob8ucLCQmhra+PGjRt4+vTpFxc6AcgLMg8ePIBEIvni6xEpM3V19Upd6ATY2VmhOTo6YsqUKXB1dRU6ClGFNW/ePCQlJWHPnj1CRyEiIiIiov9z7949NGvWDCkpKahdu7bQcYhIibDYWUGlpKTA2dkZ9+7dq9TrKBB9qfz8fNja2mL58uXo0aOH0HGIiIiIiOj/+Pr6QkNDAyEhIUJHISIlwmJnBeXv7w+RSIQFCxYIHYWowtu/fz9++uknXLlyBZqamkLHISIiIiIiABkZGbCxscHVq1dRr149oeMQkZJgsbMCkkgkqF+/Po4fPw5LS0uh4xAphN69e6NDhw6YMmWK0FGIiIiIiOj/TJo0Ca9evcLKlSuFjkJESoLFzgpo9+7dCAkJwcmTJ4WOQqQwbt26hbZt2+LSpUswMjISOg4REREREQF4/PgxrKyscOHCBZiamgodh4iUwJdvi0alLiwsDCNGjBA6BpFCMTc3x6hRo+Dv7y90FCIiIiIi+j+1atXCDz/8gLlz5wodhYiUBDs7K5gHDx7AxsYG9+/fh56entBxiBRKTk4OrK2tsWXLFnTo0EHoOEREREREBODJkyewsLDA6dOnYW5uLnQcIqrk2NlZwWzYsAGDBg1ioZOoBPT09LBw4UL4+vpCKpUKHYeIiIiIiADo6+tj3LhxmD17ttBRiEgJsLOzApHJZLC0tMSGDRvQtm1boeMQKSSZTAYnJye4u7vDx8dH6DhEREREREREVI7Y2VmBnDx5EmpqamjTpo3QUYgUlkgkwvLlyxEYGIisrCyh4xARERERERFROWKxswIJDw/HyJEjIRKJhI5CpNDs7Ozg4eGB6dOnCx2FiIiIiIiIiMoRp7FXEC9evICJiQlSU1NRu3ZtoeMQKbynT5/C2toaBw4cQPPmzYWOQ0RERERERETlgJ2dFURUVBS6dOnCQidRKalRowbmzJkDX19f8D0dIiIiIiIiIuXAYmcFER4ejhEjRggdg6hSGTFiBPLz87Fp0yahoxARERERKb3AwEDY2toKHYOIKjlOY68Arl27hu7du+Pu3btQU1MTOg5RpXL69GkMHDgQKSkpqFq1qtBxiIiIiIgUire3N7KysrBv374vvlZOTg7y8/NRs2bNUkhGRPRu7OysAMLCwuDt7c1CJ1EZaNu2Lbp164Y5c+YIHYWIiIiISKnp6emx0ElEZY7FToEVFBRg06ZNGD58uNBRiCqt+fPnIyIiAjdu3BA6ChERERGRwjp79iy6d+8OAwMDVK1aFe3bt0dSUlKxMWvWrIGFhQW0tLRQq1YtuLi4oLCwEACnsRNR+WCxU2B79+5F48aNYW5uLnQUokqrbt26CAgIwPjx47lZERERERFRCWVnZ2PIkCE4efIkzpw5A3t7e/Ts2RNZWVkAgHPnzuHHH3/EzJkz8ddffyEuLg5ff/21wKmJSNmw2CmwsLAwjBw5UugYRJWer68v7t+/j99//13oKERERERECsnZ2RlDhgyBtbU1rKyssGLFCmhpaeHQoUMAgHv37kFXVxeurq4wNTVF06ZN8dNPP3HJNiIqVyx2Cig9PV2+eQoRlS11dXUsX74cfn5+yMvLEzoOEREREZHCefToEUaPHg0LCwtUq1YNVapUwaNHj3Dv3j0AQLdu3WBqaoqvvvoKXl5eWL9+PbKzswVOTUTKhsVOAUVGRsLd3R06OjpCRyFSCl27dkXz5s2xcOFCoaMQERERESmcYcOG4ezZs1iyZAkSExNx8eJFGBsbo6CgAABQpUoVXLhwATExMTAxMcG8efNgZWWFBw8eCJyciJQJi53lRCKR4NGjR3jw4AHy8vJQVFSEiIgITmEnKmchISFYvnw57t69K3QUIiIiIiKFcurUKfj6+qJXr16wsbFBlSpVkJGRUWyMmpoanJ2dMW/ePFy+fBm5ubnYt2/fJ12/qKioLGITkZLhwhllSCaT4fTp0xCLxdDW1kbNmjWhpqaGq1ev4vbt26hbty7s7OyEjkmkVExNTTFu3DhMnDgR27dvFzoOEREREZHCsLCwwKZNm9CmTRvk5ubC398fGhoa8uf37duHtLQ0dOzYEfr6+jh69Ciys7NhbW39Sdfftm0bPDw8yio+ESkJFjvLyM2bN3Hu3Dm0b98eDg4O7xzz7bff4uDBg9DX10fHjh3LOSGR8po8eTJsbGwQHx+PLl26CB2HiIiIiEghhIeHY9SoUWjRogXq1auHwMBAPH78WP589erVsXv3bsyePRsvX76EmZkZQkND0aFDh0+6/syZMzFw4EBuaEREX0Qkk8lkQoeobK5evYrMzMxPLqLcuHED9+7dQ/fu3cs4GRG9sXv3bgQEBODSpUtQV1cXOg4RERERkdLr2LEjvvvuOwwdOlToKESkwLhmZykTi8W4f//+Z3WLWVlZwcjICElJSWWYjIj+rW/fvqhfvz5WrlwpdBQiIiIiIgIwd+5cBAYGQiKRCB2FiBQYi52l7PTp0+jRo8dnn2djY4MHDx6AjbZE5UMkEmHZsmUIDg5GZmam0HGIiIiIiJRex44dYWZmhoiICKGjEJECY7GzFOXm5kJbW7vE57ds2RJnz54txURE9CFWVlbw9vbG1KlThY5CREREREQA5syZg7lz5+LVq1dCRyEiBcViZyk6cuTIF212Ympqirt375ZiIiL6mF9++QWxsbE4ffq00FGIiIiIiJRe27ZtYWdnh3Xr1gkdhYgUFIudpUgmk0FTU/OLrqGlpVVKaYjoU1StWhXz58+Hr68vioqKhI5DRERERKT0Zs+ejXnz5uHly5dCRyEiBcRiZwXDNTuJyt/gwYOhoaGB8PBwoaMQERERESm95s2bw8HBAatXrxY6ChEpIBY7S5FIJKoQ1yCizyMSibBixQpMnz4dT58+FToOEREREZHSmzVrFhYuXIjs7GyhoxCRgmGxsxQVFhZ+8TW4CDORMJo3b45+/fph5syZQkchIiIiIlJ6tra26NKlC5YvXy50FCJSMCIZ502XmrS0NLx48QLNmjUr0fmvXr1CmzZtYGNjA09PT7i4uHzxGqBE9On++ecfWFtbIz4+Hk2aNBE6DhERERGRUktNTYWjoyNu3ryJ6tWrCx2HiBQEOztLkZmZGdLS0kp8fnx8PPbs2YMOHTogJCQEhoaG8Pb2xqFDhyCRSEoxKRG9S82aNREYGAhfX1+un0tEREREJDALCwv07t0bixcvFjoKESkQFjtLmaGhYYkKnnl5ecjLy4OpqSnGjBmD48eP48qVK2jWrBlmzZqFevXqYdSoUYiPj4dUKi2D5EQEAKNHj8azZ88QExMjdBQiIiIiIqU3Y8YMrFq1CllZWUJHISIFwWnsZWDHjh1o37496tSp80njJRIJNm3ahCFDhkBNTe2dY+7evYuYmBhER0cjPT0dgwYNgoeHBxwdHaGiwpo1UWk6efIkvLy8kJKSAl1dXaHjEBEREREptTFjxqBq1apYsGCB0FGISAGw2FkGZDIZfv/9dzRq1Ag2NjYfHJuVlYW9e/fim2++gZaW1idd/9atW4iOjkZ0dDSePHkCd3d3eHh4oHXr1tzNnaiUeHl5oUGDBggKChI6ChERERGRUktPT0fTpk1x7do11K1bV+g4RFTBsdhZhi5fvozU1FRUr14dnTt3Lta1ef78edy5cwf6+vro1KlTibszr1+/Li985ufnw8PDAx4eHrC3t2fhk+gLiMViNG3aFKdPn4a5ubnQcYiIiIiIlNqECRMAAEuXLhU4CRFVdCx2loNnz57h5MmTyM7ORmhoKCZMmIAmTZrgq6++KrV7yGQyXL58GVFRUYiOjoaamho8PT3h4eHx0e5SInq3BQsW4NSpU9i7d6/QUYiIiIiIlNrDhw9hY2ODS5cuwdjYWOg4RFSBsdhZjp4/fw4TExM8f/68TO8jk8lw7tw5REVFISYmBtWqVZN3fFpYWJTpvYkqk/z8fDRp0gRLly5Fz549hY5DRERERKTUpkyZghcvXuC3334TOgoRVWAsdpaj/Px8VK1aFfn5+eV2z6KiIiQlJSE6Ohrbtm2DoaGhvPDZoEGDcstBpKgOHjyIcePG4erVq9DU1BQ6DhERERGR0srKyoKlpSXOnTtXqjMliahyYbGzHMlkMqiqqkIikUBVVbXc7y+VSnHixAlER0djx44dMDMzg4eHB9zc3DgNgOgDXF1d0a5dO0ydOlXoKERERERESm3GjBlIT09HeHi40FGIqIJisbOcaWtr459//oGOjo6gOSQSCY4cOYLo6Gjs3r0btra28PDwwKBBg1CnTh1BsxFVNGlpaWjTpg0uXboEIyMjoeMQERERESmtZ8+eoVGjRkhISOAybUT0Tix2ljN9fX3cunUL+vr6QkeRy8/PR2xsLKKjo7Fv3z60bNkSHh4eGDBgAGrWrCl0PKIKYfr06fj777+xZcsWoaMQERERESm1oKAgXL9+HZs3bxY6ChFVQCx2lrN69erh7NmzFbY7LC8vDwcOHEB0dDT++OMPtGvXDp6enujXrx+qVasmdDwiweTm5sLa2hqbNm1Cx44dhY5DRERERKS0srOzYW5ujvj4eNja2godh4gqGBWhAygbLS0tvHr1SugY76WtrY2BAwciJiYGYrEYw4YNw65du2BiYoK+ffti69atyMnJETomUbnT1dXFokWL4Ovri8LCQqHjEBEREREprSpVqmDy5MkIDAwUOgoRVUAsdpYzbW3tCl3s/Dc9PT14enpi9+7duHfvHgYOHIiNGzfCyMgIbm5u2L59O/Ly8oSOSVRu3NzcULNmTaxZs0boKERERERESs3HxweJiYlITk4WOgoRVTCcxk6f7Z9//sGuXbsQFRWFc+fOoVevXvDw8ICLiws0NTWFjkdUpq5evQpnZ2dcv34dBgYGQschIiIiIlJaK1asQGxsLPbu3St0FCKqQFjspC+SmZmJHTt2IDo6GleuXEHfvn3h4eGBLl26QF1dXeh4RGVi/PjxePXqFTs8iYiIiIgElJ+fj0aNGiEmJgZt27YVOg4RVRAsdlKpEYvF2LZtG6Kjo3Hr1i0MGDAAHh4e6NSpE1RVVYWOR1Rqnj17BisrK+zbtw8tW7YUOg4RERERkdJau3Yttm/fjtjYWKGjEFEFwWInlYk7d+4gJiYG0dHREIvFcHNzg4eHB9q1awcVFS4VS4ovLCwMoaGhSEhI4L9pIiIiIiKBSCQSWFlZISIiAh07dhQ6DhFVACx2Upm7efMmoqOjER0djWfPnsHNzQ2enp5o1aoVRCKR0PGISqSoqAht27bFjz/+iGHDhgkdh4iIiIhIaa1fvx5hYWE4fvw4f8ckIhY7FUHv3r1hYGCAyMhIoaN8sWvXrskLnxKJBO7u7vDw8IC9vT2/KZHC+fPPP9G/f3+kpKSgWrVqQschIiIiIlJKhYWFsLW1xYoVK9CtWzeh4xCRwDj38gskJydDVVUVjo6OQkdRGDY2Npg9ezZu3LiBnTt3AgAGDBgAKysrzJgxA9evXxc4IdGna9OmDb7++mvMnj1b6ChEREREREpLTU0NgYGB+OWXX8B+LiJisfMLrFu3Dj4+Prh69SpSUlI+OFYikZRTKsUgEolgb2+P+fPn4++//8bGjRuRm5uL7t27o0mTJpg7dy5u3rwpdEyij5o3bx42bNjw0f8DiIiIiIio7Li7uyM3Nxf79+8XOgoRCYzFzhLKy8vDli1b8P3332PQoEEICwuTP3fnzh2IRCJs3boVzs7O0NbWxpo1a/DPP//gm2++gbGxMbS1tWFjY4OIiIhi13358iW8vb2hp6eHOnXqIDg4uLxfWrkTiURo3bo1QkJCcO/ePfz222/IzMxEhw4d0KJFC/z666+4c+eO0DGJ3qlOnTr4+eefMW7cOL6LTEREREQkEBUVFcyePRszZsxAUVGR0HGISEAsdpbQ9u3bYWpqCjs7OwwZMgQbNmx4q3tz2rRp8PHxwfXr19GvXz+8evUKzZs3x759+3Dt2jWMHz8eo0ePRnx8vPycSZMm4fDhw9ixYwfi4+ORnJyMEydOlPfLE4yKigrat2+PFStWQCwWY+HChUhLS0OrVq3Qtm1bLF26FGKxWOiYRMX8+OOPePDgAXbt2iV0FCIiIiIipdWvXz+IRCL+XE6k5LhBUQl16tQJffr0waRJkyCTyfDVV18hJCQEAwcOxJ07d/DVV19h0aJFmDhx4gev4+npCT09PYSGhiInJwc1a9ZEeHg4vLy8AAA5OTkwNjZGv379KsUGRSUlkUhw5MgRREVF4ffff4etrS08PDwwaNAg1KlTR+h4RDhy5AhGjBiB69evQ0dHR+g4RERERERK6cCBA5g8eTIuX74MVVVVoeMQkQDY2VkCt27dQkJCAr799lsAr6dhe3l5ITQ0tNi4li1bFvtYKpUiKCgIdnZ2qFmzJvT09LBz507cu3cPAJCWloaCggI4ODjIz9HT00OTJk3K+BVVfOrq6nBxcUFERAQyMjIwadIkJCYmwtLSEl27dkVoaCiePHkidExSYs7OzmjVqhV+/fVXoaMQERERESmtHj16oFq1aoiOjhY6ChEJRE3oAIooNDQUUqkUJiYm8mNvGmTv378vP6arq1vsvEWLFiEkJATLli1DkyZNoKenh4CAADx69KjYNejDNDU14erqCldXV+Tl5eHAgQOIiorCxIkT4ejoCA8PD/Tr1w/VqlUTOiopmZCQEDRr1gze3t5o0KCB0HGIiIiIiJSOSCTCnDlzMGbMGLi7u0NNjWUPImXDzs7PVFhYiPXr12PevHm4ePGi/HHp0iXY2dm9teHQv506dQp9+vTBkCFDYG9vDzMzM6SmpsqfNzc3h7q6Ok6fPi0/lpubi6tXr5bpa1Jk2traGDhwILZt2waxWIwhQ4Zg165dMDExQb9+/bB161bk5OQIHZOUhImJCSZMmAA/Pz+hoxARERERKS1nZ2cYGRlh48aNQkchIgGw2PmZ9u/fj6ysLHz//fewtbUt9vD09ER4ePh7d36zsLBAfHw8Tp06hRs3bmDs2LG4ffu2/Hk9PT2MHDkSU6ZMweHDh3Ht2jWMGDECUqm0vF6eQtPT08M333yD3bt34+7du+jfvz82btwIIyMjuLu7Y8eOHcjLyxM6JlVykydPxsWLF3H48GGhoxARERERKaU33Z2zZ89GQUGB0HGIqJyx2PmZwsLC4OTkhJo1a771nJubG+7evYu4uLh3njt9+nS0bt0aPXr0QMeOHaGrqyvfiOiNRYsWwcnJCf3794eTkxNsbW3RsWPHMnktlVn16tUxbNgwHDhwAH///Te6deuG3377DYaGhhg8eDD27t2L/Px8oWNSJaSlpYUlS5Zg3Lhx/MGKiIiIiEgg7du3h6WlJcLDw4WOQkTljLuxk1LJzMzE9u3bER0djatXr6Jv377w9PSEs7Mz1NXVhY5HlYRMJkOPHj3QrVs3TJw4Ueg4RERERERK6ezZs+jfvz9u3boFLS0toeMQUTlhsZOUVnp6OrZt24bo6GikpaVhwIAB8PT0RMeOHaGqqip0PFJwf/31FxwdHXHlyhUYGhoKHYeIiIiISCn17dsXzs7OGD9+vNBRiKicsNhJBODOnTuIiYlBVFQUMjIyMGjQIHh6esLBwQEqKlztgUrG398fmZmZWL9+vdBRiIiIiIiU0qVLl3D+/HkMHz4cIpFI6DhEVA5Y7CT6j9TUVHnh8/nz53B3d4eHhwdatWrFb470WbKzs2FtbY2YmBi0a9dO6DhEREREREpJJpPxdzkiJcJiJ9EHXLt2DdHR0YiKikJhYSE8PDzg4eGBpk2b8pslfZLNmzdj8eLFOHPmDJdHICIiIiIiIipjLHYSfQKZTIaLFy8iOjoa0dHR0NDQgKenJzw8PNC4cWOh41EFJpPJ0LFjRwwZMgSjRo0SOg4RERERERFRpcZiZznLzMxEkyZN8OjRI6GjUAnJZDKcOXMG0dHRiImJQY0aNeSFT3Nzc6HjUQV08eJFuLi4ICUlBfr6+kLHISIiIiIiIqq0WOwsZ8+fP0f9+vXx4sULoaNQKSgqKkJCQgKio6Oxfft2GBkZwdPTE+7u7jA1NS3R9SQSCTQ1NcsgLQnJx8cHKioqWLlypdBRiIiIiIjoX86fPw8tLS3Y2NgIHYWISgGLneWsoKAAenp6KCgoEDoKlTKpVIrjx48jKioKO3fuRKNGjeDh4QE3NzcYGRl90jVSU1OxbNkyPHz4EM7Ozhg+fDh0dHTKODmVh3/++QeNGzdGbGwsmjZtKnQcIiIiIiKll5iYiJEjR+LevXuoW7cunJ2dMX/+fNSsWVPoaET0BVSEDqBs1NXVUVhYCKlUKnQUKmWqqqpwdnbG2rVrkZGRgZkzZ+LixYto0qQJOnXqhNWrVyM/P/+D13j69Cn09fVhZGQEX19fLF26FBKJpJxeAZWlmjVrYtasWfD19QXfYyIiIiIiEtbz58/xww8/wMLCAn/++SfmzJmDzMxMjBs3TuhoRPSF2NkpAB0dHTx+/Bi6urpCR6FykJ+fjz/++ANRUVHYsGED1NTUPnrO/v37MWLECGzduhXOzs7lkJLKg1QqRatWrTB58mR88803QschIiIiIlIqL1++hIaGBtTU1HDkyBH571wODg4AgGvXrsHBwQHXrl1D/fr1BU5LRCXFzk4BaGtr49WrV0LHoHKiqakJV1dXbNmyBaqqqh8c+2Z5g61bt6Jx48awtLR857hnz55h8eLF2LlzJ7sEFYiqqipWrFiByZMnIycnR+g4RERERERK4+HDh9i4cSNSU1MBAKampkhPT4e9vb18jK6uLuzs7PD06VOhYhJRKWCxUwBaWlosdiopkUj0wec1NDQAAIcOHYKLiwtq164N4PXGRUVFRQCAuLg4zJw5E5MmTYKPjw8SEhLKNjSVKkdHRzg5OSEoKEjoKERERERESkNdXR2LFi3CgwcPAABmZmZo06YNfH19kZ+fj5ycHAQFBeHevXvs6iRScCx2/j/27jsqqrN7G/A9BRiqgnTBjr1GFBsqYgkajEoUG/beTTCvHQsSe2yJvhqFiAUUeRU0BjWKgp3YOxAbiqiggiB15vsjP/kklqACzwxzX2u5hMM5Z+5jlgb27Gc/AigUCrx69Up0DFIzr+e47tu3D0qlEi1atICOjg4AQCqVQiqVYuXKlRg+fDjc3NzQpEkTdOvWDVWqVClwn8ePH+PPP/8s8fxUeIsXL8aGDRsQGxsrOgoRERERkVYoV64cGjdujLVr1+Y3H+3Zswfx8fFwdnZG48aNERMTg40bN8LU1FRwWiL6HCx2CsDOTvoQf39/ODo6olq1avnHzp07h+HDh2Pr1q3Yt28fmjZtivv376NevXqwtbXNP+/nn39Gly5d0LNnTxgaGmLKlClIT08X8Rj0ATY2NvjPf/6DSZMmiY5CRERERKQ1fvzxR1y6dAk9e/bE//73P+zZswc1a9ZEfHw8VCoVRo4cidatW2Pfvn1YtGgRkpKSREcmok/AYqcAnNlJ/6RSqfLneR4+fBhffvklzM3NAQBRUVHw8vJCo0aNcPz4cdSuXRubNm1C2bJlUb9+/fx7HDhwAFOmTEHjxo1x5MgR7Ny5E2FhYTh8+LCQZ6IPmzhxIuLj47F3717RUYiIiIiItIKNjQ02bdoEOzs7jBw5EsuWLcO1a9cwZMgQREVFYdSoUdDT08O9e/cQERGB77//XnRkIvoo+0jcAAAgAElEQVQE/74tNBU5LmOnN+Xk5GDRokUwMjKCXC6Hnp4eWrZsCV1dXeTm5uLSpUu4desWNm/eDJlMhpEjR+LAgQNwdnZGnTp1AACJiYmYO3cuunTpgnXr1gH4e+D21q1bsWTJEri7u4t8RHoHXV1drFy5EmPHjkX79u2hUChERyIiIiIiKvWcnZ3h7OyMZcuW4fnz59DV1c1vNMnNzYVcLseoUaPQsmVLODs74/Tp03BychKcmog+Bjs7BeAydnqTVCqFsbExFixYgAkTJiApKQn79+9HYmIiZDIZhg8fjlOnTsHZ2RnLly+Hjo4Ojh07hszMTJQpUwbA38vcT58+jalTpwL4u4AK/L2boK6ubv48UFIvnTp1Qt26dbF8+XLRUYiIiIiItIqBgQEUCsVbhc68vDxIJBLUr18fXl5eWLNmjeCkRPSxWOwUgMvY6U0ymQwTJ07EkydPcPfuXcyaNQv//e9/MXjwYCQnJ0NXVxeNGzfGkiVLcPPmTYwcORJlypRBWFgYxo8fDwA4duwYbG1t8cUXX0ClUuVvbHTnzh1UqVKFncRqbPny5Vi+fDnu378vOgoRERERkVbIy8uDq6srGjZsiClTpuCPP/7I/5np9XgxAEhLS4OBgQGbR4g0DIudArCzk97H3t4ec+fORWJiIjZv3pz/LuObLl26hG7duuHy5ctYtGgRACA6OhqdOnUCAGRnZwMALl68iJSUFFSoUAFGRkYl9xD0UapUqYIxY8ZgypQpoqMQEREREWkFmUwGR0dHJCQkIDk5GX369EGTJk0wYsQIhISE4OzZswgPD0doaCiqVq1aoABKROqPxU4BOLOTCsPS0vKtY7dv30ZMTAzq1KkDOzs7GBsbAwCSkpJQo0YNAIBc/vco3j179kAul6N58+YA/t4EidTT1KlTcfLkSURGRoqOQkRERESkFebOnQu5XI6xY8ciISEBU6dORU5ODqZOnYru3bvDw8MDAwYM4CZFRBpIomIFpMQNHz48/10josJSqVSQSCSIjY2FQqGAvb09VCoVcnJyMGbMGFy9ehXR0dGQyWRIT0+Hg4MD+vbtCx8fn/yi6Ov7xMTEwNTUFNWqVRP4RPSmkJAQzJs3D+fOncsvWBMRERERUfGZPHkyoqOjcfbs2QLHY2Ji4ODgkL9HwuufxYhIM7CzUwDO7KRP8fp/rg4ODrC3t88/pquri+HDh+P58+cYPnw4/Pz84OTkBBMTE3z77bcFCp2v7dq1Cy1btoSjoyOWLFmCu3fvluiz0Ns8PDxgYWGBtWvXio5CRERERKQVli5divPnzyM8PBzA35sUAYCjo2N+oRMAC51EGobFTgG4jJ2KkkqlgpOTE/z9/ZGamorw8HAMHDgQe/bsga2tLZRKZYHzJRIJFi5ciAcPHmDRokW4desWGjdujBYtWmDlypV4+PChoCfRbhKJBKtWrcK8efPw5MkT0XGIiIiIiEo9mUyG6dOnY//+/QDAFVZEpQSXsQswe/ZsyGQy+Pj4iI5CBADIycnBoUOHEBwcjD179qBBgwbw9PSEh4fHO2eHUvGZPHkyXr58iQ0bNoiOQkRERESkFW7cuIEaNWqwg5OolGBnpwBcxk7qRkdHB25ubggICEBiYiImT56MqKgoVK9eHR06dMDGjRuRkpIiOqZWmDNnDvbu3YuYmBjRUYiIiIiItELNmjXfKnSyL4xIc7HYKYBCoWCxk9SWQqHA119/jW3btuHhw4cYMWIE9u/fj8qVK6NLly4IDAxEamqq6JilVpkyZeDn54dx48a9NYKAiIiIiIiKl0qlgkqlwrNnz0RHIaJPxGKnAJzZSZrCwMAAPXv2REhICBISEtC3b1/s3LkT9vb26N69O4KDg5Geni46ZqkzcOBAAMDmzZsFJyEiIiIi0i4SiQS//fYbOnXqxO5OIg3FYqcAXMZOmsjY2Bj9+vVDWFgY7ty5g65du8Lf3x+2trbw9PREaGgoi/hFRCqVYvXq1Zg+fTpevHghOg4RERERkVZxc3NDTk4OwsLCREchok/AYqcAXMZOms7U1BSDBw/G77//jvj4eLi6umLNmjWwtbWFl5cX9u7di+zsbNExNVqTJk3QuXNnzJ07V3QUIiIiIiKtIpVKMW/ePMyePZujpYg0EIudAnAZO5Um5ubmGDFiBA4fPozr16/DyckJCxcuhI2NDYYOHYoDBw4gNzdXdEyN5Ofnh8DAQFy7dk10FCIiIiIireLu7g49PT2EhISIjkJEH4nFTgHY2UmllbW1NcaNG4fo6GhcuHABderUwcyZM2Fra4vRo0cjMjISeXl5omNqDEtLS8yaNQsTJkzgvCAiIiIiohIkkUgwf/58+Pj48GcYIg3DYqcAnNlJ2sDe3h7ffvstzpw5g1OnTqFixYqYPHky7O3tMXHiRJw4cYJLQgphzJgxSEpKQmhoqOgoRERERERapWPHjjA3N8e2bdtERyGijyBRsV2oxJ0+fRoTJkzA6dOnRUchKnE3b95EcHAwgoKC8PLlS/Tq1Qu9e/dG48aNIZFIRMdTS5GRkRg0aBCuXbsGAwMD0XGIiIiIiLRGZGQkhg0bhuvXr0NHR0d0HCIqBHZ2CsCZnaTNatSogdmzZ+Pq1avYt28fFAoF+vTpg2rVqmH69Om4ePEil2z/Q9u2beHk5IRFixaJjkJEREREpFXatm2LSpUq4ddffxUdhYgKiZ2dAty6dQtfffUVbt26JToKkVpQqVQ4f/48goKCsGPHDujr68PT0xOenp6oVauW6Hhq4f79+2jUqBHOnj2LypUri45DRERERKQ1Tp48id69e+PWrVvQ09MTHYeI/gU7OwXgBkVEBUkkEnzxxRdYvHgxbt++DX9/fzx//hzt27dHgwYN4Ofnh/j4eNExhbK3t8fkyZPx7bffio5CRERERKRVmjdvjrp16+KXX34RHYWICoGdnQI8fvwYderUwZMnT0RHIVJrSqUS0dHRCAoKwq5du1ChQgV4enqiV69eqFChguh4JS4zMxN169bFTz/9hE6dOomOQ0RERESkNf7880907doVcXFx0NfXFx2HiD6AxU4BUlNTUb58eaSlpYmOQqQxcnNzERkZieDgYISGhqJGjRro3bs3evbsCRsbG9HxSkx4eDi8vb1x+fJl6Orqio5DRERERKQ1evTogVatWnG1FZGaY7FTgJycHBgYGCAnJ0d0FCKNlJ2djUOHDiE4OBhhYWFo0KABevfuDQ8PD1hYWIiOV6xUKhW6dOkCFxcXTJkyRXQcIiIiIiKtcfnyZXTo0AFxcXEwMjISHYeI3oPFTgFUKhXkcjmysrIgl8tFxyHSaJmZmfj9998RHByM/fv3o2nTpvD09ET37t1hZmYmOl6xuHXrFlq0aIFLly7B1tZWdBwiIiIiIq3Rp08f1K9fH9OmTRMdhYjeg8VOQQwNDZGUlMR3g4iKUEZGBvbt24egoCAcOnQIzs7O8PT0xNdffw0TExPR8YrU1KlT8eDBAwQGBoqOQkRERESkNW7evIlWrVohLi4OZcqUER2HiN6BxU5BzM3NcePGDZibm4uOQlQqpaamIiwsDMHBwTh27BhcXV3h6emJr776CoaGhqLjfbaXL1+iZs2aCA4ORsuWLUXHISIiIiLSGoMGDUKlSpUwZ84c0VGI6B1Y7BTEzs4Op06dgp2dnegoRKXes2fPsHv3bgQFBeHUqVNwc3ODp6cn3NzcoFAoRMf7ZNu2bcOSJUsQExMDmUwmOg4RERERkVb466+/0LRpU9y8eRPlypUTHYeI/kEqOoC2UigUePXqlegYRFrB1NQUgwcPRkREBOLi4uDi4oLVq1fDxsYGAwYMwL59+5CdnS065kfr06cPjI2NsWHDBtFRiIiIiIi0RpUqVeDh4YGlS5eKjkJE78DOTkHq1q2L7du3o169eqKjEGmtxMREhISEIDg4GNevX0e3bt3Qu3dvuLi4aMzmYRcvXkSHDh1w/fp1vqtMRERERFRC7t+/j4YNG+LatWuwsrISHYeI3sDOTkH09fWRmZkpOgaRVrOxscH48eMRHR2N8+fPo3bt2pgxYwZsbW0xevRoREZGIi8vT3TMD2rQoAF69uyJWbNmiY5CRERERKQ17O3t0a9fPyxatEh0FCL6B3Z2CuLs7IwFCxagdevWoqMQ0T/Ex8djx44dCA4OxuPHj9GzZ0/07t0bzZo1g0QiER3vLSkpKahVqxYiIiLQsGFD0XGIiIiIiLRCYmIi6tSpg8uXL6N8+fKi4xDR/2FnpyAKhYKdnURqqmrVqpg2bRouXLiAw4cPw8zMDEOHDkWlSpUwZcoUxMTEQJ3eJzIzM8O8efMwfvx4tcpFRERERFSa2djYYOjQofDz8xMdhYjewGKnIFzGTqQZatasCR8fH1y9ehV79+6Fnp4eevfuDQcHB8yYMQOXLl1SiwLjsGHDkJGRgW3btomOQkRERESkNb7//nsEBQXh7t27oqMQ0f9hsVMQdnYSaRaJRIJ69erB19cXsbGxCA4ORk5ODtzd3VG7dm3MnTsXN27cEJZPJpNh9erV+P7775GWliYsBxERERGRNrGwsMDo0aMxf/580VGI6P+w2CmIQqHAq1evRMcgok8gkUjQuHFjLF68GLdv38amTZvw7NkztGvXDg0aNICfnx/i4+NLPFeLFi3g6uoKX1/fEn9tIiIiIiJt9d1332H37t2Ii4sTHYWIwGKnMOzsJCodpFIpmjdvjhUrVuD+/ftYtWoVEhIS0Lx5czRp0gTLli3D/fv3SyzPokWLsHHjRty8ebPEXpOIiIiISJuZmppi0qRJmDt3rugoRAQWO4XhzE6i0kcmk6FNmzb4+eef8fDhQ/j5+eH69eto2LAhWrZsiVWrViExMbFYM9jY2GDatGmYNGmSWswSJSIiIiLSBhMnTsSBAwdw7do10VGItB6LnYJwGTtR6SaXy9GhQwf88ssvSExMxPTp0xETE4PatWvDxcUF69atw5MnT4rltcePH487d+4gPDy8WO5PREREREQFGRsbw9vbG3PmzBEdhUjrsdgpCJexE2kPXV1ddOnSBZs3b0ZiYiImTpyIyMhIVKtWDZ06dcqf+VmUr7dq1SpMnjyZ/84QEREREZWQsWPHIjo6GhcuXBAdhUirsdgpCJexE2knhUKBbt26ISgoCA8fPsTQoUOxd+9eVKxYEe7u7tiyZQtSU1M/+3U6dOiABg0aYOnSpfnH0tLSEBcXhytXruD+/fvIy8v77NchIiIiIqK/GRgYYOrUqZg9e7boKERaTaLiUDchVqxYgTt37mDFihWioxCRGkhNTUVYWBiCgoIQFRUFV1dX9O7dG126dIGhoeEn3fPOnTto3Lgx/P39kZ2dDRMTE9jZ2UGhUOD58+e4c+cOVCoVWrduDQsLiyJ+IiIiIiIi7ZOZmQkHBwfs2rULTZs2FR2HSCux2CnIunXrcP78efz3v/8VHYWI1MyzZ8/wv//9D8HBwTh16hTc3NzQu3dvfPnll1AoFIW+T0JCAvz9/dGvXz9UqVLlnecolUpERUXhyZMn8PDwgEQiKarHICIiIiLSSv/9738RGhqKiIgI0VGItBKXsQvCmZ1E9D6mpqYYMmQIIiIiEBcXh7Zt22LlypWwsbHBgAED8NtvvyE7O/uD97h9+zbOnz+PWbNmvbfQCQBSqRRt2rSBq6srtm7dyh3ciYiIiIg+0+DBg3Hr1i1ERUWJjkKklVjsFIQzO4moMCwsLDBq1CgcOXIEV69ehaOjIxYsWAAbGxsMGzYMBw8eRG5uboFrUlNTERMTA3d390K/jqmpKTp37ow9e/YU9SMQEREREWkVXV1d+Pj4YNasWWwmIBKAxU5BFAoFXr16JToGEWkQW1tbTJgwAcePH8f58+dRs2ZNTJ8+HeXLl8eYMWNw9OhR5OXl4fDhw+jevftH39/MzAz6+vpIS0srhvRERERERNqjf//+SExMxOHDh0VHIdI6LHYKwmXsRPQ5KlSoAG9vb5w9exYnTpyAnZ0dJkyYADs7O8THx0Mul3/Sfdu1a8dvyIiIiIiIPpNcLsecOXMwc+ZMdncSlTAWOwXhMnYiKipVq1bF9OnTcfHiRaxYsQJ9+vT55Hvp6Oi8tSyeiIiIiIg+nqenJ9LS0rB//37RUYi0CoudgtSuXRs+Pj6iYxBRKWNgYABbW9vPuoehoSFycnKKKBERERERkXaSSqWYN28eZ3cSlTAWOwUpV64c2rVrJzoGEZUyRfFNlJGRER49elQEaYiIiIiItFv37t2hUqmwe/du0VGItManDXWjzyaRSERHIKJSqCj+bUlISEC7du2gr68Pa2trWFtbw8rK6q2PX/9uaWkJXV3dIkhPRERERFS6SCQSzJ8/H1OnTsXXX38NqZQ9Z0TFjcVOIqJSREdHBxkZGTAwMPjke+jp6SErKwvPnz/Ho0ePkJSUhEePHuV/HBsbW+DYkydPYGJi8t6i6JsfW1hYQCaTFeETExERERGpt86dO8PX1xc7duxA7969RcchKvUkKg6OICIqNbKysnDgwAG4u7t/0vUqlQqhoaHw8PAo9DVKpRLJyclvFUX/+XFSUhJSUlJgZmb2zg7Rf35sZmbGd76JiIiIqFQ4dOgQxo4di6tXr0IuZ98ZUXHi3zAiolLkdVemSqX6pCXtZ86cgZOT00ddI5VKYWFhAQsLC9StW/eD5+bm5uLJkycFCqCPHj1CQkIC/vzzzwIF0tTUVFhaWn5wCf3rj8uWLcvxIERERESktlxdXWFjY4OtW7di4MCBouMQlWrs7FRTOTk5kEqlXO5JRB/t3r17+Ouvv9C2bduPui4vLw9BQUHo169f8QT7SNnZ2Xj8+PE7O0T/eSwrKwtWVlb/2i1qZWUFIyMjFkaJiIiIqMRFRUVh4MCBuHHjBmfeExUjFjsFiYiIQLNmzVCmTJn8Y6//U0gkEvzyyy9QKpUYMWKEqIhEpMFOnDgBfX19NGrUqFDnK5VKBAYGomfPnp8171OUV69efbAY+uYxAIXqFrW2toa+vr7gJyu8DRs24OjRo9DX14eLiwv69OnDoi4RERGRmunUqRN69OiBkSNHio5CVGqx2CmIVCrF8ePH0bx583d+ff369diwYQOio6Ohp6dXwumIqDQ4efIkUlNT0aFDhw/OvkxOTkZYWBg8PDxgYmJSggnFePnyZaG6RZOSkqCnp/fBYuibv4t6dz49PR0TJ07EiRMn0LVrVzx69AixsbHo3bs3xo8fDwC4fv065s2bh1OnTkEmk2HAgAGYPXu2kLxERERE2uzMmTPw8PBAbGwsFAqF6DhEpRKLnYIYGhpi+/btaN68OTIyMpCZmYnMzEy8evUKmZmZOH36NKZNm4aUlBSULVtWdFwi0lCPHz9GVFQUJBIJXFxcYGpqmv+1P//8E4cPH8aRI0cQHh7OsRn/oFKp8OLFi0J1iz558gRGRkaF6ha1sLAo0qH0J0+eRMeOHeHv749vvvkGALBu3TrMmjUL8fHxSEpKQrt27eDo6Ahvb2/ExsZiw4YNaNu2LRYsWFBkOYiIiIiocLp27Yr27dtjwoQJoqMQlUosdgpiY2ODpKSk/CWSEokkf0anTCaDoaEhVCoVLl68WKA4QUT0KfLy8nDs2DGkpaXlH6tbty5sbW1RtWpV7N27t9BL3ultSqUSKSkphdqRPjk5Gaampv/aLWptbY1y5cr96470gYGB+M9//oP4+Hjo6upCJpPh7t27cHd3x7hx46Cjo4NZs2bhxo0bMDIyAgBs2rQJc+fOxfnz52FmZlYSf0RERERE9H8uXLiAzp07Iy4uTiNHSBGpO+7GLkheXh6+++47tGvXDnK5HHK5HDo6Ovm/y2QyKJVKGBsbi45KRKWATCaDi4vLO7/m7e0NX19f7Nq1q4RTlR5SqRTm5uYwNzdHnTp1Pnhubm4unj59+laH6MOHD3H+/PkCBdIXL17AwsICly9fRrly5d55P2NjY2RlZSEsLAyenp4AgP379+P69etITU2Fjo4OTE1NYWRkhKysLOjp6aFmzZrIyspCVFQUvv766yL/8yAiIiKi92vYsCFatmyJn376CVOmTBEdh6jUYbFTELlcjsaNG8PNzU10FCLSciNHjsSiRYtw+fJl1KtXT3ScUk8ul+d3bjZo0OCD52ZnZ+PJkycfHGfy5ZdfYsiQIZgwYQI2bdoES0tLJCQkIC8vDxYWFihfvjwSEhKwbds29O3bFy9fvsTq1avx5MkTpKenF/XjEREREVEhzJkzB+3atcOoUaPY5ERUxGRz5syZIzqENkpJSYGTkxPs7Oze+ppKpeIOukRUYnR0dKBUKrFjx478mY+kHmQyGUxMTD64lF0ul6Np06Zo1KgRsrOzYWNjgypVquDFixdo2rQpevTogfT0dEydOhW+vr4IDw/P7/Ds1KkTateunX8vlUqFhw8f4urVq8jJyYGenh50dHRK4lGJiIiItIqlpSUuXryI+Ph4tG7dWnQcolKFMzvV1LNnz5CTkwNzc/N/nddGRPS50tLSULVqVRw7dgw1a9YUHYc+0/z58xEWFob169fnz2J98eIFrl27Bmtra2zatAl//PEHFi9ejFatWuVfp1KpEB4eDj8/v/yl9Do6OoXekV5PT0/UIxMRERFpnNjYWLRo0QK3bt3iXh1ERYjFTkF27tyJqlWr4osvvihwXKlUQiqVIiQkBDExMRg3btw7uz+JiIraggULcPPmTWzevFl0FPoI58+fR15eHho1agSVSoX//e9/GD16NLy9vTFlypT8lQJvvnHWpk0b2NnZYfXq1R/coEilUiE1NbVQO9I/fvwYhoaGhd6Rnh2jnycjIwNHjhyBUqnMXxGiUCjg4uICuZxTioiIiDTF0KFDYWtri/nz54uOQlRqsNgpSOPGjeHu7o73TRE4efIkxo8fj2XLlqFNmzYlG46ItNKLFy9QtWpVnDp1CtWqVRMdhwrp999/x6xZs5CWlgZLS0ukpKTA1dUVfn5+MDQ0xK5duyCTydC0aVNkZGRg2rRpiIqKwu7du9GsWbMiy6FUKvHs2bNC7Uj/9OlTlC1bttA70stksiLLqen++usvnD9/HgYGBmjXrl2BbtoXL17gyJEjyM3NRevWrWFpaSkwKRERERXGnTt34OjoiBs3bsDc3Fx0HKJSgcVOQdq1a4eqVavC29sbL1++xKtXr5CZmYmMjAxkZWXh4cOH+O677xAYGIg+ffqIjktEWsLHxwcJCQnYuHGj6ChUSFlZWbh58yZu3bqFp0+folq1amjfvn3+14ODg+Hj44Pbt2/DwsICjRo1wpQpU4TOhsrLy3vnjvTv+vj58+cwNzd/Z1H0nwVSMzOzUj3z+vjx41AqlXB2dv7geSqVCvv27UPlypVRp06dEkpHREREn2rMmDEwMjLC4sWLRUchKhVY7BTEy8sLW7duha6uLpRKJWQyGeRyOeRyOXR0dGBkZIScnBwEBATA1dVVdFwi0hIpKSlwcHDAn3/+iUqVKomOQ5/oXRvdZWRkIDk5GQYGBihXrpygZB8vJycHT548+eAS+tcfp6enw8rK6oNL6F9/bGJiolGF0VOnTkGhUKBhw4aFvuaPP/6Avb09qlevXozJiIiI6HM9ePAA9evXx9WrV2FtbS06DpHGY7FTkF69eiEjIwNLliyBTCYrUOyUy+WQSqXIy8uDqakpN3wgIiIqhMzMTDx+/LhQM0Zzc3ML1S1qbW0NQ0NDoc+VnJyMM2fOwM3N7aOv3bZtGzw9PTkKgIiISM1NnjwZSqUSK1euFB2FSOOx2CnIgAEDIJVKERAQIDoKERGR1klPT3+rCPq+5fRyubzQO9IrFIoizxoaGoqvv/76kwqWycnJuHTpElxcXIo8FxERERWdpKQk1K5dGxcuXIC9vb3oOEQajdt1CtK3b19kZ2fnf/56yaFKpcr/JZVKNWqJHRERkaYwNDRElSpVUKVKlQ+ep1KpkJaW9s5i6JkzZ97akV5fX79QO9JbWloWakf617utf2pnZrly5ZCSkvJJ1xIREVHJsbKywvDhw7FgwQKsW7dOdBwijcbOTiIiIqIioFKpCr0j/ZMnT1CmTJl/7Ra9e/cumjVr9lk7qx8/fhwODg7cnZ2IiEjNJScno0aNGjh79iwqV64sOg6RxmKxU6C8vDxcv34dcXFxqFSpEho2bIjMzEycO3cOr169Qt26dWFlZSU6JhERERWxvLw8JCcn/+sSeolEgkuXLn3Wa929exfPnz9HgwYNiig9ERERFRcfHx/cu3cP/v7+oqMQaSwuYxdo0aJFmDlzJnR1dWFhYYH58+dDIpFg4sSJkEgk6NatGxYuXMiCJxF9tLZt26Ju3bpYs2YNAKBSpUoYN24cvL2933tNYc4hoqIhk8lgaWkJS0tL1KtX773nhYWFffZr6enpISsr67PvQ0RERMVv8uTJcHBwwM2bN1GjRg3RcYg0klR0AG119OhRbN26FQsXLkRmZiZ+/PFHLF26FBs2bMDPP/+MgIAAXL16FevXrxcdlYjU0JMnTzBmzBhUqlQJenp6sLKygqurKw4ePAjg7w1Nfvjhh4+659mzZzFmzJjiiEtEn0gikUCpVH7WPZ4/f46yZcsWUSIiIiIqTmXLlsXkyZMxd+5c0VGINBY7OwW5f/8+ypQpg++++w4A8M033+D48eO4dOkS+vbtCwC4evUqTpw4ITImEakpDw8PZGRkYOPGjahWrRoeP36Mo0ePIjk5GQBgZmb20fe0sLAo6phE9JmaNm2K6OhotG7d+pPvcePGDXz11VdFmIqIiIiK04QJE1CtWjVcuXIFdevWFR2HSOOws1MQHR0dZGRkFNhdVUdHB+np6fmfZ2VlITc3V0Q8IlJjz58/R1RUFBYuXAhXV1dUrFgRTZo0gbe3N3r37g3g72Xs48aNK3Ddy5cv0b9/f6fUT6oAACAASURBVBgZGcHa2hpLly4t8PVKlSoVOCaRSBASEvLBc4ioeFlZWeHx48effL1KpUJeXh7kcr6/TUREpCmMjIzw/fffw8fHR3QUIo3EYqcg9vb2UKlU2Lp1KwDg1KlTOH36NCQSCX755ReEhIQgIiICbdq0EZyUiNSNkZERjIyMEBYWhszMzEJft3z5ctSqVQvnzp3D3LlzMX36dISGhhZjUiIqCnZ2dkhISPika48fP46WLVsWcSIiIiIqbqNHj8apU6dw7tw50VGINA7f5hekYcOG6Ny5MwYPHoxff/0Vt2/fRqNGjTBs2DD06dMHCoUCTZs2xfDhw0VHJSI1I5fLERAQgOHDh2P9+vVo1KgRWrZsiZ49e8LJyem91zk5OWHGjBkAgOrVq+Ps2bNYvnw5evToUVLRiegTODk54ddff0W/fv2go6NT6OtSUlKQmJiIVq1aFWM6IiIiKg76+vqYPn06Zs+ejb179yIuLg7Xrl2DRCIBABgbG8PZ2bnAalEi+hs7OwUxMDDAvHnzsGPHDtSoUQOTJk3Ctm3b0LFjR1y4cAFbtmzB9u3bYW5uLjoqEakhDw8PPHz4EOHh4XBzc8OJEyfQrFkz+Pn5vfea5s2bv/X5tWvXijsqEX0miUSC3r17Y8uWLYXu5n78+DF+++03fPPNN8WcjoiIiIrLoEGDcP/+ffzyyy9IT09H165d4e7uDnd3dzRo0ABhYWHYtWvXZ428ISqN2NkpkI6ODrp164Zu3boVOG5vbw97e3tBqYhIUygUCnTo0AEdOnTA7NmzMWzYMMyZMwfe3t5Fcn+JRAKVSlXgWE5OTpHcm4g+jkKhQP/+/REaGgpzc3O0bdv2nZ0cmZmZ2LdvH5YvX47g4OD87g8iIiLSLM+fP8fu3bsRGRkJU1PTt75uamqK7t27Q6lU4uDBgyhTpgyaNWsmICmR+mGxUw28Lia8+QOJSqXiDyhE9FFq166N3Nzc93Z+nTp16q3Pa9Wq9d77WVhYIDExMf/zpKSkAp8TUcnS0dGBp6cnUlJSEBYWBpVKBR0dHejp6SEzMxM5OTnQ09ND586dceXKFQwbNgz79+/n9xNEREQa5uXLlwgLC8PAgQP/9f/jUqkUnTp1wrlz53Dy5Mm3VnMRaSMWO9XAu/7x4g8mRPQ+ycnJ6NmzJ4YMGYL69evD2NgYMTExWLx4MVxdXWFiYvLO606dOoUffvgB33zzDSIjI7F58+b8TdLepV27dvjpp5/QokULyGQyTJ8+HQqForgei4gKyczMDN27dwfw95ujWVlZ0NPTK/C9w/Tp09GiRQusW7cOo0ePFhWViIiIPsHu3bvRv3//j6oLfPHFFzh8+DDu37/PlaKk9VjsJCLSMEZGRmjWrBlWrlyJuLg4ZGVloXz58ujbty9mzpz53uu+/fZbXLp0CQsWLIChoSHmzZv3wXl+y5Ytw9ChQ9G2bVtYWVlh8eLFuH79enE8EhF9IolE8s43IXR0dBAYGIhWrVqhffv2cHBwEJCOiIiIPtbt27dRs2ZNSKUfv8WKi4sLdu3axWInaT2J6p8D2YiIiIioVFi1ahW2b9+OqKgoyOV8j5uIiEjdhYSEwMPD45NXe+7Zswdubm7Q1dUt4mREmoO7sQukVCoRGxsrOgYRERGVUuPGjYOhoSEWL14sOgoRERH9C5VKBZlM9llj7VxdXXHkyJEiTEWkeVjsFEipVKJmzZpv7XZMREREVBSkUin8/f2xYsUKnD9/XnQcIiIi+oC0tLR37rz+MYyMjJCdnV1EiYg0E4udAsnlckilUuTm5oqOQkRERKWUvb09li1bBi8vL2RmZoqOQ0RERO+RkZEBAwODz74PG6pI27HYKZhCocCrV69ExyAiIqJSrH///qhZsyZmzZolOgoRERG9h4mJCVJTU0XHINJ4LHYKplAo2GVBRERExUoikWDdunXYunUrjh49KjoOERERvYO+vj5evHjxWfdISEiApaVlESUi0kwsdgqmr6/PYicRaaw2bdogMDBQdAwiKgRzc3M8fPgQbdq0ER2FiIiI3kEikUAmk33WqLvTp0/DycmpCFMRaR4WOwVjZycRabJZs2ZhwYIFyMvLEx2FiIiIiEjjubi4fPJu6jk5OZDL5Z+1mztRacBip2Cc2UlEmszV1RWmpqYICQkRHYWIiIiISOOVKVMGaWlpSElJ+ehrd+3aBVdX12JIRaRZWOwUjMvYiUiTSSQSzJ49G/Pnz4dSqRQdh4iIiIhI43Xv3h179+7Fs2fPCn3N7t270aJFCxgZGRVjMiLNwGKnYFzGTkSa7ssvv4S+vj52794tOgoRERERkcaTSCTw8vLCH3/8gX379n2wqeDOnTsIDAxE06ZNUaFChRJMSaS+5KIDaDsuYyciTSeRSDBz5kzMnTsX3bt354wgIiIiIqLPJJFI4O7ujipVqmDatGkoX7487O3tUbZsWbx69QqJiYlIS0tDxYoV0b9/f34PTvQGdnYKxs5OIioNunbtCqVSiX379omOQqQ2Bg0aBIlE8tavCxcuiI5GREREGmDjxo1o1KgRxo0bh6+//hq2trbIzs6GkZERWrZsCQ8PDzg6OrLQSfQP7OwUjDM7iag0eN3dOW/ePHTp0oXfcBH9n/bt2yMwMLDAMXNzc0FpgOzsbOjq6gp7fSIiIiqcrKws/PDDDwgNDQUASKVS2NrawtbWVnAyIvXHzk7B2NlJRKVFjx49kJ6ejgMHDoiOQqQ29PT0YG1tXeCXXC7Hb7/9hlatWqFs2bIwMzODm5sbbt68WeDaEydOoGHDhlAoFPjiiy+wd+9eSCQSREdHAwBycnIwZMgQVK5cGfr6+qhevTqWLl0KlUqVf4/+/fujW7du8PPzQ/ny5VGxYkUAwK+//gpHR0cYGxvDysoKnp6eSExMzL8uOzsb48aNg42NDfT09GBvb48ZM2aUwJ8YERERAX93ddavXx9NmjQRHYVI47CzUzDO7CSi0kIqleZ3d3bs2JHdnUQfkJ6ejm+//Rb16tVDRkYG5s2bB3d3d1y9ehU6OjpITU2Fu7s7OnfujG3btuH+/fuYNGlSgXvk5eWhQoUK2LFjBywsLHDq1CmMGDECFhYWGDhwYP55f/zxB0xMTHDgwIH8QmhOTg7mz5+PGjVq4MmTJ/j+++/Rt29fHDlyBADw448/Ijw8HDt27ECFChWQkJCA2NjYkvsDIiIi0mJZWVlYuHAhQkJCREch0kgS1Ztv/1OJmzx5MipUqIDJkyeLjkJE9Nny8vJQu3ZtrF27Fu3atRMdh0ioQYMGYcuWLVAoFPnHnJ2dsX///rfOTU1NRdmyZXHixAk0a9YMP/30E3x8fJCQkJB//ebNmzFw4EBERUWhVatW73xNb29vXLlyBb///juAvzs7Dx06hHv37n1w+fqVK1dQr149JCYmwtraGmPGjEFcXBwiIiL4xgUREVEJW7t2Lfbu3ct5+ESfiMvYBeMydiIqTWQyGaZPn4758+eLjkKkFlq3bo0LFy7k//rll18AALGxsejTpw+qVKkCExMT2NraQqVS4d69ewCAGzduoH79+gUKpU5OTm/d/6effoKjoyMsLCxgZGSE1atX59/jtXr16r1V6IyJiUHXrl1RsWJFGBsb59/79bWDBw9GTEwMatSogfHjx2P//v1QKpVF9wdDRERE7/R6VqePj4/oKEQai8VOwbiMnYhKm759++LevXuIiooSHYVIOAMDA1SrVi3/V/ny5QEAXbp0QUpKCjZs2IDTp0/jzz//hFQqRXZ2NgBApVL9a0fl1q1b4e3tjSFDhiAiIgIXLlzAyJEj8+/xmqGhYYHP09LS0KlTJxgbG2PLli04e/YsfvvtNwDIv7ZJkya4c+cOfH19kZOTg/79+8PNzQ1cEERERFS8/P39UbduXTRt2lR0FCKNxZmdgikUCiQnJ4uOQURUZHR0dDBt2jTMnz+fmxURvUNSUhJiY2OxceNGODs7AwDOnDlToHOyVq1aCA4ORlZWFvT09PLPeVN0dDRatGiBMWPG5B+Li4v719e/du0aUlJSsHDhQtjb2wMALl269NZ5JiYm6NWrF3r16gUvLy+0atUKt2/fRpUqVT7+oYmIiOhfZWVlwc/PDzt37hQdhUijsbNTMH19fS5jJ6JSZ8CAAXjw4AGePn0qOgqR2jE3N4eZmRnWr1+PuLg4REZGYuzYsZBK//+3ZV5eXlAqlRgxYgSuX7+OgwcPYuHChQCQ3/FZvXp1xMTEICIiArGxsZgzZw6OHz/+r69fqVIl6OrqYvXq1bh9+zb27t371lK5pUuXIigoCDdu3EBsbCy2b9+OMmXKwNbWtgj/JIiIiOhNr7s63zW6hogKj8VOwbiMnYhKI11dXVy5cgXlypUTHYVI7chkMgQHB+PcuXOoW7cuxo8fjx9++AE6Ojr555iYmCA8PBwXLlxAw4YN8Z///Adz584FgPw5nmPGjEGPHj3g6emJpk2b4sGDB2/t2P4uVlZWCAgIQEhICGrVqgVfX18sX768wDlGRkZYtGgRHB0d4ejomL/p0ZszRImIiKhojRo1Kn+0DBF9Ou7GLtjmzZtx8OBBBAYGio5CREREamzXrl3o1asXnj59ClNTU9FxiIiIiIjUEmd2CsZl7ERERPQu/v7+cHBwgJ2dHS5fvoxvv/0W3bp1Y6GTiIiIiOgDWOwUTKFQsNhJRFpJqVQWmFFIRAU9evQIc+bMwaNHj2BjYwN3d/f8uZ1ERERERPRuXMYu2MGDB7Fo0SIcOnRIdBQiohKhVCoRFhaG7du3o1q1aujatSuHsBMREREREVGRYEuNYOzsJCJtkZOTAwC4cOECvvvuOyiVSkRFRWHo0KFITU0VnI6IiIiISDPl5uZCIpFg9+7dxXoNkaZgsVMwzuwkotIuIyMDU6ZMQf369dG1a1eEhISgRYsW2L59OyIjI2FtbY3p06eLjklEREREVOTc3d3Rvn37d37t+vXrkEgkOHjwYAmnAuRyORITE+Hm5lbir01U3FjsFEyhUODVq1eiYxARFQuVSoU+ffrgxIkT8PX1Rb169RAeHo6cnBzI5XJIpVJMnDgRR48eRXZ2tui4RERERERFatiwYTh8+DDu3Lnz1tc2btyIihUrwtXVteSDAbC2toaenp6Q1yYqTix2CsZl7ERUmt28eRO3bt2Cl5cXPDw8sGDBAixfvhwhISF48OABMjMz8dtvv8Hc3Bzp6emi4xLRv1i+fDmcnZ2Rl5cnOgoREZFG6NKlC6ysrODv71/geE5ODgIDAzFkyBBIpVJ4e3ujevXq0NfXR+XKlTF16lRkZWXln3/37l107doVZmZmMDAwQK1atbBz5853vmZcXBwkEgkuXLiQf+yfy9a5jJ1KMxY7BeMydiIqzYyMjPDq1Su0bt06/5iTkxOqVKmCQYMGoWnTpjh+/Djc3NxgamoqMCkRFcakSZMgk8mwfPly0VGIiIg0glwux8CBAxEQEAClUpl/PDw8HE+fPsXgwYMBACYmJggICMD169exZs0abNmyBQsXLsw/f9SoUcjOzkZkZCSuXr2K5cuXo0yZMiX+PESagMVOwdjZSUSlmZ2dHWrWrIkVK1bkf3MXHh6O9PR0+Pr6YsSIERg4cCAGDRoEAAW+ASQi9SOVShEQEIDFixfj0qVLouMQERFphKFDh+LevXs4dOhQ/rGNGzeiY8eOsLe3BwDMnj0bLVq0QKVKldClSxdMnToV27dvzz//7t27cHZ2Rv369VG5cmW4ubmhY8eOJf4sRJpALjqAtuPMTiIq7ZYsWYJevXrB1dUVjRo1QlRUFLp27QonJyc4OTnln5ednQ1dXV2BSYmoMCpVqoTFixfDy8sLZ86c4awvIiKif+Hg4IDWrVtj06ZN6NixIx4+fIiIiAgEBwfnnxMcHIxVq1YhPj4eL1++RG5uLqTS/9+fNnHiRIwbNw779u2Dq6srevTogUaNGol4HCK1x85OwV53dqpUKtFRiIiKRb169bB69WrUqFED586dQ7169TBnzhwAQHJyMn7//Xf0798fI0eOxM8//4zY2FixgYnoXw0aNAiVKlXK/7tMREREHzZs2DDs3r0bKSkpCAgIgJmZGbp27QoAiI6ORr9+/dC5c2eEh4fj/PnzmDdvXoENPEeOHIm//voLAwcOxI0bN9CsWTP4+vq+87VeF0nfrDPk5OQU49MRqRcWOwWTyWSQy+X8h4eISrX27dtj3bp12Lt3LzZt2gQrKysEBASgTZs2+Oqrr/DgwQOkpKRgzZo16Nu3r+i4RPQvJBIJNmzYgICAABw/flx0HCIiIrX3zTffQKFQYMuWLdi0aRMGDBgAHR0dAMDx48dRsWJFzJgxA02aNIGDg8M7d2+3t7fHyJEjsXPnTsyePRvr169/52tZWloCABITE/OPvblZEVFpx2KnGuBSdiLSBnl5eTAyMsKDBw/QoUMHDB8+HM2bN8f169dx4MABhIaG4vTp08jOzsaiRYtExyWif2FpaYm1a9di4MCBePnypeg4REREak1fXx99+/bFnDlzEB8fj6FDh+Z/rXr16rh37x62b9+O+Ph4rFmzBjt27Chw/fjx4xEREYG//voL58+fR0REBGrXrv3O1zIyMoKjoyMWLlyIa9euITo6Gt9//32xPh+ROmGxUw1wkyIi0gYymQwAsHz5cjx9+hR//PEHNmzYAAcHB0ilUshkMhgbG6NJkya4fPmy4LREVBjdunWDs7MzvL29RUchIiJSe8OGDcOzZ8/QokUL1KpVK/949+7dMXnyZEyYMAENGzZEZGQk5s6dW+DavLw8jB07FrVr10anTp1Qvnx5+Pv7v/e1AgICkJubC0dHR4wZM+a9S96JSiOJisMihatYsSKOHTuGihUrio5CRFSsEhIS0K5dOwwcOBAzZszI33399Vyhly9fombNmpg5cyZGjRolMioRFdKLFy/QoEEDrF27Fm5ubqLjEBEREZGWY2enGmBnJxFpi4yMDGRmZqJfv34A/i5ySqVSZGZmYteuXXBxcYG5uTm6d+8uOCkRFVaZMmXg7++PYcOGITk5WXQcIiIiItJyLHaqAc7sJCJtUb16dZiZmcHPzw93795FdnY2tm3bhgkTJmDJkiUoX7481qxZAysrK9FRiegjuLi4wNPTE6NHjwYXDRERERGRSCx2qgF2dhKRNlm7di2uX7+ORo0aoVy5cli6dClu3bqFTp06YcWKFWjVqpXoiET0CRYsWIArV64gKChIdBQiIiIi0mJy0QHo713ZWOwkIm3RvHlz7N+/HxEREdDT0wMANGzYEHZ2doKTEdHn0NfXR2BgINzc3ODs7My/00REREQkBIudaoDL2IlI2xgZGcHDw0N0DCIqYo0bN8b48eMxZMgQREREQCKRiI5ERERERFqGy9jVAJexExERUWkxbdo0vHjxAj///LPoKERERELl5OSgSpUqiIqKEh2FSKuw2KkGuIydiAhQqVTc2ISoFJDL5di8eTN8fHxw69Yt0XGIiIiE2bJlCypXrgxnZ2fRUYi0CoudaoCdnUREQGhoKJYtWyY6BhEVgRo1amDOnDkYMGAAcnNzRcchIiIqcTk5OfD19YWPj4/oKERah8VONcCZnUREgIODA5YtW8Z/D4lKiTFjxsDExAQLFy4UHYWIiKjEbdmyBZUqVULr1q1FRyHSOix2qgF2dhIRAfXr10ezZs2wYcMG0VGIqAhIpVJs2rQJq1atwrlz50THISIiKjHs6iQSi8VONcCZnUREf5s5cyYWL17MfxOJSgk7Ozv8+OOP8PLy4t9rIiLSGlu3bkXFihXZ1UkkCIudaoDL2ImI/ta4cWM0aNAA/v7+oqMQURHp27cv6tSpgxkzZoiOQkREVOxyc3PZ1UkkGIudaoDL2ImI/r9Zs2Zh4cKFyM7OFh2FiIqARCLB2rVrERQUhMjISNFxiIiIitWWLVtQoUIFtGnTRnQUIq3FYqca4DJ2IqL/r1mzZqhRowY2b94sOgoRFZFy5cphw4YNGDRoEFJTU0XHISIiKhbs6iRSDyx2qgF2dhIRFTRr1iz88MMPyM3NFR2FiIpI586d0alTJ0yaNEl0FCIiomKxdetW2Nvbs6uTSDAWO9UAZ3YSERXk7OyMChUqYNu2baKjEFERWrZsGY4ePYo9e/aIjkJERFSkcnNzMX/+fHZ1EqkBFjvVADs7iYjeNmvWLCxYsAB5eXmioxBRETEyMsLmzZsxatQoPH78WHQcIiKiIrN161bY2dmhbdu2oqMQaT0WO9UAZ3YSEb3NxcUF5ubm2LFjh+goRFSEWrZsiYEDB2LEiBFQqVSi4xAREX2217M658yZIzoKEYHFTrXAZexERG+TSCSYPXs2fH19oVQqRcchoiI0d+5c3L59G7/++qvoKERERJ9t27ZtKF++PLs6idQEi51qgMvYiYjerWPHjjA0NERoaKjoKERUhPT09BAYGIgpU6bg7t27ouMQERF9stezOtnVSaQ+WOxUA1zGTkT0bhKJBLNmzYKvry+XuxKVMvXr14e3tzcGDRrE7m0iItJY27Ztg62tLbs6idQIi51qgJ2dRETv99VXX0EikSA8PFx0FCIqYt7e3sjJycHKlStFRyEiIvponNVJpJ5Y7FQDnNlJRPR+r7s758+fz+5OolJGJpPh119/hZ+fH65duyY6DhER0UfZvn07bGxs2NVJpGZY7FQD7OwkIvqwbt26ITMzE7///rvoKERUxKpWrQo/Pz94eXkhOztbdBwiIqJCeXNWp0QiER2HiN7AYqca4MxOIqIPk0qlmDFjBrs7iUqpYcOGwdraGr6+vqKjEBERFUpQUBCsra3Z1UmkhiQq/tQoXEZGBsqVK8el7EREH5CXl4c6dergp59+gqurq+g4RFTEEhMT0ahRI+zZswdOTk6i4xAREb1Xbm4u6tSpg7Vr16Jdu3ai4xDRP7CzUw0oFApkZWWxW4mI6ANkMhlmzJiBefPmiY5CRMXAxsYGa9asgZeXFzIyMkTHISIieq+goCBYWVnBxcVFdBQiegd2dqoJPT09pKamQk9PT3QUIiK1lZubi5o1a2LTpk1o3bq16DhEVAz69+8PU1NTrF69WnQUIiKit+Tl5aF27dr4+eefudqISE2xs1NNcJMiIqJ/J5fLMX36dMyfP190FCIqJmvWrMGePXtw8OBB0VGIiIjeEhQUBEtLSy5fJ1JjLHaqCYVCwZmdRESF4OXlhdjYWJw8eVJ0FCIqBmXLlsXGjRsxZMgQPHv2THQcIiKifHl5eZg3bx53YCdScyx2qgl2dhIRFY6Ojg6mTp3K7k6iUqxDhw7o1q0bxo0bJzoKERFRPnZ1EmkGFjvVhL6+PoudRESFNHjwYFy+fBkxMTGioxBRMVm0aBFiYmKwY8cO0VGIiIiQl5eH+fPnw8fHh12dRGqOxU41wWXsRESFp6enh++//57dnUSlmIGBAQIDAzF+/HgkJiaKjkNERFouODgY5ubm3JSISAOw2KkmuIydiOjjDBs2DGfPnsXFixdFRyGiYtK0aVOMGjUKQ4cOhUqlEh2HiIi0FGd1EmkWFjvVBJexExF9HH19fXh7e8PX11d0FCIqRjNnzkRSUhI2bNggOgoREWkpdnUSaRYWO9UEOzuJiD7eyJEjcezYMVy9elV0FCIqJjo6OggMDMSMGTMQHx8vOg4REWkZzuok0jwsdqoJzuwkIvp4hoaGmDx5MhYsWCA6ChEVo9q1a2PGjBkYMGAA8vLyRMchIiItsmPHDpiZmaF9+/aioxBRIbHYqSbY2UlE9GnGjh2LQ4cO4ebNm6KjEFExmjBhAvT09LB06VLRUYiISEtwVieRZmKxU01wZicR0acxNjbG+PHj4efnJzoKERUjqVSKgIAALF26lBuTERFRidixYwdMTU3Z1UmkYVjsVBNcxk5E9OnGjx+Pffv24a+//hIdhYiKUYUKFbB06VJ4eXkhKytLdBwiIirFXs/qZFcnkeZhsVNNcBk7EdGnK1u2LMaMGYMffvhBdBQiKmYDBgxA1apVMXv2bNFRiIioFNu5cyfKli2LDh06iI5CRB+JxU41wWXsRESfZ9KkSQgNDcXdu3dFRyGiYiSRSLB+/Xps3rwZ0dHRouMQEVEpxFmdRJqNxU41wc5OIqLPY2ZmhuHDh2PR/2PvzsNjPN+3gZ+TPbKpkqpYs5GV2GltCUVKrW2CihBLKVIUEWQj9lJKayux1f5NbSVtI7GTEImQVVARam+EkG2e94++yU9qS5jMPTM5P8fhODozz/PMOWk7Mtdc933Nny86ChFVsBo1amDVqlUYMmQIcnJyRMchIiINs3PnTpiZmbGrk0hNsdipIrhnJxHRu5s4cSK2bduGrKws0VGIqIJ99tln6NixIyZNmiQ6ChERaRDu1Umk/ljsVBHs7CQienfm5uYYOnQoFi5cKDoKESnBkiVL8Mcff+DAgQOioxARkYbYtWsXTE1N8cknn4iOQkRvicVOFcE9O4mIFOPbb7/Fxo0b8ffff4uOQkQVzNTUFGFhYRg5ciTu3bsnOg4REak5uVzOvTqJNACLnSqCy9iJiBTjww8/xKBBg/Ddd9+JjkJEStChQwcMGDAAX331FSRJEh2HiIjU2K5du2BiYsKuTiI1x2KniuAydiIixZk6dSp+/vln3L17V3QUIlKC2bNnIzk5Gb/88ovoKEREpKbkcjmCg4PZ1UmkAVjsVBFcxk5EpDi1a9fGF198gSVLloiOQkRKYGBggM2bN2PChAnIzMwUHYeIiNRQcVdn165dRUchonfEYqeKYGcnEZFi+fn5YdWqVXjw4IHoKESkBC4uLvD19cXQoUMhl8tFxyEiIjVSvFdnYGAguzqJNACLnSqCe3YSESlW/fr10bt3byxbtkx0FCJSkqlTp+LJkydYsWKF6ChERKRGdu/eDSMjPSuiUAAAIABJREFUI3Tr1k10FCJSAJnEndxVQlxcHIYPH464uDjRUYiINMbly5fRunVrZGRkwMzMTHQcIlKC9PR0tGnTBsePH0ejRo1ExyEiIhUnl8vh7OyMhQsXonv37qLjEJECsLNTBdy9exeJiYnQ1tbG77//jsuXL4uORESkEaytrdG9e3csX74cAJCamoqIiAjs27cPUVFRXOJOpIFsbGwQEhICLy8vFBYWio5DREQqjl2dRJqHnZ2CSJKEmJgYZGVloXr16mjatCmMjIyQl5eH9PR0pKenw8jICK6urtDV1RUdl4hIbV24cAGDBw+Gv78/nJycYGVlBT09PTx+/Bhnz57FgwcPUL9+fTRr1kx0VCJSEEmS0K1bN3z00UcICAgQHYeIiFRUcVfnggUL4O7uLjoOESkIi50CPHnyBLt27YKrqyvq1KnzyuMeP36M/fv3o0WLFrCyslJiQiIizZCSkoLExER8+umnqFKlyiuPu3r1Ko4ePQoPDw8YGBgoMSERVZSsrCy4uLjgt99+Q/PmzUXHISIiFbRr1y4sWLAAZ86c4WAiIg3CYqeS5ebmYseOHRg8eDC0tbXLdE5ERAQsLS1hY2NTwemIiDTHpUuXcOfOHXTq1KlMxxcUFGDz5s0YOHAg9PX1KzgdESnD1q1bERISgri4OBgaGoqOQ0REKkQul6Nx48aYP38+uzqJNAz37FSy//3vf+UqdAJA165dkZCQgCdPnlRgMiIizfHgwQNkZGSUudAJALq6uhg0aBB2795dgcmISJkGDBiAxo0bw9/fX3QUIiJSMf/73/9gaGjIoUREGojFTiVKS0uDs7NzuQqdxT777DNERkZWQCoiIs1z5MgRfPrpp+U+T09PDw0aNMCNGzcqIBURibBixQrs3LkTUVFRoqMQEZGKkMvlCAkJQWBgIJevE2kgFjuVKDExEc7Ozm91rp6eHvLy8sBdB4iIXk8ul0OSpLf6YgkAWrdujdOnTys4FRGJ8v7772PNmjXw9vZGdna26DhERKQCwsPDoa+vz+XrRBqKxU4lycvLe+c94Fq1aoXY2FgFJSIi0kzHjx9H+/bt3/p8mUwGbW1tyOVyBaYiIpG6d+8Od3d3+Pr6io5CRESCyeVyBAcHIygoiF2dRBqKxU4luX379msnr5dF3bp1cfv2bQUlIiLSTNnZ2ahevfo7XaN69ersACPSMAsXLsTx48cRHh4uOgoREQnErk4izcdip5Lk5OTA2Nj4na/DZexERK+niPdJExMT5OTkKCANEakKY2NjbNy4EaNHj+aXx0RElRT36iSqHFjsVBJFfXDmGzIR0esp4n0yJycHpqamCkhDRKqkbdu2GDZsGEaMGMEvkImIKqFff/0Vurq6bzXIkojUB4udSlKzZk1kZma+0zWuXr2KWrVqKSgREZFmeu+99965a+vu3bssdhJpqKCgIFy/fh3r168XHYWIiJSIe3USVR4sdiqJnp4e8vPz3+ka0dHRaNq0qYISERFppo8++ggnTpx46/MlSYIkSdDS4l+RRJpIT08PmzZtwtSpU3H16lXRcYiISEnY1UlUefCTnBI1adIEcXFxb3Xus2fP8NNPP6Fnz56IiYlRcDIiIs0hk8kgk8lQWFj4Vufv2bMHO3bswPXr1xWcjIhUhZOTE6ZMmQJvb28UFRWJjkNERBWMe3USVS4sdiqRlZUVkpKSUFBQUO5z9+zZg0OHDsHd3R39+/dH9+7dcerUqQpISUSk/lxdXbF3795yn/fs2TNkZ2fDxsYGLi4umDJlCh4+fFgBCYlItIkTJ0KSJHz//feioxARUQXbs2cPtLW10aNHD9FRiEgJWOxUsv79+2Pz5s3l6jg6cOAAWrZsiWrVqmHMmDFIT09H7969MWDAAHTp0gXHjx+vwMREROrHzMwMDg4O+P3338t8Tl5eHrZu3YqBAwdi9uzZuHDhAh4+fIiGDRti8eLFyMvLq8DERKRs2traCAsLw7x583Dx4kXRcYiIqIJwr06iyofFTiUzMDCAp6cnfvnlF6Snp7/22AcPHmDLli1wdHREgwYNSu7X19fHqFGjkJaWBk9PT3h5ecHV1RXR0dEVnJ6ISH00bNgQlpaW2Lp1K7Kzs197bHJyMnbs2IFBgwZBV1cXAGBhYYE1a9YgOjoa0dHRaNSoEbZs2QK5XK6M+ESkBJaWlpg7dy4GDx78znurExGRatq7dy+7OokqGZkkSZLoEJVVQkICMjIyYGpqCmdnZ5iZmeHJkye4fPkyMjMzUa1aNbRv3x7a2tqvvU5BQQG2bNmC0NBQ1KpVCwEBAXB1deW3VkREAAoLCxEdHY3s7GzUr18flpaWMDQ0RHZ2Ns6fP48nT57Azs4O9vb2r73OkSNHMHnyZBQWFmLBggXo3Lmzkl4BEVUkSZLw2WefoXHjxpg9e7boOEREpECSJKFp06YIDg7GZ599JjoOESkJi50qIDs7GykpKcjOzoaRkRHq1auH2rVrl/s6hYWF2LZtG2bPno33338fgYGB6NKlC4ueRET/3/Xr13H9+nXk5ubiq6++wq+//gpnZ+cyny9JEnbt2oVp06bB2toa8+fPR+PGjSswMREpw99//40mTZogPDwcbdq0ER2HiIgU5Ndff0VISAjOnTvHz8VElQiLnRqoqKgIO3bswKxZs2BqaoqAgAB0796db+5ERM/p3Lkzvv32W3Tr1q3c5+bn52PVqlUIDQ1F165dMWvWLNStW7cCUhKRsuzevRt+fn6Ij4+HkZGR6DhERPSOirs6g4KC0KtXL9FxiEiJuGenBtLW1saAAQOQmJiIiRMnYurUqWjZsiX27dsH1raJiP5la2v7xr2TX0VPTw/jxo1DWloa6tSpAxcXF0ydOhX//POPglMSkbL069cPbdq0wZQpU0RHISIiBdi7dy8AcPk6USXEYqcG09bWxhdffIGEhAT4+flhxowZaNasGcLDwzlgg4gqPRsbm7cudhYzNTUtmdz+4MED2NracnI7kRpbtmwZ9u3bh4iICNFRiIjoHUiShKCgIE5gJ6qkWOysBLS0tNCvXz+cP38egYGBmD17NlxcXLBr1y4WPYmo0lJEsbNY8eT2qKgoREVFcXI7kZqqWrUq1q9fDx8fHzx48EB0HCIiekvs6iSq3LhnZyUkSRIOHDiAkJAQ5ObmYubMmejfv/8bp74TEWmS1NRUfPrpp7h8+bLCr/385PaFCxfCzc1N4c9BRBXH19cXd+7cwdatW0VHISKicpIkCc2aNUNAQAB69+4tOg4RCcBiZyUmSRIiIiIQHByM7OxszJgxAx4eHix6ElGlkJ+fD1NTU+Tk5EBXV1fh139+cruNjQ3mz59frsnvRCTO06dP0bRpUwQGBsLT01N0HCIiKoe9e/ciMDAQcXFxXMJOVElxGXslJpPJ0K1bN5w8eRJLly7Fjz/+CHt7e2zcuBGFhYWi4xERVSg9PT1YWFjg6tWrFXJ9mUyGzz//HElJSXB3d0eXLl3g7e2N69evV8jzEZHiGBoaYuPGjfD19cXNmzdFxyEiojIq3qszMDCQhU6iSozFToJMJkOXLl1w7Ngx/PTTT1i3bh0aNWqE9evXo6CgQHQ8IqIKY2Njg7S0tAp9juLJ7enp6ahduzYntxOpiRYtWmD06NEYNmwYuBCKiEg97Nu3D5IkoVevXqKjEJFAXMZOZZKfnw89PT3RMYiINIa5uTn8/Pzw9ddfQ19fX3QcInqJgoICtG3bFj4+Pvjqq69ExyEioteQJAnNmzfHjBkz0KdPH9FxiEggdnZSmdjY2GDlypXIy8sTHYWISCM8P7n9l19+4eR2IhWkq6uLTZs2YebMmUhPTxcdh4iIXmP//v0oKipiVycRsdhJZbN9+3bs3bsX1tbWWL58OZ49eyY6EhGRWnNwcMC+ffsQFhaG77//Hi1atEBkZKToWET0H40aNcLMmTMxZMgQ7mlORKSiJEnCnDlzEBgYCC0tljmIKjsuY6dyiY2NxaxZs3Du3DlMmTIFI0eOhKGhoehYRERqTZIk7Ny5E9OmTYOtrS0ntxOpGLlcji5duqBz586YNm2a6DhERPQfkiRBLpdDJpOx2ElE7Oyk8mnRogX27t2Lffv2ITo6GlZWVli8eDGePHkiOhoRkdqSyWT44osvkJycXGpye2ZmpuhoRARAS0sL69evx5IlSxAfHy86DhER/YdMJoO2tjYLnUQEgMXOcpHJZNi1a9c7XSMsLAzGxsYKSiRO06ZNER4ejt9++w0nT56ElZUVFixYgMePH4uORkQarH79+li0aFGFP4+o9+r/Tm5v0qQJJ7cTqYi6deviu+++w+DBg7mdDxEREZEKY7ET/xYxX/fH29sbAHDr1i307NnznZ7Lw8MDV65cUUBq1dCkSRPs2rULf/75J+Li4mBlZYW5c+fi0aNHoqMRkZrx9vYued/V0dFB3bp1MXr0aDx8+LDkmNjYWIwZM6bCs4h+rzY1NcXs2bNx4cIF3L9/H7a2tliyZAmHxBEJ9uWXX8LW1hYzZ84UHYWIiIiIXoF7dgL4+++/S/55//79GDFiBG7dulVyn6GhIczMzEREqxD5+fnQ09OrkGsnJSUhNDQUv//+O3x9fTFu3DiN+tkRUcXx9vZGVlYWNm3ahMLCQiQlJWHYsGFo164dtm7dKjqeUJcuXYKfnx8uXryI0NBQeHp6cpkWkSB3795F48aNsW3bNrRv3150HCIiIiL6D35SAlCzZs2SP1WrVn3hvuJi3fPL2K9duwaZTIZt27ahQ4cOMDQ0hIuLCy5cuICLFy+ibdu2MDIywscff4yrV6+WPNd/l0ZmZmaiV69eqFatGqpUqYJGjRph27ZtJY8nJiaic+fOMDQ0RLVq1eDt7Y3s7OySx2NjY/HJJ5+gevXqMDU1xccff4xTp06Ven0ymQwrVqxA3759YWRkBH9/fxQVFcHHxwcNGjSAoaEhbGxssGDBAsjl8nf6Wdrb22PLli04fvw40tPTYW1tjeDg4FKdWUREr6Kvr4+aNWuidu3a+OSTT+Dh4YHff/+95PH/LmOXyWT46aef0KtXL1SpUgW2traIiorCjRs30LVrVxgZGaFJkyaIi4srOaf4fTgyMhKOjo4wMjJCp06dXvteDQAHDhxAq1atYGhoiPfffx89e/YsWcr6suX1HTt2xNixYxXyc+HkdiLVUaNGDaxatQre3t7IyckRHYeIqNJhvxYRvQmLne8oMDAQU6dOxfnz51G1alUMHDgQ48aNQ2hoKGJiYvDs2TOMHz/+leePGTMGubm5iIqKwqVLl/D999+XFFxzc3PRrVs3GBsbIyYmBuHh4Th58iSGDRtWcn5OTg4GDx6MY8eOISYmBk2aNIG7uzvu3btX6nmCg4Ph7u6OxMREfP3115DL5bCwsMCOHTuQnJyM0NBQzJkzB+vXr1fIz6Vhw4bYsGEDTp06hb/++gs2NjaYOXMm7t+/r5DrE5Hmu3LlCg4dOgRdXd3XHjd79mx4enoiISEBzZs3x4ABA+Dj44MxY8bg/PnzqFWrVsl2JMXy8vIwd+5crFu3DqdOncI///yDr7766pXPcejQIfTq1QtdunTBuXPnEBUVhQ4dOrzzF0Tl1aFDB5w5cwZTp07FyJEj0b17d1y4cEGpGYgI6NmzJ1xdXTFhwgTRUYiIKoXnC5wymQwAlP57GBGpEYlK2blzp/SqHwsAaefOnZIkSdLVq1clANLKlStLHt+3b58EQNq9e3fJfevXr5eMjIxeedvJyUkKCgp66fOtXr1aMjU1lR49elRyX1RUlARASk9Pf+k5crlcqlmzprRp06ZSuceOHfu6ly1JkiRNnTpVcnNze+NxbyMjI0MaPny4VK1aNWnatGnS3bt3K+R5iEh9DRkyRNLW1paMjIwkAwMDCYAEQFq8eHHJMfXq1ZMWLlxYchuA5OfnV3I7MTFRAiB99913JfcVv28Wv++sX79eAiClpKSUHLN582ZJV1dXKioqKjnm+ffqtm3bSh4eHq/M/t9ckiRJHTp0kL7++uvy/hjKLC8vT1q2bJlkbm4ueXt7S9evX6+w5yKiFz169Ehq0KCBtHfvXtFRiIg03rNnz6Tjx49LI0aMkGbOnCnl5uaKjkREKoydne/I2dm55J8/+OADAICTk1Op+548eYLc3NyXnu/r64vZs2ejTZs2mDFjBs6dO1fyWHJyMpydnWFiYlJyX9u2baGlpYWkpCQAwJ07dzBq1CjY2trCzMwMJiYmuHPnDq5fv17qeZo3b/7Cc69cuRLNmzdHjRo1YGxsjCVLlrxwnqJYWlpizZo1iIuLw4MHD2Bra4spU6bgzp07FfJ8RKSe2rdvj/j4eMTExGDcuHFwd3d/bXc8ULb3YQCl3m/09fXRsGHDktu1atVCQUHBK6eenz9/Hm5ubuV/QRWoeHJ7WloaatWqhSZNmsDPz4+T24mUxMTEBBs2bMCoUaNw9+5d0XGIiDRaaGgoRo8ejQsXLmDLli1o2LBhqc/ORETPY7HzHT2/vLK4nf5l972qxd7HxwdXr17F0KFDkZaWhrZt2yIoKAjAv636xef/V/H9Q4YMQWxsLJYsWYKTJ08iPj4etWvXRn5+fqnjjYyMSt3evn07vvnmG3h7eyMiIgLx8fEYM2bMC+cpWr169bBy5UokJCQgNzcXjRo1wqRJk0oNiSKiyqtKlSqwtraGk5MTli1bhtzcXMyaNeu157zN+7COjk6pa7zrcigtLa0X9o8qKCh4q2uVl5mZGUJDQ3HhwgXcu3ePk9uJlKhdu3b48ssvMWrUKO4hR0RUQW7duoXFixdjyZIliIiIwMmTJ1GnTp2SAZaFhYUAuJcnEf0fFjtVQO3atTFy5Ejs2LEDISEhWL16NYB/h/0kJCSU2vz+5MmTkMvlsLOzAwAcP34c48aNw6effgoHBweYmJiUmiT/KsePH0erVq0wduxYNG3aFNbW1sjIyKiYF/gSderUwfLly5GYmIjCwkLY29vjm2++wc2bN5WWgYhUX2BgIObPny/8vcHFxeW1A4Fq1KhR6r332bNnSElJUUa0EhYWFli7di2ioqJw+PBhNGrUCL/88gv3syKqYCEhIUhPT8fmzZtFRyEi0khLliyBm5sb3NzcYGZmhg8++ACTJ0/Grl27kJOTU/Il9qpVq7iXOREBYLFTOF9fXxw6dAhXrlxBfHw8Dh06BHt7ewDAoEGDYGRkBC8vLyQmJuLo0aMYNWoU+vbtC2trawCAra0tNm/ejKSkJMTGxsLT0xN6enpvfF5bW1vExcXh4MGDSE9Px6xZs3DkyJEKfa0vY2FhgaVLl+LSpUvQ1taGo6Mjxo4dixs3big9CxGpno4dO8LBwQGzZ88WmmP69OnYuXMnZsyYgaSkJFy6dAlLliwp2aLE1dUVW7ZsQXR0NC5duoRhw4YprbPzv4ont69fv75kcvvhw4eFZCGqDAwMDLBp0yZMmjSpwrYDIiKqrPLz85GVlQUbGxsUFRUBAIqKiuDq6gp9fX2Eh4cDANLT0zFmzJhSW8ARUeXFYqdgcrkc48aNg729Pbp06YIPPvgAGzZsAPDvcs6IiAg8evQILVu2RK9evdCmTRusW7eu5Px169bh8ePHaNasGTw9PTFs2DDUr1//jc87atQofPHFFxg4cCBatGiBa9euYdKkSRX1Mt/oww8/xHfffYeUlBRUqVIFzs7OGD16NP766y9hmYhINUycOBE///yz0PcDd3d3hIeH4+DBg3BxcUGHDh0QFRUFLa1//xqdNm0aXF1d0atXL3zyySf4+OOP0bRpU2F5gX8LxcWT20eMGMHJ7UQVqEmTJpgwYQKGDh3KbmoiIgXS09ODp6cnrK2toa2tDQDQ1taGqakpPvroI+zbtw8A4O/vj88++wwNGjQQGZeIVIRM4sYWpILu3r2LxYsXY/Xq1ejbty/8/f3L9BdXUVERkpKSULduXZiZmSkhKRGR6svPz8eqVaswe/ZsuLu7IyQkBHXq1BEdi0ijFBYWon379vDw8ICvr6/oOEREGqN4tYyurm6puRZRUVEYNWoUdu7ciWbNmiE1NRVWVlYioxKRimBnJ6mkGjVqYO7cuUhLS0PNmjXRvHlzDBs2DA8fPnzteUlJSVi4cCHatWuHESNGvPF4IqLKgJPbiSqejo4ONm7ciFmzZiE5OVl0HCIitVf8e4quru4Lhc78/Hy0adMG1apVQ8uWLdG3b18WOomoBIudpNLef/99zJo1C5cvX0bdunVhbGz82uNr164NT09PfP311/j555+xZMkSPHv2TElpiYhUGye3E1Usa2trzJ49G15eXsL27SUi0gQPHjzA6NGjsXHjRly7dg0ASgqdwL9f5BoYGMDBwQEFBQVYuHChoKREpIpY7CS18N577yEoKKhk0t7rjnN3d8eDBw9gZWWFbt26wcDAoORxfvAgIvq/ye2HDx9GZGQk7OzsOLmdSEFGjRqF6tWrIzQ0VHQUIiK1tX79emzfvh3ff/89Jk+ejC1btiAzMxPAv1PXi4cVzZ07F3v37kW9evVExiUiFcM9O0ljPL+s4cMPP8TgwYMREBBQ0g16/fp17Ny5E7m5uRg8eHCZBjkREVUG0dHRmDJlCoqKirBw4UK4urqKjkSk1m7evAkXFxfs378fLVq0EB2HiEjtnDx5Er6+vvDy8sKePXuQkpICNzc3aGtrY/fu3bhx4wYnrxPRK7GzkzRG8bd7CxcuhLa2Nvr06VNq2fuDBw9w584dnDp1CpaWlli8eDG7mIiI8OLkdnd3dyQmJoqORaS2atWqhWXLlmHw4MHIzc0VHYeISO20bdsWrVu3xtOnT/Hnn39i6dKluH79OjZv3gxLS0scPHgQGRkZomMSkYpisZM0RvES9++//x4eHh5wdHQs9XiTJk0QGhqKoKAgAICpqamyIxKRClu3bh28vLxExxBGJpPhiy++QHJyMrp164bOnTtj6NChJUvGiKh8PDw80LRpU0ybNk10FCIitTRx4kQcOnQImZmZ6NevH7y9vWFiYoIqVapgwoQJmDRpEr9QIqKXYrGTNEJxh+aSJUsgSRL69u37wrKGoqIi6OjoYM2aNXB2dkavXr2gpVX6f4GnT58qLTMRqRZbW1ukp6eLjiGcnp4exo8fz8ntRAqwfPly7N69G5GRkaKjEBGplaKiIjRo0AAffvghAgMDAQDTpk3DnDlzcOLECSxevBitW7dGlSpVBCclIlXEPTtJrUmShMjISBgZGaFNmzaoV68e+vTpg1mzZsHExKTUPp7Av/t2WltbY+XKlRg2bFjJNWQyGa5evYqff/4Z+fn58PLyeqEzlIg02+3bt+Hg4IB79+6JjqJSsrKyEBgYiL1792LatGkYM2YM9PX1RcciUhsREREYMWIELly4gKpVq4qOQ0Sk8p7/DJeamoqJEyeiVq1a2L9/PxISEmBubi44IRGpOnZ2klorLnZ+9NFHsLKywqNHj9CvX7+Srs7ivySLOz9DQ0Nha2uLHj16lFyj+JgHDx5AJpMhOTkZzs7OnKJKVMmYm5sjPz8fDx8+FB1FpbxscvvWrVu55zFRGXXt2hU9e/bE+PHjRUchIlJpxavsnv8M17BhQ7Ru3RphYWHw9/cvKXTy9xAieh0WO0mtaWlpYe7cuUhLS0PHjh2RnZ2NadOm4fz586X+AtTS0kJWVhbCwsLg6+v70m8DmzVrhoCAAPj6+gIAHBwclPY6iEg8mUwGGxsbLmV/BUdHR+zfvx/r1q3D4sWL0bJlSxw+fFh0LCK1sGDBApw+fRq7d+8WHYWISCVlZ2cjODgY0dHRyM7OBoCSLcd8fHywdu3akr3VJUl6YTsyIqLncRk7aZRr165hypQpMDIywpo1a/DkyRNUqVIFurq6GDNmDKKiohAVFYWaNWuWOu/5pRJffvklUlNTERsbK+IlEJFAnp6e6NmzJwYNGiQ6ikqTy+XYuXMn/P390bBhQ8yfPx9OTk6iYxGptNOnT6N3796Ij49/4fcQIqLKbvTo0Vi1ahXq1q2Lnj174osvvoCzszPMzMxKHZeXl8ftdIjojfh1CGmU+vXrY8eOHfjpp5+gra2N0NBQdOrUCdu3b8emTZswceLEl37AKC50njt3Djt27IC/v7+yoxORCrCxsUFaWproGCpPS0sLHh4enNxOVA6tW7fG8OHDMWLECLDXgIjo/+Tk5OD06dNYuXIlJk2ahD179uDzzz/HjBkzcOTIkZIthi5evIiRI0fiyZMnghMTkapjsZM0koGBAWQyGb799lvUqFEDX375JZ48eQJDQ0MUFRW99By5XI6lS5fCwcEBffr0UXJiIlIFXMZePi+b3D5t2jRObid6hYCAANy7dw+3b98WHYWISGVkZmaiadOmqFmzJsaNG4fr169j5syZ2Lt3L7744gsEBATg6NGj8PX1xcOHD2FkZCQ6MhGpOC5jp0rh/v37mD59OlavXo2xY8ciJCTkhYmo8fHxaNWqFbZs2YL+/fsLSkpEIp0+fRrjxo3jNhZv6caNGwgMDMS+ffvg7++P0aNHc6kZ0X/I5XLIZLKSVSVERJWdXC5Heno6Pvjggxc+o61YsQKLFi3CP//8g+zsbKSmpsLGxkZQUiJSFyx2UqVy7949xMTEoGvXrtDW1sbNmzdhbm4OHR0dDB06FOfOnUNCQgI/gBBVUvfv34eVlRUePnzI94F3cPHiRfj5+SEpKQmhoaHw8PDgIAEiIiIqs8LCQujo6JTcLp7KvmHDBoGpiEhdsNhJlVZ2djYmT56Ms2fPYtCgQQgKCsL69evZ1UlUyVWrVg2pqamoUaOG6ChqLzo6GpMnT4YkSViwYAFcXV1FRyJSefn5+Vi6dCksLS3Rr18/0XGIiISSy+WIjY1FmzZtkJycjIYNG4qORERqgG0WVGmZmZlh8eLFaNq0KQICAvDkyRMUFBSBTD5bAAAgAElEQVTg6dOnrzxHkiTI5XIlpiQiZeO+nYrTsWNHnDlzBpMnT8aIESPg7u6OxMTEMp3L72KpssrMzER6ejpmzpyJAwcOiI5DRCSUlpYWHj9+jKlTp7LQSURlxmInVWrGxsZYu3Yt7t27h8mTJ2PQoEGYNm0aHj9+/MKxkiThzJkzcHJywtatW1856IiI1BuLnYr1ssntw4YNe+Mk1YKCAjx8+BAxMTFKSkokniRJsLKywtKlS+Ht7Y0RI0YgLy9PdCwiogonSdIrv+h0dXVFaGiokhMRkTpjsZMIgKGhIebPn4/c3FwMGjQIhoaGLxwjk8nQqlUrLF68GD/88AMcHBywefNmFBYWCkhMRBXFxsYGaWlpomNonOcnt1taWr70ffZ5Y8aMQbt27TBq1CjUr18f69evV1JSIuWTJKnU7xMGBgaYPHkyLC0t8dNPPwlMRkSkHFFRUfjtt99eWvCUyWTc+5uIyoXvGETPMTAwQIsWLaCtrf3Sx2UyGbp27YoTJ05gxYoVWL16Nezt7bFhwwYWPYk0BDs7K5aZmRlmzJjx2gFQP/74I7Zu3YoxY8Zgx44dCAgIQGhoKA4ePAiAS9xJM8jlcty8eRNFRUWQyWTQ0dEp+f+ieFp7bm4uTExMBCclIqpYkiQhICAA//zzDwdEEpFC6Lz5ECL6L5lMBjc3N7i5uSE6OhohISEICQmBv78/vLy8oKurKzoiEb0lW1tbFjuV4HUfZlauXInhw4djzJgxAP4tQJ89exZr1qxBt27dIJPJkJqayr27SG0VFBSgXr16uH37Ntq1awcjIyM0b94cLi4usLCwQLVq1bBp0ybEx8fDwsJCdFwiogp1+PBh3L17F56enqKjEJGGYGcn0Tvq2LEjDh8+jLCwMGzbtg22trZYvXo18vPzRUcjordgY2ODy5cvs3tQkPz8fFhZWZXs6Vn870GSpJLOt8TERNjZ2aFHjx7IzMwUGZforejq6mLixImQJAnjxo2Do6Mjjh49ilmzZqFHjx5o2bIl1q5dix9++AHdunUTHZeIqMJIkoSgoCAEBAS8cnUdEVF5sdhJpCDt2rXDH3/8gS1btiA8PBzW1tb48ccfOViASM2YmZnB0NAQf//9t+golZKenh46dOiAXbt2Yffu3ZDJZDhw4ABOnDgBMzMzFBUVwcnJCRkZGTA1NUW9evXg4+ODp0+fio5OVC7ffvstHB0dERkZifnz5+Pw4cM4d+4cUlNT8eeffyIjIwOjRo0qOT4rKwtZWVkCExMRKd7hw4dx584ddnUSkUKx2EmkYG3btsXBgwexc+dO/Pbbb7CyssIPP/yAZ8+eiY5GRGXEfTvFKO7i/OabbzBv3jyMGjUKrVq1gq+vLy5evAhXV1doa2ujsLAQDRo0wC+//IKzZ88iPT0dVatWxaZNmwS/AqLy2bt3L37++Wfs2bMHMpkMRUVFqFq1KlxcXKCvrw8dnX93nLp37x42bNgAPz8/FjyJSGMUd3XOnDmTXZ1EpFAsdhJVkFatWmH//v3Ys2cP/vzzT1hZWeH7779Hbm6u6GhE9AYsdipfYWEhIiMjcevWLQDAV199hXv37mH06NFwdHREmzZtMGDAAAAoKXgCwIcffgg3NzcUFBQgMTGR3fSkVurXr485c+bA29sbjx8/fuWH/erVq6NFixbIzc2Fh4eHklMSEVWMqKgodnUSUYVgsZOogjVr1gx79uzB/v37cezYMVhZWWHRokUl+9ERkephsVP57t+/j61btyIkJASPHj1CdnY2ioqKEB4ejszMTEydOhXAv3t6Fk+ufvDgAfr27Yt169Zh3bp1WLBgAfT19QW/EqLymTRpEiZMmICUlJSXPl5UVAQA6Ny5M4yNjXHy5ElERkYqMyIRkcI939VZ3MVORKQoLHYSKYmLiwt2796NiIgIxMTEwNLSEvPnz0dOTo7oaET0HzY2NkhLSxMdo1L54IMPMHr0aJw4cQL29vbo3bs3atWqhStXriAgIACfffYZAJR8INqzZw+6d++O+/fvY9WqVfD29haYnujdzJgxA82bNy91X/G2Dtra2oiPj0fTpk0RERGBlStXwsXFRURMIiKFiYqKwu3bt9nVSUQVQiZx3CyREJcuXUJoaCj+/PNPfPPNNxg7dixMTU1FxyIiAOfPn4eXlxcSExNFR6mUDhw4gIyMDNjZ2aFZs2aoVq1ayWP5+fmIiIiAj48PnJycsGrVKlhbWwP4tzgkk8lExSZ6Z+np6TAzM4O5uXnJffPnz8fMmTPh5uaGuXPnwtnZGVpa7FcgIvUlSRI6duyI4cOHY/DgwaLjEJEGYrGTSLCUlBSEhobi0KFDGD9+PMaNG4eqVauKjkVUqT1+/Bjm5uZ4/PgxiwqCyeXyUv8OZsyYgVWrVqFHjx4ICgpCvXr1XjiGSF0tW7YMO3bswPHjx3Ht2jV4eXkhLi4OgYGB8PHxKVX453/3RKSuoqKiMGrUKCQlJXEJOxFVCBY7iVREeno6QkNDsX//fnz99dfw9fUt9aGGiJSrVq1aOHPmDOrUqSM6CgHIzMzEhAkTEBERgZEjR+K7774THYlI4QoLC1G1alW0adMGsbGxcHR0xIIFC9CqVatXDi96+vQpDA0NlZyUiOjtsKuTiJSBXwcTqQgbGxuEhYXhzJkzyMrKgq2tLWbMmIH79++LjkZUKXFIkWoxNzdHzZo1sXbtWsybNw/A/w1u+S9Jkl75GJEq09HRwb59+xAZGYmePXvi119/Rdu2bV9a6Hz8+DF++uknLF26VEBSIqK3Ex0djZs3b2LAgAGioxCRBmOxk0jFWFlZYe3atYiNjcXdu3dha2sLPz8/3L17V3Q0okqFxU7Voq+vj+XLl8PDwwO6uroA8MpONwDo2LEjli5diry8PGVFJFKITp06YeTIkTh27Nhrl3caGxtDX18f+/btw/jx45WYkIjo7QUHB3MCOxFVOBY7iVRUgwYNsGrVKpw/fx6PHj1Cw4YNMXnyZNy+fVt0NKJKgcVO9SWTyfDjjz/i999/h52dHbZt2wa5XC46FlGZrVy5EhYWFoiOjn7tcQMGDEDPnj2xfPnyNx5LRCRadHQ0srKyMHDgQNFRiEjDsdhJpOLq1q2LH3/8ERcuXEBeXh7s7OwwYcIE3Lp1S3Q0Io1mY2ODtLQ00THoLTk5OeHAgQP4+eefsWjRIrRq1QpRUVGiYxGVWfES9lfJzs7G0qVLERoaii5dusDKykqJ6YiIyi8oKIhdnUSkFCx2EqmJ2rVrY9myZbh06RIAwMHBAePHj0dWVpbgZESaiZ2dmqFTp06IiYnBpEmT4OPjg08//RQXL14UHYvojWrUqAFzc3Pk5ubi2bNnpR5LSEhA7969ERISgtmzZyMiIoLD1IhIpbGrk4iUicVOIjXz4YcfYsmSJUhKSoKenh6cnJzw9ddf4/r166KjEWkUa2trXLt2jYNuNICWlhY8PT2RnJyMTz75BG5ubhg2bBhu3LghOhrRG23atAmzZ8+GJEl49uwZli9fjvbt2yMvLw8xMTHw9fUVHZGI6I2Cg4MxY8YMdnUSkVKw2EmkpmrWrIlFixYhJSUFJiYmcHFxwahRo3Dt2jXR0Yg0gqGhIWrUqMEvEjSIvr4+fH19kZaWhpo1a6Jx48bw9/dHdna26GhEr9SpUyfMmTMHixYtwqBBgzBhwgRMnDgRx44dg6Ojo+h4RERvFB0djczMTAwaNEh0FCKqJFjsJFJz5ubmmDdvHlJTU1G9enU0a9YMw4cPx5UrV0RHI1J7XMqumczMzDBnzhwkJCTg77//hq2tLZYuXYr8/HzR0YheYGtri0WLFmHq1KlISkrC8ePHERgYCG1tbdHRiIjKhBPYiUjZWOwk0hDVq1dHaGgo0tPTYWFhgZYtW2Lo0KEs1BC9AxY7NVvt2rWxbt06/PnnnyWT27dv387J7aRyJk6ciM6dO6Nu3bpo1aqV6DhERGV25MgRdnUSkdKx2EmkYapVq4bg4GBcvnwZDRo0QNu2beHl5YXU1FTR0YjUDoudlUPx5Pa1a9di4cKFnNxOKmn9+vWIjIzEgQMHREchIioz7tVJRCKw2EmkoapWrYqAgABkZGSgUaNGaNeuHQYOHIikpCTR0YjUho2NDdLS0kTHICXh5HZSZRYWFjh16hTq1asnOgoRUZkcOXIE169fx5dffik6ChFVMix2Emk4U1NT+Pv7IyMjA40bN0anTp3g4eGBxMRE0dGIVB47Oyuf5ye3d+nSBa6urvDx8eHkdlIJLVq0eOlQIkmSBKQhInq94OBgTJ8+nV2dRKR0LHYSVRImJiaYOnUqMjIy0KJFC3Tp0gX9+vVDfHy86GhEKsvS0hKZmZkoKCgQHYWUTF9fH9988w3S0tJgbm7Oye2ksiRJwpEjR/DXX3+JjkJEVOLo0aP466+/2NVJREKw2ElUyRgbG+Pbb7/FlStX8PHHH8Pd3R29e/fGuXPnREcjUjn6+vqoVasWrl27JjoKCVK1alXMnTuXk9tJZclkMpw5cwbe3t4crkVEKqN4r05dXV3RUYioEpJJXPdCVKk9ffoUa9euxfz58+Hi4oKZM2eiZcuW5bpGYmIiMjIyoK2tXbKUTltbG25ubjAwMKiI2ERK07VrV/j6+sLd3V10FFIBiYmJ8PPzQ0pKCubMmYPPP/8cWlr87pjEKioqQocOHdC/f3988803ouMQUSV39OhRDB06FCkpKSx2EpEQLHYSEQDg2bNnWLduHebNmwcHBwcEBASgTZs2rz0nMjIS//zzDxwdHdGwYcNSjz19+hSHDx/G06dP0b59e5ibm1dkfKIKM3bsWNjY2MDX11d0FFIhhw8fxpQpUyCTybBw4UJ07NhRdCSq5DIyMtC6dWscOXIE9vb2ouMQUSXm5uaGQYMGYdiwYaKjEFElxWInEZWSl5eHDRs2YM6cObC1tUVAQAA+/vjjUsfI5XJs3boVbm5uqFmz5muvJ0kS9uzZAwcHB9jY2FRkdKIKsXTpUqSnp2P58uWio5CKkcvl2L59O6ZPnw57e3vMmzfvpcNjiJRl9erVWLVqFU6fPs1uKiIS4tixYxgyZAhSU1P5PkREwnDdFRGVoq+vj5EjRyItLQ0eHh7w8vKCq6srjhw5UnLMtm3b8Nlnn72x0An8u5dY7969kZaWxmnGpJY4kZ1eRUtLCwMGDEBycjI6d+4MNzc3Tm4noUaMGIGaNWti1qxZoqMQUSXFvTqJSBWw2ElEL6WnpwcfHx+kpqbCy8sLw4cPR4cOHbBixQq0a9cOJiYm5brep59+imPHjlVQWqKKY2Njg7S0NNExSIUVT25PTU3l5HYSSiaTYe3atVi1ahXOnDkjOg4RVTLHjx/HlStXMHjwYNFRiKiSY7GTiF5LV1cX3t7eSE5OxogRI5CYmIg6deq81bUcHByQmpqq4IREFat+/fq4efMm8vLyREchFVc8uT0+Pr5kcvuyZcs4uZ2U6sMPP8Ty5cvh5eWF3Nxc0XGIqBIJDg7G9OnT2dVJRMKx2ElEZaKjo4OPP/74nTYad3Z2RmJiogJTEVU8XV1d1KtXD1euXBEdhdREnTp1sG7dOvzxxx84dOgQ7OzssH37dnCbdFKWzz//HC1atMDUqVNFRyGiSuL48eO4fPkyvLy8REchImKxk4jKLj4+Hi1atHina+jo6CgoDZHycN9OehvOzs747bffsGbNGixcuBCtWrVCdHS06FhUSfzwww/49ddf8ccff4iOQkSVAPfqJCJVwmInEZWZtrY2ZDLZO11DR0cHcrlcQYmIlIPFTnoXrq6uiImJwYQJEzBs2DD06NEDFy9eFB2LNNx7772HdevWwcfHBw8fPhQdh4g02IkTJ9jVSUQqhcVOIiozRSzB1NLSYrGT1A6LnfSu/ju53dXVFT4+PsjKyhIdjTRYly5d0KtXL4wbN050FCLSYNyrk4hUDYudRKRUBQUFXMpOaofFTlKU4sntaWlpMDc3h7OzM6ZPn87J7VRh5s+fj9jYWOzcuVN0FCLSQCdOnEB6ejq7OolIpbDYSURlVrt27Xce0lJQUKCgNETKY2Njg7S0NNExSIM8P7n91q1bnNxOFaZKlSrYtGkTxo0bh1u3bomOQ0QaprirU09PT3QUIqISLHYSUZk1bdoUcXFxb31+VlYWLCwsFJiISDnq1q2Lu3fvIjc3V3QU0jCc3E7K0LJlS4wcORLDhw/nf1tEpDAnT55EWloauzqJSOWw2ElE5WJgYPDWBZ9Tp06hdevWCk5EVPG0tbVhaWmJjIwM0VFIQz0/uX3BggWc3E4KN3PmTPz9999Ys2aN6ChEpCHY1UlEqorFTiIql65du2L79u3lHjIUGxuLBg0avPM0dyJRuG8nKYOrqytiY2MxYcIEDB06FD169MClS5dExyINoKuri02bNsHf359f3BDROzt58iRSU1MxZMgQ0VGIiF7AYicRlYuuri769euHjRs3lnn/zZiYGOTl5aFZs2YVnI6o4rDYScpSPLk9JSUFnTt3RqdOnTi5nRTC3t4e06dPx5AhQ1BUVCQ6DhGpMXZ1EpEqY7GTiMrN1NQUAwYMQHh4OA4ePPjKgRrJycnYtWsX9PT08PHHHys5JZFisdhJyvb85PYaNWpwcjsphK+vL3R1dbFo0SLRUYhITZ06dYpdnUSk0mQSdyknonfw+PFjHD58GEVFRdDW1sbVq1dhZmYGY2NjNGrUCI6OjqIjEinE4cOHERwcjCNHjoiOQpVUZmYmAgIC8Ntvv2H69On46quv2FFDb+Wvv/5C8+bNERkZCWdnZ9FxiEjNdOvWDX379sXIkSNFRyEieikWO4lIoQYMGICePXti4MCBoqMQKVRmZiZatmyJW7duiY5CldyFCxfg5+eH1NRUzJ07F59//jn3Q6ZyCwsLw+LFixEbGwt9fX3RcYhITZw6dQqenp5IT0/nF25EpLK4jJ2IFOq9997Dw4cPRccgUjgLCwtkZ2cjJydHdBSq5J6f3D5//nxObqe3MmTIEFhZWSEwMFB0FCJSI8HBwfD392ehk4hUGoudRKRQLHaSptLS0oK1tTUuX74sOgoRAE5up3cjk8mwatUqbNiwAcePHxcdh4jUwOnTp5GcnIyhQ4eKjkJE9FosdhKRQrHYSZqMQ4pI1Tw/ud3NzQ2dOnXC8OHDObmdysTc3BwrV67EkCFD2LVORG/Erk4iUhcsdhKRQrHYSZqMxU5SVfr6+pgwYQLS0tJQvXp1Tm6nMuvVqxc6dOiAb7/9VnQUIlJhp0+fRlJSErs6iUgtsNhJRArFYidpMhY7SdVVrVoV8+bNQ3x8PG7evAlbW1ssW7YM+fn5oqORCvv+++/x+++/48CBA6KjEJGKCg4OxrRp09jVSURqgcVOIlIoFjtJk7HYSeqiTp06WL9+Pf744w8cOnQIdnZ22LFjByRJEh2NVJCpqSnCwsIwcuRI3Lt3T3QcIlIxZ86cwaVLl9jVSURqg8VOIlIoFjtJk7HYSeqmeHL76tWrSya3HzlyRHQsUkEdOnSAp6cnRo8ezaI4EZVSvFenvr6+6ChERGUik/jbDBERUZlIkgRTU1NkZmaiatWqouMQlYtcLsf27dvh7+8PR0dHzJs3Dw4ODqJjkQp59uwZmjVrBn9/fwwaNEh0HCJSATExMejfvz/S09NZ7CQitcHOTiIiojKSyWTs7iS19fzkdldXV05upxcYGBhg06ZNmDBhAm7cuCE6DhGpgOK9OlnoJCJ1wmInERFRObDYSeqOk9vpdZo2bYrx48dj6NChkMvlouMQkUAxMTFITEzEsGHDREchIioXFjuJiIjKgcVO0hQvm9z+ww8/cHI7wc/PDzk5Ofjxxx9FRyEigdjVSUTqisVOIiKicmCxkzTN85PbDx48CHt7e05ur+R0dHSwceNGBAUFITU1VXQcIhIgJiYGFy5cYFcnEaklDigiIpUSFBSEXbt24eLFi6KjEL3UyZMnMWHCBJw5c0Z0FKIKERkZiSlTpkBHRwcLFixAhw4dynxuXFwcrl+/Di2tf79Pl8vlaNSoERo1alRRcakCrVixAhs3bsSJEyego6MjOg4RKVGPHj3g7u6OMWPGiI5CRFRuLHYSUQlvb2/cu3cP+/fvF5bh8ePHyMvLw/vvvy8sA9Hr3L17F7a2tnjw4AFkMpnoOEQVQi6XY9u2bZg+ffobJ7cXFhbi0KFDyMvLg4uLCywtLUs9fvHiRaSkpMDU1BRdunTh/zdqRJIkdO3aFe3atcPMmTNFxyEiJYmNjUXfvn1x+fJlLmEnIrXEZexEpFKMjY1Z6CSVVr16dUiShPv374uOQlRhtLS0MHDgwDdObn/8+DE2bdoEV1dX9OvX74VCJwA4Ojqif//+aNasGTZu3IiCggJlvQx6RzKZDOvXr8cPP/yAc+fOiY5DRErCvTqJSN2x2ElEZSKTybBr165S99WvXx+LFi0quZ2WloYOHTrAwMAADRs2xG+//QZjY2OEhYWVHJOYmIjOnTvD0NAQ1apVg7e3d6kJwEFBQXB0dKzw10P0tmQyGfftpErjZZPbZ8yYgUePHiE/Px87d+7EkCFDUKVKlTde6/3334eHhwd++eUX7geqRiwsLLB06VIMHjwYT58+FR2HiCpYbGwsEhIS4OPjIzoKEdFbY7GTiBRCLpejT58+0NHRwenTpxEWFobg4GDk5eWVHJObm4tu3brB2NgYMTExCA8Px8mTJ7nxOakdW1tbFjupUime3H7+/HncuHEDtra2CA4OxsCBA0v25ywLAwMD9OrVCwcPHqzAtKRonp6ecHJywvTp00VHIaIKFhISAj8/P3Z1EpFa407jRKQQf/zxB1JTU/H777/DwsICALBkyRJ89NFHJcds2bKlZMmjiYkJAGD16tXo1KkTLl++DGtrayHZicqLnZ1UWdWtWxdhYWE4e/YsYmNj3+rDcNWqVfH06VNIksT9O9WETCbDjz/+CGdnZ/Ts2ROdOnUSHYmIKsDZs2dx/vx57Ny5U3QUIqJ3ws5OIlKIlJQU1KpVq6TQCQAtWrQo1fGTnJwMZ2fnkkInALRt2xZaWlpISkpSal6id8FiJ1V2d+/exZAhQ976/NatW+PMmTMKTEQV7f3338fatWtf2H6GiDRH8V6dBgYGoqMQEb0TFjuJqExkMtkLe6w9P2SiLB06rzuG3T2kTljspMouLy+vTPt0voqFhQX+/vtvBSYiZejevTu6d+8OX19f0VGISMHOnTuH8+fPc69OItIILHYSUZnUqFEDt27dKrl9+/btUrft7OyQlZWFmzdvltx39uxZyOXyktv29vZISEhATk5OyX0nT56EXC6HnZ1dBb8CIsUpLnZyyApVVjo6774Tkra2tgKSkLItWrQIx48fR3h4uOgoRKRAwcHB8PPzY1cnEWkEFjuJqJRHjx4hPj6+1J9r167B1dUVK1asKNnLx9vbu9QvQ126dEHDhg0xZMgQJCQk4PTp05g4cSJ0dHRKujYHDRoEIyMjeHl5ITExEUePHsWoUaPQt29f7tdJauW9996Dnp4ebt++LToKkRCKKPTzywL1ZGxsjA0bNmDMmDG4c+eO6DhEpADnzp1DXFwchg8fLjoKEZFCsNhJRKUcO3YMLi4upf58++23+O6772BpaYmOHTuif//+GD58OMzNzUvO09LSQnh4OPLy8tCyZUsMGTIE06dPh0wmKymKVqlSBREREXj06BFatmyJXr16oU2bNli3bp2ol0v01riUnYgqq48++gje3t4YMWIEi9ZEGiA4OBhTp05lVycRaQxOYyeiEmFhYQgLC3vl4wcPHix1u1+/fqVu29ra4ujRoyW3ExISUFBQUKpr08nJCZGRka98jry8PBgbG5czOZHy2draIj09He3atRMdhUjp8vLy3mmaekFBAYtkai44OBgtW7ZEWFgYhg4dKjoOEb2luLg4nDt3Djt27BAdhYhIYVjsJCKFCQ8Ph5GREWxsbHDt2jVMnDgRjRs3RtOmTd94riRJuHLlCiIjI+Hs7KyEtETvhp2dVJk1b94c586dQ/Pmzd/q/D/++AOurq4KTkXKpKenh02bNsHV1RWdOnVC/fr1RUciorfAvTqJSBNxGTsRKUxOTg7Gjh0Le3t7DBo0CHZ2doiIiChT5092djbs7e2hp6eHmTNnKiEt0bthsZMqs/r16+PatWtvff6aNWuwceNGFBYWKi4UKZ2TkxOmTJmCIUOGlBpISETqIS4uDmfPnsWIESNERyEiUiiZxDVERERE5RYXF4ehQ4ciISFBdBQiIVJSUnDnzh20b9++XOft27cPRkZGmD17Nu7evYulS5eyy1ONFRUVoWPHjujTpw8mTpwoOg4RlUOvXr3g5uaG8ePHi45CRKRQLHYSERG9hZycHNSsWROPHz9+630LidRdbGwsHjx4gK5du5bp+EOHDqFevXqws7ODJEn49ddfMWnSJDRp0gSLFi2CpaVlBSeminDlyhW0atUK0dHRcHBwEB2HiMrg/Pnz6NGjBy5fvgxDQ0PRcYiIFIrL2ImIiN6CiYkJTExMcPPmTdFRiISpWrUqRo4ciZ9//hkZGRmvPC4xMRHbtm2DnZ0d7OzsAAAymQx9+vRBUlISmjdvjpYtW2L69Ol4/PixsuKTglhaWmLu3LkYPHgw8vPzRcchojIonsDOQicRaSJ2dhJRhfDw8ECfPn3g6ekpOgpRhWnXrh1CQkLQqVMn0VGIlO7Zs2do06YNhg8fjq+//hrnz59HRkYGdHR0oK2tDUmSIJfLUVhYCCcnJzRs2PC118vKysK0adNw+PBhzJ07F4MG/T/27jssqmt9G/AzQy82MEKiiKggorGXoEiJvYVERQREQewNlWLDaFQ02BCNorGAYsVekRg02LCggAIiKIIlGktQpEnb3x/+5DscTY5lZvYAz31dc504uz3jwZ3uekcAACAASURBVGHm3Wu9ywVSKe/LVxSCIOC7775Dy5YtsXDhQrHjENG/4KhOIqrsWOwkIrkYO3YsWrZsiXHjxokdhUhuPDw80LFjR4wePVrsKEQKN2nSJPz555/Yu3fvO60c3n68/JQWDzExMfD09ISKigqCgoLQoUMHmeQl+Xv8+DFatWqFgwcP4ptvvhE7DhH9gx9++AG2trbw9PQUOwoRkVzwdjkRyUWtWrWQlZUldgwiueKK7FRVHThwAEePHsWmTZveW9CUSCSf3MvW0tISFy9exNixY/H999/Dzc0Njx49+tzIpACGhoZYs2YNhg0bhtzcXLHjENF7xMXF4dKlS7xRS0SVGoudRCQXLHZSVcBiJ1VFGRkZGDNmDHbt2oWaNWvK5RpSqRTDhw/HrVu3YGhoiK+//hoBAQF4/fq1XK5HsjNw4EB07NgRvr6+YkchoveYP38+e3USUaXHaexEJBefM4WRqKK4fv06nJyckJSUJHYUIoUoKipCly5dMGjQIHh7eyvsurdv34a3tzcSExOxfPlyfPfdd/z9osRevHiBFi1aYMOGDejZs6fYcYjo/8THx6NPnz64c+cOi51EVKmx2ElERPSJ8vLyoK+vj9zcXC6kQlWCr68vkpKScOTIEVF+5k+ePIkpU6agbt26CAwMRLNmzRSegT5MVFQU3NzckJCQAD09PbHjEBGAAQMGwNraGlOmTBE7ChGRXPGbGRER0SfS1taGvr4+7t+/L3YUIrmLiIjAzp07sWXLFtGK+927d0d8fDz69+8POzs7TJ48GX///bcoWejfde3aFQMGDMDEiRPFjkJEeDOq8+LFixgzZozYUYiI5I7FTiIios9gamqK1NRUsWMQydXDhw/h7u6O7du3o3bt2qJmUVNTw6RJk5CcnIzi4mI0bdoUwcHBKC4uFjUXvWvx4sW4du0adu/eLXYUoipv/vz58PX15fR1IqoSWOwkIiL6DFykiCq74uJiODs7Y8KECbC2thY7TpnatWtj7dq1OHnyJMLDw9GmTRucPn1a7Fj0H7S1tREWFobJkyfjzz//FDsOUZWVkJCAmJgYjuokoiqDPTuJiIg+w7Jly/Dw4UMEBgaKHYWoyhIEAQcOHICXlxfatGmDZcuWwcTEROxY9H/mzZuHS5cu4fjx41xYikgEAwcOhJWVFaZOnSp2FCIiheDITiISRUFBAVauXCl2DKLPxpGdROKTSCQYMGAAkpOT0aZNG7Rv3x5+fn7IyckROxoBmD17Np49e4b169eLHYWoyklISMCFCxc4qpOIqhQWO4lIIf57EHlRURGmTZuGV69eiZSISDZY7CRSHlpaWpg9ezYSEhKQkZEBc3NzbNu27Z3fQaRYampq2Lp1K/z8/HD79m2x4xBVKW97dWpra4sdhYhIYTiNnYjkYv/+/WjWrBkMDAxQs2bNsudLSkoAvCl+VqtWDWlpaahXr55YMYk+W0FBAWrWrImcnByoqqqKHYeI/sOFCxfg6ekJNTU1BAUFoX379mJHqtKCgoKwe/dunD17FioqKmLHIar0rl+/jp49e+LOnTssdhJRlcKRnUQkF7Nnz0br1q0xbNgwBAcH49y5c8jKyoKKigpUVFSgqqoKDQ0NPH/+XOyoRJ9FU1MThoaGyMzMFDsKEf2XTp064dKlSxg9ejTs7e3h7u6Ox48fix2rypo0aRK0tLSwZMkSsaMQVQnz58+Hj48PC51EVOWw2ElEchEdHY3Vq1cjLy8Pc+fOhaurK4YMGQI/Pz8cP34cAKCnp4cnT56InJTo85mamiI1NVXsGERyk5GRAYlEgtjY2Ap3balUCjc3N6SkpKBOnTpo3rw5lixZgtevX8s4Kf0vUqkUISEhWLFiBeLj48WOQ1SpXb9+HefPn8fYsWPFjkJEpHAsdhKRXNSpUwceHh74/fffkZCQAF9fX9SoUQOHDh3CqFGjYGVlhYyMDOTn54sdleizsW8nVQZubm6QSCSQSCRQU1NDw4YN4e3tjdzcXBgZGeHRo0do1aoVAOCPP/6ARCLBs2fPZJrB1tYWEydOLPfcf1/7U1WvXh0BAQGIiYnB+fPn0axZMxw+fJj9PBWsfv36WL58OVxdXVFQUCB2HKJKa/78+fD29uaoTiKqkljsJCK5Ki4uxpdffolx48YhPDwc+/btg7+/P9q2bYu6deuiuLhY7IhEn83MzIzFTqoUunXrhkePHiE9PR0LFy7E2rVr4e3tDRUVFRgaGorSl1bW1zY1NcWhQ4ewZs0azJgxA7169UJycrJMzk0fxtXVFWZmZvjxxx/FjkJUKd24cQPnzp3jqE4iqrJY7CQiufrvL6dmZmZwc3NDUFAQoqKiYGtrK04wIhniyE6qLDQ0NGBoaAgjIyM4OzvDxcUFBw8eLDeVPCMjA3Z2dgCAL774AhKJBG5ubgDeLD63ZMkSNGrUCFpaWvj666+xbdu2cteYP38+jI2Ny641bNgwAG9GlkZHR2PNmjVlI0wzMjLkNoW+Z8+eSEhIQN++fWFjYwNPT09kZWXJ9Br0fhKJBOvWrcO2bdtw9uxZseMQVTpve3Xq6OiIHYWISBRcNpaI5OrZs2e4ceMGkpKScO/ePbx69QpqamqwsbHBwIEDAbz5ciyRSEROSvTpWOykykpLSwtFRUXlnjMyMsK+ffswcOBAJCUlQU9PD1paWgAAPz8/7N27F2vWrEGTJk0QExODUaNGoVatWujbty/27duHZcuWYefOnfj666/x5MkTXLx4EcCblbpTU1Nhbm6ORYsWAXhTTL1//77cXp+amhomT54MJycn/PjjjzA3N8dPP/2EUaNGcbVwOfviiy+wfv16DB8+HAkJCahWrZrYkYgqhRs3buDs2bMIDQ0VOwoRkWhY7CQiublx4wbmzp2LmJgYaGhooE6dOtDU1ERpaSmOHj2K8PBwrFy5El9++aXYUYk+i4mJCR4+fIjCwkKoq6uLHYdIJi5fvowdO3aga9eu5Z5XUVGBnp4egDf9mWvXrg0AyM3NxYoVK/Dbb7+hS5cuAN7827h8+TLWrFmDvn37IjMzE19++SV69OgBNTU11K9fH+3atQMA1KhRA+rq6tDW1oahoaECX+mbwltwcDDGjh0LT09PBAcHIygoiLMP5Kx///44dOgQpk2bhg0bNogdh6hSeNurk6M6iagq4zR2IpKLhw8fwsvLC7dv38aWLVtw8eJFREdH48SJE9i/fz/8/f1x//59rFy5UuyoRJ9NTU0N9erVw927d8WOQvRZTpw4AV1dXWhqasLS0hLW1tZYvXr1Bx2bnJyMgoIC9OrVC7q6umWP4OBg3LlzBwDg4OCAgoICmJiYwMPDA3v27FGqVdFbtmyJ06dPY86cOXBzc4ODgwMyMjLEjlWprVixAlFRUThy5IjYUYgqvMTERJw9exbjxo0TOwoRkahY7CQiubh58ybu3LmDyMhI9OjRA4aGhtDS0oK2tjbq1KkDJycnDB06FL/99pvYUYlkglPZqTKwtrZGfHw8bt26hYKCAuzfvx916tT5oGNLS0sBAEeOHEF8fHzZIykpqey93sjICLdu3cL69etRvXp1eHl5oW3btsjNzZXba/pYEokEgwYNws2bN9GyZUu0a9cOc+bMUaqMlUn16tURGhqKMWPG4OnTp2LHIarQOKqTiOgNFjuJSC50dHSQk5MDbW3tf9zn9u3b7NFFlYapqSlSU1PFjkH0WbS1tdG4cWMYGxtDTU3tH/d7266hpKSk7DkLCwtoaGggMzMTjRs3LvcwNjYu209TUxN9+/ZFYGAgrly5gqSkJJw/f77svP95TjFpaWnBz88P8fHxSE9Ph7m5OXbs2AFBEMSOVulYW1vDxcUFY8eO5d8v0SdKTEzEmTNnOKqTiAjs2UlEcmJiYgJjY2N4enpi+vTpUFFRgVQqRV5eHu7fv4+9e/fiyJEjCAsLEzsqkUyYmZkhKSlJ7BhECmFsbAyJRIJjx46hf//+0NLSQrVq1eDt7Q1vb28IggBra2vk5OTg4sWLkEqlGD16NEJDQ1FcXIyOHTtCV1cXu3fvhpqaGkxNTQEADRo0wOXLl5GRkQFdXd2y3qBiqlevHrZv347z58/D09MTa9asQVBQUFmvUZKNBQsWoH379ti2bRtcXV3FjkNU4SxYsABeXl4c1UlEBBY7iUhODA0NERgYCBcXF0RHR6NRo0YoLi5GQUEBCgsLoauri8DAQPTs2VPsqEQyYWpqioMHD4odg0gh6tati59++gmzZ8/GyJEjMWzYMISGhmLBggUwMDDAsmXLMG7cOFSvXh2tWrWCr68vAKBmzZoICAiAt7c3ioqKYGFhgf3798PExAQA4O3tjeHDh8PCwgL5+flK1Qe3c+fOuHz5MkJDQ9G/f3/07t0bixYtUvhiSpWVpqYmwsLC0L17d9ja2sLIyEjsSEQVRmJiIqKjo7F582axoxARKQWJwLkiRCRHhYWF2LNnD5KSklBcXIyaNWuiYcOGaNOmDczMzMSORyQz6enpsLOzQ2ZmpthRiEjOsrOzsXDhQmzevBnTp0/H5MmToaGhIXasSmHRokWIiorCyZMnIZWy4xbRh3B0dES7du3g4+MjdhQiIqXAYicREZEMFBcXQ1dXFy9evICmpqbYcYje69atW2jSpInYMSqNtLQ0TJs2DSkpKVixYgX69esHiUQidqwKrbi4GNbW1hgyZAgmT54sdhwipZeUlIRvv/0W6enpnMJORPR/WOwkIrl7+zbz9n8lEgm/DFKlZG5ujgMHDqBp06ZiRyF6R0FBAb755hvEx8eLHaXSOXHiBKZOnQpjY2MEBgbyPeAzpaWlwdLSEufOnYO5ubnYcYiU2pAhQ9CmTZuydiFERMTV2IlIAd4WN6VSKaRSKQudVGklJyfzizkpLS8vL7YPkZNevXrh+vXr6N27N6ytrTFlyhRkZWWJHavCMjU1xYIFC+Dq6oqioiKx4xApraSkJJw+fRrjx48XOwoRkVJhsZOIiEhGWMwnZbV3715ERERgw4YNYkeptNTU1ODp6Ynk5GQUFBSgadOmWL9+PUpKSsSOViGNHTsW+vr6WLRokdhRiJTW2xXYdXV1xY5CRKRUOI2diOTqP6euExGR4t29excdO3bEsWPH0L59e7HjVBnx8fHw9PTEy5cvERQUBBsbG7EjVTh//vknWrdujaNHj/Jnl+i/JCcnw87ODnfu3GGxk4jov3BkJxHJ1ZYtW3D8+HGxYxARVUmFhYUYMmQIZs6cyWKRgrVq1Qp//PEHZs+ejeHDh2Pw4MHIzMwUO1aF8tVXX2HVqlVwdXVFfn6+2HGIlMqCBQswbdo0FjqJiN6DxU4ikqvk5GQkJiaKHYOIqEqaNWsW6tSpgylTpogdpUqSSCRwcHDAzZs38fXXX6Nt27b48ccfkZubK3a0CsPR0RGtW7fGzJkzxY5CpDSSk5Nx6tQpTJgwQewoRERKicVOIpKrWrVqcZEGov9TUFCAvLw8sWNQFXH06FGEh4cjNDSUrUREpqWlhTlz5iAuLg63b99G06ZNsXPnTrCb1IdZs2YN9u7di6ioKLGjECkFjuokIvp37NlJRHK1bt06xMXFYf369WJHIRLd2rVr8ezZM8yePRsqKipix6FK7MGDB2jbti327dsHKysrsePQfzl37hw8PT2hpaWFoKAgtG3bVuxISi8yMhKjRo3C9evXUbNmTbHjEMmVIAiIiYnBkydPIJX+//FJqqqqqFu3Lnr06MFenVRlxMXFITMzEyoqKuVuEnbt2hU6OjoiJiNlpip2ACKq3Diyk6qSTZs2wcrKCqampigtLYVEIilX1DQyMkJwcDCcnJxgamoqYlKqzIqLi+Hs7AxPT08WOpWUlZUVLl++jNDQUPTr1w99+/aFv78/DAwMxI6mtHr27Il+/fph8uTJ2Lp1q9hxiOSitLQUx44dQ2FhISwtLdGpU6dy23Nzc7F161a4ubmhuLhYpJRE8icIAk6ePIns7Gy0bt0a33//fbntr1+/xqlTp5CTkwMrKyt8+eWXIiUlZcVp7EQkVyx2UlUyY8YMnD59GlKpFKqqqmWFzlevXiE5ORn37t1DUlISEhISRE5KldlPP/0EDQ0NzJgxQ+wo9C9UVFTg4eGBlJQU1KpVC82aNcOyZctQWFgodjSltXTpUsTExGDfvn1iRyGSuYKCAmzZsgW2trYYOHAgvvrqq3f20dHRwbhx4/Dzzz/jt99+w71790RISiRfJSUl2L59O1q1aoVBgwahUaNG7+yjoaGB3r17w8HBAVevXsXNmzdFSErKjNPYiUiurly5gnHjxiE2NlbsKERyZ29vj5ycHNjZ2eH69etIS0vDn3/+iZycHEilUtSpUwfa2tr4+eef0bdvX7HjUiX0+++/Y9iwYbh27RoMDQ3FjkMfITU1FdOmTUNqaioCAwPRp08f9lp9j5iYGPzwww+Ij4/nzzhVGqWlpdiyZQuGDh0KNTW1Dz5u7969sLOzg76+vhzTESnW9u3bYW9v/1FtGiIjI2Fubg5jY2M5JqOKhCM7iUiuOLKTqpJOnTrh9OnTOHToEPLz82FlZQVfX1+EhITgyJEjOHToEA4dOgRra2uxo1Il9Ndff2H48OHYunUri0AVkJmZGY4ePYqgoCB4eXmhT58+SElJETuW0rG0tISHhwdGjRrFBZ6o0oiIiMCgQYM+qtAJAAMHDsTJkyfllKpqevXqFaZMmQJjY2NoaWmhU6dOuHLlStn2nJwcTJo0CfXq1YOWlhaaNGmCwMBAERNXLtHR0bCzs/vofrQ9e/bEhQsX5JSKKiL27CQiuWKxk6qS+vXro1atWtixYwf09PSgoaEBLS0tLkZEcldaWoqhQ4dixIgR6Natm9hx6DP07t0b3bp1wy+//IIuXbpg6NChmDt37gctylNcXAxV1cr/8X7u3Lno2LEjNm/eDA8PD7HjEH0WQRCQn5+PatWqffSxEokEX331FZ48eYI6derIIV3VM3LkSFy/fh1btmxBvXr1sG3bNnTr1g3JycmoW7cupk2bht9//x1hYWEwMTHBmTNnMGrUKNSuXRuurq5ix6/wnj59Chsbm086tmXLlkhKSkKzZs1knIoqIo7sJCK5qlmzJrKzs1FaWip2FCK5a968OTQ1NfHVV19BX18furq6ZYVOQRDKHkSy9vPPP+P169eYO3eu2FFIBtTU1DB16lQkJSUhLy8P5ubmiIyM/Nf3D0EQcOLECYwfPx67du1SYFrFU1dXR1hYGGbMmIH09HSx4xB9ltjYWLRv3/6Tj7eyssK5c+dkmKjqys/Px759+/Dzzz/D1tYWjRs3xrx589C4cWMEBwcDAC5cuABXV1fY2dmhQYMGGDZsGL755htcunRJ5PQVX0ZGBho0aPDJx1tYWLB3J5VhsZOI5EpFRQU6OjrIzs4WOwqR3DVt2hSzZs1CSUkJcnJysHfvXiQlJQF4M/ri7YNIls6dO4dVq1Zhx44dVWJUX1VSp04drF+/HhEREf+z/UVxcTGys7OhoqKCMWPGwNbWFs+ePVNQUsVr3rw5ZsyYATc3N5SUlIgdh+iTPXz48LP6DEqlUkil/FovC8XFxSgpKYGmpma557W0tMoKylZWVjhy5Aju378P4E3xMz4+Hr169VJ43somISEBbdu2/axz8HMQvcV3RSKSO05lp6pCVVUVEyZMQPXq1ZGfn48FCxbAysoK48aNw40bN8r240hnkpXnz5/D2dkZmzZtQr169cSOQ3LSunVraGpq/uvNEjU1NTg7O2P16tVo0KAB1NXV8fLlSwWmVLwpU6ZAIpGwXx5VaLJodcN2ObJRrVo1WFpaYuHChXj48CFKSkqwbds2xMTE4NGjRwCAVatWoVWrVqhfvz7U1NRgY2ODgIAA9OvXT+T0FZ9UKv3sQQFqamq8AUYAWOwkIgVgsZOqkreFTF1dXWRlZWHJkiUwMzPDgAEDMH36dFy8eJEjMEgmBEGAm5sbHBwc0LdvX7HjkJz9ry+AhYWFAN6sYpuZmYnJkyejUaNGACrvDRYVFRWEhoYiICCg3A0loopEFu1tEhMTy80g4ePfH//2nhgWFgapVIp69epBQ0MDq1atgpOTU1lBefXq1Th//jwOHz6Mq1evIjAwEN7e3jhx4sQ75yotLYWXl5for7eiPFavXv3Z/xZUVFRY7CQALHYSkQKw2ElVydsP0RoaGjAyMsKzZ88wdepUnD9/HiUlJfjll1+waNEipKamih2VKriVK1fir7/+wuLFi8WOQiITBAHq6uoAgBkzZsDJyQmWlpZl2wsLC5GWlobt27cjMjJSrJhyYWJigoCAALi6upYVfIkqElkUOy0sLMr1Bufj3x//dtO5UaNGiI6ORk5ODu7fv4/Lly+jqKgIJiYmyM/Px8yZM7FkyRL0798fLVq0wMSJEzFkyBAsW7bsnXNJpVIsX75c9NdbUR4TJkz47H8Lr1+/Lvt9SFUbi51EJHcsdlJVIpFIyvpntW3bFomJiQCAkpISjBkzBnXq1IGfnx8WLFggclKqyK5cuYLFixdj9+7d/FBPZaNYZsyYARUVFQwbNgz6+vpl26dOnYpvv/0WixcvxvDhw9G5c+eyfnOVgbu7O+rXr4+ffvpJ7ChEH6169eqf3V+3uLhYRmnoLR0dHXz55ZfIyspCZGQk7O3tUVRUhKKionfaBqioqFTaEfSKZGJi8tmDAYqKimSUhio6dm8lIrljsZOqkuzsbOzbtw+PHj3C+fPnkZqaiqZNmyI7OxuCIMDAwAB2dnaoU6eO2FGpgnr58iUcHR2xdu1amJiYiB2HRFZaWgpVVVXcu3cPa9aswaxZs9CyZcuy7YsWLUJYWBhWrlyJfv36QU1NDd9//z3CwsIwa9YsEZPLjkQiwYYNG9CyZUv07dsXnTp1EjsS0Qd5+fIlLl68iLNnz+LHH3/8pHPExcWhVatWMk5WdUVGRqK0tBTm5ua4ffs2fHx80KRJE7i7u5f16JwxYwZ0dXVhbGyM6OhobN26FUuWLBE7eoXXokUL7Nu3D2ZmZp90/IMHD1C3bl0Zp6KKisVOIpI7FjupKsnKysKMGTNgZmYGdXV1lJaWYtSoUahevToMDAxQu3Zt1KhRA1988YXYUakCEgQBI0eORK9evTBo0CCx45DIbty4AQ0NDZiZmcHT0xPNmjXD999/D21tbQDApUuXsHDhQixevBgjR44sO+7bb7/F1q1b4ePjAzU1NbHiy5SBgQGCg4MxbNgwxMfHQ1dXV+xIRP/o0aNHWLlyJTZu3IjevXujc+fOKCkp+aSFhm7fvg0HBwc5pKyaXr58iZkzZ+LBgwfQ09PDwIED4e/vX/ZeuWvXLsycORMuLi74+++/YWxsjAULFmDixIkiJ68ctLS0kJOT80nv4TExMfxsRGUkgiB8fpMQIqJ/sWjRIrx69Yp95ajKOH/+PPT19fHo0SP06NEDubm5nGpMMrFu3ToEBwfj0qVL0NTUFDsOiai0tBQzZszAsmXL4OzsjMOHD2P9+vVwdHQs60c3aNAgZGZm4sqVKwDeFMslEglGjBiBjIwMnDp1CgCQm5uL8PBwtGjRAm3bthXtNcnC8OHDoa2tjeDgYLGjEL3j1q1bWLp0Kfbv3w9XV1dMnToVDRo0QF5eHvbv3w8XFxdIJB++GvWpU6dQv359NG7cWI6piRSnuLgYYWFhGDZs2EcV/y9fvgw1NTW0bt1ajumoImHPTiKSO47spKqmc+fOMDc3h7W1NRITE99b6GRvJ/pY169fx5w5cxAeHs5CJ0EqlWLJkiXYuXMnrly5gpycHDx58qSsUJKZmYmDBw+WTY0tKSmBRCJBSkoKMjIy0Lp167I+f9HR0Th+/DicnZ3RvXv3Ct3Pc9WqVTh+/DgiIiLEjkJU5tKlSxgwYAC6dOkCIyMjpKamIigoCA0aNAAAaGtro2fPntixY8cHfz6IioqCnp4eC51UqaiqqmLw4MHYunUrXr9+/UHHXLx4EcXFxSx0Ujmcxk5EcsdiJ1U1paWlkEqlUFFRQZMmTZCamoqMjAzk5eWhsLAQ7du3Z69F+ig5OTkYPHgwAgMD0aRJE7HjkBJxdHSEo6Mj5s+fDx8fH/z1119YtGgRIiIiYGZmhjZt2gBA2QiZvXv34sWLF7C2toaq6puvAn369EHDhg0REREBLy8vnDhxAqNGjRLtNX2OGjVqICQkBMOGDcP169ehp6cndiSqogRBQEREBJYsWYKMjAx4eXkhLCwMOjo6793/iy++gL29Pfbs2YNatWrBzs7unTYTgiAgNjYWmZmZaNWqFQudVCnp6OjAxcUFhw8fhqamJrp27QotLa139ouJiUFmZiYsLCzQokULEZKSMuM0diKSu8jISCxfvhy//fab2FGIFCY/Px9r167FunXrcP/+fRQWFgIAzMzMYGBgAAcHB/Z3og82fPhwSKVShISEiB2FlNiLFy+QkJAAGxsbHDp0CG5uboiNjUWjRo0AABEREfj555/RuHFjbNq0CcCbKYOqqqrIycmBh4cHEhMTkZSUJObLkImpU6fi0aNH2LVrl9hRqIopKirC7t27sWTJEkgkEvj6+mLw4MEf1R83Ozsbp0+fhiAIUFFRwduv7G9vmBobG8srPpFSyc/PR1RUFIqKispNay8sLMS2bdtga2uLKVOmiJiQlBVHdhKR3HFkJ1VFv/76K4KCgtCnTx+Ympri1KlTKCoqwpQpU3Dnzh3s2LED6urqGD16tNhRSclt2bIFly9fRmxsrNhRSMnVrFkTNjY2AABzc3MYGxsjIiICgwYNQnp6OiZNmoTmzZtj8uTJAP5/obO0tBSRkZHYs2dP2Y3Jt9sqqkWLFqFNmzbYtWsXhgwZInYcqgJyc3OxadMmrFixAiYmJliyZAl69uz5UT0436pevTrs7e3lkJKoYtHS0kK/fv3eu61e8kej9wAAIABJREFUvXpwdnbGpEmTPmlxL6rcOLKTiOQuLS0NvXv3xu3bt8WOQqQQaWlpcHJywsCBAzF16lRoamoiLy8PK1aswIULF3D8+HEEBQVh48aNuHHjhthxSYmlpKSgS5cuOHXqFL7++mux41AFs3v3bkyYMAE1atRAXl4e2rZti4CAADRr1gzA/1+w6N69e3BwcICenh4iIiLKnq/oYmNj0adPH8TFxaFu3bpix6FK6tmzZ1i9ejWCg4PRpUsXTJ8+HR06dBA7FlGV0LFjR8yaNYs3B+gdXKCIiOSOIzupqpFKpUhPT4enp2fZQjLa2tpo164dkpOTAQBdu3bFvXv3xIxJSi4/Px+DBw+Gv78/C530SRwdHcsKMefPn8fhw4fLCp2lpaWQSCQoLCzEvn37EBsbi19//bVsW2XQrl07TJw4ESNGjADHd5CsZWRkYNKkSTAzM8OjR49w9uxZ7Nu3j4VOIgXy9PREUFCQ2DFICbHYSURyV7NmTbx8+bLSfHki+l9MTEwglUoRExNT7vn9+/fD0tISJSUlyMnJQY0aNfDixQuRUpKymzp1KiwsLCrsQjGkPN4uQPRWXl4eXr16BQC4desWli1bBk9PTxgZGaGkpKRSTQecOXMmsrKysG7dOrGjUCWRkJAAFxcXtG3bFjo6OkhKSsKvv/7KxeOIRDBo0CDcunUL169fFzsKKZmK24iHiCoMVVVVaGtr49WrV6hRo4bYcYjkTiqVwtPTEx4eHrCyskL9+vURFxeH06dP48iRI1BRUYGBgQG2bt363tUlicLDw/H777/j2rVrlWI6MSkHqfTNOIdDhw5h2bJlGDp0KNLT01FUVIQVK1YAQKX7eVNTU0NYWBisrKzQrVs3mJqaih2JKiBBEPDHH38gICAA169fx5QpU7B27Vp+riUSmbq6OsaPH4+goKCyhfeIAPbsJCIFMTY2RnR0NBo0aCB2FCKFKC4uRnBwMKKjo/H06VMYGBhg6tSpsLS0FDsaKbk7d+7A0tISERERaNu2rdhxqJJaunQp5s2bh/z8fHh5eWHp0qWVblTnf1q9ejV27NiBs2fPVuiFl0ixSkpKcPDgQQQEBCA7Oxs+Pj4YOnQoNDQ0xI5GRP/n6dOnMDMzQ2pqKr744gux45CSYLGTiBSiVatWCAkJQevWrcWOQqRQL168QFFREWrXrl3pRkyR7BUWFqJz584YOnQoPD09xY5Dldzr168xc+ZMrFy5EkOGDMH69etRrVq1d/YTBAFFRUVQV1cXIaVslJaWokePHrCzs8Ps2bPFjkNKrqCgAGFhYVi6dCn09PQwffp02Nvbl42OJiLl4uHhgYYNG/L9ncrw3ZqIFIKLFFFVVbNmTXzxxRcsdNIHmTFjBr766itMnjxZ7ChUBWhoaGDFihW4du0azMzMUFhY+M4+giBg3759aNGiBSIiIkRIKRtSqRQhISEICgpCXFyc2HFISb148QI///wzGjZsiIMHD2Ljxo2IiYnBDz/8wEInkRLz9PTE2rVr3/t7jKomzuEgIoVgsZOI6N8dPnwY+/btQ1xcHIvjpFCtWrVCq1at3rtNIpFg0KBB0NbWxpQpU/DLL78gMDAQZmZmCk75+YyMjLBixQq4uroiNjYWmpqaYkciJfHnn39i5cqV2LRpE/r06YPIyEh8/fXXYsciog/UokULPHz4UOwYpER4e4qIFILFTiKif3bv3j2MGjUKO3fuhJ6enthxiN7Rp08f3LhxA127dkXnzp3h7e2Nly9fih3ro7m4uKBp06bw8/MTOwopgZSUFHh4eKB58+Z4/fo1rl27hrCwMBY6iYgqOBY7iUghWOwkInq/4uJiODs7Y+rUqejUqZPYcYj+kbq6OqZNm4bExES8fPkS5ubm2LhxI0pKSsSO9sEkEgmCg4OxY8cOREdHix2HRHLx4kX88MMPsLGxgbGxMdLS0hAUFARjY2OxoxERkQyw2ElECsFiJ1VVxcXFyM/PFzsGKbG5c+dCR0cHvr6+Ykch+iAGBgbYsGEDjh07hi1btqBDhw44d+6c2LE+WO3atbFhwwa4ubkhOztb7DikIIIg4NixY7CxsYGTkxO6du2Ku3fv4scff4S+vr7Y8YiISIZY7CQihWCxk6qqJUuWYN68eWLHICX122+/ITQ0FGFhYVz8giqcNm3a4MyZM/Dx8YGzszOcnJxw//59sWN9kL59+6J79+6YOnWq2FFIzoqKihAWFoYWLVpg9uzZGDNmDNLS0jBx4kRoa2uLHY+IiOSAn6qJSK6Ki4tx8uRJ5OXlQUtLC0eOHMGBAwfw4MEDsaMRKYSpqSnS0tLEjkFK6NGjRxg+fDjCwsJQp04dseMQfRKJRIIhQ4YgJSUFTZo0QevWrTF//nzk5eWJHe1/Wr58Of744w8cPnxY7CgkBzk5OQgKCkLjxo0REhKCZcuWIS4uDs7OzlBVVd51ekNDQ6Grq6vQa/7xxx+QSCR49uyZQq9LVU9GRgYkEgliY2PFjkKVnEQQBEHsEERU+WRlZeHUqVNQUVGBnZ0datSoUbZNEARcvHgRDx8+hJGRETp27ChiUiL5io+Px9ChQ5GYmCh2FFIiJSUl6NGjB6ysrPDTTz+JHYdIZjIzM+Hr64uLFy9i6dKlcHBwgEQiETvWPzp37hwGDx6MhIQEfPHFF2LHIRl4+vQpVq9ejeDgYNja2sLX1xft27eX+XVsbW3RvHlz/PLLL+WeDw0NxcSJE5GTk/NJ583Pz8erV68UehOssLAQf//9NwwMDJT63yspNzc3Nzx79gxHjx4t93xsbCzat2+Pu3fvwsjICE+fPkXt2rWV+qYDVXwc2UlEMpeeno6oqCgMGDAA33//fblCJ/BmFIilpSUGDRoEPT09HDhwQKSkRPLXuHFjpKeno7S0VOwopEQWL16MkpIS/Pjjj2JHIZIpY2Nj7N69G2FhYVi8eDFsbW0RHx8vdqx/ZGVlBVdXV4wZMwYcA6J8Pub/k7t372LixIlo0qQJ/vrrL1y4cAF79uyRS6HzUxUWFv7PfbS0tBQ+2l9dXR2GhoYsdJLcqaiowNDQ8F8LnUVFRQpMRJUVi51EJFN//vknEhMTMWjQoA/6wGRqagpLS0scOnRIAemIFE9XVxe1atVi6wYqc+bMGfzyyy/Yvn07VFRUxI5DJBfW1taIjY2Fi4sLevXqhTFjxuDp06dix3qv+fPn4/bt29i6davYUeg/vHjx4oM+S8bHx8PZ2Rnt27dHtWrVkJycjPXr18PU1FQBKf+dm5sb+vXrh4CAANSrVw/16tVDaGgoJBLJOw83NzcA75/GfuzYMXTs2BFaWlrQ19dH//79UVBQAOBNAXX69OmoV68edHR00L59e0RGRpYd+3aKelRUFDp27AhtbW20a9cO165de2cfTmMnefvvaexvf/aOHz+ODh06QF1dHZGRkbh//z7s7e2hp6cHbW1tmJubY9euXWXnuXHjBrp16wYtLS3o6enBzc0NL1++BABERkZCXV0dz58/L3ftWbNmoWXLlgCA58+fw8nJCfXq1YOWlhaaNWuGkJAQBf0tkCKw2ElEMnX69Gl89913H3WMoaEhTE1Ny33oIqpM2LeT3nr27BlcXFwQEhKCunXrih2HSK5UVFQwevRopKSkQEdHBxYWFli5cqXSjdrR0NBAWFgYvL29kZmZKXacKi8xMRF9+/ZF06ZNkZSU9I/7CYKAoKAg9O3bF61bt0Z6ejoWL14MQ0NDBab936Kjo3H9+nWcOHECUVFRcHR0xKNHj8oebwszNjY27z3+xIkTsLe3R/fu3XH16lWcPn0aNjY2ZTNG3N3dER0djR07duDGjRsYPnw4+vfvj4SEhHLnmTlzJn7++Wdcu3YN+vr6cHFx4WhmUhrTp0/HwoULkZKSgo4dO2L8+PHIy8vD6dOnkZSUhJUrV6JmzZoAgLy8PPTq1Qu6urq4fPkyDhw4gAsXLmDEiBEAgG7dukFfXx979uwpO78gCNi5cyeGDh0KACgoKECbNm1w9OhRJCUlwdPTE2PGjEFUVJTiXzzJh0BEJCNJSUlCUlLSJx+/Z88eGaYhUh4jR44UgoODxY5BIispKRH69u0r+Pj4iB2FSBQ3b94UevXqJZibmwsRERFix3nH4sWLBTs7O6GkpETsKFVSbGys0KlTJ0FDQ0NwcHAQbt269a/7l5aWCvn5+UJBQYGCEpZnY2MjTJgw4Z3nQ0JCBB0dHUEQBGH48OFC7dq1/zHjkydPBGNjY8HT0/O9xwuCIHTq1ElwdHR87/G3b98WJBKJkJmZWe55e3t7Ydy4cYIgCMLp06cFAMKJEyfKtp87d04AINy/f7/cPk+fPv2Ql070XsOHDxdUVFQEHR2dcg8tLS0BgHD37l3h7t27AgDhypUrgiD8/5+9vXv3ljvX119/LcybN++91/n111+F6tWrC9nZ2WXPvT1PWlqaIAiCMGXKFMHKyqps+9mzZwWpVCo8ePDgH/M7OjoKHh4en/z6SblwZCcRyczNmzdhYWHxycfr6em9M92AqDLgyE4CgMDAQDx//hz+/v5iRyEShbm5OY4fP45ly5Zh8uTJ6NevH1JTU8WOVcbHxwevX7/GqlWrxI5S5aSnp8Pd3R2ZmZl4/PgxwsPDYWZm9q/HSCQSaGpqQkNDQ0EpP03z5s3fm7GwsBA//PADmjZtiuXLl//j8XFxcejatet7t127dg2CIMDCwgK6urplj2PHjuHOnTvl9m3RokXZf3/11VcAgCdPnnzKSyL6R9bW1oiPjy/32LFjx/88rl27duX+7OnpiYULF8LS0hJ+fn64evVq2babN2+iRYsWqFatWtlznTp1glQqRXJyMgBg6NChOH/+fNlo/e3bt8PW1rZsVk1JSQn8/f3RokUL6OvrQ1dXF/v378e9e/c++++AlAOLnUQkE4IgfHbvORsbG5w/f15GiYiUB4uddOnSJQQEBGDnzp1QU1MTOw6RaCQSCfr27YvExETY2dmhc+fO8PHxKeu1JiYVFRVs3boVCxcuLPvCTPLz119/lf13w4YNy6auP378GL///jvc3d0xZ86ccn36lEn16tXf+3P74sWLcotz6ujovPf4sWPHIisrC7t37/7kz9ClpaWQSCS4cuVKueLSzZs3sXnz5nL7/ufvnre9ULl4IsmatrY2GjduXO5Rr169/3ncf/878fDwwN27d+Hu7o7U1FR06tQJ8+bNA/Dme+c/9fN9+3zbtm1hbm6OHTt2oKioCHv27Cmbwg4Ay5Ytw/Lly+Hj44OoqCjEx8fj+++//6BFxKhiYLGTiGQiPz//nWbqH0tFRYWrQFKlZGpqqlSjl0ixXrx4gSFDhmDdunVo0KCB2HGIlIK6ujq8vLyQmJiIrKwsmJubY9OmTaIXXxo1agR/f38MGzZM6XqLVgalpaVYuHAhmjVrBgcHB0yfPr2sL2evXr3w4sULfPPNNxg/fjy0tbURHR0NZ2dnLFiwQCkK4v+pSZMmZSMr/9O1a9fQpEmTfz122bJlOHLkCI4ePYrq1av/676tW7f+xz6CrVu3hiAIePz48TsFJvaFpoquXr16GD16NMLDwzF//nz8+uuvAAALCwskJCTg1atXZfteuHABpaWlaNq0adlzLi4u2L59O06cOIHc3FwMHDiwbNu5c+fQv39/uLq6olWrVmjUqBE/q1cyLHYSkUwUFRXJZLTSf39gJKoMGjVqhIyMDBQXF4sdhRRMEASMHDkS/fr1w4ABA8SOQ6R0DAwMsHHjRhw9ehQhISHo0KGD6LM8Ro8ejTp16mDhwoWi5qhsMjIy0K1bNxw6dAh+fn7o1asXIiIisGbNGgBvZvj06NEDEydORFRUFNasWYMzZ84gMDAQoaGhOHPmjMivoLxx48YhPT0dkyZNQkJCAm7duoXAwEDs3LkT3t7e/3jc77//jlmzZmHt2rXQ0tLC48eP8fjx438s5s6ePRt79uyBn58fkpOTkZSUhMDAQOTl5cHMzAwuLi5wc3PD3r17kZ6ejtjYWCxbtgz79++X10snkjtPT0+cOHEC6enpiI+Px4kTJ8rapbm4uEBHRwfDhg3DjRs3cObMGYwZMwYDBgxA48aNy84xdOhQJCcnY86cOfjuu+/K3VgwMzNDVFQUzp07h5SUFEycOBF3795V+Osk+WGxk4hkolq1asjOzhY7BpFS0tLSgoGBAfsAVUHBwcFIT0/H0qVLxY5CpNTatm2Ls2fPwsvLC0OGDIGzszMePHggShaJRIJNmzZh3bp1uHz5sigZKqOzZ88iMzMTx44dg5OTE2bNmoWGDRuiuLgYr1+/BgCMHDkSEydOhJGRUdlxnp6eyMvLw61bt8SK/l4NGzbEmTNnkJaWhh49eqBDhw7YtWsX9uzZgz59+vzjcefOnUNRUREGDx6ML7/8suzh6en53v379OmDAwcOICIiAq1bt4aNjQ1Onz4NqfTNV/mQkBC4u7vD19cX5ubm6NevH86cOQNjY2O5vG4iRSgtLcWkSZNgYWGB7t27w8DAAFu2bAHwZqp8ZGQksrOz0aFDB9jb28PS0vKd1g3GxsawsrJCQkJCuSnsAODn54cOHTqgd+/esLa2ho6ODlxcXBT2+kj+JAKHURGRjOzbt6/c9ICPlZaWhry8PLRs2VKGqYiUQ7du3eDj44OePXuKHYUUJD4+Ht27d8eFCxdgamoqdhyiCiM3NxdLlizBmjVr4OnpCW9vb2hpaSk8x549ezBnzhxcu3YN2traCr9+ZTN//nxERUVhy5YtaNCgAQRBgL29Pdzd3fHDDz+8s78gCBAEAa9fv4aJiQk8PDy4wBsREX0QjuwkIpn5p0btH+r69essdFKlxUWKqpZXr17B0dERQUFBLHQSfSQdHR389NNPiI2NxY0bN9C0aVPs2bNH4a1uHBwc0LZtW8yYMUOh162sBg8ejBcvXmDkyJEYOXIkqlWrhsuXL8PLywtjx45953ekRCKBVCpFSEgIvvrqK4wcOVKk5EREVNGw2ElEMmNnZ4dTp0590rF5eXmijNogUhQWO6sOQRAwbtw4dOnSBc7OzmLHIaqwGjRogPDwcGzZsgX+/v6ws7NDQkKCQjP88ssvOHDgAE6ePKnQ61ZG5ubmOHDgQNk0682bNyMlJQULFixAamoqvLy8ALz5TLh+/Xps2LABVlZWWLBgAUaOHAljY2P2diciog/CYicRyYyqqir09fWRkpLyUccJgoDw8HB069ZNTsmIxMdiZ9URGhqKuLg4rFq1SuwoRJWCjY0Nrl69CicnJ/Ts2RNjx47F06dPFXLtWrVqYfPmzRgxYgSysrIUcs3KrGHDhkhOTkbnzp0xePBg1KxZEy4uLujduzcyMzPx9OlTaGtr4/79+1i5ciW6dOmCtLQ0jB8/HlKpFBKJROyXQEREFQCLnUQkU9bW1sjIyEBycvIH7V9cXIywsDD88MMPUFdXl3M6IvGYmpoiNTVV7BgkZ8nJyfDx8UF4eDh7/BHJkIqKCsaMGYObN29CS0sLzZo1Q1BQEIqKiuR+7e7du8Pe3h6TJ0+W+7Uqk6KiondGYgqCgGvXrsHS0rLc85cvX0b9+vVRrVo1AMD06dORlJSExYsXQ1dXV2GZiYiocmCxk4hkrlevXvj777+xb98+/PXXX+/dp6SkBKdOncKePXswaNAg1KhRQ8EpiRSrYcOGuH//vkK+mJM48vLy4OjoiICAADRr1kzsOESVUq1atRAYGIjo6GgcP34cLVq0QGRkpNyvu2TJEly+fBl79+6V+7Uquri4ODg5OcHJyemdbRKJBG5ubli3bh1WrVqFO3fuwM/PDzdu3ICLiws0NTUBoKzoSURE9Cm4GjsRyY0gCDh37hz++usv5Ofno6CgAIaGhmXFHhsbG+jr64uckkhxGjVqhIiICJiZmYkdheRg9OjRyM3NxbZt2zjVkkgBBEHAsWPHMHXqVDRt2hTLly+X64Jgly5dwnfffYf4+Hh8+eWXcrtORSQIAk6dOoWAgAAkJydj6tSpGDVqFKpXr/7OvkVFRXByckJiYiIKCwuhr68Pf39/9OjRQ4TkRFSVXL9+Hb1790ZGRgbU1NTEjkNyxGInESnExo0bERMTg02bNokdhUg0vXr1wqRJk9C3b1+xo5CM7dq1C3PmzMG1a9c4IolIwV6/fo1Vq1YhICAAI0aMgJ+f33uLbLLw9t/50aNHeVMDb2bq7N+/HwEBAcjNzYWvry9cXFw+qDXRrVu3oKKigsaNGysgKRHRG3Z2dhg9evR7R59T5cFp7ESkEFlZWahVq5bYMYhExUWKKqfbt29j0qRJ2L17NwudRCLQ0NCAj48PEhMT8fz5c5ibmyMkJASlpaUyv9acOXPw+PFjbNy4Uebnrkjy8/Oxbt06NGnSBIGBgZgzZw6SkpLg7u7+wT3YmzRpwkInESnclClTsHLlSrFjkJyx2ElECsFiJxGLnZXR69ev4ejoiLlz56JNmzZixyGq0gwNDbFp0yYcPnwYGzduRIcOHXDhwgWZXkNdXR1hYWGYNWsW0tPTZXruiiArKwuLFi1Cw4YNcezYMYSGhuLChQuwt7eHVMqvlkSk/Pr164enT5/i4sWLYkchOeJvJCJSCBY7iVjsrIx8fX1hbGyMCRMmiB2FiP5Pu3btcO7cOUybNg2Ojo5wcXHBgwcPZHZ+CwsLzJo1C8OGDUNJSYnMzqvMHjx4AG9vbzRu3Bi3bt3CyZMnceTIEVhZWYkdjYjoo6ioqGDSpEkICgoSOwrJEYudRKQQLHYSsdhZ2Rw8eBCHDh3Cpk2b2LuPSMlIJBI4OzsjJSUFDRs2RKtWrbBw4ULk5+fL5Pyenp5QVVXF8uXLZXI+ZXXz5k24u7ujRYsWKCkpQVxcHLZs2YLmzZuLHY2I6JONGDECkZGRMr0RRsqFxU4iUggWO4mABg0a4NGjRygoKBA7Cn2mzMxMjBkzBrt27eJ7G5ES09HRwYIFCxAbG4uEhARYWFhg3759+Nw1WqVSKbZs2YKlS5fi+vXrMkqrPN5OTbe1tUWjRo1w+/ZtBAYGon79+mJHIyL6bDVq1MDQoUOxdu1asaOQnLDYSUQKwWInEaCqqgpjY+Mq2eetMikqKoKTkxO8vb3xzTffiB2HiD5AgwYNsGfPHoSEhGD+/Pn49ttvP7tIaWxsjKVLl8LV1RWvX7+WUVLxlJaWlk1NHzp0KHr27ImMjAz4+flBT09P7HhERDI1adIkbNy4UWYj/km5sNhJRArBYifRG5zKXvHdvXsXenp68PLyEjsKEX0kW1tbXL16FY6OjujevTvGjRuHZ8+effL5hg8fDhMTE8ybN092IRWssLAQW7ZsQYsWLTB37lxMnDgRqampGD9+PLS0tMSOR0QkF6ampujQoQO2b98udhSSAxY7iUgh0tLSYGZmJnYMItGx2FnxmZqa4vDhw1x5mKiCUlVVxdixY5GSkgINDQ1YWFhg1apVKCoq+uhzSSQS/PrrrwgNDcX58+flkFZ+cnJyEBgYiMaNGyMsLAyBgYG4evUqhgwZAlVVVbHjERHJnaenJ1auXPnZrU1I+fBTOhERkQKx2FnxSSQSFjqJKoFatWph5cqV+OOPP3D06FG0bNkSv/3220efp06dOli3bh2GDRuGnJwcOSSVrSdPnsDPzw8mJiaIiYnBgQMH8Pvvv6N79+5cbI2IqpRu3bpBEAScOnVK7CgkY/ykTkREpEAsdhIRKRcLCwtERkYiICAAEyZMgL29PW7fvv1R57C3t4e1tbVSt7e4c+cOxo8fD3Nzczx//hwxMTEIDw9H27ZtxY5GRCQKiUQCT09PBAUFiR2FZIzFTiIiIgVisZOISPlIJBL0798fiYmJ6Ny5M7755htMnz4dr169+uBzBAUFITIyEsePH5dj0o937do1ODo6omPHjqhVqxZu3ryJ4OBgNG7cWOxoRESiGzp0KGJiYj76JhcpNxY7iYiIFMjIyAjPnj1DXl6e2FHoPW7evIm9e/fizJkzePTokdhxiEjBNDQ04Ovri8TERDx9+hRNmjRBaGgoSktL/+ex1atXR2hoKEaNGoXnz58rIO0/EwShbGq6vb09OnbsiLt378Lf3x8GBgaiZiMiUiba2toYOXIkVq9eLXYUkiEWO4lIZiQSCfbu3Svz8y5btgwNGjQo+/O8efPQvHlzmV+HSBFUVFRgYmLCu8dK6ODBgxg8eDDGjx8PBwcHbNmypdx2Nq8nqjoMDQ2xefNmHDp0COvXr0fHjh0RExPzP4+ztbXFkCFDMG7cOFHeM0pKShAeHo527dph8uTJcHFxwZ07dzBt2jRUq1ZN4XmIiCqC8ePHIywsDNnZ2WJHIRlhsZOoCnNzc4NEIsHIkSPf2ebr6wuJRIJ+/fqJkOzfeXt7Izo6WuwYRJ/MzMyMU9mVzJMnT+Du7o6RI0ciLS0NPj4++PXXX5GdnQ1BEFBQUMCFO4iqoPbt2+PChQuYMmUKHBwc4OrqiocPH/7rMf7+/khKSsLOnTsVlBLIz89HcHAwzMzMEBQUhLlz5yIxMRFubm5QV1dXWA4ioorIyMgI3bt3R0hIiNhRSEZY7CSq4oyMjLB7927k5uaWPVdcXIywsDDUr19fxGT/TFdXF/r6+mLHIPpk7NupfJYsWQJbW1t4enqiRo0a8PDwQJ06dTBixAh88803GDduHK5evSp2TCISgUQigYuLC1JSUmBsbIyWLVvC398fBQUF791fU1MTYWFhmDJlCh48eCDXbFlZWfD394eJiQkiIiKwdetWnD9/Ht999x2kUn7VIyL6UJ6enli1ahVKSkrEjkIywN+ARFVcixYtYGpqivDw8LLnjh07Bk1NTdja2pbbNyQkBBYWFtDU1ISZmRkCAwPf6WH1999/w8HBATo6OmjYsCG2bdtWbvuMGTPQpEkTaGlpoUGDBvD19X3ny8KSJUtoTZfdAAAgAElEQVRgaGgIXV1dDBs2DDk5OeW2//c09itXrqBHjx6oXbs2qlevDisrqw+aakYkFhY7lY+Wlhby8/ORlZUFAPDz80NGRgasra3Rq1cv3L59Gxs3bkRhYaHISYlILLq6uli4cCGuXLmCuLg4WFhYYP/+/e+drt6mTRtMnjwZ7u7uKC0thSAIOHv2LA4dOoQjR47g8OHDOHToEKKioj7pi/X9+/fh5eWFRo0aIS0tDVFRUTh8+DA6d+4si5dKRFTlWFpaQl9fH8eOHRM7CskAi51EBA8PD2zevLnsz5s3b4a7u3u5KZsbNmzArFmzMH/+fNy8eRPLly9HQEAA1q5dW+5c8+fPh729PRISEuDo6IgRI0YgMzOzbLuOjg42b96MmzdvYu3atdi1axf8/f3LtoeHh8PPzw8//fQTrl27hiZNmmDFihX/mv/Vq1dwdXXF2bNncfnyZbRq1Qp9+vTBs2fPPvevhkgu/h979x3W1NmwAfwOGxFBtoCKksSBq7j3tra4aRU3gqN1oRarfbV1t1ZtFbW2LkRRaxW0zmrrqgP3qgNlCagoU5G9cr4//MxbXhyMwEnI/bsurjY5Izf8EXPuPOd5WHaqHxsbG4SEhGDGjBnw9vbG+vXrcejQIUydOhULFiyAu7s7duzYwUWLiAh16tRBUFAQNm3ahPnz56N79+74559/iuw3e/ZspKamYs6cOdi7dy/kcjn69++Pvn37ol+/fujfvz9cXV1x4MABBAcHIysr672vfe/ePXh6eqJp06YAgFu3biEgIAAuLi4q/z2JiLSJRCKBj48P/Pz8xI5CqiAQkdYaPXq04ObmJqSkpAhGRkZCWFiY8PTpU8HAwECIiYlRbhcEQahZs6awbdu2QsevXLlSaNCggfIxAGH27NnKx3l5eYKxsbEQGBj41gw///yz4OzsrHzctm1bYezYsYX26d69u1C7dm3l43nz5gkuLi5vPadCoRDs7Oze+bpEYnr06JFgZ2cndgz6H8uWLRMGDx4sfPfdd4Krq6sQHx8v5OfnC4IgCJcuXRJcXV2F0NBQkVMSkTrJy8sT1q1bJ9jY2AgTJ04UkpKSlNvS0tKE1atXC5mZmcU6z9atW4XExMQ3bj937pzQt29fwdbWVli8eLGQkpKist+BiIheycnJEWrUqCH8888/YkehMuLITiJC9erVMXDgQPj7+2Pr1q3o0qVLofk6ExMT8ejRI0yYMAFVq1ZV/syePRuRkZGFztWkSRPl/+vp6cHa2hoJCQnK54KCgtChQwflberTp09HbGyscntoaCjatm1b6Jz/+/h/JSQkYMKECZDL5TAzM4OpqSkSEhIKnZdIndjb2+Ply5dc8VFkeXl5SE5OVj6eOXMmdu3ahcGDByMvLw95eXnQ1dWFIAj44YcfYGVlhfr164uYmIjUjZ6eHj7//HOEhoZCV1cXDRo0wJo1a5CZmYk9e/Zg4sSJMDY2LtZ5Ro4ciaNHjyrnUVcoFMpb00eNGoWPPvoIDx8+xJw5c1C9evXy/tWIiLSOgYEBJk6cyNGdlYCe2AGISD14eXlh9OjRqFq1KhYuXFho2+t5OX/55Re0a9funefR19cv9FgikSiPv3jxIjw8PDBv3jysXLkS5ubmOHDgAHx9fcuUffTo0YiPj8fKlSvh5OQEQ0NDdO/enXPrkdrS0dGBs7MzIiIi4OrqKnYcrRQQEIDDhw/j2LFjGDp0KFatWgVjY2NIJBLUqlUL1apVQ/PmzdG3b1/ExcUhNDQU169fFzs2EakpCwsLrF69GhMmTMC0adNw6NAh7N+/H7q6usU+h0QiwdChQ7Fnzx5kZ2dj+fLlMDIywqxZs+Du7l6icxERUem8HkSzdOlSWFlZiR2HSokjO4kIANC9e3cYGBggKSkJAwYMKLTN1tYWDg4OiIyMhFQqLfJTXOfPn4eDgwO+/vprtGzZEjKZrNB8ngDQoEEDXLx4sdBz//v4f507dw5TpkyBm5sbXFxcYGpqynn1SO3J5XLO2ymS48eP44svvkD9+vWxfPlybNy4sdC8xXp6ejhy5AiGDRuG69evo1mzZti7dy/Mzc1FTE1EmsDFxQV//PEHPDw8YGRkVOLjdXV18eLFC2zbtg1+fn64evUqBg8ezKKTiKiCWFtbY+DAgdiwYYPYUagMOLKTiAC8Gk3wzz//QBAEGBoaFtk+f/58TJkyBebm5vj444+Rl5eH69ev48mTJ/jqq6+K9RpyuRxPnjzBjh070LZtWxw7dgy//vproX18fHwwatQotGzZEl26dEFQUBAuXboECwuLd553+/btaN26NTIyMvDll1/CwMCgZH8AogrGRYrEkZWVBW9vb8ydOxfTp08HAERHRyM9PR0LFy6ElZUVZDIZevbsiR9//BHZ2dmlKiyISHudPXsW/fr1K/XxY8aMgYODA3r06KHCVEREVFw+Pj5wc3PDzJkzi9y5SJqBZScRKZmamr5129ixY2FiYoLly5fjq6++grGxMVxcXDB58uRin79v376YOXMmpk2bhqysLPTq1QsLFy7ExIkTlfsMGTIEUVFRmDNnDjIzM9GvXz/MmDEDAQEBbz2vv78/xo8fj+bNm8Pe3h7z589HYmJisXMRiUEmk+Hvv/8WO4bW+eWXX+Dq6govLy/lc3/99RdevHiBmjVr4smTJ7CysoKjoyMaNGjwxi9/iIjeJTU1FZaWlqU+3tDQEAUFBSpMREREJdG0aVPIZDIEBQVh6NChYsehUpAIgiCIHYKIiEjbnD17FrNmzUJISIjYUbTKxYsXERMTA3d3d+jp6WHp0qVYtmwZzpw5g0aNGiElJQXOzs74/PPP8e2334odl4g00MGDB9G3b1/Rz0FERKX3+++/Y+nSpe+dUo3UE+fsJCIiEgFvYxdHmzZtMGjQIOjp6SEvLw/16tXDX3/9hUaNGkGhUMDCwgK9evVC1apVxY5KRBqKY0mIiDRf3759kZCQwLJTQ7HsJCIiEoGtrS2ys7Px/PlzsaNohZcvXyr/X0/v1Sw++vr66N+/P5o3bw4A0NHRQVpaGqKiolC9enVRchIRASxMiYjEpquriylTpsDPz0/sKFQKLDuJiIhEIJFIOLqzgkyfPh3ff/89YmJiALz6278uEnR0/vtRSKFQYMaMGcjPz8fnn38uSlYi0nw6OjrIzs4u9fEKhQJ5eXkqTERERKXh5eWFY8eOIT4+XuwoVEIsO4mIiEQil8tZdpazzZs3w8/PD35+fvjyyy9x6dIl5OfnQyKRFNrv1q1b8PLywp9//on9+/eLlJaIKoPu3bvjxIkTpT7+3Llz6NixowoTERFRaZiZmSE6Oho2NjZiR6ESYtlJREQkEo7sLF8pKSkICgrC0qVLsX//fly+fBne3t4IDg7GixcvCu1bp04dtGrVClu2bEGtWrVESkxElYGxsTGysrJKfSt6QkICL6yJiNSEqalpkS/JSf2x7CQiIhIJy87ypaOjg169esHFxQXdu3dHaGgoZDIZJkyYgB9//BFRUVEAgLS0NAQFBWHMmDHo1q2byKmJqDLo1q0bgoODS3zckSNH0Lp163JIREREpcGiUzNJBM5+TUTl6IcffsDjx4+xcuVKsaMQqZ0LFy7Ax8cHly9fFjtKpZWVlQVjY+NCz61cuRJff/01evTogS+++AJr165FdHQ0Ll26JFJKIqqMYmJicPXqVQwaNKhYF8t//PEHnJyc0KBBgwpIR0REVHnpiR2AiCq358+fc1Vjord4PbJTEAR+a1xO/l10FhQUQFdXF9OnT0enTp0wcuRI9OnTB5mZmbh9+7aIKYmoMqpduzZMTEywe/duVKtWDR9++GGhRdGAV6uuX7x4EY8fP0br1q05jQYRkQbJyMjAhQsXUL16ddSvXx8mJiZiR6L/x7KTiMrV8+fPUb9+fbFjEKklS0tLAEBycjKsrKxETlP56erqQhAECIKA5s2bY+vWrWjdujV27NjB9ykiKhdWVlYYMmQIOnTogBs3bqBhw4aF3ovy8/PRunVrtG3bVuyoRERUAsnJyfDw8EBiYiLi4+Ph5uaGTZs2iR2L/h9vYyeicvX6LYaj1ojerFWrVli1ahXatWsndhStkpKSgjZt2qBevXo4ePCg2HGIqBKLiIhA+/bt8ejRIxgYGIgdh4iISkGhUODIkSPYsGEDWrVqBalUioULF2LVqlUwMjLCuHHj8NVXX8HT01PsqAQuUERE5UwikbDoJHoHLlJUvt72na4gCBg2bBiLTiIqd/7+/hgxYgSLTiIiDebp6YkvvvgCzZs3x5kzZ/DNN9+gV69e6NWrFzp16oTx48djzZo1Ysek/8eyk4iISERyuZxlZzlJTExEbm7uGwtPS0tLzJs3T4RURKRN8vPzERAQAG9vb7GjEBFRKT148ACXLl3CuHHjMG/ePBw7dgwTJ07E7t27lfvUqFEDhoaGSExMFDEpvcayk4iISEQc2Vk+8vPz8cknn2DlypVvHV3OUedEVN5er7DesGFDsaMQEVEp5ebmQqFQwMPDA8Crz5AeHh5ITk6Gj48PlixZgmXLlsHFxQXW1tZvvbOIKg7LTiIiIhGx7CwfixYtgr6+PmbOnCl2FCLSYps3b+aoTiIiDde4cWMIgoBDhw4pnztz5gxkMhlsbGxw+PBh2NvbY/To0QD4hbo64AJFREREInrx4gVq1qyJly9f8oORipw8eRIjRozA9evXYWdnJ3YcItJSz549Q4MGDRAbGwtTU1Ox4xARURls3LgRa9euRffu3dGiRQvs3LkTdnZ22LRpE548eYJq1arxvV6N6IkdgIiISJuZm5vDyMgI8fHxLOZUID4+HiNHjsTWrVv59yQiUW3duhXu7u68+CUiqgTGjRuHtLQ0bN++Hfv374elpSXmz58PAHBwcADwar54a2trEVPSaxzZSUREJLJ27dph6dKl6NSpk9hRNJpCocBHH32EFi1aYMmSJWLHISItJggC6tevj4CAALRt21bsOEREpCLx8fFITU2FXC4HAKSmpmL//v346aefYGhoCGtrawwaNAj9+vXjl10i4pydRKQyBQUFhR7zuxSi4uG8naqxbNkyZGRkYMGCBWJHISItJ5FI8ODBAxadRESVjI2NDeRyOXJzc7F48WLIZDJ4enoiMTER7u7uqFOnDrZs2YKxY8eKHVWr8TZ2IlIZXV3dQo8lEgkSExORnZ0Nc3NzfrNF9BZyuZxlZxmdP38eK1euxNWrV6Gnx483RERERKR6EokECoUCCxcuxJYtW9ChQweYm5sjOTkZZ8+eRVBQEMLCwtChQwccPXoUvXv3FjuyVuLITiJSiezsbIwfPx55eXkAgNzcXKxbtw7e3t4YN24cpk2bhps3b4qckkg9cWRn2aSkpGDYsGHYtGkTatasKXYcIiIiIqrErl69ih9++AG+vr5Yv349/P39sW7dOsTExGDFihWQy+Xw8PDAjz/+KHZUrcWyk4hUIj4+Hps2bYK+vj5yc3Oxdu1aTJs2DSYmJpDJZLh48SJ69OiBmJgYsaMSqR2WnaUnCALGjBkDd3d39O3bV+w4RERERFTJXbp0Cd26dYOPj49yQSIHBwd069YN9+7dAwD07t0bDRs2RHZ2tphRtRbv8yIilUhJSYGZmRkA4OHDh9i4cSNWrVqFiRMnAng18rN///74/vvvsW7dOjGjEqkdqVSKyMhIKBQK6Ojwe8iSWL16NeLi4rBnzx6xoxARERGRFrC0tERoaCjy8/NhYGAAAAgLC8O2bdvg6+sLAGjTpg3atWsHIyMjMaNqLV5REZFKJCQkoHr16gCgfNMfNWoUFAoFCgoKYGRkhE8//RS3bt0SOSmR+jE1NUW1atUQFxcndhSNcvXqVSxevBi//fab8oMmEZHY5s+fj0aNGokdg4iIysmwYcOgq6uL2bNnw9/fH/7+/pg7dy5kMhkGDRoEALCwsIC5ubnISbUXy04iUonU1FRER0fDz88PS5YsAQDk5ORAR0dHuXBRWlpakRXbiegV3speMqmpqfDw8MBPP/2EunXrih2HiDSEp6cnJBKJ8sfKygp9+vTB/fv3xY5WIU6fPg2JRIKkpCSxoxARabSAgADExcVhwYIFWLVqFZKSkjB79mzUqVNH7GgE3sZORCpiZWWFZs2a4eDBg0hOToZcLsfTp09haWkJ4FXRGRoaCrlcLnJSIvUkk8kQFhaGrl27ih1F7QmCgPHjx6Nnz54YPHiw2HGISMP06NEDgYGBAIC4uDjMnDkTAwcORGhoqMjJ3i03N5ej2ImI1ET79u3RunVrPHv2DM+fP0fjxo3FjkT/wpGdRKQSXbp0wV9//YV169Zh/fr1mDlzJmxtbZXbw8PDkZ6ejt69e4uYkkh9yeVyjuwspo0bN+L+/ftc4ZKISsXQ0BB2dnaws7ODq6srpk+fjvv37yMrKwvR0dGQSCS4evVqoWMkEgmCgoKUj+Pi4jB8+HBYWlqiSpUqaNasGU6dOlXomF27dsHZ2RmmpqYYMGBAodGUV65cQa9evWBlZYVq1aqhQ4cOuHDhQpHX/OmnnzBo0CCYmJjgP//5DwDg3r17cHNzg6mpKWxsbDB06FA8e/ZMedzt27fRvXt3VKtWDaampmjatClOnTqF6Oho5Rdq1tbWkEgk8PT0VMnflIhIG+np6cHR0ZFFpxriyE4iUokTJ04gLS1NOUfJa4IgQCKRwNXVFTt37hQpHZH6k8lkCAkJETuG2rt9+zbmzJmDs2fPwtjYWOw4RKTh0tLS8Ntvv6Fx48bFfk/JyMhA586dYWNjg3379sHBwaHInOTR0dH47bffsG/fPmRkZMDDwwNz5szB+vXrla87cuRI+Pn5QSKRYO3atfj4448RHh4OKysr5XkWLFiAb7/9FitWrIBEIsHTp0/RqVMneHt7Y8WKFcjLy8OcOXPQr18/XLx4ETo6Ohg2bBiaNm2Ky5cvQ09PD7dv34aRkRFq1qyJ4OBguLu74+7du7CwsOD7KBERVUosO4lIJfbu3Yv169ejd+/eGDJkCPr27QsLCwtIJBIAr0pPAMrHRFQY5+x8v4yMDAwePBg//PAD6tevL3YcItJQR48eRdWqVQG8el+pWbMmjhw5Uuzjd+7ciWfPnuHChQvKYtLZ2bnQPvn5+QgICICZmRkAYPz48diyZYtye7du3Qrtv2bNGgQHB+Po0aMYMWKE8vkhQ4Zg7NixysfffPMNmjZtiu+//1753LZt22BhYYGrV6+iVatWiImJga+vr/J9UiqVKve1sLAAANjY2BQqVYmIqGxeX+8CvOZVB7yNnYhU4t69e/jwww9hYmKCuXPnYvTo0dixY4dydenXCwEQ0Zs5Ozvj4cOHXMTrHSZPnozWrVtj1KhRYkchIg3WqVMn3Lx5Ezdv3sSlS5fQrVs39OrVC48ePSrW8Tdu3ECTJk3eWRbWrl1bWXQCgL29PRISEpSPExISMGHCBMjlcpiZmcHU1BQJCQmIjY0tdJ4WLVoUenzt2jWcOXMGVatWVf7UrFkTABAZGQkAmDFjBsaOHYtu3bphyZIlWrP4EhGRmCQSCZYsWQJ/f3+xoxBYdhKRisTHx8PLywuBgYFYsmQJcnNzMWvWLHh6emL37t2FPuATUVFVqlSBlZVVsS+2tU1gYCAuXLiAtWvXih2FiDRclSpVIJVKIZVK0apVK2zevBkvX77Ehg0boKPz6vLo3yN08vLyCh3/721vo6+vX+ixRCKBQqFQPh49ejSuXLmClStXIiQkBDdv3oSjoyNyc3MLHWdiYlLosUKhgJubm7Ksff0THh6OPn36AADmz5+Pe/fuYcCAAQgJCUGTJk148U1EVAFatWoFPz+/Yv07QeWLZScRqURaWhqMjIxgZGSEUaNG4ciRI1i1ahUkEgnGjBmDfv36ISAgoMiHeCL6L97K/mYPHjzAjBkzsHv3buWtp0REqiKRSKCjo4PMzExYW1sDAJ4+farcfvPmzUL7u7q64p9//im04FBJnTt3DlOmTIGbmxtcXFxgampa6DXfxtXVFXfv3kXt2rWVhe3rH1NTU+V+MpkMU6dOxeHDh+Ht7Y1NmzYBgHI1d95FQESkej179kR+fn6RBeuo4rHsJCKVyMjIUF4g5OfnQ1dXF5988gmOHTuGP/74A/b29vDy8lLe1k5ERclkMoSFhYkdQ61kZWVh8ODBWLx4MZo0aSJ2HCKqBHJycvDs2TM8e/YMoaGhmDJlCtLT09G3b18YGxujTZs2+P7773H37l2EhITA19e30PHDhg2DjY0NBgwYgLNnz+Lhw4c4cOBAiS5u5XI5tm/fjnv37uHKlSvw8PBQFpHvMmnSJKSmpmLIkCG4dOkSoqKicPz4cYwfPx5paWnIysrCpEmTcPr0aURHR+PSpUs4d+4cGjZsCODV7fUSiQSHDx9GYmIi0tPTS/bHIyKit5JIJPDx8YGfn5/YUbQey04iUonMzEzl3FR6eq/WPlMoFBAEAZ06dcLevXtx69YtODo6ihmTSK1xZGdRX3zxBerXr4/x48eLHYWIKonjx4+jRo0aqFGjBlq3bo0rV65gz5496NKlCwAob/lu2bIlJkyYgMWLFxc63sTEBH///TccHBzQt29fuLi4YN68eSWam9zf3x/p6elo3rw5PDw84OXlBScnp/ceZ29vj/Pnz0NHRwe9e/eGi4sLJk2aBENDQxgaGkJXVxfPnz/H6NGjUa9ePQwcOBBt27bFjz/+CABwcHDAggULMGfOHNja2mLy5MnFzkxERO83cuRIhISEKOdRJnFIBE4mQEQqkJKSAnNzc+VcV/8mCAIEQXjjNiL6rwMHDmD9+vU4fPiw2FHUQlBQEGbNmoXr168XWuiDiIiIiEhdzZo1Czk5OVi1apXYUbQWy04iIiI1ERoaiv79+/NWdgBRUVFo06YNDh8+jJYtW4odh4iIiIioWGJjY9GsWTNER0ejWrVqYsfRShxmRUTl4vVoTiIqvrp16yI2Nhb5+fliRxFVbm4uPDw88J///IdFJxERERFplFq1aqFHjx4ICAgQO4rWYtlJROXiwoULOHfunNgxiDSKoaEhatSogejoaLGjiOqrr76CnZ0dfHx8xI5CRERERFRiPj4+WL16NRQKhdhRtBLLTiIqF8eOHcOJEyfEjkGkcbR9kaJDhw5hz5492LJlS4kW+yAiIiIiUhft2rVD9erVORe/SFh2ElG5eP78OapXry52DCKNI5PJtHbOzsePH2Ps2LHYuXMnLC0txY5DRERERFQqEokEPj4+8PPzEzuKVmLZSUTlgmUnUelo68jO/Px8DB06FD4+PujQoYPYcYiI3qlt27Y4dOiQ2DGIiEiNDR48GPfu3cOdO3fEjqJ1WHYSUblg2UlUOnK5XCvLzvnz58PY2BizZs0SOwoR0TvdvXsXsbGx6N27t9hRiIhIjRkYGOCzzz7j6E4RsOwkonLBspOodLRxZOfx48exZcsWBAYGQkeHH02ISL1t3rwZnp6e0NPTEzsKERGpuc8++wxBQUFISkoSO4pW4RUFEZULlp1EpePk5IS4uDjk5uaKHaVCPHv2DKNGjcK2bdtga2srdhwionfKycnB9u3b4eXlJXYUIiLSADY2NhgwYAA2btwodhStwrKTiMoFy06i0tHX10fNmjURFRUldpRyp1AoMHLkSIwdOxbdu3cXOw4R0XsdOHAAjRo1grOzs9hRiIhIQ/j4+OCnn35CXl6e2FG0BstOIioXLDuJSk9bbmVfunQpcnJy8M0334gdhYioWDZv3gxvb2+xYxARkQZp1qwZpFIpgoODxY6iNVh2EpHKZWVlAQCMjY1FTkKkmbSh7Dx79ixWr16NnTt3ct47ItIIsbGxuHLlCgYNGiR2FCIi0jA+Pj5cqKgCsewkIpXjqE6ispHJZAgLCxM7RrlJSkrC8OHDsXnzZjg6Ooodh4ioWLZs2YKhQ4fyy1wiIiqxfv364dmzZ7h8+bLYUbQCy04iUjmWnURlI5fLK+3ITkEQMGbMGAwePBhubm5ixyEiKhaFQoEtW7bwFnYiIioVXV1dTJ48maM7KwjLTiJSOZadRGVTmW9jX7VqFRISEvDtt9+KHYWIqNhOnDgBCwsLfPDBB2JHISIiDeXt7Y0//vgDT548ETtKpceyk4hUjmUnUdnUqlULiYmJyvlvK4vLly/ju+++w65du2BgYCB2HCKiYtu0aRPGjh0rdgwiItJg5ubmGDZsGH7++Wexo1R6LDuJSOVYdhKVja6uLpycnBAZGSl2FJVJTU2Fh4cHfv75Z9SpU0fsOERExZaUlIRjx45h2LBhYkchIiINN2XKFGzYsKHSDWpQNyw7iUjlWHYSlV1lupVdEASMHTsWH330Edzd3cWOQ0RUItu3b0efPn1gbm4udhQiItJw9erVQ8uWLbFz506xo1RqLDuJSOVYdhKVXWUqO9evX4/w8HD88MMPYkchIioRQRCwefNm3sJOREQq4+PjAz8/PwiCIHaUSotlJxGpHMtOorKTyWQICwsTO0aZ3bp1C19//TV2794NIyMjseMQEZXIlStXkJWVhc6dO4sdhYiIKomePXsiPz8fp0+fFjtKpcWyk4hUjmUnUdlVhpGd6enpGDx4MFauXAm5XC52HCKiEtu0aRO8vLwgkUjEjkJERJWERCLB1KlT4efnJ3aUSotlJxGpHMtOorKTy+UaX3ZOmjQJ7du3x4gRI8SOQkRUYhkZGQgKCoKnp6fYUYiIqJIZOXIkzp07V6kWJFUnLDuJSOVYdhKVnYODA168eIH09HSxo5TK1q1bceXKFaxZs0bsKEREpbJnzx60b98e9vb2YkchIqJKxsTEBN7e3li7dq3YUSollp1EpHIsO4nKTkdHB87OzoiIiBA7SomFhobC19cXu3fvhomJidhxiIhKZdOmTVyYiIiIys2kSZOwbds2vHz5UuwolTlX4AAAACAASURBVA7LTiJSOZadRKqhifN2ZmVlYciQIfj222/RqFEjseMQEZXK/fv3ERkZiY8//ljsKEREVEnVqlUL3bp1Q0BAgNhRKh2WnUSkciw7iVRDE8vO6dOnw8XFhaOhiEij+fv7Y9SoUdDX1xc7ChERVWLTpk3DmjVroFAoxI5SqbDsJCKVys7OhkKhgLGxsdhRiDSeTCZDWFiY2DGK7bfffsPx48exfv16rlxMRBorLy8P27Ztg7e3t9hRiIiokmvXrh3MzMxw5MgRsaNUKiw7iUilXo/qZNFBVHaaNLIzMjISU6ZMwe7du1GtWjWx4xARldqhQ4cgl8shl8vFjkJERJWcRCKBj48P/Pz8xI5SqbDsJCKV4i3sRKojl8s1ouzMycnBkCFDMHfuXLi6uoodh4ioTDZv3sxRnUREVGEGDx6MO3fu4M6dO2JHqTRYdhKRSrHsJFIdOzs7ZGVlITU1Vewo7zR79mw4OjpiypQpYkchIiqTJ0+eICQkBJ988onYUYiISEsYGhri888/x+rVq8WOUmmw7CQilWLZSaQ6EokEUqlUrUd3HjhwAPv27YO/vz+nryAijRcQEIDBgwfDxMRE7ChERKRFJkyYgD179iA5OVnsKJUCy04iUimWnUSqpc7zdsbGxmLcuHHYuXMnLCwsxI5DRFQmCoWCt7ATEZEobG1t0b9/f2zYsEHsKJUCy04iUimWnUSqpa5lZ15eHoYOHYoZM2agXbt2YschIiqz06dPw9TUFC1atBA7ChERaSEfHx+sW7cOeXl5YkfReCw7iUilWHYSqZa6lp3z5s2DqakpZs6cKXYUIiKVCA4Ohre3N6fkICIiUXzwwQeoW7cu9u7dK3YUjceyk4hUimUnkWrJZDKEhYWJHaOQP//8E9u2bcO2bdugo8OPEkSk+QRBwNq1azFp0iSxoxARkRbz8fGBn5+f2DE0Hq9QiEilWHYSqZZcLlerkZ1Pnz6Fp6cnAgMDYWNjI3YcIiKVkEgkkEgk0NXVFTsKERFpsf79++Pp06e4fPmy2FE0GstOIiqz5ORk7N+/HwcOHICBgQESExNx6dIlCIIgdjQijWdlZQWFQqEWKzMWFBRgxIgRGD9+PLp27Sp2HCIiIiKiSkVXVxeTJ0/m6M4ykghsI4iolG7cuIGoqChYWFigU6dOhUZDxMbG4vLly9DX10evXr1gbGwsYlIizdayZUusWbMGbdq0ETXHokWLcPLkSRw/fpyjn4iIiIiIysGLFy9Qt25d3LlzB/b29mLH0UgsO4moVA4ePIi6devCxcXlnfvl5ubit99+Q+/evWFtbV1B6Ygql2HDhuGjjz7CyJEjRcvw999/Y8iQIbh+/To/dBERERERlaNJkybBwsICixYtEjuKRuJt7ERUYgcPHsQHH3zw3qITAAwMDDBixAj89ddfSE1NrYB0RJWP2CuyJyYmYsSIEdiyZQuLTiIiIiKicjZ16lRs2LAB2dnZYkfRSCw7iahErl+/DmdnZzg6Ohb7GIlEAg8PDxw+fLgckxFVXmKWnQqFAqNHj1aOLiUi0lSJiYnYtGkTfvnlF/z88884f/682JGIiIjeqF69emjevDl27twpdhSNpCd2ACLSLA8fPoS7u3uJj9PR0UHdunXx+PHjEhWlRPSq7AwLCxPltX/88Uc8f/4cixcvFuX1iYhUYf/+/Vi+fDnu3r0LExMTODg4ID8/H7Vr18ann36Kfv36wcTEROyYRERESj4+Pvjyyy8xZswYSCQSseNoFI7sJKJiS0xMhJWVVamPb926NS5duqTCRETa4fXIzoqeZvvSpUtYtmwZdu3aBX19/Qp9bSIiVZo1axZat26NqKgoPH78GCtWrMDgwYORn5+PZcuWYfPmzWJHJCIiKqRXr17Iy8vD6dOnxY6icVh2ElGxhYSEoGPHjqU+XiKRQEeHbztEJWVhYQEDAwMkJCRU2Gs+f/4cHh4eWL9+PWrXrl1hr0tEpGpRUVF48eIFZsyYgerVqwMAOnbsiFmzZmHdunUYMGAApk2bhl9//VXkpERERP8lkUgwdepU+Pn5iR1F47B1IKJi09HRKXNZqaenV+Gj04gqg4qct1MQBIwdOxZ9+/bFwIEDK+Q1iYjKi0QigaWlJdavXw/g1XtcQUEBBEGAo6Mj5s2bB09PTxw/fhx5eXkipyUiIvqvkSNH4ty5c4iKihI7ikZh2UlExaaKklIikfBCgqgUKrLsXLduHaKjo7F8+fIKeT0iovJUp04dfPrpp9i1axd27doFANDV1S00/1ndunVx7949TtlBRERqxcTEBF5eXli7dq3YUTQKFygiogoVGRkJKysrSKVSyGQySKXSQj92dnacfJnoDSqq7Lx58ybmz5+PkJAQGBoalvvrERGVJ0EQIJFIMGnSJCQmJmLkyJFYuHAhPvvsM3z44YeQSCS4ceMGduzYgYkTJ4odl4iIqIjJkyfjgw8+wIIFC2Bqaip2HI0gEXg/KREV09mzZyGXy2Fra1vqcwQFBaF79+6IiIgo8hMeHo7MzMwiBejrH3t7e875SVpr165dCA4Oxp49e8rtNdLS0tC8eXMsWLAAQ4cOLbfXISKqSKmpqUhLS4MgCEhOTkZQUBB27tyJmJgY1KlTB6mpqfDw8MCqVaugq6srdlwiIqIiPv30U3Tq1AlTpkwRO4pGYNlJRMUmCAL27t0Ld3f3Uh3//PlzXL9+Hd27d3/rPqmpqYiMjHxjEZqamgpnZ+c3FqE1a9ZkEUqV2rVr1+Dl5YVbt26Vy/kFQcDIkSNhbGyMjRs3lstrEBFVpNTUVPj7+2PhwoWoUaMGCgoKYGtrix49emDAgAHQ19fHjRs38MEHH6BBgwZixyUiInqrc+fOYcyYMXjw4AGve4uBt7ETUbG9Xk09Pz8fenolf/s4ffo0+vXr9859zMzM4OrqCldX1yLb0tPTCxWhV69exa+//oqIiAgkJyejTp06RUpQmUyGmjVrliovkTqRyWSIiIhQ3pKpagEBAbh58yYuX76s8nMTEYlhyZIlOHfuHH755RdYWFhg7dq1OHjwILKysnDy5EmsWLECw4YNEzsmERHRe7Vv3x7VqlXDkSNH0KdPH7HjqD2O7CSiEklPT8eBAwdKfHEQFhaGuLg4dOnSpVxyZWZmIioqqtBI0Nf/Hx8fj9q1axcpQaVSKWrXrs3FCEhj2NnZ4dq1a3BwcFDpee/du4fOnTvj9OnTcHFxUem5iYjE4uDggA0bNsDNzQ0AkJiYiBEjRqBz5844fvw4Hj9+jMOHD0Mmk4mclIiI6P0CAwOxbds2/PXXX2JHUXssO4moxJ48eYKQkBB88sknxRphFhYWhvDwcOXFRkXLzs7Gw4cPi5SgERERiIuLg6OjY5ESVCqVok6dOjAwMBAlM9GbdOzYEYsWLVLplwaZmZlo1aoVZsyYAS8vL5Wdl4hITBEREfj000+xevVqdOzYUfm8jY0Nrly5gtq1a6N+/fr47LPPMG3atHIbNU9ERKQqOTk5cHJywvHjxzlA4T1YdhJRqSQnJ+Po0aNo0KDBG285B4AXL17g1KlTMDc3R9euXSs4YfHk5uYiOjq6SAkaERGBR48eoUaNGm9cOb5u3bowMjISOz5pGS8vL7Rt2xbjxo1T2TnHjRuHrKwsBAYG8kKfiCoFQRBQUFCAQYMGwczMDBs3bkRmZiYCAwPx7bffIj4+HgDg6+uL6Oho7Nq1i9PdEBGRRliwYAHi4uKwfv16saOoNf6rTkSlYmlpieHDhyMyMhJBQUHQ1dWFoaEhDA0NkZ6ejry8PJiZmaFv375qfQFhYGAAuVwOuVxeZFteXh5iY2MLFaEnT55EREQEoqOjYWNjU6QElUqlcHZ2RpUqVUT4baiyk8lkCA8PV9n5fv31V/z999+4du0ai04iqjQkEgn09PTwySef4PPPP0dISAhMTEyQmpqKZcuWFdo3NzdXrT+nEBER/dtnn32G+vXrY/r06bh//36hxYpMTU3RuXNnLmAEjuwkIhXKy8tDbm4uqlSpUumLk4KCAsTGxhYZDRoREYGoqChYWlq+cdV4qVSKqlWrVkjGrKws7NmzB7du3YKpqSk+/PBDtGzZkhd1GiwoKAg7duzAvn37ynyu8PBwtGvXDn/++Sc++OADFaQjIlI/iYmJ8Pf3R0JCAkaPHo0mTZoAAO7fv4/OnTtj48aN7108kYiISF1cv34dO3fuRNeuXfHRRx8VKjaTkpJw5swZCIKAHj16wMzMTMSk4mLZSUSkYgUFBXjy5EmREjQ8PByRkZEwMzN7axGqyn+QHj16hKVLlyI9PR2BgYHo3bs3AgICYGNjAwC4cuUKjh8/jqysLMjlcrRp0wbOzs6FimrOYaZebt26heHDh+POnTtlOk9OTg7atWsHLy8vTJo0SUXpiIg0Q1paGn777TecPHkSO3fuFDsOERFRsRw8eBDOzs5o2LDhO/dTKBTYs2cP2rRpg9q1a1dQOvXCspOIqAIpFAo8ffq0SAn6+v+rVKlSpAB9fat89erVS/RaBQUFiIuLQ82aNdG8eXN07twZixcvVt5i7+npiaSkJBgYGODx48fIzs7G4sWLlSNcFAoFdHR08OLFCzx79gx2dnYwNzdX+d+Eii8jIwNWVlbIyMgo0+0pPj4+ePToEYKDg1lmE5FWio+PhyAIsLOzEzsKERHRex06dAjNmjWDo6NjsY/Zt28f2rVrB1tb23JMpp5YdhIRqQlBEBAfH//GEjQ8PBz6+vpFStBevXrB2tr6vYWVnZ0dZs6cienTpytLsgcPHsDExASOjo5QKBTw9fXF1q1bce3aNTg5OQF4dZvfggULEBISgvj4eLRo0QIBAQGQSqXl/eegt3B0dMT58+dL/S3t77//junTp+P69eslLtCJiIiIiKhi/fPPPwCgnIqluARBwK+//ophw4aVRyy1xrKTiEgDCIKApKSkIiXoV199hUaNGr2z7MzIyICNjQ38/f0xZMiQt+6XkpICGxsbXLhwAS1btgQAtG/fHpmZmfjll1/g6OgIb29v5OXl4dChQzA2Nlb570nv17VrV8yZMwc9evQo8bExMTFo2bIlDhw4gDZt2pRDOiIi9fP6cocj2YmISBMFBwfD3d29VMfeuXMH+vr6qFevnopTqTeuUkFEpAEkEgmsra1hbW2Ntm3bFuuY1/NtPnz4EBKJRDlX57+3vz43AOzfvx/6+vqQyWQAgJCQEFy4cAE3b95Ufou4cuVKuLi44OHDh++dK4bKx+sV2Utadubl5cHDwwNffvkli04i0ipTp07F119/XeTfQSIiInX34sWLMk0l1qhRI+zdu1fryk6uR09EVEkpFAoAQGhoKKpVqwYLC4tC2/+9+ND27dsxb948TJ8+Hebm5sjJycGxY8fg6OiIJk2aID8/HwBgZmYGOzs73L59u2J/GVJ6XXaW1Ndff43q1atjxowZ5ZCKiEg9RUVFYdeuXVq9Ii0REWmus2fPokuXLmU6R1nm+tdUHNlJRFTJ3bt3DzY2Nsr5GQVBgEKhgK6uLjIyMjB//nwEBwdj4sSJmD17NoBXq3WHhoZCLpcD+G9xGh8fD2tra6SmpirPxdsCK5ZMJsOZM2dKdMzRo0exY8cOXL9+XSs/7BCR9tqyZQuGDx8OQ0NDsaMQERGViq6ubpmOr1q1KrKysrRqGjKWnURElZAgCHjx4gUsLS0RFhYGJycn5aiW10XnrVu34OPjgxcvXmDdunXo3bt3ofIyPj5eeav661veY2NjoaurW2SU6Ot94uPjYWVlBT09/vNSXko6sjMuLg5jxozBrl27YG1tXY7JiIjUS0FBAbZs2YI//vhD7ChERESloopldgwNDZGdnc2yk4iINNuTJ0/Qq1cvZGdnIzo6GnXq1MH69evRuXNntG7dGoGBgfjhhx/Qvn17fPfdd6hWrRqAV/N3CoKAatWqITMzE1WrVgXw328Tb926BWNjY+Vq7f87qrN37964f/8+atWqVWTleKlUCicnJ+jr61fcH6IScnZ2RnR0NPLz899bKhcUFGD48OGYOHEiOnfuXEEJiYjUw7Fjx+Dg4IDGjRuLHYWIiEg0qampWjedC8tOIqJKyMHBAbt27cKNGzcQFxeHa9eu4eeff8alS5ewevVqTJ8+HSkpKbC3t8eKFStQr149yGQyNG7cGIaGhpBIJKhXrx4uXryIuLg42NvbA3i1iJGrq6vy9vZ/k0gkuHnzJnJycvDw4UPlivEPHjzA4cOHERERgSdPnsDBwaFICSqVSlGnTh3eZlgMRkZGsLW1RUxMDJydnd+57+LFi6Gjo4P//Oc/FZSOiEh9bN68Gd7e3mLHICIiKrVatWohMjLyvZ/73yU3N1frprKSCKoYE0tERBrl/v37CA8Px99//43bt28jKioKMTEx8PPzw4QJE6Cjo4MbN25g2LBhcHNzw8cff4xffvkFx48fx6lTp9C0adNSvW5ubi5iYmIQERGB8PBwZSEaERGB2NhY2NnZvbEIrVu3rlbddvE+PXv2xBdffIHevXu/dZ9Tp05h2LBhuH79OmrUqFGB6YiIxBcfH4969eohNjZWefcCERGRJgoODoa7u3upjk1LS8OFCxfQq1cvFadSbyw7iYhISaFQFPrWb9++fVi2bBmioqLQsmVLzJ8/Hy1atCiX187Pz0dsbGyREjQiIgIPHz6EtbV1kRJUKpXC2dkZJiYm5ZJJXU2cOBENGjTAlClT3rg9ISEBrq6u8Pf317oPNkREALBixQrcvXsXW7ZsETsKERFRmRw+fBjdunUr1eCPAwcO4KOPPtK6qcRYdhJRmXl6eiIpKQmHDh0SOwqVIzFXXi8oKMCjR4+KlKARERGIioqCubl5kRL09Y+pqakomctLfn4+Zs+ejZcvX6JPnz6QSCRwcnJSzkmnUCjg5uaGZs2a4bvvvhM5LRFRxRMEAQ0bNsTGjRvRoUMHseMQERGVSW5uLn799VeMGjWqRNdj4eHhePToEbp161aO6dQTy04iLeDp6YmtW7cCAPT09FC9enW4uLjgk08+wfjx48v8LY8qys7Xi+hcuXKl3EYOUuWkUCjw5MmTIiVoeHg4IiMjYWpq+sYSVCqVwtzcXOz4xRYfH4/z589DR0cHnTt3RvXq1ZXbHjx4gDt37sDY2Bg3b97E4cOHcfr0aa37BpeICADOnz8Pb29vhIaGivYlHRERkSqlpKTg8OHDGD58eLHm3wwPD0dYWBjc3NwqIJ364QJFRFqiR48eCAwMREFBARITE3Hy5EnMmzcPgYGBOHHixBtvA87NzYWBgYEIaYmKT0dHBzVr1kTNmjXRtWvXQtsEQcDTp08LlaB79+5V3ipvZGT0xhJUJpPBwsJCpN+oqMuXL+PFixcYOHDgGy/c69Wrh3r16iEjIwOHDh3C6tWrWXQSkdZ6vTARi04iIqosLCwsMHDgQOzatQu1atVC+/bt3/jvXEpKCk6fPg0LCwutLToBjuwk0gpvG3l5584duLq64quvvsKCBQvg5OQET09PxMbGYu/evejZsyf27NmD27dvY/r06Th//jyMjY3Rr18/+Pn5wczMrND527RpgzVr1iAjIwOffvop1q1bp5xXRBAELF++HOvXr0dcXBykUilmzZqFESNGAECRN+rOnTvj9OnTuHLlCubMmYPr168jNzcXTZo0wfLly9G2bdsK+MtRZSYIAhISEoqMBn39X11d3TeWoFKpFFZWVhV2EX358mXo6OgUe8SzIAjYvXs3evToAUtLy3JOR0SkXl6+fInatWvj/v37sLW1FTsOERGRyj179gznz5+HRCKBnp4edHR0oFAokJOTA0tLS3Tu3Bm6urpixxQVy04iLfCu28z79euHqKgo3LlzB05OTkhJScHcuXMxaNAgCIIABwcHyGQytGzZEosWLUJKSgrGjRuHxo0bIzg4WHn+4OBg9O7dG/PmzcOTJ0/g5eUFd3d3rF69GgAwZ84cBAUFwc/PD/Xq1cOFCxcwbtw47N69G25ubrhy5QpatWqFo0ePomnTpjAwMICFhQVOnjyJJ0+eoEWLFpBIJFi7di127NiB8PBwWFlZVejfkbSHIAhITk4uUoK+/snPz39jCSqVSmFra6uyIjQ+Ph43b97Ehx9+WOL8O3bsUH6ZQESkLTZu3IgjR45g3759YkchIiIqd4IgQKFQaH25+b9YdhJpgXeVnbNnz8bq1auRmZmpXOTk4MGDyu0bN26Er68vHj9+rFzo5fTp0+jatSvCw8MhlUrh6emJ33//HY8fP0bVqlUBANu3b4e3tzdSUlIAAFZWVvjzzz/RsWNH5bmnTZuGsLAwHDlypNhzdgqCAHt7eyxfvpxFDokmJSUFkZGRb1w5PjMz840lqFQqRY0aNYo1x85re/fufeut6+9z//595Ofno1GjRiU+lohIU7Vp0wZff/21Vt+6R0REpO04ZyeRlvvfFbb/t2gMDQ1FkyZNCq1o3a5dO+jo6ODevXuQSqUAgCZNmiiLTgBo27YtcnNzERkZiZycHGRnZ6N3796FXisvLw9OTk7vzJeQkICvv/4ap06dQnx8PAoKCpCVlYXY2Niy/NpEZWJhYQELCwu0bNmyyLbU1NRCRei5c+cQEBCAiIgIpKamwtnZ+Y0rxzs6OhYqQgsKCiCRSEo9SrR+/foICgpi2UlEWuPOnTt49OhRiUfDExERUeXCspNIy927dw9169ZVPv7fhYr+twz9t+KWMAqFAgBw8OBB1KpVq9C29y2iMnr0aMTHx2PlypVwcnKCoaEhunfvjtzc3GK9NlFFMzMzg6urK1xdXYtsS0tLQ2RkpHIU6OXLl7Fz505EREQgOTkZdevWVZafhoaGmDlzZpmyGBkZIScnB4aGhmU6DxGRJti8eTM8PT2hp8dLHCIiIm3GTwJEWuzOnTs4evQo5s6d+9Z9GjZsCH9/f6SlpSlHd4aEhEChUKBBgwbK/W7fvo2MjAxlWXrx4kUYGBjA2dkZCoUChoaGiImJQbdu3d74Oq9XfS8oKCj0/Llz57B69Wrl7Wjx8fF4+vRp6X9pIhGZmpqiWbNmaNasWZFtGRkZiIqKUhah9+/fR/Xq1cv0enZ2dkhOToa9vX2ZzkNEpO5ycnKwfft2XLx4UewoREREJDKWnURaIicnB8+ePYNCoUBiYiJOnDiBb7/9Fs2bN4evr+9bjxs+fDjmzZuHUaNGYeHChXj+/DkmTJiAQYMGKW9hB4D8/Hx4eXnhm2++QVxcHGbPno1x48Ypy09fX1/4+vpCEAR06tQJ6enpuHjxInR0dDB+/HjY2NjA2NgYx44dg5OTE4yMjGBmZga5XI7t27ejdevWyMjIwJdffqksRokqExMTEzRu3BiNGzcGABw4cKDM56xSpQoyMjLKfB4iInW3f/9+NG7cGM7OzmJHISIiIpEVf5UEItJox48fR40aNVCrVi10794dBw4cwLx583DmzJkit67/W5UqVXDs2DG8fPkSrVq1Qv/+/dG2bVv4+/sX2q9z585wcXFB165dMXDgQHTr1g3Lli1Tbl+0aBHmz5+PFStWwMXFBT179kRwcDDq1KkDANDT08Pq1auxadMm2Nvbo3///gAAf39/pKeno3nz5vDw8ICXl9d75/kkqgxUsaJ7amoqzM3NVZCGiEi9bd68GWPHjhU7BhEREakBrsZORESkhm7fvg0DAwPUq1ev1OfYu3cvBgwYUKIV4ImINE1MTAyaN2+OR48ewdjYWOw4REREJDJe/RAREamhxo0b486dO6U+/vXCYCw6iaiy27JlCzw8PFh0EhEREQDO2UlERKS2jI2NCy38VRJnzpxBp06dyiEVEZH6KCgowJYtW7B//36xoxAREZGa4HAPIiIiNdW9e3fs3bsXJZ1xJjU1FUlJSbCysiqnZERE6uHEiROwsrJCs2bNxI5CREREaoJlJxERkZoyNDTEhx9+iF27dhW78ExNTcXvv/8Od3f3ck5HRCS+TZs2wdvbW+wYREREpEa4QBEREZGaS0lJweHDh9GiRQs0aNDgjfsoFAr8/fffSE5Ohru7u0pWcyciUmdJSUmQSqWIjo6Gubm52HGIiIhITbDsJCIi0hB37tzBgwcPYGRkBFtbW1SpUgWpqal4+vQpAKBTp068dZ2ItMaqVatw7do1BAYGih2FiIhIpZ49e4ZRo0bh/PnzyMzMLPG0Vv/m6emJpKQkHDp0SIUJ1RvLTiIiIg2Tm5uLpKQkZGZmwszMDJaWllx1nYi0iiAIaNy4MdauXYsuXbqIHYeIiKhEPD09sXXr1iLPt27dGhcvXoSvry+OHj2Kffv2wdTUFHZ2dqV+rdTUVAiCoFV3QXA1diIiIg1jYGAAe3t7sWMQEYnm8uXLyMnJQefOncWOQkREVCo9evQocneCgYEBACAiIgLNmzeHTCYr9fnz8/Ohq6sLMzOzMuXURBwGQkREREREGmXTpk3w8vLi/MRERKSxDA0NYWdnV+jHwsICTk5O2L9/P7Zt2waJRAJPT08AQGxsLAYOHAhTU1OYmppi0KBBePz4sfJ88+fPR6NGjRAQEABnZ2cYGhoiIyMDnp6e6NOnj3I/QRCwbNkyODs7w9jYGI0bN8b27dsr+tcvVxzZSUREREREGiM9PR1BQUG4e/eu2FGIiIhU7sqVKxg2bBgsLCzg5+cHY2NjCIKAAQMGwMjICCdPnoREIsHkyZMxYMAAXLlyRfnl38OHD7Fz507s2bMHBgYGMDIyKnL+uXPnIigoCD/99BPq1auHCxcuYNy4cahevTrc3Nwq+tctFyw7iYiIiIhIY+zZswcdO3bkdB5ERKTRjh49iqpVqxZ6btKkSfj+++9haGgIY2Nj5Vydf/31F27duoXIyEg4OTkBAHbu3AmpVIoTJ06gR48eAF7N7R8YGAhbW9s3vmZGRgZ+/PFH/PnnEHzM9wAAELRJREFUn+jYsSMAoE6dOrh8+TJ++uknlp1EREREREQVbdOmTfjyyy/FjkFERFQmnTp1woYNGwo997ZFhEJDQ2Fvb68sOgGgbt26sLe3x71795Rlp6Oj41uLTgC4d+8esrOz0bt370JTweTl5RU6t6Zj2UlERERERBohNDQUUVFR+Pjjj8WOQkREVCZVqlSBVCot1r6CILx1nup/P29iYvLO8ygUCgDAwYMHUatWrULb9PX1i5VFE7DsJCIiIiIijeDv74/Ro0dXqgsyIiKi92nYsCGePHmC6Oho5QjMqKgoxMXFoWHDhiU6j6GhIWJiYtCtW7dySis+lp1ERERERKT2cnNzsW3bNpw9e1bsKERERGWWk5ODZ8+eFXpOV1cX1tbWRfbt0aMHmjZtiuHDh2P16tUQBAFTpkyBq6triUpLU1NT+Pr6wtfXF4IgoFOnTkhPT8fFixeho6OD8ePHl/n3UgcsO4mIiIiISO0dOnQI9evXh1wuFzsKERFRmR0/fhw1atQo9JyDgwMeP35cZF+JRILff/8dU6dORZcuXQC8KkDXrFnz1tvb32bRokWwtbXFihUr8Pnnn6NatWpo1qxZpZoPWyIIgiB2CCIiIiIiondxc3PDkCFDMGrUKLGjEBERkRpj2UlERERERGrt8ePHaNKkCR4/fowqVaqIHYeIiIjUmI7YAYiIiIiIiN4lICAAQ4YMYdFJRERE78WRnUREREREpLYUCgWkUil2796NFi1aiB2HiIiI1BxHdhIREWmY+fPno1GjRmLHICKqEKdOnYKpqSmaN28udhQiIiLSACw7/6+9+4/Vuqz/B/68ETkczoFNzrAfgMQRISg4SSAWzjlxobDmPFGK0YaDTQJmbZoZmzSiWBlqLsBsUpow1MCs4a9Vp0z/MGQHiMLDDx2K6CjAgiO/jp3780f7su8JEPCc0+HcPB5/8b7u68frvv86e3Jd7wsA2smuXbvyta99LRdeeGHKysrSt2/fXHPNNXn66adbNe9tt92W559/vo2qBDizLV26NNOnTz/t22YBgLOTY+wA0A62b9+esWPHpmfPnvnOd76TmpqaNDc35/e//33uuuuuvPHGG8eMOXLkSLp169YB1QKcmfbu3Zvq6uq89tpr6d27d0eXAwB0AnZ2AkA7mDlzZorFYtauXZsvfelLGTJkSIYOHZrZs2dnw4YNSZJCoZDFixentrY2FRUVmTNnTv79739n2rRpGThwYMrLy3PRRRflrrvuSnNz89G5//sYe3Nzc+bPn5/+/funrKwsw4cPz69//eujn3/mM5/Jrbfe2qK+ffv2pby8PL/61a+SJMuWLcvo0aPTs2fPnH/++fniF7+YnTt3tudPBHBSy5cvzzXXXCPoBABOmbATANrY3r178+yzz2b27NmprKw85vPzzjvv6L/nzZuXCRMmZOPGjZk1a1aam5vTt2/fPP7443nllVfyve99LwsWLMjPf/7zE65333335Yc//GF+8IMfZOPGjbnuuutSW1ub9evXJ0mmTJmSRx99tEVgumrVqpSXl2fixIlJ/rOrdN68edmwYUNWr16d3bt3Z/LkyW31kwCctmKxmAcffDDTp0/v6FIAgE7EMXYAaGNr1qzJmDFj8sQTT+S66647Yb9CoZDZs2fnxz/+8fvOd8cdd2Tt2rX53e9+l+Q/OztXrlyZv/71r0mSvn375uabb87cuXOPjrniiivSr1+/LFu2LHv27MlHPvKRPPPMMxk3blyS5KqrrsqFF16YBx544LhrNjQ0ZOjQodmxY0f69et3Wt8foC38v53x27ZtS5cu9mgAAKfGXw0A0MZO5/8RR40adUzbT37yk4waNSp9+vRJZWVl7r333uO+4zP5z3H0t956K2PHjm3Rftlll2XTpk1JkqqqqowfPz7Lly9Pkrz99tv5wx/+kClTphztX19fn2uvvTYDBgxIz549j9Z1onUB2tvSpUtz0003CToBgNPiLwcAaGMXXXRRCoVCXnnllZP2raioaPH82GOP5etf/3qmTp2a5557LuvXr8/MmTNz5MiR953neLcU//9tU6ZMyapVq3Lo0KGsWLEi/fv3z2WXXZYkeffddzN+/Pj06NEjjzzySF5++eU8++yzSXLSdQHaw4EDB/LYY49l6tSpHV0KANDJCDsBoI317t0748ePz6JFi9LY2HjM5//85z9POPbFF1/MmDFjMnv27IwcOTKDBg3Kq6++esL+vXr1ykc/+tG8+OKLx8wzbNiwo8/XXnttkmT16tVZvnx5vvzlLx8NQxsaGrJ79+4sWLAgl19+eT7+8Y/n73//+2l9Z4C2tHLlylx66aXp379/R5cCAHQywk4AaAdLlixJsVjMqFGj8stf/jKbN29OQ0ND7r///owYMeKE4wYPHpz6+vo888wz2bp1a+bPn5/nn3/+fdf6xje+kYULF2bFihXZsmVL5s6dmxdeeKHFDezdu3dPbW1tvvvd76a+vr7FEfYLLrggZWVlWbRoUV577bU89dRTufPOO1v/IwB8QEuXLs20adM6ugwAoBPq2tEFAEApGjhwYOrr67NgwYJ885vfzM6dO1NVVZWampoTXgqUJDfffHPWr1+fG2+8McViMV/4whdy66235mc/+9kJx9xyyy3Zv39/br/99uzatStDhgzJqlWr8qlPfapFv6985St56KGHMnLkyAwdOvRoe58+ffLwww9nzpw5Wbx4cUaMGJF77rknV199det/CIDTtGXLljQ0NOTzn/98R5cCAHRCbmMHAADOGHfccUfee++9LFy4sKNLAQA6IWEnAABwRnjvvffSv3//1NXVtdiBDgBwqryzEwAAOCM8/fTTqa6uFnQCAB+YsBMAADgjPPjggy4mAgBaxTF2AACgw7311lv5xCc+kR07dqSysrKjywEAOik7OwEAgA738MMPZ9KkSYJOAKBV7OwEAAA6VLFYzODBg/PII4/k0ksv7ehyAIBOzM5OAACgQ/3pT39KWVlZxowZ09GlAACdXNeOLgAAADg7HD58OHV1dWlqajrads4552TZsmWZNm1aCoVCB1YHAJQCYScAANCu3nzzzbz00kspKyvLuHHj0qNHj6OfHTx4MFu3bk1VVVVef/31DBgwoAMrBQA6O+/sBAAA2k19fX327NmTq6666qQ7N+vq6tKzZ8+MHj36f1QdAFBqhJ0AAEC7+Mtf/pLGxsZ89rOfPeUxa9asSdeuXTNy5Mh2rAwAKFUuKAIAANrcoUOHsnnz5tMKOpPkkksuyeuvv5533323nSoDAEqZsBMAAGhzdXV1mThx4gcaO2HChNTV1bVxRQDA2UDYCQAAtLmDBw+2uIjodJSVleXw4cPxxi0A4HQJOwEAgDa1bdu2DB48uFVz1NTU5G9/+1sbVQQAnC2EnQAAQJt68803M2DAgFbNccEFF2Tnzp1tVBEAcLYQdgIAAG3q8OHDKSsra9Uc5557bpqamtqoIgDgbCHsBAAA2tR5552Xd955p1Vz7Nu3L7169WqjigCAs4WwEwAAaFPDhw9PfX19q+b485//nIsvvriNKgIAzhbCTgAAoE2Vl5fn4MGDrZqjsbExPXv2bKOKAICzhbATAABoczU1NVm3bt0HGrtp06YMHTq0jSsCAM4Gwk4AAKDNDRo0KA0NDWlsbDytcQcOHEh9fX2GDRvWTpUBAKVM2AkAALSL66+/PitXrsy//vWvU+q/f//+PP7447nhhhvauTIAoFQVisVisaOLAAAASlNzc3OefPLJlJeXZ9y4cenWrdsxfZqamlJXV5f9+/entrY2XbrYkwEAfDDCTgAAoN01Njamrq4uTU1NOffcc9OtW7ccOXIkTU1N6dq1a6688koXEgEArSbsBAAA/qeKxeLR0LNQKHR0OQBACRF2AgAAAAAlwctwAAAAAICSIOwEAAAAAEqCsBMAAAAAKAnCTgAAAACgJAg7AQAAAICSIOwEAAAAAEqCsBMAAAAAKAnCTgAAAACgJAg7AQAAAICSIOwEAAAAAEqCsBMAAAAAKAnCTgAAoFU+9rGPZeHChf+Ttf74xz+mUChk9+7d/5P1AIDOpVAsFosdXQQAAHBm2rVrV77//e9n9erV2bFjR3r16pVBgwZl8uTJuemmm1JZWZl//OMfqaioSI8ePdq9niNHjmTv3r350Ic+lEKh0O7rAQCdS9eOLgAAADgzbd++PWPHjk2vXr0yf/78jBgxIs3NzdmyZUt+8YtfpKqqKjfeeGP69OnT6rWOHDmSbt26nbRft27d8uEPf7jV6wEApckxdgAA4Li++tWvpkuXLlm7dm1uuOGGDBs2LJ/85CdTW1ubJ598MpMnT05y7DH2QqGQlStXtpjreH0WL16c2traVFRUZM6cOUmSp556KkOGDEn37t1z+eWX59FHH02hUMj27duTHHuM/aGHHkplZWWLtRx1B4Czl7ATAAA4xt69e/Pcc89l1qxZqaioOG6f1h4jnzdvXiZMmJCNGzdm1qxZeeONN1JbW5uJEydmw4YNueWWW3L77be3ag0A4Owi7AQAAI6xdevWFIvFDBkypEV7v379UllZmcrKysyYMaNVa1x//fWZPn16qqurM3DgwNx///2prq7O3XffnSFDhmTSpEmtXgMAOLsIOwEAgFP2wgsvZP369bnkkkty6NChVs01atSoFs8NDQ0ZPXp0ix2jY8aMadUaAMDZxQVFAADAMQYNGpRCoZCGhoYW7QMHDkyS9715vVAopFgstmhramo6pt9/H48vFounfTS+S5cup7QWAHB2sLMTAAA4RlVVVT73uc9l0aJFaWxsPK2xffr0ydtvv330edeuXS2eT2To0KF5+eWXW7StWbPmpGsdOHAg+/btO9q2fv3606oXACgdwk4AAOC4lixZkubm5nz605/OihUrsmnTpmzZsiUrVqzIhg0bcs455xx33JVXXpnFixdn7dq1WbduXaZOnZru3bufdL0ZM2bk1VdfzW233ZbNmzfniSeeyAMPPJDkxJchjRkzJhUVFfnWt76Vbdu2ZdWqVVmyZMkH/9IAQKcm7AQAAI6ruro669aty9VXX50777wzF198cUaOHJl77rknM2fOzI9+9KPjjrv77rtTXV2dK664IpMmTcr06dNz/vnnn3S9AQMGZNWqVfnNb36Tmpqa3Hvvvfn2t7+dJCcMS3v37p3ly5fnt7/9bYYPH56f/vSnmT9//gf/0gBAp1Yo/vcLbgAAAM4Q9913X+bOnZt33nknXbrYqwEAvD8XFAEAAGeMxYsXZ/To0enTp09eeumlzJ8/P1OnThV0AgCnRNgJAACcMbZt25YFCxZkz5496devX2bMmJG5c+d2dFkAQCfhGDsAAAAAUBKcBQEAAAAASoKwEwAAAAAoCcJOAAAAAKAkCDsBAAAAgJIg7AQAAAAASoKwEwAAAAAoCcJOAAAAAKAkCDsBAAAAgJIg7AQAAAAASoKwEwAAAAAoCcJOAAAAAKAkCDsBAAAAgJIg7AQAAAAASoKwEwAAAAAoCcJOAAAAAKAkCDsBAAAAgJIg7AQAAAAASoKwEwAAAAAoCcJOAAAAAKAkCDsBAAAAgJIg7AQAAAAASoKwEwAAAAAoCcJOAAAAAKAkCDsBAAAAgJIg7AQAAAAASoKwEwAAAAAoCcJOAAAAAKAkCDsBAAAAgJIg7AQAAAAASoKwEwAAAAAoCf8HebVl/k0i9zQAAAAASUVORK5CYII=", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "# initialise a graph\n", - "G = nx.Graph()\n", - "\n", - "# use this while labeling nodes in the map\n", - "node_labels = dict()\n", - "# use this to modify colors of nodes while exploring the graph.\n", - "# This is the only dict we send to `show_map(node_colors)` while drawing the map\n", - "node_colors = dict()\n", - "\n", - "for n, p in romania_locations.items():\n", - " # add nodes from romania_locations\n", - " G.add_node(n)\n", - " # add nodes to node_labels\n", - " node_labels[n] = n\n", - " # node_colors to color nodes while exploring romania map\n", - " node_colors[n] = \"white\"\n", - "\n", - "# we'll save the initial node colors to a dict to use later\n", - "initial_node_colors = dict(node_colors)\n", - " \n", - "# positions for node labels\n", - "node_label_pos = { k:[v[0],v[1]-10] for k,v in romania_locations.items() }\n", - "\n", - "# use this while labeling edges\n", - "edge_labels = dict()\n", - "\n", - "# add edges between cities in romania map - UndirectedGraph defined in search.py\n", - "for node in romania_map.nodes():\n", - " connections = romania_map.get(node)\n", - " for connection in connections.keys():\n", - " distance = connections[connection]\n", + "show_map(romania_graph_data)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Voila! You see, the romania map as shown in the Figure[3.2] in the book. Now, see how different searching algorithms perform with our problem statements." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SIMPLE PROBLEM SOLVING AGENT PROGRAM\n", "\n", - " # add edges to the graph\n", - " G.add_edge(node, connection)\n", - " # add distances to edge_labels\n", - " edge_labels[(node, connection)] = distance" + "Let us now define a Simple Problem Solving Agent Program. Run the next cell to see how the abstract class `SimpleProblemSolvingAgentProgram` is defined in the search module." ] }, { "cell_type": "code", - "execution_count": 9, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class SimpleProblemSolvingAgentProgram:\n",
    +       "\n",
    +       "    """Abstract framework for a problem-solving agent. [Figure 3.1]"""\n",
    +       "\n",
    +       "    def __init__(self, initial_state=None):\n",
    +       "        """State is an abstract representation of the state\n",
    +       "        of the world, and seq is the list of actions required\n",
    +       "        to get to a particular state from the initial state(root)."""\n",
    +       "        self.state = initial_state\n",
    +       "        self.seq = []\n",
    +       "\n",
    +       "    def __call__(self, percept):\n",
    +       "        """[Figure 3.1] Formulate a goal and problem, then\n",
    +       "        search for a sequence of actions to solve it."""\n",
    +       "        self.state = self.update_state(self.state, percept)\n",
    +       "        if not self.seq:\n",
    +       "            goal = self.formulate_goal(self.state)\n",
    +       "            problem = self.formulate_problem(self.state, goal)\n",
    +       "            self.seq = self.search(problem)\n",
    +       "            if not self.seq:\n",
    +       "                return None\n",
    +       "        return self.seq.pop(0)\n",
    +       "\n",
    +       "    def update_state(self, state, percept):\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def formulate_goal(self, state):\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def formulate_problem(self, state, goal):\n",
    +       "        raise NotImplementedError\n",
    +       "\n",
    +       "    def search(self, problem):\n",
    +       "        raise NotImplementedError\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(SimpleProblemSolvingAgentProgram)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, "source": [ - "# initialise a graph\n", - "G = nx.Graph()\n", + "The SimpleProblemSolvingAgentProgram class has six methods: \n", "\n", - "# use this while labeling nodes in the map\n", - "node_labels = dict()\n", - "# use this to modify colors of nodes while exploring the graph.\n", - "# This is the only dict we send to `show_map(node_colors)` while drawing the map\n", - "node_colors = dict()\n", + "* `__init__(self, intial_state=None)`: This is the `contructor` of the class and is the first method to be called when the class is instantiated. It takes in a keyword argument, `initial_state` which is initially `None`. The argument `initial_state` represents the state from which the agent starts.\n", "\n", - "for n, p in romania_locations.items():\n", - " # add nodes from romania_locations\n", - " G.add_node(n)\n", - " # add nodes to node_labels\n", - " node_labels[n] = n\n", - " # node_colors to color nodes while exploring romania map\n", - " node_colors[n] = \"white\"\n", + "* `__call__(self, percept)`: This method updates the `state` of the agent based on its `percept` using the `update_state` method. It then formulates a `goal` with the help of `formulate_goal` method and a `problem` using the `formulate_problem` method and returns a sequence of actions to solve it (using the `search` method).\n", "\n", - "# we'll save the initial node colors to a dict to use later\n", - "initial_node_colors = dict(node_colors)\n", - " \n", - "# positions for node labels\n", - "node_label_pos = { k:[v[0],v[1]-10] for k,v in romania_locations.items() }\n", + "* `update_state(self, percept)`: This method updates the `state` of the agent based on its `percept`.\n", "\n", - "# use this while labeling edges\n", - "edge_labels = dict()\n", + "* `formulate_goal(self, state)`: Given a `state` of the agent, this method formulates the `goal` for it.\n", "\n", - "# add edges between cities in romania map - UndirectedGraph defined in search.py\n", - "for node in romania_map.nodes():\n", - " connections = romania_map.get(node)\n", - " for connection in connections.keys():\n", - " distance = connections[connection]\n", + "* `formulate_problem(self, state, goal)`: It is used in problem formulation given a `state` and a `goal` for the `agent`.\n", "\n", - " # add edges to the graph\n", - " G.add_edge(node, connection)\n", - " # add distances to edge_labels\n", - " edge_labels[(node, connection)] = distance" + "* `search(self, problem)`: This method is used to search a sequence of `actions` to solve a `problem`." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We have completed building our graph based on romania_map and its locations. It's time to display it here in the notebook. This function `show_map(node_colors)` helps us do that. We will be calling this function later on to display the map at each and every interval step while searching, using variety of algorithms from the book." + "Let us now define a Simple Problem Solving Agent Program. We will create a simple `vacuumAgent` class which will inherit from the abstract class `SimpleProblemSolvingAgentProgram` and overrides its methods. We will create a simple intelligent vacuum agent which can be in any one of the following states. It will move to any other state depending upon the current state as shown in the picture by arrows:\n", + "\n", + "![simple problem solving agent](images/simple_problem_solving_agent.jpg)" ] }, { "cell_type": "code", - "execution_count": 9, - "metadata": { - "collapsed": true - }, + "execution_count": 12, + "metadata": {}, "outputs": [], "source": [ - "def show_map(node_colors):\n", - " # set the size of the plot\n", - " plt.figure(figsize=(18,13))\n", - " # draw the graph (both nodes and edges) with locations from romania_locations\n", - " nx.draw(G, pos = romania_locations, node_color = [node_colors[node] for node in G.nodes()])\n", + "class vacuumAgent(SimpleProblemSolvingAgentProgram):\n", + " def update_state(self, state, percept):\n", + " return percept\n", "\n", - " # draw labels for nodes\n", - " node_label_handles = nx.draw_networkx_labels(G, pos = node_label_pos, labels = node_labels, font_size = 14)\n", - " # add a white bounding box behind the node labels\n", - " [label.set_bbox(dict(facecolor='white', edgecolor='none')) for label in node_label_handles.values()]\n", + " def formulate_goal(self, state):\n", + " goal = [state7, state8]\n", + " return goal \n", "\n", - " # add edge lables to the graph\n", - " nx.draw_networkx_edge_labels(G, pos = romania_locations, edge_labels=edge_labels, font_size = 14)\n", + " def formulate_problem(self, state, goal):\n", + " problem = state\n", + " return problem \n", " \n", - " # add a legend\n", - " white_circle = lines.Line2D([], [], color=\"white\", marker='o', markersize=15, markerfacecolor=\"white\")\n", - " orange_circle = lines.Line2D([], [], color=\"orange\", marker='o', markersize=15, markerfacecolor=\"orange\")\n", - " red_circle = lines.Line2D([], [], color=\"red\", marker='o', markersize=15, markerfacecolor=\"red\")\n", - " gray_circle = lines.Line2D([], [], color=\"gray\", marker='o', markersize=15, markerfacecolor=\"gray\")\n", - " green_circle = lines.Line2D([], [], color=\"green\", marker='o', markersize=15, markerfacecolor=\"green\")\n", - " plt.legend((white_circle, orange_circle, red_circle, gray_circle, green_circle),\n", - " ('Un-explored', 'Frontier', 'Currently Exploring', 'Explored', 'Final Solution'),\n", - " numpoints=1,prop={'size':16}, loc=(.8,.75))\n", - " \n", - " # show the plot. No need to use in notebooks. nx.draw will show the graph itself.\n", - " plt.show()" + " def search(self, problem):\n", + " if problem == state1:\n", + " seq = [\"Suck\", \"Right\", \"Suck\"]\n", + " elif problem == state2:\n", + " seq = [\"Suck\", \"Left\", \"Suck\"]\n", + " elif problem == state3:\n", + " seq = [\"Right\", \"Suck\"]\n", + " elif problem == state4:\n", + " seq = [\"Suck\"]\n", + " elif problem == state5:\n", + " seq = [\"Suck\"]\n", + " elif problem == state6:\n", + " seq = [\"Left\", \"Suck\"]\n", + " return seq" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We can simply call the function with node_colors dictionary object to display it." + "Now, we will define all the 8 states and create an object of the above class. Then, we will pass it different states and check the output:" ] }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 13, "metadata": {}, "outputs": [ { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAABTsAAAPKCAYAAABbVI7QAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3XlYVGXjxvF7kEVZlARR1Nw3XNAMUUsTcyH3LOVV0KRw\neU1xwTU3IPdywaXXNC1cMktzSS1TTMMMy6XMkspsU1/T1FREk+38/uDHvI3ggoKDw/dzXXPVnHnO\nOfeMjebN85xjMgzDEAAAAAAAAAA84OysHQAAAAAAAAAA8gJlJwAAAAAAAACbQNkJAAAAAAAAwCZQ\ndgIAAAAAAACwCZSdAAAAAAAAAGwCZScAAAAAAAAAm0DZCQAAAAAAAMAmUHYCAAAAAAAAsAmUnQAA\nAAAAAABsAmUnAAAAAAAAAJtA2QkAAAAAAADAJlB2AgAAAAAAALAJlJ0AAAAAAAAAbAJlJwAAAAAA\nAACbQNkJAAAAAAAAwCZQdgIAAAAAAACwCZSdAAAAAAAAAGwCZScAAAAAAAAAm0DZCQAAAAAAAMAm\nUHYCAAAAAAAAsAmUnQAAAAAAAABsAmUnAAAAAAAAAJtA2QkAAAAAAADAJlB2AgAAAAAAALAJlJ0A\nAAAAAAAAbAJlJwAAAAAAAACbQNkJAAAAAAAAwCZQdgIAAAAAAACwCZSdAAAAAAAAAGwCZScAAAAA\nAAAAm0DZCQAAAAAAAMAmUHYCAAAAAAAAsAmUnQAAAAAAAABsAmUnAAAAAAAAAJtA2QkAAAAAAADA\nJlB2AgAAAAAAALAJlJ0AAAAAAAAAbAJlJwAAAAAAAACbQNkJAAAAAAAAwCZQdgIAAAAAAACwCZSd\nAAAAAAAAAGwCZScAAAAAAAAAm0DZCQAAAAAAAMAmUHYCAAAAAAAAsAmUnQAAAAAAAABsAmUnAAAA\nAAAAAJtA2QkAAAAAAADAJlB2AgAAAAAAALAJlJ0AAAAAAAAAbAJlJwAAAAAAAACbQNkJAAAAAAAA\nwCZQdgIAAAAAAACwCZSdAAAAAAAAAGwCZScAAAAAAAAAm0DZCQAAAAAAAMAmUHYCAAAAAAAAsAmU\nnQAAAAAAAABsAmUnAAAAAAAAAJtA2QkAAAAAAADAJlB2AgAAAAAAALAJlJ0AAAAAAAAAbAJlJwAA\nAAAAAACbQNkJAAAAAAAAwCZQdgIAAAAAAACwCZSdAAAAAAAAAGwCZScAAAAAAAAAm0DZCQAAAAAA\nAMAmUHYCAAAAAAAAsAmUnQAAAAAAAABsAmUnAAAAAAAAAJtA2QkAAAAAAADAJlB2AgAAAAAAALAJ\nlJ0AAAAAAAAAbAJlJwAAAAAAAACbQNkJAAAAAAAAwCZQdgIAAAAAAACwCZSdAAAAAAAAAGwCZScA\nAAAAAAAAm0DZCQAAAAAAAMAmUHYCAAAAAAAAsAmUnQAAAAAAAABsAmUnAAAAAAAAAJtA2QkAAAAA\nAADAJlB2AgAAAAAAALAJlJ0AAAAAAAAAbAJlJwAAAAAAAACbQNkJAAAAAAAAwCZQdgIAAAAAAACw\nCZSdAAAAAAAAAGwCZScAAAAAAAAAm0DZCQAAAAAAAMAmUHYCAAAAAAAAsAmUnQAAAAAAAABsAmUn\nAAAAAAAAAJtA2QkAAAAAAADAJlB2AgAAAAAAALAJlJ3AA84wDGtHAAAAAAAAKBAoO4EC7Pr160pL\nS7vl66dOnbqPiQAAAAAAAAouyk6ggNq1a5fatm0rO7ubf01TU1PVuHFjffnll/cxGQAAAAAAQMFE\n2QkUQIZhaNKkSerbt+8ty05XV1dNnz5dgwcPVkZGxn1MCAAAAAAAUPBQdgIFUFxcnP78808FBwff\ndmyvXr1kb2+v2NjY/A8GAAAAAABQgJkM7m4CFCiGYeixxx7T0KFD1aNHjzva59ChQ+rQoYMSExPl\n7u6ezwkBAAAAAAAKJmZ2AgXMtm3blJSUpO7du9/xPg0bNlTnzp0VGRmZj8kAAAAAAAAKNmZ2AgWI\nYRjy9/fXmDFj1K1bt1zte+7cOdWuXVuffPKJ6tatm08JAQAAAAAACi5mdgIFyObNm5Wamqpnnnkm\n1/t6enoqMjJS4eHh4mcYAAAAAACgMGJmJwAAAAAAAACbwMxOAAAAAAAAADaBshMAAAAAAACATaDs\nBAAAAAAAAGATKDsBAAAAAAAA2ATKTsAGrFu3TiaTydoxAAAAAAAArIqyE8gHp06dUv/+/VW+fHk5\nOjqqXLly6tevn06ePGntaAAAAAAAADaLshPIY7/88ov8/Pz07bffavny5frpp5+0atUqfffdd2rU\nqJF+/fXXHPdLSUm5v0EBAAAAAABsDGUnkMcGDRokOzs7xcXFqVWrVqpQoYJatmypuLg42dnZadCg\nQZKkgIAADRw4UCNHjlSpUqX0+OOPS5LmzJkjX19fubi4qFy5curbt68uXrxocY4VK1aoYsWKcnZ2\nVseOHXXmzJlsOTZv3qxHH31URYsWVeXKlTV+/HiLQnXVqlVq1KiR3Nzc5OXlpe7du+vUqVP5+MkA\nAAAAAADkL8pOIA9duHBB27Zt06BBg+Ts7GzxmrOzs1588UV99NFH+uuvvyRlFo6GYWjPnj1asWKF\nJMnOzk4xMTH67rvvtHr1an355ZcKDw83H+eLL75QaGio+vfvr6+//lqdOnXSpEmTLM718ccfKyQk\nRIMHD9Z3332nN998U+vWrdO4cePMY1JSUhQdHa3Dhw9ry5YtOnfunHr27JlfHw0AAAAAAEC+MxmG\nYVg7BGArvvjiCzVp0kTr169X165ds72+YcMGPfPMM/riiy80evRoXbhwQd98880tj7lt2zZ16dJF\n165dk52dnYKDg/Xnn39qx44d5jF9+/bVsmXLlPV1fuKJJ9SmTRtNnDjRPGbjxo3q1auXkpKScryZ\n0ffffy8fHx+dOHFC5cuXv9uPAAAAAAAAwGqY2QlY0aOPPppt2yeffKI2bdqofPnycnNz0zPPPKOU\nlBT98ccfkqTExEQ1bdrUYp8bnx88eFBTp06Vq6ur+REcHKzk5GTzcQ4dOqQuXbqoYsWKcnNzk5+f\nnyTp999/z4+3CgAAAAAAkO8oO4E8VK1aNZlMJh09ejTH148ePSqTyaRq1apJklxcXCxe/+2339Sh\nQwf5+Pho7dq1OnjwoN58801JubuBUUZGhiIjI/X111+bH998842OHTumUqVKKTk5WYGBgXJ2dtbK\nlSu1f/9+bdu2LdfnAQAAAAAAKEjsrR0AsCUeHh4KDAzUf/7zHw0fPtziup1Xr17Va6+9pnbt2qlk\nyZI57n/gwAGlpKRo7ty5KlKkiCRpy5YtFmN8fHy0b98+i203Pm/YsKG+//57c6l6o8OHD+vcuXOa\nNm2aKleuLElav3597t4sAAAAAABAAcPMTiCPLVy4UGlpaWrdurU++eQTnThxQrt371abNm1kGIYW\nLlx4032rV6+ujIwMxcTE6JdfftE777yjmJgYizFDhgxRXFycpk+frmPHjumNN97Qhg0bLMZMmjRJ\nq1ev1qRJk/Ttt9/q+++/17p16zR69GhJUoUKFeTk5KSFCxfq559/1tatWy2u7wkAAAAAAPAgouwE\n8ljVqlV14MAB1alTR71791aVKlUUHBwsHx8f7d+/3zyTMie+vr6aN2+e5syZo9q1a2vp0qWaNWuW\nxZgmTZpo2bJlWrRokXx9fbV+/XpFRUVZjAkMDNTWrVu1a9cu+fv7y9/fXzNmzFCFChUkSaVKldLy\n5cu1ceNG1a5dW9HR0ZozZ06efxYAAAAAAAD3E3djBwAAAAAAAGATmNkJAAAAAAAAwCZwgyIAAAAA\nAFCgXb58WWfPnlVqaqq1owAPNAcHB3l5eal48eLWjpJvKDsBAAAAAECBdfnyZZ05c0blypVTsWLF\nZDKZrB0JeCAZhqFr167p1KlTkmSzhSfL2AEAAAAAQIF19uxZlStXTs7OzhSdwD0wmUxydnZWuXLl\ndPbsWWvHyTeUnQAAAAAAoMBKTU1VsWLFrB0DsBnFihWz6UtCUHYC+ejChQvy9PTU8ePHrR3lplJT\nU1WnTh1t3LjR2lEAAAAAIEfM6ATyjq1/nyg7gXwUExOjrl27qmrVqtaOclMODg6aP3++IiIidO3a\nNWvHAQAAAAAAuGsmwzAMa4cAbJFhGEpLS1NycrLc3d2tHee2unXrJl9fX02aNMnaUQAAAADALDEx\nUT4+PtaOAdgUW/5eMbMTyCcmk0kODg4PRNEpSbNnz9b8+fP122+/WTsKAAAAANi00NBQlS9fPsfX\ndu/eLZPJpLi4uPucKu9kvYfdu3dbO4pZaGioKlWqZO0YuA8oOwFIkipWrKghQ4ZoxIgR1o4CAAAA\nAABwVyg7AZiNGjVKhw4d0s6dO60dBQAAAAAApaenKy0tzdox8ACh7ARgVqxYMc2ZM0fh4eFKTU21\ndhwAAAAAKPQqVaqkXr16ac2aNfLx8ZGLi4v8/Pz02Wef3fExlixZovr166to0aLy9PRUWFiYLly4\nYH592bJlMplM2rhxo3lbenq6WrRooapVq+ry5cuSpKioKJlMJh05ckQtW7aUs7OzvL29NWnSJGVk\nZNwyg2EYmjt3rmrWrClHR0d5e3tr8ODB5mNnMZlMGj9+vGbMmKHKlSvL0dFRR44ckST9+eef+ve/\n/61y5crJyclJtWrV0pIlS7Kda+fOnWrYsKGKFi2qqlWravHixXf8WeHBR9kJwEKXLl308MMPa+HC\nhdaOAgAAAACQtGfPHs2ePVuTJ0/Wu+++q/T0dHXs2FEXL1687b5jx47VoEGD1Lp1a33wwQd69dVX\ntW3bNrVr107p6emSpLCwMHXv3l19+/bVqVOnJEmTJ0/W559/rtWrV6t48eIWx3z66afVunVrbdy4\nUcHBwZo8ebJefvnlW+YYP368IiIi1KZNG23evFmjR49WbGysOnTokK0ojY2N1datWzVr1ixt3bpV\nZcuW1eXLl9WsWTN9+OGHioqK0tatW9WpUycNHDhQCxYsMO+bmJio9u3bq1ixYlqzZo2mTZummJgY\nVjAWIvbWDgCgYDGZTJo3b56aN2+u4OBglS5d2tqRAAAAAKBQu3z5sr7++ms99NBDkqQyZcqoUaNG\n+vDDDxUcHHzT/X799Ve9+uqrioyM1KRJk8zba9SooWbNmmnz5s16+umnJf1v9mfv3r0VGRmpKVOm\naPLkyWrcuHG24/br109jx46VJLVt21aXL1/W7NmzNWzYsBxv0nvhwgXNnj1bffr0MU+sCQwMVKlS\npdS7d29t2bJFnTt3No83DEPbt29XsWLFzNsmT56s3377TUeOHFH16tUlSa1bt9bFixcVHR2tgQMH\nyt7eXlOmTJGbm5u2b98uFxcXSdJjjz2mqlWrqmzZsnf2geOBxsxO4C79c8q/ralVq5ZCQ0PNf3gB\nAAAAAKynadOm5qJTkurVqydJ+v333yVlloNpaWnmR9aMzR07digjI0MhISEWrzdu3Fhubm6Kj483\nH9Pd3V2rV69WfHy8AgMD9cQTT2jMmDE55gkKCrJ43qNHD125ckXffvttjuP37dunlJQU9erVK9t+\n9vb2+vTTTy22P/XUUxZFpyRt27ZNjRs3VuXKlS3eS2BgoM6fP6+jR49KkhISEtS+fXtz0SlJDz/8\nsB5//PEcs8H2UHYCd2Hp0qWKiIjQ7t27sy0bMAzjls8fFBMnTtT27du1b98+a0cBAAAAAJtib29v\nLiRvlLXd3v5/i3FLlixpMcbJyUmS9Pfff0uSli9fLgcHB/OjatWqkqSzZ89KkqpVq2bxuoODg5KS\nknT+/HmL4zZp0kQ1a9bU9evXNWTIENnZ5Vwb3bgCMOt51hL4G2VNFvL29rbYbm9vLw8Pj2yTiW4c\nl/Ve4uPjs72P7t27S5L5vZw+fTrHFYqsWiw8WMYO5FJ6erpGjBihlJQUffzxx+ratat69Oih+vXr\nq0SJEjKZTJKk5ORkOTg4yNHR0cqJ707x4sU1Y8YMhYeH64svvrjpH3IAAAAAgNzx8vLSuXPnlJKS\nku3vjP/9738l5a6c69Spk/bv329+nlWGenh4SJK2b99uMTM0S9brWaKjo3Xs2DH5+vpq+PDhatmy\npUqUKJFtvzNnzqhKlSoWzyWpXLlyOebLKmv/+OMP1alTx7w9LS1N58+fz1bmZv29+sasXl5emjdv\nXo7nqFmzpqTMojQrz42ZUTjQXgC5tG7dOtWpU0dfffWVoqOj9eGHH6p79+6aOHGi9uzZo6SkJElS\nTEyMpk+fbuW096ZXr15ydHTUm2++ae0oAAAAAGAzWrZsqbS0NH3wwQfZXnv//ffl7e1tLu/uhIeH\nh/z8/MyPrGXubdq0kZ2dnX7//XeL17MelStXNh9jz549mjp1qqZOnarNmzfr4sWLGjhwYI7ne++9\n9yyer1mzRq6urubz3qhJkyZydHTUmjVrLLa/++67SktLU0BAwG3f41NPPaXvv/9eFSpUyPG9uLm5\nScpc8v/hhx8qOTnZvO+JEye0d+/e254DtoGZnUAuubq6qkmTJnJ3d1f//v3Vv39/LVy4UDNnztTa\ntWvVs2dP+fv7a+LEidqxY4e1494Tk8mkBQsWqH379nr22Wdz/EkgAAAAACB3WrdurTZt2ig0NFTf\nf/+9GjdurKSkJK1Zs0abNm3SW2+9lSer66pWraoxY8Zo8ODB+uGHH9SiRQsVLVpUJ06c0I4dO9S3\nb1+1bNlSf/31l0JCQtSqVSuNHDlSJpNJS5YsUVBQkAIDA9WnTx+L477xxhvKyMhQo0aN9PHHH2vp\n0qWKiorKcRaolDmzc8SIEZo+fbpcXFzUvn17JSYmasKECWrWrJk6dOhw2/cyfPhwvfvuu2revLmG\nDx+umjVrKjk5Wd9//7327NmjTZs2SZImTJigtWvXqm3btho1apRSUlIUFRXFMvbCxABwx5KSkgzD\nMIzjx48bhmEYqampFq/HxMQYFStWNEwmk/HEE0/c93z5ZcCAAUZ4eLi1YwAAAAAohI4ePWrtCPni\n6tWrxvjx443q1asbjo6Ohqurq9GsWTNj48aNFuMqVqxohISEZNtfkhEZGXlH51qxYoXRuHFjw9nZ\n2XBxcTFq1aplDBo0yDhx4oRhGIbRrVs3w9PT0/jvf/9rsV9YWJjh6upqHDt2zDAMw4iMjDQkGUeO\nHDECAgKMokWLGqVLlzYmTJhgpKenm/fbtWuXIcnYtWuXeVtGRoYxZ84co0aNGoaDg4NRpkwZ48UX\nXzQuXbqU7X2NHz8+x/dx4cIFY9iwYUalSpUMBwcHo1SpUkazZs2MuXPnWozbsWOH0aBBA8PR0dGo\nXLmy8frrrxt9+vQxKlaseEefV2Fgq98rwzAMk2E8oHdPAe6zv//+Wx07dtSMGTPk5+cnwzDM1xFJ\nS0szXzz6+++/V+3atbVv3z75+/tbM3KeOX/+vHx8fLRz586bLksAAAAAgPyQmJgoHx8fa8eApKio\nKEVHRys1NdXiBkp48Njy94prdgJ3aMKECfrkk080btw4JSUlWVwwOes3+fT0dE2bNk3Vq1e3maJT\nyrz+S1RUlMLDwx/Yu8sDAAAAAADbR9kJ3IFLly5p3rx5Wrp0qf773/+qZ8+eOn36tCQpIyPDPM4w\nDDVv3lxr1661VtR8M2DAAF28eDHbhagBAAAAAAAKCpaxA3egb9+++vnnn/XJJ59o1apVGjZsmIKD\ngzV//vxsY9PT01WkSBErpMx/e/bsUUhIiBITE+Xi4mLtOAAAAAAKAVtebgtYiy1/r7jAAnAb58+f\n1/Lly/X5559Lknr16iV7e3uFh4fL3t5eU6dOVbFixZSRkSE7OzubLTolqXnz5mrevLmmTZumqVOn\nWjsOAAAAAACABZaxA7cxYcIENW/eXI0aNVJ6eroMw9Czzz6rwYMH66233tLq1aslSXZ2hePr9Mor\nr2jx4sX66aefrB0FAAAAAADAAjM7gduYN2+ekpKSJMk8a9PBwUGRkZFKSUnR8OHDlZ6erv79+1sz\n5n1Trlw5jRo1SsOHD9fmzZutHQcAAAAAAMCscExFA+6Bo6OjPDw8LLZl3ZRoxIgR6tSpk1566SV9\n/fXX1ohnFcOGDdMPP/ygDz/80NpRAAAAAAAAzCg7gbuQtWS9ZMmSWrp0qRo0aCBnZ2crp7p/nJyc\nNG/ePA0dOlTXr1+3dhwAAAAAAABJLGMH7klGRoaKFSumDRs2qHjx4taOc1+1a9dOPj4+mjt3rsaO\nHWvtOAAAAABwe4YhnUuQzn8ppSZJDm6Sh7/k2VQymaydDkAeoOwEcsEwDJn+8Qdg1gzPwlZ0Zpk7\nd64aN26s3r17q1y5ctaOAwAAAAA5y0iVji+Tjr4iXT+b+TwjVbJzyHw4eUm1R0tVwzKfA3hgsYwd\nuENHjx7VxYsXZRiGtaMUGFWrVtXAgQM1atQoa0cBAAAAgJylXpF2PikdGiEl/yKlJUsZKZKMzH+m\nJWduPzRC2tkqc3w+i42NlclkyvERFxeX7+f/p/Xr1ysmJibb9ri4OJlMJn322Wf3NQ9wryg7gTs0\naNAgbdy40WJmJ6SXXnpJe/fuVXx8vLWjAAAAAICljFRpdzvp/H4p/eqtx6ZfzVzevrt95n73wdq1\na5WQkGDx8Pf3vy/nznKzstPf318JCQmqX7/+fc0D3CuWsQN3YNeuXTp58qR69+5t7SgFjrOzs2bN\nmqXw8HAdPHhQ9vb8tgIAAACggDi+TLpwSMq4wxurZlyXLhyUjr8pVR+Qv9kkNWjQQNWqVbujsdev\nX5eTk1M+J/qf4sWLq0mTJnlyLMMwlJqaKkdHxzw5HnArzOwEbsMwDE2aNEmRkZEUeTfRrVs3eXh4\naPHixdaOAgAAAACZDCPzGp23m9F5o/SrmftZ8RJmWUvIN27cqBdeeEGenp4W90n48MMP1bhxYxUr\nVkzu7u7q2rWrjh07ZnGMZs2aKSAgQNu3b9cjjzwiZ2dn1a1bVx988IF5TK9evfT222/rt99+My+j\nzypfb7aMfd26dWrcuLGcnZ3l7u6uoKAgnTx50mJM+fLlFRoaqjfeeEM1a9aUo6OjPv7447z+mIAc\nUXYCtxEXF6c///xTPXv2tHaUAstkMmnBggWKjo7WuXPnrB0HAAAAADLvun797N3te/1M5v75LD09\nXWlpaeZHenq6xeuDBg2Svb293n77bS1btkyStGXLFnXs2FEPPfSQ3nvvPb322ms6fPiwmjVrpj/+\n+MNi/x9//FEREREaOXKk1q9fr9KlS+vZZ5/VL7/8IkmKjo5WYGCgypQpY15Gv27dupvmXbhwoYKC\nglSvXj29//77ev3113X48GEFBAToyhXLa53u2LFD8+fPV3R0tLZt26Y6derkxUcG3BbT1IBbMAxD\nEydOVFRUlIoUKWLtOAVanTp1FBwcrPHjxzPDEwAAAED+OjhM+uvrW4+5elJKy+WszixpV6WE5yTn\n8jcf81AD6dHs17rMjVq1alk8f/zxxy1mUj722GNasmSJxZgJEyaoRo0a2rp1q/nvqY0bN1atWrU0\nZ84cvfLKK+ax586d02effaYqVapIkurXr6+yZctq7dq1Gj16tKpWrSpPT085OTnddsn65cuX9dJL\nL6lv374WmRo1aqRatWopNjZWgwcPNm+/dOmSvvrqK3l5eeXyUwHuDWUncAsfffSRrly5oqCgIGtH\neSBERUXJx8dH/fr1k5+fn7XjAAAAACjMjHRJd7sU3fj//fPXhg0bVL78/wpVNzc3i9e7du1q8fzS\npUs6fPiwIiMjLSbkVKtWTU2aNNGnn35qMb5WrVrmolOSvL295enpqd9//z3XWffu3asrV64oJCRE\naWlp5u0VK1ZU9erVFR8fb1F2PvbYYxSdsArKTuAmsq7VGR0dLTs7rvhwJ9zd3TV16lSFh4dr7969\nfG4AAAAA8sedzKj8Pkb6eoyUkZL749s5STWHSbWG5n7fXKhbt+4tb1Dk7e1t8fyvv/7KcbsklSlT\nRocPH7bYVrJkyWzjnJyc9Pfff+c669mzmZcECAgIuKOsOWUE7gfKTuAmNm/erLS0tGw/ScOthYaG\navHixVq5cqX69Olj7TgAAAAACisPf8nO4S7LTnvJo1HeZ8olk8lk8TyrvLzx2pxZ23IqN/OKh4eH\nJGnlypXZlt9L2Wel3pgduF+YdgXkICMjg1mdd8nOzk4LFizQSy+9pEuXLlk7DgAAAIDCyrOp5HSX\ny6iLls7cv4ApXry4GjRooLVr1yojI8O8/eeff9a+fftuOuvyVpycnHTt2rXbjmvWrJlcXFx0/Phx\n+fn5ZXvUrFkz1+cG8gMtDpCDDRs2yN7eXp07d7Z2lAeSv7+/2rVrp5dfftnaUQAAAAAUViaTVHu0\nVMQ5d/sVcZZ8RmfuXwBNnjxZR48eVadOnbRlyxatXr1abdu2lYeHh4YPH57r49WuXVtnz57VkiVL\ntH//fn377bc5jnN3d9fMmTM1ZcoUDRw4UB988IF2796tt99+W3379tW77757r28NyBOUncANMjIy\nFBkZqZdffplp9/dg+vTpWrFihRITE60dBQAAAEBhVTVMKtkw8xqcd8LOSSr5qFT1hfzNdQ86duyo\nzZs369y5c+rWrZsGDhyoevXq6bPPPlOZMmVyfbz+/fsrKChIY8aMkb+/v55++umbjh00aJA2bNig\nxMREhYSEqH379oqKipJhGKpfv/69vC0gz5gMw7jbW5MBNundd9/V3LlzlZCQQNl5j+bNm6ctW7Zo\n+/btfJYAAAAA7kpiYqJ8fHzu/gCpV6Td7aULB6X0qzcfV8Q5s+gM+FBycL378wEPgHv+XhVgzOwE\n/iE9PV1RUVHM6swjL774ok6fPq0NGzZYOwoAAACAwsrBVWq1U2o4R3KpItm7/P9MT1PmP+1dJNcq\nma+32knRCTzguBs78A/vvPOOPD091aZNG2tHsQkODg5asGCBnn/+eT311FNyds7ltXIAAAAAIC/Y\nOUjVB0jV+kvnEqTz+6W0JMneLfOu7Z5NCuw1OgHkDsvYgf+XlpYmHx8fLVmyRC1btrR2HJsSFBSk\n2rVrKyowAiAkAAAgAElEQVQqytpRAAAAADxgbHm5LWAttvy9Yhk78P9Wrlyp8uXLU3Tmg1mzZmnh\nwoX69ddfrR0FAAAAAADYMMpOQFJqaqomT56sl19+2dpRbFKFChU0bNgwRUREWDsKAAAAAACwYZSd\ngKTY2FhVq1ZNzZs3t3YUmzVy5EgdPnxYO3bssHYUAAAAAABgoyg7Uehdv35dU6ZMUXR0tLWj2LSi\nRYtq7ty5GjJkiFJSUqwdBwAAAAAA2CDKThR6y5YtU506ddS0aVNrR7F5nTp1UqVKlbRgwQJrRwEA\nAAAAADbI3toBAGv6+++/NW3aNG3cuNHaUQoFk8mkefPm6bHHHlNwcLC8vb2tHQkAAABAYWIYUkKC\n9OWXUlKS5OYm+ftLTZtKJpO10wHIA5SdKNSWLFmiRx99VH5+ftaOUmjUqFFDYWFhGjt2rJYvX27t\nOAAAAAAKg9RUadky6ZVXpLNnM5+npkoODpkPLy9p9GgpLCzzOYAHFsvYUWhdvXpVM2bMUFRUlLWj\nFDoTJkzQzp079fnnn1s7CgAAAABbd+WK9OST0ogR0i+/SMnJUkpK5izPlJTM57/8kvl6q1aZ4++D\nhIQEBQUFqWzZsnJ0dJSHh4fatGmj5cuXKz09/b5kyGsbN27UnDlzsm3fvXu3TCaTdu/enSfnMZlM\nN33k18rNvH4P+XVMMLMThdiiRYvUtGlTPfLII9aOUui4ublp5syZCg8P15dffqkiRYpYOxIAAAAA\nW5SaKrVrJ+3fL12/fuuxV69mLm9v317auTNfZ3jGxMQoIiJCTz75pGbOnKmKFSvqr7/+0vbt2zVw\n4EC5u7urS5cu+Xb+/LJx40bFxcUpIiIi388VGhqqAQMGZNtes2bNfD93XmnYsKESEhJUu3Zta0ex\nKZSdKJSuXLmiV199VXFxcdaOUmgFBwfr9ddf17Jly9S/f39rxwEAAABgi5Ytkw4dun3RmeX6deng\nQenNN6UcirS8EB8fr4iICA0ePFjz58+3eK1Lly6KiIhQcnLyPZ8nNTVV9vb2MuVwLdLr16/Lycnp\nns9hTeXKlVOTJk2sHeOupKenyzAMFS9e/IF9DwUZy9hRKL322msKCAhQ3bp1rR2l0DKZTFqwYIEm\nTpyoCxcuWDsOAAAAAFtjGJnX6Lx6NXf7Xb2auZ9h5EusmTNnqmTJknrllVdyfL1q1ary9fWVJEVF\nReVYVoaGhqpSpUrm57/++qtMJpP+85//aPTo0SpbtqycnJx08eJFxcbGymQyKT4+Xt27d5e7u7sa\nN25s3vfTTz9Vq1at5ObmJhcXFwUGBurbb7+1OF9AQICaNWumuLg4NWzYUM7Ozqpbt642bNhgkWn5\n8uU6deqUeUn5PzP+U3h4uEqXLq3U1FSL7UlJSXJzc9PYsWNv+RneiWXLlmVb1p6enq4WLVqoatWq\nunz5sqT/fcZHjhxRy5Yt5ezsLG9vb02aNEkZGRm3PIdhGJo7d65q1qwpR0dHeXt7a/DgweZjZzGZ\nTBo/frxmzJihypUry9HRUUeOHMlxGfudfNZZ3nnnHdWqVUtFixZVvXr19MEHHyggIEABAQF3/8HZ\nAMpOFDqXL1/W7NmzFRkZae0ohV6DBg307LPPatKkSdaOAgAAYDUP6rX5gAIvISHzZkR348yZzP3z\nWHp6unbt2qW2bduqaNGieX78qVOn6scff9SSJUu0YcMGi3OEhISocuXKWrdunWbMmCFJ2rp1q1q1\naiVXV1etWrVKq1evVlJSkpo3b64TJ05YHPv48eMaOnSoIiIitH79enl7e6t79+766aefJEkTJ05U\n+/btVapUKSUkJCghISHHgk6SBg4cqLNnz2Z7ffXq1UpOTs5xefqNDMNQWlpatkeWsLAwde/eXX37\n9tWpU6ckSZMnT9bnn3+u1atXq3jx4hbHe/rpp9W6dWtt3LhRwcHBmjx5sl5++eVbZhg/frwiIiLU\npk0bbd68WaNHj1ZsbKw6dOiQrSiNjY3V1q1bNWvWLG3dulVly5a96XFv91lL0o4dOxQSEqJatWpp\n/fr1GjlypIYNG6Yff/zxtp+drWMZOwqd+fPnq23btvLx8bF2FCjzD5vatWurX79+ql+/vrXjAAAA\n3HdpaWnq06ePIiIi1LBhQ2vHAR4Mw4ZJX3996zEnT+Z+VmeWq1el556Type/+ZgGDaSYmFwd9ty5\nc7p27ZoqVqx4d7luo3Tp0tqwYUOOs0G7deuWbTbp0KFD1aJFC23atMm8rWXLlqpSpYpmz56tmH+8\nv3Pnzik+Pl7Vq1eXlHm9SW9vb7333nsaN26cqlatqlKlSsnR0fG2S7Nr166tFi1aaPHixQoKCjJv\nX7x4sdq2bavKlSvf9r1OmzZN06ZNy7b9zz//lKenpyRpyZIlql+/vnr37q3IyEhNmTJFkydPtpjZ\nmqVfv37mGaVt27Y1T5QaNmyY3N3ds42/cOGCZs+erT59+mjhwoWSpMDAQJUqVUq9e/fWli1b1Llz\nZ/N4wzC0fft2FStWzLwtMTExx/d2u89akiIjI1W7dm2LX++6devKz89PNWrUuO3nZ8uY2YlC5eLF\ni5o3bx6zOgsQDw8PRUdHKzw8XEY+LRMBAAAoyOzt7dW0aVN17NhR3bt3v+lffgHkUnr63S9FN4zM\n/R8wTz/9dI5FpyR17drV4vmxY8d0/PhxhYSEWMyMdHZ2VtOmTRUfH28xvnr16ubyTZK8vLzk5eWl\n33///a6yvvjii9q1a5eOHTsmSdq/f7+++uqrO5rVKUkvvPCC9u/fn+3xz2LS3d1dq1evVnx8vAID\nA/XEE09ozJgxOR7vn6WrJPXo0UNXrlzJtqQ/y759+5SSkqJevXpl28/e3l6ffvqpxfannnrKoui8\nldt91unp6Tpw4ICeffZZi1/vRx999I6KYlvHzE4UKjExMerYsaPFbxqwvn79+mnJkiVas2aNevbs\nae04AAAA91WRIkU0aNAgPf/881q4cKFatGihDh06KDIy8qbXuwMKvTuZURkTI40ZI6Wk5P74Tk6Z\ns0eHDs39vrfg4eGhYsWK6bfffsvT42bx9va+49fO/v8S/7CwMIWFhWUbX6FCBYvnJUuWzDbGyclJ\nf//9991EVdeuXVWmTBktXrxYs2bN0uuvv66yZcuqU6dOd7S/t7e3/Pz8bjuuSZMmqlmzpo4ePaoh\nQ4bIzi7neX+lS5fO8XnWEvgbZd174sbP1d7eXh4eHtnuTXGrX5sb3e6zPnfunFJTU+Xl5ZVt3I3v\nozBiZicKjZSUFB06dEgTJ060dhTcoEiRIlqwYIFGjRqlK1euWDsOAACAVTg7O2v06NE6duyYHn74\nYT366KMaPHiwTp8+be1owIPJ319ycLi7fe3tpUaN8jaPMouwgIAA7dixQ9fv4A7xWdfcTLmhsD1/\n/nyO4282qzOn1zw8PCRJ06dPz3GG5ObNm2+b7144ODiob9++io2N1dmzZ7VmzRqFhYXJ3j5v5+VF\nR0fr2LFj8vX11fDhw3Xp0qUcx505cybH5+XKlctxfFYh+ccff1hsT0tL0/nz57MVlrf6tcktT09P\nOTg4mAvrf7rxfRRGlJ0oNOzt7fXee++pSpUq1o6CHDz++ONq2bKlpk6dau0oAAAAVlWiRAm9/PLL\nSkxMlKOjo+rWrauxY8dmmyUE4DaaNpVymPl2R0qXztw/H4wdO1bnz5/X6NGjc3z9l19+0TfffCNJ\n5mt7/nMp9cWLF/X555/fc46aNWuqUqVK+u677+Tn55ftkXVH+NxwcnLStWvX7nj8gAEDdPHiRXXv\n3l3Xr19Xv379cn3OW9mzZ4+mTp2qqVOnavPmzbp48aIGDhyY49j33nvP4vmaNWvk6uqqevXq5Ti+\nSZMmcnR01Jo1ayy2v/vuu0pLS8vXO6IXKVJEfn5+ev/99y0uB3fw4EH98ssv+XbeBwXL2FFo2NnZ\n5cvd7pB3XnnlFdWrV08vvPAClxoAAACFnpeXl+bMmaOIiAhNnjxZNWrU0LBhwzR06FC5ublZOx5Q\n8JlM0ujR0ogRubtRkbNz5n55OBPvn5544gnzd/vo0aMKDQ1VhQoV9Ndff2nnzp1aunSpVq9eLV9f\nX7Vr104lSpRQv379FB0drevXr+uVV16Rq6vrPecwmUx67bXX1KVLF6WkpCgoKEienp46c+aMPv/8\nc1WoUEERERG5Ombt2rV14cIFLVq0SH5+fipatOhNy0Ipc9Zk586dtWHDBnXq1EkPP/zwHZ/r1KlT\n2rdvX7btFStWlLe3t/766y+FhISoVatWGjlypEwmk5YsWaKgoCAFBgaqT58+Fvu98cYbysjIUKNG\njfTxxx9r6dKlioqKUokSJXI8f8mSJTVixAhNnz5dLi4uat++vRITEzVhwgQ1a9ZMHTp0uOP3cjei\no6PVtm1bde3aVf3799e5c+cUFRWlMmXK3HSpfmFRuN89gALF29tbY8aM0bBhw6wdBQAAoMAoX768\nFi9erISEBCUmJqp69eqKiYm56+vkAYVKWJjUsGHmNTjvhJOT9Oij0gsv5GusYcOG6bPPPpO7u7tG\njhypJ598UqGhoUpMTNTixYvN1610d3fXli1bZGdnp6CgIL300ksKDw9Xy5Yt8yRH+/btFR8fr+Tk\nZPXt21eBgYEaPXq0/vjjDzW9i5mtffv2VY8ePTRu3Dj5+/vf0fU3u3fvLkl3fGOiLLGxsWratGm2\nx9tvvy1J6t+/v65du6bly5ebl5B3795dYWFhGjx4sH766SeL423atEk7duxQ586dtWrVKk2YMOG2\nl8GbOnWq5syZo48++kgdO3bUjBkz9Nxzz2nr1q35Xji2adNGb7/9thITE9W1a1fNnDlTs2fPVpky\nZW5a0BYWJoPbHwMoQFJSUuTr66tZs2apY8eO1o4DAABQ4HzzzTeaOHGiDh06pEmTJik0NFQOd3td\nQuABkJiYKB8fn7s/wJUrUvv20sGDt57h6eycWXR++KGUBzMncWdCQkK0d+9e/fzzz1aZkRgVFaXo\n6Gilpqbm+fVC77eTJ0+qWrVqGj9+/G2L2nv+XhVgzOwEUKA4Ojpq3rx5GjZsGLMVAAAAcuDr66tN\nmzZp7dq1WrNmjWrXrq133nlHGRkZ1o4GFEyurtLOndKcOVKVKpKLS+YMTpMp858uLpnb58zJHEfR\neV/s27dPr7/+ut59911FREQU+qXXuXXt2jUNHDhQ77//vj799FO99dZbatOmjZydndW3b19rx7Mq\nZnYCKJCefvpp+fv7a9y4cdaOAgAAUKDt3LlT48eP17Vr1zRlyhR17NgxT+/6C1hbns5AMwwpIUHa\nv19KSpLc3DLv2t6kSb5doxM5M5lMcnV1VVBQkBYvXmy1WZUP6szOlJQU/etf/9K+fft0/vx5ubi4\nqHnz5po2bZrq1q172/1teWYnZSeAAunnn3+Wv7+/vvrqq1xdpBoAAKAwMgxDmzdv1vjx4+Xq6qpp\n06bl2TX9AGuz5VIGsBZb/l4xRxhAgVSlShW9+OKLGjVqlLWjAAAAFHgmk0mdO3fWkSNHFB4ern79\n+ql169b64osvrB0NAID7irITQIE1duxYJSQkaPfu3daOAgAA8MAIDg5WYmKigoKC1K1bNz399NM6\ncuSItWMBAHBfUHYCKLCcnZ01e/ZsDRkyRGlpadaOAwAA8MBwcHBQ//79dezYMbVo0UKtW7dWr169\n9NNPP1k7GgAA+YqyE0CB9uyzz6pUqVJatGiRtaMAAAA8cIoWLarhw4frp59+Us2aNdWkSRMNGDBA\nJ0+etHY0AADyBWUngALNZDJp/vz5evnll/Xnn39aOw4AAMADyc3NTRMnTtQPP/wgd3d3+fr6asSI\nEfz/FQDA5lB2Aijw6tSpo169emncuHHWjgIAAPBA8/Dw0MyZM/Xtt9/q77//Vq1atRQZGalLly5Z\nOxpwXxiGoRMnTmjfvn369NNPtW/fPp04cUKGYVg7GoA8QtkJ4IEQFRWlLVu26MCBA9aOAgAAbFho\naKhMJpMmT55ssX337t0ymUw6d+6clZJlio2Nlaur6z0fp2zZsnrttdd04MAB/fbbb6pevbpeffVV\nXb16NQ9SAgVPenq6Dhw4oPnz52vlypWKi4vT7t27FRcXp5UrV2r+/Pk6cOCA0tPTrR0VwD2i7ATw\nQChRooSmTZumwYMHKyMjw9pxAACADStatKheffXVQrHEu3LlyoqNjdXu3bv1xRdfqHr16vrPf/6j\nlJQUa0cD8kxKSopWrFih7du36+LFi0pNTTWXmunp6UpNTdXFixe1fft2rVix4r789x8bGyuTyZTj\nw93dPV/OGRoaqkqVKuXLse+WyWRSVFSUtWPAxlB2wqZkZGTw02gb1qdPH0nSihUrrJwEAADYspYt\nW6pSpUrZZnf+09GjR9WhQwe5ubnJy8tLPXv21B9//GF+ff/+/Wrbtq08PT1VvHhxNWvWTAkJCRbH\nMJlMWrRokbp06SJnZ2fVqFFDu3bt0smTJxUYGCgXFxc1aNBAhw4dkpQ5u/T5559XcnKyuRTJq5Kg\ndu3aWrdunTZt2qQPPvhAtWrV0ooVK5jlhgdeenq63n77bZ06dUqpqam3HJuamqpTp07p7bffvm//\n7a9du1YJCQkWj7i4uPtybsBWUXbCpowfP17x8fHWjoF8YmdnpwULFmjcuHFcVwoAAOQbOzs7zZgx\nQ6+//rqOHz+e7fXTp0/riSeeUN26dfXll18qLi5OV65cUZcuXcwrUJKSktS7d2/t2bNHX375pRo0\naKD27dvr/PnzFseaMmWKevToocOHD8vPz089evRQWFiYXnzxRX311VcqW7asQkNDJUmPPfaYYmJi\n5OzsrNOnT+v06dMaOXJknr53Pz8/bdu2TbGxsVqyZInq1aun9evXcz1DPLC++uornT59+o7Ly/T0\ndJ0+fVpfffVVPifL1KBBAzVp0sTi4efnd1/OfS+uX79u7QjATVF2wmZcv35dS5cuVY0aNawdBfmo\nUaNGat++vaKjo60dBQAA2LD27dvr8ccf1/jx47O9tmjRItWvX18zZ86Uj4+PfH19tWLFCn355Zfm\n64s/+eST6t27t3x8fFSrVi0tWLBARYsW1UcffWRxrOeee049e/ZU9erVNW7cOJ09e1aBgYHq0qWL\natSoodGjR+vIkSM6d+6cHB0dVaJECZlMJpUpU0ZlypTJk+t35uSJJ57Qnj17NHv2bE2ZMkWNGjXS\nxx9/TOmJB4phGNq7d+9tZ3TeKDU1VXv37rXqf+8ZGRkKCAhQpUqVLCZ6HDlyRMWKFdOoUaPM2ypV\nqqRevXrpjTfeULVq1VS0aFE1bNhQu3btuu15Tp8+reeee06enp5ycnKSr6+vVq1aZTEma8l9fHy8\nunfvLnd3dzVu3Nj8+qeffqpWrVrJzc1NLi4uCgwM1LfffmtxjPT0dE2YMEHe3t5ydnZWQECAvvvu\nu7v9eIBbouyEzdi0aZN8fX1VpUoVa0dBPps2bZpWrlypo0ePWjsKAACwYTNnztTatWt18OBBi+0H\nDx5UfHy8XF1dzY+HH35YkswzQc+ePasBAwaoRo0aKlGihNzc3HT27Fn9/vvvFsfy9fU1/3vp0qUl\nSfXq1cu27ezZs3n/Bm/DZDKpXbt2OnDggMaMGaOhQ4cqICBAe/fuve9ZgLtx8uRJJScn39W+ycnJ\nOnnyZB4nyi49PV1paWkWj4yMDNnZ2WnVqlVKSkrSgAEDJEnXrl1Tjx49VKdOHU2dOtXiOLt379ac\nOXM0depUrVmzRk5OTmrXrp1++OGHm547OTlZLVq00EcffaRp06Zp48aNqlevnnr37q0lS5ZkGx8S\nEqLKlStr3bp1mjFjhiRp69atatWqlVxdXbVq1SqtXr1aSUlJat68uU6cOGHeNyoqStOmTVNISIg2\nbtyotm3bqnPnznnxEQLZ2Fs7AJBXli1bprCwMGvHwH3g5eWliRMnasiQIdqxY4dMJpO1IwEAABvk\n7++vZ599VqNHj9bEiRPN2zMyMtShQwfNmjUr2z5Z5WSfPn105swZzZ07V5UqVZKTk5NatWqV7cYn\nDg4O5n/P+n+anLZZ8waNdnZ26t69u7p27aqVK1cqODhYdevW1ZQpU/TII49YLRcKt23btllcJzcn\nly9fzvWsziypqanasGGDihcvftMxZcqU0VNPPXVXx89Sq1atbNs6dOigLVu2qHz58lq6dKmeeeYZ\nBQYGKiEhQb///rsOHTokR0dHi33Onj2rhIQE8w9eWrVqpYoVK2rKlClauXJljud+6623dOzYMe3a\ntUsBAQGSpHbt2unMmTOaMGGCwsLCVKRIEfP4bt266ZVXXrE4xtChQ9WiRQtt2rTJvK1ly5aqUqWK\nZs+erZiYGP3111+aO3eu+vfvb/59s23btipSpIjGjh2b+w8NuA1mdsIm/Pbbbzpw4IC6du1q7Si4\nT1588UWdOXNG69evt3YUAABgw6ZNm6Y9e/Zo27Zt5m0NGzbUd999p4oVK6patWoWDzc3N0nSZ599\npvDwcHXo0EF16tSRm5ubTp8+fc95HB0drXbTIHt7ez3//PP68ccf1a5dO7Vv317/+te/bjlzDLCm\ne/0hwf34IcOGDRu0f/9+i0dMTIz59a5du2rAgAEaOHCg3njjDc2fP1/Vq1fPdpwmTZqYi05JcnNz\nU4cOHbLdGO2f4uPjVa5cOXPRmaVXr176888/s62ku/Hv28eOHdPx48cVEhJiMTPV2dlZTZs2Nd9P\n48iRI0pOTlZQUJDF/j169Lj1hwPcJWZ2wiYsX75cPXr0ULFixawdBfeJvb29FixYoNDQULVr107O\nzs7WjgQAAGxQtWrV1L9/f82bN8+8bdCgQXrjjTf0r3/9S2PGjFGpUqX0888/67333tPs2bPl5uam\nGjVqaNWqVWrcuLGSk5M1evTobDOx7kalSpX0999/a8eOHXrkkUfk7Ox83/8/yMnJSYMHD9bzzz+v\nBQsWqFmzZurcubMmTZqkihUr3tcsKLzuZEblvn37FBcXd1c/IChSpIj5hkH5qW7duqpWrdotx/Tp\n00eLFy+Wl5eXgoODcxyTNav8xm2nTp266XEvXLggb2/vbNvLlCljfv2fbhybdXmNsLCwHFdZVqhQ\nQZLMP+i5MWNOmYG8wMxO2IRJkybptddes3YM3GcBAQFq3LixZs6cae0oAADAhk2aNEn29v+bJ1K2\nbFnt3btXdnZ2euqpp1SnTh0NGjRITk5OcnJykiS9+eabunLlih599FH16NFDL7zwgipVqnTPWR57\n7DH9+9//Vs+ePVWqVKlsS0rvJxcXF40dO1bHjh2Tt7e3GjZsqCFDhtx2aTFwv5QrV052dndXe9jZ\n2alcuXJ5nCj3rl69qhdeeEF169bVpUuXbrrs+8yZMzluu9V7KFmyZI7f16xtJUuWtNh+4+XDPDw8\nJEnTp0/PNjt1//792rx5s6T/laQ3ZswpM5AXmNkJ4IE2a9YsPfLIIwoNDVXlypWtHQcAADzgYmNj\ns23z8vJSUlKSxbbq1atr3bp1Nz1O/fr19cUXX1hs6927t8XzG+/07OnpmW1brVq1sm1btGiRFi1a\ndNNz32/u7u6aMmWKhgwZounTp6tOnToaMGCARo0apYceesja8VCIlS9fXi4uLrp48WKu93V1dVX5\n8uXzIVXuDB06VKdOndLXX3+tLVu2aNiwYXrqqacUGBhoMW7fvn06ceKEeSl7UlKStm7dqg4dOtz0\n2C1atNDatWu1d+9ePf744+btq1evlpeXl2rXrn3LbDVr1lSlSpX03Xff3fLam76+vnJx+T/27juu\nyvr///iDPQQnOREUEUEQRc2tKeZIQ81EcKSoqWWSI5w5cJWllaXWxz7uMkHLbSqGkzRz4EqN7Kup\nOHOU4GCd3x995BeZ5QAu4Dzvt9v541znGs/rCDeOr/N6v9+FWLZsGYGBgZnbo6Ki/vH8Io9LxU4R\nydfKly/PkCFDGDp0KCtXrjQ6joiIiIjZKlmyJB988AFDhgxh0qRJeHl5MWTIEF5//XWcnJz+9fh7\nK1CLZBcLCwsaNmxITEzMIy1UZGNjQ4MGDXJlIdSDBw/y66+/3re9du3arF69mrlz5/LZZ5/h4eHB\n66+/TkxMDD179uTw4cOULFkyc/9SpUrRsmVLIiMjsbOz45133iE5OTnL4mp/FRYWxocffkjHjh2Z\nMmUKrq6uLFmyhM2bNzNnzpwsixP9HQsLC2bPnk379u1JSUmhc+fOuLi4cOnSJXbt2oWbmxtDhw6l\naNGiDBkyhClTpuDs7EzLli3Zu3cv8+bNe/w3TuQfqNgpIvneG2+8gZ+fHzExMbRs2dLoOCIiIiJm\nzc3Njf/+978MGzaM8ePHU7lyZU6dOoWdnd3fFo8uXrzI0qVLiY+Pp0KFCowdOzbLivQiTyIgIIAj\nR46QmJj4UHN3WllZUaZMGQICAnIhHQQHB//t9jNnztC3b1+6detG9+7dM7cvWLAAf39/wsLCWL9+\nfebv1DPPPEPTpk0ZPXo0586do2rVqmzYsAEvL68HXrtQoUJs376d4cOHM3LkSG7evEmVKlX47LPP\nslzzn7Rp04YdO3YwZcoUXn75ZW7fvk3p0qWpV68eISEhmftFRkZiMpmYO3cus2bNom7duqxduxZf\nX9+Huo7Io7Aw/XVMhIhIPrR27VqGDRvG4cOHs2XyfxERERHJHmfPnsXV1fVvC50ZGRl06tSJ/fv3\nExISwq5du0hISGD27NkEBwdjMplypbtO8rbjx4/j4+Pz2MenpKSwZMkSLly48I8dnjY2NpQpU4Zu\n3brlq/9TVKhQgUaNGvH5558bHUXykSf9vcrLNEZAzEJYWBjPP//8E5/Hz8+PyMjIJw8k2e7555/H\nw8ODjz76yOgoIiIiIvIn5cuXf2DB8vz58xw7dowxY8bw7rvvEhcXxxtvvMGsWbO4deuWCp2SLWxt\nbVxlqfkAACAASURBVOnRowctW7akaNGi2NjYZA7RtrKywsbGhmLFitGyZUt69OiRrwqdInI/DWOX\nPGHbtm00a9bsga83bdqUrVu3Pvb5P/zww/smdpeCxcLCghkzZtCgQQO6deuWueKfiIiIiORdZcqU\noXbt2hQtWjRzm5ubGz///DOHDh2ifv36pKWlsWjRIvr06WNgUsnvrKysqF27NrVq1eLcuXMkJiaS\nkpKCra0t5cqVe2D3sYjkP+rslDyhQYMGXLhw4b7HnDlzsLCwYMCAAY913rS0NEwmE0WKFMnyAUoK\nJi8vL15++WVGjBhhdBQRERER+Rd79uyhe/fuHD9+nJCQEF5//XXi4uKYPXs2Hh4eFC9eHIAjR47w\nyiuv4O7urmG68sQsLCwoX7489erVo0mTJtSrV+8fu4/zg9OnT+t3Q+RPVOyUPMHW1pbSpUtneVy/\nfp2IiAhGjx6dOWlzYmIioaGhFCtWjGLFitG2bVt++umnzPNERkbi5+fHwoULqVSpEnZ2diQnJ983\njL1p06YMGDCA0aNH4+LiQsmSJYmIiCAjIyNzn8uXL9O+fXscHBxwd3dn/vz5ufeGyGMbM2YMW7Zs\n4dtvvzU6ioiIiIg8wO3btwkMDKRs2bLMmDGD1atXs2nTJiIiImjevDlvv/02VapUAf5YYCY1NZWI\niAiGDBmCp6cnGzduNPgOREQkr1KxU/KkGzdu0L59e5o2bcqkSZMAuHXrFs2aNcPe3p7t27eze/du\nypQpw7PPPsutW7cyjz116hRffPEFy5cv59ChQ9jb2//tNZYsWYK1tTW7du1i1qxZzJgxg+jo6MzX\nw8LCOHnyJN988w2rVq1i8eLFnD59OkfvW56ck5MT7777LgMHDnyo1RZFREREJPctXboUPz8/Ro8e\nTePGjQkKCmL27NmcP3+eV155hYYNGwJgMpkyH+Hh4SQmJvL888/Tpk0bhgwZkuX/ASIiIqBip+RB\nGRkZdO3aFWtra5YsWZI5nCAqKgqTycSCBQvw9/fH29ubOXPmkJSUxLp16zKPT0lJ4bPPPqNmzZr4\n+flhbf33U9NWrVqViRMn4uXlRefOnWnWrBmxsbEAJCQksGHDBj799FMaNmxIQEAAixYt4vbt2zn/\nBsgT69KlC87Ozvz3v/81OoqIiIiI/I3U1FQuXLjA77//nrmtXLlyFC1alP3792dus7CwwMLCInP+\n/djYWE6ePEmVKlVo1qwZjo6OuZ5dRETyNhU7Jc8ZPXo0u3fvZvXq1Tg7O2du379/P6dOncLZ2Rkn\nJyecnJwoUqQI169f5+eff87cz9XVlVKlSv3rdfz9/bM8L1u2LJcvXwbg+PHjWFpaUqdOnczX3d3d\nKVu27JPenuQCCwsLZs6cybhx47h69arRcURERETkL5555hlKly7NtGnTSExM5OjRoyxdupRz585R\nuXJl4I+uznvTTKWnpxMXF0ePHj347bff+Oqrr2jXrp2RtyAiInmUVmOXPCUqKorp06ezfv36zA85\n92RkZFCjRg2ioqLuO+7e5OUAhQoVeqhr2djYZHluYWGRZc7Oe9skf6pevTrBwcGMHTuWjz/+2Og4\nIiIiIvIn3t7eLFiwgFdffZXatWtTokQJ7ty5w/Dhw6lSpQoZGRlYWlpmfh7/4IMPmDVrFk2aNOGD\nDz7Azc0Nk8mkz+siInIfFTslzzh48CB9+vRh6tSptGrV6r7Xa9asydKlS3FxccnxldW9vb3JyMjg\n+++/p0GDBgCcOXOG8+fP5+h1JXtNmjQJX19fJk2aRIkSJYyOIyIiIiJ/4uvry44dO4iPj+fs2bPU\nqlWLkiVLApCWloatrS3Xrl1jwYIFTJw4kbCwMKZNm4aDgwOgxgR5PCaTid3ndvN94vfcvHsTZztn\n6pSrQ33X+vqZEikgVOyUPOHXX3+lQ4cONG3alO7du3Px4sX79unWrRvTp0+nffv2TJw4ETc3N86e\nPcvq1at55ZVX7usEfRJVqlShdevW9O/fn08//RQHBweGDh2a+cFK8ofixYtz9uxZrKysjI4iIiIi\nIg8QEBBAQEAAQOZIK1tbWwAGDRrEhg0bGDt2LOHh4Tg4OGR2fYo8itT0VObFz+Pdb9/lcvJlUjNS\nSU1PxcbKBhtLG0oWKsnwhsPpE9AHGyubfz+hiORZ+gshecL69ev55Zdf+PrrrylTpszfPhwdHdmx\nYwceHh4EBwfj7e1Nz549uX79OsWKFcv2TAsXLqRixYoEBgYSFBRE165dqVChQrZfR3KWlZWVvqEV\nERERySfuFTF/+eUXmjRpwqpVq5gwYQIjRozIXIzo7wqd9xYwEvk7SSlJBC4O5I2YNzh14xTJqcmk\npKdgwkRKegrJqcmcunGKN2LeoPni5iSlJOVonoULF2YuvvXXxzfffAPAN998g4WFBXFxcTmWo3v3\n7nh6ev7rfhcvXiQ8PBwvLy8cHBxwcXGhVq1aDBo0iNTU1Ee65smTJ7GwsODzzz9/5LxbtmwhMjIy\nW88pBZOFSX8VRES4e/cudnZ2RscQERERkf9ZunQpbm5uNGzYEOCBHZ0mk4n33nuP0qVL06VLF43q\nKYCOHz+Oj4/PYx2bmp5K4OJA9ibu5W763X/d387Kjjrl6hDbIzbHOjwXLlxIr169WL58Oa6urlle\nq1q1KoULF+b333/n2LFj+Pr6Zlm4Nzt1796d7777jpMnTz5wnxs3buDv74+trS0RERFUqVKFa9eu\nER8fz5IlSzhy5AhOTk4Pfc2TJ09SuXJlPvvsM7p37/5IeceMGcOUKVPu+3Lj7t27xMfH4+npiYuL\nyyOd05w9ye9VXqdh7CJi1jIyMti6dSsHDhygR48elCpVyuhIIiIiIgJ06dIly/MHDV23sLCgdu3a\nvPnmm0ydOpXJkyfTvn17je4RAObFz+PAhQMPVegEuJt+l/0X9jM/fj79a/fP0Ww1atR4YGdl4cKF\nqVevXo5e/2EsW7aMs2fPcvToUXx9fTO3v/jii0yaNClP/J7Z2dnlifdK8g4NYxcRs2ZpacmtW7fY\ntm0bgwYNMjqOiIiIiDyGpk2bEhcXxzvvvENkZCR169Zl8+bNGt5u5kwmE+9++y63Um890nG3Um/x\n7rfvGvrz83fD2Bs1akTTpk2JiYkhICAAR0dH/Pz8WLNmTZZjExIS6N69OxUqVMDBwYFKlSrx2muv\ncePGjUfOce3aNQBKly5932t/LXSmpKQwevRo3N3dsbW1pUKFCowbN+5fh7o3atSIZ5999r7trq6u\nvPzyy8D/7+q8d10LCwusrf/o33vQMPZFixbh7++PnZ0dTz31FD179uTSpUv3XSMsLIwlS5bg7e1N\noUKFePrpp9m1a9c/Zpa8TcVOETFbKSkpAAQFBfHiiy+ybNkyNm/ebHAqEREREXkcFhYWtG3blgMH\nDhAREcHAgQMJDAxU0cKM7T63m8vJlx/r2EvJl9h9bnc2J8oqPT2dtLS0zEd6evq/HpOQkMDQoUOJ\niIhgxYoVlCpVihdffJFTp05l7pOYmIi7uzsffvghmzZt4s0332TTpk08//zzj5yxTp06AHTu3JmY\nmBiSk5MfuG/37t2ZNm0avXr1Yt26dfTo0YO33nqLPn36PPJ1/+qVV14hLCwMgN27d7N7926+/fbb\nB+7/8ccfExYWRrVq1Vi1ahVTpkxh/fr1NG3alFu3sha/t27dykcffcSUKVOIiooiJSWF559/nt9/\n//2Jc4sxNIxdRMxOWloa1tbW2NrakpaWxogRI5g3bx4NGzZ85Am2RURERCRvsbS0pHPnznTs2JHF\nixfTpUsX/P39mTx5MtWrVzc6nmSTwRsHc/DiwX/c59zv5x65q/OeW6m36LGyB66FXR+4T43SNZjR\nesZjnR/A29s7y/OGDRv+64JEv/76K3FxcXh4eABQvXp1ypYty/Llyxk+fDgAzZo1o1mzZpnHNGjQ\nAA8PD5o1a8aRI0eoVq3aQ2cMDAxk3LhxvPXWW2zZsgUrKysCAgIICgpi8ODBFC5cGICDBw+yfPly\nJk2axJgxYwBo2bIllpaWTJgwgZEjR1K1atWHvu5fubq6Uq5cOYB/HbKelpbG+PHjad68OUuWLMnc\n7uXlRbNmzVi4cCEDBgzI3J6UlERMTAxFihQB4KmnnqJ+/fps3LiRzp07P3ZmMY46O0XELPz888/8\n9NNPAJnDHRYtWoS7uzurVq1i7NixzJ8/n9atWxsZU0RERESyibW1Nb179yYhIYEWLVrQqlUrunTp\nQkJCgtHRJJekZ6Rj4vGGopswkZ7x752WT2LlypXs3bs38zFv3rx/Pcbb2zuz0AlQpkwZXFxcOHPm\nTOa2u3fvMnnyZLy9vXFwcMDGxiaz+Pnjjz8+cs4JEybwyy+/8N///pfu3btz5coVxo8fj5+fH1eu\nXAFgx44dAPctOnTv+fbt2x/5uo/r2LFj/Prrr/dladq0KeXKlbsvS8OGDTMLnUBmMfjP76nkL+rs\nFBGzsGTJEpYuXcrx48eJj48nPDyco0eP0rVrV3r27En16tWxt7c3OqaIiIiIZDM7Oztef/11evfu\nzUcffUTDhg3p0KED48aNo3z58kbHk8f0MB2VM76bwYhvRpCSnvLI57ezsmNwvcEMqpdz8/r7+fk9\ncIGiBylevPh92+zs7Lhz507m8+HDh/PJJ58QGRlJvXr1cHZ25pdffiE4ODjLfo+ibNmyvPzyy5lz\naH744YcMHjyY9957j6lTp2bO7VmmTJksx92b6/Pe67nhQVnu5flrlr++p3Z2dgCP/V6J8dTZKXme\nyWTit99+MzqG5HOjRo3i/Pnz1KpVi2eeeQYnJycWL17M5MmTqVu3bpZC540bN3L1m0cRERERyXlO\nTk6MHj2ahIQESpYsSY0aNRg8eDCXLz/enI6S99UpVwcbS5vHOtba0pqnyz2dzYlyR1RUFL1792b0\n6NEEBgby9NNPZ+lczA6DBg3C2dmZY8eOAf+/YHjx4sUs+917/ndF2nvs7e0z11O4x2Qycf369cfK\n9qAs97b9UxYpGFTslDzPwsIicx4QkcdlY2PDxx9/THx8PCNGjGDOnDm0a9fuvj90GzduZMiQIXTs\n2JHY2FiD0oqIiIhITilWrBhTpkzh2LFjmEwmfHx8GDNmzGOtVC15W33X+pQsVPKxji3lVIr6rvWz\nOVHuuH37NjY2WYu8CxYseKxzXbp06W9XpT937hxJSUmZ3ZPPPPMM8Eeh9c/uzZl57/W/4+7uzo8/\n/khaWlrmtq1bt963kNC9jsvbt2//Y+aqVavi4uJyX5bt27eTmJhI06ZN//F4yf9U7JR8wcLCwugI\nUgB069aNqlWrkpCQgLu7O0DmH+6LFy8yceJE3nzzTa5evYqfnx89evQwMq6IiIiI5KBSpUrx4Ycf\ncuDAAS5cuEDlypWZOnXqP642LfmLhYUFwxsOx9HG8ZGOc7RxZHiD4fn2/6GtWrVi/vz5fPLJJ8TE\nxNC3b1++//77xzrXggUL8PHxYeLEiWzYsIFt27bx6aefEhgYiL29feZCP9WrVyc4OJixY8cyadIk\nNm/eTGRkJJMnT+all176x8WJQkNDuXz5Mr179+abb75hzpw5DBw4EGdn5yz73TvH9OnT2bNnD/v3\n7//b81lbWzNhwgQ2btxIz5492bhxI3PnziU4OBhvb2969uz5WO+F5B8qdoqIWZk/fz6HDx8mMTER\n+P+F9IyMDNLT00lISGDKlCls374dJycnIiMjDUwrIiIiIjnN3d2defPmERcXR3x8PJ6ensycOZO7\nd+8aHU2yQZ+APtQsUxM7K7uH2t/Oyo5aZWrRO6B3DifLOR9//DFt27Zl1KhRhISEcOfOnSyrkj+K\noKAgWrduzYoVK+jWrRstWrQgMjKSGjVqsGvXLqpXr5657+eff05ERARz586lTZs2LFy4kFGjRv3r\nwkstWrRg9uzZ7Nq1i6CgID777DOWLFly3wjP9u3b079/fz766CPq169P3bp1H3jOAQMGsHDhQuLj\n42nfvj0jR47kueeeY9u2bTg6PlrxW/IfC9Pf9SOLiBRgP//8MyVLliQ+Pp4mTZpkbr9y5QohISE0\naNCAyZMns3btWjp27Mjly5cpVqyYgYlFREREJLfEx8czduxYjh49yvjx43nppZewttbavkY6fvw4\nPj4+j318UkoSbZa0Yf+F/dxKvfXA/RxtHKlVphZfd/saJ1unx76eSH7wpL9XeZk6O0XE7Hh4eDB4\n8GDmz59PWlpa5lD2p556in79+rFp0yauXLlCUFAQ4eHhDxweISIiIiIFT0BAAOvWrWPJkiUsXLgQ\nPz8/li9fTkZGhtHR5DE52ToR2yOW91u+j0dRDwrZFMLOyg4LLLCzsqOQTSE8innwfsv3ie0Rq0Kn\nSD6nzk7JE+79GObXOVEk//nkk0+YOXMmBw4cwN7envT0dKysrPjoo49YvHgxO3fuxMHBAZPJpJ9L\nERERETNlMpnYvHkzo0ePJiMjgylTptC6dWt9Psxl2dmBZjKZ2H1uN3sT93Iz5SbOts7UKVeHeq71\n9O8qZqUgd3aq2Cl50r0CkwpNkpM8PT3p0aMHAwcOpHjx4iQmJhIUFETx4sXZuHGjhiuJiIiICPDH\n/09WrlzJ2LFjKV68OFOmTMkyHZLkrIJclBExSkH+vdIwdjHc22+/zYgRI7Jsu1fgVKFTctLChQv5\n8ssvadu2LZ07d6ZBgwbY2dkxe/bsLIXO9PR0du7cSUJCgoFpRURERMQoFhYWdOzYkcOHD9OvXz/C\nwsJo3bq1pjsSEcmDVOwUw82aNQtPT8/M5+vXr+eTTz7hgw8+YOvWraSlpRmYTgqyRo0aMXfuXOrX\nr8+VK1fo1asX77//Pl5eXvy56f3UqVMsWbKEkSNHkpKSYmBiERERETGSlZUVL730EidOnKB9+/a0\na9eOTp06cezYMaOjiYjI/2gYuxhq9+7dNG/enGvXrmFtbU1ERASLFy/GwcEBFxcXrK2tGT9+PO3a\ntTM6qpiBjIwMLC3//jugbdu2MXToUGrXrs2nn36ay8lEREREJC+6desWs2fPZtq0abRp04bx48dT\nsWJFo2MVOMePH8fb21sj/0Syiclk4sSJExrGLpITpk2bRmhoKPb29kRHR7N161Zmz55NYmIiS5Ys\noXLlynTr1o2LFy8aHVUKsHsra94rdP71O6D09HQuXrzIqVOnWLt2Lb///nuuZxQRERGRvMfR0ZFh\nw4bx008/4e7uTu3atXnttde4cOGC0dEKFBsbG27fvm10DJEC4/bt29jY2BgdI8eo2CmG2rVrF4cO\nHWLNmjXMnDmTHj160KVLFwD8/PyYOnUqFStW5MCBAwYnlYLsXpHz0qVLQNa5Yvfv309QUBDdunUj\nJCSEffv2UbhwYUNyioiIiEjeVKRIESZMmMCJEydwcHDAz8+PESNGcPXqVaOjFQglS5YkMTGRW7du\n3deYICIPz2QycevWLRITEylZsqTRcXKMlhoWwyQlJTF06FAOHjzI8OHDuXr1KjVq1Mh8PT09ndKl\nS2Npaal5OyXHnT59mjfeeIOpU6dSuXJlEhMTef/995k9eza1atUiLi6O+vXrGx1TRERERPKwp556\niunTpzN48GAmT55MlSpVGDRoEIMHD8bZ2dnoePnWvWaD8+fPk5qaanAakfzNxsaGUqVKFegmHs3Z\nKYY5duwYVatW5dy5c+zdu5fTp0/TokUL/Pz8MvfZsWMHbdq0ISkpycCkYi7q1KmDi4sLnTp1IjIy\nktTUVCZPnkyfPn2MjiYiIiIi+dDJkyeJjIxk8+bNjBgxgldffRUHBwejY4mIFGgqdoohzp49y9NP\nP83MmTMJDg4GyPyG7t68EQcPHiQyMpKiRYuycOFCo6KKGTl58iReXl4ADB06lDFjxlC0aFGDU4mI\niIhIfnf06FHGjh3Lvn37GDt2LL169SrQ8+WJiBhJc3aKIaZNm8bly5cJCwtj8uTJ3Lx5Exsbmywr\nYZ84cQILCwtGjRplYFIxJ56enowePRo3NzfeeustFTpFREREJFv4+fmxcuVKvvzyS5YvX46Pjw9f\nfPFF5kKZIiKSfdTZKYZwdnZmzZo17Nu3j5kzZzJy5EgGDBhw334ZGRlZCqAiucHa2pr//Oc/vPzy\ny0ZHEREREZECaMuWLbz55pskJyczefJkgoKCsiySKSIij09VJMl1K1asoFChQjRr1ow+ffrQuXNn\nwsPD6d+/P5cvXwYgLS2N9PR0FTrFENu2baNixYpa6VFEREREckRgYCC7du3irbfeYuzYsdSvX58t\nW7YYHUtEpEBQZ6fkukaNGtGoUSOmTp2auW3OnDm8/fbbBAcHM23aNAPTiYiIiIiI5J6MjAyWLVvG\n2LFjcXNzY8qUKdSrV8/oWCIi+ZaKnZKrfv/9d4oVK8ZPP/2Eh4cH6enpWFlZkZaWxqeffkpERATN\nmzdn5syZVKhQwei4IiIiIiIiuSI1NZVFixYxYcIEatasyaRJk/D39zc6lohIvqMxwpKrChcuzJUr\nV/Dw8ADAysoK+GOOxAEDBrB48WJ++OEHBg0axK1bt4yMKpKFyWQiPT3d6BgiIiIiUkDZ2Njw8ssv\n89NPP9GsWTNatmxJt27dOHnypNHRRETyFRU7JdcVL178ga916tSJ9957jytXruDo6JiLqUT+WXJy\nMuXLl+f8+fNGRxERERGRAsze3p7Bgwdz8uRJqlatSr169di2bZvmkxcReUgaxi550vXr1ylWrJjR\nMUSyGD16NGfOnOHzzz83OoqIiIiImIlr167h5OSEra2t0VFERPIFFTvFMCaTCQsLC6NjiDy0pKQk\nfHx8WLp0KY0aNTI6joiIiIiIiIj8hYaxi2FOnz5NWlqa0TFEHpqTkxPTpk0jPDxc83eKiIiIiIiI\n5EEqdophunTpwsaNG42OIfJIQkJCKFKkCJ9++qnRUURERERERETkLzSMXQzxww8/0LJlS3755Res\nra2NjiPySA4fPsyzzz7L8ePHKVGihNFxREREREREROR/1Nkphpg/fz49e/ZUoVPyJX9/f0JCQhgz\nZozRUURERERERETkT9TZKbkuJSUFV1dXdu3ahaenp9FxRB7L9evX8fHxYcOGDQQEBBgdR0RERERE\nRERQZ6cYYO3atfj4+KjQKflasWLFmDRpEuHh4eg7IxEREREREZG8QcVOyXXz58+nT58+RscQeWK9\ne/fmzp07LFmyxOgoIiIiIiIiIoKGsUsuS0xMpFq1apw7dw5HR0ej44g8se+++44XX3yREydO4Ozs\nbHQcEREREREREbOmzk7JVQsXLiQ4OFiFTikw6tWrR4sWLZg0aZLRUURERERERETMnjo7JddkZGRQ\nuXJlli5dSp06dYyOI5JtLl68iJ+fH99++y1VqlQxOo6IiIiImLH09HTS0tKws7MzOoqIiCHU2Sm5\nZseOHTg6OvL0008bHUUkW5UuXZrRo0czaNAgLVYkIiIiIoZr06YNO3bsMDqGiIghVOyUXDNv3jz6\n9OmDhYWF0VFEsl14eDhnzpxhzZo1RkcRERERETNmZWVFjx49GDNmjL6IFxGzpGHskitu3LhBhQoV\nOHnyJC4uLkbHEckR33zzDf369eOHH37AwcHB6DgiIiIiYqbS0tLw9fVl1qxZtGjRwug4IiK5Sp2d\nkiuWLl1KixYtVOiUAu3ZZ58lICCA6dOnGx1FRERERMyYtbU1EyZMYOzYseruFBGzo2Kn5Ir58+fT\np08fo2OI5Lj33nuPGTNm8MsvvxgdRURERETMWOfOnUlOTmb9+vVGRxERyVUqdkqOO3z4MBcvXtTw\nCTELFSpU4PXXXyciIsLoKCIiIiJixiwtLZk4cSLjxo0jIyPD6DgiIrlGxU7JcfPmzSMsLAwrKyuj\no4jkiuHDh7Nv3z5iY2ONjiIiIiIiZqxDhw5YWFiwcuVKo6OIiOQaLVAkOeru3bu4urqyZ88ePDw8\njI4jkmtWrlzJmDFjOHjwIDY2NkbHERERERERETEL6uyUHLV69Wr8/f1V6BSz06FDB8qVK8esWbOM\njiIiIiIiIiJiNtTZKTmqVatW9OzZk65duxodRSTXnThxgkaNGvHDDz9QqlQpo+OIiIiIiIiIFHgq\ndkqO+eWXX6hZsybnzp3DwcHB6DgihoiIiODq1assWLDA6CgiIiIiIiIiBZ6GsUuOWbhwIaGhoSp0\nilkbN24cmzZt4rvvvjM6ioiIiIiIiEiBp2Kn5IiMjAwWLFhAnz59jI4iYqjChQszdepUwsPDycjI\nMDqOiIiIiJipyMhI/Pz8jI4hIpLjVOyUHLFlyxaKFStGzZo1jY4iYrju3btjY2PD/PnzjY4iIiIi\nIvlIWFgYzz//fLacKyIigu3bt2fLuURE8jIVOyVHzJs3j969exsdQyRPsLS0ZNasWYwZM4br168b\nHUdEREREzJCTkxMlSpQwOoaISI5TsVOy3bVr19iwYQPdunUzOopInlGzZk3at2/P+PHjjY4iIiIi\nIvnQ3r17admyJS4uLhQuXJhGjRqxe/fuLPvMmTMHLy8v7O3tcXFxoVWrVqSlpQEaxi4i5kPFTsl2\nX3zxBc899xzFixc3OopInjJlyhSioqI4cuSI0VFEREREJJ+5efMmL730Ejt37uT777+nRo0atGnT\nhqtXrwKwb98+XnvtNcaPH8+PP/5IbGwsrVu3Nji1iEjuszY6gBQ88+bNY9q0aUbHEMlzXFxcGD9+\nPOHh4WzduhULCwujI4mIiIhIPhEYGJjl+cyZM/nqq6/YsGED3bt358yZMxQqVIh27drh7OyMu7s7\n1atXNyitiIhx1Nkp2erAgQNcv379vj/EIvKH/v37c/36dZYtW2Z0FBERERHJRy5fvkz//v3x8vKi\nSJEiODs7c/nyZc6cOQNAixYtcHd3p2LFinTr1o1FixZx8+ZNg1OLiOQ+FTslW926dYthw4ZhewDK\nkwAAIABJREFUaakfLZG/Y21tzcyZM4mIiCA5OdnoOCIiIiKST/Ts2ZO9e/fywQcfsGvXLg4ePIir\nqyspKSkAODs7c+DAAZYtW4abmxtvv/023t7enD9/3uDkIiK5SxUpyVZ169bl1VdfNTqGSJ7WpEkT\nGjduzFtvvWV0FBERERHJJ+Li4ggPD6dt27b4+vri7OzMhQsXsuxjbW1NYGAgb7/9NocPHyY5OZl1\n69YZlFhExBias1OylY2NjdERRPKFadOm4e/vT69evfD09DQ6joiIiIjkcV5eXnz++efUrVuX5ORk\nhg8fjq2tbebr69at4+eff6ZJkyYUL16crVu3cvPmTXx8fP713FeuXOGpp57KyfgiIrlGnZ0iIgYo\nV64cw4YNY8iQIUZHEREREZF8YP78+SQlJVGrVi1CQ0Pp3bs3FSpUyHy9aNGirFq1imeffRZvb2+m\nT5/O3Llzady48b+e+913383B5CIiucvCZDKZjA4hImKO7t69S7Vq1ZgxYwZt2rQxOo6IiIiImKni\nxYvzww8/UKZMGaOjiIg8MXV2iogYxM7OjhkzZjBo0CDu3r1rdBwRERERMVNhYWG8/fbbRscQEckW\n6uwUETFYUFAQDRs2ZOTIkUZHEREREREzdPnyZby9vTl48CBubm5GxxEReSIqdoqIGOzkyZPUrVuX\nw4cPU65cOaPjiIiIiIgZGjVqFNeuXWPOnDlGRxEReSIqdoqI5AFvvvkmp06d4osvvjA6ioiIiIiY\noWvXruHl5cX333+Ph4eH0XFERB6bip0iInlAcnIyPj4+fP755zRp0sToOCIiIiJihiIjIzl9+jQL\nFy40OoqIyGNTsVNEJI9YtmwZU6ZMYf/+/VhbWxsdR0RERETMzG+//Yanpyc7d+7E29vb6DgiIo9F\nq7FLjrt9+zaxsbGcOnXK6CgieVpwcDAlSpTQPEkiIiIiYogiRYowdOhQJkyYYHQUEZHHps5OyXHp\n6ekMGzaMzz77jIoVKxIaGkpwcDDly5c3OppInnP06FECAwM5duwYLi4uRscRERERETOTlJSEp6cn\nMTEx+Pv7Gx1HROSRqdgpuSYtLY0tW7YQFRXFqlWrqFq1KiEhIQQHB1O6dGmj44nkGYMGDeLOnTvq\n8BQRERERQ7z//vvs3LmTlStXGh1FROSRqdgphkhJSSEmJobo6GjWrl1LzZo1CQkJ4cUXX1Q3m5i9\nGzdu4O3tzfr166lVq5bRcURERETEzNy+fRtPT0/WrFmjz6Miku+o2CmGu337Nhs2bCA6OpqNGzdS\nv359QkJCeOGFFyhatKjR8UQMMW/ePObNm0dcXByWlppeWURERERy1+zZs1m/fj1ff/210VFERB6J\nip2SpyQlJbFu3Tqio6PZsmULzzzzDCEhIbRr1w5nZ2ej44nkmoyMDOrVq8fAgQPp0aOH0XFERERE\nxMzcvXsXLy8vli5dSoMGDYyOIyLy0FTslCd2+/ZtrKyssLW1zdbz/vbbb6xevZro6Gji4uJo0aIF\nISEhtG3bFkdHx2y9lkhetGfPHl544QVOnDhB4cKFjY4jIiIiImZm7ty5LF26lNjYWKOjiIg8NBU7\n5Yl99NFH2Nvb069fvxy7xrVr11i5ciVRUVHs3buX5557jtDQUFq3bo2dnV2OXVfEaL1796Z48eJM\nnz7d6CgiIiIiYmZSU1Px8fHhv//9L82aNTM6jojIQ9FEcPLErl27xvnz53P0GsWLF6dPnz5s3ryZ\nH3/8kcaNG/P+++9TunRpevbsyYYNG0hNTc3RDCJGePvtt1m0aBHHjx83OoqIiIiImBkbGxvGjx/P\n2LFjUZ+UiOQXKnbKE7O3t+f27du5dr1SpUoxYMAAtm/fztGjR6lZsyYTJ06kTJky9O3bl9jYWNLS\n0nItj0hOKlWqFG+++SaDBg3SB0wRERERyXVdu3bl6tWrxMTEGB1FROShqNgpT8ze3p47d+4Ycu1y\n5coxaNAgdu/ezf79+/Hy8mLEiBGUK1eO1157jR07dpCRkWFINpHs8tprr5GYmMiqVauMjiIiIiIi\nZsbKyooJEyYwZswYffkuIvmCip3yxBwcHAwrdv6Zu7s7w4YNY9++fXz77beULVuWgQMH4ubmxpAh\nQ/juu+/0x1nyJRsbG2bOnMnQoUNztYtaRERERASgU6dOpKSksHbtWqOjiIj8KxU75Ynl9jD2h+Hp\n6cmbb77J4cOHiYmJoXDhwoSFheHh4cGIESM4cOCACp+SrwQGBlK7dm3effddo6OIiIiIiJmxtLRk\n4sSJjB07ViPnRCTP02rsYjZMJhOHDh0iOjqa6OhorKysCA0NJSQkBD8/P6PjifyrM2fOEBAQwP79\n+6lQoYLRcURERETEjJhMJurUqcPw4cMJDg42Oo6IyAOp2ClmyWQysW/fPqKioli2bBmFCxfOLHx6\neXkZHU/kgSZNmsTBgwf56quvjI4iIiIiImZm06ZNDBkyhCNHjmBlZWV0HBGRv6Vip5i9jIwMdu/e\nTXR0NMuXL6d06dKEhobSuXNnKlasaHQ8kSzu3LlD1apV+fTTT3n22WeNjiMiIiIiZsRkMtG4cWNe\neeUVunfvbnQcEZG/pWKnyJ+kp6ezY8cOoqOj+eqrr/Dw8CAkJITOnTvj6upqdDwRAFavXs2oUaM4\ndOgQNjY2RscRERERETOybds2Xn75ZY4fP67PoiKSJ6nYKfIAqampbNmyhejoaFatWoWvry8hISF0\n6tSJ0qVLGx1PzJjJZOK5556jZcuWDB061Og4IiIiImJmmjdvTteuXenTp4/RUURE7qNipxji+eef\nx8XFhYULFxod5aHcvXuXmJgYoqOjWbduHbVq1SIkJISOHTvi4uJidDwxQz/++CMNGzbk6NGjKr6L\niIiISK7atWsXXbp0ISEhATs7O6PjiIhkYWl0AMlbDhw4gJWVFQ0bNjQ6Sp5iZ2dHUFAQn3/+ORcu\nXGDAgAF88803VKpUieeee46FCxdy48YNo2OKGalSpQq9e/dm5MiRRkcRERERETPToEEDfH19mTdv\nntFRRETuo85OyWLAgAFYWVmxePFivvvuO3x8fB64b2pq6mPP0ZLfOjsfJCkpiXXr1hEVFcWWLVto\n1qwZISEhBAUF4ezsbHQ8KeBu3ryJt7c3X375JfXr1zc6joiIiIiYkf3799OuXTtOnjyJg4OD0XFE\nRDKps1My3b59my+++IJ+/frRqVOnLN/SnT59GgsLC5YuXUpgYCAODg7MmTOHq1ev0qVLF1xdXXFw\ncMDX15cFCxZkOe+tW7cICwvDycmJUqVK8dZbb+X2reUYJycnQkNDWbVqFWfPnuXFF1/k888/x9XV\nleDgYL788ktu3bpldEwpoJydnXnnnXcIDw8nPT3d6DgiIiIiYkZq1apFnTp1+M9//mN0FBGRLFTs\nlExffvkl7u7uVKtWjZdeeonFixeTmpqaZZ9Ro0YxYMAAjh07RocOHbhz5w41a9Zk3bp1/PDDDwwa\nNIj+/fsTGxubeUxERASbN2/mq6++IjY2lvj4eHbs2JHbt5fjihQpQo8ePfj666/5v//7P1q1asV/\n/vMfypYtS9euXVmzZg137941OqYUMN26dcPe3p758+cbHUVEREREzMzEiRN55513SEpKMjqKiEgm\nDWOXTE2bNuX5558nIiICk8lExYoVmT59Op06deL06dOZz994441/PE9oaChOTk7MnTuXpKQkSpQo\nwfz58+nWrRvwx9BvV1dXOnTokO+HsT+MS5cu8dVXXxEdHc2RI0do164doaGhNG/e/LGnARD5s/j4\neJ577jmOHz9OsWLFjI4jIiIiImYkNDSU6tWrM2rUKKOjiIgA6uyU/zl58iRxcXF07doVAAsLC7p1\n63bfhNO1a9fO8jw9PZ0pU6bg7+9PiRIlcHJyYsWKFZw5cwaAn3/+mZSUlCzzCTo5OVGtWrUcvqO8\no1SpUgwYMIDt27dz5MgRatSowYQJEyhbtiz9+vUjNjZWQ5DliQQEBPDCCy8wbtw4o6OIiIiIiJmJ\njIzk/fff57fffjM6iogIoGKn/M/cuXNJT0/Hzc0Na2trrK2tmTp1KjExMZw9ezZzv0KFCmU5bvr0\n6bz33nsMGzaM2NhYDh48SIcOHUhJScntW8gXypUrx+DBg9m9ezd79+7F09OT4cOHU65cOQYOHMjO\nnTvJyMgwOqbkQ5MnTyY6OprDhw8bHUVEREREzIi3tzdt2rThgw8+MDqKiAigYqcAaWlpLFq0iLff\nfpuDBw9mPg4dOoS/v/99Cw79WVxcHEFBQbz00kvUqFGDSpUqkZCQkPl6pUqVsLGx4bvvvsvclpyc\nzNGjR3P0nvKDChUqMHz4cPbv38/OnTspXbo0AwYMwM3NjaFDh7Jnzx40y4Q8rBIlSjBhwgTCw8P1\ncyMiIiIiuWrcuHHMmjWLq1evGh1FRETFToH169fz66+/0rdvX/z8/LI8QkNDWbBgwQOLJ15eXsTG\nxhIXF8eJEycYOHAgp06dynzdycmJPn36MGLECDZv3swPP/xA7969NWz7LypXrsyYMWM4cuQImzZt\nwsnJiR49euDh4cHIkSOJj49XAUv+Vb9+/fj999+Jjo42OoqIiIiImJFKlSrRsWNHpk+fbnQUEREt\nUCTQrl077ty5Q0xMzH2v/d///R+VKlVizpw59O/fn71792aZt/P69ev06dOHzZs34+DgQFhYGElJ\nSRw7doxt27YBf3Ryvvrqq6xYsQJHR0fCw8PZs2cPLi4uZrFA0eMymUwcOnSIqKgooqOjsbGxITQ0\nlJCQEHx9fY2OJ3lUXFwcXbp04fjx4zg5ORkdR0RERETMxJkzZwgICOD48eOULFnS6DgiYsZU7BTJ\nB0wmE3v37iU6Opro6GiKFi2aWfisXLmy0fEkj+nevTtubm689dZbRkcRERERETPy1ltvERYWRtmy\nZY2OIiJmTMVOkXwmIyODXbt2ER0dzfLlyylbtiyhoaF07tyZChUqGB1P8oDz58/j7+/Pd999h6en\np9FxRERERMRM3CsvWFhYGJxERMyZip0i+Vh6ejrbt28nOjqaFStWUKlSJUJCQujcuTPlypUzOp4Y\n6N1332XHjh2sW7fO6CgiIiIiIiIiuUbFTpECIjU1ldjYWKKjo1m9ejV+fn6EhITQqVMnSpUqZXQ8\nyWUpKSlUq1aN999/n7Zt2xodR0RERERERCRXqNgpUgDdvXuXTZs2ER0dzfr166lduzYhISF07NiR\nEiVKPPZ5MzIySE1Nxc7OLhvTSk7ZuHEj4eHhHD16VP9mIiIiIiIiYhZU7BQp4G7fvs3XX39NVFQU\nMTExNGzYkJCQEDp06ECRIkUe6VwJCQl8+OGHXLx4kcDAQHr16oWjo2MOJZfs0L59e+rVq8eoUaOM\njiIiIiIiwv79+7G3t8fX19foKCJSQFkaHUAKhrCwMBYuXGh0DPkbDg4OvPjiiyxfvpzExEReeukl\nVq5cSfny5enQoQNLly4lKSnpoc51/fp1ihcvTrly5QgPD2fGjBmkpqbm8B3Ik/jggw+YPn06Z8+e\nNTqKiIiIiJixXbt24ePjQ5MmTWjXrh19+/bl6tWrRscSkQJIxU7JFvb29ty5c8foGPIvnJyc6NKl\nC6tWreLMmTO88MILfPbZZ5QrV47g4GC+++47/qnZu27dukyaNIlWrVrx1FNPUa9ePWxsbHLxDuRR\neXh4MGDAAIYNG2Z0FBERERExU7/99huvvPIKXl5e7Nmzh0mTJnHp0iVef/11o6OJSAFkbXQAKRjs\n7e25ffu20THkERQtWpSePXvSs2dPrl69yooVKyhatOg/HpOSkoKtrS1Lly6latWqVKlS5W/3u3Hj\nBgsWLMDd3Z0XXngBCwuLnLgFeUijRo3Cx8eHbdu20bRpU6PjiIiIiIgZuHXrFra2tlhbW7N//35+\n//13Ro4ciZ+fH35+flSvXp369etz9uxZypcvb3RcESlA1Nkp2UKdnflbiRIl6Nu3L97e3v9YmLS1\ntQX+WPimVatWlCxZEvhj4aKMjAwAvvnmG8aPH88bb7zBq6++yrfffpvzNyD/yNHRkenTp/P666+T\nlpZmdBwRERERKeAuXrzIZ599RkJCAgDu7u6cO3eOgICAzH0KFSqEv78/N27cMCqmiBRQKnZKtnBw\ncFCxs4BLT08HYP369WRkZNCgQYPMIeyWlpZYWlry4Ycf0rdvX5577jmefvppXnjhBTw8PLKc5/Ll\ny+zfvz/X85u7Tp064eLiwieffGJ0FBEREREp4GxsbJg+fTrnz58HoFKlStStW5eBAwdy9+5dkpKS\nmDJlCmfOnMHV1dXgtCJS0KjYKdlCw9jNx4IFC6hduzaenp6Z2w4cOEDfvn1ZsmQJ69evp06dOpw9\ne5Zq1apRtmzZzP0+/vhj2rZtS3BwMIUKFWLYsGEkJycbcRtmx8LCgpkzZzJx4kSuXLlidBwRERER\nKcBKlChBrVq1+OSTTzKbYlavXs3PP/9M48aNqVWrFvv27WPevHkUK1bM4LQiUtCo2CnZQsPYCzaT\nyYSVlRUAW7ZsoXXr1ri4uACwc+dOunfvTkBAAN9++y1Vq1Zl/vz5FC1aFH9//8xzxMTEMGzYMGrV\nqsXWrVtZvnw5a9asYcuWLYbckzny9fWlW7dujB492ugoIiIiIlLAffDBBxw+fJjg4GBWrlzJ6tWr\n8fb25ueffwagf//+NGnShPXr1/POO+9w6dIlgxOLSEGhBYokW2gYe8GVmprKO++8g5OTE9bW1tjZ\n2dGwYUNsbW1JS0vj0KFD/PTTTyxatAhra2v69etHTEwMjRs3xtfXF4ALFy4wYcIE2rZty3/+8x/g\nj3l7lixZwrRp0wgKCjLyFs1KZGQkPj4+7Nu3j9q1axsdR0REREQKqDJlyjB//ny++OILXnnlFUqU\nKMFTTz1Fr169GDZsGKVKlQLgzJkzbNq0iWPHjrFo0SKDU4tIQaBip2QLdXYWXJaWljg7OzN58mSu\nXr0KwIYNG3Bzc6N06dL069eP+vXrExUVxXvvvcdrr72GlZUVZcqUoUiRIsAfw9z37NnD999/D/xR\nQLWxsaFQoULY2tqSnp6e2TkqOato0aJMmTKFgQMHsmvXLiwt1eAvIiIiIjmjcePGNG7cmPfee48b\nN25ga2ubOUIsLS0Na2trXnnlFRo2bEjjxo3Zs2cPdevWNTi1iOR3+l+uZAvN2VlwWVlZMWjQIK5c\nucIvv/zC2LFjmTNnDr169eLq1avY2tpSq1Ytpk2bxo8//kj//v0pUqQIa9asITw8HIAdO3ZQtmxZ\natasiclkylzY6PTp03h4eOhnJ5eFhYVhMplYvHix0VFERERExAw4Ojpib29/X6EzPT0dCwsL/P39\neemll5g1a5bBSUWkIFCxU7KFOjvNQ/ny5ZkwYQIXLlxg8eLFmR9W/uzw4cN06NCBI0eO8M477wAQ\nFxdHq1atAEhJSQHg0KFDXLt2DTc3N5ycnHLvJgRLS0tmzpzJqFGj+O2334yOIyIiIiIFWHp6Os2b\nN6dGjRoMGzaM2NjYzGaHP4/uunnzJo6OjqSnpxsVVUQKCBU7JVtozk7zU7Jkyfu2nTp1in379uHr\n64urqyvOzs4AXLp0iSpVqgBgbf3H7BmrV6/G2tqaevXqAX8sgiS5p06dOrRp04YJEyYYHUVERERE\nCjArKytq167NuXPnuHr1Kl26dOHpp5+mX79+fPnll+zdu5e1a9eyYsUKKlWqpOmtROSJWZhUYZBs\nsHPnTkaPHs3OnTuNjiIGMZlMWFhY8NNPP2Fvb0/58uUxmUykpqYyYMAAjh07xs6dO7GysiI5OZnK\nlSvTtWtXxo8fn1kUldx1+fJlfH192b59O1WrVjU6joiIiIgUUHfu3KFw4cLs3r2batWq8cUXX7B9\n+3Z27tzJnTt3uHz5Mn379mX27NlGRxWRAkDFTskWe/fu5dVXX2Xfvn1GR5E8aM+ePYSFhVG/fn08\nPT354osvSEtLY8uWLZQtW/a+/a9du8aKFSvo2LEjxYsXNyCx+fjwww9Zu3YtmzdvxsLCwug4IiIi\nIlJADRkyhLi4OPbu3Ztl+759+6hcuXLm4qb3mihERB6XhrFLttAwdnkQk8lE3bp1WbBgAb///jtr\n166lZ8+erF69mrJly5KRkXHf/pcvX2bTpk1UrFiRNm3asHjxYs0tmUMGDBjAxYsXWbFihdFRRERE\nRKQAmz59OvHx8axduxb4Y5EigNq1a2cWOgEVOkXkiamzU7LFyZMnad26NSdPnjQ6ihQgN2/eZO3a\ntURHR7N161YCAwMJDQ0lKCiIQoUKGR2vwNi6dSu9evXi2LFjODo6Gh1HRERERAqocePG8euvv/Lx\nxx8bHUVECjAVOyVbnDt3jrp165KYmGh0FCmgbty4wapVq4iOjmbXrl20atWK0NBQnnvuORwcHIyO\nl+917twZHx8fLVgkIiIiIjnqxIkTVKlSRR2cIpJjVOyUbPHrr79SpUoVrl69anQUMQO//vorK1as\nIDo6mgMHDtC2bVtCQkJo2bIldnZ2RsfLl86cOUNAQAD79u2jYsWKRscREREREREReSwqdkq2SE5O\npmTJkiQnJxsdRczMxYsX+fLLL4mOjubYsWO0b9+ekJAQAgMDsbGxMTpevjJ58mT279/PypUrjY4i\nIiIiImbAZDKRmpqKlZUVVlZWRscRkQJCxU7JFmlpadjZ2ZGWlqbhCGKYc+fOsXz5cqKiojh16hQd\nO3YkJCSEJk2a6MPTQ7hz5w6+vr588skntGzZ0ug4IiIiImIGWrZsSadOnejXr5/RUUSkgFCxU7KN\njY0NycnJ2NraGh1FhFOnTrFs2TKioqK4ePEiwcHBhISEUL9+fSwtLY2Ol2etWbOG4cOHc/jwYf0u\ni4iIiEiO27NnD8HBwSQkJGBvb290HBEpAFTslGzj7OxMYmIihQsXNjqKSBYJCQlER0cTFRXFzZs3\n6dy5MyEhIdSuXVudyH9hMplo06YNzZs3JyIiwug4IiIiImIGgoKCaNmyJeHh4UZHEZECQMVOyTYl\nS5bk6NGjlCxZ0ugoIg909OhRoqOjiY6OJj09nZCQEEJCQvD391fh838SEhJo0KABR44coUyZMkbH\nEREREZECLj4+nrZt23Ly5EkcHR2NjiMi+ZyKnZJt3Nzc2LlzJ+7u7kZHEflXJpOJ+Pj4zMKnvb09\noaGhhISE4OPjY3Q8w40YMYILFy6wePFio6OIiIiIiBno1KkT9erV0+giEXliKnZKtvHy8mLt2rVU\nqVLF6Cgij8RkMvH9998TFRXFsmXLKFGiRGbHp6enp9HxDHHz5k18fHxY9v/Yu+/4ms/+j+Pvkx0Z\nZoyipYhRFI3ZofaqURRVW42qVaVGhITEKKUtOmyldmmb1uhNaYtatYnaO3YViQzJ9/dHb/k1N1rj\nnFwZr+fjcR7J+Z7veJ/cd7+Sz/lc17V4sapUqWI6DgAAANK5/fv3q3r16jpy5Ih8fHxMxwGQhrFK\nB+zG09NTMTExpmMAD81ms6lixYqaOHGiTp8+rcmTJ+vcuXN6/vnnFRAQoHHjxunkyZOmY6YoHx8f\njR07Vj179lRCQoLpOAAAAEjnnnnmGdWsWVMff/yx6SgA0jiKnbAbDw8Pip1I85ycnPTSSy9pypQp\nOnv2rMaOHatDhw7pueeeU5UqVfTRRx/p3LlzpmOmiNatW8vLy0vTp083HQUAAAAZwPDhw/Xhhx/q\n2rVrpqMASMModsJuPDw8dOvWLdMxALtxcXFRjRo1NG3aNEVGRiooKEg7d+7UM888o5dfflmffvqp\nLl68aDqmw9hsNk2aNEnDhg3T1atXTccBAABAOufv76+GDRtqwoQJpqMASMOYsxN2U6dOHb3zzjuq\nW7eu6SiAQ8XExGj16tVatGiRVqxYoQoVKqhly5Z69dVXlS1bNtPx7K5Hjx6y2WyaMmWK6SgAAABI\n506cOKGAgAAdPHhQOXLkMB0HQBpEZyfshjk7kVF4eHiocePGmj9/vs6dO6cuXbpo5cqVKliwoBo0\naKC5c+fq+vXrpmPazciRI7V06VLt3r3bdBQAAACkcwUKFNBrr72mcePGmY4CII2i2Am7YRg7MqJM\nmTLptdde09KlS3XmzBm1bt1aS5YsUf78+fXqq69q0aJFioqKMh3zsWTPnl0hISHq1auXGAwAAAAA\nRwsMDNT06dN1/vx501EApEEUO2E3LFCEjM7Hx0dvvPGGvv32W504cUKNGjXSrFmz9MQTT6hly5Za\nvnx5mv1vpEuXLrp586YWLFhgOgoAAADSuXz58qlt27YaM2aM6SgA0iDm7ITdvPXWWypdurTeeust\n01GAVOXy5ctatmyZFi5cqJ07d+qVV15Ry5YtVbt2bbm5uZmO98A2btyoli1b6uDBg/L29jYdBwAA\nAOnY+fPn9cwzz2j37t3Kly+f6TgA0hA6O2E3dHYC95YjRw517dpVP/74oyIiIlSxYkWNGTNGefLk\nUefOnfXDDz/o9u3bpmP+q+eff17VqlVTaGio6SgAAABI53Lnzq0333xTYWFhpqMASGPo7ITdDB48\nWD4+PhoyZIjpKECacPr0aS1ZskQLFy7UiRMn1KxZM7Vs2VIvvviinJ2dTce7p8jISJUqVUqbNm2S\nv7+/6TgAAABIx65cuSJ/f39t375dBQsWNB0HQBpBZyfshs5O4OHkz59f/fr109atW7V582Y99dRT\neuedd5Q/f3716dNHmzZtUmJioumYyeTJk0eDBg1S3759WawIAAAADpU9e3a9/fbbGjlypOkoANIQ\nip2wG09PT4qdwCN6+umnNWjQIO3cuVPr1q1T9uzZ9eabb6pAgQIaMGCAtm/fnmqKi71799axY8f0\n3XffmY4CAACAdK5fv34KDw/XoUOHTEcBkEZQ7ITdeHh46NatW6ZjAGle0aJFNWzYMO3PQE1aAAAg\nAElEQVTfv1/ff/+93N3d9frrr6tIkSIKDAzUnj17jBY+3dzc9PHHH6tv3758wAEAAACHypIli/r2\n7auQkBDTUQCkERQ7YTcMYwfsy2azqVSpUgoNDdWhQ4e0ePFixcfHq1GjRipRooSCg4MVERFhJFvt\n2rVVunRpffDBB0auDwAAgIyjd+/eWrNmjfbt22c6CoA0gGIn7IZh7IDj2Gw2lStXTu+//76OHz+u\nWbNm6dq1a6pZs6aeffZZjRo1SkePHk3RTBMmTNDEiRN1+vTpFL0uAAAAMhYfHx8NGDBAwcHBpqMA\nSAModsJu6OwEUobNZlOlSpX04Ycf6vTp05o0aZLOnDmjKlWqqHz58ho/frxOnTrl8BwFCxbU22+/\nrf79+zv8WgAAAMjYevTooU2bNmnnzp2mowBI5Sh2wm6YsxNIeU5OTnrppZf0ySef6OzZsxo9erR+\n//13lStXTs8//7w+/vhjRUZGOuz6AwcO1JYtW7Ru3TqHXQMAAADIlCmTBg8erGHDhpmOAiCVo9gJ\nu6GzEzDLxcVFNWvW1LRp03Tu3DkFBgbqt99+U4kSJVStWjV99tlnunTpkl2vmSlTJn3wwQfq3bu3\nbt++bddzAwAAAH/XtWtX7d69W5s3bzYdBUAqRrETdsOcnUDq4ebmpvr162vOnDmKjIxUnz599NNP\nP6lIkSKqU6eOZs6cqT/++MMu12ratKly5cqlTz75xC7nAwAAAO7F3d1dQ4cOpbsTwD+yWZZlmQ6B\n9GH79u3q1q2bfvvtN9NRANxHVFSUvv/+ey1atEhr1qzRSy+9pJYtW6pRo0by9fV95PMeOHBAVatW\n1cGDB5U9e3Y7JgYAAAD+X3x8vIoVK6ZZs2bppZdeMh0HQCpEZyfshmHsQOrn5eWlFi1a6KuvvtLp\n06fVsmVLLVq0SPnz51fTpk21ePFiRUVFPfR5S5Qooa1bt8rHx8cBqQEAAIC/uLq6avjw4Ro6dKjo\n3QJwLxQ7YTcMYwfSFl9fX7Vp00bh4eE6ceKEGjZsqBkzZihv3rxq1aqVli9f/lD/TRcoUEBubm4O\nTAwAAABIb7zxhi5evKg1a9aYjgIgFWIYO+zm7NmzqlChgs6ePWs6CoDHcOnSJS1btkyLFi3Szp07\n1bBhQ7Vs2VK1atWimAkAAIBUYdGiRZo4caJ+/fVX2Ww203EApCJ0dsJuPDw8dOvWLdMxADwmPz8/\ndevWTT/++KMOHDig8uXLa/To0XriiSf05ptv6j//+Q8rrwMAAMCo1157TdHR0fr+++9NRwGQytDZ\nCbuJioqSn5+foqOjTUcB4ACnTp3SkiVLtGjRIp08eVKvvfaaJk6cKFdXV9PRAAAAkAF9/fXXGjFi\nhLZv3y4nJ3q5APyFYifsxrIsHTlyRIULF2YYAZDOHT16VDt37lTdunXl7e1tOg4AAAAyIMuyVL58\neQ0ePFjNmjUzHQdAKkGxEwAAAAAApEkrV65U//79tWfPHjk7O5uOAyAVoM8bAAAAAACkSXXr1lXm\nzJm1aNEi01EApBJ0dgIAjFqzZo2+/vpr5cqVS7lz5076eud7d3d30xEBAACQiv3444/q3r27Dhw4\nIBcXF9NxABhGsRMAYIxlWYqIiNDatWt1/vx5XbhwQefPn0/6/sKFC/Ly8kpWBP3fYuidrzlz5mSx\nJAAAgAyqWrVqateunTp27Gg6CgDDKHYCAFIty7L0xx9/JCuA/u/3d75evnxZWbJkuW8x9O/bcuTI\nwZxOAAAA6ciGDRvUtm1b/f7773JzczMdB4BBFDuRYuLj4+Xk5ESBAYBDJCQk6MqVK/ctiv79+2vX\nril79ux3FUXvVSDNli2bbDab6bcHAACAf1G3bl01adJE3bt3Nx0FgEEUO2E3q1evVqVKlZQ5c+ak\nbXf+72Wz2TR9+nQlJiaqa9eupiICgKS/Pny5dOnSPTtE//f7qKgo5cyZ875F0b9/7+vrm2YLo9Om\nTdNPP/0kT09PVatWTa+//nqafS8AACBj2rZtm1599VUdOXJEHh4epuMAMIRiJ+zGyclJGzduVOXK\nle/5+tSpUzVt2jRt2LCBBUcApBmxsbFJ84febwj9ne/j4uL+dQj9na/e3t6m35okKSoqSn369NGm\nTZvUqFEjnT9/XocPH1arVq3Uq1cvSVJERIRGjBihzZs3y9nZWe3atdOwYcMMJwcAALhb48aNVb16\ndfXp08d0FACGUOyE3Xh5eWnBggWqXLmyoqOjFRMTo5iYGN26dUsxMTHasmWLBg8erKtXrypLliym\n4wKA3UVFRSUrjN6vQBoZGSlnZ+d/HUJ/53tHdib8+uuvql27tmbNmqXmzZtLkj777DMFBQXp6NGj\nunDhgqpXr66AgAD1799fhw8f1rRp0/Tyyy8rLCzMYbkAAAAexe7du1W3bl0dOXJEXl5epuMAMIBi\nJ+wmT548unDhgjw9PSX9NXT9zhydzs7O8vLykmVZ2r17t7JmzWo4LYCUdvv2bSUmJjJhvP6a4uPG\njRsP1C165776oCvSP+zPd+7cuRo4cKCOHj0qNzc3OTs76+TJk2rYsKF69uwpV1dXBQUF6eDBg0nd\nqDNnzlRISIh27typbNmyOeJHBAAA8MhatGihgIAAvffee6ajADDAxXQApB8JCQl69913Vb16dbm4\nuMjFxUWurq5JX52dnZWYmCgfHx/TUQEYYFmWnn/+ec2YMUOlS5c2Hccom80mX19f+fr6qkiRIv+4\nr2VZunbt2j3nEz18+HCybZcuXVLmzJnvKoYGBQXd90MmHx8fxcbG6ttvv1XLli0lSStXrlRERISu\nX78uV1dXZc2aVd7e3oqNjZW7u7uKFSum2NhY/fLLL2rcuLHdfz4AAACPIyQkRFWrVlX37t3l6+tr\nOg6AFEaxE3bj4uKi5557TvXq1TMdBUAq5OrqqhYtWigsLEyLFi0yHSfNsNlsypo1q7JmzarixYv/\n476JiYlJK9L/vQj6T/Mk161bV506dVLv3r01c+ZM5cyZU2fOnFFCQoL8/PyUN29enT59WvPnz1fr\n1q118+ZNTZo0SZcuXVJUVJS93y4AAMBjK168uOrWrauPPvpIQUFBpuMASGEMY4fdBAYGqmHDhqpU\nqdJdr1mWxaq+AHTz5k0VKlRI69ev/9fCHVLOtWvXtGHDBv3yyy/y9vaWzWbT119/rZ49e6pDhw4K\nCgrS+PHjZVmWihcvLh8fH50/f16jRo1KmudT+uteL4n7PQAAMO7IkSOqVKmSDh8+zDRqQAZDsRMp\n5o8//lB8fLxy5MghJycn03EAGDJq1CgdOHBA8+bNMx0F9zFy5Eh9++23mjp1qsqWLStJ+vPPP3Xg\nwAHlzp1bM2fO1Nq1a/X+++/rhRdeSDrOsiwtWLBAgwcPfqDFl1LLivQAACB96tKli3LlyqXQ0FDT\nUQCkIIqdsJslS5aoUKFCKleuXLLtiYmJcnJy0tKlS7V9+3b17NlT+fLlM5QSgGnXr19XoUKFtGnT\npn+drxKOt3PnTiUkJKhs2bKyLEvLly/XW2+9pf79+2vAgAFJXZp//5CqatWqypcvnyZNmnTXAkXx\n8fE6c+bMP65If+dhs9nuWxT93wLpncXvAAAAHtTJkydVrlw5HTx4UH5+fqbjAEghFDthN88995wa\nNmyo4ODge77+66+/qlevXvrggw9UtWrVlA0HIFUJDg7WqVOnNHPmTNNRMrxVq1YpKChIN27cUM6c\nOXX16lXVrFlTYWFh8vLy0ldffSVnZ2dVqFBB0dHRGjx4sH755Rd9/fXX95y25EFZlqWbN28+0Ir0\n58+fl4eHx7+uSJ87d+5HWpEeAACkXz179pSnp6fGjRtnOgqAFMICRbCbzJkz6+zZs/r999918+ZN\n3bp1SzExMYqOjlZsbKzOnTunXbt26dy5c6ajAjCsT58+Kly4sI4fP66CBQuajpOhVatWTTNmzNCh\nQ4d0+fJlFS5cWDVr1kx6/fbt2woMDNTx48fl5+ensmXLavHixY9V6JT+mtfTx8dHPj4+Kly48D/u\ne2dF+nsVQzdu3JisMHrx4kX5+vr+6xD6XLlyyc/PTy4u/CoEAEB6NmTIEJUqVUr9+vVTnjx5TMcB\nkALo7ITdtG3bVl9++aXc3NyUmJgoZ2dnubi4yMXFRa6urvL29lZ8fLxmz56tGjVqmI4LALiPey0q\nFx0drStXrihTpkzKnj27oWT/LjExUVevXn2gbtGrV68qW7Zs/9gteudr9uzZmW8aAIA06t1331V8\nfLw+/vhj01EApACKnbCbFi1aKDo6WuPGjZOzs3OyYqeLi4ucnJyUkJCgrFmzyt3d3XRcAEAGd/v2\nbV2+fPm+xdC/b7tx44Zy5MjxQHOMZsmShRXpAQBIRS5evKjixYtr586devLJJ03HAeBgFDthN+3a\ntZOTk5Nmz55tOgoAAHYVFxenixcv3nfBpb8XSG/dunVXZ+j9CqTe3t4URgEASAFDhgzRlStX9Pnn\nn5uOAsDBKHbCblatWqW4uDg1atRI0v8Pg7QsK+nh5OTEH3UAgHTt1q1bunDhwgOtSG9Z1gOvSJ8p\nUybTbw0AgDTr6tWr8vf315YtW1SoUCHTcQA4EMVOAAAAQx5mRXo3Nzflzp1ba9asYQgeAACPICQk\nRMeOHdOcOXNMRwHgQBQ7YVcJCQmKiIjQkSNHVKBAAZUpU0YxMTHasWOHbt26pZIlSypXrlymYwKw\no5dfflklS5bU5MmTJUkFChRQz5491b9///se8yD7APh/lmXpzz//1IULF1SgQAHmvgYA4BH8+eef\nKlKkiH7++WcVK1bMdBwADuJiOgDSl7Fjx2ro0KFyc3OTn5+fRo4cKZvNpj59+shms6lJkyYaM2YM\nBU8gDbl06ZKGDx+uFStWKDIyUlmyZFHJkiU1aNAg1apVS8uWLZOrq+tDnXPbtm3y8vJyUGIg/bHZ\nbMqSJYuyZMliOgoAAGlW5syZ1a9fPwUHB2vhwoWm4wBwECfTAZB+/PTTT/ryyy81ZswYxcTEaOLE\niRo/frymTZumTz75RLNnz9b+/fs1depU01EBPIRmzZpp69atmjFjhg4dOqTvvvtO9erV05UrVyRJ\n2bJlk4+Pz0Od08/Pj/kHAQAAkOJ69uyp9evXa8+ePaajAHAQip2wm9OnTytz5sx69913JUnNmzdX\nrVq15O7urtatW6tx48Zq0qSJtmzZYjgpgAd17do1/fLLLxozZoxq1Kihp556SuXLl1f//v3VqlUr\nSX8NY+/Zs2ey427evKk2bdrI29tbuXPn1vjx45O9XqBAgWTbbDabli5d+o/7AAAAAI/L29tbAwcO\n1PDhw01HAeAgFDthN66uroqOjpazs3OybVFRUUnPY2NjFR8fbyIegEfg7e0tb29vffvtt4qJiXng\n4yZMmKDixYtrx44dCgkJ0ZAhQ7Rs2TIHJgUAAAAeTPfu3bVt2zb99ttvpqMAcACKnbCb/Pnzy7Is\nffnll5KkzZs3a8uWLbLZbJo+fbqWLl2q1atX6+WXXzYbFMADc3Fx0ezZszVv3jxlyZJFlStXVv/+\n/f+1Q7tixYoKDAyUv7+/unXrpnbt2mnChAkplBoAAAC4P09PTy1atEgFChQwHQWAA1DshN2UKVNG\n9evXV8eOHVW7dm21bdtWuXLlUkhIiAYOHKg+ffooT5486tKli+moAB5Cs2bNdO7cOYWHh6tevXra\ntGmTKlWqpFGjRt33mMqVK9/1/MCBA46OCgAAADyQKlWqKHv27KZjAHAAVmOH3WTKlEkjRoxQxYoV\ntXbtWjVu3FjdunWTi4uLdu3apSNHjqhy5cry8PAwHRXAQ/Lw8FCtWrVUq1YtDRs2TG+++aaCg4PV\nv39/u5zfZrPJsqxk25jyArCfhIQExcfHy93dXTabzXQcAACM499DIP2i2Am7cnV1VZMmTdSkSZNk\n2/Pnz6/8+fMbSgXA3kqUKKHbt2/fdx7PzZs33/W8ePHi9z2fn5+fIiMjk55fuHAh2XMAj++NN95Q\n/fr11blzZ9NRAAAAAIeh2AmHuNOh9fdPyyzL4tMzII25cuWKXnvtNXXq1EmlS5eWj4+Ptm/frvff\nf181atSQr6/vPY/bvHmzRo8erebNm2v9+vX64osvkubzvZfq1atrypQpqlKlipydnTVkyBC6wAE7\ncnZ2VkhIiKpVq6bq1aurYMGCpiMBAAAADkGxEw5xr6ImhU4g7fH29lalSpX00Ucf6ciRI4qNjVXe\nvHnVunVrDR069L7H9evXT3v27FFYWJi8vLw0YsQINW/e/L77f/DBB+rcubNefvll5cqVS++//74i\nIiIc8ZaADKtkyZIaOHCg2rdvr3Xr1snZ2dl0JAAAAMDubNb/TpIGAACAdCkhIUHVq1dXw4YN7Tbn\nLgAAAJCaUOyE3d1rCDsAAEgdjh8/rgoVKmjdunUqWbKk6TgAAACAXTmZDoD0Z9WqVfrzzz9NxwAA\nAPdQsGBBjRkzRm3atFFcXJzpOAAAAIBdUeyE3Q0ePFjHjx83HQMAANxHp06d9OSTTyokJMR0FAAA\nAMCuWKAIdufp6amYmBjTMQAAwH3YbDZ9++23pmMAAAAAdkdnJ+zOw8ODYicAAAAAAABSHMVO2J2H\nh4du3bplOgaAdOTll1/WF198YToGAAAAACCVo9gJu6OzE4C9BQUFKSwsTAkJCaajAAAAAABSMYqd\nsDvm7ARgb9WrV1eOHDm0ZMkS01EAAAAAAKkYxU7YHcPYAdibzWZTUFCQQkNDlZiYaDoOAAAA0jjL\nsvi9EkinKHbC7hjGDsAR6tSpI09PTy1fvtx0FOCRdejQQTab7a7Hrl27TEcDACBDWbFihbZt22Y6\nBgAHoNgJu2MYOwBHsNlsGjZsmEaOHCnLskzHAR5ZzZo1FRkZmexRsmRJY3ni4uKMXRsAABPi4+PV\nq1cvxcfHm44CwAEodsLu6OwE4CivvPKKbDabwsPDTUcBHpm7u7ty586d7OHi4qIVK1bohRdeUJYs\nWZQtWzbVq1dPv//+e7JjN23apDJlysjDw0PlypXTd999J5vNpg0bNkj664+3Tp06qWDBgvL09JS/\nv7/Gjx+f7AOCNm3aqEmTJho1apTy5s2rp556SpI0Z84cBQQEyMfHR7ly5VLLli0VGRmZdFxcXJx6\n9uypPHnyyN3dXfnz51dgYGAK/MQAALCvuXPn6umnn9YLL7xgOgoAB3AxHQDpD3N2AnAUm82moUOH\nauTIkWrYsKFsNpvpSIDdREVF6d1331XJkiUVHR2tESNGqFGjRtq3b59cXV11/fp1NWzYUPXr19f8\n+fN1+vRp9e3bN9k5EhIS9OSTT2rx4sXy8/PT5s2b1bVrV/n5+al9+/ZJ+61du1a+vr764Ycfkgqh\n8fHxGjlypIoWLapLly7pvffeU+vWrbVu3TpJ0sSJExUeHq7FixfrySef1JkzZ3T48OGU+wEBAGAH\n8fHxCg0N1Zw5c0xHAeAgNouxgLCzcePG6cKFCxo/frzpKADSocTERJUuXVrjx49X3bp1TccBHkqH\nDh00b948eXh4JG178cUXtXLlyrv2vX79urJkyaJNmzapUqVKmjJlioYPH64zZ84kHf/FF1+offv2\n+uWXX+7bndK/f3/t27dPq1atkvRXZ+eaNWt06tQpubm53Tfrvn37VKpUKUVGRip37tzq0aOHjhw5\notWrV/NBAwAgzZo5c6bmz5+vNWvWmI4CwEEYxg67Y85OAI7k5OSkoUOHasSIEczdiTTppZde0q5d\nu5Ie06dPlyQdPnxYr7/+up5++mn5+vrqiSeekGVZOnXqlCTp4MGDKl26dLJCacWKFe86/5QpUxQQ\nECA/Pz95e3tr0qRJSee4o1SpUncVOrdv365GjRrpqaeeko+PT9K57xzbsWNHbd++XUWLFlWvXr20\ncuVKVrEFAKQp8fHxCgsL0/Dhw01HAeBAFDthdwxjB+Bor732mq5evaqff/7ZdBTgoWXKlEmFCxdO\neuTNm1eS1KBBA129elXTpk3Tli1b9Ntvv8nJyemhFhD68ssv1b9/f3Xq1EmrV6/Wrl271K1bt7vO\n4eXllez5jRs3VKdOHfn4+GjevHnatm2bVqxYIen/FzAqX768Tpw4odDQUMXHx6tNmzaqV68eHzoA\nANKMefPmqUCBAnrxxRdNRwHgQMzZCbtjgSIAjubs7Kwff/xRefLkMR0FsIsLFy7o8OHDmjFjRtIf\nYFu3bk3WOVmsWDEtXLhQsbGxcnd3T9rn7zZs2KAqVaqoR48eSduOHDnyr9c/cOCArl69qjFjxih/\n/vySpD179ty1n6+vr1q0aKEWLVqobdu2euGFF3T8+HE9/fTTD/+mAQBIYR07dlTHjh1NxwDgYHR2\nwu4Yxg4gJeTJk4d5A5Fu5MiRQ9myZdPUqVN15MgRrV+/Xm+//bacnP7/V7W2bdsqMTFRXbt2VURE\nhP7zn/9ozJgxkpT034K/v7+2b9+u1atX6/DhwwoODtbGjRv/9foFChSQm5ubJk2apOPHj+u77767\na4jf+PHjtXDhQh08eFCHDx/WggULlDlzZj3xxBN2/EkAAAAAj4diJ+yOzk4AKYFCJ9ITZ2dnLVq0\nSDt27FDJkiXVq1cvjR49Wq6urkn7+Pr6Kjw8XLt371aZMmU0cOBAhYSESFLSPJ49evRQ06ZN1bJl\nS1WoUEFnz569a8X2e8mVK5dmz56tpUuXqnjx4goNDdWECROS7ePt7a2xY8cqICBAAQEBSYse/X0O\nUQAAAMA0VmOH3a1du1ZhYWH68ccfTUcBkMElJiYm64wD0puvvvpKLVq00OXLl5U1a1bTcQAAAADj\nmLMTdkdnJwDTEhMTFR4ergULFqhw4cJq2LDhPVetBtKaWbNmqUiRIsqXL5/27t2rfv36qUmTJhQ6\nAQAAgP+i3QV2x5ydAEyJj4+XJO3atUv9+vVTQkKCfv75Z3Xu3FnXr183nA54fOfPn9cbb7yhokWL\nqlevXmrYsKHmzJljOhYAAOnS7du3ZbPZ9PXXXzv0GAD2RbETdufh4aFbt26ZjgEgA4mOjtaAAQNU\nunRpNWrUSEuXLlWVKlW0YMECrV+/Xrlz59aQIUNMxwQe2+DBg3Xy5EnFxsbqxIkTmjx5sry9vU3H\nAgAgxTVq1Eg1atS452sRERGy2Wz64YcfUjiV5OLiosjISNWrVy/Frw3gLxQ7YXcMYweQkizL0uuv\nv65NmzYpNDRUpUqVUnh4uOLj4+Xi4iInJyf16dNHP/30k+Li4kzHBQAAgB107txZ69at04kTJ+56\nbcaMGXrqqadUs2bNlA8mKXfu3HJ3dzdybQAUO+EADGMHkJJ+//13HTp0SG3btlWzZs0UFhamCRMm\naOnSpTp79qxiYmK0YsUK5ciRQ1FRUabjAgAAwA4aNGigXLlyadasWcm2x8fHa+7cuerUqZOcnJzU\nv39/+fv7y9PTUwULFtSgQYMUGxubtP/JkyfVqFEjZcuWTZkyZVLx4sW1ZMmSe17zyJEjstls2rVr\nV9K2/x22zjB2wDyKnbA7OjsBpCRvb2/dunVLL730UtK2ihUr6umnn1aHDh1UoUIFbdy4UfXq1WMR\nF8BOYmNjVapUKX3xxRemowAAMigXFxe1b99es2fPVmJiYtL28PBwXb58WR07dpQk+fr6avbs2YqI\niNDkyZM1b948jRkzJmn/7t27Ky4uTuvXr9f+/fs1YcIEZc6cOcXfDwD7odgJu2POTgApKV++fCpW\nrJg+/PDDpF90w8PDFRUVpdDQUHXt2lXt27dXhw4dJCnZL8MAHo27u7vmzZun/v3769SpU6bjAAAy\nqM6dO+vUqVNas2ZN0rYZM2aodu3ayp8/vyRp2LBhqlKligoUKKAGDRpo0KBBWrBgQdL+J0+e1Isv\nvqjSpUurYMGCqlevnmrXrp3i7wWA/biYDoD0x93dXbGxsbIsSzabzXQcABnAuHHj1KJFC9WoUUNl\ny5bVL7/8okaNGqlixYqqWLFi0n5xcXFyc3MzmBRIP5599ln169dPHTp00Jo1a+TkxGfoAICUVaRI\nEVWtWlUzZ85U7dq1de7cOa1evVoLFy5M2mfRokX6+OOPdfToUd28eVO3b99O9m9Wnz591LNnT33/\n/feqUaOGmjZtqrJly5p4OwDshN9KYXdOTk5JBU8ASAmlSpXSpEmTVLRoUe3YsUOlSpVScHCwJOnK\nlStatWqV2rRpo27duumTTz7R4cOHzQYG0okBAwYoNjZWkyZNMh0FAJBBde7cWV9//bWuXr2q2bNn\nK1u2bGrcuLEkacOGDXrjjTdUv359hYeHa+fOnRoxYkSyRSu7deumY8eOqX379jp48KAqVaqk0NDQ\ne17rTpHUsqykbfHx8Q58dwAeBcVOOARD2QGktJo1a+qzzz7Td999p5kzZypXrlyaPXu2qlatqlde\neUVnz57V1atXNXnyZLVu3dp0XCBdcHZ21pw5cxQaGqqIiAjTcQAAGVDz5s3l4eGhefPmaebMmWrX\nrp1cXV0lSRs3btRTTz2lwMBAlS9fXkWKFLnn6u358+dXt27dtGTJEg0bNkxTp06957X8/PwkSZGR\nkUnb/r5YEYDUgWInHIJFigCYkJCQIG9vb509e1a1atVSly5dVKlSJUVEROiHH37QsmXLtGXLFsXF\nxWns2LGm4wLpQuHChRUaGqq2bdvS3QIASHGenp5q3bq1goODdfToUXXu3DnpNX9/f506dUoLFizQ\n0aNHNXnyZC1evDjZ8b169dLq1at17Ngx7dy5U6tXr1aJEiXueS0fHx8FBARozJgxOnDggDZs2KD3\n3nvPoe8PwMOj2AmH8PT0pNgJIMU5OztLkiZMmKDLly9r7dq1mj59uooUKSInJyc5OzvLx8dH5cuX\n1969ew2nBdKPrl27KmfOnPcd9gcAgCO9+eab+uOPP1SlShUVL148afurr76qd0n+/PkAACAASURB\nVN55R71791aZMmW0fv16hYSEJDs2ISFBb7/9tkqUKKE6deoob968mjVr1n2vNXv2bN2+fVsBAQHq\n0aMH//YBqZDN+vtkE4CdFC9eXMuWLUv2Dw0ApIQzZ86oevXqat++vQIDA5NWX78zx9LNmzdVrFgx\nDR06VN27dzcZFUhXIiMjVaZMGYWHh6tChQqm4wAAACCDorMTDsGcnQBMiY6OVkxMjN544w1JfxU5\nnZycFBMTo6+++krVqlVTjhw59OqrrxpOCqQvefLk0aRJk9SuXTtFR0ebjgMAAIAMimInHII5OwGY\n4u/vr2zZsmnUqFE6efKk4uLiNH/+fPXp00fjxo1T3rx5NXnyZOXKlct0VCDdadGihcqVK6dBgwaZ\njgIAAIAMysV0AKRPzNkJwKRPP/1U7733nsqWLav4+HgVKVJEvr6+qlOnjjp27KgCBQqYjgikW1Om\nTFHp0qXVqFEj1axZ03QcAAAAZDAUO+EQDGMHYFLlypW1cuVKrV69Wu7u7pKkMmXKKF++fIaTAelf\n1qxZNWPGDHXq1El79uxRlixZTEcCAABABkKxEw7BMHYApnl7e6tZs2amYwAZUu3atdWoUSP16tVL\nc+fONR0HAAAAGQhzdsIhGMYOAEDGNnbsWG3ZskVLly41HQUAkE4lJCSoWLFiWrt2rekoAFIRip1w\nCDo7AaRGlmWZjgBkGF5eXvriiy/Us2dPRUZGmo4DAEiHFi1apBw5cqh69eqmowBIRSh2wiGYsxNA\nahMbG6sffvjBdAwgQ6lUqZK6dOmiLl268GEDAMCuEhISNGLECAUHB8tms5mOAyAVodgJh6CzE0Bq\nc/r0abVp00bXr183HQXIUIKCgnTu3DlNnz7ddBQAQDpyp6uzRo0apqMASGUodsIhmLMTQGpTuHBh\n1a1bV5MnTzYdBchQ3NzcNHfuXA0ZMkTHjh0zHQcAkA7c6eocPnw4XZ0A7kKxEw7BMHYAqVFgYKA+\n/PBD3bx503QUIEN55plnNHjwYLVv314JCQmm4wAA0rjFixcre/bsqlmzpukoAFIhip1wCIaxA0iN\nihUrpmrVqunTTz81HQXIcPr27StnZ2d98MEHpqMAANIw5uoE8G8odsIhGMYOILUaOnSoJkyYoOjo\naNNRgAzFyclJs2fP1rhx47Rnzx7TcQAAadTixYuVLVs2ujoB3BfFTjgEnZ0AUqtSpUqpcuXKmjp1\nqukoQIZToEABvf/++2rbtq1iY2NNxwEApDEJCQkaOXIkc3UC+EcUO+EQzNkJIDUbOnSoxo0bx4cy\ngAEdOnRQgQIFFBwcbDoKACCNWbJkibJkyaJatWqZjgIgFaPYCYegsxNAalauXDmVLVtWM2fONB0F\nyHBsNpumTZum2bNna+PGjabjAADSCObqBPCgKHbCIZizE0BqFxQUpDFjxiguLs50FCDDyZkzpz79\n9FO1b99eN2/eNB0HAJAGLFmyRJkzZ6arE8C/otgJh2AYO4DUrmLFiipevLjmzJljOgqQITVp0kQv\nvvii+vfvbzoKACCVuzNXJ12dAB4ExU44BMPYAaQFQUFBGj16tOLj401HATKkDz/8UKtWrdLKlStN\nRwEApGJLly6Vr6+vateubToKgDSAYiccgmHsANKCF154QQUKFND8+fNNRwEypMyZM2vWrFl68803\ndeXKFdNxAACpEHN1AnhYFDvhEHR2AkgrgoKCFBYWpoSEBNNRgAypWrVqatmypd566y1ZlmU6DgAg\nlVm6dKl8fHzo6gTwwCh2wiGYsxNAWvHyyy8rZ86cWrRokekoQIYVFhamffv2acGCBaajAABSkcTE\nRLo6ATw0ip1wCDo7AaQVNptNw4YNU2hoqBITE03HATIkT09PzZ07V3379tWZM2dMxwEApBJ3ujrr\n1KljOgqANIRiJxyCOTsBpCW1atWSj4+PvvrqK9NRgAzrueeeU69evdSpUyeGswMA6OoE8MgodsIh\nGMYOIC2x2WwKCgqiuxMwbPDgwfrzzz/1ySefmI4CADDsq6++kpeXF12dAB4axU44hLu7u+Li4iga\nAEgzGjRoIGdnZ4WHh5uOAmRYLi4u+uKLLzR8+HAdOnTIdBwAgCGJiYkKCQmhqxPAI6HYCYew2Wzy\n8PBQbGys6SgA8EDudHeOGDGCIbSAQUWLFlVwcLDatm2r27dvm44DADDgTldn3bp1TUcBkAZR7ITD\nsEgRgLSmcePGiouL08qVK01HATK0Hj16KHPmzBozZozpKACAFHanq3P48OF0dQJ4JBQ74TDM2wkg\nrXFyclJQUJBGjhxJdydgkJOTk2bOnKmPP/5YO3bsMB0HAJCCli1bpkyZMqlevXqmowBIoyh2wmHo\n7ASQFjVr1kzXrl3T2rVrTUcBMrR8+fJp4sSJatu2Lb9PAEAGwVydAOyBYiccxtPTkz9OAKQ5zs7O\nCgwM1IgRI0xHATK81q1b65lnnlFgYKDpKACAFLBs2TJ5enrS1QngsVDshMMwjB1AWtWqVSudO3dO\nP/30k+koQIZms9n06aefauHChVq/fr3pOAAAB0pMTNSIESOYqxPAY6PYCYdhGDuAtMrFxUWBgYEa\nOXKk6ShAhpc9e3ZNmzZNHTp00PXr103HAQA4yPLly+Xu7q769eubjgIgjaPYCYdhGDuAtKxNmzY6\nevSoNm3aZDoKkOHVr19fderUUd++fU1HAQA4AHN1ArAnip1wGDo7AaRlrq6uGjRoEN2dQCrxwQcf\n6KefftI333xjOgoAwM7o6gRgTxQ74TDM2QkgrevQoYP27dunbdu2mY4CZHje3t764osv1L17d128\neNF0HACAnTBXJwB7o9gJh6GzE0Ba5+7uroEDB9LdCaQSzz//vNq3b6+uXbvKsizTcQAAdvD111/L\n1dVVDRo0MB0FQDpBsRMOw5ydANKDzp07a/v27dq1a5fpKAAkhYSE6Pjx45ozZ47pKACAx8RcnQAc\ngWInHIZh7ADSA09PTw0YMEChoaGmowDQXx3Xc+fO1YABA3Ty5EnTcQAAj+Gbb76hqxOA3VHshMMw\njB1AetGtWzdt2LBB+/btMx0FgKTSpUurf//+6tChgxITE03HAQA8gjtdnczVCcDeKHbCYRjGDiC9\nyJQpk9555x2FhYWZjgLgv/r376/4+Hh99NFHpqMAAB7BN998I2dnZ73yyiumowBIZyh2wmHo7ASQ\nnvTo0UNr167VwYMHTUcBIMnZ2Vlz5sxRWFiY9u/fbzoOAOAh0NUJwJEodsJhmLMTQHri4+Oj3r17\na9SoUaajAPivQoUKadSoUWrbtq3i4uJMxwEAPKBvv/1WTk5OatiwoekoANIhip1wGDo7AaQ3vXr1\n0ooVK3T06FHTUQD8V5cuXZQnTx4WEQOANMKyLFZgB+BQFDvhMMzZCSC9yZw5s95++22NHj3adBQA\n/2Wz2TR9+nRNnTpVW7ZsMR0HAPAvvvnmG9lsNro6ATgMxU44DMPYAaRHffr00fLly3Xy5EnTUQD8\nV548eTR58mS1bdtW0dHRpuMAAO7jTlcnc3UCcCSKnXCYp59+WhUrVjQdAwDsKlu2bOratavGjBlj\nOgqAv2nevLkqVKig9957z3QUAMB9fPvtt5KkRo0aGU4CID2zWZZlmQ6B9Ck+Pl7x8fHKlCmT6SgA\nYFeXLl1S//79NW3aNLm5uZmOA+C//vjjDz377LOaPn26ateubToOAOBvLMtSuXLlFBwcrMaNG5uO\nAyAdo9gJAMAjiImJkYeHh+kYAP7Hf/7zH3Xq1El79uxR1qxZTccBAPzXN998o+DgYO3YsYMh7AAc\nimInAAAA0pVevXrp6tWr+vLLL01HAQDor67O5557TsOGDVOTJk1MxwGQzjFnJwAAANKVsWPHavv2\n7Vq8eLHpKAAASeHh4bIsi+HrAFIEnZ0AAABId7Zu3aqGDRtq165dypMnj+k4AJBh0dUJIKXR2QkA\nAIB0p0KFCurWrZs6d+4sPtsHAHPCw8OVmJhIVyeAFEOxEwAAAOlSUFCQLly4oGnTppmOAgAZkmVZ\nCgkJ0fDhw1mUCECKodgJAACAdMnV1VVz585VYGCgjh49ajoOAGQ43333nRISEujqBJCiKHYCAAAg\n3SpRooQCAwPVrl07JSQkmI4DABmGZVkKDg7W8OHD5eRE6QFAyuGOAwAAgHStd+/ecnNz0/jx401H\nAYAM4/vvv9ft27fp6gSQ4liNHQAAAOneyZMnFRAQoDVr1ujZZ581HQcA0jXLslS+fHkNGTJETZs2\nNR0HQAZDZyeMotYOAABSwlNPPaXx48erbdu2io2NNR0HANK177//XvHx8WrSpInpKAAyIIqdMGrf\nvn1aunSpEhMTTUcBAIf6888/devWLdMxgAytXbt2KlSokIYNG2Y6CgCkW3fm6hw2bBhzdQIwgjsP\njLEsS7GxsRo7dqxKly6tRYsWsXAAgHQpMTFRS5YsUdGiRTV79mzudYAhNptNn3/+ub744gtt2LDB\ndBwASJdWrFihuLg4vfrqq6ajAMigmLMTxlmWpVWrVikkJETXr1/X0KFD1bJlSzk7O5uOBgB2tWnT\nJg0YMEA3btzQ2LFjVbduXdlsNtOxgAznm2++Ub9+/bRr1y75+PiYjgMA6YZlWapQoYIGDRqkZs2a\nmY4DIIOi2IlUw7IsrVmzRiEhIbp06ZICAwPVunVrubi4mI4GAHZjWZa++eYbDRo0SHnz5tX777+v\n5557znQsIMPp1KmTXFxcNHXqVNNRACDd+P777zV48GDt2rWLIewAjKHYiVTHsiytW7dOISEhOnv2\nrAIDA9WmTRu5urqajgYAdnP79m3NmDFDISEhqlatmkJDQ1WwYEHTsYAM4/r163r22Wc1efJkNWjQ\nwHQcAEjz7nR1Dhw4UM2bNzcdB0AGxkctSHVsNpuqV6+un376STNmzNC8efPk7++vadOmKS4uznQ8\nALivGzdu6I8//nigfV1cXNStWzcdOnRI/v7+CggIUL9+/XTlyhUHpwQgSb6+vpo9e7a6dOmiy5cv\nm44DAGneypUrFRMTo6ZNm5qOAiCDo9iJVK1q1apau3at5s6dqyVLlqhIkSL67LPPFBsbazoaANxl\n9OjRmjx58kMd4+3treHDh2v//v2KiYlRsWLFNHbsWFZuB1JA1apV9frrr6t79+5isBMAPLo7K7AP\nHz6c4esAjOMuhDThhRde0A8//KCFCxfq22+/VeHChTVlyhTFxMSYjgYASYoUKaJDhw490rG5c+fW\nJ598og0bNmjLli2s3A6kkLCwMEVERGj+/PmmowBAmrVy5UrdunWLrk4AqQLFTqQplStX1ooVK7Rs\n2TKtWrVKhQoV0kcffUQHFIBUoUiRIjp8+PBjnaNo0aJatmyZFi5cqGnTpqls2bJatWoVXWeAg3h4\neGjevHl65513dPr0adNxACDNsSxLISEhGjZsGF2dAFIF7kRIk8qXL6/w8HCFh4dr/fr1KlSokCZM\nmKCoqCjT0QBkYP7+/o9d7LyjSpUq2rBhg0aMGKE+ffqoVq1a2rFjh13ODSC5smXLqk+fPurYsaMS\nExNNxwGANGXVqlWKiopSs2bNTEcBAEkUO5HGlStXTsuXL9eKFSu0adMmFSpUSOPGjdPNmzdNRwOQ\nAfn5+en27du6evWqXc5ns9nUpEkT7du3T82bN1eDBg30xhtv6Pjx43Y5P4D/N3DgQN28eVNTpkwx\nHQUA0gzm6gSQGtksxsUBAAAAOnToUFJXdbFixUzHAYBUb+XKlRowYID27NlDsRNAqsHdCAAAANBf\nU1GMGDFC7dq10+3bt03HAYBUjbk6AaRW3JEAAEgnWLkdeHxvvfWWsmbNqlGjRpmOAgCp2s6dO3Xj\nxg01b97cdBQASIZh7AAApBPPPvusxo4dqzp16shms5mOA6RZZ8+eVdmyZbVixQoFBASYjgMAqc6d\nMkJsbKw8PDwMpwGA5OjsRIY1ZMgQXb582XQMALCb4OBgVm4H7CBv3rz66KOP1LZtW926dct0HABI\ndWw2m2w2m9zd3U1HAYC7UOzM4Gw2m5YuXfpY55g9e7a8vb3tlCjlXL16Vf7+/nrvvfd08eJF03EA\nGFSgQAGNHz/e4ddx9P3y1VdfZeV2wE5atWql0qVLa8iQIaajAECqxUgSAKkRxc506s4nbfd7dOjQ\nQZIUGRmphg0bPta1WrZsqWPHjtkhdcr67LPPtHv3bkVFRalYsWJ69913df78edOxANhZhw4dku59\nLi4uevLJJ/XWW2/pjz/+SNpn27Zt6tGjh8OzpMT90tXVVd27d9fhw4fl7++vgIAAvfvuu7py5YpD\nrwukNzabTZ988omWLFmidevWmY4DAACAB0SxM52KjIxMekybNu2ubR999JEkKXfu3I899MDT01M5\nc+Z87MyPIy4u7pGOy58/v6ZMmaK9e/fq9u3bKlGihPr27atz587ZOSEAk2rWrKnIyEidOHFC06dP\nV3h4eLLipp+fnzJlyuTwHCl5v/T29tbw4cO1f/9+RUdHq1ixYnr//fcZkgs8hOzZs2vatGnq0KGD\n/vzzT9NxAAAA8AAodqZTuXPnTnpkyZLlrm2ZM2eWlHwY+4kTJ2Sz2bRw4UJVrVpVnp6eKlu2rPbs\n2aN9+/apSpUq8vLy0gsvvJBsWOT/Dss8ffq0GjdurGzZsilTpkwqVqyYFi5cmPT63r17VbNmTXl6\neipbtmx3/QGxbds21a5dWzly5JCvr69eeOEF/frrr8nen81m05QpU9S0aVN5eXlpyJAhSkhIUOfO\nnVWwYEF5enqqSJEiev/995WYmPivP687c3Pt379fTk5OKlmypHr27KkzZ848wk8fQGrj7u6u3Llz\nK1++fKpdu7ZatmypH374Ien1/x3GbrPZ9Omnn6px48bKlCmT/P39tW7dOp05c0Z16tSRl5eXypQp\nk2xezDv3wrVr16pkyZLy8vJStWrV/vF+KUkrVqxQxYoV5enpqezZs6thw4aKiYm5Zy5Jevnll9Wz\nZ88Hfu+5c+fWp59+qg0bNmjz5s0qWrSo5syZw8rtwAOqV6+e6tevrz59+piOAgBGsKYxgLSGYifu\nMnz4cA0cOFA7d+5UlixZ9Prrr6tXr14KCwvT1q1bFRMTo969e9/3+B49eig6Olrr1q3T/v379eGH\nHyYVXKOiolSnTh15e3tr69atWr58uTZt2qROnTolHX/jxg21bdtWv/zyi7Zu3aoyZcqofv36dw3B\nDAkJUf369bV37169/fbbSkxMVN68ebV48WJFREQoLCxMo0aN0qxZsx74vefJk0cTJkxQRESEPD09\nVbp0ab311ls6efLkQ/4UAaRWx44d06pVq+Tq6vqP+4WGhqpVq1bavXu3AgIC1KpVK3Xu3Fk9evTQ\nzp079cQTTyRNCXJHbGysRo8erZkzZ+rXX3/VtWvX1L179/teY9WqVWrUqJFq1aql3377TevWrVPV\nqlUf6EOah1W0aFEtW7ZMCxYs0Oeff65y5cpp9erV/AEDPIBx48Zpw4YNWr58uekoAJAi/v77wZ15\nOR3x+wkAOISFdG/JkiXW/f6nlmQtWbLEsizLOn78uCXJ+uyzz5JeDw8PtyRZX331VdK2WbNmWV5e\nXvd9XqpUKSs4OPie15s6darl6+trXb9+PWnbunXrLEnW4cOH73lMYmKilTt3bmvu3LnJcvfs2fOf\n3rZlWZY1cOBAq0aNGv+63/1cvHjRGjRokJUtWzarS5cu1rFjxx75XADMaN++veXs7Gx5eXlZHh4e\nliRLkjVhwoSkfZ566ilr3LhxSc8lWYMGDUp6vnfvXkuS9cEHHyRtu3PvunTpkmVZf90LJVkHDx5M\n2mfevHmWm5ublZiYmLTP3++XVapUsVq2bHnf7P+by7Isq2rVqtbbb7/9sD+GZBITE61ly5ZZ/v7+\nVo0aNazffvvtsc4HZAQbN260cuXKZZ0/f950FABwuJiYGOuXX36x3nzzTWvo0KFWdHS06UgA8MDo\n7MRdSpcunfR9rly5JEmlSpVKti0qKkrR0dH3PL5Pnz4KDQ1V5cqVNXToUP32229Jr0VERKh06dLy\n8fFJ2lalShU5OTnpwIEDkqSLFy+qW7du8vf3V+bMmeXj46OLFy/q1KlTya4TEBBw17U/++wzBQQE\nyM/PT97e3po4ceJdxz0MPz8/jR49WocOHVLOnDkVEBCgzp076+jRo498TgAp76WXXtKuXbu0detW\n9erVS/Xr1//HDnXpwe6F0l/3rDvc3d1VtGjRpOdPPPGE4uLiki2G9Hc7d+5UjRo1Hv4NPSabzXbX\nyu1t2rTRiRMnUjwLkFZUqVJFnTp1UpcuXeiIBpDuhYWFqUePHtq7d6/mz5+vokWLJvu7DgBSM4qd\nuMvfh3beGbJwr233G8bQuXNnHT9+XB07dtShQ4dUpUoVBQcH/+t175y3ffv22rZtmyZOnKhNmzZp\n165dypcv312LEHl5eSV7vmjRIvXt21cdOnTQ6tWrtWvXLvXo0eORFy/6u+zZsys0NFRHjhxR/vz5\nVbFiRbVv316HDh167HMDcLxMmTKpcOHCKlWqlD7++GNFR0dr5MiR/3jMo9wLXVxckp3jcYd9OTk5\n3VVUiY+Pf6Rz3cudldsPHTqkwoUL67nnntO7776rq1ev2u0aQHoSHBysU6dOPdQUOQCQ1kRGRmrC\nhAmaOHGiVq9erU2bNil//vxasGCBJOn27duSmMsTQOpFsRMOkS9fPnXt2lWLFy/WiBEjNHXqVElS\n8eLFtXfvXt24cSNp302bNikxMVHFixeXJG3YsEG9evVSgwYN9Mwzz8jHx0eRkZH/es0NGzaoYsWK\n6tmzp8qVK6fChQvbvQMza9asCg4O1pEjR1S4cGE9//zzatOmjSIiIux6HQCONXz4cI0dO1bnzp0z\nmqNs2bJau3btfV/38/NLdv+LiYnRwYMH7Z7Dx8dHwcHBSSu3Fy1aVOPGjUtaKAnAX9zc3DR37lwN\nHDgw2eJjAJCeTJw4UTVq1FCNGjWUOXNm5cqVSwMGDNDSpUt148aNpA93P//8c+3Zs8dwWgC4G8VO\n2F2fPn20atUqHTt2TLt27dKqVatUokQJSdIbb7yhTJkyqV27dtq7d69+/vlndevWTU2bNlXhwoUl\nSf7+/po3b54OHDigbdu2qVWrVnJzc/vX6/r7+2vHjh1auXKlDh8+rJEjR+qnn35yyHvMkiWLgoKC\ndPToUT3zzDOqWrWqWrVqpX379jnkevg/9u48rOa8fwP4fU6bEtGQyhLSymSJTMPYZRk7I8uUEMma\nVMquxJRQjLGNNcbMGEs8gwwSSsKQFi0iDOYxSKlEy/n9Mb/OwwzGUH3O6dyv6+qP6ZxT93kuT3Xu\n8/5+3kTlq0uXLrC2tsaSJUuE5pg7dy727NmDefPmISUlBcnJyVi1apX8mJBu3bph165dOHXqFJKT\nkzFu3Dj5NEVFeHlz+7lz52BhYYEdO3ZwczvRSz7++GP4+PjAxcWFyzqIqMp58eIFfvvtN5iZmcl/\nxpWUlKBr167Q1NTEgQMHAADp6emYPHnyK8eTEREpCpadVO5KS0sxbdo0WFtbo2fPnqhXrx62b98O\n4M9LSSMjI5Gbmws7OzsMHDgQ9vb22LJli/zxW7ZsQV5eHmxtbTFixAiMGzcOjRs3/sfv6+bmhuHD\nh2PUqFFo164dsrKyMGvWrIp6mgCAmjVrws/PD5mZmWjTpg26d++OL7744l+9w1lSUoLExETk5ORU\nYFIi+qtZs2Zh8+bNuHXrlrAMffv2xf79+3HkyBG0bt0anTt3RlRUFKTSP389+/n5oVu3bhg4cCAc\nHBzQsWNHtG7dusJzlW1u/+6777B+/XrY2tpyczvRSzw9PSGTybBq1SrRUYiIypWmpiZGjhyJZs2a\nyf8eUVNTg56eHjp27IiDBw8C+PMN2wEDBqBJkyYi4xIRvZZExlcuROUmPz8f69evR0hICOzt7TF/\n/vx/LCYSExOxfPlyXLlyBe3bt0dQUBD09fUrKTER0dvJZDLs378ffn5+aNSoEYKDgyulcCVSdDdu\n3ED79u0RFRWFFi1aiI5DRFRuys4H19DQgEwmk59BHhUVBTc3N+zZswe2trZIS0uDqampyKhERK/F\nyU6iclS9enXMmjULmZmZ6NSpEwYPHvyPl7g1aNAAI0aMwNSpU7F582aEhobynDwiUhgSiQRDhgxB\nUlIShgwZgr59+3JzOxGApk2bYtmyZXByciqXZYhERKI9efIEwJ8l51+LzhcvXsDe3h76+vqws7PD\nkCFDWHQSkcJi2UlUAXR0dODh4YHr16/L/0B4k9q1a6Nv37549OgRTE1N0bt3b1SrVk1+e3luXiYi\nel8aGhpwd3d/ZXO7l5cXN7eTShs/fjwaNGgAf39/0VGIiD7I48ePMWnSJOzYsUP+hubLr2M0NTVR\nrVo1WFtbo6ioCMuXLxeUlIjon6ktWrRokegQRFWVVCp9a9n58rulw4cPh6OjI4YPHy5fyHT79m1s\n3boVJ06cgImJCWrVqlUpuYmI3kRLSwtdunTBmDFj8Msvv2Dy5MmQSCSwtbWVb2clUhUSiQTdunXD\nxIkT0bFjRzRo0EB0JCKi9/LNN98gNDQUWVlZuHjxIoqKilC7dm3o6elhw4YNaN26NaRSKezt7dGp\nUyfY2dmJjkxE9Eac7CQSqGzD8fLly6GmpobBgwdDV1dXfvvjx4/x4MEDnDt3Dk2bNsXKlSu5+ZWI\nFELZ5vYzZ84gNjaWm9tJZRkaGmLt2rVwcnJCfn6+6DhERO/l008/ha2tLcaOHYvs7GzMnj0b8+bN\nw7hx4+Dj44OCggIAgIGBAfr16yc4LRHR27HsJBKobAoqNDQUjo6Of1tw0KpVKwQGBqJsALtmzZqV\nHZGI6K0sLS2xf//+Vza3Hzt2THQsoko1dOhQ2Nvbw8fHR3QUIqL3Ym9vDW6M4AAAIABJREFUj08+\n+QTPnj3D8ePHERYWhtu3b2Pnzp1o2rQpjhw5gszMTNExiYjeCctOIkHKJjRXrVoFmUyGIUOGoEaN\nGq/cp6SkBOrq6ti0aRNsbGwwcOBASKWv/t/22bNnlZaZiOhNOnTogJiYGCxYsADTpk1Dz549cfny\nZdGxiCrN6tWrcejQIURGRoqOQkT0XmbOnImjR4/izp07GDp0KMaMGYMaNWpAR0cHM2fOxKxZs+QT\nnkREioxlJ1Elk8lkOH78OM6fPw/gz6nO4cOHw8bGRn57GTU1Ndy+fRvbt2/H9OnTUbdu3Vfuc/Pm\nTQQGBsLHxwdJSUmV/EyI6J8EBwdj1qxZomNUmtdtbndycsKtW7dERyOqcLVq1cLWrVsxfvx4Lu4i\nIqVTUlKCpk2bwtjYWH5V2Zw5c7B06VLExMRg5cqV+OSTT6CjoyM2KBHRO2DZSVTJZDIZTpw4gQ4d\nOsDU1BS5ubkYOnSofKqzbGFR2eRnYGAgzM3NXzkbp+w+jx8/hkQiwbVr12BjY4PAwMBKfjZE9DZm\nZmbIyMgQHaPSvby53dTUFG3atOHmdlIJ3bt3x9ChQzF16lTRUYiI3plMJoOamhoAYP78+fj9998x\nYcIEyGQyDB48GADg6OgIX19fkTGJiN4Zy06iSiaVSrFs2TKkp6ejS5cuyMnJgZ+fHy5fvvzK8iGp\nVIq7d+9i27ZtmDFjBgwMDP72tWxtbbFgwQLMmDEDANC8efNKex5E9M9UtewsU6NGDSxatAhJSUnI\ny8uDhYUFli9fjsLCQtHRiCrMsmXL8Ouvv+KHH34QHYWI6K3KjsN6edjCwsICn3zyCbZt24Y5c+bI\nX4NwSSoRKROJ7OVrZomo0mVlZcHHxwfVq1fHpk2bUFBQAG1tbWhoaGDy5MmIiopCVFQUDA0NX3mc\nTCaT/2Hy5ZdfIi0tDRcuXBDxFIjoDZ49e4batWsjLy9PvpBMlaWmpsLPzw+//vorlixZgtGjR//t\nHGKiquDChQvo168fLl++DGNjY9FxiIj+JicnB0uXLkWfPn3QunVr6OnpyW+7d+8ejh8/jkGDBqFm\nzZqvvO4gIlIGLDuJFERhYSG0tLQwe/ZsxMbGYtq0aXB1dcXKlSsxYcKENz7u0qVLsLe3xw8//CC/\nzISIFIeJiQmioqLQtGlT0VEURkxMDLy9vVFQUIDg4GA4ODiIjkRU7rZv344RI0ZAU1OTJQERKRx3\nd3ds2LABjRo1Qv/+/eU7BF4uPQHg+fPn0NLSEpSSiOj9cJyCSEFUq1YNEokEXl5eqFu3Lr788kvk\n5+dDW1sbJSUlr31MaWkpwsLC0Lx5cxadRApK1S9lf52XN7dPnToVDg4O3NxOVY6zszOLTiJSSE+f\nPkVcXBzWr1+PWbNmISIiAl988QXmzZuH6OhoZGdnAwCSkpIwceJE5OfnC05MRPTvsOwkUjAGBgbY\nv38/fv/9d0ycOBHOzs6YOXMmcnJy/nbfq1ev4ocffsDcuXMFJCWid8Gy8/XKNrcnJydj0KBB3NxO\nVY5EImHRSUQK6c6dO2jTpg0MDQ0xbdo03L59G/Pnz8fBgwcxfPhwLFiwAKdPn8aMGTOQnZ2N6tWr\ni45MRPSv8DJ2IgX38OFDxMfHo1evXlBTU8O9e/dgYGAAdXV1jB07FpcuXUJCQgJfUBEpqJUrV+LW\nrVsICwsTHUWhPX36FCEhIfj6668xduxYzJkzB/r6+qJjEVWYFy9eICwsDE2bNsXQoUNFxyEiFVJa\nWoqMjAzUq1cPtWrVeuW2tWvXIiQkBE+ePEFOTg7S0tJgZmYmKCkR0fvhZCeRgqtTpw769u0LNTU1\n5OTkYNGiRbCzs8OKFSvw008/YcGCBSw6iRQYJzvfTY0aNbB48eJXNreHhIS88+Z2vndLyubOnTvI\nyMjA/Pnz8fPPP4uOQ0QqRCqVwsLC4pWis7i4GAAwZcoU3Lx5EwYGBnBycmLRSURKiWUnkRLR09PD\nypUr0aZNGyxYsAD5+fkoKirCs2fP3vgYFgBEYrHs/HeMjIywfv16nDlzBjExMbCwsMDhw4f/8WdZ\nUVERsrOzER8fX0lJid6fTCaDqakpwsLC4OLiggkTJuD58+eiYxGRClNXVwfw59Tn+fPnkZGRgTlz\n5ghORUT0fngZO5GSKigowKJFixASEoLp06djyZIl0NXVfeU+MpkMhw4dwt27dzFu3DhuUiQS4MWL\nF6hRowby8vKgoaEhOo7SOXv2LMzMzGBgYPDWKXZXV1fExcVBQ0MD2dnZWLhwIcaOHVuJSYn+mUwm\nQ0lJCdTU1CCRSOQl/meffYZhw4bBw8NDcEIiIuDEiRM4fvw4li1bJjoKEdF74WQnkZLS0dFBcHAw\n8vPzMWrUKGhra//tPhKJBEZGRvjPf/4DU1NTrFmz5p0vCSWi8qGpqYn69evj5s2boqMopY4dO/5j\n0fnNN99g9+7dmDx5Mn788UcsWLAAgYGBOHLkCABOuJNYpaWluHfvHkpKSiCRSKCuri7/91y2xKig\noAA1atQQnJSIVI1MJnvt78hu3bohMDBQQCIiovLBspNIyWlra8POzg5qamqvvb1du3b4+eefceDA\nARw/fhympqYIDQ1FQUFBJSclUl3m5ua8lP0D/NO5xOvXr4erqysmT54MMzMzjBs3Dg4ODti0aRNk\nMhkkEgnS0tIqKS3R/xQVFaFBgwZo2LAhunfvjn79+mHhwoWIiIjAhQsXkJmZicWLF+PKlSswNjYW\nHZeIVMyMGTOQl5f3t89LJBJIpawKiEh58ScYkYpo27YtIiIi8J///AenT5+GqakpQkJCkJ+fLzoa\nUZXHczsrzosXL2Bqair/WVY2oSKTyeQTdImJibCyskK/fv1w584dkXFJxWhoaMDT0xMymQzTpk1D\n8+bNcfr0afj7+6Nfv36ws7PDpk2bsGbNGvTp00d0XCJSIdHR0Th8+PBrrw4jIlJ2LDuJVEzr1q2x\nb98+REZG4vz582jatCmCgoJe+64uEZUPlp0VR1NTE507d8ZPP/2EvXv3QiKR4Oeff0ZMTAz09PRQ\nUlKCjz/+GJmZmahZsyZMTEwwfvz4ty52IypPXl5eaNGiBU6cOIGgoCCcPHkSly5dQlpaGo4fP47M\nzEy4ubnJ73/37l3cvXtXYGIiUgWLFy/GvHnz5IuJiIiqEpadRCrKxsYGe/bswYkTJ3DlyhU0bdoU\nS5cuRW5uruhoRFUOy86KUTbF6eHhga+++gpubm5o3749ZsyYgaSkJHTr1g1qamooLi5GkyZN8N13\n3+HixYvIyMhArVq1EB4eLvgZkKo4ePAgNm/ejIiICEgkEpSUlKBWrVpo3bo1tLS05GXDw4cPsX37\ndvj6+rLwJKIKEx0djdu3b+PLL78UHYWIqEKw7CRScS1atMDu3bsRHR2NlJQUmJqaIiAgAE+ePBEd\njajKYNlZ/oqLi3HixAncv38fADBp0iQ8fPgQ7u7uaNGiBezt7TFy5EgAkBeeAGBkZITu3bujqKgI\niYmJeP78ubDnQKqjcePGWLp0KVxcXJCXl/fGc7br1KmDdu3aoaCgAI6OjpWckohUxeLFizF37lxO\ndRJRlcWyk4gAAFZWVti5cydiYmKQmZmJZs2aYeHChXj8+LHoaERKr3Hjxrh//z4KCwtFR6kyHj16\nhN27d8Pf3x+5ubnIyclBSUkJ9u/fjzt37mD27NkA/jzTs2wDdnZ2NoYMGYItW7Zgy5YtCA4OhpaW\nluBnQqpi1qxZmDlzJlJTU197e0lJCQCgZ8+eqFGjBmJjY3H8+PHKjEhEKuD06dO4desWpzqJqEpj\n2UlErzA3N8e2bdsQFxeH3377DWZmZpg3bx4ePXokOhqR0lJXV0ejRo1w48YN0VGqjHr16sHd3R0x\nMTGwtrbGoEGDYGxsjJs3b2LBggUYMGAAAMinViIiItC7d288fvwYGzZsgIuLi8D0pKrmzZuHtm3b\nvvK5suMY1NTUcOXKFbRu3RpHjx7F+vXr0aZNGxExiagKKzurU0NDQ3QUIqIKw7KTiF6rWbNm2Lx5\nMy5evIgHDx7AzMwMvr6++OOPP0RHI1JK5ubmvJS9nLVt2xZXr17Fhg0bMHjwYOzcuROnTp3CwIED\n5fcpLi7GoUOHMGHCBOjq6uLnn39G7969AfyvZCKqLFLpn396Z2Rk4MGDBwAAiUQCAAgKCoKdnR0M\nDQ1x9OhRuLq6Ql9fX1hWIqp6Tp8+jaysLE51ElGVx7KTiN6qSZMm2LhxIy5fvoycnBxYWFjA29sb\n//3vf0VHI1IqPLez4nz++eeYPn06evbsiVq1ar1ym7+/P8aPH4/PP/8cW7ZsQbNmzVBaWgrgfyUT\nUWU7cuQIhgwZAgDIyspCp06dEBAQgMDAQOzatQutWrWSF6Nl/16JiD5U2VmdnOokoqqOZScRvRMT\nExOsW7cOCQkJKCwshJWVFTw9PeXLQYjo7Vh2Vo6ygujOnTsYNmwYwsLC4OzsjK1bt8LExOSV+xCJ\nMnnyZFy5cgU9e/ZEq1atUFJSgmPHjsHT0/Nv05xl/16fPXsmIioRVRFnzpzBzZs34eTkJDoKEVGF\n41/7RPSvNGzYEGvWrEFSUhJKS0vRvHlzTJ8+HXfv3hUdjUihseysXAYGBjA0NMS3336LZcuWAfjf\nApi/4uXsVNnU1dVx6NAhnDhxAv3790dERAQ+/fTT125pz8vLw7p16xAWFiYgKRFVFTyrk4hUCctO\nInovxsbGCA0NRUpKCjQ1NfHxxx9jypQpuH37tuhoRAqJZWfl0tLSwtdffw1HR0f5C7vXFUkymQy7\ndu1Cr169cOXKlcqOSSqsa9eumDhxIs6cOSNfpPU6urq60NLSwqFDhzB9+vRKTEhEVcXZs2dx48YN\nTnUSkcpg2UlEH8TQ0BAhISFITU2Frq4uWrVqBTc3N2RlZYmORqRQGjZsiIcPH6KgoEB0FHqJRCKB\no6MjBgwYgD59+sDZ2Rm3bt0SHYtUxPr161G/fn2cOnXqrfcbOXIk+vfvj6+//vof70tE9Fc8q5OI\nVA3LTiIqFwYGBggKCkJ6ejo++ugj2NrawtXVFTdu3BAdjUghqKmpoUmTJrh+/broKPQXGhoamDJl\nCtLT09G4cWO0adMG3t7eyM7OFh2NVMCBAwfw6aefvvH2nJwchIWFITAwED179oSpqWklpiMiZXf2\n7Flcv34dzs7OoqMQEVUalp1EVK7q1KmDpUuXIiMjA8bGxrCzs8PYsWN5+S4ReCm7oqtRowb8/f2R\nlJSE3NxcWFhYYMWKFSgsLBQdjaqwunXrwsDAAAUFBX/7t5aQkIBBgwbB398fS5YsQWRkJBo2bCgo\nKREpI57VSUSqiGUnEVUIfX19+Pv7IyMjA40bN4a9vT2cnZ2RlpYmOhqRMObm5iw7lYCRkRE2bNiA\n6OhonDlzBpaWlti5cydKS0tFR6MqLDw8HEuWLIFMJkNhYSG+/vprdOrUCc+fP0d8fDxmzJghOiIR\nKZmYmBhOdRKRSmLZSUQVqnbt2li4cCEyMzNhYWGBzz77DKNGjUJKSoroaESVjpOdysXKygoHDhxA\neHg4vv76a7Rt2xbHjx8XHYuqqK5du2Lp0qUICQnB6NGjMXPmTHh6euLMmTNo0aKF6HhEpIR4VicR\nqSqWnURUKfT09DB37lxkZmbCxsYGXbt2haOjIxITE0VHI6o0LDuV02effYZz585hzpw5cHd3R69e\nvZCQkCA6FlUx5ubmCAkJwezZs5GSkoKzZ89i4cKFUFNTEx2NiJRQTEwMMjIyONVJRCqJZScRVaoa\nNWrA19cXmZmZaNu2LXr27ImhQ4eyOCCVwLJTeUkkEgwbNgwpKSkYMGAAevXqhTFjxuD27duio1EV\n4unpiR49eqBRo0Zo37696DhEpMTKpjo1NTVFRyEiqnQsO4lICF1dXXh7eyMzMxMdOnRA7969MWjQ\nIPz666+ioxFVGGNjY+Tm5uLp06eio9B7enlzu4mJCVq3bg0fHx9ubqdys3XrVpw4cQKHDx8WHYWI\nlFRsbCzS09M51UlEKotlJxEJVb16dXh6euLGjRvo1q0b+vfvj/79+yM+Pl50NKJyJ5VKYWpqyunO\nKqBmzZrw9/dHYmIinjx5ws3tVG7q16+Pc+fOoVGjRqKjEJGS4lQnEak6lp1EpBC0tbUxffp0ZGZm\nonfv3hg6dCj69OmDc+fOiY5GVK54KXvVYmxsjI0bN+LUqVM4ffo0LC0tsWvXLm5upw/Srl27vy0l\nkslk8g8iojeJjY1FWloaxowZIzoKEZEwLDuJSKFUq1YNU6ZMwfXr1zFo0CCMHDkSDg4OOHv2rOho\nROXC3NycZWcVZG1tjYiICISHh2PNmjXc3E4VYv78+diyZYvoGESkwBYvXow5c+ZwqpOIVBrLTiJS\nSFpaWnBzc0N6ejqGDx8OZ2dndOvWDdHR0aKjEX0QTnZWbX/d3N67d28uYKNyIZFIMGLECPj6+uLG\njRui4xCRAjp37hxSU1Ph4uIiOgoRkVAsO4lIoWlqasLV1RVpaWlwcnLC+PHj0blzZ5w8eZKX8pFS\nYtlZ9b28ub1///7c3E7lpkWLFvD19YWLiwtKSkpExyEiBcOzOomI/sSyk4iUgoaGBsaOHYvU1FS4\nurrC3d0dn332GY4dO8bSk5QKy07V8fLm9kaNGnFzO5ULDw8PSCQSrFy5UnQUIlIg586dw7Vr1zjV\nSUQEQCJjS0BESqikpAQ//PADDh48iK1bt0JbW1t0JKJ3IpPJULNmTdy5cwe1atUSHYcq0b1797Bo\n0SIcOHAAvr6+mDJlCrS0tETHIiV08+ZN2NnZ4eTJk/j4449FxyEiBdC7d28MHjwYbm5uoqMQEQnH\nspOIlFrZxmOplIPqpDzatGmDDRs2oF27dqKjkAApKSnw8/PD1atXsWTJEowcOZI/w+hf27JlC1av\nXo34+Hheskqk4uLi4uDo6IiMjAz+PCAiAi9jJyIlJ5VKWRKQ0jEzM0N6erroGCRI2eb27du3Y/Xq\n1dzcTu9l7NixaNSoERYtWiQ6ChEJxg3sRESvYkNARERUyXhuJwFAp06dEBcXx83t9F4kEgk2bdqE\nLVu2IDY2VnQcIhLk/PnzSElJwdixY0VHISJSGCw7iYiIKpm5uTnLTgLAze30YerVq4d169bB2dkZ\neXl5ouMQkQCLFy+Gn58fpzqJiF7CspOIiKiScbKT/up1m9tnz56NJ0+eiI5GCm7w4MHo0KEDvL29\nRUchokp2/vx5JCUlcaqTiOgvWHYSERFVsrKykzsC6a9q1qyJgIAAJCYmIjs7G+bm5li5ciWeP38u\nOhopsNWrV+Pw4cM4cuSI6ChEVInKzurU0tISHYWISKGw7CQiIqpkH330EQDg0aNHgpOQojI2NsbG\njRtx6tQpnDp1CpaWlti1axdKS0tFRyMFpKenh61bt2LChAn8uUKkIuLj4znVSUT0Biw7iYiIKplE\nIuGl7PROrK2tcfDgwVc2t584cUJ0LFJA3bp1w7BhwzBlyhTRUYioEpSd1cmpTiKiv2PZSUREJICZ\nmRnS09NFxyAl8fLm9kmTJqFPnz64evWq6FikYJYtW4aEhATs3r1bdBQiqkDx8fFITEzEuHHjREch\nIlJILDuJiIgE4GQn/Vtlm9uTk5Px+eefw8HBAS4uLrhz547oaKQgtLW1ER4ejhkzZuDu3bui4xBR\nBeFUJxHR27HsJCIiEsDc3JxlJ70XTU1NTJ06Fenp6WjYsCFatWrFze0k17ZtW0ydOhXjxo3jEjSi\nKujChQu4evUqpzqJiN6CZScRqQS+4CNFw8lO+lDc3E5v4ufnh+zsbKxbt050FCIqZ5zqJCL6Zyw7\niajK27p1K4qKikTHIHpFWdnJIp4+1Os2t3/33Xfc3K7CNDQ0sGPHDixYsIBvqhBVIRcuXEBCQgLG\njx8vOgoRkUKTyPgqi4iqOGNjY8THx6NBgwaioxC9om7dukhMTIShoaHoKFSFnD59Gt7e3iguLkZw\ncDC6d+8uOhIJsmbNGuzatQtnz56Furq66DhE9IH69euHPn36YMqUKaKjEBEpNE52ElGVV7t2bWRn\nZ4uOQfQ3vJSdKkLZ5nZfX1+4ublxc7sKmzJlCnR1dREUFCQ6ChF9oIsXL+LKlSuc6iQiegcsO4mo\nymPZSYqKZSdVFIlEgi+++AIpKSnc3K7CpFIptm7dirCwMFy+fFl0HCL6AGVndVarVk10FCIihcey\nk4iqPJadpKjMzMyQnp4uOgZVYdzcTg0bNsTKlSvx5ZdforCwUHQcInoPFy9exOXLlznVSUT0jlh2\nElGVx7KTFJW5uTknO6lSvLy5/fHjxzA3N8eqVau4uV1FjB49GlZWVpg3b57oKET0Hvz9/eHr68up\nTiKid8QFRURERIJcvnwZY8aM4XmKVOlSUlLg6+uLxMREBAYGYsSIEZBK+R54Vfbw4UPY2Nhg9+7d\n6Ny5s+g4RPSOLl26hIEDB+L69essO4mI3hHLTiIiIkGePn0KQ0NDPH36lEUTCfHy5vbly5ejW7du\noiNRBfr5558xdepUJCQkoGbNmqLjENE7GDBgABwcHDB16lTRUYiIlAbLTiIiIoGMjIxw4cIFNGjQ\nQHQUUlEymQw//fQT/Pz8YGZmhqCgINjY2IiORRVk4sSJKCkpwebNm0VHIaJ/wKlOIqL3wzESIiIi\ngbiRnUR73eb2sWPHcnN7FbVixQpERUUhIiJCdBQi+gf+/v6YPXs2i04ion+JZScREZFALDtJUby8\nub1+/fpo1aoVfH19ubm9iqlRowa2b9+OSZMm4cGDB6LjENEb/Prrr7h48SImTJggOgoRkdJh2UlE\n9BaLFi1CixYtRMegKszMzAzp6emiYxDJ1axZE0uWLMHVq1fx6NEjWFhYcHN7FfPZZ5/B2dkZkyZN\nAk+0IlJMixcv5gZ2IqL3xLKTiBSWi4sL+vXrJzSDl5cXoqOjhWagqo2TnaSo6tevj02bNuHkyZOI\nioqClZUVdu/ejdLSUtHRqBz4+/sjIyMDO3bsEB2FiP6CU51ERB+GZScR0Vvo6urio48+Eh2DqjBz\nc3OWnaTQmjdvjoMHD2Lr1q1YtWoV7OzscPLkSdGx6ANpaWlh586d8PLywq1bt0THIaKX8KxOIqIP\nw7KTiJSSRCLBTz/99MrnGjdujJCQEPl/p6eno3PnzqhWrRosLCxw+PBh6OrqYtu2bfL7JCYmokeP\nHtDW1oa+vj5cXFyQk5Mjv52XsVNFMzU1xc2bN1FSUiI6CtFbde7cGefPn8fs2bMxceJE9O3bl0cw\nKLmWLVti1qxZGDt2LCd2iRTE5cuXceHCBU51EhF9AJadRFQllZaWYvDgwVBXV0dcXBy2bduGxYsX\nv3LmXH5+Pnr16gVdXV3Ex8dj//79iI2Nxbhx4wQmJ1Wjo6ODOnXqcPM1KYWXN7f36dMHqampLOqV\nnLe3N54/f47Vq1eLjkJE+POsztmzZ0NbW1t0FCIipaUuOgARUUX45ZdfkJaWhmPHjqF+/foAgFWr\nVqFDhw7y+3z33XfIz89HeHg4atSoAQDYuHEjunbtiuvXr6NZs2ZCspPqKTu3s3HjxqKjEL0TTU1N\nTJs2DTKZDBKJRHQc+gBqamrYsWMH2rdvDwcHB1hbW4uORKSyyqY6d+/eLToKEZFS42QnEVVJqamp\nMDY2lhedANCuXTtIpf/7sXft2jXY2NjIi04A+PTTTyGVSpGSklKpeUm1cUkRKSsWnVWDqakpAgMD\n4ezsjKKiItFxiFSWv78/fHx8ONVJRPSBWHYSkVKSSCSQyWSvfK48X6DxBTxVJjMzM559SERCTZw4\nEQYGBliyZInoKEQq6fLlyzh//jwmTpwoOgoRkdJj2UlESqlu3bq4f/++/L//+9//vvLflpaWuHfv\nHu7duyf/3MWLF19ZwGBlZYXExEQ8ffpU/rnY2FiUlpbCysqqgp8B0f9wspOIRJNIJNi8eTPWr1+P\n+Ph40XGIVA6nOomIyg/LTiJSaLm5ubhy5corH1lZWejWrRvWrl2Lixcv4vLly3BxcUG1atXkj+vZ\nsycsLCwwZswYJCQkIC4uDp6enlBXV5dPbY4ePRo6OjpwdnZGYmIiTp8+DTc3NwwZMoTndVKlMjc3\nZ9lJRMIZGRlhzZo1cHJyQkFBgeg4RCrjypUrOH/+PNzc3ERHISKqElh2EpFCO3PmDFq3bv3Kh5eX\nF1asWIGmTZuiS5cuGDZsGFxdXWFgYCB/nFQqxf79+/H8+XPY2dlhzJgxmDt3LiQSibwU1dHRQWRk\nJHJzc2FnZ4eBAwfC3t4eW7ZsEfV0SUU1bdoUt2/fRnFxsegoRKTihg8fjrZt28LX11d0FCKVwalO\nIqLyJZH99dA7IqIqKiEhAa1atcLFixdha2v7To/x8/NDVFQU4uLiKjgdqbomTZrgl19+4VQxEQmX\nnZ0NGxsbbNmyBT179hQdh6hKS0hIQJ8+fZCZmcmyk4ionHCyk4iqrP379+PYsWO4efMmoqKi4OLi\ngpYtW6JNmzb/+FiZTIbMzEycOHECLVq0qIS0pOp4biepmpKSEjx58kR0DHqN2rVrY/PmzRg3bhyy\ns7NFxyGq0vz9/eHt7c2ik4ioHLHsJKIq6+nTp5g6dSqsra0xevRoWFlZITIy8p02refk5MDa2hqa\nmpqYP39+JaQlVceyk1RNaWkpvvzyS7i5ueGPP/4QHYf+wsHBAQMHDsS0adNERyGqshISEhAbG8uz\nOomIyhnLTiKqspydnZGeno5nz57h3r17+O6771CvXr13emytWrXw/PlznD17FiYmJhWclIhlJ6ke\nDQ0NhIeHQ1tbG9bW1ggNDUVRUZHoWPSSoKAgxMfHY8+ePaKjEFVJZWd16ujoiI5CRFSlsOwkIiJS\nAGZmZkhPTxcdg+i9PH78+L22d9euXRuhoaGIjo7GkSNHYGNjg6PekGmFAAAgAElEQVRHj1ZAQnof\n1atXR3h4OKZOnYr79++LjkNUpVy9epVTnUREFYRlJxERkQLgZCcpqz/++AOtW7fGnTt33vtrWFtb\n4+jRowgODsa0adPQr18/lv8Kon379pg4cSJcXV3BvaZE5afsrE5OdRIRlT+WnUSkEu7evQsjIyPR\nMYjeqEmTJrh37x5evHghOgrROystLcWYMWMwYsQIWFhYfNDXkkgk6N+/P5KSktC5c2d8+umn8Pb2\nRk5OTjmlpfc1f/583L9/H99++63oKERVwtWrVxETE4NJkyaJjkJEVCWx7CQilWBkZITU1FTRMYje\nSENDAw0bNsSNGzdERyF6ZytXrkR2djaWLFlSbl9TS0sL3t7eSEpKwqNHj2BpaYnNmzejtLS03L4H\n/TuampoIDw+Hn58fMjMzRcchUnqc6iQiqlgSGa9HISIiUgh9+/aFu7s7+vfvLzoK0T+Ki4vDwIED\nER8fX6GL3C5cuIAZM2bgxYsXCAsLQ4cOHSrse9HbrVy5Evv27UN0dDTU1NRExyFSSomJiXBwcEBm\nZibLTiKiCsLJTiIiIgXBcztJWWRnZ2PkyJHYsGFDhRadANCuXTvExMRg5syZcHR0xKhRo/Dbb79V\n6Pek1/Pw8IC6ujpWrFghOgqR0vL394eXlxeLTiKiCsSyk4iISEGw7CRlIJPJ4Orqiv79+2PQoEGV\n8j0lEglGjx6N1NRUmJqaomXLlggICMCzZ88q5fvTn6RSKbZt24bly5fj6tWrouMQKZ3ExEScOXOG\nZ3USEVUwlp1EREQKwszMjBuoSeF98803yMrKwvLlyyv9e+vq6iIgIAAXL15EQkICrKyssGfPHm4J\nr0SNGzdGcHAwnJyc8Pz5c9FxiJRK2VRn9erVRUchIqrSeGYnERGRgrhx4wa6dOmC27dvi45CpFS6\ndOmCsLAwtGzZUnQUlSCTyTB48GBYWlriq6++Eh2HSCkkJSWhR48eyMzMZNlJRFTBONlJRASgsLAQ\noaGhomOQijMxMcGDBw94aS7RvzRixAg4ODhg0qRJ+OOPP0THqfIkEgk2btyIbdu24ezZs6LjECkF\nTnUSEVUelp1EpJL+OtReVFQET09P5OXlCUpEBKipqaFJkybIzMwUHYVIqUyaNAnXrl2DlpYWrK2t\nERYWhqKiItGxqjQDAwOsX78eY8aM4e9Oon+QlJSE06dPw93dXXQUIiKVwLKTiFTCvn37kJaWhpyc\nHAB/TqUAQElJCUpKSqCtrQ0tLS08efJEZEwiLikiek/6+voICwtDdHQ0fv75Z9jY2CAyMlJ0rCpt\n0KBB6NSpE2bNmiU6CpFC8/f3x6xZszjVSURUSVh2EpFKmDt3Ltq0aQNnZ2esW7cOZ86cQXZ2NtTU\n1KCmpgZ1dXVoaWnh0aNHoqOSimPZSfRhrK2tERkZiaCgIEyZMgUDBgzg/6cqUGhoKCIjI3H48GHR\nUYgUUtlU5+TJk0VHISJSGSw7iUglREdHY/Xq1cjPz8fChQvh7OyMESNGYN68efIXaPr6+njw4IHg\npKTqWHaSosrKyoJEIsHFixcV/ntLJBIMGDAAycnJ6NixI+zt7eHj44Pc3NwKTqp69PT0sG3bNkyY\nMIFvGBK9RkBAAKc6iYgqGctOIlIJBgYGGD9+PI4fP46EhAT4+PhAT08PERERmDBhAjp27IisrCwu\nhiHhWHaSSC4uLpBIJJBIJNDQ0EDTpk3h5eWF/Px8NGzYEPfv30erVq0AAKdOnYJEIsHDhw/LNUOX\nLl0wderUVz731+/9rrS0tODj44PExET88ccfsLS0xNatW1FaWlqekVVely5d4OjoCHd397+diU2k\nypKTkxEdHc2pTiKiSsayk4hUSnFxMYyMjODu7o4ff/wRe/fuRWBgIGxtbWFsbIzi4mLREUnFmZmZ\nIT09XXQMUmE9evTA/fv3cePGDSxZsgTffPMNvLy8oKamBkNDQ6irq1d6pg/93kZGRti6dSsiIiKw\nceNG2NnZITY2tpxTqrbAwEAkJSVh9+7doqMQKYyAgAB4enpyqpOIqJKx7CQilfLXF8rm5uZwcXFB\nWFgYTp48iS5duogJRvT/GjRogCdPnnC7MQmjpaUFQ0NDNGzYEKNGjcLo0aNx4MCBVy4lz8rKQteu\nXQEAdevWhUQigYuLCwBAJpMhODgYpqam0NbWxscff4ydO3e+8j38/f1hYmIi/17Ozs4A/pwsjY6O\nxtq1a+UTpllZWeV2CX27du0QExMDDw8PDB8+HKNHj8Zvv/32QV+T/qStrY3w8HB4eHjwf1Mi/DnV\nGRUVxalOIiIBKv+teSIigR4+fIjExEQkJyfj9u3bePr0KTQ0NNC5c2cMHToUwJ8v1Mu2tRNVNqlU\nClNTU1y/fv1fX7JLVBG0tbVRVFT0yucaNmyIvXv3YujQoUhOToa+vj60tbUBAPPmzcNPP/2EtWvX\nwsLCAufOncOECRNQu3ZtfP7559i7dy9CQkKwe/dufPzxx3jw4AHi4uIAAGFhYUhPT4elpSWWLl0K\n4M8y9c6dO+X2fKRSKb788ksMGjQIX331FVq2bImZM2di1qxZ8udA78fW1hbTpk3D2LFjERkZCamU\ncxWkusrO6tTV1RUdhYhI5fAvECJSGYmJiZg4cSJGjRqFkJAQnDp1CsnJyfj111/h7e0NR0dH3L9/\nn0UnCcdzO0lRxMfH47vvvkP37t1f+byamhr09fUB/HkmsqGhIfT09JCfn4+VK1fi22+/Re/evdGk\nSROMGjUKEyZMwNq1awEAt27dgpGRERwcHNCoUSO0bdtWfkannp4eNDU1oaOjA0NDQxgaGkJNTa1C\nnpuuri6WLFmCCxcu4PLly7C2tsbevXt55uQH8vPzQ25uLtatWyc6CpEwKSkpnOokIhKIZScRqYS7\nd+9i1qxZuH79OrZv3464uDicOnUKR48exb59+xAYGIg7d+4gNDRUdFQilp0k1NGjR6Grq4tq1arB\n3t4enTp1wpo1a97psSkpKSgsLETv3r2hq6sr/1i3bh0yMzMBAF988QUKCwvRpEkTjB8/Hnv27MHz\n588r8im9VdOmTbF3715s3rwZixYtQrdu3XD16lVheZSduro6duzYgYULFyItLU10HCIhys7q5FQn\nEZEYLDuJSCVcu3YNmZmZiIyMhIODAwwNDaGjowMdHR0YGBhg5MiR+PLLL3Hs2DHRUYlYdpJQnTp1\nwpUrV5CWlobCwkLs27cPBgYG7/TYsi3nhw4dwpUrV+QfycnJ8p+vDRs2RFpaGjZs2ICaNWti1qxZ\nsLW1RX5+foU9p3fRrVs3XL58GV988QV69OgBd3f3ct80ryosLCywaNEiODs7c/EfqZyUlBScPHkS\nU6ZMER2FiEhlsewkIpVQvXp15OXlQUdH5433uX79OmrUqFGJqYhej2UniaSjo4NmzZrBxMQEGhoa\nb7yfpqYmAKCkpET+OWtra2hpaeHWrVto1qzZKx8mJiby+1WrVg2ff/45Vq1ahQsXLiA5ORkxMTHy\nr/vy16xM6urqmDx5MlJTU6GhoQErKyusXr36b2eW0j+bPHky9PT0sGzZMtFRiCoVpzqJiMTjgiIi\nUglNmjSBiYkJZsyYgdmzZ0NNTQ1SqRQFBQW4c+cOfvrpJxw6dAjh4eGioxLBzMwM6enpomMQvZWJ\niQkkEgl+/vln9O/fH9ra2qhRowa8vLzg5eUFmUyGTp06IS8vD3FxcZBKpZg4cSK2bduG4uJitG/f\nHrq6uvjhhx+goaEBMzMzAEDjxo0RHx+PrKws6Orqys8GrUz6+vpYvXo13Nzc4OHhgfXr1yM0NBQO\nDg6VnkVZSaVSbNmyBW3atEHfvn1ha2srOhJRhbt27RpOnjyJTZs2iY5CRKTSWHYSkUowNDTEqlWr\nMHr0aERHR8PU1BTFxcUoLCzEixcvoKuri1WrVqFXr16ioxLByMgIBQUFyMnJgZ6enug4RK9Vv359\nLF68GHPnzoWrqyucnZ2xbds2BAQEoF69eggJCYG7uztq1qyJVq1awcfHBwBQq1YtBAUFwcvLC0VF\nRbC2tsa+ffvQpEkTAICXlxfGjBkDa2trPHv2DDdv3hT2HJs3b45jx47h4MGDcHd3R4sWLbBixQo0\na9ZMWCZl0qBBA4SGhsLJyQmXLl3itnuq8gICAjBz5kxOdRIRCSaRceUkEamQFy9eYM+ePUhOTkZR\nURFq166Npk2bok2bNjA3Nxcdj0guODgY48aNQ506dURHISIAz58/x6pVq7B8+XK4urpi3rx5PPrk\nHchkMjg6OqJBgwZYuXKl6DhEFebatWvo3LkzMjMz+bOBiEgwlp1EREQKqOzXs0QiEZyEiF527949\nzJkzB8eOHcPSpUvh7OwMqZTH4L/No0ePYGNjg507d6Jr166i4xBViFGjRuHjjz+Gn5+f6ChERCqP\nZScRqZyyH3svl0kslIiI6N+Ij4/H9OnTUVJSgtWrV8Pe3l50JIV2+PBhTJ48GQkJCTyeg6qc1NRU\ndOrUiVOdREQKgm9DE5HKKSs3pVIppFIpi04iUjlRUVGiIyg9Ozs7xMbGYvr06Rg2bBicnJxw9+5d\n0bEUVt++fdGrVy94eHiIjkJU7srO6mTRSUSkGFh2EhEREamQBw8ewMnJSXSMKkEqlcLJyQlpaWlo\n1KgRbGxsEBgYiMLCQtHRFNKKFStw+vRpHDhwQHQUonKTmpqKX375BVOnThUdhYiI/h/LTiJSKTKZ\nDDy9g4hUVWlpKcaMGcOys5zp6uoiMDAQFy5cwKVLl2BlZYV9+/bx981f6OrqYseOHXB3d8eDBw9E\nxyEqFwEBAfDw8OBUJxGRAuGZnUSkUh4+fIi4uDj069dPdBSiD1JYWIjS0lLo6OiIjkJKJDg4GBER\nETh16hQ0NDREx6myTpw4AQ8PD9StWxehoaGwsbERHUmh+Pr6IjU1Ffv37+dRMqTUys7qvH79OmrW\nrCk6DhER/T9OdhKRSrl37x63ZFKVsGXLFoSEhKCkpER0FFISsbGxWLFiBXbv3s2is4J1794dly9f\nxtChQ9GjRw9MmTIFjx49Eh1LYSxevBg3b97Etm3bREch+iB79uyBh4cHi04iIgXDspOIVErt2rWR\nnZ0tOgbRP9q8eTPS0tJQWlqK4uLiv5WaDRs2xJ49e3Djxg1BCUmZPH78GKNGjcKmTZvQqFEj0XFU\ngrq6OqZMmYJr165BKpXCysoKa9asQVFRkehowmlpaSE8PBw+Pj7IysoSHYfovchkMnh6emL27Nmi\noxAR0V+w7CQilcKyk5SFr68voqKiIJVKoa6uDjU1NQDA06dPkZKSgtu3byM5ORkJCQmCk5Kik8lk\nGD9+PAYNGoQBAwaIjqNyPvroI6xZswYnT57EgQMH0KpVKxw/flx0LOFsbGzg7e0NFxcXlJaWio5D\n9K9JJBJUr15d/vuZiIgUB8/sJCKVIpPJoKWlhby8PGhqaoqOQ/RGAwcORF5eHrp27YqrV68iIyMD\n9+7dQ15eHqRSKQwMDKCjo4OvvvoKn3/+uei4pMDWrFmD7du3IyYmBlpaWqLjqDSZTIaIiAh4enrC\nxsYGK1asgKmpqehYwpSUlKBz584YMmQIPD09RcchIiKiKoKTnUSkUiQSCWrVqsXpTlJ4n376KaKi\nohAREYFnz56hY8eO8PHxwdatW3Ho0CFEREQgIiICnTp1Eh2VFNivv/6KgIAA/PDDDyw6FYBEIsGg\nQYOQkpKC9u3bw87ODr6+vnj69Ok7Pb64uLiCE1YuNTU1bN++HUuXLkVycrLoOERUSZ4+fQoPDw+Y\nmJhAW1sbn376KS5cuCC/PS8vD9OmTUODBg2gra0NCwsLrFq1SmBiIlI26qIDEBFVtrJL2evVqyc6\nCtEbNWrUCLVr18Z3330HfX19aGlpQVtbm5fL0TvLzc2Fo6Mj1qxZo9LTg4qoWrVq8PPzw5gxY+Dn\n5wdLS0ssXboUzs7Ob9xOLpPJcPToURw+fBidOnXCiBEjKjl1xTA1NcWyZcvg5OSEuLg4XnVBpAJc\nXV1x9epVbN++HQ0aNMDOnTvRo0cPpKSkoH79+vD09MTx48cRHh6OJk2a4PTp05gwYQLq1KkDJycn\n0fGJSAlwspOIVA7P7SRl0KJFC1SrVg3Gxsb46KOPoKurKy86ZTKZ/IPodWQyGdzc3NCtWzc4OjqK\njkNvYGxsjO3bt2Pv3r24c+fOW+9bXFyM3NxcqKmpwc3NDV26dMHDhw8rKWnFcnV1hZGREQICAkRH\nIaIK9uzZM+zduxdfffUVunTpgmbNmmHRokVo1qwZ1q1bBwCIjY2Fk5MTunbtisaNG8PZ2RmffPIJ\nzp8/Lzg9ESkLlp1EpHJYdpIysLKywpw5c1BSUoK8vDz89NNPSEpKAvDnpbBlH0Svs3nzZiQlJSE0\nNFR0FHoHn3zyCebOnfvW+2hoaGDUqFFYs2YNGjduDE1NTeTk5FRSwoolkUjw7bffYuPGjYiLixMd\nh4gqUHFxMUpKSlCtWrVXPq+trY2zZ88CADp27IhDhw7J3wSKjY3FlStX0Lt370rPS0TKiWUnEakc\nlp2kDNTV1TFlyhTUrFkTz549Q0BAAD777DO4u7sjMTFRfj9uMaa/SkpKgp+fH3788Udoa2uLjkPv\n6J/ewHjx4gUAYNeuXbh16xamT58uP56gKvwcMDIywtq1a+Hs7Iz8/HzRcYiogtSoUQP29vZYsmQJ\n7t69i5KSEuzcuRPnzp3D/fv3AQCrV69Gy5Yt0ahRI2hoaKBz584ICgpCv379BKcnImXBspOIVA7L\nTlIWZQWGrq4usrOzERQUBAsLCwwZMgQ+Pj6Ii4uDVMpf5fQ/+fn5cHR0xPLly2FlZSU6DpUTmUwm\nP8vS19cXI0eOhL29vfz2Fy9eICMjA7t27UJkZKSomB9s2LBhsLOzw+zZs0VHIXpvN2/efOUKDFX9\nGD169BuP2wkPD4dUKkWDBg2gpaWF1atXY+TIkfK/adasWYPY2FgcPHgQly5dwqpVq+Dl5YWjR4++\n9uvJZDLhz1cRPmrXro3nz59X2L9tImUikfHALyJSMfPmzYOWlhbmz58vOgrRW718Ludnn32Gfv36\nwc/PDw8ePEBwcDB+//13WFtbY9iwYTA3NxeclhTB+PHjUVRUhO3bt0Mi4TEHVUVxcTHU1dXh6+uL\n77//Hrt3736l7HR3d8d//vMf6Onp4eHDhzA1NcX333+Phg0bCkz9fp48eQIbGxt8++23cHBwEB2H\niCpQfn4+cnNzYWRkBEdHR/mxPXp6etizZw8GDhwov6+rqyuysrJw/PhxgYmJSFlwHISIVA4nO0lZ\nSCQSSKVSSKVS2Nrays/sLCkpgZubGwwMDDBv3jwu9SAAf17efPbsWXzzzTcsOquQ0tJSqKur4/bt\n21i7di3c3NxgY2Mjv33ZsmUIDw/HwoUL8csvvyA5ORlSqRTh4eECU7+/WrVqYfPmzRg/fjx/V1Ol\n4xxQ5apevTqMjIyQnZ2NyMhIDBw4EEVFRSgqKpIvZSyjpqZWJY7sIKLKoS46ABFRZatdu7a8NCJS\nZLm5udi7dy/u37+PmJgYpKenw8rKCrm5uZDJZKhXrx66du0KAwMD0VFJsPT0dHh4eOD48ePQ1dUV\nHYfKSWJiIrS0tGBubo4ZM2agefPmGDRoEKpXrw4AOH/+PAICArBs2TK4urrKH9e1a1eEh4fD29sb\nGhoaouK/t549e2LQoEGYOnUqdu3aJToOqYDS0lIcOnQI+vr66NChA4+IqWCRkZEoLS2FpaUlrl+/\nDm9vb1haWmLs2LHyMzp9fX2hq6sLExMTREdHY8eOHQgODhYdnYiUBMtOIlI5nOwkZZGdnQ1fX1+Y\nm5tDU1MTpaWlmDBhAmrWrIl69eqhTp060NPTQ926dUVHJYEKCwvh6OgIf39/tGzZUnQcKielpaUI\nDw9HSEgIRo0ahRMnTmDDhg2wsLCQ32f58uVo3rw5ZsyYAeB/59b99ttvMDIykhed+fn5+PHHH2Fj\nYwNbW1shz+ffCgoKQuvWrfHjjz9i+PDhouNQFfX8+XPs2rULy5cvR/Xq1bF8+XJOxleCnJwc+Pn5\n4bfffoO+vj6GDh2KwMBA+c+s77//Hn5+fhg9ejQeP34MExMTBAQEYOrUqYKTE5GyYNlJRCqHZScp\nCxMTE+zbtw8fffQR7t+/DwcHB0ydOlW+qIQIALy8vNCsWTNMmjRJdBQqR1KpFMHBwbC1tcWCBQuQ\nl5eHBw8eyIuYW7du4cCBA9i/fz+AP4+3UFNTQ2pqKrKystC6dWv5WZ/R0dE4fPgwvvrqKzRq1Ahb\ntmxR+PM8dXR0EB4ejv79+6Njx44wNjYWHYmqkNzcXGzcuBGhoaFo3rw51q5di65du7LorCTDhw9/\n65sYhoaG2Lp1ayUmIqKqhvP5RKRyWHaSMunQoQMsLS3RqVMnJCUlvbbo5BlWqmvv3r04fPgwNm3a\nxBfpVZSjoyPS0tKwaNEieHt7Y+7cuQCAI0eOwNzcHG3atAEA+fl2e/fuxZMnT9CpUyeoq/8519C3\nb18EBARg0qRJOHHixBs3GisaOzs7TJo0Ca6urjxLkcrF77//jjlz5qBp06a4dOkSDh06hMjISHTr\n1o0/Q4mIqhCWnUSkclh2kjIpKzLV1NRgYWGB9PR0HDt2DAcOHMCPP/6Imzdv8mwxFXXz5k24u7vj\n+++/R61atUTHoQq2YMECPHjwAL169QIAGBkZ4ffff0dhYaH8PkeOHMGxY8fQsmVL+Rbj4uJiAECD\nBg0QFxcHKysrTJgwofKfwHuaN28e/vvf/2Ljxo2io5ASy8jIgJubG6ytrZGbm4v4+Hjs3r0brVu3\nFh2NSKi8vDy+mURVEi9jJyKVw7KTlIlUKsWzZ8/wzTffYP369bhz5w5evHgBADA3N0e9evXwxRdf\n8BwrFfPixQuMGDECvr6+sLOzEx2HKkmtWrXQuXNnAIClpSVMTExw5MgRDBs2DDdu3MC0adPQokUL\neHh4AID8MvbS0lJERkZiz549OHbs2Cu3KToNDQ2Eh4ejU6dO6N69O5o1ayY6EimRixcvIigoCKdO\nnYK7uzvS0tJ4zjXRS4KDg9G2bVsMGDBAdBSiciWRscYnIhUjk8mgqamJgoICpdxSS6onLCwMK1as\nQN++fWFmZoaTJ0+iqKgIHh4eyMzMxO7du+Hi4oKJEyeKjkqVxNvbG6mpqTh48CAvvVRhP/zwA6ZM\nmQI9PT0UFBTA1tYWQUFBaN68OYD/LSy6ffs2vvjiC+jr6+PIkSPyzyuT0NBQ7NmzB6dPn5Zfsk/0\nOjKZDMeOHUNQUBCuX78OT09PuLq6QldXV3Q0IoWze/dubNy4EVFRUaKjEJUrlp1EpJLq1q2L5ORk\nGBgYiI5C9FYZGRkYOXIkhg4dipkzZ6JatWooKCjAihUrEBsbiyNHjiAsLAzffvstEhMTRcelSnD4\n8GG4ubnh8uXLqFOnjug4pAAOHz4MS0tLNG7cWH6sRWlpKaRSKV68eIG1a9fCy8sLWVlZaNiwoXyZ\nkTIpLS1Fjx494ODgAF9fX9FxSAEVFxdjz549CA4ORnFxMXx8fDBixAi+sU30FkX/x959RzV1P+4D\nfwKCslwIDoaCBFDqAid1a91U6wJRlCXUGfdERaufFkUFV51AVVAcrbYObF24J4IoW4YLFXEhoIzk\n94c/8y111CpwSfK8zsk5Ztx7n1gPJU/eo7AQDRo0wMGDB9G8eXOh4xCVGi7yRUQqiVPZSVGoqakh\nNTUVEokEVapUAfBml+JWrVohPj4eANCtWzfcvn1byJhUTu7evQt3d3eEhYWx6CS5Pn36wNzcXH4/\nLy8POTk5AIDExET4+/tDIpEobNEJvPlZGBISguXLlyMmJkboOFSB5OXlYe3atbC0tMTPP/+MxYsX\n4/r163BxcWHRSfQvNDQ0MG7cOKxatUroKESlimUnEakklp2kKMzMzKCmpobz58+XeHzv3r2wt7dH\ncXExcnJyUK1aNTx//lyglFQeioqK4OzsjAkTJqBDhw5Cx6EK6O2ozv3796Nr165YuXIlNm7ciMLC\nQqxYsQIAFG76+t+ZmprC398fLi4ueP36tdBxSGDZ2dlYtGgRzMzM8NdffyE0NBSnTp1C3759Ffrf\nOVF58/Lywm+//YasrCyhoxCVmoq/KjkRURlg2UmKQk1NDRKJBB4eHmjfvj1MTU0RFRWFkydP4o8/\n/oC6ujrq1KmDrVu3ykd+knJatGgRNDU1OYWX/tWwYcNw9+5d+Pj4ID8/H1OnTgUAhR3V+XcjR47E\nvn37MH/+fPj5+QkdhwRw+/ZtrFixAlu3bsV3332HyMhIWFtbCx2LSGHVqlULgwYNwoYNG+Dj4yN0\nHKJSwTU7iUglDRs2DA4ODnB2dhY6CtG/Kioqws8//4zIyEhkZWWhdu3amDx5Mtq1ayd0NConx48f\nx4gRIxAVFYU6deoIHYcUxOvXrzF79mwEBATAyckJGzZsgJ6e3juvk8lkkMlk8pGhFV1WVhaaNm2K\nXbt2cZSzComNjcWyZctw8OBBuLu7Y9KkSTAyMhI6FpFSiI2NRc+ePZGeng5NTU2h4xB9MZadRKSS\nxo4dCxsbG4wbN07oKESf7NmzZygsLEStWrU4RU+FPHz4ELa2tvjll1/QvXt3oeOQAoqOjsa+ffsw\nYcIE6Ovrv/N8cXEx2rZtCz8/P3Tt2lWAhP/d77//jkmTJiEmJua9BS4pB5lMhtOnT8PPzw9RUVHI\nzMwUOhIRESkAxfj6loiolHEaOymi6tWrw8DAgEWnCpFKpRg5ciTc3NxYdNJna968OXx9fd9bdAJv\nlsuYPXs2PDw8MHDgQKSmppZzwv/u22+/RZcuXeRT9Em5SKVS7Nu3D/b29vDw8ED//v2RlpYmdCwi\nIlIQLDuJSCWx7CQiRbB06VLk5eXB19dX6CikxEQiEQYOHIG+X5oAACAASURBVIi4uDjY2dmhVatW\nmDt3Ll6+fCl0tI9auXIl/vrrLxw4cEDoKFRKXr9+jS1btqBx48ZYsmQJpk6dioSEBHh5eXFdaiIi\n+mQsO4lIJbHsJKKK7uzZs1i5ciXCwsJQqRL3lKSyp6Wlhblz5+L69evIyMiAtbU1tm3bBqlUKnS0\n96patSpCQkLg5eWFx48fCx2HvsCLFy+wbNkymJubY/fu3fj5559x6dIlDB48WOE31SIiovLHNTuJ\nSCXl5eVBKpVCV1dX6ChEn+zt/7I5jV35ZWdnw9bWFmvWrIGDg4PQcUhFnTt3DhKJBJUqVUJgYCBa\nt24tdKT3mjZtGtLT07F7927+fFQwmZmZWLVqFTZt2oQePXpgxowZaN68udCxiIhIwXFkJxGpJG1t\nbRadpHCio6Nx8eJFoWNQGZPJZHB3d8egQYNYdJKg7O3tcfHiRXh7e2PAgAFwdXWtkBvELF68GPHx\n8QgNDRU6Cn2i5ORkeHl5wcbGBi9fvsTly5cRFhZW4YrOkJCQcv998eTJkxCJRBytTB+Unp4OkUiE\nK1euCB2FqMJi2UlERKQgTp48ibCwMKFjUBlbtWoV7t+/j59++knoKERQU1ODq6srEhISULt2bTRp\n0gR+fn54/fq10NHkqlSpgu3bt2PKlCm4c+eO0HFUzn+ZKHj58mUMHjwY9vb2qFu3LhITE7F69WqY\nmZl9UYbOnTtj/Pjx7zz+pWWlo6NjuW/YZW9vj8zMzA9uKEbKzdXVFf369Xvn8StXrkAkEiE9PR0m\nJibIzMyscF8OEFUkLDuJiIgUhFgsRnJystAxqAxduXIFS5YsQXh4ODQ1NYWOQyRXtWpV+Pn54fz5\n8zh37hxsbGywf//+/1R0laUWLVpAIpHAzc2twq4xqoyePn36r0sHyGQyREREoEuXLhg8eDA6dOiA\ntLQ0LFy4EAYGBuWU9F0FBQX/+hotLS0YGhqWQ5r/o6mpiTp16nBJBvogdXV11KlT56PreRcWFpZj\nIqKKh2UnERGRgmDZqdyeP38OR0dHrF27Fubm5kLHIXovsViM/fv3Y+3atZg9ezZ69uyJmzdvCh0L\nADBz5kzk5uZi7dq1QkdRejdu3EDfvn3RuHHjj/73l8lkmDFjBqZPnw4PDw+kpKRAIpEIspTQ2xFz\nfn5+MDY2hrGxMUJCQiASid65ubq6Anj/yNBDhw6hTZs20NLSgr6+PhwcHPDq1SsAbwrUmTNnwtjY\nGNra2mjVqhWOHDkiP/btFPVjx46hTZs20NbWRsuWLREVFfXOaziNnT7kn9PY3/6bOXToEFq3bg1N\nTU0cOXIEd+7cQf/+/VGzZk1oa2vD2toaO3fulJ8nNjYW3bt3h5aWFmrWrAlXV1c8f/4cAPDnn39C\nU1MT2dnZJa49Z84cNG3aFMCb9cWHDRsGY2NjaGlpwcbGBsHBweX0t0D0cSw7iYiIFISZmRnu3r3L\nb+uVkEwmg5eXF3r06IEhQ4YIHYfoX/Xs2RMxMTHo168fOnfujIkTJ+LJkyeCZqpUqRK2bt2KhQsX\nIiEhQdAsyurq1av4+uuv0bJlS+jo6CAyMhI2NjYfPeaHH37A9evXMWLECGhoaJRT0veLjIzE9evX\nERERgWPHjsHR0RGZmZny25EjR6CpqYlOnTq99/iIiAh8++23+Oabb3D16lWcOHECnTp1ko8mdnNz\nQ2RkJMLCwnDjxg2MGjUKDg4OiImJKXGe2bNn46effkJUVBT09fUxfPjwCjNKmhTXzJkzsXjxYiQk\nJKBNmzYYO3Ys8vLycOLECdy8eRMBAQGoXr06ACA3Nxc9e/aErq4uLl26hN9++w3nzp2Du7s7AKBb\nt26oVasWdu/eLT+/TCZDWFgYRowYAQB49eoVbG1tceDAAdy8eRMSiQTe3t44duxY+b95on/48Lhn\nIiIiqlA0NTVhZGSEtLQ0WFpaCh2HStGmTZuQkJCACxcuCB2F6JNpaGhg4sSJGDZsGObPn49GjRrB\n19cXo0eP/uj0yrIkFouxaNEiuLi44Ny5c4KXa8okNTUVbm5uePLkCR48eCAvTT5GJBKhSpUq5ZDu\n01SpUgVBQUGoXLmy/DEtLS0AwKNHj+Dl5YUxY8bAzc3tvcf/8MMPGDx4MBYvXix/7O0ot1u3bmHH\njh1IT0+HqakpAGD8+PE4evQoNmzYgHXr1pU4T5cuXQAA8+fPR/v27XHv3j0YGxuX7hsmhRQREfHO\niOJPWZ7D19cXPXr0kN/PyMjAoEGD0KxZMwAosTZuWFgYcnNzsW3bNujp6QEANm7ciC5duiAlJQUW\nFhZwcnJCaGgovv/+ewDA2bNncefOHTg7OwMAjIyMMH36dPk5vby8cPz4cezYsQPdunX7zHdPVDo4\nspOIiEiBcCq78rl+/Trmzp2L8PBw+YduIkViYGCAn3/+GX/++SfCw8Nha2uLEydOCJZnzJgxqFmz\nJn788UfBMiiLhw8fyv9sbm6Ovn37olGjRnjw4AGOHj0KNzc3zJs3r8TU2Irsq6++KlF0vlVQUICB\nAweiUaNGWL58+QePv3bt2gdLnKioKMhkMjRu3Bi6urry28GDB3Hr1q0Sr31bkAJAvXr1ALwpW4kA\noGPHjoiOji5x+5QNKlu2bFnivkQiweLFi9GuXTv4+Pjg6tWr8ufi4+PRtGlTedEJvNkcS01NDXFx\ncQCAESNG4OzZs8jIyAAAhIaGolOnTvJSvri4GEuWLEHTpk2hr68PXV1d/Prrr7h9+/YX/x0QfSmW\nnURERApELBYjKSlJ6BhUSnJzc+Ho6Ijly5fD2tpa6DhEX6RZs2Y4ceIE5s+fDzc3NwwaNAhpaWnl\nnkMkEiEoKAhr1qyRr2lHn04qlWLx4sWwsbHBkCFDMHPmTPm6nL169cKzZ8/Qtm1bjB07Ftra2oiM\njISzszN++OEH+Xp/5a1q1arvvfazZ89QrVo1+X0dHZ33Hu/t7Y2nT58iPDwc6urqn5VBKpVCJBLh\n8uXLJUqq+Ph4BAUFlXjt30ccv92IiBtr0Vva2tqwsLAocfuUUb///Pft4eGBtLQ0uLm5ISkpCfb2\n9vD19f3X87z9N2lrawtra2uEhYWhsLAQu3fvlk9hBwB/f38sX74c06dPx7FjxxAdHY0BAwZ80uZf\nRGWNZScREZEC4chO5TJ+/Hi0adMGI0eOFDoKUakQiUQYPHgw4uPj0aJFC7Rs2RI+Pj54+fJlueYw\nMjJCYGAgXFxckJ+fX67XVmTp6eno3r079u/fDx8fH/Tq1QuHDx+Wb/rUqVMn9OjRA+PHj8exY8ew\ndu1anDp1CitXrkRISAhOnTolSG4rKyv5yMq/i4qKgpWV1UeP9ff3x4EDB3DgwAFUrVr1o69t0aLF\nB9cjbNGiBWQyGR48ePBOUWVkZPTf3hBRKTE2NoaXlxd27dqFRYsWYePGjQCARo0aITY2Fjk5OfLX\nnjt3DlKpFI0aNZI/NmLECISGhiIiIgK5ubkYPHiw/LkzZ87AwcEBLi4uaN68ORo2bMgv5KnCYNlJ\nRESkQCwtLVl2KomtW7fiwoULWLNmjdBRiEqdlpYWfHx8EBMTg7S0NFhbW2P79u3lugnLsGHD0KxZ\nM8yePbvcrqnoTp8+jYyMDBw8eBDDhg3DnDlzYG5ujqKiIrx+/RoA4OnpifHjx8PExER+nEQiQV5e\nHhITEwXJPWbMGKSmpmLChAmIiYlBYmIiVq5ciR07dpRYU/Cfjh49ijlz5mDdunXQ0tLCgwcP8ODB\ngw+OUJ07dy52794NHx8fxMXF4ebNm1i5ciXy8vJgaWmJ4cOHw9XVFXv27EFqaiquXLkCf39//Prr\nr2X11ok+SCKRICIiAqmpqYiOjkZERAQaN24MABg+fDi0tbUxcuRIxMbG4tSpU/D29sbAgQNhYWEh\nP8fw4cMRFxeHefPmwcHBocQXApaWljh27BjOnDmDhIQEjB8/XpDR/ETvw7KTiIhIgXBkp3JITEzE\n1KlTER4e/s4mBETKxNjYGKGhoQgPD0dAQAC+/vprXL58udyuv3btWuzevRvHjx8vt2sqsrS0NBgb\nGyMvLw/Am92XpVIpevfuLV/r0szMDHXq1CnxfH5+PmQyGZ4+fSpIbnNzc5w6dQrJycno0aMHWrdu\njZ07d2L37t3o3bv3B487c+YMCgsLMXToUNStW1d+k0gk7319nz598Ntvv+Hw4cNo0aIFOnXqhBMn\nTkBN7c3H6uDgYLi5uWHGjBmwtrZGv379cOrUKdSvX79M3jfRx0ilUkyYMAGNGzfGN998g9q1a+OX\nX34B8Gaq/JEjR/DixQu0bt0a/fv3R7t27d5ZcqF+/fpo3749YmJiSkxhBwAfHx+0bt0avXv3RseO\nHaGjo4Phw4eX2/sj+hiRrDy/XiUiIqIvUlRUBF1dXTx79qxC7XBLny4/P1++3p23t7fQcYjKjVQq\nRUhICObOnYtevXrhxx9/lJdmZenw4cP4/vvvcf369RLrN9K7EhIS4OjoCAMDAzRo0AA7d+6Erq4u\ntLW10aNHD0ydOhVisfid49atW4fNmzdj7969JXZ8JiIiEgJHdhIRESmQSpUqoX79+khNTRU6Cn2m\nqVOnwtraGl5eXkJHISpXampqcHd3R2JiIgwMDPDVV19h6dKl8unRZaV3797o06cPJk6cWKbXUQbW\n1tb47bff5CMSg4KCkJCQgB9++AFJSUmYOnUqACAvLw8bNmzApk2b0L59e/zwww/w9PRE/fr1y3Wp\nAiIiovdh2UlERKRgOJVdce3evRtHjhzBxo0b5budEqmaqlWrYunSpTh//jxOnz4NGxsb/P7772Va\nki1btgxnz57l2omfwNzcHHFxcfj6668xdOhQVK9eHcOHD0fv3r2RkZGBrKwsaGtr486dOwgICECH\nDh2QnJyMsWPHQk1NjT/biIhIcCw7iYiIFIxYLOZulwooNTUV48aNQ3h4OKfSEuHNz7I//vgDa9as\nwcyZM9GrVy/ExcWVybV0dXWxdetWjB07Fg8fPiyTayiigoKCd0pmmUyGqKgotGvXrsTjly5dgqmp\nKfT09AAAM2fOxM2bN/Hjjz9y7WEiIqpQWHYSEREpGI7sVDwFBQVwcnLCnDlz0LJlS6HjEFUovXr1\nwvXr19GnTx906tQJEomkTDa6sbe3h7u7O0aPHq3SU61lMhkiIiLQpUsXTJky5Z3nRSIRXF1dsX79\neqxatQq3bt2Cj48PYmNjMXz4cPl60W9LTyIiooqGZScRqaTCwkLk5+cLHYPos1haWrLsVDCzZ8/+\n6A6/RKpOQ0MDEokEcXFxeP36NaytrbF+/XoUFxeX6nV8fX1x+/ZtBAcHl+p5FUFRURFCQ0PRvHlz\nzJgxA56enli5cuV7p517e3vD3Nwc69atwzfffIMjR45g1apVcHJyEiA5ERHRf8Pd2IlIJZ06dQoJ\nCQncIIQUUkZGBr7++mvcvXtX6Cj0CQ4cOICxY8fi2rVr0NfXFzoOkUKIjo6GRCLBs2fPEBgYiM6d\nO5fauWNjY9G1a1dcunRJJXYOz83NRVBQEJYvX44GDRrIlwz4lLU1ExMToa6uDgsLi3JISkQVXWxs\nLHr16oW0tDRoamoKHYfogziyk4hU0vXr1xETEyN0DKLPYmJiguzsbOTl5Qkdhf7F3bt34enpibCw\nMBadRP9B8+bNcfLkSfj4+MDV1RVDhgxBenp6qZy7SZMmmDFjBkaNGlXqI0crkuzsbCxcuBBmZmY4\nceIEwsPDcfLkSfTu3fuTNxGysrJi0UlEck2aNIGVlRX27NkjdBSij2LZSUQq6enTp6hevbrQMYg+\ni5qaGszNzZGSkiJ0FPqIoqIiDBs2DBKJBO3btxc6DpHCEYlEGDJkCOLj49G0aVPY2dlh3rx5yM3N\n/eJzv12rMiAg4IvPVdFkZGRg4sSJEIvFuHv3Lk6fPo1ff/0Vbdq0EToaESkBiUSCgIAAlV77mCo+\nlp1EpJKePn2KGjVqCB2D6LNxk6KKz9fXF1paWpg5c6bQUYgUmpaWFubNm4fo6GjcunUL1tbWCAsL\n+6IP2urq6ggJCcFPP/2EGzdulGJa4Vy/fh0jRoyAra0ttLS0cOPGDWzatAlWVlZCRyMiJdKvXz9k\nZ2fjwoULQkch+iCWnUSkklh2kqJj2VmxpaamIjg4GNu2bYOaGn/dIioNJiYmCAsLw44dO7B8+XK0\nb98eV65c+ezzmZub48cff4SLiwsKCgpKMWn5kclkiIyMRJ8+fdCrVy80adIEqamp8PPzQ7169YSO\nR0RKSF1dHRMmTEBgYKDQUYg+iL99E5FKYtlJik4sFiMpKUnoGPQBZmZmSEhIQO3atYWOQqR02rdv\nj0uXLsHd3R0ODg5wd3fHgwcPPutcHh4eMDY2xsKFC0s5ZdkqLi7Gr7/+irZt28LLywsDBw5EWloa\nZs6ciWrVqgkdj4iUnJubG/78809ulkkVFstOIlJJ+/btw8CBA4WOQfTZLC0tObKzAhOJRNDT0xM6\nBpHSUldXh4eHBxISEqCvr4+vvvoKy5Ytw+vXr//TeUQiETZt2oQtW7bg/PnzZZS29Lx+/RqbN29G\n48aN4efnh5kzZyIuLg6enp6oXLmy0PGISEVUq1YNI0aMwNq1a4WOQvReIhlXlSUiIlI49+7dg52d\n3WePZiIiUiZJSUmYMmUKEhMTsWLFCvTr1++TdxwHgL1792LWrFmIjo6Gjo5OGSb9PM+fP8f69esR\nGBiI5s2bY+bMmejYseN/eo9ERKUpOTkZ9vb2yMjIgLa2ttBxiEpg2UlERKSAZDIZdHV1kZmZiapV\nqwodh4ioQjh8+DAmT56MBg0aYOXKlWjUqNEnHzty5Ejo6upi3bp1ZZjwv8nMzERAQAA2b96M3r17\nY8aMGWjatKnQsYiIAAAODg749ttvMXr0aKGjEJXAaexEREQKSCQSwcLCAikpKUJHUTnx8fHYs2cP\nTp06hczMTKHjENHf9O7dG7GxsejZsyc6duyISZMm4enTp5907KpVq3DgwAEcOXKkjFP+u8TERIwe\nPRo2NjZ49eoVrl69iu3bt7PoJKIKRSKRIDAwEBxDRxUNy04iIiIFxR3Zy99vv/2GoUOHYuzYsRgy\nZAh++eWXEs/zl30i4WloaGDy5Mm4efMm8vPzYW1tjQ0bNqC4uPijx1WvXh3BwcHw8PDAkydPyilt\nSRcvXsTAgQPRoUMHGBsbIykpCYGBgWjQoIEgeYiIPqZbt24AgGPHjgmchKgklp1EpLREIhH27NlT\n6uf19/cv8aHD19cXX331Valfh+jfsOwsX48ePYKbmxs8PT2RnJyM6dOnY+PGjXjx4gVkMhlevXrF\n9fOIKhBDQ0Ns2LABERERCA0NhZ2dHSIjIz96TLdu3TBo0CCMGzeunFK++ZLk8OHD6Ny5MxwdHdGl\nSxekpaVhwYIFqFWrVrnlICL6r0QikXx0J1FFwrKTiCoMV1dXiEQieHh4vPPczJkzIRKJ0K9fPwGS\nfdy0adP+9cMTUVkQi8VISkoSOobKWLp0Kbp06QKJRIJq1arBw8MDhoaGcHNzQ9u2bTFmzBhcvXpV\n6JhE9A8tWrRAZGQk5syZg5EjR2Lo0KHIyMj44Ot//PFHXLt2DTt37izTXIWFhdi+fTuaNWuGWbNm\nYfTo0UhOTsaECRMq5CZJRETvM3z4cFy4cIFLK1GFwrKTiCoUExMT7Nq1C7m5ufLHioqKsHXrVpia\nmgqY7MN0dXWhr68vdAxSQRzZWb60tLSQn58vX//Px8cH6enp6NSpE3r16oWUlBRs3rwZBQUFAicl\non8SiUQYOnQo4uPj8dVXX8HW1hbz588v8fvGW9ra2ti2bRskEgnu3btX6llyc3OxatUqiMVibNmy\nBUuXLkV0dDSGDx8ODQ2NUr8eEVFZ0tbWhqenJ1avXi10FCI5lp1EVKE0bdoUYrEYu3btkj928OBB\nVKlSBZ07dy7x2uDgYDRu3BhVqlSBpaUlVq5cCalUWuI1T548wZAhQ6CjowNzc3Ns3769xPOzZs2C\nlZUVtLS00KBBA8yYMQOvXr0q8ZqlS5eiTp060NXVxciRI/Hy5csSz/9zGvvly5fRo0cP1KpVC1Wr\nVkX79u1x/vz5L/lrIXovS0tLlp3lyNDQEOfOncOUKVPg4eGBDRs24MCBA5g4cSIWLlyIQYMGITQ0\nlJsWEVVg2tramD9/Pq5du4bk5GRYW1tjx44d76y326pVK0ybNg0PHz4stbV4Hz9+DF9fX5iZmSEy\nMhK7du3CiRMn0KtXLy6BQUQKbdy4cdi2bRueP38udBQiACw7iagC8vDwQFBQkPx+UFAQ3NzcSnwQ\n2LRpE+bMmYNFixYhPj4ey5cvh5+fH9atW1fiXIsWLUL//v0RExMDR0dHuLu74/bt2/LndXR0EBQU\nhPj4eKxbtw47d+7EkiVL5M/v2rULPj4+WLhwIaKiomBlZYUVK1Z8NH9OTg5cXFxw+vRpXLp0Cc2b\nN0efPn2QnZ39pX81RCUYGhqioKDgk3capi8zYcIEzJs3D3l5eRCLxWjWrBlMTU3lm57Y29tDLBYj\nPz9f4KRE9G9MTU2xY8cOhIWFYdmyZejQocM7y1BMmzYNTZo0+eIiMj09HRMnToSlpSXu37+P06dP\nY+/evWjduvUXnZeIqKIwNjZGjx49EBwcLHQUIgCASMZtQ4mognB1dcXjx4+xbds21KtXD9evX4ee\nnh7q16+P5ORkzJ8/H48fP8aBAwdgamqKJUuWwMXFRX58QEAANm7ciLi4OABvpqzNmjULP/74I4A3\n0+GrVq2KjRs3YsSIEe/NsH79evj7+8vXnLG3t4eNjQ02bdokf0337t2RkpKC9PR0AG9Gdu7Zswc3\nbtx47zllMhnq1auHZcuWffC6RJ/Lzs4OP//8Mz80l5HCwkK8ePGixFIVMpkMaWlpGDBgAA4fPgwj\nIyPIZDI4OTnh2bNnOHLkiICJiei/Ki4uRnBwMHx8fNCvXz/873//g6Gh4RefNyYmBkuXLkVERARG\njx4NiUSCunXrlkJiIqKK5/z58xgxYgSSkpKgrq4udBxScRzZSUQVTo0aNfDdd98hKCgIv/zyCzp3\n7lxivc6srCzcuXMH3t7e0NXVld9mzZqFW7dulThX06ZN5X+uVKkSDAwM8OjRI/lje/bsQfv27eXT\n1CdPnlxi5Gd8fDzatWtX4pz/vP9Pjx49gre3NywtLVGtWjXo6enh0aNHJc5LVFq4bmfZCQ4OhrOz\nM8zMzODt7S0fsSkSiWBqaoqqVavCzs4Oo0ePRr9+/XD58mWEh4cLnJqI/it1dXV4enoiMTER1atX\nx++//46ioqLPOpdMJsO1a9fQu3dv9OnTB82aNUNqaip++uknFp1EpNTatm0LfX19HDhwQOgoRKgk\ndAAiovdxd3fHqFGjoKuri0WLFpV47u26nOvXr4e9vf1Hz/PPhf5FIpH8+AsXLsDJyQkLFizAypUr\n5R9wpk2b9kXZR40ahYcPH2LlypVo0KABKleujG7dunHTEioTLDvLxtGjRzFt2jSMHTsW3bt3x5gx\nY9C0aVOMGzcOwJsvTw4dOgRfX19ERkaiV69eWLJkCapXry5wciL6XNWqVYO/vz+kUinU1D5vTIhU\nKsWTJ08wePBg7Nu3D5UrVy7llEREFZNIJMKkSZMQGBiI/v37Cx2HVBzLTiKqkLp16wZNTU08fvwY\nAwYMKPFc7dq1Ua9ePdy6dQsjR4787GucPXsWRkZGmDdvnvyxjIyMEq9p1KgRLly4AHd3d/ljFy5c\n+Oh5z5w5g1WrVqFv374AgIcPH3LDEiozYrGY06ZLWX5+Pjw8PODj44PJkycDeLPmXm5uLhYtWoRa\ntWpBLBbjm2++wYoVK/Dq1StUqVJF4NREVFo+t+gE3owS7dq1KzccIiKVNHjwYEyfPh3Xr18vMcOO\nqLyx7CSiCkkkEuH69euQyWTvHRWxcOFCTJgwAdWrV0efPn1QWFiIqKgo3Lt3D7Nnz/6ka1haWuLe\nvXsIDQ1Fu3btcOTIEezYsaPEayQSCUaOHIlWrVqhc+fO2LNnDy5evIiaNWt+9Lzbt29HmzZtkJub\nixkzZkBTU/O//QUQfSKxWIzVq1cLHUOprF+/Hra2tiW+5Pjrr7/w7NkzmJiY4N69e6hVqxaMjY3R\nqFEjjtwiohJYdBKRqtLU1MSYMWOwatUqbN68Weg4pMK4ZicRVVh6enqoWrXqe5/z9PREUFAQtm3b\nhmbNmqFDhw7YuHEjzMzMPvn8Dg4OmD59OiZNmoSmTZvir7/+emfKvKOjI3x9fTF37ly0aNECsbGx\nmDJlykfPGxQUhJcvX8LOzg5OTk5wd3dHgwYNPjkX0X9haWmJ5ORkcL/B0tOuXTs4OTlBR0cHAPDT\nTz8hNTUV+/btw4kTJ3DhwgXEx8dj27ZtAFhsEBEREb3l7e2NvXv3IisrS+gopMK4GzsREZGCq1mz\nJhITE2FgYCB0FKVRWFgIDQ0NFBYW4sCBAzA1NYWdnZ18LT9HR0c0a9YMc+bMEToqERERUYXi4eEB\nc3NzzJ07V+gopKI4spOIiEjBcZOi0vHixQv5nytVerPSj4aGBvr37w87OzsAb9byy8nJQWpqKmrU\nqCFITiIiIqKKTCKR4OXLl5x5RILhmp1EREQK7m3ZaW9vL3QUhTV58mRoa2vDy8sL9evXh0gkgkwm\ng0gkKrFZiVQqxZQpU1BUVIQxY8YImJiIiIioYmratCmaNGkidAxSYSw7iYiIFBxHdn6ZLVu2IDAw\nENra2khJScGUKVNgZ2cnH935VkxMDFauXIkTJ07g9OnTAqUlIiIiqvi4pjkJidPYiYiIFBzLzs/3\n5MkT7NmzBz/99BP279+PS5cuwcPDA3v37sWzZ89KvNbMzAytW7dGcHAwTE1NBUpMREREREQfw7KT\niIhIwYnFYiQlJQkdQyGpqamhR48esLGxQbdu3RAfByECrAAAIABJREFUHw+xWAxvb2+sWLECqamp\nAICcnBzs2bMHbm5u6Nq1q8CpiYiIiIjoQ7gbOxGplIsXL2L8+PG4fPmy0FGISs2zZ89gYmKCFy9e\ncMrQZ8jPz4eWllaJx1auXIl58+ahe/fumDp1KtasWYP09HRcvHhRoJREREREyiE3Nxfnz59HjRo1\nYG1tDR0dHaEjkZJh2UlEKuXtjzwWQqRsDA0NERMTg7p16wodRaEVFxdDXV0dAHD16lW4uLjg3r17\nyMvLQ2xsLKytrQVOSETlTSqVltiojIiIPl92djacnJyQlZWFhw8fom/fvti8ebPQsUjJ8P/aRKRS\nRCIRi05SSly3s3Soq6tDJpNBKpXCzs4Ov/zyC3JycrB161YWnUQq6tdff0ViYqLQMYiIFJJUKsWB\nAwfw7bffYvHixfjrr79w7949LF26FOHh4Th9+jRCQkKEjklKhmUnERGREmDZWXpEIhHU1NTw5MkT\nDB8+HH379sWwYcOEjkVEApDJZJg7dy6ys7OFjkJEpJBcXV0xdepU2NnZ4dSpU5g/fz569OiBHj16\noGPHjvDy8sLq1auFjklKhmUnERGREmDZWfpkMhmcnZ3xxx9/CB2FiARy5swZqKuro127dkJHISJS\nOImJibh48SJGjx6NBQsW4MiRIxgzZgx27dolf02dOnVQuXJlZGVlCZiUlA3LTiIiIiXAsvPzFBcX\nQyaT4X1LmOvr62PBggUCpCKiimLLli3w8PDgEjhERJ+hoKAAUqkUTk5OAN7Mnhk2bBiys7MhkUiw\nZMkSLFu2DDY2NjAwMHjv72NEn4NlJxERkRIQi8VISkoSOobC+d///gc3N7cPPs+Cg0h1PX/+HPv2\n7YOLi4vQUYiIFFKTJk0gk8lw4MAB+WOnTp2CWCyGoaEhDh48iHr16mHUqFEA+HsXlR7uxk5ERKQE\ncnJyULt2bbx8+ZK7Bn+iyMhIODo6IioqCvXq1RM6DhFVMBs2bMBff/2FPXv2CB2FiEhhbdq0CWvW\nrEG3bt3QsmVLhIWFoU6dOti8eTPu3buHqlWrQk9PT+iYpGQqCR2AiIiIvpyenh6qV6+Oe/fuwcTE\nROg4FV5WVhZGjBiB4OBgFp1E9F5btmzBwoULhY5BRKTQRo8ejZycHGzfvh379++Hvr4+fH19AQBG\nRkYA3vxeZmBgIGBKUjYc2UlESqu4uBjq6ury+zKZjFMjSKl16tQJCxYsQNeuXYWOUqFJpVL069cP\nTZo0gZ+fn9BxiIiIiJTew4cP8fz5c1haWgJ4s1TI/v37sXbtWlSuXBkGBgYYOHAgvv32W470pC/G\neW5EpLT+XnQCb9aAycrKwp07d5CTkyNQKqKyw02KPs2KFSvw9OlTLF68WOgoRERERCrB0NAQlpaW\nKCgowOLFiyEWi+Hq6oqsrCwMGjQIZmZmCA4Ohqenp9BRSQlwGjsRKaVXr15h4sSJWLt2LTQ0NFBQ\nUIDNmzcjIiICBQUFMDIywoQJE9C8eXOhoxKVGpad/+7ChQtYunQpLl26BA0NDaHjEBEREakEkUgE\nqVSKRYsWITg4GO3bt0f16tWRnZ2N06dPY8+ePUhKSkL79u0RERGBXr16CR2ZFBhHdhKRUnr48CE2\nb94sLzrXrFmDSZMmQUdHB2KxGBcuXED37t2RkZEhdFSiUsOy8+OePn2KYcOGYcOGDWjQoIHQcYiI\niIhUypUrV7B8+XJMmzYNGzZsQFBQENatW4eMjAz4+/vD0tISTk5OWLFihdBRScFxZCcRKaUnT56g\nWrVqAIC0tDRs2rQJAQEBGDt2LIA3Iz/79+8PPz8/rFu3TsioRKWGZeeHyWQyeHp6wsHBAd99953Q\ncYiIiIhUzsWLF9G1a1dIJBKoqb0Ze2dkZISuXbsiLi4OANCrVy+oqanh1atXqFKlipBxSYFxZCcR\nKaVHjx6hRo0aAICioiJoampi5MiRkEqlKC4uRpUqVTBkyBDExMQInJSo9DRs2BCpqakoLi4WOkqF\ns27dOqSlpWHZsmVCRyGiCszX1xdfffWV0DGIiJSSvr4+4uPjUVRUJH8sKSkJW7duhY2NDQCgbdu2\n8PX1ZdFJX4RlJxEppefPnyM9PR2BgYFYsmQJZDIZXr9+DTU1NfnGRTk5OSyFSKloa2vDwMAAt2/f\nFjpKhRIdHQ1fX1+Eh4ejcuXKQschos/k6uoKkUgkv9WqVQv9+vVDQkKC0NHKxcmTJyESifD48WOh\noxARfRZnZ2eoq6tj1qxZCAoKQlBQEHx8fCAWizFw4EAAQM2aNVG9enWBk5KiY9lJREqpVq1aaN68\nOf744w/Ex8fDysoKmZmZ8udzcnIQHx8PS0tLAVMSlT5LS0tOZf+bnJwcDB06FKtWrYJYLBY6DhF9\noe7duyMzMxOZmZn4888/kZ+frxBLUxQUFAgdgYioQggJCcH9+/excOFCBAQE4PHjx5g1axbMzMyE\njkZKhGUnESmlzp0746+//sK6deuwYcMGTJ8+HbVr15Y/n5ycjJcvX3KXP1I6XLfz/8hkMnz//ffo\n2LEjhg0bJnQcIioFlStXRp06dVCnTh3Y2tpi8uTJSEhIQH5+PtLT0yESiXDlypUSx4hEIuzZs0d+\n//79+xg+fDj09fWhra2N5s2b48SJEyWO2blzJxo2bAg9PT0MGDCgxGjKy5cvo0ePHqhVqxaqVq2K\n9u3b4/z58+9cc+3atRg4cCB0dHQwZ84cAEBcXBz69u0LPT09GBoaYtiwYXjw4IH8uNjYWHTr1g1V\nq1aFrq4umjVrhhMnTiA9PR1dunQBABgYGEAkEsHV1bVU/k6JiMrT119/je3bt+Ps2bMIDQ3F8ePH\n0adPH6FjkZLhBkVEpJSOHTuGnJwc+XSIt2QyGUQiEWxtbREWFiZQOqKyw7Lz/wQHByM6OhqXL18W\nOgoRlYGcnByEh4ejSZMm0NLS+qRjcnNz0alTJxgaGmLfvn2oV6/eO+t3p6enIzw8HL/99htyc3Ph\n5OSEuXPnYsOGDfLruri4IDAwECKRCGvWrEGfPn2QkpICfX19+XkWLlyI//3vf/D394dIJEJmZiY6\nduwIDw8P+Pv7o7CwEHPnzkX//v1x/vx5qKmpwdnZGc2aNcOlS5dQqVIlxMbGokqVKjAxMcHevXsx\naNAg3Lx5EzVr1vzk90xEVNFUqlQJxsbGMDY2FjoKKSmWnUSklH799Vds2LABvXv3xtChQ+Hg4ICa\nNWtCJBIBeFN6ApDfJ1IWYrEYx48fFzqG4OLi4jBz5kycPHkS2traQscholISEREBXV1dAG+KSxMT\nExw6dOiTjw8LC8ODBw9w/vx51KpVC8Cbzd3+rqioCCEhIahWrRoAwMvLC8HBwfLnu3btWuL1q1ev\nxt69e3H48GGMGDFC/rijoyM8PT3l9+fPn49mzZrBz89P/tjWrVtRs2ZNXLlyBa1bt0ZGRgamTZsG\na2trAICFhYX8tTVr1gQAGBoayrMTESmDtwNSiEoLp7ETkVKKi4tDz549oa2tDR8fH7i6uiIsLAz3\n798HAPnmBkTKhiM7gby8PAwdOhR+fn7ynT2JSDl07NgR0dHRiI6OxqVLl9CtWzf06NEDd+7c+aTj\nr127hqZNm360LKxfv7686ASAevXq4dGjR/L7jx49gre3NywtLVGtWjXo6enh0aNH72wO17JlyxL3\nr169ilOnTkFXV1d+MzExAQDcunULADBlyhR4enqia9euWLJkicpsvkREqksmk33yz3CiT8Wyk4iU\n0sOHD+Hu7o5t27ZhyZIleP36NWbMmAFXV1fs3r0bWVlZQkckKhPm5ubIyMhAYWGh0FEEI5FI0KxZ\nM7i5uQkdhYhKmba2NiwsLGBhYYFWrVph8+bNePHiBTZu3Ag1tTcfbd7O3gDwWT8LNTQ0StwXiUSQ\nSqXy+6NGjcLly5excuVKnDt3DtHR0TA2Nn5nEyIdHZ0S96VSKfr27Ssva9/ekpOT0a9fPwCAr68v\n4uLiMGDAAJw7dw5NmzZFUFDQf34PRESKQiqVonPnzrh48aLQUUiJsOwkIqWUk5ODKlWqoEqVKhg5\nciQOHz6MgIAA+YL+Dg4OCAkJ4e6opHQqV66MevXqIT09XegogtixYwciIyOxfv16jt4mUgEikQhq\namrIy8uDgYEBACAzM1P+fHR0dInXt2jRAtevXy+x4dB/debMGUyYMAF9+/aFjY0N9PT0SlzzQ2xt\nbXHz5k3Ur19fXti+venp6clfJxaLMXHiRBw8eBAeHh7YvHkzAEBTUxMAUFxc/NnZiYgqGnV1dYwf\nPx6BgYFCRyElwrKTiJRSbm6u/ENPUVER1NTUMHjwYBw5cgQREREwMjKCu7u7fFo7kTKxtLRUyans\nycnJmDhxIsLDw0sUB0SkPF6/fo0HDx7gwYMHiI+Px4QJE/Dy5Us4ODhAS0sLbdu2hZ+fH27evIlz\n585h2rRpJY53dnaGoaEh+vfvj9OnTyM1NRW///77O7uxf4ylpSW2b9+OuLg4XL58GU5OTvIi8mPG\njRuH58+fw9HRERcvXkRqaiqOHj0KLy8v5OTkID8/H+PGjcPJkyeRnp6Oixcv4syZM2jcuDGAN9Pr\nRSIRDh48iKysLLx8+fK//eUREVVQHh4eiIiIwL1794SOQkqCZScRKaW8vDz5eluVKr3Zi00qlUIm\nk6FDhw7Yu3cvYmJiuAMgKSVVXLfz9evXcHR0xIIFC9CiRQuh4xBRGTl69Cjq1q2LunXrok2bNrh8\n+TJ2796Nzp07A4B8ynerVq3g7e2NxYsXlzheR0cHkZGRMDY2hoODA7766issWLDgP40EDwoKwsuX\nL2FnZwcnJye4u7ujQYMG/3pcvXr1cPbsWaipqaFXr16wsbHBuHHjULlyZVSuXBnq6up4+vQpXF1d\nYWVlhe+++w7t2rXDihUrAABGRkZYuHAh5s6di9q1a2P8+PGfnJmIqCKrVq0ahg8fjnXr1gkdhZSE\nSPb3RW2IiJTEkydPUL16dfn6XX8nk8kgk8ne+xyRMggMDERycjLWrFkjdJRyM3HiRNy9exd79+7l\n9HUiIiIiBZOUlIT27dsjIyMDWlpaQschBcdP+kSklGrWrPnBMvPt+l5EykrVRnbu27cPf/zxB7Zs\n2cKik4iIiEgBWVpaonXr1ggNDRU6CikBftonIpUgk8nk09iJlJ0qlZ0ZGRnw8vLCjh07UKNGDaHj\nEBEREdFnkkgkCAwM5Gc2+mIsO4lIJbx8+RLz58/nqC9SCQ0aNMD9+/fx+vVroaOUqcLCQjg5OWH6\n9Olo27at0HGIiIiI6At0794dUqn0P20aR/Q+LDuJSCU8evQIYWFhQscgKhcaGhowMTFBamqq0FHK\n1Lx581CjRg1MnTpV6ChERERE9IVEIhEmTpyIwMBAoaOQgmPZSUQq4enTp5ziSirF0tJSqaeyR0RE\nIDQ0FL/88gvX4CUiIiJSEi4uLjh37hxu3boldBRSYPx0QEQqgWUnqRplXrfz/v37cHV1xfbt22Fg\nYCB0HCJSQL169cL27duFjkFERP+gra0NDw8PrF69WugopMBYdhKRSmDZSapGWcvO4uJiDB8+HGPH\njkWnTp2EjkNECuj27du4fPkyBg0aJHQUIiJ6j3HjxmHr1q148eKF0FFIQbHsJCKVwLKTVI2ylp2L\nFy+GSCTC3LlzhY5CRAoqJCQETk5O0NLSEjoKERG9h4mJCbp3746QkBCho5CCYtlJRCqBZSepGmUs\nO0+cOIH169cjNDQU6urqQschIgUklUoRFBQEDw8PoaMQEdFHTJo0CatWrUJxcbHQUUgBsewkIpXA\nspNUjampKbKyspCfny90lFLx6NEjuLi4ICQkBHXr1hU6DhEpqGPHjqFmzZqwtbUVOgoREX1Eu3bt\nUKNGDRw6dEjoKKSAWHYSkUpg2UmqRl1dHQ0aNEBKSorQUb6YVCrFqFGj4OLigp49ewodh4gU2JYt\nWziqk4hIAYhEIkgkEgQGBgodhRQQy04iUgksO0kVKctUdn9/f7x48QKLFi0SOgoRKbDs7GxERETA\n2dlZ6ChERPQJhg4dips3byI2NlboKKRgWHYSkUpg2UmqyNLSUuHLznPnzmH58uXYsWMHNDQ0hI5D\nRAps+/bt6NevH38fICJSEJqamhg7dixWrVoldBRSMCw7iUglsOwkVaToIzufPHkCZ2dnbNy4Eaam\npkLHISIFJpPJsHnzZk5hJyJSMN7e3tizZw8eP34sdBRSICw7iUglPH36FNWrVxc6BlG5UuSyUyaT\nwcPDAwMGDED//v2FjkNECu7y5cvIy8tDp06dhI5CRET/gaGhIQYMGIBNmzYJHYUUCMtOIlIJHNlJ\nqkiRy841a9bg9u3b8PPzEzoKESmBtxsTqanx4w8RkaKRSCRYu3YtCgsLhY5CCkIkk8lkQocgIipL\nUqkUGhoaKCgogLq6utBxiMqNVCqFrq4uHj16BF1dXaHjfLKoqCj07NkT58+fh4WFhdBxiEjB5ebm\nwsTEBLGxsTAyMhI6DhERfYbOnTvj+++/h5OTk9BRSAHwq00iUnrPnz+Hrq4ui05SOWpqamjYsCFS\nUlKEjvLJXrx4AUdHR6xevZpFJxGVit27d8Pe3p5FJxGRApNIJAgMDBQ6BikIlp1EpPQ4hZ1UmVgs\nRlJSktAxPolMJoO3tze6du3Kb+2JqNRs2bIFnp6eQscgIqIv8O233+LBgwe4ePGi0FFIAbDsJCKl\nx7KTVJmlpaXCrNu5ZcsW3LhxAwEBAUJHISIlkZCQgOTkZPTt21foKERE9AXU1dUxYcIEju6kT8Ky\nk4iUHstOUmWKsknRjRs3MGvWLISHh0NLS0voOESkJIKCgjBy5EhoaGgIHYWIiL6Qu7s7IiIicO/e\nPaGjUAXHspOIlB7LTlJlilB25ubmwtHREf7+/mjcuLHQcYhISRQWFmLr1q3w8PAQOgoREZWC6tWr\nw9nZGT///LPQUaiCY9lJREqPZSepMkUoOydOnAhbW1uMGjVK6ChEpEQOHDgAsVgMKysroaMQEVEp\nmTBhAjZu3Ij8/Hyho1AFxrKTiJQey05SZXXq1EF+fj6eP38udJT3Cg0NxZkzZ7Bu3TqIRCKh4xCR\nEtmyZQtHdRIRKRkrKyu0atUKYWFhQkehCoxlJxEpPZadpMpEIhEsLCwq5OjOpKQkTJo0CeHh4dDT\n0xM6DhEpkXv37uHcuXMYMmSI0FGIiKiUSSQSBAYGQiaTCR2FKiiWnUSk9Fh2kqoTi8VISkoSOkYJ\nr169gqOjIxYtWoTmzZsLHYeIlExISAiGDBkCHR0doaMQEVEp++abb1BUVISTJ08KHYUqKJadRKT0\nWHaSqquI63ZOmzYNDRs2xPfffy90FCJSMlKpFEFBQfD09BQ6ChERlQGRSASJRIKAgACho1AFxbKT\niJQey05SdZaWlhWq7Ny7dy8OHTqEzZs3c51OIip1kZGR0NHRQcuWLYWOQkREZcTFxQXnzp3DrVu3\nhI5CFRDLTiJSeiw7SdVVpJGdaWlpGDNmDHbu3Inq1asLHYeIlJCamhrGjx/PL1OIiJSYtrY23N3d\nsWbNGqGjUAUkknFFVyJScg0bNkRERATEYrHQUYgEkZWVBSsrKzx58kTQHAUFBejQoQOGDh2KqVOn\nCpqFiJTX2483LDuJiJTb7du30aJFC6SlpaFq1apCx6EKhCM7iUjpiUQijuwklVarVi1IpVJkZ2cL\nmmPu3LkwMDDA5MmTBc1BRMpNJBKx6CQiUgGmpqbo1q0bQkJChI5CFQzLTiJSajKZDDdu3IC+vr7Q\nUYgEIxKJBJ/KfujQIezcuRMhISFQU+OvH0RERET05SQSCVavXg2pVCp0FKpA+GmDiJSaSCRClSpV\nOMKDVJ5YLEZSUpIg17579y7c3d0RFhaGWrVqCZKBiIiIiJSPvb09qlWrhkOHDgkdhSoQlp1EREQq\nQKiRnUVFRXB2dsb48ePRoUOHcr8+ERERESkvkUgEiUSCgIAAoaNQBcKyk4iISAVYWloKUnYuWrQI\nmpqamD17drlfm4iIiIiU39ChQ3Hz5k3cuHFD6ChUQVQSOgARERGVPSFGdh4/fhybN29GVFQU1NXV\ny/XaRKS8srKysH//fhQVFUEmk6Fp06b4+uuvhY5FREQCqVy5MsaMGYNVq1Zh48aNQsehCkAkk8lk\nQocgIiKisvX06VPUr18fz58/L5c1bB8+fAhbW1uEhITgm2++KfPrEZFq2L9/P5YtW4abN29CR0cH\nRkZGKCoqgqmpKYYOHYpvv/0WOjo6QsckIqJy9vDhQ1hbWyMlJYWb0xKnsRMREamCGjVqQFNTE48e\nPSrza0mlUowcORKurq4sOomoVM2cORNt2rRBamoq7t69C39/fzg6OkIqlWLp0qXYsmWL0BGJiEgA\ntWvXxoABAziykwBwZCcREZHKaNeuHZYtW4b27duX6XV++uknHDhwACdPnkSlSlwxh4hKR2pqKuzt\n7XH16lUYGRmVeO7u3bvYsmULFi5ciNDQUAwbNkyglEREJJTo6Gg4ODggNTUVGhoaQschAXFkJxER\nkYooj3U7z549i5UrV2LHjh0sOomoVIlEIujr62PDhg0AAJlMhuLiYgCAsbExFixYAFdXVxw9ehSF\nhYVCRiUiIgE0b94c5ubm+PXXX4WOQgJj2UlEKk8qlSIzMxNSqVToKERlSiwWIykpqczOn52dDWdn\nZ2zevBkmJiZldh0iUk1mZmYYMmQIdu7ciZ07dwLAO5ufmZubIy4ujiN6iIhUlEQiQWBgoNAxSGAs\nO4mIALRq1Qq6urpo0qQJvvvuO0yfPh0bNmzA8ePHcfv2bRahpBTKcmSnTCaDu7s7Bg0aBAcHhzK5\nBhGprrcrb40bNw7ffPMNXFxcYGNjg8DAQCQmJiIpKQnh4eEIDQ2Fs7OzwGmJiEgo/fv3R2ZmJi5d\nuiR0FBIQ1+wkIvr/Xr58iVu3biElJQXJyclISUmR37Kzs2FmZgYLCwtYWFhALBbL/2xqavrOyBKi\niigqKgpubm6IiYkp9XMHBgZi+/btOHv2LDQ1NUv9/EREz58/R05ODmQyGbKzs7Fnzx6EhYUhIyMD\nZmZmePHiBRwdHREQEMD/LxMRqbDly5cjKioKoaGhQkchgbDsJCL6BHl5eUhNTX2nBE1JScHDhw9R\nv379d0pQCwsL1K9fn1PpqMLIyclBnTp18PLlS4hEolI775UrV9C7d29cvHgR5ubmpXZeIiLgTckZ\nFBSERYsWoW7duiguLkbt2rXRrVs3fPfdd9DQ0MC1a9fQokULNGrUSOi4REQksGfPnsHMzAw3b95E\nvXr1hI5DAmDZSUT0hV69eoXU1NR3StCUlBTcv38fxsbG75SgFhYWMDMz4wg4Knd16tR5707Gn+v5\n8+ewtbXFjz/+iKFDh5bKOYmI/m7GjBk4c+YMJBIJatasiTVr1uCPP/6AnZ0ddHR04O/vj5YtWwod\nk4iIKpBx48ahRo0aWLx4sdBRSAAsO4mIylBBQQHS0tLeW4TeuXMH9erVe6cEtbCwgLm5OapUqSJ0\nfFJCHTp0wA8//IDOnTt/8blkMhmcnJxQs2ZN/Pzzz18ejojoPYyMjLBx40b07dsXAJCVlYURI0ag\nU6dOOHr0KO7evYuDBw9CLBYLnJSIiCqKxMREdOzYERkZGfxcpYIqCR2AiEiZaWpqwsrKClZWVu88\nV1hYiIyMjBIF6PHjx5GcnIyMjAzUrl37vUVow4YNoa2tLcC7IWXwdpOi0ig7N23ahISEBFy4cOHL\ngxERvUdKSgoMDQ1RtWpV+WMGBga4du0aNm7ciDlz5sDa2hoHDx7EpEmTIJPJSnWZDiIiUkxWVlaw\ns7PDrl27MHLkSKHjUDlj2UlEJBANDQ15gflPRUVFuHPnToki9PTp00hJSUFaWhr09fXfKUHFYjEa\nNmwIXV3dcn8v+fn52L17N2JiYqCn9//au/Ooquv8j+OviwYiiwqBqGCskhuagFaaW6aknhzNMbcp\nQk1Tp2XEpvFnLkfHJnMZTcxMiAIrR6k0LS1JzZLCFUkkwQ0VRdExFUSIe39/dLwT4Q568cvzcY7n\nyPf7vd/P+3s9srz4fD5vF/Xo0UPh4eGqWZMvM1VNUFCQ9u3bV+H77N69W//3f/+nzZs3y9HRsRIq\nA4CyLBaLfH195ePjo8WLFys8PFyFhYVKSEiQyWTSfffdJ0nq3bu3vvvuO40dO5avOwAAq3feeUf3\n3nsvvwirhvhuAACqoJo1a8rPz09+fn567LHHypwrLS3VsWPHrCFoVlaWfvzxR2VnZ2v//v2qU6dO\nuRD08t9/PzOmMuXn5+vHH3/UhQsXNHfuXKWmpio+Pl6enp6SpK1bt2r9+vW6ePGimjRpogcffFAB\nAQFlvungm5A7IygoSImJiRW6R0FBgZ566inNnj1b999/fyVVBgBlmUwm1axZU/3799fzzz+vLVu2\nyMnJSb/88otmzpxZ5tri4mKCTgBAGd7e3vx8UU2xZycAGIjZbNbx48etIegf9wmtXbv2FUPQwMBA\n1atX75bHLS0tVW5urnx8fBQaGqpOnTpp+vTp1uX2kZGRys/Pl729vY4ePaqioiJNnz5dTzzxhLVu\nOzs7nT17VidOnJCXl5fq1q1bKe8Jytq9e7cGDRqkPXv23PI9nn32WVksFsXHx1deYQBwDadOnVJc\nXJxOnjypZ555RiEhIZKkzMxMderUSe+++671awoAAKjeCDsBoJqwWCzKy8u7YhCalZVlXVZ/pc7x\n7u7uN/xbUS8vL40fP14vv/yy7OzsJP22QbiTk5O8vb1lNpsVHR2t999/X9u3b5evr6+k335gnTp1\nqrZs2aK8vDyFhYUpPj7+isv8cesKCwvl7u6ugoIC67/Pzfjggw80Y8YMbdu2zSZbJgDAZefPn9ey\nZcv0zTff6MMPP7R1OQAAoIog7AQAyGKxKD8CGnabAAAeCUlEQVQ//4qzQbOysmSxWHTixInrdjIs\nKCiQp6en4uLi9NRTT131ujNnzsjT01MpKSkKDw+XJLVv316FhYVatGiRvL29NWzYMJWUlGj16tXs\nCVnJvL299f3331v3u7tRP//8szp06KDk5GTrrCoAsKW8vDxZLBZ5eXnZuhQAAFBFsLENAEAmk0ke\nHh7y8PDQww8/XO786dOn5eDgcNXXX95v8+DBgzKZTNa9On9//vI4krRy5Urdc889CgoKkiRt2bJF\nKSkp2rVrlzVEmzt3rpo3b66DBw+qWbNmlfKc+M3ljuw3E3ZevHhRAwYM0PTp0wk6AVQZ9evXt3UJ\nAACgirn59WsAgGrnesvYzWazJGnv3r1ydXWVm5tbmfO/bz6UmJioyZMn6+WXX1bdunV16dIlrVu3\nTt7e3goJCdGvv/4qSapTp468vLyUnp5+m56q+rocdt6McePGKTg4WM8999xtqgoArq2kpEQsSgMA\nANdD2AkAqDQZGRny9PS0NjuyWCwqLS2VnZ2dCgoKNH78eE2aNEmjR4/WjBkzJEmXLl3S3r171aRJ\nE0n/C07z8vLk4eGhX375xXovVI6bDTuXL1+udevW6d1336WjJQCbefzxx5WcnGzrMgAAQBXHMnYA\nQIVYLBadPXtW7u7u2rdvn3x9fVWnTh1JvwWXNWrUUFpaml588UWdPXtWCxcuVERERJnZnnl5edal\n6pdDzZycHNWoUaNCXeJxZUFBQdq0adMNXXvgwAGNGTNGa9assf67AsCddvDgQaWlpalDhw62LgUA\nAFRxhJ0AgAo5duyYunfvrqKiIh06dEh+fn5655131KlTJ7Vr104JCQmaPXu22rdvr9dff12urq6S\nftu/02KxyNXVVYWFhdbO3jVq1JAkpaWlydHRUX5+ftbrLyspKVGfPn3KdY739fXVPffcc4ffgbtP\nkyZNbmhmZ3FxsQYOHKgJEyZYG0kBgC3ExcVp8ODB122UBwAAQDd2AECFWCwWpaena+fOncrNzdX2\n7du1fft2tWnTRvPnz1erVq105swZRUREKCwsTMHBwQoKClLLli3l4OAgOzs7DR06VIcPH9ayZcvU\nsGFDSVJoaKjatGmj2bNnWwPSy0pKSrR27dpyneOPHTumRo0alQtBAwMD5efnd80mS9VJUVGR6tat\nqwsXLqhmzav/3nPcuHHKysrSypUrWb4OwGZKS0vl6+urNWvW0CANAABcF2EnAOC2yszMVFZWljZt\n2qT09HQdOHBAhw8f1rx58zRy5EjZ2dlp586dGjJkiHr27KmePXtq0aJFWr9+vTZs2KBWrVrd8FjF\nxcU6dOhQuRA0KytLR44cUYMGDcqFoIGBgQoICKh2s4V8fX2VnJysgICAK55fvXq1Ro8erZ07d8rd\n3f0OVwcA//Pll19q8uTJSk1NtXUpAADgLkDYCQCwCbPZLDu7//XJ+/TTTzVz5kwdOHBA4eHhmjJl\nisLCwiptvJKSEuXk5FwxCD106JA8PT3LhaBBQUEKCAhQ7dq1K62OqiIzM1ONGze+4rMdPXpUYWFh\nWrFiBfvjAbC5J598Ut27d9fIkSNtXQoAALgLEHYCMKTIyEjl5+dr9erVti4Ft+D3zYvuhNLSUh05\ncqRcCJqdna0DBw7Izc2tXAh6eUaoi4vLHavzTjCbzRo8eLBCQkI0YcIEW5cDoJo7efKkmjRpopyc\nnHJbmgAAAFwJYScAm4iMjNT7778vSapZs6bq1aun5s2bq3///nruuecq3GSmMsLOy812tm7dWqkz\nDHF3MZvNOnbsWLkQNDs7W/v375eLi0u5EPTyn7uxe7nZbNbFixfl6OhYZuYtANjC7NmzlZ6ervj4\neFuXAgAA7hJ0YwdgM926dVNCQoJKS0t16tQpffPNN5o8ebISEhKUnJwsJyencq8pLi6Wvb29DapF\ndWVnZycfHx/5+PioS5cuZc5ZLBYdP368TAi6YsUKaxhaq1atK4aggYGBcnNzs9ETXZudnd0V/+8B\nwJ1msVi0ZMkSLV682NalAACAuwhTNgDYjIODg7y8vNSoUSO1bt1af/vb37Rx40bt2LFDM2fOlPRb\nE5UpU6YoKipKdevW1ZAhQyRJ6enp6tatmxwdHeXm5qbIyEj98ssv5caYPn266tevL2dnZz377LO6\nePGi9ZzFYtHMmTMVEBAgR0dHtWzZUomJidbzfn5+kqTw8HCZTCZ17txZkrR161Z1795d9957r1xd\nXdWhQwelpKTcrrcJVZjJZFLDhg3VsWNHDRs2TK+//rqWL1+unTt36ty5c/rpp5/05ptvqmvXriou\nLtaqVas0evRo+fn5yc3NTe3atdOQIUOsIX9KSopOnTolFl0AgJSSkiKz2czewQAA4KYwsxNAldKi\nRQtFREQoKSlJU6dOlSTNmTNHEydO1LZt22SxWFRQUKAePXqobdu2Sk1N1ZkzZzRixAhFRUUpKSnJ\neq9NmzbJ0dFRycnJOnbsmKKiovT3v/9d8+fPlyRNnDhRK1asUExMjIKDg5WSkqIRI0aoXr166tWr\nl1JTU9W2bVutXbtWrVq1ss4oPX/+vP7yl79o3rx5MplMWrBggXr27Kns7Gy6VsPKZDKpfv36ql+/\nfrkf1C0Wi/Lz88vsEbp27VrrDFGz2XzFrvFBQUHy9PS8o/uZAoCtLFmyRMOGDeNzHgAAuCns2QnA\nJq61p+arr76q+fPnq7CwUL6+vmrZsqU+//xz6/l3331X0dHROnr0qLU5zMaNG9WlSxdlZWUpMDBQ\nkZGR+uyzz3T06FE5OztLkhITEzVs2DCdOXNGknTvvffqq6++0iOPPGK990svvaR9+/bpiy++uOE9\nOy0Wixo2bKg333xTQ4cOrZT3B9XbmTNnrtg1Pjs7W0VFRVcNQhs0aEAoAMAQzp8/Lx8fH2VmZsrL\ny8vW5QAAgLsIMzsBVDl/7MT9x6Bx7969CgkJKdMF++GHH5adnZ0yMjIUGBgoSQoJCbEGnZL00EMP\nqbi4WPv379elS5dUVFSkiIiIMmOVlJTI19f3mvWdPHlSr732mjZs2KC8vDyVlpbq4sWLysnJqchj\nA1Zubm5q27at2rZtW+7c2bNntX//fmsIunnzZr333nvKzs7W+fPnFRAQYA1AZ8yYoZo1+VIP4O6z\nbNkydenShaATAADcNH4CAlDlZGRkyN/f3/rxzTRLudFZbWazWZL0+eefq3HjxmXOXa8T/DPPPKO8\nvDzNnTtXvr6+cnBw0KOPPqri4uIbrhO4VXXr1lVoaKhCQ0PLnTt//rw1CD18+LANqgOAyrFkyRJN\nnDjR1mUAAIC7EGEngCrlp59+0tq1a6/5A07Tpk0VFxen8+fPW2d3btmyRWazWU2bNrVel56eroKC\nAmtY+sMPP8je3l4BAQEym81ycHDQ4cOH1bVr1yuOc3mPztLS0jLHv/vuO82fP1+9evWSJOXl5en4\n8eO3/tBAJXFxcVHr1q3VunVrW5cCALdsz549OnLkiCIiImxdCgAAuAvRjR2AzVy6dEknTpxQbm6u\n0tLSNGfOHHXu3FmhoaGKjo6+6uuGDBmi2rVr6+mnn1Z6erq+/fZbjRw5Uv369bMuYZekX3/9VVFR\nUdqzZ4++/vprvfrqqxoxYoScnJzk4uKi6OhoRUdHKy4uTtnZ2dq1a5cWLVqkxYsXS5I8PT3l6Oio\ndevWKS8vz9rtvUmTJkpMTFRGRoa2bt2qgQMHWoNRAABQMbGxsYqMjGQbDgAAcEsIOwHYzPr169Wg\nQQM1btxYjz76qFatWqUpU6bo22+/vebS9dq1a2vdunU6d+6c2rZtqz59+uihhx5SXFxcmes6deqk\n5s2bq0uXLurbt6+6du2qmTNnWs9PmzZNU6ZM0axZs9S8eXM99thjSkpKkp+fnySpZs2amj9/vpYs\nWaKGDRuqT58+kqS4uDhduHBBoaGhGjhwoKKioq67zycAALi+S5cuKSEhQVFRUbYuBQAA3KXoxg4A\nAACgSli+fLkWLlyoDRs22LoUAABwl2JmJwAAAIAqITY2VsOHD7d1GQAA4C7GzE4AAAAANnf48GG1\nadNGR48elaOjo63LAQAAdylmdgIAAACwufj4eA0cOJCgEwAAVAhhJwAAAACbKi0tVVxcHEvYAQA3\n7cSJE+revbucnJxkMpkqdK/IyEj17t27kiqDrRB2AgAAALCp5ORkubu764EHHrB1KQCAKiYyMlIm\nk6ncnwcffFCSNGvWLOXm5mrXrl06fvx4hcaaN2+eEhMTK6Ns2FBNWxcAAAAAoHqjMREA4Fq6deum\nhISEMsfs7e0lSdnZ2QoNDVVQUNAt3//XX39VjRo1VKdOnQrViaqBmZ0AAAAAbCY/P1/r1q3T4MGD\nbV0KAKCKcnBwkJeXV5k/bm5u8vX11cqVK/XBBx/IZDIpMjJSkpSTk6O+ffvKxcVFLi4u6tevn44e\nPWq935QpU9SiRQvFx8crICBADg4OKigoKLeM3WKxaObMmQoICJCjo6NatmzJzM+7ADM7AQAAANhM\nYmKievfurbp169q6FADAXWbr1q0aPHiw3NzcNG/ePDk6OspsNqtPnz5ydHTUhg0bJEljx47Vn/70\nJ23dutW6r+fBgwf14Ycfavny5bK3t1etWrXK3X/ixIlasWKFYmJiFBwcrJSUFI0YMUL16tVTr169\n7uiz4sYRdgIAAACwCYvFotjYWL311lu2LgUAUIWtXbtWzs7OZY6NGTNGb7zxhhwcHOTo6CgvLy9J\n0tdff63du3dr//798vX1lSR9+OGHCgwMVHJysrp16yZJKi4uVkJCgurXr3/FMQsKCjRnzhx99dVX\neuSRRyRJfn5+Sk1NVUxMDGFnFUbYCQAAAMAmUlNTdfHiRXXq1MnWpQAAqrCOHTtq8eLFZY5dbUXA\n3r171bBhQ2vQKUn+/v5q2LChMjIyrGGnt7f3VYNOScrIyFBRUZEiIiLKdHkvKSkpc29UPYSdAAAA\nAGwiNjZWUVFRZX6IBADgj2rXrq3AwMAK3+f3X2+cnJyuea3ZbJYkff7552rcuHGZc/fcc0+Fa8Ht\nQ9gJAAAA4I67cOGCli9frj179ti6FACAgTRt2lS5ubk6dOiQdQbmgQMHlJubq2bNmt3wfZo1ayYH\nBwcdPnxYXbt2vU3V4nYg7AQAAABwxy1fvlwdOnRQw4YNbV0KAKCKu3Tpkk6cOFHmWI0aNeTh4VHu\n2m7duikkJERDhgzRvHnzJEl//etf1aZNm5sKLV1cXBQdHa3o6GhZLBZ17NhRFy5c0A8//CA7Ozs9\n99xzFXso3DaEnQAAAADuuNjYWEVHR9u6DADAXWD9+vVq0KBBmWONGjXS0aNHy11rMpm0cuVKvfDC\nC+rSpYuk3wLQt95666a3TZk2bZrq16+vWbNm6fnnn5erq6tat26tV1555dYfBredyWKxWGxdBAAA\nAIDqIzMzU126dFFOTg77ngEAgEplZ+sCAAAAAFQvsbGxevrppwk6AQBApSPsBACgGpoyZYpatGhh\n6zIAVEMlJSX64IMPFBUVZetSAACAARF2AgBQheXl5enFF19UQECAHBwc1KhRIz3++OP64osvKnTf\n6Ohobdq0qZKqBIAbt3r1agUHBys4ONjWpQAAAAOiQREAAFXUoUOH1L59e7m4uOj1119Xq1atZDab\nlZycrFGjRiknJ6fca4qLi2Vvb3/dezs7O8vZ2fl2lA0A17RkyRINGzbM1mUAAACDYmYnAABV1OjR\noyVJ27Zt04ABAxQcHKymTZtq7Nix2r17t6Tfuk3GxMSoX79+cnJy0oQJE1RaWqphw4bJz89Pjo6O\nCgoK0syZM2U2m633/uMydrPZrGnTpsnHx0cODg5q2bKlVq5caT3/8MMPa9y4cWXqO3funBwdHfXJ\nJ59IkhITExUeHi4XFxd5enrqz3/+s44dO3bb3h8Ad59jx44pJSVF/fv3t3UpAADAoAg7AQCogs6c\nOaO1a9dqzJgxV5yBWbduXevfp06dqp49eyo9PV1jxoyR2WxWo0aN9J///Ed79+7VP//5T82YMUPv\nvffeVcebN2+e3nzzTb3xxhtKT09X37591a9fP+3atUuSNHToUH388cdlAtOkpCTVqlVLvXr1kvTb\nrNKpU6cqLS1Nq1evVn5+vgYNGlRZbwkAA4iPj9eAAQPk5ORk61IAAIBBmSwWi8XWRQAAgLJSU1PV\nrl07ffLJJ+rbt+9VrzOZTBo7dqzeeuuta97v1Vdf1bZt27R+/XpJv83sXLFihX766SdJUqNGjTRy\n5EhNmjTJ+prOnTvL29tbiYmJOn36tBo0aKAvv/xSjz76qCSpW7du8vf31+LFi684ZmZmppo2baoj\nR47I29v7pp4fgPGYzWYFBgZq2bJlCg8Pt3U5AADAoJjZCQBAFXQzv4sMCwsrd2zRokUKCwuTh4eH\nnJ2dNXfu3Cvu8Sn9thw9NzdX7du3L3O8Q4cOysjIkCS5u7srIiJCS5culSTl5uZqw4YNGjp0qPX6\nHTt2qE+fPrrvvvvk4uJiretq4wKoXjZu3FjmcwMAAMDtQNgJAEAVFBQUJJPJpL1791732j8uB122\nbJleeuklRUZGat26ddq1a5dGjx6t4uLim67DZDJZ/z506FAlJSWpqKhIH3/8sXx8fPTII49IkgoK\nCtSjRw/Vrl1bCQkJ2rp1q9auXStJtzQuAOO53Jjo959XAAAAKhthJwAAVZCbm5t69OihBQsW6MKF\nC+XOnz179qqv/e6779SuXTuNHTtWbdq0UWBgoPbv33/V611dXdWwYUN9//335e7TrFkz68dPPPGE\nJGn16tVaunSpBg8ebA0tMjMzlZ+frxkzZqhjx466//77dfLkyZt6ZgDG9d///ldffPGFhgwZYutS\nAACAwRF2AgBQRcXExMhisSgsLEzLly/Xzz//rMzMTL399tsKCQm56uuaNGmiHTt26Msvv1RWVpam\nTZumTZs2XXOs8ePHa9asWfroo4+0b98+TZo0SZs3b1Z0dLT1mlq1aunJJ5/U9OnTtWPHjjJL2Bs3\nbiwHBwctWLBABw4c0Jo1a/Taa69V/E0AYAhLly7V448/Lnd3d1uXAgAADI6wEwCAKsrf3187duzQ\nY489pr///e8KCQlR165dtWrVqqs2BZKkkSNHasCAARo8eLDCw8N16NAhjRs37ppjvfDCCxo/frxe\neeUVtWjRQp9++qmSkpLUqlWrMtcNHTpUaWlpeuCBB8rM+vTw8ND777+vzz77TM2aNdPUqVM1Z86c\nir0BAAzBYrFYl7ADAADcbnRjBwAAAHDbbN++Xf3799f+/ftlZ8dcCwAAcHvx3QYAAACA2yY2NlZR\nUVEEnQAA4I5gZicAAACA26KwsFDe3t5KS0uTj4+PrcsBAADVAL9eBQAAAHBbJCUlqV27dgSdAADg\njiHsBAAAAHBbxMbGavjw4bYuAwAAVCMsYwcAAABQ6bKystShQwcdOXJE9vb2ti4HAABUE8zsBAAA\nAFDpEhISNHToUIJOAABwRzGzEwAAAEClslgsKiws1KVLl+Tm5mbrcgAAQDVC2AkAAAAAAADAEFjG\nDgAAAAAAAMAQCDsBAAAAAAAAGAJhJwAAAAAAAABDIOwEAAAAAAAAYAiEnQAAAAAAAAAMgbATAAAA\nAAAAgCEQdgIAAAAAAAAwBMJOAAAAAAAAAIZA2AkAAAAAAADAEAg7AQAAAAAAABgCYScAAAAAAAAA\nQyDsBAAAAAAAAGAIhJ0AAAAAAAAADIGwEwAAAAAAAIAhEHYCAAAAAAAAMATCTgAAAAAAAACGQNgJ\nAAAAAAAAwBAIOwEAAAAAAAAYAmEnAAAAAAAAAEMg7AQAAAAAAABgCISdAAAAAAAAAAyBsBMAAAAA\nAACAIRB2AgAAAAAAADAEwk4AAAAAAAAAhkDYCQAAAAAAAMAQCDsBAAAAAAAAGAJhJwAAAIByfH19\nNWvWrDsy1saNG2UymZSfn39HxgMAAMZlslgsFlsXAQAAAODOycvL07/+9S+tXr1aR44ckaurqwID\nAzVo0CA9++yzcnZ21qlTp+Tk5KTatWvf9nqKi4t15swZ1a9fXyaT6baPBwAAjKumrQsAAAAAcOcc\nOnRI7du3l6urq6ZNm6aQkBA5Ojpqz549WrJkidzd3TV48GB5eHhUeKzi4mLZ29tf9zp7e3t5eXlV\neDwAAACWsQMAAADVyPPPPy87Oztt27ZNAwcOVLNmzeTn56fevXvrs88+06BBgySVX8ZuMpm0YsWK\nMve60jUxMTHq16+fnJycNGHCBEnSmjVrFBwcrFq1aqljx476+OOPZTKZdOjQIUnll7HHx8fL2dm5\nzFgsdQcAADeCsBMAAACoJk6fPq1169ZpzJgxcnJyuuI1FV1GPnXqVPXs2VPp6ekaM2aMcnJy1K9f\nP/Xq1UtpaWl64YUX9Morr1RoDAAAgKsh7AQAAACqiezsbFksFgUHB5c57u3tLWdnZzk7O2vUqFEV\nGuOpp57S8OHD5e/vLz8/P7399tvy9/fXnDlzFBwcrP79+1d4DAAAgKsh7AQAAACquc2bN2vXrl1q\n27atioqKKnSvsLCwMh9nZmYqPDy8zLF27dpVaAwAAICroUERAAAAUE0EBgbKZDIpMzOzzHE/Pz9J\numbndZPJJIvFUuZYSUlJueuutjz+ZtjZ2d3QWAAAAH/EzE4AAACgmnB3d1f37t21YMECXbhw4aZe\n6+HhoePHj1s/zsvLK/Px1dx///3atm1bmWOpqanXHauwsFDnzp2zHtu1a9dN1QsAAKonwk4AAACg\nGlm4cKHMZrNCQ0P10UcfKSMjQ/v27dNHH32ktLQ01ahR44qv69q1q2JiYrRt2zbt3LlTkZGRqlWr\n1nXHGzVqlPbv36/o6Gj9/PPP+uSTT/TOO+9IunozpHbt2snJyUn/+Mc/lJ2draSkJC1cuPDWHxoA\nAFQbhJ0AAABANeLv76+dO3cqIiJCr732mh544AG1adNGc+bM0ejRo/Xvf//7iq+bPXu2/P391blz\nZ/Xv31/Dhw+Xp6fndce77777lJSUpFWrVqlVq1aaO3euJk+eLElXDUvd3Ny0dOlSff3112rZsqUW\nL16sadOm3fpDAwCAasNk+eNmOAAAAABwG82bN0+TJk3S2bNnrzq7EwAA4FbQoAgAAADAbRUTE6Pw\n8HB5eHjohx9+0LRp0xQZGUnQCQAAKh1hJwAAAIDbKjs7WzNmzNDp06fl7e2tUaNGadKkSbYuCwAA\nGBDL2AEAAAAAAAAYAg2KAAAAAAAAABgCYScAAAAAAAAAQyDsBAAAAAAAAGAIhJ0AAAAAAAAADIGw\nEwAAAAAAAIAhEHYCAAAAAAAAMATCTgAAAAAAAACGQNgJAAAAAAAAwBAIOwEAAAAAAAAYAmEnAAAA\nAAAAAEMg7AQAAAAAAABgCISdAAAAAAAAAAyBsBMAAAAAAACAIRB2AgAAAAAAADAEwk4AAAAAAAAA\nhkDYCQAAAAAAAMAQCDsBAAAAAAAAGAJhJwAAAAAAAABDIOwEAAAAAAAAYAiEnQAAAAAAAAAMgbAT\nAAAAAAAAgCEQdgIAAAAAAAAwBMJOAAAAAAAAAIZA2AkAAAAAAADAEAg7AQAAAAAAABgCYScAAAAA\nAAAAQyDsBAAAAAAAAGAIhJ0AAAAAAAAADIGwEwAAAAAAAIAhEHYCAAAAAAAAMATCTgAAAAAAAACG\nQNgJAAAAAAAAwBAIOwEAAAAAAAAYAmEnAAAAAAAAAEMg7AQAAAAAAABgCISdAAAAAAAAAAyBsBMA\nAAAAAACAIRB2AgAAAAAAADAEwk4AAAAAAAAAhkDYCQAAAAAAAMAQCDsBAAAAAAAAGAJhJwAAAAAA\nAABDIOwEAAAAAAAAYAiEnQAAAAAAAAAMgbATAAAAAAAAgCEQdgIAAAAAAAAwBMJOAAAAAAAAAIZA\n2AkAAAAAAADAEAg7AQAAAAAAABgCYScAAAAAAAAAQyDsBAAAAAAAAGAIhJ0AAAAAAAAADIGwEwAA\nAAAAAIAhEHYCAAAAAAAAMATCTgAAAAAAAACGQNgJAAAAAAAAwBAIOwEAAAAAAAAYAmEnAAAAAAAA\nAEMg7AQAAAAAAABgCISdAAAAAAAAAAyBsBMAAAAAAACAIRB2AgAAAAAAADAEwk4AAAAAAAAAhkDY\nCQAAAAAAAMAQCDsBAAAAAAAAGAJhJwAAAAAAAABDIOwEAAAAAAAAYAiEnQAAAAAAAAAMgbATAAAA\nAAAAgCEQdgIAAAAAAAAwBMJOAAAAAAAAAIbw/w8Gv+6fOvtiAAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" + "name": "stdout", + "output_type": "stream", + "text": [ + "Left\n", + "Suck\n", + "Right\n" + ] } ], "source": [ - "show_map(node_colors)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Voila! You see, the romania map as shown in the Figure[3.2] in the book. Now, see how different searching algorithms perform with our problem statements." + "state1 = [(0, 0), [(0, 0), \"Dirty\"], [(1, 0), [\"Dirty\"]]]\n", + "state2 = [(1, 0), [(0, 0), \"Dirty\"], [(1, 0), [\"Dirty\"]]]\n", + "state3 = [(0, 0), [(0, 0), \"Clean\"], [(1, 0), [\"Dirty\"]]]\n", + "state4 = [(1, 0), [(0, 0), \"Clean\"], [(1, 0), [\"Dirty\"]]]\n", + "state5 = [(0, 0), [(0, 0), \"Dirty\"], [(1, 0), [\"Clean\"]]]\n", + "state6 = [(1, 0), [(0, 0), \"Dirty\"], [(1, 0), [\"Clean\"]]]\n", + "state7 = [(0, 0), [(0, 0), \"Clean\"], [(1, 0), [\"Clean\"]]]\n", + "state8 = [(1, 0), [(0, 0), \"Clean\"], [(1, 0), [\"Clean\"]]]\n", + "\n", + "a = vacuumAgent(state1)\n", + "\n", + "print(a(state6)) \n", + "print(a(state1))\n", + "print(a(state3))" ] }, { @@ -440,155 +1090,41 @@ "\n", "In this section, we have visualizations of the following searching algorithms:\n", "\n", - "1. Breadth First Tree Search - Implemented\n", + "1. Breadth First Tree Search\n", "2. Depth First Tree Search\n", - "3. Depth First Graph Search\n", - "4. Breadth First Search - Implemented\n", + "3. Breadth First Search\n", + "4. Depth First Graph Search\n", "5. Best First Graph Search\n", - "6. Uniform Cost Search - Implemented\n", + "6. Uniform Cost Search\n", "7. Depth Limited Search\n", "8. Iterative Deepening Search\n", - "9. A\\*-Search - Implemented\n", + "9. Greedy Best First Search\n", + "9. A\\*-Search\n", "10. Recursive Best First Search\n", "\n", "We add the colors to the nodes to have a nice visualisation when displaying. So, these are the different colors we are using in these visuals:\n", "* Un-explored nodes - white\n", "* Frontier nodes - orange\n", "* Currently exploring node - red\n", - "* Already explored nodes - gray\n", - "\n", - "Now, we will define some helper methods to display interactive buttons and sliders when visualising search algorithms." - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "def final_path_colors(problem, solution):\n", - " \"returns a node_colors dict of the final path provided the problem and solution\"\n", - " \n", - " # get initial node colors\n", - " final_colors = dict(initial_node_colors)\n", - " # color all the nodes in solution and starting node to green\n", - " final_colors[problem.initial] = \"green\"\n", - " for node in solution:\n", - " final_colors[node] = \"green\" \n", - " return final_colors\n", - "\n", - "\n", - "def display_visual(user_input, algorithm=None, problem=None):\n", - " if user_input == False:\n", - " def slider_callback(iteration):\n", - " # don't show graph for the first time running the cell calling this function\n", - " try:\n", - " show_map(all_node_colors[iteration])\n", - " except:\n", - " pass\n", - " def visualize_callback(Visualize):\n", - " if Visualize is True:\n", - " button.value = False\n", - " \n", - " global all_node_colors\n", - " \n", - " iterations, all_node_colors, node = algorithm(problem)\n", - " solution = node.solution()\n", - " all_node_colors.append(final_path_colors(problem, solution))\n", - " \n", - " slider.max = len(all_node_colors) - 1\n", - " \n", - " for i in range(slider.max + 1):\n", - " slider.value = i\n", - " #time.sleep(.5)\n", - " \n", - " slider = widgets.IntSlider(min=0, max=1, step=1, value=0)\n", - " slider_visual = widgets.interactive(slider_callback, iteration = slider)\n", - " display(slider_visual)\n", - "\n", - " button = widgets.ToggleButton(value = False)\n", - " button_visual = widgets.interactive(visualize_callback, Visualize = button)\n", - " display(button_visual)\n", - " \n", - " if user_input == True:\n", - " node_colors = dict(initial_node_colors)\n", - " if algorithm == None:\n", - " algorithms = {\"Breadth First Tree Search\": breadth_first_tree_search,\n", - " \"Breadth First Search\": breadth_first_search,\n", - " \"Uniform Cost Search\": uniform_cost_search,\n", - " \"A-star Search\": astar_search}\n", - " algo_dropdown = widgets.Dropdown(description = \"Search algorithm: \",\n", - " options = sorted(list(algorithms.keys())),\n", - " value = \"Breadth First Tree Search\")\n", - " display(algo_dropdown)\n", - " \n", - " def slider_callback(iteration):\n", - " # don't show graph for the first time running the cell calling this function\n", - " try:\n", - " show_map(all_node_colors[iteration])\n", - " except:\n", - " pass\n", - " \n", - " def visualize_callback(Visualize):\n", - " if Visualize is True:\n", - " button.value = False\n", - " \n", - " problem = GraphProblem(start_dropdown.value, end_dropdown.value, romania_map)\n", - " global all_node_colors\n", - " \n", - " if algorithm == None:\n", - " user_algorithm = algorithms[algo_dropdown.value]\n", - " \n", - "# print(user_algorithm)\n", - "# print(problem)\n", - " \n", - " iterations, all_node_colors, node = user_algorithm(problem)\n", - " solution = node.solution()\n", - " all_node_colors.append(final_path_colors(problem, solution))\n", - "\n", - " slider.max = len(all_node_colors) - 1\n", - " \n", - " for i in range(slider.max + 1):\n", - " slider.value = i\n", - "# time.sleep(.5)\n", - " \n", - " start_dropdown = widgets.Dropdown(description = \"Start city: \",\n", - " options = sorted(list(node_colors.keys())), value = \"Arad\")\n", - " display(start_dropdown)\n", - "\n", - " end_dropdown = widgets.Dropdown(description = \"Goal city: \",\n", - " options = sorted(list(node_colors.keys())), value = \"Fagaras\")\n", - " display(end_dropdown)\n", - " \n", - " button = widgets.ToggleButton(value = False)\n", - " button_visual = widgets.interactive(visualize_callback, Visualize = button)\n", - " display(button_visual)\n", - " \n", - " slider = widgets.IntSlider(min=0, max=1, step=1, value=0)\n", - " slider_visual = widgets.interactive(slider_callback, iteration = slider)\n", - " display(slider_visual)" + "* Already explored nodes - gray" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## BREADTH-FIRST TREE SEARCH\n", + "## 1. BREADTH-FIRST TREE SEARCH\n", "\n", "We have a working implementation in search module. But as we want to interact with the graph while it is searching, we need to modify the implementation. Here's the modified breadth first tree search." ] }, { "cell_type": "code", - "execution_count": 12, - "metadata": { - "collapsed": true - }, + "execution_count": 14, + "metadata": {}, "outputs": [], "source": [ - "def tree_search(problem, frontier):\n", + "def tree_breadth_search_for_vis(problem):\n", " \"\"\"Search through the successors of a problem to find a goal.\n", " The argument frontier should be an empty queue.\n", " Don't worry about repeated paths to a state. [Figure 3.7]\"\"\"\n", @@ -596,10 +1132,10 @@ " # we use these two variables at the time of visualisations\n", " iterations = 0\n", " all_node_colors = []\n", - " node_colors = dict(initial_node_colors)\n", + " node_colors = {k : 'white' for k in problem.graph.nodes()}\n", " \n", " #Adding first node to the queue\n", - " frontier.append(Node(problem.initial))\n", + " frontier = deque([Node(problem.initial)])\n", " \n", " node_colors[Node(problem.initial).state] = \"orange\"\n", " iterations += 1\n", @@ -607,7 +1143,7 @@ " \n", " while frontier:\n", " #Popping first node of queue\n", - " node = frontier.pop()\n", + " node = frontier.popleft()\n", " \n", " # modify the currently searching node to red\n", " node_colors[node.state] = \"red\"\n", @@ -637,7 +1173,7 @@ "\n", "def breadth_first_tree_search(problem):\n", " \"Search the shallowest nodes in the search tree first.\"\n", - " iterations, all_node_colors, node = tree_search(problem, FIFOQueue())\n", + " iterations, all_node_colors, node = tree_breadth_search_for_vis(problem)\n", " return(iterations, all_node_colors, node)" ] }, @@ -645,49 +1181,127 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now, we use ipywidgets to display a slider, a button and our romania map. By sliding the slider we can have a look at all the intermediate steps of a particular search algorithm. By pressing the button **Visualize**, you can see all the steps without interacting with the slider. These two helper functions are the callback functions which are called when we interact with the slider and the button.\n", - "\n" + "Now, we use `ipywidgets` to display a slider, a button and our romania map. By sliding the slider we can have a look at all the intermediate steps of a particular search algorithm. By pressing the button **Visualize**, you can see all the steps without interacting with the slider. These two helper functions are the callback functions which are called when we interact with the slider and the button." ] }, { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "all_node_colors = []\n", - "romania_problem = GraphProblem('Arad', 'Fagaras', romania_map)\n", - "display_visual(user_input = False, algorithm = breadth_first_tree_search, problem = romania_problem)" + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "a, b, c = breadth_first_tree_search(romania_problem)\n", + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=breadth_first_tree_search, \n", + " problem=romania_problem)" ] }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ - "## BREADTH-FIRST SEARCH\n", + "## 2. DEPTH-FIRST TREE SEARCH\n", + "Now let's discuss another searching algorithm, Depth-First Tree Search." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [], + "source": [ + "def tree_depth_search_for_vis(problem):\n", + " \"\"\"Search through the successors of a problem to find a goal.\n", + " The argument frontier should be an empty queue.\n", + " Don't worry about repeated paths to a state. [Figure 3.7]\"\"\"\n", + " \n", + " # we use these two variables at the time of visualisations\n", + " iterations = 0\n", + " all_node_colors = []\n", + " node_colors = {k : 'white' for k in problem.graph.nodes()}\n", + " \n", + " #Adding first node to the stack\n", + " frontier = [Node(problem.initial)]\n", + " \n", + " node_colors[Node(problem.initial).state] = \"orange\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " \n", + " while frontier:\n", + " #Popping first node of stack\n", + " node = frontier.pop()\n", + " \n", + " # modify the currently searching node to red\n", + " node_colors[node.state] = \"red\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " \n", + " if problem.goal_test(node.state):\n", + " # modify goal node to green after reaching the goal\n", + " node_colors[node.state] = \"green\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " return(iterations, all_node_colors, node)\n", + " \n", + " frontier.extend(node.expand(problem))\n", + " \n", + " for n in node.expand(problem):\n", + " node_colors[n.state] = \"orange\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", "\n", - "Let's change all the node_colors to starting position and define a different problem statement." + " # modify the color of explored nodes to gray\n", + " node_colors[node.state] = \"gray\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " \n", + " return None\n", + "\n", + "def depth_first_tree_search(problem):\n", + " \"Search the deepest nodes in the search tree first.\"\n", + " iterations, all_node_colors, node = tree_depth_search_for_vis(problem)\n", + " return(iterations, all_node_colors, node)" ] }, { "cell_type": "code", - "execution_count": 13, + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "all_node_colors = []\n", + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=depth_first_tree_search, \n", + " problem=romania_problem)" + ] + }, + { + "cell_type": "markdown", "metadata": { "collapsed": true }, + "source": [ + "## 3. BREADTH-FIRST GRAPH SEARCH\n", + "\n", + "Let's change all the `node_colors` to starting position and define a different problem statement." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, "outputs": [], "source": [ - "def breadth_first_search(problem):\n", + "def breadth_first_search_graph(problem):\n", " \"[Figure 3.11]\"\n", " \n", " # we use these two variables at the time of visualisations\n", " iterations = 0\n", " all_node_colors = []\n", - " node_colors = dict(initial_node_colors)\n", + " node_colors = {k : 'white' for k in problem.graph.nodes()}\n", " \n", " node = Node(problem.initial)\n", " \n", @@ -701,8 +1315,7 @@ " all_node_colors.append(dict(node_colors))\n", " return(iterations, all_node_colors, node)\n", " \n", - " frontier = FIFOQueue()\n", - " frontier.append(node)\n", + " frontier = deque([node])\n", " \n", " # modify the color of frontier nodes to blue\n", " node_colors[node.state] = \"orange\"\n", @@ -711,7 +1324,7 @@ " \n", " explored = set()\n", " while frontier:\n", - " node = frontier.pop()\n", + " node = frontier.popleft()\n", " node_colors[node.state] = \"red\"\n", " iterations += 1\n", " all_node_colors.append(dict(node_colors))\n", @@ -740,139 +1353,117 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "all_node_colors = []\n", "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", - "display_visual(user_input = False, algorithm = breadth_first_search, problem = romania_problem)" + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=breadth_first_search_graph, \n", + " problem=romania_problem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## UNIFORM COST SEARCH\n", - "\n", - "Let's change all the node_colors to starting position and define a different problem statement." + "## 4. DEPTH-FIRST GRAPH SEARCH \n", + "Although we have a working implementation in search module, we have to make a few changes in the algorithm to make it suitable for visualization." ] }, { "cell_type": "code", - "execution_count": 14, - "metadata": { - "collapsed": true - }, + "execution_count": 20, + "metadata": {}, "outputs": [], "source": [ - "def best_first_graph_search(problem, f):\n", - " \"\"\"Search the nodes with the lowest f scores first.\n", - " You specify the function f(node) that you want to minimize; for example,\n", - " if f is a heuristic estimate to the goal, then we have greedy best\n", - " first search; if f is node.depth then we have breadth-first search.\n", - " There is a subtlety: the line \"f = memoize(f, 'f')\" means that the f\n", - " values will be cached on the nodes as they are computed. So after doing\n", - " a best first search you can examine the f values of the path returned.\"\"\"\n", - " \n", + "def graph_search_for_vis(problem):\n", + " \"\"\"Search through the successors of a problem to find a goal.\n", + " The argument frontier should be an empty queue.\n", + " If two paths reach a state, only use the first one. [Figure 3.7]\"\"\"\n", " # we use these two variables at the time of visualisations\n", " iterations = 0\n", " all_node_colors = []\n", - " node_colors = dict(initial_node_colors)\n", - " \n", - " f = memoize(f, 'f')\n", - " node = Node(problem.initial)\n", - " \n", - " node_colors[node.state] = \"red\"\n", - " iterations += 1\n", - " all_node_colors.append(dict(node_colors))\n", - " \n", - " if problem.goal_test(node.state):\n", - " node_colors[node.state] = \"green\"\n", - " iterations += 1\n", - " all_node_colors.append(dict(node_colors))\n", - " return(iterations, all_node_colors, node)\n", + " node_colors = {k : 'white' for k in problem.graph.nodes()}\n", " \n", - " frontier = PriorityQueue(min, f)\n", - " frontier.append(node)\n", + " frontier = [(Node(problem.initial))]\n", + " explored = set()\n", " \n", - " node_colors[node.state] = \"orange\"\n", + " # modify the color of frontier nodes to orange\n", + " node_colors[Node(problem.initial).state] = \"orange\"\n", " iterations += 1\n", " all_node_colors.append(dict(node_colors))\n", - " \n", - " explored = set()\n", + " \n", " while frontier:\n", + " # Popping first node of stack\n", " node = frontier.pop()\n", " \n", + " # modify the currently searching node to red\n", " node_colors[node.state] = \"red\"\n", " iterations += 1\n", " all_node_colors.append(dict(node_colors))\n", " \n", " if problem.goal_test(node.state):\n", + " # modify goal node to green after reaching the goal\n", " node_colors[node.state] = \"green\"\n", " iterations += 1\n", " all_node_colors.append(dict(node_colors))\n", " return(iterations, all_node_colors, node)\n", " \n", " explored.add(node.state)\n", - " for child in node.expand(problem):\n", - " if child.state not in explored and child not in frontier:\n", - " frontier.append(child)\n", - " node_colors[child.state] = \"orange\"\n", - " iterations += 1\n", - " all_node_colors.append(dict(node_colors))\n", - " elif child in frontier:\n", - " incumbent = frontier[child]\n", - " if f(child) < f(incumbent):\n", - " del frontier[incumbent]\n", - " frontier.append(child)\n", - " node_colors[child.state] = \"orange\"\n", - " iterations += 1\n", - " all_node_colors.append(dict(node_colors))\n", + " frontier.extend(child for child in node.expand(problem)\n", + " if child.state not in explored and\n", + " child not in frontier)\n", + " \n", + " for n in frontier:\n", + " # modify the color of frontier nodes to orange\n", + " node_colors[n.state] = \"orange\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", "\n", + " # modify the color of explored nodes to gray\n", " node_colors[node.state] = \"gray\"\n", " iterations += 1\n", " all_node_colors.append(dict(node_colors))\n", + " \n", " return None\n", "\n", - "def uniform_cost_search(problem):\n", - " \"[Figure 3.14]\"\n", - " iterations, all_node_colors, node = best_first_graph_search(problem, lambda node: node.path_cost)\n", - " return(iterations, all_node_colors, node)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## A\\* SEARCH\n", "\n", - "Let's change all the node_colors to starting position and define a different problem statement." + "def depth_first_graph_search(problem):\n", + " \"\"\"Search the deepest nodes in the search tree first.\"\"\"\n", + " iterations, all_node_colors, node = graph_search_for_vis(problem)\n", + " return(iterations, all_node_colors, node)" ] }, { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "all_node_colors = []\n", "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", - "display_visual(user_input = False, algorithm = uniform_cost_search, problem = romania_problem)" + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=depth_first_graph_search, \n", + " problem=romania_problem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 5. BEST FIRST SEARCH\n", + "\n", + "Let's change all the `node_colors` to starting position and define a different problem statement." ] }, { "cell_type": "code", - "execution_count": 15, - "metadata": { - "collapsed": true - }, + "execution_count": 22, + "metadata": {}, "outputs": [], "source": [ - "def best_first_graph_search(problem, f):\n", + "def best_first_graph_search_for_vis(problem, f):\n", " \"\"\"Search the nodes with the lowest f scores first.\n", " You specify the function f(node) that you want to minimize; for example,\n", " if f is a heuristic estimate to the goal, then we have greedy best\n", @@ -884,7 +1475,7 @@ " # we use these two variables at the time of visualisations\n", " iterations = 0\n", " all_node_colors = []\n", - " node_colors = dict(initial_node_colors)\n", + " node_colors = {k : 'white' for k in problem.graph.nodes()}\n", " \n", " f = memoize(f, 'f')\n", " node = Node(problem.initial)\n", @@ -899,7 +1490,7 @@ " all_node_colors.append(dict(node_colors))\n", " return(iterations, all_node_colors, node)\n", " \n", - " frontier = PriorityQueue(min, f)\n", + " frontier = PriorityQueue('min', f)\n", " frontier.append(node)\n", " \n", " node_colors[node.state] = \"orange\"\n", @@ -929,8 +1520,8 @@ " all_node_colors.append(dict(node_colors))\n", " elif child in frontier:\n", " incumbent = frontier[child]\n", - " if f(child) < f(incumbent):\n", - " del frontier[incumbent]\n", + " if f(child) < incumbent:\n", + " del frontier[child]\n", " frontier.append(child)\n", " node_colors[child.state] = \"orange\"\n", " iterations += 1\n", @@ -939,512 +1530,4986 @@ " node_colors[node.state] = \"gray\"\n", " iterations += 1\n", " all_node_colors.append(dict(node_colors))\n", - " return None\n", + " return None" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 6. UNIFORM COST SEARCH\n", "\n", - "def astar_search(problem, h=None):\n", - " \"\"\"A* search is best-first graph search with f(n) = g(n)+h(n).\n", - " You need to specify the h function when you call astar_search, or\n", - " else in your Problem subclass.\"\"\"\n", - " h = memoize(h or problem.h, 'h')\n", - " iterations, all_node_colors, node = best_first_graph_search(problem, lambda n: n.path_cost + h(n))\n", - " return(iterations, all_node_colors, node)" + "Let's change all the `node_colors` to starting position and define a different problem statement." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": { - "collapsed": true - }, + "execution_count": 23, + "metadata": {}, "outputs": [], "source": [ - "all_node_colors = []\n", - "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", - "display_visual(user_input = False, algorithm = astar_search, problem = romania_problem)" + "def uniform_cost_search_graph(problem):\n", + " \"[Figure 3.14]\"\n", + " #Uniform Cost Search uses Best First Search algorithm with f(n) = g(n)\n", + " iterations, all_node_colors, node = best_first_graph_search_for_vis(problem, lambda node: node.path_cost)\n", + " return(iterations, all_node_colors, node)\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { - "collapsed": true, "scrolled": false }, "outputs": [], "source": [ "all_node_colors = []\n", - "# display_visual(user_input = True, algorithm = breadth_first_tree_search)\n", - "display_visual(user_input = True)" + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=uniform_cost_search_graph, \n", + " problem=romania_problem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## GENETIC ALGORITHM\n", - "\n", - "Genetic algorithms (or GA) are inspired by natural evolution and are particularly useful in optimization and search problems with large state spaces.\n", + "## 7. DEPTH LIMITED SEARCH\n", "\n", - "Given a problem, algorithms in the domain make use of a *population* of solutions (also called *states*), where each solution/state represents a feasible solution. At each iteration (often called *generation*), the population gets updated using methods inspired by biology and evolution, like *crossover*, *mutation* and *natural selection*." + "Let's change all the 'node_colors' to starting position and define a different problem statement. \n", + "Although we have a working implementation, but we need to make changes." ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": 25, "metadata": {}, + "outputs": [], "source": [ - "### Overview\n", - "\n", - "A genetic algorithm works in the following way:\n", - "\n", - "1) Initialize random population.\n", + "def depth_limited_search_graph(problem, limit = -1):\n", + " '''\n", + " Perform depth first search of graph g.\n", + " if limit >= 0, that is the maximum depth of the search.\n", + " '''\n", + " # we use these two variables at the time of visualisations\n", + " iterations = 0\n", + " all_node_colors = []\n", + " node_colors = {k : 'white' for k in problem.graph.nodes()}\n", + " \n", + " frontier = [Node(problem.initial)]\n", + " explored = set()\n", + " \n", + " cutoff_occurred = False\n", + " node_colors[Node(problem.initial).state] = \"orange\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " \n", + " while frontier:\n", + " # Popping first node of queue\n", + " node = frontier.pop()\n", + " \n", + " # modify the currently searching node to red\n", + " node_colors[node.state] = \"red\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " \n", + " if problem.goal_test(node.state):\n", + " # modify goal node to green after reaching the goal\n", + " node_colors[node.state] = \"green\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " return(iterations, all_node_colors, node)\n", "\n", - "2) Calculate population fitness.\n", + " elif limit >= 0:\n", + " cutoff_occurred = True\n", + " limit += 1\n", + " all_node_colors.pop()\n", + " iterations -= 1\n", + " node_colors[node.state] = \"gray\"\n", "\n", - "3) Select individuals for mating.\n", + " \n", + " explored.add(node.state)\n", + " frontier.extend(child for child in node.expand(problem)\n", + " if child.state not in explored and\n", + " child not in frontier)\n", + " \n", + " for n in frontier:\n", + " limit -= 1\n", + " # modify the color of frontier nodes to orange\n", + " node_colors[n.state] = \"orange\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", "\n", - "4) Mate selected individuals to produce new population.\n", + " # modify the color of explored nodes to gray\n", + " node_colors[node.state] = \"gray\"\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " \n", + " return 'cutoff' if cutoff_occurred else None\n", "\n", - " * Random chance to mutate individuals.\n", "\n", - "5) Repeat from step 2) until an individual is fit enough or the maximum number of iterations was reached." + "def depth_limited_search_for_vis(problem):\n", + " \"\"\"Search the deepest nodes in the search tree first.\"\"\"\n", + " iterations, all_node_colors, node = depth_limited_search_graph(problem)\n", + " return(iterations, all_node_colors, node) " ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "### Glossary\n", - "\n", - "Before we continue, we will lay the basic terminology of the algorithm.\n", - "\n", - "* Individual/State: A list of elements (called *genes*) that represent possible solutions.\n", - "\n", - "* Population: The list of all the individuals/states.\n", - "\n", - "* Gene pool: The alphabet of possible values for an individual's genes.\n", - "\n", - "* Generation/Iteration: The number of times the population will be updated.\n", - "\n", - "* Fitness: An individual's score, calculated by a function specific to the problem." + "all_node_colors = []\n", + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=depth_limited_search_for_vis, \n", + " problem=romania_problem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Crossover\n", - "\n", - "Two individuals/states can \"mate\" and produce one child. This offspring bears characteristics from both of its parents. There are many ways we can implement this crossover. Here we will take a look at the most common ones. Most other methods are variations of those below.\n", - "\n", - "* Point Crossover: The crossover occurs around one (or more) point. The parents get \"split\" at the chosen point or points and then get merged. In the example below we see two parents get split and merged at the 3rd digit, producing the following offspring after the crossover.\n", - "\n", - "![point crossover](images/point_crossover.png)\n", - "\n", - "* Uniform Crossover: This type of crossover chooses randomly the genes to get merged. Here the genes 1, 2 and 5 were chosen from the first parent, so the genes 3, 4 were added by the second parent.\n", + "## 8. ITERATIVE DEEPENING SEARCH\n", "\n", - "![uniform crossover](images/uniform_crossover.png)" + "Let's change all the 'node_colors' to starting position and define a different problem statement. " ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": 27, "metadata": {}, + "outputs": [], "source": [ - "### Mutation\n", - "\n", - "When an offspring is produced, there is a chance it will mutate, having one (or more, depending on the implementation) of its genes altered.\n", - "\n", - "For example, let's say the new individual to undergo mutation is \"abcde\". Randomly we pick to change its third gene to 'z'. The individual now becomes \"abzde\" and is added to the population." + "def iterative_deepening_search_for_vis(problem):\n", + " for depth in range(sys.maxsize):\n", + " iterations, all_node_colors, node=depth_limited_search_for_vis(problem)\n", + " if iterations:\n", + " return (iterations, all_node_colors, node)" ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "### Selection\n", - "\n", - "At each iteration, the fittest individuals are picked randomly to mate and produce offsprings. We measure an individual's fitness with a *fitness function*. That function depends on the given problem and it is used to score an individual. Usually the higher the better.\n", - "\n", - "The selection process is this:\n", - "\n", - "1) Individuals are scored by the fitness function.\n", - "\n", - "2) Individuals are picked randomly, according to their score (higher score means higher chance to get picked). Usually the formula to calculate the chance to pick an individual is the following (for population *P* and individual *i*):\n", - "\n", - "$$ chance(i) = \\dfrac{fitness(i)}{\\sum_{k \\, in \\, P}{fitness(k)}} $$" + "all_node_colors = []\n", + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=iterative_deepening_search_for_vis, \n", + " problem=romania_problem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Implementation\n", - "\n", - "Below we look over the implementation of the algorithm in the `search` module.\n", - "\n", - "First the implementation of the main core of the algorithm:" + "## 9. GREEDY BEST FIRST SEARCH\n", + "Let's change all the node_colors to starting position and define a different problem statement." ] }, { "cell_type": "code", - "execution_count": 2, - "metadata": { - "collapsed": true - }, + "execution_count": 29, + "metadata": {}, "outputs": [], "source": [ - "%psource genetic_algorithm" + "def greedy_best_first_search(problem, h=None):\n", + " \"\"\"Greedy Best-first graph search is an informative searching algorithm with f(n) = h(n).\n", + " You need to specify the h function when you call best_first_search, or\n", + " else in your Problem subclass.\"\"\"\n", + " h = memoize(h or problem.h, 'h')\n", + " iterations, all_node_colors, node = best_first_graph_search_for_vis(problem, lambda n: h(n))\n", + " return(iterations, all_node_colors, node)\n" ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "The algorithm takes the following input:\n", - "\n", - "* `population`: The initial population.\n", - "\n", - "* `fitness_fn`: The problem's fitness function.\n", - "\n", - "* `gene_pool`: The gene pool of the states/individuals. By default 0 and 1.\n", - "\n", - "* `f_thres`: The fitness threshold. If an individual reaches that score, iteration stops. By default 'None', which means the algorithm will not halt until the generations are ran.\n", - "\n", - "* `ngen`: The number of iterations/generations.\n", - "\n", - "* `pmut`: The probability of mutation.\n", - "\n", - "The algorithm gives as output the state with the largest score." + "all_node_colors = []\n", + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=greedy_best_first_search, \n", + " problem=romania_problem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "For each generation, the algorithm updates the population. First it calculates the fitnesses of the individuals, then it selects the most fit ones and finally crosses them over to produce offsprings. There is a chance that the offspring will be mutated, given by `pmut`. If at the end of the generation an individual meets the fitness threshold, the algorithm halts and returns that individual.\n", + "## 10. A\\* SEARCH\n", "\n", - "The function of mating is accomplished by the method `reproduce`:" + "Let's change all the `node_colors` to starting position and define a different problem statement." ] }, { "cell_type": "code", - "execution_count": 3, - "metadata": { - "collapsed": true - }, + "execution_count": 31, + "metadata": {}, + "outputs": [], + "source": [ + "def astar_search_graph(problem, h=None):\n", + " \"\"\"A* search is best-first graph search with f(n) = g(n)+h(n).\n", + " You need to specify the h function when you call astar_search, or\n", + " else in your Problem subclass.\"\"\"\n", + " h = memoize(h or problem.h, 'h')\n", + " iterations, all_node_colors, node = best_first_graph_search_for_vis(problem, \n", + " lambda n: n.path_cost + h(n))\n", + " return(iterations, all_node_colors, node)\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ - "%psource reproduce" + "all_node_colors = []\n", + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "display_visual(romania_graph_data, user_input=False, \n", + " algorithm=astar_search_graph, \n", + " problem=romania_problem)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The method picks at random a point and merges the parents (`x` and `y`) around it.\n", - "\n", - "The mutation is done in the method `mutate`:" + "## 11. RECURSIVE BEST FIRST SEARCH\n", + "Let's change all the `node_colors` to starting position and define a different problem statement." ] }, { "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": true - }, + "execution_count": 33, + "metadata": {}, "outputs": [], "source": [ - "%psource mutate" + "def recursive_best_first_search_for_vis(problem, h=None):\n", + " \"\"\"[Figure 3.26] Recursive best-first search\"\"\"\n", + " # we use these two variables at the time of visualizations\n", + " iterations = 0\n", + " all_node_colors = []\n", + " node_colors = {k : 'white' for k in problem.graph.nodes()}\n", + " \n", + " h = memoize(h or problem.h, 'h')\n", + " \n", + " def RBFS(problem, node, flimit):\n", + " nonlocal iterations\n", + " def color_city_and_update_map(node, color):\n", + " node_colors[node.state] = color\n", + " nonlocal iterations\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " \n", + " if problem.goal_test(node.state):\n", + " color_city_and_update_map(node, 'green')\n", + " return (iterations, all_node_colors, node), 0 # the second value is immaterial\n", + " \n", + " successors = node.expand(problem)\n", + " if len(successors) == 0:\n", + " color_city_and_update_map(node, 'gray')\n", + " return (iterations, all_node_colors, None), infinity\n", + " \n", + " for s in successors:\n", + " color_city_and_update_map(s, 'orange')\n", + " s.f = max(s.path_cost + h(s), node.f)\n", + " \n", + " while True:\n", + " # Order by lowest f value\n", + " successors.sort(key=lambda x: x.f)\n", + " best = successors[0]\n", + " if best.f > flimit:\n", + " color_city_and_update_map(node, 'gray')\n", + " return (iterations, all_node_colors, None), best.f\n", + " \n", + " if len(successors) > 1:\n", + " alternative = successors[1].f\n", + " else:\n", + " alternative = infinity\n", + " \n", + " node_colors[node.state] = 'gray'\n", + " node_colors[best.state] = 'red'\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " result, best.f = RBFS(problem, best, min(flimit, alternative))\n", + " if result[2] is not None:\n", + " color_city_and_update_map(node, 'green')\n", + " return result, best.f\n", + " else:\n", + " color_city_and_update_map(node, 'red')\n", + " \n", + " node = Node(problem.initial)\n", + " node.f = h(node)\n", + " \n", + " node_colors[node.state] = 'red'\n", + " iterations += 1\n", + " all_node_colors.append(dict(node_colors))\n", + " result, bestf = RBFS(problem, node, infinity)\n", + " return result" ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "We pick a gene in `x` to mutate and a gene from the gene pool to replace it with.\n", - "\n", - "To help initializing the population we have the helper function `init_population`\":" + "all_node_colors = []\n", + "romania_problem = GraphProblem('Arad', 'Bucharest', romania_map)\n", + "display_visual(romania_graph_data, user_input=False,\n", + " algorithm=recursive_best_first_search_for_vis,\n", + " problem=romania_problem)" ] }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "metadata": { - "collapsed": true + "scrolled": false }, "outputs": [], "source": [ - "%psource init_population" + "all_node_colors = []\n", + "# display_visual(romania_graph_data, user_input=True, algorithm=breadth_first_tree_search)\n", + "algorithms = { \"Breadth First Tree Search\": tree_breadth_search_for_vis,\n", + " \"Depth First Tree Search\": tree_depth_search_for_vis,\n", + " \"Breadth First Search\": breadth_first_search_graph,\n", + " \"Depth First Graph Search\": graph_search_for_vis,\n", + " \"Best First Graph Search\": best_first_graph_search_for_vis,\n", + " \"Uniform Cost Search\": uniform_cost_search_graph,\n", + " \"Depth Limited Search\": depth_limited_search_for_vis,\n", + " \"Iterative Deepening Search\": iterative_deepening_search_for_vis,\n", + " \"Greedy Best First Search\": greedy_best_first_search,\n", + " \"A-star Search\": astar_search_graph,\n", + " \"Recursive Best First Search\": recursive_best_first_search_for_vis}\n", + "display_visual(romania_graph_data, algorithm=algorithms, user_input=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The function takes as input the number of individuals in the population, the gene pool and the length of each individual/state. It creates individuals with random genes and returns the population when done." + "## RECURSIVE BEST-FIRST SEARCH\n", + "Recursive best-first search is a simple recursive algorithm that improves upon heuristic search by reducing the memory requirement.\n", + "RBFS uses only linear space and it attempts to mimic the operation of standard best-first search.\n", + "Its structure is similar to recursive depth-first search but it doesn't continue indefinitely down the current path, the `f_limit` variable is used to keep track of the f-value of the best _alternative_ path available from any ancestor of the current node.\n", + "RBFS remembers the f-value of the best leaf in the forgotten subtree and can decide whether it is worth re-expanding the tree later.\n", + "
    \n", + "However, RBFS still suffers from excessive node regeneration.\n", + "
    \n", + "Let's have a look at the implementation." ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": 36, "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def recursive_best_first_search(problem, h=None):\n",
    +       "    """[Figure 3.26] Recursive best-first search (RBFS) is an\n",
    +       "    informative search algorithm. Like A*, it uses the heuristic\n",
    +       "    f(n) = g(n) + h(n) to determine the next node to expand, making\n",
    +       "    it both optimal and complete (iff the heuristic is consistent).\n",
    +       "    To reduce memory consumption, RBFS uses a depth first search\n",
    +       "    and only retains the best f values of its ancestors."""\n",
    +       "    h = memoize(h or problem.h, 'h')\n",
    +       "\n",
    +       "    def RBFS(problem, node, flimit):\n",
    +       "        if problem.goal_test(node.state):\n",
    +       "            return node, 0   # (The second value is immaterial)\n",
    +       "        successors = node.expand(problem)\n",
    +       "        if len(successors) == 0:\n",
    +       "            return None, infinity\n",
    +       "        for s in successors:\n",
    +       "            s.f = max(s.path_cost + h(s), node.f)\n",
    +       "        while True:\n",
    +       "            # Order by lowest f value\n",
    +       "            successors.sort(key=lambda x: x.f)\n",
    +       "            best = successors[0]\n",
    +       "            if best.f > flimit:\n",
    +       "                return None, best.f\n",
    +       "            if len(successors) > 1:\n",
    +       "                alternative = successors[1].f\n",
    +       "            else:\n",
    +       "                alternative = infinity\n",
    +       "            result, best.f = RBFS(problem, best, min(flimit, alternative))\n",
    +       "            if result is not None:\n",
    +       "                return result, best.f\n",
    +       "\n",
    +       "    node = Node(problem.initial)\n",
    +       "    node.f = h(node)\n",
    +       "    result, bestf = RBFS(problem, node, infinity)\n",
    +       "    return result\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "### Usage\n", - "\n", - "Below we give two example usages for the genetic algorithm, for a graph coloring problem and the 8 queens problem.\n", - "\n", - "#### Graph Coloring\n", - "\n", - "First we will take on the simpler problem of coloring a small graph with two colors. Before we do anything, let's imagine how a solution might look. First, we have to represent our colors. Say, 'R' for red and 'G' for green. These make up our gene pool. What of the individual solutions though? For that, we will look at our problem. We stated we have a graph. A graph has nodes and edges, and we want to color the nodes. Naturally, we want to store each node's color. If we have four nodes, we can store their colors in a list of genes, one for each node. A possible solution will then look like this: ['R', 'R', 'G', 'R']. In the general case, we will represent each solution with a list of chars ('R' and 'G'), with length the number of nodes.\n", - "\n", - "Next we need to come up with a fitness function that appropriately scores individuals. Again, we will look at the problem definition at hand. We want to color a graph. For a solution to be optimal, no edge should connect two nodes of the same color. How can we use this information to score a solution? A naive (and ineffective) approach would be to count the different colors in the string. So ['R', 'R', 'R', 'R'] has a score of 1 and ['R', 'R', 'G', 'G'] has a score of 2. Why that fitness function is not ideal though? Why, we forgot the information about the edges! The edges are pivotal to the problem and the above function only deals with node colors. We didn't use all the information at hand and ended up with an ineffective answer. How, then, can we use that information to our advantage?\n", - "\n", - "We said that the optimal solution will have all the edges connecting nodes of different color. So, to score a solution we can count how many edges are valid (aka connecting nodes of different color). That is a great fitness function!\n", - "\n", - "Let's jump into solving this problem using the `genetic_algorithm` function." + "psource(recursive_best_first_search)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "First we need to represent the graph. Since we mostly need information about edges, we will just store the edges. We will denote edges with capital letters and nodes with integers:" + "This is how `recursive_best_first_search` can solve the `romania_problem`" ] }, { "cell_type": "code", - "execution_count": 6, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['Sibiu', 'Rimnicu', 'Pitesti', 'Bucharest']" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ - "edges = {\n", - " 'A': [0, 1],\n", - " 'B': [0, 3],\n", - " 'C': [1, 2],\n", - " 'D': [2, 3]\n", - "}" + "recursive_best_first_search(romania_problem).solution()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Edge 'A' connects nodes 0 and 1, edge 'B' connects nodes 0 and 3 etc.\n", - "\n", - "We already said our gene pool is 'R' and 'G', so we can jump right into initializing our population. Since we have only four nodes, `state_length` should be 4. For the number of individuals, we will try 8. We can increase this number if we need higher accuracy, but be careful! Larger populations need more computating power and take longer. You need to strike that sweet balance between accuracy and cost (the ultimate dilemma of the programmer!)." + "`recursive_best_first_search` can be used to solve the 8 puzzle problem too, as discussed later." ] }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 38, "metadata": {}, "outputs": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "[['R', 'G', 'G', 'R'], ['R', 'G', 'R', 'R'], ['G', 'R', 'G', 'R'], ['R', 'G', 'R', 'G'], ['G', 'R', 'R', 'G'], ['G', 'R', 'G', 'R'], ['G', 'R', 'R', 'R'], ['R', 'G', 'G', 'G']]\n" - ] + "data": { + "text/plain": [ + "['UP', 'LEFT', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'RIGHT', 'DOWN']" + ] + }, + "execution_count": 38, + "metadata": {}, + "output_type": "execute_result" } ], "source": [ - "population = init_population(8, ['R', 'G'], 4)\n", - "print(population)" + "puzzle = EightPuzzle((2, 4, 3, 1, 5, 6, 7, 8, 0))\n", + "assert puzzle.check_solvability((2, 4, 3, 1, 5, 6, 7, 8, 0))\n", + "recursive_best_first_search(puzzle).solution()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We created and printed the population. You can see that the genes in the individuals are random and there are 8 individuals each with 4 genes.\n", + "## A* HEURISTICS\n", "\n", - "Next we need to write our fitness function. We previously said we want the function to count how many edges are valid. So, given a coloring/individual `c`, we will do just that:" + "Different heuristics provide different efficiency in solving A* problems which are generally defined by the number of explored nodes as well as the branching factor. With the classic 8 puzzle we can show the efficiency of different heuristics through the number of explored nodes.\n", + "\n", + "### 8 Puzzle Problem\n", + "\n", + "The *8 Puzzle Problem* consists of a 3x3 tray in which the goal is to get the initial configuration to the goal state by shifting the numbered tiles into the blank space.\n", + "\n", + "example:- \n", + "\n", + " Initial State Goal State\n", + " | 7 | 2 | 4 | | 1 | 2 | 3 |\n", + " | 5 | 0 | 6 | | 4 | 5 | 6 |\n", + " | 8 | 3 | 1 | | 7 | 8 | 0 |\n", + " \n", + "We have a total of 9 blank tiles giving us a total of 9! initial configuration but not all of these are solvable. The solvability of a configuration can be checked by calculating the Inversion Permutation. If the total Inversion Permutation is even then the initial configuration is solvable else the initial configuration is not solvable which means that only 9!/2 initial states lead to a solution.\n", + "
    \n", + "Let's define our goal state." ] }, { "cell_type": "code", - "execution_count": 8, - "metadata": { - "collapsed": true - }, + "execution_count": 39, + "metadata": {}, "outputs": [], "source": [ - "def fitness(c):\n", - " return sum(c[n1] != c[n2] for (n1, n2) in edges.values())" + "goal = [1, 2, 3, 4, 5, 6, 7, 8, 0]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Great! Now we will run the genetic algorithm and see what solution it gives." + "#### Heuristics :-\n", + "\n", + "1) Manhattan Distance:- For the 8 puzzle problem Manhattan distance is defined as the distance of a tile from its goal state( for the tile numbered '1' in the initial configuration Manhattan distance is 4 \"2 for left and 2 for upward displacement\").\n", + "\n", + "2) No. of Misplaced Tiles:- The heuristic calculates the number of misplaced tiles between the current state and goal state.\n", + "\n", + "3) Sqrt of Manhattan Distance:- It calculates the square root of Manhattan distance.\n", + "\n", + "4) Max Heuristic:- It assign the score as the maximum between \"Manhattan Distance\" and \"No. of Misplaced Tiles\"." ] }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 40, "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "['R', 'G', 'R', 'G']\n" - ] - } - ], + "outputs": [], "source": [ - "solution = genetic_algorithm(population, fitness, gene_pool=['R', 'G'])\n", - "print(solution)" + "# Heuristics for 8 Puzzle Problem\n", + "import math\n", + "\n", + "def linear(node):\n", + " return sum([1 if node.state[i] != goal[i] else 0 for i in range(8)])\n", + "\n", + "def manhattan(node):\n", + " state = node.state\n", + " index_goal = {0:[2,2], 1:[0,0], 2:[0,1], 3:[0,2], 4:[1,0], 5:[1,1], 6:[1,2], 7:[2,0], 8:[2,1]}\n", + " index_state = {}\n", + " index = [[0,0], [0,1], [0,2], [1,0], [1,1], [1,2], [2,0], [2,1], [2,2]]\n", + " x, y = 0, 0\n", + " \n", + " for i in range(len(state)):\n", + " index_state[state[i]] = index[i]\n", + " \n", + " mhd = 0\n", + " \n", + " for i in range(8):\n", + " for j in range(2):\n", + " mhd = abs(index_goal[i][j] - index_state[i][j]) + mhd\n", + " \n", + " return mhd\n", + "\n", + "def sqrt_manhattan(node):\n", + " state = node.state\n", + " index_goal = {0:[2,2], 1:[0,0], 2:[0,1], 3:[0,2], 4:[1,0], 5:[1,1], 6:[1,2], 7:[2,0], 8:[2,1]}\n", + " index_state = {}\n", + " index = [[0,0], [0,1], [0,2], [1,0], [1,1], [1,2], [2,0], [2,1], [2,2]]\n", + " x, y = 0, 0\n", + " \n", + " for i in range(len(state)):\n", + " index_state[state[i]] = index[i]\n", + " \n", + " mhd = 0\n", + " \n", + " for i in range(8):\n", + " for j in range(2):\n", + " mhd = (index_goal[i][j] - index_state[i][j])**2 + mhd\n", + " \n", + " return math.sqrt(mhd)\n", + "\n", + "def max_heuristic(node):\n", + " score1 = manhattan(node)\n", + " score2 = linear(node)\n", + " return max(score1, score2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The algorithm converged to a solution. Let's check its score:" + "We can solve the puzzle using the `astar_search` method." ] }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 41, "metadata": {}, "outputs": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "4\n" - ] - } - ], + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 41, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ - "print(fitness(solution))" + "# Solving the puzzle \n", + "puzzle = EightPuzzle((2, 4, 3, 1, 5, 6, 7, 8, 0))\n", + "puzzle.check_solvability((2, 4, 3, 1, 5, 6, 7, 8, 0)) # checks whether the initialized configuration is solvable or not" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The solution has a score of 4. Which means it is optimal, since we have exactly 4 edges in our graph, meaning all are valid!\n", - "\n", - "*NOTE: Because the algorithm is non-deterministic, there is a chance a different solution is given. It might even be wrong, if we are very unlucky!*" + "This case is solvable, let's proceed.\n", + "
    \n", + "The default heuristic function returns the number of misplaced tiles." + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['UP', 'LEFT', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'RIGHT', 'DOWN']" + ] + }, + "execution_count": 42, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "astar_search(puzzle).solution()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "#### Eight Queens\n", - "\n", - "Let's take a look at a more complicated problem.\n", - "\n", - "In the *Eight Queens* problem, we are tasked with placing eight queens on an 8x8 chessboard without any queen threatening the others (aka queens should not be in the same row, column or diagonal). In its general form the problem is defined as placing *N* queens in an NxN chessboard without any conflicts.\n", - "\n", - "First we need to think about the representation of each solution. We can go the naive route of representing the whole chessboard with the queens' placements on it. That is definitely one way to go about it, but for the purpose of this tutorial we will do something different. We have eight queens, so we will have a gene for each of them. The gene pool will be numbers from 0 to 7, for the different columns. The *position* of the gene in the state will denote the row the particular queen is placed in.\n", - "\n", - "For example, we can have the state \"03304577\". Here the first gene with a value of 0 means \"the queen at row 0 is placed at column 0\", for the second gene \"the queen at row 1 is placed at column 3\" and so forth.\n", - "\n", - "We now need to think about the fitness function. On the graph coloring problem we counted the valid edges. The same thought process can be applied here. Instead of edges though, we have positioning between queens. If two queens are not threatening each other, we say they are at a \"non-attacking\" positioning. We can, therefore, count how many such positionings are there.\n", + "In the following cells, we use different heuristic functions.\n", + "
    " + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['UP', 'LEFT', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'RIGHT', 'DOWN']" + ] + }, + "execution_count": 43, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "astar_search(puzzle, linear).solution()" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['LEFT', 'UP', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'DOWN', 'RIGHT']" + ] + }, + "execution_count": 44, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "astar_search(puzzle, manhattan).solution()" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['LEFT', 'UP', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'DOWN', 'RIGHT']" + ] + }, + "execution_count": 45, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "astar_search(puzzle, sqrt_manhattan).solution()" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['LEFT', 'UP', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'DOWN', 'RIGHT']" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "astar_search(puzzle, max_heuristic).solution()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And here's how `recursive_best_first_search` can be used to solve this problem too." + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['LEFT', 'UP', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'DOWN', 'UP', 'DOWN', 'RIGHT']" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "recursive_best_first_search(puzzle, manhattan).solution()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Even though all the heuristic functions give the same solution, the difference lies in the computation time.\n", + "
    \n", + "This might make all the difference in a scenario where high computational efficiency is required.\n", + "
    \n", + "Let's define a few puzzle states and time `astar_search` for every heuristic function.\n", + "We will use the %%timeit magic for this." + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [], + "source": [ + "puzzle_1 = EightPuzzle((2, 4, 3, 1, 5, 6, 7, 8, 0))\n", + "puzzle_2 = EightPuzzle((1, 2, 3, 4, 5, 6, 0, 7, 8))\n", + "puzzle_3 = EightPuzzle((1, 2, 3, 4, 5, 7, 8, 6, 0))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The default heuristic function is the same as the `linear` heuristic function, but we'll still check both." + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "3.24 ms ± 190 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "astar_search(puzzle_1)\n", + "astar_search(puzzle_2)\n", + "astar_search(puzzle_3)" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "3.68 ms ± 368 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "astar_search(puzzle_1, linear)\n", + "astar_search(puzzle_2, linear)\n", + "astar_search(puzzle_3, linear)" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "3.12 ms ± 88.7 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "astar_search(puzzle_1, manhattan)\n", + "astar_search(puzzle_2, manhattan)\n", + "astar_search(puzzle_3, manhattan)" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "22.7 ms ± 1.69 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "astar_search(puzzle_1, sqrt_manhattan)\n", + "astar_search(puzzle_2, sqrt_manhattan)\n", + "astar_search(puzzle_3, sqrt_manhattan)" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "3.91 ms ± 434 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "astar_search(puzzle_1, max_heuristic)\n", + "astar_search(puzzle_2, max_heuristic)\n", + "astar_search(puzzle_3, max_heuristic)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can infer that the `manhattan` heuristic function works the fastest.\n", + "
    \n", + "`sqrt_manhattan` has an extra `sqrt` operation which makes it quite a lot slower than the others.\n", + "
    \n", + "`max_heuristic` should have been a bit slower as it calls two functions, but in this case, those values were already calculated which saved some time.\n", + "Feel free to play around with these functions." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For comparison, this is how RBFS performs on this problem." + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "140 ms ± 9.89 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "recursive_best_first_search(puzzle_1, linear)\n", + "recursive_best_first_search(puzzle_2, linear)\n", + "recursive_best_first_search(puzzle_3, linear)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is quite a lot slower than `astar_search` as we can see." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## HILL CLIMBING\n", "\n", - "Let's dive right in and initialize our population:" + "Hill Climbing is a heuristic search used for optimization problems.\n", + "Given a large set of inputs and a good heuristic function, it tries to find a sufficiently good solution to the problem. \n", + "This solution may or may not be the global optimum.\n", + "The algorithm is a variant of generate and test algorithm. \n", + "
    \n", + "As a whole, the algorithm works as follows:\n", + "- Evaluate the initial state.\n", + "- If it is equal to the goal state, return.\n", + "- Find a neighboring state (one which is heuristically similar to the current state)\n", + "- Evaluate this state. If it is closer to the goal state than before, replace the initial state with this state and repeat these steps.\n", + "
    " + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def hill_climbing(problem):\n",
    +       "    """From the initial node, keep choosing the neighbor with highest value,\n",
    +       "    stopping when no neighbor is better. [Figure 4.2]"""\n",
    +       "    current = Node(problem.initial)\n",
    +       "    while True:\n",
    +       "        neighbors = current.expand(problem)\n",
    +       "        if not neighbors:\n",
    +       "            break\n",
    +       "        neighbor = argmax_random_tie(neighbors,\n",
    +       "                                     key=lambda node: problem.value(node.state))\n",
    +       "        if problem.value(neighbor.state) <= problem.value(current.state):\n",
    +       "            break\n",
    +       "        current = neighbor\n",
    +       "    return current.state\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(hill_climbing)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will find an approximate solution to the traveling salespersons problem using this algorithm.\n", + "
    \n", + "We need to define a class for this problem.\n", + "
    \n", + "`Problem` will be used as a base class." + ] + }, + { + "cell_type": "code", + "execution_count": 56, + "metadata": {}, + "outputs": [], + "source": [ + "class TSP_problem(Problem):\n", + "\n", + " \"\"\" subclass of Problem to define various functions \"\"\"\n", + "\n", + " def two_opt(self, state):\n", + " \"\"\" Neighbour generating function for Traveling Salesman Problem \"\"\"\n", + " neighbour_state = state[:]\n", + " left = random.randint(0, len(neighbour_state) - 1)\n", + " right = random.randint(0, len(neighbour_state) - 1)\n", + " if left > right:\n", + " left, right = right, left\n", + " neighbour_state[left: right + 1] = reversed(neighbour_state[left: right + 1])\n", + " return neighbour_state\n", + "\n", + " def actions(self, state):\n", + " \"\"\" action that can be excuted in given state \"\"\"\n", + " return [self.two_opt]\n", + "\n", + " def result(self, state, action):\n", + " \"\"\" result after applying the given action on the given state \"\"\"\n", + " return action(state)\n", + "\n", + " def path_cost(self, c, state1, action, state2):\n", + " \"\"\" total distance for the Traveling Salesman to be covered if in state2 \"\"\"\n", + " cost = 0\n", + " for i in range(len(state2) - 1):\n", + " cost += distances[state2[i]][state2[i + 1]]\n", + " cost += distances[state2[0]][state2[-1]]\n", + " return cost\n", + "\n", + " def value(self, state):\n", + " \"\"\" value of path cost given negative for the given state \"\"\"\n", + " return -1 * self.path_cost(None, None, None, state)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will use cities from the Romania map as our cities for this problem.\n", + "
    \n", + "A list of all cities and a dictionary storing distances between them will be populated." + ] + }, + { + "cell_type": "code", + "execution_count": 57, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['Arad', 'Bucharest', 'Craiova', 'Drobeta', 'Eforie', 'Fagaras', 'Giurgiu', 'Hirsova', 'Iasi', 'Lugoj', 'Mehadia', 'Neamt', 'Oradea', 'Pitesti', 'Rimnicu', 'Sibiu', 'Timisoara', 'Urziceni', 'Vaslui', 'Zerind']\n" + ] + } + ], + "source": [ + "distances = {}\n", + "all_cities = []\n", + "\n", + "for city in romania_map.locations.keys():\n", + " distances[city] = {}\n", + " all_cities.append(city)\n", + " \n", + "all_cities.sort()\n", + "print(all_cities)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, we need to populate the individual lists inside the dictionary with the manhattan distance between the cities." + ] + }, + { + "cell_type": "code", + "execution_count": 58, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "for name_1, coordinates_1 in romania_map.locations.items():\n", + " for name_2, coordinates_2 in romania_map.locations.items():\n", + " distances[name_1][name_2] = np.linalg.norm(\n", + " [coordinates_1[0] - coordinates_2[0], coordinates_1[1] - coordinates_2[1]])\n", + " distances[name_2][name_1] = np.linalg.norm(\n", + " [coordinates_1[0] - coordinates_2[0], coordinates_1[1] - coordinates_2[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The way neighbours are chosen currently isn't suitable for the travelling salespersons problem.\n", + "We need a neighboring state that is similar in total path distance to the current state.\n", + "
    \n", + "We need to change the function that finds neighbors." + ] + }, + { + "cell_type": "code", + "execution_count": 59, + "metadata": {}, + "outputs": [], + "source": [ + "def hill_climbing(problem):\n", + " \n", + " \"\"\"From the initial node, keep choosing the neighbor with highest value,\n", + " stopping when no neighbor is better. [Figure 4.2]\"\"\"\n", + " \n", + " def find_neighbors(state, number_of_neighbors=100):\n", + " \"\"\" finds neighbors using two_opt method \"\"\"\n", + " \n", + " neighbors = []\n", + " \n", + " for i in range(number_of_neighbors):\n", + " new_state = problem.two_opt(state)\n", + " neighbors.append(Node(new_state))\n", + " state = new_state\n", + " \n", + " return neighbors\n", + "\n", + " # as this is a stochastic algorithm, we will set a cap on the number of iterations\n", + " iterations = 10000\n", + " \n", + " current = Node(problem.initial)\n", + " while iterations:\n", + " neighbors = find_neighbors(current.state)\n", + " if not neighbors:\n", + " break\n", + " neighbor = argmax_random_tie(neighbors,\n", + " key=lambda node: problem.value(node.state))\n", + " if problem.value(neighbor.state) <= problem.value(current.state):\n", + " \"\"\"Note that it is based on negative path cost method\"\"\"\n", + " current.state = neighbor.state\n", + " iterations -= 1\n", + " \n", + " return current.state" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An instance of the TSP_problem class will be created." + ] + }, + { + "cell_type": "code", + "execution_count": 60, + "metadata": {}, + "outputs": [], + "source": [ + "tsp = TSP_problem(all_cities)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now generate an approximate solution to the problem by calling `hill_climbing`.\n", + "The results will vary a bit each time you run it." + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['Arad',\n", + " 'Timisoara',\n", + " 'Lugoj',\n", + " 'Mehadia',\n", + " 'Drobeta',\n", + " 'Craiova',\n", + " 'Pitesti',\n", + " 'Giurgiu',\n", + " 'Bucharest',\n", + " 'Urziceni',\n", + " 'Eforie',\n", + " 'Hirsova',\n", + " 'Vaslui',\n", + " 'Iasi',\n", + " 'Neamt',\n", + " 'Fagaras',\n", + " 'Rimnicu',\n", + " 'Sibiu',\n", + " 'Oradea',\n", + " 'Zerind']" + ] + }, + "execution_count": 50, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "hill_climbing(tsp)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The solution looks like this.\n", + "It is not difficult to see why this might be a good solution.\n", + "
    \n", + "![title](images/hillclimb-tsp.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SIMULATED ANNEALING\n", + "\n", + "The intuition behind Hill Climbing was developed from the metaphor of climbing up the graph of a function to find its peak. \n", + "There is a fundamental problem in the implementation of the algorithm however.\n", + "To find the highest hill, we take one step at a time, always uphill, hoping to find the highest point, \n", + "but if we are unlucky to start from the shoulder of the second-highest hill, there is no way we can find the highest one. \n", + "The algorithm will always converge to the local optimum.\n", + "Hill Climbing is also bad at dealing with functions that flatline in certain regions.\n", + "If all neighboring states have the same value, we cannot find the global optimum using this algorithm.\n", + "
    \n", + "
    \n", + "Let's now look at an algorithm that can deal with these situations.\n", + "
    \n", + "Simulated Annealing is quite similar to Hill Climbing, \n", + "but instead of picking the _best_ move every iteration, it picks a _random_ move. \n", + "If this random move brings us closer to the global optimum, it will be accepted, \n", + "but if it doesn't, the algorithm may accept or reject the move based on a probability dictated by the _temperature_. \n", + "When the `temperature` is high, the algorithm is more likely to accept a random move even if it is bad.\n", + "At low temperatures, only good moves are accepted, with the occasional exception.\n", + "This allows exploration of the state space and prevents the algorithm from getting stuck at the local optimum.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def simulated_annealing(problem, schedule=exp_schedule()):\n",
    +       "    """[Figure 4.5] CAUTION: This differs from the pseudocode as it\n",
    +       "    returns a state instead of a Node."""\n",
    +       "    current = Node(problem.initial)\n",
    +       "    for t in range(sys.maxsize):\n",
    +       "        T = schedule(t)\n",
    +       "        if T == 0:\n",
    +       "            return current.state\n",
    +       "        neighbors = current.expand(problem)\n",
    +       "        if not neighbors:\n",
    +       "            return current.state\n",
    +       "        next_choice = random.choice(neighbors)\n",
    +       "        delta_e = problem.value(next_choice.state) - problem.value(current.state)\n",
    +       "        if delta_e > 0 or probability(math.exp(delta_e / T)):\n",
    +       "            current = next_choice\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(simulated_annealing)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The temperature is gradually decreased over the course of the iteration.\n", + "This is done by a scheduling routine.\n", + "The current implementation uses exponential decay of temperature, but we can use a different scheduling routine instead.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def exp_schedule(k=20, lam=0.005, limit=100):\n",
    +       "    """One possible schedule function for simulated annealing"""\n",
    +       "    return lambda t: (k * math.exp(-lam * t) if t < limit else 0)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(exp_schedule)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, we'll define a peak-finding problem and try to solve it using Simulated Annealing.\n", + "Let's define the grid and the initial state first.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": {}, + "outputs": [], + "source": [ + "initial = (0, 0)\n", + "grid = [[3, 7, 2, 8], [5, 2, 9, 1], [5, 3, 3, 1]]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We want to allow only four directions, namely `N`, `S`, `E` and `W`.\n", + "Let's use the predefined `directions4` dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'E': (1, 0), 'N': (0, 1), 'S': (0, -1), 'W': (-1, 0)}" + ] + }, + "execution_count": 65, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "directions4" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Define a problem with these parameters." + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": {}, + "outputs": [], + "source": [ + "problem = PeakFindingProblem(initial, grid, directions4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll run `simulated_annealing` a few times and store the solutions in a set." + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": {}, + "outputs": [], + "source": [ + "solutions = {problem.value(simulated_annealing(problem)) for i in range(100)}" + ] + }, + { + "cell_type": "code", + "execution_count": 68, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "9" + ] + }, + "execution_count": 68, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "max(solutions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Hence, the maximum value is 9." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's find the peak of a two-dimensional gaussian distribution.\n", + "We'll use the `gaussian_kernel` function from notebook.py to get the distribution." + ] + }, + { + "cell_type": "code", + "execution_count": 69, + "metadata": {}, + "outputs": [], + "source": [ + "grid = gaussian_kernel()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's use the `heatmap` function from notebook.py to plot this." + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeAAAAHwCAYAAAB+ArwOAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzsvW3ofW2b13Wse+//dd3JqFP4ImYaNUmxEHqazIhqyKIaCpUgSwos4oYsdMIeyBcW9CYIhMAI7hypIErC6IGoQAhMCPMBfWETIY4x04hmMYyS13Vfe/93L/Y+9z7Wsb7H0/mw1vr9fuuA//+31vlwnOd+Wp/1Pc6HNd1uNzrssMMOO+yww9a1b23dgcMOO+ywww77iHYA+LDDDjvssMM2sAPAhx122GGHHbaBHQA+7LDDDjvssA3sAPBhhx122GGHbWAHgA877LDDDjtsAzsAfNhhhx122GEb2AHgww5byaZp+rPTNP0DIu03T9P0hzr4vk3T9De0+jnssMPWswPAhx122GGHHbaBHQA+7LCd2DRNPzBN0++fpun/nqbpJ6dp+q0s71dP0/S/TNP0s9M0/blpmn73NE1fPPL+4KPYn5ym6S9P0/Qbp2n6kWmafnqapn9tmqa/8Kjz66dp+tFpmv6PaZr+32mafkfE/yP/Nk3Tb52m6c9M0/QXp2n6d6dpOq4fhx3WYMcP6LDDdmAPmP23RPQniegHiejXEtGPTdP0Dz2KXInoXyaiX0REf9cj/7cQEd1ut7/3UeZvvt1u33e73X7f4/yvJaJvP/z9TiL6D4nonyaiv52I/h4i+p3TNP0yzz+z30BEP0xEfxsR/Toi+ud6vPbDDvuoNh17QR922Do2TdOfpTvgLiz5CyL640T024nov7jdbr+Ylf83iOhX3G63fxb4+jEi+vtut9tveJzfiOiX3263P/04/xEi+u+J6Ptut9t1mqafT0Q/R0S/5na7/eFHmT9GRP/27Xb7r4L+/5Hb7fY/PM5/CxH947fb7dc2vCWHHfah7bx1Bw477IPZr7/dbn+gnEzT9JuJ6J8nol9CRD8wTdPPsrInIvqfH+V+BRH9Lror0J9H99/uH3Pa+n9ut9v1cfxXHn//PMv/K0T0fQn/P8WO/08i+gGn/cMOO8ywIwR92GH7sJ8iop+83W7fz/79/Nvt9qOP/P+AiP53uqvcX0BEv4OIpo7tR/z/EDv+xUT0Mx3bP+ywD2cHgA87bB/2vxLRz03T9K9P0/RXTdN0mqbpV03T9Hc88ksI+S9P0/QriehfEPX/PBH9Mqo3zz8R0b86TdNfPU3TDxHRbyOi3wfKHHbYYUE7AHzYYTuwR6j4HyOiv4WIfpKI/iIR/R4i+oWPIv8KEf0mIvpLdJ9MJeH3bxHRf/yYxfxPVHTB809E9F/TPSz9J4jovyOiH69o57DDDnvYMQnrsMMOc01O8jrssMPa7VDAhx122GGHHbaBHQA+7LDDDjvssA3sCEEfdthhhx122AZ2KODDDjvssMMO28CGbMQxTT/vRvT9I1wfNrPoMtCjXFu5FhvVRovfkVGv0RG1jP9I2ai/3uWiZbdq97A2+1m63f4/90c6aCes7yei74xxfRizT8Fy0Y854i/zlXkv/cv6rW1jTZ/fdPDRy+/FL1LVRtRvxF9PXxmfvV9rxudh9fbdUKkjBH3YB7URYBzdxqeOPnv6arW3sCNuz5vEjPW+iT1sT3YA+N3be7lwbAmLmotbz/6OhGVv37W+RgDkgNJh+7YDwG/W9qJe9mQjYL4lfNdUqZ+oX3uj+/1ebhbfws3nYSPtAPBh9D4uQHuAbw/w9ARhax96+MnY3ucX7N0+0mt9H3YA+E3anseFtlInvX29h/eu1T4ahHv62vNv9LC92PHpf3jb6kIR8dezb6OVb63tDbrSZP9qZtAWH9G6Z4rP6v2U8Lumr97+evftsD3YoYDfnH2Uu/63AN/aUO0ewsy11tLvTN0z9R+i6PVd2eJmlGi7oZjDRtkB4A9tW4R3e14Ee4M8c/HKvndvGbrIWl5P9oZojzdsW/Qrakdg863Y8Um9KdviIhOxPfoaceEbDRvPRvxcs5tgIOOvMRomHRGWjvqM+or07SP4OmyUHQB+d9Y7JNvD1x7hOxK8Pd7ftX6aWju1YM7COFO+9LUHiCO+PgI4DwhvaQeA34ytPTbl+dqjGt9yacqeNqDoYahfWSjXwrgniFvVcG9V3dNXDz+HbWnHGPCbsL0q1h5+1la9UV/R97x2HDQz5rwXa+nziJuZPX6/IvYWf4OHjbC3dgU4TLW1LiJrXhz3eGHMlKvpQ9ayfekRbqxVxxlF3FMN9wxJ9/BTfO1NnR+h6LXtAPDubU938B+1L5lymbZ7ttfTb82FmL/enjCOlusJ0LXCv2v6idgB4bXtAPCubS3IrOWjh581wTtq/XBtG2uZ1qfMjGVu0bFRr421lOzaqnoNCEfHgw8Ir2kHgHdre4HvXvrRw0dvtbvmjlkZy6rRqKH+R2f2couAzfIdKbM3EO9BmfcMjx/Www4A79L2EqLdAzR7+OipdkevG65tq6ePDLTla8wCuReMPYhabfUC8V6U+TE7+q3YAeDd2VozJN8CfNfow5ZLl2r8r2EtS5CyS496qbJeSvStKNke65db+nBYD9vTr/6wXSjfPYSsR/dhi5nTWb97sxooZ9RxRqlavlpV5Bph6V6KemT9iI/DWu2tXg3eob0FcO25/lrQXWNiVrad3lY72YooHlK22ukBYy/fa2MPID4g/N7tAPAubGv4fmTw9oTuXidlZc3qV3RdLrcIRDXfPcZwW0AYBfkoEO5F0R8QHmEHgDe30fDdqu6Wbe8hjJ7117O9GqsZ4+UWDS9bbfVSrVr9VphaMIvWHdF2qb9lSPuwGjsAvKntWTlupZhHqd29LLeq9TvavD5kx3uJYiFmzbenjnvBeBSIvfojwtIHhN+a7eGX/0Ftr/DdAtqj1O4eZllnfPVqK2s1a3m5tYSYpe8aBdkyntsC0x6h7RqYjgK4126k/mEZOwC8ibVeSLcIOe+tzb2GzjN+Wnz3tJqwMree63pbx3xrodcKxJa6veuNrBupf1jUDgCvbnsM374H8G451h3xkfUXsZqfb8smG9yi4WWtzR4TsGpVsVVvlNKuvTHYQkV7dSP1D4vYAeBV7S3Bd+0w9Qi1u0WoPeOn1m+LZduIhpeLeSDV/GbGfL36Wt0R4em9g3gUwEv9A8ItdgB4NdvbmG1NvTXbWrPvLe1F6md99WgnYiPGfqOTsTJART5qFO5IVbwmUNcMZ7eOKR9m2QHgVWxP8N0DeHuHmNe8YfDqRn3U+OxtmTajypbIhyny10PhWvV61BnhzwPxHuAdqau1eZhlB4CH24gL/VpAXAt6W/e5pZ5XN+OnxXcPy+xixc2DKfLdEnK26mogrAF4RBVnAFmjpPcC71L3CEn3tAPAQ61F1dTUXQOkWyverW8WInWjPrL+RlvNUiOiPrObPR814KiFWqaNEYDspWxrQ9KW1dY7DNkefvXv1EaEndcA6Rrg3ePrs+p49by6UR+1fntaZnZzMUudIr+eQo4qXKteiyquCSe3KOLsuPIa4NbqWPWs/h2G7ADwEBsxE3h0nTVAvdVNwl5memf9jPDhXRw9f7Vjwd44cA1YZb01YayV7xlm3hLcVp1IPdS/w6QdAO5uIy7qI4EzGmZ7gu6an02kbkvZWmsFdstYcGYcWKvbc/x3NBBHg3VLcJd6x5hwix0A7mq9L/B7gtRI8O7pRqK2jlUvmp9tb4RFZjBz09QfKtMr9Nxbufb2vQWIe4wN9wxje20dRnQAuKPtEb5rg20vr2Xr8eJIvue/1bdnNWO9xbJjvlb4uWa2c0Qd1wIzC1cExWx4ugbEa0O7to5V77ADwM229XjvHsqOAv3eAe3lWT6zfnpbSyi6x5hvFq6yjgfNPZTl5SOqOAPiHmDtPZas1bHqfWw7ADzUat7eNYHaA1p77Ve0bG/oRj7zDPzW/Ilm1gG37m4V3W2qx7hujXptVZ2ZceJI/UzZTL80v5bVwvSAsLQDwE22Rth5axi1Kt61+jRyvHjUmPDeQtCZ8LPVh8iYr6WQR43r1ijdNULZa6rhHmW18qXOEY6O2gHgKusddt5a9baAd8v+rKnqNR9Wea9exkfUvFBgbXsWVItF4JqB8jdKGvJjQbYGxtlytWVbQDxCpfeazEVKHavex7MQgKdp+oeJ6N8johMR/Z7b7fbvDO3Vrm1v8N0TeHsDei049wpNt0zaivjIWA8/kclWRO2KNwLTUj4C5B4wbinHy44E8dbA1vx6dbx6H8fcX+k0TSci+veJ6B8kop8moj8yTdN/c7vd/rfRndufvYWQc225NcA7ur9bh9Ct8l49aVOw3Ai7Pf56ffXC0d8oPrw1vRl1uyZkewF2KxCvGb7mdQ4Iaxa5GvxqIvrTt9vtzxARTdP0nxPRryOiDwbg2hDhiIv9aODtBby9yrT0wSprlbfqEG0LWM+8vnmA7gFYrfxoGFvA2grsGjhbVG7vkPQB4RqLAPgHiein2PlPE9HfKQtN0/QdIvrO/ewXdujaW7IeYee9qMi1Q8Mj4dzSh0wbVvliewZu1tBrubHjFsBq5WthHFGFPRVmDdiivmrrjYAwsgPCWYsA2Pu13RNut+8S0XeJiKbpBxb5b9tqLrp7CiX3Ur2jYVnTXgS6a4xXF3tPoM2YdZkYpXgj5SL1sr6zk7ZqoV6rhrdSzFpZr45X7/1aBMA/TUQ/xM7/OiL6mTHd2ZvVjueNuLiPAtZIEI4KM4+GbrTcR4Vt1DQoZ4CMynrlamCcAeiIEHYWqDVquFUxo35Kf1pZy7fVl/dtEQD/ESL65dM0/fVE9H8R0T9JRL9paK92YbWhxjXguxa01oRzzWscOWa8E+BGfqE9zVu91GzyPURALh2pBW0NjCNw7FWGl/P8WCDupYazcG1VwweEi7k/79vtdpmm6V8iov+R7suQfu/tdvtTw3u2qe0FvnsG16gwc1axj1Trg4EbhetaEL4E2uoO6IhKbgFtBIge/DKqOBtS9mAYUa1bqOEDwj1sut36D9fex4C/093vOjYaviNDziNUb482e/e7VwRAlhkAXA9oe4BwBqpW2SHqGV2fZEPyYo064pWR+dnyNf0a4WPEexH1o5XTylrlrTpvwb5Lt9vPuBeUtQNc79S8sWLL9qIca3z2UMW9FW9NnzqDV/tV9YJxbT3tWmfVl3W04Vsvr9r4ZyMndWXDwBl1GlXFnprtoYgz+V75Gp9RP1o5yz62Ej4APLMa9avVGTVJqQbOa6vetcPM2fyOwEUf4R4BHAkv9zStvSYoSxhL0Fpjxt5s514gtvqU9VGbXxPGHg1hC6YfF8IHgJ/WC749w857BtlotbtD6PaA7QgIR+pkhckIQ1Cu7pP8PL2xXg+MXn62PO9TC2hHquEeCh35yZSzykfqvW07ALyLMd/RIec1wdvzJmFj6EaBu6UK3vIXbKnc2n51AbLVeAbGvVVxTf5oNRyBZ01Y2ypHoKxW3vL/9u2DA3iP8PX89FS9ewFvz340QFe62juAa2C9lQrW1K/3Gqr6q4WqM8q3RRX3BHEvtWxBtMdYNCqjldPKWuWtOm/XPjCA14bv2iHnUXBtAfqodgZCtyeUrXQvr6UsqtcDxFsAPd2epow95ZsBNVEM3DUg7gnp3iFu9GU6IJyxDwrgtw7fXmHcEXle/zaGbitwR6jiaH60TMSi8FwTslIRR85TFoWx7FQG1DVQj6jabJ70q6nhVtUu+xItg/rC7WNA+IMCuMZaws5emZqwdKTs1nkj2hgA3hrI7kUJR8uja1k0/IvKaXnRUHMNbLXz6jC1taypVnFaoNOUopdX276EtBWSbrXIndoWIZN92wcEcI36XXPMtzak20NZ9gDmiDBzB+j2Vr61QLbSvbyacqh89Bq41+ulNXac6m/NWHGtIq4db65Rw9Gx3p5jzLVltHKoT7IO79/btA8G4PcC3x6gXBO8NX0iqgJvFLpbAbgVvj1+sU3KkfnYI5ylhfsYCU97YWMrLwJtD7ZRNZyBZ2bctzYcfcyO1uwDAViDbya8q5VvhW8GzCMB2wrejdRuDXRbABsBa2/4jvilRkBslUHXxWjaGlYdpi7fQUsVy4YyalaDag3oe8A7q3Zb81EZq6xV3qqzf/sgAI6My0bqRODrgVeWqR1T7QnYKBwj4K250egI3R4AXlv9tqriVouCOArhNS2ylAnVCVmNKs4o3YiC9VRvzbh1NCR9QHi0fQAArxl2zqheL78Gvj0BOwq8ndRuDWjXAnArkL08yyQnaupq9aKwjZSTZXqAfBUYt6pimY7UJy+XAXR0bNhTwzWTvGryURmrrFXeqrNfe+cA7gXflnJae3uCb2u4uaYfSfi2AHbPELbSvTzPamGk1es1w1mmyXYzeZk6XnrINBAT5dWpB+IsVDPpVl60nMyryUdlPo69YwD3hK8HT69MLXx7hpZHq94B4B0N3VoA18B2tAK2ymdmPVvl0XUykuZdX0defzVVbKWHLKOItdB0RIG2wFZri4w6e4Lw+1fB7xTAa4adZblM2NnKy4Z6W9JHg7ez2l0bwHtRw5kyXvm1J15lICvV6lbiKNSup4iRQ5SuqWFNMUfSS1u1Y8l7gTAFy3p19mfvEMAfBb490rU+bADeHiB9CzDW0qx0L6/GuL/smG8Uwl77vceAW80LaZuW3WWLj/FqMK0Bce9QdVQxy7yafFTGK0tK+bdh7wzAW8LXy4sq2pGQzaher552s9FB7W4N4FEwzqRF8nqZBWPtGofSPajWQjdzI5A1b+JWGsREfnhaLlWSaR6ILYBG4ZxJ19oq6aTkyXqRfFSm1vYfjn5HAF4Tvl6ZteHbo6wHWa+NSvD2BOwIAG+hhL28SL5lNeFnK70XdHneSOj2sKbwNAJqRO16ijkD55p0DcKZvEg+KqOV08p6dfZh7wTAveBbW64Vvi3jtBmlisrW1OFpg8A7AsCjlHAmz0qz0r28jFmKV5bpMd5rnfcCasYPUr1a+Nmr3wXEBNI0oHoKtxbO0fQDwj3tHQA4CtKIWfDMlEF50bfaK9dTDa8I317A3KNKrjnPpGXyM5YBTcZP5rz3caSs1q9W/65JEKOJUtbkKQ+EHkDXgDC37N1VFJTvB8JvHMAefLWXlx3P1cpYoemoKm4N/fL0DJCzdRrB2wu2o6EdSbeOa861tEheiyH4oPyI4uXl0HlEAWePkWWv+aWf2RuRZhDXqmEUkiaR70GYgmWt9EiZTB4ZZShQzvLp1dnO3jCAR8PXG/ftCd8MZHl6TVqLn5XBuyWUa457nHvp2TLIasLOMl1Law01ZwBaA9uoRRWzlu/2ywpLR8PL2gSt7Ljw3iGcKaeV9epsY28UwDVjvlq9teAr69TAtwXILeAlCsG3Fwh7ALgVujXAzeRl0iJ5GeN+0HUqC+IaCEfgXANwrX6t9VDHpiEQF7NmS2tqOApuEn7eAoSRvW0Iv0EA1475rg1fq04LUL36KM3y7fkYDN63BGWtnlU+cq6lWenZcp7qReU0UGegK89bgKz5j9TpabXq2DQ0PmzBEkE1qoZbVbNMj0KYWwTC0rQybxfCbwzAbyXsHOnL2vC1yjeEm0cAckSel5bJjx57eZm0SJ5lqF6L+o2cR+bmeODN5O/B0NhwWhFbajgzNoxg2JJGRnpWRcu8SD4q49m+IfyGANwTvi3lvPZknjb+itK8q3EWpgPhuxZUe/pqhW0rgCPnWlomf5TVhGO9ctqxVdZLs/ojy1u+vDxKlJdpphUI8waisCQjzyrfA8KeP5mO6tdCuBam20L4jQC4N3wjL1uWsUAZGfcdqXIj8I2EoRvBuzZIe/qO1o0eZ/KsNCu91bjf6HgvSouo4Oyx5l+DWDRNWgiIScvcDKimhaSlAwnLrBL2Qs8ZCEdDzt6b3hvCXnvbQfgNALgWvtHykfD03uBr+YuAtlH17gWuvfx5aZn8bB4619Iy+Z5pwJV5HnS9857gRTYCoBmzVH9EgZc01crvkhfWIEqgjDVmHAWuBmFkUdBmfEbLWOX2CeGdAzgbFvbqRl5uBNAobwR8o3Vrxnt3Dt6asiP65aVZx5k8K81KrzUNuDzPAnHkvOa4B3gjUO4N7giIPSCbxseGJUQJpCEIk1K/FsKobSIdwla4mStxmUesTGTiVguEeR/G244BHIGv1v0ofGU5D74aZDPwRf6yoEXteaCtUL29gZnxs1adXmnWcSYvku7leWapX56PID0KvLI/Wlo2T1oNeC2AamVr/blqGIWkEXRlGQqUpYo0Av60fJku86RtAWGrbn/bKYBbws57h2+ryl0ZvqNhl60zCtAt5aPHNedeeo15F30JVZS2NniRZYBbA17NeoFYlg31s2dIOqqkyUkj4E/Ll+nSMoDOlMmWXwfCOwNwi+rV6kfg6+VrkNXqZMLOkXIWUKPwrQQvShsJ3DXbyJbJ5FvHkXMtLZLnWasCjo73lnoehFFZrdxIpTvCMuoZ1VXNCklr0PQgTIo/AmkWTLMQlh+WB2HUJvrALYjuA8I7AnDLeG9r/dq3wavn5SNVK+taZTrDVzY9AqQ9fY3oQzav5jhyrqVF8jxbC041KtDKa1WV8oYik59pw/pLlXWgRSAsLQJhWRYZasMDs9YX5K8Gwl4bNTYWwjsBcBSe2e5a0NLK9Aw9W/VqJkzxtMHK9z2Atkefo+VrjiPnXrpWNqIQi0VD0Fb4uZxr5RCAon0OAylRp8YntxpFK+sOhzBqmMiHMAIpArWVhtqR+dJa7wy1+i3jwaU+KT7abGMAZ1Sr1dU9j/v2hm8mRN0AXw1AHqDWArBsZ0S/onmZ/Oi5lmalZ8pIuKI8C7TauQwpy/RMmgXqLLCy13atfAt4W03tvxwXLqaNC0fC0b0gTMKHzLeUbkQFk1MGtc8t88Xor4Y3+iplw8VZ+EbK9YCvVj8CX1Q3Cl8N3MnJViMBOervKN+1Zbw06zhy7qVH8jUgaWVqwRs5roVwFqBZq/XvhbJluWj9ISHp7JhwDwi/l0lZvD9EvUC8MoBrxmmz8EXlewIf+Y1OsEJpmbCzln6mZb0OIWcJji1A27PtDGwjZb206HHk3Eu3TKujqWAJ2pJmgdg6zkIY9b8GUKOhbZkGZA/UUd+qbQVhEmW0tD1PyiJQR7M+IB4M4NaJVVn41vq1ABoJPVv5EqwozTsvx5qvBvjuBY577lc2L3ocOdfSMvnFUDgX5aNrkQbYkteieknJqwWoBuJWv6PMg7QFc2gjISwtAmqeJo+RLwJteuHfHhDW6lgmOZAD8iAAT7QNfFGdbOhZy9Pga4WUkWmwtSzyeivgq6X1BmNPnyP91/Tfq28d15xLG/QLftMWBRqq4ylXq73av8VHsWhd07IQLhaFtQdc+WI063WXlbH+Y7lz30TRFSc7/fnWdCsCXy9fg6xWp2XSleZX1omo5Ar47gGMvdsZcXNg/fXSao6zaRlD9a0wtDwvZSIh6IwKjlxLtbKRa36Lf2Qt4eOMr2j4ulkJE81DqhJQGWWMfGvtbRWK1sp6dfrbDgHsdaln6LlHHs+Pwjcaeu4E3zUBNwrAW/QL/Y3mRfK9PCKi8w0kEtG54QJxYQ3JNi/TPN2DbclDcOXlrbQatedB1mJDr+trBsQ1ihqdpxVxBMLFMZEPxgjMSdTR0nqEokdCmJR6/WxnAK6FL6q3VugZXjWNNnrANzHmOwJqa8B1jb7JOpqPiB8vzTwWkEVwPV+XacK+Bcp8vpx8X7zM87pzFucAzNZxjQKu/Tt7bSLNglMvEGcsEv6uzauGMNH8AySljGaWX288WPqw/G9hY9veCYAj3RgJXw2yHnyRzwiQe8C3WAf4toJ0DwDuDeXM31Qag60ErQAjAioR0SkA40jZawEvK/NZpj3PHwU4mDmUowrYu8ZqZTN1a8ug8q2WUboteU0QJsIh54zqtSBMogy3SCg6k4fytba1stLGQXgHAB4N30z7Gnw1/xKQqFxP+Mp2g/DdEn6j2xl5Y5D5q6Y9gGvAVoJWQvNkhJvPCRhfhBrmfq8PsPK2r5fTEs4czBzKz+NJB7KnYiPXbA/UqIxn2fKt1kMFe2nQNAgTza9hPK8m9FzMKpMdD5a2BYRJqV9vGwN4jeZlGxHISkOARXUtIGfMg3TSDT8OgwP8HQ3AaPk9gRjmAZWrADcCWw2yGSVcyl9RSPrRhgS0ZZ/VnPImBPcd51YzploLM698tK7mIwNLqz8E/KK0aggTqCDB6YWjM+PBUcAiy94hZfxHy/a9S9sIwC3gs3xkZz1reWclPTOhCqV551Zb/DwZdt4qrbff3tCWeZGyBMoT0ULpBoDLYSshi+AaAe7pxBTsdQlU6YMDueSVtNKnAubS3+vlPC97vs5D10gdc2Xc+peSedz6Xj/zNwCWj5obgGYIE+WWJ5EoQ4Qb2kMoGplVZn0IrwzgXs1F4auBNNqX6Lgvslr4oqu7Nb4Mio+A2kgAr1Un0ldZRq2jq1wEXA22SwUszk8AxMEf/+mkl7s+Xoj0f72eFmFoBGYO5TCQa2BcLAtYVB6Vi+T3sCiUs+DVfIchTGQr2HLtsZYUIR+eX62tLUPRXlvSBxl+YrYCgGubyIz7trabCUuj/ChstfKonuZDUb9bQK4XBFvr9L45kHnPczCeK6BrAfekKWEBQQTYM+nq92TkcbvSS+1Kf5dHXoG2BDQHs4QyV8omkLMwJnGMzrW0GusNXc9aFK92nIawBCE/L06KRceDs4DTIKyVyX5QoyBc/BTLf3kGAbh8sLWWHUNda9Yzyvdgq/XDm3SF8gLw3QJma4K+Z1krbfaXgTegcjPA5bCVUERgjcIWGapboMzzrnR69gWBWUL5+lTD1yeQkUJOwZgoBmYL1jItWkeztQAdBWumbBWECeRlIRxRxppF3/CaULRXLjs+XfwVe7M7YfUe97V8aHVbQs+yjAVoq0wQvsXO4HgkfEcBeA0wh9LwmO63zlc4jiuhi4CrwfakHMs63CxF7NmFKWHu/6l6H76vdIJg5lDWgCwVslTHJowpqIo1kLaJA0fHAAAgAElEQVSCtqdlYFrjK9OG+folMIl0mGrhZ8+ndYeUDUVLXzWhaM9q6uRsZwDOwjdSLgpoD+4Skigte26V4cfOpKsWQPWA4xoAXg3CNwhdIl/pPv8awNVgK0EbUcORvGLXGXSvMI+3eaHTs08czBzKUil7QEYwfr1gPm5MvirmlgFtpGwtrDMw9er3OA5DGC1PIsLwtCCIymiWCUVHIexZNhSt1elnOwJwVsmiOt7L4fmR0HNPy9wYBPuAWB05PgfSM6CLtLMmfKsArIeZa8GLVK4GXQ+4GmTDE7Lo8gRp1rjqTVlFleXypvMLwpplFGWkbCS9Bn7SF7GypNTLpmvAPYO0mVnLkyxYamUy48O1dzzSRoSix9oOAByBDepmJgys+cjWbVG/Eb9nUD857rvnYy1fKxNJ7wFeI8xcC10LuC3h55bQM6rLlS4RDkOX84XiZQqZq2OuqKUyluPGcv3xIjx9dzRXxM800o+1a+p219q7RVRy9MZAgz0p6e5rRzOjuWmw1T4MzSJgl8fcb0YFy3wvXI5MzgTvZxsDuCd8Zbmowjwr6Vb7WdjKfG+C1w7g2wK/iJ+sz57QJgqBtxW6NUpYKy+tZTKWFo7mgCXCkC3lZBi6lEXlZmU4jM8nPzwtx4kpCGJue4DxCLXd6geanJTFzdtIIzohi4w6yK9n2bo1EK7pl28bArgWvjWmQVZLtyZeRfqEYIpmPVsADy43KseR9Bq4tQKxZ3vNNxXLiVUozIzAWwvdTOjZVsPWGLBPER5+1saAkeot51L5onHhxSQtBmRY5kQLVTyfVT2ftPWZKA/iKHwjdbLWAkt5Hj0mevU/mg4NvQlInfaYFR1tMwraSCh6HxDeAMAtk6K0+rJs79CzVU7CE8FUmgXoM1U911ee7xGAI9t3fcwVLwcvDzNbarcWupnQs6WMpY+MaeFn2Y5UsLxPEqLFBwJtKY/UsReiJqIZiJ/vRwuI0flezFK4WQjzc/4+aOnueDCRvj5YmxXtWUYZy+PoHVXteHDE+oWkVwRwZnJTL/ha9c9KupWvwTUKY01lB9V1BkI1dboCb4N2O4EXqd1W6CLI1k7I0tKipq395W2jsWEEWARkTR3z13p9vKuWKibS1hYnQSzPe+VlrQa0UR8SttE6LoSJsGpF8JRp2rllvIwH9lY1WquCeX1q6sMKAB41qzhikZfnhZ5b/Uvfko7SnNCz5wJBCNXx8nrDN3pzMAK+gTFerngRPCPgtZYd1Yai0bmsmzU0I1pCssWqZ03POxS258zpy/n+mdOUA1wmj5Sy2THbSF5pC7Xr5XHTbiJCnEGgrZkVnfGPrGZCVosKzpStB/EgAE9UD16tS6PUr9eHXupXS0uGnrsCarAfLU+mZyAeamOperPgRYo2onYtpZsLRc9//FroOaOEta0oZShaC0NboedSVhvv5dBfhKBpOaZMRG5omk/WMtXwvbHtQtMtynct1eyGoi2CW/CUZaw6GaXcyyJ9z/gq9iZ3wsrAN+NLqx+deOX518yb9VwRepbFI3k94VsL4F4+zfJ2uDkL3qwSxvC1Q9W8DC8n/WjnyCQ8tbqRbSj9WdD2WDAKQZc2EahL/dPjfy00vXjNPCz9tDPNdtUqSdY1tTW/xqIg9fJbVLMLYSKsgnnFaOjZy0dlZPuoXZku81A+KoP6McZ2BOBsV2T51sldVn5Ji6pdBH+LkkRm6NmDGT+38r2yI+Hcs10EXiJC4eZW8HpAfuXlgdsyAzoSgtbKyLW+9zQ84aocZ2dBIxifHm1cgfItNp9FvZyw9QTxbAkTGB+Wa4it39caYqtF6Xq+SJSvCam7EEZ3MZFZ0dxqlwh54W9pNXdR/ZcYRWwnALa6URN6RgC0/GbVr+VPwlhLSyw5KscRqGn1miGXrD+67WcaHueVE6ws8GqhZCvEXBuiLobBLMHbNwyNtpwk0idclTo1s6C18DPvM1K+C+CSA2KmiOWmHrPNPOj8uO4qYWl0rqVploFo1hfR3J81Hu3V18qGxoOJYhOyivValoR8e/myD55Zk7II9K/dNgaw13wEvpl8lOeBUqZp58i8mc5GXQQgeR7Ji5RvAmCwfPbcLTNXvWicV85sjipeD7wxMGMlPC/TZ0KWNG3bSS8Ebc2EjsyCjsC4+JKhaVnfAzF7Ufc/YtmSOT4sw9K1lvWRBbQFVa080bwNeZ7Jm1mPULRn0fJbLUvq8aVZetzIauCbLVe77Eim1U7E0vx+YunOgxb4scZuD1qyfgR0Xp3ewI208TzXVa8MN8t1vBK82vguTvfC0Hb4uS0MHVPDkfzM+l8ESF4+MvEKjwHrYWye/0qfg7gEtYlek7WevhcwFuPDZ7LD0rNyRn5Plav5Q2WIpXkhaFkmOjasvm5U0ApFo7oZQCOwa5aFLMr32ugL4Y0AXNusVy/rtzX0rMFV+glOtkJuZRULylr5SJ0o0GtvAqw2IjcBQfhq63mz8EXw9MLMHnhrn46EYBoNO9eYNit5TZvDXj4W8Yxf/6Ob/HGI9/Pz8/wz0UMJEz2XLOmdeJQTabWws8rLOrxeNAQd8eOVN/kSXRvMLQssBOVe0GsNRRfrB+ENABxpcm31i0zrp6d+NR/lOKF+PXjJ8xTIOqVly6DyIT/+0iI01hsBbyTU7M2E9qBrh6Lbws/ZMWAUipYTsrQtJXn5SOjZGuvlylZCf6ZwFzcERlg6MDYMQ9IjLRtKtupRwFfNTQHKD6ngYp4Kjo4FRy0yIzpitTDtA+EVARxtSgOkVz/7UryxX1lO+kftWepXHgdmPctmMqDV6si03vDtCeBn+kv1oqVF3lhvNtwcAW/LTOjarSl5Xcv4GK5V13vyEU+vmXSFQsvu2l8S4WQjT274MYOzMzY8X7JkQLjPdbbeasLSkXoZhTyzlh2ySDkvZkG5FtSyrV4quPguVvclGQjgGtcZ+K459ivry3zUhkVRxTSQ1oA2CtRM2R6QRnXUdDzRCs1wRuFmK6yM07wwtB2evr+EnBKOhKB5PVRWM60MmmzF28zOgi51IjDWxno11ftKfeWV1zY/B3A+nQhNRuOWHheutZrxXs8fAZ88rSacbfl4mnxiUmSHLJRvzYCOWFQFexD2bggiVgfjQQAeHNIxlx3V1NfSMv49nwH1W5qyeO2BtzYtW3Y4gHHI2VK9NeHmjOL11G4Wur1C0NIHMu1pSFr4mcieBS3Vsh5iluHs+bpgNL5sjTujELTmh1WyTS5XImU/6VaLKtdImubTSuM+PNUcet2S2BEV7EE3q4JHh6Jr1gfHb+QGAbjGakPPlp+zkm75Lmk91W9gEpYFWguIWh2ZVgNTKz0K10iZRbo/0Uob60XQ1MB7LzduXLjk4b+ZMLSthiOG1v5K39o4b6kf234y8PQjwuO9HOIlzxrvlSFo/n4t1h2fLnQ9nRc7aS2WKz08p0RERrlG61sg1tKjZa1ylmqeWc2yJGQRFVwDQWnSf8Znj/ax7QTALaHn0S8h4j86nkyUUr+RMhHAeumaHyu9pi+oD4s8f7zXCjlr478y7V5Xh29kGVIuBF03KYuXleU0sxRkpkym3J5MbixS0ojo+b15prPx4W+dr3MI8ycrzZ0t0yMK00tHv7GMHw/QJV1yTutzekKWWZDlZ9cFI/NC31Z6TXtjbAcAziz7qfXlLQmKbrJh5XsK23irLSXrgRGl1aTXADyqesMAnsNXm+WcmWiF1XE8PM3L4zwrBL1Uy1r+6y2OTcaS9TJ51uSrcm7Ngn5tKYl3u9JC0NHdrlCeG2Z+vm5cdqamhZsyS7pZCSOzwBqFs5UXha7lxxs7Tk3I0o5b1wVHx4szELX6FCnbbhsDODvumlG/0ZcWuQHITvhKjP1KMKE8EmVqoBhWoAPrqL5yk63ioJXquC48jfP8EHXJl3la/isvH362VLGEkTX5qvjiY8NoXNga643OgLZAbMFWhqpf9ZYhaVT2UeH+R8ySLlY9OSszuSoL1do8L6ws02RdVGdmHFrasSy7hQrOwNlrr49tCODWSU9RfxoMtTZb1a82Dpzc8QoBS9ZB6dE6EZCvAmB7shURQfh6IWcExvhELRu8LbtkafnF0NiwLIPOLZNlpeItadbkq1JGwjYz1iuBbO12VepoM6P5+yXTrPRikVnS8MlKtWaNE2cnWbXkZSZolTSTWR9JBWvl620jANeEnVvVb2ac1mpXq++1abgpxxrzZRkPrNE8lL8qmGOTrazxXh/G8XFepJBf+flxYe6Lp8n8UneZ50M3si64WO1TkFAo2oJxJPwsQ87c0GQsPaycG5+GYFZcLB9xGIRwBn4Ri4SmCZTxxnqtdJmm1Z1V2FoFa3VqVLDXJint5mxlAEeAVwGy1MznaJuofnSpkSRYctmRBUAvHeVFoRn1YeWHARyfbOWt442q3kxoWrYz94nD0K3h59YwNK+LwMTrZWZBIyBbm21EVPEcxDjPfo34gRPW63cczgzunvV4JautFUb53CwgZ/IiE7q6qWBkEsgRQKNO1QK9VtW2q+GVANwKwtpyVl0Lpp4U1ep7/g2XGkRRvQgwtTZb4ez50MrM0nOTrVonWnmqNzqhS/pd5uX2itbKvd7GiALOjQlr4efSXmT7SSsMLZcUWVD1FHEkLI3KeptvqBYdF26BcBaypbmsr1ZljMp3UcG1m29YUI6AsHY9r9fHNjW8AoB7znKutZ7jzRr9KnxHoFsL3IjKjfqrVc0wj435MrNmOiPTw9B2uHpe1wc6T5d+X3l2CFrL08rJPP28ZUJJf0MPSUBLguL+Tgt/KK2kF2t6X04E1woXCy1TIopPluLWA6bRNi21G0lbGFfB3LyxYE31RlQwsmi4uZcK5vUp7WMQgCfKg1frijf22xp+luWlgkWK9pORJo8DjzyzgIfSRingaFlPNas+4mO+0c010HjvvWm7nOZP1r2X08PgvHxJ575fZedgtSZlST/zj6VtFrQVhs5uxLFUxb4iRttOIkOzl9UZzcQVu+03ZE71hRKOqtpeV1sL7qgMKqvla8O1XtqikRFPSiqW3ewj65dbto/l2v+mdsKKwrfGrMlXvdWvdgyqZJWulh6Gn1E2Wj6jjDvCNzvZKluOaAnje5qukLlfXhaly3xZZl4Oh6A9NWwZUpE8PRuCtmZAyzI8TbbxyrOXIekznQv8O0BXWm8IE9WFn0sTPep4NwLWWHBEdatjwZ55qtcKW3uAtCAdhWvtjULMs2nTNP1eIvpHiegv3G63X7VBF4yyGfVrtZNRv6iNRvUbUboRhYvK9VC5NXUQfB/WCt/IZKusOp77yoMXLz3SJ2XN/+J8eYzOX+n2BQLtBY1AzI8zM6CLH00583yerm07yftaC1fu88o+9YQD06ogXGMRxZup40G7WQUjp5ElSahui0Kuscis6n4W+Tj/IyL63UT0n3RvPQTFvZk3+Uoeg2KekI6Wj0IzUqcFuCaA+yrfzGSrCFAz4WZP8ebC0GNnQkvzQs+8PQ5RBGQMY/3pR6V9bdcrNKPZW9/rqV5rlnTKaiBcTpF5Ys2zyExoq05ETWvADqtgTcmidAk2DXSZdcLyWCtjtYesP4Tdj+92u/3BaZp+addWXYso2Ij6jcx0js5mRm1Yfeqofr08y3cGmrV11b7WLTXKQjWqZjPlXnl99oq+/11CNxOG5mW8NGRS8Za00oamdksdpHw56JBa9UCMNtrQHj2Y2Y6yVjUbTlVbZYmSZq0TulCdyCzpkh6aEU2UB2ZmtrRXJgvO7PKneusWMJmm6TtE9J372V+zZtOd2vKArOUlNgixVG4mveRlFbBlrb4WdefKl6gOvkRostQLdMgs+EbKvfLqniOM0u9vkR6mRuV4WXks67xFk7OmtRnTaAMR2y+eKV1r6IlLRMbsaK6EpUUUaMZqQKvVzUzMiraxcKI5XPN73H9LyVrrRsHb7fZdIvouEdE0/RI0Hz3YbFb9aukRWHpmTdaSsjFoGtiiyne0AkZ9SfsbF3ZGCvQL+noB2Yyv+0usUcL60iSezuvI/NJ2MRyKrp8JzZVgzQxoro4zM6B5fmmHp5fcyAMXrNnP87rz8d7ulg1Hr2E1oNfqojpe+Bmys3ZjjuiSJHmOfMhjC/Iobx0VvKYMHdRkhEZamjf5Clnl5CsPrFqeZq2QRX3L3AB48H3YKPhaZYiQarZ93etg//e3wQY3r4/T4xOuEJhlWXSOTIOyFYaWY8MSyN4M6PIa0JIjOTbL4Y3SLJiiusOtF4S9ZUKt1ntGdVQdP20SBbTjHutvazb2iIwFexCmZJu6lxUs0lSL+q3xXVtfU7/Oj84CYUQZZ2BqtYF81QDdgi94sMIa8O01cetern7Lylc6VsPzvzElzMvO03wQc4h5k6/KsT3pSt8D+vW4Qn+8dw5vrHTnY8NtKvdEF/oefankXenrSJ0WCNdM0tLUp+VHs5pJYJHJWSaHIhOiNJVLwXzNb1QF11obiN2Pb5qm/4yIfoSIftE0TT9NRP/m7Xb78Y5NPCy7dEjme+FnWSe69EhrgwLpzIXkdET9bgrT7L91ws6jyhDh2dF+er8JWfwvLyPTrRnQUtlqdS60VMEyHD0H8kshRx9FWPrjhZp5efR66kA7B/r36AuW+/UT6BK4X9L36EpX+noB6a8fNwZflAZMqwpHa8uYapYUaVargLV8l2mygLWJRlaFWm3VbE/Z0n4diF063m63fyrlkYhe4YfeFlGx1sYbWUP1NZ+BH1pW/fK8zD+truWzti0B3zVmO39JX0M4fknfmwEuD+j8dpWyLG+b55X0+9tsK2ENtks1bCtflM/ByNPmIegCV+1xgvFHEfogRuuI50pXMw59r94Xj9cpYa4B90v6GkCY6CTqWyDedHa0Zi0KGJUzVbDcnlKqWM20GdHR8la+d8fQ4yaA6I3thNVT/fawqL/A26cpXK1sVAF7baK6XtudbLTyfbYDYHhP7w9fpHrRTcD9rdSVcEmTeTwdjQ/L89rZz3IM1i+PFShKr1WrPe0F/ItIf91IyDRefq7MUdpyIhuy8oCRxd7RlsJFYOPpGQVqpWsWHTeO9m1m0XW5suHaMHTUonAdM3N6BwCOwi4LWY1A3Jd27rUvj523EYEvCtqmsPAa/5ZPNloj7PwFfY+IXrD9gr4HIYpnSLdv3CHT7x8VBnU5n/+9mMCVqvj5dbgKKF/iy22u55dKLXY5LUPP93bLOC4f650rYx5OLmVK36Uibnm27zy8bavj+/cAKd2vF3VK2nxM+Gt1jFg1LxxdnqykgUxTlFYaz8umR0y7GUhbdjJWLVQz+0P3AHcf2xjAkV2lMvk9lh7JNiKTr4hCIYca0PYy7wbgjcD3i0eYuc+YsD/W66nj+1uJAc/Ll/SS5oWhieaw5aA9XV66itsJXFOuZ57/maV/a+b3emZh6FNk4tUcxq8yOAStg9h+6EJ5Z8fa1zSf1DWH8JnQmLAw0MXr5fGanmvhH+faU5Q0aGYtC2XNh7SIeg9PxmpdksQ7FYGppWCju2P1V8EbAjgDxuzmGMii6hcp3cjSI6VJCdIsaHspVa0/pJzvAL5c5X75BG9/+Fpl7m/HspxML/3Uws8R6BbgarDlgD0HhS8qdzlJGL/Or+dv0elyfQJZKmQLxnK8WIK0vGdIvaKynnHgS9V6oquYeJW1OYS1MWHR6NOulxPeqONy7gdZIswKjR81ws9TwiHwysZ7qODWdbpemXUgvBGAI7OUs/maUq0xVF9Tv8G2Ii8rCmate1HYR+Bs+loHvuXC3wO+rRO07vW0dJxWjs1Z0NcrBK4G20leEzIX1Mfn+elR5yYEWgEzB3IZy+RARsq4vLYCYp5eXq8GYk/lzlU2Vs18ZnPJLxOv+Gzn+/sfBfMcwqFx996Tsrhq1cZeR17Fa5ZFuZOxohbZeIOcfE9po3LRfrXbBgDOwrdF/Xpju7J+1J+0wMYbJS2qfL06lh9P5UahLH2tDN8sJMt4LxEpPtpUbyt4kdJFCrcAcQZbfqypX28SDa93Ev7PdzDfzjqQX+r4ughTz8eC5VjxC7ASxK/u4BnQvR41WGY7FzCjJUjYyvKju49QnR4Q1sRixDRoS39Rk+1GZkWr5oWhI3DzgMzLtJjlow+EVwRwzVhsYp/lUBkNyBb4tT4kJl95ajYCQlk+Wj+saqM+5HKjbeD75eNSeFLr9Fs3fH+rcLnShxO99NwrjYFahJdPl886cDXYakD2zLsIn1/tTOxiz4EsYfxSxnMYS4VL9IKu/uAFezerF+DnY7ElBO2GhhULA5XZmbS7H2FZCBdoZmBrqWPNT41ironULkw+JYk7RkBFX9Lasd7oZCxUzoMwKb5jtoEC1qylKz2XIWnU02zw+r6IMs368OqqEJ6HkU5igLHA93kOYWWDcF5Xn7Usy2wBX6l6LcWbAu9VnMtjXmawlW/38usCJoINnCvFwS3Ta5ZkFVXrAbWo+uwDIR6F574u59fypIvILNd5C5jWjZRUoSlV6ph1rdDUcLjN1jXB2dnTI5YT1ftcAcC91+hafs8g/yzOo/kyrWLylZaegaHmP+q3RnnLdgp8ndDzveh8rFPCKjNea4GyTNCSy4wyS5dk2DoD6OInAl4+rhuGrkwnka6de2aoXzrRUomJ/n46I1W8DE+jfaDvTeghaLTV5Kv+coz3Sme27CinhPESpKWV5VihSVjYARHNJ2VdLyeaz4xWbuKzqlibmNV6lfcUcLTss4I1g4woF4auKWMp7awK5vUI1PV7OMAmyoEXdcMKP7dAPfKSI8ujAn4ykE2HghW/KD/iwypD5MK3qF8UlkWqNaZCW5citc+evvd5WQ+Fm2WoWapdCF2kcj0FjK4FnjArvLNClbJ9CeQHrKfH8SxE/QhPE9EMxHc3LxDLhy5EQtBR42t941D21v2+xoBrlPaj4ixadEWbdKA9oyPKMwpoWaaXKrbmHZiTsbSlPwiOPVRuq/LNtE30hnbCisDXspa1v1L9Wgoa+Q5OvuJ5WZWq+dD8oXoppYv+3X80kUlXRGi2sL2pRQaukXwiem5X2Tcs3QBeBFdP/cpjbTw4YkgRSdCWPADdBYwfaQXGZ5qDuIwTcxDfm7o+3rH4rlmljpw4dSL88ARuMaV7h7DnLzz+K+xKr9CzvjzpUwymCKRIhWoqODPGbBn6/p2d/FlBT3VmZixbgK5RymN2vdJa3tCizUfKabQiioWbCeRp5Z3+RFWvVseCrNaFDNxNpSv/zR+ykJ90hScsaROqauH7pZq3hDPyd3/pywldvM883KyFmk3wcsBq0EWh515jwREVjGBb8ng+0QzQCxCfiApeZGh63iW+dGkeXkbqmM9mvtLZeHiCNBuy5WETFmS/YP9H7X7jcLEnZV1O99+ZtlVlsawq9q4/rSq4tKX5C4lGLQwj82smSNUuSYr0tY9tCGCt6cxSoB47X0W2oYzkEwZeDYy1JmsUdBTsMG857ktEqRnPRP6EqpJnzWZugW8EzhHV64F3Nr4rwauBmGgJXaSEPfUbGafTVLAE6sXI4wqYpz/K8/C0DE3fm+VjxKewCvZMKl0M2q+f0F4+Bele34Ps4mEMwK702lVsVh7uluWMB8tohJZHIJ+XQ9bj6p8KQRPZYWhy0pFj7zxqURXcF8IbK+CIjeqi5VeDuBN+jjSZgaJMy/qW+VSR9zAeeob5Bsx4GVSe59nLfawtInW48vxX2/XwReHmxRivpXij4I2qYO96YF285TGHrlY32Mb8mWhi1vSAGdP8EYscgvP8C8wraXw/7JJeVPq927pCRnthL8qzSVlEr7HhxUMb2Onis4rkkShHwbKtFvYfAZ4VWvZmR1ttRepY1g/CGwG4h/r1/JW0lcPPpUhW9UbLegrYg3k4b/mIQTv0nB1jxflafWu28zJPKud+ytcNN0vQaoo3MhZsHWscQCpXlpehaKmiAmO/C2UmL/KP84l0NTwXh/OxYf68Xr5m2Bqj5eHoEy23orzn32dUo60ridBEq7tqvg9f6MoYQV0avwGQS/g+FyVsbVVp3Qx5eUS2WrbgbEEV+ZH5i3raYwrVCg/LAjM7thxVwahsnW0A4F5U2jL8rEy+0u4DNNWpAdRTyl4bnj8vT1lyZIWeiSxlGQemVh+FnYnQMqQ4fOUTlMq5BV4iWqpeTfHycwldMtL4XyI7/GxdA7QLpoS0NetZgzFPI5FOy3pFDb9APJ+kpT0i8QVNfi73jZagfYWT9f2g5xAu21Zam3NY6jcSmi7fs+tJqO/Lmb51vj4gXN44sEEHf4s0SMo6Mo9EOV7fgrPmJ6K6TfN2xtKcjwhDe/3rbysC2GtqhPqNnmvtJNb+RlxbII0o5oj6zShjqw5YckREqUlXGpCzwJzPaNbHdL01wV88Lq36mHBM9aoTrCLglXUI/PWUMOKAd70pnyuHbamnQNOEMVLF3AfwySdq3Y0HXjmIcxOdylpgBE4Lwlf4Rr5AKc2CrObrlc/3z77M1T+aFU1nek7KQvC1YBgBNoH8MDQVH6gvbmXvLnFEGNoCfQbeNW8Y9jDYaprJqt9IOWvzjsjDHIKvI6NekdsW9WvlWRB+pj/UL73Gp9BmG7XwnYeZ52pWA2akLTtEbW9VyduZ3QhYY70ctlHwjhwH5ibHcWV5qXa05UiyX/LCLsFrgZj1bbrcN/Tgarg8HpFO9NyrWQtJa6apVw3C2uYaWrjZgqy1RvgJ3WUlul5Oi3kVn4n0ULQFXw/KWl7J9xQzMut7aN4AWGFoy6KKtPjroWA9H/WKeyCAM6577JYVDSdH29T8GeHnFgUbBXVW/Ubbe/59hZ6JCIaeNbgS0exvZJyVT7Lyws6WSq6BbyTknAo3W+BFateDrheOlnleOlLBUjWV48j4r6aAlaVKEhLqJC0jmivHeZfh5+89x41liPl79AWdaD47WoMwCjdbkLXGfzXlzNcHc4OhaCJb1TOhcfUAACAASURBVGpiDKlcBNpaBaxZikk1D2iQgI002KKCIxDO2yAAZ2YIe1tKyjIRpcrLSYVbo6wTNwg93tEaBSzb9wAN28EbbhAR2GzjoZIVmGbgu/S1DDtnQtTasiUUgq6Gbw14tfCzVLkIuJ76jSiRjArmalcb/9XK8dcl2XOdl0dquIwNf33iS4r0cV6pjLXJVmWcV8ISgRVBsway95eshbQvi/fnOR5cEvgGHfJzlNeEiArWQJtRwAj8VhnV+AMaUCfQ3YTneMQuWDX2JnbCGrVPdNQknHmadRxwGwGoBsSMb5mOui19qtB+0IaWG24U42qXX5DkxhXz8kulrPmSQJa+UYgbtXNPW7bLbyC6w1dTxTyNRDpSx5nZ0MXsIcjdm1TD1/MrbIugyU3L58uS7uXOi+8gqivr8bq8XhnTJSLYzitv/jrkgyBKOaLXePDzYQ18gw4JSE3RRrmFgB4Bp6bCvfI9lHXKUjI8YP1BvjGAkdV0KRt+rrXE7GdkFpA1QMt6XvsI5iFoz9Uv0WsMOLLkSB/3zS5HyreDlLEcE5Yq2YLvF199joGXjHwy0jQQkyjH//IyBPIsQxdkKwxtqV8CaRII8nvG3wPjt8Ih/MVX34BZ0rEZzcXkmHBRzXO1uqyLYC5VclG1WqhaAvuefnqA++XrQic6abOiWY8Wu2TJz6mkES3f44gyRuWyYOaWhXQqDJ2ZfIU6VhuGzrYV681GVjORKhp+9tpCPjvMfs7AWNbV/GjqNaKALfU7K4O3m7wfeypUbnBhLU+yASuhiZYeefBFMLbg+wV93a56W8GrQTc6ASuqfr31v2jcV8JYu/Aj6HKfnj3eH2tc+BVuRiCOQRiVlSFitBsWClsjyEbV+KIseGoSEVfCM2c2ZC1lHFHMBPJRe7JfKD9kslPRMLTML+e1kFw/VL0RgKNLimpD1JoijsR3veOASUhqsEQqFXXHUs2e0vXSiEiu+dWecmSFfGvOvYlUlh80czoDXz7TWYVvuXJzmH5FGKwczEQ2nBGIPSXM82S6laYpGu4vMu7L/3oKODD2O6srypdx4bstJ2fdoXqfUDWHLoYwAiKHbmQMWM6M1kLimaVNvOyVzvffGn9qEtqmUgOtl0ZOujQPoB68w+Vbt6aMNCyhHJX041XwBgBuhSrRvNu14edI+cCTjzLNRspbCjiifrU0FcKvpQBl4tXzGEy8siZOZWEslTFWu7kQtQXf1+YdDny/ppzqbQWvBt3IBCxP/fJ8pIDLuQVdFIaOKGAJ2qR9IppNzqJv0wzCV0K7Wi0fuoCWFfHZz0glI5ByBYsmXcnw8jx9qX552aeifgD3cnntlrWYkEVkw5dbVBlLa1W2mmIOW/RhCprqjUKydolRHwivDGALvr3Vrzw/g3wv/Bw0DYzZep4CjqR7cJ5BWN9ukmg5iQlNaOqvjPExV7eRMV8d6NcZfMt4LxHRxNVtFr5ybNgLTxNIL2lkpPM0ctKKoQttASuRvv5Xpkv1aoFXCztLMPPXy88fJseFOYRftoTwlc4z1amN1RbLzoz2AM0Nl+W+5qFouDa4PDFJquBiSNmSSOuhgKPXtzRw5Z0BOs76qbEMWNshvCKA157x3PLSEO2S1TW4RkGNFLCW7ilgDcwP48uOiF7ql5u8W+djwnd318UF6ATSZFltLTA/1vJkGxrQeT4RzZQvEXuIQg/4SpUrwawp4RYVrKW9BZPXS3D95BAu24FKCF9ZJf5QhGJSsXLoSYWK8uSDHpaTsvB48L3ty+y89O0K+ilV8NPKPtFEmDFI2ZJIyyrgWgZaBn1lw9C1HdJ8tIC0DcIrAdiDr9cNL/ycNeTDUuBK+LmlWZRuwbklzKzVYROviHLq1wofZ0LPlmq12pB5OGTtj/l+KuFmBNivHm9MOeehaQ/aSPUiOJMoR0a6PPbCz6hc5kEMUmldRX2prs4gTapdqV6/EvW+Ivj9f0L4q8/0vW+L10JziKFnBEsVOh9/9fMsNS2BXAC+GONd3KDOVfA97QJV8GxzDvTc4F4K2Kor8y3TrnVmfQlWD27eJhyZTTpQPa1fXvm4DQZwBJCoC1mwFh/e5K7a/aYNt5rSzdTlXdR8a/WIsB8XzMtNN4jm6tcLL9/TtDFiHKaOTbSqHfedb7pRDd+vlPSIIpaQRTDOKGCZz9O0c2TaRRdBlf9F48IyzYJBi4n3REL4dLnS1w/GlolZ3ObjvPMdr/i5VLMy2nP/e3p0yQbp/Zzt+Qx83l/SS1HD9cqzseALXc+n5RaVGlRbFLAGY5RvmfWdTAvYqFqtBW62D1rbRFkQDwLwRNtushEBbu3NQbCa98+rG0lHIJZ/NXWtbLpBNIfqvSoOA0sFi9b7toz7anlylys5Przs24rw9RQyKfkE/lpKmOdHTKpgTQFLuBI71hRwOUeqWRpSxt9m598GdfgM6eL+q/ujDc/n62J29LyqDsFyvlwv/Do/0zz8jCdZ2Up3NsbL/KLjZxtMBc8mZEkVjKBqgVlTwNIP0fJz7M20mWX3hi75WfU5Igwt/RO9kZ2wIup3ZPgZpQXekp7vmgXmWgWMzmdll5tuEMXUrwQsz5P1ahStVs5+0hEOOz/rsQlXRAC+MrSsQVWbHa2FmyPgjUK3lwIu5SMKWEuTdVBomavjqziXxsPOSgiamzUxKwO6SJ5Ut7Kcp3TvZVCdMp683GHr2fZZjFtrzwyOgFZTxVYdaZay9sxT1jPzNuXIWK8wNFX48G1jAPew6EtAwI5svqHcyXhK1jJPCUfUrkxLQdlXvxrk7mVsiN6b1MLQc8BHx4cj64l52PmZx5Qv3GBDg28Uykgdk1GW5xFI539lunXsmVQ2UQWspREtweuBtvQDqdykaRCWm2jw89MDfEXdcvChMDLREp5auWIWoK80n3Rl5ckdsooKXowFE3sjLNAiZWspYATn+QvFbVppIdMg56lVqYgtWLasN+4L4Q0BHGk683CEmocvWPnBt0YqWE2tWtC1ICq7orUj8zXfypaTRf1as5C1sV2Z502Ykmo1Mj6sgdnK+/L69RO+X37dGb58kw4PzATOLWVMohz/y8sQyNOMfye0UHJUASMQo4s/Ur9neqlc/jqscLqE9qMdbXb0EpLlfL48yVpWVPKWs535uO1F+EDjwxHI6yq4gLlMyIJjwfcOYbDKPJmPIIzKWMpYmsco1VftIwpbrAbUfVvfkWXWCY9oR7ZROfs5o44RZLV0C95RH0QkZz4Xs0JpUv3yckgZyzzuh0Oclzsp56htBPNXOjt/jPkSEQajBswIVGvgK9N4v0ikkShPIk07rzWkhGrNWgcs25PHPJ+rb+BjerRVlpRdTsslRVxV3ovP1a0MI5cxXzl2W8qVdG1p0pW9+OKHt8v7YbVNRE8VfH0sSZqp4Au7jmnqVeahsp7glOmk5HW1SBhaql7PegG9343BRgDeivud2u3hxlLDEdDycuhYwreoXz7bWTzrl0hfVxtVv5nlSly18rIWVPFGHPLhCnzSFb1CzxKSaMKVN87L84mlkVHHA68HXQU+pmqUVspqy5Ak7CSMkZpCypf3k+ejsny8V/PD+1pMwP2Foc/3CVRsTw45oWru5vX94qaN/75mQsfHdLn/kofWH8snJPFZ10T0jFY9H9TAN+YgwjdOmsLVPk8iXF+7OZIm62o2DN7aG5B5RGHLgx7ytgEJtSatyVda+lmkRX3LtIaHL3hpPB2Fj4mda/4QaFFdBGnpV+x6RUTmlpP36hEF6oeU8XKlJWQ1oOuTuvByo9lTjSRYvdnOCNZovDeqeqPg1ZQw0Ry4GRWshR+Lz5btJ2UfeN2MWVeigK/zs/7n2czoe3UM1AxorZnQ1r7SHMh39fw6jyjwMiOaiJ4PaljsES1nRL8cLT8vCVTELAL5EeWrgTxtEVkeWY7U2l7vOtjLilbbXLaeBeTIE5US7SGoWrBFdWWTFmhRmxZ4n8dL9VuOi0nlev+rAVFfD8zL8TwJ0rm/ZUhZg7tZR5t05alaoiWUeXoUvqgcGedI7co0YmnyNx9RwVL9Fj/y4mzB2ANxFLr8Pcia/J4zP9NXROfHOLHcLYvfQHLTFe0StNqMaRl61lS2Fs6W/Srl4POH5ZIk/qQkBEkLvhLCZNTlFgGxZyaoa5cjbWHtba8IYKupqPod1T7K7/DwBaupCPs1uKPyWjmRbu35zI+XqnWpTl/pesjanjSllbOXLan1xVrf2VONOBwRUBGULajKXbK0chHwSrhqKthSwFpaRP2WPKSAuVkgRn2R8JY+L5SbEf0VK1/C1+IpSmciul4+38eEHzOjX6DFypWDloORA1MLMXP1bG13qU3WKsdynHgxO/p0ne2ONVPB6CEN5ZhIhy+CqaaIUTlp3TmYWY7Ucxw4E4Zug/BKAB7RTNRnZverRPg5Y5Yi1vK0crKOLIvSwfN+73+vs6UOUv2ipUX3fH3SFII2P8agX8LdB/Ny3Fc+2WgGQQ2UCLQWfDnMiTDMyWiPRBqBOjKNn8tjdK7lWQq2tIWgK0PRsp5My4SgC1S9mwfehlbmdL9t/pKILqeXEr6K+Q3fEztgZUArFe18VvQrpMx9yZ2uuI+MCi7rgmdLkoho8ZAGSwHz900CmkA9EumeGOXtaRbildXBSD3vaUlEeEKXZmMgvAKAvSYyajez/KhXmwlDEPVC0JH0iAL2yj2MLz16FcWQndUTFwoOSJ4uw33cN68392WDWbZb/srQMxHhSVdES0BKVWpBUqpYouXvTVO6FpxRPzwVjNqOhHTlTGJkIqxr1kf+5ASviHG4y7YtAJR6/Bp9Zcd0nxl9Pc8VbVk+JCEnZ0ij9bmyzlX4KuU1mL5eht6uHP+9ElPYTAU/J2MR0fMhDfx90t4/nq7BmkC+51uW6a6GkfPaCVIjOljncyCAa11LONb6GVxPKtcMeLVmPD+R8LN6rE++smCnjdsWCy0FYup1WQdvuIF8ayHqqnFfTaGW8LIWTo48K5iMfDLKk5Im8wmkeyYvtFb4GSlZpJrReUb9yv7JcDR6cINcD3xZli2haCJS1wdL0MrxWRQ6jq0NfvnRJm6hMHcBrYTzHOb3PaJnk7GI6LkkyQo9y3Si+edNII+fEyiHfGdMrSvHgSOOegO1JsydZ85KIWjNem4nGSmvTcCS/jo9/QjVt8LQ2rls3wo/L9p7Tb4iosXkKzRxSk6okiC11K8MWUNgsmO09tgCM9rYA477WqDlEEVAbYGvp3o1EBMri6DMz+UxiXLctC0i5bEWauZ5sg46r7UzLQGLTFPdIq1s0nG9PB9n/wxFE73Ggy3Qag9M0CZTSRXMzzlYZZhbW0+srUGGk7HQs4IlYC1Ao7JI/WoK2cqTloJ2ZivKlq0qS8eGSXdoGwPYM969EWHj5PgvUrvROkTLehHQyr8ucFG962LnK6IlWMvxK01Xv0iVzutpY7rzdiww42M79AxBak26siDbA74ytK2dE0gnkUcij0CeNKRkLKCSkS7VraV2+Wv/tki3IIsu+tF88b2fiOh0IqJ7oJZOp1fl1/dyORMaPYJQwlXuhCWXF83LvX5vVvia9wuFqMv2lDwMfe/co35RwREFnIGwBXFu0RuzMOMswhdwZp6O1PLgBiut3jYEsLc2N+MDKdxebQRNwtkDtRaClsea0rWAO6snFC8LPxPNJ39okOXHHKqyDgemthtWDLKR0LOy5AiB1go3a3UivjLw1cLTxMoQ6eBFKpiIbtGL2YVo8i6Q8kKspWVNPmAhonR5nyI3GPyYpZX1wdcz3W/WlKVJWiTmlTYHtdwJC02s4u1o48Q8fF3a1MaWnyqahaGLPVUwf1awBc+oekUQttJXMa/RSKc26TjsxRuzSJdRmcj2k4m34yz+ec1bKtZStlYXXQX82veZSA8/S2De//pgRuqX10Hq14O0DGujtFmeDD1H4WmFqMtxWWqEHl0o94KWO2RpfmUeAqwBXQ7cy+tteNo34prySSpfIjoz9TtZcC3pUuleQNrVKM+ttGNBWNY9if6dlOPLMo0vTXp29cRDvwboSF8eNB+b1ZYbLVXwPX0ebkZwXmxJKcD8CkNflk9JqoGnpZJlPeSHQPmIqXX5OPCaD0mI+OmngjcC8MBn8YbM8h17jmNVc9l7hwjYk/5R+PmZB1SBBmY0uxnV8SCLQs859Uvz0HMxCTEJQQ3OBI5lfelDgtCCumxbtofS6AVeDl0JW2QIyMXH+fTyC7/1/AJ7BemyLDdwc2CqLl5GtkWi3pWWF3/U10eaFopGG3TItcEItDzcrIGSH8uJVk+IGqCV7czGi1kY+nJR7nSs0DLKl+nyGOVrbUbKFdtehFLfXbVytiMFbEG5JWzc4SV64eSa5j3lnAlBm39f6lcLP2tjvMUQmF95y0cRyjq2mr7AOrJfi7+P0DPRQ/0S2eoXwbYmBG2p5qKCySiH+knsL0uT0OUwvYALFwLyJ/Edu1xeYdlvLq/8UmwG4qJkpUK2FPPVyCd6qVNNIaNJY1Z9+f7x8gwG5+s9DE1EdL5eoQq+u10uUbq7wsuDpPLV1g+jCV0WaGU7i349wtBEtHxAgwZfTeV6EAbvpwvXVmWsWnRDjtLJUXs79/GzAYB7jcFGn2gUqZ/c/zkCVa+cVk/Lt4618mi7SRB+fuYZKldTr/M0bUOOum0lVVV8veY23LDSJGRRuNk7lpO8yKij5dMrj4NXg66ELQpH83T+bHcO3kXa9VH2IsaMLSuvoWb5UcQ3N/6d52uREdhZWosKnpfT94mW478ItMWXB1oZukY3AOoDGi7sNkoLNyMIk1JG+tL8aia/RykgR2R1bYjaCsOsI813pIC5ZbpVym40AQtBGXXfgyU61/wiBbw4no//yq0n738xTLMTpDxf1o5aqE4ps2z7aj/lKApf7R8KN19Jf2oSKW0Q6W1K1aso3gJYBF0J28h9+DdX/O1HMC52JhCajqrhiEVuUDVFbIFZSUMPbEDbVEq1eU+bh4vljlj3MvMnGCHQ8jaK33ud5Q2ABm0ehiYiPA5svUcWODMhaA3e3cxaD7weJONPTsrbigDuEWKu7W7nCVileLRKLXwjPlA+CD8XQ+Hn+zl+AtK8DA5PSzBLv7JcVP0u23/40jbcsP4RLSHIYYng7ClfIv/xhQi2om0LvAi6/Cdf+/NHQEYwnoWme8K31PfSpX+kdPm1mNcVac8JWef7EEZ5dvB8VvJ8i0lLBfNjTQVHxpIl0LkKRuk8DE1E860p+XOCvRCz97eUle9rLbwtc8v3nIjVc3lRG4R3qoC59Vay8iVzXwMnYFnp0RC0lz7z8fjhs7W/WvhZG+PlFwAN2Fp9pGqttmUbnvp92lX8lUqWp0WUcTRfg7qmelFZusOXj/Fq4C0/cflTz1zjVF5KRc1AXI5nIO5lkXHgiNI9sTSpyARUeChaquC7q9f3MzM7WZsFrc2olqBdrPcFftVxaL41JdqUIwNhme+do3T+3hPpnyE384tsgXXcgxLi7dRDeCUAj5pg5Rl/eZXtoBBzpEmrTgS2lg8Ugp61uww/F9OgG1sGZEOzlLHGfmUblvotKrn4Mreb1OCnQdWqq5XTQs0efDmwH+ea6uXgRUr3AtK4WeJRs7KdwbN/qAwDcTcIR6I5GgS0i77MA2klFM0f1oAnO+lhYTlp6t6Es3RI1EGg1aAt2y5haCJaPif4uUEHLSEr3x/5vloqNgphad1D1NkdseTErEi9LLzrIPytdI20ZcHXck8w+H7CgjGCoszX8rS2vPqWL+PJR1qY+OXWV7nzGco6mHkaalv6smZHE9FL/WpKl5Q0LY93T1O2Mh/5LH6t36wC38vlBd/LdQ7fbx7/eLMlTXZJNo3yviHdJ/+rjUNfro/+ixsK9+YDvefoBoXYMc8nUQ59DpE8es2aP7G1wfObPfQd9I95ffuG9rIoz/uBTPZptnsWeMCKadHhLHTeenkedolea1vjvja49VZ1q+3dLNOyzxPuqLojXyovhKyde/WD+fLJR0Q2NLXj7NIha+MOqYpln+ZjxK+Zz0Rg2ZEGAE/VklMueyz9gnIo5IxUr1S8SAHz9B7GRRIRqWqYqFIJy1AxcoqOS+fQsfzuozwJ4TM9lyWdLtfFFpVoj+ZybE2acpcOAaWsqeOIaj7RhehEr4cyPF/345Msu2KVD1Z7z/kHfwHpsj7y1UPlhkRnz7By1iLKm5wycxsE4Ik2fSxgug3lbTAVppPv+bBCzV4Y2wxBvx6+IO+IrfCzHL9F6bKuDeZ5+ciSJrSz1rOMtukGUrpWqBmpL6SyWkBMOK/AV4ac+TivB94hY8CWsa+QnKRVDsMgtkKgMnogy/DJVwgY8m8px8uzPDkWfD2dH125En9QgzejGY3lWqCVvpZhaw5wLTQ+3yeaiJbjwOV9lRAlWr4f3KJ8iwC9CyetHbFaGhm5wcYniv4qttXfVdYL2iu89NYxYJSugVrkoUcPzt05oV6QjsLHHJZyuZEVykMTrlAbafXLL+QS0giYFkS1NK0eb8uAL59oJVWvB96RCti0K9H5hDf7CKlhtGa3GLpgy7LlfdWUrgSNV15RwdcnEM+z7+W9K7FJU6UuAu2rDVsdRzcCIaLlIwrRwxmQgtVg6aXzc/neyvooX1qzekZjtz0AiwD/rh/GMLo70v+AGdBeuDkKXwu0ZvsXQuO/xbRxWqx6/WVFWLUud8dCx3K50TL/pX6JCG+6QbSEY+YfCZ+a/wi0FUhL+MqQM4frSAXcbCAkXba2dCHMv8PcDwczL8thgcoiWGt5SDU/9pc+XWj2oAa+LhhPjsKTpu7d42p1DnBvTbFUxy8/WDWXtNIH/ojClwIWm3Ig82CphaaRD68N6bNY6EssnURB6N1FyLSxW09K27ECrlW68glJLb4oF2r2QBlN99qzQtDC+OYbSPUiNaulexOrMEyX64OhytWO2eMGJ+0CK4FJ7Fj+lWmWyuXHXphaAXMEvhyuWyrgc9SfADHcyMO75vE8ra524Y6qOQ0uj5sA/qCG61nOXL6KEPALlpHtJYmWM5rl5h2aOi7taaqZ94eI8CMK5Xss34dsCBm9r6ieZrxbq941FlsXrFFbYRZ0D4sScJRvekEPATkCTC9NK2OBdpH32jVGmxGZgSjKlxOrMuZtUKAeP5YeEdESnsU0eEbzyjEKV6N8AumVypeUYwu+1kxo9E+rx9NKWU2Ry8sXf12z2dHyPUI3KTzP+yy9zyyaZ6SfLtfXzZ74HqLZykTzm9SMyaEZ6UdLl+1qv9Vvna+vMHRy6KrJhsk5S8VvoSH7zV3aSAH3eAGl67XPEO5krfCNhKqTlh3/9cLMxTLrgNFYsPTphr3lxhvowowu0OXYChkjRasde6FnWZbalK+nguVLlXkt9gn4hqawxwxFF5Ukw84lLI3gWX4HV3EeAUrxK/2zCVrT5T4ZqzyoYT42q81otidH3bu/DDF7m3rc/cXS0a5Ys3HgFsuoYK2Mp4i1dilST3OeVblrPuZQ974T21FXIhYJTWdfkhei5n/Vi464mz4vIUeUU8JnWM9+hjAqg7e3vKjliQhPvrLCyQiu/K9Mk/WkLwLHKPTM6ykWhW9UAZNxzg1tSyC7idJ4SFoNTwMQQwjzcdirku5d3OXMZl4W5V0UX+X384Dz+UrP5wXL7Sk1oKIwtBdiluFjOd4r61rp87d2/ozgp6GnI2nviwY/FL5Hx7JOLYQ/mL0x6rWoXUt2AmuNaEeUrQfcdNsMZmf8zUdhYxlORmFmLRRnqdmXD3l+WdSFs6f55CuimGq14KupX2LHUlVbbSMlTFj9WvCNhHyRCo5MyAqP61ZY+XXJh0OU5ww/ISyhOysMztFnI9VssRPI49Dgvk7imIHiHmV5LUl6gRMv/7HGe19dw+oYbU3JZ1VH0/nNAbdy0w2fjuRZRP1qdbJ5VZZRrVpZr1ORNvqMKW8wBryX9cHclC9o79sTDb6ams6OAT9M7owjoSvDxjI9cpzZvYqXR8Dmdefh58/z8LMGzStIR2FjS8ly0KJ8WU7261HWg28xDt+LckziXP5DL4toCW5ZhvcFtUOBY94naWU3r0Un5PtJIp1AHoF66HPWbsi0dtjxxPydLvOZy0T6mK01X2JxMzn7bSyPz+ImOJI+a/MxzHRW5n6Y140ekTrLhyZGovVD1sqMzAvoZ29kEpa0nQp3T/22+HDLz/d/Lsa3n/TGc3n6y33sAgLHcEFdlI7Kn9gVfJIXXCJdyaKLMa8ny5Y8DdISzhda+gzCV4LWC0Uj8PJuEciXW1hm6pZ83l95TKzuLF+8biIxKUu+l/z9LCZvcNDnq0UkeBq6mdL+PsqcrzR7yMf8+3xRgWqBVttwBrXBLZuuTgazJmJZFrmOtUL7LP5VOcnWW1vY+bYTAPe6+0BLkLyynawGstF61hiwVkW5E/bGc2X6/Ny+IHhLl6yLD5ywdTHCz+hv5AKs5ck0pOBI5Mt0IrhRxSyfYvAtx/wvgjJPt/6RqKelE9n90tIQhBf7RhdDNzqWCtY+b5RmfTf4X+Sf6DHj3gadHsVZftf1mc7L30wkfZ4m1tGz9f/qvtDq3BHSrylZcFv5XbXT1kBtb39lAI96w6xPtbFN9KXJKNxaMGdMmYD1yvZ/zDLdK5MJv3lrjVH4eWbogmuBlJ9HL9paOQvELJ3v8aypXxSuzcCXn1vdQaFhWTaytInAsexn2CwVzBvRGs7eTGk++Tnrx3SZ/3y0YRtpFmi1OtpcCqRkZbqmdtUlgcpckLBFoJopX20DHhW7A1sRwCPgOzLuD4pGYOzlZSZhRceAhWk7YGkTsOZN+3fk3mxmlI7AjNo3Zz8TOObn3oXWCmnKNA3EUv2y+nybSQTfiziOwFcqVQJpKJSM8rS6KK2URa+BRH5IBUtDNz3oc4p+ZrI8grN2oyWYxoc/lkvkLix9fuPIYYyjPRcIZQ32LK1rSQAAIABJREFU1k2AekPQOgbshauj0bsWEKfr8gq9JupKP1qn2hi0kxD0WrbyU5DW9OU8gGHZJL7I3OvEwmBeiBrV0yZszZQDmv1c/krVql28LWVlqSmZpoQqeTpXv0R2GFobT+XHHHpEPih599A/6Vu+HKSyNSXsjQc/09jNyLMyunGx3mfl/VbLeX+LGQAv48DnK54zcT/2fyvFtAlb0TkYyFRgi/X+z/kgbIOeaquZZFVzTes2HrzTeULCdgBg+UZpkBwVvu4c2rAmMGTUr9eG8yX3dsKSx/dz/OP3drC6d2OpErJhbDn+OzNDsbgKOHLRtvzIkClQYtbEKy0EjZSx7IoHRw2yGpyRD55mKWHeBlLEzzz5vnAVbIX0NeVqhZq1PPTXAb33nGAiHai8vDzWFa6MUAUiQ4sb5+Vv9ny+4mWIgevGonwmfc2yzdazsXo2rQTgrR+W/DbuhmaWCQs9TM6ALmaBNgJd6SO7XKmcZ8d/J0kO7WJNtLywygu+Fmouf7U0TQE/8qX69cxSkURLCJJy7kFXS9eUL0+zIGwp4gurI5/8NHshFnhlOS0Nhaa1Gyfthg3kRcaBI8uUIBiNdNROOcZha3wDvHTufDmt4S6tbDbvMNV2oIB7mzflPPBNaZ4MlfDXY6xECTHxkJQ203lRJ3FXzn1b6XFIP8ozBfI0pH7kuaeM+YU+o4BlGZEfVb/oXeIgjsK3nEfC0KSU56rbm3jlhZs94888RjcxixdA4q8F2ojSJZEn64K+nC46FOfpy+84qufNqJbp8thqY1YmMgacNW8uS039JvukHHtl92crAHjfb4BqI+/oeoR2FDudL/pifK2OA1o8O/OSSpdlPDNfgqZ0Sh6q6120IxdscPHOqF8iHWBIDWvwlQbuCcw8bTa2dezdQCBFL98bGIbWOinLRULTqHPoWNYReXI9cOQmFOVHbnJrrAbMwsG2Zd+sWu6/2uYNKuAVP70eX76WiQjeLOhZHfwjjO52hcpm/EWWaWjLLGbn/KptqRmkngica3XQxVu7KEtAP8rJ2b2e+tXGgC0FGhkD5qaFo5Eilm1KBS77gPrupS/C0FqHiPThAi/ULOtbSleeizy+6YtcDicnYlnL7bhFlg5pM6ctH94cDlUNv5zO/2ppqI527rWVrZcyC4L7pP5gAHt3Bft8U1azHi8/OQkr7jYO7poyZqjbEsjWeF7kgq7V1epZ/okWuz95ZoWjiTD4EAit8LKVznmmgdwKRWs3EtIWfXmEoRcbc3jDBlYjqLyndK0Xysd/xTBIZC3usy7p+6ZnQVoFZvnbLxOyel9us/60iandN+iwbGRENu/7DSpgzTYMdbcq5ZovcqJO7TiTnDgVWdKE0qNjx4sJWNwiapdoCc1o/eSFXJt8JWHnzRi2ZhrzblgTsKRpQLRmWaN2tb4RSEevE4WhnxYBo/U5os9MKyvbc/IyG3JoE7GkRTf2iJZx+9V7LDirdIcNs60p2sYzZSCA9zb2mxm472S9x0+cMt4a4Naxp3lX8OxMLV2WQcdEQnlYCrRGSXmgrbmQlyJXH1iaRdSnNG+pkTzn9aR/S2kjQGdfn9ygZNEBdI6iENGIhqeeg30owyGRzTOQxcrEx3Mj/qRpKyNCKyve/ZhusZ48yPkaBOD3uW3Y06K7wWh1IuUybQS2m2tRui0TS/j4WPTisrhmWBdMK89L1wypLMVPJvxMlJuEVY4tQCrdWuRrqlmDMDq2YCzzYfmHw5v1nmbvETUoe1GQTN7DNFhGZ0K/zpeNReZfaP5meafsG/jsQAzKkfw3bT1e3CeKMnBnIejM3UPkwQsDlO4bWJrcOv6baisYclvm+1CfWebi6YUm5UX7AtKcNso610hXLOhG1W8Gvmg8WJb1FKylgi3TQulmZ7RIh5WG5H3xpbWD2lXy5DyE7EzomrW71uM8m5Vxj92wotayE9YHs50BeOfWvF1kl16Y/rwlSNaPtTVE3RqWW8yA1szqpgfdqBnl4d7G5CtAzVpgJ/14vrywdm3bZtnoe5+PsMbrWSCubbfCeg4DfTx7f5HVA8CeWbP03sAdXu2dc+YOvNaafUYunJnxwgi8A/C1QJa51kdnQVttWn3R1hdbb4mmrj0r0YJZ1CDy8fPohBfZaA1nP8xaiiQtsy64powVmnYVtTY0NXoi1mFhOwA8ykZNYNjBj6D2UW1Vpl2wMxNteJkateQVT17otXW60eaj8NXys8OjmTJam0SkrwdG59HGIj4b7/NiW7fmGslA1wpN78Z2cF1ax/q+UBfA0zT90DRN/9M0TT8xTdOfmqbpt3XtwWFdTJ3tuEMzVfmI0GCv0CZadtTp3qK2TE2Y20r3dsTiJse3vXuiZ/oaX9XonIHO4efsZhyt1gTkdwVN/mL2tgJHt4gCvhDRb7/dbn8jEf0aIvoXp2n6m8Z2aw0b9O3byZe6ejaksJoLR+aisEoYeoD/FvBGzAJjD0uN4SbKWpuMmNb6de31efA1wGhP8rdorTfnW1zTdnIdHW0ugG+325+73W5//HH8l4joJ4joB0d3bPd3NGvvkdrxC2k/9oznza9qkT1oh4TIooolshSFp5e/V5AW6RYo6y3VscpHIJuZyWy1lWmnxbQ2zaVIxdCQQ/brlR1zNmz5DG37XRqlertbzXXnTawR3j/FU2PA0zT9UiL6W4noD4O870zT9EenafqjRD/Xp3dP633/T9R8iRkhIaJhs0a70gkeL8vZX2BU1/JXbdE10bJp7+JR/p5AWqRboKy8Vfz0cFn+euXdNhvq1rbTYlofp8hnegJlsl8v7rvxq3kRDrzfhyy/W6u57vS4pg23TRsPWRjA0zR9HxH9fiL6sdvttiDs7Xb77u12++Hb7fbDRL+gZx/flu3kM79e6378FkBHXFCagb3/m9ywWXu19YAt9zHyaXJVH8leWMX6cT3rl8c3A1ciokvzncdhgywE4GmaPtEdvv/p7Xb7L8d2qdjxqWfsc+uPTLGaC40HVVN1jwBqy7ILRUF9Os//1rpvLRMFs1auBexSjRelj/KH2ypP2/HtbYF56w4Ylu5by+DMdhaZBT0R0Y8T0U/cbrffNb5L78RGfbl3+KPRgNpF3daOQaGmow+xWOHiLcGX2nnU8VVjxaemwi2AR0PrJWR/RqFl7dyzyPp8a2gi0J4Xao7YMnwd/23w9mW9IUM+NbbD69JbsIgC/ruJ6J8hor9/mqY/8fj3o4P7tS9rHQep8b2C1V5YsvW08umLR+OFFNatHYvkSYmXIVWiZXKzVaueBUgL9si0tyTT90U/shED/vmcQJpVp9FuzM/lZI/7lu9wy02oLMNhbSlqqx4R0fWivCEHLBus75vnfmVvt9sfove4B1jULnR/l8r73vIjL756GfB3uZxme0Ff6TSbmXylszo7k5dd1ju5sz4t36H2z6fX0o8z6bNWT6T/Ds4ir5S16iTbmEDfCpguNIdUpMlP9Aqa8WPLysfO/Zd2vxHnqB7PjwCem6Z6NVNvUKKT5twGAv4tpb1iuHo3irXGNgf3ivtZr2Q72wkr9ZCzx99eu9sGbfMvoW/XQePBqT4krmqhi1JmqURNeNrLQzOekRoG52h8FJW3mJCBJFK90qfnp+XhnaG61uzkyIz06HBCxpg/ORchunKg5EdWCGRmVb8pcFcs6fuoNgjA7+9ORbXa6fjZ/QbNsv6VyBqDWo4r4TCbVdYLwXljYLzNxf2DpWQiKsdTSDzMKesrVCyqDi1FipiEZGbsVU50ahnWlmCP3hCUOvIGg7/loSVIsgHvBsj7LL1hCg38DTCP/gYsy4wRW78ddfXDxQli8m1aD3A22jcUZeBABby3mWi1z5wBNnINnLW3bdKfBVKvrGfa2FRkzMoa45ot/bCuW5mJNjxNXsgRZSxFDfr06TwPs1rvJArfetBDcETjusivTOfg11S1dTOQDT8/+3Jevk+zTkjjQJVwPoFy0qd19xAMTV/POhSvdA4oYfziaidV1czZcFdH9FgD/OaBbT3mZKztLATdYhsCv3VTjuhDApzyNaFnTxnPLxYY6JqPkq4rA/2CckMwjISaTxSDqmeBi7aECYebpQ5lHe+Yn2sw5V3TVDFaIqQtG4qoX81mdU/KxCu0+Un2s/LqRcLcII//jCRotTDyazKWD92oZWCcDlFvtIXr24J11XO/UqUHA9jrzJv6NPrbwFnUHMbe2FPWes745CZnnM4sAmJPFWvlkTIu9ZV2Sxj10zkXhpZQBq5ncDzTEpayOW3Ml+dF4fvJOJY3EegG45Moz+15w4Lg591wRWHr3TgFlbHchCM6M1la5LfSMgQEfV9OdEE349kbfVm3Ns3zm6qzZ2bkgf2OFPAA67HdZEsYOjMuE1S/kbt5eczLRseq5PKMK51ndZdKm51HlIx14bbqRMKZ3jiiEYbmoNLgNKtHc7BqkEUQ1sLLCLokyntA9tR4Rs+V8DORswUlCg2jmyNPnsvPyPpuCJ8tS5C4Gka/JZkeUbHexCzL36sDzqeVeTpUS/hZttP9qVSRJ3Hva2h0BQBv+RyXnTbTAmunzPVynt39Ru7WFwAMKGYNqDIdt2eMgT1k0qXAUrso87+yjFVH86OlaRO1zneYZNYAE+khaQS8Us4bs9VCzjxPawuBHR0j6Fo3Gp9oGX6e+OclPw/rJgulnUGad+PEzQA0vwHUfhs1ytZK92AcuTle+NRuyiM39TXzTqLjyXsWsab1X3HzDhVwz+nFlU2v9QVLtGMpUL1OVO3iyz+aHZoJvc0sAjrtAhyVaxE1DXzJMHRkGdCsfqCMzPfGfxGUvVC0JyxLWU3NWwbDz9Z57WfmfXZRZSzM++7Xbr/afXjIg26N9Qg1d7WOE2o3tncI4GINM9t6f7lGP+XoscRAznjUliREJlZp51qaTNdC1s8uizv6eQj6/rU0J2JFL7wy/OyFrjNh6PM8pIpmQyNgIRVshaI1JczrWUBE4NXgi2Y9a9CNqF8efl5UlmpWi0JEP2tNYVufpfi8y0/oej65N5Za+v37vAQzT28d7gmDOzMxc5R22Q2s92crAbhmgw2i+Sc36o3svGbZUsGRMAyqL9MCExesSVhaegTM0ZnQqC2siOf+yrjbVbsYE+njwLJcFs5am1oYuiQz2DzPKTYT2oOwBkikctEYsFae9w/5tiaIpceAT48bFQTa7A2QB1mZFlXGzPgELD7Oah0j82dGz3+j2s1uNmp0uZzwNpRoTknkehOxHnNgVgV1z8bq2bQDBbzmu94R4jWz/ay81rtLZaJFZKLHQoEqP3JtiZKXLlWAbKdczJ7ncmA1Ejb0LsiemtKUMYIFgsnDvL2ONRhrECblXIMrerkI0sgfV9u8r7Lf2mtx1a92IxO9AYpCVr5o+R1RbtyiE7CsSE92bDi6sQ26OV4o5NpNOLIWVcq99kFoK7hrWxHAIxRsq8/GD7EmZJNVwZE7VWZX5e5XrunVLhhaaFgLmWnjyi//rwsKB27kQnM5zS+KswsnumCXMhpc0QXYKy/9gwt4mYzFoaOpYO5mAa5HHoIrUqy8rDWOrAGbCMO4/JXw1fqO4Pssy9Wv7Jx8Mdpn4d1hyPLcP/oM+bm4GbieXzeAy5n8OJyMZvyj3w1Pt6JNWnoI7izypW7CkbymmOleXo2l/aFIadRJNOI6BvgrK+BRYeT+s9NmrtE/mV/TtZayqJ4cA774Y0laenR50Ssdg1a2j28C5heyy+k0X4dpXWgllEmUsy78SDFZ6kvrx3k+I9qDMFKR6HiLEDS6GZDHkRA0VL/8PUWAJZq/59rNl7yRQtELSxkDWN/OtNiAg2h5o+oNzWgRHm7o5lcLaVtLmub9UD6RcjNeG7Gzlh2ha9+wiah73d64jS87CEETvZdwQthWmMwAF+ILi48Naz96HbTZtlEbV3FRNFUMT7MULBl5llqy4M6sNQTNjz+Jv0gh8+5p3UaQ9vzxvsp+o9eCQs8L9Ys6J9PkMbrpQXWsMkThGfJy/JebF2ZGpoWLpU8vXVu6BNMv574PZOl5ef5gl3rPdgJgyzJz4DPhh85qvFYFR+4gUZlFGAmP88hxIak6y3E2HR3P2gXhueV42fwCMlMPbBz4hpRoMaScZDqvZ4U5Sxovp7UNFFkkFM27h0LQSIGSSEeKNaqMCfiQ/sk4ln0PWUT9Wu8xgTre0IEGcEUZX06viX+XEx4uKef3v8swM/qua+mRMeHs0qXQul9rEpanXr1Q9ah5LmoD2Wt47whsu7+dAnjtqeOdwhstX+BOExs+X07PH6KcCV3749cmnXgXGQxv3Id5CPtxfP7WfFcsBEmiOQyRYrLUlKaevboapIlCs6KLedDjUCZagtgCciQELctkQ9Bh9SvfLy1ddtb6TMjI02Ar653m8wz4jZ8MJ0fCzGiIRZq3RCmajmwW+Wp5EEMkP1pWExDasF6VRXbBqrVxsn0DALfAtdcdT+ANrf1SRMZOvO5Elxst7mKxJtEU62UGzVz6vBs+aGeznNkFRZ0VzWahXtCFVIOtVEhWqFlTZujCj+qAevJRezIkrUHsLNI5qC0QI8CifxqcZRkS5dGx9GfCV0YsinmRiOhnROJvRBkDKN8nX90vh3KylK9m52BGN7vFT3ZM2UqftcMiXdfL6TUBq0TGojf/0bHfqHVnV4QBpdEML7Id7SMS8RV7E7vQ2O509h9xh8rwNJmv+Szp/C+JNGHXy5nO5ytdLyc6na50oROd6PrKpxOd6PL4m08vF4KTSC8XmFLmTNdn3ZefV3+ujxL3Opd5+vmRfvlMtzPRVG5ITuw9KBfVK3hvruwv0fw3dmV/z+DY+2zQZ/GwiZSvxhUlvuwbVo8f9zJtprQ10Use87QwfOWxBCaJPFmOQB3Nf8Sv8DMXjThShG4gSzovO4f1HMyvdD10jdK1tmQ7zSFora4GXw/Wux3v9Tq2XgR2IwB/Q/PLwZq+OrYNLr6wTA2Es/cLz3on+kxEp/OVLpcTnc4SonP4nRkVoulLOMdAS1TAvIR2KXNi/i+nE50uV7qeic4SphyqCIQc1Cdavuc8j7+HRPi9l+dXpdzDVAjzvj8sAtsL4W+tlo5MtmGBt5zL4zB8eaNa1ACF8PmxBVj5otBf6YvY8aNcCT/fhzuWYeaiMqMqt5xrcLWWLmnpWIELhfwcdmLp3kMYpNWAsyd8w3V6kX37XbJ2pIAt8yimlan1rRTjlmm6tvtI6SJ4P/9Oz7zr5USn8wOG1xOdThyMc4VbftSR9Bcsy4XgGgZtUbtIHfMLzyy9PJzh8pk+IXBKpcvVLPpr5aELPH+f+bkzvEaEIfzN5Q4qTQ17QM0AF5k221k7R8ch5csdIvgiRYvKa2URnLVhB+SXpZe1v89d2AT8LKWqLVeSflAZXSEv09H4rxzCKfb5cnrJegTEjPrVzlGdHlyEPqw5OrzCCKCiDvVrZ6eTsIj6vpkNvqJjwZExkpoveqM974wJP6XIW6PL071ZmdbFR5vIhfo2S2djwXBvaCJd5WTywIX5afyiTqTDhOcLQztDaROeJPhkWe5ejgUji8yC9uDLfcnNNqBJGEtDihYpVk/hZvKAfxm1Rd/lebr+fY6CWWsPlUFjzYsy13KTGrgr9Cxyjaq9PtXOq3nHtqEC7hmGRn4jsrNzH6yxWUvBat3T6lrKuISdztfFOHAxTX1y5SvT+RgyV8koPH3vGgslM9XMj5HilmFoors6uZ4/z8PQ8gJrhaat950rUZ7ufXWsPKaQiwq+XJfQulzxt08LSdeqX+krqoJlnlS+RMlxX3ku1SrKJ9IVLpF/o8XriRd9O9Nz8tV97HcJTQumaGcrC8xy6VJkTFlLX9wggJUPiwlY3hhwZHxXM6scAm8axrJwVFSVetYS1e3uCjYEMLJIOHgUuG90v1x2tMjLyY4BmyHoV95nMf5brGYcmMNSghaFp+d+5qFqPulKg/bLN0s/f4sul890psdkLPn6UWga/eXHMpSMwtSyDs9HEAdmjgcH6hf7RPlYDvqlSEXL07SQMxHBPZ6r4IsiD0gFS3hKNRv5x+sJIPMnHxXjY74aTGVYOgLmSDg5O17Mbw5mdjnRYgesDFyjoWfNTyZUfWH/qm3NaGnfMPfOADzCNGBbUtROdvO8cpk0IhvIXvMPEF+vJ6ITV6b+TGdtRvNcDWM1fc+7PutzX1IRL9NfKpiI9MlYmgqW75l875BqLuWIlmDmdYiV0coCK7d26kcYhHDWUHtRBczBS7Qc8yWqhC8CpVSr2rGV5pVnUOaTr4r6tWBaYKdBVoKZCKtWXJ//ZnS1jdKJlAlYGfMAa+VpZYaEmyP7M0vF22LjlfEgAN/o9QZYatVTszzfopb0M0olg+aLWe+kFwb1QtGpEPRDb5XQ8/ny+KuHjFuWDhWf93KvTkvQzn3NJ2ZxFfxSx/fJWMVvWZIUUsHlvZGAvijnF8JA56Yp4iQ4I2pYKl3+kVtpSCHXKuASbiZiavcsQs6lYg18I1DWYEriWFO6Mo3llZ2v+IMXMHTtvc01YHMVzeHN/Zbypb4bZkZtifHf2QMYrDCz9lfWRedR5dzFRu0BPWoGdPEb6/cKCjgLw0qpl6pbCWgLuJ5a9SAszyOhaSMETZcTyXFgIqLTCS//wWO2d2fa0qFXngbt5brhe17xu/TF275wyCMVTOCvVME8LTreK6GKVC5aWxy0AuHziejy8HM+E124rw6KGP0SvLXAMtxc+kakLDXqCV8Ues7CWWuf9ZmrXyILuMvvLAerDFXbYWlf5c7Lz8FshrTl+O/lTHD8l0hPs8r2gq+nikO/Ja1QLVAzP+D+4ekdz4LubSvMhC5l+d9oeXQeaRfe5d5/fOVuODI70r5g4DEpfoxmNC/HsPT6qO35hez8WpLEX06F+nnmyWNUDuVznxkFyHzzHbOkwiSi2faVRHOlemZpcga0YA0sh3yU4/NpCd9P5wr4ctPK8U7LdOSDlyVRTn7WKI19rmjfZyIMx8hTiqK/m9olTZotws+eAkZWC9XotW5IODpj0WcErLlL493QTfIAsxRnJgwdrVMsMtjKfYGJWK2CnAJdQKFRmX8Wf5HfgKExWzm7OTaj+d64NaOZq1srjC2VhhoCP5VQ+n1G9POTQhdtmYaUMXoPtXIov9EmuoPt5l0XHm1ak7D410KmS/sEjlG4uZzD8d7iXLvpKPlfOuUy6drNTlT9nl+PHUQbb8zHav3w8avscrZ0VuUu83wwywetLLafRBYNPVvDq9FyGngvRh40WTCqRFuo31MZ67YSgInaIDzKWuha0YwVYpZpWtfCIegzyXFgIpptSxkB7b0Ze+kQ8kUkZjFTCTPrm3rIMen5mPIjTY4F3wvqNyfWNpS8HDnlUH4nk+PCn873DTvguQHjyC9oppbZ9dsFL5Gp6BfgI4rBVwOnpZq9crwPLO0FX3puvPEKJesTsfgxgiyaZCXTZZ0eS5qKwQiXBVov1Bwpi0zzYVkaxtL4ryDqpAaYY8aMVwQwUZty1dJLWgTwVpkkjOVnjRQrcp+FMCn1ZDoCkBgHJqLnrlhR0BLNoalBu9QvhsaLiebjv/dzfTLX6dk+3p6SiF67Y/GLM38vT0r6q0Nz08oV+zYRfQXSI8Y/H3E+PY4tEMu0M9FzDDnUvLhOS+iWNBW85RidW9C04GtB2gOsBn6ZR+wvzcd+ieYwjKhfOQ4sfcn6xfezfQDmYpklTVf05LPZVpTgbxao2bJdjU9kisyAbrFtoLwygHuat7woE9pOKHArrEwsz+I5gimJNAT0SGj66eO+LWVZD1zukk9nfRtIBFoiGRa+wguGVef+0pZriuUNgLU95b2NuQqehaK9LSf5+2ZB1gtBW+maFThcRBqAJwIxEYaxlY7KcZPQJQqCt/Rdwq4Gvt6x5Uemke0non6JEACdCVCEJm8tjz0wyzqyHSv8vBj/RaDVIKqV80LQvJ2s2k1ZhP69J2BlQ971tgGANdiNClF3CjNr4K2tkw1BZyBc2hRhaCJaPB1Jm518d61fALJwrlXB/Ph5YeKPKkTLkkgcZyAr1fGZdMVrAZorcxQtcS5EBcRojFhCdzGDWjYH+imh+2yz9E9GDhCMJZB7wffboG4EzkbaC77Ldb+ZpUNS/aKx3yhkNTCjdqDfy2m5/IiP/1oKWMsnpwzyQ0reRfzrbgiMMk2OB/eC6Zt+HGEEqJ3AmfbdeSIWr6/BU2tLAhxBVvP/THuFoYtdzyei0wt83tIhbalSyUNjZvP6Mn2ucC048+OyLOlK5ycMrufPRMQUHn8/UVp5b14Nzk1+Pqi+rAeG4FRDyrekKWAu30YOY76EqeQR4ZC1NBW6pS/8HEG2pMu0LHwlaLUQs0zPKOPzfMvJMvFqCdU4ZIkwTMtxMVT/nr4EM68j20Hql8N3EX72FHAkXStjgTRTrhrK2S0oW32OfWLSG1yGFHljM29+9FvTaJkfQuRHgcqVtKfP+XKkiMWWGi3zssdyuVJsGRTIe4znzR7UII95WkkvLhBYJHzkhZ1oCV2tHQ9UvIz0c1rWnc4vaJYlQ+Uf0WvJUPmHyhUfk+xzsA+L90K+pqjytd4j9J7L90z2h2jZJ6LZlpPoiUdEL/VajnmZcszB/ErH9bXfkbXDlQVmzWD4uVg0wmqVjZapvXSGQByB4MBr98L6QRnd269kLcuLZPki91onYiVM+7ylikHvcEbBEjifhZlFmwufZyK2EQd/OENkB6u7S/xowWJySdLL17y+TM8ug7q/rIAK5u8RP7ZC0ShfszIRi1/4uZXPA/mT349IXkln6nkCZd0nFP3/7Z1NqC1detf/9e7d770BCUHMIKZDIihiCNgBCYFMJGTQJiFOFeJI6IlCRCXoSBw4cCKZZNKoKESU4AdIQCRgmiBotKMxJLRCEMUYoRUJGqTv+57zloPaa++nnno+11r1cc6pP1xO1Vpx+L3lAAAgAElEQVTP+tj7nFu//X/Wqtqlr6KLUK7BuBzzcg7IKHzpuRbjuWNeJqSupS9ckNxvkbQOq7lfbe1Xuv+dp5h5f8vx52CW0s9Fi/SzlWqOpqAtE0BjVksvR7+CsFaZTyfraUcAZ1UDzhU2YllpyVJvgVOr0+IBv70L5wtwv33n9h/8loaWb0Oag3F57+/jP7/2HGgtxQw8Ut92uvlRNt/odZm3ucC+N5geS+81Fd/hXKAn6T1rL0AypBLPP0RpfdG4MrfoeJJrpz9LDC/XwMvPuTP14KuloKVyLdaAvXTPr7apKrMO6639Tn1fFudFkpsOg/n5Mnv4hph+LpJS0Va5FufBmNdpfaRhHbHzkht9YnW1UI2sMbdpZwB70JMIVSPPIfNxlHVgVE5HgiIf2gM1by/9hNAH2Q1dxF2wBFruWul6MYfzVP9wwfzcc9h0E5b2cI6p7RLEAPR7g+8v+N7RdPwNVg4hFlhCtkUFDk/KOR1fGpPCmMdobWi9Vma5YAm0dO40hoOQtnnH2r8T2mnH79iYViwr03Y9c/drQ1Zzu/LjKCXIe+5XcuMSmPnmqwd8Wfo564C9Oi3GquvmjD8NHFtlXv26a7yeVgRwLTy32qCl9dGhb6+LaAoaTlzYAV+B6/PslqTyDUnlnmBAd61zaOouGJjDlJ5f8HTr48LK5Z3TUsqaflED1X1d75aKvoLsip4akGDyXrXc0xtV+T1YwNQ+SGllwNIdR/9kJeDSYw5mCbTauQfHKHB5uQRZrY/bTz/1vExFL4/lh2PwOFpeju0Utdw3ILvpUg5g8ehJMf38aOQ7Xc/hWu1rU9Dd09e9PiV7/fRfi14RwEDMNmZcsLVuzF2uRcEO68Ce45AAKvXB66S3zHPH/CLNy6/AYzf0/BuS9LXc+fcBTz/nLvgxhYczlcBd+pYA/8ziJcCXcaU16/s5SUUDwPU9MBS48t9Vr8dNXiEDnLpbz+GWc8nhWulpOvfo/3kJuPxYSkF76WepLAPf97e2HL7S7UjcPdMYAl8p9Uz/fjhktdQzUKCog1lqr23KKudT2yW0yzHtm6afAcxvPdK++9dywBJUJSBacZaaeRhd/13zCxiyfde96JUBXNQrlVzT50oP5MhIgySv06DLzzXYan3e0tBF9BuSLpfY9wFP3S4f0jG1mz9AQ9vEVcTT3HQT1lRfuQZ9v7/mM/l7g6l7lN63UnaBLB7PH/DRImtcr110DjyOA5aWcajSMsn18vOM8/XG4POhfWB+LD1wA5hvrvJSvcv08HXRB48r4tmZzLjlnPbNyxdPviqKOF2pXKrLlPF6zymHOeVNMNIu07aH+81rIwADOUe6Bgg7pZaLsl3Vulh+ztuVei89XVJWwCINXf6vL8E4Txt7KWpg6XSXrth2waWPKXbpgvmxtCsawHw9+NFw7lqzKWhp41UNNGl7K+WsxXK3DKUNWEyRl4bmQJXKaLkG1Ah8i6sF7I1WXhr69tN+4IbseLU1YWDuQjXHrLlfCeDerupyzsv55qupg5v7lb56MJJ+1uoiKWhpPKufKmnQyzyAg5/3cNBc9S90QwAD9RDMtos+VcuiorARK6pI5t2K8UALck7jpdTmvfz2yH/lCxqA4oBlmM4dKE9dy06X1j3Ol31E1pOl9bB5/x/j46mzqVxbD+bvbe3/AArB0g9PJUOo11LJVhoakFPRIOfRDwLaa5fcLi+3XCoHJLAE6HtW7gH3vVBupKk/fQfxgRscrD487c1agLTZSt6gxevo+TLNvYS2++SrKSjmgDW48j6ktrzcArOnKjhvuVnKg3lR8yeMrQEM6DCNut7IOrA3ltdnoF3EcUhApPVSP1pbaVqSewKLXVywL6BPxlq6YC1tvHTBPI6nzfi3IEmbsPixDtqre0xvTcJ74ONvTP9x0vfHRso9FWjwC58GYsnhepCm8/OuBREXLEGX/9TAW86jrteLi8CX/ORPu6K7nj/gYwZj+9aiDJi1uOlXYvcPyHAWy7Vbj6R7fyMOuMa9RsCZgasYx9d/tU8M1i1G26zf9tAOAI6q525oCdA1z6RuEDfbgDx16SW1OODZ8fyWJOqCASy+JeleDvrFCA8HOw39SDXzzVEcoFoKOwba5YcBfjxz25f5pqzFb5S7VG3jVSnPpJr570Iq5yDmMZYrlqBb44DpuQXdUs/BW44liAJ94au56ety09WHy5TT9kD6Ae9EOE5t48975sAtY2sOl/av9Tkr1zZfFUVcbdQBa2URiFtttX5E1QJRs/CZ24+2c7/AbgCuccFR2Ea08v3AkRgepwGal1sOGAj0cTtgLhiYb6aamkr39C4d7FR3Yec8Ze2noqfyCGgfY38A8A6f3F4yuwVK+takVmkQ5Y6X6qK0k/qgztdyx6UthPG0efM5SXUaZKUyC7Zg5xp835E+JIdL14cpfInzfboAn7xfOl+eQo6s+3oOV3OwUh+0nh4v2xrp7Nvar/i1g0+D7GQt0HqxWhuprRXXrMj9vxZgJfWabL8XfWAH3Ko1vns4KQuK0lASYDUIa2CWxpw5svnXFAKYrQXrtxTJD+Ao57SunNN2VNrTs3jKWj623Pl8HPFbk+4FkF2jVp6NiarnTmpLfL4SdOkxd7xSmeV0wc41J8vnJjlprW/gvulqOn64R+DhQgHZgZYYno3hQORxVh+eMy7ttGUbLQM0PXiDvGgqD7b02IOn5VJ78mu/rC+TNZH11513BHAEcJE0cemn9zqwI+v3JgGWlnspaK8PD8L8fOGmruDPhwZwXwvWbhuajrnTlV1qEU9LT9O5iHVaKvoTvMPkdfVNXVJ/dGf0bFPWYkYVmn2oMeqpYwU7j7hfXl7aQTjPzF069hxv+Sm53NLec720X+k+30g6+h2/3egjfHj3WOf9MG3Hm51TqGqp5+iasLW2S8dtccZ05zN1v7MHb2hOlqd7tRQylHip3HLRXmxY2fVfSS3Q3P5LH3Z2wBIcrVuSWlyptQ7M+6VjNuyG9qSliWkdSAxIHHe2HAhS3eznZXZLUhF3wYDsZq26xxSWqWcL5loqmtblPgCQndFkU1YIwldMtyhdyHFUEpx5GloCsQZdWg7WDsJY2pz4fKQ6ywFrQLYcsFbvuWMpHX075vC1djxb6eWa1PPyeAlRWqY54wjgAZDHThL3y7920Eo9S1COpJ8zKWirD+mfKR5Qm14u52veftSulQBsPcmklyLrxdHbkaS2hrwQ6Y9McjC8Pw5QbUzNAUM5X4B5fktSEd0RbbtgfU13PuV5qpjugp7a6OvIGmjpmi8/53Wzfm4Qvjw9A98Q7hGWfp/0vt9yzN9LClYOWQhl9Ly0t9Z9aT0g/86z6fCMA74KdZLj5eeSS+4B3/dL+H64PBwtd7uRHcvFFQPyuqx1a1Lknl/JGdNzsY7c9wso7rfIcr+AfD2KuNcIrLuJMiOyQYoDtkYWlK0vefD0KaIMXNEB0xeQ/cKF2p3IHlQzT8XS5iZ0zSU1kS6cYGWaA9agKkHdcsGLPh4uGJDXgou024amrmynO7W5Ltpmxqit+4CPH1C+DffJeywhnN3tzFPLvK6899rvQ3LAHoyldkDsupB1wBHwlnIO4lrXK9WRx1RKzpfCVwLsJ7d0czkGINZx18r7BGA62KltvTMu6uJ+tbJaByyZS6tv7+8xxLEa2ke+/ehIj69cFcBUrd965KWhA6BU4yPrzJ2/HUnrx3LItFyCrwfnoAsGcP+qwiK68YlvgpqmYzvdqfyyiLXWgK0xrLoPeIcLnphLDkKY6gr7O3+BOTQ0B0xBLMVY6WZeDuGYvKaQ+N+WBF8NulIZd8sWnGvgeyu31nw5UDNrtpk67dahFmcsbfqSvnJQdL+W27UgqzldLVZrx+ulMaJQFjvz0scecGthm21X78I3AjCwzv21PW9bsq5yybcp+uFN61aCabbchC89vtzulL0VESBfLlNguTBEnS5fD5aepkX7pW2lY6+ObtIq4uno2VgChHElX95QZH3nL39fqaIOOAJdWs77oG+D9Tcn/Z3RMskN8w8evcAr1b9jfbEvXKDw/eT952aPmLTg+wk+nh23g9laO65zxvT43u628YruzRDdL5XmSHm9B1Srv1ap/dSmnzMDW314AF3DUU/aEMBADpi9gF2zDlyprCv23K8EU16uxWufH2ZjPlzwvZrenkQachdc9Hwvm8OatpPWh/ktR7St90hL657jRxn/xiaWL75V3XdHb3U7UK0k15xpy6VBl9bzcskFS4C1nK1UDyFecb4ABNjx9LH+wIsSJ4FQq5McbBEfT5qf1hc/BnB/6AYguN8pcH7sgVKDnuZkI7D1AF8Fa8nG8+OjqX0z1w6XnB7ru5IrLWU168DaNyI5c+V/G7XvpuZmtfEkp6uVS07NccHPT+TqflkCMXq7kRbL09hWKlrqh6/zTrud55uvuCuWXDJuac6Pv/Epnq/A5dL4cayAg0+PumHJ+ba4XwrlyPy08wsrk6BbfmpO2AKx5YqVutia79Ltlnp6u5HkWqlr/gD56VmWu+Wg5h8ItPuBdSd8gzx/5rP0pQuWk80AkgMcRl0W8N3ZGbn9qHVQPobWX5+d1Dt95o/uQI7AuhboFsS14+DtSJm/Ac+pWrHaxVgq81wwsLgtqeyKpg+0kG43ugiOVYudynV3C8iwnD/96h3esXoJ9BKEn2+XwdmacNkdXb5L+ErS0RSEXLSOf9CR0tCai9WgK2Uzyjz4RiyQGD5HLsn90mMNxi3gBTsPwveT95PrldZ8Nfg+7gN+gNG691eD5nINOJZ6foy9bFvqxHGeL7ONV7O9Gdp9v15KOQpIr99apZyx9sQrK9YDpGf/a0Da7zamnQAM1IEzek8wv79Xu983Mo+KB3Rk3lXNqUYdMC/nsVLfvOwKSA/neC5fWXjrWEsnLx1pKZffCO8xllJbDt2n24XLugUJUJwv5I1Zk24Qfn/7FiXtfdMuKBy6vIw7YM35cvBrtyKV8yLNCUu/iogDlqAcBbHmeoHlQzhIf/xbjbQ1XwAifDn4JPhG1oR5P/4tTXbquZyLaXLtoRuRnc/ZsqgD5mVSP9q/sEZhEH7csv5bFIFm1P32044A1lTjgrW2mjJp6MCQVNbwEiij/dJ23AFDOOfxvJy74OsI7eEcAICLDFqeTp7XyeXS7UfPuMzSyFJb2u4TvLt9IFg+F3rqC7NYDcJXPONjAM+XKy6Xsns6+NQsyQHzD0meA7bKFr8nEgfoLtiT54AtJ6yVWeAFUq7Xgq98n68O3wg0I+nrTFo6k7a+g5lsvBK/79dyqGBlPBZKucU9CeAZpdtEHS8fhNZJwNbaRMaMzqNeOwN4i41WlrhDpmXacSANrf3haUDU4iL9Si5YAzTvZxbzSEXz25IALO4N1m83kp/zTKWt+drrvLKj5s74AwBpE1eBMHXND32MsrHsw7vJ/V+envH89BkuF0wpaQ5aaa0XpD7qgK24UgYs/xRpHS2T5LnfiBOOOGDucjloJRgH1nsB3OCqg02D7/xe36meAzd6S5EEagmo2gM3eJqapp6LxNuOpmDf0Wpu1ivPgjbjdKtcsZeKXuPLFyLut/8TtA7ogAHbBa+Rhj6YLNBKsfxCfFXKeV8StDH/usIi+oQsYHm7Ed3JLKWqH8CWn7DFH3PpbcR6gD7zweD2AUMk5yx4+vH02f1B/8GPXnJf3o7lSDbDirfaaH83EnRpeQa85WflWrDmfIH5/oD8ZigGOgWKdBzrnt3HPORxeL9W6vk+Lkk9m7cdlZ+Wo/XKIcRokiBvOe8q0FpPi5I6ytrwouM8epJrJRLRvL43RO/7g6OA7bgbugxbFHWwEjij/UuOiNelyx8umH5d4WwalyUYAX29F7DhJ8Hwk/IM58UY83TzFc+L2JKalh5ZWS590iYuYdLLzVkQHl/JXe0Tae/9DqiT5nXWRqzS/+ONefTvvSY+f+lYgi59TbzOAm+JrUw5081WGnytdHEkLS255PpzeR7Tr2kJdgCLjVfiFy5kHK1WbvVVDVFBqfbSJweutdPPvdwvbbf7oyiLIiCWAJd1qrU7prlD5mPzeShPxeJTjbgRz/XQMs3l8r5r4DvT4+sKueiGrPk05V3QRdbjJ7X1Y6k/DZwasLWNWaWvC55wXwPGFc+YP0ELl+kDh5qS1n7HGoglsHrQLeUgdUXRz7jSHMlrXJRz91vKOFB5eQS8t/IW+GrApPUAFvAtu6Ol9rU7nrNwBogjZ0+8AiBvvCo/OWAlaHqwlfrSFIG11zYE9E9J0B7pZ0/rfUvSBgAu8oCa2Y2cvSeYx2ThXsZErF1N914/2u/Xqvfq1PLbxYAW00dWMjZqEJXqOcAz9w9bY1kQniArg3gSWQMGyKXz9h3HsyGZG6aQoZC1QMw/WD0JPymMAflvQXLBkqS3LOJ+pZ9S+jmxCctON8ubrYAlTCVgWrcaae21dV2tLyk+sgOaul9ATj0/83t+uZuVYCuVU2ltpHrPJUuqdtB89zPvtMi6nahmZ7OnLEjbQL8hgIH6j+ySvBRypg96dZNAHyCq5EatmIg0lxsdV3PBVt0TcN+QdX3GZ7dbkfiu6MuFruna7w2t52At4HzGdbGZKrOj2epr0se3y+E0YzcFzUWm/XzFww0/K2lpfk6hyR2w5nzp76qHC+YxWkpac8KeA6bgZecFvEB7yrkXfC0XXet0p1/RA7bmvcLR1LPmcIH531oEohlQhmHaosijJ7mkTxG0jfZJROq7Zve1No+8NgawJ+/biKIP8MiModVxF83HMrbkeG41Ksm4W/1a9dx5eXVXTFdM4csagOm2HSCwoQl8I420u/mRmpuXz79zuMja0fzJDbTeOi9NQT+2hj3OpxJ6+bzg+fKMy+UJ1+fyGj67P0FrlpaWHDB9f6U4DXYcxJYLjkpzvxEn7K0Dl3ICXkB2vRxcUfiWFDLgO+Oa25asFLf3QYCmmgFl45jwwA3znl8Lwlodj5Nkud8IvKN9L8TXRzlANfX4koReQO3zyWQnAGeAmYmV2tWkoSu/ISkypYwiDtjqtyk9PX9CFlV5Qpb0tYWSqOu1djdLD9+YdlfLj5qcQOuv82r11WJrwzQtjQLiW5zodIE5eLm7pR+IJOcr/elaf1vS340EW15+Feq89DMBsZRuBtDN9bbAWd+cVeeMedn0KxHqvAduSN/1W35KYPTqeH3U/WZU1TcPsr75SOqQg7QGrLXA7/fm7eiANQBm0shrpaElBT4IULfTQ1kHnJnPwu1CgPLyMZUAu094dvh8a365HwMPByulq8vtR8WDzuumeGmTV4m37jem/a8itjb8fAUuT25+JHZrUlZSdoPWSXOQ6q20s1bPUtKW6wUkVxhL63rw5eus5VjvU4enNt6j36vaP4BFPID7/b6LB25wSa42UsfrIzCMuGutvJlDPdPP2nlUPT+RxLVzCjrqbmlcSxqau2JJmTS0MxSXx3cpLgL1GgfM69X0tP69wXddMINkBnr27UvFySzTyY/bkPQ1YaDseH2s+T4jtwZcEtLLlPTtw8jtCVrUDd/T0s9soxZ3uNbab9T90t+Z95bzt5qfZ6HLyrR1XkB2vdNL6+N8y4aqSB/UKevx+ngfiMuOjveM6+z/j5h61p54lUkLS9Jio+yy2KSNq85HSz9bAwDrp5+jsX1BvTOANbXeG6w9hIPLqm9IQ2t8jqYJJRh7v/ea8aL1V4B/Y9JiKOX+YE80/axBFsD9gi2JQ1YSXfPl55eZB1+uC0t6wmUGYghpaQpiXNkasbURC5hfiGkZ3/lc8z/Yc7/02NuABRm803FurXd6STropDRzDKaPtDMAN74Fzur5wv0q8I24Tgt4GZcadboRpdrwACv9LKn23l+pD2s+0b7qtRKAYzchT4q6YKrK5zZXzYVfAa2FOKMLyOFmO6ut5YS4tOlb8TP4LteDL9en+bcmVf4p0TVcD7KW0536mpxyca1efETFMfMNWgXMDxA/blkSQfzE1oipy5WgKwGZx2l/Z1zSr0YDrwRnxe0CuKeZAd3xAmDwksrrXG/UGXvxUThXnbNbjtRnPWtApL/fqDP24jWIS6qBuyjKBe3hFy2bpGrdckTZF7v7gzjom+ABUqJPRdrXBKjnij2od/yCBul3qTlSDZhWPL9APwl1UhsTzvJTsqz7g6Mqu50j6eQIhAscizOmG7EoUOnmLOpouRuW5zwH8aPs1pqlpsutS4DgiimMI19D6GVLNGkpZ17PgXsro9AFluDla7zT8Ry8pYwDax5bD98CW0B+4IYGX2tuveEbuuVIAinYuQdiCG2sPqxyTVqc2s6apBZrPQkrKu3Wo4j7jYxVB/2NUtAReGWdcC9gSlaynGsbtCp3Q0ecsNamKOKCsmlrCgFTy6dkSd+alBWFpuekabq5QFB6AMdyDMxTxrOy+aYxLupyOZgfpU/3Pp5Jf3cQPz/j+XqZwfhyA6sK46kz2wW33obEz4X0MiBDd/opb66apvg49tLNFiSn2HlamkLb/65fH75R4EcfxOHCl95yRH+fnmvV+JVpYzldrTzigKucsbdW6+12jt77a43Xqvo+N1wDLpOs2aFclIW0119NLL8KOvOJOhTtj9ZLZ3vOuGZM3v/s88kjFf2R8rzoKBQk6D07jUubAsVIm0NInOJ8Vf1KDgbvIhb6wKS00+ZF6iLgneplZ/s4lx0uj/du6aH9lnMAszJtJ7IFV6k/b0e2JWn3M4Xv/U3lqWfAZ0jEHduTW55XQbNG2fRz2FIztW6+4vLGbQP6hgAusiAoAS26Y9nqQ3O5kc1alV/QYDlUSdpvQmN8xAFb4/GxNbAvLvT6/cEzBZhI13+ni5u/Zvt4tOTkoFa5z/emR5pZ/qDAN2nRtHQ5LrvDZ6lpPM2fMV3gVlLUzBkDxB0DtvulU7V+BwJsgQdwgTh0y08O3sexD16tvDYtnb1vuCbVPf0qloCmG64W9/ve32gn9WxBVqvP9KXBPuJ0tTauNPtOlXnucqub5XOIfCmDFlunHQAM5CEcqY/cPhSVB3HpOPg9wRZoJWXXhjOyUpum+kGYPhkrssb70Pz5zROUC8wnyBVQz8d7gJOCm7aRdkA/EtXLFLSelr5CSk3ff97S0wAWKWr6VYgFyFMcFn9DM7es/H2NQjlfPdCAO8XK67tAgZC1/qs5ZN+d9oRv+ZKGVvhqr2H2WrVNVwDcp11FQHr/xQjnnmvWxrYUnYt7/dDSy5Zz9YCrfZqwxs7UtcTa2gnAQJ90cMu4miuWYitdcOm2yHKmETB7zjazPqz1EZYM4Su/X9iBML/HNwdhro/vl8t52RKSwHxdWHK5wBzMVjl3xLN7hQmIS+pchDFuzvgGYwAzh0yhPJ2Xg/g79Hzl5x+x84fLLXMvr4/+fGLnGjgf740NXl4eTR979+jSslJufX1hdGzVzbubrkjquSgCxyycPZcrKet+EagHsPziBc8J116YMrczdXlhzdoRwED9QzVofXQzVsQ5e/GaCwZCX1NoORXLAUfa0/ro+nCt7v0vn5S12JQFhCBM3SaF8DMuoU1WrXqkj+ewpmCe75R+wFVyvzyellswBoDnyxRXgAxgAWVgCdOp7vFxiMN19noFd1teD31P6E8JuuU8C95Sv7brXS9GeA2RTVdS6vn+iwier/VPU9T5NrtfqYy7X88NR8bO1LXE+toZwEDc3WZdsAVQ6xakmnGdDwyWI4VSJ8WsJQnMEsRFsA/3cvVJWUB6t+7cCU8uljvjZ6HME3WqWrqZanGfL+lHA3GJLmnoefkzaftYKy6QkoBcHPJ0/LR46MnjyyHmYJ29DuFBKXyzEAduef3zOmvtV9401Qpe3seW8P1A0tYlhj8IpAm+rW5WKqu9ZmTAHB7HevKV5FSzk7eA3Nv99t9BfQAAA/HNVFp9r7VeDdB8DOqSYY/d6oJrfkMe8CPlVhmfm/D9waIqIEyhJz3xSr7Xl67nLl2tJ+qGl9CdA1sDMa2j5XPX+wDs9FMHcomjcLz3fYn/kdD2UpkG3HnZHIylbOmUdfDSNlJZBJBlrB7w5aAtZXzu9N7i+2tcA75RQHvQXsMBS3WmntD3uc/ZW48sRaG6xu1LqwG45Pz34HvjQzRmkoCsva5S7nxNYXa3s9WXJisFrbVtSk/PU9EfXZX11OfLAsIUVo+pLL/IgZbRW084ICXxNPJrEQV8pg2VdGvPdHwVyuQ0c/mpQVmCJW2T3QlN597L+fL58vdFur2oCr6PycfAZpVb/dWoxf2mxvRg5t37m5V1exOXVJ8d/wkHeBJWmUh0mIgLzt4XLNVzqGrn3hwbb0nS4KjVc3kvW4rV+uyyNhzcGQ3Mv9xeeCHS1wjSZzfP14TLxivL5T5iNGD5qeZlWwrA5Xqvn4KWXTF3vI/Us+R+az9QSLDldZKj9eo5SEt9DXil2Ihj1m4RagG0eF7jfAH5/xp3tLxOA6JX3uqAvTpT0c1Xtelnq30Emj1dbd3cN7KoGRBLbbMA7+GCI0CW5mc8GYv+jjzgWpusPEVTzzXzMlUD4Y9VMNIvY5hLbxMVv+1I6oumkx9lc9BT2AIPENPUNG2j7oImoAX01DNNaVNpa9j0tSzLLuL53P350H28L8uYyG1JWjrZSlfHHO1URp9epfWlQbsLfIs4ZCUwr/3PUtSRS87bVObWI9oxjdHOM+Ot4X7bnItLtmEY3gP4RQDvbvH/cBzHv1I3nEeCCCR7bMbSYqKbsbT46IcF1gRKs55Q9uSlrSWJY9c54a1EwSjVSTug+ZryI56v986dLR2v1E9lSxgD9DuO531I68Elns49+vqpuIP2Us9SjARRXp8Br1QmxdQ42tJOuhWpvDZaxu8dDsN3/iavB1lpHOuakIFzrTMGUL/5qufu5ky77eELxGjxAcAPjuP4O8MwfA7AvxyG4Z+N4/iv64e1QJXdkEXrsrckeRWupIcAACAASURBVLdBSS5YKiu/COP50FI4lDJep9XzGE+9YO1qXwg/PC2F6eNcUoGh3Jd+K9Iz65eOxEHL5yeBVkpFA8t0dImXVMay0tNa2pkeR5ywBl6vXAOj52Y9YLaknAGIz3qmryPsfKdG27pWrbxnX1KbhUZS2cP98jbhiSTqs+rTn3v5HsdxBPA7t9PP3f5lvm9QUYVbvOsILlgrM8bS3K7WRAK1VBdV2s0G+lPbrQ9hnkbO7HSegzS621leHy7HEnAlR0shO738+Zpv6aPEln4eMfM3jLv5SLpZKreAS48lR8whyvvQ3C1t02t3dNRFR/qq3u08TaAPaHtCu9VNR2G+aCQd87LsbUPerUfRnddZ99sP5iECDsNwAfDLAH4/gJ8ex/GXhJgvAfjSdPa7G6eV3ZCluWBLWRfszY+XB74lSQKyBV2tjdV3VJH14Wi7e5vY7mhAhoYnKY08De8TXdp9XcRTvVrZS5I0dy39TOOjqWgvBS251lLnrc9q5bRP6bVl4au/dw585Qm0OdC1QOuN36Pcr7wp8nCM3s61VX3nE7rqjeP4DOALwzB8C4B/MgzD94zj+Gss5ssAvgwAw/CdQYfc4oIjWuOWJK1MO1a6AmTwZjZnWX8LvdaLa8A8S6sfb01Ycri0TnoOtPbFC8DDTdNyPwW9TGvz1DM/ntrO09Clfe716+lnei65Y2tTVmZt2AJvGac25RyN81LdobQzsHS/HjB7QdbqKxNv1UvlprRvPco+9zlTLzlobfNVi/vt/2EgRb9xHH97GIavAPgigF9zwoPSYNVrLTjS1lsbltaWI7chJR5PGXXBnkPmMVQ9U9Bm+pnHrAdhDlO+UWoZv9zZDDwgyNsu4Sk/8YrG8tQ0rQe0tPL8liTedgliOw0tvU9emQVcWh9xwp7jpeVaypjGe7Gaa86knelrCK/5Aj58qXq6WatdTaxWl4lfNNKOrbKI1nkwxpbwBWK7oL8VwKc3+H4TgB8C8Nf7TiMD4Uy9FlN7i1HEBSPWNwevBlbNAWfdb7ZNr7+3FSGsbZySVGDmgWoeL9/n68XyMXk9hfgybr7ZyoI2HysqHiut/fI4C7rlpxSTAS/tpzU1TfuNwFeEdit8H29w3OnWuOMMnDPzlMpNZdxv5rnPljwHm3W/0TH6KOKAvw3A372tA38E4GfHcfw5u8mIdTZKZeJ6uWCpjeeCAw/miHAdiXIphiviWmms1Vc0Zjbu9uloKX38qFveXiTDUd6kRZ2rVs4d7fTSlulk7Zakx7G/A1r6MLLnTujyM7P2a7nhls1Wkbj0hivAhm8UoHDiLXltWhywNUdzQt6xVSYpcqtSD0fcy1V/im5PwhrH8VcBfG/9RIC2rxLMuOAaJ50FuueCpXLnwRyWC651wBaErblk+srM6a4+EKbpWanO2hEtuWHL4Ur98jno9/7On4ZF+5peon7vb3YHdGQjm5eK9txwJg2dcbzl2Esj0zY1KWctjrpeAO23Gnl1cGJq+rL6tcaT6qS+TK3lfrO7kaMP3uB1rannOnivuQOKKAriKAxbxB1urbzd08FpWM08rkf7puWaeqw1WyCegXwJ4QvZIR35PmFrVzJ3mJG2NN2rSQJgRNTBAnO4bi1p/pkUdCQVrUG55fakFvjy12+tAQMB+D7euHbQZWFq1UXirP4jZaasAVrV2kcNINeFL7AZgIsi0JPo4d1r2/uWJO9cmxuNByl3bkmiU+7hgjXXmlnnTbnaGslO+PnpisuVDVq1MUt2s1pKWnLUtIw7ZKmOlvMUtJRS5qnnZUz/HdCP1+ann/lx7U7oKHhpnbUuXPqKxHpp7K7O1yrX6no4ZknR9lIdL7POF6rZzZy5yEjtswCMut/oXOq1MYCBPmlpIO+Wa29JstaSI+lvY57l7+DKjr3mViwCdVJMDWgt0If6q/0CB3nTU604fCM7nS+sbtlGT0Fr9XTDlQ3i3A5oqY1WbgGX1tekoKX66EatKFA90NI58M1WANoesiGVa7ERsHrtI9Dn9VzeBwUI9QtpX7oQvfWIn0vtI4puvorIu4D1WS/eAcBF2XXZ2rVgzaFqpNNirPpGFywNVSTBzHO4NVCs6dNrF5o3gfDTBTAe1oHLfK2VywKz5Hx5meWcaT3wADEv15xtqZtehv8NSHyzFU+bzz8w5Fwwj1+maXXg0vio2y0/LcdLyzXw0njrfuHIzmltpzOA7eBbC2ar3msX6YuXSceiJChJIJQ6yjya0oOrNVFel22vtanTjgAG2iEsxfScQ7Y9hzRYefAZ0dbasBZD5aWupTqrXaTO/c9p6QFhQHbDz08XXK7P01qxwJvM/b3ypqh5e83dPursbz/S0tC0veZ8I+nn6C5oLZb2SV+zdBzZCa253fIzsi7MYb3G/cKRnc4A7O/zbQEwnD68OqtPXscVjffOZ8q6XwumkXoqz2Fb5bXq29/OAAbaAaj1RckVvSWJAzR6bo0BoVyQBGJgCTdrrZgPG4G0VZeBbcapS3oabuPm1oW13cuaG+bpZl5mwVZPJVs7oO1nQWvumcY9XvJxdkHTWG1HM4+xUs28fc2GLKnMcr2AkXIGlvDNwFaKscq3aiPNE6Scn5v/f72dz09KPVcEalHwWRPmdVn32//hHwcAMJDbUdzqgrNrwZH+O7lgDZyWA44CzoJhdi131Q1a83Xhz54u6jOkn3HB5RL7Hl/gAWvaXrutiMdrII6knzUY1977K93OlJXkirPAleIst8v7yIKXxoYcbsD1AkbKeRpgGwDXlEtjRsqtOG0MVV5gb/fLY/ixNnZE28IXOAyALWUBm2lr1XOAZndES+XGeFHQWulqqR+rnP69WWvQvEyT5c61DxdiHElJ3+A7uzXppsv1edq9GmQQv9VIcpQvQdY6eKaP+bm8GctKP9NjbX23/PTccuae4RBoHfje5y8536IM3CKAjvTJpf2as9CV5uBpE/cbGawH/CLud83xZR0IwJlUtOWCPQhqMQFIitLIdjXqnQ1ZgA+uzKYtrdxbK/ZA3LL2677N/uassi4MwIUwT0lbtxXxcynFTI8jO6AtVzxNX38EJU9Dl7GptI1jUqxVl30QB20jrf9G1oY1YLfeLxy5xQgQ1nunQWJuNFJW266mHEI5L9NiurlfyaVq7tbbCa2Nx+N5fS/3ux58gdUAzBfmo8O0pKK1Og/OERfMXS8/p7HaLuikC9ZAXMpqXWYUzlq5NHabGXto8dYMwNPngOv0aTtyq5K0mar2e32XdfpaL63XYWunnqUPA9bmq5pUtJd65n1Jx5H7gbX6iFO2XHJkN3Q65QwgtNmqpizaJts/L+dlMOIioJX6uSvy1CurjMu6gNT2afXP2/eAL++j06Mo+0gig6baTVkWSKPjtLpgrU8I9cHbkjQQZ1yq166U9wJpdy3vF565X17GNmhZ0twtreP3/vKvGSztcs537nYz9/5yONdIArYG3uwuaKk+sjacSU+7O6QV11vKXNfLzzV4ZWOkNh4Ys2W1oDWhy6V1Wo6P6n57O9q2i+bGKegoiKNOdU0XzKHqnUtjc5JaaWknLAtiD7AtKWXPBdfA3P3Mo6ekyy7p65W5WrZBiwJ2mYJegpg7U+uJV/OyvPNdK/1MX3uk3EtB0+MIdLW4KHijbWZlbK03dIvR1HE7TNdoky2Dc+6B2QQxzW5q7vdToUwaiJ5HnW7PjVVWfatzj2mnNWD3alvZLttvdke0NR4HcsPDOShcLfBaQG0BcwSmuzjmZUpa2iXN14atW4v4ubfeCzxATEGrueIW50sByDeQteyEju6A5scRJxyFbqnznHIqBa3cXlTO3VuMAPs8C9jW9tl++Nje3LS5qpJSzxrFeZnmdiPxUr9rul8vrt+Fb8dNWB4soyDccy04Mget38CHBc53YMl7GPUQYnq1KXFeHy0fGCRdMV04S4Pr83RhNXS5zGH2GlS7E1p6HzjAo9ClxxFXHL1v2HtKltSf5nqBTvCFE8PrvfZef1Ybb1yrjRUvHS8kBda6X97GG28r99u7ja6dr0oBCKXbZPusXQu2IN7JBbc64BDMjPNIm2w9lQR371d3j7lB+OkKXJ/UDVr8CVryE6/8FLO31jtvk0k9Lx0vT0OXvou0VHRGUlst/Uzj+U/aLrM+HAFz6OlZ0bVeAKldzjxGapOJt2Ij8ZHxW88laN+lbbyiHfF6CdBSvdSHpb3cb1/4ArsDGMhDLhJH+1xrLVirl+YkxdDNWQ6EyzHYuZdmrklLZ6G8Syr6JuHpWTQlzZ+gRTdpWWlpafOWt9ZLU9DTjJagBqKpZz0NXebH5W02q3kcJT/nQKXts9DlMdEU9L1cAO90Hry9iB73gGambY/z2v6ltmDHqqQGkQdiSPHauVemzSdTp9VvC1/gEAAG8hC24mvG8mCpydtsxfuW5lCOA+vB/DxSF63fG7rWZx8rBoDkhvna8NPT5b5JS9stPcFSd77AA8R8rXeqk9eSCxwzm64oiKXzUlaUSbFL7rdHGjqzKSu7Nqztbgagb7ICYN5eRI9bz6N1Pcesga1XJ6rmoRsSoCPy4rV6a804ctvR9vAFDgNgoB3Cazycg/flpaattWKpLLHhS4KtBVpvTTUD2T1dbpH3mYu6YZaWppuyliC+3NeIrXt/+a1IUopZcrK5+3/lb0EqcyptilpS0Fr7ml3QNDYKXamfjOMFUJduts55XS1kawGcmWstbKU4XjfTSAI+VY4BGYCe243edhRNeUs6LnyB1QA8ou5+3t4QrhmH1lsAjZ5DKaPxQDoVXZQBreV0Jci2tG2V9FaF6sv7d0tLK0/RoiC+i60R81TzNJT9kI3lGnLdLujH+TwFXZN+LvPWFH0Qx9obsqS6MHgBhNLN9HgtyNLz3v31nBePW4jCF8axtfEKrM66UHi3HW1za1B7X5/iIA/ioG9Yza09rZIgyOU50podzlJ7aROWRJHgrUne2nAUlkdzuhFZcL6fs1uWni747PqMj67PoiO+0HICYmtjlnY7EhD5zl87FV3GK/H0nJZN5bn/L5EUNAd2BLilnfYISw5w0yWzW4qAAHgBqPf1Qjleo27N/loBK7U3xSHLj3kZj9cGiThoKZ7Xr5V6zlwIs2n2hzZMQZdJtjjUVhfce0MW75fXW4DWdkg7vxINPhqgM23QWJc53qIOwH19+PoEPF0Wu6Uv16f7Bp6F2jK81eI7teVNYcuvVYz1Lb8oDbi8TWbtl8fwPsUU9PMcuuYGKwBp10uPt4QvyDmSdV4chHOpjQZoUZKDkxp6rpWeZ6Cm9X00tc1xhzXgXmniqLL9RF1wdD2YHnup6OCtSfRYgmMUWN0g55RHFP3QEK2blRM3rKwP0x3TjzLdEUduOdJuV8o8epKuDxdZu6Gj0lyzBlt6Hln7pXGhFLSSap7KjA1WAEKbrLTjaF20Ta/jNdpIr8UUhawEXJ561j4NcGXWj7V6Xq6NYc2lxf32+XCwA4CBnBuW2u7hgrMbsHgfHMJam3LspKKLNDi2ulXpXFJLTM1nrDR86TlZH3664rPrE8qDPPitSyU1Te8jfsZ8w1YNjIHYNx9ZaWfJxVpu2Nqs5aWjM6loDlVaJ6agBbcLYJFqBpSdzdOgj5+9odkC0bXm1LO9Ku9xk9p/Zikmsys6Aj4+F6nty4AvsBuAizw37MHOitVAm52X1Ta7Icvb1AWEUtEahKS/nYhbjawDZ91tr/Xk6K9Oi5PeK2XHdAGx9UUP958VMJ5GtHdBT+XyYye1NWBe5ymShs5uzErdnuS43VI3SzMDNnjp8dYglI6z7daYM5xyUdHHTVobryL/8TVA8z69teHoeBF5/fRPie8MYKAvhC3RfiIuuCbtrI3HZfXD6wLfG8yPNffbmkqOgLqHIjDNltPXfy8TQEx2TfP0NHfFGRgDgJeKfsSsm34u6vFYShpjpqizbhfAYnPV1Hkb6LY6XrO/KGSfhBjebibtliNAhqLnbrMbs7jWSD3XOt911qMPAGDAT0lHIczjrPreG7L4OZ8Dr7dS0YH14DIFIJdetlLWGRhrMWuCmcqDrgRfrX0BMds1TdPTLTAGHs7XSkVP01o65RIL5NPPRdk0dOZ+YBo/izWgO50/1nYBY313GqAP7LT6KOx6ATjTV+8PAqKsW44sl6rFRODrud+IrLbRC9E+8AUOA+CItCvumrJSxlKcB2WpDy2G1hkQpmH02IMwnPJMf/wYRn1tX9GYaKzannzRw03Sc6bN3dP3oGWR9h3F3q5m67uNpYd0SPVW3/P4XPqZtuHQpcdSmhkION7yMwuZVmAdDb5IHEfrxWBr3TcCox4xNbcdtY7Zq01cBwNwNq0steGgbnXBtanoyGuxwMzrEl/a0Ao6SWu7XO0DQDYmGkvfIy01zXdOs/Q0ANcV32Pw+H5iyR1P5ctd0eW8tKPi34YUTUtHdkDzc9URPy9BS+Ebgi4A9T5e/rMVvF59z357jkXLWuYlStp0JTXwUs+8TaRPCa41kF3D/a4LX2A1AJdfaE33FriifXoQjrTL7oqO7oLm55bzDa4594DwHmnkHpLeNg2+FnR5f/efenq6HGspagAmkIH5VyV6j6GcYrZ7FOUUw9yw4XCBJXBpvQndafDHzyyErbq9YNur715zEmVtuuqZeqaqTT1HwWx9ePDiMvOxxj/Ek7DoC8wMlYVwdN1Wqss4Va/fllR06RtOG2M9uBbCMOoz4Jb66aUMaKNteJwWcz9n9xMDM2cs7aLm7ngqo/UylIH45qvej6IE5s6WzpMfP83KFZcLxKDr/TwiBLcYu0cbUZFNVxJINbhGnK4FZG0cq1zrM1PfGl9/sdswBZ2FcQ8IW/UahLXybCo6CuHSD4SxJFA7m7Jq3G0vYB7BOXsg5mUWfDUQ33dPA1qausCHu2MAC4cM8JQ2cbxXefNVgfRUl/tvzOF6L3/SoQvIsOVxC5cLtEGX/2yB35ptesxvrTmLkjZdFWnwze4gjkBcirf6tNpk2nlzbukjrg0BTCVdESW1pqN5jNVf7a5oz9VK7SywW+DG7bgRwmB1Wtke6WkJgBYUvTKrP+tPyAP07P0Q1owBE8gAFlAGoIK5xBdxONZK6odvLtNgCwSBC7RBl/+kf38RYFl1RwRwz7mJ4vD9lB3zOqvcO5dEJ+fBnddJbaU6LUaL8+YQaZ/XTgAG5lcwS5mNWZFUdDSdLPXrtbXgKb0GXh9JWQPNEO4N2T2cbxS6NXXRn2IfujsGCLAUKANzME/nczhTLb7RyZC1a5uO9yhTYAvIwAWwcLn02PtZjlug3Bt82ba1Y6/RhygLvhDq6M8a+GoxPE4aX+srqmPDF9gVwEURCHoAi8RGFIGzlYqW5EE5CmGpz0YI87je8TwGLFZSD/eerVtV869HlCTd5kQlPZnrUfd4hrXdh/1/THLCn/EyD7qADtXIT15WA+a1QFwbL/3s0adXJkqCL5W3PkvPI/+Jov/Rog5Xa9N7Pmu1X+oAAAbWhzCPsUBbm4rmZZE1Yw/CtB2UNgkIF0UBuLcifxZSvPVr0WKi5VY/5oeS4VEGgO6qpilrYO6SiygMP7ouH5OZ1QKuRQvosjdSgm3kuAXMLbDbA7JrzClTJ0qDr3cOVs615q7nyPiaMv17setdEA8CYCB/td1qbA/wEnA1wFr9av0Ayyt8JYRr09KtjrIV6hGgWu2yv15eHoEwhLG0+vuxAGVgCeYiDmhJFM5RMEvu+In9PWmA1eqyZT1+rgnlzBj8dW45Z1G18O2ZeqbK7nq22vN2WlvvA4Sm9eALHArAwBI4XK0umPYd3RWt9VkLYcndSm2AJYgt91wB4aItYdqiDEgj0I7+tMaLAJfXQ4gBPR/m5/cx2d/kVbjX0Ek1TzHOo02tMg3E3nELfMvxWnBeC5iZNj3mIGor+FJFgRyBrwfXlwtf4HAALrKufD3Xg6MQ1sqj68HR1LM0P5BYfkzPAxAuigBXBIMSG20T6cP61UdiNRB7cK2BslXG6yEca/Pl9WBl91jj8aRRRaDLyyJQ7lHWAjt+3guqR52bqC3hmwXy23a+RSsBeETbZihgPQjzGGueGQhL9RYhMlC2nDcQhjBtUuSlmy3HHD3vLQ2e2nmmLyumBcLSMSC/dwiUQaj3FPnw48VrIPaOI6C16tb4uRY0a+dRO4aoveEb3axl1dfWeVoTvqXvQzwJi77QFhhrfffuE8hdvSNtJaAWZZ2xVp5IR/NpanCxfgI6SDQYe+UZV17zoeGtK/J+7AVgqcyLiZavDegeP1vbLtQbvpa8NV6pTOs3k3qOjCm9SWvAN7uh7KENU9A1MOZXe6nPyM7krAum9WumoqV5axCOwjoBYUtZkHrte4i/bS2flaz++U8rxioD5u+D9wFGOqdlEOqk+qi0dhnwWnUWaKT6HkBqAS0/P+JP93e9Bny9tpmymtQzV+QPvuY/RbZNPXiLNgQwFd9k5Mm60q4FYa2tB2Gp3irLbNxaAcJbu8ke/Wkgbv0pjZEBrgZYLwVtnYOVS3VSjBcbibNAy89rj1ugq9Vp9UcBbev4qlrhK6kVvlRR+EbrMjFSXKRNtp+cdgJwUSaNbEE4Gp+BcLTO+wAQhTCEsloIU10RhjDXlu52b2Vg7JV5xwic0zKrnM8zKivWg3EvEGcgbNUd9edafYrq5Xwjbam8siyssheWLeHbD7xFOwMYyLthrY/opqxMX1kIe/VeGZ+3BWFLUlvA3SHdax33CJLm5bnfTB0vQ+IYgXNaZpX3kNZfbxDX1O8BuSOANgxeYHv4Zsv4sRbDX6wH1ygQW+HbH7xFBwBwUcQNW0CNumnJBcPo1xpTqrMAzsta0tFWe+28IiVdoy1BbEEz29aK0eJrjhE4p2W8nNdl5f1ueoGXnnvwlWJbAFfTZosxatuI8sArldWs+VIdGb7SG3Zc+AKHAjCwDoQ9SEb6kaBptbFAqs1JiuN9WzFcnSAslUV+orJNVi19Zsf34o6UAbBkzVGrq4Fu5LgVwFbdlnBeM1aU91xnWqbBV1ItkLWymk1XNdDL/sfbH77A4QAMtEM4Gr/Vpiyvned6IZTxdDR/OAeU2ASEqVrgtpesPxEN2FKdV8br4cRo8+NtaRkv53WtikJXKmsBcY+ynnBdu002VlU05SyVZVxyaxmE+h51rannY8AXOCSAgTYIa2vKEWhbEI7WWSlirV0NmDmEodTRc6oynw63Klmw7eFyLVkwreknU1ZzDOOcllnlvC4r7/eQBS8/73EcgVZt3VZtatubOjJ8uTSIZVLPkT4zkD4OfIHDAhjQQUqVvdpaEI3EtEIYqAOuVMbB6u2Q5mOXGMcN06GjsK3VVpCW6rQyqW1v8Eagu6YDtvrzwMvLIsf0/Egg7hXbUqfKSjlrMKZla7lcaywrhpd77aR6KUaL02KjbdfRgQFc5LlhDcJauwiErb4s6HsQ9vrskaKuTUkDVWlpSUdJP1twjcJYq+8JXi8VTculuhpZv59aGG8J32hZb1ivVaeKPtIwAloJklvCl+ulwHdb8Ba9AAAD/SHsxWWuxNG1Yq9tLYSlMi8lLSnhhnkzCjWpziuTprIlwCWA1sDWAu2W67+175/UJlK2Noil2FoQb1UXjTfFXS9Ql3KOxktlEYBKsLdiuDz4SsrA19M+8AVeDICBvhCWYj0IW/UchhDqJChKbS248vZemZWu1sAcWBumL5OrJ4i98VtWH3hZpj5yHKlDoIyWS3WSou+rFafV9YavdlxTvzWcW+JN1bpeWpaNj5ZZ5ZEY/gas7Xy1+Ei79fWCAAzEHW1Ea0I4O34Uwt78rdckjWeNFXDDNJz+RGMZOtVlyjL1XjtrzEz9FspCOANgft7juEfZ3vGmesIXQp1UpsFSUgt899De/8FsvTAAAzboJABZbbR4KybjhL314AiEgdjDOqS4UmatGVt9BNaGyzSoekExoujnEK+t10/kWKpD4hxOOa2jiv4vjryvWkwrjI8I4payaDw/VmWBN1PW2+W2OF8uXue1e53rvlQvEMBAPYRhtLP69iCste8BYVpuwZLHAXPQRtPP/I+yAcRHkgfbSGwNhKPnRdF1YD7fFmWumZGynmB+SVC2jk1F1nq9shp4rg3f2jpen43TYiPtttULBTBQB2EtFvABa0E4WtcTwrysvAbPNfM2YPGADudEWro04cNstUa8hVohDCz/TL1yKPVZee+tVp+BKz9vATE9XqO+F2xTf7MWeGl5i+v1+mmJjaadLfhKWhO+xwBv0QsGMLBuOvroEAbstWLLNXspa62sjAk0g7hGNf14n8U0iEaOM3XlHE6MFSvV95TVp1TXC8Za3JaA7tmnq5Z0My3fwuGuDd8InF8nfIEXD2Dg9UEYiIGVlmug9ByytzYstSljBtPS9GVRvQSHS9ULwlpMkeV81/jf6v0OIuCVynrA+EgAjsS6ksAL1K3RrpFyjpb3WPON1Gv9vw74Aq8CwEBfCNfGaeNlIayNnXHImT4iQLcceALEdAjujjnApHpelqnXFP0gYPWZGc+Ksdqu+YElcw21yqOQra1bA7w9+gop6nppuQfADDi3gK8FuV7wfV16JQAG+kFYiuVxnlPWxuN10XQ0ELt/mJZn3LTloKkst13hiHvBsUXWryrahn+AgFGvxVjltE6rzyryvvaCr3ceOabnkfKeMLbGdhVd59XKs2nfKNxryqPwXcv5arFWvNVmf70iAHvaE8LROg22NE4CpARnXu5BNvKwEGvsUt4I4qOmprOgtiDt9WmBmNZTeevcGWXAK5XXwteqW/O45sOAKS/VXFPe28me8D2CXhmAvXTx3hAGfJdL++FX4ozr1cqjIK5p1wjiiNYGNHezLWu9Vt+0DEpbD8RSbIusPiKuVyp7aTCmx9VulzdeA3xrut7oHDJ1kXopxoq14q02x9ErAzCwhBnXnhDm9VZdy/ovB2TNJq8sKxPMhwAAEv9JREFUcDuAmE/Rc8V7p6g9CGddb8Tx9v4f671/GTfslb1EGIe0N3jXKs/EZeEr6W3BF3iVAC468ppwDwgDNgRpm0i55Ya9GKmci77Gzq64VRH32rNvC8Jw5tIK4+j7+tLdMD2PlDdBl3dwBPD2nIcV540r1UdjpDgv3mpzPL1iAAMvC8LAHLDaurAWF3XJUl+9Qez1W+mKqfZaK+7lejPO2poLl7TZK6st4OvF9ARwpo0rze0C6wAvCsIad7sHfM+0M9UrB3BPrQ1hXh/98JCFcE0by5UDtsu22iRBTLvlrKfDPbFy6zhTV3OeKYvUSbFUXru1HLBU/lIgHFY21WzVZV1vdMzWvrgyfwg1aWdNrx++wJsAcC8XrMVvBeEawGXcs9dGcsNenFQu1TU44r2ccKt6OOAS21s94CuV9TzfxO0CMejW1q2VPu4xn0wfkXopJhOnxXptjq03AGDg5UMY0B2oFpcFdG2bmrVgK21d1JCeLl1tAWb+q5T+PLQ/MQ/CMOrX0lrwlcq2BnBKreC1ANUK3tpxrf5O+O6hNwJgYAkOKgvCUptaCNO+WiDd6mx5/zWuNQri2v7pBXBFGK8B6l4QLvVwYnrIew+yYD4KgFOqWd/N1K0J3pa66Lxq6qWYTJwW67V5GXpDAPZkXQklN1wDYR5jgUqrp/3XuuFSFwWlVRcBsRSbqatcK34p8iAcjWkZvzamFr5SWU8ghxXdzdxS1wPeawC7Fa418NV+UW8PvsCbBHBNOlpr1wPCUkx0XZifR93wGnXczUbcc7auwhXTaZZujgjnPSAcfR/WcL9SWet5Smu73V79tIx/wvfoeoMABuohHI3fAsLA3HVGgZ2po2N4dTD6bYF9GYfXVbhiaQgNzl69FcPLpLY8XmqTiemhPdLQUtlq4LXcLrAOeNcaZ6v51dRLMZreLnyBNwtgoA7CWps9IMzra4FpOU7e1qrz+q1xvbzeSk8DTSnqvZzx3unn0n9LXO/0dFenC2yTZm7pi9e3OFur7Vau9lzzjeoNAxjYB8JAPWQj9dk1Zq2vCIhLfcQRg8RE+uL1EVBXwphOWdPeqes1IXw0AFt9hhVNMXv1e7rJI82zpo9MX1as1+bl6o0DGKiHMIR2EQhLcRHI0vE8sPYCbbbeAyRvX+O+o2M1wBjwAbA3kHsp8xqyKWitvLvDLap1utl6z+FFHGCt462p791fNEaKy8Z6bV62TgADqF8TjsA1GueBT+qntxves74FtivDGGiDxJFhfQQAV8tb0wX6gm7ttdO9wVszZrQfK9aKt9q8fJ0AvusIEJbiIm4ZyLlhK36t+hLjwbMVthvAGJCvFzWgPQKcW1PQVl3315bZSKVNwIIyr+8Bqd7p3y3mXBujxWmxVrzV5nXoBPBMR4YwoENP6kdyw7TNViBu6SOaoub1Jcaqp/0UNQAZQvdliB4QWhPUa7vgJnHgSgOtsX65hXtco4+11nJP+K6hE8AL1UI4Gt8Ca66sW65p0+rAtT5oTKQPKZ1utYnOi8Z0csdUGpSj9VbcmoqMt+qcjgremj4jMT366AXWSMwJ3x46ASyqBsJamzWdcCTGg1mkjec8M320jJMZt8RYrliKKXEcAJ2ADGE4PvTeKemiTecRAS6wHiB7p4l79pMFb6RNbUwmTou14q02r08ngFXtBWHAd7Geo4zE9HCqkZge0LTmL7Up7WpioMStCGSqVujxP7OjwHyhWuBG49aAbm3MmmNtvZZ7wrenwgAehuEC4KsA/vs4jj+63pSOJA/CgAxWCO0i8KKxEcBK/WVjom3o+JkYGtcjRgJmDeS9vr15bgTkrA4JXAm2QPwCL8XWAmNvt9grZg9He8K3tzIO+CcAfA3AN680l4Mqshabadd7XbinG6ZlvUAsxbUC04rJuGmpbxobSVeXWA0yBwHzZsrAFshdqPeGrhS3pmvX+tra9faIteKtNq9bIQAPw/B5AD8C4K8B+POrzuiQqnHCVru1U9JSXBSWLY6YxkUcpxSXjSlxEixr2vWIpW00IAEvE87W6wHqLrKtF/0IpKS2PUHZu79e4F0jLhurxUfavW59FIz7KQA/CeAzLWAYhi8Nw/DVYRi+Cvy/LpM7lrb4I4mO8akQu7Vj2CLuSYhreU2R/mtjtXlp7YAJZvTfERWZo/c6vfcnEq/9LdT+nqOAjv4/q53va4GvpRO+mlwHPAzDjwL4+jiOvzwMwx/V4sZx/DKAL09tfu9RryaN8pyw5oIhtLPWkGtja1PCUlxtWprG0disI65NYXtxkbSy1F6LteK1+Wnta/7bZFx0y3/LyEW3xgH1cMV7fGDcInXe2rb3nLXYmnirzdtRJAX9AwB+bBiGHwbwHsA3D8PwM+M4/vi6UzuqaiBstYumozOxWhywBFukv9qUM43tuU7cGkdjvfZWH6WNlYK2LjIROEt9UrV+1s26mchFc+00tFa+lyvcK4W+RpwW2zPeavO2NIxj/D/wzQH/RW8X9OSAv9Q4taPL25ilgVhrl4lvjW2J69G+95zWeD+scu9za+3fRk1fPZW5KG7tiLdynNF5rbVZac8PAj1ivTZeu9eiL2Mcf8tNT533AVdLSy0XWSnpqLvV4lvT19m0NI9tbR+NbXG8tX3S2Izj1cblffI5SPL62kIZZ+zNseZi3MOFvSRwrbEuu0fK2WrjtXt7SgF4HMevAPjKKjN5laqBMIQ21jpy79jMHKLrvzSWxnuxrevKPWL5PLQ6Kw0ttePKAHoPtaaevT7WTHseOXarlHa235p4q43X7m3qdMDNstaEa9tZa8nR2B7OOdpvBvqZufX64JHptyji5Hmd1yfv1+pH629NZS+QrWnoninqI8C01fFm+zjh+9J1AriLPJgCOSdc2rVAxovPuGHet9ZvqyOm8a3uOTIPC57ZjVjS+JF+ubTfyx7qmYbufeHeGtC94o8+v5p4q43X7m3rBHA3eU7YcoNQ2u61jizNJ+Mkax3mWqlsq2/apgbIUjve1roAeY45I+l33zOdHb2QrrEmbLVby0326rtnP3uA2mrT0u7UCeCuqoWw1bYHKHvHWy5NA08G6Dy+FsY0PgpXC6yek41unorCWWujqRa2tRfJtdeFezqxPSDda1wtfgsHWwter+0p4ATwCrIcLVAHYatdjRuW5pcBsQYsq/8ecI3Ms1f/VNH0s7cRSxrb6i/aZi31XhN+ic64Jr6nw9zrtVltvHZe21NFJ4BXUw1Ma9tloWqN0xvcUputPgRE21hOl7fz2kbaS/1Y/UUl/T57Xwh7rgtvmfLcYh30iODt3cZr57U9RXUCeFW1QBhK2z3T2L3b1K4V8zatMLbG8ebnjZnpR1Lmv2jrhW/NFPYe7niLVOzeHxR6t2lp57U9xXUCeHXVQthq23Nnde9xeoLY6q/nB4FoO6+t1l7qR+vL6vsIylxgX0pKurbNVs7yKJumTvj21gngTbQGhK22NQ66JwStNla77KYtq02NK+btvLaR9lo/Ul+StrwPGKi7iEY/JLQA12u/VQp2S7Ad4fVG2kban5J0AngzeSAF+q4LW+1aXGItvKXxrHbZ9DRtw9ttnW723geujGveUz3XfyP9Hckhb+0oj+J6vbaR9qc0nQDeVBZIgbbNWVDabrWeTMeyxpPaRjZsSXU9XG4Ph1u7Bmz16Y3RU61p7ugFODLOWinQLd3uGnM5Wjuv7amITgBvrlYIw2i/tRsuWmO9NeOKo31KbaOvQ2vP+9D6kfqy+vTG2FrZi+1LTUuv1WdL2z0AesJ3C50A3kUtEPbar7kJaY8xpba9gJpNN/P2Uh9SP1pfWp+a1loP3mr3dI+0tNfPGtBt6bel7R5jem0j7U9FdQJ4N60JYa997zXlaFso7VvWwFv71dpG2tM+rH5oX1TZ/35HufD1XguO9rmWYzuii2zJeOzV9lRWJ4B31ZEhDKVtK7TWcMTRfmv7bkkxZ6HsjbGFWi7CWwE30ofX/qiblo7qfCN9nMroBPDuikAU2N6VemNvAeKasTMwXXvNV+pP69Mb42h6iWvCrf3vCV6v/dpjR/o4ldUJ4EPIgyiwHkij7fccW2sfdcVavQfjSB+8H68/3ifX1vf9Wqq94G59u1Kknz3h9dLbR/o4VaMTwIdRK4Qjfay1uau0RaC91sfaIG+dH+2jKJNejv5Xi1zoekC65wU169aPAN3IGC+9/RZzONWiE8CH0tEhHG2Phj56gVjro8bRrgFkr29Le14Ua1LjW64LR2L2BudrmcOpVp0APpy2gjCMPnruDl4LpBnH6rliKyY6Fu/P61Pqm2uv/55bbMLKjtXqdiNjHQF6R5nHCd8tdAL4xaoVwhG1uuGj9BFx5ZGYMhac8XifkX6tsSS1/tftvclrDfBG+3wpwHop8D21lU4AH1JReK7thHv2AaOfHm62xzxojBcXdcVSv5H+Pe15Ea1xR703Zr0maPaay1av51QvnQA+rLaCcLQPOP1EnWxrP703U/WGsTWu1X9knK3VeiFe44Edrw2aL20up3rqBPCh1SONHO2nVyoYG/UThXmPfjJxdNyizH8z7wLYE9C9L7Zr7IbOxL1G2B2tn1M9dQL48OoJTzh99YBw6QdOXz0ButX7Q+MisXR8qpb/dke6SK65IzoTezRIvdZ+TvXWCeAXoV6QifTVC3rROW2ZIi/q8WznbGzNXI6mrXZHv2bw9uzrhO9L10v633/KVRTCPZTZNbwFhKP9RPvK9FdikYinc6E60n/JHhu99gJvtL/XDN9TR9eR/refMtVrPTjaV09I9UyRw+mr9+1EWbCucevRFv9Ne+6qzsIhE98TTq8dvifIj64TwC9KRwRnpi8E+juyG0awz5Y2kiIXUu81rH3bUs2FvDd4M32+VPhGdd7r+xJ0AvhV6qgQjva39caqmodr1IA42y6jPS64te7pNYA32le0v6P2dWpNfbT3BE5l1fvisHVf0f72uNA8JfusdX21bY+glvln270V+Eb1Uv9mTmk6AfzmdeT1pCiEe88t+wCJFif4UmDc43VGlf0gFO2zl97S3E6tqRPAL1J7/MeO6rU4iLUeKmG1P9pFsccHhGzbNZ6edeT07gnMt6wTwKewj3PtrTUuZFtDuPSx98W21xzWfB2v4W+2t06YvzSdAD6V0B5rwcC+bqIGwj1BvOXFsud4Nf2skbE58rrvCcy3rhPAL1bnf/LttPYjFyN9rfX7WQP0a8P3yEswp07FdQL41Eo6+geEl/iB4wigXKPPE5Sn3qZOAJ9K6rxYxrWma21t/xI/gAD7zfvoHxTP/5cvUSeAT70yrXUhOtoF7ogAPeKcPL3EOZ96LRrGcezf6TD8TwD/tXvH6+r3APhfe0/ilet8j7fR+T5vo/N93kYv8X3+znEcv9ULWgXAL1HDMHx1HMc/svc8XrPO93gbne/zNjrf5230mt/nMwV96tSpU6dO7aATwKdOnTp16tQOOgH80Jf3nsAb0Pkeb6Pzfd5G5/u8jV7t+3yuAZ86derUqVM76HTAp06dOnXq1A46AXzq1KlTp07toDcP4GEYvjgMw38ahuE3hmH4S3vP5zVqGIa/PQzD14dh+LW95/KaNQzDdwzD8AvDMHxtGIZfH4bhJ/ae02vUMAzvh2H4N8Mw/Ifb+/xX957Ta9UwDJdhGP79MAw/t/dc1tCbBvAwDBcAPw3gjwH4bgB/chiG7953Vq9SfwfAF/eexBvQE4C/MI7jHwLw/QD+zPn3vIo+APjBcRz/MIAvAPjiMAzfv/OcXqt+AsDX9p7EWnrTAAbwfQB+YxzH/zyO4ycA/gGAP77znF6dxnH8RQD/e+95vHaN4/g/xnH8d7fj/4vpwvXt+87q9Wmc9Du308/d/p27WTtrGIbPA/gRAH9z77mspbcO4G8H8N/I+W/ivGCdegUahuG7AHwvgF/adyavU7fU6K8A+DqAnx/H8Xyf++unAPwkgM/2nshaeusAHoSy85PsqRetYRh+F4B/BODPjeP4f/aez2vUOI7P4zh+AcDnAXzfMAzfs/ecXpOGYfhRAF8fx/GX957LmnrrAP5NAN9Bzj8P4Ld2msupU80ahuFzmOD798Zx/Md7z+e1axzH3wbwFZx7HHrrBwD82DAM/wXT0uAPDsPwM/tOqb/eOoD/LYA/MAzD7xuG4WMAfwLAP915TqdOVWkYhgHA3wLwtXEc/8be83mtGobhW4dh+Jbb8TcB+CEA/3HfWb0ujeP4l8dx/Pw4jt+F6br8L8Zx/PGdp9VdbxrA4zg+AfizAP45pg0rPzuO46/vO6vXp2EY/j6AfwXgDw7D8JvDMPzpvef0SvUDAP4UJrfwK7d/P7z3pF6hvg3ALwzD8KuYPsT//DiOr/I2mVPr6nwU5alTp06dOrWD3rQDPnXq1KlTp/bSCeBTp06dOnVqB50APnXq1KlTp3bQCeBTp06dOnVqB50APnXq1KlTp3bQCeBTp06dOnVqB50APnXq1KlTp3bQ/we5egeI3ld27AAAAABJRU5ErkJggg==", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "heatmap(grid, cmap='jet', interpolation='spline16')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's define the problem.\n", + "This time, we will allow movement in eight directions as defined in `directions8`." + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'E': (1, 0),\n", + " 'N': (0, 1),\n", + " 'NE': (1, 1),\n", + " 'NW': (-1, 1),\n", + " 'S': (0, -1),\n", + " 'SE': (1, -1),\n", + " 'SW': (-1, -1),\n", + " 'W': (-1, 0)}" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "directions8" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll solve the problem just like we did last time.\n", + "
    \n", + "Let's also time it." + ] + }, + { + "cell_type": "code", + "execution_count": 72, + "metadata": {}, + "outputs": [], + "source": [ + "problem = PeakFindingProblem(initial, grid, directions8)" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "533 ms ± 51 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "solutions = {problem.value(simulated_annealing(problem)) for i in range(100)}" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "9" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "max(solutions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The peak is at 1.0 which is how gaussian distributions are defined.\n", + "
    \n", + "This could also be solved by Hill Climbing as follows." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "206 µs ± 21.6 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "solution = problem.value(hill_climbing(problem))" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "1.0" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "solution = problem.value(hill_climbing(problem))\n", + "solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, Hill-Climbing is about 24 times faster than Simulated Annealing.\n", + "(Notice that we ran Simulated Annealing for 100 iterations whereas we ran Hill Climbing only once.)\n", + "
    \n", + "Simulated Annealing makes up for its tardiness by its ability to be applicable in a larger number of scenarios than Hill Climbing as illustrated by the example below.\n", + "
    " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's define a 2D surface as a matrix." + ] + }, + { + "cell_type": "code", + "execution_count": 73, + "metadata": {}, + "outputs": [], + "source": [ + "grid = [[0, 0, 0, 1, 4], \n", + " [0, 0, 2, 8, 10], \n", + " [0, 0, 2, 4, 12], \n", + " [0, 2, 4, 8, 16], \n", + " [1, 4, 8, 16, 32]]" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeAAAAHwCAYAAAB+ArwOAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJztvX/MdV1a13et93ned4AMZZpIUueHjka0NSRCO1Ia0taMNB2RiqZJiwYTf2WSWuPQ0FLxD9umfzVNiH+UNHkLRBONaIttLdUaGiGUhCIzCAYcNRMYwxTiSA2BSWfeee9ndv84977vfda5fq9rrb32Ptc3eXKfvda11tr3eZ7nfM73Wj92WZYFUqlUKpVKjdVre99AKpVKpVL3qARwKpVKpVI7KAGcSqVSqdQOSgCnUqlUKrWDEsCpVCqVSu2gBHAqlUqlUjsoAZxKpVKp1A5KAKdSg1RK+WQp5eursj9SSvnRgL6XUspvae0nlUqNUwI4lUqlUqkdlABOpSZRKeXdpZTvL6X8s1LKz5dS/vSm7mtKKT9WSvmVUsovlVL+u1LKG491P/IY9tOllM+UUv7DUsrvKqV8qpTy7aWUTz+2+f2llG8opfzjUso/L6X8WU3/j/VLKeVPl1J+rpTyy6WU/7aUkp8fqVSD8j9QKjWBHmH2vwHATwPAewDgdwPAt5ZS/t3HkFcA8J8AwK8DgH/jsf5PAgAsy/JvPcb8jmVZ3rksy199vP6XAOCLHvv7cwDwPwDAtwDAvwYA/yYA/LlSym+W+t/oDwDABwDgXwWAbwKAPxbxu6dS96qSZ0GnUmNUSvkkXAD3sCl+AwB+EgC+DQD+x2VZfsMm/jsA4Lcuy/JHkb6+FQD+7WVZ/sDj9QIAX7Esyycer38XAPwtAHjnsiyvSilfCgC/CgBfuyzLjz/GfAwA/utlWf4XZf+/Z1mW/+Px+k8CwL+/LMvvbnhLUqm71su9byCVujP9/mVZ/s/1opTyRwDgTwDAbwSAd5dSfmUT+wIA/q/HuN8KAN8JFwf6JXD5v/sxYaz/d1mWV4+vP/v4859u6j8LAO809P8Lm9f/BADeLYyfSqUYZQo6lZpDvwAAP78sy7s2f750WZZveKz/7wHgH8LF5f4LAPBnAaAEjq/p/32b178BAH4xcPxU6u6UAE6l5tDfBYBfLaX856WULy6lvCilfGUp5Xc+1q8p5M+UUv5lAPiPqvb/FAB+M/gl9Q8A8J+VUv7FUsr7AOAjAPBXkZhUKqVUAjiVmkCPqeJ/DwC+CgB+HgB+GQC+GwC+7DHkPwWAPwQAvwaXxVQ1/P5LAPiLj6uY/wPHLUj9AwD8r3BJS/8UAPzvAPA9jnFSqdSjchFWKpUSVS/ySqVS7UoHnEqlUqnUDkoAp1KpVCq1gzIFnUqlUqnUDkoHnEqlUqnUDupyEEcpX7IAvKtH16lUKnUA1VuoLdfaOq4N1r61X889cvfDxWvLvGNGxVL6JCzLL4sddToJ610A8OE+XadSqdT0er26rj9qufrXiXKuTtPGM6am3FKH1dfXWBsqjovXtrX2pdHvlEMgU9CpVCp1YlnAM4usAIyE71glgFOpVGpX7X0k/7yAGq+xfxcJ4FQqlbo77QV9L+zP+SUhAZxKpVKhOics+qj3ezV67temBHAqlUodQtxCqyhxi6k0bSx1qQRwKpVKdZW0CjjlVxTg9/mikABOpVKpaeRxuTM549YvF5btR1Grn/dz6QngVCqVCpMVQD0//F8nXu+lGe6h1r4p8gRwKpVKdVNU+jkSXp55Xk1fljrt2L3d777KGfJUKpUKUeuH/t4p01Yw94S5RaPfR//RlQngVCqV6iLL8ZNSW0qaoyIjwTjSwXvc7wj4RpwVfVGmoFOpVGp3jUqZas+PtvZlleXLSLTmgC9AOuBUKpUKUE+Aah+I0Hs8i0akyWtp3g/PfcVCd6sEcCqVSoXL+/QjqU4zHpV+bv24124/sj79yPrkI2l8Stbfvx94VyWAU6lUqknWx+95Fel+90w/e9UypqVtf/CuSgCnUqmUWx4QjnK/HrUeBCLVed1vS+pZ+3uMA++qBHAqlUq5pAFF1GIjD2Q1c8fRp221ztFa4Htc8K7KVdCpVCplltWladSaCp7d/daawf/tB1+AOd6BVCqVOpA8DhSAd8eWOuvYVHvP+c49n3wU5X7nd76rEsCpVCqllvZwiJaFWR4Yag/esK4WjrhP7FrzfvSC7/7gXZUATqVSKZV6wVfrKj2uWNM+YrEXVzcLfOcB76oEcCqVSrGyfOi3Pg2pdR6WGr/nimoLRjSxVvgey/VulQBOpVIpVFa35VkV7anzuF+q3DO/25J6luKtccdzvVslgFOpVOpGI+DrSelyfVpTzK1zzVJdROrZm3KeG7yrchtSKpVKXSlihW2Ped9aniMnW1dEa/ryxkb1dwz4AqQDTqVSqUd59622LrqKTgNr+m4d03reM3df1D140s7HgS9AAjiVSt29Wg6MiIQv17f2HkadeEW1wa6lLw4Rx1EeC7yrEsCpVOpOFQleLD7qIA5u7EjIRhwMYj16837hC5AATqVSdyXrQRTa9r3gG7HoqnUvsLZNxP5nC8CPC95VCeBUKnUH6gVerN2IU7Ba9/VGrMDeqkcqnhrv+OBdlQBOpVInVit4pT72OAVLGxd92hUo6xK+WiWAU6nUyRS1jaZ1L3CPfcD1dS+HG3E/Uj9YPRaz6lzwBUgAp1Kp02gv8GJtZoFvrehTsEbA93zgXZUATqVSB1bkoRE9wCvF9IZvrzaWPjz1qw4C3/r2H3zNUqlUanJpoQvQ5na59qO3Id0bfCcDbydSJoBTqdTksgAXoN3tcn2M3obUc/529JzvQVLOA6mYAE6lUhNqD+hy/XjBK8UkfJ+1E3h3pGACOJVKTaQe4NX02+PwjTrG4oqtMMMU3caTuqbGngC+UfTD+nk19hZSqVTKIStwAcZAl2sfvQWpro8A3V5zvhPP93pp15GSCeBUKjVQHuACxEFX6qt17y8W1yPFOyrt7F3BrakfAF4r4QYTUfU84FLKh0op/6iU8olSyp/pfVOpVOoser36Y9HLzR/tON6+qPZUOyy+jqtjsHouth4Pq9P272nTsn1qR/ha/tm8BFt8sMQhSykvAOC7AODfAYBPAcBPlFL+xrIs/6D3zaVSqaPJ63BXRTpdqT+r26XatKakW0DXq03EPdV1AN3BGxET0e7zcd1+DQB8YlmWnwMAKKV8HwB8EwAkgFOpFLRB1/rJ1ppilvoYuf1Iit8rHd1rvrcTfKW/7onT0Jqh3gMAv7C5/hQA/Ot1UCnlwwDw4cvVlwXcWiqVmlejnK5lrD3Bi8Va4BsNWK7uJPCNAu+OK6E0Q2Pv3HJTsCxvAsCbAAClvPumPpVKHVUjYWsZrwW4XHsLdLH4HuD11mnbHCTlHAHdVuB6qens6lMA8L7N9XsB4Bd13adSqePpqMCV+vLMB7fMBXvBW9f3hLIlbkfXy/3VRaegvW06DfMTAPAVpZTfBAD/DwB8MwD8oa53lUqlBqoVuADHc7lc20joYmXRKWWuLiLdbBkbIAy8I6C7Y/pZNfyyLA+llD8FAH8bAF4AwPcuy/Kz3e8slUp1VLpcXTvvqucW9xjhhrl2k6ebqb+KPdPP1naBKWhYluVvAsDfNN5CKpWaRqOBaxnzqNDFYkeA11vXAlcO/B3B2+KCtTGe2CDtbMBTqVRfndXpSn1Ywduy2pkqiwJcRN3ErtfqeCPnfGdPQadSqaMpoatrExHfulCp98Ksg4HX63hnSj/P0XUqlRqrFvCeaREV167V6WLlmoVYo93wxOlmC3i9Llgb44kNVAI4lTq8vOBN6OpjJWhpYnpB8aCOd4/UcwTxNH1ELsJKpVKzaZTbPSJ0e6SWtWNZY6LqJgWv1u16oNsDuIOJmABOpQ6lEW53ZuiOnM+l2nv2A/eY+43sZyLwtkBX+8+8B/m2faYDTqXOpFnAe1ToalPLlvYt0JXqpbYR88RYfTB4ezpgTb02JqLNvMOkUimfZgGvps/ovbpndLvW+hOlmkelnidPO08ydCqVojULeI8I3T3cribmZI43Os3she6ItLO1baagU6kj6gjg9aaZZ4XurG44atV0XT+R251h8VU64FTq3nV08PZ2u9pY7VizQre+nsTtYt3NAt2RC6+0faQDTqWOoBnA2yPN3Au6Z3S61nhL3Q6LqiKgO9IBe+KDlABOpXbRiH28ezjeiLSxNrYndD0xPaFb1x/Y7Vrnf6X6GR3wPt2lUilZR3a9MzjekVDVxCR4xRht24g6S0xEmwYlgFOpYTqy6+3leM8K3Tqm5x7hidPMoxZe7b3oyvlXkgBOpborweuP2wuonhjp/qPnkQO3EHmg2wrX3sBNB5xK3bNGgFczztHAO2ofriam1elKfe4IXaxsr7neVuCOnAfW9JcOOJXaU7O7XusCq2jw9nK7e6xgrssmcrpYd73mea1w7QVjbUxLfJASwKlUqGYHL9d+VvDuAVWp3vpl4Y6gGwncSNhG0U7TTzrgVGqk7iXdHDUHq2k3W711DvjOodsDuCdyvzsPnUqdRQlfW0zvudmZtg1p6ncG797zvKPmf62xLX2mA06lRuhs8G1Zsbx32jYS2gndZrgexQF72wQoAZxKudQCXoB953tbXG8P8PbcI9vidhO6zWVceUudpt4bO6KfPt2lUvegdL14zMjVwZFQ7pnO7rhtKAqoe6egpbromJb4YCWAUymT7mGVsxUyWEwUIKOgfFDoRgC1N4StsS11lhhLXHT7nANOpSJ1NteLxUe43hGu1Vt3Euj2grC2naU/Tz+9YlriO2mS20ilZlbrfO9s42jHOit8I1PdO4J3hhS0p7ylzhJjiYvuIx1wKhWhe3S+1rT0aLh6U83ePiaG7mgIR5ZLddExLfGdNMltpFIz6mzwjZjvjXK9PcEbPS7A1NCNdLR7gVhTr42xxEW3XZUOOJVqUcJ3zHxqJJSjoe6ErhWIo+Z4ZwJuBIwtcdbYFr2EBHAq5VeudO4D31ZnawWvB8hBTneP1LIXsBaA7gXbns53RwomgFOpK50dvhJ46xjvQRWRgI0Cb2e3e8ZU80jXGxnjiY3sJx1wKmXV7PDl2kbAV6rvlTo+GHhnh+4RgHvHrnerSW4jldpbs8PX2kazx5eTxflq+pgBvpOCd2S6WdtXZHlUvTXOGjuinzHdplJH0hHgG5l2rmM41xo5p6spnwy8FnBGxWLX3pgeZVLd6FRzOuBU6qiaHb6te3ylmJ7wjSifALxakJ4h3Tw61dzDzbZSLYKKOQecSkm6B/hqnClW3wO+1HgW1zsBePeE8N7A7QXiXnHeeKvq/hPAqRSnUft8vX3NAl8NNK3lra43GLytoG2BqMfl7g3hljpLTI+4qHZBSgCn7lCj4CuNE/XfT7PPl6vrAV8vRLV9DQLvDC7YG2Mp85RLdZp6bUzv2Og+0gGnUnuqBb69D9nQ1mnisHLt4qy6TNvXAPhGuN+90s0WkI52vzPBdgL6TXALqdRIzTDvuyd8OSBa5lcj5oFbUs6B4J3Z/Y5IN/eAbY/U8VHcL0A64FTqVglfvM664CoS1No2ja43GrwjoTsKuHu73jtwvLUmvKVUqofuDb7WeizOA1RLe63DbXC9reCNSEVb+tG2by3zlLfUWWI8sZ741nYBSgCnUqyOCl/PXt/Z4Iv14Ug3t4I3Iv0c4XJngXBEvTXOEx9NN0t/mYJOpVZ53e+I/x7W7UaauB7wpcaMgC8X3+B6PZCNTD/PlnqOLNfWa2N6x0a066CJbiWVOqr23G5kiWmBr7RCmWvfMt+7KgC+HuBSUNwbvHtBdy/na20TSTZPX+mAUymA/u53ptSzZcWzJE86OhK+RucbBVmvE7bUea61Mdq2UnlLnSXGEzu6TUdNdjup1L1oJHy5/lrTya3w5cZ3ut4e7rcVyBHXrWWecqlOU2+NGxUf3X6rdMCp1MzuVxvvWfHM9eFNMU8IXw6aLRBuAe+RIdxSZ4mxxLW260k4ru8EcCrl0Z6pZ400kKbAGrHiWRNnga9zpbMGoNY6bz0XH3HtjfGWS3Waem1MS7y3TUu7YE1yG6lUtFog17Pv1tSzFKOBL9Ve+lTHnGsdxzn2TvBthbFUpqm31HmutTHathF1mnprXGu7KKINImMCOJV6Uu//vXvO+0pxEmgtW4cGOt8oCLc64vp1j+vWMq68pc4S0xI/epwBmvjWUimv9nS/vT4drP1GzPtyZVh/VJ3GOU8GXwuQufiIa2+MVN5Sp6m3xrW2m8n95hxw6j6156EbXB+W+2pNPVOi4Mt9mmv36dbul4Nvw4KrmSCsfR1xrY2JLNfWa2Na4lvaTUy5iW8tlRol7X8DL9xHp541c79YW+scb13GtafGEOCrgaYHtFrgWsBsqfNcW8o85VKdpl4bE9GmpV1rW0npgFOpker1X0mCb63RqWcOvpIb7gjf3u53NghbY1vqLDGe2Na2kf8VvX0lgFP3J49Dndn9Wj8FvaddtZxSRSkIvhT8ejljT1n92lKnuW4t48pb6jxx3viIthPSbsJbSqWOpohPn+h5X8tpV1x/HHy1874Hge9MEPbGeMulOkuMJW7vdlHta6UDTqUk7eV+NbL+1/QAXCrTAJ7bXhQkDHxY+SwO2ALbSPD2gG4klL3xEe0nJd2kt5VKWdVz6xGnI7lfa1ndnwWyQe43+qc31lJvrdNca2M85VKdJcYSt3e7qPbHHDqVOoIi3W/UJ1hdbz3xSlOmgW3n1POon9Y67WtLnebaUsaVt9RZYlriI9r3pJvUd6agU/ejXouvog/d8IwhxVgPyeDKWlY9HxS+VuC2QliK1bS3lHHlUp2m3hrX2mbvtsGa6FZSqSOJ+6+jdb/Rx01aFl55z3XWyNhub+j2cMKWOs21NsZTHlXvjY1sH0W01n7SAadSlCLcb3S7vceKdL91O+bTiGqSENbHaNtq6zT12piW+Kg+JqbcxLeWSmm0x+KriE+eVvdb1/VyvxJ8pXYOK3AUCHtee661MZ5yqU5Tb42bqW1Ee0zpgFOp0dprJXarWu67oS0HOWubnvDlxudeW+o015YyrrylzhPnjY/sY1LSTXpbqVQvaf7JR4LU+2kWOffL9YuNQbncDu63BZDePlqd72gH3Apirs7ypScqbqa2Ee0xpQNOpWr1/Are8rQj6315j5zEylpWYTdstWoBqqfN3hC21LXEeMu19dqYlvjZxvX0nQBOnV89Ur4zpJFncr/WbUerFAuvNIDiYGht0wJfL4QtdZprSxlXLtVp6q1xUe0i2k9CvkluI5U6gqzuF4u3ut+6vof7jbYoyq//2yYRLtfTZpQD7gHeSPc7q/P19BFJtc6ETACnDqoIEI1W66EbXGyL+61jWt0vIyvwtLEREPbGaMeX6jTXljJPubZeGxPZrrWPGT4CKk14S6nUXtpj8ZVFmiMnJfX+L+9wv1wdBWUuVtO/d0yvA7bUSWN7yrhybb02xhMb2U/0P3FPfzkHnEpFypp+9vTpXZxFuV/OoWrTz9R+Y0ca2+N+ufYW59r6UyrTvrbUYdeWMq5cqtPUW+Na20S0n4x4k91OKtVDo/+ZY+O1uusId869D1JaWfMeDnC/HpfaA74zgDcy7Tyj+90zzdzaTzrgVMoi60MNWtViPzTbglpgK2mA+21xu9y4e8B3Rgcc4X73cqF7gjlYk95WKsUpeqtQ70cORi++ksbz9luDtRXURPdeJ+Z1zVR7y/1o4euF7ewOeKT7PRLYKaUDTqV6auTc74yLrwynXlkAh5VFO2MJUB5XzL221HmuqTJPubZeGxPRprWPiSk38a2lUhGa6WuxR9rUeHT6uWGxFSYrzDR1dYxlXE8qmrun1lS059pSxpVLdZYYS1xk+9n+m6cDTqVGy5t+7qWW9LNFhsVX2JDeOi+wLWNZxuReS2NGwJgq75163iNlvJf7DtaEt5RKjdTei68ijp2U2rfAlbsPxfujdbqaeGudFOuN04zbKx2tjfGUS3Wa+tb4vceL6mO+oVKpCM14VvPe47Tej/aTujH9zNV5nSNX1wJlb6pc+1qqk+K1baTyljpPXGubvcbrJPGWSinfCwDfCACfXpblK/vfUioVpZH/4zTp55H3Y3nwQt2GKsf6aFx8pS3rAWWLtI47XbA/NqL96PEoKWdlXlPE/AUA+FDDraRSk2p0+lka35N+ltLSvdLPRmndI1fG9eVJG1vcr0Yj4PtSEcPFYeVSnaa+jpFiuTbesTRfHLzjdZI45LIsP1JKeX//W0mljqDZ0s+9FbT6WVunjfeAlpMF+NJ97OWCLWVcuVSnqffGHmksTqNXQZdSPgwAH75cfVlUt6nUQTULQL3awQ7sqQc41q9s/fLS2rdnHO/9eNqNahOssFtYluVNAHgTAKCUdy9R/aZS86vVYo1UnZKe5b5SXWR1vy2uWBvTEj96HG/fuQ84lTqLrNuPWtz3QZz76lgx58rV9b6fvRSRep49Jd3beUe1n3OoVCr1rD1XSE+svUGmkeUeZ/t9IqGsqbfGzdZ3Z4mroEspfwUAfgwAflsp5VOllD/e/7ZSKUpv730DEynSrXZyvg99ujWp9R4eqp/e8fZ+L0bDt3U19F59W/vC/kSloJdl+YO+O02lUj5FwXCSr/lR4tzkXq50Noe7qtU9jkhH93KuIxd/NUqzDziVSgHAYeZHj6LRjrAej7q2Ot570l7wtbhia5+trrhBM353S6VS9y6vs/QswIpysdt+qD6pmJmcdC8H2WOh1ojFXx37muWvPJWaSK3/LaxOmRsvXfeNLJCNhis2tjSG5x6igWxZJW3pQ9NPNEx7r4YeSMVMQadSpxIG7AkgzqV1vXV1jGV8aaGUd9yWtPUMKe8jLNSK6q+O3yEVnQ44lWpSz1UhKZNanGaPFLAm3awdd6YU9apZFmr1dsSevgIfxpBKTabcijSFq5Xkda2j3K5WXH89F2zN4IgB7OlrqS7a7WpivIu+qD9BSgCnUsPlhWdPCyR9qelEg6jUdAuYpZ9acYD2vNaMpb32KmqeWNtXBHg9ae2d0tAJ4FQqtVGH7EIUDDyQjZa2/9kc8Qg33ZqSjgKvRjtuPdoqAZxKHV49PkmoT+wa0I7nrkSnmC2x3kVYWocuLczSuF2PI45Qr7nUqJR0Sx8tDvflYv+Tc8CpVKpdlCN2kMGSJvXEemBlWe2sgTFVZoF0Sz97ybNKunX+NjIVDUDDtKMSwKkTa6ZPqBQpC9h6zvW2zCtTZZZUdetccK954K0s7rHXQq2oVPRA0JK3sMuoqVSz3oY5VwLPeE87qt4+o9lOo2lTbx/yjGMRNx5XhvWhfa1pO4OioNxar4WuVy8N32iKbpx0wKnUVDopwD2rdiPbRLpgrWO3rIrGFOGCuf56Ombvth9PvdjW6HJfPuB/OigBnErdtXZeWtuSYg68DdV4VhhjcZr6yMVZe8zCWOeDe7peD3QHKgGcSqUErQuxHqrrVR3mz1rmY3vMAXudcc/FWTOodeV0D9erdbxR0H356vZProJOpVIX7XxymDa92xOyUnttPRfrKZP6bn3N9bu3WhZpoeWdoIsBdvunQQng1IGVR1L6pXnvBn1aW+dtqXaWMTT1Hmes7VPqj3stjaFt20PWYyu1fazlHHzZPg3QDQSspARwKpWqZPliY/x07wHTnqlnaQyqThOP9U/VU7Ga8paxeilqnlhyvRrwDgRurQRwKrWrZtpP0pJREByIFhpR0LSmmKPngaV70brfKHfsgbMk7wlWVJkHvuS4AnidwH3t5SvVH+26iARwKnVX8loeaSFWgCLcWIs77OWIPWXYWFQ9FTujwtLRBOC04FUIB2usEsCpVGqMvO511E9OFhhr4y1lrWCObEvJm8yJcr7B4B2hBHAq1VUzpZg94j6da1e8w3F+kuONgLsnPS3dS09HbE1V91LLQxai4Suop8tlxx06WiqVmlDYh1enFea9XKvkZjVzuZ75X01MzzIgylpS8b3UvGcYgS+XchZcrxe6L16+Ev+UPIoydR/KrUix4t7P2vEO0p6pZo2jbYFwXRdVpqm3fFnpIcuWJQq+aKwOvBphcI1UAjiVUmnmM5q9dkf75SU4DT0CmD1SzVrQUjHWPrG+qDKtS6a0pzOWZIUvIS14e8EWvafuI6RSp9XR53e32vkTeKSbbUk1a2OjXLK2LzCU7fFXXf9XUW9PaoevBrwR0H3x8uHpTz4NKZW6S41KyXOf4g4X3HILFjhHttH2YxlDqx7paW8iJUId4cvJC90tbJ+g61ACOJW6C23BrP2wwGDOlTV+Uvd0rL3aRI1prQOkzlJmAbOnj5HJIQd8LeCNgi2mBHAqlaqEQdZqr4JdcC+Xq4FwFGCl2Mg6qWwmtbhfBKJcylkL3h6wxZQATqWGarbFXJ6UNdfG+YEV6VyjY6UYi6NudcK1IvjQG9Ca+d+r+jb4UpLAOwq6WyWAUyfXjF/5ZxX2XkmAXttgcQYXjJW1uF2uzupO9wC+5r48ddrxNPWj1AhfyfVaofvy5SvxT8nnAadSKb+0aWhMxk9ub4o4Eoxcf5r2lrGi7ivSGY+CseSGa/cbAF9KWvDWcI1UAjiVSgmSFmNhLngtUx5PGeXwvDDnYnqAFZMlVtPG8p5IZbMoAL4a8PYCbq0EcCp1SLVuN/J+yg78dG51eFFOVtO3B/hRoLXICuM91TAXy8FXHHbgedAJ4FTq1Np+4FDQ9mxRovoLdMFS3V7p5Ii0cWQK2ut0veqWjhb+nSjdrwe+HsebZ0GnUilEUYdxvE28xiR9KjekoqPT01ydBnhSe21MT2fscfFcjLdeK83cbyUtfLmUswW6Pc6ETgCnUmGabYuRRRZoS67a66iZYeqyFtDu4YA1/Xp+euSFcqSLtqoCngW+ZJcTnAudAE6l7lbUhxMFWIsLdm5LqrvRwBEr88ZLgLWAUXq910+rwlxuTDet8JVcbz6MIZUK05kemBChqPQ0BVvJOg2GcKQ7toCOGkeCYXS51YGPEvffsgan4jzn2zIavlw/vnOhkUcW5j7gVCoVIwra0qd2YypIM4rRAAAgAElEQVRaYrk1XRoZT8VIwNOOEVVe188obvGVIvVcywtfrSLnghPAqdSV7tkxa4BpXUndkIredmVN52rLPC5Zajc6lWyFq/bLgOaLx86qAWiFr/5s6D5p6QRwKuXSHqD2fiL2/CS1WC5HKnrbzALhnmAeCd/6/rgyb5+WfwK9wbsFqOB+tTDk4MtpxFxwAjiVujtp54EpJ6txwVT5IAhjZRQ4rQ54JHwfkGtrHxHy9tfpe6p23tcD31ELsAASwKlU6kqeRVoUqDXlnSDc0wFT8S0/6365Oo07tt6nFK+RB9LS4RuPkuZ+tfDl4Gp6RvCLV+yfovx3nQBO3amOvGd3L1m2JK3SQNggDpDUa48D5oA1g9OtX0uxknqnljVi0s9baeZ9KfhS/anmgTeAjVICOJUSdURYew/WsNRZFl5xMrjgeiisTFsvxVniW1LDHpBaIDtberrz8gkrfCVFQ3erBHAq1U1HXVHNAZSrG5SK3jaVnLCmnoqLdsLbsVtAbfkiwKk3mIO0TT9L7jcSvj3BuyoBnEoNU7STbnG5ddsIF8zVdYTw9rUmHa1J63pgLf1sSTNb6ywx2jY9AU2kn7m5X82TjTzw9YD3BTxc/dH+W04Ap1IpRByg6zrt3O5ACFuATLXxuGLuJ3bfUfDl+sbUAlMPkAckg2r3i6+UZhZhKcFbw/ZFw5uZAE6lUg55UtFcXTCEt68lIEeUaYGqAbi2DpAybTxVNzINrVwBverFVSpaTj1z7a/KFeCNgC2mBHAqFaIzLNSypqm5VdGTQXj7utXtetPP2P22wNTqdDVjTiTNsZOUbueKafiy/XSA7lYJ4FTqSUddNKVV70/ZCSDckoZuKYtOKUuQjEhL90pRc8L+iym2H1ncbwR8W8D7El5pn8WQAE6dWb2AenZQc5JcMifPgi0jhCmNhDA3fi9wtrha6/gTy3KQBlln+GVfwqubPxYlgFN3qFnSxb1B3nqqlacPS9paW7eA+cQsyglLALSknLkyTapaWxb15QGEsonEbT3aSpr7RRdiEfDVuN4W2GJKAKcOrt4wnQXWvaQBrHWuuAeEAboszrLWa2HNtbP0ifUl9UcpwjnvDO5t+tmTesbgK4E3Eri1EsCp1LTqAX/NJ6jnU7YVwg/Kus4rpKMcsPQzGrTS70nF7iVsBbThEA0pzgJfSr2gu1UCOJXqorPNE2NAjXTCdT1XN8HiLE06m/vpGZN7jSna9WN9t8IcOUyDWv3Mud/ruBj4WvQCXl39yYcxpFKHU6vj9X4aYmlorC/PnHJPCA84tMPigDXQle4Fq+faePs6kTRHSt6UEW+KxvXWsH3R4JITwKmTyupAsXgtEDVxZ51LloDqaaOFsENeyNVl1G1409XSPVjgq9FsgBbSz173a4Uvp1bYYkoApw4sD9RaQXiG1LLW3XrjpFS0pk0N4aAV0nXzCCdsSTdHpqDBWO5JWWvG2FHaOeKneOSXkFxvD/CuSgCnUofRkeAfDeG6vq7baYV0K3wl8FP3TPXLlWP9TAZV6fSrl8z2JMn9UvCl1BO8qxLAqdShwNZTkS4Yi6udLNauZfX0TmdIa+Z6LcCV4N1SvqcM/824k6+keIA2+GrBi80FP7fNRVipU2uP9HOkRkHfs3CqVnTKOhLCdX3HFdIaCEt1dUwdZ4Ev16c0lqcfr6h/6itYHWc+S48TvLo2wpcdN2Dh1VYJ4NQJFQG3FlgfwVFHfMqOhvAO88KtZaPmdFthOgrGAKanIGkWX1nngTH4clCNhu5WCeBUSq2WldJHEeWYqU/iaAhzkMX67jwvjLli6/ywta0lNV2/xq4xWcE6IIWtffpRi/ul4Ev2l3PAqVRPjXCrZ4O0Rl4IY9oJwly9NPcquVOre/XAdwSYOwgDrMb9RsLX6ni3zwt+AQ/5NKTUmcUBjQLqKAh6gT7TnHaEC6biPeloTUyHxVnaOWHPXC9Xr32tGctap5E2vvG7rbT4ypJ6tsJXUg1crxLAqdSNZnCsnnvYc+659bGEnqMuLfWd54Rb5ou519Q9adu0yNsn98+wnv99hGhr+plzv1r4Sq43Ari1EsCplEqR87+zLNLyfJBE2SVtOvpAEJbKpL6xWCnlrZHXGfeUYmvRU6jxIQ0SICn4cv1FQnerBHDqYPKkn6Pi70mco41IRVPxXgg/VPXcCmmDJLhSsdr23GstfGeBKiflf7UVrFj6WeN+pXlfC3yt4N0+tjAfxpBKPcniVKO3H93bOdEjIexdnBUI4WhnzJVjdVHOeFJZ3e9VWwG+VMpZA94tbFseW5gATqVcGg3NPdy6xwVz7aIhzI0ljR+wOpors6SgqZhWHQCwreLc71WcEpBa8EYpAZw6kPZKP8+Yqp7BNc8IYSwdTbVvWB2tKa/LNCllb50US5VRioI3uyDregGWNf2MPenoedgtmLWLsPBfutXlckoAp06uGUC1lQbmI++59ZN2BIRbtykFQrjuoiUFTcVY67BrSQdIVVvSz9Kq5+vY2xXTnqck8co54NSpFOl+PWNIcWed/209SzpqT3HrNqUgCGtdZuRcrwRADSBHQFT6b7huQTKsgAaQ3S8HX2ze9/qadr3ifbEPY9ApAZw6sSi47ZV+3iOVHQF4CcLSB2ovCGN9tDxXWKnWFDTXB9WPN1Zq71HgP2Mq/dyy+Oop3gFf7bOB82EMqTvSXu53lrnfWe6Dk/dTfTSEubaOdLRnrrV3qjmijVbkk44UMQp53a8Xvug95MMYUimrItyvtt/IQzqOfrgHpch9xa3nR3PtlRDWQC5yDpgaQ7qvCIUmgR5v0OpmMRAHwZdyvR7orm3yLOjUSTRijnSvedjRkG6RBnreVDTXVjsHbSUP9wCHhiGj5oBbYiPaeRUIa+vKZ7QP5Zyv1I6K8c79rkoApyaWBJojbT3y9je7s7UqCsLRx1YGbE+K1GypZo+u0tD0e6qZ/71po3S/VJu1nefxhDkHnEqxYOp18lXv9HO09rgPzSf8TBDW1ima9FqENaP7DZjjtT6AQXK/XOpZ+2jCqMcTaiUCuJTyvlLKD5VSPl5K+dlSykfC7yKVulF0enY294vd51nnf61qhbAUo5kPHnhmdMRK5mjYtvyTErcjYSucr93vi+rnVaxiz68HvlQ/nucCa8+C1rzNDwDwbcuy/GQp5UsB4GOllB9cluUfqO8qlQrVLNt5ervflv61cZb38m1lvw+KfrV9aVT3pRmfUkvbgG41cZ1ukZUIVUW54QELN3GI+6VSzxb4ep8JHPV0JNEBL8vyS8uy/OTj618DgI8DwHtCRk+lUHm3HVnd70xbj2ZJW0ep9QMqelHWABeMDSvVj14kRcnzz17bZjv/q3j+r8b9auB7cxvKJyNJT0fa7XnApZT3A8BXA8CPI3UfLqV8tJTyUYD/L+buUneoGUHU4n5nniO2ynLfZ0uLM/LsBU51V69n+EZKDeBSyjsB4PsB4FuXZfnVun5ZljeXZfnAsiwfAPiSyHtM3Y1aVj1HuN+9tBfg7wiSKbvqfx6WNLThn5a0+tm69QjTyOcCW6QCcCnldbjA9y8vy/LXu9xJKsUqatUzFx8NvZndrxW+6X6nkTr129B2gOxHSyJzyMq5X7lv+SEPWl1gH7QIq5RSAOB7AODjy7J8p/luUilRvaB01EcO9gZ8T/hGaaa/j46K+jV7vV0tLviq3OcgpcVXGlkf0PBcrrvnlscUahzw1wHAHwaAD5ZSfurxzze4R0ylrtQKjJ4Lr0a439Fw6w1fbf+zZAICZHGaM6xeHvV8EcM4mu1HtTzuV7ulSIJv1DOCxbdoWZYfBVAfbZlKGTQDfK19zyKP+50Fvi2K+ns5uMNuBWtL+8a3TnsAx1beOVjrs4E1Y7VCd6s8CSs1sSLnfT1jaGJb9uWOnDeeCTjRXxQ0fcz+hepRPf6aev/Va9PUzAEc+qH88Is4ySoSvpf+Uqnh6uV8uXYj3W/kf6s99ip7fv89P0pa/74C4TzybTiCC2bOgI4SlX6Odr/R8L30mUoN1Uzw7eF+W+Z+W/47Rj0zuTd8e2w166XAmbegudLw8TUQ1tyf8Xfg5ng1Zz97HhMotafg2wO8qzIFnRqoHit0ve16LLyKVgSkR+cfOY0CaM+DjANkXUkc9Vce/baoXfAz2LD533oBVi39IwT9e3X3gO+l/1Squ6IOhvA42b1Tz3u63yM5X6k/qr2m3zpm0Mde5C64nrcctVLaeRCHvnv94RncIwq17tkC37pPbe4kAZzqrN7w9bQZlXrec4wzwdeinfY4a+BjAdSM25Wa/on7FmBJh2+gQwnP89WMoYFvzKKuVKqLIo9DHDHv2yP1PGLfbwuoRyy2ioBvpPvV3scEOy81AOy939gK4asvGfoFWPX+X83xky0nX7W3i0lN5xxwqoMiIRO1uIiLH7XqeaanL3m0h7tsgS+m4Pd7tJPtsfK5RY7+rEdQXobxAc8DSs/qaa+O9L8/Nb2izwvey/lS8aPmk6NP4NK0k9QLvpGrnq00fJ2pc6j1+6AWrpayHq7Z+aWDW4ClVWT62Zp6jgTv83ipVLOiPyi9W1VmPpxjxLGWPeDr+YjoDd/eK9WD0889tiC1grX1k187n+08AxpAf6QkFRcJzB7wBUgAp5o1C3w9bUadCz3DKUwzuV5P354xuPE6Hb7xEinjbsNaL8W2QtgLa+NfJ7cAK+Lxgzd9NrjfXvC9jJlKmdXrw7zHIQ0RbrnH3O1I93sU1yv11Wu6wLn4ygtLKfXcuvrYCmFsvJYFYDdxG7eKrobm9wGvYLSkn63QjIDvNrZEPY4wlXpWTxc1A3xHrXpumWyz/h30hu+oef+Wv7OO7hcbgnLBIyFsVYTLNa6Ats7/rrKufpbcbyt8WxxyAjil1D3Cl1LkfGNE+6g58Vnha9EIt+9v1jRGj7lcy/gNsj4BaYWkZ/UzB0T9qVq5Dzg1hXpu92id7+0N39bU84iFV5axKc0O3sgsAjeWY/GVtBipdQGWF8LWFdDe8VmXr1+AZdn/+9RG8eCFllXT9Li5DSnVXb33WfZwvVy70XPEvVfqRvTZc65X2791tbOlLwm+A+Z+63hLWtoyvgW4rXAWIYynn5/neh9uyiS9UM4D8320u9/oBVkJ4FSlEQccHBW+PQDaMs5I+PZw1F74en5vCc4OYXO9mvlfqh8prm7TA7gtZZic87yXIXSpZGrxlcf9joTv5R5SqS7zj95xRsHXOoblQ7/3Xl5LXGu7HnOqnpQz13crXAPcrwVSUiwG79YlCty4mvGs97zRa1eroB/ndpGV0Wv6uXaqEeDTLLyixsltSKlOOgp4pT488LUAdRR8KWljPe9R65iW/jV99pgqCEo9Y11YwGRxuh4ItkjTt+aen177D+CgJIHQ4n61fUf1QykBfJcaBV7tWDPANyK2ddFWr9Rz9JclT99e18uNoXlvAuFrAZCnPy0EvWXUGJzMEObnf6Wymxjh6UW6pxbx7rcFvq3uOAF8d7oH+HrGaz0Vq8fBHhHzy73gG9nvHvBtUEt6VlzA5LyH1jKLtE7eMP9bp581YPXM/WpXPec2pFSgeqy0bR2vBbxc+97zxK2QGuWSLf3u5XqlmIgvS1R9gPv1zvtaIdw679zqmqWxb/p5dpza+V9JXthJK59xh51PQ0qFaEbwavr1ut7eK6T3nvfdG757g5dqZ3W+DfCl5n69Md5rTlZga8a2uHyFPI8kBMCB6nW/VvjmKuiUQbOlmrX9nhW+lEa5ZM0YLf1Z+uz5dzUAvt56S91aX/c/cq63w/wvdvwklX7m9v5a535b4JuroFNKjZ4ztIzZK+UstY2Ar2Xc1rRpD5fs7T96/jhyWkHz3jXCl5LV/WqdrzYF3JJStqSZ6/vDrsl2FfCQVdFWF2xxv9pDN7C2mvIoJYBPo6OmmzV9eeAb2Ua76CpyzlKK7QnfmcFLxXeAbwtYLbGY06X6kcpHwJm9V3z+VxK199frfrn424cztD6M4Rb2+TSku9LIdLNlvJ4pZ8t9aNq0wtc6nja21SVb40emm7nxJoQvVu8Fryb9zF17pE1Va8ZmIYwvsnopPHbw0hUFwxj3G/mEI6vLxpQAPrTS9ca0i4DvrPO+0a63N3i5tjuknSUXzA1tLdc44x5zwJr5XhHCOsdXy/LwhcuQ/njtvK+8CjrukJEE8CF1ZPBq+hs132ttEwHwkfO+EdvBLH1p4nqBF4sLPOVK64IpSG/LOahqU8C908yeexO2H11e37pgy+Kr5zay+5VgbV8FHX+6VwL4cJpxdbO275YPZ6l9b/hG9GGBb4tLPhp4ufY7wVeq5xyiJYaqk5wv1gcVaymTJEGZSC9zq58laQ/nsKSetfPAUr+tSgAfRkd2va1AmGFr0lEWXUXBd0bwUvEd4Kt1txqX3DsFrZXm/rRjMOnn14S5Xqyccr/efb9c6lm/CEsP3vrLQS7COpVmdL0j0s1S+6h0ptSmF3xboOr5rxvpekd/qdK+B53hi7XZOwVtSTVbvgxYwEw8fEF69q8WctLTizSPJ9Rca+/LOhdN95OaWGd2vS3g5dr3dr3WfkbCt3WB2IzgpdoEud66K9HtCT/rMq1bjkhBv0RiW+FscdzC6udt+Tb9TG0T4tyvZ95XA99R4H3uL3VSWf5qjzTXq70HbZue8KXU+sXK0ucR0s1c253ga3GP2vrW/jz3tI2L6k+Rfq6lcb+adLJFLY8njLoHud/UpPJ+SM/sejX9zDrf6+mnh6PV9sn1q2lridkLvADd4atJM2vipdR0HYfVczEWp2rtT8wKVCB9mgMm0tLI4qtbd3q78Mrifq0PVKCcby/wPvefmlAj4Hsm15vw1fcrtbPGjPp7CXS9dXceF9ySgpbSxFoAasDpTWfXZeh4G/erWP3MLb56ijGcSuWZ97WknXXnTbc669Rk8sD3zK5Xat97sRXXV/SeXE+/1ri9HS/XfoDrrbtscb6aWCyeKueAisGU6lsDbK+TvrnW7/0FuH3wwuW17H6pWKneA1/P/mGvEsBT6UjwnRW8XDuuzR7wbXHJVkjv7Xi5tpbfJSjlXF974Outt7rdWl6Ycn1Z7uklkO5XWnx13Y3N/VoXXnHwtbreXg9lSABPoZlSziNc7yxzilJdLzdL9R2519cSv9fcveW9CXS99bUFpJo4rcO21GmcrhamFjhz9xDofuuVz/W+4K00qWcrfKPAu8bnPuDTa1b49nS9Uvs94Wv5++jx3877ZWGmdDMVPxi+UqymP+0YotNkrrE2lDzpZW3MozTuV3uq1U3fwlORtCueo+Gbc8CHV++081ng22N1dG/49lhIFeWS91hgxbXrAN6628gUdE8IS/+NeqSgpb6urunFV5L7xeRxv1r4ck9J8oA3OhWdAN5VM8B3dvBK7T0f9r3TpSNje8DXC16urfXLyyD4asqpWGkMrp5LQdf1VNo4MgVtdd8vH57g+9rLVy73a32EoHRIRit8W58L7FECeDclfMeDl6uz9jcDqK3wne2L0ADXW197X2vdcA1N7rVUV/cttbe00/bFud9KGvcrPfFIcyY0DnD9nG/L1idKt18Ocg74ZNL+VY1a5dyzfY+2vV2vNX4kfO8EvFjXo+Bbg9QyDlanAaPkbDXtNOMFul9MlscRtsBX63rzecCnl9X9RsK3t+ud7cPe2+c9wHcG8AJ0gW8LiLX1GkCvr6Vy6p6tDpXrx+x26zb0sZOt7pdzpdp9wRb4zvAsYIAE8AEU7Xxb+2n5J5Pwtd1Ly4EcVHttPwdyvXX3rS6Yq8dAKsVqyjEnTckLZqovSzvieEkAm/ul5nU1W4Z0q5Nj4OsFr/ZfdgJ4uCygPIvzbfmwbxk3ar6Xa7O386Vie3yZmdz11tfRLtjqbuv6+l64txODstaxcu2ofiRnXq183j7z1+N+r4ePm/e1zvfm4wjvSrPCd8a0cUvbURAZuThL22+C9+Za40StEObqLUDWALO+d6y9Fs5cO/H6ee4XwLbv9/KaBqwmfdwC39ZHEebTkA6vM8J3Rsc8m+u1xrfM+Ub+7t52ncGLDRHlerevW4G8LZOcM9a2h5P1tjPM/UrP+6X2/Nb1dVscuDjora434klIdZ95EtbpNTN87831RsUfAb4Tgbcua4FvC5AxWHJtJAcMVX19rYEs1sbV7hEkiPtd4fuyWgm9PXKyJfUcDd8I8OZBHIdUtPvtDd8ebXumuaPv5+zwPZnrra81r7fXWvh6HTD1WqqTwKxpY4Uz2ub6zOeI1HNv+Fpd70joXo+bmkizw3d0O6ntyPsZPT/cAt/oL0EHAW997X3d6oCxsggH3ORkkXHEsfCFVwCX1PPLzUKsp/IXr9jjIr0PYvDC1wve1nOgMwU9jaK2B2nV0/l62iV842I1GuH4uTYHhy/X/7YMA3PLeFp3zMVIYMYkjVVtO8IeK7iWY+c96/fzao6i7A/fkedAX+4hNYki3O9ZnK8XvFy/e6ape8RGOd8JXS82jBZeXGwP51v/pNpJ5dK1xrVyfXjc7wpfwf1KTzvCDtywH0WpHyMSvB7oWtokgLtK637vCb735Hqt8Vhsr7RzFKwHgle6toK4Bcga+FLu0upePVBtgfVVDL7war1+ek3s+W2Z942Gb9QZ0JFOOAG8u44I35nAK/V9lG1J0fDt/aVjMHjrsijXS732QJgro0BN1Wmu63G4NhZYr+6XOe/5xdYBC6uePfO+rfCNOwmr3wIsgARwR42a+z06fEeDl2vXG1qtsa3wjfoCcWDXu329lwP21PV0wFf98Ht+rauet9D0wtd2KIf+MA5L2XW9fDxlLsI6hCLcr7ftCAfoHadXn7O5Xio+Gr4nBG99PaMDll5b6rbX0bC++R0u7pc7bnJ74AYH2bX8qe0g+FrBu8eDGAASwJNrpoMyPP9UEr72+ISv+ZpzxNr4CAdsuTfp95F+JyuIqTGv2mwWXj2K2vMLgM/7AvBbjq6HHw/fqLOg63FqpQPeVb336krtZ04731PK2Ro/I3w7gxcr04KKi/W4XazMCmGqHeeGLe7YDFVFPy8BLHt+63nfp7hq0dWl62tXbHW+ONjtc715FvTdaMTc75lAJNVxfUptR71PR4HvSV0vV6d1qBEOmCvD+rPUuaAK16JgrThu8vkaXxRFLZaithtty6LhawFvyznQEQu0EsC7qMX93hN8vb+rt21vp2+J9UL1AK63F4j3dsB1mfTaUre99sKaBfbtcZPcWc9ah7uWX35yW5H4uWEshhqnfo1f51GUJ1SE++3hoD1/zb3hO5Pr7Q1eKt7y3tTtD+B6sSF6uFyubk8HrC2n6iwOV9OmHgcAsAM3uLOeuXlfDJ4R8G3bB0yvkK77ofqw1gPkHPDEannLjwQXLn6WRVbediPhq23bCt87c73b1xEO2ApiCpQRoFW3ked9uf2+1IpnDKyt8PU8H/j5euxJWBYlgIdKervP4uyoeO/v19J2RIqdatP63kTDdwfwYmU9XHAUcLVlGuDW9VjMaBd8c41vOeL2+0rpZaocg69mvtd2AAfteFsO5MD645UOeLBGHbwRMe4Mzk4zbmvb6Dlxzzia2JZ/OwnfrvD11FkcsFRX9yO1V7e5Pu0K4Bq06zW23xcAX/EslWM/a/kXZPUBb889wAAJ4IHqBRlrmyPBd5Z2o7MCJ3C+kddWKHuA3OKApf5ncMBXZbenXT2nmh/QRVfPQ8grnqlyW2ral3JuOYyjbl/LshVJ+78rATyFjgQMa9+R6fHR7Xq/973hS93LwEcGSjFncMCUg9W8puq8oKVinsqutxxh8F0lzftqVkK3wNfrem0HcfTZhpSLsIaqZUVv9JgJ35h2vTMIkfDdwfVaQWu9bgFxtAOWYinIamK0YMbirTEIfFdJh21wK54t8NUutuK3IdnAq4EuBdxchHV6zXYgBBXfCzBSPzO1S/ii3fe8jn7tdcCan1SZBGWsfQucWWeMH7ahge86v2uFLwdXTcq5F3itW5BsME4HPEi93G/vvxorNLSxkYd0eAHqbdvbKbd+sZHg2xm82BBRcPXWeRxwDwhHu2APnFlAX5/zvD1so4bvqtHw5ff/0qCmYuv4uk0dqynX1ucc8CG05wrm1r4t/Y6E78h2rfC1jKdxvrUmhi/Xd0/4Yv1GOGBuDO6epN+bi1e34Q/bAADypCuAa1heutTvAb68bn8kIRZ7W0873lmfCZwAnlJRfy0RQNXGzuB8Z3S9VLm2D+37X8ftuNCqhwuOeK0BLVfXwwFTdRKo1a7YftgGt7KZ2+srzfl64RvxVKQ6DruWyuv+OeUirCHqkQb1jGf5a9wbvrOA19uuF3wj44LgK4HWeu2NjXTAvSC8vqbgzMFaE1vXm4BNH7YhwZdatWxNO9tWQceBV14FTUG47XGEOQd8WFk/4C3xCV+5XZRTbn1fNHFYzCD4WuHcCtvtdZTrxcq8UKbiuPvzulx1m+fDNqSTrrC9vq3wxQDrfRyhNB9MxWqusb7qPjXa9ptzwN012v1aZF2BrInV3nNvsO3RrpfrtcRK8D3IKucernf7ugeE19eacovT5eq8cDbAd7viOdr5ahdbtYI38jGEe2xFSgBPpQh4RS/2oeIitipZ47k2PdqNdr1Y7I4p50iXy8Va46LqPU5Y44Kp19iYFhijoMXK8GMmsWf7RsK35WlImm1Iaz/bmNt63aKsug+qjbauVs4Bd9XILTAjF1KNgu9IiHLtItuMeu93gm/LtQfE3tfWsigYe143u1ysnX67kXTEpLTVyLLYCk9H6xZjcTHc6227bdvrehuE6z5pJYBPLMtf255/xQlfXWzCtxt8sbG5f0peByy95hwxV68tewmg3W6E7fWtYUo5zFtQ43VyOlq7Ejr2TOg6jiur+9FoHSvngHfTCPfbIzba/UakqLk2Pdp52hwYvhykPNd7gngv57v+1Dhdiwvm6g3MUpMAACAASURBVNG42+1G0gMWJPhiKWQKol74jjgPuo7Dr2MewmBVAtis3guoJLW637PDd5RTHrUqHYsZAF9rfStsqbojQFh6TdVpXLEKyPIZz9ijBTXw9aSdW07Fouq3P29f38Kai9+22SpyEVbYHHAp5YsA4EcA4B2P8f/Tsiz/hfmO7kIzud/Z4XvPrlcbOwi+kde9nS71OhrC2nItiDWwxa7VMfTTjVrgSzlYySljbdb6y69gT0djsdt4Kvb52r8Iq9dKaI2degsAPrgsy2dKKa8DwI+WUv7Wsiz/d5c7mlozul/tPZ0ZvmdzvVRcI3wlV2u9jnS61Os9IOz5aYWvxuVKMFY8WpCC76oe8OVWOUedBd3rHGh+FbR+PjjMAS/LsgDAZx4vX3/8o+v9rhQJjh77eL0aBd9oiEaPtSd8B7jeumwkiC2Qldp5wdwLvsDUUfVYTA3sRvjWoJXdMA1Z76Eccp2clqbint9GzhFjAI6ZCw7dhlRKeQEAHwOA3wIA37Usy48jMR8GgA9frr5MeZupNo10pj37PxN8oxdlnTDlvL2m/jqOAt9tn1I5d611vVtVe30BAOpTrgBk+D7FbWC6jXu+boevfvWzdTGW9yhK/Vww1UeLVABeluUVAHxVKeVdAPA/l1K+clmWn6li3gSANwEASnn3CR3ynqcrRTpaqj/LwRJSTAR8jwZeKr4ldjL4RoO41e1i9Xs5YK0jphytKeb6oI16ry92vjPlcuWyW+fLwVWa65UXYN3GrOXXP+d+HnCXgziWZfmVUsoPA8CHAOBnhPATadTc76iFV5oY7xeBFofMxUvtotvsfQBK8Hzv0UDc4oAtdVIbKX5H+FIPV6AWXGlgqpnvleaHL7eqB69tLljniLexdTweG7P4ah2zwBdU8eInainlywHg7Uf4fjEAfD0A/Demuzq1ot2YJjYy9eyFb48tSVT8yDYHX2iFddsLtlxs9OuIsh4/LfDVwJZtcw1fas53PeGqhiq2pcgz36txyWvdtvw2/rYe63OVdv7Xsh2JLuMXXElp6siDOH49APzFx3ng1wDgry3L8gPK/k8gr/u1ttsr9Rx1H63w3dv1Wvu6M9dbX2tee9pY4YrV94axFcQt8H0qv4XverZzDV89aHGYavcDa+Z5ow7iwCAqwZlq93zdby9w5Crovw8AX20a/TSSIOoBpOXDHFPkgQ5Ri65aoLM3eLk6S18tsQdyvVGA1by2Aper88LYGtMCW6yMWO3MPdXIstKZc728E8ahvMbj5Zo54Fsob1/bUtD4Iq26PdaWKsOEgTwfxrCrIuaMW/5qerr2yIVZnM4IXyzuQPDlxu3xmqvn4jwxLW3X19L7iMVLZQA38N0Kc74ANvg+t7mO3ZZR/W7L6njt/PDabq27Ltc74tvXEXPAur2/df8J4GYd1f16nWdU33s4X+7voudcLxW/03zvWVxwRJnWAUswtThf6rXVCT+V6+d8JedLOVxubrgue+7bcoa0DF4rdCPngK3bkKypaEkJYFQt8N174ZVGHvhiioZv5Pt64hXOWLezwnZ73RPCHhh7fmpATIFVE8PAl5vzlVPM+GIry0Ir7dzwtp9tHVa+7evyuu04Si1sex5D+QJexa2CTlkUCYmo8TWQlNpg7TyA5sa3vke9wUvFt8Z2hu8eYI54HQnhKBh7QKyFLdamEb60y+WBKjlkrLwuq/u9/Ho+8GqgG3Uwh6UM65tSPo7QrR6pZ4u8zsrrYj1zuh5AjziIY5Z0MxY74Nm9vcBsHYOK8QA5CsJUmygwd4Yvt+BKcqhSGplOQ9PpaWnrUn0P2z7WOiy+/sm5XCtwpXlfCqyWIyhXpQN2aWTqudc+Wk4R874e+FKKAKDU5g5cb30dFeuBsgeylnoPhFt+auta4SusduZOt2p1s1pIY22x2G3Z5de1zAnbFmVhMfxrec6Xd786GKcDnlotqWcvpD2p517umIrjxrTGn2SRFdZ1D9hysVEgbgEuVtYLwtY6C2yxstX1ApDw5U63srhZ7aIsy57h27Fo8GJAtUJ35Epo/yrodMBGHWXhlbeviL49/1wSvk2SoKmN7V3XCl9sjL0csKWuLudgXJdtnS8ACd9VUtp5Kw6qT/0xK6Iv9T74tmxPWsuuf/pWQ2/bcPF1m/p9lISBOx2wSdGLoaxjaf8aNEDQxESknkdsSZoJvFR8J/ha3Gp9HRXb87W2vgW0UozXDXMuWOuEt/Bl0s5W5yutdLY4ZMu88GXM2O1Ja911OT3/q4GtdCBHHa8px5T7gNXyrgqW2re+tVFfCnqlnq19pOtVqwdALbF7OOBoCK+vPS7YA2QOtliZEr7cuc6eeVzLPmDPyul63Ou28opnaVFW3X4VB+Y6to7Druv2t3U8jBPAYfLA1xLf8oHvWfwUsZ3Iu+hKuheuP+v2ol7gpWI7wHcEiKMAq3kdAWEPjHtBd/uauq7LjPCV4EhBFUADbQq0Nkg/t5PBq0lP4z/1Z0NrD+LAAIuB1bcKOgGsUK8tR1aAeORZIBWxqKpXahqLk+JPssIZ6zrCvVrqtH1Gwnc0hDEoSrEcdKnXXJkRvtK2IgmqFjdribv8apq0NJeGjlmUhcXcvn5Ay7dtqXpt3VYJYFGt8N3zxCtPTK95X0k94Huifb1Y1xHQ9NZp2mjKI4HL1UltPIBuAXEwfD3p5MjUtG5fMe/A6/aXt+YW9GubtX6tu/5pe0rStp47kMOaggYAePmKhnECuKsiUs97K+Kv3gpw671Y3reR8A2UBFht7BHhq+kfq6PeI+09RsEXqvJG+FLywBfvIzo1LT+iUOeEr/tay65/6t0wBl0JuOj+YAawLx6EOeAlAcxoZOrZEjvS/Upjt7rjFud7hynnumwkbLVx3tdWSEc4YC90ubrOzhfAttrZ4ny51LFvNTTvrJ+v9eDFoFq7Vg641lXQNWAxqL540O3nfVHxu+ia3SOAR6eeo7cdSTERq56j4Gq5B6mvXk5254VW0rU3djSIPfVRdRbY1tcW6G7LpL6N8PWudo6MsY5/iZcd83WcvCDr8jbiQN5em1ZBb2C7BS0G2Bqmz33g5ZiKzgDfG4BbU8E9U88el+rptxWe0j8ZDeQi4HtC11tfRzvdPVxvSxkVI7XRwJark8q41zWgAVTw9a52HhEDQK+OvtTZU9PPfV7Hb8vX+LWsjtn+3MZqYFtDtoZr4aZ/NYdjJYBr9drvK7XTxHpdcoQTbe3TWm9xoXfqeqXrVsB6+tvbAff4GQ3iGr7MgxU0q53fAZ9nIPcK3gFv3bR5Ca/gDfj8pjwC0PqFXJe3g4f0tv02vi6XoFsDV4KtCFkMrJzr5UCcKeitWuHraaeFYetpUZa+JbiOrsdiuHLLe90aO2iVs3TdE8RU3BkhbK3zvH76077aeYUvBrU34K2bNu+At1DobfvZguwN+DwB1pgFWnw5nX7moKsB7tbZrsC9Am0NzRqwGFR1x0Ff95UOeFUEfKNWMXvf7oiFV0eBb6859hO5Xq4uEsqt8PVCuBXK3pgWEG/hq3iwAvcsXws0+6Wl7XuHKfDWZXV/6/Xl7SRccgXdGrhbd4sC9xVSRl1TrlcLYoAEsF4tb0EPKGju5wx/ba1bjA6ecra2p+q0wG1p73G72HhaByy11wBbM04kfG/GtW01Arhd9ftc/gylOv42rl453DflrJkTrsu2/UWB9wm627eEgi4VU9dpyqnYBDBAjHPde89vxIlXs7nf1rRzK3w7ghfrfhbna+2Lio92w1EOOLJOA9yb1/KcL0D7VqOtEwaAx+t459vijp/LcRi/gFdQu+G1TgNd1uVqYMulpSnYGlZBJ4C7p54tb11P9+vZ82vpr1YP+J5kvpdzYJprD2y1cdGvR0O45ae2TgPlBvh6tho9L6i6hW9k2vmNxwVd2oVeXL/Y77iWYeDdul0VdCngSrDlQIvBtcUN3/cirL3g29v9ev66rIDl2h8FvpY+JzhYQ7rW9DMSypHA5epaYFtfc1DlylSv9audL0308N22aYHvukgLAFu01edgj3WsLXhv3DHidl9c3koAMECXc78aF4xd1/G1OBDfrwOeDb5eQMyYeo7smyprBWqmnE39asq1cMXqLaDVxGjdrKZOUy/G4/ClpHWgeJpZB8p3bLYhbVdMx6WptWlpHXhX6AKA7HQtwI2aA5YcL1Z/nw54phXPmrEscaP/qjxfCCglfNHrvUHsfR1Rpo3hHK32Z4jLrV4LW40AgF3xXF9jaVkrBOutSNpU9xvVfmNsFTY1z7y2qfslfycEvKjbfQWyy7W4YA60rU4Ya3t/Djg6/YvJ8sFPKerEq97ul1OLc6bGOhh8sa6tMPbU7f26B3yl9tg1VhfhfKXXV9cbelSywHe7FWcLtrVs+1rqY9vGts2oTidf168x8sEb1673CtKPc7zb+d0bx6txu9s6qn5bvi3jXtdtqBhN/X0BWAuPFvdrfat6ut/Rc82RcI6Grzb139H1YmVaN2uJnRm4WP1eDri7C35OOwOAaa+vZ7UzAGxcqexkrc43avU0l24mU81bWFIgXn+2zAPXryPngrF29wPgEfCNahflfqU2rQ7V2/cM8O3oerHuI697g5jqs6cD9jjhlp/R8L26vk47A/SDrxWEEkytMF+vb+emr+eUAa5XN9fpZtLx1uClQMxBl1vpzDlkqh6Ieq6M0vkB3HqQg6Uvqn2P/ahSjPVeW+Ac6YwTvs2A3V5HwFTz+ggQ1tZZQczA97Wred54+N4unrpts24R0sI3ymlz6WZunpcEL5Z2plywdh4Yq6Pq69d13FZaCJ97EdYM8LXEe6GiGTtycZTld7XAOfr333m+FxvKC1sudtTrnkDuBWFrHVWmKScWXAFAJ+dbQw2HLwdvALhZTFXHS/Xciu31HgFuF1mR4K1BW19TLpgq06aksbq6vq7DrqXyrc7rgCPh29J+xKIvacw9U8+WcT3xB4WvF7ZcXTRkqfII4GJlURD2xjS9Hg9faTWzBF/pUA9pDzBfL6ebb1Y2b1c1U+ClFl9h0PXMAXtPwfI64fM54NYTnqx9euCrcXXauOi/mhY4WwDb+iWBupeDp5zr6x4g7gltC0S5Om2bKDC3whd5sAKADF8A+5OFtoCLhK8uRa378nC5h+pJTZLrrV8DU+ZJR9flgMRzr7kFWVQZ1narczngHm5zDwdLacS2I04tgLV8UYn853YS+HLjWl9H9ucpo+q4WM0YUr91mQbSYvktfFdt9/pSeoF8Mr/clHHAe46Rtithe3Jv66i+rNfXzvd2rvcdbwnzvBhUI+eBe5yG5XXA5wHwaOcr9bHHwqtWRcKZkzWjEJkdOCh8R7jWiD68btgb63XA2r6MzhfgdsHVU5lrr6+8+tgKSGpOt80Z61POrOv1LMDSzAN7FmVR9XUMdr0V9x3s+AD2ONSIXycKvlrN7H6lfl8ydVw/WLxmPKqvSeE7GsSRsdp6D4QjfnrKuNdP1zr4tj7ZyFtPPRO4Fb7U/DB2L+946/O3e3o/B/SWIu08cN1mew1CmWbhlXcFtOR2sfrjzgF7U8PaX2VU6nkv99sC5yhnLPWjcbSTL7aqr71gjgJxbzjPBGELaKV6Ab7UnC9A22MF/fUyfFeY1vCl5o8pqK/zvU99eV2vZgGWNA/csgraOvdrnQfG+jueA24BYxR8e7pfLWSkmMh5Va4tB26uLmJl9uTwtcCWa9sC397AlepHQdgacwD4Pm/fsW0P4oFKp5WpPcUm17yB781c71ugn+eVwCuloS1zwBhsKdBq5n41EF51DABHuNEZ4Rvl4jUnSXH1Un+963rBd9KUs3RtBbGmfLTbxcp6Qdha1xm+q8bs9eXT0tHwJV3zq8fDPj73dpvrresAqQfiWjsHrElD168tC7LqeE5zAzgqDTwCvla1nmFskeW+W7Ydaeui0tva8RplhWlLf9q+qDisvJfz1ZRp/om0/pTqsPvUwvcpXr/auV5wRelF9clNAfq57rr+cpu30Nz2tb7exq5j0yujOcg/IOM+PD2n9/HtiYEvNefbejAH54Bb54GxGEoPMNMccK8511Hw7bHwStOf9X2LPHQjok4aX/OlBIsZlHbu5Xy5uh7O1VKPxba64b0csOq1f7UzAL8dKGLLjzxXK6eOsaMl6362J1ttnS8731svumoFb+14tfPA2lXQEozrGOyaKsO0rwMu0Hex0yjjHpF69oImoo2n7Qj3q/mycVD4jgBxD/h6QcvVzQ7fp/uPg699xbPlmk8tS/DF0s7YSmcSvutcrwTYtx7fVwnMFHip+V4OxBHzwFR8LQ2E505Be2W93R7zvlHxLX1JAJvJ/SZ8TXWj4DuzAx4FX+Z4yVVb+D6VGeBbp469cK6BW/eznRNeH9Jgge+TK6YWW9Xg/BxRrnHE2pXQEogx6HLApcBrnQeu22CaJwUdpVnga7mPXu7XMqZl7663Ttrzq6njYgLgi3XbC7aW2FHO1VPfWtcTvm4Q4081AriGb33E5KU5BVYavtyiKmq7EbYtSNpOxKWvw+C7dcAUYNfUNFR1WLmUmt5eg1C+lgFSbnXB2n3AnBM+lwOOcnSe/jTtZnK/o6V1yd4vIyeCLxWnKef6sbTxOlpr/CjHi5UZ4btNOz9f8/CtF2FJK54BtIuh8GstYLm08/Y1Bd83PveF6/neNZ1cwxfbeqRJN2vAS7ldbsUzBmmqvq6ryzGwco4Xiz+HA97j9nqu0Pb03XI/PeZ+o7IDmt8rcLvRVi3wtfTtebulsSQgauM0X0qwuh7OVxoHuz/LlxQEvqs4+D43vwbpU9vKDT+X4+74+le57u9FNQbW9nYO2l631l/Bf+N8ARDnC4DDMwK+1i1IGvC2zgOnAwbw3Vqr+/XARQtIz9GLkedgc2N5XbXX/Q7c69sK22i3a30d7WhbynrAd4QDRuC7zvu+ePn8Kbo9aAPgeksPVkalli9D49uCuNSzdtEVtopZV3e92nl7utWN86VWOnPOdxsPQKesAenDuyBLgm6UA+Zgi7nj4wLYe0s9U89Wacbx3IsFlD2cfIT7PQl8I+oSvreKhu9TvzR8pVOuAPBFVJdheqWeuUM7dHVu+GKQxVZAf+7xvcXiPe4YiDLOFXNlgJRvy7jXGFQ5CNcxx0tBt9xKBHyj3C8WG3HkpKQe7pfrs2XuF5j6hK+7/ojw5Rywpkx8vaDwfepKccQkAP1owFXauVztvC+3L3gL0eex6ZXSbvhSK525cit4gagHJgaQOAAaupj7xcBMxWwlrX42amcARww/E3y16u1+e8i77Yjro9M/v3uGbzS494Kv1+0q4as533kVBV/t4RtYH5cyDXCv221XQFNbmqg9wutxkyh8a4eLQRaLAei7IIuCMQbXiFXQyvnfRXLDczvgqGFng+8o92vZW9vb/XJxHJixdh1WPB8NvlIfUllL2zPC91GW852fyhCQrore76t1yZQrpsB8DePrBVcq+NaQpVLU27S0dzEWIPXbMoBbEFNlGhdMpKVruD7UUH7U2wyEvzDHHHCv7qPcX6SL1P6uEe6XUw9nzD3tKGrshO/VawucNW1bQcvVeX9SdRb4Alauf7iC5qCN63J+v+/lVm5XMtvT0rb0NQbim/lgbJ+vBF/NIizqhCwqdQ1MHFbPXQPyU5uShmvYbkGLwfWBAW4d/4V9HXDp13XToRQWWd2vN7YVmFFwtszbamRxv532+lrqLfHauj1ccJRrpupGQFgqq+uvyq+PmNxqu+L5cv3KvN1IuyWpTj1jsdsxsTrMQdf3WW83em537ZAB4Op4yavHCW4hCJvXlGOlnKvG+daulhoDmPjtNcDtfddl22t4hi4H3Bq2lNul3LHSAPd2wNGKhG/UftZoRT5SUEuKiCcVafsYBF8LkHs4YQmsmngtnK1uWBungaclXut2sX5VDljeblTP+wLotxvVwgCIbUnCYRmReqaOqqzPd36Ad7z1+auznUmXW28dqh3sKyIOK9fOCUtgBriFKwZYAroccLegvYFwBde3QdYaozTARwLwKPh62mHjefb99lTE7xwx9ztAUUC1xHohq+nH6mCtfXjgq/l9te01YObaXJXL8F1l2W50GYpOPWOuVZNSvrRp35pErXh+nvfdPFhhC1Jq7vatzU8AGb7c3G+dugakvQW8igVYNXRr4G5hewVmuBYGXSYTbdZBADwq7expr43XxFndrxaC3jrPgRyWcSeY942ItYLY0o+1zFvHxffsRypT1+u2GwHcPt1Igq98zjPubum0MQ5PyjFvX79RHSeJpZvred83Pvf29VONapBqXDAGX8rtYtDWOGIg6qAqA6QNXKArAZeC7fZ1DVjK+XIgPokDjtwLq+mTaz/b3K+lv4iFU545Yq7NBPD1wlgLXwm6s8BXWwdIXB1rccCa+5DaAAC24nmVZtEVwG06+bmchi8d87Dp8xaQmpQyB2ntwq3nRVeI86VAyK12luDLPaxhWwdAg5lbkAVVDFw7XQm6GHAfkLK6HKvfCos9AYCPAF+sjfZwDqnd2dzvVpPBtyWWksUhW+rrMk2dBPKWdhI8rf1KDvimXF7xvIpbdMWtgq4XXNVx122eHe+lXD6j2TPve/vEJATy9XYjKsXMAVMDX8xBAxLLwZZKNTPg3bpdCbo1cCnYeiFc6+CLsHpvpTmboue0I/qPXlXd0J3F+VrGsrpiC8hb4cn1KZVJfUs/sb6tcLfe60sckgC6M54v3V9D9KoPBpScG962rftaX2NjUwCn2nDzvmULQCR1S8IQc8sYTKEqr6+18KXuB56vKcdrBa8Gut7537fhsA44YjVudPvoQzewuFHHTnr7pOJ2cr+1IgGrdcY9X3tdsBd4mrrWn9a6ugx9jR+2IS262sqTeqaAWgPS4n7reWhN6plMSVepZxSqq2vd/sTiuMcQWh/WULtjLq76osA5Xg66rc7XM/8LcDgH3PtYxxHzvp7+tW16PiIRU+t4A92v5ToqttWBa/tsdbpR0NY6XW18XYfFU/eHwbdKPQMAmnrGjpC8hSQN2lo8YLUPUKAXb1GLq9gY7KQrzNFyZdRWJC98LQ9rgOtrDLyY29VAFwOrNv3MpZ6xugM44FFw65F6nsn9Riy+0n4JodyvVo3utwW+1r6pOo8rtsS2uERrDHZvmlisXANhz++Glt/Cd1XEs33xgzP4hyfUfdcLseoxqPljzZwwt98XPWyD+sPt2aXqNfDFjq/UwNcA3hq68HhNQdcD4d5bkACGA3iPox+9fVhSz57+tW2itmB53K8nfU0Be2f49nC7FqBa+vC6YatjtjhXyf1q+8FiOVjflOMnXWmf7buV7mQrft4XBy73IAbu4A5r6pnf73sFRmoRFVVXQ7FOWT8A/jCGliclAVzBVwNei/PlgHuybUgF+i568sDE0of13nsuZurRnydF3Op+d1QUfK3A1bpcbCypzANWbXttf5rfwxKL1aPtVsLw+30Bbh+ygEHtKfYGiLdzxlthgL3c6isUktt2FGTX/p7jArYcaRdY1S4XA3I9h1xDtu7b8KQkK3gxsGoh7FkFLa2A3sYfIAXtVQR8e4+tje259ajjPCw5bicwt7rd6DptWlbbVgtnL3SlGIuLjYKwywG37fel5nQpID8Pz7nhGqz1eDRM1/J6DGqh17af+ktC87yvpo5LKXNPUKqBDNf9UenmFbwAz8DlwKuFbl2+LavL6zoqptbBFmFpFXW7ke5XqyO4Rk/6mdL2PQ5c+Sz9ExgBY02cFrTe/rd1FuhqxpLaSIDF4rwO+On1der5tat53uuPyK0DvnTDO91a3HzwNgbrn5oX5l5fQ5sCM/7Epcvb8djPA/KQhfon5Xg1dVwf29fA9A3VawB0SxHneqm5Xyt4W5wvl34GOCWA93S+3PhRB29YdSfpZwtwRzjjltfavr0OVxMbkYKuy7l+re6eff2cegYAVep5KyqlzLna53bY3O6tw63jsXHrOO7Eq9vFWrz7JeFJpYIpmGJ11FYl7VhVOZZy1rjeGrbYa20qelvGldd1terYk6WgI29zVvd7pvRzkPttAaqlL22dNQ0tvda6QQ9srRCPSD17xqnL0Nf4fl8A32lXVEp5K2nVM+VKNc7YcuIVNw564IbWzVpSz9KCKmkRF5K65uDbCl4JuhYI13VUTK2TOGDr7R3F/bYq4mxnrp3nC8NgjXa7lnux9ql1qNbUsKYNN2Y0jLF40QHrnu9rPe0KEwVbPPahase7X6x8vZ/t/W7r+BXQxIEblJuVAFvP4WKglRZiOeD72c/xrpeCcA1R76KsbRlX3ksTA7gHfPdwv5qxWtyvd1zKvdbSxHVw3L3crhSrifM6ZEs9VacBcg+n622P9SHd0/p6+3zfR2lXPW+FzedqQXuJxVPU3JwuNl/r2XZ0fb/XK6rRhVcA/J5b7o8mhS3t7RXg+/bjIqzPfs7mejHHq90HrNkDPAK2mCYF8EzwbXW/vcEeQZetImHqTD9b3a3lLeDA4AWrJ56rj3CtUW3qtpoY7ZcFqv6q/PEjEjlwYyvPgRu18FQz/zQkzP3W/VErmuttR2sdte3o6h65s545KFKQtYAag2u9P7gac1md7oOcctakoKX0NBDXox2upAkBPOEtTaXeD144uCJSzxFjcbER7lc7ngeyLS65BcJX94LPonHuF0A+cONqCMTVWoRBddvv+horr1PRVIqactwvHvftPP36mPuF6vWrqryOrfvZxtVldR/1WEh9BHw1h3BY5n731kS0897KLO5XK2v6uaVvSpHOODj93NP9euqiXnPjaAEmxbS4YCm+pR9LClrpfrmFVwDynl9M2hOv+JiHq/625VvYUwdqrPGXt+M2VX15O57dLwDI7reOkeaL1z9bJ0sdS4k54nWv72bO97Nv2eFbQ1i7EItLSc+mCQDccgszLRbqlX723r+2XS9n3OGpIel29gAAFONJREFUR7V6wNgCUK0kF6gZL9LRemJb+9CmoAHActzkVprTrZ6Hu00rU5LcMeZSMZhu72lbh9/XA12+PfEKAyKAnI7GoKuJwR7UwKS41wVXEnw/+3jbGISlvcAa9zurdgTwBOwf7n5b5V2c5TkqU7I+HRTpdj3A5RThfq1lkWljT5ueKWjq9VPsA2iOm8ROvAKgockdrmHZdiTPBVPzt5TLvYau5H4BqrlfgGsIAshOdxujATMG4hq+G9e8wpdbcPVZwCG6hS9XD8zrI2gHCkYM2TPV6pHW/fZc/dzzi4F19fMO7jcitsX99gBxXWcBs7YPawraC2HJAT+V8cdNbkWdeIWtRqYXY92ukNaoXji1HZ8qx8Bc3+v6egtqyv2anS0QMRo3LI1FON+3Gee7ha/W9VLp5aOBd9UAQs3gdDHtPfe7l7a/2yz3BMd3v56xrA5Sau9xw9pxLW2pdtrUNAC0nHgFAGQZB1qL+73c8u2iqnqcOt28ff0Sheyrq7ZXC7pq90stgOKgu42VQKyFb/WH2ue7QhdAhi8FXA7CR1QnOpZ+XU/nfveSljKR6Wcqfqud3S8Hjkgwex2v95+lFqYtaWuq3nIPaqeLtbGdeHXdnTTPe+1YMbBaVC/Oql+v19c/6ScePbe5TVev5Vcrn7ffLziHS0G1XqSFtQEgIUv1WZ9wVTvfbbca+GpTz0fVa3vfgE1Rjs37mEJtfMTiMCk9bR3PqwHzwi3uN2pMTVzkfWjhpe1HC1HrPVHlmr8j75eRlzQMa/eLxiDu9zKcZjGVbu53jcf6xg/hoMH8PDY1n/zq9oELANcQBLgGJ2xioKp/qOrqfjCIA/ITAzrAzeMEsT28ALd19wZfgNPaxB6/Vu90bVT/Fnt4QPVIN1tTzJpyD5Couoi/NmtaWhuvSSdTcVdtrvf9ag/dsB45KaWVPdK42fqe6vQzdt83bnl94AIAvc1o+5pKFwNTL80XMylrasUz5Xa3aWguDoiyM+hAn8hRqedI96vtPxre0fd5oLlgS9vWOE0bayJDKusx5+tJS2tiqWuqP/L15shJ5dwvJmw7kSXNTLnf535wh7zWX34tvLwGM97/A9Judb6vbvf9wuYaSyXXIIUqhuurLpcWeQGQi64ouGrgu97CGeELYEhBl1JelFL+XinlB3reEK69AUGN3zqhJ/WvrY9sp5k/Dpz/bQFshNmPyqq3uN+odLQkq6sd5X4raVc+U4uqNAutLJJS19iRktfl13PEWDnWzxPwscVXAPSpV1gMVk+5X+oaNj+3b8mj++Xmfan5Xs1hHGeFL4BtDvgjAPDxXjdCywKRlk+rmZIBlnsZtSit0/yvZ0isbhSMLaC1ppw5tbpfj7O13AcXw76+feACwK37vap7Uad6uYVWOKS5eV2qreSQsXvh4CqV36TVKZiu1/UcMAZSyv1SKW0ulf1YVqeeuf25np9nhS+AEsCllPcCwO8FgO/uezu1op3vHvO4I9PP2k/3vTMKj2pJN1v6peqivq/1SkP3+G7TA+Ct7ldx5GT9uMFLF7j75eZ+OVFOV2qLwbTeosSlpbfbkOrxto8cvFp8haWUMZhSgKaASjloDOKPP+vUM8Ctc/XCd7sn+IzSOuA/DwDfDgBfoAJKKR8upXy0lPJRgM8E3JoVEr2cWHT6eTZNAuNaUQ7X4357mX2vI9bCcNQ/yRb32yjszGdTe2Qe13wPW0CK88o01LF6bBHXlarU781cbZ0qrl+vfTxU11vVfTA/l8fXD48/t7AF4vXZoWqRCOBSyjcCwKeXZfkYF7csy5vLsnxgWZYPALwz7Abj1HvxlVbW+d+RGjz/61VvGHvbtECn5d68c7jS2B4XbClrPPP5+brP4iv5Gb3btPV1Cnn7unbGz28F7pCpZ/4+qYYuwK27XX9iKWUsBa1xzOs4jz+3e34Bnt3v1sEC8rqOkVZAn1kaB/x1APD7SimfBIDvA4APllL+Ute7umv3GwXoyKcUdbBcPeZwveMbFwmFpZ97pKE9jlpzX9o+NGVY+hmuF18BAHvwBn5LtKtt3Wp02x8PXCzNfH0vt+X19XbvLwDgbhdzsxRMAW5Ty1g/2lT1GrKZ+926W2wRFoAewvcgEcDLsnzHsizvXZbl/QDwzQDwd5Zl+ZZ+t9TDEe7lfo+yd3hntcwFj5jv1bSXxmr9gmGdo40CuaYf1xww7X634o6d9EBV42yp/uutR5Sb3cZvf8pnRSOOmlv9DNVPzBnXLrcuoxZura9rMG9+YnO/9RA1bKX9vfcEX4DpTsI6MlBG3HuETYza/7vTHPio+d6Itj1A3KqWxVZUH1IdmUlA5jjhduvR7RB8SnpbZoG0tPqZbnftZjFI12CWti4BwPWDF1bV8Fx/Ui54206CLQX5bf3jn+22IwB8WxFUr6VDNgDuC74Axv/yy7L8MAD8cJc7cWt0+rlF1nsd+YVk4PxvD4fbMgbWxgtpT0qWqmt1ra3yThNwEL7p8/KJL6WfNYuvsO1I13U82LWqH0O4/Sk99xdzwNghHGv6+UlY+rku4+aBOUDXq6glMMOz492+rud8LRDetr8nTeSAj+J+I4Hf8vhBS78tGux0e8/39uizpf+W9LPUhzVNTfVnTTtLqo6dXMWlnwFu3aJnTpd75i96T0zK2zunzH0RuOp/m35ehQ2JwRnglmgYtLEYaqzHuu3K5+15z3VzDLzYrd9j6nnVJAD2AkPzv3+W1c/ROslToSwuloPe6PSzJ7anY49oZ/kyoK0zLr7SPPP3qp5ID3uecERBWeoLP8GKdsZ1ObcV6WXlPAEAX1RFlWPzwnU5l36uob0CdzP3yzleawr63jQBgI/ifDmNOIDjJBo5z+ltM1P6uefiKU8/XCxVZhwbO/nq6hqFlvTwBX4e13R/V4C9TTNv4+oYbBvTWne7/agamHO6NYjrcmx1NADuchkwb0+9WiUttMJWQN+z691qZwDPCqkZ5n8t7VsXYFkt0bY8cP9vj/neqFRxa59WEFOxrYAdtQiLbGtf/QzAp23rOM1Z0NRDFbi+6nlk6jCNerw6pj71qtZ2/rdQaWMszawpX0VtNZIWYcH1vt968ZQmBU3NFd+jdgRwK+T2SD/3nP+11ke3k9p3+FLSG7jasTXfM6IWjkXE76WIRVgA3dPPlzq726XmhLk5Ynwrkm4hGHV0JQDcHr4BQLtXbDEWVc7BHCurwPx01OTDdTgAn3bevqbmg+9ROwF4BHyPJsvvNGvm4ASKnFedLf2sTXRo2kmQxcoUi69mTz/X/T6XXaeit6+p+d/69drPi6slxkCvYoaGcixtLSzQoo6dpB6akCloWTsA+GzwmHn+d/vpN8E9RS9CsvQ5Kv3ckpKWYrWrlq39WfoJex+ZtK3i1CvcYcrpZ6vqOVv8yUr1lqTtNqVbMG/LsfYAcDv/C0ADEksXa8uxhVvMIqw6/Uy5XyrdXJfduwYDeCQE9krhplTqmdK19Lf3orCW8Ufee4dFWPXe363q5/5aJM3Jco8V3I55nVJud9DYnDJ3z6UmFhDXEeWUc96GIOnnbZMaqNS8MHdL96ZBudwD70sNUc8FWB5FLcDqpN7zv5r2Fkj3cL1e9XD6lhjxvcI/erHTr7bpZ2qrUaQsJ2fRz/W9hSp33vPNSVivXj3t/wUAfLvRes0truLmfzVzwNvXr/D08zaFTC2sytOueHX+SJjVTe69AEvq2/K+jVgYFnACVk9wjurT0n/0/K3Uv3UeN2IVNFWGzg/Lq5/r+V8A/iAMqmxb3mv+F0D3BQADc90WO5YSTQ7U87ZSOcBtGpoqr1dBI2Cun3r0VL5psl5j6WmAnPut1SkFXWBe+PaW5vfu9d7c2Xs+2nnu6YSj5389as4wyO635/yv72CO23ld6n6kgzuohzFsX98swNKsUpbKt3WYk66FgLlOP1OQpcx0rnzGNcFBHBZpPwHOPP97hHsU1GP+d4btR1JfVN2ssyrRq6A30s7/XrrQp58l0HLP/q1jtn3S/fFjYTHYAqyres0CLNhcc+XaxVxUynobtoEvNhy1wrkeKt3vsw4G4Nl0Ahi6FXgAx1ajU9Wtfc84/6tNL3vS0C73rjv7+apO8fCFaFlcMrW1iFowxp1j/eSc6wM4MHHlmjS0Zv63ilk219iDF7BrQOoTvLdKAJ9Wk21BotQbfD00q2PVqueqcucY2PyvJK8zbdF2xbRusZbuiwQKbm4B1iqtK9b2R/S7XflMpZupLrC9wamLTghgCTazLcA6kga/DxFwHr3lqGWcqAVYkbKmycV58gpIzPyv5ulH1KlTa/v66UdWaZ5+hD2akLrX+v5uYrYroAH4NLMEVGoBFtcfsg0J236EQXV7q9wWpNSzDgTgo0Jwti1Imr6Dx5zxry56LljrAvdcgOUZk6uzvCfE8ZPybcjQwuKsklZXSwuwuHvSrIC+qteuXN6Kmu+t64R53qfyR2Hbj1ZRJ1xB9TrTz7QOBODemjhNS8rzCWzdahT8vsy+5Wi0Sx6RDraUR2cdNE2ZhzLgQ/FbjzRttQuw1jJsLOyaArXmOMoX9f4eaZEVpofqj6U/wa7W8791KHf0ZP06dVECOFQeWLXsAY5oN5EiFv+MdNs9XW+rtAuv6nJNn6b70D39aKt6AZZV3vlf6wIs3xwwvgJ6XYB1swJ6uy93Kw60WPta2KrnepzH7Uea+V9u6pkrv3cdBMAz5jDvGJRRurcFWD2des/MQss2pACNWAGtFeeGr5/7K7tmqsy8AhqABu3azroAqz5sg4Aw52qpldGpZx0EwFr1WIA1k+4I3hF/VXsswPLKunWIuj6QIhZgkX3D7SP+XPco7Nm19qHuR7OlqBYH2rV++1MoX6prKp3MHTuZzpfXyQDcQwf+hEvJ6rUAK1p7p9Y1dWQb38ewbi5W7tt7JKVmBbS2PR9H/A6at80CWq7dA1EOzwuwuC6lc57TBeNKAKcc6nQIxyhFLP5ugfJe3+mk3ztySsBxApZuKP2pWBZx+3W5/rEtSLr+K1f88Op2CxIAf4SkBrSeum26eV0Fjfxa0vGS6YJlHQDA9+xAIz4RT562PlKauVbv+7UuxOL6sI6pDRcWZuldpN3dYmlqbdrae1+qLUiUWkBrXVHN9CdlsKlV0KlbHQDAI9QLUkchgmYLUsDv4nWFR3kbvSugo36/XlubIvpsdL5bcS5TAiMXoz2oQ3PsJAZX9UEgEY6W29+rXGlNPfVIuo2UXglgl07uKiN1FHhqdKbfRVLEFwflGdDba+0WJM1DEHoIByu/B5iKr19fqQXC3EIsqg/CIT880CdgbRdbcSdfpQumNTmAj3jCwlE06BSsaE1+e6T2OrVq1HhHngroLAnM24cwXEkDUOm7hmfrEjHfi71OtWlyAFu0tyvde3yNJrrH3innMwPhoL+PdgGWa9vOAFmdtfVhEehjCAFkyEpvj3deWanchuTXiQDcQzN/0s18bwfX3quUPWctR99DZ71wbk0aLcupV5zEuWlsBTSA7ulGVL22H0T1HmBD05RBCeBdlRDdVSHznBE3otTewB24B7iWdQ9w9CEcdMztKVgtIk/BApCdrlfEHuC3mXvJ/b4xSgBPpYlSxCN1dIj1HiMCkr2+SHR6PyP29gLQ242sfUSp9XAPVpIjZg7b4BzydjW05QzohLGsBPBQ9QJsgntoX0dKXPRaazfJQyfiQG2bd9ZvWXJkAFrmbLXDKeM8LjilVwI4lRqpUZA6wJcEz2MIe4g7BStSve7/Rq0rpJFyyx7gBLNeEwN41CfInbrH1H3K4/R3gnmvU7Ci2nJA7XoOtOWWB66cyj3Adk0M4JRPB7A+PXSnv3bKL+5QDU5SzDCnCzDF0uSErF8J4FQqQp3PTD60Bh1DaetnP3LtOfaNqIO45th6fXolgFNjNNkq29T9ioK117lKT0LqIgvDg3ifTjdeCeBUqpfOtr3qBNrjVC33mNy2odQplABOpVLxIh7EkLrVi9Z87w4Z7YmS6IdWAvgwytXaqVQqdSaVZYn/plpK+WcA8E/CO+6rXwcAv7z3TZxc+R6PUb7PY5Tv8xgd8X3+jcuyfLkU1AXAR1Qp5aPLsnxg7/s4s/I9HqN8n8co3+cxOvP7nCnoVCqVSqV2UAI4lUqlUqkdlAB+1pt738AdKN/jMcr3eYzyfR6j077POQecSqVSqdQOSgecSqVSqdQOSgCnUqlUKrWD7h7ApZQPlVL+USnlE6WUP7P3/ZxRpZTvLaV8upTyM3vfy5lVSnlfKeWHSikfL6X8bCnlI3vf0xlVSvmiUsrfLaX89OP7/F/tfU9nVSnlRSnl75VSfmDve+mhuwZwKeUFAHwXAPweAPjtAPAHSym/fd+7OqX+AgB8aO+buAM9AMC3LcvyrwDA1wLAf5z/nrvoLQD44LIsvwMAvgoAPlRK+dqd7+ms+ggAfHzvm+iluwYwAHwNAHxiWZafW5bl8wDwfQDwTTvf0+m0LMuPAMA/3/s+zq5lWX5pWZaffHz9a3D54HrPvnd1Pi0Xfebx8vXHP7maNVillPcCwO8FgO/e+1566d4B/B4A+IXN9acgP7BSJ1Ap5f0A8NUA8OP73sk59Zga/SkA+DQA/OCyLPk+x+vPA8C3A8AX9r6RXrp3ABekLL/Jpg6tUso7AeD7AeBbl2X51b3v54xaluXVsixfBQDvBYCvKaV85d73dCaVUr4RAD69LMvH9r6Xnrp3AH8KAN63uX4vAPziTveSSjWrlPI6XOD7l5dl+et738/ZtSzLrwDAD0OucYjW1wHA7yulfBIuU4MfLKX8pX1vKV73DuCfAICvKKX8plLKGwDwzQDwN3a+p1TKpVJKAYDvAYCPL8vynXvfz1lVSvnyUsq7Hl9/MQB8PQD8w33v6lxaluU7lmV577Is74fL5/LfWZblW3a+rXDdNYCXZXkAgD8FAH8bLgtW/tqyLD+7712dT6WUvwIAPwYAv62U8qlSyh/f+55Oqq8DgD8MF7fwU49/vmHvmzqhfj0A/FAp5e/D5Uv8Dy7LcsptMqm+yqMoU6lUKpXaQXftgFOpVCqV2ksJ4FQqlUqldlACOJVKpVKpHZQATqVSqVRqByWAU6lUKpXaQQngVCqVSqV2UAI4lUqlUqkd9P8DnGSSkMm/7/MAAAAASUVORK5CYII=", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "heatmap(grid, cmap='jet', interpolation='spline16')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The peak value is 32 at the lower right corner.\n", + "
    \n", + "The region at the upper left corner is planar." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's instantiate `PeakFindingProblem` one last time." + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": {}, + "outputs": [], + "source": [ + "problem = PeakFindingProblem(initial, grid, directions8)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solution by Hill Climbing" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "solution = problem.value(hill_climbing(problem))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "solution" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solution by Simulated Annealing" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "32" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "solutions = {problem.value(simulated_annealing(problem)) for i in range(100)}\n", + "max(solutions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that even though both algorithms started at the same initial state, \n", + "Hill Climbing could never escape from the planar region and gave a locally optimum solution of **0**,\n", + "whereas Simulated Annealing could reach the peak at **32**.\n", + "
    \n", + "A very similar situation arises when there are two peaks of different heights.\n", + "One should carefully consider the possible search space before choosing the algorithm for the task." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## GENETIC ALGORITHM\n", + "\n", + "Genetic algorithms (or GA) are inspired by natural evolution and are particularly useful in optimization and search problems with large state spaces.\n", + "\n", + "Given a problem, algorithms in the domain make use of a *population* of solutions (also called *states*), where each solution/state represents a feasible solution. At each iteration (often called *generation*), the population gets updated using methods inspired by biology and evolution, like *crossover*, *mutation* and *natural selection*." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Overview\n", + "\n", + "A genetic algorithm works in the following way:\n", + "\n", + "1) Initialize random population.\n", + "\n", + "2) Calculate population fitness.\n", + "\n", + "3) Select individuals for mating.\n", + "\n", + "4) Mate selected individuals to produce new population.\n", + "\n", + " * Random chance to mutate individuals.\n", + "\n", + "5) Repeat from step 2) until an individual is fit enough or the maximum number of iterations is reached." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Glossary\n", + "\n", + "Before we continue, we will lay the basic terminology of the algorithm.\n", + "\n", + "* Individual/State: A list of elements (called *genes*) that represent possible solutions.\n", + "\n", + "* Population: The list of all the individuals/states.\n", + "\n", + "* Gene pool: The alphabet of possible values for an individual's genes.\n", + "\n", + "* Generation/Iteration: The number of times the population will be updated.\n", + "\n", + "* Fitness: An individual's score, calculated by a function specific to the problem." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Crossover\n", + "\n", + "Two individuals/states can \"mate\" and produce one child. This offspring bears characteristics from both of its parents. There are many ways we can implement this crossover. Here we will take a look at the most common ones. Most other methods are variations of those below.\n", + "\n", + "* Point Crossover: The crossover occurs around one (or more) point. The parents get \"split\" at the chosen point or points and then get merged. In the example below we see two parents get split and merged at the 3rd digit, producing the following offspring after the crossover.\n", + "\n", + "![point crossover](images/point_crossover.png)\n", + "\n", + "* Uniform Crossover: This type of crossover chooses randomly the genes to get merged. Here the genes 1, 2 and 5 were chosen from the first parent, so the genes 3, 4 were added by the second parent.\n", + "\n", + "![uniform crossover](images/uniform_crossover.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Mutation\n", + "\n", + "When an offspring is produced, there is a chance it will mutate, having one (or more, depending on the implementation) of its genes altered.\n", + "\n", + "For example, let's say the new individual to undergo mutation is \"abcde\". Randomly we pick to change its third gene to 'z'. The individual now becomes \"abzde\" and is added to the population." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Selection\n", + "\n", + "At each iteration, the fittest individuals are picked randomly to mate and produce offsprings. We measure an individual's fitness with a *fitness function*. That function depends on the given problem and it is used to score an individual. Usually the higher the better.\n", + "\n", + "The selection process is this:\n", + "\n", + "1) Individuals are scored by the fitness function.\n", + "\n", + "2) Individuals are picked randomly, according to their score (higher score means higher chance to get picked). Usually the formula to calculate the chance to pick an individual is the following (for population *P* and individual *i*):\n", + "\n", + "$$ chance(i) = \\dfrac{fitness(i)}{\\sum_{k \\, in \\, P}{fitness(k)}} $$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Implementation\n", + "\n", + "Below we look over the implementation of the algorithm in the `search` module.\n", + "\n", + "First the implementation of the main core of the algorithm:" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def genetic_algorithm(population, fitness_fn, gene_pool=[0, 1], f_thres=None, ngen=1000, pmut=0.1):\n",
    +       "    """[Figure 4.8]"""\n",
    +       "    for i in range(ngen):\n",
    +       "        population = [mutate(recombine(*select(2, population, fitness_fn)), gene_pool, pmut)\n",
    +       "                      for i in range(len(population))]\n",
    +       "\n",
    +       "        fittest_individual = fitness_threshold(fitness_fn, f_thres, population)\n",
    +       "        if fittest_individual:\n",
    +       "            return fittest_individual\n",
    +       "\n",
    +       "\n",
    +       "    return argmax(population, key=fitness_fn)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(genetic_algorithm)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The algorithm takes the following input:\n", + "\n", + "* `population`: The initial population.\n", + "\n", + "* `fitness_fn`: The problem's fitness function.\n", + "\n", + "* `gene_pool`: The gene pool of the states/individuals. By default 0 and 1.\n", + "\n", + "* `f_thres`: The fitness threshold. If an individual reaches that score, iteration stops. By default 'None', which means the algorithm will not halt until the generations are ran.\n", + "\n", + "* `ngen`: The number of iterations/generations.\n", + "\n", + "* `pmut`: The probability of mutation.\n", + "\n", + "The algorithm gives as output the state with the largest score." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For each generation, the algorithm updates the population. First it calculates the fitnesses of the individuals, then it selects the most fit ones and finally crosses them over to produce offsprings. There is a chance that the offspring will be mutated, given by `pmut`. If at the end of the generation an individual meets the fitness threshold, the algorithm halts and returns that individual.\n", + "\n", + "The function of mating is accomplished by the method `recombine`:" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def recombine(x, y):\n",
    +       "    n = len(x)\n",
    +       "    c = random.randrange(0, n)\n",
    +       "    return x[:c] + y[c:]\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(recombine)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The method picks at random a point and merges the parents (`x` and `y`) around it.\n", + "\n", + "The mutation is done in the method `mutate`:" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def mutate(x, gene_pool, pmut):\n",
    +       "    if random.uniform(0, 1) >= pmut:\n",
    +       "        return x\n",
    +       "\n",
    +       "    n = len(x)\n",
    +       "    g = len(gene_pool)\n",
    +       "    c = random.randrange(0, n)\n",
    +       "    r = random.randrange(0, g)\n",
    +       "\n",
    +       "    new_gene = gene_pool[r]\n",
    +       "    return x[:c] + [new_gene] + x[c+1:]\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(mutate)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We pick a gene in `x` to mutate and a gene from the gene pool to replace it with.\n", + "\n", + "To help initializing the population we have the helper function `init_population`:" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def init_population(pop_number, gene_pool, state_length):\n",
    +       "    """Initializes population for genetic algorithm\n",
    +       "    pop_number  :  Number of individuals in population\n",
    +       "    gene_pool   :  List of possible values for individuals\n",
    +       "    state_length:  The length of each individual"""\n",
    +       "    g = len(gene_pool)\n",
    +       "    population = []\n",
    +       "    for i in range(pop_number):\n",
    +       "        new_individual = [gene_pool[random.randrange(0, g)] for j in range(state_length)]\n",
    +       "        population.append(new_individual)\n",
    +       "\n",
    +       "    return population\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(init_population)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The function takes as input the number of individuals in the population, the gene pool and the length of each individual/state. It creates individuals with random genes and returns the population when done." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Explanation\n", + "\n", + "Before we solve problems using the genetic algorithm, we will explain how to intuitively understand the algorithm using a trivial example.\n", + "\n", + "#### Generating Phrases\n", + "\n", + "In this problem, we use a genetic algorithm to generate a particular target phrase from a population of random strings. This is a classic example that helps build intuition about how to use this algorithm in other problems as well. Before we break the problem down, let us try to brute force the solution. Let us say that we want to generate the phrase \"genetic algorithm\". The phrase is 17 characters long. We can use any character from the 26 lowercase characters and the space character. To generate a random phrase of length 17, each space can be filled in 27 ways. So the total number of possible phrases is\n", + "\n", + "$$ 27^{17} = 2153693963075557766310747 $$\n", + "\n", + "which is a massive number. If we wanted to generate the phrase \"Genetic Algorithm\", we would also have to include all the 26 uppercase characters into consideration thereby increasing the sample space from 27 characters to 53 characters and the total number of possible phrases then would be\n", + "\n", + "$$ 53^{17} = 205442259656281392806087233013 $$\n", + "\n", + "If we wanted to include punctuations and numerals into the sample space, we would have further complicated an already impossible problem. Hence, brute forcing is not an option. Now we'll apply the genetic algorithm and see how it significantly reduces the search space. We essentially want to *evolve* our population of random strings so that they better approximate the target phrase as the number of generations increase. Genetic algorithms work on the principle of Darwinian Natural Selection according to which, there are three key concepts that need to be in place for evolution to happen. They are:\n", + "\n", + "* **Heredity**: There must be a process in place by which children receive the properties of their parents.
    \n", + "For this particular problem, two strings from the population will be chosen as parents and will be split at a random index and recombined as described in the `recombine` function to create a child. This child string will then be added to the new generation.\n", + "\n", + "\n", + "* **Variation**: There must be a variety of traits present in the population or a means with which to introduce variation.
    If there is no variation in the sample space, we might never reach the global optimum. To ensure that there is enough variation, we can initialize a large population, but this gets computationally expensive as the population gets larger. Hence, we often use another method called mutation. In this method, we randomly change one or more characters of some strings in the population based on a predefined probability value called the mutation rate or mutation probability as described in the `mutate` function. The mutation rate is usually kept quite low. A mutation rate of zero fails to introduce variation in the population and a high mutation rate (say 50%) is as good as a coin flip and the population fails to benefit from the previous recombinations. An optimum balance has to be maintained between population size and mutation rate so as to reduce the computational cost as well as have sufficient variation in the population.\n", + "\n", + "\n", + "* **Selection**: There must be some mechanism by which some members of the population have the opportunity to be parents and pass down their genetic information and some do not. This is typically referred to as \"survival of the fittest\".
    \n", + "There has to be some way of determining which phrases in our population have a better chance of eventually evolving into the target phrase. This is done by introducing a fitness function that calculates how close the generated phrase is to the target phrase. The function will simply return a scalar value corresponding to the number of matching characters between the generated phrase and the target phrase." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before solving the problem, we first need to define our target phrase." + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "target = 'Genetic Algorithm'" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "We then need to define our gene pool, i.e the elements which an individual from the population might comprise of. Here, the gene pool contains all uppercase and lowercase letters of the English alphabet and the space character." + ] + }, + { + "cell_type": "code", + "execution_count": 56, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "# The ASCII values of uppercase characters ranges from 65 to 91\n", + "u_case = [chr(x) for x in range(65, 91)]\n", + "# The ASCII values of lowercase characters ranges from 97 to 123\n", + "l_case = [chr(x) for x in range(97, 123)]\n", + "\n", + "gene_pool = []\n", + "gene_pool.extend(u_case) # adds the uppercase list to the gene pool\n", + "gene_pool.extend(l_case) # adds the lowercase list to the gene pool\n", + "gene_pool.append(' ') # adds the space character to the gene pool" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now need to define the maximum size of each population. Larger populations have more variation but are computationally more expensive to run algorithms on." + ] + }, + { + "cell_type": "code", + "execution_count": 57, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "max_population = 100" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As our population is not very large, we can afford to keep a relatively large mutation rate." + ] + }, + { + "cell_type": "code", + "execution_count": 58, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mutation_rate = 0.07 # 7%" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Great! Now, we need to define the most important metric for the genetic algorithm, i.e the fitness function. This will simply return the number of matching characters between the generated sample and the target phrase." + ] + }, + { + "cell_type": "code", + "execution_count": 59, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def fitness_fn(sample):\n", + " # initialize fitness to 0\n", + " fitness = 0\n", + " for i in range(len(sample)):\n", + " # increment fitness by 1 for every matching character\n", + " if sample[i] == target[i]:\n", + " fitness += 1\n", + " return fitness" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before we run our genetic algorithm, we need to initialize a random population. We will use the `init_population` function to do this. We need to pass in the maximum population size, the gene pool and the length of each individual, which in this case will be the same as the length of the target phrase." + ] + }, + { + "cell_type": "code", + "execution_count": 60, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "population = init_population(max_population, gene_pool, len(target))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will now define how the individuals in the population should change as the number of generations increases. First, the `select` function will be run on the population to select *two* individuals with high fitness values. These will be the parents which will then be recombined using the `recombine` function to generate the child." + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "parents = select(2, population, fitness_fn) " + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "# The recombine function takes two parents as arguments, so we need to unpack the previous variable\n", + "child = recombine(*parents)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, we need to apply a mutation according to the mutation rate. We call the `mutate` function on the child with the gene pool and mutation rate as the additional arguments." + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "child = mutate(child, gene_pool, mutation_rate)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The above lines can be condensed into\n", + "\n", + "`child = mutate(recombine(*select(2, population, fitness_fn)), gene_pool, mutation_rate)`\n", + "\n", + "And, we need to do this `for` every individual in the current population to generate the new population." + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "population = [mutate(recombine(*select(2, population, fitness_fn)), gene_pool, mutation_rate) for i in range(len(population))]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The individual with the highest fitness can then be found using the `max` function." + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "current_best = max(population, key=fitness_fn)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's print this out" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['J', 'y', 'O', 'e', ' ', 'h', 'c', 'r', 'C', 'W', 'H', 'o', 'r', 'R', 'y', 'P', 'U']\n" + ] + } + ], + "source": [ + "print(current_best)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that this is a list of characters. This can be converted to a string using the join function" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "JyOe hcrCWHorRyPU\n" + ] + } + ], + "source": [ + "current_best_string = ''.join(current_best)\n", + "print(current_best_string)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now need to define the conditions to terminate the algorithm. This can happen in two ways\n", + "1. Termination after a predefined number of generations\n", + "2. Termination when the fitness of the best individual of the current generation reaches a predefined threshold value.\n", + "\n", + "We define these variables below" + ] + }, + { + "cell_type": "code", + "execution_count": 68, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "ngen = 1200 # maximum number of generations\n", + "# we set the threshold fitness equal to the length of the target phrase\n", + "# i.e the algorithm only terminates whne it has got all the characters correct \n", + "# or it has completed 'ngen' number of generations\n", + "f_thres = len(target)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "To generate `ngen` number of generations, we run a `for` loop `ngen` number of times. After each generation, we calculate the fitness of the best individual of the generation and compare it to the value of `f_thres` using the `fitness_threshold` function. After every generation, we print out the best individual of the generation and the corresponding fitness value. Lets now write a function to do this." + ] + }, + { + "cell_type": "code", + "execution_count": 69, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def genetic_algorithm_stepwise(population, fitness_fn, gene_pool=[0, 1], f_thres=None, ngen=1200, pmut=0.1):\n", + " for generation in range(ngen):\n", + " population = [mutate(recombine(*select(2, population, fitness_fn)), gene_pool, pmut) for i in range(len(population))]\n", + " # stores the individual genome with the highest fitness in the current population\n", + " current_best = ''.join(max(population, key=fitness_fn))\n", + " print(f'Current best: {current_best}\\t\\tGeneration: {str(generation)}\\t\\tFitness: {fitness_fn(current_best)}\\r', end='')\n", + " \n", + " # compare the fitness of the current best individual to f_thres\n", + " fittest_individual = fitness_threshold(fitness_fn, f_thres, population)\n", + " \n", + " # if fitness is greater than or equal to f_thres, we terminate the algorithm\n", + " if fittest_individual:\n", + " return fittest_individual, generation\n", + " return max(population, key=fitness_fn) , generation " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The function defined above is essentially the same as the one defined in `search.py` with the added functionality of printing out the data of each generation." + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def genetic_algorithm(population, fitness_fn, gene_pool=[0, 1], f_thres=None, ngen=1000, pmut=0.1):\n",
    +       "    """[Figure 4.8]"""\n",
    +       "    for i in range(ngen):\n",
    +       "        population = [mutate(recombine(*select(2, population, fitness_fn)), gene_pool, pmut)\n",
    +       "                      for i in range(len(population))]\n",
    +       "\n",
    +       "        fittest_individual = fitness_threshold(fitness_fn, f_thres, population)\n",
    +       "        if fittest_individual:\n",
    +       "            return fittest_individual\n",
    +       "\n",
    +       "\n",
    +       "    return argmax(population, key=fitness_fn)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(genetic_algorithm)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have defined all the required functions and variables. Let's now create a new population and test the function we wrote above." + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Current best: Genetic Algorithm\t\tGeneration: 985\t\tFitness: 17\r" + ] + } + ], + "source": [ + "population = init_population(max_population, gene_pool, len(target))\n", + "solution, generations = genetic_algorithm_stepwise(population, fitness_fn, gene_pool, f_thres, ngen, mutation_rate)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The genetic algorithm was able to converge!\n", + "We implore you to rerun the above cell and play around with `target, max_population, f_thres, ngen` etc parameters to get a better intuition of how the algorithm works. To summarize, if we can define the problem states in simple array format and if we can create a fitness function to gauge how good or bad our approximate solutions are, there is a high chance that we can get a satisfactory solution using a genetic algorithm. \n", + "- There is also a better GUI version of this program `genetic_algorithm_example.py` in the GUI folder for you to play around with." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Usage\n", + "\n", + "Below we give two example usages for the genetic algorithm, for a graph coloring problem and the 8 queens problem.\n", + "\n", + "#### Graph Coloring\n", + "\n", + "First we will take on the simpler problem of coloring a small graph with two colors. Before we do anything, let's imagine how a solution might look. First, we have to represent our colors. Say, 'R' for red and 'G' for green. These make up our gene pool. What of the individual solutions though? For that, we will look at our problem. We stated we have a graph. A graph has nodes and edges, and we want to color the nodes. Naturally, we want to store each node's color. If we have four nodes, we can store their colors in a list of genes, one for each node. A possible solution will then look like this: ['R', 'R', 'G', 'R']. In the general case, we will represent each solution with a list of chars ('R' and 'G'), with length the number of nodes.\n", + "\n", + "Next we need to come up with a fitness function that appropriately scores individuals. Again, we will look at the problem definition at hand. We want to color a graph. For a solution to be optimal, no edge should connect two nodes of the same color. How can we use this information to score a solution? A naive (and ineffective) approach would be to count the different colors in the string. So ['R', 'R', 'R', 'R'] has a score of 1 and ['R', 'R', 'G', 'G'] has a score of 2. Why that fitness function is not ideal though? Why, we forgot the information about the edges! The edges are pivotal to the problem and the above function only deals with node colors. We didn't use all the information at hand and ended up with an ineffective answer. How, then, can we use that information to our advantage?\n", + "\n", + "We said that the optimal solution will have all the edges connecting nodes of different color. So, to score a solution we can count how many edges are valid (aka connecting nodes of different color). That is a great fitness function!\n", + "\n", + "Let's jump into solving this problem using the `genetic_algorithm` function." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we need to represent the graph. Since we mostly need information about edges, we will just store the edges. We will denote edges with capital letters and nodes with integers:" + ] + }, + { + "cell_type": "code", + "execution_count": 72, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "edges = {\n", + " 'A': [0, 1],\n", + " 'B': [0, 3],\n", + " 'C': [1, 2],\n", + " 'D': [2, 3]\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Edge 'A' connects nodes 0 and 1, edge 'B' connects nodes 0 and 3 etc.\n", + "\n", + "We already said our gene pool is 'R' and 'G', so we can jump right into initializing our population. Since we have only four nodes, `state_length` should be 4. For the number of individuals, we will try 8. We can increase this number if we need higher accuracy, but be careful! Larger populations need more computating power and take longer. You need to strike that sweet balance between accuracy and cost (the ultimate dilemma of the programmer!)." + ] + }, + { + "cell_type": "code", + "execution_count": 73, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[['R', 'G', 'G', 'G'], ['G', 'R', 'R', 'G'], ['G', 'G', 'G', 'G'], ['G', 'R', 'G', 'G'], ['G', 'G', 'G', 'R'], ['G', 'R', 'R', 'G'], ['G', 'R', 'G', 'G'], ['G', 'G', 'R', 'G']]\n" + ] + } + ], + "source": [ + "population = init_population(8, ['R', 'G'], 4)\n", + "print(population)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We created and printed the population. You can see that the genes in the individuals are random and there are 8 individuals each with 4 genes.\n", + "\n", + "Next we need to write our fitness function. We previously said we want the function to count how many edges are valid. So, given a coloring/individual `c`, we will do just that:" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def fitness(c):\n", + " return sum(c[n1] != c[n2] for (n1, n2) in edges.values())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Great! Now we will run the genetic algorithm and see what solution it gives." + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['R', 'G', 'R', 'G']\n" + ] + } + ], + "source": [ + "solution = genetic_algorithm(population, fitness, gene_pool=['R', 'G'])\n", + "print(solution)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The algorithm converged to a solution. Let's check its score:" + ] + }, + { + "cell_type": "code", + "execution_count": 76, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "4\n" + ] + } + ], + "source": [ + "print(fitness(solution))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The solution has a score of 4. Which means it is optimal, since we have exactly 4 edges in our graph, meaning all are valid!\n", + "\n", + "*NOTE: Because the algorithm is non-deterministic, there is a chance a different solution is given. It might even be wrong, if we are very unlucky!*" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Eight Queens\n", + "\n", + "Let's take a look at a more complicated problem.\n", + "\n", + "In the *Eight Queens* problem, we are tasked with placing eight queens on an 8x8 chessboard without any queen threatening the others (aka queens should not be in the same row, column or diagonal). In its general form the problem is defined as placing *N* queens in an NxN chessboard without any conflicts.\n", + "\n", + "First we need to think about the representation of each solution. We can go the naive route of representing the whole chessboard with the queens' placements on it. That is definitely one way to go about it, but for the purpose of this tutorial we will do something different. We have eight queens, so we will have a gene for each of them. The gene pool will be numbers from 0 to 7, for the different columns. The *position* of the gene in the state will denote the row the particular queen is placed in.\n", + "\n", + "For example, we can have the state \"03304577\". Here the first gene with a value of 0 means \"the queen at row 0 is placed at column 0\", for the second gene \"the queen at row 1 is placed at column 3\" and so forth.\n", + "\n", + "We now need to think about the fitness function. On the graph coloring problem we counted the valid edges. The same thought process can be applied here. Instead of edges though, we have positioning between queens. If two queens are not threatening each other, we say they are at a \"non-attacking\" positioning. We can, therefore, count how many such positionings are there.\n", + "\n", + "Let's dive right in and initialize our population:" + ] + }, + { + "cell_type": "code", + "execution_count": 77, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[2, 6, 2, 0, 2, 3, 4, 7], [7, 2, 0, 6, 3, 3, 0, 6], [2, 3, 0, 6, 6, 2, 5, 5], [2, 6, 4, 2, 3, 5, 5, 5], [3, 1, 5, 1, 5, 1, 0, 3]]\n" + ] + } + ], + "source": [ + "population = init_population(100, range(8), 8)\n", + "print(population[:5])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have a population of 100 and each individual has 8 genes. The gene pool is the integers from 0 to 7, in string form. Above you can see the first five individuals.\n", + "\n", + "Next we need to write our fitness function. Remember, queens threaten each other if they are at the same row, column or diagonal.\n", + "\n", + "Since positionings are mutual, we must take care not to count them twice. Therefore for each queen, we will only check for conflicts for the queens after her.\n", + "\n", + "A gene's value in an individual `q` denotes the queen's column, and the position of the gene denotes its row. We can check if the aforementioned values between two genes are the same. We also need to check for diagonals. A queen *a* is in the diagonal of another queen, *b*, if the difference of the rows between them is equal to either their difference in columns (for the diagonal on the right of *a*) or equal to the negative difference of their columns (for the left diagonal of *a*). Below is given the fitness function." + ] + }, + { + "cell_type": "code", + "execution_count": 78, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def fitness(q):\n", + " non_attacking = 0\n", + " for row1 in range(len(q)):\n", + " for row2 in range(row1+1, len(q)):\n", + " col1 = int(q[row1])\n", + " col2 = int(q[row2])\n", + " row_diff = row1 - row2\n", + " col_diff = col1 - col2\n", + "\n", + " if col1 != col2 and row_diff != col_diff and row_diff != -col_diff:\n", + " non_attacking += 1\n", + "\n", + " return non_attacking" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the best score achievable is 28. That is because for each queen we only check for the queens after her. For the first queen we check 7 other queens, for the second queen 6 others and so on. In short, the number of checks we make is the sum 7+6+5+...+1. Which is equal to 7\\*(7+1)/2 = 28.\n", + "\n", + "Because it is very hard and will take long to find a perfect solution, we will set the fitness threshold at 25. If we find an individual with a score greater or equal to that, we will halt. Let's see how the genetic algorithm will fare." + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[2, 5, 7, 1, 3, 6, 4, 6]\n", + "25\n" + ] + } + ], + "source": [ + "solution = genetic_algorithm(population, fitness, f_thres=25, gene_pool=range(8))\n", + "print(solution)\n", + "print(fitness(solution))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Above you can see the solution and its fitness score, which should be no less than 25." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is where we conclude Genetic Algorithms." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### N-Queens Problem\n", + "Here, we will look at the generalized cae of the Eight Queens problem.\n", + "
    \n", + "We are given a `N` x `N` chessboard, with `N` queens, and we need to place them in such a way that no two queens can attack each other.\n", + "
    \n", + "We will solve this problem using search algorithms.\n", + "To do this, we already have a `NQueensProblem` class in `search.py`." + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class NQueensProblem(Problem):\n",
    +       "\n",
    +       "    """The problem of placing N queens on an NxN board with none attacking\n",
    +       "    each other.  A state is represented as an N-element array, where\n",
    +       "    a value of r in the c-th entry means there is a queen at column c,\n",
    +       "    row r, and a value of -1 means that the c-th column has not been\n",
    +       "    filled in yet.  We fill in columns left to right.\n",
    +       "    >>> depth_first_tree_search(NQueensProblem(8))\n",
    +       "    <Node (7, 3, 0, 2, 5, 1, 6, 4)>\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, N):\n",
    +       "        self.N = N\n",
    +       "        self.initial = tuple([-1] * N)\n",
    +       "        Problem.__init__(self, self.initial)\n",
    +       "\n",
    +       "    def actions(self, state):\n",
    +       "        """In the leftmost empty column, try all non-conflicting rows."""\n",
    +       "        if state[-1] is not -1:\n",
    +       "            return []  # All columns filled; no successors\n",
    +       "        else:\n",
    +       "            col = state.index(-1)\n",
    +       "            return [row for row in range(self.N)\n",
    +       "                    if not self.conflicted(state, row, col)]\n",
    +       "\n",
    +       "    def result(self, state, row):\n",
    +       "        """Place the next queen at the given row."""\n",
    +       "        col = state.index(-1)\n",
    +       "        new = list(state[:])\n",
    +       "        new[col] = row\n",
    +       "        return tuple(new)\n",
    +       "\n",
    +       "    def conflicted(self, state, row, col):\n",
    +       "        """Would placing a queen at (row, col) conflict with anything?"""\n",
    +       "        return any(self.conflict(row, col, state[c], c)\n",
    +       "                   for c in range(col))\n",
    +       "\n",
    +       "    def conflict(self, row1, col1, row2, col2):\n",
    +       "        """Would putting two queens in (row1, col1) and (row2, col2) conflict?"""\n",
    +       "        return (row1 == row2 or  # same row\n",
    +       "                col1 == col2 or  # same column\n",
    +       "                row1 - col1 == row2 - col2 or  # same \\ diagonal\n",
    +       "                row1 + col1 == row2 + col2)   # same / diagonal\n",
    +       "\n",
    +       "    def goal_test(self, state):\n",
    +       "        """Check if all columns filled, no conflicts."""\n",
    +       "        if state[-1] is -1:\n",
    +       "            return False\n",
    +       "        return not any(self.conflicted(state, state[col], col)\n",
    +       "                       for col in range(len(state)))\n",
    +       "\n",
    +       "    def h(self, node):\n",
    +       "        """Return number of conflicting queens for a given node"""\n",
    +       "        num_conflicts = 0\n",
    +       "        for (r1, c1) in enumerate(node.state):\n",
    +       "            for (r2, c2) in enumerate(node.state):\n",
    +       "                if (r1, c1) != (r2, c2):\n",
    +       "                    num_conflicts += self.conflict(r1, c1, r2, c2)\n",
    +       "\n",
    +       "        return num_conflicts\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(NQueensProblem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In [`csp.ipynb`](https://github.com/aimacode/aima-python/blob/master/csp.ipynb) we have seen that the N-Queens problem can be formulated as a CSP and can be solved by \n", + "the `min_conflicts` algorithm in a way similar to Hill-Climbing. \n", + "Here, we want to solve it using heuristic search algorithms and even some classical search algorithms.\n", + "The `NQueensProblem` class derives from the `Problem` class and is implemented in such a way that the search algorithms we already have, can solve it.\n", + "
    \n", + "Let's instantiate the class." + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "nqp = NQueensProblem(8)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's use `depth_first_tree_search` first.\n", + "
    \n", + "We will also use the %%timeit magic with each algorithm to see how much time they take." + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "4.82 ms ± 498 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "depth_first_tree_search(nqp)" + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "dfts = depth_first_tree_search(nqp).solution()" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAewAAAHwCAYAAABkPlyAAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3X+4FdWd7/nP93IOIIZfBw6YAGOgkyczHQO2nBa7iQwxpA0IRmd6umGMXs1kuJO5hiDY6Zbn6Scmz41mVCB07OncXGnw3jagaduI2lGiEQwYtQ+00jHpnseAiYj8OMIJ6DERuGv+qLM9e+9TVbvO3lW7dlW9X8+zn7131aq11t6Lw3evVatWmXNOAACgtf27tCsAAABqI2ADAJABBGwAADKAgA0AQAYQsAEAyAACNgAAGUDABgAgAwjYAABkAAEbaDFm9kEz+0czO2Fmh83sbjNrC0k/zsz+pj9tn5n9i5n9+2bWGUDyCNhA6/l/JR2V9H5JF0r6nyX9334JzWy4pCclnS/pDySNlfRnku4wsxVNqS2ApiBgA61nuqQHnHO/cc4dlvS4pI8GpL1W0v8g6X9zzh1wzp12zj0uaYWk/2RmoyXJzJyZfah0kJltNrP/VPZ+sZm9aGa9Zvasmc0s2/cBM3vQzI6Z2YHyHwJmdquZPWBm/9XMTpnZy2bWVbb/z83s9f59/2Zmn4znKwKKh4ANtJ4Nkpaa2SgzmyJpobyg7edTkn7gnHu7avuDkkZJuqRWYWZ2kaS/lfQfJE2Q9J8lbTOzEWb27yQ9IuklSVMkfVLSSjO7vCyLKyVtlTRO0jZJd/fn+xFJN0r6fefcaEmXS3q1Vn0A+CNgA61np7we9UlJByV1S/p+QNqJkt6o3uicOyOpR1JnhPL+T0n/2Tn3vHPurHPuXkm/lRfsf19Sp3Pua865d51z+yX9F0lLy47f5Zz7R+fcWUn/TdKs/u1nJY2Q9Ltm1u6ce9U594sI9QHgg4ANtJD+Hu0Tkv5B0rnyAvJ4Sf9PwCE98s51V+fT1n/ssQjFni9pdf9weK+Z9UqaJukD/fs+ULVvjaTJZccfLnvdJ2mkmbU5516RtFLSrZKOmtlWM/tAhPoA8EHABlpLh7xgebdz7rfOuTclbZK0KCD9k5IWmtm5Vdv/V0mnJb3Q/75P3hB5yXllr1+T9HXn3Liyxyjn3Jb+fQeq9o12zgXVp4Jz7rvOuY/LC/xOwT88ANRAwAZaiHOuR9IBSV8wszYzGyfp38s7h+znv8kbNv9e/+Vg7f3nl/9K0h3OuV/3p3tR0v9uZsPM7NPyZp6X/BdJ/5eZzTHPuWZ2Rf+EtRckneyfPHZO//EXmNnv1/osZvYRM7vMzEZI+o2kd+QNkwOoAwEbaD3/i6RPyxvOfkXSGUk3+SV0zv1W0gJ5PeHn5QXFxyV9U9JXy5J+SdISSb2SrlHZOXHnXLe889h3SzrRX+b1/fvO9h93obwfEj2S7pF3+VgtIyR9o/+Yw5ImyRtOB1AHc86lXQcAMTGzdkk/kPS6pOsdf+BAbtDDBnLEOXda3vnrX0j6SMrVARAjetgAAGQAPWwAADIg8IYCzTJx4kT3wQ9+MO1qJGbPnj1pVyFRs2fPTrsKiaMNs432y768t6GkHudczUWOUh8S7+rqct3d3anWIUlmlnYVEhXrv589MXxXs+P/90wbZhvtl315b0NJe5xzXbUSMSSOdB250wvUcQRraSCvI2vjyQ8AWgQBG+k4/aYXWA9+OZn8D97s5X/6SDL5A0CTpX4OGwUUV286in39K3AmMFQOAM1EDxvN1cxg3QrlAkBMCNhojr0j0g+ae0w6vjXdOgBAnQjYSN4ek9y7DWdz4x0x1OXAsvR/OABAHTiHjWTtHdlwFlZ2scNfP+A9u0avBNw7Qrrotw1mAgDNQw8byXK1g2LnAum+H/jvs4ArE4O2RxZDjx8AmomAjeTUGHq2Lu/R0yt99i8bD8Kl/EqPC/6ksfoBQCshYCMZNYLht+73315v0PY77uX9EQ4kaAPICAI24nfmaM0kK+5sQj0U8QfAmZ7E6wEAjSJgI34vTY4tq6DJZQ1POiv3Us019wEgdcwSR7zeGLj2yq93Wwq0rjv68Lfrlk71SWPmSSefkUaPil6dTV8ZeB1WHx1eL513U/SMAaDJ6GEjXof+XFJwMD5YNlo+d9bg/UE951KQDgrWQcddv8R7/tVh//3v1fP1Vf4JAKBFELDRVNMWDbzetbEy0IYNc3/4au95wmXBaarzKn9//uKh1RMAWg0BG/FpcMb16yFz1V55zXs+fjI4Tdi+SJgxDqCFEbDRVIvmBu+buih4XxRhve/FlzaWNwCkjYCNRPTt9t/+2Ibm1qPkkfX+2995trn1AIB6EbARj9OVs7rOGeGdQz5nxMC2KJdibX6kvuIf3lk7TXn5o0Z670cOr0p0+lh9FQCAhBGwEY997/fd3LdbOv289zrKZVw3fHXwtjNnK9/39A5Oc9Xq2nmXyu/dIb29KyDRvkm1MwKAFBCwkbi2YY0dP/ySyvedCxrLb+z7GjseANJAwEZTRellL11T+d658PSf+1o85QJAKyNgo+Xcv31o6TdtS6YeANBKEgnYZvZpM/s3M3vFzP4iiTLQWlati5622b3doZQ3lM8BAM0Ue8A2s2GS/lrSQkm/K2mZmf1u3OWgtayLeWXPL9weLV3cd/2K+3MAQFyS6GFfLOkV59x+59y7krZK+kwC5SDDFq8M3//tB73nnXv99297xnsOuq92SfXs8euuqF03AGhFSQTsKZJeK3t/sH/be8xsuZl1m1n3sWNc91oE0z9Q+f6xoMuqqsxf7r/9MxF7wtXXZ9/rc9kYAGRBEgHbb0Hminm+zrnvOOe6nHNdnZ3ci7gIfnzP4G0LV4Qf0xGy1Kgkjf9E+P6Va8P3A0CWJBGwD0qaVvZ+qqRDCZSDVjIrfKRkis96JI/XWBb0RI2befSeCt+/YUv4fl8ze+o4CACSl0TA/idJHzaz6WY2XNJSSVx4k3dtE+s6LKkZ41ffXOeB7RNirQcAxKUt7gydc2fM7EZJT0gaJulvnXMvx10OEOb7O9KuAQDEK/aALUnOuX+U9I9J5I3smtwhHTmeXvlzLkivbABoFCudIT6zw9cQPTzEFczKfexD0oKLpd+ZWn8ez22ukaBG/QEgTYn0sIEgrjv4vPWiuY3dL/vyG6XtzwWXCwBZRsBGvKbeJR0Mn/HVu0MaN997fWS7NKmjcv/1t0r3Phq9yLmzpF0bpSfuHth24JA040rvdaSe/bS/il4gAKSAIXHEa3LtG1OXbm/pur1gvXW71+suPYYSrCVp90uVx295wluopdSrntwRfrwkadIXh1YoADSZuVr3LkxYV1eX6+7O73ilmd86Mvnh++/n9DFpn8+F11WiXtK1ZJ50wxJp/mzpxCnpJ/uk2zZJP9sfoX5R/mnN7Am9nKuQbZgjtF/25b0NJe1xztX8H5EhccSvvf7V67at8wJ0kPFjpBlTpGsWVm7f9aJ06efrLJRrrwFkAAEbyZjtpD3hv4pLE9Da26R3qyaLDWVBFdctffzCgd50+xzpzNmIvWtmhgPICAI2khMhaEsDwbreVc/Kjzv7gnT6+Yh5EawBZAiTzpCs6bUX9C5NFvNz63LpxNNeb7n06Nvtbfcz7OKIwXr69yIkAoDWwaSzhOV9skSkfz8BvezqwHrVfOmhu+qvy7I13ozzcoHD4kPoXdOG2Ub7ZV/e21BMOkPLmO2kvaMk986gXT1PSRPGVm4bPU96qy969h1jpDd/JG25zXtI0jc2S7fc7ZN4+hapY2n0zAGgRRCw0RwX9Ufgqt522zBp+pXSqw3cgPX4ycre+i8fHdzTlsQ5awCZxjlsNFdZ0HTd0sM7GwvWfs5f7F23XTEcTrAGkHH0sNF8s510+ri0b4Kuu0K67ooEy5p5tKHrwgGgVdDDRjraO7zAPW19MvlP2+DlT7AGkBP0sJGuSSu9hxTpmu2aGPoGkFP0sNE6ZruBx6wTg3av9uuMz3yj8jgAyCl62GhNbeMGBeC1f5dSXQCgBdDDBgAgAwjYAABkAAEbAIAMIGADAJABqd/8w8xyPbU37e83aQVYlJ82zDjaL/sK0Ibc/AMAEnP2hPRiR8Wm1eultTdVpZt5SGp/f/Pqhdyih52wtL/fpPHrPvvy3oaxtl8LLu6T9/aTCvE3GKmHzTlsAAhz5E4vUMcRrKWBvI6sjSc/FAY97ISl/f0mjV/32Zf3Nqy7/U6/Ke2bGG9l/Mw8LLVPrvvwvLefVIi/Qc5hA0Bd4upNR7HvPO+ZpXVRA0PiAFCumcG6FcpFZhCwAUCS9o5IP2juMen41nTrgJZFwAaAPSa5dxvO5sY7YqjLgWXp/3BAS2LSWcLS/n6TxoSX7Mt7G9Zsv70jJffbhsown+lCrruhLCUbLl1Uu155bz+pEH+DXNYFADVFCNadC6T7fuC/zy9Yh22PLIYeP/KFHnbC0v5+k8av++zLexuGtl+NoecoPeewwFwr7UdnSD99ILQKNWeP5739pEL8DdLDBoBANYL1t+73315vz9nvuJf3RziQ89noR8AGUDxnjtZMsuLOJtRDEX8AnOlJvB5ofQRsAMXzUv0ri1ULmlzW8KSzci91xpgZsoqVzgAUyxsD116FnaN23dGHv123dKpPGjNPOvmMNHpU9Ops+srA69Bz5ofXS+dV3woMRUIPG0CxHPpzScHB+GDZaPncWYP3B/WcS0E6KFgHHXf9Eu/5V4f9979Xz9dX+SdAYRCwAaDMtEUDr3dtrAy0YcPcH77ae55wWXCa6rzK35+/eGj1RPEQsAEUR4Mzrl8Pmav2ymve8/GTwWnC9kXCjPFCI2ADQJlFc4P3TV0UvC+KsN734ksbyxv5R8AGUEh9u/23P7ahufUoeWS9//Z3nm1uPdC6CNgAiuF05ayuc0Z455DPGTGwLcqlWJsfqa/4h3fWTlNe/qiR3vuRw6sSnT5WXwWQeSxNmrC0v9+ksSxi9uW9Dd9rv5Dzv2fOSu1z+tP7BO3qGeXVacqPl6RjT0oTxw0tj/I0vTukse8LrG7FcqV5bz+pEH+DLE0KAFG0DWvs+OGXVL7vXNBYfqHBGoVFwAaAMlEWS1m6pvJ9rQ7g574WT7kottgDtpn9rZkdNbOfxp03ALSC+7cPLf2mbcnUA8WSRA97s6RPJ5AvANRt1broaZvd2x1KeUP5HMiX2AO2c+4ZScfjzhcAGrEu5pU9v3B7tHRx3/Ur7s+B7OAcNgD4WLwyfP+3H/Sed+7137/tGe856L7aJVetrnx/3RW164ZiSiVgm9lyM+s2szhvQAcAdZv+gcr3j+2Kdtz85f7bPxOxJ1x9ffa9X412HIonlYDtnPuOc64rynVnANAMP75n8LaFK8KP6QhZalSSxn8ifP/KteH7gXIMiQMohlnhK4RNmTR42+M1lgU9UeNmHr2nwvdv2BK+39fMnjoOQh4kcVnXFkk/kfQRMztoZv9H3GUAwJC1TazrsKRmjF99c50Htk+ItR7Ijra4M3TOLYs7TwDIm+/vSLsGyBqGxAGg3+SOdMufc0G65aO1cfOPhKX9/SaNGw9kX97bcFD7hdwERKp/CPxjH/IC/oFD0i8O1pdHzbuFzR78bzHv7ScV4m8w0s0/Yh8SB4Asc93BQXvR3Mbul335jdL254LLBcIQsAEUy9S7pIPhM756d0jj5nuvj2yXJlUNlV9/q3Tvo9GLnDtL2rVReuLugW0HDkkzrvReH46yNvm0v4peIHKJIfGEpf39Jo3huOzLexv6tl+NYXHJ62WXer1bt0vL1oSnH4rvfl1advngckL5DIdL+W8/qRB/g5GGxAnYCUv7+00a/1lkX97b0Lf9Th+T9vlceF0l6vnsJfOkG5ZI82dLJ05JP9kn3bZJ+tn+CPWLEqxn9gRezpX39pMK8TfIOWwA8NXeWfeh29Z5ATrI+DHSjCnSNQsrt+96Ubr083UWyrXXED3sxKX9/SaNX/fZl/c2DG2/iEPj7W3Su88N3h65DlW96PY50pmzjQ2Fv1ePnLefVIi/QXrYABBqtosUtEvBut5LvsqPO/uCdPr5iHnVCNYoFhZOAVBs02sv6G1dwQH21uXSiae93nLp0bfb2+5n2MURg/X070VIhCJhSDxhaX+/SWM4Lvvy3oaR2i+gl10dWK+aLz10V/11WbbGm3FeLnBYPGLvOu/tJxXib5BZ4q0g7e83afxnkX15b8PI7bd3lOTeqdhkXVLPU9KEsZVJR8+T3uqLXoeOMdKbP6rc9o3N0i13+wTs6VukjqWR8857+0mF+BvkHDYARHZRfwSu6m23DZOmXym9eqj+rI+frOyt//LRwT1tSZyzRijOYQNAubKg6bqlh3c2Fqz9nL/Yu267ondNsEYNDIknLO3vN2kMx2Vf3tuw7vY7fVza14Trn2cebei68Ly3n1SIv8FIQ+L0sAHAT3uH1+udtj6Z/Kdt8PJvIFijWOhhJyzt7zdp/LrPvry3YaztF+Ga7ZpiHvrOe/tJhfgbpIcNALGa7QYes04M2r3arzM+843K44A60cNOWNrfb9L4dZ99eW9D2i/7CtCG9LABAMgLAjYAABlAwAYAIANSX+ls9uzZ6u6Oco+5bMr7+aW8n1uSaMOso/2yL+9tGBU9bAAAMiD1HjZQFIF3ZRqCeu/HDCD76GEDCbr52oF7JMehlNeqa+LJD0B2ELCBBHSM8QLrnV9KJv+1N3n5T+pIJn8ArYchcSBmcfWmozjSf4tGhsqB/KOHDcSomcG6FcoF0DwEbCAGv3k2/aDpuqU//VS6dQCQHAI20CDXLY0Y3ng+N97ReB5bb0//hwOAZHAOG2jAO7sbz6P8/PNfP+A9Nxp0f/OsNPIPG8sDQGuhhw00YOSI2mk6F0j3/cB/X9BksUYnkcXR4wfQWgjYQJ1q9YKty3v09Eqf/cvGg3Apv9Ljgj9prH4AsoWADdShVjD81v3+2+sN2n7Hvby/9nEEbSA/CNjAEHVGWKxkxZ3J10OK9gNgwtjk6wEgeQRsYIiObo8vr6AecJw9456n4ssLQHqYJQ4MwZ9dO/Dar3dbCrSuO/rwt+uWTvVJY+ZJJ5+RRo+KXp9NX4lWn5XLpG9uiZ4vgNZDDxsYgjv61wYPCsYHjw68njtr8P6gnnMpSAcF66Djrl/iPf/qsP/+Uj3Xr/bfDyA7CNhAjKYtGni9a2NloA0b5v7w1d7zhMuC01TnVf7+/MVDqyeA7CFgAxE1el759aPB+155zXs+fjI4Tdi+KJgxDmQbARuI0aK5wfumLgreF0VY73vxpY3lDaD1EbCBOvQFLEn62Ibm1qPkkfX+2995trn1AJAcAjYQweQJle/PGeENMZ9TtjRplCHnzY/UV/7DO2unKS9/1Ejv/ciqJUonjquvfADpI2ADERx+wn97327p9PPe6yiXcd3w1cHbzpytfN/TOzjNVRFmeZfK790hvb3LP82xJ2vnA6A1EbCBBrUNa+z44ZdUvu9c0Fh+Y9/X2PEAWhMBG4hRlF720jWV750LT/+5r8VTLoBsI2ADTXb/EJc23bQtmXoAyJbYA7aZTTOzp83s52b2spl9Ke4ygGZbtS562mb3dodS3lA+B4DWkkQP+4yk1c65/0nSJZL+o5n9bgLlAE2zblW8+X3h9mjp4r7rV9yfA0DzxB6wnXNvOOf29r8+JennkqbEXQ7QyhavDN//7Qe95517/fdve8Z7Drqvdkn17PHrrqhdNwDZlOg5bDP7oKTfk/R81fblZtZtZt3Hjh1LsgpAU0z/QOX7xwIuq6o2f7n/9s9E7AlXX599r89lYwDyIbGAbWbvk/SgpJXOuYpVkJ1z33HOdTnnujo7O5OqAtA0P75n8LaFK8KP6QhZalSSxn8ifP/KteH7AeRLIgHbzNrlBev7nHP/kEQZQDNN/GT4/imTBm97vMayoCdq3Myj91T4/g113N86bD1yAK0tiVniJmmjpJ8755iTilx489f1HZfUjPGrb67vuEbv+AUgPUn0sOdKulbSZWb2Yv+jwfsUASj3/R1p1wBAs7XFnaFzbpckiztfoNVN7pCOHE+v/DkXpFc2gOSx0hkQUa3h7cNDXMGs3Mc+JC24WPqdqfXn8dzm8P0sXwpkW+w9bKDIXHdwYFw0t7H7ZV9+o7T9ueByAeQbARsYgtXrpbU3hafp3SGNm++9PrJdmtRRuf/6W6V7H41e5txZ0q6N0hN3D2w7cEiacaX3OkrP/osxr5gGoPnM1bpVUMK6urpcd3d+uwfepPn8SvvfTzNUt2GU3qx1DaTbul1atiY8/VB89+vSsssHl1OrPkHy3ob8DWZf3ttQ0h7nXM2TVgTshOX9H1ra/36aoboNJ46Tjj0Z4biI54yXzJNuWCLNny2dOCX9ZJ902ybpZ/trHxslWE+4LPxyrry3IX+D2Zf3NlTEgM2QODBEPb31H7ttnRegg4wfI82YIl2zsHL7rhelSz9fX5lcew3kAwEbqEOUoejSBLT2NundqsliQ5mx7bqlj184UF77HOnM2caHwgFkCwEbqFPU88elYF1v8Cw/7uwL0unno+VFsAbyheuwgQYsvaV2GusKDp63LpdOPO0F/tKjb7e33c+wi6MF4j/+cu00ALKFSWcJy/tkibT//TRDrTYM6mVXB9ar5ksP3VV/PZat8Wac11N2mLy3IX+D2Zf3NhSTzoDmsC7p7V3SqJGD9/U8JU0YW7lt9Dzprb7o+XeMkd78kbTlNu8hSd/YLN1y9+C0S2+R7v9h9LwBZAcBG4jBuR/3nqt7vG3DpOlXSq8eqj/v4ycre8y/fHRwT1vinDWQd5zDBmJUHjRdt/TwzsaCtZ/zF3vXbZf/OCBYA/lHDxuImXVJ40dLx5+WrrvCeySlc0Fj14UDyA562EACTpzyAvfKtcnkv+JOL3+CNVAc9LCBBG3Y4j2keO6oxdA3UFz0sIEmKV2PbV0Dd/Mqt3r94G3nXV55HIDioocNpODXb/kH4HX3Nb8uALKBHjYAABlAwAYAIAMI2AAAZAABGwCADEj95h9mluuV69P+fpNWgEX5acOMo/2yrwBtyM0/cu3sCenFjopNq9dLa2+qSjfzkNT+/ubVCwCQCHrYCYv1+90Twy/p2fF+3fy6z768tyHtl30FaMNIPWzOYbe6I3d6gTqOYC0N5HUkoTUzAQCJoIedsLq/39NvSvsmxlsZPzMPS+2T6z6cX/fZl/c2pP2yrwBtyDnszIqrNx3FvvO855iHygEA8WJIvNU0M1i3QrkAgEgI2K1i74j0g+Yek45vTbcOAABfBOxWsMck927D2dx4Rwx1ObAs/R8OAIBBmHSWsJrf796RkvttQ2X43fWp4Xsv23Dpotr1YsJL9uW9DWm/7CtAG3JZVyZECNadC6T7fuC/L+geyQ3fOzmGHj8AID70sBMW+v3WGHqO0nMOC8y10n50hvTTB0KrUHP2OL/usy/vbUj7ZV8B2pAedkurEay/db//9np7zn7Hvbw/woGczwaAlkDATsOZozWTrLizCfVQxB8AZ3oSrwcAIBwBOw0v1b+yWLWgyWUNTzor91JnjJkBAOrBSmfN9sbAtVdh56hdd/Thb9ctneqTxsyTTj4jjR4VvTqbvjLwOvSc+eH10nnVtwIDADQLPexmO/TnkoKD8cGy0fK5swbvD+o5l4J0ULAOOu76Jd7zrw7773+vnq+v8k8AAGgKAnaLmbZo4PWujZWBNmyY+8NXe88TLgtOU51X+fvzFw+tngCA5iJgN1ODM65fD5mr9spr3vPxk8FpwvZFwoxxAEgNAbvFLJobvG/qouB9UYT1vhdf2ljeAIBkEbBT0rfbf/tjG5pbj5JH1vtvf+fZ5tYDAOCPgN0spytndZ0zwjuHfM6IgW1RLsXa/Eh9xT+8s3aa8vJHjfTejxxelej0sfoqAABoCEuTJuy97zfk/O+Zs1L7nP70PkG7ekZ5dZry4yXp2JPSxHFDy6M8Te8Oaez7AqtbsVwpyyJmX97bkPbLvgK0IUuTZkXbsMaOH35J5fvOBY3lFxqsAQCpIGC3mCiLpSxdU/m+1o/Pz30tnnIBAOmJPWCb2Ugze8HMXjKzl83sq3GXUXT3bx9a+k3bkqkHAKB5kuhh/1bSZc65WZIulPRpM7ukxjG5t2pd9LTN7u0OpbyhfA4AQHxiD9jO81b/2/b+R75nDESwLuaVPb9we7R0cd/1K+7PAQCIJpFz2GY2zMxelHRU0g+dc89X7V9uZt1mFuc9pXJl8crw/d9+0Hveudd//7ZnvOeg+2qXXLW68v11V9SuGwCg+RK9rMvMxkl6SNIXnXM/DUiT6953lMu6JGnGldKBQ1XH9v+cCRqyrnVHr7D9QXlHui0nl3XlSt7bkPbLvgK0YfqXdTnneiXtkPTpJMvJgx/fM3jbwhXhx3SELDUqSeM/Eb5/5drw/QCA1pHELPHO/p61zOwcSQsk/Wvc5WTOrPAVwqZMGrzt8RrLgp6ocTOP3lPh+zdsCd/va2ZPHQcBABrVlkCe75d0r5kNk/eD4AHn3KMJlJMtbRPrOiypGeNX31znge0TYq0HACCa2AO2c26fpN+LO1/E6/s70q4BAGAoWOmshUzuSLf8ORekWz4AIBg3/0jYoO+3xmzxeofAP/YhL+AfOCT94mB9edScIT57cFMxQzX78t6GtF/2FaANI80ST+IcNhoQdinWormN3S/78hul7c8FlwsAaF0E7Gabepd0MHzGV+8Oadx87/WR7dKkqqHy62+V7h3CNL65s6RdG6Un7h7YduCQd+23JB2Osjb5tL+KXiAAIHYMiSfM9/utMSwueb3sUq9363Zp2Zrw9EPx3a9Lyy4fXE4on+FwieG4PMh7G9J+2VeANow0JE7ATpjv93v6mLTP58LrKlHPZy+ZJ92wRJo/WzpxSvrJPum2TdLP9keoX5RgPbMn8HIu/rPIvry3Ie2XfQVoQ85ht6z2zroP3bbOC9BBxo+RZkyRrllYuX3Xi9Kln6+zUK69BoDU0cNOWOj3G3FovL1Neve5wdsj16GqF90+RzpztrGh8Pfqwa//wb/SAAAgAElEQVT7zMt7G9J+2VeANqSH3fJmu0hBuxSs673kq/y4sy9Ip5+PmFeNYA0AaB4WTknb9NoLeltXcIC9dbl04mmvt1x69O32tvsZdnHEYD39exESAQCahSHxhEX6fgN62dWB9ar50kN31V+XZWu8GeflAofFI/auGY7Lvry3Ie2XfQVoQ2aJt4LI3+/eUZJ7p2KTdUk9T0kTxlYmHT1Peqsveh06xkhv/qhy2zc2S7fc7ROwp2+ROpZGzpv/LLIv721I+2VfAdqQc9iZclF/BK7qbbcNk6ZfKb16qP6sj5+s7K3/8tHBPW1JnLMGgBbGOexWUxY0Xbf08M7GgrWf8xd7121X9K4J1gDQ0hgST1jd3+/p49K+Jlz/PPNoQ9eFMxyXfXlvQ9ov+wrQhpGGxOlht6r2Dq/XO219MvlP2+Dl30CwBgA0Dz3shMX6/Ua4ZrummIe++XWffXlvQ9ov+wrQhvSwc2e2G3jMOjFo92q/zvjMNyqPAwBkEj3shKX9/SaNX/fZl/c2pP2yrwBtSA8bAIC8IGADAJABBGwAADIg9ZXOZs+ere7uKPd5zKa8n1/K+7kliTbMOtov+/LehlHRwwYAIANS72EDANAsgXcoHIJItyhOAD1sAECu3XytF6jjCNbSQF6rroknv6gI2ACAXOoY4wXWO7+UTP5rb/Lyn9SRTP7VGBIHAOROXL3pKI7036446aFyetgAgFxpZrBuZrkEbABALvzm2fSCdYnrlv70U8nkTcAGAGSe65ZGDG88nxvvaDyPrbcn88OBc9gAgEx7Z3fjeZSff/7rB7znRoPub56VRv5hY3mUo4cNAMi0kSNqp+lcIN33A/99QZPFGp1EFkePvxwBGwCQWbV6wdblPXp6pc/+ZeNBuJRf6XHBnzRWv6EgYAMAMqlWMPzW/f7b6w3afse9vL/2cXEFbQI2ACBzOiMsVrLizuTrIUX7ATBhbOPlELABAJlzdHt8eQX1gOMczu55qvE8mCUOAMiUP7t24LVf77YUaF139OFv1y2d6pPGzJNOPiONHhW9Ppu+Eq0+K5dJ39wSPd9q9LABAJlyR//a4EHB+ODRgddzZw3eH9RzLgXpoGAddNz1S7znXx3231+q5/rV/vujImADAHJl2qKB17s2VgbasGHuD1/tPU+4LDhNdV7l789fPLR6DhUBGwCQGY2eV379aPC+V17zno+fDE4Tti+KRupPwAYA5MqiucH7pi4K3hdFWO978aWN5V0LARsAkEl9AUuSPrahufUoeWS9//Z3no0nfwI2ACATJk+ofH/OCG+I+ZyypUmjDDlvfqS+8h/eWTtNefmjRnrvR1YtUTpxXH3lE7ABAJlw+An/7X27pdPPe6+jXMZ1w1cHbztztvJ9T+/gNFdFmOVdKr93h/T2Lv80x56snY8fAjYAIPPahjV2/PBLKt93Lmgsv7Hva+x4PwRsAECuROllL11T+d658PSf+1o85TYikYBtZsPM7J/N7NEk8gcAoBH3D3Fp003bkqnHUCTVw/6SpJ8nlDcAoIBWrYueNunebiPlDeVzlIs9YJvZVElXSLon7rwBAMW1blW8+X3h9mjp4r7rV72fI4ke9jclfVnSfw9KYGbLzazbzLqPHTuWQBUAAEW3eGX4/m8/6D3v3Ou/f9sz3nPQfbVLqmePX3dF7brVI9aAbWaLJR11zu0JS+ec+45zrss519XZ2RlnFQAABTX9A5XvHwu4rKra/OX+2z8TsSdcfX32vT6XjcUh7h72XElXmtmrkrZKuszM/i7mMgAAGOTHPidiF64IP6YjZKlRSRr/ifD9K9eG749TrAHbOXeLc26qc+6DkpZK+pFz7rNxlgEAKKaJnwzfP2XS4G2P11gW9ESNm3n0ngrfv6GO+1uHrUcehuuwAQCZ8Oav6zsuqRnjV99c33H13vGrrb7DanPO7ZC0I6n8AQBI0/d3NLc8etgAgNyY3JFu+XMuSC5vAjYAIDNqDW8fHuIKZuU+9iFpwcXS70ytP4/nNofvb2R4PrEhcQAA0uC6gwPjormN3S/78hul7c8Fl5skAjYAIFNWr5fW3hSepneHNG6+9/rIdmlS1VD59bdK9w7hbhdzZ0m7NkpP3D2w7cAhacaV3usoPfsvNrhimrlatyhJWFdXl+vuTvhnSYrMLO0qJCrtfz/NQBtmG+2XfX5tGKU3a10D6bZul5atCU8/FN/9urTs8sHl1KpPgD3OuZqD5QTshPGfRfbRhtlG+2WfXxtOHCcdezLCsRHPGS+ZJ92wRJo/WzpxSvrJPum2TdLP9tc+NkqwnnBZ6OVckQI2Q+IAgMzp6a3/2G3rvAAdZPwYacYU6ZqFldt3vShd+vn6yqz32utyBGwAQCZFGYouTUBrb5PerZosNpQZ265b+viFA+W1z5HOnG14KHxICNgAgMyKev64FKzrDZ7lx519QTr9fLS84lxljeuwAQCZtvSW2mmsKzh43rpcOvG0F/hLj77d3nY/wy6OFoj/+Mu10wwFk84SxoSX7KMNs432y74obRjUy64OrFfNlx66q/66LFvjzTivp+wQTDoDABSDdUlv75JGjRy8r+cpacLYym2j50lv9UXPv2OM9OaPpC23eQ9J+sZm6Za7B6ddeot0/w+j5x0VARsAkAvnftx7ru7xtg2Tpl8pvXqo/ryPn6zsMf/y0cE9bSm5O4NJnMMGAORMedB03dLDOxsL1n7OX+xdt13+4yDJYC3RwwYA5JB1SeNHS8eflq67wnskpXNBY9eFR0UPGwCQSydOeYF75dpk8l9xp5d/M4K1RA8bAJBzG7Z4DymeO2olPfQdhB42AKAwStdjW9fA3bzKrV4/eNt5l1celxZ62ACAQvr1W/4BeN19za9LFPSwAQDIAAI2AAAZQMAGACADUl9L3MxyvRBu2t9v0vK+TrNEG2Yd7Zd9BWjDSGuJ08MGACADmCUOIDZZvsYVaHX0sAE05OZrB+4hHIdSXquuiSc/IC84h52wtL/fpHH+LPvqbcPS7QaTNvmPpKPH6z+e9su+ArQh98MGkIy4etNRHOm/hSFD5Sg6hsQBDEkzg3UrlAu0CgI2gEh+82z6QdN1S3/6qXTrAKSFgA2gJtctjRjeeD433tF4HltvT/+HA5AGJp0lLO3vN2lMeMm+Wm34zm5p5IgGy/A5/9xo0P3tu9LIP6ydrujtlwcFaEMWTgHQuCjBunOBdN8P/PcFTRZrdBJZHD1+IEvoYScs7e83afy6z76wNqzVC47Scw4LzLXSfnSG9NMHhl6HijIK3H55UYA2pIcNoH61gvW37vffXm/P2e+4l/fXPo7z2SgKAjaAQTo7aqdZcWfy9ZCi/QCYMDb5egBpI2ADGOTo9vjyCuoBx9kz7nkqvryAVsVKZwAq/Nm1A6/DzlG77ujD365bOtUnjZknnXxGGj0qen02fSVafVYuk765JXq+QNbQwwZQ4Y4vec9Bwfjg0YHXc2cN3h/Ucy4F6aBgHXTc9Uu8518d9t9fquf61f77gbwgYAMYkmmLBl7v2lgZaMOGuT98tfc84bLgNNV5lb8/f/HQ6gnkDQEbwHsaPa/8+tHgfa+85j0fPxmcJmxfFMwYR54RsAEMyaK5wfumLgreF0VY73vxpY3lDWQdARuAr77d/tsf29DcepQ8st5/+zvPNrceQFoI2AAkSZMnVL4/Z4Q3xHxO2dKkUYacNz9SX/kP76ydprz8USO99yOrliidOK6+8oFWx9KkCUv7+00ayyJmX6kNw4LxmbNS+xwFpqueUV6dpvx4STr25ODAWiuP8jS9O6Sx7wuub3leRWm/PCtAG7I0KYB4tA1r7Pjhl1S+71zQWH5hwRrIKwI2gCGJsljK0jWV72t1kD73tXjKBfIskYBtZq+a2b+Y2YtmxoUWQMHcP8SlTTdtS6YeQJ4k2cP+hHPuwijj8gDSt2pd9LTN7u0OpbyhfA4gSxgSByBJWrcq3vy+cHu0dHHf9SvuzwG0iqQCtpO03cz2mNny6p1mttzMuhkuB7Jr8crw/d9+0Hveudd//7ZnvOeg+2qXXFW1Rvh1V9SuG5BHiVzWZWYfcM4dMrNJkn4o6YvOuWcC0uZ6vn4BLkdIuwqJK0ob1rrGesaV0oFDldtKxwQNWde6o1fY/qC8o1wLzmVd+VKANkzvsi7n3KH+56OSHpJ0cRLlAGieH98zeNvCFeHHdIQsNSpJ4z8Rvn/l2vD9QJHEHrDN7FwzG116LemPJP007nIAxGviJ8P3T5k0eNvjNZYFPVHjZh69p8L3b6jj/tZh65EDWdaWQJ6TJT3UP0zTJum7zrnHEygHQIze/HV9xyU1Y/zqm+s7rtE7fgGtKvaA7ZzbL8nntvYAEN33d6RdA6C1cFkXgMgmd6Rb/pwL0i0fSBM3/0hY2t9v0pihmn3VbVhrFna9Q+Af+5AX8A8ckn5xsL486qlb0dovjwrQhpFmiSdxDhtAjoVdirVobmP3y778Rmn7c8HlAkVGwAZQYfV6ae1N4Wl6d0jj5nuvj2yXJlUNlV9/q3Tvo9HLnDtL2rVReuLugW0HDnnXfkvS4Qhrk38x5hXTgFbDkHjC0v5+k8ZwXPb5tWHUxUlK6bZul5atCU8/FN/9urTs8sHl1KqPnyK2X94UoA0jDYkTsBOW9vebNP6zyD6/Npw4Tjr2ZIRjI57PXjJPumGJNH+2dOKU9JN90m2bpJ/tr31slGA94bLgy7mK2H55U4A25Bw2gPr09NZ/7LZ1XoAOMn6MNGOKdM3Cyu27XpQu/Xx9ZXLtNYqAHnbC0v5+k8av++wLa8OoQ9HtbdK7zw3eHlV1Oe1zpDNnGxsKfy/vArdfXhSgDelhA2hM1PPHpWBd7yVf5cedfUE6/Xy0vJp9X24gTSycAiDU0ltqp7Gu4OB563LpxNNe4C89+nZ72/0MuzhaIP7jL9dOA+QJQ+IJS/v7TRrDcdkXpQ2DetnVgfWq+dJDd9Vfl2VrvBnn9ZQdhPbLvgK0IbPEW0Ha32/S+M8i+6K24du7pFEjq47tknqekiaMrdw+ep70Vl/0OnSMkd78UeW2b2yWbrl7cMBeeot0/w+j5037ZV8B2pBz2ADic+7HvefqANo2TJp+pfTqofrzPn6yssf8y0cH97Qlzlmj2DiHDWBIyoOm65Ye3tlYsPZz/mLvuu3yHwcEaxQdQ+IJS/v7TRrDcdlXbxuOHy0dfzrmyvjoXNDYdeG0X/YVoA0jDYnTwwZQlxOnvF7vyrXJ5L/izv5z5A0EayBP6GEnLO3vN2n8us++ONswjjtqxT30TftlXwHakB42gOYqXY9tXQN38yq3ev3gbeddXnkcAH/0sBOW9vebNH7dZ1/e25D2y74CtCE9bAAA8oKADQBABhCwAQDIgNRXOps9e7a6u2OYWtqi8n5+Ke/nliTaMOtov+zLextGRQ8bAIAMIGADAJABqQ+JAwBayJ4Yhp9n53+YPg30sAGg6I7c6QXqOIK1NJDXkYTWrS0oAjYAFNXpN73AevDLyeR/8GYv/9NHksm/YBgSB4Aiiqs3HcW+87xnhsobQg8bAIqmmcG6FcrNCQI2ABTF3hHpB809Jh3fmm4dMoqADQBFsMck927D2dx4Rwx1ObAs/R8OGcQ5bADIu70jG86i/Nanf/2A99zw/c/3jpAu+m2DmRQHPWwAyDtXOyh2LpDu+4H/vqD7lDd8//IYevxFQsAGgDyrMfRsXd6jp1f67F82HoRL+ZUeF/xJY/XDAAI2AORVjWD4rfv9t9cbtP2Oe3l/hAMJ2pEQsAEgj84crZlkxZ1NqIci/gA405N4PbKOgA0AefTS5NiyCppc1vCks3IvdcaYWT4xSxwA8uaNgWuv/Hq3pUDruqMPf7tu6VSfNGaedPIZafSo6NXZ9JWB12H10eH10nk3Rc+4YOhhA0DeHPpzScHB+GDZaPncWYP3B/WcS0E6KFgHHXf9Eu/5V4f9979Xz9dX+SeAJAI2ABTOtEUDr3dtrAy0YcPcH77ae55wWXCa6rzK35+/eGj1RCUCNgDkSYMzrl8Pmav2ymve8/GTwWnC9kXCjPFABGwAKJhFc4P3TV0UvC+KsN734ksby7voCNgAkFN9u/23P7ahufUoeWS9//Z3nm1uPbKKgA0AeXG6clbXOSO8c8jnjBjYFuVSrM2P1Ff8wztrpykvf9RI7/3I4VWJTh+rrwI5R8AGgLzY937fzX27pdPPe6+jXMZ1w1cHbztztvJ9T+/gNFetrp13qfzeHdLbuwIS7ZtUO6MCImADQAG0DWvs+OGXVL7vXNBYfmPf19jxRZRIwDazcWb292b2r2b2czP7gyTKAQAMXZRe9tI1le+dC0//ua/FUy6CJdXD3iDpcefc/yhplqSfJ1QOACAB928fWvpN25KpBwbEHrDNbIykeZI2SpJz7l3nnM/ZDgBAnFati5622b3doZQ3lM9RJEn0sGdIOiZpk5n9s5ndY2bnJlAOAKDMuphX9vzC7dHSxX3Xr7g/R14kEbDbJF0k6W+cc78n6W1Jf1GewMyWm1m3mXUfO8b0fQBIw+KV4fu//aD3vHOv//5tz3jPQffVLqmePX7dFbXrhsGSCNgHJR10zvVfRKC/lxfA3+Oc+45zrss519XZyS3VAKAZpn+g8v1jQZdVVZm/3H/7ZyL2hKuvz77X57Ix1BZ7wHbOHZb0mpl9pH/TJyX9LO5yAABD8+N7Bm9buCL8mI6QpUYlafwnwvevXBu+H9EldT/sL0q6z8yGS9ov6YaEygEAlMw6Jr0UPGo5xWc9ksdrLAt6osbNPHpPhe/fsCV8v6+ZPXUclH+JBGzn3IuSuOIOAJqpbWJdhyU1Y/zqm+s8sH1CrPXIC1Y6AwAk4vs70q5BvhCwAaBAJnekW/6cC9ItP8sI2ACQJ7PD1xA9PMQVzMp97EPSgoul35lafx7Pba6RoEb9iyypSWcAgBbluoPPWy+a29j9si+/Udr+XHC5qB8BGwDyZupd0sHwGV+9O6Rx873XR7ZLk6qGyq+/Vbr30ehFzp0l7dooPXH3wLYDh6QZV3qvI/Xsp/1V9AILiCFxAMibybVvTF26vaXr9oL11u1er7v0GEqwlqTdL1Uev+UJb6GWUq860rnzSV8cWqEFY67WPdMS1tXV5bq78ztOYmZpVyFRaf/7aQbaMNsK236nj0n7fC68rhL1kq4l86QblkjzZ0snTkk/2Sfdtkn62f4IdYzyX/zMnsDLufLehpL2OOdqtgRD4gCQR+31L/u8bZ0XoIOMHyPNmCJds7By+64XpUs/X2ehXHtdEwEbAPJqtpP2hPdOSxPQ2tukd6smiw1lQRXXLX38woHedPsc6czZiL1rZoZHQsAGgDyLELSlgWBd76pn5cedfUE6/XzEvAjWkTHpDADybnrtBb1Lk8X83LpcOvG011suPfp2e9v9DLs4YrCe/r0IiVDCpLOE5X2yRNr/fpqBNsw22q9fQC+7OrBeNV966K7667NsjTfjvFzgsHjE3nXe21BMOgMAvGe2k/aOktw7g3b1PCVNGFu5bfQ86a2+6Nl3jJHe/JG05TbvIUnf2CzdcrdP4ulbpI6l0TOHJAI2ABTHRf0RuKq33TZMmn6l9Oqh+rM+frKyt/7LRwf3tCVxzroBnMMGgKIpC5quW3p4Z2PB2s/5i73rtiuGwwnWDaGHDQBFNNtJp49L+ybouiuk665IsKyZRxu6LhweetgAUFTtHV7gnrY+mfynbfDyJ1jHgh42ABTdpJXeQ4p0zXZNDH0ngh42AGDAbDfwmHVi0O7Vfp3xmW9UHodE0MMGAPhrGzcoAK/9u5TqAnrYAABkAQEbAIAMIGADAJABqa8lbma5nqGQ9vebtAKs8UsbZhztl30FaMNIa4nTwwYAIANyM0s80k3Sa6j3PrAAACQt0z3sm68duDdrHEp5rbomnvwAAIhLJs9hl27jlrTJfyQdPd5YHml/v0nj/Fn25b0Nab/sK0Ab5vN+2HH1pqM40n9rOIbKAQBpy9SQeDODdSuUCwBASSYC9m+eTT9oum7pTz+Vbh0AAMXV8gHbdUsjhjeez413NJ7H1tvT/+EAACimlp509s5uaeSIBvP3Of/caND97bvSyD+Mljbt7zdpTHjJvry3Ie2XfQVow+wvnBIlWHcukO77gf++oMlijU4ii6PHDwDAULRsD7tWLzhKzzksMNdK+9EZ0k8fGHodBpWT/1+GaVchcbRhttF+2VeANsxuD7tWsP7W/f7b6+05+x338v7ax3E+GwDQLC0XsDs7aqdZcWfy9ZCi/QCYMDb5egAA0HIB++j2+PIK6gHH2TPueSq+vAAACNJSK5392bUDr8POUbvu6MPfrls61SeNmSedfEYaPSp6fTZ9JVp9Vi6Tvrkler4AAAxVS/Ww7/iS9xwUjA8eHXg9d9bg/UE951KQDgrWQcddv8R7/tVh//2leq5f7b8fAIC4tFTArmXaooHXuzZWBtqwYe4PX+09T7gsOE11XuXvz188tHoCABC3lgnYjZ5Xfv1o8L5XXvOej58MThO2LwpmjAMAktQyATuKRXOD901dFLwvirDe9+JLG8sbAIBGtWTA7tvtv/2xDc2tR8kj6/23v/Nsc+sBACiulgjYkydUvj9nhDfEfE7Z0qRRhpw3P1Jf+Q/vrJ2mvPxRI733I6uWKJ04rr7yAQCopSWWJg0LxmfOSu1zvNd+6apnlFenKT9eko49OTiw1sqjPE3vDmns+4LrOyiv/C+pl3YVEkcbZhvtl30FaMPsLk1arm1YY8cPv6TyfeeCxvILC9YAACSl5QN2uSiLpSxdU/m+1g+zz30tnnIBAEhS7AHbzD5iZi+WPU6a2cq4ywly/xCXNt20LZl6AAAQp9gDtnPu35xzFzrnLpQ0W1KfpIfCjlm1Lnr+ze7tDqW8oXwOAACGIukh8U9K+oVz7pdhidatirfQL9weLV3cd/2K+3MAAFCSdMBeKmnQbTHMbLmZdZtZXeuDLa4xwP7tB73nnXv99297xnsOuq92yVVVa4Rfd0XtugEAkITELusys+GSDkn6qHPuSEi60Mu6JGnGldKBQ5XbSscEDVnXuqNX2P6gvKNcC85lXflDG2Yb7Zd9BWjD1C/rWihpb1iwjurH9/hkviL8mI6QpUYlafwnwvevXBu+HwCAZkoyYC+Tz3C4n4mfDN8/ZdLgbY/XWBb0RI2befSeCt+/oY77W4etRw4AQCMSCdhmNkrSpyT9Q5T0b/66znISmjF+9c31HdfoHb8AAAjSlkSmzrk+SRNqJmxR39+Rdg0AAKiUmZXOJnekW/6cC9ItHwBQbC1x84/S61qzsOsdAv/Yh7yAf+CQ9IuD9eVRb93S/n6TxgzV7Mt7G9J+2VeANow0SzyRIfGkhF2KtWhuY/fLvvxGaftzweUCAJCmlgrYq9dLa28KT9O7Qxo333t9ZLs0qWqo/PpbpXsfjV7m3FnSro3SE3cPbDtwyLv2W5IOR1ib/Isxr5gGAEC1lhoSl6IvTlJKt3W7tGxNePqh+O7XpWWXDy6nVn2CpP39Jo3huOzLexvSftlXgDaMNCTecgF74jjp2JMRjot4PnvJPOmGJdL82dKJU9JP9km3bZJ+tr/2sVGC9YTLwi/nSvv7TRr/WWRf3tuQ9su+ArRhNs9h9/TWf+y2dV6ADjJ+jDRjinTNwsrtu16ULv18fWVy7TUAoBlaroddEnUour1Neve5wdujqi6nfY505mzjQ+Hv5Z//X4ZpVyFxtGG20X7ZV4A2zGYPuyTq+eNSsK73kq/y486+IJ1+Plpezb4vNwCg2Fp64ZSlt9ROY13BwfPW5dKJp73AX3r07fa2+xl2cbRA/Mdfrp0GAIA4teyQeElQL7s6sF41X3rorvrrsWyNN+O8nrLDpP39Jo3huOzLexvSftlXgDbM5ixxP2/vkkaNrDquS+p5SpowtnL76HnSW33Ry+8YI735o8pt39gs3XL34IC99Bbp/h9Gz1sqxD+0tKuQONow22i/7CtAG2b7HHa5cz/uPVcH0LZh0vQrpVcP1Z/38ZOVPeZfPjq4py1xzhoAkK6WPoddrTxoum7p4Z2NBWs/5y/2rtsu/3FAsAYApC0TQ+LVxo+Wjj+dRG0qdS5o7LpwqRBDOWlXIXG0YbbRftlXgDaMNCSeqR52yYlTXq935dpk8l9xZ/858gaDNQAAcclkD9tPHHfUSmLoO+3vN2n8us++vLch7Zd9BWjD/Paw/ZSux7augbt5lVu9fvC28y6vPA4AgFaVmx52q0r7+00av+6zL+9tSPtlXwHasFg9bAAA8oyADQBABhCwAQDIgFZY6axH0i+bWN7E/jKbIqXzS039jCnIexvSfjGi/WLX9M9XgDY8P0qi1CedNZuZdUc5uZ9lef+MfL5s4/NlW94/n9S6n5EhcQAAMoCADQBABhQxYH8n7Qo0Qd4/I58v2/h82Zb3zye16Gcs3DlsAACyqIg9bAAAMoeADQBABhQqYJvZp83s38zsFTP7i7TrEycz+1szO2pmP027Lkkws2lm9rSZ/dzMXjazL6Vdp7iZ2Ugze8HMXur/jF9Nu05xM7NhZvbPZvZo2nVJgpm9amb/YmYvmlkM9xBsLWY2zsz+3sz+tf9v8Q/SrlNczOwj/e1Wepw0s5Vp16tcYc5hm9kwSf+fpE9JOijpnyQtc879LNWKxcTM5kl6S9J/dc5dkHZ94mZm75f0fufcXjMbLWmPpKvy0n6SZN7qEOc6594ys3ZJuyR9yTn3XMpVi42ZrZLUJWmMc25x2vWJm5m9KqnLOZfLhVPM7F5JP3bO3WNmwyWNcs71pl2vuPXHi9clzXHONXNhr1BF6mFfLOkV59x+59y7krZK+kzKdYqNc+4ZScfTrkdSnHNvOOf29r8+JennkqakW6t4Oc9b/W/b+x+5+UVtZlMlXSHpnn58C9IAAAJTSURBVLTrgqEzszGS5knaKEnOuXfzGKz7fVLSL1opWEvFCthTJL1W9v6gcvYfflGY2Qcl/Z6k59OtSfz6h4xflHRU0g+dc3n6jN+U9GVJ/z3tiiTISdpuZnvMbHnalYnZDEnHJG3qP61xj5mdm3alErJU0pa0K1GtSAHbbzHa3PReisLM3ifpQUkrnXMn065P3JxzZ51zF0qaKuliM8vF6Q0zWyzpqHNuT9p1Sdhc59xFkhZK+o/9p6ryok3SRZL+xjn3e5LelpSruUCS1D/Uf6Wk76Vdl2pFCtgHJU0rez9V0qGU6oI69J/XfVDSfc65f0i7PknqH2rcIenTKVclLnMlXdl/jnerpMvM7O/SrVL8nHOH+p+PSnpI3qm4vDgo6WDZqM/fywvgebNQ0l7n3JG0K1KtSAH7nyR92Mym9/+CWippW8p1QkT9E7I2Svq5c25d2vVJgpl1mtm4/tfnSFog6V/TrVU8nHO3OOemOuc+KO9v70fOuc+mXK1Ymdm5/RMi1T9U/EeScnPVhnPusKTXzOwj/Zs+KSk3kz7LLFMLDodLrXF7zaZwzp0xsxslPSFpmKS/dc69nHK1YmNmWyTNlzTRzA5K+opzbmO6tYrVXEnXSvqX/nO8krTGOfePKdYpbu+XdG//DNV/J+kB51wuL3/KqcmSHuq/FWSbpO865x5Pt0qx+6Kk+/o7Pfsl3ZByfWJlZqPkXUn0H9Kui5/CXNYFAECWFWlIHACAzCJgAwCQAQRsAAAygIANAEAGELABAMgAAjYAABlAwAYAIAP+fzFY3dTllVswAAAAAElFTkSuQmCC", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_NQueens(dfts)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`breadth_first_tree_search`" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "88.6 ms ± 2.01 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "breadth_first_tree_search(nqp)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "bfts = breadth_first_tree_search(nqp).solution()" + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAewAAAHwCAYAAABkPlyAAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3X+4FNWd7/vPd9gbEMOvDRtMgGtgkifnTow4skecIXKJIWNAMHru3Bm4Ro/m5nJu7jEEwcmMPM88MXlONFcFQuLcycmRAc8ZA5pxjKgTJf4AA0adDaNMTGbuY8BERH5sgYBiInDW/aN2u7t7V1VXd1d1dVW9X8/TT3dXrVprdS82316rVq0y55wAAEB7+520KwAAAGojYAMAkAEEbAAAMoCADQBABhCwAQDIAAI2AAAZQMAGACADCNgAAGQAARtoM2b2QTP7RzM7amYHzOwuM+sIST/GzP6mP+1JM/sXM/sPrawzgOQRsIH28/9KOiTp/ZIukPS/SPq//RKa2VBJT0g6V9IfShot6c8l3W5mS1tSWwAtQcAG2s9USfc7537jnDsg6TFJHw1Ie42k/0nS/+ac2+ucO+Wce0zSUkn/2cxGSpKZOTP7UOkgM9tgZv+57P0CM3vRzI6Z2bNmdn7Zvg+Y2QNmdtjM9pb/EDCzW8zsfjP7b2Z2wsxeNrOesv1/YWav9+/7NzP7ZDxfEVA8BGyg/ayVtMjMRpjZJEnz5AVtP5+S9EPn3NtV2x+QNELSxbUKM7MLJf2tpP8oaZyk/yJps5kNM7PfkfSwpJckTZL0SUnLzOyysiyukLRJ0hhJmyXd1Z/vRyTdIOkPnHMjJV0m6dVa9QHgj4ANtJ9t8nrUxyXtk9Qr6QcBacdLeqN6o3PutKQ+Sd0Ryvs/Jf0X59zzzrkzzrl7JP1WXrD/A0ndzrmvOefedc7tkfRfJS0qO367c+4fnXNnJP13SdP7t5+RNEzS75lZp3PuVefcLyLUB4APAjbQRvp7tI9L+gdJZ8sLyGMl/T8Bh/TJO9ddnU9H/7GHIxR7rqQV/cPhx8zsmKQpkj7Qv+8DVftWSppYdvyBstcnJQ03sw7n3CuSlkm6RdIhM9tkZh+IUB8APgjYQHvpkhcs73LO/dY596ak9ZLmB6R/QtI8Mzu7avv/KumUpBf635+UN0Reck7Z69ckfd05N6bsMcI5t7F/396qfSOdc0H1qeCc+55z7uPyAr9T8A8PADUQsIE24pzrk7RX0hfMrMPMxkj6D/LOIfv57/KGzb/ffzlYZ//55W9Jut059+v+dC9K+t/NbIiZfVrezPOS/yrp/zKzmeY528wu75+w9oKk4/2Tx87qP/48M/uDWp/FzD5iZpea2TBJv5H0jrxhcgANIGAD7effS/q0vOHsVySdlnSjX0Ln3G8lzZXXE35eXlB8TNI3JX21LOmXJC2UdEzS1So7J+6c65V3HvsuSUf7y7yuf9+Z/uMukPdDok/S3fIuH6tlmKRv9B9zQNIEecPpABpgzrm06wAgJmbWKemHkl6XdJ3jDxzIDXrYQI44507JO3/9C0kfSbk6AGJEDxsAgAyghw0AQAYE3lCgVcaPH+8++MEPpl2NxOzcuTPtKiRqxowZaVchcbRhttF+2Zf3NpTU55yruchR6kPiPT09rre3t/mMdlrzecyI/7swi6FebSztfz+tQBtmG+2XfXlvQ0k7nXM9tRJle0j84B1eoI4jWEsDeR1cFU9+AADEJJsB+9SbXmDd9+Vk8t93k5f/qYPJ5A8AQJ1SP4ddt7h601Hs7l+9MYGhcgAA6pGtHnYrg3U7lAsAQL9sBOxdw9IPmjtNOrIp3ToAAAqr/QP2TpPcu01nc8PtMdRl7+L0fzgAAAqpvc9h7xredBZWNlH+r+/3nl2zV5HtGiZd+NsmMwEAILr27mG72kGxe6507w/991nAVW1B2yOLoccPAEA92jdg1xh6th7v0XdM+uxfNR+ES/mVHuf9aXP1AwAgTu0ZsGsEw2/f57+90aDtd9zLeyIcSNAGALRI+wXs04dqJll6RwvqoYg/AE73JV4PAADaL2C/NDG2rIImlzU96azcSzXXawcAoGntNUv8jYFrr/x6t6VA63qjD3+7XunESWnUbOn4M9LIEdGrs/4rA6/D6qMDa6RzboyeMQAAdWqvHvb+v5AUHIz3lY2Wz5o+eH9Qz7kUpIOCddBx1y30nn91wH//e/V8fbl/AgAAYtJeAbuGKfMHXm9fVxlow4a5P3yV9zzu0uA01XmVvz93QX31BAAgbu0TsJuccf16yFy1V17zno8cD04Tti8SZowDABLUPgE7gvmzgvdNnh+8L4qw3veCS5rLGwCAZrVlwD65w3/7o2tbW4+Sh9f4b3/n2dbWAwBQXO0RsE9Vzuo6a5h3DvmsYQPbolyKteHhxop/aFvtNOXljxjuvR8+tCrRqcONVQAAgBraI2Dvfr/v5pM7pFPPe6+jXMZ1/VcHbzt9pvJ937HBaa5cUTvvUvnHtkpvbw9ItHtC7YwAAGhAewTsEB1Dmjt+6MWV77vnNpff6Pc1dzwAAI1o+4BdLkove9HKyvfOhaf/3NfiKRcAgCRlKmBHcd+W+tKv35xMPQAAiFMiAdvMPm1m/2Zmr5jZX9ZKv3x1HXm3uLdbT3n1fA4AAOoRe8A2syGS/lrSPEm/J2mxmf1e2DGrY17Z8wu3RUsX912/4v4cAACUJNHDvkjSK865Pc65dyVtkvSZOAtYsCx8/3ce8J637fLfv/kZ7znovtol1bPHr728dt0AAEhCEgF7kqTXyt7v69/2HjNbYma9ZtZ7+HDta5enfqDy/aNBl1VVmbPEf/tnIvaEq6/PvsfnsjEAAFohiYDtt6h2xVxt59x3nXM9zrme7u7a95P+8d2Dt81bGn5MV8hSo5I09hPh+5etCt8PAEArJRGw90maUvZ+sqT9oUdMD+9lT/JZj+SxGsuCHq1xM49jJ8L3r90Yvt/X+X0NHAQAQG1JBOx/kvRhM5tqZkMlLZIUfvFUx/iGCkpqxvhVNzV4YOe4WOsBAEBJR9wZOudOm9kNkh6XNETS3zrnXo67nCT9YGvaNQAAoFLsAVuSnHP/KOkf48xzYpd08EicOdZn5nnplQ0AQPusdDYjfA3RA3WuYFbuYx+S5l4k/e7kxvN4bkONBDXqDwBAMxLpYSfF9Qaft54/q7n7ZV92g7TlueByAQBIU3sF7Ml3SvvCZ3wd2yqNmeO9PrhFmtBVuf+6W6R7Hole5Kzp0vZ10uN3DWzbu1+adoX3OlLPfsq3ohcIAEAD2mdIXJIm1r4xden2lq7XC9abtni97tKjnmAtSTteqjx+4+PeQi2lXvXErvDjJUkTvlhfoQAA1MlcrftPJqynp8f19paNOZ86LO32ufC6StRLuhbOlq5fKM2ZIR09If1kt3Treulne2ofG2ko/Py+0Mu5zPzWkcmPtP/9tAJtmG20X/blvQ0l7XTO1Yxq7TUkLkmdtVc+C7J5tRegg4wdJU2bJF09r3L79helSz7fYKFcew0AaIH2C9iSN+N6Z/gvqtIEtM4O6d2qyWL1LKjieqWPXzDQm+6cKZ0+E7F3zcxwAECLtGfAliIFbWkgWDe66ln5cWdekE49HzEvgjUAoIXaa9JZtam1F/QuTRbzc8sS6ejTXm+59Di5w9vuZ8hFEYP11O9HSAQAQHzab9JZtYBednVgvXKO9OCdjddj8Upvxnm5wGHxOnrXeZ8skfa/n1agDbON9su+vLehMjvprNoMJ+0aIbl3Bu3qe1IaN7py28jZ0lsno2ffNUp68ylp463eQ5K+sUG6+S6fxFM3Sl2LomcOAEBM2j9gS9KF/RG4qrfdMUSaeoX0avjNO0MdOV7ZW//lI4N72pI4Zw0ASFV7n8OuVhY0Xa/00LbmgrWfcxd4121XDIcTrAEAKctGD7vcDCedOiLtHqdrL5euvTzBss4/1NR14QAAxCVbPeySzi4vcE9Zk0z+U9Z6+ROsAQBtIns97HITlnkPKdI12zUx9A0AaFPZ7GH7meEGHtOPDtq9wq8zfv4blccBANCmst3DDtIxZlAAXvV3KdUFAIAY5KeHDQBAjhGwAQDIAAI2AAAZQMAGACADUr/5h5nlenp22t9v0gqwKD9tmHG0X/YVoA0j3fyDHjYAwNeYkZW3J3a90vKrB287Z1zaNS0GetgJS/v7TRq/7rMv721I+9Un8LbCdai+/XGzCtCG9LABALXddM1AbzkO5b1xxIcedsLS/n6TlvfemUQbZh3tF6xrlPTmUzFWJsDEP5YOHWn8+AK0YaQedj5XOgMAhIqrNx3FwS3ec9xD5UXDkDgAFEwrg3U7lJsXBGwAKIjfPJt+0HS90p99Kt06ZBUBGwAKwPVKw4Y2n88Ntzefx6bb0v/hkEVMOktY2t9v0vI+YUmiDbOO9pPe2SENH9ZkOT7nn5sNur99Vxr+R7XTFaANuawLABAtWHfPle79of++oMlizU4ii6PHXyT0sBOW9vebtLz3ziTaMOuK3n61esFRes5hgblW2o9Ok356f/11qCgj/21IDxsAiqxWsP72ff7bG+05+x338p7ax3E+OxoCNgDkUHdX7TRL70i+HlK0HwDjRidfj6wjYANADh3aEl9eQT3gOHvGfU/Gl1desdIZAOTMn18z8DrsHLXrjT787XqlEyelUbOl489II0dEr8/6r0Srz7LF0jc3Rs+3aOhhA0DO3P4l7zkoGO87NPB61vTB+4N6zqUgHRSsg467bqH3/KsD/vtL9Vyzwn8/PARsACiYKfMHXm9fVxlow4a5P3yV9zzu0uA01XmVvz93QX31RCUCNgDkSLPnlV8/FLzvlde85yPHg9OE7YuCGePBCNgAUDDzZwXvmzw/eF8UYb3vBZc0l3fREbABIKdO7vDf/uja1taj5OE1/tvfeba19cgqAjYA5MTEcZXvzxrmDTGfVbY0aZQh5w0PN1b+Q9tqpykvf8Rw7/3wqiVKx49prPy8Y2nShKX9/SYt78taSrRh1hWp/cKC8ekzUufM4HTVM8qr05QfL0mHnxgcWGvlUZ7m2FZp9PuC61ueVwHakKVJAQCejiHNHT/04sr33XObyy8sWMMfARsACibKYimLVla+r9XJ/dzX4ikXwWIP2Gb2t2Z2yMx+GnfeAIDWuK/OpU3Xb06mHhiQRA97g6RPJ5AvACDE8tXR07a6t1tPefV8jiKJPWA7556RdCTufAEA4VYvjze/L9wWLV3cd/2K+3PkBeewAaCgFiwL3/+dB7znbbv8929+xnsOuq92yZVVa4Rfe3ntumGwVAK2mS0xs14zYxE6AGiRqR+ofP/o9mjHzVniv/0zEXvC1ddn3/PVaMehUioB2zn3XedcT5TrzgAA8fjx3YO3zVsafkxXyFKjkjT2E+H7l60K34/oGBIHgJwY/8nw/ZMmDN72WI1lQY/WuJnHsRPh+9c2cH/rsPXIiyyJy7o2SvqJpI+Y2T4z+z/iLgMAMNibv27suKRmjF91U2PHNXvHr7zqiDtD59ziuPMEAGTPD7amXYN8YUgcAApkYle65c88L93ys4ybfyQs7e83aXm/cYREG2ZdEduv1h25Gh0C/9iHvIC/d7/0i32N5dFI3QrQhpFu/hH7kDgAoL253uCgPX9Wc/fLvuwGactzweWicQRsAMiZFWukVTeGpzm2VRozx3t9cIs0oWqo/LpbpHseiV7mrOnS9nXS43cNbNu7X5p2hff6QIS1yb8Y84ppecOQeMLS/n6TlvfhVIk2zLqitl+U3qz1DKTbtEVavDI8fT2+93Vp8WWDy6lVHz8FaMNIQ+IE7ISl/f0mLe//2Uu0YdYVtf3Gj5EOPxHh+IjnsxfOlq5fKM2ZIR09If1kt3Treulne2ofGyVYj7s0+HKuArQh57ABoKj6jjV+7ObVXoAOMnaUNG2SdPW8yu3bX5Qu+XxjZXLtdW30sBOW9vebtLz3ziTaMOuK3n5Rh6I7O6R3nxu8ParqcjpnSqfPNDcU/l7e+W9DetgAUHRRzx+XgnWjl3yVH3fmBenU89HyavV9ubOMhVMAIOcW3Vw7jfUEB89blkhHn/YCf+lxcoe33c+Qi6IF4j/5cu00GMCQeMLS/n6TlvfhVIk2zDrazxPUy64OrFfOkR68s/H6LF7pzThvpOwgBWhDZom3g7S/36Tl/T97iTbMOtpvwNvbpRHDq47vkfqelMaNrtw+crb01sno9egaJb35VOW2b2yQbr5rcMBedLN034+i512ANuQcNgBgwNkf956rA2jHEGnqFdKr+xvP+8jxyh7zLx8Z3NOWOGfdDM5hA0DBlAdN1ys9tK25YO3n3AXeddvlPw4I1s1hSDxhaX+/Scv7cKpEG2Yd7Rds7EjpyNMxViZA99zmrgsvQBtGGhKnhw0ABXX0hNfrXbYqmfyX3tF/jryJYI0B9LATlvb3m7S8984k2jDraL/6xHFHrbiHvgvQhvSwAQD1KV2PbT0Dd/Mqt2LN4G3nXFZ5HJJBDzthaX+/Sct770yiDbOO9su+ArQhPWwAAPKCgA0AQAYQsAEAyIDUVzqbMWOGentjmJbYpvJ+finv55Yk2jDraL/sy3sbRkUPGwCADEi9hw0gR3bG0BOakf8eI9AIetgAmnPwDi9QxxGspYG8Dia0/BaQUQRsAI059aYXWPd9OZn8993k5X/qYDL5AxnDkDiA+sXVm45i9zneM0PlKDh62ADq08pg3Q7lAm2CgA0gml3D0g+aO006sindOgApIWADqG2nSe7dprO54fYY6rJ3cfo/HIAUcA4bQLhdw5vOovwOTn99v/fc9G0cdw2TLvxtk5kA2UEPG0A4Vzsods+V7v2h/76g2y02fRvGGHr8QJYQsAEEqzH0XLr/cd8x6bN/1XwQLr+nsvVI5/1pc/UD8oSADcBfjWD47fv8tzcatP2Oe3lPhAMJ2igIAjaAwU4fqplk6R0tqIci/gA43Zd4PYC0EbABDPbSxNiyCppc1vSks3IvdceYGdCemCUOoNIbA9de+fVuS4HW9UYf/na90omT0qjZ0vFnpJEjoldn/VcGXofVRwfWSOfcGD1jIGPoYQOotP8vJAUH431lo+Wzpg/eH9RzLgXpoGAddNx1C73nXx3w3/9ePV9f7p8AyAkCNoC6TJk/8Hr7uspAGzbM/eGrvOdxlwanqc6r/P25C+qrJ5A3BGwAA5qccf16yFy1V17zno8cD04Tti8SZowjxwjYAOoyf1bwvsnzg/dFEdb7XnBJc3kDWUfABuDr5A7/7Y+ubW09Sh5e47/9nWdbWw8gLQRsAJ5TlbO6zhrmnUM+a9jAtiiXYm14uLHiH9pWO015+SOGe++HD61KdOpwYxUA2hwBG4Bn9/t9N5/cIZ163nsd5TKu6786eNvpM5Xv+44NTnPlitp5l8o/tlV6e3tAot0TamcEZBABG0BNHUOaO37oxZXvu+c2l9/o9zV3PJBFBGwAdYnSy160svK9c+HpP/e1eMoF8oyADSB2922pL/36zcnUA8iT2AO2mU0xs6fN7Odm9rKZfSnuMgDEb/nq6Glb3dutp7x6PgeQJUn0sE9LWuGc+58lXSzpP5nZ7yVQDoAYrY55Zc8v3BYtXdx3/Yr7cwDtIvaA7Zx7wzm3q//1CUk/lzQp7nIApGvBsvD933nAe962y3//5me856D7apdUzx6/9vLadQPyKNFz2Gb2QUm/L+n5qu1LzKzXzHoPH+aaSSALpn6g8v2jQZdVVZmzxH/7ZyL2hKuvz77H57IxoAgSC9hm9j5JD0ha5pyrWCHYOfdd51yPc66nu5v72AJZ8OO7B2+btzT8mK6QpUYlaewnwvcvWxW+HyiSRAK2mXXKC9b3Ouf+IYkyAMRsevho1ySf9Ugeq7Es6NEaN/M4diJ8/9qN4ft9nd/XwEFA+0tilrhJWifp58455msCWdExvqHDkpoxftVNDR7YOS7WegDtIoke9ixJ10i61Mxe7H80eQ8fAEXzg61p1wBoLx1xZ+ic2y6Jm9ICOTSxSzp4JL3yZ56XXtlA2ljpDMCAGeFriB6ocwWzch/7kDT3Iul3Jzeex3MbaiSoUX8gy2LvYQPIN9cbfN56/qzm7pd92Q3SlueCywWKjIANoNLkO6V94TO+jm2VxszxXh/cIk3oqtx/3S3SPY9EL3LWdGn7Ounxuwa27d0vTbvCex2pZz/lW9ELBDKIIXEAlSbWvjF16faWrtcL1pu2eL3u0qOeYC1JO16qPH7j495CLaVe9cSu8OMlSRO+WF+hQMaYq3Xfu4T19PS43t78jnV5V7nlV9r/flqhkG146rC02+fC6ypRL+laOFu6fqE0Z4Z09IT0k93Sreuln+2JUL8o/z2c3xd4OVch2y9n8t6GknY652r+NTEkDmCwzsZXINy82gvQQcaOkqZNkq6eV7l9+4vSJZ9vsFCuvUYBELAB+JvhpJ3hPZvSBLTODundqsli9Syo4nqlj18w0JvunCmdPhOxd83McBQEARtAsAhBWxoI1o2uelZ+3JkXpFPPR8yLYI0CYdIZgHBTay/oXZos5ueWJdLRp73eculxcoe33c+QiyIG66nfj5AIyA8mnSUs75Ml0v730wq0oQJ72dWB9co50oN3Nl6XxSu9GeflAofFI/auab/sy3sbiklnAGIzw0m7RkjunUG7+p6Uxo2u3DZytvTWyejZd42S3nxK2nir95Ckb2yQbr7LJ/HUjVLXouiZAzlBwAYQzYX9Ebiqt90xRJp6hfTq/sazPnK8srf+y0cG97Qlcc4ahcY5bAD1KQuarld6aFtzwdrPuQu867YrhsMJ1ig4etgA6jfDSaeOSLvH6drLpWsvT7Cs8w81dV04kBf0sAE0prPLC9xT1iST/5S1Xv4Ea0ASPWwAzZqwzHtIka7Zromhb8AXPWwA8ZnhBh7Tjw7avcKvM37+G5XHAfBFDxtAMjrGDArAq/4upboAOUAPGwCADCBgAwCQAQRsAAAygIANAEAGpH7zDzPL9bTQtL/fpBVgUX7aMONov+wrQBty8w8AAAKdOSq92FWxacUaadWNVenO3y91vr919QpADzthaX+/SePXffblvQ1pv+yLtQ3bcHGfqD1szmEDAPLt4B1eoI4jWEsDeR1cFU9+EdHDTlja32/S+HWffXlvQ9ov+xpuw1NvSrvHx1sZP+cfkDonNnw457ABAMUVV286it3neM8JL63LkDgAIF9aGaxbWC4BGwCQD7uGpResS3aadGRTIlkTsAEA2bfTJPdu09nccHsMddm7OJEfDkw6S1ja32/SmPCSfXlvQ9ov+2q24a7hkvttU2WYz5Qv19tUlpINlS6sXS8u6wIAFEOEYN09V7r3h/77/IJ12PbIYujxl6OHnbC0v9+k8es++/LehrRf9oW2YY2h5yg957DAXCvtR6dJP70/tAo1Z4/TwwYA5FuNYP3t+/y3N9pz9jvu5T0RDozpfDYBGwCQPacP1Uyy9I4W1EMRfwCc7mu6HAI2ACB7Xmp8ZbFqQZPLmp50Vu6l7qazYKUzAEC2vDFw7VXYOWrXG3342/VKJ05Ko2ZLx5+RRo6IXp31Xxl4HXrO/MAa6ZzqW4FFRw8bAJAt+/9CUnAw3lc2Wj5r+uD9QT3nUpAOCtZBx1230Hv+1QH//e/V8/Xl/gkiImADAHJlyvyB19vXVQbasGHuD1/lPY+7NDhNdV7l789dUF8960XABgBkR5Mzrl8Pmav2ymve85HjwWnC9kXSRP0J2ACAXJk/K3jf5PnB+6II630vuKS5vGshYAMAMunkDv/tj65tbT1KHl7jv/2dZ+PJn4ANAMiGU5Wzus4a5p1DPmvYwLYol2JteLix4h/aVjtNefkjhnvvhw+tSnTqcEPlszRpwtL+fpNW+GURcyDvbUj7Zd97bRhy/vf0GalzZn96n6BdPaO8Ok358ZJ0+Alp/Jj68ihPc2yrNPp9gdWtWK6UpUkBAIXRMaS544deXPm+e25z+YUG6wYRsAEAuRJlsZRFKyvf1xqI+dzX4im3GbEHbDMbbmYvmNlLZvaymX017jIAAGjGfVvqS79+czL1qEcSPezfSrrUOTdd0gWSPm1mF9c4BgCAUMtXR0+bdG+3mfLq+RzlYg/YzvNW/9vO/ke+Z30AABK3urmVPQf5wm3R0sV9169GP0ci57DNbIiZvSjpkKQfOeeer9q/xMx6zSzOe6EAAPCeBcvC93/nAe952y7//Zuf8Z6D7qtdcuWKyvfXXl67bo1I9LIuMxsj6UFJX3TO/TQgTa5731xSkn20YbbRftkX5bIuSZp2hbR3f9Wx/d3CoCHrWnf0CtsflHek23K222VdzrljkrZK+nSS5QAA8OO7B2+btzT8mK6QpUYlaewnwvcvWxW+P05JzBLv7u9Zy8zOkjRX0r/GXQ4AoGCmh68QNmnC4G2P1VgW9GiNm3kcOxG+f+3G8P2+zu9r4CCpo6Gjwr1f0j1mNkTeD4L7nXOPJFAOAKBIOsY3dFhSM8avuqnBAzvHNXRY7AHbObdb0u/HnS8AAO3kB1tbWx4rnQEAcmNiV7rlzzwvuby5+UfC0v5+k1aoGao5lfc2pP2yb1Ab1pgt3ugQ+Mc+5AX8vfulX+xrLI+aM8RnDP73GHWWeBLnsAEASE3YpVjzZzV3v+zLbpC2PBdcbpII2ACAbJl8p7QvfMbXsa3SmDne64NbpAlVQ+XX3SLdU8d06FnTpe3rpMfvGti2d7937bckHYiyNvmUb0Uv0AdD4glL+/tNWiGH43Im721I+2WfbxvWGBaXvF52qde7aYu0eGV4+np87+vS4ssGlxPKZzhcij4kTsBOWNrfb9IK+59FjuS9DWm/7PNtw1OHpd0+F15XiXo+e+Fs6fqF0pwZ0tET0k92S7eul362J0L9ogTr8/sCL+fiHDYAIL86uxs+dPNqL0AHGTtKmjZJunpe5fbtL0qXfL7BQhu89rocPeyEpf39Jq2wv+5zJO9tSPtlX2gbRhwa7+yQ3n1u8PbIdajqRXfOlE6faW4o/L160MMGAOTeDBcpaJeCdaOXfJUfd+YF6dTzEfOqEazrwcIpAIBsm1pjBNUvAAAgAElEQVR7QW/rCQ6wtyyRjj7t9ZZLj5M7vO1+hlwUMVhP/X6ERNExJJ6wtL/fpBV+OC4H8t6GtF/2RWrDgF52dWC9co704J2N12XxSm/GebnAYfGIvWtmibeJtL/fpPGfRfblvQ1pv+yL3Ia7RkjunYpN1iP1PSmNG12ZdORs6a2T0evQNUp686nKbd/YIN18l0/AnrpR6loUOW/OYQMAiuXC/ghc1dvuGCJNvUJ6dX/jWR85Xtlb/+Ujg3vakmI9Z12Nc9gAgHwpC5quV3poW3PB2s+5C7zrtit61wkGa4kh8cSl/f0mjeG47Mt7G9J+2ddwG546Iu1u/vrnms4/1NR14VGHxOlhAwDyqbPL6/VOWZNM/lPWevk3EazrQQ87YWl/v0nj13325b0Nab/si7UNI1yzXVPMQ9/0sAEAqDbDDTymHx20e4VfZ/z8NyqPSwk97ISl/f0mjV/32Zf3NqT9sq8AbUgPGwCAvCBgAwCQAQRsAAAyIPWVzmbMmKHe3ij3J8umvJ9fyvu5JYk2zDraL/vy3oZR0cMGACADUu9hI7pIN0qvodF7wQIA0kUPu83ddM3A/VnjUMpr+dXx5AcAaA0CdpvqGuUF1ju+lEz+q2708p/QlUz+AIB4MSTehuLqTUdxsP/2cAyVA0B7o4fdZloZrNuhXABANATsNvGbZ9MPmq5X+rNPpVsHAIA/AnYbcL3SsKHN53PD7c3nsem29H84AAAG4xx2yt7Z0Xwe5eef//p+77nZoPubZ6Xhf9RcHgCA+NDDTtnwYbXTdM+V7v2h/76gyWLNTiKLo8cPAIgPATtFtXrB1uM9+o5Jn/2r5oNwKb/S47w/ba5+AIDWIWCnpFYw/PZ9/tsbDdp+x728p/ZxBG0AaA8E7BR0R1isZOkdyddDivYDYNzo5OsBAAhHwE7BoS3x5RXUA46zZ9z3ZHx5AQAawyzxFvvzawZe+/VuS4HW9UYf/na90omT0qjZ0vFnpJEjotdn/Vei1WfZYumbG6PnCwCIFz3sFru9f23woGC879DA61nTB+8P6jmXgnRQsA467rqF3vOvDvjvL9VzzQr//QCA1iBgt5kp8wdeb19XGWjDhrk/fJX3PO7S4DTVeZW/P3dBffUEALQWAbuFmj2v/Pqh4H2vvOY9HzkenCZsXxTMGAeA9BCw28z8WcH7Js8P3hdFWO97wSXN5Q0ASBYBOyUnA5YkfXRta+tR8vAa/+3vPNvaegAA/BGwW2TiuMr3Zw3zhpjPKluaNMqQ84aHGyv/oW2105SXP2K493541RKl48c0Vj4AoDkE7BY58Lj/9pM7pFPPe6+jXMZ1/VcHbzt9pvJ937HBaa6MMMu7VP6xrdLb2/3THH6idj4AgPgRsNtAx5Dmjh96ceX77rnN5Tf6fc0dDwCIHwG7zUTpZS9aWfneufD0n/taPOUCANKTSMA2syFm9s9m9kgS+RfdfXUubbp+czL1AAC0TlI97C9J+nlCeWfS8tXR07a6t1tPefV8DgBAfGIP2GY2WdLlku6OO+8sW7083vy+cFu0dHHf9SvuzwEAiCaJHvY3JX1Z0v8ISmBmS8ys18x6Dx8+nEAVsm/BsvD933nAe962y3//5me856D7apdUzx6/9vLadQMAtF6sAdvMFkg65JzbGZbOOfdd51yPc66nu7s7zipk1tQPVL5/NOCyqmpzlvhv/0zEnnD19dn3+Fw2BgBIX9w97FmSrjCzVyVtknSpmf1dzGXk0o99TiDMWxp+TFfIUqOSNPYT4fuXrQrfDwBoH7EGbOfczc65yc65D0paJOkp59xn4ywjq8Z/Mnz/pAmDtz1WY1nQozVu5nHsRPj+tQ3c3zpsPXIAQHK4DrtF3vx1Y8clNWP8qpsaO67ZO34BABrTkVTGzrmtkrYmlT+a84OtadcAAFAPethtZGJXuuXPPC/d8gEAwQjYLVRrePtAnSuYlfvYh6S5F0m/O7nxPJ7bEL6f5UsBID2JDYmjMa43ODDOn9Xc/bIvu0Ha8lxwuQCA9kXAbrEVa6RVN4anObZVGjPHe31wizShaqj8uluke+pYpX3WdGn7Ounxuwa27d0vTbvCex2lZ//FmFdMAwDUx1ytWz0lrKenx/X25rd7Z2aDtkXpzVrPQLpNW6TFK8PT1+N7X5cWXza4nFr18ZP2v59W8GvDPMl7G9J+2Zf3NpS00zlX86QjATthfv/Qxo+RDj8R4diI54wXzpauXyjNmSEdPSH9ZLd063rpZ3tqHxslWI+7NPhyrrT//bRC3v+zyHsb0n7Zl/c2VMSAzZB4CvqONX7s5tVegA4ydpQ0bZJ09bzK7dtflC75fGNlcu01AKSPgJ2SKEPRpQlonR3Su1WTxeqZse16pY9fMFBe50zp9JnmhsIBAK1FwE5R1PPHpWDdaPAsP+7MC9Kp56PlRbAGgPbBddgpW3Rz7TTWExw8b1kiHX3aC/ylx8kd3nY/Qy6KFoj/5Mu10wAAWodJZwmLMlkiqJddHVivnCM9eGfjdVm80ptx3kjZQdL+99MKeZ/wkvc2pP2yL+9tKCadZYf1SG9vl0YMH7yv70lp3OjKbSNnS2+djJ5/1yjpzaekjbd6D0n6xgbp5rsGp110s3Tfj6LnDQBoDQJ2mzj7495zdY+3Y4g09Qrp1f2N533keGWP+ZePDO5pS5yzBoB2xjnsNlMeNF2v9NC25oK1n3MXeNdtl/84IFgDQHujh92GrEcaO1I68rR07eXeIyndc5u7LhwA0Br0sNvU0RNe4F62Kpn8l97h5U+wBoBsoIfd5tZu9B5SPHfUYugbALKJHnaGlK7Htp6Bu3mVW7Fm8LZzLqs8DgCQTfSwM+rXb/kH4NX3tr4uAIDk0cMGACADCNgAAGQAARsAgAxIfS1xM8v1Qrhpf79JK8Aav7RhxtF+2VeANoy0ljg9bAAAMoBZ4kCr7IyhJzQj3z0NAMHoYQNJOniHF6jjCNbSQF4HE1oCD0Db4hx2wtL+fpPG+bMAp96Udo+PvzLVzj8gdU5sKou8tyF/g9lXgDbkfthAKuLqTUex+xzvmaFyIPcYEgfi1Mpg3Q7lAmgZAjYQh13D0g+aO006sindOgBIDAEbaNZOk9y7TWdzw+0x1GXv4vR/OABIBJPOEpb295u0wk942TVccr9tKn+/m7g0fStVGypdGK1eeW9D/gazrwBtyMIpQOIiBOvuudK9P/TfF3TL06ZvhRpDjx9Ae6GHnbC0v9+kFfrXfY2h5yg957DAXCvtR6dJP70/tAqRZo/nvQ35G8y+ArQhPWwgMTWC9bfv89/eaM/Z77iX90Q4kPPZQG4QsIF6nT5UM8nSO1pQD0X8AXC6L/F6AEgeARuo10vNrSxWLmhyWdOTzsq91B1jZgDSwkpnQD3eGLj2KuwcteuNPvzteqUTJ6VRs6Xjz0gjR0SvzvqvDLwOPWd+YI10zo3RMwbQduhhA/XY/xeSgoPxvrLR8lnTB+8P6jmXgnRQsA467rqF3vOvDvjvf6+ery/3TwAgMwjYQIymzB94vX1dZaANG+b+8FXe87hLg9NU51X+/twF9dUTQPYQsIGompxx/XrIXLVXXvOejxwPThO2LxJmjAOZRsAGYjR/VvC+yfOD90UR1vtecElzeQNofwRsoAEnd/hvf3Rta+tR8vAa/+3vPNvaegBIDgEbiOJU5ayus4Z555DPGjawLcqlWBsebqz4h7bVTlNe/ojh3vvhQ6sSnTrcWAUApI6lSROW9vebtMIsixhy/vf0GalzZn9an6BdPaO8Ok358ZJ0+Alp/Jj68ihPc2yrNPp9gdUdtFxp3tuQv8HsK0AbsjQp0AodQ5o7fujFle+75zaXX2iwBpBZBGwgRlEWS1m0svJ9rc7D574WT7kAsi2RgG1mr5rZv5jZi2YW5yKLQObdt6W+9Os3J1MPANmSZA/7E865C6KMywPtbvnq6Glb3dutp7x6PgeA9sKQOBDB6phX9vzCbdHSxX3Xr7g/B4DWSSpgO0lbzGynmS2p3mlmS8ysl+Fy5NWCZeH7v/OA97xtl//+zc94z0H31S65ckXl+2svr103ANmUyGVdZvYB59x+M5sg6UeSvuiceyYgba7n6xfgcoS0q5C4Wpd1SdK0K6S9+6uO6/85GjRkXeuOXmH7g/KOdFtOLuvKlby3n1SINkzvsi7n3P7+50OSHpR0URLlAO3ix3cP3jZvafgxXSFLjUrS2E+E71+2Knw/gHyJPWCb2dlmNrL0WtIfS/pp3OUALTU9fIWwSRMGb3usxrKgR2vczOPYifD9azeG7/d1fl8DBwFoBx0J5DlR0oP9wzQdkr7nnHssgXKA1ukY39BhSc0Yv+qmBg/sHBdrPQC0TuwB2zm3R9L0uPMFMOAHW9OuAYBW47IuICYTu9Itf+Z56ZYPIFnc/CNhaX+/SSvcDNUas8UbHQL/2Ie8gL93v/SLfY3lUXOG+Az/f4t5b0P+BrOvAG0YaZZ4EuewgcIKuxRr/qzm7pd92Q3SlueCywWQbwRsoB6T75T2hc/4OrZVGjPHe31wizShaqj8ulukex6JXuSs6dL2ddLjdw1s27vfu/Zbkg5EWZt8yreiFwigLTEknrC0v9+kFXI4rsawuOT1sku93k1bpMUrw9PX43tflxZfNricUAHD4VL+25C/wewrQBtGGhInYCcs7e83aYX8z+LUYWm3z4XXVaKez144W7p+oTRnhnT0hPST3dKt66Wf7YlQtyjB+vy+0Mu58t6G/A1mXwHakHPYQCI6uxs+dPNqL0AHGTtKmjZJunpe5fbtL0qXfL7BQrn2GsgFetgJS/v7TVqhf91HHBrv7JDefW7w9sjlV/WiO2dKp880PxT+Xl1y3ob8DWZfAdqQHjaQqBm1bwoiDQTrRi/5Kj/uzAvSqecj5hUhWAPIDhZOAZoxtfaC3tYTHGBvWSIdfdrrLZceJ3d42/0MuShisJ76/QiJAGQJQ+IJS/v7TRrDcQrsZVcH1ivnSA/e2Xg9Fq/0ZpxX1C1oWLyO3nXe25C/wewrQBsyS7wdpP39Jo3/LPrtGiG5dyo2WY/U96Q0bnRl0pGzpbdORi+/a5T05lOV276xQbr5Lp+APXWj1LUoeubKfxvyN5h9BWhDzmEDLXNhfwSu6m13DJGmXiG9ur/xrI8cr+yt//KRwT1tSZyzBnKOc9hAnMqCpuuVHtrWXLD2c+4C77rtit41wRrIPYbEE5b295s0huMCnDoi7W7B9c/nH2rqunAp/23I32D2FaANIw2J08MGktDZ5fV6p6xJJv8pa738mwzWALKDHnbC0v5+k8av+zpEuGa7pgSGvvPehvwNZl8B2pAeNtBWZriBx/Sjg3av8OuMn/9G5XEACosedsLS/n6Txq/77Mt7G9J+2VeANqSHDQBAXhCwAQDIAAI2AAAZkPpKZzNmzFBvb5T7BGZT3s8v5f3ckkQbZh3tl315b8Oo6GEDAJABBGwAADIg9SFxAMiKwNuZ1iHS/cwBH/SwASDETdd4gTqOYC0N5LX86njyQ3EQsAHAR9coL7De8aVk8l91o5f/hK5k8kf+MCQOAFXi6k1HcbD/3uYMlaMWetgAUKaVwbodykV2ELABQNJvnk0/aLpe6c8+lW4d0L4I2AAKz/VKw4Y2n88Ntzefx6bb0v/hgPbEOWwAhfbOjubzKD///Nf3e8/NBt3fPCsN/6Pm8kC+0MMGUGjDh9VO0z1XuveH/vuCJos1O4ksjh4/8oWADaCwavWCrcd79B2TPvtXzQfhUn6lx3l/2lz9UCwEbACFVCsYfvs+/+2NBm2/417eU/s4gjZKCNgACqc7wmIlS+9Ivh5StB8A40YnXw+0PwI2gMI5tCW+vIJ6wHH2jPuejC8vZBezxAEUyp9fM/Dar3dbCrSuN/rwt+uVTpyURs2Wjj8jjRwRvT7rvxKtPssWS9/cGD1f5A89bACFcnv/2uBBwXjfoYHXs6YP3h/Ucy4F6aBgHXTcdQu9518d8N9fqueaFf77URwEbAAoM2X+wOvt6yoDbdgw94ev8p7HXRqcpjqv8vfnLqivnigeAjaAwmj2vPLrh4L3vfKa93zkeHCasH1RMGO82AjYAFBm/qzgfZPnB++LIqz3veCS5vJG/hGwARTSyYAlSR9d29p6lDy8xn/7O8+2th5oXwRsAIUwcVzl+7OGeUPMZ5UtTRplyHnDw42V/9C22mnKyx8x3Hs/vGqJ0vFjGisf2UfABlAIBx73335yh3Tqee91lMu4rv/q4G2nz1S+7zs2OM2VEWZ5l8o/tlV6e7t/msNP1M4H+UTABlB4HUOaO37oxZXvu+c2l9/o9zV3PPIpkYBtZmPM7O/N7F/N7Odm9odJlAMAcYvSy160svK9c+HpP/e1eMpFsSXVw14r6THn3L+TNF3SzxMqBwBa7r46lzZdvzmZeqBYYg/YZjZK0mxJ6yTJOfeuc87njA4AtM7y1dHTtrq3W0959XwO5EsSPexpkg5LWm9m/2xmd5vZ2QmUAwCRrV4eb35fuC1aurjv+hX350B2JBGwOyRdKOlvnHO/L+ltSX9ZnsDMlphZr5n1Hj58OIEqAEBzFiwL3/+dB7znbbv8929+xnsOuq92SfXs8Wsvr103FFMSAXufpH3Ouf4LJfT38gL4e5xz33XO9Tjnerq7uxOoAgDUZ+oHKt8/GnBZVbU5S/y3fyZiT7j6+ux7fC4bA6QEArZz7oCk18zsI/2bPinpZ3GXAwBx+vHdg7fNWxp+TFfIUqOSNPYT4fuXrQrfD5RLapb4FyXda2a7JV0g6daEygGASMZ/Mnz/pAmDtz1WY1nQozVu5nHsRPj+tQ3c3zpsPXLkW0cSmTrnXpTEVYUA2sabv27suKRmjF91U2PHNXvHL2QXK50BQAp+sDXtGiBrCNgA0G9iV7rlzzwv3fLR3gjYAAqj1vD2gTpXMCv3sQ9Jcy+Sfndy43k8tyF8P8uXFlsi57ABIKtcb3BgnD+ruftlX3aDtOW54HKBMARsAIWyYo206sbwNMe2SmPmeK8PbpEmVA2VX3eLdM8j0cucNV3avk56/K6BbXv3S9Ou8F5H6dl/MeYV05A95mrdZiZhPT09rrc3vz8tzSztKiQq7X8/rUAbZptf+0XpzVrPQLpNW6TFK8PT1+N7X5cWXza4nFr18ZP39pPy/zcoaadzruYJDwJ2wvL+Dy3tfz+tQBtmm1/7jR8jHX4iwrERzxkvnC1dv1CaM0M6ekL6yW7p1vXSz/bUPjZKsB53afDlXHlvPyn/f4OKGLAZEgdQOH1N3D9w82ovQAcZO0qaNkm6el7l9u0vSpd8vrEyufYaEgEbQEFFGYouTUDr7JDerZosVs+MbdcrffyCgfI6Z0qnzzQ3FI7iIWADKKyo549LwbrR4Fl+3JkXpFPPR8uLYI1yXIcNoNAW3Vw7jfUEB89blkhHn/YCf+lxcoe33c+Qi6IF4j/5cu00KBYmnSUs75Ml0v730wq0YbZFab+gXnZ1YL1yjvTgnY3XZfFKb8Z5I2UHyXv7Sfn/GxSTzgAgGuuR3t4ujRg+eF/fk9K40ZXbRs6W3joZPf+uUdKbT0kbb/UekvSNDdLNdw1Ou+hm6b4fRc8bxUHABgBJZ3/ce67u8XYMkaZeIb26v/G8jxyv7DH/8pHBPW2Jc9YIxzlsAChTHjRdr/TQtuaCtZ9zF3jXbZf/OCBYoxZ62ABQxXqksSOlI09L117uPZLSPbe568JRHPSwAcDH0RNe4F62Kpn8l97h5U+wRlT0sAEgxNqN3kOK545aDH2jUfSwASCi0vXY1jNwN69yK9YM3nbOZZXHAY2ihw0ADfj1W/4BePW9ra8LioEeNgAAGUDABgAgAwjYAABkQOpriZtZrhfCTfv7TVoB1vilDTOO9su+ArRhpLXE6WEDAJABzBJH2+AaVwAIRg8bqbrpmoF7CMehlNfyq+PJDwDaBeewE5b295u0Rs+flW43mLSJfywdOtJcHrRhttF+2VeANuR+2GhPcfWmozjYfwtDhsoBZB1D4mipVgbrdigXAOJCwEZL/ObZ9IOm65X+7FPp1gEAGkXARuJcrzRsaPP53HB783lsui39Hw4A0AgmnSUs7e83abUmvLyzQxo+rMkyfM4/Nxt0f/uuNPyPoqUtehtmHe2XfQVoQxZOQfqiBOvuudK9P/TfFzRZrNlJZHH0+AGglehhJyzt7zdpYb/ua/WCo/ScwwJzrbQfnSb99P766zConAK3YR7QftlXgDakh4301ArW377Pf3ujPWe/417eU/s4zmcDyAoCNmLX3VU7zdI7kq+HFO0HwLjRydcDAJpFwEbsDm2JL6+gHnCcPeO+J+PLCwCSwkpniNWfXzPwOuwcteuNPvzteqUTJ6VRs6Xjz0gjR0Svz/qvRKvPssXSNzdGzxcAWo0eNmJ1+5e856BgvO/QwOtZ0wfvD+o5l4J0ULAOOu66hd7zrw747y/Vc80K//0A0C4I2GipKfMHXm9fVxlow4a5P3yV9zzu0uA01XmVvz93QX31BIB2Q8BGbJo9r/z6oeB9r7zmPR85HpwmbF8UzBgH0M4I2Gip+bOC902eH7wvirDe94JLmssbANJGwEYiTu7w3/7o2tbWo+ThNf7b33m2tfUAgEYRsBGLieMq3581zBtiPqtsadIoQ84bHm6s/Ie21U5TXv6I4d774VVLlI4f01j5AJA0liZNWNrfb9JKyyKGBePTZ6TOmQpMVz2jvDpN+fGSdPiJwYG1Vh7laY5tlUa/L7i+g/IqSBvmFe2XfQVoQ5YmRXvoGNLc8UMvrnzfPbe5/MKCNQC0KwI2WirKYimLVla+r/Xj+nNfi6dcAGhnsQdsM/uImb1Y9jhuZsviLgf5dV+dS5uu35xMPQCgncQesJ1z/+acu8A5d4GkGZJOSnow7nLQXpavjp621b3desqr53MAQCslPST+SUm/cM79MuFykLLVy+PN7wu3RUsX912/4v4cABCXpAP2IkmDbqlgZkvMrNfMWFuqoBbUOEnynQe85227/PdvfsZ7DrqvdsmVVWuEX3t57boBQDtK7LIuMxsqab+kjzrnDoaky/V8/QJcjiCp9jXW066Q9u6v3FY6JmjIutYdvcL2B+Ud5VpwLuvKF9ov+wrQhqlf1jVP0q6wYI3i+PHdg7fNWxp+TFfIUqOSNPYT4fuXrQrfDwBZkmTAXiyf4XDk0/hPhu+fNGHwtsdqLAt6tMbNPI6dCN+/toF/fWHrkQNAmhIJ2GY2QtKnJP1DEvmj/bz568aOS2rG+FU3NXZcs3f8AoCkdCSRqXPupKRxNRMCCfnB1rRrAADxYqUztMzErnTLn3leuuUDQDO4+UfC0v5+k1Y9Q7XWLOxGh8A/9iEv4O/dL/1iX2N5NFq3orVh3tB+2VeANow0SzyRIXEgSNilWPNnNXe/7MtukLY8F1wuAGQZARuxWrFGWnVjeJpjW6Uxc7zXB7dIE6qGyq+7RbrnkehlzpoubV8nPX7XwLa9+71rvyXpQIS1yb8Y84ppABA3hsQTlvb3mzS/4bioi5OU0m3aIi1eGZ6+Ht/7urT4ssHl1KpPkCK2YZ7QftlXgDaMNCROwE5Y2t9v0vz+sxg/Rjr8RIRjI57PXjhbun6hNGeGdPSE9JPd0q3rpZ/tqX1slGA97tLwy7mK2IZ5QvtlXwHakHPYSEffscaP3bzaC9BBxo6Spk2Srp5XuX37i9Iln2+sTK69BpAF9LATlvb3m7SwX/dRh6I7O6R3nxu8ParqcjpnSqfPND8U/l7+BW7DPKD9sq8AbUgPG+mKev64FKwbveSr/LgzL0inno+WV6vvyw0AzWDhFCRq0c2101hPcPC8ZYl09Gkv8JceJ3d42/0MuShaIP6TL9dOAwDthCHxhKX9/SYtynBcUC+7OrBeOUd68M7G67J4pTfjvJGyw9CG2Ub7ZV8B2pBZ4u0g7e83aVH/s3h7uzRieNWxPVLfk9K40ZXbR86W3joZvQ5do6Q3n6rc9o0N0s13DQ7Yi26W7vtR9Lwl2jDraL/sK0Abcg4b7ePsj3vP1QG0Y4g09Qrp1f2N533keGWP+ZePDO5pS5yzBpBtnMNGS5UHTdcrPbStuWDt59wF3nXb5T8OCNYAso4h8YSl/f0mrdHhuLEjpSNPx1wZH91zm7suXKINs472y74CtGGkIXF62EjF0RNer3fZqmTyX3pH/znyJoM1ALQLetgJS/v7TVqcv+7juKNWEkPftGG20X7ZV4A2pIeNbCldj209A3fzKrdizeBt51xWeRwA5BU97ISl/f0mjV/32Zf3NqT9sq8AbUgPGwCAvCBgAwCQAQRsAAAyoB1WOuuT9MsWlje+v8yWSOn8Uks/Ywry3oa0X4xov9i1/PMVoA3PjZIo9UlnrWZmvVFO7mdZ3j8jny/b+HzZlvfPJ7XvZ2RIHACADCBgAwCQAUUM2N9NuwItkPfPyOfLNj5ftuX980lt+hkLdw4bAIAsKmIPGwCAzCFgAwCQAYUK2Gb2aTP7NzN7xcz+Mu36xMnM/tbMDpnZT9OuSxLMbIqZPW1mPzezl83sS2nXKW5mNtzMXjCzl/o/41fTrlPczGyImf2zmT2Sdl2SYGavmtm/mNmLZhbD/efai5mNMbO/N7N/7f9b/MO06xQXM/tIf7uVHsfNbFna9SpXmHPYZjZE0v8n6VOS9kn6J0mLnXM/S7ViMTGz2ZLekvTfnHPnpV2fuJnZ+yW93zm3y8xGStop6cq8tJ8kmbc6xNnOubfMrFPSdklfcs49l3LVYmNmyyX1SBrlnFuQdn3iZmavSupxzuVy4RQzu0fSj51zd5vZUEkjnHO5u+t8f7x4XdJM51wrF/YKVaQe9kWSXnHO7XHOvStpk6TPpFyn2DjnnpF0JO16JMU594Zzblf/6xOSfi5pUrq1ipfzvNX/tl2ok/MAAAJgSURBVLP/kZtf1GY2WdLlku5Ouy6on5mNkjRb0jpJcs69m8dg3e+Tkn7RTsFaKlbAniTptbL3+5Sz//CLwsw+KOn3JT2fbk3i1z9k/KKkQ5J+5JzL02f8pqQvS/ofaVckQU7SFjPbaWZL0q5MzKZJOixpff9pjbvN7Oy0K5WQRZI2pl2JakUK2H6L0eam91IUZvY+SQ9IWuacO552feLmnDvjnLtA0mRJF5lZLk5vmNkCSYecczvTrkvCZjnnLpQ0T9J/6j9VlRcdki6U9DfOud+X9LakXM0FkqT+of4rJH0/7bpUK1LA3idpStn7yZL2p1QXNKD/vO4Dku51zv1D2vVJUv9Q41ZJn065KnGZJemK/nO8myRdamZ/l26V4uec29//fEjSg/JOxeXFPkn7ykZ9/l5eAM+beZJ2OecOpl2RakUK2P8k6cNmNrX/F9QiSZtTrhMi6p+QtU7Sz51zq9OuTxLMrNvMxvS/PkvSXEn/mm6t4uGcu9k5N9k590F5f3tPOec+m3K1YmVmZ/dPiFT/UPEfS8rNVRvOuQOSXjOzj/Rv+qSk3Ez6LLNYbTgcLrXH7TVbwjl32sxukPS4pCGS/tY593LK1YqNmW2UNEfSeDPbJ+krzrl16dYqVrMkXSPpX/rP8UrSSufcP6ZYp7i9X9I9/TNUf0fS/c65XF7+lFMTJT3YfyvIDknfc849lm6VYvdFSff2d3r2SLo+5frEysxGyLuS6D+mXRc/hbmsCwCALCvSkDgAAJlFwAYAIAMI2AAAZAABGwCADCBgAwCQAQRsAAAygIANAEAG/P+uMuaa/akHvAAAAABJRU5ErkJggg==", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_NQueens(bfts)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`uniform_cost_search`" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1.08 s ± 154 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "uniform_cost_search(nqp)" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "ucs = uniform_cost_search(nqp).solution()" + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAewAAAHwCAYAAABkPlyAAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3X+4FNWd7/vPd9gbEMOvDRtMgGtgkifnTow4skecIXKJIWNAMHru3Bm4Ro/m5nJu7jEEwcmMPM88MXlONFcFQuLcycmRAc8ZA5pxjKgTJf4AA0adDaNMTGbuY8BERH5sgYBiInDW/aN2u7t7V1VXd1d1dVW9X8/TT3dXrVprdS82316rVq0y55wAAEB7+520KwAAAGojYAMAkAEEbAAAMoCADQBABhCwAQDIAAI2AAAZQMAGACADCNgAAGQAARtoM2b2QTP7RzM7amYHzOwuM+sIST/GzP6mP+1JM/sXM/sPrawzgOQRsIH28/9KOiTp/ZIukPS/SPq//RKa2VBJT0g6V9IfShot6c8l3W5mS1tSWwAtQcAG2s9USfc7537jnDsg6TFJHw1Ie42k/0nS/+ac2+ucO+Wce0zSUkn/2cxGSpKZOTP7UOkgM9tgZv+57P0CM3vRzI6Z2bNmdn7Zvg+Y2QNmdtjM9pb/EDCzW8zsfjP7b2Z2wsxeNrOesv1/YWav9+/7NzP7ZDxfEVA8BGyg/ayVtMjMRpjZJEnz5AVtP5+S9EPn3NtV2x+QNELSxbUKM7MLJf2tpP8oaZyk/yJps5kNM7PfkfSwpJckTZL0SUnLzOyysiyukLRJ0hhJmyXd1Z/vRyTdIOkPnHMjJV0m6dVa9QHgj4ANtJ9t8nrUxyXtk9Qr6QcBacdLeqN6o3PutKQ+Sd0Ryvs/Jf0X59zzzrkzzrl7JP1WXrD/A0ndzrmvOefedc7tkfRfJS0qO367c+4fnXNnJP13SdP7t5+RNEzS75lZp3PuVefcLyLUB4APAjbQRvp7tI9L+gdJZ8sLyGMl/T8Bh/TJO9ddnU9H/7GHIxR7rqQV/cPhx8zsmKQpkj7Qv+8DVftWSppYdvyBstcnJQ03sw7n3CuSlkm6RdIhM9tkZh+IUB8APgjYQHvpkhcs73LO/dY596ak9ZLmB6R/QtI8Mzu7avv/KumUpBf635+UN0Reck7Z69ckfd05N6bsMcI5t7F/396qfSOdc0H1qeCc+55z7uPyAr9T8A8PADUQsIE24pzrk7RX0hfMrMPMxkj6D/LOIfv57/KGzb/ffzlYZ//55W9Jut059+v+dC9K+t/NbIiZfVrezPOS/yrp/zKzmeY528wu75+w9oKk4/2Tx87qP/48M/uDWp/FzD5iZpea2TBJv5H0jrxhcgANIGAD7effS/q0vOHsVySdlnSjX0Ln3G8lzZXXE35eXlB8TNI3JX21LOmXJC2UdEzS1So7J+6c65V3HvsuSUf7y7yuf9+Z/uMukPdDok/S3fIuH6tlmKRv9B9zQNIEecPpABpgzrm06wAgJmbWKemHkl6XdJ3jDxzIDXrYQI44507JO3/9C0kfSbk6AGJEDxsAgAyghw0AQAYE3lCgVcaPH+8++MEPpl2NxOzcuTPtKiRqxowZaVchcbRhttF+2Zf3NpTU55yruchR6kPiPT09rre3t/mMdlrzecyI/7swi6FebSztfz+tQBtmG+2XfXlvQ0k7nXM9tRJle0j84B1eoI4jWEsDeR1cFU9+AADEJJsB+9SbXmDd9+Vk8t93k5f/qYPJ5A8AQJ1SP4ddt7h601Hs7l+9MYGhcgAA6pGtHnYrg3U7lAsAQL9sBOxdw9IPmjtNOrIp3ToAAAqr/QP2TpPcu01nc8PtMdRl7+L0fzgAAAqpvc9h7xredBZWNlH+r+/3nl2zV5HtGiZd+NsmMwEAILr27mG72kGxe6507w/991nAVW1B2yOLoccPAEA92jdg1xh6th7v0XdM+uxfNR+ES/mVHuf9aXP1AwAgTu0ZsGsEw2/f57+90aDtd9zLeyIcSNAGALRI+wXs04dqJll6RwvqoYg/AE73JV4PAADaL2C/NDG2rIImlzU96azcSzXXawcAoGntNUv8jYFrr/x6t6VA63qjD3+7XunESWnUbOn4M9LIEdGrs/4rA6/D6qMDa6RzboyeMQAAdWqvHvb+v5AUHIz3lY2Wz5o+eH9Qz7kUpIOCddBx1y30nn91wH//e/V8fbl/AgAAYtJeAbuGKfMHXm9fVxlow4a5P3yV9zzu0uA01XmVvz93QX31BAAgbu0TsJuccf16yFy1V17zno8cD04Tti8SZowDABLUPgE7gvmzgvdNnh+8L4qw3veCS5rLGwCAZrVlwD65w3/7o2tbW4+Sh9f4b3/n2dbWAwBQXO0RsE9Vzuo6a5h3DvmsYQPbolyKteHhxop/aFvtNOXljxjuvR8+tCrRqcONVQAAgBraI2Dvfr/v5pM7pFPPe6+jXMZ1/VcHbzt9pvJ937HBaa5cUTvvUvnHtkpvbw9ItHtC7YwAAGhAewTsEB1Dmjt+6MWV77vnNpff6Pc1dzwAAI1o+4BdLkove9HKyvfOhaf/3NfiKRcAgCRlKmBHcd+W+tKv35xMPQAAiFMiAdvMPm1m/2Zmr5jZX9ZKv3x1HXm3uLdbT3n1fA4AAOoRe8A2syGS/lrSPEm/J2mxmf1e2DGrY17Z8wu3RUsX912/4v4cAACUJNHDvkjSK865Pc65dyVtkvSZOAtYsCx8/3ce8J637fLfv/kZ7znovtol1bPHr728dt0AAEhCEgF7kqTXyt7v69/2HjNbYma9ZtZ7+HDta5enfqDy/aNBl1VVmbPEf/tnIvaEq6/PvsfnsjEAAFohiYDtt6h2xVxt59x3nXM9zrme7u7a95P+8d2Dt81bGn5MV8hSo5I09hPh+5etCt8PAEArJRGw90maUvZ+sqT9oUdMD+9lT/JZj+SxGsuCHq1xM49jJ8L3r90Yvt/X+X0NHAQAQG1JBOx/kvRhM5tqZkMlLZIUfvFUx/iGCkpqxvhVNzV4YOe4WOsBAEBJR9wZOudOm9kNkh6XNETS3zrnXo67nCT9YGvaNQAAoFLsAVuSnHP/KOkf48xzYpd08EicOdZn5nnplQ0AQPusdDYjfA3RA3WuYFbuYx+S5l4k/e7kxvN4bkONBDXqDwBAMxLpYSfF9Qaft54/q7n7ZV92g7TlueByAQBIU3sF7Ml3SvvCZ3wd2yqNmeO9PrhFmtBVuf+6W6R7Hole5Kzp0vZ10uN3DWzbu1+adoX3OlLPfsq3ohcIAEAD2mdIXJIm1r4xden2lq7XC9abtni97tKjnmAtSTteqjx+4+PeQi2lXvXErvDjJUkTvlhfoQAA1MlcrftPJqynp8f19paNOZ86LO32ufC6StRLuhbOlq5fKM2ZIR09If1kt3Treulne2ofG2ko/Py+0Mu5zPzWkcmPtP/9tAJtmG20X/blvQ0l7XTO1Yxq7TUkLkmdtVc+C7J5tRegg4wdJU2bJF09r3L79helSz7fYKFcew0AaIH2C9iSN+N6Z/gvqtIEtM4O6d2qyWL1LKjieqWPXzDQm+6cKZ0+E7F3zcxwAECLtGfAliIFbWkgWDe66ln5cWdekE49HzEvgjUAoIXaa9JZtam1F/QuTRbzc8sS6ejTXm+59Di5w9vuZ8hFEYP11O9HSAQAQHzab9JZtYBednVgvXKO9OCdjddj8Upvxnm5wGHxOnrXeZ8skfa/n1agDbON9su+vLehMjvprNoMJ+0aIbl3Bu3qe1IaN7py28jZ0lsno2ffNUp68ylp463eQ5K+sUG6+S6fxFM3Sl2LomcOAEBM2j9gS9KF/RG4qrfdMUSaeoX0avjNO0MdOV7ZW//lI4N72pI4Zw0ASFV7n8OuVhY0Xa/00LbmgrWfcxd4121XDIcTrAEAKctGD7vcDCedOiLtHqdrL5euvTzBss4/1NR14QAAxCVbPeySzi4vcE9Zk0z+U9Z6+ROsAQBtIns97HITlnkPKdI12zUx9A0AaFPZ7GH7meEGHtOPDtq9wq8zfv4blccBANCmst3DDtIxZlAAXvV3KdUFAIAY5KeHDQBAjhGwAQDIAAI2AAAZQMAGACADUr/5h5nlenp22t9v0gqwKD9tmHG0X/YVoA0j3fyDHjYAwNeYkZW3J3a90vKrB287Z1zaNS0GetgJS/v7TRq/7rMv721I+9Un8LbCdai+/XGzCtCG9LABALXddM1AbzkO5b1xxIcedsLS/n6TlvfemUQbZh3tF6xrlPTmUzFWJsDEP5YOHWn8+AK0YaQedj5XOgMAhIqrNx3FwS3ec9xD5UXDkDgAFEwrg3U7lJsXBGwAKIjfPJt+0HS90p99Kt06ZBUBGwAKwPVKw4Y2n88Ntzefx6bb0v/hkEVMOktY2t9v0vI+YUmiDbOO9pPe2SENH9ZkOT7nn5sNur99Vxr+R7XTFaANuawLABAtWHfPle79of++oMlizU4ii6PHXyT0sBOW9vebtLz3ziTaMOuK3n61esFRes5hgblW2o9Ok356f/11qCgj/21IDxsAiqxWsP72ff7bG+05+x338p7ax3E+OxoCNgDkUHdX7TRL70i+HlK0HwDjRidfj6wjYANADh3aEl9eQT3gOHvGfU/Gl1desdIZAOTMn18z8DrsHLXrjT787XqlEyelUbOl489II0dEr8/6r0Srz7LF0jc3Rs+3aOhhA0DO3P4l7zkoGO87NPB61vTB+4N6zqUgHRSsg467bqH3/KsD/vtL9Vyzwn8/PARsACiYKfMHXm9fVxlow4a5P3yV9zzu0uA01XmVvz93QX31RCUCNgDkSLPnlV8/FLzvlde85yPHg9OE7YuCGePBCNgAUDDzZwXvmzw/eF8UYb3vBZc0l3fREbABIKdO7vDf/uja1taj5OE1/tvfeba19cgqAjYA5MTEcZXvzxrmDTGfVbY0aZQh5w0PN1b+Q9tqpykvf8Rw7/3wqiVKx49prPy8Y2nShKX9/SYt78taSrRh1hWp/cKC8ekzUufM4HTVM8qr05QfL0mHnxgcWGvlUZ7m2FZp9PuC61ueVwHakKVJAQCejiHNHT/04sr33XObyy8sWMMfARsACibKYimLVla+r9XJ/dzX4ikXwWIP2Gb2t2Z2yMx+GnfeAIDWuK/OpU3Xb06mHhiQRA97g6RPJ5AvACDE8tXR07a6t1tPefV8jiKJPWA7556RdCTufAEA4VYvjze/L9wWLV3cd/2K+3PkBeewAaCgFiwL3/+dB7znbbv8929+xnsOuq92yZVVa4Rfe3ntumGwVAK2mS0xs14zYxE6AGiRqR+ofP/o9mjHzVniv/0zEXvC1ddn3/PVaMehUioB2zn3XedcT5TrzgAA8fjx3YO3zVsafkxXyFKjkjT2E+H7l60K34/oGBIHgJwY/8nw/ZMmDN72WI1lQY/WuJnHsRPh+9c2cH/rsPXIiyyJy7o2SvqJpI+Y2T4z+z/iLgMAMNibv27suKRmjF91U2PHNXvHr7zqiDtD59ziuPMEAGTPD7amXYN8YUgcAApkYle65c88L93ys4ybfyQs7e83aXm/cYREG2ZdEduv1h25Gh0C/9iHvIC/d7/0i32N5dFI3QrQhpFu/hH7kDgAoL253uCgPX9Wc/fLvuwGactzweWicQRsAMiZFWukVTeGpzm2VRozx3t9cIs0oWqo/LpbpHseiV7mrOnS9nXS43cNbNu7X5p2hff6QIS1yb8Y84ppecOQeMLS/n6TlvfhVIk2zLqitl+U3qz1DKTbtEVavDI8fT2+93Vp8WWDy6lVHz8FaMNIQ+IE7ISl/f0mLe//2Uu0YdYVtf3Gj5EOPxHh+IjnsxfOlq5fKM2ZIR09If1kt3Treulne2ofGyVYj7s0+HKuArQh57ABoKj6jjV+7ObVXoAOMnaUNG2SdPW8yu3bX5Qu+XxjZXLtdW30sBOW9vebtLz3ziTaMOuK3n5Rh6I7O6R3nxu8ParqcjpnSqfPNDcU/l7e+W9DetgAUHRRzx+XgnWjl3yVH3fmBenU89HyavV9ubOMhVMAIOcW3Vw7jfUEB89blkhHn/YCf+lxcoe33c+Qi6IF4j/5cu00GMCQeMLS/n6TlvfhVIk2zDrazxPUy64OrFfOkR68s/H6LF7pzThvpOwgBWhDZom3g7S/36Tl/T97iTbMOtpvwNvbpRHDq47vkfqelMaNrtw+crb01sno9egaJb35VOW2b2yQbr5rcMBedLN034+i512ANuQcNgBgwNkf956rA2jHEGnqFdKr+xvP+8jxyh7zLx8Z3NOWOGfdDM5hA0DBlAdN1ys9tK25YO3n3AXeddvlPw4I1s1hSDxhaX+/Scv7cKpEG2Yd7Rds7EjpyNMxViZA99zmrgsvQBtGGhKnhw0ABXX0hNfrXbYqmfyX3tF/jryJYI0B9LATlvb3m7S8984k2jDraL/6xHFHrbiHvgvQhvSwAQD1KV2PbT0Dd/Mqt2LN4G3nXFZ5HJJBDzthaX+/Sct770yiDbOO9su+ArQhPWwAAPKCgA0AQAYQsAEAyIDUVzqbMWOGentjmJbYpvJ+finv55Yk2jDraL/sy3sbRkUPGwCADEi9hw0gR3bG0BOakf8eI9AIetgAmnPwDi9QxxGspYG8Dia0/BaQUQRsAI059aYXWPd9OZn8993k5X/qYDL5AxnDkDiA+sXVm45i9zneM0PlKDh62ADq08pg3Q7lAm2CgA0gml3D0g+aO006sindOgApIWADqG2nSe7dprO54fYY6rJ3cfo/HIAUcA4bQLhdw5vOovwOTn99v/fc9G0cdw2TLvxtk5kA2UEPG0A4Vzsods+V7v2h/76g2y02fRvGGHr8QJYQsAEEqzH0XLr/cd8x6bN/1XwQLr+nsvVI5/1pc/UD8oSADcBfjWD47fv8tzcatP2Oe3lPhAMJ2igIAjaAwU4fqplk6R0tqIci/gA43Zd4PYC0EbABDPbSxNiyCppc1vSks3IvdceYGdCemCUOoNIbA9de+fVuS4HW9UYf/na90omT0qjZ0vFnpJEjoldn/VcGXofVRwfWSOfcGD1jIGPoYQOotP8vJAUH431lo+Wzpg/eH9RzLgXpoGAddNx1C73nXx3w3/9ePV9f7p8AyAkCNoC6TJk/8Hr7uspAGzbM/eGrvOdxlwanqc6r/P25C+qrJ5A3BGwAA5qccf16yFy1V17zno8cD04Tti8SZowjxwjYAOoyf1bwvsnzg/dFEdb7XnBJc3kDWUfABuDr5A7/7Y+ubW09Sh5e47/9nWdbWw8gLQRsAJ5TlbO6zhrmnUM+a9jAtiiXYm14uLHiH9pWO015+SOGe++HD61KdOpwYxUA2hwBG4Bn9/t9N5/cIZ163nsd5TKu6786eNvpM5Xv+44NTnPlitp5l8o/tlV6e3tAot0TamcEZBABG0BNHUOaO37oxZXvu+c2l9/o9zV3PJBFBGwAdYnSy160svK9c+HpP/e1eMoF8oyADSB2922pL/36zcnUA8iT2AO2mU0xs6fN7Odm9rKZfSnuMgDEb/nq6Glb3dutp7x6PgeQJUn0sE9LWuGc+58lXSzpP5nZ7yVQDoAYrY55Zc8v3BYtXdx3/Yr7cwDtIvaA7Zx7wzm3q//1CUk/lzQp7nIApGvBsvD933nAe962y3//5me856D7apdUzx6/9vLadQPyKNFz2Gb2QUm/L+n5qu1LzKzXzHoPH+aaSSALpn6g8v2jQZdVVZmzxH/7ZyL2hKuvz77H57IxoAgSC9hm9j5JD0ha5pyrWCHYOfdd51yPc66nu5v72AJZ8OO7B2+btzT8mK6QpUYlaewnwvcvWxW+HyiSRAK2mXXKC9b3Ouf+IYkyAMRsevho1ySf9Ugeq7Es6NEaN/M4diJ8/9qN4ft9nd/XwEFA+0tilrhJWifp58455msCWdExvqHDkpoxftVNDR7YOS7WegDtIoke9ixJ10i61Mxe7H80eQ8fAEXzg61p1wBoLx1xZ+ic2y6Jm9ICOTSxSzp4JL3yZ56XXtlA2ljpDMCAGeFriB6ocwWzch/7kDT3Iul3Jzeex3MbaiSoUX8gy2LvYQPIN9cbfN56/qzm7pd92Q3SlueCywWKjIANoNLkO6V94TO+jm2VxszxXh/cIk3oqtx/3S3SPY9EL3LWdGn7Ounxuwa27d0vTbvCex2pZz/lW9ELBDKIIXEAlSbWvjF16faWrtcL1pu2eL3u0qOeYC1JO16qPH7j495CLaVe9cSu8OMlSRO+WF+hQMaYq3Xfu4T19PS43t78jnV5V7nlV9r/flqhkG146rC02+fC6ypRL+laOFu6fqE0Z4Z09IT0k93Sreuln+2JUL8o/z2c3xd4OVch2y9n8t6GknY652r+NTEkDmCwzsZXINy82gvQQcaOkqZNkq6eV7l9+4vSJZ9vsFCuvUYBELAB+JvhpJ3hPZvSBLTODundqsli9Syo4nqlj18w0JvunCmdPhOxd83McBQEARtAsAhBWxoI1o2uelZ+3JkXpFPPR8yLYI0CYdIZgHBTay/oXZos5ueWJdLRp73eculxcoe33c+QiyIG66nfj5AIyA8mnSUs75Ml0v730wq0oQJ72dWB9co50oN3Nl6XxSu9GeflAofFI/auab/sy3sbiklnAGIzw0m7RkjunUG7+p6Uxo2u3DZytvTWyejZd42S3nxK2nir95Ckb2yQbr7LJ/HUjVLXouiZAzlBwAYQzYX9Ebiqt90xRJp6hfTq/sazPnK8srf+y0cG97Qlcc4ahcY5bAD1KQuarld6aFtzwdrPuQu867YrhsMJ1ig4etgA6jfDSaeOSLvH6drLpWsvT7Cs8w81dV04kBf0sAE0prPLC9xT1iST/5S1Xv4Ea0ASPWwAzZqwzHtIka7Zromhb8AXPWwA8ZnhBh7Tjw7avcKvM37+G5XHAfBFDxtAMjrGDArAq/4upboAOUAPGwCADCBgAwCQAQRsAAAygIANAEAGpH7zDzPL9bTQtL/fpBVgUX7aMONov+wrQBty8w8AAAKdOSq92FWxacUaadWNVenO3y91vr919QpADzthaX+/SePXffblvQ1pv+yLtQ3bcHGfqD1szmEDAPLt4B1eoI4jWEsDeR1cFU9+EdHDTlja32/S+HWffXlvQ9ov+xpuw1NvSrvHx1sZP+cfkDonNnw457ABAMUVV286it3neM8JL63LkDgAIF9aGaxbWC4BGwCQD7uGpResS3aadGRTIlkTsAEA2bfTJPdu09nccHsMddm7OJEfDkw6S1ja32/SmPCSfXlvQ9ov+2q24a7hkvttU2WYz5Qv19tUlpINlS6sXS8u6wIAFEOEYN09V7r3h/77/IJ12PbIYujxl6OHnbC0v9+k8es++/LehrRf9oW2YY2h5yg957DAXCvtR6dJP70/tAo1Z4/TwwYA5FuNYP3t+/y3N9pz9jvu5T0RDozpfDYBGwCQPacP1Uyy9I4W1EMRfwCc7mu6HAI2ACB7Xmp8ZbFqQZPLmp50Vu6l7qazYKUzAEC2vDFw7VXYOWrXG3342/VKJ05Ko2ZLx5+RRo6IXp31Xxl4HXrO/MAa6ZzqW4FFRw8bAJAt+/9CUnAw3lc2Wj5r+uD9QT3nUpAOCtZBx1230Hv+1QH//e/V8/Xl/gkiImADAHJlyvyB19vXVQbasGHuD1/lPY+7NDhNdV7l789dUF8960XABgBkR5Mzrl8Pmav2ymve85HjwWnC9kXSRP0J2ACAXJk/K3jf5PnB+6II630vuKS5vGshYAMAMunkDv/tj65tbT1KHl7jv/2dZ+PJn4ANAMiGU5Wzus4a5p1DPmvYwLYol2JteLix4h/aVjtNefkjhnvvhw+tSnTqcEPlszRpwtL+fpNW+GURcyDvbUj7Zd97bRhy/vf0GalzZn96n6BdPaO8Ok358ZJ0+Alp/Jj68ihPc2yrNPp9gdWtWK6UpUkBAIXRMaS544deXPm+e25z+YUG6wYRsAEAuRJlsZRFKyvf1xqI+dzX4im3GbEHbDMbbmYvmNlLZvaymX017jIAAGjGfVvqS79+czL1qEcSPezfSrrUOTdd0gWSPm1mF9c4BgCAUMtXR0+bdG+3mfLq+RzlYg/YzvNW/9vO/ke+Z30AABK3urmVPQf5wm3R0sV9169GP0ci57DNbIiZvSjpkKQfOeeer9q/xMx6zSzOe6EAAPCeBcvC93/nAe952y7//Zuf8Z6D7qtdcuWKyvfXXl67bo1I9LIuMxsj6UFJX3TO/TQgTa5731xSkn20YbbRftkX5bIuSZp2hbR3f9Wx/d3CoCHrWnf0CtsflHek23K222VdzrljkrZK+nSS5QAA8OO7B2+btzT8mK6QpUYlaewnwvcvWxW+P05JzBLv7u9Zy8zOkjRX0r/GXQ4AoGCmh68QNmnC4G2P1VgW9GiNm3kcOxG+f+3G8P2+zu9r4CCpo6Gjwr1f0j1mNkTeD4L7nXOPJFAOAKBIOsY3dFhSM8avuqnBAzvHNXRY7AHbObdb0u/HnS8AAO3kB1tbWx4rnQEAcmNiV7rlzzwvuby5+UfC0v5+k1aoGao5lfc2pP2yb1Ab1pgt3ugQ+Mc+5AX8vfulX+xrLI+aM8RnDP73GHWWeBLnsAEASE3YpVjzZzV3v+zLbpC2PBdcbpII2ACAbJl8p7QvfMbXsa3SmDne64NbpAlVQ+XX3SLdU8d06FnTpe3rpMfvGti2d7937bckHYiyNvmUb0Uv0AdD4glL+/tNWiGH43Im721I+2WfbxvWGBaXvF52qde7aYu0eGV4+np87+vS4ssGlxPKZzhcij4kTsBOWNrfb9IK+59FjuS9DWm/7PNtw1OHpd0+F15XiXo+e+Fs6fqF0pwZ0tET0k92S7eul362J0L9ogTr8/sCL+fiHDYAIL86uxs+dPNqL0AHGTtKmjZJunpe5fbtL0qXfL7BQhu89rocPeyEpf39Jq2wv+5zJO9tSPtlX2gbRhwa7+yQ3n1u8PbIdajqRXfOlE6faW4o/L160MMGAOTeDBcpaJeCdaOXfJUfd+YF6dTzEfOqEazrwcIpAIBsm1pjBNUvAAAgAElEQVR7QW/rCQ6wtyyRjj7t9ZZLj5M7vO1+hlwUMVhP/X6ERNExJJ6wtL/fpBV+OC4H8t6GtF/2RWrDgF52dWC9co704J2N12XxSm/GebnAYfGIvWtmibeJtL/fpPGfRfblvQ1pv+yL3Ia7RkjunYpN1iP1PSmNG12ZdORs6a2T0evQNUp686nKbd/YIN18l0/AnrpR6loUOW/OYQMAiuXC/ghc1dvuGCJNvUJ6dX/jWR85Xtlb/+Ujg3vakmI9Z12Nc9gAgHwpC5quV3poW3PB2s+5C7zrtit61wkGa4kh8cSl/f0mjeG47Mt7G9J+2ddwG546Iu1u/vrnms4/1NR14VGHxOlhAwDyqbPL6/VOWZNM/lPWevk3EazrQQ87YWl/v0nj13325b0Nab/si7UNI1yzXVPMQ9/0sAEAqDbDDTymHx20e4VfZ/z8NyqPSwk97ISl/f0mjV/32Zf3NqT9sq8AbUgPGwCAvCBgAwCQAQRsAAAyIPWVzmbMmKHe3ij3J8umvJ9fyvu5JYk2zDraL/vy3oZR0cMGACADUu9hI7pIN0qvodF7wQIA0kUPu83ddM3A/VnjUMpr+dXx5AcAaA0CdpvqGuUF1ju+lEz+q2708p/QlUz+AIB4MSTehuLqTUdxsP/2cAyVA0B7o4fdZloZrNuhXABANATsNvGbZ9MPmq5X+rNPpVsHAIA/AnYbcL3SsKHN53PD7c3nsem29H84AAAG4xx2yt7Z0Xwe5eef//p+77nZoPubZ6Xhf9RcHgCA+NDDTtnwYbXTdM+V7v2h/76gyWLNTiKLo8cPAIgPATtFtXrB1uM9+o5Jn/2r5oNwKb/S47w/ba5+AIDWIWCnpFYw/PZ9/tsbDdp+x728p/ZxBG0AaA8E7BR0R1isZOkdyddDivYDYNzo5OsBAAhHwE7BoS3x5RXUA46zZ9z3ZHx5AQAawyzxFvvzawZe+/VuS4HW9UYf/na90omT0qjZ0vFnpJEjotdn/Vei1WfZYumbG6PnCwCIFz3sFru9f23woGC879DA61nTB+8P6jmXgnRQsA467rqF3vOvDvjvL9VzzQr//QCA1iBgt5kp8wdeb19XGWjDhrk/fJX3PO7S4DTVeZW/P3dBffUEALQWAbuFmj2v/Pqh4H2vvOY9HzkenCZsXxTMGAeA9BCw28z8WcH7Js8P3hdFWO97wSXN5Q0ASBYBOyUnA5YkfXRta+tR8vAa/+3vPNvaegAA/BGwW2TiuMr3Zw3zhpjPKluaNMqQ84aHGyv/oW2105SXP2K493541RKl48c0Vj4AoDkE7BY58Lj/9pM7pFPPe6+jXMZ1/VcHbzt9pvJ937HBaa6MMMu7VP6xrdLb2/3THH6idj4AgPgRsNtAx5Dmjh96ceX77rnN5Tf6fc0dDwCIHwG7zUTpZS9aWfneufD0n/taPOUCANKTSMA2syFm9s9m9kgS+RfdfXUubbp+czL1AAC0TlI97C9J+nlCeWfS8tXR07a6t1tPefV8DgBAfGIP2GY2WdLlku6OO+8sW7083vy+cFu0dHHf9SvuzwEAiCaJHvY3JX1Z0v8ISmBmS8ys18x6Dx8+nEAVsm/BsvD933nAe962y3//5me856D7apdUzx6/9vLadQMAtF6sAdvMFkg65JzbGZbOOfdd51yPc66nu7s7zipk1tQPVL5/NOCyqmpzlvhv/0zEnnD19dn3+Fw2BgBIX9w97FmSrjCzVyVtknSpmf1dzGXk0o99TiDMWxp+TFfIUqOSNPYT4fuXrQrfDwBoH7EGbOfczc65yc65D0paJOkp59xn4ywjq8Z/Mnz/pAmDtz1WY1nQozVu5nHsRPj+tQ3c3zpsPXIAQHK4DrtF3vx1Y8clNWP8qpsaO67ZO34BABrTkVTGzrmtkrYmlT+a84OtadcAAFAPethtZGJXuuXPPC/d8gEAwQjYLVRrePtAnSuYlfvYh6S5F0m/O7nxPJ7bEL6f5UsBID2JDYmjMa43ODDOn9Xc/bIvu0Ha8lxwuQCA9kXAbrEVa6RVN4anObZVGjPHe31wizShaqj8uluke+pYpX3WdGn7Ounxuwa27d0vTbvCex2lZ//FmFdMAwDUx1ytWz0lrKenx/X25rd7Z2aDtkXpzVrPQLpNW6TFK8PT1+N7X5cWXza4nFr18ZP2v59W8GvDPMl7G9J+2Zf3NpS00zlX86QjATthfv/Qxo+RDj8R4diI54wXzpauXyjNmSEdPSH9ZLd063rpZ3tqHxslWI+7NPhyrrT//bRC3v+zyHsb0n7Zl/c2VMSAzZB4CvqONX7s5tVegA4ydpQ0bZJ09bzK7dtflC75fGNlcu01AKSPgJ2SKEPRpQlonR3Su1WTxeqZse16pY9fMFBe50zp9JnmhsIBAK1FwE5R1PPHpWDdaPAsP+7MC9Kp56PlRbAGgPbBddgpW3Rz7TTWExw8b1kiHX3aC/ylx8kd3nY/Qy6KFoj/5Mu10wAAWodJZwmLMlkiqJddHVivnCM9eGfjdVm80ptx3kjZQdL+99MKeZ/wkvc2pP2yL+9tKCadZYf1SG9vl0YMH7yv70lp3OjKbSNnS2+djJ5/1yjpzaekjbd6D0n6xgbp5rsGp110s3Tfj6LnDQBoDQJ2mzj7495zdY+3Y4g09Qrp1f2N533keGWP+ZePDO5pS5yzBoB2xjnsNlMeNF2v9NC25oK1n3MXeNdtl/84IFgDQHujh92GrEcaO1I68rR07eXeIyndc5u7LhwA0Br0sNvU0RNe4F62Kpn8l97h5U+wBoBsoIfd5tZu9B5SPHfUYugbALKJHnaGlK7Htp6Bu3mVW7Fm8LZzLqs8DgCQTfSwM+rXb/kH4NX3tr4uAIDk0cMGACADCNgAAGQAARsAgAxIfS1xM8v1Qrhpf79JK8Aav7RhxtF+2VeANoy0ljg9bAAAMoBZ4kCr7IyhJzQj3z0NAMHoYQNJOniHF6jjCNbSQF4HE1oCD0Db4hx2wtL+fpPG+bMAp96Udo+PvzLVzj8gdU5sKou8tyF/g9lXgDbkfthAKuLqTUex+xzvmaFyIPcYEgfi1Mpg3Q7lAmgZAjYQh13D0g+aO006sindOgBIDAEbaNZOk9y7TWdzw+0x1GXv4vR/OABIBJPOEpb295u0wk942TVccr9tKn+/m7g0fStVGypdGK1eeW9D/gazrwBtyMIpQOIiBOvuudK9P/TfF3TL06ZvhRpDjx9Ae6GHnbC0v9+kFfrXfY2h5yg957DAXCvtR6dJP70/tAqRZo/nvQ35G8y+ArQhPWwgMTWC9bfv89/eaM/Z77iX90Q4kPPZQG4QsIF6nT5UM8nSO1pQD0X8AXC6L/F6AEgeARuo10vNrSxWLmhyWdOTzsq91B1jZgDSwkpnQD3eGLj2KuwcteuNPvzteqUTJ6VRs6Xjz0gjR0SvzvqvDLwOPWd+YI10zo3RMwbQduhhA/XY/xeSgoPxvrLR8lnTB+8P6jmXgnRQsA467rqF3vOvDvjvf6+ery/3TwAgMwjYQIymzB94vX1dZaANG+b+8FXe87hLg9NU51X+/twF9dUTQPYQsIGompxx/XrIXLVXXvOejxwPThO2LxJmjAOZRsAGYjR/VvC+yfOD90UR1vtecElzeQNofwRsoAEnd/hvf3Rta+tR8vAa/+3vPNvaegBIDgEbiOJU5ayus4Z555DPGjawLcqlWBsebqz4h7bVTlNe/ojh3vvhQ6sSnTrcWAUApI6lSROW9vebtMIsixhy/vf0GalzZn9an6BdPaO8Ok358ZJ0+Alp/Jj68ihPc2yrNPp9gdUdtFxp3tuQv8HsK0AbsjQp0AodQ5o7fujFle+75zaXX2iwBpBZBGwgRlEWS1m0svJ9rc7D574WT7kAsi2RgG1mr5rZv5jZi2YW5yKLQObdt6W+9Os3J1MPANmSZA/7E865C6KMywPtbvnq6Glb3dutp7x6PgeA9sKQOBDB6phX9vzCbdHSxX3Xr7g/B4DWSSpgO0lbzGynmS2p3mlmS8ysl+Fy5NWCZeH7v/OA97xtl//+zc94z0H31S65ckXl+2svr103ANmUyGVdZvYB59x+M5sg6UeSvuiceyYgba7n6xfgcoS0q5C4Wpd1SdK0K6S9+6uO6/85GjRkXeuOXmH7g/KOdFtOLuvKlby3n1SINkzvsi7n3P7+50OSHpR0URLlAO3ix3cP3jZvafgxXSFLjUrS2E+E71+2Knw/gHyJPWCb2dlmNrL0WtIfS/pp3OUALTU9fIWwSRMGb3usxrKgR2vczOPYifD9azeG7/d1fl8DBwFoBx0J5DlR0oP9wzQdkr7nnHssgXKA1ukY39BhSc0Yv+qmBg/sHBdrPQC0TuwB2zm3R9L0uPMFMOAHW9OuAYBW47IuICYTu9Itf+Z56ZYPIFnc/CNhaX+/SSvcDNUas8UbHQL/2Ie8gL93v/SLfY3lUXOG+Az/f4t5b0P+BrOvAG0YaZZ4EuewgcIKuxRr/qzm7pd92Q3SlueCywWQbwRsoB6T75T2hc/4OrZVGjPHe31wizShaqj8ulukex6JXuSs6dL2ddLjdw1s27vfu/Zbkg5EWZt8yreiFwigLTEknrC0v9+kFXI4rsawuOT1sku93k1bpMUrw9PX43tflxZfNricUAHD4VL+25C/wewrQBtGGhInYCcs7e83aYX8z+LUYWm3z4XXVaKez144W7p+oTRnhnT0hPST3dKt66Wf7YlQtyjB+vy+0Mu58t6G/A1mXwHakHPYQCI6uxs+dPNqL0AHGTtKmjZJunpe5fbtL0qXfL7BQrn2GsgFetgJS/v7TVqhf91HHBrv7JDefW7w9sjlV/WiO2dKp880PxT+Xl1y3ob8DWZfAdqQHjaQqBm1bwoiDQTrRi/5Kj/uzAvSqecj5hUhWAPIDhZOAZoxtfaC3tYTHGBvWSIdfdrrLZceJ3d42/0MuShisJ76/QiJAGQJQ+IJS/v7TRrDcQrsZVcH1ivnSA/e2Xg9Fq/0ZpxX1C1oWLyO3nXe25C/wewrQBsyS7wdpP39Jo3/LPrtGiG5dyo2WY/U96Q0bnRl0pGzpbdORi+/a5T05lOV276xQbr5Lp+APXWj1LUoeubKfxvyN5h9BWhDzmEDLXNhfwSu6m13DJGmXiG9ur/xrI8cr+yt//KRwT1tSZyzBnKOc9hAnMqCpuuVHtrWXLD2c+4C77rtit41wRrIPYbEE5b295s0huMCnDoi7W7B9c/nH2rqunAp/23I32D2FaANIw2J08MGktDZ5fV6p6xJJv8pa738mwzWALKDHnbC0v5+k8av+zpEuGa7pgSGvvPehvwNZl8B2pAeNtBWZriBx/Sjg3av8OuMn/9G5XEACosedsLS/n6Txq/77Mt7G9J+2VeANqSHDQBAXhCwAQDIAAI2AAAZkPpKZzNmzFBvb5T7BGZT3s8v5f3ckkQbZh3tl315b8Oo6GEDAJABBGwAADIg9SFxAMiKwNuZ1iHS/cwBH/SwASDETdd4gTqOYC0N5LX86njyQ3EQsAHAR9coL7De8aVk8l91o5f/hK5k8kf+MCQOAFXi6k1HcbD/3uYMlaMWetgAUKaVwbodykV2ELABQNJvnk0/aLpe6c8+lW4d0L4I2AAKz/VKw4Y2n88Ntzefx6bb0v/hgPbEOWwAhfbOjubzKD///Nf3e8/NBt3fPCsN/6Pm8kC+0MMGUGjDh9VO0z1XuveH/vuCJos1O4ksjh4/8oWADaCwavWCrcd79B2TPvtXzQfhUn6lx3l/2lz9UCwEbACFVCsYfvs+/+2NBm2/417eU/s4gjZKCNgACqc7wmIlS+9Ivh5StB8A40YnXw+0PwI2gMI5tCW+vIJ6wHH2jPuejC8vZBezxAEUyp9fM/Dar3dbCrSuN/rwt+uVTpyURs2Wjj8jjRwRvT7rvxKtPssWS9/cGD1f5A89bACFcnv/2uBBwXjfoYHXs6YP3h/Ucy4F6aBgHXTcdQu9518d8N9fqueaFf77URwEbAAoM2X+wOvt6yoDbdgw94ev8p7HXRqcpjqv8vfnLqivnigeAjaAwmj2vPLrh4L3vfKa93zkeHCasH1RMGO82AjYAFBm/qzgfZPnB++LIqz3veCS5vJG/hGwARTSyYAlSR9d29p6lDy8xn/7O8+2th5oXwRsAIUwcVzl+7OGeUPMZ5UtTRplyHnDw42V/9C22mnKyx8x3Hs/vGqJ0vFjGisf2UfABlAIBx73335yh3Tqee91lMu4rv/q4G2nz1S+7zs2OM2VEWZ5l8o/tlV6e7t/msNP1M4H+UTABlB4HUOaO37oxZXvu+c2l9/o9zV3PPIpkYBtZmPM7O/N7F/N7Odm9odJlAMAcYvSy160svK9c+HpP/e1eMpFsSXVw14r6THn3L+TNF3SzxMqBwBa7r46lzZdvzmZeqBYYg/YZjZK0mxJ6yTJOfeuc87njA4AtM7y1dHTtrq3W0959XwO5EsSPexpkg5LWm9m/2xmd5vZ2QmUAwCRrV4eb35fuC1aurjv+hX350B2JBGwOyRdKOlvnHO/L+ltSX9ZnsDMlphZr5n1Hj58OIEqAEBzFiwL3/+dB7znbbv8929+xnsOuq92SfXs8Wsvr103FFMSAXufpH3Ouf4LJfT38gL4e5xz33XO9Tjnerq7uxOoAgDUZ+oHKt8/GnBZVbU5S/y3fyZiT7j6+ux7fC4bA6QEArZz7oCk18zsI/2bPinpZ3GXAwBx+vHdg7fNWxp+TFfIUqOSNPYT4fuXrQrfD5RLapb4FyXda2a7JV0g6daEygGASMZ/Mnz/pAmDtz1WY1nQozVu5nHsRPj+tQ3c3zpsPXLkW0cSmTrnXpTEVYUA2sabv27suKRmjF91U2PHNXvHL2QXK50BQAp+sDXtGiBrCNgA0G9iV7rlzzwv3fLR3gjYAAqj1vD2gTpXMCv3sQ9Jcy+Sfndy43k8tyF8P8uXFlsi57ABIKtcb3BgnD+ruftlX3aDtOW54HKBMARsAIWyYo206sbwNMe2SmPmeK8PbpEmVA2VX3eLdM8j0cucNV3avk56/K6BbXv3S9Ou8F5H6dl/MeYV05A95mrdZiZhPT09rrc3vz8tzSztKiQq7X8/rUAbZptf+0XpzVrPQLpNW6TFK8PT1+N7X5cWXza4nFr18ZP39pPy/zcoaadzruYJDwJ2wvL+Dy3tfz+tQBtmm1/7jR8jHX4iwrERzxkvnC1dv1CaM0M6ekL6yW7p1vXSz/bUPjZKsB53afDlXHlvPyn/f4OKGLAZEgdQOH1N3D9w82ovQAcZO0qaNkm6el7l9u0vSpd8vrEyufYaEgEbQEFFGYouTUDr7JDerZosVs+MbdcrffyCgfI6Z0qnzzQ3FI7iIWADKKyo549LwbrR4Fl+3JkXpFPPR8uLYI1yXIcNoNAW3Vw7jfUEB89blkhHn/YCf+lxcoe33c+Qi6IF4j/5cu00KBYmnSUs75Ml0v730wq0YbZFab+gXnZ1YL1yjvTgnY3XZfFKb8Z5I2UHyXv7Sfn/GxSTzgAgGuuR3t4ujRg+eF/fk9K40ZXbRs6W3joZPf+uUdKbT0kbb/UekvSNDdLNdw1Ou+hm6b4fRc8bxUHABgBJZ3/ce67u8XYMkaZeIb26v/G8jxyv7DH/8pHBPW2Jc9YIxzlsAChTHjRdr/TQtuaCtZ9zF3jXbZf/OCBYoxZ62ABQxXqksSOlI09L117uPZLSPbe568JRHPSwAcDH0RNe4F62Kpn8l97h5U+wRlT0sAEgxNqN3kOK545aDH2jUfSwASCi0vXY1jNwN69yK9YM3nbOZZXHAY2ihw0ADfj1W/4BePW9ra8LioEeNgAAGUDABgAgAwjYAABkQOpriZtZrhfCTfv7TVoB1vilDTOO9su+ArRhpLXE6WEDAJABzBJH2+AaVwAIRg8bqbrpmoF7CMehlNfyq+PJDwDaBeewE5b295u0Rs+flW43mLSJfywdOtJcHrRhttF+2VeANuR+2GhPcfWmozjYfwtDhsoBZB1D4mipVgbrdigXAOJCwEZL/ObZ9IOm65X+7FPp1gEAGkXARuJcrzRsaPP53HB783lsui39Hw4A0AgmnSUs7e83abUmvLyzQxo+rMkyfM4/Nxt0f/uuNPyPoqUtehtmHe2XfQVoQxZOQfqiBOvuudK9P/TfFzRZrNlJZHH0+AGglehhJyzt7zdpYb/ua/WCo/ScwwJzrbQfnSb99P766zConAK3YR7QftlXgDakh4301ArW377Pf3ujPWe/417eU/s4zmcDyAoCNmLX3VU7zdI7kq+HFO0HwLjRydcDAJpFwEbsDm2JL6+gHnCcPeO+J+PLCwCSwkpniNWfXzPwOuwcteuNPvzteqUTJ6VRs6Xjz0gjR0Svz/qvRKvPssXSNzdGzxcAWo0eNmJ1+5e856BgvO/QwOtZ0wfvD+o5l4J0ULAOOu66hd7zrw747y/Vc80K//0A0C4I2GipKfMHXm9fVxlow4a5P3yV9zzu0uA01XmVvz93QX31BIB2Q8BGbJo9r/z6oeB9r7zmPR85HpwmbF8UzBgH0M4I2Gip+bOC902eH7wvirDe94JLmssbANJGwEYiTu7w3/7o2tbWo+ThNf7b33m2tfUAgEYRsBGLieMq3581zBtiPqtsadIoQ84bHm6s/Ie21U5TXv6I4d774VVLlI4f01j5AJA0liZNWNrfb9JKyyKGBePTZ6TOmQpMVz2jvDpN+fGSdPiJwYG1Vh7laY5tlUa/L7i+g/IqSBvmFe2XfQVoQ5YmRXvoGNLc8UMvrnzfPbe5/MKCNQC0KwI2WirKYimLVla+r/Xj+nNfi6dcAGhnsQdsM/uImb1Y9jhuZsviLgf5dV+dS5uu35xMPQCgncQesJ1z/+acu8A5d4GkGZJOSnow7nLQXpavjp621b3desqr53MAQCslPST+SUm/cM79MuFykLLVy+PN7wu3RUsX912/4v4cABCXpAP2IkmDbqlgZkvMrNfMWFuqoBbUOEnynQe85227/PdvfsZ7DrqvdsmVVWuEX3t57boBQDtK7LIuMxsqab+kjzrnDoaky/V8/QJcjiCp9jXW066Q9u6v3FY6JmjIutYdvcL2B+Ud5VpwLuvKF9ov+wrQhqlf1jVP0q6wYI3i+PHdg7fNWxp+TFfIUqOSNPYT4fuXrQrfDwBZkmTAXiyf4XDk0/hPhu+fNGHwtsdqLAt6tMbNPI6dCN+/toF/fWHrkQNAmhIJ2GY2QtKnJP1DEvmj/bz568aOS2rG+FU3NXZcs3f8AoCkdCSRqXPupKRxNRMCCfnB1rRrAADxYqUztMzErnTLn3leuuUDQDO4+UfC0v5+k1Y9Q7XWLOxGh8A/9iEv4O/dL/1iX2N5NFq3orVh3tB+2VeANow0SzyRIXEgSNilWPNnNXe/7MtukLY8F1wuAGQZARuxWrFGWnVjeJpjW6Uxc7zXB7dIE6qGyq+7RbrnkehlzpoubV8nPX7XwLa9+71rvyXpQIS1yb8Y84ppABA3hsQTlvb3mzS/4bioi5OU0m3aIi1eGZ6+Ht/7urT4ssHl1KpPkCK2YZ7QftlXgDaMNCROwE5Y2t9v0vz+sxg/Rjr8RIRjI57PXjhbun6hNGeGdPSE9JPd0q3rpZ/tqX1slGA97tLwy7mK2IZ5QvtlXwHakHPYSEffscaP3bzaC9BBxo6Spk2Srp5XuX37i9Iln2+sTK69BpAF9LATlvb3m7SwX/dRh6I7O6R3nxu8ParqcjpnSqfPND8U/l7+BW7DPKD9sq8AbUgPG+mKev64FKwbveSr/LgzL0inno+WV6vvyw0AzWDhFCRq0c2101hPcPC8ZYl09Gkv8JceJ3d42/0MuShaIP6TL9dOAwDthCHxhKX9/SYtynBcUC+7OrBeOUd68M7G67J4pTfjvJGyw9CG2Ub7ZV8B2pBZ4u0g7e83aVH/s3h7uzRieNWxPVLfk9K40ZXbR86W3joZvQ5do6Q3n6rc9o0N0s13DQ7Yi26W7vtR9Lwl2jDraL/sK0Abcg4b7ePsj3vP1QG0Y4g09Qrp1f2N533keGWP+ZePDO5pS5yzBpBtnMNGS5UHTdcrPbStuWDt59wF3nXb5T8OCNYAso4h8YSl/f0mrdHhuLEjpSNPx1wZH91zm7suXKINs472y74CtGGkIXF62EjF0RNer3fZqmTyX3pH/znyJoM1ALQLetgJS/v7TVqcv+7juKNWEkPftGG20X7ZV4A2pIeNbCldj209A3fzKrdizeBt51xWeRwA5BU97ISl/f0mjV/32Zf3NqT9sq8AbUgPGwCAvCBgAwCQAQRsAAAyoB1WOuuT9MsWlje+v8yWSOn8Uks/Ywry3oa0X4xov9i1/PMVoA3PjZIo9UlnrWZmvVFO7mdZ3j8jny/b+HzZlvfPJ7XvZ2RIHACADCBgAwCQAUUM2N9NuwItkPfPyOfLNj5ftuX980lt+hkLdw4bAIAsKmIPGwCAzCFgAwCQAYUK2Gb2aTP7NzN7xcz+Mu36xMnM/tbMDpnZT9OuSxLMbIqZPW1mPzezl83sS2nXKW5mNtzMXjCzl/o/41fTrlPczGyImf2zmT2Sdl2SYGavmtm/mNmLZhbD/efai5mNMbO/N7N/7f9b/MO06xQXM/tIf7uVHsfNbFna9SpXmHPYZjZE0v8n6VOS9kn6J0mLnXM/S7ViMTGz2ZLekvTfnHPnpV2fuJnZ+yW93zm3y8xGStop6cq8tJ8kmbc6xNnOubfMrFPSdklfcs49l3LVYmNmyyX1SBrlnFuQdn3iZmavSupxzuVy4RQzu0fSj51zd5vZUEkjnHO5u+t8f7x4XdJM51wrF/YKVaQe9kWSXnHO7XHOvStpk6TPpFyn2DjnnpF0JO16JMU594Zzblf/6xOSfi5pUrq1ipfzvNX/tl2ok/MAAAJgSURBVLP/kZtf1GY2WdLlku5Ouy6on5mNkjRb0jpJcs69m8dg3e+Tkn7RTsFaKlbAniTptbL3+5Sz//CLwsw+KOn3JT2fbk3i1z9k/KKkQ5J+5JzL02f8pqQvS/ofaVckQU7SFjPbaWZL0q5MzKZJOixpff9pjbvN7Oy0K5WQRZI2pl2JakUK2H6L0eam91IUZvY+SQ9IWuacO552feLmnDvjnLtA0mRJF5lZLk5vmNkCSYecczvTrkvCZjnnLpQ0T9J/6j9VlRcdki6U9DfOud+X9LakXM0FkqT+of4rJH0/7bpUK1LA3idpStn7yZL2p1QXNKD/vO4Dku51zv1D2vVJUv9Q41ZJn065KnGZJemK/nO8myRdamZ/l26V4uec29//fEjSg/JOxeXFPkn7ykZ9/l5eAM+beZJ2OecOpl2RakUK2P8k6cNmNrX/F9QiSZtTrhMi6p+QtU7Sz51zq9OuTxLMrNvMxvS/PkvSXEn/mm6t4uGcu9k5N9k590F5f3tPOec+m3K1YmVmZ/dPiFT/UPEfS8rNVRvOuQOSXjOzj/Rv+qSk3Ez6LLNYbTgcLrXH7TVbwjl32sxukPS4pCGS/tY593LK1YqNmW2UNEfSeDPbJ+krzrl16dYqVrMkXSPpX/rP8UrSSufcP6ZYp7i9X9I9/TNUf0fS/c65XF7+lFMTJT3YfyvIDknfc849lm6VYvdFSff2d3r2SLo+5frEysxGyLuS6D+mXRc/hbmsCwCALCvSkDgAAJlFwAYAIAMI2AAAZAABGwCADCBgAwCQAQRsAAAygIANAEAG/P+uMuaa/akHvAAAAABJRU5ErkJggg==", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_NQueens(ucs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`depth_first_tree_search` is almost 20 times faster than `breadth_first_tree_search` and more than 200 times faster than `uniform_cost_search`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also solve this problem using `astar_search` with a suitable heuristic function. \n", + "
    \n", + "The best heuristic function for this scenario will be one that returns the number of conflicts in the current state." + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
        def h(self, node):\n",
    +       "        """Return number of conflicting queens for a given node"""\n",
    +       "        num_conflicts = 0\n",
    +       "        for (r1, c1) in enumerate(node.state):\n",
    +       "            for (r2, c2) in enumerate(node.state):\n",
    +       "                if (r1, c1) != (r2, c2):\n",
    +       "                    num_conflicts += self.conflict(r1, c1, r2, c2)\n",
    +       "\n",
    +       "        return num_conflicts\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(NQueensProblem.h)" + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "8.85 ms ± 424 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" + ] + } + ], + "source": [ + "%%timeit\n", + "astar_search(nqp)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`astar_search` is faster than both `uniform_cost_search` and `breadth_first_tree_search`." + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "astar = astar_search(nqp).solution()" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAewAAAHwCAYAAABkPlyAAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3X+4FdWd7/nP93IOIIZfBw6YAGOgkyczHSO2nBa7iQwxpA0IRmd6umGMXs1kuJO5hqDY6Zbn6Scmz41mVCB07OncXGnw3jagaduI2lGiEQwYtQ+00jHpnseAiYj8OMIJKCYCd80fdbZn/6iqXWfvql27qt6v59nP3rtq1Vpr73Xgu9eqVavMOScAANDe/l3aFQAAAPURsAEAyAACNgAAGUDABgAgAwjYAABkAAEbAIAMIGADAJABBGwAADKAgA20GTP7oJn9o5kdM7ODZna3mXWEpB9nZn8zkPakmf2Lmf37VtYZQPII2ED7+X8lHZb0fkkXSPqfJf3ffgnNbLikJyWdK+kPJI2V9GeS7jCz5S2pLYCWIGAD7We6pAecc79xzh2U9LikjwakvUbS/yDpf3PO7XPOnXLOPS5puaT/ZGajJcnMnJl9qHSQmW00s/9U9n6Rmb1oZv1m9qyZnV+27wNm9qCZHTGzfeU/BMzsVjN7wMz+q5mdMLOXzaynbP+fm9nrA/v+zcw+Gc9XBBQPARtoP+skLTGzUWY2RdICeUHbz6ck/cA593bV9gcljZJ0cb3CzOxCSX8r6T9ImiDpP0vaYmYjzOzfSXpE0kuSpkj6pKQVZnZZWRZXSNosaZykLZLuHsj3I5JukPT7zrnRki6T9Gq9+gDwR8AG2s92eT3q45L2S+qV9P2AtBMlvVG90Tl3WlKfpO4I5f2fkv6zc+5559wZ59y9kn4rL9j/vqRu59zXnHPvOuf2SvovkpaUHb/DOfePzrkzkv6bpJkD289IGiHpd82s0zn3qnPuFxHqA8AHARtoIwM92ick/YOks+UF5PGS/p+AQ/rkneuuzqdj4NgjEYo9V9LKgeHwfjPrlzRN0gcG9n2gat8qSZPLjj9Y9vqkpJFm1uGce0XSCkm3SjpsZpvN7AMR6gPABwEbaC9d8oLl3c653zrn3pS0QdLCgPRPSlpgZmdXbf9fJZ2S9MLA+5PyhshLzil7/ZqkrzvnxpU9RjnnNg3s21e1b7RzLqg+FZxz33XOfVxe4HcK/uEBoA4CNtBGnHN9kvZJ+oKZdZjZOEn/Xt45ZD//Td6w+fcGLgfrHDi//FeS7nDO/Xog3YuS/nczG2Zmn5Y387zkv0j6v8xstnnONrPLByasvSDp+MDksbMGjj/PzH6/3mcxs4+Y2aVmNkLSbyS9I2+YHEADCNhA+/lfJH1a3nD2K5JOS7rRL6Fz7reS5svrCT8vLyg+Lumbkr5alvRLkhZL6pd0tcrOiTvneuWdx75b0rGBMq8b2Hdm4LgL5P2Q6JN0j7zLx+oZIekbA8cclDRJ3nA6gAaYcy7tOgCIiZl1SvqBpNclXef4Bw7kBj1sIEecc6fknb/+haSPpFwdADGihw0AQAbQwwYAIAMCbyjQKhMnTnQf/OAH065GYnbt2pV2FRI1a9astKuQONow22i/7Mt7G0rqc87VXeQo9SHxnp4e19vbm2odkmRmaVchUWn//bRCXG3oYvgzH1ylOz55b0P+DWZf3ttQ0i7nXN1/3QyJAwm6+RovUMcRrKXBvG66Op78AGQHARtIQNcYL7De+aVk8l99o5f/pK5k8gfQflI/hw3kTVy96SgObfWekxgqB9Be6GEDMWplsG6HcgG0DgEbiMFvnk0/aLpe6U8/lW4dACSHgA00yfVKI4Y3n88NdzSfx+bb0//hACAZnMMGmvDOzubzKD///NcPeM/NBt3fPCuN/MPm8gDQXuhhA00YOaJ+mu750n0/8N8XNFms2UlkcfT4AbQXAjbQoHq9YOvxHn390mf/svkgXMqv9DjvT5qrH4BsIWADDagXDL91v//2RoO233Ev761/HEEbyA8CNjBE3REWK1l+Z/L1kKL9AJgwNvl6AEgeARsYosNb48srqAccZ8+476n48gKQHmaJA0PwZ9cMvvbr3ZYCreuNPvzteqUTJ6Uxc6Xjz0ijR0Wvz4avRKvPiqXSNzdFzxdA+6GHDQzBHQNrgwcF4/2HB1/PmVm7P6jnXArSQcE66LjrFnvPvzrov79Uz7Ur/fcDyA4CNhCjaQsHX+9YXxlow4a5P3yV9zzh0uA01XmVvz930dDqCSB7CNhARM2eV379cPC+V17zno8eD04Tti8KZowD2UbABmK0cE7wvqkLg/dFEdb7XnRJc3kDaH8EbKABJwOWJH1sXWvrUfLIWv/t7zzb2noASA4BG4hg8oTK92eN8IaYzypbmjTKkPPGRxor/+Ht9dOUlz9qpPd+ZNUSpRPHNVY+gPQRsIEIDj7hv/3kTunU897rKJdxXf/V2m2nz1S+7+uvTXNlhFnepfL7t0lv7/BPc+TJ+vkAaE8EbKBJHcOaO374xZXvu+c3l9/Y9zV3PID2RMAGYhSll71kVeV758LTf+5r8ZQLINsI2ECL3T/EpU03bEmmHgCyJZGAbWafNrN/M7NXzOwvkigDaKWb1kRP2+re7lDKG8rnANBeYg/YZjZM0l9LWiDpdyUtNbPfjbscoJXW3BRvfl+4PVq6uO/6FffnANA6SfSwL5L0inNur3PuXUmbJX0mgXKAtrVoRfj+bz/oPW/f7b9/yzPec9B9tUuqZ49fe3n9ugHIpiQC9hRJr5W93z+w7T1mtszMes2s98iRIwlUAWit6R+ofP9YwGVV1eYt89/+mYg94errs+/1uWwMQD4kEbDNZ1vFPFjn3Heccz3OuZ7u7u4EqgC01o/vqd22YHn4MV0hS41K0vhPhO9fsTp8P4B8SSJg75c0rez9VEkHEigHaJmJnwzfP2VS7bbH6ywLeqzOzTz6T4TvX9fA/a3D1iMH0N6SCNj/JOnDZjbdzIZLWiKJC1OQaW/+urHjkpoxftXNjR3X7B2/AKSnI+4MnXOnzewGSU9IGibpb51zL8ddDlBk39+Wdg0AtFrsAVuSnHP/KOkfk8gbaFeTu6RDR9Mrf/Z56ZUNIHmsdAZEVG94++AQVzAr97EPSfMvkn5nauN5PLcxfD/LlwLZlkgPGygq1xscGBfOae5+2ZfdIG19LrhcAPlGwAaGYOVaafWN4Wn6t0nj5nmvD22VJnVV7r/uVuneR6OXOWemtGO99MTdg9v2HZBmXOG9jtKz/2LMK6YBaD1z9W4VlLCenh7X25vf7oGZ32Xp+ZH2308rVLdhlN6s9Qym27xVWroqPP1QfPfr0tLLasupV58geW9D/g1mX97bUNIu51zdk1YE7ITl/Q8t7b+fVqhuw4njpCNPRjgu4jnjxXOl6xdL82ZJx05IP9kj3bZB+tne+sdGCdYTLg2/nCvvbci/wezLexsqYsBmSBwYor7+xo/dssYL0EHGj5FmTJGuXlC5fceL0iWfb6xMrr0G8oGADTQgylB0aQJaZ4f0btVksaHM2Ha90scvGCyvc7Z0+kzzQ+EAsoWADTQo6vnjUrBuNHiWH3fmBenU89HyIlgD+cJ12EATltxSP431BAfPW5dJx572An/pcXKnt93PsIuiBeI//nL9NACyhUlnCcv7ZIm0/35aoV4bBvWyqwPrlfOkh+5qvB5LV3kzzhspO0ze25B/g9mX9zYUk86A1rAe6e0d0qiRtfv6npImjK3cNnqu9NbJ6Pl3jZHe/JG06TbvIUnf2Cjdcndt2iW3SPf/MHreALKDgA3E4OyPe8/VPd6OYdL0K6RXm7jB7NHjlT3mXz5a29OWOGcN5B3nsIEYlQdN1ys9vL25YO3n3EXeddvlPw4I1kD+0cMGYmY90vjR0tGnpWsv9x5J6Z7f3HXhALKDHjaQgGMnvMC9YnUy+S+/08ufYA0UBz1sIEHrNnkPKZ47ajH0DRQXPWygRUrXY1vP4N28yq1cW7vtnMsqjwNQXPSwgRT8+i3/ALzmvtbXBUA20MMGACADCNgAAGQAARsAgAwgYAMAkAGp3/zDzHK9cn3a32/SCrAoP22YcbRf9hWgDbn5R66dOSa92FWxaeVaafWNVenOPyB1vr919QIAJIIedsJi/X53xfBLela8Xze/7rMv721I+2VfAdowUg+bc9jt7tCdXqCOI1hLg3kdSmjNTABAIuhhJ6zh7/fUm9KeifFWxs/5B6XOyQ0fzq/77Mt7G9J+2VeANuQcdmbF1ZuOYs853nPMQ+UAgHgxJN5uWhms26FcAEAkBOx2sXtE+kFzl0lHN6dbBwCALwJ2O9hlknu36WxuuCOGuuxbmv4PBwBADSadJazu97t7pOR+21QZfnd9avreyzZcurB+vZjwkn15b0PaL/sK0IZc1pUJEYJ193zpvh/47wu6R3LT906OoccPAIgPPeyEhX6/dYaeo/ScwwJzvbQfnSH99IHQKtSdPc6v++zLexvSftlXgDakh93W6gTrb93vv73RnrPfcS/vjXAg57MBoC0QsNNw+nDdJMvvbEE9FPEHwOm+xOsBAAhHwE7DS42vLFYtaHJZ05POyr3UHWNmAIBGsNJZq70xeO1V2Dlq1xt9+Nv1SidOSmPmSsefkUaPil6dDV8ZfB16zvzgWumc6luBAQBahR52qx34c0nBwXh/2Wj5nJm1+4N6zqUgHRSsg467brH3/KuD/vvfq+frN/knAAC0BAG7zUxbOPh6x/rKQBs2zP3hq7znCZcGp6nOq/z9uYuGVk8AQGsRsFupyRnXr4fMVXvlNe/56PHgNGH7ImHGOACkhoDdZhbOCd43dWHwvijCet+LLmkubwBAsgjYKTm503/7Y+taW4+SR9b6b3/n2dbWAwDgj4DdKqcqZ3WdNcI7h3zWiMFtUS7F2vhIY8U/vL1+mvLyR4303o8cXpXo1JHGKgAAaApLkybsve835Pzv6TNS5+yB9D5Bu3pGeXWa8uMl6ciT0sRxQ8ujPE3/Nmns+wKrW7FcKcsiZl/e25D2y74CtCFLk2ZFx7Dmjh9+ceX77vnN5RcarAEAqSBgt5koi6UsWVX5vt6Pz899LZ5yAQDpiT1gm9nfmtlhM/tp3HnDc//WoaXfsCWZegAAWieJHvZGSZ9OIN9Mu2lN9LSt7u0OpbyhfA4AQHxiD9jOuWckHY0736xbE/PKnl+4PVq6uO/6FffnAABEwznsNrVoRfj+bz/oPW/f7b9/yzPec9B9tUuuXFn5/trL69cNANB6qQRsM1tmZr1mFudNIDNt+gcq3z+2I9px85b5b/9MxJ5w9fXZ93412nEAgNZKJWA7577jnOuJct1ZUfz4ntptC5aHH9MVstSoJI3/RPj+FavD9wMA2gdD4q0yM3yFsCmTarc9XmdZ0GN1bubRfyJ8/7pN4ft9nd/XwEEAgGYlcVnXJkk/kfQRM9tvZv9H3GVkUsfEhg5Lasb4VTc3eGDnhFjrAQCIpiPuDJ1zS+POE/H7/ra0awAAGAqGxNvI5K50y599XrrlAwCCcfOPhNV8vyE3AZEaHwL/2Ie8gL/vgPSL/Y3lUfduYbNqm4obD2Rf3tuQ9su+ArRhpJt/xD4kjua43uCgvXBOc/fLvuwGaetzweUCANoXAbvVpt4l7Q+f8dW/TRo3z3t9aKs0qWqo/LpbpXsfjV7knJnSjvXSE3cPbtt3QJpxhff6YJS1yaf9VfQCAQCxY0g8Yb7fb51hccnrZZd6vZu3SktXhacfiu9+XVp6WW05oXyGwyWG4/Ig721I+2VfAdow0pA4ATthvt/vqSPSHp8Lr6tEPZ+9eK50/WJp3izp2AnpJ3uk2zZIP9sboX5RgvX5fYGXc/GfRfblvQ1pv+wrQBtyDrttdXY3fOiWNV6ADjJ+jDRjinT1gsrtO16ULvl8g4Vy7TUApI4edsJCv9+IQ+OdHdK7z9Vuj1yHql5052zp9JnmhsLfqwe/7jMv721I+2VfAdqQHnbbm+UiBe1SsG70kq/y4868IJ16PmJedYI1AKB1WDglbdPrL+htPcEB9tZl0rGnvd5y6XFyp7fdz7CLIgbr6d+LkAgA0CoMiScs0vcb0MuuDqxXzpMeuqvxuixd5c04Lxc4LB6xd81wXPblvQ1pv+wrQBsyS7wdRP5+d4+S3DsVm6xH6ntKmjC2MunoudJbJ6PXoWuM9OaPKrd9Y6N0y90+AXv6JqlrSeS8+c8i+/LehrRf9hWgDTmHnSkXDkTgqt52xzBp+hXSqwcaz/ro8cre+i8fre1pS+KcNQC0Mc5ht5uyoOl6pYe3Nxes/Zy7yLtuu6J3TbAGgLbGkHjCGv5+Tx2V9rTg+ufzDzd1XTjDcdmX9zak/bKvAG0YaUicHna76uzyer3T1iaT/7R1Xv5NBGsAQOvQw05YrN9vhGu264p56Jtf99mX9zak/bKvAG1IDzt3ZrnBx8xjNbtX+nXGz3+j8jgAQCbRw05Y2t9v0vh1n315b0PaL/sK0Ib0sAEAyAsCNgAAGUDABgAgA1Jf6WzWrFnq7Y1yn8dsyvv5pbyfW5Jow6yj/bIv720YFT1sAAAyIPUeNgCgjbTheg/w0MMGgKI7dKcXqOMI1tJgXodWx5MfJBGwAaC4Tr3pBdb9X04m//03e/mfOpRM/gXDkDgAFFFcveko9pzjPTNU3hR62ABQNK0M1u1Qbk4QsAGgKHaPSD9o7jLp6OZ065BRBGwAKIJdJrl3m87mhjtiqMu+pen/cMggzmEDQN7tHtl0FlZ2a4q/fsB7ds2uebV7hHThb5vMpDjoYQNA3rn6QbF7vnTfD/z3WcB9pIK2RxZDj79ICNgAkGd1hp6tx3v09Uuf/cvmg3Apv9LjvD9prn4YRMAGgLyqEwy/db//9kaDtt9xL++NcCBBOxICNgDk0enDdZMsv7MF9VDEHwCn+xKvR9YRsAEgj16aHFtWQZPLmp50Vu6l7hgzyydmiQNA3rwxeO2VX++2FGhdb/Thb9crnTgpjZkrHX9GGj0qenU2fGXwdVh9dHCtdM6N0TMuGHrYAJA3B/5cUnAw3l82Wj5nZu3+oJ5zKUgHBeug465b7D3/6qD//vfq+fpN/gkgiYANAIUzbeHg6x3rKwNt2DD3h6/ynidcGpymOq/y9+cuGlo9UYmADQB50uSM69dD5qq98pr3fPR4cJqwfZEwYzwQARsACmbhnOB9UxcG74sirPe96JLm8i46AjYA5NTJnf7bH1vX2nqUPLLWf/s7z7a2HllFwAaAvDhVOavrrBHeOeSzRgxui3Ip1sZHGiv+4e3105SXP2qk937k8KpEp440VoGcI2ADQF7seb/v5pM7pVPPe6+jXMZ1/Vdrt50+U/m+r782zZUr6+ddKr9/m/T2joBEeybVz6iACNgAUAAdw5o7fvjFle+75zeX39j3NXd8ERGwAaBgovSyl6yqfO9cePrPfS2echGMgA0AqHH/1qGl37AlmXpgUOwB28ymmdnTZvZzM3vZzL4UdxkAgFo3rYmettW93aGUN5TPUSRJ9LBPS1rpnPufJF0s6T+a2e8mUA4AoMyamFf2/MLt0dLFfdevuD9HXsQesJ1zbzjndg+8PiHp55KmxF0OAKA5i1aE7//2g97z9t3++7c84z0H3Ve7pHr2+LWX168baiV6DtvMPijp9yQ9X7V9mZn1mlnvkSNcbwcArTD9A5XvHwu6rKrKvGX+2z8TsSdcfX32vT6XjaG+xAK2mb1P0oOSVjjnKlaXdc59xznX45zr6e7mHqgA0Ao/vqd224Ll4cd0hSw1KknjPxG+f8Xq8P2ILpGAbWad8oL1fc65f0iiDABAlZnhI5ZTfNYjebzOsqDH6tzMo/9E+P51m8L3+zq/r4GD8i+JWeImab2knzvnmOsHAK3SMbGhw5KaMX7VzQ0e2Dkh1nrkRRI97DmSrpF0qZm9OPBo8v4vAICs+f62tGuQLx1xZ+ic2yGJG5oCQBua3CUdOppe+bPPS6/srGOlMwDIk1nha4geHOIKZuU+9iFp/kXS70xtPI/nNtZJUKf+RRZ7DxsA0N5cb/B564Vzmrtf9mU3SFufCy4XjSNgA0DeTL1L2h8+46t/mzRunvf60FZpUlfl/utule59NHqRc2ZKO9ZLT9w9uG3fAWnGFd7rSD37aX8VvcACYkgcAPJmcv0bU5dub+l6vWC9eavX6y49hhKsJWnnS5XHb3rCW6il1Kue3BV+vCRp0heHVmjBmKt3z7SE9fT0uN7e/I6TeFe55Vfafz+tQBtmW2Hb79QRaY/PhddVol7StXiudP1iad4s6dgJ6Sd7pNs2SD/bG6GOUf6LP78v8HKuvLehpF3OubotwZA4AORRZ+OrSG5Z4wXoIOPHSDOmSFcvqNy+40Xpks83WCjXXtdFwAaAvJrlpF3hvdPSBLTODundqsliQ1lQxfVKH79gsDfdOVs6fSZi75qZ4ZEQsAEgzyIEbWkwWDe66ln5cWdekE49HzEvgnVkTDoDgLybXn9B79JkMT+3LpOOPe31lkuPkzu97X6GXRQxWE//XoREKGHSWcLyPlki7b+fVqANs432GxDQy64OrFfOkx66q/H6LF3lzTgvFzgsHrF3nfc2FJPOAADvmeWk3aMk907Nrr6npAljK7eNniu9dTJ69l1jpDd/JG26zXtI0jc2Srfc7ZN4+iapa0n0zCGJgA0AxXHhQASu6m13DJOmXyG9eqDxrI8er+yt//LR2p62JM5ZN4Fz2ABQNGVB0/VKD29vLlj7OXeRd912xXA4wbop9LABoIhmOenUUWnPBF17uXTt5QmWdf7hpq4Lh4ceNgAUVWeXF7inrU0m/2nrvPwJ1rGghw0ARTdphfeQIl2zXRdD34mghw0AGDTLDT5mHqvZvdKvM37+G5XHIRH0sAEA/jrG1QTg1X+XUl1ADxsAgCwgYAMAkAEEbAAAMoCADQBABqR+8w8zy/WUwrS/36QVYFF+2jDjaL/sK0AbRrr5Bz1stKVxoytv5ed6pZuurt12zoS0awoArUEPO2Fpf79Ji/PXfeAt+IYg0j14h4g2zDbaL/sK0Ib0sNH+br5msLcch/LeOADkCT3shKX9/Sat0V/3pXvnJm3yH0mHjzaXB22YbbRf9hWgDSP1sFnpDC0XV286ikMD9+NNYqgcAFqJIXG0VCuDdTuUCwBxIWCjJX7zbPpB0/VKf/qpdOsAAI0iYCNxrlcaMbz5fG64o/k8Nt+e/g8HAGgEk84Slvb3m7R6E17e2SmNHNFkGT7nn5sNur99Vxr5h9HSFr0Ns472y74CtCGXdSF9UYJ193zpvh/47wuaLNbsJLI4evwA0Er0sBOW9vebtLBf9/V6wVF6zmGBuV7aj86QfvrA0OtQU06B2zAPaL/sK0Ab0sNGeuoF62/d77+90Z6z33Ev761/HOezAWQFARux6+6qn2b5ncnXQ4r2A2DC2OTrAQDNImAjdoe3xpdXUA84zp5x31Px5QUASWGlM8Tqz64ZfB12jtr1Rh/+dr3SiZPSmLnS8Wek0aOi12fDV6LVZ8VS6ZuboucLAK1GDxuxuuNL3nNQMN5/ePD1nJm1+4N6zqUgHRSsg467brH3/KuD/vtL9Vy70n8/ALQLAjZaatrCwdc71lcG2rBh7g9f5T1PuDQ4TXVe5e/PXTS0egJAuyFgIzbNnld+/XDwvlde856PHg9OE7YvCmaMA2hnBGy01MI5wfumLgzeF0VY73vRJc3lDQBpI2AjESd3+m9/bF1r61HyyFr/7e8829p6AECjCNiIxeQJle/PGuENMZ9VtjRplCHnjY80Vv7D2+unKS9/1Ejv/ciqJUonjmusfABIGkuTJizt7zdppWURw4Lx6TNS52wFpqueUV6dpvx4STryZG1grZdHeZr+bdLY9wXXtyavgrRhXtF+2VeANmRpUrSHjmHNHT/84sr33fObyy8sWANAuyJgo6WiLJayZFXl+3o/rj/3tXjKBYB2FnvANrORZvaCmb1kZi+b2VfjLgP5dv8QlzbdsCWZegBAO0mih/1bSZc652ZKukDSp83s4jrHIONuWhM9bat7u0MpbyifAwBaKfaA7TxvDbztHHjke8YAtOamePP7wu3R0sV916+4PwcAxCWRc9hmNszMXpR0WNIPnXPPV+1fZma9ZsbaUgW1aEX4/m8/6D1v3+2/f8sz3nPQfbVLrqxaI/zay+vXDQDaUaKXdZnZOEkPSfqic+6nAWly3fsuwOUIkupfYz3jCmnfgcptpWOChqzr3dErbH9Q3lGuBeeyrnyh/bKvAG2Y/mVdzrl+SdskfTrJctD+fnxP7bYFy8OP6QpZalSSxn8ifP+K1eH7ASBLkpgl3j3Qs5aZnSVpvqR/jbsctJeJnwzfP2VS7bbH6ywLeqzOzTz6T4TvX9fA/a3D1iMHgDR1JJDn+yXda2bD5P0geMA592gC5aCNvPnrxo5Lasb4VTc3dlyzd/wCgKTEHrCdc3sk/V7c+QJD8f1tadcAAOLFSmdomcld6ZY/+7x0yweAZnDzj4Sl/f0mrXqGar1Z2I0OgX/sQ17A33dA+sX+xvJotG5Fa8O8of2yrwBtGGmWeBLnsIFAYZdiLZzT3P2yL7tB2vpccLkAkGUEbMRq5Vpp9Y3hafq3SePmea8PbZUmVQ2VX3erdO8QpinOmSntWC89cffgtn0HvGu/JelghLXJvxjzimkAEDeGxBOW9vebNL/huKiLk5TSbd4qLV0Vnn4ovvt1aellteXUq0+QIrZhntB+2VeANow0JE7ATlja32/S/P6zmDhOOvJkhGMjns9ePFe6frE0b5Z07IT0kz3SbRukn+2tf2yUYD3h0vDLuYrYhnlC+2VfAdqQc9hIR19/48duWeMF6CDjx0gzpkhXL6jcvuNF6ZLPN1Ym114DyAJ62AlL+/tNWtiv+6hD0Z0d0rvP1W6PqrqcztnS6TPND4W/l3+B2zAPaL/sK0Ab0sNGuqKePy4F60Yv+So/7swL0qnno+XV6vtyA0AzWDgFiVpyS/001hMcPG9dJh172gv8pcfJnd52P8MuihaI//jL9dOjhkQyAAAgAElEQVQAQDthSDxhaX+/SYsyHBfUy64OrFfOkx66q/G6LF3lzThvpOwwtGG20X7ZV4A2ZJZ4O0j7+01a1P8s3t4hjRpZdWyP1PeUNGFs5fbRc6W3TkavQ9cY6c0fVW77xkbplrtrA/aSW6T7fxg9b4k2zDraL/sK0Iacw0b7OPvj3nN1AO0YJk2/Qnr1QON5Hz1e2WP+5aO1PW2Jc9YAso1z2Gip8qDpeqWHtzcXrP2cu8i7brv8xwHBGkDWMSSesLS/36Q1Ohw3frR09OmYK+Oje35z14VLtGHW0X7ZV4A2jDQkTg8bqTh2wuv1rlidTP7L7xw4R95ksAaAdkEPO2Fpf79Ji/PXfRx31Epi6Js2zDbaL/sK0Ib0sJEtpeuxrWfwbl7lVq6t3XbOZZXHAUBe0cNOWNrfb9L4dZ99eW9D2i/7CtCG9LABAMgLAjYAABlAwAYAIANSX+ls1qxZ6u2NYXpwm8r7+aW8n1uSaMOso/2yL+9tGBU9bAAAMiD1HjYAZEW7rhWAYqCHDQAhbr5m8F7scSjlddPV8eSH4iBgA4CPrjFeYL3zS8nkv/pGL/9JXcnkj/xhSBwAqsTVm47i0MCtYBkqRz30sAGgTCuDdTuUi+wgYAOApN88m37QdL3Sn34q3TqgfRGwARSe65VGDG8+nxvuaD6Pzben/8MB7Ylz2AAK7Z2dzedRfv75rx/wnpsNur95Vhr5h83lgXyhhw2g0EaOqJ+me7503w/89wVNFmt2ElkcPX7kCwEbQGHV6wWX7rPe1y999i+bD8Ll9263Hum8P2mufigWAjaAQqoXDL91v//2RoO233Ev761/HEEbJQRsAIXTHWGxkuV3Jl8PKdoPgAljk68H2h8BG0DhHN4aX15BPeA4e8Z9T8WXF7KLWeIACuXPrhl87de7LQVa1xt9+Nv1SidOSmPmSsefkUaPil6fDV+JVp8VS6VvboqeL/KHHjaAQrljYG3woGC8//Dg6zkza/cH9ZxLQTooWAcdd91i7/lXB/33l+q5dqX/fhQHARsAykxbOPh6x/rKQBs2zP3hq7znCZcGp6nOq/z9uYuGVk8UDwEbQGE0e1759cPB+155zXs+ejw4Tdi+KJgxXmwEbAAos3BO8L6pC4P3RRHW+150SXN5I/8I2AAK6WTAkqSPrWttPUoeWeu//Z1nW1sPtC8CNoBCmDyh8v1ZI7wh5rPKliaNMuS88ZHGyn94e/005eWPGum9H1m1ROnEcY2Vj+wjYAMohINP+G8/uVM69bz3OsplXNd/tXbb6TOV7/v6a9NcGWGWd6n8/m3S2zv80xx5sn4+yCcCNoDC6xjW3PHDL6583z2/ufzGvq+545FPBGwAKBOll71kVeV758LTf+5r8ZSLYkskYJvZMDP7ZzN7NIn8ASBN9w9xadMNW5KpB4olqR72lyT9PKG8AWDIbloTPW2re7tDKW8onwP5EnvANrOpki6XdE/ceQNAo9bcFG9+X7g9Wrq47/oV9+dAdiTRw/6mpC9L+u9BCcxsmZn1mlnvkSNHEqgCADRn0Yrw/d9+0Hvevtt//5ZnvOeg+2qXVM8ev/by+nVDMcUasM1skaTDzrldYemcc99xzvU453q6u7vjrAIANGT6ByrfPxZwWVW1ecv8t38mYk+4+vrse30uGwOk+HvYcyRdYWavStos6VIz+7uYywCA2P3Y5yTeguXhx3SFLDUqSeM/Eb5/xerw/UC5WAO2c+4W59xU59wHJS2R9CPn3GfjLAMAGjHxk+H7p0yq3fZ4nWVBj9W5mUf/ifD96xq4v3XYeuTIN67DBlAIb/66seOSmjF+1c2NHdfsHb+QXR1JZeyc2yZpW1L5A0CWfX9b2jVA1tDDBoABk7vSLX/2eemWj/ZGwAZQGPWGtw8OcQWzch/7kDT/Iul3pjaex3Mbw/ezfGmxJTYkDgBZ5HqDA+PCOc3dL/uyG6StzwWXC4QhYAMolJVrpdU3hqfp3yaNm+e9PrRVmlQ1VH7drdK9Q7hTwpyZ0o710hN3D27bd0CacYX3OkrP/osxr5iG7DFX7zYzCevp6XG9vfn9aWlmaVchUWn//bQCbZhtfu0XpTdrPYPpNm+Vlq4KTz8U3/26tPSy2nLq1cdP3ttPyv+/QUm7nHN1T3gQsBOW9z+0tP9+WoE2zDa/9ps4TjryZIRjI54zXjxXun6xNG+WdOyE9JM90m0bpJ/trX9slGA94dLgy7ny3n5S/v8NKmLAZkgcQOH09Td+7JY1XoAOMn6MNGOKdPWCyu07XpQu+XxjZXLtNSQCNoCCijIUXZqA1tkhvVs1WWwoM7Zdr/TxCwbL65wtnT7T3FA4ioeADaCwop4/LgXrRoNn+XFnXpBOPR8tL4I1ynEdNoBCW3JL/TTWExw8b10mHXvaC/ylx8md3nY/wy6KFoj/+Mv106BYmHSWsLxPlkj776cVaMNsi9J+Qb3s6sB65Tzpobsar8vSVd6M80bKDpL39pPy/29QTDoDgGisR3p7hzRqZO2+vqekCWMrt42eK711Mnr+XWOkN38kbbrNe0jSNzZKt9xdm3bJLdL9P4yeN4qDgA0Aks7+uPdc3ePtGCZNv0J69UDjeR89Xtlj/uWjtT1tiXPWCMc5bAAoUx40Xa/08PbmgrWfcxd5122X/zggWKMeetgAUMV6pPGjpaNPS9de7j2S0j2/uevCURz0sAHAx7ETXuBesTqZ/Jff6eVPsEZU9LABIMS6Td5DiueOWgx9o1H0sAEgotL12NYzeDevcivX1m4757LK44BG0cMGgAb8+i3/ALzmvtbXBcVADxsAgAwgYAMAkAEEbAAAMiD1tcTNLNcL4ab9/SatAGv80oYZR/tlXwHaMNJa4vSwAQDIAGaJAwCKY1cMIxKz0unx08MGAOTboTu9QB1HsJYG8zqU0DJ4ATiHnbC0v9+kcf4s+/LehrRf9jXchqfelPZMjLcyfs4/KHVObvjwqOewGRIHAORPXL3pKPac4z0nPFTOkDgAIF9aGaxbWC4BGwCQD7tHpBesS3aZdHRzIlkTsAEA2bfLJPdu09nccEcMddm3NJEfDkw6S1ja32/SmPCSfXlvQ9ov++q24e6RkvttU2X43cil6dup2nDpwvr1YuEUAEAxRAjW3fOl+37gvy/otqdN3w41hh5/OXrYCUv7+00av+6zL+9tSPtlX2gb1hl6jtJzDgvM9dJ+dIb00wdCq1B39jg9bABAvtUJ1t+63397oz1nv+Ne3hvhwJjOZxOwAQDZc/pw3STL72xBPRTxB8DpvqbLIWADALLnpcZXFqsWNLms6Uln5V7qbjoLVjoDAGTLG4PXXoWdo3a90Ye/Xa904qQ0Zq50/Blp9Kjo1dnwlcHXoefMD66VzrkxesZV6GEDALLlwJ9LCg7G+8tGy+fMrN0f1HMuBemgYB103HWLvedfHfTf/149X7/JP0FEBGwAQK5MWzj4esf6ykAbNsz94au85wmXBqepzqv8/bmLhlbPoSJgAwCyo8kZ16+HzFV75TXv+ejx4DRh+yJpov4EbABAriycE7xv6sLgfVGE9b4XXdJc3vUQsAEAmXRyp//2x9a1th4lj6z13/7Os/HkT8AGAGTDqcpZXWeN8M4hnzVicFuUS7E2PtJY8Q9vr5+mvPxRI733I4dXJTp1pKHyWZo0YWl/v0kr/LKIOZD3NqT9su+9Ngw5/3v6jNQ5eyC9T9CunlFenab8eEk68qQ0cdzQ8ihP079NGvu+wOpWLFfK0qQAgMLoGNbc8cMvrnzfPb+5/EKDdYMI2ACAXImyWMqSVZXv6w3EfO5r8ZTbjEQCtpm9amb/YmYvmlmci7sBANC0+7cOLf2GLcnUYyiS7GF/wjl3QZRxeQAA6rlpTfS0Sfd2mylvKJ+jHEPiAIBMWNPcyp41vnB7tHRx3/Wr0c+RVMB2kraa2S4zW1a908yWmVkvw+UAgKQsWhG+/9sPes/bd/vv3/KM9xx0X+2SK1dWvr/28vp1a0Qil3WZ2QeccwfMbJKkH0r6onPumYC0ub7mgktKso82zDbaL/uiXNYlSTOukPYdqDp2oFsYNGRd745eYfuD8o50W852uazLOXdg4PmwpIckXZREOQAAlPz4ntptC5aHH9MVstSoJI3/RPj+FavD98cp9oBtZmeb2ejSa0l/JOmncZcDACiYmeErhE2ZVLvt8TrLgh6rczOP/hPh+9dtCt/v6/y+Bg6SOho6KtxkSQ8NDNN0SPquc+7xBMoBABRJx8SGDktqxvhVNzd4YOeEhg6LPWA75/ZK8rllOAAA+fH9ba0tj8u6AAC5Mbkr3fJnn5dc3tz8I2Fpf79JK9QM1ZzKexvSftlX04Z1Zos3OgT+sQ95AX/fAekX+xvLo+4M8Vm1f49RZ4kncQ4bAIDUhF2KtXBOc/fLvuwGaetzweUmiYANAMiWqXdJ+8NnfPVvk8bN814f2ipNqhoqv+5W6d5Hoxc5Z6a0Y730xN2D2/Yd8K79lqSDUdYmn/ZX0Qv0wZB4wtL+fpNWyOG4nMl7G9J+2efbhnWGxSWvl13q9W7eKi1dFZ5+KL77dWnpZbXlhPIZDpeiD4kTsBOW9vebtML+Z5EjeW9D2i/7fNvw1BFpj8+F11Wins9ePFe6frE0b5Z07IT0kz3SbRukn+2NUL8owfr8vsDLuTiHDQDIr87uhg/dssYL0EHGj5FmTJGuXlC5fceL0iWfb7DQBq+9LkcPO2Fpf79JK+yv+xzJexvSftkX2oYRh8Y7O6R3n6vdHrkOVb3oztnS6TPNDYW/Vw962ACA3JvlIgXtUrBu9JKv8uPOvCCdej5iXnWC9VCwcAoAINum11/Q23qCA+yty6RjT3u95dLj5E5vu59hF0UM1tO/FyFRdAyJJyzt7zdphR+Oy4G8tyHtl32R2jCgl10dWK+cJz10V+N1WbrKm3FeLnBYPGLvmlnibSLt7zdp/GeRfXlvQ9ov+yK34e5RknunYpP1SH1PSRPGViYdPVd662T0OnSNkd78UeW2b2yUbrnbJ2BP3yR1LYmcN+ewAQDFcuFABK7qbXcMk6ZfIb16oPGsjx6v7K3/8tHanrakWM9ZV+McNgAgX8qCpuuVHt7eXLD2c+4i77rtit51gsFaYkg8cWl/v0ljOC778t6GtF/2NdyGp45Ke5q//rmu8w83dV141CFxetgAgHzq7PJ6vdPWJpP/tHVe/k0E66Ggh52wtL/fpPHrPvvy3oa0X/bF2oYRrtmuK+ahb3rYAABUm+UGHzOP1exe6dcZP/+NyuNSQg87YWl/v0nj13325b0Nab/sK0Ab0sMGACAvCNgAAGQAARsAgAxIfaWzWbNmqbc3yv3Jsinv55fyfm5Jog2zjvbLvry3YVT0sAEAyAACNgAAGZD6kDiAHGnDRSmAvKCHDaA5h+70AnUcwVoazOvQ6njyA3KCgA2gMafe9ALr/i8nk//+m738Tx1KJn8gYxgSBzB0cfWmo9hzjvfMUDkKjh42gKFpZbBuh3KBNkHABhDN7hHpB81dJh3dnG4dgJQQsAHUt8sk927T2dxwRwx12bc0/R8OQAo4hw0g3O6RTWdhZfch+usHvGfX7AKHu0dIF/62yUyA7KCHDSCcqx8Uu+dL9/3Af58F3DQwaHtkMfT4gSwhYAMIVmfo2Xq8R1+/9Nm/bD4Il/IrPc77k+bqB+QJARuAvzrB8Fv3+29vNGj7Hffy3ggHErRREARsALVOH66bZPmdLaiHIv4AON2XeD2AtBGwAdR6aXJsWQVNLmt60lm5l7pjzAxoT8wSB1DpjcFrr/x6t6VA63qjD3+7XunESWnMXOn4M9LoUdGrs+Erg6/D6qODa6VzboyeMZAx9LABVDrw55KCg/H+stHyOTNr9wf1nEtBOihYBx133WLv+VcH/fe/V8/Xb/JPAOQEARvAkExbOPh6x/rKQBs2zP3hq7znCZcGp6nOq/z9uYuGVk8gbwjYAAY1OeP69ZC5aq+85j0fPR6cJmxfJMwYR44RsAEMycI5wfumLgzeF0VY73vRJc3lDWQdARuAr5M7/bc/tq619Sh5ZK3/9neebW09gLQQsAF4TlXO6jprhHcO+awRg9uiXIq18ZHGin94e/005eWPGum9Hzm8KtGpI41VAGhzBGwAnj3v9918cqd06nnvdZTLuK7/au2202cq3/f116a5cmX9vEvl92+T3t4RkGjPpPoZARlEwAZQV8ew5o4ffnHl++75zeU39n3NHQ9kUSIB28zGmdnfm9m/mtnPzewPkigHQOtF6WUvWVX53rnw9J/7WjzlAnmWVA97naTHnXP/o6SZkn6eUDkA2tD9W4eWfsOWZOoB5EnsAdvMxkiaK2m9JDnn3nXO+ZyxAtBObloTPW2re7tDKW8onwPIkiR62DMkHZG0wcz+2czuMbOzEygHQIzWxLyy5xduj5Yu7rt+xf05gHaRRMDukHShpL9xzv2epLcl/UV5AjNbZma9ZtZ75AiXYABZtGhF+P5vP+g9b9/tv3/LM95z0H21S6pnj197ef26AXmURMDeL2m/c27gQhD9vbwA/h7n3Heccz3OuZ7ubm6LB2TB9A9Uvn8s6LKqKvOW+W//TMSecPX12ff6XDYGFEHsAds5d1DSa2b2kYFNn5T0s7jLAdBaP76ndtuC5eHHdIUsNSpJ4z8Rvn/F6vD9QJEkdT/sL0q6z8yGS9or6fqEygEQl5lHpJeCR7ym+KxH8nidZUGP1bmZR/+J8P3rNoXv93V+XwMHAe0vkYDtnHtREldNAlnSMbGhw5KaMX7VzQ0e2Dkh1noA7YKVzgC0pe9vS7sGQHshYAOIbHJXuuXPPi/d8oE0EbABDJoVvobowSGuYFbuYx+S5l8k/c7UxvN4bmOdBHXqD2RZUpPOAOSU6w0+b71wTnP3y77sBmnrc8HlAkVGwAZQaepd0v7wGV/926Rx87zXh7ZKk6qGyq+7Vbr30ehFzpkp7VgvPXH34LZ9B6QZV3ivI/Xsp/1V9AKBDGJIHEClyfVvTF26vaXr9YL15q1er7v0GEqwlqSdL1Uev+kJb6GWUq860rnzSV8cWqFAxpird9+7hPX09Lje3vyOdZlZ2lVIVNp/P61QyDY8dUTa43PhdZWol3Qtnitdv1iaN0s6dkL6yR7ptg3Sz/ZGqF+U/x7O7wu8nKuQ7ZczeW9DSbucc3X/NTEkDqBWZ+NLBm9Z4wXoIOPHSDOmSFcvqNy+40Xpks83WCjXXqMACNgA/M1y0q7wnk1pAlpnh/Ru1WSxoSyo4nqlj18w2JvunC2dPhOxd83McBQEARtAsAhBWxoM1o2uelZ+3JkXpFPPR8yLYI0CYdIZgHDT6y/oXZos5ufWZdKxp73eculxcqe33c+wiyIG6+nfi5AIyA8mnSUs75Ml0v77aQXaUIG97OrAeuU86aG7Gq/L0lXejPNygcPiEXvXtF/25b0NxaQzALGZ5aTdoyT3Ts2uvqekCWMrt42eK711Mnr2XWOkN38kbbrNe0jSNzZKt9ztk3j6JqlrSfTMgZwgYAOI5sKBCFzV2+4YJk2/Qnr1QONZHz1e2Vv/5aO1PW1JnLNGoXEOG8DQlAVN1ys9vL25YO3n3EXeddsVw+EEaxQcPWwAQzfLSaeOSnsm6NrLpWsvT7Cs8w83dV04kBf0sAE0prPLC9zT1iaT/7R1Xv4Ea0ASPWwAzZq0wntIka7Zrouhb8AXPWwA8ZnlBh8zj9XsXunXGT//jcrjAPiihw0gGR3jagLw6r9LqS5ADtDDBgAgAwjYAABkAAEbAIAMSH0tcTPL9SyTtL/fpBVgjV/aMONov+wrQBtGWkucHjYAABmQm1nikW50X0ej9/IFACBpme5h33zN4P1141DK66ar48kPAIC4ZPIcdulWfEmb/EfS4aPN5ZH295s0zp9lX97bkPbLvgK0YT7vhx1XbzqKQwO392OoHACQtkwNibcyWLdDuQAAlGQiYP/m2fSDpuuV/vRT6dYBAFBcbR+wXa80Ynjz+dxwR/N5bL49/R8OAIBiautJZ+/slEaOaDJ/n/PPzQbd374rjfzDaGnT/n6TxoSX7Mt7G9J+2VeANsz+wilRgnX3fOm+H/jvC5os1uwksjh6/AAADEXb9rDr9YKj9JzDAnO9tB+dIf30gaHXoaac/P8yTLsKiaMNs432y74CtGF2e9j1gvW37vff3mjP2e+4l/fWP47z2QCAVmm7gN3dVT/N8juTr4cU7QfAhLHJ1wMAgLYL2Ie3xpdXUA84zp5x31Px5QUAQJC2Wunsz64ZfB12jtr1Rh/+dr3SiZPSmLnS8Wek0aOi12fDV6LVZ8VS6ZuboucLAMBQtVUP+44vec9BwXj/4cHXc2bW7g/qOZeCdFCwDjruusXe868O+u8v1XPtSv/9AADEpa0Cdj3TFg6+3rG+MtCGDXN/+CrvecKlwWmq8yp/f+6iodUTAIC4tU3Abva88uuHg/e98pr3fPR4cJqwfVEwYxwAkKS2CdhRLJwTvG/qwuB9UYT1vhdd0lzeAAA0qy0D9smd/tsfW9faepQ8stZ/+zvPtrYeAIDiaouAPXlC5fuzRnhDzGeVLU0aZch54yONlf/w9vppyssfNdJ7P7JqidKJ4xorHwCAetpiadKwYHz6jNQ523vtl656Rnl1mvLjJenIk7WBtV4e5Wn6t0lj3xdc35q88r+kXtpVSBxtmG20X/YVoA2zuzRpuY5hzR0//OLK993zm8svLFgDAJCUtg/Y5aIslrJkVeX7ej/MPve1eMoFACBJsQdsM/uImb1Y9jhuZiviLifI/UNc2nTDlmTqAQBAnGIP2M65f3POXeCcu0DSLEknJT0UdsxNa6Ln3+re7lDKG8rnAABgKJIeEv+kpF84534ZlmjNTfEW+oXbo6WL+65fcX8OAABKkg7YSyTV3BbDzJaZWa+ZNbQ+2KI6A+zfftB73r7bf/+WZ7znoPtql1xZtUb4tZfXrxsAAElI7LIuMxsu6YCkjzrnDoWkC72sS5JmXCHtO1C5rXRM0JB1vTt6he0PyjvKteBc1pU/tGG20X7ZV4A2TP2yrgWSdocF66h+fI9P5svDj+kKWWpUksZ/Inz/itXh+wEAaKUkA/ZS+QyH+5n4yfD9UybVbnu8zrKgx+rczKP/RPj+dQ3c3zpsPXIAAJqRSMA2s1GSPiXpH6Kkf/PXDZaT0Izxq25u7Lhm7/gFAECQjiQydc6dlDShbsI29f1tadcAAIBKmVnpbHJXuuXPPi/d8gEAxdYWN/8ova43C7vRIfCPfcgL+PsOSL/Y31gejdYt7e83acxQzb68tyHtl30FaMNIs8QTGRJPStilWAvnNHe/7MtukLY+F1wuAABpaquAvXKttPrG8DT926Rx87zXh7ZKk6qGyq+7Vbr30ehlzpkp7VgvPXH34LZ9B7xrvyXpYIS1yb8Y84ppAABUa6shcSn64iSldJu3SktXhacfiu9+XVp6WW059eoTJO3vN2kMx2Vf3tuQ9su+ArRhpCHxtgvYE8dJR56McFzE89mL50rXL5bmzZKOnZB+ske6bYP0s731j40SrCdcGn45V9rfb9L4zyL78t6GtF/2FaANs3kOu6+/8WO3rPECdJDxY6QZU6SrF1Ru3/GidMnnGyuTa68BAK3Qdj3skqhD0Z0d0rvP1W6PqrqcztnS6TPND4W/l3/+fxmmXYXE0YbZRvtlXwHaMJs97JKo549LwbrRS77KjzvzgnTq+Wh5tfq+3ACAYmvrhVOW3FI/jfUEB89bl0nHnvYCf+lxcqe33c+wi6IF4j/+cv00AADEqW2HxEuCetnVgfXKedJDdzVej6WrvBnnjZQdJu3vN2kMx2Vf3tuQ9su+ArRhNmeJ+3l7hzRqZNVxPVLfU9KEsZXbR8+V3joZvfyuMdKbP6rc9o2N0i131wbsJbdI9/8wet5SIf7Q0q5C4mjDbKP9sq8AbZjtc9jlzv6491wdQDuGSdOvkF490HjeR49X9ph/+WhtT1vinDUAIF1tfQ67WnnQdL3Sw9ubC9Z+zl3kXbdd/uOAYA0ASFsmhsSrjR8tHX06idpU6p7f3HXhUiGGctKuQuJow2yj/bKvAG0YaUg8Uz3skmMnvF7vitXJ5L/8zoFz5E0GawAA4pLJHrafOO6olcTQd9rfb9L4dZ99eW9D2i/7CtCG+e1h+yldj209g3fzKrdybe22cy6rPA4AgHaVmx52u0r7+00av+6zL+9tSPtlXwHasFg9bAAA8oyADQBABhCwAQDIgHZY6axP0i9bWN7EgTJbIqXzSy39jCnIexvSfjGi/WLX8s9XgDY8N0qi1CedtZqZ9UY5uZ9lef+MfL5s4/NlW94/n9S+n5EhcQAAMoCADQBABhQxYH8n7Qq0QN4/I58v2/h82Zb3zye16Wcs3DlsAACyqIg9bAAAMoeADQBABhQqYJvZp83s38zsFTP7i7TrEycz+1szO2xmP027Lkkws2lm9rSZ/dzMXjazL6Vdp7iZ2Ugze8HMXhr4jF9Nu05xM7NhZvbPZvZo2nVJgpm9amb/YmYvmlkM9xBsL2Y2zsz+3sz+deDf4h+kXae4mNlHBtqt9DhuZivSrle5wpzDNrNhkv4/SZ+StF/SP0la6pz7WaoVi4mZzZX0lqT/6pw7L+36xM3M3i/p/c653WY2WtIuSVfmpf0kybzVIc52zr1lZp2Sdkj6knPuuZSrFhszu0lSj6QxzrlFadcnbmb2qqQe51wuF04xs3sl/dg5d4+ZDZc0yjnXn3a94jYQL16XNNs518qFvUIVqYd9kaRXnHN7nXPvStos6TMp1yk2zrlnJB1Nux5Jcc694ZzbPfD6hKSfS5qSbq3i5TxvDbztHHjk5he1mU2VdLmke9KuC4bOzMZImitpvSQ5597NY7Ae8ElJv2inYC0VK2BPkRolYLIAAAIzSURBVPRa2fv9ytl/+EVhZh+U9HuSnk+3JvEbGDJ+UdJhST90zuXpM35T0pcl/fe0K5IgJ2mrme0ys2VpVyZmMyQdkbRh4LTGPWZ2dtqVSsgSSZvSrkS1IgVsv8Voc9N7KQoze5+kByWtcM4dT7s+cXPOnXHOXSBpqqSLzCwXpzfMbJGkw865XWnXJWFznHMXSlog6T8OnKrKiw5JF0r6G+fc70l6W1Ku5gJJ0sBQ/xWSvpd2XaoVKWDvlzSt7P1USQdSqgsaMHBe90FJ9znn/iHt+iRpYKhxm6RPp1yVuMyRdMXAOd7Nki41s79Lt0rxc84dGHg+LOkheafi8mK/pP1loz5/Ly+A580CSbudc4fSrki1IgXsf5L0YTObPvALaomkLSnXCRENTMhaL+nnzrk1adcnCWbWbWbjBl6fJWm+pH9Nt1bxcM7d4pyb6pz7oLx/ez9yzn025WrFyszOHpgQqYGh4j+SlJurNpxzByW9ZmYfGdj0SUm5mfRZZqnacDhcao/ba7aEc+60md0g6QlJwyT9rXPu5ZSrFRsz2yRpnqSJZrZf0lecc+vTrVWs5ki6RtK/DJzjlaRVzrl/TLFOcXu/pHsHZqj+O0kPOOdyeflTTk2W9NDArSA7JH3XOfd4ulWK3Rcl3TfQ6dkr6fqU6xMrMxsl70qi/5B2XfwU5rIuAACyrEhD4gAAZBYBGwCADCBgAwCQAQRsAAAygIANAEAGELABAMgAAjYAABnw/wPRIOc/pYUmbAAAAABJRU5ErkJggg==", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_NQueens(astar)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## AND-OR GRAPH SEARCH\n", + "An _AND-OR_ graph is a graphical representation of the reduction of goals to _conjunctions_ and _disjunctions_ of subgoals.\n", + "
    \n", + "An _AND-OR_ graph can be seen as a generalization of a directed graph.\n", + "It contains a number of vertices and generalized edges that connect the vertices.\n", + "
    \n", + "Each connector in an _AND-OR_ graph connects a set of vertices $V$ to a single vertex, $v_0$.\n", + "A connector can be an _AND_ connector or an _OR_ connector.\n", + "An __AND__ connector connects two edges having a logical _AND_ relationship,\n", + "while and __OR__ connector connects two edges having a logical _OR_ relationship.\n", + "
    \n", + "A vertex can have more than one _AND_ or _OR_ connector.\n", + "This is why _AND-OR_ graphs can be expressed as logical statements.\n", + "
    \n", + "
    \n", + "_AND-OR_ graphs also provide a computational model for executing logic programs and you will come across this data-structure in the `logic` module as well.\n", + "_AND-OR_ graphs can be searched in depth-first, breadth-first or best-first ways searching the state sapce linearly or parallely.\n", + "
    \n", + "Our implementation of _AND-OR_ search searches over graphs generated by non-deterministic environments and returns a conditional plan that reaches a goal state in all circumstances.\n", + "Let's have a look at the implementation of `and_or_graph_search`." + ] + }, + { + "cell_type": "code", + "execution_count": 76, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    def and_or_graph_search(problem):\n",
    +       "    """[Figure 4.11]Used when the environment is nondeterministic and completely observable.\n",
    +       "    Contains OR nodes where the agent is free to choose any action.\n",
    +       "    After every action there is an AND node which contains all possible states\n",
    +       "    the agent may reach due to stochastic nature of environment.\n",
    +       "    The agent must be able to handle all possible states of the AND node (as it\n",
    +       "    may end up in any of them).\n",
    +       "    Returns a conditional plan to reach goal state,\n",
    +       "    or failure if the former is not possible."""\n",
    +       "\n",
    +       "    # functions used by and_or_search\n",
    +       "    def or_search(state, problem, path):\n",
    +       "        """returns a plan as a list of actions"""\n",
    +       "        if problem.goal_test(state):\n",
    +       "            return []\n",
    +       "        if state in path:\n",
    +       "            return None\n",
    +       "        for action in problem.actions(state):\n",
    +       "            plan = and_search(problem.result(state, action),\n",
    +       "                              problem, path + [state, ])\n",
    +       "            if plan is not None:\n",
    +       "                return [action, plan]\n",
    +       "\n",
    +       "    def and_search(states, problem, path):\n",
    +       "        """Returns plan in form of dictionary where we take action plan[s] if we reach state s."""\n",
    +       "        plan = {}\n",
    +       "        for s in states:\n",
    +       "            plan[s] = or_search(s, problem, path)\n",
    +       "            if plan[s] is None:\n",
    +       "                return None\n",
    +       "        return plan\n",
    +       "\n",
    +       "    # body of and or search\n",
    +       "    return or_search(problem.initial, problem, [])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(and_or_graph_search)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The search is carried out by two functions `and_search` and `or_search` that recursively call each other, traversing nodes sequentially.\n", + "It is a recursive depth-first algorithm for searching an _AND-OR_ graph.\n", + "
    \n", + "A very similar algorithm `fol_bc_ask` can be found in the `logic` module, which carries out inference on first-order logic knowledge bases using _AND-OR_ graph-derived data-structures.\n", + "
    \n", + "_AND-OR_ trees can also be used to represent the search spaces for two-player games, where a vertex of the tree represents the problem of one of the players winning the game, starting from the initial state of the game.\n", + "
    \n", + "Problems involving _MIN-MAX_ trees can be reformulated as _AND-OR_ trees by representing _MAX_ nodes as _OR_ nodes and _MIN_ nodes as _AND_ nodes.\n", + "`and_or_graph_search` can then be used to find the optimal solution.\n", + "Standard algorithms like `minimax` and `expectiminimax` (for belief states) can also be applied on it with a few modifications." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's how `and_or_graph_search` can be applied to a simple vacuum-world example." + ] + }, + { + "cell_type": "code", + "execution_count": 77, + "metadata": {}, + "outputs": [], + "source": [ + "vacuum_world = GraphProblemStochastic('State_1', ['State_7', 'State_8'], vacuum_world)\n", + "plan = and_or_graph_search(vacuum_world)" + ] + }, + { + "cell_type": "code", + "execution_count": 78, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['Suck',\n", + " {'State_5': ['Right', {'State_6': ['Suck', {'State_8': []}]}], 'State_7': []}]" + ] + }, + "execution_count": 78, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plan" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": {}, + "outputs": [], + "source": [ + "def run_plan(state, problem, plan):\n", + " if problem.goal_test(state):\n", + " return True\n", + " if len(plan) is not 2:\n", + " return False\n", + " predicate = lambda x: run_plan(x, problem, plan[1][x])\n", + " return all(predicate(r) for r in problem.result(state, plan[0]))" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 80, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "run_plan('State_1', vacuum_world, plan)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## ONLINE DFS AGENT\n", + "So far, we have seen agents that use __offline search__ algorithms,\n", + "which is a class of algorithms that compute a complete solution before executing it.\n", + "In contrast, an __online search__ agent interleaves computation and action.\n", + "Online search is better for most dynamic environments and necessary for unknown environments.\n", + "
    \n", + "Online search problems are solved by an agent executing actions, rather than just by pure computation.\n", + "For a fully observable environment, an online agent cycles through three steps: taking an action, computing the step cost and checking if the goal has been reached.\n", + "
    \n", + "For online algorithms in partially-observable environments, there is usually a tradeoff between exploration and exploitation to be taken care of.\n", + "
    \n", + "
    \n", + "Whenever an online agent takes an action, it receives a _percept_ or an observation that tells it something about its immediate environment.\n", + "Using this percept, the agent can augment its map of the current environment.\n", + "For a partially observable environment, this is called the belief state.\n", + "
    \n", + "Online algorithms expand nodes in a _local_ order, just like _depth-first search_ as it does not have the option of observing farther nodes like _A* search_.\n", + "Whenever an action from the current state has not been explored, the agent tries that action.\n", + "
    \n", + "Difficulty arises when the agent has tried all actions in a particular state.\n", + "An offline search algorithm would simply drop the state from the queue in this scenario whereas an online search agent has to physically move back to the previous state.\n", + "To do this, the agent needs to maintain a table where it stores the order of nodes it has been to.\n", + "This is how our implementation of _Online DFS-Agent_ works.\n", + "This agent works only in state spaces where the action is reversible, because of the use of backtracking.\n", + "
    \n", + "Let's have a look at the `OnlineDFSAgent` class." + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class OnlineDFSAgent:\n",
    +       "\n",
    +       "    """[Figure 4.21] The abstract class for an OnlineDFSAgent. Override\n",
    +       "    update_state method to convert percept to state. While initializing\n",
    +       "    the subclass a problem needs to be provided which is an instance of\n",
    +       "    a subclass of the Problem class."""\n",
    +       "\n",
    +       "    def __init__(self, problem):\n",
    +       "        self.problem = problem\n",
    +       "        self.s = None\n",
    +       "        self.a = None\n",
    +       "        self.untried = dict()\n",
    +       "        self.unbacktracked = dict()\n",
    +       "        self.result = {}\n",
    +       "\n",
    +       "    def __call__(self, percept):\n",
    +       "        s1 = self.update_state(percept)\n",
    +       "        if self.problem.goal_test(s1):\n",
    +       "            self.a = None\n",
    +       "        else:\n",
    +       "            if s1 not in self.untried.keys():\n",
    +       "                self.untried[s1] = self.problem.actions(s1)\n",
    +       "            if self.s is not None:\n",
    +       "                if s1 != self.result[(self.s, self.a)]:\n",
    +       "                    self.result[(self.s, self.a)] = s1\n",
    +       "                    self.unbacktracked[s1].insert(0, self.s)\n",
    +       "            if len(self.untried[s1]) == 0:\n",
    +       "                if len(self.unbacktracked[s1]) == 0:\n",
    +       "                    self.a = None\n",
    +       "                else:\n",
    +       "                    # else a <- an action b such that result[s', b] = POP(unbacktracked[s'])\n",
    +       "                    unbacktracked_pop = self.unbacktracked.pop(s1)\n",
    +       "                    for (s, b) in self.result.keys():\n",
    +       "                        if self.result[(s, b)] == unbacktracked_pop:\n",
    +       "                            self.a = b\n",
    +       "                            break\n",
    +       "            else:\n",
    +       "                self.a = self.untried.pop(s1)\n",
    +       "        self.s = s1\n",
    +       "        return self.a\n",
    +       "\n",
    +       "    def update_state(self, percept):\n",
    +       "        """To be overridden in most cases. The default case\n",
    +       "        assumes the percept to be of type state."""\n",
    +       "        return percept\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(OnlineDFSAgent)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It maintains two dictionaries `untried` and `unbacktracked`.\n", + "`untried` contains nodes that have not been visited yet.\n", + "`unbacktracked` contains the sequence of nodes that the agent has visited so it can backtrack to it later, if required.\n", + "`s` and `a` store the state and the action respectively and `result` stores the final path or solution of the problem.\n", + "
    \n", + "Let's look at another online search algorithm." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## LRTA* AGENT\n", + "We can infer now that hill-climbing is an online search algorithm, but it is not very useful natively because for complicated search spaces, it might converge to the local minima and indefinitely stay there.\n", + "In such a case, we can choose to randomly restart it a few times with different starting conditions and return the result with the lowest total cost.\n", + "Sometimes, it is better to use random walks instead of random restarts depending on the problem, but progress can still be very slow.\n", + "
    \n", + "A better improvement would be to give hill-climbing a memory element.\n", + "We store the current best heuristic estimate and it is updated as the agent gains experience in the state space.\n", + "The estimated optimal cost is made more and more accurate as time passes and each time the the local minima is \"flattened out\" until we escape it.\n", + "
    \n", + "This learning scheme is a simple improvement upon traditional hill-climbing and is called _learning real-time A*_ or __LRTA*__.\n", + "Similar to _Online DFS-Agent_, it builds a map of the environment and chooses the best possible move according to its current heuristic estimates.\n", + "
    \n", + "Actions that haven't been tried yet are assumed to lead immediately to the goal with the least possible cost.\n", + "This is called __optimism under uncertainty__ and encourages the agent to explore new promising paths.\n", + "This algorithm might not terminate if the state space is infinite, unlike A* search.\n", + "
    \n", + "Let's have a look at the `LRTAStarAgent` class." + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class LRTAStarAgent:\n",
    +       "\n",
    +       "    """ [Figure 4.24]\n",
    +       "    Abstract class for LRTA*-Agent. A problem needs to be\n",
    +       "    provided which is an instance of a subclass of Problem Class.\n",
    +       "\n",
    +       "    Takes a OnlineSearchProblem [Figure 4.23] as a problem.\n",
    +       "    """\n",
    +       "\n",
    +       "    def __init__(self, problem):\n",
    +       "        self.problem = problem\n",
    +       "        # self.result = {}      # no need as we are using problem.result\n",
    +       "        self.H = {}\n",
    +       "        self.s = None\n",
    +       "        self.a = None\n",
    +       "\n",
    +       "    def __call__(self, s1):     # as of now s1 is a state rather than a percept\n",
    +       "        if self.problem.goal_test(s1):\n",
    +       "            self.a = None\n",
    +       "            return self.a\n",
    +       "        else:\n",
    +       "            if s1 not in self.H:\n",
    +       "                self.H[s1] = self.problem.h(s1)\n",
    +       "            if self.s is not None:\n",
    +       "                # self.result[(self.s, self.a)] = s1    # no need as we are using problem.output\n",
    +       "\n",
    +       "                # minimum cost for action b in problem.actions(s)\n",
    +       "                self.H[self.s] = min(self.LRTA_cost(self.s, b, self.problem.output(self.s, b),\n",
    +       "                                     self.H) for b in self.problem.actions(self.s))\n",
    +       "\n",
    +       "            # an action b in problem.actions(s1) that minimizes costs\n",
    +       "            self.a = argmin(self.problem.actions(s1),\n",
    +       "                            key=lambda b: self.LRTA_cost(s1, b, self.problem.output(s1, b), self.H))\n",
    +       "\n",
    +       "            self.s = s1\n",
    +       "            return self.a\n",
    +       "\n",
    +       "    def LRTA_cost(self, s, a, s1, H):\n",
    +       "        """Returns cost to move from state 's' to state 's1' plus\n",
    +       "        estimated cost to get to goal from s1."""\n",
    +       "        print(s, a, s1)\n",
    +       "        if s1 is None:\n",
    +       "            return self.problem.h(s)\n",
    +       "        else:\n",
    +       "            # sometimes we need to get H[s1] which we haven't yet added to H\n",
    +       "            # to replace this try, except: we can initialize H with values from problem.h\n",
    +       "            try:\n",
    +       "                return self.problem.c(s, a, s1) + self.H[s1]\n",
    +       "            except:\n",
    +       "                return self.problem.c(s, a, s1) + self.problem.h(s1)\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(LRTAStarAgent)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`H` stores the heuristic cost of the paths the agent may travel to.\n", + "
    \n", + "`s` and `a` store the state and the action respectively.\n", + "
    \n", + "`problem` stores the problem definition and the current map of the environment is stored in `problem.result`.\n", + "
    \n", + "The `LRTA_cost` method computes the cost of a new path given the current state `s`, the action `a`, the next state `s1` and the estimated cost to get from `s` to `s1` is extracted from `H`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's use `LRTAStarAgent` to solve a simple problem.\n", + "We'll define a new `LRTA_problem` instance based on our `one_dim_state_space`." + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 83, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "one_dim_state_space" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's define an instance of `OnlineSearchProblem`." + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": {}, + "outputs": [], + "source": [ + "LRTA_problem = OnlineSearchProblem('State_3', 'State_5', one_dim_state_space)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we initialize a `LRTAStarAgent` object for the problem we just defined." + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": {}, + "outputs": [], + "source": [ + "lrta_agent = LRTAStarAgent(LRTA_problem)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll pass the percepts `[State_3, State_4, State_3, State_4, State_5]` one-by-one to our agent to see what action it comes up with at each timestep." + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State_3 Right State_4\n", + "State_3 Left State_2\n" + ] + }, + { + "data": { + "text/plain": [ + "'Right'" + ] + }, + "execution_count": 86, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "lrta_agent('State_3')" + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State_3 Right State_4\n", + "State_3 Left State_2\n", + "State_4 Right State_5\n", + "State_4 Left State_3\n" + ] + }, + { + "data": { + "text/plain": [ + "'Left'" + ] + }, + "execution_count": 87, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "lrta_agent('State_4')" ] }, { "cell_type": "code", - "execution_count": 11, + "execution_count": 88, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "[[0, 2, 7, 1, 7, 3, 2, 4], [2, 7, 5, 4, 4, 5, 2, 0], [7, 1, 6, 0, 1, 3, 0, 2], [0, 3, 6, 1, 3, 0, 5, 4], [0, 4, 6, 4, 7, 4, 1, 6]]\n" + "State_4 Right State_5\n", + "State_4 Left State_3\n", + "State_3 Right State_4\n", + "State_3 Left State_2\n" ] + }, + { + "data": { + "text/plain": [ + "'Right'" + ] + }, + "execution_count": 88, + "metadata": {}, + "output_type": "execute_result" } ], "source": [ - "population = init_population(100, range(8), 8)\n", - "print(population[:5])" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We have a population of 100 and each individual has 8 genes. The gene pool is the integers from 0 to 7, in string form. Above you can see the first five individuals.\n", - "\n", - "Next we need to write our fitness function. Remember, queens threaten each other if they are at the same row, column or diagonal.\n", - "\n", - "Since positionings are mutual, we must take care not to count them twice. Therefore for each queen, we will only check for conflicts for the queens after her.\n", - "\n", - "A gene's value in an individual `q` denotes the queen's column, and the position of the gene denotes its row. We can check if the aforementioned values between two genes are the same. We also need to check for diagonals. A queen *a* is in the diagonal of another queen, *b*, if the difference of the rows between them is equal to either their difference in columns (for the diagonal on the right of *a*) or equal to the negative difference of their columns (for the left diagonal of *a*). Below is given the fitness function." + "lrta_agent('State_3')" ] }, { "cell_type": "code", - "execution_count": 12, - "metadata": { - "collapsed": true - }, - "outputs": [], + "execution_count": 89, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State_3 Right State_4\n", + "State_3 Left State_2\n", + "State_4 Right State_5\n", + "State_4 Left State_3\n" + ] + }, + { + "data": { + "text/plain": [ + "'Right'" + ] + }, + "execution_count": 89, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ - "def fitness(q):\n", - " non_attacking = 0\n", - " for row1 in range(len(q)):\n", - " for row2 in range(row1+1, len(q)):\n", - " col1 = int(q[row1])\n", - " col2 = int(q[row2])\n", - " row_diff = row1 - row2\n", - " col_diff = col1 - col2\n", - "\n", - " if col1 != col2 and row_diff != col_diff and row_diff != -col_diff:\n", - " non_attacking += 1\n", - "\n", - " return non_attacking" + "lrta_agent('State_4')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Note that the best score achievable is 28. That is because for each queen we only check for the queens after her. For the first queen we check 7 other queens, for the second queen 6 others and so on. In short, the number of checks we make is the sum 7+6+5+...+1. Which is equal to 7\\*(7+1)/2 = 28.\n", - "\n", - "Because it is very hard and will take long to find a perfect solution, we will set the fitness threshold at 25. If we find an individual with a score greater or equal to that, we will halt. Let's see how the genetic algorithm will fare." + "If you manually try to see what the optimal action should be at each step, the outputs of the `lrta_agent` will start to make sense if it doesn't already." ] }, { "cell_type": "code", - "execution_count": 16, + "execution_count": 90, "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[5, 0, 6, 3, 7, 4, 1, 3]\n", - "26\n" - ] - } - ], + "outputs": [], "source": [ - "solution = genetic_algorithm(population, fitness, f_thres=25, gene_pool=range(8))\n", - "print(solution)\n", - "print(fitness(solution))" + "lrta_agent('State_5')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Above you can see the solution and its fitness score, which should be no less than 25." + "There is no possible action for this state." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "With that this tutorial on the genetic algorithm comes to an end. Hope you found this guide helpful!" + "
    \n", + "This concludes the notebook.\n", + "Hope you learned something new!" ] } ], @@ -1464,420 +6529,49 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.2+" + "version": "3.7.6" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "metadata": { + "collapsed": false + }, + "source": [] + } }, "widgets": { "state": { - "013d8df0a2ab4899b09f83aa70ce5d50": { - "views": [] - }, - "01ee7dc2239c4b0095710436453b362d": { - "views": [] - }, - "04d594ae6a704fc4b16895e6a7b85270": { - "views": [] - }, - "052ea3e7259346a4b022ec4fef1fda28": { - "views": [ - { - "cell_index": 32 - } - ] - }, - "0ade4328785545c2b66d77e599a3e9da": { - "views": [ - { - "cell_index": 29 - } - ] - }, - "0b94d8de6b4e47f89b0382b60b775cbd": { - "views": [] - }, - "0c63dcc0d11a451ead31a4c0c34d7b43": { - "views": [] - }, - "0d91be53b6474cdeac3239fdffeab908": { - "views": [ - { - "cell_index": 39 - } - ] - }, - "0fe9c3b9b1264d4abd22aef40a9c1ab9": { - "views": [] - }, - "10fd06131b05455d9f0a98072d7cebc6": { - "views": [] - }, - "1193eaa60bb64cb790236d95bf11f358": { - "views": [ - { - "cell_index": 38 - } - ] - }, - "11b596cbf81a47aabccae723684ac3a5": { - "views": [] - }, - "127ae5faa86f41f986c39afb320f2298": { - "views": [] - }, - "16a9167ec7b4479e864b2a32e40825a1": { - "views": [ - { - "cell_index": 39 - } - ] - }, - "170e2e101180413f953a192a41ecbfcc": { - "views": [] - }, - "181efcbccf89478792f0e38a25500e51": { - "views": [] - }, - "1894a28092604d69b0d7d465a3b165b1": { - "views": [] - }, - "1a56cc2ab5ae49ea8bf2a3f6ca2b1c36": { - "views": [] - }, - "1cfd8f392548467696d8cd4fc534a6b4": { - "views": [] - }, - "1e395e67fdec406f8698aa5922764510": { - "views": [] - }, - "23509c6536404e96985220736d286183": { - "views": [] - }, - "23bffaca1206421fb9ea589126e35438": { - "views": [] - }, - "25330d0b799e4f02af5e510bc70494cf": { - "views": [] - }, - "2ab8bf4795ac4240b70e1a94e14d1dd6": { - "views": [ - { - "cell_index": 30 - } - ] - }, - "2bd48f1234e4422aaedecc5815064181": { - "views": [] - }, - "2d3a082066304c8ebf2d5003012596b4": { - "views": [] - }, - "2dc962f16fd143c1851aaed0909f3963": { - "views": [ - { - "cell_index": 35 - } - ] - }, - "2f659054242a453da5ea0884de996008": { - "views": [] - }, - "30a214881db545729c1b883878227e95": { - "views": [] - }, - "3275b81616424947be98bf8fd3cd7b82": { - "views": [] - }, - "330b52bc309d4b6a9b188fd9df621180": { - "views": [] - }, - "3320648123f44125bcfda3b7c68febcf": { - "views": [] - }, - "338e3b1562e747f197ab3ceae91e371f": { - "views": [] - }, - "34658e2de2894f01b16cf89905760f14": { - "views": [ - { - "cell_index": 39 - } - ] - }, - "352f5fd9f698460ea372c6af57c5b478": { - "views": [] - }, - "35dc16b828a74356b56cd01ff9ddfc09": { - "views": [] - }, - "3805ce2994364bd1b259373d8798cc7a": { - "views": [] - }, - "3d1f1f899cfe49aaba203288c61686ac": { - "views": [] - }, - "3d7e943e19794e29b7058eb6bbe23c66": { - "views": [] - }, - "3f6652b3f85740949b7711fbcaa509ba": { - "views": [] - }, - "43e48664a76342c991caeeb2d5b17a49": { - "views": [ - { - "cell_index": 35 - } - ] - }, - "4662dec8595f45fb9ae061b2bdf44427": { - "views": [] - }, - "47ae3d2269d94a95a567be21064eb98a": { - "views": [] - }, - "49c49d665ba44746a1e1e9dc598bc411": { - "views": [ - { - "cell_index": 39 - } - ] - }, - "4a1c43b035f644699fd905d5155ad61f": { - "views": [ - { - "cell_index": 39 - } - ] - }, - "4eb88b6f6b4241f7b755f69b9e851872": { - "views": [] - }, - "4fbb3861e50f41c688e9883da40334d4": { - "views": [] - }, - "52d76de4ee8f4487b335a4a11726fbce": { - "views": [] - }, - "53eccc8fc0ad461cb8277596b666f32a": { - "views": [ - { - "cell_index": 29 - } - ] - }, - "54d3a6067b594ad08907ce059d9f4a41": { - "views": [] - }, - "612530d3edf8443786b3093ab612f88b": { - "views": [] - }, - "613a133b6d1f45e0ac9c5c270bc408e0": { - "views": [] - }, - "636caa7780614389a7f52ad89ea1c6e8": { - "views": [ - { - "cell_index": 39 - } - ] - }, - "63aa621196294629b884c896b6a034d8": { - "views": [] - }, - "66d1d894cc7942c6a91f0630fc4321f9": { - "views": [] - }, - "6775928a174b43ecbe12608772f1cb05": { - "views": [] - }, - "6bce621c90d543bca50afbe0c489a191": { - "views": [] - }, - "6ebbb8c7ec174c15a6ee79a3c5b36312": { - "views": [] - }, - "743219b9d37e4f47a5f777bb41ad0a96": { - "views": [ - { - "cell_index": 29 - } - ] - }, - "774f464794cc409ca6d1106bcaac0cf1": { - "views": [] - }, - "7ba3da40fb26490697fc64b3248c5952": { - "views": [] - }, - "7e79fea4654f4bedb5969db265736c25": { - "views": [] - }, - "85c82ed0844f4ae08a14fd750e55fc15": { - "views": [] - }, - "86e8f92c1d584cdeb13b36af1b6ad695": { - "views": [ - { - "cell_index": 35 - } - ] - }, - "88485e72d2ec447ba7e238b0a6de2839": { - "views": [] - }, - "892d7b895d3840f99504101062ba0f65": { - "views": [] - }, - "89be4167713e488696a20b9b5ddac9bd": { - "views": [] - }, - "8a24a07d166b45498b7d8b3f97c131eb": { - "views": [] - }, - "8e7c7f3284ee45b38d95fe9070d5772f": { - "views": [] - }, - "98985eefab414365991ed6844898677f": { - "views": [] - }, - "98df98e5af87474d8b139cb5bcbc9792": { - "views": [] - }, - "99f11243d387409bbad286dd5ecb1725": { - "views": [] - }, - "9ab2d641b0be4cf8950be5ba72e5039f": { - "views": [] - }, - "9b1ffbd1e7404cb4881380a99c7d11bc": { - "views": [] - }, - "9c07ec6555cb4d0ba8b59007085d5692": { - "views": [] - }, - "9cc80f47249b4609b98223ce71594a3d": { - "views": [] - }, - "9d79bfd34d3640a3b7156a370d2aabae": { - "views": [] - }, - "a015f138cbbe4a0cad4d72184762ed75": { - "views": [] - }, - "a27d2f1eb3834c38baf1181b0de93176": { - "views": [] - }, - "a29b90d050f3442a89895fc7615ccfee": { - "views": [ - { - "cell_index": 29 - } - ] - }, - "a725622cfc5b43b4ae14c74bc2ad7ad0": { - "views": [] - }, - "ac2e05d7d7e945bf99862a2d9d1fa685": { - "views": [] - }, - "b0bb2ca65caa47579a4d3adddd94504b": { - "views": [] - }, - "b8995c40625d465489e1b7ec8014b678": { - "views": [] - }, - "ba83da1373fe45d19b3c96a875f2f4fb": { - "views": [] - }, - "baa0040d35c64604858c529418c22797": { - "views": [] - }, - "badc9fd7b56346d6b6aea68bfa6d2699": { - "views": [ - { - "cell_index": 38 - } - ] - }, - "bdb41c7654e54c83a91452abc59141bd": { - "views": [] - }, - "c2399056ef4a4aa7aa4e23a0f381d64a": { + "1516e2501ddd4a2e8e3250bffc0164db": { "views": [ { - "cell_index": 38 + "cell_index": 59 } ] }, - "c73b47b242b4485fb1462abcd92dc7c9": { - "views": [] - }, - "ce3f28a8aeee4be28362d068426a71f6": { + "17be64c89a9a4a43b3272cb018df0970": { "views": [ { - "cell_index": 32 + "cell_index": 59 } ] }, - "d3067a6bb84544bba5f1abd241a72e55": { - "views": [] - }, - "db13a2b94de34ce9bea721aaf971c049": { - "views": [] - }, - "db468d80cb6e43b6b88455670b036618": { - "views": [] - }, - "e2cb458522b4438ea3f9873b6e411acb": { - "views": [] - }, - "e77dca31f1d94d4dadd3f95d2cdbf10e": { - "views": [] - }, - "e7bffb1fed664dea90f749ea79dcc4f1": { + "ac05040009a340b0af81b0ee69161fbc": { "views": [ { - "cell_index": 39 + "cell_index": 59 } ] }, - "e80abb145fce4e888072b969ba8f455a": { - "views": [] - }, - "e839d0cf348c4c1b832fc1fc3b0bd3c9": { - "views": [] - }, - "e948c6baadde46f69f105649555b84eb": { - "views": [] - }, - "eb16e9da25bf4bef91a34b1d0565c774": { - "views": [] - }, - "ec82b64048834eafa3e53733bb54a713": { - "views": [] - }, - "edbb3a621c87445e9df4773cc60ec8d2": { - "views": [] - }, - "ef6c99705936425a975e49b9e18ac267": { - "views": [] - }, - "f1b494f025dd48d1ae58ae8e3e2ebf46": { - "views": [] - }, - "f435b108c59c42989bf209a625a3a5b5": { + "d9735ffe77c24f13ae4ad3620ce84334": { "views": [ { - "cell_index": 32 + "cell_index": 59 } ] - }, - "f71ed7e15a314c28973943046c4529d6": { - "views": [] - }, - "f81f726f001c4fb999851df532ed39f2": { - "views": [] } }, - "version": "1.1.1" + "version": "1.2.0" } }, "nbformat": 4, diff --git a/search.py b/search.py index 68b77a5a8..5012c1a18 100644 --- a/search.py +++ b/search.py @@ -1,36 +1,26 @@ -"""Search (Chapters 3-4) +""" +Search (Chapters 3-4) The way to use this code is to subclass Problem to create a class of problems, then create problem instances and solve them with calls to the various search -functions.""" - -from utils import ( - is_in, argmin, argmax, argmax_random_tie, probability, weighted_sampler, - memoize, print_table, open_data, Stack, FIFOQueue, PriorityQueue, name, - distance -) +functions. +""" -from collections import defaultdict -import math -import random import sys -import bisect - -infinity = float('inf') - -# ______________________________________________________________________________ +from collections import deque +from utils import * -class Problem(object): - """The abstract class for a formal problem. You should subclass +class Problem: + """The abstract class for a formal problem. You should subclass this and implement the methods actions and result, and possibly __init__, goal_test, and path_cost. Then you will create instances of your subclass and solve them with the various search functions.""" def __init__(self, initial, goal=None): """The constructor specifies the initial state, and possibly a goal - state, if there is a unique goal. Your subclass's constructor can add + state, if there is a unique goal. Your subclass's constructor can add other arguments.""" self.initial = initial self.goal = goal @@ -62,24 +52,25 @@ def path_cost(self, c, state1, action, state2): """Return the cost of a solution path that arrives at state2 from state1 via action, assuming cost c to get up to state1. If the problem is such that the path doesn't matter, this function will only look at - state2. If the path does matter, it will consider c and maybe state1 + state2. If the path does matter, it will consider c and maybe state1 and action. The default method costs 1 for every step in the path.""" return c + 1 def value(self, state): - """For optimization problems, each state has a value. Hill-climbing + """For optimization problems, each state has a value. Hill Climbing and related algorithms try to maximize this value.""" raise NotImplementedError + + # ______________________________________________________________________________ class Node: - """A node in a search tree. Contains a pointer to the parent (the node that this is a successor of) and to the actual state for this node. Note that if a state is arrived at by two paths, then there are two nodes with - the same state. Also includes the action that got us to this state, and - the total path_cost (also known as g) to reach the node. Other functions + the same state. Also includes the action that got us to this state, and + the total path_cost (also known as g) to reach the node. Other functions may add an f and h value; see best_first_graph_search and astar_search for an explanation of how the f and h values are handled. You will not need to subclass this class.""" @@ -107,10 +98,9 @@ def expand(self, problem): def child_node(self, problem, action): """[Figure 3.10]""" - next = problem.result(self.state, action) - return Node(next, self, action, - problem.path_cost(self.path_cost, self.state, - action, next)) + next_state = problem.result(self.state, action) + next_node = Node(next_state, self, action, problem.path_cost(self.path_cost, self.state, action, next_state)) + return next_node def solution(self): """Return the sequence of actions to go from the root to this node.""" @@ -124,7 +114,7 @@ def path(self): node = node.parent return list(reversed(path_back)) - # We want for a queue of nodes in breadth_first_search or + # We want for a queue of nodes in breadth_first_graph_search or # astar_search to have no duplicated states, so we treat nodes # with the same state as equal. [Problem: this may not be what you # want in other contexts.] @@ -133,17 +123,24 @@ def __eq__(self, other): return isinstance(other, Node) and self.state == other.state def __hash__(self): + # We use the hash value of the state + # stored in the node instead of the node + # object itself to quickly search a node + # with the same state in a Hash Table return hash(self.state) + # ______________________________________________________________________________ class SimpleProblemSolvingAgentProgram: - - """Abstract framework for a problem-solving agent. [Figure 3.1]""" + """ + [Figure 3.1] + Abstract framework for a problem-solving agent. + """ def __init__(self, initial_state=None): - """State is an sbstract representation of the state + """State is an abstract representation of the state of the world, and seq is the list of actions required to get to a particular state from the initial state(root).""" self.state = initial_state @@ -161,7 +158,7 @@ def __call__(self, percept): return None return self.seq.pop(0) - def update_state(self, percept): + def update_state(self, state, percept): raise NotImplementedError def formulate_goal(self, state): @@ -173,15 +170,41 @@ def formulate_problem(self, state, goal): def search(self, problem): raise NotImplementedError + # ______________________________________________________________________________ # Uninformed Search algorithms -def tree_search(problem, frontier): - """Search through the successors of a problem to find a goal. +def breadth_first_tree_search(problem): + """ + [Figure 3.7] + Search the shallowest nodes in the search tree first. + Search through the successors of a problem to find a goal. The argument frontier should be an empty queue. - Don't worry about repeated paths to a state. [Figure 3.7]""" - frontier.append(Node(problem.initial)) + Repeats infinitely in case of loops. + """ + + frontier = deque([Node(problem.initial)]) # FIFO queue + + while frontier: + node = frontier.popleft() + if problem.goal_test(node.state): + return node + frontier.extend(node.expand(problem)) + return None + + +def depth_first_tree_search(problem): + """ + [Figure 3.7] + Search the deepest nodes in the search tree first. + Search through the successors of a problem to find a goal. + The argument frontier should be an empty queue. + Repeats infinitely in case of loops. + """ + + frontier = [Node(problem.initial)] # Stack + while frontier: node = frontier.pop() if problem.goal_test(node.state): @@ -190,11 +213,17 @@ def tree_search(problem, frontier): return None -def graph_search(problem, frontier): - """Search through the successors of a problem to find a goal. +def depth_first_graph_search(problem): + """ + [Figure 3.7] + Search the deepest nodes in the search tree first. + Search through the successors of a problem to find a goal. The argument frontier should be an empty queue. - If two paths reach a state, only use the first one. [Figure 3.7]""" - frontier.append(Node(problem.initial)) + Does not get trapped by loops. + If two paths reach a state, only use the first one. + """ + frontier = [(Node(problem.initial))] # Stack + explored = set() while frontier: node = frontier.pop() @@ -202,36 +231,23 @@ def graph_search(problem, frontier): return node explored.add(node.state) frontier.extend(child for child in node.expand(problem) - if child.state not in explored and - child not in frontier) + if child.state not in explored and child not in frontier) return None -def breadth_first_tree_search(problem): - """Search the shallowest nodes in the search tree first.""" - return tree_search(problem, FIFOQueue()) - - -def depth_first_tree_search(problem): - """Search the deepest nodes in the search tree first.""" - return tree_search(problem, Stack()) - - -def depth_first_graph_search(problem): - """Search the deepest nodes in the search tree first.""" - return graph_search(problem, Stack()) - - -def breadth_first_search(problem): - """[Figure 3.11]""" +def breadth_first_graph_search(problem): + """[Figure 3.11] + Note that this function can be implemented in a + single line as below: + return graph_search(problem, FIFOQueue()) + """ node = Node(problem.initial) if problem.goal_test(node.state): return node - frontier = FIFOQueue() - frontier.append(node) + frontier = deque([node]) explored = set() while frontier: - node = frontier.pop() + node = frontier.popleft() explored.add(node.state) for child in node.expand(problem): if child.state not in explored and child not in frontier: @@ -241,7 +257,7 @@ def breadth_first_search(problem): return None -def best_first_graph_search(problem, f): +def best_first_graph_search(problem, f, display=False): """Search the nodes with the lowest f scores first. You specify the function f(node) that you want to minimize; for example, if f is a heuristic estimate to the goal, then we have greedy best @@ -251,34 +267,34 @@ def best_first_graph_search(problem, f): a best first search you can examine the f values of the path returned.""" f = memoize(f, 'f') node = Node(problem.initial) - if problem.goal_test(node.state): - return node - frontier = PriorityQueue(min, f) + frontier = PriorityQueue('min', f) frontier.append(node) explored = set() while frontier: node = frontier.pop() if problem.goal_test(node.state): + if display: + print(len(explored), "paths have been expanded and", len(frontier), "paths remain in the frontier") return node explored.add(node.state) for child in node.expand(problem): if child.state not in explored and child not in frontier: frontier.append(child) elif child in frontier: - incumbent = frontier[child] - if f(child) < f(incumbent): - del frontier[incumbent] + if f(child) < frontier[child]: + del frontier[child] frontier.append(child) return None -def uniform_cost_search(problem): +def uniform_cost_search(problem, display=False): """[Figure 3.14]""" - return best_first_graph_search(problem, lambda node: node.path_cost) + return best_first_graph_search(problem, lambda node: node.path_cost, display) def depth_limited_search(problem, limit=50): """[Figure 3.17]""" + def recursive_dls(node, problem, limit): if problem.goal_test(node.state): return node @@ -305,17 +321,19 @@ def iterative_deepening_search(problem): if result != 'cutoff': return result + # ______________________________________________________________________________ # Bidirectional Search # Pseudocode from https://webdocs.cs.ualberta.ca/%7Eholte/Publications/MM-AAAI2016.pdf def bidirectional_search(problem): - e = problem.find_min_edge() - gF, gB = {problem.initial : 0}, {problem.goal : 0} - openF, openB = [problem.initial], [problem.goal] + e = 0 + if isinstance(problem, GraphProblem): + e = problem.find_min_edge() + gF, gB = {Node(problem.initial): 0}, {Node(problem.goal): 0} + openF, openB = [Node(problem.initial)], [Node(problem.goal)] closedF, closedB = [], [] - U = infinity - + U = np.inf def extend(U, open_dir, open_other, g_dir, g_other, closed_dir): """Extend search in given direction""" @@ -324,14 +342,14 @@ def extend(U, open_dir, open_other, g_dir, g_other, closed_dir): open_dir.remove(n) closed_dir.append(n) - for c in problem.actions(n): + for c in n.expand(problem): if c in open_dir or c in closed_dir: - if g_dir[c] <= problem.path_cost(g_dir[n], n, None, c): + if g_dir[c] <= problem.path_cost(g_dir[n], n.state, None, c.state): continue open_dir.remove(c) - g_dir[c] = problem.path_cost(g_dir[n], n, None, c) + g_dir[c] = problem.path_cost(g_dir[n], n.state, None, c.state) open_dir.append(c) if c in open_other: @@ -339,33 +357,32 @@ def extend(U, open_dir, open_other, g_dir, g_other, closed_dir): return U, open_dir, closed_dir, g_dir - def find_min(open_dir, g): """Finds minimum priority, g and f values in open_dir""" - m, m_f = infinity, infinity + # pr_min_f isn't forward pr_min instead it's the f-value + # of node with priority pr_min. + pr_min, pr_min_f = np.inf, np.inf for n in open_dir: f = g[n] + problem.h(n) - pr = max(f, 2*g[n]) - m = min(m, pr) - m_f = min(m_f, f) - - return m, m_f, min(g.values()) + pr = max(f, 2 * g[n]) + pr_min = min(pr_min, pr) + pr_min_f = min(pr_min_f, f) + return pr_min, pr_min_f, min(g.values()) def find_key(pr_min, open_dir, g): """Finds key in open_dir with value equal to pr_min and minimum g value.""" - m = infinity - state = -1 + m = np.inf + node = Node(-1) for n in open_dir: - pr = max(g[n] + problem.h(n), 2*g[n]) + pr = max(g[n] + problem.h(n), 2 * g[n]) if pr == pr_min: if g[n] < m: m = g[n] - state = n - - return state + node = n + return node while openF and openB: pr_min_f, f_min_f, g_min_f = find_min(openF, gF) @@ -382,22 +399,202 @@ def find_key(pr_min, open_dir, g): # Extend backward U, openB, closedB, gB = extend(U, openB, openF, gB, gF, closedB) - return infinity + return np.inf + # ______________________________________________________________________________ # Informed (Heuristic) Search greedy_best_first_graph_search = best_first_graph_search + + # Greedy best-first search is accomplished by specifying f(n) = h(n). -def astar_search(problem, h=None): +def astar_search(problem, h=None, display=False): """A* search is best-first graph search with f(n) = g(n)+h(n). You need to specify the h function when you call astar_search, or else in your Problem subclass.""" h = memoize(h or problem.h, 'h') - return best_first_graph_search(problem, lambda n: n.path_cost + h(n)) + return best_first_graph_search(problem, lambda n: n.path_cost + h(n), display) + + +# ______________________________________________________________________________ +# A* heuristics + +class EightPuzzle(Problem): + """ The problem of sliding tiles numbered from 1 to 8 on a 3x3 board, where one of the + squares is a blank. A state is represented as a tuple of length 9, where element at + index i represents the tile number at index i (0 if it's an empty square) """ + + def __init__(self, initial, goal=(1, 2, 3, 4, 5, 6, 7, 8, 0)): + """ Define goal state and initialize a problem """ + super().__init__(initial, goal) + + def find_blank_square(self, state): + """Return the index of the blank square in a given state""" + + return state.index(0) + + def actions(self, state): + """ Return the actions that can be executed in the given state. + The result would be a list, since there are only four possible actions + in any given state of the environment """ + + possible_actions = ['UP', 'DOWN', 'LEFT', 'RIGHT'] + index_blank_square = self.find_blank_square(state) + + if index_blank_square % 3 == 0: + possible_actions.remove('LEFT') + if index_blank_square < 3: + possible_actions.remove('UP') + if index_blank_square % 3 == 2: + possible_actions.remove('RIGHT') + if index_blank_square > 5: + possible_actions.remove('DOWN') + + return possible_actions + + def result(self, state, action): + """ Given state and action, return a new state that is the result of the action. + Action is assumed to be a valid action in the state """ + + # blank is the index of the blank square + blank = self.find_blank_square(state) + new_state = list(state) + + delta = {'UP': -3, 'DOWN': 3, 'LEFT': -1, 'RIGHT': 1} + neighbor = blank + delta[action] + new_state[blank], new_state[neighbor] = new_state[neighbor], new_state[blank] + + return tuple(new_state) + + def goal_test(self, state): + """ Given a state, return True if state is a goal state or False, otherwise """ + + return state == self.goal + + def check_solvability(self, state): + """ Checks if the given state is solvable """ + + inversion = 0 + for i in range(len(state)): + for j in range(i + 1, len(state)): + if (state[i] > state[j]) and state[i] != 0 and state[j] != 0: + inversion += 1 + + return inversion % 2 == 0 + + def h(self, node): + """ Return the heuristic value for a given state. Default heuristic function used is + h(n) = number of misplaced tiles """ + + return sum(s != g for (s, g) in zip(node.state, self.goal)) + + +# ______________________________________________________________________________ + + +class PlanRoute(Problem): + """ The problem of moving the Hybrid Wumpus Agent from one place to other """ + + def __init__(self, initial, goal, allowed, dimrow): + """ Define goal state and initialize a problem """ + super().__init__(initial, goal) + self.dimrow = dimrow + self.goal = goal + self.allowed = allowed + + def actions(self, state): + """ Return the actions that can be executed in the given state. + The result would be a list, since there are only three possible actions + in any given state of the environment """ + + possible_actions = ['Forward', 'TurnLeft', 'TurnRight'] + x, y = state.get_location() + orientation = state.get_orientation() + + # Prevent Bumps + if x == 1 and orientation == 'LEFT': + if 'Forward' in possible_actions: + possible_actions.remove('Forward') + if y == 1 and orientation == 'DOWN': + if 'Forward' in possible_actions: + possible_actions.remove('Forward') + if x == self.dimrow and orientation == 'RIGHT': + if 'Forward' in possible_actions: + possible_actions.remove('Forward') + if y == self.dimrow and orientation == 'UP': + if 'Forward' in possible_actions: + possible_actions.remove('Forward') + + return possible_actions + + def result(self, state, action): + """ Given state and action, return a new state that is the result of the action. + Action is assumed to be a valid action in the state """ + x, y = state.get_location() + proposed_loc = list() + + # Move Forward + if action == 'Forward': + if state.get_orientation() == 'UP': + proposed_loc = [x, y + 1] + elif state.get_orientation() == 'DOWN': + proposed_loc = [x, y - 1] + elif state.get_orientation() == 'LEFT': + proposed_loc = [x - 1, y] + elif state.get_orientation() == 'RIGHT': + proposed_loc = [x + 1, y] + else: + raise Exception('InvalidOrientation') + + # Rotate counter-clockwise + elif action == 'TurnLeft': + if state.get_orientation() == 'UP': + state.set_orientation('LEFT') + elif state.get_orientation() == 'DOWN': + state.set_orientation('RIGHT') + elif state.get_orientation() == 'LEFT': + state.set_orientation('DOWN') + elif state.get_orientation() == 'RIGHT': + state.set_orientation('UP') + else: + raise Exception('InvalidOrientation') + + # Rotate clockwise + elif action == 'TurnRight': + if state.get_orientation() == 'UP': + state.set_orientation('RIGHT') + elif state.get_orientation() == 'DOWN': + state.set_orientation('LEFT') + elif state.get_orientation() == 'LEFT': + state.set_orientation('UP') + elif state.get_orientation() == 'RIGHT': + state.set_orientation('DOWN') + else: + raise Exception('InvalidOrientation') + + if proposed_loc in self.allowed: + state.set_location(proposed_loc[0], [proposed_loc[1]]) + + return state + + def goal_test(self, state): + """ Given a state, return True if state is a goal state or False, otherwise """ + + return state.get_location() == tuple(self.goal) + + def h(self, node): + """ Return the heuristic value for a given state.""" + + # Manhattan Heuristic Function + x1, y1 = node.state.get_location() + x2, y2 = self.goal + + return abs(x2 - x1) + abs(y2 - y1) + # ______________________________________________________________________________ # Other search algorithms @@ -409,10 +606,10 @@ def recursive_best_first_search(problem, h=None): def RBFS(problem, node, flimit): if problem.goal_test(node.state): - return node, 0 # (The second value is immaterial) + return node, 0 # (The second value is immaterial) successors = node.expand(problem) if len(successors) == 0: - return None, infinity + return None, np.inf for s in successors: s.f = max(s.path_cost + h(s), node.f) while True: @@ -424,27 +621,29 @@ def RBFS(problem, node, flimit): if len(successors) > 1: alternative = successors[1].f else: - alternative = infinity + alternative = np.inf result, best.f = RBFS(problem, best, min(flimit, alternative)) if result is not None: return result, best.f node = Node(problem.initial) node.f = h(node) - result, bestf = RBFS(problem, node, infinity) + result, bestf = RBFS(problem, node, np.inf) return result def hill_climbing(problem): - """From the initial node, keep choosing the neighbor with highest value, - stopping when no neighbor is better. [Figure 4.2]""" + """ + [Figure 4.2] + From the initial node, keep choosing the neighbor with highest value, + stopping when no neighbor is better. + """ current = Node(problem.initial) while True: neighbors = current.expand(problem) if not neighbors: break - neighbor = argmax_random_tie(neighbors, - key=lambda node: problem.value(node.state)) + neighbor = argmax_random_tie(neighbors, key=lambda node: problem.value(node.state)) if problem.value(neighbor.state) <= problem.value(current.state): break current = neighbor @@ -453,7 +652,7 @@ def hill_climbing(problem): def exp_schedule(k=20, lam=0.005, limit=100): """One possible schedule function for simulated annealing""" - return lambda t: (k * math.exp(-lam * t) if t < limit else 0) + return lambda t: (k * np.exp(-lam * t) if t < limit else 0) def simulated_annealing(problem, schedule=exp_schedule()): @@ -467,10 +666,29 @@ def simulated_annealing(problem, schedule=exp_schedule()): neighbors = current.expand(problem) if not neighbors: return current.state - next = random.choice(neighbors) - delta_e = problem.value(next.state) - problem.value(current.state) - if delta_e > 0 or probability(math.exp(delta_e / T)): - current = next + next_choice = random.choice(neighbors) + delta_e = problem.value(next_choice.state) - problem.value(current.state) + if delta_e > 0 or probability(np.exp(delta_e / T)): + current = next_choice + + +def simulated_annealing_full(problem, schedule=exp_schedule()): + """ This version returns all the states encountered in reaching + the goal state.""" + states = [] + current = Node(problem.initial) + for t in range(sys.maxsize): + states.append(current.state) + T = schedule(t) + if T == 0: + return states + neighbors = current.expand(problem) + if not neighbors: + return current.state + next_choice = random.choice(neighbors) + delta_e = problem.value(next_choice.state) - problem.value(current.state) + if delta_e > 0 or probability(np.exp(delta_e / T)): + current = next_choice def and_or_graph_search(problem): @@ -509,38 +727,38 @@ def and_search(states, problem, path): return or_search(problem.initial, problem, []) +# Pre-defined actions for PeakFindingProblem +directions4 = {'W': (-1, 0), 'N': (0, 1), 'E': (1, 0), 'S': (0, -1)} +directions8 = dict(directions4) +directions8.update({'NW': (-1, 1), 'NE': (1, 1), 'SE': (1, -1), 'SW': (-1, -1)}) + + class PeakFindingProblem(Problem): """Problem of finding the highest peak in a limited grid""" - def __init__(self, initial, grid): + def __init__(self, initial, grid, defined_actions=directions4): """The grid is a 2 dimensional array/list whose state is specified by tuple of indices""" - Problem.__init__(self, initial) + super().__init__(initial) self.grid = grid + self.defined_actions = defined_actions self.n = len(grid) assert self.n > 0 self.m = len(grid[0]) assert self.m > 0 def actions(self, state): - """Allows movement in only 4 directions""" - # TODO: Add flag to allow diagonal motion + """Returns the list of actions which are allowed to be taken from the given state""" allowed_actions = [] - if state[0] > 0: - allowed_actions.append('N') - if state[0] < self.n - 1: - allowed_actions.append('S') - if state[1] > 0: - allowed_actions.append('W') - if state[1] < self.m - 1: - allowed_actions.append('E') + for action in self.defined_actions: + next_state = vector_add(state, self.defined_actions[action]) + if 0 <= next_state[0] <= self.n - 1 and 0 <= next_state[1] <= self.m - 1: + allowed_actions.append(action) + return allowed_actions def result(self, state, action): """Moves in the direction specified by action""" - x, y = state - x = x + (1 if action == 'S' else (-1 if action == 'N' else 0)) - y = y + (1 if action == 'E' else (-1 if action == 'W' else 0)) - return (x, y) + return vector_add(state, self.defined_actions[action]) def value(self, state): """Value of a state is the value it is the index to""" @@ -551,18 +769,20 @@ def value(self, state): class OnlineDFSAgent: - - """[Figure 4.21] The abstract class for an OnlineDFSAgent. Override + """ + [Figure 4.21] + The abstract class for an OnlineDFSAgent. Override update_state method to convert percept to state. While initializing the subclass a problem needs to be provided which is an instance of - a subclass of the Problem class.""" + a subclass of the Problem class. + """ def __init__(self, problem): self.problem = problem self.s = None self.a = None - self.untried = defaultdict(list) - self.unbacktracked = defaultdict(list) + self.untried = dict() + self.unbacktracked = dict() self.result = {} def __call__(self, percept): @@ -581,13 +801,13 @@ def __call__(self, percept): self.a = None else: # else a <- an action b such that result[s', b] = POP(unbacktracked[s']) - unbacktracked_pop = self.unbacktracked[s1].pop(0) + unbacktracked_pop = self.unbacktracked.pop(s1) for (s, b) in self.result.keys(): if self.result[(s, b)] == unbacktracked_pop: self.a = b break else: - self.a = self.untried[s1].pop(0) + self.a = self.untried.pop(s1) self.s = s1 return self.a @@ -596,6 +816,7 @@ def update_state(self, percept): assumes the percept to be of type state.""" return percept + # ______________________________________________________________________________ @@ -606,15 +827,14 @@ class OnlineSearchProblem(Problem): Carried in a deterministic and a fully observable environment.""" def __init__(self, initial, goal, graph): - self.initial = initial - self.goal = goal + super().__init__(initial, goal) self.graph = graph def actions(self, state): - return self.graph.dict[state].keys() + return self.graph.graph_dict[state].keys() def output(self, state, action): - return self.graph.dict[state][action] + return self.graph.graph_dict[state][action] def h(self, state): """Returns least possible cost to reach a goal for the given state.""" @@ -634,10 +854,9 @@ def goal_test(self, state): class LRTAStarAgent: - """ [Figure 4.24] Abstract class for LRTA*-Agent. A problem needs to be - provided which is an instanace of a subclass of Problem Class. + provided which is an instance of a subclass of Problem Class. Takes a OnlineSearchProblem [Figure 4.23] as a problem. """ @@ -649,7 +868,7 @@ def __init__(self, problem): self.s = None self.a = None - def __call__(self, s1): # as of now s1 is a state rather than a percept + def __call__(self, s1): # as of now s1 is a state rather than a percept if self.problem.goal_test(s1): self.a = None return self.a @@ -661,11 +880,11 @@ def __call__(self, s1): # as of now s1 is a state rather than a percept # minimum cost for action b in problem.actions(s) self.H[self.s] = min(self.LRTA_cost(self.s, b, self.problem.output(self.s, b), - self.H) for b in self.problem.actions(self.s)) + self.H) for b in self.problem.actions(self.s)) # an action b in problem.actions(s1) that minimizes costs - self.a = argmin(self.problem.actions(s1), - key=lambda b: self.LRTA_cost(s1, b, self.problem.output(s1, b), self.H)) + self.a = min(self.problem.actions(s1), + key=lambda b: self.LRTA_cost(s1, b, self.problem.output(s1, b), self.H)) self.s = s1 return self.a @@ -684,11 +903,12 @@ def LRTA_cost(self, s, a, s1, H): except: return self.problem.c(s, a, s1) + self.problem.h(s1) + # ______________________________________________________________________________ # Genetic Algorithm -def genetic_search(problem, fitness_fn, ngen=1000, pmut=0.1, n=20): +def genetic_search(problem, ngen=1000, pmut=0.1, n=20): """Call genetic_algorithm on the appropriate parts of a problem. This requires the problem to have states that can mate and mutate, plus a value method that scores states.""" @@ -702,27 +922,28 @@ def genetic_search(problem, fitness_fn, ngen=1000, pmut=0.1, n=20): return genetic_algorithm(states[:n], problem.value, ngen, pmut) -def genetic_algorithm(population, fitness_fn, gene_pool=[0, 1], f_thres=None, ngen=1000, pmut=0.1): # noqa +def genetic_algorithm(population, fitness_fn, gene_pool=[0, 1], f_thres=None, ngen=1000, pmut=0.1): """[Figure 4.8]""" for i in range(ngen): - new_population = [] - random_selection = selection_chances(fitness_fn, population) - for j in range(len(population)): - x = random_selection() - y = random_selection() - child = reproduce(x, y) - if random.uniform(0, 1) < pmut: - child = mutate(child, gene_pool) - new_population.append(child) + population = [mutate(recombine(*select(2, population, fitness_fn)), gene_pool, pmut) + for i in range(len(population))] + + fittest_individual = fitness_threshold(fitness_fn, f_thres, population) + if fittest_individual: + return fittest_individual + + return max(population, key=fitness_fn) + - population = new_population +def fitness_threshold(fitness_fn, f_thres, population): + if not f_thres: + return None - if f_thres: - fittest_individual = argmax(population, key=fitness_fn) - if fitness_fn(fittest_individual) >= f_thres: - return fittest_individual + fittest_individual = max(population, key=fitness_fn) + if fitness_fn(fittest_individual) >= f_thres: + return fittest_individual - return argmax(population, key=fitness_fn) + return None def init_population(pop_number, gene_pool, state_length): @@ -739,25 +960,41 @@ def init_population(pop_number, gene_pool, state_length): return population -def selection_chances(fitness_fn, population): +def select(r, population, fitness_fn): fitnesses = map(fitness_fn, population) - return weighted_sampler(population, fitnesses) + sampler = weighted_sampler(population, fitnesses) + return [sampler() for i in range(r)] -def reproduce(x, y): +def recombine(x, y): n = len(x) - c = random.randrange(1, n) + c = random.randrange(0, n) return x[:c] + y[c:] -def mutate(x, gene_pool): +def recombine_uniform(x, y): + n = len(x) + result = [0] * n + indexes = random.sample(range(n), n) + for i in range(n): + ix = indexes[i] + result[ix] = x[ix] if i < n / 2 else y[ix] + + return ''.join(str(r) for r in result) + + +def mutate(x, gene_pool, pmut): + if random.uniform(0, 1) >= pmut: + return x + n = len(x) g = len(gene_pool) c = random.randrange(0, n) r = random.randrange(0, g) new_gene = gene_pool[r] - return x[:c] + [new_gene] + x[c+1:] + return x[:c] + [new_gene] + x[c + 1:] + # _____________________________________________________________________________ # The remainder of this file implements examples for the search algorithms. @@ -767,30 +1004,29 @@ def mutate(x, gene_pool): class Graph: - - """A graph connects nodes (verticies) by edges (links). Each edge can also - have a length associated with it. The constructor call is something like: + """A graph connects nodes (vertices) by edges (links). Each edge can also + have a length associated with it. The constructor call is something like: g = Graph({'A': {'B': 1, 'C': 2}) this makes a graph with 3 nodes, A, B, and C, with an edge of length 1 from - A to B, and an edge of length 2 from A to C. You can also do: + A to B, and an edge of length 2 from A to C. You can also do: g = Graph({'A': {'B': 1, 'C': 2}, directed=False) This makes an undirected graph, so inverse links are also added. The graph stays undirected; if you add more links with g.connect('B', 'C', 3), then - inverse link is also added. You can use g.nodes() to get a list of nodes, + inverse link is also added. You can use g.nodes() to get a list of nodes, g.get('A') to get a dict of links out of A, and g.get('A', 'B') to get the - length of the link from A to B. 'Lengths' can actually be any object at + length of the link from A to B. 'Lengths' can actually be any object at all, and nodes can be any hashable object.""" - def __init__(self, dict=None, directed=True): - self.dict = dict or {} + def __init__(self, graph_dict=None, directed=True): + self.graph_dict = graph_dict or {} self.directed = directed if not directed: self.make_undirected() def make_undirected(self): """Make a digraph into an undirected graph by adding symmetric edges.""" - for a in list(self.dict.keys()): - for (b, dist) in self.dict[a].items(): + for a in list(self.graph_dict.keys()): + for (b, dist) in self.graph_dict[a].items(): self.connect1(b, a, dist) def connect(self, A, B, distance=1): @@ -802,13 +1038,13 @@ def connect(self, A, B, distance=1): def connect1(self, A, B, distance): """Add a link from A to B of given distance, in one direction only.""" - self.dict.setdefault(A, {})[B] = distance + self.graph_dict.setdefault(A, {})[B] = distance def get(self, a, b=None): """Return a link distance or a dict of {node: distance} entries. .get(a,b) returns the distance or None; .get(a) returns a dict of {node: distance} entries, possibly {}.""" - links = self.dict.setdefault(a, {}) + links = self.graph_dict.setdefault(a, {}) if b is None: return links else: @@ -816,12 +1052,15 @@ def get(self, a, b=None): def nodes(self): """Return a list of nodes in the graph.""" - return list(self.dict.keys()) + s1 = set([k for k in self.graph_dict.keys()]) + s2 = set([k2 for v in self.graph_dict.values() for k2, v2 in v.items()]) + nodes = s1.union(s2) + return list(nodes) -def UndirectedGraph(dict=None): +def UndirectedGraph(graph_dict=None): """Build a Graph where every edge (including future ones) goes both ways.""" - return Graph(dict=dict, directed=False) + return Graph(graph_dict=graph_dict, directed=False) def RandomGraph(nodes=list(range(10)), min_links=2, width=400, height=300, @@ -845,9 +1084,10 @@ def RandomGraph(nodes=list(range(10)), min_links=2, width=400, height=300, def distance_to_node(n): if n is node or g.get(node, n): - return infinity + return np.inf return distance(g.locations[n], here) - neighbor = argmin(nodes, key=distance_to_node) + + neighbor = min(nodes, key=distance_to_node) d = distance(g.locations[neighbor], here) * curvature() g.connect(node, neighbor, int(d)) return g @@ -893,7 +1133,7 @@ def distance_to_node(n): 7 - CCL Clean Clean Left 8 - CCR Clean Clean Right """ -vacumm_world = Graph(dict( +vacuum_world = Graph(dict( State_1=dict(Suck=['State_7', 'State_5'], Right=['State_2']), State_2=dict(Suck=['State_8', 'State_4'], Left=['State_2']), State_3=dict(Suck=['State_7'], Right=['State_4']), @@ -902,7 +1142,7 @@ def distance_to_node(n): State_6=dict(Suck=['State_8'], Left=['State_5']), State_7=dict(Suck=['State_7', 'State_3'], Right=['State_8']), State_8=dict(Suck=['State_8', 'State_6'], Left=['State_7']) - )) +)) """ [Figure 4.23] One-dimensional state space Graph @@ -914,7 +1154,7 @@ def distance_to_node(n): State_4=dict(Right='State_5', Left='State_3'), State_5=dict(Right='State_6', Left='State_4'), State_6=dict(Left='State_5') - )) +)) one_dim_state_space.least_costs = dict( State_1=8, State_2=9, @@ -937,11 +1177,10 @@ def distance_to_node(n): class GraphProblem(Problem): - """The problem of searching a graph from one node to another.""" def __init__(self, initial, goal, graph): - Problem.__init__(self, initial, goal) + super().__init__(initial, goal) self.graph = graph def actions(self, A): @@ -953,12 +1192,12 @@ def result(self, state, action): return action def path_cost(self, cost_so_far, A, action, B): - return cost_so_far + (self.graph.get(A, B) or infinity) + return cost_so_far + (self.graph.get(A, B) or np.inf) def find_min_edge(self): """Find minimum value of edges.""" - m = infinity - for d in self.graph.dict.values(): + m = np.inf + for d in self.graph.graph_dict.values(): local_min = min(d.values()) m = min(m, local_min) @@ -973,7 +1212,7 @@ def h(self, node): return int(distance(locs[node.state], locs[self.goal])) else: - return infinity + return np.inf class GraphProblemStochastic(GraphProblem): @@ -996,35 +1235,34 @@ def path_cost(self): class NQueensProblem(Problem): - """The problem of placing N queens on an NxN board with none attacking - each other. A state is represented as an N-element array, where + each other. A state is represented as an N-element array, where a value of r in the c-th entry means there is a queen at column c, - row r, and a value of None means that the c-th column has not been - filled in yet. We fill in columns left to right. + row r, and a value of -1 means that the c-th column has not been + filled in yet. We fill in columns left to right. >>> depth_first_tree_search(NQueensProblem(8)) - + """ def __init__(self, N): + super().__init__(tuple([-1] * N)) self.N = N - self.initial = [None] * N def actions(self, state): """In the leftmost empty column, try all non-conflicting rows.""" - if state[-1] is not None: + if state[-1] != -1: return [] # All columns filled; no successors else: - col = state.index(None) + col = state.index(-1) return [row for row in range(self.N) if not self.conflicted(state, row, col)] def result(self, state, row): """Place the next queen at the given row.""" - col = state.index(None) - new = state[:] + col = state.index(-1) + new = list(state[:]) new[col] = row - return new + return tuple(new) def conflicted(self, state, row, col): """Would placing a queen at (row, col) conflict with anything?""" @@ -1036,15 +1274,26 @@ def conflict(self, row1, col1, row2, col2): return (row1 == row2 or # same row col1 == col2 or # same column row1 - col1 == row2 - col2 or # same \ diagonal - row1 + col1 == row2 + col2) # same / diagonal + row1 + col1 == row2 + col2) # same / diagonal def goal_test(self, state): """Check if all columns filled, no conflicts.""" - if state[-1] is None: + if state[-1] == -1: return False return not any(self.conflicted(state, state[col], col) for col in range(len(state))) + def h(self, node): + """Return number of conflicting queens for a given node""" + num_conflicts = 0 + for (r1, c1) in enumerate(node.state): + for (r2, c2) in enumerate(node.state): + if (r1, c1) != (r2, c2): + num_conflicts += self.conflict(r1, c1, r2, c2) + + return num_conflicts + + # ______________________________________________________________________________ # Inverse Boggle: Search for a high-scoring Boggle board. A good domain for # iterative-repair and related search techniques, as suggested by Justin Boyan. @@ -1065,6 +1314,7 @@ def random_boggle(n=4): random.shuffle(cubes) return list(map(random.choice, cubes)) + # The best 5x5 board found by Boyan, with our word list this board scores # 2274 words, for a score of 9837 @@ -1099,7 +1349,7 @@ def boggle_neighbors(n2, cache={}): on_top = i < n on_bottom = i >= n2 - n on_left = i % n == 0 - on_right = (i+1) % n == 0 + on_right = (i + 1) % n == 0 if not on_top: neighbors[i].append(i - n) if not on_left: @@ -1122,15 +1372,15 @@ def boggle_neighbors(n2, cache={}): def exact_sqrt(n2): """If n2 is a perfect square, return its square root, else raise error.""" - n = int(math.sqrt(n2)) + n = int(np.sqrt(n2)) assert n * n == n2 return n + # _____________________________________________________________________________ class Wordlist: - """This class holds a list of words. You can use (word in wordlist) to check if a word is in the list, or wordlist.lookup(prefix) to see if prefix starts any of the words in the list.""" @@ -1165,11 +1415,11 @@ def __contains__(self, word): def __len__(self): return len(self.words) + # _____________________________________________________________________________ class BoggleFinder: - """A class that allows you to find all the words in a Boggle board.""" wordlist = None # A class variable, holding a wordlist @@ -1226,6 +1476,7 @@ def __len__(self): """The number of words found.""" return len(self.found) + # _____________________________________________________________________________ @@ -1257,13 +1508,13 @@ def mutate_boggle(board): board[i] = random.choice(random.choice(cubes16)) return i, oldc + # ______________________________________________________________________________ # Code to compare searchers on various problems. class InstrumentedProblem(Problem): - """Delegates to a problem, and keeps statistics.""" def __init__(self, problem): @@ -1302,7 +1553,7 @@ def __repr__(self): def compare_searchers(problems, header, searchers=[breadth_first_tree_search, - breadth_first_search, + breadth_first_graph_search, depth_first_graph_search, iterative_deepening_search, depth_limited_search, @@ -1311,6 +1562,7 @@ def do(searcher, problem): p = InstrumentedProblem(problem) searcher(p) return p + table = [[name(s)] + [do(s, p) for p in problems] for s in searchers] print_table(table, header) diff --git a/search4e.ipynb b/search4e.ipynb new file mode 100644 index 000000000..7c636f2e7 --- /dev/null +++ b/search4e.ipynb @@ -0,0 +1,2652 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Search for AIMA 4th edition\n", + "\n", + "Implementation of search algorithms and search problems for AIMA.\n", + "\n", + "# Problems and Nodes\n", + "\n", + "We start by defining the abstract class for a `Problem`; specific problem domains will subclass this. To make it easier for algorithms that use a heuristic evaluation function, `Problem` has a default `h` function (uniformly zero), and subclasses can define their own default `h` function.\n", + "\n", + "We also define a `Node` in a search tree, and some functions on nodes: `expand` to generate successors; `path_actions` and `path_states` to recover aspects of the path from the node. " + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "import matplotlib.pyplot as plt\n", + "import random\n", + "import heapq\n", + "import math\n", + "import sys\n", + "from collections import defaultdict, deque, Counter\n", + "from itertools import combinations\n", + "\n", + "\n", + "class Problem(object):\n", + " \"\"\"The abstract class for a formal problem. A new domain subclasses this,\n", + " overriding `actions` and `results`, and perhaps other methods.\n", + " The default heuristic is 0 and the default action cost is 1 for all states.\n", + " When yiou create an instance of a subclass, specify `initial`, and `goal` states \n", + " (or give an `is_goal` method) and perhaps other keyword args for the subclass.\"\"\"\n", + "\n", + " def __init__(self, initial=None, goal=None, **kwds): \n", + " self.__dict__.update(initial=initial, goal=goal, **kwds) \n", + " \n", + " def actions(self, state): raise NotImplementedError\n", + " def result(self, state, action): raise NotImplementedError\n", + " def is_goal(self, state): return state == self.goal\n", + " def action_cost(self, s, a, s1): return 1\n", + " def h(self, node): return 0\n", + " \n", + " def __str__(self):\n", + " return '{}({!r}, {!r})'.format(\n", + " type(self).__name__, self.initial, self.goal)\n", + " \n", + "\n", + "class Node:\n", + " \"A Node in a search tree.\"\n", + " def __init__(self, state, parent=None, action=None, path_cost=0):\n", + " self.__dict__.update(state=state, parent=parent, action=action, path_cost=path_cost)\n", + "\n", + " def __repr__(self): return '<{}>'.format(self.state)\n", + " def __len__(self): return 0 if self.parent is None else (1 + len(self.parent))\n", + " def __lt__(self, other): return self.path_cost < other.path_cost\n", + " \n", + " \n", + "failure = Node('failure', path_cost=math.inf) # Indicates an algorithm couldn't find a solution.\n", + "cutoff = Node('cutoff', path_cost=math.inf) # Indicates iterative deepening search was cut off.\n", + " \n", + " \n", + "def expand(problem, node):\n", + " \"Expand a node, generating the children nodes.\"\n", + " s = node.state\n", + " for action in problem.actions(s):\n", + " s1 = problem.result(s, action)\n", + " cost = node.path_cost + problem.action_cost(s, action, s1)\n", + " yield Node(s1, node, action, cost)\n", + " \n", + "\n", + "def path_actions(node):\n", + " \"The sequence of actions to get to this node.\"\n", + " if node.parent is None:\n", + " return [] \n", + " return path_actions(node.parent) + [node.action]\n", + "\n", + "\n", + "def path_states(node):\n", + " \"The sequence of states to get to this node.\"\n", + " if node in (cutoff, failure, None): \n", + " return []\n", + " return path_states(node.parent) + [node.state]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Queues\n", + "\n", + "First-in-first-out and Last-in-first-out queues, and a `PriorityQueue`, which allows you to keep a collection of items, and continually remove from it the item with minimum `f(item)` score." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "FIFOQueue = deque\n", + "\n", + "LIFOQueue = list\n", + "\n", + "class PriorityQueue:\n", + " \"\"\"A queue in which the item with minimum f(item) is always popped first.\"\"\"\n", + "\n", + " def __init__(self, items=(), key=lambda x: x): \n", + " self.key = key\n", + " self.items = [] # a heap of (score, item) pairs\n", + " for item in items:\n", + " self.add(item)\n", + " \n", + " def add(self, item):\n", + " \"\"\"Add item to the queuez.\"\"\"\n", + " pair = (self.key(item), item)\n", + " heapq.heappush(self.items, pair)\n", + "\n", + " def pop(self):\n", + " \"\"\"Pop and return the item with min f(item) value.\"\"\"\n", + " return heapq.heappop(self.items)[1]\n", + " \n", + " def top(self): return self.items[0][1]\n", + "\n", + " def __len__(self): return len(self.items)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Search Algorithms: Best-First\n", + "\n", + "Best-first search with various *f(n)* functions gives us different search algorithms. Note that A\\*, weighted A\\* and greedy search can be given a heuristic function, `h`, but if `h` is not supplied they use the problem's default `h` function (if the problem does not define one, it is taken as *h(n)* = 0)." + ] + }, + { + "cell_type": "code", + "execution_count": 356, + "metadata": {}, + "outputs": [], + "source": [ + "def best_first_search(problem, f):\n", + " \"Search nodes with minimum f(node) value first.\"\n", + " node = Node(problem.initial)\n", + " frontier = PriorityQueue([node], key=f)\n", + " reached = {problem.initial: node}\n", + " while frontier:\n", + " node = frontier.pop()\n", + " if problem.is_goal(node.state):\n", + " return node\n", + " for child in expand(problem, node):\n", + " s = child.state\n", + " if s not in reached or child.path_cost < reached[s].path_cost:\n", + " reached[s] = child\n", + " frontier.add(child)\n", + " return failure\n", + "\n", + "\n", + "def best_first_tree_search(problem, f):\n", + " \"A version of best_first_search without the `reached` table.\"\n", + " frontier = PriorityQueue([Node(problem.initial)], key=f)\n", + " while frontier:\n", + " node = frontier.pop()\n", + " if problem.is_goal(node.state):\n", + " return node\n", + " for child in expand(problem, node):\n", + " if not is_cycle(child):\n", + " frontier.add(child)\n", + " return failure\n", + "\n", + "\n", + "def g(n): return n.path_cost\n", + "\n", + "\n", + "def astar_search(problem, h=None):\n", + " \"\"\"Search nodes with minimum f(n) = g(n) + h(n).\"\"\"\n", + " h = h or problem.h\n", + " return best_first_search(problem, f=lambda n: g(n) + h(n))\n", + "\n", + "\n", + "def astar_tree_search(problem, h=None):\n", + " \"\"\"Search nodes with minimum f(n) = g(n) + h(n), with no `reached` table.\"\"\"\n", + " h = h or problem.h\n", + " return best_first_tree_search(problem, f=lambda n: g(n) + h(n))\n", + "\n", + "\n", + "def weighted_astar_search(problem, h=None, weight=1.4):\n", + " \"\"\"Search nodes with minimum f(n) = g(n) + weight * h(n).\"\"\"\n", + " h = h or problem.h\n", + " return best_first_search(problem, f=lambda n: g(n) + weight * h(n))\n", + "\n", + " \n", + "def greedy_bfs(problem, h=None):\n", + " \"\"\"Search nodes with minimum h(n).\"\"\"\n", + " h = h or problem.h\n", + " return best_first_search(problem, f=h)\n", + "\n", + "\n", + "def uniform_cost_search(problem):\n", + " \"Search nodes with minimum path cost first.\"\n", + " return best_first_search(problem, f=g)\n", + "\n", + "\n", + "def breadth_first_bfs(problem):\n", + " \"Search shallowest nodes in the search tree first; using best-first.\"\n", + " return best_first_search(problem, f=len)\n", + "\n", + "\n", + "def depth_first_bfs(problem):\n", + " \"Search deepest nodes in the search tree first; using best-first.\"\n", + " return best_first_search(problem, f=lambda n: -len(n))\n", + "\n", + "\n", + "def is_cycle(node, k=30):\n", + " \"Does this node form a cycle of length k or less?\"\n", + " def find_cycle(ancestor, k):\n", + " return (ancestor is not None and k > 0 and\n", + " (ancestor.state == node.state or find_cycle(ancestor.parent, k - 1)))\n", + " return find_cycle(node.parent, k)\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Other Search Algorithms\n", + "\n", + "Here are the other search algorithms:" + ] + }, + { + "cell_type": "code", + "execution_count": 234, + "metadata": {}, + "outputs": [], + "source": [ + "def breadth_first_search(problem):\n", + " \"Search shallowest nodes in the search tree first.\"\n", + " node = Node(problem.initial)\n", + " if problem.is_goal(problem.initial):\n", + " return node\n", + " frontier = FIFOQueue([node])\n", + " reached = {problem.initial}\n", + " while frontier:\n", + " node = frontier.pop()\n", + " for child in expand(problem, node):\n", + " s = child.state\n", + " if problem.is_goal(s):\n", + " return child\n", + " if s not in reached:\n", + " reached.add(s)\n", + " frontier.appendleft(child)\n", + " return failure\n", + "\n", + "\n", + "def iterative_deepening_search(problem):\n", + " \"Do depth-limited search with increasing depth limits.\"\n", + " for limit in range(1, sys.maxsize):\n", + " result = depth_limited_search(problem, limit)\n", + " if result != cutoff:\n", + " return result\n", + " \n", + " \n", + "def depth_limited_search(problem, limit=10):\n", + " \"Search deepest nodes in the search tree first.\"\n", + " frontier = LIFOQueue([Node(problem.initial)])\n", + " result = failure\n", + " while frontier:\n", + " node = frontier.pop()\n", + " if problem.is_goal(node.state):\n", + " return node\n", + " elif len(node) >= limit:\n", + " result = cutoff\n", + " elif not is_cycle(node):\n", + " for child in expand(problem, node):\n", + " frontier.append(child)\n", + " return result\n", + "\n", + "\n", + "def depth_first_recursive_search(problem, node=None):\n", + " if node is None: \n", + " node = Node(problem.initial)\n", + " if problem.is_goal(node.state):\n", + " return node\n", + " elif is_cycle(node):\n", + " return failure\n", + " else:\n", + " for child in expand(problem, node):\n", + " result = depth_first_recursive_search(problem, child)\n", + " if result:\n", + " return result\n", + " return failure" + ] + }, + { + "cell_type": "code", + "execution_count": 236, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['N', 'I', 'V', 'U', 'B', 'F', 'S', 'O', 'Z', 'A', 'T', 'L']" + ] + }, + "execution_count": 236, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "path_states(depth_first_recursive_search(r2))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Bidirectional Best-First Search" + ] + }, + { + "cell_type": "code", + "execution_count": 412, + "metadata": {}, + "outputs": [], + "source": [ + "def bidirectional_best_first_search(problem_f, f_f, problem_b, f_b, terminated):\n", + " node_f = Node(problem_f.initial)\n", + " node_b = Node(problem_f.goal)\n", + " frontier_f, reached_f = PriorityQueue([node_f], key=f_f), {node_f.state: node_f}\n", + " frontier_b, reached_b = PriorityQueue([node_b], key=f_b), {node_b.state: node_b}\n", + " solution = failure\n", + " while frontier_f and frontier_b and not terminated(solution, frontier_f, frontier_b):\n", + " def S1(node, f):\n", + " return str(int(f(node))) + ' ' + str(path_states(node))\n", + " print('Bi:', S1(frontier_f.top(), f_f), S1(frontier_b.top(), f_b))\n", + " if f_f(frontier_f.top()) < f_b(frontier_b.top()):\n", + " solution = proceed('f', problem_f, frontier_f, reached_f, reached_b, solution)\n", + " else:\n", + " solution = proceed('b', problem_b, frontier_b, reached_b, reached_f, solution)\n", + " return solution\n", + "\n", + "def inverse_problem(problem):\n", + " if isinstance(problem, CountCalls):\n", + " return CountCalls(inverse_problem(problem._object))\n", + " else:\n", + " inv = copy.copy(problem)\n", + " inv.initial, inv.goal = inv.goal, inv.initial\n", + " return inv" + ] + }, + { + "cell_type": "code", + "execution_count": 413, + "metadata": {}, + "outputs": [], + "source": [ + "def bidirectional_uniform_cost_search(problem_f):\n", + " def terminated(solution, frontier_f, frontier_b):\n", + " n_f, n_b = frontier_f.top(), frontier_b.top()\n", + " return g(n_f) + g(n_b) > g(solution)\n", + " return bidirectional_best_first_search(problem_f, g, inverse_problem(problem_f), g, terminated)\n", + "\n", + "def bidirectional_astar_search(problem_f):\n", + " def terminated(solution, frontier_f, frontier_b):\n", + " nf, nb = frontier_f.top(), frontier_b.top()\n", + " return g(nf) + g(nb) > g(solution)\n", + " problem_f = inverse_problem(problem_f)\n", + " return bidirectional_best_first_search(problem_f, lambda n: g(n) + problem_f.h(n),\n", + " problem_b, lambda n: g(n) + problem_b.h(n), \n", + " terminated)\n", + " \n", + "\n", + "def proceed(direction, problem, frontier, reached, reached2, solution):\n", + " node = frontier.pop()\n", + " for child in expand(problem, node):\n", + " s = child.state\n", + " print('proceed', direction, S(child))\n", + " if s not in reached or child.path_cost < reached[s].path_cost:\n", + " frontier.add(child)\n", + " reached[s] = child\n", + " if s in reached2: # Frontiers collide; solution found\n", + " solution2 = (join_nodes(child, reached2[s]) if direction == 'f' else\n", + " join_nodes(reached2[s], child))\n", + " #print('solution', path_states(solution2), solution2.path_cost, \n", + " # path_states(child), path_states(reached2[s]))\n", + " if solution2.path_cost < solution.path_cost:\n", + " solution = solution2\n", + " return solution\n", + "\n", + "S = path_states\n", + "\n", + "#A-S-R + B-P-R => A-S-R-P + B-P\n", + "def join_nodes(nf, nb):\n", + " \"\"\"Join the reverse of the backward node nb to the forward node nf.\"\"\"\n", + " #print('join', S(nf), S(nb))\n", + " join = nf\n", + " while nb.parent is not None:\n", + " cost = join.path_cost + nb.path_cost - nb.parent.path_cost\n", + " join = Node(nb.parent.state, join, nb.action, cost)\n", + " nb = nb.parent\n", + " #print(' now join', S(join), 'with nb', S(nb), 'parent', S(nb.parent))\n", + " return join\n", + " \n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#A , B = uniform_cost_search(r1), uniform_cost_search(r2)\n", + "#path_states(A), path_states(B)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#path_states(append_nodes(A, B))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# TODO: RBFS" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Problem Domains\n", + "\n", + "Now we turn our attention to defining some problem domains as subclasses of `Problem`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Route Finding Problems\n", + "\n", + "![](romania.png)\n", + "\n", + "In a `RouteProblem`, the states are names of \"cities\" (or other locations), like `'A'` for Arad. The actions are also city names; `'Z'` is the action to move to city `'Z'`. The layout of cities is given by a separate data structure, a `Map`, which is a graph where there are vertexes (cities), links between vertexes, distances (costs) of those links (if not specified, the default is 1 for every link), and optionally the 2D (x, y) location of each city can be specified. A `RouteProblem` takes this `Map` as input and allows actions to move between linked cities. The default heuristic is straight-line distance to the goal, or is uniformly zero if locations were not given." + ] + }, + { + "cell_type": "code", + "execution_count": 398, + "metadata": {}, + "outputs": [], + "source": [ + "class RouteProblem(Problem):\n", + " \"\"\"A problem to find a route between locations on a `Map`.\n", + " Create a problem with RouteProblem(start, goal, map=Map(...)}).\n", + " States are the vertexes in the Map graph; actions are destination states.\"\"\"\n", + " \n", + " def actions(self, state): \n", + " \"\"\"The places neighboring `state`.\"\"\"\n", + " return self.map.neighbors[state]\n", + " \n", + " def result(self, state, action):\n", + " \"\"\"Go to the `action` place, if the map says that is possible.\"\"\"\n", + " return action if action in self.map.neighbors[state] else state\n", + " \n", + " def action_cost(self, s, action, s1):\n", + " \"\"\"The distance (cost) to go from s to s1.\"\"\"\n", + " return self.map.distances[s, s1]\n", + " \n", + " def h(self, node):\n", + " \"Straight-line distance between state and the goal.\"\n", + " locs = self.map.locations\n", + " return straight_line_distance(locs[node.state], locs[self.goal])\n", + " \n", + " \n", + "def straight_line_distance(A, B):\n", + " \"Straight-line distance between two points.\"\n", + " return sum(abs(a - b)**2 for (a, b) in zip(A, B)) ** 0.5" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [], + "source": [ + "class Map:\n", + " \"\"\"A map of places in a 2D world: a graph with vertexes and links between them. \n", + " In `Map(links, locations)`, `links` can be either [(v1, v2)...] pairs, \n", + " or a {(v1, v2): distance...} dict. Optional `locations` can be {v1: (x, y)} \n", + " If `directed=False` then for every (v1, v2) link, we add a (v2, v1) link.\"\"\"\n", + "\n", + " def __init__(self, links, locations=None, directed=False):\n", + " if not hasattr(links, 'items'): # Distances are 1 by default\n", + " links = {link: 1 for link in links}\n", + " if not directed:\n", + " for (v1, v2) in list(links):\n", + " links[v2, v1] = links[v1, v2]\n", + " self.distances = links\n", + " self.neighbors = multimap(links)\n", + " self.locations = locations or defaultdict(lambda: (0, 0))\n", + "\n", + " \n", + "def multimap(pairs) -> dict:\n", + " \"Given (key, val) pairs, make a dict of {key: [val,...]}.\"\n", + " result = defaultdict(list)\n", + " for key, val in pairs:\n", + " result[key].append(val)\n", + " return result" + ] + }, + { + "cell_type": "code", + "execution_count": 400, + "metadata": {}, + "outputs": [], + "source": [ + "# Some specific RouteProblems\n", + "\n", + "romania = Map(\n", + " {('O', 'Z'): 71, ('O', 'S'): 151, ('A', 'Z'): 75, ('A', 'S'): 140, ('A', 'T'): 118, \n", + " ('L', 'T'): 111, ('L', 'M'): 70, ('D', 'M'): 75, ('C', 'D'): 120, ('C', 'R'): 146, \n", + " ('C', 'P'): 138, ('R', 'S'): 80, ('F', 'S'): 99, ('B', 'F'): 211, ('B', 'P'): 101, \n", + " ('B', 'G'): 90, ('B', 'U'): 85, ('H', 'U'): 98, ('E', 'H'): 86, ('U', 'V'): 142, \n", + " ('I', 'V'): 92, ('I', 'N'): 87, ('P', 'R'): 97},\n", + " {'A': ( 76, 497), 'B': (400, 327), 'C': (246, 285), 'D': (160, 296), 'E': (558, 294), \n", + " 'F': (285, 460), 'G': (368, 257), 'H': (548, 355), 'I': (488, 535), 'L': (162, 379),\n", + " 'M': (160, 343), 'N': (407, 561), 'O': (117, 580), 'P': (311, 372), 'R': (227, 412),\n", + " 'S': (187, 463), 'T': ( 83, 414), 'U': (471, 363), 'V': (535, 473), 'Z': (92, 539)})\n", + "\n", + "\n", + "r0 = RouteProblem('A', 'A', map=romania)\n", + "r1 = RouteProblem('A', 'B', map=romania)\n", + "r2 = RouteProblem('N', 'L', map=romania)\n", + "r3 = RouteProblem('E', 'T', map=romania)\n", + "r4 = RouteProblem('O', 'M', map=romania)" + ] + }, + { + "cell_type": "code", + "execution_count": 232, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['A', 'S', 'R', 'P', 'B']" + ] + }, + "execution_count": 232, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "path_states(uniform_cost_search(r1)) # Lowest-cost path from Arab to Bucharest" + ] + }, + { + "cell_type": "code", + "execution_count": 233, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['A', 'S', 'F', 'B']" + ] + }, + "execution_count": 233, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "path_states(breadth_first_search(r1)) # Breadth-first: fewer steps, higher path cost" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Grid Problems\n", + "\n", + "A `GridProblem` involves navigating on a 2D grid, with some cells being impassible obstacles. By default you can move to any of the eight neighboring cells that are not obstacles (but in a problem instance you can supply a `directions=` keyword to change that). Again, the default heuristic is straight-line distance to the goal. States are `(x, y)` cell locations, such as `(4, 2)`, and actions are `(dx, dy)` cell movements, such as `(0, -1)`, which means leave the `x` coordinate alone, and decrement the `y` coordinate by 1." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [], + "source": [ + "class GridProblem(Problem):\n", + " \"\"\"Finding a path on a 2D grid with obstacles. Obstacles are (x, y) cells.\"\"\"\n", + "\n", + " def __init__(self, initial=(15, 30), goal=(130, 30), obstacles=(), **kwds):\n", + " Problem.__init__(self, initial=initial, goal=goal, \n", + " obstacles=set(obstacles) - {initial, goal}, **kwds)\n", + "\n", + " directions = [(-1, -1), (0, -1), (1, -1),\n", + " (-1, 0), (1, 0),\n", + " (-1, +1), (0, +1), (1, +1)]\n", + " \n", + " def action_cost(self, s, action, s1): return straight_line_distance(s, s1)\n", + " \n", + " def h(self, node): return straight_line_distance(node.state, self.goal)\n", + " \n", + " def result(self, state, action): \n", + " \"Both states and actions are represented by (x, y) pairs.\"\n", + " return action if action not in self.obstacles else state\n", + " \n", + " def actions(self, state):\n", + " \"\"\"You can move one cell in any of `directions` to a non-obstacle cell.\"\"\"\n", + " x, y = state\n", + " return {(x + dx, y + dy) for (dx, dy) in self.directions} - self.obstacles\n", + " \n", + "class ErraticVacuum(Problem):\n", + " def actions(self, state): \n", + " return ['suck', 'forward', 'backward']\n", + " \n", + " def results(self, state, action): return self.table[action][state]\n", + " \n", + " table = dict(suck= {1:{5,7}, 2:{4,8}, 3:{7}, 4:{2,4}, 5:{1,5}, 6:{8}, 7:{3,7}, 8:{6,8}},\n", + " forward= {1:{2}, 2:{2}, 3:{4}, 4:{4}, 5:{6}, 6:{6}, 7:{8}, 8:{8}},\n", + " backward={1:{1}, 2:{1}, 3:{3}, 4:{3}, 5:{5}, 6:{5}, 7:{7}, 8:{7}})" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [], + "source": [ + "# Some grid routing problems\n", + "\n", + "# The following can be used to create obstacles:\n", + " \n", + "def random_lines(X=range(15, 130), Y=range(60), N=150, lengths=range(6, 12)):\n", + " \"\"\"The set of cells in N random lines of the given lengths.\"\"\"\n", + " result = set()\n", + " for _ in range(N):\n", + " x, y = random.choice(X), random.choice(Y)\n", + " dx, dy = random.choice(((0, 1), (1, 0)))\n", + " result |= line(x, y, dx, dy, random.choice(lengths))\n", + " return result\n", + "\n", + "def line(x, y, dx, dy, length):\n", + " \"\"\"A line of `length` cells starting at (x, y) and going in (dx, dy) direction.\"\"\"\n", + " return {(x + i * dx, y + i * dy) for i in range(length)}\n", + "\n", + "random.seed(42) # To make this reproducible\n", + "\n", + "frame = line(-10, 20, 0, 1, 20) | line(150, 20, 0, 1, 20)\n", + "cup = line(102, 44, -1, 0, 15) | line(102, 20, -1, 0, 20) | line(102, 44, 0, -1, 24)\n", + "\n", + "d1 = GridProblem(obstacles=random_lines(N=100) | frame)\n", + "d2 = GridProblem(obstacles=random_lines(N=150) | frame)\n", + "d3 = GridProblem(obstacles=random_lines(N=200) | frame)\n", + "d4 = GridProblem(obstacles=random_lines(N=250) | frame)\n", + "d5 = GridProblem(obstacles=random_lines(N=300) | frame)\n", + "d6 = GridProblem(obstacles=cup | frame)\n", + "d7 = GridProblem(obstacles=cup | frame | line(50, 35, 0, -1, 10) | line(60, 37, 0, -1, 17) | line(70, 31, 0, -1, 19))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 8 Puzzle Problems\n", + "\n", + "![](https://ece.uwaterloo.ca/~dwharder/aads/Algorithms/N_puzzles/images/puz3.png)\n", + "\n", + "A sliding tile puzzle where you can swap the blank with an adjacent piece, trying to reach a goal configuration. The cells are numbered 0 to 8, starting at the top left and going row by row left to right. The pieces are numebred 1 to 8, with 0 representing the blank. An action is the cell index number that is to be swapped with the blank (*not* the actual number to be swapped but the index into the state). So the diagram above left is the state `(5, 2, 7, 8, 4, 0, 1, 3, 6)`, and the action is `8`, because the cell number 8 (the 9th or last cell, the `6` in the bottom right) is swapped with the blank.\n", + "\n", + "There are two disjoint sets of states that cannot be reached from each other. One set has an even number of \"inversions\"; the other has an odd number. An inversion is when a piece in the state is larger than a piece that follows it.\n", + "\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 397, + "metadata": {}, + "outputs": [], + "source": [ + "class EightPuzzle(Problem):\n", + " \"\"\" The problem of sliding tiles numbered from 1 to 8 on a 3x3 board,\n", + " where one of the squares is a blank, trying to reach a goal configuration.\n", + " A board state is represented as a tuple of length 9, where the element at index i \n", + " represents the tile number at index i, or 0 if for the empty square, e.g. the goal:\n", + " 1 2 3\n", + " 4 5 6 ==> (1, 2, 3, 4, 5, 6, 7, 8, 0)\n", + " 7 8 _\n", + " \"\"\"\n", + "\n", + " def __init__(self, initial, goal=(0, 1, 2, 3, 4, 5, 6, 7, 8)):\n", + " assert inversions(initial) % 2 == inversions(goal) % 2 # Parity check\n", + " self.initial, self.goal = initial, goal\n", + " \n", + " def actions(self, state):\n", + " \"\"\"The indexes of the squares that the blank can move to.\"\"\"\n", + " moves = ((1, 3), (0, 2, 4), (1, 5),\n", + " (0, 4, 6), (1, 3, 5, 7), (2, 4, 8),\n", + " (3, 7), (4, 6, 8), (7, 5))\n", + " blank = state.index(0)\n", + " return moves[blank]\n", + " \n", + " def result(self, state, action):\n", + " \"\"\"Swap the blank with the square numbered `action`.\"\"\"\n", + " s = list(state)\n", + " blank = state.index(0)\n", + " s[action], s[blank] = s[blank], s[action]\n", + " return tuple(s)\n", + " \n", + " def h1(self, node):\n", + " \"\"\"The misplaced tiles heuristic.\"\"\"\n", + " return hamming_distance(node.state, self.goal)\n", + " \n", + " def h2(self, node):\n", + " \"\"\"The Manhattan heuristic.\"\"\"\n", + " X = (0, 1, 2, 0, 1, 2, 0, 1, 2)\n", + " Y = (0, 0, 0, 1, 1, 1, 2, 2, 2)\n", + " return sum(abs(X[s] - X[g]) + abs(Y[s] - Y[g])\n", + " for (s, g) in zip(node.state, self.goal) if s != 0)\n", + " \n", + " def h(self, node): return h2(self, node)\n", + " \n", + " \n", + "def hamming_distance(A, B):\n", + " \"Number of positions where vectors A and B are different.\"\n", + " return sum(a != b for a, b in zip(A, B))\n", + " \n", + "\n", + "def inversions(board):\n", + " \"The number of times a piece is a smaller number than a following piece.\"\n", + " return sum((a > b and a != 0 and b != 0) for (a, b) in combinations(board, 2))\n", + " \n", + " \n", + "def board8(board, fmt=(3 * '{} {} {}\\n')):\n", + " \"A string representing an 8-puzzle board\"\n", + " return fmt.format(*board).replace('0', '_')\n", + "\n", + "class Board(defaultdict):\n", + " empty = '.'\n", + " off = '#'\n", + " def __init__(self, board=None, width=8, height=8, to_move=None, **kwds):\n", + " if board is not None:\n", + " self.update(board)\n", + " self.width, self.height = (board.width, board.height) \n", + " else:\n", + " self.width, self.height = (width, height)\n", + " self.to_move = to_move\n", + "\n", + " def __missing__(self, key):\n", + " x, y = key\n", + " if x < 0 or x >= self.width or y < 0 or y >= self.height:\n", + " return self.off\n", + " else:\n", + " return self.empty\n", + " \n", + " def __repr__(self):\n", + " def row(y): return ' '.join(self[x, y] for x in range(self.width))\n", + " return '\\n'.join(row(y) for y in range(self.height))\n", + " \n", + " def __hash__(self): \n", + " return hash(tuple(sorted(self.items()))) + hash(self.to_move)" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [], + "source": [ + "# Some specific EightPuzzle problems\n", + "\n", + "e1 = EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8))\n", + "e2 = EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0))\n", + "e3 = EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6))\n", + "e4 = EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1))\n", + "e5 = EightPuzzle((8, 6, 7, 2, 5, 4, 3, 0, 1))" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "1 4 2\n", + "_ 7 5\n", + "3 6 8\n", + "\n", + "1 4 2\n", + "3 7 5\n", + "_ 6 8\n", + "\n", + "1 4 2\n", + "3 7 5\n", + "6 _ 8\n", + "\n", + "1 4 2\n", + "3 _ 5\n", + "6 7 8\n", + "\n", + "1 _ 2\n", + "3 4 5\n", + "6 7 8\n", + "\n", + "_ 1 2\n", + "3 4 5\n", + "6 7 8\n", + "\n" + ] + } + ], + "source": [ + "# Solve an 8 puzzle problem and print out each state\n", + "\n", + "for s in path_states(astar_search(e1)):\n", + " print(board8(s))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Water Pouring Problems\n", + "\n", + "![](http://puzzles.nigelcoldwell.co.uk/images/water22.png)\n", + "\n", + "In a [water pouring problem](https://en.wikipedia.org/wiki/Water_pouring_puzzle) you are given a collection of jugs, each of which has a size (capacity) in, say, litres, and a current level of water (in litres). The goal is to measure out a certain level of water; it can appear in any of the jugs. For example, in the movie *Die Hard 3*, the heroes were faced with the task of making exactly 4 gallons from jugs of size 5 gallons and 3 gallons.) A state is represented by a tuple of current water levels, and the available actions are:\n", + "- `(Fill, i)`: fill the `i`th jug all the way to the top (from a tap with unlimited water).\n", + "- `(Dump, i)`: dump all the water out of the `i`th jug.\n", + "- `(Pour, i, j)`: pour water from the `i`th jug into the `j`th jug until either the jug `i` is empty, or jug `j` is full, whichever comes first." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [], + "source": [ + "class PourProblem(Problem):\n", + " \"\"\"Problem about pouring water between jugs to achieve some water level.\n", + " Each state is a tuples of water levels. In the initialization, also provide a tuple of \n", + " jug sizes, e.g. PourProblem(initial=(0, 0), goal=4, sizes=(5, 3)), \n", + " which means two jugs of sizes 5 and 3, initially both empty, with the goal\n", + " of getting a level of 4 in either jug.\"\"\"\n", + " \n", + " def actions(self, state):\n", + " \"\"\"The actions executable in this state.\"\"\"\n", + " jugs = range(len(state))\n", + " return ([('Fill', i) for i in jugs if state[i] < self.sizes[i]] +\n", + " [('Dump', i) for i in jugs if state[i]] +\n", + " [('Pour', i, j) for i in jugs if state[i] for j in jugs if i != j])\n", + "\n", + " def result(self, state, action):\n", + " \"\"\"The state that results from executing this action in this state.\"\"\"\n", + " result = list(state)\n", + " act, i, *_ = action\n", + " if act == 'Fill': # Fill i to capacity\n", + " result[i] = self.sizes[i]\n", + " elif act == 'Dump': # Empty i\n", + " result[i] = 0\n", + " elif act == 'Pour': # Pour from i into j\n", + " j = action[2]\n", + " amount = min(state[i], self.sizes[j] - state[j])\n", + " result[i] -= amount\n", + " result[j] += amount\n", + " return tuple(result)\n", + "\n", + " def is_goal(self, state):\n", + " \"\"\"True if the goal level is in any one of the jugs.\"\"\"\n", + " return self.goal in state" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In a `GreenPourProblem`, the states and actions are the same, but instead of all actions costing 1, in these problems the cost of an action is the amount of water that flows from the tap. (There is an issue that non-*Fill* actions have 0 cost, which in general can lead to indefinitely long solutions, but in this problem there is a finite number of states, so we're ok.)" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [], + "source": [ + "class GreenPourProblem(PourProblem): \n", + " \"\"\"A PourProblem in which the cost is the amount of water used.\"\"\"\n", + " def action_cost(self, s, action, s1):\n", + " \"The cost is the amount of water used.\"\n", + " act, i, *_ = action\n", + " return self.sizes[i] - s[i] if act == 'Fill' else 0" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [], + "source": [ + "# Some specific PourProblems\n", + "\n", + "p1 = PourProblem((1, 1, 1), 13, sizes=(2, 16, 32))\n", + "p2 = PourProblem((0, 0, 0), 21, sizes=(8, 11, 31))\n", + "p3 = PourProblem((0, 0), 8, sizes=(7,9))\n", + "p4 = PourProblem((0, 0, 0), 21, sizes=(8, 11, 31))\n", + "p5 = PourProblem((0, 0), 4, sizes=(3, 5))\n", + "\n", + "g1 = GreenPourProblem((1, 1, 1), 13, sizes=(2, 16, 32))\n", + "g2 = GreenPourProblem((0, 0, 0), 21, sizes=(8, 11, 31))\n", + "g3 = GreenPourProblem((0, 0), 8, sizes=(7,9))\n", + "g4 = GreenPourProblem((0, 0, 0), 21, sizes=(8, 11, 31))\n", + "g5 = GreenPourProblem((0, 0), 4, sizes=(3, 5))" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "([('Fill', 1), ('Pour', 1, 0), ('Dump', 0), ('Pour', 1, 0)],\n", + " [(1, 1, 1), (1, 16, 1), (2, 15, 1), (0, 15, 1), (2, 13, 1)])" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# Solve the PourProblem of getting 13 in some jug, and show the actions and states\n", + "soln = breadth_first_search(p1)\n", + "path_actions(soln), path_states(soln)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Pancake Sorting Problems\n", + "\n", + "Given a stack of pancakes of various sizes, can you sort them into a stack of decreasing sizes, largest on bottom to smallest on top? You have a spatula with which you can flip the top `i` pancakes. This is shown below for `i = 3`; on the top the spatula grabs the first three pancakes; on the bottom we see them flipped:\n", + "\n", + "\n", + "![](https://upload.wikimedia.org/wikipedia/commons/0/0f/Pancake_sort_operation.png)\n", + "\n", + "How many flips will it take to get the whole stack sorted? This is an interesting [problem](https://en.wikipedia.org/wiki/Pancake_sorting) that Bill Gates has [written about](https://people.eecs.berkeley.edu/~christos/papers/Bounds%20For%20Sorting%20By%20Prefix%20Reversal.pdf). A reasonable heuristic for this problem is the *gap heuristic*: if we look at neighboring pancakes, if, say, the 2nd smallest is next to the 3rd smallest, that's good; they should stay next to each other. But if the 2nd smallest is next to the 4th smallest, that's bad: we will require at least one move to separate them and insert the 3rd smallest between them. The gap heuristic counts the number of neighbors that have a gap like this. In our specification of the problem, pancakes are ranked by size: the smallest is `1`, the 2nd smallest `2`, and so on, and the representation of a state is a tuple of these rankings, from the top to the bottom pancake. Thus the goal state is always `(1, 2, ..., `*n*`)` and the initial (top) state in the diagram above is `(2, 1, 4, 6, 3, 5)`.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [], + "source": [ + "class PancakeProblem(Problem):\n", + " \"\"\"A PancakeProblem the goal is always `tuple(range(1, n+1))`, where the\n", + " initial state is a permutation of `range(1, n+1)`. An act is the index `i` \n", + " of the top `i` pancakes that will be flipped.\"\"\"\n", + " \n", + " def __init__(self, initial): \n", + " self.initial, self.goal = tuple(initial), tuple(sorted(initial))\n", + " \n", + " def actions(self, state): return range(2, len(state) + 1)\n", + "\n", + " def result(self, state, i): return state[:i][::-1] + state[i:]\n", + " \n", + " def h(self, node):\n", + " \"The gap heuristic.\"\n", + " s = node.state\n", + " return sum(abs(s[i] - s[i - 1]) > 1 for i in range(1, len(s)))" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [], + "source": [ + "c0 = PancakeProblem((2, 1, 4, 6, 3, 5))\n", + "c1 = PancakeProblem((4, 6, 2, 5, 1, 3))\n", + "c2 = PancakeProblem((1, 3, 7, 5, 2, 6, 4))\n", + "c3 = PancakeProblem((1, 7, 2, 6, 3, 5, 4))\n", + "c4 = PancakeProblem((1, 3, 5, 7, 9, 2, 4, 6, 8))" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[(2, 1, 4, 6, 3, 5),\n", + " (6, 4, 1, 2, 3, 5),\n", + " (5, 3, 2, 1, 4, 6),\n", + " (4, 1, 2, 3, 5, 6),\n", + " (3, 2, 1, 4, 5, 6),\n", + " (1, 2, 3, 4, 5, 6)]" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# Solve a pancake problem\n", + "path_states(astar_search(c0))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Jumping Frogs Puzzle\n", + "\n", + "In this puzzle (which also can be played as a two-player game), the initial state is a line of squares, with N pieces of one kind on the left, then one empty square, then N pieces of another kind on the right. The diagram below uses 2 blue toads and 2 red frogs; we will represent this as the string `'LL.RR'`. The goal is to swap the pieces, arriving at `'RR.LL'`. An `'L'` piece moves left-to-right, either sliding one space ahead to an empty space, or two spaces ahead if that space is empty and if there is an `'R'` in between to hop over. The `'R'` pieces move right-to-left analogously. An action will be an `(i, j)` pair meaning to swap the pieces at those indexes. The set of actions for the N = 2 position below is `{(1, 2), (3, 2)}`, meaning either the blue toad in position 1 or the red frog in position 3 can swap places with the blank in position 2.\n", + "\n", + "![](https://upload.wikimedia.org/wikipedia/commons/2/2f/ToadsAndFrogs.png)" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [], + "source": [ + "class JumpingPuzzle(Problem):\n", + " \"\"\"Try to exchange L and R by moving one ahead or hopping two ahead.\"\"\"\n", + " def __init__(self, N=2):\n", + " self.initial = N*'L' + '.' + N*'R'\n", + " self.goal = self.initial[::-1]\n", + " \n", + " def actions(self, state):\n", + " \"\"\"Find all possible move or hop moves.\"\"\"\n", + " idxs = range(len(state))\n", + " return ({(i, i + 1) for i in idxs if state[i:i+2] == 'L.'} # Slide\n", + " |{(i, i + 2) for i in idxs if state[i:i+3] == 'LR.'} # Hop\n", + " |{(i + 1, i) for i in idxs if state[i:i+2] == '.R'} # Slide\n", + " |{(i + 2, i) for i in idxs if state[i:i+3] == '.LR'}) # Hop\n", + "\n", + " def result(self, state, action):\n", + " \"\"\"An action (i, j) means swap the pieces at positions i and j.\"\"\"\n", + " i, j = action\n", + " result = list(state)\n", + " result[i], result[j] = state[j], state[i]\n", + " return ''.join(result)\n", + " \n", + " def h(self, node): return hamming_distance(node.state, self.goal)" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{(1, 2), (3, 2)}" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "JumpingPuzzle(N=2).actions('LL.RR')" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['LLL.RRR',\n", + " 'LLLR.RR',\n", + " 'LL.RLRR',\n", + " 'L.LRLRR',\n", + " 'LRL.LRR',\n", + " 'LRLRL.R',\n", + " 'LRLRLR.',\n", + " 'LRLR.RL',\n", + " 'LR.RLRL',\n", + " '.RLRLRL',\n", + " 'R.LRLRL',\n", + " 'RRL.LRL',\n", + " 'RRLRL.L',\n", + " 'RRLR.LL',\n", + " 'RR.RLLL',\n", + " 'RRR.LLL']" + ] + }, + "execution_count": 34, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "j3 = JumpingPuzzle(N=3)\n", + "j9 = JumpingPuzzle(N=9)\n", + "path_states(astar_search(j3))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Reporting Summary Statistics on Search Algorithms\n", + "\n", + "Now let's gather some metrics on how well each algorithm does. We'll use `CountCalls` to wrap a `Problem` object in such a way that calls to its methods are delegated to the original problem, but each call increments a counter. Once we've solved the problem, we print out summary statistics." + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [], + "source": [ + "class CountCalls:\n", + " \"\"\"Delegate all attribute gets to the object, and count them in ._counts\"\"\"\n", + " def __init__(self, obj):\n", + " self._object = obj\n", + " self._counts = Counter()\n", + " \n", + " def __getattr__(self, attr):\n", + " \"Delegate to the original object, after incrementing a counter.\"\n", + " self._counts[attr] += 1\n", + " return getattr(self._object, attr)\n", + "\n", + " \n", + "def report(searchers, problems, verbose=True):\n", + " \"\"\"Show summary statistics for each searcher (and on each problem unless verbose is false).\"\"\"\n", + " for searcher in searchers:\n", + " print(searcher.__name__ + ':')\n", + " total_counts = Counter()\n", + " for p in problems:\n", + " prob = CountCalls(p)\n", + " soln = searcher(prob)\n", + " counts = prob._counts; \n", + " counts.update(actions=len(soln), cost=soln.path_cost)\n", + " total_counts += counts\n", + " if verbose: report_counts(counts, str(p)[:40])\n", + " report_counts(total_counts, 'TOTAL\\n')\n", + " \n", + "def report_counts(counts, name):\n", + " \"\"\"Print one line of the counts report.\"\"\"\n", + " print('{:9,d} nodes |{:9,d} goal |{:5.0f} cost |{:8,d} actions | {}'.format(\n", + " counts['result'], counts['is_goal'], counts['cost'], counts['actions'], name))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's a tiny report for uniform-cost search on the jug pouring problems:" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "uniform_cost_search:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 52 nodes | 14 goal | 6 cost | 19 actions | PourProblem((0, 0), 4)\n", + " 8,122 nodes | 931 goal | 42 cost | 968 actions | TOTAL\n", + "\n" + ] + } + ], + "source": [ + "report([uniform_cost_search], [p1, p2, p3, p4, p5])" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "uniform_cost_search:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 1,696 nodes | 190 goal | 10 cost | 204 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 124 nodes | 30 goal | 35 cost | 45 actions | GreenPourProblem((0, 0), 8)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 3,590 nodes | 719 goal | 7 cost | 725 actions | PancakeProblem((4, 6, 2, 5, 1, 3), (1, 2\n", + " 30,204 nodes | 5,035 goal | 8 cost | 5,042 actions | PancakeProblem((1, 3, 7, 5, 2, 6, 4), (1\n", + " 22,068 nodes | 3,679 goal | 6 cost | 3,684 actions | PancakeProblem((1, 7, 2, 6, 3, 5, 4), (1\n", + " 81,467 nodes | 12,321 goal | 174 cost | 12,435 actions | TOTAL\n", + "\n", + "breadth_first_search:\n", + " 596 nodes | 597 goal | 4 cost | 73 actions | PourProblem((1, 1, 1), 13)\n", + " 596 nodes | 597 goal | 15 cost | 73 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 2,618 nodes | 2,619 goal | 9 cost | 302 actions | PourProblem((0, 0, 0), 21)\n", + " 2,618 nodes | 2,619 goal | 32 cost | 302 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 120 nodes | 121 goal | 14 cost | 42 actions | PourProblem((0, 0), 8)\n", + " 120 nodes | 121 goal | 36 cost | 42 actions | GreenPourProblem((0, 0), 8)\n", + " 2,618 nodes | 2,619 goal | 9 cost | 302 actions | PourProblem((0, 0, 0), 21)\n", + " 2,618 nodes | 2,619 goal | 32 cost | 302 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 2,618 nodes | 2,619 goal | 9 cost | 302 actions | PourProblem((0, 0, 0), 21)\n", + " 2,618 nodes | 2,619 goal | 32 cost | 302 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 2,951 nodes | 2,952 goal | 7 cost | 598 actions | PancakeProblem((4, 6, 2, 5, 1, 3), (1, 2\n", + " 25,945 nodes | 25,946 goal | 8 cost | 4,333 actions | PancakeProblem((1, 3, 7, 5, 2, 6, 4), (1\n", + " 5,975 nodes | 5,976 goal | 6 cost | 1,002 actions | PancakeProblem((1, 7, 2, 6, 3, 5, 4), (1\n", + " 52,011 nodes | 52,024 goal | 213 cost | 7,975 actions | TOTAL\n", + "\n" + ] + } + ], + "source": [ + "report((uniform_cost_search, breadth_first_search), \n", + " (p1, g1, p2, g2, p3, g3, p4, g4, p4, g4, c1, c2, c3)) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Comparing heuristics\n", + "\n", + "First, let's look at the eight puzzle problems, and compare three different heuristics the Manhattan heuristic, the less informative misplaced tiles heuristic, and the uninformed (i.e. *h* = 0) breadth-first search:" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "breadth_first_search:\n", + " 81 nodes | 82 goal | 5 cost | 35 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 160,948 nodes | 160,949 goal | 22 cost | 59,960 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 218,263 nodes | 218,264 goal | 23 cost | 81,829 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 418,771 nodes | 418,772 goal | 26 cost | 156,533 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 448,667 nodes | 448,668 goal | 27 cost | 167,799 actions | EightPuzzle((8, 6, 7, 2, 5, 4, 3, 0, 1),\n", + "1,246,730 nodes |1,246,735 goal | 103 cost | 466,156 actions | TOTAL\n", + "\n", + "astar_misplaced_tiles:\n", + " 17 nodes | 7 goal | 5 cost | 11 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 23,407 nodes | 8,726 goal | 22 cost | 8,747 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 38,632 nodes | 14,433 goal | 23 cost | 14,455 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 124,324 nodes | 46,553 goal | 26 cost | 46,578 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 156,111 nodes | 58,475 goal | 27 cost | 58,501 actions | EightPuzzle((8, 6, 7, 2, 5, 4, 3, 0, 1),\n", + " 342,491 nodes | 128,194 goal | 103 cost | 128,292 actions | TOTAL\n", + "\n", + "astar_search:\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 3,614 nodes | 1,349 goal | 22 cost | 1,370 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 5,373 nodes | 2,010 goal | 23 cost | 2,032 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 10,832 nodes | 4,086 goal | 26 cost | 4,111 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 11,669 nodes | 4,417 goal | 27 cost | 4,443 actions | EightPuzzle((8, 6, 7, 2, 5, 4, 3, 0, 1),\n", + " 31,503 nodes | 11,868 goal | 103 cost | 11,966 actions | TOTAL\n", + "\n" + ] + } + ], + "source": [ + "def astar_misplaced_tiles(problem): return astar_search(problem, h=problem.h1)\n", + "\n", + "report([breadth_first_search, astar_misplaced_tiles, astar_search], \n", + " [e1, e2, e3, e4, e5])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that all three algorithms get cost-optimal solutions, but the better the heuristic, the fewer nodes explored. \n", + "Compared to the uninformed search, the misplaced tiles heuristic explores about 1/4 the number of nodes, and the Manhattan heuristic needs just 2%.\n", + "\n", + "Next, we can show the value of the gap heuristic for pancake sorting problems:" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "astar_search:\n", + " 1,285 nodes | 258 goal | 7 cost | 264 actions | PancakeProblem((4, 6, 2, 5, 1, 3), (1, 2\n", + " 3,804 nodes | 635 goal | 8 cost | 642 actions | PancakeProblem((1, 3, 7, 5, 2, 6, 4), (1\n", + " 294 nodes | 50 goal | 6 cost | 55 actions | PancakeProblem((1, 7, 2, 6, 3, 5, 4), (1\n", + " 2,256 nodes | 283 goal | 9 cost | 291 actions | PancakeProblem((1, 3, 5, 7, 9, 2, 4, 6, \n", + " 7,639 nodes | 1,226 goal | 30 cost | 1,252 actions | TOTAL\n", + "\n", + "uniform_cost_search:\n", + " 3,590 nodes | 719 goal | 7 cost | 725 actions | PancakeProblem((4, 6, 2, 5, 1, 3), (1, 2\n", + " 30,204 nodes | 5,035 goal | 8 cost | 5,042 actions | PancakeProblem((1, 3, 7, 5, 2, 6, 4), (1\n", + " 22,068 nodes | 3,679 goal | 6 cost | 3,684 actions | PancakeProblem((1, 7, 2, 6, 3, 5, 4), (1\n", + "2,271,792 nodes | 283,975 goal | 9 cost | 283,983 actions | PancakeProblem((1, 3, 5, 7, 9, 2, 4, 6, \n", + "2,327,654 nodes | 293,408 goal | 30 cost | 293,434 actions | TOTAL\n", + "\n" + ] + } + ], + "source": [ + "report([astar_search, uniform_cost_search], [c1, c2, c3, c4])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We need to explore 300 times more nodes without the heuristic.\n", + "\n", + "# Comparing graph search and tree search\n", + "\n", + "Keeping the *reached* table in `best_first_search` allows us to do a graph search, where we notice when we reach a state by two different paths, rather than a tree search, where we have duplicated effort. The *reached* table consumes space and also saves time. How much time? In part it depends on how good the heuristics are at focusing the search. Below we show that on some pancake and eight puzzle problems, the tree search expands roughly twice as many nodes (and thus takes roughly twice as much time):" + ] + }, + { + "cell_type": "code", + "execution_count": 188, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "astar_search:\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 3,614 nodes | 1,349 goal | 22 cost | 1,370 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 5,373 nodes | 2,010 goal | 23 cost | 2,032 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 10,832 nodes | 4,086 goal | 26 cost | 4,111 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 15 nodes | 6 goal | 418 cost | 9 actions | RouteProblem('A', 'B')\n", + " 34 nodes | 15 goal | 910 cost | 23 actions | RouteProblem('N', 'L')\n", + " 33 nodes | 14 goal | 805 cost | 21 actions | RouteProblem('E', 'T')\n", + " 20 nodes | 9 goal | 445 cost | 13 actions | RouteProblem('O', 'M')\n", + " 19,936 nodes | 7,495 goal | 2654 cost | 7,589 actions | TOTAL\n", + "\n", + "astar_tree_search:\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 5,384 nodes | 2,000 goal | 22 cost | 2,021 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 9,116 nodes | 3,404 goal | 23 cost | 3,426 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 19,084 nodes | 7,185 goal | 26 cost | 7,210 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 15 nodes | 6 goal | 418 cost | 9 actions | RouteProblem('A', 'B')\n", + " 47 nodes | 19 goal | 910 cost | 27 actions | RouteProblem('N', 'L')\n", + " 46 nodes | 18 goal | 805 cost | 25 actions | RouteProblem('E', 'T')\n", + " 24 nodes | 10 goal | 445 cost | 14 actions | RouteProblem('O', 'M')\n", + " 33,731 nodes | 12,648 goal | 2654 cost | 12,742 actions | TOTAL\n", + "\n" + ] + } + ], + "source": [ + "report([astar_search, astar_tree_search], [e1, e2, e3, e4, r1, r2, r3, r4])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Comparing different weighted search values\n", + "\n", + "Below we report on problems using these four algorithms:\n", + "\n", + "|Algorithm|*f*|Optimality|\n", + "|:---------|---:|:----------:|\n", + "|Greedy best-first search | *f = h*|nonoptimal|\n", + "|Extra weighted A* search | *f = g + 2 × h*|nonoptimal|\n", + "|Weighted A* search | *f = g + 1.4 × h*|nonoptimal|\n", + "|A* search | *f = g + h*|optimal|\n", + "|Uniform-cost search | *f = g*|optimal|\n", + "\n", + "We will see that greedy best-first search (which ranks nodes solely by the heuristic) explores the fewest number of nodes, but has the highest path costs. Weighted A* search explores twice as many nodes (on this problem set) but gets 10% better path costs. A* is optimal, but explores more nodes, and uniform-cost is also optimal, but explores an order of magnitude more nodes." + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "greedy_bfs:\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 9 nodes | 4 goal | 450 cost | 6 actions | RouteProblem('A', 'B')\n", + " 29 nodes | 12 goal | 910 cost | 20 actions | RouteProblem('N', 'L')\n", + " 19 nodes | 8 goal | 837 cost | 14 actions | RouteProblem('E', 'T')\n", + " 14 nodes | 6 goal | 572 cost | 10 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 909 nodes | 138 goal | 136 cost | 258 actions | GridProblem((15, 30), (130, 30))\n", + " 974 nodes | 147 goal | 152 cost | 277 actions | GridProblem((15, 30), (130, 30))\n", + " 5,146 nodes | 4,984 goal | 99 cost | 5,082 actions | JumpingPuzzle('LLLLLLLLL.RRRRRRRRR', 'RR\n", + " 1,569 nodes | 568 goal | 58 cost | 625 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 1,424 nodes | 257 goal | 164 cost | 406 actions | GridProblem((15, 30), (130, 30))\n", + " 1,899 nodes | 342 goal | 153 cost | 470 actions | GridProblem((15, 30), (130, 30))\n", + " 18,239 nodes | 2,439 goal | 134 cost | 2,564 actions | GridProblem((15, 30), (130, 30))\n", + " 18,329 nodes | 2,460 goal | 152 cost | 2,594 actions | GridProblem((15, 30), (130, 30))\n", + " 287 nodes | 109 goal | 33 cost | 141 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 1,128 nodes | 408 goal | 46 cost | 453 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 49,990 nodes | 11,889 goal | 3901 cost | 12,930 actions | TOTAL\n", + "\n", + "extra_weighted_astar_search:\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 9 nodes | 4 goal | 450 cost | 6 actions | RouteProblem('A', 'B')\n", + " 29 nodes | 12 goal | 910 cost | 20 actions | RouteProblem('N', 'L')\n", + " 23 nodes | 9 goal | 805 cost | 16 actions | RouteProblem('E', 'T')\n", + " 18 nodes | 8 goal | 445 cost | 12 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 1,575 nodes | 239 goal | 136 cost | 357 actions | GridProblem((15, 30), (130, 30))\n", + " 1,384 nodes | 231 goal | 133 cost | 349 actions | GridProblem((15, 30), (130, 30))\n", + " 10,990 nodes | 10,660 goal | 99 cost | 10,758 actions | JumpingPuzzle('LLLLLLLLL.RRRRRRRRR', 'RR\n", + " 1,720 nodes | 633 goal | 24 cost | 656 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 9,282 nodes | 1,412 goal | 163 cost | 1,551 actions | GridProblem((15, 30), (130, 30))\n", + " 1,354 nodes | 228 goal | 134 cost | 345 actions | GridProblem((15, 30), (130, 30))\n", + " 16,024 nodes | 2,098 goal | 129 cost | 2,214 actions | GridProblem((15, 30), (130, 30))\n", + " 16,950 nodes | 2,237 goal | 140 cost | 2,359 actions | GridProblem((15, 30), (130, 30))\n", + " 1,908 nodes | 709 goal | 25 cost | 733 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 1,312 nodes | 489 goal | 30 cost | 518 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 62,593 nodes | 18,976 goal | 3628 cost | 19,904 actions | TOTAL\n", + "\n", + "weighted_astar_search:\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 9 nodes | 4 goal | 450 cost | 6 actions | RouteProblem('A', 'B')\n", + " 32 nodes | 14 goal | 910 cost | 22 actions | RouteProblem('N', 'L')\n", + " 29 nodes | 12 goal | 805 cost | 19 actions | RouteProblem('E', 'T')\n", + " 18 nodes | 8 goal | 445 cost | 12 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 1,631 nodes | 236 goal | 128 cost | 350 actions | GridProblem((15, 30), (130, 30))\n", + " 1,706 nodes | 275 goal | 131 cost | 389 actions | GridProblem((15, 30), (130, 30))\n", + " 10,990 nodes | 10,660 goal | 99 cost | 10,758 actions | JumpingPuzzle('LLLLLLLLL.RRRRRRRRR', 'RR\n", + " 2,082 nodes | 771 goal | 22 cost | 792 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 8,385 nodes | 1,266 goal | 154 cost | 1,396 actions | GridProblem((15, 30), (130, 30))\n", + " 1,400 nodes | 229 goal | 133 cost | 344 actions | GridProblem((15, 30), (130, 30))\n", + " 12,122 nodes | 1,572 goal | 124 cost | 1,686 actions | GridProblem((15, 30), (130, 30))\n", + " 24,129 nodes | 3,141 goal | 127 cost | 3,255 actions | GridProblem((15, 30), (130, 30))\n", + " 3,960 nodes | 1,475 goal | 25 cost | 1,499 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 1,992 nodes | 748 goal | 26 cost | 773 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 68,500 nodes | 20,418 goal | 3585 cost | 21,311 actions | TOTAL\n", + "\n", + "astar_search:\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 15 nodes | 6 goal | 418 cost | 9 actions | RouteProblem('A', 'B')\n", + " 34 nodes | 15 goal | 910 cost | 23 actions | RouteProblem('N', 'L')\n", + " 33 nodes | 14 goal | 805 cost | 21 actions | RouteProblem('E', 'T')\n", + " 20 nodes | 9 goal | 445 cost | 13 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 26,711 nodes | 3,620 goal | 127 cost | 3,734 actions | GridProblem((15, 30), (130, 30))\n", + " 12,932 nodes | 1,822 goal | 124 cost | 1,936 actions | GridProblem((15, 30), (130, 30))\n", + " 10,991 nodes | 10,661 goal | 99 cost | 10,759 actions | JumpingPuzzle('LLLLLLLLL.RRRRRRRRR', 'RR\n", + " 3,614 nodes | 1,349 goal | 22 cost | 1,370 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 62,509 nodes | 8,729 goal | 154 cost | 8,859 actions | GridProblem((15, 30), (130, 30))\n", + " 15,190 nodes | 2,276 goal | 133 cost | 2,391 actions | GridProblem((15, 30), (130, 30))\n", + " 25,303 nodes | 3,196 goal | 124 cost | 3,310 actions | GridProblem((15, 30), (130, 30))\n", + " 32,572 nodes | 4,149 goal | 127 cost | 4,263 actions | GridProblem((15, 30), (130, 30))\n", + " 5,373 nodes | 2,010 goal | 23 cost | 2,032 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 10,832 nodes | 4,086 goal | 26 cost | 4,111 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + " 206,144 nodes | 41,949 goal | 3543 cost | 42,841 actions | TOTAL\n", + "\n", + "uniform_cost_search:\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 30 nodes | 13 goal | 418 cost | 16 actions | RouteProblem('A', 'B')\n", + " 42 nodes | 19 goal | 910 cost | 27 actions | RouteProblem('N', 'L')\n", + " 44 nodes | 20 goal | 805 cost | 27 actions | RouteProblem('E', 'T')\n", + " 30 nodes | 12 goal | 445 cost | 16 actions | RouteProblem('O', 'M')\n", + " 124 nodes | 46 goal | 5 cost | 50 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 355,452 nodes | 44,984 goal | 127 cost | 45,098 actions | GridProblem((15, 30), (130, 30))\n", + " 326,962 nodes | 41,650 goal | 124 cost | 41,764 actions | GridProblem((15, 30), (130, 30))\n", + " 10,992 nodes | 10,662 goal | 99 cost | 10,760 actions | JumpingPuzzle('LLLLLLLLL.RRRRRRRRR', 'RR\n", + " 214,952 nodes | 79,187 goal | 22 cost | 79,208 actions | EightPuzzle((1, 2, 3, 4, 5, 6, 7, 8, 0),\n", + " 558,084 nodes | 70,738 goal | 154 cost | 70,868 actions | GridProblem((15, 30), (130, 30))\n", + " 370,370 nodes | 47,243 goal | 133 cost | 47,358 actions | GridProblem((15, 30), (130, 30))\n", + " 349,062 nodes | 43,693 goal | 124 cost | 43,807 actions | GridProblem((15, 30), (130, 30))\n", + " 366,996 nodes | 45,970 goal | 127 cost | 46,084 actions | GridProblem((15, 30), (130, 30))\n", + " 300,925 nodes | 112,082 goal | 23 cost | 112,104 actions | EightPuzzle((4, 0, 2, 5, 1, 3, 7, 8, 6),\n", + " 457,766 nodes | 171,571 goal | 26 cost | 171,596 actions | EightPuzzle((7, 2, 4, 5, 0, 6, 8, 3, 1),\n", + "3,311,831 nodes | 667,891 goal | 3543 cost | 668,783 actions | TOTAL\n", + "\n" + ] + } + ], + "source": [ + "def extra_weighted_astar_search(problem): return weighted_astar_search(problem, weight=2)\n", + " \n", + "report((greedy_bfs, extra_weighted_astar_search, weighted_astar_search, astar_search, uniform_cost_search), \n", + " (r0, r1, r2, r3, r4, e1, d1, d2, j9, e2, d3, d4, d6, d7, e3, e4))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that greedy search expands the fewest nodes, but has the highest path costs. In contrast, A\\* gets optimal path costs, but expands 4 or 5 times more nodes. Weighted A* is a good compromise, using half the compute time as A\\*, and achieving path costs within 1% or 2% of optimal. Uniform-cost is optimal, but is an order of magnitude slower than A\\*.\n", + "\n", + "# Comparing many search algorithms\n", + "\n", + "Finally, we compare a host of algorihms (even the slow ones) on some of the easier problems:" + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "astar_search:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 1,696 nodes | 190 goal | 10 cost | 204 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 124 nodes | 30 goal | 35 cost | 45 actions | GreenPourProblem((0, 0), 8)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 15 nodes | 6 goal | 418 cost | 9 actions | RouteProblem('A', 'B')\n", + " 34 nodes | 15 goal | 910 cost | 23 actions | RouteProblem('N', 'L')\n", + " 33 nodes | 14 goal | 805 cost | 21 actions | RouteProblem('E', 'T')\n", + " 20 nodes | 9 goal | 445 cost | 13 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 18,151 nodes | 2,096 goal | 2706 cost | 2,200 actions | TOTAL\n", + "\n", + "uniform_cost_search:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 1,696 nodes | 190 goal | 10 cost | 204 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 124 nodes | 30 goal | 35 cost | 45 actions | GreenPourProblem((0, 0), 8)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 30 nodes | 13 goal | 418 cost | 16 actions | RouteProblem('A', 'B')\n", + " 42 nodes | 19 goal | 910 cost | 27 actions | RouteProblem('N', 'L')\n", + " 44 nodes | 20 goal | 805 cost | 27 actions | RouteProblem('E', 'T')\n", + " 30 nodes | 12 goal | 445 cost | 16 actions | RouteProblem('O', 'M')\n", + " 124 nodes | 46 goal | 5 cost | 50 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 18,304 nodes | 2,156 goal | 2706 cost | 2,260 actions | TOTAL\n", + "\n", + "breadth_first_search:\n", + " 596 nodes | 597 goal | 4 cost | 73 actions | PourProblem((1, 1, 1), 13)\n", + " 596 nodes | 597 goal | 15 cost | 73 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 2,618 nodes | 2,619 goal | 9 cost | 302 actions | PourProblem((0, 0, 0), 21)\n", + " 2,618 nodes | 2,619 goal | 32 cost | 302 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 120 nodes | 121 goal | 14 cost | 42 actions | PourProblem((0, 0), 8)\n", + " 120 nodes | 121 goal | 36 cost | 42 actions | GreenPourProblem((0, 0), 8)\n", + " 2,618 nodes | 2,619 goal | 9 cost | 302 actions | PourProblem((0, 0, 0), 21)\n", + " 2,618 nodes | 2,619 goal | 32 cost | 302 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 18 nodes | 19 goal | 450 cost | 10 actions | RouteProblem('A', 'B')\n", + " 42 nodes | 43 goal | 1085 cost | 27 actions | RouteProblem('N', 'L')\n", + " 36 nodes | 37 goal | 837 cost | 22 actions | RouteProblem('E', 'T')\n", + " 30 nodes | 31 goal | 445 cost | 16 actions | RouteProblem('O', 'M')\n", + " 81 nodes | 82 goal | 5 cost | 35 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 12,111 nodes | 12,125 goal | 2973 cost | 1,548 actions | TOTAL\n", + "\n", + "breadth_first_bfs:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 1,062 nodes | 124 goal | 15 cost | 127 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 3,757 nodes | 420 goal | 24 cost | 428 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 124 nodes | 30 goal | 36 cost | 43 actions | GreenPourProblem((0, 0), 8)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 3,757 nodes | 420 goal | 24 cost | 428 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 28 nodes | 12 goal | 450 cost | 14 actions | RouteProblem('A', 'B')\n", + " 55 nodes | 24 goal | 910 cost | 32 actions | RouteProblem('N', 'L')\n", + " 51 nodes | 22 goal | 837 cost | 28 actions | RouteProblem('E', 'T')\n", + " 40 nodes | 16 goal | 445 cost | 20 actions | RouteProblem('O', 'M')\n", + " 124 nodes | 46 goal | 5 cost | 50 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 17,068 nodes | 2,032 goal | 2782 cost | 2,119 actions | TOTAL\n", + "\n", + "iterative_deepening_search:\n", + " 6,133 nodes | 6,118 goal | 4 cost | 822 actions | PourProblem((1, 1, 1), 13)\n", + " 6,133 nodes | 6,118 goal | 15 cost | 822 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 288,706 nodes | 288,675 goal | 9 cost | 36,962 actions | PourProblem((0, 0, 0), 21)\n", + " 288,706 nodes | 288,675 goal | 62 cost | 36,962 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 3,840 nodes | 3,824 goal | 14 cost | 949 actions | PourProblem((0, 0), 8)\n", + " 3,840 nodes | 3,824 goal | 36 cost | 949 actions | GreenPourProblem((0, 0), 8)\n", + " 288,706 nodes | 288,675 goal | 9 cost | 36,962 actions | PourProblem((0, 0, 0), 21)\n", + " 288,706 nodes | 288,675 goal | 62 cost | 36,962 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 27 nodes | 25 goal | 450 cost | 13 actions | RouteProblem('A', 'B')\n", + " 167 nodes | 173 goal | 910 cost | 82 actions | RouteProblem('N', 'L')\n", + " 117 nodes | 120 goal | 837 cost | 56 actions | RouteProblem('E', 'T')\n", + " 108 nodes | 109 goal | 572 cost | 44 actions | RouteProblem('O', 'M')\n", + " 116 nodes | 118 goal | 5 cost | 47 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + "1,175,305 nodes |1,175,130 goal | 2985 cost | 151,632 actions | TOTAL\n", + "\n", + "depth_limited_search:\n", + " 4,433 nodes | 4,374 goal | 10 cost | 627 actions | PourProblem((1, 1, 1), 13)\n", + " 4,433 nodes | 4,374 goal | 30 cost | 627 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 37,149 nodes | 37,106 goal | 10 cost | 4,753 actions | PourProblem((0, 0, 0), 21)\n", + " 37,149 nodes | 37,106 goal | 54 cost | 4,753 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 452 nodes | 453 goal | inf cost | 110 actions | PourProblem((0, 0), 8)\n", + " 452 nodes | 453 goal | inf cost | 110 actions | GreenPourProblem((0, 0), 8)\n", + " 37,149 nodes | 37,106 goal | 10 cost | 4,753 actions | PourProblem((0, 0, 0), 21)\n", + " 37,149 nodes | 37,106 goal | 54 cost | 4,753 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 17 nodes | 8 goal | 733 cost | 14 actions | RouteProblem('A', 'B')\n", + " 40 nodes | 38 goal | 910 cost | 26 actions | RouteProblem('N', 'L')\n", + " 29 nodes | 23 goal | 992 cost | 20 actions | RouteProblem('E', 'T')\n", + " 35 nodes | 29 goal | 895 cost | 22 actions | RouteProblem('O', 'M')\n", + " 351 nodes | 349 goal | 5 cost | 138 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 158,838 nodes | 158,526 goal | inf cost | 20,706 actions | TOTAL\n", + "\n", + "greedy_bfs:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 1,696 nodes | 190 goal | 10 cost | 204 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 124 nodes | 30 goal | 35 cost | 45 actions | GreenPourProblem((0, 0), 8)\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 9 nodes | 4 goal | 450 cost | 6 actions | RouteProblem('A', 'B')\n", + " 29 nodes | 12 goal | 910 cost | 20 actions | RouteProblem('N', 'L')\n", + " 19 nodes | 8 goal | 837 cost | 14 actions | RouteProblem('E', 'T')\n", + " 14 nodes | 6 goal | 572 cost | 10 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 18,120 nodes | 2,082 goal | 2897 cost | 2,184 actions | TOTAL\n", + "\n", + "weighted_astar_search:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 1,696 nodes | 190 goal | 10 cost | 204 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 124 nodes | 30 goal | 35 cost | 45 actions | GreenPourProblem((0, 0), 8)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 9 nodes | 4 goal | 450 cost | 6 actions | RouteProblem('A', 'B')\n", + " 32 nodes | 14 goal | 910 cost | 22 actions | RouteProblem('N', 'L')\n", + " 29 nodes | 12 goal | 805 cost | 19 actions | RouteProblem('E', 'T')\n", + " 18 nodes | 8 goal | 445 cost | 12 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 18,137 nodes | 2,090 goal | 2738 cost | 2,193 actions | TOTAL\n", + "\n", + "extra_weighted_astar_search:\n", + " 948 nodes | 109 goal | 4 cost | 112 actions | PourProblem((1, 1, 1), 13)\n", + " 1,696 nodes | 190 goal | 10 cost | 204 actions | GreenPourProblem((1, 1, 1), 13)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 124 nodes | 30 goal | 14 cost | 43 actions | PourProblem((0, 0), 8)\n", + " 124 nodes | 30 goal | 35 cost | 45 actions | GreenPourProblem((0, 0), 8)\n", + " 3,499 nodes | 389 goal | 9 cost | 397 actions | PourProblem((0, 0, 0), 21)\n", + " 4,072 nodes | 454 goal | 21 cost | 463 actions | GreenPourProblem((0, 0, 0), 21)\n", + " 0 nodes | 1 goal | 0 cost | 0 actions | RouteProblem('A', 'A')\n", + " 9 nodes | 4 goal | 450 cost | 6 actions | RouteProblem('A', 'B')\n", + " 29 nodes | 12 goal | 910 cost | 20 actions | RouteProblem('N', 'L')\n", + " 23 nodes | 9 goal | 805 cost | 16 actions | RouteProblem('E', 'T')\n", + " 18 nodes | 8 goal | 445 cost | 12 actions | RouteProblem('O', 'M')\n", + " 15 nodes | 6 goal | 5 cost | 10 actions | EightPuzzle((1, 4, 2, 0, 7, 5, 3, 6, 8),\n", + " 18,128 nodes | 2,085 goal | 2738 cost | 2,188 actions | TOTAL\n", + "\n" + ] + } + ], + "source": [ + "report((astar_search, uniform_cost_search, breadth_first_search, breadth_first_bfs, \n", + " iterative_deepening_search, depth_limited_search, greedy_bfs, \n", + " weighted_astar_search, extra_weighted_astar_search), \n", + " (p1, g1, p2, g2, p3, g3, p4, g4, r0, r1, r2, r3, r4, e1))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This confirms some of the things we already knew: A* and uniform-cost search are optimal, but the others are not. A* explores fewer nodes than uniform-cost. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Visualizing Reached States\n", + "\n", + "I would like to draw a picture of the state space, marking the states that have been reached by the search.\n", + "Unfortunately, the *reached* variable is inaccessible inside `best_first_search`, so I will define a new version of `best_first_search` that is identical except that it declares *reached* to be `global`. I can then define `plot_grid_problem` to plot the obstacles of a `GridProblem`, along with the initial and goal states, the solution path, and the states reached during a search." + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [], + "source": [ + "def best_first_search(problem, f):\n", + " \"Search nodes with minimum f(node) value first.\"\n", + " global reached # <<<<<<<<<<< Only change here\n", + " node = Node(problem.initial)\n", + " frontier = PriorityQueue([node], key=f)\n", + " reached = {problem.initial: node}\n", + " while frontier:\n", + " node = frontier.pop()\n", + " if problem.is_goal(node.state):\n", + " return node\n", + " for child in expand(problem, node):\n", + " s = child.state\n", + " if s not in reached or child.path_cost < reached[s].path_cost:\n", + " reached[s] = child\n", + " frontier.add(child)\n", + " return failure\n", + "\n", + "\n", + "def plot_grid_problem(grid, solution, reached=(), title='Search', show=True):\n", + " \"Use matplotlib to plot the grid, obstacles, solution, and reached.\"\n", + " reached = list(reached)\n", + " plt.figure(figsize=(16, 10))\n", + " plt.axis('off'); plt.axis('equal')\n", + " plt.scatter(*transpose(grid.obstacles), marker='s', color='darkgrey')\n", + " plt.scatter(*transpose(reached), 1**2, marker='.', c='blue')\n", + " plt.scatter(*transpose(path_states(solution)), marker='s', c='blue')\n", + " plt.scatter(*transpose([grid.initial]), 9**2, marker='D', c='green')\n", + " plt.scatter(*transpose([grid.goal]), 9**2, marker='8', c='red')\n", + " if show: plt.show()\n", + " print('{} {} search: {:.1f} path cost, {:,d} states reached'\n", + " .format(' ' * 10, title, solution.path_cost, len(reached)))\n", + " \n", + "def plots(grid, weights=(1.4, 2)): \n", + " \"\"\"Plot the results of 4 heuristic search algorithms for this grid.\"\"\"\n", + " solution = astar_search(grid)\n", + " plot_grid_problem(grid, solution, reached, 'A* search')\n", + " for weight in weights:\n", + " solution = weighted_astar_search(grid, weight=weight)\n", + " plot_grid_problem(grid, solution, reached, '(b) Weighted ({}) A* search'.format(weight))\n", + " solution = greedy_bfs(grid)\n", + " plot_grid_problem(grid, solution, reached, 'Greedy best-first search')\n", + " \n", + "def transpose(matrix): return list(zip(*matrix))" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3b/PJFt6H/bTw11xTGsvocR2RFiZAgrv3dAADRJWaDhYLNATEPQGBmUl+hMM7ipQ4sxKSBAObqBgXoNYCIQyg7DBBQw48V7QViZIkeFQvCMTl7zwtoN55+47Pd39VnVVnfM9pz4foDC7z+3ues6Pqn5PV/XTh9PpVAAAACDFq9YJAAAAwHMWqgAAAESxUAUAACCKhSoAAABRLFQBAACIYqEKAABAFAtVAAAAolioAgAAEMVCFQAAgCgWqgAAAESxUAUAACCKhSoAAABRLFQBAACIYqEKAABAFAtVAAAAolioAgAAEMVCFQAAgCgWqgAAAESxUAUAACCKhSoAAABRLFQBAACIYqEKAABAFAtVAAAAolioAgAAEMVCFQAAgCgWqgAAAESxUAUAACCKhSoAAABRLFQBAACIYqEKAABAFAtVAAAAolioAgAAEMVCFQAAgCgWqgAAAESxUAUAACCKhSoAAABRLFQBAACIYqEKAABAFAtVAAAAolioAgAAEMVCFQAAgCgWqgAAAESxUAUAACCKhSoAAABRLFQBAACIYqEKAABAFAtVAAAAolioAgAAEMVCFQAAgCgWqgAAAESxUAUAACCKhSoAAABRLFQBAACIYqEKAABAFAtVAAAAolioAgAAEMVCFQAAgCgWqkAzh0M5HA7l88OhHBJjAAC0YaEKtPRQSvmTp38TYwAANHA4nU6tcwB26unq5UMp5cvTqZzSYgAAtGGhCgAAQBS3/gIAABDFQhVYJKn4UcuiS2n5AAD0zEIVWCqp+FHLoktp+QAAdMt3VIFFkooftSy6lJYPAEDPLFQBAACI4tZfAAAAoliows71WjAoKZaWTw/9AABwi4Uq0GvBoKRYWj499AMAwFW+owo712vBoKRYWj499AMAwC0WqgAAAERx6y8AAABRLFRhUAoG1Yul5aMf5ucNAGSxUIVxKRhUL5aWj36YnzcAEMR3VGFQLQvl7C2Wlo9+mJ83AJDFQhUAAIAobv0FAAAgioUqBEsrOJOUT1IsLR/9kBeb+1gA2DsLVciWVnAmKZ+kWFo++iEvNvexALBrvqMKwQ5hBWeS8kmKpeWjH/Jicx8LAHtnoQoAAEAUt/4CAAAQxUIVGkgq8qJQjr7RD/oGAOKcTiebzVZ5K+X0eSmnf13K6fNeYmn5JMXS8tEPebG0fK7laLPZbDZbytY8AZttj1spp8PTH4qHXmJp+STF0vLRD3mxtHyu5Wiz2Ww2W8qmmBIAAABRfEcVAACAKBaqcKekIiiKweibhFhaPkmxtHzm5A0ATbS+99hm63UrQUVQasTS8kmKpeWjH/JiafnMydtms9lsthZb8wRstl63ElQEpUYsLZ+kWFo++iEvlpbPnLxtNpvNZmuxKaYEwCYeHx+/KqV878J/enc8Hj+rnQ8A0A/fUQVgK5cWqbfiAAClFAtV+ERSIZOkWFo+SbG0fNL64ZKkvM2R+W0BgK1ZqMKnHkopf/L0r9jHkvJJiqXlk9YPlyTlbY7MbwsAbMp3VOHM05WDh1LKl6dTOYmVb08SSfkkxdLySemHt28ff1GuePPm+Colb3NkflsAYGsWqsC3FL9hTY+Pj1ffYI7Ho1tJJ3BM8pz5AOyJW3+B5xS/gSyOSZ4zH4DdsFBlt0YveLJFsZSkHJNiafmk9cMlSXn3MEeS+hAAarBQZc9GL3iyRbGUpByTYmn5pPXDJUl59zBHLkk7PwDAanxHld06XCkScim+l9it4jfH4/GQkGNiLC2flH5QTGn5HEnsQ9rxvW9gTyxUgW/t8Y8gxUm20+N8SpsPPfYh2zEfgD1x6y+wd4qT8Jz5AAABLFTZhVoFT0aJLe3HHmNL2pvWlpZ9uHbf9jgftpgj6X0IAGuzUGUvahU8GSV2TVKONdps3syL3YqfS8q7Rs5p+dQ6ZwDAXXxHlV04zCgSMvWxI8b2WExpjWI1KW1pHduib3ucD2vOkV76kDp8RxXYEwtV4Ft7/CNoj22upce+Tcs5LR/aMh+APXHrLwAAAFEsVAEAAIhiocou1KrMOUpsaT/2GFvS3rS2tOzDtfu2x/mwxRwZpQ8BYCoLVfaiVmXOUWLXJOVYo83mzbzYrfi5pLxr5JyWT8vzCAC8SDElduHpU/6Honrr3RVGVf01b+b2Qy8Va2vknJZPrfnAuhRTAvbEQhX41h7/CNpjm2vpsW/Tck7Lh7ZuzYcL3h2Px882SwZgY279BQAYz/daJwCwhIUqw2lZ8GSU2BZ9mx5b0t60trTsw7X7tsf5sMUc6bEPp7YDAC6xUGVELQuejBK7JinHGm02b+bFbsXPJeVdI+e0fJL6FQA+4TuqDOfp0/uHUrHgySgxxZQ+ppiSYkpr5ZyWT8s5wv1mfkfV95iBrlmoAt/aY+GWPba5lh77Ni3ntHxoy0K1nsfHx6/K5e/5KlIFlbj1FwAAPnatGJUiVVCJhSpd26LQR1LhkbSCJ0k51mizebNOPyzp2x7nwxZzZKqkPlzSDgCwUKV3tYrB7C12TVKONdps3syL3YqfS8q7Rs5b5LN2ji3PLQDwEd9RpWtPn9Q/lI2KwbQuPFI7ppjSxxRTUkxprZy3yOfS9w8T+lAxpe34jmo9vh8O7VmoAt/a4xvz0jZPLbhx43Gj+KTASI/zKS3ntHxoy0K1HscetOfWX4BlphbcGHmRWsr47QMAKrJQpRu1Cn0kFR5JK3iSlGONNm/RX6Nb0uZR5sMW55a1JfUrAFxioUpPWhaD2VvsmqQca7R5i/4a3ZI2jzIftji3rC2pXwHgExaq9OTLUsoPn/7dKlZrP+mxa5JyrNHmLfprdEvaPMp82OLcsrakfgWATyimBDs1t7hPo0IyaQWI7i4YNLcISo/O50iPxUjSck7KJ/B4nOOTY7dHoxRTmlqELiCfdEPMa7jGFVXYrx7elNNyXJLPu9WygDbSjsc5es59RFOL0NXS6/zoNW+Y5DutE4BLDg1/46/VvmvH3r7NH5d7ctzatXnz8mPff+p9Xz9c/y3NNGuPX/qc2/rcUvMYSMoFAFxRJVXLQh9JBV3Sipv0kOPW5uQ3cj9cs3be6XOu1rmlhqRcANg5C1VStSz0kVTQJa24SQ85bq1l8akerJ13+pzrtZjSJUm5ALBzbv0l0tNtZz+vHZvy2A9FF57fBvf4+P7foNi7p1tNb7SjzFZ7XO7JcWvX5s3Ux47SD9esnXf6nFvz3LJGPksk5QIArqjCfD0UL+ghxynSChC1yietHy7pIUeW6XmMe86d7fU6P3rNGyZxRZXmEgoLzSl40ktBkRGKKV0qQHSrsNDaP8UwZ94sef49/dD6WJly/NyaY0l51ygA1TKftfgZDEZ1aW7X+GmopJ+fgkSuqJIgrbDQKAVFRiqm1GtxmaT+ann8XJKU99rza+m5pcfzDQCsykKVBGmFhUYpKDJSMaVei8sk9VfL4+eSpLxrFIBqmQ8AdMetvzTXqnDS/cWUJjQqwBbtSBirpOIyaz1/QQGvF4tm1Yx9mvf5f22fY435tWahtltu3TZ4h3du7V3Xh/FsnMOQc+RG38bkCCzniirM10PxgrVz7KHNI7v2x+4oRbN4r+V4mkvrG61Pk9rjnAg74IoqVSUUS1laTKmXwjZLirS8eXN8dV/f9FHsZqqUYkq9FCXqNe8aOS/ddw0t5w0AnHNFldqSiqVsUfCkx9glPfRNDUtzaTVWacdPet41cl667xqS+hqAnbNQpbakYilbFDzpMXZJD31TQ1oxpRr7UExpm5yX7ruGpL4GYOfc+ktVCcVSbsXS8mlVNGZaoZw2BX72WUxp+32sEes17xo5L913DS3nDVwztyjVygWkgIZcUQXupZgFrKtl0TIF09Y3Wp+2ao/3FNgpV1TZTEJhlG2KKY0Ru6dwS0KhHMWUttmHYkrb5jztsWMUalurb3pX62dSbl1BPB6Ph62eC7A1V1TZUlJhlC0KnowSm2Pq89PyvtfSXGq0OW0u9Zh3jZxbtiU9disOwE5ZqLKlpMIoiildj82RVCinBsWU5ufXY95pxZT2FrsVB2CnDqfTEHfYADfMLUbRqxa3qi287a7luLzb+tbEjvvmIrdCru/GOG8+P0fR462/c/abWBxprb6p1f+jv9cwLldUYR+i/uDnWy3Hpca+rxVfmVKUJW3OjlYYJ4WibMstOc7Yh9HfaxiUYkpsJqlYx96LKdUsQMQvpY/L9sfP/cWBWvfNmzfHV8W5pdp5N11ym12t+liNq51AHa6osqWkYh0KntBC+rj0cPy00kPfjBLrwR7bDNCUhSpbSirWoeAJLaSPSw/HTys99M0osR7ssc0ATbn1l8083eL0855iafmsFXt8PG8lNaSPS/Lx02PffChY8vy25ad2vDsej58lnAsSY/eqWSDm0pg+3dpetc2XbNUPE299VSgH2IwrqrAPimpkajku6XOix75RGKguBWLe0w+8pMfzKbiiyjqSCnMoeHIp9mlRm7dvH39xbTwvFZKZ8/xWP2uQ5p5xuVxYaN5YjXH83F+IKbVoWUKOibFepRxTrQuPkc9Vb3rliiprSSrMoeDJvNglc/qG61qNleOnbeyapByTYr1KO6YAhmKhylqSCnMoeDIvdsmcvuG6VmPl+GkbuyYpx6RYr9KOKYChHE6nru+8Ae506xbaKbfuLn3+hde7VhDko2Ida+93qRr5pLV5bTWL4rQywjhtYcncbv01gJQx7bEf5oz73Pb1dN6duo+Ac6SiWTThiirs17UCB60KHyhEs1/GmHsoEPOefhhf63Nk6/2zU4opMVtSEY7sYjDpsXnFas7jey3gMacYTI0iPRlzadnxs5e5lDRWSbH7+7C/glvbHFP390OrongAU7iiyj2SinC0LFyxt9it+J7M6YMtxmDrfbQ8fkaXNFZJsTmS8u7hmHLsAd2yUOUeSUU4FIOpF7sV35OWxXNq7KPl8TO6pLFKis2RlHcPx5RjD+iWW3+Z7en2oZ+PGEvLJyl2Hn98PP+v+3Ctby55flvdU3+9e7pNb9YY3Orr5DkyNbaXuZQ0VkmxORLy/lDY5vkt6x/m8NqxGY9d/dwC0JorqgDLzCkmoiDFZQqy0JPE4zgxp1vSivmla90vrffPTrmiyk1JRSVyClfsM3Ye30MBnCl986Fk/9QCJYoptS0Gs2U/3Mql1hj0GOutv1LPfT21xU+dzKO/2CtXVHlJUlGJkQpX9Bi7FR/VFn2zdAzWfL2Rjp+pWuaSNFZJsWuScuzh3DdSWwAsVHlRUlGJkQpX9Bi7FR/VFn2zdsGTHuZIUjGYlrkkjVVS7JqkHHs4943UFgC3/nJbQuGKmrE1X/NGwY27il60jn3avvP/Op4t+mbtYkPJc6RGbO48bJlL0lglxa6pP5emnbNTz301jh+AmlxRhe1cK27RW9EL2lN4BLY39ZydeNwl5gSwiCuqfCuhSEXr2Jqv2UsBHMWU3qvVN7WLDfV6/PR4nL10TCSNVVIspb+mz6V6x2NOm+uOPUAprqjysaQiFS2LY2zxmueS2ry0b0ZRq2+SxrSH46fH4ywtn/TYNUn5jH5MXdJy7AHK4XTyoRbvJX3CPsIVoVs/VfHmzfFVSpvvv2o472dBXvL4+Hj1ZPT89aY+bq5afdPj2Kdc/Uk/zl46JpLGKiV263hOGr9Wx+3obZ7i1hy5ZMn7wNT99rQP6JmFKmxk9DeguX88hHu35u/UJY39hwIxNfd5p7vGIKyvV8+lo/Gb4pMxHn380vXQZgvVm89veX5Y9X0TLnHrL8A4C4FLemlbL3nWNlK/jNSWUSjU1reWx5Tjmc1ZqO7U4VAOh0P5/OlWHLEN++aSpDYv7ZtR1OqbluOXbpTjrFb7elSrv0aaS1vu53g8fnY8Hg9v3hxfvXlz/P6bN8dXx+PxcDweP0vqf2CfLFT3q1VhiPTYVq95LqnNS/tmFLX6puX4pRvlONsin1HU6q+R5lLS+4BzFVCNhep+fVlK+eHTv2If2+I1zyW1eWnfjKJW37Qcv3SjHGdb5DOKWv010lxKeh9wrgKqsVDdqdOpnE6n8vPnlffEtnvNc0ltXto3o6jVNy3HL90ox1mt9vWoVn+NNJda7Tup/4F9+k7rBAACjFw45F3po+jFyGOwRC/jN4UxZnMXKumOXJ225fnB8czmLFSB1SX8jmoPP7tQw8B/oO2C8YPFRvmg5xPOD4zOrb871aqaX3psq9c8l9TmpX2zdvumvt7aj+uhb9Jjrfed0tdp+aTH0vorLZeksUoaE2B8Fqr7lVThLym21WueS2rz0r65pEZ/rf24HvomPdZ63+eScmmZT3rsmqR89nhMJcWAHTqcTr6/vkdPn1Q+lFK+/FDEQGzdvnn79vEX1/r/zZvjq5Q239s3a7dv6uut/bge+qaXWIt9J/X1rVyOx+MhaaxSYrdu0U8av1bHbdJYtYrdmiNTbfl1lCV8RQVus1CFjYz+BrR2+0b6juroY3/J4+PjV2Xg74KVCQVZ9jjuSyX12RoLojBDFBGyUIX9cusv0JtrlQZVIGxr5EVqKeO3j/GYs/m8n8ENFqqDu1aUIKlIQlJsq9c8l9TmpX2zdvteer3j8fjZ8Xg8vHlzfPXmzfH7b94cXx2Px8PxePysVf/X6pv02OiW9kPSWCXF0vprJEnjbEw+9eH97MLW/ZVwWIOF6vgUZ5gX2+o1zyW1eWnfXJLUX2u/Xsu+SY+Nbmk/JI1VUuyatHx6lDTOxgSYxXdUB/f0ieRDUZyheuGKxMIca/ZNq2JKif3fct8psQG/3/eJl8ZOMaV1501SMaVejXC+Gfk7qsBtFqqwkaQ3uRuFbu4uttGqmFKr1+tl31PsoPDRJl4au/RxT5TUZyN+2DLCvLNQhf1y6y/sw7VFyZLFytpFIBSVqMcidT7zkN6Ys0DXvtM6AdbT4vbW0WJrvubbt8vHKimXT+Pvr8Sul/e6r1ez/1vue+2+GUn9W0dv55Mw9omxlP66NX49XEFLGtNacwQYmyuqY3ko7QoGjRLb6jXPtWzfvbm0zDu9/1vue+2+GUlavyaNfVLsmrR80iWNqTEBFvMd1YG4opp2RTWnoM4auaSNVVL/t9z32n0zkqTjTDGly7Feiim5opo5R6byHVXok4UqzDS1MNHMN9e7ixpNscc3XMWUrhuxaMwlScV4EsY9UVKf1chlwEJmm753laKYEuyZW39hvi0KE430hwv59lBkpUUbFQTjJaOd62u0x/EDO6WY0kDc+lvn1t+phXLmFqxpVSRk1HmjmNKt2NqFsPqcIzX7NSfHvNg1SecMrtt6nD5csfWVBtgfV1TH8lDGLopTIzb3seemPm7qc7do3737rZVjUpuX9k2NfSfF0vJJiqXlkxS7Ji0fLks7foBB+I7qQFz1qHVFdVqhnLmf7m5ZOEQxpW37uuW+k2Jp+STF0vJJie2tmNKI3w+vMU5THrv2+PmOKrRnoQozTX3zmvsHyZZvfFu84U4tKlVD4wIln7S3xz9wAoq8VJ83SwT0Vw1NC+UMWkxpuD+6Us5pa49f2NyMeb+Fmtz6C9xri6JS92q5YBhlsdK6Ha33P1dv+d6jZaGcUQvojNau0dqTKun9FqpRTGkgbkGrdXvRtDEYvZhSUsGghCIoqX3Tax8m9U1qf9Wwdb9OLZQzyphuefUroQ9T35vv6cO553Fgfa6ojuWhKOqxNDb3seemPm7qc7do37373eI1a7SvlvS+6bUP02OjSzvvGtPrkvow7b15qhrv9cBEFqpj+bKU8sOnf2/F5jx2b7G5jz039XFTn7tF++7d7xavWaN9taT3Ta99mB4bXdp515hel9SHae/NU9V4rwcmUkwJZlJMabvXvLCPLgrWnLc3qQjHVAlFXvY2b3qQOl+3sONz2hAFeQYvpjQ5l9A5dq8h5ib3c0UVSNbDm+0oxURat2PN/fcwb3rQek6MKHFuJuaUoNdCXyON50ht4Q6KKQ1EMaX2BRsUU1q3La0L1tz7G7M9FlM6nZYXtUlpS+t5s4Wpv9Hc4+/0puQz4ryZKqH/a7031zwnLm0z7J0rqmPZoijB3mJzH3tuSYGFWu27d79bvOaSvGvooW+SYon5jGLk+ZWYz94k9X+t9+b0GOye76gOpLdPrxNjUx479WrGrce99Nz1rxQsvwKzxWsuybuG5L5JjKXk03rebMEV1bbzZsXvqEb+0ZU+b6Y8dsTj4oOZ31GNnGP32tP35PmUhSrMpJjSdq85Zx81hPfNtYIZuy8+0XrebGHquWXvf9QNVkimmhHmTY/Hhfn6stSxow63/gLJWhat6LVghj968sdurtHasyXzfz7zqx3z9TZzc+cUU+rUCLdZJcamPFYxpfVec8tiFjVvA2vRN4kFm1ruO33erNm+xLFPmTc9F0S6NL9KyRqr5DnS43HR03x1ZZMWXFHt10MZp3BFUmzuY89NfdzU527Rvnv3u8Vrpre5h76psY89nlsuSeubtXMcad70aPRjqtYcuSSpLaPMV9ic76h2KuXT69FiUx6rmNJ6r7ntJ9Utr6i2LTTVqv9r7afHedNDjsnn3TX6Jp0rqnXem1u3pdf56ooqLViowkzpxZTmFmdILhi0RMv8WheaSuj/XvXQrz3k2ErPhbT2PnbXTC0ct+S4aFWcrqf5an7Sglt/YTyKMwB71WvxlV7zrqFG4bhWxel6Gfde8mQwiil1KuU2q9FiUx6bXkxpq/3O2U/CmNbMr8W+E/u/5b7T+7WHHMeYN/cX0kqMpeXTsh8uWeu4aHdMTZuvNX7jFxK5otqvh5JVuGKU2NzHnpv6uKnPXdqWNfc7Zz9pY7p1fq33vfU+9nhuuSStb9bO0bzJi6Xlk3Q8znnsKMcU7IrvqHbKJ7T5BRtaFVPaar9z9pNQuEIxpczjJz2mmJJ5kxRLy6dF7Nb3OKe+5750XKQfU66osleuqHbqdCqn06n8/Pmb2dTY0uePHJv72HNTHzf1uUvbsuZ+5+wnbUy3zq/1vrfexxbHT3ps7X7tIUfzJi+Wlk/Lfrhk7eNiyXNb9gOMykIVAACAKBaqnTocyuFwKJ8/3RoyK7b0+SPH5j723NTHTX3u0rasud85+0kb063za73vrfexxfGTHlu7X3vI0bzJi6Xl07IfLln7uFjy3Jb9AKOyUO3X0i/j1/jSf4+xuY89N/VxU5+7tC1r7nfOftLGdOv8Wu97633s8dxySVrfrJ2jeZMXS8sn6Xic89hRjinYFcWUOvX0CdtDUZxh1diUxyqmNK8fWo2pYkqZx096TDEl8yYplpZPi5hiSoopsV8WqhM8Pj5+VS7/6PO74/H9b2CxH7feNJ+/Ydx63EvPXaLWfqf2Qyst86ux7/T+n+PGOXYUq75XjDT21NPr3zJrvOe+dFzM2Mfo56pSwucDl/V6fL/Erb/TXDspjX6yYpl3rROATox+Lh29ffRh9L9lrr3nrvlePEpf3bKHNo5oyOPbQrVTS7+M36oQQHps7mPPPX/c8Xj87Hg8Ht68Ob568+b4/afbhyY9d622TLG0b2q0Ze1+qDVvau17631scfwsmbO9GnnsW86bkWNbvea5pDbPOTc8f+yl99zj8Xg4Ho+frd03QB0Wqv1a+mX8Gl/67zE297Hnajx3yT6W5tyqLWv3Q615U2vfW++j1rlldCOPvfekbWJbvea5pDbPOTe06hugAgvVfn1ZSvnh079zY0ufP3Js7mPP1Xjukn0szblVW9buh1rzpta+t95HrXPL6EYee+9J28S2es1zSW2ec25o1TdABYopTaBwBc/VKNiwhGJK7ymm1I+5c7ZHa45Jq7EPKCRzV1GQUYuMzNXrOSPpfLqHc1Up2fOBy3o9vl/iiirUVaPYA+0Z53lG75dR2te6KMe9+x+yyAhNjHIs37KHNtKJ77ROgJc9fYH/oRS/y7Z1bMpj375dMlbvP73fsi238rsv52u/Fbr8NVv1w/bzpu04Jx8/rY6L1r9Rum6O1zLMOrdsoaf+Snu/7r0ftsx7et9MO1et/bunc66WjXpljX1yRbUPWxQLWPs1R4nNfey5tLZMsbRvlrxmq34Y6Zi6pIfjJz12SVrfrJ3j2ueWLfTYX+bNNjnX2k+vxwp0z0K1Dz0UZxglNvex59LaMsXSvlnymq36YaRj6pIejp/02CVpfbN2jmufW7bQY3+ZN9vkXGs/vR4r0D3FlCZwGwXPpc8HxZTe20vRi0sS+r9X6fO6lKbFlJofU1vfNjmyhPFbW2pxupavZ77v06jj7ooq7NvIRRNGbhvbUQjrutZ90Hr/AFSkmFIHkoszjBab8tj0ghS38luzGEx6P1wqenGrwMVIko+f/Ng2hZ3WfM2tCqaN0Dc9nrNT500PUufIiH0NLbii2octigXUKH7QY2zuY8+ltWXN/Grtp1XfjKSH42dvsa1ec4qkfmg5by5J6oe0edOD9DkCLOA7qhO0vu87+RPa0WJTHrvVz1es98lwnZ/XSO+HuX0zklb9X2s/PcbWfM25P32R1A9trqj2d66qPW96tdb4rT1H/DwNtY067haqE4w6+NxnZkGKd8fj+9vlaqk1X3s8LkYsJrKy6vO1lsfHx69KKd9rncfWUo+9lno8V23B+a8rH52L1xq75/N9B+fEYd/PLhn1POfWX9jWyG8CPVKM5baR5+vIbQPGUuN8Nfo5cfT27YJiSh1IvpVotNiUx7YrZNK2cEit/WwbyyoGk3gr3qjnlj0XN0no/7a3/uqbl/oh+RZoV4Jhv1xR7cNDyS/OMEps7mOnaNmWNXOptZ8eY2s8P8Xo55Y9Sur/lvPmkqR+qNU3lyS12XELlFJ8R3WS1vd9J39CO1psymPnXgWr/Um1YkqZ86aXK6przpHW45Le11tQTKneObG3vum1H/Z6RfXs+6Sr9MEWr5ms5+9mztV6rbIVC9UJRh12sESyAAAgAElEQVR87jP35F57jiim1I/EPxS2HrsdFPBoyrH3qdHPVTeOqckFeZL7IfE8WYOF6nLJ83ptvR7fL3HrL8ynIA9rSZtLNfKxSIV1XTumHGv9qnEuTnv/Wdvo7dsFxZQ6kHwr0WixaY/9tCDPrVuq6t/ieC2Tdfum5wIlOcfU/cWdEm/jm5b3tazzpdwWvcZtzGMcP23PiSl9M7V9vffDJWtdKVr7alQPV7f29NMt9MsV1T48lPziDKPE1nj+uZZtWTOXWvvpMdZ63+d66Icepc2RJfZ4/FyS1A9bHFMj9QOwMxaqffiylPLDp3/XiG3xmqPE1nj+uZZtWTOXWvvpMdZ63+d66Icepc2RJfZ4/FyS1A9bHFMj9QOwMxaqHTidyul0Kj9/fgvMktgWrzlKbI3nn2vZljVz6aHNvc6bpHGu1Q89SpsjrdrS6/GT3g9bHFMj9QOwP76jCkBN70qfRV6aFOaYWyW5USXPd77vRgsX5vuoc3GV86bzA72xUAWgGn+wzNbDor6HHNmHIefitfNmJz8xM+SYUIdbfztwOJTD4VA+f6qEtzi2xWuOElvj+edatmXNXHpoc6/zJmmce+2H9Ngaz0/Ww7xZO++0eTO1fb32A7A/Fqp96LXCYo+xNZ5/LqlK4hZ9s/Z+eoy13ve5PfZDemyN5yfrYd6snXfavLlkpH4AduZwOvVw10BbrX8P6+lTxYdS+vjNup5j9z4/6Tctt8oluc2tYy32ndj/aeOSFNtinJOk/MZsL8fKmn0ztX299sPc21sTfve01t+Nndz6G/PbsSNrvVbZioXqBKMOPutJmiMJb5COi/vNLZ7TAYU0FujlD9EFNp8fS85VvR+Pz9vX6znbQvW+/SRJnl+j6PX4folbf4F7XauC2qQ66kC6/aP4itHaU9vox1P6/EjPbw7nbFowv7ibqr9hUm4l2mvs3ue/fdt2TGvk8mn8/VWQpPHrbd7MHb9eJY1Vf3Pk/uOsl9uGS8k9J/Z+PI5yzmYdPV9ZY59cUc3zULKKM+wttsbzz7Vsy5q5tGxLemyr1xxF0liNNEdGmkvp58ReJR0Do89hYGW+ozpBzfu+sz/ZHz927/OTilTUKqbUeqySYrXGr1fJhVpqxVrsu5e5dDweD6nnxF768JoRjj3fUb1vP1vvmyyjfkfVQnWCUQef9STNkaRcmK+X4hhzmHdt9DKXtp4fC4spddGH14xw7LVeqFZwd0ExC1U+GPVvP7f+AmQZrfDEaO3pSQ99n55jen57MPoYjFSwC1almFKYUW9B6yV27/P3WUwpb/x6mzeXY30XPEnKJynWZt99zKWt97PsnNi2D6e17/rtyUnjfG/sw9XGkW7JBqZxRTXPQxm7qEd6bI3nn2vZljVzadmW9FhaPvohL5aWT1Ks5n7OJfXD0r65JKktW/QDMDDfUZ1AMaX9xO59vmJK+46l5aMf8mJp+STFtt5P0vm59hXV9Pbd2w+32tzhd1Tv/lvSd1T5YNTvqFqoTjDq4HOfx8fHr8qM75SMUExpbptXdnehCUjQ+PhZYohjb/T38B20r9fjZzILVZYa9Tzg1l+Y4PCTw+Hwk8PvHH5yOJT8N8xrhSeWFKRo2eb0/oaX9DqHe8373BbnROoZZR5eYx7CFYophUm5lWivsUvxw08Oh3Iqf1hK+f1Syh+fTqdyOEz/cKp+W7Yp/tFS0hxxTOmHe/umR2PMmz6KSt1/62/b8dvj8bPkZ42m3o4NuKKa6KFkFWfYW+yj+NMV1D8sp1e/Vw7lUE6vfu+P/58/LjNvmU9q39K+aSWpHxxT12Np+STFemXetI3Nfey5pLbs8fi5ZIw2Hw5/pxwOf+/Z9ndap8R4fEd1AsWU9hP7KP47P/6y/M5P/rCU8rullP/ww3//1cOvlt/69d8qv/+f/P6kK6sjFLNoXWgivQ/T8tEPWbHWx88SNY69LV5zlNiUx45eTCnx+Em5otrsO6qHw2+WUv68lPIrz6L/XynlPy+n0/+5yj6YZdTvqFqoTjDq4HPdt1dS/+bX/mH5W3/16QP+5tfKP/iP/rNJi9UR5kjrPxRG6EP2q/Xxs8Tej70eCsmN/jdK4vGzZKH6/LmhhaJuz7vD4Tf/snz2F98rX310W+YvSinvymfl18tXf99itb5RzwNu/YUz3y5SS/ndi4vUUkr5W39VfvaXPyt33Abcq5bFHhSaoHe9zuFe816TQnLtpc3DKflMLeCVOMbXc3p/e++fny9SS3m/oPhe+aqUUv7cbcCsRTGlMCm3Eu029qFw0unV75VXv/gPbo3VX5/+uvzsL39WSik3r6xGte/O2IdPV1PySYul5aMfsmKOn37nza1CRbUsyTGhD7c4fta+3Xnu77K+nPe0Yz5hfs30H5dSfuXaVa5XpZRTKb/y4/Lj3/4nh/Ivlt7aDK6o5nkoWcUZ9hb77VLK77+0SP3gr09/Xf7s3/1Z+Vd/9a9uPSypfebNNrG0fPRDXiwtn6RYWj7XcmxlSY5JfbjFHLmkxjiPNL9W9Yvy6tVPyw/+h7KjNrMd31GdQDGlHcVmXFEtZVphpRGKWXxoS1I+SbG0fPRDXiwtn6RYWj5Tr7TVcDweD0tyHPX9J/+Kah/z65qrf9seDn+vlPK/lxu3B59Keffj8uP/+p+UP3BFtaJRv6NqoTrBqIPPZR99R/VZtd9zU6v/7mmOhBaGWNOk4iZr20G/ltKob9cUOk7d92tLrQv5THn/2OPfKGu3uUYfhp4fZvnuv//35b/8x/+4fOev/uriLZm/KKV8VX69/N3yb8q/K99+TfXd6VScgzY26nnArb9w5vQHp1Mp5R+VUv55+Ztfu/ygv/m1WT9RsyNdvwlP0Kp9o/drKWO0MbENiTn1RCE51tL9sfjN3/7b5c9+8pPyrnxWzi8Ff1ik/lb52fNFaikDtJt2FFMKs/fbrFrHfhk/PZTf+fE/Kt/7v/9h+fv/vHxU/fdvfq2Uv/jd8vu/+w8mLVKT2rf1vOmwMMRsLY6pPfRrKVnzfaRxSuibft+T2hbCmvLY0YspXWrL2m2u0Yep54e5vvqN3yj/23//35X/4g/+oBx+8cvl6v/79a+W3yo/K/9X+c1PnuPWX+7limqeh6JwRcvYL+P/y48fyp/+USl/8bvl2yurT4vU8qd/NOdKalL7as2bkbU8pkaXNN9HGqekvvGeNC8297HnktqyRT9csvaxO/r5YbavfuM3yr/8Z/+s/M//9J9+u/3d8m8uLlKfDNkPbM93VCdQTGk/sfN4KeUXpZxK+a/+21K+/z+W8n/8N6X86R+VUg7l7dvHSWM6ajGLngpDrOnSeJay9ZW68fu1lP6PldRxSu/XtHySYlMeq5jS8jbX6MPU88Na3rw53vrPF983Wc+o31G1UJ1g1MHnZYfDhxPqqZT/9H8t5d/+dinl/ZBPXajuaY60LjxSQ4vx3EO/ltL/sZI6Tr33K7ft8W+UTosp3X1++NGPflC+/vq7F/7LqXz4myTZ6dRBkp0b9Tzg1l+Y5FDKv/2dcscbwt6KYYze3lbtG71fSxmjjYltSMwJ9ujuY/HyIrWUHhapxTmIBRRTCpNyK9FeY+fxOWNXyji3Wd0/b7YvPJJ4m9v2+2lb0CWnH9Jj18cpJ8e8WFo+SbEpj1VMaXmb6/Th/efxUj4pspts0vswTOGKap6HklWcYW+xW/Epktoy+ry5RN+0jaXlkxRLyycplpZPUmzuY88ltWWLfrhk6X7WfL0t+iHdSG2hMd9RnUAxpf3EzuPlxqeYl76j6orq9rF9XlHtI5aWT1IsLZ+kWFo+SbEpj008J25/RbW/YkpLYsUVVV4w6ndULVQnGHXwednhcP2EemmhemcRh6/K5R/Efnc8vr9ViF8a/Xi8MR96NeQ8dtz2Y8BjarIRzomX9FhMaarDoXQ9X08KJzWRNIfX5NZfaO/aG1K3b1QsMtq4j9aeDxy3/TAm9GTGfI272KRwEquyUA1zOJTD4VA+f7pNYpNYrf30GLsVn2Lpvtd8vdHnTVJ/1Wpfr5LmTat5mJZ3UqzmfvYmaZy3mCNrtzmpDy95+/bxbPufPlzBfFVK+X4p5dXpVA4NY5859liThWqeqV9CXxKrtZ8eY7fiUyzd95qvN/q8uWSkvhlJ0rxpNQ/T8k6K1dzP3iSN8xZz5JKR3lemShorxx6r8R3VCRRT2k/sPF4qFFNKL+KQNm8S+6tW+3qVNI9bzcOUvBNjLcdqdKMce+fxkYsplZl/dxyPx0PSWF0bP7Y16ndULVQnGHXw+dhhZgGDS28YHVq18MsOipZsXijn1vmmVysXGYvnfSHLiMfUyrorAHZpTH/0ox+Ur7/+7qWHvzudys32tfo7b42/O5xvKGXctYpbf+GXJr9ZvH79zZZ51LT2QqDLhcUMNdo3WjGKe9sz+lyintGOqbX1eKx9MqZXFqmlZLdvj393wGTfaZ0AL+vtNqueY9cMcvX0ojXnzdu3tbOvb/tj6v2VjaTjos0t0PeMTgbn3bT3pOxjKuHW5IR+mDdHPh3TcuOW2SXnm41v873l21uOX5ojaWMFa3FFtQ8Ppc/CFT3G9miLeTMyx9T12Fav2RtzZF4sLZ89vv8k9cMW7z81nrv22M+ZI2ljBavwHdUJWt/33d+n133GDofrnwaOfEV1aqGIa/G0KwNbUyin1hXVfueSOeI9aU4sYa6nF12a8thy44pqeXZ1cu4YbNk3U3O+lV9aMSXaaL1W2YqF6gSjDj4fu7VQPff69Tfliy9+umU69KO7QiQ96LkAjvcF5kiY673N2cPMIkQXfFRgacnfeSvkctHp/e+UllIy5kgF3ksXGHWt4tZf+KXJBTduFG1gf5ILdfRMARz2ovVcb73/eyw976553t7iPaDHMVnKeymfUEypA26zqhP78OnqjNtyPpH6+20Jt5bVsOUtWj0Vs0i+PW96rN8COM673pPmxa7P9SnPn3t+T7/Nd0o/3Grf86/qvHlzvPq4j/vw+utNuH13DS/cmrzSXqAzrqj24aEoXFErdis+xdJ9r/l6S9rRq1pzpNW+02Np+bSaD/pmXiwtn6TY3MdOkdS+pf2wxJLjuUYua+8DuuM7qhO0vu/bp9dNP7W9+kn1pQJLrqi21eqKaloxi5Tjp3U+ra6oKqZk3tTsm51eUZ303nzrimqZWKzopf66lcsMdxd7GknP36VsrfVaZSsWqhOMOvi87DCzEvA986HG/NpJIYZNj0fnAZ4zH0gx9/ze2/w8zCxWNGOh+pFrRRKf99fcXKYa+ZcF5uhtbiYZ9T3Jrb8AAP0avfDO5IXh69ff3Pz/t0wskrj6InVOjoMbfR5zB8WUOuA2q8yCDWuN1ZIiDmvs45LkT99ufWq47a2et/NKmscpx0/rfFrNh733jXlTu28+LcY0t9hXeuyGm7fvfrhC+vz23TLzVt2pfxOMflW0xi3jcM4V1T48FIUrasVuxadYuu81X29JO3pVa4602nd6LC2fVvNB38yLpeWTFFvj+eeS2rekHXMeO+c113zuSFqOMzvlO6oTtL7v26fX+QUbPhilmFKvV1QVU8o6flrns+0VVcWU1oql5ZMUu/f5Nd5XasUON2pFlDsKIpUF7+tznzsSV1SztV6rbMVCdYJRB5+X3XqD3FkxpXfH4/vby1prdTw6D7T3+Pi4SSGTtW09H3rph4lizi0jmXq+ajyXPhr7wx2Fik6n8rwtk9q85H197nNH4n0u26h/o7j1F5hilD+K6Zt5+N5I/TBSW3rUsv/P9z03F8V36tHXNKGYUgdSbiXaQ+w8XmOsEosp9WrbWz3b7buXWOsxSLF13/TSD3MkzeOk2L3Pn/q+0nouTX2/PZ3Ki1+vmNrmrXJ86dbYJbdj7+WrJ3DOFdU+PJSs4gwjx27Fp1i67zVfb0k7elVrjrTad3qs5n6S1eqbkSTN46TYGs8/lzSXtjjHLt3Pvc+t8V6/9PnpMfiIhWofviyl/LB8/GnektgWrzlK7FZ8iqX7XvP1lrSjV7XmSKt9p8dq7idZrb4ZSdI8Toqt8fxzSXNpi3Ps0v3c+9wa7/VLn58eg4+49bcDT7dE/Hyt2BavOUrsPH6Y+fXze/b9eKP+wlrtu7WPkWw5R17qw6R5nHL8rB3rZR7f2zcfCts8v4XxQ5tb36K5taR5nBS79/lT31daHVM/+tEPytdff7eUGb9pulabl7yv33rulu/1e3n/gXOuqALQi9ELeuyxsNDoY5quSf8/LVKnMkdgp1xR7UBKcYY9xM7jNcZKMaX1KKaUdfysv5/3P2Wx7TjfLlry4X/f+imAe/tmD8ep35itc0xNf1/Z/pi6FCu3r6TeNUdaF1Pa8r1+xPcfmMIV1T5s8WX1pC/PJ8VuxadYuu81X29JO3pVa4602nd6LC2fVsfK0r4Z2ejzJu2YuiStfVPy2+o1p1i7X9d+7pznJ8XgRRaqfUgrzjBy7FZ8ihpFLxQquK5lsYekeZx2/PQYW2Jp34xs9HmTdkxdkta+Kflt9ZpTrN2vaz93zvOTYvAiC9UOnE7ldDqVnz+/XWJJbIvXHCV2Kz7F0n2v+XpL2tGrWnOk1b7TY2n5tDpWlvbNyEafN2nH1CVp7ZuS31avOcXa/br2c+c8PykGU/iOKtz2rlwpcPLmzfGj///69TfleLz0SIBJrp5vBqEozg4dDuWrMn1eN5wjH6+hPlQmfvNm0uLK3IYNWKjCDadT+ew8djhcftOaWcUQ4CPH4/GT8w0M4Ooi9XQqM38sZksfp3LrPT0rbxiXW387dTiUw+FQPn+qpDYrtvT5I8fmPvbc0n2v+XpT9zG6Wn2YNI97OH7SY0uM3jfmTT99c0nSMbVFP0zdzxIt+2HtfJLmA5yzUO2XypzbxOY+9lx6dcY9qtWHSfO4h+MnPbbE6H1j3tSLrfH8c0nH1Bb9MHU/S7Tsh7XzSZoP8JHD6eR7zS+59Xt5z39Xr6anT6UeSkn8rcN+Y1MeW2b+/ttLr3frdxsv/ebgPe27tY+pWs31c3OPx7XmyEu/r5k0j5OPn/TYGr+j6rdC9zdv0vqmxvvK1FhZ4T1zzTbfyuft28fnz7mRdt33+hHff1hX4lplDRaqE4w6+NzncOU7qle8u/Q91+dqzK9b++jAu+ff3Wt1PPZwHnh8fLxWtOTdS99/vPHcll7Me21Tx3nOfAjt23PV+5rttDpfHeYVTvrou561jpMXFqCT3PMd1SVj0sP7D22NOkfc+gvzzanul/7HaQ/04XTX+mpKHyb2c2JO9+ihHT3kSL458+j8vbTKHHz9+pulL6HCL1RiodqppV9gr/FF+R5jUx57OpXPnj5NfVVK+X554Tias+97n7tkH3u0RR+mzeN780uT1IdLcu5B8nl3r7E1nn+u4fHz7Xvm6fT+vbTFsfLFFz8tb98+frS9YFLeLc/PSXO21/MfmSxU+7X0C+ytvjyfHlvj+edqPHfJPvZoiz5Mm8f35pcmqQ+X5NyDXs+7I8fWeP4556B5euibpDnbw5jSCQvVfn1ZSvnh079zY0ufP3Jsjeefq/HcJfvYoy36MG0e35tfmqQ+XJJzD3o9744cW+P555yD5umhb5LmbA9jSicUU5pg1C8os57DjAJLr19/U7744qdbpjOce4vYrKmH88BWxTpauqdvOyleFCdlHidpPJcmFbg6XClgdO29Zq1xvrbfa14qQNTyHHSrwNI9hZMuUUyJLY06R1xRnebaF+d9oZ4PJs+Fr7/+7pZ5jMhxVkdiP9+bk0XqfInjn6DlXJq674uPq/Bes6Rw0r2P2cSNAkuOC3ox5FrlO60T6EFiyf6nL6k/lOI369aMLXj+Z+excvu345p4/lttc39btedP5M7dM8Zv367/muv/BuiS/I6fzOFezy0vjVWPav/2ZSfn3V3MpZdyXPv1Zvw+6i13zNfr56Apz0/67di1z889vP/4zdS2Etcqa3BFtV9Lv9Se9CX7pNhWr5kiPb9athjPtHncIr9a++nx2FsqqV9Hnzdpc2ntHHs836Tl06q/lj4/qc3wIt9R7dSon163jq35muXGFdUJ5fA3McIV1TW+h3Hfp+HX++t4PB4S5nHrKwot2jy3H3qVcEVo9HmTOJemnFvKzPeatebSrf2Wu66oLpsjrc9/W+bXw/uPK6pswUIVNnKYUWDpki2KLk0tSvTSc1vqtZjSDgr8TCr8UkNqYaglUo6/vWk9l87H/TCzgFErrT6MXWjTc1jlYncx52NYwq2/sJ1FX2DfoBBG11+of6bXggHxf1wulNS+9Lkw12jt6UnLvr+076Tj7KIbhYnSJfft3HmY3BaYTDGlTo16m1Xr2MqvubjA0pa3KyUUCbnHVp8Sb13Motf+nivjeL6/MNRLt9h9+N8rXGG/+vyE2xQ7Pu/GzKUt+uZ8rjz3/CrmrZ9b2cDdXylJs+2tv0v2e3ke3jqPuM2XEbii2q+HMnbhilaxmvuZIimX0S3tL/39XtLx3OtxkdQPI513e4zdiqdIz2+OVueWLc5Vezx3MhjfUe3UuJ9ej/3Jfpn/kzUbXlHts5jSVpZeaVu7v3vUQ1EPV1TzYmn5JMXO42Vi4SRXVO+z7V1M6xd7SjyPwJosVKGiw8ICS1e8Oz3dZnzNDgr5EGCEDzNaF8/ZgKIqnTrMLJzUaqF6OpVJH+D0ILkgX63XhCRu/YW6tijMMeUPGYtUtqbgTybHfr8mj915AaOKBY3Oj/uezwM95w5DUkxpIG6z6uIWtEmFOcrMW4SXFHEg0z23bbX+HcEtXrP+rb9TRqcvCf06+rxpcBvlzdttP/y0Wf3bP7cpPjXlsVNv25/6ejXPLVvMEbf0MgJXVMfyUBSuWBpLzGeKtV+P9taeN3s8fmocez1I6tfR502teZh03KfNkamSxrTWHHFOpCu+ozoQn16P88l+WbnoUu8FLvbIFdVWV1THO1bSizOl5ZMSO9yuaTCpgFH62O/rimrfxZSgBQtVCPTCHyhTfFRgqfcCF3t0TyEMhTWWG/FYMfbZDjOLJpUyvYDRnsa+52NXMSW4zK2/kGlpUYfzP3oUiYBpRjtWRmvPiOYWvDKmYzGecIViSgNxm9VQt6AtLrr08XM/LXBxz++ohvRNXOze598ag/te7/r4JfdDVqxOMZg93K65r3lzf+zaPCjl/ZXT9OM+ZY70Ugit1i25bvNlBK6ojuWhKFyxNJaWz5y8zy157tLX3FtsjeefS3q90Y+fln1zSVLe5k292DXpx33aHEmXNm967EN2wndUB+LT6/19sl9uF11atcCSK6p1r6gqpjRWLG2skmJp+SSdx6ddUR1/Lk15bC+F0NYck62KKUEKC1Xo2GFe0aVFBZYUZljf2oUwFNboxwZjf60gz7vj8fjZhTgNHGYWTjo9K5p0zZK5dGPesJE1z8XO+YzOrb/QtzlFGPwxAuO6dnw77rPMGY8aRXbMj7oUToIZFFMaiNusdnkL2ieFX8rEAkv3FJ4IaXNc7N7nr10EpXVRlbRxSYptPVatx968mXyb7y2f3Ko55TWXjH0vBYjmeOmW19Fulb5mi9eE2lxRHctDUbhiaSwtn6VtOTf1cXOeL7bO888lvd4ej59afXNJj2Nv3syLXbK0b6a+5tTn9mrkY2XO2G3xmlCV76gOxKfXPtkvL1xRLc8+sVdMqf28UUxpP7Gtx6r12Js3656f511RvX/seylANMeerqgqpsToLFRhMIcZBZZev/6mfPHFTyc9duvCDDsp6vFRYRvFlN4LHPvNCxAZ+/EdNiicdMnUsQ88zjYxoYDUMMfKSG2BS9z6C+OZXKzh66+/u/prLjD8H1BlH228R1q/pOVDnxROqk+xIhiIYkoDcZuVW9CeYrMKLF1y6ZahrfMesajHJVPbvHYfJh8/iWO//a2/6+679dg77168zfeWSbdlTtnP1LFvfZwtvRW1xntNwrxZ6zbdLV4TanNFdSwPReGKpbG0fLZo3xQt+2Z0U9u8dh/2cPwkqdU3a+977dfrYd6kxy7Zom+m7qeVtDmydo5J82ur14SqfEd1IHv/9Non+9dj5cYV1bdvHz+JtbmiOl5Rj0ue9+1oBXXufX7i2K/dX+fx0cbeeXfeebesekV12ti3Ps5yrqj2e6ycxxRTYnQWqrADhxkFlpjrVEo5fBK9VqjqrLjJ1XHZU0GdW3nv0Z7GvkeHmUWSLjndWTjpkl6On5R5ONKxMlJb4BK3/sI+KDCxmct/C1wpVGUcLtMv9GRpUaI9zvc9thlYSDGlwe3pNiu3oN2MLS6wxHwv/57f9efed0vbuq9X7/g5fjI/93wbeY9jv6fz7vWevmrTW17nFknq7fbWqf2wh2Nlzm26W7wm1OaK6vgeyv4KVyyJpeVTq81sY8kYrD2mezx+etXj2I80b9aeX7X6pkY+I82RS5LavHTc93juZDC+ozq4ET697vVT2/RYcUW1hheuqI5VUGeL19zjFdUex36EebPhuXPjK6rz5nv6fNj+imq/x8p5TDElRmehCoEeHx8XF+t4yZs3xy1fnsvenZ5uwy5FMaWt9FJc5pqexr7GuWol747H42e3HnBYoUjSJacXCid11IdLvNj/tYx0nhypLXCJW38h0+Z/tLx+/c2MR3f9d3+Srcf1WsGSvRUy2Vt7W+plgTUlzy3aMmUu9tKHS+yhjcDKFFPaqZTbVtJiKfnMLY5xj0s/nbKl3m6pmhM7j5cbtwZOHef78lm3KFGvx0+Nfph7++Cc5699zuj9XLWWCbf0zrXKOa2nPlwi5dyimJJbf+mHK6r79VByCgEkxRLzGUVSv9aaN5cseVx6LC2fVsfoFnNk7Xz2eK5auy177MMl0s4ta+eYdA7a6jWhKt9R3amkTwOTYin59F4M5hJXVL/1bT+MVNTjUj+0zqfXK6o9FVPq6Vz1Uj+UikWSeu3DJYk+aYUAABUfSURBVJYW92l17Cacb67FFFNidBaqEKj3YjALxBTcWOJwmP5m//r1Nxdvw1YII9vSIiajFNLq4Vz1ox/9oHz99XdXf93TC0WSpuqhD9eQck4bqQDRSG2BS9z6C5n2WgxmlIIbk8dviz+goaL4c9UWx9i8YnQviu/DFeyhjcDKFFPaqZTbVtJiOfncXwym99vIMvp/8bz5ZPzKzNsKk9rX3/FT49bfeWO39Plr55Nwrqo1b8qCW3pvF706bt6H977mrbwvXWlLOKbq3fp7rWey2rz0Nt0tXhNqc0V1vx5KTiGApFhaPkvb0qOkPtxi3kyV1L49Hj9LxnNO30x9/tr5JMVq7meKpH5d4/lTJM2HWnPkkqQ2Lx3jGvMGNuU7qjuV9GlgUiwtnz1eUe2tmMWHvJdc1Xn79nE3/TBKTDGlfuZN2eiKao1+vff5rqiOeaycxxRTYnQWqjCY3gtzjFoA4jCjwNIOvDs93R7dK8WU6jocylel0nfYT8+KJPXar73mXcNIfTNSW+ASt/4CSUYuuDFy2+YapWjWEtfmg3lyWa05o/8BQiimxLdSbmVpGUvL577bmq6Pccrv2I0Wm/jYxQWWRpI0fmsfZ9PmyPIiREvySYpNeez11s1yxy3V118spW/m5t2qLSnvza3HdIu+WbstkMIVVZ57KDnFAVrF0vJZ2pZz+mab2BrP35uk8atxnKXlkxSb+9h79divW+R9SdJ8qHX8XJLU5qVj7D2J7vmOKt9K+oRwhE9t213p2aZQREr7EmP3Pr/s+IpquePqVlJsaTGl1vkk9OHMK6prHCvdFam69/mKKeWO6ZoxxZQYnYUqDGaL4gqPj4/VCplM9O54PHZdjKeUUg4KLD3XVYGltCImaflMdWhUJGmqlv1647z74vkvaT4Evn9socl7UtI4wxbc+gtMkfZHRlo+91K45ZdGGVPmUSTpumt909ux0lu+99hDG6E6C1VuOhzK4XAonz/dGjJ8LC2fpW05t/brtZTW1/c8/3Qqnz1d5XlVSvl+KeXV6VQOvcamPLaXMV37OEvLJyk2073z87Me+3XpvpdoPM7dSTt+jBUjsFDlJUkFA2rE0vJZ2pZza79eS2l9nZRPD/1wSVJb1m5HWj5JsTmScmzZNzXO2y3HuUdJc3Or14SqfEeVm54+YXsogUUEtoil5XNPbItCEbe+B9NKUtGLlvtOik15bLldFCdmTNc+ztLySejD06mcDvO/p121LS37dUmhnLnFlC6p0YcjaXGcKabE6CxUoaKdFJWoYstCEQOOU6tCH5/045s3xzkvEVtgKa2ISVo+5w4rFU26pyDSEo2LKd2976T5kPhB5xYc97A+t/5CXSMtflraujDKaOPUqj2f7Pf1628WPZ9urTGWPRZEYh/j1qqN1/a7hz5nB77TOgH6k3LL2Baxrffz9u3S3u9X8u2H57ERx6nF8XOpH7/44qellI/nQ7lxO3DCfJg7R9LOLQn9dT2791dJE3JM7Nd7973GOWy9try/myP5vbnX2Ic7ZbaYY5DAFVXu8VByinCsHau5n71JGuc9jlPL42dqPq3yXnuOpJ1b0vsrLcekfl267yXS+j8pn6TYGs+HSL6jymwpnyRuEdt6P3spKnFJX1dUxxunGv1/Hp9aiKZ0WGBJMaV5sXJjjLOvqCqmtFX75sTS8kmKrfF8SGWhChXtpajEJT0VdhhxnJILfRxuVH49VS6eM1VaEZO0fM71OMalKKYE0JLvqEJd78o+C8T0VthhtHFK7/+r/X1hgRNbCXiOmpWla3/w8qMf/aB8/fV3pz48fW4C0IiFKlRU4ydCfJK+XIufctmzSwvPG1fgRvkAYZR2fOLWIjX56ikAWRRTYhWHQzkcDuXzp+8/dBtLy2dpW87pm+36OimftH64pNU8bnWczXn+yHo9fqa2pcZ+13j+FGn9n5RPUmzuY6EnFqqsJakCXsvqeUmxS/TNNrG0fNL64ZJW87jVcTbn+SPr9fiZ2pYa+13j+VOk9X9SPkmxuY+FbiimxCqePrF7KAEV8JbE0vK5J7ZVlcqU9iXG0vJJ6Yclc7GEVwJeWvV3xMrSH7x5c7z1n6tXRE48n06JqfrbxxxpHZv7WOiJhSoMxndUSbFkLh5uVIm9oHqBpaXHWVJl6ZnFjxZJ/o5qzQJXlFJKeaceAHCLW38BiPP69TdzHm5xsUCtRWrJr/BrHtWlv4GbLFTZTFKxgS2KEqTHLtE32/V1Uj5p/XDJS6/5xRc/LW/fPn60LXm9Lds8JZc5zx/Iq1LK90spr06ncjidymc9HD/U4xy7PHYrDr2zUGVLScUGtihKkB67RN9sE0vLJ60fLlkyj5e8XqvjbM7zR9Hr8UM9vc6RpNitOHTNd1TZzNMnew8loNjA1FhaPvfEFFMyb1L6YclcvPTcuUV6ko6zOc+v7YV+XWLSmFyLJ5072cbU96RrcbHrfQMjsFCFwSim9N6AhVG6KzyyZC5eeu6tBdVLtwbXlFxMSeGk65IKXO3FxGOl5bm8u/MujMStv8CoRlqkljJee/auSWEhhZNu6jHnnk3t75bnPuddaOg7rRNgXEm3xuzpFs63b7cZk5T2rdEPvUro13m3/t7flp7Hb9rx8/4qTe1bkUvg79PW2s/LsU/HpNffUR0t1lJSP7j1l71xRZUtPZScYgNTY2n5LG3LuT32zUiS+nXOHFnSlh61PH7m5DPlcc6717U8fyX1zejn8aR+WHreha74jiqbSfrEsb9P9usVeRm1b0YsjJJ+FeU8XrOYUtJ3VKcUU2p13BdXVCPOp66ortMPNaT39a049M5CFQYzUjGlAQsiLdLh+N09F+cWU6KU16+/KV988dNv///cwkmnzoof1dDyfDrSuXyJ1gvVPfU1pHHrL5DMIvWXdl/o5fXrb1qnEO18UTqzcNLu59cV1/pFf9XTsq+NMzSkmBJVJd0u09staNNvVdum/9PakuR4PB4Sxr517Dy+djGlD1cLZ9zKynUxtzO23PfLsXWLXt3TN5dk9E2d2Iefh0nJJy0GI3NFldoeSk4BgkuxtHyWtuVcr33Tg6SxTzt+Lll7HjNPD/NG7LqkHM2RtjEYlu+oUlXSp5D9fbLftvhHWluSuKJ67YrqusWULj23uKL6kedFpV74Tq8rquExxZTMkal9A6OyUIXBjFSAo3URjal669da1i6mdOm5h4M/1u5xGqBw0p6LrTnnAHvg1l8gWQ+FLHrIcWT6f75R+myXi1SAvVBMiebSbqFJyue+22W36es27eujiEZaPin9sHYxpSvPNUdevgU6+jbRe5/fS7G1LSSNX/IcGS0Ge+OKKgkeSlZRgqR8lrblnL7ZJpaWT1o/XNJqHqf1Tfpx30Pf7FHS+PUwR0aJwa74jirNpX1amZTPfVcZximm1EssLZ+UfqhRTKl1m1PmSNnlFdU+iq1tocdjIPn46SUGe2OhCoMZqZgSfatRTIn3DjeKSp0GKJx0SS/F1rbgGAD2wK2/MJ5rhVJGKaACfGqPx/3IbQPYPcWUiOT2oiWxbQoQ5bQvL5aWT0o/VCqmFNcPjeZIF0Wl1u2bPoqt3Ru7dcU4Jcf8OdJvDHBFlVwPRcGGtFhaPkmxtHzS+uGSOa9573OTYmn5JMXS8kmKXZOUozmyTQx2z3dUieRT27xYWj5JsbR8UvpBMSVzRN9sc0W1x2PAHHFFFeayUAVgE6MUU3p8fPyqlPK9mvuc4N3x+P7WV8aUdAwAtODWXwC4LW2RWkpmTgCwGgtVunE4lMPhUD5/ukVmk1it/fQYS8snKZaWT1o/XDLnNe997hbtS5I09o6f7frmkqQczZHt2gx7Z6FKT2oVL6ixnx5jafkkxdLySeuHS+a85r3P3aJ9SZLG3vGzTeyapBzNkW1isHu+o0o3nj5pfCgKNjQr7JCUT1IsLZ+UfhilmNKtXFoapaBOWj4pMcWU9j1HAAtVYEWhRWfWpIDNDAMVU4p8o1RQZ2xJxwDLdf7+6L2PJtz6C6yp1zfhqUZv39rezYynSsw3MSfWNcrxw3s9v3/0nDsd+07rBGAJtxdl3Xr19u2c0euTeTNnjrz/BP6+223njcG2sfvbYY7om3tjH65gpeRjjiyL7eH9Edbmiiq9eygKNtSKzX3sqMyb67GtXvNcUpudW5bH0vJJiqXlox/WaQswge+o0jWf2mZ9op1adGZNl4qYlJI1VslzZPrVh5xiSml9M1osLZ+kWFo++mGbc1oPfC+aFixUgdWkFp1ZkzfrOhSSAUbS+/uj8y4tuPUXWNPoRT5Gb18ShWSAkfR87uo5d3p2Op1stqG2Uk6HUk6fl3I6zI0tff7IsbR8kmJp+eiHvFhaPkmxtHySYmn56Id12mKz2aZtzROw2dbent4U/nUpp8/nxpY+f+RYWj5JsbR89ENeLC2fpFhaPkmxtHz0wzptsdls07bmCdhsa2/Fp7abxNLySYql5aMf8mJp+STF0vJJiqXlox/WaYvNZpu2HU6nUwEAAIAUiikBAAAQxUKVXTgcyuFwKJ8//abZ1dicx+4tlpZPUiwtH/2QF0vLJymWlk9SLC0f/TA/b2CB1vce22w1tjKjyMHUx+4tlpZPUiwtH/2QF0vLJymWlk9SLC0f/TA/b5vNdv/WPAGbrcZWdliwYe1YWj5JsbR89ENeLC2fpFhaPkmxtHz0w/y8bTbb/ZtiSgAAAETxHVUAAACiWKiyC6MXbFDMQt8kxNLySYql5ZMUS8snKZaWj34Aqmp977HNVmMrMwofTH3s3mJp+STF0vLRD3mxtHySYmn5JMXS8tEPy/8esdls07fmCdhsNbYyeMGGGrG0fJJiafnoh7xYWj5JsbR8kmJp+eiH5X+P2Gy26ZtiSgAAAETxHVUAAACiWKiyWz0UbEiKpeWTFEvLRz/kxdLySYql5ZMUS8tnpH4AOtD63mObrdVWOijYkBRLyycplpaPfsiLpeWTFEvLJymWls9I/WCz2fK35gnYbK220kHBhqRYWj5JsbR89ENeLC2fpFhaPkmxtHxG6gebzZa/KaYEAABAFN9RBQAAIIqFKtwpqdBEr8UsRoml5aMf8mJp+STF0vJJiqXl00M/AANpfe+xzdbrVoIKTdSIpeWTFEvLRz/kxdLySYql5ZMUS8unh36w2WzjbM0TsNl63UpQoYkasbR8kmJp+eiHvFhaPkmxtHySYmn59NAPNpttnE0xJQAAAKL4jioAAABRLFRhY0kFKRT10Df6Qd+kxdLySYql5VOrzQCllNL83mObbfStBBWkWBJLyycplpaPfsiLpeWTFEvLJymWlk+tNttsNtvpdCrNE7DZRt9KUEGKJbG0fJJiafnoh7xYWj5JsbR8kmJp+dRqs81ms51OiikBAAAQxndUAQAAiGKhCgAAQBQLVQiRVHVR9Ul9ox/0jb5pH0vLZ07eAIu1/pKszWZ7v5WgqouXYmn5JMXS8tEPebG0fJJiafkkxdLymZO3zWazLd2aJ2Cz2d5vJajq4qVYWj5JsbR89ENeLC2fpFhaPkmxtHzm5G2z2WxLN1V/AQAAiOI7qgAAAESxUIUBKOrRNpaWj37Ii6XlkxRLyycp1nrfAE21vvfYZrMt34qiHk1jafnoh7xYWj5JsbR8kmKt922z2Wwtt+YJ2Gy25VtR1KNpLC0f/ZAXS8snKZaWT1Ks9b5tNput5aaYEgAAAFF8RxUAAIAoFqqwI70W9UiPpeWjH/JiafkkxdLySYqt8XyAbrW+99hms9XbSqdFPdJjafnoh7xYWj5JsbR8kmJrPN9ms9l63ZonYLPZ6m2l06Ie6bG0fPRDXiwtn6RYWj5JsTWeb7PZbL1uiikBAAAQxXdUAQAAiGKhCkyWVGQkKZaWj37Ii6XlkxRLyycpBrBrre89ttls/WwlqMhIUiwtH/2QF0vLJymWlk9SzGaz2fa8NU/AZrP1s5WgIiNJsbR89ENeLC2fpFhaPkkxm81m2/OmmBIAAABRfEcVAACAKBaqQBVJBUoUg9EP+iYnlpbP0rYAsJLW9x7bbLZ9bCWoQMnasbR89ENeLC2fpFhaPkvbYrPZbLZ1tuYJ2Gy2fWwlqEDJ2rG0fPRDXiwtn6RYWj5L22Kz2Wy2dTbFlAAAAIjiO6oAAABEsVAFoiQVRtljMRj9oG/23jcAhGh977HNZrM930pQYZSpsbR89ENeLC2fpFhiPjabzWZrvzVPwGaz2Z5vJagwytRYWj76IS+Wlk9SLDEfm81ms7XfFFMCAAAgiu+oAgAAEMVCFRjG6MVgkvRQFCcplpZPUmzuYwHYBwtVYCQPpZQ/efq3ZqzmflK07IceY2n5JMXmPhaAHfAdVWAYT1deHkopX55O5VQrVnM/KVr2Q4+xtHySYnMfC8A+WKgCAAAQxa2/AAAARLFQBSBCUoEfxZTWaTMA3MtCFYAUSQV+FFNaHgOAu/mOKgARkgr8KKY0VvEvAPpjoQoAAEAUt/4CAAAQxUIVgO4lFRHqtZgSACSxUAVgBElFhHotpgQAMXxHFYDuJRUR6rWYEgAksVAFAAAgilt/AQAAiGKhCgBn0oopAcDeWKgCwKfSiikBwK74jioAnEkrpgQAe2OhCgAAQBS3/gIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiGKhCgAAQBQLVQAAAKJYqAIAABDFQhUAAIAoFqoAAABEsVAFAAAgioUqAAAAUSxUAQAAiPL/AykaYnoisiJNAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " A* search search: 154.2 path cost, 7,418 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3b+PNMl5H/DqlzxxSfNeWpEdEXBmCBLumNMgQYeODgJmARPCRRSUSP+BQZ4DJc6kRAKh4A0U7AjGwSCc2YQNHaD0DrSVWpFhOJLehemXfOG3Hezue/vOzvR0T1d3PVX9+QCHw9XNj+ruqu55pnq/0/V9nwAAACCKZ6U7AAAAAI8pVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKF8tXQHAGjTfr9/mVJ6/8j/ut3tds/X7g8AUA8rqgAs5ViROtQOAJBSUqgCAAAQjEIVAACAUBSqAAAAhCJMCXhL+A3EYk7ymPEAbIkVVeAx4TcQiznJY8YDsBkKVQAAAEJRqAIAABCKQhUAAIBQhCkBmyachMeMBwCIwYoqsHXCSXjMeACAABSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUL5augMAAMyz3+/7g6bb3W73vEhnADKwogoA0J73S3cAYA6FKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCkfoLAACP7Pf7l+l4IJU0ZViJFVUAAHjXqdRkacqwEoUqAAAAoShUAQAACEWhCgAAQCjClABmGBu4MfC4VggYAQCysaIKMM/YwI2Wi9SU2t8+AGBFClUAAABCUagCAAAQikIVAACAUIQpwUbVEO5Tuo/7/b4/aBIYxGaVno8zmbuBjA2hC9CfU48/vDaUYlzTNCuqsF01fOCM1sc5/bnN1gsoI9p8nKLmvrdobAjdWmodH7X2G0axogqwgjnfegf69h4AYBVWVAEAAAhFoQoAAEAobv2FiSoJFBGwAABAtayownTRi9SU6ujjGNECiEr1J9p+OKaGPjJPzce45r6zvFrHR639hlGsqAJhHVsVHgoW2u123bI9KsPqOBEYh7Sq1LVmi9czmMKKKgAAAKEoVAEAAAjFrb8AwQ0EeAnNasjUoLbMv69rLGUWIXiv1THinAjbYEUVpqshvCB3H2vY5pad+rDbSmgWd0oeT2Mpv9b2aaTtcU6EDbCiChNt4dtaIQ4AAJRkRRUAAIBQFKoAAACE4tZf4CLCLABYWuGQMaAgK6rApYRZQF4lQ8sEpuXX2j4ttT2uKbBRVlQBIAB3IrRlreM5tIJ4LhhvznMBlmZFFQAAgFAUqgAAAITi1l/YgLXDKIRZjDP1uJx4jUv3dejQqxz7hviEssHyCp9PzWUuZkUVtsEH/phKHpc13vtU+MqYUJZoY7a1YJwohLLNN2eesQ2tX2tolBVVABZR87fogmSoRc3zbAm55q47g6A8K6oAAACEolAFAAAgFLf+AkAGgoHWJSDmzlL7YeStr2H2A9AeK6qwDUI1Yip5XKKPiRr3jWCgdQmIuWM/cE6N51OwogpbcOwb76Fvy8eEUcx9/qVaCrgYuxJRal+XZJUGIA/nU2plRRUAAIBQFKoAAACE4tZfIARBNNtVOBQHILQA50jXYYqwogrbdSrgoFTwgSCa7XKMuYSAmDv2Q/tKnyNLvz8bZUUVNsq3o0DNnMPuzNkPWwxqA+phRRUAAIBQFKoAAACE4tZfgIUcua1OIAVULkCwzTHOLUBzrKgCzDMlTCTah9soBLJQk4jzOGKfhkQL84uu9H4p/f5slBVVgBmOrWIMBZTwVCthMI47jGP1dxr7i62yogoAAEAoClUAAABCcesvLGQgcEPoBUAwztkAsVhRheWcCreoLfSC8gSPwPLGnrMjzruIfQKYxYoqQHBWcyAO8xFgHVZUAQAACEWhCgAAQChu/QWyy/17kmNfb8b7NhuWMhAQE02zx2COio7fGI4xZFT4/GA+szgrqgDtFALH1LJttfRzbS3tl5a2pRWC2upWck6ZzyzOiioAwAZZEQMis6IKAABAKApVAAAAQnHrLwAATTgSqif0ByplRRWg7eCQWratln6uraX90tK2UI+WQ39KzinzmcVZUQWy2+123dTnDP20zOPXy/241llJqJvjB5zi/EDrrKgCAAAQikIVAACAUNz6C8Bs+/3+ZQr2t2BDt39fQCDLhmUeS2sxZoGqWVEFanMqwEGwQ1mhitQFtL59tMeYjc/1DAZYUQWqYoUAgBa4nsEwK6oAAACEolAFAAAgFIUqAAAAofgbVdiAgUTWOamQt6deM8jrcULEhN4KGIfUxpgFqqZQhW04VZRcXKzkDoEQKrGqTRSpu92uK90H2mAsAazPrb8AAACEolAFAAAgFIUqAAAAofgbVZgoRzDRfr/vL30uAHVqMMjMtQtYjBVVmC57MNHM58JUW0gDLbGNp95zC/ubcVo716+xPeYPbJQVVYCNsQKyDPsV8ptxpxJQOSuqAAAAhKJQBQAAIBS3/gIXyREqtUJfTj0+5y1iTYSJBAh5qWo/Bthfa6jqmECrIl1vYU1WVIFLLREqdamSBUMrxUrp7Sj9/lPV1t9LlAzKaTVAp7Xtam17oop0vYXVWFEFAIrY2mrQ1rYXYA4rqgAAAISiUAUAACAUt/4CYW0ksIbMjBuiCjo2BfI0JOgYu5SxuXFWVIHIarjYthImUno7cr5/DeOmBqXHRIsijs2IfYqg1qCvlo5nS9vCBayoApyw2+260n1Yi2+tY3s8Fod+XmlLYxaW5JwI5VlRBQAAIBSFKgAAAKG49RegQgOBGcIn2KylgmSGbreGS+UYr8YmLbOiCkRWMrSi1sAM4RPxj91UrW3Pkoz/6YyvcozXYcbmxllRBcKaszIocGa7jBtqZHwRmfFJCVZUAQAACEWhCgAAQChu/YXGLBUmAgBrWyM4TjgdxGRFFdqjSAW2qtbwlVr7vYY1guNKhdPVctxr6SeNsaIKADTB6hc1GTtehbyxVVZUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUOdF3qui592HWpG2oDAACWoVCFpz5IKf37+38PtQEAAAtQqMJTX6SUfvf+30NtAADAAvyO6gj7/f5lOv6jz7d+s609fZ/6lNLnp9r2+xK9gnYNnGNDGfotwzNcKyjOZ5n5Ip6rZpyXTjEeKtTq/LaiOs6pk1KokxXh3JbuAFSi9XNp69tHHVr/LHPqmpvzWtzKvhqyhW1sUZPz24oqHLgPTPogpfTF/UrqO203N+Ne59g3WAt88wkAm1fzqhFwnBVVeEqYEgAAFKRQhaeEKQEAQEFu/YUDwpQAngoQJHNRKEirISMArbOiCutaI+yB8hznaVrfL61sX+lQjkvfv8mQEYpoZS4P2cI2UgkrqmzWsdCkU+2XhCkd49v7bXCcp4m0v4YCz3a7XbdmX4BYxp6rcp9HpryecxgtsaLKlp0KSBKmBAAABSlU2bJTAUnClAAAoCC3/rJZx0KTTrULU2qL37MF+JJzIhCRFVXYtpZDE1reNpYjCOu00vug9PsDsCIrqmzCuYCktcKUSttSkMKx0AurBpwTKdgpGvsGgDVZUWUrpgQkCVMCAICCFKpsxZSAJGFKAABQkFt/2YRzAUlTHjsUpnTk1tJbt8uxthm3ODc7Xvf7/cuU0vul+wHks8afc+R+j5GvN/pcfGn/NnBObPZ6tiVWVGFZLV8EaiSMZVjL47XlbQPassb5qvVzYuvbtwlWVNkEYUqkFC8MRrgTEEHkoD3nSdguK6pshTAlAACohEKVrRCmBAAAlXDrL5uwVpgSMGwDAR6wqoE5JUwGqJoVVZhOIA+5RBtLa/RHkQp5nZpT5lq91jgXR7v+5Nb69m2CFVWaMzY46dIwpb6/+4b63YCl/Zu1to92zFntGAoYiRyMUrMo+1W4DNzJNSdzn09rOD9bbacGVlRp0dgwpLlhSgKWAABgAQpVWjQ2DGlumJKAJQAAWIBbf2nO2OCkS8OUjrUJWAKWMDV8qtBtwUJ7KOLIeDcWBzg/UBsrqgCsqdaAi1L9riEQp4Y+sg2tjsVaz5sptXtMWIEVVZqzfJjS07abm7W2Durmm3WAaU6dNwWr0TorqrRImBIAAFRMoUqLhCkBAEDF3PpLc4QpUbOp4TknXiPS7WCCNDZgxpgLPT5yzEcALmNFFbjUqXCHmkMfImjtQ3Fr27O21udT9PERvX9TOGdTgvHFxayo0hxhSuuIvAoCrZgzz4KtrFNYxHP2lGsz8+12u650H2AKK6q0SJgSAMTn2gqcpFClRcKUACA+11bgJIUqzen71Pd9+vzxLUNj23I8HwA4z7UVGOJvVAFiuU3bCHBheTWMpejjo4Z92LqixyD333ofeb3QyddQkkIVIBAfWMjFWJqvhn3YemjWsWPQ2Db7IgROcOsvVeu61HVd+vA+JXB2W47nAwDvWuLaDLRNoUrt5iT0Sv0FgHUscW0GGqZQpXZzEnql/gLAOpa4NgMN8zeqVO0+FfDzXG1jHrvf71+mlN6/ucmwAZV42OZCby9ogqoVnj9zmHtkc+m1eeo1t7G/X4VNs6IKI3SfdF33Sff97pOuS/E/cJ5K0ZyTrllym6Pvbzin1jFca78PLXFOZD2tjMNTjEM4wYoqVbsPVfggpfTFw2+uzWk71t590nWpT3+WUvpRSumnfd+nroub5WAFBOBLzonrW+LaHMlutxv8EDC0qnvuucCXrKhSu0XDlO5XUP8s9c9+L3WpS/2z3/vp//pp6vuQ104AiGCJazORdN1vpq7754/++c3SXaI9VlSp3XJhSt//yRcppT9LKf0wPXvz9ZRSSs/efP2zf/gspZTSj/7pj0KvrAJAIUtcm4mi6347pfTXKaWvPGr9f6nr/kXq+/9WqFc0SKFK1ZYKU+o+6e6K1F9/4/fTb/zynf//q/5X6T//779JKSlWAVomSO4yS1ybW7T2+BoZNDU87rrut/8hPf/F++nlO7dlvkkp3abnv/hW1/2OYpVc3PoLB97e7pvSDw+L1Ld+45fps3/4LG3oNuCSYQ+CJqhdrWO41n7nJEiuvGjjcEx/xgZ4RTzGp/t0d3vvXx8WqSndFRTvp5cppfTXbgMmFyuqVC13YMPb4KT+2e+9vd33hF/1v0pbuQ241m/1IQLzh5Zlvw4ftD3Mn8ftNzf7N6f6c329ezb1fYZe75Lwo4bn/D9JKX3l1CrXs5RSn9JXfpJ+8r1/26X/UFNAFjFZUaV2uQMbvpdS+tG5IvXBr/pfpZ///c/T3/7yby/oOgBUL/d1+GzQ4YL9YaY36dmzT9NHf5LsazJQqFK73IEN/zWl9NP05tn/HfPmX+u+ln7wj3+Qfusbv3VB1wGgermvw8NBh+dDloQ2FfQsvXnzUfr0j5J9TQZu/aVquQMb+h/3ffdJ9wepe5NSSj9MKf2jU+/9te5r6bvf+m7zt/1OUTh4ZA1Fwk02sF9Tqjg45kHQ41T9fiW27NfhgaDDh/b9Pm9/hl4vl6Dnh6NOhS699xd/kf7VH/5h+uovf3l0petNSull+tb7f5L+6NOUUrr/aHTb9+l52khAFnlZUYUD/Y/7PqX0Bymlv0y//sbxB/36G4rU46q4CM9Qavta368ptbGNEbchYp9qIkiOXKqfi6+/+c30808+SbfpeTr8o977IjV9N32W/j69k6VU/XZTjhVVqpY7xOHL9v6D9P2f/EF6/3/+fvqdv0zvpP/++hsp/eKH6Uc//JeKVICGWY1+19LBSafaDttvbvL2cej1eNfLb387/c2/+zfpBz/+cerefFmu/p9XX0vfTZ+l/55++8lzhClxKSuq1G65wIb/8pMP0s/+PKVf/DC9XVm9L1LTz/5ckQrA1qwRnCRMKbiX3/52+o9/+qfpP/3xH7/955+l/3G0SL1nX3MRK6rUbuHAhi6ln/35Xet3/uJtkZqSIhWAzVkjOEmYUgVef/Ob6fU3v/n2vw9u9z1kX3MRhSpVWzqw4W7R9L5Y/cW/TunvvpcUqQBs0RrBSa2GKY318ccfpVev3hv56D7V8Jnk1DGFc9z6C6N0Kf3d99MFF4SthWG0vr2ltq/1/ZpSG9sYcRsi9gm2aNRcHF+kplRDkZqcg5jBiipVWzqwYWp/drtdFVeNpawRPHIqNv/+/Zvc/wJd6uA40ZJSwUkthSm9+75354dzfUnpSaBubZ6lkccUzrGiSu1KBjYAQKtKBSe1FKYUqS9raX37WJFCldqVDGwAgFaVCk5qKUwpUl/W0vr2sSK3/lK1dcKUlrXf71+m4z+IfetWwu0ZGA+1anIcm7f1aHBOraJUcFJLYUoHnyeejMMWf+VuyjGFc6yoQnmnPkD5YLVNrR331rbngXlbD8eECBYYh+H+3FNwElkpVGlO16Wu69KH93+8P6ltqB0AWjTnurlG21B7zm2ZY4H3eJZS+k5K6dnNzT4d/+evUt+n7vFj+z51Bdue+wxFTgpVWiRMCQDGixScVGuYUu73mPLcSMfKZyiyUajSImFKADBepOCkWsOUcr/HlOdGOlY+Q5GNMCWac2lgw7Ggg6mGfuMzwusdkTX4pXRoSW37i9NKjyVowchz4u39b3yGCE66NEzp448/Sq9evZfSo98hvQ8ruu37NLh9OcKUzgUnzXm9c/2LdKwEJ5GTFVX40uiLytXV6yX7sabchUDrhcUa29daGMWl29P6WGI9rc2p3Gqca0+O6X2RekyJ7Zv7nsYsJCuqNOj+D/g/SCl9cf/N3ui2U25uFsquhwNWbCGv6HNqhTtBzppz3Vyj7Wn73TF93JYeraRO3b6bm7z768zTn43Z5jn9u7Tfl3w+giVZUaVF/uAfAMaLFLwzJUwpd8DSWGv0ZYnPKD4fURWFKi3yB/8AMF6k4J0pYUq5A5bGWqMvS3xG8fmIqnR9b0X/nKHbcna7nd+JakTXjb+95erqdXrx4tMlu0M9BCwtIMLtkJdyXWCKCGO9tjGbIazoIWAppTTvc97Uvtz//ugkEcbIClxLZ2i1VrGiCl8aHV4wENrA9tQYRFIDYSJsRemxXvr9LzH3vJvzvD3ltWrc12txLeUJYUo059KwgIdvV8eGMxxz7Fur8yEJ+5PvketbsI18G7vot4Zb2YclnQtLKR3oMnbewhTnVpFyj8Xr693kMJ/SbYftQ9v3OPzw+np38nFjw4rWCE4CjrOiSouWCGxYoz+wdZHCW8xboigVBBSpbaj9UmsEMTmPwAwKVVq0RGDDGv2BrYsU3mLeEkWpIKBIbUPtl1ojiMl5BGZw6y/Nub+V5vNL2g7buww3kp57772faIWU0ry5u3abectaco/FCPNnSluG4KRT3t4yfX29OxmSePCZYFJfHt+W/XCcHt9m7DwCw6yoAgDUq/WAntGF4dXV68H/HjIyJPHivnBW6+OYC1hRZROWCGyY8z5jQxxKiRxlLtSoXZHCW2qct7Tp/Ph8Gjw2FLAUYf5cEqZ0wtuwomPb/LBC+jhAKk0MSbwkxKlFkT8X0C4rqmzFWoENQlngcpHCW8xbosg9PiPNn7nzbM42j2XeQyEKVbZircAGoSxwuUjhLeYtUeQen5Hmz9x5NmebxzLvoRC3/rIJa4UptRLKcuT22ttzv/UHc52fK/uXKaX3j4WRrN1W0sN+KN2PTJxbzsh9XYkwp77+9Y+e/E3osevtuWvwJds857qeI2ARGM+KKjBGKx+KqZtxeKel/dDSttSoyP4fGVx0jvCd9djXFGFFlU0QpgSxmCvwrjHXpBxzpcY51fepy7XNU9977HMfBzYd79/pgKtzQUVDIYJCjmiZFVW2QpgSxGKuwLumXJNyhynVoNQ25w6pAkZSqLIVwpQgFnMF3jXlmpQ7TKkGpbY5d0gVMJJbf9kEYUoQi7nyVGMhSUw05po0tm18mNK4vn388dPwo7Xl2ualwpS2fv6CJVhRBaAWrQd6bLFIbf2YRjdq/xcuUo0R2CgrqmyWMCUo55K50vd3P2Ny2XPHzsdxgSdD4SacJvjltBJhSmPnVErp5LyYaTCAaMp+KB2m5FoP+VlRZcuEKUE5c+bKGm2wtpJhSqXmxdygQ2FK0DCFKlsmTAnKmTNX1miDtZUMUyo1L+YGHQpTgoa59ZfNmhumdH29e+e/r65eC4iBkS6ZK2u0mY+UUiJM6Vhb16XVQr0uDTo81pY7TOlRgNSo256dWyA/K6owbHSIQ+lERKB6rYfGtL59rVgr1Cv0eJh4TQ+9LVArK6pw4CAQ4UnQRBr4dlWYEoyzZCDSvDClNbb+uN3uLtiG9o0dm1Mee1mY0tHgpCGjwo9yt+Xc5jPbN9bk/eBaD9NZUYWnWgmpgMgiBSeZj6xtiRChNUJ/Ss7HUmFKY1/PuQUyU6jCU62EVEBkkYKTzEfWtkSI0BqhPyXnY6kwpbGv59wCmbn1Fw6MCJoY8va24PvH3d7fPlx9wELB3428dUvkOPv9/lQIytl9OPDcRTy+De5hDhxpu73/nceqw5TW3rcXMs9WlDNE6FjbhGChSWNz2bl3N0+OnRtSOn/OGPL495EPgxAfG/p/j0U5t0DrrKjCdFNCE6J/OK2BfTjeqX01Zh9G3M8R+3SJGrajhj6S35TjvnRg0Cpj8Orq9dyXEJwEK1Gowghdl7quSx92Xer6Pj3v+9Slu/nznXRmHj1+7lDbkn1e6j1gaWPnT+42WNKUcZh7HE8Y72+vcX1/d+1rYe69ePFpurnZv/PPGVn3AzCeQhXGKRVcMYcQB1ogTIkWRQtTmtPH1ude69sHYSlUYZxSwRVzCHGgBcKUaFG0MKU5fWx97rW+fRCWMCUYIWfA0tXV63Rz8+mo950TYOQ322hB7lCWOYEsY+djweAxMlsqCGvM2Pz444/Sq1fvvfO8Y9eaY23X17t0dfU6vXjx5bXm0eud/C3wx5YMTjrWFjVsqPXtg8isqI5z6g/n/UE9D0aPhcMPHpxlnq0j4n6+tE+CgaaLePwjKDaW5l4rDp8/8fVKjIdiY3AgYMm8oBZN1ipWVEcQ2c8x9+EIH6SUvrj/CZp32tLIb63XdH29e5bu+/c4rn+M3W4nDKJhQ+e5g7Hen2qb8tgl21q8m8D8Y2Fvrw2l5u39T1FdfG4ZuqYdmz9z+g3RtFqrWFGFy9UYnBC9f8S0VvCL0BIoI9K8nXtuGcu5BYJTqMLlagxOiN4/Ylor+EVoCZQRad7OPbeM5dwCwSlU4UJ9n/q+T58/vhXoWFsk0ftHTFPG+tjHrtEGjBNp3s49t6yxzcA6/I0qLOc2nQjiuL7enX3yYWJjpv604NR+Db19S6WHTnj/pT9k3Qb6G5mTc69Socd24y4eS8dSe6fpU0rz/jR5zLUmZRxfpc9zp4w8/0U6hx06OQ5PbFvkbYHRFKqwkIeApce6bvw3sq9evSdA5YiKL77hPrxlFmb75oyRoQ+0j+fj2MfNfR/KmjOWrq/nrsB1qe/PV6rHxtJQgTrmNWcIcx64QNi+nxqHA+eRsNsCU7j1FzLqutR1XfrwPiXwZFvu15vTBjmtMWbNC9ZWciy5rgBbpVCFvCKlEkovpIRISaHmBbmUHEuuK8AmKVQhr0iphNILKSFSUqh5QS4lx5LrCrBJXd8LMYO1TPkb1Qluj/097GNRAy5oSwt/W7lC6NTahKqsoOvSIufY2v5Gtfb5s+Q5bIm/S/e37rTOiiqsa4n0zjEfjhSpLE0ybUzm/jqW2M8Xz6mrq9fZX3Okms8DNfcdmiT1FxZ2HzDxQUrpi4eVz4O2/rAtpfRmxns8eb2bm0wbw2ou+TZ86Nv16+vdszQwRnK0LfGaa7eZKxwz4pw95OK5N+axx8bsw0+brTHv323bjbrGLbMf9ievm1YXoU5WVGF5awRSCLjgnLUCVCIFIgmDIZc542atOZX7vSO1TX0s0ACFKixvjUAKARecs1aASqRAJGEw5DJn3Kw1p3K/d6S2qY8FGuDWX1jY/S1Kn09p66bfpPT2lqf75z4ELH2eUkr7/eTXozGXjMOpbWu9z5Jt5grHHJyfJwUnLT2nhsZshDmVo23MY4f2Q+0hT7BVVlQhprmhDocfooREwDitzZXWtieCKcFJ9j/nGCNwghVVKGBEgMTs0KVzARdDwRPHCKOYbk74x9p9Eaa0bhh2ULfIAAAPIUlEQVSM4Je4SgUn5QxTGvvcWtrm7odIzG8Yz4oqlLFG8IvgifIiHYNogSdbaxtqJ5Y1zrtrhQhFmgMl9wNQoa7v3bYPa8v1bXMa/hmbwW/2ragub+2fU/DzNHHbDtutqMa15Hl37Z9lWf/naUquqK53h8ocOef30DnfeYQWKFShYl2Xpkzgh4CllNL0cAkXvfxyf8jwoaUeCxz7UwE/t7vd7vmRdka4IDhp9Xk2ZywNjBsWolCF8dz6C3WbEsLgwwi069T8Nu/naT04yfhYV41jBIoRpgSBXRK6lEYGLNUSPFGz3OEf0UNV1nqfGtumHCvWEyk4qUSYUovj8Nztzm67h3pYUYXY5oZPjHk9lpM7/CP3sS8ZeLK1tqF2ymlpTs3pY0u2uM3QJIUqxPZFSul307vf7I9tG/t6LGfOscr9enPG0pT3XeN9amwbaqeclubUnD62ZIvbDE0SpgSNmRKwdHX1Or148emoxy59S9RGQj3eCbYRpnQn4LFfPIDIsY+nhuCkY8Ye+4DzbBEjAqSamSstbQscY0UV2jM6rOHVq/eyv+YMzX+AStvYxktE2y/R+sM6BCfVr8bjApwgTAkqkztg6ZhjYRQj3/vithZDPY4RplTHsY8WplTjPowmenDSEmFKkSyxwtfK+QY4zooq1Cd3wNLY95j73gIu7uQ+LrUGv0Q/9tHClGrch9FEnz9LhCm1rpXzDXCEQhXqkztgaex7zH1vARd3hClN63cp0cKUatyH0USfP0uEKbWulfMNcIQwJdiAKQFLTNWndCRT5VRQ1UG4iUCdNNzvLdrSsS+l1uCkY2qZP1HGYUtzpaVtgWOsqMI2CJhYzPHPAieCqhyH4+wX1tZ6cFI09iEwmTAlaFTugCWmOxZKtWT4Ua1hSn2/ezI+5wVz7ase2wJi5qsxOGmtMKXWV9rMFWiHFVVol0CJ8kqFt+R+vbXClJboT422uM25RZ8rwpSWY39BIxSq0C6BEuWVCm/J/XprhSkJ5rqzxW3OLfpcEaa0HPsLGiFMCQLa7/eTgj4ucX29W/LlOe724TbslIQpLaWWcJlTajr2a5yr5vr4449O/c34UWsHJ9WwDzO43e12z88/bHktnSdb2hY4xooqxLT4h5arq9cTHl315/5Ilj6upwJLthZksrXtLSl8gTWlSE1lxk74fZjBFrYRyEyYEmzUsZ9OWdK5YKGa2w7b00BQ1bLhR3lDidYLU8rdtvx+GApsGrOSMbQSIkxpVaPOS2M5dgD5WFEF1hIpoGStwJNjagx0WSL4pZW2JUTrT8ty71fHDiAThSqwlkgBJWsFnhxTY6BLrWFKtQY2RetPy3LvV8cOIBO3/gKreHyr5H7/0Pbl/79vu72/bfPzx8+9v10ubNthezd84+fb/XB9vUtXV6+P3oYdafsu3Q8ttz2M4SVE608EUwORxjq/X08GHR0NB3LsAPKxogoxbTUMppXAjdHHb4kP37CiVc5VC82TMX0/dU7Kea7awvl+C9sIZGZFFQKaE+Nf+09zRDIjROhJmE8aCFia896R2qL1Z9kwpSlHc5po/RmS6ydHzm1fmjh/TsganJTLEj/b4mdLgBZYUQU4bYkQodzvHaktWn+EKdVjje1rfR8CNEWhCnDaEiFCud87Ulu0/ghTqsca29f6PgRoilt/AU7IGSI0FLB0fb071vz2Vsdjz43eduaxt/e3R4cLSRrbJkwpr4O5cirAKNt7DLUBEIMVVSCSlgM3Wt62qVoJzZrj1HgwTpYZH/YrQGWsqMKGCNFYzogAnNkBSy2JEIhUMkzpIUAnSn/WMiaE68xLjApEmhIABkBMVlQB8thiAM4ckQKRSh67aP1Z2pQQrrHPb3l/AWyWQhUgjy0G4MwRKRCp5LGL1p+lTQnhGvv8lvcXwGa59Rc4a7/fLxJuMsPtEr89OMclAThDAUsbcBgWVVXAUq7womj9WdrhdkwNTsp9TCMbOO+GO/8NCXj9OGnG75BXdUygFlZUgTGifciI1p9LCXj5UivHlGmmHPetzZdT+6a2uVJbfy+xhW2E1SlUAVbUdanruvRh16Wu79Pzvk9dujsXfyel9KzvU1dr25jHjt03tbRNEa0/S8vQ58Ox9Dz6NgOQj0IVYF2RgoCWCKaZE2ITaVuWCOOJ1p+lze1zjdsMQCYKVYB1RQoCWiKYZk6ITaRtWSKMJ1p/lja3zzVuMwCZCFOCFZUOlZgRFLEpSx6nx793+RCAs0Lbbd/vFg8qOmx/2I+P+3N9vTt8ymPVBCxdEl4UrT9LOwgOezKnzoWJtRCIBMDlrKjCugQu5LF0qEprx6nU9jx536ur17OeT7WmHsutBSe1bAvHstQ2nnrfLexzNsCKKrCK3W4n/KSg+/CZD1JKX9yvSmVvO2x/vJL64MWLT1NKKV1f7549PC49Wkkt0e9L2o5t21CfS/ZnDef6N/Tcvk/dlDFGXfxsy3LsW1pnRRVgG0qGKY3tT6l+5w4vmhs0VWOYkuAkALJSqAJsQ8kwpbH9KdXv3OFFc4OmagxTEpwEQFZu/QXYgDXChg7bhwJ+DoJ2hoQMWBq7baef/zRoak5o1pA1QtQe9+XrX/8ovXr13qTnC04C4JAVVVjXVgMOatvu2vp7TvTtmdK/VgKWWtmOJyYWqdHHJgCFWFGFFa0RfDC0eiLQaJzaAypKhQ0dtk8IHHp+2JaCByzNDVMqHX5U0NsQrXNBUwBsmxVVgPaUDBbKHZI057k1hCltjX0DwCgKVYD2lAwWyh2SNOe5NYQpbY19A8AoXd+7wwZa4tZfopgzFrtu0u2fDwFLq5k7z9YIOBrr44+nhx9dqu9T2HPQQ8BV6X5syG3tf2YBLMuKKgDhXF29nvJwxcUMaxWpKX5wknG0LvsbGKRQBdiorktd16UP74NsZrcNtU997xcvPk03N/t3/llzW6Zs85i+THl+Q56llL6TUnrW96nr+/R8g/sAgAspVAG2q9YwpTmvJ0xpPbmPMQAbolAF2K5aw5TmvJ4wpfXkPsYAbIgwJWiMMKU7DQajVBc8MmcsHnvu9fXu5OPP3Rq8pshhSoKTTosUcLUVI+dKyXN5deddaIkVVaBVLRWpKbW3PVtXJFhIcNKgGvtcs7H7u+S5z3kXCvpq6Q4AENt98M0HKaUv+v7uZ2OOtR2239xc/ppDz41uzL7p+7tVmrH79vz+2r851Z/r692zh8ellE4+Lt19eT27L+fGSFTHVs5K3qHi7hhg66yoAnBODWFKkUzZNyXDncY8bo3+AcATClUAzqkhTCmSKfumZLjTmMet0T8AeEKhCsCgvk9936fPH9++eaxtqH3Oa9Zmyr7J3TalP2Met0b/AOAYf6MKhNVgcu8cmw96ubp6fTIMaCgReG3X1yUKs126unqdXrz49G3LQ8LvyP5sfnydcJuOn4Psr/WcOgZrvTdQiEIViKyKIlWwyToeirDH+7vrrNY9OCzihxJ+a/vpmFL8NEl5jgFsl1t/AahK16Wu69KH96myXODYPrRfAYhEoQpAbaTJzielF4DQFKoA1Eaa7HxSegEIzd+oAlCV+xTZz1NKqXOT6jvGhko93odDbZEJWwNomxVVILIaEhdr6GPL7P/pWtlnilSAhllRBcKS9sgx92E/H6SUvuj79PxIWx+tben3SSm9Gdhlz8b2EQCisKIKQG3GBgFFalvzfQ4JTgKgOgpVAGozNggoUtua73NIcBIA1XHrLwBVGRsEFKlt6fcZCpVqITgJgO2xogrtORWU0kqACvDUFud9y9sGsHlWVKExAohgG+aESrWg9XPdfr9v5lgBXMKKKgDUKXfoEgCEoVAFgDrlDl0CgDDc+gsAA/b7/cuU0vul+3Hgtu93z9OMcCcAiMyKKgAMi1akphSzTwCQjUIVAACAUBSqAAAAhKJQBQAAIBRhSkA2QUNncrpt/bcbAciv8uujax9FWFEFcqr1IjxW69uX2+3E9qgi9jdin8irlfnDnZqvHzX3nYpZUQVgEa18A9/KdlAX4w7YOiuqAAAAhKJQBQAAIBSFKgAAAKEoVIGcWg/5aH37IhEkA7Sk5nNXzX2nYl3f96X7AAAAAG9ZUQUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEMr/B8wj0o1qhF8ZAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (1.4) A* search search: 154.2 path cost, 944 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3b+PNMl5H/DqVzxxRfNeWpEdEXBmCBSOzGmQkENHBwOzgAniIgpMpP/AIM+BEmdSQoJg8AYKdgzjYAjObMEGD1B6B9pKrcgwHInvwvRLvvA7Dnb3vX1np3u6p6u7nqr+fIADeXXzo7q7qnuerZ7vdIfDIQEAAEAUz0p3AAAAAB5TqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEL5UukOANCm/X7/MqX0/on/dLvb7Z6v3R8AoB5WVAFYyqkidagdACClpFAFAAAgGIUqAAAAoShUAQAACEWYEvCW8BuIxZzkMeMB2BIrqsBjwm8gFnOSx4wHYDMUqgAAAISiUAUAACAUhSoAAAChCFMCNk04CY8ZDwAQgxVVYOuEk/CY8QAAAShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoXypdAcAAJhnv98fjppud7vd8yKdAcjAiioAQHveL90BgDkUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIReovAAA8st/vX6bTgVTSlGElVlQBAOBdfanJ0pRhJQpVAAAAQlGoAgAAEIpCFQAAgFCEKQHMMDZwY+BxrRAwAgBkY0UVYJ6xgRstF6kptb99AMCKFKoAAACEolAFAAAgFIUqAAAAoQhTgo2qIdyndB/3+/3hqElgEJtVej7OZO4GMjaELkB/+h5/fG0oxbimaVZUYbtq+MAZrY9z+nObrRdQRrT5OEXNfW/R2BC6tdQ6PmrtN4xiRRVgBXP+6h3or/cAAKuwogoAAEAoClUAAABCcesvTFRJoIiABQAAqmVFFaaLXqSmVEcfx4gWQFSqP9H2wyk19JF5aj7GNfed5dU6PmrtN4xiRRUI69Sq8FCw0G6365btURlWx4nAOKRVpa41W7yewRRWVAEAAAhFoQoAAEAobv0FCG4gwEtoVkOmBrVl/n1dYymzCMF7rY4R50TYBiuqMF0N4QW5+1jDNres78NuK6FZ3Cl5PI2l/Frbp5G2xzkRNsCKKky0hb/WCnEAAKAkK6oAAACEolAFAAAgFLf+AhcRZgHA0gqHjAEFWVEFLiXMAvIqGVomMC2/1vZpqe1xTYGNsqIKAAG4E6Etax3PoRXEc8F4c54LsDQrqgAAAISiUAUAACAUt/7CBqwdRiHMYpypx6XnNS7d16FDr3LsG+ITygbLK3w+NZe5mBVV2AYf+GMqeVzWeO++8JUxoSzRxmxrwThRCGWbb848Yxtav9bQKCuqACyi5r+iC5KhFjXPsyXkmrvuDILyrKgCAAAQikIVAACAUNz6CwAZCAZal4CYO0vth5G3vobZD0B7rKjCNgjViKnkcYk+JmrcN4KB1iUg5o79wDk1nk/Biipswam/eA/9tXxMGMXc51+qpYCLsSsRpfZ1SVZpAPJwPqVWVlQBAAAIRaEKAABAKG79BUIQRLNdhUNxAEILcI50HaYIK6qwXX0BB6WCDwTRbJdjzCUExNyxH9pX+hxZ+v3ZKCuqsFH+OgrUzDnszpz9sMWgNqAeVlQBAAAIRaEKAABAKG79BVjIidvqBFJA5QIE25zi3AI0x4oqwDxTwkSifbiNQiALNYk4jyP2aUi0ML/oSu+X0u/PRllRBZjh1CrGUEAJT7USBuO4wzhWf6exv9gqK6oAAACEolAFAAAgFLf+wkIGAjeEXgAE45wNEIsVVVhOX7hFbaEXlCd4BJY39pwdcd5F7BPALFZUAYKzmgNxmI8A67CiCgAAQCgKVQAAAEJx6y+QXe7fkxz7ejPet9mwlIGAmGiaPQZzVHT8xnCMIaPC5wfzmcVZUQVopxA4pZZtq6Wfa2tpv7S0La0Q1Fa3knPKfGZxVlQBADbIihgQmRVVAAAAQlGoAgAAEIpbfwEAaMKJUD2hP1ApK6oAbQeH1LJttfRzbS3tl5a2hXq0HPpTck6ZzyzOiiqQ3W6366Y+Z+inZR6/Xu7Htc5KQt0cP6CP8wOts6IKAABAKApVAAAAQnHrLwCz7ff7lynYd8GGbv++gECWDcs8ltZizAJVs6IK1KYvwEGwQ1mhitQFtL59tMeYjc/1DAZYUQWqYoUAgBa4nsEwK6oAAACEolAFAAAgFIUqAAAAofiOKmzAQCLrnFTI277XDPJ69IiY0FsB45DaGLNA1RSqsA19RcnFxUruEAihEqvaRJG62+260n2gDcYSwPrc+gsAAEAoClUAAABCUagCAAAQiu+owkQ5gon2+/3h0ucCUKcGg8xcu4DFWFGF6bIHE818Lky1hTTQEtvY955b2N+M09q5fo3tMX9go6yoAmyMFZBl2K+Q34w7lYDKWVEFAAAgFIUqAAAAobj1F7hIjlCpFfrS9/ict4g1ESYSIOSlqv0YYH+toapjAq2KdL2FNVlRBS61RKjUpUoWDK0UK6W3o/T7T1Vbfy9RMiin1QCd1rarte2JKtL1FlZjRRUAKGJrq0Fb216AOayoAgAAEIpCFQAAgFDc+guEtZHAGjIzbogq6NgUyNOQoGPsUsbmxllRBSKr4WLbSphI6e3I+f41jJsalB4TLYo4NiP2KYJag75aOp4tbQsXsKIK0GO323Wl+7AWf7WO7fFYHPp5pS2NWViScyKUZ0UVAACAUBSqAAAAhOLWX4AKDQRmCJ9gs5YKkhm63RoulWO8Gpu0zIoqEFnJ0IpaAzOET8Q/dlO1tj1LMv6nM77KMV6HGZsbZ0UVCGvOyqDAme0ybqiR8UVkxiclWFEFAAAgFIUqAAAAobj1FxqzVJgIAKxtjeA44XQQkxVVaI8iFdiqWsNXau33GtYIjisVTlfLca+lnzTGiioA0ASrX9Rk7HgV8sZWWVEFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQ/I7qCPv9/mU6/aPPt36zDWCegXNsKEO/ZXiGawXF+SwzX8Rz1YzzUh/joUKtzm8rquP0nZRCnawI57Z0B6ASrZ9LW98+6tD6Z5m+a27Oa3Er+2rIFraxRU3ObyuqcKTrUpdS+iCl9PnhkA7HbTc3417n1F+wFvjLJwBsXs2rRsBpVlThqQ9SSv/+/n+H2gAAgAUoVOGpz1NK//L+f4faAACABbj1F47c3+77WV/bfl+iVwBlBQiSuSgUpNWQEYDWWVGFda0R9kB5jvM0re+XVravdCjHpe/fZMgIRbQyl4dsYRuphBVVOJIrTOkUf73fBsd5mkj7ayjwbLfbdWv2BYhl7Lkq93lkyus5h9ESK6rwlDAlAAAoSKEKTwlTAgCAgtz6C0eEKbXP79kCfME5EYjIiipsW8uhCS1vG8sRhNWv9D4o/f4ArMiKKpt1KjSprz1XmFJpWwpSOBV6YdWAcyIFO0Vj38Rx7jrV2rUL2CYrqmxZX0CSMCUAIht7nXLtAqqlUGXL+gKShCkBENnY65RrF1Att/6yWadCk/rax4Ypnbi19Nbtcqxtxi3OzY7X/X7/MqX0ful+QA6Pr0ldl56M7e7uSx63h0N6nhoOAlzj6xy532Pk640+F1/avw2cE5u9nm2JFVVYVssXgRoJYxnW8nhtedvYtr6xbczXa41j1/r4aH37NsGKKpu1xTClrYv211XhTsA5565JY587dO06FbR3SWDTEm3Ok7BdVlTZMmFKAEQ355o059o1J7ApdxuwQQpVtkyYEgDRzbkmzbl2zQlsyt0GbJBbf9msJcKUgGEbCPCArM4FJ53x5uH/XF/v0tXV6/TixSeT37d0G7BNVlRhOoE85BJtLK3RH0UqXG7W/Hn16r1c/WBZa5yLo11/cmt9+zbBiiqbMDbEYdxj7wJ53g2p2L9JMNGccKehgJFTwSjMF2W/CpdZVpQQoVP9Ger3zc0Xt/tcX++q3ea+/XBKrjmZ+3xaw/k5WrggnGJFla2YEtgg8AFgu6KFCK1xrYm0za6tQEpJocp2TAlsEPgAsF3RQoTWuNZE2mbXViCl5NZfNmJKYMMlgQ8CloAlTA2fKnRb8G1LtxGeCy/qTty4uUbbkiIFJ00JUzox3psai7k5P1AbK6oArKnWgItS/a4hfKqGPl6qim27unpdugtRVHG8LlDreTOldo8JK7CiyibkDVN62nZzs+bWQL38ZZ1Izp3bS/btjGcpY5hfpOCkKWFKW9F33hSsRuusqLIVwpQAOFbruT13HyMFJ9Ww/4EVKFTZCmFKAByr9dyeu4+RgpNq2P/ACtz6yyYIU6IWU8Nzel4j0u1ggjQ2YMaYKzo+zgUnRTXndt+e31h9+3r3QU63h0N6noKHKQFts6IKXKov3KHm0IcIqvigPEFr27O21udTpPERqS8p9Xw9c0xwUoZwpWD7goq1fg5jQVZU2QRhSvlZJYPlzZlnwVbWQxobnHRzU9dtMy9efJJSSun6evc2dCk9WjUdQ5hSe3a73co/fATzWFFlK4QpAXCs9fP4nO0TpgQUpVBlK4QpAXCs9fP4nO0TpgQUpVBlEw6HdDgc0mePbyM61TblsX3PB6AOrZ/H52zfnGth7jZgm3xHFSCW29RWkIkgjXJqGEsVjI9m6qVJ46Hrnmz4QxLw2oqO49zf9T7xepLRoYdCFSAQH1jIxVjKpQsdQjO2kBpbZJ4oUB8UKRZPjePGgsKi/zEJinHrL5vQdanruvTN+zTB3rYpj+17PgB1GHsen3NdmHtNmXP9yX09i7YfgLYpVNkKqb8AHBt7Hi+Zdjvn+pP7ehZtPwANU6iyFVJ/ATg29jxeMu12zvUn9/Us2n4AGuY7qmzCfXrgZ+faxjx2v9+/TCm9f3OzWHfDedjmQm8vaIKqFZ4/czQ79z766MP06tV7KaX0Zszjx15DcreNeex+n7ff3fCNtW/31/3jHgKWFtsPU6+5jX1/FTbNiiqM0H3cdd3H3Xe7j7suxf/A2ZeiOSdds+Q2R9/fcE6tY7jWfh97cu67L1Ivfn7jpmzvGmOklXHYZ2vjC0azokpz7oMWPkgpff7wO2xj2061dx93XTqkn6SUfpBS+tnhcEjdmT85l9TqCgjAWO+ex+/OiY/b0vBK6rN04TUkd9uYx+a4u+foPSbtr7X2QyTnUqCHVnUjJ0hDNFZUaVG2wIb7FdSfpMOz76cudenw7Ps/+18/S4dDyGsnAHdqDAxaIkxprOj7i2i67vdT1/3TR//8fuku0R4rqrQoT2DDd3/8eUrpJyml76Vnb34vpZTSsze/9+mvPk0ppfSDf/yD0CurABtWY2DQEmFKY0XfX0TSdd9IKf0ipfQ7j1r/X+q6f5YOh/9WqFc0SKFKc3IEV3Qfd3dF6m+/8sfpd3/9zn//zeE36T//779JKSlWASI6CgeaFGZ1LjzvIbxohbbb3W73fE6Y0ljRA5ZatXbQ2sigqeEQta77xq/S81++n16+c1vmm5TSbXr+y6913R8qVsnFrb9w5O3tvil977hIfet3f50+/dWnaUO3AZcMexA0Qe1qHcO19vvYlELgeJsFyd0pGbAUbRyO6c/YUMNIx/hBf5/ubu/9xXGRmtJdQfF+eplSSr9wGzC5WFGlObNCHB6Ckw7Pvv/2dt8evzn8Jm3lNmABTXA582dZ587vZ54+GJwU4WfI1ghTGvG+xQKWHubPu9u8733v6+vd5DCsode7JPyo4Tn/j1JKv9O3yvUspXRI6Xd+nH78nX/Tpf9QU0AWMVlRpUVzAhu+k1L6wbki9cFvDr9Jf/33f53+9td/O6/HAFxqjSCgkkr1MVLA0lrvzUxv0rNnn6QP/zzZ12SgUKVFcwIb/mtK6WfpzbP/O+aNvtx9Of3RP/yj9Adf+YN5PQbgUmsEAZVUqo+RApbWem9mepbevPkwffKnyb4mA7f+0pw5gQ2HHx0O3cfdD1P3JqWUvpdS+gd97/Pl7svp21/7dvO3/U6xdjBEAcMhEwvZwH5NqdC+zSnocap+v56TKzjpVFuOoKK5SvUxUsDScfvQNl/yGWCN4xz0/HBSX+jSez//efoXf/In6Uu//vXJla43KaWX6Wvv/3n6009SGj72MIYVVThy+NHhkFL6YUrpL9Nvv3L6Qb/9iiL1tCouwjOU2r7W92tKbWxjxG2I2KclzQlOuvQxS4kWInSsZMBSjarfB6+/+tX01x9/nG7T8ydfUL4vUtO306fp79M7WUrVbzflWFGlOXNCHL5oP3yQvvvjH6b3/+cfpz/8y/RO+u9vv5LSL7+XfvC9f65IhQ0pFWozV85gmyXacr7mmV0xOWTncHga5BNt36wUpnSqj6sFLB23D23zJe8Tde5G9PLrX09/82//dfqjH/0odW++ONz/59WX07fTp+m/p288eY4wJS5lRZUW5Qls+C8//iD91U9T+uX30tuV1fsiNf3VTxWpsD21BrLkDrZZKygndyhOpG1eYt/kFmn/r/k+nPHy619P//Ev/iL9pz/7s7f//JP0P04Wqffsay5iRZUWZQxs6FL6q5/etX7r52+L1JQUqbBBtQay5A62WSsoJ3coTqRtXmLf5BZp/6/5Pozw+qtfTa+/+tW3/350u+8x+5qLKFRpzpwwpeP2u0XT+2L1l/8qpb/7TlKkwjadC+25vt71PTMVPm8cB9u8I1Jbjuf3mRW0V6htzGNLhSmdapsYsPSOgbbbwyE9rz1M6ZSPPvowvXr13oxXKH5uGaVvbMM5bv2FUbqU/u676YILQvQwjNxa395S29f6fk2pvm2cEBAS/4PkBtQ2vmq1xH5uJYznyb6ZV6SmVMm5xdzjYlZU2YRLAxumvs9ut6viqrGUNX4Goy82//79m9z/rf+8yINIAT8zQ3so74LgpHhtYx67ZhBQ7oClS9+7ljClc8FcKdO+CWbU3IMxrKiyFUsENgB5RQqxEb5St0hjpNYwpVNKzpUaw5QiHbu1bHGbWYhCla1YIrAByCtSiI3wlbpFGiO1himdUnKu1BimFOnYrWWL28xCusPBCvw5W7zVkDtd13+Lys3N0/SFS8bDfr9/Espy73Yrt3xO0fp8HBgPtbpoHJ8KK4KxDocvvrzX4JwaLcI5ceg6OsfV1ev04sUnT9ovvA4vfl3Zyjnt8dxjPa1+NrKiCuX1Xbiav6BxUmvH/dLtaW0/sJ7j8BZjqaxFwnTmBxGtboFxGG6xSXASWSlU2YSuS13XpW/ef6G/t22oHchnypyM7OZm/+SfdHdt/VZK6dnhkLqHf061a1vkNZ/XOJZqdW4u3/+0zMXHM3d/htrmmPkeg/vh1Hnm7p9/l0rO3RNt5h5ZKVTZCmFKEEvLgRtzQ3G21rbm+5BftGNXaozMeY+5/Ys0n809slGoshXClCCWlgM35obibK1tzfchv2jHrtQYmfMec/sXaT6be2QjTGmEVr+gzLumBh2cClOqUNbApg2EliwecDV0vqnVufNkzSEjuULVWE6Lcyqz0MF9GcKYbh9+37VP7s95U89p525x9jmUc1odI1ZU4QujLypXV6+X7MeachcHVRYbE6yxfa2FUYzZnirHTUPngda1Nqdyiz7/5h6/Ets35T2NT+jxpdIdgDXcf6n/g5TS54fD3V9nT7X1aWT1lApEXtnI6fH8G3qcucdc0edU7Su+Y6+vl7Y9rIYenTPe5OzjzU3ebT7z9GdDzz33eQS2xIoqW+EL/xCL+QdtKBm4lbuPa7yezyMwkkKVrfCFf4jF/IM2lAzcyt3HNV7P5xEYya2/bML9rTSfnWvrc329e+ffr65epxcvPsnWv5Jqv+1sbSf2V+ggkqgez79uIOah5bkHLRh7fZ3Tdtw+dM7o8fZW4fvnPgQsfZZSSvuJ3zA46svU4KRZn0eONXQNdy3lCSuq8IXRgQavXr23ZD+oS/QgkhqYe2xZ6TCd0u9/iUgBS4KT8nAt5QkrqmzCWoEN19e7USEJ74Y47HvfI1ekeEN/cR20ZAT7VvbhWo7mxaS5d8k8y902NG9hirmrSFPPTWv8VEXuuXei/ck549ScPL4jo6+PQ2FKgpOgHCuqbEW0wAbBCWzdGmEka4W8AO9aK0wp9/V67OMEJ8EKFKpsRbTABsEJbN0aYSRrhbwA71orTCn39Xrs4wQnwQrc+ssmrBXYcElYxNQQB2jBuXkxNPfWCG8xb+FyS4cpnWq7YE6+vVX4+nrXG9Q2Jzjp8e3ID/17fJux8wgMs6IKAFAvAT09rq5ej37syKC20UXqlPcmpWQcc4IVVTZrTGBDjtc8H8oyd0vyWyNs41JCjepzybzI/Xr5w5Ry7BmY71QYU8TzZOYwpVFz8mGF9HEAW5oYkjj2vHRz0/byaOTPBbTLiipbJkwJ1pF7XghTgvqUDFNaI3QJyEyhypYJU4J15J4XwpSgPiXDlNYIXQIyc+svmyVMqd+J28Zu5/7WH9t1ybwYmntjA0qWbCtpv99PCnQJzrmloDXH0ok5dXs47J6nFcKUcl3Xpz4XmMeKKjBGKx+KoQUtzceWtqVGJfe/Y18PQUcUYUWVzRKmBOvIHaYEtGHpMKVTz53Tx6HHPQ5sOt2/fW+I07mgoqFwLCFHtMyKKlsmTAnWYV4Ap7QUpuScBpkpVNkyYUqwDvMCOKWlMCXnNMjMrb9sljAlWEfuMKVWNRaSBGeNPTeMeWzpMCXXesjPiioAxLDFIlVIS1kl979jDwyyogpHhClBXrnDlM6FlsxpGxt4MhRuQj/BL7FE/GmgWsOUXOshPyuq8JQwJcgr97yY83rmI8QmTAlIKSlU4RRhSpBX7nkx5/XMR4hNmBKQUnLrLzwxNjjh+nr3zr9fXb0WpgQn5A5TuuT1xraZj1BWlDCljz76ML169V5KKfV+HSBX/4DTrKjCsNFhD/cXNIBLtR4u0/r20ZCJ13RjGxZgRRWOHAUiPD9uSwN/XRWmBE/lDlPKFZwUbT5GDLaBKEqFKfWYHOjmWg/TWVGFp0qFvECrhCkBc5UKU5rTF+cWmEGhCk+VCnmBVglTAuYqFaY0py/OLTCDW3/hyJyQl/TotuD7x93e3z5cfcBCwd+NvHVL5Dj7/f5lSun9E//p7D4ceO5sj295exjv59qOw8oeqy1Macl9m5F5RlHn5sm5c8aQx7+PPHRuGfpvj0U5t0DrrKjCdFNCE6J/OK2BfThe374asw/t5+XUsG9r6CNtW2UMXl29nvsSgpNgJQpVGKHrUtd16Ztdl7rDIT0/HFKX7ubPt9KZefT4uUNtS/Z5qfeAEsbOqTltQJtevPgk3dzs3/nnjLfX+sPh7jOAcwasQ6EK46wR/JKbEAdaJUwJWItzBhSiUIVx1gh+yU2IA60SpgSsxTkDChGmBCPkDFi6unqdbm4+GfW+cwKM/GYbtfjoow/Tq1fvjX78+dCSu1CWS0KchoydjwWDx8iscBCWgKsAxgYnAflZUR2n74vzvlDPg9FjYcoHclJK5tlaiu3niXNiTD8FA01nnp1WcixtbRwXG4MDAUvmBbVoslaxojqCv2hyyn2Iwgcppc/vf4Lmnbb0aCU1iuvr3bN037/Hcf1j7HY7oRENm3qeOxr/h6H2c21peK48G3ruqbYW7yYw/2jd3M9aQ3cynJo/l5yrHp/rIJJWaxUrqnC5GgMWovePevSNpdwBRgKRgCU4t0BwClW4XI0BC9H7Rz36xlLuACOBSMASnFsgOIUqXOhwSIfDIX32+FagU22RRO8f9egbS2PnxdixmPv1AFJyboEa+I4qLOc29YRhXF/vsr3J1dXr9OLFqBThqr9Q/0jffg29fYXTQ9dIoh2VUNp1acp+uPSY9s69SoUe240rOZZGHfeJ55ZFx1Lp81yfkee/yCnLveOwZ9sibwuMplCFhTwELD3Wdfn/Ivvq1XubClqp+OIb7sNbZmO3r/dxh0PKMo7njJGxgSxTg1sufR/KquR80zunCoylms9zYfveNw4HziNhtwWmcOsvZNR1qeu69M37lMDetlLvu0Zf2LY5427OOM7dBgCUpVCFvEolBkovJIpSCb+52wCAghSqkFepxEDphURRKuE3dxsAUJDvqEIuwSwPAAANJ0lEQVRG96mAn/W1dcvdWPjm4f/cv8ft/XdkP0vpi4CLm5vL32CFMJ5q2TdfOBrvk4JVzs2fNdr2+/7+jT3Oc8dD5vEkVAWAKllRhXWtld55XBwIVmBpp8b2lHEn2XYZ5j5rqnke19x3aJIVVVjYfUDLBymlzx+SgI/aDpe2pUcrqUPvO2cllTIuSevMkSQ7Z2wet595q2eXvs+SbeYKXK7k6r0kbWiPFVVYXqngFwExXGLuOKwxOMlcAYBgFKqwvFLBLwJiuMTccVhjcJK5AgDBuPUXFrZk8MuZcKa3twVfX+/S1dXr9OLFJ9M3gE25dGzWGJx0qm0oTAmok8A7qJMVVajb6PCHV6/eW7If0EpwUuS+XaK17YHWmKPQw4oqVOZcOFMaCFi6vt69DbC5udn3Pu4UYRTTnQ/umXYM1u7fJWFKPUIGJ51u22UNPOsLmho69uYatMv8hvGsqEJ9BCzVI3pwz1oBRJFCkkqGM0U69gAQmkIV6iNgqR7Rg3vWCiCKFJJUMpwp0rEHgNDc+guVEbBUj+jBPRcGeD0JTToz5kKEJJVuO27Pfez3+31fmNVtyd+2JLaBcQNQnBVVaI+AJZY09UOtoJB19B0XRQhDjI91OR/CBFZUoQFzApZYzvkwpZK9uywcaOj1DofUxQhEitd23F762EOrzoUVDf1UjaAjiMWKKrQhemjPVkU/LoKT1msbagcAjihUoQ3RQ3u2KvpxEZy0XttQOwBwxK2/0ICZAUshtBjq8fj2zvvwnNv73+msIkzpVHDS3Nc81fZw7E/sr1P7cI22d47TnG3ray997NmeFs+xQNusqAJjrBEAsYUPULVt45T+zhkj0fZLtP5ADlsY18KKoCFWVKFRYwNwTrm+3j17eG5fGEzu0JmthMvM2eY5+zB3cFK6+0NnljES8dgLU4K8BBUBU1lRhXbNCW6ZEgazROhMy3Ifl7Gvl/uYLDFGIhGmBAAFKVShXXOCW6aEwSwROtOy3Mdl7OvlPiZLjJFIhCkBQEHd4dD7c1JAI7oumeiLOaSUnt7RdnX1Or148cmT9se3v835Pb++oKPj9/3oow/Tq1fvDb3URQ6HExt9oaH9sEWX3CLptyG3q5b5E2UctjRXWtoWOMWKKmyDgInFnP4s0FMc5jwOJ4NRjt93iSI15R9Pxie0zRwHJhOmBI06CnR5ftyWUnpTsHubcCqUKlf40bI9f2JwO+YHbu2ejM95wVx7Yxt6WGkDamFFFdpVY4BNa9YIP1rD3MCgUm0AQKUUqtCuGgNsWrNG+NEa5gYGlWoDACrl1l8IaL/fnwzKmeLxbaX7/dO26+vdnJdnnLe3oHZ3N9vd3t+G/VlKXxyXU+5vaf3s/rmTxkPuY/u4L0NtUx67ZNvQfiWvHOeqldzudne3mEez9j4sFL4Udv8DcVlRhZgW/9BydfV6wqOrCJWswaXHtWQhUGMISo19rlUNRWpKsfsZuW+5bGEbgcysqMJGnfrplCWdCxaque24PQ0EVV0SpjS4Yx8FHQ29bxoZiJRzP5Q7LnnDmaYGNo0Jq6nlJ0UAoBQrqsBaIoXsLBHaMyckqdTj1toPLbcBAAtQqAJriRSys0Roz5yQpFKPW2s/tNwGACzArb/AKh7fKnkq3Om+7fb+ts2iYTxT247bu+EbP9/uh+vrXbq6ev3ObdgfffRhevXqvXceN2Ts+5bYDy23CWzahoGgI+FAAAuzogoxbTUMppXAjdHH774o7f33ie/T975bHU8sr5axdWk/+85JOc9VtezDObawjUBmVlQhoDl/qRfSUsZR+M6TMJ80coX0jHOBSIuHCNURprR821AQ1pZYVZxviX04dB0YE/YFEIEVVYA81gjfiRQiJEwJAFiMQhUgjzXCdyKFCAlTAgAW49ZfgAzOhe8MBR1dX++yvEfptmj9EaYEAPWyogpE0nLgxtxta3nfbJHgKwAYYEUVNkSIxrpmBiydC05aJDRLmNI6YUpCiABgmBVVgOXMCeQpFeYjTGlaGwCwAIUqwHLmBPKUCvMRpjStDQBYgFt/gbP2+/3LlPcH7ue6reHWyTkBS1MCjHISpjSuTZgSSxs471Zx/nsQ8PrRa8bvkFd1TKAWVlSBMaJ9yIjWn0sJ1AH69J3najv/1dbfS2xhG2F1ClWAFXVd6roufbPrUnc4pOeHQ+rS3bn4WymlZ4fDXfvjxxXucjrVl77+jX1sK20AwDIUqgDrqjG4R5hSfxsAsACFKsC6agzuEabU3wYALECYEqyodKjEjKCITVnyOD3+/c2HQJ6xbTPMCvq4NEzpYT+e2Jbbw2H3fMxrRm0TpgQAy7KiCusSuJDH0mFDrR2nUtvTShgMcLkthMOV2kaBfDTNiiqwit1uJ4CmIfeBQh+klD6/X2l80v54JXXM82tqG9o24At+tmU59i2ts6IKwCWmhCmNfX6NbQDAAhSqAFxiSpjS2OfX2AYALMCtvwBMNi5Madrza2qbG6a0ZrBaoRC1WQFeAGBFFda11YCD2ra7tv6e09r2tKD1QKnWtw+AhVlRhRWtscIwtHoi0GgcK0HLixCIJEwJAOKyogpACZECkYQpAUAwClUASogUiCRMCQCCcesvAKs7H1Z0Fzb0+Bbb+wCj28Nh93zouWu0zQ1TIp6pAVeFQqpmv3fJfh8RuAUMsqIKQER9BYOQnjrUGOBlbK3L/gYGWVEFYHVzwoqEKS1D2BoAkVhRBaCEOWFFwpQAoHEKVQBKmBNWJEwJABrn1l+gSVODUSrQVPDInLCikaFLack2YB2Fz+VNnXehNlZUgVa1VKSm1N725FLrfokWNhStPyXYB+sau79LzvFazy/QBCuqAFSlhVCj3Ks0Qz85IiRpnFPHpOR+dUyBrbOiCkBthBoBQOMUqgDURqgRADROoQpAVQ6HdDgc0mcPv28KALTHd1SBsBpM7p1D0MtptynWGHGc2tI3vhzn9ZSc444zFKRQBSKLVID0EmxSjp+OYEnGV3mOAWyXW38BAAAIRaEKAABAKApVAAAAQvEdVQCgOsLWANpmRRWIrIbExRr6SPv6xmHL41ORCtAwK6pAWNIeYRxzBYDWWFEFAAAgFIUqAAAAoShUAQAACEWhCu3ZYqgKsD3OaQANE6YEjRGqAmxB6+e6/X5/KN0HgJKsqAIAABCKQhUAAIBQ3PoLAAP2+/3LlNL7pftx5Lb1W18B2DYrqgAwLFqRmlLMPgFANgpVAAAAQlGoAgAAEIpCFQAAgFCEKQHZBA2dyUmADQCTVX59dO2jCCuqQE61XoTHan37crud2B5VxP5G7BN5tTJ/uFPz9aPmvlMxK6oALKKVv8C3sh3UxbgDts6KKgAAAKEoVAEAAAhFoQoAAEAoClUgp9ZDPlrfvkgEyQAtqfncVXPfqVh3OBxK9wEAAADesqIKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACAUhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACCU/w8XqOddMO5L7QAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (2) A* search search: 162.8 path cost, 782 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3c+rLOl5H/C3rzSek4lmFG+SgEEku2AkRlkGJljIy+CFMPRZiKBFkK2N/R+EmcnCm+zsjYPQ4mJEchrCJYTsgpDRQJYZ4cTb7EKW1jlkfDWX3M7innNun+6q6qquH+/zvvX5gBhNTf94q+qtqn7OU/3tzX6/TwAAABDFs9wDAAAAgEMKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQCgKVQAAAEJRqAIAABCKQhUAAIBQFKoAAACEolAFAAAgFIUqAAAAoShUAQAACEWhCgAAQChfzT0AAOq02+1uU0rvN/ynu+12+8HS4wEAyqGjCsBcmorUruUAACklhSoAAADBKFQBAAAIRaEKAABAKMKUgEfCbyAWxySHzAdgTXRUgUPCbyAWxySHzAdgNRSqAAAAhKJQBQAAIBSFKgAAAKEIUwJWTTgJh8wHAIhBRxVYO+EkHDIfACAAhSoAAAChKFQBAAAIRaEKAABAKApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAISiUAUAACCUr+YeAAAA4+x2u/3RorvtdvtBlsEATEBHFQCgPu/nHgDAGApVAAAAQlGoAgAAEIpCFQAAgFAUqgAAAIQi9RcAAA7sdrvb1BxIJU0ZFqKjCgAAT7WlJktThoUoVAEAAAhFoQoAAEAoClUAAABCEaYEMELfwI2Ox9VCwAgAMBkdVYBx+gZu1FykplT/+gEAC1KoAgAAEIpCFQAAgFAUqgAAAIQiTAlWqoRwn9xj3O12+6NFAoNYrdzH40iO3UD6htAFGE/b44+vDbmY11RNRxXWq4QPnNHGOGY8d5ONAvKIdjwOUfLYa9Q3hG4ppc6PUscNveioAixgzF+9A/31HgBgETqqAAAAhKJQBQAAIBS3/sJAhQSKCFgAAKBYOqowXPQiNaUyxthHtACiXOOJth2alDBGxil5H5c8duZX6vwoddzQi44qEFZTV7grWGi73W7mHVEeuuNEYB5Sq1zXmjVez2AIHVUAAABCUagCAAAQilt/AYLrCPASmlWRoUFtE/++rrk0sQjBe7XOEedEWAcdVRiuhPCCqcdYwjrXrO3Dbi2hWbyRc3+aS9OrbZtGWh/nRFgBHVUYaA1/rRXiAABATjqqAAAAhKJQBQAAIBS3/gIXEWYBwNwyh4wBGemoApcSZgHTyhlaJjBterVt01zr45oCK6WjCgABuBOhLkvtz64O4rlgvDHPBZibjioAAAChKFQBAAAIxa2/sAJLh1EIs+hn6H5peY1Lt3Xo0Ksptg3xCWWD+WU+nzqWuZiOKqyDD/wx5dwvS7x3W/hKn1CWaHO2tmCcKISyjTfmOGMdar/WUCkdVQBmUfJf0QXJUIqSj7M5THXsujMI8tNRBQAAIBSFKgAAAKG49RcAJiAYaFkCYt6Yazv0vPU1zHYA6qOjCusgVCOmnPsl+pwocdsIBlqWgJg3bAfOKfF8CjqqsAZNf/Hu+mt5nzCKsc+/VE0BF307Ebm2dU66NADTcD6lVDqqAAAAhKJQBQAAIBS3/gIhCKJZr8yhOAChBThHug6ThY4qrFdbwEGu4ANBNOtlH3MJATFv2A71y32OzP3+rJSOKqyUv44CJXMOe2PMdlhjUBtQDh1VAAAAQlGoAgAAEIpbfwFm0nBbnUAKKFyAYJsmzi1AdXRUAcYZEiYS7cNtFAJZKEnE4zjimLpEC/OLLvd2yf3+rJSOKsAITV2MroASTtUSBmO/Qz+6v8PYXqyVjioAAAChKFQBAAAIxa2/MJOOwA2hFwDBOGcDxKKjCvNpC7coLfSC/ASPwPz6nrMjHncRxwQwio4qQHC6ORCH4xFgGTqqAAAAhKJQBQAAIBS3/gKTm/r3JPu+3oj3rTYspSMgJppq98EYBe2/PuxjmFDm84PjmdnpqALUUwg0KWXdShnn0mraLjWtSy0EtZUt5zHleGZ2OqoAACukIwZEpqMKAABAKApVAAAAQnHrLwAAVWgI1RP6A4XSUQWoOziklHUrZZxLq2m71LQulKPm0J+cx5TjmdnpqAKT2263m6HP6fppmcPXm/pxtdNJKJv9B7RxfqB2OqoAAACEolAFAAAgFLf+AjDabre7TcG+C9Z1+/cFBLKs2MRzaSnmLFA0HVWgNG0BDoId8gpVpM6g9vWjPuZsfK5n0EFHFXrYbNImpfRhSumX+33a91l2c5NrtHXTIQCgBq5n0E1HFfr5MKX0H+//OXQZAAAwgEIV+vllSun37/85dBkAADCAQhV62O/Tfr9Pnz/c4jtkGQAAMIzvqMJAm01qSze92+9TyO+bdCSyjkmFvGt7zSCvR4uICb0FMA8pjTkLFE2hCsO1fcCP/MF/8jFPHQIhVGJRkefqZLbb7Sb3GKiDuQSwPLf+wpHNJm02m/Tt+wTf1mV9nwsAAAyjUIVTY9J8pf4CAMBIClU4NSbNV+ovAACM5DuqcOQ+sffzlDqDk9q8fvg/19fbdHX1Kj1//uLkQbvd7jgVeEyoEQAFqDDIzLULmI2OKnQb9YHi5ct3FnkfGGgNaaA51rHtPdewvemntnP9Euvj+IGV0lGFI/dBSB+mM7fv3tzsHv//9fV25lHBdHRA5mG7wvT6HlcNdyoBhdNRhVMCkQAAICOFKpwSiAQAABm59ReOHIUp0aIjFGTxcI2hASUT3yJWRZhIgJCXorZjgO21hKL2CdQq0vUWlqSjClyq7UN6jg/vOQuGWoqV3OuR+/2HKm28l8gZlFNrgE5t61Xb+kQV6XoLi9FRhSN9w5QAGGdt3aC1rS/AGDqqcEqYEgAAZKRQhVPClAAAICO3/sIRYUpxrCSwhomZN0QVdG4K5KlI0Dl2KXNz5XRUgchKuNjWEiaSez2mfP8S5k0Jcs+JGkWcmxHHFEGpQV817c+a1oUL6KjCEWFKPNhut6vpqfurdWyHc7Hr55XWNGdhTs6JkJ+OKpwSpgQAABkpVOGUMCUAAMjIrb9wRJgSJegIzBA+wWrNFSTTdbs1XGqK+WpuUjMdVSCynKEVpQZmCJ+Iv++Gqm195mT+D2d+5WO+djM3V05HFY4IU4pjTGdQ4Mx6mTeUyPwiMvOTHHRU4ZQwJQAAyEihCqeEKQEAQEZu/YUjpYcpzRUmAgBLWyI4TjgdxKSjCvVRpAJrVWr4SqnjXsISwXG5wulK2e+ljJPK6KjCEWFKAGXS/aIkfeerkDfWSkcVTglTAgCAjBSqcEqYEgAAZOTWXzhSepgSAACUTkcVAACAUBSqcGSzSZvNJn37PlQJAABYmEIVTglTAgCAjBSqcEqYEgAAZCRMqYfdbnebmn/0+c5vttVn6jCl6+vtk3+/unqVnj9/Mf6FoRId59hQun7L8AzXCrLzWWa8iOeqEeelNuZDgWo9vnVU+2k7KYU6WZHP1dWr3o99+fKdGUcCRar9XFr7+lGG2j/L3A1cfolatlWXNaxjjao8vnVU4ch9iNKHacCtvw8d0uvr7bOD576+9H3vu7oAQA8ld42AZjqqcGpMmFKu5wIAQDUUqnBqTJhSrucCAEA13PoLR8aEKfV9blPA0uFzAaIJECRzUShIrSEjALXTUYX59A5wELBUnSVCPWpS+3apZf1yh3Jc+v5VhoyQRS3Hcpc1rCOF0FGFI5eEKTU9d79PHzS8XmvAUlOYkoClMunSDBNpe3X91MN2u53gB6uAUvU9V019Hhnyes5h1ERHFU5NHYjU9/XGPBcAAKqhUIVTUwci9X29Mc8FAIBquPUXjkwVpnTB6z3eFnz/uLv724cFLE2s69YogLVxTgQi0lGFZQ0JKVgi6KPm0ISa1435CMJql3sb5H5/ABakowpHxoQpnXu93AFLawpSaAq90DXgnEjBTtHYNnH0vTYcLru5yTVagMvoqMKpqQOMBCwBMKW+1wbXC6BYClU4NXWAkYAlAKbU99rgegEUy62/cGRMmFLDraV3+/32JBBJwBJzGnGL812tt3fudrvbtMz3vmF2R9eQk7nddL3Y7ZYd4xKW+DrH1O/R8/V6n4svHd8KzonVXs/WREcV5tXnIhAtYKlmwli61Ty/al431q1tbpvz5Vpi39U+P2pfv1XQUYUjU4cpnXuPqQOWBGa0i/bXVeFOwDnnzvl9n9t1bYgctOc8CeulowqnlgifELAEQB9TXy8AiqBQhVNLhE8IWAKgj6mvFwBFcOsvHBkTpnTJe1zwvp0BSzUGZlCPFQR4wKTOBSed8Xi9uL7epqurV+n58xcTjxBgHjqqEJOApXWIFu60xHjMV7jcqOPn5ct3phoH81riXBzt+jO12tdvFXRU4UjfkIrr6+2z9BhSsWsNP7rkfccELAlTKseYcKeugJHIwSgli7JdhcvM61x40ZLLjpd3jfvm5u3tNNfX25FboQxTHZNTn09LOD9HCxeEJjqqcCpXSIXADID8+p6Ll1jWtRygagpVOJUrpEJgBkB+fc/FSyzrWg5QNbf+wpG+oUaHj5siwGiqgCWBGVCPoeFTmW4LvqvpNsJz4UVN5+IllnGqYb5XNRen5vxAaXRUoRy9gwEEZhBYqQEXucZdQvhUCWO8VBHrdnX1KvcQoihif12g1PNmSvXuExagowpH+gZXLBFgNCZgCSLyl3UiORdqlHNsZ8wZ5iesK5i286ZgNWqnowqnIgUYjQlYAqBbqefYJcL8ALJSqMKpSAFGYwKWAOhW6jl2iTA/gKzc+gtHcoUpnXuPoeOjTEPDc1peI9LtYII0VmDEnMs6P84FJ0U15nbf499Yvbp61XitAchNRxW4VFu4Q8mhDxEU8UF5gNrWZ2m1H0+R5keksaTU8lXRPsFJQ8KVhO8xs9rPYcxIRxWORApTOve+y73rKV0ymN+Y4yxYZz2kvufTm5uZbpuZycPPk11fbx9Dl1JH+N65UCkBS3XYbrfuw6IoOqpwKlKYUqT3BahN7efTMdez2rcNEJxCFU5FClOK9L4Atan9fDrmelb7tgGCU6jCkf0+7ff79Pm5W536Pm5qud4XoDa1n0/HXM9q3zZAfL6jClXyuaJgdylcqMsogjTyKWEuFTA/qjmfts6HzeZkJe/2+xQlhyDrPJ76u94NrycZHVooVKFK8hJK5QMLUzGXprIJHULTt5BqKjwbCtQHYf7A0TSPKwsKC7OtIRq3/sKRzSZtNpv07fvEw9GPm9qY9801ZoCIxpzvl1g29LGXjnvq5wJMQaEKp2pO/ZXiCPDW1Km4Uy8b+thLxz31cwFGU6jCqZpTf6U4Arw1dSru1MuGPvbScU/9XIDRfEcVjtwnHH6eUkqbjpubbm52rT+ePqe+47u+3j7596urV0+eO8Rut7tN+b5HI2iComU+fsao/tjrez5tOncusazPY3e7y8fdtc4ppcdr3P3jHgKWBl9Dxhh6/FT2/VVYNR1V6GWf0j/6eQqY/tg7MfPly3fGvE/OD9klfsCHQ6XO4VLHfaztPFlA4vDshmyDXPOhlnnYxjyEFjqqcOQ+JOLD9Hhr0z6l3/vDlP7pT1L67/8qpf/871LOVN3D8T2kOB6NubXTe/Tc/ZBlOY0Z9xLLoo3Hdoi3rFR1zJvtyXmy6XG5t8Ol2+bmpmvkZ9dlsWtITcfPuRTorq5u5ARpiEZHFU4dhEXcF6nf+mlKz16/+efv/WHK3FldIhwjWmBGriCTOQJPal4WbTyRlpVqjfOmSQnHVF+lbgci2Wx+M202/+Tgf7+Ze0jUZ7Pfh/xjVSj+MrYuj3/N/c4nv0zv/+/X6Vs/Tek3vnj7gC/fS+mvvp/+w/d/N23OfMEnpennSI+uQNd3Z591PbdtWe7v/Fxfby8a91LLoo3Hdoi1LPfxM8YSx94cr3nJsjTDuXOJbdOVl9B0/SltO0Q8fqJ0VIdum8nee7P5ZkrpFymlrxws/X8ppX+e9vv/Mcl7MEittYpCtYdadz7tNp9uNimlP09fvvcHT4rUB1++l3737/+z9MN/+MOzxerSc2TT/gPuI+xTSqY6XMbxU7qbm47Eovn0CrOa+jPKPNeQy11dvUrPn7/IPYwnpipUgwatdc+7zeabv0of/NX76fbJbZmvU0p36YP09XT7LcXq8mqtVdz6C0cei9SUvt9YpKaU0m98kT771Wfpx//nxyngH3tmCGYo9hwHATh+SnZ19SrXW2dLWs/0vo1GBgHOoc/26RvgFa1ITalrTG9u7/3FcZGa0puC4v10m1JKv3AbMFMRpgQHNp9uNmmf/jztn/3L9Oz13+l67K/3v06f/eqzlFLq1Vmd05hwDACeeLy9NdfPkB2aOkypx3uEu4Zst9tN39udL7ldfejt0z3GW+vPOv2DlNJX2rpcz1JK+5S+8kn65Hf+zSb9p5ICsohJRxWe+p2U0g/PFakPfr3/dfrZ3/ws/fUXfz3zsM4SSAEwjWjnziXO7yVcQ5YIfGKk1+nZsxfpe3+abGsmoFCFp/4ypfTj9PrZ3/Z58Lubd9N3/95302+/99szD+usX6aUfj89/YmFpmUAdIt27lzi/F7CNaTvePquS7T1q8Kz9Pr199KLP062NRNw6y8c2H+8328+3fwobV6nlNL3U0p/t+2x727eTR99/aPst/2mlNL9rTSfty3LPDyAYhyeO3dZMpSeOnd+n2KM0a8hxwFP19fb1pClc+vStGyJ/Rw0OKlRWzDPOz/5SfoXf/RH6atffNHY6XqdUrpNX3//T9Mfv0jpcd7c3d9O/nnDU6CTjioc2X+836eUfpRS+mn68r3mB335XpgitadQ4RjUyNeO2tk2BTk+V+Y8d0Y6b0caS0opZMjSOUUUqV1efe1r6Weffpru0gcnX1q+L1LTR+mz9DfpSZZS8etNPjqqcOTNl/73H6bvfPKj9J1PUzrqrL67eTd91POnaXI6F46x9O/51aIpqCOleX9LcA3bNaX4v5db6n6Kvl2jjefpsu3oc+f8592uvX+5qa8hly4b+nM5lx27l2yhdbr9xjfSf/u3/zp99+OP0+b121Pe/335bvoofZb+Z/rmyXOEKXEpHVU49eZL/z//5MP00Fl9+M7q62d/W1AndUygRFvwgUCK8dtG0Ee7JbbhGvdTpG2T8/gpcdnQx04p2nZYYtz0cPuNb6T/8md/lv7rn/zJ4//+cfpfjUXqPduaiyhU4dTjl/4fbwPevP6LtE/7tHn9F4UUqSmNC5RoCz4QSDF+2wj6aLfENlzjfoq0bXIePyUuG/rYKUXbDkuMm55efe1r6e63fuvxf0e3+x6zrbmIW3/hyHHowmPAUkr/PqX0l5vN5g+yDW6ASwIlupb1eWyE4JG5Xbptxixbw3ZNad5tuMSyqPspwrbpWhZtPJGW9XnsXPMu2naYc9y5jt0f/OB7I79ru08pxf/D+dh9ynrpqEIP+4/3+/3H+5/fd1iHCBdAMbPa1zfX+tW+XVOqYx0jrkPEMcEanRyL4wOh4hepyTmIEXRU4ciY0IXtdlvEVWMu2+2b4JE5tcXm379/ldt/ie3KePZTeWIENsUMU8plbPBOpDClc8FcKZ2E59agV3gb9KGjCqeELgCsQ6TAoGhhSrnUFKa0tn2X0jrXmZkoVOGU0AWAdYgUGBQtTCmXmsKU1rbvUlrnOjOTzX7wV+7WZ423GtJu6vmw2+1uU/MPYt+5lfBU7cdjx3woVZXz2HFbjgqPqd5qOCd2/Y7qzc3bFKSOYKK7h9+BbbPEdWWzSauYh/t9GV+crU2tn410VCG/tgtX9Rc0GtW232tbnweO23LYJyvQEUwUZf/PMI5wzSbBSUxKmBIcWVtwBQBEMUfwzhLX9XPvcebpnQFENze71tCl7Xa7iRQAJjiJKemowilBAACQxxzX2yWu62PeY+z4IgWA+bzEZBSqcEoQAADkMcf1donr+pj3GDu+SAFgPi8xGbf+wpH7W1U+T+ltCEff24K6vsx+ialfr8GkwS+5Q0tK2160yz2XoAY9z4mhzmuH1+Bzrq+3fV/28dbZzZtYmYeApftr/aAhNjoc99DgpKZ1fvpZZNzzcy+DS+moQrfaPyhPvX6213i1hVFcuj61zyWWU9sxNbXox9oc+2/udR7y+uYntNBRhSOCk8gpUmcDahD9mFrgTpCiHIfxPPy0zFEwUWu4UDoIJup63Jhr/ZzBSQKI4C0dVTglCAAA8mi7Bo8JJhrzuL7PXSo4CVZDoQqnBAEAQB5t1+AxwURjHtf3uUsFJ8FquPUXjgwJMKiB286GadheoYJIAErWFsZzFFbU6vA3R8+ELT153NXVq/T8+YvBY5w6OGmoiq7hrqWc0FGFbkIOOCd6EEmpHHusRe65nvv9L9E45qurV53/3uXly3cuHYvgpGm4lnJCRxU6NP11r+uvl9vttuPvvM2mfr2h71GTqbZXk7Vswyii/2XdfGAqY+f60Lk453lyTkeBQycBS4ed1AcPHdLr622vgKUe7ys4CRakowoAQHRTBxjlel/BSdCTQhUAgOimDjDK9b6Ck6Ant/4CABDauRCirvDDvkFMPZ47KDjp8Hbkh/Ed/mbrGgIbYQwdVQCAcgnoWU7vInVIkBMpJfOYBjqqwInIYRtCbADeGhr6t0YDwo8ufu7NTd3t0cifC6iXjioAADWbOkwJWIBCFQCAmk0dpgQswK2/wImG28buov+uJfXb7XaDgkxqVdl2cG7JKPNcWmzfTxim1Or6evvk36+uXj3+jitwGR1VoI9aPhRTNvPwjZq2Q03rUqKc27/Efd878Ofly3fmHMfSBB2RhY4qAADVmipMab9PHzS83uu2515fb589PO7wp2qOnQsq6grHEnJEzXRUAQCo2dRhSn1fTxATjKBQBQCgZlOHKfV9PUFMMIJbfwEggMpCkiCMCcOUBh2jh8/d1f0zqzALHVUASlF7oMcai9Ta92l0Obd/ift+yDFa4vpBKDqqABRhiZ+y6Bta0vU42gl+iWUtPw00VZjSmYc+Bifdd1KfPPfmZug7AzqqAADUbOowpb6PE6YEIyhUAQCo2dRhSn0fJ0wJRnDrLwAA1bokTOkHP/heevnynZQ6fie17T2alglTguF0VGG4toAEwQnAGLWfQ2pfPypyX6T2ZW7DDHRUYaC1hE8Ay3JugXmMCVNq0RmcJEwJpqGjCgBAzaYONeobnCRMCUZQqAIAULOpQ436BicJU4IR3PoL9JLxdyPv3BLZz263u03NP0h/dht2PDenKvZ90G17rIptTbnmPE5ubnaPgUjX19vWx3X9t0PngpOalglTguF0VIHoon/Aj6RtW/XZhhG3c8QxXaKE9ShhjNRtkTl4dfVq7EsIToKF6KgCALAKz5+/OFl2ppM6ODhJmBJMQ0cVAACajQlOEqYEIyhUAQCg2ZjgJGFKMIJbf2FBQ8MiMgYYQbHmCmXpezw6buuROQhLwFUAlwQnNS0TpgTD6aj20/bFeV+oZyiBJcM5zpYRcTtfOibH2XAR938EOefS2uZxtjnYEbDkuKAUVdYqOqo9+Ismtdhut5uH/z+063P4XOrjPBeb44/ajT0HdV3Tmo6fSwKRHpZBNLVew3VUAQBYG4FIEJxCFQCAtRGIBMG59RcAgFUZE5IELEOhCsu6SxkTJDO979TatmHo9cucHrpEEm2khNKcx9kcQs/tyoU/Zw88t8w6l3Kf59r0PP9FOocda52HLesWeV2gN4UqLMiFY7yCt2G4D28TC7N+Y+ZI30CWocEtl74PeRVyvmk99jLMpTDngQuEHXvbPOw4j4RdFxjCd1QBAAAIRaEKAABAKApVAAAAQvEdVViBKQIuFgjjKZZtsw599/PY+TDxfBKqAkCRdFRhHQQrMDfJtDE59llSyeeBkscOVdJRBQjokrROSbJATjm7985/UB8dVQAAAEJRqAIAABCKW38BAKiWwDsok44qrIOQCOintmOltvWB2jhGoYWOKqxAU8DF0L8wC6OYnr/yx7NUGIzgF1gnxzf0p6MKAABAKApVAAAAQnHrLwAUbrfb3aaU3m/4T3c5f9uS2DrmDUB2OqoAUL62YkMRQhfzY1mCk2AAHVUAAKpwLqxIkBmUQ0cVAACAUBSqAAAAhOLWXyCElYR6CLZpEHDf209UJ+BxBtBJRxXoY4l03D+5AAAHrklEQVQAiDV8gFrDOl4i2naJNh6YwhrmtbAiqIiOKnBCoAQAU3JdAYbSUQUAACAUhSoAAAChuPUXAI50/dYilMI8Bkqmowrr1RY6kSuMYg0hGGtYx0vYLlA3xzgwmI4qrFS0n9+INh6WM/W+10WCdkKNgFLoqAIAABCKQhUAAIBQ3PoLAe12u9tU14+z37m1F+pT0Lkq7Dlo6W2Y6db4sNsfiEtHFWIq4YPfELWtT1TRArJyWdv65lTKsR15nJHHNpU1rCMwMR1VgEroWLyxxHbo6kr1CasR+AQA3XRUAQAACEWhCgAAQChu/QUW0fNWR4EbQBgdQUfOVQAz01GFmNYaBiNwA8pSyrnq0nG2nZOmPFeVsg3HWMM6AhPTUYWAxvylXkgLsBRdxfHm2IZjw74AItBRBQAAIBSFKgAAAKEoVAEAAAhFoQpEsrbAjbb1Xdt2WCP7HgA6CFOCFRGiEYsgmvWy7wGgm44qAAAAoShUAQAACMWtv8BZu93uNk37A/dj3bl1EqhZx3m3qPNfwOtHqxG/Q17UPoFS6KgCfUT7kBFtPABTazvPlXb+K228l1jDOsLiFKoAAACEolAFAAAgFIUqAAAAoQhTggXlDpUYERSxKrn30wyyBH3UEgYDACxPRxWWVVPxk9PdzK9f237KtT61hMEAl5v7fB1BrnVse981bHNWQEcVWMR2u93kHgMAy3L3xHxsW2qnowoAAEAoClUAAABCcesvACxsycCuTCFqArMAGEVHFZa11oCD0ta7tPGeU9v61KD2QKna1w+AmemowoKW6DB0dU8EGvWjEwQAkJeOKgAAAKEoVAEAAAjFrb8AhNMRNiSkh1kMDbjKFFI1+r1zjvuIYxnopKMKQERtBYOQnjKUGOBlbi3L9gY66agCAMLWAAhFRxUAAIBQFKoAAACE4tZfoEpDg1EKIHikQYX7GTiQ+Rh33oWMdFSBWtVWvNS2PlMpdbtECxuKNp4cbINl9d3eOY/xUs8vUAUdVQBY2NRdmq6fHBGS1E/TPsm5Xe1TYO10VAEAAAhFoQoAAEAoClUAAABC8R1VICyJrk8Ieml2l2LNEfupLm3zy35eTs5j3H6GjBSqQGSRCpBWgk3y8dMRzMn8ys8+gPVy6y8AAAChKFQBAAAIRaEKAABAKL6jCgAUR9gaQN10VIHISkhcLGGM1K9tHtY8PxWpABXTUQXCkvYI/ThWAKiNjioAAAChKFQBAAAIRaEKAABAKApVqM8aQ1WA9XFOA6iYMCWojFAVYA1qP9ftdrt97jEA5KSjCgAAQCgKVQAAAEJx6y8AdNjtdrcppfdzj+PIXe23vgKwbjqqANAtWpGaUswxAcBkFKoAAACEolAFAAAgFIUqAAAAoQhTAiYTNHRmSgJsABis8Oujax9Z6KgCUyr1ItxX7es3tbuBy6OKON6IY2JatRw/vFHy9aPksVMwHVUAZlHLX+BrWQ/KYt4Ba6ejCgAAQCgKVQAAAEJRqAIAABCKQhWYUu0hH7WvXySCZICalHzuKnnsFGyz3+9zjwEAAAAe6agCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAhFoQoAAEAoClUAAABCUagCAAAQikIVAACAUBSqAAAAhKJQBQAAIBSFKgAAAKEoVAEAAAjl/wPl7yXh8I1XfgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Greedy best-first search search: 164.5 path cost, 448 states reached\n" + ] + } + ], + "source": [ + "plots(d3)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6IAAAJCCAYAAADay3qxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3cGOJMl5H/CoEUmsZOxQOvmum2HIS/psQX4BWyAE1BwEiSfZPOkRRFKPoJNswQdK4GEKMAjZL0DBvnvXkq9+BFPahihChKZ86O5hT09ldWZl5BdfRP5+wGLJ2KrKyMjIrPo6sv51OJ/PBQAAAKK8at0BAAAA9kUhCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChvtK6AwDAfp1Opy9LKZ9e+E93x+PxdXR/AIhhRRQAaOlSEXqtHYABKEQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAI9ZXWHQC2cTqdviylfHrhP90dj8fX0f0BAIBHVkRhXJeK0GvtAAAQQiEKAABAKIUoAAAAoRSiAAAAhBJWBMBm5oZmbRGulT2wK3v/AGBLVkQB2NLc0KwtwrWyB3Zl7x8AbEYhCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQ6iutOwDA9k6n05ellE8v/Ke74/H4Oro/AMC+WREF2IdLRei1dgCAzShEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEJJzQVoTKItALA3VkQB2pNoCwDsikIUAACAUApRAAAAQilEAQAACCWsCABmECq1D44zQAwrogAwj1CpfXCcAQIoRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQn2ldQcAmOd0On1ZSvl0g9c93/jUu+Px+LpqZ2Cntjq/g7gWAItZEQXoR7YPqdn6Az3r+Xzque9AIwpRAAAAQilEAQAACKUQBQAAIJSwImBoVwJAhGt0qkaoy4qApjWvZ84tVPs4reU4A9RjRRQY3VTB0mO4xl3rDjzTqj89HrtS+u03y9x6nLOd30v03HegESuiAJ1Ys8pybSXneDwebn1doA6rqMDeWBEFAAAglEIUAACAUG7NhY0Jy9lG48Aax46bzJ1zDUN6XpzbNc49Ynj/ATKzIgrbaxWWMxUeMUqoRMsPwj6EtzXKHM5oztw2/6/LND9HCmsDBmNFFAblr92MqtXcXhP4lO1nSHq2dbiWYC+AGFZEAQAACKUQBQAAIJRbc4GqBJmwd72fA24jBiCCFVGgtm4/gEMlzoHtZAoCAmAFK6IAwGKCewBYw4ooAAAAoRSiAAAAhHJrLlTUKqTkynbv/J5oG44JwHWNg71ci6ExK6JQV6s31KnttujPHsJE5uxjpmPS0tRY9ThP5u5Lj/sGLbS8Hu7tWgzpWBEFqsr2F+ZrP0UhbGV72ebDGnP3ZaR99lMuAGzFiigAAAChFKIAAACEcmsuEKJxKAWVRR7PidtDBY0Qzq3KLCG0Dq6zIgpEUYSOpfXxbL19YL2WwV4R2xZaB1dYEQUAIJxVQdg3K6IAAACEUogCAAAQyq25wEf2EizUKnhE4AlsS0gMQH5WRIFLhi9Cd2gqmOPWwI6WISMZtk9uQmIAkrMiCrADtVeB1rzetRXh4/F4uPV1YWsZ5qc7KoBRWBEFAAAglEIUAACAUG7NhYYu3GIlSAMgyF6C2WpZEwIlQAp4zooo5JLlA5EgGGAPslxze7EmBEqAFPABK6LAR0b663SrYJwl2+05fORwKIdSymellC/O53Ke0/b2baveAgBZWBEFYI3PSin/9eHfS9sAgJ1SiAKwxhellN95+PfSNgBgp9yaCxBsya24jW7bnR0e8nDr7edL2k6nSr0EALplRRSA54SHwMeEuAFUZEUUgjwNqOk5nIZ9uCWESFgRI9kyzAwAK6IAXLYmhEhYEQBwlUIUgEvWhBAJKwIArnJrLnTmdDp9WXyHj43dEkI0t01YETVlvyZm7x9AK1ZEoT8+0CwzFTAieGSaseFRD+dP9mti9v4BNGFFFBja3J8hifQ8BOVaeFVEYMqWwUTCivqW8fwBYAxWRAGICCYSVgQAvKcQBSAimEhYEQDwnltzAUpsoMiS35Fd8Zuzd3Nvq9w2mOh+XOfejhv0G7svjs1OAmY+GIed7DMASVgRBbg32gfwLPuTpR9PzelTxn7X9nwf97DPLWQKdgJIw4oozDQ3lAV6I5gItlM78CnorgGAzVkRhfkEsDAqwUQAQCiFKMwngIVRCSYCAEIpRGGm87mcz+fy+dNbcC+1QW/mzu01bQAAT/mOKMC9uzJWWEuWgJSU4+p7dveMAxk0mocfpWdfSY6WtA0bUIgClNhAkePxeLj1sb25NK61xiaTW45TL/u2RIv5OuI4EuJS0ThVSErahg24NZfdOxzK4XAo33hI+lzdBi9pOW9qz/fa50qv55Rrxr1McwmA3BSiIB2UeC3nTURC7ppzpddzyjXjXqa5BEBiClGQDkq8lvMmIiF3zbnS6znlmnEv01wCIDHfEWX3HpI9P6/VBi9pOW9qz/db2k6nZf3rQe1x6FWr+bU1QTQA9VkRhbFlSU6FW/Uwh2/tYw/7tsRo+/OUInQ8I89X6IIVUXblIdzis1LKF4+/cVi7raXeE1b3ouW8iTgHXmp7+3Z+/x5TdzP0u37bsn3L0+98bSyXaWy9d8E+WRFlb4RmkIGwomX9y9bvluOVqT+Z2ljO2AJNKUTZG6EZZCCsaFn/svW75Xhl6k+mNpYztkBTbs1lV1qGZsz90fU1P84e8MPud4+3Sm5lD6EgW4etXBnDu4fbQbsLK8oQstS6bc5jH4/909ufH8Y7xbHfqm2JiGtxD7IEQQH7ZUUUWCKiQBy6CA0yNYbGdnyOPXMI6gGasyLKsIRmkJWwomX9y9Lv1m1bjO0obdm8eXN8VTY7f07vlvTleDweRhpbYBxWRBmZ0AyyajlvsgfMCOmZblv62Ocy7cvo191s+zzS2AKDUIgyMqEZZCWsaFn/svW7h7CiSzLty+jX3Wz7PNLYAoM4nM/uvIBaRg+3KGX733vbwxh26sWgqgRBUz30sbYP9vna+TPqbzVmvGbcMtbXQsbmHuMFbgqeWzO/epibW+3fLa83sY3Rrl+XbB6KSB5WRIElIgIuhGjkNOfDT+sPSD30sbbR9mcEt17DIoOmWsybqXFxzZ9vD+f7HvaRB8KKGEIPoRlP//o596+urf/SH/VX6g/H8f4voZmOaauQkWzWBOVE6aGPtQkrmvZSaNC1c3LLwKE1QVPXHve8z9feQ2r38SVWuYDnrIgyCqEZfct0/MyRaT3sXw99rG3u/mU6B6LOqTXPz7QvtR+3RR8BFlGIMgqhGX3LdPzMkWk97F8PfaxNWNG0HsOd5vZlzeO26CPAIm7NZQgPtxB9nqGN5TIdvy3bTqfStR72r4c+1jZ3/zKcA1u1Pfftb3+r/OxnXy2llPe33h7uv2hwdz6X1yXheM3ty9I+L3lsT+fP3HAnIC8ropCbEIex9Hw85/S99f710MfaRtufKh6K0EueFy4CdJbJNF6R4U5Z7GFe7mEfeWBFlO5kCshYEppxi0t/1a0dEd86EKmUXMdv27bpIKaa22kXwLJN0FQPfYy6ttQKvGm9Ly2vu1nnQ42woiVjs+b8oQ0rvYzGiig9yhSQIcShjkzHr+W8iZiLmfa55dj02Lb0sc9l2peW191M/V5z7NY8bos+AiyiEKVHmQIyhDjUken4tZw3rUJPsrdl608P43BJpn1ped3N1O81x27N47boI8Aih/O5+V15MIwavw9a+1baNa8X9Tui1NXjMb0SPNKrJoEpPR77tX75l39+vvKd0DkeA4w2MTdUZ4v3j5Hnwxb7luGrKg0Id6IZK6IAZDBSEVrKePuT1soitJTtj1XtUJ1MgUH0z7WKZhSidOdwKIfDoXzjIUAhXRvXZTpWmdq2es3nMu3z6OdPyzkS0Z9MbVPevj29/6f12Mzd7kuPOx6Pr4/H4+HNm+OrN2+O33zz5vjqeDwejsfj6yVj4zwFWlOI0qNMoRJCHJbLdKwytW31ms9l2ufRzx9hRXFta/UYQrR2bJynQFMKUXqUKVRCiMNymY5VpratXvO5TPs8+vkjrCiuba0eQ4jWjo3zFGjK74jSnYffNvs8axvXZTpWL7U9how8/d2+0/0dfncPv6dXdd7U6/fzV95iG5NjU25pG9EW15YMx75V2+FQPgr9OdS7WfT9b+8+vOZjgFHY+bjVsYt8TYAlrIgCTKsdMjISY3Cd4Jj6Fs25Tz75+dX/X3Nb0DHXKpqxIkpqD+EIn5VSvnj4q2z6Nq7LdKxearu2YrfFvOmp3z2tZr55c3xVEs6vLeZI5JyNbpves1LO53L4cBxO754/5gc/+FEp5cP5UJ6shG41XnOPyVbHju30/hM40JoVUbLLFIYhxKGOTMeqdsjI2m302u/sMs2lqDlySaZ92WLOrRmHuY+LOB9r9wUgJYUo2WUKwxDiUEemY5Up+KXnfmeXaS5FzZFLMu1LVPhO7cdFnI+1+wKQ0uF8ducG1HI6nfZ4Qt0dj8fXrTuxhWvHM/MtWWv6/RhCVL1TDWU4ViOO61Pf/va3ys9+9tVm2z+fP/yNy7nnwOGw6PbVxwCjReb2ZYvrTa/XsDmix6vmdoB7VkSBtYb9cL1Tox3PLEEco43rB1oWoWXdMV7y3KGPIUA0YUWkkSEMQ1jR9jIdq0zBL1n6nTGE6Hg8HjLMh72EO3Xgo/CpUm4+B14/byuVA4yEFQFcZkWUTDKFYQiG2E6mY5Up+CVbv7PJNB9cH9pacv7MfX7EeT/39SK2C9CcQpRMMoVhCIbYTqZjlSn4JVu/s8k0H1wf2lpy/sx9fqvQIGFFwG4JK4KKdhpWlE218KRegz5WhhWlm8OZx3qujON6q9bBRG/fnlY9/6X5VDvAqHZYUUfBV5sG2Y0UVnTlmL44hmueC61ZEQVG08MHtK1NBbDMCWbJEu7zKFt/bjXKfjQtQj/55OcRm8keYNTLNa6XfmYwNVZzxnDNc6EpYUU0kSE8JDqs6M2b40cBG/WCUE6T4RovbXfuc689LpsMoT8tw4rO5+NHASwxz93unMrUn1bjOuexa64Fc9vKlTCftauVEWbsc9UAo9phRYKvgFFYEaWVTOEhUWEkrbYdEaSRTaZxXbKNTPM4U1u2/vQwDpdEXeuy6/EaO9L4A5RSFKK0kyk8JCqMJHuoztzn9iDTuLYMKxqlLVt/ehiHS6Kuddn1eI0dafwBSiluzfUl70Yebl/6fMS2KVtu+3Tlbrhaz732uGwyjOvSti1ec5S2bP2p1XY4lI/efw4Xok8utU21v3lzvPzge+9vL527nalt9+zGY3XN83F9DDCqdo3t6asRT10I//HZCm4war1iRdSXvGGuXsJWeuknDPU+ExQk1Er2AKNeGBu4zZD1yu5XRGkjR3hIbFjRltteE6oz/7kxYSuZ2voNKxqrLVt/arWV/s0KM2vx8zvXfobjxuN3c4CRECKAy6yI0kqmAJCoAI9W284WbpLpWBmbPtqy9SfqmpFdr/vS6vqw5nEAw1GI0kqmAJCoAI9W284WbpLpWBmbPtqy9SfqmpFdr/vS6vqw5nEAw3FrLk1kCArZqm1K72FFNdq2eM2s47q0bet96bktW39qhRD1bm6AzrXbZF+wSQhHxeN8zfuxefPmWD755OflBz/40dW+9BQIt8aK+QAMxoooMBXCIfQHbrNBeESuz+5BwUTZQzhmXyN/9rOvbtkPgC5ZEWVzGUJBhBVde731IUSjhs4IK8rRlq0/tUOI3r7dyVJYEhWP/aIAo5f6cu168+bN8X0wVA/nT68/NwPEsiJKhEyhIC2DR1ptO9vYZOpPpnGN2pce27L1Z+21gLYyHfuI683a5zsHgE0oRImQKRSkZfDIKKE6ewyducTYxLVl68/aawFtZTr2Edebtc93DgCbcGsum8sQHhLZNuXDUIrTl6WUT5/eivUYVHFL2zVPb5G68np3D7foCp2pFFY0N7BmKvBk7mP31patPyOGEI1u46Cqm/uyVTja2ufXvnYCPLIiCnXNDf7JFsKRrT8jMKa8KCj0p1cC0/rl2AEvsiLK5jKEh0S1Pf7UwJoQnFaEztQNK5o/8ozqUgjRx6EzXy2Hw7H78+daOM3xeEy1bpwpqCoiHG3rfb7ctn0I3tJApDljs/S9eckxAD5mRZQImcJDosIZegxyyDY2mdou6fEY097o508PMo1D7etNr/Om5VjXfs1ezwsIpxAlQqbwkGzBKplkG5tMbZf0eIxpb/TzpweZxmH0sKIexrr2a/Z6XkC4w/m877sGTqfT5ABku52IcVybd628NN8fA5aCulPb3eNt01MuhQvBGpduze3tfeXKef/BOZXtvbRVfw6HdbdifvLJz8sPfvCjj9p7mzcVrqd3j7/TOmWLY7z0vbnWcVmzL9nOPbYx6nG2IgptZAtymNOfnou0OX3vef9IZqAQoqnzwvly2apr+89+9tVa/Wht7fwwv2AHhBWxuQxBGpFt8x5bN8jhWmjDx+Eo9YN7etAqXGheYM28Pu61LVt/1pyPvY3D3ACdnq4PG4/hR9f2UsqiUJ3afQ7Y502up1uGyUWHFY3+/gq3siJKhEyhBlHBCZlCGyK20YNM+5dt3mRvy9afVudjD/vSg0xjGNHntc/Pvs9rn7um35nGC7qjECVCplCD0QN5LtljGMklmfYv27zJ3patP63Oxx72pQeZxjCiz2ufn32f1z53Tb8zjRd0R1jRoF/+JY+FoQ0vBjRcEjGPtwhY+va3vzXSd6IuunRrbkMvhjZF6Cj46qbxGul9Ze6+ZNvnleEvswKa5lobYMR7H7w/Cita/1z6MepxtiIK21vygTvzh/PqAUujF6EJA2uyzK8s/XhJL/2krtoBTdnC6XqV7Xx0XGElYUVUlSEopHXb8/aIMawR2hARsHRhbFaHeNSWbAWzugznSk/BHJnOx7XPF1Z0r8Hxm3XtLAmvh2vNDWubOw4tw4pqBf/1fk2EmqyIUlurQIRMbdfa56gddDDS2LBetvmQXabzce3zI/alB9mvp6OLmHNbvCc59lCZQpTaMgWFZAtWmStTOEq2sWG9bPMhu0zn49rnR+xLD7JfT0cXMee2eE9y7KEyhShVnc/lfD6Xz5/e9rK3tmvtW49h7dfLNjasl20+ZJfpfFz7/Ih96UH26+noIubcFu9Jjj3U5zuiUNHChNyp13j+pnRTkm42NcZmawnDhWrLEq5xV5LPhQdZxovlpuZY5mPay3kxy4rr6eQ4fPj+eCyffPLz8oMf/OjW7SzZduZ5A91SiEJdkx8iLoU2PI3cvhLxP8oHk5vHJs5XSynHKq80atR6DRl+Qoax9TjHRviD46M1P/d1aRym3h9rJ6/3OG+gZ27NparDoRwOh/KNhzS4XbatHa+5j6v93B7Gpse2ufuWrd+Z2rL1p9X52MO+ZJPp+O3x/Kk9hnP1MDY9nD+wNYUotWVK5MyULjlliwTAW5/bw9j02HZJtkTU7G3Z+tPqfOxhX7LJdPz2eP5cUvv11mxjbX8ixguGdTif9/0dabfP1fXwl73PysRvZY3edpi+vfbF31Mr139D7urvl719e5p87pa/fRY1NtmO85y2pcckS78ztmXrT+1jn3kc5u7Ltce1fC/NdPz2dP7UHsNy5f1xzftHi7FZOl4vnT8+x+7DqMdZITrogWVbhxuCd1Z8R/SSDwKMWs3jW8bhkjzfEa3HtWW/5h770+mUPsBrI3eZv4vn3F2v9hgufH9cZYMApFUUopQy7nF2ay7cZtGHx5npgUtS+bJ8eF3djx0k1cKULOdxtL3uN7cLS62tHYAETFOIUlWmL/1HtF1yPpdDuT+3vllKefX27Wnyr6tPX/N8Lq+fP/fadub2p9U4XHLr2PTcNnffsvU7U1u2/mxx7OlDpvnVw/lTawyXvj8CfXAiU1umL/23DBaY+9g126m9jS3GYW5/5j6ux7ZLph6Xqd+Z2rL1J+K8JadM86uH8+eSlu8/QCIKUWr7opTyOw//3kPblLmPXbOd2tvYYhzm9mfu43psu2TqcZn6naktW38izltyyjS/ejh/Lmn5/gMkIqxo0C//Us9hYSDPw+1D762ZY4cFAQ1TAQu15vHScbjkUihRQ5sGpqy9tlwJskkd9PLcTgJ5PjgmC8KKdvsGnOX9dQfzs8n1IuKz1ZL3xx5kC0ma0NX7z0hGrVesiMLLlnxIqR2oMPv1AgIWVn1YSxhKlP3D51T/svf7ud76e4tb9zEsgCWZTPs9+vwcef8yzaPVOglJGnk+0cBXWneAsTwEDXxWSvvfTqvV9sIuv/h7kG/frhqv1xf6c+33Rm/ZxupxSLbSOduW82bpcV/7/Kxt1/ZjJLcdu+NH53e23zzM+vugLJPpvK+43UXnz5xzqix8fwXWsSJKbZkCGloGj0SENswlGGLaHudNpvNnJFHnvOPHLTLNm2zXJfMdGlGIUlumgIZsoTNzn187oEEwxLQ9zptM589Ios55x49bZJo32a5L5js0Iqxo0C//cpu1wUSX1J5jGQMaer01t5WZYUVDXJv2HMhzya3HruNQndThJnuYn+bcfBnfX5+7NdTo29/+1sT3UM+lXPgos3UAIsuM8pngOd8RZfcO3z8cSim/VUr5q1LOLYOJlmw3zYeDhCFE5JJqvja25prR6xhm7/fo89OcWyb9fLg11Gj6eZdrmInHDxUQRXsKUarKEI6ypO3w/cOhnMufllL+oJTyZ1N/GXzwUTBRo9CZ1QFGKz0bh6+Ww+F4UzBE7SCUPaxulJLn/JnXtiyQJ0tIT40wnzdvjrOuGUvGgbourZwtvY60Wo3Y+vzpOWhsxdisvl7dGAa46D386bVl7jVo6TaeP991iC34jii1ZQohuNr2sBL6p+X86vfKoRzK+dXvlX/3H0uZvtZuEX7QY6CIYIj2mp8/wW3Z+jPy+U0/os6fHvV6bZlrzTVoll8tPyl/VL7/279R/vd/+6Py/d8uh8Ov3dBPuMqKKLVlCiGYbvu33/uilPKnpZTfLa/e/XIppZRX7365/MYP7x/13/9TubAyukX4QY+BIoIh2st3TrUJFMnUdkmP5zf9iDp/etTrtWWuNdegF/3L8jflf5Z/U75e/u7P/6j88atX5d2fl1L+qRwOv1nO57+5ob9wkUKUqh5u3fg8c9vh+4f7IvQff+U/lK/99MMd+NpPy1Qx+vT1HkMcnt66dLrP67k7Ho+vP3xsmVRr/w6BN4at6fPz9mtjw7Rs59TWbdn6E31+wyVbnz89X597u7bc8B7+/jbbN2+OHwULPQkmWnw77mMR+rr8XTmU8ukvPbzEu1LKXXn9118/HH5DMUotbs1lV97fjlvK735UhD56LEY/vE33+Rf0p8IMWoUcRAUIZA8qyN4/uFWvc7vXftPvseux36v6/DxYaFmg0S++jvSr5Sfvi9DnBcKrUsqn5ctSSvkfbtOlFiuiVJUhPOTFYKLzq997fzvulK/9tJRv/pd/KP/6z/6iHMp3zt89n5eEOGwcVjQrwCjb+C8dm1tk+pmILYOTMh3TFvOmdX8anN/nx7mdKYzpkp5/RqA3258/t8+5yCC6NaFgt49N9bZZY10qBBNe+rm14/F4OBzK4Q/Ln/z218vf/flh4o/qD8XpL5VS/nkp5Sdr+wJWRKkt05f+n7f9VinlD14sQh/dP+4PHp43tY0pa4IERm671s58mY5py3mTqe2SkcaBfMybZX3JNjbZxvWzH5Vv/cm78kptQBiTjdoyfen/edtflVL+rLx79Q+z9uT+cX/28LypbUxpFWaSve1aO/NlOqYt502mtktGGgfyMW+W9SXb2GQb1y++VX70h6/Ku8ifg2PnDufzvn8W6NrtHm4xGs8H3xEt5Z9deejfl1J+WEr5ztt/8fbvyoLvfj6fN+bYtDVj8xgYVb1Tbdw9v614sLH5aP/WuLJ/Vbczsy9Dn9+tflOz5THu5XdEM6t9XmQ7z1rNz8Nh/W95Tt2a+7CBXyul/N93pfzqpZWqd6WUV6X8bSnl18v5/JO1fWG+bOdALVZE2ZXzd8/nUsp3Sik/LP/4K5cfdN/+w1Luvxtacn2g5xdGOi619yXb2ESsPxR/AAAW3UlEQVTtX7b95nYtj3GPYTfEajU/V83NTz75+fUH3BeXv3lXXn/0ZdSH1NxSSvlNRSi1KESp6nAoh8OhfOPhy/Up28r3zp+VH3/3O+VrP/3P5X7l86m/L1/76X8uP/7ud8r3zp89fe7acZj7uJHblozNHtWeN5lEzZuWc3vrvmTbv1Edj8fXx+Px8PSf1n2aa4/zpuU5Vbs/L7Wdz+X1+VwO5f7z+zfLy5/j3z/u7dvTBz/xMtnvcv4/Xy9f/sarUv72XMrdP5VXf38u5e5VKX/79fKln26hKoUotWX6Mv9024+/91l5XBl9/M7o/b9/WEr5zsN/XxMO0CrMJHvbtXbqz5tMouZNpvCQqPOn13AU6tvjvGl5TtXuT0RfXn7sfbH5698r3/v9b5b/9f++V773++X+dlxFKFUpRKkt05f5r7a9v0338O4vyrmcy+HdX5Rf3I67NhygVZhJ9rZr7dSfN5lEzZtM4SFR50+v4SjUt8d50/Kcqt2fiL7Me+z5/JM/Lt/9y78u/+rf/3H57l+6HZctCCsa9Mu/zPcQYPRbpZS/eihCP7Dl70EybUYgz1DHpWbIVcaxqXk9zXTdbtmXhKFUl9wU3rJkXEcZh6gAnE7Ga42mgVbR16DDlQCjh9t4Sym5+sxyox6/r7TuALT2UHz+uHU/WOyujPNhqnY4SraxEf6yjUzHeEpEH0cZh6gAnB7Ga43R9++5qeu96y7pKUSp6uEL9p+VUr44n+//Std729u3W40Wa9T+i/dWf22cO+eet1+bdy+/5v3YZDh/arTVHZu460PLcWCfzJt7t55TGc/7GW2zrvd7OO70x3dEqa3Vl/RHCV1gPGtDM9a85iht19qfy3R9aDkO7JN5c2/tOZXpvPdZhmEpRKktUzBOj6ELjGekQJ6W5172sYnoy5Jts0/mzb09hhX5LEN33JpLVQ+3g3w+Utvp9NFuEuDCrbKbB1BsYe6ce95+bd5lOC8i2563ZxqbyL4s2TbjuhY29PT2y6fzY2+3Zd56Tl1qexzvuWOY6drpekF2VkSBXuwtgAJe0kMYSUQfexiHmlwLYxlv2IgVUarKFHDSKqyo5s9wjC7jz4ysMWogT+u2zGPTMqwoIpTq7dvTu+k9/FiLa9qlOyVqX3fnvt5o17Qpa35Cqvb4z3VrWNHSzwCZrp3CisjOiii1ZfpCvi/4Ey0qiCbTeRF17mUfm4i+RO2L6x8jWntOrdlOpjZIQyFKbZm+kO8L/kQbPZBHWFHbvkTti+sfI1p7Tq3ZTqY2SMOtuVSV4Qv5a9qWhhLAU6MG8rRue96eaWzm9uXSteXxuXPbljy2VhuM4vawovXbydAmrIiMrIjCh4QSAFtwbaGW1uFMc7Y/9ZjWfQcSsSJKVRm+kO/L/LQyaiBP67bMYzO3L64t1NLDz1j10EegPSui1JbpC/m+zE+00QN5Wp6P2cdmbl8AgKIQpb5MX8j3ZX6ijR7II6xofV8AgFLK4XzexU9eTfIbjzxV47fKlvyOKLt0N+e2tQ1+h+/Lcvl7irP608qVfsOjj+Zwxt8Rjf7tzZFUeB+9+Tq3dNszjnO6a9re51cPRr0+WBEFiNXqA8jUdlN9ILoge/9ozxzhJZnmSKa+QFPCiqgqU8CJsCKyahHIkyng55awopHd+tfs2is1NbYBAHNZEaW2TAEnworIqmUgT+3Xcz4CAIspRKktU8CJsCKyahnIU/v1nI8AwGJuzaWqh1vpPu+17XS6tndQx6V5+Lz92lysPbfrnT/3IRxPbwN+2O7d+Xx8fes29qTXUKnWltxC7HbjfYo67he249yFCVZEAWLdte7AhnoNRGrp+XwwhjCW5+fuyO8BsIgVUTaXKfQkW1hRz5HbL5kTyHP5GJzeTb3mmzfHV9eeu+7Yt9nuSGFFW21jiehzatRI/VvV2merloyq1eqoc4qMrIgSIVPoiXCUOEsCeTKF6rTc7ihhRRF9BgA6phAlQqbQE+EocZYE8mQK1Wm53VHCiiL6DAB0zK25bC5DCNHcthphRR2EZoQEJ8wJ5LnUFhGqk2m7Uf2JCSvaZhtLZLr9rHZf1r5eprEZydxxXTP+Gxw7ATpAc1ZEYX+EnvRhKtBC0AWwlveBOK7ZMMGKKJvLEEKUNaxoFLcG8rQK1cm03emxuV+t6Gn/MoQVMb45gUhrQqSsHI/p1iCtpfNhjyFlcCsrokTIFEIkHGUbawN5soUGtdruKPtXexsAwGAUokTIFEIkHGUbawN5soUGtdruKPtXexsAwGDcmsvmMoQQzW2rEVbUg8q3nt093Ea6OJDnUtsew4pOp9OXpZRPn97a+tCf2WObYf8yhBUBjOrxvWKD113zmUDwFTezIgqsJfRivakxNLYQbw/hMr3tY2/93UrG94SMfaITVkTZXIYQosiwouPxePjwNU/v1r9qbsKK6oQVbbntvYQVbRkUUuNOgjVBOc+fGxGissfgnkurO3OPy1YhSXsPwFl6TIA+WBElQqYQoqhwlL0Frggr2ma7UdvJvg0AYDAKUSJkCiGKCkfZW+CKsKJtthu1nezbAAAG49ZcNpchhGhuW62wor0FIC0JnckQqpNpuy/paf+EFcHYtgrLieBWXsjHiiiwlhAJoBTXgj3osggNtPU5kPEcy9gnOmFFlCYyBBMtDVt58+b46hePux5ANPc1ewygWBs6kyFUJ9N2hRXVDStqZW5gzUh6vH5BTdHngJ9JYTRWRGklUzDRmrCVGo/tzdpxzRSqk327UdvJvg0AYDAKUVrJFEy0JmylxmN7s3ZcM4XqZN9u1HaybwMAGIxClCbO53I+n8vnT2+9y9Q2t881HtubteNa+xi0OvYR2+11/2pvAwAYj++IAty7K5eDOFIHMVxJsbzb4feJUh+r53pOIGUTPVyDpvoIsJhCFKB0HQIx9aFw+A+LA4TlDH+MmK+Ha1CmPi4NBbv1erGX8DFowa25NHE4lMPhUL7xkJCZrm1un2s8tjdrx7X2MRilbe0+z33umm20OnYAwHgUorSSKSFXau4yo6fmZppza58vNRcASEkhSiuZEnKl5i4zempupjm39vlScwGAlHxHlCYeEjE/z9Z2Os3r87XHLX3sUq0DTt6+fdqXRW13x+Pxda1jMFrblIg5G33+LG2Dp9Z+Z893/qjlwlzaY0gc3MyK6HQaXaaUOvJpOW96DTiZ22/n5DaMK8C2en1/Jr8h38N3vyLqL1d5PISTfFZK+eLxNwSj256u4l3v3/28ufR6S15zj14+LtNjO3LbreNVY85GnD8AwG1GrVesiJJJ9uCYtYE83Mt0nDO1Tekx3AkA4CqFKJlkD45ZG8jDvUzHOVPblB7DnQAArtr9rbnkkSE4Zm3YymOQ0NxbFPcWmnE6nc4rwo5WBSU93JqaIpioTVjRx3Oz9tjUDuYiVkQQWuuwtS3MvY5Xvt4LxQG6Z0UUPrT2y+BDfcAaiOMyPQbZx2bIgIakIuZC9vnWC+MYx7UGNmJFlNTig2PWB7rs0Zs3x1flhbFpvfqbKZioRVjR/FCjiLCv+duw6gO0dOka1Pr9DEZhRZTsMoXJCGqZ1sPYZJojLcOK1myjVXASADAYhSjZZQqTEdQyrYexyTRHWoYVrdlGq+AkAGAwbs0ltUxhMmtDZ0bWw9hkmiNtwopu30btvixtA/Zjq0Art9NCPlZEoa49hhrM3eeWY7PH4wIZORfrGHkcBTHBTlgRpTuZAmamglWy9CdTm7HJG3KVOayodh+32j/m6TV86tpq2vF4PET2BWAUVkTpUaaAmalglUz9ydSWrT+Z2lqK2JeR5g0AsJJClB5lCpiZClbJ1J9Mbdn6k6mtpR7CihwrABiIW3PpTqaAmalglUz9qRdEcx8g8fT2xIdwmruH31/d7djUaGuph7CiDMfq2v6xvSshNne93fI70r7woQu3cTumMMGKKDDXVICEYAkeTQWojByscolx2MZI16CR9qW20c4TxxQmWBGlOxlCS3oIVokMatn72AgremzbJpBqi9fcMqzI6gfcbs354ydaoC9WROlRptCSHoJVMgXRZNuXTG0tZRqHHuYNALCSQpQeZQot6SFYJVMQTbZ9ydTWUqZx6GHeAAAruTWX7mQILbnWlq0/mYJosuxLxraWMo3DrfOmRpiWsKK6rgTybLGt2rdkCpgB2JgVUQBGsMfwl+yhLj2Pfc9937OM50TGPkEKVkQZQoYgk2zBKsKK+mrLJtvYRMzPWmFFUS6t2AlrYc+WrGJfO1eOx+Nhq+cCv2BFlFFkCzLJ1J+IoBZjs74tm2xjEzE/ez1WANAdhSijyBZkkqk/wor6aMsm29hEzM9ejxUAdMetuQyht2CVHtuuBbW8fXt69/i/rwXEZNmXjG3ZZBubiDAtYUUAEMeKKLAFQR9xpoIwBGTkN/ox6nn/eu47QBesiDKszMEqPbYtDWrZ09i0DCuq8RMTmcZhi7CirEYPG6r98ycCYgDGYkWUkfUQrNJj21x7HJteA3AyjcPaeQMAdEAhysh6CFbpsW2uPY5NrwE4mcZhi7AiACAZt+YyrMzBKj22LQ1qGXVsTqfTl6WUT5/eCnotoKmHsKIM43qtbc5jBQnBxx6vV402f1fr9uwr+1FtG0A8K6IAy0x9qBPQxJ7sLcyn11Cwltelmtt23YUBWRFlV7IEq/TYJqzo5XFoEVZUQ4Zx3WNYUU/2HgZk1Q2gPiui7E22YJUe2+YafWwuEVa0TdvSxwIAySlE2ZtswSo9ts01+thcIqxom7aljwUAknNrLruSJVilxzZhRS+Pw9p5c0nE70q+fXt694vtlVJWBi/VbJsbDiWsaCxLQ3Y6+P3V2aE6HezLbBf2ZVfhQo3DoqLs6phSlxVRAJ7L9MFJSMk+jXZ8R9ufW+1tHPawv3vYRzaiEGX3DodyOBzKNx6CT1a3bfGamdrmGn1s5u5z7XGNkn2slz4WqOd4PB6u/dO6f0AfFKIgrOiW/Ztj9LG5pNewokuyj/XSxwIAiShEQVjRLfs3x+hjc0mvYUWXZB/rpY8FABIRVsTuCSua1yas6OVxWBP6k02GsV7SR2FF+VwJahFusmO1w5i2CHcaKTAKMrMiCrCduSEOd5v2Yrls/WG+qWM355iuee4la4KmRpuDI+3PSPtS2/Ox2cNY7WEf2YgVUZjpIRDls1LKFw8rMRfbljy2p7anP59x63hl2ZfW4/C87XF1KMP+ZWybM7ZLj8uo1qw0ZlqlvNSXa6tUIwXkvLQvexmHHrw03pnOKfOGjKyIwnzCipYZfWzmMm/Wt00RVgQAnVKIwnzCipYZfWzmMm/Wt00RVgQAnXJrLswkrOj5Xl436tjUGIdRx2ZN22OwzdNbbF8aa2FFQEsrQo0EdkGxIgpADnODnVqqHeYD7FMP1zvYnBVR2ECmoJdMIT1Z9qX1OIw6NpHj2oIVDFjvlmAcP6cCY7IiCtvIFPSSKaSnZb8zjcPoYxMxrgBAxxSisI1MQS+ZQnpa9jvTOIw+NhHjCgB0zK25sIEM4S+124QV7Tes6FqQUK026ng8Vq378RK3WgJgRRSAl6QvbHjPseqD4Kt9c5yhWBGFpjKExGQIk8mwf8KK+g4Sgp4IvqrnlvCjUq6vyt/6msAyVkShrUwhMS3DZDLtn7Ci630EAFhNIQptZQqJaRkmk2n/hBVd7yMAwGpuzYWNXQkPuTufj69LwuCYGmEyS8JILmz7prGJaNtnWNHU3gEwpfPwsDu3kLM1K6Kwvak3oVvfnNK/qVWwh33sSQ/BGsJf7u1tfyGznt/Leu47nbAiCh0QHDNtpOCeDMFEl9u2/6v42p/z8Jf7e5nGoXYYjJ98ARiLFVHog+CYaSMF92QKJjLnAIDNKEShD4Jjpo0U3JMpmMicAwA249Zc6IDgmGkjBfdkCCa61gbk5vZloCdWRKE/ewgjybyPmfsGwDwRAWc9v1/03Hc6YUUUOtMqjGTpX9pvCSPpwaXxtwoB0JeI99KRw8OgBiuiAAAAhFKIAgAAEEohCgAAQCjfEQWgitPp9GUp5dPA7d363dy7DN/dWjpevotc7srl8RKqAtAhhSgAtYQVoStl6WeWfnQhwx8PsssSOuOPJsAcbs0FAAAglEIUAACAUApRAAAAQvmOKLxgywCWRt+jSRHUAi0591jiyvuAYwpwIyui8LLRAkVG2x/ykF563fNzz3j1Y+q62eJ6OjVvzCegK1ZEAahizcrQLSuULyWEZk/uvDRe1/qcJRF1ruzj3ysrsMAorIgCAAAQSiEKAABAKLfmAgxmy4AtAIAarIjCy0YLgBhtf/iYIvRetrmerT/Afgi5Ih0rovCCtcEQI4WPQE+EugDccz0kIyuiAAAAhFKIAgAAEMqtuVDR0pCYRr+zd+cWHaA3WUO4XMcBbmNFFOpK9yHpgog+Cj9oaw/jP+o+ChSZ1sP1NcrexsJ5AQOyIgqsJnQplx5XSoR63evx2MHWnBcwJiuiAAAAhFKIAgAAEMqtuQCwsaxBOxcIwYEBuQaRkRVRqKuH4IRb+ygsYpqxWW/0MezhA2Apufs5ylyoIftYjH4+9yjzuf1UL/2kgsP53CJ1HPZDCAvQ6Cc+blLrujTStW+kfWGf9ngNIj8rogAAAIRSiAIAABBKWBEAwEY6ColZQ8AMsJgVUdie0Aagl/O9l372ZPQitJR97GPvejm3e+knFVgRhY35KzHgOgC05BpERlZEAQAACKUQBQAAIJRCFAAAgFAKUQCA7ewhfGUP+whUJqwIAGAjQmIALrMiCgAAQCiFKAAAAKEUogAAAIRSiAIAW5gKsOkx2GakfQFI4XA+n1v3AQAAgB2xIgoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEOr/A27LKJSZmrwOAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " A* search search: 133.0 path cost, 2,196 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6IAAAJCCAYAAADay3qxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3c2OLMl5HuCoJoc4lDGH4op77QyLHtFrCfIN2AJhoHpBULOiPVfBmeFV0CK8GBOE0QUIhOwbICHtPbTki+DGJE+Do4EGPOVFd5+prpNZlT+RX0ZEPg9ADJinfiIjs7Lq6y/qrd3xeEwAAAAQ5WbtAQAAALAtClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACPXVtQcAAGzX4XB4lVJ6t+Of7vf7/cvo8QAQQ0cUAFhTVxF6aTsADVCIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEOqraw8AWMbhcHiVUnq345/u9/v9y+jxAADAEx1RaFdXEXppOwAAhFCIAgAAEEohCgAAQCiFKAAAAKGEFQGwmKGhWUuEa5Ue2FX6+ABgSTqiACxpaGjWEuFapQd2lT4+AFiMQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAINRX1x4AAMs7HA6vUkrvdvzT/X6/fxk9HgBg23REAbahqwi9tB0AYDEKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQUnMBVibRFgDYGh1RgPVJtAUANkUhCgAAQCiFKAAAAKEUogAAAIQSVgQAAwiV2gbHGSCGjigADCNUahscZ4AAClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFBfXXsAAAxzOBxepZTeXeBxjxPver/f719mHQxs1FKv7yCuBcBoOqIA9SjtQ2pp44Ga1fx6qnnswEoUogAAAIRSiAIAABBKIQoAAEAoYUVA0y4EgAjXqFSOUJcZAU1zHs85N1Lu4zSX4wyQj44o0Lq+gqXGcI37tQdwZq3x1HjsUqp33Iwz9TiX9voeo+axAyvREQWoxJwuy6VOzn6/3019XCAPXVRga3REAQAACKUQBQAAIJSlubAwYTnLWDmwxrFjkqHn3IohPVfP7RyvPWJ4/wFKpiMKy1srLKcvPKKVUIk1Pwj7EL6uVs7hEg05t53/l5V0frYU1gY0RkcUGuWv3bRqrXN7TuBTaT9DUrOlw7UEewHE0BEFAAAglEIUAACAUJbmAlkJMmHran8NWEYMQAQdUSC3aj+AQyZeA8spKQgIgBl0RAGA0QT3ADCHjigAAAChFKIAAACEsjQXMlorpOTC8977PdF1OCYAl60c7OVaDCvTEYW81npD7XveNcazhTCRIftY0jFZU99c1XieDN2XGvcN1rDm9XBr12Iojo4okFVpf2G+9FMUwlaWV9r5MMfQfWlpn/2UCwBL0REFAAAglEIUAACAUJbmAiFWDqUgs8jj2bM8VNAI4SxVZgyhdXCZjigQRRHalrWP59rPD8y3ZrBXxHMLrYMLdEQBAAinKwjbpiMKAABAKIUoAAAAoSzNBd6ylWChtYJHBJ7AsoTEAJRPRxTo0nwRukF9wRxTAzvWDBkp4fkpm5AYgMLpiAJsQO4u0JzHu9QR3u/3u6mPC0sr4fy0ogJohY4oAAAAoRSiAAAAhLI0F1bUscRKkAZAkK0Es+UyJwRKgBRwTkcUylLKByJBMMAWlHLNrcWcECgBUsAzOqLAW1r66/RawThjnlf4CACwNTqiAAAAhFKIAgAAEMrSXIBgY5birrRsV3gIALAoHVEAzgkPgbcJcQPISEcUgpwG1AinASjbkmFmAOiIAgAAEEwhCgAAQChLc6Eyh8PhVfIdPoCUUvnXxNLHB7AWHVGojw804/QFjAge6WdueFLD66f0a2Lp4wNYhY4o0LQSf4bkPATlUniVwBTWVOLrB4A26IgCAAAQSiEKAABAKEtzAVJsoMiY35Gd8Zuz9yUsqxw7r0G/sXt1bjYSMPNsHjayzwAUQkcU4EFrH8BL2Z9SxnFqyJhKHHdu5/u4hX1eQ0nBTgDF0BEFAFhI7pUJQasGABanIwoAAEAohSgAAAChFKIAAACE8h1RgAf3qa2wllICUoqcV9+ze2AeKMFK5+Fb6dkXkqMlbcMCFKIAKTZQZL/f76betjZd85prbkoy5TjVsm9jrHG+tjiPhOgqGvsKSUnbsABLcwEAAAilEAUAACCUQhQAAIBQviMKAHCBIBqA/HREoW2lJKfCVDWcw1PHWMO+jdHa/pxShLan5fMVqqAjCg2pPWEVzuVOMy5Jy/sGY3jvgm3SEQUAACCUQhQAAIBQluZCkKE/uj7nx9kDftj9funlhEJB5rswh4sfP9bl2F8XcS0G4DodUWCMiAJRETpf3xya2/Y59gwhqAdYnY4oAEBGS4bvjO3UCgICSqUjCgAAQCiFKAAAAKEszQUINmZpXe7AlBmPdzXsJkfQ1Mz9DRljYYQQNWKpoKme15TzpkINXr+6ODc3REcUGCMi4EKIRpmGfPhZ+wNSDWPMrbX9acHUa1hk0NQa503fvLjmD7eF1/sW9pFHOqIQ5DQw4lLXZ+jtIqwRctHyX0LXPp6wNbe3+5uU0nsppV8dj+mYUkq7Xdo9bbu7O7zuu2+NIT/nYy7pmtPytR2YRkcUAGjVeymlv33876VtAARTiAIArfpVSuk/Pf730jYAglmaCwA05f33v5s+//ydlFJ6s/R297Bo9f54TC9TSp+mlNLhsMboyGGpcCcgjo4olE2IQ1tqPp5Dxr72/tUwxtxa258sHovQLueFiwCdcUqar8hwp1Js4bzcwj7ySEcUCtb1V92x4RPXAjdKCrNoXdRf6YeGYeVWQxeihjHypdNgoa6woadt0x/z4XyY85hDxzh33CXw+lmX+ac1OqIAQKmWCBvK/ZhDH09IEsAJhSgAUKolwoZyP+bQxxOSBHDC0lwAVncheKRWAlMyeFzC+umlbSfBRENdDDDKPcanc/vu7st/fwxJco5UJPfXWAr6WozzkNXoiAJQgpaK0JTa259ijSxCuyx9rHKH6pQUGET9XKtYjY4oABBqTsDPmNCfu7svf5/l9nYfNp7Tbaed0Bx0r4BW6IgCANHmBPwsEfqTezyCiQCuUIgCANHmBPwsEfqTezyCiQCusDQXoMeFAJ3Nhzs0GC5EoCEhROfbdrv01jm3y/eruIMCjKaM+3BIAHTQEQXolztkpCXm4DLBMfmNOudevPji4v/P+VxQMdcqVqMjCkDT9vt9vr4ZbywZ8NO17dJYjse0ex4QdHh9fptPPvl5Siml29v9zcljvnW73PuXO6yIcri2wDw6ogDAFBEBP2NCf4beds7thBUBZKIQBQCmiAj4GRP6M/S2c24nrAggE0tzIcjhcLj4e3djb5f7eWfcd/PBPS0RQrSMEuY197XldMnp17/+3fT55+88+/euIKE5264ZGhB0FoB0SZYAI2FFAN10RIG5FC1tae14lhLE0dq8PnNehAabc4zH3LfpYwgQTUcUgKIJBOHMm7Chp3CglIYHBJ2FC70835YyBxgJKwLopiMKANSkLwgodwjRnPsKKwK4QiEKANSkLwgodwjRnPsKKwK4wtJcYLbcISgzCU+CBb3//tvBRJFOfyP0NAho6BLYa+FCuQOMcocV5Qq+CrhuuxYPdOGYXp3DOfeFtemIAq0RKNIfwDIkmKWUcJ8npY1nqlb2Y9Ui9MWLLyKepvQAo1qucbWMswR9czVkDufcF1alIwpBlgxcufSX7WvPO/S+hXU9uWDOX8H9BX0ZT/M6JeymL5CnOxjn0Bu0c3u7fyvkZ8p40oUwn7u7On+rZMkAI2FFAN10RAEgztywm7WCdloP34mYVwBOKEQBIM7csJu1gnZaD9+JmFcATmx+aa4veQMQ5SwU5633n66gnL7wnK7tt7f7S09/HrRz9fGuBPc0Y8kAoxxhRbV+NaJj3D5bwQSt1is6or7kDUPVErZSyzihqfeZoCChtZQeYFQLcwPTNFmvbL4jCgxT81/cYGkTQ39q9yb86FJI0pJBbX1ydBBzBRgJKwLopiMKAPNtMfSnpX3pkjvACIATClEAmG+LoT8t7UuX3AFGAJywNBcAZpoSQlS7S8txT81YJrtqCEeuAKPb23168eKL9MknP19glPWpNXgJyE9HFOgL4RD6A9MsEB5R1mf3oGCi0kM4Bl8jP//8nSXHAVAlHVHYOCFE+UwJrHnaNvf+LW8rbTy5Q4ju7jL8vgch5gQYzbFG4NMcup7AEDqiAPnMDayZc/+Wt5U2ntZDiOjn2ANkohAFyGduYM2c+7e8rbTxtB5CRD/HHiATS3NhBYfD4a0wkwWfa8gSqVVDQVoxJbCmL/Bk6G23tq208bQYQkS/mQFGAJzQEYW8hgb/lBbCUdp4WmBOuSoo9KdWAtPq5dgBV+mIQka6inW5Fjpzdzf9vsuOnBp0hRDd3u5v0rPz5p202+2rD2269FMutQXtjDXndT/0elOD5+fI/q0gp8hzDqiDjiiwZXOCR4SWMEXroU1bNGceWppD5xwwikIU2LI5wSNCS5ii9dCmLZozDy3NoXMOGMXSXGCQyIClBXSGMV0LF7q93V96zDfLwgSUMNS1sJtL2+bef8q2p9f96bLRw8OK4/vH5ZefnmzbpKFhRT3Xk9en//7ixRfpk09+nnmEMXKFtT1uu3/8ndYL52aGQQOr0hGFdZQW5DBkPLUWoSkNG3vN+0dhGgoh6ntdeL10m3Vt//zzd3KNY21zzw/nF2yAjigs4HrwQt4ghy0HhQy1VrjQsMCaYWPc6rbSxjPn9VjbPAwN7Ko9aGeOszl869qeTrqemZ6j+PNm7v4OeY4tn3PQCh1RWIbQhvKUNIe1BtGseV6XNJ4551Kt8zB0X7Yo4trS0nkzVEnXbGABClFYhtCG8pQ0h7UG0ax5Xpc0noiQq9LmYei+bFHEtaWl82aokq7ZwAJ2x+Px+q0adjgceifAkkZy6AptuOApoGGUiPP40nNM9f77323pO1GdupbmrqgztClaRcFXk+arpfeVoftS2j7PGc+F83PS+bDbpW1/0Mrn2fvjEufc2Pe5iPfXAedrUa89ltHqcdYRheWN+cBd8ofz7AFLrRehBQbWlHJ+lTKOa2oZJ3nlDmgqLZyuVqW9Hh1XmElYEcwwJBgix2OWENowthMwcG5mh3jkVlgHkw0oOXRGWNE01wKMegJ+irsezjU0rG3oPKx5ztXcdYJS6YjCPGOCIeY8Zo2hDUvMDbSo1tAZr+V+rVzHl+CcA1JKClGYa0wwxJzHrDG0YYm5gRbVGjrjtdyvlev4EpxzQEpJIQqzHI/peDymT0+Xzl3aPvUxh24ryRJzAy0a8/qec33IfW3xWu7XynV8Cc454InviEJGIxNy+x7j/A12UpJuaXLMzdIKDBfKrZRwjftU+LnwqJT5Yry+c6zkY1rL62KQGdfT3nl4/v64Ty9efJE++eTnU59nzHOXfN5AtRSikFfvh4iu0IbT8IMLEf+tfDCZPDdx3kkp7bM8UqtR6zmU8BMytK3Gc6yFPzg+mfNzX13z0Pf+mDt5vcbzBmpmaS4MtNul3W6X/uwxta9329zHjHruqYaOZYm5qXEbTNF3LpV0Hjvf54s6diWdN3PmYYnHcy2H9ShEYbglEg/nJAWulR4YlQa5VjpoRNooXCM1dxuijl1J502XNR/PtRxWohCF4ZZIPJyTFLhWemBUGuRa6aARaaNwjdTcbYg6diWdN13WfDzXcliJ74jCQI8pfZ+m1B28s5uwgOfsMS958wPfj7d7CjD6NKWUDm9/xXIxU+Zh7tzUuC3ymFCnw+HQG+B1d3d6u8vbl9x2yd3d4fX1W3V+X/Ded/EeRFyXop5nzvXv2uNdeg+5ve38Xv/5e+YzfY+32z2ELN3dfRmA5FoOy9ERhWlGBQgNTA8ck8pXSoDR7HFsIKkW+pTyOo621f1murDU2twBSEA/hSh0mBNecDymXXp4bX0npXRzd3fojZc/fczjMb08v+/YMeaWO8RhzNxEKCmsA6AWEde/Oe+PQB28kKHb3PCCiBCiiDCFNQOaIpQU1gFQi4jrn2ssNE4hCt3mhhdEhBBFhCmsGdAUoaSwDoBaRFz/XGOhccKKoMO1QJ4x978UdDAnoCGdhDHc3u7TixdfZF/mOncezh0Oh+PQIJQ5P4g+0P3xuH8T+PSklLCiC0E2VQW9XArkaUhVx4QvjT0/A65LuS1ybkaEtc18f5ylJwBpUX3v4XPOuQXOV9c6stIRhevGfIjOHagw+PECAhZmFRMFhhKVXhz1ja/0cZ+rbbxTTN3HsACWwpS0362fny3vX0nn0WyVhCS1fD6xAh1RNu8xWOG9lNKvHv/a+mzblbvfnN/3/P5DO4A943nZMZ5BP5nQ83hv7d/QbZee4+6unXz7XPM19rizPTV0Fi51VPb7vVAuQl17fxx7zT7fnka+vwLz6IjCMoFBcwIVcgc0CN8Zx3wBlGmJ67PrNqxEIQrLBAbNCVTIHdAgfGcc8wVQpiWuz67bsBJLc9m8OYE8XWEK59vHhjbMCWjoCVh4s9So675Dt7Woa9nh6ZLap2M3ZRvbsnSITYmhTwP3WbhJpZY65+a8VgZeiwcH0Z1vX/u9b0hI0tRgwvff/27n91CXCDqEoRSibN7u490upfSXKaVfpnRcM5hozPMW84G0wBAiylLU+bqyOdeMWuew9HG3fn4658Yp/nyYGmrUd7+Rj9dUQBTrU4iyKedhBbuPd7t0TD9OKf0gpfSTlI4ppd4/ib4VTDQk/CBHlyx3gNFMZ/PwTtrt9pOCIe7uDr3jnhKEUuFPKzSv9W6YMJ+6dZ2fY68jjnPdcgcgTQwDHPUefnu7f/M+fOl99PR2l55jv9/vxgQ8QS6+I8rWvAkleOyE/jgdb76fdmmXjjffT//hv6TUf61dM/ygpGAcwRAAtCJ3AFJEuN3cQMXrt93tvvnD9PFffTv9n//5w/TxX6Xd7psTxgkX6YiyNQ+hBP/+o1+llH6cUvpeunn99ZRSSjevv56+/bOHW/2v/5o6OqNrhh+UFIwjGAKAVuQOQIoIt5sbqHj5trvdn6aU/v6j9NFXfph+dHOTXv/3lNIf0m73F+l4/KcJ44VOClE25XE57kMR+i9/9J/T1z57foOvfZb6itHnAUQPIQ5dIQn7/f7lnLCivnGniQFGuV0by6Vt59tzzA0ATDXnPS3Xtgnv4W+W2d7e7t8KHDoJJhq05He3+3Ip2L9J/5R+m76RXqbfpZuU0lceH+J1Suk+vfzHb+x231aMkouluWzKm+W4KX3vrSL0yVMx+nyZ7vkX9PvCDNYKOYgKECg9qKD08cFUtZ7btY6beo9djeOeNebzwKGpgUZ/nH6T/iH9+Zsi9NRNSund9CqllP7eMl1y0RFlM94EEx1vvv9mOW6fr32W0nf+2z+nf/eTn6Zd+uD44fE4JoAod1jRtedYK2Bh7Lal56akYBzBSeS01rktuGe75pxzuUO8WgwFm/IenjIEE97dPV+KdPqTMd9Kv0436XVvl+px+1dSSt9KKf1m7lhAR5Qt+cuU0g+uFqFPHm73g8f7pTTnS//LKClgQVgRAAwXEWoERVOIsiW/TCn9JL2++edBt3643U8e75fS1C/9L6ekgAVhRQAwXESoERTN0lw24/jh8bj7ePdB2r1OKaXvpZT+1YWb/z7dvP5ZSumDu39997vD4fDumGWkEYE8JQQsjN12vn3O3DwFRk1/hPV0LDO7z7n0ssC5idq/rM/DehxjSpbj/MwdanS6xHaqX6dvpdfpJr1O3Z2qx+1/SCn9evaTQdIRZWOOHx6PKaUPUko/S//yR903etj+s5QevhuayvpAz5daOi6596W0uYnav9L2m+nWPMY1ht0Qa63zc9a5+eLFFxe3/TZ9M/15+of0Kn3jrS+jPqbmppTSX6Tj0fdDyUIhyqbsdmmXPjq+l37x4Qfpa5/9TUrp92c3+X362md/k37x4Qfpo+N7jyEBk55nt0t/NvX+pejajznbLm0HKMF+v3+53+93p/9be0xs1+l75vGYXh6PaZcePr9/J13/HP/mdnd3h2c/8fLkk09+nk4f8/+mP7354/S7b9+k9NtjSvd/SDe/P6Z0f5PSb7+RXvnpFrJSiLI1D1/6/8VH76WnzujTd0Yf/vuzlNIHj/8+JxyglXABYUUAsJ45AUbTQhYfis0/+Sh99NffSf/7/32UPvrrlNKfKELJTSHK1rz50v+bZbq71z9Nx3RMu9c/TV8ux50bDtBKuICwIgBYz5wAo+khi8fjb36UPvy7f0z/9j/+KH34d5bjsgRhRWzKeRDAmwCjlP5HSumXj0XorECdw+FwHBpsVPpvTZ7ux9M8zNl2vh2YrsBQqsWDuHqec9V5GHgdvzoPUQFNlczXnMdrKtBqToDRmM8yY0IHIReFKJv3WHz+Yu1xMNp9KuxD+Ay5w1FKmxvhL8so6Rj3iRhjK/MQFYBTw3zN0fr+neu73rvuUjyFKJv3GJrzXkrpV49//Xu2TQevTLn/4n3pr/S1hZW01A0A4Lmzzy0vO7b5LEMVfEcU5gUBAABEmhsaCEVQiMK8IAAAgEhzQwOhCJbmsnnXvqA/NqyIPFoPoACYYu2woa0pcb6HBgv5LEPpdESBWhT1QQAKUEMYScQYa5iHnFwLY5lvWIiOKGR2HmzTUghObqX/fA2ULGKFwNjX6BrXtK55yH3dHfp4W7mmXZvDyPkH6qUjCgAAQCiFKAAAAKEszYUTJYYSAPVzbQGA53RE4TkfFIEluLaQy9rhTEOev+82a48dKIiOKABAJWr4GasaxgisT0cUAACAUApRAAAAQlmaCyvy22jjrDVfmZ/3fo1laxfCclYZz1BjQ35Kek2VNJYoEfvc8RxFn8PEK/kcEVwGX9IRBYi11geQvuct/QNR6eNjfc4RrinpHClpLLAqHVEAWNF+v99Nud/Y7uOU59liVxeAGDqiAAAAhFKIAgAAEMrSXACyqDUQqSTmcJoxS4gtN96mqONeclASlEZHFCDW/doDWFCtgUhrOj8fzCG05fy12/J7AIyiIwormhpS0rJLf7Vecr7Wel7yij5Wzpvncu2zriWtWqs76jVFiXREAQAACKUQBQAAIJSluZBZBaEZghNoVknLz3KPZe7jlTQ3LRk6r3Pmf4Fj530AWJ2OKGyP0JM69AVaCLoA5vI+EMc1G3roiAIUSLcCug0JRJoTIqVz3KapQVpjz4cthpTBVDqiAAAAhFKIAgAAEMrSXNigzEvPhF7MdDgcXqXu72yZWwBSShffK+Y+7pzPBN6nmExHFJhL6MV8fXNobiHeFsJlatvH2sa7lBLfE0ocE5XQEYXMzoMKBF9AnCWDQnK8lucE5cy9tkyZmy1ev7q6O0OPy1IhSVsPwBl7TIA66IgCAAAQSiEKAABAKEtzAQC4aqmwnAiW8kJ5dESBuYRIACm5FmxBlUVooKVfAyW+xkocE5XQEYWBhgZSTH1MoF5LXB9K5/rF1kW/BvxMCq3REQUAACCUQhQAAIBQClEAAABC+Y4owIP71B3EUXQQw4UUy/sNfp+o6GN1ruYEUhZRwzWob4wAoylEAVLVIRB9Hwqb/7DYQFhO88eI4Wq4BpU0xrGhYFOvF1sJH4M1WJoLAABAKIUoAAAAoRSiAAAAhPIdUahMxQEnWwzPATKb+5093/kjl45zyfscjKAj2p9GV1JKHeVZ87ypsQhNafi4vSaXYV4BllXr+zPla/I9fPMdUX+5YgrnzXLM7TLMKwDUqdX3cB1RAAAAQilEAQAACLX5pbmQ09ggoa2FZqy4v5sPkLhwbm5+bngQEYRWcdhar6HXtczXP69boHo6ovDc3C+DN/UBqyGOS/8clD43TQY0FCriXCj9fKuFeYzjWgML0RGFE/7CPM1+v99du83Wur/k4TUJrKnrGuT9DPLQEQUAACCUQhQAAIBQluYCAFCEpQKtLKeF8uiIQl5bDDUYus9rzs0WjwuUyGsxj5bnURATbISOKGQkWKWfuQFqvQ5c6qYNCWsD4G06ogAAAIRSiAIAABDK0lxgkAsBEve1LrcD6tHSNailfeG5jmXcjin00BEFhuoLkBAswZO+AJWWg1Wnib2ZAAAIdElEQVS6mIdltHQNamlfcmvtdeKYQg8dUQCy8Ff/B+YBppvz+vETLVAXHVEAAABCKUQBAAAIZWkuANUT/lKeC8dkiefKvSTTeQOwMB1RAFqwxfCX0kNdap77mse+ZSW+JkocExRBRxQAKtTVsRPWwpaN6WJfeq3s9/vdUvcFvqQjCgAAQCiFKAAAAKEszQVm61imJOgDAIBeOqLAEgR9xOkLwhCQUb7Wj1HN+1fz2AGqoCMKUDGd53q1HjaU+9wUEAPQFh1RAAAAQilEAQAACGVpLsAIh8PhVer+DqyAJqAoF65XEbJdE113oU06ogDj9H2oE9DElmwtzKfWULA1r0s5n9t1FxqkIwoAXLT1MCBdN4D8dEQBAAAIpRAFAAAglKW5AAWL+F3JjucoJgBESMk2jQ3ZqeD3VwefrxXsy2AlX1sirBwWFWVTx5S8dEQBOFfSBychJdvU2vFtbX+m2to8bGF/t7CPLERHFACAwa6FV7XU1QWWoyMKAABAKIUoAAAAoSzNBchk68EccErQFF1yL9tdYhmwpcUQQ0cUYDlDQxzuFx3FeKWNh+H6jt2QYzrnvl3mBE21dg62tD8t7Utu53Ozhbnawj6yEB1RgJXpDpHLnHOppPOwayyXulTXwnNqMicIqKV5qMG1+S7pNeW8oUQ6ogAAAIRSiAIAABDK0lwAVnch2AagSDNCjQR2QdIRBaAMNRShucN8gG2q4XoHi9MRBYABdDBgvinBOH5OBdqkIwoAAEAohSgAAAChLM0F4CJBQvWo5VhZagmAjigA1xRf2PCGY1UHwVfb5jhD0hEFAAgl+CqfKeFHKV3uyk99TGAcHVEAAABCKUQBAAAIZWkuLOxCeMj9lOVZGwkjmTQ3AFCKyt+vvQ+zOB1RWF7fm9DUN6fi39Qy2MI+1qSGYA3hLw+2tr9Qsprfy2oeO5XQEQXgooi/is/9OQ9/uX9Q0jzkDoPxky8AbdERBQAAIJRCFAAAgFCW5gIANMDyZaAmOqJQny2EkZS8jyWPDYBhIgLOan6/qHnsVEJHFCqzVhjJ2L+0TwkjqUHX/OtCANQl4r205fAwyEFHFAAAgFAKUQAAAEIpRAEAAAjlO6IAZHE4HF6llN4NfL6p3829L+G7W2Pny3eR033qni+hKgAVUogCkEtYETpTKeMsZRxVKOGPB6UrJXTGH02AISzNBQAAIJRCFAAAgFAKUQAAAEL5jihcsWQAy0rfoykiqAXW5LXHGBfeBxxTgIl0ROG61gJFWtsfyiG99LLz1575qkffdXON62nfeeN8AqqiIwpAFnM6Q1M6lNcSQktP7uyar0tjLiURdajS579WOrBAK3REAQAACKUQBQAAIJSluQCNWTJgCwAgBx1RuK61AIjW9oe3KUIflHaulzYeYDuEXFEcHVG4Ym4wREvhI1AToS4AD1wPKZGOKAAAAKEUogAAAISyNBcyGhsSs9Lv7N1bogPUptQQLtdxgGl0RCGv4j4kdYgYo/CDdW1h/lvdR4Ei/Wq4vkbZ2lx4XUCDdESB2YQulaXGTolQrwc1HjtYmtcFtElHFAAAgFAKUQAAAEJZmgsACys1aKeDEBxokGsQJdIRhbxqCE6YOkZhEf3MzXytz2ENHwBTKnucrZwLOZQ+F62/nmtU8mv7VC3jJIPd8bhG6jhshxAWYKWf+Jgk13WppWtfS/vCNm3xGkT5dEQBAAAIpRAFAAAglLAiAICFVBQSM4eAGWA0HVFYntAGoJbXey3jrEnrRWhK29jH2tXy2q5lnGSgIwoL81diwHUAWJNrECXSEQUAACCUQhQAAIBQClEAAABCKUQBAJazhfCVLewjkJmwIgCAhQiJAeimIwoAAEAohSgAAAChFKIAAACEUogCAEvoC7CpMdimpX0BKMLueDyuPQYAAAA2REcUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACDU/weM6W3qaeGRAwAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (1.4) A* search search: 133.0 path cost, 440 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6IAAAJCCAYAAADay3qxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3c+OJNl1H+CTTQ7RpDBNcUXAK0Nbiaa5l0C+gC0QBrIWBDULQ/Q8BaeHT0FD8KItEEYlYAxkvwAJaq+hKe+9MsCNKXXB9IADdnpRVT3ZWfkn/tw4cW/E922GzK7KvHHjZmSeOjd/udnv9wEAAABZns09AAAAANZFIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqb489wAAgPXa7XavI+L9E/90t91uX2SPB4AcOqIAwJxOFaGXbgdgARSiAAAApFKIAgAAkEohCgAAQCqFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApPry3AMAprHb7V5HxPsn/uluu92+yB4PAAA80hGF5TpVhF66HQAAUihEAQAASKUQBQAAIJVCFAAAgFTCigCYTNfQrCnCtWoP7Kp9fAAwJR1RAKbUNTRrinCt2gO7ah8fAExGIQoAAEAqhSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkOrLcw8AgOntdrvXEfH+iX+62263L7LHAwCsm44owDqcKkIv3Q4AMBmFKAAAAKkUogAAAKRSiAIAAJBKIQoAAEAqqbkAM5NoCwCsjY4owPwk2gIAq6IQBQAAIJVCFAAAgFQKUQAAAFIJKwKADoRKrYPzDJBDRxQAuhEqtQ7OM0AChSgAAACpFKIAAACkUogCAACQSiEKAABAKoUoAAAAqRSiAAAApFKIAgAAkEohCgAAQKovzz0AALrZ7XavI+L9Ce53P/BX77bb7Yuig4GVmur5ncS1AOhNRxSgHbW9Sa1tPNCylp9PLY8dmIlCFAAAgFQKUQAAAFIpRAEAAEglrAhYtAsBIMI1GlUi1GVEQNOY+7Pmeip9nsZyngHK0REFlu5cwdJiuMbd3AM4Mtd4Wjx3Ee2Om36Gnufant99tDx2YCY6ogCNGNNludTJ2W63m6H3C5ShiwqsjY4oAAAAqRSiAAAApLI1FyYmLGcaMwfWOHcM0nXNzRjSc3Vtl3jukcPrD1AzHVGY3lxhOefCI5YSKjHnG2Fvwue1lDVcoy5r2/q/rKb1uaSwNmBhdERhofy1m6Waa22PCXyq7WtIWjZ1uJZgL4AcOqIAAACkUogCAACQytZcoChBJqxd688B24gByKAjCpTW7BtwKMRzYDo1BQEBMIKOKADQm+AeAMbQEQUAACCVQhQAAIBUtuZCQXOFlFx43DvfJzoP5wTgspmDvVyLYWY6olDWXC+o5x53jvGsIUykyzHWdE7mdG6uWlwnXY+lxWODOcx5PVzbtRiqoyMKFFXbX5gvfRWFsJXp1bYexuh6LEs6Zl/lAsBUdEQBAABIpRAFAAAgla25QIqZQykoLPN8ntkeKmiEdLYq04fQOrhMRxTIoghdlrnP59yPD4w3Z7BXxmMLrYMLdEQBAEinKwjrpiMKAABAKoUoAAAAqWzNBZ5YS7DQXMEjAk9gWkJiAOqnIwqcsvgidIXOBXMMDeyYM2SkhsenbkJiACqnIwqwAqW7QGPu71JHeLvdbobeL0ythvVpRwWwFDqiAAAApFKIAgAAkMrWXJjRiS1WgjQAkqwlmK2UMSFQAqSAYzqiUJda3hAJggHWoJZrbivGhEAJkALeoSMKPLGkv07PFYzT53GFjwAAa6MjCgAAQCqFKAAAAKlszQVI1mcr7kzbdoWHAACT0hEF4JjwEHhKiBtAQTqikOQwoEY4DUDdpgwzA0BHFAAAgGQKUQAAAFLZmguN2e12r8Nn+AAiov5rYu3jA5iLjii0xxuafs4FjAgeOc/c8KiF50/t18TaxwcwCx1RYNFq/BqS4xCUS+FVAlOYU43PHwCWQUcUAACAVApRAAAAUtmaCxC5gSJ9vkd2xHfO3tWwrbLvvCZ9x+7VuVlJwMw787CSYwagEjqiAPeW9ga8luOpZRyHuoypxnGXdnyMazjmOdQU7ARQDR1RAICJlN6ZkLRrAGByOqIAAACkUogCAACQSiEKAABAKp8RBbh3F8sKa6klIKXKefU5u3vmgRrMtA6fpGdfSI6WtA0TUIgCRG6gyHa73Qz92dacmtdSc1OTIeeplWPrY471usR5JMWpovFcISlpGyZgay4AAACpFKIAAACkUogCAACQymdEAQAuEEQDUJ6OKCxbLcmpMFQLa3joGFs4tj6WdjyHFKHLs+T1Ck3QEYUFaT1hFY6VTjOuyZKPDfrw2gXrpCMKAABAKoUoAAAAqWzNhSRdv3R9zJezJ3yx+93U2wmFgox3YQ4nP3/My7m/LuNaDMB1OqJAHxkFoiJ0vHNzaG6Xz7mnC0E9wOx0RAEACpoyfKdvp1YQEFArHVEAAABSKUQBAABIZWsuQLI+W+tKB6aMuL+rYTclgqZGHm/KGCsjhGghpgqaOvOcsm4atMDr1ynW5oroiAJ9ZARcCNGoU5c3P3O/QWphjKUt7XiWYOg1LDNoao51c25eXPO7W8PzfQ3HyAMdUUhyGBhxqevT9ecyzBFyseS/hM59PmFtbm62zyLi2xHxq/0+9hERm01sHm+7vd29Ofe7LYb8HI+5pmvOkq/twDA6ogDAUn07Iv7rw38v3QZAMoUoALBUv4qIf/fw30u3AZDM1lwAYFE++OD78dln70VEvN16u7nftHq338eLiPg0ImK3m2N0lDBVuBOQR0cU6ibEYVlaPp9dxj738bUwxtKWdjxFPBShpxwXLgJ0+qlpvjLDnWqxhnW5hmPkgY4oVOzUX3X7hk9cC9yoKcxi6bL+St81DKu0FroQLYyRyw7Dhh4DiIb//v16GHufa+H5My/zz9LoiAIALRkbNiTACKACClEAoCVjw4YEGAFUwNZcAGZ3IXikVQJTJvKwdfbTx/9/EEzU1cUAo9KE6ixD6Y+xVPSxGOuQ2eiIAlCDJRWhEcs7nmr1LEJPmfpclQ7VqSkwiPa5VjEbHVEAoEqnQoT6BAvd3n7x/Sw3N9vJHieT7hWwFDqiAECtsoKFBBgBJFOIAgC1ygoWEmAEkMzWXIAzhIyct8BwISp0GEy02cSTNbcp9624qQFGAOiIAlxSOmRkSczBZYJjyuu15p4///zi/y/5WNAw1ypmoyMKwKJtt9tyfTPe6hrwU+q2S2PZ72Nz+LO3t7s3xz/z6tUnERFxc7N9dnCfT36u7/GxXq4tMI6OKAAwRNeAn9K39RlP6Z8TYARQiEIUABiia8BP6dv6jKf0zwkwAijE1lxIstvtOm3j6vpzpR93xO+uPrhnSYQQTaOGeS19bbm9/eJ/f/Wr34/PPnvvnX8/FSQ05rZrDoONdrtuP3flcQQYAUxIRxQYS9GyLEs7n7UEcSxtXt9xXIQmG3OO+/zuos8hQDYdUQCqJhCEI2/Dhg4Dg94NKzr/y0eBQy+ObwsBRgApdEQBgJacCwzKCCESYARQiEIUAGjJucCgjBAiAUYAhdiaC4xWOgRlJOFJMKEPPngaTJTp8DtCD0OJLm3HPXQYVnTqttoDjEoFXyVct12LO7pwTq/O4ZjfhbnpiAJLI1DkfABLl2CWWsJ9HtU2nqGWchyzFqHPn3+e8TC1Bxi1co1rZZw1ODdXXeZwzO/CrHREIcmUgSuX/rJ97XG7/m5lXU8uGPNXcH9Bn8bjvHYNuzkXgHPtZw+7hcdubrZPQn6GjCcuhPnc3l743pSKTRlg1LVTC7A2OqIAkKdr2E2fQJ7SQTtrDOQRYASQTCEKAHm6ht30CeQpHbSzxkAeAUYAyVa/NdeHvAHIchSK8+T151RQzrnwnFO339xsLz38cdDO1fu7EtyzGFMGGO0K7FZu9aMRJ8btvRUMsNR6RUfUh7yhq1bCVloZJyzqdSYpSGgutQcYtcLcwDCLrFdW3xEFumn5L24wtYGhP617G350KSRpyqC2c0p0EEsFGAkrAjhNRxQAxltj6M+SjuWU0gFGABxQiALAeGsM/VnSsZxSOsAIgAO25gLASENCiFp3aTvuoRHbZGcN4SgVYHRzs43nzz+PV68+mWCU7Wk1eAkoT0cUOBfCIfQHhpkgPKKu9+5JwUS1h3B0vkZ+9tl7U44DoEk6orByQojKGRJY83jb2N9f8m21jad0CNHtbYHv9yDFmACjMeYIfBpD1xPoQkcUoJyxgTVjfn/Jt9U2nqWHEHGecw9QiEIUoJyxgTVjfn/Jt9U2nqWHEHGecw9QiK25MIPdbvckzGTCx+qyRWrWUJClGBJYcy7wpOvPru222sazxBAizhsZYATAAR1RKKtr8E9tIRy1jWcJzClXJYX+tEpgWrucO+AqHVEoSFexLddCZ25vh//utCOnBadCiG5uts/inXXzXmw22+ZDmy59lUtrQTt9jXned73etODdNbJ9EuR0bW33JRAJ2qcjCqzZmOARoSUMsfTQpjUaMw9LmkPrBuhFIQqs2ZjgEaElDLH00KY1GjMPS5pD6wboxdZcoJPMgKUJnAxjuhYudHOzvXSfb7ciCiihq2thN5duG/v7Q257fN4fbhvd3e84vnvYfvnpwW2r1DWs6Mz15M3hvz9//nm8evVJ4RHm6BjWdvfw/atP1jawPjqiMI/aghy6jKfVIjSi29hbPj4qs6AQonPPC8+X00Zd2z/77L1S45ibdQNcpSMKE7ge0NAvyEFQyHhzhQt1C6zpNsa13lbbeMY8H1ubh66BXa0H7YxxNIdPru1x0PUs9BjVr5sSxwIsn44oTENQSH1qmsNWg2jmXNc1jWfMWmp1HroeyxplXFuWvm6sJVghhShMQ1BIfWqaw1aDaOZc1zWNJyPkqrZ56Hosa5RxbVn6urGWYIU2+/26d0Bc+h4qWxop4VRoQ0+P4Q5nZazjKb6z7YMPvr+kz0SddGpr7oxOhjZlayj4atB8Lel1peux1HbMY8ZzYX0OWg+bja2mV1x9jTtlijXX93Uu4/W1w3qt6rnHNJZ6nnVEYXpj33DX8oa9eMDS0ovQCgNrallLtYzjmlbGSVmlg3ZqC6erTavPM+cVRhJWBCN0CYaY6nGyg0L6dgI6zs3oEI/SKutgsgI1h84IKxrmWoDRmYCf6q6HYx1eTy99HVbptTmFlrtOUCsdURinTzBE6cepPdwha26gdUsKneFe6bleOvMFK6QQhXH6BEOUfpzawx2y5gZat6TQGe6VnuulM1+wQgpRGGG/j/1+H58ef+/ZudtLPk7pxygta26gdX2e311/tvRtXcfNvdJzvXTmC9bJZ0ShoAIJuefu9/hFd1DK4JymmpuSKgwXKq2WcI27qHwtPKhlvujv3Bqr+Zy28rzo5Ph6+vz552cD6oYlC2/j+fPP49WrT4YM75wW1w00SyEKZZ19E3EqBOda+MGFF+cW36wUnZtpvBcR5wM1+lhq1HoJNXyFDMvW4hpr7Y+Ll5y6/j0WjIfXv7FfbVM6eb3FdQMtszUXRthsYrPZxL9+SPIr8vtd73PsYw/VdcxZc1P7bXBNn7VU0zq23sfLOndrWzdTzA1QnkIUxhmb4NdiYmVWuuFc6aAZaaNwSGruemWdu7WtmynmBihMIQrjjE3wazGxMivdcK500Iy0UTgkNXe9ss7d2tbNFHMDFLbZ79cdPOZzXJR06fMuhT8j+sS50IYp1/GmUABRPZ8RLce1Zb26nvvdbld9gNdE7mr+LJ7n7nhd57DPa1yWCQKQRrm25qzXdVjqedYRhbp1TuorHdrQ0eg30StIqoVz1liERqz3uHmqujTamV5LYZUUotBRVqDB4X3u9/Fiv49N3D9XvxPJz9nSx3x8LLe3u1n/8iysCCDXtde4/T42XW6b7QCAYjyRobusQIOaghOyQojmIqwIIJdrLBARClHoIyvQoKbghKwQorkIKwLI5RoLRETEl+ceALRiv499RHwacTqkZ1NoM+bh45x57DTXjrmv3W63v73t/rNjHquDu/1++yIuzPWY23ZP85d6uRBkU3XQy7GVBPI0dU74Qt/1mXBdKq26tVnqGjvla+HNzXa6Oz/jXEjSmDU3wXqtbj3RNh1RGKbXG+uFBPKMKiYqnIPai6Nz46t93MdaG+8QQ4+xuqCWJDUd99LX55KPr6Z1NFojIUlLXk/MQEcUTngIm/l2RPzq4S+w79x26Xf3+9gc/uzt7e5NyfGM+d1Tx9L1tkuPcerrV1pVar66dn5ZrxY6C0v9ygDadHTdfXHitl7X7OPbI2L06zXQnY4onDY2JKF0oMKY+xMM0Y/5AqjTFNdn122YiUIUThsbklA6UGHM/QmG6Md8AdRpiuuz6zbMxNZcOGFsSE/J0JoT4znrTMDC261Gp363621LdGrb4eGW2sdzN+Q21mXqEJsaQ586HrNwk0ZNtebGPFc6Xos7B9Ed3z73a1+XkKRzoUbXfPDB909+DnXo/UEJClFWb/PxZhMR342IX+w/2p96gezzQpwRnnAXFb0hrTCEiLpUtV5nNub60Ooc1j7upa9Pa66f6tfD0FCjc7/X8/4WFRDF/BSirMpxWMHm480m9vHTiPjriPibzcebD/cf7fc9woGeHd7fqccp0SW7FtAQuQELR8f8Xmw220HBEJeCnIYEoTT41QqLt/RumDCftp1an32vI85z20oHIA0MA+z1Gn5zs337OnzpdfTw5y49xna73fQJeIJSfEaUtXkbSvDQCf1p7J/9MDaxif2zH0bETx9u7xpekBV+UFMwjmAIAJaidABSRrjd2Pco1392s/nGj+Pjv/xW/I//9uP4+C9js/nGgHHCRTqirM19KMH3Xv4qIn4aET+IZ2++GhHx8N8fRETE915+GD9/2SW8ICv8oKZgHMEQACxF6QCkjHC7rr87LGRxs/mziPjly3j5pR/HT549izf/OSL+EJvNX8R+/48DxgsnKURZlYftuPdF6O+/9qP4yu+Of+SP4vdf+1G8/79/FLGPiMs7rg63xDyGJGy32xdThhWdui0zYOHaWC7ddnx7ibkBgKHGvKaVum3Aa/jb9x43N9sngUMHwUSdtvxuNl9su/3T+Mf4p/h6vIh/jmcR8aWHu3gTEXfx4tdf32y+pRilFFtzWZW323EjfnCiCL33ld9FfOtnEf/mP0Rc+EjEmZCeuUIOsgIEag8qqH18MFSra7vVcdPuuWtx3KPGfBw4NDTQ6I/jt/H38edvi9BDzyLi/XgdEfFL23QpRUeU1XgbTLR/9sO323HPeSxGIyL++3+Mh85op3CAiPJhRdceY66Ahb63TT03NQXjCE6ipLnWtuCe9Rqz5kqHeC0xFGzIa3gUCCa8vX13K9LhV8Z8M34Tz+LN2S7Vw+1fiohvRsRvx44FdERZk+9GxF9fLUIffeV3Ed/5TxH/8hePtwz/0P80agpYEFYEAN1lhBpB1RSirMkvIuJv4s2z/9fpp3//tYh/+PcR/+u7j7cM+9D/dGoKWBBWBADdZYQaQdVszWU19h/t95uPNx/G5k3EfTruH5394d9/LeLXPzjclhvXtuO+81gJgTw1BCz0ve349jFzs9vtXkflXzx+zoltZnclt15WODdZx1f0cZiPc0zNSqzP0qFGh1tsh/pNfDPexLN4E6c7VQ+3/yEifjP6wSB0RFmZ/Uf7fUR8GBE/i99/7fQPnShCzwQTMa+aCq2xSh9LbXOTdXy1HTfDzXmOWwy7Iddc63PU2jz1Xubwtn+Kb8Sfx9/H6/j6kw+jPqTmRkT8Rez3Ph9KETqirMr9h/73347vvfwwvvdxxNPO6P+Nr/zuZ3H3Lz6M2HQKJjr/ONOGFWVoLawIYKxTHS3hY8zlWqhRXA4wuhqy+OrVJ7HdbjeP9/k/489+9cfxz38aEb/cR3zpTTx79izevHkW8Yevx2vfI0pROqKszf2H/n/+8tvx2Bl9/Mzo/X9/FhEfPvz7mHCApYQLCCsCgPmMCTAaFrJ4X2z+yct4+VffiX/4Py/j5V9FxJ8oQilNIcravP3Q/9ttups3fxv72Mfmzd9GxIcPt48NB1hKuICwIgCYz5gAo+Ehi/v9b38SH/3dr+Nf/dufxEd/ZzsuU7A1l1U5DgJ4G2AU8V8i4hcPReioQJ3dbrfvuuW09u1eh8fxOA9jbju+HRiuwlCqyYO4zjzmrPPQ8Tp+dR6yApoama8x97eoQKsxAUZ93sv0CR2EUhSirN5D8fnzucdBb3dR2ZvwEUqHo9Q2N8JfplHTOT4nY4xLmYesAJwW5muMpR/fsXPXe9ddqqcQBZpU+i/el/5Kv91uL/zNuT5L6gYA8K5rAUanQgPtRqJGPiMKAADtGBsaCFVQiAIAQDvGhgZCFWzNBaq09AAKgCHmDhtamxrnu2uw0JjgRcigIwq0oqo3AlCBFsJIMsbYwjyU5FqYy3zDRHREobDjYJslheCUVvvX10DNMnYI9H2OznFNOzUPpa+7Xe9vLde0a3OYOf9Au3REAQAASKUQBQAAIJWtuXCgxlACoH2uLQDwLh1ReJc3isAUXFsoZe5wpi6Pf+5n5h47UBEdUQCARrTwNVYtjBGYn44oAAAAqRSiAAAApLI1F2bku9H6mWu+Cj/u3Rzb1i6E5cwynq76hvzU9JyqaSxZMo75xGNUvYbJV/MaEVwGX9ARBcg11xuQc49b+xui2sfH/KwRrqlpjdQ0FpiVjigAzGi73W6G/F7f7uOQx1ljVxeAHDqiAAAApFKIAgAAkMrWXACKaDUQqSbmcJg+W4htN16nrPNec1AS1EZHFCDX3dwDmFCrgUhzOl4P5hCW5fi5u+TXAOhFRxRmNDSkZMku/dV6yvma63EpK/tcWTfvKnXMupYs1VzdUc8paqQjCgAAQCqFKAAAAKlszYXCGgjNEJzAYtW0/az0WMbeX01zsyRd53XM/E9w7rwOALPTEYX1EXrShnOBFoIugLG8DuRxzYYzdEQBKqRbAad1CUQaEyKlc7xMQ4O0+q6HNYaUwVA6ogAAAKRSiAIAAJDK1lxYocJbz4RejLTb7V7H6c9smVsAIuLia8XY+x3znsDrFIPpiAJjCb0Y79wcmlvIt4ZwmdaOsbXxTqXG14Qax0QjdEShsOOgAsEXkGfKoJASz+UxQTljry1D5maN169T3Z2u52WqkKS1B+D0PSdAG3REAQAASKUQBQAAIJWtuQAAXDVVWE4GW3mhPjqiwFhCJIAI14I1aLIITTT1c6DG51iNY6IROqLQUddAiqH3CbRriutD7Vy/WLvs54CvSWFpdEQBAABIpRAFAAAglUIUAACAVD4jCnDvLk4HcVQdxHAhxfJuhZ8nqvpcHWs5gZRJtHANOjdGgN4UogDRdAjEuTeFi3+zuICwnMWfI7pr4RpU0xj7hoINvV6sJXwM5mBrLgAAAKkUogAAAKRSiAIAAJDKZ0ShMQ0HnKwxPAcobOxn9nzmj1JOrCWvc9CDjuj5NLqaUuqoz5zrpsUiNKL7uD0np2FeAabV6usz9Vvka/jqO6L+csUQ1s10zO00zCsAtGmpr+E6ogAAAKRSiAIAAJBq9VtzoaS+QUJrC82Y8XhXHyBxYW2ufm64lxGE1nDY2lldr2uFr3+et0DzdEThXWM/DL6oN1gL4rycn4Pa52aRAQ2VylgLta+3VpjHPK41MBEdUTjgL8zDbLfbzbWfWVv3lzI8J4E5nboGeT2DMnREAQAASKUQBQAAIJWtuQAAVGGqQCvbaaE+OqJQ1hpDDboe85xzs8bzAjXyXCxjyfMoiAlWQkcUChKscp65AVq9DlzqpnUJawPgKR1RAAAAUilEAQAASGVrLtDJhQCJu1a32wHtWNI1aEnHwrtObON2TuEMHVGgq3MBEoIleHQuQGXJwSqnmIdpLOkatKRjKW1pzxPnFM7QEQWgCH/1v2ceYLgxzx9f0QJt0REFAAAglUIUAACAVLbmAtA84S/1uXBOpnis0lsyrRuAiemIArAEawx/qT3UpeW5b3nsa1bjc6LGMUEVdEQBoEGnOnbCWlizPl3sS8+V7Xa7mep3gS/oiAIAAJBKIQoAAEAqW3OB0U5sUxL0AQDAWTqiwBQEfeQ5F4QhIKN+Sz9HLR9fy2MHaIKOKEDDdJ7btfSwodJrU0AMwLLoiAIAAJBKIQoAAEAqW3MBetjtdq9CzRnhAAAHpElEQVTj9GdgBTQBVblwvcpQ7JrougvLpCMK0M+5N3UCmliTtYX5tBoKNud1qeRju+7CAumIAgAXrT0MSNcNoDwdUQAAAFIpRAEAAEhlay5AxTK+V/LEY1QTACKkZJ36huw08P2rnddrA8fSWc3Xlgwzh0VlWdU5pSwdUQCO1fTGSUjJOi3t/C7teIZa2zys4XjXcIxMREcUAIDOroVXLamrC0xHRxQAAIBUClEAAABS2ZoLUMjagzngkKApTim9bXeKbcC2FkMOHVGA6XQNcbibdBT91TYeujt37rqc0zG/e8qYoKmlrcElHc+SjqW047lZw1yt4RiZiI4owMx0hyhlzFqqaR2eGsulLtW18JyWjAkCWtI8tODafNf0nLJuqJGOKAAAAKkUogAAAKSyNReA2V0ItgGo0ohQI4FdEDqiANShhSK0dJgPsE4tXO9gcjqiANCBDgaMNyQYx9epwDLpiAIAAJBKIQoAAEAqW3MBuEiQUDtaOVe2WgKgIwrANdUXNrzlXLVB8NW6Oc8QOqIAAKkEX5UzJPwo4nJXfuh9Av3oiAIAAJBKIQoAAEAqW3NhYhfCQ+6GbM9aSRjJoLkBgFo0/nrtdZjJ6YjC9M69CA19car+Ra2ANRxjS1oI1hD+cm9txws1a/m1rOWx0wgdUQAuyvir+Niv8/CX+3s1zUPpMBhf+QKwLDqiAAAApFKIAgAAkMrWXACABbB9GWiJjii0Zw1hJDUfY81jA6CbjICzll8vWh47jdARhcbMFUbS9y/tQ8JIWnBq/nUhANqS8Vq65PAwKEFHFAAAgFQKUQAAAFIpRAEAAEjlM6IAFLHb7V5HxPuJjzf0s7l3NXx2q+98+Sxy3MXp+RKqAtAghSgApaQVoSPVMs5axtGEGv54ULtaQmf80QTowtZcAAAAUilEAQAASKUQBQAAIJXPiMIVUwawzPQ5miqCWmBOnnv0ceF1wDkFGEhHFK5bWqDI0o6Hekgvvez4uWe+2nHuujnH9fTcurGegKboiAJQxJjO0JAO5bWE0NqTO0/N16Ux15KI2lXt898qHVhgKXREAQAASKUQBQAAIJWtuQALM2XAFgBACTqicN3SAiCWdjw8pQi9V9tar208wHoIuaI6OqJwxdhgiCWFj0BLhLoA3HM9pEY6ogAAAKRSiAIAAJDK1lwoqG9IzEzfs3dniw7QmlpDuFzHAYbREYWyqnuTdELGGIUfzGsN87/UYxQocl4L19csa5sLzwtYIB1RYDShS3VpsVMi1Otei+cOpuZ5AcukIwoAAEAqhSgAAACpbM0FgInVGrRzghAcWCDXIGqkIwpltRCcMHSMwiLOMzfjLX0OW3gDGFH3OJeyFkqofS6W/nxuUc3P7UOtjJMCNvv9HKnjsB5CWICZvuJjkFLXpSVd+5Z0LKzTGq9B1E9HFAAAgFQKUQAAAFIJKwIAmEhDITFjCJgBetMRhekJbQBaeb63Ms6WLL0IjVjHMbauled2K+OkAB1RmJi/EgOuA8CcXIOokY4oAAAAqRSiAAAApFKIAgAAkEohCgAwnTWEr6zhGIHChBUBAExESAzAaTqiAAAApFKIAgAAkEohCgAAQCqFKAAwhXMBNi0G2yzpWACqsNnv93OPAQAAgBXREQUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASKUQBQAAIJVCFAAAgFQKUQAAAFIpRAEAAEilEAUAACCVQhQAAIBUClEAAABSKUQBAABIpRAFAAAglUIUAACAVApRAAAAUilEAQAASPX/AeIfP+8eZiEsAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (2) A* search search: 134.2 path cost, 418 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6IAAAJCCAYAAADay3qxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3cGOJMl5H/CoJpcYUtileCLgk6GrRNO6SxBfQBYIA9UHgtqDIXqfgpzlU9AQfFgLPHQBBiH7BUhQd+2a8t0nA7yY0jRML7jglA/dPVNTnZmVWRn5ZUTk7wcQK+V0dUVGRkbV1xH1r93xeEwAAAAQ5WbtBgAAALAtClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACPXltRsAAGzX4XB4lVJ6v+Of7vf7/QfR7QEghhVRAGBNXUXo0HEAGqAQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACCUQhQAAIBQClEAAABCKUQBAAAIpRAFAAAglEIUAACAUApRAAAAQilEAQAACKUQBQAAINSX124AsIzD4fAqpfR+xz/d7/f7D6LbAwAAT6yIQru6itCh4wAAEEIhCgAAQCiFKAAAAKEUogAAAIQSVgTAYsaGZi0RrlV6YFfp7QOAJVkRBWBJY0OzlgjXKj2wq/T2AcBiFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKG+vHYDAFje4XB4lVJ6v+Of7vf7/QfR7QEAts2KKMA2dBWhQ8cBABajEAUAACCUQhQAAIBQClEAAABCKUQBAAAIJTUXYGUSbQGArbEiCrA+ibYAwKYoRAEAAAilEAUAACCUQhQAAIBQwooAYAShUtvgOgPEsCIKAOMIldoG1xkggEIUAACAUApRAAAAQilEAQAACKUQBQAAIJRCFAAAgFAKUQAAAEIpRAEAAAilEAUAACDUl9duAADjHA6HVyml9xf4vccrH3q/3+8/yNoY2Kil7u8g5gJgMiuiAPUo7U1qae2BmtV8P9XcdmAlClEAAABCKUQBAAAIpRAFAAAglLAioGkDASDCNSqVI9RlRkDTnN9nzE2U+zrN5ToD5GNFFGhdX8FSY7jG/doNOLNWe2q8dinV226mufY6l3Z/T1Fz24GVWBEFqMScVZahlZz9fr+79vcCeVhFBbbGiigAAAChFKIAAACEsjUXFiYsZxkrB9a4dlxl7JhbMaTn4tjOce8Rw+sPUDIrorC8tcJy+sIjWgmVWPONsDfh62plDJdozNg2/oeVND5bCmsDGmNFFBrlr920aq2xPSfwqbSvIanZ0uFagr0AYlgRBQAAIJRCFAAAgFC25gJZCTJh62q/B2wjBiCCFVEgt2rfgEMm7oHllBQEBMAMVkQBgMkE9wAwhxVRAAAAQilEAQAACGVrLmS0VkjJwPPe+z7RdbgmAMNWDvYyF8PKrIhCXmu9oPY97xrt2UKYyJhzLOmarKmvr2ocJ2PPpcZzgzWsOR9ubS6G4lgRBbIq7S/MQ19FIWxleaWNhznGnktL5+yrXABYihVRAAAAQilEAQAACGVrLhBi5VAKMou8nj3bQwWNEM5WZaYQWgfDrIgCURShbVn7eq79/MB8awZ7RTy30DoYYEUUAIBwVgVh26yIAgAAEEohCgAAQChbc4FnthIstFbwiMATWJaQGIDyWREFujRfhG5QXzDHtYEda4aMlPD8lE1IDEDhrIgCbEDuVaA5v29oRXi/3++u/b2wtBLGpx0VQCusiAIAABBKIQoAAEAoW3NhRR1brARpAATZSjBbLnNCoARIAeesiEJZSnlDJAgG2IJS5txazAmBEiAFvMOKKPBMS3+dXisYZ8rzCh8BALbGiigAAAChFKIAAACEsjUXINiUrbgrbdsVHgIALMqKKADnhIfAc0LcADKyIgpBTgNqhNMAlG3JMDMArIgCAAAQTCEKAABAKFtzoTKHw+FV8hk+gJRS+XNi6e0DWIsVUaiPNzTT9AWMCB7pp294UsP9U/qcWHr7AFZhRRRoWolfQ3IegjIUXiUwhTWVeP8A0AYrogAAAIRSiAIAABDK1lyAFBsoMuV7ZGd85+x9Cdsqp/Zr0HfsXuybjQTMvNMPGzlnAAphRRTgQWtvwEs5n1LacWpMm0psd27n57iFc15DScFOAMWwIgoAsJDcOxOCdg0ALM6KKAAAAKEUogAAAIRSiAIAABDKZ0QBHtyntsJaSglIKbJffc7ugX6gBCuNw2fp2QPJ0ZK2YQEKUYAUGyiy3+931/5sbbr6NVfflOSa61TLuU2xxnhtsR8J0VU09hWSkrZhAbbmAgAAEEohCgAAQCiFKAAAAKF8RhQAYIAgGoD8rIhC20pJToVr1TCGr21jDec2RWvnc0oR2p6WxytUwYooNKT2hFU4lzvNuCQtnxtM4bULtsmKKAAAAKEUogAAAISyNReCjP3S9Tlfzh7wxe73S28nFAoy30AfLn79WJdrf1nEXAzAZVZEgSkiCkRF6Hx9fahv2+faM4agHmB1VkQBADJaMnxn6kqtICCgVFZEAQAACKUQBQAAIJStuQDBpmytyx2YMuP3XQy7yRE0NfN8Q9pYGCFEjVgqaKrnnjJuKtTg/NXF2NwQK6LAFBEBF0I0yjTmzc/ab5BqaGNurZ1PC66dwyKDptYYN339Ys4fbwv3+xbOkUdWRCHIaWDE0KrP2J+LsEbIRct/CV37esLW3N7ub1JK304pfXY8pmNKKe12afd07O7u8LrvsTWG/Jy3uaQ5p+W5HbiOFVEAoFXfTin918f/Dh0DIJhCFABo1WcppX//+N+hYwAEszUXAGjKhx9+N33++XsppfRm6+3uYdPq/fGYPkgpfZpSSofDGq0jh6XCnYA4VkShbEIc2lLz9RzT9rXPr4Y25tba+WTxWIR2OS9cBOhMU1J/RYY7lWIL43IL58gjK6JQsK6/6k4Nn7gUuFFSmEXrov5KPzYMK7caViFqaCPDTsOGngKIrn/8w3iY+zu3wv2zLv1Pa6yIAgA1mRs2JMAIoAAKUQCgJnPDhgQYARTA1lwAVjcQPFIrgSkLedw6++nT/38STDTWYIBRbkJ12pD7YywFfSzGOGQ1VkQBKEFLRWhK7Z1PsSYWoV2Wvla5Q3VKCgyifuYqVmNFFACoxpRgobu7t9/Pcnu7n/Q7Sw0wsnoFtMKKKABQkyWChQQYAQRTiAIANVkiWEiAEUAwW3MBeggZ6ddguBCVOA8ryiQ0wAgAK6IAQ3KHjLREHwwTHFOYFy++mPLjxjdbYa5iNVZEAWjafr/frd2GFo0N+Ik4NsYnn/wspZTS7e3+5unx6WQl9NrzY7vMLTCPFVEA4BpjA34ijs1t99ifE2AEkIlCFAC4xtiAn4hjc9s99ucEGAFkYmsuBDkcDqO2cY39udzPO+Oxmw/uaYkQomWU0K+555a7u7f/91e/+t30+efvvfPvu45NixHHLjkNO7rweAFGAAuyIgrMpWhpS2vXs5Qgjtb69R3nRWgJRoYTTRkfTV9DgGhWRAEomkAQLjke0+40SOju7jA2hOiD82NJgBFACCuiAEALIkKIBBgBZKIQBQBaEBFCJMAIIBNbc4HZcoegzCQ8CRb04YfPg4nWdjgcjqfhSUNOw4q6jpUeYJQr+Cpg3jYXjzRwTS/24ZzHwtqsiAKtESjSH8AyJpillHCfJ6W151qtnEdxRejIUKIpSg8wqmWOq6WdJejrqzF9OOexsCorohBkycCVob9sX3resY8tbNWTAXP+Cu4v6Mt46texYTd9ATiXfnYopOf2dn9z7XOPDfO5uztM7JkyLBlgNHalFmBrrIgCQJyxYTd9ATgRQTtbDOQRYAQQTCEKAHHGht30BeBEBO1sMZBHgBFAsM1vzfUhbwCinIXiPHv96QrK6QvP6Tp+e7sfevrzoJ2Lv+9CcE8zlgwwOmTYrVzrRyM62u29FVyh1XrFiqgPecNYtYSt1NJOaOp1ZoHQoJKUHmBUC30D12myXtn8iigwTs1/cYM1jAj9qd2b8KOhkKQlg9r65FhBzBVgJKwIoJsVUQBYRushNi2dS5fcAUYAnFCIAsAyWg+xaelcuuQOMALghK25ALCAS8FEtRvajntqxjbZVUM4cgUY3d7u04sXX6RPPvnZAq2sT63BS0B+VkSBvhAOoT+Qz8witKz37kHBRKUX7qPnyM8/f2/JdgBUyYoobJwQonwuhdMMHaM9Y4OJ7u4yfL8HIeYEGM2xRuDTHFY9gTGsiALkMzbcRJDJNrjO7XE/A2SiEAXIZ2y4iSCTbXCd2+N+BsjE1lxYweFwCAsuGblFatVQkFZcCqd5DDe5f9zS9+n542nD0/19+v2Rt7f71dpDPjMDjAA4YUUU8hob/FNaCEdp7WlBX5/q6/a5xnkITKuXawdcZEUUMrKqWJdLQUKnK1pTHzv1ecf8zqWOsa7b2/1NGnmd1hojl++V/q9yqS1oZ6qx9/2lxw7NNzV4d4zsnwU55Z6DBCJB/ayIAls2J3gk92On/M7cx1jXlOu01hgxlvrN6YeW+tC4ASZRiAJbNid4JPdjp/zO3MdY15TrtNYYMZb6zemHlvrQuAEmsTUXGCUyYGkBnWFMl8KFLgTMvNmKeEVAycXHdh3PcExQUoEuBeBc87O5jnUFLx0evvb0/nH75acnxzZpbFjR+Xzy4sUXZ329WBNDCGsDprIiCusoLchhTHtqLUJTGtf2ms9vrC2cI3kJ3Zpm9Nz++efvLdmOtRk3wEVWRGEBlwMa+oMcrjEU2tB6UMhYc8KF5ri7e7vMUcJXeAgwKs+UaxIfQjSu3bUH7cxx1ofP5vZ0sgNi6LE19WHusDZzEGyTFVFYhtCG8uj/B/qhPDWEFY1t9xZtsQ/XCnoDGqIQhWUIbSiP/n+gH8pTQ1jR2HZv0Rb7cK2gN6AhtubCAoQ2jPPhh9/t/JzUixdfpE8++VnW57p0TTbkPCgpfBxWFHzVGXKVW9lhRePaXXvQzqmB8Xkx9Kzr2IUwszf34+3tfpG5bwkz59PV5yCgDFZEYXmthDZkD1jqC+tYIMTjvO0hff/ixReD//9bq34kao1xWMvYr6Wd5JV7zp4bYFRauN25ufdJrfdZ6dcFimdFFGYYEygy9fGlhjZMXRka2Te9IR63t/ub88fP6Zux1+Q0XCi3NVc6hoKSahqHLRJW1J45AUa5574lXDOf1j4HCf6D/KyIwjxTAkXGPr6V0IbS+qaVfl1Cy+OwBsKK2tN6mE/u9tRwzkBmClGYZ0qgyNjHtxLaUFrftNKvS2h5HNZAWFF7Wg/zyd2eGs4ZyEwhCjMcj+l4PKZPz7cN9R0f8/ixjy1daX3TSr8uoeVxWIMp/T/2Z3MfG9tuHuTu19L6Ond7ajhnID+FKGS026VXu106nv7vws8fz/73Kqqt0erom3bf7/QHJa3S17WEfGRsZ7tjq1B9166UsdfbjnZeF94d84XNQX1KHzfQFGFFkFdv+t/I0IZa0wPHqKBvdtkCKQ6HQ2/lsUboxb6jWwf+GLBoX0d8JUp55JxEKn2MPQUYnVrrflzOu2P+KaztdP4r7ZxLHzfQGiuiMMNul3a7Xfq3j+l+i/3OJZ7nWmPbt2bfzGlPxHNEiegH5pvS17nHZ+4xQr+15qUpzzPnXHI/1hwE7VOIwjxLpPqVnh6YO3kz6rlzJ1aWfp1Saj+5sxVSc7dhrXlpyvOMFZFgbg6CxilEYZ4lUv1KTw/MnbwZ9dy5EytLv04ptZ/c2Qqpuduw1rw05XnGikgwNwdB43xGFGZ4TPP7NPOvffNF57uHzUf3j58nyv08Vzk958dAiXc+y7Pr2DDVdexK531z8XmmPnfXNb3m2OGQVnepjRf6puhx2JK7u8Obvn4aN3d3b//9dCx1HV/y2Nh2D+n4vPT9Fj+Ll/l+fMfAsfvjMX2Q+7VqQrvnPHZwDlpijj0cDs9e04DlWBGFFQylB3Yo+UVxdtvO+2Ji3+S2tWTEKedb8jikLsZStyXmn9L7es4ctETCben9BU2xIgozPAYmfDul9NmU7zp7Sg+8vd3fPD0+nfz1d8zznB4bu4Ix1aXnnfr7jse0e7fdz1dUpvbNTG+eo+v85hxb6prMddbuD86PpRnjcCvf91frtac8c+7HGc+Tda6b2ZYZc9D+2WO3PC9BjayIwjxzgxNKD21Y4nlLCkJZK+RlTQKM5tMP5BI1lkqa69YMbQIKohCFeeYGJ5Qe2rDE85YUhLJWyMuaBBjNpx/Ipdagt9xhRVGhTUBBbM2FGeYGQJQc2tDRvtkhDofD4XhNEMrt7X7O0/bKFUzUdWzuNRkIzZgV9BIdYNRi+MfpGP7qV7+bPv/8vXf+/fZ2n168+OLNNnPKNXV8dgQvzXIpLCrH3Lfbvbs1dejevSaMbqqA0KamgtXmjLnc4zVtNGiM5VgRhXKUHhwz6znnhBAtEWC0cijSGH39vfS1zz0OmypCz50XoZeOk1IqKxSs6PG54Dy1+L07o+1zx8ecdpc0NktU9P1CfayIwkhLBCLkCm2ICCsa+rm7u2W/q2SplaXdbr9YgEdNgTVLBhjV1A9rOQ3mujS3LBk60339+r+iZb/f5/tiJjrNnfuGVlRzhxClk/C3sV/t09OW2aFN175er7Xat8DKJVTBiiiMFxXcU3qYT0tKCvBYkwCjdU3pw4gx6/ptQ0QIUUltMa6hMApRGC8quKf0MJ+WlBTgsSYBRuua0ocRY9b124aIEKKS2mJcQ2F2x+O2dwMMbYew7Yg+1wT3dG1fvTTGzkMmhvSFo+Qax0NtWXprbmvGXJOS5qYp4zC9DQpJKbW15ezDD58HE+VwPKYx46HW0Keiw01aGp9dlgp667Lk60CG83hnXipNxDjsm7+mBqt5b7yOkt4T5GRFlM3bfbzb7T7efWf38aQ8wElvCCNCG3reIAteIIc5AUbNjMGFAojG9k+NRWhK5be7mfHZJSqUbennyfD7Nz8OMwWrNX2/EE9YEZtyHlaw+3i3S8f0k5TS36SU/nb38e6j44+OxzkhDsdj2o0N+xjRxkmhDWNDT645NtTma8NWcgehtL66ESlfgNH+2WOnjsO5j894D/Se8+lq0IXVm6vvUfLrWq2dOo+UtBpxPm72++fH08TgnxJ2vIxdsZsaztR3PP7Y9fPk0Ovo6WtzGrju+/1+Zx5iDVZE2Zo3YQWPK6A/Sceb76dd2qXjzfdTSj95PD436GCtkIW1Qkvmhq1QnpLG4RK/c63gHoFBLMm826+GuSXitXncz+523/hh+vivvpX+x3/7Yfr4r9Ju940RvwMmsSLK1jyEFXzn5WcppZ+klL6Xbl5/NaWUHv/7vZRSSt95+VH6+cs5QQdrhSysFVoyN2yF8pQ0Dpf4nWsF9wgMYknm3X41zC0Rr82Xf3a3+5OU0i9fppdf+mH68c1Nev1fUkq/T7vdn6fj8Z9G/C4YRSHKpjxux30oQn/3tR+kr/z2/Ef+IP3uaz9I7//vH6R0TCnt0qRPjp48T0rp05RSOkzc1XT62K5jF9rzZutN18/NOTbkUpv7jk/tG+KUPA5z/84c98BYY++VvvsHhoyZd5ca2xUYnJf6jpd+7PZ2/yxw6CSYaNQ27NNAuj9O/5T+OX09fZD+Jd2klL70+Ctep5Tu0we/+vpu9y3FKLnYmsumvNmOm9L3OorQB1/5bUrf+mlKf/kfU5r4kYigYIjSwgJKak9JbWmdvn50ft8PzANz+qzW/q613S0bfU2iwo5yqa29uZwHDl0brPaH6TfpH9KfvSlCT92klN5Pr1JK6Ze26ZKLFVE2400w0fHm+2+24/Z5KkZTSum//6eUer5doTvEYX8WJDCn1Q/mBMcsYFbYSu6+OVXS10S0GJxU2Dhc05t7oCso5GllImd42NPYjg5RmRq2VlJwD29dunf7x8N7abfbZx03uYPoLt1naTvz0ijn71tOA56+mX6dbtLr3lWqx+NfSil9M6X0m0UayKZYEWVL/iKl9DcXi9AnX/ltSn/6n1P617+45rnWCjiJsEToDHUoaRyuaYuhTdSt5XFTUluACRSibMkvUkp/m17f/L9RP/27r6X0j/8hpf/1F9c811oBJxGWCJ2hDiWNwzVtMbSJurU8bkpqCzCBrblsxvFHx+Pu491Hafc6pYd03D/o/eHffS2lX31vcFvu4HNlDuSZGRyT1dywlVx9czgcXqXyv6S8U8c2s/uc24qX6pvTrdRP1+702IXvz2zG2DE8514Zc/9EHFsrUGxgDGe9V7ai5XFT0uvjmnLMv79O30yv0016nbpXqh6P/z6l9OvZTwbJiigbc/zR8ZhS+iil9NP0u691/9DIIrSwUISoQJCSgkeqLEJ75D6XVfqmsHtiKSXdAy3rG8MRY9s1bovr+ahrjj499s/pG+nP0j+kV+nrzz5Y+5iam1JKf56OR58PJQsromzKQ4DB8dvpOy8/St/5OKXnK6P/N33ltz9N9//qo5R2V4V1vH2eZQJ5up5jWvjEesei+oZ1nH59wLk5wT3nx4fuyZwBQWOODY3hpe+ftc+vVV0rrhsIH2tq3Ix9few7PieMKXoOSsNhTIPBaik9zNv7/X739Dv/Z/qTz/4w/csfp5R+eUzpS6/Tzc1Nev36JqXffz298j2iZGVFlK15CDD4+ctvp6eV0afPjD7896cppY8e/31O0EFEUMJa4RPCirhG1LgpKZQl6v5pJXSGWC2Pm6h7as5zrzUHXf7Zh2Lzj16ml3/9p+kf/8/L9PKvU0p/pAglN4UoW/MmwODNNt3d679Lx3RMu9d/l1L66PH43KCDiKCEtcInhBVxjahxU1Ioi7AiStbyuIm6p+Y891pz0LifPR5/8+P0o7//Vfo3/+7H6Ud/bzsuS9gdj83tNplkaLuN70Pbht3Hu116+GqXXzwWoe9ocUtWDS7df61dl/PznTM3ldg3OefTkubtNdtSSWDXVeFCU/q1lX6ICmiqpL/mWDzQqqQ5aLdLvW05Ht8GXZTUZqZr9fr5jCib91h8/nztdjDZfWrnzVTuMI3S+kZYyDJKusZ9ItrYSj9EBTTV0F9ztH5+5/rme/MuxVOIAlXK/Rfvlv7a6Ost8ik5rGjtQBhgHdeEFZovKJHPiAJAv5JCS6Y8N9CuGgOk4BmFKAD0Kym0ZMpzA+2qMUAKnrE1FyhSx1bZxQMo4NzjFrdP+44dDtc/duqxKc9NuzYQNlSUEvv7mnnEfEGJrIgCtSjqjQAUoIYwkog21tAPOZkLY+lvWIgVUcgs59dwtK7ErxmBXJYOKzoe96NCSuYcu7s7vJ5yzmvMaV07JXLPu2N/31bmtDlfIZW7/4F6WREFgGVEhRVFHAOArBSiALCMqLCiiGMAkJWtuXCixFACoE7vBoU8zC2n3+X3FB4y9tiUn811DACWYkUU3qUIBZZgbiGXtcOZxjx/38+s3XagIFZEAWAB74b+rN0aWlHD11jV0EZgfVZEAWAZQn8AoIdCFACWIfQHAHrYmgsr8t1o06zVX5mf936NbWsDQVyrtGesqQFiJd1TU7+DswUR/d/xHEWPYeKVPEaEIsJbVkQBYq31BqTveUt/Q1R6+1ifMcIlJY2RktoCq7IiCgAr2u/3u2seN3X18ZrnKWmFGYC2WBEFAAAglEIUAACAULbmApBFrYFIJdGH15myhdh2422Kuu4lByVBaayIAsS6X7sBC6o1EGlN5+NBH0Jbzu/dll8DYBIrorCia0NKWjb0V+sl+2ut5yWv6Gtl3Lwr1zlbtaRVa62OuqcokRVRAAAAQilEAQAACGVrLmRWQWiG4ASaVdL2s9xtmfv7Suqblozt1zn9v8C18zoArM6KKGyP0JM69AVaCLoA5vI6EMecDT2siAIUyGoFdBsTiDQnRMrKcZuuDdKaOh62GFIG17IiCgAAQCiFKAAAAKFszYUNyrz1TOjFTIfD4VXq/syWvgUgpTT4WjH39855T+B1iqtZEQXmEnoxX18f6luIt4VwmdrOsbb2LqXE14QS20QlrIhCZudBBYIvIM6SQSE57uU5QTlz55Zr+maL81fX6s7Y67JUSNLWA3CmXhOgDlZEAQAACKUQBQAAIJStuQAAXLRUWE4EW3mhPFZEgbmESAApmQu2oMoiNNDS90CJ91iJbaISVkRhpLGBFNf+TqBeS8wPpTN/sXXR94CvSaE1VkQBAAAIpRAFAAAglEIUAACAUD4jCvDgPnUHcRQdxDCQYnm/wc8TFX2tztWcQMoiapiD+toIMJlCFCApCEN+AAAJ/ElEQVRVHQLR96aw+TeLDYTlNH+NGK+GOaikNk4NBbt2vthK+BiswdZcAAAAQilEAQAACKUQBQAAIJTPiEJlKg442WJ4DpDZ3M/s+cwfuXSMJa9zMIEV0f40upJS6ijPmuOmxiI0pfHtdk8uQ78CLKvW12fK1+Rr+OZXRP3limsYN8vRt8vQrwBQp1Zfw62IAgAAEEohCgAAQKjNb82FnKYGCW0tNGPF8918gMTA2Nx83/AgIgit4rC1XmPntczzn/sWqJ4VUXjX3A+DN/UGqyGuS38flN43TQY0FCpiLJQ+3mqhH+OYa2AhVkThhL8wX2e/3+8u/czWVn/Jwz0JrKlrDvJ6BnlYEQUAACCUQhQAAIBQtuYCAFCEpQKtbKeF8lgRhby2GGow9pzX7JstXhcokXsxj5b7URATbIQVUchIsEo/fQPUOg8MraaNCWsD4DkrogAAAIRSiAIAABDK1lxglIEAiftat9sB9WhpDmrpXHhXxzZu1xR6WBEFxuoLkBAswZO+AJWWg1W66IdltDQHtXQuubV2n7im0MOKKABZ+Kv/A/0A15tz//iKFqiLFVEAAABCKUQBAAAIZWsuANUT/lKegWuyxHPl3pJp3AAszIooAC3YYvhL6aEuNfd9zW3fshLviRLbBEWwIgoAFepasRPWwpZNWcUeulf2+/1uqccCb1kRBQAAIJRCFAAAgFC25gKzdWxTEvQBAEAvK6LAEgR9xOkLwhCQUb7Wr1HN51dz2wGqYEUUoGJWnuvVethQ7rEpIAagLVZEAQAACKUQBQAAIJStuQATHA6HV6n7M7ACmoCiDMxXEbLNieZdaJMVUYBp+t7UCWhiS7YW5lNrKNia81LO5zbvQoOsiAIAg7YeBmTVDSA/K6IAAACEUogCAAAQytZcgIJFfK9kx3MUEwAipGSbpobsVPD9q6PHawXnMlrJc0uElcOiomzqmpKXFVEAzpX0xklIyTa1dn1bO59rba0ftnC+WzhHFmJFFACA0S6FV7W0qgssx4ooAAAAoRSiAAAAhLI1FyCTrQdzwClBU3TJvW13iW3AthZDDCuiAMsZG+Jwv2grpiutPYzXd+3GXNM5j+0yJ2iqtTHY0vm0dC65nffNFvpqC+fIQqyIAqzM6hC5zBlLJY3DrrYMrVJdCs+pyZwgoJb6oQaX+ruke8q4oURWRAEAAAilEAUAACCUrbkArG4g2AagSDNCjQR2QbIiCkAZaihCc4f5ANtUw3wHi7MiCgAjWMGA+a4JxvF1KtAmK6IAAACEUogCAAAQytZcAAYJEqpHLdfKVksArIgCcEnxhQ1vuFZ1EHy1ba4zJCuiAAChBF/lc034UUrDq/LX/k5gGiuiAAAAhFKIAgAAEMrWXFjYQHjI/TXbszYSRnJV3wBAKSp/vfY6zOKsiMLy+l6Ern1xKv5FLYMtnGNNagjWEP7yYGvnCyWr+bWs5rZTCSuiAAyK+Kv43K/z8Jf7ByX1Q+4wGF/5AtAWK6IAAACEUogCAAAQytZcAIAG2L4M1MSKKNRnC2EkJZ9jyW0DYJyIgLOaXy9qbjuVsCIKlVkrjGTqX9qvCSOpQVf/W4UAqEvEa2nL4WGQgxVRAAAAQilEAQAACKUQBQAAIJTPiAKQxeFweJVSej/w+a79bO59CZ/dmtpfPouc7lN3fwlVAaiQQhSAXMKK0JlKaWcp7ahCCX88KF0poTP+aAKMYWsuAAAAoRSiAAAAhFKIAgAAEMpnROGCJQNYVvocTRFBLbAm9x5TDLwOuKYAV7IiCpe1FijS2vlQDumlw87vPf1Vj755c435tG/cGE9AVayIApDFnJWha1YoLyWElp7c2dVfQ20uJRF1rNL7v1ZWYIFWWBEFAAAglEIUAACAULbmAjRmyYAtAIAcrIjCZa0FQLR2PjynCH1Q2lgvrT3Adgi5ojhWROGCucEQLYWPQE2EugA8MB9SIiuiAAAAhFKIAgAAEMrWXMhoakjMSt+zd2+LDlCbUkO4zOMA17EiCnkV9yapQ0QbhR+sawv93+o5ChTpV8P8GmVrfeG+gAZZEQVmE7pUlhpXSoR6Pajx2sHS3BfQJiuiAAAAhFKIAgAAEMrWXABYWKlBOx2E4ECDzEGUyIoo5FVDcMK1bRQW0U/fzNd6H9bwBjClstvZyljIofS+aP1+rlHJ9/apWtpJBrvjcY3UcdgOISzASl/xcZVc81JLc19L58I2bXEOonxWRAEAAAilEAUAACCUsCIAgIVUFBIzh4AZYDIrorA8oQ1ALfd7Le2sSetFaErbOMfa1XJv19JOMrAiCgvzV2LAPACsyRxEiayIAgAAEEohCgAAQCiFKAAAAKEUogAAy9lC+MoWzhHITFgRAMBChMQAdLMiCgAAQCiFKAAAAKEUogAAAIRSiAIAS+gLsKkx2KalcwEowu54PK7dBgAAADbEiigAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQCiFKAAAAKEUogAAAIRSiAIAABBKIQoAAEAohSgAAAChFKIAAACEUogCAAAQSiEKAABAKIUoAAAAoRSiAAAAhFKIAgAAEEohCgAAQKj/D3dlYGiWPh2hAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Greedy best-first search search: 153.0 path cost, 502 states reached\n" + ] + } + ], + "source": [ + "plots(d4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# The cost of weighted A* search\n", + "\n", + "Now I want to try a much simpler grid problem, `d6`, with only a few obstacles. We see that A* finds the optimal path, skirting below the obstacles. Weighterd A* with a weight of 1.4 finds the same optimal path while exploring only 1/3 the number of states. But weighted A* with weight 2 takes the slightly longer path above the obstacles, because that path allowed it to stay closer to the goal in straight-line distance, which it over-weights. And greedy best-first search has a bad showing, not deviating from its path towards the goal until it is almost inside the cup made by the obstacles." + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3cGOZOd53+GvRorgKOA4WgXIKs42EpraZJVAvoFA8IZeCPLOia5CHN5CVoqDLBxACw4gGEj2gQRlH8iOgWwM30EscyDGUOA5WbDFUD3dzaruqnN+56vnAV4QeDmc89ZXVcP+d59657AsywAAAICKF1sPAAAAAF8kqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAIAAJAiqAKbORzG4XAY7x8O43BqDwCAeQmqwJZuxhg/uf3nqT0AACZ1WJZl6xmAK3X7E9KbMcYvlmUsp/QAAJiXoAoAAECKW38BAABIEVSBs3vOkqQ99mrzOIfHZwQA+gRV4BKesyRpj73aPM7h8RkBgDifUQXO7jlLkvbYq83jHCzhAoC9E1QBAABIcesvAAAAKYIqcLTSUpxSrzaPc+j1avOs9ZgB4KkEVeAUpaU4pV5tHufQ69XmWesxA8DTLMuilFJH1RjLYYzl/TGWg97/79XmcQ69Xm2etR6zUkop9dSyTAkAAIAUt/4CAACQIqjClbMM5vm92jzOoderzVPq1ea5xOMD4Am2vvdYKbVt3X6u7K/GWN5/rHfKr722Xm0e59Dr1eYp9WrzXOLxKaWUOr02H0AptW0Ny2Ce3avN4xx6vdo8pV5tnks8PqWUUqeXZUoAAACk+IwqAAAAKYIqTMrCk/V6tXmcQ69Xm6fUq81ziccHwBNsfe+xUuoyNSw8Wa1Xm8c59Hq1eUq92jyXeHxKKaVOr80HUEpdpoaFJ6v1avM4h16vNk+pV5vnEo9PKaXU6WWZEgAwrdevX38yxnhv6zl25s0HH3zwcushgOvmM6oAwMyE1NM5M2BzgirsjIUnvV5tHufQ69XmKfXWvA4A+yGowv7cjDF+cvvPNXtbXrveq83jHHq92jyl3prXAWAnfEYVdub2JwQ3Y4xfLMtY1uptee16rzaPc+j1avOUepe+zscfv347ONkHH3zgp9HApgRVAGBar1+/9oXOEwiqwNbc+gsAzOzN1gMA8ARb//04SqmHq/Z3AZbmKfVq8ziHXq82T6lXm6fUu/R1Pv744+Wh+uIMSim1RfmJKrRd48KTPfZq8ziHXq82T6lXm6fUW/M6AC1bJ2Wl1MN1Td/Z33OvNo9z6PVq85R6tXlKvUtfx09UlVLlOizLcnK4BQBg3x5bNGWZErA1t/4CAACQIqjCBg6HcTgcxvu3f9ffLnq1eUq92jzOoderzVPq1eYp9da8DkCNoArbKC3rqC312GOvNo9z6PVq85R6tXlKvTWvA9Cy9YdklbrGKi3rqCz12HOvNo9z6PVq85R6tXlKvUtfxzIlpVS5LFMCALhClikBZW79BQAAIEVQhTMqLeHY61KPPfZq8ziHXq82T6lXm6fUW/M6ADWCKpxXaQnHXpd67LFXm8c59Hq1eUq92jyl3prXAWjZ+kOySs1UpSUce1vqsedebR7n0OvV5in1avOUepe+jmVKSqlyWaYEAHCFLFMCytz6CwAAQIqgCkcoLdeYfanHHnu1eZxDr1ebp9SrzVPqrXkdgBpBFY5TWq4x+1KPPfZq8ziHXq82T6lXm6fUW/M6AC1bf0hWqT1UabnGrEs99tyrzeMcer3aPKVebZ5S79LXsUxJKVUuy5SO8Pr160/GGO/d86/efPDBBy/XngcA4LksU4I5zJpV3Pp7nPue+Mf6AAAAa5gyqwiqcEdpkUapV5un1KvN4xx6vdo8pV5tnlJvzesA1Aiq8K7SIo1SrzZPqVebxzn0erV5Sr3aPKXemtcBSPEZ1SP4DMd1uf0u880Y4xfLMha98fnrvzRPqVebxzn0erV5Sr3aPKXepa/z8cev344H+PoG9mPWrCKoHmHWJx8AuF6+voE5zPpedusvAAAAKYIqV2vLxRV77NXmKfVq8ziHXq82T6lXm6fUW/M6ADWCKtfMUo/TerV5Sr3aPM6h16vNU+rV5in11rwOQIrPqB5h1vu+r93td5NvhqUeiaUee+7V5nEOvV5tnlKvNk+pd+nrWKYEc5g1qwiqR5j1yQcArpevb2AOs76X3foLAABAiqDKVagtrthjrzZPqVebxzn0erV5Sr3aPKXemtcBqBFUuRa1xRV77NXmKfVq8ziHXq82T6lXm6fUW/M6ACk+o3qEWe/7via33zm+GZZ6ZJd67LlXm8c59Hq1eUq92jyl3qWvY5kSzGHWrCKoHmHWJx8AuF6+voE5zPpedusvAAAAKYIqAAAAKYIqV6G2YXGPvdo8pV5tHufQ69XmKfVq85R6a14HoEZQ5VrUNizusVebp9SrzeMcer3aPKVebZ5Sb83rAKRYpnSEWT+gfE1uv3N8M2yfzG6f3HOvNo9z6PVq85R6tXlKvUtfx9ZfmMOsWUVQPcKsTz4AcL18fQNzmPW97NZfAAAAUgRVprOHxRV77NXmKfVq8ziHXq82T6lXm6fUW/M6ADWCKjPaw+KKPfZq85R6tXmcQ69Xm6fUq81T6q15HYAUn1E9wqz3fc/q9rvENyO4uGLPvdo8pV5tHufQ69XmKfVq85R6l76OZUowh1mziqB6hFmffADgevn6BuYw63vZrb8AAACkCKpMZw+LK/bYq81T6tXmcQ69Xm2eUq82T6m35nUAagRVZrSHxRV77NXmKfVq8ziHXq82T6lXm6fUW/M6ACk+o3qEWe/7ntXtd4lvRnBxxZ57tXlKvdo8zqHXq81T6tXmKfUufR3LlGAOs2YVQfUIsz75AMD18vUNzGHW97JbfwEAAEgRVNm1vS6u2GOvNk+pV5vHOfR6tXlKvdo8pd6a1wGoEVTZu70urthjrzZPqVebxzn0erV5Sr3aPKXemtcBSPEZ1SPMet/3DG6/I3wzdrK4Ys+92jylXm0e59Dr1eYp9WrzlHqXvo5lSjCHWbOKoHqEWZ98AOB6+foG5jDre9mtvwAAAKQIquzaXhdX7LFXm6fUq83jHHq92jylXm2eUm/N6wDUCKrs3V4XV+yxV5un1KvN4xx6vdo8pV5tnlJvzesApPiM6hFmve97BrffEb4ZO1lcsedebZ5SrzaPc+j1avOUerV5Sr1LX8cyJZjDrFlFUD3CrE8+AHC9fH0Dc5j1vfzVrQeAPTh8dDiMMb4zxvjZ8qHv7gAAwCX5jCq7tsbiisNHh8NYxo/GMv7bWMaPbkNrauHGGr3aPKVebR7n0OvV5in1avOUemteZxal52+m18g1vpbYnqDK3l10ccVtKP3RWF58fxzGYSwvvj/G52G1tHBjrWUbpXlKvdo8zqHXq81T6tXmKfXWvM4sSs/fTK+R3+4dDt/44fjou98af/5ffjg++u44HL4x4Mx8RvUIs973PYPb7+LdjEssrvj9V78Yv//Rj8YY3xtj/KMvXPZXY4wfj59++IPx01dnu3a9V5un1KvN4xx6vdo8pV5tnlLv0teZcZlS6fmb4TVyb28c/sUY4+fLGF95O168eDHevj2M8fdjjH89luV/Pu2Z4zlmzSqC6hFmffJ52Oc/Sf311//t+Nqn7/6CX399jL/43hj/9T+M4Y4XYH1vlmW83HoI9s3XN5zscPjm346Xf/He+OS3bst8O8Z4M16O3x2ffEtYXd+s72W3/sIdn4fUMb53b0gdY4yvfTrGt348xr/5d2MM3+wBVvfe1gMAV+az23t/fjekjvFZoHhvfDLGGD93GzDnIqiya2dfDPCbxUlvX3x//Pbtvu8SVoENnf3Pv416tXlKvTWvs0el52r218jhMA4fjlffWcb4ykPh4bb/lTHGP3ngl8BJBFX27tzLAr4zxvjj8eLtPzzq6l/7dIxv/6cx/tnPTp8c4HnWX6BymV5tnlJvzevsUem5mv01cvNn4w/+/dvxQnZgNT6jeoRZ7/ueweHcywJ+8xPV5cX3jwqrPqsKbOfFiCx0eU6vNk+pd+nr7H2ZUum5mvU18sXeD8dH3301Xv3nw+MfPXgzxviXY1n+12PPHec1a1YRVI8w65PP/X7rM6qP3f4rpAItFixxEl/fXK/DYXwyTvys+z8efzP+evzeeDn+9t5bMt+OMV6M8csxxj8fy/I355iT48z6Xvbje7hj+XBZxhg/GGP8ePz66/f/IiEV6LFgCTjWyX9e/HJ8Y/yr8d/HJ+N3x90fxf9m6+/47K+oEVI5C0GVXbvUUoLxarkZP/3wB+Nrn/7J+OzvTf2iX42vffon480/fTHG4dtjjBfLMg7LMg7js/fUlL3aPKVebR7n0Oud8/ccjygteSkug9lbb83rVHiNnNY7x39/ir8c3/w8rC5jvPn78eJXyxhvXozxS381DecmqLJ3l1tK8NNXN+M3P1l9++L/jDHG7T9/PMb4we2/ryxTmGphww57tXmcQ693qd/zrtJj9rp5fm/N61R4jZzWO8d/f5K/HN8cvzf+erwar/7o2+N//O9X49Ufjc9u9xVSOSufUT3CrPd9z+CwwlKCzxcsjfHHY4z/OA7jB8uHy3Lua9d7tXlKvdo8zqHXO+fvOcY7d9190e4WLNXmKfUufZ3iMiWvkXVeI+PxP0eO9c6fN2xj1qwiqB5h1ief490uWPrOGONnt59hBdjE4XDSF4QWLPEgX9/M5/CEJUlPdfuxBAJmfS9/desBYA9uw+lPt54DYHz21z8c+4WoBUtwXdZ6z79Z6TpcMZ9RZdees0DgoaUC5/49Z+nV5in1avM4h17vnL/nsoyXMy1Yqs1T6q15nS3UzmGPvcf6R3rqkreXpdcScxJU2bvaUoKZe7V5Sr3aPM6h11vzOneVzsHr5rTemtfZQu0c9th7rH+MWV5LTMhnVI8w633fM7j9Lt7N2Mniij33avOUerV5nEOvd+nrjB0vWKrNU+pd+jpbL1OqnMOee3f74/QlSWf984FtzJpVBNUjzPrkAzCHgwVLPIGvb/bhcMEFSRYizWHW97JbfwFg/05ZbGLBEuzLpd6zFiKRJqiya89ZQPDQEoBz/56z9GrzlHq1eZxDr3fp6+x5wVJtnlJvzeuc017Pod470ZMXIm31uoG7BFX27hJLAM79e87Sq81T6tXmcQ693tbXvsvZ7KO35nXOaa/nUO+dojYPnMxnVI8w633fM7j9zt7N2Mniij33avOUerV5nEOvt8W1x04WLNWeq1Lv0te51DKlvZ1DvTdOX5A0xgrvcTpmzSqC6hFmffIBmNfBgiW+hK9vtnWwJIkzmfW97NZfAJiTBUvQZkkSPEJQZTrPXQzwnP9+5l5tnlKvNo9z6PW2uPZeFixtee16b83rPNVM57Dl83ykdxYkWZLEzARVZvTcxQDnXjYwS682T6lXm8c59HrFee5yNr3emtd5qpnOofR+vM9aZwMJPqN6hFnv+57V7XcFb4aFDWft1eYp9WrzOIderzLPCC5YqpxNsXfp65xjmdIM5xB9P97nnffopc6bfZk1qwiqR5j1yQfguhwsWOILfH3zfIcLLkS6y4IkHjLre9mtvwBwPSxYgvNa631iQRJXR1BlOs9dIPCc/37mXm2eUq82j3Po9SrzFBcsVc6m2FvzOndd4zmc+wxP8M6SpHt69y5I2nhuuChBlRk9d4HAuZcSzNKrzVPq1eZxDr1ebZ6HZrzL2WzbW/M6d13jOZz7DI+15dlAls+oHmHW+75ndfudwpthYcNZe7V5Sr3aPM6h16vN88Xe2HjBUuUcir1LX+exZUp/+IcfbLJca2+vkXH6QqT7PPmsz3G27N+sWUVQPcKsTz4AHCxYulq+vnnYwZIkdmTW97JbfwHgulmwBO+yJAk2JqhyFU5ZKvCcpQQz92rzlHq1eZxDr1eb54u9rRcsVc6h2FvzOndd4zkcezYPOGYh0n29e5ckXeJsYG8EVa7FWksJZu7V5in1avM4h16vNs8pc9/lbNbrrXmdu67xHI49m/vs4WxgV3xG9Qiz3vd9TW6/o3gzdrywYetebZ5SrzaPc+j1avN8WW+suGCp8piLvUtfxzKlJ78H7rPJeZ36a5nTrFlFUD3CrE8+ANzn8MiCJYtf5uHrm4c99h64j/cFW5r1vezWXwDgrgcXvBwOY7lTn6w5GDzH4TA+uec1/E6d+NtaiAQX8NWtBwAAWu77K2ge+eLdJmD25KTXq5+Uwnb8RJWrcN/2u/t6p/zaa+vV5in1avM4h16vNs9zH8tdzuZyZ73H53QP53Cs0jmccjYwA0GVa3GJ7XnX1qvNU+rV5nEOvV5tnuc+lruczWV6a17nrtnP4VilczjlbGD3LFM6wqwfUL4mt99lvBnBzYJ76dXmKfVq8ziHXq82z1N640KbgCuPr9i79HWucevv2Mk23+eeDddl1qwiqB5h1icfAI51OG3BzJv7PudKy0xf3xw+W+p19s9L+4wqezDTe/mL3PoLABzjlM2mFiyxtku85mzzhQ0JqlytU5YS6PXmKfVq8ziHXq82z1N6yzJe3v506cUY49vjS76GuKazuVRvzevctddzOMHnr+NlGYe7r+3b3svSOTzz8cLuCKpcs1OWEuj15in1avM4h16vNs8lHt9dzub5vTWvc9dez+FYpcd37ucOpuAzqkeY9b7va3f7HcmbYanHsxY26Dkb53C9ZzPOsGCp8liKvUtfZ6ZlSmOyJUmnnA3MmlUE1SPM+uQDwHMcLFjatT18fXOwJAm+1B7ey0/h1l8A4KksWOLSLEmCKyWowh2lJQmlXm2eUq82j3Po9WrznKt3jgVLlcdS7K15nbtq53CCKZYkneEcYPcEVXhXaUlCqVebp9SrzeMcer3aPGs95ruczWm9Na9zV+0cjlWae8tzgN3zGdUjzHrfN/e7/e7lzQgsSSj1avOUerV5nEOvV5vnkr1x4oKlytzF3qWvs4dlSofTPgc9xiRLkh56PcB9Zs0qguoRZn3yAeDcTgkWv/M7/3f86Z/+2SXH4YnO9fXN4ULLkB5iSRLXaNas4tZfAOCcjl5U83d/9w8uOQcNay7RsiQJJiKowhFKyxS26tXmKfVq8ziHXq82zyV7py5YoukS74GnemAh0n29XS5JOvd5wSz8zwOOU1qmsFWvNk+pV5vHOfR6tXm2PAf2ofbcl17H3iuwAp9RPcKs931zvNvvct6MwDKFrXq1eUq92jzOoderzbN2bzyyYOnjj18/9K/Y0LmWKY3Hl2sdZVnGofA6tjiJqlmziqB6hFmffABYw+H0za2MZQx3go4xLEiCLzNrVnHr73Ee+nC+D+0DwJfz/8uT7fZry3Pz2oEvN2dWWZZFKXWmGmM5jLG8P8ZymK1Xm6fUq83jHHq92jyV3hjLoq6mvH8eeXxKqXdr8wGUmqlu/+fzV2Ms78/Wq81T6tXmcQ69Xm2eSm+MZevwpNYr759HHp9S6t3afAClZqoR+g7tuXu1eUq92jzOoderzVPpjbFsHZ7UeuX988jjU0q9W7f/owAAWJclS9djsRAJOJFlSgDAVva96INjeZ6BkwmqsIHDYRwOh/H+7d+dtotebZ5SrzaPc+j1avNUessyXt7+pO3FGOPbY4wXn93tpbf1tc/ce+n9A5xs63uPlbrGGqElDsf2avOUerV5nEOvV5un1KvNU+rV5nEO53ksSqnjavMBlLrGGqElDsf2avOUerV5nEOvV5un1KvNU+rV5nEO53ksSqnjyjIlAAAAUnxGFQAAgBRBFcJqCyBK85R6tXmcQ69Xm6fUq81T6tXmucZzADa09b3HSqmHa8QWQJTmKfVq8ziHXq82T6lXm6fUq81zjeeglNquNh9AKfVwjdgCiNI8pV5tHufQ69XmKfVq85R6tXmu8RyUUtuVZUoAAACk+IwqAAAAKYIq7Mw1LrOo92rzOIderzZPqVebp9SrzVM7B2ByW997rJQ6rcYVLrOo92rzOIderzZPqVebp9SrzVM7B6XU3LX5AEqp02pc4TKLeq82j3Po9WrzlHq1eUq92jy1c1BKzV2WKQEAAJDiM6oAAACkCKowKUs91uvV5nEOvV5tnlKvNk+pV5wHYDVb33uslLpMDUs9VuvV5nEOvV5tnlKvNk+pV5xHKaXWqs0HUEpdpoalHqv1avM4h16vNk+pV5un1CvOo5RSa5VlSgAAAKT4jCoAAAApgipcub0u9Sj1avM4h16vNk+pV5vnEo8PgCfY+t5jpdS2NXa61KPUq83jHHq92jylXm2eSzw+pZRSp9fmAyiltq2x06UepV5tHufQ69XmKfVq81zi8SmllDq9LFMCAAAgxWdUAQAASBFUgaOVFpSUerV5nEOvV5tnrccMAE8lqAKnuBlj/OT2n3q/rTSPc+j1avOs9ZgB4Gm2/pCsUmo/VVpQUurV5nEOvV5tnrUes1JKKfXUskwJAACAFLf+AgAAkCKoAmdXWuhiUY5zsOAHAPZHUAUuobTQxaKc9Xq1eSz4AYCd8hlV4Oxuf4J1M8b4xbKMZfZebR7n8PiMAECfoAoAAECKW38BAABIEVSBzViKAwDAfQRVYEuW4gAA8A6fUQU2YykOAAD3EVQBAABIcesvAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAADYcZriAAABrElEQVQAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKYIqAAAAKf8PLmrpHsltQLwAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " A* search search: 124.1 path cost, 3,305 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAGO9JREFUeJzt3T+PXNd9x+HfHSuCo0BrqTKQKnGTwhEoNq4cyG8gMAIDq0KQXTnRq7Cod6E4SOFChQkEAeI+kGBXaSIlDuAudZpIJiHFUGDeFLtkqOXu8s7u3HO/58zzNIKPaJ3D+UPNR3Pvb6d5ngsAAABS7LY+AAAAADxNqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABBFqAIAABDlha0PAACwlvv37z+oqpe3PkdnHp6enp5sfQjguPlGFQAYmUjdn8cM2JxQBQCGMk01TVO9Pk01bX0WAG5GqAIAo7lTVf9w/lcAOiRUAYDRfFJVPzj/KwAdEqoAwFDmueZ5ro/nueatzwLAzQhVAGBkD7c+AAD78+NpAIAunA9HulNVnzz+tvT5a2c/ZuVm/99t19be5+c/v//oNs8HwJp8owoA9OKyIUkjr7XcByDKNM9u3wAA8iV92zn6N6qnp6d+tA+wKaEKAHCE7t+/f+WHQKEKbM2lvwAAAEQRqgDApqappmmq188vS7XW+LEBSCRUAYCtJQ0wSlpruQ9AFPeoAgCbShpglLS29j6GKQHJhCoAwBEyTAlI5tJfAAAAoghVAODg0gYG9bjWch+ANEIVAFhD2sCgHtda7gMQxT2qAMDBpQwM6nlt7X0MUwKSCVUAgCNkmBKQzKW/AAAARBGqAMBivQ4M6nGt5T4AaYQqALCPXgcG9bjWch+AKO5RBQAW621gUM9ra+9jmBKQTKgucP/+/QdV9fIlf+vh6enpSevzAADclmFKMIZRW8Wlv8tc9sRftw4AANDCkK0iVAGA4QcG9bjWch+ANEIVAKgaf2BQj2st9wGI4h7VBdzDAcDoRh0Y1PPa2vsYpgRjGLVVfKMKANQ81zzP9fHTkdRibcu909da7gOQRqgCAAAQRagCwMCShgMZppT52AAkEqoAMLak4UCGKe231nIfgCiGKS0w6g3KAIwvaTiQYUpZj41hSjCGUVvFN6oAMLCk4UCGKWU+NgCJhCoAAABRhCoAAABRhCoADCJpYm3aZNse11ruA5BGqALAOJIm1qZNtu1xreU+AFFM/V1g1ElaAIwlaWJtymTbntfW3sfUXxjDqK0iVBcY9ckHAI6XzzcwhlHfyy79BQAAIIpQBYAOJQ396WFgUI9rLfcBSCNUAaBPSUN/ehgY1ONay30AorhHdYFRr/sGoF9JQ3+SBwb1vLb2PoYpwRhGbRWhusCoTz4AcLx8voExjPpedukvAAAAUYQqAARJGuYz0sCgHtda7gOQRqgCQJakYT4jDQzqca3lPgBR3KO6wKjXfQOQJ2mYzwgDg3peW3sfw5RgDKO2ilBdYNQnHwA4Xj7fwBhGfS+79BcAAIAoQhUANpI0uGf0gUE9rrXcByCNUAWA7SQN7hl9YFCPay33AYjiHtUFRr3uG4BtJQ3uGXVgUM9ra+9jmBKMYdRWEaoLjPrkAwDHy+cbGMOo72WX/gIAABBFqAJAA0lDepLW0s6TtNZyH4A0QhUA2kga0pO0lnaepLWW+wBEcY/qAqNe9w1AO0lDepLW0s6TtLb2PoYpwRhGbRWhusCoTz4AcLx8voExjPpedukvAAAAUYQqAByQgUH7raWdJ2mt5T4AaYQqAByWgUH7raWdJ2mt5T4AUdyjusCo130DcHgGBmUNDOp5be19DFOCMYzaKkJ1gVGffADgePl8A2MY9b38wtYHgB5M701TVb1RVR/N7/qvOwAAsCb3qMIFFwdNTO9NU831fs31zzXX++fRGjVwY6ShHj2upZ3H45C3lnaepLW08ySttdwHHvNaIoVQhWc9GTRxHqXv17x7u6aaat69XfUkVpMGbow01KPHtbTzeBzy1tLOk7SWdp6ktZb7wGPPf91M06s/qfe+/1r92z/9pN77fk3Tq+2Pyejco7rAqNd9c7nz/zJ4p75375P63nvvV9VbVfVHT/2Sz6vqg/rw3Xfqw3t3KmDgRou1tPMkraWdx+OQt5Z2nqS1tPMkra29j2FKXOa5r6Wavl1Vv5yrvvaodrtdPXo0Vf2+qv6i5vnX2538eI3aKkJ1gVGffK725JvUL1/663rxi2d/wZcvVf37W1W/+NsqV7wA7T2c5zrZ+hD0zecbnmea6kFVvfz4f3+7fl2/qu/WSf32K5dlPqqqh3VS36gHr4nV9kZ9L7v0Fy54EqlVb10aqVVVL35R9doHVX/5N1XlP/YAzb38/F8CcGtP/qx5pT69NFKrzoLi5XpQVfVLlwFzKEIVnvJkcNKj3dv11ct9nyVWgQ21GLLTYi3tPElrLfdhLGu8Rr5Z/1W7enRlPJyvf62qvnnI3wvHS6jCV71RVT+u3aM/XPSrX/yi6u7fV/3JR+ueCuBZSUN/ehgY1ONay30Yi9cI3XOP6gKjXvfNs/7/R9Hs3l4Uq+5VBbazq5ChP8kDg3peW3sfw5TGdajXSJ3dflpVVX9Wv6l/qe/UST28buuHVfWdmuffHP53xVVGbRWhusCoTz6X+8o9qtdd/itSgSwGLLEXn2+Ow3RhINJNvVKf1n/Wn156j2rVWdHuqj6rqm/VPH962/1YbtT3skt/4YL53Xmuqneq6oP68qXLf5FIBfIYsARc5iB/NnxWr9Z361f1oL5RF7+Kfzz1t85+RI1I5SCEKlwwTTXVvflOffjuO/XiFz+ts5+b+rTP68UvfloP/3hXNd2tqt081zTPNdXZe2rItbTzJK2lncfjkLd2yH9mXSNpEFDawKAe11ruw3ZavUZuaVdVd/+j/nz3Sv32tV3VZ3PVw9/X7vO56uGu6jM/moZDE6rwrLMhAh/eu1OPv1l9tPufqqrzv35QVe+c//2UgRsjDfXocS3tPB6HvLW1/pkXJf2evW5uv9ZyH7bT6jVymDOexei37tW9H96tf/3ve3Xvh3V2ua9I5aDco7rAqNd9c7mLgwWeDFiq+nFV/V1N9c787jwnDdwYYahHz2tp5/E45K0d8p9Z9cxVd0/rbsBS2nmS1tbexzClDGu/Rur6PzOWWvRnC9sYtVWE6gKjPvksdz5g6Y2q+uj8HlaATUzTXh8IDVjiSj7f5JkONPjo0M5vQSDUqO/lF7Y+APTgPE4/3PocAHX24x+WfpCN+8ALXCvxPXvtz6OBtbhHFQAOaO2hOPNcJyMNWEo7T9Jay324nU6ek5sOdDvxumELQhUADittKE6LfVsNgzm2tZb7cDs9PCc9nBGecI/qAqNe9w3A4W0xFKc6HrCUdp6ktbX3MUzpcBq9l2/roH8WkGPUVhGqC4z65AMwhsmAJW7A55t1TAYi0dio72WX/gJA//YZdhL3ARoGk/geMxCJ7ghVAGhgzaE4PQ9YSjtP0lrLfYh7XBcNPjIQiZEJVQBoY8uBPClnMUxpv7WW+5D1uLZ6jUAs96guMOp13wC003ogT3UyYGnLvdPX1t7HMKWvavTeW2rRe/S252YMo7aKUF1g1CcfgHFNBizxHD7fXG3aeCCSwUfsY9T3skt/AWBMBizBzW35njD4CEqoAsBm1hyU08uApS33Tl9ruc8owh6bRQORLlkz+AhKqALAlgxY2nbv9LWW+4wi6bFJOgt0xz2qC4x63TcA22o9pKcCByxt8Tj0srb2PiMOU2r0XlnqoO8puMqorSJUFxj1yQfguEwGLPGU0T/fTAYicSRGfS+79BcAjocBSxwTA5GgY0IVAIKsOTwnccBSq316XGu5T4o1HodbMhAJNiJUASDLsQ1YarVPj2st90mxxuNw6POM8lhDNPeoLjDqdd8A5Gk9uKc2HrC0xe+5l7W190kcpnTIx6EMROJIjNoqQnWBUZ98AJiuGbBkGMzYRv98c91reynvAXow6nv5ha0PAABs6mFdMXTmkg/6JgHT1LTt5F4DkWBDQhUAjthl4XnNN1EmAdParV9zvhWFPhmmBAAd2moabK+TbXtca7nPTaWfL/E8wDJCFQD6NMok4Fa/lx7XWu5zU+nnSzwPsIBhSguMeoMyAP1acxpsNZwEvPbvpee1tfc5xNTftR+H6QADkcrkXgY3aqsI1QVGffIB4DJ7xoEBS5069OebadvBR1dyjyqjG7VVXPoLAFy0z7TTuDBhM4mvBZN7oVNCFQAGcaihMfNcJ+ffQu2q6m495/NCDwODelxruc8SWw0buvhanOea9lg7MRAJ+iRUAWAcPQ5YanXuHtda7rPElsOGDESCI+Me1QVGve4bgLH0OGBp7XP3vLb2PvsOU7rla+TG5rmmNR5bGMWorSJUFxj1yQeApQxYGs9tPt+0HJxkGBJcb9RWcekvALCEAUs8rdVzbBgSHCmhCgADSx+wdMgzjrbWcp+Lbjk46aaDjy4dhrTn3sAghCoAjC19wFKrM/a41nKfi1r8f2/7OAADc4/qAqNe9w3A+NIHLK19xp7X1t7numFKb755eu3zVysN19rncQDOjNoqQnWBUZ98ALgNA5b6tvTzzb6Dkww/grZGbRWX/gIAN2XA0nHY57kz/Ag4CKEKAGw2YOmmex/DWst9LrrF4KQTg4+AQxCqAEDVdgN61th7lLWW+1zUYugSwJXco7rAqNd9A8BjWw1YOuTeo62tvc/SYUp1oKFZwDpGbRWhusCoTz4AHNo+A5a+/vX/rZ/97B/XPA57+tGP/qp+97s/WPzrDU6C7Y3aKi79BQAOafEwnX2CiDb2fE4MTgJWI1QBgMUOPWCJrhicBDTjXx4AwD4OPbSHfniegWaEKgCwj0+q6gfnf913jb55noFmXtj6AABAP86nuH68z9p0zUWhb755evAzjmGuCruadulzD3AIvlFd5qphAYYIAMDz+ffl3rIitTyHkGzIVvHjaQCAg1vycz33+VE2NOfnowKb8o0qALAGg3f65vkDNuUbVQDg4Hyj2j3fqAKbEqoAwCaEaq7zn4ULsBmX/gIAW+l60MfAPC/A5oQqANDENNU0TfX6+SWkNc91cv7N3a6q7lbVbp5rsna2tuHeJxefK4DWhCoA0MpVw3iWDu45trXE8wA04R5VAKCJq4bxLBm8dIxriecBaEWoAgAAEMWlvwAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAEQRqgAAAET5Pzxfqg2F7ogAAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (1.4) A* search search: 124.1 path cost, 975 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAGSNJREFUeJzt3TGPHOd9x/H/nAXBUSBaqgykStykcARZjSsH9hsIXBigCkHunOhVWNS7UBykVKED0iR9IMGu0kRKHMBd6jSRzIMUQIE5Kbh3Od7t3s3e7sz8nmc/n0bwY5Iz3FuK/Gp3fxzGcSwAAABIcbb2DQAAAMB1QhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUAAIAoQhUA6Mow1DAM9YNhqKHls8T7AViKUAUAevNmVf3D5p8tnyXeD8AihnEc174HAICj2bwK+GZVfT6ONbZ6lng/AEsRqgBAt4ahnlbVq2vfR2MuxrEerX0TwGkTqgBAt4bBq4EPMY4+mwqsy2dUAYBmGQKah8cVWJtQBQBaZghoHh5XYFXe+gsANOu+IaCqerbi7bXsrAwsASsSqgBAFwwnzcrAErAooQoAdKGv4aSxKuyjoAaWgCW9tPYNAABMMeFtvjt9/PH5/DfYgbfffrzz//P3rQJLMqYEALTCwM+6PP7AYoQqANCKz6vqZ/Xiq6fbzpiHxx9YjLf+AgBN2Ly19LNdZ4NPUM7takF581hfDix9tus7ADyUV1QBAKiqqm9/+3/3+eYWloHZWP0FAOJMHe7Z4+9MnfT3giadzX2djz8+3/l4vf3246vH6xiPK8C+vKIKACSaOtwzdcznkB9vrbMlr3PTsR9XgL14RRUAiOMV1Yc/Dl5RBXogVAGALgzD7igaxzK1dMP5+fnOx+vx48dXj9ddj+sWlwNLAAfx1l8AAO5ysce3NbAEHIVQBQBWNQw1DEP9YPOW0YPPlrjGEmdLXueux2sc69HmFemzqnqr7vnz4yHXBbgkVAGAta01DpR+tuR1blrr+wJUlc+oAgArO9Y4UBlTmmVMaa7HGuAuQhUAaM4w1NPa4/OQxpRumzqmtI2BJWBu3voLALRon9GefcaAmMbAEjAroQoALGKOwaAdrkZ/xvH5GFDSSFILY0rbGFgCliRUAYClzDEYNPU6LZ4teZ0pDCwBi/EZVQBgEcccDKrOhpPSxpS2fUZ1ia8JwCWhCgBEM5w0j0PGlLYxsAQck7f+AgDpDCe1wcAScDRCFQA4ujkGg3boYjiphTGlbQwsAXMRqgDAHOYYDDrkOi2eLXmdhzKwBMzCZ1QBgKM75mBQncBwUgtjStsYWALmIlQBgGh3jfQYTnq4Y48pbeNrBzzUS2vfQAvOz893rQ1ePH782GIdABzJngu/hpPyXdSOr+eWiLUEDA/Qa6sI1Wl2/YZpsQ4Ajmvn761egWvPtvC841VWf66Ch+myVYwpAQBHd+iy7SE/Zi9nS17nmA657qHPEaAfQhUAmMOhy7aH/Ji9nC15nWM69hLwPt8f6IQxpQmWGBsAgJ48dNm2TnTht9XV322OvQS868cEnuu1VXxGFQA4uk1MfHbX2Z7DSZN+zJ7O5r7O+fnNqx3Hfdcd7v5j81XEbr7dxTjWo12PD9Avb/0FANayz9CHhd9+7PO1bHoMBng4oQoAHGSmIZ+zqnqrqs7GsYZxrEdJQ0fGlPZz/bqbV0iHuvY13uf733UG9EOoAgCHmmPIJ2nUyJjS4ZZ6PgCdMKY0Qa8fUAaAY3jouM+w++/TrDqx4aSexpS2OWRgaRxr2Ocxg1PTa6sYUwIADjJhPOfWaNI9gzpRo0bGlA53yMDSzf+gcX1k6eaPCfTDW38BgLntO4hjOOn07Ps1N7IEnROqAMBkxx7tuTmqc6rDSb2PKW1z38DS5n9P+v53nQFtEqoAwD4MJy13tuR11mBgCdjJmNIEvX5AGQD29ZDRnjrSUM6pnc19nSXHlLZZanALetdrqxhTAgAme8hw0qE/5qmezX2dJceUttnncdjhKrQNLEF/vPUXADimfSLVaBL32ec5YmAJOiJUAYCtZhjouTWadOh1ej5b8jopbt7ftpGlfb7/rjMgn1AFAHY59kDPmoNBLZ4teZ0U+zwOU79/+s8Z2MKY0gS9fkAZAO5y7OGk2jJ+89DrnMLZ3NdZe0xpmymPQ+35HDOwRO96bRVjSgDAVsceTlpjMKjls7mvs/aY0jZTHofh7j92G1iCTnjrLwDwUIaTWIOBJTgBQhUAWGQ4ac3BoBbPlrxOuuv3bWAJToNQBQCqlhlOMqa039mS10m31HMRCGFMaYJeP6AMAJeWGE5aYzCo5bO5r5M4prTLUs9FaFGvrWJMCQBYZDjJmNJ+Z3NfJ3FMaZcJz8+7GFiCBnnrLwAwheEkkhlYgs4IVQBgkeEkY0r7P9bGlHZbYmDJYw3rEaoAQNUyYzXGlPY7W/I6LVrrOXuKjzUszpjSBL1+QBkALi0xVmNMKeuxaWlMaZu1nrOHfk3h2HptFWNKAMAiw0nGlPY7m/s6LY0pbXPkgaUXHOHMYBMcyFt/AYCbDCfRgzWfmwab4EBCFQBOzFrDSWsOBrV4tuR1enH957fvwNKc9zLXGfRMqALA6UkaoZnjx+zlbMnr9CLp5+xrBwcwpjRBrx9QBuA0JY3QHPr9ez6b+zqtjyltc+Bz+9hmH2yCqn5bxZgSAJyYtYaTjCntdzb3dVofU9rmwIGlYzvmYJNxJk6Ot/4CwGkznMQpafU5bJyJkyNUAaBjScNJaw4GtXi25HV6dv3nvG1gafPcPtrZEj+P+849R+iBUAWAviUNJxlT2u9syev0bM2v3zF5jnBSjClN0OsHlAHo34HjMouMwSQNGCWdzX2dHseUtln661fzDTbd+vW41M+PbL22ilCdoNcvPgCnZdh/OMnvcR3z55t5DEPTcXc52kRDev21bPUXJhg+GIaq+nFVfTq+77/uAM0ynATzu6h2x49avW865DOqcMPNEYHhg2GosT6ssf65xvpwE61Rgxs9jXq0eJZ2Px6HvLPE+9li9uGkFh6bpLMlr8Nhrj+uxx5sWvPnsu8ZHJNQhduuRgQ2UfphjWfv1lBDjWfvVl3FatLgRk+jHi2epd2PxyHvLPF+bvLY5J0teR0O09PX6f77GYbXf1kf/PSN+rd//GV98NMahteXv0165zOqE/T6vm+22/yXwTfrJ08+r5988GFVvVNVf3ztm3xVVR/VJ++/V588ebMCBjeWOEu7n6SztPvxOOSdpdxPrTyclPzYJJ7NfZ1TGVNawoq/budw978Lavh+Vf16rPrWszo7O6tnz4aqP1TVX9Y4/nbhe6X6bRWhOkGvX3x2u3ol9ZtX/rpe/vr2N/jmlap/f6fqn/62yjtegA4YTjo9/nzThiFonOn79dv6Tf2oHtXvX3hb5rOquqhH9Z16+oZYXV6vv5a99RduuIrUqne2RmpV1ctfV73xUdVf/U1Vzu8fAA9lOAlyRfz6fK2+2BqpVc+D4tV6WlX1a28D5liEKlxzNZz07OzdevHtvreJVaBdqwwn7RpfWeva6WdLXocs179Od40zLTna9N36rzqrZzu/4eb8W1X13QN+6nBFqMKLflxVv6izZ3806Vu//HXVW39f9aefzntXAMe11jjQroGYpPtJOlvyOmRZ8zkCEXxGdYJe3/fNbf//V9GcvTspVn1WFWjTKsNJxpSyHhtjSrnWeI7UPaNNf16/q3+pH9aju9+JfFFVP6xx/N3+P2seqtdWEaoT9PrFZ7sXPqN619t/RSrQKMNJVPnzDS+6b7Tptfqi/rP+bOtnVKueV+5Z1ZdV9b0axy9muUm26vXXsrf+wg3j++NYVe9V1Uf1zSvbv5FIBdoVMcwCxLnz3w1f1uv1o/pNPa3v3Hrp9XL1t57/FTUilaMQqnDDMNRQT8Y365P336uXv/5VPf97U6/7ql7++ld18SdnVcPB4wWtnKXdT9JZ2v14HPLOwu5nteEkY0qZjw2na+po0+XZf9RfnL1Wv3/jrOrLseriD3X21Vh1cVb1pb+ahmMTqnDb82GBT568WZevrD47+5+qqs0/P6qq9zb/f8rgRk+jHi2epd2PxyHvLO1+ks7S7ifpbMnrcJr2f948j9HvPaknP3+r/vW/n9STn9fzt/uKVI7KZ1Qn6PV932x3c2zgamCp6hdV9Xc11Hvj++OYNLjRw6hHy2dp9+NxyDtLu5+ks7T7STqb+zrGlJjjOcvyem0VoTpBr198ptsMLP24qj7dfIYVAJrmzzfQh15/Lb+09g1ACzZx+sna9wEAAKfAZ1QBoGNLjPEcOuSTdD9JZ0teByCNUAWAviWNA605GNTi2ZLXAYjiM6oT9Pq+bwD6lzQOZEwp67ExpgR96LVVvKIKAB0bxxrHsT67Hj9JZ2n3k3S25HUA0ghVAAAAoghVADgxpzgY1OLZktcBSCNUAeD0nOJgUItnS14HIIoxpQl6/YAyAKfplAaDWj6b+zrGlKAPvbaKV1QB4MSc4mBQi2dLXgcgjVAFAAAgilAFAAAgilAFALpftm3xbMnrAKQRqgBAVf/Lti2eLXkdgChWfyfodUkLAC71umzb8tnc17H6C33otVW8ogoAdL9s2+LZktcBSCNUAQAAiCJUAYDuB4NaPFvyOgBphCoAUNX/YFCLZ0teByCKMaUJev2AMgBc6nUwqOWzua9jTAn60GureEUVAOh+MKjFsyWvA5BGqAIAABBFqAIAW/U0GNTi2ZLXAUgjVAGAXXoaDGrxbMnrAEQxpjRBrx9QBoC79DAY1PLZ3NcxpgR96LVVvKIKAGzV02BQi2dLXgcgjVAFAAAgilAFACZrdTCoxbMlrwOQRqgCAPtodTCoxbMlrwMQxZjSBL1+QBkA9tXaYFDLZ3Nfx5gS9KHXVvGKKgAwWauDQS2eLXkdgDRCFQAAgChCFQA4SAuDQS2eLXkdgDRCFQA4VAuDQS2eLXkdgCjGlCbo9QPKAHAMyYNBLZ/NfR1jStCHXlvFK6oAwEFaGAxq8WzJ6wCkEaoAAABEEaoAwNGlDQa1eLbkdQDSCFUAYA5pg0Etni15HYAoxpQm6PUDygAwl5TBoJbP5r6OMSXoQ6+tIlQn6PWLDwBrOj8/f1pVr659H9zmzzfQjl5bxVt/AYC1iFQAthKqAMDRGfIB4BBCFQCYgyEfAB5MqAIAc/i8qn62+eddZwBwy0tr3wAA0J/Nyuxn950BwDZeUZ3mYs9zAOB+fh/N5OsCbemyVfz1NAAAAETxiioAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABRhCoAAABR/g/qIHhYi9KlRgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (2) A* search search: 128.6 path cost, 879 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAF0RJREFUeJzt3T+PHPd9x/HvrATCoSHKqgSkCtw6Aq3egfwEAhcCVoUgN4EVPgqTfBa0jVSBCi6QJukDCnYv0n96VwHUxPIdxAAMdOPilofjcvdul7cz+/nNvl6AQXi0pxnOHe17c3c/1/V9XwAAAJBidugLAAAAgMuEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAAAFGEKgAwKV1XXdfVj7uuupaPJV4PwFiEKgAwNXer6j+Wv7Z8LPF6AEbR9X1/6GsAANib5bOAd6vqWd9X3+qxxOsBGItQBQAmq+vqpKreOfR1NOa07+vOoS8COG5CFQCYrK7zbOCb6HvvTQUOy3tUAYBmGQIahvsKHJpQBQBaZghoGO4rcFBe+gsANOu6IaCqOjvg5bVsVgaWgAMSqgDAJBhOGpSBJWBUQhUAmIRpDSf1VWFvBTWwBIzp7UNfAADAm1p5me9Gjx8vxrmgxn3yyXzjP/PzVoExGVMCAFpm4Gc8BpaA0QhVAKBlz6rq47rmGVX2Yt29dv+BQXjpLwDQrOXLTZ9WVXXeQTm0iwXl5b1+ObD09FAXBEyXZ1QBAKiqqu997/93ebiFZWAwVn8BgGbt8DNTt/q5oEnHhj7P48eLjffrk0/mF/drH/cVYFeeUQUAWrbtmM+2Q0BJx8Y8z6p931eAnXhGFQBolmdUPaMKTJNQBQAmoes2R1Hfl6mlFYvFYuP9ms/nF/frqvu6xsuBJYAb8dJfAACucrrDYw0sAXshVAGAZnVddV1XP16+3HSnx6UfG/M8V92vvq87y2ekZ1X1YV3z/eNNzgvwklAFAFpmTGk/51l1qI8FqCrvUQUAGmZMafgxpXUfu497DXAVoQoATIIxpd1sO6a0joElYGhe+gsAwK4MLAGDEqoAQBP2NQ7UyrExz7MNA0vAmIQqANCKMQZ+ko6NeZ5tGFgCRuM9qgBAE64bBypjSnsbU1r3HtUx7j/AS0IVAJgEY0q7ucmY0jruP7BPXvoLAMA+bBxY6rrqV/5zMuaFAe15+9AXAABA+9b9CJornmW1BAxcyTOqAEATrP4Od543NcbnBDhOQhUAaIXV3+HO86YsAQODMKYEADTB6u9hV3/XsQQMDEWobmGxWJzU+vdSnM7n89fejwEAjM/q7G72vfq7zlWfkzVO173PFbjaVFvFS3+3s+kN/4YAAAA227gEvIbvq+DNTLJVhCoA0ARjSsOdZ58un6Pv687y2exZVX1Y13zvuct9AKZNqAIArTCmNNx59mnfn6ddPh6YCKEKALTiWVV9vPz1qmM3+dikY2OeZ5/2/Xna5eOBiRCqAEAT+r76vq+nl9dg1x27yccmHRvzPPt0w/OeVdVXVXXWddV3XZ2Mdd1AFqEKAMCYDCwB1xKqAEATjCkNd56h3WRgafXjrzoGTIdQBQBaYUxpuPMM7abXkvR7AUYgVAGAVhhTGu48Q7vptST9XoARCFUAoAnGlIY7z9Buci1dV32tGVkysATTJlQBADi0XQaWqowsweQJVQCgCcaUhjvPIVw3sLT871t9/FXHgDYJVQCgFcaUhjvPIRhYAjYSqgBAK4wpDXeeQzCwBGwkVAGAJhhTGu48h7CH6zOwBBMmVAEASLXLyJKBJZgQoQoANMGY0nDnSbF6fetGlnb5+E3HgHxCFQBohTGl4c6TYpf7sO3Hp/+egTWEKgDQCmNKw50nxS73YduPT/89A2sIVQCgCcaUhjtPil3uwwYGlmAihCoAAC0xsARHQKgCAHH2PQ6UNJJkTGl3l6/bwBIcB6EKACTa9zhQ0kiSMaXdjfG5B4J0fe/l+tdZLBYbb9J8Pvc3cQCwZ8tnuu5W1bOX7y287lidvz9xk9mu/75DHxv6PI8fLzber7Tvb8b43EOrptoqnlEFAOLsexwoaSTJmNLubvh7MbAEDRKqAAC0zsASTIxQBQDiGFMa9zwtMrAE0yZUAYBExpTGPU+LDCzBhBlT2sJU36AMAKmMKb35fZjimNI6Bpbg3FRbxTOqAEAcY0rjnqdFBpZg2oQqAABTZGAJGiZUAYA4xpTGPc9UGFiC6RCqAEAiY0rjnmcqDCzBRBhT2sJU36AMAKmMKb35fTiWMaV1DCxxjKbaKm8f+gIAAFYto+DppmNdVye1w/sKr/v3JR4b+jyLxerZ2rfF181VLiJ2+bjTvq87q/8+YBxe+gsAtGiX8ZtdRnWYNgNL0AihCgDEueGwzcV4Tt+fj+okjSQZUxqXgSVok1AFABId23CSMaXhGFiCBhlT2sJU36AMAKmObTjJmNJwDCwxdVNtFWNKAECcYxtOMqY0HANL0CahClvoHnZdVX1UVV/2970MAWBkhpMY0mlt/zVmYAlG4j2qsGJ1JKF72HXV16Pq67+rr0fLaI0a3JjSqEeLx9Kux33IO5Z2PUnHdn3sGpMYTjrk180xunwfhhpYmtLXiK8lDkGowusuRhKWUfqo+tln1VVX/eyzqotYTRrcmNKoR4vH0q7Hfcg7lnY9Scd2feyqpN9Lq183x2gqX3OHOXfXvffLevizD+r3//nLeviz6rr3CvbMmNIWpvoGZdZb/s3g3frpg2f104ePqurTqvr+pYd8W1Vf1JP79+rJg7sVMLgxxrG060k6lnY97kPesbTrSTq2zWPrCIaTDvF1cyxjSutM5WvuIOeu7kdV9du+6q2zms1mdXbWVX1XVf9Uff/H7T8L7MtUW8V7VGFF31ffPeyeVdWjenH787r1fPUh368Xtz+vd/7n86q+qrrq1vxPwJSPpV1P0rG063Ef8o6lXU/SsauOb5I0fmRMqR17Hlh6Rdqfn30e+1H9sb6pd+tO/bVmVfXW8lacVdVp3fnDu133gVhlX7z0F1ZcvNy36tM1kXru1vOqD76o+ud/raqNf4kFwLAMJzEUX1srflB/qd/VTy4i9bJZVb1TJ1VVvy0vA2ZPhCpccjGcdDb7rF59ue/rxCrA2CY7nLTu2Jjn4dV7s+vA0jF4v76uWZ1tvBHL429V1ftjXRPTdvR/6GDFR1X1i5qd/d1Wj771vOrDf6v6hy+HvSoAqrJGbMY4NuZ5cL8gilCFV31ZVb+ps9n/bfXoF7ervvqXqj9/NOxVAVB1PnLz8fLXYzg25nlwvyCKUIVL+vt9X13dq9nZv9f5uu9mL25X/eHTqv/6VZVXUAEMru+r7/t6ennxdMrHxjwP7td1vq7366xmG+eQl8e/q6qvx7ompk2owor+ft9X1b2q+qJe3F7/IJEKMDbjNhzaUX8NflPv1U/qd3VS774Wq8vV36rzH1Hzl/GvjikSqrCi66qrB/3denL/Xt16/ut6/ZnVb+vW81/X6d/PqrrLox6vjC5M7Vja9SQdS7se9yHvWNr1JB3b4bGTHk4yppTpuoGlhv787OXYn+ofZz+ov34wq/qmrzr9rmbf9lWns6pv3q0TP5qGvRKq8Lrz4YQnD+7Wy2dWX75n9fzXL6rq3vKfpwxuTGnUo8VjadfjPuQdS7uepGNp15N0bMzzsF7S10PGn5/zGP3hg3rw8w/rq/99UA9+XlU/FKnsW9f3R/+S+2stFouNN2k+n/sbyYlZ/i3z3ap61vfVX/zImqpfVNVvqqt7/f2+X33cuo+d0rG060k6lnY97kPesbTrSTqWdj1Jx4Y+z+PHi01vN/T9zVLS10MLf344jKm2ilDdwlQ/+Wyve9h1df6ja75cvocVAJrm+xuYhqn+WX770BcALVjG6ZNDXwcAABwD71EFAOKMMSKUfmzM8wCkEaoAQKKkwZpDHRvzPABRhCoAkOhZVX28/PVYj415HoAoQhUAiNP31fd9Pb28Jnpsx8Y8D0AaoQoAAEAUoQoAxEkaNTKmBDA+oQoAJEoaNTKmBDAyoQoAJEoaNTKmBDAyoQoAxEkaNTKmBDA+oQoAAEAUoQoAAEAUoQoAxEla37X6CzA+oQoAJEpa37X6CzAyoQoAJEpa37X6CzAyoQoAxEla37X6CzA+oQoAAEAUoQoAxEkaNTKmBDA+oQoAJEoaNTKmBDAyoQoAJEoaNTKmBDAyoQoAxEkaNTKmBDA+oQoAAEAUoQoANCFp6MiYEsCwhCoA0IqkoSNjSgADEqoAQCuSho6MKQEMSKgCAE1IGjoypgQwLKEKAABAFKEKADQhaejImBLAsIQqANCKpKEjY0oAAxKqAEArkoaOjCkBDEioAgBNSBo6MqYEMCyhCgAAQBShCgA0IWnoyJgSwLCEKgDQiqShI2NKAAMSqgBAK5KGjowpAQxIqAIATUgaOjKmBDAsoQoAAEAUoQoANCFp6MiYEsCwhCoA0IqkoSNjSgADEqoAQCuSho6MKQEMqOt776W/zmKx2HiT5vO5l84AwBtYLBYnVfXOoa+D1/n+Btox1VbxjCoAcCgiFYC1hCoAAABRhCoAAABRhCoAAABRhCoAAABRhOp2Tnc8DgBcz/+PZvJ5gbZMslX8eBoAAACieEYVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKEIVAACAKH8DxgVAk+apKx8AAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Greedy best-first search search: 133.9 path cost, 758 states reached\n" + ] + } + ], + "source": [ + "plots(d6)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the next problem, `d7`, we see a similar story. the optimal path found by A*, and we see that again weighted A* with weight 1.4 does great and with weight 2 ends up erroneously going below the first two barriers, and then makes another mistake by reversing direction back towards the goal and passing above the third barrier. Again, greedy best-first makes bad decisions all around." + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3cGqJOmVH/AT16Jpy6gkrwxemdkacbs3XnnQvIARs8m7EJrd2L3yI6irX2FW8hgvxqBFFYgBe28k7L1pewzeyI9gSV2oDRJT4UWliuqsvFkRNzMi/t8Xvx8cGk7fqjhxMjKrzr0Rp4ZxHAsAAABS3G1dAAAAALzLoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyoAAABRDKoAAABEMagCAAAQxaAKAABAFIMqAAAAUQyqAAAARDGoAgAAEMWgCgAAQBSDKgAAAFEMqgAAAEQxqAIAABDFoAoAAEAUgyrQtWGoYRjqk2GoYakcAAC3ZVAFendfVT8//nepHAAANzSM47h1DQCLOf7k876qvhzHGpfIAQBwWwZVAAAAorj1FwAAgCgGVaBJayxJmrNMKame3vsAAPTPoAq0ao0lSXOWKSXV03sfAIDOeUYVaNIaS5LmLFNKqqf3PgAA/TOoAgAAEMWtvwAAAEQxqAJRkpYDpS0RSs+l1ZOUW/M4ANADgyqQJmk5UNoSofRcWj1JuTWPAwDN84wqEGWNZTy3zqXVow95uTWPAwA9MKgCAAAQxa2/AAAARDGoAqtIWmzT6qKc9FxaPUm5tHqScnO/FoB9MKgCa0labNPqopz0XFo9Sbm0epJyc78WgD0Yx1EIIRaPqnGoGj+pGofecmn16ENeLq2epNzcrxVCCLGPsEwJAACAKG79BQAAIIpBFQAAgCgGVeAqSdtDW9ha2nMurZ6kXFo9Sbm0epY4PwCeYOuHZIUQbcdx2cmvqsZP9ppLq0cf8nJp9STl0upZ4vyEEELMj80LEEK0HRW0PXSrXFo9+pCXS6snKZdWzxLnJ4QQYn7Y+gsAAEAUz6gCAAAQxaAKvGePC08sg9EHvdGbtNxSvydAE7a+91gIkRe1w4Un1+TS6tGHvFxaPUm5tHqSckv9nkII0UJsXoAQIi9qhwtPrsml1aMPebm0epJyafUk5Zb6PYUQooWwTAkAAIAonlEFAAAgikEVdiRtqUcvubR69CEvl1ZPUi6tnqRcWj3XngvALFvfeyyEWC8qbKlHL7m0evQhL5dWT1IurZ6kXFo9156LEELMic0LEEKsFxW21KOXXFo9+pCXS6snKZdWT1IurZ5rz0UIIeaEZUoAAABE8YwqAAAAUQyq0AFLPbbNpdWjD3m5tHqScmn1JOXS6mmhD0BHtr73WAhxfZSlHpvm0urRh7xcWj1JubR6knJp9bTQByFEP7F5AUKI66Ms9dg0l1aPPuTl0upJyqXVk5RLq6eFPggh+gnLlAAAAIjiGVUAAACiGFQhWKvLLPaWS6tHH/JyafUk5dLqScql1bPHPgAb2vreYyHE41GNLrPYWy6tHn3Iy6XVk5RLqycpl1bPHvsghNguNi9ACPF4VKPLLPaWS6tHH/JyafUk5dLqScql1bPHPgghtgvLlACAbg1DfVVV39m6jsa8Gsd6tnURwL4ZVAGAbg1D+YvOE4yj5zSBbVmmBBtIWlLRwjKL9FxaPfqQl0urJym35nGYbo/XiGsJwmx977EQe4wKWlIxNZdWT1IurR59yMul1ZOUW/o4VeMonhS7uUaeUo8QYvnYvAAh9hgVtKRiai6tnqRcWj36kJdLqycpt/RxqsatB75WYzfXyFPqEUIsH55RBQC6MFictCQLloBVGVQBgC4MXS1OGqvCHoscLVgCVvStrQuAnhyXLdxX1Zfj+OYvTL3k0upJyqXVow95ubR6knK3/D3rghcvXl763xw9PBwe/X89XCNL1gjclq2/cFv3VfXz4397y6XVk5RLq0cf8nJp9STllvo9ub2erhHXHIRz6y/cUNJ3d/f4HW290YfUXFo9Sblb/p5V9boe4Seq01z6iWq9+QFH09eIn6hCOwyqAEBzhpmLkwyq03xgUD1lwRKwGLf+AgAtmjykfvzxH5asoysze2XDMrCcrf99HCFaiKR/v82/Y5eXS6tHH/JyafUk5Z7666vG8ULEnF/ydfPixYvxsUjqddrrMqduIcTTY/MChGghjn/4/Kpq/GSvubR6knJp9ehDXi6tnqTcU3991XhpeIo5v+Tr5gODakyv016XOXULIZ4emxcgRAtRQd+h3SqXVk9SLq0efcjLpdWTlHvqr68aLw1PMeeXfN34ierydQshnh7DOI4FAJBq7uKkcaxhwXK68fLly0f/Eng4HN72cBhmbba1YAm4CcuUAIB0c5b2vFqsiv2a01MLloCbMKjCiWGoYRjqk+O/kyanN3qjD3oT0ptH3FXVp1V19+ZOsXqWdH4tXDfnvPt141jPjj+lftvrqb+2pT5s1WvgEVvfeyxEWlTQIoakXFo9Sbm0evQhL5dWT1JuytdWjZeekYw5l9aum6nLlLZ+TdJel2vPRQgxLTYvQIi0qKBFDEm5tHqScmn16ENeLq2epNyUr60aLw1FMefS2nUzdZnS1q9J2uty7bkIIaaFZUoAQIzB4qTVTF2mdM5gwRKwMM+oAgBJLE5qgwVLwKIMquxW7wsbLLPQG33IyaXVk5S7lD+j28VJW14353zo144rLlhaqw9bvs7AGVvfeyzEVlGdL2y4dS6tnqRcWj36kJdLqycpd5qvGi89+xhTdw/XzTXLlNZ87dJelyVeZyHE+7F5AUJsFdX5woZb59LqScql1aMPebm0epJyp/mq8dKwE1N3D9fNNcuU1nzt0l6XJV5nIcT7YZkSALCJweKkTV2zTOmcYd6CpaksYoKd8owqALAVi5P6ssRrZBET7JRBlV2wsOH6XFo9Sbm0evQhL5dWT1Lugl0tTtryujnnKb/fOHPB0hxJr8vG7wvYj63vPRZijagZCw2mfu3ecmn1JOXS6tGHvFxaPSm5qvHSM40RNfZ83dx6mdK53Ade40mR9rqs9doLsffYvAAh1oiysOHqXFo9Sbm0evQhL5dWT0quarw0oETU2PN1c+tlSudyH3iNJ0Xa67LWay/E3sMypQlevnz52LKHV4fDwQP+AHDBMHNpUlXVaHHS4m69TOmcYZkFS4+xeIld6nVW8YzqNI/94eoBfwD4sLl/Xlqc1I81X0t/L2OvupxVDKrsQtriihZzafUk5dLq0Ye8XFo9W/bh1HiyfGfc6eKkLa+bc251jPHMgqXjazwpd6nGJetOzMHeGFTZi/uq+vnxv5dyc752b7m0epJyafXoQ14urZ4t+3BOUo1JuTWPcyqtD1Ml1b1lH6B5nlGdYI1nOFjW8TuS91X15Ti+eV7mXG7O1+4tl1ZPUi6tHn3Iy6XVs3auql7XI8axhoQaE3NLH+fFi5ePvi4PD4e7hD4M859xjah7iRw8ptdZxaA6Qa8vPkBLhics5CGfpUnbaeHvN08YVKewdImutPBefgq3/gLQCkNqfyxN4kOWuEZ8lkADDKp059pFBdf8+p5zafUk5dLq6b0PNO29pUlVWddxUm7N40yxRR+mLmNq4VyWyEHPDKr06NpFBUmLE5JyafUk5dLq6b0PtMv7Z15uzeNM0UIfpkp6nX0ewhmeUZ2g1/u+e3X8TuN9BS6uaDmXVk9SLq2eXvtQFxby0Iz3Ft1UZV3HSbmlj3NpmdK5v98k96Hmfz50sXQJqvqdVQyqE/T64gOkGixO6pLFSVl6+vvNYOkSO9bTe/ldbv0FIJEhtT8WJ7EkS5egMwZVmrbEUoKkJQlJubR6knJp9fTUh3NevHj5XtQHFq1snUurZ6PcM++febk1j/NUKX0Yd7p06VIeWmdQpXVbLmzYWy6tnqRcWj099WGqpD64Rubl0upJyq15nKdqtQ9TJV0Pc3oDzfOM6gS93vfdg+N3D++rkcUVLefS6knKpdXTQx/qwmKU409Qv+Hh4RC9GCWtnqRcWj1JuaWPM3eZ0jmt9aE6W7p0Kc9+9DqrGFQn6PXFB0g1XFiMcm5Q9VkM8+3x7zeXPlvOGS0AowG9vpe/tXUBACxn6Gx77scf/2HrEoC2vaoZn4kTB1vbgWEBBlWAvjU9pL7704xL3zEGmGLqQDnzJ69Nf85CKsuUaMatN+U9tiVvjeO0mEurJymXVs9jNbZo6rkk9brVa0Rv8nJrHueWWu3DNb1p4RqB1hhUacmW2/Pk8upJyqXV81iNLZp6Lkm9bvUa0Zu83JrHuaVW+3BNb1q4RqAplilN0OsDyq05flfwvhrZLNhbLq2epFxaPe/mav6GyzRvN25e2lBq62+7ubR6knJLH+cWW3/Paa0PLW8Hnvu19KnXWcWgOkGvLz7Ql6GzxUlV059R9VkM83lPPW6YuR34ChYxcbVe38tu/QXoR1dDar3ZzgmwhbU+f3r73IabMajSjLSlBHvLpdWTlEus59SLFy/fi3rzZ8CnVXU3jjUcf3r55Ny1v/5M7tnU80vqfwvXSFIurZ6k3JrH2UJaH97NjWM9m/JZtWYflugNJDOo0pK0pQR7y6XVk5RLrGeKVvuwxrm02psWc2n1JOXWPM4W0vqwVQ+3vEYglmdUJ+j1vu/WHL8DeF8hSwn2lkurJymXUk9dWP5x/AnqN9x6AdHS52eZUp+5tHqScksfZ6llSlOl9GGpz90Znvz5de250IdeZxWD6gS9vvhAu4aZi5PODaqtfX75LIbb8p663mDpEgF6fS+79RegTZOH1I8//sOSdQDsmaVLsBCDKs24ZlnAnKUCaxynxVxaPUm5rY99xttFHy9evKy/+Zu/PftFrfZhjXNptTct5tLqScqteZwUrV0jU5cu3WIR01q9gRQGVVpy68UHSywl6DmXVk9Sbutjn7rm61rowxrn0mpvWsyl1ZOUW/M4KfZ4jUy1Vm8ggmdUJ+j1vu/WHL/bd18NL65oOZdWT1Jui2PX5QUebxdzrLmAaOlztkypz1xaPUm5pY+z9TKlc/Z0jdT8RUzvfc4tUSPt6XVWMahO0OuLD7RhmLk46Xh7WVX19fnV07lAAu+pbQ3LLWKyeGlnen0vu/UXIN+cJRprLfYA4DpLfV5bvEQXDKo049bLAh5bILDGcVrMpdWTlFvzOGecLut4NvXX9tSHpOvB+0dvWutNC5Jel1vlpi5iSu0XLM2gSkvWWmiwxnFazKXVk5Rb8zin1vi1PfWh92ukxVxaPUm5NY+TLul12fJ6mCqtHpjNM6oT9Hrfd2uO38W7r4YXV7ScS6snKbf0cWri4qStFxAt3QfLlPrMpdWTlFv6OInLlB6T9Lqsnav5S5eqPvBng6VLfel1VjGoTtDriw/kGa5YnHROT59fPZ0LJPCeasNg6RIf0Ot72a2/AFksTgLgXZYusUsGVSKtsQTgscUAWx07PZdWT1Juqd/zjEmLk6b+fi30YatzabU3LebS6knKrXmcXiS9frfKXVq6dO3iJdcSyQyqpFpjCcBjiwG2OnZ6Lq2epNxSv+eppN9vy2tkjXNptTct5tLqScqteZxeJL1+W14jU7mWiOUZ1Ql6ve872fG7c/fV2eKKlnNp9STlbvl71hWLk7ZeQLR0vy1T6jOXVk9SbunjtLRMaaqk12+La6TmL166+Z8DrK/XWcWgOkGvLz6wreHGi5PO6enzq7Nzeey1f3U4HCw3YRU9vad4Y1hu8dIpi5iC9Ppe/tbWBUALhi+Goap+UFW/HD/33R1uxuKk/XrstbfcBLjGq1rnc8RnFYvzjCqRbv1w/5wlAKf54YthqLF+WmP9lxrrp8ehdbMak3ojd5vePOLJi5OmHiOtD0nnslZvWjyXVq+bFnNrHqdnSa/p0tfIpcVLdcXSpXP2eC2xLoMqqdZYQPDYEoC3+eNQ+tMa735cQw013v246u2wulWNEb2Ru2lvzlmixjWOsVa/lz6XtXrT4rm0et20mFvzOD1Lek3TrpEn+V79un5SX/zw+/U//tNP6osf1jD842t/TzjlGdUJer3vO9nxu3P3teVSgj97/mX92Rc/raofVdU/eqe831XVz+oXn39Wv3i+eo0RvQmoJyn31F9fN16ctPUCoqX73dMypZbPJeX9s4fc0sfpcZnSOUmvaco1UvOXLr31z+vv6r/Vv6zv1m9fva67u7t6/Xqo+vuq+tMax7976u/L0/U6qxhUJ+j1xedxb3+S+vtv/+v66Ov3v+D33676nz+q+s//rsodLyzgKYuTzunp88u5wG25DvdreOLSpT8Oqc/qt9+4LfN1Vb2qZ/Xd+ur7htX19fpedusvnHg7pFb96OyQWlX10ddV3/9Z1b/6N1U2snN7FicBsKTZf858r359dkitejNQfKe+qqr6r24D5lYMqkTaainB28VJr+9+XN+83fd9hlVu56aLk87lzlniGGu8T9c4l7V60+K5tHrdtJhb8zhkvfZLXyNTly69m/u39Vd//t367avHhodj/h9U1T+50GaYzKBKqq2WEvygqv6y7l7/w0lVfvR11af/oeqf/XLSl8Mj1ri21zhuT+eyVm9aPJdWr5sWc2seh6zXPu0auf/b+vO/el13ZgdW4xnVCXq97zvZ8TuA97X2ooI//kR1vPvxpGHVs6rcxmLLcyxTysj1dC5L92brepJySx9nL8uUpkp67VOukXdzP6kvfvi8nv/H4fK/ofqqqv5FjeP/ntN7rtPrrGJQnaDXF5/zvvGM6qXbfw2p3MitFied09Pnl3OB23IdMsubZ0//z+uq7537serrqrqr+k1V/UmN46/XLW7fen0v+/E9nBg/H8eq+qyqfla///b5LzKkcjsWJwGQ783w+aev6tl7/7bNH7f+1pt/osaQyk0YVIm09VKCej7e1y8+/6w++vqv682/m/qu39VHX/91vfqnd1XDpAUEPeTS6knKXfnrb7446bFr+1TKUo/Ec1mrNy2eS6vXTYu5NY/DeUnXQ8T7p8b/9d366vt3Vb8Zq179fd39bqx6dVf1G/80DbdmUCXV9ksJfvH8vv74k9XXd/+vqur4359V1WfH/5+yTKGrhQ0N5tLqeazGU632YY1zWas3LZ5Lq9dNi7k1j8N5SddDxvvnzTD6J8/r+V98Wv/9/z6v539Rb273NaRyU55RnaDX+76THb+Ld18BSwneLliq+suq+vc11Gfj5+O4VY1JvZHL741lShm5ns5lT++frXNLH8cypQ9Luh5aeP+wjV5nFYPqBL2++Ex3XLD0g6r65fEZVmhCT59fzgVuy3UIfej1vfytrQuAFhyH019sXQcAAOyBZ1SJtOpigC2XEjSUS6snKZdWz2M1nmq1D2ucy1q9afFcWr1uWsyteRyANAZVUu1nKUE7ubR6knJp9TxW46lW+7DGuazVmxbPpdXrpsXcmscBiOIZ1Ql6ve872fE7vfe186UESbm0epJyafW8m7NMKSPX07ns6f2zdW7p41imBH3odVYxqE7Q64sP9K+nzy/nArflOoQ+9PpedusvAAAAUQyqNKOnxRUt5tLqScql1fNYjada7cMa57JWb1o8l1avmxZzax4HII1BlZb0tLiixVxaPUm5tHoeq/FUq31Y41zW6k2L59LqddNibs3jAETxjOoEvd733Zrjd3/vq+HFFS3n0upJyqXV827OMqWMXE/nsqf3z9a5pY9jmRL0oddZxaA6Qa8vPtC/nj6/nAvclusQ+tDre9mtvwAAAEQxqNKMnhZXtJhLqycpl1bPYzWearUPa5zLWr1p8VxavW5azK15HIA0BlVa0tPiihZzafUk5dLqeazGU632YY1zWas3LZ5Lq9dNi7k1jwMQxTOqE/R633drjt/9va+GF1e0nEurJymXVs+7OcuUMnI9ncue3j9b55Y+jmVK0IdeZxWD6gS9vvhA/3r6/HIucFuuQ+hDr+9lt/4C9O3VzDwAwOYMqjSjp8UVLebS6knKpdXzbu5wODw7HA7Dw8Ph7uHh8OnDw+HucDgMh8PhWat9OCeh10/pTYvn0up102JuzeMApDGo0pKeFle0mEurJymXVk/vfTgnqQ9zetPiubR63bSYW/M4AFE8ozpBr/d9t+b43d/7anhxRcu5tHqScmn19NqHlhcQ9XQurV03LeeWPo5lStCHXmcVg+oEvb74AC3p6bO4p3OhXa5D6EOv72W3/gIAABDFoEozelpc0WIurZ6kXFo9vffhnKQ+zOlNi+fS6nXTYm7N4wCkMajSkp4WV7SYS6snKZdWT+99OCepD3N60+K5tHrdtJhb8zgAUTyjOkGv93235vjd3/tqeHFFy7m0epJyafX02oeWFxD1dC6tXTct55Y+jmVK0IdeZxWD6gS9vvgALenps7inc6FdrkPoQ6/vZbf+AgAAEMWgCgAAQBSDKk1rdcNii7m0epJyafX03odzkvowpzdTJZ1Lq9dNi7k1jwOQxqBK61rdsNhiLq2epFxaPb334ZykPszpzVRJ59LqddNibs3jAESxTGmCXh9Q7sHxO8L31ciGxZZzafUk5dLq6bUPLW/KnXMu5/5cSTqX1q6blnNLH8fWX+hDr7OKQXWCXl98gJb09Fnc07nQLtch9KHX97JbfwEAAIhiUKU7LSyuaDGXVk9SLq2e3vtwTlIf5vTm1pLOOe26aTG35nEA0hhU6VELiytazKXVk5RLq6f3PpyT1Ic5vbm1pHNOu25azK15HIAonlGdoNf7vnt1/C7xfQUurmg5l1ZPUi6tnl77sOdlSlMlnXPKddNybunjWKYEfeh1VjGoTtDriw/Qkp4+i3s6F9rlOoQ+9PpedusvAAAAUQyq7ELa4ooWc2n1JOXS6um9D+ck9WFOb7bSQm/k1j0OQBqDKnuRtriixVxaPUm5tHp678M5SX2Y05uttNAbuXWPAxDFM6oT9Hrf954cv3N8X5Z6xC71aDmXVk+vfbBM6XaSe7N1PUm5pY+z9XUI3Eavs4pBdYJeX3yAlvT0WdzTuVzj5cuXX1XVd7aug/ft6TqE1vX6Z4pbfwFoxauZefIZUgE4y6DKLqQtrmgxl1ZPUi6tnl77cDgcnh0Oh+Hh4XD38HD49OHhcHc4HIbD4fAsqQ9zerOGpD5s1QMA2mNQZS/SFle0mEurJymXVo8+5OUu5ZeW1IetegBAYzyjOkGv933vyfG79/dlqUfsUo+Wc2n16ENe7jS/5hKbpD6c5i79+cq2/P0G2tHrrGJQnaDXFx+Abfhz5Q2Daq49XYfQul7/THHr7zQWeADA7flzNJPXBdrS56wyjqMQu4yqcagaP6kahw/l5fLqScql1aMPebnT/IsXL8bH4prPsBZzafUk5RLrEUKItcJPVNkzy2Dm5dLqScql1aMPeblL+adKOj/XzTK5xHoA1rH1pCzEVtHCd6+Tcmn1JOXS6tGHvNxp3k9UXTct9kYIIdaMYRzH6VMtAHC1XhdfAMCtuPUXAACAKAZVODEMNQxDfXL89/7k9EZv9GGx3kyVdC6um/32BmBNBlV4X9LiiqRcWj1JubR69CEvdyk/RdK5uG7WyyXWA7COrR+SFSItkhZXJOXS6knKpdWjD3m50/zcZUpJ5+K62W9vhBBizbBMCQBWZpkSAFzm1l8AAACiGFRhgqRlFpZ65OXS6tGHvNyl/Kmkul03evOh6xVgKQZVmCZpmYWlHnm5tHr0IS93KX8qqW7Xzba5xHoA1rH1Q7JCtBBJyyws9cjLpdWjD3m50/ylZUpJdbtu9Oa0HiGEWCssUwKAlVmmBACXufUXAACAKAZVuKGkpRe9L/VIyqXVow95uUv5U0l1u2705kPXK8BSDKpwW0lLL3pf6pGUS6tHH/Jyl/Knkup23WybS6wHYB1bPyQrRE+RtPSi96UeSbm0evTlRM9RAAAIBUlEQVQhL3eat0zJddNib4QQYs2wTAkAVmaZEgBc5tZfAAAAohhUYQNJyzFaXeqRlEurRx/ycpfyp5Lqdt3ozYeuV4ClGFRhG0nLMVpd6pGUS6tHH/Jyl/Knkup23WybS6wHYB1bPyQrxB4jaTlGq0s9knJp9ehDXu40b5mS66bF3gghxJphmRIArMwyJQC4zK2/AAAARDGoQrC0JRpJ9STl0urRh7zcpfyppLpdN/vtDcDWDKqQLW2JRlI9Sbm0evQhL3cpfyqpbtfNtrmtjw2wna0fkhVCPB5pSzSS6knKpdWjD3m507xlSq6b9N4IIcTWYZkSAKzMMiUAuMytvwAAAEQxqEIHel/qkZ5Lq0cf8nKX8qeS6nbd7Lc3AFszqEIfel/qkZ5Lq0cf8nKX8qeS6nbdbJvb+tgA29n6IVkhxPXR+1KP9FxaPfqQlzvNW6bkuknvjRBCbB2WKQHAyixTAoDL3PoLAOt7NTMPALtiUIUdaXWpR3ourR59yMud5g+Hw7PD4TA8PBzuHh4Onz48HO4Oh8NwOByeJdXtutlHbwAibX3vsRBivTg+g/SrqvGTublrf33PubR69CEvl1ZPUi6tnqTcmscRQoi02LwAIcR6UY0u9UjPpdWjD3m5tHqScmn1JOXWPI4QQqSFZUoAAABE8YwqAAAAUQyqwHssPJmXS6tHH/JyafUk5dLqScot9XsCNGHre4+FEHlRFp7MyqXVow95ubR6knJp9STllvo9hRCihdi8ACFEXpSFJ7NyafXoQ14urZ6kXFo9Sbmlfk8hhGghLFMCAAAgimdUAQAAiGJQBa6StHikp4UnLebS6knKpdWTlEurZ4nzA+AJtr73WAjRdlTQ4pGtcmn16ENeLq2epFxaPUucnxBCiPmxeQFCiLajghaPbJVLq0cf8nJp9STl0upZ4vyEEELMD8uUAAAAiOIZVQAAAKIYVAEAAIhiUAVWkbSF09ZSfdCbnNzcrwVgHwyqwFruq+rnx//2lkurRx/ycmn1JOXmfi0Ae7D1NichxD4iaQunraX6oDc5ublfK4QQYh9h6y8AAABR3PoLAABAFIMqECVpyYtFOfrQWm8AoBcGVSBN0pIXi3Lm5dLqScqteRwAaJ5nVIEox58O3VfVl+NYYwu5tHr0IS+35nEAoAcGVQAAAKK49RcAAIAoBlWgSXtclJOeW/M4AEDfDKpAq/a4KCc9t+ZxAICOeUYVaNIeF+Wk59Y8DgDQN4MqAAAAUdz6CwAAQBSDKtA1y30AANpjUAV6Z7kPAEBjPKMKdM1yHwCA9hhUAQAAiOLWXwAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACCKQRUAAIAoBlUAAACiGFQBAACIYlAFAAAgikEVAACAKAZVAAAAohhUAQAAiGJQBQAAIIpBFQAAgCgGVQAAAKIYVAEAAIhiUAUAACDK/wcquXLwuqOEZgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " A* search search: 127.4 path cost, 4,058 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAHjZJREFUeJzt3UGPJdlZJuAvklar8cgFXllihdjMgrHavWHFCP4AYmEpa2GZHUP/Crr7X3gYzcJCXlRKCGlmP2oLVrOhegCJv+ANmC65kWy5YhaVlc7Oupl5b0bEifec+zwSavE5s86JuHFv1ls37pvTPM8FAAAAKS723gAAAADcJqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAIYyTTVNU313mmrqeZa4H4BWBFUAYDQfVtXfXP+351nifgCamOZ53nsPAACruX4X8MOq+mKea+51lrgfgFYEVQBgWNNUX1bVN/feR2dezXM923sTwHkTVAGAYU2TdwOfYp59NhXYl8+oAgBdUATUjvMK7E1QBQB6oQioHecV2JVbfwGALjylCKiqXu+03d5dlIIlYEeCKgAwBMVJm1KwBDQlqAIAQxirOGmuCvsoqIIloKX39t4AAHDe1vp9nw+t8eLF1XYHMJDnzy/v/d/8vlWgJWVKAMDelpQkKfhpx/kHmhFUAYC9fVFV36uvvyu6ZMY2nH+gGbf+AgC7ur5l9OUpM8VJu7hpUJ7efFr1bcHSy/u+AeCpvKMKAPTo6JD6wQe/3HIfQznxXPmHAmAzWn8BgEWWlCE99fvr4d+PetTvAE2fbb3OixdX957D588vb87hGuca4FTeUQUAllpafLR2cdLa5Ux7zVquc9fa5xrgJN5RBQAW8Y6qd1SPORaAUwiqAEC0U4uT5rmmDbczjKurq3v/Enh5eXlzDqfppLD5tmAJYBG3/gIA6U4p7Xm12S7O1ynnVMESsApBFQA4aJpqmqb67vWtnJvMTv3aAy6q6qOqupjnmua5nrXYd9q5WbrOXbe/bp7r2fW71Dfn+tjvPXVdgLcEVQDgPmmFQXvtUZlSu+8FqCqfUQUA7pFSGFRnUJyUXKa01WMC8BBBFQCIoTipnWPLlA5RsARsza2/AEASxUl9ULAEbEpQBQDiCoPuMWxxUlqZ0iEKloCWBFUAoCqvMChpj2nnpkWZ0iEKloBmfEYVAIgpDKozLU5KK1M69BnVFqVXAG8JqgDALhQn7WtJmdIhCpaANbn1FwDYi+KksShYAlYjqALAwJLKgU4szzmr4qQeypQOUbAEbEVQBYCxJZUDnVKek7RHZUr3U7AEbMJnVAFgYEnlQHdnj3ym8ayKk3ooUzpEwRKwFUH1CFdXV/eVPby6vLxUBAAADzi1NKlKcVILa5cpHaJgCbY3alZx6+9x7vvhqggAAB536s9LxUnjULAE2xsyqwiqAMRKKrbpoRQnfXbI3fKdcy1O2vO6WdvtNdYoWGq1byCLoApAsqRimx5KcdJn90naY9Ks5TprWrtg6ZTvBwbhM6pHaPEZDgDeNQUV2yyZpe2n9aweKNSZ55oS9pg423qdNcqUDllyPdSBgqX7/kzgjVGzynt7bwAA7nP9F9KXVYcLeaYDP37TZ2n7aXXMD7n9OJt93ZbrXF3dXW0dj637yDVyE2Kvv+7V9e3DB88PMC63/gLQi65LIThIadJ5UrAEPEpQBSDCXsUvNPVOaVJVVoFR0qzlOlu7ve6pBUt3v/+hGTAOQRWAFHsVv9COoqnTZi3X2drSvSQdC9CAMqUjjPoBZYAkCwtY6MPRRTlm25+brcqUDlny/D6lcAvO0ahZRZkSABEeK06if3sUBvU823qdrcqUDllSsDRNXw+ht0uW7v6ZwDjc+gtAIiF1PIqTeMip14fXCBicoAowmJGKXw558eLqnf+rd0t6pqRZ2n52mj3bszCox1nLdfbwWMHS9f9/1PefOgPyCaoA4xmp+OVYSQU4CoNOm6XtJ2nWcp09LN1fj8cMHEmZ0hFG/YAyMKYRil/qgWKV63dQv+b588t3Snr2Og9bn5vRZmn7SZptvU7LMqVDjtnzdOfzqXc8+XkPIxk1qwiqRxj1wQfGNw1YSnQoqHothtP18PebR4LqMd6WLsGwenguP4VbfwHGNlRI/eCDX+69BaCtpSVcQ70GwjkRVAE6dgbFITfFKi9eXNWPfvS3e+9nF0kFP70WBvU4a7lOirv7O1SytPTPXDoD2hBUAfo2enHISMeyRFLBT6+FQT3OWq6T4pTzsOTPHOV8wbB8RvUIo973DfTvseKQeqCUqBM3ZSl7F7/sactCnb1naftJmm29TuJz6pjzUKe/rq1atgZpRs0qguoRRn3wgfFNy4tIdnX79yh6LYZ19fqcOvV17bHfxwq96/W5/Jj39t4AAOuYxmv4XVqiAozpVZ3wWrfgH+w0BsOOBFWAcdz7Fze/0gUYxbHhcYU7Skb6hz/ojjIlgE6s3UipoTRTUutsq8czaT9Js5brpNvr+M7xXEMKQRWgH2s3UmoozZTUOjtSs22Ps5brpNvr+M7xXEMEZUpHGPUDykBfpkcaKeuBJsxDt/4+f365ehPmku9/bJbYULqFLc9h4ixtP0mzrdfp6Tn1lOOrdVrPNQYTb9SsIqgeYdQHH+jXdGJx0gifUfVaDOsa/Tk17dt6roiJZkZ9Lrv1F6BPR4fUDz745Zb7AEi1Z3O4IiZYSFAFCLSwmOOiqj6qqosXL67qRz/62033+hTKSN5IKu7Za5a2n6RZy3VGcfv45rmeXf8O1ZvXxHmu6ZjZ2ntZYwbnRlAFyLSkmKOHAg9lJG8kFfeMXhjU46zlOqNIOjdJe4Hu+IzqEUa97xvINS0rCbkp/0gtS3ns+HovfjnWU87DaLO0/STNtl7Hc+rJr7HHUsREE6NmFe+oAgSa55rnuV7Oc83TVF9Ob0pBXlfVP9Qjf4G6/b1NNvsEh/Z47GwkS87DKLO0/STNWq4zirBzc/OaPU01330df2D25eiPExxDUAXId0opx57lIQCjUMQEOxNUAXa2sEjjbiHIs7QSDoUibySV9CTN0vaTNGu5Dn0WMZ3ytR57eiOoAuxvSZFGDyUcCkXeSCrpSZql7Sdp1nIdss5rq2sEYilTOsKoH1AGMkzLSj0eLOtIKEt57PiOnSUcyxJrnYfRZmn7SZptvU7vz6m1rXVeq2ER09J9M4ZRs4qgeoRRH3wgzzTVl3XC55Oubzm710ivXyMdCyTwnNrGNEWGwFfzXM/23gTbGPW57NZfgCyKkwD6lvjarKCJ7giqAA0tLLg4qjhpz8IMxS8Kg06dpe0nadZyHZa5fV73LmI6Zo9rzGBrgipAW0sKLnoozFD8ojDo1FnafpJmLddhmR4ekx72CDd8RvUIo973DbQ3LSvhOKpc4/asdVnKY/s5h+KXQ8dx39zMuVGmNIa1HpNap4jpPif/DFHE1IdRs4qgeoRRH3xgX9PKxUmHjPT6Ndix3PfYv7q8vFR4QhMjPadGMSli4glGfS6/t/cGoAfTZ9NUVX9UVT+ZP/GvO6xGcdL5uu+xV3gC5+1V5b0OpO2HM+EzqnDH3cKA6bNpqrl+WHP9n5rrh9ehNapwY6RSjx5na3z/AU8uTjphjSfb8xpJsvQaWfJnjjJL20/SrOU67Of2Y3JqEVOrgibXEnsQVOFdN4UB16H0hzVf/KCmmmq++EHVTVhNKtwYqdSjx9ka339XeunFntdIkqXXyJI/c5RZ2n6SZi3XYT+trpH19jhN3/rL+uxPv1P/73/9ZX32pzVN31ppHbjhM6pHGPW+bw67/pfBD+uPP/2i/vizH1bV96vqP936kp9X1Y/r808+rs8//bACCjdazNL2kzR76vfXysVJLctS9rhGEotfnnqNPHQsz59frlp4kj5L20/SbOt1Ep9T52jra6TWKWj69etSTb9fVX83V/3G67q4uKjXr6eqX1XVf615/qcV1uJEo2YVQfUIoz743O/mndRffOO/1ftfvfsFv/hG1T9+v+p///cqd7ywgacUJx0y0uuXY4F1uQ7Pw7RiQdPv1z/V39cf1rP696/dlvm6ql7Vs/qt+vI7wmp7oz6X3foLd9yE1KrvHwypVVXvf1X1nR9X/clfVEUW9NE5xUkArGWVnym/Xf92MKRWvQkU36wvq6r+zm3ArEVQhVtuipNeX/ygvn6777uEVdazanHS2gUXil/u38vSc7P2Oj3O0vaTNGu5DmO5/TifWtBU9xQxfbt+Whf1+t7wcD3/jar69iYHxdkRVOHr/qiq/rwuXv/mUV/9/ldVH/3Pqt/9yba7YnTpZSmKX97YovBk7XV6nKXtJ2nWch3G4hqhez6jeoRR7/vmXb/+VTQXPzgqrPqsKuvYrDxnjbKUY9ddc9+JxS9LjuPuXJmSMqWEc7P3c4rtrHWN1K0ipv9c/1L/t/6gnj18J/GrqvqDmud/Wf+ouM+oWcU7qnDL/Mk811Qf18Xrv6437b73E1JZyTzXPM/18vZfTteetdhfD8eyxNI9H3ssLc5h0ixtP0mzluswli2ukZ/Wt+t1XdxbIXw9/1VV/XTNY+F8Capwx/zJPFfVx1X14/rFNw5/kZDKehQnAZDq5mfUz+pb9Yf19/Vl/dY7YfVt62+9+RU1/9Zyg4xLUIU7pqmm+nT+sD7/5ON6/6u/qnffWf15vf/VX9Wr37momk4qJeh5lrafpNnC71+9OGlJWYril+32fOyxtDiHSbO0/STNWq4Db92+RuY7RUz/XP/l4rfr379zUfWzuerVr+ri53PVq4uqn/nVNKxNUIV3vSkR+PzTD+vtO6uvL/6jqur6vz+uqo+v//eUwo2RSj16nKXtZ0kRRg/noYVW10iLtdNnaftJmrVcB956+Lp5E0Z/79P69M8+qn/410/r0z+rqt8TUlmbMqUjjPoBZQ6b7hQL3BQsVf15Vf2Pmurj+ZN5vvt1h753pFnafpJmafu5PTu1LCX5PLQsftn6GlGm1MfzZ+/Z1usoU+KQpdcs7Y2aVQTVI4z64HO86bNpqje/uuYn159hhS6M9PrlWGBdrkMYw6jP5ff23gD04Dqcfr73PgAA4Bz4jCpAQ3uWsmy9lz2PZYlWe15yLHtdN3ueG7O26wCkEVQB2tqzlGXrvex5LEu02vOSY9nrutnz3Ji1XQcgis+oHmHU+76B9qbGpSwtS3v2PJYlr8UtHpOlx9L6uhmpMKjn2dbrKFOCMYyaVbyjCtDQPNc8z/Xy9l9EW8xa7GXPY1mi1Z6XHMte182e58as7ToAaQRVAAAAogiqADvbqyzlHItf0kpxejyWXq+bHmct1wFII6gC7G+vspRzLH5JK8VZIqn0p4frpsdZy3UAoihTOsKoH1AGMkw7laWcY5nSlvs75dz0fCxbn5u995M023odZUowhlGzindUAXa2V1nKORa/pJXi9HgsvV43Pc5argOQRlAFAAAgiqAKEKhFWcroxS89lOKsLemY066bHmct1wFII6gCZGpRljJ68UsPpThrSzrmtOumx1nLdQCiKFM6wqgfUAZyTQ3KUkYvU9pyL0vPzVYlNknHnHLd9Dzbeh1lSjCGUbOKd1QBArUoSxm9+KWHUpy1JR1z2nXT46zlOgBpBFWAsb06cQ4AsDtBFaATTylLuby8fHZ5eTk9f3558fz55UfPn19eXF5eTpeXl89GKn5JKsA55dy0kHQe9rxuepy1XAcgjaAK0I+kkpe04pek83DKuWkh6Tzsed30OGu5DkAUZUpHGPUDykBfpqCSl0OzrddpWQy19blpWWKTdB72uG56nm29jjIlGMOoWUVQPcKoDz5AT0Z6LR7pWOiX6xDGMOpz2a2/AAAARBFUATqm+CXvPJxybvbSw7kxa7sOQBpBFaBvil/arbvFudlLD+fGrO06AFF8RvUIo973DfRvUvxSVcqUTpV8bvbeT9Js63X2vg6BdYyaVQTVI4z64AP0ZKTX4pGOhX65DmEMoz6X3foLAABAFEEV4AyMXvySVIBzyrlJN9J10+Os5ToAaQRVgPMwevFLUgHOKecm3UjXTY+zlusARPEZ1SOMet83cD6mwYtflCltY4TrpufZ1uv0ch0CDxs1q3hHFeAMzHPN81wvb/8FeO1Zy3X2OL4tzk26ka6bHmct1wFII6gCAAAQRVAFAAAgiqAKcKZGaihNamo95dz0qNfrpsdZy3UA0giqAOdrpIbSpKbWkVp/D+n1uulx1nIdgChaf48wapMWcN7WbhPd4s+8PdP6m6G366bn2dbr9HwdAr82albxjirAmRqpoTSpqfWUc9OjXq+bHmct1wFII6gCAAAQRVAF4EavxS9JBTinnJuRpV03Pc5argOQRlAF4LZei1+SCnBGL1M6Vtp10+Os5ToAUZQpHWHUDygD3NVr8YsypTwp103Ps63XOYfrEM7BqFlFUD3CqA8+QE9Gei0e6ViWuLq6+rKqvrn3PnjXOV2H0LtRf6a49ReAXrw6cU4+IRWAg97bewMA9GefWzgvn22/Rqtbf5ecfQAYn3dUAXgKpTinzR6aAwB3CKoAPMUXVfW96/8+Njd7eA4A3CGoAnCyea55nuvl7Vtb75ubPTwHAN4lqB5HgQcArM/P0UweF+jLkFnFr6cBYDVJBUZJs7tzv78SAB7mHVUA1pRUYJQ0e2gOANzhHVUAVpP0LmbS7O7cO6oA8DBBFQAau7q6uveHr6AKAG79BQAAIIygCkBz01TTNNV3r2+HHX720BwAeJegCsAekoqOlCkBQBifUQWguaSiI2VKAJBHUAWAxpQpAcDD3PoLAABAFEEVgAhJ5UfKlABgX4IqACmSyo+UKQHAjnxGFYAISeVHypQAYF+CKgA0pkwJAB7m1l8AAACiCKoAdCWpJEmZEgBsQ1AFoDdJJUnKlABgAz6jCkBXkkqSlCkBwDYEVQBoTJkSADzMrb8AAABEEVQBGJIyJQDol6AKwKiUKQFAp3xGFYAhKVMCgH55RxWAIc1zzfNcL2+Hxb1mD80BgHcJqgAAAEQRVAE4a8qUACCPoArAuVOmBABhBFUAzt0XVfW96/9uNXtoDgDcIagCcNaUKQFAHkEVAACAKIIqABxBmRIAtCOoAsBxlCkBQCOCKgAcR5kSADQiqALAEZQpAUA7gioAtPfqxDkAnJVpnv3DLgCs5bos6cOq+uL2u6eH5vd9LQCcO++oAsC6TilTUrAEAAd4RxUAVuQdVQBYTlAFAAAgilt/AQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACi/H9tufMPU4+G/wAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (1.4) A* search search: 127.4 path cost, 1,289 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAHp1JREFUeJzt3b+PZNlZBuDvtn+wLNoxjpwQIEQG1thCIgIZ/gCE0Eo9gWUcIMz+FewuOfkCIkBogy4JIUFCBFgm9wwQEhAhOTLMCGMNcl+Crhl6u6uqq7rqnvueW88jjcb+tqvPubduV8/bVfX2MI5jAQAAQIqLuTcAAAAAtwmqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAWZRhqGIb62jDU0PMscT8ArQiqAMDSPK2qv1r/3fMscT8ATQzjOM69BwCAk1k/C/i0ql6MY429zhL3A9CKoAoALNYw1Muqem/ufXTm1TjWk7k3AZw3QRUAWKxh8GzgY4yj96YC8/IeVQCgW4qApuG8AnMTVAGAnikCmobzCszKS38BgG49VARUVdczbq9nF6VgCZiRoAoALILipEkpWAKaElQBgEVYVnHSWBX2VlAFS0BLn597AwAA+9jjZb5bXV2tpt/gAjx7drn1v/l9q0BLypQAgF4o+JmX8w80I6gCAL14UVXv12efPd00YxrOP9CMl/4CAF1Yv7T0+bbZ4B2Uk3ro/AOckmdUAYCF89bJfb3zzv9u/W/DUOOdPy8bbg04M1p/AYBF2NX6q7H2vtVqtfV8XV5evj1fziswB8+oAgBxhqGGYaivrVtlD5qd+vPNNWu5zj7na9+P2/e2ALsIqgBAon0bZvdtnT3m8801a7nOXac+rwAH8dJfACDOvr+z887vUb3e8SkvDv18c8+mXufqarX1fD17dvn2fJ3ivAIcSlAFALqzLvJ5b9+P917K+07xHtUNXo1jPTlqYwDlpb8AQJ/2DqlV9WqyXZyHQ87fIfcLwFaCKgAwq1MXAdXNv2++XlUX41jDONaTpJKk3sqUxrGerJ+Rfnte973toesCvCGoAgBzU5zUR5mSgiWgGe9RBQBmpTgpu0xpqvMPsIugCgBEU5w0jX3LlDZRsARMzUt/AYB0ipPyKFgCJiWoAgBNHFsYtMUiipPSypQ2UbAEtCSoAgCtHFsYtO/nXMqs5Tr7ULAENOM9qgBAE48tDKozKE5KK1Pa9B7VFvcTwBuCKgAQQ3FSO8eUKW2iYAk4JS/9BQCSKE7ql4Il4GQEVQDg5I4tDNpiscVJPZQpbaJgCZiKoAoATOHYwqBjPudSZi3XeSwFS8AkvEcVADi5xxYG1ZkWJ/VQprSJgiVgKoLqHlar1bZih1eXl5eKAADgERQnzevUZUqbKFiC6S01q3jp7362fRNVBAAAj6c4afkULMH0FplVBFUA4OROUJRzVsVJvZYpbXLqgqVW+wayCKoAwBSOLcpJKjVSpnSYUxcsHXJ7YCG8R3UPLd7DAQBLsk+5zwPvXzyr4qRey5Q2OXXB0rbPCdxYalb5/NwbAACWZx0mnldtLk0aHvin0+3bn+ts6nVWq7urncZD6z5w378NseuPe7V++fDG8wMsl5f+AgBTO7TQQ3HSsilYAh4kqAIARzmmtOdu0c65FictqUxpk2MKlu7eftcMWA5BFQA4luKkaWYt15laq2sEWAhlSntY6huUAeAUjinPGccakgqMkmZTrzNVmdImra4ROEdLzSrKlADowmq1ulfIs/bq8vLySev98P8eKk465PZmn9VjmdImxxQs3W2Hvl2ydPdzAsvhpb8A9GJb+FG2kuWQ+0NpEm8cei34uoeFE1QBgL0dWWpzrzTpkM95brOW68zhoYKl9f/f6/a7ZkCfBFUA4BDHlNrMWRjU46zlOnNQsARsJagCAId4UVXvr//eNdv3tod8znObtVxnDsfuL/34gCMoUwIA9nZMcdIchUE9z6Zep2WZ0iaHnIct3jYFK1iC5fGMKgDwWIqTmNoh142CJVgQQRUA2OjUxUlzFgb1OGu5Toq7+9tUsnTI7bfNgHyCKgCwzamLk5QpHTZruU6KQ87DvrdPP2Zgg2Ecx4c/6sytVqutJ+ny8tJP5wAa8Fjc3voZqKdV9WL93sHPzOrWewQ3uNh12zezfdY519nU61xdrbbef3N9Te1zHupE1x0sxVK/PypTAliw1Wq1rezm1eXl5ZPW+6Evpy5OUqZ02GzqdeYuU9pkn/Mw7P5nt4IlWAgv/QVYtm3BQukIh1KcRAoFS3AGBFWAjikO4VRaFCcpUzr8XJ9bmdI2t/etYAnOg6AK0DfFIZxKi+IkZUqHzVquk67F9QkEEVQB+vaiqt5f/71rBg855lra97bbPt8xt1/yrOU66Vpcn0AQQRWgY+NY4zjW89tNlptm8JDb180w1MthqLFuimm+X7tbVve+Drddm8fcfsmzluukO/JY3l7Hw3Bzffd6HuCcCKoAwF2Kk+iNgiVYGEEVoGNKQniMuYqTlCllnptetShYcq5hPoIqQN+UhPAYcxUnKVM6bNZynR4lXcdLP9fQ3DCOXpr/kNVqtfUkXV5e+ikZMJv1T+qfVtWLN++1uj27ulptfW9hb49fHotP56Hrpna/J/Vi120PnR17+yXPpl6n98eHpOv4kPsUTm2p3x8/P/cGAHi89T9+nm+brVZz7Ip0t6+RYaiXdcB79h665g6dTfE5lzKbep3eHx8eOr5h9z/P34bYTR93gtmrcawnd/cH7M9LfwHgvClOYqnmvF4VNsGRBFWATijw6FtSSc8B18hJi5PmLAzqcdZynaW4fXyHFixNuZepZrBkgipAPxR49C2ppCepcKbVOj3OWq6zFEnH7L6DIyhT2sNS36AM9GVTMcdDs97LUm7r/bH4Mfdfi1nNXDiTch4SZ1Ovs6THhzeOvN5PrcnXD/T+/XEbZUoAnTjHspQleajAaJim0OXB2S5LKAzqeTb1Okt8fDiyYOnUTlnYpJyJs+OlvwDQXg9FK4qTWKJer+seHjPgpARVgEAKN/p1SClOmMmLk+YsDOpx1nKdJbt9zJsKltbX+8lmLY7joblrhCUQVAEyKdzo1yGlOEmWXhjU46zlOks25/13Sq4RzooypT0s9Q3KQK5TFW4sqSyll8fibYUnt+fVttBlX5MXv+xzbuYuMEqaTb3Okh4fdml9/9V0X9/3vkZbHR/Zevn+eChlSgCBlKX06+59MmwoTkq01MKgnmdTr3Mujw+t779huliws5xp2/zAmdImYgiqsIfh42Goqm9U1XfHD70MAThIfEitfgtmINGr6uPrfpNe980CeY8q3HG3RGD4eBhqrE9qrL+vsT5Zh9aowo0llXr0OEvbz7Y9plvKsRyy56ur1b0/NWHJy45Zk+KkXr9+ln5uON7t83rqwqY5j+XQGZySoAr3vS0RWIfST2q8+FYNNdR48a2qt2E1qXBjSaUePc7S9tNrEcZSjuXYPSddN75+5p21XIfjLOl+eng/w/DlP6yPf/ur9c9/84f18W/XMHy5/TZZOmVKe1jqG5TZbP2Twaf1Gx+9qN/4+JOq+mZV/cytD/nvqvq0/vHDD+ofP3paAYUbLWZp+0mape3n9qynspSlHMvd4xiG7WUk62dQP+PZs8tZSo18/eTNpl6nl6+pHkx5P1X78rXdj0E1/FJVfW+s+tx1XVxc1PX1UPWTqvr1Gsd/bbxXarlZxXtU4Y5xrHH4eHhRVZ/U63e/U1/80d0P+Zl6/e536r3/+E7VWFVDDRseApY8S9tP0mzCdTYWXCypLGUpx7KtFOeY2y95lrafpNnU6/TyNdWDKe+nbd9rJrS1tOmX6l/rP+tL9aT+qy6q6nPrD72uqlf15F++NAxfFVY5FS/9hTvevty36psbQuqNL/6o6qufVv3WH1Rtf7IETknBBcB5iig7+9n6Yf1T/drbkHrbRVW9Vy+rqr7nZcCciqAKt7wtTrq++FZ99uW+9wmrNLb0gosej6XHPVe1Keg55Nwk7Sdp1nIdsty+n3aVM7UsbfpK/aAu6nrrB67nn6uqrxxx6PCWoAqf9Y2q+v26uP7pvT76iz+q+vqfV/38d6fdFdzooYTjGD0eS497rjrPwqAeZy3XIcuc1whEEFThs75bVX9W1xf/s9dHv3636vu/V/Xv35h2V3DjRVW9v/770FkPejyWHvdcddy1dOpZ2n6SZi3XIcuc1whEEFThlvHDcayhPqiL67+sm3bf7V6/W/Uv36z62z+p8goqGhjHGsexnt9uBN131oMej6XHPVcddy2depa2n6RZy3XIMuc1ss0P6it1XRdbK4jX859U1Q/2PU7YRVCFO8YPx7GqPqiqT+v1u5s/SEhlBsNw86tObv15OfeeOIacAETZWdr0n/Xl+rX6p3pZX7oXVtetv1U3v6Lmh1NtkPPi19PABuOH4zh8PHywbv29/3tUv/ijT+tX/uyD8W/+1L80mdyw/fdwagLumh9yATnGm1+B9oBfrhr+65er6nt1U5xUVVUXVT/5Ur30e1Q5Kc+owh1vmvbqo7HqzTOr4/plwDd/f1pVH9RHY6U0Qy6pfbLHWct17uq11XMpDaU97nmbpX/99DhruQ7n6VHXzU0Y/YXX9YVf/bA++t3X9YVfrapfEFI5NUEV7nvbgPf2ZcAvf+7v6nqoevlzf1dVH6znSc2QS2qf7HHWcp27em1xXEojZY973mbpXz89zlquw3l63HUzjj/8qXr9zh/Vh3/8U/X6HS/3ZQrDOHrl4kNWq9XWk3R5eeknkguz/gni06p68aZcYPjNj4b68r/9Xv3wF/98/Iebp1o3ftyCZ2n7SZpNvU7V1u6KqpsfOG697dXVautt53z8esx5SDyWfY5j2P7S7bq6Wt2bJR/LFLM5106fTb1O4tcUbU1xzdLeUrOKoLqHpd75QB92BZ0NXt1+n9GSHr96PZZegirnp9evKeCzlvq17KW/APl2NjHeoWAJAOieoAoQ6HZxxTjWk3GsoW4es79eDzx291CW0mPxS497PlarIp8W6/Q4a7kOQBpBFSDTMSUoPZSl9Fj80uOej7WkwqAeZy3XAYgiqAJkelFV76//3jXb97Zp9j2+pGPpcc/HOuaYDzk3LdbpcdZyHYAogipAoHGscRzr+e0mxU2zLa6r6vtVdf3s2WV9+9u/M+leH2Pf4zvgmCfX456PdcwxH3JuWqzT46zlOgBpBFWAPu1dsPTjH39hyn0AAJycoArQiWMKlubUY/FLj3uewpIKg3qctVwHIE3sP2wAuKfXYpQei1963PMUllQY1OOs5ToAUQRVgH70WozSY/FLj3uewpIKg3qctVwHIIqgCtCJXotReix+6XHPU1hSYVCPs5brAKQRVAEAAIgiqAJ07JhiFMUv2/W451Z6vW56nLVcByCNoArQt2OKURS/bNfjnlvp9brpcdZyHYAogipA344pRlH8sl2Pe26l1+umx1nLdQCifH7uDQDweOtClOdVVcOOF/I9e3a5aXz95n9suu0xswc+9tU41pNa7/uN28eyabZabV5nCg/tZdNsGOplVb13+7/vuk969Zhzs2s2xedcymzqdVp+TQEcyjOqAMvxau4N7Om9hz+kSwcd1zvv/O9U+wCA7gmqAB27XYwyjvVkHGuom8f2r1fwY3xS8UuLYpq798vV1ar+4i/+eoKjmUcPhUE9zlquA5Am9h8xAOyl17KUpOKXVsU0Pdwvj9VDYVCPs5brAEQRVAH61mtZSlLxS6timh7ul8fqoTCox1nLdQCiCKoAHRvHGsexnq8LUrbOAl1X1fer6noYahyGejnXsey77u3ZMNTLYajx9nE8Zp2leMw5PNXtlzxruQ5AGkEVYNkULE3jkP32ch8AQAxBFWBhHipYGscappjt87H77nvX7NQmKKG5ew6eHHj77vRQGNTjrOU6AGkEVYDl6aH45Zh9n9qpS2gOOTdL0UNhUI+zlusARBnG0VsUHrJarbaepMvLSz+RBKKsnyl5WlUv3rwPrcVsn4+t3e/lvNh126ur1dbbHvNY/JhjPvQ4Wh3LXKa+bs51NvU6S7sO4VwtNat4RhVgYXooftlir4KlU3toz4cWJx1ybpaih8KgHmct1wFII6gC0NIhxUIpBUuKkwCgMUEVgEk9VO60721PvZdDZ1vcK046pBRnydIKg3qctVwHII2gCsDUpigmmnovUxRAnVuJTVphUI+zlusARBFUAZjai6p6f/33rtm+t22xl2P2t+22pz6WdMeeG7O26wBEEVQBmNSpCpaePbusb3/7dybdyymKk86xTGmTtMKgHmct1wFII6gCMLe9C4h+/OMvTLmPKsVJABBBUAWguWMKlk617q7ZFnsVJylT2k6Z0mGzlusApBFUAZjDXCUvLYqTlCltp0zpsFnLdQCiCKoAzGGukpcWxUnKlLZTpnTYrOU6AFEEVQCam6vkpUVxkjKl7ZQpHTZruQ5AGkEVgHOlOAkAQgmqAEQ4puRlgiKZRxcnKVM6XFKBUdKs5ToAaQRVAFIcU/Jy6iKZVqU43EgqMEqatVwHIIqgCkCKY0peTl0k06oUhxtJBUZJs5brAEQZxtF76R+yWq22nqTLy0svnQE4sXXJ0WzWv9d1Mr6v3FitVi/rsPcK08g5XYfQu6V+T/GMKgCJ5iwvUpzUjpAKwEaCKgARbpe8jGM9WT+r+bbUaMKlT1qcpEwJAI4nqAKQYq7ilzlLcQCADQRVAFLMVfwyZykOALCBoApAhHGscRzr+Tj+f5HSptlc6x4za3UsALAUgup+thVrKNwAaGeKx1yP4/Ny/jO5X6Avi8wqfj0NALHWxUNPq+rFm2cie5zdnV9dra63HXPPv0oAAE7FM6oAJGtRdKRMCQDCeEYVgFhJz4p6RhUA2hFUAaCx1Wq19ZuvoAoAXvoLAABAGEEVgLMxDDUMQ31t/TLcZrNdcwDgPkEVgHOiTAkAOuA9qgCcDWVKANAHz6gCcDbGscZxrOe3A2SL2a45AHCfoAoAAEAUQRWAs6ZMCQDyCKoAnDtlSgAQRlAF4Ny9qKr3139PNds1BwDuEFQBOGvKlAAgj6AKAABAFEEVAO5QpgQA8xJUAeA+ZUoAMCNBFQDuU6YEADMSVAHgDmVKADAvQRUAAIAogioAAABRBFUA2IPWXwBoR1AFgP1o/QWARgRVANiP1l8AaERQBYA9aP0FgHYEVQAAAKIIqgDwSMqUAGAagioAPJ4yJQCYgKAKAI+nTAkAJiCoAsAjKVMCgGkIqgAAAEQRVAHghJQpAcDxBFUAOC1lSgBwJEEVAE5LmRIAHElQBYATUqYEAMcTVAGgvVcHzgHgrAzj6Ae7AAAA5PCMKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAFEEVAACAKIIqAAAAUQRVAAAAogiqAAAARBFUAQAAiCKoAgAAEEVQBQAAIIqgCgAAQBRBFQAAgCiCKgAAAFEEVQAAAKIIqgAAAEQRVAEAAIgiqAIAABBFUAUAACCKoAoAAEAUQRUAAIAogioAAABRBFUAAACiCKoAAABEEVQBAACIIqgCAAAQRVAFAAAgiqAKAABAlP8DscRvdl+TaAMAAAAASUVORK5CYII=\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " (b) Weighted (2) A* search search: 140.4 path cost, 982 states reached\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA6oAAAJCCAYAAADJHDpFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAG55JREFUeJzt3c+OZHd9xuHvaYw1MfIYVpayilhFCpbxnghxASyIpZ6FE7xIIL4HFh4vuAcCYmFFjjSNIha5ARDsPRZEyjJbbwLMyI5lizlZdM2kpqeq+9TU+fP+Tj2PZBmOq/ucru5q+zPV9XbX930BAABAirOlLwAAAAC2CVUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAYFW6rrquq292XXUtH0u8HoC5CFUAYG1er6p/3/y95WOJ1wMwi67v+6WvAQBgNJtnAV+vqo/6vvpWjyVeD8BchCoAsFpdVw+q6uWlr6MxD/u+bi99EcBpE6oAwGp1nWcDn0ffe20qsCyvUQUAmmUIaBruV2BpQhUAaJkhoGm4X4FF+dFfAKBZNw0BVdWjBS+vZWdlYAlYkFAFAFbBcNKkDCwBsxKqAMAqrGs4qa8KeymogSVgTi8sfQEAAM/ryo/57nXv3sU8F9S4O3fO9/4zv28VmJMxJQCgZQZ+5mNgCZiNUAUAWvZRVb1ZNzyjyih23dfuf2ASfvQXAGjW5sdN71dVdV5BOant+/q6YwBj8IwqALByXjo51K1bX+z9Z11X/ZW/Hsx4acCJsfoLAKzCdau/FmufdXFxsff+Oj8/f3J/uV+BJXhGFQBoVtdV13X1zc367EG3Sz8253mG3F9Dbzf0bQGuI1QBgJYNXZ0dulibdGzO81w19v0KcBA/+gsANOvK71F9dM1Nzx7f7rrfAZp0bOrz3Lt3sff+unPn/Mn9Ncb9CnAooQoArILXUh5mjNeo7vCw7+v2URcGUH70FwCA6z084LYvT3YVwEkRqgBAs4wpjXOe6+6vvq/bm2ekz6rqjbrhvx8NLAFjEKoAQMuMKY1znquWeluAqvIaVQCgYcaUph9T2vW2Y9zXANcRqgDAKhhTOszQMaVdDCwBU/OjvwAAHMrAEjApoQoANGGscaBWjs15niEMLAFzEqoAQCvmGPhJOjbneYYwsATMxmtUAYAm3DQOVMaURhtT2vUa1Tnuf4DHhCoAsArGlA5zzJjSLgaWgDH50V8AAMZgYAkYjVAFAJpgTGm68zwvA0vAVIQqANAKY0rTned5GVgCJuE1qgBAE4wpLTumtIuBJWAqQnWAi4uLB7X7tRQPz8/PDQEAQABjSocZe0xpFwNLML21toof/R1m3wv+DQEAAOxnYAmmt8pWEaoAxFpqIIZMxpSmO8+Yxh5Ymuu6gSxCFYBkSw3EkMmY0nTnGdPYn6dD3h5YCa9RHWCO13AA8Kxjh2hYF2NKeWNKu4z9edr3PoFLa22VF5a+AADYZ/MfpPerqrqunhmL6C7/9ft4gOX+3NfHvLa/HnYd6675z7Gb3jbx2NTnubi4erZxHPN5qq2I3X5877t/gPXyo78AtGKVYxFwggwsATcSqgBEmGMoh7YZU5ruPFM7ZmDp6ttfdwxYD6EKQIo5hnJomzGl6c4ztWOvJeljAWZgTGmAtb5AGSDJHEM5tM2YUhtjSrsc87nr++qMqMF+a20VY0oARLhpOOkGOwdYyvjKqhhTGvfYVGNKuxzzueu6pyPUYxxOgx/9BSDRsQMqBligLYcMLFV5jMPqCVUAIgwdRrl37+LJX4e+P+MrbTOmNN15lnDTwNLm/w96++uOAW0SqgCkGHsYxfjK+hhTmu48SzCwBOwlVAFI8VFVvbn5+1Tvb+xzMK9jPqdD3zbp2JznWcKx15f+8QFHMKYEQIShwyoHMLC0MsaUxj0255jSLofcD3t4jMOKeUYVgGbduvXFITc3vgLtOWRkyWMcVkSoAhDheUZQ3n//l49HlZ4MsBx6DuMr7TCmNN15Uly9vl0jS4e8/b5jQD6hCkCKY0ZQxh7UIZMxpenOk+KQ+2Ho26d/zMAOQhWAFMeMoIw9qEMmY0rTnSfFIffD0LdP/5iBHYwpAazYxcXFg9r9uq2H5+fnt+e+nuscM6Z0wNsaX2mYMaVxjy09prTLkPvBYxxOg2dUAdZt37jImkdHjK/AunmMwwkQqgBEOGbwZPttja+sw9jjQEkjScaUDucxDqdHqAKQYuwxJeMrbRt7HChpJMmY0uE8xuHECFUAUow9pmR8pW1jjwMljSQZUzqcxzicGGNKAEQYa0zpOd6f8ZVAzzMOZEyp/TGlfTzG4fR4RhWAU2B8BdbNYxxWRqgCNGxNIyFjX7fxlbYZU5r3PC3yGId1E6oAbVvTSMjY1218pW3GlOY9T4s8xmHFhCpA29Y0EjL2dRtfaZsxpXnP0yKPcVgxoQrQsL6vvu/r/mZUZO+xFox93UfeN4+q6sOqetR11XddPWj1fm3V0M/f0M/LMe9vqWNznqdFHuOwbkIVgFNlfAXWzWMcGiZUARq2pkGQOa57yfGVUxy7OYYxpXnPsxYGlmA9hCpA29Y0CDLHdS85vnKKYzfHMKY073nWwsASrIRQBWjbmgZB5rjuJcdXTnHs5hjGlOY9z1oYWIKVEKoADVvTWMoc173k+Mr2bbuuHnRd9ce+zzW76XO16z485v0lHpvzPGthYAnWQ6gCwP+ba3xl39sadBnukPvqkM8r62ZgCRohVAEatqbxj6Wue67xlbFHf9buyPvhyeev7y8/r0kjScaU5mVgCdokVAHatqbxj6Wue67xFYMuhzm14SRjStMxsAQNEqoAbVvT+MdS1z3X+IpBl8Oc2nCSMaXpGFiCBglVgIataSxlqeueenzl0NGfIe/zFJzacJIxpekYWII2CVUAuN6x4yvHDrIYdDGcxLQMLEEgoQpX7BxO+M7drvu7v/+n7jt3Fx/XWPuoR4vH0q5njqGPue6HpWxfyxjjK/vcu3fx5K9D3+dajh162x1WMZy05PeWUzT2Y3ztXyO+lliCUIVnPTWS0L3XdfXGz39Rr/3bz+qNn/+ie6/rdt3uBI6lXU/SsbTrmWPoY677YSljj6/MdT0tHjv0tlclfSytfm85RWv5mku8HhhF1/d+vP4mFxcXe++k8/Nzf4q0Mps/GXy9qj6qyydQf1J9vVVdfaX6+qS6+qCq3qm7fT2+3dZr0bq1Hku7nqRjadezfezevYu9r+U75vvX1PdDXfMaxF3PPo79vfiY66vLPwR+8rZdt/91bNsfy50759dd0lPvc8g1tnJsyG3rgPs77eNL/t4y1feHFqzla26xc3fd1z6vL7/64/rRX/+ofvxfL9YXH1ff/+GQzwHjWWurCNUB1vrJ53qbZ05/UlVvVdVXtv7RJ1WXsdq/6wFEtla/fw2Nu8fm/liuu77NjxAOuu3QUL36Pk/NIfc3w7X6/WEOvuau0XXfqKrfVNWXto7+uar+tvr+98tc1Glb62P5haUvABI9idTPX/phvfjp1X/8lfr8pR/W7976Ydf1VSf+7yviPOz7uj33SbuuHtQMIyO3bn0x9SmGelh7Pt7r/gN329WP5datL+qzz76887ZD3+cJMpzEVI5+jK/R39Tv67f1St2uPz31+sFHVfWwbv/ula57TawyFqEKVzz1TOqzkXrpxU+rXvvg8n//x7+UWCXIUouUk5x3+5mL6/7EeG67/jDgkGdgdn0s77//y6p6+k+/T/k/iHc5+WeymM2hj/FT8NX6Q/22vvVMpFZd/iz0y/Wgquo31XVf92PAjMGYEmzp3uu66usn9ejsH+rpH/d91uNY/e4/V532v7sIM/YS45Jrj0mrksfeD0Nvm/Qxpxn6OVjLsTnPg/vrJq/Wx3VWj/bGw+b4l6rq1bmuiXUTqvC0b1fVD+rs0V8MuvWLn1a98fOqv/r1tFcFhxl7iXHJtcekVclj74eht036mNMkrZvOcWzO8+D+gihCFZ7266r6WT06+99Bt/78paoP/7Hqv7897VXBYT6qqjc3f5/q/Y19jkPOvZRj74eht036mNMM/Rys5dic58H9BVGEKmzp3+376uqdOnv0r3W57rvf5y9V/e4tr1El0aOq+rCqHt25c15vv/29o95Z31ff93V/86tWHmxep/XkHFO+bmv73FOd45hrOeT6ht426WNOM/RzsJZjc54H99dNPq5X61Gd7f29PZvjf66qj+e6JtZNqMIVm185805VfVCfv7T7RiKVhuxbkn1Oc441tbbouu96j/k4WrsPpuS+YGkn/TX4x/pafat+Ww/qlWdidbP6W3X5K2oMKTEKoQpXdF11dbd/vX717jv14qc/rWefWf2kXvz0p/XwL8+qujeq6qzvq9usUZ5V1SqPpV1P0rGU69nx5XywY8ZEJvj4bqcPmWxfX9/X7X0fx9XbjvA+V3Ps2K+HNR+b8zzslv54nPvc/1nfOPtq/em1s6o/9lUP/1xnn/RVD8+q/vhKPfCraRiVUIVnXQ4n/Oru6/X4mdXHr1m9/PsHVfXO5p+nDG6sadSjxWOJ1/O8phgMSvr4xnbs18gx73Mtx9KuJ+nYnOdht6Svh4zHz2WMfv1u3f3+G/Xh/9ytu9+vqq+LVMbW9f3J/8j9ja773X3bv++Oddj8KfPrVfVR31f/5FfWVP2gqn5WXb3Tv9v3V2+3623XdCztepKOpVxP1d6XDtW9exfPHNv1/euYc/R9dVN+fPfuXew991Lfi5/3a+S6j+XOnfOzIe9zLcfSrifp2NTnSXxMpUn6emjh8cMy1toqQnWAtX7yGa57r+vq8lfX/HrzGlaI010zajQ0VK+8vwd1wGtSNz8eNpk1fS9e08dCu3wdwjqs9bH8wtIXAC3YxOmvlr4OmNkhw0knPTICAIzLa1QBGnbMMMqRQytHDQZNcD2szNiDQS0em/M8AGmEKkDbjhlGOWZo5ZBRjzmuh/VZapwm6dic5wGIIlQB2vZRVb25+fsYbzv0/e273VLXw/oM/XpY87E5zwMQRagCNKzvq+/7uv88i4vbb9t19WAzxvSoqj6sa9Z9rzvvWNczxvujbUO/HtZ8bM7zAKQRqgBUGU4CAIIIVYCGjT1etMczw0lTDLIYfmFb0qiRMSWA+QlVgLaNPV409HZTDLIYfmFb0qiRMSWAmQlVgLaNPV409HZTDLIYfmFb0qiRMSWAmb2w9AUA8Pw2gyj3q6q6gT/I9/bb36vPPvty1Q2DSbvOcd2xY910nouLMc9GuqFfd2s+NvV5PKaAZJ5RBTgxm0gdynASADA7oQrQsAmGUWYZTtrF8AvbkkaNjCkBzE+oArRt7GGUJcdXDL+wLWnUyJgSwMyEKkDbxh5GWXJ8xfAL25JGjYwpAczMmBJAw4aOKd25c37w+7vu2BQMv7AtadTImBLA/DyjCrAexw4fGU4CACIIVYCGbQ+j9H3d7vvqamsQ6YY3X2w4aRfDL2xLGjUypgQwP6EK0LZjxlLShlbSrodlJY0aGVMCmJlQBWjbMWMpaUMradfDspJGjYwpAcxMqAI0rO+r7/u6vxlI2Xts7LedQtr1sKyhXw9rPjbneQDSCFWAdds3kGQ4CQCIJVQBVuamgaWE4aRdDL9wk6ShI2NKANMSqgDr0+qoSgvXyLKSho6MKQFMSKgCrE+royotXCPLSho6MqYEMCGhCrAyrY6qtHCNLCtp6MiYEsC0hCoAAABRhCoAEQy/cJOkoSNjSgDTEqoApDD8wk2Sho6MKQFMSKgCkMLwCzdJGjoypgQwIaEKQATDL9wkaejImBLAtIQqAAAAUYQqANCEpKEjY0oA0xKqAEArkoaOjCkBTEioAgCtSBo6MqYEMCGhCgA0IWnoyJgSwLSEKgAAAFGEKgDQhKShI2NKANMSqgBAK5KGjowpAUxIqAIArUgaOjKmBDChru+9lv4mFxcXe++k8/NzPzoDMIM1fS9e08dyjIuLiwdV9fLS18GzTunrEFq31n+neEYVgFY8PPA4+UQqADu9sPQFAMAQ5+fnt5e+BgBgHp5RBQAAIIpQBQAAIIpQBQAAIIpQHcaABwCMz79HM/m8QFtW2Sp+PQ0AzGytv0oAAMbiGVUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAAACiCFUAmN/DA48DwEnp+r5f+hoAAADgCc+oAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEEWoAgAAEOX/AIS6zUzgLYR1AAAAAElFTkSuQmCC\n", + "text/plain": [ + "
    " + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Greedy best-first search search: 151.6 path cost, 826 states reached\n" + ] + } + ], + "source": [ + "plots(d7)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Nondeterministic Actions\n", + "\n", + "To handle problems with nondeterministic problems, we'll replace the `result` method with `results`, which returns a collection of possible result states. We'll represent the solution to a problem not with a `Node`, but with a plan that consist of two types of component: sequences of actions, like `['forward', 'suck']`, and condition actions, like\n", + "`{5: ['forward', 'suck'], 7: []}`, which says that if we end up in state 5, then do `['forward', 'suck']`, but if we end up in state 7, then do the empty sequence of actions." + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [], + "source": [ + "def and_or_search(problem):\n", + " \"Find a plan for a problem that has nondterministic actions.\"\n", + " return or_search(problem, problem.initial, [])\n", + " \n", + "def or_search(problem, state, path):\n", + " \"Find a sequence of actions to reach goal from state, without repeating states on path.\"\n", + " if problem.is_goal(state): return []\n", + " if state in path: return failure # check for loops\n", + " for action in problem.actions(state):\n", + " plan = and_search(problem, problem.results(state, action), [state] + path)\n", + " if plan != failure:\n", + " return [action] + plan\n", + " return failure\n", + "\n", + "def and_search(problem, states, path):\n", + " \"Plan for each of the possible states we might end up in.\"\n", + " if len(states) == 1: \n", + " return or_search(problem, next(iter(states)), path)\n", + " plan = {}\n", + " for s in states:\n", + " plan[s] = or_search(problem, s, path)\n", + " if plan[s] == failure: return failure\n", + " return [plan]" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [], + "source": [ + "class MultiGoalProblem(Problem):\n", + " \"\"\"A version of `Problem` with a colllection of `goals` instead of one `goal`.\"\"\"\n", + " \n", + " def __init__(self, initial=None, goals=(), **kwds): \n", + " self.__dict__.update(initial=initial, goals=goals, **kwds)\n", + " \n", + " def is_goal(self, state): return state in self.goals\n", + " \n", + "class ErraticVacuum(MultiGoalProblem):\n", + " \"\"\"In this 2-location vacuum problem, the suck action in a dirty square will either clean up that square,\n", + " or clean up both squares. A suck action in a clean square will either do nothing, or\n", + " will deposit dirt in that square. Forward and backward actions are deterministic.\"\"\"\n", + " \n", + " def actions(self, state): \n", + " return ['suck', 'forward', 'backward']\n", + " \n", + " def results(self, state, action): return self.table[action][state]\n", + " \n", + " table = {'suck':{1:{5,7}, 2:{4,8}, 3:{7}, 4:{2,4}, 5:{1,5}, 6:{8}, 7:{3,7}, 8:{6,8}},\n", + " 'forward': {1:{2}, 2:{2}, 3:{4}, 4:{4}, 5:{6}, 6:{6}, 7:{8}, 8:{8}},\n", + " 'backward': {1:{1}, 2:{1}, 3:{3}, 4:{3}, 5:{5}, 6:{5}, 7:{7}, 8:{7}}}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's find a plan to get from state 1 to the goal of no dirt (states 7 or 8):" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['suck', {5: ['forward', 'suck'], 7: []}]" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "and_or_search(ErraticVacuum(1, {7, 8}))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This plan says \"First suck, and if we end up in state 5, go forward and suck again; if we end up in state 7, do nothing because that is a goal.\"\n", + "\n", + "Here are the plans to get to a goal state starting from any one of the 8 states:" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{1: ['suck', {5: ['forward', 'suck'], 7: []}],\n", + " 2: ['suck', {8: [], 4: ['backward', 'suck']}],\n", + " 3: ['suck'],\n", + " 4: ['backward', 'suck'],\n", + " 5: ['forward', 'suck'],\n", + " 6: ['suck'],\n", + " 7: [],\n", + " 8: []}" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "{s: and_or_search(ErraticVacuum(s, {7,8})) \n", + " for s in range(1, 9)}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Comparing Algorithms on EightPuzzle Problems of Different Lengths" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": {}, + "outputs": [], + "source": [ + "from functools import lru_cache\n", + "\n", + "def build_table(table, depth, state, problem):\n", + " if depth > 0 and state not in table:\n", + " problem.initial = state\n", + " table[state] = len(astar_search(problem))\n", + " for a in problem.actions(state):\n", + " build_table(table, depth - 1, problem.result(state, a), problem)\n", + " return table\n", + "\n", + "def invert_table(table):\n", + " result = defaultdict(list)\n", + " for key, val in table.items():\n", + " result[val].append(key)\n", + " return result\n", + "\n", + "goal = (0, 1, 2, 3, 4, 5, 6, 7, 8)\n", + "table8 = invert_table(build_table({}, 25, goal, EightPuzzle(goal)))" + ] + }, + { + "cell_type": "code", + "execution_count": 78, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "2.6724" + ] + }, + "execution_count": 78, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "def report8(table8, M, Ds=range(2, 25, 2), searchers=(breadth_first_search, astar_misplaced_tiles, astar_search)):\n", + " \"Make a table of average nodes generated and effective branching factor\"\n", + " for d in Ds:\n", + " line = [d]\n", + " N = min(M, len(table8[d]))\n", + " states = random.sample(table8[d], N)\n", + " for searcher in searchers:\n", + " nodes = 0\n", + " for s in states:\n", + " problem = CountCalls(EightPuzzle(s))\n", + " searcher(problem)\n", + " nodes += problem._counts['result']\n", + " nodes = int(round(nodes/N))\n", + " line.append(nodes)\n", + " line.extend([ebf(d, n) for n in line[1:]])\n", + " print('{:2} & {:6} & {:5} & {:5} && {:.2f} & {:.2f} & {:.2f}'\n", + " .format(*line))\n", + "\n", + " \n", + "def ebf(d, N, possible_bs=[b/100 for b in range(100, 300)]):\n", + " \"Effective Branching Factor\"\n", + " return min(possible_bs, key=lambda b: abs(N - sum(b**i for i in range(1, d+1))))\n", + "\n", + "def edepth_reduction(d, N, b=2.67):\n", + " \n", + " \n", + "\n", + "from statistics import mean \n", + "\n", + "def random_state():\n", + " x = list(range(9))\n", + " random.shuffle(x)\n", + " return tuple(x)\n", + "\n", + "meanbf = mean(len(e3.actions(random_state())) for _ in range(10000))\n", + "meanbf" + ] + }, + { + "cell_type": "code", + "execution_count": 72, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{0: 1,\n", + " 1: 2,\n", + " 2: 4,\n", + " 3: 8,\n", + " 4: 16,\n", + " 5: 20,\n", + " 6: 36,\n", + " 7: 60,\n", + " 8: 87,\n", + " 9: 123,\n", + " 10: 175,\n", + " 11: 280,\n", + " 12: 397,\n", + " 13: 656,\n", + " 14: 898,\n", + " 15: 1452,\n", + " 16: 1670,\n", + " 17: 2677,\n", + " 18: 2699,\n", + " 19: 4015,\n", + " 20: 3472,\n", + " 21: 4672,\n", + " 22: 3311,\n", + " 23: 3898,\n", + " 24: 1945,\n", + " 25: 1796,\n", + " 26: 621,\n", + " 27: 368,\n", + " 28: 63,\n", + " 29: 19,\n", + " 30: 0}" + ] + }, + "execution_count": 72, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "{n: len(v) for (n, v) in table30.items()}" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CPU times: user 24min 7s, sys: 11.6 s, total: 24min 19s\n", + "Wall time: 24min 44s\n" + ] + } + ], + "source": [ + "%time table30 = invert_table(build_table({}, 30, goal, EightPuzzle(goal)))" + ] + }, + { + "cell_type": "code", + "execution_count": 68, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " 2 & 5 & 6 & 6 && 1.79 & 2.00 & 2.00\n", + " 4 & 33 & 12 & 12 && 2.06 & 1.49 & 1.49\n", + " 6 & 128 & 24 & 19 && 2.01 & 1.42 & 1.34\n", + " 8 & 368 & 48 & 31 && 1.91 & 1.40 & 1.30\n", + "10 & 1033 & 116 & 48 && 1.85 & 1.43 & 1.27\n", + "12 & 2672 & 279 & 84 && 1.80 & 1.45 & 1.28\n", + "14 & 6783 & 678 & 174 && 1.77 & 1.47 & 1.31\n", + "16 & 17270 & 1683 & 364 && 1.74 & 1.48 & 1.32\n", + "18 & 41558 & 4102 & 751 && 1.72 & 1.49 & 1.34\n", + "20 & 91493 & 9905 & 1318 && 1.69 & 1.50 & 1.34\n", + "22 & 175921 & 22955 & 2548 && 1.66 & 1.50 & 1.34\n", + "24 & 290082 & 53039 & 5733 && 1.62 & 1.50 & 1.36\n", + "CPU times: user 6min, sys: 3.63 s, total: 6min 4s\n", + "Wall time: 6min 13s\n" + ] + } + ], + "source": [ + "%time report8(table30, 20, range(26, 31, 2))" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "26 & 395355 & 110372 & 10080 && 1.58 & 1.50 & 1.35\n", + "28 & 463234 & 202565 & 22055 && 1.53 & 1.49 & 1.36\n" + ] + }, + { + "ename": "ZeroDivisionError", + "evalue": "division by zero", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mZeroDivisionError\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n", + "\u001b[0;32m\u001b[0m in \u001b[0;36mreport8\u001b[0;34m(table8, M, Ds, searchers)\u001b[0m\n\u001b[1;32m 11\u001b[0m \u001b[0msearcher\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mproblem\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 12\u001b[0m \u001b[0mnodes\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0mproblem\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_counts\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'result'\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 13\u001b[0;31m \u001b[0mnodes\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mround\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnodes\u001b[0m\u001b[0;34m/\u001b[0m\u001b[0mN\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 14\u001b[0m \u001b[0mline\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnodes\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 15\u001b[0m \u001b[0mline\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mextend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mebf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0md\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mn\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mn\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mline\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mZeroDivisionError\u001b[0m: division by zero" + ] + } + ], + "source": [ + "%time report8(table30, 20, range(26, 31, 2))" + ] + }, + { + "cell_type": "code", + "execution_count": 315, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0 116 116 ['A']\n", + "140 0 140 ['A', 'S']\n", + "0 83 83 ['A']\n", + "118 0 118 ['A', 'T']\n", + "0 45 45 ['A']\n", + "75 0 75 ['A', 'Z']\n", + "0 176 176 ['B']\n", + "101 92 193 ['B', 'P']\n", + "211 0 211 ['B', 'F']\n", + "0 77 77 ['B']\n", + "90 0 90 ['B', 'G']\n", + "0 100 100 ['B']\n", + "101 0 101 ['B', 'P']\n", + "0 80 80 ['B']\n", + "85 0 85 ['B', 'U']\n", + "0 87 87 ['C']\n", + "120 0 120 ['C', 'D']\n", + "0 109 109 ['C']\n", + "138 0 138 ['C', 'P']\n", + "0 128 128 ['C']\n", + "146 0 146 ['C', 'R']\n", + "0 47 47 ['D']\n", + "75 0 75 ['D', 'M']\n", + "0 62 62 ['E']\n", + "86 0 86 ['E', 'H']\n", + "0 98 98 ['F']\n", + "99 0 99 ['F', 'S']\n", + "0 77 77 ['H']\n", + "98 0 98 ['H', 'U']\n", + "0 85 85 ['I']\n", + "87 0 87 ['I', 'N']\n", + "0 78 78 ['I']\n", + "92 0 92 ['I', 'V']\n", + "0 36 36 ['L']\n", + "70 0 70 ['L', 'M']\n", + "0 86 86 ['L']\n", + "111 0 111 ['L', 'T']\n", + "0 136 136 ['O']\n", + "151 0 151 ['O', 'S']\n", + "0 48 48 ['O']\n", + "71 0 71 ['O', 'Z']\n", + "0 93 93 ['P']\n", + "97 0 97 ['P', 'R']\n", + "0 65 65 ['R']\n", + "80 0 80 ['R', 'S']\n", + "0 127 127 ['U']\n", + "142 0 142 ['U', 'V']\n" + ] + }, + { + "data": { + "text/plain": [ + "(1.2698088530709188, 1.2059558858330393)" + ] + }, + "execution_count": 315, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from itertools import combinations\n", + "from statistics import median, mean\n", + "\n", + "# Detour index for Romania\n", + "\n", + "L = romania.locations\n", + "def ratio(a, b): return astar_search(RouteProblem(a, b, map=romania)).path_cost / sld(L[a], L[b])\n", + "nums = [ratio(a, b) for a,b in combinations(L, 2) if b in r1.actions(a)]\n", + "mean(nums), median(nums) # 1.7, 1.6 # 1.26, 1.2 for adjacent cities" + ] + }, + { + "cell_type": "code", + "execution_count": 300, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 300, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sld" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tests/test_agents.py b/tests/test_agents.py index 59ab6bce9..d1a669486 100644 --- a/tests/test_agents.py +++ b/tests/test_agents.py @@ -1,12 +1,19 @@ import random -from agents import Direction -from agents import Agent -from agents import ReflexVacuumAgent, ModelBasedVacuumAgent, TrivialVacuumEnvironment, compare_agents,\ - RandomVacuumAgent +import pytest -random.seed("aima-python") +from agents import (ReflexVacuumAgent, ModelBasedVacuumAgent, TrivialVacuumEnvironment, compare_agents, + RandomVacuumAgent, TableDrivenVacuumAgent, TableDrivenAgentProgram, RandomAgentProgram, + SimpleReflexAgentProgram, ModelBasedReflexAgentProgram, Wall, Gold, Explorer, Thing, Bump, Glitter, + WumpusEnvironment, Pit, VacuumEnvironment, Dirt, Direction, Agent) +# random seed may affect the placement +# of things in the environment which may +# lead to failure of tests. Please change +# the seed if the tests are failing with +# current changes in any stochastic method +# function or variable. +random.seed(9) def test_move_forward(): d = Direction("up") @@ -55,7 +62,24 @@ def test_add(): assert l2.direction == Direction.D -def test_RandomVacuumAgent() : +def test_RandomAgentProgram(): + # create a list of all the actions a Vacuum cleaner can perform + list = ['Right', 'Left', 'Suck', 'NoOp'] + # create a program and then an object of the RandomAgentProgram + program = RandomAgentProgram(list) + + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_RandomVacuumAgent(): # create an object of the RandomVacuumAgent agent = RandomVacuumAgent() # create an object of TrivialVacuumEnvironment @@ -65,10 +89,46 @@ def test_RandomVacuumAgent() : # run the environment environment.run() # check final status of the environment - assert environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_TableDrivenAgent(): + random.seed(10) + loc_A, loc_B = (0, 0), (1, 0) + # table defining all the possible states of the agent + table = {((loc_A, 'Clean'),): 'Right', + ((loc_A, 'Dirty'),): 'Suck', + ((loc_B, 'Clean'),): 'Left', + ((loc_B, 'Dirty'),): 'Suck', + ((loc_A, 'Dirty'), (loc_A, 'Clean')): 'Right', + ((loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean')): 'Left', + ((loc_A, 'Dirty'), (loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck'} + # create an program and then an object of the TableDrivenAgent + program = TableDrivenAgentProgram(table) + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # initializing some environment status + environment.status = {loc_A: 'Dirty', loc_B: 'Dirty'} + # add agent to the environment + environment.add_thing(agent) + + # run the environment by single step everytime to check how environment evolves using TableDrivenAgentProgram + environment.run(steps=1) + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Dirty'} + + environment.run(steps=1) + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Dirty'} -def test_ReflexVacuumAgent() : + environment.run(steps=1) + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_ReflexVacuumAgent(): # create an object of the ReflexVacuumAgent agent = ReflexVacuumAgent() # create an object of TrivialVacuumEnvironment @@ -78,10 +138,76 @@ def test_ReflexVacuumAgent() : # run the environment environment.run() # check final status of the environment - assert environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_SimpleReflexAgentProgram(): + class Rule: + + def __init__(self, state, action): + self.__state = state + self.action = action + + def matches(self, state): + return self.__state == state + loc_A = (0, 0) + loc_B = (1, 0) -def test_ModelBasedVacuumAgent() : + # create rules for a two state Vacuum Environment + rules = [Rule((loc_A, "Dirty"), "Suck"), Rule((loc_A, "Clean"), "Right"), + Rule((loc_B, "Dirty"), "Suck"), Rule((loc_B, "Clean"), "Left")] + + def interpret_input(state): + return state + + # create a program and then an object of the SimpleReflexAgentProgram + program = SimpleReflexAgentProgram(rules, interpret_input) + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_ModelBasedReflexAgentProgram(): + class Rule: + + def __init__(self, state, action): + self.__state = state + self.action = action + + def matches(self, state): + return self.__state == state + + loc_A = (0, 0) + loc_B = (1, 0) + + # create rules for a two-state Vacuum Environment + rules = [Rule((loc_A, "Dirty"), "Suck"), Rule((loc_A, "Clean"), "Right"), + Rule((loc_B, "Dirty"), "Suck"), Rule((loc_B, "Clean"), "Left")] + + def update_state(state, action, percept, model): + return percept + + # create a program and then an object of the ModelBasedReflexAgentProgram class + program = ModelBasedReflexAgentProgram(rules, update_state, None) + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_ModelBasedVacuumAgent(): # create an object of the ModelBasedVacuumAgent agent = ModelBasedVacuumAgent() # create an object of TrivialVacuumEnvironment @@ -91,16 +217,29 @@ def test_ModelBasedVacuumAgent() : # run the environment environment.run() # check final status of the environment - assert environment.status == {(1,0):'Clean' , (0,0) : 'Clean'} + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_TableDrivenVacuumAgent(): + # create an object of the TableDrivenVacuumAgent + agent = TableDrivenVacuumAgent() + # create an object of the TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} -def test_compare_agents() : +def test_compare_agents(): environment = TrivialVacuumEnvironment agents = [ModelBasedVacuumAgent, ReflexVacuumAgent] result = compare_agents(environment, agents) - performance_ModelBasedVacummAgent = result[0][1] - performance_ReflexVacummAgent = result[1][1] + performance_ModelBasedVacuumAgent = result[0][1] + performance_ReflexVacuumAgent = result[1][1] # The performance of ModelBasedVacuumAgent will be at least as good as that of # ReflexVacuumAgent, since ModelBasedVacuumAgent can identify when it has @@ -108,12 +247,141 @@ def test_compare_agents() : # NoOp leading to 0 performance change, whereas ReflexVacuumAgent cannot # identify the terminal state and thus will keep moving, leading to worse # performance compared to ModelBasedVacuumAgent. - assert performance_ReflexVacummAgent <= performance_ModelBasedVacummAgent + assert performance_ReflexVacuumAgent <= performance_ModelBasedVacuumAgent + + +def test_TableDrivenAgentProgram(): + table = {(('foo', 1),): 'action1', + (('foo', 2),): 'action2', + (('bar', 1),): 'action3', + (('bar', 2),): 'action1', + (('foo', 1), ('foo', 1),): 'action2', + (('foo', 1), ('foo', 2),): 'action3'} + agent_program = TableDrivenAgentProgram(table) + assert agent_program(('foo', 1)) == 'action1' + assert agent_program(('foo', 2)) == 'action3' + assert agent_program(('invalid percept',)) is None def test_Agent(): def constant_prog(percept): return percept + agent = Agent(constant_prog) result = agent.program(5) assert result == 5 + + +def test_VacuumEnvironment(): + # initialize Vacuum Environment + v = VacuumEnvironment(6, 6) + # get an agent + agent = ModelBasedVacuumAgent() + agent.direction = Direction(Direction.R) + v.add_thing(agent) + v.add_thing(Dirt(), location=(2, 1)) + + # check if things are added properly + assert len([x for x in v.things if isinstance(x, Wall)]) == 20 + assert len([x for x in v.things if isinstance(x, Dirt)]) == 1 + + # let the action begin! + assert v.percept(agent) == ("Clean", "None") + v.execute_action(agent, "Forward") + assert v.percept(agent) == ("Dirty", "None") + v.execute_action(agent, "TurnLeft") + v.execute_action(agent, "Forward") + assert v.percept(agent) == ("Dirty", "Bump") + v.execute_action(agent, "Suck") + assert v.percept(agent) == ("Clean", "None") + old_performance = agent.performance + v.execute_action(agent, "NoOp") + assert old_performance == agent.performance + + +def test_WumpusEnvironment(): + def constant_prog(percept): + return percept + + # initialize Wumpus Environment + w = WumpusEnvironment(constant_prog) + + # check if things are added properly + assert len([x for x in w.things if isinstance(x, Wall)]) == 20 + assert any(map(lambda x: isinstance(x, Gold), w.things)) + assert any(map(lambda x: isinstance(x, Explorer), w.things)) + assert not any(map(lambda x: not isinstance(x, Thing), w.things)) + + # check that gold and wumpus are not present on (1,1) + assert not any(map(lambda x: isinstance(x, Gold) or isinstance(x, WumpusEnvironment), w.list_things_at((1, 1)))) + + # check if w.get_world() segments objects correctly + assert len(w.get_world()) == 6 + for row in w.get_world(): + assert len(row) == 6 + + # start the game! + agent = [x for x in w.things if isinstance(x, Explorer)][0] + gold = [x for x in w.things if isinstance(x, Gold)][0] + pit = [x for x in w.things if isinstance(x, Pit)][0] + + assert not w.is_done() + + # check Walls + agent.location = (1, 2) + percepts = w.percept(agent) + assert len(percepts) == 5 + assert any(map(lambda x: isinstance(x, Bump), percepts[0])) + + # check Gold + agent.location = gold.location + percepts = w.percept(agent) + assert any(map(lambda x: isinstance(x, Glitter), percepts[4])) + agent.location = (gold.location[0], gold.location[1] + 1) + percepts = w.percept(agent) + assert not any(map(lambda x: isinstance(x, Glitter), percepts[4])) + + # check agent death + agent.location = pit.location + assert w.in_danger(agent) + assert not agent.alive + assert agent.killed_by == Pit.__name__ + assert agent.performance == -1000 + + assert w.is_done() + + +def test_WumpusEnvironmentActions(): + random.seed(9) + def constant_prog(percept): + return percept + + # initialize Wumpus Environment + w = WumpusEnvironment(constant_prog) + + agent = [x for x in w.things if isinstance(x, Explorer)][0] + gold = [x for x in w.things if isinstance(x, Gold)][0] + pit = [x for x in w.things if isinstance(x, Pit)][0] + + agent.location = (1, 1) + assert agent.direction.direction == "right" + w.execute_action(agent, 'TurnRight') + assert agent.direction.direction == "down" + w.execute_action(agent, 'TurnLeft') + assert agent.direction.direction == "right" + w.execute_action(agent, 'Forward') + assert agent.location == (2, 1) + + agent.location = gold.location + w.execute_action(agent, 'Grab') + assert agent.holding == [gold] + + agent.location = (1, 1) + w.execute_action(agent, 'Climb') + assert not any(map(lambda x: isinstance(x, Explorer), w.things)) + + assert w.is_done() + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_agents4e.py b/tests/test_agents4e.py new file mode 100644 index 000000000..295a1ee47 --- /dev/null +++ b/tests/test_agents4e.py @@ -0,0 +1,386 @@ +import random + +import pytest + +from agents4e import (ReflexVacuumAgent, ModelBasedVacuumAgent, TrivialVacuumEnvironment, compare_agents, + RandomVacuumAgent, TableDrivenVacuumAgent, TableDrivenAgentProgram, RandomAgentProgram, + SimpleReflexAgentProgram, ModelBasedReflexAgentProgram, Wall, Gold, Explorer, Thing, Bump, + Glitter, WumpusEnvironment, Pit, VacuumEnvironment, Dirt, Direction, Agent) + +# random seed may affect the placement +# of things in the environment which may +# lead to failure of tests. Please change +# the seed if the tests are failing with +# current changes in any stochastic method +# function or variable. +random.seed(9) + +def test_move_forward(): + d = Direction("up") + l1 = d.move_forward((0, 0)) + assert l1 == (0, -1) + + d = Direction(Direction.R) + l1 = d.move_forward((0, 0)) + assert l1 == (1, 0) + + d = Direction(Direction.D) + l1 = d.move_forward((0, 0)) + assert l1 == (0, 1) + + d = Direction("left") + l1 = d.move_forward((0, 0)) + assert l1 == (-1, 0) + + l2 = d.move_forward((1, 0)) + assert l2 == (0, 0) + + +def test_add(): + d = Direction(Direction.U) + l1 = d + "right" + l2 = d + "left" + assert l1.direction == Direction.R + assert l2.direction == Direction.L + + d = Direction("right") + l1 = d.__add__(Direction.L) + l2 = d.__add__(Direction.R) + assert l1.direction == "up" + assert l2.direction == "down" + + d = Direction("down") + l1 = d.__add__("right") + l2 = d.__add__("left") + assert l1.direction == Direction.L + assert l2.direction == Direction.R + + d = Direction(Direction.L) + l1 = d + Direction.R + l2 = d + Direction.L + assert l1.direction == Direction.U + assert l2.direction == Direction.D + + +def test_RandomAgentProgram(): + # create a list of all the actions a Vacuum cleaner can perform + list = ['Right', 'Left', 'Suck', 'NoOp'] + # create a program and then an object of the RandomAgentProgram + program = RandomAgentProgram(list) + + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_RandomVacuumAgent(): + # create an object of the RandomVacuumAgent + agent = RandomVacuumAgent() + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_TableDrivenAgent(): + random.seed(10) + loc_A, loc_B = (0, 0), (1, 0) + # table defining all the possible states of the agent + table = {((loc_A, 'Clean'),): 'Right', + ((loc_A, 'Dirty'),): 'Suck', + ((loc_B, 'Clean'),): 'Left', + ((loc_B, 'Dirty'),): 'Suck', + ((loc_A, 'Dirty'), (loc_A, 'Clean')): 'Right', + ((loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean')): 'Left', + ((loc_A, 'Dirty'), (loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck', + ((loc_B, 'Dirty'), (loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck'} + + # create an program and then an object of the TableDrivenAgent + program = TableDrivenAgentProgram(table) + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # initializing some environment status + environment.status = {loc_A: 'Dirty', loc_B: 'Dirty'} + # add agent to the environment + environment.add_thing(agent, location=(1, 0)) + # run the environment by single step everytime to check how environment evolves using TableDrivenAgentProgram + environment.run(steps=1) + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Dirty'} + + environment.run(steps=1) + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Dirty'} + + environment.run(steps=1) + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_ReflexVacuumAgent(): + # create an object of the ReflexVacuumAgent + agent = ReflexVacuumAgent() + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_SimpleReflexAgentProgram(): + class Rule: + + def __init__(self, state, action): + self.__state = state + self.action = action + + def matches(self, state): + return self.__state == state + + loc_A = (0, 0) + loc_B = (1, 0) + + # create rules for a two state Vacuum Environment + rules = [Rule((loc_A, "Dirty"), "Suck"), Rule((loc_A, "Clean"), "Right"), + Rule((loc_B, "Dirty"), "Suck"), Rule((loc_B, "Clean"), "Left")] + + def interpret_input(state): + return state + + # create a program and then an object of the SimpleReflexAgentProgram + program = SimpleReflexAgentProgram(rules, interpret_input) + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_ModelBasedReflexAgentProgram(): + class Rule: + + def __init__(self, state, action): + self.__state = state + self.action = action + + def matches(self, state): + return self.__state == state + + loc_A = (0, 0) + loc_B = (1, 0) + + # create rules for a two-state Vacuum Environment + rules = [Rule((loc_A, "Dirty"), "Suck"), Rule((loc_A, "Clean"), "Right"), + Rule((loc_B, "Dirty"), "Suck"), Rule((loc_B, "Clean"), "Left")] + + def update_state(state, action, percept, transition_model, sensor_model): + return percept + + # create a program and then an object of the ModelBasedReflexAgentProgram class + program = ModelBasedReflexAgentProgram(rules, update_state, None, None) + agent = Agent(program) + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_ModelBasedVacuumAgent(): + # create an object of the ModelBasedVacuumAgent + agent = ModelBasedVacuumAgent() + # create an object of TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_TableDrivenVacuumAgent(): + # create an object of the TableDrivenVacuumAgent + agent = TableDrivenVacuumAgent() + # create an object of the TrivialVacuumEnvironment + environment = TrivialVacuumEnvironment() + # add agent to the environment + environment.add_thing(agent) + # run the environment + environment.run() + # check final status of the environment + assert environment.status == {(1, 0): 'Clean', (0, 0): 'Clean'} + + +def test_compare_agents(): + environment = TrivialVacuumEnvironment + agents = [ModelBasedVacuumAgent, ReflexVacuumAgent] + + result = compare_agents(environment, agents) + performance_ModelBasedVacuumAgent = result[0][1] + performance_ReflexVacuumAgent = result[1][1] + + # The performance of ModelBasedVacuumAgent will be at least as good as that of + # ReflexVacuumAgent, since ModelBasedVacuumAgent can identify when it has + # reached the terminal state (both locations being clean) and will perform + # NoOp leading to 0 performance change, whereas ReflexVacuumAgent cannot + # identify the terminal state and thus will keep moving, leading to worse + # performance compared to ModelBasedVacuumAgent. + assert performance_ReflexVacuumAgent <= performance_ModelBasedVacuumAgent + + +def test_TableDrivenAgentProgram(): + table = {(('foo', 1),): 'action1', + (('foo', 2),): 'action2', + (('bar', 1),): 'action3', + (('bar', 2),): 'action1', + (('foo', 1), ('foo', 1),): 'action2', + (('foo', 1), ('foo', 2),): 'action3'} + agent_program = TableDrivenAgentProgram(table) + assert agent_program(('foo', 1)) == 'action1' + assert agent_program(('foo', 2)) == 'action3' + assert agent_program(('invalid percept',)) is None + + +def test_Agent(): + def constant_prog(percept): + return percept + + agent = Agent(constant_prog) + result = agent.program(5) + assert result == 5 + + +def test_VacuumEnvironment(): + # initialize Vacuum Environment + v = VacuumEnvironment(6, 6) + # get an agent + agent = ModelBasedVacuumAgent() + agent.direction = Direction(Direction.R) + v.add_thing(agent, location=(1, 1)) + v.add_thing(Dirt(), location=(2, 1)) + + # check if things are added properly + assert len([x for x in v.things if isinstance(x, Wall)]) == 20 + assert len([x for x in v.things if isinstance(x, Dirt)]) == 1 + + # let the action begin! + assert v.percept(agent) == ("Clean", "None") + v.execute_action(agent, "Forward") + assert v.percept(agent) == ("Dirty", "None") + v.execute_action(agent, "TurnLeft") + v.execute_action(agent, "Forward") + assert v.percept(agent) == ("Dirty", "Bump") + v.execute_action(agent, "Suck") + assert v.percept(agent) == ("Clean", "None") + old_performance = agent.performance + v.execute_action(agent, "NoOp") + assert old_performance == agent.performance + + +def test_WumpusEnvironment(): + def constant_prog(percept): + return percept + + # initialize Wumpus Environment + w = WumpusEnvironment(constant_prog) + + # check if things are added properly + assert len([x for x in w.things if isinstance(x, Wall)]) == 20 + assert any(map(lambda x: isinstance(x, Gold), w.things)) + assert any(map(lambda x: isinstance(x, Explorer), w.things)) + assert not any(map(lambda x: not isinstance(x, Thing), w.things)) + + # check that gold and wumpus are not present on (1,1) + assert not any(map(lambda x: isinstance(x, Gold) or isinstance(x, WumpusEnvironment), w.list_things_at((1, 1)))) + + # check if w.get_world() segments objects correctly + assert len(w.get_world()) == 6 + for row in w.get_world(): + assert len(row) == 6 + + # start the game! + agent = [x for x in w.things if isinstance(x, Explorer)][0] + gold = [x for x in w.things if isinstance(x, Gold)][0] + pit = [x for x in w.things if isinstance(x, Pit)][0] + + assert not w.is_done() + + # check Walls + agent.location = (1, 2) + percepts = w.percept(agent) + assert len(percepts) == 5 + assert any(map(lambda x: isinstance(x, Bump), percepts[0])) + + # check Gold + agent.location = gold.location + percepts = w.percept(agent) + assert any(map(lambda x: isinstance(x, Glitter), percepts[4])) + agent.location = (gold.location[0], gold.location[1] + 1) + percepts = w.percept(agent) + assert not any(map(lambda x: isinstance(x, Glitter), percepts[4])) + + # check agent death + agent.location = pit.location + assert w.in_danger(agent) + assert not agent.alive + assert agent.killed_by == Pit.__name__ + assert agent.performance == -1000 + + assert w.is_done() + + +def test_WumpusEnvironmentActions(): + random.seed(9) + def constant_prog(percept): + return percept + + # initialize Wumpus Environment + w = WumpusEnvironment(constant_prog) + + agent = [x for x in w.things if isinstance(x, Explorer)][0] + gold = [x for x in w.things if isinstance(x, Gold)][0] + pit = [x for x in w.things if isinstance(x, Pit)][0] + + agent.location = (1, 1) + assert agent.direction.direction == "right" + w.execute_action(agent, 'TurnRight') + assert agent.direction.direction == "down" + w.execute_action(agent, 'TurnLeft') + assert agent.direction.direction == "right" + w.execute_action(agent, 'Forward') + assert agent.location == (2, 1) + + agent.location = gold.location + w.execute_action(agent, 'Grab') + assert agent.holding == [gold] + + agent.location = (1, 1) + w.execute_action(agent, 'Climb') + assert not any(map(lambda x: isinstance(x, Explorer), w.things)) + + assert w.is_done() + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_csp.py b/tests/test_csp.py index 5a10f5ce5..a070cd531 100644 --- a/tests/test_csp.py +++ b/tests/test_csp.py @@ -1,26 +1,30 @@ import pytest +from utils import failure_test from csp import * +import random + +random.seed("aima-python") def test_csp_assign(): var = 10 val = 5 assignment = {} - australia.assign(var, val, assignment) + australia_csp.assign(var, val, assignment) - assert australia.nassigns == 1 + assert australia_csp.nassigns == 1 assert assignment[var] == val def test_csp_unassign(): var = 10 assignment = {var: 5} - australia.unassign(var, assignment) + australia_csp.unassign(var, assignment) assert var not in assignment -def test_csp_nconflits(): +def test_csp_nconflicts(): map_coloring_test = MapColoringCSP(list('RGB'), 'A: B C; B: C; C: ') assignment = {'A': 'R', 'B': 'G'} var = 'C' @@ -63,17 +67,16 @@ def test_csp_result(): def test_csp_goal_test(): map_coloring_test = MapColoringCSP(list('123'), 'A: B C; B: C; C: ') state = (('A', '1'), ('B', '3'), ('C', '2')) - assert map_coloring_test.goal_test(state) is True + assert map_coloring_test.goal_test(state) state = (('A', '1'), ('C', '2')) - assert map_coloring_test.goal_test(state) is False + assert not map_coloring_test.goal_test(state) def test_csp_support_pruning(): map_coloring_test = MapColoringCSP(list('123'), 'A: B C; B: C; C: ') map_coloring_test.support_pruning() - assert map_coloring_test.curr_domains == {'A': ['1', '2', '3'], 'B': ['1', '2', '3'], - 'C': ['1', '2', '3']} + assert map_coloring_test.curr_domains == {'A': ['1', '2', '3'], 'B': ['1', '2', '3'], 'C': ['1', '2', '3']} def test_csp_suppose(): @@ -84,8 +87,7 @@ def test_csp_suppose(): removals = map_coloring_test.suppose(var, value) assert removals == [('A', '2'), ('A', '3')] - assert map_coloring_test.curr_domains == {'A': ['1'], 'B': ['1', '2', '3'], - 'C': ['1', '2', '3']} + assert map_coloring_test.curr_domains == {'A': ['1'], 'B': ['1', '2', '3'], 'C': ['1', '2', '3']} def test_csp_prune(): @@ -96,16 +98,14 @@ def test_csp_prune(): map_coloring_test.support_pruning() map_coloring_test.prune(var, value, removals) - assert map_coloring_test.curr_domains == {'A': ['1', '2'], 'B': ['1', '2', '3'], - 'C': ['1', '2', '3']} + assert map_coloring_test.curr_domains == {'A': ['1', '2'], 'B': ['1', '2', '3'], 'C': ['1', '2', '3']} assert removals is None map_coloring_test = MapColoringCSP(list('123'), 'A: B C; B: C; C: ') removals = [('A', '2')] map_coloring_test.support_pruning() map_coloring_test.prune(var, value, removals) - assert map_coloring_test.curr_domains == {'A': ['1', '2'], 'B': ['1', '2', '3'], - 'C': ['1', '2', '3']} + assert map_coloring_test.curr_domains == {'A': ['1', '2'], 'B': ['1', '2', '3'], 'C': ['1', '2', '3']} assert removals == [('A', '2'), ('A', '3')] @@ -121,9 +121,9 @@ def test_csp_choices(): assert map_coloring_test.choices(var) == ['1', '2'] -def test_csp_infer_assignement(): +def test_csp_infer_assignment(): map_coloring_test = MapColoringCSP(list('123'), 'A: B C; B: C; C: ') - map_coloring_test.infer_assignment() == {} + assert map_coloring_test.infer_assignment() == {} var = 'A' value = '3' @@ -131,7 +131,7 @@ def test_csp_infer_assignement(): value = '1' map_coloring_test.prune(var, value, None) - map_coloring_test.infer_assignment() == {'A': '2'} + assert map_coloring_test.infer_assignment() == {'A': '2'} def test_csp_restore(): @@ -141,8 +141,7 @@ def test_csp_restore(): map_coloring_test.restore(removals) - assert map_coloring_test.curr_domains == {'A': ['2', '3', '1'], 'B': ['1', '2', '3'], - 'C': ['2', '3']} + assert map_coloring_test.curr_domains == {'A': ['2', '3', '1'], 'B': ['1', '2', '3'], 'C': ['2', '3']} def test_csp_conflicted_vars(): @@ -169,7 +168,7 @@ def test_csp_conflicted_vars(): def test_revise(): neighbors = parse_neighbors('A: B; B: ') domains = {'A': [0], 'B': [4]} - constraints = lambda X, x, Y, y: x % 2 == 0 and (x+y) == 4 + constraints = lambda X, x, Y, y: x % 2 == 0 and (x + y) == 4 csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) csp.support_pruning() @@ -177,35 +176,98 @@ def test_revise(): Xj = 'B' removals = [] - assert revise(csp, Xi, Xj, removals) is False + consistency, _ = revise(csp, Xi, Xj, removals) + assert not consistency assert len(removals) == 0 domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]} csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) csp.support_pruning() - assert revise(csp, Xi, Xj, removals) is True + assert revise(csp, Xi, Xj, removals) assert removals == [('A', 1), ('A', 3)] def test_AC3(): neighbors = parse_neighbors('A: B; B: ') domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]} - constraints = lambda X, x, Y, y: x % 2 == 0 and (x+y) == 4 and y % 2 != 0 + constraints = lambda X, x, Y, y: x % 2 == 0 and x + y == 4 and y % 2 != 0 + removals = [] + + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + consistency, _ = AC3(csp, removals=removals) + assert not consistency + + constraints = lambda X, x, Y, y: x % 2 == 0 and x + y == 4 + removals = [] + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + assert AC3(csp, removals=removals) + assert (removals == [('A', 1), ('A', 3), ('B', 1), ('B', 3)] or + removals == [('B', 1), ('B', 3), ('A', 1), ('A', 3)]) + + domains = {'A': [2, 4], 'B': [3, 5]} + constraints = lambda X, x, Y, y: (X == 'A' and Y == 'B') or (X == 'B' and Y == 'A') and x > y + removals = [] + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + assert AC3(csp, removals=removals) + + +def test_AC3b(): + neighbors = parse_neighbors('A: B; B: ') + domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]} + constraints = lambda X, x, Y, y: x % 2 == 0 and x + y == 4 and y % 2 != 0 + removals = [] + + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + consistency, _ = AC3b(csp, removals=removals) + assert not consistency + + constraints = lambda X, x, Y, y: x % 2 == 0 and x + y == 4 + removals = [] + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + assert AC3b(csp, removals=removals) + assert (removals == [('A', 1), ('A', 3), ('B', 1), ('B', 3)] or + removals == [('B', 1), ('B', 3), ('A', 1), ('A', 3)]) + + domains = {'A': [2, 4], 'B': [3, 5]} + constraints = lambda X, x, Y, y: (X == 'A' and Y == 'B') or (X == 'B' and Y == 'A') and x > y + removals = [] + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + assert AC3b(csp, removals=removals) + + +def test_AC4(): + neighbors = parse_neighbors('A: B; B: ') + domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]} + constraints = lambda X, x, Y, y: x % 2 == 0 and x + y == 4 and y % 2 != 0 removals = [] csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) - assert AC3(csp, removals=removals) is False + consistency, _ = AC4(csp, removals=removals) + assert not consistency - constraints = lambda X, x, Y, y: (x % 2) == 0 and (x+y) == 4 + constraints = lambda X, x, Y, y: x % 2 == 0 and x + y == 4 removals = [] csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) - assert AC3(csp, removals=removals) is True + assert AC4(csp, removals=removals) assert (removals == [('A', 1), ('A', 3), ('B', 1), ('B', 3)] or removals == [('B', 1), ('B', 3), ('A', 1), ('A', 3)]) + domains = {'A': [2, 4], 'B': [3, 5]} + constraints = lambda X, x, Y, y: (X == 'A' and Y == 'B') or (X == 'B' and Y == 'A') and x > y + removals = [] + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + assert AC4(csp, removals=removals) + def test_first_unassigned_variable(): map_coloring_test = MapColoringCSP(list('123'), 'A: B C; B: C; C: ') @@ -235,7 +297,7 @@ def test_num_legal_values(): def test_mrv(): neighbors = parse_neighbors('A: B; B: C; C: ') domains = {'A': [0, 1, 2, 3, 4], 'B': [4], 'C': [0, 1, 2, 3, 4]} - constraints = lambda X, x, Y, y: x % 2 == 0 and (x+y) == 4 + constraints = lambda X, x, Y, y: x % 2 == 0 and x + y == 4 csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) assignment = {'A': 0} @@ -257,13 +319,13 @@ def test_mrv(): def test_unordered_domain_values(): map_coloring_test = MapColoringCSP(list('123'), 'A: B C; B: C; C: ') assignment = None - assert unordered_domain_values('A', assignment, map_coloring_test) == ['1', '2', '3'] + assert unordered_domain_values('A', assignment, map_coloring_test) == ['1', '2', '3'] def test_lcv(): neighbors = parse_neighbors('A: B; B: C; C: ') domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4, 5], 'C': [0, 1, 2, 3, 4]} - constraints = lambda X, x, Y, y: x % 2 == 0 and (x+y) == 4 + constraints = lambda X, x, Y, y: x % 2 == 0 and (x + y) == 4 csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) assignment = {'A': 0} @@ -291,52 +353,105 @@ def test_forward_checking(): var = 'B' value = 3 assignment = {'A': 1, 'C': '3'} - assert forward_checking(csp, var, value, assignment, None) == True + assert forward_checking(csp, var, value, assignment, None) assert csp.curr_domains['A'] == A_curr_domains assert csp.curr_domains['C'] == C_curr_domains assignment = {'C': 3} - assert forward_checking(csp, var, value, assignment, None) == True + assert forward_checking(csp, var, value, assignment, None) assert csp.curr_domains['A'] == [1, 3] csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) csp.support_pruning() assignment = {} - assert forward_checking(csp, var, value, assignment, None) == True + assert forward_checking(csp, var, value, assignment, None) assert csp.curr_domains['A'] == [1, 3] assert csp.curr_domains['C'] == [1, 3] csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) - domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4, 7], 'C': [0, 1, 2, 3, 4]} csp.support_pruning() value = 7 assignment = {} - assert forward_checking(csp, var, value, assignment, None) == False + assert not forward_checking(csp, var, value, assignment, None) assert (csp.curr_domains['A'] == [] or csp.curr_domains['C'] == []) def test_backtracking_search(): - assert backtracking_search(australia) - assert backtracking_search(australia, select_unassigned_variable=mrv) - assert backtracking_search(australia, order_domain_values=lcv) - assert backtracking_search(australia, select_unassigned_variable=mrv, - order_domain_values=lcv) - assert backtracking_search(australia, inference=forward_checking) - assert backtracking_search(australia, inference=mac) - assert backtracking_search(usa, select_unassigned_variable=mrv, - order_domain_values=lcv, inference=mac) + assert backtracking_search(australia_csp) + assert backtracking_search(australia_csp, select_unassigned_variable=mrv) + assert backtracking_search(australia_csp, order_domain_values=lcv) + assert backtracking_search(australia_csp, select_unassigned_variable=mrv, order_domain_values=lcv) + assert backtracking_search(australia_csp, inference=forward_checking) + assert backtracking_search(australia_csp, inference=mac) + assert backtracking_search(usa_csp, select_unassigned_variable=mrv, order_domain_values=lcv, inference=mac) def test_min_conflicts(): - random.seed("aima-python") - assert min_conflicts(australia) - assert min_conflicts(usa) - assert min_conflicts(france) + assert min_conflicts(australia_csp) + assert min_conflicts(france_csp) + + tests = [(usa_csp, None)] * 3 + assert failure_test(min_conflicts, tests) >= 1 / 3 + australia_impossible = MapColoringCSP(list('RG'), 'SA: WA NT Q NSW V; NT: WA Q; NSW: Q V; T: ') assert min_conflicts(australia_impossible, 1000) is None + assert min_conflicts(NQueensCSP(2), 1000) is None + assert min_conflicts(NQueensCSP(3), 1000) is None + + +def test_nqueens_csp(): + csp = NQueensCSP(8) + + assignment = {0: 0, 1: 1, 2: 2, 3: 3, 4: 4} + csp.assign(5, 5, assignment) + assert len(assignment) == 6 + csp.assign(6, 6, assignment) + assert len(assignment) == 7 + csp.assign(7, 7, assignment) + assert len(assignment) == 8 + assert assignment[5] == 5 + assert assignment[6] == 6 + assert assignment[7] == 7 + assert csp.nconflicts(3, 2, assignment) == 0 + assert csp.nconflicts(3, 3, assignment) == 0 + assert csp.nconflicts(1, 5, assignment) == 1 + assert csp.nconflicts(7, 5, assignment) == 2 + csp.unassign(1, assignment) + csp.unassign(2, assignment) + csp.unassign(3, assignment) + assert 1 not in assignment + assert 2 not in assignment + assert 3 not in assignment + + assignment = {0: 0, 1: 1, 2: 4, 3: 1, 4: 6} + csp.assign(5, 7, assignment) + assert len(assignment) == 6 + csp.assign(6, 6, assignment) + assert len(assignment) == 7 + csp.assign(7, 2, assignment) + assert len(assignment) == 8 + assert assignment[5] == 7 + assert assignment[6] == 6 + assert assignment[7] == 2 + assignment = {0: 0, 1: 1, 2: 4, 3: 1, 4: 6, 5: 7, 6: 6, 7: 2} + assert csp.nconflicts(7, 7, assignment) == 4 + assert csp.nconflicts(3, 4, assignment) == 0 + assert csp.nconflicts(2, 6, assignment) == 2 + assert csp.nconflicts(5, 5, assignment) == 3 + csp.unassign(4, assignment) + csp.unassign(5, assignment) + csp.unassign(6, assignment) + assert 4 not in assignment + assert 5 not in assignment + assert 6 not in assignment + + for n in range(5, 9): + csp = NQueensCSP(n) + solution = min_conflicts(csp) + assert not solution or sorted(solution.values()) == list(range(n)) def test_universal_dict(): @@ -350,10 +465,10 @@ def test_parse_neighbours(): def test_topological_sort(): root = 'NT' - Sort, Parents = topological_sort(australia,root) - - assert Sort == ['NT','SA','Q','NSW','V','WA'] - assert Parents['NT'] == None + Sort, Parents = topological_sort(australia_csp, root) + + assert Sort == ['NT', 'SA', 'Q', 'NSW', 'V', 'WA'] + assert Parents['NT'] is None assert Parents['SA'] == 'NT' assert Parents['Q'] == 'SA' assert Parents['NSW'] == 'Q' @@ -362,12 +477,186 @@ def test_topological_sort(): def test_tree_csp_solver(): - australia_small = MapColoringCSP(list('RB'), - 'NT: WA Q; NSW: Q V') + australia_small = MapColoringCSP(list('RB'), 'NT: WA Q; NSW: Q V') tcs = tree_csp_solver(australia_small) assert (tcs['NT'] == 'R' and tcs['WA'] == 'B' and tcs['Q'] == 'B' and tcs['NSW'] == 'R' and tcs['V'] == 'B') or \ (tcs['NT'] == 'B' and tcs['WA'] == 'R' and tcs['Q'] == 'R' and tcs['NSW'] == 'B' and tcs['V'] == 'R') +def test_ac_solver(): + assert ac_solver(csp_crossword) == {'one_across': 'has', + 'one_down': 'hold', + 'two_down': 'syntax', + 'three_across': 'land', + 'four_across': 'ant'} or {'one_across': 'bus', + 'one_down': 'buys', + 'two_down': 'search', + 'three_across': 'year', + 'four_across': 'car'} + assert ac_solver(two_two_four) == {'T': 7, 'F': 1, 'W': 6, 'O': 5, 'U': 3, 'R': 0, 'C1': 1, 'C2': 1, 'C3': 1} or \ + {'T': 9, 'F': 1, 'W': 2, 'O': 8, 'U': 5, 'R': 6, 'C1': 1, 'C2': 0, 'C3': 1} + assert ac_solver(send_more_money) == \ + {'S': 9, 'M': 1, 'E': 5, 'N': 6, 'D': 7, 'O': 0, 'R': 8, 'Y': 2, 'C1': 1, 'C2': 1, 'C3': 0, 'C4': 1} + + +def test_ac_search_solver(): + assert ac_search_solver(csp_crossword) == {'one_across': 'has', + 'one_down': 'hold', + 'two_down': 'syntax', + 'three_across': 'land', + 'four_across': 'ant'} or {'one_across': 'bus', + 'one_down': 'buys', + 'two_down': 'search', + 'three_across': 'year', + 'four_across': 'car'} + assert ac_search_solver(two_two_four) == {'T': 7, 'F': 1, 'W': 6, 'O': 5, 'U': 3, 'R': 0, + 'C1': 1, 'C2': 1, 'C3': 1} or \ + {'T': 9, 'F': 1, 'W': 2, 'O': 8, 'U': 5, 'R': 6, 'C1': 1, 'C2': 0, 'C3': 1} + assert ac_search_solver(send_more_money) == {'S': 9, 'M': 1, 'E': 5, 'N': 6, 'D': 7, 'O': 0, 'R': 8, 'Y': 2, + 'C1': 1, 'C2': 1, 'C3': 0, 'C4': 1} + + +def test_different_values_constraint(): + assert different_values_constraint('A', 1, 'B', 2) + assert not different_values_constraint('A', 1, 'B', 1) + + +def test_flatten(): + sequence = [[0, 1, 2], [4, 5]] + assert flatten(sequence) == [0, 1, 2, 4, 5] + + +def test_sudoku(): + h = Sudoku(easy1) + assert backtracking_search(h, select_unassigned_variable=mrv, inference=forward_checking) is not None + g = Sudoku(harder1) + assert backtracking_search(g, select_unassigned_variable=mrv, inference=forward_checking) is not None + + +def test_make_arc_consistent(): + neighbors = parse_neighbors('A: B; B: ') + domains = {'A': [0], 'B': [3]} + constraints = lambda X, x, Y, y: x % 2 == 0 and (x + y) == 4 + + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + csp.support_pruning() + Xi = 'A' + Xj = 'B' + + assert make_arc_consistent(Xi, Xj, csp) == [] + + domains = {'A': [0], 'B': [4]} + constraints = lambda X, x, Y, y: x % 2 == 0 and (x + y) == 4 + + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + csp.support_pruning() + Xi = 'A' + Xj = 'B' + + assert make_arc_consistent(Xi, Xj, csp) == [0] + + domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]} + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + csp.support_pruning() + + assert make_arc_consistent(Xi, Xj, csp) == [0, 2, 4] + + +def test_assign_value(): + neighbors = parse_neighbors('A: B; B: ') + domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]} + constraints = lambda X, x, Y, y: x % 2 == 0 and (x + y) == 4 + Xi = 'A' + Xj = 'B' + + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + csp.support_pruning() + + assignment = {'A': 1} + assert assign_value(Xi, Xj, csp, assignment) is None + + assignment = {'A': 2} + assert assign_value(Xi, Xj, csp, assignment) == 2 + + constraints = lambda X, x, Y, y: (x + y) == 4 + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + csp.support_pruning() + + assignment = {'A': 1} + assert assign_value(Xi, Xj, csp, assignment) == 3 + + +def test_no_inference(): + neighbors = parse_neighbors('A: B; B: ') + domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4, 5]} + constraints = lambda X, x, Y, y: (x + y) < 8 + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + + var = 'B' + value = 3 + assignment = {'A': 1} + assert no_inference(csp, var, value, assignment, None) + + +def test_mac(): + neighbors = parse_neighbors('A: B; B: ') + domains = {'A': [0], 'B': [0]} + constraints = lambda X, x, Y, y: x % 2 == 0 + var = 'B' + value = 0 + assignment = {'A': 0} + + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + assert mac(csp, var, value, assignment, None) + + neighbors = parse_neighbors('A: B; B: ') + domains = {'A': [0, 1, 2, 3, 4], 'B': [0, 1, 2, 3, 4]} + constraints = lambda X, x, Y, y: x % 2 == 0 and (x + y) == 4 and y % 2 != 0 + var = 'B' + value = 3 + assignment = {'A': 1} + + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + consistency, _ = mac(csp, var, value, assignment, None) + assert not consistency + + constraints = lambda X, x, Y, y: x % 2 != 0 and (x + y) == 6 and y % 2 != 0 + csp = CSP(variables=None, domains=domains, neighbors=neighbors, constraints=constraints) + _, consistency = mac(csp, var, value, assignment, None) + assert consistency + + +def test_queen_constraint(): + assert queen_constraint(0, 1, 0, 1) + assert queen_constraint(2, 1, 4, 2) + assert not queen_constraint(2, 1, 3, 2) + + +def test_zebra(): + z = Zebra() + algorithm = min_conflicts + # would take very long + ans = algorithm(z, max_steps=10000) + assert ans is None or ans == {'Red': 3, 'Yellow': 1, 'Blue': 2, 'Green': 5, 'Ivory': 4, 'Dog': 4, 'Fox': 1, + 'Snails': 3, 'Horse': 2, 'Zebra': 5, 'OJ': 4, 'Tea': 2, 'Coffee': 5, 'Milk': 3, + 'Water': 1, 'Englishman': 3, 'Spaniard': 4, 'Norwegian': 1, 'Ukranian': 2, + 'Japanese': 5, 'Kools': 1, 'Chesterfields': 2, 'Winston': 3, 'LuckyStrike': 4, + 'Parliaments': 5} + + # restrict search space + z.domains = {'Red': [3, 4], 'Yellow': [1, 2], 'Blue': [1, 2], 'Green': [4, 5], 'Ivory': [4, 5], 'Dog': [4, 5], + 'Fox': [1, 2], 'Snails': [3], 'Horse': [2], 'Zebra': [5], 'OJ': [1, 2, 3, 4, 5], + 'Tea': [1, 2, 3, 4, 5], 'Coffee': [1, 2, 3, 4, 5], 'Milk': [3], 'Water': [1, 2, 3, 4, 5], + 'Englishman': [1, 2, 3, 4, 5], 'Spaniard': [1, 2, 3, 4, 5], 'Norwegian': [1], + 'Ukranian': [1, 2, 3, 4, 5], 'Japanese': [1, 2, 3, 4, 5], 'Kools': [1, 2, 3, 4, 5], + 'Chesterfields': [1, 2, 3, 4, 5], 'Winston': [1, 2, 3, 4, 5], 'LuckyStrike': [1, 2, 3, 4, 5], + 'Parliaments': [1, 2, 3, 4, 5]} + ans = algorithm(z, max_steps=10000) + assert ans == {'Red': 3, 'Yellow': 1, 'Blue': 2, 'Green': 5, 'Ivory': 4, 'Dog': 4, 'Fox': 1, 'Snails': 3, + 'Horse': 2, 'Zebra': 5, 'OJ': 4, 'Tea': 2, 'Coffee': 5, 'Milk': 3, 'Water': 1, 'Englishman': 3, + 'Spaniard': 4, 'Norwegian': 1, 'Ukranian': 2, 'Japanese': 5, 'Kools': 1, 'Chesterfields': 2, + 'Winston': 3, 'LuckyStrike': 4, 'Parliaments': 5} + + if __name__ == "__main__": pytest.main() diff --git a/tests/test_deep_learning4e.py b/tests/test_deep_learning4e.py new file mode 100644 index 000000000..34676b02b --- /dev/null +++ b/tests/test_deep_learning4e.py @@ -0,0 +1,80 @@ +import pytest +from keras.datasets import imdb + +from deep_learning4e import * +from learning4e import DataSet, grade_learner, err_ratio + +random.seed("aima-python") + +iris_tests = [([5.0, 3.1, 0.9, 0.1], 0), + ([5.1, 3.5, 1.0, 0.0], 0), + ([4.9, 3.3, 1.1, 0.1], 0), + ([6.0, 3.0, 4.0, 1.1], 1), + ([6.1, 2.2, 3.5, 1.0], 1), + ([5.9, 2.5, 3.3, 1.1], 1), + ([7.5, 4.1, 6.2, 2.3], 2), + ([7.3, 4.0, 6.1, 2.4], 2), + ([7.0, 3.3, 6.1, 2.5], 2)] + + +def test_neural_net(): + iris = DataSet(name='iris') + classes = ['setosa', 'versicolor', 'virginica'] + iris.classes_to_numbers(classes) + n_samples, n_features = len(iris.examples), iris.target + + X, y = (np.array([x[:n_features] for x in iris.examples]), + np.array([x[n_features] for x in iris.examples])) + + nnl_gd = NeuralNetworkLearner(iris, [4], l_rate=0.15, epochs=100, optimizer=stochastic_gradient_descent).fit(X, y) + assert grade_learner(nnl_gd, iris_tests) > 0.7 + assert err_ratio(nnl_gd, iris) < 0.15 + + nnl_adam = NeuralNetworkLearner(iris, [4], l_rate=0.001, epochs=200, optimizer=adam).fit(X, y) + assert grade_learner(nnl_adam, iris_tests) > 0.7 + assert err_ratio(nnl_adam, iris) < 0.15 + + +def test_perceptron(): + iris = DataSet(name='iris') + classes = ['setosa', 'versicolor', 'virginica'] + iris.classes_to_numbers(classes) + n_samples, n_features = len(iris.examples), iris.target + + X, y = (np.array([x[:n_features] for x in iris.examples]), + np.array([x[n_features] for x in iris.examples])) + + pl_gd = PerceptronLearner(iris, l_rate=0.01, epochs=100, optimizer=stochastic_gradient_descent).fit(X, y) + assert grade_learner(pl_gd, iris_tests) == 1 + assert err_ratio(pl_gd, iris) < 0.2 + + pl_adam = PerceptronLearner(iris, l_rate=0.01, epochs=100, optimizer=adam).fit(X, y) + assert grade_learner(pl_adam, iris_tests) == 1 + assert err_ratio(pl_adam, iris) < 0.2 + + +def test_rnn(): + data = imdb.load_data(num_words=5000) + + train, val, test = keras_dataset_loader(data) + train = (train[0][:1000], train[1][:1000]) + val = (val[0][:200], val[1][:200]) + + rnn = SimpleRNNLearner(train, val) + score = rnn.evaluate(test[0][:200], test[1][:200], verbose=False) + assert score[1] >= 0.2 + + +def test_autoencoder(): + iris = DataSet(name='iris') + classes = ['setosa', 'versicolor', 'virginica'] + iris.classes_to_numbers(classes) + inputs = np.asarray(iris.examples) + + al = AutoencoderLearner(inputs, 100) + print(inputs[0]) + print(al.predict(inputs[:1])) + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_games.py b/tests/test_games.py index b5c30ee67..b7541ee93 100644 --- a/tests/test_games.py +++ b/tests/test_games.py @@ -1,18 +1,21 @@ +import pytest + from games import * # Creating the game instances f52 = Fig52Game() ttt = TicTacToe() +random.seed("aima-python") + -def gen_state(to_move='X', x_positions=[], o_positions=[], h=3, v=3, k=3): +def gen_state(to_move='X', x_positions=[], o_positions=[], h=3, v=3): """Given whose turn it is to move, the positions of X's on the board, the positions of O's on the board, and, (optionally) number of rows, columns and how many consecutive X's or O's required to win, return the corresponding game state""" - moves = set([(x, y) for x in range(1, h + 1) for y in range(1, v + 1)]) \ - - set(x_positions) - set(o_positions) + moves = set([(x, y) for x in range(1, h + 1) for y in range(1, v + 1)]) - set(x_positions) - set(o_positions) moves = list(moves) board = {} for pos in x_positions: @@ -22,41 +25,45 @@ def gen_state(to_move='X', x_positions=[], o_positions=[], h=3, v=3, k=3): return GameState(to_move=to_move, utility=0, board=board, moves=moves) -def test_minimax_decision(): - assert minimax_decision('A', f52) == 'a1' - assert minimax_decision('B', f52) == 'b1' - assert minimax_decision('C', f52) == 'c1' - assert minimax_decision('D', f52) == 'd3' +def test_minmax_decision(): + assert minmax_decision('A', f52) == 'a1' + assert minmax_decision('B', f52) == 'b1' + assert minmax_decision('C', f52) == 'c1' + assert minmax_decision('D', f52) == 'd3' -def test_alphabeta_search(): - assert alphabeta_search('A', f52) == 'a1' - assert alphabeta_search('B', f52) == 'b1' - assert alphabeta_search('C', f52) == 'c1' - assert alphabeta_search('D', f52) == 'd3' +def test_alpha_beta_search(): + assert alpha_beta_search('A', f52) == 'a1' + assert alpha_beta_search('B', f52) == 'b1' + assert alpha_beta_search('C', f52) == 'c1' + assert alpha_beta_search('D', f52) == 'd3' state = gen_state(to_move='X', x_positions=[(1, 1), (3, 3)], o_positions=[(1, 2), (3, 2)]) - assert alphabeta_search(state, ttt) == (2, 2) + assert alpha_beta_search(state, ttt) == (2, 2) state = gen_state(to_move='O', x_positions=[(1, 1), (3, 1), (3, 3)], o_positions=[(1, 2), (3, 2)]) - assert alphabeta_search(state, ttt) == (2, 2) + assert alpha_beta_search(state, ttt) == (2, 2) state = gen_state(to_move='O', x_positions=[(1, 1)], o_positions=[]) - assert alphabeta_search(state, ttt) == (2, 2) + assert alpha_beta_search(state, ttt) == (2, 2) state = gen_state(to_move='X', x_positions=[(1, 1), (3, 1)], o_positions=[(2, 2), (3, 1)]) - assert alphabeta_search(state, ttt) == (1, 3) + assert alpha_beta_search(state, ttt) == (1, 3) def test_random_tests(): - assert Fig52Game().play_game(alphabeta_player, alphabeta_player) == 3 + assert Fig52Game().play_game(alpha_beta_player, alpha_beta_player) == 3 # The player 'X' (one who plays first) in TicTacToe never loses: - assert ttt.play_game(alphabeta_player, alphabeta_player) >= 0 + assert ttt.play_game(alpha_beta_player, alpha_beta_player) >= 0 # The player 'X' (one who plays first) in TicTacToe never loses: - assert ttt.play_game(alphabeta_player, random_player) >= 0 + assert ttt.play_game(alpha_beta_player, random_player) >= 0 + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_games4e.py b/tests/test_games4e.py new file mode 100644 index 000000000..7dfa47f11 --- /dev/null +++ b/tests/test_games4e.py @@ -0,0 +1,96 @@ +import pytest + +from games4e import * + +# Creating the game instances +f52 = Fig52Game() +ttt = TicTacToe() +con4 = ConnectFour() + +random.seed("aima-python") + + +def gen_state(to_move='X', x_positions=[], o_positions=[], h=3, v=3): + """Given whose turn it is to move, the positions of X's on the board, the + positions of O's on the board, and, (optionally) number of rows, columns + and how many consecutive X's or O's required to win, return the corresponding + game state""" + + moves = set([(x, y) for x in range(1, h + 1) for y in range(1, v + 1)]) - set(x_positions) - set(o_positions) + moves = list(moves) + board = {} + for pos in x_positions: + board[pos] = 'X' + for pos in o_positions: + board[pos] = 'O' + return GameState(to_move=to_move, utility=0, board=board, moves=moves) + + +def test_minmax_decision(): + assert minmax_decision('A', f52) == 'a1' + assert minmax_decision('B', f52) == 'b1' + assert minmax_decision('C', f52) == 'c1' + assert minmax_decision('D', f52) == 'd3' + + +def test_alpha_beta_search(): + assert alpha_beta_search('A', f52) == 'a1' + assert alpha_beta_search('B', f52) == 'b1' + assert alpha_beta_search('C', f52) == 'c1' + assert alpha_beta_search('D', f52) == 'd3' + + state = gen_state(to_move='X', x_positions=[(1, 1), (3, 3)], + o_positions=[(1, 2), (3, 2)]) + assert alpha_beta_search(state, ttt) == (2, 2) + + state = gen_state(to_move='O', x_positions=[(1, 1), (3, 1), (3, 3)], + o_positions=[(1, 2), (3, 2)]) + assert alpha_beta_search(state, ttt) == (2, 2) + + state = gen_state(to_move='O', x_positions=[(1, 1)], + o_positions=[]) + assert alpha_beta_search(state, ttt) == (2, 2) + + state = gen_state(to_move='X', x_positions=[(1, 1), (3, 1)], + o_positions=[(2, 2), (3, 1)]) + assert alpha_beta_search(state, ttt) == (1, 3) + + +def test_monte_carlo_tree_search(): + state = gen_state(to_move='X', x_positions=[(1, 1), (3, 3)], + o_positions=[(1, 2), (3, 2)]) + assert monte_carlo_tree_search(state, ttt) == (2, 2) + + state = gen_state(to_move='O', x_positions=[(1, 1), (3, 1), (3, 3)], + o_positions=[(1, 2), (3, 2)]) + assert monte_carlo_tree_search(state, ttt) == (2, 2) + + # uncomment the following when removing the 3rd edition + # state = gen_state(to_move='O', x_positions=[(1, 1)], + # o_positions=[]) + # assert monte_carlo_tree_search(state, ttt) == (2, 2) + + state = gen_state(to_move='X', x_positions=[(1, 1), (3, 1)], + o_positions=[(2, 2), (3, 1)]) + assert monte_carlo_tree_search(state, ttt) == (1, 3) + + # should never lose to a random or alpha_beta player in a ttt game + assert ttt.play_game(mcts_player, random_player) >= 0 + assert ttt.play_game(mcts_player, alpha_beta_player) >= 0 + + # should never lose to a random player in a connect four game + assert con4.play_game(mcts_player, random_player) >= 0 + + +def test_random_tests(): + assert Fig52Game().play_game(alpha_beta_player, alpha_beta_player) == 3 + + # The player 'X' (one who plays first) in TicTacToe never loses: + assert ttt.play_game(alpha_beta_player, alpha_beta_player) >= 0 + + # The player 'X' (one who plays first) in TicTacToe never loses: + assert ttt.play_game(alpha_beta_player, random_player) >= 0 + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_knowledge.py b/tests/test_knowledge.py index 89fe479a0..d3829de02 100644 --- a/tests/test_knowledge.py +++ b/tests/test_knowledge.py @@ -1,35 +1,76 @@ +import pytest + from knowledge import * from utils import expr import random random.seed("aima-python") +party = [ + {'Pizza': 'Yes', 'Soda': 'No', 'GOAL': True}, + {'Pizza': 'Yes', 'Soda': 'Yes', 'GOAL': True}, + {'Pizza': 'No', 'Soda': 'No', 'GOAL': False}] + +animals_umbrellas = [ + {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': True}, + {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True}, + {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True}, + {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': False}, + {'Species': 'Dog', 'Rain': 'No', 'Coat': 'No', 'GOAL': False}, + {'Species': 'Cat', 'Rain': 'No', 'Coat': 'No', 'GOAL': False}, + {'Species': 'Cat', 'Rain': 'No', 'Coat': 'Yes', 'GOAL': True}] + +conductance = [ + {'Sample': 'S1', 'Mass': 12, 'Temp': 26, 'Material': 'Cu', 'Size': 3, 'GOAL': 0.59}, + {'Sample': 'S1', 'Mass': 12, 'Temp': 100, 'Material': 'Cu', 'Size': 3, 'GOAL': 0.57}, + {'Sample': 'S2', 'Mass': 24, 'Temp': 26, 'Material': 'Cu', 'Size': 6, 'GOAL': 0.59}, + {'Sample': 'S3', 'Mass': 12, 'Temp': 26, 'Material': 'Pb', 'Size': 2, 'GOAL': 0.05}, + {'Sample': 'S3', 'Mass': 12, 'Temp': 100, 'Material': 'Pb', 'Size': 2, 'GOAL': 0.04}, + {'Sample': 'S4', 'Mass': 18, 'Temp': 100, 'Material': 'Pb', 'Size': 3, 'GOAL': 0.04}, + {'Sample': 'S4', 'Mass': 18, 'Temp': 100, 'Material': 'Pb', 'Size': 3, 'GOAL': 0.04}, + {'Sample': 'S5', 'Mass': 24, 'Temp': 100, 'Material': 'Pb', 'Size': 4, 'GOAL': 0.04}, + {'Sample': 'S6', 'Mass': 36, 'Temp': 26, 'Material': 'Pb', 'Size': 6, 'GOAL': 0.05}] + + +def r_example(Alt, Bar, Fri, Hun, Pat, Price, Rain, Res, Type, Est, GOAL): + return {'Alt': Alt, 'Bar': Bar, 'Fri': Fri, 'Hun': Hun, 'Pat': Pat, 'Price': Price, + 'Rain': Rain, 'Res': Res, 'Type': Type, 'Est': Est, 'GOAL': GOAL} + + +restaurant = [ + r_example('Yes', 'No', 'No', 'Yes', 'Some', '$$$', 'No', 'Yes', 'French', '0-10', True), + r_example('Yes', 'No', 'No', 'Yes', 'Full', '$', 'No', 'No', 'Thai', '30-60', False), + r_example('No', 'Yes', 'No', 'No', 'Some', '$', 'No', 'No', 'Burger', '0-10', True), + r_example('Yes', 'No', 'Yes', 'Yes', 'Full', '$', 'Yes', 'No', 'Thai', '10-30', True), + r_example('Yes', 'No', 'Yes', 'No', 'Full', '$$$', 'No', 'Yes', 'French', '>60', False), + r_example('No', 'Yes', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Italian', '0-10', True), + r_example('No', 'Yes', 'No', 'No', 'None', '$', 'Yes', 'No', 'Burger', '0-10', False), + r_example('No', 'No', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Thai', '0-10', True), + r_example('No', 'Yes', 'Yes', 'No', 'Full', '$', 'Yes', 'No', 'Burger', '>60', False), + r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$$$', 'No', 'Yes', 'Italian', '10-30', False), + r_example('No', 'No', 'No', 'No', 'None', '$', 'No', 'No', 'Thai', '0-10', False), + r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$', 'No', 'No', 'Burger', '30-60', True)] + def test_current_best_learning(): examples = restaurant hypothesis = [{'Alt': 'Yes'}] h = current_best_learning(examples, hypothesis) - values = [] - for e in examples: - values.append(guess_value(e, h)) + values = [guess_value(e, h) for e in examples] assert values == [True, False, True, True, False, True, False, True, False, False, False, True] examples = animals_umbrellas initial_h = [{'Species': 'Cat'}] h = current_best_learning(examples, initial_h) - values = [] - for e in examples: - values.append(guess_value(e, h)) + values = [guess_value(e, h) for e in examples] assert values == [True, True, True, False, False, False, True] examples = party initial_h = [{'Pizza': 'Yes'}] h = current_best_learning(examples, initial_h) - values = [] - for e in examples: - values.append(guess_value(e, h)) + values = [guess_value(e, h) for e in examples] assert values == [True, True, False] @@ -58,227 +99,187 @@ def test_minimal_consistent_det(): assert minimal_consistent_det(conductance, {'Mass', 'Temp', 'Size'}) == {'Mass', 'Temp', 'Size'} +A, B, C, D, E, F, G, H, I, x, y, z = map(expr, 'ABCDEFGHIxyz') + +# knowledge base containing family relations +small_family = FOILContainer([expr("Mother(Anne, Peter)"), + expr("Mother(Anne, Zara)"), + expr("Mother(Sarah, Beatrice)"), + expr("Mother(Sarah, Eugenie)"), + expr("Father(Mark, Peter)"), + expr("Father(Mark, Zara)"), + expr("Father(Andrew, Beatrice)"), + expr("Father(Andrew, Eugenie)"), + expr("Father(Philip, Anne)"), + expr("Father(Philip, Andrew)"), + expr("Mother(Elizabeth, Anne)"), + expr("Mother(Elizabeth, Andrew)"), + expr("Male(Philip)"), + expr("Male(Mark)"), + expr("Male(Andrew)"), + expr("Male(Peter)"), + expr("Female(Elizabeth)"), + expr("Female(Anne)"), + expr("Female(Sarah)"), + expr("Female(Zara)"), + expr("Female(Beatrice)"), + expr("Female(Eugenie)")]) + +smaller_family = FOILContainer([expr("Mother(Anne, Peter)"), + expr("Father(Mark, Peter)"), + expr("Father(Philip, Anne)"), + expr("Mother(Elizabeth, Anne)"), + expr("Male(Philip)"), + expr("Male(Mark)"), + expr("Male(Peter)"), + expr("Female(Elizabeth)"), + expr("Female(Anne)")]) + +# target relation +target = expr('Parent(x, y)') + +# positive examples of target +examples_pos = [{x: expr('Elizabeth'), y: expr('Anne')}, + {x: expr('Elizabeth'), y: expr('Andrew')}, + {x: expr('Philip'), y: expr('Anne')}, + {x: expr('Philip'), y: expr('Andrew')}, + {x: expr('Anne'), y: expr('Peter')}, + {x: expr('Anne'), y: expr('Zara')}, + {x: expr('Mark'), y: expr('Peter')}, + {x: expr('Mark'), y: expr('Zara')}, + {x: expr('Andrew'), y: expr('Beatrice')}, + {x: expr('Andrew'), y: expr('Eugenie')}, + {x: expr('Sarah'), y: expr('Beatrice')}, + {x: expr('Sarah'), y: expr('Eugenie')}] + +# negative examples of target +examples_neg = [{x: expr('Anne'), y: expr('Eugenie')}, + {x: expr('Beatrice'), y: expr('Eugenie')}, + {x: expr('Mark'), y: expr('Elizabeth')}, + {x: expr('Beatrice'), y: expr('Philip')}] + + +def test_tell(): + """ + adds in the knowledge base a sentence + """ + smaller_family.tell(expr("Male(George)")) + smaller_family.tell(expr("Female(Mum)")) + assert smaller_family.ask(expr("Male(George)")) == {} + assert smaller_family.ask(expr("Female(Mum)")) == {} + assert not smaller_family.ask(expr("Female(George)")) + assert not smaller_family.ask(expr("Male(Mum)")) + + def test_extend_example(): - assert list(test_network.extend_example({x: A, y: B}, expr('Conn(x, z)'))) == [ - {x: A, y: B, z: B}, {x: A, y: B, z: D}] - assert list(test_network.extend_example({x: G}, expr('Conn(x, y)'))) == [{x: G, y: I}] - assert list(test_network.extend_example({x: C}, expr('Conn(x, y)'))) == [] - assert len(list(test_network.extend_example({}, expr('Conn(x, y)')))) == 10 + """ + Create the extended examples of the given clause. + (The extended examples are a set of examples created by extending example + with each possible constant value for each new variable in literal.) + """ assert len(list(small_family.extend_example({x: expr('Andrew')}, expr('Father(x, y)')))) == 2 assert len(list(small_family.extend_example({x: expr('Andrew')}, expr('Mother(x, y)')))) == 0 assert len(list(small_family.extend_example({x: expr('Andrew')}, expr('Female(y)')))) == 6 def test_new_literals(): - assert len(list(test_network.new_literals([expr('p | q'), [expr('p')]]))) == 8 - assert len(list(test_network.new_literals([expr('p'), [expr('q'), expr('p | r')]]))) == 15 assert len(list(small_family.new_literals([expr('p'), []]))) == 8 assert len(list(small_family.new_literals([expr('p & q'), []]))) == 20 +def test_new_clause(): + """ + Finds the best clause to add in the set of clauses. + """ + clause = small_family.new_clause([examples_pos, examples_neg], target)[0][1] + assert len(clause) == 1 and (clause[0].op in ['Male', 'Female', 'Father', 'Mother']) + + def test_choose_literal(): - literals = [expr('Conn(p, q)'), expr('Conn(x, z)'), expr('Conn(r, s)'), expr('Conn(t, y)')] - examples_pos = [{x: A, y: B}, {x: A, y: D}] - examples_neg = [{x: A, y: C}, {x: C, y: A}, {x: C, y: B}, {x: A, y: I}] - assert test_network.choose_literal(literals, [examples_pos, examples_neg]) == expr('Conn(x, z)') - literals = [expr('Conn(x, p)'), expr('Conn(p, x)'), expr('Conn(p, q)')] - examples_pos = [{x: C}, {x: F}, {x: I}] - examples_neg = [{x: D}, {x: A}, {x: B}, {x: G}] - assert test_network.choose_literal(literals, [examples_pos, examples_neg]) == expr('Conn(p, x)') - literals = [expr('Father(x, y)'), expr('Father(y, x)'), expr('Mother(x, y)'), expr('Mother(x, y)')] + """ + Choose the best literal based on the information gain + """ + literals = [expr('Father(x, y)'), expr('Father(x, y)'), expr('Mother(x, y)'), expr('Mother(x, y)')] examples_pos = [{x: expr('Philip')}, {x: expr('Mark')}, {x: expr('Peter')}] examples_neg = [{x: expr('Elizabeth')}, {x: expr('Sarah')}] assert small_family.choose_literal(literals, [examples_pos, examples_neg]) == expr('Father(x, y)') literals = [expr('Father(x, y)'), expr('Father(y, x)'), expr('Male(x)')] examples_pos = [{x: expr('Philip')}, {x: expr('Mark')}, {x: expr('Andrew')}] examples_neg = [{x: expr('Elizabeth')}, {x: expr('Sarah')}] - assert small_family.choose_literal(literals, [examples_pos, examples_neg]) == expr('Male(x)') + assert small_family.choose_literal(literals, [examples_pos, examples_neg]) == expr('Father(x,y)') -def test_new_clause(): - target = expr('Open(x, y)') - examples_pos = [{x: B}, {x: A}, {x: G}] - examples_neg = [{x: C}, {x: F}, {x: I}] - clause = test_network.new_clause([examples_pos, examples_neg], target)[0][1] - assert len(clause) == 1 and clause[0].op == 'Conn' and clause[0].args[0] == x - target = expr('Flow(x, y)') - examples_pos = [{x: B}, {x: D}, {x: E}, {x: G}] - examples_neg = [{x: A}, {x: C}, {x: F}, {x: I}, {x: H}] - clause = test_network.new_clause([examples_pos, examples_neg], target)[0][1] - assert len(clause) == 2 and \ - ((clause[0].args[0] == x and clause[1].args[1] == x) or \ - (clause[0].args[1] == x and clause[1].args[0] == x)) +def test_gain(): + """ + Calculates the utility of each literal, based on the information gained. + """ + gain_father = small_family.gain(expr('Father(x,y)'), [examples_pos, examples_neg]) + gain_male = small_family.gain(expr('Male(x)'), [examples_pos, examples_neg]) + assert round(gain_father, 2) == 2.49 + assert round(gain_male, 2) == 1.16 -def test_foil(): - target = expr('Reach(x, y)') - examples_pos = [{x: A, y: B}, - {x: A, y: C}, - {x: A, y: D}, - {x: A, y: E}, - {x: A, y: F}, - {x: A, y: G}, - {x: A, y: I}, - {x: B, y: C}, - {x: D, y: C}, - {x: D, y: E}, - {x: D, y: F}, - {x: D, y: G}, - {x: D, y: I}, - {x: E, y: F}, - {x: E, y: G}, - {x: E, y: I}, - {x: G, y: I}, - {x: H, y: G}, - {x: H, y: I}] - nodes = {A, B, C, D, E, F, G, H, I} - examples_neg = [example for example in [{x: a, y: b} for a in nodes for b in nodes] - if example not in examples_pos] - ## TODO: Modify FOIL to recursively check for satisfied positive examples -# clauses = test_network.foil([examples_pos, examples_neg], target) -# assert len(clauses) == 2 - target = expr('Parent(x, y)') - examples_pos = [{x: expr('Elizabeth'), y: expr('Anne')}, - {x: expr('Elizabeth'), y: expr('Andrew')}, - {x: expr('Philip'), y: expr('Anne')}, - {x: expr('Philip'), y: expr('Andrew')}, - {x: expr('Anne'), y: expr('Peter')}, - {x: expr('Anne'), y: expr('Zara')}, - {x: expr('Mark'), y: expr('Peter')}, - {x: expr('Mark'), y: expr('Zara')}, - {x: expr('Andrew'), y: expr('Beatrice')}, - {x: expr('Andrew'), y: expr('Eugenie')}, - {x: expr('Sarah'), y: expr('Beatrice')}, - {x: expr('Sarah'), y: expr('Eugenie')}] - examples_neg = [{x: expr('Anne'), y: expr('Eugenie')}, - {x: expr('Beatrice'), y: expr('Eugenie')}, - {x: expr('Mark'), y: expr('Elizabeth')}, - {x: expr('Beatrice'), y: expr('Philip')}] - clauses = small_family.foil([examples_pos, examples_neg], target) - assert len(clauses) == 2 and \ - ((clauses[0][1][0] == expr('Father(x, y)') and clauses[1][1][0] == expr('Mother(x, y)')) or \ - (clauses[1][1][0] == expr('Father(x, y)') and clauses[0][1][0] == expr('Mother(x, y)'))) - target = expr('Grandparent(x, y)') - examples_pos = [{x: expr('Elizabeth'), y: expr('Peter')}, - {x: expr('Elizabeth'), y: expr('Zara')}, - {x: expr('Elizabeth'), y: expr('Beatrice')}, - {x: expr('Elizabeth'), y: expr('Eugenie')}, - {x: expr('Philip'), y: expr('Peter')}, - {x: expr('Philip'), y: expr('Zara')}, - {x: expr('Philip'), y: expr('Beatrice')}, - {x: expr('Philip'), y: expr('Eugenie')}] - examples_neg = [{x: expr('Anne'), y: expr('Eugenie')}, - {x: expr('Beatrice'), y: expr('Eugenie')}, - {x: expr('Elizabeth'), y: expr('Andrew')}, - {x: expr('Philip'), y: expr('Anne')}, - {x: expr('Philip'), y: expr('Andrew')}, - {x: expr('Anne'), y: expr('Peter')}, - {x: expr('Anne'), y: expr('Zara')}, - {x: expr('Mark'), y: expr('Peter')}, - {x: expr('Mark'), y: expr('Zara')}, - {x: expr('Andrew'), y: expr('Beatrice')}, - {x: expr('Andrew'), y: expr('Eugenie')}, - {x: expr('Sarah'), y: expr('Beatrice')}, - {x: expr('Mark'), y: expr('Elizabeth')}, - {x: expr('Beatrice'), y: expr('Philip')}] -# clauses = small_family.foil([examples_pos, examples_neg], target) -# assert len(clauses) == 2 and \ -# ((clauses[0][1][0] == expr('Father(x, y)') and clauses[1][1][0] == expr('Mother(x, y)')) or \ -# (clauses[1][1][0] == expr('Father(x, y)') and clauses[0][1][0] == expr('Mother(x, y)'))) +def test_update_examples(): + """Add to the kb those examples what are represented in extended_examples + List of omitted examples is returned. + """ + extended_examples = [{x: expr("Mark"), y: expr("Peter")}, + {x: expr("Philip"), y: expr("Anne")}] + uncovered = smaller_family.update_examples(target, examples_pos, extended_examples) + assert {x: expr("Elizabeth"), y: expr("Anne")} in uncovered + assert {x: expr("Anne"), y: expr("Peter")} in uncovered + assert {x: expr("Philip"), y: expr("Anne")} not in uncovered + assert {x: expr("Mark"), y: expr("Peter")} not in uncovered -party = [ - {'Pizza': 'Yes', 'Soda': 'No', 'GOAL': True}, - {'Pizza': 'Yes', 'Soda': 'Yes', 'GOAL': True}, - {'Pizza': 'No', 'Soda': 'No', 'GOAL': False} -] -animals_umbrellas = [ - {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': True}, - {'Species': 'Cat', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True}, - {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'Yes', 'GOAL': True}, - {'Species': 'Dog', 'Rain': 'Yes', 'Coat': 'No', 'GOAL': False}, - {'Species': 'Dog', 'Rain': 'No', 'Coat': 'No', 'GOAL': False}, - {'Species': 'Cat', 'Rain': 'No', 'Coat': 'No', 'GOAL': False}, - {'Species': 'Cat', 'Rain': 'No', 'Coat': 'Yes', 'GOAL': True} -] - -conductance = [ - {'Sample': 'S1', 'Mass': 12, 'Temp': 26, 'Material': 'Cu', 'Size': 3, 'GOAL': 0.59}, - {'Sample': 'S1', 'Mass': 12, 'Temp': 100, 'Material': 'Cu', 'Size': 3, 'GOAL': 0.57}, - {'Sample': 'S2', 'Mass': 24, 'Temp': 26, 'Material': 'Cu', 'Size': 6, 'GOAL': 0.59}, - {'Sample': 'S3', 'Mass': 12, 'Temp': 26, 'Material': 'Pb', 'Size': 2, 'GOAL': 0.05}, - {'Sample': 'S3', 'Mass': 12, 'Temp': 100, 'Material': 'Pb', 'Size': 2, 'GOAL': 0.04}, - {'Sample': 'S4', 'Mass': 18, 'Temp': 100, 'Material': 'Pb', 'Size': 3, 'GOAL': 0.04}, - {'Sample': 'S4', 'Mass': 18, 'Temp': 100, 'Material': 'Pb', 'Size': 3, 'GOAL': 0.04}, - {'Sample': 'S5', 'Mass': 24, 'Temp': 100, 'Material': 'Pb', 'Size': 4, 'GOAL': 0.04}, - {'Sample': 'S6', 'Mass': 36, 'Temp': 26, 'Material': 'Pb', 'Size': 6, 'GOAL': 0.05}, -] - -def r_example(Alt, Bar, Fri, Hun, Pat, Price, Rain, Res, Type, Est, GOAL): - return {'Alt': Alt, 'Bar': Bar, 'Fri': Fri, 'Hun': Hun, 'Pat': Pat, - 'Price': Price, 'Rain': Rain, 'Res': Res, 'Type': Type, 'Est': Est, - 'GOAL': GOAL} - -restaurant = [ - r_example('Yes', 'No', 'No', 'Yes', 'Some', '$$$', 'No', 'Yes', 'French', '0-10', True), - r_example('Yes', 'No', 'No', 'Yes', 'Full', '$', 'No', 'No', 'Thai', '30-60', False), - r_example('No', 'Yes', 'No', 'No', 'Some', '$', 'No', 'No', 'Burger', '0-10', True), - r_example('Yes', 'No', 'Yes', 'Yes', 'Full', '$', 'Yes', 'No', 'Thai', '10-30', True), - r_example('Yes', 'No', 'Yes', 'No', 'Full', '$$$', 'No', 'Yes', 'French', '>60', False), - r_example('No', 'Yes', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Italian', '0-10', True), - r_example('No', 'Yes', 'No', 'No', 'None', '$', 'Yes', 'No', 'Burger', '0-10', False), - r_example('No', 'No', 'No', 'Yes', 'Some', '$$', 'Yes', 'Yes', 'Thai', '0-10', True), - r_example('No', 'Yes', 'Yes', 'No', 'Full', '$', 'Yes', 'No', 'Burger', '>60', False), - r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$$$', 'No', 'Yes', 'Italian', '10-30', False), - r_example('No', 'No', 'No', 'No', 'None', '$', 'No', 'No', 'Thai', '0-10', False), - r_example('Yes', 'Yes', 'Yes', 'Yes', 'Full', '$', 'No', 'No', 'Burger', '30-60', True) -] - -""" -A H -|\ /| -| \ / | -v v v v -B D-->E-->G-->I -| / | -| / | -vv v -C F -""" -test_network = FOIL_container([expr("Conn(A, B)"), - expr("Conn(A ,D)"), - expr("Conn(B, C)"), - expr("Conn(D, C)"), - expr("Conn(D, E)"), - expr("Conn(E ,F)"), - expr("Conn(E, G)"), - expr("Conn(G, I)"), - expr("Conn(H, G)"), - expr("Conn(H, I)")]) - -small_family = FOIL_container([expr("Mother(Anne, Peter)"), - expr("Mother(Anne, Zara)"), - expr("Mother(Sarah, Beatrice)"), - expr("Mother(Sarah, Eugenie)"), - expr("Father(Mark, Peter)"), - expr("Father(Mark, Zara)"), - expr("Father(Andrew, Beatrice)"), - expr("Father(Andrew, Eugenie)"), - expr("Father(Philip, Anne)"), - expr("Father(Philip, Andrew)"), - expr("Mother(Elizabeth, Anne)"), - expr("Mother(Elizabeth, Andrew)"), - expr("Male(Philip)"), - expr("Male(Mark)"), - expr("Male(Andrew)"), - expr("Male(Peter)"), - expr("Female(Elizabeth)"), - expr("Female(Anne)"), - expr("Female(Sarah)"), - expr("Female(Zara)"), - expr("Female(Beatrice)"), - expr("Female(Eugenie)"), -]) - -A, B, C, D, E, F, G, H, I, x, y, z = map(expr, 'ABCDEFGHIxyz') +def test_foil(): + """ + Test the FOIL algorithm, when target is Parent(x,y) + """ + clauses = small_family.foil([examples_pos, examples_neg], target) + assert len(clauses) == 2 and \ + ((clauses[0][1][0] == expr('Father(x, y)') and clauses[1][1][0] == expr('Mother(x, y)')) or + (clauses[1][1][0] == expr('Father(x, y)') and clauses[0][1][0] == expr('Mother(x, y)'))) + + target_g = expr('Grandparent(x, y)') + examples_pos_g = [{x: expr('Elizabeth'), y: expr('Peter')}, + {x: expr('Elizabeth'), y: expr('Zara')}, + {x: expr('Elizabeth'), y: expr('Beatrice')}, + {x: expr('Elizabeth'), y: expr('Eugenie')}, + {x: expr('Philip'), y: expr('Peter')}, + {x: expr('Philip'), y: expr('Zara')}, + {x: expr('Philip'), y: expr('Beatrice')}, + {x: expr('Philip'), y: expr('Eugenie')}] + examples_neg_g = [{x: expr('Anne'), y: expr('Eugenie')}, + {x: expr('Beatrice'), y: expr('Eugenie')}, + {x: expr('Elizabeth'), y: expr('Andrew')}, + {x: expr('Elizabeth'), y: expr('Anne')}, + {x: expr('Elizabeth'), y: expr('Mark')}, + {x: expr('Elizabeth'), y: expr('Sarah')}, + {x: expr('Philip'), y: expr('Anne')}, + {x: expr('Philip'), y: expr('Andrew')}, + {x: expr('Anne'), y: expr('Peter')}, + {x: expr('Anne'), y: expr('Zara')}, + {x: expr('Mark'), y: expr('Peter')}, + {x: expr('Mark'), y: expr('Zara')}, + {x: expr('Andrew'), y: expr('Beatrice')}, + {x: expr('Andrew'), y: expr('Eugenie')}, + {x: expr('Sarah'), y: expr('Beatrice')}, + {x: expr('Mark'), y: expr('Elizabeth')}, + {x: expr('Beatrice'), y: expr('Philip')}, + {x: expr('Peter'), y: expr('Andrew')}, + {x: expr('Zara'), y: expr('Mark')}, + {x: expr('Peter'), y: expr('Anne')}, + {x: expr('Zara'), y: expr('Eugenie')}] + + clauses = small_family.foil([examples_pos_g, examples_neg_g], target_g) + assert len(clauses[0]) == 2 + assert clauses[0][1][0].op == 'Parent' + assert clauses[0][1][0].args[0] == x + assert clauses[0][1][1].op == 'Parent' + assert clauses[0][1][1].args[1] == y + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_learning.py b/tests/test_learning.py index aff8903a4..63a7fd9aa 100644 --- a/tests/test_learning.py +++ b/tests/test_learning.py @@ -1,64 +1,18 @@ import pytest -import math -import random -from utils import open_data -from learning import * +from learning import * random.seed("aima-python") -def test_euclidean(): - distance = euclidean_distance([1, 2], [3, 4]) - assert round(distance, 2) == 2.83 - - distance = euclidean_distance([1, 2, 3], [4, 5, 6]) - assert round(distance, 2) == 5.2 - - distance = euclidean_distance([0, 0, 0], [0, 0, 0]) - assert distance == 0 - - -def test_rms_error(): - assert rms_error([2, 2], [2, 2]) == 0 - assert rms_error((0, 0), (0, 1)) == math.sqrt(0.5) - assert rms_error((1, 0), (0, 1)) == 1 - assert rms_error((0, 0), (0, -1)) == math.sqrt(0.5) - assert rms_error((0, 0.5), (0, -0.5)) == math.sqrt(0.5) - - -def test_manhattan_distance(): - assert manhattan_distance([2, 2], [2, 2]) == 0 - assert manhattan_distance([0, 0], [0, 1]) == 1 - assert manhattan_distance([1, 0], [0, 1]) == 2 - assert manhattan_distance([0, 0], [0, -1]) == 1 - assert manhattan_distance([0, 0.5], [0, -0.5]) == 1 - - -def test_mean_boolean_error(): - assert mean_boolean_error([1, 1], [0, 0]) == 1 - assert mean_boolean_error([0, 1], [1, 0]) == 1 - assert mean_boolean_error([1, 1], [0, 1]) == 0.5 - assert mean_boolean_error([0, 0], [0, 0]) == 0 - assert mean_boolean_error([1, 1], [1, 1]) == 0 - - -def test_mean_error(): - assert mean_error([2, 2], [2, 2]) == 0 - assert mean_error([0, 0], [0, 1]) == 0.5 - assert mean_error([1, 0], [0, 1]) == 1 - assert mean_error([0, 0], [0, -1]) == 0.5 - assert mean_error([0, 0.5], [0, -0.5]) == 0.5 - - def test_exclude(): iris = DataSet(name='iris', exclude=[3]) assert iris.inputs == [0, 1, 2] def test_parse_csv(): - Iris = open_data('iris.csv').read() - assert parse_csv(Iris)[0] == [5.1, 3.5, 1.4, 0.2, 'setosa'] + iris = open_data('iris.csv').read() + assert parse_csv(iris)[0] == [5.1, 3.5, 1.4, 0.2, 'setosa'] def test_weighted_mode(): @@ -70,114 +24,83 @@ def test_weighted_replicate(): def test_means_and_deviation(): - iris = DataSet(name="iris") - + iris = DataSet(name='iris') means, deviations = iris.find_means_and_deviations() - - assert round(means["setosa"][0], 3) == 5.006 - assert round(means["versicolor"][0], 3) == 5.936 - assert round(means["virginica"][0], 3) == 6.588 - - assert round(deviations["setosa"][0], 3) == 0.352 - assert round(deviations["versicolor"][0], 3) == 0.516 - assert round(deviations["virginica"][0], 3) == 0.636 + assert round(means['setosa'][0], 3) == 5.006 + assert round(means['versicolor'][0], 3) == 5.936 + assert round(means['virginica'][0], 3) == 6.588 + assert round(deviations['setosa'][0], 3) == 0.352 + assert round(deviations['versicolor'][0], 3) == 0.516 + assert round(deviations['virginica'][0], 3) == 0.636 def test_plurality_learner(): - zoo = DataSet(name="zoo") - - pL = PluralityLearner(zoo) - assert pL([1, 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 0, 4, 1, 0, 1]) == "mammal" - - -def test_naive_bayes(): - iris = DataSet(name="iris") - - # Discrete - nBD = NaiveBayesLearner(iris, continuous=False) - assert nBD([5, 3, 1, 0.1]) == "setosa" - assert nBD([6, 3, 4, 1.1]) == "versicolor" - assert nBD([7.7, 3, 6, 2]) == "virginica" - - # Continuous - nBC = NaiveBayesLearner(iris, continuous=True) - assert nBC([5, 3, 1, 0.1]) == "setosa" - assert nBC([6, 5, 3, 1.5]) == "versicolor" - assert nBC([7, 3, 6.5, 2]) == "virginica" - - # Simple - data1 = 'a'*50 + 'b'*30 + 'c'*15 - dist1 = CountingProbDist(data1) - data2 = 'a'*30 + 'b'*45 + 'c'*20 - dist2 = CountingProbDist(data2) - data3 = 'a'*20 + 'b'*20 + 'c'*35 - dist3 = CountingProbDist(data3) - - dist = {('First', 0.5): dist1, ('Second', 0.3): dist2, ('Third', 0.2): dist3} - nBS = NaiveBayesLearner(dist, simple=True) - assert nBS('aab') == 'First' - assert nBS(['b', 'b']) == 'Second' - assert nBS('ccbcc') == 'Third' + zoo = DataSet(name='zoo') + pl = PluralityLearner(zoo) + assert pl([1, 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 0, 4, 1, 0, 1]) == 'mammal' def test_k_nearest_neighbors(): - iris = DataSet(name="iris") - kNN = NearestNeighborLearner(iris, k=3) - assert kNN([5, 3, 1, 0.1]) == "setosa" - assert kNN([5, 3, 1, 0.1]) == "setosa" - assert kNN([6, 5, 3, 1.5]) == "versicolor" - assert kNN([7.5, 4, 6, 2]) == "virginica" - - -def test_truncated_svd(): - test_mat = [[17, 0], - [0, 11]] - _, _, eival = truncated_svd(test_mat) - assert isclose(abs(eival[0]), 17) - assert isclose(abs(eival[1]), 11) - - test_mat = [[17, 0], - [0, -34]] - _, _, eival = truncated_svd(test_mat) - assert isclose(abs(eival[0]), 34) - assert isclose(abs(eival[1]), 17) - - test_mat = [[1, 0, 0, 0, 2], - [0, 0, 3, 0, 0], - [0, 0, 0, 0, 0], - [0, 2, 0, 0, 0]] - _, _, eival = truncated_svd(test_mat) - assert isclose(abs(eival[0]), 3) - assert isclose(abs(eival[1]), 5**0.5) - - test_mat = [[3, 2, 2], - [2, 3, -2]] - _, _, eival = truncated_svd(test_mat) - assert isclose(abs(eival[0]), 5) - assert isclose(abs(eival[1]), 3) + iris = DataSet(name='iris') + knn = NearestNeighborLearner(iris, k=3) + assert knn([5, 3, 1, 0.1]) == 'setosa' + assert knn([6, 5, 3, 1.5]) == 'versicolor' + assert knn([7.5, 4, 6, 2]) == 'virginica' def test_decision_tree_learner(): - iris = DataSet(name="iris") - dTL = DecisionTreeLearner(iris) - assert dTL([5, 3, 1, 0.1]) == "setosa" - assert dTL([6, 5, 3, 1.5]) == "versicolor" - assert dTL([7.5, 4, 6, 2]) == "virginica" + iris = DataSet(name='iris') + dtl = DecisionTreeLearner(iris) + assert dtl([5, 3, 1, 0.1]) == 'setosa' + assert dtl([6, 5, 3, 1.5]) == 'versicolor' + assert dtl([7.5, 4, 6, 2]) == 'virginica' + + +def test_svc(): + iris = DataSet(name='iris') + classes = ['setosa', 'versicolor', 'virginica'] + iris.classes_to_numbers(classes) + n_samples, n_features = len(iris.examples), iris.target + X, y = (np.array([x[:n_features] for x in iris.examples]), + np.array([x[n_features] for x in iris.examples])) + svm = MultiClassLearner(SVC()).fit(X, y) + assert svm.predict([[5.0, 3.1, 0.9, 0.1]]) == 0 + assert svm.predict([[5.1, 3.5, 1.0, 0.0]]) == 0 + assert svm.predict([[4.9, 3.3, 1.1, 0.1]]) == 0 + assert svm.predict([[6.0, 3.0, 4.0, 1.1]]) == 1 + assert svm.predict([[6.1, 2.2, 3.5, 1.0]]) == 1 + assert svm.predict([[5.9, 2.5, 3.3, 1.1]]) == 1 + assert svm.predict([[7.5, 4.1, 6.2, 2.3]]) == 2 + assert svm.predict([[7.3, 4.0, 6.1, 2.4]]) == 2 + assert svm.predict([[7.0, 3.3, 6.1, 2.5]]) == 2 + + +def test_information_content(): + assert information_content([]) == 0 + assert information_content([4]) == 0 + assert information_content([5, 4, 0, 2, 5, 0]) > 1.9 + assert information_content([5, 4, 0, 2, 5, 0]) < 2 + assert information_content([1.5, 2.5]) > 0.9 + assert information_content([1.5, 2.5]) < 1.0 def test_random_forest(): - iris = DataSet(name="iris") - rF = RandomForest(iris) - assert rF([5, 3, 1, 0.1]) == "setosa" - assert rF([6, 5, 3, 1]) == "versicolor" - assert rF([7.5, 4, 6, 2]) == "virginica" + iris = DataSet(name='iris') + rf = RandomForest(iris) + tests = [([5.0, 3.0, 1.0, 0.1], 'setosa'), + ([5.1, 3.3, 1.1, 0.1], 'setosa'), + ([6.0, 5.0, 3.0, 1.0], 'versicolor'), + ([6.1, 2.2, 3.5, 1.0], 'versicolor'), + ([7.5, 4.1, 6.2, 2.3], 'virginica'), + ([7.3, 3.7, 6.1, 2.5], 'virginica')] + assert grade_learner(rf, tests) >= 1 / 3 def test_neural_network_learner(): - iris = DataSet(name="iris") - classes = ["setosa", "versicolor", "virginica"] + iris = DataSet(name='iris') + classes = ['setosa', 'versicolor', 'virginica'] iris.classes_to_numbers(classes) - nNL = NeuralNetLearner(iris, [5], 0.15, 75) + nnl = NeuralNetLearner(iris, [5], 0.15, 75) tests = [([5.0, 3.1, 0.9, 0.1], 0), ([5.1, 3.5, 1.0, 0.0], 0), ([4.9, 3.3, 1.1, 0.1], 0), @@ -187,23 +110,22 @@ def test_neural_network_learner(): ([7.5, 4.1, 6.2, 2.3], 2), ([7.3, 4.0, 6.1, 2.4], 2), ([7.0, 3.3, 6.1, 2.5], 2)] - assert grade_learner(nNL, tests) >= 1/3 - assert err_ratio(nNL, iris) < 0.2 + assert grade_learner(nnl, tests) >= 1 / 3 + assert err_ratio(nnl, iris) < 0.21 def test_perceptron(): - iris = DataSet(name="iris") + iris = DataSet(name='iris') iris.classes_to_numbers() - classes_number = len(iris.values[iris.target]) - perceptron = PerceptronLearner(iris) + pl = PerceptronLearner(iris) tests = [([5, 3, 1, 0.1], 0), ([5, 3.5, 1, 0], 0), ([6, 3, 4, 1.1], 1), ([6, 2, 3.5, 1], 1), ([7.5, 4, 6, 2], 2), ([7, 3, 6, 2.5], 2)] - assert grade_learner(perceptron, tests) > 1/2 - assert err_ratio(perceptron, iris) < 0.4 + assert grade_learner(pl, tests) > 1 / 2 + assert err_ratio(pl, iris) < 0.4 def test_random_weights(): @@ -213,4 +135,23 @@ def test_random_weights(): test_weights = random_weights(min_value, max_value, num_weights) assert len(test_weights) == num_weights for weight in test_weights: - assert weight >= min_value and weight <= max_value + assert min_value <= weight <= max_value + + +def test_ada_boost(): + iris = DataSet(name='iris') + iris.classes_to_numbers() + wl = WeightedLearner(PerceptronLearner) + ab = ada_boost(iris, wl, 5) + tests = [([5, 3, 1, 0.1], 0), + ([5, 3.5, 1, 0], 0), + ([6, 3, 4, 1.1], 1), + ([6, 2, 3.5, 1], 1), + ([7.5, 4, 6, 2], 2), + ([7, 3, 6, 2.5], 2)] + assert grade_learner(ab, tests) > 2 / 3 + assert err_ratio(ab, iris) < 0.25 + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_learning4e.py b/tests/test_learning4e.py new file mode 100644 index 000000000..b345efad7 --- /dev/null +++ b/tests/test_learning4e.py @@ -0,0 +1,127 @@ +import pytest + +from deep_learning4e import PerceptronLearner +from learning4e import * + +random.seed("aima-python") + + +def test_exclude(): + iris = DataSet(name='iris', exclude=[3]) + assert iris.inputs == [0, 1, 2] + + +def test_parse_csv(): + iris = open_data('iris.csv').read() + assert parse_csv(iris)[0] == [5.1, 3.5, 1.4, 0.2, 'setosa'] + + +def test_weighted_mode(): + assert weighted_mode('abbaa', [1, 2, 3, 1, 2]) == 'b' + + +def test_weighted_replicate(): + assert weighted_replicate('ABC', [1, 2, 1], 4) == ['A', 'B', 'B', 'C'] + + +def test_means_and_deviation(): + iris = DataSet(name='iris') + means, deviations = iris.find_means_and_deviations() + assert round(means['setosa'][0], 3) == 5.006 + assert round(means['versicolor'][0], 3) == 5.936 + assert round(means['virginica'][0], 3) == 6.588 + assert round(deviations['setosa'][0], 3) == 0.352 + assert round(deviations['versicolor'][0], 3) == 0.516 + assert round(deviations['virginica'][0], 3) == 0.636 + + +def test_plurality_learner(): + zoo = DataSet(name='zoo') + pl = PluralityLearner(zoo) + assert pl.predict([1, 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 0, 4, 1, 0, 1]) == 'mammal' + + +def test_k_nearest_neighbors(): + iris = DataSet(name='iris') + knn = NearestNeighborLearner(iris, k=3) + assert knn.predict([5, 3, 1, 0.1]) == 'setosa' + assert knn.predict([6, 5, 3, 1.5]) == 'versicolor' + assert knn.predict([7.5, 4, 6, 2]) == 'virginica' + + +def test_decision_tree_learner(): + iris = DataSet(name='iris') + dtl = DecisionTreeLearner(iris) + assert dtl.predict([5, 3, 1, 0.1]) == 'setosa' + assert dtl.predict([6, 5, 3, 1.5]) == 'versicolor' + assert dtl.predict([7.5, 4, 6, 2]) == 'virginica' + + +def test_svc(): + iris = DataSet(name='iris') + classes = ['setosa', 'versicolor', 'virginica'] + iris.classes_to_numbers(classes) + n_samples, n_features = len(iris.examples), iris.target + X, y = (np.array([x[:n_features] for x in iris.examples]), + np.array([x[n_features] for x in iris.examples])) + svm = MultiClassLearner(SVC()).fit(X, y) + assert svm.predict([[5.0, 3.1, 0.9, 0.1]]) == 0 + assert svm.predict([[5.1, 3.5, 1.0, 0.0]]) == 0 + assert svm.predict([[4.9, 3.3, 1.1, 0.1]]) == 0 + assert svm.predict([[6.0, 3.0, 4.0, 1.1]]) == 1 + assert svm.predict([[6.1, 2.2, 3.5, 1.0]]) == 1 + assert svm.predict([[5.9, 2.5, 3.3, 1.1]]) == 1 + assert svm.predict([[7.5, 4.1, 6.2, 2.3]]) == 2 + assert svm.predict([[7.3, 4.0, 6.1, 2.4]]) == 2 + assert svm.predict([[7.0, 3.3, 6.1, 2.5]]) == 2 + + +def test_information_content(): + assert information_content([]) == 0 + assert information_content([4]) == 0 + assert information_content([5, 4, 0, 2, 5, 0]) > 1.9 + assert information_content([5, 4, 0, 2, 5, 0]) < 2 + assert information_content([1.5, 2.5]) > 0.9 + assert information_content([1.5, 2.5]) < 1.0 + + +def test_random_forest(): + iris = DataSet(name='iris') + rf = RandomForest(iris) + tests = [([5.0, 3.0, 1.0, 0.1], 'setosa'), + ([5.1, 3.3, 1.1, 0.1], 'setosa'), + ([6.0, 5.0, 3.0, 1.0], 'versicolor'), + ([6.1, 2.2, 3.5, 1.0], 'versicolor'), + ([7.5, 4.1, 6.2, 2.3], 'virginica'), + ([7.3, 3.7, 6.1, 2.5], 'virginica')] + assert grade_learner(rf, tests) >= 1 / 3 + + +def test_random_weights(): + min_value = -0.5 + max_value = 0.5 + num_weights = 10 + test_weights = random_weights(min_value, max_value, num_weights) + assert len(test_weights) == num_weights + for weight in test_weights: + assert min_value <= weight <= max_value + + +def test_ada_boost(): + iris = DataSet(name='iris') + classes = ['setosa', 'versicolor', 'virginica'] + iris.classes_to_numbers(classes) + wl = WeightedLearner(PerceptronLearner(iris)) + ab = ada_boost(iris, wl, 5) + tests = [([5, 3, 1, 0.1], 0), + ([5, 3.5, 1, 0], 0), + ([6, 3, 4, 1.1], 1), + ([6, 2, 3.5, 1], 1), + ([7.5, 4, 6, 2], 2), + ([7, 3, 6, 2.5], 2)] + assert grade_learner(ab, tests) > 2 / 3 + assert err_ratio(ab, iris) < 0.25 + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_logic.py b/tests/test_logic.py index 86bcc9ed6..2ead21746 100644 --- a/tests/test_logic.py +++ b/tests/test_logic.py @@ -1,6 +1,19 @@ import pytest + from logic import * -from utils import expr_handle_infix_ops, count, Symbol +from utils import expr_handle_infix_ops, count + +random.seed("aima-python") + +definite_clauses_KB = PropDefiniteKB() +for clause in ['(B & F) ==> E', + '(A & E & F) ==> G', + '(B & C) ==> F', + '(A & B) ==> D', + '(E & F) ==> H', + '(H & I) ==> J', + 'A', 'B', 'C']: + definite_clauses_KB.tell(expr(clause)) def test_is_symbol(): @@ -34,8 +47,7 @@ def test_variables(): def test_expr(): assert repr(expr('P <=> Q(1)')) == '(P <=> Q(1))' assert repr(expr('P & Q | ~R(x, F(x))')) == '((P & Q) | ~R(x, F(x)))' - assert (expr_handle_infix_ops('P & Q ==> R & ~S') - == "P & Q |'==>'| R & ~S") + assert expr_handle_infix_ops('P & Q ==> R & ~S') == "P & Q |'==>'| R & ~S" def test_extend(): @@ -43,7 +55,7 @@ def test_extend(): def test_subst(): - assert subst({x: 42, y:0}, F(x) + y) == (F(42) + 0) + assert subst({x: 42, y: 0}, F(x) + y) == (F(42) + 0) def test_PropKB(): @@ -51,11 +63,11 @@ def test_PropKB(): assert count(kb.ask(expr) for expr in [A, C, D, E, Q]) is 0 kb.tell(A & E) assert kb.ask(A) == kb.ask(E) == {} - kb.tell(E |'==>'| C) + kb.tell(E | '==>' | C) assert kb.ask(C) == {} kb.retract(E) - assert kb.ask(E) is False - assert kb.ask(C) is False + assert not kb.ask(E) + assert not kb.ask(C) def test_wumpus_kb(): @@ -66,10 +78,10 @@ def test_wumpus_kb(): assert wumpus_kb.ask(~P12) == {} # Statement: There is a pit in [2,2]. - assert wumpus_kb.ask(P22) is False + assert not wumpus_kb.ask(P22) # Statement: There is a pit in [3,1]. - assert wumpus_kb.ask(P31) is False + assert not wumpus_kb.ask(P31) # Statement: Neither [1,2] nor [2,1] contains a pit. assert wumpus_kb.ask(~P12 & ~P21) == {} @@ -90,16 +102,17 @@ def test_is_definite_clause(): def test_parse_definite_clause(): assert parse_definite_clause(expr('A & B & C & D ==> E')) == ([A, B, C, D], E) assert parse_definite_clause(expr('Farmer(Mac)')) == ([], expr('Farmer(Mac)')) - assert parse_definite_clause(expr('(Farmer(f) & Rabbit(r)) ==> Hates(f, r)')) == ([expr('Farmer(f)'), expr('Rabbit(r)')], expr('Hates(f, r)')) + assert parse_definite_clause(expr('(Farmer(f) & Rabbit(r)) ==> Hates(f, r)')) == ( + [expr('Farmer(f)'), expr('Rabbit(r)')], expr('Hates(f, r)')) def test_pl_true(): assert pl_true(P, {}) is None - assert pl_true(P, {P: False}) is False - assert pl_true(P | Q, {P: True}) is True - assert pl_true((A | B) & (C | D), {A: False, B: True, D: True}) is True - assert pl_true((A & B) & (C | D), {A: False, B: True, D: True}) is False - assert pl_true((A & B) | (A & C), {A: False, B: True, C: True}) is False + assert not pl_true(P, {P: False}) + assert pl_true(P | Q, {P: True}) + assert pl_true((A | B) & (C | D), {A: False, B: True, D: True}) + assert not pl_true((A & B) & (C | D), {A: False, B: True, D: True}) + assert not pl_true((A & B) | (A & C), {A: False, B: True, C: True}) assert pl_true((A | B) & (C | D), {A: True, D: False}) is None assert pl_true(P | P, {}) is None @@ -123,37 +136,80 @@ def test_tt_true(): assert tt_true('(A | (B & C)) <=> ((A | B) & (A | C))') -def test_dpll(): - assert (dpll_satisfiable(A & ~B & C & (A | ~D) & (~E | ~D) & (C | ~D) & (~A | ~F) & (E | ~F) - & (~D | ~F) & (B | ~C | D) & (A | ~E | F) & (~A | E | D)) - == {B: False, C: True, A: True, F: False, D: True, E: False}) +def test_dpll_satisfiable(): + assert dpll_satisfiable(A & ~B & C & (A | ~D) & (~E | ~D) & (C | ~D) & (~A | ~F) & (E | ~F) & (~D | ~F) & + (B | ~C | D) & (A | ~E | F) & (~A | E | D)) == \ + {B: False, C: True, A: True, F: False, D: True, E: False} + assert dpll_satisfiable(A & B & ~C & D) == {C: False, A: True, D: True, B: True} + assert dpll_satisfiable((A | (B & C)) | '<=>' | ((A | B) & (A | C))) == {C: True, A: True} or {C: True, B: True} + assert dpll_satisfiable(A | '<=>' | B) == {A: True, B: True} assert dpll_satisfiable(A & ~B) == {A: True, B: False} assert dpll_satisfiable(P & ~P) is False +def test_cdcl_satisfiable(): + assert cdcl_satisfiable(A & ~B & C & (A | ~D) & (~E | ~D) & (C | ~D) & (~A | ~F) & (E | ~F) & (~D | ~F) & + (B | ~C | D) & (A | ~E | F) & (~A | E | D)) == \ + {B: False, C: True, A: True, F: False, D: True, E: False} + assert cdcl_satisfiable(A & B & ~C & D) == {C: False, A: True, D: True, B: True} + assert cdcl_satisfiable((A | (B & C)) | '<=>' | ((A | B) & (A | C))) == {C: True, A: True} or {C: True, B: True} + assert cdcl_satisfiable(A | '<=>' | B) == {A: True, B: True} + assert cdcl_satisfiable(A & ~B) == {A: True, B: False} + assert cdcl_satisfiable(P & ~P) is False + + def test_find_pure_symbol(): - assert find_pure_symbol([A, B, C], [A|~B,~B|~C,C|A]) == (A, True) - assert find_pure_symbol([A, B, C], [~A|~B,~B|~C,C|A]) == (B, False) - assert find_pure_symbol([A, B, C], [~A|B,~B|~C,C|A]) == (None, None) + assert find_pure_symbol([A, B, C], [A | ~B, ~B | ~C, C | A]) == (A, True) + assert find_pure_symbol([A, B, C], [~A | ~B, ~B | ~C, C | A]) == (B, False) + assert find_pure_symbol([A, B, C], [~A | B, ~B | ~C, C | A]) == (None, None) def test_unit_clause_assign(): - assert unit_clause_assign(A|B|C, {A:True}) == (None, None) - assert unit_clause_assign(B|C, {A:True}) == (None, None) - assert unit_clause_assign(B|~A, {A:True}) == (B, True) + assert unit_clause_assign(A | B | C, {A: True}) == (None, None) + assert unit_clause_assign(B | C, {A: True}) == (None, None) + assert unit_clause_assign(B | ~A, {A: True}) == (B, True) def test_find_unit_clause(): - assert find_unit_clause([A|B|C, B|~C, ~A|~B], {A:True}) == (B, False) - + assert find_unit_clause([A | B | C, B | ~C, ~A | ~B], {A: True}) == (B, False) + def test_unify(): assert unify(x, x, {}) == {} assert unify(x, 3, {}) == {x: 3} + assert unify(x & 4 & y, 6 & y & 4, {}) == {x: 6, y: 4} + assert unify(expr('A(x)'), expr('A(B)')) == {x: B} + assert unify(expr('American(x) & Weapon(B)'), expr('American(A) & Weapon(y)')) == {x: A, y: B} + assert unify(expr('P(F(x,z), G(u, z))'), expr('P(F(y,a), y)')) == {x: G(u, a), z: a, y: G(u, a)} + + # tests for https://github.com/aimacode/aima-python/issues/1053 + # unify(expr('P(A, x, F(G(y)))'), expr('P(z, F(z), F(u))')) + # must return {z: A, x: F(A), u: G(y)} and not {z: A, x: F(z), u: G(y)} + assert unify(expr('P(A, x, F(G(y)))'), expr('P(z, F(z), F(u))')) == {z: A, x: F(A), u: G(y)} + assert unify(expr('P(x, A, F(G(y)))'), expr('P(F(z), z, F(u))')) == {x: F(A), z: A, u: G(y)} + + +def test_unify_mm(): + assert unify_mm(x, x) == {} + assert unify_mm(x, 3) == {x: 3} + assert unify_mm(x & 4 & y, 6 & y & 4) == {x: 6, y: 4} + assert unify_mm(expr('A(x)'), expr('A(B)')) == {x: B} + assert unify_mm(expr('American(x) & Weapon(B)'), expr('American(A) & Weapon(y)')) == {x: A, y: B} + assert unify_mm(expr('P(F(x,z), G(u, z))'), expr('P(F(y,a), y)')) == {x: G(u, a), z: a, y: G(u, a)} + + # tests for https://github.com/aimacode/aima-python/issues/1053 + # unify(expr('P(A, x, F(G(y)))'), expr('P(z, F(z), F(u))')) + # must return {z: A, x: F(A), u: G(y)} and not {z: A, x: F(z), u: G(y)} + assert unify_mm(expr('P(A, x, F(G(y)))'), expr('P(z, F(z), F(u))')) == {z: A, x: F(A), u: G(y)} + assert unify_mm(expr('P(x, A, F(G(y)))'), expr('P(F(z), z, F(u))')) == {x: F(A), z: A, u: G(y)} def test_pl_fc_entails(): assert pl_fc_entails(horn_clauses_KB, expr('Q')) + assert pl_fc_entails(definite_clauses_KB, expr('G')) + assert pl_fc_entails(definite_clauses_KB, expr('H')) + assert not pl_fc_entails(definite_clauses_KB, expr('I')) + assert not pl_fc_entails(definite_clauses_KB, expr('J')) assert not pl_fc_entails(horn_clauses_KB, expr('SomethingSilly')) @@ -161,6 +217,9 @@ def test_tt_entails(): assert tt_entails(P & Q, Q) assert not tt_entails(P | Q, Q) assert tt_entails(A & (B | C) & E & F & ~(P | Q), A & E & F & ~P & ~Q) + assert not tt_entails(P | '<=>' | Q, Q) + assert tt_entails((P | '==>' | Q) & P, Q) + assert not tt_entails((P | '<=>' | Q) & ~P, Q) def test_prop_symbols(): @@ -201,10 +260,8 @@ def test_dissociate(): def test_associate(): - assert (repr(associate('&', [(A & B), (B | C), (B & C)])) - == '(A & B & (B | C) & B & C)') - assert (repr(associate('|', [A | (B | (C | (A & B)))])) - == '(A | B | C | (A & B))') + assert repr(associate('&', [(A & B), (B | C), (B & C)])) == '(A & B & (B | C) & B & C)' + assert repr(associate('|', [A | (B | (C | (A & B)))])) == '(A | B | C | (A & B))' def test_move_not_inwards(): @@ -214,47 +271,56 @@ def test_move_not_inwards(): def test_distribute_and_over_or(): - def test_enatilment(s, has_and = False): + def test_entailment(s, has_and=False): result = distribute_and_over_or(s) if has_and: assert result.op == '&' assert tt_entails(s, result) assert tt_entails(result, s) - test_enatilment((A & B) | C, True) - test_enatilment((A | B) & C, True) - test_enatilment((A | B) | C, False) - test_enatilment((A & B) | (C | D), True) + + test_entailment((A & B) | C, True) + test_entailment((A | B) & C, True) + test_entailment((A | B) | C, False) + test_entailment((A & B) | (C | D), True) + def test_to_cnf(): - assert (repr(to_cnf(wumpus_world_inference & ~expr('~P12'))) == - "((~P12 | B11) & (~P21 | B11) & (P12 | P21 | ~B11) & ~B11 & P12)") + assert repr(to_cnf(wumpus_world_inference & ~expr('~P12'))) == \ + '((~P12 | B11) & (~P21 | B11) & (P12 | P21 | ~B11) & ~B11 & P12)' assert repr(to_cnf((P & Q) | (~P & ~Q))) == '((~P | P) & (~Q | P) & (~P | Q) & (~Q | Q))' - assert repr(to_cnf("B <=> (P1 | P2)")) == '((~P1 | B) & (~P2 | B) & (P1 | P2 | ~B))' - assert repr(to_cnf("a | (b & c) | d")) == '((b | a | d) & (c | a | d))' - assert repr(to_cnf("A & (B | (D & E))")) == '(A & (D | B) & (E | B))' - assert repr(to_cnf("A | (B | (C | (D & E)))")) == '((D | A | B | C) & (E | A | B | C))' + assert repr(to_cnf('A <=> B')) == '((A | ~B) & (B | ~A))' + assert repr(to_cnf('B <=> (P1 | P2)')) == '((~P1 | B) & (~P2 | B) & (P1 | P2 | ~B))' + assert repr(to_cnf('A <=> (B & C)')) == '((A | ~B | ~C) & (B | ~A) & (C | ~A))' + assert repr(to_cnf('a | (b & c) | d')) == '((b | a | d) & (c | a | d))' + assert repr(to_cnf('A & (B | (D & E))')) == '(A & (D | B) & (E | B))' + assert repr(to_cnf('A | (B | (C | (D & E)))')) == '((D | A | B | C) & (E | A | B | C))' + assert repr(to_cnf('(A <=> ~B) ==> (C | ~D)')) == \ + '((B | ~A | C | ~D) & (A | ~A | C | ~D) & (B | ~B | C | ~D) & (A | ~B | C | ~D))' def test_pl_resolution(): - # TODO: Add fast test cases assert pl_resolution(wumpus_kb, ~P11) + assert pl_resolution(wumpus_kb, ~B11) + assert not pl_resolution(wumpus_kb, P22) + assert pl_resolution(horn_clauses_KB, A) + assert pl_resolution(horn_clauses_KB, B) + assert not pl_resolution(horn_clauses_KB, P) + assert not pl_resolution(definite_clauses_KB, P) def test_standardize_variables(): e = expr('F(a, b, c) & G(c, A, 23)') assert len(variables(standardize_variables(e))) == 3 - # assert variables(e).intersection(variables(standardize_variables(e))) == {} assert is_variable(standardize_variables(expr('x'))) def test_fol_bc_ask(): def test_ask(query, kb=None): q = expr(query) - test_variables = variables(q) answers = fol_bc_ask(kb or test_kb, q) - return sorted( - [dict((x, v) for x, v in list(a.items()) if x in test_variables) - for a in answers], key=repr) + return sorted([dict((x, v) for x, v in list(a.items()) if x in variables(q)) + for a in answers], key=repr) + assert repr(test_ask('Farmer(x)')) == '[{x: Mac}]' assert repr(test_ask('Human(x)')) == '[{x: Mac}, {x: MrsMac}]' assert repr(test_ask('Rabbit(x)')) == '[{x: MrsRabbit}, {x: Pete}]' @@ -264,11 +330,10 @@ def test_ask(query, kb=None): def test_fol_fc_ask(): def test_ask(query, kb=None): q = expr(query) - test_variables = variables(q) answers = fol_fc_ask(kb or test_kb, q) - return sorted( - [dict((x, v) for x, v in list(a.items()) if x in test_variables) - for a in answers], key=repr) + return sorted([dict((x, v) for x, v in list(a.items()) if x in variables(q)) + for a in answers], key=repr) + assert repr(test_ask('Criminal(x)', crime_kb)) == '[{x: West}]' assert repr(test_ask('Enemy(x, America)', crime_kb)) == '[{x: Nono}]' assert repr(test_ask('Farmer(x)')) == '[{x: Mac}]' @@ -281,21 +346,26 @@ def test_d(): def test_WalkSAT(): - def check_SAT(clauses, single_solution={}): + def check_SAT(clauses, single_solution=None): # Make sure the solution is correct if it is returned by WalkSat # Sometimes WalkSat may run out of flips before finding a solution - soln = WalkSAT(clauses) - if soln: - assert all(pl_true(x, soln) for x in clauses) + if single_solution is None: + single_solution = {} + sol = WalkSAT(clauses) + if sol: + assert all(pl_true(x, sol) for x in clauses) if single_solution: # Cross check the solution if only one exists assert all(pl_true(x, single_solution) for x in clauses) - assert soln == single_solution + assert sol == single_solution + # Test WalkSat for problems with solution check_SAT([A & B, A & C]) check_SAT([A | B, P & Q, P & B]) check_SAT([A & B, C | D, ~(D | P)], {A: True, B: True, C: True, D: False, P: False}) + check_SAT([A, B, ~C, D], {C: False, A: True, B: True, D: True}) # Test WalkSat for problems without solution assert WalkSAT([A & ~A], 0.5, 100) is None + assert WalkSAT([A & B, C | D, ~(D | B)], 0.5, 100) is None assert WalkSAT([A | B, ~A, ~(B | C), C | D, P | Q], 0.5, 100) is None assert WalkSAT([A | B, B & C, C | D, D & A, P, ~P], 0.5, 100) is None @@ -304,9 +374,9 @@ def test_SAT_plan(): transition = {'A': {'Left': 'A', 'Right': 'B'}, 'B': {'Left': 'A', 'Right': 'C'}, 'C': {'Left': 'B', 'Right': 'C'}} - assert SAT_plan('A', transition, 'C', 2) is None - assert SAT_plan('A', transition, 'B', 3) == ['Right'] - assert SAT_plan('C', transition, 'A', 3) == ['Left', 'Left'] + assert SAT_plan('A', transition, 'C', 1) is None + assert SAT_plan('A', transition, 'B', 2) == ['Right'] + assert SAT_plan('C', transition, 'A', 2) == ['Left', 'Left'] transition = {(0, 0): {'Right': (0, 1), 'Down': (1, 0)}, (0, 1): {'Left': (1, 0), 'Down': (1, 1)}, diff --git a/tests/test_logic4e.py b/tests/test_logic4e.py new file mode 100644 index 000000000..5a7399281 --- /dev/null +++ b/tests/test_logic4e.py @@ -0,0 +1,359 @@ +import pytest + +from logic4e import * +from utils4e import expr_handle_infix_ops, count + +definite_clauses_KB = PropDefiniteKB() +for clause in ['(B & F)==>E', + '(A & E & F)==>G', + '(B & C)==>F', + '(A & B)==>D', + '(E & F)==>H', + '(H & I)==>J', + 'A', 'B', 'C']: + definite_clauses_KB.tell(expr(clause)) + + +def test_is_symbol(): + assert is_symbol('x') + assert is_symbol('X') + assert is_symbol('N245') + assert not is_symbol('') + assert not is_symbol('1L') + assert not is_symbol([1, 2, 3]) + + +def test_is_var_symbol(): + assert is_var_symbol('xt') + assert not is_var_symbol('Txt') + assert not is_var_symbol('') + assert not is_var_symbol('52') + + +def test_is_prop_symbol(): + assert not is_prop_symbol('xt') + assert is_prop_symbol('Txt') + assert not is_prop_symbol('') + assert not is_prop_symbol('52') + + +def test_variables(): + assert variables(expr('F(x, x) & G(x, y) & H(y, z) & R(A, z, 2)')) == {x, y, z} + assert variables(expr('(x ==> y) & B(x, y) & A')) == {x, y} + + +def test_expr(): + assert repr(expr('P <=> Q(1)')) == '(P <=> Q(1))' + assert repr(expr('P & Q | ~R(x, F(x))')) == '((P & Q) | ~R(x, F(x)))' + assert (expr_handle_infix_ops('P & Q ==> R & ~S') == "P & Q |'==>'| R & ~S") + + +def test_extend(): + assert extend({x: 1}, y, 2) == {x: 1, y: 2} + + +def test_subst(): + assert subst({x: 42, y: 0}, F(x) + y) == (F(42) + 0) + + +def test_PropKB(): + kb = PropKB() + assert count(kb.ask(expr) for expr in [A, C, D, E, Q]) is 0 + kb.tell(A & E) + assert kb.ask(A) == kb.ask(E) == {} + kb.tell(E | '==>' | C) + assert kb.ask(C) == {} + kb.retract(E) + assert kb.ask(E) is False + assert kb.ask(C) is False + + +def test_wumpus_kb(): + # Statement: There is no pit in [1,1]. + assert wumpus_kb.ask(~P11) == {} + + # Statement: There is no pit in [1,2]. + assert wumpus_kb.ask(~P12) == {} + + # Statement: There is a pit in [2,2]. + assert wumpus_kb.ask(P22) is False + + # Statement: There is a pit in [3,1]. + assert wumpus_kb.ask(P31) is False + + # Statement: Neither [1,2] nor [2,1] contains a pit. + assert wumpus_kb.ask(~P12 & ~P21) == {} + + # Statement: There is a pit in either [2,2] or [3,1]. + assert wumpus_kb.ask(P22 | P31) == {} + + +def test_is_definite_clause(): + assert is_definite_clause(expr('A & B & C & D ==> E')) + assert is_definite_clause(expr('Farmer(Mac)')) + assert not is_definite_clause(expr('~Farmer(Mac)')) + assert is_definite_clause(expr('(Farmer(f) & Rabbit(r)) ==> Hates(f, r)')) + assert not is_definite_clause(expr('(Farmer(f) & ~Rabbit(r)) ==> Hates(f, r)')) + assert not is_definite_clause(expr('(Farmer(f) | Rabbit(r)) ==> Hates(f, r)')) + + +def test_parse_definite_clause(): + assert parse_definite_clause(expr('A & B & C & D ==> E')) == ([A, B, C, D], E) + assert parse_definite_clause(expr('Farmer(Mac)')) == ([], expr('Farmer(Mac)')) + assert parse_definite_clause(expr('(Farmer(f) & Rabbit(r)) ==> Hates(f, r)')) == ( + [expr('Farmer(f)'), expr('Rabbit(r)')], expr('Hates(f, r)')) + + +def test_pl_true(): + assert pl_true(P, {}) is None + assert pl_true(P, {P: False}) is False + assert pl_true(P | Q, {P: True}) is True + assert pl_true((A | B) & (C | D), {A: False, B: True, D: True}) is True + assert pl_true((A & B) & (C | D), {A: False, B: True, D: True}) is False + assert pl_true((A & B) | (A & C), {A: False, B: True, C: True}) is False + assert pl_true((A | B) & (C | D), {A: True, D: False}) is None + assert pl_true(P | P, {}) is None + + +def test_tt_true(): + assert tt_true(P | ~P) + assert tt_true('~~P <=> P') + assert not tt_true((P | ~Q) & (~P | Q)) + assert not tt_true(P & ~P) + assert not tt_true(P & Q) + assert tt_true((P | ~Q) | (~P | Q)) + assert tt_true('(A & B) ==> (A | B)') + assert tt_true('((A & B) & C) <=> (A & (B & C))') + assert tt_true('((A | B) | C) <=> (A | (B | C))') + assert tt_true('(A ==> B) <=> (~B ==> ~A)') + assert tt_true('(A ==> B) <=> (~A | B)') + assert tt_true('(A <=> B) <=> ((A ==> B) & (B ==> A))') + assert tt_true('~(A & B) <=> (~A | ~B)') + assert tt_true('~(A | B) <=> (~A & ~B)') + assert tt_true('(A & (B | C)) <=> ((A & B) | (A & C))') + assert tt_true('(A | (B & C)) <=> ((A | B) & (A | C))') + + +def test_dpll(): + assert (dpll_satisfiable(A & ~B & C & (A | ~D) & (~E | ~D) & (C | ~D) & (~A | ~F) & (E | ~F) + & (~D | ~F) & (B | ~C | D) & (A | ~E | F) & (~A | E | D)) + == {B: False, C: True, A: True, F: False, D: True, E: False}) + assert dpll_satisfiable(A & B & ~C & D) == {C: False, A: True, D: True, B: True} + assert dpll_satisfiable((A | (B & C)) | '<=>' | ((A | B) & (A | C))) == {C: True, A: True} or {C: True, B: True} + assert dpll_satisfiable(A | '<=>' | B) == {A: True, B: True} + assert dpll_satisfiable(A & ~B) == {A: True, B: False} + assert dpll_satisfiable(P & ~P) is False + + +def test_find_pure_symbol(): + assert find_pure_symbol([A, B, C], [A | ~B, ~B | ~C, C | A]) == (A, True) + assert find_pure_symbol([A, B, C], [~A | ~B, ~B | ~C, C | A]) == (B, False) + assert find_pure_symbol([A, B, C], [~A | B, ~B | ~C, C | A]) == (None, None) + + +def test_unit_clause_assign(): + assert unit_clause_assign(A | B | C, {A: True}) == (None, None) + assert unit_clause_assign(B | C, {A: True}) == (None, None) + assert unit_clause_assign(B | ~A, {A: True}) == (B, True) + + +def test_find_unit_clause(): + assert find_unit_clause([A | B | C, B | ~C, ~A | ~B], {A: True}) == (B, False) + + +def test_unify(): + assert unify(x, x, {}) == {} + assert unify(x, 3, {}) == {x: 3} + assert unify(x & 4 & y, 6 & y & 4, {}) == {x: 6, y: 4} + assert unify(expr('A(x)'), expr('A(B)')) == {x: B} + assert unify(expr('American(x) & Weapon(B)'), expr('American(A) & Weapon(y)')) == {x: A, y: B} + + +def test_pl_fc_entails(): + assert pl_fc_entails(horn_clauses_KB, expr('Q')) + assert pl_fc_entails(definite_clauses_KB, expr('G')) + assert pl_fc_entails(definite_clauses_KB, expr('H')) + assert not pl_fc_entails(definite_clauses_KB, expr('I')) + assert not pl_fc_entails(definite_clauses_KB, expr('J')) + assert not pl_fc_entails(horn_clauses_KB, expr('SomethingSilly')) + + +def test_tt_entails(): + assert tt_entails(P & Q, Q) + assert not tt_entails(P | Q, Q) + assert tt_entails(A & (B | C) & E & F & ~(P | Q), A & E & F & ~P & ~Q) + assert not tt_entails(P | '<=>' | Q, Q) + assert tt_entails((P | '==>' | Q) & P, Q) + assert not tt_entails((P | '<=>' | Q) & ~P, Q) + + +def test_prop_symbols(): + assert prop_symbols(expr('x & y & z | A')) == {A} + assert prop_symbols(expr('(x & B(z)) ==> Farmer(y) | A')) == {A, expr('Farmer(y)'), expr('B(z)')} + + +def test_constant_symbols(): + assert constant_symbols(expr('x & y & z | A')) == {A} + assert constant_symbols(expr('(x & B(z)) & Father(John) ==> Farmer(y) | A')) == {A, expr('John')} + + +def test_predicate_symbols(): + assert predicate_symbols(expr('x & y & z | A')) == set() + assert predicate_symbols(expr('(x & B(z)) & Father(John) ==> Farmer(y) | A')) == { + ('B', 1), + ('Father', 1), + ('Farmer', 1)} + assert predicate_symbols(expr('(x & B(x, y, z)) & F(G(x, y), x) ==> P(Q(R(x, y)), x, y, z)')) == { + ('B', 3), + ('F', 2), + ('G', 2), + ('P', 4), + ('Q', 1), + ('R', 2)} + + +def test_eliminate_implications(): + assert repr(eliminate_implications('A ==> (~B <== C)')) == '((~B | ~C) | ~A)' + assert repr(eliminate_implications(A ^ B)) == '((A & ~B) | (~A & B))' + assert repr(eliminate_implications(A & B | C & ~D)) == '((A & B) | (C & ~D))' + + +def test_dissociate(): + assert dissociate('&', [A & B]) == [A, B] + assert dissociate('|', [A, B, C & D, P | Q]) == [A, B, C & D, P, Q] + assert dissociate('&', [A, B, C & D, P | Q]) == [A, B, C, D, P | Q] + + +def test_associate(): + assert (repr(associate('&', [(A & B), (B | C), (B & C)])) + == '(A & B & (B | C) & B & C)') + assert (repr(associate('|', [A | (B | (C | (A & B)))])) + == '(A | B | C | (A & B))') + + +def test_move_not_inwards(): + assert repr(move_not_inwards(~(A | B))) == '(~A & ~B)' + assert repr(move_not_inwards(~(A & B))) == '(~A | ~B)' + assert repr(move_not_inwards(~(~(A | ~B) | ~~C))) == '((A | ~B) & ~C)' + + +def test_distribute_and_over_or(): + def test_entailment(s, has_and=False): + result = distribute_and_over_or(s) + if has_and: + assert result.op == '&' + assert tt_entails(s, result) + assert tt_entails(result, s) + + test_entailment((A & B) | C, True) + test_entailment((A | B) & C, True) + test_entailment((A | B) | C, False) + test_entailment((A & B) | (C | D), True) + + +def test_to_cnf(): + assert (repr(to_cnf(wumpus_world_inference & ~expr('~P12'))) == + "((~P12 | B11) & (~P21 | B11) & (P12 | P21 | ~B11) & ~B11 & P12)") + assert repr(to_cnf((P & Q) | (~P & ~Q))) == '((~P | P) & (~Q | P) & (~P | Q) & (~Q | Q))' + assert repr(to_cnf('A <=> B')) == '((A | ~B) & (B | ~A))' + assert repr(to_cnf("B <=> (P1 | P2)")) == '((~P1 | B) & (~P2 | B) & (P1 | P2 | ~B))' + assert repr(to_cnf('A <=> (B & C)')) == '((A | ~B | ~C) & (B | ~A) & (C | ~A))' + assert repr(to_cnf("a | (b & c) | d")) == '((b | a | d) & (c | a | d))' + assert repr(to_cnf("A & (B | (D & E))")) == '(A & (D | B) & (E | B))' + assert repr(to_cnf("A | (B | (C | (D & E)))")) == '((D | A | B | C) & (E | A | B | C))' + assert repr(to_cnf( + '(A <=> ~B) ==> (C | ~D)')) == '((B | ~A | C | ~D) & (A | ~A | C | ~D) & (B | ~B | C | ~D) & (A | ~B | C | ~D))' + + +def test_pl_resolution(): + assert pl_resolution(wumpus_kb, ~P11) + assert pl_resolution(wumpus_kb, ~B11) + assert not pl_resolution(wumpus_kb, P22) + assert pl_resolution(horn_clauses_KB, A) + assert pl_resolution(horn_clauses_KB, B) + assert not pl_resolution(horn_clauses_KB, P) + assert not pl_resolution(definite_clauses_KB, P) + + +def test_standardize_variables(): + e = expr('F(a, b, c) & G(c, A, 23)') + assert len(variables(standardize_variables(e))) == 3 + # assert variables(e).intersection(variables(standardize_variables(e))) == {} + assert is_variable(standardize_variables(expr('x'))) + + +def test_fol_bc_ask(): + def test_ask(query, kb=None): + q = expr(query) + test_variables = variables(q) + answers = fol_bc_ask(kb or test_kb, q) + return sorted( + [dict((x, v) for x, v in list(a.items()) if x in test_variables) + for a in answers], key=repr) + + assert repr(test_ask('Farmer(x)')) == '[{x: Mac}]' + assert repr(test_ask('Human(x)')) == '[{x: Mac}, {x: MrsMac}]' + assert repr(test_ask('Rabbit(x)')) == '[{x: MrsRabbit}, {x: Pete}]' + assert repr(test_ask('Criminal(x)', crime_kb)) == '[{x: West}]' + + +def test_fol_fc_ask(): + def test_ask(query, kb=None): + q = expr(query) + test_variables = variables(q) + answers = fol_fc_ask(kb or test_kb, q) + return sorted( + [dict((x, v) for x, v in list(a.items()) if x in test_variables) + for a in answers], key=repr) + + assert repr(test_ask('Criminal(x)', crime_kb)) == '[{x: West}]' + assert repr(test_ask('Enemy(x, America)', crime_kb)) == '[{x: Nono}]' + assert repr(test_ask('Farmer(x)')) == '[{x: Mac}]' + assert repr(test_ask('Human(x)')) == '[{x: Mac}, {x: MrsMac}]' + assert repr(test_ask('Rabbit(x)')) == '[{x: MrsRabbit}, {x: Pete}]' + + +def test_d(): + assert d(x * x - x, x) == 2 * x - 1 + + +def test_WalkSAT(): + def check_SAT(clauses, single_solution={}): + # Make sure the solution is correct if it is returned by WalkSat + # Sometimes WalkSat may run out of flips before finding a solution + soln = WalkSAT(clauses) + if soln: + assert all(pl_true(x, soln) for x in clauses) + if single_solution: # Cross check the solution if only one exists + assert all(pl_true(x, single_solution) for x in clauses) + assert soln == single_solution + + # Test WalkSat for problems with solution + check_SAT([A & B, A & C]) + check_SAT([A | B, P & Q, P & B]) + check_SAT([A & B, C | D, ~(D | P)], {A: True, B: True, C: True, D: False, P: False}) + check_SAT([A, B, ~C, D], {C: False, A: True, B: True, D: True}) + # Test WalkSat for problems without solution + assert WalkSAT([A & ~A], 0.5, 100) is None + assert WalkSAT([A & B, C | D, ~(D | B)], 0.5, 100) is None + assert WalkSAT([A | B, ~A, ~(B | C), C | D, P | Q], 0.5, 100) is None + assert WalkSAT([A | B, B & C, C | D, D & A, P, ~P], 0.5, 100) is None + + +def test_SAT_plan(): + transition = {'A': {'Left': 'A', 'Right': 'B'}, + 'B': {'Left': 'A', 'Right': 'C'}, + 'C': {'Left': 'B', 'Right': 'C'}} + assert SAT_plan('A', transition, 'C', 2) is None + assert SAT_plan('A', transition, 'B', 3) == ['Right'] + assert SAT_plan('C', transition, 'A', 3) == ['Left', 'Left'] + + transition = {(0, 0): {'Right': (0, 1), 'Down': (1, 0)}, + (0, 1): {'Left': (1, 0), 'Down': (1, 1)}, + (1, 0): {'Right': (1, 0), 'Up': (1, 0), 'Left': (1, 0), 'Down': (1, 0)}, + (1, 1): {'Left': (1, 0), 'Up': (0, 1)}} + assert SAT_plan((0, 0), transition, (1, 1), 4) == ['Right', 'Down'] + + +if __name__ == '__main__': + pytest.main() diff --git a/tests/test_mdp.py b/tests/test_mdp.py index b27c1af71..979b4ba85 100644 --- a/tests/test_mdp.py +++ b/tests/test_mdp.py @@ -1,5 +1,26 @@ +import pytest + from mdp import * +random.seed("aima-python") + +sequential_decision_environment_1 = GridMDP([[-0.1, -0.1, -0.1, +1], + [-0.1, None, -0.1, -1], + [-0.1, -0.1, -0.1, -0.1]], + terminals=[(3, 2), (3, 1)]) + +sequential_decision_environment_2 = GridMDP([[-2, -2, -2, +1], + [-2, None, -2, -1], + [-2, -2, -2, -2]], + terminals=[(3, 2), (3, 1)]) + +sequential_decision_environment_3 = GridMDP([[-1.0, -0.1, -0.1, -0.1, -0.1, 0.5], + [-0.1, None, None, -0.5, -0.1, -0.1], + [-0.1, None, 1.0, 3.0, None, -0.1], + [-0.1, -0.1, -0.1, None, None, -0.1], + [0.5, -0.1, -0.1, -0.1, -0.1, -1.0]], + terminals=[(2, 2), (3, 2), (0, 4), (5, 0)]) + def test_value_iteration(): assert value_iteration(sequential_decision_environment, .01) == { @@ -10,6 +31,32 @@ def test_value_iteration(): (2, 0): 0.34461306281476806, (2, 1): 0.48643676237737926, (2, 2): 0.79536093684710951} + assert value_iteration(sequential_decision_environment_1, .01) == { + (3, 2): 1.0, (3, 1): -1.0, + (3, 0): -0.0897388258468311, (0, 1): 0.146419707398967840, + (0, 2): 0.30596200514385086, (1, 0): 0.010092796415625799, + (0, 0): 0.00633408092008296, (1, 2): 0.507390193380827400, + (2, 0): 0.15072242145212010, (2, 1): 0.358309043654212570, + (2, 2): 0.71675493618997840} + + assert value_iteration(sequential_decision_environment_2, .01) == { + (3, 2): 1.0, (3, 1): -1.0, + (3, 0): -3.5141584808407855, (0, 1): -7.8000009574737180, + (0, 2): -6.1064293596058830, (1, 0): -7.1012549580376760, + (0, 0): -8.5872244532783200, (1, 2): -3.9653547121245810, + (2, 0): -5.3099468802901630, (2, 1): -3.3543366255753995, + (2, 2): -1.7383376462930498} + + assert value_iteration(sequential_decision_environment_3, .01) == { + (0, 0): 4.350592130345558, (0, 1): 3.640700980321895, (0, 2): 3.0734806370346943, (0, 3): 2.5754335063434937, + (0, 4): -1.0, + (1, 0): 3.640700980321895, (1, 1): 3.129579352304856, (1, 4): 2.0787517066719916, + (2, 0): 3.0259220379893352, (2, 1): 2.5926103577982897, (2, 2): 1.0, (2, 4): 2.507774181360808, + (3, 0): 2.5336747364500076, (3, 2): 3.0, (3, 3): 2.292172805400873, (3, 4): 2.996383110867515, + (4, 0): 2.1014575936349886, (4, 3): 3.1297590518608907, (4, 4): 3.6408806798779287, + (5, 0): -1.0, (5, 1): 2.5756132058995282, (5, 2): 3.0736603365907276, (5, 3): 3.6408806798779287, + (5, 4): 4.350771829901593} + def test_policy_iteration(): assert policy_iteration(sequential_decision_environment) == { @@ -18,24 +65,104 @@ def test_policy_iteration(): (2, 1): (0, 1), (2, 2): (1, 0), (3, 0): (-1, 0), (3, 1): None, (3, 2): None} + assert policy_iteration(sequential_decision_environment_1) == { + (0, 0): (0, 1), (0, 1): (0, 1), (0, 2): (1, 0), + (1, 0): (1, 0), (1, 2): (1, 0), (2, 0): (0, 1), + (2, 1): (0, 1), (2, 2): (1, 0), (3, 0): (-1, 0), + (3, 1): None, (3, 2): None} + + assert policy_iteration(sequential_decision_environment_2) == { + (0, 0): (1, 0), (0, 1): (0, 1), (0, 2): (1, 0), + (1, 0): (1, 0), (1, 2): (1, 0), (2, 0): (1, 0), + (2, 1): (1, 0), (2, 2): (1, 0), (3, 0): (0, 1), + (3, 1): None, (3, 2): None} + def test_best_policy(): - pi = best_policy(sequential_decision_environment, - value_iteration(sequential_decision_environment, .01)) + pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .01)) assert sequential_decision_environment.to_arrows(pi) == [['>', '>', '>', '.'], ['^', None, '^', '.'], ['^', '>', '^', '<']] + pi_1 = best_policy(sequential_decision_environment_1, value_iteration(sequential_decision_environment_1, .01)) + assert sequential_decision_environment_1.to_arrows(pi_1) == [['>', '>', '>', '.'], + ['^', None, '^', '.'], + ['^', '>', '^', '<']] + + pi_2 = best_policy(sequential_decision_environment_2, value_iteration(sequential_decision_environment_2, .01)) + assert sequential_decision_environment_2.to_arrows(pi_2) == [['>', '>', '>', '.'], + ['^', None, '>', '.'], + ['>', '>', '>', '^']] + + pi_3 = best_policy(sequential_decision_environment_3, value_iteration(sequential_decision_environment_3, .01)) + assert sequential_decision_environment_3.to_arrows(pi_3) == [['.', '>', '>', '>', '>', '>'], + ['v', None, None, '>', '>', '^'], + ['v', None, '.', '.', None, '^'], + ['v', '<', 'v', None, None, '^'], + ['<', '<', '<', '<', '<', '.']] + def test_transition_model(): - transition_model = { - "A": {"a1": (0.3, "B"), "a2": (0.7, "C")}, - "B": {"a1": (0.5, "B"), "a2": (0.5, "A")}, - "C": {"a1": (0.9, "A"), "a2": (0.1, "B")}, - } + transition_model = {'a': {'plan1': [(0.2, 'a'), (0.3, 'b'), (0.3, 'c'), (0.2, 'd')], + 'plan2': [(0.4, 'a'), (0.15, 'b'), (0.45, 'c')], + 'plan3': [(0.2, 'a'), (0.5, 'b'), (0.3, 'c')], + }, + 'b': {'plan1': [(0.2, 'a'), (0.6, 'b'), (0.2, 'c'), (0.1, 'd')], + 'plan2': [(0.6, 'a'), (0.2, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3': [(0.3, 'a'), (0.3, 'b'), (0.4, 'c')], + }, + 'c': {'plan1': [(0.3, 'a'), (0.5, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan2': [(0.5, 'a'), (0.3, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3': [(0.1, 'a'), (0.3, 'b'), (0.1, 'c'), (0.5, 'd')], + }} + + mdp = MDP(init="a", actlist={"plan1", "plan2", "plan3"}, terminals={"d"}, states={"a", "b", "c", "d"}, + transitions=transition_model) + + assert mdp.T("a", "plan3") == [(0.2, 'a'), (0.5, 'b'), (0.3, 'c')] + assert mdp.T("b", "plan2") == [(0.6, 'a'), (0.2, 'b'), (0.1, 'c'), (0.1, 'd')] + assert mdp.T("c", "plan1") == [(0.3, 'a'), (0.5, 'b'), (0.1, 'c'), (0.1, 'd')] + + +def test_pomdp_value_iteration(): + t_prob = [[[0.65, 0.35], [0.65, 0.35]], [[0.65, 0.35], [0.65, 0.35]], [[1.0, 0.0], [0.0, 1.0]]] + e_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[0.8, 0.2], [0.3, 0.7]]] + rewards = [[5, -10], [-20, 5], [-1, -1]] + + gamma = 0.95 + actions = ('0', '1', '2') + states = ('0', '1') + + pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma) + utility = pomdp_value_iteration(pomdp, epsilon=5) + + for _, v in utility.items(): + sum_ = 0 + for element in v: + sum_ += sum(element) + + assert -9.76 < sum_ < -9.70 or 246.5 < sum_ < 248.5 or 0 < sum_ < 1 + + +def test_pomdp_value_iteration2(): + t_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[1.0, 0.0], [0.0, 1.0]]] + e_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[0.85, 0.15], [0.15, 0.85]]] + rewards = [[-100, 10], [10, -100], [-1, -1]] + + gamma = 0.95 + actions = ('0', '1', '2') + states = ('0', '1') + + pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma) + utility = pomdp_value_iteration(pomdp, epsilon=100) + + for _, v in utility.items(): + sum_ = 0 + for element in v: + sum_ += sum(element) + + assert -77.31 < sum_ < -77.25 or 799 < sum_ < 800 - mdp = MDP(init="A", actlist={"a1","a2"}, terminals={"C"}, states={"A","B","C"}, transitions=transition_model) - assert mdp.T("A","a1") == (0.3, "B") - assert mdp.T("B","a2") == (0.5, "A") - assert mdp.T("C","a1") == (0.9, "A") +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_mdp4e.py b/tests/test_mdp4e.py new file mode 100644 index 000000000..e51bda5d6 --- /dev/null +++ b/tests/test_mdp4e.py @@ -0,0 +1,176 @@ +import pytest + +from mdp4e import * + +random.seed("aima-python") + +sequential_decision_environment_1 = GridMDP([[-0.1, -0.1, -0.1, +1], + [-0.1, None, -0.1, -1], + [-0.1, -0.1, -0.1, -0.1]], + terminals=[(3, 2), (3, 1)]) + +sequential_decision_environment_2 = GridMDP([[-2, -2, -2, +1], + [-2, None, -2, -1], + [-2, -2, -2, -2]], + terminals=[(3, 2), (3, 1)]) + +sequential_decision_environment_3 = GridMDP([[-1.0, -0.1, -0.1, -0.1, -0.1, 0.5], + [-0.1, None, None, -0.5, -0.1, -0.1], + [-0.1, None, 1.0, 3.0, None, -0.1], + [-0.1, -0.1, -0.1, None, None, -0.1], + [0.5, -0.1, -0.1, -0.1, -0.1, -1.0]], + terminals=[(2, 2), (3, 2), (0, 4), (5, 0)]) + + +def test_value_iteration(): + ref1 = { + (3, 2): 1.0, (3, 1): -1.0, + (3, 0): 0.12958868267972745, (0, 1): 0.39810203830605462, + (0, 2): 0.50928545646220924, (1, 0): 0.25348746162470537, + (0, 0): 0.29543540628363629, (1, 2): 0.64958064617168676, + (2, 0): 0.34461306281476806, (2, 1): 0.48643676237737926, + (2, 2): 0.79536093684710951} + assert sum(value_iteration(sequential_decision_environment, .01).values()) - sum(ref1.values()) < 0.0001 + + ref2 = { + (3, 2): 1.0, (3, 1): -1.0, + (3, 0): -0.0897388258468311, (0, 1): 0.146419707398967840, + (0, 2): 0.30596200514385086, (1, 0): 0.010092796415625799, + (0, 0): 0.00633408092008296, (1, 2): 0.507390193380827400, + (2, 0): 0.15072242145212010, (2, 1): 0.358309043654212570, + (2, 2): 0.71675493618997840} + assert sum(value_iteration(sequential_decision_environment_1, .01).values()) - sum(ref2.values()) < 0.0001 + + ref3 = { + (3, 2): 1.0, (3, 1): -1.0, + (3, 0): -3.5141584808407855, (0, 1): -7.8000009574737180, + (0, 2): -6.1064293596058830, (1, 0): -7.1012549580376760, + (0, 0): -8.5872244532783200, (1, 2): -3.9653547121245810, + (2, 0): -5.3099468802901630, (2, 1): -3.3543366255753995, + (2, 2): -1.7383376462930498} + assert sum(value_iteration(sequential_decision_environment_2, .01).values()) - sum(ref3.values()) < 0.0001 + + ref4 = { + (0, 0): 4.350592130345558, (0, 1): 3.640700980321895, (0, 2): 3.0734806370346943, (0, 3): 2.5754335063434937, + (0, 4): -1.0, + (1, 0): 3.640700980321895, (1, 1): 3.129579352304856, (1, 4): 2.0787517066719916, + (2, 0): 3.0259220379893352, (2, 1): 2.5926103577982897, (2, 2): 1.0, (2, 4): 2.507774181360808, + (3, 0): 2.5336747364500076, (3, 2): 3.0, (3, 3): 2.292172805400873, (3, 4): 2.996383110867515, + (4, 0): 2.1014575936349886, (4, 3): 3.1297590518608907, (4, 4): 3.6408806798779287, + (5, 0): -1.0, (5, 1): 2.5756132058995282, (5, 2): 3.0736603365907276, (5, 3): 3.6408806798779287, + (5, 4): 4.350771829901593} + assert sum(value_iteration(sequential_decision_environment_3, .01).values()) - sum(ref4.values()) < 0.001 + + +def test_policy_iteration(): + assert policy_iteration(sequential_decision_environment) == { + (0, 0): (0, 1), (0, 1): (0, 1), (0, 2): (1, 0), + (1, 0): (1, 0), (1, 2): (1, 0), (2, 0): (0, 1), + (2, 1): (0, 1), (2, 2): (1, 0), (3, 0): (-1, 0), + (3, 1): None, (3, 2): None} + + assert policy_iteration(sequential_decision_environment_1) == { + (0, 0): (0, 1), (0, 1): (0, 1), (0, 2): (1, 0), + (1, 0): (1, 0), (1, 2): (1, 0), (2, 0): (0, 1), + (2, 1): (0, 1), (2, 2): (1, 0), (3, 0): (-1, 0), + (3, 1): None, (3, 2): None} + + assert policy_iteration(sequential_decision_environment_2) == { + (0, 0): (1, 0), (0, 1): (0, 1), (0, 2): (1, 0), + (1, 0): (1, 0), (1, 2): (1, 0), (2, 0): (1, 0), + (2, 1): (1, 0), (2, 2): (1, 0), (3, 0): (0, 1), + (3, 1): None, (3, 2): None} + + +def test_best_policy(): + pi = best_policy(sequential_decision_environment, + value_iteration(sequential_decision_environment, .01)) + assert sequential_decision_environment.to_arrows(pi) == [['>', '>', '>', '.'], + ['^', None, '^', '.'], + ['^', '>', '^', '<']] + + pi_1 = best_policy(sequential_decision_environment_1, + value_iteration(sequential_decision_environment_1, .01)) + assert sequential_decision_environment_1.to_arrows(pi_1) == [['>', '>', '>', '.'], + ['^', None, '^', '.'], + ['^', '>', '^', '<']] + + pi_2 = best_policy(sequential_decision_environment_2, + value_iteration(sequential_decision_environment_2, .01)) + assert sequential_decision_environment_2.to_arrows(pi_2) == [['>', '>', '>', '.'], + ['^', None, '>', '.'], + ['>', '>', '>', '^']] + + pi_3 = best_policy(sequential_decision_environment_3, + value_iteration(sequential_decision_environment_3, .01)) + assert sequential_decision_environment_3.to_arrows(pi_3) == [['.', '>', '>', '>', '>', '>'], + ['v', None, None, '>', '>', '^'], + ['v', None, '.', '.', None, '^'], + ['v', '<', 'v', None, None, '^'], + ['<', '<', '<', '<', '<', '.']] + + +def test_transition_model(): + transition_model = {'a': {'plan1': [(0.2, 'a'), (0.3, 'b'), (0.3, 'c'), (0.2, 'd')], + 'plan2': [(0.4, 'a'), (0.15, 'b'), (0.45, 'c')], + 'plan3': [(0.2, 'a'), (0.5, 'b'), (0.3, 'c')], + }, + 'b': {'plan1': [(0.2, 'a'), (0.6, 'b'), (0.2, 'c'), (0.1, 'd')], + 'plan2': [(0.6, 'a'), (0.2, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3': [(0.3, 'a'), (0.3, 'b'), (0.4, 'c')], + }, + 'c': {'plan1': [(0.3, 'a'), (0.5, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan2': [(0.5, 'a'), (0.3, 'b'), (0.1, 'c'), (0.1, 'd')], + 'plan3': [(0.1, 'a'), (0.3, 'b'), (0.1, 'c'), (0.5, 'd')], + }} + + mdp = MDP(init="a", actlist={"plan1", "plan2", "plan3"}, terminals={"d"}, states={"a", "b", "c", "d"}, + transitions=transition_model) + + assert mdp.T("a", "plan3") == [(0.2, 'a'), (0.5, 'b'), (0.3, 'c')] + assert mdp.T("b", "plan2") == [(0.6, 'a'), (0.2, 'b'), (0.1, 'c'), (0.1, 'd')] + assert mdp.T("c", "plan1") == [(0.3, 'a'), (0.5, 'b'), (0.1, 'c'), (0.1, 'd')] + + +def test_pomdp_value_iteration(): + t_prob = [[[0.65, 0.35], [0.65, 0.35]], [[0.65, 0.35], [0.65, 0.35]], [[1.0, 0.0], [0.0, 1.0]]] + e_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[0.8, 0.2], [0.3, 0.7]]] + rewards = [[5, -10], [-20, 5], [-1, -1]] + + gamma = 0.95 + actions = ('0', '1', '2') + states = ('0', '1') + + pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma) + utility = pomdp_value_iteration(pomdp, epsilon=5) + + for _, v in utility.items(): + sum_ = 0 + for element in v: + sum_ += sum(element) + + assert -9.76 < sum_ < -9.70 or 246.5 < sum_ < 248.5 or 0 < sum_ < 1 + + +def test_pomdp_value_iteration2(): + t_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[1.0, 0.0], [0.0, 1.0]]] + e_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[0.85, 0.15], [0.15, 0.85]]] + rewards = [[-100, 10], [10, -100], [-1, -1]] + + gamma = 0.95 + actions = ('0', '1', '2') + states = ('0', '1') + + pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma) + utility = pomdp_value_iteration(pomdp, epsilon=100) + + for _, v in utility.items(): + sum_ = 0 + for element in v: + sum_ += sum(element) + + assert -77.31 < sum_ < -77.25 or 799 < sum_ < 800 + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_nlp.py b/tests/test_nlp.py index 1d8320cdc..85d246dfa 100644 --- a/tests/test_nlp.py +++ b/tests/test_nlp.py @@ -1,9 +1,11 @@ +import random + import pytest import nlp from nlp import loadPageHTML, stripRawHTML, findOutlinks, onlyWikipediaURLS -from nlp import expand_pages, relevant_pages, normalize, ConvergenceDetector, getInlinks -from nlp import getOutlinks, Page, determineInlinks, HITS +from nlp import expand_pages, relevant_pages, normalize, ConvergenceDetector, getInLinks +from nlp import getOutLinks, Page, determineInlinks, HITS from nlp import Rules, Lexicon, Grammar, ProbRules, ProbLexicon, ProbGrammar from nlp import Chart, CYK_parse # Clumsy imports because we want to access certain nlp.py globals explicitly, because @@ -12,6 +14,8 @@ from unittest.mock import patch from io import BytesIO +random.seed("aima-python") + def test_rules(): check = {'A': [['B', 'C'], ['D', 'E']], 'B': [['E'], ['a'], ['b', 'c']]} @@ -39,7 +43,7 @@ def test_grammar(): def test_generation(): lexicon = Lexicon(Article="the | a | an", - Pronoun="i | you | he") + Pronoun="i | you | he") rules = Rules( S="Article | More | Pronoun", @@ -113,6 +117,11 @@ def test_CYK_parse(): P = CYK_parse(words, grammar) assert len(P) == 52 + grammar = nlp.E_Prob_Chomsky_ + words = ['astronomers', 'saw', 'stars'] + P = CYK_parse(words, grammar) + assert len(P) == 32 + # ______________________________________________________________________________ # Data Setup @@ -148,9 +157,10 @@ def test_CYK_parse(): pageDict = {pA.address: pA, pB.address: pB, pC.address: pC, pD.address: pD, pE.address: pE, pF.address: pF} nlp.pagesIndex = pageDict -nlp.pagesContent ={pA.address: testHTML, pB.address: testHTML2, - pC.address: testHTML, pD.address: testHTML2, - pE.address: testHTML, pF.address: testHTML2} +nlp.pagesContent = {pA.address: testHTML, pB.address: testHTML2, + pC.address: testHTML, pD.address: testHTML2, + pE.address: testHTML, pF.address: testHTML2} + # This test takes a long time (> 60 secs) # def test_loadPageHTML(): @@ -178,12 +188,15 @@ def test_determineInlinks(): assert set(determineInlinks(pE)) == set([]) assert set(determineInlinks(pF)) == set(['E']) + def test_findOutlinks_wiki(): testPage = pageDict[pA.address] outlinks = findOutlinks(testPage, handleURLs=onlyWikipediaURLS) assert "https://en.wikipedia.org/wiki/TestThing" in outlinks assert "https://en.wikipedia.org/wiki/TestThing" in outlinks assert "https://google.com.au" not in outlinks + + # ______________________________________________________________________________ # HITS Helper Functions @@ -212,7 +225,8 @@ def test_relevant_pages(): def test_normalize(): normalize(pageDict) print(page.hub for addr, page in nlp.pagesIndex.items()) - expected_hub = [1/91**0.5, 2/91**0.5, 3/91**0.5, 4/91**0.5, 5/91**0.5, 6/91**0.5] # Works only for sample data above + expected_hub = [1 / 91 ** 0.5, 2 / 91 ** 0.5, 3 / 91 ** 0.5, 4 / 91 ** 0.5, 5 / 91 ** 0.5, + 6 / 91 ** 0.5] # Works only for sample data above expected_auth = list(reversed(expected_hub)) assert len(expected_hub) == len(expected_auth) == len(nlp.pagesIndex) assert expected_hub == [page.hub for addr, page in sorted(nlp.pagesIndex.items())] @@ -238,12 +252,12 @@ def test_detectConvergence(): def test_getInlinks(): - inlnks = getInlinks(pageDict['A']) + inlnks = getInLinks(pageDict['A']) assert sorted(inlnks) == pageDict['A'].inlinks def test_getOutlinks(): - outlnks = getOutlinks(pageDict['A']) + outlnks = getOutLinks(pageDict['A']) assert sorted(outlnks) == pageDict['A'].outlinks diff --git a/tests/test_nlp4e.py b/tests/test_nlp4e.py new file mode 100644 index 000000000..2d16a3196 --- /dev/null +++ b/tests/test_nlp4e.py @@ -0,0 +1,139 @@ +import random + +import pytest +import nlp + +from nlp4e import Rules, Lexicon, Grammar, ProbRules, ProbLexicon, ProbGrammar, E0 +from nlp4e import Chart, CYK_parse, subspan, astar_search_parsing, beam_search_parsing + +# Clumsy imports because we want to access certain nlp.py globals explicitly, because +# they are accessed by functions within nlp.py + +random.seed("aima-python") + + +def test_rules(): + check = {'A': [['B', 'C'], ['D', 'E']], 'B': [['E'], ['a'], ['b', 'c']]} + assert Rules(A="B C | D E", B="E | a | b c") == check + + +def test_lexicon(): + check = {'Article': ['the', 'a', 'an'], 'Pronoun': ['i', 'you', 'he']} + lexicon = Lexicon(Article="the | a | an", Pronoun="i | you | he") + assert lexicon == check + + +def test_grammar(): + rules = Rules(A="B C | D E", B="E | a | b c") + lexicon = Lexicon(Article="the | a | an", Pronoun="i | you | he") + grammar = Grammar("Simplegram", rules, lexicon) + + assert grammar.rewrites_for('A') == [['B', 'C'], ['D', 'E']] + assert grammar.isa('the', 'Article') + + grammar = nlp.E_Chomsky + for rule in grammar.cnf_rules(): + assert len(rule) == 3 + + +def test_generation(): + lexicon = Lexicon(Article="the | a | an", + Pronoun="i | you | he") + + rules = Rules( + S="Article | More | Pronoun", + More="Article Pronoun | Pronoun Pronoun" + ) + + grammar = Grammar("Simplegram", rules, lexicon) + + sentence = grammar.generate_random('S') + for token in sentence.split(): + found = False + for non_terminal, terminals in grammar.lexicon.items(): + if token in terminals: + found = True + assert found + + +def test_prob_rules(): + check = {'A': [(['B', 'C'], 0.3), (['D', 'E'], 0.7)], + 'B': [(['E'], 0.1), (['a'], 0.2), (['b', 'c'], 0.7)]} + rules = ProbRules(A="B C [0.3] | D E [0.7]", B="E [0.1] | a [0.2] | b c [0.7]") + assert rules == check + + +def test_prob_lexicon(): + check = {'Article': [('the', 0.5), ('a', 0.25), ('an', 0.25)], + 'Pronoun': [('i', 0.4), ('you', 0.3), ('he', 0.3)]} + lexicon = ProbLexicon(Article="the [0.5] | a [0.25] | an [0.25]", + Pronoun="i [0.4] | you [0.3] | he [0.3]") + assert lexicon == check + + +def test_prob_grammar(): + rules = ProbRules(A="B C [0.3] | D E [0.7]", B="E [0.1] | a [0.2] | b c [0.7]") + lexicon = ProbLexicon(Article="the [0.5] | a [0.25] | an [0.25]", + Pronoun="i [0.4] | you [0.3] | he [0.3]") + grammar = ProbGrammar("Simplegram", rules, lexicon) + + assert grammar.rewrites_for('A') == [(['B', 'C'], 0.3), (['D', 'E'], 0.7)] + assert grammar.isa('the', 'Article') + + grammar = nlp.E_Prob_Chomsky + for rule in grammar.cnf_rules(): + assert len(rule) == 4 + + +def test_prob_generation(): + lexicon = ProbLexicon(Verb="am [0.5] | are [0.25] | is [0.25]", + Pronoun="i [0.4] | you [0.3] | he [0.3]") + + rules = ProbRules( + S="Verb [0.5] | More [0.3] | Pronoun [0.1] | nobody is here [0.1]", + More="Pronoun Verb [0.7] | Pronoun Pronoun [0.3]") + + grammar = ProbGrammar("Simplegram", rules, lexicon) + + sentence = grammar.generate_random('S') + assert len(sentence) == 2 + + +def test_chart_parsing(): + chart = Chart(nlp.E0) + parses = chart.parses('the stench is in 2 2') + assert len(parses) == 1 + + +def test_CYK_parse(): + grammar = nlp.E_Prob_Chomsky + words = ['the', 'robot', 'is', 'good'] + P = CYK_parse(words, grammar) + assert len(P) == 5 + + grammar = nlp.E_Prob_Chomsky_ + words = ['astronomers', 'saw', 'stars'] + P = CYK_parse(words, grammar) + assert len(P) == 3 + + +def test_subspan(): + spans = subspan(3) + assert spans.__next__() == (1, 1, 2) + assert spans.__next__() == (2, 2, 3) + assert spans.__next__() == (1, 1, 3) + assert spans.__next__() == (1, 2, 3) + + +def test_text_parsing(): + words = ["the", "wumpus", "is", "dead"] + grammer = E0 + assert astar_search_parsing(words, grammer) == 'S' + assert beam_search_parsing(words, grammer) == 'S' + words = ["the", "is", "wupus", "dead"] + assert astar_search_parsing(words, grammer) is False + assert beam_search_parsing(words, grammer) is False + + +if __name__ == '__main__': + pytest.main() diff --git a/tests/test_perception4e.py b/tests/test_perception4e.py new file mode 100644 index 000000000..46d534523 --- /dev/null +++ b/tests/test_perception4e.py @@ -0,0 +1,87 @@ +import random + +import pytest + +from perception4e import * +from PIL import Image +import numpy as np +import os + +random.seed("aima-python") + + +def test_array_normalization(): + assert list(array_normalization([1, 2, 3, 4, 5], 0, 1)) == [0, 0.25, 0.5, 0.75, 1] + assert list(array_normalization([1, 2, 3, 4, 5], 1, 2)) == [1, 1.25, 1.5, 1.75, 2] + + +def test_sum_squared_difference(): + image = Image.open(os.path.abspath("./images/broxrevised.png")) + arr = np.asarray(image) + arr1 = arr[10:500, :514] + arr2 = arr[10:500, 514:1028] + assert sum_squared_difference(arr1, arr1)[1] == 0 + assert sum_squared_difference(arr1, arr1)[0] == (0, 0) + assert sum_squared_difference(arr1, arr2)[1] > 200000 + + +def test_gen_gray_scale_picture(): + assert list(gen_gray_scale_picture(size=3, level=3)[0]) == [0, 125, 250] + assert list(gen_gray_scale_picture(size=3, level=3)[1]) == [125, 125, 250] + assert list(gen_gray_scale_picture(size=3, level=3)[2]) == [250, 250, 250] + assert list(gen_gray_scale_picture(2, level=2)[0]) == [0, 250] + assert list(gen_gray_scale_picture(2, level=2)[1]) == [250, 250] + + +def test_generate_edge_weight(): + assert generate_edge_weight(gray_scale_image, (0, 0), (2, 2)) == 5 + assert generate_edge_weight(gray_scale_image, (1, 0), (0, 1)) == 255 + + +def test_graph_bfs(): + graph = Graph(gray_scale_image) + assert not graph.bfs((1, 1), (0, 0), []) + parents = [] + assert graph.bfs((0, 0), (2, 2), parents) + assert len(parents) == 8 + + +def test_graph_min_cut(): + image = gen_gray_scale_picture(size=3, level=2) + graph = Graph(image) + assert len(graph.min_cut((0, 0), (2, 2))) == 4 + image = gen_gray_scale_picture(size=10, level=2) + graph = Graph(image) + assert len(graph.min_cut((0, 0), (9, 9))) == 10 + + +def test_gen_discs(): + discs = gen_discs(100, 2) + assert len(discs) == 2 + assert len(discs[1]) == len(discs[0]) == 8 + + +def test_simple_convnet(): + train, val, test = load_MINST(1000, 100, 10) + model = simple_convnet() + model.fit(train[0], train[1], validation_data=(val[0], val[1]), epochs=5, verbose=2, batch_size=32) + scores = model.evaluate(test[0], test[1], verbose=1) + assert scores[1] > 0.2 + + +def test_ROIPoolingLayer(): + # Create feature map input + feature_maps_shape = (200, 100, 1) + feature_map = np.ones(feature_maps_shape, dtype='float32') + feature_map[200 - 1, 100 - 3, 0] = 50 + roiss = np.asarray([[0.5, 0.2, 0.7, 0.4], [0.0, 0.0, 1.0, 1.0]]) + assert pool_rois(feature_map, roiss, 3, 7)[0].tolist() == [[1, 1, 1, 1, 1, 1, 1], + [1, 1, 1, 1, 1, 1, 1], + [1, 1, 1, 1, 1, 1, 1]] + assert pool_rois(feature_map, roiss, 3, 7)[1].tolist() == [[1, 1, 1, 1, 1, 1, 1], + [1, 1, 1, 1, 1, 1, 1], + [1, 1, 1, 1, 1, 1, 50]] + + +if __name__ == '__main__': + pytest.main() diff --git a/tests/test_planning.py b/tests/test_planning.py index 2c355f54c..a39152adc 100644 --- a/tests/test_planning.py +++ b/tests/test_planning.py @@ -1,32 +1,40 @@ +import random + +import pytest + from planning import * +from search import astar_search from utils import expr -from logic import FolKB +from logic import FolKB, conjuncts + +random.seed('aima-python') def test_action(): - precond = [[expr("P(x)"), expr("Q(y, z)")], [expr("Q(x)")]] - effect = [[expr("Q(x)")], [expr("P(x)")]] - a=Action(expr("A(x,y,z)"), precond, effect) - args = [expr("A"), expr("B"), expr("C")] - assert a.substitute(expr("P(x, z, y)"), args) == expr("P(A, C, B)") - test_kb = FolKB([expr("P(A)"), expr("Q(B, C)"), expr("R(D)")]) + precond = 'At(c, a) & At(p, a) & Cargo(c) & Plane(p) & Airport(a)' + effect = 'In(c, p) & ~At(c, a)' + a = Action('Load(c, p, a)', precond, effect) + args = [expr('C1'), expr('P1'), expr('SFO')] + assert a.substitute(expr('Load(c, p, a)'), args) == expr('Load(C1, P1, SFO)') + test_kb = FolKB(conjuncts(expr('At(C1, SFO) & At(C2, JFK) & At(P1, SFO) & At(P2, JFK) & Cargo(C1) & Cargo(C2) & ' + 'Plane(P1) & Plane(P2) & Airport(SFO) & Airport(JFK)'))) assert a.check_precond(test_kb, args) a.act(test_kb, args) - assert test_kb.ask(expr("P(A)")) is False - assert test_kb.ask(expr("Q(A)")) is not False - assert test_kb.ask(expr("Q(B, C)")) is not False + assert test_kb.ask(expr('In(C1, P2)')) is False + assert test_kb.ask(expr('In(C1, P1)')) is not False + assert test_kb.ask(expr('Plane(P2)')) is not False assert not a.check_precond(test_kb, args) def test_air_cargo_1(): p = air_cargo() assert p.goal_test() is False - solution_1 = [expr("Load(C1 , P1, SFO)"), - expr("Fly(P1, SFO, JFK)"), - expr("Unload(C1, P1, JFK)"), - expr("Load(C2, P2, JFK)"), - expr("Fly(P2, JFK, SFO)"), - expr("Unload (C2, P2, SFO)")] + solution_1 = [expr('Load(C1 , P1, SFO)'), + expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), + expr('Load(C2, P2, JFK)'), + expr('Fly(P2, JFK, SFO)'), + expr('Unload(C2, P2, SFO)')] for action in solution_1: p.act(action) @@ -37,12 +45,12 @@ def test_air_cargo_1(): def test_air_cargo_2(): p = air_cargo() assert p.goal_test() is False - solution_2 = [expr("Load(C2, P2, JFK)"), - expr("Fly(P2, JFK, SFO)"), - expr("Unload (C2, P2, SFO)"), - expr("Load(C1 , P1, SFO)"), - expr("Fly(P1, SFO, JFK)"), - expr("Unload(C1, P1, JFK)")] + solution_2 = [expr('Load(C1 , P1, SFO)'), + expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), + expr('Load(C2, P1, JFK)'), + expr('Fly(P1, JFK, SFO)'), + expr('Unload(C2, P1, SFO)')] for action in solution_2: p.act(action) @@ -50,14 +58,59 @@ def test_air_cargo_2(): assert p.goal_test() -def test_spare_tire(): +def test_air_cargo_3(): + p = air_cargo() + assert p.goal_test() is False + solution_3 = [expr('Load(C2, P2, JFK)'), + expr('Fly(P2, JFK, SFO)'), + expr('Unload(C2, P2, SFO)'), + expr('Load(C1 , P1, SFO)'), + expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)')] + + for action in solution_3: + p.act(action) + + assert p.goal_test() + + +def test_air_cargo_4(): + p = air_cargo() + assert p.goal_test() is False + solution_4 = [expr('Load(C2, P2, JFK)'), + expr('Fly(P2, JFK, SFO)'), + expr('Unload(C2, P2, SFO)'), + expr('Load(C1, P2, SFO)'), + expr('Fly(P2, SFO, JFK)'), + expr('Unload(C1, P2, JFK)')] + + for action in solution_4: + p.act(action) + + assert p.goal_test() + + +def test_spare_tire_1(): p = spare_tire() assert p.goal_test() is False - solution = [expr("Remove(Flat, Axle)"), - expr("Remove(Spare, Trunk)"), - expr("PutOn(Spare, Axle)")] + solution_1 = [expr('Remove(Flat, Axle)'), + expr('Remove(Spare, Trunk)'), + expr('PutOn(Spare, Axle)')] - for action in solution: + for action in solution_1: + p.act(action) + + assert p.goal_test() + + +def test_spare_tire_2(): + p = spare_tire() + assert p.goal_test() is False + solution_2 = [expr('Remove(Spare, Trunk)'), + expr('Remove(Flat, Axle)'), + expr('PutOn(Spare, Axle)')] + + for action in solution_2: p.act(action) assert p.goal_test() @@ -66,9 +119,22 @@ def test_spare_tire(): def test_three_block_tower(): p = three_block_tower() assert p.goal_test() is False - solution = [expr("MoveToTable(C, A)"), - expr("Move(B, Table, C)"), - expr("Move(A, Table, B)")] + solution = [expr('MoveToTable(C, A)'), + expr('Move(B, Table, C)'), + expr('Move(A, Table, B)')] + + for action in solution: + p.act(action) + + assert p.goal_test() + + +def test_simple_blocks_world(): + p = simple_blocks_world() + assert p.goal_test() is False + solution = [expr('ToTable(A, B)'), + expr('FromTable(B, A)'), + expr('FromTable(C, B)')] for action in solution: p.act(action) @@ -79,8 +145,8 @@ def test_three_block_tower(): def test_have_cake_and_eat_cake_too(): p = have_cake_and_eat_cake_too() assert p.goal_test() is False - solution = [expr("Eat(Cake)"), - expr("Bake(Cake)")] + solution = [expr('Eat(Cake)'), + expr('Bake(Cake)')] for action in solution: p.act(action) @@ -88,10 +154,39 @@ def test_have_cake_and_eat_cake_too(): assert p.goal_test() +def test_shopping_problem_1(): + p = shopping_problem() + assert p.goal_test() is False + solution_1 = [expr('Go(Home, SM)'), + expr('Buy(Banana, SM)'), + expr('Buy(Milk, SM)'), + expr('Go(SM, HW)'), + expr('Buy(Drill, HW)')] + + for action in solution_1: + p.act(action) + + assert p.goal_test() + + +def test_shopping_problem_2(): + p = shopping_problem() + assert p.goal_test() is False + solution_2 = [expr('Go(Home, HW)'), + expr('Buy(Drill, HW)'), + expr('Go(HW, SM)'), + expr('Buy(Banana, SM)'), + expr('Buy(Milk, SM)')] + + for action in solution_2: + p.act(action) + + assert p.goal_test() + + def test_graph_call(): - pddl = spare_tire() - negkb = FolKB([expr('At(Flat, Trunk)')]) - graph = Graph(pddl, negkb) + planning_problem = spare_tire() + graph = Graph(planning_problem) levels_size = len(graph.levels) graph() @@ -99,6 +194,336 @@ def test_graph_call(): assert levels_size == len(graph.levels) - 1 +def test_graphPlan(): + spare_tire_solution = spare_tire_graphPlan() + spare_tire_solution = linearize(spare_tire_solution) + assert expr('Remove(Flat, Axle)') in spare_tire_solution + assert expr('Remove(Spare, Trunk)') in spare_tire_solution + assert expr('PutOn(Spare, Axle)') in spare_tire_solution + + cake_solution = have_cake_and_eat_cake_too_graphPlan() + cake_solution = linearize(cake_solution) + assert expr('Eat(Cake)') in cake_solution + assert expr('Bake(Cake)') in cake_solution + + air_cargo_solution = air_cargo_graphPlan() + air_cargo_solution = linearize(air_cargo_solution) + assert expr('Load(C1, P1, SFO)') in air_cargo_solution + assert expr('Load(C2, P2, JFK)') in air_cargo_solution + assert expr('Fly(P1, SFO, JFK)') in air_cargo_solution + assert expr('Fly(P2, JFK, SFO)') in air_cargo_solution + assert expr('Unload(C1, P1, JFK)') in air_cargo_solution + assert expr('Unload(C2, P2, SFO)') in air_cargo_solution + + sussman_anomaly_solution = three_block_tower_graphPlan() + sussman_anomaly_solution = linearize(sussman_anomaly_solution) + assert expr('MoveToTable(C, A)') in sussman_anomaly_solution + assert expr('Move(B, Table, C)') in sussman_anomaly_solution + assert expr('Move(A, Table, B)') in sussman_anomaly_solution + + blocks_world_solution = simple_blocks_world_graphPlan() + blocks_world_solution = linearize(blocks_world_solution) + assert expr('ToTable(A, B)') in blocks_world_solution + assert expr('FromTable(B, A)') in blocks_world_solution + assert expr('FromTable(C, B)') in blocks_world_solution + + shopping_problem_solution = shopping_graphPlan() + shopping_problem_solution = linearize(shopping_problem_solution) + assert expr('Go(Home, HW)') in shopping_problem_solution + assert expr('Go(Home, SM)') in shopping_problem_solution + assert expr('Buy(Drill, HW)') in shopping_problem_solution + assert expr('Buy(Banana, SM)') in shopping_problem_solution + assert expr('Buy(Milk, SM)') in shopping_problem_solution + + +def test_forwardPlan(): + spare_tire_solution = astar_search(ForwardPlan(spare_tire())).solution() + spare_tire_solution = list(map(lambda action: Expr(action.name, *action.args), spare_tire_solution)) + assert expr('Remove(Flat, Axle)') in spare_tire_solution + assert expr('Remove(Spare, Trunk)') in spare_tire_solution + assert expr('PutOn(Spare, Axle)') in spare_tire_solution + + cake_solution = astar_search(ForwardPlan(have_cake_and_eat_cake_too())).solution() + cake_solution = list(map(lambda action: Expr(action.name, *action.args), cake_solution)) + assert expr('Eat(Cake)') in cake_solution + assert expr('Bake(Cake)') in cake_solution + + air_cargo_solution = astar_search(ForwardPlan(air_cargo())).solution() + air_cargo_solution = list(map(lambda action: Expr(action.name, *action.args), air_cargo_solution)) + assert expr('Load(C2, P2, JFK)') in air_cargo_solution + assert expr('Fly(P2, JFK, SFO)') in air_cargo_solution + assert expr('Unload(C2, P2, SFO)') in air_cargo_solution + assert expr('Load(C1, P2, SFO)') in air_cargo_solution + assert expr('Fly(P2, SFO, JFK)') in air_cargo_solution + assert expr('Unload(C1, P2, JFK)') in air_cargo_solution + + sussman_anomaly_solution = astar_search(ForwardPlan(three_block_tower())).solution() + sussman_anomaly_solution = list(map(lambda action: Expr(action.name, *action.args), sussman_anomaly_solution)) + assert expr('MoveToTable(C, A)') in sussman_anomaly_solution + assert expr('Move(B, Table, C)') in sussman_anomaly_solution + assert expr('Move(A, Table, B)') in sussman_anomaly_solution + + blocks_world_solution = astar_search(ForwardPlan(simple_blocks_world())).solution() + blocks_world_solution = list(map(lambda action: Expr(action.name, *action.args), blocks_world_solution)) + assert expr('ToTable(A, B)') in blocks_world_solution + assert expr('FromTable(B, A)') in blocks_world_solution + assert expr('FromTable(C, B)') in blocks_world_solution + + shopping_problem_solution = astar_search(ForwardPlan(shopping_problem())).solution() + shopping_problem_solution = list(map(lambda action: Expr(action.name, *action.args), shopping_problem_solution)) + assert expr('Go(Home, SM)') in shopping_problem_solution + assert expr('Buy(Banana, SM)') in shopping_problem_solution + assert expr('Buy(Milk, SM)') in shopping_problem_solution + assert expr('Go(SM, HW)') in shopping_problem_solution + assert expr('Buy(Drill, HW)') in shopping_problem_solution + + +def test_backwardPlan(): + spare_tire_solution = astar_search(BackwardPlan(spare_tire())).solution() + spare_tire_solution = list(map(lambda action: Expr(action.name, *action.args), spare_tire_solution)) + assert expr('Remove(Flat, Axle)') in spare_tire_solution + assert expr('Remove(Spare, Trunk)') in spare_tire_solution + assert expr('PutOn(Spare, Axle)') in spare_tire_solution + + cake_solution = astar_search(BackwardPlan(have_cake_and_eat_cake_too())).solution() + cake_solution = list(map(lambda action: Expr(action.name, *action.args), cake_solution)) + assert expr('Eat(Cake)') in cake_solution + assert expr('Bake(Cake)') in cake_solution + + air_cargo_solution = astar_search(BackwardPlan(air_cargo())).solution() + air_cargo_solution = list(map(lambda action: Expr(action.name, *action.args), air_cargo_solution)) + assert air_cargo_solution == [expr('Unload(C1, P1, JFK)'), + expr('Fly(P1, SFO, JFK)'), + expr('Unload(C2, P2, SFO)'), + expr('Fly(P2, JFK, SFO)'), + expr('Load(C2, P2, JFK)'), + expr('Load(C1, P1, SFO)')] or [expr('Load(C1, P1, SFO)'), + expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), + expr('Load(C2, P1, JFK)'), + expr('Fly(P1, JFK, SFO)'), + expr('Unload(C2, P1, SFO)')] + + sussman_anomaly_solution = astar_search(BackwardPlan(three_block_tower())).solution() + sussman_anomaly_solution = list(map(lambda action: Expr(action.name, *action.args), sussman_anomaly_solution)) + assert expr('MoveToTable(C, A)') in sussman_anomaly_solution + assert expr('Move(B, Table, C)') in sussman_anomaly_solution + assert expr('Move(A, Table, B)') in sussman_anomaly_solution + + blocks_world_solution = astar_search(BackwardPlan(simple_blocks_world())).solution() + blocks_world_solution = list(map(lambda action: Expr(action.name, *action.args), blocks_world_solution)) + assert expr('ToTable(A, B)') in blocks_world_solution + assert expr('FromTable(B, A)') in blocks_world_solution + assert expr('FromTable(C, B)') in blocks_world_solution + + shopping_problem_solution = astar_search(BackwardPlan(shopping_problem())).solution() + shopping_problem_solution = list(map(lambda action: Expr(action.name, *action.args), shopping_problem_solution)) + assert shopping_problem_solution == [expr('Go(Home, SM)'), + expr('Buy(Banana, SM)'), + expr('Buy(Milk, SM)'), + expr('Go(SM, HW)'), + expr('Buy(Drill, HW)')] or [expr('Go(Home, HW)'), + expr('Buy(Drill, HW)'), + expr('Go(HW, SM)'), + expr('Buy(Banana, SM)'), + expr('Buy(Milk, SM)')] + + +def test_CSPlan(): + spare_tire_solution = CSPlan(spare_tire(), 3) + assert expr('Remove(Flat, Axle)') in spare_tire_solution + assert expr('Remove(Spare, Trunk)') in spare_tire_solution + assert expr('PutOn(Spare, Axle)') in spare_tire_solution + + cake_solution = CSPlan(have_cake_and_eat_cake_too(), 2) + assert expr('Eat(Cake)') in cake_solution + assert expr('Bake(Cake)') in cake_solution + + air_cargo_solution = CSPlan(air_cargo(), 6) + assert air_cargo_solution == [expr('Load(C1, P1, SFO)'), + expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), + expr('Load(C2, P1, JFK)'), + expr('Fly(P1, JFK, SFO)'), + expr('Unload(C2, P1, SFO)')] or [expr('Load(C1, P1, SFO)'), + expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), + expr('Load(C2, P2, JFK)'), + expr('Fly(P2, JFK, SFO)'), + expr('Unload(C2, P2, SFO)')] + + sussman_anomaly_solution = CSPlan(three_block_tower(), 3) + assert expr('MoveToTable(C, A)') in sussman_anomaly_solution + assert expr('Move(B, Table, C)') in sussman_anomaly_solution + assert expr('Move(A, Table, B)') in sussman_anomaly_solution + + blocks_world_solution = CSPlan(simple_blocks_world(), 3) + assert expr('ToTable(A, B)') in blocks_world_solution + assert expr('FromTable(B, A)') in blocks_world_solution + assert expr('FromTable(C, B)') in blocks_world_solution + + shopping_problem_solution = CSPlan(shopping_problem(), 5) + assert shopping_problem_solution == [expr('Go(Home, SM)'), + expr('Buy(Banana, SM)'), + expr('Buy(Milk, SM)'), + expr('Go(SM, HW)'), + expr('Buy(Drill, HW)')] or [expr('Go(Home, HW)'), + expr('Buy(Drill, HW)'), + expr('Go(HW, SM)'), + expr('Buy(Banana, SM)'), + expr('Buy(Milk, SM)')] + + +def test_SATPlan(): + spare_tire_solution = SATPlan(spare_tire(), 3) + assert expr('Remove(Flat, Axle)') in spare_tire_solution + assert expr('Remove(Spare, Trunk)') in spare_tire_solution + assert expr('PutOn(Spare, Axle)') in spare_tire_solution + + cake_solution = SATPlan(have_cake_and_eat_cake_too(), 2) + assert expr('Eat(Cake)') in cake_solution + assert expr('Bake(Cake)') in cake_solution + + sussman_anomaly_solution = SATPlan(three_block_tower(), 3) + assert expr('MoveToTable(C, A)') in sussman_anomaly_solution + assert expr('Move(B, Table, C)') in sussman_anomaly_solution + assert expr('Move(A, Table, B)') in sussman_anomaly_solution + + blocks_world_solution = SATPlan(simple_blocks_world(), 3) + assert expr('ToTable(A, B)') in blocks_world_solution + assert expr('FromTable(B, A)') in blocks_world_solution + assert expr('FromTable(C, B)') in blocks_world_solution + + +def test_linearize_class(): + st = spare_tire() + possible_solutions = [[expr('Remove(Spare, Trunk)'), expr('Remove(Flat, Axle)'), expr('PutOn(Spare, Axle)')], + [expr('Remove(Flat, Axle)'), expr('Remove(Spare, Trunk)'), expr('PutOn(Spare, Axle)')]] + assert Linearize(st).execute() in possible_solutions + + ac = air_cargo() + possible_solutions = [ + [expr('Load(C1, P1, SFO)'), expr('Load(C2, P2, JFK)'), expr('Fly(P1, SFO, JFK)'), expr('Fly(P2, JFK, SFO)'), + expr('Unload(C1, P1, JFK)'), expr('Unload(C2, P2, SFO)')], + [expr('Load(C1, P1, SFO)'), expr('Load(C2, P2, JFK)'), expr('Fly(P1, SFO, JFK)'), expr('Fly(P2, JFK, SFO)'), + expr('Unload(C2, P2, SFO)'), expr('Unload(C1, P1, JFK)')], + [expr('Load(C1, P1, SFO)'), expr('Load(C2, P2, JFK)'), expr('Fly(P2, JFK, SFO)'), expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), expr('Unload(C2, P2, SFO)')], + [expr('Load(C1, P1, SFO)'), expr('Load(C2, P2, JFK)'), expr('Fly(P2, JFK, SFO)'), expr('Fly(P1, SFO, JFK)'), + expr('Unload(C2, P2, SFO)'), expr('Unload(C1, P1, JFK)')], + [expr('Load(C2, P2, JFK)'), expr('Load(C1, P1, SFO)'), expr('Fly(P1, SFO, JFK)'), expr('Fly(P2, JFK, SFO)'), + expr('Unload(C1, P1, JFK)'), expr('Unload(C2, P2, SFO)')], + [expr('Load(C2, P2, JFK)'), expr('Load(C1, P1, SFO)'), expr('Fly(P1, SFO, JFK)'), expr('Fly(P2, JFK, SFO)'), + expr('Unload(C2, P2, SFO)'), expr('Unload(C1, P1, JFK)')], + [expr('Load(C2, P2, JFK)'), expr('Load(C1, P1, SFO)'), expr('Fly(P2, JFK, SFO)'), expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), expr('Unload(C2, P2, SFO)')], + [expr('Load(C2, P2, JFK)'), expr('Load(C1, P1, SFO)'), expr('Fly(P2, JFK, SFO)'), expr('Fly(P1, SFO, JFK)'), + expr('Unload(C2, P2, SFO)'), expr('Unload(C1, P1, JFK)')], + [expr('Load(C1, P1, SFO)'), expr('Fly(P1, SFO, JFK)'), expr('Load(C2, P2, JFK)'), expr('Fly(P2, JFK, SFO)'), + expr('Unload(C1, P1, JFK)'), expr('Unload(C2, P2, SFO)')], + [expr('Load(C1, P1, SFO)'), expr('Fly(P1, SFO, JFK)'), expr('Load(C2, P2, JFK)'), expr('Fly(P2, JFK, SFO)'), + expr('Unload(C2, P2, SFO)'), expr('Unload(C1, P1, JFK)')], + [expr('Load(C2, P2, JFK)'), expr('Fly(P2, JFK, SFO)'), expr('Load(C1, P1, SFO)'), expr('Fly(P1, SFO, JFK)'), + expr('Unload(C1, P1, JFK)'), expr('Unload(C2, P2, SFO)')], + [expr('Load(C2, P2, JFK)'), expr('Fly(P2, JFK, SFO)'), expr('Load(C1, P1, SFO)'), expr('Fly(P1, SFO, JFK)'), + expr('Unload(C2, P2, SFO)'), expr('Unload(C1, P1, JFK)')]] + assert Linearize(ac).execute() in possible_solutions + + ss = socks_and_shoes() + possible_solutions = [[expr('LeftSock'), expr('RightSock'), expr('LeftShoe'), expr('RightShoe')], + [expr('LeftSock'), expr('RightSock'), expr('RightShoe'), expr('LeftShoe')], + [expr('RightSock'), expr('LeftSock'), expr('LeftShoe'), expr('RightShoe')], + [expr('RightSock'), expr('LeftSock'), expr('RightShoe'), expr('LeftShoe')], + [expr('LeftSock'), expr('LeftShoe'), expr('RightSock'), expr('RightShoe')], + [expr('RightSock'), expr('RightShoe'), expr('LeftSock'), expr('LeftShoe')]] + assert Linearize(ss).execute() in possible_solutions + + +def test_expand_actions(): + assert len(spare_tire().expand_actions()) == 9 + assert len(air_cargo().expand_actions()) == 20 + assert len(have_cake_and_eat_cake_too().expand_actions()) == 2 + assert len(socks_and_shoes().expand_actions()) == 4 + assert len(simple_blocks_world().expand_actions()) == 12 + assert len(three_block_tower().expand_actions()) == 18 + assert len(shopping_problem().expand_actions()) == 12 + + +def test_expand_feats_values(): + assert len(spare_tire().expand_fluents()) == 10 + assert len(air_cargo().expand_fluents()) == 18 + assert len(have_cake_and_eat_cake_too().expand_fluents()) == 2 + assert len(socks_and_shoes().expand_fluents()) == 4 + assert len(simple_blocks_world().expand_fluents()) == 12 + assert len(three_block_tower().expand_fluents()) == 16 + assert len(shopping_problem().expand_fluents()) == 20 + + +def test_find_open_precondition(): + st = spare_tire() + pop = PartialOrderPlanner(st) + assert pop.find_open_precondition()[0] == expr('At(Spare, Axle)') + assert pop.find_open_precondition()[1] == pop.finish + assert pop.find_open_precondition()[2][0].name == 'PutOn' + + ss = socks_and_shoes() + pop = PartialOrderPlanner(ss) + assert (pop.find_open_precondition()[0] == expr('LeftShoeOn') and + pop.find_open_precondition()[2][0].name == 'LeftShoe') or ( + pop.find_open_precondition()[0] == expr('RightShoeOn') and + pop.find_open_precondition()[2][0].name == 'RightShoe') + assert pop.find_open_precondition()[1] == pop.finish + + cp = have_cake_and_eat_cake_too() + pop = PartialOrderPlanner(cp) + assert pop.find_open_precondition()[0] == expr('Eaten(Cake)') + assert pop.find_open_precondition()[1] == pop.finish + assert pop.find_open_precondition()[2][0].name == 'Eat' + + +def test_cyclic(): + st = spare_tire() + pop = PartialOrderPlanner(st) + graph = [('a', 'b'), ('a', 'c'), ('b', 'c'), ('b', 'd'), ('d', 'e'), ('e', 'c')] + assert not pop.cyclic(graph) + + graph = [('a', 'b'), ('a', 'c'), ('b', 'c'), ('b', 'd'), ('d', 'e'), ('e', 'c'), ('e', 'b')] + assert pop.cyclic(graph) + + graph = [('a', 'b'), ('a', 'c'), ('b', 'c'), ('b', 'd'), ('d', 'e'), ('e', 'c'), ('b', 'e'), ('a', 'e')] + assert not pop.cyclic(graph) + + graph = [('a', 'b'), ('a', 'c'), ('b', 'c'), ('b', 'd'), ('d', 'e'), ('e', 'c'), ('e', 'b'), ('b', 'e'), ('a', 'e')] + assert pop.cyclic(graph) + + +def test_partial_order_planner(): + ss = socks_and_shoes() + pop = PartialOrderPlanner(ss) + pop.execute(display=False) + plan = list(reversed(list(pop.toposort(pop.convert(pop.constraints))))) + assert list(plan[0])[0].name == 'Start' + assert (list(plan[1])[0].name == 'LeftSock' and list(plan[1])[1].name == 'RightSock') or ( + list(plan[1])[0].name == 'RightSock' and list(plan[1])[1].name == 'LeftSock') + assert (list(plan[2])[0].name == 'LeftShoe' and list(plan[2])[1].name == 'RightShoe') or ( + list(plan[2])[0].name == 'RightShoe' and list(plan[2])[1].name == 'LeftShoe') + assert list(plan[3])[0].name == 'Finish' + + +def test_double_tennis(): + p = double_tennis_problem() + assert not goal_test(p.goals, p.initial) + + solution = [expr('Go(A, RightBaseLine, LeftBaseLine)'), + expr('Hit(A, Ball, RightBaseLine)'), + expr('Go(A, LeftNet, RightBaseLine)')] + + for action in solution: + p.act(action) + + assert goal_test(p.goals, p.initial) + + def test_job_shop_problem(): p = job_shop_problem() assert p.goal_test() is False @@ -115,33 +540,235 @@ def test_job_shop_problem(): assert p.goal_test() -def test_refinements() : - init = [expr('At(Home)')] - def goal_test(kb): - return kb.ask(expr('At(SFO)')) - - library = {"HLA": ["Go(Home,SFO)","Taxi(Home, SFO)"], - "steps": [["Taxi(Home, SFO)"],[]], - "precond_pos": [["At(Home)"],["At(Home)"]], - "precond_neg": [[],[]], - "effect_pos": [["At(SFO)"],["At(SFO)"]], - "effect_neg": [["At(Home)"],["At(Home)"],]} - # Go SFO - precond_pos = [expr("At(Home)")] - precond_neg = [] - effect_add = [expr("At(SFO)")] - effect_rem = [expr("At(Home)")] - go_SFO = HLA(expr("Go(Home,SFO)"), - [precond_pos, precond_neg], [effect_add, effect_rem]) - # Taxi SFO - precond_pos = [expr("At(Home)")] - precond_neg = [] - effect_add = [expr("At(SFO)")] - effect_rem = [expr("At(Home)")] - taxi_SFO = HLA(expr("Go(Home,SFO)"), - [precond_pos, precond_neg], [effect_add, effect_rem]) - prob = Problem(init, [go_SFO, taxi_SFO], goal_test) - result = [i for i in Problem.refinements(go_SFO, prob, library)] - assert(len(result) == 1) - assert(result[0].name == "Taxi") - assert(result[0].args == (expr("Home"), expr("SFO"))) + +# hierarchies +library_1 = { + 'HLA': ['Go(Home,SFO)', 'Go(Home,SFO)', 'Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)', + 'Taxi(Home, SFO)'], + 'steps': [['Drive(Home, SFOLongTermParking)', 'Shuttle(SFOLongTermParking, SFO)'], ['Taxi(Home, SFO)'], [], [], []], + 'precond': [['At(Home) & Have(Car)'], ['At(Home)'], ['At(Home) & Have(Car)'], ['At(SFOLongTermParking)'], + ['At(Home)']], + 'effect': [['At(SFO) & ~At(Home)'], ['At(SFO) & ~At(Home) & ~Have(Cash)'], ['At(SFOLongTermParking) & ~At(Home)'], + ['At(SFO) & ~At(LongTermParking)'], ['At(SFO) & ~At(Home) & ~Have(Cash)']]} + +library_2 = { + 'HLA': ['Go(Home,SFO)', 'Go(Home,SFO)', 'Bus(Home, MetroStop)', 'Metro(MetroStop, SFO)', 'Metro(MetroStop, SFO)', + 'Metro1(MetroStop, SFO)', 'Metro2(MetroStop, SFO)', 'Taxi(Home, SFO)'], + 'steps': [['Bus(Home, MetroStop)', 'Metro(MetroStop, SFO)'], ['Taxi(Home, SFO)'], [], ['Metro1(MetroStop, SFO)'], + ['Metro2(MetroStop, SFO)'], [], [], []], + 'precond': [['At(Home)'], ['At(Home)'], ['At(Home)'], ['At(MetroStop)'], ['At(MetroStop)'], ['At(MetroStop)'], + ['At(MetroStop)'], ['At(Home) & Have(Cash)']], + 'effect': [['At(SFO) & ~At(Home)'], ['At(SFO) & ~At(Home) & ~Have(Cash)'], ['At(MetroStop) & ~At(Home)'], + ['At(SFO) & ~At(MetroStop)'], ['At(SFO) & ~At(MetroStop)'], ['At(SFO) & ~At(MetroStop)'], + ['At(SFO) & ~At(MetroStop)'], ['At(SFO) & ~At(Home) & ~Have(Cash)']]} + +# HLA's +go_SFO = HLA('Go(Home,SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home)') +taxi_SFO = HLA('Taxi(Home,SFO)', precond='At(Home)', effect='At(SFO) & ~At(Home) & ~Have(Cash)') +drive_SFOLongTermParking = HLA('Drive(Home, SFOLongTermParking)', 'At(Home) & Have(Car)', + 'At(SFOLongTermParking) & ~At(Home)') +shuttle_SFO = HLA('Shuttle(SFOLongTermParking, SFO)', 'At(SFOLongTermParking)', 'At(SFO) & ~At(LongTermParking)') + +# Angelic HLA's +angelic_opt_description = AngelicHLA('Go(Home, SFO)', precond='At(Home)', effect='$+At(SFO) & $-At(Home)') +angelic_pes_description = AngelicHLA('Go(Home, SFO)', precond='At(Home)', effect='$+At(SFO) & ~At(Home)') + +# Angelic Nodes +plan1 = AngelicNode('At(Home)', None, [angelic_opt_description], [angelic_pes_description]) +plan2 = AngelicNode('At(Home)', None, [taxi_SFO]) +plan3 = AngelicNode('At(Home)', None, [drive_SFOLongTermParking, shuttle_SFO]) + +# Problems +prob_1 = RealWorldPlanningProblem('At(Home) & Have(Cash) & Have(Car) ', 'At(SFO) & Have(Cash)', + [go_SFO, taxi_SFO, drive_SFOLongTermParking, shuttle_SFO]) + +initialPlan = [AngelicNode(prob_1.initial, None, [angelic_opt_description], [angelic_pes_description])] + + +def test_refinements(): + result = [i for i in RealWorldPlanningProblem.refinements(go_SFO, library_1)] + + assert (result[0][0].name == drive_SFOLongTermParking.name) + assert (result[0][0].args == drive_SFOLongTermParking.args) + assert (result[0][0].precond == drive_SFOLongTermParking.precond) + assert (result[0][0].effect == drive_SFOLongTermParking.effect) + + assert (result[0][1].name == shuttle_SFO.name) + assert (result[0][1].args == shuttle_SFO.args) + assert (result[0][1].precond == shuttle_SFO.precond) + assert (result[0][1].effect == shuttle_SFO.effect) + + assert (result[1][0].name == taxi_SFO.name) + assert (result[1][0].args == taxi_SFO.args) + assert (result[1][0].precond == taxi_SFO.precond) + assert (result[1][0].effect == taxi_SFO.effect) + + +def test_hierarchical_search(): + # test_1 + prob_1 = RealWorldPlanningProblem('At(Home) & Have(Cash) & Have(Car) ', 'At(SFO) & Have(Cash)', [go_SFO]) + + solution = RealWorldPlanningProblem.hierarchical_search(prob_1, library_1) + + assert (len(solution) == 2) + + assert (solution[0].name == drive_SFOLongTermParking.name) + assert (solution[0].args == drive_SFOLongTermParking.args) + + assert (solution[1].name == shuttle_SFO.name) + assert (solution[1].args == shuttle_SFO.args) + + # test_2 + solution_2 = RealWorldPlanningProblem.hierarchical_search(prob_1, library_2) + + assert (len(solution_2) == 2) + + assert (solution_2[0].name == 'Bus') + assert (solution_2[0].args == (expr('Home'), expr('MetroStop'))) + + assert (solution_2[1].name == 'Metro1') + assert (solution_2[1].args == (expr('MetroStop'), expr('SFO'))) + + +def test_convert_angelic_HLA(): + """ + Converts angelic HLA's into expressions that correspond to their actions + ~ : Delete (Not) + $+ : Possibly add (PosYes) + $-: Possibly delete (PosNo) + $$: Possibly add / delete (PosYesNo) + """ + ang1 = AngelicHLA('Test', precond=None, effect='~A') + ang2 = AngelicHLA('Test', precond=None, effect='$+A') + ang3 = AngelicHLA('Test', precond=None, effect='$-A') + ang4 = AngelicHLA('Test', precond=None, effect='$$A') + + assert (ang1.convert(ang1.effect) == [expr('NotA')]) + assert (ang2.convert(ang2.effect) == [expr('PosYesA')]) + assert (ang3.convert(ang3.effect) == [expr('PosNotA')]) + assert (ang4.convert(ang4.effect) == [expr('PosYesNotA')]) + + +def test_is_primitive(): + """ + Tests if a plan is consisted out of primitive HLA's (angelic HLA's) + """ + assert (not RealWorldPlanningProblem.is_primitive(plan1, library_1)) + assert (RealWorldPlanningProblem.is_primitive(plan2, library_1)) + assert (RealWorldPlanningProblem.is_primitive(plan3, library_1)) + + +def test_angelic_action(): + """ + Finds the HLA actions that correspond to the HLA actions with angelic semantics + + h1 : precondition positive: B _______ (add A) or (add A and remove B) + effect: add A and possibly remove B + + h2 : precondition positive: A _______ (add A and add C) or (delete A and add C) or + (add C) or (add A and delete C) or + effect: possibly add/remove A and possibly add/remove C (delete A and delete C) or (delete C) or + (add A) or (delete A) or [] + + """ + h_1 = AngelicHLA(expr('h1'), 'B', 'A & $-B') + h_2 = AngelicHLA(expr('h2'), 'A', '$$A & $$C') + action_1 = AngelicHLA.angelic_action(h_1) + action_2 = AngelicHLA.angelic_action(h_2) + + assert ([a.effect for a in action_1] == [[expr('A'), expr('NotB')], [expr('A')]]) + assert ([a.effect for a in action_2] == [[expr('A'), expr('C')], [expr('NotA'), expr('C')], [expr('C')], + [expr('A'), expr('NotC')], [expr('NotA'), expr('NotC')], [expr('NotC')], + [expr('A')], [expr('NotA')], [None]]) + + +def test_optimistic_reachable_set(): + """ + Find optimistic reachable set given a problem initial state and a plan + """ + h_1 = AngelicHLA('h1', 'B', '$+A & $-B ') + h_2 = AngelicHLA('h2', 'A', '$$A & $$C') + f_1 = HLA('h1', 'B', 'A & ~B') + f_2 = HLA('h2', 'A', 'A & C') + problem = RealWorldPlanningProblem('B', 'A', [f_1, f_2]) + plan = AngelicNode(problem.initial, None, [h_1, h_2], [h_1, h_2]) + opt_reachable_set = RealWorldPlanningProblem.reach_opt(problem.initial, plan) + assert (opt_reachable_set[1] == [[expr('A'), expr('NotB')], [expr('NotB')], [expr('B'), expr('A')], [expr('B')]]) + assert (problem.intersects_goal(opt_reachable_set)) + + +def test_pessimistic_reachable_set(): + """ + Find pessimistic reachable set given a problem initial state and a plan + """ + h_1 = AngelicHLA('h1', 'B', '$+A & $-B ') + h_2 = AngelicHLA('h2', 'A', '$$A & $$C') + f_1 = HLA('h1', 'B', 'A & ~B') + f_2 = HLA('h2', 'A', 'A & C') + problem = RealWorldPlanningProblem('B', 'A', [f_1, f_2]) + plan = AngelicNode(problem.initial, None, [h_1, h_2], [h_1, h_2]) + pes_reachable_set = RealWorldPlanningProblem.reach_pes(problem.initial, plan) + assert (pes_reachable_set[1] == [[expr('A'), expr('NotB')], [expr('NotB')], [expr('B'), expr('A')], [expr('B')]]) + assert (problem.intersects_goal(pes_reachable_set)) + + +def test_find_reachable_set(): + h_1 = AngelicHLA('h1', 'B', '$+A & $-B ') + f_1 = HLA('h1', 'B', 'A & ~B') + problem = RealWorldPlanningProblem('B', 'A', [f_1]) + reachable_set = {0: [problem.initial]} + action_description = [h_1] + + reachable_set = RealWorldPlanningProblem.find_reachable_set(reachable_set, action_description) + assert (reachable_set[1] == [[expr('A'), expr('NotB')], [expr('NotB')], [expr('B'), expr('A')], [expr('B')]]) + + +def test_intersects_goal(): + problem_1 = RealWorldPlanningProblem('At(SFO)', 'At(SFO)', []) + problem_2 = RealWorldPlanningProblem('At(Home) & Have(Cash) & Have(Car) ', 'At(SFO) & Have(Cash)', []) + reachable_set_1 = {0: [problem_1.initial]} + reachable_set_2 = {0: [problem_2.initial]} + + assert (RealWorldPlanningProblem.intersects_goal(problem_1, reachable_set_1)) + assert (not RealWorldPlanningProblem.intersects_goal(problem_2, reachable_set_2)) + + +def test_making_progress(): + """ + function not yet implemented + """ + + plan_1 = AngelicNode(prob_1.initial, None, [angelic_opt_description], [angelic_pes_description]) + + assert (not RealWorldPlanningProblem.making_progress(plan_1, initialPlan)) + + +def test_angelic_search(): + """ + Test angelic search for problem, hierarchy, initialPlan + """ + # test_1 + solution = RealWorldPlanningProblem.angelic_search(prob_1, library_1, initialPlan) + + assert (len(solution) == 2) + + assert (solution[0].name == drive_SFOLongTermParking.name) + assert (solution[0].args == drive_SFOLongTermParking.args) + + assert (solution[1].name == shuttle_SFO.name) + assert (solution[1].args == shuttle_SFO.args) + + # test_2 + solution_2 = RealWorldPlanningProblem.angelic_search(prob_1, library_2, initialPlan) + + assert (len(solution_2) == 2) + + assert (solution_2[0].name == 'Bus') + assert (solution_2[0].args == (expr('Home'), expr('MetroStop'))) + + assert (solution_2[1].name == 'Metro1') + assert (solution_2[1].args == (expr('MetroStop'), expr('SFO'))) + + +if __name__ == '__main__': + pytest.main() diff --git a/tests/test_probabilistic_learning.py b/tests/test_probabilistic_learning.py new file mode 100644 index 000000000..bd37b6ebb --- /dev/null +++ b/tests/test_probabilistic_learning.py @@ -0,0 +1,38 @@ +import random + +import pytest + +from learning import DataSet +from probabilistic_learning import * + +random.seed("aima-python") + + +def test_naive_bayes(): + iris = DataSet(name='iris') + # discrete + nbd = NaiveBayesLearner(iris, continuous=False) + assert nbd([5, 3, 1, 0.1]) == 'setosa' + assert nbd([6, 3, 4, 1.1]) == 'versicolor' + assert nbd([7.7, 3, 6, 2]) == 'virginica' + # continuous + nbc = NaiveBayesLearner(iris, continuous=True) + assert nbc([5, 3, 1, 0.1]) == 'setosa' + assert nbc([6, 5, 3, 1.5]) == 'versicolor' + assert nbc([7, 3, 6.5, 2]) == 'virginica' + # simple + data1 = 'a' * 50 + 'b' * 30 + 'c' * 15 + dist1 = CountingProbDist(data1) + data2 = 'a' * 30 + 'b' * 45 + 'c' * 20 + dist2 = CountingProbDist(data2) + data3 = 'a' * 20 + 'b' * 20 + 'c' * 35 + dist3 = CountingProbDist(data3) + dist = {('First', 0.5): dist1, ('Second', 0.3): dist2, ('Third', 0.2): dist3} + nbs = NaiveBayesLearner(dist, simple=True) + assert nbs('aab') == 'First' + assert nbs(['b', 'b']) == 'Second' + assert nbs('ccbcc') == 'Third' + + +if __name__ == "__main__": + pytest.main() diff --git a/tests/test_probability.py b/tests/test_probability.py index e974a7c89..8def79c68 100644 --- a/tests/test_probability.py +++ b/tests/test_probability.py @@ -1,7 +1,10 @@ -import random +import pytest + from probability import * from utils import rounder +random.seed("aima-python") + def tests(): cpt = burglary.variable_node('Alarm') @@ -9,7 +12,7 @@ def tests(): assert cpt.p(True, event) == 0.95 event = {'Burglary': False, 'Earthquake': True} assert cpt.p(False, event) == 0.71 - # #enumeration_ask('Earthquake', {}, burglary) + # enumeration_ask('Earthquake', {}, burglary) s = {'A': True, 'B': False, 'C': True, 'D': False} assert consistent_with(s, {}) @@ -30,12 +33,25 @@ def test_probdist_basic(): P = ProbDist('Flip') P['H'], P['T'] = 0.25, 0.75 assert P['H'] == 0.25 + assert P['T'] == 0.75 + assert P['X'] == 0.00 + + P = ProbDist('BiasedDie') + P['1'], P['2'], P['3'], P['4'], P['5'], P['6'] = 10, 15, 25, 30, 40, 80 + P.normalize() + assert P['2'] == 0.075 + assert P['4'] == 0.15 + assert P['6'] == 0.4 def test_probdist_frequency(): P = ProbDist('X', {'lo': 125, 'med': 375, 'hi': 500}) assert (P['lo'], P['med'], P['hi']) == (0.125, 0.375, 0.5) + P = ProbDist('Pascal-5', {'x1': 1, 'x2': 5, 'x3': 10, 'x4': 10, 'x5': 5, 'x6': 1}) + assert (P['x1'], P['x2'], P['x3'], P['x4'], P['x5'], P['x6']) == ( + 0.03125, 0.15625, 0.3125, 0.3125, 0.15625, 0.03125) + def test_probdist_normalize(): P = ProbDist('Flip') @@ -43,6 +59,12 @@ def test_probdist_normalize(): P = P.normalize() assert (P.prob['H'], P.prob['T']) == (0.350, 0.650) + P = ProbDist('BiasedDie') + P['1'], P['2'], P['3'], P['4'], P['5'], P['6'] = 10, 15, 25, 30, 40, 80 + P = P.normalize() + assert (P.prob['1'], P.prob['2'], P.prob['3'], P.prob['4'], P.prob['5'], P.prob['6']) == ( + 0.05, 0.075, 0.125, 0.15, 0.2, 0.4) + def test_jointprob(): P = JointProbDist(['X', 'Y']) @@ -66,6 +88,20 @@ def test_enumerate_joint(): assert enumerate_joint(['X'], dict(Y=2), P) == 0 assert enumerate_joint(['X'], dict(Y=1), P) == 0.75 + Q = JointProbDist(['W', 'X', 'Y', 'Z']) + Q[0, 1, 1, 0] = 0.12 + Q[1, 0, 1, 1] = 0.4 + Q[0, 0, 1, 1] = 0.5 + Q[0, 0, 1, 0] = 0.05 + Q[0, 0, 0, 0] = 0.675 + Q[1, 1, 1, 0] = 0.3 + assert enumerate_joint(['W'], dict(X=0, Y=0, Z=1), Q) == 0 + assert enumerate_joint(['W'], dict(X=0, Y=0, Z=0), Q) == 0.675 + assert enumerate_joint(['W'], dict(X=0, Y=1, Z=1), Q) == 0.9 + assert enumerate_joint(['Y'], dict(W=1, X=0, Z=1), Q) == 0.4 + assert enumerate_joint(['Z'], dict(W=0, X=0, Y=0), Q) == 0.675 + assert enumerate_joint(['Z'], dict(W=1, X=1, Y=1), Q) == 0.3 + def test_enumerate_joint_ask(): P = JointProbDist(['X', 'Y']) @@ -73,11 +109,12 @@ def test_enumerate_joint_ask(): P[0, 1] = 0.5 P[1, 1] = P[2, 1] = 0.125 assert enumerate_joint_ask( - 'X', dict(Y=1), P).show_approx() == '0: 0.667, 1: 0.167, 2: 0.167' + 'X', dict(Y=1), P).show_approx() == '0: 0.667, 1: 0.167, 2: 0.167' def test_bayesnode_p(): bn = BayesNode('X', 'Burglary', {T: 0.2, F: 0.625}) + assert bn.p(True, {'Burglary': True, 'Earthquake': False}) == 0.2 assert bn.p(False, {'Burglary': False, 'Earthquake': True}) == 0.375 assert BayesNode('W', '', 0.75).p(False, {'Random': True}) == 0.25 @@ -92,45 +129,171 @@ def test_bayesnode_sample(): def test_enumeration_ask(): assert enumeration_ask( - 'Burglary', dict(JohnCalls=T, MaryCalls=T), - burglary).show_approx() == 'False: 0.716, True: 0.284' - - -def test_elemination_ask(): - elimination_ask( - 'Burglary', dict(JohnCalls=T, MaryCalls=T), - burglary).show_approx() == 'False: 0.716, True: 0.284' + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary).show_approx() == 'False: 0.716, True: 0.284' + assert enumeration_ask( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary).show_approx() == 'False: 0.995, True: 0.00513' + assert enumeration_ask( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary).show_approx() == 'False: 0.993, True: 0.00688' + assert enumeration_ask( + 'Burglary', dict(JohnCalls=T), + burglary).show_approx() == 'False: 0.984, True: 0.0163' + assert enumeration_ask( + 'Burglary', dict(MaryCalls=T), + burglary).show_approx() == 'False: 0.944, True: 0.0561' + + +def test_elimination_ask(): + assert elimination_ask( + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary).show_approx() == 'False: 0.716, True: 0.284' + assert elimination_ask( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary).show_approx() == 'False: 0.995, True: 0.00513' + assert elimination_ask( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary).show_approx() == 'False: 0.993, True: 0.00688' + assert elimination_ask( + 'Burglary', dict(JohnCalls=T), + burglary).show_approx() == 'False: 0.984, True: 0.0163' + assert elimination_ask( + 'Burglary', dict(MaryCalls=T), + burglary).show_approx() == 'False: 0.944, True: 0.0561' + + +def test_prior_sample(): + random.seed(42) + all_obs = [prior_sample(burglary) for x in range(1000)] + john_calls_true = [observation for observation in all_obs if observation['JohnCalls']] + mary_calls_true = [observation for observation in all_obs if observation['MaryCalls']] + burglary_and_john = [observation for observation in john_calls_true if observation['Burglary']] + burglary_and_mary = [observation for observation in mary_calls_true if observation['Burglary']] + assert len(john_calls_true) / 1000 == 46 / 1000 + assert len(mary_calls_true) / 1000 == 13 / 1000 + assert len(burglary_and_john) / len(john_calls_true) == 1 / 46 + assert len(burglary_and_mary) / len(mary_calls_true) == 1 / 13 + + +def test_prior_sample2(): + random.seed(128) + all_obs = [prior_sample(sprinkler) for x in range(1000)] + rain_true = [observation for observation in all_obs if observation['Rain']] + sprinkler_true = [observation for observation in all_obs if observation['Sprinkler']] + rain_and_cloudy = [observation for observation in rain_true if observation['Cloudy']] + sprinkler_and_cloudy = [observation for observation in sprinkler_true if observation['Cloudy']] + assert len(rain_true) / 1000 == 0.476 + assert len(sprinkler_true) / 1000 == 0.291 + assert len(rain_and_cloudy) / len(rain_true) == 376 / 476 + assert len(sprinkler_and_cloudy) / len(sprinkler_true) == 39 / 291 def test_rejection_sampling(): random.seed(47) - rejection_sampling( - 'Burglary', dict(JohnCalls=T, MaryCalls=T), - burglary, 10000).show_approx() == 'False: 0.7, True: 0.3' + assert rejection_sampling( + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.7, True: 0.3' + assert rejection_sampling( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary, 10000).show_approx() == 'False: 1, True: 0' + assert rejection_sampling( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.987, True: 0.0128' + assert rejection_sampling( + 'Burglary', dict(JohnCalls=T), + burglary, 10000).show_approx() == 'False: 0.982, True: 0.0183' + assert rejection_sampling( + 'Burglary', dict(MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.965, True: 0.0348' + + +def test_rejection_sampling2(): + random.seed(42) + assert rejection_sampling( + 'Cloudy', dict(Rain=T, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.56, True: 0.44' + assert rejection_sampling( + 'Cloudy', dict(Rain=T, Sprinkler=F), + sprinkler, 10000).show_approx() == 'False: 0.119, True: 0.881' + assert rejection_sampling( + 'Cloudy', dict(Rain=F, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.951, True: 0.049' + assert rejection_sampling( + 'Cloudy', dict(Rain=T), + sprinkler, 10000).show_approx() == 'False: 0.205, True: 0.795' + assert rejection_sampling( + 'Cloudy', dict(Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.835, True: 0.165' def test_likelihood_weighting(): random.seed(1017) assert likelihood_weighting( - 'Burglary', dict(JohnCalls=T, MaryCalls=T), - burglary, 10000).show_approx() == 'False: 0.702, True: 0.298' + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.702, True: 0.298' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary, 10000).show_approx() == 'False: 0.993, True: 0.00656' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.996, True: 0.00363' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=F, MaryCalls=F), + burglary, 10000).show_approx() == 'False: 1, True: 0.000126' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=T), + burglary, 10000).show_approx() == 'False: 0.979, True: 0.0205' + assert likelihood_weighting( + 'Burglary', dict(MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.94, True: 0.0601' + + +def test_likelihood_weighting2(): + random.seed(42) + assert likelihood_weighting( + 'Cloudy', dict(Rain=T, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.559, True: 0.441' + assert likelihood_weighting( + 'Cloudy', dict(Rain=T, Sprinkler=F), + sprinkler, 10000).show_approx() == 'False: 0.12, True: 0.88' + assert likelihood_weighting( + 'Cloudy', dict(Rain=F, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.951, True: 0.0486' + assert likelihood_weighting( + 'Cloudy', dict(Rain=T), + sprinkler, 10000).show_approx() == 'False: 0.198, True: 0.802' + assert likelihood_weighting( + 'Cloudy', dict(Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.833, True: 0.167' def test_forward_backward(): - umbrella_prior = [0.5, 0.5] umbrella_transition = [[0.7, 0.3], [0.3, 0.7]] umbrella_sensor = [[0.9, 0.2], [0.1, 0.8]] umbrellaHMM = HiddenMarkovModel(umbrella_transition, umbrella_sensor) umbrella_evidence = [T, T, F, T, T] - assert (rounder(forward_backward(umbrellaHMM, umbrella_evidence, umbrella_prior)) == - [[0.6469, 0.3531], [0.8673, 0.1327], [0.8204, 0.1796], [0.3075, 0.6925], - [0.8204, 0.1796], [0.8673, 0.1327]]) + assert rounder(forward_backward(umbrellaHMM, umbrella_evidence)) == [ + [0.6469, 0.3531], [0.8673, 0.1327], [0.8204, 0.1796], [0.3075, 0.6925], [0.8204, 0.1796], [0.8673, 0.1327]] + + umbrella_evidence = [T, F, T, F, T] + assert rounder(forward_backward(umbrellaHMM, umbrella_evidence)) == [ + [0.5871, 0.4129], [0.7177, 0.2823], [0.2324, 0.7676], [0.6072, 0.3928], [0.2324, 0.7676], [0.7177, 0.2823]] + + +def test_viterbi(): + umbrella_transition = [[0.7, 0.3], [0.3, 0.7]] + umbrella_sensor = [[0.9, 0.2], [0.1, 0.8]] + umbrellaHMM = HiddenMarkovModel(umbrella_transition, umbrella_sensor) + + umbrella_evidence = [T, T, F, T, T] + assert viterbi(umbrellaHMM, umbrella_evidence)[0] == [T, T, F, T, T] + assert rounder(viterbi(umbrellaHMM, umbrella_evidence)[1]) == [0.8182, 0.5155, 0.1237, 0.0334, 0.0210] umbrella_evidence = [T, F, T, F, T] - assert rounder(forward_backward(umbrellaHMM, umbrella_evidence, umbrella_prior)) == [ - [0.5871, 0.4129], [0.7177, 0.2823], [0.2324, 0.7676], [0.6072, 0.3928], - [0.2324, 0.7676], [0.7177, 0.2823]] + assert viterbi(umbrellaHMM, umbrella_evidence)[0] == [T, F, F, F, T] + assert rounder(viterbi(umbrellaHMM, umbrella_evidence)[1]) == [0.8182, 0.1964, 0.0275, 0.0154, 0.0042] def test_fixed_lag_smoothing(): @@ -142,8 +305,7 @@ def test_fixed_lag_smoothing(): umbrellaHMM = HiddenMarkovModel(umbrella_transition, umbrella_sensor) d = 2 - assert rounder(fixed_lag_smoothing(e_t, umbrellaHMM, d, - umbrella_evidence, t)) == [0.1111, 0.8889] + assert rounder(fixed_lag_smoothing(e_t, umbrellaHMM, d, umbrella_evidence, t)) == [0.1111, 0.8889] d = 5 assert fixed_lag_smoothing(e_t, umbrellaHMM, d, umbrella_evidence, t) is None @@ -152,8 +314,7 @@ def test_fixed_lag_smoothing(): e_t = T d = 1 - assert rounder(fixed_lag_smoothing(e_t, umbrellaHMM, - d, umbrella_evidence, t)) == [0.9939, 0.0061] + assert rounder(fixed_lag_smoothing(e_t, umbrellaHMM, d, umbrella_evidence, t)) == [0.9939, 0.0061] def test_particle_filtering(): @@ -169,7 +330,7 @@ def test_particle_filtering(): def test_monte_carlo_localization(): - ## TODO: Add tests for random motion/inaccurate sensors + # TODO: Add tests for random motion/inaccurate sensors random.seed('aima-python') m = MCLmap([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0], @@ -185,12 +346,12 @@ def test_monte_carlo_localization(): def P_motion_sample(kin_state, v, w): """Sample from possible kinematic states. - Returns from a single element distribution (no uncertainity in motion)""" + Returns from a single element distribution (no uncertainty in motion)""" pos = kin_state[:2] orient = kin_state[2] - + # for simplicity the robot first rotates and then moves - orient = (orient + w)%4 + orient = (orient + w) % 4 for _ in range(orient): v = (v[1], -v[0]) pos = vector_add(pos, v) @@ -210,7 +371,7 @@ def P_sensor(x, y): a = {'v': (0, 0), 'w': 0} z = (2, 4, 1, 6) S = monte_carlo_localization(a, z, 1000, P_motion_sample, P_sensor, m) - grid = [[0]*17 for _ in range(11)] + grid = [[0] * 17 for _ in range(11)] for x, y, _ in S: if 0 <= x < 11 and 0 <= y < 17: grid[x][y] += 1 @@ -220,7 +381,7 @@ def P_sensor(x, y): a = {'v': (0, 1), 'w': 0} z = (2, 3, 5, 7) S = monte_carlo_localization(a, z, 1000, P_motion_sample, P_sensor, m, S) - grid = [[0]*17 for _ in range(11)] + grid = [[0] * 17 for _ in range(11)] for x, y, _ in S: if 0 <= x < 11 and 0 <= y < 17: grid[x][y] += 1 @@ -230,6 +391,12 @@ def P_sensor(x, y): assert grid[6][7] > 700 +def test_gibbs_ask(): + possible_solutions = ['False: 0.16, True: 0.84', 'False: 0.17, True: 0.83', 'False: 0.15, True: 0.85'] + g_solution = gibbs_ask('Cloudy', dict(Rain=True), sprinkler, 200).show_approx() + assert g_solution in possible_solutions + + # The following should probably go in .ipynb: """ diff --git a/tests/test_probability4e.py b/tests/test_probability4e.py new file mode 100644 index 000000000..d07954e0a --- /dev/null +++ b/tests/test_probability4e.py @@ -0,0 +1,349 @@ +import pytest + +from probability4e import * + +random.seed("aima-python") + + +def tests(): + cpt = burglary.variable_node('Alarm') + event = {'Burglary': True, 'Earthquake': True} + assert cpt.p(True, event) == 0.95 + event = {'Burglary': False, 'Earthquake': True} + assert cpt.p(False, event) == 0.71 + # enumeration_ask('Earthquake', {}, burglary) + + s = {'A': True, 'B': False, 'C': True, 'D': False} + assert consistent_with(s, {}) + assert consistent_with(s, s) + assert not consistent_with(s, {'A': False}) + assert not consistent_with(s, {'D': True}) + + random.seed(21) + p = rejection_sampling('Earthquake', {}, burglary, 1000) + assert p[True], p[False] == (0.001, 0.999) + + random.seed(71) + p = likelihood_weighting('Earthquake', {}, burglary, 1000) + assert p[True], p[False] == (0.002, 0.998) + + +# test ProbDist + + +def test_probdist_basic(): + P = ProbDist('Flip') + P['H'], P['T'] = 0.25, 0.75 + assert P['H'] == 0.25 + assert P['T'] == 0.75 + assert P['X'] == 0.00 + + P = ProbDist('BiasedDie') + P['1'], P['2'], P['3'], P['4'], P['5'], P['6'] = 10, 15, 25, 30, 40, 80 + P.normalize() + assert P['2'] == 0.075 + assert P['4'] == 0.15 + assert P['6'] == 0.4 + + +def test_probdist_frequency(): + P = ProbDist('X', {'lo': 125, 'med': 375, 'hi': 500}) + assert (P['lo'], P['med'], P['hi']) == (0.125, 0.375, 0.5) + + P = ProbDist('Pascal-5', {'x1': 1, 'x2': 5, 'x3': 10, 'x4': 10, 'x5': 5, 'x6': 1}) + assert (P['x1'], P['x2'], P['x3'], P['x4'], P['x5'], P['x6']) == ( + 0.03125, 0.15625, 0.3125, 0.3125, 0.15625, 0.03125) + + +def test_probdist_normalize(): + P = ProbDist('Flip') + P['H'], P['T'] = 35, 65 + P = P.normalize() + assert (P.prob['H'], P.prob['T']) == (0.350, 0.650) + + P = ProbDist('BiasedDie') + P['1'], P['2'], P['3'], P['4'], P['5'], P['6'] = 10, 15, 25, 30, 40, 80 + P = P.normalize() + assert (P.prob['1'], P.prob['2'], P.prob['3'], P.prob['4'], P.prob['5'], P.prob['6']) == ( + 0.05, 0.075, 0.125, 0.15, 0.2, 0.4) + + +# test JoinProbDist + + +def test_jointprob(): + P = JointProbDist(['X', 'Y']) + P[1, 1] = 0.25 + assert P[1, 1] == 0.25 + P[dict(X=0, Y=1)] = 0.5 + assert P[dict(X=0, Y=1)] == 0.5 + + +def test_event_values(): + assert event_values({'A': 10, 'B': 9, 'C': 8}, ['C', 'A']) == (8, 10) + assert event_values((1, 2), ['C', 'A']) == (1, 2) + + +def test_enumerate_joint(): + P = JointProbDist(['X', 'Y']) + P[0, 0] = 0.25 + P[0, 1] = 0.5 + P[1, 1] = P[2, 1] = 0.125 + assert enumerate_joint(['Y'], dict(X=0), P) == 0.75 + assert enumerate_joint(['X'], dict(Y=2), P) == 0 + assert enumerate_joint(['X'], dict(Y=1), P) == 0.75 + + Q = JointProbDist(['W', 'X', 'Y', 'Z']) + Q[0, 1, 1, 0] = 0.12 + Q[1, 0, 1, 1] = 0.4 + Q[0, 0, 1, 1] = 0.5 + Q[0, 0, 1, 0] = 0.05 + Q[0, 0, 0, 0] = 0.675 + Q[1, 1, 1, 0] = 0.3 + assert enumerate_joint(['W'], dict(X=0, Y=0, Z=1), Q) == 0 + assert enumerate_joint(['W'], dict(X=0, Y=0, Z=0), Q) == 0.675 + assert enumerate_joint(['W'], dict(X=0, Y=1, Z=1), Q) == 0.9 + assert enumerate_joint(['Y'], dict(W=1, X=0, Z=1), Q) == 0.4 + assert enumerate_joint(['Z'], dict(W=0, X=0, Y=0), Q) == 0.675 + assert enumerate_joint(['Z'], dict(W=1, X=1, Y=1), Q) == 0.3 + + +def test_enumerate_joint_ask(): + P = JointProbDist(['X', 'Y']) + P[0, 0] = 0.25 + P[0, 1] = 0.5 + P[1, 1] = P[2, 1] = 0.125 + assert enumerate_joint_ask( + 'X', dict(Y=1), P).show_approx() == '0: 0.667, 1: 0.167, 2: 0.167' + + +def test_is_independent(): + P = JointProbDist(['X', 'Y']) + P[0, 0] = P[0, 1] = P[1, 1] = P[1, 0] = 0.25 + assert enumerate_joint_ask( + 'X', dict(Y=1), P).show_approx() == '0: 0.5, 1: 0.5' + assert is_independent(['X', 'Y'], P) + + +# test BayesNode + + +def test_bayesnode_p(): + bn = BayesNode('X', 'Burglary', {T: 0.2, F: 0.625}) + assert bn.p(True, {'Burglary': True, 'Earthquake': False}) == 0.2 + assert bn.p(False, {'Burglary': False, 'Earthquake': True}) == 0.375 + assert BayesNode('W', '', 0.75).p(False, {'Random': True}) == 0.25 + + +def test_bayesnode_sample(): + X = BayesNode('X', 'Burglary', {T: 0.2, F: 0.625}) + assert X.sample({'Burglary': False, 'Earthquake': True}) in [True, False] + Z = BayesNode('Z', 'P Q', {(True, True): 0.2, (True, False): 0.3, + (False, True): 0.5, (False, False): 0.7}) + assert Z.sample({'P': True, 'Q': False}) in [True, False] + + +# test continuous variable bayesian net + + +def test_gaussian_probability(): + param = {'sigma': 0.5, 'b': 1, 'a': {'h': 0.5}} + event = {'h': 0.6} + assert gaussian_probability(param, event, 1) == 0.6664492057835993 + + +def test_logistic_probability(): + param = {'mu': 0.5, 'sigma': 0.1} + event = {'h': 0.6} + assert logistic_probability(param, event, True) == 0.16857376940725355 + assert logistic_probability(param, event, False) == 0.8314262305927465 + + +def test_enumeration_ask(): + assert enumeration_ask( + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary).show_approx() == 'False: 0.716, True: 0.284' + assert enumeration_ask( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary).show_approx() == 'False: 0.995, True: 0.00513' + assert enumeration_ask( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary).show_approx() == 'False: 0.993, True: 0.00688' + assert enumeration_ask( + 'Burglary', dict(JohnCalls=T), + burglary).show_approx() == 'False: 0.984, True: 0.0163' + assert enumeration_ask( + 'Burglary', dict(MaryCalls=T), + burglary).show_approx() == 'False: 0.944, True: 0.0561' + + +def test_elimination_ask(): + assert elimination_ask( + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary).show_approx() == 'False: 0.716, True: 0.284' + assert elimination_ask( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary).show_approx() == 'False: 0.995, True: 0.00513' + assert elimination_ask( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary).show_approx() == 'False: 0.993, True: 0.00688' + assert elimination_ask( + 'Burglary', dict(JohnCalls=T), + burglary).show_approx() == 'False: 0.984, True: 0.0163' + assert elimination_ask( + 'Burglary', dict(MaryCalls=T), + burglary).show_approx() == 'False: 0.944, True: 0.0561' + + +# test sampling + + +def test_prior_sample(): + random.seed(42) + all_obs = [prior_sample(burglary) for x in range(1000)] + john_calls_true = [observation for observation in all_obs if observation['JohnCalls'] is True] + mary_calls_true = [observation for observation in all_obs if observation['MaryCalls'] is True] + burglary_and_john = [observation for observation in john_calls_true if observation['Burglary'] is True] + burglary_and_mary = [observation for observation in mary_calls_true if observation['Burglary'] is True] + assert len(john_calls_true) / 1000 == 46 / 1000 + assert len(mary_calls_true) / 1000 == 13 / 1000 + assert len(burglary_and_john) / len(john_calls_true) == 1 / 46 + assert len(burglary_and_mary) / len(mary_calls_true) == 1 / 13 + + +def test_prior_sample2(): + random.seed(128) + all_obs = [prior_sample(sprinkler) for x in range(1000)] + rain_true = [observation for observation in all_obs if observation['Rain'] is True] + sprinkler_true = [observation for observation in all_obs if observation['Sprinkler'] is True] + rain_and_cloudy = [observation for observation in rain_true if observation['Cloudy'] is True] + sprinkler_and_cloudy = [observation for observation in sprinkler_true if observation['Cloudy'] is True] + assert len(rain_true) / 1000 == 0.476 + assert len(sprinkler_true) / 1000 == 0.291 + assert len(rain_and_cloudy) / len(rain_true) == 376 / 476 + assert len(sprinkler_and_cloudy) / len(sprinkler_true) == 39 / 291 + + +def test_rejection_sampling(): + random.seed(47) + assert rejection_sampling( + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.7, True: 0.3' + assert rejection_sampling( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary, 10000).show_approx() == 'False: 1, True: 0' + assert rejection_sampling( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.987, True: 0.0128' + assert rejection_sampling( + 'Burglary', dict(JohnCalls=T), + burglary, 10000).show_approx() == 'False: 0.982, True: 0.0183' + assert rejection_sampling( + 'Burglary', dict(MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.965, True: 0.0348' + + +def test_rejection_sampling2(): + random.seed(42) + assert rejection_sampling( + 'Cloudy', dict(Rain=T, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.56, True: 0.44' + assert rejection_sampling( + 'Cloudy', dict(Rain=T, Sprinkler=F), + sprinkler, 10000).show_approx() == 'False: 0.119, True: 0.881' + assert rejection_sampling( + 'Cloudy', dict(Rain=F, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.951, True: 0.049' + assert rejection_sampling( + 'Cloudy', dict(Rain=T), + sprinkler, 10000).show_approx() == 'False: 0.205, True: 0.795' + assert rejection_sampling( + 'Cloudy', dict(Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.835, True: 0.165' + + +def test_likelihood_weighting(): + random.seed(1017) + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=T, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.702, True: 0.298' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=T, MaryCalls=F), + burglary, 10000).show_approx() == 'False: 0.993, True: 0.00656' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=F, MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.996, True: 0.00363' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=F, MaryCalls=F), + burglary, 10000).show_approx() == 'False: 1, True: 0.000126' + assert likelihood_weighting( + 'Burglary', dict(JohnCalls=T), + burglary, 10000).show_approx() == 'False: 0.979, True: 0.0205' + assert likelihood_weighting( + 'Burglary', dict(MaryCalls=T), + burglary, 10000).show_approx() == 'False: 0.94, True: 0.0601' + + +def test_likelihood_weighting2(): + random.seed(42) + assert likelihood_weighting( + 'Cloudy', dict(Rain=T, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.559, True: 0.441' + assert likelihood_weighting( + 'Cloudy', dict(Rain=T, Sprinkler=F), + sprinkler, 10000).show_approx() == 'False: 0.12, True: 0.88' + assert likelihood_weighting( + 'Cloudy', dict(Rain=F, Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.951, True: 0.0486' + assert likelihood_weighting( + 'Cloudy', dict(Rain=T), + sprinkler, 10000).show_approx() == 'False: 0.198, True: 0.802' + assert likelihood_weighting( + 'Cloudy', dict(Sprinkler=T), + sprinkler, 10000).show_approx() == 'False: 0.833, True: 0.167' + + +def test_gibbs_ask(): + g_solution = gibbs_ask('Cloudy', dict(Rain=True), sprinkler, 1000) + assert abs(g_solution.prob[False] - 0.2) < 0.05 + assert abs(g_solution.prob[True] - 0.8) < 0.05 + + +# The following should probably go in .ipynb: + +""" +# We can build up a probability distribution like this (p. 469): +>>> P = ProbDist() +>>> P['sunny'] = 0.7 +>>> P['rain'] = 0.2 +>>> P['cloudy'] = 0.08 +>>> P['snow'] = 0.02 + +# and query it like this: (Never mind this ELLIPSIS option +# added to make the doctest portable.) +>>> P['rain'] #doctest:+ELLIPSIS +0.2... + +# A Joint Probability Distribution is dealt with like this [Figure 13.3]: +>>> P = JointProbDist(['Toothache', 'Cavity', 'Catch']) +>>> T, F = True, False +>>> P[T, T, T] = 0.108; P[T, T, F] = 0.012; P[F, T, T] = 0.072; P[F, T, F] = 0.008 +>>> P[T, F, T] = 0.016; P[T, F, F] = 0.064; P[F, F, T] = 0.144; P[F, F, F] = 0.576 + +>>> P[T, T, T] +0.108 + +# Ask for P(Cavity|Toothache=T) +>>> PC = enumerate_joint_ask('Cavity', {'Toothache': T}, P) +>>> PC.show_approx() +'False: 0.4, True: 0.6' + +>>> 0.6-epsilon < PC[T] < 0.6+epsilon +True + +>>> 0.4-epsilon < PC[F] < 0.4+epsilon +True +""" + +if __name__ == '__main__': + pytest.main() diff --git a/tests/test_reinforcement_learning.py b/tests/test_reinforcement_learning.py new file mode 100644 index 000000000..d80ad3baf --- /dev/null +++ b/tests/test_reinforcement_learning.py @@ -0,0 +1,71 @@ +import pytest + +from reinforcement_learning import * +from mdp import sequential_decision_environment + +random.seed("aima-python") + +north = (0, 1) +south = (0, -1) +west = (-1, 0) +east = (1, 0) + +policy = { + (0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, + (0, 1): north, (2, 1): north, (3, 1): None, + (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west, +} + + +def test_PassiveDUEAgent(): + agent = PassiveDUEAgent(policy, sequential_decision_environment) + for i in range(200): + run_single_trial(agent, sequential_decision_environment) + agent.estimate_U() + # Agent does not always produce same results. + # Check if results are good enough. + # print(agent.U[(0, 0)], agent.U[(0,1)], agent.U[(1,0)]) + assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 + assert agent.U[(0, 1)] > 0.15 # In reality around 0.4 + assert agent.U[(1, 0)] > 0 # In reality around 0.2 + + +def test_PassiveADPAgent(): + agent = PassiveADPAgent(policy, sequential_decision_environment) + for i in range(100): + run_single_trial(agent, sequential_decision_environment) + + # Agent does not always produce same results. + # Check if results are good enough. + # print(agent.U[(0, 0)], agent.U[(0,1)], agent.U[(1,0)]) + assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 + assert agent.U[(0, 1)] > 0.15 # In reality around 0.4 + assert agent.U[(1, 0)] > 0 # In reality around 0.2 + + +def test_PassiveTDAgent(): + agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60. / (59 + n)) + for i in range(200): + run_single_trial(agent, sequential_decision_environment) + + # Agent does not always produce same results. + # Check if results are good enough. + assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 + assert agent.U[(0, 1)] > 0.15 # In reality around 0.35 + assert agent.U[(1, 0)] > 0.15 # In reality around 0.25 + + +def test_QLearning(): + q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, alpha=lambda n: 60. / (59 + n)) + + for i in range(200): + run_single_trial(q_agent, sequential_decision_environment) + + # Agent does not always produce same results. + # Check if results are good enough. + assert q_agent.Q[((0, 1), (0, 1))] >= -0.5 # In reality around 0.1 + assert q_agent.Q[((1, 0), (0, -1))] <= 0.5 # In reality around -0.1 + + +if __name__ == '__main__': + pytest.main() diff --git a/tests/test_reinforcement_learning4e.py b/tests/test_reinforcement_learning4e.py new file mode 100644 index 000000000..287ec397b --- /dev/null +++ b/tests/test_reinforcement_learning4e.py @@ -0,0 +1,69 @@ +import pytest + +from mdp4e import sequential_decision_environment +from reinforcement_learning4e import * + +random.seed("aima-python") + +north = (0, 1) +south = (0, -1) +west = (-1, 0) +east = (1, 0) + +policy = {(0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, + (0, 1): north, (2, 1): north, (3, 1): None, + (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west} + + +def test_PassiveDUEAgent(): + agent = PassiveDUEAgent(policy, sequential_decision_environment) + for i in range(200): + run_single_trial(agent, sequential_decision_environment) + agent.estimate_U() + # Agent does not always produce same results. + # Check if results are good enough. + # print(agent.U[(0, 0)], agent.U[(0,1)], agent.U[(1,0)]) + assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 + assert agent.U[(0, 1)] > 0.15 # In reality around 0.4 + assert agent.U[(1, 0)] > 0 # In reality around 0.2 + + +def test_PassiveADPAgent(): + agent = PassiveADPAgent(policy, sequential_decision_environment) + for i in range(100): + run_single_trial(agent, sequential_decision_environment) + + # Agent does not always produce same results. + # Check if results are good enough. + # print(agent.U[(0, 0)], agent.U[(0,1)], agent.U[(1,0)]) + assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 + assert agent.U[(0, 1)] > 0.15 # In reality around 0.4 + assert agent.U[(1, 0)] > 0 # In reality around 0.2 + + +def test_PassiveTDAgent(): + agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60. / (59 + n)) + for i in range(200): + run_single_trial(agent, sequential_decision_environment) + + # Agent does not always produce same results. + # Check if results are good enough. + assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 + assert agent.U[(0, 1)] > 0.15 # In reality around 0.35 + assert agent.U[(1, 0)] > 0.15 # In reality around 0.25 + + +def test_QLearning(): + q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, alpha=lambda n: 60. / (59 + n)) + + for i in range(200): + run_single_trial(q_agent, sequential_decision_environment) + + # Agent does not always produce same results. + # Check if results are good enough. + assert q_agent.Q[((0, 1), (0, 1))] >= -0.5 # In reality around 0.1 + assert q_agent.Q[((1, 0), (0, -1))] <= 0.5 # In reality around -0.1 + + +if __name__ == '__main__': + pytest.main() diff --git a/tests/test_rl.py b/tests/test_rl.py deleted file mode 100644 index 05f071266..000000000 --- a/tests/test_rl.py +++ /dev/null @@ -1,55 +0,0 @@ -import pytest - -from rl import * -from mdp import sequential_decision_environment - - -north = (0, 1) -south = (0,-1) -west = (-1, 0) -east = (1, 0) - -policy = { - (0, 2): east, (1, 2): east, (2, 2): east, (3, 2): None, - (0, 1): north, (2, 1): north, (3, 1): None, - (0, 0): north, (1, 0): west, (2, 0): west, (3, 0): west, -} - - - -def test_PassiveADPAgent(): - agent = PassiveADPAgent(policy, sequential_decision_environment) - for i in range(75): - run_single_trial(agent,sequential_decision_environment) - - # Agent does not always produce same results. - # Check if results are good enough. - assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 - assert agent.U[(0, 1)] > 0.15 # In reality around 0.4 - assert agent.U[(1, 0)] > 0 # In reality around 0.2 - - - -def test_PassiveTDAgent(): - agent = PassiveTDAgent(policy, sequential_decision_environment, alpha=lambda n: 60./(59+n)) - for i in range(200): - run_single_trial(agent,sequential_decision_environment) - - # Agent does not always produce same results. - # Check if results are good enough. - assert agent.U[(0, 0)] > 0.15 # In reality around 0.3 - assert agent.U[(0, 1)] > 0.15 # In reality around 0.35 - assert agent.U[(1, 0)] > 0.15 # In reality around 0.25 - - -def test_QLearning(): - q_agent = QLearningAgent(sequential_decision_environment, Ne=5, Rplus=2, - alpha=lambda n: 60./(59+n)) - - for i in range(200): - run_single_trial(q_agent,sequential_decision_environment) - - # Agent does not always produce same results. - # Check if results are good enough. - assert q_agent.Q[((0, 1), (0, 1))] >= -0.5 # In reality around 0.1 - assert q_agent.Q[((1, 0), (0, -1))] <= 0.5 # In reality around -0.1 diff --git a/tests/test_search.py b/tests/test_search.py index f22ca6f89..9be3e4a47 100644 --- a/tests/test_search.py +++ b/tests/test_search.py @@ -1,10 +1,15 @@ import pytest from search import * +random.seed("aima-python") romania_problem = GraphProblem('Arad', 'Bucharest', romania_map) -vacumm_world = GraphProblemStochastic('State_1', ['State_7', 'State_8'], vacumm_world) +vacuum_world = GraphProblemStochastic('State_1', ['State_7', 'State_8'], vacuum_world) LRTA_problem = OnlineSearchProblem('State_3', 'State_5', one_dim_state_space) +eight_puzzle = EightPuzzle((1, 2, 3, 4, 5, 7, 8, 6, 0)) +eight_puzzle2 = EightPuzzle((1, 0, 6, 8, 7, 5, 4, 2), (0, 1, 2, 3, 4, 5, 6, 7, 8)) +n_queens = NQueensProblem(8) + def test_find_min_edge(): assert romania_problem.find_min_edge() == 70 @@ -13,10 +18,11 @@ def test_find_min_edge(): def test_breadth_first_tree_search(): assert breadth_first_tree_search( romania_problem).solution() == ['Sibiu', 'Fagaras', 'Bucharest'] + assert breadth_first_graph_search(n_queens).solution() == [0, 4, 7, 5, 2, 6, 1, 3] -def test_breadth_first_search(): - assert breadth_first_search(romania_problem).solution() == ['Sibiu', 'Fagaras', 'Bucharest'] +def test_breadth_first_graph_search(): + assert breadth_first_graph_search(romania_problem).solution() == ['Sibiu', 'Fagaras', 'Bucharest'] def test_best_first_graph_search(): @@ -38,6 +44,11 @@ def test_best_first_graph_search(): def test_uniform_cost_search(): assert uniform_cost_search( romania_problem).solution() == ['Sibiu', 'Rimnicu', 'Pitesti', 'Bucharest'] + assert uniform_cost_search(n_queens).solution() == [0, 4, 7, 5, 2, 6, 1, 3] + + +def test_depth_first_tree_search(): + assert depth_first_tree_search(n_queens).solution() == [7, 3, 0, 2, 5, 1, 6, 4] def test_depth_first_graph_search(): @@ -60,20 +71,120 @@ def test_depth_limited_search(): def test_bidirectional_search(): assert bidirectional_search(romania_problem) == 418 + assert bidirectional_search(eight_puzzle) == 12 + assert bidirectional_search(EightPuzzle((1, 2, 3, 4, 5, 6, 0, 7, 8))) == 2 def test_astar_search(): assert astar_search(romania_problem).solution() == ['Sibiu', 'Rimnicu', 'Pitesti', 'Bucharest'] + assert astar_search(eight_puzzle).solution() == ['LEFT', 'LEFT', 'UP', 'RIGHT', 'RIGHT', 'DOWN', 'LEFT', 'UP', + 'LEFT', 'DOWN', 'RIGHT', 'RIGHT'] + assert astar_search(EightPuzzle((1, 2, 3, 4, 5, 6, 0, 7, 8))).solution() == ['RIGHT', 'RIGHT'] + assert astar_search(n_queens).solution() == [7, 1, 3, 0, 6, 4, 2, 5] + + +def test_find_blank_square(): + assert eight_puzzle.find_blank_square((0, 1, 2, 3, 4, 5, 6, 7, 8)) == 0 + assert eight_puzzle.find_blank_square((6, 3, 5, 1, 8, 4, 2, 0, 7)) == 7 + assert eight_puzzle.find_blank_square((3, 4, 1, 7, 6, 0, 2, 8, 5)) == 5 + assert eight_puzzle.find_blank_square((1, 8, 4, 7, 2, 6, 3, 0, 5)) == 7 + assert eight_puzzle.find_blank_square((4, 8, 1, 6, 0, 2, 3, 5, 7)) == 4 + assert eight_puzzle.find_blank_square((1, 0, 6, 8, 7, 5, 4, 2, 3)) == 1 + assert eight_puzzle.find_blank_square((1, 2, 3, 4, 5, 6, 7, 8, 0)) == 8 + + +def test_actions(): + assert eight_puzzle.actions((0, 1, 2, 3, 4, 5, 6, 7, 8)) == ['DOWN', 'RIGHT'] + assert eight_puzzle.actions((6, 3, 5, 1, 8, 4, 2, 0, 7)) == ['UP', 'LEFT', 'RIGHT'] + assert eight_puzzle.actions((3, 4, 1, 7, 6, 0, 2, 8, 5)) == ['UP', 'DOWN', 'LEFT'] + assert eight_puzzle.actions((1, 8, 4, 7, 2, 6, 3, 0, 5)) == ['UP', 'LEFT', 'RIGHT'] + assert eight_puzzle.actions((4, 8, 1, 6, 0, 2, 3, 5, 7)) == ['UP', 'DOWN', 'LEFT', 'RIGHT'] + assert eight_puzzle.actions((1, 0, 6, 8, 7, 5, 4, 2, 3)) == ['DOWN', 'LEFT', 'RIGHT'] + assert eight_puzzle.actions((1, 2, 3, 4, 5, 6, 7, 8, 0)) == ['UP', 'LEFT'] + + +def test_result(): + assert eight_puzzle.result((0, 1, 2, 3, 4, 5, 6, 7, 8), 'DOWN') == (3, 1, 2, 0, 4, 5, 6, 7, 8) + assert eight_puzzle.result((6, 3, 5, 1, 8, 4, 2, 0, 7), 'LEFT') == (6, 3, 5, 1, 8, 4, 0, 2, 7) + assert eight_puzzle.result((3, 4, 1, 7, 6, 0, 2, 8, 5), 'UP') == (3, 4, 0, 7, 6, 1, 2, 8, 5) + assert eight_puzzle.result((1, 8, 4, 7, 2, 6, 3, 0, 5), 'RIGHT') == (1, 8, 4, 7, 2, 6, 3, 5, 0) + assert eight_puzzle.result((4, 8, 1, 6, 0, 2, 3, 5, 7), 'LEFT') == (4, 8, 1, 0, 6, 2, 3, 5, 7) + assert eight_puzzle.result((1, 0, 6, 8, 7, 5, 4, 2, 3), 'DOWN') == (1, 7, 6, 8, 0, 5, 4, 2, 3) + assert eight_puzzle.result((1, 2, 3, 4, 5, 6, 7, 8, 0), 'UP') == (1, 2, 3, 4, 5, 0, 7, 8, 6) + assert eight_puzzle.result((4, 8, 1, 6, 0, 2, 3, 5, 7), 'RIGHT') == (4, 8, 1, 6, 2, 0, 3, 5, 7) + + +def test_goal_test(): + assert not eight_puzzle.goal_test((0, 1, 2, 3, 4, 5, 6, 7, 8)) + assert not eight_puzzle.goal_test((6, 3, 5, 1, 8, 4, 2, 0, 7)) + assert not eight_puzzle.goal_test((3, 4, 1, 7, 6, 0, 2, 8, 5)) + assert eight_puzzle.goal_test((1, 2, 3, 4, 5, 6, 7, 8, 0)) + assert not eight_puzzle2.goal_test((4, 8, 1, 6, 0, 2, 3, 5, 7)) + assert not eight_puzzle2.goal_test((3, 4, 1, 7, 6, 0, 2, 8, 5)) + assert not eight_puzzle2.goal_test((1, 2, 3, 4, 5, 6, 7, 8, 0)) + assert eight_puzzle2.goal_test((0, 1, 2, 3, 4, 5, 6, 7, 8)) + assert n_queens.goal_test((7, 3, 0, 2, 5, 1, 6, 4)) + assert n_queens.goal_test((0, 4, 7, 5, 2, 6, 1, 3)) + assert n_queens.goal_test((7, 1, 3, 0, 6, 4, 2, 5)) + assert not n_queens.goal_test((0, 1, 2, 3, 4, 5, 6, 7)) + + +def test_check_solvability(): + assert eight_puzzle.check_solvability((0, 1, 2, 3, 4, 5, 6, 7, 8)) + assert eight_puzzle.check_solvability((6, 3, 5, 1, 8, 4, 2, 0, 7)) + assert eight_puzzle.check_solvability((3, 4, 1, 7, 6, 0, 2, 8, 5)) + assert eight_puzzle.check_solvability((1, 8, 4, 7, 2, 6, 3, 0, 5)) + assert eight_puzzle.check_solvability((4, 8, 1, 6, 0, 2, 3, 5, 7)) + assert eight_puzzle.check_solvability((1, 0, 6, 8, 7, 5, 4, 2, 3)) + assert eight_puzzle.check_solvability((1, 2, 3, 4, 5, 6, 7, 8, 0)) + assert not eight_puzzle.check_solvability((1, 2, 3, 4, 5, 6, 8, 7, 0)) + assert not eight_puzzle.check_solvability((1, 0, 3, 2, 4, 5, 6, 7, 8)) + assert not eight_puzzle.check_solvability((7, 0, 2, 8, 5, 3, 6, 4, 1)) + + +def test_conflict(): + assert not n_queens.conflict(7, 0, 1, 1) + assert not n_queens.conflict(0, 3, 6, 4) + assert not n_queens.conflict(2, 6, 5, 7) + assert not n_queens.conflict(2, 4, 1, 6) + assert n_queens.conflict(0, 0, 1, 1) + assert n_queens.conflict(4, 3, 4, 4) + assert n_queens.conflict(6, 5, 5, 6) + assert n_queens.conflict(0, 6, 1, 7) def test_recursive_best_first_search(): assert recursive_best_first_search( romania_problem).solution() == ['Sibiu', 'Rimnicu', 'Pitesti', 'Bucharest'] + assert recursive_best_first_search( + EightPuzzle((2, 4, 3, 1, 5, 6, 7, 8, 0))).solution() == [ + 'UP', 'LEFT', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'RIGHT', 'DOWN'] + + def manhattan(node): + state = node.state + index_goal = {0: [2, 2], 1: [0, 0], 2: [0, 1], 3: [0, 2], 4: [1, 0], 5: [1, 1], 6: [1, 2], 7: [2, 0], 8: [2, 1]} + index_state = {} + index = [[0, 0], [0, 1], [0, 2], [1, 0], [1, 1], [1, 2], [2, 0], [2, 1], [2, 2]] + + for i in range(len(state)): + index_state[state[i]] = index[i] + + mhd = 0 + + for i in range(8): + for j in range(2): + mhd = abs(index_goal[i][j] - index_state[i][j]) + mhd + + return mhd + + assert recursive_best_first_search( + EightPuzzle((2, 4, 3, 1, 5, 6, 7, 8, 0)), h=manhattan).solution() == [ + 'LEFT', 'UP', 'UP', 'LEFT', 'DOWN', 'RIGHT', 'DOWN', 'UP', 'DOWN', 'RIGHT'] def test_hill_climbing(): prob = PeakFindingProblem((0, 0), [[0, 5, 10, 20], - [-3, 7, 11, 5]]) + [-3, 7, 11, 5]]) assert hill_climbing(prob) == (0, 3) prob = PeakFindingProblem((0, 0), [[0, 5, 10, 8], [-3, 7, 9, 999], @@ -86,14 +197,13 @@ def test_hill_climbing(): def test_simulated_annealing(): - random.seed("aima-python") prob = PeakFindingProblem((0, 0), [[0, 5, 10, 20], - [-3, 7, 11, 5]]) - sols = {prob.value(simulated_annealing(prob)) for i in range(100)} + [-3, 7, 11, 5]], directions4) + sols = {prob.value(simulated_annealing(prob)) for _ in range(100)} assert max(sols) == 20 prob = PeakFindingProblem((0, 0), [[0, 5, 10, 8], [-3, 7, 9, 999], - [1, 2, 5, 11]]) + [1, 2, 5, 11]], directions8) sols = {prob.value(simulated_annealing(prob)) for i in range(100)} assert max(sols) == 999 @@ -114,37 +224,44 @@ def test_and_or_graph_search(): def run_plan(state, problem, plan): if problem.goal_test(state): return True - if len(plan) is not 2: + if len(plan) != 2: return False predicate = lambda x: run_plan(x, problem, plan[1][x]) return all(predicate(r) for r in problem.result(state, plan[0])) - plan = and_or_graph_search(vacumm_world) - assert run_plan('State_1', vacumm_world, plan) + + plan = and_or_graph_search(vacuum_world) + assert run_plan('State_1', vacuum_world, plan) + + +def test_online_dfs_agent(): + odfs_agent = OnlineDFSAgent(LRTA_problem) + keys = [key for key in odfs_agent('State_3')] + assert keys[0] in ['Right', 'Left'] + assert keys[1] in ['Right', 'Left'] + assert odfs_agent('State_5') is None def test_LRTAStarAgent(): - my_agent = LRTAStarAgent(LRTA_problem) - assert my_agent('State_3') == 'Right' - assert my_agent('State_4') == 'Left' - assert my_agent('State_3') == 'Right' - assert my_agent('State_4') == 'Right' - assert my_agent('State_5') is None + lrta_agent = LRTAStarAgent(LRTA_problem) + assert lrta_agent('State_3') == 'Right' + assert lrta_agent('State_4') == 'Left' + assert lrta_agent('State_3') == 'Right' + assert lrta_agent('State_4') == 'Right' + assert lrta_agent('State_5') is None - my_agent = LRTAStarAgent(LRTA_problem) - assert my_agent('State_4') == 'Left' + lrta_agent = LRTAStarAgent(LRTA_problem) + assert lrta_agent('State_4') == 'Left' - my_agent = LRTAStarAgent(LRTA_problem) - assert my_agent('State_5') is None + lrta_agent = LRTAStarAgent(LRTA_problem) + assert lrta_agent('State_5') is None def test_genetic_algorithm(): # Graph coloring - edges = { - 'A': [0, 1], - 'B': [0, 3], - 'C': [1, 2], - 'D': [2, 3] - } + edges = {'A': [0, 1], + 'B': [0, 3], + 'C': [1, 2], + 'D': [2, 3]} def fitness(c): return sum(c[n1] != c[n2] for (n1, n2) in edges.values()) @@ -165,7 +282,7 @@ def fitness(c): def fitness(q): non_attacking = 0 for row1 in range(len(q)): - for row2 in range(row1+1, len(q)): + for row2 in range(row1 + 1, len(q)): col1 = int(q[row1]) col2 = int(q[row2]) row_diff = row1 - row2 @@ -176,7 +293,6 @@ def fitness(q): return non_attacking - solution = genetic_algorithm(population, fitness, gene_pool=gene_pool, f_thres=25) assert fitness(solution) >= 25 @@ -201,13 +317,56 @@ def GA_GraphColoringInts(edges, fitness): return genetic_algorithm(population, fitness) +def test_simpleProblemSolvingAgent(): + class vacuumAgent(SimpleProblemSolvingAgentProgram): + def update_state(self, state, percept): + return percept + + def formulate_goal(self, state): + goal = [state7, state8] + return goal + + def formulate_problem(self, state, goal): + problem = state + return problem + + def search(self, problem): + if problem == state1: + seq = ["Suck", "Right", "Suck"] + elif problem == state2: + seq = ["Suck", "Left", "Suck"] + elif problem == state3: + seq = ["Right", "Suck"] + elif problem == state4: + seq = ["Suck"] + elif problem == state5: + seq = ["Suck"] + elif problem == state6: + seq = ["Left", "Suck"] + return seq + + state1 = [(0, 0), [(0, 0), "Dirty"], [(1, 0), ["Dirty"]]] + state2 = [(1, 0), [(0, 0), "Dirty"], [(1, 0), ["Dirty"]]] + state3 = [(0, 0), [(0, 0), "Clean"], [(1, 0), ["Dirty"]]] + state4 = [(1, 0), [(0, 0), "Clean"], [(1, 0), ["Dirty"]]] + state5 = [(0, 0), [(0, 0), "Dirty"], [(1, 0), ["Clean"]]] + state6 = [(1, 0), [(0, 0), "Dirty"], [(1, 0), ["Clean"]]] + state7 = [(0, 0), [(0, 0), "Clean"], [(1, 0), ["Clean"]]] + state8 = [(1, 0), [(0, 0), "Clean"], [(1, 0), ["Clean"]]] + + a = vacuumAgent(state1) + + assert a(state6) == "Left" + assert a(state1) == "Suck" + assert a(state3) == "Right" + # TODO: for .ipynb: """ >>> compare_graph_searchers() Searcher romania_map(A, B) romania_map(O, N) australia_map breadth_first_tree_search < 21/ 22/ 59/B> <1158/1159/3288/N> < 7/ 8/ 22/WA> - breadth_first_search < 7/ 11/ 18/B> < 19/ 20/ 45/N> < 2/ 6/ 8/WA> + breadth_first_graph_search < 7/ 11/ 18/B> < 19/ 20/ 45/N> < 2/ 6/ 8/WA> depth_first_graph_search < 8/ 9/ 20/B> < 16/ 17/ 38/N> < 4/ 5/ 11/WA> iterative_deepening_search < 11/ 33/ 31/B> < 656/1815/1812/N> < 3/ 11/ 11/WA> depth_limited_search < 54/ 65/ 185/B> < 387/1012/1125/N> < 50/ 54/ 200/WA> diff --git a/tests/test_text.py b/tests/test_text.py index 9e1aeb2f2..3aaa007f6 100644 --- a/tests/test_text.py +++ b/tests/test_text.py @@ -1,10 +1,12 @@ -import pytest -import os import random +import numpy as np +import pytest + from text import * -from utils import isclose, open_data +from utils import open_data +random.seed("aima-python") def test_text_models(): @@ -30,9 +32,9 @@ def test_text_models(): (13, ('as', 'well', 'as'))] # Test isclose - assert isclose(P1['the'], 0.0611, rel_tol=0.001) - assert isclose(P2['of', 'the'], 0.0108, rel_tol=0.01) - assert isclose(P3['so', 'as', 'to'], 0.000323, rel_tol=0.001) + assert np.isclose(P1['the'], 0.0611, rtol=0.001) + assert np.isclose(P2['of', 'the'], 0.0108, rtol=0.01) + assert np.isclose(P3['so', 'as', 'to'], 0.000323, rtol=0.001) # Test cond_prob.get assert P2.cond_prob.get(('went',)) is None @@ -123,7 +125,7 @@ def test_char_models(): def test_samples(): story = open_data("EN-text/flatland.txt").read() - story += open_data("EN-text/gutenberg.txt").read() + story += open_data("gutenberg.txt").read() wordseq = words(story) P1 = UnigramWordModel(wordseq) P2 = NgramWordModel(2, wordseq) @@ -164,14 +166,15 @@ def test_shift_decoding(): def test_permutation_decoder(): - gutenberg = open_data("EN-text/gutenberg.txt").read() + gutenberg = open_data("gutenberg.txt").read() flatland = open_data("EN-text/flatland.txt").read() pd = PermutationDecoder(canonicalize(gutenberg)) assert pd.decode('aba') in ('ece', 'ete', 'tat', 'tit', 'txt') pd = PermutationDecoder(canonicalize(flatland)) - assert pd.decode('aba') in ('ded', 'did', 'ece', 'ele', 'eme', 'ere', 'eve', 'eye', 'iti', 'mom', 'ses', 'tat', 'tit') + assert pd.decode('aba') in ( + 'ded', 'did', 'ece', 'ele', 'eme', 'ere', 'eve', 'eye', 'iti', 'mom', 'ses', 'tat', 'tit') def test_rot13_encoding(): @@ -227,8 +230,7 @@ def verify_query(query, expected): Results(62.95, "aima-data/MAN/shred.txt"), Results(57.46, "aima-data/MAN/pico.txt"), Results(43.38, "aima-data/MAN/login.txt"), - Results(41.93, "aima-data/MAN/ln.txt"), - ]) + Results(41.93, "aima-data/MAN/ln.txt")]) q2 = uc.query("how do I delete a file") assert verify_query(q2, [ @@ -238,8 +240,7 @@ def verify_query(query, expected): Results(60.63, "aima-data/MAN/zip.txt"), Results(57.46, "aima-data/MAN/pico.txt"), Results(51.28, "aima-data/MAN/shred.txt"), - Results(26.72, "aima-data/MAN/tr.txt"), - ]) + Results(26.72, "aima-data/MAN/tr.txt")]) q3 = uc.query("email") assert verify_query(q3, [ @@ -247,8 +248,7 @@ def verify_query(query, expected): Results(12.01, "aima-data/MAN/info.txt"), Results(9.89, "aima-data/MAN/pico.txt"), Results(8.73, "aima-data/MAN/grep.txt"), - Results(8.07, "aima-data/MAN/zip.txt"), - ]) + Results(8.07, "aima-data/MAN/zip.txt")]) q4 = uc.query("word count for files") assert verify_query(q4, [ @@ -258,8 +258,7 @@ def verify_query(query, expected): Results(55.45, "aima-data/MAN/ps.txt"), Results(53.42, "aima-data/MAN/more.txt"), Results(42.00, "aima-data/MAN/dd.txt"), - Results(12.85, "aima-data/MAN/who.txt"), - ]) + Results(12.85, "aima-data/MAN/who.txt")]) q5 = uc.query("learn: date") assert verify_query(q5, []) @@ -267,8 +266,7 @@ def verify_query(query, expected): q6 = uc.query("2003") assert verify_query(q6, [ Results(14.58, "aima-data/MAN/pine.txt"), - Results(11.62, "aima-data/MAN/jar.txt"), - ]) + Results(11.62, "aima-data/MAN/jar.txt")]) def test_words(): @@ -281,7 +279,7 @@ def test_canonicalize(): def test_translate(): text = 'orange apple lemon ' - func = lambda x: ('s ' + x) if x ==' ' else x + func = lambda x: ('s ' + x) if x == ' ' else x assert translate(text, func) == 'oranges apples lemons ' @@ -291,6 +289,5 @@ def test_bigrams(): assert bigrams(['this', 'is', 'a', 'test']) == [['this', 'is'], ['is', 'a'], ['a', 'test']] - if __name__ == '__main__': pytest.main() diff --git a/tests/test_utils.py b/tests/test_utils.py index a07bc76ef..6c2a50808 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -2,28 +2,51 @@ from utils import * import random +random.seed("aima-python") -def test_removeall_list(): - assert removeall(4, []) == [] - assert removeall(4, [1, 2, 3, 4]) == [1, 2, 3] - assert removeall(4, [4, 1, 4, 2, 3, 4, 4]) == [1, 2, 3] +def test_sequence(): + assert sequence(1) == (1,) + assert sequence("helloworld") == "helloworld" + assert sequence({"hello": 4, "world": 5}) == ({"hello": 4, "world": 5},) + assert sequence([1, 2, 3]) == [1, 2, 3] + assert sequence((4, 5, 6)) == (4, 5, 6) + assert sequence([(1, 2), (2, 3), (4, 5)]) == [(1, 2), (2, 3), (4, 5)] + assert sequence(([1, 2], [3, 4], [5, 6])) == ([1, 2], [3, 4], [5, 6]) -def test_removeall_string(): - assert removeall('s', '') == '' - assert removeall('s', 'This is a test. Was a test.') == 'Thi i a tet. Wa a tet.' + +def test_remove_all_list(): + assert remove_all(4, []) == [] + assert remove_all(4, [1, 2, 3, 4]) == [1, 2, 3] + assert remove_all(4, [4, 1, 4, 2, 3, 4, 4]) == [1, 2, 3] + assert remove_all(1, [2, 3, 4, 5, 6]) == [2, 3, 4, 5, 6] + + +def test_remove_all_string(): + assert remove_all('s', '') == '' + assert remove_all('s', 'This is a test. Was a test.') == 'Thi i a tet. Wa a tet.' + assert remove_all('a', 'artificial intelligence: a modern approach') == 'rtificil intelligence: modern pproch' def test_unique(): assert unique([1, 2, 3, 2, 1]) == [1, 2, 3] assert unique([1, 5, 6, 7, 6, 5]) == [1, 5, 6, 7] + assert unique([1, 2, 3, 4, 5]) == [1, 2, 3, 4, 5] def test_count(): assert count([1, 2, 3, 4, 2, 3, 4]) == 7 assert count("aldpeofmhngvia") == 14 assert count([True, False, True, True, False]) == 3 - assert count([5 > 1, len("abc") == 3, 3+1 == 5]) == 2 + assert count([5 > 1, len("abc") == 3, 3 + 1 == 5]) == 2 + assert count("aima") == 4 + + +def test_multimap(): + assert multimap([(1, 2), (1, 3), (1, 4), (2, 3), (2, 4), (4, 5)]) == \ + {1: [2, 3, 4], 2: [3, 4], 4: [5]} + assert multimap([("a", 2), ("a", 3), ("a", 4), ("b", 3), ("b", 4), ("c", 5)]) == \ + {'a': [2, 3, 4], 'b': [3, 4], 'c': [5]} def test_product(): @@ -35,9 +58,15 @@ def test_first(): assert first('word') == 'w' assert first('') is None assert first('', 'empty') == 'empty' + assert first([1, 2, 3, 4, 5]) == 1 + assert first([]) is None assert first(range(10)) == 0 assert first(x for x in range(10) if x > 3) == 4 assert first(x for x in range(10) if x > 100) is None + assert first((1, 2, 3)) == 1 + assert first(range(2, 10)) == 2 + assert first([(1, 2), (1, 3), (1, 4)]) == (1, 2) + assert first({1: "one", 2: "two", 3: "three"}) == 1 def test_is_in(): @@ -49,80 +78,82 @@ def test_is_in(): def test_mode(): assert mode([12, 32, 2, 1, 2, 3, 2, 3, 2, 3, 44, 3, 12, 4, 9, 0, 3, 45, 3]) == 3 assert mode("absndkwoajfkalwpdlsdlfllalsflfdslgflal") == 'l' + assert mode("artificialintelligence") == 'i' + +def test_power_set(): + assert power_set([1, 2, 3]) == [(1,), (2,), (3,), (1, 2), (1, 3), (2, 3), (1, 2, 3)] -def test_powerset(): - assert powerset([1, 2, 3]) == [(1,), (2,), (3,), (1, 2), (1, 3), (2, 3), (1, 2, 3)] +def test_histogram(): + assert histogram([1, 2, 4, 2, 4, 5, 7, 9, 2, 1]) == [(1, 2), (2, 3), (4, 2), (5, 1), (7, 1), (9, 1)] + assert histogram([1, 2, 4, 2, 4, 5, 7, 9, 2, 1], 0, lambda x: x * x) == \ + [(1, 2), (4, 3), (16, 2), (25, 1), (49, 1), (81, 1)] + assert histogram([1, 2, 4, 2, 4, 5, 7, 9, 2, 1], 1) == [(2, 3), (4, 2), (1, 2), (9, 1), (7, 1), (5, 1)] -def test_argminmax(): - assert argmin([-2, 1], key=abs) == 1 - assert argmax([-2, 1], key=abs) == -2 - assert argmax(['one', 'to', 'three'], key=len) == 'three' +def test_euclidean(): + distance = euclidean_distance([1, 2], [3, 4]) + assert round(distance, 2) == 2.83 -def test_histogram(): - assert histogram([1, 2, 4, 2, 4, 5, 7, 9, 2, 1]) == [(1, 2), (2, 3), - (4, 2), (5, 1), - (7, 1), (9, 1)] - assert histogram([1, 2, 4, 2, 4, 5, 7, 9, 2, 1], 0, lambda x: x*x) == [(1, 2), (4, 3), - (16, 2), (25, 1), - (49, 1), (81, 1)] - assert histogram([1, 2, 4, 2, 4, 5, 7, 9, 2, 1], 1) == [(2, 3), (4, 2), - (1, 2), (9, 1), - (7, 1), (5, 1)] + distance = euclidean_distance([1, 2, 3], [4, 5, 6]) + assert round(distance, 2) == 5.2 + distance = euclidean_distance([0, 0, 0], [0, 0, 0]) + assert distance == 0 -def test_dotproduct(): - assert dotproduct([1, 2, 3], [1000, 100, 10]) == 1230 +def test_cross_entropy(): + loss = cross_entropy_loss([1, 0], [0.9, 0.3]) + assert round(loss, 2) == 0.23 -def test_element_wise_product(): - assert element_wise_product([1, 2, 5], [7, 10, 0]) == [7, 20, 0] - assert element_wise_product([1, 6, 3, 0], [9, 12, 0, 0]) == [9, 72, 0, 0] + loss = cross_entropy_loss([1, 0, 0, 1], [0.9, 0.3, 0.5, 0.75]) + assert round(loss, 2) == 0.36 + loss = cross_entropy_loss([1, 0, 0, 1, 1, 0, 1, 1], [0.9, 0.3, 0.5, 0.75, 0.85, 0.14, 0.93, 0.79]) + assert round(loss, 2) == 0.26 -def test_matrix_multiplication(): - assert matrix_multiplication([[1, 2, 3], - [2, 3, 4]], - [[3, 4], - [1, 2], - [1, 0]]) == [[8, 8], [13, 14]] - assert matrix_multiplication([[1, 2, 3], - [2, 3, 4]], - [[3, 4, 8, 1], - [1, 2, 5, 0], - [1, 0, 0, 3]], - [[1, 2], - [3, 4], - [5, 6], - [1, 2]]) == [[132, 176], [224, 296]] +def test_rms_error(): + assert rms_error([2, 2], [2, 2]) == 0 + assert rms_error((0, 0), (0, 1)) == np.sqrt(0.5) + assert rms_error((1, 0), (0, 1)) == 1 + assert rms_error((0, 0), (0, -1)) == np.sqrt(0.5) + assert rms_error((0, 0.5), (0, -0.5)) == np.sqrt(0.5) -def test_vector_to_diagonal(): - assert vector_to_diagonal([1, 2, 3]) == [[1, 0, 0], [0, 2, 0], [0, 0, 3]] - assert vector_to_diagonal([0, 3, 6]) == [[0, 0, 0], [0, 3, 0], [0, 0, 6]] +def test_manhattan_distance(): + assert manhattan_distance([2, 2], [2, 2]) == 0 + assert manhattan_distance([0, 0], [0, 1]) == 1 + assert manhattan_distance([1, 0], [0, 1]) == 2 + assert manhattan_distance([0, 0], [0, -1]) == 1 + assert manhattan_distance([0, 0.5], [0, -0.5]) == 1 -def test_vector_add(): - assert vector_add((0, 1), (8, 9)) == (8, 10) +def test_mean_boolean_error(): + assert mean_boolean_error([1, 1], [0, 0]) == 1 + assert mean_boolean_error([0, 1], [1, 0]) == 1 + assert mean_boolean_error([1, 1], [0, 1]) == 0.5 + assert mean_boolean_error([0, 0], [0, 0]) == 0 + assert mean_boolean_error([1, 1], [1, 1]) == 0 -def test_scalar_vector_product(): - assert scalar_vector_product(2, [1, 2, 3]) == [2, 4, 6] +def test_mean_error(): + assert mean_error([2, 2], [2, 2]) == 0 + assert mean_error([0, 0], [0, 1]) == 0.5 + assert mean_error([1, 0], [0, 1]) == 1 + assert mean_error([0, 0], [0, -1]) == 0.5 + assert mean_error([0, 0.5], [0, -0.5]) == 0.5 -def test_scalar_matrix_product(): - assert rounder(scalar_matrix_product(-5, [[1, 2], [3, 4], [0, 6]])) == [[-5, -10], [-15, -20], - [0, -30]] - assert rounder(scalar_matrix_product(0.2, [[1, 2], [2, 3]])) == [[0.2, 0.4], [0.4, 0.6]] +def test_dot_product(): + assert dot_product([1, 2, 3], [1000, 100, 10]) == 1230 + assert dot_product([1, 2, 3], [0, 0, 0]) == 0 -def test_inverse_matrix(): - assert rounder(inverse_matrix([[1, 0], [0, 1]])) == [[1, 0], [0, 1]] - assert rounder(inverse_matrix([[2, 1], [4, 3]])) == [[1.5, -0.5], [-2.0, 1.0]] - assert rounder(inverse_matrix([[4, 7], [2, 6]])) == [[0.6, -0.7], [-0.2, 0.4]] +def test_vector_add(): + assert vector_add((0, 1), (8, 9)) == (8, 10) + assert vector_add((1, 1, 1), (2, 2, 2)) == (3, 3, 3) def test_rounder(): @@ -130,8 +161,7 @@ def test_rounder(): assert rounder(10.234566) == 10.2346 assert rounder([1.234566, 0.555555, 6.010101]) == [1.2346, 0.5556, 6.0101] assert rounder([[1.234566, 0.555555, 6.010101], - [10.505050, 12.121212, 6.030303]]) == [[1.2346, 0.5556, 6.0101], - [10.5051, 12.1212, 6.0303]] + [10.505050, 12.121212, 6.030303]]) == [[1.2346, 0.5556, 6.0101], [10.5051, 12.1212, 6.0303]] def test_num_or_str(): @@ -143,34 +173,10 @@ def test_normalize(): assert normalize([1, 2, 1]) == [0.25, 0.5, 0.25] -def test_norm(): - assert isclose(norm([1, 2, 1], 1), 4) - assert isclose(norm([3, 4], 2), 5) - assert isclose(norm([-1, 1, 2], 4), 18**0.25) - - -def test_clip(): - assert [clip(x, 0, 1) for x in [-1, 0.5, 10]] == [0, 0.5, 1] - - -def test_sigmoid(): - assert isclose(0.5, sigmoid(0)) - assert isclose(0.7310585786300049, sigmoid(1)) - assert isclose(0.2689414213699951, sigmoid(-1)) - - def test_gaussian(): - assert gaussian(1,0.5,0.7) == 0.6664492057835993 - assert gaussian(5,2,4.5) == 0.19333405840142462 - assert gaussian(3,1,3) == 0.3989422804014327 - - -def test_sigmoid_derivative(): - value = 1 - assert sigmoid_derivative(value) == 0 - - value = 3 - assert sigmoid_derivative(value) == -6 + assert gaussian(1, 0.5, 0.7) == 0.6664492057835993 + assert gaussian(5, 2, 4.5) == 0.19333405840142462 + assert gaussian(3, 1, 3) == 0.3989422804014327 def test_weighted_choice(): @@ -191,27 +197,23 @@ def test_distance_squared(): assert distance_squared((1, 2), (5, 5)) == 25.0 -def test_vector_clip(): - assert vector_clip((-1, 10), (0, 0), (9, 9)) == (0, 9) - - def test_turn_heading(): - assert turn_heading((0, 1), 1) == (-1, 0) - assert turn_heading((0, 1), -1) == (1, 0) - assert turn_heading((1, 0), 1) == (0, 1) - assert turn_heading((1, 0), -1) == (0, -1) - assert turn_heading((0, -1), 1) == (1, 0) - assert turn_heading((0, -1), -1) == (-1, 0) - assert turn_heading((-1, 0), 1) == (0, -1) - assert turn_heading((-1, 0), -1) == (0, 1) + assert turn_heading((0, 1), 1) == (-1, 0) + assert turn_heading((0, 1), -1) == (1, 0) + assert turn_heading((1, 0), 1) == (0, 1) + assert turn_heading((1, 0), -1) == (0, -1) + assert turn_heading((0, -1), 1) == (1, 0) + assert turn_heading((0, -1), -1) == (-1, 0) + assert turn_heading((-1, 0), 1) == (0, -1) + assert turn_heading((-1, 0), -1) == (0, 1) def test_turn_left(): - assert turn_left((0, 1)) == (-1, 0) + assert turn_left((0, 1)) == (-1, 0) def test_turn_right(): - assert turn_right((0, 1)) == (1, 0) + assert turn_right((0, 1)) == (1, 0) def test_step(): @@ -252,56 +254,50 @@ def test_expr(): assert expr('P & Q <=> Q & P') == Expr('<=>', (P & Q), (Q & P)) assert expr('P(x) | P(y) & Q(z)') == (P(x) | (P(y) & Q(z))) # x is grandparent of z if x is parent of y and y is parent of z: - assert (expr('GP(x, z) <== P(x, y) & P(y, z)') - == Expr('<==', GP(x, z), P(x, y) & P(y, z))) - -def test_FIFOQueue() : - # Create an object - queue = FIFOQueue() - # Generate an array of number to be used for testing - test_data = [ random.choice(range(100)) for i in range(100) ] - # Index of the element to be added in the queue - front_head = 0 - # Index of the element to be removed from the queue - back_head = 0 - while front_head < 100 or back_head < 100 : - if front_head == 100 : # only possible to remove - # check for pop and append method - assert queue.pop() == test_data[back_head] - back_head += 1 - elif back_head == front_head : # only possible to push element into queue - queue.append(test_data[front_head]) - front_head += 1 - # else do it in a random manner - elif random.random() < 0.5 : - assert queue.pop() == test_data[back_head] - back_head += 1 - else : - queue.append(test_data[front_head]) - front_head += 1 - # check for __len__ method - assert len(queue) == front_head - back_head - # chek for __contains__ method - if front_head - back_head > 0 : - assert random.choice(test_data[back_head:front_head]) in queue - - # check extend method - test_data1 = [ random.choice(range(100)) for i in range(50) ] - test_data2 = [ random.choice(range(100)) for i in range(50) ] - # append elements of test data 1 - queue.extend(test_data1) - # append elements of test data 2 - queue.extend(test_data2) - # reset front_head - front_head = 0 - - while front_head < 50 : - assert test_data1[front_head] == queue.pop() - front_head += 1 - - while front_head < 100 : - assert test_data2[front_head - 50] == queue.pop() - front_head += 1 + assert (expr('GP(x, z) <== P(x, y) & P(y, z)') == Expr('<==', GP(x, z), P(x, y) & P(y, z))) + + +def test_min_priority_queue(): + queue = PriorityQueue(f=lambda x: x[1]) + queue.append((1, 100)) + queue.append((2, 30)) + queue.append((3, 50)) + assert queue.pop() == (2, 30) + assert len(queue) == 2 + assert queue[(3, 50)] == 50 + assert (1, 100) in queue + del queue[(1, 100)] + assert (1, 100) not in queue + queue.extend([(1, 100), (4, 10)]) + assert queue.pop() == (4, 10) + assert len(queue) == 2 + + +def test_max_priority_queue(): + queue = PriorityQueue(order='max', f=lambda x: x[1]) + queue.append((1, 100)) + queue.append((2, 30)) + queue.append((3, 50)) + assert queue.pop() == (1, 100) + + +def test_priority_queue_with_objects(): + class Test: + def __init__(self, a, b): + self.a = a + self.b = b + + def __eq__(self, other): + return self.a == other.a + + queue = PriorityQueue(f=lambda x: x.b) + queue.append(Test(1, 100)) + other = Test(1, 10) + assert queue[other] == 100 + assert other in queue + del queue[other] + assert len(queue) == 0 + if __name__ == '__main__': pytest.main() diff --git a/text.ipynb b/text.ipynb index f1c61e175..327bd1160 100644 --- a/text.ipynb +++ b/text.ipynb @@ -18,7 +18,8 @@ "outputs": [], "source": [ "from text import *\n", - "from utils import open_data" + "from utils import open_data\n", + "from notebook import psource" ] }, { @@ -55,46 +56,11 @@ }, { "cell_type": "code", - "execution_count": 2, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "%psource UnigramWordModel" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "%psource NgramWordModel" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": true - }, - "outputs": [], - "source": [ - "%psource UnigramCharModel" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "collapsed": true - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ - "%psource NgramCharModel" + "psource(UnigramWordModel, NgramWordModel, UnigramCharModel, NgramCharModel)" ] }, { @@ -117,7 +83,7 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 2, "metadata": {}, "outputs": [ { @@ -149,25 +115,25 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We see that the most used word in *Flatland* is 'the', with 2081 occurences, while the most used sequence is 'of the' with 368 occurences. Also, the probability of 'an' is approximately 0.003, while for 'i was' it is close to 0.001. Note that the strings used as keys are all lowercase. For the unigram model, the keys are single strings, while for n-gram models we have n-tuples of strings.\n", + "We see that the most used word in *Flatland* is 'the', with 2081 occurrences, while the most used sequence is 'of the' with 368 occurrences. Also, the probability of 'an' is approximately 0.003, while for 'i was' it is close to 0.001. Note that the strings used as keys are all lowercase. For the unigram model, the keys are single strings, while for n-gram models we have n-tuples of strings.\n", "\n", "Below we take a look at how we can get information from the conditional probabilities of the model, and how we can generate the next word in a sequence." ] }, { "cell_type": "code", - "execution_count": 12, + "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "Conditional Probabilities Table: {'myself': 1, 'to': 2, 'at': 2, 'pleased': 1, 'considered': 1, 'will': 1, 'intoxicated': 1, 'glad': 1, 'certain': 2, 'in': 2, 'now': 2, 'sitting': 1, 'unusually': 1, 'approaching': 1, 'by': 1, 'covered': 1, 'standing': 1, 'allowed': 1, 'surprised': 1, 'keenly': 1, 'afraid': 1, 'once': 2, 'crushed': 1, 'not': 4, 'rapt': 1, 'simulating': 1, 'rapidly': 1, 'quite': 1, 'describing': 1, 'wearied': 1} \n", + "Conditional Probabilities Table: {'now': 2, 'glad': 1, 'keenly': 1, 'considered': 1, 'once': 2, 'not': 4, 'in': 2, 'by': 1, 'simulating': 1, 'intoxicated': 1, 'wearied': 1, 'quite': 1, 'certain': 2, 'sitting': 1, 'to': 2, 'rapidly': 1, 'will': 1, 'describing': 1, 'allowed': 1, 'at': 2, 'afraid': 1, 'covered': 1, 'approaching': 1, 'standing': 1, 'myself': 1, 'surprised': 1, 'unusually': 1, 'rapt': 1, 'pleased': 1, 'crushed': 1} \n", "\n", "Conditional Probability of 'once' give 'i was': 0.05128205128205128 \n", "\n", - "Next word after 'i was': not\n" + "Next word after 'i was': wearied\n" ] } ], @@ -198,7 +164,7 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 4, "metadata": {}, "outputs": [ { @@ -246,16 +212,16 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "not it of before most regions multitudes the a three\n", - "the inhabitants of so also refers to the cube with\n", - "the service of education waxed daily more numerous than the\n" + "hearing as inside is confined to conduct by the duties\n", + "all and of voice being in a day of the\n", + "party they are stirred to mutual warfare and perish by\n" ] } ], @@ -283,23 +249,22 @@ }, { "cell_type": "code", - "execution_count": 19, + "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "it again stealing away through the ranks of his nephew but he laughed most immoderately\n", - "exclaiming that he henceforth exchanged them for the artist s pencil how great and glorious\n", - "compound now for nothing worse but however all that is quite out of the question\n", - "accordance with precedent and for the sake of secrecy he must condemn him to perpetual\n" + "leave them at cleveland this christmas now pray do not ask you to relate or\n", + "meaning and both of us sprang forward in the direction and no sooner had they\n", + "palmer though very unwilling to go as well from real humanity and good nature as\n", + "time about what they should do and they agreed he should take orders directly and\n" ] } ], "source": [ "data = open_data(\"EN-text/flatland.txt\").read()\n", - "data += open_data(\"EN-text/gutenberg.txt\").read()\n", "data += open_data(\"EN-text/sense.txt\").read()\n", "\n", "wordseq = words(data)\n", @@ -332,7 +297,7 @@ "\n", "We are given a string containing words of a sentence, but all the spaces are gone! It is very hard to read and we would like to separate the words in the string. We can accomplish this by employing the `Viterbi Segmentation` algorithm. It takes as input the string to segment and a text model, and it returns a list of the separate words.\n", "\n", - "The algorithm operates in a dynamic programming approach. It starts from the beginning of the string and iteratively builds the best solution using previous solutions. It accomplishes that by segmentating the string into \"windows\", each window representing a word (real or gibberish). It then calculates the probability of the sequence up that window/word occuring and updates its solution. When it is done, it traces back from the final word and finds the complete sequence of words." + "The algorithm operates in a dynamic programming approach. It starts from the beginning of the string and iteratively builds the best solution using previous solutions. It accomplishes that by segmentating the string into \"windows\", each window representing a word (real or gibberish). It then calculates the probability of the sequence up that window/word occurring and updates its solution. When it is done, it traces back from the final word and finds the complete sequence of words." ] }, { @@ -344,13 +309,11 @@ }, { "cell_type": "code", - "execution_count": 3, - "metadata": { - "collapsed": true - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ - "%psource viterbi_segment" + "psource(viterbi_segment)" ] }, { @@ -373,7 +336,7 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 3, "metadata": {}, "outputs": [ { @@ -388,7 +351,7 @@ "source": [ "flatland = open_data(\"EN-text/flatland.txt\").read()\n", "wordseq = words(flatland)\n", - "P = UnigramTextModel(wordseq)\n", + "P = UnigramWordModel(wordseq)\n", "text = \"itiseasytoreadwordswithoutspaces\"\n", "\n", "s, p = viterbi_segment(text,P)\n", @@ -423,7 +386,7 @@ "\n", "How does an IR system determine which documents are relevant though? We can sign a document as relevant if all the words in the query appear in it, and sign it as irrelevant otherwise. We can even extend the query language to support boolean operations (for example, \"paint AND brush\") and then sign as relevant the outcome of the query for the document. This technique though does not give a level of relevancy. All the documents are either relevant or irrelevant, but in reality some documents are more relevant than others.\n", "\n", - "So, instead of a boolean relevancy system, we use a *scoring function*. There are many scoring functions around for many different situations. One of the most used takes into account the frequency of the words appearing in a document, the frequency of a word appearing across documents (for example, the word \"a\" appears a lot, so it is not very important) and the length of a document (since large documents will have higher occurences for the query terms, but a short document with a lot of occurences seems very relevant). We combine these properties in a formula and we get a numeric score for each document, so we can then quantify relevancy and pick the best documents.\n", + "So, instead of a boolean relevancy system, we use a *scoring function*. There are many scoring functions around for many different situations. One of the most used takes into account the frequency of the words appearing in a document, the frequency of a word appearing across documents (for example, the word \"a\" appears a lot, so it is not very important) and the length of a document (since large documents will have higher occurrences for the query terms, but a short document with a lot of occurrences seems very relevant). We combine these properties in a formula and we get a numeric score for each document, so we can then quantify relevancy and pick the best documents.\n", "\n", "These scoring functions are not perfect though and there is room for improvement. For instance, for the above scoring function we assume each word is independent. That is not the case though, since words can share meaning. For example, the words \"painter\" and \"painters\" are closely related. If in a query we have the word \"painter\" and in a document the word \"painters\" appears a lot, this might be an indication that the document is relevant but we are missing out since we are only looking for \"painter\". There are a lot of ways to combat this. One of them is to reduce the query and document words into their stems. For example, both \"painter\" and \"painters\" have \"paint\" as their stem form. This can improve slightly the performance of algorithms.\n", "\n", @@ -447,7 +410,7 @@ }, "outputs": [], "source": [ - "%psource IRSystem" + "psource(IRSystem)" ] }, { @@ -490,7 +453,7 @@ }, "outputs": [], "source": [ - "%psource UnixConsultant" + "psource(UnixConsultant)" ] }, { @@ -504,7 +467,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 4, "metadata": {}, "outputs": [ { @@ -533,7 +496,7 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 5, "metadata": {}, "outputs": [ { @@ -564,7 +527,7 @@ "source": [ "## INFORMATION EXTRACTION\n", "\n", - "**Information Extraction (IE)** is a method for finding occurences of object classes and relationships in text. Unlike IR systems, an IE system includes (limited) notions of syntax and semantics. While it is difficult to extract object information in a general setting, for more specific domains the system is very useful. One model of an IE system makes use of templates that match with strings in a text.\n", + "**Information Extraction (IE)** is a method for finding occurrences of object classes and relationships in text. Unlike IR systems, an IE system includes (limited) notions of syntax and semantics. While it is difficult to extract object information in a general setting, for more specific domains the system is very useful. One model of an IE system makes use of templates that match with strings in a text.\n", "\n", "A typical example of such a model is reading prices from web pages. Prices usually appear after a dollar and consist of numbers, maybe followed by two decimal points. Before the price, usually there will appear a string like \"price:\". Let's build a sample template.\n", "\n", @@ -572,7 +535,7 @@ "\n", "`[$][0-9]+([.][0-9][0-9])?`\n", "\n", - "Where `+` means 1 or more occurences and `?` means at most 1 occurence. Usually a template consists of a prefix, a target and a postfix regex. In this template, the prefix regex can be \"price:\", the target regex can be the above regex and the postfix regex can be empty.\n", + "Where `+` means 1 or more occurrences and `?` means atmost 1 occurrence. Usually a template consists of a prefix, a target and a postfix regex. In this template, the prefix regex can be \"price:\", the target regex can be the above regex and the postfix regex can be empty.\n", "\n", "A template can match with multiple strings. If this is the case, we need a way to resolve the multiple matches. Instead of having just one template, we can use multiple templates (ordered by priority) and pick the match from the highest-priority template. We can also use other ways to pick. For the dollar example, we can pick the match closer to the numerical half of the highest match. For the text \"Price $90, special offer $70, shipping $5\" we would pick \"$70\" since it is closer to the half of the highest match (\"$90\")." ] @@ -628,7 +591,7 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": 6, "metadata": {}, "outputs": [ { @@ -656,7 +619,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 7, "metadata": {}, "outputs": [ { @@ -743,25 +706,23 @@ "metadata": {}, "source": [ "### Permutation Decoder\n", - "Now let us try to decode messages encrypted by a general monoalphabetic substitution cipher. The letters in the alphabet can be replaced by any permutation of letters. For example if the alpahbet consisted of `{A B C}` then it can be replaced by `{A C B}`, `{B A C}`, `{B C A}`, `{C A B}`, `{C B A}` or even `{A B C}` itself. Suppose we choose the permutation `{C B A}`, then the plain text `\"CAB BA AAC\"` would become `\"ACB BC CCA\"`. We can see that Caesar cipher is also a form of permutation cipher where the permutation is a cyclic permutation. Unlike the Caesar cipher, it is infeasible to try all possible permutations. The number of possible permutations in Latin alphabet is `26!` which is of the order $10^{26}$. We use graph search algorithms to search for a 'good' permutation." + "Now let us try to decode messages encrypted by a general mono-alphabetic substitution cipher. The letters in the alphabet can be replaced by any permutation of letters. For example, if the alphabet consisted of `{A B C}` then it can be replaced by `{A C B}`, `{B A C}`, `{B C A}`, `{C A B}`, `{C B A}` or even `{A B C}` itself. Suppose we choose the permutation `{C B A}`, then the plain text `\"CAB BA AAC\"` would become `\"ACB BC CCA\"`. We can see that Caesar cipher is also a form of permutation cipher where the permutation is a cyclic permutation. Unlike the Caesar cipher, it is infeasible to try all possible permutations. The number of possible permutations in Latin alphabet is `26!` which is of the order $10^{26}$. We use graph search algorithms to search for a 'good' permutation." ] }, { "cell_type": "code", - "execution_count": 10, - "metadata": { - "collapsed": true - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ - "%psource PermutationDecoder" + "psource(PermutationDecoder)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Each state/node in the graph is represented as a letter-to-letter map. If there no mapping for a letter it means the letter is unchanged in the permutation. These maps are stored as dictionaries. Each dictionary is a 'potential' permutation. We use the word 'potential' because every dictionary doesn't necessarily represent a valid permutation since a permutation cannot have repeating elements. For example the dictionary `{'A': 'B', 'C': 'X'}` is invalid because `'A'` is replaced by `'B'`, but so is `'B'` because the dictionary doesn't have a mapping for `'B'`. Two dictionaries can also represent the same permutation e.g. `{'A': 'C', 'C': 'A'}` and `{'A': 'C', 'B': 'B', 'C': 'A'}` represent the same permutation where `'A'` and `'C'` are interchanged and all other letters remain unaltered. To ensure we get a valid permutation a goal state must map all letters in the alphabet. We also prevent repetions in the permutation by allowing only those actions which go to new state/node in which the newly added letter to the dictionary maps to previously unmapped letter. These two rules togeter ensure that the dictionary of a goal state will represent a valid permutation.\n", + "Each state/node in the graph is represented as a letter-to-letter map. If there is no mapping for a letter, it means the letter is unchanged in the permutation. These maps are stored as dictionaries. Each dictionary is a 'potential' permutation. We use the word 'potential' because every dictionary doesn't necessarily represent a valid permutation since a permutation cannot have repeating elements. For example the dictionary `{'A': 'B', 'C': 'X'}` is invalid because `'A'` is replaced by `'B'`, but so is `'B'` because the dictionary doesn't have a mapping for `'B'`. Two dictionaries can also represent the same permutation e.g. `{'A': 'C', 'C': 'A'}` and `{'A': 'C', 'B': 'B', 'C': 'A'}` represent the same permutation where `'A'` and `'C'` are interchanged and all other letters remain unaltered. To ensure that we get a valid permutation, a goal state must map all letters in the alphabet. We also prevent repetitions in the permutation by allowing only those actions which go to a new state/node in which the newly added letter to the dictionary maps to previously unmapped letter. These two rules together ensure that the dictionary of a goal state will represent a valid permutation.\n", "The score of a state is determined using word scores, unigram scores, and bigram scores. Experiment with different weightages for word, unigram and bigram scores and see how they affect the decoding." ] }, @@ -791,7 +752,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As evident from the above example, permutation decoding using best first search is sensitive to initial text. This is because not only the final dictionary, with substitutions for all letters, must have good score but so must the intermediate dictionaries. You could think of it as performing a local search by finding substitutons for each letter one by one. We could get very different results by changing even a single letter because that letter could be a deciding factor for selecting substitution in early stages which snowballs and affects the later stages. To make the search better we can use different definition of score in different stages and optimize on which letter to substitute first." + "As evident from the above example, permutation decoding using best first search is sensitive to initial text. This is because not only the final dictionary, with substitutions for all letters, must have good score but so must the intermediate dictionaries. You could think of it as performing a local search by finding substitutions for each letter one by one. We could get very different results by changing even a single letter because that letter could be a deciding factor for selecting substitution in early stages which snowballs and affects the later stages. To make the search better we can use different definitions of score in different stages and optimize on which letter to substitute first." ] } ], @@ -811,7 +772,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.5.2+" + "version": "3.5.3" } }, "nbformat": 4, diff --git a/text.py b/text.py index c62c1627a..11a5731f1 100644 --- a/text.py +++ b/text.py @@ -1,50 +1,57 @@ -"""Statistical Language Processing tools. (Chapter 22) +""" +Statistical Language Processing tools (Chapter 22) + We define Unigram and Ngram text models, use them to generate random text, -and show the Viterbi algorithm for segmentatioon of letters into words. +and show the Viterbi algorithm for segmentation of letters into words. Then we show a very simple Information Retrieval system, and an example -working on a tiny sample of Unix manual pages.""" - -from utils import argmin, argmax, hashabledict -from learning import CountingProbDist -import search +working on a tiny sample of Unix manual pages. +""" -from math import log, exp -from collections import defaultdict import heapq -import re import os +import re +from collections import defaultdict +import numpy as np -class UnigramWordModel(CountingProbDist): +import search +from probabilistic_learning import CountingProbDist +from utils import hashabledict + +class UnigramWordModel(CountingProbDist): """This is a discrete probability distribution over words, so you can add, sample, or get P[word], just like with CountingProbDist. You can also generate a random text, n words long, with P.samples(n).""" + def __init__(self, observations, default=0): + # Call CountingProbDist constructor, + # passing the observations and default parameters. + super(UnigramWordModel, self).__init__(observations, default) + def samples(self, n): """Return a string of n words, random according to the model.""" return ' '.join(self.sample() for i in range(n)) class NgramWordModel(CountingProbDist): - """This is a discrete probability distribution over n-tuples of words. You can add, sample or get P[(word1, ..., wordn)]. The method P.samples(n) builds up an n-word sequence; P.add_cond_prob and P.add_sequence add data.""" - def __init__(self, n, observation_sequence=[], default=0): + def __init__(self, n, observation_sequence=None, default=0): # In addition to the dictionary of n-tuples, cond_prob is a # mapping from (w1, ..., wn-1) to P(wn | w1, ... wn-1) CountingProbDist.__init__(self, default=default) self.n = n self.cond_prob = defaultdict() - self.add_sequence(observation_sequence) + self.add_sequence(observation_sequence or []) # __getitem__, top, sample inherited from CountingProbDist # Note that they deal with tuples, not strings, as inputs def add_cond_prob(self, ngram): - """Builds the conditional probabilities P(wn | (w1, ..., wn-1)""" + """Build the conditional probabilities P(wn | (w1, ..., wn-1)""" if ngram[:-1] not in self.cond_prob: self.cond_prob[ngram[:-1]] = CountingProbDist() self.cond_prob[ngram[:-1]].add(ngram[-1]) @@ -68,7 +75,7 @@ def samples(self, nwords): output = list(self.sample()) for i in range(n, nwords): - last = output[-n+1:] + last = output[-n + 1:] next_word = self.cond_prob[tuple(last)].sample() output.append(next_word) @@ -83,14 +90,17 @@ def add_sequence(self, words): class UnigramCharModel(NgramCharModel): - def __init__(self, observation_sequence=[], default=0): + def __init__(self, observation_sequence=None, default=0): CountingProbDist.__init__(self, default=default) self.n = 1 self.cond_prob = defaultdict() - self.add_sequence(observation_sequence) + self.add_sequence(observation_sequence or []) def add_sequence(self, words): - [self.add(char) for word in words for char in list(word)] + for word in words: + for char in word: + self.add(char) + # ______________________________________________________________________________ @@ -104,7 +114,7 @@ def viterbi_segment(text, P): words = [''] + list(text) best = [1.0] + [0.0] * n # Fill in the vectors best words via dynamic programming - for i in range(n+1): + for i in range(n + 1): for j in range(0, i): w = text[j:i] curr_score = P[w] * best[i - len(w)] @@ -126,7 +136,6 @@ def viterbi_segment(text, P): # TODO(tmrts): Expose raw index class IRSystem: - """A very simple Information Retrieval System, as discussed in Sect. 23.2. The constructor s = IRSystem('the a') builds an empty system with two stopwords. Next, index several documents with s.index_document(text, url). @@ -147,8 +156,7 @@ def index_collection(self, filenames): """Index a whole collection of files.""" prefix = os.path.dirname(__file__) for filename in filenames: - self.index_document(open(filename).read(), - os.path.relpath(filename, prefix)) + self.index_document(open(filename).read(), os.path.relpath(filename, prefix)) def index_document(self, text, url): """Index the text of a document.""" @@ -170,15 +178,14 @@ def query(self, query_text, n=10): return [] qwords = [w for w in words(query_text) if w not in self.stopwords] - shortest = argmin(qwords, key=lambda w: len(self.index[w])) + shortest = min(qwords, key=lambda w: len(self.index[w])) docids = self.index[shortest] return heapq.nlargest(n, ((self.total_score(qwords, docid), docid) for docid in docids)) def score(self, word, docid): """Compute a score for this word on the document with this docid.""" # There are many options; here we take a very simple approach - return (log(1 + self.index[word][docid]) / - log(1 + self.documents[docid].nwords)) + return np.log(1 + self.index[word][docid]) / np.log(1 + self.documents[docid].nwords) def total_score(self, words, docid): """Compute the sum of the scores of these words on the document with this docid.""" @@ -188,9 +195,7 @@ def present(self, results): """Present the results as a list.""" for (score, docid) in results: doc = self.documents[docid] - print( - ("{:5.2}|{:25} | {}".format(100 * score, doc.url, - doc.title[:45].expandtabs()))) + print("{:5.2}|{:25} | {}".format(100 * score, doc.url, doc.title[:45].expandtabs())) def present_results(self, query_text, n=10): """Get results for the query and present them.""" @@ -198,23 +203,20 @@ def present_results(self, query_text, n=10): class UnixConsultant(IRSystem): - """A trivial IR system over a small collection of Unix man pages.""" def __init__(self): IRSystem.__init__(self, stopwords="how do i the a of") - + import os aima_root = os.path.dirname(__file__) mandir = os.path.join(aima_root, 'aima-data/MAN/') - man_files = [mandir + f for f in os.listdir(mandir) - if f.endswith('.txt')] + man_files = [mandir + f for f in os.listdir(mandir) if f.endswith('.txt')] self.index_collection(man_files) class Document: - """Metadata for a document: title and url; maybe add others later.""" def __init__(self, title, url, nwords): @@ -249,6 +251,7 @@ def canonicalize(text): alphabet = 'abcdefghijklmnopqrstuvwxyz' + # Encoding @@ -303,11 +306,11 @@ def bigrams(text): """ return [text[i:i + 2] for i in range(len(text) - 1)] + # Decoding a Shift (or Caesar) Cipher class ShiftDecoder: - """There are only 26 possible encodings, so we can try all of them, and return the one with the highest probability, according to a bigram probability distribution.""" @@ -328,7 +331,7 @@ def score(self, plaintext): def decode(self, ciphertext): """Return the shift decoding of text with the best score.""" - return argmax(all_shifts(ciphertext), key=lambda shift: self.score(shift)) + return max(all_shifts(ciphertext), key=lambda shift: self.score(shift)) def all_shifts(text): @@ -336,11 +339,11 @@ def all_shifts(text): yield from (shift_encode(text, i) for i, _ in enumerate(alphabet)) + # Decoding a General Permutation Cipher class PermutationDecoder: - """This is a much harder problem than the shift decoder. There are 26! permutations, so we can't try them all. Instead we have to search. We want to search well, but there are many things to consider: @@ -363,9 +366,9 @@ def decode(self, ciphertext): """Search for a decoding of the ciphertext.""" self.ciphertext = canonicalize(ciphertext) # reduce domain to speed up search - self.chardomain = {c for c in self.ciphertext if c is not ' '} + self.chardomain = {c for c in self.ciphertext if c != ' '} problem = PermutationDecoderProblem(decoder=self) - solution = search.best_first_graph_search( + solution = search.best_first_graph_search( problem, lambda node: self.score(node.state)) solution.state[' '] = ' ' @@ -383,25 +386,25 @@ def score(self, code): # add small positive value to prevent computing log(0) # TODO: Modify the values to make score more accurate - logP = (sum([log(self.Pwords[word] + 1e-20) for word in words(text)]) + - sum([log(self.P1[c] + 1e-5) for c in text]) + - sum([log(self.P2[b] + 1e-10) for b in bigrams(text)])) - return -exp(logP) + logP = (sum(np.log(self.Pwords[word] + 1e-20) for word in words(text)) + + sum(np.log(self.P1[c] + 1e-5) for c in text) + + sum(np.log(self.P2[b] + 1e-10) for b in bigrams(text))) + return -np.exp(logP) class PermutationDecoderProblem(search.Problem): def __init__(self, initial=None, goal=None, decoder=None): - self.initial = initial or hashabledict() + super().__init__(initial or hashabledict(), goal) self.decoder = decoder def actions(self, state): search_list = [c for c in self.decoder.chardomain if c not in state] target_list = [c for c in alphabet if c not in state.values()] - # Find the best charater to replace - plainchar = argmax(search_list, key=lambda c: self.decoder.P1[c]) - for cipherchar in target_list: - yield (plainchar, cipherchar) + # Find the best character to replace + plain_char = max(search_list, key=lambda c: self.decoder.P1[c]) + for cipher_char in target_list: + yield (plain_char, cipher_char) def result(self, state, action): new_state = hashabledict(state) # copy to prevent hash issues diff --git a/utils.py b/utils.py index d2720abe1..3158e3793 100644 --- a/utils.py +++ b/utils.py @@ -3,12 +3,15 @@ import bisect import collections import collections.abc +import functools +import heapq import operator import os.path import random -import math -import functools from itertools import chain, combinations +from statistics import mean + +import numpy as np # ______________________________________________________________________________ @@ -16,27 +19,45 @@ def sequence(iterable): - """Coerce iterable to sequence, if it is not already one.""" - return (iterable if isinstance(iterable, collections.abc.Sequence) - else tuple(iterable)) + """Converts iterable to sequence, if it is not already one.""" + return iterable if isinstance(iterable, collections.abc.Sequence) else tuple([iterable]) -def removeall(item, seq): - """Return a copy of seq (or string) with all occurences of item removed.""" +def remove_all(item, seq): + """Return a copy of seq (or string) with all occurrences of item removed.""" if isinstance(seq, str): return seq.replace(item, '') + elif isinstance(seq, set): + rest = seq.copy() + rest.remove(item) + return rest else: return [x for x in seq if x != item] -def unique(seq): # TODO: replace with set +def unique(seq): """Remove duplicate elements from seq. Assumes hashable elements.""" return list(set(seq)) def count(seq): """Count the number of items in sequence that are interpreted as true.""" - return sum(bool(x) for x in seq) + return sum(map(bool, seq)) + + +def multimap(items): + """Given (key, val) pairs, return {key: [val, ....], ...}.""" + result = collections.defaultdict(list) + for (key, val) in items: + result[key].append(val) + return dict(result) + + +def multimap_items(mmap): + """Yield all (key, val) pairs stored in the multimap.""" + for (key, vals) in mmap.items(): + for val in vals: + yield key, val def product(numbers): @@ -48,13 +69,8 @@ def product(numbers): def first(iterable, default=None): - """Return the first element of an iterable or the next element of a generator; or default.""" - try: - return iterable[0] - except IndexError: - return default - except TypeError: - return next(iterable, default) + """Return the first element of an iterable; or default.""" + return next(iter(iterable), default) def is_in(elt, seq): @@ -68,30 +84,35 @@ def mode(data): return item -def powerset(iterable): - """powerset([1,2,3]) --> (1,) (2,) (3,) (1,2) (1,3) (2,3) (1,2,3)""" +def power_set(iterable): + """power_set([1,2,3]) --> (1,) (2,) (3,) (1,2) (1,3) (2,3) (1,2,3)""" s = list(iterable) - return list(chain.from_iterable(combinations(s, r) for r in range(len(s)+1)))[1:] + return list(chain.from_iterable(combinations(s, r) for r in range(len(s) + 1)))[1:] + + +def extend(s, var, val): + """Copy dict s and extend it by setting var to val; return copy.""" + return {**s, var: val} + + +def flatten(seqs): + return sum(seqs, []) # ______________________________________________________________________________ # argmin and argmax - identity = lambda x: x -argmin = min -argmax = max - def argmin_random_tie(seq, key=identity): """Return a minimum element of seq; break ties at random.""" - return argmin(shuffled(seq), key=key) + return min(shuffled(seq), key=key) def argmax_random_tie(seq, key=identity): """Return an element with highest fn(seq[i]) score; break ties at random.""" - return argmax(shuffled(seq), key=key) + return max(shuffled(seq), key=key) def shuffled(iterable): @@ -117,85 +138,40 @@ def histogram(values, mode=0, bin_function=None): bins[val] = bins.get(val, 0) + 1 if mode: - return sorted(list(bins.items()), key=lambda x: (x[1], x[0]), - reverse=True) + return sorted(list(bins.items()), key=lambda x: (x[1], x[0]), reverse=True) else: return sorted(bins.items()) -def dotproduct(X, Y): - """Return the sum of the element-wise product of vectors X and Y.""" - return sum(x * y for x, y in zip(X, Y)) - +def dot_product(x, y): + """Return the sum of the element-wise product of vectors x and y.""" + return sum(_x * _y for _x, _y in zip(x, y)) -def element_wise_product(X, Y): - """Return vector as an element-wise product of vectors X and Y""" - assert len(X) == len(Y) - return [x * y for x, y in zip(X, Y)] +def element_wise_product(x, y): + """Return vector as an element-wise product of vectors x and y.""" + assert len(x) == len(y) + return np.multiply(x, y) -def matrix_multiplication(X_M, *Y_M): - """Return a matrix as a matrix-multiplication of X_M and arbitary number of matrices *Y_M""" - def _mat_mult(X_M, Y_M): - """Return a matrix as a matrix-multiplication of two matrices X_M and Y_M - >>> matrix_multiplication([[1, 2, 3], - [2, 3, 4]], - [[3, 4], - [1, 2], - [1, 0]]) - [[8, 8],[13, 14]] - """ - assert len(X_M[0]) == len(Y_M) - - result = [[0 for i in range(len(Y_M[0]))] for j in range(len(X_M))] - for i in range(len(X_M)): - for j in range(len(Y_M[0])): - for k in range(len(Y_M)): - result[i][j] += X_M[i][k] * Y_M[k][j] - return result +def matrix_multiplication(x, *y): + """Return a matrix as a matrix-multiplication of x and arbitrary number of matrices *y.""" - result = X_M - for Y in Y_M: - result = _mat_mult(result, Y) + result = x + for _y in y: + result = np.matmul(result, _y) return result -def vector_to_diagonal(v): - """Converts a vector to a diagonal matrix with vector elements - as the diagonal elements of the matrix""" - diag_matrix = [[0 for i in range(len(v))] for j in range(len(v))] - for i in range(len(v)): - diag_matrix[i][i] = v[i] - - return diag_matrix - - def vector_add(a, b): """Component-wise addition of two vectors.""" return tuple(map(operator.add, a, b)) -def scalar_vector_product(X, Y): +def scalar_vector_product(x, y): """Return vector as a product of a scalar and a vector""" - return [X * y for y in Y] - - -def scalar_matrix_product(X, Y): - """Return matrix as a product of a scalar and a matrix""" - return [scalar_vector_product(X, y) for y in Y] - - -def inverse_matrix(X): - """Inverse a given square matrix of size 2x2""" - assert len(X) == 2 - assert len(X[0]) == 2 - det = X[0][0] * X[1][1] - X[0][1] * X[1][0] - assert det != 0 - inv_mat = scalar_matrix_product(1.0/det, [[X[1][1], -X[0][1]], [-X[1][0], X[0][0]]]) - - return inv_mat + return np.multiply(x, y) def probability(p): @@ -208,7 +184,6 @@ def weighted_sample_with_replacement(n, seq, weights): probability of each element in proportion to its corresponding weight.""" sample = weighted_sampler(seq, weights) - return [sample() for _ in range(n)] @@ -217,22 +192,33 @@ def weighted_sampler(seq, weights): totals = [] for w in weights: totals.append(w + totals[-1] if totals else w) - return lambda: seq[bisect.bisect(totals, random.uniform(0, totals[-1]))] +def weighted_choice(choices): + """A weighted version of random.choice""" + # NOTE: should be replaced by random.choices if we port to Python 3.6 + + total = sum(w for _, w in choices) + r = random.uniform(0, total) + upto = 0 + for c, w in choices: + if upto + w >= r: + return c, w + upto += w + + def rounder(numbers, d=4): """Round a single number, or sequence of numbers, to d decimal places.""" if isinstance(numbers, (int, float)): return round(numbers, d) else: - constructor = type(numbers) # Can be list, set, tuple, etc. + constructor = type(numbers) # Can be list, set, tuple, etc. return constructor(rounder(n, d) for n in numbers) -def num_or_str(x): - """The argument is a string; convert to a number if - possible, or strip it.""" +def num_or_str(x): # TODO: rename as `atom` + """The argument is a string; convert to a number if possible, or strip it.""" try: return int(x) except ValueError: @@ -242,35 +228,97 @@ def num_or_str(x): return str(x).strip() +def euclidean_distance(x, y): + return np.sqrt(sum((_x - _y) ** 2 for _x, _y in zip(x, y))) + + +def manhattan_distance(x, y): + return sum(abs(_x - _y) for _x, _y in zip(x, y)) + + +def hamming_distance(x, y): + return sum(_x != _y for _x, _y in zip(x, y)) + + +def cross_entropy_loss(x, y): + return (-1.0 / len(x)) * sum(_x * np.log(_y) + (1 - _x) * np.log(1 - _y) for _x, _y in zip(x, y)) + + +def mean_squared_error_loss(x, y): + return (1.0 / len(x)) * sum((_x - _y) ** 2 for _x, _y in zip(x, y)) + + +def rms_error(x, y): + return np.sqrt(ms_error(x, y)) + + +def ms_error(x, y): + return mean((_x - _y) ** 2 for _x, _y in zip(x, y)) + + +def mean_error(x, y): + return mean(abs(_x - _y) for _x, _y in zip(x, y)) + + +def mean_boolean_error(x, y): + return mean(_x != _y for _x, _y in zip(x, y)) + + def normalize(dist): """Multiply each number by a constant such that the sum is 1.0""" if isinstance(dist, dict): total = sum(dist.values()) for key in dist: dist[key] = dist[key] / total - assert 0 <= dist[key] <= 1, "Probabilities must be between 0 and 1." + assert 0 <= dist[key] <= 1 # probabilities must be between 0 and 1 return dist total = sum(dist) return [(n / total) for n in dist] -def norm(X, n=2): - """Return the n-norm of vector X""" - return sum([x**n for x in X])**(1/n) +def random_weights(min_value, max_value, num_weights): + return [random.uniform(min_value, max_value) for _ in range(num_weights)] -def clip(x, lowest, highest): - """Return x clipped to the range [lowest..highest].""" - return max(lowest, min(x, highest)) +def sigmoid(x): + """Return activation value of x with sigmoid function.""" + return 1 / (1 + np.exp(-x)) def sigmoid_derivative(value): return value * (1 - value) -def sigmoid(x): - """Return activation value of x with sigmoid function""" - return 1/(1 + math.exp(-x)) +def elu(x, alpha=0.01): + return x if x > 0 else alpha * (np.exp(x) - 1) + + +def elu_derivative(value, alpha=0.01): + return 1 if value > 0 else alpha * np.exp(value) + + +def tanh(x): + return np.tanh(x) + + +def tanh_derivative(value): + return 1 - (value ** 2) + + +def leaky_relu(x, alpha=0.01): + return x if x > 0 else alpha * x + + +def leaky_relu_derivative(value, alpha=0.01): + return 1 if value > 0 else alpha + + +def relu(x): + return max(0, x) + + +def relu_derivative(value): + return 1 if value > 0 else 0 def step(x): @@ -280,28 +328,29 @@ def step(x): def gaussian(mean, st_dev, x): """Given the mean and standard deviation of a distribution, it returns the probability of x.""" - return 1/(math.sqrt(2*math.pi)*st_dev)*math.e**(-0.5*(float(x-mean)/st_dev)**2) + return 1 / (np.sqrt(2 * np.pi) * st_dev) * np.e ** (-0.5 * (float(x - mean) / st_dev) ** 2) -try: # math.isclose was added in Python 3.5; but we might be in 3.4 - from math import isclose -except ImportError: - def isclose(a, b, rel_tol=1e-09, abs_tol=0.0): - """Return true if numbers a and b are close to each other.""" - return abs(a - b) <= max(rel_tol * max(abs(a), abs(b)), abs_tol) +def linear_kernel(x, y=None): + if y is None: + y = x + return np.dot(x, y.T) -def weighted_choice(choices): - """A weighted version of random.choice""" - # NOTE: Shoule be replaced by random.choices if we port to Python 3.6 +def polynomial_kernel(x, y=None, degree=2.0): + if y is None: + y = x + return (1.0 + np.dot(x, y.T)) ** degree - total = sum(w for _, w in choices) - r = random.uniform(0, total) - upto = 0 - for c, w in choices: - if upto + w >= r: - return c, w - upto += w + +def rbf_kernel(x, y=None, gamma=None): + """Radial-basis function kernel (aka squared-exponential kernel).""" + if y is None: + y = x + if gamma is None: + gamma = 1.0 / x.shape[1] # 1.0 / n_features + return np.exp(-gamma * (-2.0 * np.dot(x, y.T) + + np.sum(x * x, axis=1).reshape((-1, 1)) + np.sum(y * y, axis=1).reshape((1, -1)))) # ______________________________________________________________________________ @@ -328,26 +377,33 @@ def distance(a, b): """The distance between two (x, y) points.""" xA, yA = a xB, yB = b - return math.hypot((xA - xB), (yA - yB)) + return np.hypot((xA - xB), (yA - yB)) def distance_squared(a, b): """The square of the distance between two (x, y) points.""" xA, yA = a xB, yB = b - return (xA - xB)**2 + (yA - yB)**2 - - -def vector_clip(vector, lowest, highest): - """Return vector, except if any element is less than the corresponding - value of lowest or more than the corresponding value of highest, clip to - those values.""" - return type(vector)(map(clip, vector, lowest, highest)) + return (xA - xB) ** 2 + (yA - yB) ** 2 # ______________________________________________________________________________ # Misc Functions +class injection: + """Dependency injection of temporary values for global functions/classes/etc. + E.g., `with injection(DataBase=MockDataBase): ...`""" + + def __init__(self, **kwds): + self.new = kwds + + def __enter__(self): + self.old = {v: globals()[v] for v in self.new} + globals().update(self.new) + + def __exit__(self, type, value, traceback): + globals().update(self.old) + def memoize(fn, slot=None, maxsize=32): """Memoize fn: make it remember the computed value for any argument list. @@ -400,20 +456,26 @@ def print_table(table, header=None, sep=' ', numfmt='{}'): table = [[numfmt.format(x) if isnumber(x) else x for x in row] for row in table] - sizes = list( - map(lambda seq: max(map(len, seq)), - list(zip(*[map(str, row) for row in table])))) + sizes = list(map(lambda seq: max(map(len, seq)), list(zip(*[map(str, row) for row in table])))) for row in table: - print(sep.join(getattr( - str(x), j)(size) for (j, size, x) in zip(justs, sizes, row))) + print(sep.join(getattr(str(x), j)(size) for (j, size, x) in zip(justs, sizes, row))) def open_data(name, mode='r'): aima_root = os.path.dirname(__file__) aima_file = os.path.join(aima_root, *['aima-data', name]) - return open(aima_file) + return open(aima_file, mode=mode) + + +def failure_test(algorithm, tests): + """Grades the given algorithm based on how many tests it passes. + Most algorithms have arbitrary output on correct execution, which is difficult + to check for correctness. On the other hand, a lot of algorithms output something + particular on fail (for example, False, or None). + tests is a list with each element in the form: (values, failure_output).""" + return mean(int(algorithm(x) != y) for x, y in tests) # ______________________________________________________________________________ @@ -422,7 +484,7 @@ def open_data(name, mode='r'): # See https://docs.python.org/3/reference/expressions.html#operator-precedence # See https://docs.python.org/3/reference/datamodel.html#special-method-names -class Expr(object): +class Expr: """A mathematical expression with an operator and 0 or more arguments. op is a str like '+' or 'sin'; args are Expressions. Expr('x') or Symbol('x') creates a symbol (a nullary Expr). @@ -529,32 +591,35 @@ def __rmatmul__(self, lhs): return Expr('@', lhs, self) def __call__(self, *args): - "Call: if 'f' is a Symbol, then f(0) == Expr('f', 0)." + """Call: if 'f' is a Symbol, then f(0) == Expr('f', 0).""" if self.args: - raise ValueError('can only do a call for a Symbol, not an Expr') + raise ValueError('Can only do a call for a Symbol, not an Expr') else: return Expr(self.op, *args) # Equality and repr def __eq__(self, other): - "'x == y' evaluates to True or False; does not build an Expr." - return (isinstance(other, Expr) - and self.op == other.op - and self.args == other.args) + """x == y' evaluates to True or False; does not build an Expr.""" + return isinstance(other, Expr) and self.op == other.op and self.args == other.args + + def __lt__(self, other): + return isinstance(other, Expr) and str(self) < str(other) - def __hash__(self): return hash(self.op) ^ hash(self.args) + def __hash__(self): + return hash(self.op) ^ hash(self.args) def __repr__(self): op = self.op args = [str(arg) for arg in self.args] - if op.isidentifier(): # f(x) or f(x, y) + if op.isidentifier(): # f(x) or f(x, y) return '{}({})'.format(op, ', '.join(args)) if args else op - elif len(args) == 1: # -x or -(x + 1) + elif len(args) == 1: # -x or -(x + 1) return op + args[0] - else: # (x - y) + else: # (x - y) opp = (' ' + op + ' ') return '(' + opp.join(args) + ')' + # An 'Expression' is either an Expr or a Number. # Symbol is not an explicit type; it is any Expr with 0 args. @@ -588,11 +653,13 @@ def arity(expression): else: # expression is a number return 0 + # For operators that are not defined in Python, we allow new InfixOps: class PartialExpr: """Given 'P |'==>'| Q, first form PartialExpr('==>', P), then combine with Q.""" + def __init__(self, op, lhs): self.op, self.lhs = op, lhs @@ -611,10 +678,7 @@ def expr(x): >>> expr('P & Q ==> Q') ((P & Q) ==> Q) """ - if isinstance(x, str): - return eval(expr_handle_infix_ops(x), defaultkeydict(Symbol)) - else: - return x + return eval(expr_handle_infix_ops(x), defaultkeydict(Symbol)) if isinstance(x, str) else x infix_ops = '==> <== <=>'.split() @@ -635,141 +699,82 @@ class defaultkeydict(collections.defaultdict): >>> d = defaultkeydict(len); d['four'] 4 """ + def __missing__(self, key): self[key] = result = self.default_factory(key) return result class hashabledict(dict): - """Allows hashing by representing a dictionary as tuple of key:value pairs - May cause problems as the hash value may change during runtime - """ - def __tuplify__(self): - return tuple(sorted(self.items())) + """Allows hashing by representing a dictionary as tuple of key:value pairs. + May cause problems as the hash value may change during runtime.""" def __hash__(self): - return hash(self.__tuplify__()) - - def __lt__(self, odict): - assert isinstance(odict, hashabledict) - return self.__tuplify__() < odict.__tuplify__() - - def __gt__(self, odict): - assert isinstance(odict, hashabledict) - return self.__tuplify__() > odict.__tuplify__() - - def __le__(self, odict): - assert isinstance(odict, hashabledict) - return self.__tuplify__() <= odict.__tuplify__() - - def __ge__(self, odict): - assert isinstance(odict, hashabledict) - return self.__tuplify__() >= odict.__tuplify__() + return 1 # ______________________________________________________________________________ # Queues: Stack, FIFOQueue, PriorityQueue - -# TODO: queue.PriorityQueue -# TODO: Priority queues may not belong here -- see treatment in search.py +# Stack and FIFOQueue are implemented as list and collection.deque +# PriorityQueue is implemented here -class Queue: +class PriorityQueue: + """A Queue in which the minimum (or maximum) element (as determined by f and + order) is returned first. + If order is 'min', the item with minimum f(x) is + returned first; if order is 'max', then it is the item with maximum f(x). + Also supports dict-like lookup.""" - """Queue is an abstract class/interface. There are three types: - Stack(): A Last In First Out Queue. - FIFOQueue(): A First In First Out Queue. - PriorityQueue(order, f): Queue in sorted order (default min-first). - Each type supports the following methods and functions: - q.append(item) -- add an item to the queue - q.extend(items) -- equivalent to: for item in items: q.append(item) - q.pop() -- return the top item from the queue - len(q) -- number of items in q (also q.__len()) - item in q -- does q contain item? - Note that isinstance(Stack(), Queue) is false, because we implement stacks - as lists. If Python ever gets interfaces, Queue will be an interface.""" + def __init__(self, order='min', f=lambda x: x): + self.heap = [] + if order == 'min': + self.f = f + elif order == 'max': # now item with max f(x) + self.f = lambda x: -f(x) # will be popped first + else: + raise ValueError("Order must be either 'min' or 'max'.") - def __init__(self): - raise NotImplementedError + def append(self, item): + """Insert item at its correct position.""" + heapq.heappush(self.heap, (self.f(item), item)) def extend(self, items): + """Insert each item in items at its correct position.""" for item in items: self.append(item) - -def Stack(): - """Return an empty list, suitable as a Last-In-First-Out Queue.""" - return [] - - -class FIFOQueue(Queue): - - """A First-In-First-Out Queue.""" - - def __init__(self, maxlen=None, items=[]): - self.queue = collections.deque(items, maxlen) - - def append(self, item): - if not self.queue.maxlen or len(self.queue) < self.queue.maxlen: - self.queue.append(item) - else: - raise Exception('FIFOQueue is full') - - def extend(self, items): - if not self.queue.maxlen or len(self.queue) + len(items) <= self.queue.maxlen: - self.queue.extend(items) - else: - raise Exception('FIFOQueue max length exceeded') - def pop(self): - if len(self.queue) > 0: - return self.queue.popleft() + """Pop and return the item (with min or max f(x) value) + depending on the order.""" + if self.heap: + return heapq.heappop(self.heap)[1] else: - raise Exception('FIFOQueue is empty') + raise Exception('Trying to pop from empty PriorityQueue.') def __len__(self): - return len(self.queue) + """Return current capacity of PriorityQueue.""" + return len(self.heap) - def __contains__(self, item): - return item in self.queue - - -class PriorityQueue(Queue): - - """A queue in which the minimum (or maximum) element (as determined by f and - order) is returned first. If order is min, the item with minimum f(x) is - returned first; if order is max, then it is the item with maximum f(x). - Also supports dict-like lookup.""" - - def __init__(self, order=min, f=lambda x: x): - self.A = [] - self.order = order - self.f = f - - def append(self, item): - bisect.insort(self.A, (self.f(item), item)) - - def __len__(self): - return len(self.A) - - def pop(self): - if self.order == min: - return self.A.pop(0)[1] - else: - return self.A.pop()[1] - - def __contains__(self, item): - return any(item == pair[1] for pair in self.A) + def __contains__(self, key): + """Return True if the key is in PriorityQueue.""" + return any([item == key for _, item in self.heap]) def __getitem__(self, key): - for _, item in self.A: + """Returns the first value associated with key in PriorityQueue. + Raises KeyError if key is not present.""" + for value, item in self.heap: if item == key: - return item + return value + raise KeyError(str(key) + " is not in the priority queue") def __delitem__(self, key): - for i, (value, item) in enumerate(self.A): - if item == key: - self.A.pop(i) + """Delete the first occurrence of key.""" + try: + del self.heap[[item == key for _, item in self.heap].index(True)] + except ValueError: + raise KeyError(str(key) + " is not in the priority queue") + heapq.heapify(self.heap) # ______________________________________________________________________________ @@ -777,7 +782,7 @@ def __delitem__(self, key): class Bool(int): - """Just like `bool`, except values display as 'T' and 'F' instead of 'True' and 'False'""" + """Just like `bool`, except values display as 'T' and 'F' instead of 'True' and 'False'.""" __str__ = __repr__ = lambda self: 'T' if self else 'F' diff --git a/utils4e.py b/utils4e.py new file mode 100644 index 000000000..65cb9026f --- /dev/null +++ b/utils4e.py @@ -0,0 +1,807 @@ +"""Provides some utilities widely used by other modules""" + +import bisect +import collections +import collections.abc +import functools +import heapq +import os.path +import random +from itertools import chain, combinations +from statistics import mean + +import numpy as np + + +# part1. General data structures and their functions +# ______________________________________________________________________________ +# Queues: Stack, FIFOQueue, PriorityQueue +# Stack and FIFOQueue are implemented as list and collection.deque +# PriorityQueue is implemented here + + +class PriorityQueue: + """A Queue in which the minimum (or maximum) element (as determined by f and order) is returned first. + If order is 'min', the item with minimum f(x) is + returned first; if order is 'max', then it is the item with maximum f(x). + Also supports dict-like lookup.""" + + def __init__(self, order='min', f=lambda x: x): + self.heap = [] + + if order == 'min': + self.f = f + elif order == 'max': # now item with max f(x) + self.f = lambda x: -f(x) # will be popped first + else: + raise ValueError("Order must be either 'min' or 'max'.") + + def append(self, item): + """Insert item at its correct position.""" + heapq.heappush(self.heap, (self.f(item), item)) + + def extend(self, items): + """Insert each item in items at its correct position.""" + for item in items: + self.append(item) + + def pop(self): + """Pop and return the item (with min or max f(x) value) + depending on the order.""" + if self.heap: + return heapq.heappop(self.heap)[1] + else: + raise Exception('Trying to pop from empty PriorityQueue.') + + def __len__(self): + """Return current capacity of PriorityQueue.""" + return len(self.heap) + + def __contains__(self, key): + """Return True if the key is in PriorityQueue.""" + return any([item == key for _, item in self.heap]) + + def __getitem__(self, key): + """Returns the first value associated with key in PriorityQueue. + Raises KeyError if key is not present.""" + for value, item in self.heap: + if item == key: + return value + raise KeyError(str(key) + " is not in the priority queue") + + def __delitem__(self, key): + """Delete the first occurrence of key.""" + try: + del self.heap[[item == key for _, item in self.heap].index(True)] + except ValueError: + raise KeyError(str(key) + " is not in the priority queue") + heapq.heapify(self.heap) + + +# ______________________________________________________________________________ +# Functions on Sequences and Iterables + + +def sequence(iterable): + """Converts iterable to sequence, if it is not already one.""" + return (iterable if isinstance(iterable, collections.abc.Sequence) + else tuple([iterable])) + + +def remove_all(item, seq): + """Return a copy of seq (or string) with all occurrences of item removed.""" + if isinstance(seq, str): + return seq.replace(item, '') + elif isinstance(seq, set): + rest = seq.copy() + rest.remove(item) + return rest + else: + return [x for x in seq if x != item] + + +def unique(seq): + """Remove duplicate elements from seq. Assumes hashable elements.""" + return list(set(seq)) + + +def count(seq): + """Count the number of items in sequence that are interpreted as true.""" + return sum(map(bool, seq)) + + +def multimap(items): + """Given (key, val) pairs, return {key: [val, ....], ...}.""" + result = collections.defaultdict(list) + for (key, val) in items: + result[key].append(val) + return dict(result) + + +def multimap_items(mmap): + """Yield all (key, val) pairs stored in the multimap.""" + for (key, vals) in mmap.items(): + for val in vals: + yield key, val + + +def product(numbers): + """Return the product of the numbers, e.g. product([2, 3, 10]) == 60""" + result = 1 + for x in numbers: + result *= x + return result + + +def first(iterable, default=None): + """Return the first element of an iterable; or default.""" + return next(iter(iterable), default) + + +def is_in(elt, seq): + """Similar to (elt in seq), but compares with 'is', not '=='.""" + return any(x is elt for x in seq) + + +def mode(data): + """Return the most common data item. If there are ties, return any one of them.""" + [(item, count)] = collections.Counter(data).most_common(1) + return item + + +def power_set(iterable): + """power_set([1,2,3]) --> (1,) (2,) (3,) (1,2) (1,3) (2,3) (1,2,3)""" + s = list(iterable) + return list(chain.from_iterable(combinations(s, r) for r in range(len(s) + 1)))[1:] + + +def extend(s, var, val): + """Copy dict s and extend it by setting var to val; return copy.""" + return {**s, var: val} + + +def flatten(seqs): + return sum(seqs, []) + + +# ______________________________________________________________________________ +# argmin and argmax + + +identity = lambda x: x + + +def argmin_random_tie(seq, key=identity): + """Return a minimum element of seq; break ties at random.""" + return min(shuffled(seq), key=key) + + +def argmax_random_tie(seq, key=identity): + """Return an element with highest fn(seq[i]) score; break ties at random.""" + return max(shuffled(seq), key=key) + + +def shuffled(iterable): + """Randomly shuffle a copy of iterable.""" + items = list(iterable) + random.shuffle(items) + return items + + +# part2. Mathematical and Statistical util functions +# ______________________________________________________________________________ + + +def histogram(values, mode=0, bin_function=None): + """Return a list of (value, count) pairs, summarizing the input values. + Sorted by increasing value, or if mode=1, by decreasing count. + If bin_function is given, map it over values first.""" + if bin_function: + values = map(bin_function, values) + + bins = {} + for val in values: + bins[val] = bins.get(val, 0) + 1 + + if mode: + return sorted(list(bins.items()), key=lambda x: (x[1], x[0]), reverse=True) + else: + return sorted(bins.items()) + + +def element_wise_product(x, y): + if hasattr(x, '__iter__') and hasattr(y, '__iter__'): + assert len(x) == len(y) + return [element_wise_product(_x, _y) for _x, _y in zip(x, y)] + elif hasattr(x, '__iter__') == hasattr(y, '__iter__'): + return x * y + else: + raise Exception('Inputs must be in the same size!') + + +def vector_add(a, b): + """Component-wise addition of two vectors.""" + if not (a and b): + return a or b + if hasattr(a, '__iter__') and hasattr(b, '__iter__'): + assert len(a) == len(b) + return list(map(vector_add, a, b)) + else: + try: + return a + b + except TypeError: + raise Exception('Inputs must be in the same size!') + + +def scalar_vector_product(x, y): + """Return vector as a product of a scalar and a vector recursively.""" + return [scalar_vector_product(x, _y) for _y in y] if hasattr(y, '__iter__') else x * y + + +def map_vector(f, x): + """Apply function f to iterable x.""" + return [map_vector(f, _x) for _x in x] if hasattr(x, '__iter__') else list(map(f, [x]))[0] + + +def probability(p): + """Return true with probability p.""" + return p > random.uniform(0.0, 1.0) + + +def weighted_sample_with_replacement(n, seq, weights): + """Pick n samples from seq at random, with replacement, with the + probability of each element in proportion to its corresponding + weight.""" + sample = weighted_sampler(seq, weights) + + return [sample() for _ in range(n)] + + +def weighted_sampler(seq, weights): + """Return a random-sample function that picks from seq weighted by weights.""" + totals = [] + for w in weights: + totals.append(w + totals[-1] if totals else w) + + return lambda: seq[bisect.bisect(totals, random.uniform(0, totals[-1]))] + + +def weighted_choice(choices): + """A weighted version of random.choice""" + # NOTE: Should be replaced by random.choices if we port to Python 3.6 + + total = sum(w for _, w in choices) + r = random.uniform(0, total) + upto = 0 + for c, w in choices: + if upto + w >= r: + return c, w + upto += w + + +def rounder(numbers, d=4): + """Round a single number, or sequence of numbers, to d decimal places.""" + if isinstance(numbers, (int, float)): + return round(numbers, d) + else: + constructor = type(numbers) # Can be list, set, tuple, etc. + return constructor(rounder(n, d) for n in numbers) + + +def num_or_str(x): # TODO: rename as `atom` + """The argument is a string; convert to a number if + possible, or strip it.""" + try: + return int(x) + except ValueError: + try: + return float(x) + except ValueError: + return str(x).strip() + + +def euclidean_distance(x, y): + return np.sqrt(sum((_x - _y) ** 2 for _x, _y in zip(x, y))) + + +def manhattan_distance(x, y): + return sum(abs(_x - _y) for _x, _y in zip(x, y)) + + +def hamming_distance(x, y): + return sum(_x != _y for _x, _y in zip(x, y)) + + +def rms_error(x, y): + return np.sqrt(ms_error(x, y)) + + +def ms_error(x, y): + return mean((x - y) ** 2 for x, y in zip(x, y)) + + +def mean_error(x, y): + return mean(abs(x - y) for x, y in zip(x, y)) + + +def mean_boolean_error(x, y): + return mean(_x != _y for _x, _y in zip(x, y)) + + +# part3. Neural network util functions +# ______________________________________________________________________________ + + +def cross_entropy_loss(x, y): + """Cross entropy loss function. x and y are 1D iterable objects.""" + return (-1.0 / len(x)) * sum(x * np.log(_y) + (1 - _x) * np.log(1 - _y) for _x, _y in zip(x, y)) + + +def mean_squared_error_loss(x, y): + """Min square loss function. x and y are 1D iterable objects.""" + return (1.0 / len(x)) * sum((_x - _y) ** 2 for _x, _y in zip(x, y)) + + +def normalize(dist): + """Multiply each number by a constant such that the sum is 1.0""" + if isinstance(dist, dict): + total = sum(dist.values()) + for key in dist: + dist[key] = dist[key] / total + assert 0 <= dist[key] <= 1 # probabilities must be between 0 and 1 + return dist + total = sum(dist) + return [(n / total) for n in dist] + + +def random_weights(min_value, max_value, num_weights): + return [random.uniform(min_value, max_value) for _ in range(num_weights)] + + +def conv1D(x, k): + """1D convolution. x: input vector; K: kernel vector.""" + return np.convolve(x, k, mode='same') + + +def gaussian_kernel(size=3): + return [gaussian((size - 1) / 2, 0.1, x) for x in range(size)] + + +def gaussian_kernel_1D(size=3, sigma=0.5): + return [gaussian((size - 1) / 2, sigma, x) for x in range(size)] + + +def gaussian_kernel_2D(size=3, sigma=0.5): + x, y = np.mgrid[-size // 2 + 1:size // 2 + 1, -size // 2 + 1:size // 2 + 1] + g = np.exp(-((x ** 2 + y ** 2) / (2.0 * sigma ** 2))) + return g / g.sum() + + +def step(x): + """Return activation value of x with sign function.""" + return 1 if x >= 0 else 0 + + +def gaussian(mean, st_dev, x): + """Given the mean and standard deviation of a distribution, it returns the probability of x.""" + return 1 / (np.sqrt(2 * np.pi) * st_dev) * np.exp(-0.5 * (float(x - mean) / st_dev) ** 2) + + +def linear_kernel(x, y=None): + if y is None: + y = x + return np.dot(x, y.T) + + +def polynomial_kernel(x, y=None, degree=2.0): + if y is None: + y = x + return (1.0 + np.dot(x, y.T)) ** degree + + +def rbf_kernel(x, y=None, gamma=None): + """Radial-basis function kernel (aka squared-exponential kernel).""" + if y is None: + y = x + if gamma is None: + gamma = 1.0 / x.shape[1] # 1.0 / n_features + return np.exp(-gamma * (-2.0 * np.dot(x, y.T) + + np.sum(x * x, axis=1).reshape((-1, 1)) + np.sum(y * y, axis=1).reshape((1, -1)))) + + +# part4. Self defined data structures +# ______________________________________________________________________________ +# Grid Functions + + +orientations = EAST, NORTH, WEST, SOUTH = [(1, 0), (0, 1), (-1, 0), (0, -1)] +turns = LEFT, RIGHT = (+1, -1) + + +def turn_heading(heading, inc, headings=orientations): + return headings[(headings.index(heading) + inc) % len(headings)] + + +def turn_right(heading): + return turn_heading(heading, RIGHT) + + +def turn_left(heading): + return turn_heading(heading, LEFT) + + +def distance(a, b): + """The distance between two (x, y) points.""" + xA, yA = a + xB, yB = b + return np.hypot((xA - xB), (yA - yB)) + + +def distance_squared(a, b): + """The square of the distance between two (x, y) points.""" + xA, yA = a + xB, yB = b + return (xA - xB) ** 2 + (yA - yB) ** 2 + + +# ______________________________________________________________________________ +# Misc Functions + + +class injection: + """Dependency injection of temporary values for global functions/classes/etc. + E.g., `with injection(DataBase=MockDataBase): ...`""" + + def __init__(self, **kwds): + self.new = kwds + + def __enter__(self): + self.old = {v: globals()[v] for v in self.new} + globals().update(self.new) + + def __exit__(self, type, value, traceback): + globals().update(self.old) + + +def memoize(fn, slot=None, maxsize=32): + """Memoize fn: make it remember the computed value for any argument list. + If slot is specified, store result in that slot of first argument. + If slot is false, use lru_cache for caching the values.""" + if slot: + def memoized_fn(obj, *args): + if hasattr(obj, slot): + return getattr(obj, slot) + else: + val = fn(obj, *args) + setattr(obj, slot, val) + return val + else: + @functools.lru_cache(maxsize=maxsize) + def memoized_fn(*args): + return fn(*args) + + return memoized_fn + + +def name(obj): + """Try to find some reasonable name for the object.""" + return (getattr(obj, 'name', 0) or getattr(obj, '__name__', 0) or + getattr(getattr(obj, '__class__', 0), '__name__', 0) or + str(obj)) + + +def isnumber(x): + """Is x a number?""" + return hasattr(x, '__int__') + + +def issequence(x): + """Is x a sequence?""" + return isinstance(x, collections.abc.Sequence) + + +def print_table(table, header=None, sep=' ', numfmt='{}'): + """Print a list of lists as a table, so that columns line up nicely. + header, if specified, will be printed as the first row. + numfmt is the format for all numbers; you might want e.g. '{:.2f}'. + (If you want different formats in different columns, + don't use print_table.) sep is the separator between columns.""" + justs = ['rjust' if isnumber(x) else 'ljust' for x in table[0]] + + if header: + table.insert(0, header) + + table = [[numfmt.format(x) if isnumber(x) else x for x in row] + for row in table] + sizes = list( + map(lambda seq: max(map(len, seq)), + list(zip(*[map(str, row) for row in table])))) + + for row in table: + print(sep.join(getattr( + str(x), j)(size) for (j, size, x) in zip(justs, sizes, row))) + + +def open_data(name, mode='r'): + aima_root = os.path.dirname(__file__) + aima_file = os.path.join(aima_root, *['aima-data', name]) + + return open(aima_file, mode=mode) + + +def failure_test(algorithm, tests): + """Grades the given algorithm based on how many tests it passes. + Most algorithms have arbitrary output on correct execution, which is difficult + to check for correctness. On the other hand, a lot of algorithms output something + particular on fail (for example, False, or None). + tests is a list with each element in the form: (values, failure_output).""" + return mean(int(algorithm(x) != y) for x, y in tests) + + +# ______________________________________________________________________________ +# Expressions + +# See https://docs.python.org/3/reference/expressions.html#operator-precedence +# See https://docs.python.org/3/reference/datamodel.html#special-method-names + + +class Expr: + """A mathematical expression with an operator and 0 or more arguments. + op is a str like '+' or 'sin'; args are Expressions. + Expr('x') or Symbol('x') creates a symbol (a nullary Expr). + Expr('-', x) creates a unary; Expr('+', x, 1) creates a binary.""" + + def __init__(self, op, *args): + self.op = str(op) + self.args = args + + # Operator overloads + def __neg__(self): + return Expr('-', self) + + def __pos__(self): + return Expr('+', self) + + def __invert__(self): + return Expr('~', self) + + def __add__(self, rhs): + return Expr('+', self, rhs) + + def __sub__(self, rhs): + return Expr('-', self, rhs) + + def __mul__(self, rhs): + return Expr('*', self, rhs) + + def __pow__(self, rhs): + return Expr('**', self, rhs) + + def __mod__(self, rhs): + return Expr('%', self, rhs) + + def __and__(self, rhs): + return Expr('&', self, rhs) + + def __xor__(self, rhs): + return Expr('^', self, rhs) + + def __rshift__(self, rhs): + return Expr('>>', self, rhs) + + def __lshift__(self, rhs): + return Expr('<<', self, rhs) + + def __truediv__(self, rhs): + return Expr('/', self, rhs) + + def __floordiv__(self, rhs): + return Expr('//', self, rhs) + + def __matmul__(self, rhs): + return Expr('@', self, rhs) + + def __or__(self, rhs): + """Allow both P | Q, and P |'==>'| Q.""" + if isinstance(rhs, Expression): + return Expr('|', self, rhs) + else: + return PartialExpr(rhs, self) + + # Reverse operator overloads + def __radd__(self, lhs): + return Expr('+', lhs, self) + + def __rsub__(self, lhs): + return Expr('-', lhs, self) + + def __rmul__(self, lhs): + return Expr('*', lhs, self) + + def __rdiv__(self, lhs): + return Expr('/', lhs, self) + + def __rpow__(self, lhs): + return Expr('**', lhs, self) + + def __rmod__(self, lhs): + return Expr('%', lhs, self) + + def __rand__(self, lhs): + return Expr('&', lhs, self) + + def __rxor__(self, lhs): + return Expr('^', lhs, self) + + def __ror__(self, lhs): + return Expr('|', lhs, self) + + def __rrshift__(self, lhs): + return Expr('>>', lhs, self) + + def __rlshift__(self, lhs): + return Expr('<<', lhs, self) + + def __rtruediv__(self, lhs): + return Expr('/', lhs, self) + + def __rfloordiv__(self, lhs): + return Expr('//', lhs, self) + + def __rmatmul__(self, lhs): + return Expr('@', lhs, self) + + def __call__(self, *args): + """Call: if 'f' is a Symbol, then f(0) == Expr('f', 0).""" + if self.args: + raise ValueError('Can only do a call for a Symbol, not an Expr') + else: + return Expr(self.op, *args) + + # Equality and repr + def __eq__(self, other): + """'x == y' evaluates to True or False; does not build an Expr.""" + return isinstance(other, Expr) and self.op == other.op and self.args == other.args + + def __lt__(self, other): + return isinstance(other, Expr) and str(self) < str(other) + + def __hash__(self): + return hash(self.op) ^ hash(self.args) + + def __repr__(self): + op = self.op + args = [str(arg) for arg in self.args] + if op.isidentifier(): # f(x) or f(x, y) + return '{}({})'.format(op, ', '.join(args)) if args else op + elif len(args) == 1: # -x or -(x + 1) + return op + args[0] + else: # (x - y) + opp = (' ' + op + ' ') + return '(' + opp.join(args) + ')' + + +# An 'Expression' is either an Expr or a Number. +# Symbol is not an explicit type; it is any Expr with 0 args. + + +Number = (int, float, complex) +Expression = (Expr, Number) + + +def Symbol(name): + """A Symbol is just an Expr with no args.""" + return Expr(name) + + +def symbols(names): + """Return a tuple of Symbols; names is a comma/whitespace delimited str.""" + return tuple(Symbol(name) for name in names.replace(',', ' ').split()) + + +def subexpressions(x): + """Yield the subexpressions of an Expression (including x itself).""" + yield x + if isinstance(x, Expr): + for arg in x.args: + yield from subexpressions(arg) + + +def arity(expression): + """The number of sub-expressions in this expression.""" + if isinstance(expression, Expr): + return len(expression.args) + else: # expression is a number + return 0 + + +# For operators that are not defined in Python, we allow new InfixOps: + + +class PartialExpr: + """Given 'P |'==>'| Q, first form PartialExpr('==>', P), then combine with Q.""" + + def __init__(self, op, lhs): + self.op, self.lhs = op, lhs + + def __or__(self, rhs): + return Expr(self.op, self.lhs, rhs) + + def __repr__(self): + return "PartialExpr('{}', {})".format(self.op, self.lhs) + + +def expr(x): + """Shortcut to create an Expression. x is a str in which: + - identifiers are automatically defined as Symbols. + - ==> is treated as an infix |'==>'|, as are <== and <=>. + If x is already an Expression, it is returned unchanged. Example: + >>> expr('P & Q ==> Q') + ((P & Q) ==> Q) + """ + if isinstance(x, str): + return eval(expr_handle_infix_ops(x), defaultkeydict(Symbol)) + else: + return x + + +infix_ops = '==> <== <=>'.split() + + +def expr_handle_infix_ops(x): + """Given a str, return a new str with ==> replaced by |'==>'|, etc. + >>> expr_handle_infix_ops('P ==> Q') + "P |'==>'| Q" + """ + for op in infix_ops: + x = x.replace(op, '|' + repr(op) + '|') + return x + + +class defaultkeydict(collections.defaultdict): + """Like defaultdict, but the default_factory is a function of the key. + >>> d = defaultkeydict(len); d['four'] + 4 + """ + + def __missing__(self, key): + self[key] = result = self.default_factory(key) + return result + + +class hashabledict(dict): + """Allows hashing by representing a dictionary as tuple of key:value pairs. + May cause problems as the hash value may change during runtime.""" + + def __hash__(self): + return 1 + + +# ______________________________________________________________________________ +# Monte Carlo tree node and ucb function + + +class MCT_Node: + """Node in the Monte Carlo search tree, keeps track of the children states.""" + + def __init__(self, parent=None, state=None, U=0, N=0): + self.__dict__.update(parent=parent, state=state, U=U, N=N) + self.children = {} + self.actions = None + + +def ucb(n, C=1.4): + return np.inf if n.N == 0 else n.U / n.N + C * np.sqrt(np.log(n.parent.N) / n.N) + + +# ______________________________________________________________________________ +# Useful Shorthands + + +class Bool(int): + """Just like `bool`, except values display as 'T' and 'F' instead of 'True' and 'False'.""" + __str__ = __repr__ = lambda self: 'T' if self else 'F' + + +T = Bool(True) +F = Bool(False) diff --git a/vacuum_world.ipynb b/vacuum_world.ipynb new file mode 100644 index 000000000..6b05254c7 --- /dev/null +++ b/vacuum_world.ipynb @@ -0,0 +1,701 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# THE VACUUM WORLD \n", + "\n", + "In this notebook, we will be discussing **the structure of agents** through an example of the **vacuum agent**. The job of AI is to design an **agent program** that implements the agent function: the mapping from percepts to actions. We assume this program will run on some sort of computing device with physical sensors and actuators: we call this the **architecture**:\n", + "\n", + "

    agent = architecture + program

    " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before moving on, please review [agents.ipynb](https://github.com/aimacode/aima-python/blob/master/agents.ipynb)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CONTENTS\n", + "\n", + "* Agent\n", + "* Random Agent Program\n", + "* Table-Driven Agent Program\n", + "* Simple Reflex Agent Program\n", + "* Model-Based Reflex Agent Program\n", + "* Goal-Based Agent Program\n", + "* Utility-Based Agent Program\n", + "* Learning Agent" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## AGENT PROGRAMS\n", + "\n", + "An agent program takes the current percept as input from the sensors and returns an action to the actuators. There is a difference between an agent program and an agent function: an agent program takes the current percept as input whereas an agent function takes the entire percept history.\n", + "\n", + "The agent program takes just the current percept as input because nothing more is available from the environment; if the agent's actions depend on the entire percept sequence, the agent will have to remember the percept.\n", + "\n", + "We'll discuss the following agent programs here with the help of the vacuum world example:\n", + "\n", + "* Random Agent Program\n", + "* Table-Driven Agent Program\n", + "* Simple Reflex Agent Program\n", + "* Model-Based Reflex Agent Program\n", + "* Goal-Based Agent Program\n", + "* Utility-Based Agent Program" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Random Agent Program\n", + "\n", + "A random agent program, as the name suggests, chooses an action at random, without taking into account the percepts. \n", + "Here, we will demonstrate a random vacuum agent for a trivial vacuum environment, that is, the two-state environment." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's begin by importing all the functions from the agents module:" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [], + "source": [ + "from agents import *\n", + "from notebook import psource" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us first see how we define the TrivialVacuumEnvironment. Run the next cell to see how abstract class TrivialVacuumEnvironment is defined in agents module:" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "\n", + "\n", + " Codestin Search App\n", + " \n", + " \n", + "\n", + "\n", + "

    \n", + "\n", + "
    class TrivialVacuumEnvironment(Environment):\n",
    +       "\n",
    +       "    """This environment has two locations, A and B. Each can be Dirty\n",
    +       "    or Clean. The agent perceives its location and the location's\n",
    +       "    status. This serves as an example of how to implement a simple\n",
    +       "    Environment."""\n",
    +       "\n",
    +       "    def __init__(self):\n",
    +       "        super().__init__()\n",
    +       "        self.status = {loc_A: random.choice(['Clean', 'Dirty']),\n",
    +       "                       loc_B: random.choice(['Clean', 'Dirty'])}\n",
    +       "\n",
    +       "    def thing_classes(self):\n",
    +       "        return [Wall, Dirt, ReflexVacuumAgent, RandomVacuumAgent,\n",
    +       "                TableDrivenVacuumAgent, ModelBasedVacuumAgent]\n",
    +       "\n",
    +       "    def percept(self, agent):\n",
    +       "        """Returns the agent's location, and the location status (Dirty/Clean)."""\n",
    +       "        return (agent.location, self.status[agent.location])\n",
    +       "\n",
    +       "    def execute_action(self, agent, action):\n",
    +       "        """Change agent's location and/or location's status; track performance.\n",
    +       "        Score 10 for each dirt cleaned; -1 for each move."""\n",
    +       "        if action == 'Right':\n",
    +       "            agent.location = loc_B\n",
    +       "            agent.performance -= 1\n",
    +       "        elif action == 'Left':\n",
    +       "            agent.location = loc_A\n",
    +       "            agent.performance -= 1\n",
    +       "        elif action == 'Suck':\n",
    +       "            if self.status[agent.location] == 'Dirty':\n",
    +       "                agent.performance += 10\n",
    +       "            self.status[agent.location] = 'Clean'\n",
    +       "\n",
    +       "    def default_location(self, thing):\n",
    +       "        """Agents start in either location at random."""\n",
    +       "        return random.choice([loc_A, loc_B])\n",
    +       "
    \n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "psource(TrivialVacuumEnvironment)" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State of the Environment: {(0, 0): 'Clean', (1, 0): 'Dirty'}.\n" + ] + } + ], + "source": [ + "# These are the two locations for the two-state environment\n", + "loc_A, loc_B = (0, 0), (1, 0)\n", + "\n", + "# Initialize the two-state environment\n", + "trivial_vacuum_env = TrivialVacuumEnvironment()\n", + "\n", + "# Check the initial state of the environment\n", + "print(\"State of the Environment: {}.\".format(trivial_vacuum_env.status))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's create our agent now. This agent will choose any of the actions from 'Right', 'Left', 'Suck' and 'NoOp' (No Operation) randomly." + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": {}, + "outputs": [], + "source": [ + "# Create the random agent\n", + "random_agent = Agent(program=RandomAgentProgram(['Right', 'Left', 'Suck', 'NoOp']))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will now add our agent to the environment." + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "RandomVacuumAgent is located at (1, 0).\n" + ] + } + ], + "source": [ + "# Add agent to the environment\n", + "trivial_vacuum_env.add_thing(random_agent)\n", + "\n", + "print(\"RandomVacuumAgent is located at {}.\".format(random_agent.location))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's run our environment now." + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State of the Environment: {(0, 0): 'Clean', (1, 0): 'Dirty'}.\n", + "RandomVacuumAgent is located at (1, 0).\n" + ] + } + ], + "source": [ + "# Running the environment\n", + "trivial_vacuum_env.step()\n", + "\n", + "# Check the current state of the environment\n", + "print(\"State of the Environment: {}.\".format(trivial_vacuum_env.status))\n", + "\n", + "print(\"RandomVacuumAgent is located at {}.\".format(random_agent.location))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## TABLE-DRIVEN AGENT PROGRAM\n", + "\n", + "A table-driven agent program keeps track of the percept sequence and then uses it to index into a table of actions to decide what to do. The table represents explicitly the agent function that the agent program embodies. \n", + "In the two-state vacuum world, the table would consist of all the possible states of the agent." + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": {}, + "outputs": [], + "source": [ + "table = {((loc_A, 'Clean'),): 'Right',\n", + " ((loc_A, 'Dirty'),): 'Suck',\n", + " ((loc_B, 'Clean'),): 'Left',\n", + " ((loc_B, 'Dirty'),): 'Suck',\n", + " ((loc_A, 'Dirty'), (loc_A, 'Clean')): 'Right',\n", + " ((loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck',\n", + " ((loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck',\n", + " ((loc_B, 'Dirty'), (loc_B, 'Clean')): 'Left',\n", + " ((loc_A, 'Dirty'), (loc_A, 'Clean'), (loc_B, 'Dirty')): 'Suck',\n", + " ((loc_B, 'Dirty'), (loc_B, 'Clean'), (loc_A, 'Dirty')): 'Suck'\n", + " }" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will now create a table-driven agent program for our two-state environment." + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": {}, + "outputs": [], + "source": [ + "# Create a table-driven agent\n", + "table_driven_agent = Agent(program=TableDrivenAgentProgram(table=table))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since we are using the same environment, let's remove the previously added random agent from the environment to avoid confusion." + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": {}, + "outputs": [], + "source": [ + "trivial_vacuum_env.delete_thing(random_agent)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "TableDrivenVacuumAgent is located at (0, 0).\n" + ] + } + ], + "source": [ + "# Add the table-driven agent to the environment\n", + "trivial_vacuum_env.add_thing(table_driven_agent)\n", + "\n", + "print(\"TableDrivenVacuumAgent is located at {}.\".format(table_driven_agent.location))" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State of the Environment: {(0, 0): 'Clean', (1, 0): 'Dirty'}.\n", + "TableDrivenVacuumAgent is located at (1, 0).\n" + ] + } + ], + "source": [ + "# Run the environment\n", + "trivial_vacuum_env.step()\n", + "\n", + "# Check the current state of the environment\n", + "print(\"State of the Environment: {}.\".format(trivial_vacuum_env.status))\n", + "\n", + "print(\"TableDrivenVacuumAgent is located at {}.\".format(table_driven_agent.location))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SIMPLE REFLEX AGENT PROGRAM\n", + "\n", + "A simple reflex agent program selects actions on the basis of the *current* percept, ignoring the rest of the percept history. These agents work on a **condition-action rule** (also called **situation-action rule**, **production** or **if-then rule**), which tells the agent the action to trigger when a particular situation is encountered. \n", + "\n", + "The schematic diagram shown in **Figure 2.9** of the book will make this more clear:\n", + "\n", + "\"![simple reflex agent](images/simple_reflex_agent.jpg)\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us now create a simple reflex agent for the environment." + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": {}, + "outputs": [], + "source": [ + "# Delete the previously added table-driven agent\n", + "trivial_vacuum_env.delete_thing(table_driven_agent)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To create our agent, we need two functions: INTERPRET-INPUT function, which generates an abstracted description of the current state from the percerpt and the RULE-MATCH function, which returns the first rule in the set of rules that matches the given state description." + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [], + "source": [ + "\n", + "loc_A = (0, 0)\n", + "loc_B = (1, 0)\n", + "\n", + "\"\"\"We change the simpleReflexAgentProgram so that it doesn't make use of the Rule class\"\"\"\n", + "def SimpleReflexAgentProgram():\n", + " \"\"\"This agent takes action based solely on the percept. [Figure 2.10]\"\"\"\n", + " \n", + " def program(percept):\n", + " loc, status = percept\n", + " return ('Suck' if status == 'Dirty' \n", + " else'Right' if loc == loc_A \n", + " else'Left')\n", + " return program\n", + "\n", + " \n", + "# Create a simple reflex agent the two-state environment\n", + "program = SimpleReflexAgentProgram()\n", + "simple_reflex_agent = Agent(program)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now add the agent to the environment:" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "SimpleReflexVacuumAgent is located at (1, 0).\n" + ] + } + ], + "source": [ + "trivial_vacuum_env.add_thing(simple_reflex_agent)\n", + "\n", + "print(\"SimpleReflexVacuumAgent is located at {}.\".format(simple_reflex_agent.location))" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State of the Environment: {(0, 0): 'Clean', (1, 0): 'Clean'}.\n", + "SimpleReflexVacuumAgent is located at (1, 0).\n" + ] + } + ], + "source": [ + "# Run the environment\n", + "trivial_vacuum_env.step()\n", + "\n", + "# Check the current state of the environment\n", + "print(\"State of the Environment: {}.\".format(trivial_vacuum_env.status))\n", + "\n", + "print(\"SimpleReflexVacuumAgent is located at {}.\".format(simple_reflex_agent.location))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## MODEL-BASED REFLEX AGENT PROGRAM\n", + "\n", + "A model-based reflex agent maintains some sort of **internal state** that depends on the percept history and thereby reflects at least some of the unobserved aspects of the current state. In addition to this, it also requires a **model** of the world, that is, knowledge about \"how the world works\".\n", + "\n", + "The schematic diagram shown in **Figure 2.11** of the book will make this more clear:\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will now create a model-based reflex agent for the environment:" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [], + "source": [ + "# Delete the previously added simple reflex agent\n", + "trivial_vacuum_env.delete_thing(simple_reflex_agent)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We need another function UPDATE-STATE which will be responsible for creating a new state description." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "ModelBasedVacuumAgent is located at (0, 0).\n" + ] + } + ], + "source": [ + "# TODO: Implement this function for the two-dimensional environment\n", + "def update_state(state, action, percept, model):\n", + " pass\n", + "\n", + "# Create a model-based reflex agent\n", + "model_based_reflex_agent = ModelBasedVacuumAgent()\n", + "\n", + "# Add the agent to the environment\n", + "trivial_vacuum_env.add_thing(model_based_reflex_agent)\n", + "\n", + "print(\"ModelBasedVacuumAgent is located at {}.\".format(model_based_reflex_agent.location))" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "State of the Environment: {(0, 0): 'Clean', (1, 0): 'Clean'}.\n", + "ModelBasedVacuumAgent is located at (1, 0).\n" + ] + } + ], + "source": [ + "# Run the environment\n", + "trivial_vacuum_env.step()\n", + "\n", + "# Check the current state of the environment\n", + "print(\"State of the Environment: {}.\".format(trivial_vacuum_env.status))\n", + "\n", + "print(\"ModelBasedVacuumAgent is located at {}.\".format(model_based_reflex_agent.location))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## GOAL-BASED AGENT PROGRAM\n", + "\n", + "A goal-based agent needs some sort of **goal** information that describes situations that are desirable, apart from the current state description.\n", + "\n", + "**Figure 2.13** of the book shows a model-based, goal-based agent:\n", + "\n", + "\n", + "**Search** (Chapters 3 to 5) and **Planning** (Chapters 10 to 11) are the subfields of AI devoted to finding action sequences that achieve the agent's goals.\n", + "\n", + "## UTILITY-BASED AGENT PROGRAM\n", + "\n", + "A utility-based agent maximizes its **utility** using the agent's **utility function**, which is essentially an internalization of the agent's performance measure.\n", + "\n", + "**Figure 2.14** of the book shows a model-based, utility-based agent:\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## LEARNING AGENT\n", + "\n", + "Learning allows the agent to operate in initially unknown environments and to become more competent than its initial knowledge alone might allow. Here, we will breifly introduce the main ideas of learning agents. \n", + "\n", + "A learning agent can be divided into four conceptual components. The **learning element** is responsible for making improvements. It uses the feedback from the **critic** on how the agent is doing and determines how the performance element should be modified to do better in the future. The **performance element** is responsible for selecting external actions for the agent: it takes in percepts and decides on actions. The critic tells the learning element how well the agent is doing with respect to a fixed performance standard. It is necesaary because the percepts themselves provide no indication of the agent's success. The last component of the learning agent is the **problem generator**. It is responsible for suggesting actions that will lead to new and informative experiences. \n", + "\n", + "**Figure 2.15** of the book sums up the components and their working: \n", + "" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/viterbi_algorithm.ipynb b/viterbi_algorithm.ipynb new file mode 100644 index 000000000..9c23c4f75 --- /dev/null +++ b/viterbi_algorithm.ipynb @@ -0,0 +1,418 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Probabilistic Reasoning over Time\n", + "---\n", + "# Finding the Most Likely Sequence with Viterbi Algorithm\n", + "\n", + "## Introduction\n", + "An ***Hidden Markov Model*** (HMM) network is parameterized by two distributions:\n", + "\n", + "- the *emission or sensor probabilties* giving the conditional probability of observing evidence values for each hidden state;\n", + "- the *transition probabilities* giving the conditional probability of moving between states during the sequence. \n", + "\n", + "Additionally, an *initial distribution* describes the probability of a sequence starting in each state.\n", + "\n", + "At each time $t$, $X_t$ represents the *hidden state* and $E_t$ represents an *observation* at that time." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "from probability import *" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mclass\u001b[0m \u001b[0mHiddenMarkovModel\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"A Hidden markov model which takes Transition model and Sensor model as inputs\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__init__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtransition_model\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msensor_model\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mprior\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtransition_model\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtransition_model\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msensor_model\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msensor_model\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprior\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mprior\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;36m0.5\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0msensor_dist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mev\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mev\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msensor_model\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msensor_model\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource HiddenMarkovModel" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Finding the Most Likely Sequence\n", + "\n", + "There is a linear-time algorithm for finding the most likely sequence: the easiest way to think about the problem is to view each sequence as a path through a graph whose nodes are the possible states at each time step. Now consider the task of finding the most likely path through this graph, where the likelihood of any path is the product of the transition probabilities along the path and the probabilities of the given observations at each state. There is a recursive relationship between most likely paths to each state $x_{t+1}$ and most likely paths to each state $x_t$ . We can write this relationship as an equation connecting the probabilities of the paths:\n", + "\n", + "$$ \n", + "\\begin{align*}\n", + "m_{1:t+1} &= \\max_{x_{1:t}} \\textbf{P}(\\textbf{x}_{1:t}, \\textbf{X}_{t+1} | \\textbf{e}_{1:t+1}) \\\\\n", + "&= \\alpha \\textbf{P}(\\textbf{e}_{t+1} | \\textbf{X}_{t+1}) \\max_{x_t} \\Big(\\textbf{P}\n", + "(\\textbf{X}_{t+1} | \\textbf{x}_t) \\max_{x_{1:t-1}} P(\\textbf{x}_{1:t-1}, \\textbf{x}_{t} | \\textbf{e}_{1:t})\\Big)\n", + "\\end{align*}\n", + "$$\n", + "\n", + "The *Viterbi algorithm* is a dynamic programming algorithm for *finding the most likely sequence of hidden states*, called the Viterbi path, that results in a sequence of observed events in the context of HMMs.\n", + "This algorithms is useful in many applications, including *speech recognition*, where the aim is to find the most likely sequence of words, given a series of sounds and the *reconstruction of bit strings transmitted over a noisy channel*." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "\u001b[0;32mdef\u001b[0m \u001b[0mviterbi\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mHMM\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mev\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m\"\"\"\u001b[0m\n", + "\u001b[0;34m [Equation 15.11]\u001b[0m\n", + "\u001b[0;34m Viterbi algorithm to find the most likely sequence. Computes the best path and the\u001b[0m\n", + "\u001b[0;34m corresponding probabilities, given an HMM model and a sequence of observations.\"\"\"\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mt\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mev\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mev\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mev\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcopy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mev\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minsert\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mm\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0.0\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m0.0\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0m_\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mev\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# the recursion is initialized with m1 = forward(P(X0), e1)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mforward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mHMM\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mHMM\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mprior\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mev\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# keep track of maximizing predecessors\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbacktracking_graph\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mt\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0melement_wise_product\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mHMM\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msensor_dist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mev\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m \u001b[0;34m+\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0melement_wise_product\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mHMM\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtransition_model\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0melement_wise_product\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mHMM\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtransition_model\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mbacktracking_graph\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0melement_wise_product\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mHMM\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtransition_model\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0melement_wise_product\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mHMM\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtransition_model\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# computed probabilities\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mml_probabilities\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;36m0.0\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mev\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# most likely sequence\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mml_path\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;32mTrue\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mev\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# the construction of the most likely sequence starts in the final state with the largest probability, and\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;31m# runs backwards; the algorithm needs to store for each xt its predecessor xt-1 maximizing its probability\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mi_max\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margmax\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mrange\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mt\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m-\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mml_probabilities\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi_max\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mml_path\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mTrue\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mi_max\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mi\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0mi_max\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mbacktracking_graph\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m \u001b[0;34m-\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi_max\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\n", + "\u001b[0;34m\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mml_path\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mml_probabilities\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "%psource viterbi" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Umbrella World\n", + "---\n", + "\n", + "> You are the security guard stationed at a secret under-ground installation. Each day, you try to guess whether it’s raining today, but your only access to the outside world occurs each morning when you see the director coming in with, or without, an umbrella.\n", + "\n", + "In this problem $t$ corresponds to each day of the week, the hidden state $X_t$ represent the *weather* outside at day $t$ (whether it is rainy or sunny) and observations record $E_t$ whether at day $t$ the security guard sees the director carrying an *umbrella* or not." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Observation Emission or Sensor Probabilities $P(E_t := Umbrella_t | X_t := Weather_t)$\n", + "We need to assume that we have some prior knowledge about the director's behavior to estimate the emission probabilities for each hidden state:\n", + "\n", + "| | $yes$ | $no$ |\n", + "| --- | --- | --- |\n", + "| $Sunny$ | 0.10 | 0.90 |\n", + "| $Rainy$ | 0.80 | 0.20 |\n", + "\n", + "#### Initial Probability $P(X_0 := Weather_0)$\n", + "We will assume that we don't know anything useful about the likelihood of a sequence starting in either state. If the sequences start each week on Monday and end each week on Friday (so each week is a new sequence), then this assumption means that it's equally likely that the weather on a Monday may be Rainy or Sunny. We can assign equal probability to each starting state:\n", + "\n", + "| $Sunny$ | $Rainy$ |\n", + "| --- | ---\n", + "| 0.5 | 0.5 |\n", + "\n", + "#### State Transition Probabilities $P(X_{t} := Weather_t | X_{t-1} := Weather_{t-1})$\n", + "Finally, we will assume that we can estimate transition probabilities from something like historical weather data for the area. Under this assumption, we get the conditional probability:\n", + "\n", + "| | $Sunny$ | $Rainy$ |\n", + "| --- | --- | --- |\n", + "|$Sunny$| 0.70 | 0.30 |\n", + "|$Rainy$| 0.30 | 0.70 |" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "umbrella_transition = [[0.7, 0.3], [0.3, 0.7]]\n", + "umbrella_sensor = [[0.9, 0.2], [0.1, 0.8]]\n", + "umbrellaHMM = HiddenMarkovModel(umbrella_transition, umbrella_sensor)" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "from graphviz import Digraph" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "image/svg+xml": [ + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "Start\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "Rainy\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.5\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "Sunny\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.5\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.6\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.2\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "Yes\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.8\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "No\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.2\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.4\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.8\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.1\n", + "\n", + "\n", + "\n", + "Codestin Search App\n", + "\n", + "\n", + "0.9\n", + "\n", + "\n", + "\n" + ], + "text/plain": [ + "" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dot = Digraph()\n", + "\n", + "dot.node('I', 'Start', shape='doublecircle')\n", + "dot.node('R', 'Rainy')\n", + "dot.node('S','Sunny')\n", + "\n", + "dot.edge('I', 'R', label='0.5')\n", + "dot.edge('I', 'S', label='0.5')\n", + "\n", + "dot.edge('R', 'S', label='0.2')\n", + "dot.edge('S', 'R', label='0.4')\n", + "\n", + "dot.node('Y', 'Yes')\n", + "dot.node('N', 'No')\n", + "\n", + "dot.edge('R', 'R', label='0.6')\n", + "dot.edge('R', 'Y', label='0.8')\n", + "dot.edge('R', 'N', label='0.2')\n", + "\n", + "dot.edge('S', 'S', label='0.8')\n", + "dot.edge('S', 'Y', label='0.1')\n", + "dot.edge('S', 'N', label='0.9')\n", + "\n", + "dot" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Suppose that $[true, true, false, true, true]$ is the umbrella sequence for the security guard’s first five days on the job. What is the weather sequence most likely to explain this?" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import rounder" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "([1, 1, 0, 1, 1], [0.8182, 0.5155, 0.1237, 0.0334, 0.021])" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "umbrella_evidence = [True, True, False, True, True]\n", + "\n", + "rounder(viterbi(umbrellaHMM, umbrella_evidence))" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.3" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +}